Compare commits

...

305 Commits

Author SHA1 Message Date
Tamara Dahlgren
2f7d8947ba Test-Builder refactor: move do_test from PackageBase to PackageTest 2024-11-18 18:07:34 -08:00
Tamara Dahlgren
9a604bf615 Add set_current_specs to TestSuite and PackageTest
These changes allow removing references to test_suite from builder.py,
deferring them to PackageTest, which ensures the test_suite is set
for the builder's phase tests.

Also added comments to explain the need for test-related attributes
to the builder and package_base.
2024-11-15 17:26:26 -08:00
Tamara Dahlgren
f1487caa92 Testing-Builder refactor: move phase_tests to builder.py 2024-11-15 13:33:11 -08:00
Tamara Dahlgren
de3fb78477 Testing-Builder refactor: move print_message to tty/log.py 2024-11-15 13:16:50 -08:00
Veselin Dobrev
ac0ed2c4cc [mfem] Add a patch for MFEM v4.7 that adds support for SUDIALS v7 (#47591) 2024-11-15 08:50:44 -06:00
Harmen Stoppels
66a93b5433 Add missing llnl.* imports (#47618) 2024-11-15 15:49:25 +01:00
Harmen Stoppels
b7993317ea Improve type hints for package API (#47576)
by disentangling `package_base`, `builder` and `directives`.
2024-11-15 09:13:10 +01:00
etiennemlb
66622ec4d0 py-easybuild-framework: add python forward compat bound (#47597) 2024-11-14 23:47:23 -07:00
Pranav Sivaraman
9b2cd1b208 yyjson: new package (#47563)
* yyjson: new package

* [@spackbot] updating style on behalf of pranav-sivaraman

---------

Co-authored-by: pranav-sivaraman <pranav-sivaraman@users.noreply.github.com>
2024-11-14 11:01:21 -08:00
Dominic Hofer
9888683a21 eccodes: add v2.38.0 (#47581)
* eccodes: Add 2.38.0

* Update var/spack/repos/builtin/packages/eccodes/package.py
2024-11-14 10:57:59 -08:00
Massimiliano Culpo
fb46c7a72d Rework spack.database.InstallStatuses into a flag (#47321) 2024-11-14 15:43:31 +01:00
Massimiliano Culpo
c0196cde39 Remove support for PGI compilers (#47195) 2024-11-14 09:17:41 +01:00
Todd Gamblin
d091172d67 Spec: prefer a splice-specific method to __len__ (#47585)
Automatic splicing say `Spec` grow a `__len__` method but it's only used
in one place and it's not clear the semantics are useful elsewhere. It also
runs the risk of Specs one day being confused for other types of containers.

Rather than introduce a new function for one algorithm, let's use a more
specific method in the splice code.

- [x] Use topological ordering in `_resolve_automatic_splices` instead of 
      sorting by node count
- [x] delete `Spec.__len__()` and `Spec.__bool__()`

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-11-13 23:20:03 -08:00
psakievich
ab51369087 Update tutorial version (#47593) 2024-11-14 08:15:11 +01:00
Matthieu Dorier
1cea82b629 xfsprogs: fix dependency on liburcu (#47582)
* xfsprogs: fix dependency on liburcu

* xfsprogs: fix install rules.d

* xfsprogs: edited xfsprogs requirement on liburcu

* xfsprogs: many more versions
2024-11-13 16:17:20 -06:00
H. Joe Lee
2abb711337 hermes-shm: remove duplicate line (#47575)
close #10
2024-11-13 09:29:25 -07:00
Todd Gamblin
6f948eb847 spack spec: simplify and unify output (#47574)
`spack spec` output has looked like this for a while:

```console
> spack spec /v5fn6xo /wd2p2v7
Input spec
--------------------------------
 -   /v5fn6xo

Concretized
--------------------------------
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...

Input spec
--------------------------------
 -   /wd2p2v7

Concretized
--------------------------------
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
```

But the input spec is right there on the CLI, and it doesn't add anything to the output.
Also, since #44843, specs concretized in the CLI line can be unified, so it makes sense
to display them as we did in #44489 -- as one multi-root tree instead of as multiple
single-root trees.

With this PR, concretize output now looks like this:

```console
> spack spec /v5fn6xo /wd2p2v7
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^gmake@4.4.1%apple-clang@16.0.0~guile build_system=generic arch=darwin-sequoia-m1
[+]      ^perl@5.40.0%apple-clang@16.0.0+cpanm+opcode+open+shared+threads build_system=generic arch=darwin-sequoia-m1
[+]          ^berkeley-db@18.1.40%apple-clang@16.0.0+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=darwin-sequoia-m1
[+]          ^bzip2@1.0.8%apple-clang@16.0.0~debug~pic+shared build_system=generic arch=darwin-sequoia-m1
[+]              ^diffutils@3.10%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]                  ^libiconv@1.17%apple-clang@16.0.0 build_system=autotools libs=shared,static arch=darwin-sequoia-m1
[+]          ^gdbm@1.23%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]              ^readline@8.2%apple-clang@16.0.0 build_system=autotools patches=bbf97f1 arch=darwin-sequoia-m1
[+]                  ^ncurses@6.5%apple-clang@16.0.0~symlinks+termlib abi=none build_system=autotools patches=7a351bc arch=darwin-sequoia-m1
[+]                      ^pkgconf@2.2.0%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]      ^zlib-ng@2.2.1%apple-clang@16.0.0+compat+new_strategies+opt+pic+shared build_system=autotools arch=darwin-sequoia-m1
[+]          ^gnuconfig@2022-09-17%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^py-setuptools@69.2.0%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[-]      ^py-wheel@0.41.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...
```

With no input spec displayed -- just the concretization output shown as one consolidated
tree and multiple roots.

- [x] remove "Input Spec" section and "Concretized" header from `spack spec` output
- [x] print concretized specs as one BFS tree instead of multiple

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-11-13 08:21:16 -07:00
Alec Scott
93bf0634f3 nlopt: reformat for best practices (#47340) 2024-11-13 08:20:56 -07:00
Luca Heltai
badb3cedcd dealii: add v9.6.0 (#45554)
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2024-11-13 15:34:27 +01:00
Harmen Stoppels
be918817d6 bump version to 0.24.0.dev0 (#47578) 2024-11-13 13:05:14 +01:00
Harmen Stoppels
41d9f687f6 missing and redundant imports (#47577) 2024-11-13 13:03:09 +01:00
dslarm
9642b04513 Add SVE as a variant for Neoverse N2. Default to true, but should be (#47567)
benchmarked to test if that is a correct decision.
2024-11-12 22:09:05 -07:00
John Gouwar
bf16f0bf74 Add solver capability for synthesizing splices of ABI compatible packages. (#46729)
This PR provides complementary 2 features:
1. An augmentation to the package language to express ABI compatibility relationships among packages. 
2. An extension to the concretizer that can synthesize splices between ABI compatible packages.

1.  The `can_splice` directive and ABI compatibility 
We augment the package language with a single directive: `can_splice`. Here is an example of a package `Foo` exercising the `can_splice` directive:

class Foo(Package):
    version("1.0")
    version("1.1")
    variant("compat", default=True)
    variant("json", default=False)
    variant("pic", default=False)
    can_splice("foo@1.0", when="@1.1")
    can_splice("bar@1.0", when="@1.0+compat")
    can_splice("baz@1.0+compat", when="@1.0+compat", match_variants="*")
    can_splice("quux@1.0", when=@1.1~compat", match_variants="json")

Explanations of the uses of each directive: 
- `can_splice("foo@1.0", when="@1.1")`:  If `foo@1.0` is the dependency of an already installed spec and `foo@1.1` could be a valid dependency for the parent spec, then `foo@1.1` can be spliced in for `foo@1.0` in the parent spec.
- `can_splice("bar@1.0", when="@1.0+compat")`: If `bar@1.0` is the dependency of an already installed spec and `foo@1.0+compat` could be a valid dependency for the parent spec, then `foo@1.0+compat` can be spliced in for `bar@1.0+compat` in the parent spec
-  `can_splice("baz@1.0", when="@1.0+compat", match_variants="*")`: If `baz@1.0+compat` is the dependency of an already installed spec and `foo@1.0+compat` could be a valid dependency for the parent spec, then `foo@1.0+compat` can be spliced in for `baz@1.0+compat` in the parent spec, provided that they have the same value for all other variants (regardless of what those values are). 
-  `can_splice("quux@1.0", when=@1.1~compat", match_variants="json")`:If `quux@1.0` is the dependency of an already installed spec and `foo@1.1~compat` could be a valid dependency for the parent spec, then `foo@1.0~compat` can be spliced in for `quux@1.0` in the parent spec, provided that they have the same value for their `json` variant. 

2. Augmenting the solver to synthesize splices
### Changes to the hash encoding in `asp.py`
Previously, when including concrete specs in the solve, they would have the following form:

installed_hash("foo", "xxxyyy")
imposed_constraint("xxxyyy", "foo", "attr1", ...)
imposed_constraint("xxxyyy", "foo", "attr2", ...)
% etc. 

Concrete specs now have the following form:
installed_hash("foo", "xxxyyy")
hash_attr("xxxyyy", "foo", "attr1", ...)
hash_attr("xxxyyy", "foo", "attr2", ...)

This transformation allows us to control which constraints are imposed when we select a hash, to facilitate the splicing of dependencies. 

2.1 Compiling `can_splice` directives in `asp.py`
Consider the concrete spec:
foo@2.72%gcc@11.4 arch=linux-ubuntu22.04-icelake build_system=autotools ^bar ...
It will emit the following facts for reuse (below is a subset)

installed_hash("foo", "xxxyyy")
hash_attr("xxxyyy", "hash", "foo", "xxxyyy")
hash_attr("xxxyyy", "version", "foo", "2.72")
hash_attr("xxxyyy", "node_os", "ubuntu22.04")
hash_attr("xxxyyy", "hash", "bar", "zzzqqq")
hash_attr("xxxyyy", "depends_on", "foo", "bar", "link")

Rules that derive abi_splice_conditions_hold will be generated from 
use of the `can_splice` directive. They will have the following form:
can_splice("foo@1.0.0+a", when="@1.0.1+a", match_variants=["b"]) --->

abi_splice_conditions_hold(0, node(SID, "foo"), "foo", BaseHash) :-
  installed_hash("foo", BaseHash),
  attr("node", node(SID, SpliceName)),
  attr("node_version_satisfies", node(SID, "foo"), "1.0.1"),
  hash_attr("hash", "node_version_satisfies", "foo", "1.0.1"),
  attr("variant_value", node(SID, "foo"), "a", "True"),
  hash_attr("hash", "variant_value", "foo", "a", "True"),
  attr("variant_value", node(SID, "foo"), "b", VariVar0),
  hash_attr("hash", "variant_value", "foo", "b", VariVar0).


2.2 Synthesizing splices in `concretize.lp` and `splices.lp`

The ASP solver generates "splice_at_hash" attrs to indicate that a particular node has a splice in one of its immediate dependencies. 

Splices can be introduced in the dependencies of concrete specs when `splices.lp` is conditionally loaded (based on the config option `concretizer:splice:True`. 

2.3 Constructing spliced specs in `asp.py`

The method `SpecBuilder._resolve_splices` implements a top-down memoized implementation of hybrid splicing. This is an optimization over the more general `Spec.splice`, since the solver gives a global view of exactly which specs can be shared, to ensure the minimal number of splicing operations. 

Misc changes to facilitate configuration and benchmarking 
- Added the method `Solver.solve_with_stats` to expose timers from the public interface for easier benchmarking 
- Added the boolean config option `concretizer:splice` to conditionally load splicing behavior 

Co-authored-by: Greg Becker <becker33@llnl.gov>
2024-11-12 20:51:19 -08:00
v
ad518d975c py-nugraph, ph5concat, py-numl: Add new nugraph packages (#47315) 2024-11-13 01:34:11 +01:00
SXS Bot
a76e3f2030 spectre: add v2024.03.19 (#43275)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-11-12 15:16:27 -07:00
Greg Becker
1809b81e1d parse_specs: special case for concretizing lookups quickly (#47556)
We added unification semantics for parsing specs from the CLI, but there are a couple
of special cases in which we can avoid calls to the concretizer for speed when the
specs can all be resolved by lookups.

- [x] special case 1: solving a single spec

- [x] special case 2: all specs are either concrete (come from a file) or have an abstract
      hash. In this case if concretizer:unify:true we need an additional check to confirm
      the specs are compatible.

- [x] add a parameterized test for unifying on the CI

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-12 15:04:47 -07:00
Alec Scott
a02b40b670 restic: add v0.17.3 (#47553) 2024-11-12 14:15:53 -07:00
Alec Scott
6d8fdbcf82 direnv: add v2.35.0 (#47551) 2024-11-12 13:54:19 -07:00
Paul Gessinger
3dadf569a4 geomodel: Allow configuring C++ standard (#47422)
* geomodel: Allow configuring C++ standard

* drop c++11
2024-11-12 14:41:14 -05:00
Alec Scott
751585f1e3 glab: add v1.48.0 (#47552) 2024-11-12 12:07:34 -07:00
Wouter Deconinck
f6d6a5a480 parsec: update urls (#47416)
* parsec: update urls
* parsec: fix homepage
2024-11-12 11:31:57 -07:00
Matthieu Dorier
57a1ebc77e xfsprogs: fix dependency on gettext (#47547)
* xfsprogs: fix dependency on gettext

* changed dependency on gettext in xfsprogs

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-12 11:20:48 -07:00
Wouter Deconinck
acdcd1016a openssh: add v9.9p1 (#47555) 2024-11-12 10:04:12 -08:00
Matthieu Dorier
e7c9bb5258 py-constantly: add v23.10.4 (#47548)
* py-constantly: added version 23.10.4
* py-constantly: fixed dependency on py-versioneer
* py-constantly: updated py-versioneer dependency

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-11-12 09:53:34 -08:00
teddy
e083acdc5d costo: new package and to fix the build, add pkgconfig dep to vtk (#47121)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-11-12 17:04:20 +01:00
Sebastian Pipping
99fd37931c expat: Add 2.6.4 with security fixes + deprecate vulnerable 2.6.3 (#47521) 2024-11-12 07:10:00 -07:00
Harmen Stoppels
00e68af794 llvm-amdgpu: add missing dependency on libxml2 (#47560) 2024-11-12 14:51:33 +01:00
Harmen Stoppels
e33cbac01f getting_started.rst: fix list of spack deps (#47557) 2024-11-12 08:59:07 +01:00
Wouter Deconinck
ada4c208d4 py-cryptography: add v43.0.3 (switch to maturin) (#47546)
* py-cryptography: add v43.0.3 (switch to maturin)
* py-cryptography: deny some setuptools versions
* py-cryptography: depends_on py-setuptools-rust when @42, no range

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-11-11 22:09:33 -07:00
Xavier Delaruelle
91310d3ae6 environment-modules: add version 5.5.0 (#47543)
This new version is compatible with Tcl 9.0. It also requires
'util-linux' for new logging capabilities.
2024-11-11 21:45:03 -07:00
Tim Haines
def1613741 gdb: add version 15.2 (#47540) 2024-11-11 21:44:33 -07:00
Mosè Giordano
ac703bc88d prometheus: add v2.55.1 (#47544) 2024-11-11 21:34:32 -07:00
Mikael Simberg
f0f5ffa9de libunwind: Add 1.7.2, 1.8.1, and new *-stable branches (#47412)
* libunwind: Add 1.7.2 and 1.8.1
* libunwind: Remove deprecated 1.1 version
* libunwind: Add newer *-stable branches: Remove 1.5-stable branch as well as cleanup.
* libunwind: Use GitHub url for all versions
* libunwind: Add conflict for PPC and 1.8.*
* libunwind: Add conflict for aarch64 and 1.8:
   Build fails with

   aarch64/Gos-linux.c: In function '_ULaarch64_local_resume':
   aarch64/Gos-linux.c:147:1: error: x29 cannot be used in asm here
    }
    ^
   aarch64/Gos-linux.c:147:1: error: x29 cannot be used in asm here
   make[2]: *** [Makefile:4795: aarch64/Los-linux.lo] Error 1
2024-11-11 18:17:36 -08:00
Alberto Sartori
65929888de justbuild: add version 1.4.0 (#47410) 2024-11-11 18:15:07 -08:00
Luke Diorio-Toth
2987efa93c packages: new versions (diamond, py-alive-progress, py-bakta, py-deepsig-biocomp), new packages (py-pyhmmer, py-pyrodigal) (#47277)
* added updated versions
* added pyhmmer
* updated infernal
* fix blast-plus for apple-clang
* fix py-biopython build on apple-clang
* remove erroneous biopython dep: build issue is with python 3.8, not biopython
* deepsig python 3.9: expanding unnecessary python restrictions
* add pyrodigal
* fix unnecessarily strict diamond version
* builds and updates: blast-plus indexing broken, still need to test db download and bakta pipeline
* builds and runs
* revert blast-plus changes: remove my personal hacks to get blast-plus to build
2024-11-11 18:13:46 -08:00
Tim Haines
37de92e7a2 extrae: Update dyninst dependency (#47359) 2024-11-11 18:09:03 -08:00
Sreenivasa Murthy Kolam
42fd1cafe6 Fix the build error during compilation of rocdecode package (#47283)
* fix the build error during compilation of rocdecode.was dependent on libva-devel packag
* address review comment
* address review changes.commit the changes
2024-11-11 18:05:21 -08:00
MatthewLieber
370694f112 osu-micro-benchmarks: add v7.5 (#47423)
* Adding sha for 7.4 release of OSU Micro Benchmarks
* Adds the sha256sum for the OSU mirco benchmarks 7.5 release.
2024-11-11 18:01:39 -08:00
Wouter Deconinck
fc7125fdf3 py-fsspec-xrootd: new package (#47405)
* py-fsspec-xrootd: new package
* py-fsspec-xrootd: depends_on python@3.8:
2024-11-11 17:58:18 -08:00
Stephen Herbener
3fed708618 openmpi: add two_level_namespace variant for MacOS (#47202)
* Add two_level_namespace variant (default is disabled) for MacOS to enable building
executables and libraries with two level namespace enabled.
* Addressed reviewer comments.
* Moved two_level_namespace variant ahead of the patch that uses that variant to
get concretize to work properly.
* Removed extra print statements
2024-11-11 17:55:28 -08:00
renjithravindrankannath
0614ded2ef Removing args to get libraries added in RPATH (#47465) 2024-11-11 17:54:19 -08:00
Satish Balay
e38e51a6bc superlu-dist: add v9.1.0, v9.0.0 (#47461)
Fix typo wrt @xiaoyeli
2024-11-11 19:52:19 -06:00
Wouter Deconinck
c44c938caf rsyslog: add v8.2410.0 (fix CVE) (#47511)
* rsyslog: add v8.2410.0
2024-11-11 17:50:02 -08:00
Wouter Deconinck
cdaacce4db varnish-cache: add v7.6.1 (#47513) 2024-11-11 17:47:53 -08:00
Wouter Deconinck
b98e5886e5 py-pyppeteer: new package (#47375)
* py-pyppeteer: new package
* py-pyee: new package (v11.1.1, v12.0.0)
2024-11-11 17:44:28 -08:00
Wouter Deconinck
09a88ad3bd xerces-c: add v3.3.0 (#47522) 2024-11-11 17:30:30 -08:00
Wouter Deconinck
4d91d3f77f scitokens-cpp: add v1.1.2 (#47523) 2024-11-11 17:28:06 -08:00
Wouter Deconinck
b748907a61 pixman: add v0.44.0 (switch to meson) (#47529)
* pixman: add v0.44.0 (switch to meson)
2024-11-11 17:26:29 -08:00
Wouter Deconinck
cbd9fad66e xtrans: add v1.5.2 (#47530) 2024-11-11 17:22:32 -08:00
Wouter Deconinck
82dd33c04c git: add v2.46.2, v2.47.0 (#47534) 2024-11-11 17:21:27 -08:00
teddy
31b2b790e7 py-non-regression-test-tools: add v1.1.4 (#47520)
Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
2024-11-11 18:57:00 -06:00
Alec Scott
9fd698edcb fzf: add v0.56.2 (#47549) 2024-11-11 16:15:34 -08:00
Alec Scott
247446a8f3 bfs: add v4.0.4 (#47550) 2024-11-11 16:12:48 -08:00
Paul Gessinger
993f743245 soqt: new package (#47443)
* soqt: Add SoQt package

The geomodel package needs this if visualization is turned on.

* make qt versions explicit

* use virtual dependency for qt

* pr feedback

Remove myself as maintainer
Remove v1.6.0
Remove unused qt variant
2024-11-11 17:03:08 -06:00
Harmen Stoppels
786f8dfcce openmpi: fix detection (#47541)
Take a simpler approach to listing variant options -- store them in variables instead of trying to
roundtrip them through metadata dictionaries.
2024-11-11 14:14:38 -08:00
Harmen Stoppels
4691301eba Compiler.default_libc: early exit on darwin/win (#47554)
* Compiler.default_libc: early exit on darwin/win

* use .cc when testing c++ compiler if c compiler is missing
2024-11-11 14:12:43 -08:00
eugeneswalker
a55073e7b0 vtk-m %oneapi@2025: cxxflags add -Wno-error=missing-template-arg-list-after-template-kw (#47477) 2024-11-11 13:57:45 -08:00
Harmen Stoppels
484c9cf47c py-pillow: patch for disabling optional deps (#47542) 2024-11-11 11:55:47 -07:00
Peter Scheibel
9ed5e1de8e Bugfix: spack find -x in environments (#46798)
This addresses part [1] of #46345

#44713 introduced a bug where all non-spec query parameters like date
ranges, -x, etc. were ignored when an env was active.

This fixes that issue and adds tests for it.

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-11-11 10:13:31 -08:00
Massimiliano Culpo
4eb7b998e8 Move concretization tests to the same folder (#47539)
* Move concretization tests to the same folder

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Fix for clingo-cffi

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-11-11 19:01:24 +01:00
Satish Balay
3b423a67a2 butterflypack: add v3.2.0, strumpack: add v8.0.0 (#47462)
* butterflypack: add v3.2.0

* strumpack: add v8.0.0

* restrict fj patch to @1.2.0

* Update var/spack/repos/builtin/packages/butterflypack/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-11 11:18:03 -06:00
kwryankrattiger
b803dabb2c mirrors: allow username/password as environment variables (#46549)
`spack mirror add` and `set` now have flags `--oci-password-variable`, `--oci-password-variable`, `--s3-access-key-id-variable`, `--s3-access-key-secret-variable`, `--s3-access-token-variable`, which allows users to specify an environment variable in which a username or password is stored.

Storing plain text passwords in config files is considered deprecated.

The schema for mirrors.yaml has changed, notably the `access_pair` list is generally replaced with a dictionary of `{id: ..., secret_variable: ...}` or `{id_variable: ..., secret_variable: ...}`.
2024-11-11 16:34:39 +01:00
v
33dd894eff py-oracledb: add v1.4.2, v2.3.0, v2.4.1 (#47313)
the py-oracledb package only has a single outdated version available in its recipe. this PR adds a much broader range of versions and their corresponding checksums.

* add more versions of py-oracledb
* update py-oracledb recipe
* add py-cython version dependencies
* tweak py-cython version dependencies
* remove older versions of py-oracledb
2024-11-11 07:10:09 -08:00
Satish Balay
f458392c1b petsc: use --with-exodusii-dir [as exodus does not have 'libs()' to provide value for --with-exodusii-lib] (#47506) 2024-11-11 08:52:20 -06:00
Wouter Deconinck
8c962a94b0 vbfnlo: add v3.0; depends on tcsh (build) (#47532)
* vbfnlo: depends on tcsh (build)

* vbfnlo: add v3.0

* vbfnlo: comment

Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>

---------

Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2024-11-11 07:24:44 -07:00
Wouter Deconinck
8b165c2cfe py-gosam: add v2.1.2 (#47533) 2024-11-11 14:28:52 +01:00
Mikael Simberg
01edde35be ut: add 2.1.0 and 2.1.1 (#47538) 2024-11-11 14:07:08 +01:00
Massimiliano Culpo
84d33fccce llvm: filter clang-ocl from the executables being probed (#47536)
This filters any selected executable ending with `-ocl` from the list of executables being probed as candidate for external `llvm` installations.

I couldn't reproduce the entire issue, but with a simple script:
```
#!/bin/bash

touch foo.o
echo "clang version 10.0.0-4ubuntu1 "
echo "Target: x86_64-pc-linux-gnu"
echo "Thread model: posix"
echo "InstalledDir: /usr/bin"
exit 0
```
I noticed the executable was still probed:
```
$ spack -d compiler find /tmp/ocl
[ ... ]
==> [2024-11-11-08:38:41.933618] '/tmp/ocl/bin/clang-ocl' '--version'
```
and `foo.o` was left in the working directory. With this change, instead the executable is filtered out of the list on which we run `--version`, so `clang-ocl --version` is not run by Spack.
2024-11-11 05:05:01 -07:00
Todd Gamblin
c4a5a996a5 solver: avoid parsing specs in setup
- [x] Get rid of a call to `parser.quote_if_needed()` during solver setup, which
      introduces a circular import and also isn't necessary.

- [x] Rename `spack.variant.Value` to `spack.variant.ConditionalValue`, as it is *only*
      used for conditional values. This makes it much easier to understand some of the
      logic for variant definitions.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-11 01:54:57 -08:00
Todd Gamblin
6961514122 imports: move conditional to directives.py
`conditional()`, which defines conditional variant values, and the other ways to declare
variant values should probably be in a layer above `spack.variant`. This does the simple
thing and moves *just* `conditional()` to `spack.directives` to avoid a circular import.

We can revisit the public variant interface later, when we split packages from core.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-11 01:54:57 -08:00
Harmen Stoppels
a9e6074996 filesystem.py find: return directories and improve performance (#47537) 2024-11-11 09:43:23 +00:00
Giuncan
30db764449 lua: always generate pcfile without patch and remove +pcfile variant (#47353)
* lua: add +pcfile support for @5.4: versions, without using a version-dependent patch

* lua: always generate pcfile, remove +pcfile variant from all packages

* lua: minor fixes

* rpm: minor fix
2024-11-10 20:12:15 -06:00
Wouter Deconinck
f5b8b0ac5d mbedtls: add v2.28.9, v3.6.2 (fix CVEs) (#46637)
* mbedtls: add v2.28.9, v3.6.1 (fix CVEs)
* mbedtls: add v3.6.2
2024-11-10 15:09:06 -08:00
Dave Keeshan
913dcd97bc verilator: add v5.030 (#47455)
* Add 5.030 and remove the requirement to patch verilator, the problem has be fixed in this rev

* Update var/spack/repos/builtin/packages/verilator/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-10 16:07:12 -07:00
Adam J. Stewart
68570b7587 GDAL: add v3.10.0 (#47472) 2024-11-10 14:34:20 -06:00
Stephen Nicholas Swatman
2da4366ba6 benchmark: enable shared libraries by default (#47368)
* benchmark: enable shared libraries by default

The existing behaviour of Google Benchmark yiels static objects which
are of little use for most projects. This PR changes the spec to use
dynamic libraries instead.

* Add shared variant
2024-11-10 14:12:23 -06:00
Adam J. Stewart
2713b0c216 py-kornia: add v0.7.4 (#47435) 2024-11-10 13:21:01 -06:00
Matthieu Dorier
16b01c5661 librdkafka: added missing dependency on curl (#47500)
* librdkafka: added missing dependency on curl

This PR adds a missing dependency on curl in librdkafka.

* librdkafka: added dependency on openssl and zlib
2024-11-10 13:06:41 -06:00
dependabot[bot]
ebd4ef934c build(deps): bump types-six in /.github/workflows/requirements/style (#47454)
Bumps [types-six](https://github.com/python/typeshed) from 1.16.21.20241009 to 1.16.21.20241105.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-six
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-10 13:03:37 -06:00
Kaan
97b5ec6e4f Add support for Codeplay AMD Plugin for Intel OneAPI Compilers (#46749)
* Added support for Codeplay AMD Plugin for Intel OneAPI Compilers

* [@spackbot] updating style on behalf of kaanolgu

* Adding 2025.0.0

* removed HOME and XDG_RUNTIME_DIR

* [@spackbot] updating style on behalf of kaanolgu

---------

Co-authored-by: Kaan Olgu <kaan.olgu@bristol.ac.uk>
2024-11-10 11:51:39 -07:00
Dave Keeshan
4c9bc8d879 Add v0.47 (#47456) 2024-11-10 11:51:07 -06:00
Chris Marsh
825fd1ccf6 Disable the optional flexblas support as system flexiblas is possibly used as flexiblas is not a depends and the entire build chain to support using flexibls is not setup. As this does not seem to be needed with the spack blas and lapack, it is easier to disable (#47514) 2024-11-10 10:47:12 -07:00
Matthieu Dorier
33109ce9b9 lksctp-tools: added version 1.0.21 (#47493)
Adds version 1.0.21 of lksctp-tools
2024-11-10 11:11:13 -06:00
Adam J. Stewart
fb5910d139 py-torchmetrics: add v1.5.2 (#47497) 2024-11-10 10:53:15 -06:00
JStewart28
fa6b8a4ceb beatnik: add v1.1 (#47361)
Co-authored-by: Patrick Bridges <patrickb314@gmail.com>
2024-11-09 13:43:55 -07:00
Dom Heinzeller
97acf2614a cprnc: set install rpath and add v1.0.8 (#47505) 2024-11-09 15:39:55 +01:00
eugeneswalker
e99bf48d28 Revert "upcxx %oneapi@2025: cxxflags add -Wno-error=missing-template-arg-list-after-template-kw (#47503)" (#47512)
This reverts commit 4322cf56b1.
2024-11-09 06:12:46 -08:00
Massimiliano Culpo
b97015b791 ci: ci/all must always run, and fail if any job has status "fail" or "canceled" (#47517)
This means it succeeds when a both jobs have either status "success"
or status "skipped"
2024-11-09 06:04:51 -08:00
Seth R. Johnson
1884520f7b root: fix macos build (#47483)
No ROOT `builtin` should ever be set to true if possible, because that
builds an existing library that spack may not know about.

Furthermore, using `builtin_glew` forces the package to be on, even when
not building x/gl/aqua on macos. This causes build failures.

Caused by https://github.com/spack/spack/pull/45632#issuecomment-2276311748 .
2024-11-09 07:30:38 -06:00
Todd Gamblin
7fbfb0f6dc Revert "fix patched dependencies across repositories (#42463)" (#47519)
This reverts commit da1d533877.
2024-11-09 10:25:25 +01:00
Massimiliano Culpo
11d276ab6f Fix style checks on develop (#47518)
`mypy` checks have been accidentally broken by #47213
2024-11-08 23:50:37 -08:00
Greg Becker
da1d533877 fix patched dependencies across repositories (#42463)
Currently, if a package has a dependency from another repository and patches it,
generation of the patch cache will fail. Concretization succeeds if a fixed patch
cache is in place.

- [x] don't assume that patched dependencies are in the same repo when indexing
- [x] add some test fixtures to support multi-repo tests.

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-08 18:07:40 -07:00
Harmen Stoppels
c6997e11a7 spack.compiler/spack.util.libc: add caching (#47213)
* spack.compiler: cache output

* compute libc from the dynamic linker at most once per spack process

* wrap compiler cache entry in class, add type hints

* test compiler caching

* ensure tests do not populate user cache, and fix 2 tests

* avoid recursion: cache lookup -> compute key -> cflags -> real_version -> cache lookup

* allow compiler execution in test that depends on get_real_version
2024-11-08 16:25:02 -08:00
eugeneswalker
4322cf56b1 upcxx %oneapi@2025: cxxflags add -Wno-error=missing-template-arg-list-after-template-kw (#47503) 2024-11-08 15:12:50 -07:00
Harmen Stoppels
907a37145f llnl.util.filesystem: multiple entrypoints and max_depth (#47495)
If a package `foo` doesn't implement `libs`, the default was to search recursively for `libfoo` whenever asking for `spec[foo].libs` (this also happens automatically if a package includes `foo` as a link dependency).

This can lead to some strange behavior:
1. A package that is normally used as a build dependency (e.g. `cmake` at one point) is referenced like
   `depends_on(cmake)` which leads to a fully-recursive search for `libcmake` (this can take
   "forever" when CMake is registered as an external with a prefix like `/usr`, particularly on NFS mounts).
2. A similar hang can occur if a package is registered as an external with an incorrect prefix

- [x] Update the default library search to stop after a maximum depth (by default, search
  the root prefix and each directory in it, but no lower).
- [x] 

The following is a list of known changes to `find` compared to `develop`:

1. Matching directories are no longer returned -- `find` consistently only finds non-dirs, 
   even at `max_depth`
2. Symlinked directories are followed (needed to support max_depth)
3. `find(..., "dir/*.txt")` is allowed, for finding files inside certain dirs. These "complex"
   patterns are delegated to `glob`, like they are on `develop`.
4. `root` and `files` arguments both support generic sequences, and `root`
   allows both `str` and `path` types. This allows us to specify multiple entry points to `find`.

---------

Co-authored-by: Peter Scheibel <scheibel1@llnl.gov>
2024-11-08 13:55:53 -08:00
Harmen Stoppels
4778d2d332 Add missing imports (#47496) 2024-11-08 17:51:58 +01:00
Mikael Simberg
eb256476d2 pika: add 0.30.0 (#47498) 2024-11-08 17:07:50 +01:00
Alec Scott
ff26d2f833 spack env track command (#41897)
This PR adds a sub-command to `spack env` (`track`) which allows users to add/link
anonymous environments into their installation as named environments. This allows
users to more easily track their installed packages and the environments they're
dependencies of. For example, with the addition of #41731 it's now easier to remove
all packages not required by any environments with,

```
spack gc -bE
```

#### Usage
```
spack env track /path/to/env
==> Linked environment in /path/to/env
==> You can activate this environment with:
==>     spack env activate env
```

By default `track /path/to/env` will use the last directory in the path as the name of 
the environment. However users may customize the name of the linked environment
with `-n | --name`. Shown below.
```
spack env track /path/to/env --name foo 
==> Tracking environment in /path/to/env
==> You can activate this environment with:
==>     spack env activate foo
```

When removing a linked environment, Spack will remove the link to the environment
but will keep the structure of the environment within the directory. This will allow
users to remove a linked environment from their installation without deleting it from
a shared repository.

There is a `spack env untrack` command that can be used to *only* untrack a tracked
environment -- it will fail if it is used on a managed environment.  Users can also use
`spack env remove` to untrack an environment.

This allows users to continue to share environments in git repositories  while also having
the dependencies of those environments be remembered by Spack.

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-08 00:16:01 -08:00
Harmen Stoppels
ed916ffe6c Revert "filesystem.py: add max_depth argument to find (#41945)"
This reverts commit 38c8069ab4.
2024-11-07 13:09:10 -08:00
Harmen Stoppels
4fbdf2f2c0 Revert "llnl.util.filesystem.find: restore old error handling (#47463)"
This reverts commit a31c525778.
2024-11-07 13:09:10 -08:00
Harmen Stoppels
60ba61f6b2 Revert "llnl.util.filesystem.find: multiple entrypoints (#47436)"
This reverts commit 73219e4b02.
2024-11-07 13:09:10 -08:00
Chris White
0a4563fd02 silo package: update patch (#47457)
Update patch based on LLNL/Silo#319 to fix build of 4.10.2
2024-11-07 12:49:26 -08:00
Marc T. Henry de Frahan
754408ca2b Add fast farm variant to openfast (#47486) 2024-11-07 13:21:38 -07:00
Harmen Stoppels
0d817878ea spec.py: fix comparison with multivalued variants (#47485) 2024-11-07 19:29:37 +00:00
eugeneswalker
bf11fb037b loki%oneapi@2025: -Wno-error=missing-template-arg-list-after-template-kw (#47475) 2024-11-06 18:49:35 -07:00
Marc T. Henry de Frahan
074b845cd3 Add amr-wind versions (#47479) 2024-11-06 17:30:29 -07:00
eugeneswalker
dd26732897 legion%oneapi@2025: cxxflags add -Wno-error=missing-template-arg-list-after-template-kw (#47478) 2024-11-06 16:26:04 -08:00
eugeneswalker
3665c5c01b slate %oneapi@2025: cxxflags: add -Wno-error=missing-template-arg-list-after-template-kw (#47476) 2024-11-06 16:25:47 -08:00
Harmen Stoppels
73219e4b02 llnl.util.filesystem.find: multiple entrypoints (#47436)
You can now provide multiple roots to a single `find()` call and all of
them will be searched. The roots can overlap (e.g. can be parents of one
another).

This also adds a library function for taking a set of regular expression
patterns and creating a single OR expression (and that library function
is used in `find` to improve its performance).
2024-11-06 15:22:26 -08:00
Jon Rood
57a90c91a4 nalu-wind: fix hypre constraints (#47474) 2024-11-06 15:47:44 -07:00
Cameron Smith
8f4a0718bf omega-h: new version and cuda conflicts for prior versions (#47473)
* omegah: add version 10.8.6

* omegah: cuda without kokkos conflict

* omegah: test with latest version in ci
2024-11-06 22:36:59 +00:00
Wouter Deconinck
9049ffdc7a gsoap: add v2.8.135 (#47415)
* gsoap: add v2.8.135
2024-11-06 12:52:30 -07:00
eugeneswalker
d1f313342e tau: add v2.34 (#47471) 2024-11-06 10:15:52 -07:00
Massimiliano Culpo
e62cf9c45b Fix spack -c <override> when env active (#47403)
Set command line scopes last in _main, so they are higher scopes

Restore the global configuration in a spawned process by inspecting
the result of ctx.get_start_method()

Add the ability to pass a mp.context to PackageInstallContext.

Add shell-tests to check overriding the configuration:
- Using both -c and -C from command line
- With and without an environment active
2024-11-06 17:18:58 +01:00
Wouter Deconinck
ee2723dc46 rivet: add through v4.0.2 (incl yoda: add through v2.0.2) (#47383)
* yoda: add v2.0.1, v2.0.2

* rivet: add v3.1.9, v3.1.10, v4.0.0, v4.0.1, v4.0.2

* rivet: yoda@:1 when @:3; conflicts hepmc3@3.3.0 when @:4.0.0

* rivet: fix style

* rivet: hepmc=2 only when @:3; use libs.directories[0]

* hepmc3: def libs

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-06 09:09:40 -07:00
Harmen Stoppels
d09b185522 Fix various bootstrap/concretizer import issues (#47467) 2024-11-06 14:35:04 +00:00
Harmen Stoppels
a31c525778 llnl.util.filesystem.find: restore old error handling (#47463) 2024-11-06 11:49:14 +01:00
Thomas Madlener
2aa5a16433 edm4hep: Add json variant for newer versions (#47180)
* edm4hep: Add json variant for newer versions

Explicit option has been added to EDM4hep so we now expose it via a
variant as well. We keep the old behavior where we unconditionally
depended on nlohmann-json and implicitly built JSON support if we could
detect it cmake stage

* Fix condition statement in when clause

* Use open version range to avoid fixing to single version

---------

Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2024-11-06 03:42:34 -07:00
Richarda Butler
0c164d2740 Feature: Allow variants to propagate if not available in source pkg (#42931)
Variants can now be propagated from a dependent package to (transitive) dependencies, 
even if the source or transitive dependencies have the propagated variants.

For example, here `zlib` doesn't have a `guile` variant, but `gmake` does:
```
$ spack spec zlib++guile
 -   zlib@1.3%gcc@12.2.0+optimize+pic+shared build_system=makefile arch=linux-rhel8-broadwell
 -       ^gcc-runtime@12.2.0%gcc@12.2.0 build_system=generic arch=linux-rhel8-broadwell
 -       ^gmake@4.4.1%gcc@12.2.0+guile build_system=generic arch=linux-rhel8-broadwell
```

Adding this property has some strange ramifications for `satisfies()`. In particular:
* The abstract specs `pkg++variant` and `pkg+variant`  do not intersect, because `+variant`
  implies that `pkg` *has* the variant, but `++variant` does not.
* This means that `spec.satisfies("++foo")` is `True` if:
    * for concrete specs: `spec` and its dependencies all have `foo` set if it exists
    * for abstract specs: no dependency of `spec`  has `~foo` (i.e. no dependency contradicts `++foo`).
* This also means that `Spec("++foo").satisfies("+foo")` is `False` -- we only know after concretization.

The `satisfies()` semantics may be surprising, but this is the cost of introducing non-subset
semantics (which are more useful than proper subsets here).

- [x] Change checks for variants
- [x] Resolve conflicts
- [x] Add tests
- [x] Add documentation

---------

Co-authored-by: Gregory Becker <becker33@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-11-06 00:53:52 -08:00
Jon Rood
801390f6be masa: add versions and update dependencies (#47447)
* masa: add versions

* masa: update dependencies
2024-11-05 13:08:21 -07:00
wspear
c601692bc7 Fix filter_compiler_wrappers with mpi (#47448) 2024-11-05 13:08:10 -07:00
Harmen Stoppels
2b9c6790f2 omega-h: fix versioning and cuda compat (#47433) 2024-11-05 15:50:48 +01:00
Martin Lang
09ae2516d5 cgal: add v6.0.1 (#47285) 2024-11-05 14:20:10 +01:00
Harmen Stoppels
eb9ff5d7a7 paraview: add forward compat bound with cuda (#47430) 2024-11-05 13:25:19 +01:00
Harmen Stoppels
dadb30f0e2 libc.py: detect glibc also in chinese locale (#47434) 2024-11-05 12:30:32 +01:00
Harmen Stoppels
d45f682573 Revert "Ci generate on change (#47318)" (#47431)
This reverts commit 1462c35761.
2024-11-05 10:51:12 +01:00
David Boehme
b7601f3042 Add Adiak v0.4.1 (#47429) 2024-11-05 01:51:19 -07:00
Mosè Giordano
6b5a479d1e extrae: fix build with gcc@14 (#47407) 2024-11-05 09:42:06 +01:00
Harmen Stoppels
1297dd7fbc py-torchaudio, py-torchtext: rpath fixup (#47404)
* py-torchaudio, py-torchtext: rpath fixup

also add missing dependency on Spack ffmpeg to torchaudio.

* py-torchaudio: add torio rpath
2024-11-05 09:36:26 +01:00
Adam J. Stewart
75c169d870 py-tensorflow: add v2.18.0 (#47211) 2024-11-05 09:04:07 +01:00
Harmen Stoppels
afe431cfb5 py-python-ptrace: missing forward compat bound (#47401) 2024-11-05 07:50:38 +01:00
Massimiliano Culpo
14bc900e9d spack.concretize: add type-hints, remove kwargs (#47382)
Also remove find_spec, which was used by the old concretizer.
Currently, it seems to be used only in tests.
2024-11-05 07:46:49 +01:00
Howard Pritchard
e42e541605 openmpi: add 4.1.7 (#47427)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2024-11-04 22:23:16 -07:00
Wouter Deconinck
9310fcabd8 sst-dumpi: fix homepage (#47420) 2024-11-04 22:13:17 -07:00
Wouter Deconinck
6822f99cc6 quicksilver: fix homepage (#47419) 2024-11-04 22:12:57 -07:00
Wouter Deconinck
703cd6a313 py-eventlet: fix url (#47418)
* py-eventlet: fix url
* py-eventlet: fix checksum
2024-11-04 22:08:17 -07:00
Wouter Deconinck
5b59a53545 py-configspace: fix homepage (#47417) 2024-11-04 21:59:04 -07:00
Mosè Giordano
b862eec6bc extrae: add more versions (#47408) 2024-11-04 21:43:02 -07:00
Mosè Giordano
dcc199ae63 extrae: fix typo (#47406) 2024-11-04 21:34:08 -07:00
Paul Gessinger
f8da72cffe pythia8: Include patch for C++20 / Clang (#47400)
* pythia8: Include patch for C++20 / Clang

Pythia8 vendors some FJCore sources that are as of Pythia8 312
incompatible with C++20 on clang. This adds a patch that makes it
compatible in these scenarios

* Add issue link

* rename setup_cxxstd function

* Remove an accidental printout

* Apply patch to all compilers, add lower bound
2024-11-04 20:34:47 -06:00
Matthieu Dorier
8650ba3cea prometheus-cpp: added package prometheus-cpp (#47384)
* prometheus-cpp: added package prometheus-cpp
* prometheus-cpp: edited PR for style
2024-11-04 17:55:29 -08:00
Pranav Sivaraman
54aaa95a35 flux: new package (#47392) 2024-11-04 17:51:23 -08:00
Edward Hartnett
5a29c9d82b added g2c-2.0.0 (#47399) 2024-11-04 17:48:48 -08:00
Tim Haines
c8873ea35c dyninst: patch broken builds for 10.0.0:12.2.0 (#47339)
* dyninst: patch broken builds for 10.0.0:12.3.0
* Only apply before 12.3.0
2024-11-04 15:33:41 -08:00
Paul R. C. Kent
c7659df4af libxc: add v7.0.0 (#47263) 2024-11-04 15:30:55 -08:00
Sreenivasa Murthy Kolam
0de6c17477 fix the error libroctx64.so.o not found when executing MIOpenDriver (#47196) 2024-11-04 11:54:48 -08:00
Darren Bolduc
6924c530e2 google-cloud-cpp: add v2.29.0, v2.30.0 (#47146)
* google-cloud-cpp: add v2.29.0; fix cxx-std versions
* d'oh, single value for the variant
2024-11-04 11:50:54 -08:00
Peter Scheibel
38c8069ab4 filesystem.py: add max_depth argument to find (#41945)
* `find(..., max_depth=...)` can be used to control how many directories at most to descend into below the starting point
* `find` now enters every unique (symlinked) directory once at the lowest depth
* `find` is now repeatable: it traverses the directory tree in a deterministic order
2024-11-04 20:31:57 +01:00
Todd Gamblin
5cc07522ab cc: parse RPATHs when in ld mode
In the pure `ld` case, we weren't actually parsing `RPATH` arguments separately as we
do for `ccld`. Fix this by adding *another* nested case statement for raw `RPATH`
parsing.

There are now 3 places where we deal with `-rpath` and friends, but I don't see a great
way to unify them, as `-Wl,`, `-Xlinker`, and raw `-rpath` arguments are all ever so
slightly different.

Also, this Fixes ordering of assertions to make `pytest` diffs more intelligible.
The meaning of `+` and `-` in diffs changed in `pytest` 6.0 and the "preferred" order
for assertions became `assert actual == expected` instead of the other way around.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-04 19:52:08 +01:00
Todd Gamblin
575a006ca3 cc: simplify ordered list handling
`cc` divides most paths up into system paths, spack managed paths, and other paths.
This gets really repetitive and makes the code hard to read. Simplify the script
by adding some functions to do most of the redundant work for us.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-04 19:52:08 +01:00
John Gouwar
23ac56edfb Times spec building and timing to public concretizer API (#47310)
This PR has two small contributions:
- It adds another phase to the timer for concrectization, "construct_specs", to actually see the time the concretizer spends interpreting the `clingo` output to build the Python object for a concretized spec. 
- It adds the method `Solver.solve_with_stats` to expose the timers that were already in the concretizer to the public solver API. `Solver.solve` just becomes a special case of `Solver.solve_with_stats` that throws away the timing output (which is what it was already doing).  

These changes will make it easier to benchmark concretizer performance and provide a more complete picture of the time spent in the concretizer by including the time spent interpreting clingo output.
2024-11-04 09:48:18 -08:00
Harmen Stoppels
8c3068809f papi: add forward compat bound for cuda (#47409) 2024-11-04 17:32:47 +01:00
Stephen Nicholas Swatman
2214fc855d geant4-data: symlink only specific data dirs (#47367)
Currently, the `geant4-data` spec creates symlink to all of its
dependencies, and it does so by globbing their `share/` directories.
This works very well for the way Spack installs these, but it doesn't
work for anybody wanting to use e.g. the Geant4 data on CVMFS. See pull
request #47298. This commit changes the way the `geant4-data` spec
works. It no longer blindly globs directories and makes symlinks, but it
asks its dependencies specifically for the name of their data directory.
This should allow us to use Spack to use the CVMFS installations as
externals.
2024-11-04 15:38:12 +00:00
Harmen Stoppels
d44bdc40c9 boost: require +icu when +locale (#47396) 2024-11-04 16:12:16 +01:00
Stephen Nicholas Swatman
e952f6be8e acts dependencies: new versions as of 2024/11/01 (#47366)
* acts dependencies: new versions as of 2024/11/01

Includes new versions of ACTS itself, Detray, and Vecmem.

* Bind TBB
2024-11-04 06:48:08 -07:00
Wouter Deconinck
b95936f752 zabbix: add v5.0.44, v6.0.34, v7.0.4 (fix CVEs) (#47001)
* zabbix: add v5.0.44, v6.0.34, v7.0.4 (fix CVEs)

* [@spackbot] updating style on behalf of wdconinc

* zabbix: use f-string

* zabbix: fix f-string quoting

* zabbix: use mysql-client

* @wdconic, this fixes the mysql client virtual for me

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-11-04 07:20:09 -06:00
Harmen Stoppels
8d0856d1cc packaging_guide.rst: explain forward and backward compat before the less common cases (#47402)
The idea is to go from most to least used: backward compat -> forward compat -> pinning on major or major.minor version -> pinning specific, concrete versions.

Further, the following

```python
   # backward compatibility with Python
   depends_on("python@3.8:")
   depends_on("python@3.9:", when="@1.2:")
   depends_on("python@3.10:", when="@1.4:")

   # forward compatibility with Python
   depends_on("python@:3.12", when="@:1.10")
   depends_on("python@:3.13", when="@:1.12")
   depends_on("python@:3.14")
```

is better than disjoint when ranges causing repetition of the rules on dependencies, and requiring frequent editing of existing lines after new releases are done:

```python
   depends_on("python@3.8:3.12", when="@:1.1")
   depends_on("python@3.9:3.12", when="@1.2:1.3")
   depends_on("python@3.10:3.12", when="@1.4:1.10")
   depends_on("python@3.10:3.13", when="@1.11:1.12")
   depends_on("python@3.10:3.14", when="@1.13:")
2024-11-04 13:52:05 +01:00
Teague Sterling
10f7014add vep-cache: new package (#44523)
* py-uvloop: add v3.8.14, v3.9.15, v3.10.3 and dependencies

* rollback

* vep: add v110,v111,v112

* vep-cache: add v110,v111,v112

* Cleanup

* Reorganizigng

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of teaguesterling

* Update package.py

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of teaguesterling

* Update package.py

* [@spackbot] updating style on behalf of teaguesterling

* Fix scoping and syntax issues

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* fix styles

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* fix variants

* fixing up variant issues and cleaning up resource code

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* fixing unused imports

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Apply suggestions from code review

Co-authored-by: Arne Becker <101113822+EbiArnie@users.noreply.github.com>

* fixing vep dependencies

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Fixing resources

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Fixing issue where resources are not downloaded

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* vep-cache fixing downloads

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* defaulting to using VEP installer

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Removing resource-based cache installation and simplifying package. Resources without checksums doesn't work (anymore?) and calculating them with be difficult

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Arne Becker <101113822+EbiArnie@users.noreply.github.com>
2024-11-04 11:37:21 +00:00
Harmen Stoppels
c9ed91758d tcsh: add missing libxcrypt dependency (#47398) 2024-11-04 03:42:34 -07:00
Harmen Stoppels
2c1d74db9b krb5: disable missing keyutils dependency (#47397) 2024-11-04 03:20:33 -07:00
Harmen Stoppels
5b93466340 libssh2: fix crypto (#47393) 2024-11-04 02:48:35 -07:00
Martin Lang
1ee344c75c bigdft-futile: fix compilation for @1.9.5~mpi (#47292)
When compiled without MPI support, a fake mpi header is autogenerated during configure/build. The header is missing one symbol in version 1.9.5. The problem has since been fixed upstream.

A simular problem does also occur for 1.9.4. Unfortunately, the patch does not work for 1.9.4 and I also don't know if further fixes would be required for 1.9.4. Therefore, only the newest version 1.9.5 is patched.
2024-11-04 10:47:54 +01:00
afzpatel
754011643c rocal and rocm-openmp-extras: fix build failures (#47314) 2024-11-04 10:41:07 +01:00
Cédric Chevalier
2148292bdb kokkos and kokkos-kernels: use new urls for v4.4 and above (#47330) 2024-11-04 10:39:16 +01:00
Harmen Stoppels
cf3576a9bb suite-sparse: fix missing rpaths for dependencies (#47394) 2024-11-04 02:38:16 -07:00
Martin Lang
a86f164835 nlopt: new version 2.8.0 (#47289) 2024-11-04 10:35:11 +01:00
Martin Lang
2782ae6d7e libpspio: new version 0.4.1 (#47287) 2024-11-04 10:34:22 +01:00
Wouter Deconinck
b1fd6dbb6d libxml2: add v2.11.9, v2.12.9, v2.13.4 (#47297)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-04 10:32:34 +01:00
Andrey Prokopenko
18936771ff arborx: remove Trilinos dependency for @1.6: (#47305) 2024-11-04 10:29:32 +01:00
Brian Spilner
9a94ea7dfe icon: add a maintainer (#47323) 2024-11-04 10:26:28 +01:00
Larry Knox
a93bd6cee4 hdf5: add develop-2.0 (#47299)
Update HDF5 version for develop branch to develop-2.0 to match the new
version in the develop branch.

Remove develop-1.16 as it has been decided to make next release HDF5 2.0.0.
2024-11-04 10:21:17 +01:00
Paul R. C. Kent
4c247e206c llvm: add v19.1.3 (#47325) 2024-11-04 10:18:24 +01:00
Weiqun Zhang
fcdaccfeb6 amrex: add v24.11 (#47371) 2024-11-04 10:06:49 +01:00
Wouter Deconinck
2fc056e27c py-flask-compress: add v1.14 (#47373) 2024-11-04 10:02:15 +01:00
Wouter Deconinck
417c48b07a py-flask-cors: add v4.0.0 (#47374) 2024-11-04 10:01:36 +01:00
Christophe Prud'homme
f05033b0d2 cpr: add +pic and +shared variants (#47281) 2024-11-04 10:00:26 +01:00
Cameron Smith
d63f06e4b7 pumi: add version 2.2.9 (#47380) 2024-11-04 09:58:40 +01:00
Wouter Deconinck
8296aaf175 minizip: add v1.3.1 (#47379) 2024-11-04 09:57:40 +01:00
Wouter Deconinck
86ebcabd46 cups: add v2.4.11 (#47390) 2024-11-04 09:55:33 +01:00
Wouter Deconinck
87329639f2 elasticsearch, kibana, logstash: add v8.15.2 (#46873) 2024-11-04 09:50:41 +01:00
Harmen Stoppels
0acd6ae7b2 lua-luaposix: add missing libxcrypt dependency (#47395) 2024-11-04 01:47:47 -07:00
Massimiliano Culpo
395c911689 Specs: propagated variants affect == equality (#47376)
This PR changes the semantic of == for spec so that:

hdf5++mpi == hdf5+mpi

won't hold true anymore. It also changes the constrain semantic, so that a
non-propagating variant always override a propagating variant. This means:

(hdf5++mpi).constrain(hdf5+mpi) -> hdf5+mpi

Before we had a very weird semantic, that was supposed to be tested by unit-tests:

(libelf++debug).constrain(libelf+debug+foo) -> libelf++debug++foo

This semantic has been dropped, as it was never really tested due to the == bug.
2024-11-03 22:35:16 -08:00
Wouter Deconinck
2664303d7a pythia8: add v8.312 (#47389)
* pythia8: add v8.312

* pythia8: update homepage url
2024-11-03 23:48:34 +01:00
Wouter Deconinck
ff9568fa2f sherpa: add v3.0.1 (#47388)
* sherpa: add v3.0.1

* sherpa: no depends_on py-setuptools
2024-11-03 15:32:35 -07:00
eugeneswalker
632c009569 e4s ci stacks: reduce package prefs (#47381) 2024-11-03 00:18:13 +01:00
Paul Gessinger
55918c31d2 root: require +opengl when +aqua is on (#47349)
According to https://github.com/root-project/root/issues/7160, if
`-Dcocoa=ON` build must also be configured with `-Dopengl=ON`, since
otherwise the build encounters missing includes. This is/was a silent
failure in ROOT CMake, but I believe has been made an explicit failure
some time this year.
2024-11-02 11:13:37 +01:00
Tamara Dahlgren
b8461f3d2d Remove ignored config:install_missing_compilers from unit tests (#47357) 2024-11-02 09:36:05 +01:00
Massimiliano Culpo
133895e785 Rework the schema for reusing environments (#47364)
Currently, the schema reads:

  from:
    - type:
        environment: path_or_name

but this can't be extended easily to other types, e.g. to buildcaches,
without duplicating the extension keys. Use instead:

  from:
    - type: environment
      path: path_or_name
2024-11-02 09:03:42 +01:00
dependabot[bot]
19e3ab83cf build(deps): bump python-levenshtein in /lib/spack/docs (#47372)
Bumps [python-levenshtein](https://github.com/rapidfuzz/python-Levenshtein) from 0.26.0 to 0.26.1.
- [Release notes](https://github.com/rapidfuzz/python-Levenshtein/releases)
- [Changelog](https://github.com/rapidfuzz/python-Levenshtein/blob/main/HISTORY.md)
- [Commits](https://github.com/rapidfuzz/python-Levenshtein/compare/v0.26.0...v0.26.1)

---
updated-dependencies:
- dependency-name: python-levenshtein
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-01 23:55:18 +00:00
Greg Becker
e42a4a8bac parse_specs: unify specs based on concretizer:unify (#44843)
Currently, the `concretizer:unify:` config option only affects environments.

With this PR, it now affects any group of specs given to a command using the `parse_specs(*, concretize=True)` interface.

- [x] implementation in `parse_specs`
- [x] tests
- [x] ensure all commands that accept multiple specs and concretize use `parse_specs` interface

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-01 23:49:26 +00:00
kwryankrattiger
1462c35761 Ci generate on change (#47318)
* don't concretize in CI if changed packages are not in stacks

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>

* Generate noop job when no specs to rebuild due to untouched pruning

* Add test to verify skipping generate creates a noop job

* Changed debug for early exit

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-01 22:07:23 +00:00
Massimiliano Culpo
0cf8cb70f4 Fix pickle round-trip of specs propagating variants (#47351)
This changes `Spec` serialization to include information about propagation for abstract specs.
This was previously not included in the JSON representation for abstract specs, and couldn't be
stored.

Now, there is a separate `propagate` dictionary alongside the `parameters` dictionary. This isn't
beautiful, but when we bump the spec version for Spack `v0.24`, we can clean up this and other
aspects of the schema.
2024-11-01 13:43:16 -07:00
Marc T. Henry de Frahan
7b2450c22a Add openfast version 3.5.4 (#47369)
* Add openfast version 3.5.4

* remove commits
2024-11-01 13:53:59 -06:00
Paul R. C. Kent
8f09f523cc cp2k: protect 2024.3 against newer libxc (#47363)
* cp2k: protect against newer libxc

* Compat bound for libxc
2024-11-01 11:40:10 -06:00
Stephen Nicholas Swatman
24d3ed8c18 geant4: make downloading data dependency optional (#47298)
* geant4: make downloading data dependency optional

This PR makes downloading the data repository of the Geant4 spec
optional by adding a sticky, default-enabled variant which controls the
dependency on `geant4-data`. This should not change the default
behaviour, but should allow users to choose whether or not they want the
data directory.

* Add comment

* Update env variable

* Generic docs

* Buildable false
2024-11-01 15:41:34 +00:00
Kenneth Moreland
492c52089f adios2: fix mgard variant (#47223)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-11-01 14:18:13 +01:00
dependabot[bot]
5df7dc88fc build(deps): bump docutils from 0.20.1 to 0.21.2 in /lib/spack/docs (#45592)
Bumps [docutils](https://docutils.sourceforge.io) from 0.20.1 to 0.21.2.

---
updated-dependencies:
- dependency-name: docutils
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-01 06:23:10 -05:00
Matt Thompson
4a75c3c87a mapl: add 2.50.2, 2.47.1 tweaks (#47324) 2024-11-01 06:30:00 +01:00
Eric Müller
35aa02771a verilator: add 5.028, fix builds when using gcc on newer versions (#47168) 2024-11-01 05:26:55 +01:00
G-Ragghianti
b38a29f4df New versions for slate, lapackpp, and blaspp (#47334) 2024-11-01 04:59:30 +01:00
joscot-linaro
9a25a58219 linaro-forge: added 24.0.6 version (#47348) 2024-11-01 04:56:02 +01:00
Paul R. C. Kent
c0c9743300 py-ase: add v3.23.0 (#47337) 2024-11-01 04:38:40 +01:00
Adam J. Stewart
a69af3c71f py-rasterio: add v1.4.2 (#47344) 2024-11-01 03:34:09 +01:00
Julien Cortial
cb92d70d6d mmg: add v5.8.0 (#47356) 2024-11-01 03:29:54 +01:00
Vicente Bolea
76ed4578e7 adios2: add v2.10.2 release and fix build of older versions (#47235)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-11-01 02:21:25 +01:00
Kaan
504cc808d6 Babelstream v5.0 Spack Package Updates (#41019)
- Merging sycl2020usm and sycl2020acc into sycl2020 and the submodel=acc/usm variant is introduced
- implementation is renamed to option
- impl ( fortran implementation options) renamed to foption
- sycl_compiler_implementation and thrust_backend
- stddata,stdindices,stdranges to a single std with std_submodel introduction
- std_use_tbb to be boolean; also changed model filtering algorithm to make sure that it only picks model names
- Modified comments to clear confusion with cuda_arch cc_ and sm_ prefix appends
- Deleted duplicate of cuda_arch definition from +omp
- CMAKE_CXX_COMPILER moved to be shared arg between all models except tbb and thrust
- Replaced sys.exit with InstallError and created a dictionary to simplify things and eliminate excess code lines doing same checks
- Replaced the -mcpu flags to -march since it is deprecated now
- Replaced platform.machine with spec.target
-  Removing raja_backend, introducing openmp_flag,removing -march flags,clearing debugging print(), removing excess if ___ in self.spec.variants
- [FIX] Issue where Thrust couldn't find correct compiler (it requires nvcc)
-  [FIX] Fortran unsupported check to match the full string
- [FIX] RAJA cuda_arch to be with sm_ not cc_
- dir= option is no longer needed for kokkos
- dir is no longer needed
-  [omp] Adding clang support for nvidia offload
- SYCL2020 offload to nvidia GPU
- changing model dependency to be languages rather than build system
- removing hardcoded arch flags and replacing with archspec
- removing cpu_arch from acc model

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
Co-authored-by: Kaan Olgu <kaan.olgu@bristol.ac.uk>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-31 18:42:40 -06:00
Mosè Giordano
8076134c91 nvidia-nsight-systems: new package (#47355)
Co-authored-by: Scot Halverson <shalverson@nvidia.com>
2024-10-31 18:32:43 -06:00
Tobias Ribizel
b4b3320f71 typst: new package (#47293) 2024-11-01 00:05:00 +01:00
Paolo
e35bc1f82d acfl, armpl-cc: add v24.10 (#47167)
* Introduce support for ArmPL and ACfL 24.10
   This patch introduces the possibility of installing armpl-gcc
   and acfl 24.10 through spack. It also addressed one issue observed
   after PR https://github.com/spack/spack/pull/46594
* Fix Github action issues.
   - Remove default URL
   - Reinstate default OS for ACfL to RHEL.
2024-10-31 14:51:47 -07:00
Tim Haines
0de1ddcbe8 cbtf: Update Boost dependencies (#47131) 2024-10-31 15:33:04 -06:00
Harmen Stoppels
e3aca49e25 database.py: remove process unsafe update_explicit (#47358)
Fixes an issue reported where `spack env depfile` + `make -j` would
non-deterministically refuse to mark all environment roots explicit.

`update_explicit` had the pattern

```python
rec = self._data[key]
with self.write_transaction():
    rec.explicit = explicit
```

but `write_transaction` may reinitialize `self._data`, meaning that
mutating `rec` won't mutate `self._data`, and the changes won't be
persisted.

Instead, use `mark` which has a correct implementation.

Also avoids the essentially incorrect early return in `update_explicit`
which is a pattern I don't think belongs in database.py: it branches on
possibly stale data to realize there is nothing to change, but in reality
it requires a write transaction to know that for a fact, but that would
defeat the purpose. So, leave this optimization to the call site.
2024-10-31 13:58:42 -07:00
Wouter Deconinck
94c29e1cfc mcpp: add v2.7.2-25-g619046f with CVE patches (#47301) 2024-10-31 20:57:56 +01:00
kwryankrattiger
0c00a297e1 Concretize reuse: reuse specs from environment (#45139)
The already concrete specs in an environment are now among the reusable specs for the concretizer.

This includes concrete specs from all include_concrete environments.

In addition to this change to the default reuse, `environment` is added as a reuse type for 
the concretizer config. This allows users to specify:

spack:
  concretizer:
    # Reuse from this environment (including included concrete) but not elsewhere
    reuse:
      from:
      - type: environment
    # or reuse from only my_env included environment
    reuse:
      from:
      - type:
          environment: my_env
    # or reuse from everywhere
    reuse: true

If reuse is specified from a specific environment, only specs from that environment will be reused.
If the reused environment is not specified via include_concrete, the concrete specs will be retried
at concretization time to be reused.

Signed-off-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Gregory Becker <becker33@llnl.gov>
2024-10-31 10:31:34 -07:00
Martin Lang
c6a1ec996c gsl: new version 2.8 (#47286) 2024-10-31 16:55:02 +01:00
Antonio Cervone
0437c5314e salome,-med,-medcoupling: new versions, new/changed variants (#46576)
* boost: boost.python does not support numpy@2 yet
2024-10-31 09:52:00 -06:00
Adam J. Stewart
ffde309a99 py-ipympl: add v0.9.4 (#47193)
* py-ipympl: add v0.9.4

* Add node/npm dependencies at runtime

* node-js: fix build with older GCC

* Change CLANG flags too

* Add supported compiler versions

* Deprecate older version
2024-10-31 09:28:59 -06:00
Tim Haines
a08b4ae538 extrae: update checksums, fix build (-lintl), minor modernisation (#47343)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-10-31 09:24:06 -06:00
Jon Rood
404b1c6c19 nalu-wind: put bounds on yaml-cpp versions. (#47341) 2024-10-31 09:16:10 -06:00
Richard Berger
275339ab4c kokkos: add cmake_lang variant, require at least one active backend (#43517) 2024-10-31 08:55:09 -06:00
Wouter Deconinck
877930c4ef minio: add v2024-10-13T13-34-11Z (#47303) 2024-10-31 08:22:39 -05:00
Wouter Deconinck
89d0215d5b optipng: add v0.7.8 (#47311)
* optipng: add v0.7.8

* optipng: mv for_aarch64.patch for_aarch64_0.7.7.patch

* optipng: add for_aarch64_0.7.8.patch

* optipng: deprecate v0.7.7

* optipng: fix style
2024-10-31 08:22:01 -05:00
Seth R. Johnson
f003d8c0c3 vecgeom: new version 1.2.9 (#47306) 2024-10-31 13:12:26 +00:00
Adam J. Stewart
6ab92b119d Docs: remove reference to pyspack (#47346) 2024-10-31 11:15:51 +01:00
Alec Scott
f809b56f81 mpidiff: new package (#47335)
* mpidiff: new package

* fix style with black

* Add variants, docs, and examples variants. Remove options that are not really options in the build.
2024-10-30 15:31:20 -07:00
Alex Hedges
ec058556ad Remove trailing spaces in default YAML files (#47328)
caught by `prettier`
2024-10-30 22:32:54 +01:00
Greg Becker
ce78e8a1f8 verdict: new package (#47333)
* add verdict package

Co-authored-by: becker33 <becker33@users.noreply.github.com>
Co-authored-by: Alec Scott <scott112@llnl.gov>
2024-10-30 15:07:12 -06:00
Harmen Stoppels
c3435b4e7d hooks: run in clear, fixed order (#47329)
Currently the order in which hooks are run is arbitrary.

This can be fixed by sorted(list_modules(...)) but I think it is much
more clear to just have a static list.

Hooks are not extensible other than modifying Spack code, which
means it's unlikely people maintain custom hooks since they'd have
to fork Spack. And if they fork Spack, they might as well add an entry
to the list when they're continuously rebasing.
2024-10-30 18:57:49 +00:00
Adam J. Stewart
02d2c4a9ff PyTorch: add v2.5.1 (#47326) 2024-10-30 19:33:09 +01:00
Cameron Stanavige
9d03170cb2 scr: release v3.1.0, including components (#45737)
SCR and the SCR components have new releases

- AXL v0.9.0
  - MPI variant added to AXL package
- ER v0.5.0
- KVTREE v1.5.0
- Rankstr v0.4.0
- Shuffile v0.4.0
- Spatha v0.4.0
- dtcmp v1.1.5
- lwgrp v1.0.6
- Redset v0.4.0
  - New variants added to Redset

- SCR v3.1.0
  - Added Flux resourse manager
  - Added pthreads variant
  - Removed deprecated release candidates and references
  - Cleaned up component dependency versions
  - Updated versions within variants and cleaned up cmake_args
2024-10-30 16:08:01 +01:00
Harmen Stoppels
8892c878ce types: remove singleton union in globals (#47282) 2024-10-30 13:48:32 +01:00
Harmen Stoppels
cbf4d3967a add std_pip_args global to the audit list (#47320) 2024-10-30 13:14:15 +01:00
Harmen Stoppels
8bc0b2e086 Spec.__str__: use full hash (#47322)
The idea is that `spack -e env add ./concrete-spec.json` would list the
full hash in the specs, so that (a) it's not ambiguous and (b) it could
in principle results in constant time lookup instead of linear time
substring match in large build caches.
2024-10-30 12:44:51 +01:00
Massimiliano Culpo
354615d491 Spec.dependencies: allow to filter on virtuals (#47284)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-10-30 12:15:01 +01:00
Veselin Dobrev
9ac261af58 Add latest OpenSSL versions. Deprecate previous versions. (#47316) 2024-10-30 09:43:31 +01:00
Alex Hedges
34b2f28a5e Fix malformed RST link in documentation (#47309) 2024-10-30 09:40:35 +01:00
Alex Hedges
8a10eff757 Fix errors found by running spack audit externals (#47308)
The problem was that `+` is part of the regex grammar, so it needs to be
escaped.
2024-10-30 09:38:36 +01:00
Alex Hedges
44d09f2b2b Fix typo in default concretizer.yaml (#47307)
This was caught by `codespell` when I copied the config file into an
internal repository.
2024-10-30 09:35:30 +01:00
Harmen Stoppels
161b2d7cb0 std_pip_args: use PythonPipBuilder.std_args(...) instead (#47260) 2024-10-30 09:18:56 +01:00
Wouter Deconinck
4de5b664cd r-*: add new versions (#46093)
* r-*: updates to latest versions
* r-*: add new dependencies
* r-proj: fix docstring line length
* r-list: add homepage
* r-*: add more dependencies
* r-rmpi: use virtual dependencies, conflict openmpi@5:
* r-cairo: require cairo +png; +pdf for some versions; cairo +fc when +ft
* r-proj: set LD_LIBRARY_PATH since rpath not respected
2024-10-29 18:27:43 -06:00
Wouter Deconinck
5d0c6c3350 rocketmq: add v5.3.1 (#46976)
and it installs.
2024-10-29 22:48:42 +00:00
Jon Rood
8391c8eb87 nalu-wind: version 2.1.0 requires trilinos 15.1.1 (#47296) 2024-10-29 15:17:10 -06:00
Georgia Stuart
3108849121 Add new seqfu version (#47294)
Signed-off-by: Georgia Stuart <gstuart@umass.edu>
2024-10-29 15:16:45 -06:00
Andrey Perestoronin
52471bab02 Added packages to for intel-2025.0.0 release (#47264)
* Added packages to for intel-2025.0.0 release

* fix style

* pin mkl to 2024.2.2

until e4s can upgrade to 2025 compiler and ginkgo compatibility issue can be resolved.

---------

Co-authored-by: Robert Cohn <rscohn2@gmail.com>
2024-10-29 12:50:30 -06:00
Massimiliano Culpo
b8e3246e89 llnl.util.lang: add classes to help with deprecations (#47279)
* Add a descriptor to have a class level constant

This descriptor helps intercept places where we set a value on instances.
It does not really behave like "const" in C-like languages, but is the
simplest implementation that might still be useful.

* Add a descriptor to deprecate properties/attributes of an object

This descriptor is used as a base class. Derived classes may implement a
factory to return an adaptor to the attribute being deprecated. The
descriptor can either warn, or raise an error, when usage of the deprecated
attribute is intercepted.

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-10-29 19:06:26 +01:00
Paul R. C. Kent
60cb628283 py-pycifrw: add v4.4.6 (#47262) 2024-10-29 10:34:03 -07:00
James Taliaferro
5bca7187a5 py-uv: new version (#47275)
This adds the current latest version of py-uv. While working on this, I also
found that uv (including older versions) has a build dependency on cmake which
was not specified in the package, so I added it as a dependency.

I also found that on my machine, the build process had trouble finding cmake,
so I set the path to it explicitly as an environment variable.
2024-10-29 10:30:13 -07:00
Stephen Nicholas Swatman
65daf17b54 acts dependencies: new versions as of 2024/10/28 (#47265)
This commit adds a new version of ACTS and detray.
2024-10-29 10:28:18 -07:00
Valentin Volkl
d776dead56 gaudi: gdb doesn't build on macos/arm64 (#47266)
* gaudi: gdb doesn't build on macos/arm64
* fix imports
2024-10-29 10:26:35 -07:00
Satish Balay
741a4a5d4f petsc, py-petsc4py: add v3.22.1 (#47267) 2024-10-29 10:25:21 -07:00
Paul R. C. Kent
dbe7b6bc6b quantum-espresso: add v7.4.0 (#47268) 2024-10-29 10:23:20 -07:00
Bernhard Kaindl
ffc904aa6b r-textshaping,r-ragg: Add dep on pkgconfig, type="build" and handle freetype@2.13.3 (#47091)
* r-textshaping: build-dep on pkgconfig to find harfbuzz
* r-ragg: Fix build with freetype@2.13.3
2024-10-29 10:22:10 -07:00
Christoph Junghans
f889b2a95e gromacs: add missing sycl contraint (#47274) 2024-10-29 10:19:36 -07:00
Lin Guo
7f609ba934 wrf: add v4.6.0 and v4.6.1 versions (#47203) 2024-10-29 10:18:06 -07:00
Massimiliano Culpo
ffd7830bfa cbflib: improve syntax for querying %gcc version (#47280)
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-10-29 15:33:41 +01:00
Jose E. Roman
20a6b22f78 New patch release SLEPc 3.22.1 (#47257) 2024-10-29 09:06:32 -05:00
teddy
1bff2f7034 py-mpi4py: add v4.0.1 (#47261)
Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
2024-10-29 09:06:01 -05:00
Adam J. Stewart
ca48233ef7 py-jupyter-core: set environment variables for extensions (#47192)
* py-jupyter-core: set environment variables for extensions

* Changes committed by gh-spack-pr

---------

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-10-29 14:45:26 +01:00
Martin Lang
c302049b5d spglib: new version 2.5.0 (#47291) 2024-10-29 07:03:12 -06:00
Cédric Chevalier
360dbe41f7 kokkos: async malloc (#46464) 2024-10-29 13:28:12 +01:00
Harmen Stoppels
ea1aa0714b bootstrap: do not consider source when metadata file missing (#47278) 2024-10-29 10:57:31 +01:00
Harmen Stoppels
7af1a3d240 std_meson_args: deprecate (#47259) 2024-10-29 07:54:28 +01:00
Harmen Stoppels
962115b386 builder.py: builder_cls should be associated to spack.pkg module (#47269) 2024-10-29 07:53:06 +01:00
Harmen Stoppels
f81ca0cd89 directives_meta.py: use startswith to test module part of spack.pkg (#47270) 2024-10-29 07:51:36 +01:00
Jon Rood
25a5585f7d exawind: remove generated fortran dependencies (#47276) 2024-10-28 21:26:11 -06:00
Greg Becker
e81ce18cad cmd/solve: use interface from cmd/spec (#47182)
Currently, `spack solve` has different spec selection semantics than `spack spec`.
`spack solve` currently does not allow specifying a single spec when an environment is active.

This PR modifies `spack solve` to inherit the interface from `spack spec`, and to use
the same spec selection logic. This will allow for better use of `spack solve --show opt`
for debugging.

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-10-28 19:28:03 -07:00
Christoph Junghans
d48d993ae7 lammps: add heffte support (#47254)
* lammps: add heffte support
* Add Richard's suggestion
2024-10-28 15:05:17 -07:00
Paul R. C. Kent
fbd5c3d589 py-pyscf: add v2.7.0 (#47272) 2024-10-28 16:05:44 -05:00
Christophe Prud'homme
11aa02b37a feelpp/spack#11 (#47243) 2024-10-28 13:25:29 +01:00
Harmen Stoppels
b9ebf8cc9c gdk-pixbuf/atk: delete old versions, make mesonpackage (#47258)
* gdk-pixbuf: delete old versions, make mesonpackage

goal is to get rid of `std_meson_args` global, but clean up package
while at it.

`setup_dependent_run_environment` was removed because it did not depend
on the dependent spec, and would result in repeated env variable
changes.

* atk: idem

* fix a dependent
2024-10-28 12:21:59 +01:00
Miroslav Stoyanov
1229d5a3cc tasmanian: add v8.1 (#47221)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-10-28 05:19:50 -06:00
teddy
be5a096665 py-non-regression-test-tools: add v1.1.2 & remove v1.0.2 (tag removed) (#47256)
Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
2024-10-28 05:06:23 -06:00
Adam J. Stewart
32ce278a51 ML CI: Linux aarch64 (#39666)
* ML CI: Linux aarch64

* Add config files

* No aarch64 tag

* Don't specify image

* Use amazonlinux image

Co-authored-by: kwryankrattiger <80296582+kwryankrattiger@users.noreply.github.com>

* Update and require

* GCC is too old

* Fix some builds

* xgboost doesn't support old GCC + cuda

* Run on newer Ubuntu

* Remove mxnet

* Try aarch64 range

* Use main branch

* Conflict applies to all targets

* cuda only required when +cuda

* Use tagged version

* Comment out tf-estimator

* Add ROCm, use newer Ubuntu

* Remove ROCm

---------

Co-authored-by: kwryankrattiger <80296582+kwryankrattiger@users.noreply.github.com>
2024-10-28 10:30:07 +01:00
Adam J. Stewart
e83536de38 bazel: add Apple Clang 16 conflict (#47228) 2024-10-28 09:11:59 +01:00
jmlapre
ff058377c5 sst: update core, elements, macro to 14.1.0 (#47184) 2024-10-28 09:10:46 +01:00
Adam J. Stewart
e855bb011d py-hatchet: add v1.4.0 (#47222) 2024-10-28 09:05:16 +01:00
Bernhard Kaindl
dbab4828ed py-mgmetis: fails to build with mpi4py @4: depend on @3 (#47236) 2024-10-28 09:04:18 +01:00
Christophe Prud'homme
fac92dceca cpr: add versions up to v1.11 (#47242) 2024-10-28 08:39:16 +01:00
Wouter Deconinck
035b890b17 pango: add v1.54.0 (#47244) 2024-10-28 08:37:21 +01:00
Wouter Deconinck
2a7e5cafa1 geode: add v1.13.8, v1.14.3, v1.15.1 (#47253) 2024-10-28 08:36:26 +01:00
Wouter Deconinck
49845760b6 less: add v661, v668 (#47252) 2024-10-27 19:47:35 -06:00
Wouter Deconinck
ce6255c0bb nano: add v8.1, v8.2 (and v6.4) (#47245)
* nano: add v8.1, v8.2

* nano: depends on gettext

* nano: add v6.4
2024-10-28 01:21:18 +01:00
Diego Alvarez S.
f0d54ba39d Add nextflow 24.10.0 (#47251) 2024-10-27 17:32:20 -06:00
Harmen Stoppels
2ec4281c4f Remove a few redundant imports (#47250)
* remove self-imports

* remove unused imports
2024-10-27 15:40:05 -06:00
Harmen Stoppels
e80d75cbe3 gha: circular imports: pin (#47248) 2024-10-27 21:34:32 +01:00
Greg Becker
47e70c5c3a explicit splice: do not fail for bad config replacement if target not matched (#46925)
Originally, concretization failed if the splice config points to an invalid replacement.

This PR defers the check until we know the splice is needed, so that irrelevant splices
with bad config cannot stop concretization.

While I was at it, I improved an error message from an assert to a ValueError.
2024-10-27 11:35:10 -07:00
William R Tobin
fea2171672 silo: resolve hdf5 develop-X.Y branch versions (#39344) 2024-10-27 06:44:20 -06:00
Beat Reichenbach
12a475e648 feat: Add OpenColorIO option to OpenImageIO (#47237)
* feat: Add OpenColorIO option to OpenImageIO

* style: Pep 8

---------

Co-authored-by: Beat Reichenbach <beatreichenbach@users.noreply.github.com>
2024-10-27 09:37:49 +01:00
Wouter Deconinck
c348891c07 pango: deprecate @:1.44 due to CVE (#47232) 2024-10-27 06:56:13 +01:00
Jeff Hammond
019e90ab36 NWChem: add TCE_CUDA option (#47191)
Signed-off-by: Jeff Hammond <jehammond@nvidia.com>
2024-10-27 06:54:06 +01:00
Jeff Hammond
19137b2653 add the USE_F90_ALLOCATABLE option to Spack (#47190)
Signed-off-by: Jeff Hammond <jehammond@nvidia.com>
2024-10-27 06:53:48 +01:00
Andrew W Elble
2761e650fa gem5: new package (#47218) 2024-10-27 06:28:26 +01:00
Asa
84ea389017 py-olcf-velocity: new package (#47215)
* Add package py-olcf-velocity

* Removed trailing newline

* Fixed packages description line length

---------

Co-authored-by: Asa Rentschler <rentschleraj@ornl.gov>
2024-10-27 05:36:23 +01:00
Adam J. Stewart
17f07523f5 py-alive-progress: support newer Python (#47220) 2024-10-27 05:24:51 +01:00
Pranav Sivaraman
bd2ddb8909 byte-lite: new package (#47234)
* byte-lite: new package

* byte-lite: style adjustments
2024-10-27 05:22:58 +01:00
Adam J. Stewart
f5db757e66 adios2: mark conflict with newer Python@3.11 for @:2.7 (#47219) 2024-10-27 04:40:38 +01:00
1172 changed files with 14115 additions and 6462 deletions

View File

@@ -83,10 +83,17 @@ jobs:
all-prechecks:
needs: [ prechecks ]
if: ${{ always() }}
runs-on: ubuntu-latest
steps:
- name: Success
run: "true"
run: |
if [ "${{ needs.prechecks.result }}" == "failure" ] || [ "${{ needs.prechecks.result }}" == "canceled" ]; then
echo "Unit tests failed."
exit 1
else
exit 0
fi
coverage:
needs: [ unit-tests, prechecks ]
@@ -94,8 +101,19 @@ jobs:
secrets: inherit
all:
needs: [ coverage, bootstrap ]
needs: [ unit-tests, coverage, bootstrap ]
if: ${{ always() }}
runs-on: ubuntu-latest
# See https://docs.github.com/en/actions/writing-workflows/choosing-what-your-workflow-does/accessing-contextual-information-about-workflow-runs#needs-context
steps:
- name: Success
run: "true"
- name: Status summary
run: |
if [ "${{ needs.unit-tests.result }}" == "failure" ] || [ "${{ needs.unit-tests.result }}" == "canceled" ]; then
echo "Unit tests failed."
exit 1
elif [ "${{ needs.bootstrap.result }}" == "failure" ] || [ "${{ needs.bootstrap.result }}" == "canceled" ]; then
echo "Bootstrap tests failed."
exit 1
else
exit 0
fi

View File

@@ -3,5 +3,5 @@ clingo==5.7.1
flake8==7.1.1
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.20241009
types-six==1.16.21.20241105
vermin==1.6.0

View File

@@ -174,7 +174,7 @@ jobs:
spack bootstrap disable github-actions-v0.6
spack bootstrap status
spack solve zlib
spack unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml lib/spack/spack/test/concretize.py
spack unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml lib/spack/spack/test/concretization/core.py
- uses: actions/upload-artifact@b4b15b8c7c6ac21ea08fcf65892d2ee8f75cf882
with:
name: coverage-clingo-cffi

View File

@@ -121,7 +121,7 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
repository: haampie/circular-import-fighter
ref: 555519c6fd5564fd2eb844e7b87e84f4d12602e2
ref: 9f60f51bc7134e0be73f27623f1b0357d1718427
path: circular-import-fighter
- name: Install dependencies
working-directory: circular-import-fighter

View File

@@ -39,11 +39,19 @@ concretizer:
# Option to deal with possible duplicate nodes (i.e. different nodes from the same package) in the DAG.
duplicates:
# "none": allows a single node for any package in the DAG.
# "minimal": allows the duplication of 'build-tools' nodes only (e.g. py-setuptools, cmake etc.)
# "minimal": allows the duplication of 'build-tools' nodes only
# (e.g. py-setuptools, cmake etc.)
# "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG)
strategy: minimal
# Option to specify compatiblity between operating systems for reuse of compilers and packages
# Specified as a key: [list] where the key is the os that is being targeted, and the list contains the OS's
# it can reuse. Note this is a directional compatibility so mutual compatibility between two OS's
# Option to specify compatibility between operating systems for reuse of compilers and packages
# Specified as a key: [list] where the key is the os that is being targeted, and the list contains the OS's
# it can reuse. Note this is a directional compatibility so mutual compatibility between two OS's
# requires two entries i.e. os_compatible: {sonoma: [monterey], monterey: [sonoma]}
os_compatible: {}
# Option to specify whether to support splicing. Splicing allows for
# the relinking of concrete package dependencies in order to better
# reuse already built packages with ABI compatible dependencies
splice:
explicit: []
automatic: false

View File

@@ -40,9 +40,9 @@ packages:
jpeg: [libjpeg-turbo, libjpeg]
lapack: [openblas, amdlibflame]
libc: [glibc, musl]
libgfortran: [ gcc-runtime ]
libgfortran: [gcc-runtime]
libglx: [mesa+glx]
libifcore: [ intel-oneapi-runtime ]
libifcore: [intel-oneapi-runtime]
libllvm: [llvm]
lua-lang: [lua, lua-luajit-openresty, lua-luajit]
luajit: [lua-luajit-openresty, lua-luajit]

View File

@@ -1359,6 +1359,10 @@ For example, for the ``stackstart`` variant:
mpileaks stackstart==4 # variant will be propagated to dependencies
mpileaks stackstart=4 # only mpileaks will have this variant value
Spack also allows variants to be propagated from a package that does
not have that variant.
^^^^^^^^^^^^^^
Compiler Flags
^^^^^^^^^^^^^^

View File

@@ -237,3 +237,35 @@ is optional -- by default, splices will be transitive.
``mpich/abcdef`` instead of ``mvapich2`` as the MPI provider. Spack
will warn the user in this case, but will not fail the
concretization.
.. _automatic_splicing:
^^^^^^^^^^^^^^^^^^
Automatic Splicing
^^^^^^^^^^^^^^^^^^
The Spack solver can be configured to do automatic splicing for
ABI-compatible packages. Automatic splices are enabled in the concretizer
config section
.. code-block:: yaml
concretizer:
splice:
automatic: True
Packages can include ABI-compatibility information using the
``can_splice`` directive. See :ref:`the packaging
guide<abi_compatibility>` for instructions on specifying ABI
compatibility using the ``can_splice`` directive.
.. note::
The ``can_splice`` directive is experimental and may be changed in
future versions.
When automatic splicing is enabled, the concretizer will combine any
number of ABI-compatible specs if possible to reuse installed packages
and packages available from binary caches. The end result of these
specs is equivalent to a series of transitive/intransitive splices,
but the series may be non-obvious.

View File

@@ -210,16 +210,18 @@ def setup(sphinx):
# Spack classes that are private and we don't want to expose
("py:class", "spack.provider_index._IndexBase"),
("py:class", "spack.repo._PrependFileLoader"),
("py:class", "spack.build_systems._checks.BaseBuilder"),
("py:class", "spack.build_systems._checks.BuilderWithDefaults"),
# Spack classes that intersphinx is unable to resolve
("py:class", "spack.version.StandardVersion"),
("py:class", "spack.spec.DependencySpec"),
("py:class", "spack.spec.ArchSpec"),
("py:class", "spack.spec.InstallStatus"),
("py:class", "spack.spec.SpecfileReaderBase"),
("py:class", "spack.install_test.Pb"),
("py:class", "spack.filesystem_view.SimpleFilesystemView"),
("py:class", "spack.traverse.EdgeAndDepth"),
("py:class", "archspec.cpu.microarchitecture.Microarchitecture"),
("py:class", "spack.compiler.CompilerCache"),
# TypeVar that is not handled correctly
("py:class", "llnl.util.lang.T"),
]

View File

@@ -184,7 +184,7 @@ Style Tests
Spack uses `Flake8 <http://flake8.pycqa.org/en/latest/>`_ to test for
`PEP 8 <https://www.python.org/dev/peps/pep-0008/>`_ conformance and
`mypy <https://mypy.readthedocs.io/en/stable/>` for type checking. PEP 8 is
`mypy <https://mypy.readthedocs.io/en/stable/>`_ for type checking. PEP 8 is
a series of style guides for Python that provide suggestions for everything
from variable naming to indentation. In order to limit the number of PRs that
were mostly style changes, we decided to enforce PEP 8 conformance. Your PR

View File

@@ -333,13 +333,9 @@ inserting them at different places in the spack code base. Whenever a hook
type triggers by way of a function call, we find all the hooks of that type,
and run them.
Spack defines hooks by way of a module at ``lib/spack/spack/hooks`` where we can define
types of hooks in the ``__init__.py``, and then python files in that folder
can use hook functions. The files are automatically parsed, so if you write
a new file for some integration (e.g., ``lib/spack/spack/hooks/myintegration.py``
you can then write hook functions in that file that will be automatically detected,
and run whenever your hook is called. This section will cover the basic kind
of hooks, and how to write them.
Spack defines hooks by way of a module in the ``lib/spack/spack/hooks`` directory.
This module has to be registered in ``__init__.py`` so that Spack is aware of it.
This section will cover the basic kind of hooks, and how to write them.
^^^^^^^^^^^^^^
Types of Hooks

View File

@@ -1042,7 +1042,7 @@ file snippet we define a view named ``mpis``, rooted at
``/path/to/view`` in which all projections use the package name,
version, and compiler name to determine the path for a given
package. This view selects all packages that depend on MPI, and
excludes those built with the PGI compiler at version 18.5.
excludes those built with the GCC compiler at version 18.5.
The root specs with their (transitive) link and run type dependencies
will be put in the view due to the ``link: all`` option,
and the files in the view will be symlinks to the spack install
@@ -1056,7 +1056,7 @@ directories.
mpis:
root: /path/to/view
select: [^mpi]
exclude: ['%pgi@18.5']
exclude: ['%gcc@18.5']
projections:
all: '{name}/{version}-{compiler.name}'
link: all

View File

@@ -35,7 +35,7 @@ A build matrix showing which packages are working on which systems is shown belo
.. code-block:: console
apt update
apt install build-essential ca-certificates coreutils curl environment-modules gfortran git gpg lsb-release python3 python3-distutils python3-venv unzip zip
apt install bzip2 ca-certificates file g++ gcc gfortran git gzip lsb-release patch python3 tar unzip xz-utils zstd
.. tab-item:: RHEL
@@ -43,14 +43,14 @@ A build matrix showing which packages are working on which systems is shown belo
dnf install epel-release
dnf group install "Development Tools"
dnf install curl findutils gcc-gfortran gnupg2 hostname iproute redhat-lsb-core python3 python3-pip python3-setuptools unzip python3-boto3
dnf install gcc-gfortran redhat-lsb-core python3 unzip
.. tab-item:: macOS Brew
.. code-block:: console
brew update
brew install curl gcc git gnupg zip
brew install gcc git zip
------------
Installation
@@ -283,10 +283,6 @@ compilers`` or ``spack compiler list``:
intel@14.0.1 intel@13.0.1 intel@12.1.2 intel@10.1
-- clang -------------------------------------------------------
clang@3.4 clang@3.3 clang@3.2 clang@3.1
-- pgi ---------------------------------------------------------
pgi@14.3-0 pgi@13.2-0 pgi@12.1-0 pgi@10.9-0 pgi@8.0-1
pgi@13.10-0 pgi@13.1-1 pgi@11.10-0 pgi@10.2-0 pgi@7.1-3
pgi@13.6-0 pgi@12.8-0 pgi@11.1-0 pgi@9.0-4 pgi@7.0-6
Any of these compilers can be used to build Spack packages. More on
how this is done is in :ref:`sec-specs`.
@@ -806,65 +802,6 @@ flags to the ``icc`` command:
spec: intel@15.0.24.4.9.3
^^^
PGI
^^^
PGI comes with two sets of compilers for C++ and Fortran,
distinguishable by their names. "Old" compilers:
.. code-block:: yaml
cc: /soft/pgi/15.10/linux86-64/15.10/bin/pgcc
cxx: /soft/pgi/15.10/linux86-64/15.10/bin/pgCC
f77: /soft/pgi/15.10/linux86-64/15.10/bin/pgf77
fc: /soft/pgi/15.10/linux86-64/15.10/bin/pgf90
"New" compilers:
.. code-block:: yaml
cc: /soft/pgi/15.10/linux86-64/15.10/bin/pgcc
cxx: /soft/pgi/15.10/linux86-64/15.10/bin/pgc++
f77: /soft/pgi/15.10/linux86-64/15.10/bin/pgfortran
fc: /soft/pgi/15.10/linux86-64/15.10/bin/pgfortran
Older installations of PGI contains just the old compilers; whereas
newer installations contain the old and the new. The new compiler is
considered preferable, as some packages
(``hdf``) will not build with the old compiler.
When auto-detecting a PGI compiler, there are cases where Spack will
find the old compilers, when you really want it to find the new
compilers. It is best to check this ``compilers.yaml``; and if the old
compilers are being used, change ``pgf77`` and ``pgf90`` to
``pgfortran``.
Other issues:
* There are reports that some packages will not build with PGI,
including ``libpciaccess`` and ``openssl``. A workaround is to
build these packages with another compiler and then use them as
dependencies for PGI-build packages. For example:
.. code-block:: console
$ spack install openmpi%pgi ^libpciaccess%gcc
* PGI requires a license to use; see :ref:`licensed-compilers` for more
information on installation.
.. note::
It is believed the problem with HDF 4 is that everything is
compiled with the ``F77`` compiler, but at some point some Fortran
90 code slipped in there. So compilers that can handle both FORTRAN
77 and Fortran 90 (``gfortran``, ``pgfortran``, etc) are fine. But
compilers specific to one or the other (``pgf77``, ``pgf90``) won't
work.
^^^
NAG
^^^

View File

@@ -12,10 +12,6 @@
Spack
===================
.. epigraph::
`These are docs for the Spack package manager. For sphere packing, see` `pyspack <https://pyspack.readthedocs.io>`_.
Spack is a package management tool designed to support multiple
versions and configurations of software on a wide variety of platforms
and environments. It was designed for large supercomputing centers,

View File

@@ -1267,7 +1267,7 @@ Git fetching supports the following parameters to ``version``:
This feature requires ``git`` to be version ``2.25.0`` or later but is useful for
large repositories that have separate portions that can be built independently.
If paths provided are directories then all the subdirectories and associated files
will also be cloned.
will also be cloned.
Only one of ``tag``, ``branch``, or ``commit`` can be used at a time.
@@ -1367,8 +1367,8 @@ Submodules
git-submodule``.
Sparse-Checkout
You can supply ``git_sparse_paths`` at the package or version level to utilize git's
sparse-checkout feature. This will only clone the paths that are specified in the
You can supply ``git_sparse_paths`` at the package or version level to utilize git's
sparse-checkout feature. This will only clone the paths that are specified in the
``git_sparse_paths`` attribute for the package along with the files in the top level directory.
This feature allows you to only clone what you need from a large repository.
Note that this is a newer feature in git and requries git ``2.25.0`` or greater.
@@ -1928,71 +1928,29 @@ to the empty list.
String. A URL pointing to license setup instructions for the software.
Defaults to the empty string.
For example, let's take a look at the package for the PGI compilers.
For example, let's take a look at the Arm Forge package.
.. code-block:: python
# Licensing
license_required = True
license_comment = "#"
license_files = ["license.dat"]
license_vars = ["PGROUPD_LICENSE_FILE", "LM_LICENSE_FILE"]
license_url = "http://www.pgroup.com/doc/pgiinstall.pdf"
license_comment = "#"
license_files = ["licences/Licence"]
license_vars = [
"ALLINEA_LICENSE_DIR",
"ALLINEA_LICENCE_DIR",
"ALLINEA_LICENSE_FILE",
"ALLINEA_LICENCE_FILE",
]
license_url = "https://developer.arm.com/documentation/101169/latest/Use-Arm-Licence-Server"
As you can see, PGI requires a license. Its license manager, FlexNet, uses
the ``#`` symbol to denote a comment. It expects the license file to be
named ``license.dat`` and to be located directly in the installation prefix.
If you would like the installation file to be located elsewhere, simply set
``PGROUPD_LICENSE_FILE`` or ``LM_LICENSE_FILE`` after installation. For
further instructions on installation and licensing, see the URL provided.
Arm Forge requires a license. Its license manager uses the ``#`` symbol to denote a comment.
It expects the license file to be named ``License`` and to be located in a ``licenses`` directory
in the installation prefix.
Let's walk through a sample PGI installation to see exactly what Spack is
and isn't capable of. Since PGI does not provide a download URL, it must
be downloaded manually. It can either be added to a mirror or located in
the current directory when ``spack install pgi`` is run. See :ref:`mirrors`
for instructions on setting up a mirror.
After running ``spack install pgi``, the first thing that will happen is
Spack will create a global license file located at
``$SPACK_ROOT/etc/spack/licenses/pgi/license.dat``. It will then open up the
file using :ref:`your favorite editor <controlling-the-editor>`. It will look like
this:
.. code-block:: sh
# A license is required to use pgi.
#
# The recommended solution is to store your license key in this global
# license file. After installation, the following symlink(s) will be
# added to point to this file (relative to the installation prefix):
#
# license.dat
#
# Alternatively, use one of the following environment variable(s):
#
# PGROUPD_LICENSE_FILE
# LM_LICENSE_FILE
#
# If you choose to store your license in a non-standard location, you may
# set one of these variable(s) to the full pathname to the license file, or
# port@host if you store your license keys on a dedicated license server.
# You will likely want to set this variable in a module file so that it
# gets loaded every time someone tries to use pgi.
#
# For further information on how to acquire a license, please refer to:
#
# http://www.pgroup.com/doc/pgiinstall.pdf
#
# You may enter your license below.
You can add your license directly to this file, or tell FlexNet to use a
license stored on a separate license server. Here is an example that
points to a license server called licman1:
.. code-block:: none
SERVER licman1.mcs.anl.gov 00163eb7fba5 27200
USE_SERVER
If you would like the installation file to be located elsewhere, simply set ``ALLINEA_LICENSE_DIR`` or
one of the other license variables after installation. For further instructions on installation and
licensing, see the URL provided.
If your package requires the license to install, you can reference the
location of this global license using ``self.global_license_file``.
@@ -2392,7 +2350,7 @@ by the ``--jobs`` option:
.. code-block:: python
:emphasize-lines: 7, 11
:linenos:
class Xios(Package):
...
def install(self, spec, prefix):
@@ -2503,15 +2461,14 @@ with. For example, suppose that in the ``libdwarf`` package you write:
depends_on("libelf@0.8")
Now ``libdwarf`` will require ``libelf`` at *exactly* version ``0.8``.
You can also specify a requirement for a particular variant or for
specific compiler flags:
Now ``libdwarf`` will require ``libelf`` in the range ``0.8``, which
includes patch versions ``0.8.1``, ``0.8.2``, etc. Apart from version
restrictions, you can also specify variants if this package requires
optional features of the dependency.
.. code-block:: python
depends_on("libelf@0.8+debug")
depends_on("libelf debug=True")
depends_on("libelf cppflags='-fPIC'")
depends_on("libelf@0.8 +parser +pic")
Both users *and* package authors can use the same spec syntax to refer
to different package configurations. Users use the spec syntax on the
@@ -2519,46 +2476,82 @@ command line to find installed packages or to install packages with
particular constraints, and package authors can use specs to describe
relationships between packages.
^^^^^^^^^^^^^^
Version ranges
^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Specifying backward and forward compatibility
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Although some packages require a specific version for their dependencies,
most can be built with a range of versions. For example, if you are
writing a package for a legacy Python module that only works with Python
2.4 through 2.6, this would look like:
Packages are often compatible with a range of versions of their
dependencies. This is typically referred to as backward and forward
compatibility. Spack allows you to specify this in the ``depends_on``
directive using version ranges.
**Backwards compatibility** means that the package requires at least a
certain version of its dependency:
.. code-block:: python
depends_on("python@2.4:2.6")
depends_on("python@3.10:")
Version ranges in Spack are *inclusive*, so ``2.4:2.6`` means any version
greater than or equal to ``2.4`` and up to and including any ``2.6.x``. If
you want to specify that a package works with any version of Python 3 (or
higher), this would look like:
In this case, the package requires Python 3.10 or newer.
Commonly, packages drop support for older versions of a dependency as
they release new versions. In Spack you can conveniently add every
backward compatibility rule as a separate line:
.. code-block:: python
depends_on("python@3:")
# backward compatibility with Python
depends_on("python@3.8:")
depends_on("python@3.9:", when="@1.2:")
depends_on("python@3.10:", when="@1.4:")
Here we leave out the upper bound. If you want to say that a package
requires Python 2, you can similarly leave out the lower bound:
This means that in general we need Python 3.8 or newer; from version
1.2 onwards we need Python 3.9 or newer; from version 1.4 onwards we
need Python 3.10 or newer. Notice that it's fine to have overlapping
ranges in the ``when`` clauses.
**Forward compatibility** means that the package requires at most a
certain version of its dependency. Forward compatibility rules are
necessary when there are breaking changes in the dependency that the
package cannot handle. In Spack we often add forward compatibility
bounds only at the time a new, breaking version of a dependency is
released. As with backward compatibility, it is typical to see a list
of forward compatibility bounds in a package file as seperate lines:
.. code-block:: python
depends_on("python@:2")
# forward compatibility with Python
depends_on("python@:3.12", when="@:1.10")
depends_on("python@:3.13", when="@:1.12")
Notice that we didn't use ``@:3``. Version ranges are *inclusive*, so
``@:3`` means "up to and including any 3.x version".
Notice how the ``:`` now appears before the version number both in the
dependency and in the ``when`` clause. This tells Spack that in general
we need Python 3.13 or older up to version ``1.12.x``, and up to version
``1.10.x`` we need Python 3.12 or older. Said differently, forward compatibility
with Python 3.13 was added in version 1.11, while version 1.13 added forward
compatibility with Python 3.14.
You can also simply write
Notice that a version range ``@:3.12`` includes *any* patch version
number ``3.12.x``, which is often useful when specifying forward compatibility
bounds.
So far we have seen open-ended version ranges, which is by far the most
common use case. It is also possible to specify both a lower and an upper bound
on the version of a dependency, like this:
.. code-block:: python
depends_on("python@2.7")
depends_on("python@3.10:3.12")
to tell Spack that the package needs Python 2.7.x. This is equivalent to
``@2.7:2.7``.
There is short syntax to specify that a package is compatible with say any
``3.x`` version:
.. code-block:: python
depends_on("python@3")
The above is equivalent to ``depends_on("python@3:3")``, which means at least
Python version 3 and at most any version ``3.x.y``.
In very rare cases, you may need to specify an exact version, for example
if you need to distinguish between ``3.2`` and ``3.2.1``:
@@ -2932,9 +2925,9 @@ make sense during the build phase may not be needed at runtime, and vice versa.
it makes sense to let a dependency set the environment variables for its dependents. To allow all
this, Spack provides four different methods that can be overridden in a package:
1. :meth:`setup_build_environment <spack.builder.Builder.setup_build_environment>`
1. :meth:`setup_build_environment <spack.builder.BaseBuilder.setup_build_environment>`
2. :meth:`setup_run_environment <spack.package_base.PackageBase.setup_run_environment>`
3. :meth:`setup_dependent_build_environment <spack.builder.Builder.setup_dependent_build_environment>`
3. :meth:`setup_dependent_build_environment <spack.builder.BaseBuilder.setup_dependent_build_environment>`
4. :meth:`setup_dependent_run_environment <spack.package_base.PackageBase.setup_dependent_run_environment>`
The Qt package, for instance, uses this call:
@@ -5385,7 +5378,7 @@ by build recipes. Examples of checking :ref:`variant settings <variants>` and
determine whether it needs to also set up build dependencies (see
:ref:`test-build-tests`).
The ``MyPackage`` package below provides two basic test examples:
The ``MyPackage`` package below provides two basic test examples:
``test_example`` and ``test_example2``. The first runs the installed
``example`` and ensures its output contains an expected string. The second
runs ``example2`` without checking output so is only concerned with confirming
@@ -5702,7 +5695,7 @@ subdirectory of the installation prefix. They are automatically copied to
the appropriate relative paths under the test stage directory prior to
executing stand-alone tests.
.. tip::
.. tip::
*Perform test-related conversions once when copying files.*
@@ -7078,6 +7071,46 @@ might write:
CXXFLAGS += -I$DWARF_PREFIX/include
CXXFLAGS += -L$DWARF_PREFIX/lib
.. _abi_compatibility:
----------------------------
Specifying ABI Compatibility
----------------------------
Packages can include ABI-compatibility information using the
``can_splice`` directive. For example, if ``Foo`` version 1.1 can
always replace version 1.0, then the package could have:
.. code-block:: python
can_splice("foo@1.0", when="@1.1")
For virtual packages, packages can also specify ABI-compabitiliby with
other packages providing the same virtual. For example, ``zlib-ng``
could specify:
.. code-block:: python
can_splice("zlib@1.3.1", when="@2.2+compat")
Some packages have ABI-compatibility that is dependent on matching
variant values, either for all variants or for some set of
ABI-relevant variants. In those cases, it is not necessary to specify
the full combinatorial explosion. The ``match_variants`` keyword can
cover all single-value variants.
.. code-block:: python
can_splice("foo@1.1", when="@1.2", match_variants=["bar"]) # any value for bar as long as they're the same
can_splice("foo@1.2", when="@1.3", match_variants="*") # any variant values if all single-value variants match
The concretizer will use ABI compatibility to determine automatic
splices when :ref:`automatic splicing<automatic_splicing>` is enabled.
.. note::
The ``can_splice`` directive is experimental, and may be replaced
by a higher-level interface in future versions of Spack.
.. _package_class_structure:

View File

@@ -2,8 +2,8 @@ sphinx==8.1.3
sphinxcontrib-programoutput==0.17
sphinx_design==0.6.1
sphinx-rtd-theme==3.0.1
python-levenshtein==0.26.0
docutils==0.20.1
python-levenshtein==0.26.1
docutils==0.21.2
pygments==2.18.0
urllib3==2.2.3
pytest==8.3.3

238
lib/spack/env/cc vendored
View File

@@ -101,10 +101,9 @@ setsep() {
esac
}
# prepend LISTNAME ELEMENT [SEP]
# prepend LISTNAME ELEMENT
#
# Prepend ELEMENT to the list stored in the variable LISTNAME,
# assuming the list is separated by SEP.
# Prepend ELEMENT to the list stored in the variable LISTNAME.
# Handles empty lists and single-element lists.
prepend() {
varname="$1"
@@ -238,6 +237,36 @@ esac
}
"
# path_list functions. Path_lists have 3 parts: spack_store_<list>, <list> and system_<list>,
# which are used to prioritize paths when assembling the final command line.
# init_path_lists LISTNAME
# Set <LISTNAME>, spack_store_<LISTNAME>, and system_<LISTNAME> to "".
init_path_lists() {
eval "spack_store_$1=\"\""
eval "$1=\"\""
eval "system_$1=\"\""
}
# assign_path_lists LISTNAME1 LISTNAME2
# Copy contents of LISTNAME2 into LISTNAME1, for each path_list prefix.
assign_path_lists() {
eval "spack_store_$1=\"\${spack_store_$2}\""
eval "$1=\"\${$2}\""
eval "system_$1=\"\${system_$2}\""
}
# append_path_lists LISTNAME ELT
# Append the provided ELT to the appropriate list, based on the result of path_order().
append_path_lists() {
path_order "$2"
case $? in
0) eval "append spack_store_$1 \"\$2\"" ;;
1) eval "append $1 \"\$2\"" ;;
2) eval "append system_$1 \"\$2\"" ;;
esac
}
# Check if optional parameters are defined
# If we aren't asking for debug flags, don't add them
if [ -z "${SPACK_ADD_DEBUG_FLAGS:-}" ]; then
@@ -470,12 +499,7 @@ input_command="$*"
parse_Wl() {
while [ $# -ne 0 ]; do
if [ "$wl_expect_rpath" = yes ]; then
path_order "$1"
case $? in
0) append return_spack_store_rpath_dirs_list "$1" ;;
1) append return_rpath_dirs_list "$1" ;;
2) append return_system_rpath_dirs_list "$1" ;;
esac
append_path_lists return_rpath_dirs_list "$1"
wl_expect_rpath=no
else
case "$1" in
@@ -484,24 +508,14 @@ parse_Wl() {
if [ -z "$arg" ]; then
shift; continue
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
append_path_lists return_rpath_dirs_list "$arg"
;;
--rpath=*)
arg="${1#--rpath=}"
if [ -z "$arg" ]; then
shift; continue
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
append_path_lists return_rpath_dirs_list "$arg"
;;
-rpath|--rpath)
wl_expect_rpath=yes
@@ -509,8 +523,7 @@ parse_Wl() {
"$dtags_to_strip")
;;
-Wl)
# Nested -Wl,-Wl means we're in NAG compiler territory, we don't support
# it.
# Nested -Wl,-Wl means we're in NAG compiler territory. We don't support it.
return 1
;;
*)
@@ -529,21 +542,10 @@ categorize_arguments() {
return_other_args_list=""
return_isystem_was_used=""
return_isystem_spack_store_include_dirs_list=""
return_isystem_system_include_dirs_list=""
return_isystem_include_dirs_list=""
return_spack_store_include_dirs_list=""
return_system_include_dirs_list=""
return_include_dirs_list=""
return_spack_store_lib_dirs_list=""
return_system_lib_dirs_list=""
return_lib_dirs_list=""
return_spack_store_rpath_dirs_list=""
return_system_rpath_dirs_list=""
return_rpath_dirs_list=""
init_path_lists return_isystem_include_dirs_list
init_path_lists return_include_dirs_list
init_path_lists return_lib_dirs_list
init_path_lists return_rpath_dirs_list
# Global state for keeping track of -Wl,-rpath -Wl,/path
wl_expect_rpath=no
@@ -609,32 +611,17 @@ categorize_arguments() {
arg="${1#-isystem}"
return_isystem_was_used=true
if [ -z "$arg" ]; then shift; arg="$1"; fi
path_order "$arg"
case $? in
0) append return_isystem_spack_store_include_dirs_list "$arg" ;;
1) append return_isystem_include_dirs_list "$arg" ;;
2) append return_isystem_system_include_dirs_list "$arg" ;;
esac
append_path_lists return_isystem_include_dirs_list "$arg"
;;
-I*)
arg="${1#-I}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
path_order "$arg"
case $? in
0) append return_spack_store_include_dirs_list "$arg" ;;
1) append return_include_dirs_list "$arg" ;;
2) append return_system_include_dirs_list "$arg" ;;
esac
append_path_lists return_include_dirs_list "$arg"
;;
-L*)
arg="${1#-L}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
path_order "$arg"
case $? in
0) append return_spack_store_lib_dirs_list "$arg" ;;
1) append return_lib_dirs_list "$arg" ;;
2) append return_system_lib_dirs_list "$arg" ;;
esac
append_path_lists return_lib_dirs_list "$arg"
;;
-l*)
# -loopopt=0 is generated erroneously in autoconf <= 2.69,
@@ -667,32 +654,17 @@ categorize_arguments() {
break
elif [ "$xlinker_expect_rpath" = yes ]; then
# Register the path of -Xlinker -rpath <other args> -Xlinker <path>
path_order "$1"
case $? in
0) append return_spack_store_rpath_dirs_list "$1" ;;
1) append return_rpath_dirs_list "$1" ;;
2) append return_system_rpath_dirs_list "$1" ;;
esac
append_path_lists return_rpath_dirs_list "$1"
xlinker_expect_rpath=no
else
case "$1" in
-rpath=*)
arg="${1#-rpath=}"
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
append_path_lists return_rpath_dirs_list "$arg"
;;
--rpath=*)
arg="${1#--rpath=}"
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
append_path_lists return_rpath_dirs_list "$arg"
;;
-rpath|--rpath)
xlinker_expect_rpath=yes
@@ -709,7 +681,36 @@ categorize_arguments() {
"$dtags_to_strip")
;;
*)
append return_other_args_list "$1"
# if mode is not ld, we can just add to other args
if [ "$mode" != "ld" ]; then
append return_other_args_list "$1"
shift
continue
fi
# if we're in linker mode, we need to parse raw RPATH args
case "$1" in
-rpath=*)
arg="${1#-rpath=}"
append_path_lists return_rpath_dirs_list "$arg"
;;
--rpath=*)
arg="${1#--rpath=}"
append_path_lists return_rpath_dirs_list "$arg"
;;
-rpath|--rpath)
if [ $# -eq 1 ]; then
# -rpath without value: let the linker raise an error.
append return_other_args_list "$1"
break
fi
shift
append_path_lists return_rpath_dirs_list "$1"
;;
*)
append return_other_args_list "$1"
;;
esac
;;
esac
shift
@@ -731,21 +732,10 @@ categorize_arguments() {
categorize_arguments "$@"
spack_store_include_dirs_list="$return_spack_store_include_dirs_list"
system_include_dirs_list="$return_system_include_dirs_list"
include_dirs_list="$return_include_dirs_list"
spack_store_lib_dirs_list="$return_spack_store_lib_dirs_list"
system_lib_dirs_list="$return_system_lib_dirs_list"
lib_dirs_list="$return_lib_dirs_list"
spack_store_rpath_dirs_list="$return_spack_store_rpath_dirs_list"
system_rpath_dirs_list="$return_system_rpath_dirs_list"
rpath_dirs_list="$return_rpath_dirs_list"
isystem_spack_store_include_dirs_list="$return_isystem_spack_store_include_dirs_list"
isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
isystem_include_dirs_list="$return_isystem_include_dirs_list"
assign_path_lists isystem_include_dirs_list return_isystem_include_dirs_list
assign_path_lists include_dirs_list return_include_dirs_list
assign_path_lists lib_dirs_list return_lib_dirs_list
assign_path_lists rpath_dirs_list return_rpath_dirs_list
isystem_was_used="$return_isystem_was_used"
other_args_list="$return_other_args_list"
@@ -821,21 +811,10 @@ IFS="$lsep"
categorize_arguments $spack_flags_list
unset IFS
spack_flags_isystem_spack_store_include_dirs_list="$return_isystem_spack_store_include_dirs_list"
spack_flags_isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
spack_flags_isystem_include_dirs_list="$return_isystem_include_dirs_list"
spack_flags_spack_store_include_dirs_list="$return_spack_store_include_dirs_list"
spack_flags_system_include_dirs_list="$return_system_include_dirs_list"
spack_flags_include_dirs_list="$return_include_dirs_list"
spack_flags_spack_store_lib_dirs_list="$return_spack_store_lib_dirs_list"
spack_flags_system_lib_dirs_list="$return_system_lib_dirs_list"
spack_flags_lib_dirs_list="$return_lib_dirs_list"
spack_flags_spack_store_rpath_dirs_list="$return_spack_store_rpath_dirs_list"
spack_flags_system_rpath_dirs_list="$return_system_rpath_dirs_list"
spack_flags_rpath_dirs_list="$return_rpath_dirs_list"
assign_path_lists spack_flags_isystem_include_dirs_list return_isystem_include_dirs_list
assign_path_lists spack_flags_include_dirs_list return_include_dirs_list
assign_path_lists spack_flags_lib_dirs_list return_lib_dirs_list
assign_path_lists spack_flags_rpath_dirs_list return_rpath_dirs_list
spack_flags_isystem_was_used="$return_isystem_was_used"
spack_flags_other_args_list="$return_other_args_list"
@@ -894,7 +873,7 @@ esac
case "$mode" in
cpp|cc|as|ccld)
if [ "$spack_flags_isystem_was_used" = "true" ] || [ "$isystem_was_used" = "true" ]; then
extend isystem_spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS
extend spack_store_isystem_include_dirs_list SPACK_STORE_INCLUDE_DIRS
extend isystem_include_dirs_list SPACK_INCLUDE_DIRS
else
extend spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS
@@ -910,64 +889,63 @@ args_list="$flags_list"
# Include search paths partitioned by (in store, non-sytem, system)
# NOTE: adding ${lsep} to the prefix here turns every added element into two
extend args_list spack_flags_spack_store_include_dirs_list -I
extend args_list spack_store_spack_flags_include_dirs_list -I
extend args_list spack_store_include_dirs_list -I
extend args_list spack_flags_include_dirs_list -I
extend args_list include_dirs_list -I
extend args_list spack_flags_isystem_spack_store_include_dirs_list "-isystem${lsep}"
extend args_list isystem_spack_store_include_dirs_list "-isystem${lsep}"
extend args_list spack_store_spack_flags_isystem_include_dirs_list "-isystem${lsep}"
extend args_list spack_store_isystem_include_dirs_list "-isystem${lsep}"
extend args_list spack_flags_isystem_include_dirs_list "-isystem${lsep}"
extend args_list isystem_include_dirs_list "-isystem${lsep}"
extend args_list spack_flags_system_include_dirs_list -I
extend args_list system_spack_flags_include_dirs_list -I
extend args_list system_include_dirs_list -I
extend args_list spack_flags_isystem_system_include_dirs_list "-isystem${lsep}"
extend args_list isystem_system_include_dirs_list "-isystem${lsep}"
extend args_list system_spack_flags_isystem_include_dirs_list "-isystem${lsep}"
extend args_list system_isystem_include_dirs_list "-isystem${lsep}"
# Library search paths partitioned by (in store, non-sytem, system)
extend args_list spack_flags_spack_store_lib_dirs_list "-L"
extend args_list spack_store_spack_flags_lib_dirs_list "-L"
extend args_list spack_store_lib_dirs_list "-L"
extend args_list spack_flags_lib_dirs_list "-L"
extend args_list lib_dirs_list "-L"
extend args_list spack_flags_system_lib_dirs_list "-L"
extend args_list system_spack_flags_lib_dirs_list "-L"
extend args_list system_lib_dirs_list "-L"
# RPATHs arguments
rpath_prefix=""
case "$mode" in
ccld)
if [ -n "$dtags_to_add" ] ; then
append args_list "$linker_arg$dtags_to_add"
fi
extend args_list spack_flags_spack_store_rpath_dirs_list "$rpath"
extend args_list spack_store_rpath_dirs_list "$rpath"
extend args_list spack_flags_rpath_dirs_list "$rpath"
extend args_list rpath_dirs_list "$rpath"
extend args_list spack_flags_system_rpath_dirs_list "$rpath"
extend args_list system_rpath_dirs_list "$rpath"
rpath_prefix="$rpath"
;;
ld)
if [ -n "$dtags_to_add" ] ; then
append args_list "$dtags_to_add"
fi
extend args_list spack_flags_spack_store_rpath_dirs_list "-rpath${lsep}"
extend args_list spack_store_rpath_dirs_list "-rpath${lsep}"
extend args_list spack_flags_rpath_dirs_list "-rpath${lsep}"
extend args_list rpath_dirs_list "-rpath${lsep}"
extend args_list spack_flags_system_rpath_dirs_list "-rpath${lsep}"
extend args_list system_rpath_dirs_list "-rpath${lsep}"
rpath_prefix="-rpath${lsep}"
;;
esac
# if mode is ccld or ld, extend RPATH lists with the prefix determined above
if [ -n "$rpath_prefix" ]; then
extend args_list spack_store_spack_flags_rpath_dirs_list "$rpath_prefix"
extend args_list spack_store_rpath_dirs_list "$rpath_prefix"
extend args_list spack_flags_rpath_dirs_list "$rpath_prefix"
extend args_list rpath_dirs_list "$rpath_prefix"
extend args_list system_spack_flags_rpath_dirs_list "$rpath_prefix"
extend args_list system_rpath_dirs_list "$rpath_prefix"
fi
# Other arguments from the input command
extend args_list other_args_list
extend args_list spack_flags_other_args_list

View File

@@ -20,11 +20,23 @@
import tempfile
from contextlib import contextmanager
from itertools import accumulate
from typing import Callable, Iterable, List, Match, Optional, Tuple, Union
from typing import (
Callable,
Deque,
Dict,
Iterable,
List,
Match,
Optional,
Sequence,
Set,
Tuple,
Union,
)
import llnl.util.symlink
from llnl.util import tty
from llnl.util.lang import dedupe, memoized
from llnl.util.lang import dedupe, fnmatch_translate_multiple, memoized
from llnl.util.symlink import islink, readlink, resolve_link_target_relative_to_the_link, symlink
from ..path import path_to_os_path, system_path_filter
@@ -85,6 +97,8 @@
"visit_directory_tree",
]
Path = Union[str, pathlib.Path]
if sys.version_info < (3, 7, 4):
# monkeypatch shutil.copystat to fix PermissionError when copying read-only
# files on Lustre when using Python < 3.7.4
@@ -1673,105 +1687,203 @@ def find_first(root: str, files: Union[Iterable[str], str], bfs_depth: int = 2)
return FindFirstFile(root, *files, bfs_depth=bfs_depth).find()
def find(root, files, recursive=True):
"""Search for ``files`` starting from the ``root`` directory.
Like GNU/BSD find but written entirely in Python.
Examples:
.. code-block:: console
$ find /usr -name python
is equivalent to:
>>> find('/usr', 'python')
.. code-block:: console
$ find /usr/local/bin -maxdepth 1 -name python
is equivalent to:
>>> find('/usr/local/bin', 'python', recursive=False)
def find(
root: Union[Path, Sequence[Path]],
files: Union[str, Sequence[str]],
recursive: bool = True,
max_depth: Optional[int] = None,
) -> List[str]:
"""Finds all files matching the patterns from ``files`` starting from ``root``. This function
returns a deterministic result for the same input and directory structure when run multiple
times. Symlinked directories are followed, and unique directories are searched only once. Each
matching file is returned only once at lowest depth in case multiple paths exist due to
symlinked directories.
Accepts any glob characters accepted by fnmatch:
========== ====================================
Pattern Meaning
========== ====================================
``*`` matches everything
``*`` matches one or more characters
``?`` matches any single character
``[seq]`` matches any character in ``seq``
``[!seq]`` matches any character not in ``seq``
========== ====================================
Parameters:
root (str): The root directory to start searching from
files (str or collections.abc.Sequence): Library name(s) to search for
recursive (bool): if False search only root folder,
if True descends top-down from the root. Defaults to True.
Examples:
Returns:
list: The files that have been found
>>> find("/usr", "*.txt", recursive=True, max_depth=2)
finds all files with the extension ``.txt`` in the directory ``/usr`` and subdirectories up to
depth 2.
>>> find(["/usr", "/var"], ["*.txt", "*.log"], recursive=True)
finds all files with the extension ``.txt`` or ``.log`` in the directories ``/usr`` and
``/var`` at any depth.
>>> find("/usr", "GL/*.h", recursive=True)
finds all header files in a directory GL at any depth in the directory ``/usr``.
Parameters:
root: One or more root directories to start searching from
files: One or more filename patterns to search for
recursive: if False search only root, if True descends from roots. Defaults to True.
max_depth: if set, don't search below this depth. Cannot be set if recursive is False
Returns a list of absolute, matching file paths.
"""
if isinstance(root, (str, pathlib.Path)):
root = [root]
elif not isinstance(root, collections.abc.Sequence):
raise TypeError(f"'root' arg must be a path or a sequence of paths, not '{type(root)}']")
if isinstance(files, str):
files = [files]
elif not isinstance(files, collections.abc.Sequence):
raise TypeError(f"'files' arg must be str or a sequence of str, not '{type(files)}']")
if recursive:
tty.debug(f"Find (recursive): {root} {str(files)}")
result = _find_recursive(root, files)
else:
tty.debug(f"Find (not recursive): {root} {str(files)}")
result = _find_non_recursive(root, files)
# If recursive is false, max_depth can only be None or 0
if max_depth and not recursive:
raise ValueError(f"max_depth ({max_depth}) cannot be set if recursive is False")
tty.debug(f"Find complete: {root} {str(files)}")
tty.debug(f"Find (max depth = {max_depth}): {root} {files}")
if not recursive:
max_depth = 0
elif max_depth is None:
max_depth = sys.maxsize
result = _find_max_depth(root, files, max_depth)
tty.debug(f"Find complete: {root} {files}")
return result
@system_path_filter
def _find_recursive(root, search_files):
# The variable here is **on purpose** a defaultdict. The idea is that
# we want to poke the filesystem as little as possible, but still maintain
# stability in the order of the answer. Thus we are recording each library
# found in a key, and reconstructing the stable order later.
found_files = collections.defaultdict(list)
# Make the path absolute to have os.walk also return an absolute path
root = os.path.abspath(root)
for path, _, list_files in os.walk(root):
for search_file in search_files:
matches = glob.glob(os.path.join(path, search_file))
matches = [os.path.join(path, x) for x in matches]
found_files[search_file].extend(matches)
answer = []
for search_file in search_files:
answer.extend(found_files[search_file])
return answer
def _log_file_access_issue(e: OSError, path: str) -> None:
errno_name = errno.errorcode.get(e.errno, "UNKNOWN")
tty.debug(f"find must skip {path}: {errno_name} {e}")
@system_path_filter
def _find_non_recursive(root, search_files):
# The variable here is **on purpose** a defaultdict as os.list_dir
# can return files in any order (does not preserve stability)
found_files = collections.defaultdict(list)
def _file_id(s: os.stat_result) -> Tuple[int, int]:
# Note: on windows, st_ino is the file index and st_dev is the volume serial number. See
# https://github.com/python/cpython/blob/3.9/Python/fileutils.c
return (s.st_ino, s.st_dev)
# Make the path absolute to have absolute path returned
root = os.path.abspath(root)
for search_file in search_files:
matches = glob.glob(os.path.join(root, search_file))
matches = [os.path.join(root, x) for x in matches]
found_files[search_file].extend(matches)
def _dedupe_files(paths: List[str]) -> List[str]:
"""Deduplicate files by inode and device, dropping files that cannot be accessed."""
unique_files: List[str] = []
# tuple of (inode, device) for each file without following symlinks
visited: Set[Tuple[int, int]] = set()
for path in paths:
try:
stat_info = os.lstat(path)
except OSError as e:
_log_file_access_issue(e, path)
continue
file_id = _file_id(stat_info)
if file_id not in visited:
unique_files.append(path)
visited.add(file_id)
return unique_files
answer = []
for search_file in search_files:
answer.extend(found_files[search_file])
return answer
def _find_max_depth(
roots: Sequence[Path], globs: Sequence[str], max_depth: int = sys.maxsize
) -> List[str]:
"""See ``find`` for the public API."""
# We optimize for the common case of simple filename only patterns: a single, combined regex
# is used. For complex patterns that include path components, we use a slower glob call from
# every directory we visit within max_depth.
filename_only_patterns = {
f"pattern_{i}": os.path.normcase(x) for i, x in enumerate(globs) if "/" not in x
}
complex_patterns = {f"pattern_{i}": x for i, x in enumerate(globs) if "/" in x}
regex = re.compile(fnmatch_translate_multiple(filename_only_patterns))
# Ordered dictionary that keeps track of what pattern found which files
matched_paths: Dict[str, List[str]] = {f"pattern_{i}": [] for i, _ in enumerate(globs)}
# Ensure returned paths are always absolute
roots = [os.path.abspath(r) for r in roots]
# Breadth-first search queue. Each element is a tuple of (depth, dir)
dir_queue: Deque[Tuple[int, str]] = collections.deque()
# Set of visited directories. Each element is a tuple of (inode, device)
visited_dirs: Set[Tuple[int, int]] = set()
for root in roots:
try:
stat_root = os.stat(root)
except OSError as e:
_log_file_access_issue(e, root)
continue
dir_id = _file_id(stat_root)
if dir_id not in visited_dirs:
dir_queue.appendleft((0, root))
visited_dirs.add(dir_id)
while dir_queue:
depth, curr_dir = dir_queue.pop()
try:
dir_iter = os.scandir(curr_dir)
except OSError as e:
_log_file_access_issue(e, curr_dir)
continue
# Use glob.glob for complex patterns.
for pattern_name, pattern in complex_patterns.items():
matched_paths[pattern_name].extend(
path for path in glob.glob(os.path.join(curr_dir, pattern))
)
# List of subdirectories by path and (inode, device) tuple
subdirs: List[Tuple[str, Tuple[int, int]]] = []
with dir_iter:
for dir_entry in dir_iter:
# Match filename only patterns
if filename_only_patterns:
m = regex.match(os.path.normcase(dir_entry.name))
if m:
for pattern_name in filename_only_patterns:
if m.group(pattern_name):
matched_paths[pattern_name].append(dir_entry.path)
break
# Collect subdirectories
if depth >= max_depth:
continue
try:
if not dir_entry.is_dir(follow_symlinks=True):
continue
if sys.platform == "win32":
# Note: st_ino/st_dev on DirEntry.stat are not set on Windows, so we have
# to call os.stat
stat_info = os.stat(dir_entry.path, follow_symlinks=True)
else:
stat_info = dir_entry.stat(follow_symlinks=True)
except OSError as e:
# Possible permission issue, or a symlink that cannot be resolved (ELOOP).
_log_file_access_issue(e, dir_entry.path)
continue
subdirs.append((dir_entry.path, _file_id(stat_info)))
# Enqueue subdirectories in a deterministic order
if subdirs:
subdirs.sort(key=lambda s: os.path.basename(s[0]))
for subdir, subdir_id in subdirs:
if subdir_id not in visited_dirs:
dir_queue.appendleft((depth + 1, subdir))
visited_dirs.add(subdir_id)
# Sort the matched paths for deterministic output
for paths in matched_paths.values():
paths.sort()
all_matching_paths = [path for paths in matched_paths.values() for path in paths]
# We only dedupe files if we have any complex patterns, since only they can match the same file
# multiple times
return _dedupe_files(all_matching_paths) if complex_patterns else all_matching_paths
# Utilities for libraries and headers
@@ -2210,7 +2322,9 @@ def find_system_libraries(libraries, shared=True):
return libraries_found
def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
def find_libraries(
libraries, root, shared=True, recursive=False, runtime=True, max_depth: Optional[int] = None
):
"""Returns an iterable of full paths to libraries found in a root dir.
Accepts any glob characters accepted by fnmatch:
@@ -2231,6 +2345,8 @@ def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
otherwise for static. Defaults to True.
recursive (bool): if False search only root folder,
if True descends top-down from the root. Defaults to False.
max_depth (int): if set, don't search below this depth. Cannot be set
if recursive is False
runtime (bool): Windows only option, no-op elsewhere. If true,
search for runtime shared libs (.DLL), otherwise, search
for .Lib files. If shared is false, this has no meaning.
@@ -2239,6 +2355,7 @@ def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
Returns:
LibraryList: The libraries that have been found
"""
if isinstance(libraries, str):
libraries = [libraries]
elif not isinstance(libraries, collections.abc.Sequence):
@@ -2271,8 +2388,10 @@ def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
libraries = ["{0}.{1}".format(lib, suffix) for lib in libraries for suffix in suffixes]
if not recursive:
if max_depth:
raise ValueError(f"max_depth ({max_depth}) cannot be set if recursive is False")
# If not recursive, look for the libraries directly in root
return LibraryList(find(root, libraries, False))
return LibraryList(find(root, libraries, recursive=False))
# To speedup the search for external packages configured e.g. in /usr,
# perform first non-recursive search in root/lib then in root/lib64 and
@@ -2290,7 +2409,7 @@ def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
if found_libs:
break
else:
found_libs = find(root, libraries, True)
found_libs = find(root, libraries, recursive=True, max_depth=max_depth)
return LibraryList(found_libs)

View File

@@ -5,14 +5,17 @@
import collections.abc
import contextlib
import fnmatch
import functools
import itertools
import os
import re
import sys
import traceback
import typing
import warnings
from datetime import datetime, timedelta
from typing import Callable, Iterable, List, Tuple, TypeVar
from typing import Callable, Dict, Iterable, List, Tuple, TypeVar
# Ignore emacs backups when listing modules
ignore_modules = r"^\.#|~$"
@@ -858,6 +861,19 @@ def elide_list(line_list: List[str], max_num: int = 10) -> List[str]:
return line_list
if sys.version_info >= (3, 9):
PatternStr = re.Pattern[str]
else:
PatternStr = typing.Pattern[str]
def fnmatch_translate_multiple(named_patterns: Dict[str, str]) -> str:
"""Similar to ``fnmatch.translate``, but takes an ordered dictionary where keys are pattern
names, and values are filename patterns. The output is a regex that matches any of the
patterns in order, and named capture groups are used to identify which pattern matched."""
return "|".join(f"(?P<{n}>{fnmatch.translate(p)})" for n, p in named_patterns.items())
@contextlib.contextmanager
def nullcontext(*args, **kwargs):
"""Empty context manager.
@@ -870,15 +886,6 @@ class UnhashableArguments(TypeError):
"""Raise when an @memoized function receives unhashable arg or kwarg values."""
def enum(**kwargs):
"""Return an enum-like class.
Args:
**kwargs: explicit dictionary of enums
"""
return type("Enum", (object,), kwargs)
T = TypeVar("T")
@@ -914,6 +921,21 @@ def ensure_last(lst, *elements):
lst.append(lst.pop(lst.index(elt)))
class Const:
"""Class level constant, raises when trying to set the attribute"""
__slots__ = ["value"]
def __init__(self, value):
self.value = value
def __get__(self, instance, owner):
return self.value
def __set__(self, instance, value):
raise TypeError(f"Const value does not support assignment [value={self.value}]")
class TypedMutableSequence(collections.abc.MutableSequence):
"""Base class that behaves like a list, just with a different type.
@@ -1018,3 +1040,42 @@ def __init__(self, callback):
def __get__(self, instance, owner):
return self.callback(owner)
class DeprecatedProperty:
"""Data descriptor to error or warn when a deprecated property is accessed.
Derived classes must define a factory method to return an adaptor for the deprecated
property, if the descriptor is not set to error.
"""
__slots__ = ["name"]
#: 0 - Nothing
#: 1 - Warning
#: 2 - Error
error_lvl = 0
def __init__(self, name: str) -> None:
self.name = name
def __get__(self, instance, owner):
if instance is None:
return self
if self.error_lvl == 1:
warnings.warn(
f"accessing the '{self.name}' property of '{instance}', which is deprecated"
)
elif self.error_lvl == 2:
raise AttributeError(f"cannot access the '{self.name}' attribute of '{instance}'")
return self.factory(instance, owner)
def __set__(self, instance, value):
raise TypeError(
f"the deprecated property '{self.name}' of '{instance}' does not support assignment"
)
def factory(self, instance, owner):
raise NotImplementedError("must be implemented by derived classes")

View File

@@ -21,7 +21,7 @@
from multiprocessing.connection import Connection
from threading import Thread
from types import ModuleType
from typing import Callable, Optional
from typing import Callable, Optional, Union
import llnl.util.tty as tty
@@ -1022,3 +1022,22 @@ def wrapped(*args, **kwargs):
def _input_available(f):
return f in select.select([f], [], [], 0)[0]
LogType = Union[nixlog, winlog]
def print_message(logger: LogType, msg: str, verbose: bool = False):
"""Print the message to the log, optionally echoing.
Args:
logger: instance of the output logger (e.g. nixlog or winlog)
msg: message being output
verbose: ``True`` displays verbose output, ``False`` suppresses
it (``False`` is default)
"""
if verbose:
with logger.force_echo():
tty.info(msg, format="g")
else:
tty.info(msg, format="g")

View File

@@ -11,7 +11,7 @@
import spack.util.git
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.23.0.dev0"
__version__ = "0.24.0.dev0"
spack_version = __version__

View File

@@ -714,17 +714,16 @@ def _ensure_env_methods_are_ported_to_builders(pkgs, error_cls):
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
# values are either Value objects (for conditional values) or the values themselves
# values are either ConditionalValue objects or the values themselves
build_system_names = set(
v.value if isinstance(v, spack.variant.Value) else v
v.value if isinstance(v, spack.variant.ConditionalValue) else v
for _, variant in pkg_cls.variant_definitions("build_system")
for v in variant.values
)
builder_cls_names = [spack.builder.BUILDER_CLS[x].__name__ for x in build_system_names]
module = pkg_cls.module
has_builders_in_package_py = any(
getattr(module, name, False) for name in builder_cls_names
spack.builder.get_builder_class(pkg_cls, name) for name in builder_cls_names
)
if not has_builders_in_package_py:
continue
@@ -806,7 +805,7 @@ def _uses_deprecated_globals(pkgs, error_cls):
file = spack.repo.PATH.filename_for_package_name(pkg_name)
tree = ast.parse(open(file).read())
visitor = DeprecatedMagicGlobals(("std_cmake_args",))
visitor = DeprecatedMagicGlobals(("std_cmake_args", "std_meson_args", "std_pip_args"))
visitor.visit(tree)
if visitor.references_to_globals:
errors.append(

View File

@@ -87,6 +87,8 @@
from spack.stage import Stage
from spack.util.executable import which
from .enums import InstallRecordStatus
BUILD_CACHE_RELATIVE_PATH = "build_cache"
BUILD_CACHE_KEYS_RELATIVE_PATH = "_pgp"
@@ -252,7 +254,7 @@ def _associate_built_specs_with_mirror(self, cache_key, mirror_url):
spec_list = [
s
for s in db.query_local(installed=any)
for s in db.query_local(installed=InstallRecordStatus.ANY)
if s.external or db.query_local_by_spec_hash(s.dag_hash()).in_buildcache
]
@@ -1182,6 +1184,9 @@ def __init__(self, mirror: spack.mirror.Mirror, force: bool, update_index: bool)
self.tmpdir: str
self.executor: concurrent.futures.Executor
# Verify if the mirror meets the requirements to push
self.mirror.ensure_mirror_usable("push")
def __enter__(self):
self._tmpdir = tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root())
self._executor = spack.util.parallel.make_concurrent_executor()

View File

@@ -45,7 +45,6 @@
import spack.util.executable
import spack.util.path
import spack.util.spack_yaml
import spack.util.url
import spack.version
from spack.installer import PackageInstaller
@@ -603,7 +602,10 @@ def bootstrapping_sources(scope: Optional[str] = None):
current = copy.copy(entry)
metadata_dir = spack.util.path.canonicalize_path(entry["metadata"])
metadata_yaml = os.path.join(metadata_dir, METADATA_YAML_FILENAME)
with open(metadata_yaml, encoding="utf-8") as stream:
current.update(spack.util.spack_yaml.load(stream))
list_of_sources.append(current)
try:
with open(metadata_yaml, encoding="utf-8") as stream:
current.update(spack.util.spack_yaml.load(stream))
list_of_sources.append(current)
except OSError:
pass
return list_of_sources

View File

@@ -56,7 +56,6 @@
from llnl.util.symlink import symlink
from llnl.util.tty.color import cescape, colorize
import spack.build_systems._checks
import spack.build_systems.cmake
import spack.build_systems.meson
import spack.build_systems.python
@@ -1375,7 +1374,7 @@ def exitcode_msg(p):
return child_result
CONTEXT_BASES = (spack.package_base.PackageBase, spack.build_systems._checks.BaseBuilder)
CONTEXT_BASES = (spack.package_base.PackageBase, spack.builder.Builder)
def get_package_context(traceback, context=3):

View File

@@ -9,6 +9,7 @@
import spack.builder
import spack.error
import spack.phase_callbacks
import spack.relocate
import spack.spec
import spack.store
@@ -63,7 +64,7 @@ def apply_macos_rpath_fixups(builder: spack.builder.Builder):
def ensure_build_dependencies_or_raise(
spec: spack.spec.Spec, dependencies: List[spack.spec.Spec], error_msg: str
spec: spack.spec.Spec, dependencies: List[str], error_msg: str
):
"""Ensure that some build dependencies are present in the concrete spec.
@@ -71,7 +72,7 @@ def ensure_build_dependencies_or_raise(
Args:
spec: concrete spec to be checked.
dependencies: list of abstract specs to be satisfied
dependencies: list of package names of required build dependencies
error_msg: brief error message to be prepended to a longer description
Raises:
@@ -111,7 +112,7 @@ def execute_build_time_tests(builder: spack.builder.Builder):
if not builder.pkg.run_tests or not builder.build_time_test_callbacks:
return
builder.pkg.tester.phase_tests(builder, "build", builder.build_time_test_callbacks)
builder.phase_tests("build", builder.build_time_test_callbacks)
def execute_install_time_tests(builder: spack.builder.Builder):
@@ -124,11 +125,11 @@ def execute_install_time_tests(builder: spack.builder.Builder):
if not builder.pkg.run_tests or not builder.install_time_test_callbacks:
return
builder.pkg.tester.phase_tests(builder, "install", builder.install_time_test_callbacks)
builder.phase_tests("install", builder.install_time_test_callbacks)
class BaseBuilder(spack.builder.Builder):
"""Base class for builders to register common checks"""
class BuilderWithDefaults(spack.builder.Builder):
"""Base class for all specific builders with common callbacks registered."""
# Check that self.prefix is there after installation
spack.builder.run_after("install")(sanity_check_prefix)
spack.phase_callbacks.run_after("install")(sanity_check_prefix)

View File

@@ -6,7 +6,7 @@
import os.path
import stat
import subprocess
from typing import List
from typing import Callable, List, Optional, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -15,6 +15,9 @@
import spack.builder
import spack.error
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from spack.operating_systems.mac_os import macos_version
@@ -22,7 +25,7 @@
from spack.version import Version
from ._checks import (
BaseBuilder,
BuilderWithDefaults,
apply_macos_rpath_fixups,
ensure_build_dependencies_or_raise,
execute_build_time_tests,
@@ -69,14 +72,14 @@ def flags_to_build_system_args(self, flags):
# Legacy methods (used by too many packages to change them,
# need to forward to the builder)
def enable_or_disable(self, *args, **kwargs):
return self.builder.enable_or_disable(*args, **kwargs)
return spack.builder.create(self).enable_or_disable(*args, **kwargs)
def with_or_without(self, *args, **kwargs):
return self.builder.with_or_without(*args, **kwargs)
return spack.builder.create(self).with_or_without(*args, **kwargs)
@spack.builder.builder("autotools")
class AutotoolsBuilder(BaseBuilder):
class AutotoolsBuilder(BuilderWithDefaults):
"""The autotools builder encodes the default way of installing software built
with autotools. It has four phases that can be overridden, if need be:
@@ -157,7 +160,7 @@ class AutotoolsBuilder(BaseBuilder):
install_libtool_archives = False
@property
def patch_config_files(self):
def patch_config_files(self) -> bool:
"""Whether to update old ``config.guess`` and ``config.sub`` files
distributed with the tarball.
@@ -177,7 +180,7 @@ def patch_config_files(self):
)
@property
def _removed_la_files_log(self):
def _removed_la_files_log(self) -> str:
"""File containing the list of removed libtool archives"""
build_dir = self.build_directory
if not os.path.isabs(self.build_directory):
@@ -185,15 +188,15 @@ def _removed_la_files_log(self):
return os.path.join(build_dir, "removed_la_files.txt")
@property
def archive_files(self):
def archive_files(self) -> List[str]:
"""Files to archive for packages based on autotools"""
files = [os.path.join(self.build_directory, "config.log")]
if not self.install_libtool_archives:
files.append(self._removed_la_files_log)
return files
@spack.builder.run_after("autoreconf")
def _do_patch_config_files(self):
@spack.phase_callbacks.run_after("autoreconf")
def _do_patch_config_files(self) -> None:
"""Some packages ship with older config.guess/config.sub files and need to
have these updated when installed on a newer architecture.
@@ -294,7 +297,7 @@ def runs_ok(script_abs_path):
and set the prefix to the directory containing the `config.guess` and
`config.sub` files.
"""
raise spack.error.InstallError(msg.format(", ".join(to_be_found), self.name))
raise spack.error.InstallError(msg.format(", ".join(to_be_found), self.pkg.name))
# Copy the good files over the bad ones
for abs_path in to_be_patched:
@@ -304,8 +307,8 @@ def runs_ok(script_abs_path):
fs.copy(substitutes[name], abs_path)
os.chmod(abs_path, mode)
@spack.builder.run_before("configure")
def _patch_usr_bin_file(self):
@spack.phase_callbacks.run_before("configure")
def _patch_usr_bin_file(self) -> None:
"""On NixOS file is not available in /usr/bin/file. Patch configure
scripts to use file from path."""
@@ -316,8 +319,8 @@ def _patch_usr_bin_file(self):
with fs.keep_modification_time(*x.filenames):
x.filter(regex="/usr/bin/file", repl="file", string=True)
@spack.builder.run_before("configure")
def _set_autotools_environment_variables(self):
@spack.phase_callbacks.run_before("configure")
def _set_autotools_environment_variables(self) -> None:
"""Many autotools builds use a version of mknod.m4 that fails when
running as root unless FORCE_UNSAFE_CONFIGURE is set to 1.
@@ -330,8 +333,8 @@ def _set_autotools_environment_variables(self):
"""
os.environ["FORCE_UNSAFE_CONFIGURE"] = "1"
@spack.builder.run_before("configure")
def _do_patch_libtool_configure(self):
@spack.phase_callbacks.run_before("configure")
def _do_patch_libtool_configure(self) -> None:
"""Patch bugs that propagate from libtool macros into "configure" and
further into "libtool". Note that patches that can be fixed by patching
"libtool" directly should be implemented in the _do_patch_libtool method
@@ -358,8 +361,8 @@ def _do_patch_libtool_configure(self):
# Support Libtool 2.4.2 and older:
x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2')
@spack.builder.run_after("configure")
def _do_patch_libtool(self):
@spack.phase_callbacks.run_after("configure")
def _do_patch_libtool(self) -> None:
"""If configure generates a "libtool" script that does not correctly
detect the compiler (and patch_libtool is set), patch in the correct
values for libtool variables.
@@ -507,27 +510,64 @@ def _do_patch_libtool(self):
)
@property
def configure_directory(self):
def configure_directory(self) -> str:
"""Return the directory where 'configure' resides."""
return self.pkg.stage.source_path
@property
def configure_abs_path(self):
def configure_abs_path(self) -> str:
# Absolute path to configure
configure_abs_path = os.path.join(os.path.abspath(self.configure_directory), "configure")
return configure_abs_path
@property
def build_directory(self):
def build_directory(self) -> str:
"""Override to provide another place to build the package"""
return self.configure_directory
@spack.builder.run_before("autoreconf")
def delete_configure_to_force_update(self):
@spack.phase_callbacks.run_before("autoreconf")
def delete_configure_to_force_update(self) -> None:
if self.force_autoreconf:
fs.force_remove(self.configure_abs_path)
def autoreconf(self, pkg, spec, prefix):
@property
def autoreconf_search_path_args(self) -> List[str]:
"""Search path includes for autoreconf. Add an -I flag for all `aclocal` dirs
of build deps, skips the default path of automake, move external include
flags to the back, since they might pull in unrelated m4 files shadowing
spack dependencies."""
return _autoreconf_search_path_args(self.spec)
@spack.phase_callbacks.run_after("autoreconf")
def set_configure_or_die(self) -> None:
"""Ensure the presence of a "configure" script, or raise. If the "configure"
is found, a module level attribute is set.
Raises:
RuntimeError: if the "configure" script is not found
"""
# Check if the "configure" script is there. If not raise a RuntimeError.
if not os.path.exists(self.configure_abs_path):
msg = "configure script not found in {0}"
raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
def configure_args(self) -> List[str]:
"""Return the list of all the arguments that must be passed to configure,
except ``--prefix`` which will be pre-pended to the list.
"""
return []
def autoreconf(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Not needed usually, configure should be already there"""
# If configure exists nothing needs to be done
@@ -554,39 +594,12 @@ def autoreconf(self, pkg, spec, prefix):
autoreconf_args += self.autoreconf_extra_args
self.pkg.module.autoreconf(*autoreconf_args)
@property
def autoreconf_search_path_args(self):
"""Search path includes for autoreconf. Add an -I flag for all `aclocal` dirs
of build deps, skips the default path of automake, move external include
flags to the back, since they might pull in unrelated m4 files shadowing
spack dependencies."""
return _autoreconf_search_path_args(self.spec)
@spack.builder.run_after("autoreconf")
def set_configure_or_die(self):
"""Ensure the presence of a "configure" script, or raise. If the "configure"
is found, a module level attribute is set.
Raises:
RuntimeError: if the "configure" script is not found
"""
# Check if the "configure" script is there. If not raise a RuntimeError.
if not os.path.exists(self.configure_abs_path):
msg = "configure script not found in {0}"
raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
def configure_args(self):
"""Return the list of all the arguments that must be passed to configure,
except ``--prefix`` which will be pre-pended to the list.
"""
return []
def configure(self, pkg, spec, prefix):
def configure(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "configure", with the arguments specified by the builder and an
appropriately set prefix.
"""
@@ -597,7 +610,12 @@ def configure(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory, create=True):
pkg.module.configure(*options)
def build(self, pkg, spec, prefix):
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the build targets specified by the builder."""
# See https://autotools.io/automake/silent.html
params = ["V=1"]
@@ -605,41 +623,49 @@ def build(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory):
pkg.module.make(*params)
def install(self, pkg, spec, prefix):
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
def check(self) -> None:
"""Run "make" on the ``test`` and ``check`` targets, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("test")
self.pkg._if_make_target_execute("check")
def _activate_or_not(
self, name, activation_word, deactivation_word, activation_value=None, variant=None
):
self,
name: str,
activation_word: str,
deactivation_word: str,
activation_value: Optional[Union[Callable, str]] = None,
variant=None,
) -> List[str]:
"""This function contain the current implementation details of
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without` and
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.enable_or_disable`.
Args:
name (str): name of the option that is being activated or not
activation_word (str): the default activation word ('with' in the
case of ``with_or_without``)
deactivation_word (str): the default deactivation word ('without'
in the case of ``with_or_without``)
activation_value (typing.Callable): callable that accepts a single
value. This value is either one of the allowed values for a
multi-valued variant or the name of a bool-valued variant.
name: name of the option that is being activated or not
activation_word: the default activation word ('with' in the case of
``with_or_without``)
deactivation_word: the default deactivation word ('without' in the case of
``with_or_without``)
activation_value: callable that accepts a single value. This value is either one of the
allowed values for a multi-valued variant or the name of a bool-valued variant.
Returns the parameter to be used when the value is activated.
The special value 'prefix' can also be assigned and will return
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
variant (str): name of the variant that is being processed
(if different from option name)
variant: name of the variant that is being processed (if different from option name)
Examples:
@@ -647,19 +673,19 @@ def _activate_or_not(
.. code-block:: python
variant('foo', values=('x', 'y'), description='')
variant('bar', default=True, description='')
variant('ba_z', default=True, description='')
variant("foo", values=("x", "y"), description=")
variant("bar", default=True, description=")
variant("ba_z", default=True, description=")
calling this function like:
.. code-block:: python
_activate_or_not(
'foo', 'with', 'without', activation_value='prefix'
"foo", "with", "without", activation_value="prefix"
)
_activate_or_not('bar', 'with', 'without')
_activate_or_not('ba-z', 'with', 'without', variant='ba_z')
_activate_or_not("bar", "with", "without")
_activate_or_not("ba-z", "with", "without", variant="ba_z")
will generate the following configuration options:
@@ -679,8 +705,8 @@ def _activate_or_not(
Raises:
KeyError: if name is not among known variants
"""
spec = self.pkg.spec
args = []
spec: spack.spec.Spec = self.pkg.spec
args: List[str] = []
if activation_value == "prefix":
activation_value = lambda x: spec[x].prefix
@@ -698,7 +724,7 @@ def _activate_or_not(
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
vdef = self.pkg.get_variant(variant)
if set(vdef.values) == set((True, False)):
if set(vdef.values) == set((True, False)): # type: ignore
# BoolValuedVariant carry information about a single option.
# Nonetheless, for uniformity of treatment we'll package them
# in an iterable of one element.
@@ -709,14 +735,12 @@ def _activate_or_not(
# package's build system. It excludes values which have special
# meanings and do not correspond to features (e.g. "none")
feature_values = getattr(vdef.values, "feature_values", None) or vdef.values
options = [(value, f"{variant}={value}" in spec) for value in feature_values]
options = [(v, f"{variant}={v}" in spec) for v in feature_values] # type: ignore
# For each allowed value in the list of values
for option_value, activated in options:
# Search for an override in the package for this value
override_name = "{0}_or_{1}_{2}".format(
activation_word, deactivation_word, option_value
)
override_name = f"{activation_word}_or_{deactivation_word}_{option_value}"
line_generator = getattr(self, override_name, None) or getattr(
self.pkg, override_name, None
)
@@ -725,19 +749,24 @@ def _activate_or_not(
def _default_generator(is_activated):
if is_activated:
line = "--{0}-{1}".format(activation_word, option_value)
line = f"--{activation_word}-{option_value}"
if activation_value is not None and activation_value(
option_value
): # NOQA=ignore=E501
line += "={0}".format(activation_value(option_value))
line = f"{line}={activation_value(option_value)}"
return line
return "--{0}-{1}".format(deactivation_word, option_value)
return f"--{deactivation_word}-{option_value}"
line_generator = _default_generator
args.append(line_generator(activated))
return args
def with_or_without(self, name, activation_value=None, variant=None):
def with_or_without(
self,
name: str,
activation_value: Optional[Union[Callable, str]] = None,
variant: Optional[str] = None,
) -> List[str]:
"""Inspects a variant and returns the arguments that activate
or deactivate the selected feature(s) for the configure options.
@@ -752,12 +781,11 @@ def with_or_without(self, name, activation_value=None, variant=None):
``variant=value`` is in the spec.
Args:
name (str): name of a valid multi-valued variant
activation_value (typing.Callable): callable that accepts a single
value and returns the parameter to be used leading to an entry
of the type ``--with-{name}={parameter}``.
name: name of a valid multi-valued variant
activation_value: callable that accepts a single value and returns the parameter to be
used leading to an entry of the type ``--with-{name}={parameter}``.
The special value 'prefix' can also be assigned and will return
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
Returns:
@@ -765,18 +793,22 @@ def with_or_without(self, name, activation_value=None, variant=None):
"""
return self._activate_or_not(name, "with", "without", activation_value, variant)
def enable_or_disable(self, name, activation_value=None, variant=None):
def enable_or_disable(
self,
name: str,
activation_value: Optional[Union[Callable, str]] = None,
variant: Optional[str] = None,
) -> List[str]:
"""Same as
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without`
but substitute ``with`` with ``enable`` and ``without`` with ``disable``.
Args:
name (str): name of a valid multi-valued variant
activation_value (typing.Callable): if present accepts a single value
and returns the parameter to be used leading to an entry of the
type ``--enable-{name}={parameter}``
name: name of a valid multi-valued variant
activation_value: if present accepts a single value and returns the parameter to be
used leading to an entry of the type ``--enable-{name}={parameter}``
The special value 'prefix' can also be assigned and will return
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
Returns:
@@ -784,15 +816,15 @@ def enable_or_disable(self, name, activation_value=None, variant=None):
"""
return self._activate_or_not(name, "enable", "disable", activation_value, variant)
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def installcheck(self):
def installcheck(self) -> None:
"""Run "make" on the ``installcheck`` target, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("installcheck")
@spack.builder.run_after("install")
def remove_libtool_archives(self):
@spack.phase_callbacks.run_after("install")
def remove_libtool_archives(self) -> None:
"""Remove all .la files in prefix sub-folders if the package sets
``install_libtool_archives`` to be False.
"""
@@ -814,12 +846,13 @@ def setup_build_environment(self, env):
env.set("MACOSX_DEPLOYMENT_TARGET", "10.16")
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
def _autoreconf_search_path_args(spec):
dirs_seen = set()
flags_spack, flags_external = [], []
def _autoreconf_search_path_args(spec: spack.spec.Spec) -> List[str]:
dirs_seen: Set[Tuple[int, int]] = set()
flags_spack: List[str] = []
flags_external: List[str] = []
# We don't want to add an include flag for automake's default search path.
for automake in spec.dependencies(name="automake", deptype="build"):

View File

@@ -10,7 +10,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.builder
import spack.phase_callbacks
from .cmake import CMakeBuilder, CMakePackage
@@ -332,7 +332,7 @@ def std_cmake_args(self):
args.extend(["-C", self.cache_path])
return args
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def install_cmake_cache(self):
fs.mkdirp(self.pkg.spec.prefix.share.cmake)
fs.install(self.cache_path, self.pkg.spec.prefix.share.cmake)

View File

@@ -7,10 +7,11 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from spack.multimethod import when
from ._checks import BaseBuilder, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_install_time_tests
class CargoPackage(spack.package_base.PackageBase):
@@ -27,7 +28,7 @@ class CargoPackage(spack.package_base.PackageBase):
@spack.builder.builder("cargo")
class CargoBuilder(BaseBuilder):
class CargoBuilder(BuilderWithDefaults):
"""The Cargo builder encodes the most common way of building software with
a rust Cargo.toml file. It has two phases that can be overridden, if need be:
@@ -77,7 +78,7 @@ def install(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory):
fs.install_tree("out", prefix)
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def check(self):
"""Run "cargo test"."""

View File

@@ -9,7 +9,7 @@
import re
import sys
from itertools import chain
from typing import List, Optional, Set, Tuple
from typing import Any, List, Optional, Set, Tuple
import llnl.util.filesystem as fs
from llnl.util.lang import stable_partition
@@ -18,11 +18,14 @@
import spack.deptypes as dt
import spack.error
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from spack.util.environment import filter_system_paths
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
# Regex to extract the primary generator from the CMake generator
# string.
@@ -48,9 +51,9 @@ def _maybe_set_python_hints(pkg: spack.package_base.PackageBase, args: List[str]
python_executable = pkg.spec["python"].command.path
args.extend(
[
CMakeBuilder.define("PYTHON_EXECUTABLE", python_executable),
CMakeBuilder.define("Python_EXECUTABLE", python_executable),
CMakeBuilder.define("Python3_EXECUTABLE", python_executable),
define("PYTHON_EXECUTABLE", python_executable),
define("Python_EXECUTABLE", python_executable),
define("Python3_EXECUTABLE", python_executable),
]
)
@@ -85,7 +88,7 @@ def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[
ipo = False
if cmake.satisfies("@3.9:"):
args.append(CMakeBuilder.define("CMAKE_INTERPROCEDURAL_OPTIMIZATION", ipo))
args.append(define("CMAKE_INTERPROCEDURAL_OPTIMIZATION", ipo))
# Disable Package Registry: export(PACKAGE) may put files in the user's home directory, and
# find_package may search there. This is not what we want.
@@ -93,30 +96,36 @@ def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[
# Do not populate CMake User Package Registry
if cmake.satisfies("@3.15:"):
# see https://cmake.org/cmake/help/latest/policy/CMP0090.html
args.append(CMakeBuilder.define("CMAKE_POLICY_DEFAULT_CMP0090", "NEW"))
args.append(define("CMAKE_POLICY_DEFAULT_CMP0090", "NEW"))
elif cmake.satisfies("@3.1:"):
# see https://cmake.org/cmake/help/latest/variable/CMAKE_EXPORT_NO_PACKAGE_REGISTRY.html
args.append(CMakeBuilder.define("CMAKE_EXPORT_NO_PACKAGE_REGISTRY", True))
args.append(define("CMAKE_EXPORT_NO_PACKAGE_REGISTRY", True))
# Do not use CMake User/System Package Registry
# https://cmake.org/cmake/help/latest/manual/cmake-packages.7.html#disabling-the-package-registry
if cmake.satisfies("@3.16:"):
args.append(CMakeBuilder.define("CMAKE_FIND_USE_PACKAGE_REGISTRY", False))
args.append(define("CMAKE_FIND_USE_PACKAGE_REGISTRY", False))
elif cmake.satisfies("@3.1:3.15"):
args.append(CMakeBuilder.define("CMAKE_FIND_PACKAGE_NO_PACKAGE_REGISTRY", False))
args.append(CMakeBuilder.define("CMAKE_FIND_PACKAGE_NO_SYSTEM_PACKAGE_REGISTRY", False))
args.append(define("CMAKE_FIND_PACKAGE_NO_PACKAGE_REGISTRY", False))
args.append(define("CMAKE_FIND_PACKAGE_NO_SYSTEM_PACKAGE_REGISTRY", False))
# Export a compilation database if supported.
if _supports_compilation_databases(pkg):
args.append(CMakeBuilder.define("CMAKE_EXPORT_COMPILE_COMMANDS", True))
args.append(define("CMAKE_EXPORT_COMPILE_COMMANDS", True))
# Enable MACOSX_RPATH by default when cmake_minimum_required < 3
# https://cmake.org/cmake/help/latest/policy/CMP0042.html
if pkg.spec.satisfies("platform=darwin") and cmake.satisfies("@3:"):
args.append(CMakeBuilder.define("CMAKE_POLICY_DEFAULT_CMP0042", "NEW"))
args.append(define("CMAKE_POLICY_DEFAULT_CMP0042", "NEW"))
# Disable find package's config mode for versions of Boost that
# didn't provide it. See https://github.com/spack/spack/issues/20169
# and https://cmake.org/cmake/help/latest/module/FindBoost.html
if pkg.spec.satisfies("^boost@:1.69.0"):
args.append(define("Boost_NO_BOOST_CMAKE", True))
def generator(*names: str, default: Optional[str] = None):
def generator(*names: str, default: Optional[str] = None) -> None:
"""The build system generator to use.
See ``cmake --help`` for a list of valid generators.
@@ -263,15 +272,15 @@ def flags_to_build_system_args(self, flags):
# Legacy methods (used by too many packages to change them,
# need to forward to the builder)
def define(self, *args, **kwargs):
return self.builder.define(*args, **kwargs)
def define(self, cmake_var: str, value: Any) -> str:
return define(cmake_var, value)
def define_from_variant(self, *args, **kwargs):
return self.builder.define_from_variant(*args, **kwargs)
def define_from_variant(self, cmake_var: str, variant: Optional[str] = None) -> str:
return define_from_variant(self, cmake_var, variant)
@spack.builder.builder("cmake")
class CMakeBuilder(BaseBuilder):
class CMakeBuilder(BuilderWithDefaults):
"""The cmake builder encodes the default way of building software with CMake. IT
has three phases that can be overridden:
@@ -321,15 +330,15 @@ class CMakeBuilder(BaseBuilder):
build_time_test_callbacks = ["check"]
@property
def archive_files(self):
def archive_files(self) -> List[str]:
"""Files to archive for packages based on CMake"""
files = [os.path.join(self.build_directory, "CMakeCache.txt")]
if _supports_compilation_databases(self):
if _supports_compilation_databases(self.pkg):
files.append(os.path.join(self.build_directory, "compile_commands.json"))
return files
@property
def root_cmakelists_dir(self):
def root_cmakelists_dir(self) -> str:
"""The relative path to the directory containing CMakeLists.txt
This path is relative to the root of the extracted tarball,
@@ -338,16 +347,17 @@ def root_cmakelists_dir(self):
return self.pkg.stage.source_path
@property
def generator(self):
def generator(self) -> str:
if self.spec.satisfies("generator=make"):
return "Unix Makefiles"
if self.spec.satisfies("generator=ninja"):
return "Ninja"
msg = f'{self.spec.format()} has an unsupported value for the "generator" variant'
raise ValueError(msg)
raise ValueError(
f'{self.spec.format()} has an unsupported value for the "generator" variant'
)
@property
def std_cmake_args(self):
def std_cmake_args(self) -> List[str]:
"""Standard cmake arguments provided as a property for
convenience of package writers
"""
@@ -356,7 +366,9 @@ def std_cmake_args(self):
return args
@staticmethod
def std_args(pkg, generator=None):
def std_args(
pkg: spack.package_base.PackageBase, generator: Optional[str] = None
) -> List[str]:
"""Computes the standard cmake arguments for a generic package"""
default_generator = "Ninja" if sys.platform == "win32" else "Unix Makefiles"
generator = generator or default_generator
@@ -373,7 +385,6 @@ def std_args(pkg, generator=None):
except KeyError:
build_type = "RelWithDebInfo"
define = CMakeBuilder.define
args = [
"-G",
generator,
@@ -405,152 +416,31 @@ def std_args(pkg, generator=None):
return args
@staticmethod
def define_cuda_architectures(pkg):
"""Returns the str ``-DCMAKE_CUDA_ARCHITECTURES:STRING=(expanded cuda_arch)``.
``cuda_arch`` is variant composed of a list of target CUDA architectures and
it is declared in the cuda package.
This method is no-op for cmake<3.18 and when ``cuda_arch`` variant is not set.
"""
cmake_flag = str()
if "cuda_arch" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.18:"):
cmake_flag = CMakeBuilder.define(
"CMAKE_CUDA_ARCHITECTURES", pkg.spec.variants["cuda_arch"].value
)
return cmake_flag
def define_cuda_architectures(pkg: spack.package_base.PackageBase) -> str:
return define_cuda_architectures(pkg)
@staticmethod
def define_hip_architectures(pkg):
"""Returns the str ``-DCMAKE_HIP_ARCHITECTURES:STRING=(expanded amdgpu_target)``.
``amdgpu_target`` is variant composed of a list of the target HIP
architectures and it is declared in the rocm package.
This method is no-op for cmake<3.18 and when ``amdgpu_target`` variant is
not set.
"""
cmake_flag = str()
if "amdgpu_target" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.21:"):
cmake_flag = CMakeBuilder.define(
"CMAKE_HIP_ARCHITECTURES", pkg.spec.variants["amdgpu_target"].value
)
return cmake_flag
def define_hip_architectures(pkg: spack.package_base.PackageBase) -> str:
return define_hip_architectures(pkg)
@staticmethod
def define(cmake_var, value):
"""Return a CMake command line argument that defines a variable.
def define(cmake_var: str, value: Any) -> str:
return define(cmake_var, value)
The resulting argument will convert boolean values to OFF/ON
and lists/tuples to CMake semicolon-separated string lists. All other
values will be interpreted as strings.
Examples:
.. code-block:: python
[define('BUILD_SHARED_LIBS', True),
define('CMAKE_CXX_STANDARD', 14),
define('swr', ['avx', 'avx2'])]
will generate the following configuration options:
.. code-block:: console
["-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2]
"""
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
if isinstance(value, bool):
kind = "BOOL"
value = "ON" if value else "OFF"
else:
kind = "STRING"
if isinstance(value, collections.abc.Sequence) and not isinstance(value, str):
value = ";".join(str(v) for v in value)
else:
value = str(value)
return "".join(["-D", cmake_var, ":", kind, "=", value])
def define_from_variant(self, cmake_var, variant=None):
"""Return a CMake command line argument from the given variant's value.
The optional ``variant`` argument defaults to the lower-case transform
of ``cmake_var``.
This utility function is similar to
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without`.
Examples:
Given a package with:
.. code-block:: python
variant('cxxstd', default='11', values=('11', '14'),
multi=False, description='')
variant('shared', default=True, description='')
variant('swr', values=any_combination_of('avx', 'avx2'),
description='')
calling this function like:
.. code-block:: python
[self.define_from_variant('BUILD_SHARED_LIBS', 'shared'),
self.define_from_variant('CMAKE_CXX_STANDARD', 'cxxstd'),
self.define_from_variant('SWR')]
will generate the following configuration options:
.. code-block:: console
["-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2]
for ``<spec-name> cxxstd=14 +shared swr=avx,avx2``
Note: if the provided variant is conditional, and the condition is not met,
this function returns an empty string. CMake discards empty strings
provided on the command line.
"""
if variant is None:
variant = cmake_var.lower()
if not self.pkg.has_variant(variant):
raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, self.pkg.name))
if variant not in self.pkg.spec.variants:
return ""
value = self.pkg.spec.variants[variant].value
if isinstance(value, (tuple, list)):
# Sort multi-valued variants for reproducibility
value = sorted(value)
return self.define(cmake_var, value)
def define_from_variant(self, cmake_var: str, variant: Optional[str] = None) -> str:
return define_from_variant(self.pkg, cmake_var, variant)
@property
def build_dirname(self):
def build_dirname(self) -> str:
"""Directory name to use when building the package."""
return "spack-build-%s" % self.pkg.spec.dag_hash(7)
return f"spack-build-{self.pkg.spec.dag_hash(7)}"
@property
def build_directory(self):
def build_directory(self) -> str:
"""Full-path to the directory to use when building the package."""
return os.path.join(self.pkg.stage.path, self.build_dirname)
def cmake_args(self):
def cmake_args(self) -> List[str]:
"""List of all the arguments that must be passed to cmake, except:
* CMAKE_INSTALL_PREFIX
@@ -560,7 +450,12 @@ def cmake_args(self):
"""
return []
def cmake(self, pkg, spec, prefix):
def cmake(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Runs ``cmake`` in the build directory"""
# skip cmake phase if it is an incremental develop build
@@ -575,7 +470,12 @@ def cmake(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory, create=True):
pkg.module.cmake(*options)
def build(self, pkg, spec, prefix):
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Make the build targets"""
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
@@ -584,7 +484,12 @@ def build(self, pkg, spec, prefix):
self.build_targets.append("-v")
pkg.module.ninja(*self.build_targets)
def install(self, pkg, spec, prefix):
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Make the install targets"""
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
@@ -592,9 +497,9 @@ def install(self, pkg, spec, prefix):
elif self.generator == "Ninja":
pkg.module.ninja(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
def check(self) -> None:
"""Search the CMake-generated files for the targets ``test`` and ``check``,
and runs them if found.
"""
@@ -605,3 +510,133 @@ def check(self):
elif self.generator == "Ninja":
self.pkg._if_ninja_target_execute("test", jobs_env="CTEST_PARALLEL_LEVEL")
self.pkg._if_ninja_target_execute("check")
def define(cmake_var: str, value: Any) -> str:
"""Return a CMake command line argument that defines a variable.
The resulting argument will convert boolean values to OFF/ON and lists/tuples to CMake
semicolon-separated string lists. All other values will be interpreted as strings.
Examples:
.. code-block:: python
[define("BUILD_SHARED_LIBS", True),
define("CMAKE_CXX_STANDARD", 14),
define("swr", ["avx", "avx2"])]
will generate the following configuration options:
.. code-block:: console
["-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2]
"""
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
if isinstance(value, bool):
kind = "BOOL"
value = "ON" if value else "OFF"
else:
kind = "STRING"
if isinstance(value, collections.abc.Sequence) and not isinstance(value, str):
value = ";".join(str(v) for v in value)
else:
value = str(value)
return "".join(["-D", cmake_var, ":", kind, "=", value])
def define_from_variant(
pkg: spack.package_base.PackageBase, cmake_var: str, variant: Optional[str] = None
) -> str:
"""Return a CMake command line argument from the given variant's value.
The optional ``variant`` argument defaults to the lower-case transform
of ``cmake_var``.
Examples:
Given a package with:
.. code-block:: python
variant("cxxstd", default="11", values=("11", "14"),
multi=False, description="")
variant("shared", default=True, description="")
variant("swr", values=any_combination_of("avx", "avx2"),
description="")
calling this function like:
.. code-block:: python
[
self.define_from_variant("BUILD_SHARED_LIBS", "shared"),
self.define_from_variant("CMAKE_CXX_STANDARD", "cxxstd"),
self.define_from_variant("SWR"),
]
will generate the following configuration options:
.. code-block:: console
[
"-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2",
]
for ``<spec-name> cxxstd=14 +shared swr=avx,avx2``
Note: if the provided variant is conditional, and the condition is not met, this function
returns an empty string. CMake discards empty strings provided on the command line.
"""
if variant is None:
variant = cmake_var.lower()
if not pkg.has_variant(variant):
raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, pkg.name))
if variant not in pkg.spec.variants:
return ""
value = pkg.spec.variants[variant].value
if isinstance(value, (tuple, list)):
# Sort multi-valued variants for reproducibility
value = sorted(value)
return define(cmake_var, value)
def define_hip_architectures(pkg: spack.package_base.PackageBase) -> str:
"""Returns the str ``-DCMAKE_HIP_ARCHITECTURES:STRING=(expanded amdgpu_target)``.
``amdgpu_target`` is variant composed of a list of the target HIP
architectures and it is declared in the rocm package.
This method is no-op for cmake<3.18 and when ``amdgpu_target`` variant is
not set.
"""
if "amdgpu_target" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.21:"):
return define("CMAKE_HIP_ARCHITECTURES", pkg.spec.variants["amdgpu_target"].value)
return ""
def define_cuda_architectures(pkg: spack.package_base.PackageBase) -> str:
"""Returns the str ``-DCMAKE_CUDA_ARCHITECTURES:STRING=(expanded cuda_arch)``.
``cuda_arch`` is variant composed of a list of target CUDA architectures and
it is declared in the cuda package.
This method is no-op for cmake<3.18 and when ``cuda_arch`` variant is not set.
"""
if "cuda_arch" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.18:"):
return define("CMAKE_CUDA_ARCHITECTURES", pkg.spec.variants["cuda_arch"].value)
return ""

View File

@@ -180,13 +180,6 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%gcc@7:", when="+cuda ^cuda@:9.1 target=x86_64:")
conflicts("%gcc@8:", when="+cuda ^cuda@:10.0.130 target=x86_64:")
conflicts("%gcc@9:", when="+cuda ^cuda@:10.2.89 target=x86_64:")
conflicts("%pgi@:14.8", when="+cuda ^cuda@:7.0.27 target=x86_64:")
conflicts("%pgi@:15.3,15.5:", when="+cuda ^cuda@7.5 target=x86_64:")
conflicts("%pgi@:16.2,16.0:16.3", when="+cuda ^cuda@8 target=x86_64:")
conflicts("%pgi@:15,18:", when="+cuda ^cuda@9.0:9.1 target=x86_64:")
conflicts("%pgi@:16,19:", when="+cuda ^cuda@9.2.88:10.0 target=x86_64:")
conflicts("%pgi@:17,20:", when="+cuda ^cuda@10.1.105:10.2.89 target=x86_64:")
conflicts("%pgi@:17,21:", when="+cuda ^cuda@11.0.2:11.1.0 target=x86_64:")
conflicts("%clang@:3.4", when="+cuda ^cuda@:7.5 target=x86_64:")
conflicts("%clang@:3.7,4:", when="+cuda ^cuda@8.0:9.0 target=x86_64:")
conflicts("%clang@:3.7,4.1:", when="+cuda ^cuda@9.1 target=x86_64:")
@@ -212,9 +205,6 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%gcc@8:", when="+cuda ^cuda@:10.0.130 target=ppc64le:")
conflicts("%gcc@9:", when="+cuda ^cuda@:10.1.243 target=ppc64le:")
# officially, CUDA 11.0.2 only supports the system GCC 8.3 on ppc64le
conflicts("%pgi", when="+cuda ^cuda@:8 target=ppc64le:")
conflicts("%pgi@:16", when="+cuda ^cuda@:9.1.185 target=ppc64le:")
conflicts("%pgi@:17", when="+cuda ^cuda@:10 target=ppc64le:")
conflicts("%clang@4:", when="+cuda ^cuda@:9.0.176 target=ppc64le:")
conflicts("%clang@5:", when="+cuda ^cuda@:9.1 target=ppc64le:")
conflicts("%clang@6:", when="+cuda ^cuda@:9.2 target=ppc64le:")

View File

@@ -7,8 +7,9 @@
import spack.builder
import spack.directives
import spack.package_base
import spack.phase_callbacks
from ._checks import BaseBuilder, apply_macos_rpath_fixups, execute_install_time_tests
from ._checks import BuilderWithDefaults, apply_macos_rpath_fixups, execute_install_time_tests
class Package(spack.package_base.PackageBase):
@@ -26,7 +27,7 @@ class Package(spack.package_base.PackageBase):
@spack.builder.builder("generic")
class GenericBuilder(BaseBuilder):
class GenericBuilder(BuilderWithDefaults):
"""A builder for a generic build system, that require packagers
to implement an "install" phase.
"""
@@ -44,7 +45,7 @@ class GenericBuilder(BaseBuilder):
install_time_test_callbacks = []
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
# unconditionally perform any post-install phase tests
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -7,10 +7,11 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, extends
from spack.multimethod import when
from ._checks import BaseBuilder, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_install_time_tests
class GoPackage(spack.package_base.PackageBase):
@@ -32,7 +33,7 @@ class GoPackage(spack.package_base.PackageBase):
@spack.builder.builder("go")
class GoBuilder(BaseBuilder):
class GoBuilder(BuilderWithDefaults):
"""The Go builder encodes the most common way of building software with
a golang go.mod file. It has two phases that can be overridden, if need be:
@@ -99,7 +100,7 @@ def install(self, pkg, spec, prefix):
fs.mkdirp(prefix.bin)
fs.install(pkg.name, prefix.bin)
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def check(self):
"""Run ``go test .`` in the source directory"""

View File

@@ -22,8 +22,8 @@
install,
)
import spack.builder
import spack.error
import spack.phase_callbacks
from spack.build_environment import dso_suffix
from spack.error import InstallError
from spack.util.environment import EnvironmentModifications
@@ -1163,7 +1163,7 @@ def _determine_license_type(self):
debug_print(license_type)
return license_type
@spack.builder.run_before("install")
@spack.phase_callbacks.run_before("install")
def configure(self):
"""Generates the silent.cfg file to pass to installer.sh.
@@ -1250,7 +1250,7 @@ def install(self, spec, prefix):
for f in glob.glob("%s/intel*log" % tmpdir):
install(f, dst)
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def validate_install(self):
# Sometimes the installer exits with an error but doesn't pass a
# non-zero exit code to spack. Check for the existence of a 'bin'
@@ -1258,7 +1258,7 @@ def validate_install(self):
if not os.path.exists(self.prefix.bin):
raise InstallError("The installer has failed to install anything.")
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def configure_rpath(self):
if "+rpath" not in self.spec:
return
@@ -1276,7 +1276,7 @@ def configure_rpath(self):
with open(compiler_cfg, "w") as fh:
fh.write("-Xlinker -rpath={0}\n".format(compilers_lib_dir))
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def configure_auto_dispatch(self):
if self._has_compilers:
if "auto_dispatch=none" in self.spec:
@@ -1300,7 +1300,7 @@ def configure_auto_dispatch(self):
with open(compiler_cfg, "a") as fh:
fh.write("-ax{0}\n".format(",".join(ad)))
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def filter_compiler_wrappers(self):
if ("+mpi" in self.spec or self.provides("mpi")) and "~newdtags" in self.spec:
bin_dir = self.component_bin_dir("mpi")
@@ -1308,7 +1308,7 @@ def filter_compiler_wrappers(self):
f = os.path.join(bin_dir, f)
filter_file("-Xlinker --enable-new-dtags", " ", f, string=True)
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def uninstall_ism(self):
# The "Intel(R) Software Improvement Program" [ahem] gets installed,
# apparently regardless of PHONEHOME_SEND_USAGE_DATA.
@@ -1340,7 +1340,7 @@ def base_lib_dir(self):
debug_print(d)
return d
@spack.builder.run_after("install")
@spack.phase_callbacks.run_after("install")
def modify_LLVMgold_rpath(self):
"""Add libimf.so and other required libraries to the RUNPATH of LLVMgold.so.

View File

@@ -8,11 +8,14 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from ._checks import (
BaseBuilder,
BuilderWithDefaults,
apply_macos_rpath_fixups,
execute_build_time_tests,
execute_install_time_tests,
@@ -36,7 +39,7 @@ class MakefilePackage(spack.package_base.PackageBase):
@spack.builder.builder("makefile")
class MakefileBuilder(BaseBuilder):
class MakefileBuilder(BuilderWithDefaults):
"""The Makefile builder encodes the most common way of building software with
Makefiles. It has three phases that can be overridden, if need be:
@@ -91,35 +94,50 @@ class MakefileBuilder(BaseBuilder):
install_time_test_callbacks = ["installcheck"]
@property
def build_directory(self):
def build_directory(self) -> str:
"""Return the directory containing the main Makefile."""
return self.pkg.stage.source_path
def edit(self, pkg, spec, prefix):
def edit(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Edit the Makefile before calling make. The default is a no-op."""
pass
def build(self, pkg, spec, prefix):
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.build_targets)
def install(self, pkg, spec, prefix):
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
def check(self) -> None:
"""Run "make" on the ``test`` and ``check`` targets, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("test")
self.pkg._if_make_target_execute("check")
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def installcheck(self):
def installcheck(self) -> None:
"""Searches the Makefile for an ``installcheck`` target
and runs it if found.
"""
@@ -127,4 +145,4 @@ def installcheck(self):
self.pkg._if_make_target_execute("installcheck")
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)

View File

@@ -10,7 +10,7 @@
from spack.multimethod import when
from spack.util.executable import which
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class MavenPackage(spack.package_base.PackageBase):
@@ -34,7 +34,7 @@ class MavenPackage(spack.package_base.PackageBase):
@spack.builder.builder("maven")
class MavenBuilder(BaseBuilder):
class MavenBuilder(BuilderWithDefaults):
"""The Maven builder encodes the default way to build software with Maven.
It has two phases that can be overridden, if need be:

View File

@@ -9,10 +9,13 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
class MesonPackage(spack.package_base.PackageBase):
@@ -62,7 +65,7 @@ def flags_to_build_system_args(self, flags):
@spack.builder.builder("meson")
class MesonBuilder(BaseBuilder):
class MesonBuilder(BuilderWithDefaults):
"""The Meson builder encodes the default way to build software with Meson.
The builder has three phases that can be overridden, if need be:
@@ -112,7 +115,7 @@ def archive_files(self):
return [os.path.join(self.build_directory, "meson-logs", "meson-log.txt")]
@property
def root_mesonlists_dir(self):
def root_mesonlists_dir(self) -> str:
"""Relative path to the directory containing meson.build
This path is relative to the root of the extracted tarball,
@@ -121,7 +124,7 @@ def root_mesonlists_dir(self):
return self.pkg.stage.source_path
@property
def std_meson_args(self):
def std_meson_args(self) -> List[str]:
"""Standard meson arguments provided as a property for convenience
of package writers.
"""
@@ -132,7 +135,7 @@ def std_meson_args(self):
return std_meson_args
@staticmethod
def std_args(pkg):
def std_args(pkg) -> List[str]:
"""Standard meson arguments for a generic package."""
try:
build_type = pkg.spec.variants["buildtype"].value
@@ -172,7 +175,7 @@ def build_directory(self):
"""Directory to use when building the package."""
return os.path.join(self.pkg.stage.path, self.build_dirname)
def meson_args(self):
def meson_args(self) -> List[str]:
"""List of arguments that must be passed to meson, except:
* ``--prefix``
@@ -185,7 +188,12 @@ def meson_args(self):
"""
return []
def meson(self, pkg, spec, prefix):
def meson(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run ``meson`` in the build directory"""
options = []
if self.spec["meson"].satisfies("@0.64:"):
@@ -196,21 +204,31 @@ def meson(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory, create=True):
pkg.module.meson(*options)
def build(self, pkg, spec, prefix):
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Make the build targets"""
options = ["-v"]
options += self.build_targets
with fs.working_dir(self.build_directory):
pkg.module.ninja(*options)
def install(self, pkg, spec, prefix):
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Make the install targets"""
with fs.working_dir(self.build_directory):
pkg.module.ninja(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
def check(self) -> None:
"""Search Meson-generated files for the target ``test`` and run it if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_ninja_target_execute("test")

View File

@@ -10,7 +10,7 @@
import spack.package_base
from spack.directives import build_system, conflicts
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class MSBuildPackage(spack.package_base.PackageBase):
@@ -26,7 +26,7 @@ class MSBuildPackage(spack.package_base.PackageBase):
@spack.builder.builder("msbuild")
class MSBuildBuilder(BaseBuilder):
class MSBuildBuilder(BuilderWithDefaults):
"""The MSBuild builder encodes the most common way of building software with
Mircosoft's MSBuild tool. It has two phases that can be overridden, if need be:

View File

@@ -10,7 +10,7 @@
import spack.package_base
from spack.directives import build_system, conflicts
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class NMakePackage(spack.package_base.PackageBase):
@@ -26,7 +26,7 @@ class NMakePackage(spack.package_base.PackageBase):
@spack.builder.builder("nmake")
class NMakeBuilder(BaseBuilder):
class NMakeBuilder(BuilderWithDefaults):
"""The NMake builder encodes the most common way of building software with
Mircosoft's NMake tool. It has two phases that can be overridden, if need be:

View File

@@ -7,7 +7,7 @@
from spack.directives import build_system, extends
from spack.multimethod import when
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class OctavePackage(spack.package_base.PackageBase):
@@ -29,7 +29,7 @@ class OctavePackage(spack.package_base.PackageBase):
@spack.builder.builder("octave")
class OctaveBuilder(BaseBuilder):
class OctaveBuilder(BuilderWithDefaults):
"""The octave builder provides the following phases that can be overridden:
1. :py:meth:`~.OctaveBuilder.install`

View File

@@ -10,11 +10,12 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, extends
from spack.install_test import SkipTest, test_part
from spack.util.executable import Executable
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
class PerlPackage(spack.package_base.PackageBase):
@@ -84,7 +85,7 @@ def test_use(self):
@spack.builder.builder("perl")
class PerlBuilder(BaseBuilder):
class PerlBuilder(BuilderWithDefaults):
"""The perl builder provides four phases that can be overridden, if required:
1. :py:meth:`~.PerlBuilder.configure`
@@ -163,7 +164,7 @@ def configure(self, pkg, spec, prefix):
# Build.PL may be too long causing the build to fail. Patching the shebang
# does not happen until after install so set '/usr/bin/env perl' here in
# the Build script.
@spack.builder.run_after("configure")
@spack.phase_callbacks.run_after("configure")
def fix_shebang(self):
if self.build_method == "Build.PL":
pattern = "#!{0}".format(self.spec["perl"].command.path)
@@ -175,7 +176,7 @@ def build(self, pkg, spec, prefix):
self.build_executable()
# Ensure that tests run after build (if requested):
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self):
"""Runs built-in tests of a Perl package."""

View File

@@ -24,6 +24,7 @@
import spack.detection
import spack.multimethod
import spack.package_base
import spack.phase_callbacks
import spack.platforms
import spack.repo
import spack.spec
@@ -34,7 +35,7 @@
from spack.spec import Spec
from spack.util.prefix import Prefix
from ._checks import BaseBuilder, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_install_time_tests
def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
@@ -374,7 +375,7 @@ def list_url(cls) -> Optional[str]: # type: ignore[override]
return None
@property
def python_spec(self):
def python_spec(self) -> Spec:
"""Get python-venv if it exists or python otherwise."""
python, *_ = self.spec.dependencies("python-venv") or self.spec.dependencies("python")
return python
@@ -425,7 +426,7 @@ def libs(self) -> LibraryList:
@spack.builder.builder("python_pip")
class PythonPipBuilder(BaseBuilder):
class PythonPipBuilder(BuilderWithDefaults):
phases = ("install",)
#: Names associated with package methods in the old build-system format
@@ -543,4 +544,4 @@ def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
with fs.working_dir(self.build_directory):
pip(*args)
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -6,9 +6,10 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
class QMakePackage(spack.package_base.PackageBase):
@@ -30,7 +31,7 @@ class QMakePackage(spack.package_base.PackageBase):
@spack.builder.builder("qmake")
class QMakeBuilder(BaseBuilder):
class QMakeBuilder(BuilderWithDefaults):
"""The qmake builder provides three phases that can be overridden:
1. :py:meth:`~.QMakeBuilder.qmake`
@@ -81,4 +82,4 @@ def check(self):
with working_dir(self.build_directory):
self.pkg._if_make_target_execute("check")
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)

View File

@@ -8,7 +8,7 @@
import spack.package_base
from spack.directives import build_system, extends, maintainers
from ._checks import BaseBuilder
from ._checks import BuilderWithDefaults
class RubyPackage(spack.package_base.PackageBase):
@@ -28,7 +28,7 @@ class RubyPackage(spack.package_base.PackageBase):
@spack.builder.builder("ruby")
class RubyBuilder(BaseBuilder):
class RubyBuilder(BuilderWithDefaults):
"""The Ruby builder provides two phases that can be overridden if required:
#. :py:meth:`~.RubyBuilder.build`

View File

@@ -4,9 +4,10 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from ._checks import BaseBuilder, execute_build_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests
class SConsPackage(spack.package_base.PackageBase):
@@ -28,7 +29,7 @@ class SConsPackage(spack.package_base.PackageBase):
@spack.builder.builder("scons")
class SConsBuilder(BaseBuilder):
class SConsBuilder(BuilderWithDefaults):
"""The Scons builder provides the following phases that can be overridden:
1. :py:meth:`~.SConsBuilder.build`
@@ -79,4 +80,4 @@ def build_test(self):
"""
pass
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)

View File

@@ -11,11 +11,12 @@
import spack.builder
import spack.install_test
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on, extends
from spack.multimethod import when
from spack.util.executable import Executable
from ._checks import BaseBuilder, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_install_time_tests
class SIPPackage(spack.package_base.PackageBase):
@@ -103,7 +104,7 @@ def test_imports(self):
@spack.builder.builder("sip")
class SIPBuilder(BaseBuilder):
class SIPBuilder(BuilderWithDefaults):
"""The SIP builder provides the following phases that can be overridden:
* configure
@@ -170,4 +171,4 @@ def install_args(self):
"""Arguments to pass to install."""
return []
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -6,9 +6,10 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from ._checks import BaseBuilder, execute_build_time_tests, execute_install_time_tests
from ._checks import BuilderWithDefaults, execute_build_time_tests, execute_install_time_tests
class WafPackage(spack.package_base.PackageBase):
@@ -30,7 +31,7 @@ class WafPackage(spack.package_base.PackageBase):
@spack.builder.builder("waf")
class WafBuilder(BaseBuilder):
class WafBuilder(BuilderWithDefaults):
"""The WAF builder provides the following phases that can be overridden:
* configure
@@ -136,7 +137,7 @@ def build_test(self):
"""
pass
spack.builder.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def install_test(self):
"""Run unit tests after install.
@@ -146,4 +147,4 @@ def install_test(self):
"""
pass
spack.builder.run_after("install")(execute_install_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)

View File

@@ -6,43 +6,35 @@
import collections.abc
import copy
import functools
from typing import List, Optional, Tuple
from typing import Dict, List, Optional, Tuple, Type
from llnl.util import lang
import llnl.util.tty as tty
import llnl.util.tty.log as log
import spack.config
import spack.error
import spack.install_test
import spack.multimethod
import spack.package_base
import spack.phase_callbacks
import spack.repo
import spack.spec
import spack.util.environment
#: Builder classes, as registered by the "builder" decorator
BUILDER_CLS = {}
#: An object of this kind is a shared global state used to collect callbacks during
#: class definition time, and is flushed when the class object is created at the end
#: of the class definition
#:
#: Args:
#: attribute_name (str): name of the attribute that will be attached to the builder
#: callbacks (list): container used to temporarily aggregate the callbacks
CallbackTemporaryStage = collections.namedtuple(
"CallbackTemporaryStage", ["attribute_name", "callbacks"]
)
#: Shared global state to aggregate "@run_before" callbacks
_RUN_BEFORE = CallbackTemporaryStage(attribute_name="run_before_callbacks", callbacks=[])
#: Shared global state to aggregate "@run_after" callbacks
_RUN_AFTER = CallbackTemporaryStage(attribute_name="run_after_callbacks", callbacks=[])
BUILDER_CLS: Dict[str, Type["Builder"]] = {}
#: Map id(pkg) to a builder, to avoid creating multiple
#: builders for the same package object.
_BUILDERS = {}
_BUILDERS: Dict[int, "Builder"] = {}
def builder(build_system_name):
def builder(build_system_name: str):
"""Class decorator used to register the default builder
for a given build-system.
Args:
build_system_name (str): name of the build-system
build_system_name: name of the build-system
"""
def _decorator(cls):
@@ -53,13 +45,9 @@ def _decorator(cls):
return _decorator
def create(pkg):
"""Given a package object with an associated concrete spec,
return the builder object that can install it.
Args:
pkg (spack.package_base.PackageBase): package for which we want the builder
"""
def create(pkg: spack.package_base.PackageBase) -> "Builder":
"""Given a package object with an associated concrete spec, return the builder object that can
install it."""
if id(pkg) not in _BUILDERS:
_BUILDERS[id(pkg)] = _create(pkg)
return _BUILDERS[id(pkg)]
@@ -74,7 +62,15 @@ def __call__(self, spec, prefix):
return self.phase_fn(self.builder.pkg, spec, prefix)
def _create(pkg):
def get_builder_class(pkg, name: str) -> Optional[Type["Builder"]]:
"""Return the builder class if a package module defines it."""
cls = getattr(pkg.module, name, None)
if cls and cls.__module__.startswith(spack.repo.ROOT_PYTHON_NAMESPACE):
return cls
return None
def _create(pkg: spack.package_base.PackageBase) -> "Builder":
"""Return a new builder object for the package object being passed as argument.
The function inspects the build-system used by the package object and try to:
@@ -94,14 +90,15 @@ class hierarchy (look at AspellDictPackage for an example of that)
to look for build-related methods in the ``*Package``.
Args:
pkg (spack.package_base.PackageBase): package object for which we need a builder
pkg: package object for which we need a builder
"""
package_buildsystem = buildsystem_name(pkg)
default_builder_cls = BUILDER_CLS[package_buildsystem]
builder_cls_name = default_builder_cls.__name__
builder_cls = getattr(pkg.module, builder_cls_name, None)
if builder_cls:
return builder_cls(pkg)
builder_class = get_builder_class(pkg, builder_cls_name)
if builder_class:
return builder_class(pkg)
# Specialized version of a given buildsystem can subclass some
# base classes and specialize certain phases or methods or attributes.
@@ -130,11 +127,16 @@ def __init__(self, wrapped_pkg_object, root_builder):
new_cls_name,
bases,
{
# boolean to indicate whether install-time tests are run
"run_tests": property(lambda x: x.wrapped_package_object.run_tests),
# boolean to indicate whether the package's stand-alone tests
# require a compiler
"test_requires_compiler": property(
lambda x: x.wrapped_package_object.test_requires_compiler
),
# TestSuite instance the spec is a part of
"test_suite": property(lambda x: x.wrapped_package_object.test_suite),
# PackageTest instance to manage the spec's testing
"tester": property(lambda x: x.wrapped_package_object.tester),
},
)
@@ -158,8 +160,8 @@ def __forward(self, *args, **kwargs):
# with the same name is defined in the Package, it will override this definition
# (when _ForwardToBaseBuilder is initialized)
for method_name in (
base_cls.phases
+ base_cls.legacy_methods
base_cls.phases # type: ignore
+ base_cls.legacy_methods # type: ignore
+ getattr(base_cls, "legacy_long_methods", tuple())
+ ("setup_build_environment", "setup_dependent_build_environment")
):
@@ -171,14 +173,14 @@ def __forward(self):
return __forward
for attribute_name in base_cls.legacy_attributes:
for attribute_name in base_cls.legacy_attributes: # type: ignore
setattr(
_ForwardToBaseBuilder,
attribute_name,
property(forward_property_to_getattr(attribute_name)),
)
class Adapter(base_cls, metaclass=_PackageAdapterMeta):
class Adapter(base_cls, metaclass=_PackageAdapterMeta): # type: ignore
def __init__(self, pkg):
# Deal with custom phases in packages here
if hasattr(pkg, "phases"):
@@ -203,99 +205,18 @@ def setup_dependent_build_environment(self, env, dependent_spec):
return Adapter(pkg)
def buildsystem_name(pkg):
def buildsystem_name(pkg: spack.package_base.PackageBase) -> str:
"""Given a package object with an associated concrete spec,
return the name of its build system.
Args:
pkg (spack.package_base.PackageBase): package for which we want
the build system name
"""
return the name of its build system."""
try:
return pkg.spec.variants["build_system"].value
except KeyError:
# We are reading an old spec without the build_system variant
return pkg.legacy_buildsystem
class PhaseCallbacksMeta(type):
"""Permit to register arbitrary functions during class definition and run them
later, before or after a given install phase.
Each method decorated with ``run_before`` or ``run_after`` gets temporarily
stored in a global shared state when a class being defined is parsed by the Python
interpreter. At class definition time that temporary storage gets flushed and a list
of callbacks is attached to the class being defined.
"""
def __new__(mcs, name, bases, attr_dict):
for temporary_stage in (_RUN_BEFORE, _RUN_AFTER):
staged_callbacks = temporary_stage.callbacks
# Here we have an adapter from an old-style package. This means there is no
# hierarchy of builders, and every callback that had to be combined between
# *Package and *Builder has been combined already by _PackageAdapterMeta
if name == "Adapter":
continue
# If we are here we have callbacks. To get a complete list, we accumulate all the
# callbacks from base classes, we deduplicate them, then prepend what we have
# registered here.
#
# The order should be:
# 1. Callbacks are registered in order within the same class
# 2. Callbacks defined in derived classes precede those defined in base
# classes
callbacks_from_base = []
for base in bases:
current_callbacks = getattr(base, temporary_stage.attribute_name, None)
if not current_callbacks:
continue
callbacks_from_base.extend(current_callbacks)
callbacks_from_base = list(lang.dedupe(callbacks_from_base))
# Set the callbacks in this class and flush the temporary stage
attr_dict[temporary_stage.attribute_name] = staged_callbacks[:] + callbacks_from_base
del temporary_stage.callbacks[:]
return super(PhaseCallbacksMeta, mcs).__new__(mcs, name, bases, attr_dict)
@staticmethod
def run_after(phase, when=None):
"""Decorator to register a function for running after a given phase.
Args:
phase (str): phase after which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_AFTER.callbacks.append(item)
return fn
return _decorator
@staticmethod
def run_before(phase, when=None):
"""Decorator to register a function for running before a given phase.
Args:
phase (str): phase before which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_BEFORE.callbacks.append(item)
return fn
return _decorator
return pkg.legacy_buildsystem # type: ignore
class BuilderMeta(
PhaseCallbacksMeta,
spack.phase_callbacks.PhaseCallbacksMeta,
spack.multimethod.MultiMethodMeta,
type(collections.abc.Sequence), # type: ignore
):
@@ -390,8 +311,12 @@ def __new__(mcs, name, bases, attr_dict):
)
combine_callbacks = _PackageAdapterMeta.combine_callbacks
attr_dict[_RUN_BEFORE.attribute_name] = combine_callbacks(_RUN_BEFORE.attribute_name)
attr_dict[_RUN_AFTER.attribute_name] = combine_callbacks(_RUN_AFTER.attribute_name)
attr_dict[spack.phase_callbacks._RUN_BEFORE.attribute_name] = combine_callbacks(
spack.phase_callbacks._RUN_BEFORE.attribute_name
)
attr_dict[spack.phase_callbacks._RUN_AFTER.attribute_name] = combine_callbacks(
spack.phase_callbacks._RUN_AFTER.attribute_name
)
return super(_PackageAdapterMeta, mcs).__new__(mcs, name, bases, attr_dict)
@@ -411,8 +336,8 @@ def __init__(self, name, builder):
self.name = name
self.builder = builder
self.phase_fn = self._select_phase_fn()
self.run_before = self._make_callbacks(_RUN_BEFORE.attribute_name)
self.run_after = self._make_callbacks(_RUN_AFTER.attribute_name)
self.run_before = self._make_callbacks(spack.phase_callbacks._RUN_BEFORE.attribute_name)
self.run_after = self._make_callbacks(spack.phase_callbacks._RUN_AFTER.attribute_name)
def _make_callbacks(self, callbacks_attribute):
result = []
@@ -473,15 +398,103 @@ def copy(self):
return copy.deepcopy(self)
class Builder(collections.abc.Sequence, metaclass=BuilderMeta):
"""A builder is a class that, given a package object (i.e. associated with
concrete spec), knows how to install it.
class BaseBuilder(metaclass=BuilderMeta):
"""An interface for builders, without any phases defined. This class is exposed in the package
API, so that packagers can create a single class to define ``setup_build_environment`` and
``@run_before`` and ``@run_after`` callbacks that can be shared among different builders.
The builder behaves like a sequence, and when iterated over return the
"phases" of the installation in the correct order.
Example:
Args:
pkg (spack.package_base.PackageBase): package object to be built
.. code-block:: python
class AnyBuilder(BaseBuilder):
@run_after("install")
def fixup_install(self):
# do something after the package is installed
pass
def setup_build_environment(self, env):
env.set("MY_ENV_VAR", "my_value")
class CMakeBuilder(cmake.CMakeBuilder, AnyBuilder):
pass
class AutotoolsBuilder(autotools.AutotoolsBuilder, AnyBuilder):
pass
"""
def __init__(self, pkg: spack.package_base.PackageBase) -> None:
self.pkg = pkg
@property
def spec(self) -> spack.spec.Spec:
return self.pkg.spec
@property
def stage(self):
return self.pkg.stage
@property
def prefix(self):
return self.pkg.prefix
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
"""Sets up the build environment for a package.
This method will be called before the current package prefix exists in
Spack's store.
Args:
env: environment modifications to be applied when the package is built. Package authors
can call methods on it to alter the build environment.
"""
if not hasattr(super(), "setup_build_environment"):
return
super().setup_build_environment(env) # type: ignore
def setup_dependent_build_environment(
self, env: spack.util.environment.EnvironmentModifications, dependent_spec: spack.spec.Spec
) -> None:
"""Sets up the build environment of a package that depends on this one.
This is similar to ``setup_build_environment``, but it is used to modify the build
environment of a package that *depends* on this one.
This gives packages the ability to set environment variables for the build of the
dependent, which can be useful to provide search hints for headers or libraries if they are
not in standard locations.
This method will be called before the dependent package prefix exists in Spack's store.
Args:
env: environment modifications to be applied when the dependent package is built.
Package authors can call methods on it to alter the build environment.
dependent_spec: the spec of the dependent package about to be built. This allows the
extendee (self) to query the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
if not hasattr(super(), "setup_dependent_build_environment"):
return
super().setup_dependent_build_environment(env, dependent_spec) # type: ignore
def __repr__(self):
fmt = "{name}{/hash:7}"
return f"{self.__class__.__name__}({self.spec.format(fmt)})"
def __str__(self):
fmt = "{name}{/hash:7}"
return f'"{self.__class__.__name__}" builder for "{self.spec.format(fmt)}"'
class Builder(BaseBuilder, collections.abc.Sequence):
"""A builder is a class that, given a package object (i.e. associated with concrete spec),
knows how to install it and perform install-time checks.
The builder behaves like a sequence, and when iterated over return the "phases" of the
installation in the correct order.
"""
#: Sequence of phases. Must be defined in derived classes
@@ -496,79 +509,19 @@ class Builder(collections.abc.Sequence, metaclass=BuilderMeta):
build_time_test_callbacks: List[str]
install_time_test_callbacks: List[str]
#: List of glob expressions. Each expression must either be
#: absolute or relative to the package source path.
#: Matching artifacts found at the end of the build process will be
#: copied in the same directory tree as _spack_build_logfile and
#: _spack_build_envfile.
archive_files: List[str] = []
#: List of glob expressions. Each expression must either be absolute or relative to the package
#: source path. Matching artifacts found at the end of the build process will be copied in the
#: same directory tree as _spack_build_logfile and _spack_build_envfile.
@property
def archive_files(self) -> List[str]:
return []
def __init__(self, pkg):
self.pkg = pkg
def __init__(self, pkg: spack.package_base.PackageBase) -> None:
super().__init__(pkg)
self.callbacks = {}
for phase in self.phases:
self.callbacks[phase] = InstallationPhase(phase, self)
@property
def spec(self):
return self.pkg.spec
@property
def stage(self):
return self.pkg.stage
@property
def prefix(self):
return self.pkg.prefix
def setup_build_environment(self, env):
"""Sets up the build environment for a package.
This method will be called before the current package prefix exists in
Spack's store.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the package is built. Package authors
can call methods on it to alter the build environment.
"""
if not hasattr(super(), "setup_build_environment"):
return
super().setup_build_environment(env)
def setup_dependent_build_environment(self, env, dependent_spec):
"""Sets up the build environment of packages that depend on this one.
This is similar to ``setup_build_environment``, but it is used to
modify the build environments of packages that *depend* on this one.
This gives packages like Python and others that follow the extension
model a way to implement common environment or compile-time settings
for dependencies.
This method will be called before the dependent package prefix exists
in Spack's store.
Examples:
1. Installing python modules generally requires ``PYTHONPATH``
to point to the ``lib/pythonX.Y/site-packages`` directory in the
module's install prefix. This method could be used to set that
variable.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the dependent package is built.
Package authors can call methods on it to alter the build environment.
dependent_spec (spack.spec.Spec): the spec of the dependent package
about to be built. This allows the extendee (self) to query
the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
if not hasattr(super(), "setup_dependent_build_environment"):
return
super().setup_dependent_build_environment(env, dependent_spec)
def __getitem__(self, idx):
key = self.phases[idx]
return self.callbacks[key]
@@ -576,15 +529,51 @@ def __getitem__(self, idx):
def __len__(self):
return len(self.phases)
def __repr__(self):
msg = "{0}({1})"
return msg.format(type(self).__name__, self.pkg.spec.format("{name}/{hash:7}"))
def phase_tests(self, phase_name: str, method_names: List[str]):
"""Execute the package's phase-time tests.
def __str__(self):
msg = '"{0}" builder for "{1}"'
return msg.format(type(self).build_system, self.pkg.spec.format("{name}/{hash:7}"))
This process uses the same test setup and logging used for
stand-alone tests for consistency.
Args:
phase_name: the name of the build-time phase (e.g., ``build``, ``install``)
method_names: phase-specific callback method names
"""
verbose = tty.is_verbose()
fail_fast = spack.config.get("config:fail_fast", False)
# Export these names as standalone to be used in packages
run_after = PhaseCallbacksMeta.run_after
run_before = PhaseCallbacksMeta.run_before
tester = self.pkg.tester
with tester.test_logger(verbose=verbose, externals=False) as logger:
# Report running each of the methods in the build log
log.print_message(logger, f"Running {phase_name}-time tests", verbose)
tester.set_current_specs(self.pkg.spec, self.pkg.spec)
have_tests = any(name.startswith("test_") for name in method_names)
if have_tests:
spack.install_test.copy_test_files(self.pkg, self.pkg.spec)
for name in method_names:
try:
# Prefer the method in the package over the builder's.
# We need this primarily to pick up arbitrarily named test
# methods but also some build-time checks.
fn = getattr(self.pkg, name, getattr(self, name))
msg = f"RUN-TESTS: {phase_name}-time tests [{name}]"
log.print_message(logger, msg, verbose)
fn()
except AttributeError as e:
msg = f"RUN-TESTS: method not implemented [{name}]"
log.print_message(logger, msg, verbose)
tester.add_failure(e, msg)
if fail_fast:
break
if have_tests:
log.print_message(logger, "Completed testing", verbose)
# Raise exception if any failures encountered
tester.handle_failures()

View File

@@ -5,7 +5,6 @@
"""Caches used by Spack to store data"""
import os
from typing import Union
import llnl.util.lang
from llnl.util.filesystem import mkdirp
@@ -32,12 +31,8 @@ def _misc_cache():
return spack.util.file_cache.FileCache(path)
FileCacheType = Union[spack.util.file_cache.FileCache, llnl.util.lang.Singleton]
#: Spack's cache for small data
MISC_CACHE: Union[spack.util.file_cache.FileCache, llnl.util.lang.Singleton] = (
llnl.util.lang.Singleton(_misc_cache)
)
MISC_CACHE: spack.util.file_cache.FileCache = llnl.util.lang.Singleton(_misc_cache) # type: ignore
def fetch_cache_location():
@@ -74,6 +69,4 @@ def store(self, fetcher, relative_dest):
#: Spack's local cache for downloaded source archives
FETCH_CACHE: Union[spack.fetch_strategy.FsCache, llnl.util.lang.Singleton] = (
llnl.util.lang.Singleton(_fetch_cache)
)
FETCH_CACHE: spack.fetch_strategy.FsCache = llnl.util.lang.Singleton(_fetch_cache) # type: ignore

View File

@@ -32,6 +32,7 @@
import spack
import spack.binary_distribution as bindist
import spack.builder
import spack.concretize
import spack.config as cfg
import spack.error
@@ -1387,7 +1388,11 @@ def copy_stage_logs_to_artifacts(job_spec: spack.spec.Spec, job_log_dir: str) ->
stage_dir = job_pkg.stage.path
tty.debug(f"stage dir: {stage_dir}")
for file in [job_pkg.log_path, job_pkg.env_mods_path, *job_pkg.builder.archive_files]:
for file in [
job_pkg.log_path,
job_pkg.env_mods_path,
*spack.builder.create(job_pkg).archive_files,
]:
copy_files_to_artifacts(file, job_log_dir)

View File

@@ -8,7 +8,8 @@
import os
import re
import sys
from typing import List, Union
from collections import Counter
from typing import List, Optional, Union
import llnl.string
import llnl.util.tty as tty
@@ -17,12 +18,14 @@
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
import spack.concretize
import spack.config # breaks a cycle.
import spack.environment as ev
import spack.error
import spack.extensions
import spack.parser
import spack.paths
import spack.repo
import spack.spec
import spack.store
import spack.traverse as traverse
@@ -30,6 +33,8 @@
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
from ..enums import InstallRecordStatus
# cmd has a submodule called "list" so preserve the python list module
python_list = list
@@ -173,10 +178,66 @@ def parse_specs(
arg_string = " ".join([quote_kvp(arg) for arg in args])
specs = spack.parser.parse(arg_string)
for spec in specs:
if concretize:
spec.concretize(tests=tests)
return specs
if not concretize:
return specs
to_concretize = [(s, None) for s in specs]
return _concretize_spec_pairs(to_concretize, tests=tests)
def _concretize_spec_pairs(to_concretize, tests=False):
"""Helper method that concretizes abstract specs from a list of abstract,concrete pairs.
Any spec with a concrete spec associated with it will concretize to that spec. Any spec
with ``None`` for its concrete spec will be newly concretized. This method respects unification
rules from config."""
unify = spack.config.get("concretizer:unify", False)
# Special case for concretizing a single spec
if len(to_concretize) == 1:
abstract, concrete = to_concretize[0]
return [concrete or abstract.concretized()]
# Special case if every spec is either concrete or has an abstract hash
if all(
concrete or abstract.concrete or abstract.abstract_hash
for abstract, concrete in to_concretize
):
# Get all the concrete specs
ret = [
concrete or (abstract if abstract.concrete else abstract.lookup_hash())
for abstract, concrete in to_concretize
]
# If unify: true, check that specs don't conflict
# Since all concrete, "when_possible" is not relevant
if unify is True: # True, "when_possible", False are possible values
runtimes = spack.repo.PATH.packages_with_tags("runtime")
specs_per_name = Counter(
spec.name
for spec in traverse.traverse_nodes(
ret, deptype=("link", "run"), key=traverse.by_dag_hash
)
if spec.name not in runtimes # runtimes are allowed multiple times
)
conflicts = sorted(name for name, count in specs_per_name.items() if count > 1)
if conflicts:
raise spack.error.SpecError(
"Specs conflict and `concretizer:unify` is configured true.",
f" specs depend on multiple versions of {', '.join(conflicts)}",
)
return ret
# Standard case
concretize_method = spack.concretize.concretize_separately # unify: false
if unify is True:
concretize_method = spack.concretize.concretize_together
elif unify == "when_possible":
concretize_method = spack.concretize.concretize_together_when_possible
concretized = concretize_method(to_concretize, tests=tests)
return [concrete for _, concrete in concretized]
def matching_spec_from_env(spec):
@@ -192,39 +253,64 @@ def matching_spec_from_env(spec):
return spec.concretized()
def disambiguate_spec(spec, env, local=False, installed=True, first=False):
def matching_specs_from_env(specs):
"""
Same as ``matching_spec_from_env`` but respects spec unification rules.
For each spec, if there is a matching spec in the environment it is used. If no
matching spec is found, this will return the given spec but concretized in the
context of the active environment and other given specs, with unification rules applied.
"""
env = ev.active_environment()
spec_pairs = [(spec, env.matching_spec(spec) if env else None) for spec in specs]
additional_concrete_specs = (
[(concrete, concrete) for _, concrete in env.concretized_specs()] if env else []
)
return _concretize_spec_pairs(spec_pairs + additional_concrete_specs)[: len(spec_pairs)]
def disambiguate_spec(
spec: spack.spec.Spec,
env: Optional[ev.Environment],
local: bool = False,
installed: Union[bool, InstallRecordStatus] = True,
first: bool = False,
) -> spack.spec.Spec:
"""Given a spec, figure out which installed package it refers to.
Arguments:
spec (spack.spec.Spec): a spec to disambiguate
env (spack.environment.Environment): a spack environment,
if one is active, or None if no environment is active
local (bool): do not search chained spack instances
installed (bool or spack.database.InstallStatus or typing.Iterable):
install status argument passed to database query.
See ``spack.database.Database._query`` for details.
Args:
spec: a spec to disambiguate
env: a spack environment, if one is active, or None if no environment is active
local: do not search chained spack instances
installed: install status argument passed to database query.
first: returns the first matching spec, even if more than one match is found
"""
hashes = env.all_hashes() if env else None
return disambiguate_spec_from_hashes(spec, hashes, local, installed, first)
def disambiguate_spec_from_hashes(spec, hashes, local=False, installed=True, first=False):
def disambiguate_spec_from_hashes(
spec: spack.spec.Spec,
hashes: List[str],
local: bool = False,
installed: Union[bool, InstallRecordStatus] = True,
first: bool = False,
) -> spack.spec.Spec:
"""Given a spec and a list of hashes, get concrete spec the spec refers to.
Arguments:
spec (spack.spec.Spec): a spec to disambiguate
hashes (typing.Iterable): a set of hashes of specs among which to disambiguate
local (bool): do not search chained spack instances
installed (bool or spack.database.InstallStatus or typing.Iterable):
install status argument passed to database query.
See ``spack.database.Database._query`` for details.
spec: a spec to disambiguate
hashes: a set of hashes of specs among which to disambiguate
local: if True, do not search chained spack instances
installed: install status argument passed to database query.
first: returns the first matching spec, even if more than one match is found
"""
if local:
matching_specs = spack.store.STORE.db.query_local(spec, hashes=hashes, installed=installed)
else:
matching_specs = spack.store.STORE.db.query(spec, hashes=hashes, installed=installed)
if not matching_specs:
tty.die("Spec '%s' matches no installed packages." % spec)
tty.die(f"Spec '{spec}' matches no installed packages.")
elif first:
return matching_specs[0]
@@ -509,6 +595,18 @@ def __init__(self, name):
super().__init__("{0} is not a permissible Spack command name.".format(name))
class MultipleSpecsMatch(Exception):
"""Raised when multiple specs match a constraint, in a context where
this is not allowed.
"""
class NoSpecMatches(Exception):
"""Raised when no spec matches a constraint, in a context where
this is not allowed.
"""
########################################
# argparse types for argument validation
########################################

View File

@@ -34,6 +34,8 @@
from spack.cmd.common import arguments
from spack.spec import Spec, save_dependency_specfiles
from ..enums import InstallRecordStatus
description = "create, download and install binary packages"
section = "packaging"
level = "long"
@@ -308,7 +310,10 @@ def setup_parser(subparser: argparse.ArgumentParser):
def _matching_specs(specs: List[Spec]) -> List[Spec]:
"""Disambiguate specs and return a list of matching specs"""
return [spack.cmd.disambiguate_spec(s, ev.active_environment(), installed=any) for s in specs]
return [
spack.cmd.disambiguate_spec(s, ev.active_environment(), installed=InstallRecordStatus.ANY)
for s in specs
]
def _format_spec(spec: Spec) -> str:

View File

@@ -105,7 +105,8 @@ def clean(parser, args):
# Then do the cleaning falling through the cases
if args.specs:
specs = spack.cmd.parse_specs(args.specs, concretize=False)
specs = list(spack.cmd.matching_spec_from_env(x) for x in specs)
specs = spack.cmd.matching_specs_from_env(specs)
for spec in specs:
msg = "Cleaning build stage [{0}]"
tty.msg(msg.format(spec.short_spec))

View File

@@ -581,23 +581,51 @@ def add_concretizer_args(subparser):
def add_connection_args(subparser, add_help):
subparser.add_argument(
"--s3-access-key-id", help="ID string to use to connect to this S3 mirror"
def add_argument_string_or_variable(parser, arg: str, *, deprecate_str: bool = True, **kwargs):
group = parser.add_mutually_exclusive_group()
group.add_argument(arg, **kwargs)
# Update help string
if "help" in kwargs:
kwargs["help"] = "environment variable containing " + kwargs["help"]
group.add_argument(arg + "-variable", **kwargs)
s3_connection_parser = subparser.add_argument_group("S3 Connection")
add_argument_string_or_variable(
s3_connection_parser,
"--s3-access-key-id",
help="ID string to use to connect to this S3 mirror",
)
subparser.add_argument(
"--s3-access-key-secret", help="secret string to use to connect to this S3 mirror"
add_argument_string_or_variable(
s3_connection_parser,
"--s3-access-key-secret",
help="secret string to use to connect to this S3 mirror",
)
subparser.add_argument(
"--s3-access-token", help="access token to use to connect to this S3 mirror"
add_argument_string_or_variable(
s3_connection_parser,
"--s3-access-token",
help="access token to use to connect to this S3 mirror",
)
subparser.add_argument(
s3_connection_parser.add_argument(
"--s3-profile", help="S3 profile name to use to connect to this S3 mirror", default=None
)
subparser.add_argument(
s3_connection_parser.add_argument(
"--s3-endpoint-url", help="endpoint URL to use to connect to this S3 mirror"
)
subparser.add_argument("--oci-username", help="username to use to connect to this OCI mirror")
subparser.add_argument("--oci-password", help="password to use to connect to this OCI mirror")
oci_connection_parser = subparser.add_argument_group("OCI Connection")
add_argument_string_or_variable(
oci_connection_parser,
"--oci-username",
deprecate_str=False,
help="username to use to connect to this OCI mirror",
)
add_argument_string_or_variable(
oci_connection_parser,
"--oci-password",
help="password to use to connect to this OCI mirror",
)
def use_buildcache(cli_arg_value):

View File

@@ -23,9 +23,10 @@
import spack.installer
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from spack.error import SpackError
from ..enums import InstallRecordStatus
description = "replace one package with another via symlinks"
section = "admin"
level = "long"
@@ -95,8 +96,12 @@ def deprecate(parser, args):
if len(specs) != 2:
raise SpackError("spack deprecate requires exactly two specs")
install_query = [InstallStatuses.INSTALLED, InstallStatuses.DEPRECATED]
deprecate = spack.cmd.disambiguate_spec(specs[0], env, local=True, installed=install_query)
deprecate = spack.cmd.disambiguate_spec(
specs[0],
env,
local=True,
installed=(InstallRecordStatus.INSTALLED | InstallRecordStatus.DEPRECATED),
)
if args.install:
deprecator = specs[1].concretized()

View File

@@ -10,11 +10,12 @@
import sys
import tempfile
from pathlib import Path
from typing import List, Optional
from typing import List, Optional, Set
import llnl.string as string
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.symlink import islink, symlink
from llnl.util.tty.colify import colify
from llnl.util.tty.color import cescape, colorize
@@ -50,6 +51,8 @@
"update",
"revert",
"depfile",
"track",
"untrack",
]
@@ -446,6 +449,193 @@ def env_deactivate(args):
sys.stdout.write(cmds)
#
# env track
#
def env_track_setup_parser(subparser):
"""track an environment from a directory in Spack"""
subparser.add_argument("-n", "--name", help="custom environment name")
subparser.add_argument("dir", help="path to environment")
arguments.add_common_arguments(subparser, ["yes_to_all"])
def env_track(args):
src_path = os.path.abspath(args.dir)
if not ev.is_env_dir(src_path):
tty.die("Cannot track environment. Path doesn't contain an environment")
if args.name:
name = args.name
else:
name = os.path.basename(src_path)
try:
dst_path = ev.environment_dir_from_name(name, exists_ok=False)
except ev.SpackEnvironmentError:
tty.die(
f"An environment named {name} already exists. Set a name with:"
"\n\n"
f" spack env track --name NAME {src_path}\n"
)
symlink(src_path, dst_path)
tty.msg(f"Tracking environment in {src_path}")
tty.msg(
"You can now activate this environment with the following command:\n\n"
f" spack env activate {name}\n"
)
#
# env remove & untrack helpers
#
def filter_managed_env_names(env_names: Set[str]) -> Set[str]:
tracked_env_names = {e for e in env_names if islink(ev.environment_dir_from_name(e))}
managed_env_names = env_names - set(tracked_env_names)
num_managed_envs = len(managed_env_names)
managed_envs_str = " ".join(managed_env_names)
if num_managed_envs >= 2:
tty.error(
f"The following are not tracked environments. "
"To remove them completely run,"
"\n\n"
f" spack env rm {managed_envs_str}\n"
)
elif num_managed_envs > 0:
tty.error(
f"'{managed_envs_str}' is not a tracked env. "
"To remove it completely run,"
"\n\n"
f" spack env rm {managed_envs_str}\n"
)
return tracked_env_names
def get_valid_envs(env_names: Set[str]) -> Set[ev.Environment]:
valid_envs = set()
for env_name in env_names:
try:
env = ev.read(env_name)
valid_envs.add(env)
except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError):
pass
return valid_envs
def _env_untrack_or_remove(
env_names: List[str], remove: bool = False, force: bool = False, yes_to_all: bool = False
):
all_env_names = set(ev.all_environment_names())
known_env_names = set(env_names).intersection(all_env_names)
unknown_env_names = set(env_names) - known_env_names
# print error for unknown environments
for env_name in unknown_env_names:
tty.error(f"Environment '{env_name}' does not exist")
# if only unlinking is allowed, remove all environments
# which do not point internally at symlinks
if not remove:
env_names_to_remove = filter_managed_env_names(known_env_names)
else:
env_names_to_remove = known_env_names
# initalize all environments with valid spack.yaml configs
all_valid_envs = get_valid_envs(all_env_names)
# build a task list of environments and bad env names to remove
envs_to_remove = [e for e in all_valid_envs if e.name in env_names_to_remove]
bad_env_names_to_remove = env_names_to_remove - {e.name for e in envs_to_remove}
for remove_env in envs_to_remove:
for env in all_valid_envs:
# don't check if an environment is included to itself
if env.name == remove_env.name:
continue
# check if an environment is included un another
if remove_env.path in env.included_concrete_envs:
msg = f"Environment '{remove_env.name}' is used by environment '{env.name}'"
if force:
tty.warn(msg)
else:
tty.error(msg)
envs_to_remove.remove(remove_env)
# ask the user if they really want to remove the known environments
# force should do the same as yes to all here following the symantics of rm
if not (yes_to_all or force) and (envs_to_remove or bad_env_names_to_remove):
environments = string.plural(len(env_names_to_remove), "environment", show_n=False)
envs = string.comma_and(list(env_names_to_remove))
answer = tty.get_yes_or_no(
f"Really {'remove' if remove else 'untrack'} {environments} {envs}?", default=False
)
if not answer:
tty.die("Will not remove any environments")
# keep track of the environments we remove for later printing the exit code
removed_env_names = []
for env in envs_to_remove:
name = env.name
if not force and env.active:
tty.error(
f"Environment '{name}' can't be "
f"{'removed' if remove else 'untracked'} while activated."
)
continue
# Get path to check if environment is a tracked / symlinked environment
if islink(env.path):
real_env_path = os.path.realpath(env.path)
os.unlink(env.path)
tty.msg(
f"Sucessfully untracked environment '{name}', "
"but it can still be found at:\n\n"
f" {real_env_path}\n"
)
else:
env.destroy()
tty.msg(f"Successfully removed environment '{name}'")
removed_env_names.append(env.name)
for bad_env_name in bad_env_names_to_remove:
shutil.rmtree(
spack.environment.environment.environment_dir_from_name(bad_env_name, exists_ok=True)
)
tty.msg(f"Successfully removed environment '{bad_env_name}'")
removed_env_names.append(env.name)
# Following the design of linux rm we should exit with a status of 1
# anytime we cannot delete every environment the user asks for.
# However, we should still process all the environments we know about
# and delete them instead of failing on the first unknown enviornment.
if len(removed_env_names) < len(known_env_names):
sys.exit(1)
#
# env untrack
#
def env_untrack_setup_parser(subparser):
"""track an environment from a directory in Spack"""
subparser.add_argument("env", nargs="+", help="tracked environment name")
subparser.add_argument(
"-f", "--force", action="store_true", help="force unlink even when environment is active"
)
arguments.add_common_arguments(subparser, ["yes_to_all"])
def env_untrack(args):
_env_untrack_or_remove(
env_names=args.env, force=args.force, yes_to_all=args.yes_to_all, remove=False
)
#
# env remove
#
@@ -471,54 +661,9 @@ def env_remove_setup_parser(subparser):
def env_remove(args):
"""remove existing environment(s)"""
remove_envs = []
valid_envs = []
bad_envs = []
for env_name in ev.all_environment_names():
try:
env = ev.read(env_name)
valid_envs.append(env)
if env_name in args.rm_env:
remove_envs.append(env)
except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError):
if env_name in args.rm_env:
bad_envs.append(env_name)
# Check if remove_env is included from another env before trying to remove
for env in valid_envs:
for remove_env in remove_envs:
# don't check if environment is included to itself
if env.name == remove_env.name:
continue
if remove_env.path in env.included_concrete_envs:
msg = f'Environment "{remove_env.name}" is being used by environment "{env.name}"'
if args.force:
tty.warn(msg)
else:
tty.die(msg)
if not args.yes_to_all:
environments = string.plural(len(args.rm_env), "environment", show_n=False)
envs = string.comma_and(args.rm_env)
answer = tty.get_yes_or_no(f"Really remove {environments} {envs}?", default=False)
if not answer:
tty.die("Will not remove any environments")
for env in remove_envs:
name = env.name
if env.active:
tty.die(f"Environment {name} can't be removed while activated.")
env.destroy()
tty.msg(f"Successfully removed environment '{name}'")
for bad_env_name in bad_envs:
shutil.rmtree(
spack.environment.environment.environment_dir_from_name(bad_env_name, exists_ok=True)
)
tty.msg(f"Successfully removed environment '{bad_env_name}'")
_env_untrack_or_remove(
env_names=args.rm_env, remove=True, force=args.force, yes_to_all=args.yes_to_all
)
#

View File

@@ -17,7 +17,8 @@
import spack.spec
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from ..enums import InstallRecordStatus
description = "list and search installed packages"
section = "basic"
@@ -137,21 +138,22 @@ def setup_parser(subparser):
subparser.add_argument(
"--loaded", action="store_true", help="show only packages loaded in the user environment"
)
subparser.add_argument(
only_missing_or_deprecated = subparser.add_mutually_exclusive_group()
only_missing_or_deprecated.add_argument(
"-M",
"--only-missing",
action="store_true",
dest="only_missing",
help="show only missing dependencies",
)
only_missing_or_deprecated.add_argument(
"--only-deprecated", action="store_true", help="show only deprecated packages"
)
subparser.add_argument(
"--deprecated",
action="store_true",
help="show deprecated packages as well as installed specs",
)
subparser.add_argument(
"--only-deprecated", action="store_true", help="show only deprecated packages"
)
subparser.add_argument(
"--install-tree",
action="store",
@@ -165,14 +167,23 @@ def setup_parser(subparser):
def query_arguments(args):
# Set up query arguments.
installed = []
if not (args.only_missing or args.only_deprecated):
installed.append(InstallStatuses.INSTALLED)
if (args.deprecated or args.only_deprecated) and not args.only_missing:
installed.append(InstallStatuses.DEPRECATED)
if (args.missing or args.only_missing) and not args.only_deprecated:
installed.append(InstallStatuses.MISSING)
if args.only_missing and (args.deprecated or args.missing):
raise RuntimeError("cannot use --only-missing with --deprecated, or --missing")
if args.only_deprecated and (args.deprecated or args.missing):
raise RuntimeError("cannot use --only-deprecated with --deprecated, or --missing")
installed = InstallRecordStatus.INSTALLED
if args.only_missing:
installed = InstallRecordStatus.MISSING
elif args.only_deprecated:
installed = InstallRecordStatus.DEPRECATED
if args.missing:
installed |= InstallRecordStatus.MISSING
if args.deprecated:
installed |= InstallRecordStatus.DEPRECATED
predicate_fn = None
if args.unknown:
@@ -222,11 +233,9 @@ def decorator(spec, fmt):
def display_env(env, args, decorator, results):
"""Display extra find output when running in an environment.
Find in an environment outputs 2 or 3 sections:
1. Root specs
2. Concretized roots (if asked for with -c)
3. Installed specs
In an environment, `spack find` outputs a preliminary section
showing the root specs of the environment (this is in addition
to the section listing out specs matching the query parameters).
"""
tty.msg("In environment %s" % env.name)
@@ -299,6 +308,56 @@ def root_decorator(spec, string):
print()
def _find_query(args, env):
q_args = query_arguments(args)
concretized_but_not_installed = list()
if env:
all_env_specs = env.all_specs()
if args.constraint:
init_specs = cmd.parse_specs(args.constraint)
env_specs = env.all_matching_specs(*init_specs)
else:
env_specs = all_env_specs
spec_hashes = set(x.dag_hash() for x in env_specs)
specs_meeting_q_args = set(spack.store.STORE.db.query(hashes=spec_hashes, **q_args))
results = list()
with spack.store.STORE.db.read_transaction():
for spec in env_specs:
if not spec.installed:
concretized_but_not_installed.append(spec)
if spec in specs_meeting_q_args:
results.append(spec)
else:
results = args.specs(**q_args)
# use groups by default except with format.
if args.groups is None:
args.groups = not args.format
# Exit early with an error code if no package matches the constraint
if concretized_but_not_installed and args.show_concretized:
pass
elif results:
pass
elif args.constraint:
raise cmd.NoSpecMatches()
# If tags have been specified on the command line, filter by tags
if args.tags:
packages_with_tags = spack.repo.PATH.packages_with_tags(*args.tags)
results = [x for x in results if x.name in packages_with_tags]
concretized_but_not_installed = [
x for x in concretized_but_not_installed if x.name in packages_with_tags
]
if args.loaded:
results = cmd.filter_loaded_specs(results)
return results, concretized_but_not_installed
def find(parser, args):
env = ev.active_environment()
@@ -307,34 +366,12 @@ def find(parser, args):
if not env and args.show_concretized:
tty.die("-c / --show-concretized requires an active environment")
if env:
if args.constraint:
init_specs = spack.cmd.parse_specs(args.constraint)
results = env.all_matching_specs(*init_specs)
else:
results = env.all_specs()
else:
q_args = query_arguments(args)
results = args.specs(**q_args)
decorator = make_env_decorator(env) if env else lambda s, f: f
# use groups by default except with format.
if args.groups is None:
args.groups = not args.format
# Exit early with an error code if no package matches the constraint
if not results and args.constraint:
constraint_str = " ".join(str(s) for s in args.constraint_specs)
tty.die(f"No package matches the query: {constraint_str}")
# If tags have been specified on the command line, filter by tags
if args.tags:
packages_with_tags = spack.repo.PATH.packages_with_tags(*args.tags)
results = [x for x in results if x.name in packages_with_tags]
if args.loaded:
results = spack.cmd.filter_loaded_specs(results)
try:
results, concretized_but_not_installed = _find_query(args, env)
except cmd.NoSpecMatches:
# Note: this uses args.constraint vs. args.constraint_specs because
# the latter only exists if you call args.specs()
tty.die(f"No package matches the query: {' '.join(args.constraint)}")
if args.install_status or args.show_concretized:
status_fn = spack.spec.Spec.install_status
@@ -345,14 +382,16 @@ def find(parser, args):
if args.json:
cmd.display_specs_as_json(results, deps=args.deps)
else:
decorator = make_env_decorator(env) if env else lambda s, f: f
if not args.format:
if env:
display_env(env, args, decorator, results)
if not args.only_roots:
display_results = results
if not args.show_concretized:
display_results = list(x for x in results if x.installed)
display_results = list(results)
if args.show_concretized:
display_results += concretized_but_not_installed
cmd.display_specs(
display_results, args, decorator=decorator, all_headers=True, status_fn=status_fn
)
@@ -370,13 +409,9 @@ def find(parser, args):
concretized_suffix += " (show with `spack find -c`)"
pkg_type = "loaded" if args.loaded else "installed"
spack.cmd.print_how_many_pkgs(
list(x for x in results if x.installed), pkg_type, suffix=installed_suffix
)
cmd.print_how_many_pkgs(results, pkg_type, suffix=installed_suffix)
if env:
spack.cmd.print_how_many_pkgs(
list(x for x in results if not x.installed),
"concretized",
suffix=concretized_suffix,
cmd.print_how_many_pkgs(
concretized_but_not_installed, "concretized", suffix=concretized_suffix
)

View File

@@ -78,8 +78,8 @@
boxlib @B{dim=2} boxlib built for 2 dimensions
libdwarf @g{%intel} ^libelf@g{%gcc}
libdwarf, built with intel compiler, linked to libelf built with gcc
mvapich2 @g{%pgi} @B{fabrics=psm,mrail,sock}
mvapich2, built with pgi compiler, with support for multiple fabrics
mvapich2 @g{%gcc} @B{fabrics=psm,mrail,sock}
mvapich2, built with gcc compiler, with support for multiple fabrics
"""

View File

@@ -11,6 +11,7 @@
import llnl.util.tty.color as color
from llnl.util.tty.colify import colify
import spack.builder
import spack.deptypes as dt
import spack.fetch_strategy as fs
import spack.install_test
@@ -202,11 +203,13 @@ def print_namespace(pkg, args):
def print_phases(pkg, args):
"""output installation phases"""
if hasattr(pkg.builder, "phases") and pkg.builder.phases:
builder = spack.builder.create(pkg)
if hasattr(builder, "phases") and builder.phases:
color.cprint("")
color.cprint(section_title("Installation Phases:"))
phase_str = ""
for phase in pkg.builder.phases:
for phase in builder.phases:
phase_str += " {0}".format(phase)
color.cprint(phase_str)

View File

@@ -10,7 +10,8 @@
import spack.cmd
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from ..enums import InstallRecordStatus
description = "mark packages as explicitly or implicitly installed"
section = "admin"
@@ -67,8 +68,7 @@ def find_matching_specs(specs, allow_multiple_matches=False):
has_errors = False
for spec in specs:
install_query = [InstallStatuses.INSTALLED]
matching = spack.store.STORE.db.query_local(spec, installed=install_query)
matching = spack.store.STORE.db.query_local(spec, installed=InstallRecordStatus.INSTALLED)
# For each spec provided, make sure it refers to only one package.
# Fail and ask user to be unambiguous if it doesn't
if not allow_multiple_matches and len(matching) > 1:
@@ -98,8 +98,9 @@ def do_mark(specs, explicit):
specs (list): list of specs to be marked
explicit (bool): whether to mark specs as explicitly installed
"""
for spec in specs:
spack.store.STORE.db.update_explicit(spec, explicit)
with spack.store.STORE.db.write_transaction():
for spec in specs:
spack.store.STORE.db.mark(spec, "explicit", explicit)
def mark_specs(args, specs):

View File

@@ -231,31 +231,133 @@ def setup_parser(subparser):
)
def _configure_access_pair(
args, id_tok, id_variable_tok, secret_tok, secret_variable_tok, default=None
):
"""Configure the access_pair options"""
# Check if any of the arguments are set to update this access_pair.
# If none are set, then skip computing the new access pair
args_id = getattr(args, id_tok)
args_id_variable = getattr(args, id_variable_tok)
args_secret = getattr(args, secret_tok)
args_secret_variable = getattr(args, secret_variable_tok)
if not any([args_id, args_id_variable, args_secret, args_secret_variable]):
return None
def _default_value(id_):
if isinstance(default, list):
return default[0] if id_ == "id" else default[1]
elif isinstance(default, dict):
return default.get(id_)
else:
return None
def _default_variable(id_):
if isinstance(default, dict):
return default.get(id_ + "_variable")
else:
return None
id_ = None
id_variable = None
secret = None
secret_variable = None
# Get the value/default value if the argument of the inverse
if not args_id_variable:
id_ = getattr(args, id_tok) or _default_value("id")
if not args_id:
id_variable = getattr(args, id_variable_tok) or _default_variable("id")
if not args_secret_variable:
secret = getattr(args, secret_tok) or _default_value("secret")
if not args_secret:
secret_variable = getattr(args, secret_variable_tok) or _default_variable("secret")
if (id_ or id_variable) and (secret or secret_variable):
if secret:
if not id_:
raise SpackError("Cannot add mirror with a variable id and text secret")
return [id_, secret]
else:
return dict(
[
(("id", id_) if id_ else ("id_variable", id_variable)),
("secret_variable", secret_variable),
]
)
else:
if id_ or id_variable or secret or secret_variable is not None:
id_arg_tok = id_tok.replace("_", "-")
secret_arg_tok = secret_tok.replace("_", "-")
tty.warn(
"Expected both parts of the access pair to be specified. "
f"(i.e. --{id_arg_tok} and --{secret_arg_tok})"
)
return None
def mirror_add(args):
"""add a mirror to Spack"""
if (
args.s3_access_key_id
or args.s3_access_key_secret
or args.s3_access_token
or args.s3_access_key_id_variable
or args.s3_access_key_secret_variable
or args.s3_access_token_variable
or args.s3_profile
or args.s3_endpoint_url
or args.type
or args.oci_username
or args.oci_password
or args.oci_username_variable
or args.oci_password_variable
or args.autopush
or args.signed is not None
):
connection = {"url": args.url}
if args.s3_access_key_id and args.s3_access_key_secret:
connection["access_pair"] = [args.s3_access_key_id, args.s3_access_key_secret]
# S3 Connection
if args.s3_access_key_secret:
tty.warn(
"Configuring mirror secrets as plain text with --s3-access-key-secret is "
"deprecated. Use --s3-access-key-secret-variable instead"
)
if args.oci_password:
tty.warn(
"Configuring mirror secrets as plain text with --oci-password is deprecated. "
"Use --oci-password-variable instead"
)
access_pair = _configure_access_pair(
args,
"s3_access_key_id",
"s3_access_key_id_variable",
"s3_access_key_secret",
"s3_access_key_secret_variable",
)
if access_pair:
connection["access_pair"] = access_pair
if args.s3_access_token:
connection["access_token"] = args.s3_access_token
elif args.s3_access_token_variable:
connection["access_token_variable"] = args.s3_access_token_variable
if args.s3_profile:
connection["profile"] = args.s3_profile
if args.s3_endpoint_url:
connection["endpoint_url"] = args.s3_endpoint_url
if args.oci_username and args.oci_password:
connection["access_pair"] = [args.oci_username, args.oci_password]
# OCI Connection
access_pair = _configure_access_pair(
args, "oci_username", "oci_username_variable", "oci_password", "oci_password_variable"
)
if access_pair:
connection["access_pair"] = access_pair
if args.type:
connection["binary"] = "binary" in args.type
connection["source"] = "source" in args.type
@@ -285,16 +387,35 @@ def _configure_mirror(args):
changes = {}
if args.url:
changes["url"] = args.url
if args.s3_access_key_id and args.s3_access_key_secret:
changes["access_pair"] = [args.s3_access_key_id, args.s3_access_key_secret]
default_access_pair = entry._get_value("access_pair", direction or "fetch")
# TODO: Init access_pair args with the fetch/push/base values in the current mirror state
access_pair = _configure_access_pair(
args,
"s3_access_key_id",
"s3_access_key_id_variable",
"s3_access_key_secret",
"s3_access_key_secret_variable",
default=default_access_pair,
)
if access_pair:
changes["access_pair"] = access_pair
if args.s3_access_token:
changes["access_token"] = args.s3_access_token
if args.s3_profile:
changes["profile"] = args.s3_profile
if args.s3_endpoint_url:
changes["endpoint_url"] = args.s3_endpoint_url
if args.oci_username and args.oci_password:
changes["access_pair"] = [args.oci_username, args.oci_password]
access_pair = _configure_access_pair(
args,
"oci_username",
"oci_username_variable",
"oci_password",
"oci_password_variable",
default=default_access_pair,
)
if access_pair:
changes["access_pair"] = access_pair
if getattr(args, "signed", None) is not None:
changes["signed"] = args.signed
if getattr(args, "autopush", None) is not None:

View File

@@ -19,6 +19,7 @@
import spack.modules
import spack.modules.common
import spack.repo
from spack.cmd import MultipleSpecsMatch, NoSpecMatches
from spack.cmd.common import arguments
description = "manipulate module files"
@@ -91,18 +92,6 @@ def add_loads_arguments(subparser):
arguments.add_common_arguments(subparser, ["recurse_dependencies"])
class MultipleSpecsMatch(Exception):
"""Raised when multiple specs match a constraint, in a context where
this is not allowed.
"""
class NoSpecMatches(Exception):
"""Raised when no spec matches a constraint, in a context where
this is not allowed.
"""
def one_spec_or_raise(specs):
"""Ensures exactly one spec has been selected, or raises the appropriate
exception.

View File

@@ -33,8 +33,9 @@ def patch(parser, args):
spack.config.set("config:checksum", False, scope="command_line")
specs = spack.cmd.parse_specs(args.specs, concretize=False)
specs = spack.cmd.matching_specs_from_env(specs)
for spec in specs:
_patch(spack.cmd.matching_spec_from_env(spec).package)
_patch(spec.package)
def _patch_env(env: ev.Environment):

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import re
import sys
@@ -12,13 +11,12 @@
import spack
import spack.cmd
import spack.cmd.common.arguments
import spack.cmd.spec
import spack.config
import spack.environment
import spack.hash_types as ht
import spack.solver.asp as asp
import spack.spec
from spack.cmd.common import arguments
description = "concretize a specs using an ASP solver"
section = "developer"
@@ -41,42 +39,6 @@ def setup_parser(subparser):
" solutions models found by asp program\n"
" all all of the above",
)
# Below are arguments w.r.t. spec display (like spack spec)
arguments.add_common_arguments(subparser, ["long", "very_long", "namespaces"])
install_status_group = subparser.add_mutually_exclusive_group()
arguments.add_common_arguments(install_status_group, ["install_status", "no_install_status"])
subparser.add_argument(
"-y",
"--yaml",
action="store_const",
dest="format",
default=None,
const="yaml",
help="print concrete spec as yaml",
)
subparser.add_argument(
"-j",
"--json",
action="store_const",
dest="format",
default=None,
const="json",
help="print concrete spec as json",
)
subparser.add_argument(
"-c",
"--cover",
action="store",
default="nodes",
choices=["nodes", "edges", "paths"],
help="how extensively to traverse the DAG (default: nodes)",
)
subparser.add_argument(
"-t", "--types", action="store_true", default=False, help="show dependency types"
)
subparser.add_argument(
"--timers",
action="store_true",
@@ -86,9 +48,8 @@ def setup_parser(subparser):
subparser.add_argument(
"--stats", action="store_true", default=False, help="print out statistics from clingo"
)
subparser.add_argument("specs", nargs=argparse.REMAINDER, help="specs of packages")
spack.cmd.common.arguments.add_concretizer_args(subparser)
spack.cmd.spec.setup_parser(subparser)
def _process_result(result, show, required_format, kwargs):
@@ -164,11 +125,12 @@ def solve(parser, args):
# If we have an active environment, pick the specs from there
env = spack.environment.active_environment()
if env and args.specs:
msg = "cannot give explicit specs when an environment is active"
raise RuntimeError(msg)
specs = list(env.user_specs) if env else spack.cmd.parse_specs(args.specs)
if args.specs:
specs = spack.cmd.parse_specs(args.specs)
elif env:
specs = list(env.user_specs)
else:
tty.die("spack solve requires at least one spec or an active environment")
solver = asp.Solver()
output = sys.stdout if "asp" in show else None

View File

@@ -82,64 +82,44 @@ def spec(parser, args):
if args.namespaces:
fmt = "{namespace}." + fmt
tree_kwargs = {
"cover": args.cover,
"format": fmt,
"hashlen": None if args.very_long else 7,
"show_types": args.types,
"status_fn": install_status_fn if args.install_status else None,
}
# use a read transaction if we are getting install status for every
# spec in the DAG. This avoids repeatedly querying the DB.
tree_context = lang.nullcontext
if args.install_status:
tree_context = spack.store.STORE.db.read_transaction
# Use command line specified specs, otherwise try to use environment specs.
env = ev.active_environment()
if args.specs:
input_specs = spack.cmd.parse_specs(args.specs)
concretized_specs = spack.cmd.parse_specs(args.specs, concretize=True)
specs = list(zip(input_specs, concretized_specs))
concrete_specs = spack.cmd.parse_specs(args.specs, concretize=True)
elif env:
env.concretize()
concrete_specs = env.concrete_roots()
else:
env = ev.active_environment()
if env:
env.concretize()
specs = env.concretized_specs()
tty.die("spack spec requires at least one spec or an active environment")
# environments are printed together in a combined tree() invocation,
# except when using --yaml or --json, which we print spec by spec below.
if not args.format:
tree_kwargs["key"] = spack.traverse.by_dag_hash
tree_kwargs["hashes"] = args.long or args.very_long
print(spack.spec.tree([concrete for _, concrete in specs], **tree_kwargs))
return
else:
tty.die("spack spec requires at least one spec or an active environment")
for input, output in specs:
# With --yaml or --json, just print the raw specs to output
if args.format:
# With --yaml, --json, or --format, just print the raw specs to output
if args.format:
for spec in concrete_specs:
if args.format == "yaml":
# use write because to_yaml already has a newline.
sys.stdout.write(output.to_yaml(hash=ht.dag_hash))
sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
elif args.format == "json":
print(output.to_json(hash=ht.dag_hash))
print(spec.to_json(hash=ht.dag_hash))
else:
print(output.format(args.format))
continue
print(spec.format(args.format))
return
with tree_context():
# Only show the headers for input specs that are not concrete to avoid
# repeated output. This happens because parse_specs outputs concrete
# specs for `/hash` inputs.
if not input.concrete:
tree_kwargs["hashes"] = False # Always False for input spec
print("Input spec")
print("--------------------------------")
print(input.tree(**tree_kwargs))
print("Concretized")
print("--------------------------------")
tree_kwargs["hashes"] = args.long or args.very_long
print(output.tree(**tree_kwargs))
with tree_context():
print(
spack.spec.tree(
concrete_specs,
cover=args.cover,
format=fmt,
hashlen=None if args.very_long else 7,
show_types=args.types,
status_fn=install_status_fn if args.install_status else None,
hashes=args.long or args.very_long,
key=spack.traverse.by_dag_hash,
)
)

View File

@@ -47,8 +47,8 @@ def stage(parser, args):
if len(specs) > 1 and custom_path:
tty.die("`--path` requires a single spec, but multiple were provided")
specs = spack.cmd.matching_specs_from_env(specs)
for spec in specs:
spec = spack.cmd.matching_spec_from_env(spec)
pkg = spec.package
if custom_path:

View File

@@ -24,7 +24,7 @@
# tutorial configuration parameters
tutorial_branch = "releases/v0.22"
tutorial_branch = "releases/v0.23"
tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@@ -17,7 +17,8 @@
import spack.store
import spack.traverse as traverse
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from ..enums import InstallRecordStatus
description = "remove installed packages"
section = "build"
@@ -99,12 +100,14 @@ def find_matching_specs(
hashes = env.all_hashes() if env else None
# List of specs that match expressions given via command line
specs_from_cli: List["spack.spec.Spec"] = []
specs_from_cli: List[spack.spec.Spec] = []
has_errors = False
for spec in specs:
install_query = [InstallStatuses.INSTALLED, InstallStatuses.DEPRECATED]
matching = spack.store.STORE.db.query_local(
spec, hashes=hashes, installed=install_query, origin=origin
spec,
hashes=hashes,
installed=(InstallRecordStatus.INSTALLED | InstallRecordStatus.DEPRECATED),
origin=origin,
)
# For each spec provided, make sure it refers to only one package.
# Fail and ask user to be unambiguous if it doesn't

View File

@@ -4,20 +4,23 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import contextlib
import hashlib
import itertools
import json
import os
import platform
import re
import shutil
import sys
import tempfile
from typing import List, Optional, Sequence
from typing import Dict, List, Optional, Sequence
import llnl.path
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import path_contains_subdirectory, paths_containing_libs
import spack.caches
import spack.error
import spack.schema.environment
import spack.spec
@@ -26,6 +29,7 @@
import spack.util.module_cmd
import spack.version
from spack.util.environment import filter_system_paths
from spack.util.file_cache import FileCache
__all__ = ["Compiler"]
@@ -34,7 +38,7 @@
@llnl.util.lang.memoized
def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()):
def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()) -> str:
"""Invokes the compiler at a given path passing a single
version argument and returns the output.
@@ -57,7 +61,7 @@ def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()):
return output
def get_compiler_version_output(compiler_path, *args, **kwargs):
def get_compiler_version_output(compiler_path, *args, **kwargs) -> str:
"""Wrapper for _get_compiler_version_output()."""
# This ensures that we memoize compiler output by *absolute path*,
# not just executable name. If we don't do this, and the path changes
@@ -290,6 +294,7 @@ def __init__(
self.environment = environment or {}
self.extra_rpaths = extra_rpaths or []
self.enable_implicit_rpaths = enable_implicit_rpaths
self.cache = COMPILER_CACHE
self.cc = paths[0]
self.cxx = paths[1]
@@ -390,15 +395,11 @@ def real_version(self):
E.g. C++11 flag checks.
"""
if not self._real_version:
try:
real_version = spack.version.Version(self.get_real_version())
if real_version == spack.version.Version("unknown"):
return self.version
self._real_version = real_version
except spack.util.executable.ProcessError:
self._real_version = self.version
return self._real_version
real_version_str = self.cache.get(self).real_version
if not real_version_str or real_version_str == "unknown":
return self.version
return spack.version.StandardVersion.from_string(real_version_str)
def implicit_rpaths(self) -> List[str]:
if self.enable_implicit_rpaths is False:
@@ -427,6 +428,11 @@ def default_dynamic_linker(self) -> Optional[str]:
@property
def default_libc(self) -> Optional["spack.spec.Spec"]:
"""Determine libc targeted by the compiler from link line"""
# technically this should be testing the target platform of the compiler, but we don't have
# that, so stick to host platform for now.
if sys.platform in ("darwin", "win32"):
return None
dynamic_linker = self.default_dynamic_linker
if not dynamic_linker:
@@ -445,19 +451,23 @@ def required_libs(self):
@property
def compiler_verbose_output(self) -> Optional[str]:
"""Verbose output from compiling a dummy C source file. Output is cached."""
if not hasattr(self, "_compile_c_source_output"):
self._compile_c_source_output = self._compile_dummy_c_source()
return self._compile_c_source_output
return self.cache.get(self).c_compiler_output
def _compile_dummy_c_source(self) -> Optional[str]:
cc = self.cc if self.cc else self.cxx
if self.cc:
cc = self.cc
ext = "c"
else:
cc = self.cxx
ext = "cc"
if not cc or not self.verbose_flag:
return None
try:
tmpdir = tempfile.mkdtemp(prefix="spack-implicit-link-info")
fout = os.path.join(tmpdir, "output")
fin = os.path.join(tmpdir, "main.c")
fin = os.path.join(tmpdir, f"main.{ext}")
with open(fin, "w") as csource:
csource.write(
@@ -559,7 +569,7 @@ def fc_pic_flag(self):
# Note: This is not a class method. The class methods are used to detect
# compilers on PATH based systems, and do not set up the run environment of
# the compiler. This method can be called on `module` based systems as well
def get_real_version(self):
def get_real_version(self) -> str:
"""Query the compiler for its version.
This is the "real" compiler version, regardless of what is in the
@@ -569,14 +579,17 @@ def get_real_version(self):
modifications) to enable the compiler to run properly on any platform.
"""
cc = spack.util.executable.Executable(self.cc)
with self.compiler_environment():
output = cc(
self.version_argument,
output=str,
error=str,
ignore_errors=tuple(self.ignore_version_errors),
)
return self.extract_version_from_output(output)
try:
with self.compiler_environment():
output = cc(
self.version_argument,
output=str,
error=str,
ignore_errors=tuple(self.ignore_version_errors),
)
return self.extract_version_from_output(output)
except spack.util.executable.ProcessError:
return "unknown"
@property
def prefix(self):
@@ -603,7 +616,7 @@ def default_version(cls, cc):
@classmethod
@llnl.util.lang.memoized
def extract_version_from_output(cls, output):
def extract_version_from_output(cls, output: str) -> str:
"""Extracts the version from compiler's output."""
match = re.search(cls.version_regex, output)
return match.group(1) if match else "unknown"
@@ -732,3 +745,106 @@ def __init__(self, compiler, feature, flag_name, ver_string=None):
)
+ " implement the {0} property and submit a pull request or issue.".format(flag_name),
)
class CompilerCacheEntry:
"""Deserialized cache entry for a compiler"""
__slots__ = ["c_compiler_output", "real_version"]
def __init__(self, c_compiler_output: Optional[str], real_version: str):
self.c_compiler_output = c_compiler_output
self.real_version = real_version
@classmethod
def from_dict(cls, data: Dict[str, Optional[str]]):
if not isinstance(data, dict):
raise ValueError(f"Invalid {cls.__name__} data")
c_compiler_output = data.get("c_compiler_output")
real_version = data.get("real_version")
if not isinstance(real_version, str) or not isinstance(
c_compiler_output, (str, type(None))
):
raise ValueError(f"Invalid {cls.__name__} data")
return cls(c_compiler_output, real_version)
class CompilerCache:
"""Base class for compiler output cache. Default implementation does not cache anything."""
def value(self, compiler: Compiler) -> Dict[str, Optional[str]]:
return {
"c_compiler_output": compiler._compile_dummy_c_source(),
"real_version": compiler.get_real_version(),
}
def get(self, compiler: Compiler) -> CompilerCacheEntry:
return CompilerCacheEntry.from_dict(self.value(compiler))
class FileCompilerCache(CompilerCache):
"""Cache for compiler output, which is used to determine implicit link paths, the default libc
version, and the compiler version."""
name = os.path.join("compilers", "compilers.json")
def __init__(self, cache: "FileCache") -> None:
self.cache = cache
self.cache.init_entry(self.name)
self._data: Dict[str, Dict[str, Optional[str]]] = {}
def _get_entry(self, key: str) -> Optional[CompilerCacheEntry]:
try:
return CompilerCacheEntry.from_dict(self._data[key])
except ValueError:
del self._data[key]
except KeyError:
pass
return None
def get(self, compiler: Compiler) -> CompilerCacheEntry:
# Cache hit
try:
with self.cache.read_transaction(self.name) as f:
assert f is not None
self._data = json.loads(f.read())
assert isinstance(self._data, dict)
except (json.JSONDecodeError, AssertionError):
self._data = {}
key = self._key(compiler)
value = self._get_entry(key)
if value is not None:
return value
# Cache miss
with self.cache.write_transaction(self.name) as (old, new):
try:
assert old is not None
self._data = json.loads(old.read())
assert isinstance(self._data, dict)
except (json.JSONDecodeError, AssertionError):
self._data = {}
# Use cache entry that may have been created by another process in the meantime.
entry = self._get_entry(key)
# Finally compute the cache entry
if entry is None:
self._data[key] = self.value(compiler)
entry = CompilerCacheEntry.from_dict(self._data[key])
new.write(json.dumps(self._data, separators=(",", ":")))
return entry
def _key(self, compiler: Compiler) -> str:
as_bytes = json.dumps(compiler.to_dict(), separators=(",", ":")).encode("utf-8")
return hashlib.sha256(as_bytes).hexdigest()
def _make_compiler_cache():
return FileCompilerCache(spack.caches.MISC_CACHE)
COMPILER_CACHE: CompilerCache = llnl.util.lang.Singleton(_make_compiler_cache) # type: ignore

View File

@@ -116,5 +116,5 @@ def fflags(self):
def _handle_default_flag_addtions(self):
# This is a known issue for AOCC 3.0 see:
# https://developer.amd.com/wp-content/resources/AOCC-3.0-Install-Guide.pdf
if self.real_version.satisfies(ver("3.0.0")):
if self.version.satisfies(ver("3.0.0")):
return "-Wno-unused-command-line-argument " "-mllvm -eliminate-similar-expr=false"

View File

@@ -16,7 +16,6 @@
("gfortran", os.path.join("clang", "gfortran")),
("xlf_r", os.path.join("xl_r", "xlf_r")),
("xlf", os.path.join("xl", "xlf")),
("pgfortran", os.path.join("pgi", "pgfortran")),
("ifort", os.path.join("intel", "ifort")),
]
@@ -25,7 +24,6 @@
("gfortran", os.path.join("clang", "gfortran")),
("xlf90_r", os.path.join("xl_r", "xlf90_r")),
("xlf90", os.path.join("xl", "xlf90")),
("pgfortran", os.path.join("pgi", "pgfortran")),
("ifort", os.path.join("intel", "ifort")),
]

View File

@@ -1,77 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from spack.compiler import Compiler, UnsupportedCompilerFlag
from spack.version import Version
class Pgi(Compiler):
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("pgi", "pgcc"),
"cxx": os.path.join("pgi", "pgc++"),
"f77": os.path.join("pgi", "pgfortran"),
"fc": os.path.join("pgi", "pgfortran"),
}
version_argument = "-V"
ignore_version_errors = [2] # `pgcc -V` on PowerPC annoyingly returns 2
version_regex = r"pg[^ ]* ([0-9.]+)-[0-9]+ (LLVM )?[^ ]+ target on "
@property
def verbose_flag(self):
return "-v"
@property
def debug_flags(self):
return ["-g", "-gopt"]
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3", "-O4"]
@property
def openmp_flag(self):
return "-mp"
@property
def cxx11_flag(self):
return "-std=c++11"
@property
def cc_pic_flag(self):
return "-fpic"
@property
def cxx_pic_flag(self):
return "-fpic"
@property
def f77_pic_flag(self):
return "-fpic"
@property
def fc_pic_flag(self):
return "-fpic"
required_libs = ["libpgc", "libpgf90"]
@property
def c99_flag(self):
if self.real_version >= Version("12.10"):
return "-c99"
raise UnsupportedCompilerFlag(self, "the C99 standard", "c99_flag", "< 12.10")
@property
def c11_flag(self):
if self.real_version >= Version("15.3"):
return "-c11"
raise UnsupportedCompilerFlag(self, "the C11 standard", "c11_flag", "< 15.3")
@property
def stdcxx_libs(self):
return ("-pgc++libs",)

View File

@@ -2,14 +2,20 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""
(DEPRECATED) Used to contain the code for the original concretizer
"""
"""High-level functions to concretize list of specs"""
import sys
import time
from contextlib import contextmanager
from itertools import chain
from typing import Iterable, Optional, Sequence, Tuple, Union
import llnl.util.tty as tty
import spack.compilers
import spack.config
import spack.error
import spack.repo
import spack.util.parallel
from spack.spec import ArchSpec, CompilerSpec, Spec
CHECK_COMPILER_EXISTENCE = True
@@ -30,67 +36,167 @@ def enable_compiler_existence_check():
CHECK_COMPILER_EXISTENCE = saved
def find_spec(spec, condition, default=None):
"""Searches the dag from spec in an intelligent order and looks
for a spec that matches a condition"""
# First search parents, then search children
deptype = ("build", "link")
dagiter = chain(
spec.traverse(direction="parents", deptype=deptype, root=False),
spec.traverse(direction="children", deptype=deptype, root=False),
)
visited = set()
for relative in dagiter:
if condition(relative):
return relative
visited.add(id(relative))
# Then search all other relatives in the DAG *except* spec
for relative in spec.root.traverse(deptype="all"):
if relative is spec:
continue
if id(relative) in visited:
continue
if condition(relative):
return relative
# Finally search spec itself.
if condition(spec):
return spec
return default # Nothing matched the condition; return default.
SpecPair = Tuple[Spec, Spec]
SpecLike = Union[Spec, str]
TestsType = Union[bool, Iterable[str]]
def concretize_specs_together(*abstract_specs, **kwargs):
def concretize_specs_together(
abstract_specs: Sequence[SpecLike], tests: TestsType = False
) -> Sequence[Spec]:
"""Given a number of specs as input, tries to concretize them together.
Args:
tests (bool or list or set): False to run no tests, True to test
all packages, or a list of package names to run tests for some
*abstract_specs: abstract specs to be concretized, given either
as Specs or strings
Returns:
List of concretized specs
abstract_specs: abstract specs to be concretized
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.solver.asp
allow_deprecated = spack.config.get("config:deprecated", False)
solver = spack.solver.asp.Solver()
result = solver.solve(
abstract_specs, tests=kwargs.get("tests", False), allow_deprecated=allow_deprecated
)
result = solver.solve(abstract_specs, tests=tests, allow_deprecated=allow_deprecated)
return [s.copy() for s in result.specs]
def concretize_together(
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
"""Given a number of specs as input, tries to concretize them together.
Args:
spec_list: list of tuples to concretize. First entry is abstract spec, second entry is
already concrete spec or None if not yet concretized
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list]
abstract_specs = [abstract for abstract, _ in spec_list]
concrete_specs = concretize_specs_together(to_concretize, tests=tests)
return list(zip(abstract_specs, concrete_specs))
def concretize_together_when_possible(
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
"""Given a number of specs as input, tries to concretize them together to the extent possible.
See documentation for ``unify: when_possible`` concretization for the precise definition of
"to the extent possible".
Args:
spec_list: list of tuples to concretize. First entry is abstract spec, second entry is
already concrete spec or None if not yet concretized
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.solver.asp
to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list]
old_concrete_to_abstract = {
concrete: abstract for (abstract, concrete) in spec_list if concrete
}
result_by_user_spec = {}
solver = spack.solver.asp.Solver()
allow_deprecated = spack.config.get("config:deprecated", False)
for result in solver.solve_in_rounds(
to_concretize, tests=tests, allow_deprecated=allow_deprecated
):
result_by_user_spec.update(result.specs_by_input)
# If the "abstract" spec is a concrete spec from the previous concretization
# translate it back to an abstract spec. Otherwise, keep the abstract spec
return [
(old_concrete_to_abstract.get(abstract, abstract), concrete)
for abstract, concrete in sorted(result_by_user_spec.items())
]
def concretize_separately(
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
"""Concretizes the input specs separately from each other.
Args:
spec_list: list of tuples to concretize. First entry is abstract spec, second entry is
already concrete spec or None if not yet concretized
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.bootstrap
to_concretize = [abstract for abstract, concrete in spec_list if not concrete]
args = [
(i, str(abstract), tests)
for i, abstract in enumerate(to_concretize)
if not abstract.concrete
]
ret = [(i, abstract) for i, abstract in enumerate(to_concretize) if abstract.concrete]
# Ensure we don't try to bootstrap clingo in parallel
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_clingo_importable_or_raise()
# Ensure all the indexes have been built or updated, since
# otherwise the processes in the pool may timeout on waiting
# for a write lock. We do this indirectly by retrieving the
# provider index, which should in turn trigger the update of
# all the indexes if there's any need for that.
_ = spack.repo.PATH.provider_index
# Ensure we have compilers in compilers.yaml to avoid that
# processes try to write the config file in parallel
_ = spack.compilers.all_compilers_config(spack.config.CONFIG)
# Early return if there is nothing to do
if len(args) == 0:
# Still have to combine the things that were passed in as abstract with the things
# that were passed in as pairs
return [(abstract, concrete) for abstract, (_, concrete) in zip(to_concretize, ret)] + [
(abstract, concrete) for abstract, concrete in spec_list if concrete
]
# Solve the environment in parallel on Linux
# TODO: support parallel concretization on macOS and Windows
num_procs = min(len(args), spack.config.determine_number_of_jobs(parallel=True))
for j, (i, concrete, duration) in enumerate(
spack.util.parallel.imap_unordered(
_concretize_task, args, processes=num_procs, debug=tty.is_debug(), maxtaskperchild=1
)
):
ret.append((i, concrete))
percentage = (j + 1) / len(args) * 100
tty.verbose(
f"{duration:6.1f}s [{percentage:3.0f}%] {concrete.cformat('{hash:7}')} "
f"{to_concretize[i].colored_str}"
)
sys.stdout.flush()
# Add specs in original order
ret.sort(key=lambda x: x[0])
return [(abstract, concrete) for abstract, (_, concrete) in zip(to_concretize, ret)] + [
(abstract, concrete) for abstract, concrete in spec_list if concrete
]
def _concretize_task(packed_arguments: Tuple[int, str, TestsType]) -> Tuple[int, Spec, float]:
index, spec_str, tests = packed_arguments
with tty.SuppressOutput(msg_enabled=False):
start = time.time()
spec = Spec(spec_str).concretized(tests=tests)
return index, spec, time.time() - start
class UnavailableCompilerVersionError(spack.error.SpackError):
"""Raised when there is no available compiler that satisfies a
compiler spec."""
def __init__(self, compiler_spec, arch=None):
err_msg = "No compilers with spec {0} found".format(compiler_spec)
def __init__(self, compiler_spec: CompilerSpec, arch: Optional[ArchSpec] = None) -> None:
err_msg = f"No compilers with spec {compiler_spec} found"
if arch:
err_msg += " for operating system {0} and target {1}.".format(arch.os, arch.target)
err_msg += f" for operating system {arch.os} and target {arch.target}."
super().__init__(
err_msg,

View File

@@ -427,6 +427,10 @@ def __init__(self, *scopes: ConfigScope) -> None:
self.push_scope(scope)
self.format_updates: Dict[str, List[ConfigScope]] = collections.defaultdict(list)
def ensure_unwrapped(self) -> "Configuration":
"""Ensure we unwrap this object from any dynamic wrapper (like Singleton)"""
return self
@_config_mutator
def push_scope(self, scope: ConfigScope) -> None:
"""Add a higher precedence scope to the Configuration."""
@@ -714,7 +718,7 @@ def print_section(self, section: str, blame: bool = False, *, scope=None) -> Non
@contextlib.contextmanager
def override(
path_or_scope: Union[ConfigScope, str], value: Optional[Any] = None
) -> Generator[Union[lang.Singleton, Configuration], None, None]:
) -> Generator[Configuration, None, None]:
"""Simple way to override config settings within a context.
Arguments:
@@ -752,13 +756,7 @@ def override(
assert scope is overrides
#: configuration scopes added on the command line set by ``spack.main.main()``
COMMAND_LINE_SCOPES: List[str] = []
def _add_platform_scope(
cfg: Union[Configuration, lang.Singleton], name: str, path: str, writable: bool = True
) -> None:
def _add_platform_scope(cfg: Configuration, name: str, path: str, writable: bool = True) -> None:
"""Add a platform-specific subdirectory for the current platform."""
platform = spack.platforms.host().name
scope = DirectoryConfigScope(
@@ -792,9 +790,7 @@ def config_paths_from_entry_points() -> List[Tuple[str, str]]:
return config_paths
def _add_command_line_scopes(
cfg: Union[Configuration, lang.Singleton], command_line_scopes: List[str]
) -> None:
def _add_command_line_scopes(cfg: Configuration, command_line_scopes: List[str]) -> None:
"""Add additional scopes from the --config-scope argument, either envs or dirs."""
import spack.environment.environment as env # circular import
@@ -864,18 +860,11 @@ def create() -> Configuration:
# Each scope can have per-platfom overrides in subdirectories
_add_platform_scope(cfg, name, path)
# add command-line scopes
_add_command_line_scopes(cfg, COMMAND_LINE_SCOPES)
# we make a special scope for spack commands so that they can
# override configuration options.
cfg.push_scope(InternalConfigScope("command_line"))
return cfg
#: This is the singleton configuration instance for Spack.
CONFIG: Union[Configuration, lang.Singleton] = lang.Singleton(create)
CONFIG: Configuration = lang.Singleton(create) # type: ignore
def add_from_file(filename: str, scope: Optional[str] = None) -> None:

View File

@@ -69,6 +69,8 @@
from spack.error import SpackError
from spack.util.crypto import bit_length
from .enums import InstallRecordStatus
# TODO: Provide an API automatically retyring a build after detecting and
# TODO: clearing a failure.
@@ -160,36 +162,12 @@ def converter(self, spec_like, *args, **kwargs):
return converter
class InstallStatus(str):
pass
class InstallStatuses:
INSTALLED = InstallStatus("installed")
DEPRECATED = InstallStatus("deprecated")
MISSING = InstallStatus("missing")
@classmethod
def canonicalize(cls, query_arg):
if query_arg is True:
return [cls.INSTALLED]
if query_arg is False:
return [cls.MISSING]
if query_arg is any:
return [cls.INSTALLED, cls.DEPRECATED, cls.MISSING]
if isinstance(query_arg, InstallStatus):
return [query_arg]
try:
statuses = list(query_arg)
if all(isinstance(x, InstallStatus) for x in statuses):
return statuses
except TypeError:
pass
raise TypeError(
"installation query must be `any`, boolean, "
"InstallStatus, or iterable of InstallStatus"
)
def normalize_query(installed: Union[bool, InstallRecordStatus]) -> InstallRecordStatus:
if installed is True:
installed = InstallRecordStatus.INSTALLED
elif installed is False:
installed = InstallRecordStatus.MISSING
return installed
class InstallRecord:
@@ -227,8 +205,8 @@ def __init__(
installation_time: Optional[float] = None,
deprecated_for: Optional[str] = None,
in_buildcache: bool = False,
origin=None,
):
origin: Optional[str] = None,
) -> None:
self.spec = spec
self.path = str(path) if path else None
self.installed = bool(installed)
@@ -239,14 +217,12 @@ def __init__(
self.in_buildcache = in_buildcache
self.origin = origin
def install_type_matches(self, installed):
installed = InstallStatuses.canonicalize(installed)
def install_type_matches(self, installed: InstallRecordStatus) -> bool:
if self.installed:
return InstallStatuses.INSTALLED in installed
return InstallRecordStatus.INSTALLED in installed
elif self.deprecated_for:
return InstallStatuses.DEPRECATED in installed
else:
return InstallStatuses.MISSING in installed
return InstallRecordStatus.DEPRECATED in installed
return InstallRecordStatus.MISSING in installed
def to_dict(self, include_fields=DEFAULT_INSTALL_RECORD_FIELDS):
rec_dict = {}
@@ -1336,7 +1312,7 @@ def _deprecate(self, spec: "spack.spec.Spec", deprecator: "spack.spec.Spec") ->
self._data[spec_key] = spec_rec
@_autospec
def mark(self, spec: "spack.spec.Spec", key, value) -> None:
def mark(self, spec: "spack.spec.Spec", key: str, value: Any) -> None:
"""Mark an arbitrary record on a spec."""
with self.write_transaction():
return self._mark(spec, key, value)
@@ -1396,7 +1372,13 @@ def installed_extensions_for(self, extendee_spec: "spack.spec.Spec"):
if spec.package.extends(extendee_spec):
yield spec.package
def _get_by_hash_local(self, dag_hash, default=None, installed=any):
def _get_by_hash_local(
self,
dag_hash: str,
default: Optional[List["spack.spec.Spec"]] = None,
installed: Union[bool, InstallRecordStatus] = InstallRecordStatus.ANY,
) -> Optional[List["spack.spec.Spec"]]:
installed = normalize_query(installed)
# hash is a full hash and is in the data somewhere
if dag_hash in self._data:
rec = self._data[dag_hash]
@@ -1405,8 +1387,7 @@ def _get_by_hash_local(self, dag_hash, default=None, installed=any):
else:
return default
# check if hash is a prefix of some installed (or previously
# installed) spec.
# check if hash is a prefix of some installed (or previously installed) spec.
matches = [
record.spec
for h, record in self._data.items()
@@ -1418,52 +1399,43 @@ def _get_by_hash_local(self, dag_hash, default=None, installed=any):
# nothing found
return default
def get_by_hash_local(self, dag_hash, default=None, installed=any):
def get_by_hash_local(
self,
dag_hash: str,
default: Optional[List["spack.spec.Spec"]] = None,
installed: Union[bool, InstallRecordStatus] = InstallRecordStatus.ANY,
) -> Optional[List["spack.spec.Spec"]]:
"""Look up a spec in *this DB* by DAG hash, or by a DAG hash prefix.
Arguments:
dag_hash (str): hash (or hash prefix) to look up
default (object or None): default value to return if dag_hash is
not in the DB (default: None)
installed (bool or InstallStatus or typing.Iterable or None):
if ``True``, includes only installed
specs in the search; if ``False`` only missing specs, and if
``any``, all specs in database. If an InstallStatus or iterable
of InstallStatus, returns specs whose install status
(installed, deprecated, or missing) matches (one of) the
InstallStatus. (default: any)
Args:
dag_hash: hash (or hash prefix) to look up
default: default value to return if dag_hash is not in the DB
installed: if ``True``, includes only installed specs in the search; if ``False``
only missing specs. Otherwise, a InstallRecordStatus flag.
``installed`` defaults to ``any`` so that we can refer to any
known hash. Note that ``query()`` and ``query_one()`` differ in
that they only return installed specs by default.
Returns:
(list): a list of specs matching the hash or hash prefix
``installed`` defaults to ``InstallRecordStatus.ANY`` so we can refer to any known hash.
``query()`` and ``query_one()`` differ in that they only return installed specs by default.
"""
with self.read_transaction():
return self._get_by_hash_local(dag_hash, default=default, installed=installed)
def get_by_hash(self, dag_hash, default=None, installed=any):
def get_by_hash(
self,
dag_hash: str,
default: Optional[List["spack.spec.Spec"]] = None,
installed: Union[bool, InstallRecordStatus] = InstallRecordStatus.ANY,
) -> Optional[List["spack.spec.Spec"]]:
"""Look up a spec by DAG hash, or by a DAG hash prefix.
Arguments:
dag_hash (str): hash (or hash prefix) to look up
default (object or None): default value to return if dag_hash is
not in the DB (default: None)
installed (bool or InstallStatus or typing.Iterable or None):
if ``True``, includes only installed specs in the search; if ``False``
only missing specs, and if ``any``, all specs in database. If an
InstallStatus or iterable of InstallStatus, returns specs whose install
status (installed, deprecated, or missing) matches (one of) the
InstallStatus. (default: any)
Args:
dag_hash: hash (or hash prefix) to look up
default: default value to return if dag_hash is not in the DB
installed: if ``True``, includes only installed specs in the search; if ``False``
only missing specs. Otherwise, a InstallRecordStatus flag.
``installed`` defaults to ``any`` so that we can refer to any
known hash. Note that ``query()`` and ``query_one()`` differ in
that they only return installed specs by default.
Returns:
(list): a list of specs matching the hash or hash prefix
``installed`` defaults to ``InstallRecordStatus.ANY`` so we can refer to any known hash.
``query()`` and ``query_one()`` differ in that they only return installed specs by default.
"""
@@ -1483,7 +1455,7 @@ def _query(
query_spec: Optional[Union[str, "spack.spec.Spec"]] = None,
*,
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
installed: Union[bool, InstallRecordStatus] = True,
explicit: Optional[bool] = None,
start_date: Optional[datetime.datetime] = None,
end_date: Optional[datetime.datetime] = None,
@@ -1491,6 +1463,7 @@ def _query(
in_buildcache: Optional[bool] = None,
origin: Optional[str] = None,
) -> List["spack.spec.Spec"]:
installed = normalize_query(installed)
# Restrict the set of records over which we iterate first
matching_hashes = self._data
@@ -1560,7 +1533,7 @@ def query_local(
query_spec: Optional[Union[str, "spack.spec.Spec"]] = None,
*,
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
installed: Union[bool, InstallRecordStatus] = True,
explicit: Optional[bool] = None,
start_date: Optional[datetime.datetime] = None,
end_date: Optional[datetime.datetime] = None,
@@ -1620,7 +1593,7 @@ def query(
query_spec: Optional[Union[str, "spack.spec.Spec"]] = None,
*,
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
installed: Union[bool, InstallRecordStatus] = True,
explicit: Optional[bool] = None,
start_date: Optional[datetime.datetime] = None,
end_date: Optional[datetime.datetime] = None,
@@ -1628,7 +1601,7 @@ def query(
hashes: Optional[List[str]] = None,
origin: Optional[str] = None,
install_tree: str = "all",
):
) -> List["spack.spec.Spec"]:
"""Queries the Spack database including all upstream databases.
Args:
@@ -1709,13 +1682,14 @@ def query(
)
results = list(local_results) + list(x for x in upstream_results if x not in local_results)
return sorted(results)
results.sort()
return results
def query_one(
self,
query_spec: Optional[Union[str, "spack.spec.Spec"]],
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
installed: Union[bool, InstallRecordStatus] = True,
) -> Optional["spack.spec.Spec"]:
"""Query for exactly one spec that matches the query spec.
@@ -1771,24 +1745,6 @@ def root(key, record):
if id(rec.spec) not in needed and rec.installed
]
def update_explicit(self, spec, explicit):
"""
Update the spec's explicit state in the database.
Args:
spec (spack.spec.Spec): the spec whose install record is being updated
explicit (bool): ``True`` if the package was requested explicitly
by the user, ``False`` if it was pulled in as a dependency of
an explicit package.
"""
rec = self.get_record(spec)
if explicit != rec.explicit:
with self.write_transaction():
message = "{s.name}@{s.version} : marking the package {0}"
status = "explicit" if explicit else "implicit"
tty.debug(message.format(status, s=spec))
rec.explicit = explicit
class NoUpstreamVisitor:
"""Gives edges to upstream specs, but does follow edges from upstream specs."""

View File

@@ -34,12 +34,13 @@ class OpenMpi(Package):
import collections.abc
import os.path
import re
from typing import TYPE_CHECKING, Any, Callable, List, Optional, Tuple, Union
from typing import Any, Callable, List, Optional, Tuple, Union
import llnl.util.lang
import llnl.util.tty.color
import spack.deptypes as dt
import spack.package_base
import spack.patch
import spack.spec
import spack.util.crypto
@@ -56,14 +57,10 @@ class OpenMpi(Package):
VersionLookupError,
)
if TYPE_CHECKING:
import spack.package_base
__all__ = [
"DirectiveError",
"DirectiveMeta",
"DisableRedistribute",
"version",
"conditional",
"conflicts",
"depends_on",
"extends",
@@ -76,6 +73,7 @@ class OpenMpi(Package):
"build_system",
"requires",
"redistribute",
"can_splice",
]
_patch_order_index = 0
@@ -83,15 +81,15 @@ class OpenMpi(Package):
SpecType = str
DepType = Union[Tuple[str, ...], str]
WhenType = Optional[Union["spack.spec.Spec", str, bool]]
Patcher = Callable[[Union["spack.package_base.PackageBase", Dependency]], None]
WhenType = Optional[Union[spack.spec.Spec, str, bool]]
Patcher = Callable[[Union[spack.package_base.PackageBase, Dependency]], None]
PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]]
SUPPORTED_LANGUAGES = ("fortran", "cxx", "c")
def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]:
def _make_when_spec(value: WhenType) -> Optional[spack.spec.Spec]:
"""Create a ``Spec`` that indicates when a directive should be applied.
Directives with ``when`` specs, e.g.:
@@ -136,7 +134,7 @@ def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]:
return spack.spec.Spec(value)
SubmoduleCallback = Callable[["spack.package_base.PackageBase"], Union[str, List[str], bool]]
SubmoduleCallback = Callable[[spack.package_base.PackageBase], Union[str, List[str], bool]]
directive = DirectiveMeta.directive
@@ -252,8 +250,8 @@ def _execute_version(pkg, ver, **kwargs):
def _depends_on(
pkg: "spack.package_base.PackageBase",
spec: "spack.spec.Spec",
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
*,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
@@ -332,7 +330,7 @@ def conflicts(conflict_spec: SpecType, when: WhenType = None, msg: Optional[str]
msg (str): optional user defined message
"""
def _execute_conflicts(pkg: "spack.package_base.PackageBase"):
def _execute_conflicts(pkg: spack.package_base.PackageBase):
# If when is not specified the conflict always holds
when_spec = _make_when_spec(when)
if not when_spec:
@@ -373,19 +371,12 @@ def depends_on(
assert type == "build", "languages must be of 'build' type"
return _language(lang_spec_str=spec, when=when)
def _execute_depends_on(pkg: "spack.package_base.PackageBase"):
def _execute_depends_on(pkg: spack.package_base.PackageBase):
_depends_on(pkg, dep_spec, when=when, type=type, patches=patches)
return _execute_depends_on
#: Store whether a given Spec source/binary should not be redistributed.
class DisableRedistribute:
def __init__(self, source, binary):
self.source = source
self.binary = binary
@directive("disable_redistribute")
def redistribute(source=None, binary=None, when: WhenType = None):
"""Can be used inside a Package definition to declare that
@@ -402,7 +393,7 @@ def redistribute(source=None, binary=None, when: WhenType = None):
def _execute_redistribute(
pkg: "spack.package_base.PackageBase", source=None, binary=None, when: WhenType = None
pkg: spack.package_base.PackageBase, source=None, binary=None, when: WhenType = None
):
if source is None and binary is None:
return
@@ -432,7 +423,7 @@ def _execute_redistribute(
if not binary:
disable.binary = True
else:
pkg.disable_redistribute[when_spec] = DisableRedistribute(
pkg.disable_redistribute[when_spec] = spack.package_base.DisableRedistribute(
source=not source, binary=not binary
)
@@ -478,7 +469,7 @@ def provides(*specs: SpecType, when: WhenType = None):
when: condition when this provides clause needs to be considered
"""
def _execute_provides(pkg: "spack.package_base.PackageBase"):
def _execute_provides(pkg: spack.package_base.PackageBase):
import spack.parser # Avoid circular dependency
when_spec = _make_when_spec(when)
@@ -504,6 +495,43 @@ def _execute_provides(pkg: "spack.package_base.PackageBase"):
return _execute_provides
@directive("splice_specs")
def can_splice(
target: SpecType, *, when: SpecType, match_variants: Union[None, str, List[str]] = None
):
"""Packages can declare whether they are ABI-compatible with another package
and thus can be spliced into concrete versions of that package.
Args:
target: The spec that the current package is ABI-compatible with.
when: An anonymous spec constraining current package for when it is
ABI-compatible with target.
match_variants: A list of variants that must match
between target spec and current package, with special value '*'
which matches all variants. Example: a variant is defined on both
packages called json, and they are ABI-compatible whenever they agree on
the json variant (regardless of whether it is turned on or off). Note
that this cannot be applied to multi-valued variants and multi-valued
variants will be skipped by '*'.
"""
def _execute_can_splice(pkg: spack.package_base.PackageBase):
when_spec = _make_when_spec(when)
if isinstance(match_variants, str) and match_variants != "*":
raise ValueError(
"* is the only valid string for match_variants "
"if looking to provide a single variant, use "
f"[{match_variants}] instead"
)
if when_spec is None:
return
pkg.splice_specs[when_spec] = (spack.spec.Spec(target), match_variants)
return _execute_can_splice
@directive("patches")
def patch(
url_or_filename: str,
@@ -530,7 +558,7 @@ def patch(
compressed URL patches)
"""
def _execute_patch(pkg_or_dep: Union["spack.package_base.PackageBase", Dependency]):
def _execute_patch(pkg_or_dep: Union[spack.package_base.PackageBase, Dependency]):
pkg = pkg_or_dep
if isinstance(pkg, Dependency):
pkg = pkg.pkg
@@ -577,6 +605,15 @@ def _execute_patch(pkg_or_dep: Union["spack.package_base.PackageBase", Dependenc
return _execute_patch
def conditional(*values: List[Any], when: Optional[WhenType] = None):
"""Conditional values that can be used in variant declarations."""
# _make_when_spec returns None when the condition is statically false.
when = _make_when_spec(when)
return spack.variant.ConditionalVariantValues(
spack.variant.ConditionalValue(x, when=when) for x in values
)
@directive("variants")
def variant(
name: str,
@@ -845,7 +882,7 @@ def requires(*requirement_specs: str, policy="one_of", when=None, msg=None):
msg: optional user defined message
"""
def _execute_requires(pkg: "spack.package_base.PackageBase"):
def _execute_requires(pkg: spack.package_base.PackageBase):
if policy not in ("one_of", "any_of"):
err_msg = (
f"the 'policy' argument of the 'requires' directive in {pkg.name} is set "
@@ -870,7 +907,7 @@ def _execute_requires(pkg: "spack.package_base.PackageBase"):
def _language(lang_spec_str: str, *, when: Optional[Union[str, bool]] = None):
"""Temporary implementation of language virtuals, until compilers are proper dependencies."""
def _execute_languages(pkg: "spack.package_base.PackageBase"):
def _execute_languages(pkg: spack.package_base.PackageBase):
when_spec = _make_when_spec(when)
if not when_spec:
return

View File

@@ -10,6 +10,7 @@
import llnl.util.lang
import spack.error
import spack.repo
import spack.spec
#: Names of possible directives. This list is mostly populated using the @directive decorator.
@@ -63,7 +64,7 @@ def __init__(cls, name, bases, attr_dict):
# The instance is being initialized: if it is a package we must ensure
# that the directives are called to set it up.
if "spack.pkg" in cls.__module__:
if cls.__module__.startswith(spack.repo.ROOT_PYTHON_NAMESPACE):
# Ensure the presence of the dictionaries associated with the directives.
# All dictionaries are defaultdicts that create lists for missing keys.
for d in DirectiveMeta._directive_dict_names:

15
lib/spack/spack/enums.py Normal file
View File

@@ -0,0 +1,15 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Enumerations used throughout Spack"""
import enum
class InstallRecordStatus(enum.Flag):
"""Enum flag to facilitate querying status from the DB"""
INSTALLED = enum.auto()
DEPRECATED = enum.auto()
MISSING = enum.auto()
ANY = INSTALLED | DEPRECATED | MISSING

View File

@@ -473,6 +473,7 @@
active_environment,
all_environment_names,
all_environments,
as_env_dir,
create,
create_in_dir,
deactivate,
@@ -480,6 +481,7 @@
default_view_name,
display_specs,
environment_dir_from_name,
environment_from_name_or_dir,
exists,
initialize_environment_dir,
installed_specs,
@@ -507,6 +509,7 @@
"active_environment",
"all_environment_names",
"all_environments",
"as_env_dir",
"create",
"create_in_dir",
"deactivate",
@@ -514,6 +517,7 @@
"default_view_name",
"display_specs",
"environment_dir_from_name",
"environment_from_name_or_dir",
"exists",
"initialize_environment_dir",
"installed_specs",

View File

@@ -11,22 +11,19 @@
import re
import shutil
import stat
import sys
import time
import urllib.parse
import urllib.request
import warnings
from typing import Any, Dict, Iterable, List, Optional, Set, Tuple, Union
from typing import Any, Dict, Iterable, List, Optional, Sequence, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.util.link_tree import ConflictingSpecsError
from llnl.util.symlink import readlink, symlink
from llnl.util.symlink import islink, readlink, symlink
import spack
import spack.caches
import spack.compilers
import spack.concretize
import spack.config
import spack.deptypes as dt
@@ -45,7 +42,6 @@
import spack.util.environment
import spack.util.hash
import spack.util.lock as lk
import spack.util.parallel
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
@@ -57,6 +53,8 @@
from spack.spec_list import SpecList
from spack.util.path import substitute_path_variables
SpecPair = spack.concretize.SpecPair
#: environment variable used to indicate the active environment
spack_env_var = "SPACK_ENV"
@@ -277,6 +275,22 @@ def is_env_dir(path):
return os.path.isdir(path) and os.path.exists(os.path.join(path, manifest_name))
def as_env_dir(name_or_dir):
"""Translate an environment name or directory to the environment directory"""
if is_env_dir(name_or_dir):
return name_or_dir
else:
validate_env_name(name_or_dir)
if not exists(name_or_dir):
raise SpackEnvironmentError("no such environment '%s'" % name_or_dir)
return root(name_or_dir)
def environment_from_name_or_dir(name_or_dir):
"""Get an environment with the supplied name."""
return Environment(as_env_dir(name_or_dir))
def read(name):
"""Get an environment with the supplied name."""
validate_env_name(name)
@@ -654,7 +668,7 @@ def from_dict(base_path, d):
@property
def _current_root(self):
if not os.path.islink(self.root):
if not islink(self.root):
return None
root = readlink(self.root)
@@ -1494,7 +1508,7 @@ def deconcretize(self, spec: spack.spec.Spec, concrete: bool = True):
def _get_specs_to_concretize(
self,
) -> Tuple[Set[spack.spec.Spec], Set[spack.spec.Spec], List[spack.spec.Spec]]:
) -> Tuple[List[spack.spec.Spec], List[spack.spec.Spec], List[SpecPair]]:
"""Compute specs to concretize for unify:true and unify:when_possible.
This includes new user specs and any already concretized specs.
@@ -1504,23 +1518,20 @@ def _get_specs_to_concretize(
"""
# Exit early if the set of concretized specs is the set of user specs
new_user_specs = set(self.user_specs) - set(self.concretized_user_specs)
kept_user_specs = set(self.user_specs) & set(self.concretized_user_specs)
new_user_specs = list(set(self.user_specs) - set(self.concretized_user_specs))
kept_user_specs = list(set(self.user_specs) & set(self.concretized_user_specs))
kept_user_specs += self.included_user_specs
if not new_user_specs:
return new_user_specs, kept_user_specs, []
concrete_specs_to_keep = [
concrete
specs_to_concretize = [(s, None) for s in new_user_specs] + [
(abstract, concrete)
for abstract, concrete in self.concretized_specs()
if abstract in kept_user_specs
]
specs_to_concretize = list(new_user_specs) + concrete_specs_to_keep
return new_user_specs, kept_user_specs, specs_to_concretize
def _concretize_together_where_possible(
self, tests: bool = False
) -> List[Tuple[spack.spec.Spec, spack.spec.Spec]]:
def _concretize_together_where_possible(self, tests: bool = False) -> Sequence[SpecPair]:
# Avoid cyclic dependency
import spack.solver.asp
@@ -1529,36 +1540,26 @@ def _concretize_together_where_possible(
if not new_user_specs:
return []
old_concrete_to_abstract = {
concrete: abstract for (abstract, concrete) in self.concretized_specs()
}
self.concretized_user_specs = []
self.concretized_order = []
self.specs_by_hash = {}
result_by_user_spec = {}
solver = spack.solver.asp.Solver()
allow_deprecated = spack.config.get("config:deprecated", False)
for result in solver.solve_in_rounds(
specs_to_concretize, tests=tests, allow_deprecated=allow_deprecated
):
result_by_user_spec.update(result.specs_by_input)
ret = []
result = spack.concretize.concretize_together_when_possible(
specs_to_concretize, tests=tests
)
for abstract, concrete in result:
# Only add to the environment if it's from this environment (not included in)
if abstract in self.user_specs:
self._add_concrete_spec(abstract, concrete)
result = []
for abstract, concrete in sorted(result_by_user_spec.items()):
# If the "abstract" spec is a concrete spec from the previous concretization
# translate it back to an abstract spec. Otherwise, keep the abstract spec
abstract = old_concrete_to_abstract.get(abstract, abstract)
# Return only the new specs
if abstract in new_user_specs:
result.append((abstract, concrete))
self._add_concrete_spec(abstract, concrete)
ret.append((abstract, concrete))
return result
return ret
def _concretize_together(
self, tests: bool = False
) -> List[Tuple[spack.spec.Spec, spack.spec.Spec]]:
def _concretize_together(self, tests: bool = False) -> Sequence[SpecPair]:
"""Concretization strategy that concretizes all the specs
in the same DAG.
"""
@@ -1572,8 +1573,8 @@ def _concretize_together(
self.specs_by_hash = {}
try:
concrete_specs: List[spack.spec.Spec] = spack.concretize.concretize_specs_together(
*specs_to_concretize, tests=tests
concretized_specs = spack.concretize.concretize_together(
specs_to_concretize, tests=tests
)
except spack.error.UnsatisfiableSpecError as e:
# "Enhance" the error message for multiple root specs, suggest a less strict
@@ -1591,14 +1592,13 @@ def _concretize_together(
)
raise
# set() | set() does not preserve ordering, even though sets are ordered
ordered_user_specs = list(new_user_specs) + list(kept_user_specs)
concretized_specs = [x for x in zip(ordered_user_specs, concrete_specs)]
for abstract, concrete in concretized_specs:
self._add_concrete_spec(abstract, concrete)
# Don't add if it's just included
if abstract in self.user_specs:
self._add_concrete_spec(abstract, concrete)
# zip truncates the longer list, which is exactly what we want here
return list(zip(new_user_specs, concrete_specs))
# Return the portion of the return value that is new
return concretized_specs[: len(new_user_specs)]
def _concretize_separately(self, tests=False):
"""Concretization strategy that concretizes separately one
@@ -1620,71 +1620,16 @@ def _concretize_separately(self, tests=False):
concrete = old_specs_by_hash[h]
self._add_concrete_spec(s, concrete, new=False)
# Concretize any new user specs that we haven't concretized yet
args, root_specs, i = [], [], 0
for uspec in self.user_specs:
if uspec not in old_concretized_user_specs:
root_specs.append(uspec)
args.append((i, str(uspec), tests))
i += 1
to_concretize = [
(root, None) for root in self.user_specs if root not in old_concretized_user_specs
]
concretized_specs = spack.concretize.concretize_separately(to_concretize, tests=tests)
# Ensure we don't try to bootstrap clingo in parallel
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_clingo_importable_or_raise()
# Ensure all the indexes have been built or updated, since
# otherwise the processes in the pool may timeout on waiting
# for a write lock. We do this indirectly by retrieving the
# provider index, which should in turn trigger the update of
# all the indexes if there's any need for that.
_ = spack.repo.PATH.provider_index
# Ensure we have compilers in compilers.yaml to avoid that
# processes try to write the config file in parallel
_ = spack.compilers.all_compilers_config(spack.config.CONFIG)
# Early return if there is nothing to do
if len(args) == 0:
return []
# Solve the environment in parallel on Linux
start = time.time()
num_procs = min(len(args), spack.config.determine_number_of_jobs(parallel=True))
# TODO: support parallel concretization on macOS and Windows
msg = "Starting concretization"
if sys.platform not in ("darwin", "win32") and num_procs > 1:
msg += f" pool with {num_procs} processes"
tty.msg(msg)
batch = []
for j, (i, concrete, duration) in enumerate(
spack.util.parallel.imap_unordered(
_concretize_task,
args,
processes=num_procs,
debug=tty.is_debug(),
maxtaskperchild=1,
)
):
batch.append((i, concrete))
percentage = (j + 1) / len(args) * 100
tty.verbose(
f"{duration:6.1f}s [{percentage:3.0f}%] {concrete.cformat('{hash:7}')} "
f"{root_specs[i].colored_str}"
)
sys.stdout.flush()
# Add specs in original order
batch.sort(key=lambda x: x[0])
by_hash = {} # for attaching information on test dependencies
for root, (_, concrete) in zip(root_specs, batch):
self._add_concrete_spec(root, concrete)
by_hash = {}
for abstract, concrete in concretized_specs:
self._add_concrete_spec(abstract, concrete)
by_hash[concrete.dag_hash()] = concrete
finish = time.time()
tty.msg(f"Environment concretized in {finish - start:.2f} seconds")
# Unify the specs objects, so we get correct references to all parents
self._read_lockfile_dict(self._to_lockfile_dict())
@@ -1704,11 +1649,7 @@ def _concretize_separately(self, tests=False):
test_dependency.copy(), depflag=dt.TEST, virtuals=current_edge.virtuals
)
results = [
(abstract, self.specs_by_hash[h])
for abstract, h in zip(self.concretized_user_specs, self.concretized_order)
]
return results
return concretized_specs
@property
def default_view(self):
@@ -2515,14 +2456,6 @@ def display_specs(specs):
print(tree_string)
def _concretize_task(packed_arguments) -> Tuple[int, Spec, float]:
index, spec_str, tests = packed_arguments
with tty.SuppressOutput(msg_enabled=False):
start = time.time()
spec = Spec(spec_str).concretized(tests=tests)
return index, spec, time.time() - start
def make_repo_path(root):
"""Make a RepoPath from the repo subdirectories in an environment."""
path = spack.repo.RepoPath(cache=spack.caches.MISC_CACHE)

View File

@@ -21,43 +21,40 @@
features.
"""
import importlib
from llnl.util.lang import ensure_last, list_modules
import spack.paths
import types
from typing import List, Optional
class _HookRunner:
#: Stores all hooks on first call, shared among
#: all HookRunner objects
_hooks = None
#: Order in which hooks are executed
HOOK_ORDER = [
"spack.hooks.module_file_generation",
"spack.hooks.licensing",
"spack.hooks.sbang",
"spack.hooks.windows_runtime_linkage",
"spack.hooks.drop_redundant_rpaths",
"spack.hooks.absolutify_elf_sonames",
"spack.hooks.permissions_setters",
# after all mutations to the install prefix, write metadata
"spack.hooks.write_install_manifest",
# after all metadata is written
"spack.hooks.autopush",
]
#: Contains all hook modules after first call, shared among all HookRunner objects
_hooks: Optional[List[types.ModuleType]] = None
def __init__(self, hook_name):
self.hook_name = hook_name
@classmethod
def _populate_hooks(cls):
# Lazily populate the list of hooks
cls._hooks = []
relative_names = list(list_modules(spack.paths.hooks_path))
# Ensure that write_install_manifest comes last
ensure_last(relative_names, "absolutify_elf_sonames", "write_install_manifest")
for name in relative_names:
module_name = __name__ + "." + name
module_obj = importlib.import_module(module_name)
cls._hooks.append((module_name, module_obj))
@property
def hooks(self):
def hooks(self) -> List[types.ModuleType]:
if not self._hooks:
self._populate_hooks()
self._hooks = [importlib.import_module(module_name) for module_name in self.HOOK_ORDER]
return self._hooks
def __call__(self, *args, **kwargs):
for _, module in self.hooks:
for module in self.hooks:
if hasattr(module, self.hook_name):
hook = getattr(module, self.hook_name)
if hasattr(hook, "__call__"):

View File

@@ -17,13 +17,13 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import llnl.util.tty.log
import llnl.util.tty.log as log
from llnl.string import plural
from llnl.util.lang import nullcontext
from llnl.util.tty.color import colorize
import spack.build_environment
import spack.builder
import spack.compilers
import spack.config
import spack.error
import spack.package_base
@@ -51,7 +51,6 @@
ListOrStringType = Union[str, List[str]]
LogType = Union[llnl.util.tty.log.nixlog, llnl.util.tty.log.winlog]
Pb = TypeVar("Pb", bound="spack.package_base.PackageBase")
PackageObjectOrClass = Union[Pb, Type[Pb]]
@@ -208,22 +207,6 @@ def install_test_root(pkg: Pb):
return os.path.join(pkg.metadata_dir, "test")
def print_message(logger: LogType, msg: str, verbose: bool = False):
"""Print the message to the log, optionally echoing.
Args:
logger: instance of the output logger (e.g. nixlog or winlog)
msg: message being output
verbose: ``True`` displays verbose output, ``False`` suppresses
it (``False`` is default)
"""
if verbose:
with logger.force_echo():
tty.info(msg, format="g")
else:
tty.info(msg, format="g")
def overall_status(current_status: "TestStatus", substatuses: List["TestStatus"]) -> "TestStatus":
"""Determine the overall status based on the current and associated sub status values.
@@ -270,15 +253,16 @@ def __init__(self, pkg: Pb):
self.test_log_file: str
self.pkg_id: str
if pkg.test_suite:
if self.pkg.test_suite is not None:
# Running stand-alone tests
self.test_log_file = pkg.test_suite.log_file_for_spec(pkg.spec)
self.tested_file = pkg.test_suite.tested_file_for_spec(pkg.spec)
self.pkg_id = pkg.test_suite.test_pkg_id(pkg.spec)
suite = self.pkg.test_suite
self.test_log_file = suite.log_file_for_spec(pkg.spec) # type: ignore[union-attr]
self.tested_file = suite.tested_file_for_spec(pkg.spec) # type: ignore[union-attr]
self.pkg_id = suite.test_pkg_id(pkg.spec) # type: ignore[union-attr]
else:
# Running phase-time tests for a single package whose results are
# retained in the package's stage directory.
pkg.test_suite = TestSuite([pkg.spec])
self.pkg.test_suite = TestSuite([pkg.spec])
self.test_log_file = fs.join_path(pkg.stage.path, spack_install_test_log)
self.pkg_id = pkg.spec.format("{name}-{version}-{hash:7}")
@@ -286,10 +270,10 @@ def __init__(self, pkg: Pb):
self._logger = None
@property
def logger(self) -> Optional[LogType]:
def logger(self) -> Optional[log.LogType]:
"""The current logger or, if none, sets to one."""
if not self._logger:
self._logger = llnl.util.tty.log.log_output(self.test_log_file)
self._logger = log.log_output(self.test_log_file)
return self._logger
@@ -306,7 +290,7 @@ def test_logger(self, verbose: bool = False, externals: bool = False):
fs.touch(self.test_log_file) # Otherwise log_parse complains
fs.set_install_permissions(self.test_log_file)
with llnl.util.tty.log.log_output(self.test_log_file, verbose) as self._logger:
with log.log_output(self.test_log_file, verbose) as self._logger:
with self.logger.force_echo(): # type: ignore[union-attr]
tty.msg("Testing package " + colorize(r"@*g{" + self.pkg_id + r"}"))
@@ -332,6 +316,13 @@ def add_failure(self, exception: Exception, msg: str):
"""Add the failure details to the current list."""
self.test_failures.append((exception, msg))
def set_current_specs(self, base_spec: spack.spec.Spec, test_spec: spack.spec.Spec):
# Ignore union-attr check for test_suite since the constructor of this
# class ensures it is always not None.
test_suite = self.pkg.test_suite
test_suite.current_base_spec = base_spec # type: ignore[union-attr]
test_suite.current_test_spec = test_spec # type: ignore[union-attr]
def status(self, name: str, status: "TestStatus", msg: Optional[str] = None):
"""Track and print the test status for the test part name."""
part_name = f"{self.pkg.__class__.__name__}::{name}"
@@ -353,65 +344,54 @@ def status(self, name: str, status: "TestStatus", msg: Optional[str] = None):
self.test_parts[part_name] = status
self.counts[status] += 1
def phase_tests(
self, builder: spack.builder.Builder, phase_name: str, method_names: List[str]
):
"""Execute the builder's package phase-time tests.
def handle_failures(self):
"""Raise exception if any failures were collected during testing
Args:
builder: builder for package being tested
phase_name: the name of the build-time phase (e.g., ``build``, ``install``)
method_names: phase-specific callback method names
Raises:
TestFailure: test failures were collected
"""
verbose = tty.is_verbose()
fail_fast = spack.config.get("config:fail_fast", False)
if self.test_failures:
raise TestFailure(self.test_failures)
with self.test_logger(verbose=verbose, externals=False) as logger:
# Report running each of the methods in the build log
print_message(logger, f"Running {phase_name}-time tests", verbose)
builder.pkg.test_suite.current_test_spec = builder.pkg.spec
builder.pkg.test_suite.current_base_spec = builder.pkg.spec
have_tests = any(name.startswith("test_") for name in method_names)
if have_tests:
copy_test_files(builder.pkg, builder.pkg.spec)
for name in method_names:
try:
# Prefer the method in the package over the builder's.
# We need this primarily to pick up arbitrarily named test
# methods but also some build-time checks.
fn = getattr(builder.pkg, name, getattr(builder, name))
msg = f"RUN-TESTS: {phase_name}-time tests [{name}]"
print_message(logger, msg, verbose)
fn()
except AttributeError as e:
msg = f"RUN-TESTS: method not implemented [{name}]"
print_message(logger, msg, verbose)
self.add_failure(e, msg)
if fail_fast:
break
if have_tests:
print_message(logger, "Completed testing", verbose)
# Raise any collected failures here
if self.test_failures:
raise TestFailure(self.test_failures)
def stand_alone_tests(self, kwargs):
def stand_alone_tests(self, dirty=False, externals=False):
"""Run the package's stand-alone tests.
Args:
kwargs (dict): arguments to be used by the test process
"""
import spack.build_environment
spack.build_environment.start_build_process(self.pkg, test_process, kwargs)
Raises:
AttributeError: required test_requires_compiler attribute is missing
"""
pkg = self.pkg
spec = pkg.spec
pkg_spec = spec.format("{name}-{version}-{hash:7}")
if not hasattr(pkg, "test_requires_compiler"):
raise AttributeError(
f"Cannot run tests for {pkg_spec}: missing required "
"test_requires_compiler attribute"
)
if pkg.test_requires_compiler:
compilers = spack.compilers.compilers_for_spec(
spec.compiler, arch_spec=spec.architecture
)
if not compilers:
tty.error(
f"Skipping tests for package {pkg_spec}\n"
f"Package test requires missing compiler {spec.compiler}"
)
return
kwargs = {
"dirty": dirty,
"fake": False,
"context": "test",
"externals": externals,
"verbose": tty.is_verbose(),
}
spack.build_environment.start_build_process(pkg, test_process, kwargs)
def parts(self) -> int:
"""The total number of (checked) test parts."""
@@ -703,10 +683,9 @@ def process_test_parts(pkg: Pb, test_specs: List[spack.spec.Spec], verbose: bool
):
test_fn(pkg)
# If fail-fast was on, we error out above
# If we collect errors, raise them in batch here
if tester.test_failures:
raise TestFailure(tester.test_failures)
# If fail-fast was on, we errored out above
# If we collected errors, raise them in batch here
tester.handle_failures()
finally:
if tester.ran_tests():
@@ -732,12 +711,12 @@ def test_process(pkg: Pb, kwargs):
with pkg.tester.test_logger(verbose, externals) as logger:
if pkg.spec.external and not externals:
print_message(logger, "Skipped tests for external package", verbose)
log.print_message(logger, "Skipped tests for external package", verbose)
pkg.tester.status(pkg.spec.name, TestStatus.SKIPPED)
return
if not pkg.spec.installed:
print_message(logger, "Skipped not installed package", verbose)
log.print_message(logger, "Skipped not installed package", verbose)
pkg.tester.status(pkg.spec.name, TestStatus.SKIPPED)
return
@@ -764,7 +743,7 @@ def virtuals(pkg):
# hack for compilers that are not dependencies (yet)
# TODO: this all eventually goes away
c_names = ("gcc", "intel", "intel-parallel-studio", "pgi")
c_names = ("gcc", "intel", "intel-parallel-studio")
if pkg.name in c_names:
v_names.extend(["c", "cxx", "fortran"])
if pkg.spec.satisfies("llvm+clang"):
@@ -862,7 +841,7 @@ def __init__(self, specs, alias=None):
# even if they contain the same spec
self.specs = [spec.copy() for spec in specs]
self.current_test_spec = None # spec currently tested, can be virtual
self.current_base_spec = None # spec currently running do_test
self.current_base_spec = None # spec currently running tests
self.alias = alias
self._hash = None
@@ -886,6 +865,10 @@ def content_hash(self):
self._hash = b32_hash
return self._hash
def set_current_specs(self, base_spec: spack.spec.Spec, test_spec: spack.spec.Spec):
self.current_base_spec = base_spec
self.current_test_spec = test_spec
def __call__(self, *args, **kwargs):
self.write_reproducibility_data()
@@ -895,18 +878,16 @@ def __call__(self, *args, **kwargs):
externals = kwargs.get("externals", False)
for spec in self.specs:
pkg = spec.package
try:
if spec.package.test_suite:
if pkg.test_suite:
raise TestSuiteSpecError(
"Package {} cannot be run in two test suites at once".format(
spec.package.name
)
f"Package {pkg.name} cannot be run in two test suites at once"
)
# Set up the test suite to know which test is running
spec.package.test_suite = self
self.current_base_spec = spec
self.current_test_spec = spec
pkg.test_suite = self
self.set_current_specs(spec, spec)
# setup per-test directory in the stage dir
test_dir = self.test_dir_for_spec(spec)
@@ -915,7 +896,7 @@ def __call__(self, *args, **kwargs):
fs.mkdirp(test_dir)
# run the package tests
spec.package.do_test(dirty=dirty, externals=externals)
pkg.tester.stand_alone_tests(dirty=dirty, externals=externals)
# Clean up on success
if remove_directory:
@@ -949,8 +930,7 @@ def __call__(self, *args, **kwargs):
finally:
spec.package.test_suite = None
self.current_test_spec = None
self.current_base_spec = None
self.set_current_specs(None, None)
write_test_summary(self.counts)

View File

@@ -50,6 +50,7 @@
import spack.binary_distribution as binary_distribution
import spack.build_environment
import spack.builder
import spack.config
import spack.database
import spack.deptypes as dt
@@ -212,7 +213,7 @@ def _check_last_phase(pkg: "spack.package_base.PackageBase") -> None:
Raises:
``BadInstallPhase`` if stop_before or last phase is invalid
"""
phases = pkg.builder.phases # type: ignore[attr-defined]
phases = spack.builder.create(pkg).phases # type: ignore[attr-defined]
if pkg.stop_before_phase and pkg.stop_before_phase not in phases: # type: ignore[attr-defined]
raise BadInstallPhase(pkg.name, pkg.stop_before_phase) # type: ignore[attr-defined]
@@ -412,7 +413,7 @@ def _process_external_package(pkg: "spack.package_base.PackageBase", explicit: b
tty.debug(f"{pre} already registered in DB")
record = spack.store.STORE.db.get_record(spec)
if explicit and not record.explicit:
spack.store.STORE.db.update_explicit(spec, explicit)
spack.store.STORE.db.mark(spec, "explicit", True)
except KeyError:
# If not, register it and generate the module file.
@@ -661,7 +662,7 @@ def log(pkg: "spack.package_base.PackageBase") -> None:
spack.store.STORE.layout.metadata_path(pkg.spec), "archived-files"
)
for glob_expr in pkg.builder.archive_files:
for glob_expr in spack.builder.create(pkg).archive_files:
# Check that we are trying to copy things that are
# in the stage tree (not arbitrary files)
abs_expr = os.path.realpath(glob_expr)
@@ -1507,8 +1508,8 @@ def _prepare_for_install(self, task: Task) -> None:
self._update_installed(task)
# Only update the explicit entry once for the explicit package
if task.explicit:
spack.store.STORE.db.update_explicit(task.pkg.spec, True)
if task.explicit and not rec.explicit:
spack.store.STORE.db.mark(task.pkg.spec, "explicit", True)
def _cleanup_all_tasks(self) -> None:
"""Cleanup all tasks to include releasing their locks."""
@@ -2394,7 +2395,6 @@ def _install_source(self) -> None:
fs.install_tree(pkg.stage.source_path, src_target)
def _real_install(self) -> None:
import spack.builder
pkg = self.pkg

View File

@@ -911,13 +911,6 @@ def _main(argv=None):
# Make spack load / env activate work on macOS
restore_macos_dyld_vars()
# make spack.config aware of any command line configuration scopes
if args.config_scopes:
spack.config.COMMAND_LINE_SCOPES = args.config_scopes
# ensure options on spack command come before everything
setup_main_options(args)
# activate an environment if one was specified on the command line
env_format_error = None
if not args.no_env:
@@ -931,6 +924,12 @@ def _main(argv=None):
e.print_context()
env_format_error = e
# Push scopes from the command line last
if args.config_scopes:
spack.config._add_command_line_scopes(spack.config.CONFIG, args.config_scopes)
spack.config.CONFIG.push_scope(spack.config.InternalConfigScope("command_line"))
setup_main_options(args)
# ------------------------------------------------------------------------
# Things that require configuration should go below here
# ------------------------------------------------------------------------

View File

@@ -18,7 +18,7 @@
import sys
import traceback
import urllib.parse
from typing import List, Optional, Union
from typing import Any, Dict, Optional, Tuple, Union
import llnl.url
import llnl.util.symlink
@@ -154,8 +154,66 @@ def push_url(self):
"""Get the valid, canonicalized fetch URL"""
return self.get_url("push")
def ensure_mirror_usable(self, direction: str = "push"):
access_pair = self._get_value("access_pair", direction)
access_token_variable = self._get_value("access_token_variable", direction)
errors = []
# Verify that the credentials that are variables expand
if access_pair and isinstance(access_pair, dict):
if "id_variable" in access_pair and access_pair["id_variable"] not in os.environ:
errors.append(f"id_variable {access_pair['id_variable']} not set in environment")
if "secret_variable" in access_pair:
if access_pair["secret_variable"] not in os.environ:
errors.append(
f"environment variable `{access_pair['secret_variable']}` "
"(secret_variable) not set"
)
if access_token_variable:
if access_token_variable not in os.environ:
errors.append(
f"environment variable `{access_pair['access_token_variable']}` "
"(access_token_variable) not set"
)
if errors:
msg = f"invalid {direction} configuration for mirror {self.name}: "
msg += "\n ".join(errors)
raise MirrorError(msg)
def _update_connection_dict(self, current_data: dict, new_data: dict, top_level: bool):
keys = ["url", "access_pair", "access_token", "profile", "endpoint_url"]
# Only allow one to exist in the config
if "access_token" in current_data and "access_token_variable" in new_data:
current_data.pop("access_token")
elif "access_token_variable" in current_data and "access_token" in new_data:
current_data.pop("access_token_variable")
# If updating to a new access_pair that is the deprecated list, warn
warn_deprecated_access_pair = False
if "access_pair" in new_data:
warn_deprecated_access_pair = isinstance(new_data["access_pair"], list)
# If the not updating the current access_pair, and it is the deprecated list, warn
elif "access_pair" in current_data:
warn_deprecated_access_pair = isinstance(current_data["access_pair"], list)
if warn_deprecated_access_pair:
tty.warn(
f"in mirror {self.name}: support for plain text secrets in config files "
"(access_pair: [id, secret]) is deprecated and will be removed in a future Spack "
"version. Use environment variables instead (access_pair: "
"{id: ..., secret_variable: ...})"
)
keys = [
"url",
"access_pair",
"access_token",
"access_token_variable",
"profile",
"endpoint_url",
]
if top_level:
keys += ["binary", "source", "signed", "autopush"]
changed = False
@@ -271,11 +329,53 @@ def get_url(self, direction: str) -> str:
return _url_or_path_to_url(url)
def get_access_token(self, direction: str) -> Optional[str]:
return self._get_value("access_token", direction)
def get_credentials(self, direction: str) -> Dict[str, Any]:
"""Get the mirror credentials from the mirror config
def get_access_pair(self, direction: str) -> Optional[List]:
return self._get_value("access_pair", direction)
Args:
direction: fetch or push mirror config
Returns:
Dictionary from credential type string to value
Credential Type Map:
access_token -> str
access_pair -> tuple(str,str)
profile -> str
"""
creddict: Dict[str, Any] = {}
access_token = self.get_access_token(direction)
if access_token:
creddict["access_token"] = access_token
access_pair = self.get_access_pair(direction)
if access_pair:
creddict.update({"access_pair": access_pair})
profile = self.get_profile(direction)
if profile:
creddict["profile"] = profile
return creddict
def get_access_token(self, direction: str) -> Optional[str]:
tok = self._get_value("access_token_variable", direction)
if tok:
return os.environ.get(tok)
else:
return self._get_value("access_token", direction)
return None
def get_access_pair(self, direction: str) -> Optional[Tuple[str, str]]:
pair = self._get_value("access_pair", direction)
if isinstance(pair, (tuple, list)) and len(pair) == 2:
return (pair[0], pair[1]) if all(pair) else None
elif isinstance(pair, dict):
id_ = os.environ.get(pair["id_variable"]) if "id_variable" in pair else pair["id"]
secret = os.environ.get(pair["secret_variable"])
return (id_, secret) if id_ and secret else None
else:
return None
def get_profile(self, direction: str) -> Optional[str]:
return self._get_value("profile", direction)
@@ -756,7 +856,7 @@ def create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats):
def require_mirror_name(mirror_name):
"""Find a mirror by name and raise if it does not exist"""
mirror = spack.mirror.MirrorCollection().get(mirror_name)
mirror = MirrorCollection().get(mirror_name)
if not mirror:
raise ValueError(f'no mirror named "{mirror_name}"')
return mirror

View File

@@ -10,7 +10,7 @@
import llnl.util.filesystem
import spack.builder
import spack.phase_callbacks
def filter_compiler_wrappers(*files, **kwargs):
@@ -111,4 +111,4 @@ def _filter_compiler_wrappers_impl(pkg_or_builder):
if pkg.compiler.name == "nag":
x.filter("-Wl,--enable-new-dtags", "", **filter_kwargs)
spack.builder.run_after(after)(_filter_compiler_wrappers_impl)
spack.phase_callbacks.run_after(after)(_filter_compiler_wrappers_impl)

View File

@@ -39,7 +39,7 @@
import llnl.util.filesystem
import llnl.util.tty as tty
from llnl.util.lang import dedupe, memoized
from llnl.util.lang import Singleton, dedupe, memoized
import spack.build_environment
import spack.config
@@ -246,7 +246,7 @@ def _generate_upstream_module_index():
return UpstreamModuleIndex(spack.store.STORE.db, module_indices)
upstream_module_index = llnl.util.lang.Singleton(_generate_upstream_module_index)
upstream_module_index = Singleton(_generate_upstream_module_index)
ModuleIndexEntry = collections.namedtuple("ModuleIndexEntry", ["path", "use_name"])

View File

@@ -377,9 +377,10 @@ def credentials_from_mirrors(
# Prefer push credentials over fetch. Unlikely that those are different
# but our config format allows it.
for direction in ("push", "fetch"):
pair = mirror.get_access_pair(direction)
if pair is None:
pair = mirror.get_credentials(direction).get("access_pair")
if not pair:
continue
url = mirror.get_url(direction)
if not url.startswith("oci://"):
continue

View File

@@ -74,7 +74,7 @@
from spack.build_systems.sourceware import SourcewarePackage
from spack.build_systems.waf import WafPackage
from spack.build_systems.xorg import XorgPackage
from spack.builder import run_after, run_before
from spack.builder import BaseBuilder
from spack.config import determine_number_of_jobs
from spack.deptypes import ALL_TYPES as all_deptypes
from spack.directives import *
@@ -100,15 +100,11 @@
on_package_attributes,
)
from spack.package_completions import *
from spack.phase_callbacks import run_after, run_before
from spack.spec import InvalidSpecDetected, Spec
from spack.util.executable import *
from spack.util.filesystem import file_command, fix_darwin_install_name, mime_type
from spack.variant import (
any_combination_of,
auto_or_any_combination_of,
conditional,
disjoint_sets,
)
from spack.variant import any_combination_of, auto_or_any_combination_of, disjoint_sets
from spack.version import Version, ver
# These are just here for editor support; they will be replaced when the build env

View File

@@ -32,30 +32,28 @@
from llnl.util.lang import classproperty, memoized
from llnl.util.link_tree import LinkTree
import spack.build_environment
import spack.builder
import spack.compilers
import spack.config
import spack.dependency
import spack.deptypes as dt
import spack.directives
import spack.directives_meta
import spack.error
import spack.fetch_strategy as fs
import spack.hooks
import spack.mirror
import spack.multimethod
import spack.patch
import spack.phase_callbacks
import spack.repo
import spack.spec
import spack.store
import spack.url
import spack.util.environment
import spack.util.executable
import spack.util.path
import spack.util.web
import spack.variant
from spack.error import InstallError, NoURLError, PackageError
from spack.filesystem_view import YamlFilesystemView
from spack.install_test import PackageTest, TestSuite
from spack.solver.version_order import concretization_version_order
from spack.stage import DevelopStage, ResourceStage, Stage, StageComposite, compute_stage_name
from spack.util.package_hash import package_hash
@@ -301,9 +299,9 @@ def determine_variants(cls, objs, version_str):
class PackageMeta(
spack.builder.PhaseCallbacksMeta,
spack.phase_callbacks.PhaseCallbacksMeta,
DetectablePackageMeta,
spack.directives.DirectiveMeta,
spack.directives_meta.DirectiveMeta,
spack.multimethod.MultiMethodMeta,
):
"""
@@ -455,7 +453,7 @@ def _names(when_indexed_dictionary: WhenDict) -> List[str]:
return sorted(all_names)
WhenVariantList = List[Tuple["spack.spec.Spec", "spack.variant.Variant"]]
WhenVariantList = List[Tuple[spack.spec.Spec, spack.variant.Variant]]
def _remove_overridden_vdefs(variant_defs: WhenVariantList) -> None:
@@ -494,41 +492,14 @@ class Hipblas:
i += 1
class RedistributionMixin:
"""Logic for determining whether a Package is source/binary
redistributable.
"""
#: Store whether a given Spec source/binary should not be
#: redistributed.
disable_redistribute: Dict["spack.spec.Spec", "spack.directives.DisableRedistribute"]
# Source redistribution must be determined before concretization
# (because source mirrors work with un-concretized Specs).
@classmethod
def redistribute_source(cls, spec):
"""Whether it should be possible to add the source of this
package to a Spack mirror.
"""
for when_spec, disable_redistribute in cls.disable_redistribute.items():
if disable_redistribute.source and spec.satisfies(when_spec):
return False
return True
@property
def redistribute_binary(self):
"""Whether it should be possible to create a binary out of an
installed instance of this package.
"""
for when_spec, disable_redistribute in self.__class__.disable_redistribute.items():
if disable_redistribute.binary and self.spec.satisfies(when_spec):
return False
return True
#: Store whether a given Spec source/binary should not be redistributed.
class DisableRedistribute:
def __init__(self, source, binary):
self.source = source
self.binary = binary
class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass=PackageMeta):
class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
"""This is the superclass for all spack packages.
***The Package class***
@@ -614,16 +585,20 @@ class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass
# Declare versions dictionary as placeholder for values.
# This allows analysis tools to correctly interpret the class attributes.
versions: dict
dependencies: Dict["spack.spec.Spec", Dict[str, "spack.dependency.Dependency"]]
conflicts: Dict["spack.spec.Spec", List[Tuple["spack.spec.Spec", Optional[str]]]]
dependencies: Dict[spack.spec.Spec, Dict[str, spack.dependency.Dependency]]
conflicts: Dict[spack.spec.Spec, List[Tuple[spack.spec.Spec, Optional[str]]]]
requirements: Dict[
"spack.spec.Spec", List[Tuple[Tuple["spack.spec.Spec", ...], str, Optional[str]]]
spack.spec.Spec, List[Tuple[Tuple[spack.spec.Spec, ...], str, Optional[str]]]
]
provided: Dict["spack.spec.Spec", Set["spack.spec.Spec"]]
provided_together: Dict["spack.spec.Spec", List[Set[str]]]
patches: Dict["spack.spec.Spec", List["spack.patch.Patch"]]
variants: Dict["spack.spec.Spec", Dict[str, "spack.variant.Variant"]]
languages: Dict["spack.spec.Spec", Set[str]]
provided: Dict[spack.spec.Spec, Set[spack.spec.Spec]]
provided_together: Dict[spack.spec.Spec, List[Set[str]]]
patches: Dict[spack.spec.Spec, List[spack.patch.Patch]]
variants: Dict[spack.spec.Spec, Dict[str, spack.variant.Variant]]
languages: Dict[spack.spec.Spec, Set[str]]
splice_specs: Dict[spack.spec.Spec, Tuple[spack.spec.Spec, Union[None, str, List[str]]]]
#: Store whether a given Spec source/binary should not be redistributed.
disable_redistribute: Dict[spack.spec.Spec, DisableRedistribute]
#: By default, packages are not virtual
#: Virtual packages override this attribute
@@ -737,12 +712,12 @@ class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass
#: are available to build a custom test code.
test_requires_compiler: bool = False
#: TestSuite instance used to manage stand-alone tests for 1+ specs.
test_suite: Optional["TestSuite"] = None
#: The spec's TestSuite instance, which is used to manage its testing.
test_suite: Optional[Any] = None
def __init__(self, spec):
# this determines how the package should be built.
self.spec: "spack.spec.Spec" = spec
self.spec: spack.spec.Spec = spec
# Allow custom staging paths for packages
self.path = None
@@ -760,7 +735,7 @@ def __init__(self, spec):
# init internal variables
self._stage: Optional[StageComposite] = None
self._fetcher = None
self._tester: Optional["PackageTest"] = None
self._tester: Optional[Any] = None
# Set up timing variables
self._fetch_time = 0.0
@@ -810,9 +785,7 @@ def variant_definitions(cls, name: str) -> WhenVariantList:
return defs
@classmethod
def variant_items(
cls,
) -> Iterable[Tuple["spack.spec.Spec", Dict[str, "spack.variant.Variant"]]]:
def variant_items(cls) -> Iterable[Tuple[spack.spec.Spec, Dict[str, spack.variant.Variant]]]:
"""Iterate over ``cls.variants.items()`` with overridden definitions removed."""
# Note: This is quadratic in the average number of variant definitions per name.
# That is likely close to linear in practice, as there are few variants with
@@ -830,7 +803,7 @@ def variant_items(
if filtered_variants_by_name:
yield when, filtered_variants_by_name
def get_variant(self, name: str) -> "spack.variant.Variant":
def get_variant(self, name: str) -> spack.variant.Variant:
"""Get the highest precedence variant definition matching this package's spec.
Arguments:
@@ -1005,6 +978,26 @@ def global_license_file(self):
self.global_license_dir, self.name, os.path.basename(self.license_files[0])
)
# Source redistribution must be determined before concretization (because source mirrors work
# with abstract specs).
@classmethod
def redistribute_source(cls, spec):
"""Whether it should be possible to add the source of this
package to a Spack mirror."""
for when_spec, disable_redistribute in cls.disable_redistribute.items():
if disable_redistribute.source and spec.satisfies(when_spec):
return False
return True
@property
def redistribute_binary(self):
"""Whether it should be possible to create a binary out of an installed instance of this
package."""
for when_spec, disable_redistribute in self.disable_redistribute.items():
if disable_redistribute.binary and self.spec.satisfies(when_spec):
return False
return True
# NOTE: return type should be Optional[Literal['all', 'specific', 'none']] in
# Python 3.8+, but we still support 3.6.
@property
@@ -1017,7 +1010,7 @@ def keep_werror(self) -> Optional[str]:
* ``"none"``: filter out all ``-Werror*`` flags.
* ``None``: respect the user's configuration (``"none"`` by default).
"""
if self.spec.satisfies("%nvhpc@:23.3") or self.spec.satisfies("%pgi"):
if self.spec.satisfies("%nvhpc@:23.3"):
# Filtering works by replacing -Werror with -Wno-error, but older nvhpc and
# PGI do not understand -Wno-error, so we disable filtering.
return "all"
@@ -1354,11 +1347,13 @@ def archive_install_test_log(self):
@property
def tester(self):
import spack.install_test
if not self.spec.versions.concrete:
raise ValueError("Cannot retrieve tester for package without concrete version.")
if not self._tester:
self._tester = PackageTest(self)
self._tester = spack.install_test.PackageTest(self)
return self._tester
@property
@@ -1948,29 +1943,6 @@ def _resource_stage(self, resource):
resource_stage_folder = "-".join(pieces)
return resource_stage_folder
def do_test(self, dirty=False, externals=False):
if self.test_requires_compiler:
compilers = spack.compilers.compilers_for_spec(
self.spec.compiler, arch_spec=self.spec.architecture
)
if not compilers:
tty.error(
"Skipping tests for package %s\n"
% self.spec.format("{name}-{version}-{hash:7}")
+ "Package test requires missing compiler %s" % self.spec.compiler
)
return
kwargs = {
"dirty": dirty,
"fake": False,
"context": "test",
"externals": externals,
"verbose": tty.is_verbose(),
}
self.tester.stand_alone_tests(kwargs)
def unit_test_check(self):
"""Hook for unit tests to assert things about package internals.
@@ -2015,72 +1987,58 @@ def build_system_flags(
"""
return None, None, flags
def setup_run_environment(self, env):
def setup_run_environment(self, env: spack.util.environment.EnvironmentModifications) -> None:
"""Sets up the run environment for a package.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the package is run. Package authors
env: environment modifications to be applied when the package is run. Package authors
can call methods on it to alter the run environment.
"""
pass
def setup_dependent_run_environment(self, env, dependent_spec):
def setup_dependent_run_environment(
self, env: spack.util.environment.EnvironmentModifications, dependent_spec: spack.spec.Spec
) -> None:
"""Sets up the run environment of packages that depend on this one.
This is similar to ``setup_run_environment``, but it is used to
modify the run environments of packages that *depend* on this one.
This is similar to ``setup_run_environment``, but it is used to modify the run environment
of a package that *depends* on this one.
This gives packages like Python and others that follow the extension
model a way to implement common environment or run-time settings
for dependencies.
This gives packages like Python and others that follow the extension model a way to
implement common environment or run-time settings for dependencies.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the dependent package is run.
Package authors can call methods on it to alter the build environment.
env: environment modifications to be applied when the dependent package is run. Package
authors can call methods on it to alter the build environment.
dependent_spec (spack.spec.Spec): The spec of the dependent package
about to be run. This allows the extendee (self) to query
the dependent's state. Note that *this* package's spec is
dependent_spec: The spec of the dependent package about to be run. This allows the
extendee (self) to query the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
pass
def setup_dependent_package(self, module, dependent_spec):
"""Set up Python module-scope variables for dependent packages.
def setup_dependent_package(self, module, dependent_spec: spack.spec.Spec) -> None:
"""Set up module-scope global variables for dependent packages.
Called before the install() method of dependents.
Default implementation does nothing, but this can be
overridden by an extendable package to set up the module of
its extensions. This is useful if there are some common steps
to installing all extensions for a certain package.
This function is called when setting up the build and run environments of a DAG.
Examples:
1. Extensions often need to invoke the ``python`` interpreter
from the Python installation being extended. This routine
can put a ``python()`` Executable object in the module scope
for the extension package to simplify extension installs.
1. Extensions often need to invoke the ``python`` interpreter from the Python installation
being extended. This routine can put a ``python`` Executable as a global in the module
scope for the extension package to simplify extension installs.
2. MPI compilers could set some variables in the dependent's
scope that point to ``mpicc``, ``mpicxx``, etc., allowing
them to be called by common name regardless of which MPI is used.
3. BLAS/LAPACK implementations can set some variables
indicating the path to their libraries, since these
paths differ by BLAS/LAPACK implementation.
2. MPI compilers could set some variables in the dependent's scope that point to ``mpicc``,
``mpicxx``, etc., allowing them to be called by common name regardless of which MPI is
used.
Args:
module (spack.package_base.PackageBase.module): The Python ``module``
object of the dependent package. Packages can use this to set
module-scope variables for the dependent to use.
module: The Python ``module`` object of the dependent package. Packages can use this to
set module-scope variables for the dependent to use.
dependent_spec (spack.spec.Spec): The spec of the dependent package
about to be built. This allows the extendee (self) to
query the dependent's state. Note that *this*
package's spec is available as ``self.spec``.
dependent_spec: The spec of the dependent package about to be built. This allows the
extendee (self) to query the dependent's state. Note that *this* package's spec is
available as ``self.spec``.
"""
pass
@@ -2107,7 +2065,7 @@ def flag_handler(self, var: FLAG_HANDLER_TYPE) -> None:
# arguments. This is implemented for build system classes where
# appropriate and will otherwise raise a NotImplementedError.
def flags_to_build_system_args(self, flags):
def flags_to_build_system_args(self, flags: Dict[str, List[str]]) -> None:
# Takes flags as a dict name: list of values
if any(v for v in flags.values()):
msg = "The {0} build system".format(self.__class__.__name__)
@@ -2310,10 +2268,6 @@ def rpath_args(self):
"""
return " ".join("-Wl,-rpath,%s" % p for p in self.rpath)
@property
def builder(self):
return spack.builder.create(self)
inject_flags = PackageBase.inject_flags
env_flags = PackageBase.env_flags

View File

@@ -0,0 +1,105 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import llnl.util.lang as lang
#: An object of this kind is a shared global state used to collect callbacks during
#: class definition time, and is flushed when the class object is created at the end
#: of the class definition
#:
#: Args:
#: attribute_name (str): name of the attribute that will be attached to the builder
#: callbacks (list): container used to temporarily aggregate the callbacks
CallbackTemporaryStage = collections.namedtuple(
"CallbackTemporaryStage", ["attribute_name", "callbacks"]
)
#: Shared global state to aggregate "@run_before" callbacks
_RUN_BEFORE = CallbackTemporaryStage(attribute_name="run_before_callbacks", callbacks=[])
#: Shared global state to aggregate "@run_after" callbacks
_RUN_AFTER = CallbackTemporaryStage(attribute_name="run_after_callbacks", callbacks=[])
class PhaseCallbacksMeta(type):
"""Permit to register arbitrary functions during class definition and run them
later, before or after a given install phase.
Each method decorated with ``run_before`` or ``run_after`` gets temporarily
stored in a global shared state when a class being defined is parsed by the Python
interpreter. At class definition time that temporary storage gets flushed and a list
of callbacks is attached to the class being defined.
"""
def __new__(mcs, name, bases, attr_dict):
for temporary_stage in (_RUN_BEFORE, _RUN_AFTER):
staged_callbacks = temporary_stage.callbacks
# Here we have an adapter from an old-style package. This means there is no
# hierarchy of builders, and every callback that had to be combined between
# *Package and *Builder has been combined already by _PackageAdapterMeta
if name == "Adapter":
continue
# If we are here we have callbacks. To get a complete list, we accumulate all the
# callbacks from base classes, we deduplicate them, then prepend what we have
# registered here.
#
# The order should be:
# 1. Callbacks are registered in order within the same class
# 2. Callbacks defined in derived classes precede those defined in base
# classes
callbacks_from_base = []
for base in bases:
current_callbacks = getattr(base, temporary_stage.attribute_name, None)
if not current_callbacks:
continue
callbacks_from_base.extend(current_callbacks)
callbacks_from_base = list(lang.dedupe(callbacks_from_base))
# Set the callbacks in this class and flush the temporary stage
attr_dict[temporary_stage.attribute_name] = staged_callbacks[:] + callbacks_from_base
del temporary_stage.callbacks[:]
return super(PhaseCallbacksMeta, mcs).__new__(mcs, name, bases, attr_dict)
@staticmethod
def run_after(phase, when=None):
"""Decorator to register a function for running after a given phase.
Args:
phase (str): phase after which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_AFTER.callbacks.append(item)
return fn
return _decorator
@staticmethod
def run_before(phase, when=None):
"""Decorator to register a function for running before a given phase.
Args:
phase (str): phase before which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_BEFORE.callbacks.append(item)
return fn
return _decorator
# Export these names as standalone to be used in packages
run_after = PhaseCallbacksMeta.run_after
run_before = PhaseCallbacksMeta.run_before

View File

@@ -39,9 +39,9 @@
import spack.error
import spack.patch
import spack.provider_index
import spack.repo
import spack.spec
import spack.tag
import spack.util.file_cache
import spack.util.git
import spack.util.naming as nm
import spack.util.path
@@ -216,9 +216,9 @@ def compute_loader(self, fullname):
def packages_path():
"""Get the test repo if it is active, otherwise the builtin repo."""
try:
return spack.repo.PATH.get_repo("builtin.mock").packages_path
except spack.repo.UnknownNamespaceError:
return spack.repo.PATH.get_repo("builtin").packages_path
return PATH.get_repo("builtin.mock").packages_path
except UnknownNamespaceError:
return PATH.get_repo("builtin").packages_path
class GitExe:
@@ -314,7 +314,7 @@ def add_package_to_git_stage(packages):
git = GitExe()
for pkg_name in packages:
filename = spack.repo.PATH.filename_for_package_name(pkg_name)
filename = PATH.filename_for_package_name(pkg_name)
if not os.path.isfile(filename):
tty.die("No such package: %s. Path does not exist:" % pkg_name, filename)
@@ -590,7 +590,7 @@ def __init__(
self,
package_checker: FastPackageChecker,
namespace: str,
cache: "spack.caches.FileCacheType",
cache: spack.util.file_cache.FileCache,
):
self.checker = package_checker
self.packages_path = self.checker.packages_path
@@ -683,7 +683,7 @@ class RepoPath:
def __init__(
self,
*repos: Union[str, "Repo"],
cache: Optional["spack.caches.FileCacheType"],
cache: Optional[spack.util.file_cache.FileCache],
overrides: Optional[Dict[str, Any]] = None,
) -> None:
self.repos: List[Repo] = []
@@ -965,7 +965,7 @@ def __init__(
self,
root: str,
*,
cache: "spack.caches.FileCacheType",
cache: spack.util.file_cache.FileCache,
overrides: Optional[Dict[str, Any]] = None,
) -> None:
"""Instantiate a package repository from a filesystem path.
@@ -1440,9 +1440,7 @@ def _path(configuration=None):
return create(configuration=configuration)
def create(
configuration: Union["spack.config.Configuration", llnl.util.lang.Singleton]
) -> RepoPath:
def create(configuration: spack.config.Configuration) -> RepoPath:
"""Create a RepoPath from a configuration object.
Args:
@@ -1465,7 +1463,7 @@ def create(
#: Singleton repo path instance
PATH: Union[RepoPath, llnl.util.lang.Singleton] = llnl.util.lang.Singleton(_path)
PATH: RepoPath = llnl.util.lang.Singleton(_path) # type: ignore
# Add the finder to sys.meta_path
REPOS_FINDER = ReposFinder()
@@ -1585,7 +1583,7 @@ def __init__(self, name, repo=None):
long_msg = "Use 'spack create' to create a new package."
if not repo:
repo = spack.repo.PATH
repo = PATH
# We need to compare the base package name
pkg_name = name.rsplit(".", 1)[-1]

View File

@@ -204,7 +204,7 @@ def extract_package_from_signature(self, instance, *args, **kwargs):
class TestInfoCollector(InfoCollector):
"""Collect information for the PackageBase.do_test method.
"""Collect information for the PackageTest.stand_alone_tests method.
Args:
specs: specs whose install information will be recorded
@@ -214,7 +214,7 @@ class TestInfoCollector(InfoCollector):
dir: str
def __init__(self, specs: List[spack.spec.Spec], record_directory: str):
super().__init__(spack.package_base.PackageBase, "do_test", specs)
super().__init__(spack.install_test.PackageTest, "stand_alone_tests", specs)
self.dir = record_directory
def on_success(self, pkg, kwargs, package_record):
@@ -233,7 +233,7 @@ def fetch_log(self, pkg: spack.package_base.PackageBase):
return f"Cannot open log for {pkg.spec.cshort_spec}"
def extract_package_from_signature(self, instance, *args, **kwargs):
return instance
return instance.pkg
@contextlib.contextmanager

View File

@@ -33,8 +33,14 @@
"properties": {
"type": {
"type": "string",
"enum": ["local", "buildcache", "external"],
"enum": [
"local",
"buildcache",
"external",
"environment",
],
},
"path": {"type": "string"},
"include": LIST_OF_SPECS,
"exclude": LIST_OF_SPECS,
},
@@ -72,7 +78,8 @@
"transitive": {"type": "boolean", "default": False},
},
},
}
},
"automatic": {"type": "boolean"},
},
},
"duplicates": {

View File

@@ -19,6 +19,8 @@
#: Top level key in a manifest file
TOP_LEVEL_KEY = "spack"
include_concrete = {"type": "array", "default": [], "items": {"type": "string"}}
properties: Dict[str, Any] = {
"spack": {
"type": "object",
@@ -31,7 +33,7 @@
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spec_list_schema,
"include_concrete": {"type": "array", "default": [], "items": {"type": "string"}},
"include_concrete": include_concrete,
},
),
}

View File

@@ -15,14 +15,42 @@
"url": {"type": "string"},
# todo: replace this with named keys "username" / "password" or "id" / "secret"
"access_pair": {
"type": "array",
"items": {"type": ["string", "null"], "minItems": 2, "maxItems": 2},
"oneOf": [
{
"type": "array",
"items": {"minItems": 2, "maxItems": 2, "type": ["string", "null"]},
}, # deprecated
{
"type": "object",
"required": ["secret_variable"],
# Only allow id or id_variable to be set, not both
"oneOf": [{"required": ["id"]}, {"required": ["id_variable"]}],
"properties": {
"id": {"type": "string"},
"id_variable": {"type": "string"},
"secret_variable": {"type": "string"},
},
},
]
},
"access_token": {"type": ["string", "null"]},
"profile": {"type": ["string", "null"]},
"endpoint_url": {"type": ["string", "null"]},
"access_token": {"type": ["string", "null"]}, # deprecated
"access_token_variable": {"type": ["string", "null"]},
}
connection_ext = {
"deprecatedProperties": [
{
"names": ["access_token"],
"message": "Use of plain text `access_token` in mirror config is deprecated, use "
"environment variables instead (access_token_variable)",
"error": False,
}
]
}
#: Mirror connection inside pull/push keys
fetch_and_push = {
"anyOf": [
@@ -31,6 +59,7 @@
"type": "object",
"additionalProperties": False,
"properties": {**connection}, # type: ignore
**connection_ext, # type: ignore
},
]
}
@@ -49,6 +78,7 @@
"autopush": {"type": "boolean"},
**connection, # type: ignore
},
**connection_ext, # type: ignore
}
#: Properties for inclusion in other schemas
@@ -70,3 +100,28 @@
"additionalProperties": False,
"properties": properties,
}
def update(data):
import jsonschema
errors = []
def check_access_pair(name, section):
if not section or not isinstance(section, dict):
return
if "access_token" in section and "access_token_variable" in section:
errors.append(
f'{name}: mirror credential "access_token" conflicts with "access_token_variable"'
)
# Check all of the sections
for name, section in data.items():
check_access_pair(name, section)
if isinstance(section, dict):
check_access_pair(name, section.get("fetch"))
check_access_pair(name, section.get("push"))
if errors:
raise jsonschema.ValidationError("\n".join(errors))

View File

@@ -27,7 +27,6 @@
import spack
import spack.binary_distribution
import spack.bootstrap.core
import spack.compilers
import spack.concretize
import spack.config
@@ -53,6 +52,7 @@
from .core import (
AspFunction,
AspVar,
NodeArgument,
ast_sym,
ast_type,
@@ -515,6 +515,8 @@ def _compute_specs_from_answer_set(self):
best = min(self.answers)
opt, _, answer = best
for input_spec in self.abstract_specs:
# The specs must be unified to get here, so it is safe to associate any satisfying spec
# with the input. Multiple inputs may be matched to the same concrete spec
node = SpecBuilder.make_node(pkg=input_spec.name)
if input_spec.virtual:
providers = [
@@ -523,12 +525,14 @@ def _compute_specs_from_answer_set(self):
node = SpecBuilder.make_node(pkg=providers[0])
candidate = answer.get(node)
if candidate and candidate.build_spec.satisfies(input_spec):
if not candidate.satisfies(input_spec):
tty.warn(
"explicit splice configuration has caused the concretized spec"
f" {candidate} not to satisfy the input spec {input_spec}"
)
if candidate and candidate.satisfies(input_spec):
self._concrete_specs.append(answer[node])
self._concrete_specs_by_input[input_spec] = answer[node]
elif candidate and candidate.build_spec.satisfies(input_spec):
tty.warn(
"explicit splice configuration has caused the concretized spec"
f" {candidate} not to satisfy the input spec {input_spec}"
)
self._concrete_specs.append(answer[node])
self._concrete_specs_by_input[input_spec] = answer[node]
else:
@@ -814,7 +818,7 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
solve, and the internal statistics from clingo.
"""
# avoid circular import
import spack.bootstrap
import spack.bootstrap.core
output = output or DEFAULT_OUTPUT_CONFIGURATION
timer = spack.util.timer.Timer()
@@ -853,6 +857,8 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
self.control.load(os.path.join(parent_dir, "libc_compatibility.lp"))
else:
self.control.load(os.path.join(parent_dir, "os_compatibility.lp"))
if setup.enable_splicing:
self.control.load(os.path.join(parent_dir, "splices.lp"))
timer.stop("load")
@@ -887,6 +893,7 @@ def on_model(model):
result.satisfiable = solve_result.satisfiable
if result.satisfiable:
timer.start("construct_specs")
# get the best model
builder = SpecBuilder(specs, hash_lookup=setup.reusable_and_possible)
min_cost, best_model = min(models)
@@ -911,7 +918,8 @@ def on_model(model):
# record the possible dependencies in the solve
result.possible_dependencies = setup.pkgs
timer.stop("construct_specs")
timer.stop()
elif cores:
result.control = self.control
result.cores.extend(cores)
@@ -1163,6 +1171,9 @@ def __init__(self, tests: bool = False):
# list of unique libc specs targeted by compilers (or an educated guess if no compiler)
self.libcs: List[spack.spec.Spec] = []
# If true, we have to load the code for synthesizing splices
self.enable_splicing: bool = spack.config.CONFIG.get("concretizer:splice:automatic")
def pkg_version_rules(self, pkg):
"""Output declared versions of a package.
@@ -1333,6 +1344,10 @@ def pkg_rules(self, pkg, tests):
# dependencies
self.package_dependencies_rules(pkg)
# splices
if self.enable_splicing:
self.package_splice_rules(pkg)
# virtual preferences
self.virtual_preferences(
pkg.name,
@@ -1433,14 +1448,13 @@ def define_variant(
for value in sorted(values):
pkg_fact(fn.variant_possible_value(vid, value))
# when=True means unconditional, so no need for conditional values
if getattr(value, "when", True) is True:
# we're done here for unconditional values
if not isinstance(value, vt.ConditionalValue):
continue
# now we have to handle conditional values
quoted_value = spack.parser.quote_if_needed(str(value))
vstring = f"{name}={quoted_value}"
variant_has_value = spack.spec.Spec(vstring)
# make a spec indicating whether the variant has this conditional value
variant_has_value = spack.spec.Spec()
variant_has_value.variants[name] = spack.variant.AbstractVariant(name, value.value)
if value.when:
# the conditional value is always "possible", but it imposes its when condition as
@@ -1451,10 +1465,12 @@ def define_variant(
imposed_spec=value.when,
required_name=pkg.name,
imposed_name=pkg.name,
msg=f"{pkg.name} variant {name} has value '{quoted_value}' when {value.when}",
msg=f"{pkg.name} variant {name} has value '{value.value}' when {value.when}",
)
else:
# We know the value is never allowed statically (when was false), but we can't just
vstring = f"{name}='{value.value}'"
# We know the value is never allowed statically (when was None), but we can't just
# ignore it b/c it could come in as a possible value and we need a good error msg.
# So, it's a conflict -- if the value is somehow used, it'll trigger an error.
trigger_id = self.condition(
@@ -1670,6 +1686,94 @@ def dependency_holds(input_spec, requirements):
self.gen.newline()
def _gen_match_variant_splice_constraints(
self,
pkg,
cond_spec: "spack.spec.Spec",
splice_spec: "spack.spec.Spec",
hash_asp_var: "AspVar",
splice_node,
match_variants: List[str],
):
# If there are no variants to match, no constraints are needed
variant_constraints = []
for i, variant_name in enumerate(match_variants):
vari_defs = pkg.variant_definitions(variant_name)
# the spliceable config of the package always includes the variant
if vari_defs != [] and any(cond_spec.satisfies(s) for (s, _) in vari_defs):
variant = vari_defs[0][1]
if variant.multi:
continue # cannot automatically match multi-valued variants
value_var = AspVar(f"VariValue{i}")
attr_constraint = fn.attr("variant_value", splice_node, variant_name, value_var)
hash_attr_constraint = fn.hash_attr(
hash_asp_var, "variant_value", splice_spec.name, variant_name, value_var
)
variant_constraints.append(attr_constraint)
variant_constraints.append(hash_attr_constraint)
return variant_constraints
def package_splice_rules(self, pkg):
self.gen.h2("Splice rules")
for i, (cond, (spec_to_splice, match_variants)) in enumerate(
sorted(pkg.splice_specs.items())
):
with named_spec(cond, pkg.name):
self.version_constraints.add((cond.name, cond.versions))
self.version_constraints.add((spec_to_splice.name, spec_to_splice.versions))
hash_var = AspVar("Hash")
splice_node = fn.node(AspVar("NID"), cond.name)
when_spec_attrs = [
fn.attr(c.args[0], splice_node, *(c.args[2:]))
for c in self.spec_clauses(cond, body=True, required_from=None)
if c.args[0] != "node"
]
splice_spec_hash_attrs = [
fn.hash_attr(hash_var, *(c.args))
for c in self.spec_clauses(spec_to_splice, body=True, required_from=None)
if c.args[0] != "node"
]
if match_variants is None:
variant_constraints = []
elif match_variants == "*":
filt_match_variants = set()
for map in pkg.variants.values():
for k in map:
filt_match_variants.add(k)
filt_match_variants = list(filt_match_variants)
variant_constraints = self._gen_match_variant_splice_constraints(
pkg, cond, spec_to_splice, hash_var, splice_node, filt_match_variants
)
else:
if any(
v in cond.variants or v in spec_to_splice.variants for v in match_variants
):
raise Exception(
"Overlap between match_variants and explicitly set variants"
)
variant_constraints = self._gen_match_variant_splice_constraints(
pkg, cond, spec_to_splice, hash_var, splice_node, match_variants
)
rule_head = fn.abi_splice_conditions_hold(
i, splice_node, spec_to_splice.name, hash_var
)
rule_body_components = (
[
# splice_set_fact,
fn.attr("node", splice_node),
fn.installed_hash(spec_to_splice.name, hash_var),
]
+ when_spec_attrs
+ splice_spec_hash_attrs
+ variant_constraints
)
rule_body = ",\n ".join(str(r) for r in rule_body_components)
rule = f"{rule_head} :-\n {rule_body}."
self.gen.append(rule)
self.gen.newline()
def virtual_preferences(self, pkg_name, func):
"""Call func(vspec, provider, i) for each of pkg's provider prefs."""
config = spack.config.get("packages")
@@ -2028,9 +2132,12 @@ def _spec_clauses(
for variant_def in variant_defs:
self.variant_values_from_specs.add((spec.name, id(variant_def), value))
clauses.append(f.variant_value(spec.name, vname, value))
if variant.propagate:
clauses.append(f.propagate(spec.name, fn.variant_value(vname, value)))
if self.pkg_class(spec.name).has_variant(vname):
clauses.append(f.variant_value(spec.name, vname, value))
else:
clauses.append(f.variant_value(spec.name, vname, value))
# compiler and compiler version
if spec.compiler:
@@ -2529,8 +2636,9 @@ def concrete_specs(self):
for h, spec in self.reusable_and_possible.explicit_items():
# this indicates that there is a spec like this installed
self.gen.fact(fn.installed_hash(spec.name, h))
# this describes what constraints it imposes on the solve
self.impose(h, spec, body=True)
# indirection layer between hash constraints and imposition to allow for splicing
for pred in self.spec_clauses(spec, body=True, required_from=None):
self.gen.fact(fn.hash_attr(h, *pred.args))
self.gen.newline()
# Declare as possible parts of specs that are not in package.py
# - Add versions to possible versions
@@ -2616,6 +2724,7 @@ def setup(
)
for name, info in env.dev_specs.items()
)
specs = tuple(specs) # ensure compatible types to add
self.gen.h1("Reusable concrete specs")
@@ -3470,6 +3579,14 @@ def consume_facts(self):
self._setup.effect_rules()
# This should be a dataclass, but dataclasses don't work on Python 3.6
class Splice:
def __init__(self, splice_node: NodeArgument, child_name: str, child_hash: str):
self.splice_node = splice_node
self.child_name = child_name
self.child_hash = child_hash
class SpecBuilder:
"""Class with actions to rebuild a spec from ASP results."""
@@ -3505,10 +3622,11 @@ def make_node(*, pkg: str) -> NodeArgument:
"""
return NodeArgument(id="0", pkg=pkg)
def __init__(
self, specs: List[spack.spec.Spec], *, hash_lookup: Optional[ConcreteSpecsByHash] = None
):
def __init__(self, specs, hash_lookup=None):
self._specs: Dict[NodeArgument, spack.spec.Spec] = {}
# Matches parent nodes to splice node
self._splices: Dict[NodeArgument, List[Splice]] = {}
self._result = None
self._command_line_specs = specs
self._flag_sources: Dict[Tuple[NodeArgument, str], Set[str]] = collections.defaultdict(
@@ -3592,16 +3710,8 @@ def external_spec_selected(self, node, idx):
def depends_on(self, parent_node, dependency_node, type):
dependency_spec = self._specs[dependency_node]
edges = self._specs[parent_node].edges_to_dependencies(name=dependency_spec.name)
edges = [x for x in edges if id(x.spec) == id(dependency_spec)]
depflag = dt.flag_from_string(type)
if not edges:
self._specs[parent_node].add_dependency_edge(
self._specs[dependency_node], depflag=depflag, virtuals=()
)
else:
edges[0].update_deptypes(depflag=depflag)
self._specs[parent_node].add_dependency_edge(dependency_spec, depflag=depflag, virtuals=())
def virtual_on_edge(self, parent_node, provider_node, virtual):
dependencies = self._specs[parent_node].edges_to_dependencies(name=(provider_node.pkg))
@@ -3718,6 +3828,57 @@ def _order_index(flag_group):
def deprecated(self, node: NodeArgument, version: str) -> None:
tty.warn(f'using "{node.pkg}@{version}" which is a deprecated version')
def splice_at_hash(
self,
parent_node: NodeArgument,
splice_node: NodeArgument,
child_name: str,
child_hash: str,
):
splice = Splice(splice_node, child_name=child_name, child_hash=child_hash)
self._splices.setdefault(parent_node, []).append(splice)
def _resolve_automatic_splices(self):
"""After all of the specs have been concretized, apply all immediate splices.
Use reverse topological order to ensure that all dependencies are resolved
before their parents, allowing for maximal sharing and minimal copying.
"""
fixed_specs = {}
# create a mapping from dag hash to an integer representing position in reverse topo order.
specs = self._specs.values()
topo_order = list(traverse.traverse_nodes(specs, order="topo", key=traverse.by_dag_hash))
topo_lookup = {spec.dag_hash(): index for index, spec in enumerate(reversed(topo_order))}
# iterate over specs, children before parents
for node, spec in sorted(self._specs.items(), key=lambda x: topo_lookup[x[1].dag_hash()]):
immediate = self._splices.get(node, [])
if not immediate and not any(
edge.spec in fixed_specs for edge in spec.edges_to_dependencies()
):
continue
new_spec = spec.copy(deps=False)
new_spec.build_spec = spec
for edge in spec.edges_to_dependencies():
depflag = edge.depflag & ~dt.BUILD
if any(edge.spec.dag_hash() == splice.child_hash for splice in immediate):
splice = [s for s in immediate if s.child_hash == edge.spec.dag_hash()][0]
new_spec.add_dependency_edge(
self._specs[splice.splice_node], depflag=depflag, virtuals=edge.virtuals
)
elif edge.spec in fixed_specs:
new_spec.add_dependency_edge(
fixed_specs[edge.spec], depflag=depflag, virtuals=edge.virtuals
)
else:
new_spec.add_dependency_edge(
edge.spec, depflag=depflag, virtuals=edge.virtuals
)
self._specs[node] = new_spec
fixed_specs[spec] = new_spec
@staticmethod
def sort_fn(function_tuple) -> Tuple[int, int]:
"""Ensure attributes are evaluated in the correct order.
@@ -3747,7 +3908,6 @@ def build_specs(self, function_tuples):
# them here so that directives that build objects (like node and
# node_compiler) are called in the right order.
self.function_tuples = sorted(set(function_tuples), key=self.sort_fn)
self._specs = {}
for name, args in self.function_tuples:
if SpecBuilder.ignored_attributes.match(name):
@@ -3777,10 +3937,14 @@ def build_specs(self, function_tuples):
continue
# if we've already gotten a concrete spec for this pkg,
# do not bother calling actions on it
# do not bother calling actions on it except for node_flag_source,
# since node_flag_source is tracking information not in the spec itself
# we also need to keep track of splicing information.
spec = self._specs.get(args[0])
if spec and spec.concrete:
continue
do_not_ignore_attrs = ["node_flag_source", "splice_at_hash"]
if name not in do_not_ignore_attrs:
continue
action(*args)
@@ -3790,7 +3954,7 @@ def build_specs(self, function_tuples):
# inject patches -- note that we' can't use set() to unique the
# roots here, because the specs aren't complete, and the hash
# function will loop forever.
roots = [spec.root for spec in self._specs.values() if not spec.root.installed]
roots = [spec.root for spec in self._specs.values()]
roots = dict((id(r), r) for r in roots)
for root in roots.values():
spack.spec.Spec.inject_patches_variant(root)
@@ -3806,6 +3970,8 @@ def build_specs(self, function_tuples):
for root in roots.values():
root._finalize_concretization()
self._resolve_automatic_splices()
for s in self._specs.values():
spack.spec.Spec.ensure_no_deprecated(s)
@@ -3820,7 +3986,6 @@ def build_specs(self, function_tuples):
)
specs = self.execute_explicit_splices()
return specs
def execute_explicit_splices(self):
@@ -3829,8 +3994,16 @@ def execute_explicit_splices(self):
for splice_set in splice_config:
target = splice_set["target"]
replacement = spack.spec.Spec(splice_set["replacement"])
assert replacement.abstract_hash
replacement.replace_hash()
if not replacement.abstract_hash:
location = getattr(
splice_set["replacement"], "_start_mark", " at unknown line number"
)
msg = f"Explicit splice replacement '{replacement}' does not include a hash.\n"
msg += f"{location}\n\n"
msg += " Splice replacements must be specified by hash"
raise InvalidSpliceError(msg)
transitive = splice_set.get("transitive", False)
splice_triples.append((target, replacement, transitive))
@@ -3841,6 +4014,10 @@ def execute_explicit_splices(self):
if target in current_spec:
# matches root or non-root
# e.g. mvapich2%gcc
# The first iteration, we need to replace the abstract hash
if not replacement.concrete:
replacement.replace_hash()
current_spec = current_spec.splice(replacement, transitive)
new_key = NodeArgument(id=key.id, pkg=current_spec.name)
specs[new_key] = current_spec
@@ -3966,7 +4143,7 @@ def selected_specs(self) -> List[spack.spec.Spec]:
return [s for s in self.factory() if self.is_selected(s)]
@staticmethod
def from_store(configuration, include, exclude) -> "SpecFilter":
def from_store(configuration, *, include, exclude) -> "SpecFilter":
"""Constructs a filter that takes the specs from the current store."""
packages = _external_config_with_implicit_externals(configuration)
is_reusable = functools.partial(_is_reusable, packages=packages, local=True)
@@ -3974,7 +4151,7 @@ def from_store(configuration, include, exclude) -> "SpecFilter":
return SpecFilter(factory=factory, is_usable=is_reusable, include=include, exclude=exclude)
@staticmethod
def from_buildcache(configuration, include, exclude) -> "SpecFilter":
def from_buildcache(configuration, *, include, exclude) -> "SpecFilter":
"""Constructs a filter that takes the specs from the configured buildcaches."""
packages = _external_config_with_implicit_externals(configuration)
is_reusable = functools.partial(_is_reusable, packages=packages, local=False)
@@ -3982,6 +4159,29 @@ def from_buildcache(configuration, include, exclude) -> "SpecFilter":
factory=_specs_from_mirror, is_usable=is_reusable, include=include, exclude=exclude
)
@staticmethod
def from_environment(configuration, *, include, exclude, env) -> "SpecFilter":
packages = _external_config_with_implicit_externals(configuration)
is_reusable = functools.partial(_is_reusable, packages=packages, local=True)
factory = functools.partial(_specs_from_environment, env=env)
return SpecFilter(factory=factory, is_usable=is_reusable, include=include, exclude=exclude)
@staticmethod
def from_environment_included_concrete(
configuration,
*,
include: List[str],
exclude: List[str],
env: ev.Environment,
included_concrete: str,
) -> "SpecFilter":
packages = _external_config_with_implicit_externals(configuration)
is_reusable = functools.partial(_is_reusable, packages=packages, local=True)
factory = functools.partial(
_specs_from_environment_included_concrete, env=env, included_concrete=included_concrete
)
return SpecFilter(factory=factory, is_usable=is_reusable, include=include, exclude=exclude)
def _specs_from_store(configuration):
store = spack.store.create(configuration)
@@ -3999,6 +4199,23 @@ def _specs_from_mirror():
return []
def _specs_from_environment(env):
"""Return all concrete specs from the environment. This includes all included concrete"""
if env:
return [concrete for _, concrete in env.concretized_specs()]
else:
return []
def _specs_from_environment_included_concrete(env, included_concrete):
"""Return only concrete specs from the environment included from the included_concrete"""
if env:
assert included_concrete in env.included_concrete_envs
return [concrete for concrete in env.included_specs_by_hash[included_concrete].values()]
else:
return []
class ReuseStrategy(enum.Enum):
ROOTS = enum.auto()
DEPENDENCIES = enum.auto()
@@ -4028,6 +4245,12 @@ def __init__(self, configuration: spack.config.Configuration) -> None:
SpecFilter.from_buildcache(
configuration=self.configuration, include=[], exclude=[]
),
SpecFilter.from_environment(
configuration=self.configuration,
include=[],
exclude=[],
env=ev.active_environment(), # includes all concrete includes
),
]
)
else:
@@ -4042,7 +4265,46 @@ def __init__(self, configuration: spack.config.Configuration) -> None:
for source in reuse_yaml.get("from", default_sources):
include = source.get("include", default_include)
exclude = source.get("exclude", default_exclude)
if source["type"] == "local":
if source["type"] == "environment" and "path" in source:
env_dir = ev.as_env_dir(source["path"])
active_env = ev.active_environment()
if active_env and env_dir in active_env.included_concrete_envs:
# If environment is included as a concrete environment, use the local copy
# of specs in the active environment.
# note: included concrete environments are only updated at concretization
# time, and reuse needs to matchthe included specs.
self.reuse_sources.append(
SpecFilter.from_environment_included_concrete(
self.configuration,
include=include,
exclude=exclude,
env=active_env,
included_concrete=env_dir,
)
)
else:
# If the environment is not included as a concrete environment, use the
# current specs from its lockfile.
self.reuse_sources.append(
SpecFilter.from_environment(
self.configuration,
include=include,
exclude=exclude,
env=ev.environment_from_name_or_dir(env_dir),
)
)
elif source["type"] == "environment":
# reusing from the current environment implicitly reuses from all of the
# included concrete environments
self.reuse_sources.append(
SpecFilter.from_environment(
self.configuration,
include=include,
exclude=exclude,
env=ev.active_environment(),
)
)
elif source["type"] == "local":
self.reuse_sources.append(
SpecFilter.from_store(self.configuration, include=include, exclude=exclude)
)
@@ -4060,7 +4322,6 @@ def reusable_specs(self, specs: List[spack.spec.Spec]) -> List[spack.spec.Spec]:
result = []
for reuse_source in self.reuse_sources:
result.extend(reuse_source.selected_specs())
# If we only want to reuse dependencies, remove the root specs
if self.reuse_strategy == ReuseStrategy.DEPENDENCIES:
result = [spec for spec in result if not any(root in spec for root in specs)]
@@ -4091,7 +4352,7 @@ def _check_input_and_extract_concrete_specs(specs):
spack.spec.Spec.ensure_valid_variants(s)
return reusable
def solve(
def solve_with_stats(
self,
specs,
out=None,
@@ -4102,6 +4363,8 @@ def solve(
allow_deprecated=False,
):
"""
Concretize a set of specs and track the timing and statistics for the solve
Arguments:
specs (list): List of ``Spec`` objects to solve for.
out: Optionally write the generate ASP program to a file-like object.
@@ -4113,15 +4376,22 @@ def solve(
setup_only (bool): if True, stop after setup and don't solve (default False).
allow_deprecated (bool): allow deprecated version in the solve
"""
# Check upfront that the variants are admissible
specs = [s.lookup_hash() for s in specs]
reusable_specs = self._check_input_and_extract_concrete_specs(specs)
reusable_specs.extend(self.selector.reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)
output = OutputConfiguration(timers=timers, stats=stats, out=out, setup_only=setup_only)
result, _, _ = self.driver.solve(
return self.driver.solve(
setup, specs, reuse=reusable_specs, output=output, allow_deprecated=allow_deprecated
)
def solve(self, specs, **kwargs):
"""
Convenience function for concretizing a set of specs and ignoring timing
and statistics. Uses the same kwargs as solve_with_stats.
"""
# Check upfront that the variants are admissible
result, _, _ = self.solve_with_stats(specs, **kwargs)
return result
def solve_in_rounds(
@@ -4221,8 +4491,11 @@ def __init__(self, provided, conflicts):
super().__init__(msg)
self.provided = provided
# Add attribute expected of the superclass interface
self.required = None
self.constraint_type = None
self.provided = provided
class InvalidSpliceError(spack.error.SpackError):
"""For cases in which the splice configuration is invalid."""

View File

@@ -57,6 +57,12 @@
internal_error("provider with no virtual node").
:- provider(PackageNode, _), not attr("node", PackageNode),
internal_error("provider with no real node").
:- node_has_variant(PackageNode, _, _), not attr("node", PackageNode),
internal_error("node has variant for a non-node").
:- attr("variant_set", PackageNode, _, _), not attr("node", PackageNode),
internal_error("variant_set for a non-node").
:- variant_is_propagated(PackageNode, _), not attr("node", PackageNode),
internal_error("variant_is_propagated for a non-node").
:- attr("root", node(ID, PackageNode)), ID > min_dupe_id,
internal_error("root with a non-minimal duplicate ID").
@@ -575,7 +581,8 @@ attr("virtual_on_edge", PackageNode, ProviderNode, Virtual)
% or used somewhere
:- attr("virtual_node", node(_, Virtual)),
not attr("virtual_on_incoming_edges", _, Virtual),
not attr("virtual_root", node(_, Virtual)).
not attr("virtual_root", node(_, Virtual)),
internal_error("virtual node does not match incoming edge").
attr("virtual_on_incoming_edges", ProviderNode, Virtual)
:- attr("virtual_on_edge", _, ProviderNode, Virtual).
@@ -629,7 +636,8 @@ do_not_impose(EffectID, node(X, Package))
virtual_condition_holds(_, PossibleProvider, Virtual),
PossibleProvider != ProviderNode,
explicitly_requested_root(PossibleProvider),
not explicitly_requested_root(ProviderNode).
not explicitly_requested_root(ProviderNode),
internal_error("If a root can provide a virtual, it must be the provider").
% A package cannot be the actual provider for a virtual if it does not
% fulfill the conditions to provide that virtual
@@ -772,7 +780,8 @@ required_provider(Provider, Virtual)
pkg_fact(Virtual, condition_effect(ConditionID, EffectID)),
imposed_constraint(EffectID, "node", Provider).
:- provider(node(Y, Package), node(X, Virtual)), required_provider(Provider, Virtual), Package != Provider.
:- provider(node(Y, Package), node(X, Virtual)), required_provider(Provider, Virtual), Package != Provider,
internal_error("If a provider is required the concretizer must use it").
% TODO: the following choice rule allows the solver to add compiler
% flags if their only source is from a requirement. This is overly-specific
@@ -852,7 +861,8 @@ variant_defined(PackageNode, Name) :- variant_definition(PackageNode, Name, _).
% for two or more variant definitions, this prefers the last one defined.
:- node_has_variant(node(NodeID, Package), Name, SelectedVariantID),
variant_definition(node(NodeID, Package), Name, VariantID),
VariantID > SelectedVariantID.
VariantID > SelectedVariantID,
internal_error("If the solver picks a variant descriptor it must use that variant descriptor").
% B: Associating applicable package rules with nodes
@@ -969,6 +979,7 @@ error(100, "{0} variant '{1}' cannot have values '{2}' and '{3}' as they come fr
:- attr("variant_set", node(ID, Package), Variant, Value),
not attr("variant_value", node(ID, Package), Variant, Value).
internal_error("If a variant is set to a value it must have that value").
% The rules below allow us to prefer default values for variants
% whenever possible. If a variant is set in a spec, or if it is
@@ -979,7 +990,7 @@ variant_not_default(node(ID, Package), Variant, Value)
% variants set explicitly on the CLI don't count as non-default
not attr("variant_set", node(ID, Package), Variant, Value),
% variant values forced by propagation don't count as non-default
not propagate(node(ID, Package), variant_value(Variant, Value)),
not propagate(node(ID, Package), variant_value(Variant, Value, _)),
% variants set on externals that we could use don't count as non-default
% this makes spack prefer to use an external over rebuilding with the
% default configuration
@@ -991,7 +1002,7 @@ variant_default_not_used(node(ID, Package), Variant, Value)
:- variant_default_value(node(ID, Package), Variant, Value),
node_has_variant(node(ID, Package), Variant, _),
not attr("variant_value", node(ID, Package), Variant, Value),
not propagate(node(ID, Package), variant_value(Variant, _)),
not propagate(node(ID, Package), variant_value(Variant, _, _)),
attr("node", node(ID, Package)).
% The variant is set in an external spec
@@ -1036,10 +1047,14 @@ variant_single_value(PackageNode, Variant)
% Propagation semantics
%-----------------------------------------------------------------------------
non_default_propagation(variant_value(Name, Value)) :- attr("propagate", RootNode, variant_value(Name, Value)).
% Propagation roots have a corresponding attr("propagate", ...)
propagate(RootNode, PropagatedAttribute) :- attr("propagate", RootNode, PropagatedAttribute).
propagate(RootNode, PropagatedAttribute) :- attr("propagate", RootNode, PropagatedAttribute), not non_default_propagation(PropagatedAttribute).
propagate(RootNode, PropagatedAttribute, EdgeTypes) :- attr("propagate", RootNode, PropagatedAttribute, EdgeTypes).
% Special case variants, to inject the source node in the propagated attribute
propagate(RootNode, variant_value(Name, Value, RootNode)) :- attr("propagate", RootNode, variant_value(Name, Value)).
% Propagate an attribute along edges to child nodes
propagate(ChildNode, PropagatedAttribute) :-
@@ -1061,21 +1076,53 @@ propagate(ChildNode, PropagatedAttribute, edge_types(DepType1, DepType2)) :-
% If a variant is propagated, and can be accepted, set its value
attr("variant_selected", PackageNode, Variant, Value, VariantType, VariantID) :-
propagate(PackageNode, variant_value(Variant, Value)),
propagate(PackageNode, variant_value(Variant, Value, _)),
node_has_variant(PackageNode, Variant, VariantID),
variant_type(VariantID, VariantType),
variant_possible_value(PackageNode, Variant, Value),
not attr("variant_set", PackageNode, Variant).
variant_possible_value(PackageNode, Variant, Value).
% If a variant is propagated, we cannot have extraneous values
variant_is_propagated(PackageNode, Variant) :-
attr("variant_value", PackageNode, Variant, Value),
propagate(PackageNode, variant_value(Variant, Value)),
propagate(PackageNode, variant_value(Variant, Value, _)),
not attr("variant_set", PackageNode, Variant).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_selected", PackageNode, Variant, Value, _, _),
not propagate(PackageNode, variant_value(Variant, Value)).
not propagate(PackageNode, variant_value(Variant, Value, _)).
error(100, "{0} and {1} cannot both propagate variant '{2}' to the shared dependency: {3}",
Package1, Package2, Variant, Dependency) :-
% The variant is a singlevalued variant
variant_single_value(node(X, Package1), Variant),
% Dependency is trying to propagate Variant with different values and is not the source package
propagate(node(Z, Dependency), variant_value(Variant, Value1, node(X, Package1))),
propagate(node(Z, Dependency), variant_value(Variant, Value2, node(Y, Package2))),
% Package1 and Package2 and their values are different
Package1 > Package2, Value1 != Value2,
not propagate(node(Z, Dependency), variant_value(Variant, _, node(Z, Dependency))).
% Cannot propagate the same variant from two different packages if one is a dependency of the other
error(100, "{0} and {1} cannot both propagate variant '{2}'", Package1, Package2, Variant) :-
% The variant is a single-valued variant
variant_single_value(node(X, Package1), Variant),
% Package1 and Package2 and their values are different
Package1 != Package2, Value1 != Value2,
% Package2 is set to propagate the value from Package1
propagate(node(Y, Package2), variant_value(Variant, Value2, node(X, Package2))),
propagate(node(Y, Package2), variant_value(Variant, Value1, node(X, Package1))),
variant_is_propagated(node(Y, Package2), Variant).
% Cannot propagate a variant if a different value was set for it in a dependency
error(100, "Cannot propagate the variant '{0}' from the package: {1} because package: {2} is set to exclude it", Variant, Source, Package) :-
% Package has a Variant and Source is propagating Variant
attr("variant_set", node(X, Package), Variant, Value1),
% The packages and values are different
Source != Package, Value1 != Value2,
% The variant is a single-valued variant
variant_single_value(node(X, Package1), Variant),
% A different value is being propagated from somewhere else
propagate(node(X, Package), variant_value(Variant, Value2, node(Y, Source))).
%----
% Flags
@@ -1402,25 +1449,71 @@ attr("node_flag", PackageNode, NodeFlag) :- attr("node_flag_set", PackageNode, N
%-----------------------------------------------------------------------------
% Installed packages
% Installed Packages
%-----------------------------------------------------------------------------
% the solver is free to choose at most one installed hash for each package
{ attr("hash", node(ID, Package), Hash) : installed_hash(Package, Hash) } 1
:- attr("node", node(ID, Package)), internal_error("Package must resolve to at most one hash").
#defined installed_hash/2.
#defined abi_splice_conditions_hold/4.
% These are the previously concretized attributes of the installed package as
% a hash. It has the general form:
% hash_attr(Hash, Attribute, PackageName, Args*)
#defined hash_attr/3.
#defined hash_attr/4.
#defined hash_attr/5.
#defined hash_attr/6.
#defined hash_attr/7.
{ attr("hash", node(ID, PackageName), Hash): installed_hash(PackageName, Hash) } 1 :-
attr("node", node(ID, PackageName)),
internal_error("Package must resolve to at most 1 hash").
% you can't choose an installed hash for a dev spec
:- attr("hash", PackageNode, Hash), attr("variant_value", PackageNode, "dev_path", _).
% You can't install a hash, if it is not installed
:- attr("hash", node(ID, Package), Hash), not installed_hash(Package, Hash).
% This should be redundant given the constraint above
:- attr("node", PackageNode), 2 { attr("hash", PackageNode, Hash) }.
% if a hash is selected, we impose all the constraints that implies
impose(Hash, PackageNode) :- attr("hash", PackageNode, Hash).
% hash_attrs are versions, but can_splice_attr are usually node_version_satisfies
hash_attr(Hash, "node_version_satisfies", PackageName, Constraint) :-
hash_attr(Hash, "version", PackageName, Version),
pkg_fact(PackageName, version_satisfies(Constraint, Version)).
% This recovers the exact semantics for hash reuse hash and depends_on are where
% splices are decided, and virtual_on_edge can result in name-changes, which is
% why they are all treated separately.
imposed_constraint(Hash, Attr, PackageName) :-
hash_attr(Hash, Attr, PackageName).
imposed_constraint(Hash, Attr, PackageName, A1) :-
hash_attr(Hash, Attr, PackageName, A1), Attr != "hash".
imposed_constraint(Hash, Attr, PackageName, Arg1, Arg2) :-
hash_attr(Hash, Attr, PackageName, Arg1, Arg2),
Attr != "depends_on",
Attr != "virtual_on_edge".
imposed_constraint(Hash, Attr, PackageName, A1, A2, A3) :-
hash_attr(Hash, Attr, PackageName, A1, A2, A3).
imposed_constraint(Hash, "hash", PackageName, Hash) :- installed_hash(PackageName, Hash).
% Without splicing, we simply recover the exact semantics
imposed_constraint(ParentHash, "hash", ChildName, ChildHash) :-
hash_attr(ParentHash, "hash", ChildName, ChildHash),
ChildHash != ParentHash,
not abi_splice_conditions_hold(_, _, ChildName, ChildHash).
imposed_constraint(Hash, "depends_on", PackageName, DepName, Type) :-
hash_attr(Hash, "depends_on", PackageName, DepName, Type),
hash_attr(Hash, "hash", DepName, DepHash),
not attr("splice_at_hash", _, _, DepName, DepHash).
imposed_constraint(Hash, "virtual_on_edge", PackageName, DepName, VirtName) :-
hash_attr(Hash, "virtual_on_edge", PackageName, DepName, VirtName),
not attr("splice_at_hash", _, _, DepName,_).
% Rules pertaining to attr("splice_at_hash") and abi_splice_conditions_hold will
% be conditionally loaded from splices.lp
impose(Hash, PackageNode) :- attr("hash", PackageNode, Hash), attr("node", PackageNode).
% If there is not a hash for a package, we build it.
build(PackageNode) :- attr("node", PackageNode), not concrete(PackageNode).
% if we haven't selected a hash for a package, we'll be building it
build(PackageNode) :- not attr("hash", PackageNode, _), attr("node", PackageNode).
% Minimizing builds is tricky. We want a minimizing criterion
@@ -1433,6 +1526,7 @@ build(PackageNode) :- not attr("hash", PackageNode, _), attr("node", PackageNode
% criteria for built specs -- so that they take precedence over the otherwise
% topmost-priority criterion to reuse what is installed.
%
% The priority ranges are:
% 1000+ Optimizations for concretization errors
% 300 - 1000 Highest priority optimizations for valid solutions
@@ -1458,12 +1552,10 @@ build_priority(PackageNode, 0) :- not build(PackageNode), attr("node", Package
pkg_fact(Package, version_declared(Version, Weight, "installed")),
not optimize_for_reuse().
#defined installed_hash/2.
% This statement, which is a hidden feature of clingo, let us avoid cycles in the DAG
#edge (A, B) : depends_on(A, B).
%-----------------------------------------------------------------
% Optimization to avoid errors
%-----------------------------------------------------------------

View File

@@ -44,6 +44,17 @@ def _id(thing: Any) -> Union[str, AspObject]:
return f'"{str(thing)}"'
class AspVar(AspObject):
"""Represents a variable in an ASP rule, allows for conditionally generating
rules"""
def __init__(self, name: str):
self.name = name
def __str__(self) -> str:
return str(self.name)
@lang.key_ordering
class AspFunction(AspObject):
"""A term in the ASP logic program"""
@@ -88,6 +99,8 @@ def _argify(self, arg: Any) -> Any:
return clingo().Number(arg)
elif isinstance(arg, AspFunction):
return clingo().Function(arg.name, [self._argify(x) for x in arg.args], positive=True)
elif isinstance(arg, AspVar):
return clingo().Variable(arg.name)
return clingo().String(str(arg))
def symbol(self):

View File

@@ -15,7 +15,6 @@
#show attr/4.
#show attr/5.
#show attr/6.
% names of optimization criteria
#show opt_criterion/2.

Some files were not shown because too many files have changed in this diff Show More