Compare commits

...

116 Commits

Author SHA1 Message Date
Gregory Becker
f932135f08 node_flag_propagation_candidate: special cases for multiple node attrs 2024-01-26 00:26:06 -08:00
RikkiButler20
b7e7db4890 [@spackbot] updating style on behalf of RikkiButler20 2024-01-16 18:34:04 +00:00
Kayla Butler
f4762cfe7b Fix attribute error
Compiler flag test still fail
2023-11-17 13:51:25 -08:00
Richarda Butler
be5f1bd9d7 Merge branch 'develop' into bugfix/compiler-flag-propagation 2023-11-07 13:11:46 -08:00
Kayla Butler
dbb36dbbf1 Change method name 2023-11-07 12:32:00 -08:00
Richarda Butler
5774df6b7a Propagate variant across nodes that don't have that variant (#38512)
Before this PR, variant were not propagated to leaf nodes that could accept 
the propagated value, if some intermediate node couldn't accept it.

This PR fixes that issue by marking nodes as "candidate" for propagation
and by setting the variant only if it can be accepted by the node.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-11-07 21:04:41 +01:00
Harmen Stoppels
3a5c1eb5f3 tutorial pipeline: force gcc@12.3.0 (#40937) 2023-11-07 20:53:44 +01:00
Harmen Stoppels
3a2ec729f7 Ensure global command line arguments end up in args like before (#40929) 2023-11-07 20:35:56 +01:00
Jacob King
a093f4a8ce superlu-dist: add +parmetis variant. (#40746)
* Expose ability to make parmetis an optional superlu-dist dependency to
spack package management.

* rename parmetis variant: Enable ParMETIS library

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2023-11-07 10:21:38 -08:00
Scott Wittenburg
b8302a8277 ci: do not retry timed out build jobs (#40936) 2023-11-07 17:44:28 +00:00
Massimiliano Culpo
32f319157d Update the branch for the tutorial command (#40934) 2023-11-07 16:59:48 +00:00
Harmen Stoppels
75dfad8788 catch exceptions in which_string (#40935) 2023-11-07 17:17:31 +01:00
Vanessasaurus
f3ba20db26 fix configure args for darshan-runtime (#40873)
Problem: the current configure arguments are added lists to a list,
and this needs to be adding strings to the same list.
Solution: ensure we add each item (string) separately.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2023-11-07 07:00:28 -08:00
Rob Falgout
6301edbd5d Update package.py for new release 2.30.0 (#40907) 2023-11-07 07:58:00 -07:00
Massimiliano Culpo
c232bf435a Change container labeling so that "latest" is the latest tag (#40593)
* Use `major.minor.patch`, `major.minor`, `major` in tags

* Ensure `latest` is the semver largest version, and not "latest in time"

* Remove Ubuntu 18.04 from the list of images
2023-11-07 11:53:36 +01:00
Massimiliano Culpo
f3537bc66b ASP: targets, compilers and providers soft-preferences are only global (#31261)
Modify the packages.yaml schema so that soft-preferences on targets,
compilers and providers can only be specified under the "all" attribute.
This makes them effectively global preferences.

Version preferences instead can only be specified under a package
specific section.

If a preference attribute is found in a section where it should
not be, it will be ignored and a warning is printed to screen.
2023-11-07 07:46:06 +01:00
Massimiliano Culpo
4004f27bc0 archspec: update to v0.2.2 (#40917)
Adds support for Neoverse V2
2023-11-07 07:44:52 +01:00
Todd Gamblin
910190f55b database: optimize query() by skipping unnecessary virtual checks (#40898)
Most queries will end up calling `spec.satisfies(query)` on everything in the DB, which
will cause Spack to ask whether the query spec is virtual if its name doesn't match the
target spec's. This can be expensive, because it can cause Spack to check if any new
virtuals showed up in *all* the packages it knows about. That can currently trigger
thousands of `stat()` calls.

We can avoid the virtual check for most successful queries if we consider that if there
*is* a match by name, the query spec *can't* be virtual. This PR adds an optimization to
the query loop to save any comparisons that would trigger a virtual check for last.

- [x] Add a `deferred` list to the `query()` loop.
- [x] First run through the `query()` loop *only* checks for name matches.
- [x] Query loop now returns early if there's a name match, skipping most `satisfies()` calls.
- [x] Second run through the `deferred()` list only runs if query spec is virtual.
- [x] Fix up handling of concrete specs.
- [x] Add test for querying virtuals in DB.
- [x] Avoid allocating deferred if not necessary.

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-11-07 01:00:37 +00:00
Harmen Stoppels
4ce80b95f3 spack compiler find --[no]-mixed-toolchain (#40902)
Currently there's some hacky logic in the AppleClang compiler that makes
it also accept `gfortran` as a fortran compiler if `flang` is not found.

This is guarded by `if sys.platform` checks s.t. it only applies to
Darwin.

But on Linux the feature of detecting mixed toolchains is highly
requested too, cause it's rather annoying to run into a failed build of
`openblas` after dozens of minutes of compiling its dependencies, just
because clang doesn't have a fortran compiler.

In particular in CI where the system compilers may change during system
updates, it's typically impossible to fix compilers in a hand-written
compilers.yaml config file: the config will almost certainly be outdated
sooner or later, and maintaining one config file per target machine and
writing logic to select the correct config is rather undesirable too.

---

This PR introduces a flag `spack compiler find --mixed-toolchain` that
fills out missing `fc` and `f77` entries in `clang` / `apple-clang` by
picking the best matching `gcc`.

It is enabled by default on macOS, but not on Linux, matching current
behavior of `spack compiler find`.

The "best matching gcc" logic and compiler path updates are identical to
how compiler path dictionaries are currently flattened "horizontally"
(per compiler id). This just adds logic to do the same "vertically"
(across different compiler ids).

So, with this change on Ubuntu 22.04:

```
$ spack compiler find --mixed-toolchain
==> Added 6 new compilers to /home/harmen/.spack/linux/compilers.yaml
    gcc@13.1.0  gcc@12.3.0  gcc@11.4.0  gcc@10.5.0  clang@16.0.0  clang@15.0.7
==> Compilers are defined in the following files:
    /home/harmen/.spack/linux/compilers.yaml

```

you finally get:

```
compilers:
- compiler:
    spec: clang@=15.0.7
    paths:
      cc: /usr/bin/clang
      cxx: /usr/bin/clang++
      f77: /usr/bin/gfortran
      fc: /usr/bin/gfortran
    flags: {}
    operating_system: ubuntu23.04
    target: x86_64
    modules: []
    environment: {}
    extra_rpaths: []
- compiler:
    spec: clang@=16.0.0
    paths:
      cc: /usr/bin/clang-16
      cxx: /usr/bin/clang++-16
      f77: /usr/bin/gfortran
      fc: /usr/bin/gfortran
    flags: {}
    operating_system: ubuntu23.04
    target: x86_64
    modules: []
    environment: {}
    extra_rpaths: []
```

The "best gcc" is automatically default system gcc, since it has no
suffixes / prefixes.
2023-11-06 15:17:31 -08:00
Sinan
8f1f9048ec package/qgis: add latest ltr (#40752)
* package/qgis: add latest ltr

* fix bug

* [@spackbot] updating style on behalf of Sinan81

* make flake happy

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
2023-11-06 15:55:20 -07:00
Harmen Stoppels
e7372a54a1 docs: expand section about relocation, suggest padding (#40909) 2023-11-06 14:49:54 -08:00
Michael Kuhn
5074b7e922 Add support for aliases (#17229)
Add a new config section: `config:aliases`, which is a dictionary mapping aliases
to commands.

For instance:


```yaml
config:
    aliases:
        sp: spec -I
```

will define a new command `sp` that will execute `spec` with the `-I`
argument. 

Aliases cannot override existing commands, and this is ensured with a test.

We cannot currently alias subcommands. Spack will warn about any aliases
containing a space, but will not error, which leaves room for subcommand
aliases in the future.

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2023-11-06 14:37:46 -08:00
Harmen Stoppels
461eb944bd Don't let runtime env variables of compiler like deps leak into the build environment (#40916)
* Test that setup_run_environment changes to CC/CXX/FC/F77 are dropped in build env

* compilers set in run env shouldn't impact build

Adds `drop` to EnvironmentModifications courtesy of @haampie, and uses
it to clear modifications of CC, CXX, F77 and FC made by
`setup_{,dependent_}run_environment` routines when producing an
environment in BUILD context.

* comment / style

* comment

---------

Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-11-06 14:30:27 -08:00
Harmen Stoppels
4700108b5b fix prefix_inspections keys in example (#40904) 2023-11-06 13:22:13 -08:00
Harmen Stoppels
3384181868 docs: mention public build cache for GHA (#40908) 2023-11-06 13:21:16 -08:00
Vicente Bolea
f0f6e54b29 adios2: add v2.9.2 release (#40832) 2023-11-06 12:15:29 -08:00
Harmen Stoppels
a2f00886e9 defaults/modules.yaml: hide implicits (#40906) 2023-11-06 10:37:29 -08:00
Harmen Stoppels
1235084c20 Introduce default_args context manager (#39964)
This adds a rather trivial context manager that lets you deduplicate repeated
arguments in directives, e.g.

```python
depends_on("py-x@1", when="@1", type=("build", "run"))
depends_on("py-x@2", when="@2", type=("build", "run"))
depends_on("py-x@3", when="@3", type=("build", "run"))
depends_on("py-x@4", when="@4", type=("build", "run"))
```

can be condensed to

```python
with default_args(type=("build", "run")):
    depends_on("py-x@1", when="@1")
    depends_on("py-x@2", when="@2")
    depends_on("py-x@3", when="@3")
    depends_on("py-x@4", when="@4")
```

The advantage is it's clear for humans, the downside it's less clear for type checkers due to type erasure.
2023-11-06 10:22:29 -08:00
Greg Becker
b5538960c3 error messages: condition chaining (#40173)
Create chains of causation for error messages.

The current implementation is only completed for some of the many errors presented by the concretizer. The rest will need to be filled out over time, but this demonstrates the capability.

The basic idea is to associate conditions in the solver with one another in causal relationships, and to associate errors with the proximate causes of their facts in the condition graph. Then we can construct causal trees to explain errors, which will hopefully present users with useful information to avoid the error or report issues.

Technically, this is implemented as a secondary solve. The concretizer computes the optimal model, and if the optimal model contains an error, then a secondary solve computes causation information about the error(s) in the concretizer output.

Examples:

$ spack solve hdf5 ^cmake@3.0.1
==> Error: concretization failed for the following reasons:

   1. Cannot satisfy 'cmake@3.0.1'
   2. Cannot satisfy 'cmake@3.0.1'
        required because hdf5 ^cmake@3.0.1 requested from CLI 
   3. Cannot satisfy 'cmake@3.18:' and 'cmake@3.0.1
        required because hdf5 ^cmake@3.0.1 requested from CLI 
        required because hdf5 depends on cmake@3.18: when @1.13: 
          required because hdf5 ^cmake@3.0.1 requested from CLI 
   4. Cannot satisfy 'cmake@3.12:' and 'cmake@3.0.1
        required because hdf5 depends on cmake@3.12: 
          required because hdf5 ^cmake@3.0.1 requested from CLI 
        required because hdf5 ^cmake@3.0.1 requested from CLI

$ spack spec cmake ^curl~ldap   # <-- with curl configured non-buildable and an external with `+ldap`
==> Error: concretization failed for the following reasons:

   1. Attempted to use external for 'curl' which does not satisfy any configured external spec
   2. Attempted to build package curl which is not buildable and does not have a satisfying external
        attr('variant_value', 'curl', 'ldap', 'True') is an external constraint for curl which was not satisfied
   3. Attempted to build package curl which is not buildable and does not have a satisfying external
        attr('variant_value', 'curl', 'gssapi', 'True') is an external constraint for curl which was not satisfied
   4. Attempted to build package curl which is not buildable and does not have a satisfying external
        'curl+ldap' is an external constraint for curl which was not satisfied
        'curl~ldap' required
        required because cmake ^curl~ldap requested from CLI 

$ spack solve yambo+mpi ^hdf5~mpi
==> Error: concretization failed for the following reasons:

   1. 'hdf5' required multiple values for single-valued variant 'mpi'
   2. 'hdf5' required multiple values for single-valued variant 'mpi'
    Requested '~mpi' and '+mpi'
        required because yambo depends on hdf5+mpi when +mpi 
          required because yambo+mpi ^hdf5~mpi requested from CLI 
        required because yambo+mpi ^hdf5~mpi requested from CLI 
   3. 'hdf5' required multiple values for single-valued variant 'mpi'
    Requested '~mpi' and '+mpi'
        required because netcdf-c depends on hdf5+mpi when +mpi 
          required because netcdf-fortran depends on netcdf-c 
            required because yambo depends on netcdf-fortran 
              required because yambo+mpi ^hdf5~mpi requested from CLI 
          required because netcdf-fortran depends on netcdf-c@4.7.4: when @4.5.3: 
            required because yambo depends on netcdf-fortran 
              required because yambo+mpi ^hdf5~mpi requested from CLI 
          required because yambo depends on netcdf-c 
            required because yambo+mpi ^hdf5~mpi requested from CLI 
          required because yambo depends on netcdf-c+mpi when +mpi 
            required because yambo+mpi ^hdf5~mpi requested from CLI 
        required because yambo+mpi ^hdf5~mpi requested from CLI 

Future work:

In addition to fleshing out the causes of other errors, I would like to find a way to associate different components of the error messages with different causes. In this example it's pretty easy to infer which part is which, but I'm not confident that will always be the case. 

See the previous PR #34500 for discussion of how the condition chains are incomplete. In the future, we may need custom logic for individual attributes to associate some important choice rules with conditions such that clingo choices or other derivations can be part of the explanation.

---------

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-11-06 09:55:21 -08:00
Michael Kuhn
d3d82e8d6b c-blosc2: add v2.11.1 (#40889) 2023-11-06 09:48:42 -08:00
Tamara Dahlgren
17a9198c78 Environments: remove environments created with SpackYAMLErrors (#40878) 2023-11-06 18:48:28 +01:00
Juan Miguel Carceller
c6c689be28 pythia8: fix configure args (#40644)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-11-06 09:33:23 -08:00
AMD Toolchain Support
ab563c09d2 enable threading in amdlibflame (#40852)
Co-authored-by: vkallesh <Vijay-teekinavar.Kallesh@amd.com>
2023-11-06 09:20:19 -08:00
Sergio Sánchez Ramírez
abdac36fd5 Add Python as build dependency of Julia (#40903) 2023-11-06 09:03:38 -07:00
Harmen Stoppels
b8a18f0a78 mpich: remove unnecessary tuples and upperbounds (#40899)
* mpich: remove unnecessary tuples

* remove redundant :3.3.99 upperbound
2023-11-06 07:58:50 -07:00
Wouter Deconinck
17656b2ea0 qt: new version 5.15.11 (#40884)
* qt: new version 5.15.11

* qt: open end patch for qtlocation when gcc-10:
2023-11-06 06:08:19 -07:00
Harmen Stoppels
3c641c8509 spack env activate: create & activate default environment without args (#40756)
This PR implements the concept of "default environment", which doesn't have to be
created explicitly. The aim is to lower the barrier for adopting environments.

To (create and) activate the default environment, run

```
$ spack env activate
```

This mimics the behavior of

```
$ cd
```

which brings you to your home directory.

This is not a breaking change, since `spack env activate` without arguments
currently errors. It is similar to the already existing `spack env activate --temp`
command which always creates an env in a temporary directory, the difference
is that the default environment is a managed / named environment named `default`.

The name `default` is not a reserved name, it's just that `spack env activate`
creates it for you if you don't have it already.

With this change, you can get started with environments faster:

```
$ spack env activate [--prompt]
$ spack install --add x y z
```

instead of

```
$ spack env create default
==> Created environment 'default in /Users/harmenstoppels/spack/var/spack/environments/default
==> You can activate this environment with:
==>   spack env activate default
$ spack env activate [--prompt] default 
$ spack install --add x y z
```

Notice that Spack supports switching (but not stacking) environments, so the
parallel with `cd` is pretty clear:

```
$ spack env activate named_env
$ spack env status
==> In environment named_env
$ spack env activate
$ spack env status
==> In environment default
```
2023-11-05 22:53:26 -08:00
Michael Kuhn
141c7de5d8 Add command and package suggestions (#40895)
* Add command suggestions

This adds suggestions of similar commands in case users mistype a
command. Before:
```
$ spack spack
==> Error: spack is not a recognized Spack command or extension command; check with `spack commands`.
```
After:
```
$ spack spack
==> Error: spack is not a recognized Spack command or extension command; check with `spack commands`.

Did you mean one of the following commands?
  spec
  patch
```

* Add package name suggestions

* Remove suggestion to run spack clean -m
2023-11-05 14:32:09 -08:00
Todd Gamblin
f6b23b4653 bugfix: compress aliases for first command in completion (#40890)
This completes to `spack concretize`:

```
spack conc<tab>
```

but this still gets hung up on the difference between `concretize` and `concretise`:

```
spack -e . conc<tab>
```

We were checking `"$COMP_CWORD" = 1`, which tracks the word on the command line
including any flags and their args, but we should track `"$COMP_CWORD_NO_FLAGS" = 1` to
figure out if the arg we're completing is the first real command.
2023-11-05 10:15:37 +00:00
Harmen Stoppels
4755b28398 Hidden modules: always append hash (#40868) 2023-11-05 08:56:11 +01:00
Tamara Dahlgren
c9dfb9b0fd Environments: Add support for including definitions files (#33960)
This PR adds support for including separate definitions from `spack.yaml`.

Supporting the inclusion of files with definitions enables user to make
curated/standardized collections of packages that can re-used by others.
2023-11-05 00:47:06 -07:00
Veselin Dobrev
5a67c578b7 mfem: allow cuda/rocm builds with superlu-dist built without cuda/rocm (#40847) 2023-11-04 20:15:56 -05:00
Michael Kuhn
e47be18acb c-blosc: add v1.21.5 (#40888) 2023-11-04 16:51:37 -07:00
Harmen Stoppels
6593d22c4e spack.modules.commmon: pass spec to SetupContext (#40886)
Currently module globals aren't set before running
`setup_[dependent_]run_environment` to compute environment modifications
for module files. This commit fixes that.
2023-11-04 20:42:47 +00:00
Massimiliano Culpo
f51dad976e hdf5-vol-async: better specify dependency condition (#40882) 2023-11-04 20:31:52 +01:00
Cameron Rutherford
ff8cd597e0 hiop: fix cuda constraints (#40875) 2023-11-04 13:09:59 -05:00
eugeneswalker
fd22d109a6 sundials +sycl: add cxxflags=-fsycl via flag_handler (#40845) 2023-11-04 08:55:19 -05:00
zv-io
88ee3a0fba linux-headers: support multiple versions (#40877)
The download URL for linux-headers was hardcoded to 4.x;
we need to derive the correct URL from the version number.
2023-11-04 12:21:12 +01:00
Massimiliano Culpo
f50377de7f environment: solve one spec per child process (#40876)
Looking at the memory profiles of concurrent solves
for environment with unify:false, it seems memory
is only ramping up.

This exchange in the potassco mailing list:
 https://sourceforge.net/p/potassco/mailman/potassco-users/thread/b55b5b8c2e8945409abb3fa3c935c27e%40lohn.at/#msg36517698

Seems to suggest that clingo doesn't release memory
until end of the application.

Since when unify:false we distribute work to processes,
here we give a maxtaskperchild=1, so we clean memory
after each solve.
2023-11-03 23:10:42 +00:00
Adam J. Stewart
8e96d3a051 GDAL: add v3.7.3 (#40865) 2023-11-03 22:59:52 +01:00
Richarda Butler
8fc1ba2d7a Bugfix: propagation of multivalued variants (#39833)
Don't encourage use of default value if propagating a multivalued variant.
2023-11-03 12:09:39 -07:00
Massimiliano Culpo
668a5b45e5 clingo-bootstrap: force setuptools through variant (#40866) 2023-11-03 16:53:45 +01:00
Andrew W Elble
70171d6caf squashfuse: remove url_for_version (#40862)
0.5.0 tarball now has the 'v' removed from the name
2023-11-03 10:34:25 -04:00
Thomas-Ulrich
0f1898c82a xdmf3: fix compilation with hdf5@1.10 and above (#37551) 2023-11-03 14:23:49 +01:00
Massimiliano Culpo
db16335aec ASP-based solver: fix for unsplittable providers (#40859)
Some providers must provide virtuals "together", i.e.
if they provide one virtual of a set, they must be the
providers also of the others.

There was a bug though, where we were not checking if
the other virtuals in the set were needed at all in
the DAG.

This commit fixes the bug.
2023-11-03 12:56:37 +01:00
Harmen Stoppels
3082ce6a22 oci parsing: make image name case insensitive (#40858) 2023-11-03 12:50:30 +01:00
George Young
fe0cf80e05 py-spython: updating to @0.3.1 (#40839)
* py-spython: updating to @0.3.1

* Adding `when=` for py-semver

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-11-03 05:07:58 -06:00
Thomas-Ulrich
a5e6097af7 fix typo in packaging guide (#40853) 2023-11-03 09:56:13 +01:00
eugeneswalker
d4a1618e07 tau: update 2.33 hash, add syscall variant (#40851)
Co-authored-by: wspear <wjspear@gmail.com>
2023-11-03 07:58:00 +01:00
Veselin Dobrev
48a21970d1 MFEM: add logic to find CUDA math-libs when using HPC SDK installation (#40815)
* mfem: add logic to find CUDA math-libs when using HPC SDK installation

* [@spackbot] updating style on behalf of v-dobrev
2023-11-02 20:19:11 -07:00
Martin Aumüller
864d47043c qt-svg: new package for Qt6 SVG module (#40834)
enables loading of SVG icons by providing plugin used by qt-base
2023-11-02 17:05:54 -07:00
Martin Aumüller
c2af2bcac3 qt-*: add v6.5.3 & v6.6.0 (#40833) 2023-11-02 19:52:15 -04:00
Martin Aumüller
7c79c744b6 libtheora: fix build on macos (#40840)
* libtheora: regenerate Makefile.in during autoreconf

The patch to inhibit running of configure would exit autogen.sh so early
that it did not yet run autoconf/automake/...
Instead of patching autogen.sh, just pass -V as argument, as this is
passed on to configure and lets it just print its version instead of
configuring the build tree.

Also drop arguments from autogen.sh, as they are unused when configure
does not run.

* libtheora: fix build on macos

Apply upstream patches in order to avoid unresolved symbols during building of libtheoraenc.
These patches require re-running automake/autoconf/...

Error messages:
libtool: link: /Users/ma/git/spack/lib/spack/env/clang/clang -dynamiclib  -o .libs/libtheoraenc.1.dylib  .libs/apiwrapper.o .libs/fragment.o .libs/idct.o .libs/internal.o .libs/state.o .libs/quant.o .l
ibs/analyze.o .libs/fdct.o .libs/encfrag.o .libs/encapiwrapper.o .libs/encinfo.o .libs/encode.o .libs/enquant.o .libs/huffenc.o .libs/mathops.o .libs/mcenc.o .libs/rate.o .libs/tokenize.o   -L/opt/spac
k/darwin-sonoma-m1/apple-clang-15.0.0/libtheora-1.1.1-uflq3jvysewnrmlj5x5tvltst65ho3v4/lib -logg -lm  -Wl,-exported_symbols_list -Wl,/var/folders/zv/qr55pmd9065glf0mcltpx5bm000102/T/ma/spack-stage/spac
k-stage-libtheora-1.1.1-uflq3jvysewnrmlj5x5tvltst65ho3v4/spack-src/lib/theoraenc.exp   -install_name  /opt/spack/darwin-sonoma-m1/apple-clang-15.0.0/libtheora-1.1.1-uflq3jvysewnrmlj5x5tvltst65ho3v4/lib
/libtheoraenc.1.dylib -compatibility_version 3 -current_version 3.2
ld: warning: search path '/opt/spack/darwin-sonoma-m1/apple-clang-15.0.0/libtheora-1.1.1-uflq3jvysewnrmlj5x5tvltst65ho3v4/lib' not found
ld: Undefined symbols:
  _th_comment_add, referenced from:
      _theora_comment_add in apiwrapper.o
  _th_comment_add_tag, referenced from:
      _theora_comment_add_tag in apiwrapper.o
  _th_comment_clear, referenced from:
      _theora_comment_clear in apiwrapper.o
  _th_comment_init, referenced from:
      _theora_comment_init in apiwrapper.o
  _th_comment_query, referenced from:
      _theora_comment_query in apiwrapper.o
  _th_comment_query_count, referenced from:
      _theora_comment_query_count in apiwrapper.o

* libtheora: add git versions

stable as version name for theora-1.1 branch was chosen so that it sorts between 1.1.x and master

* libtheora: remove unused patch

thanks to @michaelkuhn for noticing
2023-11-03 00:08:22 +01:00
garylawson
94d143763e Update Anaconda3 -- add version 2023.09-0 for x86_64, aarch64, and ppc64le (#40622)
* Add 2023.09-0 for x86_64, aarch64, and ppc64le
   extend the anaconda3 package.py to support aarch64 and ppc64le. 
   add the latest version of anaconda3 to each new platform, including the existing x86_64
* formatting
2023-11-02 16:42:44 -06:00
Vanessasaurus
6f9425c593 Automated deployment to update package flux-sched 2023-10-18 (#40596)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-11-02 13:16:39 -07:00
Nicolas Cornu
05953e4491 highfive: 2.8.0 (#40837)
Co-authored-by: Nicolas Cornu <me@alkino.fr>
2023-11-02 14:03:44 -06:00
Sergey Kosukhin
6b236f130c eccodes: rename variant 'definitions' to 'extra_definitions' (#36186) 2023-11-02 13:28:31 -06:00
Greg Becker
fa08de669e bugfix: computing NodeID2 in requirement node_flag_source (#40846) 2023-11-02 20:17:54 +01:00
Seth R. Johnson
c2193b5470 py-pint: new versions 0.21, 0.22 (#40745)
* py-pint: new versions 0.21, 0.22

* Address feedback

* Fix dumb typo

* Add typing extension requirement
2023-11-02 14:13:19 -05:00
Chris Richardson
b5b94d89d3 Update to latest version (#40778) 2023-11-02 14:07:44 -05:00
vucoda
dd57b58c2f py-pyside2: fix to build with newer llvm and to use spack install headers (#40544)
* Fix py-pyside2 to build with newer llvm and to use spack libglx and libxcb headers where system headers are missing

pyside2 needs LLVM_INSTALL_DIR to be set when using llvm 11: and expects system headers for libglx and libxcb and won't build otherwise.

* Fix styling

* remove raw string type

* Update var/spack/repos/builtin/packages/py-pyside2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-02 14:03:18 -05:00
Chris Richardson
29a30963b3 Fixes to ffcx @0.6.0 (#40787) 2023-11-02 14:02:07 -05:00
Jordan Ogas
3447e425f0 add charliecloud 0.35 (#40842)
* add charliecloud 0.35
* fix linter rage
* fix linter rage?
2023-11-02 11:23:49 -07:00
Juan Miguel Carceller
518da16833 Gaudi: Add a few versions and a dependency on tbb after 37.1 (#40802)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-11-02 11:15:27 -07:00
Paul R. C. Kent
4633327e60 llvm: add 17.0.2-4 (#40820) 2023-11-02 17:00:35 +01:00
Harmen Stoppels
6930176ac6 clingo ^pyhton@3.12: revisit distutils fix (#40844) 2023-11-02 16:48:21 +01:00
Adam J. Stewart
bb64b22066 PyTorch: build with external sleef (#40763)
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2023-11-02 16:09:49 +01:00
Harmen Stoppels
8b0ab67de4 depfile: deal with empty / non-concrete env (#40816) 2023-11-02 16:04:35 +01:00
Satish Balay
dbf21bf843 exago: update petsc dependency (#40831) 2023-11-02 07:29:37 -07:00
Harmen Stoppels
af3a29596e go/rust bootstrap: no versions if unsupported arch (#40841)
The lookup in a dictionary causes KeyError on package load for
unsupported architectures such as i386 and ppc big endian.
2023-11-02 08:13:13 -06:00
Harmen Stoppels
80944d22f7 spack external find: fix multi-arch troubles (#33973) 2023-11-02 09:45:31 +01:00
Tamara Dahlgren
f56efaff3e env remove: add a unit test removing two environments (#40814) 2023-11-02 08:51:08 +01:00
Martin Aumüller
83bb2002b4 openscenegraph: support more file formats (#39897) 2023-11-02 08:41:03 +01:00
Massimiliano Culpo
16fa3b9f07 Cherry-picking virtual dependencies (#35322)
This PR makes it possible to select only a subset of virtual dependencies from a spec that _may_ provide more. To select providers, a syntax to specify edge attributes is introduced:
```
hdf5 ^[virtuals=mpi] mpich
```
With that syntax we can concretize specs like:
```console
$ spack spec strumpack ^[virtuals=mpi] intel-parallel-studio+mkl ^[virtuals=lapack] openblas
```

On `develop` this would currently fail with:
```console
$ spack spec strumpack ^intel-parallel-studio+mkl ^openblas
==> Error: Spec cannot include multiple providers for virtual 'blas'
    Requested 'intel-parallel-studio' and 'openblas'
```

In package recipes, virtual specs that are declared in the same `provides` directive need to be provided _together_. This means that e.g. `openblas`, which has:
```python
provides("blas", "lapack")
```
needs to provide both `lapack` and `blas` when requested to provide at least one of them.

## Additional notes

This capability is needed to model compilers. Assuming that languages are treated like virtual dependencies, we might want e.g. to use LLVM to compile C/C++ and Gnu GCC to compile Fortran. This can be accomplished by the following[^1]:
```
hdf5 ^[virtuals=c,cxx] llvm ^[virtuals=fortran] gcc
```

[^1]: We plan to add some syntactic sugar around this syntax, and reuse the `%` sigil to avoid having a lot of boilerplate around compilers.

Modifications:
- [x] Add syntax to interact with edge attributes from spec literals
- [x] Add concretization logic to be able to cherry-pick virtual dependencies
- [x] Extend semantic of the `provides` directive to express when virtuals need to be provided together
- [x] Add unit-tests and documentation
2023-11-01 23:35:23 -07:00
Thomas Madlener
6cd2241e49 edm4hep: Add 0.10.1 tag and update maintainers (#40829)
* edm4hep: add latest tag
* edm4hep: Add myself as maintainer
2023-11-01 23:04:00 -06:00
RikkiButler20
88368dea40 [@spackbot] updating style on behalf of RikkiButler20 2023-11-02 02:25:24 +00:00
Kayla Butler
968223758a Update & add test
Test for propagating compiler flags that aren't the root spec
2023-11-01 18:47:49 -07:00
snehring
6af45230b4 ceres-solver: adding version 2.2.0 (#40824)
* ceres-solver: adding version 2.2.0
* ceres-solver: adding suite-sparse dep
2023-11-01 17:47:55 -07:00
snehring
a8285f0eec vcftools: add v0.1.16 (#40805)
* vcftools: adding new version 0.1.16

* Update var/spack/repos/builtin/packages/vcftools/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2023-11-01 16:33:12 -07:00
Adam J. Stewart
e7456e1aab py-matplotlib: add v3.8.1 (#40819) 2023-11-01 16:33:00 -07:00
Jeremy L Thompson
dd636dd3fb libCEED v0.12.0, Ratel v0.3.0 (#40822)
* ratel - add v0.3.0
* libceed - add version 0.12.0
2023-11-01 16:29:18 -07:00
Mikael Simberg
a73c95b734 pika: Add 0.20.0 (#40817) 2023-11-01 17:19:56 -06:00
Kayla Butler
177976b762 Remove compiler flag propagation from concretize.lp 2023-10-20 15:02:46 -07:00
Kayla Butler
3c9bac5861 Create a separate file for compiler flag propagation 2023-10-20 12:00:05 -07:00
Kayla Butler
b8c373f51e Check all dependencies for test 2023-10-05 11:03:48 -07:00
Kayla Butler
a27da63d3e Change test to widen spec checks 2023-10-05 09:41:07 -07:00
Kayla Butler
8a825667e4 Specify that Packages are PackageNodes 2023-10-05 09:25:58 -07:00
Richarda Butler
9db3d9dd6a Merge branch 'develop' into bugfix/compiler-flag-propagation 2023-09-29 13:35:38 -07:00
Richarda Butler
69bbccef6e Merge branch 'develop' into bugfix/compiler-flag-propagation 2023-09-28 14:38:54 -07:00
Richarda Butler
4bf6325930 Merge branch 'develop' into bugfix/compiler-flag-propagation 2023-09-25 08:37:41 -07:00
Richarda Butler
8a56c096d3 Merge branch 'develop' into bugfix/compiler-flag-propagation 2023-09-11 13:27:25 -07:00
Richarda Butler
8760d3885a Merge branch 'develop' into bugfix/compiler-flag-propagation 2023-09-07 13:52:08 -07:00
Richarda Butler
1740c575a2 Merge branch 'develop' into bugfix/compiler-flag-propagation 2023-09-06 11:05:17 -07:00
Richarda Butler
d882116f75 Merge branch 'develop' into bugfix/compiler-flag-propagation 2023-09-05 09:20:32 -07:00
Richarda Butler
90393b77d9 Merge branch 'develop' into bugfix/compiler-flag-propagation 2023-08-31 09:20:48 -07:00
Richarda Butler
2096f35f76 Merge branch 'develop' into bugfix/compiler-flag-propagation 2023-08-30 14:27:36 -07:00
Kayla Butler
8897e1e11f Only specified compiler flag propagates 2023-08-29 13:18:12 -07:00
Kayla Butler
f0df745f54 Make sure literals have the right node id when propagating 2023-08-28 15:07:16 -07:00
RikkiButler20
06c516c580 [@spackbot] updating style on behalf of RikkiButler20 2023-08-24 15:24:06 -07:00
Kayla Butler
0d9b416bc5 Check that compiler flags can propagate 2023-08-24 15:22:50 -07:00
Kayla Butler
a9be08f560 Make sure to get the value from source package 2023-08-24 15:16:04 -07:00
Kayla Butler
eb62129654 Fix error in concretizer 2023-08-24 15:16:03 -07:00
Kayla Butler
92ff8cfd45 Simplify node_flag_propagate and propagate past 1st level dependencies 2023-08-24 15:14:53 -07:00
Kayla Butler
b0d13f8fcc Making compiler flag propagation test uniform 2023-08-24 14:49:57 -07:00
Kayla Butler
e101c77768 Change test for not passing compiler flag 2023-08-24 14:49:56 -07:00
Kayla Butler
fe5ea193fc Updating test and put in a fix 2023-08-24 14:42:57 -07:00
193 changed files with 4653 additions and 1489 deletions

View File

@@ -159,6 +159,9 @@ jobs:
brew install cmake bison@2.7 tree
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: "3.12"
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh

View File

@@ -38,12 +38,11 @@ jobs:
# Meaning of the various items in the matrix list
# 0: Container name (e.g. ubuntu-bionic)
# 1: Platforms to build for
# 2: Base image (e.g. ubuntu:18.04)
# 2: Base image (e.g. ubuntu:22.04)
dockerfile: [[amazon-linux, 'linux/amd64,linux/arm64', 'amazonlinux:2'],
[centos7, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:7'],
[centos-stream, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:stream'],
[leap15, 'linux/amd64,linux/arm64,linux/ppc64le', 'opensuse/leap:15'],
[ubuntu-bionic, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:18.04'],
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04'],
[almalinux8, 'linux/amd64,linux/arm64,linux/ppc64le', 'almalinux:8'],
@@ -58,18 +57,20 @@ jobs:
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- name: Set Container Tag Normal (Nightly)
run: |
container="${{ matrix.dockerfile[0] }}:latest"
echo "container=${container}" >> $GITHUB_ENV
echo "versioned=${container}" >> $GITHUB_ENV
# On a new release create a container with the same tag as the release.
- name: Set Container Tag on Release
if: github.event_name == 'release'
run: |
versioned="${{matrix.dockerfile[0]}}:${GITHUB_REF##*/}"
echo "versioned=${versioned}" >> $GITHUB_ENV
- uses: docker/metadata-action@96383f45573cb7f253c731d3b3ab81c87ef81934
id: docker_meta
with:
images: |
ghcr.io/${{ github.repository_owner }}/${{ matrix.dockerfile[0] }}
${{ github.repository_owner }}/${{ matrix.dockerfile[0] }}
tags: |
type=schedule,pattern=nightly
type=schedule,pattern=develop
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
type=ref,event=branch
type=ref,event=pr
- name: Generate the Dockerfile
env:
@@ -92,13 +93,13 @@ jobs:
path: dockerfiles
- name: Set up QEMU
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3 # @v1
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@f95db51fddba0c2d1ec667646a06c2ce06100226 # @v1
uses: docker/setup-buildx-action@f95db51fddba0c2d1ec667646a06c2ce06100226
- name: Log in to GitHub Container Registry
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d # @v1
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -106,21 +107,18 @@ jobs:
- name: Log in to DockerHub
if: github.event_name != 'pull_request'
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d # @v1
uses: docker/login-action@343f7c4344506bcbf9b4de18042ae17996df046d
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@0565240e2d4ab88bba5387d719585280857ece09 # @v2
uses: docker/build-push-action@0565240e2d4ab88bba5387d719585280857ece09
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}
push: ${{ github.event_name != 'pull_request' }}
cache-from: type=gha
cache-to: type=gha,mode=max
tags: |
spack/${{ env.container }}
spack/${{ env.versioned }}
ghcr.io/spack/${{ env.container }}
ghcr.io/spack/${{ env.versioned }}
tags: ${{ steps.docker_meta.outputs.tags }}
labels: ${{ steps.docker_meta.outputs.labels }}

View File

@@ -229,3 +229,11 @@ config:
flags:
# Whether to keep -Werror flags active in package builds.
keep_werror: 'none'
# A mapping of aliases that can be used to define new commands. For instance,
# `sp: spec -I` will define a new command `sp` that will execute `spec` with
# the `-I` argument. Aliases cannot override existing commands.
aliases:
concretise: concretize
containerise: containerize
rm: remove

View File

@@ -46,10 +46,12 @@ modules:
tcl:
all:
autoload: direct
hide_implicits: true
# Default configurations if lmod is enabled
lmod:
all:
autoload: direct
hide_implicits: true
hierarchy:
- mpi

View File

@@ -1526,6 +1526,30 @@ any MPI implementation will do. If another package depends on
error. Likewise, if you try to plug in some package that doesn't
provide MPI, Spack will raise an error.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Explicit binding of virtual dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
There are packages that provide more than just one virtual dependency. When interacting with them, users
might want to utilize just a subset of what they could provide, and use other providers for virtuals they
need.
It is possible to be more explicit and tell Spack which dependency should provide which virtual, using a
special syntax:
.. code-block:: console
$ spack spec strumpack ^[virtuals=mpi] intel-parallel-studio+mkl ^[virtuals=lapack] openblas
Concretizing the spec above produces the following DAG:
.. figure:: images/strumpack_virtuals.svg
:scale: 60 %
:align: center
where ``intel-parallel-studio`` *could* provide ``mpi``, ``lapack``, and ``blas`` but is used only for the former. The ``lapack``
and ``blas`` dependencies are satisfied by ``openblas``.
^^^^^^^^^^^^^^^^^^^^^^^^
Specifying Specs by Hash
^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -155,6 +155,33 @@ List of popular build caches
* `Extreme-scale Scientific Software Stack (E4S) <https://e4s-project.github.io/>`_: `build cache <https://oaciss.uoregon.edu/e4s/inventory.html>`_
----------
Relocation
----------
When using buildcaches across different machines, it is likely that the install
root will be different from the one used to build the binaries.
To address this issue, Spack automatically relocates all paths encoded in binaries
and scripts to their new location upon install.
Note that there are some cases where this is not possible: if binaries are built in
a relatively short path, and then installed to a longer path, there may not be enough
space in the binary to encode the new path. In this case, Spack will fail to install
the package from the build cache, and a source build is required.
To reduce the likelihood of this happening, it is highly recommended to add padding to
the install root during the build, as specified in the :ref:`config <config-yaml>`
section of the configuration:
.. code-block:: yaml
config:
install_tree:
root: /opt/spack
padded_length: 128
-----------------------------------------
OCI / Docker V2 registries as build cache
@@ -216,29 +243,34 @@ other system dependencies. However, they are still compatible with tools like
are `alternative drivers <https://docs.docker.com/storage/storagedriver/>`_.
------------------------------------
Using a buildcache in GitHub Actions
Spack build cache for GitHub Actions
------------------------------------
GitHub Actions is a popular CI/CD platform for building and testing software,
but each CI job has limited resources, making from source builds too slow for
many applications. Spack build caches can be used to share binaries between CI
runs, speeding up CI significantly.
To significantly speed up Spack in GitHub Actions, binaries can be cached in
GitHub Packages. This service is an OCI registry that can be linked to a GitHub
repository.
A typical workflow is to include a ``spack.yaml`` environment in your repository
that specifies the packages to install:
that specifies the packages to install, the target architecture, and the build
cache to use under ``mirrors``:
.. code-block:: yaml
spack:
specs: [pkg-x, pkg-y]
packages:
all:
require: target=x86_64_v2
mirrors:
github_packages: oci://ghcr.io/<user>/<repo>
specs:
- python@3.11
config:
install_tree:
root: /opt/spack
padded_length: 128
packages:
all:
require: target=x86_64_v2
mirrors:
local-buildcache: oci://ghcr.io/<organization>/<repository>
And a GitHub action that sets up Spack, installs packages from the build cache
or from sources, and pushes newly built binaries to the build cache:
A GitHub action can then be used to install the packages and push them to the
build cache:
.. code-block:: yaml
@@ -252,26 +284,35 @@ or from sources, and pushes newly built binaries to the build cache:
jobs:
example:
runs-on: ubuntu-22.04
permissions:
packages: write
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Install Spack
run: |
git clone --depth=1 https://github.com/spack/spack.git
echo "$PWD/spack/bin/" >> "$GITHUB_PATH"
- name: Checkout Spack
uses: actions/checkout@v3
with:
repository: spack/spack
path: spack
- name: Setup Spack
run: echo "$PWD/spack/bin" >> "$GITHUB_PATH"
- name: Concretize
run: spack -e . concretize
- name: Install
run: spack -e . install --no-check-signature --fail-fast
run: spack -e . install --no-check-signature
- name: Run tests
run: ./my_view/bin/python3 -c 'print("hello world")'
- name: Push to buildcache
run: |
spack -e . mirror set --oci-username <user> --oci-password "${{ secrets.GITHUB_TOKEN }}" github_packages
spack -e . buildcache push --base-image ubuntu:22.04 --unsigned --update-index github_packages
if: always()
spack -e . mirror set --oci-username ${{ github.actor }} --oci-password "${{ secrets.GITHUB_TOKEN }}" local-buildcache
spack -e . buildcache push --base-image ubuntu:22.04 --unsigned --update-index local-buildcache
if: ${{ !cancelled() }}
The first time this action runs, it will build the packages from source and
push them to the build cache. Subsequent runs will pull the binaries from the
@@ -281,15 +322,15 @@ over source builds.
The build cache entries appear in the GitHub Packages section of your repository,
and contain instructions for pulling and running them with ``docker`` or ``podman``.
----------
Relocation
----------
Initial build and later installation do not necessarily happen at the same
location. Spack provides a relocation capability and corrects for RPATHs and
non-relocatable scripts. However, many packages compile paths into binary
artifacts directly. In such cases, the build instructions of this package would
need to be adjusted for better re-locatability.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using Spack's public build cache for GitHub Actions
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack offers a public build cache for GitHub Actions with a set of common packages,
which lets you get started quickly. See the following resources for more information:
* `spack/github-actions-buildcache <https://github.com/spack/github-actions-buildcache>`_
.. _cmd-spack-buildcache:

View File

@@ -526,56 +526,52 @@ Package Preferences
In some cases package requirements can be too strong, and package
preferences are the better option. Package preferences do not impose
constraints on packages for particular versions or variants values,
they rather only set defaults -- the concretizer is free to change
them if it must due to other constraints. Also note that package
preferences are of lower priority than reuse of already installed
packages.
they rather only set defaults. The concretizer is free to change
them if it must, due to other constraints, and also prefers reusing
installed packages over building new ones that are a better match for
preferences.
Here's an example ``packages.yaml`` file that sets preferred packages:
Most package preferences (``compilers``, ``target`` and ``providers``)
can only be set globally under the ``all`` section of ``packages.yaml``:
.. code-block:: yaml
packages:
all:
compiler: [gcc@12.2.0, clang@12:, oneapi@2023:]
target: [x86_64_v3]
providers:
mpi: [mvapich2, mpich, openmpi]
These preferences override Spack's default and effectively reorder priorities
when looking for the best compiler, target or virtual package provider. Each
preference takes an ordered list of spec constraints, with earlier entries in
the list being preferred over later entries.
In the example above all packages prefer to be compiled with ``gcc@12.2.0``,
to target the ``x86_64_v3`` microarchitecture and to use ``mvapich2`` if they
depend on ``mpi``.
The ``variants`` and ``version`` preferences can be set under
package specific sections of the ``packages.yaml`` file:
.. code-block:: yaml
packages:
opencv:
compiler: [gcc@4.9]
variants: +debug
gperftools:
version: [2.2, 2.4, 2.3]
all:
compiler: [gcc@4.4.7, 'gcc@4.6:', intel, clang, pgi]
target: [sandybridge]
providers:
mpi: [mvapich2, mpich, openmpi]
At a high level, this example is specifying how packages are preferably
concretized. The opencv package should prefer using GCC 4.9 and
be built with debug options. The gperftools package should prefer version
2.2 over 2.4. Every package on the system should prefer mvapich2 for
its MPI and GCC 4.4.7 (except for opencv, which overrides this by preferring GCC 4.9).
These options are used to fill in implicit defaults. Any of them can be overwritten
on the command line if explicitly requested.
In this case, the preference for ``opencv`` is to build with debug options, while
``gperftools`` prefers version 2.2 over 2.4.
Package preferences accept the follow keys or components under
the specific package (or ``all``) section: ``compiler``, ``variants``,
``version``, ``providers``, and ``target``. Each component has an
ordered list of spec ``constraints``, with earlier entries in the
list being preferred over later entries.
Any preference can be overwritten on the command line if explicitly requested.
Sometimes a package installation may have constraints that forbid
the first concretization rule, in which case Spack will use the first
legal concretization rule. Going back to the example, if a user
requests gperftools 2.3 or later, then Spack will install version 2.4
as the 2.4 version of gperftools is preferred over 2.3.
An explicit concretization rule in the preferred section will always
take preference over unlisted concretizations. In the above example,
xlc isn't listed in the compiler list. Every listed compiler from
gcc to pgi will thus be preferred over the xlc compiler.
The syntax for the ``provider`` section differs slightly from other
concretization rules. A provider lists a value that packages may
``depends_on`` (e.g, MPI) and a list of rules for fulfilling that
dependency.
Preferences cannot overcome explicit constraints, as they only set a preferred
ordering among homogeneous attribute values. Going back to the example, if
``gperftools@2.3:`` was requested, then Spack will install version 2.4
since the most preferred version 2.2 is prohibited by the version constraint.
.. _package_permissions:

View File

@@ -304,3 +304,17 @@ To work properly, this requires your terminal to reset its title after
Spack has finished its work, otherwise Spack's status information will
remain in the terminal's title indefinitely. Most terminals should already
be set up this way and clear Spack's status information.
-----------
``aliases``
-----------
Aliases can be used to define new Spack commands. They can be either shortcuts
for longer commands or include specific arguments for convenience. For instance,
if users want to use ``spack install``'s ``-v`` argument all the time, they can
create a new alias called ``inst`` that will always call ``install -v``:
.. code-block:: yaml
aliases:
inst: install -v

View File

@@ -0,0 +1,534 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><!-- Generated by graphviz version 2.40.1 (20161225.0304)
--><!-- Title: G Pages: 1 --><svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="3044pt" height="1683pt" viewBox="0.00 0.00 3043.65 1682.80">
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 1678.8)">
<title>G</title>
<polygon fill="#ffffff" stroke="transparent" points="-4,4 -4,-1678.8 3039.6456,-1678.8 3039.6456,4 -4,4"/>
<!-- hkcrbrtf2qex6rvzuok5tzdrbam55pdn -->
<g id="node1" class="node">
<title>hkcrbrtf2qex6rvzuok5tzdrbam55pdn</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M2407.965,-1198.3002C2407.965,-1198.3002 1948.1742,-1198.3002 1948.1742,-1198.3002 1942.1742,-1198.3002 1936.1742,-1192.3002 1936.1742,-1186.3002 1936.1742,-1186.3002 1936.1742,-1123.6998 1936.1742,-1123.6998 1936.1742,-1117.6998 1942.1742,-1111.6998 1948.1742,-1111.6998 1948.1742,-1111.6998 2407.965,-1111.6998 2407.965,-1111.6998 2413.965,-1111.6998 2419.965,-1117.6998 2419.965,-1123.6998 2419.965,-1123.6998 2419.965,-1186.3002 2419.965,-1186.3002 2419.965,-1192.3002 2413.965,-1198.3002 2407.965,-1198.3002"/>
<text text-anchor="middle" x="2178.0696" y="-1147.8" font-family="Monaco" font-size="24.00" fill="#000000">netlib-scalapack@2.2.0%gcc@9.4.0/hkcrbrt</text>
</g>
<!-- o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="node8" class="node">
<title>o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M901.2032,-1039.5002C901.2032,-1039.5002 486.936,-1039.5002 486.936,-1039.5002 480.936,-1039.5002 474.936,-1033.5002 474.936,-1027.5002 474.936,-1027.5002 474.936,-964.8998 474.936,-964.8998 474.936,-958.8998 480.936,-952.8998 486.936,-952.8998 486.936,-952.8998 901.2032,-952.8998 901.2032,-952.8998 907.2032,-952.8998 913.2032,-958.8998 913.2032,-964.8998 913.2032,-964.8998 913.2032,-1027.5002 913.2032,-1027.5002 913.2032,-1033.5002 907.2032,-1039.5002 901.2032,-1039.5002"/>
<text text-anchor="middle" x="694.0696" y="-989" font-family="Monaco" font-size="24.00" fill="#000000">openblas@0.3.21%gcc@9.4.0/o524geb</text>
</g>
<!-- hkcrbrtf2qex6rvzuok5tzdrbam55pdn&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge10" class="edge">
<title>hkcrbrtf2qex6rvzuok5tzdrbam55pdn-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1936.1981,-1113.832C1933.0949,-1113.4088 1930.0059,-1112.9948 1926.9392,-1112.5915 1575.405,-1066.3348 1485.3504,-1074.0879 1131.9752,-1040.5955 1064.2267,-1034.1713 990.6114,-1026.9648 923.4066,-1020.2975"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1936.4684,-1111.8504C1933.3606,-1111.4265 1930.2716,-1111.0125 1927.2,-1110.6085 1575.2335,-1064.3422 1485.1789,-1072.0953 1132.164,-1038.6045 1064.4216,-1032.1808 990.8062,-1024.9744 923.604,-1018.3073"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="923.505,-1015.7853 913.2081,-1018.2801 922.8133,-1022.751 923.505,-1015.7853"/>
<text text-anchor="middle" x="1368.79" y="-1067.6346" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas,lapack</text>
</g>
<!-- 2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="node23" class="node">
<title>2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M2767.3081,-1039.5002C2767.3081,-1039.5002 2166.8311,-1039.5002 2166.8311,-1039.5002 2160.8311,-1039.5002 2154.8311,-1033.5002 2154.8311,-1027.5002 2154.8311,-1027.5002 2154.8311,-964.8998 2154.8311,-964.8998 2154.8311,-958.8998 2160.8311,-952.8998 2166.8311,-952.8998 2166.8311,-952.8998 2767.3081,-952.8998 2767.3081,-952.8998 2773.3081,-952.8998 2779.3081,-958.8998 2779.3081,-964.8998 2779.3081,-964.8998 2779.3081,-1027.5002 2779.3081,-1027.5002 2779.3081,-1033.5002 2773.3081,-1039.5002 2767.3081,-1039.5002"/>
<text text-anchor="middle" x="2467.0696" y="-989" font-family="Monaco" font-size="24.00" fill="#000000">intel-parallel-studio@cluster.2020.4%gcc@9.4.0/2w3nq3n</text>
</g>
<!-- hkcrbrtf2qex6rvzuok5tzdrbam55pdn&#45;&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="edge29" class="edge">
<title>hkcrbrtf2qex6rvzuok5tzdrbam55pdn-&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2256.5586,-1110.7308C2294.3103,-1089.9869 2339.6329,-1065.083 2378.4976,-1043.7276"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2257.5217,-1112.4836C2295.2735,-1091.7397 2340.5961,-1066.8358 2379.4607,-1045.4804"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2381.116,-1047.4235 2388.1946,-1039.5403 2377.745,-1041.2886 2381.116,-1047.4235"/>
<text text-anchor="middle" x="2286.6606" y="-1079.8414" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=mpi</text>
</g>
<!-- gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="node27" class="node">
<title>gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1539.1928,-1039.5002C1539.1928,-1039.5002 1152.9464,-1039.5002 1152.9464,-1039.5002 1146.9464,-1039.5002 1140.9464,-1033.5002 1140.9464,-1027.5002 1140.9464,-1027.5002 1140.9464,-964.8998 1140.9464,-964.8998 1140.9464,-958.8998 1146.9464,-952.8998 1152.9464,-952.8998 1152.9464,-952.8998 1539.1928,-952.8998 1539.1928,-952.8998 1545.1928,-952.8998 1551.1928,-958.8998 1551.1928,-964.8998 1551.1928,-964.8998 1551.1928,-1027.5002 1551.1928,-1027.5002 1551.1928,-1033.5002 1545.1928,-1039.5002 1539.1928,-1039.5002"/>
<text text-anchor="middle" x="1346.0696" y="-989" font-family="Monaco" font-size="24.00" fill="#000000">cmake@3.25.1%gcc@9.4.0/gguve5i</text>
</g>
<!-- hkcrbrtf2qex6rvzuok5tzdrbam55pdn&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge17" class="edge">
<title>hkcrbrtf2qex6rvzuok5tzdrbam55pdn-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1950.9968,-1111.6597C1829.5529,-1088.4802 1680.8338,-1060.0949 1561.2457,-1037.2697"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1561.7091,-1033.795 1551.2303,-1035.3581 1560.3967,-1040.6709 1561.7091,-1033.795"/>
</g>
<!-- i4avrindvhcamhurzbfdaggbj2zgsrrh -->
<g id="node2" class="node">
<title>i4avrindvhcamhurzbfdaggbj2zgsrrh</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M1536.3649,-86.7002C1536.3649,-86.7002 1155.7743,-86.7002 1155.7743,-86.7002 1149.7743,-86.7002 1143.7743,-80.7002 1143.7743,-74.7002 1143.7743,-74.7002 1143.7743,-12.0998 1143.7743,-12.0998 1143.7743,-6.0998 1149.7743,-.0998 1155.7743,-.0998 1155.7743,-.0998 1536.3649,-.0998 1536.3649,-.0998 1542.3649,-.0998 1548.3649,-6.0998 1548.3649,-12.0998 1548.3649,-12.0998 1548.3649,-74.7002 1548.3649,-74.7002 1548.3649,-80.7002 1542.3649,-86.7002 1536.3649,-86.7002"/>
<text text-anchor="middle" x="1346.0696" y="-36.2" font-family="Monaco" font-size="24.00" fill="#000000">pkgconf@1.8.0%gcc@9.4.0/i4avrin</text>
</g>
<!-- ywrpvv2hgooeepdke33exkqrtdpd5gkl -->
<g id="node3" class="node">
<title>ywrpvv2hgooeepdke33exkqrtdpd5gkl</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M849.3673,-721.9002C849.3673,-721.9002 480.7719,-721.9002 480.7719,-721.9002 474.7719,-721.9002 468.7719,-715.9002 468.7719,-709.9002 468.7719,-709.9002 468.7719,-647.2998 468.7719,-647.2998 468.7719,-641.2998 474.7719,-635.2998 480.7719,-635.2998 480.7719,-635.2998 849.3673,-635.2998 849.3673,-635.2998 855.3673,-635.2998 861.3673,-641.2998 861.3673,-647.2998 861.3673,-647.2998 861.3673,-709.9002 861.3673,-709.9002 861.3673,-715.9002 855.3673,-721.9002 849.3673,-721.9002"/>
<text text-anchor="middle" x="665.0696" y="-671.4" font-family="Monaco" font-size="24.00" fill="#000000">perl@5.36.0%gcc@9.4.0/ywrpvv2</text>
</g>
<!-- h3ujmb3ts4kxxxv77knh2knuystuerbx -->
<g id="node7" class="node">
<title>h3ujmb3ts4kxxxv77knh2knuystuerbx</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M392.4016,-563.1002C392.4016,-563.1002 19.7376,-563.1002 19.7376,-563.1002 13.7376,-563.1002 7.7376,-557.1002 7.7376,-551.1002 7.7376,-551.1002 7.7376,-488.4998 7.7376,-488.4998 7.7376,-482.4998 13.7376,-476.4998 19.7376,-476.4998 19.7376,-476.4998 392.4016,-476.4998 392.4016,-476.4998 398.4016,-476.4998 404.4016,-482.4998 404.4016,-488.4998 404.4016,-488.4998 404.4016,-551.1002 404.4016,-551.1002 404.4016,-557.1002 398.4016,-563.1002 392.4016,-563.1002"/>
<text text-anchor="middle" x="206.0696" y="-512.6" font-family="Monaco" font-size="24.00" fill="#000000">bzip2@1.0.8%gcc@9.4.0/h3ujmb3</text>
</g>
<!-- ywrpvv2hgooeepdke33exkqrtdpd5gkl&#45;&gt;h3ujmb3ts4kxxxv77knh2knuystuerbx -->
<g id="edge9" class="edge">
<title>ywrpvv2hgooeepdke33exkqrtdpd5gkl-&gt;h3ujmb3ts4kxxxv77knh2knuystuerbx</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M539.3189,-636.1522C477.7157,-614.8394 403.4197,-589.1353 340.5959,-567.4002"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M539.9728,-634.2622C478.3696,-612.9494 404.0736,-587.2452 341.2498,-565.5101"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="341.9365,-563.1023 331.3417,-563.1403 339.6478,-569.7176 341.9365,-563.1023"/>
</g>
<!-- uabgssx6lsgrevwbttslldnr5nzguprj -->
<g id="node19" class="node">
<title>uabgssx6lsgrevwbttslldnr5nzguprj</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M1298.2296,-563.1002C1298.2296,-563.1002 937.9096,-563.1002 937.9096,-563.1002 931.9096,-563.1002 925.9096,-557.1002 925.9096,-551.1002 925.9096,-551.1002 925.9096,-488.4998 925.9096,-488.4998 925.9096,-482.4998 931.9096,-476.4998 937.9096,-476.4998 937.9096,-476.4998 1298.2296,-476.4998 1298.2296,-476.4998 1304.2296,-476.4998 1310.2296,-482.4998 1310.2296,-488.4998 1310.2296,-488.4998 1310.2296,-551.1002 1310.2296,-551.1002 1310.2296,-557.1002 1304.2296,-563.1002 1298.2296,-563.1002"/>
<text text-anchor="middle" x="1118.0696" y="-512.6" font-family="Monaco" font-size="24.00" fill="#000000">gdbm@1.23%gcc@9.4.0/uabgssx</text>
</g>
<!-- ywrpvv2hgooeepdke33exkqrtdpd5gkl&#45;&gt;uabgssx6lsgrevwbttslldnr5nzguprj -->
<g id="edge44" class="edge">
<title>ywrpvv2hgooeepdke33exkqrtdpd5gkl-&gt;uabgssx6lsgrevwbttslldnr5nzguprj</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M788.523,-634.2635C849.3209,-612.9507 922.6457,-587.2465 984.6483,-565.5114"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M789.1847,-636.1509C849.9825,-614.8381 923.3073,-589.1339 985.3099,-567.3988"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="986.1559,-569.7515 994.435,-563.1403 983.8402,-563.1456 986.1559,-569.7515"/>
</g>
<!-- gkw4dg2p7rdnhru3m6lcnsjbzyr7g3hb -->
<g id="node20" class="node">
<title>gkw4dg2p7rdnhru3m6lcnsjbzyr7g3hb</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M896.1744,-563.1002C896.1744,-563.1002 433.9648,-563.1002 433.9648,-563.1002 427.9648,-563.1002 421.9648,-557.1002 421.9648,-551.1002 421.9648,-551.1002 421.9648,-488.4998 421.9648,-488.4998 421.9648,-482.4998 427.9648,-476.4998 433.9648,-476.4998 433.9648,-476.4998 896.1744,-476.4998 896.1744,-476.4998 902.1744,-476.4998 908.1744,-482.4998 908.1744,-488.4998 908.1744,-488.4998 908.1744,-551.1002 908.1744,-551.1002 908.1744,-557.1002 902.1744,-563.1002 896.1744,-563.1002"/>
<text text-anchor="middle" x="665.0696" y="-512.6" font-family="Monaco" font-size="24.00" fill="#000000">berkeley-db@18.1.40%gcc@9.4.0/gkw4dg2</text>
</g>
<!-- ywrpvv2hgooeepdke33exkqrtdpd5gkl&#45;&gt;gkw4dg2p7rdnhru3m6lcnsjbzyr7g3hb -->
<g id="edge23" class="edge">
<title>ywrpvv2hgooeepdke33exkqrtdpd5gkl-&gt;gkw4dg2p7rdnhru3m6lcnsjbzyr7g3hb</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M664.0696,-635.2072C664.0696,-616.1263 664.0696,-593.5257 664.0696,-573.4046"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M666.0696,-635.2072C666.0696,-616.1263 666.0696,-593.5257 666.0696,-573.4046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="668.5697,-573.1403 665.0696,-563.1403 661.5697,-573.1404 668.5697,-573.1403"/>
</g>
<!-- nizxi5u5bbrzhzwfy2qb7hatlhuswlrz -->
<g id="node24" class="node">
<title>nizxi5u5bbrzhzwfy2qb7hatlhuswlrz</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M2195.2248,-563.1002C2195.2248,-563.1002 1840.9144,-563.1002 1840.9144,-563.1002 1834.9144,-563.1002 1828.9144,-557.1002 1828.9144,-551.1002 1828.9144,-551.1002 1828.9144,-488.4998 1828.9144,-488.4998 1828.9144,-482.4998 1834.9144,-476.4998 1840.9144,-476.4998 1840.9144,-476.4998 2195.2248,-476.4998 2195.2248,-476.4998 2201.2248,-476.4998 2207.2248,-482.4998 2207.2248,-488.4998 2207.2248,-488.4998 2207.2248,-551.1002 2207.2248,-551.1002 2207.2248,-557.1002 2201.2248,-563.1002 2195.2248,-563.1002"/>
<text text-anchor="middle" x="2018.0696" y="-512.6" font-family="Monaco" font-size="24.00" fill="#000000">zlib@1.2.13%gcc@9.4.0/nizxi5u</text>
</g>
<!-- ywrpvv2hgooeepdke33exkqrtdpd5gkl&#45;&gt;nizxi5u5bbrzhzwfy2qb7hatlhuswlrz -->
<g id="edge4" class="edge">
<title>ywrpvv2hgooeepdke33exkqrtdpd5gkl-&gt;nizxi5u5bbrzhzwfy2qb7hatlhuswlrz</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M861.3292,-654.5584C1116.9929,-624.5514 1561.4447,-572.3867 1818.5758,-542.2075"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M861.5624,-656.5447C1117.2261,-626.5378 1561.6778,-574.373 1818.8089,-544.1939"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1819.373,-546.6449 1828.8968,-542.003 1818.5569,-539.6926 1819.373,-546.6449"/>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id -->
<g id="node4" class="node">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M2383.212,-1674.7002C2383.212,-1674.7002 1972.9272,-1674.7002 1972.9272,-1674.7002 1966.9272,-1674.7002 1960.9272,-1668.7002 1960.9272,-1662.7002 1960.9272,-1662.7002 1960.9272,-1600.0998 1960.9272,-1600.0998 1960.9272,-1594.0998 1966.9272,-1588.0998 1972.9272,-1588.0998 1972.9272,-1588.0998 2383.212,-1588.0998 2383.212,-1588.0998 2389.212,-1588.0998 2395.212,-1594.0998 2395.212,-1600.0998 2395.212,-1600.0998 2395.212,-1662.7002 2395.212,-1662.7002 2395.212,-1668.7002 2389.212,-1674.7002 2383.212,-1674.7002"/>
<text text-anchor="middle" x="2178.0696" y="-1624.2" font-family="Monaco" font-size="24.00" fill="#000000">strumpack@7.0.1%gcc@9.4.0/idvshq5</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;hkcrbrtf2qex6rvzuok5tzdrbam55pdn -->
<g id="edge33" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;hkcrbrtf2qex6rvzuok5tzdrbam55pdn</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2177.0696,-1587.8598C2177.0696,-1500.5185 2177.0696,-1304.1624 2177.0696,-1208.8885"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2179.0696,-1587.8598C2179.0696,-1500.5185 2179.0696,-1304.1624 2179.0696,-1208.8885"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2181.5697,-1208.611 2178.0696,-1198.611 2174.5697,-1208.611 2181.5697,-1208.611"/>
<text text-anchor="middle" x="2125.9224" y="-1397.5399" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=scalapack</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge8" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1960.6199,-1629.1097C1600.5855,-1621.4505 897.1143,-1596.5054 662.748,-1516.9469 459.8544,-1447.9506 281.1117,-1289.236 401.2427,-1111.0377 418.213,-1086.3492 472.759,-1062.01 530.3793,-1041.9698"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1960.6625,-1627.1101C1600.6564,-1619.4517 897.1852,-1594.5067 663.3912,-1515.0531 461.1823,-1446.4551 282.4397,-1287.7405 402.8965,-1112.1623 419.028,-1088.1757 473.574,-1063.8364 531.0362,-1043.8589"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="532.0142,-1046.1665 540.3395,-1039.6137 529.7449,-1039.5445 532.0142,-1046.1665"/>
<text text-anchor="middle" x="1175.5163" y="-1600.8866" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas,lapack</text>
</g>
<!-- imopnxjmv7cwzyiecdw2saq42qvpnauh -->
<g id="node12" class="node">
<title>imopnxjmv7cwzyiecdw2saq42qvpnauh</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M3003.3872,-1357.1002C3003.3872,-1357.1002 2606.752,-1357.1002 2606.752,-1357.1002 2600.752,-1357.1002 2594.752,-1351.1002 2594.752,-1345.1002 2594.752,-1345.1002 2594.752,-1282.4998 2594.752,-1282.4998 2594.752,-1276.4998 2600.752,-1270.4998 2606.752,-1270.4998 2606.752,-1270.4998 3003.3872,-1270.4998 3003.3872,-1270.4998 3009.3872,-1270.4998 3015.3872,-1276.4998 3015.3872,-1282.4998 3015.3872,-1282.4998 3015.3872,-1345.1002 3015.3872,-1345.1002 3015.3872,-1351.1002 3009.3872,-1357.1002 3003.3872,-1357.1002"/>
<text text-anchor="middle" x="2805.0696" y="-1306.6" font-family="Monaco" font-size="24.00" fill="#000000">parmetis@4.0.3%gcc@9.4.0/imopnxj</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;imopnxjmv7cwzyiecdw2saq42qvpnauh -->
<g id="edge51" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;imopnxjmv7cwzyiecdw2saq42qvpnauh</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2393.6993,-1587.0809C2455.3565,-1569.7539 2521.1771,-1546.2699 2577.5864,-1515.1245 2649.1588,-1475.6656 2717.4141,-1409.6691 2759.9512,-1363.9364"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2394.2404,-1589.0062C2456.0286,-1571.6376 2521.8491,-1548.1536 2578.5528,-1516.8755 2650.5491,-1477.1034 2718.8043,-1411.107 2761.4156,-1365.2986"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2763.3454,-1366.8938 2767.5512,-1357.1695 2758.1992,-1362.1485 2763.3454,-1366.8938"/>
</g>
<!-- ern66gyp6qmhmpod4jaynxx4weoberfm -->
<g id="node13" class="node">
<title>ern66gyp6qmhmpod4jaynxx4weoberfm</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M2928.3784,-1198.3002C2928.3784,-1198.3002 2563.7608,-1198.3002 2563.7608,-1198.3002 2557.7608,-1198.3002 2551.7608,-1192.3002 2551.7608,-1186.3002 2551.7608,-1186.3002 2551.7608,-1123.6998 2551.7608,-1123.6998 2551.7608,-1117.6998 2557.7608,-1111.6998 2563.7608,-1111.6998 2563.7608,-1111.6998 2928.3784,-1111.6998 2928.3784,-1111.6998 2934.3784,-1111.6998 2940.3784,-1117.6998 2940.3784,-1123.6998 2940.3784,-1123.6998 2940.3784,-1186.3002 2940.3784,-1186.3002 2940.3784,-1192.3002 2934.3784,-1198.3002 2928.3784,-1198.3002"/>
<text text-anchor="middle" x="2746.0696" y="-1147.8" font-family="Monaco" font-size="24.00" fill="#000000">metis@5.1.0%gcc@9.4.0/ern66gy</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;ern66gyp6qmhmpod4jaynxx4weoberfm -->
<g id="edge25" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;ern66gyp6qmhmpod4jaynxx4weoberfm</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2371.6269,-1587.103C2443.5875,-1567.249 2513.691,-1542.0963 2537.3223,-1515.3355 2611.3482,-1433.6645 2525.4748,-1364.8484 2585.2274,-1269.8608 2602.2478,-1243.3473 2627.3929,-1221.1402 2652.8797,-1203.3777"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2372.1589,-1589.0309C2444.2629,-1569.1315 2514.3664,-1543.9788 2538.8169,-1516.6645 2612.5989,-1432.1038 2526.7255,-1363.2878 2586.9118,-1270.9392 2603.5717,-1244.8464 2628.7168,-1222.6393 2654.0229,-1205.0188"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2655.7411,-1206.8749 2662.0621,-1198.3722 2651.8184,-1201.0773 2655.7411,-1206.8749"/>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf -->
<g id="node14" class="node">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1964.017,-1357.1002C1964.017,-1357.1002 1532.1222,-1357.1002 1532.1222,-1357.1002 1526.1222,-1357.1002 1520.1222,-1351.1002 1520.1222,-1345.1002 1520.1222,-1345.1002 1520.1222,-1282.4998 1520.1222,-1282.4998 1520.1222,-1276.4998 1526.1222,-1270.4998 1532.1222,-1270.4998 1532.1222,-1270.4998 1964.017,-1270.4998 1964.017,-1270.4998 1970.017,-1270.4998 1976.017,-1276.4998 1976.017,-1282.4998 1976.017,-1282.4998 1976.017,-1345.1002 1976.017,-1345.1002 1976.017,-1351.1002 1970.017,-1357.1002 1964.017,-1357.1002"/>
<text text-anchor="middle" x="1748.0696" y="-1306.6" font-family="Monaco" font-size="24.00" fill="#000000">butterflypack@2.2.2%gcc@9.4.0/nqiyrxl</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;nqiyrxlid6tikfpvoqdpvsjt5drs2obf -->
<g id="edge26" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;nqiyrxlid6tikfpvoqdpvsjt5drs2obf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2118.5874,-1588.7094C2039.1194,-1530.0139 1897.9154,-1425.72 1814.4793,-1364.0937"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2119.7757,-1587.1006C2040.3076,-1528.4052 1899.1036,-1424.1112 1815.6675,-1362.485"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1817.0581,-1360.404 1806.9348,-1357.2781 1812.8992,-1366.0347 1817.0581,-1360.404"/>
</g>
<!-- 4bu62kyfuh4ikdkuyxfxjxanf7e7qopu -->
<g id="node16" class="node">
<title>4bu62kyfuh4ikdkuyxfxjxanf7e7qopu</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1106.2192,-1515.9002C1106.2192,-1515.9002 683.92,-1515.9002 683.92,-1515.9002 677.92,-1515.9002 671.92,-1509.9002 671.92,-1503.9002 671.92,-1503.9002 671.92,-1441.2998 671.92,-1441.2998 671.92,-1435.2998 677.92,-1429.2998 683.92,-1429.2998 683.92,-1429.2998 1106.2192,-1429.2998 1106.2192,-1429.2998 1112.2192,-1429.2998 1118.2192,-1435.2998 1118.2192,-1441.2998 1118.2192,-1441.2998 1118.2192,-1503.9002 1118.2192,-1503.9002 1118.2192,-1509.9002 1112.2192,-1515.9002 1106.2192,-1515.9002"/>
<text text-anchor="middle" x="895.0696" y="-1465.4" font-family="Monaco" font-size="24.00" fill="#000000">slate@2022.07.00%gcc@9.4.0/4bu62ky</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;4bu62kyfuh4ikdkuyxfxjxanf7e7qopu -->
<g id="edge5" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;4bu62kyfuh4ikdkuyxfxjxanf7e7qopu</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1960.6663,-1605.4991C1729.5518,-1576.8935 1365.2868,-1531.8075 1128.237,-1502.4673"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1960.912,-1603.5143C1729.7975,-1574.9086 1365.5325,-1529.8227 1128.4827,-1500.4825"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1128.5789,-1497.9754 1118.2247,-1500.2204 1127.719,-1504.9224 1128.5789,-1497.9754"/>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="edge20" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2395.1113,-1591.5061C2621.5772,-1545.7968 2953.3457,-1462.5053 3023.2362,-1356.6473 3049.986,-1316.785 3021.2047,-1131.5143 3003.3326,-1112.2759 2971.8969,-1077.7826 2884.3944,-1052.6467 2789.1441,-1034.9179"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2395.507,-1593.4665C2622.0642,-1547.7366 2953.8327,-1464.4452 3024.903,-1357.7527 3051.9623,-1316.478 3023.181,-1131.2073 3004.8066,-1110.9241 2972.4491,-1075.8603 2884.9466,-1050.7244 2789.5102,-1032.9517"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2789.9449,-1030.4898 2779.4781,-1032.132 2788.6845,-1037.3754 2789.9449,-1030.4898"/>
<text text-anchor="middle" x="2611.7445" y="-1537.8321" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=mpi</text>
</g>
<!-- 7rzbmgoxhmm2jhellkgcjmn62uklf22x -->
<g id="node25" class="node">
<title>7rzbmgoxhmm2jhellkgcjmn62uklf22x</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1749.1952,-1515.9002C1749.1952,-1515.9002 1398.944,-1515.9002 1398.944,-1515.9002 1392.944,-1515.9002 1386.944,-1509.9002 1386.944,-1503.9002 1386.944,-1503.9002 1386.944,-1441.2998 1386.944,-1441.2998 1386.944,-1435.2998 1392.944,-1429.2998 1398.944,-1429.2998 1398.944,-1429.2998 1749.1952,-1429.2998 1749.1952,-1429.2998 1755.1952,-1429.2998 1761.1952,-1435.2998 1761.1952,-1441.2998 1761.1952,-1441.2998 1761.1952,-1503.9002 1761.1952,-1503.9002 1761.1952,-1509.9002 1755.1952,-1515.9002 1749.1952,-1515.9002"/>
<text text-anchor="middle" x="1574.0696" y="-1465.4" font-family="Monaco" font-size="24.00" fill="#000000">zfp@0.5.5%gcc@9.4.0/7rzbmgo</text>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;7rzbmgoxhmm2jhellkgcjmn62uklf22x -->
<g id="edge36" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;7rzbmgoxhmm2jhellkgcjmn62uklf22x</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2012.7697,-1588.9743C1930.7903,-1567.4208 1831.729,-1541.3762 1748.4742,-1519.4874"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2013.2782,-1587.0401C1931.2989,-1565.4866 1832.2376,-1539.442 1748.9827,-1517.5531"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1749.477,-1515.0982 1738.9157,-1515.9403 1747.697,-1521.8681 1749.477,-1515.0982"/>
</g>
<!-- idvshq5nqmygzd4uo62mdispwgxsw7id&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge3" class="edge">
<title>idvshq5nqmygzd4uo62mdispwgxsw7id-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2229.2864,-1587.9836C2336.2076,-1492.3172 2562.5717,-1260.0833 2429.0696,-1111.6 2372.2327,-1048.3851 1860.8259,-1017.0375 1561.5401,-1003.9799"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1561.5673,-1000.4779 1551.4253,-1003.5421 1561.2645,-1007.4714 1561.5673,-1000.4779"/>
</g>
<!-- mujlx42xgttdc6u6rmiftsktpsrcmpbs -->
<g id="node5" class="node">
<title>mujlx42xgttdc6u6rmiftsktpsrcmpbs</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M912.4048,-1198.3002C912.4048,-1198.3002 475.7344,-1198.3002 475.7344,-1198.3002 469.7344,-1198.3002 463.7344,-1192.3002 463.7344,-1186.3002 463.7344,-1186.3002 463.7344,-1123.6998 463.7344,-1123.6998 463.7344,-1117.6998 469.7344,-1111.6998 475.7344,-1111.6998 475.7344,-1111.6998 912.4048,-1111.6998 912.4048,-1111.6998 918.4048,-1111.6998 924.4048,-1117.6998 924.4048,-1123.6998 924.4048,-1123.6998 924.4048,-1186.3002 924.4048,-1186.3002 924.4048,-1192.3002 918.4048,-1198.3002 912.4048,-1198.3002"/>
<text text-anchor="middle" x="694.0696" y="-1147.8" font-family="Monaco" font-size="24.00" fill="#000000">blaspp@2022.07.00%gcc@9.4.0/mujlx42</text>
</g>
<!-- mujlx42xgttdc6u6rmiftsktpsrcmpbs&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge16" class="edge">
<title>mujlx42xgttdc6u6rmiftsktpsrcmpbs-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M693.0696,-1111.6072C693.0696,-1092.5263 693.0696,-1069.9257 693.0696,-1049.8046"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M695.0696,-1111.6072C695.0696,-1092.5263 695.0696,-1069.9257 695.0696,-1049.8046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="697.5697,-1049.5403 694.0696,-1039.5403 690.5697,-1049.5404 697.5697,-1049.5403"/>
<text text-anchor="middle" x="657.8516" y="-1079.8482" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas</text>
</g>
<!-- mujlx42xgttdc6u6rmiftsktpsrcmpbs&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge28" class="edge">
<title>mujlx42xgttdc6u6rmiftsktpsrcmpbs-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M872.2315,-1111.6072C960.9952,-1089.988 1068.311,-1063.8504 1158.3512,-1041.9204"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1159.2354,-1045.3074 1168.1232,-1039.5403 1157.5789,-1038.5062 1159.2354,-1045.3074"/>
</g>
<!-- htzjns66gmq6pjofohp26djmjnpbegho -->
<g id="node6" class="node">
<title>htzjns66gmq6pjofohp26djmjnpbegho</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M2663.3553,-880.7002C2663.3553,-880.7002 2270.7839,-880.7002 2270.7839,-880.7002 2264.7839,-880.7002 2258.7839,-874.7002 2258.7839,-868.7002 2258.7839,-868.7002 2258.7839,-806.0998 2258.7839,-806.0998 2258.7839,-800.0998 2264.7839,-794.0998 2270.7839,-794.0998 2270.7839,-794.0998 2663.3553,-794.0998 2663.3553,-794.0998 2669.3553,-794.0998 2675.3553,-800.0998 2675.3553,-806.0998 2675.3553,-806.0998 2675.3553,-868.7002 2675.3553,-868.7002 2675.3553,-874.7002 2669.3553,-880.7002 2663.3553,-880.7002"/>
<text text-anchor="middle" x="2467.0696" y="-830.2" font-family="Monaco" font-size="24.00" fill="#000000">patchelf@0.16.1%gcc@9.4.0/htzjns6</text>
</g>
<!-- xm3ldz3y3msfdc3hzshvxpbpg5hnt6o6 -->
<g id="node15" class="node">
<title>xm3ldz3y3msfdc3hzshvxpbpg5hnt6o6</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M394.2232,-404.3002C394.2232,-404.3002 17.916,-404.3002 17.916,-404.3002 11.916,-404.3002 5.916,-398.3002 5.916,-392.3002 5.916,-392.3002 5.916,-329.6998 5.916,-329.6998 5.916,-323.6998 11.916,-317.6998 17.916,-317.6998 17.916,-317.6998 394.2232,-317.6998 394.2232,-317.6998 400.2232,-317.6998 406.2232,-323.6998 406.2232,-329.6998 406.2232,-329.6998 406.2232,-392.3002 406.2232,-392.3002 406.2232,-398.3002 400.2232,-404.3002 394.2232,-404.3002"/>
<text text-anchor="middle" x="206.0696" y="-353.8" font-family="Monaco" font-size="24.00" fill="#000000">diffutils@3.8%gcc@9.4.0/xm3ldz3</text>
</g>
<!-- h3ujmb3ts4kxxxv77knh2knuystuerbx&#45;&gt;xm3ldz3y3msfdc3hzshvxpbpg5hnt6o6 -->
<g id="edge1" class="edge">
<title>h3ujmb3ts4kxxxv77knh2knuystuerbx-&gt;xm3ldz3y3msfdc3hzshvxpbpg5hnt6o6</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M206.0696,-476.4072C206.0696,-457.3263 206.0696,-434.7257 206.0696,-414.6046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="209.5697,-414.3403 206.0696,-404.3403 202.5697,-414.3404 209.5697,-414.3403"/>
</g>
<!-- o524gebsxavobkte3k5fglgwnedfkadf&#45;&gt;ywrpvv2hgooeepdke33exkqrtdpd5gkl -->
<g id="edge11" class="edge">
<title>o524gebsxavobkte3k5fglgwnedfkadf-&gt;ywrpvv2hgooeepdke33exkqrtdpd5gkl</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M690.0981,-952.705C684.8522,-895.2533 675.6173,-794.1153 669.9514,-732.0637"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="673.4345,-731.7184 669.0396,-722.0781 666.4635,-732.355 673.4345,-731.7184"/>
</g>
<!-- 4vsmjofkhntilgzh4zebluqak5mdsu3x -->
<g id="node9" class="node">
<title>4vsmjofkhntilgzh4zebluqak5mdsu3x</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M1977.9121,-721.9002C1977.9121,-721.9002 1386.2271,-721.9002 1386.2271,-721.9002 1380.2271,-721.9002 1374.2271,-715.9002 1374.2271,-709.9002 1374.2271,-709.9002 1374.2271,-647.2998 1374.2271,-647.2998 1374.2271,-641.2998 1380.2271,-635.2998 1386.2271,-635.2998 1386.2271,-635.2998 1977.9121,-635.2998 1977.9121,-635.2998 1983.9121,-635.2998 1989.9121,-641.2998 1989.9121,-647.2998 1989.9121,-647.2998 1989.9121,-709.9002 1989.9121,-709.9002 1989.9121,-715.9002 1983.9121,-721.9002 1977.9121,-721.9002"/>
<text text-anchor="middle" x="1682.0696" y="-671.4" font-family="Monaco" font-size="24.00" fill="#000000">ca-certificates-mozilla@2023-01-10%gcc@9.4.0/4vsmjof</text>
</g>
<!-- xiro2z6na56qdd4czjhj54eag3ekbiow -->
<g id="node10" class="node">
<title>xiro2z6na56qdd4czjhj54eag3ekbiow</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M988.1824,-1357.1002C988.1824,-1357.1002 533.9568,-1357.1002 533.9568,-1357.1002 527.9568,-1357.1002 521.9568,-1351.1002 521.9568,-1345.1002 521.9568,-1345.1002 521.9568,-1282.4998 521.9568,-1282.4998 521.9568,-1276.4998 527.9568,-1270.4998 533.9568,-1270.4998 533.9568,-1270.4998 988.1824,-1270.4998 988.1824,-1270.4998 994.1824,-1270.4998 1000.1824,-1276.4998 1000.1824,-1282.4998 1000.1824,-1282.4998 1000.1824,-1345.1002 1000.1824,-1345.1002 1000.1824,-1351.1002 994.1824,-1357.1002 988.1824,-1357.1002"/>
<text text-anchor="middle" x="761.0696" y="-1306.6" font-family="Monaco" font-size="24.00" fill="#000000">lapackpp@2022.07.00%gcc@9.4.0/xiro2z6</text>
</g>
<!-- xiro2z6na56qdd4czjhj54eag3ekbiow&#45;&gt;mujlx42xgttdc6u6rmiftsktpsrcmpbs -->
<g id="edge37" class="edge">
<title>xiro2z6na56qdd4czjhj54eag3ekbiow-&gt;mujlx42xgttdc6u6rmiftsktpsrcmpbs</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M741.8402,-1270.7959C733.6789,-1251.4525 723.9915,-1228.4917 715.4149,-1208.1641"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M743.6829,-1270.0185C735.5216,-1250.675 725.8342,-1227.7143 717.2576,-1207.3866"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="719.4676,-1206.1933 712.3555,-1198.3403 713.0181,-1208.9144 719.4676,-1206.1933"/>
</g>
<!-- xiro2z6na56qdd4czjhj54eag3ekbiow&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge35" class="edge">
<title>xiro2z6na56qdd4czjhj54eag3ekbiow-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M597.2326,-1271.3826C534.1471,-1251.0571 472.8527,-1225.5904 454.2471,-1198.9688 432.1275,-1166.6075 433.5639,-1144.2113 454.2226,-1111.0684 472.6194,-1081.8657 500.3255,-1060.004 530.6572,-1043.4601"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M597.8458,-1269.4789C534.9144,-1249.2102 473.6201,-1223.7435 455.8921,-1197.8312 434.1234,-1166.7355 435.5598,-1144.3393 455.9166,-1112.1316 473.8583,-1083.4358 501.5644,-1061.5741 531.6142,-1045.2163"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="532.9062,-1047.362 540.1422,-1039.6231 529.6595,-1041.1605 532.9062,-1047.362"/>
<text text-anchor="middle" x="474.3109" y="-1250.2598" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas,lapack</text>
</g>
<!-- xiro2z6na56qdd4czjhj54eag3ekbiow&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge45" class="edge">
<title>xiro2z6na56qdd4czjhj54eag3ekbiow-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M833.5823,-1270.3956C865.3249,-1250.0918 902.2709,-1224.6296 933.0696,-1198.4 973.2414,-1164.1878 969.8532,-1140.395 1014.0696,-1111.6 1058.5051,-1082.6623 1111.0286,-1060.0733 1161.029,-1042.8573"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1162.313,-1046.1177 1170.6621,-1039.5953 1160.0678,-1039.4876 1162.313,-1046.1177"/>
</g>
<!-- j5rupoqliu7kasm6xndl7ui32wgawkru -->
<g id="node11" class="node">
<title>j5rupoqliu7kasm6xndl7ui32wgawkru</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1527.3625,-245.5002C1527.3625,-245.5002 1164.7767,-245.5002 1164.7767,-245.5002 1158.7767,-245.5002 1152.7767,-239.5002 1152.7767,-233.5002 1152.7767,-233.5002 1152.7767,-170.8998 1152.7767,-170.8998 1152.7767,-164.8998 1158.7767,-158.8998 1164.7767,-158.8998 1164.7767,-158.8998 1527.3625,-158.8998 1527.3625,-158.8998 1533.3625,-158.8998 1539.3625,-164.8998 1539.3625,-170.8998 1539.3625,-170.8998 1539.3625,-233.5002 1539.3625,-233.5002 1539.3625,-239.5002 1533.3625,-245.5002 1527.3625,-245.5002"/>
<text text-anchor="middle" x="1346.0696" y="-195" font-family="Monaco" font-size="24.00" fill="#000000">ncurses@6.4%gcc@9.4.0/j5rupoq</text>
</g>
<!-- j5rupoqliu7kasm6xndl7ui32wgawkru&#45;&gt;i4avrindvhcamhurzbfdaggbj2zgsrrh -->
<g id="edge15" class="edge">
<title>j5rupoqliu7kasm6xndl7ui32wgawkru-&gt;i4avrindvhcamhurzbfdaggbj2zgsrrh</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1346.0696,-158.8072C1346.0696,-139.7263 1346.0696,-117.1257 1346.0696,-97.0046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1349.5697,-96.7403 1346.0696,-86.7403 1342.5697,-96.7404 1349.5697,-96.7403"/>
<text text-anchor="middle" x="1292.7436" y="-127.0482" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=pkgconfig</text>
</g>
<!-- imopnxjmv7cwzyiecdw2saq42qvpnauh&#45;&gt;ern66gyp6qmhmpod4jaynxx4weoberfm -->
<g id="edge19" class="edge">
<title>imopnxjmv7cwzyiecdw2saq42qvpnauh-&gt;ern66gyp6qmhmpod4jaynxx4weoberfm</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2788.0102,-1270.7555C2780.8234,-1251.412 2772.2926,-1228.4513 2764.7402,-1208.1236"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2789.885,-1270.0589C2782.6982,-1250.7155 2774.1674,-1227.7547 2766.615,-1207.4271"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2768.9358,-1206.4953 2762.1721,-1198.3403 2762.3741,-1208.9332 2768.9358,-1206.4953"/>
</g>
<!-- imopnxjmv7cwzyiecdw2saq42qvpnauh&#45;&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="edge12" class="edge">
<title>imopnxjmv7cwzyiecdw2saq42qvpnauh-&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2907.2846,-1269.5018C2936.475,-1251.8137 2964.9158,-1228.1116 2981.1904,-1197.9236 2999.477,-1164.2363 3005.2125,-1141.4693 2981.289,-1112.225 2954.5472,-1078.5579 2876.5297,-1053.8974 2789.2983,-1036.3535"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M2908.3216,-1271.2119C2937.7554,-1253.3501 2966.1962,-1229.648 2982.9488,-1198.8764 3001.4164,-1164.7249 3007.1519,-1141.9579 2982.8502,-1110.975 2955.15,-1076.6509 2877.1325,-1051.9904 2789.6927,-1034.3928"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2790.125,-1031.93 2779.6364,-1033.4269 2788.7692,-1038.7974 2790.125,-1031.93"/>
<text text-anchor="middle" x="2836.0561" y="-1059.5023" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=mpi</text>
</g>
<!-- imopnxjmv7cwzyiecdw2saq42qvpnauh&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge49" class="edge">
<title>imopnxjmv7cwzyiecdw2saq42qvpnauh-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2883.731,-1270.4691C2909.4451,-1251.9243 2934.9956,-1227.7144 2949.0696,-1198.4 2965.7663,-1163.6227 2975.3506,-1139.841 2949.0696,-1111.6 2925.7161,-1086.5049 1993.0368,-1031.9055 1561.3071,-1007.9103"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1561.3813,-1004.4092 1551.2026,-1007.3492 1560.9931,-1011.3984 1561.3813,-1004.4092"/>
</g>
<!-- ern66gyp6qmhmpod4jaynxx4weoberfm&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge50" class="edge">
<title>ern66gyp6qmhmpod4jaynxx4weoberfm-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2551.6031,-1113.7387C2547.0531,-1112.9948 2542.537,-1112.2802 2538.0696,-1111.6 2198.5338,-1059.8997 1800.8632,-1026.8711 1561.4583,-1009.9443"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1561.4619,-1006.436 1551.2407,-1009.2249 1560.9702,-1013.4187 1561.4619,-1006.436"/>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf&#45;&gt;hkcrbrtf2qex6rvzuok5tzdrbam55pdn -->
<g id="edge34" class="edge">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf-&gt;hkcrbrtf2qex6rvzuok5tzdrbam55pdn</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1865.2226,-1269.4691C1922.6966,-1248.2438 1991.964,-1222.6632 2050.6644,-1200.985"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1865.9154,-1271.3453C1923.3894,-1250.12 1992.6569,-1224.5394 2051.3572,-1202.8612"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2052.5441,-1205.088 2060.7123,-1198.3403 2050.119,-1198.5215 2052.5441,-1205.088"/>
<text text-anchor="middle" x="1910.9073" y="-1238.6056" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=scalapack</text>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge52" class="edge">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1519.9696,-1290.6844C1394.6018,-1273.3057 1237.6631,-1244.7294 1102.7507,-1199.3478 1021.8138,-1171.8729 1008.1992,-1149.8608 932.6248,-1112.4956 887.1715,-1089.9216 836.578,-1065.4054 793.6914,-1044.8018"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1520.2442,-1288.7034C1394.9601,-1271.3381 1238.0214,-1242.7618 1103.3885,-1197.4522 1023.5148,-1170.8208 1009.9002,-1148.8087 933.5144,-1110.7044 888.0436,-1088.1218 837.4502,-1063.6056 794.5574,-1042.999"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="795.6235,-1040.7377 785.0938,-1039.565 792.5939,-1047.0482 795.6235,-1040.7377"/>
<text text-anchor="middle" x="1046.8307" y="-1202.5988" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas,lapack</text>
</g>
<!-- lfh3aovn65e66cs24qiehq3nd2ddojef -->
<g id="node21" class="node">
<title>lfh3aovn65e66cs24qiehq3nd2ddojef</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1547.9922,-1198.3002C1547.9922,-1198.3002 1144.147,-1198.3002 1144.147,-1198.3002 1138.147,-1198.3002 1132.147,-1192.3002 1132.147,-1186.3002 1132.147,-1186.3002 1132.147,-1123.6998 1132.147,-1123.6998 1132.147,-1117.6998 1138.147,-1111.6998 1144.147,-1111.6998 1144.147,-1111.6998 1547.9922,-1111.6998 1547.9922,-1111.6998 1553.9922,-1111.6998 1559.9922,-1117.6998 1559.9922,-1123.6998 1559.9922,-1123.6998 1559.9922,-1186.3002 1559.9922,-1186.3002 1559.9922,-1192.3002 1553.9922,-1198.3002 1547.9922,-1198.3002"/>
<text text-anchor="middle" x="1346.0696" y="-1147.8" font-family="Monaco" font-size="24.00" fill="#000000">arpack-ng@3.8.0%gcc@9.4.0/lfh3aov</text>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf&#45;&gt;lfh3aovn65e66cs24qiehq3nd2ddojef -->
<g id="edge46" class="edge">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf-&gt;lfh3aovn65e66cs24qiehq3nd2ddojef</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1637.8539,-1271.3373C1584.2332,-1250.1557 1519.6324,-1224.6368 1464.827,-1202.9873"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1638.5887,-1269.4771C1584.968,-1248.2956 1520.3672,-1222.7767 1465.5618,-1201.1272"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1466.3716,-1198.7592 1455.785,-1198.3403 1463.7998,-1205.2696 1466.3716,-1198.7592"/>
</g>
<!-- 57joith2sqq6sehge54vlloyolm36mdu -->
<g id="node22" class="node">
<title>57joith2sqq6sehge54vlloyolm36mdu</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M1906.2352,-1198.3002C1906.2352,-1198.3002 1589.904,-1198.3002 1589.904,-1198.3002 1583.904,-1198.3002 1577.904,-1192.3002 1577.904,-1186.3002 1577.904,-1186.3002 1577.904,-1123.6998 1577.904,-1123.6998 1577.904,-1117.6998 1583.904,-1111.6998 1589.904,-1111.6998 1589.904,-1111.6998 1906.2352,-1111.6998 1906.2352,-1111.6998 1912.2352,-1111.6998 1918.2352,-1117.6998 1918.2352,-1123.6998 1918.2352,-1123.6998 1918.2352,-1186.3002 1918.2352,-1186.3002 1918.2352,-1192.3002 1912.2352,-1198.3002 1906.2352,-1198.3002"/>
<text text-anchor="middle" x="1748.0696" y="-1147.8" font-family="Monaco" font-size="24.00" fill="#000000">sed@4.8%gcc@9.4.0/57joith</text>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf&#45;&gt;57joith2sqq6sehge54vlloyolm36mdu -->
<g id="edge27" class="edge">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf-&gt;57joith2sqq6sehge54vlloyolm36mdu</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1748.0696,-1270.4072C1748.0696,-1251.3263 1748.0696,-1228.7257 1748.0696,-1208.6046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1751.5697,-1208.3403 1748.0696,-1198.3403 1744.5697,-1208.3404 1751.5697,-1208.3403"/>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf&#45;&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="edge24" class="edge">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf-&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1975.9734,-1301.684C2148.2819,-1288.3961 2365.6859,-1259.5384 2428.3689,-1197.6866 2466.9261,-1160.1438 2472.9783,-1095.7153 2471.5152,-1049.9701"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1976.1272,-1303.678C2148.5451,-1290.3788 2365.949,-1261.521 2429.7703,-1199.1134 2468.9173,-1160.3309 2474.9695,-1095.9024 2473.5142,-1049.9065"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2476.0078,-1049.7027 2472.0657,-1039.8686 2469.0147,-1050.0146 2476.0078,-1049.7027"/>
<text text-anchor="middle" x="2207.8884" y="-1273.0053" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=mpi</text>
</g>
<!-- nqiyrxlid6tikfpvoqdpvsjt5drs2obf&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge6" class="edge">
<title>nqiyrxlid6tikfpvoqdpvsjt5drs2obf-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1520.1614,-1301.6771C1362.9712,-1287.992 1173.582,-1259.0928 1123.0696,-1198.4 1098.3914,-1168.7481 1103.0165,-1144.5563 1123.0696,-1111.6 1140.5998,-1082.79 1167.9002,-1060.8539 1197.4647,-1044.2681"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1199.1408,-1047.3408 1206.2789,-1039.5114 1195.8163,-1041.1806 1199.1408,-1047.3408"/>
</g>
<!-- ogcucq2eod3xusvvied5ol2iobui4nsb -->
<g id="node18" class="node">
<title>ogcucq2eod3xusvvied5ol2iobui4nsb</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M400.2088,-245.5002C400.2088,-245.5002 11.9304,-245.5002 11.9304,-245.5002 5.9304,-245.5002 -.0696,-239.5002 -.0696,-233.5002 -.0696,-233.5002 -.0696,-170.8998 -.0696,-170.8998 -.0696,-164.8998 5.9304,-158.8998 11.9304,-158.8998 11.9304,-158.8998 400.2088,-158.8998 400.2088,-158.8998 406.2088,-158.8998 412.2088,-164.8998 412.2088,-170.8998 412.2088,-170.8998 412.2088,-233.5002 412.2088,-233.5002 412.2088,-239.5002 406.2088,-245.5002 400.2088,-245.5002"/>
<text text-anchor="middle" x="206.0696" y="-195" font-family="Monaco" font-size="24.00" fill="#000000">libiconv@1.17%gcc@9.4.0/ogcucq2</text>
</g>
<!-- xm3ldz3y3msfdc3hzshvxpbpg5hnt6o6&#45;&gt;ogcucq2eod3xusvvied5ol2iobui4nsb -->
<g id="edge47" class="edge">
<title>xm3ldz3y3msfdc3hzshvxpbpg5hnt6o6-&gt;ogcucq2eod3xusvvied5ol2iobui4nsb</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M205.0696,-317.6072C205.0696,-298.5263 205.0696,-275.9257 205.0696,-255.8046"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M207.0696,-317.6072C207.0696,-298.5263 207.0696,-275.9257 207.0696,-255.8046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="209.5697,-255.5403 206.0696,-245.5403 202.5697,-255.5404 209.5697,-255.5403"/>
<text text-anchor="middle" x="165.5739" y="-285.8482" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=iconv</text>
</g>
<!-- 4bu62kyfuh4ikdkuyxfxjxanf7e7qopu&#45;&gt;mujlx42xgttdc6u6rmiftsktpsrcmpbs -->
<g id="edge42" class="edge">
<title>4bu62kyfuh4ikdkuyxfxjxanf7e7qopu-&gt;mujlx42xgttdc6u6rmiftsktpsrcmpbs</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M672.6614,-1430.2151C600.7916,-1411.3548 534.1254,-1386.9583 512.2667,-1357.7962 489.0909,-1326.029 493.54,-1304.0273 512.1928,-1269.9192 527.5256,-1242.0821 552.3382,-1220.1508 578.9347,-1203.0434"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M673.169,-1428.2806C601.4789,-1409.4766 534.8127,-1385.0802 513.8725,-1356.6038 491.0512,-1326.4254 495.5003,-1304.4237 513.9464,-1270.8808 528.8502,-1243.5806 553.6627,-1221.6493 580.016,-1204.7259"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="581.46,-1206.7724 588.1193,-1198.532 577.7747,-1200.8211 581.46,-1206.7724"/>
</g>
<!-- 4bu62kyfuh4ikdkuyxfxjxanf7e7qopu&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge43" class="edge">
<title>4bu62kyfuh4ikdkuyxfxjxanf7e7qopu-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M680.4783,-1430.2246C600.8632,-1410.3933 522.8724,-1385.2921 493.3877,-1357.9314 411.1392,-1281.1573 374.1678,-1206.1582 435.2305,-1111.0561 454.3431,-1081.6726 482.5021,-1059.8261 513.5088,-1043.3725"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M680.9617,-1428.2839C601.476,-1408.4895 523.4851,-1383.3883 494.7515,-1356.4686 412.9331,-1280.273 375.9616,-1205.2739 436.9087,-1112.1439 455.569,-1083.2528 483.728,-1061.4063 514.4455,-1045.1396"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="515.8631,-1047.2236 523.1893,-1039.5699 512.6893,-1040.9844 515.8631,-1047.2236"/>
<text text-anchor="middle" x="453.0969" y="-1356.92" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas</text>
</g>
<!-- 4bu62kyfuh4ikdkuyxfxjxanf7e7qopu&#45;&gt;xiro2z6na56qdd4czjhj54eag3ekbiow -->
<g id="edge38" class="edge">
<title>4bu62kyfuh4ikdkuyxfxjxanf7e7qopu-&gt;xiro2z6na56qdd4czjhj54eag3ekbiow</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M857.6892,-1429.8521C840.9235,-1409.9835 820.9375,-1386.2985 803.4466,-1365.5705"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M859.2178,-1428.5623C842.4521,-1408.6937 822.466,-1385.0087 804.9751,-1364.2807"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="806.7654,-1362.5258 797.6414,-1357.1403 801.4156,-1367.0402 806.7654,-1362.5258"/>
</g>
<!-- 4bu62kyfuh4ikdkuyxfxjxanf7e7qopu&#45;&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="edge13" class="edge">
<title>4bu62kyfuh4ikdkuyxfxjxanf7e7qopu-&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1118.1783,-1450.5735C1412.4221,-1422.447 1902.6188,-1374.0528 1984.8578,-1356.2227 2203.916,-1308.9943 2329.6342,-1377.1305 2461.2658,-1197.8052 2492.3675,-1156.1664 2488.743,-1094.1171 2480.3694,-1050.0521"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1118.3686,-1452.5644C1412.6186,-1424.4374 1902.8153,-1376.0432 1985.2814,-1358.1773 2202.963,-1310.7526 2328.6812,-1378.8889 2462.8734,-1198.9948 2494.3641,-1156.0498 2490.7395,-1094.0005 2482.3343,-1049.6791"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2484.7438,-1048.9818 2479.3189,-1039.8812 2477.8845,-1050.3784 2484.7438,-1048.9818"/>
<text text-anchor="middle" x="1820.4407" y="-1379.7188" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=mpi</text>
</g>
<!-- 4bu62kyfuh4ikdkuyxfxjxanf7e7qopu&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge32" class="edge">
<title>4bu62kyfuh4ikdkuyxfxjxanf7e7qopu-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M947.2173,-1428.5496C968.7089,-1408.5917 992.2747,-1383.3345 1008.2117,-1356.6861 1067.0588,-1259.8646 1008.3745,-1197.6371 1084.3226,-1110.9351 1110.3076,-1081.7965 1144.7149,-1059.7578 1180.1804,-1043.0531"/>
<path fill="none" stroke="#daa520" stroke-width="2" d="M948.5783,-1430.0151C970.1712,-1409.9561 993.737,-1384.6989 1009.9275,-1357.7139 1068.5139,-1258.4924 1009.8295,-1196.2649 1085.8166,-1112.2649 1111.3864,-1083.4807 1145.7936,-1061.442 1181.0322,-1044.8626"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1182.4567,-1046.9607 1190.1008,-1039.6246 1179.5503,-1040.5926 1182.4567,-1046.9607"/>
</g>
<!-- 5xerf6imlgo4xlubacr4mljacc3edexo -->
<g id="node17" class="node">
<title>5xerf6imlgo4xlubacr4mljacc3edexo</title>
<path fill="#add8e6" stroke="#000000" stroke-width="4" d="M1822.3657,-880.7002C1822.3657,-880.7002 1437.7735,-880.7002 1437.7735,-880.7002 1431.7735,-880.7002 1425.7735,-874.7002 1425.7735,-868.7002 1425.7735,-868.7002 1425.7735,-806.0998 1425.7735,-806.0998 1425.7735,-800.0998 1431.7735,-794.0998 1437.7735,-794.0998 1437.7735,-794.0998 1822.3657,-794.0998 1822.3657,-794.0998 1828.3657,-794.0998 1834.3657,-800.0998 1834.3657,-806.0998 1834.3657,-806.0998 1834.3657,-868.7002 1834.3657,-868.7002 1834.3657,-874.7002 1828.3657,-880.7002 1822.3657,-880.7002"/>
<text text-anchor="middle" x="1630.0696" y="-830.2" font-family="Monaco" font-size="24.00" fill="#000000">openssl@1.1.1s%gcc@9.4.0/5xerf6i</text>
</g>
<!-- 5xerf6imlgo4xlubacr4mljacc3edexo&#45;&gt;ywrpvv2hgooeepdke33exkqrtdpd5gkl -->
<g id="edge22" class="edge">
<title>5xerf6imlgo4xlubacr4mljacc3edexo-&gt;ywrpvv2hgooeepdke33exkqrtdpd5gkl</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1425.7129,-803.7711C1262.7545,-776.9548 1035.5151,-739.5603 871.9084,-712.6373"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="872.1525,-709.1305 861.7169,-710.9602 871.0158,-716.0376 872.1525,-709.1305"/>
</g>
<!-- 5xerf6imlgo4xlubacr4mljacc3edexo&#45;&gt;4vsmjofkhntilgzh4zebluqak5mdsu3x -->
<g id="edge48" class="edge">
<title>5xerf6imlgo4xlubacr4mljacc3edexo-&gt;4vsmjofkhntilgzh4zebluqak5mdsu3x</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1644.2788,-794.0072C1650.5843,-774.7513 1658.0636,-751.9107 1664.6976,-731.6514"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1668.0917,-732.533 1667.8776,-721.9403 1661.4393,-730.3546 1668.0917,-732.533"/>
</g>
<!-- 5xerf6imlgo4xlubacr4mljacc3edexo&#45;&gt;nizxi5u5bbrzhzwfy2qb7hatlhuswlrz -->
<g id="edge41" class="edge">
<title>5xerf6imlgo4xlubacr4mljacc3edexo-&gt;nizxi5u5bbrzhzwfy2qb7hatlhuswlrz</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1834.3289,-793.5645C1906.6817,-774.1673 1975.9199,-749.2273 1998.2925,-721.3707 2031.5218,-680.681 2032.1636,-617.9031 2027.044,-573.3921"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1834.8468,-795.4962C1907.3595,-776.0489 1976.5977,-751.1089 1999.8467,-722.6293 2033.5217,-680.7015 2034.1635,-617.9235 2029.0309,-573.1639"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2031.4885,-572.6712 2026.7474,-563.1964 2024.5451,-573.5598 2031.4885,-572.6712"/>
</g>
<!-- v32wejd4d5lc6uka4qlrogwh5xae2h3r -->
<g id="node26" class="node">
<title>v32wejd4d5lc6uka4qlrogwh5xae2h3r</title>
<path fill="#ff7f50" stroke="#000000" stroke-width="4" d="M1306.1776,-404.3002C1306.1776,-404.3002 929.9616,-404.3002 929.9616,-404.3002 923.9616,-404.3002 917.9616,-398.3002 917.9616,-392.3002 917.9616,-392.3002 917.9616,-329.6998 917.9616,-329.6998 917.9616,-323.6998 923.9616,-317.6998 929.9616,-317.6998 929.9616,-317.6998 1306.1776,-317.6998 1306.1776,-317.6998 1312.1776,-317.6998 1318.1776,-323.6998 1318.1776,-329.6998 1318.1776,-329.6998 1318.1776,-392.3002 1318.1776,-392.3002 1318.1776,-398.3002 1312.1776,-404.3002 1306.1776,-404.3002"/>
<text text-anchor="middle" x="1118.0696" y="-353.8" font-family="Monaco" font-size="24.00" fill="#000000">readline@8.2%gcc@9.4.0/v32wejd</text>
</g>
<!-- uabgssx6lsgrevwbttslldnr5nzguprj&#45;&gt;v32wejd4d5lc6uka4qlrogwh5xae2h3r -->
<g id="edge7" class="edge">
<title>uabgssx6lsgrevwbttslldnr5nzguprj-&gt;v32wejd4d5lc6uka4qlrogwh5xae2h3r</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1117.0696,-476.4072C1117.0696,-457.3263 1117.0696,-434.7257 1117.0696,-414.6046"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1119.0696,-476.4072C1119.0696,-457.3263 1119.0696,-434.7257 1119.0696,-414.6046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1121.5697,-414.3403 1118.0696,-404.3403 1114.5697,-414.3404 1121.5697,-414.3403"/>
</g>
<!-- lfh3aovn65e66cs24qiehq3nd2ddojef&#45;&gt;o524gebsxavobkte3k5fglgwnedfkadf -->
<g id="edge14" class="edge">
<title>lfh3aovn65e66cs24qiehq3nd2ddojef-&gt;o524gebsxavobkte3k5fglgwnedfkadf</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1167.6711,-1112.5788C1078.9073,-1090.9596 971.5916,-1064.822 881.5513,-1042.892"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1168.1444,-1110.6356C1079.3806,-1089.0165 972.0649,-1062.8788 882.0246,-1040.9488"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="882.5603,-1038.5062 872.016,-1039.5403 880.9038,-1045.3074 882.5603,-1038.5062"/>
<text text-anchor="middle" x="963.904" y="-1079.817" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=blas,lapack</text>
</g>
<!-- lfh3aovn65e66cs24qiehq3nd2ddojef&#45;&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw -->
<g id="edge31" class="edge">
<title>lfh3aovn65e66cs24qiehq3nd2ddojef-&gt;2w3nq3n3hcj2tqlvcpewsryamltlu5tw</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1559.7922,-1112.1043C1562.8511,-1111.5975 1565.8904,-1111.1002 1568.9103,-1110.6128 1759.2182,-1079.8992 1973.2397,-1052.1328 2144.6143,-1031.5343"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1560.1191,-1114.0774C1563.1741,-1113.5712 1566.2134,-1113.0739 1569.2289,-1112.5872 1759.4755,-1081.8826 1973.497,-1054.1161 2144.8529,-1033.52"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2145.1529,-1036.002 2154.6648,-1031.3357 2144.3191,-1029.0518 2145.1529,-1036.002"/>
<text text-anchor="middle" x="1828.178" y="-1072.4692" font-family="Times,serif" font-size="14.00" fill="#000000">virtuals=mpi</text>
</g>
<!-- lfh3aovn65e66cs24qiehq3nd2ddojef&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge21" class="edge">
<title>lfh3aovn65e66cs24qiehq3nd2ddojef-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1346.0696,-1111.6072C1346.0696,-1092.5263 1346.0696,-1069.9257 1346.0696,-1049.8046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1349.5697,-1049.5403 1346.0696,-1039.5403 1342.5697,-1049.5404 1349.5697,-1049.5403"/>
</g>
<!-- 2w3nq3n3hcj2tqlvcpewsryamltlu5tw&#45;&gt;htzjns66gmq6pjofohp26djmjnpbegho -->
<g id="edge30" class="edge">
<title>2w3nq3n3hcj2tqlvcpewsryamltlu5tw-&gt;htzjns66gmq6pjofohp26djmjnpbegho</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M2467.0696,-952.8072C2467.0696,-933.7263 2467.0696,-911.1257 2467.0696,-891.0046"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="2470.5697,-890.7403 2467.0696,-880.7403 2463.5697,-890.7404 2470.5697,-890.7403"/>
</g>
<!-- 7rzbmgoxhmm2jhellkgcjmn62uklf22x&#45;&gt;gguve5icmo5e4cw5o3hvvfsxremc46if -->
<g id="edge2" class="edge">
<title>7rzbmgoxhmm2jhellkgcjmn62uklf22x-&gt;gguve5icmo5e4cw5o3hvvfsxremc46if</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1422.351,-1429.2133C1312.2528,-1388.8872 1171.1589,-1316.8265 1103.0696,-1198.4 1083.8409,-1164.956 1082.4563,-1144.2088 1103.0696,-1111.6 1121.4102,-1082.5864 1149.2483,-1060.7204 1179.6189,-1044.2895"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1181.4205,-1047.2977 1188.6801,-1039.5809 1178.1927,-1041.0863 1181.4205,-1047.2977"/>
</g>
<!-- v32wejd4d5lc6uka4qlrogwh5xae2h3r&#45;&gt;j5rupoqliu7kasm6xndl7ui32wgawkru -->
<g id="edge39" class="edge">
<title>v32wejd4d5lc6uka4qlrogwh5xae2h3r-&gt;j5rupoqliu7kasm6xndl7ui32wgawkru</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1179.8001,-316.7866C1209.2065,-296.3053 1244.4355,-271.7686 1274.8343,-250.5961"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1180.9431,-318.4278C1210.3495,-297.9465 1245.5785,-273.4098 1275.9774,-252.2373"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1277.6375,-254.1277 1283.8429,-245.5403 1273.6367,-248.3836 1277.6375,-254.1277"/>
</g>
<!-- gguve5icmo5e4cw5o3hvvfsxremc46if&#45;&gt;j5rupoqliu7kasm6xndl7ui32wgawkru -->
<g id="edge18" class="edge">
<title>gguve5icmo5e4cw5o3hvvfsxremc46if-&gt;j5rupoqliu7kasm6xndl7ui32wgawkru</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1345.0696,-952.7909C1345.0696,-891.6316 1345.0696,-776.6094 1345.0696,-678.6 1345.0696,-678.6 1345.0696,-678.6 1345.0696,-519.8 1345.0696,-426.9591 1345.0696,-318.8523 1345.0696,-255.7237"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1347.0696,-952.7909C1347.0696,-891.6316 1347.0696,-776.6094 1347.0696,-678.6 1347.0696,-678.6 1347.0696,-678.6 1347.0696,-519.8 1347.0696,-426.9591 1347.0696,-318.8523 1347.0696,-255.7237"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1349.5697,-255.6091 1346.0696,-245.6091 1342.5697,-255.6092 1349.5697,-255.6091"/>
</g>
<!-- gguve5icmo5e4cw5o3hvvfsxremc46if&#45;&gt;5xerf6imlgo4xlubacr4mljacc3edexo -->
<g id="edge40" class="edge">
<title>gguve5icmo5e4cw5o3hvvfsxremc46if-&gt;5xerf6imlgo4xlubacr4mljacc3edexo</title>
<path fill="none" stroke="#1e90ff" stroke-width="2" d="M1423.1858,-951.9344C1460.2844,-931.1905 1504.8229,-906.2866 1543.0151,-884.9312"/>
<path fill="none" stroke="#dc143c" stroke-width="2" d="M1424.1619,-953.68C1461.2605,-932.9361 1505.799,-908.0322 1543.9912,-886.6769"/>
<polygon fill="#1e90ff" stroke="#1e90ff" stroke-width="2" points="1545.5391,-888.6757 1552.5592,-880.7403 1542.1228,-882.5659 1545.5391,-888.6757"/>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 58 KiB

View File

@@ -519,11 +519,11 @@ inspections and customize them per-module-set.
modules:
prefix_inspections:
bin:
./bin:
- PATH
man:
./man:
- MANPATH
'':
./:
- CMAKE_PREFIX_PATH
Prefix inspections are only applied if the relative path inside the
@@ -579,7 +579,7 @@ the view.
view_relative_modules:
use_view: my_view
prefix_inspections:
bin:
./bin:
- PATH
view:
my_view:

View File

@@ -2352,7 +2352,7 @@ the following at the command line of a bash shell:
.. code-block:: console
$ for i in {1..12}; do nohup spack install -j 4 mpich@3.3.2 >> mpich_install.txt 2>&1 &; done
$ for i in {1..12}; do nohup spack install -j 4 mpich@3.3.2 >> mpich_install.txt 2>&1 & done
.. note::
@@ -2974,6 +2974,33 @@ The ``provides("mpi")`` call tells Spack that the ``mpich`` package
can be used to satisfy the dependency of any package that
``depends_on("mpi")``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Providing multiple virtuals simultaneously
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Packages can provide more than one virtual dependency. Sometimes, due to implementation details,
there are subsets of those virtuals that need to be provided together by the same package.
A well-known example is ``openblas``, which provides both the ``lapack`` and ``blas`` API in a single ``libopenblas``
library. A package that needs ``lapack`` and ``blas`` must either use ``openblas`` to provide both, or not use
``openblas`` at all. It cannot pick one or the other.
To express this constraint in a package, the two virtual dependencies must be listed in the same ``provides`` directive:
.. code-block:: python
provides('blas', 'lapack')
This makes it impossible to select ``openblas`` as a provider for one of the two
virtual dependencies and not for the other. If you try to, Spack will report an error:
.. code-block:: console
$ spack spec netlib-scalapack ^[virtuals=lapack] openblas ^[virtuals=blas] atlas
==> Error: concretization failed for the following reasons:
1. Package 'openblas' needs to provide both 'lapack' and 'blas' together, but provides only 'lapack'
^^^^^^^^^^^^^^^^^^^^
Versioned Interfaces
^^^^^^^^^^^^^^^^^^^^
@@ -3476,6 +3503,56 @@ is equivalent to:
Constraints from nested context managers are also combined together, but they are rarely
needed or recommended.
.. _default_args:
------------------------
Common default arguments
------------------------
Similarly, if directives have a common set of default arguments, you can
group them together in a ``with default_args()`` block:
.. code-block:: python
class PyExample(PythonPackage):
with default_args(type=("build", "run")):
depends_on("py-foo")
depends_on("py-foo@2:", when="@2:")
depends_on("py-bar")
depends_on("py-bz")
The above is short for:
.. code-block:: python
class PyExample(PythonPackage):
depends_on("py-foo", type=("build", "run"))
depends_on("py-foo@2:", when="@2:", type=("build", "run"))
depends_on("py-bar", type=("build", "run"))
depends_on("py-bz", type=("build", "run"))
.. note::
The ``with when()`` context manager is composable, while ``with default_args()``
merely overrides the default. For example:
.. code-block:: python
with default_args(when="+feature"):
depends_on("foo")
depends_on("bar")
depends_on("baz", when="+baz")
is equivalent to:
.. code-block:: python
depends_on("foo", when="+feature")
depends_on("bar", when="+feature")
depends_on("baz", when="+baz") # Note: not when="+feature+baz"
.. _install-method:
------------------

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.1 (commit df43a1834460bf94516136951c4729a3100603ec)
* Version: 0.2.2 (commit 1dc58a5776dd77e6fc6e4ba5626af5b1fb24996e)
astunparse
----------------

View File

@@ -1,2 +1,2 @@
"""Init file to avoid namespace packages"""
__version__ = "0.2.1"
__version__ = "0.2.2"

View File

@@ -2318,6 +2318,26 @@
]
}
},
"power10": {
"from": ["power9"],
"vendor": "IBM",
"generation": 10,
"features": [],
"compilers": {
"gcc": [
{
"versions": "11.1:",
"flags": "-mcpu={name} -mtune={name}"
}
],
"clang": [
{
"versions": "11.0:",
"flags": "-mcpu={name} -mtune={name}"
}
]
}
},
"ppc64le": {
"from": [],
"vendor": "generic",
@@ -2405,6 +2425,29 @@
]
}
},
"power10le": {
"from": ["power9le"],
"vendor": "IBM",
"generation": 10,
"features": [],
"compilers": {
"gcc": [
{
"name": "power10",
"versions": "11.1:",
"flags": "-mcpu={name} -mtune={name}"
}
],
"clang": [
{
"versions": "11.0:",
"family": "ppc64le",
"name": "power10",
"flags": "-mcpu={name} -mtune={name}"
}
]
}
},
"aarch64": {
"from": [],
"vendor": "generic",
@@ -2592,6 +2635,37 @@
]
}
},
"armv9.0a": {
"from": ["armv8.5a"],
"vendor": "generic",
"features": [],
"compilers": {
"gcc": [
{
"versions": "12:",
"flags": "-march=armv9-a -mtune=generic"
}
],
"clang": [
{
"versions": "14:",
"flags": "-march=armv9-a -mtune=generic"
}
],
"apple-clang": [
{
"versions": ":",
"flags": "-march=armv9-a -mtune=generic"
}
],
"arm": [
{
"versions": ":",
"flags": "-march=armv9-a -mtune=generic"
}
]
}
},
"thunderx2": {
"from": ["armv8.1a"],
"vendor": "Cavium",
@@ -2813,8 +2887,12 @@
],
"arm" : [
{
"versions": "20:",
"versions": "20:21.9",
"flags" : "-march=armv8.2-a+fp16+rcpc+dotprod+crypto"
},
{
"versions": "22:",
"flags" : "-mcpu=neoverse-n1"
}
],
"nvhpc" : [
@@ -2942,7 +3020,7 @@
},
{
"versions": "22:",
"flags" : "-march=armv8.4-a+sve+ssbs+fp16+bf16+crypto+i8mm+rng"
"flags" : "-mcpu=neoverse-v1"
}
],
"nvhpc" : [
@@ -2954,6 +3032,126 @@
]
}
},
"neoverse_v2": {
"from": ["neoverse_n1", "armv9.0a"],
"vendor": "ARM",
"features": [
"fp",
"asimd",
"evtstrm",
"aes",
"pmull",
"sha1",
"sha2",
"crc32",
"atomics",
"fphp",
"asimdhp",
"cpuid",
"asimdrdm",
"jscvt",
"fcma",
"lrcpc",
"dcpop",
"sha3",
"sm3",
"sm4",
"asimddp",
"sha512",
"sve",
"asimdfhm",
"dit",
"uscat",
"ilrcpc",
"flagm",
"ssbs",
"sb",
"paca",
"pacg",
"dcpodp",
"sve2",
"sveaes",
"svepmull",
"svebitperm",
"svesha3",
"svesm4",
"flagm2",
"frint",
"svei8mm",
"svebf16",
"i8mm",
"bf16",
"dgh",
"bti"
],
"compilers" : {
"gcc": [
{
"versions": "4.8:5.99",
"flags": "-march=armv8-a"
},
{
"versions": "6:6.99",
"flags" : "-march=armv8.1-a"
},
{
"versions": "7.0:7.99",
"flags" : "-march=armv8.2-a -mtune=cortex-a72"
},
{
"versions": "8.0:8.99",
"flags" : "-march=armv8.4-a+sve -mtune=cortex-a72"
},
{
"versions": "9.0:9.99",
"flags" : "-march=armv8.5-a+sve -mtune=cortex-a76"
},
{
"versions": "10.0:11.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16 -mtune=cortex-a77"
},
{
"versions": "12.0:12.99",
"flags" : "-march=armv9-a+i8mm+bf16 -mtune=cortex-a710"
},
{
"versions": "13.0:",
"flags" : "-mcpu=neoverse-v2"
}
],
"clang" : [
{
"versions": "9.0:10.99",
"flags" : "-march=armv8.5-a+sve"
},
{
"versions": "11.0:13.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16"
},
{
"versions": "14.0:15.99",
"flags" : "-march=armv9-a+i8mm+bf16"
},
{
"versions": "16.0:",
"flags" : "-mcpu=neoverse-v2"
}
],
"arm" : [
{
"versions": "23.04.0:",
"flags" : "-mcpu=neoverse-v2"
}
],
"nvhpc" : [
{
"versions": "23.3:",
"name": "neoverse-v2",
"flags": "-tp {name}"
}
]
}
},
"m1": {
"from": ["armv8.4a"],
"vendor": "Apple",

View File

@@ -143,7 +143,9 @@ def _bootstrap_config_scopes() -> Sequence["spack.config.ConfigScope"]:
def _add_compilers_if_missing() -> None:
arch = spack.spec.ArchSpec.frontend_arch()
if not spack.compilers.compilers_for_arch(arch):
new_compilers = spack.compilers.find_new_compilers()
new_compilers = spack.compilers.find_new_compilers(
mixed_toolchain=sys.platform == "darwin"
)
if new_compilers:
spack.compilers.add_compilers_to_config(new_compilers, init_config=False)

View File

@@ -291,6 +291,10 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
with spack_python_interpreter():
# Add hint to use frontend operating system on Cray
concrete_spec = spack.spec.Spec(abstract_spec_str + " ^" + spec_for_current_python())
# This is needed to help the old concretizer taking the `setuptools` dependency
# only when bootstrapping from sources on Python 3.12
if spec_for_current_python() == "python@3.12":
concrete_spec.constrain("+force_setuptools")
if module == "clingo":
# TODO: remove when the old concretizer is deprecated # pylint: disable=fixme

View File

@@ -1016,10 +1016,17 @@ def get_env_modifications(self) -> EnvironmentModifications:
self._make_runnable(dspec, env)
if self.should_setup_run_env & flag:
run_env_mods = EnvironmentModifications()
for spec in dspec.dependents(deptype=dt.LINK | dt.RUN):
if id(spec) in self.nodes_in_subdag:
pkg.setup_dependent_run_environment(env, spec)
pkg.setup_run_environment(env)
pkg.setup_dependent_run_environment(run_env_mods, spec)
pkg.setup_run_environment(run_env_mods)
if self.context == Context.BUILD:
# Don't let the runtime environment of comiler like dependencies leak into the
# build env
run_env_mods.drop("CC", "CXX", "F77", "FC")
env.extend(run_env_mods)
return env
def _make_buildtime_detectable(self, dep: spack.spec.Spec, env: EnvironmentModifications):

View File

@@ -46,7 +46,22 @@
from spack.reporters import CDash, CDashConfiguration
from spack.reporters.cdash import build_stamp as cdash_build_stamp
JOB_RETRY_CONDITIONS = ["always"]
# See https://docs.gitlab.com/ee/ci/yaml/#retry for descriptions of conditions
JOB_RETRY_CONDITIONS = [
# "always",
"unknown_failure",
"script_failure",
"api_failure",
"stuck_or_timeout_failure",
"runner_system_failure",
"runner_unsupported",
"stale_schedule",
# "job_execution_timeout",
"archived_failure",
"unmet_prerequisites",
"scheduler_failure",
"data_integrity_failure",
]
TEMP_STORAGE_MIRROR_NAME = "ci_temporary_mirror"
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]

View File

@@ -796,7 +796,9 @@ def names(args: Namespace, out: IO) -> None:
commands = copy.copy(spack.cmd.all_commands())
if args.aliases:
commands.extend(spack.main.aliases.keys())
aliases = spack.config.get("config:aliases")
if aliases:
commands.extend(aliases.keys())
colify(commands, output=out)
@@ -812,8 +814,10 @@ def bash(args: Namespace, out: IO) -> None:
parser = spack.main.make_argument_parser()
spack.main.add_all_commands(parser)
aliases = ";".join(f"{key}:{val}" for key, val in spack.main.aliases.items())
out.write(f'SPACK_ALIASES="{aliases}"\n\n')
aliases_config = spack.config.get("config:aliases")
if aliases_config:
aliases = ";".join(f"{key}:{val}" for key, val in aliases_config.items())
out.write(f'SPACK_ALIASES="{aliases}"\n\n')
writer = BashCompletionWriter(parser.prog, out, args.aliases)
writer.write(parser)

View File

@@ -31,6 +31,19 @@ def setup_parser(subparser):
aliases=["add"],
help="search the system for compilers to add to Spack configuration",
)
mixed_toolchain_group = find_parser.add_mutually_exclusive_group()
mixed_toolchain_group.add_argument(
"--mixed-toolchain",
action="store_true",
default=sys.platform == "darwin",
help="Allow mixed toolchains (for example: clang, clang++, gfortran)",
)
mixed_toolchain_group.add_argument(
"--no-mixed-toolchain",
action="store_false",
dest="mixed_toolchain",
help="Do not allow mixed toolchains (for example: clang, clang++, gfortran)",
)
find_parser.add_argument("add_paths", nargs=argparse.REMAINDER)
find_parser.add_argument(
"--scope",
@@ -86,7 +99,9 @@ def compiler_find(args):
# Below scope=None because we want new compilers that don't appear
# in any other configuration.
new_compilers = spack.compilers.find_new_compilers(paths, scope=None)
new_compilers = spack.compilers.find_new_compilers(
paths, scope=None, mixed_toolchain=args.mixed_toolchain
)
if new_compilers:
spack.compilers.add_compilers_to_config(new_compilers, scope=args.scope, init_config=False)
n = len(new_compilers)

View File

@@ -407,7 +407,9 @@ def config_prefer_upstream(args):
pkgs = {}
for spec in pref_specs:
# Collect all the upstream compilers and versions for this package.
pkg = pkgs.get(spec.name, {"version": [], "compiler": []})
pkg = pkgs.get(spec.name, {"version": []})
all = pkgs.get("all", {"compiler": []})
pkgs["all"] = all
pkgs[spec.name] = pkg
# We have no existing variant if this is our first added version.
@@ -418,8 +420,8 @@ def config_prefer_upstream(args):
pkg["version"].append(version)
compiler = str(spec.compiler)
if compiler not in pkg["compiler"]:
pkg["compiler"].append(compiler)
if compiler not in all["compiler"]:
all["compiler"].append(compiler)
# Get and list all the variants that differ from the default.
variants = []

View File

@@ -99,10 +99,7 @@ def dev_build(self, args):
spec = specs[0]
if not spack.repo.PATH.exists(spec.name):
tty.die(
"No package for '{0}' was found.".format(spec.name),
" Use `spack create` to create a new package",
)
raise spack.repo.UnknownPackageError(spec.name)
if not spec.versions.concrete_range_as_version:
tty.die(

View File

@@ -43,10 +43,7 @@ def edit_package(name, repo_path, namespace):
if not os.access(path, os.R_OK):
tty.die("Insufficient permissions on '%s'!" % path)
else:
tty.die(
"No package for '{0}' was found.".format(spec.name),
" Use `spack create` to create a new package",
)
raise spack.repo.UnknownPackageError(spec.name)
editor(path)

View File

@@ -5,6 +5,7 @@
import argparse
import os
import shlex
import shutil
import sys
import tempfile
@@ -144,10 +145,13 @@ def create_temp_env_directory():
return tempfile.mkdtemp(prefix="spack-")
def env_activate(args):
if not args.activate_env and not args.dir and not args.temp:
tty.die("spack env activate requires an environment name, directory, or --temp")
def _tty_info(msg):
"""tty.info like function that prints the equivalent printf statement for eval."""
decorated = f'{colorize("@*b{==>}")} {msg}\n'
print(f"printf {shlex.quote(decorated)};")
def env_activate(args):
if not args.shell:
spack.cmd.common.shell_init_instructions(
"spack env activate", " eval `spack env activate {sh_arg} [...]`"
@@ -160,12 +164,25 @@ def env_activate(args):
env_name_or_dir = args.activate_env or args.dir
# When executing `spack env activate` without further arguments, activate
# the default environment. It's created when it doesn't exist yet.
if not env_name_or_dir and not args.temp:
short_name = "default"
if not ev.exists(short_name):
ev.create(short_name)
action = "Created and activated"
else:
action = "Activated"
env_path = ev.root(short_name)
_tty_info(f"{action} default environment in {env_path}")
# Temporary environment
if args.temp:
elif args.temp:
env = create_temp_env_directory()
env_path = os.path.abspath(env)
short_name = os.path.basename(env_path)
ev.create_in_dir(env).write(regenerate=False)
_tty_info(f"Created and activated temporary environment in {env_path}")
# Managed environment
elif ev.exists(env_name_or_dir) and not args.dir:
@@ -385,7 +402,7 @@ def env_remove(args):
try:
env = ev.read(env_name)
read_envs.append(env)
except spack.config.ConfigFormatError:
except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError):
bad_envs.append(env_name)
if not args.yes_to_all:
@@ -553,8 +570,8 @@ def env_update_setup_parser(subparser):
def env_update(args):
manifest_file = ev.manifest_file(args.update_env)
backup_file = manifest_file + ".bkp"
needs_update = not ev.is_latest_format(manifest_file)
needs_update = not ev.is_latest_format(manifest_file)
if not needs_update:
tty.msg('No update needed for the environment "{0}"'.format(args.update_env))
return
@@ -672,18 +689,31 @@ def env_depfile(args):
# Currently only make is supported.
spack.cmd.require_active_env(cmd_name="env depfile")
env = ev.active_environment()
# What things do we build when running make? By default, we build the
# root specs. If specific specs are provided as input, we build those.
filter_specs = spack.cmd.parse_specs(args.specs) if args.specs else None
template = spack.tengine.make_environment().get_template(os.path.join("depfile", "Makefile"))
model = depfile.MakefileModel.from_env(
ev.active_environment(),
env,
filter_specs=filter_specs,
pkg_buildcache=depfile.UseBuildCache.from_string(args.use_buildcache[0]),
dep_buildcache=depfile.UseBuildCache.from_string(args.use_buildcache[1]),
make_prefix=args.make_prefix,
jobserver=args.jobserver,
)
# Warn in case we're generating a depfile for an empty environment. We don't automatically
# concretize; the user should do that explicitly. Could be changed in the future if requested.
if model.empty:
if not env.user_specs:
tty.warn("no specs in the environment")
elif filter_specs is not None:
tty.warn("no concrete matching specs found in environment")
else:
tty.warn("environment is not concretized. Run `spack concretize` first")
makefile = template.render(model.to_dict())
# Finally write to stdout/file.

View File

@@ -23,7 +23,7 @@
# tutorial configuration parameters
tutorial_branch = "releases/v0.20"
tutorial_branch = "releases/v0.21"
tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@@ -10,7 +10,7 @@
import itertools
import multiprocessing.pool
import os
from typing import Dict, List
from typing import Dict, List, Optional, Tuple
import archspec.cpu
@@ -21,6 +21,7 @@
import spack.compiler
import spack.config
import spack.error
import spack.operating_systems
import spack.paths
import spack.platforms
import spack.spec
@@ -223,13 +224,16 @@ def all_compiler_specs(scope=None, init_config=True):
]
def find_compilers(path_hints=None):
def find_compilers(
path_hints: Optional[List[str]] = None, *, mixed_toolchain=False
) -> List["spack.compiler.Compiler"]:
"""Return the list of compilers found in the paths given as arguments.
Args:
path_hints (list or None): list of path hints where to look for.
A sensible default based on the ``PATH`` environment variable
will be used if the value is None
path_hints: list of path hints where to look for. A sensible default based on the ``PATH``
environment variable will be used if the value is None
mixed_toolchain: allow mixing compilers from different toolchains if otherwise missing for
a certain language
"""
if path_hints is None:
path_hints = get_path("PATH")
@@ -250,7 +254,7 @@ def find_compilers(path_hints=None):
finally:
tp.close()
def valid_version(item):
def valid_version(item: Tuple[Optional[DetectVersionArgs], Optional[str]]) -> bool:
value, error = item
if error is None:
return True
@@ -262,25 +266,37 @@ def valid_version(item):
pass
return False
def remove_errors(item):
def remove_errors(
item: Tuple[Optional[DetectVersionArgs], Optional[str]]
) -> DetectVersionArgs:
value, _ = item
assert value is not None
return value
return make_compiler_list(map(remove_errors, filter(valid_version, detected_versions)))
return make_compiler_list(
[remove_errors(detected) for detected in detected_versions if valid_version(detected)],
mixed_toolchain=mixed_toolchain,
)
def find_new_compilers(path_hints=None, scope=None):
def find_new_compilers(
path_hints: Optional[List[str]] = None,
scope: Optional[str] = None,
*,
mixed_toolchain: bool = False,
):
"""Same as ``find_compilers`` but return only the compilers that are not
already in compilers.yaml.
Args:
path_hints (list or None): list of path hints where to look for.
A sensible default based on the ``PATH`` environment variable
will be used if the value is None
scope (str): scope to look for a compiler. If None consider the
merged configuration.
path_hints: list of path hints where to look for. A sensible default based on the ``PATH``
environment variable will be used if the value is None
scope: scope to look for a compiler. If None consider the merged configuration.
mixed_toolchain: allow mixing compilers from different toolchains if otherwise missing for
a certain language
"""
compilers = find_compilers(path_hints)
compilers = find_compilers(path_hints, mixed_toolchain=mixed_toolchain)
return select_new_compilers(compilers, scope)
@@ -638,7 +654,9 @@ def all_compiler_types():
)
def arguments_to_detect_version_fn(operating_system, paths):
def arguments_to_detect_version_fn(
operating_system: spack.operating_systems.OperatingSystem, paths: List[str]
) -> List[DetectVersionArgs]:
"""Returns a list of DetectVersionArgs tuples to be used in a
corresponding function to detect compiler versions.
@@ -646,8 +664,7 @@ def arguments_to_detect_version_fn(operating_system, paths):
function by providing a method called with the same name.
Args:
operating_system (spack.operating_systems.OperatingSystem): the operating system
on which we are looking for compilers
operating_system: the operating system on which we are looking for compilers
paths: paths to search for compilers
Returns:
@@ -656,10 +673,10 @@ def arguments_to_detect_version_fn(operating_system, paths):
compilers in this OS.
"""
def _default(search_paths):
command_arguments = []
def _default(search_paths: List[str]) -> List[DetectVersionArgs]:
command_arguments: List[DetectVersionArgs] = []
files_to_be_tested = fs.files_in(*search_paths)
for compiler_name in spack.compilers.supported_compilers_for_host_platform():
for compiler_name in supported_compilers_for_host_platform():
compiler_cls = class_for_compiler_name(compiler_name)
for language in ("cc", "cxx", "f77", "fc"):
@@ -684,7 +701,9 @@ def _default(search_paths):
return fn(paths)
def detect_version(detect_version_args):
def detect_version(
detect_version_args: DetectVersionArgs,
) -> Tuple[Optional[DetectVersionArgs], Optional[str]]:
"""Computes the version of a compiler and adds it to the information
passed as input.
@@ -693,8 +712,7 @@ def detect_version(detect_version_args):
needs to be checked by the code dispatching the calls.
Args:
detect_version_args (DetectVersionArgs): information on the
compiler for which we should detect the version.
detect_version_args: information on the compiler for which we should detect the version.
Returns:
A ``(DetectVersionArgs, error)`` tuple. If ``error`` is ``None`` the
@@ -710,7 +728,7 @@ def _default(fn_args):
path = fn_args.path
# Get compiler names and the callback to detect their versions
callback = getattr(compiler_cls, "{0}_version".format(language))
callback = getattr(compiler_cls, f"{language}_version")
try:
version = callback(path)
@@ -736,13 +754,15 @@ def _default(fn_args):
return fn(detect_version_args)
def make_compiler_list(detected_versions):
def make_compiler_list(
detected_versions: List[DetectVersionArgs], mixed_toolchain: bool = False
) -> List["spack.compiler.Compiler"]:
"""Process a list of detected versions and turn them into a list of
compiler specs.
Args:
detected_versions (list): list of DetectVersionArgs containing a
valid version
detected_versions: list of DetectVersionArgs containing a valid version
mixed_toolchain: allow mixing compilers from different toolchains if langauge is missing
Returns:
list: list of Compiler objects
@@ -751,7 +771,7 @@ def make_compiler_list(detected_versions):
sorted_compilers = sorted(detected_versions, key=group_fn)
# Gather items in a dictionary by the id, name variation and language
compilers_d = {}
compilers_d: Dict[CompilerID, Dict[NameVariation, dict]] = {}
for sort_key, group in itertools.groupby(sorted_compilers, key=group_fn):
compiler_id, name_variation, language = sort_key
by_compiler_id = compilers_d.setdefault(compiler_id, {})
@@ -760,7 +780,7 @@ def make_compiler_list(detected_versions):
def _default_make_compilers(cmp_id, paths):
operating_system, compiler_name, version = cmp_id
compiler_cls = spack.compilers.class_for_compiler_name(compiler_name)
compiler_cls = class_for_compiler_name(compiler_name)
spec = spack.spec.CompilerSpec(compiler_cls.name, f"={version}")
paths = [paths.get(x, None) for x in ("cc", "cxx", "f77", "fc")]
# TODO: johnwparent - revist the following line as per discussion at:
@@ -782,13 +802,14 @@ def _default_make_compilers(cmp_id, paths):
getattr(variation, "suffix", None),
)
compilers = []
# Flatten to a list of compiler id, primary variation and compiler dictionary
flat_compilers: List[Tuple[CompilerID, NameVariation, dict]] = []
for compiler_id, by_compiler_id in compilers_d.items():
ordered = sorted(by_compiler_id, key=sort_fn)
selected_variation = ordered[0]
selected = by_compiler_id[selected_variation]
# fill any missing parts from subsequent entries
# Fill any missing parts from subsequent entries (without mixing toolchains)
for lang in ["cxx", "f77", "fc"]:
if lang not in selected:
next_lang = next(
@@ -797,14 +818,63 @@ def _default_make_compilers(cmp_id, paths):
if next_lang:
selected[lang] = next_lang
operating_system, _, _ = compiler_id
make_compilers = getattr(operating_system, "make_compilers", _default_make_compilers)
flat_compilers.append((compiler_id, selected_variation, selected))
compilers.extend(make_compilers(compiler_id, selected))
# Next, fill out the blanks of missing compilers by creating a mixed toolchain (if requested)
if mixed_toolchain:
make_mixed_toolchain(flat_compilers)
# Finally, create the compiler list
compilers = []
for compiler_id, _, compiler in flat_compilers:
make_compilers = getattr(compiler_id.os, "make_compilers", _default_make_compilers)
compilers.extend(make_compilers(compiler_id, compiler))
return compilers
def make_mixed_toolchain(compilers: List[Tuple[CompilerID, NameVariation, dict]]) -> None:
"""Add missing compilers across toolchains when they are missing for a particular language.
This currently only adds the most sensible gfortran to (apple)-clang if it doesn't have a
fortran compiler (no flang)."""
# First collect the clangs that are missing a fortran compiler
clangs_without_flang = [
(id, variation, compiler)
for id, variation, compiler in compilers
if id.compiler_name in ("clang", "apple-clang")
and "f77" not in compiler
and "fc" not in compiler
]
if not clangs_without_flang:
return
# Filter on GCCs with fortran compiler
gccs_with_fortran = [
(id, variation, compiler)
for id, variation, compiler in compilers
if id.compiler_name == "gcc" and "f77" in compiler and "fc" in compiler
]
# Sort these GCCs by "best variation" (no prefix / suffix first)
gccs_with_fortran.sort(
key=lambda x: (getattr(x[1], "prefix", None), getattr(x[1], "suffix", None))
)
# Attach the optimal GCC fortran compiler to the clangs that don't have one
for clang_id, _, clang_compiler in clangs_without_flang:
gcc_compiler = next(
(gcc[2] for gcc in gccs_with_fortran if gcc[0].os == clang_id.os), None
)
if not gcc_compiler:
continue
# Update the fc / f77 entries
clang_compiler["f77"] = gcc_compiler["f77"]
clang_compiler["fc"] = gcc_compiler["fc"]
def is_mixed_toolchain(compiler):
"""Returns True if the current compiler is a mixed toolchain,
False otherwise.

View File

@@ -5,7 +5,6 @@
import os
import re
import sys
import llnl.util.lang
@@ -114,17 +113,6 @@ def extract_version_from_output(cls, output):
return ".".join(match.groups())
return "unknown"
@classmethod
def fc_version(cls, fortran_compiler):
if sys.platform == "darwin":
return cls.default_version("clang")
return cls.default_version(fortran_compiler)
@classmethod
def f77_version(cls, f77):
return cls.fc_version(f77)
@property
def stdcxx_libs(self):
return ("-lstdc++",)

View File

@@ -5,7 +5,6 @@
import os
import re
import sys
import llnl.util.lang
@@ -39,10 +38,10 @@ class Clang(Compiler):
cxx_names = ["clang++"]
# Subclasses use possible names of Fortran 77 compiler
f77_names = ["flang", "gfortran", "xlf_r"]
f77_names = ["flang"]
# Subclasses use possible names of Fortran 90 compiler
fc_names = ["flang", "gfortran", "xlf90_r"]
fc_names = ["flang"]
version_argument = "--version"
@@ -182,16 +181,3 @@ def extract_version_from_output(cls, output):
if match:
ver = match.group(match.lastindex)
return ver
@classmethod
def fc_version(cls, fc):
# We could map from gcc/gfortran version to clang version, but on macOS
# we normally mix any version of gfortran with any version of clang.
if sys.platform == "darwin":
return cls.default_version("clang")
else:
return cls.default_version(fc)
@classmethod
def f77_version(cls, f77):
return cls.fc_version(f77)

View File

@@ -69,6 +69,7 @@
SECTION_SCHEMAS = {
"compilers": spack.schema.compilers.schema,
"concretizer": spack.schema.concretizer.schema,
"definitions": spack.schema.definitions.schema,
"mirrors": spack.schema.mirrors.schema,
"repos": spack.schema.repos.schema,
"packages": spack.schema.packages.schema,
@@ -994,6 +995,7 @@ def read_config_file(filename, schema=None):
key = next(iter(data))
schema = _ALL_SCHEMAS[key]
validate(data, schema)
return data
except StopIteration:

View File

@@ -1522,14 +1522,18 @@ def _query(
# TODO: like installed and known that can be queried? Or are
# TODO: these really special cases that only belong here?
# Just look up concrete specs with hashes; no fancy search.
if isinstance(query_spec, spack.spec.Spec) and query_spec.concrete:
# TODO: handling of hashes restriction is not particularly elegant.
hash_key = query_spec.dag_hash()
if hash_key in self._data and (not hashes or hash_key in hashes):
return [self._data[hash_key].spec]
else:
return []
if query_spec is not any:
if not isinstance(query_spec, spack.spec.Spec):
query_spec = spack.spec.Spec(query_spec)
# Just look up concrete specs with hashes; no fancy search.
if query_spec.concrete:
# TODO: handling of hashes restriction is not particularly elegant.
hash_key = query_spec.dag_hash()
if hash_key in self._data and (not hashes or hash_key in hashes):
return [self._data[hash_key].spec]
else:
return []
# Abstract specs require more work -- currently we test
# against everything.
@@ -1537,6 +1541,9 @@ def _query(
start_date = start_date or datetime.datetime.min
end_date = end_date or datetime.datetime.max
# save specs whose name doesn't match for last, to avoid a virtual check
deferred = []
for key, rec in self._data.items():
if hashes is not None and rec.spec.dag_hash() not in hashes:
continue
@@ -1561,8 +1568,26 @@ def _query(
if not (start_date < inst_date < end_date):
continue
if query_spec is any or rec.spec.satisfies(query_spec):
if query_spec is any:
results.append(rec.spec)
continue
# check anon specs and exact name matches first
if not query_spec.name or rec.spec.name == query_spec.name:
if rec.spec.satisfies(query_spec):
results.append(rec.spec)
# save potential virtual matches for later, but not if we already found a match
elif not results:
deferred.append(rec.spec)
# Checking for virtuals is expensive, so we save it for last and only if needed.
# If we get here, we didn't find anything in the DB that matched by name.
# If we did fine something, the query spec can't be virtual b/c we matched an actual
# package installation, so skip the virtual check entirely. If we *didn't* find anything,
# check all the deferred specs *if* the query is virtual.
if not results and query_spec is not any and deferred and query_spec.virtual:
results = [spec for spec in deferred if spec.satisfies(query_spec)]
return results

View File

@@ -15,9 +15,12 @@
from typing import Dict, List, Optional, Set, Tuple
import llnl.util.filesystem
import llnl.util.lang
import llnl.util.tty
import spack.util.elf as elf_utils
import spack.util.environment
import spack.util.environment as environment
import spack.util.ld_so_conf
from .common import (
@@ -57,6 +60,11 @@ def common_windows_package_paths(pkg_cls=None) -> List[str]:
return paths
def file_identifier(path):
s = os.stat(path)
return (s.st_dev, s.st_ino)
def executables_in_path(path_hints: List[str]) -> Dict[str, str]:
"""Get the paths of all executables available from the current PATH.
@@ -75,12 +83,40 @@ def executables_in_path(path_hints: List[str]) -> Dict[str, str]:
return path_to_dict(search_paths)
def get_elf_compat(path):
"""For ELF files, get a triplet (EI_CLASS, EI_DATA, e_machine) and see if
it is host-compatible."""
# On ELF platforms supporting, we try to be a bit smarter when it comes to shared
# libraries, by dropping those that are not host compatible.
with open(path, "rb") as f:
elf = elf_utils.parse_elf(f, only_header=True)
return (elf.is_64_bit, elf.is_little_endian, elf.elf_hdr.e_machine)
def accept_elf(path, host_compat):
"""Accept an ELF file if the header matches the given compat triplet,
obtained with :py:func:`get_elf_compat`. In case it's not an ELF (e.g.
static library, or some arbitrary file, fall back to is_readable_file)."""
# Fast path: assume libraries at least have .so in their basename.
# Note: don't replace with splitext, because of libsmth.so.1.2.3 file names.
if ".so" not in os.path.basename(path):
return llnl.util.filesystem.is_readable_file(path)
try:
return host_compat == get_elf_compat(path)
except (OSError, elf_utils.ElfParsingError):
return llnl.util.filesystem.is_readable_file(path)
def libraries_in_ld_and_system_library_path(
path_hints: Optional[List[str]] = None,
) -> Dict[str, str]:
"""Get the paths of all libraries available from LD_LIBRARY_PATH,
LIBRARY_PATH, DYLD_LIBRARY_PATH, DYLD_FALLBACK_LIBRARY_PATH, and
standard system library paths.
"""Get the paths of all libraries available from ``path_hints`` or the
following defaults:
- Environment variables (Linux: ``LD_LIBRARY_PATH``, Darwin: ``DYLD_LIBRARY_PATH``,
and ``DYLD_FALLBACK_LIBRARY_PATH``)
- Dynamic linker default paths (glibc: ld.so.conf, musl: ld-musl-<arch>.path)
- Default system library paths.
For convenience, this is constructed as a dictionary where the keys are
the library paths and the values are the names of the libraries
@@ -94,17 +130,45 @@ def libraries_in_ld_and_system_library_path(
constructed based on the set of LD_LIBRARY_PATH, LIBRARY_PATH,
DYLD_LIBRARY_PATH, and DYLD_FALLBACK_LIBRARY_PATH environment
variables as well as the standard system library paths.
path_hints (list): list of paths to be searched. If ``None``, the default
system paths are used.
"""
default_lib_search_paths = (
spack.util.environment.get_path("LD_LIBRARY_PATH")
+ spack.util.environment.get_path("DYLD_LIBRARY_PATH")
+ spack.util.environment.get_path("DYLD_FALLBACK_LIBRARY_PATH")
+ spack.util.ld_so_conf.host_dynamic_linker_search_paths()
)
path_hints = path_hints if path_hints is not None else default_lib_search_paths
if path_hints:
search_paths = llnl.util.filesystem.search_paths_for_libraries(*path_hints)
else:
search_paths = []
search_paths = llnl.util.filesystem.search_paths_for_libraries(*path_hints)
return path_to_dict(search_paths)
# Environment variables
if sys.platform == "darwin":
search_paths.extend(environment.get_path("DYLD_LIBRARY_PATH"))
search_paths.extend(environment.get_path("DYLD_FALLBACK_LIBRARY_PATH"))
elif sys.platform.startswith("linux"):
search_paths.extend(environment.get_path("LD_LIBRARY_PATH"))
# Dynamic linker paths
search_paths.extend(spack.util.ld_so_conf.host_dynamic_linker_search_paths())
# Drop redundant paths
search_paths = list(filter(os.path.isdir, search_paths))
# Make use we don't doubly list /usr/lib and /lib etc
search_paths = list(llnl.util.lang.dedupe(search_paths, key=file_identifier))
try:
host_compat = get_elf_compat(sys.executable)
accept = lambda path: accept_elf(path, host_compat)
except (OSError, elf_utils.ElfParsingError):
accept = llnl.util.filesystem.is_readable_file
path_to_lib = {}
# Reverse order of search directories so that a lib in the first
# search path entry overrides later entries
for search_path in reversed(search_paths):
for lib in os.listdir(search_path):
lib_path = os.path.join(search_path, lib)
if accept(lib_path):
path_to_lib[lib_path] = lib
return path_to_lib
def libraries_in_windows_paths(path_hints: Optional[List[str]] = None) -> Dict[str, str]:

View File

@@ -137,6 +137,7 @@ class DirectiveMeta(type):
_directive_dict_names: Set[str] = set()
_directives_to_be_executed: List[str] = []
_when_constraints_from_context: List[str] = []
_default_args: List[dict] = []
def __new__(cls, name, bases, attr_dict):
# Initialize the attribute containing the list of directives
@@ -199,6 +200,16 @@ def pop_from_context():
"""Pop the last constraint from the context"""
return DirectiveMeta._when_constraints_from_context.pop()
@staticmethod
def push_default_args(default_args):
"""Push default arguments"""
DirectiveMeta._default_args.append(default_args)
@staticmethod
def pop_default_args():
"""Pop default arguments"""
return DirectiveMeta._default_args.pop()
@staticmethod
def directive(dicts=None):
"""Decorator for Spack directives.
@@ -259,7 +270,13 @@ def _decorator(decorated_function):
directive_names.append(decorated_function.__name__)
@functools.wraps(decorated_function)
def _wrapper(*args, **kwargs):
def _wrapper(*args, **_kwargs):
# First merge default args with kwargs
kwargs = dict()
for default_args in DirectiveMeta._default_args:
kwargs.update(default_args)
kwargs.update(_kwargs)
# Inject when arguments from the context
if DirectiveMeta._when_constraints_from_context:
# Check that directives not yet supporting the when= argument
@@ -573,17 +590,21 @@ def _execute_extends(pkg):
return _execute_extends
@directive("provided")
def provides(*specs, **kwargs):
"""Allows packages to provide a virtual dependency. If a package provides
'mpi', other packages can declare that they depend on "mpi", and spack
can use the providing package to satisfy the dependency.
@directive(dicts=("provided", "provided_together"))
def provides(*specs, when: Optional[str] = None):
"""Allows packages to provide a virtual dependency.
If a package provides "mpi", other packages can declare that they depend on "mpi",
and spack can use the providing package to satisfy the dependency.
Args:
*specs: virtual specs provided by this package
when: condition when this provides clause needs to be considered
"""
def _execute_provides(pkg):
import spack.parser # Avoid circular dependency
when = kwargs.get("when")
when_spec = make_when_spec(when)
if not when_spec:
return
@@ -591,15 +612,18 @@ def _execute_provides(pkg):
# ``when`` specs for ``provides()`` need a name, as they are used
# to build the ProviderIndex.
when_spec.name = pkg.name
spec_objs = [spack.spec.Spec(x) for x in specs]
spec_names = [x.name for x in spec_objs]
if len(spec_names) > 1:
pkg.provided_together.setdefault(when_spec, []).append(set(spec_names))
for string in specs:
for provided_spec in spack.parser.parse(string):
if pkg.name == provided_spec.name:
raise CircularReferenceError("Package '%s' cannot provide itself." % pkg.name)
for provided_spec in spec_objs:
if pkg.name == provided_spec.name:
raise CircularReferenceError("Package '%s' cannot provide itself." % pkg.name)
if provided_spec not in pkg.provided:
pkg.provided[provided_spec] = set()
pkg.provided[provided_spec].add(when_spec)
if provided_spec not in pkg.provided:
pkg.provided[provided_spec] = set()
pkg.provided[provided_spec].add(when_spec)
return _execute_provides

View File

@@ -339,6 +339,7 @@
from .environment import (
TOP_LEVEL_KEY,
Environment,
SpackEnvironmentConfigError,
SpackEnvironmentError,
SpackEnvironmentViewError,
activate,
@@ -372,6 +373,7 @@
__all__ = [
"TOP_LEVEL_KEY",
"Environment",
"SpackEnvironmentConfigError",
"SpackEnvironmentError",
"SpackEnvironmentViewError",
"activate",

View File

@@ -232,6 +232,10 @@ def to_dict(self):
"pkg_ids": " ".join(self.all_pkg_identifiers),
}
@property
def empty(self):
return len(self.roots) == 0
@staticmethod
def from_env(
env: ev.Environment,
@@ -254,15 +258,10 @@ def from_env(
jobserver: when enabled, make will invoke Spack with jobserver support. For
dry-run this should be disabled.
"""
# If no specs are provided as a filter, build all the specs in the environment.
if filter_specs:
entrypoints = [env.matching_spec(s) for s in filter_specs]
else:
entrypoints = [s for _, s in env.concretized_specs()]
roots = env.all_matching_specs(*filter_specs) if filter_specs else env.concrete_roots()
visitor = DepfileSpecVisitor(pkg_buildcache, dep_buildcache)
traverse.traverse_breadth_first_with_visitor(
entrypoints, traverse.CoverNodesVisitor(visitor, key=lambda s: s.dag_hash())
roots, traverse.CoverNodesVisitor(visitor, key=lambda s: s.dag_hash())
)
return MakefileModel(env, entrypoints, visitor.adjacency_list, make_prefix, jobserver)
return MakefileModel(env, roots, visitor.adjacency_list, make_prefix, jobserver)

View File

@@ -342,7 +342,7 @@ def create_in_dir(
manifest.flush()
except spack.config.ConfigFormatError as e:
except (spack.config.ConfigFormatError, SpackEnvironmentConfigError) as e:
shutil.rmtree(manifest_dir)
raise e
@@ -396,7 +396,13 @@ def all_environments():
def _read_yaml(str_or_file):
"""Read YAML from a file for round-trip parsing."""
data = syaml.load_config(str_or_file)
try:
data = syaml.load_config(str_or_file)
except syaml.SpackYAMLError as e:
raise SpackEnvironmentConfigError(
f"Invalid environment configuration detected: {e.message}"
)
filename = getattr(str_or_file, "name", None)
default_data = spack.config.validate(data, spack.schema.env.schema, filename)
return data, default_data
@@ -781,10 +787,18 @@ def _re_read(self):
"""Reinitialize the environment object."""
self.clear(re_read=True)
self.manifest = EnvironmentManifestFile(self.path)
self._read()
self._read(re_read=True)
def _read(self):
self._construct_state_from_manifest()
def _read(self, re_read=False):
# If the manifest has included files, then some of the information
# (e.g., definitions) MAY be in those files. So we need to ensure
# the config is populated with any associated spec lists in order
# to fully construct the manifest state.
includes = self.manifest[TOP_LEVEL_KEY].get("include", [])
if includes and not re_read:
prepare_config_scope(self)
self._construct_state_from_manifest(re_read)
if os.path.exists(self.lock_path):
with open(self.lock_path) as f:
@@ -798,21 +812,30 @@ def write_transaction(self):
"""Get a write lock context manager for use in a `with` block."""
return lk.WriteTransaction(self.txlock, acquire=self._re_read)
def _construct_state_from_manifest(self):
def _process_definition(self, item):
"""Process a single spec definition item."""
entry = copy.deepcopy(item)
when = _eval_conditional(entry.pop("when", "True"))
assert len(entry) == 1
if when:
name, spec_list = next(iter(entry.items()))
user_specs = SpecList(name, spec_list, self.spec_lists.copy())
if name in self.spec_lists:
self.spec_lists[name].extend(user_specs)
else:
self.spec_lists[name] = user_specs
def _construct_state_from_manifest(self, re_read=False):
"""Read manifest file and set up user specs."""
self.spec_lists = collections.OrderedDict()
if not re_read:
for item in spack.config.get("definitions", []):
self._process_definition(item)
env_configuration = self.manifest[TOP_LEVEL_KEY]
for item in env_configuration.get("definitions", []):
entry = copy.deepcopy(item)
when = _eval_conditional(entry.pop("when", "True"))
assert len(entry) == 1
if when:
name, spec_list = next(iter(entry.items()))
user_specs = SpecList(name, spec_list, self.spec_lists.copy())
if name in self.spec_lists:
self.spec_lists[name].extend(user_specs)
else:
self.spec_lists[name] = user_specs
self._process_definition(item)
spec_list = env_configuration.get(user_speclist_name, [])
user_specs = SpecList(
@@ -857,7 +880,9 @@ def clear(self, re_read=False):
yaml, and need to be maintained when re-reading an existing
environment.
"""
self.spec_lists = {user_speclist_name: SpecList()} # specs from yaml
self.spec_lists = collections.OrderedDict()
self.spec_lists[user_speclist_name] = SpecList()
self.dev_specs = {} # dev-build specs from yaml
self.concretized_user_specs = [] # user specs from last concretize
self.concretized_order = [] # roots of last concretize, in order
@@ -1006,7 +1031,8 @@ def included_config_scopes(self):
elif include_url.scheme:
raise ValueError(
"Unsupported URL scheme for environment include: {}".format(config_path)
f"Unsupported URL scheme ({include_url.scheme}) for "
f"environment include: {config_path}"
)
# treat relative paths as relative to the environment
@@ -1068,8 +1094,10 @@ def update_stale_references(self, from_list=None):
from_list = next(iter(self.spec_lists.keys()))
index = list(self.spec_lists.keys()).index(from_list)
# spec_lists is an OrderedDict, all list entries after the modified
# list may refer to the modified list. Update stale references
# spec_lists is an OrderedDict to ensure lists read from the manifest
# are maintainted in order, hence, all list entries after the modified
# list may refer to the modified list requiring stale references to be
# updated.
for i, (name, speclist) in enumerate(
list(self.spec_lists.items())[index + 1 :], index + 1
):
@@ -1167,7 +1195,7 @@ def change_existing_spec(
def remove(self, query_spec, list_name=user_speclist_name, force=False):
"""Remove specs from an environment that match a query_spec"""
err_msg_header = (
f"cannot remove {query_spec} from '{list_name}' definition "
f"Cannot remove '{query_spec}' from '{list_name}' definition "
f"in {self.manifest.manifest_file}"
)
query_spec = Spec(query_spec)
@@ -1198,11 +1226,10 @@ def remove(self, query_spec, list_name=user_speclist_name, force=False):
list_to_change.remove(spec)
self.update_stale_references(list_name)
new_specs = set(self.user_specs)
except spack.spec_list.SpecListError:
except spack.spec_list.SpecListError as e:
# define new specs list
new_specs = set(self.user_specs)
msg = f"Spec '{spec}' is part of a spec matrix and "
msg += f"cannot be removed from list '{list_to_change}'."
msg = str(e)
if force:
msg += " It will be removed from the concrete specs."
# Mock new specs, so we can remove this spec from concrete spec lists
@@ -1525,7 +1552,11 @@ def _concretize_separately(self, tests=False):
batch = []
for j, (i, concrete, duration) in enumerate(
spack.util.parallel.imap_unordered(
_concretize_task, args, processes=num_procs, debug=tty.is_debug()
_concretize_task,
args,
processes=num_procs,
debug=tty.is_debug(),
maxtaskperchild=1,
)
):
batch.append((i, concrete))
@@ -2063,7 +2094,7 @@ def matching_spec(self, spec):
def removed_specs(self):
"""Tuples of (user spec, concrete spec) for all specs that will be
removed on nexg concretize."""
removed on next concretize."""
needed = set()
for s, c in self.concretized_specs():
if s in self.user_specs:
@@ -2722,7 +2753,7 @@ def override_user_spec(self, user_spec: str, idx: int) -> None:
self.changed = True
def add_definition(self, user_spec: str, list_name: str) -> None:
"""Appends a user spec to the first active definition mathing the name passed as argument.
"""Appends a user spec to the first active definition matching the name passed as argument.
Args:
user_spec: user spec to be appended
@@ -2935,3 +2966,7 @@ class SpackEnvironmentError(spack.error.SpackError):
class SpackEnvironmentViewError(SpackEnvironmentError):
"""Class for errors regarding view generation."""
class SpackEnvironmentConfigError(SpackEnvironmentError):
"""Class for Spack environment-specific configuration errors."""

View File

@@ -5,6 +5,7 @@
"""Service functions and classes to implement the hooks
for Spack's command extensions.
"""
import difflib
import importlib
import os
import re
@@ -176,10 +177,19 @@ class CommandNotFoundError(spack.error.SpackError):
"""
def __init__(self, cmd_name):
super().__init__(
msg = (
"{0} is not a recognized Spack command or extension command;"
" check with `spack commands`.".format(cmd_name)
)
long_msg = None
similar = difflib.get_close_matches(cmd_name, spack.cmd.all_commands())
if 1 <= len(similar) <= 5:
long_msg = "\nDid you mean one of the following commands?\n "
long_msg += "\n ".join(similar)
super().__init__(msg, long_msg)
class ExtensionNamingError(spack.error.SpackError):

View File

@@ -528,10 +528,15 @@ def node_entry(self, node):
def edge_entry(self, edge):
colormap = {"build": "dodgerblue", "link": "crimson", "run": "goldenrod"}
label = ""
if edge.virtuals:
label = f" xlabel=\"virtuals={','.join(edge.virtuals)}\""
return (
edge.parent.dag_hash(),
edge.spec.dag_hash(),
f"[color=\"{':'.join(colormap[x] for x in dt.flag_to_tuple(edge.depflag))}\"]",
f"[color=\"{':'.join(colormap[x] for x in dt.flag_to_tuple(edge.depflag))}\""
+ label
+ "]",
)

View File

@@ -3,17 +3,22 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Optional, Set
from llnl.util import tty
import spack.config
import spack.modules
import spack.spec
def _for_each_enabled(spec, method_name, explicit=None):
def _for_each_enabled(
spec: spack.spec.Spec, method_name: str, explicit: Optional[bool] = None
) -> None:
"""Calls a method for each enabled module"""
set_names = set(spack.config.get("modules", {}).keys())
set_names: Set[str] = set(spack.config.get("modules", {}).keys())
for name in set_names:
enabled = spack.config.get("modules:%s:enable" % name)
enabled = spack.config.get(f"modules:{name}:enable")
if not enabled:
tty.debug("NO MODULE WRITTEN: list of enabled module files is empty")
continue
@@ -28,7 +33,7 @@ def _for_each_enabled(spec, method_name, explicit=None):
tty.warn(msg.format(method_name, str(e)))
def post_install(spec, explicit):
def post_install(spec, explicit: bool):
import spack.environment as ev # break import cycle
if ev.active_environment():

View File

@@ -16,11 +16,13 @@
import os.path
import pstats
import re
import shlex
import signal
import subprocess as sp
import sys
import traceback
import warnings
from typing import List, Tuple
import archspec.cpu
@@ -49,9 +51,6 @@
#: names of profile statistics
stat_names = pstats.Stats.sort_arg_dict_default
#: top-level aliases for Spack commands
aliases = {"concretise": "concretize", "containerise": "containerize", "rm": "remove"}
#: help levels in order of detail (i.e., number of commands shown)
levels = ["short", "long"]
@@ -359,7 +358,10 @@ def add_command(self, cmd_name):
module = spack.cmd.get_module(cmd_name)
# build a list of aliases
alias_list = [k for k, v in aliases.items() if v == cmd_name]
alias_list = []
aliases = spack.config.get("config:aliases")
if aliases:
alias_list = [k for k, v in aliases.items() if shlex.split(v)[0] == cmd_name]
subparser = self.subparsers.add_parser(
cmd_name,
@@ -670,7 +672,6 @@ def __init__(self, command_name, subprocess=False):
Windows, where it is always False.
"""
self.parser = make_argument_parser()
self.command = self.parser.add_command(command_name)
self.command_name = command_name
# TODO: figure out how to support this on windows
self.subprocess = subprocess if sys.platform != "win32" else False
@@ -702,13 +703,14 @@ def __call__(self, *argv, **kwargs):
if self.subprocess:
p = sp.Popen(
[spack.paths.spack_script, self.command_name] + prepend + list(argv),
[spack.paths.spack_script] + prepend + [self.command_name] + list(argv),
stdout=sp.PIPE,
stderr=sp.STDOUT,
)
out, self.returncode = p.communicate()
out = out.decode()
else:
command = self.parser.add_command(self.command_name)
args, unknown = self.parser.parse_known_args(
prepend + [self.command_name] + list(argv)
)
@@ -716,7 +718,7 @@ def __call__(self, *argv, **kwargs):
out = io.StringIO()
try:
with log_output(out, echo=True):
self.returncode = _invoke_command(self.command, self.parser, args, unknown)
self.returncode = _invoke_command(command, self.parser, args, unknown)
except SystemExit as e:
self.returncode = e.code
@@ -870,6 +872,46 @@ def restore_macos_dyld_vars():
os.environ[dyld_var] = os.environ[stored_var_name]
def resolve_alias(cmd_name: str, cmd: List[str]) -> Tuple[str, List[str]]:
"""Resolves aliases in the given command.
Args:
cmd_name: command name.
cmd: command line arguments.
Returns:
new command name and arguments.
"""
all_commands = spack.cmd.all_commands()
aliases = spack.config.get("config:aliases")
if aliases:
for key, value in aliases.items():
if " " in key:
tty.warn(
f"Alias '{key}' (mapping to '{value}') contains a space"
", which is not supported."
)
if key in all_commands:
tty.warn(
f"Alias '{key}' (mapping to '{value}') attempts to override"
" built-in command."
)
if cmd_name not in all_commands:
alias = None
if aliases:
alias = aliases.get(cmd_name)
if alias is not None:
alias_parts = shlex.split(alias)
cmd_name = alias_parts[0]
cmd = alias_parts + cmd[1:]
return cmd_name, cmd
def _main(argv=None):
"""Logic for the main entry point for the Spack command.
@@ -962,7 +1004,7 @@ def _main(argv=None):
# Try to load the particular command the caller asked for.
cmd_name = args.command[0]
cmd_name = aliases.get(cmd_name, cmd_name)
cmd_name, args.command = resolve_alias(cmd_name, args.command)
# set up a bootstrap context, if asked.
# bootstrap context needs to include parsing the command, b/c things
@@ -974,10 +1016,10 @@ def _main(argv=None):
bootstrap_context = bootstrap.ensure_bootstrap_configuration()
with bootstrap_context:
return finish_parse_and_run(parser, cmd_name, env_format_error)
return finish_parse_and_run(parser, cmd_name, args.command, env_format_error)
def finish_parse_and_run(parser, cmd_name, env_format_error):
def finish_parse_and_run(parser, cmd_name, cmd, env_format_error):
"""Finish parsing after we know the command to run."""
# add the found command to the parser and re-run then re-parse
command = parser.add_command(cmd_name)

View File

@@ -7,10 +7,15 @@
include Tcl non-hierarchical modules, Lua hierarchical modules, and others.
"""
from .common import disable_modules
from typing import Dict, Type
from .common import BaseModuleFileWriter, disable_modules
from .lmod import LmodModulefileWriter
from .tcl import TclModulefileWriter
__all__ = ["TclModulefileWriter", "LmodModulefileWriter", "disable_modules"]
module_types = {"tcl": TclModulefileWriter, "lmod": LmodModulefileWriter}
module_types: Dict[str, Type[BaseModuleFileWriter]] = {
"tcl": TclModulefileWriter,
"lmod": LmodModulefileWriter,
}

View File

@@ -35,7 +35,7 @@
import os.path
import re
import string
from typing import Optional
from typing import List, Optional
import llnl.util.filesystem
import llnl.util.tty as tty
@@ -50,6 +50,7 @@
import spack.projections as proj
import spack.repo
import spack.schema.environment
import spack.spec
import spack.store
import spack.tengine as tengine
import spack.util.environment
@@ -395,16 +396,14 @@ class BaseConfiguration:
default_projections = {"all": "{name}/{version}-{compiler.name}-{compiler.version}"}
def __init__(self, spec, module_set_name, explicit=None):
def __init__(self, spec: spack.spec.Spec, module_set_name: str, explicit: bool) -> None:
# Module where type(self) is defined
self.module = inspect.getmodule(self)
m = inspect.getmodule(self)
assert m is not None # make mypy happy
self.module = m
# Spec for which we want to generate a module file
self.spec = spec
self.name = module_set_name
# Software installation has been explicitly asked (get this information from
# db when querying an existing module, like during a refresh or rm operations)
if explicit is None:
explicit = spec._installed_explicitly()
self.explicit = explicit
# Dictionary of configuration options that should be applied
# to the spec
@@ -458,7 +457,11 @@ def suffixes(self):
if constraint in self.spec:
suffixes.append(suffix)
suffixes = list(dedupe(suffixes))
if self.hash:
# For hidden modules we can always add a fixed length hash as suffix, since it guards
# against file name clashes, and the module is not exposed to the user anyways.
if self.hidden:
suffixes.append(self.spec.dag_hash(length=7))
elif self.hash:
suffixes.append(self.hash)
return suffixes
@@ -551,8 +554,7 @@ def exclude_env_vars(self):
def _create_list_for(self, what):
include = []
for item in self.conf[what]:
conf = type(self)(item, self.name)
if not conf.excluded:
if not self.module.make_configuration(item, self.name).excluded:
include.append(item)
return include
@@ -731,7 +733,9 @@ def environment_modifications(self):
# for that to work, globals have to be set on the package modules, and the
# whole chain of setup_dependent_package has to be followed from leaf to spec.
# So: just run it here, but don't collect env mods.
spack.build_environment.SetupContext(context=Context.RUN).set_all_package_py_globals()
spack.build_environment.SetupContext(
spec, context=Context.RUN
).set_all_package_py_globals()
# Then run setup_dependent_run_environment before setup_run_environment.
for dep in spec.dependencies(deptype=("link", "run")):
@@ -824,8 +828,7 @@ def autoload(self):
def _create_module_list_of(self, what):
m = self.conf.module
name = self.conf.name
explicit = self.conf.explicit
return [m.make_layout(x, name, explicit).use_name for x in getattr(self.conf, what)]
return [m.make_layout(x, name).use_name for x in getattr(self.conf, what)]
@tengine.context_property
def verbose(self):
@@ -834,12 +837,19 @@ def verbose(self):
class BaseModuleFileWriter:
def __init__(self, spec, module_set_name, explicit=None):
default_template: str
hide_cmd_format: str
modulerc_header: List[str]
def __init__(
self, spec: spack.spec.Spec, module_set_name: str, explicit: Optional[bool] = None
) -> None:
self.spec = spec
# This class is meant to be derived. Get the module of the
# actual writer.
self.module = inspect.getmodule(self)
assert self.module is not None # make mypy happy
m = self.module
# Create the triplet of configuration/layout/context

View File

@@ -6,8 +6,7 @@
import collections
import itertools
import os.path
import posixpath
from typing import Any, Dict, List
from typing import Dict, List, Optional, Tuple
import llnl.util.filesystem as fs
import llnl.util.lang as lang
@@ -24,18 +23,19 @@
#: lmod specific part of the configuration
def configuration(module_set_name):
config_path = "modules:%s:lmod" % module_set_name
config = spack.config.get(config_path, {})
return config
def configuration(module_set_name: str) -> dict:
return spack.config.get(f"modules:{module_set_name}:lmod", {})
# Caches the configuration {spec_hash: configuration}
configuration_registry: Dict[str, Any] = {}
configuration_registry: Dict[Tuple[str, str, bool], BaseConfiguration] = {}
def make_configuration(spec, module_set_name, explicit):
def make_configuration(
spec: spack.spec.Spec, module_set_name: str, explicit: Optional[bool] = None
) -> BaseConfiguration:
"""Returns the lmod configuration for spec"""
explicit = bool(spec._installed_explicitly()) if explicit is None else explicit
key = (spec.dag_hash(), module_set_name, explicit)
try:
return configuration_registry[key]
@@ -45,16 +45,18 @@ def make_configuration(spec, module_set_name, explicit):
)
def make_layout(spec, module_set_name, explicit):
def make_layout(
spec: spack.spec.Spec, module_set_name: str, explicit: Optional[bool] = None
) -> BaseFileLayout:
"""Returns the layout information for spec"""
conf = make_configuration(spec, module_set_name, explicit)
return LmodFileLayout(conf)
return LmodFileLayout(make_configuration(spec, module_set_name, explicit))
def make_context(spec, module_set_name, explicit):
def make_context(
spec: spack.spec.Spec, module_set_name: str, explicit: Optional[bool] = None
) -> BaseContext:
"""Returns the context information for spec"""
conf = make_configuration(spec, module_set_name, explicit)
return LmodContext(conf)
return LmodContext(make_configuration(spec, module_set_name, explicit))
def guess_core_compilers(name, store=False) -> List[spack.spec.CompilerSpec]:
@@ -97,10 +99,7 @@ def guess_core_compilers(name, store=False) -> List[spack.spec.CompilerSpec]:
class LmodConfiguration(BaseConfiguration):
"""Configuration class for lmod module files."""
# Note: Posixpath is used here as well as below as opposed to
# os.path.join due to spack.spec.Spec.format
# requiring forward slash path seperators at this stage
default_projections = {"all": posixpath.join("{name}", "{version}")}
default_projections = {"all": "{name}/{version}"}
@property
def core_compilers(self) -> List[spack.spec.CompilerSpec]:
@@ -274,19 +273,16 @@ def filename(self):
hierarchy_name = os.path.join(*parts)
# Compute the absolute path
fullname = os.path.join(
return os.path.join(
self.arch_dirname, # root for lmod files on this architecture
hierarchy_name, # relative path
".".join([self.use_name, self.extension]), # file name
f"{self.use_name}.{self.extension}", # file name
)
return fullname
@property
def modulerc(self):
"""Returns the modulerc file associated with current module file"""
return os.path.join(
os.path.dirname(self.filename), ".".join([".modulerc", self.extension])
)
return os.path.join(os.path.dirname(self.filename), f".modulerc.{self.extension}")
def token_to_path(self, name, value):
"""Transforms a hierarchy token into the corresponding path part.
@@ -319,9 +315,7 @@ def path_part_fmt(token):
# we need to append a hash to the version to distinguish
# among flavors of the same library (e.g. openblas~openmp vs.
# openblas+openmp)
path = path_part_fmt(token=value)
path = "-".join([path, value.dag_hash(length=7)])
return path
return f"{path_part_fmt(token=value)}-{value.dag_hash(length=7)}"
@property
def available_path_parts(self):
@@ -333,8 +327,7 @@ def available_path_parts(self):
# List of services that are part of the hierarchy
hierarchy = self.conf.hierarchy_tokens
# Tokenize each part that is both in the hierarchy and available
parts = [self.token_to_path(x, available[x]) for x in hierarchy if x in available]
return parts
return [self.token_to_path(x, available[x]) for x in hierarchy if x in available]
@property
@lang.memoized
@@ -452,7 +445,7 @@ def missing(self):
@lang.memoized
def unlocked_paths(self):
"""Returns the list of paths that are unlocked unconditionally."""
layout = make_layout(self.spec, self.conf.name, self.conf.explicit)
layout = make_layout(self.spec, self.conf.name)
return [os.path.join(*parts) for parts in layout.unlocked_paths[None]]
@tengine.context_property
@@ -460,7 +453,7 @@ def conditionally_unlocked_paths(self):
"""Returns the list of paths that are unlocked conditionally.
Each item in the list is a tuple with the structure (condition, path).
"""
layout = make_layout(self.spec, self.conf.name, self.conf.explicit)
layout = make_layout(self.spec, self.conf.name)
value = []
conditional_paths = layout.unlocked_paths
conditional_paths.pop(None)
@@ -482,9 +475,9 @@ def manipulate_path(token):
class LmodModulefileWriter(BaseModuleFileWriter):
"""Writer class for lmod module files."""
default_template = posixpath.join("modules", "modulefile.lua")
default_template = "modules/modulefile.lua"
modulerc_header: list = []
modulerc_header = []
hide_cmd_format = 'hide_version("%s")'

View File

@@ -7,28 +7,29 @@
non-hierarchical modules.
"""
import os.path
import posixpath
from typing import Any, Dict
from typing import Dict, Optional, Tuple
import spack.config
import spack.spec
import spack.tengine as tengine
from .common import BaseConfiguration, BaseContext, BaseFileLayout, BaseModuleFileWriter
#: Tcl specific part of the configuration
def configuration(module_set_name):
config_path = "modules:%s:tcl" % module_set_name
config = spack.config.get(config_path, {})
return config
def configuration(module_set_name: str) -> dict:
return spack.config.get(f"modules:{module_set_name}:tcl", {})
# Caches the configuration {spec_hash: configuration}
configuration_registry: Dict[str, Any] = {}
configuration_registry: Dict[Tuple[str, str, bool], BaseConfiguration] = {}
def make_configuration(spec, module_set_name, explicit):
def make_configuration(
spec: spack.spec.Spec, module_set_name: str, explicit: Optional[bool] = None
) -> BaseConfiguration:
"""Returns the tcl configuration for spec"""
explicit = bool(spec._installed_explicitly()) if explicit is None else explicit
key = (spec.dag_hash(), module_set_name, explicit)
try:
return configuration_registry[key]
@@ -38,16 +39,18 @@ def make_configuration(spec, module_set_name, explicit):
)
def make_layout(spec, module_set_name, explicit):
def make_layout(
spec: spack.spec.Spec, module_set_name: str, explicit: Optional[bool] = None
) -> BaseFileLayout:
"""Returns the layout information for spec"""
conf = make_configuration(spec, module_set_name, explicit)
return TclFileLayout(conf)
return TclFileLayout(make_configuration(spec, module_set_name, explicit))
def make_context(spec, module_set_name, explicit):
def make_context(
spec: spack.spec.Spec, module_set_name: str, explicit: Optional[bool] = None
) -> BaseContext:
"""Returns the context information for spec"""
conf = make_configuration(spec, module_set_name, explicit)
return TclContext(conf)
return TclContext(make_configuration(spec, module_set_name, explicit))
class TclConfiguration(BaseConfiguration):
@@ -75,10 +78,7 @@ def prerequisites(self):
class TclModulefileWriter(BaseModuleFileWriter):
"""Writer class for tcl module files."""
# Note: Posixpath is used here as opposed to
# os.path.join due to spack.spec.Spec.format
# requiring forward slash path seperators at this stage
default_template = posixpath.join("modules", "modulefile.tcl")
default_template = "modules/modulefile.tcl"
modulerc_header = ["#%Module4.7"]

View File

@@ -26,6 +26,7 @@
"""
import functools
import inspect
from contextlib import contextmanager
from llnl.util.lang import caller_locals
@@ -271,6 +272,13 @@ def __exit__(self, exc_type, exc_val, exc_tb):
spack.directives.DirectiveMeta.pop_from_context()
@contextmanager
def default_args(**kwargs):
spack.directives.DirectiveMeta.push_default_args(kwargs)
yield
spack.directives.DirectiveMeta.pop_default_args()
class MultiMethodError(spack.error.SpackError):
"""Superclass for multimethod dispatch errors"""

View File

@@ -9,8 +9,10 @@
import spack.spec
# all the building blocks
alphanumeric = r"[a-z0-9]+"
# notice: Docker is more strict (no uppercase allowed). We parse image names *with* uppercase
# and normalize, so: example.com/Organization/Name -> example.com/organization/name. Tags are
# case sensitive though.
alphanumeric_with_uppercase = r"[a-zA-Z0-9]+"
separator = r"(?:[._]|__|[-]+)"
localhost = r"localhost"
domainNameComponent = r"(?:[a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9-]*[a-zA-Z0-9])"
@@ -25,7 +27,7 @@
domainAndPort = rf"{host}{optionalPort}"
# image name
pathComponent = rf"{alphanumeric}(?:{separator}{alphanumeric})*"
pathComponent = rf"{alphanumeric_with_uppercase}(?:{separator}{alphanumeric_with_uppercase})*"
remoteName = rf"{pathComponent}(?:\/{pathComponent})*"
namePat = rf"(?:{domainAndPort}\/)?{remoteName}"
@@ -130,6 +132,11 @@ def from_string(cls, string) -> "ImageReference":
name = f"{domain}/{name}"
domain = "index.docker.io"
# Lowercase the image name. This is enforced by Docker, although the OCI spec isn't clear?
# We do this anyways, cause for example in Github Actions the <organization>/<repository>
# part can have uppercase, and may be interpolated when specifying the relevant OCI image.
name = name.lower()
if not tag:
tag = "latest"

View File

@@ -85,7 +85,7 @@
UpstreamPackageError,
)
from spack.mixins import filter_compiler_wrappers
from spack.multimethod import when
from spack.multimethod import default_args, when
from spack.package_base import (
DependencyConflictError,
build_system_flags,

View File

@@ -6,7 +6,7 @@
Here is the EBNF grammar for a spec::
spec = [name] [node_options] { ^ node } |
spec = [name] [node_options] { ^[edge_properties] node } |
[name] [node_options] hash |
filename
@@ -14,7 +14,8 @@
[name] [node_options] hash |
filename
node_options = [@(version_list|version_pair)] [%compiler] { variant }
node_options = [@(version_list|version_pair)] [%compiler] { variant }
edge_properties = [ { bool_variant | key_value } ]
hash = / id
filename = (.|/|[a-zA-Z0-9-_]*/)([a-zA-Z0-9-_./]*)(.json|.yaml)
@@ -64,6 +65,7 @@
from llnl.util.tty import color
import spack.deptypes
import spack.error
import spack.spec
import spack.version
@@ -126,6 +128,8 @@ class TokenType(TokenBase):
"""
# Dependency
START_EDGE_PROPERTIES = r"(?:\^\[)"
END_EDGE_PROPERTIES = r"(?:\])"
DEPENDENCY = r"(?:\^)"
# Version
VERSION_HASH_PAIR = rf"(?:@(?:{GIT_VERSION_PATTERN})=(?:{VERSION}))"
@@ -280,16 +284,15 @@ def next_spec(
initial_spec = initial_spec or spack.spec.Spec()
root_spec = SpecNodeParser(self.ctx).parse(initial_spec)
while True:
if self.ctx.accept(TokenType.DEPENDENCY):
dependency = SpecNodeParser(self.ctx).parse()
if dependency is None:
msg = (
"this dependency sigil needs to be followed by a package name "
"or a node attribute (version, variant, etc.)"
)
raise SpecParsingError(msg, self.ctx.current_token, self.literal_str)
if self.ctx.accept(TokenType.START_EDGE_PROPERTIES):
edge_properties = EdgeAttributeParser(self.ctx, self.literal_str).parse()
edge_properties.setdefault("depflag", 0)
edge_properties.setdefault("virtuals", ())
dependency = self._parse_node(root_spec)
root_spec._add_dependency(dependency, **edge_properties)
elif self.ctx.accept(TokenType.DEPENDENCY):
dependency = self._parse_node(root_spec)
root_spec._add_dependency(dependency, depflag=0, virtuals=())
else:
@@ -297,6 +300,18 @@ def next_spec(
return root_spec
def _parse_node(self, root_spec):
dependency = SpecNodeParser(self.ctx).parse()
if dependency is None:
msg = (
"the dependency sigil and any optional edge attributes must be followed by a "
"package name or a node attribute (version, variant, etc.)"
)
raise SpecParsingError(msg, self.ctx.current_token, self.literal_str)
if root_spec.concrete:
raise spack.spec.RedundantSpecError(root_spec, "^" + str(dependency))
return dependency
def all_specs(self) -> List["spack.spec.Spec"]:
"""Return all the specs that remain to be parsed"""
return list(iter(self.next_spec, None))
@@ -438,6 +453,41 @@ def parse(self, initial_spec: "spack.spec.Spec") -> "spack.spec.Spec":
return initial_spec
class EdgeAttributeParser:
__slots__ = "ctx", "literal_str"
def __init__(self, ctx, literal_str):
self.ctx = ctx
self.literal_str = literal_str
def parse(self):
attributes = {}
while True:
if self.ctx.accept(TokenType.KEY_VALUE_PAIR):
name, value = self.ctx.current_token.value.split("=", maxsplit=1)
name = name.strip("'\" ")
value = value.strip("'\" ").split(",")
attributes[name] = value
if name not in ("deptypes", "virtuals"):
msg = (
"the only edge attributes that are currently accepted "
'are "deptypes" and "virtuals"'
)
raise SpecParsingError(msg, self.ctx.current_token, self.literal_str)
# TODO: Add code to accept bool variants here as soon as use variants are implemented
elif self.ctx.accept(TokenType.END_EDGE_PROPERTIES):
break
else:
msg = "unexpected token in edge attributes"
raise SpecParsingError(msg, self.ctx.next_token, self.literal_str)
# Turn deptypes=... to depflag representation
if "deptypes" in attributes:
deptype_string = attributes.pop("deptypes")
attributes["depflag"] = spack.deptypes.canonicalize(deptype_string)
return attributes
def parse(text: str) -> List["spack.spec.Spec"]:
"""Parse text into a list of strings

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Classes and functions to manage providers of virtual dependencies"""
import itertools
from typing import Dict, List, Optional, Set
import spack.error
@@ -11,33 +10,6 @@
import spack.util.spack_json as sjson
def _cross_provider_maps(lmap, rmap):
"""Return a dictionary that combines constraint requests from both input.
Args:
lmap: main provider map
rmap: provider map with additional constraints
"""
# TODO: this is pretty darned nasty, and inefficient, but there
# TODO: are not that many vdeps in most specs.
result = {}
for lspec, rspec in itertools.product(lmap, rmap):
try:
constrained = lspec.constrained(rspec)
except spack.error.UnsatisfiableSpecError:
continue
# lp and rp are left and right provider specs.
for lp_spec, rp_spec in itertools.product(lmap[lspec], rmap[rspec]):
if lp_spec.name == rp_spec.name:
try:
const = lp_spec.constrained(rp_spec, deps=False)
result.setdefault(constrained, set()).add(const)
except spack.error.UnsatisfiableSpecError:
continue
return result
class _IndexBase:
#: This is a dict of dicts used for finding providers of particular
#: virtual dependencies. The dict of dicts looks like:
@@ -81,29 +53,6 @@ def providers_for(self, virtual_spec):
def __contains__(self, name):
return name in self.providers
def satisfies(self, other):
"""Determine if the providers of virtual specs are compatible.
Args:
other: another provider index
Returns:
True if the providers are compatible, False otherwise.
"""
common = set(self.providers) & set(other.providers)
if not common:
return True
# This ensures that some provider in other COULD satisfy the
# vpkg constraints on self.
result = {}
for name in common:
crossed = _cross_provider_maps(self.providers[name], other.providers[name])
if crossed:
result[name] = crossed
return all(c in result for c in common)
def __eq__(self, other):
return self.providers == other.providers

View File

@@ -6,6 +6,7 @@
import abc
import collections.abc
import contextlib
import difflib
import errno
import functools
import importlib
@@ -1516,7 +1517,18 @@ def __init__(self, name, repo=None):
long_msg = "Did you mean to specify a filename with './{0}'?"
long_msg = long_msg.format(name)
else:
long_msg = "You may need to run 'spack clean -m'."
long_msg = "Use 'spack create' to create a new package."
if not repo:
repo = spack.repo.PATH
# We need to compare the base package name
pkg_name = name.rsplit(".", 1)[-1]
similar = difflib.get_close_matches(pkg_name, repo.all_package_names())
if 1 <= len(similar) <= 5:
long_msg += "\n\nDid you mean one of the following packages?\n "
long_msg += "\n ".join(similar)
super().__init__(msg, long_msg)
self.name = name

View File

@@ -62,3 +62,25 @@ def _deprecated_properties(validator, deprecated, instance, schema):
Validator = llnl.util.lang.Singleton(_make_validator)
spec_list_schema = {
"type": "array",
"default": [],
"items": {
"anyOf": [
{
"type": "object",
"additionalProperties": False,
"properties": {
"matrix": {
"type": "array",
"items": {"type": "array", "items": {"type": "string"}},
},
"exclude": {"type": "array", "items": {"type": "string"}},
},
},
{"type": "string"},
{"type": "null"},
]
},
}

View File

@@ -92,6 +92,7 @@
"url_fetch_method": {"type": "string", "enum": ["urllib", "curl"]},
"additional_external_search_paths": {"type": "array", "items": {"type": "string"}},
"binary_index_ttl": {"type": "integer", "minimum": 0},
"aliases": {"type": "object", "patternProperties": {r"\w[\w-]*": {"type": "string"}}},
},
"deprecatedProperties": {
"properties": ["terminal_title"],

View File

@@ -0,0 +1,34 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for definitions
.. literalinclude:: _spack_root/lib/spack/spack/schema/definitions.py
:lines: 13-
"""
import spack.schema
#: Properties for inclusion in other schemas
properties = {
"definitions": {
"type": "array",
"default": [],
"items": {
"type": "object",
"properties": {"when": {"type": "string"}},
"patternProperties": {r"^(?!when$)\w*": spack.schema.spec_list_schema},
},
}
}
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack definitions configuration file schema",
"type": "object",
"additionalProperties": False,
"properties": properties,
}

View File

@@ -12,34 +12,11 @@
import spack.schema.gitlab_ci # DEPRECATED
import spack.schema.merged
import spack.schema.packages
import spack.schema.projections
#: Top level key in a manifest file
TOP_LEVEL_KEY = "spack"
spec_list_schema = {
"type": "array",
"default": [],
"items": {
"anyOf": [
{
"type": "object",
"additionalProperties": False,
"properties": {
"matrix": {
"type": "array",
"items": {"type": "array", "items": {"type": "string"}},
},
"exclude": {"type": "array", "items": {"type": "string"}},
},
},
{"type": "string"},
{"type": "null"},
]
},
}
projections_scheme = spack.schema.projections.properties["projections"]
schema = {
@@ -75,16 +52,7 @@
}
},
},
"definitions": {
"type": "array",
"default": [],
"items": {
"type": "object",
"properties": {"when": {"type": "string"}},
"patternProperties": {r"^(?!when$)\w*": spec_list_schema},
},
},
"specs": spec_list_schema,
"specs": spack.schema.spec_list_schema,
"view": {
"anyOf": [
{"type": "boolean"},

View File

@@ -17,6 +17,7 @@
import spack.schema.concretizer
import spack.schema.config
import spack.schema.container
import spack.schema.definitions
import spack.schema.mirrors
import spack.schema.modules
import spack.schema.packages
@@ -32,6 +33,7 @@
spack.schema.config.properties,
spack.schema.container.properties,
spack.schema.ci.properties,
spack.schema.definitions.properties,
spack.schema.mirrors.properties,
spack.schema.modules.properties,
spack.schema.packages.properties,

View File

@@ -8,6 +8,66 @@
:lines: 13-
"""
permissions = {
"type": "object",
"additionalProperties": False,
"properties": {
"read": {"type": "string", "enum": ["user", "group", "world"]},
"write": {"type": "string", "enum": ["user", "group", "world"]},
"group": {"type": "string"},
},
}
variants = {"oneOf": [{"type": "string"}, {"type": "array", "items": {"type": "string"}}]}
requirements = {
"oneOf": [
# 'require' can be a list of requirement_groups.
# each requirement group is a list of one or more
# specs. Either at least one or exactly one spec
# in the group must be satisfied (depending on
# whether you use "any_of" or "one_of",
# repectively)
{
"type": "array",
"items": {
"oneOf": [
{
"type": "object",
"additionalProperties": False,
"properties": {
"one_of": {"type": "array", "items": {"type": "string"}},
"any_of": {"type": "array", "items": {"type": "string"}},
"spec": {"type": "string"},
"message": {"type": "string"},
"when": {"type": "string"},
},
},
{"type": "string"},
]
},
},
# Shorthand for a single requirement group with
# one member
{"type": "string"},
]
}
permissions = {
"type": "object",
"additionalProperties": False,
"properties": {
"read": {"type": "string", "enum": ["user", "group", "world"]},
"write": {"type": "string", "enum": ["user", "group", "world"]},
"group": {"type": "string"},
},
}
package_attributes = {
"type": "object",
"additionalProperties": False,
"patternProperties": {r"\w+": {}},
}
#: Properties for inclusion in other schemas
properties = {
@@ -15,57 +75,14 @@
"type": "object",
"default": {},
"additionalProperties": False,
"patternProperties": {
r"\w[\w-]*": { # package name
"properties": {
"all": { # package name
"type": "object",
"default": {},
"additionalProperties": False,
"properties": {
"require": {
"oneOf": [
# 'require' can be a list of requirement_groups.
# each requirement group is a list of one or more
# specs. Either at least one or exactly one spec
# in the group must be satisfied (depending on
# whether you use "any_of" or "one_of",
# repectively)
{
"type": "array",
"items": {
"oneOf": [
{
"type": "object",
"additionalProperties": False,
"properties": {
"one_of": {
"type": "array",
"items": {"type": "string"},
},
"any_of": {
"type": "array",
"items": {"type": "string"},
},
"spec": {"type": "string"},
"message": {"type": "string"},
"when": {"type": "string"},
},
},
{"type": "string"},
]
},
},
# Shorthand for a single requirement group with
# one member
{"type": "string"},
]
},
"version": {
"type": "array",
"default": [],
# version strings (type should be string, number is still possible
# but deprecated. this is to avoid issues with e.g. 3.10 -> 3.1)
"items": {"anyOf": [{"type": "string"}, {"type": "number"}]},
},
"require": requirements,
"version": {}, # Here only to warn users on ignored properties
"target": {
"type": "array",
"default": [],
@@ -78,22 +95,10 @@
"items": {"type": "string"},
}, # compiler specs
"buildable": {"type": "boolean", "default": True},
"permissions": {
"type": "object",
"additionalProperties": False,
"properties": {
"read": {"type": "string", "enum": ["user", "group", "world"]},
"write": {"type": "string", "enum": ["user", "group", "world"]},
"group": {"type": "string"},
},
},
"permissions": permissions,
# If 'get_full_repo' is promoted to a Package-level
# attribute, it could be useful to set it here
"package_attributes": {
"type": "object",
"additionalProperties": False,
"patternProperties": {r"\w+": {}},
},
"package_attributes": package_attributes,
"providers": {
"type": "object",
"default": {},
@@ -106,12 +111,40 @@
}
},
},
"variants": {
"oneOf": [
{"type": "string"},
{"type": "array", "items": {"type": "string"}},
]
"variants": variants,
},
"deprecatedProperties": {
"properties": ["version"],
"message": "setting version preferences in the 'all' section of packages.yaml "
"is deprecated and will be removed in v0.22\n\n\tThese preferences "
"will be ignored by Spack. You can set them only in package specific sections "
"of the same file.\n",
"error": False,
},
}
},
"patternProperties": {
r"(?!^all$)(^\w[\w-]*)": { # package name
"type": "object",
"default": {},
"additionalProperties": False,
"properties": {
"require": requirements,
"version": {
"type": "array",
"default": [],
# version strings
"items": {"anyOf": [{"type": "string"}, {"type": "number"}]},
},
"target": {}, # Here only to warn users on ignored properties
"compiler": {}, # Here only to warn users on ignored properties
"buildable": {"type": "boolean", "default": True},
"permissions": permissions,
# If 'get_full_repo' is promoted to a Package-level
# attribute, it could be useful to set it here
"package_attributes": package_attributes,
"providers": {}, # Here only to warn users on ignored properties
"variants": variants,
"externals": {
"type": "array",
"items": {
@@ -127,6 +160,14 @@
},
},
},
"deprecatedProperties": {
"properties": ["target", "compiler", "providers"],
"message": "setting compiler, target or provider preferences in a package "
"specific section of packages.yaml is deprecated, and will be removed in "
"v0.22.\n\n\tThese preferences will be ignored by Spack. You "
"can set them only in the 'all' section of the same file.\n",
"error": False,
},
}
},
}

View File

@@ -8,11 +8,12 @@
import enum
import itertools
import os
import pathlib
import pprint
import re
import types
import warnings
from typing import Dict, List, NamedTuple, Optional, Sequence, Tuple, Union
from typing import Callable, Dict, List, NamedTuple, Optional, Sequence, Set, Tuple, Union
import archspec.cpu
@@ -337,6 +338,13 @@ def __getattr__(self, name):
fn = AspFunctionBuilder()
TransformFunction = Callable[[spack.spec.Spec, List[AspFunction]], List[AspFunction]]
def remove_node(spec: spack.spec.Spec, facts: List[AspFunction]) -> List[AspFunction]:
"""Transformation that removes all "node" and "virtual_node" from the input list of facts."""
return list(filter(lambda x: x.args[0] not in ("node", "virtual_node"), facts))
def _create_counter(specs, tests):
strategy = spack.config.CONFIG.get("concretizer:duplicates:strategy", "none")
@@ -684,7 +692,7 @@ def extract_args(model, predicate_name):
class ErrorHandler:
def __init__(self, model):
self.model = model
self.error_args = extract_args(model, "error")
self.full_model = None
def multiple_values_error(self, attribute, pkg):
return f'Cannot select a single "{attribute}" for package "{pkg}"'
@@ -692,6 +700,48 @@ def multiple_values_error(self, attribute, pkg):
def no_value_error(self, attribute, pkg):
return f'Cannot select a single "{attribute}" for package "{pkg}"'
def _get_cause_tree(
self,
cause: Tuple[str, str],
conditions: Dict[str, str],
condition_causes: List[Tuple[Tuple[str, str], Tuple[str, str]]],
seen: Set,
indent: str = " ",
) -> List[str]:
"""
Implementation of recursion for self.get_cause_tree. Much of this operates on tuples
(condition_id, set_id) in which the latter idea means that the condition represented by
the former held in the condition set represented by the latter.
"""
seen = set(seen) | set(cause)
parents = [c for e, c in condition_causes if e == cause and c not in seen]
local = "required because %s " % conditions[cause[0]]
return [indent + local] + [
c
for parent in parents
for c in self._get_cause_tree(
parent, conditions, condition_causes, seen, indent=indent + " "
)
]
def get_cause_tree(self, cause: Tuple[str, str]) -> List[str]:
"""
Get the cause tree associated with the given cause.
Arguments:
cause: The root cause of the tree (final condition)
Returns:
A list of strings describing the causes, formatted to display tree structure.
"""
conditions: Dict[str, str] = dict(extract_args(self.full_model, "condition_reason"))
condition_causes: List[Tuple[Tuple[str, str], Tuple[str, str]]] = list(
((Effect, EID), (Cause, CID))
for Effect, EID, Cause, CID in extract_args(self.full_model, "condition_cause")
)
return self._get_cause_tree(cause, conditions, condition_causes, set())
def handle_error(self, msg, *args):
"""Handle an error state derived by the solver."""
if msg == "multiple_values_error":
@@ -700,14 +750,31 @@ def handle_error(self, msg, *args):
if msg == "no_value_error":
return self.no_value_error(*args)
try:
idx = args.index("startcauses")
except ValueError:
msg_args = args
causes = []
else:
msg_args = args[:idx]
cause_args = args[idx + 1 :]
cause_args_conditions = cause_args[::2]
cause_args_ids = cause_args[1::2]
causes = list(zip(cause_args_conditions, cause_args_ids))
msg = msg.format(*msg_args)
# For variant formatting, we sometimes have to construct specs
# to format values properly. Find/replace all occurances of
# Spec(...) with the string representation of the spec mentioned
msg = msg.format(*args)
specs_to_construct = re.findall(r"Spec\(([^)]*)\)", msg)
for spec_str in specs_to_construct:
msg = msg.replace("Spec(%s)" % spec_str, str(spack.spec.Spec(spec_str)))
for cause in set(causes):
for c in self.get_cause_tree(cause):
msg += f"\n{c}"
return msg
def message(self, errors) -> str:
@@ -719,11 +786,31 @@ def message(self, errors) -> str:
return "\n".join([header] + messages)
def raise_if_errors(self):
if not self.error_args:
initial_error_args = extract_args(self.model, "error")
if not initial_error_args:
return
error_causation = clingo.Control()
parent_dir = pathlib.Path(__file__).parent
errors_lp = parent_dir / "error_messages.lp"
def on_model(model):
self.full_model = model.symbols(shown=True, terms=True)
with error_causation.backend() as backend:
for atom in self.model:
atom_id = backend.add_atom(atom)
backend.add_rule([atom_id], [], choice=False)
error_causation.load(str(errors_lp))
error_causation.ground([("base", []), ("error_messages", [])])
_ = error_causation.solve(on_model=on_model)
# No choices so there will be only one model
error_args = extract_args(self.full_model, "error")
errors = sorted(
[(int(priority), msg, args) for priority, msg, *args in self.error_args], reverse=True
[(int(priority), msg, args) for priority, msg, *args in error_args], reverse=True
)
msg = self.message(errors)
raise UnsatisfiableSpecError(msg)
@@ -861,6 +948,12 @@ def visit(node):
self.control.load(os.path.join(parent_dir, "display.lp"))
if not setup.concretize_everything:
self.control.load(os.path.join(parent_dir, "when_possible.lp"))
for spec in specs:
if self._compiler_flags_has_propagation(spec.compiler_flags):
self.control.load(os.path.join(parent_dir, "propagation.lp"))
break
timer.stop("load")
# Grounding is the first step in the solve -- it turns our facts
@@ -924,7 +1017,7 @@ def on_model(model):
if sym.name not in ("attr", "error", "opt_criterion"):
tty.debug(
"UNKNOWN SYMBOL: %s(%s)"
% (sym.name, ", ".join(intermediate_repr(sym.arguments)))
% (sym.name, ", ".join([str(s) for s in intermediate_repr(sym.arguments)]))
)
elif cores:
@@ -941,6 +1034,12 @@ def on_model(model):
return result, timer, self.control.statistics
def _compiler_flags_has_propagation(self, flags):
for _, flag_vals in flags.items():
if any(val.propagate for val in flag_vals):
return True
return False
class ConcreteSpecsByHash(collections.abc.Mapping):
"""Mapping containing concrete specs keyed by DAG hash.
@@ -1116,7 +1215,7 @@ def conflict_rules(self, pkg):
default_msg = "{0}: '{1}' conflicts with '{2}'"
no_constraint_msg = "{0}: conflicts with '{1}'"
for trigger, constraints in pkg.conflicts.items():
trigger_msg = "conflict trigger %s" % str(trigger)
trigger_msg = f"conflict is triggered when {str(trigger)}"
trigger_spec = spack.spec.Spec(trigger)
trigger_id = self.condition(
trigger_spec, name=trigger_spec.name or pkg.name, msg=trigger_msg
@@ -1128,7 +1227,11 @@ def conflict_rules(self, pkg):
conflict_msg = no_constraint_msg.format(pkg.name, trigger)
else:
conflict_msg = default_msg.format(pkg.name, trigger, constraint)
constraint_msg = "conflict constraint %s" % str(constraint)
spec_for_msg = (
spack.spec.Spec(pkg.name) if constraint == spack.spec.Spec() else constraint
)
constraint_msg = f"conflict applies to spec {str(spec_for_msg)}"
constraint_id = self.condition(constraint, name=pkg.name, msg=constraint_msg)
self.gen.fact(
fn.pkg_fact(pkg.name, fn.conflict(trigger_id, constraint_id, conflict_msg))
@@ -1167,32 +1270,9 @@ def compiler_facts(self):
matches = sorted(indexed_possible_compilers, key=lambda x: ppk(x[1].spec))
for weight, (compiler_id, cspec) in enumerate(matches):
f = fn.default_compiler_preference(compiler_id, weight)
f = fn.compiler_weight(compiler_id, weight)
self.gen.fact(f)
def package_compiler_defaults(self, pkg):
"""Facts about packages' compiler prefs."""
packages = spack.config.get("packages")
pkg_prefs = packages.get(pkg.name)
if not pkg_prefs or "compiler" not in pkg_prefs:
return
compiler_list = self.possible_compilers
compiler_list = sorted(compiler_list, key=lambda x: (x.name, x.version), reverse=True)
ppk = spack.package_prefs.PackagePrefs(pkg.name, "compiler", all=False)
matches = sorted(compiler_list, key=lambda x: ppk(x.spec))
for i, compiler in enumerate(reversed(matches)):
self.gen.fact(
fn.pkg_fact(
pkg.name,
fn.node_compiler_preference(
compiler.spec.name, compiler.spec.version, -i * 100
),
)
)
def package_requirement_rules(self, pkg):
rules = self.requirement_rules_from_package_py(pkg)
rules.extend(self.requirement_rules_from_packages_yaml(pkg))
@@ -1284,9 +1364,6 @@ def pkg_rules(self, pkg, tests):
# conflicts
self.conflict_rules(pkg)
# default compilers for this package
self.package_compiler_defaults(pkg)
# virtuals
self.package_provider_rules(pkg)
@@ -1310,7 +1387,7 @@ def trigger_rules(self):
self.gen.h2("Trigger conditions")
for name in self._trigger_cache:
cache = self._trigger_cache[name]
for spec_str, (trigger_id, requirements) in cache.items():
for (spec_str, _), (trigger_id, requirements) in cache.items():
self.gen.fact(fn.pkg_fact(name, fn.trigger_id(trigger_id)))
self.gen.fact(fn.pkg_fact(name, fn.trigger_msg(spec_str)))
for predicate in requirements:
@@ -1323,7 +1400,7 @@ def effect_rules(self):
self.gen.h2("Imposed requirements")
for name in self._effect_cache:
cache = self._effect_cache[name]
for spec_str, (effect_id, requirements) in cache.items():
for (spec_str, _), (effect_id, requirements) in cache.items():
self.gen.fact(fn.pkg_fact(name, fn.effect_id(effect_id)))
self.gen.fact(fn.pkg_fact(name, fn.effect_msg(spec_str)))
for predicate in requirements:
@@ -1422,18 +1499,26 @@ def variant_rules(self, pkg):
self.gen.newline()
def condition(self, required_spec, imposed_spec=None, name=None, msg=None, node=False):
def condition(
self,
required_spec: spack.spec.Spec,
imposed_spec: Optional[spack.spec.Spec] = None,
name: Optional[str] = None,
msg: Optional[str] = None,
transform_required: Optional[TransformFunction] = None,
transform_imposed: Optional[TransformFunction] = remove_node,
):
"""Generate facts for a dependency or virtual provider condition.
Arguments:
required_spec (spack.spec.Spec): the spec that triggers this condition
imposed_spec (spack.spec.Spec or None): the spec with constraints that
are imposed when this condition is triggered
name (str or None): name for `required_spec` (required if
required_spec is anonymous, ignored if not)
msg (str or None): description of the condition
node (bool): if False does not emit "node" or "virtual_node" requirements
from the imposed spec
required_spec: the constraints that triggers this condition
imposed_spec: the constraints that are imposed when this condition is triggered
name: name for `required_spec` (required if required_spec is anonymous, ignored if not)
msg: description of the condition
transform_required: transformation applied to facts from the required spec. Defaults
to leave facts as they are.
transform_imposed: transformation applied to facts from the imposed spec. Defaults
to removing "node" and "virtual_node" facts.
Returns:
int: id of the condition created by this function
"""
@@ -1451,10 +1536,14 @@ def condition(self, required_spec, imposed_spec=None, name=None, msg=None, node=
cache = self._trigger_cache[named_cond.name]
named_cond_key = str(named_cond)
named_cond_key = (str(named_cond), transform_required)
if named_cond_key not in cache:
trigger_id = next(self._trigger_id_counter)
requirements = self.spec_clauses(named_cond, body=True, required_from=name)
if transform_required:
requirements = transform_required(named_cond, requirements)
cache[named_cond_key] = (trigger_id, requirements)
trigger_id, requirements = cache[named_cond_key]
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_trigger(condition_id, trigger_id)))
@@ -1463,14 +1552,14 @@ def condition(self, required_spec, imposed_spec=None, name=None, msg=None, node=
return condition_id
cache = self._effect_cache[named_cond.name]
imposed_spec_key = str(imposed_spec)
imposed_spec_key = (str(imposed_spec), transform_imposed)
if imposed_spec_key not in cache:
effect_id = next(self._effect_id_counter)
requirements = self.spec_clauses(imposed_spec, body=False, required_from=name)
if not node:
requirements = list(
filter(lambda x: x.args[0] not in ("node", "virtual_node"), requirements)
)
if transform_imposed:
requirements = transform_imposed(imposed_spec, requirements)
cache[imposed_spec_key] = (effect_id, requirements)
effect_id, requirements = cache[imposed_spec_key]
self.gen.fact(fn.pkg_fact(named_cond.name, fn.condition_effect(condition_id, effect_id)))
@@ -1501,6 +1590,17 @@ def package_provider_rules(self, pkg):
)
self.gen.newline()
for when, sets_of_virtuals in pkg.provided_together.items():
condition_id = self.condition(
when, name=pkg.name, msg="Virtuals are provided together"
)
for set_id, virtuals_together in enumerate(sets_of_virtuals):
for name in virtuals_together:
self.gen.fact(
fn.pkg_fact(pkg.name, fn.provided_together(condition_id, set_id, name))
)
self.gen.newline()
def package_dependencies_rules(self, pkg):
"""Translate 'depends_on' directives into ASP logic."""
for _, conditions in sorted(pkg.dependencies.items()):
@@ -1519,21 +1619,32 @@ def package_dependencies_rules(self, pkg):
if not depflag:
continue
msg = "%s depends on %s" % (pkg.name, dep.spec.name)
msg = f"{pkg.name} depends on {dep.spec}"
if cond != spack.spec.Spec():
msg += " when %s" % cond
msg += f" when {cond}"
else:
pass
condition_id = self.condition(cond, dep.spec, pkg.name, msg)
self.gen.fact(
fn.pkg_fact(pkg.name, fn.dependency_condition(condition_id, dep.spec.name))
)
def track_dependencies(input_spec, requirements):
return requirements + [fn.attr("track_dependencies", input_spec.name)]
for t in dt.ALL_FLAGS:
if t & depflag:
# there is a declared dependency of type t
self.gen.fact(fn.dependency_type(condition_id, dt.flag_to_string(t)))
def dependency_holds(input_spec, requirements):
return remove_node(input_spec, requirements) + [
fn.attr(
"dependency_holds", pkg.name, input_spec.name, dt.flag_to_string(t)
)
for t in dt.ALL_FLAGS
if t & depflag
]
self.condition(
cond,
dep.spec,
name=pkg.name,
msg=msg,
transform_required=track_dependencies,
transform_imposed=dependency_holds,
)
self.gen.newline()
@@ -1548,6 +1659,7 @@ def virtual_preferences(self, pkg_name, func):
for i, provider in enumerate(providers):
provider_name = spack.spec.Spec(provider).name
func(vspec, provider_name, i)
self.gen.newline()
def provider_defaults(self):
self.gen.h2("Default virtual providers")
@@ -1628,8 +1740,17 @@ def emit_facts_from_requirement_rules(self, rules: List[RequirementRule]):
when_spec = spack.spec.Spec(pkg_name)
try:
# With virtual we want to emit "node" and "virtual_node" in imposed specs
transform: Optional[TransformFunction] = remove_node
if virtual:
transform = None
member_id = self.condition(
required_spec=when_spec, imposed_spec=spec, name=pkg_name, node=virtual
required_spec=when_spec,
imposed_spec=spec,
name=pkg_name,
transform_imposed=transform,
msg=f"{spec_str} is a requirement for package {pkg_name}",
)
except Exception as e:
# Do not raise if the rule comes from the 'all' subsection, since usability
@@ -1692,8 +1813,16 @@ def external_packages(self):
# Declare external conditions with a local index into packages.yaml
for local_idx, spec in enumerate(external_specs):
msg = "%s available as external when satisfying %s" % (spec.name, spec)
condition_id = self.condition(spec, msg=msg)
self.gen.fact(fn.pkg_fact(pkg_name, fn.possible_external(condition_id, local_idx)))
def external_imposition(input_spec, _):
return [fn.attr("external_conditions_hold", input_spec.name, local_idx)]
self.condition(
spec,
spack.spec.Spec(spec.name),
msg=msg,
transform_imposed=external_imposition,
)
self.possible_versions[spec.name].add(spec.version)
self.gen.newline()
@@ -1723,8 +1852,8 @@ def preferred_variants(self, pkg_name):
fn.variant_default_value_from_packages_yaml(pkg_name, variant.name, value)
)
def target_preferences(self, pkg_name):
key_fn = spack.package_prefs.PackagePrefs(pkg_name, "target")
def target_preferences(self):
key_fn = spack.package_prefs.PackagePrefs("all", "target")
if not self.target_specs_cache:
self.target_specs_cache = [
@@ -1734,17 +1863,25 @@ def target_preferences(self, pkg_name):
package_targets = self.target_specs_cache[:]
package_targets.sort(key=key_fn)
offset = 0
best_default = self.default_targets[0][1]
for i, preferred in enumerate(package_targets):
if str(preferred.architecture.target) == best_default and i != 0:
offset = 100
self.gen.fact(
fn.pkg_fact(
pkg_name, fn.target_weight(str(preferred.architecture.target), i + offset)
)
)
self.gen.fact(fn.target_weight(str(preferred.architecture.target), i))
def flag_defaults(self):
self.gen.h2("Compiler flag defaults")
# types of flags that can be on specs
for flag in spack.spec.FlagMap.valid_compiler_flags():
self.gen.fact(fn.flag_type(flag))
self.gen.newline()
# flags from compilers.yaml
compilers = all_compilers_in_config()
for compiler in compilers:
for name, flags in compiler.flags.items():
for flag in flags:
self.gen.fact(
fn.compiler_version_flag(compiler.name, compiler.version, name, flag)
)
def spec_clauses(self, *args, **kwargs):
"""Wrap a call to `_spec_clauses()` into a try/except block that
@@ -1796,8 +1933,8 @@ class Head:
node_compiler_version = fn.attr("node_compiler_version_set")
node_flag = fn.attr("node_flag_set")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagate = fn.attr("variant_propagate")
node_flag_propagation_candidate = fn.attr("node_flag_propagation_candidate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
class Body:
node = fn.attr("node")
@@ -1810,8 +1947,8 @@ class Body:
node_compiler_version = fn.attr("node_compiler_version")
node_flag = fn.attr("node_flag")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagate = fn.attr("variant_propagate")
node_flag_propagation_candidate = fn.attr("node_flag_propagation_candidate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
f = Body if body else Head
@@ -1860,7 +1997,9 @@ class Body:
clauses.append(f.variant_value(spec.name, vname, value))
if variant.propagate:
clauses.append(f.variant_propagate(spec.name, vname, value, spec.name))
clauses.append(
f.variant_propagation_candidate(spec.name, vname, value, spec.name)
)
# Tell the concretizer that this is a possible value for the
# variant, to account for things like int/str values where we
@@ -1893,7 +2032,9 @@ class Body:
clauses.append(f.node_flag(spec.name, flag_type, flag))
clauses.append(f.node_flag_source(spec.name, flag_type, spec.name))
if not spec.concrete and flag.propagate is True:
clauses.append(f.node_flag_propagate(spec.name, flag_type))
clauses.append(
f.node_flag_propagation_candidate(spec.name, flag_type, flag, spec.name)
)
# dependencies
if spec.concrete:
@@ -1902,6 +2043,16 @@ class Body:
clauses.append(fn.attr("package_hash", spec.name, spec._package_hash))
clauses.append(fn.attr("hash", spec.name, spec.dag_hash()))
edges = spec.edges_from_dependents()
virtuals = [x for x in itertools.chain.from_iterable([edge.virtuals for edge in edges])]
if not body:
for virtual in virtuals:
clauses.append(fn.attr("provider_set", spec.name, virtual))
clauses.append(fn.attr("virtual_node", virtual))
else:
for virtual in virtuals:
clauses.append(fn.attr("virtual_on_incoming_edges", spec.name, virtual))
# add all clauses from dependencies
if transitive:
# TODO: Eventually distinguish 2 deps on the same pkg (build and link)
@@ -2188,6 +2339,8 @@ def target_defaults(self, specs):
self.default_targets = list(sorted(set(self.default_targets)))
self.target_preferences()
def virtual_providers(self):
self.gen.h2("Virtual providers")
msg = (
@@ -2509,7 +2662,6 @@ def setup(
self.pkg_rules(pkg, tests=self.tests)
self.gen.h2("Package preferences: %s" % pkg)
self.preferred_variants(pkg)
self.target_preferences(pkg)
self.gen.h1("Develop specs")
# Inject dev_path from environment
@@ -2535,20 +2687,45 @@ def setup(
self.define_target_constraints()
def literal_specs(self, specs):
for idx, spec in enumerate(specs):
for spec in specs:
self.gen.h2("Spec: %s" % str(spec))
self.gen.fact(fn.literal(idx))
condition_id = next(self._condition_id_counter)
trigger_id = next(self._trigger_id_counter)
self.gen.fact(fn.literal(idx, "virtual_root" if spec.virtual else "root", spec.name))
for clause in self.spec_clauses(spec):
self.gen.fact(fn.literal(idx, *clause.args))
if clause.args[0] == "variant_set":
self.gen.fact(
fn.literal(idx, "variant_default_value_from_cli", *clause.args[1:])
# Special condition triggered by "literal_solved"
self.gen.fact(fn.literal(trigger_id))
self.gen.fact(fn.pkg_fact(spec.name, fn.condition_trigger(condition_id, trigger_id)))
self.gen.fact(fn.condition_reason(condition_id, f"{spec} requested from CLI"))
# Effect imposes the spec
imposed_spec_key = str(spec), None
cache = self._effect_cache[spec.name]
msg = (
"literal specs have different requirements. clear cache before computing literals"
)
assert imposed_spec_key not in cache, msg
effect_id = next(self._effect_id_counter)
requirements = self.spec_clauses(spec)
root_name = spec.name
for clause in requirements:
clause_name = clause.args[0]
if clause_name == "variant_set":
requirements.append(
fn.attr("variant_default_value_from_cli", *clause.args[1:])
)
elif clause_name in ("node", "virtual_node", "hash"):
# These facts are needed to compute the "condition_set" of the root
pkg_name = clause.args[1]
self.gen.fact(fn.mentioned_in_literal(trigger_id, root_name, pkg_name))
requirements.append(fn.attr("virtual_root" if spec.virtual else "root", spec.name))
cache[imposed_spec_key] = (effect_id, requirements)
self.gen.fact(fn.pkg_fact(spec.name, fn.condition_effect(condition_id, effect_id)))
if self.concretize_everything:
self.gen.fact(fn.solve_literal(idx))
self.gen.fact(fn.solve_literal(trigger_id))
self.effect_rules()
def validate_and_define_versions_from_requirements(
self, *, allow_deprecated: bool, require_checksum: bool
@@ -3124,10 +3301,11 @@ def __init__(self, provided, conflicts):
msg = (
"Spack concretizer internal error. Please submit a bug report and include the "
"command, environment if applicable and the following error message."
f"\n {provided} is unsatisfiable, errors are:"
f"\n {provided} is unsatisfiable"
)
msg += "".join([f"\n {conflict}" for conflict in conflicts])
if conflicts:
msg += ", errors are:" + "".join([f"\n {conflict}" for conflict in conflicts])
super(spack.error.UnsatisfiableSpecError, self).__init__(msg)

View File

@@ -10,9 +10,8 @@
% ID of the nodes in the "root" link-run sub-DAG
#const min_dupe_id = 0.
#const link_run = 0.
#const direct_link_run =1.
#const direct_build = 2.
#const direct_link_run = 0.
#const direct_build = 1.
% Allow clingo to create nodes
{ attr("node", node(0..X-1, Package)) } :- max_dupes(Package, X), not virtual(Package).
@@ -30,23 +29,21 @@
:- attr("variant_value", PackageNode, _, _), not attr("node", PackageNode).
:- attr("node_flag_compiler_default", PackageNode), not attr("node", PackageNode).
:- attr("node_flag", PackageNode, _, _), not attr("node", PackageNode).
:- attr("node_flag_source", PackageNode, _, _), not attr("node", PackageNode).
:- attr("no_flags", PackageNode, _), not attr("node", PackageNode).
:- attr("external_spec_selected", PackageNode, _), not attr("node", PackageNode).
:- attr("depends_on", ParentNode, _, _), not attr("node", ParentNode).
:- attr("depends_on", _, ChildNode, _), not attr("node", ChildNode).
:- attr("node_flag_source", ParentNode, _, _), not attr("node", ParentNode).
:- attr("node_flag_source", _, _, ChildNode), not attr("node", ChildNode).
:- attr("virtual_node", VirtualNode), not provider(_, VirtualNode), internal_error("virtual node with no provider").
:- provider(_, VirtualNode), not attr("virtual_node", VirtualNode), internal_error("provider with no virtual node").
:- provider(PackageNode, _), not attr("node", PackageNode), internal_error("provider with no real node").
:- attr("virtual_node", VirtualNode), not provider(_, VirtualNode).
:- provider(_, VirtualNode), not attr("virtual_node", VirtualNode).
:- provider(PackageNode, _), not attr("node", PackageNode).
:- attr("root", node(ID, PackageNode)), ID > min_dupe_id.
:- attr("root", node(ID, PackageNode)), ID > min_dupe_id, internal_error("root with a non-minimal duplicate ID").
% Nodes in the "root" unification set cannot depend on non-root nodes if the dependency is "link" or "run"
:- attr("depends_on", node(min_dupe_id, Package), node(ID, _), "link"), ID != min_dupe_id, unification_set("root", node(min_dupe_id, Package)).
:- attr("depends_on", node(min_dupe_id, Package), node(ID, _), "run"), ID != min_dupe_id, unification_set("root", node(min_dupe_id, Package)).
:- attr("depends_on", node(min_dupe_id, Package), node(ID, _), "link"), ID != min_dupe_id, unification_set("root", node(min_dupe_id, Package)), internal_error("link dependency out of the root unification set").
:- attr("depends_on", node(min_dupe_id, Package), node(ID, _), "run"), ID != min_dupe_id, unification_set("root", node(min_dupe_id, Package)), internal_error("run dependency out of the root unification set").
% Rules on "unification sets", i.e. on sets of nodes allowing a single configuration of any given package
unify(SetID, PackageName) :- unification_set(SetID, node(_, PackageName)).
@@ -86,22 +83,24 @@ unification_set(SetID, VirtualNode)
%----
% In the "root" unification set only ID = 0 are allowed
:- unification_set("root", node(ID, _)), ID != 0.
:- unification_set("root", node(ID, _)), ID != 0, internal_error("root unification set has node with non-zero unification set ID").
% In the "root" unification set we allow only packages from the link-run possible subDAG
:- unification_set("root", node(_, Package)), not possible_in_link_run(Package), not virtual(Package).
:- unification_set("root", node(_, Package)), not possible_in_link_run(Package), not virtual(Package), internal_error("package outside possible link/run graph in root unification set").
% Each node must belong to at least one unification set
:- attr("node", PackageNode), not unification_set(_, PackageNode).
:- attr("node", PackageNode), not unification_set(_, PackageNode), internal_error("node belongs to no unification set").
% Cannot have a node with an ID, if lower ID of the same package are not used
:- attr("node", node(ID1, Package)),
not attr("node", node(ID2, Package)),
max_dupes(Package, X), ID1=0..X-1, ID2=0..X-1, ID2 < ID1.
max_dupes(Package, X), ID1=0..X-1, ID2=0..X-1, ID2 < ID1,
internal_error("node skipped id number").
:- attr("virtual_node", node(ID1, Package)),
not attr("virtual_node", node(ID2, Package)),
max_dupes(Package, X), ID1=0..X-1, ID2=0..X-1, ID2 < ID1.
max_dupes(Package, X), ID1=0..X-1, ID2=0..X-1, ID2 < ID1,
internal_error("virtual node skipped id number").
%-----------------------------------------------------------------------------
% Map literal input specs to facts that drive the solve
@@ -113,26 +112,30 @@ unification_set(SetID, VirtualNode)
multiple_nodes_attribute("node_flag_source").
multiple_nodes_attribute("depends_on").
multiple_nodes_attribute("virtual_on_edge").
multiple_nodes_attribute("provider_set").
% Map constraint on the literal ID to facts on the node
attr(Name, node(min_dupe_id, A1)) :- literal(LiteralID, Name, A1), solve_literal(LiteralID).
attr(Name, node(min_dupe_id, A1), A2) :- literal(LiteralID, Name, A1, A2), solve_literal(LiteralID).
attr(Name, node(min_dupe_id, A1), A2, A3) :- literal(LiteralID, Name, A1, A2, A3), solve_literal(LiteralID), not multiple_nodes_attribute(Name).
attr(Name, node(min_dupe_id, A1), A2, A3, A4) :- literal(LiteralID, Name, A1, A2, A3, A4), solve_literal(LiteralID).
trigger_condition_holds(TriggerID, node(min_dupe_id, Package)) :-
solve_literal(TriggerID),
pkg_fact(Package, condition_trigger(_, TriggerID)),
literal(TriggerID).
% Special cases where nodes occur in arguments other than A1
attr("node_flag_source", node(min_dupe_id, A1), A2, node(min_dupe_id, A3)) :- literal(LiteralID, "node_flag_source", A1, A2, A3), solve_literal(LiteralID).
attr("depends_on", node(min_dupe_id, A1), node(min_dupe_id, A2), A3) :- literal(LiteralID, "depends_on", A1, A2, A3), solve_literal(LiteralID).
trigger_node(TriggerID, Node, Node) :-
trigger_condition_holds(TriggerID, Node),
literal(TriggerID).
% Since we trigger the existence of literal nodes from a condition, we need to construct
% the condition_set/2 manually below
mentioned_in_literal(Root, Mentioned) :- mentioned_in_literal(TriggerID, Root, Mentioned), solve_literal(TriggerID).
condition_set(node(min_dupe_id, Root), node(min_dupe_id, Mentioned)) :- mentioned_in_literal(Root, Mentioned).
% Discriminate between "roots" that have been explicitly requested, and roots that are deduced from "virtual roots"
explicitly_requested_root(node(min_dupe_id, A1)) :- literal(LiteralID, "root", A1), solve_literal(LiteralID).
explicitly_requested_root(node(min_dupe_id, Package)) :-
solve_literal(TriggerID),
trigger_and_effect(Package, TriggerID, EffectID),
imposed_constraint(EffectID, "root", Package).
#defined concretize_everything/0.
#defined literal/1.
#defined literal/3.
#defined literal/4.
#defined literal/5.
#defined literal/6.
% Attributes for node packages which must have a single value
attr_single_value("version").
@@ -230,7 +233,8 @@ possible_version_weight(node(ID, Package), Weight)
1 { version_weight(node(ID, Package), Weight) : pkg_fact(Package, version_declared(Version, Weight)) } 1
:- attr("version", node(ID, Package), Version),
attr("node", node(ID, Package)).
attr("node", node(ID, Package)),
internal_error("version weights must exist and be unique").
% node_version_satisfies implies that exactly one of the satisfying versions
% is the package's version, and vice versa.
@@ -244,7 +248,8 @@ possible_version_weight(node(ID, Package), Weight)
% bound on the choice rule to avoid false positives with the error below
1 { attr("version", node(ID, Package), Version) : pkg_fact(Package, version_satisfies(Constraint, Version)) }
:- attr("node_version_satisfies", node(ID, Package), Constraint),
pkg_fact(Package, version_satisfies(Constraint, _)).
pkg_fact(Package, version_satisfies(Constraint, _)),
internal_error("must choose a single version to satisfy version constraints").
% More specific error message if the version cannot satisfy some constraint
% Otherwise covered by `no_version_error` and `versions_conflict_error`.
@@ -313,9 +318,11 @@ trigger_condition_holds(ID, RequestorNode) :-
attr(Name, node(X, A1)) : condition_requirement(ID, Name, A1), condition_nodes(ID, PackageNode, node(X, A1));
attr(Name, node(X, A1), A2) : condition_requirement(ID, Name, A1, A2), condition_nodes(ID, PackageNode, node(X, A1));
attr(Name, node(X, A1), A2, A3) : condition_requirement(ID, Name, A1, A2, A3), condition_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name);
attr(Name, node(X, A1), A2, A3, A4) : condition_requirement(ID, Name, A1, A2, A3, A4), condition_nodes(ID, PackageNode, node(X, A1));
attr(Name, node(X, A1), A2, A3, A4) : condition_requirement(ID, Name, A1, A2, A3, A4), condition_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name);
% Special cases
attr("node_flag_source", node(X, A1), A2, node(Y, A3)) : condition_requirement(ID, "node_flag_source", A1, A2, A3), condition_nodes(ID, PackageNode, node(X, A1)), condition_nodes(ID, PackageNode, node(Y, A3));
attr("node_flag_propagation_candidate", node(X, A1), A2, A3, node(Y, A4)) : condition_reuqirement(ID, "node_flag_propagation_candidate", A1, A2, A3, A4), condition_nodes(ID, PackageNode, node(X, A1)), condition_nodes(ID, PackageNode, node(Y, A4));
% if the structure of the program ever could change so that node_flag_source can be in a condition, we will need a case for it here.
not cannot_hold(ID, PackageNode).
condition_holds(ConditionID, node(X, Package))
@@ -357,7 +364,7 @@ imposed_nodes(ConditionID, PackageNode, node(X, A1))
% Conditions that hold impose may impose constraints on other specs
attr(Name, node(X, A1)) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1), imposed_nodes(ID, PackageNode, node(X, A1)).
attr(Name, node(X, A1), A2) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2), imposed_nodes(ID, PackageNode, node(X, A1)).
attr(Name, node(X, A1), A2) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2), imposed_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name).
attr(Name, node(X, A1), A2, A3) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3), imposed_nodes(ID, PackageNode, node(X, A1)), not multiple_nodes_attribute(Name).
attr(Name, node(X, A1), A2, A3, A4) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1, A2, A3, A4), imposed_nodes(ID, PackageNode, node(X, A1)).
@@ -368,6 +375,25 @@ attr("node_flag_source", node(X, A1), A2, node(Y, A3))
imposed_constraint(ID, "node_flag_source", A1, A2, A3),
condition_set(node(Y, A3), node(X, A1)).
% For node flag propagation, we need to look at the condition_set of the source, since it is the ancestor
% of the package on which we want to impose the constraint
attr("node_flag_propagation_candidate", node(X, A1), A2, A3, node(Y, A4))
:- impose(ID, node(X, A1)),
imposed_constraint(ID, "node_flag_propagation_candidate", A1, A2, A3, A4),
condition_set(node(Y, A4), node(X, A1)).
% if the structure of the program ever could change so that node_flag_source can be in a condition, we will need a case for it here.
% Provider set is relevant only for literals, since it's the only place where `^[virtuals=foo] bar`
% might appear in the HEAD of a rule
attr("provider_set", node(min_dupe_id, Provider), node(min_dupe_id, Virtual))
:- solve_literal(TriggerID),
trigger_and_effect(_, TriggerID, EffectID),
impose(EffectID, _),
imposed_constraint(EffectID, "provider_set", Provider, Virtual).
provider(ProviderNode, VirtualNode) :- attr("provider_set", ProviderNode, VirtualNode).
% Here we can't use the condition set because it's a recursive definition, that doesn't define the
% node index, and leads to unsatisfiability. Hence we say that one and only one node index must
% satisfy the dependency.
@@ -427,24 +453,11 @@ depends_on(PackageNode, DependencyNode) :- attr("depends_on", PackageNode, Depen
% concrete. We chop off dependencies for externals, and dependencies of
% concrete specs don't need to be resolved -- they arise from the concrete
% specs themselves.
dependency_holds(node(NodeID, Package), Dependency, Type) :-
pkg_fact(Package, dependency_condition(ID, Dependency)),
dependency_type(ID, Type),
build(node(NodeID, Package)),
not external(node(NodeID, Package)),
condition_holds(ID, node(NodeID, Package)).
% We cut off dependencies of externals (as we don't really know them).
% Don't impose constraints on dependencies that don't exist.
do_not_impose(EffectID, node(NodeID, Package)) :-
not dependency_holds(node(NodeID, Package), Dependency, _),
attr("node", node(NodeID, Package)),
pkg_fact(Package, dependency_condition(ID, Dependency)),
pkg_fact(Package, condition_effect(ID, EffectID)).
attr("track_dependencies", Node) :- build(Node), not external(Node).
% If a dependency holds on a package node, there must be one and only one dependency node satisfying it
1 { attr("depends_on", PackageNode, node(0..Y-1, Dependency), Type) : max_dupes(Dependency, Y) } 1
:- dependency_holds(PackageNode, Dependency, Type),
:- attr("dependency_holds", PackageNode, Dependency, Type),
not virtual(Dependency).
% all nodes in the graph must be reachable from some root
@@ -476,10 +489,25 @@ error(1, Msg)
% Virtual dependencies
%-----------------------------------------------------------------------------
% If the provider is set from the command line, its weight is 0
possible_provider_weight(ProviderNode, VirtualNode, 0, "Set on the command line")
:- attr("provider_set", ProviderNode, VirtualNode).
% Enforces all virtuals to be provided, if multiple of them are provided together
error(100, "Package '{0}' needs to provide both '{1}' and '{2}' together, but provides only '{1}'", Package, Virtual1, Virtual2)
:- condition_holds(ID, node(X, Package)),
pkg_fact(Package, provided_together(ID, SetID, Virtual1)),
pkg_fact(Package, provided_together(ID, SetID, Virtual2)),
Virtual1 != Virtual2,
attr("virtual_on_incoming_edges", node(X, Package), Virtual1),
not attr("virtual_on_incoming_edges", node(X, Package), Virtual2),
attr("virtual_node", node(_, Virtual1)),
attr("virtual_node", node(_, Virtual2)).
% if a package depends on a virtual, it's not external and we have a
% provider for that virtual then it depends on the provider
node_depends_on_virtual(PackageNode, Virtual, Type)
:- dependency_holds(PackageNode, Virtual, Type),
:- attr("dependency_holds", PackageNode, Virtual, Type),
virtual(Virtual),
not external(PackageNode).
@@ -489,11 +517,14 @@ node_depends_on_virtual(PackageNode, Virtual) :- node_depends_on_virtual(Package
:- node_depends_on_virtual(PackageNode, Virtual, Type).
attr("virtual_on_edge", PackageNode, ProviderNode, Virtual)
:- dependency_holds(PackageNode, Virtual, Type),
:- attr("dependency_holds", PackageNode, Virtual, Type),
attr("depends_on", PackageNode, ProviderNode, Type),
provider(ProviderNode, node(_, Virtual)),
not external(PackageNode).
attr("virtual_on_incoming_edges", ProviderNode, Virtual)
:- attr("virtual_on_edge", _, ProviderNode, Virtual).
% dependencies on virtuals also imply that the virtual is a virtual node
1 { attr("virtual_node", node(0..X-1, Virtual)) : max_dupes(Virtual, X) }
:- node_depends_on_virtual(PackageNode, Virtual).
@@ -501,6 +532,10 @@ attr("virtual_on_edge", PackageNode, ProviderNode, Virtual)
% If there's a virtual node, we must select one and only one provider.
% The provider must be selected among the possible providers.
error(100, "'{0}' cannot be a provider for the '{1}' virtual", Package, Virtual)
:- attr("provider_set", node(min_dupe_id, Package), node(min_dupe_id, Virtual)),
not virtual_condition_holds( node(min_dupe_id, Package), Virtual).
error(100, "Cannot find valid provider for virtual {0}", Virtual)
:- attr("virtual_node", node(X, Virtual)),
not provider(_, node(X, Virtual)).
@@ -521,20 +556,6 @@ attr("root", PackageNode) :- attr("virtual_root", VirtualNode), provider(Package
attr("node", PackageNode), virtual_condition_holds(PackageNode, Virtual) } 1
:- attr("virtual_node", node(X, Virtual)).
% If a spec is selected as a provider, it is for all the virtual it could provide
:- provider(PackageNode, node(X, Virtual1)),
virtual_condition_holds(PackageNode, Virtual2),
Virtual2 != Virtual1,
unification_set(SetID, PackageNode),
unification_set(SetID, node(X, Virtual2)),
not provider(PackageNode, node(X, Virtual2)).
% If a spec is a dependency, and could provide a needed virtual, it must be a provider
:- node_depends_on_virtual(PackageNode, Virtual),
depends_on(PackageNode, PossibleProviderNode),
virtual_condition_holds(PossibleProviderNode, Virtual),
not attr("virtual_on_edge", PackageNode, PossibleProviderNode, Virtual).
% The provider provides the virtual if some provider condition holds.
virtual_condition_holds(node(ProviderID, Provider), Virtual) :- virtual_condition_holds(ID, node(ProviderID, Provider), Virtual).
virtual_condition_holds(ID, node(ProviderID, Provider), Virtual) :-
@@ -561,6 +582,8 @@ do_not_impose(EffectID, node(X, Package))
not virtual_condition_holds(PackageNode, Virtual),
internal_error("Virtual when provides not respected").
#defined provided_together/4.
%-----------------------------------------------------------------------------
% Virtual dependency weights
%-----------------------------------------------------------------------------
@@ -577,21 +600,15 @@ possible_provider_weight(DependencyNode, VirtualNode, 0, "external")
:- provider(DependencyNode, VirtualNode),
external(DependencyNode).
% A provider mentioned in packages.yaml can use a weight
% according to its priority in the list of providers
possible_provider_weight(node(DependencyID, Dependency), node(VirtualID, Virtual), Weight, "packages_yaml")
:- provider(node(DependencyID, Dependency), node(VirtualID, Virtual)),
depends_on(node(ID, Package), node(DependencyID, Dependency)),
pkg_fact(Package, provider_preference(Virtual, Dependency, Weight)).
% A provider mentioned in the default configuration can use a weight
% according to its priority in the list of providers
possible_provider_weight(node(DependencyID, Dependency), node(VirtualID, Virtual), Weight, "default")
:- provider(node(DependencyID, Dependency), node(VirtualID, Virtual)),
default_provider_preference(Virtual, Dependency, Weight).
possible_provider_weight(node(ProviderID, Provider), node(VirtualID, Virtual), Weight, "default")
:- provider(node(ProviderID, Provider), node(VirtualID, Virtual)),
default_provider_preference(Virtual, Provider, Weight).
% Any provider can use 100 as a weight, which is very high and discourage its use
possible_provider_weight(node(DependencyID, Dependency), VirtualNode, 100, "fallback") :- provider(node(DependencyID, Dependency), VirtualNode).
possible_provider_weight(node(ProviderID, Provider), VirtualNode, 100, "fallback")
:- provider(node(ProviderID, Provider), VirtualNode).
% do not warn if generated program contains none of these.
#defined virtual/1.
@@ -609,11 +626,11 @@ possible_provider_weight(node(DependencyID, Dependency), VirtualNode, 100, "fall
pkg_fact(Package, version_declared(Version, Weight, "external")) }
:- external(node(ID, Package)).
error(100, "Attempted to use external for '{0}' which does not satisfy any configured external spec", Package)
error(100, "Attempted to use external for '{0}' which does not satisfy any configured external spec version", Package)
:- external(node(ID, Package)),
not external_version(node(ID, Package), _, _).
error(100, "Attempted to use external for '{0}' which does not satisfy any configured external spec", Package)
error(100, "Attempted to use external for '{0}' which does not satisfy a unique configured external spec version", Package)
:- external(node(ID, Package)),
2 { external_version(node(ID, Package), Version, Weight) }.
@@ -642,18 +659,15 @@ external(PackageNode) :- attr("external_spec_selected", PackageNode, _).
% determine if an external spec has been selected
attr("external_spec_selected", node(ID, Package), LocalIndex) :-
external_conditions_hold(node(ID, Package), LocalIndex),
attr("external_conditions_hold", node(ID, Package), LocalIndex),
attr("node", node(ID, Package)),
not attr("hash", node(ID, Package), _).
external_conditions_hold(node(PackageID, Package), LocalIndex) :-
pkg_fact(Package, possible_external(ID, LocalIndex)), condition_holds(ID, node(PackageID, Package)).
% it cannot happen that a spec is external, but none of the external specs
% conditions hold.
error(100, "Attempted to use external for '{0}' which does not satisfy any configured external spec", Package)
:- external(node(ID, Package)),
not external_conditions_hold(node(ID, Package), _).
not attr("external_conditions_hold", node(ID, Package), _).
%-----------------------------------------------------------------------------
% Config required semantics
@@ -707,7 +721,7 @@ requirement_group_satisfied(node(ID, Package), X) :-
activate_requirement(node(NodeID1, Package1), RequirementID),
pkg_fact(Package1, condition_effect(ConditionID, EffectID)),
imposed_constraint(EffectID, "node_flag_source", Package1, FlagType, Package2),
imposed_packages(NodeID2, Package2).
imposed_nodes(EffectID, node(NodeID2, Package2), node(NodeID1, Package1)).
requirement_weight(node(ID, Package), Group, W) :-
W = #min {
@@ -754,23 +768,36 @@ node_has_variant(node(ID, Package), Variant) :-
pkg_fact(Package, variant(Variant)),
attr("node", node(ID, Package)).
attr("variant_propagate", PackageNode, Variant, Value, Source) :-
% Variant propagation is forwarded to dependencies
attr("variant_propagation_candidate", PackageNode, Variant, Value, Source) :-
attr("node", PackageNode),
depends_on(ParentNode, PackageNode),
attr("variant_propagate", ParentNode, Variant, Value, Source),
not attr("variant_set", PackageNode, Variant).
attr("variant_value", node(_, Source), Variant, Value),
attr("variant_propagation_candidate", ParentNode, Variant, _, Source).
attr("variant_value", node(ID, Package), Variant, Value) :-
attr("node", node(ID, Package)),
% If the node is a candidate, and it has the variant and value,
% then those variant and value should be propagated
attr("variant_propagate", node(ID, Package), Variant, Value, Source) :-
attr("variant_propagation_candidate", node(ID, Package), Variant, Value, Source),
node_has_variant(node(ID, Package), Variant),
attr("variant_propagate", node(ID, Package), Variant, Value, _),
pkg_fact(Package, variant_possible_value(Variant, Value)).
pkg_fact(Package, variant_possible_value(Variant, Value)),
not attr("variant_set", node(ID, Package), Variant).
% Propagate the value, if there is the corresponding attribute
attr("variant_value", PackageNode, Variant, Value) :- attr("variant_propagate", PackageNode, Variant, Value, _).
% If a variant is propagated, we cannot have extraneous values (this is for multi valued variants)
variant_is_propagated(PackageNode, Variant) :- attr("variant_propagate", PackageNode, Variant, _, _).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_value", PackageNode, Variant, Value),
not attr("variant_propagate", PackageNode, Variant, Value, _).
% Cannot receive different values from different sources on the same variant
error(100, "{0} and {1} cannot both propagate variant '{2}' to package {3} with values '{4}' and '{5}'", Source1, Source2, Variant, Package, Value1, Value2) :-
attr("variant_propagate", node(X, Package), Variant, Value1, Source1),
attr("variant_propagate", node(X, Package), Variant, Value2, Source2),
node_has_variant(node(X, Package), Variant),
Value1 < Value2.
Value1 < Value2, Source1 < Source2.
% a variant cannot be set if it is not a variant on the package
error(100, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, Package)
@@ -866,13 +893,15 @@ variant_default_not_used(node(ID, Package), Variant, Value)
:- variant_default_value(Package, Variant, Value),
node_has_variant(node(ID, Package), Variant),
not attr("variant_value", node(ID, Package), Variant, Value),
not attr("variant_propagate", node(ID, Package), Variant, _, _),
attr("node", node(ID, Package)).
% The variant is set in an external spec
external_with_variant_set(node(NodeID, Package), Variant, Value)
:- attr("variant_value", node(NodeID, Package), Variant, Value),
condition_requirement(ID, "variant_value", Package, Variant, Value),
pkg_fact(Package, possible_external(ID, _)),
condition_requirement(TriggerID, "variant_value", Package, Variant, Value),
trigger_and_effect(Package, TriggerID, EffectID),
imposed_constraint(EffectID, "external_conditions_hold", Package, _),
external(node(NodeID, Package)),
attr("node", node(NodeID, Package)).
@@ -1048,7 +1077,7 @@ attr("node_target", PackageNode, Target)
node_target_weight(node(ID, Package), Weight)
:- attr("node", node(ID, Package)),
attr("node_target", node(ID, Package), Target),
pkg_fact(Package, target_weight(Target, Weight)).
target_weight(Target, Weight).
% compatibility rules for targets among nodes
node_target_match(ParentNode, DependencyNode)
@@ -1170,23 +1199,17 @@ compiler_mismatch_required(PackageNode, DependencyNode)
#defined allow_compiler/2.
% compilers weighted by preference according to packages.yaml
compiler_weight(node(ID, Package), Weight)
node_compiler_weight(node(ID, Package), Weight)
:- node_compiler(node(ID, Package), CompilerID),
compiler_name(CompilerID, Compiler),
compiler_version(CompilerID, V),
pkg_fact(Package, node_compiler_preference(Compiler, V, Weight)).
compiler_weight(node(ID, Package), Weight)
compiler_weight(CompilerID, Weight).
node_compiler_weight(node(ID, Package), 100)
:- node_compiler(node(ID, Package), CompilerID),
compiler_name(CompilerID, Compiler),
compiler_version(CompilerID, V),
not pkg_fact(Package, node_compiler_preference(Compiler, V, _)),
default_compiler_preference(CompilerID, Weight).
compiler_weight(node(ID, Package), 100)
:- node_compiler(node(ID, Package), CompilerID),
compiler_name(CompilerID, Compiler),
compiler_version(CompilerID, V),
not pkg_fact(Package, node_compiler_preference(Compiler, V, _)),
not default_compiler_preference(CompilerID, _).
not compiler_weight(CompilerID, _).
% For the time being, be strict and reuse only if the compiler match one we have on the system
error(100, "Compiler {1}@{2} requested for {0} cannot be found. Set install_missing_compilers:true if intended.", Package, Compiler, Version)
@@ -1194,52 +1217,22 @@ error(100, "Compiler {1}@{2} requested for {0} cannot be found. Set install_miss
not node_compiler(node(ID, Package), _).
#defined node_compiler_preference/4.
#defined default_compiler_preference/3.
#defined compiler_weight/3.
%-----------------------------------------------------------------------------
% Compiler flags
%-----------------------------------------------------------------------------
% propagate flags when compiler match
can_inherit_flags(PackageNode, DependencyNode, FlagType)
:- same_compiler(PackageNode, DependencyNode),
not attr("node_flag_set", DependencyNode, FlagType, _),
flag_type(FlagType).
same_compiler(PackageNode, DependencyNode)
:- depends_on(PackageNode, DependencyNode),
node_compiler(PackageNode, CompilerID),
node_compiler(DependencyNode, CompilerID),
compiler_id(CompilerID).
node_flag_inherited(DependencyNode, FlagType, Flag)
:- attr("node_flag_set", PackageNode, FlagType, Flag),
can_inherit_flags(PackageNode, DependencyNode, FlagType),
attr("node_flag_propagate", PackageNode, FlagType).
% Ensure propagation
:- node_flag_inherited(PackageNode, FlagType, Flag),
can_inherit_flags(PackageNode, DependencyNode, FlagType),
attr("node_flag_propagate", PackageNode, FlagType).
error(100, "{0} and {1} cannot both propagate compiler flags '{2}' to {3}", Source1, Source2, Package, FlagType) :-
depends_on(Source1, Package),
depends_on(Source2, Package),
attr("node_flag_propagate", Source1, FlagType),
attr("node_flag_propagate", Source2, FlagType),
can_inherit_flags(Source1, Package, FlagType),
can_inherit_flags(Source2, Package, FlagType),
Source1 < Source2.
% remember where flags came from
attr("node_flag_source", PackageNode, FlagType, PackageNode)
:- attr("node_flag_set", PackageNode, FlagType, _).
attr("node_flag_source", DependencyNode, FlagType, Q)
:- attr("node_flag_source", PackageNode, FlagType, Q),
node_flag_inherited(DependencyNode, FlagType, _),
attr("node_flag_propagate", PackageNode, FlagType).
% compiler flags from compilers.yaml are put on nodes if compiler matches
attr("node_flag", PackageNode, FlagType, Flag)
:- compiler_flag(CompilerID, FlagType, Flag),
@@ -1260,7 +1253,30 @@ attr("node_flag_compiler_default", PackageNode)
% if a flag is set to something or inherited, it's included
attr("node_flag", PackageNode, FlagType, Flag) :- attr("node_flag_set", PackageNode, FlagType, Flag).
attr("node_flag", PackageNode, FlagType, Flag) :- node_flag_inherited(PackageNode, FlagType, Flag).
attr("node_flag_propagate", PackageNode, FlagType, Flag, Source) :-
attr("node_flag_propagation_candidate", PackageNode, FlagType, Flag, Source),
not attr("node_flag_set", PackageNode, FlagType, _).
% source is a node() attribute
attr("node_flag_propagation_candidate", PackageNode, FlagType, Flag, Source) :-
same_compiler(ParentNode, PackageNode),
attr("node_flag_propagation_candidate", ParentNode, FlagType, _, Source),
attr("node_flag", Source, FlagType, Flag),
flag_type(FlagType).
error(100, "{0} and {1} cannot both propagate compiler flags '{2}' to {3}", Source1, Source2, PackageNode, FlagType) :-
same_compiler(Source1, PackageNode),
same_compiler(Source2, PackageNode),
attr("node_flag_propagate", _, FlagType, _, Source1),
attr("node_flag_propagate", _, FlagType, _, Source2),
Source1 < Source2.
attr("node_flag_source", PackageNode, FlagType, Q)
:- attr("node_flag_propagate", PackageNode, FlagType, _, Q).
attr("node_flag", PackageNode, FlagType, Flag) :- attr("node_flag_propagate", PackageNode, FlagType, Flag, _).
% if no node flags are set for a type, there are no flags.
attr("no_flags", PackageNode, FlagType)
@@ -1518,7 +1534,7 @@ opt_criterion(15, "non-preferred compilers").
#minimize{ 0@15: #true }.
#minimize{
Weight@15+Priority,PackageNode
: compiler_weight(PackageNode, Weight),
: node_compiler_weight(PackageNode, Weight),
build_priority(PackageNode, Priority)
}.

View File

@@ -24,4 +24,29 @@
#show error/5.
#show error/6.
% for error causation
#show condition_reason/2.
% For error messages to use later
#show pkg_fact/2.
#show condition_holds/2.
#show imposed_constraint/3.
#show imposed_constraint/4.
#show imposed_constraint/5.
#show imposed_constraint/6.
#show condition_requirement/3.
#show condition_requirement/4.
#show condition_requirement/5.
#show condition_requirement/6.
#show node_has_variant/2.
#show build/1.
#show external/1.
#show external_version/3.
#show trigger_and_effect/3.
#show unification_set/2.
#show provider/2.
#show condition_nodes/3.
#show trigger_node/3.
#show imposed_nodes/3.
% debug

View File

@@ -0,0 +1,239 @@
% Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
% Spack Project Developers. See the top-level COPYRIGHT file for details.
%
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% This logic program adds detailed error messages to Spack's concretizer
%=============================================================================
#program error_messages.
% Create a causal tree between trigger conditions by locating the effect conditions
% that are triggers for another condition. Condition2 is caused by Condition1
condition_cause(Condition2, ID2, Condition1, ID1) :-
condition_holds(Condition2, node(ID2, Package2)),
pkg_fact(Package2, condition_trigger(Condition2, Trigger)),
condition_requirement(Trigger, Name, Package),
condition_nodes(Trigger, TriggerNode, node(ID, Package)),
trigger_node(Trigger, TriggerNode, node(ID2, Package2)),
attr(Name, node(ID, Package)),
condition_holds(Condition1, node(ID1, Package1)),
pkg_fact(Package1, condition_effect(Condition1, Effect)),
imposed_constraint(Effect, Name, Package),
imposed_nodes(Effect, node(ID1, Package1), node(ID, Package)).
condition_cause(Condition2, ID2, Condition1, ID1) :-
condition_holds(Condition2, node(ID2, Package2)),
pkg_fact(Package2, condition_trigger(Condition2, Trigger)),
condition_requirement(Trigger, Name, Package, A1),
condition_nodes(Trigger, TriggerNode, node(ID, Package)),
trigger_node(Trigger, TriggerNode, node(ID2, Package2)),
attr(Name, node(ID, Package), A1),
condition_holds(Condition1, node(ID1, Package1)),
pkg_fact(Package1, condition_effect(Condition1, Effect)),
imposed_constraint(Effect, Name, Package, A1),
imposed_nodes(Effect, node(ID1, Package1), node(ID, Package)).
condition_cause(Condition2, ID2, Condition1, ID1) :-
condition_holds(Condition2, node(ID2, Package2)),
pkg_fact(Package2, condition_trigger(Condition2, Trigger)),
condition_requirement(Trigger, Name, Package, A1, A2),
condition_nodes(Trigger, TriggerNode, node(ID, Package)),
trigger_node(Trigger, TriggerNode, node(ID2, Package2)),
attr(Name, node(ID, Package), A1, A2),
condition_holds(Condition1, node(ID1, Package1)),
pkg_fact(Package1, condition_effect(Condition1, Effect)),
imposed_constraint(Effect, Name, Package, A1, A2),
imposed_nodes(Effect, node(ID1, Package1), node(ID, Package)).
condition_cause(Condition2, ID2, Condition1, ID1) :-
condition_holds(Condition2, node(ID2, Package2)),
pkg_fact(Package2, condition_trigger(Condition2, Trigger)),
condition_requirement(Trigger, Name, Package, A1, A2, A3),
condition_nodes(Trigger, TriggerNode, node(ID, Package)),
trigger_node(Trigger, TriggerNode, node(ID2, Package2)),
attr(Name, node(ID, Package), A1, A2, A3),
condition_holds(Condition1, node(ID1, Package1)),
pkg_fact(Package1, condition_effect(Condition1, Effect)),
imposed_constraint(Effect, Name, Package, A1, A2, A3),
imposed_nodes(Effect, node(ID1, Package1), node(ID, Package)).
% special condition cause for dependency conditions
% we can't simply impose the existence of the node for dependency conditions
% because we need to allow for the choice of which dupe ID the node gets
condition_cause(Condition2, ID2, Condition1, ID1) :-
condition_holds(Condition2, node(ID2, Package2)),
pkg_fact(Package2, condition_trigger(Condition2, Trigger)),
condition_requirement(Trigger, "node", Package),
condition_nodes(Trigger, TriggerNode, node(ID, Package)),
trigger_node(Trigger, TriggerNode, node(ID2, Package2)),
attr("node", node(ID, Package)),
condition_holds(Condition1, node(ID1, Package1)),
pkg_fact(Package1, condition_effect(Condition1, Effect)),
imposed_constraint(Effect, "dependency_holds", Parent, Package, Type),
imposed_nodes(Effect, node(ID1, Package1), node(ID, Package)),
attr("depends_on", node(X, Parent), node(ID, Package), Type).
% The literal startcauses is used to separate the variables that are part of the error from the
% ones describing the causal tree of the error. After startcauses, each successive pair must be
% a condition and a condition_set id for which it holds.
% More specific error message if the version cannot satisfy some constraint
% Otherwise covered by `no_version_error` and `versions_conflict_error`.
error(1, "Cannot satisfy '{0}@{1}'", Package, Constraint, startcauses, ConstraintCause, CauseID)
:- attr("node_version_satisfies", node(ID, Package), Constraint),
pkg_fact(TriggerPkg, condition_effect(ConstraintCause, EffectID)),
imposed_constraint(EffectID, "node_version_satisfies", Package, Constraint),
condition_holds(ConstraintCause, node(CauseID, TriggerPkg)),
attr("version", node(ID, Package), Version),
not pkg_fact(Package, version_satisfies(Constraint, Version)).
error(0, "Cannot satisfy '{0}@{1}' and '{0}@{2}", Package, Constraint1, Constraint2, startcauses, Cause1, C1ID, Cause2, C2ID)
:- attr("node_version_satisfies", node(ID, Package), Constraint1),
pkg_fact(TriggerPkg1, condition_effect(Cause1, EffectID1)),
imposed_constraint(EffectID1, "node_version_satisfies", Package, Constraint1),
condition_holds(Cause1, node(C1ID, TriggerPkg1)),
% two constraints
attr("node_version_satisfies", node(ID, Package), Constraint2),
pkg_fact(TriggerPkg2, condition_effect(Cause2, EffectID2)),
imposed_constraint(EffectID2, "node_version_satisfies", Package, Constraint2),
condition_holds(Cause2, node(C2ID, TriggerPkg2)),
% version chosen
attr("version", node(ID, Package), Version),
% version satisfies one but not the other
pkg_fact(Package, version_satisfies(Constraint1, Version)),
not pkg_fact(Package, version_satisfies(Constraint2, Version)).
% causation tracking error for no or multiple virtual providers
error(0, "Cannot find a valid provider for virtual {0}", Virtual, startcauses, Cause, CID)
:- attr("virtual_node", node(X, Virtual)),
not provider(_, node(X, Virtual)),
imposed_constraint(EID, "dependency_holds", Parent, Virtual, Type),
pkg_fact(TriggerPkg, condition_effect(Cause, EID)),
condition_holds(Cause, node(CID, TriggerPkg)).
% At most one variant value for single-valued variants
error(0, "'{0}' required multiple values for single-valued variant '{1}'\n Requested 'Spec({1}={2})' and 'Spec({1}={3})'", Package, Variant, Value1, Value2, startcauses, Cause1, X, Cause2, X)
:- attr("node", node(X, Package)),
node_has_variant(node(X, Package), Variant),
pkg_fact(Package, variant_single_value(Variant)),
build(node(X, Package)),
attr("variant_value", node(X, Package), Variant, Value1),
imposed_constraint(EID1, "variant_set", Package, Variant, Value1),
pkg_fact(TriggerPkg1, condition_effect(Cause1, EID1)),
condition_holds(Cause1, node(X, TriggerPkg1)),
attr("variant_value", node(X, Package), Variant, Value2),
imposed_constraint(EID2, "variant_set", Package, Variant, Value2),
pkg_fact(TriggerPkg2, condition_effect(Cause2, EID2)),
condition_holds(Cause2, node(X, TriggerPkg2)),
Value1 < Value2. % see[1] in concretize.lp
% Externals have to specify external conditions
error(0, "Attempted to use external for {0} which does not satisfy any configured external spec version", Package, startcauses, ExternalCause, CID)
:- external(node(ID, Package)),
attr("external_spec_selected", node(ID, Package), Index),
imposed_constraint(EID, "external_conditions_hold", Package, Index),
pkg_fact(TriggerPkg, condition_effect(ExternalCause, EID)),
condition_holds(ExternalCause, node(CID, TriggerPkg)),
not external_version(node(ID, Package), _, _).
error(0, "Attempted to build package {0} which is not buildable and does not have a satisfying external\n attr('{1}', '{2}') is an external constraint for {0} which was not satisfied", Package, Name, A1)
:- external(node(ID, Package)),
not attr("external_conditions_hold", node(ID, Package), _),
imposed_constraint(EID, "external_conditions_hold", Package, _),
trigger_and_effect(Package, TID, EID),
condition_requirement(TID, Name, A1),
not attr(Name, node(_, A1)).
error(0, "Attempted to build package {0} which is not buildable and does not have a satisfying external\n attr('{1}', '{2}', '{3}') is an external constraint for {0} which was not satisfied", Package, Name, A1, A2)
:- external(node(ID, Package)),
not attr("external_conditions_hold", node(ID, Package), _),
imposed_constraint(EID, "external_conditions_hold", Package, _),
trigger_and_effect(Package, TID, EID),
condition_requirement(TID, Name, A1, A2),
not attr(Name, node(_, A1), A2).
error(0, "Attempted to build package {0} which is not buildable and does not have a satisfying external\n attr('{1}', '{2}', '{3}', '{4}') is an external constraint for {0} which was not satisfied", Package, Name, A1, A2, A3)
:- external(node(ID, Package)),
not attr("external_conditions_hold", node(ID, Package), _),
imposed_constraint(EID, "external_conditions_hold", Package, _),
trigger_and_effect(Package, TID, EID),
condition_requirement(TID, Name, A1, A2, A3),
not attr(Name, node(_, A1), A2, A3).
error(0, "Attempted to build package {0} which is not buildable and does not have a satisfying external\n 'Spec({0} {1}={2})' is an external constraint for {0} which was not satisfied\n 'Spec({0} {1}={3})' required", Package, Variant, Value, OtherValue, startcauses, OtherValueCause, CID)
:- external(node(ID, Package)),
not attr("external_conditions_hold", node(ID, Package), _),
imposed_constraint(EID, "external_conditions_hold", Package, _),
trigger_and_effect(Package, TID, EID),
condition_requirement(TID, "variant_value", Package, Variant, Value),
not attr("variant_value", node(ID, Package), Variant, Value),
attr("variant_value", node(ID, Package), Variant, OtherValue),
imposed_constraint(EID2, "variant_set", Package, Variant, OtherValue),
pkg_fact(TriggerPkg, condition_effect(OtherValueCause, EID2)),
condition_holds(OtherValueCause, node(CID, TriggerPkg)).
error(0, "Attempted to build package {0} which is not buildable and does not have a satisfying external\n attr('{1}', '{2}', '{3}', '{4}', '{5}') is an external constraint for {0} which was not satisfied", Package, Name, A1, A2, A3, A4)
:- external(node(ID, Package)),
not attr("external_conditions_hold", node(ID, Package), _),
imposed_constraint(EID, "external_conditions_hold", Package, _),
trigger_and_effect(Package, TID, EID),
condition_requirement(TID, Name, A1, A2, A3, A4),
not attr(Name, node(_, A1), A2, A3, A4).
% error message with causes for conflicts
error(0, Msg, startcauses, TriggerID, ID1, ConstraintID, ID2)
:- attr("node", node(ID, Package)),
pkg_fact(Package, conflict(TriggerID, ConstraintID, Msg)),
% node(ID1, TriggerPackage) is node(ID2, Package) in most, but not all, cases
condition_holds(TriggerID, node(ID1, TriggerPackage)),
condition_holds(ConstraintID, node(ID2, Package)),
unification_set(X, node(ID2, Package)),
unification_set(X, node(ID1, TriggerPackage)),
not external(node(ID, Package)), % ignore conflicts for externals
not attr("hash", node(ID, Package), _). % ignore conflicts for installed packages
% variables to show
#show error/2.
#show error/3.
#show error/4.
#show error/5.
#show error/6.
#show error/7.
#show error/8.
#show error/9.
#show error/10.
#show error/11.
#show condition_cause/4.
#show condition_reason/2.
% Define all variables used to avoid warnings at runtime when the model doesn't happen to have one
#defined error/2.
#defined error/3.
#defined error/4.
#defined error/5.
#defined error/6.
#defined attr/2.
#defined attr/3.
#defined attr/4.
#defined attr/5.
#defined pkg_fact/2.
#defined imposed_constraint/3.
#defined imposed_constraint/4.
#defined imposed_constraint/5.
#defined imposed_constraint/6.
#defined condition_requirement/3.
#defined condition_requirement/4.
#defined condition_requirement/5.
#defined condition_requirement/6.
#defined condition_holds/2.
#defined unification_set/2.
#defined external/1.
#defined trigger_and_effect/3.
#defined build/1.
#defined node_has_variant/2.
#defined provider/2.
#defined external_version/3.

View File

@@ -11,19 +11,14 @@
%-----------------
% Domain heuristic
%-----------------
#heuristic attr("hash", node(0, Package), Hash) : literal(_, "root", Package). [45, init]
#heuristic attr("root", node(0, Package)) : literal(_, "root", Package). [45, true]
#heuristic attr("node", node(0, Package)) : literal(_, "root", Package). [45, true]
#heuristic attr("node", node(0, Package)) : literal(_, "node", Package). [45, true]
% Root node
#heuristic attr("version", node(0, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("root", node(0, Package)). [35, true]
#heuristic version_weight(node(0, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("root", node(0, Package)). [35, true]
#heuristic attr("variant_value", node(0, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("root", node(0, Package)). [35, true]
#heuristic attr("node_target", node(0, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("root", node(0, Package)). [35, true]
#heuristic attr("node_target", node(0, Package), Target) : target_weight(Target, 0), attr("root", node(0, Package)). [35, true]
#heuristic node_target_weight(node(0, Package), 0) : attr("root", node(0, Package)). [35, true]
#heuristic node_compiler(node(0, Package), CompilerID) : default_compiler_preference(ID, 0), compiler_id(ID), attr("root", node(0, Package)). [35, true]
#heuristic node_compiler(node(0, Package), CompilerID) : compiler_weight(ID, 0), compiler_id(ID), attr("root", node(0, Package)). [35, true]
% Providers
#heuristic attr("node", node(0, Package)) : default_provider_preference(Virtual, Package, 0), possible_in_link_run(Package). [30, true]

View File

@@ -13,7 +13,7 @@
#heuristic attr("variant_value", node(ID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic attr("node_target", node(ID, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_target_weight(node(ID, Package), 0) : attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : default_compiler_preference(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : compiler_weight(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
% node(ID, _), split build dependencies
#heuristic attr("version", node(ID, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
@@ -21,4 +21,4 @@
#heuristic attr("variant_value", node(ID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic attr("node_target", node(ID, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_target_weight(node(ID, Package), 0) : attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : default_compiler_preference(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : compiler_weight(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]

View File

@@ -0,0 +1,31 @@
% Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
% Spack Project Developers. See the top-level COPYRIGHT file for details.
%
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% TODO: Later
%=============================================================================
% propagate flags when compiler match
attr("node_flag_propagate", PackageNode, FlagType, Flag, Source) :- % source is a node() attribute
attr("node_flag_propagation_candidate", PackageNode, FlagType, Flag, Source),
not attr("node_flag_set", PackageNode, FlagType, _).
attr("node_flag_propagation_candidate", PackageNode, FlagType, Flag, Source) :- % source is a node() attribute
same_compiler(ParentNode, PackageNode),
attr("node_flag_propagation_candidate", ParentNode, FlagType, _, Source),
attr("node_flag", Source, FlagType, Flag),
flag_type(FlagType).
error(100, "{0} and {1} cannot both propagate compiler flags '{2}' to {3}", Source1, Source2, PackageNode, FlagType) :-
same_compiler(Source1, PackageNode),
same_compiler(Source2, PackageNode),
attr("node_flag_propagate", _, FlagType, _, Source1),
attr("node_flag_propagate", _, FlagType, _, Source2),
Source1 < Source2.
attr("node_flag_source", PackageNode, FlagType, Q)
:- attr("node_flag_propagate", PackageNode, FlagType, _, Q).
attr("node_flag", PackageNode, FlagType, Flag) :- attr("node_flag_propagate", PackageNode, FlagType, Flag, _).

View File

@@ -59,7 +59,7 @@
import re
import socket
import warnings
from typing import Any, Callable, Dict, List, Optional, Tuple, Union
from typing import Any, Callable, Dict, List, Optional, Set, Tuple, Union
import llnl.path
import llnl.string
@@ -1464,6 +1464,26 @@ def edges_to_dependencies(self, name=None, depflag: dt.DepFlag = dt.ALL):
"""
return [d for d in self._dependencies.select(child=name, depflag=depflag)]
@property
def edge_attributes(self) -> str:
"""Helper method to print edge attributes in spec literals"""
edges = self.edges_from_dependents()
if not edges:
return ""
union = DependencySpec(parent=Spec(), spec=self, depflag=0, virtuals=())
for edge in edges:
union.update_deptypes(edge.depflag)
union.update_virtuals(edge.virtuals)
deptypes_str = (
f"deptypes={','.join(dt.flag_to_tuple(union.depflag))}" if union.depflag else ""
)
virtuals_str = f"virtuals={','.join(union.virtuals)}" if union.virtuals else ""
if not deptypes_str and not virtuals_str:
return ""
result = f"{deptypes_str} {virtuals_str}".strip()
return f"[{result}]"
def dependencies(self, name=None, deptype: Union[dt.DepTypes, dt.DepFlag] = dt.ALL):
"""Return a list of direct dependencies (nodes in the DAG).
@@ -3688,8 +3708,15 @@ def intersects(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
if other.concrete and self.concrete:
return self.dag_hash() == other.dag_hash()
self_hash = self.dag_hash() if self.concrete else self.abstract_hash
other_hash = other.dag_hash() if other.concrete else other.abstract_hash
elif self.concrete:
return self.satisfies(other)
elif other.concrete:
return other.satisfies(self)
# From here we know both self and other are not concrete
self_hash = self.abstract_hash
other_hash = other.abstract_hash
if (
self_hash
@@ -3778,10 +3805,6 @@ def _intersects_dependencies(self, other):
repository=spack.repo.PATH, specs=other.traverse(), restrict=True
)
# This handles cases where there are already providers for both vpkgs
if not self_index.satisfies(other_index):
return False
# These two loops handle cases where there is an overly restrictive
# vpkg in one spec for a provider in the other (e.g., mpi@3: is not
# compatible with mpich2)
@@ -3879,7 +3902,46 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
return False
# If we arrived here, then rhs is abstract. At the moment we don't care about the edge
# structure of an abstract DAG - hence the deps=False parameter.
# structure of an abstract DAG, so we check if any edge could satisfy the properties
# we ask for.
lhs_edges: Dict[str, Set[DependencySpec]] = collections.defaultdict(set)
for rhs_edge in other.traverse_edges(root=False, cover="edges"):
# If we are checking for ^mpi we need to verify if there is any edge
if rhs_edge.spec.virtual:
rhs_edge.update_virtuals(virtuals=(rhs_edge.spec.name,))
if not rhs_edge.virtuals:
continue
if not lhs_edges:
# Construct a map of the link/run subDAG + direct "build" edges,
# keyed by dependency name
for lhs_edge in self.traverse_edges(
root=False, cover="edges", deptype=("link", "run")
):
lhs_edges[lhs_edge.spec.name].add(lhs_edge)
for virtual_name in lhs_edge.virtuals:
lhs_edges[virtual_name].add(lhs_edge)
build_edges = self.edges_to_dependencies(depflag=dt.BUILD)
for lhs_edge in build_edges:
lhs_edges[lhs_edge.spec.name].add(lhs_edge)
for virtual_name in lhs_edge.virtuals:
lhs_edges[virtual_name].add(lhs_edge)
# We don't have edges to this dependency
current_dependency_name = rhs_edge.spec.name
if current_dependency_name not in lhs_edges:
return False
for virtual in rhs_edge.virtuals:
has_virtual = any(
virtual in edge.virtuals for edge in lhs_edges[current_dependency_name]
)
if not has_virtual:
return False
# Edges have been checked above already, hence deps=False
return all(
any(lhs.satisfies(rhs, deps=False) for lhs in self.traverse(root=False))
for rhs in other.traverse(root=False)
@@ -4081,9 +4143,7 @@ def __getitem__(self, name):
"""
query_parameters = name.split(":")
if len(query_parameters) > 2:
msg = "key has more than one ':' symbol."
msg += " At most one is admitted."
raise KeyError(msg)
raise KeyError("key has more than one ':' symbol. At most one is admitted.")
name, query_parameters = query_parameters[0], query_parameters[1:]
if query_parameters:
@@ -4108,11 +4168,17 @@ def __getitem__(self, name):
itertools.chain(
# Regular specs
(x for x in order() if x.name == name),
(
x
for x in order()
if (not x.virtual)
and any(name in edge.virtuals for edge in x.edges_from_dependents())
),
(x for x in order() if (not x.virtual) and x.package.provides(name)),
)
)
except StopIteration:
raise KeyError("No spec with name %s in %s" % (name, self))
raise KeyError(f"No spec with name {name} in {self}")
if self._concrete:
return SpecBuildInterface(value, name, query_parameters)
@@ -4490,17 +4556,27 @@ def format_path(
return str(path_ctor(*output_path_components))
def __str__(self):
sorted_nodes = [self] + sorted(
root_str = [self.format()]
sorted_dependencies = sorted(
self.traverse(root=False), key=lambda x: (x.name, x.abstract_hash)
)
return " ^".join(d.format() for d in sorted_nodes).strip()
sorted_dependencies = [
d.format("{edge_attributes} " + DEFAULT_FORMAT) for d in sorted_dependencies
]
spec_str = " ^".join(root_str + sorted_dependencies)
return spec_str.strip()
@property
def colored_str(self):
sorted_nodes = [self] + sorted(
root_str = [self.cformat()]
sorted_dependencies = sorted(
self.traverse(root=False), key=lambda x: (x.name, x.abstract_hash)
)
return " ^".join(d.cformat() for d in sorted_nodes).strip()
sorted_dependencies = [
d.cformat("{edge_attributes} " + DISPLAY_FORMAT) for d in sorted_dependencies
]
spec_str = " ^".join(root_str + sorted_dependencies)
return spec_str.strip()
def install_status(self):
"""Helper for tree to print DB install status."""

View File

@@ -93,8 +93,8 @@ def remove(self, spec):
if (isinstance(s, str) and not s.startswith("$")) and Spec(s) == Spec(spec)
]
if not remove:
msg = "Cannot remove %s from SpecList %s\n" % (spec, self.name)
msg += "Either %s is not in %s or %s is " % (spec, self.name, spec)
msg = f"Cannot remove {spec} from SpecList {self.name}.\n"
msg += f"Either {spec} is not in {self.name} or {spec} is "
msg += "expanded from a matrix and cannot be removed directly."
raise SpecListError(msg)
@@ -133,9 +133,8 @@ def _parse_reference(self, name):
# Make sure the reference is valid
if name not in self._reference:
msg = "SpecList %s refers to " % self.name
msg += "named list %s " % name
msg += "which does not appear in its reference dict"
msg = f"SpecList '{self.name}' refers to named list '{name}'"
msg += " which does not appear in its reference dict."
raise UndefinedReferenceError(msg)
return (name, sigil)

View File

@@ -642,3 +642,28 @@ def test_effective_deptype_run_environment(default_mock_concretization):
for spec, effective_type in spack.build_environment.effective_deptypes(s, context=Context.RUN):
assert effective_type & expected_flags.pop(spec.name) == effective_type
assert not expected_flags, f"Missing {expected_flags.keys()} from effective_deptypes"
def test_monkey_patching_works_across_virtual(default_mock_concretization):
"""Assert that a monkeypatched attribute is found regardless we access through the
real name or the virtual name.
"""
s = default_mock_concretization("mpileaks ^mpich")
s["mpich"].foo = "foo"
assert s["mpich"].foo == "foo"
assert s["mpi"].foo == "foo"
def test_clear_compiler_related_runtime_variables_of_build_deps(default_mock_concretization):
"""Verify that Spack drops CC, CXX, FC and F77 from the dependencies related build environment
variable changes if they are set in setup_run_environment. Spack manages those variables
elsewhere."""
s = default_mock_concretization("build-env-compiler-var-a")
ctx = spack.build_environment.SetupContext(s, context=Context.BUILD)
result = {}
ctx.get_env_modifications().apply_modifications(result)
assert "CC" not in result
assert "CXX" not in result
assert "FC" not in result
assert "F77" not in result
assert result["ANOTHER_VAR"] == "this-should-be-present"

View File

@@ -58,6 +58,24 @@ def test_subcommands():
assert "spack compiler add" in out2
@pytest.mark.not_on_windows("subprocess not supported on Windows")
def test_override_alias():
"""Test that spack commands cannot be overriden by aliases."""
install = spack.main.SpackCommand("install", subprocess=True)
instal = spack.main.SpackCommand("instal", subprocess=True)
out = install(fail_on_error=False, global_args=["-c", "config:aliases:install:find"])
assert "install requires a package argument or active environment" in out
assert "Alias 'install' (mapping to 'find') attempts to override built-in command" in out
out = install(fail_on_error=False, global_args=["-c", "config:aliases:foo bar:find"])
assert "Alias 'foo bar' (mapping to 'find') contains a space, which is not supported" in out
out = instal(fail_on_error=False, global_args=["-c", "config:aliases:instal:find"])
assert "install requires a package argument or active environment" not in out
def test_rst():
"""Do some simple sanity checks of the rst writer."""
out1 = commands("--format=rst")

View File

@@ -4,12 +4,14 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
import sys
import pytest
import spack.cmd.compiler
import spack.compilers
import spack.main
import spack.spec
import spack.util.pattern
import spack.version
compiler = spack.main.SpackCommand("compiler")
@@ -146,7 +148,7 @@ def test_compiler_add(mutable_config, mock_packages, mock_executable):
compilers_before_find = set(spack.compilers.all_compiler_specs())
args = spack.util.pattern.Bunch(
all=None, compiler_spec=None, add_paths=[str(root_dir)], scope=None
all=None, compiler_spec=None, add_paths=[str(root_dir)], scope=None, mixed_toolchain=False
)
spack.cmd.compiler.compiler_find(args)
compilers_after_find = set(spack.compilers.all_compiler_specs())
@@ -159,10 +161,15 @@ def test_compiler_add(mutable_config, mock_packages, mock_executable):
@pytest.mark.not_on_windows("Cannot execute bash script on Windows")
@pytest.mark.regression("17590")
def test_compiler_find_mixed_suffixes(no_compilers_yaml, working_env, compilers_dir):
@pytest.mark.parametrize("mixed_toolchain", [True, False])
def test_compiler_find_mixed_suffixes(
mixed_toolchain, no_compilers_yaml, working_env, compilers_dir
):
"""Ensure that we'll mix compilers with different suffixes when necessary."""
os.environ["PATH"] = str(compilers_dir)
output = compiler("find", "--scope=site")
output = compiler(
"find", "--scope=site", "--mixed-toolchain" if mixed_toolchain else "--no-mixed-toolchain"
)
assert "clang@11.0.0" in output
assert "gcc@8.4.0" in output
@@ -176,9 +183,8 @@ def test_compiler_find_mixed_suffixes(no_compilers_yaml, working_env, compilers_
assert clang["paths"] == {
"cc": str(compilers_dir / "clang"),
"cxx": str(compilers_dir / "clang++"),
# we only auto-detect mixed clang on macos
"f77": gfortran_path if sys.platform == "darwin" else None,
"fc": gfortran_path if sys.platform == "darwin" else None,
"f77": gfortran_path if mixed_toolchain else None,
"fc": gfortran_path if mixed_toolchain else None,
}
assert gcc["paths"] == {

View File

@@ -215,10 +215,10 @@ def test_config_add_override_leaf(mutable_empty_config):
def test_config_add_update_dict(mutable_empty_config):
config("add", "packages:all:version:[1.0.0]")
config("add", "packages:hdf5:version:[1.0.0]")
output = config("get", "packages")
expected = "packages:\n all:\n version: [1.0.0]\n"
expected = "packages:\n hdf5:\n version: [1.0.0]\n"
assert output == expected
@@ -352,8 +352,7 @@ def test_config_add_update_dict_from_file(mutable_empty_config, tmpdir):
contents = """spack:
packages:
all:
version:
- 1.0.0
target: [x86_64]
"""
# create temp file and add it to config
@@ -368,8 +367,7 @@ def test_config_add_update_dict_from_file(mutable_empty_config, tmpdir):
# added config comes before prior config
expected = """packages:
all:
version:
- 1.0.0
target: [x86_64]
compiler: [gcc]
"""
@@ -381,7 +379,7 @@ def test_config_add_invalid_file_fails(tmpdir):
# invalid because version requires a list
contents = """spack:
packages:
all:
hdf5:
version: 1.0.0
"""
@@ -631,14 +629,11 @@ def test_config_prefer_upstream(
packages = syaml.load(open(cfg_file))["packages"]
# Make sure only the non-default variants are set.
assert packages["boost"] == {
"compiler": ["gcc@=10.2.1"],
"variants": "+debug +graph",
"version": ["1.63.0"],
}
assert packages["dependency-install"] == {"compiler": ["gcc@=10.2.1"], "version": ["2.0"]}
assert packages["all"] == {"compiler": ["gcc@=10.2.1"]}
assert packages["boost"] == {"variants": "+debug +graph", "version": ["1.63.0"]}
assert packages["dependency-install"] == {"version": ["2.0"]}
# Ensure that neither variant gets listed for hdf5, since they conflict
assert packages["hdf5"] == {"compiler": ["gcc@=10.2.1"], "version": ["2.3"]}
assert packages["hdf5"] == {"version": ["2.3"]}
# Make sure a message about the conflicting hdf5's was given.
assert "- hdf5" in output

View File

@@ -14,7 +14,14 @@
dependencies = SpackCommand("dependencies")
mpis = ["low-priority-provider", "mpich", "mpich2", "multi-provider-mpi", "zmpi"]
mpis = [
"intel-parallel-studio",
"low-priority-provider",
"mpich",
"mpich2",
"multi-provider-mpi",
"zmpi",
]
mpi_deps = ["fake"]

View File

@@ -163,8 +163,15 @@ def test_dev_build_fails_multiple_specs(mock_packages):
def test_dev_build_fails_nonexistent_package_name(mock_packages):
output = dev_build("no_such_package", fail_on_error=False)
assert "No package for 'no_such_package' was found" in output
output = ""
try:
dev_build("no_such_package")
assert False, "no exception was raised!"
except spack.repo.UnknownPackageError as e:
output = e.message
assert "Package 'no_such_package' not found" in output
def test_dev_build_fails_no_version(mock_packages):

View File

@@ -14,6 +14,7 @@
import llnl.util.filesystem as fs
import llnl.util.link_tree
import llnl.util.tty as tty
import spack.cmd.env
import spack.config
@@ -631,7 +632,7 @@ def test_env_view_external_prefix(tmp_path, mutable_database, mock_packages):
manifest_dir.mkdir(parents=True, exist_ok=False)
manifest_file = manifest_dir / ev.manifest_name
manifest_file.write_text(
"""
"""\
spack:
specs:
- a
@@ -719,38 +720,25 @@ def test_env_with_config(environment_from_manifest):
def test_with_config_bad_include(environment_from_manifest):
"""Confirm missing include paths raise expected exception and error."""
e = environment_from_manifest(
"""
with pytest.raises(spack.config.ConfigFileError, match="2 missing include path"):
e = environment_from_manifest(
"""
spack:
include:
- /no/such/directory
- no/such/file.yaml
"""
)
with pytest.raises(spack.config.ConfigFileError, match="2 missing include path"):
)
with e:
e.concretize()
assert ev.active_environment() is None
def test_env_with_include_config_files_same_basename(environment_from_manifest):
e = environment_from_manifest(
"""
spack:
include:
- ./path/to/included-config.yaml
- ./second/path/to/include-config.yaml
specs:
- libelf
- mpileaks
"""
)
e = ev.read("test")
fs.mkdirp(os.path.join(e.path, "path", "to"))
with open(os.path.join(e.path, "./path/to/included-config.yaml"), "w") as f:
def test_env_with_include_config_files_same_basename(tmp_path, environment_from_manifest):
file1 = fs.join_path(tmp_path, "path", "to", "included-config.yaml")
fs.mkdirp(os.path.dirname(file1))
with open(file1, "w") as f:
f.write(
"""\
packages:
@@ -759,8 +747,9 @@ def test_env_with_include_config_files_same_basename(environment_from_manifest):
"""
)
fs.mkdirp(os.path.join(e.path, "second", "path", "to"))
with open(os.path.join(e.path, "./second/path/to/include-config.yaml"), "w") as f:
file2 = fs.join_path(tmp_path, "second", "path", "included-config.yaml")
fs.mkdirp(os.path.dirname(file2))
with open(file2, "w") as f:
f.write(
"""\
packages:
@@ -769,6 +758,18 @@ def test_env_with_include_config_files_same_basename(environment_from_manifest):
"""
)
e = environment_from_manifest(
f"""
spack:
include:
- {file1}
- {file2}
specs:
- libelf
- mpileaks
"""
)
with e:
e.concretize()
@@ -805,12 +806,18 @@ def mpileaks_env_config(include_path):
)
def test_env_with_included_config_file(environment_from_manifest, packages_file):
def test_env_with_included_config_file(mutable_mock_env_path, packages_file):
"""Test inclusion of a relative packages configuration file added to an
existing environment.
"""
env_root = mutable_mock_env_path
fs.mkdirp(env_root)
include_filename = "included-config.yaml"
e = environment_from_manifest(
included_path = env_root / include_filename
shutil.move(packages_file.strpath, included_path)
spack_yaml = env_root / ev.manifest_name
spack_yaml.write_text(
f"""\
spack:
include:
@@ -820,9 +827,7 @@ def test_env_with_included_config_file(environment_from_manifest, packages_file)
"""
)
included_path = os.path.join(e.path, include_filename)
shutil.move(packages_file.strpath, included_path)
e = ev.Environment(env_root)
with e:
e.concretize()
@@ -855,68 +860,67 @@ def test_env_with_included_config_missing_file(tmpdir, mutable_empty_config):
with spack_yaml.open("w") as f:
f.write("spack:\n include:\n - {0}\n".format(missing_file.strpath))
env = ev.Environment(tmpdir.strpath)
with pytest.raises(spack.config.ConfigError, match="missing include path"):
ev.activate(env)
ev.Environment(tmpdir.strpath)
def test_env_with_included_config_scope(environment_from_manifest, packages_file):
def test_env_with_included_config_scope(mutable_mock_env_path, packages_file):
"""Test inclusion of a package file from the environment's configuration
stage directory. This test is intended to represent a case where a remote
file has already been staged."""
config_scope_path = os.path.join(ev.root("test"), "config")
# Configure the environment to include file(s) from the environment's
# remote configuration stage directory.
e = environment_from_manifest(mpileaks_env_config(config_scope_path))
env_root = mutable_mock_env_path
config_scope_path = env_root / "config"
# Copy the packages.yaml file to the environment configuration
# directory, so it is picked up during concretization. (Using
# copy instead of rename in case the fixture scope changes.)
fs.mkdirp(config_scope_path)
include_filename = os.path.basename(packages_file.strpath)
included_path = os.path.join(config_scope_path, include_filename)
included_path = config_scope_path / include_filename
fs.copy(packages_file.strpath, included_path)
# Configure the environment to include file(s) from the environment's
# remote configuration stage directory.
spack_yaml = env_root / ev.manifest_name
spack_yaml.write_text(mpileaks_env_config(config_scope_path))
# Ensure the concretized environment reflects contents of the
# packages.yaml file.
e = ev.Environment(env_root)
with e:
e.concretize()
assert any(x.satisfies("mpileaks@2.2") for x in e._get_environment_specs())
def test_env_with_included_config_var_path(environment_from_manifest, packages_file):
def test_env_with_included_config_var_path(tmpdir, packages_file):
"""Test inclusion of a package configuration file with path variables
"staged" in the environment's configuration stage directory."""
config_var_path = os.path.join("$tempdir", "included-config.yaml")
e = environment_from_manifest(mpileaks_env_config(config_var_path))
included_file = packages_file.strpath
env_path = pathlib.PosixPath(tmpdir)
config_var_path = os.path.join("$tempdir", "included-packages.yaml")
spack_yaml = env_path / ev.manifest_name
spack_yaml.write_text(mpileaks_env_config(config_var_path))
config_real_path = substitute_path_variables(config_var_path)
fs.mkdirp(os.path.dirname(config_real_path))
shutil.move(packages_file.strpath, config_real_path)
shutil.move(included_file, config_real_path)
assert os.path.exists(config_real_path)
e = ev.Environment(env_path)
with e:
e.concretize()
assert any(x.satisfies("mpileaks@2.2") for x in e._get_environment_specs())
def test_env_config_precedence(environment_from_manifest):
e = environment_from_manifest(
"""
spack:
packages:
libelf:
version: ["0.8.12"]
include:
- ./included-config.yaml
specs:
- mpileaks
"""
)
with open(os.path.join(e.path, "included-config.yaml"), "w") as f:
def test_env_with_included_config_precedence(tmp_path):
"""Test included scope and manifest precedence when including a package
configuration file."""
included_file = "included-packages.yaml"
included_path = tmp_path / included_file
with open(included_path, "w") as f:
f.write(
"""\
packages:
@@ -927,29 +931,50 @@ def test_env_config_precedence(environment_from_manifest):
"""
)
with e:
e.concretize()
# ensure included scope took effect
assert any(x.satisfies("mpileaks@2.2") for x in e._get_environment_specs())
# ensure env file takes precedence
assert any(x.satisfies("libelf@0.8.12") for x in e._get_environment_specs())
def test_included_config_precedence(environment_from_manifest):
e = environment_from_manifest(
"""
spack_yaml = tmp_path / ev.manifest_name
spack_yaml.write_text(
f"""\
spack:
packages:
libelf:
version: ["0.8.12"]
include:
- ./high-config.yaml # this one should take precedence
- ./low-config.yaml
- {os.path.join(".", included_file)}
specs:
- mpileaks
"""
)
with open(os.path.join(e.path, "high-config.yaml"), "w") as f:
e = ev.Environment(tmp_path)
with e:
e.concretize()
specs = e._get_environment_specs()
# ensure included scope took effect
assert any(x.satisfies("mpileaks@2.2") for x in specs)
# ensure env file takes precedence
assert any(x.satisfies("libelf@0.8.12") for x in specs)
def test_env_with_included_configs_precedence(tmp_path):
"""Test precendence of multiple included configuration files."""
file1 = "high-config.yaml"
file2 = "low-config.yaml"
spack_yaml = tmp_path / ev.manifest_name
spack_yaml.write_text(
f"""\
spack:
include:
- {os.path.join(".", file1)} # this one should take precedence
- {os.path.join(".", file2)}
specs:
- mpileaks
"""
)
with open(tmp_path / file1, "w") as f:
f.write(
"""\
packages:
@@ -958,7 +983,7 @@ def test_included_config_precedence(environment_from_manifest):
"""
)
with open(os.path.join(e.path, "low-config.yaml"), "w") as f:
with open(tmp_path / file2, "w") as f:
f.write(
"""\
packages:
@@ -969,38 +994,23 @@ def test_included_config_precedence(environment_from_manifest):
"""
)
e = ev.Environment(tmp_path)
with e:
e.concretize()
specs = e._get_environment_specs()
assert any(x.satisfies("mpileaks@2.2") for x in e._get_environment_specs())
# ensure included package spec took precedence over manifest spec
assert any(x.satisfies("mpileaks@2.2") for x in specs)
assert any([x.satisfies("libelf@0.8.10") for x in e._get_environment_specs()])
# ensure first included package spec took precedence over one from second
assert any(x.satisfies("libelf@0.8.10") for x in specs)
def test_bad_env_yaml_format(tmpdir):
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
"""\
spack:
spacks:
- mpileaks
"""
)
with tmpdir.as_cwd():
with pytest.raises(spack.config.ConfigFormatError) as e:
env("create", "test", "./spack.yaml")
assert "'spacks' was unexpected" in str(e)
assert "test" not in env("list")
def test_bad_env_yaml_format_remove():
@pytest.mark.regression("39248")
def test_bad_env_yaml_format_remove(mutable_mock_env_path):
badenv = "badenv"
env("create", badenv)
tmpdir = spack.environment.environment.environment_dir_from_name(badenv, exists_ok=True)
filename = os.path.join(tmpdir, "spack.yaml")
filename = mutable_mock_env_path / "spack.yaml"
with open(filename, "w") as f:
f.write(
"""\
@@ -1013,6 +1023,88 @@ def test_bad_env_yaml_format_remove():
assert badenv not in env("list")
@pytest.mark.regression("39248")
@pytest.mark.parametrize(
"error,message,contents",
[
(
spack.config.ConfigFormatError,
"not of type",
"""\
spack:
specs: mpi@2.0
""",
),
(
ev.SpackEnvironmentConfigError,
"duplicate key",
"""\
spack:
packages:
all:
providers:
mpi: [mvapich2]
mpi: [mpich]
""",
),
(
spack.config.ConfigFormatError,
"'specks' was unexpected",
"""\
spack:
specks:
- libdwarf
""",
),
],
)
def test_bad_env_yaml_create_fails(tmp_path, mutable_mock_env_path, error, message, contents):
"""Ensure creation with invalid yaml does NOT create or leave the environment."""
filename = tmp_path / ev.manifest_name
filename.write_text(contents)
env_name = "bad_env"
with pytest.raises(error, match=message):
env("create", env_name, str(filename))
assert env_name not in env("list")
manifest = mutable_mock_env_path / env_name / ev.manifest_name
assert not os.path.exists(str(manifest))
@pytest.mark.regression("39248")
@pytest.mark.parametrize("answer", ["-y", ""])
def test_multi_env_remove(mutable_mock_env_path, monkeypatch, answer):
"""Test removal (or not) of a valid and invalid environment"""
remove_environment = answer == "-y"
monkeypatch.setattr(tty, "get_yes_or_no", lambda prompt, default: remove_environment)
environments = ["goodenv", "badenv"]
for e in environments:
env("create", e)
# Ensure the bad environment contains invalid yaml
filename = mutable_mock_env_path / environments[1] / ev.manifest_name
filename.write_text(
"""\
- libdwarf
"""
)
assert all(e in env("list") for e in environments)
args = [answer] if answer else []
args.extend(environments)
output = env("remove", *args, fail_on_error=False)
if remove_environment is True:
# Successfully removed (and reported removal) of *both* environments
assert not all(e in env("list") for e in environments)
assert output.count("Successfully removed") == len(environments)
else:
# Not removing any of the environments
assert all(e in env("list") for e in environments)
def test_env_loads(install_mockery, mock_fetch):
env("create", "test")
@@ -1549,11 +1641,10 @@ def test_stack_yaml_remove_from_list(tmpdir):
assert Spec("callpath") in test.user_specs
def test_stack_yaml_remove_from_list_force(tmpdir):
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
"""\
def test_stack_yaml_remove_from_list_force(tmp_path):
spack_yaml = tmp_path / ev.manifest_name
spack_yaml.write_text(
"""\
spack:
definitions:
- packages: [mpileaks, callpath]
@@ -1562,20 +1653,20 @@ def test_stack_yaml_remove_from_list_force(tmpdir):
- [$packages]
- [^mpich, ^zmpi]
"""
)
with tmpdir.as_cwd():
env("create", "test", "./spack.yaml")
with ev.read("test"):
concretize()
remove("-f", "-l", "packages", "mpileaks")
find_output = find("-c")
)
assert "mpileaks" not in find_output
env("create", "test", str(spack_yaml))
with ev.read("test"):
concretize()
remove("-f", "-l", "packages", "mpileaks")
find_output = find("-c")
test = ev.read("test")
assert len(test.user_specs) == 2
assert Spec("callpath ^zmpi") in test.user_specs
assert Spec("callpath ^mpich") in test.user_specs
assert "mpileaks" not in find_output
test = ev.read("test")
assert len(test.user_specs) == 2
assert Spec("callpath ^zmpi") in test.user_specs
assert Spec("callpath ^mpich") in test.user_specs
def test_stack_yaml_remove_from_matrix_no_effect(tmpdir):
@@ -1621,7 +1712,7 @@ def test_stack_yaml_force_remove_from_matrix(tmpdir):
with tmpdir.as_cwd():
env("create", "test", "./spack.yaml")
with ev.read("test") as e:
concretize()
e.concretize()
before_user = e.user_specs.specs
before_conc = e.concretized_user_specs
@@ -2461,8 +2552,12 @@ def test_concretize_user_specs_together():
e.remove("mpich")
e.add("mpich2")
exc_cls = spack.error.SpackError
if spack.config.get("config:concretizer") == "clingo":
exc_cls = spack.error.UnsatisfiableSpecError
# Concretizing without invalidating the concrete spec for mpileaks fails
with pytest.raises(spack.error.UnsatisfiableSpecError):
with pytest.raises(exc_cls):
e.concretize()
e.concretize(force=True)
@@ -2494,9 +2589,12 @@ def test_duplicate_packages_raise_when_concretizing_together():
e.add("mpileaks~opt")
e.add("mpich")
with pytest.raises(
spack.error.UnsatisfiableSpecError, match=r"You could consider setting `concretizer:unify`"
):
exc_cls, match = spack.error.SpackError, None
if spack.config.get("config:concretizer") == "clingo":
exc_cls = spack.error.UnsatisfiableSpecError
match = r"You could consider setting `concretizer:unify`"
with pytest.raises(exc_cls, match=match):
e.concretize()
@@ -2523,7 +2621,7 @@ def test_env_write_only_non_default_nested(tmpdir):
- matrix:
- [mpileaks]
packages:
mpileaks:
all:
compiler: [gcc]
view: true
"""
@@ -2861,6 +2959,25 @@ def test_activate_temp(monkeypatch, tmpdir):
assert ev.is_env_dir(str(tmpdir))
def test_activate_default(monkeypatch):
"""Tests whether `spack env activate` creates / activates the default
environment"""
assert not ev.exists("default")
# Activating it the first time should create it
env("activate", "--sh")
env("deactivate", "--sh")
assert ev.exists("default")
# Activating it while it already exists should work
env("activate", "--sh")
env("deactivate", "--sh")
assert ev.exists("default")
env("remove", "-y", "default")
assert not ev.exists("default")
def test_env_view_fail_if_symlink_points_elsewhere(tmpdir, install_mockery, mock_fetch):
view = str(tmpdir.join("view"))
# Put a symlink to an actual directory in view
@@ -3346,6 +3463,20 @@ def test_spack_package_ids_variable(tmpdir, mock_packages):
assert "post-install: {}".format(s.dag_hash()) in out
def test_depfile_empty_does_not_error(tmp_path):
# For empty environments Spack should create a depfile that does nothing
make = Executable("make")
makefile = str(tmp_path / "Makefile")
env("create", "test")
with ev.read("test"):
env("depfile", "-o", makefile)
make("-f", makefile)
assert make.returncode == 0
def test_unify_when_possible_works_around_conflicts():
e = ev.create("coconcretization")
e.unify = "when_possible"

View File

@@ -349,11 +349,25 @@ def test_compiler_flags_differ_identical_compilers(self):
spec.concretize()
assert spec.satisfies("cflags=-O2")
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
def test_concretize_compiler_flag_propagate(self):
spec = Spec("hypre cflags=='-g' ^openblas")
spec = Spec("callpath cflags=='-g'")
spec.concretize()
assert spec.satisfies("^openblas cflags='-g'")
for dep in spec.traverse():
assert dep.satisfies("cflags='-g'")
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
def test_concretize_non_root_compiler_flag_propagate(self):
spec = Spec("callpath ^dyninst cflags=='-g'")
spec.concretize()
assert spec.satisfies("^libdwarf cflags='-g'")
assert spec.satisfies("^libelf cflags='-g'")
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
@@ -362,17 +376,30 @@ def test_concretize_compiler_flag_does_not_propagate(self):
spec = Spec("hypre cflags='-g' ^openblas")
spec.concretize()
assert not spec.satisfies("^openblas cflags='-g'")
for dep in spec.traverse(root=False):
assert not dep.satisfies("cflags='-g'")
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
def test_concretize_propagate_compiler_flag_not_passed_to_dependent(self):
spec = Spec("hypre cflags=='-g' ^openblas cflags='-O3'")
spec = Spec("callpath cflags=='-g' ^dyninst cflags='-O3'")
spec.concretize()
assert set(spec.compiler_flags["cflags"]) == set(["-g"])
assert spec.satisfies("^openblas cflags='-O3'")
assert spec.satisfies("^dyninst cflags='-O3'")
assert spec.satisfies("^libelf cflags='-g'")
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
def test_concretize_propagate_specified_compiler_flag(self):
spec = Spec("callpath cflags=='-g' cxxflags='-O3'")
spec.concretize()
for dep in spec.traverse(root=False):
assert dep.satisfies("cflags='-g'")
assert not dep.satisfies("cxxflags='-O3'")
def test_mixing_compilers_only_affects_subdag(self):
spack.config.set("packages:all:compiler", ["clang", "gcc"])
@@ -458,19 +485,66 @@ def test_concretize_two_virtuals_with_dual_provider_and_a_conflict(self):
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
def test_concretize_propagate_disabled_variant(self):
"""Test a package variant value was passed from its parent."""
spec = Spec("hypre~~shared ^openblas")
spec.concretize()
assert spec.satisfies("^openblas~shared")
@pytest.mark.parametrize(
"spec_str,expected_propagation",
[
("hypre~~shared ^openblas+shared", [("hypre", "~shared"), ("openblas", "+shared")]),
# Propagates past a node that doesn't have the variant
("hypre~~shared ^openblas", [("hypre", "~shared"), ("openblas", "~shared")]),
(
"ascent~~shared +adios2",
[("ascent", "~shared"), ("adios2", "~shared"), ("bzip2", "~shared")],
),
# Propagates below a node that uses the other value explicitly
(
"ascent~~shared +adios2 ^adios2+shared",
[("ascent", "~shared"), ("adios2", "+shared"), ("bzip2", "~shared")],
),
(
"ascent++shared +adios2 ^adios2~shared",
[("ascent", "+shared"), ("adios2", "~shared"), ("bzip2", "+shared")],
),
],
)
def test_concretize_propagate_disabled_variant(self, spec_str, expected_propagation):
"""Tests various patterns of boolean variant propagation"""
spec = Spec(spec_str).concretized()
for key, expected_satisfies in expected_propagation:
spec[key].satisfies(expected_satisfies)
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
def test_concretize_propagated_variant_is_not_passed_to_dependent(self):
"""Test a package variant value was passed from its parent."""
spec = Spec("hypre~~shared ^openblas+shared")
spec = Spec("ascent~~shared +adios2 ^adios2+shared")
spec.concretize()
assert spec.satisfies("^openblas+shared")
assert spec.satisfies("^adios2+shared")
assert spec.satisfies("^bzip2~shared")
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
def test_concretize_propagate_specified_variant(self):
"""Test that only the specified variant is propagated to the dependencies"""
spec = Spec("parent-foo-bar ~~foo")
spec.concretize()
assert spec.satisfies("~foo") and spec.satisfies("^dependency-foo-bar~foo")
assert spec.satisfies("+bar") and not spec.satisfies("^dependency-foo-bar+bar")
@pytest.mark.only_clingo("Original concretizer is allowed to forego variant propagation")
def test_concretize_propagate_multivalue_variant(self):
"""Test that multivalue variants are propagating the specified value(s)
to their dependecies. The dependencies should not have the default value"""
spec = Spec("multivalue-variant foo==baz,fee")
spec.concretize()
assert spec.satisfies("^a foo=baz,fee")
assert spec.satisfies("^b foo=baz,fee")
assert not spec.satisfies("^a foo=bar")
assert not spec.satisfies("^b foo=bar")
def test_no_matching_compiler_specs(self, mock_low_high_config):
# only relevant when not building compilers as needed
@@ -1838,7 +1912,8 @@ def test_installed_specs_disregard_conflicts(self, mutable_database, monkeypatch
# If we concretize with --reuse it is not, since "mpich~debug" was already installed
with spack.config.override("concretizer:reuse", True):
s = Spec("mpich").concretized()
assert s.satisfies("~debug")
assert s.installed
assert s.satisfies("~debug"), s
@pytest.mark.regression("32471")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
@@ -2132,14 +2207,16 @@ def test_reuse_python_from_cli_and_extension_from_db(self, mutable_database):
@pytest.fixture()
def duplicates_test_repository():
builder_test_path = os.path.join(spack.paths.repos_path, "duplicates.test")
with spack.repo.use_repositories(builder_test_path) as mock_repo:
repository_path = os.path.join(spack.paths.repos_path, "duplicates.test")
with spack.repo.use_repositories(repository_path) as mock_repo:
yield mock_repo
@pytest.mark.usefixtures("mutable_config", "duplicates_test_repository")
@pytest.mark.only_clingo("Not supported by the original concretizer")
class TestConcretizeSeparately:
"""Collects test on separate concretization"""
@pytest.mark.parametrize("strategy", ["minimal", "full"])
def test_two_gmake(self, strategy):
"""Tests that we can concretize a spec with nodes using the same build
@@ -2320,3 +2397,53 @@ def test_adding_specs(self, input_specs, default_mock_concretization):
assert node == container[node.dag_hash()]
assert node.dag_hash() in container
assert node is not container[node.dag_hash()]
@pytest.fixture()
def edges_test_repository():
repository_path = os.path.join(spack.paths.repos_path, "edges.test")
with spack.repo.use_repositories(repository_path) as mock_repo:
yield mock_repo
@pytest.mark.usefixtures("mutable_config", "edges_test_repository")
@pytest.mark.only_clingo("Edge properties not supported by the original concretizer")
class TestConcretizeEdges:
"""Collects tests on edge properties"""
@pytest.mark.parametrize(
"spec_str,expected_satisfies,expected_not_satisfies",
[
("conditional-edge", ["^zlib@2.0"], ["^zlib-api"]),
("conditional-edge~foo", ["^zlib@2.0"], ["^zlib-api"]),
(
"conditional-edge+foo",
["^zlib@1.0", "^zlib-api", "^[virtuals=zlib-api] zlib"],
["^[virtuals=mpi] zlib"],
),
],
)
def test_condition_triggered_by_edge_property(
self, spec_str, expected_satisfies, expected_not_satisfies
):
"""Tests that we can enforce constraints based on edge attributes"""
s = Spec(spec_str).concretized()
for expected in expected_satisfies:
assert s.satisfies(expected), str(expected)
for not_expected in expected_not_satisfies:
assert not s.satisfies(not_expected), str(not_expected)
def test_virtuals_provided_together_but_only_one_required_in_dag(self):
"""Tests that we can use a provider that provides more than one virtual together,
and is providing only one, iff the others are not needed in the DAG.
o blas-only-client
| [virtual=blas]
o openblas (provides blas and lapack together)
"""
s = Spec("blas-only-client ^openblas").concretized()
assert s.satisfies("^[virtuals=blas] openblas")
assert not s.satisfies("^[virtuals=blas,lapack] openblas")

View File

@@ -0,0 +1,68 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import pytest
import spack.solver.asp
import spack.spec
pytestmark = [
pytest.mark.not_on_windows("Windows uses old concretizer"),
pytest.mark.only_clingo("Original concretizer does not support configuration requirements"),
]
version_error_messages = [
"Cannot satisfy 'fftw@:1.0' and 'fftw@1.1:",
" required because quantum-espresso depends on fftw@:1.0",
" required because quantum-espresso ^fftw@1.1: requested from CLI",
" required because quantum-espresso ^fftw@1.1: requested from CLI",
]
external_error_messages = [
(
"Attempted to build package quantum-espresso which is not buildable and does not have"
" a satisfying external"
),
(
" 'quantum-espresso~veritas' is an external constraint for quantum-espresso"
" which was not satisfied"
),
" 'quantum-espresso+veritas' required",
" required because quantum-espresso+veritas requested from CLI",
]
variant_error_messages = [
"'fftw' required multiple values for single-valued variant 'mpi'",
" Requested '~mpi' and '+mpi'",
" required because quantum-espresso depends on fftw+mpi when +invino",
" required because quantum-espresso+invino ^fftw~mpi requested from CLI",
" required because quantum-espresso+invino ^fftw~mpi requested from CLI",
]
external_config = {
"packages:quantum-espresso": {
"buildable": False,
"externals": [{"spec": "quantum-espresso@1.0~veritas", "prefix": "/path/to/qe"}],
}
}
@pytest.mark.parametrize(
"error_messages,config_set,spec",
[
(version_error_messages, {}, "quantum-espresso^fftw@1.1:"),
(external_error_messages, external_config, "quantum-espresso+veritas"),
(variant_error_messages, {}, "quantum-espresso+invino^fftw~mpi"),
],
)
def test_error_messages(error_messages, config_set, spec, mock_packages, mutable_config):
for path, conf in config_set.items():
spack.config.set(path, conf)
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError) as e:
_ = spack.spec.Spec(spec).concretized()
for em in error_messages:
assert em in str(e.value)

View File

@@ -105,17 +105,13 @@ def test_preferred_variants_from_wildcard(self):
@pytest.mark.parametrize(
"compiler_str,spec_str",
[("gcc@4.5.0", "mpileaks"), ("clang@12.0.0", "mpileaks"), ("gcc@4.5.0", "openmpi")],
[("gcc@=4.5.0", "mpileaks"), ("clang@=12.0.0", "mpileaks"), ("gcc@=4.5.0", "openmpi")],
)
def test_preferred_compilers(self, compiler_str, spec_str):
"""Test preferred compilers are applied correctly"""
spec = Spec(spec_str)
update_packages(spec.name, "compiler", [compiler_str])
spec.concretize()
# note: lhs has concrete compiler version, rhs still abstract.
# Could be made more strict by checking for equality with `gcc@=4.5.0`
# etc.
assert spec.compiler.satisfies(CompilerSpec(compiler_str))
update_packages("all", "compiler", [compiler_str])
spec = spack.spec.Spec(spec_str).concretized()
assert spec.compiler == CompilerSpec(compiler_str)
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_preferred_target(self, mutable_mock_repo):
@@ -124,7 +120,7 @@ def test_preferred_target(self, mutable_mock_repo):
default = str(spec.target)
preferred = str(spec.target.family)
update_packages("mpich", "target", [preferred])
update_packages("all", "target", [preferred])
spec = concretize("mpich")
assert str(spec.target) == preferred
@@ -132,7 +128,7 @@ def test_preferred_target(self, mutable_mock_repo):
assert str(spec["mpileaks"].target) == preferred
assert str(spec["mpich"].target) == preferred
update_packages("mpileaks", "target", [default])
update_packages("all", "target", [default])
spec = concretize("mpileaks")
assert str(spec["mpileaks"].target) == default
assert str(spec["mpich"].target) == default

View File

@@ -78,7 +78,7 @@ def env_yaml(tmpdir):
verify_ssl: False
dirty: False
packages:
libelf:
all:
compiler: [ 'gcc@4.5.3' ]
repos:
- /x/y/z
@@ -942,7 +942,7 @@ def test_single_file_scope(config, env_yaml):
# from the single-file config
assert spack.config.get("config:verify_ssl") is False
assert spack.config.get("config:dirty") is False
assert spack.config.get("packages:libelf:compiler") == ["gcc@4.5.3"]
assert spack.config.get("packages:all:compiler") == ["gcc@4.5.3"]
# from the lower config scopes
assert spack.config.get("config:checksum") is True
@@ -965,7 +965,7 @@ def test_single_file_scope_section_override(tmpdir, config):
config:
verify_ssl: False
packages::
libelf:
all:
compiler: [ 'gcc@4.5.3' ]
repos:
- /x/y/z
@@ -977,7 +977,7 @@ def test_single_file_scope_section_override(tmpdir, config):
with spack.config.override(scope):
# from the single-file config
assert spack.config.get("config:verify_ssl") is False
assert spack.config.get("packages:libelf:compiler") == ["gcc@4.5.3"]
assert spack.config.get("packages:all:compiler") == ["gcc@4.5.3"]
# from the lower config scopes
assert spack.config.get("config:checksum") is True

View File

@@ -2,6 +2,7 @@ enable:
- lmod
lmod:
hide_implicits: true
hash_length: 0
core_compilers:
- 'clang@3.3'
hierarchy:

View File

@@ -4,5 +4,6 @@ enable:
- tcl
tcl:
exclude_implicits: true
hash_length: 0
all:
autoload: direct

View File

@@ -2,5 +2,6 @@ enable:
- tcl
tcl:
hide_implicits: true
hash_length: 0
all:
autoload: direct

View File

@@ -803,6 +803,14 @@ def test_query_spec_with_non_conditional_virtual_dependency(database):
assert len(results) == 1
def test_query_virtual_spec(database):
"""Make sure we can query for virtuals in the DB"""
results = spack.store.STORE.db.query_local("mpi")
assert len(results) == 3
names = [s.name for s in results]
assert all(name in names for name in ["mpich", "mpich2", "zmpi"])
def test_failed_spec_path_error(database):
"""Ensure spec not concrete check is covered."""
s = spack.spec.Spec("a")

View File

@@ -18,6 +18,7 @@
SpackEnvironmentViewError,
_error_on_nonempty_view_dir,
)
from spack.spec_list import UndefinedReferenceError
pytestmark = pytest.mark.not_on_windows("Envs are not supported on windows")
@@ -716,3 +717,64 @@ def test_variant_propagation_with_unify_false(tmp_path, mock_packages):
root = env.matching_spec("parent-foo")
for node in root.traverse():
assert node.satisfies("+foo")
def test_env_with_include_defs(mutable_mock_env_path, mock_packages):
"""Test environment with included definitions file."""
env_path = mutable_mock_env_path
env_path.mkdir()
defs_file = env_path / "definitions.yaml"
defs_file.write_text(
"""definitions:
- core_specs: [libdwarf, libelf]
- compilers: ['%gcc']
"""
)
spack_yaml = env_path / ev.manifest_name
spack_yaml.write_text(
f"""spack:
include:
- file://{defs_file}
definitions:
- my_packages: [zlib]
specs:
- matrix:
- [$core_specs]
- [$compilers]
- $my_packages
"""
)
e = ev.Environment(env_path)
with e:
e.concretize()
def test_env_with_include_def_missing(mutable_mock_env_path, mock_packages):
"""Test environment with included definitions file that is missing a definition."""
env_path = mutable_mock_env_path
env_path.mkdir()
filename = "missing-def.yaml"
defs_file = env_path / filename
defs_file.write_text("definitions:\n- my_compilers: ['%gcc']\n")
spack_yaml = env_path / ev.manifest_name
spack_yaml.write_text(
f"""spack:
include:
- file://{defs_file}
specs:
- matrix:
- [$core_specs]
- [$my_compilers]
"""
)
e = ev.Environment(env_path)
with e:
with pytest.raises(UndefinedReferenceError, match=r"which does not appear"):
e.concretize()

View File

@@ -435,7 +435,7 @@ def test_modules_no_arch(self, factory, module_configuration):
assert str(spec.os) not in path
def test_hide_implicits(self, module_configuration):
def test_hide_implicits(self, module_configuration, temporary_store):
"""Tests the addition and removal of hide command in modulerc."""
module_configuration("hide_implicits")
@@ -446,29 +446,42 @@ def test_hide_implicits(self, module_configuration):
writer.write()
assert os.path.exists(writer.layout.modulerc)
with open(writer.layout.modulerc) as f:
content = f.readlines()
content = "".join(content).split("\n")
hide_cmd = 'hide_version("%s")' % writer.layout.use_name
assert len([x for x in content if hide_cmd == x]) == 1
content = [line.strip() for line in f.readlines()]
hide_implicit_mpileaks = f'hide_version("{writer.layout.use_name}")'
assert len([x for x in content if hide_implicit_mpileaks == x]) == 1
# mpileaks becomes explicit, thus modulerc is removed
writer = writer_cls(spec, "default", True)
writer.write(overwrite=True)
assert not os.path.exists(writer.layout.modulerc)
# The direct dependencies are all implicitly installed, and they should all be hidden,
# except for mpich, which is provider for mpi, which is in the hierarchy, and therefore
# can't be hidden. All other hidden modules should have a 7 character hash (the config
# hash_length = 0 only applies to exposed modules).
with open(writer.layout.filename) as f:
depends_statements = [line.strip() for line in f.readlines() if "depends_on" in line]
for dep in spec.dependencies(deptype=("link", "run")):
if dep.satisfies("mpi"):
assert not any(dep.dag_hash(7) in line for line in depends_statements)
else:
assert any(dep.dag_hash(7) in line for line in depends_statements)
# mpileaks is defined as explicit, no modulerc file should exist
# when mpileaks becomes explicit, its file name changes (hash_length = 0), meaning an
# extra module file is created; the old one still exists and remains hidden.
writer = writer_cls(spec, "default", True)
writer.write()
assert not os.path.exists(writer.layout.modulerc)
assert os.path.exists(writer.layout.modulerc)
with open(writer.layout.modulerc) as f:
content = [line.strip() for line in f.readlines()]
assert hide_implicit_mpileaks in content # old, implicit mpileaks is still hidden
assert f'hide_version("{writer.layout.use_name}")' not in content
# explicit module is removed
writer.remove()
# after removing both the implicit and explicit module, the modulerc file would be empty
# and should be removed.
writer_cls(spec, "default", False).remove()
writer_cls(spec, "default", True).remove()
assert not os.path.exists(writer.layout.modulerc)
assert not os.path.exists(writer.layout.filename)
# implicit module is removed
writer = writer_cls(spec, "default", False)
writer.write(overwrite=True)
writer.write()
assert os.path.exists(writer.layout.filename)
assert os.path.exists(writer.layout.modulerc)
writer.remove()
@@ -486,35 +499,19 @@ def test_hide_implicits(self, module_configuration):
writer_alt2.write(overwrite=True)
assert os.path.exists(writer.layout.modulerc)
with open(writer.layout.modulerc) as f:
content = f.readlines()
content = "".join(content).split("\n")
hide_cmd = 'hide_version("%s")' % writer.layout.use_name
hide_cmd_alt1 = 'hide_version("%s")' % writer_alt1.layout.use_name
hide_cmd_alt2 = 'hide_version("%s")' % writer_alt2.layout.use_name
content = [line.strip() for line in f.readlines()]
hide_cmd = f'hide_version("{writer.layout.use_name}")'
hide_cmd_alt1 = f'hide_version("{writer_alt1.layout.use_name}")'
hide_cmd_alt2 = f'hide_version("{writer_alt2.layout.use_name}")'
assert len([x for x in content if hide_cmd == x]) == 1
assert len([x for x in content if hide_cmd_alt1 == x]) == 1
assert len([x for x in content if hide_cmd_alt2 == x]) == 1
# one version is removed, a second becomes explicit
# one version is removed
writer_alt1.remove()
writer_alt2 = writer_cls(spec_alt2, "default", True)
writer_alt2.write(overwrite=True)
assert os.path.exists(writer.layout.modulerc)
with open(writer.layout.modulerc) as f:
content = f.readlines()
content = "".join(content).split("\n")
content = [line.strip() for line in f.readlines()]
assert len([x for x in content if hide_cmd == x]) == 1
assert len([x for x in content if hide_cmd_alt1 == x]) == 0
assert len([x for x in content if hide_cmd_alt2 == x]) == 0
# disable hide_implicits configuration option
module_configuration("autoload_direct")
writer = writer_cls(spec, "default")
writer.write(overwrite=True)
assert not os.path.exists(writer.layout.modulerc)
# reenable hide_implicits configuration option
module_configuration("hide_implicits")
writer = writer_cls(spec, "default")
writer.write(overwrite=True)
assert os.path.exists(writer.layout.modulerc)
assert len([x for x in content if hide_cmd_alt2 == x]) == 1

View File

@@ -488,7 +488,7 @@ def test_modules_no_arch(self, factory, module_configuration):
assert str(spec.os) not in path
def test_hide_implicits(self, module_configuration):
def test_hide_implicits(self, module_configuration, temporary_store):
"""Tests the addition and removal of hide command in modulerc."""
module_configuration("hide_implicits")
@@ -499,29 +499,37 @@ def test_hide_implicits(self, module_configuration):
writer.write()
assert os.path.exists(writer.layout.modulerc)
with open(writer.layout.modulerc) as f:
content = f.readlines()
content = "".join(content).split("\n")
hide_cmd = "module-hide --soft --hidden-loaded %s" % writer.layout.use_name
assert len([x for x in content if hide_cmd == x]) == 1
content = [line.strip() for line in f.readlines()]
hide_implicit_mpileaks = f"module-hide --soft --hidden-loaded {writer.layout.use_name}"
assert len([x for x in content if hide_implicit_mpileaks == x]) == 1
# mpileaks becomes explicit, thus modulerc is removed
writer = writer_cls(spec, "default", True)
writer.write(overwrite=True)
assert not os.path.exists(writer.layout.modulerc)
# The direct dependencies are all implicit, and they should have depends-on with fixed
# 7 character hash, even though the config is set to hash_length = 0.
with open(writer.layout.filename) as f:
depends_statements = [line.strip() for line in f.readlines() if "depends-on" in line]
for dep in spec.dependencies(deptype=("link", "run")):
assert any(dep.dag_hash(7) in line for line in depends_statements)
# mpileaks is defined as explicit, no modulerc file should exist
# when mpileaks becomes explicit, its file name changes (hash_length = 0), meaning an
# extra module file is created; the old one still exists and remains hidden.
writer = writer_cls(spec, "default", True)
writer.write()
assert not os.path.exists(writer.layout.modulerc)
assert os.path.exists(writer.layout.modulerc)
with open(writer.layout.modulerc) as f:
content = [line.strip() for line in f.readlines()]
assert hide_implicit_mpileaks in content # old, implicit mpileaks is still hidden
assert f"module-hide --soft --hidden-loaded {writer.layout.use_name}" not in content
# explicit module is removed
writer.remove()
# after removing both the implicit and explicit module, the modulerc file would be empty
# and should be removed.
writer_cls(spec, "default", False).remove()
writer_cls(spec, "default", True).remove()
assert not os.path.exists(writer.layout.modulerc)
assert not os.path.exists(writer.layout.filename)
# implicit module is removed
writer = writer_cls(spec, "default", False)
writer.write(overwrite=True)
writer.write()
assert os.path.exists(writer.layout.filename)
assert os.path.exists(writer.layout.modulerc)
writer.remove()
@@ -539,35 +547,19 @@ def test_hide_implicits(self, module_configuration):
writer_alt2.write(overwrite=True)
assert os.path.exists(writer.layout.modulerc)
with open(writer.layout.modulerc) as f:
content = f.readlines()
content = "".join(content).split("\n")
hide_cmd = "module-hide --soft --hidden-loaded %s" % writer.layout.use_name
hide_cmd_alt1 = "module-hide --soft --hidden-loaded %s" % writer_alt1.layout.use_name
hide_cmd_alt2 = "module-hide --soft --hidden-loaded %s" % writer_alt2.layout.use_name
content = [line.strip() for line in f.readlines()]
hide_cmd = f"module-hide --soft --hidden-loaded {writer.layout.use_name}"
hide_cmd_alt1 = f"module-hide --soft --hidden-loaded {writer_alt1.layout.use_name}"
hide_cmd_alt2 = f"module-hide --soft --hidden-loaded {writer_alt2.layout.use_name}"
assert len([x for x in content if hide_cmd == x]) == 1
assert len([x for x in content if hide_cmd_alt1 == x]) == 1
assert len([x for x in content if hide_cmd_alt2 == x]) == 1
# one version is removed, a second becomes explicit
# one version is removed
writer_alt1.remove()
writer_alt2 = writer_cls(spec_alt2, "default", True)
writer_alt2.write(overwrite=True)
assert os.path.exists(writer.layout.modulerc)
with open(writer.layout.modulerc) as f:
content = f.readlines()
content = "".join(content).split("\n")
content = [line.strip() for line in f.readlines()]
assert len([x for x in content if hide_cmd == x]) == 1
assert len([x for x in content if hide_cmd_alt1 == x]) == 0
assert len([x for x in content if hide_cmd_alt2 == x]) == 0
# disable hide_implicits configuration option
module_configuration("autoload_direct")
writer = writer_cls(spec, "default")
writer.write(overwrite=True)
assert not os.path.exists(writer.layout.modulerc)
# reenable hide_implicits configuration option
module_configuration("hide_implicits")
writer = writer_cls(spec, "default")
writer.write(overwrite=True)
assert os.path.exists(writer.layout.modulerc)
assert len([x for x in content if hide_cmd_alt2 == x]) == 1

View File

@@ -34,6 +34,10 @@
("myname:1234/myimage:abc", ("myname:1234", "myimage", "abc", None)),
("localhost/myimage:abc", ("localhost", "myimage", "abc", None)),
("localhost:1234/myimage:abc", ("localhost:1234", "myimage", "abc", None)),
(
"example.com/UPPERCASE/lowercase:AbC",
("example.com", "uppercase/lowercase", "AbC", None),
),
],
)
def test_name_parsing(image_ref, expected):

View File

@@ -37,6 +37,7 @@ def mpileaks_possible_deps(mock_packages, mpi_names):
"low-priority-provider": set(),
"dyninst": set(["libdwarf", "libelf"]),
"fake": set(),
"intel-parallel-studio": set(),
"libdwarf": set(["libelf"]),
"libelf": set(),
"mpich": set(),

View File

@@ -80,7 +80,17 @@ def test_module_suffixes(module_suffixes_schema):
@pytest.mark.regression("10246")
@pytest.mark.parametrize(
"config_name",
["compilers", "config", "env", "merged", "mirrors", "modules", "packages", "repos"],
[
"compilers",
"config",
"definitions",
"env",
"merged",
"mirrors",
"modules",
"packages",
"repos",
],
)
def test_schema_validation(meta_schema, config_name):
import importlib

View File

@@ -532,6 +532,7 @@ def test_normalize_mpileaks(self):
assert not spec.eq_dag(expected_normalized, deptypes=True)
assert not spec.eq_dag(non_unique_nodes, deptypes=True)
@pytest.mark.xfail(reason="String representation changed")
def test_normalize_with_virtual_package(self):
spec = Spec("mpileaks ^mpi ^libelf@1.8.11 ^libdwarf")
spec.normalize()

View File

@@ -294,13 +294,10 @@ def test_concrete_specs_which_satisfies_abstract(self, lhs, rhs, default_mock_co
("foo@4.0%pgi@4.5", "@1:3%pgi@4.4:4.6"),
("builtin.mock.mpich", "builtin.mpich"),
("mpileaks ^builtin.mock.mpich", "^builtin.mpich"),
("mpileaks^mpich", "^zmpi"),
("mpileaks^zmpi", "^mpich"),
("mpileaks^mpich@1.2", "^mpich@2.0"),
("mpileaks^mpich@4.0^callpath@1.5", "^mpich@1:3^callpath@1.4:1.6"),
("mpileaks^mpich@2.0^callpath@1.7", "^mpich@1:3^callpath@1.4:1.6"),
("mpileaks^mpich@4.0^callpath@1.7", "^mpich@1:3^callpath@1.4:1.6"),
("mpileaks^mpich", "^zmpi"),
("mpileaks^mpi@3", "^mpi@1.2:1.6"),
("mpileaks^mpi@3:", "^mpich2@1.4"),
("mpileaks^mpi@3:", "^mpich2"),
@@ -338,30 +335,30 @@ def test_constraining_abstract_specs_with_empty_intersection(self, lhs, rhs):
rhs.constrain(lhs)
@pytest.mark.parametrize(
"lhs,rhs,intersection_expected",
"lhs,rhs",
[
("mpich", "mpich +foo", True),
("mpich", "mpich~foo", True),
("mpich", "mpich foo=1", True),
("mpich", "mpich++foo", True),
("mpich", "mpich~~foo", True),
("mpich", "mpich foo==1", True),
("mpich", "mpich +foo"),
("mpich", "mpich~foo"),
("mpich", "mpich foo=1"),
("mpich", "mpich++foo"),
("mpich", "mpich~~foo"),
("mpich", "mpich foo==1"),
# Flags semantics is currently different from other variant
("mpich", 'mpich cflags="-O3"', True),
("mpich cflags=-O3", 'mpich cflags="-O3 -Ofast"', False),
("mpich cflags=-O2", 'mpich cflags="-O3"', False),
("multivalue-variant foo=bar", "multivalue-variant +foo", False),
("multivalue-variant foo=bar", "multivalue-variant ~foo", False),
("multivalue-variant fee=bar", "multivalue-variant fee=baz", False),
("mpich", 'mpich cflags="-O3"'),
("mpich cflags=-O3", 'mpich cflags="-O3 -Ofast"'),
("mpich cflags=-O2", 'mpich cflags="-O3"'),
("multivalue-variant foo=bar", "multivalue-variant +foo"),
("multivalue-variant foo=bar", "multivalue-variant ~foo"),
("multivalue-variant fee=bar", "multivalue-variant fee=baz"),
],
)
def test_concrete_specs_which_do_not_satisfy_abstract(
self, lhs, rhs, intersection_expected, default_mock_concretization
self, lhs, rhs, default_mock_concretization
):
lhs, rhs = default_mock_concretization(lhs), Spec(rhs)
assert lhs.intersects(rhs) is intersection_expected
assert rhs.intersects(lhs) is intersection_expected
assert lhs.intersects(rhs) is False
assert rhs.intersects(lhs) is False
assert not lhs.satisfies(rhs)
assert not rhs.satisfies(lhs)
@@ -483,10 +480,14 @@ def test_intersects_virtual(self):
assert Spec("mpich2").intersects(Spec("mpi"))
assert Spec("zmpi").intersects(Spec("mpi"))
def test_intersects_virtual_dep_with_virtual_constraint(self):
def test_intersects_virtual_providers(self):
"""Tests that we can always intersect virtual providers from abstract specs.
Concretization will give meaning to virtuals, and eventually forbid certain
configurations.
"""
assert Spec("netlib-lapack ^openblas").intersects("netlib-lapack ^openblas")
assert not Spec("netlib-lapack ^netlib-blas").intersects("netlib-lapack ^openblas")
assert not Spec("netlib-lapack ^openblas").intersects("netlib-lapack ^netlib-blas")
assert Spec("netlib-lapack ^netlib-blas").intersects("netlib-lapack ^openblas")
assert Spec("netlib-lapack ^openblas").intersects("netlib-lapack ^netlib-blas")
assert Spec("netlib-lapack ^netlib-blas").intersects("netlib-lapack ^netlib-blas")
def test_intersectable_concrete_specs_must_have_the_same_hash(self):
@@ -1006,6 +1007,103 @@ def test_spec_override(self):
assert new_spec.compiler_flags["cflags"] == ["-O2"]
assert new_spec.compiler_flags["cxxflags"] == ["-O1"]
@pytest.mark.parametrize(
"spec_str,specs_in_dag",
[
("hdf5 ^[virtuals=mpi] mpich", [("mpich", "mpich"), ("mpi", "mpich")]),
# Try different combinations with packages that provides a
# disjoint set of virtual dependencies
(
"netlib-scalapack ^mpich ^openblas-with-lapack",
[
("mpi", "mpich"),
("lapack", "openblas-with-lapack"),
("blas", "openblas-with-lapack"),
],
),
(
"netlib-scalapack ^[virtuals=mpi] mpich ^openblas-with-lapack",
[
("mpi", "mpich"),
("lapack", "openblas-with-lapack"),
("blas", "openblas-with-lapack"),
],
),
(
"netlib-scalapack ^mpich ^[virtuals=lapack] openblas-with-lapack",
[
("mpi", "mpich"),
("lapack", "openblas-with-lapack"),
("blas", "openblas-with-lapack"),
],
),
(
"netlib-scalapack ^[virtuals=mpi] mpich ^[virtuals=lapack] openblas-with-lapack",
[
("mpi", "mpich"),
("lapack", "openblas-with-lapack"),
("blas", "openblas-with-lapack"),
],
),
# Test that we can mix dependencies that provide an overlapping
# sets of virtual dependencies
(
"netlib-scalapack ^[virtuals=mpi] intel-parallel-studio "
"^[virtuals=lapack] openblas-with-lapack",
[
("mpi", "intel-parallel-studio"),
("lapack", "openblas-with-lapack"),
("blas", "openblas-with-lapack"),
],
),
(
"netlib-scalapack ^[virtuals=mpi] intel-parallel-studio ^openblas-with-lapack",
[
("mpi", "intel-parallel-studio"),
("lapack", "openblas-with-lapack"),
("blas", "openblas-with-lapack"),
],
),
(
"netlib-scalapack ^intel-parallel-studio ^[virtuals=lapack] openblas-with-lapack",
[
("mpi", "intel-parallel-studio"),
("lapack", "openblas-with-lapack"),
("blas", "openblas-with-lapack"),
],
),
# Test that we can bind more than one virtual to the same provider
(
"netlib-scalapack ^[virtuals=lapack,blas] openblas-with-lapack",
[("lapack", "openblas-with-lapack"), ("blas", "openblas-with-lapack")],
),
],
)
def test_virtual_deps_bindings(self, default_mock_concretization, spec_str, specs_in_dag):
if spack.config.get("config:concretizer") == "original":
pytest.skip("Use case not supported by the original concretizer")
s = default_mock_concretization(spec_str)
for label, expected in specs_in_dag:
assert label in s
assert s[label].satisfies(expected), label
@pytest.mark.parametrize(
"spec_str",
[
# openblas-with-lapack needs to provide blas and lapack together
"netlib-scalapack ^[virtuals=blas] intel-parallel-studio ^openblas-with-lapack",
# intel-* provides blas and lapack together, openblas can provide blas only
"netlib-scalapack ^[virtuals=lapack] intel-parallel-studio ^openblas",
],
)
def test_unsatisfiable_virtual_deps_bindings(self, spec_str):
if spack.config.get("config:concretizer") == "original":
pytest.skip("Use case not supported by the original concretizer")
with pytest.raises(spack.solver.asp.UnsatisfiableSpecError):
Spec(spec_str).concretized()
@pytest.mark.parametrize(
"spec_str,format_str,expected",

View File

@@ -530,6 +530,26 @@ def _specfile_for(spec_str, filename):
[Token(TokenType.VERSION, value="@:0.4"), Token(TokenType.COMPILER, value="% nvhpc")],
"@:0.4%nvhpc",
),
(
"^[virtuals=mpi] openmpi",
[
Token(TokenType.START_EDGE_PROPERTIES, value="^["),
Token(TokenType.KEY_VALUE_PAIR, value="virtuals=mpi"),
Token(TokenType.END_EDGE_PROPERTIES, value="]"),
Token(TokenType.UNQUALIFIED_PACKAGE_NAME, value="openmpi"),
],
"^[virtuals=mpi] openmpi",
),
(
"^[deptypes=link,build] zlib",
[
Token(TokenType.START_EDGE_PROPERTIES, value="^["),
Token(TokenType.KEY_VALUE_PAIR, value="deptypes=link,build"),
Token(TokenType.END_EDGE_PROPERTIES, value="]"),
Token(TokenType.UNQUALIFIED_PACKAGE_NAME, value="zlib"),
],
"^[deptypes=build,link] zlib",
),
(
"zlib@git.foo/bar",
[
@@ -923,6 +943,9 @@ def test_disambiguate_hash_by_spec(spec1, spec2, constraint, mock_packages, monk
("x platform=test platform=test", spack.spec.DuplicateArchitectureError),
("x os=fe platform=test target=fe os=fe", spack.spec.DuplicateArchitectureError),
("x target=be platform=test os=be os=fe", spack.spec.DuplicateArchitectureError),
("^[@foo] zlib", spack.parser.SpecParsingError),
# TODO: Remove this as soon as use variants are added and we can parse custom attributes
("^[foo=bar] zlib", spack.parser.SpecParsingError),
],
)
def test_error_conditions(text, exc_cls):

View File

@@ -120,6 +120,21 @@ def test_parser_doesnt_deal_with_nonzero_offset():
elf.parse_elf(elf_at_offset_one)
def test_only_header():
# When passing only_header=True parsing a file that is literally just a header
# without any sections/segments should not error.
# 32 bit
elf_32 = elf.parse_elf(io.BytesIO(b"\x7fELF\x01\x01" + b"\x00" * 46), only_header=True)
assert not elf_32.is_64_bit
assert elf_32.is_little_endian
# 64 bit
elf_64 = elf.parse_elf(io.BytesIO(b"\x7fELF\x02\x01" + b"\x00" * 58), only_header=True)
assert elf_64.is_64_bit
assert elf_64.is_little_endian
@pytest.mark.requires_executables("gcc")
@skip_unless_linux
def test_elf_get_and_replace_rpaths(binary_with_rpaths):

View File

@@ -377,7 +377,7 @@ def parse_header(f, elf):
elf.elf_hdr = ElfHeader._make(unpack(elf_header_fmt, data))
def _do_parse_elf(f, interpreter=True, dynamic_section=True):
def _do_parse_elf(f, interpreter=True, dynamic_section=True, only_header=False):
# We don't (yet?) allow parsing ELF files at a nonzero offset, we just
# jump to absolute offsets as they are specified in the ELF file.
if f.tell() != 0:
@@ -386,6 +386,9 @@ def _do_parse_elf(f, interpreter=True, dynamic_section=True):
elf = ElfFile()
parse_header(f, elf)
if only_header:
return elf
# We don't handle anything but executables and shared libraries now.
if elf.elf_hdr.e_type not in (ELF_CONSTANTS.ET_EXEC, ELF_CONSTANTS.ET_DYN):
raise ElfParsingError("Not an ET_DYN or ET_EXEC type")
@@ -403,11 +406,11 @@ def _do_parse_elf(f, interpreter=True, dynamic_section=True):
return elf
def parse_elf(f, interpreter=False, dynamic_section=False):
def parse_elf(f, interpreter=False, dynamic_section=False, only_header=False):
"""Given a file handle f for an ELF file opened in binary mode, return an ElfFile
object that is stores data about rpaths"""
try:
return _do_parse_elf(f, interpreter, dynamic_section)
return _do_parse_elf(f, interpreter, dynamic_section, only_header)
except (DeprecationWarning, struct.error):
# According to the docs old versions of Python can throw DeprecationWarning
# instead of struct.error.

View File

@@ -596,6 +596,14 @@ def group_by_name(self) -> Dict[str, ModificationList]:
modifications[item.name].append(item)
return modifications
def drop(self, *name) -> bool:
"""Drop all modifications to the variable with the given name."""
old_mods = self.env_modifications
new_mods = [x for x in self.env_modifications if x.name not in name]
self.env_modifications = new_mods
return len(old_mods) != len(new_mods)
def is_unset(self, variable_name: str) -> bool:
"""Returns True if the last modification to a variable is to unset it, False otherwise."""
modifications = self.group_by_name()

View File

@@ -330,8 +330,11 @@ def add_extra_search_paths(paths):
for candidate_item in candidate_items:
for directory in search_paths:
exe = directory / candidate_item
if exe.is_file() and os.access(str(exe), os.X_OK):
return str(exe)
try:
if exe.is_file() and os.access(str(exe), os.X_OK):
return str(exe)
except OSError:
pass
if required:
raise CommandNotFoundError("spack requires '%s'. Make sure it is in your path." % args[0])

View File

@@ -6,6 +6,7 @@
import os
import sys
import traceback
from typing import Optional
class ErrorFromWorker:
@@ -53,7 +54,9 @@ def __call__(self, *args, **kwargs):
return value
def imap_unordered(f, list_of_args, *, processes: int, debug=False):
def imap_unordered(
f, list_of_args, *, processes: int, maxtaskperchild: Optional[int] = None, debug=False
):
"""Wrapper around multiprocessing.Pool.imap_unordered.
Args:
@@ -62,6 +65,8 @@ def imap_unordered(f, list_of_args, *, processes: int, debug=False):
processes: maximum number of processes allowed
debug: if False, raise an exception containing just the error messages
from workers, if True an exception with complete stacktraces
maxtaskperchild: number of tasks to be executed by a child before being
killed and substituted
Raises:
RuntimeError: if any error occurred in the worker processes
@@ -70,7 +75,7 @@ def imap_unordered(f, list_of_args, *, processes: int, debug=False):
yield from map(f, list_of_args)
return
with multiprocessing.Pool(processes) as p:
with multiprocessing.Pool(processes, maxtasksperchild=maxtaskperchild) as p:
for result in p.imap_unordered(Task(f), list_of_args):
if isinstance(result, ErrorFromWorker):
raise RuntimeError(result.stacktrace if debug else str(result))

View File

@@ -370,7 +370,7 @@ _spack_compress_aliases() {
# If there are zero or one completions, don't do anything
# If this isn't the first argument, bail because aliases currently only apply
# to top-level commands.
if [ "${#COMPREPLY[@]}" -le "1" ] || [ "$COMP_CWORD" != "1" ]; then
if [ "${#COMPREPLY[@]}" -le "1" ] || [ "$COMP_CWORD_NO_FLAGS" != "1" ]; then
return
fi

View File

@@ -150,7 +150,7 @@ spack:
- swig@4.0.2-fortran
- sz3
- tasmanian
- tau +mpi +python
- tau +mpi +python +syscall
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
- turbine
- umap
@@ -186,7 +186,7 @@ spack:
- flux-core +cuda
- hpctoolkit +cuda
- papi +cuda
- tau +mpi +cuda
- tau +mpi +cuda +syscall
# --
# - bricks +cuda # not respecting target=aarch64?
# - legion +cuda # legion: needs NVIDIA driver

View File

@@ -153,7 +153,7 @@ spack:
- superlu-dist
- sz3
- tasmanian
- tau +mpi +python
- tau +mpi +python +syscall
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
- turbine
- umap
@@ -200,11 +200,11 @@ spack:
- kokkos +sycl +openmp cxxstd=17 +tests +examples
- kokkos-kernels build_type=Release %oneapi ^kokkos +sycl +openmp cxxstd=17 +tests +examples
- slate +sycl
- tau +mpi +opencl +level_zero ~pdt # tau: requires libdrm.so to be installed
- sundials +sycl cxxstd=17 +examples-install
- tau +mpi +opencl +level_zero ~pdt +syscall # tau: requires libdrm.so to be installed
# --
# - ginkgo +oneapi # InstallError: Ginkgo's oneAPI backend requires theDPC++ compiler as main CXX compiler.
# - hpctoolkit +level_zero # dyninst@12.3.0%gcc: /usr/bin/ld: libiberty/./d-demangle.c:142: undefined reference to `_intel_fast_memcpy'; can't mix intel-tbb@%oneapi with dyninst%gcc
# - sundials +sycl cxxstd=17 # sundials: include/sunmemory/sunmemory_sycl.h:20:10: fatal error: 'CL/sycl.hpp' file not found
- py-scipy

View File

@@ -150,7 +150,7 @@ spack:
- swig@4.0.2-fortran
- sz3
- tasmanian
- tau +mpi +python # tau: has issue with `spack env depfile` build
- tau +mpi +python # +syscall fails: https://github.com/spack/spack/pull/40830#issuecomment-1790799772; tau: has issue with `spack env depfile` build
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
- turbine
- umap

View File

@@ -240,7 +240,7 @@ spack:
specs:
# ROCM NOARCH
- hpctoolkit +rocm
- tau +mpi +rocm # tau: has issue with `spack env depfile` build
- tau +mpi +rocm +syscall # tau: has issue with `spack env depfile` build
# ROCM 908
- adios2 +kokkos +rocm amdgpu_target=gfx908

View File

@@ -51,6 +51,8 @@ spack:
require: "@3.4.4"
vtk-m:
require: "+examples"
visit:
require: "~gui"
cuda:
version: [11.8.0]
paraview:
@@ -157,7 +159,7 @@ spack:
- swig@4.0.2-fortran
- sz3
- tasmanian
- tau +mpi +python
- tau +mpi +python +syscall
- trilinos +amesos +amesos2 +anasazi +aztec +belos +boost +epetra +epetraext +ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu +nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos +teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
- turbine
- umap
@@ -192,7 +194,7 @@ spack:
- flux-core +cuda
- hpctoolkit +cuda
- papi +cuda
- tau +mpi +cuda
- tau +mpi +cuda +syscall
# --
# - legion +cuda # legion: needs NVIDIA driver
@@ -289,7 +291,7 @@ spack:
# ROCM NOARCH
- hpctoolkit +rocm
- tau +mpi +rocm # tau: has issue with `spack env depfile` build
- tau +mpi +rocm +syscall # tau: has issue with `spack env depfile` build
# ROCM 908
- adios2 +kokkos +rocm amdgpu_target=gfx908

View File

@@ -18,7 +18,7 @@ spack:
- hdf5+hl+mpi ^mpich
- trilinos
- trilinos +hdf5 ^hdf5+hl+mpi ^mpich
- gcc@12
- gcc@12.3.0
- mpileaks
- lmod
- macsio@1.1+scr ^scr@2.0.0~fortran ^silo~fortran ^hdf5~fortran

Some files were not shown because too many files have changed in this diff Show More