Compare commits

...

41 Commits

Author SHA1 Message Date
Harmen Stoppels
2bfcc69fa8 Set version to v0.23.1 2025-02-19 16:04:31 +01:00
kwryankrattiger
af62d91457 gdk-pixbuf: Point at gitlab instead of broken mirror (#47825) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
f32a74491e spec.py: ensure spec.extra_attributes is {} if is null in json (#48896) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
4f80f07b9a spec.py: fix hash change due to None vs {} (#48854)
* Fix hash change due to None vs {}

* Enforce null for empty list of external_modules
2025-02-19 14:37:47 +01:00
eugeneswalker
4a8ae59a9e e4s cray rhel ci stack: re-enable and update for new cpe (#47697)
* e4s cray rhel ci stack: re-enable and update for new cpe, should fix cray libsci issue

* only run e4s-cray-rhel stack

* Mkae Autotools build_system point at correct build_directory

* remove selective enable of cray-rhel stacks

* restore SPACK_CI_DISABLE_STACKS

* use dot prefix to hide cray-sles jobs instead of comment-out

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
2025-02-19 14:37:47 +01:00
Harmen Stoppels
6f7e881b69 fix year dependent license verification check 2025-02-19 14:37:47 +01:00
Till Ehrengruber
24bcaec6c3 oci/opener.py: respect system proxy settings (#48783) 2025-02-19 14:37:47 +01:00
Massimiliano Culpo
1c9bb36fdf spec.py: fix ArchSpec.intersects (#48741)
fixes a bug where `x86_64:` and `ppc64le:` intersected, and x86_64: and :haswell did not.
2025-02-19 14:37:47 +01:00
Harmen Stoppels
044d1b12bb autotools.py: set lt_cv_apple_cc_single_mod=yes (#48671)
Since macOS 15 `ld -single_module` warns with a deprecation message,
which makes configure scripts believe the flag is unsupported. That
in turn triggers a code path where `archive_cmds` is set to

```
$CC -r -keep_private_externs -nostdlib ... -dynamiclib
```

instead of just

```
$CC -dynamiclib ...
```

This code path was meant to trigger only on ancient macOS <= 14.4 where
libtool had to add `-single_module`, which is the default since macos
14.4, and is now apparently deprecated because the flag is a no-op for
more than 15 years.

The wrong `archive_cmds` causes actual problems combined with a bug in
OpenMPI's compiler wrapper (`CC=mpicc`), which appends `-rpath` flags,
which cause an error when combined with the `-r` flag added by the
autotools.

Spack's compiler wrapper doesn't do this, but it's likely there are
other compiler wrappers out there that are not aware that `-r` and
`-rpath` cannot be combined.

The fix is to change defaults: `lt_cv_apple_cc_single_mod=yes`.
2025-02-19 14:37:47 +01:00
Harmen Stoppels
536468783d spec.py: make hashing of extra_attributes order independent (#48615) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
a298296237 relocate_text: fix return value (#48568) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
1db11ff06b spec.py: fix return type of concretized() (#48504) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
b621c045a7 cmd/__init__.py: pass tests in case n=1 (#48417) 2025-02-19 14:37:47 +01:00
Wouter Deconinck
734ada1f10 doc: ensure getting_started has bootstrap list output in correct place (#48281) 2025-02-19 14:37:47 +01:00
psakievich
71712465e8 Ensure command_line scope is always last (#48255) 2025-02-19 14:37:47 +01:00
Massimiliano Culpo
7cc67b5c23 bootstrap mirror: fix references from v0.4 to v0.6 (#48235) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
1f9b0e39b9 llnl.util.lang: remove testing literal backtrace output (#48209) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
d012683c1b docs: advertise --oci-username-variable and --oci-password-variable (#48189) 2025-02-19 14:37:47 +01:00
Todd Gamblin
8b3c3e9165 Make unit tests work on ubuntu 24.04 (#48151)
`kcov` was removed in Ubuntu 24.04, and it is no longer
installable via `apt` in our CI images. Instal it via
Linuxbrew instead, at least until it comes back to Ubuntu.

`subversion` is also not installed on ubuntu 24 by default,
so we have to install it manually.

- [x] Add linuxbrew to linux tests
- [x] Install `kcov` with brew
- [x] Install subversion with `apt`

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-02-19 14:37:47 +01:00
Harmen Stoppels
dd94a44b6a gha: fix git safe.directory config (#48041) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
95d190e354 filter_file: make tempfile later (#48108)
* filter_file: make tempfile later

* also add a `.` after the filename
2025-02-19 14:37:47 +01:00
Harmen Stoppels
f124409d8a filter_file: fix various bugs (#48038)
* `f.tell` on a `TextIOWrapper` does not return the offset in bytes, but
  an opaque integer that can only be used for `f.seek` on the same
  object. Spack assumes it's a byte offset.
* Do not open in a locale dependent way, but assume utf-8 (and allow
  users to override that)
* Use tempfile to generate a backup/temporary file in a safe way
* Comparison between None and str is valid and on purpose.
2025-02-19 14:37:47 +01:00
Harmen Stoppels
bee2132c04 log.py: improve utf-8 handling, and non-utf-8 output (#48005) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
d0ea33fa67 stage.py: improve path to url (#47898) 2025-02-19 14:37:47 +01:00
Carson Woods
4e913876b9 bug fix: updated warning message to reflect impending v1.0 release (#47887) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
c25e43ce61 darwin: preserve hardlinks on codesign/install_name_tool (#47808) 2025-02-19 14:37:47 +01:00
Wouter Deconinck
85146d875b qt-base: fix rpath for dependents (#47424)
ensure that CMAKE_INSTALL_RPATH_USE_LINK_PATH=ON works in qt packages.
2025-02-19 14:37:47 +01:00
Harmen Stoppels
c8167eec5d Set version to 0.23.1.dev0 2025-02-19 14:37:47 +01:00
Gregory Becker
c6d4037758 update version number to 0.23.0 2024-11-17 00:59:27 -05:00
Gregory Becker
08f1cf9ae2 Update CHANGELOG.md for v0.23.0 2024-11-17 00:59:27 -05:00
Todd Gamblin
48dfa3c95e Spec: prefer a splice-specific method to __len__ (#47585)
Automatic splicing say `Spec` grow a `__len__` method but it's only used
in one place and it's not clear the semantics are useful elsewhere. It also
runs the risk of Specs one day being confused for other types of containers.

Rather than introduce a new function for one algorithm, let's use a more
specific method in the splice code.

- [x] Use topological ordering in `_resolve_automatic_splices` instead of 
      sorting by node count
- [x] delete `Spec.__len__()` and `Spec.__bool__()`

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-11-14 08:24:06 +01:00
Todd Gamblin
e5c411d8f0 spack spec: simplify and unify output (#47574)
`spack spec` output has looked like this for a while:

```console
> spack spec /v5fn6xo /wd2p2v7
Input spec
--------------------------------
 -   /v5fn6xo

Concretized
--------------------------------
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...

Input spec
--------------------------------
 -   /wd2p2v7

Concretized
--------------------------------
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
```

But the input spec is right there on the CLI, and it doesn't add anything to the output.
Also, since #44843, specs concretized in the CLI line can be unified, so it makes sense
to display them as we did in #44489 -- as one multi-root tree instead of as multiple
single-root trees.

With this PR, concretize output now looks like this:

```console
> spack spec /v5fn6xo /wd2p2v7
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^gmake@4.4.1%apple-clang@16.0.0~guile build_system=generic arch=darwin-sequoia-m1
[+]      ^perl@5.40.0%apple-clang@16.0.0+cpanm+opcode+open+shared+threads build_system=generic arch=darwin-sequoia-m1
[+]          ^berkeley-db@18.1.40%apple-clang@16.0.0+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=darwin-sequoia-m1
[+]          ^bzip2@1.0.8%apple-clang@16.0.0~debug~pic+shared build_system=generic arch=darwin-sequoia-m1
[+]              ^diffutils@3.10%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]                  ^libiconv@1.17%apple-clang@16.0.0 build_system=autotools libs=shared,static arch=darwin-sequoia-m1
[+]          ^gdbm@1.23%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]              ^readline@8.2%apple-clang@16.0.0 build_system=autotools patches=bbf97f1 arch=darwin-sequoia-m1
[+]                  ^ncurses@6.5%apple-clang@16.0.0~symlinks+termlib abi=none build_system=autotools patches=7a351bc arch=darwin-sequoia-m1
[+]                      ^pkgconf@2.2.0%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]      ^zlib-ng@2.2.1%apple-clang@16.0.0+compat+new_strategies+opt+pic+shared build_system=autotools arch=darwin-sequoia-m1
[+]          ^gnuconfig@2022-09-17%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^py-setuptools@69.2.0%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[-]      ^py-wheel@0.41.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...
```

With no input spec displayed -- just the concretization output shown as one consolidated
tree and multiple roots.

- [x] remove "Input Spec" section and "Concretized" header from `spack spec` output
- [x] print concretized specs as one BFS tree instead of multiple

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-11-14 08:16:16 +01:00
psakievich
020e30f3e6 Update tutorial version (#47593) 2024-11-14 08:15:37 +01:00
Harmen Stoppels
181c404af5 missing and redundant imports (#47577) 2024-11-13 13:03:29 +01:00
dslarm
9642b04513 Add SVE as a variant for Neoverse N2. Default to true, but should be (#47567)
benchmarked to test if that is a correct decision.
2024-11-12 22:09:05 -07:00
John Gouwar
bf16f0bf74 Add solver capability for synthesizing splices of ABI compatible packages. (#46729)
This PR provides complementary 2 features:
1. An augmentation to the package language to express ABI compatibility relationships among packages. 
2. An extension to the concretizer that can synthesize splices between ABI compatible packages.

1.  The `can_splice` directive and ABI compatibility 
We augment the package language with a single directive: `can_splice`. Here is an example of a package `Foo` exercising the `can_splice` directive:

class Foo(Package):
    version("1.0")
    version("1.1")
    variant("compat", default=True)
    variant("json", default=False)
    variant("pic", default=False)
    can_splice("foo@1.0", when="@1.1")
    can_splice("bar@1.0", when="@1.0+compat")
    can_splice("baz@1.0+compat", when="@1.0+compat", match_variants="*")
    can_splice("quux@1.0", when=@1.1~compat", match_variants="json")

Explanations of the uses of each directive: 
- `can_splice("foo@1.0", when="@1.1")`:  If `foo@1.0` is the dependency of an already installed spec and `foo@1.1` could be a valid dependency for the parent spec, then `foo@1.1` can be spliced in for `foo@1.0` in the parent spec.
- `can_splice("bar@1.0", when="@1.0+compat")`: If `bar@1.0` is the dependency of an already installed spec and `foo@1.0+compat` could be a valid dependency for the parent spec, then `foo@1.0+compat` can be spliced in for `bar@1.0+compat` in the parent spec
-  `can_splice("baz@1.0", when="@1.0+compat", match_variants="*")`: If `baz@1.0+compat` is the dependency of an already installed spec and `foo@1.0+compat` could be a valid dependency for the parent spec, then `foo@1.0+compat` can be spliced in for `baz@1.0+compat` in the parent spec, provided that they have the same value for all other variants (regardless of what those values are). 
-  `can_splice("quux@1.0", when=@1.1~compat", match_variants="json")`:If `quux@1.0` is the dependency of an already installed spec and `foo@1.1~compat` could be a valid dependency for the parent spec, then `foo@1.0~compat` can be spliced in for `quux@1.0` in the parent spec, provided that they have the same value for their `json` variant. 

2. Augmenting the solver to synthesize splices
### Changes to the hash encoding in `asp.py`
Previously, when including concrete specs in the solve, they would have the following form:

installed_hash("foo", "xxxyyy")
imposed_constraint("xxxyyy", "foo", "attr1", ...)
imposed_constraint("xxxyyy", "foo", "attr2", ...)
% etc. 

Concrete specs now have the following form:
installed_hash("foo", "xxxyyy")
hash_attr("xxxyyy", "foo", "attr1", ...)
hash_attr("xxxyyy", "foo", "attr2", ...)

This transformation allows us to control which constraints are imposed when we select a hash, to facilitate the splicing of dependencies. 

2.1 Compiling `can_splice` directives in `asp.py`
Consider the concrete spec:
foo@2.72%gcc@11.4 arch=linux-ubuntu22.04-icelake build_system=autotools ^bar ...
It will emit the following facts for reuse (below is a subset)

installed_hash("foo", "xxxyyy")
hash_attr("xxxyyy", "hash", "foo", "xxxyyy")
hash_attr("xxxyyy", "version", "foo", "2.72")
hash_attr("xxxyyy", "node_os", "ubuntu22.04")
hash_attr("xxxyyy", "hash", "bar", "zzzqqq")
hash_attr("xxxyyy", "depends_on", "foo", "bar", "link")

Rules that derive abi_splice_conditions_hold will be generated from 
use of the `can_splice` directive. They will have the following form:
can_splice("foo@1.0.0+a", when="@1.0.1+a", match_variants=["b"]) --->

abi_splice_conditions_hold(0, node(SID, "foo"), "foo", BaseHash) :-
  installed_hash("foo", BaseHash),
  attr("node", node(SID, SpliceName)),
  attr("node_version_satisfies", node(SID, "foo"), "1.0.1"),
  hash_attr("hash", "node_version_satisfies", "foo", "1.0.1"),
  attr("variant_value", node(SID, "foo"), "a", "True"),
  hash_attr("hash", "variant_value", "foo", "a", "True"),
  attr("variant_value", node(SID, "foo"), "b", VariVar0),
  hash_attr("hash", "variant_value", "foo", "b", VariVar0).


2.2 Synthesizing splices in `concretize.lp` and `splices.lp`

The ASP solver generates "splice_at_hash" attrs to indicate that a particular node has a splice in one of its immediate dependencies. 

Splices can be introduced in the dependencies of concrete specs when `splices.lp` is conditionally loaded (based on the config option `concretizer:splice:True`. 

2.3 Constructing spliced specs in `asp.py`

The method `SpecBuilder._resolve_splices` implements a top-down memoized implementation of hybrid splicing. This is an optimization over the more general `Spec.splice`, since the solver gives a global view of exactly which specs can be shared, to ensure the minimal number of splicing operations. 

Misc changes to facilitate configuration and benchmarking 
- Added the method `Solver.solve_with_stats` to expose timers from the public interface for easier benchmarking 
- Added the boolean config option `concretizer:splice` to conditionally load splicing behavior 

Co-authored-by: Greg Becker <becker33@llnl.gov>
2024-11-12 20:51:19 -08:00
v
ad518d975c py-nugraph, ph5concat, py-numl: Add new nugraph packages (#47315) 2024-11-13 01:34:11 +01:00
SXS Bot
a76e3f2030 spectre: add v2024.03.19 (#43275)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-11-12 15:16:27 -07:00
Greg Becker
1809b81e1d parse_specs: special case for concretizing lookups quickly (#47556)
We added unification semantics for parsing specs from the CLI, but there are a couple
of special cases in which we can avoid calls to the concretizer for speed when the
specs can all be resolved by lookups.

- [x] special case 1: solving a single spec

- [x] special case 2: all specs are either concrete (come from a file) or have an abstract
      hash. In this case if concretizer:unify:true we need an additional check to confirm
      the specs are compatible.

- [x] add a parameterized test for unifying on the CI

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-12 15:04:47 -07:00
Alec Scott
a02b40b670 restic: add v0.17.3 (#47553) 2024-11-12 14:15:53 -07:00
Alec Scott
6d8fdbcf82 direnv: add v2.35.0 (#47551) 2024-11-12 13:54:19 -07:00
78 changed files with 2145 additions and 509 deletions

View File

@@ -52,7 +52,13 @@ jobs:
# Needed for unit tests # Needed for unit tests
sudo apt-get -y install \ sudo apt-get -y install \
coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \ coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \
cmake bison libbison-dev kcov cmake bison libbison-dev subversion
# On ubuntu 24.04, kcov was removed. It may come back in some future Ubuntu
- name: Set up Homebrew
id: set-up-homebrew
uses: Homebrew/actions/setup-homebrew@40e9946c182a64b3db1bf51be0dcb915f7802aa9
- name: Install kcov with brew
run: "brew install kcov"
- name: Install Python packages - name: Install Python packages
run: | run: |
pip install --upgrade pip setuptools pytest pytest-xdist pytest-cov pip install --upgrade pip setuptools pytest pytest-xdist pytest-cov
@@ -99,7 +105,13 @@ jobs:
run: | run: |
sudo apt-get -y update sudo apt-get -y update
# Needed for shell tests # Needed for shell tests
sudo apt-get install -y coreutils kcov csh zsh tcsh fish dash bash sudo apt-get install -y coreutils csh zsh tcsh fish dash bash subversion
# On ubuntu 24.04, kcov was removed. It may come back in some future Ubuntu
- name: Set up Homebrew
id: set-up-homebrew
uses: Homebrew/actions/setup-homebrew@40e9946c182a64b3db1bf51be0dcb915f7802aa9
- name: Install kcov with brew
run: "brew install kcov"
- name: Install Python packages - name: Install Python packages
run: | run: |
pip install --upgrade pip setuptools pytest coverage[toml] pytest-xdist pip install --upgrade pip setuptools pytest coverage[toml] pytest-xdist
@@ -134,7 +146,7 @@ jobs:
- name: Setup repo and non-root user - name: Setup repo and non-root user
run: | run: |
git --version git --version
git config --global --add safe.directory /__w/spack/spack git config --global --add safe.directory '*'
git fetch --unshallow git fetch --unshallow
. .github/workflows/bin/setup_git.sh . .github/workflows/bin/setup_git.sh
useradd spack-test useradd spack-test

View File

@@ -74,7 +74,7 @@ jobs:
- name: Setup repo and non-root user - name: Setup repo and non-root user
run: | run: |
git --version git --version
git config --global --add safe.directory /__w/spack/spack git config --global --add safe.directory '*'
git fetch --unshallow git fetch --unshallow
. .github/workflows/bin/setup_git.sh . .github/workflows/bin/setup_git.sh
useradd spack-test useradd spack-test

View File

@@ -1,3 +1,395 @@
# v0.23.1 (2025-02-19)
## Bugfixes
- Fix a correctness issue of `ArchSpec.intersects` (#48741)
- Make `extra_attributes` order independent in Spec hashing (#48615, #48854)
- Fix issue where system proxy settings were not respected in OCI build caches (#48783)
- Fix an issue where the `--test` concretizer flag was not forwarded correctly (#48417)
- Fix an issue where `codesign` and `install_name_tool` would not preserve hardlinks on
Darwin (#47808)
- Fix an issue on Darwin where codesign would run on unmodified binaries (#48568)
- Patch configure scripts generated with libtool < 2.5.4, to avoid redundant flags when
creating shared libraries on Darwin (#48671)
- Fix issue related to mirror URL paths on Windows (#47898)
- Esnure proper UTF-8 encoding/decoding in logging (#48005)
- Fix issues related to `filter_file` (#48038, #48108)
- Fix issue related to creating bootstrap source mirrors (#48235)
- Fix issue where command line config arguments were not always top level (#48255)
- Fix an incorrect typehint of `concretized()` (#48504)
- Improve mention of next Spack version in warning (#47887)
- Tests: fix forward compatibility with Python 3.13 (#48209)
- Docs: encourage use of `--oci-username-variable` and `--oci-password-variable` (#48189)
- Docs: ensure Getting Started has bootstrap list output in correct place (#48281)
- CI: allow GitHub actions to run on forks of Spack with different project name (#48041)
- CI: make unit tests work on Ubuntu 24.04 (#48151)
- CI: re-enable cray pipelines (#47697)
## Package updates
- `qt-base`: fix rpath for dependents (#47424)
- `gdk-pixbuf`: fix outdated URL (#47825)
# v0.23.0 (2024-11-13)
`v0.23.0` is a major feature release.
We are planning to make this the last major release before Spack `v1.0`
in June 2025. Alongside `v0.23`, we will be making pre-releases (alpha,
beta, etc.) of `v1.0`, and we encourage users to try them and send us
feedback, either on GitHub or on Slack. You can track the road to
`v1.0` here:
* https://github.com/spack/spack/releases
* https://github.com/spack/spack/discussions/30634
## Features in this Release
1. **Language virtuals**
Your packages can now explicitly depend on the languages they require.
Historically, Spack has considered C, C++, and Fortran compiler
dependencies to be implicit. In `v0.23`, you should ensure that
new packages add relevant C, C++, and Fortran dependencies like this:
```python
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("fortran", type="build")
```
We encourage you to add these annotations to your packages now, to prepare
for Spack `v1.0.0`. In `v1.0.0`, these annotations will be necessary for
your package to use C, C++, and Fortran compilers. Note that you should
*not* add language dependencies to packages that don't need them, e.g.,
pure python packages.
We have already auto-generated these dependencies for packages in the
`builtin` repository (see #45217), based on the types of source files
present in each package's source code. We *may* have added too many or too
few language dependencies, so please submit pull requests to correct
packages if you find that the language dependencies are incorrect.
Note that we have also backported support for these dependencies to
`v0.21.3` and `v0.22.2`, to make all of them forward-compatible with
`v0.23`. This should allow you to move easily between older and newer Spack
releases without breaking your packages.
2. **Spec splicing**
We are working to make binary installation more seamless in Spack. `v0.23`
introduces "splicing", which allows users to deploy binaries using local,
optimized versions of a binary interface, even if they were not built with
that interface. For example, this would allow you to build binaries in the
cloud using `mpich` and install them on a system using a local, optimized
version of `mvapich2` *without rebuilding*. Spack preserves full provenance
for the installed packages and knows that they were built one way but
deployed another.
Our intent is to leverage this across many key HPC binary packages,
e.g. MPI, CUDA, ROCm, and libfabric.
Fundamentally, splicing allows Spack to redeploy an existing spec with
different dependencies than how it was built. There are two interfaces to
splicing.
a. Explicit Splicing
#39136 introduced the explicit splicing interface. In the
concretizer config, you can specify a target spec and a replacement
by hash.
```yaml
concretizer:
splice:
explicit:
- target: mpi
replacement: mpich/abcdef
```
Here, every installation that would normally use the target spec will
instead use its replacement. Above, any spec using *any* `mpi` will be
spliced to depend on the specific `mpich` installation requested. This
*can* go wrong if you try to replace something built with, e.g.,
`openmpi` with `mpich`, and it is on the user to ensure ABI
compatibility between target and replacement specs. This currently
requires some expertise to use, but it will allow users to reuse the
binaries they create across more machines and environments.
b. Automatic Splicing (experimental)
#46729 introduced automatic splicing. In the concretizer config, enable
automatic splicing:
```yaml
concretizer:
splice:
automatic: true
```
or run:
```console
spack config add concretizer:splice:automatic:true
```
The concretizer will select splices for ABI compatibility to maximize
package reuse. Packages can denote ABI compatibility using the
`can_splice` directive. No packages in Spack yet use this directive, so
if you want to use this feature you will need to add `can_splice`
annotations to your packages. We are working on ways to add more ABI
compatibility information to the Spack package repository, and this
directive may change in the future.
See the documentation for more details:
* https://spack.readthedocs.io/en/latest/build_settings.html#splicing
* https://spack.readthedocs.io/en/latest/packaging_guide.html#specifying-abi-compatibility
3. Broader variant propagation
Since #42931, you can specify propagated variants like `hdf5
build_type==RelWithDebInfo` or `trilinos ++openmp` to propagate a variant
to all dependencies for which it is relevant. This is valid *even* if the
variant does not exist on the package or its dependencies.
See https://spack.readthedocs.io/en/latest/basic_usage.html#variants.
4. Query specs by namespace
#45416 allows a package's namespace (indicating the repository it came from)
to be treated like a variant. You can request packages from particular repos
like this:
```console
spack find zlib namespace=builtin
spack find zlib namespace=myrepo
```
Previously, the spec syntax only allowed namespaces to be prefixes of spec
names, e.g. `builtin.zlib`. The previous syntax still works.
5. `spack spec` respects environment settings and `unify:true`
`spack spec` did not previously respect environment lockfiles or
unification settings, which made it difficult to see exactly how a spec
would concretize within an environment. Now it does, so the output you get
with `spack spec` will be *the same* as what your environment will
concretize to when you run `spack concretize`. Similarly, if you provide
multiple specs on the command line with `spack spec`, it will concretize
them together if `unify:true` is set.
See #47556 and #44843.
6. Less noisy `spack spec` output
`spack spec` previously showed output like this:
```console
> spack spec /v5fn6xo
Input spec
--------------------------------
- /v5fn6xo
Concretized
--------------------------------
[+] openssl@3.3.1%apple-clang@16.0.0~docs+shared arch=darwin-sequoia-m1
...
```
But the input spec is redundant, and we know we run `spack spec` to concretize
the input spec. `spack spec` now *only* shows the concretized spec. See #47574.
7. Better output for `spack find -c`
In an environmnet, `spack find -c` lets you search the concretized, but not
yet installed, specs, just as you would the installed ones. As with `spack
spec`, this should make it easier for you to see what *will* be built
before building and installing it. See #44713.
8. `spack -C <env>`: use an environment's configuration without activation
Spack environments allow you to associate:
1. a set of (possibly concretized) specs, and
2. configuration
When you activate an environment, you're using both of these. Previously, we
supported:
* `spack -e <env>` to run spack in the context of a specific environment, and
* `spack -C <directory>` to run spack using a directory with configuration files.
You can now also pass an environment to `spack -C` to use *only* the environment's
configuration, but not the specs or lockfile. See #45046.
## New commands, options, and directives
* The new `spack env track` command (#41897) takes a non-managed Spack
environment and adds a symlink to Spack's `$environments_root` directory, so
that it will be included for reference counting for commands like `spack
uninstall` and `spack gc`. If you use free-standing directory environments,
this is useful for preventing Spack from removing things required by your
environments. You can undo this tracking with the `spack env untrack`
command.
* Add `-t` short option for `spack --backtrace` (#47227)
`spack -d / --debug` enables backtraces on error, but it can be very
verbose, and sometimes you just want the backtrace. `spack -t / --backtrace`
provides that option.
* `gc`: restrict to specific specs (#46790)
If you only want to garbage-collect specific packages, you can now provide
them on the command line. This gives users finer-grained control over what
is uninstalled.
* oci buildcaches now support `--only=package`. You can now push *just* a
package and not its dependencies to an OCI registry. This allows dependents
of non-redistributable specs to be stored in OCI registries without an
error. See #45775.
## Notable refactors
* Variants are now fully conditional
The `variants` dictionary on packages was previously keyed by variant name,
and allowed only one definition of any given variant. Spack is now smart
enough to understand that variants may have different values and defaults
for different versions. For example, `warpx` prior to `23.06` only supported
builds for one dimensionality, and newer `warpx` versions could be built
with support for many different dimensions:
```python
variant(
"dims",
default="3",
values=("1", "2", "3", "rz"),
multi=False,
description="Number of spatial dimensions",
when="@:23.05",
)
variant(
"dims",
default="1,2,rz,3",
values=("1", "2", "3", "rz"),
multi=True,
description="Number of spatial dimensions",
when="@23.06:",
)
```
Previously, the default for the old version of `warpx` was not respected and
had to be specified manually. Now, Spack will select the right variant
definition for each version at concretization time. This allows variants to
evolve more smoothly over time. See #44425 for details.
## Highlighted bugfixes
1. Externals no longer override the preferred provider (#45025).
External definitions could interfere with package preferences. Now, if
`openmpi` is the preferred `mpi`, and an external `mpich` is defined, a new
`openmpi` *will* be built if building it is possible. Previously we would
prefer `mpich` despite the preference.
2. Composable `cflags` (#41049).
This release fixes a longstanding bug that concretization would fail if
there were different `cflags` specified in `packages.yaml`,
`compilers.yaml`, or on `the` CLI. Flags and their ordering are now tracked
in the concretizer and flags from multiple sources will be merged.
3. Fix concretizer Unification for included environments (#45139).
## Deprecations, removals, and syntax changes
1. The old concretizer has been removed from Spack, along with the
`config:concretizer` config option. Spack will emit a warning if the option
is present in user configuration, since it now has no effect. Spack now
uses a simpler bootstrapping mechanism, where a JSON prototype is tweaked
slightly to get an initial concrete spec to download. See #45215.
2. Best-effort expansion of spec matrices has been removed. This feature did
not work with the "new" ASP-based concretizer, and did not work with
`unify: True` or `unify: when_possible`. Use the
[exclude key](https://spack.readthedocs.io/en/latest/environments.html#spec-matrices)
for the environment to exclude invalid components, or use multiple spec
matrices to combine the list of specs for which the constraint is valid and
the list of specs for which it is not. See #40792.
3. The old Cray `platform` (based on Cray PE modules) has been removed, and
`platform=cray` is no longer supported. Since `v0.19`, Spack has handled
Cray machines like Linux clusters with extra packages, and we have
encouraged using this option to support Cray. The new approach allows us to
correctly handle Cray machines with non-SLES operating systems, and it is
much more reliable than making assumptions about Cray modules. See the
`v0.19` release notes and #43796 for more details.
4. The `config:install_missing_compilers` config option has been deprecated,
and it is a no-op when set in `v0.23`. Our new compiler dependency model
will replace it with a much more reliable and robust mechanism in `v1.0`.
See #46237.
5. Config options that deprecated in `v0.21` have been removed in `v0.23`. You
can now only specify preferences for `compilers`, `targets`, and
`providers` globally via the `packages:all:` section. Similarly, you can
only specify `versions:` locally for a specific package. See #44061 and
#31261 for details.
6. Spack's old test interface has been removed (#45752), having been
deprecated in `v0.22.0` (#34236). All `builtin` packages have been updated
to use the new interface. See the [stand-alone test documentation](
https://spack.readthedocs.io/en/latest/packaging_guide.html#stand-alone-tests)
7. The `spack versions --safe-only` option, deprecated since `v0.21.0`, has
been removed. See #45765.
* The `--dependencies` and `--optimize` arguments to `spack ci` have been
deprecated. See #45005.
## Binary caches
1. Public binary caches now include an ML stack for Linux/aarch64 (#39666)We
now build an ML stack for Linux/aarch64 for all pull requests and on
develop. The ML stack includes both CPU-only and CUDA builds for Horovod,
Hugging Face, JAX, Keras, PyTorch,scikit-learn, TensorBoard, and
TensorFlow, and related packages. The CPU-only stack also includes XGBoost.
See https://cache.spack.io/tag/develop/?stack=ml-linux-aarch64-cuda.
2. There is also now an stack of developer tools for macOS (#46910), which is
analogous to the Linux devtools stack. You can use this to avoid building
many common build dependencies. See
https://cache.spack.io/tag/develop/?stack=developer-tools-darwin.
## Architecture support
* archspec has been updated to `v0.2.5`, with support for `zen5`
* Spack's CUDA package now supports the Grace Hopper `9.0a` compute capability (#45540)
## Windows
* Windows bootstrapping: `file` and `gpg` (#41810)
* `scripts` directory added to PATH on Windows for python extensions (#45427)
* Fix `spack load --list` and `spack unload` on Windows (#35720)
## Other notable changes
* Bugfix: `spack find -x` in environments (#46798)
* Spec splices are now robust to duplicate nodes with the same name in a spec (#46382)
* Cache per-compiler libc calculations for performance (#47213)
* Fixed a bug in external detection for openmpi (#47541)
* Mirror configuration allows username/password as environment variables (#46549)
* Default library search caps maximum depth (#41945)
* Unify interface for `spack spec` and `spack solve` commands (#47182)
* Spack no longer RPATHs directories in the default library search path (#44686)
* Improved performance of Spack database (#46554)
* Enable package reuse for packages with versions from git refs (#43859)
* Improved handling for `uuid` virtual on macos (#43002)
* Improved tracking of task queueing/requeueing in the installer (#46293)
## Spack community stats
* Over 2,000 pull requests updated package recipes
* 8,307 total packages, 329 new since `v0.22.0`
* 140 new Python packages
* 14 new R packages
* 373 people contributed to this release
* 357 committers to packages
* 60 committers to core
# v0.22.2 (2024-09-21) # v0.22.2 (2024-09-21)
## Bugfixes ## Bugfixes
@@ -419,7 +811,7 @@
- spack graph: fix coloring with environments (#41240) - spack graph: fix coloring with environments (#41240)
- spack info: sort variants in --variants-by-name (#41389) - spack info: sort variants in --variants-by-name (#41389)
- Spec.format: error on old style format strings (#41934) - Spec.format: error on old style format strings (#41934)
- ASP-based solver: - ASP-based solver:
- fix infinite recursion when computing concretization errors (#41061) - fix infinite recursion when computing concretization errors (#41061)
- don't error for type mismatch on preferences (#41138) - don't error for type mismatch on preferences (#41138)
- don't emit spurious debug output (#41218) - don't emit spurious debug output (#41218)

View File

@@ -39,7 +39,8 @@ concretizer:
# Option to deal with possible duplicate nodes (i.e. different nodes from the same package) in the DAG. # Option to deal with possible duplicate nodes (i.e. different nodes from the same package) in the DAG.
duplicates: duplicates:
# "none": allows a single node for any package in the DAG. # "none": allows a single node for any package in the DAG.
# "minimal": allows the duplication of 'build-tools' nodes only (e.g. py-setuptools, cmake etc.) # "minimal": allows the duplication of 'build-tools' nodes only
# (e.g. py-setuptools, cmake etc.)
# "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG) # "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG)
strategy: minimal strategy: minimal
# Option to specify compatibility between operating systems for reuse of compilers and packages # Option to specify compatibility between operating systems for reuse of compilers and packages
@@ -47,3 +48,10 @@ concretizer:
# it can reuse. Note this is a directional compatibility so mutual compatibility between two OS's # it can reuse. Note this is a directional compatibility so mutual compatibility between two OS's
# requires two entries i.e. os_compatible: {sonoma: [monterey], monterey: [sonoma]} # requires two entries i.e. os_compatible: {sonoma: [monterey], monterey: [sonoma]}
os_compatible: {} os_compatible: {}
# Option to specify whether to support splicing. Splicing allows for
# the relinking of concrete package dependencies in order to better
# reuse already built packages with ABI compatible dependencies
splice:
explicit: []
automatic: false

View File

@@ -265,25 +265,30 @@ infrastructure, or to cache Spack built binaries in Github Actions and
GitLab CI. GitLab CI.
To get started, configure an OCI mirror using ``oci://`` as the scheme, To get started, configure an OCI mirror using ``oci://`` as the scheme,
and optionally specify a username and password (or personal access token): and optionally specify variables that hold the username and password (or
personal access token) for the registry:
.. code-block:: console .. code-block:: console
$ spack mirror add --oci-username username --oci-password password my_registry oci://example.com/my_image $ spack mirror add --oci-username-variable REGISTRY_USER \
--oci-password-variable REGISTRY_TOKEN \
my_registry oci://example.com/my_image
Spack follows the naming conventions of Docker, with Dockerhub as the default Spack follows the naming conventions of Docker, with Dockerhub as the default
registry. To use Dockerhub, you can omit the registry domain: registry. To use Dockerhub, you can omit the registry domain:
.. code-block:: console .. code-block:: console
$ spack mirror add --oci-username username --oci-password password my_registry oci://username/my_image $ spack mirror add ... my_registry oci://username/my_image
From here, you can use the mirror as any other build cache: From here, you can use the mirror as any other build cache:
.. code-block:: console .. code-block:: console
$ export REGISTRY_USER=...
$ export REGISTRY_TOKEN=...
$ spack buildcache push my_registry <specs...> # push to the registry $ spack buildcache push my_registry <specs...> # push to the registry
$ spack install <specs...> # install from the registry $ spack install <specs...> # or install from the registry
A unique feature of buildcaches on top of OCI registries is that it's incredibly A unique feature of buildcaches on top of OCI registries is that it's incredibly
easy to generate get a runnable container image with the binaries installed. This easy to generate get a runnable container image with the binaries installed. This

View File

@@ -237,3 +237,35 @@ is optional -- by default, splices will be transitive.
``mpich/abcdef`` instead of ``mvapich2`` as the MPI provider. Spack ``mpich/abcdef`` instead of ``mvapich2`` as the MPI provider. Spack
will warn the user in this case, but will not fail the will warn the user in this case, but will not fail the
concretization. concretization.
.. _automatic_splicing:
^^^^^^^^^^^^^^^^^^
Automatic Splicing
^^^^^^^^^^^^^^^^^^
The Spack solver can be configured to do automatic splicing for
ABI-compatible packages. Automatic splices are enabled in the concretizer
config section
.. code-block:: yaml
concretizer:
splice:
automatic: True
Packages can include ABI-compatibility information using the
``can_splice`` directive. See :ref:`the packaging
guide<abi_compatibility>` for instructions on specifying ABI
compatibility using the ``can_splice`` directive.
.. note::
The ``can_splice`` directive is experimental and may be changed in
future versions.
When automatic splicing is enabled, the concretizer will combine any
number of ABI-compatible specs if possible to reuse installed packages
and packages available from binary caches. The end result of these
specs is equivalent to a series of transitive/intransitive splices,
but the series may be non-obvious.

View File

@@ -38,9 +38,11 @@ just have to configure and OCI registry and run ``spack buildcache push``.
spack -e . install spack -e . install
# Configure the registry # Configure the registry
spack -e . mirror add --oci-username ... --oci-password ... container-registry oci://example.com/name/image spack -e . mirror add --oci-username-variable REGISTRY_USER \
--oci-password-variable REGISTRY_TOKEN \
container-registry oci://example.com/name/image
# Push the image # Push the image (do set REGISTRY_USER and REGISTRY_TOKEN)
spack -e . buildcache push --update-index --base-image ubuntu:22.04 --tag my_env container-registry spack -e . buildcache push --update-index --base-image ubuntu:22.04 --tag my_env container-registry
The resulting container image can then be run as follows: The resulting container image can then be run as follows:

View File

@@ -148,20 +148,22 @@ The first time you concretize a spec, Spack will bootstrap automatically:
-------------------------------- --------------------------------
zlib@1.2.13%gcc@9.4.0+optimize+pic+shared build_system=makefile arch=linux-ubuntu20.04-icelake zlib@1.2.13%gcc@9.4.0+optimize+pic+shared build_system=makefile arch=linux-ubuntu20.04-icelake
The default bootstrap behavior is to use pre-built binaries. You can verify the
active bootstrap repositories with:
.. command-output:: spack bootstrap list
If for security concerns you cannot bootstrap ``clingo`` from pre-built If for security concerns you cannot bootstrap ``clingo`` from pre-built
binaries, you have to disable fetching the binaries we generated with Github Actions. binaries, you have to disable fetching the binaries we generated with Github Actions.
.. code-block:: console .. code-block:: console
$ spack bootstrap disable github-actions-v0.4 $ spack bootstrap disable github-actions-v0.6
==> "github-actions-v0.4" is now disabled and will not be used for bootstrapping ==> "github-actions-v0.6" is now disabled and will not be used for bootstrapping
$ spack bootstrap disable github-actions-v0.3 $ spack bootstrap disable github-actions-v0.5
==> "github-actions-v0.3" is now disabled and will not be used for bootstrapping ==> "github-actions-v0.5" is now disabled and will not be used for bootstrapping
You can verify that the new settings are effective with:
.. command-output:: spack bootstrap list
You can verify that the new settings are effective with ``spack bootstrap list``.
.. note:: .. note::

View File

@@ -1267,7 +1267,7 @@ Git fetching supports the following parameters to ``version``:
This feature requires ``git`` to be version ``2.25.0`` or later but is useful for This feature requires ``git`` to be version ``2.25.0`` or later but is useful for
large repositories that have separate portions that can be built independently. large repositories that have separate portions that can be built independently.
If paths provided are directories then all the subdirectories and associated files If paths provided are directories then all the subdirectories and associated files
will also be cloned. will also be cloned.
Only one of ``tag``, ``branch``, or ``commit`` can be used at a time. Only one of ``tag``, ``branch``, or ``commit`` can be used at a time.
@@ -1367,8 +1367,8 @@ Submodules
git-submodule``. git-submodule``.
Sparse-Checkout Sparse-Checkout
You can supply ``git_sparse_paths`` at the package or version level to utilize git's You can supply ``git_sparse_paths`` at the package or version level to utilize git's
sparse-checkout feature. This will only clone the paths that are specified in the sparse-checkout feature. This will only clone the paths that are specified in the
``git_sparse_paths`` attribute for the package along with the files in the top level directory. ``git_sparse_paths`` attribute for the package along with the files in the top level directory.
This feature allows you to only clone what you need from a large repository. This feature allows you to only clone what you need from a large repository.
Note that this is a newer feature in git and requries git ``2.25.0`` or greater. Note that this is a newer feature in git and requries git ``2.25.0`` or greater.
@@ -2392,7 +2392,7 @@ by the ``--jobs`` option:
.. code-block:: python .. code-block:: python
:emphasize-lines: 7, 11 :emphasize-lines: 7, 11
:linenos: :linenos:
class Xios(Package): class Xios(Package):
... ...
def install(self, spec, prefix): def install(self, spec, prefix):
@@ -5420,7 +5420,7 @@ by build recipes. Examples of checking :ref:`variant settings <variants>` and
determine whether it needs to also set up build dependencies (see determine whether it needs to also set up build dependencies (see
:ref:`test-build-tests`). :ref:`test-build-tests`).
The ``MyPackage`` package below provides two basic test examples: The ``MyPackage`` package below provides two basic test examples:
``test_example`` and ``test_example2``. The first runs the installed ``test_example`` and ``test_example2``. The first runs the installed
``example`` and ensures its output contains an expected string. The second ``example`` and ensures its output contains an expected string. The second
runs ``example2`` without checking output so is only concerned with confirming runs ``example2`` without checking output so is only concerned with confirming
@@ -5737,7 +5737,7 @@ subdirectory of the installation prefix. They are automatically copied to
the appropriate relative paths under the test stage directory prior to the appropriate relative paths under the test stage directory prior to
executing stand-alone tests. executing stand-alone tests.
.. tip:: .. tip::
*Perform test-related conversions once when copying files.* *Perform test-related conversions once when copying files.*
@@ -7113,6 +7113,46 @@ might write:
CXXFLAGS += -I$DWARF_PREFIX/include CXXFLAGS += -I$DWARF_PREFIX/include
CXXFLAGS += -L$DWARF_PREFIX/lib CXXFLAGS += -L$DWARF_PREFIX/lib
.. _abi_compatibility:
----------------------------
Specifying ABI Compatibility
----------------------------
Packages can include ABI-compatibility information using the
``can_splice`` directive. For example, if ``Foo`` version 1.1 can
always replace version 1.0, then the package could have:
.. code-block:: python
can_splice("foo@1.0", when="@1.1")
For virtual packages, packages can also specify ABI-compabitiliby with
other packages providing the same virtual. For example, ``zlib-ng``
could specify:
.. code-block:: python
can_splice("zlib@1.3.1", when="@2.2+compat")
Some packages have ABI-compatibility that is dependent on matching
variant values, either for all variants or for some set of
ABI-relevant variants. In those cases, it is not necessary to specify
the full combinatorial explosion. The ``match_variants`` keyword can
cover all single-value variants.
.. code-block:: python
can_splice("foo@1.1", when="@1.2", match_variants=["bar"]) # any value for bar as long as they're the same
can_splice("foo@1.2", when="@1.3", match_variants="*") # any variant values if all single-value variants match
The concretizer will use ABI compatibility to determine automatic
splices when :ref:`automatic splicing<automatic_splicing>` is enabled.
.. note::
The ``can_splice`` directive is experimental, and may be replaced
by a higher-level interface in future versions of Spack.
.. _package_class_structure: .. _package_class_structure:

View File

@@ -24,6 +24,7 @@
Callable, Callable,
Deque, Deque,
Dict, Dict,
Generator,
Iterable, Iterable,
List, List,
Match, Match,
@@ -300,35 +301,32 @@ def filter_file(
ignore_absent: bool = False, ignore_absent: bool = False,
start_at: Optional[str] = None, start_at: Optional[str] = None,
stop_at: Optional[str] = None, stop_at: Optional[str] = None,
encoding: Optional[str] = "utf-8",
) -> None: ) -> None:
r"""Like sed, but uses python regular expressions. r"""Like sed, but uses python regular expressions.
Filters every line of each file through regex and replaces the file Filters every line of each file through regex and replaces the file with a filtered version.
with a filtered version. Preserves mode of filtered files. Preserves mode of filtered files.
As with re.sub, ``repl`` can be either a string or a callable. As with re.sub, ``repl`` can be either a string or a callable. If it is a callable, it is
If it is a callable, it is passed the match object and should passed the match object and should return a suitable replacement string. If it is a string, it
return a suitable replacement string. If it is a string, it can contain ``\1``, ``\2``, etc. to represent back-substitution as sed would allow.
can contain ``\1``, ``\2``, etc. to represent back-substitution
as sed would allow.
Args: Args:
regex (str): The regular expression to search for regex: The regular expression to search for
repl (str): The string to replace matches with repl: The string to replace matches with
*filenames: One or more files to search and replace *filenames: One or more files to search and replace string: Treat regex as a plain string.
string (bool): Treat regex as a plain string. Default it False Default it False backup: Make backup file(s) suffixed with ``~``. Default is False
backup (bool): Make backup file(s) suffixed with ``~``. Default is False ignore_absent: Ignore any files that don't exist. Default is False
ignore_absent (bool): Ignore any files that don't exist. start_at: Marker used to start applying the replacements. If a text line matches this
Default is False marker filtering is started at the next line. All contents before the marker and the
start_at (str): Marker used to start applying the replacements. If a marker itself are copied verbatim. Default is to start filtering from the first line of
text line matches this marker filtering is started at the next line. the file.
All contents before the marker and the marker itself are copied stop_at: Marker used to stop scanning the file further. If a text line matches this marker
verbatim. Default is to start filtering from the first line of the filtering is stopped and the rest of the file is copied verbatim. Default is to filter
file. until the end of the file.
stop_at (str): Marker used to stop scanning the file further. If a text encoding: The encoding to use when reading and writing the files. Default is None, which
line matches this marker filtering is stopped and the rest of the uses the system's default encoding.
file is copied verbatim. Default is to filter until the end of the
file.
""" """
# Allow strings to use \1, \2, etc. for replacement, like sed # Allow strings to use \1, \2, etc. for replacement, like sed
if not callable(repl): if not callable(repl):
@@ -344,72 +342,56 @@ def groupid_to_group(x):
if string: if string:
regex = re.escape(regex) regex = re.escape(regex)
for filename in path_to_os_path(*filenames): regex_compiled = re.compile(regex)
msg = 'FILTER FILE: {0} [replacing "{1}"]' for path in path_to_os_path(*filenames):
tty.debug(msg.format(filename, regex)) if ignore_absent and not os.path.exists(path):
tty.debug(f'FILTER FILE: file "{path}" not found. Skipping to next file.')
backup_filename = filename + "~"
tmp_filename = filename + ".spack~"
if ignore_absent and not os.path.exists(filename):
msg = 'FILTER FILE: file "{0}" not found. Skipping to next file.'
tty.debug(msg.format(filename))
continue continue
else:
tty.debug(f'FILTER FILE: {path} [replacing "{regex}"]')
# Create backup file. Don't overwrite an existing backup fd, temp_path = tempfile.mkstemp(
# file in case this file is being filtered multiple times. prefix=f"{os.path.basename(path)}.", dir=os.path.dirname(path)
if not os.path.exists(backup_filename): )
shutil.copy(filename, backup_filename) os.close(fd)
# Create a temporary file to read from. We cannot use backup_filename shutil.copy(path, temp_path)
# in case filter_file is invoked multiple times on the same file. errored = False
shutil.copy(filename, tmp_filename)
try: try:
# Open as a text file and filter until the end of the file is # Open as a text file and filter until the end of the file is reached, or we found a
# reached, or we found a marker in the line if it was specified # marker in the line if it was specified. To avoid translating line endings (\n to
# # \r\n and vice-versa) use newline="".
# To avoid translating line endings (\n to \r\n and vice-versa) with open(
# we force os.open to ignore translations and use the line endings temp_path, mode="r", errors="surrogateescape", newline="", encoding=encoding
# the file comes with ) as input_file, open(
with open(tmp_filename, mode="r", errors="surrogateescape", newline="") as input_file: path, mode="w", errors="surrogateescape", newline="", encoding=encoding
with open(filename, mode="w", errors="surrogateescape", newline="") as output_file: ) as output_file:
do_filtering = start_at is None if start_at is None and stop_at is None: # common case, avoids branching in loop
# Using iter and readline is a workaround needed not to for line in input_file:
# disable input_file.tell(), which will happen if we call output_file.write(re.sub(regex_compiled, repl, line))
# input_file.next() implicitly via the for loop else:
for line in iter(input_file.readline, ""): # state is -1 before start_at; 0 between; 1 after stop_at
if stop_at is not None: state = 0 if start_at is None else -1
current_position = input_file.tell() for line in input_file:
if state == 0:
if stop_at == line.strip(): if stop_at == line.strip():
output_file.write(line) state = 1
break else:
if do_filtering: line = re.sub(regex_compiled, repl, line)
filtered_line = re.sub(regex, repl, line) elif state == -1 and start_at == line.strip():
output_file.write(filtered_line) state = 0
else: output_file.write(line)
do_filtering = start_at == line.strip()
output_file.write(line)
else:
current_position = None
# If we stopped filtering at some point, reopen the file in
# binary mode and copy verbatim the remaining part
if current_position and stop_at:
with open(tmp_filename, mode="rb") as input_binary_buffer:
input_binary_buffer.seek(current_position)
with open(filename, mode="ab") as output_binary_buffer:
output_binary_buffer.writelines(input_binary_buffer.readlines())
except BaseException: except BaseException:
# clean up the original file on failure. # restore the original file
shutil.move(backup_filename, filename) os.rename(temp_path, path)
errored = True
raise raise
finally: finally:
os.remove(tmp_filename) if not errored and not backup:
if not backup and os.path.exists(backup_filename): os.unlink(temp_path)
os.remove(backup_filename)
class FileFilter: class FileFilter:
@@ -2838,6 +2820,25 @@ def temporary_dir(
remove_directory_contents(tmp_dir) remove_directory_contents(tmp_dir)
@contextmanager
def edit_in_place_through_temporary_file(file_path: str) -> Generator[str, None, None]:
"""Context manager for modifying ``file_path`` in place, preserving its inode and hardlinks,
for functions or external tools that do not support in-place editing. Notice that this function
is unsafe in that it works with paths instead of a file descriptors, but this is by design,
since we assume the call site will create a new inode at the same path."""
tmp_fd, tmp_path = tempfile.mkstemp(
dir=os.path.dirname(file_path), prefix=f"{os.path.basename(file_path)}."
)
# windows cannot replace a file with open fds, so close since the call site needs to replace.
os.close(tmp_fd)
try:
shutil.copyfile(file_path, tmp_path, follow_symlinks=True)
yield tmp_path
shutil.copyfile(tmp_path, file_path, follow_symlinks=True)
finally:
os.unlink(tmp_path)
def filesummary(path, print_bytes=16) -> Tuple[int, bytes]: def filesummary(path, print_bytes=16) -> Tuple[int, bytes]:
"""Create a small summary of the given file. Does not error """Create a small summary of the given file. Does not error
when file does not exist. when file does not exist.

View File

@@ -879,10 +879,13 @@ def _writer_daemon(
write_fd.close() write_fd.close()
# 1. Use line buffering (3rd param = 1) since Python 3 has a bug # 1. Use line buffering (3rd param = 1) since Python 3 has a bug
# that prevents unbuffered text I/O. # that prevents unbuffered text I/O. [needs citation]
# 2. Python 3.x before 3.7 does not open with UTF-8 encoding by default # 2. Enforce a UTF-8 interpretation of build process output with errors replaced by '?'.
# The downside is that the log file will not contain the exact output of the build process.
# 3. closefd=False because Connection has "ownership" # 3. closefd=False because Connection has "ownership"
read_file = os.fdopen(read_fd.fileno(), "r", 1, encoding="utf-8", closefd=False) read_file = os.fdopen(
read_fd.fileno(), "r", 1, encoding="utf-8", errors="replace", closefd=False
)
if stdin_fd: if stdin_fd:
stdin_file = os.fdopen(stdin_fd.fileno(), closefd=False) stdin_file = os.fdopen(stdin_fd.fileno(), closefd=False)
@@ -928,11 +931,7 @@ def _writer_daemon(
try: try:
while line_count < 100: while line_count < 100:
# Handle output from the calling process. # Handle output from the calling process.
try: line = _retry(read_file.readline)()
line = _retry(read_file.readline)()
except UnicodeDecodeError:
# installs like --test=root gpgme produce non-UTF8 logs
line = "<line lost: output was not encoded as UTF-8>\n"
if not line: if not line:
return return
@@ -946,6 +945,13 @@ def _writer_daemon(
output_line = clean_line output_line = clean_line
if filter_fn: if filter_fn:
output_line = filter_fn(clean_line) output_line = filter_fn(clean_line)
enc = sys.stdout.encoding
if enc != "utf-8":
# On Python 3.6 and 3.7-3.14 with non-{utf-8,C} locale stdout
# may not be able to handle utf-8 output. We do an inefficient
# dance of re-encoding with errors replaced, so stdout.write
# does not raise.
output_line = output_line.encode(enc, "replace").decode(enc)
sys.stdout.write(output_line) sys.stdout.write(output_line)
# Stripped output to log file. # Stripped output to log file.

View File

@@ -11,7 +11,7 @@
import spack.util.git import spack.util.git
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string #: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.23.0.dev0" __version__ = "0.23.1"
spack_version = __version__ spack_version = __version__

View File

@@ -1366,14 +1366,8 @@ def _test_detection_by_executable(pkgs, debug_log, error_cls):
def _compare_extra_attribute(_expected, _detected, *, _spec): def _compare_extra_attribute(_expected, _detected, *, _spec):
result = [] result = []
# Check items are of the same type
if not isinstance(_detected, type(_expected)):
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
# If they are string expected is a regex # If they are string expected is a regex
if isinstance(_expected, str): if isinstance(_expected, str) and isinstance(_detected, str):
try: try:
_regex = re.compile(_expected) _regex = re.compile(_expected)
except re.error: except re.error:
@@ -1389,7 +1383,7 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
_details = [f"{_detected} does not match the regex"] _details = [f"{_detected} does not match the regex"]
return [error_cls(summary=_summary, details=_details)] return [error_cls(summary=_summary, details=_details)]
if isinstance(_expected, dict): elif isinstance(_expected, dict) and isinstance(_detected, dict):
_not_detected = set(_expected.keys()) - set(_detected.keys()) _not_detected = set(_expected.keys()) - set(_detected.keys())
if _not_detected: if _not_detected:
_summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}" _summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}"
@@ -1404,6 +1398,10 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
result.extend( result.extend(
_compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec) _compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec)
) )
else:
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
return result return result

View File

@@ -2332,7 +2332,9 @@ def is_backup_file(file):
if not codesign: if not codesign:
return return
for binary in changed_files: for binary in changed_files:
codesign("-fs-", binary) # preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
# If we are installing back to the same location # If we are installing back to the same location
# relocate the sbang location if the spack directory changed # relocate the sbang location if the spack directory changed

View File

@@ -357,6 +357,13 @@ def _do_patch_libtool_configure(self):
) )
# Support Libtool 2.4.2 and older: # Support Libtool 2.4.2 and older:
x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2') x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2')
# Configure scripts generated with libtool < 2.5.4 have a faulty test for the
# -single_module linker flag. A deprecation warning makes it think the default is
# -multi_module, triggering it to use problematic linker flags (such as ld -r). The
# linker default is `-single_module` from (ancient) macOS 10.4, so override by setting
# `lt_cv_apple_cc_single_mod=yes`. See the fix in libtool commit
# 82f7f52123e4e7e50721049f7fa6f9b870e09c9d.
x.filter("lt_cv_apple_cc_single_mod=no", "lt_cv_apple_cc_single_mod=yes", string=True)
@spack.builder.run_after("configure") @spack.builder.run_after("configure")
def _do_patch_libtool(self): def _do_patch_libtool(self):

View File

@@ -8,6 +8,7 @@
import os import os
import re import re
import sys import sys
from collections import Counter
from typing import List, Union from typing import List, Union
import llnl.string import llnl.string
@@ -24,6 +25,7 @@
import spack.extensions import spack.extensions
import spack.parser import spack.parser
import spack.paths import spack.paths
import spack.repo
import spack.spec import spack.spec
import spack.store import spack.store
import spack.traverse as traverse import spack.traverse as traverse
@@ -165,7 +167,9 @@ def quote_kvp(string: str) -> str:
def parse_specs( def parse_specs(
args: Union[str, List[str]], concretize: bool = False, tests: bool = False args: Union[str, List[str]],
concretize: bool = False,
tests: spack.concretize.TestsType = False,
) -> List[spack.spec.Spec]: ) -> List[spack.spec.Spec]:
"""Convenience function for parsing arguments from specs. Handles common """Convenience function for parsing arguments from specs. Handles common
exceptions and dies if there are errors. exceptions and dies if there are errors.
@@ -177,11 +181,13 @@ def parse_specs(
if not concretize: if not concretize:
return specs return specs
to_concretize = [(s, None) for s in specs] to_concretize: List[spack.concretize.SpecPairInput] = [(s, None) for s in specs]
return _concretize_spec_pairs(to_concretize, tests=tests) return _concretize_spec_pairs(to_concretize, tests=tests)
def _concretize_spec_pairs(to_concretize, tests=False): def _concretize_spec_pairs(
to_concretize: List[spack.concretize.SpecPairInput], tests: spack.concretize.TestsType = False
) -> List[spack.spec.Spec]:
"""Helper method that concretizes abstract specs from a list of abstract,concrete pairs. """Helper method that concretizes abstract specs from a list of abstract,concrete pairs.
Any spec with a concrete spec associated with it will concretize to that spec. Any spec Any spec with a concrete spec associated with it will concretize to that spec. Any spec
@@ -189,6 +195,43 @@ def _concretize_spec_pairs(to_concretize, tests=False):
rules from config.""" rules from config."""
unify = spack.config.get("concretizer:unify", False) unify = spack.config.get("concretizer:unify", False)
# Special case for concretizing a single spec
if len(to_concretize) == 1:
abstract, concrete = to_concretize[0]
return [concrete or abstract.concretized(tests=tests)]
# Special case if every spec is either concrete or has an abstract hash
if all(
concrete or abstract.concrete or abstract.abstract_hash
for abstract, concrete in to_concretize
):
# Get all the concrete specs
ret = [
concrete or (abstract if abstract.concrete else abstract.lookup_hash())
for abstract, concrete in to_concretize
]
# If unify: true, check that specs don't conflict
# Since all concrete, "when_possible" is not relevant
if unify is True: # True, "when_possible", False are possible values
runtimes = spack.repo.PATH.packages_with_tags("runtime")
specs_per_name = Counter(
spec.name
for spec in traverse.traverse_nodes(
ret, deptype=("link", "run"), key=traverse.by_dag_hash
)
if spec.name not in runtimes # runtimes are allowed multiple times
)
conflicts = sorted(name for name, count in specs_per_name.items() if count > 1)
if conflicts:
raise spack.error.SpecError(
"Specs conflict and `concretizer:unify` is configured true.",
f" specs depend on multiple versions of {', '.join(conflicts)}",
)
return ret
# Standard case
concretize_method = spack.concretize.concretize_separately # unify: false concretize_method = spack.concretize.concretize_separately # unify: false
if unify is True: if unify is True:
concretize_method = spack.concretize.concretize_together concretize_method = spack.concretize.concretize_together

View File

@@ -29,7 +29,7 @@
# Tarball to be downloaded if binary packages are requested in a local mirror # Tarball to be downloaded if binary packages are requested in a local mirror
BINARY_TARBALL = "https://github.com/spack/spack-bootstrap-mirrors/releases/download/v0.4/bootstrap-buildcache.tar.gz" BINARY_TARBALL = "https://github.com/spack/spack-bootstrap-mirrors/releases/download/v0.6/bootstrap-buildcache.tar.gz"
#: Subdirectory where to create the mirror #: Subdirectory where to create the mirror
LOCAL_MIRROR_DIR = "bootstrap_cache" LOCAL_MIRROR_DIR = "bootstrap_cache"
@@ -51,9 +51,9 @@
}, },
} }
CLINGO_JSON = "$spack/share/spack/bootstrap/github-actions-v0.4/clingo.json" CLINGO_JSON = "$spack/share/spack/bootstrap/github-actions-v0.6/clingo.json"
GNUPG_JSON = "$spack/share/spack/bootstrap/github-actions-v0.4/gnupg.json" GNUPG_JSON = "$spack/share/spack/bootstrap/github-actions-v0.6/gnupg.json"
PATCHELF_JSON = "$spack/share/spack/bootstrap/github-actions-v0.4/patchelf.json" PATCHELF_JSON = "$spack/share/spack/bootstrap/github-actions-v0.6/patchelf.json"
# Metadata for a generated source mirror # Metadata for a generated source mirror
SOURCE_METADATA = { SOURCE_METADATA = {

View File

@@ -528,6 +528,7 @@ def __call__(self, parser, namespace, values, option_string):
# the const from the constructor or a value from the CLI. # the const from the constructor or a value from the CLI.
# Note that this is only called if the argument is actually # Note that this is only called if the argument is actually
# specified on the command line. # specified on the command line.
spack.config.CONFIG.ensure_scope_ordering()
spack.config.set(self.config_path, self.const, scope="command_line") spack.config.set(self.config_path, self.const, scope="command_line")

View File

@@ -3,7 +3,6 @@
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import datetime
import os import os
import re import re
from collections import defaultdict from collections import defaultdict
@@ -97,7 +96,7 @@ def list_files(args):
OLD_LICENSE, SPDX_MISMATCH, GENERAL_MISMATCH = range(1, 4) OLD_LICENSE, SPDX_MISMATCH, GENERAL_MISMATCH = range(1, 4)
#: Latest year that copyright applies. UPDATE THIS when bumping copyright. #: Latest year that copyright applies. UPDATE THIS when bumping copyright.
latest_year = datetime.date.today().year latest_year = 2024 # year of 0.23 release
strict_date = r"Copyright 2013-%s" % latest_year strict_date = r"Copyright 2013-%s" % latest_year
#: regexes for valid license lines at tops of files #: regexes for valid license lines at tops of files

View File

@@ -82,14 +82,6 @@ def spec(parser, args):
if args.namespaces: if args.namespaces:
fmt = "{namespace}." + fmt fmt = "{namespace}." + fmt
tree_kwargs = {
"cover": args.cover,
"format": fmt,
"hashlen": None if args.very_long else 7,
"show_types": args.types,
"status_fn": install_status_fn if args.install_status else None,
}
# use a read transaction if we are getting install status for every # use a read transaction if we are getting install status for every
# spec in the DAG. This avoids repeatedly querying the DB. # spec in the DAG. This avoids repeatedly querying the DB.
tree_context = lang.nullcontext tree_context = lang.nullcontext
@@ -99,46 +91,35 @@ def spec(parser, args):
env = ev.active_environment() env = ev.active_environment()
if args.specs: if args.specs:
input_specs = spack.cmd.parse_specs(args.specs) concrete_specs = spack.cmd.parse_specs(args.specs, concretize=True)
concretized_specs = spack.cmd.parse_specs(args.specs, concretize=True)
specs = list(zip(input_specs, concretized_specs))
elif env: elif env:
env.concretize() env.concretize()
specs = env.concretized_specs() concrete_specs = env.concrete_roots()
if not args.format:
# environments are printed together in a combined tree() invocation,
# except when using --yaml or --json, which we print spec by spec below.
tree_kwargs["key"] = spack.traverse.by_dag_hash
tree_kwargs["hashes"] = args.long or args.very_long
print(spack.spec.tree([concrete for _, concrete in specs], **tree_kwargs))
return
else: else:
tty.die("spack spec requires at least one spec or an active environment") tty.die("spack spec requires at least one spec or an active environment")
for input, output in specs: # With --yaml, --json, or --format, just print the raw specs to output
# With --yaml or --json, just print the raw specs to output if args.format:
if args.format: for spec in concrete_specs:
if args.format == "yaml": if args.format == "yaml":
# use write because to_yaml already has a newline. # use write because to_yaml already has a newline.
sys.stdout.write(output.to_yaml(hash=ht.dag_hash)) sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
elif args.format == "json": elif args.format == "json":
print(output.to_json(hash=ht.dag_hash)) print(spec.to_json(hash=ht.dag_hash))
else: else:
print(output.format(args.format)) print(spec.format(args.format))
continue return
with tree_context(): with tree_context():
# Only show the headers for input specs that are not concrete to avoid print(
# repeated output. This happens because parse_specs outputs concrete spack.spec.tree(
# specs for `/hash` inputs. concrete_specs,
if not input.concrete: cover=args.cover,
tree_kwargs["hashes"] = False # Always False for input spec format=fmt,
print("Input spec") hashlen=None if args.very_long else 7,
print("--------------------------------") show_types=args.types,
print(input.tree(**tree_kwargs)) status_fn=install_status_fn if args.install_status else None,
print("Concretized") hashes=args.long or args.very_long,
print("--------------------------------") key=spack.traverse.by_dag_hash,
)
tree_kwargs["hashes"] = args.long or args.very_long )
print(output.tree(**tree_kwargs))

View File

@@ -24,7 +24,7 @@
# tutorial configuration parameters # tutorial configuration parameters
tutorial_branch = "releases/v0.22" tutorial_branch = "releases/v0.23"
tutorial_mirror = "file:///mirror" tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub") tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@@ -6,7 +6,7 @@
import sys import sys
import time import time
from contextlib import contextmanager from contextlib import contextmanager
from typing import Iterable, Optional, Sequence, Tuple, Union from typing import Iterable, List, Optional, Sequence, Tuple, Union
import llnl.util.tty as tty import llnl.util.tty as tty
@@ -36,6 +36,7 @@ def enable_compiler_existence_check():
CHECK_COMPILER_EXISTENCE = saved CHECK_COMPILER_EXISTENCE = saved
SpecPairInput = Tuple[Spec, Optional[Spec]]
SpecPair = Tuple[Spec, Spec] SpecPair = Tuple[Spec, Spec]
SpecLike = Union[Spec, str] SpecLike = Union[Spec, str]
TestsType = Union[bool, Iterable[str]] TestsType = Union[bool, Iterable[str]]
@@ -60,8 +61,8 @@ def concretize_specs_together(
def concretize_together( def concretize_together(
spec_list: Sequence[SpecPair], tests: TestsType = False spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> Sequence[SpecPair]: ) -> List[SpecPair]:
"""Given a number of specs as input, tries to concretize them together. """Given a number of specs as input, tries to concretize them together.
Args: Args:
@@ -77,8 +78,8 @@ def concretize_together(
def concretize_together_when_possible( def concretize_together_when_possible(
spec_list: Sequence[SpecPair], tests: TestsType = False spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> Sequence[SpecPair]: ) -> List[SpecPair]:
"""Given a number of specs as input, tries to concretize them together to the extent possible. """Given a number of specs as input, tries to concretize them together to the extent possible.
See documentation for ``unify: when_possible`` concretization for the precise definition of See documentation for ``unify: when_possible`` concretization for the precise definition of
@@ -114,8 +115,8 @@ def concretize_together_when_possible(
def concretize_separately( def concretize_separately(
spec_list: Sequence[SpecPair], tests: TestsType = False spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> Sequence[SpecPair]: ) -> List[SpecPair]:
"""Concretizes the input specs separately from each other. """Concretizes the input specs separately from each other.
Args: Args:

View File

@@ -431,6 +431,19 @@ def ensure_unwrapped(self) -> "Configuration":
"""Ensure we unwrap this object from any dynamic wrapper (like Singleton)""" """Ensure we unwrap this object from any dynamic wrapper (like Singleton)"""
return self return self
def highest(self) -> ConfigScope:
"""Scope with highest precedence"""
return next(reversed(self.scopes.values())) # type: ignore
@_config_mutator
def ensure_scope_ordering(self):
"""Ensure that scope order matches documented precedent"""
# FIXME: We also need to consider that custom configurations and other orderings
# may not be preserved correctly
if "command_line" in self.scopes:
# TODO (when dropping python 3.6): self.scopes.move_to_end
self.scopes["command_line"] = self.remove_scope("command_line")
@_config_mutator @_config_mutator
def push_scope(self, scope: ConfigScope) -> None: def push_scope(self, scope: ConfigScope) -> None:
"""Add a higher precedence scope to the Configuration.""" """Add a higher precedence scope to the Configuration."""

View File

@@ -77,6 +77,7 @@ class OpenMpi(Package):
"build_system", "build_system",
"requires", "requires",
"redistribute", "redistribute",
"can_splice",
] ]
_patch_order_index = 0 _patch_order_index = 0
@@ -505,6 +506,43 @@ def _execute_provides(pkg: "spack.package_base.PackageBase"):
return _execute_provides return _execute_provides
@directive("splice_specs")
def can_splice(
target: SpecType, *, when: SpecType, match_variants: Union[None, str, List[str]] = None
):
"""Packages can declare whether they are ABI-compatible with another package
and thus can be spliced into concrete versions of that package.
Args:
target: The spec that the current package is ABI-compatible with.
when: An anonymous spec constraining current package for when it is
ABI-compatible with target.
match_variants: A list of variants that must match
between target spec and current package, with special value '*'
which matches all variants. Example: a variant is defined on both
packages called json, and they are ABI-compatible whenever they agree on
the json variant (regardless of whether it is turned on or off). Note
that this cannot be applied to multi-valued variants and multi-valued
variants will be skipped by '*'.
"""
def _execute_can_splice(pkg: "spack.package_base.PackageBase"):
when_spec = _make_when_spec(when)
if isinstance(match_variants, str) and match_variants != "*":
raise ValueError(
"* is the only valid string for match_variants "
"if looking to provide a single variant, use "
f"[{match_variants}] instead"
)
if when_spec is None:
return
pkg.splice_specs[when_spec] = (spack.spec.Spec(target), match_variants)
return _execute_can_splice
@directive("patches") @directive("patches")
def patch( def patch(
url_or_filename: str, url_or_filename: str,

View File

@@ -3044,11 +3044,13 @@ def prepare_config_scope(self) -> None:
"""Add the manifest's scopes to the global configuration search path.""" """Add the manifest's scopes to the global configuration search path."""
for scope in self.env_config_scopes: for scope in self.env_config_scopes:
spack.config.CONFIG.push_scope(scope) spack.config.CONFIG.push_scope(scope)
spack.config.CONFIG.ensure_scope_ordering()
def deactivate_config_scope(self) -> None: def deactivate_config_scope(self) -> None:
"""Remove any of the manifest's scopes from the global config path.""" """Remove any of the manifest's scopes from the global config path."""
for scope in self.env_config_scopes: for scope in self.env_config_scopes:
spack.config.CONFIG.remove_scope(scope.name) spack.config.CONFIG.remove_scope(scope.name)
spack.config.CONFIG.ensure_scope_ordering()
@contextlib.contextmanager @contextlib.contextmanager
def use_config(self): def use_config(self):

View File

@@ -180,7 +180,7 @@ def ensure_mirror_usable(self, direction: str = "push"):
if errors: if errors:
msg = f"invalid {direction} configuration for mirror {self.name}: " msg = f"invalid {direction} configuration for mirror {self.name}: "
msg += "\n ".join(errors) msg += "\n ".join(errors)
raise spack.mirror.MirrorError(msg) raise MirrorError(msg)
def _update_connection_dict(self, current_data: dict, new_data: dict, top_level: bool): def _update_connection_dict(self, current_data: dict, new_data: dict, top_level: bool):
# Only allow one to exist in the config # Only allow one to exist in the config

View File

@@ -397,6 +397,7 @@ def create_opener():
"""Create an opener that can handle OCI authentication.""" """Create an opener that can handle OCI authentication."""
opener = urllib.request.OpenerDirector() opener = urllib.request.OpenerDirector()
for handler in [ for handler in [
urllib.request.ProxyHandler(),
urllib.request.UnknownHandler(), urllib.request.UnknownHandler(),
urllib.request.HTTPSHandler(context=spack.util.web.ssl_create_default_context()), urllib.request.HTTPSHandler(context=spack.util.web.ssl_create_default_context()),
spack.util.web.SpackHTTPDefaultErrorHandler(), spack.util.web.SpackHTTPDefaultErrorHandler(),

View File

@@ -622,6 +622,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass
patches: Dict["spack.spec.Spec", List["spack.patch.Patch"]] patches: Dict["spack.spec.Spec", List["spack.patch.Patch"]]
variants: Dict["spack.spec.Spec", Dict[str, "spack.variant.Variant"]] variants: Dict["spack.spec.Spec", Dict[str, "spack.variant.Variant"]]
languages: Dict["spack.spec.Spec", Set[str]] languages: Dict["spack.spec.Spec", Set[str]]
splice_specs: Dict["spack.spec.Spec", Tuple["spack.spec.Spec", Union[None, str, List[str]]]]
#: By default, packages are not virtual #: By default, packages are not virtual
#: Virtual packages override this attribute #: Virtual packages override this attribute

View File

@@ -13,6 +13,7 @@
import macholib.mach_o import macholib.mach_o
import macholib.MachO import macholib.MachO
import llnl.util.filesystem as fs
import llnl.util.lang import llnl.util.lang
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.lang import memoized from llnl.util.lang import memoized
@@ -275,10 +276,10 @@ def modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
# Deduplicate and flatten # Deduplicate and flatten
args = list(itertools.chain.from_iterable(llnl.util.lang.dedupe(args))) args = list(itertools.chain.from_iterable(llnl.util.lang.dedupe(args)))
install_name_tool = executable.Executable("install_name_tool")
if args: if args:
args.append(str(cur_path)) with fs.edit_in_place_through_temporary_file(cur_path) as temp_path:
install_name_tool = executable.Executable("install_name_tool") install_name_tool(*args, temp_path)
install_name_tool(*args)
def macholib_get_paths(cur_path): def macholib_get_paths(cur_path):
@@ -717,8 +718,8 @@ def fixup_macos_rpath(root, filename):
# No fixes needed # No fixes needed
return False return False
args.append(abspath) with fs.edit_in_place_through_temporary_file(abspath) as temp_path:
executable.Executable("install_name_tool")(*args) executable.Executable("install_name_tool")(*args, temp_path)
return True return True

View File

@@ -209,7 +209,7 @@ def _apply_to_file(self, f):
# but it's nasty to deal with matches across boundaries, so let's stick to # but it's nasty to deal with matches across boundaries, so let's stick to
# something simple. # something simple.
modified = True modified = False
for match in self.regex.finditer(f.read()): for match in self.regex.finditer(f.read()):
# The matching prefix (old) and its replacement (new) # The matching prefix (old) and its replacement (new)

View File

@@ -78,7 +78,8 @@
"transitive": {"type": "boolean", "default": False}, "transitive": {"type": "boolean", "default": False},
}, },
}, },
} },
"automatic": {"type": "boolean"},
}, },
}, },
"duplicates": { "duplicates": {

View File

@@ -106,8 +106,8 @@
{ {
"names": ["install_missing_compilers"], "names": ["install_missing_compilers"],
"message": "The config:install_missing_compilers option has been deprecated in " "message": "The config:install_missing_compilers option has been deprecated in "
"Spack v0.23, and is currently ignored. It will be removed from config in " "Spack v0.23, and is currently ignored. It will be removed from config after "
"Spack v0.25.", "Spack v1.0.",
"error": False, "error": False,
}, },
], ],

View File

@@ -52,6 +52,7 @@
from .core import ( from .core import (
AspFunction, AspFunction,
AspVar,
NodeArgument, NodeArgument,
ast_sym, ast_sym,
ast_type, ast_type,
@@ -524,12 +525,14 @@ def _compute_specs_from_answer_set(self):
node = SpecBuilder.make_node(pkg=providers[0]) node = SpecBuilder.make_node(pkg=providers[0])
candidate = answer.get(node) candidate = answer.get(node)
if candidate and candidate.build_spec.satisfies(input_spec): if candidate and candidate.satisfies(input_spec):
if not candidate.satisfies(input_spec): self._concrete_specs.append(answer[node])
tty.warn( self._concrete_specs_by_input[input_spec] = answer[node]
"explicit splice configuration has caused the concretized spec" elif candidate and candidate.build_spec.satisfies(input_spec):
f" {candidate} not to satisfy the input spec {input_spec}" tty.warn(
) "explicit splice configuration has caused the concretized spec"
f" {candidate} not to satisfy the input spec {input_spec}"
)
self._concrete_specs.append(answer[node]) self._concrete_specs.append(answer[node])
self._concrete_specs_by_input[input_spec] = answer[node] self._concrete_specs_by_input[input_spec] = answer[node]
else: else:
@@ -854,6 +857,8 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
self.control.load(os.path.join(parent_dir, "libc_compatibility.lp")) self.control.load(os.path.join(parent_dir, "libc_compatibility.lp"))
else: else:
self.control.load(os.path.join(parent_dir, "os_compatibility.lp")) self.control.load(os.path.join(parent_dir, "os_compatibility.lp"))
if setup.enable_splicing:
self.control.load(os.path.join(parent_dir, "splices.lp"))
timer.stop("load") timer.stop("load")
@@ -1166,6 +1171,9 @@ def __init__(self, tests: bool = False):
# list of unique libc specs targeted by compilers (or an educated guess if no compiler) # list of unique libc specs targeted by compilers (or an educated guess if no compiler)
self.libcs: List[spack.spec.Spec] = [] self.libcs: List[spack.spec.Spec] = []
# If true, we have to load the code for synthesizing splices
self.enable_splicing: bool = spack.config.CONFIG.get("concretizer:splice:automatic")
def pkg_version_rules(self, pkg): def pkg_version_rules(self, pkg):
"""Output declared versions of a package. """Output declared versions of a package.
@@ -1336,6 +1344,10 @@ def pkg_rules(self, pkg, tests):
# dependencies # dependencies
self.package_dependencies_rules(pkg) self.package_dependencies_rules(pkg)
# splices
if self.enable_splicing:
self.package_splice_rules(pkg)
# virtual preferences # virtual preferences
self.virtual_preferences( self.virtual_preferences(
pkg.name, pkg.name,
@@ -1674,6 +1686,94 @@ def dependency_holds(input_spec, requirements):
self.gen.newline() self.gen.newline()
def _gen_match_variant_splice_constraints(
self,
pkg,
cond_spec: "spack.spec.Spec",
splice_spec: "spack.spec.Spec",
hash_asp_var: "AspVar",
splice_node,
match_variants: List[str],
):
# If there are no variants to match, no constraints are needed
variant_constraints = []
for i, variant_name in enumerate(match_variants):
vari_defs = pkg.variant_definitions(variant_name)
# the spliceable config of the package always includes the variant
if vari_defs != [] and any(cond_spec.satisfies(s) for (s, _) in vari_defs):
variant = vari_defs[0][1]
if variant.multi:
continue # cannot automatically match multi-valued variants
value_var = AspVar(f"VariValue{i}")
attr_constraint = fn.attr("variant_value", splice_node, variant_name, value_var)
hash_attr_constraint = fn.hash_attr(
hash_asp_var, "variant_value", splice_spec.name, variant_name, value_var
)
variant_constraints.append(attr_constraint)
variant_constraints.append(hash_attr_constraint)
return variant_constraints
def package_splice_rules(self, pkg):
self.gen.h2("Splice rules")
for i, (cond, (spec_to_splice, match_variants)) in enumerate(
sorted(pkg.splice_specs.items())
):
with named_spec(cond, pkg.name):
self.version_constraints.add((cond.name, cond.versions))
self.version_constraints.add((spec_to_splice.name, spec_to_splice.versions))
hash_var = AspVar("Hash")
splice_node = fn.node(AspVar("NID"), cond.name)
when_spec_attrs = [
fn.attr(c.args[0], splice_node, *(c.args[2:]))
for c in self.spec_clauses(cond, body=True, required_from=None)
if c.args[0] != "node"
]
splice_spec_hash_attrs = [
fn.hash_attr(hash_var, *(c.args))
for c in self.spec_clauses(spec_to_splice, body=True, required_from=None)
if c.args[0] != "node"
]
if match_variants is None:
variant_constraints = []
elif match_variants == "*":
filt_match_variants = set()
for map in pkg.variants.values():
for k in map:
filt_match_variants.add(k)
filt_match_variants = list(filt_match_variants)
variant_constraints = self._gen_match_variant_splice_constraints(
pkg, cond, spec_to_splice, hash_var, splice_node, filt_match_variants
)
else:
if any(
v in cond.variants or v in spec_to_splice.variants for v in match_variants
):
raise Exception(
"Overlap between match_variants and explicitly set variants"
)
variant_constraints = self._gen_match_variant_splice_constraints(
pkg, cond, spec_to_splice, hash_var, splice_node, match_variants
)
rule_head = fn.abi_splice_conditions_hold(
i, splice_node, spec_to_splice.name, hash_var
)
rule_body_components = (
[
# splice_set_fact,
fn.attr("node", splice_node),
fn.installed_hash(spec_to_splice.name, hash_var),
]
+ when_spec_attrs
+ splice_spec_hash_attrs
+ variant_constraints
)
rule_body = ",\n ".join(str(r) for r in rule_body_components)
rule = f"{rule_head} :-\n {rule_body}."
self.gen.append(rule)
self.gen.newline()
def virtual_preferences(self, pkg_name, func): def virtual_preferences(self, pkg_name, func):
"""Call func(vspec, provider, i) for each of pkg's provider prefs.""" """Call func(vspec, provider, i) for each of pkg's provider prefs."""
config = spack.config.get("packages") config = spack.config.get("packages")
@@ -2536,8 +2636,9 @@ def concrete_specs(self):
for h, spec in self.reusable_and_possible.explicit_items(): for h, spec in self.reusable_and_possible.explicit_items():
# this indicates that there is a spec like this installed # this indicates that there is a spec like this installed
self.gen.fact(fn.installed_hash(spec.name, h)) self.gen.fact(fn.installed_hash(spec.name, h))
# this describes what constraints it imposes on the solve # indirection layer between hash constraints and imposition to allow for splicing
self.impose(h, spec, body=True) for pred in self.spec_clauses(spec, body=True, required_from=None):
self.gen.fact(fn.hash_attr(h, *pred.args))
self.gen.newline() self.gen.newline()
# Declare as possible parts of specs that are not in package.py # Declare as possible parts of specs that are not in package.py
# - Add versions to possible versions # - Add versions to possible versions
@@ -3478,6 +3579,14 @@ def consume_facts(self):
self._setup.effect_rules() self._setup.effect_rules()
# This should be a dataclass, but dataclasses don't work on Python 3.6
class Splice:
def __init__(self, splice_node: NodeArgument, child_name: str, child_hash: str):
self.splice_node = splice_node
self.child_name = child_name
self.child_hash = child_hash
class SpecBuilder: class SpecBuilder:
"""Class with actions to rebuild a spec from ASP results.""" """Class with actions to rebuild a spec from ASP results."""
@@ -3513,10 +3622,11 @@ def make_node(*, pkg: str) -> NodeArgument:
""" """
return NodeArgument(id="0", pkg=pkg) return NodeArgument(id="0", pkg=pkg)
def __init__( def __init__(self, specs, hash_lookup=None):
self, specs: List[spack.spec.Spec], *, hash_lookup: Optional[ConcreteSpecsByHash] = None
):
self._specs: Dict[NodeArgument, spack.spec.Spec] = {} self._specs: Dict[NodeArgument, spack.spec.Spec] = {}
# Matches parent nodes to splice node
self._splices: Dict[NodeArgument, List[Splice]] = {}
self._result = None self._result = None
self._command_line_specs = specs self._command_line_specs = specs
self._flag_sources: Dict[Tuple[NodeArgument, str], Set[str]] = collections.defaultdict( self._flag_sources: Dict[Tuple[NodeArgument, str], Set[str]] = collections.defaultdict(
@@ -3600,16 +3710,8 @@ def external_spec_selected(self, node, idx):
def depends_on(self, parent_node, dependency_node, type): def depends_on(self, parent_node, dependency_node, type):
dependency_spec = self._specs[dependency_node] dependency_spec = self._specs[dependency_node]
edges = self._specs[parent_node].edges_to_dependencies(name=dependency_spec.name)
edges = [x for x in edges if id(x.spec) == id(dependency_spec)]
depflag = dt.flag_from_string(type) depflag = dt.flag_from_string(type)
self._specs[parent_node].add_dependency_edge(dependency_spec, depflag=depflag, virtuals=())
if not edges:
self._specs[parent_node].add_dependency_edge(
self._specs[dependency_node], depflag=depflag, virtuals=()
)
else:
edges[0].update_deptypes(depflag=depflag)
def virtual_on_edge(self, parent_node, provider_node, virtual): def virtual_on_edge(self, parent_node, provider_node, virtual):
dependencies = self._specs[parent_node].edges_to_dependencies(name=(provider_node.pkg)) dependencies = self._specs[parent_node].edges_to_dependencies(name=(provider_node.pkg))
@@ -3726,6 +3828,57 @@ def _order_index(flag_group):
def deprecated(self, node: NodeArgument, version: str) -> None: def deprecated(self, node: NodeArgument, version: str) -> None:
tty.warn(f'using "{node.pkg}@{version}" which is a deprecated version') tty.warn(f'using "{node.pkg}@{version}" which is a deprecated version')
def splice_at_hash(
self,
parent_node: NodeArgument,
splice_node: NodeArgument,
child_name: str,
child_hash: str,
):
splice = Splice(splice_node, child_name=child_name, child_hash=child_hash)
self._splices.setdefault(parent_node, []).append(splice)
def _resolve_automatic_splices(self):
"""After all of the specs have been concretized, apply all immediate splices.
Use reverse topological order to ensure that all dependencies are resolved
before their parents, allowing for maximal sharing and minimal copying.
"""
fixed_specs = {}
# create a mapping from dag hash to an integer representing position in reverse topo order.
specs = self._specs.values()
topo_order = list(traverse.traverse_nodes(specs, order="topo", key=traverse.by_dag_hash))
topo_lookup = {spec.dag_hash(): index for index, spec in enumerate(reversed(topo_order))}
# iterate over specs, children before parents
for node, spec in sorted(self._specs.items(), key=lambda x: topo_lookup[x[1].dag_hash()]):
immediate = self._splices.get(node, [])
if not immediate and not any(
edge.spec in fixed_specs for edge in spec.edges_to_dependencies()
):
continue
new_spec = spec.copy(deps=False)
new_spec.build_spec = spec
for edge in spec.edges_to_dependencies():
depflag = edge.depflag & ~dt.BUILD
if any(edge.spec.dag_hash() == splice.child_hash for splice in immediate):
splice = [s for s in immediate if s.child_hash == edge.spec.dag_hash()][0]
new_spec.add_dependency_edge(
self._specs[splice.splice_node], depflag=depflag, virtuals=edge.virtuals
)
elif edge.spec in fixed_specs:
new_spec.add_dependency_edge(
fixed_specs[edge.spec], depflag=depflag, virtuals=edge.virtuals
)
else:
new_spec.add_dependency_edge(
edge.spec, depflag=depflag, virtuals=edge.virtuals
)
self._specs[node] = new_spec
fixed_specs[spec] = new_spec
@staticmethod @staticmethod
def sort_fn(function_tuple) -> Tuple[int, int]: def sort_fn(function_tuple) -> Tuple[int, int]:
"""Ensure attributes are evaluated in the correct order. """Ensure attributes are evaluated in the correct order.
@@ -3755,7 +3908,6 @@ def build_specs(self, function_tuples):
# them here so that directives that build objects (like node and # them here so that directives that build objects (like node and
# node_compiler) are called in the right order. # node_compiler) are called in the right order.
self.function_tuples = sorted(set(function_tuples), key=self.sort_fn) self.function_tuples = sorted(set(function_tuples), key=self.sort_fn)
self._specs = {} self._specs = {}
for name, args in self.function_tuples: for name, args in self.function_tuples:
if SpecBuilder.ignored_attributes.match(name): if SpecBuilder.ignored_attributes.match(name):
@@ -3785,10 +3937,14 @@ def build_specs(self, function_tuples):
continue continue
# if we've already gotten a concrete spec for this pkg, # if we've already gotten a concrete spec for this pkg,
# do not bother calling actions on it # do not bother calling actions on it except for node_flag_source,
# since node_flag_source is tracking information not in the spec itself
# we also need to keep track of splicing information.
spec = self._specs.get(args[0]) spec = self._specs.get(args[0])
if spec and spec.concrete: if spec and spec.concrete:
continue do_not_ignore_attrs = ["node_flag_source", "splice_at_hash"]
if name not in do_not_ignore_attrs:
continue
action(*args) action(*args)
@@ -3798,7 +3954,7 @@ def build_specs(self, function_tuples):
# inject patches -- note that we' can't use set() to unique the # inject patches -- note that we' can't use set() to unique the
# roots here, because the specs aren't complete, and the hash # roots here, because the specs aren't complete, and the hash
# function will loop forever. # function will loop forever.
roots = [spec.root for spec in self._specs.values() if not spec.root.installed] roots = [spec.root for spec in self._specs.values()]
roots = dict((id(r), r) for r in roots) roots = dict((id(r), r) for r in roots)
for root in roots.values(): for root in roots.values():
spack.spec.Spec.inject_patches_variant(root) spack.spec.Spec.inject_patches_variant(root)
@@ -3814,6 +3970,8 @@ def build_specs(self, function_tuples):
for root in roots.values(): for root in roots.values():
root._finalize_concretization() root._finalize_concretization()
self._resolve_automatic_splices()
for s in self._specs.values(): for s in self._specs.values():
spack.spec.Spec.ensure_no_deprecated(s) spack.spec.Spec.ensure_no_deprecated(s)
@@ -3828,7 +3986,6 @@ def build_specs(self, function_tuples):
) )
specs = self.execute_explicit_splices() specs = self.execute_explicit_splices()
return specs return specs
def execute_explicit_splices(self): def execute_explicit_splices(self):
@@ -4165,7 +4322,6 @@ def reusable_specs(self, specs: List[spack.spec.Spec]) -> List[spack.spec.Spec]:
result = [] result = []
for reuse_source in self.reuse_sources: for reuse_source in self.reuse_sources:
result.extend(reuse_source.selected_specs()) result.extend(reuse_source.selected_specs())
# If we only want to reuse dependencies, remove the root specs # If we only want to reuse dependencies, remove the root specs
if self.reuse_strategy == ReuseStrategy.DEPENDENCIES: if self.reuse_strategy == ReuseStrategy.DEPENDENCIES:
result = [spec for spec in result if not any(root in spec for root in specs)] result = [spec for spec in result if not any(root in spec for root in specs)]
@@ -4335,11 +4491,10 @@ def __init__(self, provided, conflicts):
super().__init__(msg) super().__init__(msg)
self.provided = provided
# Add attribute expected of the superclass interface # Add attribute expected of the superclass interface
self.required = None self.required = None
self.constraint_type = None self.constraint_type = None
self.provided = provided
class InvalidSpliceError(spack.error.SpackError): class InvalidSpliceError(spack.error.SpackError):

View File

@@ -1449,25 +1449,71 @@ attr("node_flag", PackageNode, NodeFlag) :- attr("node_flag_set", PackageNode, N
%----------------------------------------------------------------------------- %-----------------------------------------------------------------------------
% Installed packages % Installed Packages
%----------------------------------------------------------------------------- %-----------------------------------------------------------------------------
% the solver is free to choose at most one installed hash for each package
{ attr("hash", node(ID, Package), Hash) : installed_hash(Package, Hash) } 1
:- attr("node", node(ID, Package)), internal_error("Package must resolve to at most one hash").
#defined installed_hash/2.
#defined abi_splice_conditions_hold/4.
% These are the previously concretized attributes of the installed package as
% a hash. It has the general form:
% hash_attr(Hash, Attribute, PackageName, Args*)
#defined hash_attr/3.
#defined hash_attr/4.
#defined hash_attr/5.
#defined hash_attr/6.
#defined hash_attr/7.
{ attr("hash", node(ID, PackageName), Hash): installed_hash(PackageName, Hash) } 1 :-
attr("node", node(ID, PackageName)),
internal_error("Package must resolve to at most 1 hash").
% you can't choose an installed hash for a dev spec % you can't choose an installed hash for a dev spec
:- attr("hash", PackageNode, Hash), attr("variant_value", PackageNode, "dev_path", _). :- attr("hash", PackageNode, Hash), attr("variant_value", PackageNode, "dev_path", _).
% You can't install a hash, if it is not installed % You can't install a hash, if it is not installed
:- attr("hash", node(ID, Package), Hash), not installed_hash(Package, Hash). :- attr("hash", node(ID, Package), Hash), not installed_hash(Package, Hash).
% This should be redundant given the constraint above
:- attr("node", PackageNode), 2 { attr("hash", PackageNode, Hash) }.
% if a hash is selected, we impose all the constraints that implies % hash_attrs are versions, but can_splice_attr are usually node_version_satisfies
impose(Hash, PackageNode) :- attr("hash", PackageNode, Hash). hash_attr(Hash, "node_version_satisfies", PackageName, Constraint) :-
hash_attr(Hash, "version", PackageName, Version),
pkg_fact(PackageName, version_satisfies(Constraint, Version)).
% This recovers the exact semantics for hash reuse hash and depends_on are where
% splices are decided, and virtual_on_edge can result in name-changes, which is
% why they are all treated separately.
imposed_constraint(Hash, Attr, PackageName) :-
hash_attr(Hash, Attr, PackageName).
imposed_constraint(Hash, Attr, PackageName, A1) :-
hash_attr(Hash, Attr, PackageName, A1), Attr != "hash".
imposed_constraint(Hash, Attr, PackageName, Arg1, Arg2) :-
hash_attr(Hash, Attr, PackageName, Arg1, Arg2),
Attr != "depends_on",
Attr != "virtual_on_edge".
imposed_constraint(Hash, Attr, PackageName, A1, A2, A3) :-
hash_attr(Hash, Attr, PackageName, A1, A2, A3).
imposed_constraint(Hash, "hash", PackageName, Hash) :- installed_hash(PackageName, Hash).
% Without splicing, we simply recover the exact semantics
imposed_constraint(ParentHash, "hash", ChildName, ChildHash) :-
hash_attr(ParentHash, "hash", ChildName, ChildHash),
ChildHash != ParentHash,
not abi_splice_conditions_hold(_, _, ChildName, ChildHash).
imposed_constraint(Hash, "depends_on", PackageName, DepName, Type) :-
hash_attr(Hash, "depends_on", PackageName, DepName, Type),
hash_attr(Hash, "hash", DepName, DepHash),
not attr("splice_at_hash", _, _, DepName, DepHash).
imposed_constraint(Hash, "virtual_on_edge", PackageName, DepName, VirtName) :-
hash_attr(Hash, "virtual_on_edge", PackageName, DepName, VirtName),
not attr("splice_at_hash", _, _, DepName,_).
% Rules pertaining to attr("splice_at_hash") and abi_splice_conditions_hold will
% be conditionally loaded from splices.lp
impose(Hash, PackageNode) :- attr("hash", PackageNode, Hash), attr("node", PackageNode).
% If there is not a hash for a package, we build it.
build(PackageNode) :- attr("node", PackageNode), not concrete(PackageNode).
% if we haven't selected a hash for a package, we'll be building it
build(PackageNode) :- not attr("hash", PackageNode, _), attr("node", PackageNode).
% Minimizing builds is tricky. We want a minimizing criterion % Minimizing builds is tricky. We want a minimizing criterion
@@ -1480,6 +1526,7 @@ build(PackageNode) :- not attr("hash", PackageNode, _), attr("node", PackageNode
% criteria for built specs -- so that they take precedence over the otherwise % criteria for built specs -- so that they take precedence over the otherwise
% topmost-priority criterion to reuse what is installed. % topmost-priority criterion to reuse what is installed.
% %
% The priority ranges are: % The priority ranges are:
% 1000+ Optimizations for concretization errors % 1000+ Optimizations for concretization errors
% 300 - 1000 Highest priority optimizations for valid solutions % 300 - 1000 Highest priority optimizations for valid solutions
@@ -1505,12 +1552,10 @@ build_priority(PackageNode, 0) :- not build(PackageNode), attr("node", Package
pkg_fact(Package, version_declared(Version, Weight, "installed")), pkg_fact(Package, version_declared(Version, Weight, "installed")),
not optimize_for_reuse(). not optimize_for_reuse().
#defined installed_hash/2.
% This statement, which is a hidden feature of clingo, let us avoid cycles in the DAG % This statement, which is a hidden feature of clingo, let us avoid cycles in the DAG
#edge (A, B) : depends_on(A, B). #edge (A, B) : depends_on(A, B).
%----------------------------------------------------------------- %-----------------------------------------------------------------
% Optimization to avoid errors % Optimization to avoid errors
%----------------------------------------------------------------- %-----------------------------------------------------------------

View File

@@ -44,6 +44,17 @@ def _id(thing: Any) -> Union[str, AspObject]:
return f'"{str(thing)}"' return f'"{str(thing)}"'
class AspVar(AspObject):
"""Represents a variable in an ASP rule, allows for conditionally generating
rules"""
def __init__(self, name: str):
self.name = name
def __str__(self) -> str:
return str(self.name)
@lang.key_ordering @lang.key_ordering
class AspFunction(AspObject): class AspFunction(AspObject):
"""A term in the ASP logic program""" """A term in the ASP logic program"""
@@ -88,6 +99,8 @@ def _argify(self, arg: Any) -> Any:
return clingo().Number(arg) return clingo().Number(arg)
elif isinstance(arg, AspFunction): elif isinstance(arg, AspFunction):
return clingo().Function(arg.name, [self._argify(x) for x in arg.args], positive=True) return clingo().Function(arg.name, [self._argify(x) for x in arg.args], positive=True)
elif isinstance(arg, AspVar):
return clingo().Variable(arg.name)
return clingo().String(str(arg)) return clingo().String(str(arg))
def symbol(self): def symbol(self):

View File

@@ -15,7 +15,6 @@
#show attr/4. #show attr/4.
#show attr/5. #show attr/5.
#show attr/6. #show attr/6.
% names of optimization criteria % names of optimization criteria
#show opt_criterion/2. #show opt_criterion/2.

View File

@@ -0,0 +1,56 @@
% Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
% Spack Project Developers. See the top-level COPYRIGHT file for details.
%
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% These rules are conditionally loaded to handle the synthesis of spliced
% packages.
% =============================================================================
% Consider the concrete spec:
% foo@2.72%gcc@11.4 arch=linux-ubuntu22.04-icelake build_system=autotools ^bar ...
% It will emit the following facts for reuse (below is a subset)
% installed_hash("foo", "xxxyyy")
% hash_attr("xxxyyy", "hash", "foo", "xxxyyy")
% hash_attr("xxxyyy", "version", "foo", "2.72")
% hash_attr("xxxyyy", "node_os", "ubuntu22.04")
% hash_attr("xxxyyy", "hash", "bar", "zzzqqq")
% hash_attr("xxxyyy", "depends_on", "foo", "bar", "link")
% Rules that derive abi_splice_conditions_hold will be generated from
% use of the `can_splice` directive. The will have the following form:
% can_splice("foo@1.0.0+a", when="@1.0.1+a", match_variants=["b"]) --->
% abi_splice_conditions_hold(0, node(SID, "foo"), "foo", BashHash) :-
% installed_hash("foo", BaseHash),
% attr("node", node(SID, SpliceName)),
% attr("node_version_satisfies", node(SID, "foo"), "1.0.1"),
% hash_attr("hash", "node_version_satisfies", "foo", "1.0.1"),
% attr("variant_value", node(SID, "foo"), "a", "True"),
% hash_attr("hash", "variant_value", "foo", "a", "True"),
% attr("variant_value", node(SID, "foo"), "b", VariVar0),
% hash_attr("hash", "variant_value", "foo", "b", VariVar0),
% If the splice is valid (i.e. abi_splice_conditions_hold is derived) in the
% dependency of a concrete spec the solver free to choose whether to continue
% with the exact hash semantics by simply imposing the child hash, or introducing
% a spliced node as the dependency instead
{ imposed_constraint(ParentHash, "hash", ChildName, ChildHash) } :-
hash_attr(ParentHash, "hash", ChildName, ChildHash),
abi_splice_conditions_hold(_, node(SID, SpliceName), ChildName, ChildHash).
attr("splice_at_hash", ParentNode, node(SID, SpliceName), ChildName, ChildHash) :-
attr("hash", ParentNode, ParentHash),
hash_attr(ParentHash, "hash", ChildName, ChildHash),
abi_splice_conditions_hold(_, node(SID, SpliceName), ChildName, ChildHash),
ParentHash != ChildHash,
not imposed_constraint(ParentHash, "hash", ChildName, ChildHash).
% Names and virtual providers may change when a dependency is spliced in
imposed_constraint(Hash, "dependency_holds", ParentName, SpliceName, Type) :-
hash_attr(Hash, "depends_on", ParentName, DepName, Type),
hash_attr(Hash, "hash", DepName, DepHash),
attr("splice_at_hash", node(ID, ParentName), node(SID, SpliceName), DepName, DepHash).
imposed_constraint(Hash, "virtual_on_edge", ParentName, SpliceName, VirtName) :-
hash_attr(Hash, "virtual_on_edge", ParentName, DepName, VirtName),
attr("splice_at_hash", node(ID, ParentName), node(SID, SpliceName), DepName, DepHash).

View File

@@ -59,7 +59,7 @@
import re import re
import socket import socket
import warnings import warnings
from typing import Any, Callable, Dict, List, Match, Optional, Set, Tuple, Union from typing import Any, Callable, Dict, Iterable, List, Match, Optional, Set, Tuple, Union
import archspec.cpu import archspec.cpu
@@ -448,6 +448,9 @@ def _target_satisfies(self, other: "ArchSpec", strict: bool) -> bool:
return bool(self._target_intersection(other)) return bool(self._target_intersection(other))
def _target_constrain(self, other: "ArchSpec") -> bool: def _target_constrain(self, other: "ArchSpec") -> bool:
if self.target is None and other.target is None:
return False
if not other._target_satisfies(self, strict=False): if not other._target_satisfies(self, strict=False):
raise UnsatisfiableArchitectureSpecError(self, other) raise UnsatisfiableArchitectureSpecError(self, other)
@@ -496,21 +499,56 @@ def _target_intersection(self, other):
if (not s_min or o_comp >= s_min) and (not s_max or o_comp <= s_max): if (not s_min or o_comp >= s_min) and (not s_max or o_comp <= s_max):
results.append(o_min) results.append(o_min)
else: else:
# Take intersection of two ranges # Take the "min" of the two max, if there is a partial ordering.
# Lots of comparisons needed n_max = ""
_s_min = _make_microarchitecture(s_min) if s_max and o_max:
_s_max = _make_microarchitecture(s_max) _s_max = _make_microarchitecture(s_max)
_o_min = _make_microarchitecture(o_min) _o_max = _make_microarchitecture(o_max)
_o_max = _make_microarchitecture(o_max) if _s_max.family != _o_max.family:
continue
if _s_max <= _o_max:
n_max = s_max
elif _o_max < _s_max:
n_max = o_max
else:
continue
elif s_max:
n_max = s_max
elif o_max:
n_max = o_max
# Take the "max" of the two min.
n_min = ""
if s_min and o_min:
_s_min = _make_microarchitecture(s_min)
_o_min = _make_microarchitecture(o_min)
if _s_min.family != _o_min.family:
continue
if _s_min >= _o_min:
n_min = s_min
elif _o_min > _s_min:
n_min = o_min
else:
continue
elif s_min:
n_min = s_min
elif o_min:
n_min = o_min
if n_min and n_max:
_n_min = _make_microarchitecture(n_min)
_n_max = _make_microarchitecture(n_max)
if _n_min.family != _n_max.family or not _n_min <= _n_max:
continue
if n_min == n_max:
results.append(n_min)
else:
results.append(f"{n_min}:{n_max}")
elif n_min:
results.append(f"{n_min}:")
elif n_max:
results.append(f":{n_max}")
n_min = s_min if _s_min >= _o_min else o_min
n_max = s_max if _s_max <= _o_max else o_max
_n_min = _make_microarchitecture(n_min)
_n_max = _make_microarchitecture(n_max)
if _n_min == _n_max:
results.append(n_min)
elif not n_min or not n_max or _n_min < _n_max:
results.append("%s:%s" % (n_min, n_max))
return results return results
def constrain(self, other: "ArchSpec") -> bool: def constrain(self, other: "ArchSpec") -> bool:
@@ -1500,9 +1538,8 @@ def __init__(
self._external_path = external_path self._external_path = external_path
self.external_modules = Spec._format_module_list(external_modules) self.external_modules = Spec._format_module_list(external_modules)
# This attribute is used to store custom information for # This attribute is used to store custom information for external specs.
# external specs. None signal that it was not set yet. self.extra_attributes: dict = {}
self.extra_attributes = None
# This attribute holds the original build copy of the spec if it is # This attribute holds the original build copy of the spec if it is
# deployed differently than it was built. None signals that the spec # deployed differently than it was built. None signals that the spec
@@ -2217,8 +2254,8 @@ def to_node_dict(self, hash=ht.dag_hash):
d["external"] = syaml.syaml_dict( d["external"] = syaml.syaml_dict(
[ [
("path", self.external_path), ("path", self.external_path),
("module", self.external_modules), ("module", self.external_modules or None),
("extra_attributes", self.extra_attributes), ("extra_attributes", syaml.sorted_dict(self.extra_attributes)),
] ]
) )
@@ -2828,7 +2865,7 @@ def ensure_no_deprecated(root):
msg += " For each package listed, choose another spec\n" msg += " For each package listed, choose another spec\n"
raise SpecDeprecatedError(msg) raise SpecDeprecatedError(msg)
def concretize(self, tests: Union[bool, List[str]] = False) -> None: def concretize(self, tests: Union[bool, Iterable[str]] = False) -> None:
"""Concretize the current spec. """Concretize the current spec.
Args: Args:
@@ -2907,7 +2944,7 @@ def _mark_concrete(self, value=True):
if (not value) and s.concrete and s.installed: if (not value) and s.concrete and s.installed:
continue continue
elif not value: elif not value:
s.clear_cached_hashes() s.clear_caches()
s._mark_root_concrete(value) s._mark_root_concrete(value)
def _finalize_concretization(self): def _finalize_concretization(self):
@@ -2956,7 +2993,7 @@ def _finalize_concretization(self):
for spec in self.traverse(): for spec in self.traverse():
spec._cached_hash(ht.dag_hash) spec._cached_hash(ht.dag_hash)
def concretized(self, tests=False): def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "Spec":
"""This is a non-destructive version of concretize(). """This is a non-destructive version of concretize().
First clones, then returns a concrete version of this package First clones, then returns a concrete version of this package
@@ -3079,18 +3116,13 @@ def constrain(self, other, deps=True):
if not self.variants[v].compatible(other.variants[v]): if not self.variants[v].compatible(other.variants[v]):
raise vt.UnsatisfiableVariantSpecError(self.variants[v], other.variants[v]) raise vt.UnsatisfiableVariantSpecError(self.variants[v], other.variants[v])
# TODO: Check out the logic here
sarch, oarch = self.architecture, other.architecture sarch, oarch = self.architecture, other.architecture
if sarch is not None and oarch is not None: if (
if sarch.platform is not None and oarch.platform is not None: sarch is not None
if sarch.platform != oarch.platform: and oarch is not None
raise UnsatisfiableArchitectureSpecError(sarch, oarch) and not self.architecture.intersects(other.architecture)
if sarch.os is not None and oarch.os is not None: ):
if sarch.os != oarch.os: raise UnsatisfiableArchitectureSpecError(sarch, oarch)
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
if sarch.target is not None and oarch.target is not None:
if sarch.target != oarch.target:
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
changed = False changed = False
@@ -3113,18 +3145,12 @@ def constrain(self, other, deps=True):
changed |= self.compiler_flags.constrain(other.compiler_flags) changed |= self.compiler_flags.constrain(other.compiler_flags)
old = str(self.architecture)
sarch, oarch = self.architecture, other.architecture sarch, oarch = self.architecture, other.architecture
if sarch is None or other.architecture is None: if sarch is not None and oarch is not None:
self.architecture = sarch or oarch changed |= self.architecture.constrain(other.architecture)
else: elif oarch is not None:
if sarch.platform is None or oarch.platform is None: self.architecture = oarch
self.architecture.platform = sarch.platform or oarch.platform changed = True
if sarch.os is None or oarch.os is None:
sarch.os = sarch.os or oarch.os
if sarch.target is None or oarch.target is None:
sarch.target = sarch.target or oarch.target
changed |= str(self.architecture) != old
if deps: if deps:
changed |= self._constrain_dependencies(other) changed |= self._constrain_dependencies(other)
@@ -4256,7 +4282,7 @@ def _splice_detach_and_add_dependents(self, replacement, context):
for ancestor in ancestors_in_context: for ancestor in ancestors_in_context:
# Only set it if it hasn't been spliced before # Only set it if it hasn't been spliced before
ancestor._build_spec = ancestor._build_spec or ancestor.copy() ancestor._build_spec = ancestor._build_spec or ancestor.copy()
ancestor.clear_cached_hashes(ignore=(ht.package_hash.attr,)) ancestor.clear_caches(ignore=(ht.package_hash.attr,))
for edge in ancestor.edges_to_dependencies(depflag=dt.BUILD): for edge in ancestor.edges_to_dependencies(depflag=dt.BUILD):
if edge.depflag & ~dt.BUILD: if edge.depflag & ~dt.BUILD:
edge.depflag &= ~dt.BUILD edge.depflag &= ~dt.BUILD
@@ -4450,7 +4476,7 @@ def mask_build_deps(in_spec):
return spec return spec
def clear_cached_hashes(self, ignore=()): def clear_caches(self, ignore=()):
""" """
Clears all cached hashes in a Spec, while preserving other properties. Clears all cached hashes in a Spec, while preserving other properties.
""" """
@@ -4458,7 +4484,9 @@ def clear_cached_hashes(self, ignore=()):
if h.attr not in ignore: if h.attr not in ignore:
if hasattr(self, h.attr): if hasattr(self, h.attr):
setattr(self, h.attr, None) setattr(self, h.attr, None)
self._dunder_hash = None for attr in ("_dunder_hash", "_prefix"):
if attr not in ignore:
setattr(self, attr, None)
def __hash__(self): def __hash__(self):
# If the spec is concrete, we leverage the process hash and just use # If the spec is concrete, we leverage the process hash and just use
@@ -4833,8 +4861,8 @@ def from_node_dict(cls, node):
spec.external_modules = node["external"]["module"] spec.external_modules = node["external"]["module"]
if spec.external_modules is False: if spec.external_modules is False:
spec.external_modules = None spec.external_modules = None
spec.extra_attributes = node["external"].get( spec.extra_attributes = (
"extra_attributes", syaml.syaml_dict() node["external"].get("extra_attributes") or syaml.syaml_dict()
) )
# specs read in are concrete unless marked abstract # specs read in are concrete unless marked abstract

View File

@@ -487,7 +487,7 @@ def _generate_fetchers(self, mirror_only=False) -> Generator["fs.FetchStrategy",
# Insert fetchers in the order that the URLs are provided. # Insert fetchers in the order that the URLs are provided.
fetchers[:0] = ( fetchers[:0] = (
fs.from_url_scheme( fs.from_url_scheme(
url_util.join(mirror.fetch_url, self.mirror_layout.path), url_util.join(mirror.fetch_url, *self.mirror_layout.path.split(os.sep)),
checksum=digest, checksum=digest,
expand=expand, expand=expand,
extension=extension, extension=extension,

View File

@@ -0,0 +1,247 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
""" Test ABI-based splicing of dependencies """
from typing import List
import pytest
import spack.config
import spack.deptypes as dt
import spack.solver.asp
from spack.installer import PackageInstaller
from spack.spec import Spec
class CacheManager:
def __init__(self, specs: List[str]) -> None:
self.req_specs = specs
self.concr_specs: List[Spec]
self.concr_specs = []
def __enter__(self):
self.concr_specs = [Spec(s).concretized() for s in self.req_specs]
for s in self.concr_specs:
PackageInstaller([s.package], fake=True, explicit=True).install()
def __exit__(self, exc_type, exc_val, exc_tb):
for s in self.concr_specs:
s.package.do_uninstall()
# MacOS and Windows only work if you pass this function pointer rather than a
# closure
def _mock_has_runtime_dependencies(_x):
return True
def _make_specs_non_buildable(specs: List[str]):
output_config = {}
for spec in specs:
output_config[spec] = {"buildable": False}
return output_config
@pytest.fixture
def splicing_setup(mutable_database, mock_packages, monkeypatch):
spack.config.set("concretizer:reuse", True)
monkeypatch.setattr(
spack.solver.asp, "_has_runtime_dependencies", _mock_has_runtime_dependencies
)
def _enable_splicing():
spack.config.set("concretizer:splice", {"automatic": True})
def _has_build_dependency(spec: Spec, name: str):
return any(s.name == name for s in spec.dependencies(None, dt.BUILD))
def test_simple_reuse(splicing_setup):
with CacheManager(["splice-z@1.0.0+compat"]):
spack.config.set("packages", _make_specs_non_buildable(["splice-z"]))
assert Spec("splice-z").concretized().satisfies(Spec("splice-z"))
def test_simple_dep_reuse(splicing_setup):
with CacheManager(["splice-z@1.0.0+compat"]):
spack.config.set("packages", _make_specs_non_buildable(["splice-z"]))
assert Spec("splice-h@1").concretized().satisfies(Spec("splice-h@1"))
def test_splice_installed_hash(splicing_setup):
cache = [
"splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0",
"splice-h@1.0.2+compat ^splice-z@1.0.0",
]
with CacheManager(cache):
packages_config = _make_specs_non_buildable(["splice-t", "splice-h"])
spack.config.set("packages", packages_config)
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0")
with pytest.raises(Exception):
goal_spec.concretized()
_enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec)
def test_splice_build_splice_node(splicing_setup):
with CacheManager(["splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat"]):
spack.config.set("packages", _make_specs_non_buildable(["splice-t"]))
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.0+compat")
with pytest.raises(Exception):
goal_spec.concretized()
_enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec)
def test_double_splice(splicing_setup):
cache = [
"splice-t@1 ^splice-h@1.0.0+compat ^splice-z@1.0.0+compat",
"splice-h@1.0.2+compat ^splice-z@1.0.1+compat",
"splice-z@1.0.2+compat",
]
with CacheManager(cache):
freeze_builds_config = _make_specs_non_buildable(["splice-t", "splice-h", "splice-z"])
spack.config.set("packages", freeze_builds_config)
goal_spec = Spec("splice-t@1 ^splice-h@1.0.2+compat ^splice-z@1.0.2+compat")
with pytest.raises(Exception):
goal_spec.concretized()
_enable_splicing()
assert goal_spec.concretized().satisfies(goal_spec)
# The next two tests are mirrors of one another
def test_virtual_multi_splices_in(splicing_setup):
cache = [
"depends-on-virtual-with-abi ^virtual-abi-1",
"depends-on-virtual-with-abi ^virtual-abi-2",
]
goal_specs = [
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
]
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
for gs in goal_specs:
with pytest.raises(Exception):
Spec(gs).concretized()
_enable_splicing()
for gs in goal_specs:
assert Spec(gs).concretized().satisfies(gs)
def test_virtual_multi_can_be_spliced(splicing_setup):
cache = [
"depends-on-virtual-with-abi ^virtual-abi-multi abi=one",
"depends-on-virtual-with-abi ^virtual-abi-multi abi=two",
]
goal_specs = [
"depends-on-virtual-with-abi ^virtual-abi-1",
"depends-on-virtual-with-abi ^virtual-abi-2",
]
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["depends-on-virtual-with-abi"]))
with pytest.raises(Exception):
for gs in goal_specs:
Spec(gs).concretized()
_enable_splicing()
for gs in goal_specs:
assert Spec(gs).concretized().satisfies(gs)
def test_manyvariant_star_matching_variant_splice(splicing_setup):
cache = [
# can_splice("manyvariants@1.0.0", when="@1.0.1", match_variants="*")
"depends-on-manyvariants ^manyvariants@1.0.0+a+b c=v1 d=v2",
"depends-on-manyvariants ^manyvariants@1.0.0~a~b c=v3 d=v3",
]
goal_specs = [
Spec("depends-on-manyvariants ^manyvariants@1.0.1+a+b c=v1 d=v2"),
Spec("depends-on-manyvariants ^manyvariants@1.0.1~a~b c=v3 d=v3"),
]
with CacheManager(cache):
freeze_build_config = {"depends-on-manyvariants": {"buildable": False}}
spack.config.set("packages", freeze_build_config)
for goal in goal_specs:
with pytest.raises(Exception):
goal.concretized()
_enable_splicing()
for goal in goal_specs:
assert goal.concretized().satisfies(goal)
def test_manyvariant_limited_matching(splicing_setup):
cache = [
# can_splice("manyvariants@2.0.0+a~b", when="@2.0.1~a+b", match_variants=["c", "d"])
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0+a~b c=v3 d=v2",
# can_splice("manyvariants@2.0.0 c=v1 d=v1", when="@2.0.1+a+b")
"depends-on-manyvariants@2.0 ^manyvariants@2.0.0~a~b c=v1 d=v1",
]
goal_specs = [
Spec("depends-on-manyvariants@2.0 ^manyvariants@2.0.1~a+b c=v3 d=v2"),
Spec("depends-on-manyvariants@2.0 ^manyvariants@2.0.1+a+b c=v3 d=v3"),
]
with CacheManager(cache):
freeze_build_config = {"depends-on-manyvariants": {"buildable": False}}
spack.config.set("packages", freeze_build_config)
for s in goal_specs:
with pytest.raises(Exception):
s.concretized()
_enable_splicing()
for s in goal_specs:
assert s.concretized().satisfies(s)
def test_external_splice_same_name(splicing_setup):
cache = [
"splice-h@1.0.0 ^splice-z@1.0.0+compat",
"splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.1+compat",
]
packages_yaml = {
"splice-z": {"externals": [{"spec": "splice-z@1.0.2+compat", "prefix": "/usr"}]}
}
goal_specs = [
Spec("splice-h@1.0.0 ^splice-z@1.0.2"),
Spec("splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.2"),
]
with CacheManager(cache):
spack.config.set("packages", packages_yaml)
_enable_splicing()
for s in goal_specs:
assert s.concretized().satisfies(s)
def test_spliced_build_deps_only_in_build_spec(splicing_setup):
cache = ["splice-t@1.0 ^splice-h@1.0.1 ^splice-z@1.0.0"]
goal_spec = Spec("splice-t@1.0 ^splice-h@1.0.2 ^splice-z@1.0.0")
with CacheManager(cache):
_enable_splicing()
concr_goal = goal_spec.concretized()
build_spec = concr_goal._build_spec
# Spec has been spliced
assert build_spec is not None
# Build spec has spliced build dependencies
assert _has_build_dependency(build_spec, "splice-h")
assert _has_build_dependency(build_spec, "splice-z")
# Spliced build dependencies are removed
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0
def test_spliced_transitive_dependency(splicing_setup):
cache = ["splice-depends-on-t@1.0 ^splice-h@1.0.1"]
goal_spec = Spec("splice-depends-on-t^splice-h@1.0.2")
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["splice-depends-on-t"]))
_enable_splicing()
concr_goal = goal_spec.concretized()
# Spec has been spliced
assert concr_goal._build_spec is not None
assert concr_goal["splice-t"]._build_spec is not None
assert concr_goal.satisfies(goal_spec)
# Spliced build dependencies are removed
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0

View File

@@ -4,10 +4,17 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import pytest import pytest
import spack.config
import spack.environment as ev
import spack.error
import spack.solver.asp as asp
import spack.store
from spack.cmd import ( from spack.cmd import (
CommandNameError, CommandNameError,
PythonNameError, PythonNameError,
cmd_name, cmd_name,
matching_specs_from_env,
parse_specs,
python_name, python_name,
require_cmd_name, require_cmd_name,
require_python_name, require_python_name,
@@ -34,3 +41,99 @@ def test_require_cmd_name():
with pytest.raises(CommandNameError): with pytest.raises(CommandNameError):
require_cmd_name("okey_dokey") require_cmd_name("okey_dokey")
require_cmd_name(cmd_name("okey_dokey")) require_cmd_name(cmd_name("okey_dokey"))
@pytest.mark.parametrize(
"unify,spec_strs,error",
[
# single spec
(True, ["zmpi"], None),
(False, ["mpileaks"], None),
# multiple specs, some from hash some from file
(True, ["zmpi", "mpileaks^zmpi", "libelf"], None),
(True, ["mpileaks^zmpi", "mpileaks^mpich", "libelf"], spack.error.SpecError),
(False, ["mpileaks^zmpi", "mpileaks^mpich", "libelf"], None),
],
)
def test_special_cases_concretization_parse_specs(
unify, spec_strs, error, monkeypatch, mutable_config, mutable_database, tmpdir
):
"""Test that special cases in parse_specs(concretize=True) bypass solver"""
# monkeypatch to ensure we do not call the actual concretizer
def _fail(*args, **kwargs):
assert False
monkeypatch.setattr(asp.SpackSolverSetup, "setup", _fail)
spack.config.set("concretizer:unify", unify)
args = [f"/{spack.store.STORE.db.query(s)[0].dag_hash()}" for s in spec_strs]
if len(args) > 1:
# We convert the last one to a specfile input
filename = tmpdir.join("spec.json")
spec = parse_specs(args[-1], concretize=True)[0]
with open(filename, "w") as f:
spec.to_json(f)
args[-1] = str(filename)
if error:
with pytest.raises(error):
parse_specs(args, concretize=True)
else:
# assertion error from monkeypatch above if test fails
parse_specs(args, concretize=True)
@pytest.mark.parametrize(
"unify,spec_strs,error",
[
# single spec
(True, ["zmpi"], None),
(False, ["mpileaks"], None),
# multiple specs, some from hash some from file
(True, ["zmpi", "mpileaks^zmpi", "libelf"], None),
(True, ["mpileaks^zmpi", "mpileaks^mpich", "libelf"], spack.error.SpecError),
(False, ["mpileaks^zmpi", "mpileaks^mpich", "libelf"], None),
],
)
def test_special_cases_concretization_matching_specs_from_env(
unify,
spec_strs,
error,
monkeypatch,
mutable_config,
mutable_database,
tmpdir,
mutable_mock_env_path,
):
"""Test that special cases in parse_specs(concretize=True) bypass solver"""
# monkeypatch to ensure we do not call the actual concretizer
def _fail(*args, **kwargs):
assert False
monkeypatch.setattr(asp.SpackSolverSetup, "setup", _fail)
spack.config.set("concretizer:unify", unify)
ev.create("test")
env = ev.read("test")
args = [f"/{spack.store.STORE.db.query(s)[0].dag_hash()}" for s in spec_strs]
if len(args) > 1:
# We convert the last one to a specfile input
filename = tmpdir.join("spec.json")
spec = parse_specs(args[-1], concretize=True)[0]
with open(filename, "w") as f:
spec.to_json(f)
args[-1] = str(filename)
with env:
specs = parse_specs(args, concretize=False)
if error:
with pytest.raises(error):
matching_specs_from_env(specs)
else:
# assertion error from monkeypatch above if test fails
matching_specs_from_env(specs)

View File

@@ -311,7 +311,20 @@ def test_pkg_grep(mock_packages, capfd):
output, _ = capfd.readouterr() output, _ = capfd.readouterr()
assert output.strip() == "\n".join( assert output.strip() == "\n".join(
spack.repo.PATH.get_pkg_class(name).module.__file__ spack.repo.PATH.get_pkg_class(name).module.__file__
for name in ["splice-a", "splice-h", "splice-t", "splice-vh", "splice-vt", "splice-z"] for name in [
"depends-on-manyvariants",
"manyvariants",
"splice-a",
"splice-depends-on-t",
"splice-h",
"splice-t",
"splice-vh",
"splice-vt",
"splice-z",
"virtual-abi-1",
"virtual-abi-2",
"virtual-abi-multi",
]
) )
# ensure that this string isn't fouhnd # ensure that this string isn't fouhnd

View File

@@ -7,6 +7,7 @@
import pytest import pytest
import spack.config
import spack.environment as ev import spack.environment as ev
import spack.error import spack.error
import spack.spec import spack.spec
@@ -179,3 +180,43 @@ def test_spec_version_assigned_git_ref_as_version(name, version, error):
else: else:
output = spec(name + "@" + version) output = spec(name + "@" + version)
assert version in output assert version in output
@pytest.mark.parametrize(
"unify, spec_hash_args, match, error",
[
# success cases with unfiy:true
(True, ["mpileaks_mpich"], "mpich", None),
(True, ["mpileaks_zmpi"], "zmpi", None),
(True, ["mpileaks_mpich", "dyninst"], "mpich", None),
(True, ["mpileaks_zmpi", "dyninst"], "zmpi", None),
# same success cases with unfiy:false
(False, ["mpileaks_mpich"], "mpich", None),
(False, ["mpileaks_zmpi"], "zmpi", None),
(False, ["mpileaks_mpich", "dyninst"], "mpich", None),
(False, ["mpileaks_zmpi", "dyninst"], "zmpi", None),
# cases with unfiy:false
(True, ["mpileaks_mpich", "mpileaks_zmpi"], "callpath, mpileaks", spack.error.SpecError),
(False, ["mpileaks_mpich", "mpileaks_zmpi"], "zmpi", None),
],
)
def test_spec_unification_from_cli(
install_mockery, mutable_config, mutable_database, unify, spec_hash_args, match, error
):
"""Ensure specs grouped together on the CLI are concretized together when unify:true."""
spack.config.set("concretizer:unify", unify)
db = spack.store.STORE.db
spec_lookup = {
"mpileaks_mpich": db.query_one("mpileaks ^mpich").dag_hash(),
"mpileaks_zmpi": db.query_one("mpileaks ^zmpi").dag_hash(),
"dyninst": db.query_one("dyninst").dag_hash(),
}
hashes = [f"/{spec_lookup[name]}" for name in spec_hash_args]
if error:
with pytest.raises(error, match=match):
output = spec(*hashes)
else:
output = spec(*hashes)
assert match in output

View File

@@ -1532,3 +1532,30 @@ def test_config_path_dsl(path, it_should_work, expected_parsed):
else: else:
with pytest.raises(ValueError): with pytest.raises(ValueError):
spack.config.ConfigPath._validate(path) spack.config.ConfigPath._validate(path)
@pytest.mark.regression("48254")
def test_env_activation_preserves_config_scopes(mutable_mock_env_path):
"""Check that the "command_line" scope remains the highest priority scope, when we activate,
or deactivate, environments.
"""
expected_cl_scope = spack.config.CONFIG.highest()
assert expected_cl_scope.name == "command_line"
# Creating an environment pushes a new scope
ev.create("test")
with ev.read("test"):
assert spack.config.CONFIG.highest() == expected_cl_scope
# No active environment pops the scope
with ev.no_active_environment():
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope
# Switch the environment to another one
ev.create("test-2")
with ev.read("test-2"):
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope

View File

@@ -1249,3 +1249,14 @@ def test_find_input_types(tmp_path: pathlib.Path):
with pytest.raises(TypeError): with pytest.raises(TypeError):
fs.find(1, "file.txt") # type: ignore fs.find(1, "file.txt") # type: ignore
def test_edit_in_place_through_temporary_file(tmp_path):
(tmp_path / "example.txt").write_text("Hello")
current_ino = os.stat(tmp_path / "example.txt").st_ino
with fs.edit_in_place_through_temporary_file(tmp_path / "example.txt") as temporary:
os.unlink(temporary)
with open(temporary, "w") as f:
f.write("World")
assert (tmp_path / "example.txt").read_text() == "World"
assert os.stat(tmp_path / "example.txt").st_ino == current_ino

View File

@@ -298,30 +298,6 @@ def inner():
top-level raised TypeError: ok""" top-level raised TypeError: ok"""
) )
full_message = h.grouped_message(with_tracebacks=True)
no_line_numbers = re.sub(r"line [0-9]+,", "line xxx,", full_message)
assert (
no_line_numbers
== dedent(
"""\
due to the following failures:
inner method raised ValueError: wow!
File "{0}", \
line xxx, in test_grouped_exception
inner()
File "{0}", \
line xxx, in inner
raise ValueError("wow!")
top-level raised TypeError: ok
File "{0}", \
line xxx, in test_grouped_exception
raise TypeError("ok")
"""
).format(__file__)
)
def test_grouped_exception_base_type(): def test_grouped_exception_base_type():
h = llnl.util.lang.GroupedExceptionHandler() h = llnl.util.lang.GroupedExceptionHandler()

View File

@@ -57,18 +57,16 @@ def test_log_python_output_without_echo(capfd, tmpdir):
assert capfd.readouterr()[0] == "" assert capfd.readouterr()[0] == ""
def test_log_python_output_with_invalid_utf8(capfd, tmpdir): def test_log_python_output_with_invalid_utf8(capfd, tmp_path):
with tmpdir.as_cwd(): tmp_file = str(tmp_path / "foo.txt")
with log.log_output("foo.txt"): with log.log_output(tmp_file, echo=True):
sys.stdout.buffer.write(b"\xc3\x28\n") sys.stdout.buffer.write(b"\xc3helloworld\n")
expected = b"<line lost: output was not encoded as UTF-8>\n" # we should be able to read this as valid utf-8
with open("foo.txt", "rb") as f: with open(tmp_file, "r", encoding="utf-8") as f:
written = f.read() assert f.read() == "<EFBFBD>helloworld\n"
assert written == expected
# nothing on stdout or stderr assert capfd.readouterr().out == "<EFBFBD>helloworld\n"
assert capfd.readouterr()[0] == ""
def test_log_python_output_and_echo_output(capfd, tmpdir): def test_log_python_output_and_echo_output(capfd, tmpdir):

View File

@@ -138,3 +138,19 @@ def test_round_trip_configuration(initial_content, expected_final_content, tmp_p
expected_final_content = initial_content expected_final_content = initial_content
assert final_content.getvalue() == expected_final_content assert final_content.getvalue() == expected_final_content
def test_sorted_dict():
assert syaml.sorted_dict(
{
"z": 0,
"y": [{"x": 0, "w": [2, 1, 0]}, 0],
"v": ({"u": 0, "t": 0, "s": 0}, 0, {"r": 0, "q": 0}),
"p": 0,
}
) == {
"p": 0,
"v": ({"s": 0, "t": 0, "u": 0}, 0, {"q": 0, "r": 0}),
"y": [{"w": [2, 1, 0], "x": 0}, 0],
"z": 0,
}

View File

@@ -1763,8 +1763,8 @@ def test_package_hash_affects_dunder_and_dag_hash(mock_packages, default_mock_co
assert a1.dag_hash() == a2.dag_hash() assert a1.dag_hash() == a2.dag_hash()
assert a1.process_hash() == a2.process_hash() assert a1.process_hash() == a2.process_hash()
a1.clear_cached_hashes() a1.clear_caches()
a2.clear_cached_hashes() a2.clear_caches()
# tweak the dag hash of one of these specs # tweak the dag hash of one of these specs
new_hash = "00000000000000000000000000000000" new_hash = "00000000000000000000000000000000"
@@ -1842,6 +1842,16 @@ def test_abstract_contains_semantic(lhs, rhs, expected, mock_packages):
# Different virtuals intersect if there is at least package providing both # Different virtuals intersect if there is at least package providing both
(Spec, "mpi", "lapack", (True, False, False)), (Spec, "mpi", "lapack", (True, False, False)),
(Spec, "mpi", "pkgconfig", (False, False, False)), (Spec, "mpi", "pkgconfig", (False, False, False)),
# Intersection among target ranges for different architectures
(Spec, "target=x86_64:", "target=ppc64le:", (False, False, False)),
(Spec, "target=x86_64:", "target=:power9", (False, False, False)),
(Spec, "target=:haswell", "target=:power9", (False, False, False)),
(Spec, "target=:haswell", "target=ppc64le:", (False, False, False)),
# Intersection among target ranges for the same architecture
(Spec, "target=:haswell", "target=x86_64:", (True, True, True)),
(Spec, "target=:haswell", "target=x86_64_v4:", (False, False, False)),
# Edge case of uarch that split in a diamond structure, from a common ancestor
(Spec, "target=:cascadelake", "target=:cannonlake", (False, False, False)),
], ],
) )
def test_intersects_and_satisfies(factory, lhs_str, rhs_str, results): def test_intersects_and_satisfies(factory, lhs_str, rhs_str, results):
@@ -1891,6 +1901,16 @@ def test_intersects_and_satisfies(factory, lhs_str, rhs_str, results):
# Flags # Flags
(Spec, "cppflags=-foo", "cppflags=-foo", False, "cppflags=-foo"), (Spec, "cppflags=-foo", "cppflags=-foo", False, "cppflags=-foo"),
(Spec, "cppflags=-foo", "cflags=-foo", True, "cppflags=-foo cflags=-foo"), (Spec, "cppflags=-foo", "cflags=-foo", True, "cppflags=-foo cflags=-foo"),
# Target ranges
(Spec, "target=x86_64:", "target=x86_64:", False, "target=x86_64:"),
(Spec, "target=x86_64:", "target=:haswell", True, "target=x86_64:haswell"),
(
Spec,
"target=x86_64:haswell",
"target=x86_64_v2:icelake",
True,
"target=x86_64_v2:haswell",
),
], ],
) )
def test_constrain(factory, lhs_str, rhs_str, result, constrained_str): def test_constrain(factory, lhs_str, rhs_str, result, constrained_str):

View File

@@ -21,6 +21,7 @@
import pytest import pytest
import ruamel.yaml import ruamel.yaml
import spack.config
import spack.hash_types as ht import spack.hash_types as ht
import spack.paths import spack.paths
import spack.repo import spack.repo
@@ -144,86 +145,83 @@ def descend_and_check(iterable, level=0):
assert level >= 5 assert level >= 5
def test_ordered_read_not_required_for_consistent_dag_hash(config, mock_packages): @pytest.mark.parametrize("spec_str", ["mpileaks ^zmpi", "dttop", "dtuse"])
def test_ordered_read_not_required_for_consistent_dag_hash(
spec_str, mutable_config: spack.config.Configuration, mock_packages
):
"""Make sure ordered serialization isn't required to preserve hashes. """Make sure ordered serialization isn't required to preserve hashes.
For consistent hashes, we require that YAML and json documents For consistent hashes, we require that YAML and JSON serializations have their keys in a
have their keys serialized in a deterministic order. However, we deterministic order. However, we don't want to require them to be serialized in order. This
don't want to require them to be serialized in order. This ensures that is not required."""
ensures that is not required.
"""
specs = ["mpileaks ^zmpi", "dttop", "dtuse"]
for spec in specs:
spec = Spec(spec)
spec.concretize()
# # Make sure that `extra_attributes` of externals is order independent for hashing.
# Dict & corresponding YAML & JSON from the original spec. extra_attributes = {
# "compilers": {"c": "/some/path/bin/cc", "cxx": "/some/path/bin/c++"},
spec_dict = spec.to_dict() "foo": "bar",
spec_yaml = spec.to_yaml() "baz": "qux",
spec_json = spec.to_json() }
mutable_config.set(
"packages:dtuse",
{
"buildable": False,
"externals": [
{"spec": "dtuse@=1.0", "prefix": "/usr", "extra_attributes": extra_attributes}
],
},
)
# spec = spack.spec.Spec(spec_str).concretized()
# Make a spec with reversed OrderedDicts for every
# OrderedDict in the original.
#
reversed_spec_dict = reverse_all_dicts(spec.to_dict())
# if spec_str == "dtuse":
# Dump to YAML and JSON assert spec.external and spec.extra_attributes == extra_attributes
#
yaml_string = syaml.dump(spec_dict, default_flow_style=False)
reversed_yaml_string = syaml.dump(reversed_spec_dict, default_flow_style=False)
json_string = sjson.dump(spec_dict)
reversed_json_string = sjson.dump(reversed_spec_dict)
# spec_dict = spec.to_dict(hash=ht.dag_hash)
# Do many consistency checks spec_yaml = spec.to_yaml()
# spec_json = spec.to_json()
# spec yaml is ordered like the spec dict # Make a spec with dict keys reversed recursively
assert yaml_string == spec_yaml spec_dict_rev = reverse_all_dicts(spec_dict)
assert json_string == spec_json
# reversed string is different from the original, so it # Dump to YAML and JSON
# *would* generate a different hash yaml_string = syaml.dump(spec_dict, default_flow_style=False)
assert yaml_string != reversed_yaml_string yaml_string_rev = syaml.dump(spec_dict_rev, default_flow_style=False)
assert json_string != reversed_json_string json_string = sjson.dump(spec_dict)
json_string_rev = sjson.dump(spec_dict_rev)
# build specs from the "wrongly" ordered data # spec yaml is ordered like the spec dict
round_trip_yaml_spec = Spec.from_yaml(yaml_string) assert yaml_string == spec_yaml
round_trip_json_spec = Spec.from_json(json_string) assert json_string == spec_json
round_trip_reversed_yaml_spec = Spec.from_yaml(reversed_yaml_string)
round_trip_reversed_json_spec = Spec.from_yaml(reversed_json_string)
# Strip spec if we stripped the yaml # reversed string is different from the original, so it *would* generate a different hash
spec = spec.copy(deps=ht.dag_hash.depflag) assert yaml_string != yaml_string_rev
assert json_string != json_string_rev
# specs are equal to the original # build specs from the "wrongly" ordered data
assert spec == round_trip_yaml_spec from_yaml = Spec.from_yaml(yaml_string)
assert spec == round_trip_json_spec from_json = Spec.from_json(json_string)
from_yaml_rev = Spec.from_yaml(yaml_string_rev)
from_json_rev = Spec.from_json(json_string_rev)
assert spec == round_trip_reversed_yaml_spec # Strip spec if we stripped the yaml
assert spec == round_trip_reversed_json_spec spec = spec.copy(deps=ht.dag_hash.depflag)
assert round_trip_yaml_spec == round_trip_reversed_yaml_spec
assert round_trip_json_spec == round_trip_reversed_json_spec
# dag_hashes are equal
assert spec.dag_hash() == round_trip_yaml_spec.dag_hash()
assert spec.dag_hash() == round_trip_json_spec.dag_hash()
assert spec.dag_hash() == round_trip_reversed_yaml_spec.dag_hash()
assert spec.dag_hash() == round_trip_reversed_json_spec.dag_hash()
# dag_hash is equal after round-trip by dag_hash # specs and their hashes are equal to the original
spec.concretize() assert (
round_trip_yaml_spec.concretize() spec.process_hash()
round_trip_json_spec.concretize() == from_yaml.process_hash()
round_trip_reversed_yaml_spec.concretize() == from_json.process_hash()
round_trip_reversed_json_spec.concretize() == from_yaml_rev.process_hash()
assert spec.dag_hash() == round_trip_yaml_spec.dag_hash() == from_json_rev.process_hash()
assert spec.dag_hash() == round_trip_json_spec.dag_hash() )
assert spec.dag_hash() == round_trip_reversed_yaml_spec.dag_hash() assert (
assert spec.dag_hash() == round_trip_reversed_json_spec.dag_hash() spec.dag_hash()
== from_yaml.dag_hash()
== from_json.dag_hash()
== from_yaml_rev.dag_hash()
== from_json_rev.dag_hash()
)
assert spec == from_yaml == from_json == from_yaml_rev == from_json_rev
@pytest.mark.parametrize("module", [spack.spec, spack.version]) @pytest.mark.parametrize("module", [spack.spec, spack.version])
@@ -294,13 +292,10 @@ def visit_Call(self, node):
def reverse_all_dicts(data): def reverse_all_dicts(data):
"""Descend into data and reverse all the dictionaries""" """Descend into data and reverse all the dictionaries"""
if isinstance(data, dict): if isinstance(data, dict):
return syaml_dict( return type(data)((k, reverse_all_dicts(v)) for k, v in reversed(list(data.items())))
reversed([(reverse_all_dicts(k), reverse_all_dicts(v)) for k, v in data.items()])
)
elif isinstance(data, (list, tuple)): elif isinstance(data, (list, tuple)):
return type(data)(reverse_all_dicts(elt) for elt in data) return type(data)(reverse_all_dicts(elt) for elt in data)
else: return data
return data
def check_specs_equal(original_spec, spec_yaml_path): def check_specs_equal(original_spec, spec_yaml_path):

View File

@@ -13,7 +13,7 @@
import sys import sys
from llnl.util import tty from llnl.util import tty
from llnl.util.filesystem import join_path from llnl.util.filesystem import edit_in_place_through_temporary_file
from llnl.util.lang import memoized from llnl.util.lang import memoized
from spack.util.executable import Executable, which from spack.util.executable import Executable, which
@@ -81,12 +81,11 @@ def fix_darwin_install_name(path):
Parameters: Parameters:
path (str): directory in which .dylib files are located path (str): directory in which .dylib files are located
""" """
libs = glob.glob(join_path(path, "*.dylib")) libs = glob.glob(os.path.join(path, "*.dylib"))
install_name_tool = Executable("install_name_tool")
otool = Executable("otool")
for lib in libs: for lib in libs:
# fix install name first: args = ["-id", lib]
install_name_tool = Executable("install_name_tool")
install_name_tool("-id", lib, lib)
otool = Executable("otool")
long_deps = otool("-L", lib, output=str).split("\n") long_deps = otool("-L", lib, output=str).split("\n")
deps = [dep.partition(" ")[0][1::] for dep in long_deps[2:-1]] deps = [dep.partition(" ")[0][1::] for dep in long_deps[2:-1]]
# fix all dependencies: # fix all dependencies:
@@ -98,5 +97,8 @@ def fix_darwin_install_name(path):
# but we don't know builddir (nor how symbolic links look # but we don't know builddir (nor how symbolic links look
# in builddir). We thus only compare the basenames. # in builddir). We thus only compare the basenames.
if os.path.basename(dep) == os.path.basename(loc): if os.path.basename(dep) == os.path.basename(loc):
install_name_tool("-change", dep, loc, lib) args.extend(("-change", dep, loc))
break break
with edit_in_place_through_temporary_file(lib) as tmp:
install_name_tool(*args, tmp)

View File

@@ -447,20 +447,13 @@ def _dump_annotated(handler, data, stream=None):
return getvalue() return getvalue()
def sorted_dict(dict_like): def sorted_dict(data):
"""Return an ordered dict with all the fields sorted recursively. """Descend into data and sort all dictionary keys."""
if isinstance(data, dict):
Args: return type(data)((k, sorted_dict(v)) for k, v in sorted(data.items()))
dict_like (dict): dictionary to be sorted elif isinstance(data, (list, tuple)):
return type(data)(sorted_dict(v) for v in data)
Returns: return data
dictionary sorted recursively
"""
result = syaml_dict(sorted(dict_like.items()))
for key, value in result.items():
if isinstance(value, collections.abc.Mapping):
result[key] = sorted_dict(value)
return result
def extract_comments(data): def extract_comments(data):

View File

@@ -14,10 +14,10 @@ default:
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] } image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
# CI Platform-Arch # CI Platform-Arch
.cray_rhel_zen4: .cray_rhel_x86_64_v3:
variables: variables:
SPACK_TARGET_PLATFORM: "cray-rhel" SPACK_TARGET_PLATFORM: "cray-rhel"
SPACK_TARGET_ARCH: "zen4" SPACK_TARGET_ARCH: "x86_64_v3"
.cray_sles_zen4: .cray_sles_zen4:
variables: variables:
@@ -876,7 +876,7 @@ aws-pcluster-build-neoverse_v1:
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true - cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
.generate-cray-rhel: .generate-cray-rhel:
tags: [ "cray-rhel-zen4", "public" ] tags: [ "cray-rhel-x86_64_v3", "public" ]
extends: [ ".generate-cray" ] extends: [ ".generate-cray" ]
.generate-cray-sles: .generate-cray-sles:
@@ -888,7 +888,7 @@ aws-pcluster-build-neoverse_v1:
# E4S - Cray RHEL # E4S - Cray RHEL
####################################### #######################################
.e4s-cray-rhel: .e4s-cray-rhel:
extends: [ ".cray_rhel_zen4" ] extends: [ ".cray_rhel_x86_64_v3" ]
variables: variables:
SPACK_CI_STACK_NAME: e4s-cray-rhel SPACK_CI_STACK_NAME: e4s-cray-rhel
@@ -896,7 +896,6 @@ e4s-cray-rhel-generate:
extends: [ ".generate-cray-rhel", ".e4s-cray-rhel" ] extends: [ ".generate-cray-rhel", ".e4s-cray-rhel" ]
e4s-cray-rhel-build: e4s-cray-rhel-build:
allow_failure: true # libsci_cray.so broken, misses DT_NEEDED for libdl.so
extends: [ ".build", ".e4s-cray-rhel" ] extends: [ ".build", ".e4s-cray-rhel" ]
trigger: trigger:
include: include:
@@ -915,10 +914,10 @@ e4s-cray-rhel-build:
variables: variables:
SPACK_CI_STACK_NAME: e4s-cray-sles SPACK_CI_STACK_NAME: e4s-cray-sles
e4s-cray-sles-generate: .e4s-cray-sles-generate:
extends: [ ".generate-cray-sles", ".e4s-cray-sles" ] extends: [ ".generate-cray-sles", ".e4s-cray-sles" ]
e4s-cray-sles-build: .e4s-cray-sles-build:
allow_failure: true # libsci_cray.so broken, misses DT_NEEDED for libdl.so allow_failure: true # libsci_cray.so broken, misses DT_NEEDED for libdl.so
extends: [ ".build", ".e4s-cray-sles" ] extends: [ ".build", ".e4s-cray-sles" ]
trigger: trigger:

View File

@@ -1,31 +1,27 @@
compilers: compilers:
- compiler: - compiler:
spec: cce@15.0.1 spec: cce@=18.0.0
paths: paths:
cc: cc cc: /opt/cray/pe/cce/18.0.0/bin/craycc
cxx: CC cxx: /opt/cray/pe/cce/18.0.0/bin/crayCC
f77: ftn f77: /opt/cray/pe/cce/18.0.0/bin/crayftn
fc: ftn fc: /opt/cray/pe/cce/18.0.0/bin/crayftn
flags: {} flags: {}
operating_system: rhel8 operating_system: rhel8
target: any target: x86_64
modules: modules: []
- PrgEnv-cray/8.3.3 environment: {}
- cce/15.0.1 extra_rpaths: []
environment:
set:
MACHTYPE: x86_64
- compiler: - compiler:
spec: gcc@11.2.0 spec: gcc@=8.5.0
paths: paths:
cc: gcc cc: /usr/bin/gcc
cxx: g++ cxx: /usr/bin/g++
f77: gfortran f77: /usr/bin/gfortran
fc: gfortran fc: /usr/bin/gfortran
flags: {} flags: {}
operating_system: rhel8 operating_system: rhel8
target: any target: x86_64
modules: modules: []
- PrgEnv-gnu environment: {}
- gcc/11.2.0 extra_rpaths: []
environment: {}

View File

@@ -1,16 +1,15 @@
packages: packages:
# EXTERNALS
cray-mpich: cray-mpich:
buildable: false buildable: false
externals: externals:
- spec: cray-mpich@8.1.25 %cce@15.0.1 - spec: cray-mpich@8.1.30 %cce
prefix: /opt/cray/pe/mpich/8.1.25/ofi/cray/10.0 prefix: /opt/cray/pe/mpich/8.1.30/ofi/cray/18.0
modules: modules:
- cray-mpich/8.1.25 - cray-mpich/8.1.30
cray-libsci: cray-libsci:
buildable: false buildable: false
externals: externals:
- spec: cray-libsci@23.02.1.1 %cce@15.0.1 - spec: cray-libsci@24.07.0 %cce
prefix: /opt/cray/pe/libsci/23.02.1.1/CRAY/9.0/x86_64/ prefix: /opt/cray/pe/libsci/24.07.0/CRAY/18.0/x86_64/
modules: modules:
- cray-libsci/23.02.1.1 - cray-libsci/24.07.0

View File

@@ -0,0 +1,4 @@
ci:
pipeline-gen:
- build-job:
tags: ["cray-rhel-x86_64_v3"]

View File

@@ -1,4 +0,0 @@
ci:
pipeline-gen:
- build-job:
tags: ["cray-rhel-zen4"]

View File

@@ -10,8 +10,7 @@ spack:
packages: packages:
all: all:
prefer: require: "%cce@18.0.0 target=x86_64_v3"
- "%cce"
compiler: [cce] compiler: [cce]
providers: providers:
blas: [cray-libsci] blas: [cray-libsci]
@@ -19,17 +18,15 @@ spack:
mpi: [cray-mpich] mpi: [cray-mpich]
tbb: [intel-tbb] tbb: [intel-tbb]
scalapack: [netlib-scalapack] scalapack: [netlib-scalapack]
target: [zen4]
variants: +mpi variants: +mpi
ncurses:
require: +termlib ldflags=-Wl,--undefined-version
tbb: tbb:
require: "intel-tbb" require: "intel-tbb"
binutils: binutils:
variants: +ld +gold +headers +libiberty ~nls variants: +ld +gold +headers +libiberty ~nls
boost: boost:
variants: +python +filesystem +iostreams +system variants: +python +filesystem +iostreams +system
cuda:
version: [11.7.0]
elfutils: elfutils:
variants: ~nls variants: ~nls
require: "%gcc" require: "%gcc"
@@ -39,20 +36,14 @@ spack:
variants: +fortran +hl +shared variants: +fortran +hl +shared
libfabric: libfabric:
variants: fabrics=sockets,tcp,udp,rxm variants: fabrics=sockets,tcp,udp,rxm
libunwind:
variants: +pic +xz
mgard: mgard:
require: require:
- "@2023-01-10:" - "@2023-01-10:"
mpich: mpich:
variants: ~wrapperrpath variants: ~wrapperrpath
ncurses:
variants: +termlib
paraview: paraview:
# Don't build GUI support or GLX rendering for HPC/container deployments # Don't build GUI support or GLX rendering for HPC/container deployments
require: "@5.11 ~qt ^[virtuals=gl] osmesa" require: "~qt ^[virtuals=gl] osmesa"
python:
version: [3.8.13]
trilinos: trilinos:
require: require:
- one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack - one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack
@@ -63,12 +54,6 @@ spack:
- one_of: [~ml ~muelu ~zoltan2 ~teko, +ml +muelu +zoltan2 +teko] - one_of: [~ml ~muelu ~zoltan2 ~teko, +ml +muelu +zoltan2 +teko]
- one_of: [+superlu-dist, ~superlu-dist] - one_of: [+superlu-dist, ~superlu-dist]
- one_of: [+shylu, ~shylu] - one_of: [+shylu, ~shylu]
xz:
variants: +pic
mesa:
version: [21.3.8]
unzip:
require: "%gcc"
specs: specs:
# CPU # CPU
@@ -76,62 +61,43 @@ spack:
- aml - aml
- arborx - arborx
- argobots - argobots
- bolt
- butterflypack
- boost +python +filesystem +iostreams +system - boost +python +filesystem +iostreams +system
- cabana - cabana
- caliper
- chai - chai
- charliecloud
- conduit - conduit
# - cp2k +mpi # libxsmm: ftn-78 ftn: ERROR in command linel; The -f option has an invalid argument, "tree-vectorize".
- datatransferkit - datatransferkit
- flecsi - flecsi
- flit - flit
- flux-core
- fortrilinos
- ginkgo - ginkgo
- globalarrays - globalarrays
- gmp - gmp
- gotcha - gotcha
- h5bench - h5bench
- hdf5-vol-async - hdf5-vol-async
- hdf5-vol-cache - hdf5-vol-cache cflags=-Wno-error=incompatible-function-pointer-types
- hdf5-vol-log - hdf5-vol-log
- heffte +fftw - heffte +fftw
- hpx max_cpu_count=512 networking=mpi
- hypre - hypre
- kokkos +openmp - kokkos +openmp
- kokkos-kernels +openmp - kokkos-kernels +openmp
- lammps
- legion - legion
- libnrm - libnrm
#- libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +mgard # mgard:
- libquo - libquo
- libunwind - libunwind
- mercury - mercury
- metall - metall
- mfem - mfem
# - mgard +serial +openmp +timing +unstructured ~cuda # mgard
- mpark-variant - mpark-variant
- mpifileutils ~xattr - mpifileutils ~xattr cflags=-Wno-error=implicit-function-declaration
- nccmp - nccmp
- nco - nco
- netlib-scalapack - netlib-scalapack cflags=-Wno-error=implicit-function-declaration
- omega-h
- openmpi
- openpmd-api ^adios2~mgard - openpmd-api ^adios2~mgard
- papi
- papyrus - papyrus
- pdt - pdt
- petsc - petsc
- plumed
- precice - precice
- pumi - pumi
- py-h5py +mpi
- py-h5py ~mpi
- py-libensemble +mpi +nlopt
- py-petsc4py
- qthreads scheduler=distrib - qthreads scheduler=distrib
- raja - raja
- slate ~cuda - slate ~cuda
@@ -144,8 +110,7 @@ spack:
- swig@4.0.2-fortran - swig@4.0.2-fortran
- sz3 - sz3
- tasmanian - tasmanian
- tau +mpi +python - trilinos +belos +ifpack2 +stokhos
- trilinos@13.0.1 +belos +ifpack2 +stokhos
- turbine - turbine
- umap - umap
- umpire - umpire
@@ -155,27 +120,47 @@ spack:
# - alquimia # pflotran: petsc-3.19.4-c6pmpdtpzarytxo434zf76jqdkhdyn37/lib/petsc/conf/rules:169: material_aux.o] Error 1: fortran errors # - alquimia # pflotran: petsc-3.19.4-c6pmpdtpzarytxo434zf76jqdkhdyn37/lib/petsc/conf/rules:169: material_aux.o] Error 1: fortran errors
# - amrex # disabled temporarily pending resolution of unreproducible CI failure # - amrex # disabled temporarily pending resolution of unreproducible CI failure
# - axom # axom: CMake Error at axom/sidre/cmake_install.cmake:154 (file): file INSTALL cannot find "/tmp/gitlab-runner-2/spack-stage/spack-stage-axom-0.8.1-jvol6riu34vuyqvrd5ft2gyhrxdqvf63/spack-build-jvol6ri/lib/fortran/axom_spio.mod": No such file or directory. # - axom # axom: CMake Error at axom/sidre/cmake_install.cmake:154 (file): file INSTALL cannot find "/tmp/gitlab-runner-2/spack-stage/spack-stage-axom-0.8.1-jvol6riu34vuyqvrd5ft2gyhrxdqvf63/spack-build-jvol6ri/lib/fortran/axom_spio.mod": No such file or directory.
# - bolt # ld.lld: error: CMakeFiles/bolt-omp.dir/kmp_gsupport.cpp.o: symbol GOMP_atomic_end@@GOMP_1.0 has undefined version GOMP_1.0
# - bricks # bricks: clang-15: error: clang frontend command failed with exit code 134 (use -v to see invocation) # - bricks # bricks: clang-15: error: clang frontend command failed with exit code 134 (use -v to see invocation)
# - butterflypack ^netlib-scalapack cflags=-Wno-error=implicit-function-declaration # ftn-2116 ftn: INTERNAL "driver" was terminated due to receipt of signal 01: Hangup.
# - caliper # papi: papi_internal.c:124:3: error: use of undeclared identifier '_papi_hwi_my_thread'; did you mean '_papi_hwi_read'?
# - charliecloud # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - cp2k +mpi # libxsmm: ftn-78 ftn: ERROR in command linel; The -f option has an invalid argument, "tree-vectorize".
# - dealii # llvm@14.0.6: ?; intel-tbb@2020.3: clang-15: error: unknown argument: '-flifetime-dse=1'; assimp@5.2.5: clang-15: error: clang frontend command failed with exit code 134 (use -v to see invocation) # - dealii # llvm@14.0.6: ?; intel-tbb@2020.3: clang-15: error: unknown argument: '-flifetime-dse=1'; assimp@5.2.5: clang-15: error: clang frontend command failed with exit code 134 (use -v to see invocation)
# - dyninst # requires %gcc # - dyninst # requires %gcc
# - ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp ^hdf5@1.14 # llvm@14.0.6: ?; # - ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp ^hdf5@1.14 # llvm@14.0.6: ?;
# - exaworks # rust: ld.lld: error: relocation R_X86_64_32 cannot be used against local symbol; recompile with -fPIC'; defined in /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o, referenced by /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o:(__no_mmap_for_malloc) # - exaworks # rust: ld.lld: error: relocation R_X86_64_32 cannot be used against local symbol; recompile with -fPIC'; defined in /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o, referenced by /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o:(__no_mmap_for_malloc)
# - flux-core # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - fortrilinos # trilinos-14.0.0: packages/teuchos/core/src/Teuchos_BigUIntDecl.hpp:67:8: error: no type named 'uint32_t' in namespace 'std'
# - gasnet # configure error: User requested --enable-ofi but I don't know how to build ofi programs for your system # - gasnet # configure error: User requested --enable-ofi but I don't know how to build ofi programs for your system
# - gptune # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']] # - gptune # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']]
# - hpctoolkit # dyninst requires %gcc # - hpctoolkit # dyninst requires %gcc
# - hpx max_cpu_count=512 networking=mpi # libxcrypt-4.4.35
# - lammps # lammps-20240829.1: Reversed (or previously applied) patch detected! Assume -R? [n]
# - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +mgard # mgard:
# - mgard +serial +openmp +timing +unstructured ~cuda # mgard
# - nrm # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']] # - nrm # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']]
# - nvhpc # requires %gcc # - nvhpc # requires %gcc
# - omega-h # trilinos-13.4.1: packages/kokkos/core/src/impl/Kokkos_MemoryPool.cpp:112:48: error: unknown type name 'uint32_t'
# - openmpi # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - papi # papi_internal.c:124:3: error: use of undeclared identifier '_papi_hwi_my_thread'; did you mean '_papi_hwi_read'?
# - parsec ~cuda # parsec: parsec/fortran/CMakeFiles/parsec_fortran.dir/parsecf.F90.o: ftn-2103 ftn: WARNING in command line. The -W extra option is not supported or invalid and will be ignored. # - parsec ~cuda # parsec: parsec/fortran/CMakeFiles/parsec_fortran.dir/parsecf.F90.o: ftn-2103 ftn: WARNING in command line. The -W extra option is not supported or invalid and will be ignored.
# - phist # fortran_bindings/CMakeFiles/phist_fort.dir/phist_testing.F90.o: ftn-78 ftn: ERROR in command line. The -f option has an invalid argument, "no-math-errno". # - phist # fortran_bindings/CMakeFiles/phist_fort.dir/phist_testing.F90.o: ftn-78 ftn: ERROR in command line. The -f option has an invalid argument, "no-math-errno".
# - plasma # %cce conflict # - plasma # %cce conflict
# - plumed # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - py-h5py +mpi # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - py-h5py ~mpi # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - py-jupyterhub # rust: ld.lld: error: relocation R_X86_64_32 cannot be used against local symbol; recompile with -fPIC'; defined in /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o, referenced by /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o:(__no_mmap_for_malloc) # - py-jupyterhub # rust: ld.lld: error: relocation R_X86_64_32 cannot be used against local symbol; recompile with -fPIC'; defined in /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o, referenced by /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o:(__no_mmap_for_malloc)
# - py-libensemble +mpi +nlopt # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - py-petsc4py # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - quantum-espresso # quantum-espresso: CMake Error at cmake/FindSCALAPACK.cmake:503 (message): A required library with SCALAPACK API not found. Please specify library # - quantum-espresso # quantum-espresso: CMake Error at cmake/FindSCALAPACK.cmake:503 (message): A required library with SCALAPACK API not found. Please specify library
# - scr # scr: make[2]: *** [examples/CMakeFiles/test_ckpt_F.dir/build.make:112: examples/test_ckpt_F] Error 1: /opt/cray/pe/cce/15.0.1/binutils/x86_64/x86_64-pc-linux-gnu/bin/ld: /opt/cray/pe/mpich/8.1.25/ofi/cray/10.0/lib/libmpi_cray.so: undefined reference to `PMI_Barrier' # - scr # scr: make[2]: *** [examples/CMakeFiles/test_ckpt_F.dir/build.make:112: examples/test_ckpt_F] Error 1: /opt/cray/pe/cce/15.0.1/binutils/x86_64/x86_64-pc-linux-gnu/bin/ld: /opt/cray/pe/mpich/8.1.25/ofi/cray/10.0/lib/libmpi_cray.so: undefined reference to `PMI_Barrier'
# - strumpack ~slate # strumpack: [test/CMakeFiles/test_HSS_seq.dir/build.make:117: test/test_HSS_seq] Error 1: ld.lld: error: undefined reference due to --no-allow-shlib-undefined: mpi_abort_ # - strumpack ~slate # strumpack: [test/CMakeFiles/test_HSS_seq.dir/build.make:117: test/test_HSS_seq] Error 1: ld.lld: error: undefined reference due to --no-allow-shlib-undefined: mpi_abort_
# - tau +mpi +python # libelf: configure: error: installation or configuration problem: C compiler cannot create executables.; papi: papi_internal.c:124:3: error: use of undeclared identifier '_papi_hwi_my_thread'; did you mean '_papi_hwi_read'?
# - upcxx # upcxx: configure error: User requested --enable-ofi but I don't know how to build ofi programs for your system # - upcxx # upcxx: configure error: User requested --enable-ofi but I don't know how to build ofi programs for your system
# - variorum # variorum: /opt/cray/pe/cce/15.0.1/binutils/x86_64/x86_64-pc-linux-gnu/bin/ld: /opt/cray/pe/lib64/libpals.so.0: undefined reference to `json_array_append_new@@libjansson.so.4' # - variorum # variorum: /opt/cray/pe/cce/15.0.1/binutils/x86_64/x86_64-pc-linux-gnu/bin/ld: /opt/cray/pe/lib64/libpals.so.0: undefined reference to `json_array_append_new@@libjansson.so.4'
# - xyce +mpi +shared +pymi +pymi_static_tpls ^trilinos~shylu # openblas: ftn-2307 ftn: ERROR in command line: The "-m" option must be followed by 0, 1, 2, 3 or 4.; make[2]: *** [<builtin>: spotrf2.o] Error 1; make[1]: *** [Makefile:27: lapacklib] Error 2; make: *** [Makefile:250: netlib] Error 2
# - warpx +python # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']] # - warpx +python # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']]
# - xyce +mpi +shared +pymi +pymi_static_tpls ^trilinos~shylu # openblas: ftn-2307 ftn: ERROR in command line: The "-m" option must be followed by 0, 1, 2, 3 or 4.; make[2]: *** [<builtin>: spotrf2.o] Error 1; make[1]: *** [Makefile:27: lapacklib] Error 2; make: *** [Makefile:250: netlib] Error 2
cdash: cdash:
build-group: E4S Cray build-group: E4S Cray

View File

@@ -36,7 +36,7 @@ export QA_DIR=$(realpath $QA_DIR)
cd "$SPACK_ROOT" cd "$SPACK_ROOT"
# Run bash tests with coverage enabled, but pipe output to /dev/null # Run bash tests with coverage enabled, but pipe output to /dev/null
# because it seems that kcov seems to undo the script's redirection # because it seems that kcov undoes the script's redirection
if [ "$COVERAGE" = true ]; then if [ "$COVERAGE" = true ]; then
kcov "$SPACK_ROOT/coverage" "$QA_DIR/setup-env-test.sh" &> /dev/null kcov "$SPACK_ROOT/coverage" "$QA_DIR/setup-env-test.sh" &> /dev/null
kcov "$SPACK_ROOT/coverage" "$QA_DIR/completion-test.sh" &> /dev/null kcov "$SPACK_ROOT/coverage" "$QA_DIR/completion-test.sh" &> /dev/null

View File

@@ -0,0 +1,25 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class DependsOnManyvariants(Package):
"""
A package with a dependency on `manyvariants`, so that `manyvariants` can
be spliced in tests.
"""
homepage = "https://www.test.com"
has_code = False
version("1.0")
version("2.0")
depends_on("manyvariants@1.0", when="@1.0")
depends_on("manyvariants@2.0", when="@2.0")
def install(self, spec, prefix):
touch(prefix.bar)

View File

@@ -0,0 +1,19 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class DependsOnVirtualWithAbi(Package):
"""
This has a virtual dependency on `virtual-with-abi`, mostly for testing
automatic splicing of providers.
"""
homepage = "https://www.example.com"
has_code = False
version("1.0")
depends_on("virtual-with-abi")

View File

@@ -0,0 +1,33 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Manyvariants(Package):
"""
A package with 4 different variants of different arities to test the
`match_variants` argument to `can_splice`
"""
homepage = "https://www.test.com"
has_code = False
version("2.0.1")
version("2.0.0")
version("1.0.1")
version("1.0.0")
variant("a", default=True)
variant("b", default=False)
variant("c", values=("v1", "v2", "v3"), multi=False, default="v1")
variant("d", values=("v1", "v2", "v3"), multi=False, default="v1")
can_splice("manyvariants@1.0.0", when="@1.0.1", match_variants="*")
can_splice("manyvariants@2.0.0+a~b", when="@2.0.1~a+b", match_variants=["c", "d"])
can_splice("manyvariants@2.0.0 c=v1 d=v1", when="@2.0.1+a+b")
def install(self, spec, prefix):
touch(prefix.bar)

View File

@@ -0,0 +1,22 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class SpliceDependsOnT(Package):
"""Package that depends on splice-t"""
homepage = "http://www.example.com"
url = "http://www.example.com/splice-depends-on-t-1.0.tar.gz"
version("1.0", md5="0123456789abcdef0123456789abcdef")
depends_on("splice-t")
def install(self, spec, prefix):
with open(prefix.join("splice-depends-on-t"), "w") as f:
f.write("splice-depends-on-t: {0}".format(prefix))
f.write("splice-t: {0}".format(spec["splice-t"].prefix))

View File

@@ -12,17 +12,24 @@ class SpliceH(Package):
homepage = "http://www.example.com" homepage = "http://www.example.com"
url = "http://www.example.com/splice-h-1.0.tar.gz" url = "http://www.example.com/splice-h-1.0.tar.gz"
version("1.0", md5="0123456789abcdef0123456789abcdef") version("1.0.2")
version("1.0.1")
version("1.0.0")
variant("foo", default=False, description="nope") variant("foo", default=False, description="nope")
variant("bar", default=False, description="nope") variant("bar", default=False, description="nope")
variant("baz", default=False, description="nope") variant("baz", default=False, description="nope")
variant("compat", default=True, description="nope")
depends_on("splice-z") depends_on("splice-z")
depends_on("splice-z+foo", when="+foo") depends_on("splice-z+foo", when="+foo")
provides("something") provides("something")
provides("somethingelse") provides("somethingelse")
provides("virtual-abi")
can_splice("splice-h@1.0.0 +compat", when="@1.0.1 +compat")
can_splice("splice-h@1.0.0:1.0.1 +compat", when="@1.0.2 +compat")
def install(self, spec, prefix): def install(self, spec, prefix):
with open(prefix.join("splice-h"), "w") as f: with open(prefix.join("splice-h"), "w") as f:

View File

@@ -12,10 +12,16 @@ class SpliceZ(Package):
homepage = "http://www.example.com" homepage = "http://www.example.com"
url = "http://www.example.com/splice-z-1.0.tar.gz" url = "http://www.example.com/splice-z-1.0.tar.gz"
version("1.0", md5="0123456789abcdef0123456789abcdef") version("1.0.2")
version("1.0.1")
version("1.0.0")
variant("foo", default=False, description="nope") variant("foo", default=False, description="nope")
variant("bar", default=False, description="nope") variant("bar", default=False, description="nope")
variant("compat", default=True, description="nope")
can_splice("splice-z@1.0.0 +compat", when="@1.0.1 +compat")
can_splice("splice-z@1.0.0:1.0.1 +compat", when="@1.0.2 +compat")
def install(self, spec, prefix): def install(self, spec, prefix):
with open(prefix.join("splice-z"), "w") as f: with open(prefix.join("splice-z"), "w") as f:

View File

@@ -0,0 +1,25 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class VirtualAbi1(Package):
"""
This package provides `virtual-with-abi` and is conditionally ABI
compatible with `virtual-abi-multi`
"""
homepage = "https://www.example.com"
has_code = False
version("1.0")
provides("virtual-with-abi")
can_splice("virtual-abi-multi@1.0 abi=one", when="@1.0")
def install(self, spec, prefix):
touch(prefix.foo)

View File

@@ -0,0 +1,25 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class VirtualAbi2(Package):
"""
This package provides `virtual-with-abi` and is conditionally ABI
compatible with `virtual-abi-multi`
"""
homepage = "https://www.example.com"
has_code = False
version("1.0")
provides("virtual-with-abi")
can_splice("virtual-abi-multi@1.0 abi=two", when="@1.0")
def install(self, spec, prefix):
touch(prefix.foo)

View File

@@ -0,0 +1,29 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class VirtualAbiMulti(Package):
"""
This package provides `virtual-with-abi` is ABI compatible with either
`virtual-abi-1` or `virtual-abi-2` depending on the value of its `abi`
variant
"""
homepage = "https://www.example.com"
has_code = False
version("1.0")
variant("abi", default="custom", multi=False, values=("one", "two", "custom"))
provides("virtual-with-abi")
can_splice("virtual-abi-1@1.0", when="@1.0 abi=one")
can_splice("virtual-abi-2@1.0", when="@1.0 abi=two")
def install(self, spec, prefix):
touch(prefix.foo)

View File

@@ -0,0 +1,16 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class VirtualWithAbi(Package):
"""Virtual package for mocking an interface with stable ABI ."""
homepage = "https://www.abi.org/"
virtual = True
def test_hello(self):
print("Hello there!")

View File

@@ -16,6 +16,7 @@ class Direnv(GoPackage):
license("MIT") license("MIT")
version("2.35.0", sha256="a7aaec49d1b305f0745dad364af967fb3dc9bb5befc9f29d268d528b5a474e57")
version("2.34.0", sha256="3d7067e71500e95d69eac86a271a6b6fc3f2f2817ba0e9a589524bf3e73e007c") version("2.34.0", sha256="3d7067e71500e95d69eac86a271a6b6fc3f2f2817ba0e9a589524bf3e73e007c")
version("2.33.0", sha256="8ef18051aa6bdcd6b59f04f02acdd0b78849b8ddbdbd372d4957af7889c903ea") version("2.33.0", sha256="8ef18051aa6bdcd6b59f04f02acdd0b78849b8ddbdbd372d4957af7889c903ea")
version("2.32.3", sha256="c66f6d1000f28f919c6106b5dcdd0a0e54fb553602c63c60bf59d9bbdf8bd33c") version("2.32.3", sha256="c66f6d1000f28f919c6106b5dcdd0a0e54fb553602c63c60bf59d9bbdf8bd33c")

View File

@@ -12,33 +12,36 @@ class GdkPixbuf(MesonPackage):
GTK+ 2 but it was split off into a separate package in preparation for the change to GTK+ 3.""" GTK+ 2 but it was split off into a separate package in preparation for the change to GTK+ 3."""
homepage = "https://gitlab.gnome.org/GNOME/gdk-pixbuf" homepage = "https://gitlab.gnome.org/GNOME/gdk-pixbuf"
url = "https://ftp.acc.umu.se/pub/gnome/sources/gdk-pixbuf/2.40/gdk-pixbuf-2.40.0.tar.xz"
git = "https://gitlab.gnome.org/GNOME/gdk-pixbuf" git = "https://gitlab.gnome.org/GNOME/gdk-pixbuf"
list_url = "https://ftp.acc.umu.se/pub/gnome/sources/gdk-pixbuf/" url = "https://gitlab.gnome.org/GNOME/gdk-pixbuf/-/archive/2.40.0/gdk-pixbuf-2.40.0.tar.gz"
# Falling back to the gitlab source since the mirror here seems to be broken
# url = "https://ftp.acc.umu.se/pub/gnome/sources/gdk-pixbuf/2.40/gdk-pixbuf-2.40.0.tar.xz"
# list_url = "https://ftp.acc.umu.se/pub/gnome/sources/gdk-pixbuf/"
list_depth = 1 list_depth = 1
license("LGPL-2.1-or-later", checked_by="wdconinc") license("LGPL-2.1-or-later", checked_by="wdconinc")
version("2.42.12", sha256="b9505b3445b9a7e48ced34760c3bcb73e966df3ac94c95a148cb669ab748e3c7") version("2.42.12", sha256="d41966831b3d291fcdfe31f683bea4b3f03241d591ddbe550b5db873af3da364")
# https://nvd.nist.gov/vuln/detail/CVE-2022-48622 # https://nvd.nist.gov/vuln/detail/CVE-2022-48622
version( version(
"2.42.10", "2.42.10",
sha256="ee9b6c75d13ba096907a2e3c6b27b61bcd17f5c7ebeab5a5b439d2f2e39fe44b", sha256="87a086c51d9705698b22bd598a795efaccf61e4db3a96f439dcb3cd90506dab8",
deprecated=True, deprecated=True,
) )
version( version(
"2.42.9", "2.42.9",
sha256="28f7958e7bf29a32d4e963556d241d0a41a6786582ff6a5ad11665e0347fc962", sha256="226d950375907857b23c5946ae6d30128f08cd75f65f14b14334c7a9fb686e36",
deprecated=True, deprecated=True,
) )
version( version(
"2.42.6", "2.42.6",
sha256="c4a6b75b7ed8f58ca48da830b9fa00ed96d668d3ab4b1f723dcf902f78bde77f", sha256="c4f3a84a04bc7c5f4fbd97dce7976ab648c60628f72ad4c7b79edce2bbdb494d",
deprecated=True, deprecated=True,
) )
version( version(
"2.42.2", "2.42.2",
sha256="83c66a1cfd591d7680c144d2922c5955d38b4db336d7cd3ee109f7bcf9afef15", sha256="249b977279f761979104d7befbb5ee23f1661e29d19a36da5875f3a97952d13f",
deprecated=True, deprecated=True,
) )

View File

@@ -183,7 +183,7 @@ class Gromacs(CMakePackage, CudaPackage):
"sve", "sve",
default=True, default=True,
description="Enable SVE on aarch64 if available", description="Enable SVE on aarch64 if available",
when="target=neoverse_v1:,neoverse_v2:", when="target=neoverse_v1:,neoverse_v2:,neoverse_n2:",
) )
variant( variant(
"sve", default=True, description="Enable SVE on aarch64 if available", when="target=a64fx" "sve", default=True, description="Enable SVE on aarch64 if available", when="target=a64fx"

View File

@@ -0,0 +1,35 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Ph5concat(AutotoolsPackage):
"""Parallel Data Concatenation for High Energy Physics Data Analysis"""
homepage = "https://github.com/NU-CUCIS/ph5concat"
url = "https://github.com/NU-CUCIS/ph5concat/archive/v1.1.0.tar.gz"
maintainers("vhewes")
version("1.1.0", sha256="cecc22325a56771cda1fc186e6bd1f9bde2957beca3fa9a387d55462efd5254f")
depends_on("autoconf", type="build")
depends_on("automake", type="build")
depends_on("libtool", type="build")
depends_on("zlib")
depends_on("hdf5+hl+mpi@1.10.4:1.12")
depends_on("mpich")
variant("profiling", default=False, description="Enable profiling support")
def setup_build_environment(self, env):
env.set("LIBS", "-ldl -lz")
def configure_args(self):
args = [f"--with-{pkg}={self.spec[pkg].prefix}" for pkg in ("hdf5", "mpich")]
args.extend(self.enable_or_disable("profiling"))
return args

View File

@@ -0,0 +1,34 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyNugraph(PythonPackage):
"""Graph Neural Network for neutrino physics event reconstruction"""
pypi = "nugraph/nugraph-24.7.1.tar.gz"
maintainers("vhewes")
license("MIT", checked_by="vhewes")
version("24.7.1", sha256="e1449e4a37049cc774ad026d4f2db339eb60bb59109a11920bb65a4061915de8")
version("24.7.0", sha256="b95d93a1cbcd280a3529ce4782ef778b982d9d4edcc19f522442c38144895f65")
version("24.4.0", sha256="5f888d065819b1ec7c33e7f829ad65eb963db2cf109a5d31b4caef49c004f86f")
version("24.2.0", sha256="4765ea73b384e95a38a598499e77d805541e415049da9f6f46193f8bc281208a")
version("23.11.1", sha256="b160996fca9615b2c7e6ed02fb780af5edaa97f6cdafd45abdf65ea0c7a6f2ca")
version("23.11.0", sha256="a1e01a8c3143fc8db2cf8a3584d192a738d89eb865b1d52cd2994b24bd4175ec")
version("23.10.0", sha256="8a0219318c6bd6d0d240e419ef88cdedd7e944276f0cce430d9ece423e06f1b8")
depends_on("py-flit-core", type="build")
depends_on("py-matplotlib")
depends_on("py-numl")
depends_on("py-pynvml")
depends_on("py-seaborn")
depends_on("py-pytorch-lightning")
extends("python")

View File

@@ -0,0 +1,40 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class PyNuml(PythonPackage):
"""Standardised ML input processing for particle physics"""
pypi = "pynuml/pynuml-24.7.1.tar.gz"
maintainers("vhewes")
license("MIT", checked_by="vhewes")
version("24.7.1", sha256="20d2f1a07887473e67c79ecc3804b8012e22b78883199fdb0d07bb1b725b6ab0")
version("24.7.0", sha256="d47f71ead6861278595b79d04c554da4998d5c4c50587e4c90231f50db0f2e81")
version("24.6.0", sha256="357d2b0e0b9ca179514d177278620e5ac57bed37bfb6d145c172150126432613")
version("23.11.0", sha256="1a7e61864cfeb0b27c6a93646c33e3f457bbc384eb86aee4df76b5e02898d02f")
version("23.9.0", sha256="77ea8c9df541351adeb249594cce27d742973ee82a0d7f2ad8cdcffa9d3fa6b1")
version("23.8.0", sha256="0896797f3f70b3a6d3d74f7a3e7fe5eaf59a2000a47ffc7ac08b73be0aa15706")
version("23.7.0", sha256="5449dd09a7e046d036e12c7971e61d2862cdb79c7932144b038288fc05ca50a8")
version("23.6.1", sha256="fdb23a9d4f1b83b06cc35b07608fe4c2e55f8307ac47851cccc21a20b69ab674")
version("23.6.0", sha256="fcc1546b9489584f2635f6418c5e1a43f6bdf02dd5c46b7afa09ea5f247524a2")
version("23.5.2", sha256="d83576c8e25e22cc9ba68a35b9690ea861f7a4c09db65ca134849c89fba9b330")
version("23.5.1", sha256="73ef1bea1022b9ebddec35ac7d66c1394003aa5e63a4ec99bfa14d4f833e04a4")
version("23.5.0", sha256="dccb774932813ddc788b1d27e52e251d9db6ea16b303596bfa0955ae51098674")
depends_on("py-flit-core", type="build")
depends_on("mpich")
depends_on("py-h5py +mpi")
depends_on("py-pandas")
depends_on("py-particle")
depends_on("py-plotly")
depends_on("py-torch-geometric")
extends("python")

View File

@@ -65,9 +65,15 @@ def cmake_args(self):
if re_qt.match(dep.name): if re_qt.match(dep.name):
qt_prefix_path.append(self.spec[dep.name].prefix) qt_prefix_path.append(self.spec[dep.name].prefix)
# Now append all qt-* dependency prefixex into a prefix path # Now append all qt-* dependency prefixes into a prefix path
args.append(self.define("QT_ADDITIONAL_PACKAGES_PREFIX_PATH", ":".join(qt_prefix_path))) args.append(self.define("QT_ADDITIONAL_PACKAGES_PREFIX_PATH", ":".join(qt_prefix_path)))
# Make our CMAKE_INSTALL_RPATH redundant:
# for prefix of current package ($ORIGIN/../lib type of rpaths),
args.append(self.define("QT_DISABLE_RPATH", True))
# for prefixes of dependencies
args.append(self.define("QT_NO_DISABLE_CMAKE_INSTALL_RPATH_USE_LINK_PATH", True))
return args return args
@run_after("install") @run_after("install")

View File

@@ -16,6 +16,7 @@ class Restic(GoPackage):
license("BSD-2-Clause") license("BSD-2-Clause")
version("0.17.3", sha256="bf0dd73edfae531c24070e2e7833938613f7b179ed165e6b681098edfdf286c8")
version("0.17.1", sha256="cba3a5759690d11dae4b5620c44f56be17a5688e32c9856776db8a9a93d6d59a") version("0.17.1", sha256="cba3a5759690d11dae4b5620c44f56be17a5688e32c9856776db8a9a93d6d59a")
version("0.16.4", sha256="d736a57972bb7ee3398cf6b45f30e5455d51266f5305987534b45a4ef505f965") version("0.16.4", sha256="d736a57972bb7ee3398cf6b45f30e5455d51266f5305987534b45a4ef505f965")
version("0.16.3", sha256="a94d6c1feb0034fcff3e8b4f2d65c0678f906fc21a1cf2d435341f69e7e7af52") version("0.16.3", sha256="a94d6c1feb0034fcff3e8b4f2d65c0678f906fc21a1cf2d435341f69e7e7af52")

View File

@@ -31,6 +31,9 @@ class Spectre(CMakePackage):
license("MIT") license("MIT")
version("develop", branch="develop") version("develop", branch="develop")
version(
"2024.03.19", sha256="42a25c8827b56268d9826239cde521491be19318d83785b35cd0265a9f6a1f7c"
)
version( version(
"2024.09.29", sha256="b5e84b4564ad7cd2e069a24c6c472aab342753fe8393242eceba378b52226acb" "2024.09.29", sha256="b5e84b4564ad7cd2e069a24c6c472aab342753fe8393242eceba378b52226acb"
) )