Compare commits

...

34 Commits

Author SHA1 Message Date
Harmen Stoppels
2bfcc69fa8 Set version to v0.23.1 2025-02-19 16:04:31 +01:00
kwryankrattiger
af62d91457 gdk-pixbuf: Point at gitlab instead of broken mirror (#47825) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
f32a74491e spec.py: ensure spec.extra_attributes is {} if is null in json (#48896) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
4f80f07b9a spec.py: fix hash change due to None vs {} (#48854)
* Fix hash change due to None vs {}

* Enforce null for empty list of external_modules
2025-02-19 14:37:47 +01:00
eugeneswalker
4a8ae59a9e e4s cray rhel ci stack: re-enable and update for new cpe (#47697)
* e4s cray rhel ci stack: re-enable and update for new cpe, should fix cray libsci issue

* only run e4s-cray-rhel stack

* Mkae Autotools build_system point at correct build_directory

* remove selective enable of cray-rhel stacks

* restore SPACK_CI_DISABLE_STACKS

* use dot prefix to hide cray-sles jobs instead of comment-out

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
2025-02-19 14:37:47 +01:00
Harmen Stoppels
6f7e881b69 fix year dependent license verification check 2025-02-19 14:37:47 +01:00
Till Ehrengruber
24bcaec6c3 oci/opener.py: respect system proxy settings (#48783) 2025-02-19 14:37:47 +01:00
Massimiliano Culpo
1c9bb36fdf spec.py: fix ArchSpec.intersects (#48741)
fixes a bug where `x86_64:` and `ppc64le:` intersected, and x86_64: and :haswell did not.
2025-02-19 14:37:47 +01:00
Harmen Stoppels
044d1b12bb autotools.py: set lt_cv_apple_cc_single_mod=yes (#48671)
Since macOS 15 `ld -single_module` warns with a deprecation message,
which makes configure scripts believe the flag is unsupported. That
in turn triggers a code path where `archive_cmds` is set to

```
$CC -r -keep_private_externs -nostdlib ... -dynamiclib
```

instead of just

```
$CC -dynamiclib ...
```

This code path was meant to trigger only on ancient macOS <= 14.4 where
libtool had to add `-single_module`, which is the default since macos
14.4, and is now apparently deprecated because the flag is a no-op for
more than 15 years.

The wrong `archive_cmds` causes actual problems combined with a bug in
OpenMPI's compiler wrapper (`CC=mpicc`), which appends `-rpath` flags,
which cause an error when combined with the `-r` flag added by the
autotools.

Spack's compiler wrapper doesn't do this, but it's likely there are
other compiler wrappers out there that are not aware that `-r` and
`-rpath` cannot be combined.

The fix is to change defaults: `lt_cv_apple_cc_single_mod=yes`.
2025-02-19 14:37:47 +01:00
Harmen Stoppels
536468783d spec.py: make hashing of extra_attributes order independent (#48615) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
a298296237 relocate_text: fix return value (#48568) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
1db11ff06b spec.py: fix return type of concretized() (#48504) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
b621c045a7 cmd/__init__.py: pass tests in case n=1 (#48417) 2025-02-19 14:37:47 +01:00
Wouter Deconinck
734ada1f10 doc: ensure getting_started has bootstrap list output in correct place (#48281) 2025-02-19 14:37:47 +01:00
psakievich
71712465e8 Ensure command_line scope is always last (#48255) 2025-02-19 14:37:47 +01:00
Massimiliano Culpo
7cc67b5c23 bootstrap mirror: fix references from v0.4 to v0.6 (#48235) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
1f9b0e39b9 llnl.util.lang: remove testing literal backtrace output (#48209) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
d012683c1b docs: advertise --oci-username-variable and --oci-password-variable (#48189) 2025-02-19 14:37:47 +01:00
Todd Gamblin
8b3c3e9165 Make unit tests work on ubuntu 24.04 (#48151)
`kcov` was removed in Ubuntu 24.04, and it is no longer
installable via `apt` in our CI images. Instal it via
Linuxbrew instead, at least until it comes back to Ubuntu.

`subversion` is also not installed on ubuntu 24 by default,
so we have to install it manually.

- [x] Add linuxbrew to linux tests
- [x] Install `kcov` with brew
- [x] Install subversion with `apt`

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-02-19 14:37:47 +01:00
Harmen Stoppels
dd94a44b6a gha: fix git safe.directory config (#48041) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
95d190e354 filter_file: make tempfile later (#48108)
* filter_file: make tempfile later

* also add a `.` after the filename
2025-02-19 14:37:47 +01:00
Harmen Stoppels
f124409d8a filter_file: fix various bugs (#48038)
* `f.tell` on a `TextIOWrapper` does not return the offset in bytes, but
  an opaque integer that can only be used for `f.seek` on the same
  object. Spack assumes it's a byte offset.
* Do not open in a locale dependent way, but assume utf-8 (and allow
  users to override that)
* Use tempfile to generate a backup/temporary file in a safe way
* Comparison between None and str is valid and on purpose.
2025-02-19 14:37:47 +01:00
Harmen Stoppels
bee2132c04 log.py: improve utf-8 handling, and non-utf-8 output (#48005) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
d0ea33fa67 stage.py: improve path to url (#47898) 2025-02-19 14:37:47 +01:00
Carson Woods
4e913876b9 bug fix: updated warning message to reflect impending v1.0 release (#47887) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
c25e43ce61 darwin: preserve hardlinks on codesign/install_name_tool (#47808) 2025-02-19 14:37:47 +01:00
Wouter Deconinck
85146d875b qt-base: fix rpath for dependents (#47424)
ensure that CMAKE_INSTALL_RPATH_USE_LINK_PATH=ON works in qt packages.
2025-02-19 14:37:47 +01:00
Harmen Stoppels
c8167eec5d Set version to 0.23.1.dev0 2025-02-19 14:37:47 +01:00
Gregory Becker
c6d4037758
update version number to 0.23.0 2024-11-17 00:59:27 -05:00
Gregory Becker
08f1cf9ae2
Update CHANGELOG.md for v0.23.0 2024-11-17 00:59:27 -05:00
Todd Gamblin
48dfa3c95e Spec: prefer a splice-specific method to __len__ (#47585)
Automatic splicing say `Spec` grow a `__len__` method but it's only used
in one place and it's not clear the semantics are useful elsewhere. It also
runs the risk of Specs one day being confused for other types of containers.

Rather than introduce a new function for one algorithm, let's use a more
specific method in the splice code.

- [x] Use topological ordering in `_resolve_automatic_splices` instead of 
      sorting by node count
- [x] delete `Spec.__len__()` and `Spec.__bool__()`

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-11-14 08:24:06 +01:00
Todd Gamblin
e5c411d8f0 spack spec: simplify and unify output (#47574)
`spack spec` output has looked like this for a while:

```console
> spack spec /v5fn6xo /wd2p2v7
Input spec
--------------------------------
 -   /v5fn6xo

Concretized
--------------------------------
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...

Input spec
--------------------------------
 -   /wd2p2v7

Concretized
--------------------------------
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
```

But the input spec is right there on the CLI, and it doesn't add anything to the output.
Also, since #44843, specs concretized in the CLI line can be unified, so it makes sense
to display them as we did in #44489 -- as one multi-root tree instead of as multiple
single-root trees.

With this PR, concretize output now looks like this:

```console
> spack spec /v5fn6xo /wd2p2v7
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^gmake@4.4.1%apple-clang@16.0.0~guile build_system=generic arch=darwin-sequoia-m1
[+]      ^perl@5.40.0%apple-clang@16.0.0+cpanm+opcode+open+shared+threads build_system=generic arch=darwin-sequoia-m1
[+]          ^berkeley-db@18.1.40%apple-clang@16.0.0+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=darwin-sequoia-m1
[+]          ^bzip2@1.0.8%apple-clang@16.0.0~debug~pic+shared build_system=generic arch=darwin-sequoia-m1
[+]              ^diffutils@3.10%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]                  ^libiconv@1.17%apple-clang@16.0.0 build_system=autotools libs=shared,static arch=darwin-sequoia-m1
[+]          ^gdbm@1.23%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]              ^readline@8.2%apple-clang@16.0.0 build_system=autotools patches=bbf97f1 arch=darwin-sequoia-m1
[+]                  ^ncurses@6.5%apple-clang@16.0.0~symlinks+termlib abi=none build_system=autotools patches=7a351bc arch=darwin-sequoia-m1
[+]                      ^pkgconf@2.2.0%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]      ^zlib-ng@2.2.1%apple-clang@16.0.0+compat+new_strategies+opt+pic+shared build_system=autotools arch=darwin-sequoia-m1
[+]          ^gnuconfig@2022-09-17%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^py-setuptools@69.2.0%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[-]      ^py-wheel@0.41.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...
```

With no input spec displayed -- just the concretization output shown as one consolidated
tree and multiple roots.

- [x] remove "Input Spec" section and "Concretized" header from `spack spec` output
- [x] print concretized specs as one BFS tree instead of multiple

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-11-14 08:16:16 +01:00
psakievich
020e30f3e6 Update tutorial version (#47593) 2024-11-14 08:15:37 +01:00
Harmen Stoppels
181c404af5 missing and redundant imports (#47577) 2024-11-13 13:03:29 +01:00
53 changed files with 993 additions and 476 deletions

View File

@ -52,7 +52,13 @@ jobs:
# Needed for unit tests # Needed for unit tests
sudo apt-get -y install \ sudo apt-get -y install \
coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \ coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \
cmake bison libbison-dev kcov cmake bison libbison-dev subversion
# On ubuntu 24.04, kcov was removed. It may come back in some future Ubuntu
- name: Set up Homebrew
id: set-up-homebrew
uses: Homebrew/actions/setup-homebrew@40e9946c182a64b3db1bf51be0dcb915f7802aa9
- name: Install kcov with brew
run: "brew install kcov"
- name: Install Python packages - name: Install Python packages
run: | run: |
pip install --upgrade pip setuptools pytest pytest-xdist pytest-cov pip install --upgrade pip setuptools pytest pytest-xdist pytest-cov
@ -99,7 +105,13 @@ jobs:
run: | run: |
sudo apt-get -y update sudo apt-get -y update
# Needed for shell tests # Needed for shell tests
sudo apt-get install -y coreutils kcov csh zsh tcsh fish dash bash sudo apt-get install -y coreutils csh zsh tcsh fish dash bash subversion
# On ubuntu 24.04, kcov was removed. It may come back in some future Ubuntu
- name: Set up Homebrew
id: set-up-homebrew
uses: Homebrew/actions/setup-homebrew@40e9946c182a64b3db1bf51be0dcb915f7802aa9
- name: Install kcov with brew
run: "brew install kcov"
- name: Install Python packages - name: Install Python packages
run: | run: |
pip install --upgrade pip setuptools pytest coverage[toml] pytest-xdist pip install --upgrade pip setuptools pytest coverage[toml] pytest-xdist
@ -134,7 +146,7 @@ jobs:
- name: Setup repo and non-root user - name: Setup repo and non-root user
run: | run: |
git --version git --version
git config --global --add safe.directory /__w/spack/spack git config --global --add safe.directory '*'
git fetch --unshallow git fetch --unshallow
. .github/workflows/bin/setup_git.sh . .github/workflows/bin/setup_git.sh
useradd spack-test useradd spack-test

View File

@ -74,7 +74,7 @@ jobs:
- name: Setup repo and non-root user - name: Setup repo and non-root user
run: | run: |
git --version git --version
git config --global --add safe.directory /__w/spack/spack git config --global --add safe.directory '*'
git fetch --unshallow git fetch --unshallow
. .github/workflows/bin/setup_git.sh . .github/workflows/bin/setup_git.sh
useradd spack-test useradd spack-test

View File

@ -1,3 +1,395 @@
# v0.23.1 (2025-02-19)
## Bugfixes
- Fix a correctness issue of `ArchSpec.intersects` (#48741)
- Make `extra_attributes` order independent in Spec hashing (#48615, #48854)
- Fix issue where system proxy settings were not respected in OCI build caches (#48783)
- Fix an issue where the `--test` concretizer flag was not forwarded correctly (#48417)
- Fix an issue where `codesign` and `install_name_tool` would not preserve hardlinks on
Darwin (#47808)
- Fix an issue on Darwin where codesign would run on unmodified binaries (#48568)
- Patch configure scripts generated with libtool < 2.5.4, to avoid redundant flags when
creating shared libraries on Darwin (#48671)
- Fix issue related to mirror URL paths on Windows (#47898)
- Esnure proper UTF-8 encoding/decoding in logging (#48005)
- Fix issues related to `filter_file` (#48038, #48108)
- Fix issue related to creating bootstrap source mirrors (#48235)
- Fix issue where command line config arguments were not always top level (#48255)
- Fix an incorrect typehint of `concretized()` (#48504)
- Improve mention of next Spack version in warning (#47887)
- Tests: fix forward compatibility with Python 3.13 (#48209)
- Docs: encourage use of `--oci-username-variable` and `--oci-password-variable` (#48189)
- Docs: ensure Getting Started has bootstrap list output in correct place (#48281)
- CI: allow GitHub actions to run on forks of Spack with different project name (#48041)
- CI: make unit tests work on Ubuntu 24.04 (#48151)
- CI: re-enable cray pipelines (#47697)
## Package updates
- `qt-base`: fix rpath for dependents (#47424)
- `gdk-pixbuf`: fix outdated URL (#47825)
# v0.23.0 (2024-11-13)
`v0.23.0` is a major feature release.
We are planning to make this the last major release before Spack `v1.0`
in June 2025. Alongside `v0.23`, we will be making pre-releases (alpha,
beta, etc.) of `v1.0`, and we encourage users to try them and send us
feedback, either on GitHub or on Slack. You can track the road to
`v1.0` here:
* https://github.com/spack/spack/releases
* https://github.com/spack/spack/discussions/30634
## Features in this Release
1. **Language virtuals**
Your packages can now explicitly depend on the languages they require.
Historically, Spack has considered C, C++, and Fortran compiler
dependencies to be implicit. In `v0.23`, you should ensure that
new packages add relevant C, C++, and Fortran dependencies like this:
```python
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("fortran", type="build")
```
We encourage you to add these annotations to your packages now, to prepare
for Spack `v1.0.0`. In `v1.0.0`, these annotations will be necessary for
your package to use C, C++, and Fortran compilers. Note that you should
*not* add language dependencies to packages that don't need them, e.g.,
pure python packages.
We have already auto-generated these dependencies for packages in the
`builtin` repository (see #45217), based on the types of source files
present in each package's source code. We *may* have added too many or too
few language dependencies, so please submit pull requests to correct
packages if you find that the language dependencies are incorrect.
Note that we have also backported support for these dependencies to
`v0.21.3` and `v0.22.2`, to make all of them forward-compatible with
`v0.23`. This should allow you to move easily between older and newer Spack
releases without breaking your packages.
2. **Spec splicing**
We are working to make binary installation more seamless in Spack. `v0.23`
introduces "splicing", which allows users to deploy binaries using local,
optimized versions of a binary interface, even if they were not built with
that interface. For example, this would allow you to build binaries in the
cloud using `mpich` and install them on a system using a local, optimized
version of `mvapich2` *without rebuilding*. Spack preserves full provenance
for the installed packages and knows that they were built one way but
deployed another.
Our intent is to leverage this across many key HPC binary packages,
e.g. MPI, CUDA, ROCm, and libfabric.
Fundamentally, splicing allows Spack to redeploy an existing spec with
different dependencies than how it was built. There are two interfaces to
splicing.
a. Explicit Splicing
#39136 introduced the explicit splicing interface. In the
concretizer config, you can specify a target spec and a replacement
by hash.
```yaml
concretizer:
splice:
explicit:
- target: mpi
replacement: mpich/abcdef
```
Here, every installation that would normally use the target spec will
instead use its replacement. Above, any spec using *any* `mpi` will be
spliced to depend on the specific `mpich` installation requested. This
*can* go wrong if you try to replace something built with, e.g.,
`openmpi` with `mpich`, and it is on the user to ensure ABI
compatibility between target and replacement specs. This currently
requires some expertise to use, but it will allow users to reuse the
binaries they create across more machines and environments.
b. Automatic Splicing (experimental)
#46729 introduced automatic splicing. In the concretizer config, enable
automatic splicing:
```yaml
concretizer:
splice:
automatic: true
```
or run:
```console
spack config add concretizer:splice:automatic:true
```
The concretizer will select splices for ABI compatibility to maximize
package reuse. Packages can denote ABI compatibility using the
`can_splice` directive. No packages in Spack yet use this directive, so
if you want to use this feature you will need to add `can_splice`
annotations to your packages. We are working on ways to add more ABI
compatibility information to the Spack package repository, and this
directive may change in the future.
See the documentation for more details:
* https://spack.readthedocs.io/en/latest/build_settings.html#splicing
* https://spack.readthedocs.io/en/latest/packaging_guide.html#specifying-abi-compatibility
3. Broader variant propagation
Since #42931, you can specify propagated variants like `hdf5
build_type==RelWithDebInfo` or `trilinos ++openmp` to propagate a variant
to all dependencies for which it is relevant. This is valid *even* if the
variant does not exist on the package or its dependencies.
See https://spack.readthedocs.io/en/latest/basic_usage.html#variants.
4. Query specs by namespace
#45416 allows a package's namespace (indicating the repository it came from)
to be treated like a variant. You can request packages from particular repos
like this:
```console
spack find zlib namespace=builtin
spack find zlib namespace=myrepo
```
Previously, the spec syntax only allowed namespaces to be prefixes of spec
names, e.g. `builtin.zlib`. The previous syntax still works.
5. `spack spec` respects environment settings and `unify:true`
`spack spec` did not previously respect environment lockfiles or
unification settings, which made it difficult to see exactly how a spec
would concretize within an environment. Now it does, so the output you get
with `spack spec` will be *the same* as what your environment will
concretize to when you run `spack concretize`. Similarly, if you provide
multiple specs on the command line with `spack spec`, it will concretize
them together if `unify:true` is set.
See #47556 and #44843.
6. Less noisy `spack spec` output
`spack spec` previously showed output like this:
```console
> spack spec /v5fn6xo
Input spec
--------------------------------
- /v5fn6xo
Concretized
--------------------------------
[+] openssl@3.3.1%apple-clang@16.0.0~docs+shared arch=darwin-sequoia-m1
...
```
But the input spec is redundant, and we know we run `spack spec` to concretize
the input spec. `spack spec` now *only* shows the concretized spec. See #47574.
7. Better output for `spack find -c`
In an environmnet, `spack find -c` lets you search the concretized, but not
yet installed, specs, just as you would the installed ones. As with `spack
spec`, this should make it easier for you to see what *will* be built
before building and installing it. See #44713.
8. `spack -C <env>`: use an environment's configuration without activation
Spack environments allow you to associate:
1. a set of (possibly concretized) specs, and
2. configuration
When you activate an environment, you're using both of these. Previously, we
supported:
* `spack -e <env>` to run spack in the context of a specific environment, and
* `spack -C <directory>` to run spack using a directory with configuration files.
You can now also pass an environment to `spack -C` to use *only* the environment's
configuration, but not the specs or lockfile. See #45046.
## New commands, options, and directives
* The new `spack env track` command (#41897) takes a non-managed Spack
environment and adds a symlink to Spack's `$environments_root` directory, so
that it will be included for reference counting for commands like `spack
uninstall` and `spack gc`. If you use free-standing directory environments,
this is useful for preventing Spack from removing things required by your
environments. You can undo this tracking with the `spack env untrack`
command.
* Add `-t` short option for `spack --backtrace` (#47227)
`spack -d / --debug` enables backtraces on error, but it can be very
verbose, and sometimes you just want the backtrace. `spack -t / --backtrace`
provides that option.
* `gc`: restrict to specific specs (#46790)
If you only want to garbage-collect specific packages, you can now provide
them on the command line. This gives users finer-grained control over what
is uninstalled.
* oci buildcaches now support `--only=package`. You can now push *just* a
package and not its dependencies to an OCI registry. This allows dependents
of non-redistributable specs to be stored in OCI registries without an
error. See #45775.
## Notable refactors
* Variants are now fully conditional
The `variants` dictionary on packages was previously keyed by variant name,
and allowed only one definition of any given variant. Spack is now smart
enough to understand that variants may have different values and defaults
for different versions. For example, `warpx` prior to `23.06` only supported
builds for one dimensionality, and newer `warpx` versions could be built
with support for many different dimensions:
```python
variant(
"dims",
default="3",
values=("1", "2", "3", "rz"),
multi=False,
description="Number of spatial dimensions",
when="@:23.05",
)
variant(
"dims",
default="1,2,rz,3",
values=("1", "2", "3", "rz"),
multi=True,
description="Number of spatial dimensions",
when="@23.06:",
)
```
Previously, the default for the old version of `warpx` was not respected and
had to be specified manually. Now, Spack will select the right variant
definition for each version at concretization time. This allows variants to
evolve more smoothly over time. See #44425 for details.
## Highlighted bugfixes
1. Externals no longer override the preferred provider (#45025).
External definitions could interfere with package preferences. Now, if
`openmpi` is the preferred `mpi`, and an external `mpich` is defined, a new
`openmpi` *will* be built if building it is possible. Previously we would
prefer `mpich` despite the preference.
2. Composable `cflags` (#41049).
This release fixes a longstanding bug that concretization would fail if
there were different `cflags` specified in `packages.yaml`,
`compilers.yaml`, or on `the` CLI. Flags and their ordering are now tracked
in the concretizer and flags from multiple sources will be merged.
3. Fix concretizer Unification for included environments (#45139).
## Deprecations, removals, and syntax changes
1. The old concretizer has been removed from Spack, along with the
`config:concretizer` config option. Spack will emit a warning if the option
is present in user configuration, since it now has no effect. Spack now
uses a simpler bootstrapping mechanism, where a JSON prototype is tweaked
slightly to get an initial concrete spec to download. See #45215.
2. Best-effort expansion of spec matrices has been removed. This feature did
not work with the "new" ASP-based concretizer, and did not work with
`unify: True` or `unify: when_possible`. Use the
[exclude key](https://spack.readthedocs.io/en/latest/environments.html#spec-matrices)
for the environment to exclude invalid components, or use multiple spec
matrices to combine the list of specs for which the constraint is valid and
the list of specs for which it is not. See #40792.
3. The old Cray `platform` (based on Cray PE modules) has been removed, and
`platform=cray` is no longer supported. Since `v0.19`, Spack has handled
Cray machines like Linux clusters with extra packages, and we have
encouraged using this option to support Cray. The new approach allows us to
correctly handle Cray machines with non-SLES operating systems, and it is
much more reliable than making assumptions about Cray modules. See the
`v0.19` release notes and #43796 for more details.
4. The `config:install_missing_compilers` config option has been deprecated,
and it is a no-op when set in `v0.23`. Our new compiler dependency model
will replace it with a much more reliable and robust mechanism in `v1.0`.
See #46237.
5. Config options that deprecated in `v0.21` have been removed in `v0.23`. You
can now only specify preferences for `compilers`, `targets`, and
`providers` globally via the `packages:all:` section. Similarly, you can
only specify `versions:` locally for a specific package. See #44061 and
#31261 for details.
6. Spack's old test interface has been removed (#45752), having been
deprecated in `v0.22.0` (#34236). All `builtin` packages have been updated
to use the new interface. See the [stand-alone test documentation](
https://spack.readthedocs.io/en/latest/packaging_guide.html#stand-alone-tests)
7. The `spack versions --safe-only` option, deprecated since `v0.21.0`, has
been removed. See #45765.
* The `--dependencies` and `--optimize` arguments to `spack ci` have been
deprecated. See #45005.
## Binary caches
1. Public binary caches now include an ML stack for Linux/aarch64 (#39666)We
now build an ML stack for Linux/aarch64 for all pull requests and on
develop. The ML stack includes both CPU-only and CUDA builds for Horovod,
Hugging Face, JAX, Keras, PyTorch,scikit-learn, TensorBoard, and
TensorFlow, and related packages. The CPU-only stack also includes XGBoost.
See https://cache.spack.io/tag/develop/?stack=ml-linux-aarch64-cuda.
2. There is also now an stack of developer tools for macOS (#46910), which is
analogous to the Linux devtools stack. You can use this to avoid building
many common build dependencies. See
https://cache.spack.io/tag/develop/?stack=developer-tools-darwin.
## Architecture support
* archspec has been updated to `v0.2.5`, with support for `zen5`
* Spack's CUDA package now supports the Grace Hopper `9.0a` compute capability (#45540)
## Windows
* Windows bootstrapping: `file` and `gpg` (#41810)
* `scripts` directory added to PATH on Windows for python extensions (#45427)
* Fix `spack load --list` and `spack unload` on Windows (#35720)
## Other notable changes
* Bugfix: `spack find -x` in environments (#46798)
* Spec splices are now robust to duplicate nodes with the same name in a spec (#46382)
* Cache per-compiler libc calculations for performance (#47213)
* Fixed a bug in external detection for openmpi (#47541)
* Mirror configuration allows username/password as environment variables (#46549)
* Default library search caps maximum depth (#41945)
* Unify interface for `spack spec` and `spack solve` commands (#47182)
* Spack no longer RPATHs directories in the default library search path (#44686)
* Improved performance of Spack database (#46554)
* Enable package reuse for packages with versions from git refs (#43859)
* Improved handling for `uuid` virtual on macos (#43002)
* Improved tracking of task queueing/requeueing in the installer (#46293)
## Spack community stats
* Over 2,000 pull requests updated package recipes
* 8,307 total packages, 329 new since `v0.22.0`
* 140 new Python packages
* 14 new R packages
* 373 people contributed to this release
* 357 committers to packages
* 60 committers to core
# v0.22.2 (2024-09-21) # v0.22.2 (2024-09-21)
## Bugfixes ## Bugfixes

View File

@ -265,25 +265,30 @@ infrastructure, or to cache Spack built binaries in Github Actions and
GitLab CI. GitLab CI.
To get started, configure an OCI mirror using ``oci://`` as the scheme, To get started, configure an OCI mirror using ``oci://`` as the scheme,
and optionally specify a username and password (or personal access token): and optionally specify variables that hold the username and password (or
personal access token) for the registry:
.. code-block:: console .. code-block:: console
$ spack mirror add --oci-username username --oci-password password my_registry oci://example.com/my_image $ spack mirror add --oci-username-variable REGISTRY_USER \
--oci-password-variable REGISTRY_TOKEN \
my_registry oci://example.com/my_image
Spack follows the naming conventions of Docker, with Dockerhub as the default Spack follows the naming conventions of Docker, with Dockerhub as the default
registry. To use Dockerhub, you can omit the registry domain: registry. To use Dockerhub, you can omit the registry domain:
.. code-block:: console .. code-block:: console
$ spack mirror add --oci-username username --oci-password password my_registry oci://username/my_image $ spack mirror add ... my_registry oci://username/my_image
From here, you can use the mirror as any other build cache: From here, you can use the mirror as any other build cache:
.. code-block:: console .. code-block:: console
$ export REGISTRY_USER=...
$ export REGISTRY_TOKEN=...
$ spack buildcache push my_registry <specs...> # push to the registry $ spack buildcache push my_registry <specs...> # push to the registry
$ spack install <specs...> # install from the registry $ spack install <specs...> # or install from the registry
A unique feature of buildcaches on top of OCI registries is that it's incredibly A unique feature of buildcaches on top of OCI registries is that it's incredibly
easy to generate get a runnable container image with the binaries installed. This easy to generate get a runnable container image with the binaries installed. This

View File

@ -38,9 +38,11 @@ just have to configure and OCI registry and run ``spack buildcache push``.
spack -e . install spack -e . install
# Configure the registry # Configure the registry
spack -e . mirror add --oci-username ... --oci-password ... container-registry oci://example.com/name/image spack -e . mirror add --oci-username-variable REGISTRY_USER \
--oci-password-variable REGISTRY_TOKEN \
container-registry oci://example.com/name/image
# Push the image # Push the image (do set REGISTRY_USER and REGISTRY_TOKEN)
spack -e . buildcache push --update-index --base-image ubuntu:22.04 --tag my_env container-registry spack -e . buildcache push --update-index --base-image ubuntu:22.04 --tag my_env container-registry
The resulting container image can then be run as follows: The resulting container image can then be run as follows:

View File

@ -148,20 +148,22 @@ The first time you concretize a spec, Spack will bootstrap automatically:
-------------------------------- --------------------------------
zlib@1.2.13%gcc@9.4.0+optimize+pic+shared build_system=makefile arch=linux-ubuntu20.04-icelake zlib@1.2.13%gcc@9.4.0+optimize+pic+shared build_system=makefile arch=linux-ubuntu20.04-icelake
The default bootstrap behavior is to use pre-built binaries. You can verify the
active bootstrap repositories with:
.. command-output:: spack bootstrap list
If for security concerns you cannot bootstrap ``clingo`` from pre-built If for security concerns you cannot bootstrap ``clingo`` from pre-built
binaries, you have to disable fetching the binaries we generated with Github Actions. binaries, you have to disable fetching the binaries we generated with Github Actions.
.. code-block:: console .. code-block:: console
$ spack bootstrap disable github-actions-v0.4 $ spack bootstrap disable github-actions-v0.6
==> "github-actions-v0.4" is now disabled and will not be used for bootstrapping ==> "github-actions-v0.6" is now disabled and will not be used for bootstrapping
$ spack bootstrap disable github-actions-v0.3 $ spack bootstrap disable github-actions-v0.5
==> "github-actions-v0.3" is now disabled and will not be used for bootstrapping ==> "github-actions-v0.5" is now disabled and will not be used for bootstrapping
You can verify that the new settings are effective with:
.. command-output:: spack bootstrap list
You can verify that the new settings are effective with ``spack bootstrap list``.
.. note:: .. note::

View File

@ -24,6 +24,7 @@
Callable, Callable,
Deque, Deque,
Dict, Dict,
Generator,
Iterable, Iterable,
List, List,
Match, Match,
@ -300,35 +301,32 @@ def filter_file(
ignore_absent: bool = False, ignore_absent: bool = False,
start_at: Optional[str] = None, start_at: Optional[str] = None,
stop_at: Optional[str] = None, stop_at: Optional[str] = None,
encoding: Optional[str] = "utf-8",
) -> None: ) -> None:
r"""Like sed, but uses python regular expressions. r"""Like sed, but uses python regular expressions.
Filters every line of each file through regex and replaces the file Filters every line of each file through regex and replaces the file with a filtered version.
with a filtered version. Preserves mode of filtered files. Preserves mode of filtered files.
As with re.sub, ``repl`` can be either a string or a callable. As with re.sub, ``repl`` can be either a string or a callable. If it is a callable, it is
If it is a callable, it is passed the match object and should passed the match object and should return a suitable replacement string. If it is a string, it
return a suitable replacement string. If it is a string, it can contain ``\1``, ``\2``, etc. to represent back-substitution as sed would allow.
can contain ``\1``, ``\2``, etc. to represent back-substitution
as sed would allow.
Args: Args:
regex (str): The regular expression to search for regex: The regular expression to search for
repl (str): The string to replace matches with repl: The string to replace matches with
*filenames: One or more files to search and replace *filenames: One or more files to search and replace string: Treat regex as a plain string.
string (bool): Treat regex as a plain string. Default it False Default it False backup: Make backup file(s) suffixed with ``~``. Default is False
backup (bool): Make backup file(s) suffixed with ``~``. Default is False ignore_absent: Ignore any files that don't exist. Default is False
ignore_absent (bool): Ignore any files that don't exist. start_at: Marker used to start applying the replacements. If a text line matches this
Default is False marker filtering is started at the next line. All contents before the marker and the
start_at (str): Marker used to start applying the replacements. If a marker itself are copied verbatim. Default is to start filtering from the first line of
text line matches this marker filtering is started at the next line. the file.
All contents before the marker and the marker itself are copied stop_at: Marker used to stop scanning the file further. If a text line matches this marker
verbatim. Default is to start filtering from the first line of the filtering is stopped and the rest of the file is copied verbatim. Default is to filter
file. until the end of the file.
stop_at (str): Marker used to stop scanning the file further. If a text encoding: The encoding to use when reading and writing the files. Default is None, which
line matches this marker filtering is stopped and the rest of the uses the system's default encoding.
file is copied verbatim. Default is to filter until the end of the
file.
""" """
# Allow strings to use \1, \2, etc. for replacement, like sed # Allow strings to use \1, \2, etc. for replacement, like sed
if not callable(repl): if not callable(repl):
@ -344,72 +342,56 @@ def groupid_to_group(x):
if string: if string:
regex = re.escape(regex) regex = re.escape(regex)
for filename in path_to_os_path(*filenames): regex_compiled = re.compile(regex)
msg = 'FILTER FILE: {0} [replacing "{1}"]' for path in path_to_os_path(*filenames):
tty.debug(msg.format(filename, regex)) if ignore_absent and not os.path.exists(path):
tty.debug(f'FILTER FILE: file "{path}" not found. Skipping to next file.')
backup_filename = filename + "~"
tmp_filename = filename + ".spack~"
if ignore_absent and not os.path.exists(filename):
msg = 'FILTER FILE: file "{0}" not found. Skipping to next file.'
tty.debug(msg.format(filename))
continue continue
else:
tty.debug(f'FILTER FILE: {path} [replacing "{regex}"]')
# Create backup file. Don't overwrite an existing backup fd, temp_path = tempfile.mkstemp(
# file in case this file is being filtered multiple times. prefix=f"{os.path.basename(path)}.", dir=os.path.dirname(path)
if not os.path.exists(backup_filename): )
shutil.copy(filename, backup_filename) os.close(fd)
# Create a temporary file to read from. We cannot use backup_filename shutil.copy(path, temp_path)
# in case filter_file is invoked multiple times on the same file. errored = False
shutil.copy(filename, tmp_filename)
try: try:
# Open as a text file and filter until the end of the file is # Open as a text file and filter until the end of the file is reached, or we found a
# reached, or we found a marker in the line if it was specified # marker in the line if it was specified. To avoid translating line endings (\n to
# # \r\n and vice-versa) use newline="".
# To avoid translating line endings (\n to \r\n and vice-versa) with open(
# we force os.open to ignore translations and use the line endings temp_path, mode="r", errors="surrogateescape", newline="", encoding=encoding
# the file comes with ) as input_file, open(
with open(tmp_filename, mode="r", errors="surrogateescape", newline="") as input_file: path, mode="w", errors="surrogateescape", newline="", encoding=encoding
with open(filename, mode="w", errors="surrogateescape", newline="") as output_file: ) as output_file:
do_filtering = start_at is None if start_at is None and stop_at is None: # common case, avoids branching in loop
# Using iter and readline is a workaround needed not to for line in input_file:
# disable input_file.tell(), which will happen if we call output_file.write(re.sub(regex_compiled, repl, line))
# input_file.next() implicitly via the for loop else:
for line in iter(input_file.readline, ""): # state is -1 before start_at; 0 between; 1 after stop_at
if stop_at is not None: state = 0 if start_at is None else -1
current_position = input_file.tell() for line in input_file:
if state == 0:
if stop_at == line.strip(): if stop_at == line.strip():
output_file.write(line) state = 1
break
if do_filtering:
filtered_line = re.sub(regex, repl, line)
output_file.write(filtered_line)
else: else:
do_filtering = start_at == line.strip() line = re.sub(regex_compiled, repl, line)
elif state == -1 and start_at == line.strip():
state = 0
output_file.write(line) output_file.write(line)
else:
current_position = None
# If we stopped filtering at some point, reopen the file in
# binary mode and copy verbatim the remaining part
if current_position and stop_at:
with open(tmp_filename, mode="rb") as input_binary_buffer:
input_binary_buffer.seek(current_position)
with open(filename, mode="ab") as output_binary_buffer:
output_binary_buffer.writelines(input_binary_buffer.readlines())
except BaseException: except BaseException:
# clean up the original file on failure. # restore the original file
shutil.move(backup_filename, filename) os.rename(temp_path, path)
errored = True
raise raise
finally: finally:
os.remove(tmp_filename) if not errored and not backup:
if not backup and os.path.exists(backup_filename): os.unlink(temp_path)
os.remove(backup_filename)
class FileFilter: class FileFilter:
@ -2838,6 +2820,25 @@ def temporary_dir(
remove_directory_contents(tmp_dir) remove_directory_contents(tmp_dir)
@contextmanager
def edit_in_place_through_temporary_file(file_path: str) -> Generator[str, None, None]:
"""Context manager for modifying ``file_path`` in place, preserving its inode and hardlinks,
for functions or external tools that do not support in-place editing. Notice that this function
is unsafe in that it works with paths instead of a file descriptors, but this is by design,
since we assume the call site will create a new inode at the same path."""
tmp_fd, tmp_path = tempfile.mkstemp(
dir=os.path.dirname(file_path), prefix=f"{os.path.basename(file_path)}."
)
# windows cannot replace a file with open fds, so close since the call site needs to replace.
os.close(tmp_fd)
try:
shutil.copyfile(file_path, tmp_path, follow_symlinks=True)
yield tmp_path
shutil.copyfile(tmp_path, file_path, follow_symlinks=True)
finally:
os.unlink(tmp_path)
def filesummary(path, print_bytes=16) -> Tuple[int, bytes]: def filesummary(path, print_bytes=16) -> Tuple[int, bytes]:
"""Create a small summary of the given file. Does not error """Create a small summary of the given file. Does not error
when file does not exist. when file does not exist.

View File

@ -879,10 +879,13 @@ def _writer_daemon(
write_fd.close() write_fd.close()
# 1. Use line buffering (3rd param = 1) since Python 3 has a bug # 1. Use line buffering (3rd param = 1) since Python 3 has a bug
# that prevents unbuffered text I/O. # that prevents unbuffered text I/O. [needs citation]
# 2. Python 3.x before 3.7 does not open with UTF-8 encoding by default # 2. Enforce a UTF-8 interpretation of build process output with errors replaced by '?'.
# The downside is that the log file will not contain the exact output of the build process.
# 3. closefd=False because Connection has "ownership" # 3. closefd=False because Connection has "ownership"
read_file = os.fdopen(read_fd.fileno(), "r", 1, encoding="utf-8", closefd=False) read_file = os.fdopen(
read_fd.fileno(), "r", 1, encoding="utf-8", errors="replace", closefd=False
)
if stdin_fd: if stdin_fd:
stdin_file = os.fdopen(stdin_fd.fileno(), closefd=False) stdin_file = os.fdopen(stdin_fd.fileno(), closefd=False)
@ -928,11 +931,7 @@ def _writer_daemon(
try: try:
while line_count < 100: while line_count < 100:
# Handle output from the calling process. # Handle output from the calling process.
try:
line = _retry(read_file.readline)() line = _retry(read_file.readline)()
except UnicodeDecodeError:
# installs like --test=root gpgme produce non-UTF8 logs
line = "<line lost: output was not encoded as UTF-8>\n"
if not line: if not line:
return return
@ -946,6 +945,13 @@ def _writer_daemon(
output_line = clean_line output_line = clean_line
if filter_fn: if filter_fn:
output_line = filter_fn(clean_line) output_line = filter_fn(clean_line)
enc = sys.stdout.encoding
if enc != "utf-8":
# On Python 3.6 and 3.7-3.14 with non-{utf-8,C} locale stdout
# may not be able to handle utf-8 output. We do an inefficient
# dance of re-encoding with errors replaced, so stdout.write
# does not raise.
output_line = output_line.encode(enc, "replace").decode(enc)
sys.stdout.write(output_line) sys.stdout.write(output_line)
# Stripped output to log file. # Stripped output to log file.

View File

@ -11,7 +11,7 @@
import spack.util.git import spack.util.git
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string #: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.23.0.dev0" __version__ = "0.23.1"
spack_version = __version__ spack_version = __version__

View File

@ -1366,14 +1366,8 @@ def _test_detection_by_executable(pkgs, debug_log, error_cls):
def _compare_extra_attribute(_expected, _detected, *, _spec): def _compare_extra_attribute(_expected, _detected, *, _spec):
result = [] result = []
# Check items are of the same type
if not isinstance(_detected, type(_expected)):
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
# If they are string expected is a regex # If they are string expected is a regex
if isinstance(_expected, str): if isinstance(_expected, str) and isinstance(_detected, str):
try: try:
_regex = re.compile(_expected) _regex = re.compile(_expected)
except re.error: except re.error:
@ -1389,7 +1383,7 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
_details = [f"{_detected} does not match the regex"] _details = [f"{_detected} does not match the regex"]
return [error_cls(summary=_summary, details=_details)] return [error_cls(summary=_summary, details=_details)]
if isinstance(_expected, dict): elif isinstance(_expected, dict) and isinstance(_detected, dict):
_not_detected = set(_expected.keys()) - set(_detected.keys()) _not_detected = set(_expected.keys()) - set(_detected.keys())
if _not_detected: if _not_detected:
_summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}" _summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}"
@ -1404,6 +1398,10 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
result.extend( result.extend(
_compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec) _compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec)
) )
else:
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
return result return result

View File

@ -2332,7 +2332,9 @@ def is_backup_file(file):
if not codesign: if not codesign:
return return
for binary in changed_files: for binary in changed_files:
codesign("-fs-", binary) # preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
# If we are installing back to the same location # If we are installing back to the same location
# relocate the sbang location if the spack directory changed # relocate the sbang location if the spack directory changed

View File

@ -357,6 +357,13 @@ def _do_patch_libtool_configure(self):
) )
# Support Libtool 2.4.2 and older: # Support Libtool 2.4.2 and older:
x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2') x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2')
# Configure scripts generated with libtool < 2.5.4 have a faulty test for the
# -single_module linker flag. A deprecation warning makes it think the default is
# -multi_module, triggering it to use problematic linker flags (such as ld -r). The
# linker default is `-single_module` from (ancient) macOS 10.4, so override by setting
# `lt_cv_apple_cc_single_mod=yes`. See the fix in libtool commit
# 82f7f52123e4e7e50721049f7fa6f9b870e09c9d.
x.filter("lt_cv_apple_cc_single_mod=no", "lt_cv_apple_cc_single_mod=yes", string=True)
@spack.builder.run_after("configure") @spack.builder.run_after("configure")
def _do_patch_libtool(self): def _do_patch_libtool(self):

View File

@ -25,6 +25,7 @@
import spack.extensions import spack.extensions
import spack.parser import spack.parser
import spack.paths import spack.paths
import spack.repo
import spack.spec import spack.spec
import spack.store import spack.store
import spack.traverse as traverse import spack.traverse as traverse
@ -166,7 +167,9 @@ def quote_kvp(string: str) -> str:
def parse_specs( def parse_specs(
args: Union[str, List[str]], concretize: bool = False, tests: bool = False args: Union[str, List[str]],
concretize: bool = False,
tests: spack.concretize.TestsType = False,
) -> List[spack.spec.Spec]: ) -> List[spack.spec.Spec]:
"""Convenience function for parsing arguments from specs. Handles common """Convenience function for parsing arguments from specs. Handles common
exceptions and dies if there are errors. exceptions and dies if there are errors.
@ -178,11 +181,13 @@ def parse_specs(
if not concretize: if not concretize:
return specs return specs
to_concretize = [(s, None) for s in specs] to_concretize: List[spack.concretize.SpecPairInput] = [(s, None) for s in specs]
return _concretize_spec_pairs(to_concretize, tests=tests) return _concretize_spec_pairs(to_concretize, tests=tests)
def _concretize_spec_pairs(to_concretize, tests=False): def _concretize_spec_pairs(
to_concretize: List[spack.concretize.SpecPairInput], tests: spack.concretize.TestsType = False
) -> List[spack.spec.Spec]:
"""Helper method that concretizes abstract specs from a list of abstract,concrete pairs. """Helper method that concretizes abstract specs from a list of abstract,concrete pairs.
Any spec with a concrete spec associated with it will concretize to that spec. Any spec Any spec with a concrete spec associated with it will concretize to that spec. Any spec
@ -193,7 +198,7 @@ def _concretize_spec_pairs(to_concretize, tests=False):
# Special case for concretizing a single spec # Special case for concretizing a single spec
if len(to_concretize) == 1: if len(to_concretize) == 1:
abstract, concrete = to_concretize[0] abstract, concrete = to_concretize[0]
return [concrete or abstract.concretized()] return [concrete or abstract.concretized(tests=tests)]
# Special case if every spec is either concrete or has an abstract hash # Special case if every spec is either concrete or has an abstract hash
if all( if all(

View File

@ -29,7 +29,7 @@
# Tarball to be downloaded if binary packages are requested in a local mirror # Tarball to be downloaded if binary packages are requested in a local mirror
BINARY_TARBALL = "https://github.com/spack/spack-bootstrap-mirrors/releases/download/v0.4/bootstrap-buildcache.tar.gz" BINARY_TARBALL = "https://github.com/spack/spack-bootstrap-mirrors/releases/download/v0.6/bootstrap-buildcache.tar.gz"
#: Subdirectory where to create the mirror #: Subdirectory where to create the mirror
LOCAL_MIRROR_DIR = "bootstrap_cache" LOCAL_MIRROR_DIR = "bootstrap_cache"
@ -51,9 +51,9 @@
}, },
} }
CLINGO_JSON = "$spack/share/spack/bootstrap/github-actions-v0.4/clingo.json" CLINGO_JSON = "$spack/share/spack/bootstrap/github-actions-v0.6/clingo.json"
GNUPG_JSON = "$spack/share/spack/bootstrap/github-actions-v0.4/gnupg.json" GNUPG_JSON = "$spack/share/spack/bootstrap/github-actions-v0.6/gnupg.json"
PATCHELF_JSON = "$spack/share/spack/bootstrap/github-actions-v0.4/patchelf.json" PATCHELF_JSON = "$spack/share/spack/bootstrap/github-actions-v0.6/patchelf.json"
# Metadata for a generated source mirror # Metadata for a generated source mirror
SOURCE_METADATA = { SOURCE_METADATA = {

View File

@ -528,6 +528,7 @@ def __call__(self, parser, namespace, values, option_string):
# the const from the constructor or a value from the CLI. # the const from the constructor or a value from the CLI.
# Note that this is only called if the argument is actually # Note that this is only called if the argument is actually
# specified on the command line. # specified on the command line.
spack.config.CONFIG.ensure_scope_ordering()
spack.config.set(self.config_path, self.const, scope="command_line") spack.config.set(self.config_path, self.const, scope="command_line")

View File

@ -3,7 +3,6 @@
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import datetime
import os import os
import re import re
from collections import defaultdict from collections import defaultdict
@ -97,7 +96,7 @@ def list_files(args):
OLD_LICENSE, SPDX_MISMATCH, GENERAL_MISMATCH = range(1, 4) OLD_LICENSE, SPDX_MISMATCH, GENERAL_MISMATCH = range(1, 4)
#: Latest year that copyright applies. UPDATE THIS when bumping copyright. #: Latest year that copyright applies. UPDATE THIS when bumping copyright.
latest_year = datetime.date.today().year latest_year = 2024 # year of 0.23 release
strict_date = r"Copyright 2013-%s" % latest_year strict_date = r"Copyright 2013-%s" % latest_year
#: regexes for valid license lines at tops of files #: regexes for valid license lines at tops of files

View File

@ -82,14 +82,6 @@ def spec(parser, args):
if args.namespaces: if args.namespaces:
fmt = "{namespace}." + fmt fmt = "{namespace}." + fmt
tree_kwargs = {
"cover": args.cover,
"format": fmt,
"hashlen": None if args.very_long else 7,
"show_types": args.types,
"status_fn": install_status_fn if args.install_status else None,
}
# use a read transaction if we are getting install status for every # use a read transaction if we are getting install status for every
# spec in the DAG. This avoids repeatedly querying the DB. # spec in the DAG. This avoids repeatedly querying the DB.
tree_context = lang.nullcontext tree_context = lang.nullcontext
@ -99,46 +91,35 @@ def spec(parser, args):
env = ev.active_environment() env = ev.active_environment()
if args.specs: if args.specs:
input_specs = spack.cmd.parse_specs(args.specs) concrete_specs = spack.cmd.parse_specs(args.specs, concretize=True)
concretized_specs = spack.cmd.parse_specs(args.specs, concretize=True)
specs = list(zip(input_specs, concretized_specs))
elif env: elif env:
env.concretize() env.concretize()
specs = env.concretized_specs() concrete_specs = env.concrete_roots()
if not args.format:
# environments are printed together in a combined tree() invocation,
# except when using --yaml or --json, which we print spec by spec below.
tree_kwargs["key"] = spack.traverse.by_dag_hash
tree_kwargs["hashes"] = args.long or args.very_long
print(spack.spec.tree([concrete for _, concrete in specs], **tree_kwargs))
return
else: else:
tty.die("spack spec requires at least one spec or an active environment") tty.die("spack spec requires at least one spec or an active environment")
for input, output in specs: # With --yaml, --json, or --format, just print the raw specs to output
# With --yaml or --json, just print the raw specs to output
if args.format: if args.format:
for spec in concrete_specs:
if args.format == "yaml": if args.format == "yaml":
# use write because to_yaml already has a newline. # use write because to_yaml already has a newline.
sys.stdout.write(output.to_yaml(hash=ht.dag_hash)) sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
elif args.format == "json": elif args.format == "json":
print(output.to_json(hash=ht.dag_hash)) print(spec.to_json(hash=ht.dag_hash))
else: else:
print(output.format(args.format)) print(spec.format(args.format))
continue return
with tree_context(): with tree_context():
# Only show the headers for input specs that are not concrete to avoid print(
# repeated output. This happens because parse_specs outputs concrete spack.spec.tree(
# specs for `/hash` inputs. concrete_specs,
if not input.concrete: cover=args.cover,
tree_kwargs["hashes"] = False # Always False for input spec format=fmt,
print("Input spec") hashlen=None if args.very_long else 7,
print("--------------------------------") show_types=args.types,
print(input.tree(**tree_kwargs)) status_fn=install_status_fn if args.install_status else None,
print("Concretized") hashes=args.long or args.very_long,
print("--------------------------------") key=spack.traverse.by_dag_hash,
)
tree_kwargs["hashes"] = args.long or args.very_long )
print(output.tree(**tree_kwargs))

View File

@ -24,7 +24,7 @@
# tutorial configuration parameters # tutorial configuration parameters
tutorial_branch = "releases/v0.22" tutorial_branch = "releases/v0.23"
tutorial_mirror = "file:///mirror" tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub") tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@ -6,7 +6,7 @@
import sys import sys
import time import time
from contextlib import contextmanager from contextlib import contextmanager
from typing import Iterable, Optional, Sequence, Tuple, Union from typing import Iterable, List, Optional, Sequence, Tuple, Union
import llnl.util.tty as tty import llnl.util.tty as tty
@ -36,6 +36,7 @@ def enable_compiler_existence_check():
CHECK_COMPILER_EXISTENCE = saved CHECK_COMPILER_EXISTENCE = saved
SpecPairInput = Tuple[Spec, Optional[Spec]]
SpecPair = Tuple[Spec, Spec] SpecPair = Tuple[Spec, Spec]
SpecLike = Union[Spec, str] SpecLike = Union[Spec, str]
TestsType = Union[bool, Iterable[str]] TestsType = Union[bool, Iterable[str]]
@ -60,8 +61,8 @@ def concretize_specs_together(
def concretize_together( def concretize_together(
spec_list: Sequence[SpecPair], tests: TestsType = False spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> Sequence[SpecPair]: ) -> List[SpecPair]:
"""Given a number of specs as input, tries to concretize them together. """Given a number of specs as input, tries to concretize them together.
Args: Args:
@ -77,8 +78,8 @@ def concretize_together(
def concretize_together_when_possible( def concretize_together_when_possible(
spec_list: Sequence[SpecPair], tests: TestsType = False spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> Sequence[SpecPair]: ) -> List[SpecPair]:
"""Given a number of specs as input, tries to concretize them together to the extent possible. """Given a number of specs as input, tries to concretize them together to the extent possible.
See documentation for ``unify: when_possible`` concretization for the precise definition of See documentation for ``unify: when_possible`` concretization for the precise definition of
@ -114,8 +115,8 @@ def concretize_together_when_possible(
def concretize_separately( def concretize_separately(
spec_list: Sequence[SpecPair], tests: TestsType = False spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> Sequence[SpecPair]: ) -> List[SpecPair]:
"""Concretizes the input specs separately from each other. """Concretizes the input specs separately from each other.
Args: Args:

View File

@ -431,6 +431,19 @@ def ensure_unwrapped(self) -> "Configuration":
"""Ensure we unwrap this object from any dynamic wrapper (like Singleton)""" """Ensure we unwrap this object from any dynamic wrapper (like Singleton)"""
return self return self
def highest(self) -> ConfigScope:
"""Scope with highest precedence"""
return next(reversed(self.scopes.values())) # type: ignore
@_config_mutator
def ensure_scope_ordering(self):
"""Ensure that scope order matches documented precedent"""
# FIXME: We also need to consider that custom configurations and other orderings
# may not be preserved correctly
if "command_line" in self.scopes:
# TODO (when dropping python 3.6): self.scopes.move_to_end
self.scopes["command_line"] = self.remove_scope("command_line")
@_config_mutator @_config_mutator
def push_scope(self, scope: ConfigScope) -> None: def push_scope(self, scope: ConfigScope) -> None:
"""Add a higher precedence scope to the Configuration.""" """Add a higher precedence scope to the Configuration."""

View File

@ -3044,11 +3044,13 @@ def prepare_config_scope(self) -> None:
"""Add the manifest's scopes to the global configuration search path.""" """Add the manifest's scopes to the global configuration search path."""
for scope in self.env_config_scopes: for scope in self.env_config_scopes:
spack.config.CONFIG.push_scope(scope) spack.config.CONFIG.push_scope(scope)
spack.config.CONFIG.ensure_scope_ordering()
def deactivate_config_scope(self) -> None: def deactivate_config_scope(self) -> None:
"""Remove any of the manifest's scopes from the global config path.""" """Remove any of the manifest's scopes from the global config path."""
for scope in self.env_config_scopes: for scope in self.env_config_scopes:
spack.config.CONFIG.remove_scope(scope.name) spack.config.CONFIG.remove_scope(scope.name)
spack.config.CONFIG.ensure_scope_ordering()
@contextlib.contextmanager @contextlib.contextmanager
def use_config(self): def use_config(self):

View File

@ -180,7 +180,7 @@ def ensure_mirror_usable(self, direction: str = "push"):
if errors: if errors:
msg = f"invalid {direction} configuration for mirror {self.name}: " msg = f"invalid {direction} configuration for mirror {self.name}: "
msg += "\n ".join(errors) msg += "\n ".join(errors)
raise spack.mirror.MirrorError(msg) raise MirrorError(msg)
def _update_connection_dict(self, current_data: dict, new_data: dict, top_level: bool): def _update_connection_dict(self, current_data: dict, new_data: dict, top_level: bool):
# Only allow one to exist in the config # Only allow one to exist in the config

View File

@ -397,6 +397,7 @@ def create_opener():
"""Create an opener that can handle OCI authentication.""" """Create an opener that can handle OCI authentication."""
opener = urllib.request.OpenerDirector() opener = urllib.request.OpenerDirector()
for handler in [ for handler in [
urllib.request.ProxyHandler(),
urllib.request.UnknownHandler(), urllib.request.UnknownHandler(),
urllib.request.HTTPSHandler(context=spack.util.web.ssl_create_default_context()), urllib.request.HTTPSHandler(context=spack.util.web.ssl_create_default_context()),
spack.util.web.SpackHTTPDefaultErrorHandler(), spack.util.web.SpackHTTPDefaultErrorHandler(),

View File

@ -13,6 +13,7 @@
import macholib.mach_o import macholib.mach_o
import macholib.MachO import macholib.MachO
import llnl.util.filesystem as fs
import llnl.util.lang import llnl.util.lang
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.lang import memoized from llnl.util.lang import memoized
@ -275,10 +276,10 @@ def modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
# Deduplicate and flatten # Deduplicate and flatten
args = list(itertools.chain.from_iterable(llnl.util.lang.dedupe(args))) args = list(itertools.chain.from_iterable(llnl.util.lang.dedupe(args)))
if args:
args.append(str(cur_path))
install_name_tool = executable.Executable("install_name_tool") install_name_tool = executable.Executable("install_name_tool")
install_name_tool(*args) if args:
with fs.edit_in_place_through_temporary_file(cur_path) as temp_path:
install_name_tool(*args, temp_path)
def macholib_get_paths(cur_path): def macholib_get_paths(cur_path):
@ -717,8 +718,8 @@ def fixup_macos_rpath(root, filename):
# No fixes needed # No fixes needed
return False return False
args.append(abspath) with fs.edit_in_place_through_temporary_file(abspath) as temp_path:
executable.Executable("install_name_tool")(*args) executable.Executable("install_name_tool")(*args, temp_path)
return True return True

View File

@ -209,7 +209,7 @@ def _apply_to_file(self, f):
# but it's nasty to deal with matches across boundaries, so let's stick to # but it's nasty to deal with matches across boundaries, so let's stick to
# something simple. # something simple.
modified = True modified = False
for match in self.regex.finditer(f.read()): for match in self.regex.finditer(f.read()):
# The matching prefix (old) and its replacement (new) # The matching prefix (old) and its replacement (new)

View File

@ -106,8 +106,8 @@
{ {
"names": ["install_missing_compilers"], "names": ["install_missing_compilers"],
"message": "The config:install_missing_compilers option has been deprecated in " "message": "The config:install_missing_compilers option has been deprecated in "
"Spack v0.23, and is currently ignored. It will be removed from config in " "Spack v0.23, and is currently ignored. It will be removed from config after "
"Spack v0.25.", "Spack v1.0.",
"error": False, "error": False,
}, },
], ],

View File

@ -3839,12 +3839,21 @@ def splice_at_hash(
self._splices.setdefault(parent_node, []).append(splice) self._splices.setdefault(parent_node, []).append(splice)
def _resolve_automatic_splices(self): def _resolve_automatic_splices(self):
"""After all of the specs have been concretized, apply all immediate """After all of the specs have been concretized, apply all immediate splices.
splices in size order. This ensures that all dependencies are resolved
Use reverse topological order to ensure that all dependencies are resolved
before their parents, allowing for maximal sharing and minimal copying. before their parents, allowing for maximal sharing and minimal copying.
""" """
fixed_specs = {} fixed_specs = {}
for node, spec in sorted(self._specs.items(), key=lambda x: len(x[1])):
# create a mapping from dag hash to an integer representing position in reverse topo order.
specs = self._specs.values()
topo_order = list(traverse.traverse_nodes(specs, order="topo", key=traverse.by_dag_hash))
topo_lookup = {spec.dag_hash(): index for index, spec in enumerate(reversed(topo_order))}
# iterate over specs, children before parents
for node, spec in sorted(self._specs.items(), key=lambda x: topo_lookup[x[1].dag_hash()]):
immediate = self._splices.get(node, []) immediate = self._splices.get(node, [])
if not immediate and not any( if not immediate and not any(
edge.spec in fixed_specs for edge in spec.edges_to_dependencies() edge.spec in fixed_specs for edge in spec.edges_to_dependencies()

View File

@ -448,6 +448,9 @@ def _target_satisfies(self, other: "ArchSpec", strict: bool) -> bool:
return bool(self._target_intersection(other)) return bool(self._target_intersection(other))
def _target_constrain(self, other: "ArchSpec") -> bool: def _target_constrain(self, other: "ArchSpec") -> bool:
if self.target is None and other.target is None:
return False
if not other._target_satisfies(self, strict=False): if not other._target_satisfies(self, strict=False):
raise UnsatisfiableArchitectureSpecError(self, other) raise UnsatisfiableArchitectureSpecError(self, other)
@ -496,21 +499,56 @@ def _target_intersection(self, other):
if (not s_min or o_comp >= s_min) and (not s_max or o_comp <= s_max): if (not s_min or o_comp >= s_min) and (not s_max or o_comp <= s_max):
results.append(o_min) results.append(o_min)
else: else:
# Take intersection of two ranges # Take the "min" of the two max, if there is a partial ordering.
# Lots of comparisons needed n_max = ""
_s_min = _make_microarchitecture(s_min) if s_max and o_max:
_s_max = _make_microarchitecture(s_max) _s_max = _make_microarchitecture(s_max)
_o_min = _make_microarchitecture(o_min)
_o_max = _make_microarchitecture(o_max) _o_max = _make_microarchitecture(o_max)
if _s_max.family != _o_max.family:
continue
if _s_max <= _o_max:
n_max = s_max
elif _o_max < _s_max:
n_max = o_max
else:
continue
elif s_max:
n_max = s_max
elif o_max:
n_max = o_max
n_min = s_min if _s_min >= _o_min else o_min # Take the "max" of the two min.
n_max = s_max if _s_max <= _o_max else o_max n_min = ""
if s_min and o_min:
_s_min = _make_microarchitecture(s_min)
_o_min = _make_microarchitecture(o_min)
if _s_min.family != _o_min.family:
continue
if _s_min >= _o_min:
n_min = s_min
elif _o_min > _s_min:
n_min = o_min
else:
continue
elif s_min:
n_min = s_min
elif o_min:
n_min = o_min
if n_min and n_max:
_n_min = _make_microarchitecture(n_min) _n_min = _make_microarchitecture(n_min)
_n_max = _make_microarchitecture(n_max) _n_max = _make_microarchitecture(n_max)
if _n_min == _n_max: if _n_min.family != _n_max.family or not _n_min <= _n_max:
continue
if n_min == n_max:
results.append(n_min) results.append(n_min)
elif not n_min or not n_max or _n_min < _n_max: else:
results.append("%s:%s" % (n_min, n_max)) results.append(f"{n_min}:{n_max}")
elif n_min:
results.append(f"{n_min}:")
elif n_max:
results.append(f":{n_max}")
return results return results
def constrain(self, other: "ArchSpec") -> bool: def constrain(self, other: "ArchSpec") -> bool:
@ -1431,8 +1469,6 @@ def tree(
class Spec: class Spec:
#: Cache for spec's prefix, computed lazily in the corresponding property #: Cache for spec's prefix, computed lazily in the corresponding property
_prefix = None _prefix = None
#: Cache for spec's length, computed lazily in the corresponding property
_length = None
abstract_hash = None abstract_hash = None
@staticmethod @staticmethod
@ -1502,9 +1538,8 @@ def __init__(
self._external_path = external_path self._external_path = external_path
self.external_modules = Spec._format_module_list(external_modules) self.external_modules = Spec._format_module_list(external_modules)
# This attribute is used to store custom information for # This attribute is used to store custom information for external specs.
# external specs. None signal that it was not set yet. self.extra_attributes: dict = {}
self.extra_attributes = None
# This attribute holds the original build copy of the spec if it is # This attribute holds the original build copy of the spec if it is
# deployed differently than it was built. None signals that the spec # deployed differently than it was built. None signals that the spec
@ -2219,8 +2254,8 @@ def to_node_dict(self, hash=ht.dag_hash):
d["external"] = syaml.syaml_dict( d["external"] = syaml.syaml_dict(
[ [
("path", self.external_path), ("path", self.external_path),
("module", self.external_modules), ("module", self.external_modules or None),
("extra_attributes", self.extra_attributes), ("extra_attributes", syaml.sorted_dict(self.extra_attributes)),
] ]
) )
@ -2958,7 +2993,7 @@ def _finalize_concretization(self):
for spec in self.traverse(): for spec in self.traverse():
spec._cached_hash(ht.dag_hash) spec._cached_hash(ht.dag_hash)
def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "spack.spec.Spec": def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "Spec":
"""This is a non-destructive version of concretize(). """This is a non-destructive version of concretize().
First clones, then returns a concrete version of this package First clones, then returns a concrete version of this package
@ -3081,17 +3116,12 @@ def constrain(self, other, deps=True):
if not self.variants[v].compatible(other.variants[v]): if not self.variants[v].compatible(other.variants[v]):
raise vt.UnsatisfiableVariantSpecError(self.variants[v], other.variants[v]) raise vt.UnsatisfiableVariantSpecError(self.variants[v], other.variants[v])
# TODO: Check out the logic here
sarch, oarch = self.architecture, other.architecture sarch, oarch = self.architecture, other.architecture
if sarch is not None and oarch is not None: if (
if sarch.platform is not None and oarch.platform is not None: sarch is not None
if sarch.platform != oarch.platform: and oarch is not None
raise UnsatisfiableArchitectureSpecError(sarch, oarch) and not self.architecture.intersects(other.architecture)
if sarch.os is not None and oarch.os is not None: ):
if sarch.os != oarch.os:
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
if sarch.target is not None and oarch.target is not None:
if sarch.target != oarch.target:
raise UnsatisfiableArchitectureSpecError(sarch, oarch) raise UnsatisfiableArchitectureSpecError(sarch, oarch)
changed = False changed = False
@ -3115,18 +3145,12 @@ def constrain(self, other, deps=True):
changed |= self.compiler_flags.constrain(other.compiler_flags) changed |= self.compiler_flags.constrain(other.compiler_flags)
old = str(self.architecture)
sarch, oarch = self.architecture, other.architecture sarch, oarch = self.architecture, other.architecture
if sarch is None or other.architecture is None: if sarch is not None and oarch is not None:
self.architecture = sarch or oarch changed |= self.architecture.constrain(other.architecture)
else: elif oarch is not None:
if sarch.platform is None or oarch.platform is None: self.architecture = oarch
self.architecture.platform = sarch.platform or oarch.platform changed = True
if sarch.os is None or oarch.os is None:
sarch.os = sarch.os or oarch.os
if sarch.target is None or oarch.target is None:
sarch.target = sarch.target or oarch.target
changed |= str(self.architecture) != old
if deps: if deps:
changed |= self._constrain_dependencies(other) changed |= self._constrain_dependencies(other)
@ -3702,18 +3726,6 @@ def __getitem__(self, name: str):
return child return child
def __len__(self):
if not self.concrete:
raise spack.error.SpecError(f"Cannot get length of abstract spec: {self}")
if not self._length:
self._length = 1 + sum(len(dep) for dep in self.dependencies())
return self._length
def __bool__(self):
# Need to define this so __len__ isn't used by default
return True
def __contains__(self, spec): def __contains__(self, spec):
"""True if this spec or some dependency satisfies the spec. """True if this spec or some dependency satisfies the spec.
@ -4472,7 +4484,7 @@ def clear_caches(self, ignore=()):
if h.attr not in ignore: if h.attr not in ignore:
if hasattr(self, h.attr): if hasattr(self, h.attr):
setattr(self, h.attr, None) setattr(self, h.attr, None)
for attr in ("_dunder_hash", "_prefix", "_length"): for attr in ("_dunder_hash", "_prefix"):
if attr not in ignore: if attr not in ignore:
setattr(self, attr, None) setattr(self, attr, None)
@ -4849,8 +4861,8 @@ def from_node_dict(cls, node):
spec.external_modules = node["external"]["module"] spec.external_modules = node["external"]["module"]
if spec.external_modules is False: if spec.external_modules is False:
spec.external_modules = None spec.external_modules = None
spec.extra_attributes = node["external"].get( spec.extra_attributes = (
"extra_attributes", syaml.syaml_dict() node["external"].get("extra_attributes") or syaml.syaml_dict()
) )
# specs read in are concrete unless marked abstract # specs read in are concrete unless marked abstract

View File

@ -487,7 +487,7 @@ def _generate_fetchers(self, mirror_only=False) -> Generator["fs.FetchStrategy",
# Insert fetchers in the order that the URLs are provided. # Insert fetchers in the order that the URLs are provided.
fetchers[:0] = ( fetchers[:0] = (
fs.from_url_scheme( fs.from_url_scheme(
url_util.join(mirror.fetch_url, self.mirror_layout.path), url_util.join(mirror.fetch_url, *self.mirror_layout.path.split(os.sep)),
checksum=digest, checksum=digest,
expand=expand, expand=expand,
extension=extension, extension=extension,

View File

@ -10,9 +10,6 @@
import spack.config import spack.config
import spack.deptypes as dt import spack.deptypes as dt
import spack.package_base
import spack.paths
import spack.repo
import spack.solver.asp import spack.solver.asp
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
from spack.spec import Spec from spack.spec import Spec
@ -232,3 +229,19 @@ def test_spliced_build_deps_only_in_build_spec(splicing_setup):
assert _has_build_dependency(build_spec, "splice-z") assert _has_build_dependency(build_spec, "splice-z")
# Spliced build dependencies are removed # Spliced build dependencies are removed
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0 assert len(concr_goal.dependencies(None, dt.BUILD)) == 0
def test_spliced_transitive_dependency(splicing_setup):
cache = ["splice-depends-on-t@1.0 ^splice-h@1.0.1"]
goal_spec = Spec("splice-depends-on-t^splice-h@1.0.2")
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["splice-depends-on-t"]))
_enable_splicing()
concr_goal = goal_spec.concretized()
# Spec has been spliced
assert concr_goal._build_spec is not None
assert concr_goal["splice-t"]._build_spec is not None
assert concr_goal.satisfies(goal_spec)
# Spliced build dependencies are removed
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0

View File

@ -4,9 +4,11 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import pytest import pytest
import spack.config
import spack.environment as ev import spack.environment as ev
import spack.error import spack.error
import spack.solver.asp as asp import spack.solver.asp as asp
import spack.store
from spack.cmd import ( from spack.cmd import (
CommandNameError, CommandNameError,
PythonNameError, PythonNameError,

View File

@ -315,6 +315,7 @@ def test_pkg_grep(mock_packages, capfd):
"depends-on-manyvariants", "depends-on-manyvariants",
"manyvariants", "manyvariants",
"splice-a", "splice-a",
"splice-depends-on-t",
"splice-h", "splice-h",
"splice-t", "splice-t",
"splice-vh", "splice-vh",

View File

@ -7,6 +7,7 @@
import pytest import pytest
import spack.config
import spack.environment as ev import spack.environment as ev
import spack.error import spack.error
import spack.spec import spack.spec

View File

@ -1532,3 +1532,30 @@ def test_config_path_dsl(path, it_should_work, expected_parsed):
else: else:
with pytest.raises(ValueError): with pytest.raises(ValueError):
spack.config.ConfigPath._validate(path) spack.config.ConfigPath._validate(path)
@pytest.mark.regression("48254")
def test_env_activation_preserves_config_scopes(mutable_mock_env_path):
"""Check that the "command_line" scope remains the highest priority scope, when we activate,
or deactivate, environments.
"""
expected_cl_scope = spack.config.CONFIG.highest()
assert expected_cl_scope.name == "command_line"
# Creating an environment pushes a new scope
ev.create("test")
with ev.read("test"):
assert spack.config.CONFIG.highest() == expected_cl_scope
# No active environment pops the scope
with ev.no_active_environment():
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope
# Switch the environment to another one
ev.create("test-2")
with ev.read("test-2"):
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope

View File

@ -1249,3 +1249,14 @@ def test_find_input_types(tmp_path: pathlib.Path):
with pytest.raises(TypeError): with pytest.raises(TypeError):
fs.find(1, "file.txt") # type: ignore fs.find(1, "file.txt") # type: ignore
def test_edit_in_place_through_temporary_file(tmp_path):
(tmp_path / "example.txt").write_text("Hello")
current_ino = os.stat(tmp_path / "example.txt").st_ino
with fs.edit_in_place_through_temporary_file(tmp_path / "example.txt") as temporary:
os.unlink(temporary)
with open(temporary, "w") as f:
f.write("World")
assert (tmp_path / "example.txt").read_text() == "World"
assert os.stat(tmp_path / "example.txt").st_ino == current_ino

View File

@ -298,30 +298,6 @@ def inner():
top-level raised TypeError: ok""" top-level raised TypeError: ok"""
) )
full_message = h.grouped_message(with_tracebacks=True)
no_line_numbers = re.sub(r"line [0-9]+,", "line xxx,", full_message)
assert (
no_line_numbers
== dedent(
"""\
due to the following failures:
inner method raised ValueError: wow!
File "{0}", \
line xxx, in test_grouped_exception
inner()
File "{0}", \
line xxx, in inner
raise ValueError("wow!")
top-level raised TypeError: ok
File "{0}", \
line xxx, in test_grouped_exception
raise TypeError("ok")
"""
).format(__file__)
)
def test_grouped_exception_base_type(): def test_grouped_exception_base_type():
h = llnl.util.lang.GroupedExceptionHandler() h = llnl.util.lang.GroupedExceptionHandler()

View File

@ -57,18 +57,16 @@ def test_log_python_output_without_echo(capfd, tmpdir):
assert capfd.readouterr()[0] == "" assert capfd.readouterr()[0] == ""
def test_log_python_output_with_invalid_utf8(capfd, tmpdir): def test_log_python_output_with_invalid_utf8(capfd, tmp_path):
with tmpdir.as_cwd(): tmp_file = str(tmp_path / "foo.txt")
with log.log_output("foo.txt"): with log.log_output(tmp_file, echo=True):
sys.stdout.buffer.write(b"\xc3\x28\n") sys.stdout.buffer.write(b"\xc3helloworld\n")
expected = b"<line lost: output was not encoded as UTF-8>\n" # we should be able to read this as valid utf-8
with open("foo.txt", "rb") as f: with open(tmp_file, "r", encoding="utf-8") as f:
written = f.read() assert f.read() == "<EFBFBD>helloworld\n"
assert written == expected
# nothing on stdout or stderr assert capfd.readouterr().out == "<EFBFBD>helloworld\n"
assert capfd.readouterr()[0] == ""
def test_log_python_output_and_echo_output(capfd, tmpdir): def test_log_python_output_and_echo_output(capfd, tmpdir):

View File

@ -138,3 +138,19 @@ def test_round_trip_configuration(initial_content, expected_final_content, tmp_p
expected_final_content = initial_content expected_final_content = initial_content
assert final_content.getvalue() == expected_final_content assert final_content.getvalue() == expected_final_content
def test_sorted_dict():
assert syaml.sorted_dict(
{
"z": 0,
"y": [{"x": 0, "w": [2, 1, 0]}, 0],
"v": ({"u": 0, "t": 0, "s": 0}, 0, {"r": 0, "q": 0}),
"p": 0,
}
) == {
"p": 0,
"v": ({"s": 0, "t": 0, "u": 0}, 0, {"q": 0, "r": 0}),
"y": [{"w": [2, 1, 0], "x": 0}, 0],
"z": 0,
}

View File

@ -1842,6 +1842,16 @@ def test_abstract_contains_semantic(lhs, rhs, expected, mock_packages):
# Different virtuals intersect if there is at least package providing both # Different virtuals intersect if there is at least package providing both
(Spec, "mpi", "lapack", (True, False, False)), (Spec, "mpi", "lapack", (True, False, False)),
(Spec, "mpi", "pkgconfig", (False, False, False)), (Spec, "mpi", "pkgconfig", (False, False, False)),
# Intersection among target ranges for different architectures
(Spec, "target=x86_64:", "target=ppc64le:", (False, False, False)),
(Spec, "target=x86_64:", "target=:power9", (False, False, False)),
(Spec, "target=:haswell", "target=:power9", (False, False, False)),
(Spec, "target=:haswell", "target=ppc64le:", (False, False, False)),
# Intersection among target ranges for the same architecture
(Spec, "target=:haswell", "target=x86_64:", (True, True, True)),
(Spec, "target=:haswell", "target=x86_64_v4:", (False, False, False)),
# Edge case of uarch that split in a diamond structure, from a common ancestor
(Spec, "target=:cascadelake", "target=:cannonlake", (False, False, False)),
], ],
) )
def test_intersects_and_satisfies(factory, lhs_str, rhs_str, results): def test_intersects_and_satisfies(factory, lhs_str, rhs_str, results):
@ -1891,6 +1901,16 @@ def test_intersects_and_satisfies(factory, lhs_str, rhs_str, results):
# Flags # Flags
(Spec, "cppflags=-foo", "cppflags=-foo", False, "cppflags=-foo"), (Spec, "cppflags=-foo", "cppflags=-foo", False, "cppflags=-foo"),
(Spec, "cppflags=-foo", "cflags=-foo", True, "cppflags=-foo cflags=-foo"), (Spec, "cppflags=-foo", "cflags=-foo", True, "cppflags=-foo cflags=-foo"),
# Target ranges
(Spec, "target=x86_64:", "target=x86_64:", False, "target=x86_64:"),
(Spec, "target=x86_64:", "target=:haswell", True, "target=x86_64:haswell"),
(
Spec,
"target=x86_64:haswell",
"target=x86_64_v2:icelake",
True,
"target=x86_64_v2:haswell",
),
], ],
) )
def test_constrain(factory, lhs_str, rhs_str, result, constrained_str): def test_constrain(factory, lhs_str, rhs_str, result, constrained_str):

View File

@ -21,6 +21,7 @@
import pytest import pytest
import ruamel.yaml import ruamel.yaml
import spack.config
import spack.hash_types as ht import spack.hash_types as ht
import spack.paths import spack.paths
import spack.repo import spack.repo
@ -144,86 +145,83 @@ def descend_and_check(iterable, level=0):
assert level >= 5 assert level >= 5
def test_ordered_read_not_required_for_consistent_dag_hash(config, mock_packages): @pytest.mark.parametrize("spec_str", ["mpileaks ^zmpi", "dttop", "dtuse"])
def test_ordered_read_not_required_for_consistent_dag_hash(
spec_str, mutable_config: spack.config.Configuration, mock_packages
):
"""Make sure ordered serialization isn't required to preserve hashes. """Make sure ordered serialization isn't required to preserve hashes.
For consistent hashes, we require that YAML and json documents For consistent hashes, we require that YAML and JSON serializations have their keys in a
have their keys serialized in a deterministic order. However, we deterministic order. However, we don't want to require them to be serialized in order. This
don't want to require them to be serialized in order. This ensures that is not required."""
ensures that is not required.
"""
specs = ["mpileaks ^zmpi", "dttop", "dtuse"]
for spec in specs:
spec = Spec(spec)
spec.concretize()
# # Make sure that `extra_attributes` of externals is order independent for hashing.
# Dict & corresponding YAML & JSON from the original spec. extra_attributes = {
# "compilers": {"c": "/some/path/bin/cc", "cxx": "/some/path/bin/c++"},
spec_dict = spec.to_dict() "foo": "bar",
"baz": "qux",
}
mutable_config.set(
"packages:dtuse",
{
"buildable": False,
"externals": [
{"spec": "dtuse@=1.0", "prefix": "/usr", "extra_attributes": extra_attributes}
],
},
)
spec = spack.spec.Spec(spec_str).concretized()
if spec_str == "dtuse":
assert spec.external and spec.extra_attributes == extra_attributes
spec_dict = spec.to_dict(hash=ht.dag_hash)
spec_yaml = spec.to_yaml() spec_yaml = spec.to_yaml()
spec_json = spec.to_json() spec_json = spec.to_json()
# # Make a spec with dict keys reversed recursively
# Make a spec with reversed OrderedDicts for every spec_dict_rev = reverse_all_dicts(spec_dict)
# OrderedDict in the original.
#
reversed_spec_dict = reverse_all_dicts(spec.to_dict())
#
# Dump to YAML and JSON # Dump to YAML and JSON
#
yaml_string = syaml.dump(spec_dict, default_flow_style=False) yaml_string = syaml.dump(spec_dict, default_flow_style=False)
reversed_yaml_string = syaml.dump(reversed_spec_dict, default_flow_style=False) yaml_string_rev = syaml.dump(spec_dict_rev, default_flow_style=False)
json_string = sjson.dump(spec_dict) json_string = sjson.dump(spec_dict)
reversed_json_string = sjson.dump(reversed_spec_dict) json_string_rev = sjson.dump(spec_dict_rev)
#
# Do many consistency checks
#
# spec yaml is ordered like the spec dict # spec yaml is ordered like the spec dict
assert yaml_string == spec_yaml assert yaml_string == spec_yaml
assert json_string == spec_json assert json_string == spec_json
# reversed string is different from the original, so it # reversed string is different from the original, so it *would* generate a different hash
# *would* generate a different hash assert yaml_string != yaml_string_rev
assert yaml_string != reversed_yaml_string assert json_string != json_string_rev
assert json_string != reversed_json_string
# build specs from the "wrongly" ordered data # build specs from the "wrongly" ordered data
round_trip_yaml_spec = Spec.from_yaml(yaml_string) from_yaml = Spec.from_yaml(yaml_string)
round_trip_json_spec = Spec.from_json(json_string) from_json = Spec.from_json(json_string)
round_trip_reversed_yaml_spec = Spec.from_yaml(reversed_yaml_string) from_yaml_rev = Spec.from_yaml(yaml_string_rev)
round_trip_reversed_json_spec = Spec.from_yaml(reversed_json_string) from_json_rev = Spec.from_json(json_string_rev)
# Strip spec if we stripped the yaml # Strip spec if we stripped the yaml
spec = spec.copy(deps=ht.dag_hash.depflag) spec = spec.copy(deps=ht.dag_hash.depflag)
# specs are equal to the original # specs and their hashes are equal to the original
assert spec == round_trip_yaml_spec assert (
assert spec == round_trip_json_spec spec.process_hash()
== from_yaml.process_hash()
assert spec == round_trip_reversed_yaml_spec == from_json.process_hash()
assert spec == round_trip_reversed_json_spec == from_yaml_rev.process_hash()
assert round_trip_yaml_spec == round_trip_reversed_yaml_spec == from_json_rev.process_hash()
assert round_trip_json_spec == round_trip_reversed_json_spec )
# dag_hashes are equal assert (
assert spec.dag_hash() == round_trip_yaml_spec.dag_hash() spec.dag_hash()
assert spec.dag_hash() == round_trip_json_spec.dag_hash() == from_yaml.dag_hash()
assert spec.dag_hash() == round_trip_reversed_yaml_spec.dag_hash() == from_json.dag_hash()
assert spec.dag_hash() == round_trip_reversed_json_spec.dag_hash() == from_yaml_rev.dag_hash()
== from_json_rev.dag_hash()
# dag_hash is equal after round-trip by dag_hash )
spec.concretize() assert spec == from_yaml == from_json == from_yaml_rev == from_json_rev
round_trip_yaml_spec.concretize()
round_trip_json_spec.concretize()
round_trip_reversed_yaml_spec.concretize()
round_trip_reversed_json_spec.concretize()
assert spec.dag_hash() == round_trip_yaml_spec.dag_hash()
assert spec.dag_hash() == round_trip_json_spec.dag_hash()
assert spec.dag_hash() == round_trip_reversed_yaml_spec.dag_hash()
assert spec.dag_hash() == round_trip_reversed_json_spec.dag_hash()
@pytest.mark.parametrize("module", [spack.spec, spack.version]) @pytest.mark.parametrize("module", [spack.spec, spack.version])
@ -294,12 +292,9 @@ def visit_Call(self, node):
def reverse_all_dicts(data): def reverse_all_dicts(data):
"""Descend into data and reverse all the dictionaries""" """Descend into data and reverse all the dictionaries"""
if isinstance(data, dict): if isinstance(data, dict):
return syaml_dict( return type(data)((k, reverse_all_dicts(v)) for k, v in reversed(list(data.items())))
reversed([(reverse_all_dicts(k), reverse_all_dicts(v)) for k, v in data.items()])
)
elif isinstance(data, (list, tuple)): elif isinstance(data, (list, tuple)):
return type(data)(reverse_all_dicts(elt) for elt in data) return type(data)(reverse_all_dicts(elt) for elt in data)
else:
return data return data

View File

@ -13,7 +13,7 @@
import sys import sys
from llnl.util import tty from llnl.util import tty
from llnl.util.filesystem import join_path from llnl.util.filesystem import edit_in_place_through_temporary_file
from llnl.util.lang import memoized from llnl.util.lang import memoized
from spack.util.executable import Executable, which from spack.util.executable import Executable, which
@ -81,12 +81,11 @@ def fix_darwin_install_name(path):
Parameters: Parameters:
path (str): directory in which .dylib files are located path (str): directory in which .dylib files are located
""" """
libs = glob.glob(join_path(path, "*.dylib")) libs = glob.glob(os.path.join(path, "*.dylib"))
for lib in libs:
# fix install name first:
install_name_tool = Executable("install_name_tool") install_name_tool = Executable("install_name_tool")
install_name_tool("-id", lib, lib)
otool = Executable("otool") otool = Executable("otool")
for lib in libs:
args = ["-id", lib]
long_deps = otool("-L", lib, output=str).split("\n") long_deps = otool("-L", lib, output=str).split("\n")
deps = [dep.partition(" ")[0][1::] for dep in long_deps[2:-1]] deps = [dep.partition(" ")[0][1::] for dep in long_deps[2:-1]]
# fix all dependencies: # fix all dependencies:
@ -98,5 +97,8 @@ def fix_darwin_install_name(path):
# but we don't know builddir (nor how symbolic links look # but we don't know builddir (nor how symbolic links look
# in builddir). We thus only compare the basenames. # in builddir). We thus only compare the basenames.
if os.path.basename(dep) == os.path.basename(loc): if os.path.basename(dep) == os.path.basename(loc):
install_name_tool("-change", dep, loc, lib) args.extend(("-change", dep, loc))
break break
with edit_in_place_through_temporary_file(lib) as tmp:
install_name_tool(*args, tmp)

View File

@ -447,20 +447,13 @@ def _dump_annotated(handler, data, stream=None):
return getvalue() return getvalue()
def sorted_dict(dict_like): def sorted_dict(data):
"""Return an ordered dict with all the fields sorted recursively. """Descend into data and sort all dictionary keys."""
if isinstance(data, dict):
Args: return type(data)((k, sorted_dict(v)) for k, v in sorted(data.items()))
dict_like (dict): dictionary to be sorted elif isinstance(data, (list, tuple)):
return type(data)(sorted_dict(v) for v in data)
Returns: return data
dictionary sorted recursively
"""
result = syaml_dict(sorted(dict_like.items()))
for key, value in result.items():
if isinstance(value, collections.abc.Mapping):
result[key] = sorted_dict(value)
return result
def extract_comments(data): def extract_comments(data):

View File

@ -14,10 +14,10 @@ default:
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] } image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
# CI Platform-Arch # CI Platform-Arch
.cray_rhel_zen4: .cray_rhel_x86_64_v3:
variables: variables:
SPACK_TARGET_PLATFORM: "cray-rhel" SPACK_TARGET_PLATFORM: "cray-rhel"
SPACK_TARGET_ARCH: "zen4" SPACK_TARGET_ARCH: "x86_64_v3"
.cray_sles_zen4: .cray_sles_zen4:
variables: variables:
@ -876,7 +876,7 @@ aws-pcluster-build-neoverse_v1:
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true - cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
.generate-cray-rhel: .generate-cray-rhel:
tags: [ "cray-rhel-zen4", "public" ] tags: [ "cray-rhel-x86_64_v3", "public" ]
extends: [ ".generate-cray" ] extends: [ ".generate-cray" ]
.generate-cray-sles: .generate-cray-sles:
@ -888,7 +888,7 @@ aws-pcluster-build-neoverse_v1:
# E4S - Cray RHEL # E4S - Cray RHEL
####################################### #######################################
.e4s-cray-rhel: .e4s-cray-rhel:
extends: [ ".cray_rhel_zen4" ] extends: [ ".cray_rhel_x86_64_v3" ]
variables: variables:
SPACK_CI_STACK_NAME: e4s-cray-rhel SPACK_CI_STACK_NAME: e4s-cray-rhel
@ -896,7 +896,6 @@ e4s-cray-rhel-generate:
extends: [ ".generate-cray-rhel", ".e4s-cray-rhel" ] extends: [ ".generate-cray-rhel", ".e4s-cray-rhel" ]
e4s-cray-rhel-build: e4s-cray-rhel-build:
allow_failure: true # libsci_cray.so broken, misses DT_NEEDED for libdl.so
extends: [ ".build", ".e4s-cray-rhel" ] extends: [ ".build", ".e4s-cray-rhel" ]
trigger: trigger:
include: include:
@ -915,10 +914,10 @@ e4s-cray-rhel-build:
variables: variables:
SPACK_CI_STACK_NAME: e4s-cray-sles SPACK_CI_STACK_NAME: e4s-cray-sles
e4s-cray-sles-generate: .e4s-cray-sles-generate:
extends: [ ".generate-cray-sles", ".e4s-cray-sles" ] extends: [ ".generate-cray-sles", ".e4s-cray-sles" ]
e4s-cray-sles-build: .e4s-cray-sles-build:
allow_failure: true # libsci_cray.so broken, misses DT_NEEDED for libdl.so allow_failure: true # libsci_cray.so broken, misses DT_NEEDED for libdl.so
extends: [ ".build", ".e4s-cray-sles" ] extends: [ ".build", ".e4s-cray-sles" ]
trigger: trigger:

View File

@ -1,31 +1,27 @@
compilers: compilers:
- compiler: - compiler:
spec: cce@15.0.1 spec: cce@=18.0.0
paths: paths:
cc: cc cc: /opt/cray/pe/cce/18.0.0/bin/craycc
cxx: CC cxx: /opt/cray/pe/cce/18.0.0/bin/crayCC
f77: ftn f77: /opt/cray/pe/cce/18.0.0/bin/crayftn
fc: ftn fc: /opt/cray/pe/cce/18.0.0/bin/crayftn
flags: {} flags: {}
operating_system: rhel8 operating_system: rhel8
target: any target: x86_64
modules: modules: []
- PrgEnv-cray/8.3.3
- cce/15.0.1
environment:
set:
MACHTYPE: x86_64
- compiler:
spec: gcc@11.2.0
paths:
cc: gcc
cxx: g++
f77: gfortran
fc: gfortran
flags: {}
operating_system: rhel8
target: any
modules:
- PrgEnv-gnu
- gcc/11.2.0
environment: {} environment: {}
extra_rpaths: []
- compiler:
spec: gcc@=8.5.0
paths:
cc: /usr/bin/gcc
cxx: /usr/bin/g++
f77: /usr/bin/gfortran
fc: /usr/bin/gfortran
flags: {}
operating_system: rhel8
target: x86_64
modules: []
environment: {}
extra_rpaths: []

View File

@ -1,16 +1,15 @@
packages: packages:
# EXTERNALS
cray-mpich: cray-mpich:
buildable: false buildable: false
externals: externals:
- spec: cray-mpich@8.1.25 %cce@15.0.1 - spec: cray-mpich@8.1.30 %cce
prefix: /opt/cray/pe/mpich/8.1.25/ofi/cray/10.0 prefix: /opt/cray/pe/mpich/8.1.30/ofi/cray/18.0
modules: modules:
- cray-mpich/8.1.25 - cray-mpich/8.1.30
cray-libsci: cray-libsci:
buildable: false buildable: false
externals: externals:
- spec: cray-libsci@23.02.1.1 %cce@15.0.1 - spec: cray-libsci@24.07.0 %cce
prefix: /opt/cray/pe/libsci/23.02.1.1/CRAY/9.0/x86_64/ prefix: /opt/cray/pe/libsci/24.07.0/CRAY/18.0/x86_64/
modules: modules:
- cray-libsci/23.02.1.1 - cray-libsci/24.07.0

View File

@ -0,0 +1,4 @@
ci:
pipeline-gen:
- build-job:
tags: ["cray-rhel-x86_64_v3"]

View File

@ -1,4 +0,0 @@
ci:
pipeline-gen:
- build-job:
tags: ["cray-rhel-zen4"]

View File

@ -10,8 +10,7 @@ spack:
packages: packages:
all: all:
prefer: require: "%cce@18.0.0 target=x86_64_v3"
- "%cce"
compiler: [cce] compiler: [cce]
providers: providers:
blas: [cray-libsci] blas: [cray-libsci]
@ -19,17 +18,15 @@ spack:
mpi: [cray-mpich] mpi: [cray-mpich]
tbb: [intel-tbb] tbb: [intel-tbb]
scalapack: [netlib-scalapack] scalapack: [netlib-scalapack]
target: [zen4]
variants: +mpi variants: +mpi
ncurses:
require: +termlib ldflags=-Wl,--undefined-version
tbb: tbb:
require: "intel-tbb" require: "intel-tbb"
binutils: binutils:
variants: +ld +gold +headers +libiberty ~nls variants: +ld +gold +headers +libiberty ~nls
boost: boost:
variants: +python +filesystem +iostreams +system variants: +python +filesystem +iostreams +system
cuda:
version: [11.7.0]
elfutils: elfutils:
variants: ~nls variants: ~nls
require: "%gcc" require: "%gcc"
@ -39,20 +36,14 @@ spack:
variants: +fortran +hl +shared variants: +fortran +hl +shared
libfabric: libfabric:
variants: fabrics=sockets,tcp,udp,rxm variants: fabrics=sockets,tcp,udp,rxm
libunwind:
variants: +pic +xz
mgard: mgard:
require: require:
- "@2023-01-10:" - "@2023-01-10:"
mpich: mpich:
variants: ~wrapperrpath variants: ~wrapperrpath
ncurses:
variants: +termlib
paraview: paraview:
# Don't build GUI support or GLX rendering for HPC/container deployments # Don't build GUI support or GLX rendering for HPC/container deployments
require: "@5.11 ~qt ^[virtuals=gl] osmesa" require: "~qt ^[virtuals=gl] osmesa"
python:
version: [3.8.13]
trilinos: trilinos:
require: require:
- one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack - one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack
@ -63,12 +54,6 @@ spack:
- one_of: [~ml ~muelu ~zoltan2 ~teko, +ml +muelu +zoltan2 +teko] - one_of: [~ml ~muelu ~zoltan2 ~teko, +ml +muelu +zoltan2 +teko]
- one_of: [+superlu-dist, ~superlu-dist] - one_of: [+superlu-dist, ~superlu-dist]
- one_of: [+shylu, ~shylu] - one_of: [+shylu, ~shylu]
xz:
variants: +pic
mesa:
version: [21.3.8]
unzip:
require: "%gcc"
specs: specs:
# CPU # CPU
@ -76,62 +61,43 @@ spack:
- aml - aml
- arborx - arborx
- argobots - argobots
- bolt
- butterflypack
- boost +python +filesystem +iostreams +system - boost +python +filesystem +iostreams +system
- cabana - cabana
- caliper
- chai - chai
- charliecloud
- conduit - conduit
# - cp2k +mpi # libxsmm: ftn-78 ftn: ERROR in command linel; The -f option has an invalid argument, "tree-vectorize".
- datatransferkit - datatransferkit
- flecsi - flecsi
- flit - flit
- flux-core
- fortrilinos
- ginkgo - ginkgo
- globalarrays - globalarrays
- gmp - gmp
- gotcha - gotcha
- h5bench - h5bench
- hdf5-vol-async - hdf5-vol-async
- hdf5-vol-cache - hdf5-vol-cache cflags=-Wno-error=incompatible-function-pointer-types
- hdf5-vol-log - hdf5-vol-log
- heffte +fftw - heffte +fftw
- hpx max_cpu_count=512 networking=mpi
- hypre - hypre
- kokkos +openmp - kokkos +openmp
- kokkos-kernels +openmp - kokkos-kernels +openmp
- lammps
- legion - legion
- libnrm - libnrm
#- libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +mgard # mgard:
- libquo - libquo
- libunwind - libunwind
- mercury - mercury
- metall - metall
- mfem - mfem
# - mgard +serial +openmp +timing +unstructured ~cuda # mgard
- mpark-variant - mpark-variant
- mpifileutils ~xattr - mpifileutils ~xattr cflags=-Wno-error=implicit-function-declaration
- nccmp - nccmp
- nco - nco
- netlib-scalapack - netlib-scalapack cflags=-Wno-error=implicit-function-declaration
- omega-h
- openmpi
- openpmd-api ^adios2~mgard - openpmd-api ^adios2~mgard
- papi
- papyrus - papyrus
- pdt - pdt
- petsc - petsc
- plumed
- precice - precice
- pumi - pumi
- py-h5py +mpi
- py-h5py ~mpi
- py-libensemble +mpi +nlopt
- py-petsc4py
- qthreads scheduler=distrib - qthreads scheduler=distrib
- raja - raja
- slate ~cuda - slate ~cuda
@ -144,8 +110,7 @@ spack:
- swig@4.0.2-fortran - swig@4.0.2-fortran
- sz3 - sz3
- tasmanian - tasmanian
- tau +mpi +python - trilinos +belos +ifpack2 +stokhos
- trilinos@13.0.1 +belos +ifpack2 +stokhos
- turbine - turbine
- umap - umap
- umpire - umpire
@ -155,27 +120,47 @@ spack:
# - alquimia # pflotran: petsc-3.19.4-c6pmpdtpzarytxo434zf76jqdkhdyn37/lib/petsc/conf/rules:169: material_aux.o] Error 1: fortran errors # - alquimia # pflotran: petsc-3.19.4-c6pmpdtpzarytxo434zf76jqdkhdyn37/lib/petsc/conf/rules:169: material_aux.o] Error 1: fortran errors
# - amrex # disabled temporarily pending resolution of unreproducible CI failure # - amrex # disabled temporarily pending resolution of unreproducible CI failure
# - axom # axom: CMake Error at axom/sidre/cmake_install.cmake:154 (file): file INSTALL cannot find "/tmp/gitlab-runner-2/spack-stage/spack-stage-axom-0.8.1-jvol6riu34vuyqvrd5ft2gyhrxdqvf63/spack-build-jvol6ri/lib/fortran/axom_spio.mod": No such file or directory. # - axom # axom: CMake Error at axom/sidre/cmake_install.cmake:154 (file): file INSTALL cannot find "/tmp/gitlab-runner-2/spack-stage/spack-stage-axom-0.8.1-jvol6riu34vuyqvrd5ft2gyhrxdqvf63/spack-build-jvol6ri/lib/fortran/axom_spio.mod": No such file or directory.
# - bolt # ld.lld: error: CMakeFiles/bolt-omp.dir/kmp_gsupport.cpp.o: symbol GOMP_atomic_end@@GOMP_1.0 has undefined version GOMP_1.0
# - bricks # bricks: clang-15: error: clang frontend command failed with exit code 134 (use -v to see invocation) # - bricks # bricks: clang-15: error: clang frontend command failed with exit code 134 (use -v to see invocation)
# - butterflypack ^netlib-scalapack cflags=-Wno-error=implicit-function-declaration # ftn-2116 ftn: INTERNAL "driver" was terminated due to receipt of signal 01: Hangup.
# - caliper # papi: papi_internal.c:124:3: error: use of undeclared identifier '_papi_hwi_my_thread'; did you mean '_papi_hwi_read'?
# - charliecloud # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - cp2k +mpi # libxsmm: ftn-78 ftn: ERROR in command linel; The -f option has an invalid argument, "tree-vectorize".
# - dealii # llvm@14.0.6: ?; intel-tbb@2020.3: clang-15: error: unknown argument: '-flifetime-dse=1'; assimp@5.2.5: clang-15: error: clang frontend command failed with exit code 134 (use -v to see invocation) # - dealii # llvm@14.0.6: ?; intel-tbb@2020.3: clang-15: error: unknown argument: '-flifetime-dse=1'; assimp@5.2.5: clang-15: error: clang frontend command failed with exit code 134 (use -v to see invocation)
# - dyninst # requires %gcc # - dyninst # requires %gcc
# - ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp ^hdf5@1.14 # llvm@14.0.6: ?; # - ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp ^hdf5@1.14 # llvm@14.0.6: ?;
# - exaworks # rust: ld.lld: error: relocation R_X86_64_32 cannot be used against local symbol; recompile with -fPIC'; defined in /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o, referenced by /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o:(__no_mmap_for_malloc) # - exaworks # rust: ld.lld: error: relocation R_X86_64_32 cannot be used against local symbol; recompile with -fPIC'; defined in /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o, referenced by /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o:(__no_mmap_for_malloc)
# - flux-core # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - fortrilinos # trilinos-14.0.0: packages/teuchos/core/src/Teuchos_BigUIntDecl.hpp:67:8: error: no type named 'uint32_t' in namespace 'std'
# - gasnet # configure error: User requested --enable-ofi but I don't know how to build ofi programs for your system # - gasnet # configure error: User requested --enable-ofi but I don't know how to build ofi programs for your system
# - gptune # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']] # - gptune # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']]
# - hpctoolkit # dyninst requires %gcc # - hpctoolkit # dyninst requires %gcc
# - hpx max_cpu_count=512 networking=mpi # libxcrypt-4.4.35
# - lammps # lammps-20240829.1: Reversed (or previously applied) patch detected! Assume -R? [n]
# - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +mgard # mgard:
# - mgard +serial +openmp +timing +unstructured ~cuda # mgard
# - nrm # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']] # - nrm # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']]
# - nvhpc # requires %gcc # - nvhpc # requires %gcc
# - omega-h # trilinos-13.4.1: packages/kokkos/core/src/impl/Kokkos_MemoryPool.cpp:112:48: error: unknown type name 'uint32_t'
# - openmpi # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - papi # papi_internal.c:124:3: error: use of undeclared identifier '_papi_hwi_my_thread'; did you mean '_papi_hwi_read'?
# - parsec ~cuda # parsec: parsec/fortran/CMakeFiles/parsec_fortran.dir/parsecf.F90.o: ftn-2103 ftn: WARNING in command line. The -W extra option is not supported or invalid and will be ignored. # - parsec ~cuda # parsec: parsec/fortran/CMakeFiles/parsec_fortran.dir/parsecf.F90.o: ftn-2103 ftn: WARNING in command line. The -W extra option is not supported or invalid and will be ignored.
# - phist # fortran_bindings/CMakeFiles/phist_fort.dir/phist_testing.F90.o: ftn-78 ftn: ERROR in command line. The -f option has an invalid argument, "no-math-errno". # - phist # fortran_bindings/CMakeFiles/phist_fort.dir/phist_testing.F90.o: ftn-78 ftn: ERROR in command line. The -f option has an invalid argument, "no-math-errno".
# - plasma # %cce conflict # - plasma # %cce conflict
# - plumed # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - py-h5py +mpi # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - py-h5py ~mpi # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - py-jupyterhub # rust: ld.lld: error: relocation R_X86_64_32 cannot be used against local symbol; recompile with -fPIC'; defined in /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o, referenced by /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o:(__no_mmap_for_malloc) # - py-jupyterhub # rust: ld.lld: error: relocation R_X86_64_32 cannot be used against local symbol; recompile with -fPIC'; defined in /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o, referenced by /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o:(__no_mmap_for_malloc)
# - py-libensemble +mpi +nlopt # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - py-petsc4py # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - quantum-espresso # quantum-espresso: CMake Error at cmake/FindSCALAPACK.cmake:503 (message): A required library with SCALAPACK API not found. Please specify library # - quantum-espresso # quantum-espresso: CMake Error at cmake/FindSCALAPACK.cmake:503 (message): A required library with SCALAPACK API not found. Please specify library
# - scr # scr: make[2]: *** [examples/CMakeFiles/test_ckpt_F.dir/build.make:112: examples/test_ckpt_F] Error 1: /opt/cray/pe/cce/15.0.1/binutils/x86_64/x86_64-pc-linux-gnu/bin/ld: /opt/cray/pe/mpich/8.1.25/ofi/cray/10.0/lib/libmpi_cray.so: undefined reference to `PMI_Barrier' # - scr # scr: make[2]: *** [examples/CMakeFiles/test_ckpt_F.dir/build.make:112: examples/test_ckpt_F] Error 1: /opt/cray/pe/cce/15.0.1/binutils/x86_64/x86_64-pc-linux-gnu/bin/ld: /opt/cray/pe/mpich/8.1.25/ofi/cray/10.0/lib/libmpi_cray.so: undefined reference to `PMI_Barrier'
# - strumpack ~slate # strumpack: [test/CMakeFiles/test_HSS_seq.dir/build.make:117: test/test_HSS_seq] Error 1: ld.lld: error: undefined reference due to --no-allow-shlib-undefined: mpi_abort_ # - strumpack ~slate # strumpack: [test/CMakeFiles/test_HSS_seq.dir/build.make:117: test/test_HSS_seq] Error 1: ld.lld: error: undefined reference due to --no-allow-shlib-undefined: mpi_abort_
# - tau +mpi +python # libelf: configure: error: installation or configuration problem: C compiler cannot create executables.; papi: papi_internal.c:124:3: error: use of undeclared identifier '_papi_hwi_my_thread'; did you mean '_papi_hwi_read'?
# - upcxx # upcxx: configure error: User requested --enable-ofi but I don't know how to build ofi programs for your system # - upcxx # upcxx: configure error: User requested --enable-ofi but I don't know how to build ofi programs for your system
# - variorum # variorum: /opt/cray/pe/cce/15.0.1/binutils/x86_64/x86_64-pc-linux-gnu/bin/ld: /opt/cray/pe/lib64/libpals.so.0: undefined reference to `json_array_append_new@@libjansson.so.4' # - variorum # variorum: /opt/cray/pe/cce/15.0.1/binutils/x86_64/x86_64-pc-linux-gnu/bin/ld: /opt/cray/pe/lib64/libpals.so.0: undefined reference to `json_array_append_new@@libjansson.so.4'
# - xyce +mpi +shared +pymi +pymi_static_tpls ^trilinos~shylu # openblas: ftn-2307 ftn: ERROR in command line: The "-m" option must be followed by 0, 1, 2, 3 or 4.; make[2]: *** [<builtin>: spotrf2.o] Error 1; make[1]: *** [Makefile:27: lapacklib] Error 2; make: *** [Makefile:250: netlib] Error 2
# - warpx +python # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']] # - warpx +python # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']]
# - xyce +mpi +shared +pymi +pymi_static_tpls ^trilinos~shylu # openblas: ftn-2307 ftn: ERROR in command line: The "-m" option must be followed by 0, 1, 2, 3 or 4.; make[2]: *** [<builtin>: spotrf2.o] Error 1; make[1]: *** [Makefile:27: lapacklib] Error 2; make: *** [Makefile:250: netlib] Error 2
cdash: cdash:
build-group: E4S Cray build-group: E4S Cray

View File

@ -36,7 +36,7 @@ export QA_DIR=$(realpath $QA_DIR)
cd "$SPACK_ROOT" cd "$SPACK_ROOT"
# Run bash tests with coverage enabled, but pipe output to /dev/null # Run bash tests with coverage enabled, but pipe output to /dev/null
# because it seems that kcov seems to undo the script's redirection # because it seems that kcov undoes the script's redirection
if [ "$COVERAGE" = true ]; then if [ "$COVERAGE" = true ]; then
kcov "$SPACK_ROOT/coverage" "$QA_DIR/setup-env-test.sh" &> /dev/null kcov "$SPACK_ROOT/coverage" "$QA_DIR/setup-env-test.sh" &> /dev/null
kcov "$SPACK_ROOT/coverage" "$QA_DIR/completion-test.sh" &> /dev/null kcov "$SPACK_ROOT/coverage" "$QA_DIR/completion-test.sh" &> /dev/null

View File

@ -0,0 +1,22 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class SpliceDependsOnT(Package):
"""Package that depends on splice-t"""
homepage = "http://www.example.com"
url = "http://www.example.com/splice-depends-on-t-1.0.tar.gz"
version("1.0", md5="0123456789abcdef0123456789abcdef")
depends_on("splice-t")
def install(self, spec, prefix):
with open(prefix.join("splice-depends-on-t"), "w") as f:
f.write("splice-depends-on-t: {0}".format(prefix))
f.write("splice-t: {0}".format(spec["splice-t"].prefix))

View File

@ -12,33 +12,36 @@ class GdkPixbuf(MesonPackage):
GTK+ 2 but it was split off into a separate package in preparation for the change to GTK+ 3.""" GTK+ 2 but it was split off into a separate package in preparation for the change to GTK+ 3."""
homepage = "https://gitlab.gnome.org/GNOME/gdk-pixbuf" homepage = "https://gitlab.gnome.org/GNOME/gdk-pixbuf"
url = "https://ftp.acc.umu.se/pub/gnome/sources/gdk-pixbuf/2.40/gdk-pixbuf-2.40.0.tar.xz"
git = "https://gitlab.gnome.org/GNOME/gdk-pixbuf" git = "https://gitlab.gnome.org/GNOME/gdk-pixbuf"
list_url = "https://ftp.acc.umu.se/pub/gnome/sources/gdk-pixbuf/" url = "https://gitlab.gnome.org/GNOME/gdk-pixbuf/-/archive/2.40.0/gdk-pixbuf-2.40.0.tar.gz"
# Falling back to the gitlab source since the mirror here seems to be broken
# url = "https://ftp.acc.umu.se/pub/gnome/sources/gdk-pixbuf/2.40/gdk-pixbuf-2.40.0.tar.xz"
# list_url = "https://ftp.acc.umu.se/pub/gnome/sources/gdk-pixbuf/"
list_depth = 1 list_depth = 1
license("LGPL-2.1-or-later", checked_by="wdconinc") license("LGPL-2.1-or-later", checked_by="wdconinc")
version("2.42.12", sha256="b9505b3445b9a7e48ced34760c3bcb73e966df3ac94c95a148cb669ab748e3c7") version("2.42.12", sha256="d41966831b3d291fcdfe31f683bea4b3f03241d591ddbe550b5db873af3da364")
# https://nvd.nist.gov/vuln/detail/CVE-2022-48622 # https://nvd.nist.gov/vuln/detail/CVE-2022-48622
version( version(
"2.42.10", "2.42.10",
sha256="ee9b6c75d13ba096907a2e3c6b27b61bcd17f5c7ebeab5a5b439d2f2e39fe44b", sha256="87a086c51d9705698b22bd598a795efaccf61e4db3a96f439dcb3cd90506dab8",
deprecated=True, deprecated=True,
) )
version( version(
"2.42.9", "2.42.9",
sha256="28f7958e7bf29a32d4e963556d241d0a41a6786582ff6a5ad11665e0347fc962", sha256="226d950375907857b23c5946ae6d30128f08cd75f65f14b14334c7a9fb686e36",
deprecated=True, deprecated=True,
) )
version( version(
"2.42.6", "2.42.6",
sha256="c4a6b75b7ed8f58ca48da830b9fa00ed96d668d3ab4b1f723dcf902f78bde77f", sha256="c4f3a84a04bc7c5f4fbd97dce7976ab648c60628f72ad4c7b79edce2bbdb494d",
deprecated=True, deprecated=True,
) )
version( version(
"2.42.2", "2.42.2",
sha256="83c66a1cfd591d7680c144d2922c5955d38b4db336d7cd3ee109f7bcf9afef15", sha256="249b977279f761979104d7befbb5ee23f1661e29d19a36da5875f3a97952d13f",
deprecated=True, deprecated=True,
) )

View File

@ -65,9 +65,15 @@ def cmake_args(self):
if re_qt.match(dep.name): if re_qt.match(dep.name):
qt_prefix_path.append(self.spec[dep.name].prefix) qt_prefix_path.append(self.spec[dep.name].prefix)
# Now append all qt-* dependency prefixex into a prefix path # Now append all qt-* dependency prefixes into a prefix path
args.append(self.define("QT_ADDITIONAL_PACKAGES_PREFIX_PATH", ":".join(qt_prefix_path))) args.append(self.define("QT_ADDITIONAL_PACKAGES_PREFIX_PATH", ":".join(qt_prefix_path)))
# Make our CMAKE_INSTALL_RPATH redundant:
# for prefix of current package ($ORIGIN/../lib type of rpaths),
args.append(self.define("QT_DISABLE_RPATH", True))
# for prefixes of dependencies
args.append(self.define("QT_NO_DISABLE_CMAKE_INSTALL_RPATH_USE_LINK_PATH", True))
return args return args
@run_after("install") @run_after("install")