Compare commits

...

34 Commits

Author SHA1 Message Date
Harmen Stoppels
2bfcc69fa8 Set version to v0.23.1 2025-02-19 16:04:31 +01:00
kwryankrattiger
af62d91457 gdk-pixbuf: Point at gitlab instead of broken mirror (#47825) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
f32a74491e spec.py: ensure spec.extra_attributes is {} if is null in json (#48896) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
4f80f07b9a spec.py: fix hash change due to None vs {} (#48854)
* Fix hash change due to None vs {}

* Enforce null for empty list of external_modules
2025-02-19 14:37:47 +01:00
eugeneswalker
4a8ae59a9e e4s cray rhel ci stack: re-enable and update for new cpe (#47697)
* e4s cray rhel ci stack: re-enable and update for new cpe, should fix cray libsci issue

* only run e4s-cray-rhel stack

* Mkae Autotools build_system point at correct build_directory

* remove selective enable of cray-rhel stacks

* restore SPACK_CI_DISABLE_STACKS

* use dot prefix to hide cray-sles jobs instead of comment-out

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
2025-02-19 14:37:47 +01:00
Harmen Stoppels
6f7e881b69 fix year dependent license verification check 2025-02-19 14:37:47 +01:00
Till Ehrengruber
24bcaec6c3 oci/opener.py: respect system proxy settings (#48783) 2025-02-19 14:37:47 +01:00
Massimiliano Culpo
1c9bb36fdf spec.py: fix ArchSpec.intersects (#48741)
fixes a bug where `x86_64:` and `ppc64le:` intersected, and x86_64: and :haswell did not.
2025-02-19 14:37:47 +01:00
Harmen Stoppels
044d1b12bb autotools.py: set lt_cv_apple_cc_single_mod=yes (#48671)
Since macOS 15 `ld -single_module` warns with a deprecation message,
which makes configure scripts believe the flag is unsupported. That
in turn triggers a code path where `archive_cmds` is set to

```
$CC -r -keep_private_externs -nostdlib ... -dynamiclib
```

instead of just

```
$CC -dynamiclib ...
```

This code path was meant to trigger only on ancient macOS <= 14.4 where
libtool had to add `-single_module`, which is the default since macos
14.4, and is now apparently deprecated because the flag is a no-op for
more than 15 years.

The wrong `archive_cmds` causes actual problems combined with a bug in
OpenMPI's compiler wrapper (`CC=mpicc`), which appends `-rpath` flags,
which cause an error when combined with the `-r` flag added by the
autotools.

Spack's compiler wrapper doesn't do this, but it's likely there are
other compiler wrappers out there that are not aware that `-r` and
`-rpath` cannot be combined.

The fix is to change defaults: `lt_cv_apple_cc_single_mod=yes`.
2025-02-19 14:37:47 +01:00
Harmen Stoppels
536468783d spec.py: make hashing of extra_attributes order independent (#48615) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
a298296237 relocate_text: fix return value (#48568) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
1db11ff06b spec.py: fix return type of concretized() (#48504) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
b621c045a7 cmd/__init__.py: pass tests in case n=1 (#48417) 2025-02-19 14:37:47 +01:00
Wouter Deconinck
734ada1f10 doc: ensure getting_started has bootstrap list output in correct place (#48281) 2025-02-19 14:37:47 +01:00
psakievich
71712465e8 Ensure command_line scope is always last (#48255) 2025-02-19 14:37:47 +01:00
Massimiliano Culpo
7cc67b5c23 bootstrap mirror: fix references from v0.4 to v0.6 (#48235) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
1f9b0e39b9 llnl.util.lang: remove testing literal backtrace output (#48209) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
d012683c1b docs: advertise --oci-username-variable and --oci-password-variable (#48189) 2025-02-19 14:37:47 +01:00
Todd Gamblin
8b3c3e9165 Make unit tests work on ubuntu 24.04 (#48151)
`kcov` was removed in Ubuntu 24.04, and it is no longer
installable via `apt` in our CI images. Instal it via
Linuxbrew instead, at least until it comes back to Ubuntu.

`subversion` is also not installed on ubuntu 24 by default,
so we have to install it manually.

- [x] Add linuxbrew to linux tests
- [x] Install `kcov` with brew
- [x] Install subversion with `apt`

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-02-19 14:37:47 +01:00
Harmen Stoppels
dd94a44b6a gha: fix git safe.directory config (#48041) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
95d190e354 filter_file: make tempfile later (#48108)
* filter_file: make tempfile later

* also add a `.` after the filename
2025-02-19 14:37:47 +01:00
Harmen Stoppels
f124409d8a filter_file: fix various bugs (#48038)
* `f.tell` on a `TextIOWrapper` does not return the offset in bytes, but
  an opaque integer that can only be used for `f.seek` on the same
  object. Spack assumes it's a byte offset.
* Do not open in a locale dependent way, but assume utf-8 (and allow
  users to override that)
* Use tempfile to generate a backup/temporary file in a safe way
* Comparison between None and str is valid and on purpose.
2025-02-19 14:37:47 +01:00
Harmen Stoppels
bee2132c04 log.py: improve utf-8 handling, and non-utf-8 output (#48005) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
d0ea33fa67 stage.py: improve path to url (#47898) 2025-02-19 14:37:47 +01:00
Carson Woods
4e913876b9 bug fix: updated warning message to reflect impending v1.0 release (#47887) 2025-02-19 14:37:47 +01:00
Harmen Stoppels
c25e43ce61 darwin: preserve hardlinks on codesign/install_name_tool (#47808) 2025-02-19 14:37:47 +01:00
Wouter Deconinck
85146d875b qt-base: fix rpath for dependents (#47424)
ensure that CMAKE_INSTALL_RPATH_USE_LINK_PATH=ON works in qt packages.
2025-02-19 14:37:47 +01:00
Harmen Stoppels
c8167eec5d Set version to 0.23.1.dev0 2025-02-19 14:37:47 +01:00
Gregory Becker
c6d4037758
update version number to 0.23.0 2024-11-17 00:59:27 -05:00
Gregory Becker
08f1cf9ae2
Update CHANGELOG.md for v0.23.0 2024-11-17 00:59:27 -05:00
Todd Gamblin
48dfa3c95e Spec: prefer a splice-specific method to __len__ (#47585)
Automatic splicing say `Spec` grow a `__len__` method but it's only used
in one place and it's not clear the semantics are useful elsewhere. It also
runs the risk of Specs one day being confused for other types of containers.

Rather than introduce a new function for one algorithm, let's use a more
specific method in the splice code.

- [x] Use topological ordering in `_resolve_automatic_splices` instead of 
      sorting by node count
- [x] delete `Spec.__len__()` and `Spec.__bool__()`

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-11-14 08:24:06 +01:00
Todd Gamblin
e5c411d8f0 spack spec: simplify and unify output (#47574)
`spack spec` output has looked like this for a while:

```console
> spack spec /v5fn6xo /wd2p2v7
Input spec
--------------------------------
 -   /v5fn6xo

Concretized
--------------------------------
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...

Input spec
--------------------------------
 -   /wd2p2v7

Concretized
--------------------------------
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
```

But the input spec is right there on the CLI, and it doesn't add anything to the output.
Also, since #44843, specs concretized in the CLI line can be unified, so it makes sense
to display them as we did in #44489 -- as one multi-root tree instead of as multiple
single-root trees.

With this PR, concretize output now looks like this:

```console
> spack spec /v5fn6xo /wd2p2v7
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^gmake@4.4.1%apple-clang@16.0.0~guile build_system=generic arch=darwin-sequoia-m1
[+]      ^perl@5.40.0%apple-clang@16.0.0+cpanm+opcode+open+shared+threads build_system=generic arch=darwin-sequoia-m1
[+]          ^berkeley-db@18.1.40%apple-clang@16.0.0+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=darwin-sequoia-m1
[+]          ^bzip2@1.0.8%apple-clang@16.0.0~debug~pic+shared build_system=generic arch=darwin-sequoia-m1
[+]              ^diffutils@3.10%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]                  ^libiconv@1.17%apple-clang@16.0.0 build_system=autotools libs=shared,static arch=darwin-sequoia-m1
[+]          ^gdbm@1.23%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]              ^readline@8.2%apple-clang@16.0.0 build_system=autotools patches=bbf97f1 arch=darwin-sequoia-m1
[+]                  ^ncurses@6.5%apple-clang@16.0.0~symlinks+termlib abi=none build_system=autotools patches=7a351bc arch=darwin-sequoia-m1
[+]                      ^pkgconf@2.2.0%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]      ^zlib-ng@2.2.1%apple-clang@16.0.0+compat+new_strategies+opt+pic+shared build_system=autotools arch=darwin-sequoia-m1
[+]          ^gnuconfig@2022-09-17%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^py-setuptools@69.2.0%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[-]      ^py-wheel@0.41.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...
```

With no input spec displayed -- just the concretization output shown as one consolidated
tree and multiple roots.

- [x] remove "Input Spec" section and "Concretized" header from `spack spec` output
- [x] print concretized specs as one BFS tree instead of multiple

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-11-14 08:16:16 +01:00
psakievich
020e30f3e6 Update tutorial version (#47593) 2024-11-14 08:15:37 +01:00
Harmen Stoppels
181c404af5 missing and redundant imports (#47577) 2024-11-13 13:03:29 +01:00
53 changed files with 993 additions and 476 deletions

View File

@ -52,7 +52,13 @@ jobs:
# Needed for unit tests
sudo apt-get -y install \
coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \
cmake bison libbison-dev kcov
cmake bison libbison-dev subversion
# On ubuntu 24.04, kcov was removed. It may come back in some future Ubuntu
- name: Set up Homebrew
id: set-up-homebrew
uses: Homebrew/actions/setup-homebrew@40e9946c182a64b3db1bf51be0dcb915f7802aa9
- name: Install kcov with brew
run: "brew install kcov"
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest pytest-xdist pytest-cov
@ -99,7 +105,13 @@ jobs:
run: |
sudo apt-get -y update
# Needed for shell tests
sudo apt-get install -y coreutils kcov csh zsh tcsh fish dash bash
sudo apt-get install -y coreutils csh zsh tcsh fish dash bash subversion
# On ubuntu 24.04, kcov was removed. It may come back in some future Ubuntu
- name: Set up Homebrew
id: set-up-homebrew
uses: Homebrew/actions/setup-homebrew@40e9946c182a64b3db1bf51be0dcb915f7802aa9
- name: Install kcov with brew
run: "brew install kcov"
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest coverage[toml] pytest-xdist
@ -134,7 +146,7 @@ jobs:
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory /__w/spack/spack
git config --global --add safe.directory '*'
git fetch --unshallow
. .github/workflows/bin/setup_git.sh
useradd spack-test

View File

@ -74,7 +74,7 @@ jobs:
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory /__w/spack/spack
git config --global --add safe.directory '*'
git fetch --unshallow
. .github/workflows/bin/setup_git.sh
useradd spack-test

View File

@ -1,3 +1,395 @@
# v0.23.1 (2025-02-19)
## Bugfixes
- Fix a correctness issue of `ArchSpec.intersects` (#48741)
- Make `extra_attributes` order independent in Spec hashing (#48615, #48854)
- Fix issue where system proxy settings were not respected in OCI build caches (#48783)
- Fix an issue where the `--test` concretizer flag was not forwarded correctly (#48417)
- Fix an issue where `codesign` and `install_name_tool` would not preserve hardlinks on
Darwin (#47808)
- Fix an issue on Darwin where codesign would run on unmodified binaries (#48568)
- Patch configure scripts generated with libtool < 2.5.4, to avoid redundant flags when
creating shared libraries on Darwin (#48671)
- Fix issue related to mirror URL paths on Windows (#47898)
- Esnure proper UTF-8 encoding/decoding in logging (#48005)
- Fix issues related to `filter_file` (#48038, #48108)
- Fix issue related to creating bootstrap source mirrors (#48235)
- Fix issue where command line config arguments were not always top level (#48255)
- Fix an incorrect typehint of `concretized()` (#48504)
- Improve mention of next Spack version in warning (#47887)
- Tests: fix forward compatibility with Python 3.13 (#48209)
- Docs: encourage use of `--oci-username-variable` and `--oci-password-variable` (#48189)
- Docs: ensure Getting Started has bootstrap list output in correct place (#48281)
- CI: allow GitHub actions to run on forks of Spack with different project name (#48041)
- CI: make unit tests work on Ubuntu 24.04 (#48151)
- CI: re-enable cray pipelines (#47697)
## Package updates
- `qt-base`: fix rpath for dependents (#47424)
- `gdk-pixbuf`: fix outdated URL (#47825)
# v0.23.0 (2024-11-13)
`v0.23.0` is a major feature release.
We are planning to make this the last major release before Spack `v1.0`
in June 2025. Alongside `v0.23`, we will be making pre-releases (alpha,
beta, etc.) of `v1.0`, and we encourage users to try them and send us
feedback, either on GitHub or on Slack. You can track the road to
`v1.0` here:
* https://github.com/spack/spack/releases
* https://github.com/spack/spack/discussions/30634
## Features in this Release
1. **Language virtuals**
Your packages can now explicitly depend on the languages they require.
Historically, Spack has considered C, C++, and Fortran compiler
dependencies to be implicit. In `v0.23`, you should ensure that
new packages add relevant C, C++, and Fortran dependencies like this:
```python
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("fortran", type="build")
```
We encourage you to add these annotations to your packages now, to prepare
for Spack `v1.0.0`. In `v1.0.0`, these annotations will be necessary for
your package to use C, C++, and Fortran compilers. Note that you should
*not* add language dependencies to packages that don't need them, e.g.,
pure python packages.
We have already auto-generated these dependencies for packages in the
`builtin` repository (see #45217), based on the types of source files
present in each package's source code. We *may* have added too many or too
few language dependencies, so please submit pull requests to correct
packages if you find that the language dependencies are incorrect.
Note that we have also backported support for these dependencies to
`v0.21.3` and `v0.22.2`, to make all of them forward-compatible with
`v0.23`. This should allow you to move easily between older and newer Spack
releases without breaking your packages.
2. **Spec splicing**
We are working to make binary installation more seamless in Spack. `v0.23`
introduces "splicing", which allows users to deploy binaries using local,
optimized versions of a binary interface, even if they were not built with
that interface. For example, this would allow you to build binaries in the
cloud using `mpich` and install them on a system using a local, optimized
version of `mvapich2` *without rebuilding*. Spack preserves full provenance
for the installed packages and knows that they were built one way but
deployed another.
Our intent is to leverage this across many key HPC binary packages,
e.g. MPI, CUDA, ROCm, and libfabric.
Fundamentally, splicing allows Spack to redeploy an existing spec with
different dependencies than how it was built. There are two interfaces to
splicing.
a. Explicit Splicing
#39136 introduced the explicit splicing interface. In the
concretizer config, you can specify a target spec and a replacement
by hash.
```yaml
concretizer:
splice:
explicit:
- target: mpi
replacement: mpich/abcdef
```
Here, every installation that would normally use the target spec will
instead use its replacement. Above, any spec using *any* `mpi` will be
spliced to depend on the specific `mpich` installation requested. This
*can* go wrong if you try to replace something built with, e.g.,
`openmpi` with `mpich`, and it is on the user to ensure ABI
compatibility between target and replacement specs. This currently
requires some expertise to use, but it will allow users to reuse the
binaries they create across more machines and environments.
b. Automatic Splicing (experimental)
#46729 introduced automatic splicing. In the concretizer config, enable
automatic splicing:
```yaml
concretizer:
splice:
automatic: true
```
or run:
```console
spack config add concretizer:splice:automatic:true
```
The concretizer will select splices for ABI compatibility to maximize
package reuse. Packages can denote ABI compatibility using the
`can_splice` directive. No packages in Spack yet use this directive, so
if you want to use this feature you will need to add `can_splice`
annotations to your packages. We are working on ways to add more ABI
compatibility information to the Spack package repository, and this
directive may change in the future.
See the documentation for more details:
* https://spack.readthedocs.io/en/latest/build_settings.html#splicing
* https://spack.readthedocs.io/en/latest/packaging_guide.html#specifying-abi-compatibility
3. Broader variant propagation
Since #42931, you can specify propagated variants like `hdf5
build_type==RelWithDebInfo` or `trilinos ++openmp` to propagate a variant
to all dependencies for which it is relevant. This is valid *even* if the
variant does not exist on the package or its dependencies.
See https://spack.readthedocs.io/en/latest/basic_usage.html#variants.
4. Query specs by namespace
#45416 allows a package's namespace (indicating the repository it came from)
to be treated like a variant. You can request packages from particular repos
like this:
```console
spack find zlib namespace=builtin
spack find zlib namespace=myrepo
```
Previously, the spec syntax only allowed namespaces to be prefixes of spec
names, e.g. `builtin.zlib`. The previous syntax still works.
5. `spack spec` respects environment settings and `unify:true`
`spack spec` did not previously respect environment lockfiles or
unification settings, which made it difficult to see exactly how a spec
would concretize within an environment. Now it does, so the output you get
with `spack spec` will be *the same* as what your environment will
concretize to when you run `spack concretize`. Similarly, if you provide
multiple specs on the command line with `spack spec`, it will concretize
them together if `unify:true` is set.
See #47556 and #44843.
6. Less noisy `spack spec` output
`spack spec` previously showed output like this:
```console
> spack spec /v5fn6xo
Input spec
--------------------------------
- /v5fn6xo
Concretized
--------------------------------
[+] openssl@3.3.1%apple-clang@16.0.0~docs+shared arch=darwin-sequoia-m1
...
```
But the input spec is redundant, and we know we run `spack spec` to concretize
the input spec. `spack spec` now *only* shows the concretized spec. See #47574.
7. Better output for `spack find -c`
In an environmnet, `spack find -c` lets you search the concretized, but not
yet installed, specs, just as you would the installed ones. As with `spack
spec`, this should make it easier for you to see what *will* be built
before building and installing it. See #44713.
8. `spack -C <env>`: use an environment's configuration without activation
Spack environments allow you to associate:
1. a set of (possibly concretized) specs, and
2. configuration
When you activate an environment, you're using both of these. Previously, we
supported:
* `spack -e <env>` to run spack in the context of a specific environment, and
* `spack -C <directory>` to run spack using a directory with configuration files.
You can now also pass an environment to `spack -C` to use *only* the environment's
configuration, but not the specs or lockfile. See #45046.
## New commands, options, and directives
* The new `spack env track` command (#41897) takes a non-managed Spack
environment and adds a symlink to Spack's `$environments_root` directory, so
that it will be included for reference counting for commands like `spack
uninstall` and `spack gc`. If you use free-standing directory environments,
this is useful for preventing Spack from removing things required by your
environments. You can undo this tracking with the `spack env untrack`
command.
* Add `-t` short option for `spack --backtrace` (#47227)
`spack -d / --debug` enables backtraces on error, but it can be very
verbose, and sometimes you just want the backtrace. `spack -t / --backtrace`
provides that option.
* `gc`: restrict to specific specs (#46790)
If you only want to garbage-collect specific packages, you can now provide
them on the command line. This gives users finer-grained control over what
is uninstalled.
* oci buildcaches now support `--only=package`. You can now push *just* a
package and not its dependencies to an OCI registry. This allows dependents
of non-redistributable specs to be stored in OCI registries without an
error. See #45775.
## Notable refactors
* Variants are now fully conditional
The `variants` dictionary on packages was previously keyed by variant name,
and allowed only one definition of any given variant. Spack is now smart
enough to understand that variants may have different values and defaults
for different versions. For example, `warpx` prior to `23.06` only supported
builds for one dimensionality, and newer `warpx` versions could be built
with support for many different dimensions:
```python
variant(
"dims",
default="3",
values=("1", "2", "3", "rz"),
multi=False,
description="Number of spatial dimensions",
when="@:23.05",
)
variant(
"dims",
default="1,2,rz,3",
values=("1", "2", "3", "rz"),
multi=True,
description="Number of spatial dimensions",
when="@23.06:",
)
```
Previously, the default for the old version of `warpx` was not respected and
had to be specified manually. Now, Spack will select the right variant
definition for each version at concretization time. This allows variants to
evolve more smoothly over time. See #44425 for details.
## Highlighted bugfixes
1. Externals no longer override the preferred provider (#45025).
External definitions could interfere with package preferences. Now, if
`openmpi` is the preferred `mpi`, and an external `mpich` is defined, a new
`openmpi` *will* be built if building it is possible. Previously we would
prefer `mpich` despite the preference.
2. Composable `cflags` (#41049).
This release fixes a longstanding bug that concretization would fail if
there were different `cflags` specified in `packages.yaml`,
`compilers.yaml`, or on `the` CLI. Flags and their ordering are now tracked
in the concretizer and flags from multiple sources will be merged.
3. Fix concretizer Unification for included environments (#45139).
## Deprecations, removals, and syntax changes
1. The old concretizer has been removed from Spack, along with the
`config:concretizer` config option. Spack will emit a warning if the option
is present in user configuration, since it now has no effect. Spack now
uses a simpler bootstrapping mechanism, where a JSON prototype is tweaked
slightly to get an initial concrete spec to download. See #45215.
2. Best-effort expansion of spec matrices has been removed. This feature did
not work with the "new" ASP-based concretizer, and did not work with
`unify: True` or `unify: when_possible`. Use the
[exclude key](https://spack.readthedocs.io/en/latest/environments.html#spec-matrices)
for the environment to exclude invalid components, or use multiple spec
matrices to combine the list of specs for which the constraint is valid and
the list of specs for which it is not. See #40792.
3. The old Cray `platform` (based on Cray PE modules) has been removed, and
`platform=cray` is no longer supported. Since `v0.19`, Spack has handled
Cray machines like Linux clusters with extra packages, and we have
encouraged using this option to support Cray. The new approach allows us to
correctly handle Cray machines with non-SLES operating systems, and it is
much more reliable than making assumptions about Cray modules. See the
`v0.19` release notes and #43796 for more details.
4. The `config:install_missing_compilers` config option has been deprecated,
and it is a no-op when set in `v0.23`. Our new compiler dependency model
will replace it with a much more reliable and robust mechanism in `v1.0`.
See #46237.
5. Config options that deprecated in `v0.21` have been removed in `v0.23`. You
can now only specify preferences for `compilers`, `targets`, and
`providers` globally via the `packages:all:` section. Similarly, you can
only specify `versions:` locally for a specific package. See #44061 and
#31261 for details.
6. Spack's old test interface has been removed (#45752), having been
deprecated in `v0.22.0` (#34236). All `builtin` packages have been updated
to use the new interface. See the [stand-alone test documentation](
https://spack.readthedocs.io/en/latest/packaging_guide.html#stand-alone-tests)
7. The `spack versions --safe-only` option, deprecated since `v0.21.0`, has
been removed. See #45765.
* The `--dependencies` and `--optimize` arguments to `spack ci` have been
deprecated. See #45005.
## Binary caches
1. Public binary caches now include an ML stack for Linux/aarch64 (#39666)We
now build an ML stack for Linux/aarch64 for all pull requests and on
develop. The ML stack includes both CPU-only and CUDA builds for Horovod,
Hugging Face, JAX, Keras, PyTorch,scikit-learn, TensorBoard, and
TensorFlow, and related packages. The CPU-only stack also includes XGBoost.
See https://cache.spack.io/tag/develop/?stack=ml-linux-aarch64-cuda.
2. There is also now an stack of developer tools for macOS (#46910), which is
analogous to the Linux devtools stack. You can use this to avoid building
many common build dependencies. See
https://cache.spack.io/tag/develop/?stack=developer-tools-darwin.
## Architecture support
* archspec has been updated to `v0.2.5`, with support for `zen5`
* Spack's CUDA package now supports the Grace Hopper `9.0a` compute capability (#45540)
## Windows
* Windows bootstrapping: `file` and `gpg` (#41810)
* `scripts` directory added to PATH on Windows for python extensions (#45427)
* Fix `spack load --list` and `spack unload` on Windows (#35720)
## Other notable changes
* Bugfix: `spack find -x` in environments (#46798)
* Spec splices are now robust to duplicate nodes with the same name in a spec (#46382)
* Cache per-compiler libc calculations for performance (#47213)
* Fixed a bug in external detection for openmpi (#47541)
* Mirror configuration allows username/password as environment variables (#46549)
* Default library search caps maximum depth (#41945)
* Unify interface for `spack spec` and `spack solve` commands (#47182)
* Spack no longer RPATHs directories in the default library search path (#44686)
* Improved performance of Spack database (#46554)
* Enable package reuse for packages with versions from git refs (#43859)
* Improved handling for `uuid` virtual on macos (#43002)
* Improved tracking of task queueing/requeueing in the installer (#46293)
## Spack community stats
* Over 2,000 pull requests updated package recipes
* 8,307 total packages, 329 new since `v0.22.0`
* 140 new Python packages
* 14 new R packages
* 373 people contributed to this release
* 357 committers to packages
* 60 committers to core
# v0.22.2 (2024-09-21)
## Bugfixes

View File

@ -265,25 +265,30 @@ infrastructure, or to cache Spack built binaries in Github Actions and
GitLab CI.
To get started, configure an OCI mirror using ``oci://`` as the scheme,
and optionally specify a username and password (or personal access token):
and optionally specify variables that hold the username and password (or
personal access token) for the registry:
.. code-block:: console
$ spack mirror add --oci-username username --oci-password password my_registry oci://example.com/my_image
$ spack mirror add --oci-username-variable REGISTRY_USER \
--oci-password-variable REGISTRY_TOKEN \
my_registry oci://example.com/my_image
Spack follows the naming conventions of Docker, with Dockerhub as the default
registry. To use Dockerhub, you can omit the registry domain:
.. code-block:: console
$ spack mirror add --oci-username username --oci-password password my_registry oci://username/my_image
$ spack mirror add ... my_registry oci://username/my_image
From here, you can use the mirror as any other build cache:
.. code-block:: console
$ export REGISTRY_USER=...
$ export REGISTRY_TOKEN=...
$ spack buildcache push my_registry <specs...> # push to the registry
$ spack install <specs...> # install from the registry
$ spack install <specs...> # or install from the registry
A unique feature of buildcaches on top of OCI registries is that it's incredibly
easy to generate get a runnable container image with the binaries installed. This

View File

@ -38,9 +38,11 @@ just have to configure and OCI registry and run ``spack buildcache push``.
spack -e . install
# Configure the registry
spack -e . mirror add --oci-username ... --oci-password ... container-registry oci://example.com/name/image
spack -e . mirror add --oci-username-variable REGISTRY_USER \
--oci-password-variable REGISTRY_TOKEN \
container-registry oci://example.com/name/image
# Push the image
# Push the image (do set REGISTRY_USER and REGISTRY_TOKEN)
spack -e . buildcache push --update-index --base-image ubuntu:22.04 --tag my_env container-registry
The resulting container image can then be run as follows:

View File

@ -148,20 +148,22 @@ The first time you concretize a spec, Spack will bootstrap automatically:
--------------------------------
zlib@1.2.13%gcc@9.4.0+optimize+pic+shared build_system=makefile arch=linux-ubuntu20.04-icelake
The default bootstrap behavior is to use pre-built binaries. You can verify the
active bootstrap repositories with:
.. command-output:: spack bootstrap list
If for security concerns you cannot bootstrap ``clingo`` from pre-built
binaries, you have to disable fetching the binaries we generated with Github Actions.
.. code-block:: console
$ spack bootstrap disable github-actions-v0.4
==> "github-actions-v0.4" is now disabled and will not be used for bootstrapping
$ spack bootstrap disable github-actions-v0.3
==> "github-actions-v0.3" is now disabled and will not be used for bootstrapping
You can verify that the new settings are effective with:
.. command-output:: spack bootstrap list
$ spack bootstrap disable github-actions-v0.6
==> "github-actions-v0.6" is now disabled and will not be used for bootstrapping
$ spack bootstrap disable github-actions-v0.5
==> "github-actions-v0.5" is now disabled and will not be used for bootstrapping
You can verify that the new settings are effective with ``spack bootstrap list``.
.. note::

View File

@ -24,6 +24,7 @@
Callable,
Deque,
Dict,
Generator,
Iterable,
List,
Match,
@ -300,35 +301,32 @@ def filter_file(
ignore_absent: bool = False,
start_at: Optional[str] = None,
stop_at: Optional[str] = None,
encoding: Optional[str] = "utf-8",
) -> None:
r"""Like sed, but uses python regular expressions.
Filters every line of each file through regex and replaces the file
with a filtered version. Preserves mode of filtered files.
Filters every line of each file through regex and replaces the file with a filtered version.
Preserves mode of filtered files.
As with re.sub, ``repl`` can be either a string or a callable.
If it is a callable, it is passed the match object and should
return a suitable replacement string. If it is a string, it
can contain ``\1``, ``\2``, etc. to represent back-substitution
as sed would allow.
As with re.sub, ``repl`` can be either a string or a callable. If it is a callable, it is
passed the match object and should return a suitable replacement string. If it is a string, it
can contain ``\1``, ``\2``, etc. to represent back-substitution as sed would allow.
Args:
regex (str): The regular expression to search for
repl (str): The string to replace matches with
*filenames: One or more files to search and replace
string (bool): Treat regex as a plain string. Default it False
backup (bool): Make backup file(s) suffixed with ``~``. Default is False
ignore_absent (bool): Ignore any files that don't exist.
Default is False
start_at (str): Marker used to start applying the replacements. If a
text line matches this marker filtering is started at the next line.
All contents before the marker and the marker itself are copied
verbatim. Default is to start filtering from the first line of the
file.
stop_at (str): Marker used to stop scanning the file further. If a text
line matches this marker filtering is stopped and the rest of the
file is copied verbatim. Default is to filter until the end of the
file.
regex: The regular expression to search for
repl: The string to replace matches with
*filenames: One or more files to search and replace string: Treat regex as a plain string.
Default it False backup: Make backup file(s) suffixed with ``~``. Default is False
ignore_absent: Ignore any files that don't exist. Default is False
start_at: Marker used to start applying the replacements. If a text line matches this
marker filtering is started at the next line. All contents before the marker and the
marker itself are copied verbatim. Default is to start filtering from the first line of
the file.
stop_at: Marker used to stop scanning the file further. If a text line matches this marker
filtering is stopped and the rest of the file is copied verbatim. Default is to filter
until the end of the file.
encoding: The encoding to use when reading and writing the files. Default is None, which
uses the system's default encoding.
"""
# Allow strings to use \1, \2, etc. for replacement, like sed
if not callable(repl):
@ -344,72 +342,56 @@ def groupid_to_group(x):
if string:
regex = re.escape(regex)
for filename in path_to_os_path(*filenames):
msg = 'FILTER FILE: {0} [replacing "{1}"]'
tty.debug(msg.format(filename, regex))
backup_filename = filename + "~"
tmp_filename = filename + ".spack~"
if ignore_absent and not os.path.exists(filename):
msg = 'FILTER FILE: file "{0}" not found. Skipping to next file.'
tty.debug(msg.format(filename))
regex_compiled = re.compile(regex)
for path in path_to_os_path(*filenames):
if ignore_absent and not os.path.exists(path):
tty.debug(f'FILTER FILE: file "{path}" not found. Skipping to next file.')
continue
else:
tty.debug(f'FILTER FILE: {path} [replacing "{regex}"]')
# Create backup file. Don't overwrite an existing backup
# file in case this file is being filtered multiple times.
if not os.path.exists(backup_filename):
shutil.copy(filename, backup_filename)
fd, temp_path = tempfile.mkstemp(
prefix=f"{os.path.basename(path)}.", dir=os.path.dirname(path)
)
os.close(fd)
# Create a temporary file to read from. We cannot use backup_filename
# in case filter_file is invoked multiple times on the same file.
shutil.copy(filename, tmp_filename)
shutil.copy(path, temp_path)
errored = False
try:
# Open as a text file and filter until the end of the file is
# reached, or we found a marker in the line if it was specified
#
# To avoid translating line endings (\n to \r\n and vice-versa)
# we force os.open to ignore translations and use the line endings
# the file comes with
with open(tmp_filename, mode="r", errors="surrogateescape", newline="") as input_file:
with open(filename, mode="w", errors="surrogateescape", newline="") as output_file:
do_filtering = start_at is None
# Using iter and readline is a workaround needed not to
# disable input_file.tell(), which will happen if we call
# input_file.next() implicitly via the for loop
for line in iter(input_file.readline, ""):
if stop_at is not None:
current_position = input_file.tell()
# Open as a text file and filter until the end of the file is reached, or we found a
# marker in the line if it was specified. To avoid translating line endings (\n to
# \r\n and vice-versa) use newline="".
with open(
temp_path, mode="r", errors="surrogateescape", newline="", encoding=encoding
) as input_file, open(
path, mode="w", errors="surrogateescape", newline="", encoding=encoding
) as output_file:
if start_at is None and stop_at is None: # common case, avoids branching in loop
for line in input_file:
output_file.write(re.sub(regex_compiled, repl, line))
else:
# state is -1 before start_at; 0 between; 1 after stop_at
state = 0 if start_at is None else -1
for line in input_file:
if state == 0:
if stop_at == line.strip():
output_file.write(line)
break
if do_filtering:
filtered_line = re.sub(regex, repl, line)
output_file.write(filtered_line)
else:
do_filtering = start_at == line.strip()
output_file.write(line)
else:
current_position = None
# If we stopped filtering at some point, reopen the file in
# binary mode and copy verbatim the remaining part
if current_position and stop_at:
with open(tmp_filename, mode="rb") as input_binary_buffer:
input_binary_buffer.seek(current_position)
with open(filename, mode="ab") as output_binary_buffer:
output_binary_buffer.writelines(input_binary_buffer.readlines())
state = 1
else:
line = re.sub(regex_compiled, repl, line)
elif state == -1 and start_at == line.strip():
state = 0
output_file.write(line)
except BaseException:
# clean up the original file on failure.
shutil.move(backup_filename, filename)
# restore the original file
os.rename(temp_path, path)
errored = True
raise
finally:
os.remove(tmp_filename)
if not backup and os.path.exists(backup_filename):
os.remove(backup_filename)
if not errored and not backup:
os.unlink(temp_path)
class FileFilter:
@ -2838,6 +2820,25 @@ def temporary_dir(
remove_directory_contents(tmp_dir)
@contextmanager
def edit_in_place_through_temporary_file(file_path: str) -> Generator[str, None, None]:
"""Context manager for modifying ``file_path`` in place, preserving its inode and hardlinks,
for functions or external tools that do not support in-place editing. Notice that this function
is unsafe in that it works with paths instead of a file descriptors, but this is by design,
since we assume the call site will create a new inode at the same path."""
tmp_fd, tmp_path = tempfile.mkstemp(
dir=os.path.dirname(file_path), prefix=f"{os.path.basename(file_path)}."
)
# windows cannot replace a file with open fds, so close since the call site needs to replace.
os.close(tmp_fd)
try:
shutil.copyfile(file_path, tmp_path, follow_symlinks=True)
yield tmp_path
shutil.copyfile(tmp_path, file_path, follow_symlinks=True)
finally:
os.unlink(tmp_path)
def filesummary(path, print_bytes=16) -> Tuple[int, bytes]:
"""Create a small summary of the given file. Does not error
when file does not exist.

View File

@ -879,10 +879,13 @@ def _writer_daemon(
write_fd.close()
# 1. Use line buffering (3rd param = 1) since Python 3 has a bug
# that prevents unbuffered text I/O.
# 2. Python 3.x before 3.7 does not open with UTF-8 encoding by default
# that prevents unbuffered text I/O. [needs citation]
# 2. Enforce a UTF-8 interpretation of build process output with errors replaced by '?'.
# The downside is that the log file will not contain the exact output of the build process.
# 3. closefd=False because Connection has "ownership"
read_file = os.fdopen(read_fd.fileno(), "r", 1, encoding="utf-8", closefd=False)
read_file = os.fdopen(
read_fd.fileno(), "r", 1, encoding="utf-8", errors="replace", closefd=False
)
if stdin_fd:
stdin_file = os.fdopen(stdin_fd.fileno(), closefd=False)
@ -928,11 +931,7 @@ def _writer_daemon(
try:
while line_count < 100:
# Handle output from the calling process.
try:
line = _retry(read_file.readline)()
except UnicodeDecodeError:
# installs like --test=root gpgme produce non-UTF8 logs
line = "<line lost: output was not encoded as UTF-8>\n"
line = _retry(read_file.readline)()
if not line:
return
@ -946,6 +945,13 @@ def _writer_daemon(
output_line = clean_line
if filter_fn:
output_line = filter_fn(clean_line)
enc = sys.stdout.encoding
if enc != "utf-8":
# On Python 3.6 and 3.7-3.14 with non-{utf-8,C} locale stdout
# may not be able to handle utf-8 output. We do an inefficient
# dance of re-encoding with errors replaced, so stdout.write
# does not raise.
output_line = output_line.encode(enc, "replace").decode(enc)
sys.stdout.write(output_line)
# Stripped output to log file.

View File

@ -11,7 +11,7 @@
import spack.util.git
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.23.0.dev0"
__version__ = "0.23.1"
spack_version = __version__

View File

@ -1366,14 +1366,8 @@ def _test_detection_by_executable(pkgs, debug_log, error_cls):
def _compare_extra_attribute(_expected, _detected, *, _spec):
result = []
# Check items are of the same type
if not isinstance(_detected, type(_expected)):
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
# If they are string expected is a regex
if isinstance(_expected, str):
if isinstance(_expected, str) and isinstance(_detected, str):
try:
_regex = re.compile(_expected)
except re.error:
@ -1389,7 +1383,7 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
_details = [f"{_detected} does not match the regex"]
return [error_cls(summary=_summary, details=_details)]
if isinstance(_expected, dict):
elif isinstance(_expected, dict) and isinstance(_detected, dict):
_not_detected = set(_expected.keys()) - set(_detected.keys())
if _not_detected:
_summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}"
@ -1404,6 +1398,10 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
result.extend(
_compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec)
)
else:
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
return result

View File

@ -2332,7 +2332,9 @@ def is_backup_file(file):
if not codesign:
return
for binary in changed_files:
codesign("-fs-", binary)
# preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed

View File

@ -357,6 +357,13 @@ def _do_patch_libtool_configure(self):
)
# Support Libtool 2.4.2 and older:
x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2')
# Configure scripts generated with libtool < 2.5.4 have a faulty test for the
# -single_module linker flag. A deprecation warning makes it think the default is
# -multi_module, triggering it to use problematic linker flags (such as ld -r). The
# linker default is `-single_module` from (ancient) macOS 10.4, so override by setting
# `lt_cv_apple_cc_single_mod=yes`. See the fix in libtool commit
# 82f7f52123e4e7e50721049f7fa6f9b870e09c9d.
x.filter("lt_cv_apple_cc_single_mod=no", "lt_cv_apple_cc_single_mod=yes", string=True)
@spack.builder.run_after("configure")
def _do_patch_libtool(self):

View File

@ -25,6 +25,7 @@
import spack.extensions
import spack.parser
import spack.paths
import spack.repo
import spack.spec
import spack.store
import spack.traverse as traverse
@ -166,7 +167,9 @@ def quote_kvp(string: str) -> str:
def parse_specs(
args: Union[str, List[str]], concretize: bool = False, tests: bool = False
args: Union[str, List[str]],
concretize: bool = False,
tests: spack.concretize.TestsType = False,
) -> List[spack.spec.Spec]:
"""Convenience function for parsing arguments from specs. Handles common
exceptions and dies if there are errors.
@ -178,11 +181,13 @@ def parse_specs(
if not concretize:
return specs
to_concretize = [(s, None) for s in specs]
to_concretize: List[spack.concretize.SpecPairInput] = [(s, None) for s in specs]
return _concretize_spec_pairs(to_concretize, tests=tests)
def _concretize_spec_pairs(to_concretize, tests=False):
def _concretize_spec_pairs(
to_concretize: List[spack.concretize.SpecPairInput], tests: spack.concretize.TestsType = False
) -> List[spack.spec.Spec]:
"""Helper method that concretizes abstract specs from a list of abstract,concrete pairs.
Any spec with a concrete spec associated with it will concretize to that spec. Any spec
@ -193,7 +198,7 @@ def _concretize_spec_pairs(to_concretize, tests=False):
# Special case for concretizing a single spec
if len(to_concretize) == 1:
abstract, concrete = to_concretize[0]
return [concrete or abstract.concretized()]
return [concrete or abstract.concretized(tests=tests)]
# Special case if every spec is either concrete or has an abstract hash
if all(

View File

@ -29,7 +29,7 @@
# Tarball to be downloaded if binary packages are requested in a local mirror
BINARY_TARBALL = "https://github.com/spack/spack-bootstrap-mirrors/releases/download/v0.4/bootstrap-buildcache.tar.gz"
BINARY_TARBALL = "https://github.com/spack/spack-bootstrap-mirrors/releases/download/v0.6/bootstrap-buildcache.tar.gz"
#: Subdirectory where to create the mirror
LOCAL_MIRROR_DIR = "bootstrap_cache"
@ -51,9 +51,9 @@
},
}
CLINGO_JSON = "$spack/share/spack/bootstrap/github-actions-v0.4/clingo.json"
GNUPG_JSON = "$spack/share/spack/bootstrap/github-actions-v0.4/gnupg.json"
PATCHELF_JSON = "$spack/share/spack/bootstrap/github-actions-v0.4/patchelf.json"
CLINGO_JSON = "$spack/share/spack/bootstrap/github-actions-v0.6/clingo.json"
GNUPG_JSON = "$spack/share/spack/bootstrap/github-actions-v0.6/gnupg.json"
PATCHELF_JSON = "$spack/share/spack/bootstrap/github-actions-v0.6/patchelf.json"
# Metadata for a generated source mirror
SOURCE_METADATA = {

View File

@ -528,6 +528,7 @@ def __call__(self, parser, namespace, values, option_string):
# the const from the constructor or a value from the CLI.
# Note that this is only called if the argument is actually
# specified on the command line.
spack.config.CONFIG.ensure_scope_ordering()
spack.config.set(self.config_path, self.const, scope="command_line")

View File

@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import datetime
import os
import re
from collections import defaultdict
@ -97,7 +96,7 @@ def list_files(args):
OLD_LICENSE, SPDX_MISMATCH, GENERAL_MISMATCH = range(1, 4)
#: Latest year that copyright applies. UPDATE THIS when bumping copyright.
latest_year = datetime.date.today().year
latest_year = 2024 # year of 0.23 release
strict_date = r"Copyright 2013-%s" % latest_year
#: regexes for valid license lines at tops of files

View File

@ -82,14 +82,6 @@ def spec(parser, args):
if args.namespaces:
fmt = "{namespace}." + fmt
tree_kwargs = {
"cover": args.cover,
"format": fmt,
"hashlen": None if args.very_long else 7,
"show_types": args.types,
"status_fn": install_status_fn if args.install_status else None,
}
# use a read transaction if we are getting install status for every
# spec in the DAG. This avoids repeatedly querying the DB.
tree_context = lang.nullcontext
@ -99,46 +91,35 @@ def spec(parser, args):
env = ev.active_environment()
if args.specs:
input_specs = spack.cmd.parse_specs(args.specs)
concretized_specs = spack.cmd.parse_specs(args.specs, concretize=True)
specs = list(zip(input_specs, concretized_specs))
concrete_specs = spack.cmd.parse_specs(args.specs, concretize=True)
elif env:
env.concretize()
specs = env.concretized_specs()
if not args.format:
# environments are printed together in a combined tree() invocation,
# except when using --yaml or --json, which we print spec by spec below.
tree_kwargs["key"] = spack.traverse.by_dag_hash
tree_kwargs["hashes"] = args.long or args.very_long
print(spack.spec.tree([concrete for _, concrete in specs], **tree_kwargs))
return
concrete_specs = env.concrete_roots()
else:
tty.die("spack spec requires at least one spec or an active environment")
for input, output in specs:
# With --yaml or --json, just print the raw specs to output
if args.format:
# With --yaml, --json, or --format, just print the raw specs to output
if args.format:
for spec in concrete_specs:
if args.format == "yaml":
# use write because to_yaml already has a newline.
sys.stdout.write(output.to_yaml(hash=ht.dag_hash))
sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
elif args.format == "json":
print(output.to_json(hash=ht.dag_hash))
print(spec.to_json(hash=ht.dag_hash))
else:
print(output.format(args.format))
continue
print(spec.format(args.format))
return
with tree_context():
# Only show the headers for input specs that are not concrete to avoid
# repeated output. This happens because parse_specs outputs concrete
# specs for `/hash` inputs.
if not input.concrete:
tree_kwargs["hashes"] = False # Always False for input spec
print("Input spec")
print("--------------------------------")
print(input.tree(**tree_kwargs))
print("Concretized")
print("--------------------------------")
tree_kwargs["hashes"] = args.long or args.very_long
print(output.tree(**tree_kwargs))
with tree_context():
print(
spack.spec.tree(
concrete_specs,
cover=args.cover,
format=fmt,
hashlen=None if args.very_long else 7,
show_types=args.types,
status_fn=install_status_fn if args.install_status else None,
hashes=args.long or args.very_long,
key=spack.traverse.by_dag_hash,
)
)

View File

@ -24,7 +24,7 @@
# tutorial configuration parameters
tutorial_branch = "releases/v0.22"
tutorial_branch = "releases/v0.23"
tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@ -6,7 +6,7 @@
import sys
import time
from contextlib import contextmanager
from typing import Iterable, Optional, Sequence, Tuple, Union
from typing import Iterable, List, Optional, Sequence, Tuple, Union
import llnl.util.tty as tty
@ -36,6 +36,7 @@ def enable_compiler_existence_check():
CHECK_COMPILER_EXISTENCE = saved
SpecPairInput = Tuple[Spec, Optional[Spec]]
SpecPair = Tuple[Spec, Spec]
SpecLike = Union[Spec, str]
TestsType = Union[bool, Iterable[str]]
@ -60,8 +61,8 @@ def concretize_specs_together(
def concretize_together(
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> List[SpecPair]:
"""Given a number of specs as input, tries to concretize them together.
Args:
@ -77,8 +78,8 @@ def concretize_together(
def concretize_together_when_possible(
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> List[SpecPair]:
"""Given a number of specs as input, tries to concretize them together to the extent possible.
See documentation for ``unify: when_possible`` concretization for the precise definition of
@ -114,8 +115,8 @@ def concretize_together_when_possible(
def concretize_separately(
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
spec_list: Sequence[SpecPairInput], tests: TestsType = False
) -> List[SpecPair]:
"""Concretizes the input specs separately from each other.
Args:

View File

@ -431,6 +431,19 @@ def ensure_unwrapped(self) -> "Configuration":
"""Ensure we unwrap this object from any dynamic wrapper (like Singleton)"""
return self
def highest(self) -> ConfigScope:
"""Scope with highest precedence"""
return next(reversed(self.scopes.values())) # type: ignore
@_config_mutator
def ensure_scope_ordering(self):
"""Ensure that scope order matches documented precedent"""
# FIXME: We also need to consider that custom configurations and other orderings
# may not be preserved correctly
if "command_line" in self.scopes:
# TODO (when dropping python 3.6): self.scopes.move_to_end
self.scopes["command_line"] = self.remove_scope("command_line")
@_config_mutator
def push_scope(self, scope: ConfigScope) -> None:
"""Add a higher precedence scope to the Configuration."""

View File

@ -3044,11 +3044,13 @@ def prepare_config_scope(self) -> None:
"""Add the manifest's scopes to the global configuration search path."""
for scope in self.env_config_scopes:
spack.config.CONFIG.push_scope(scope)
spack.config.CONFIG.ensure_scope_ordering()
def deactivate_config_scope(self) -> None:
"""Remove any of the manifest's scopes from the global config path."""
for scope in self.env_config_scopes:
spack.config.CONFIG.remove_scope(scope.name)
spack.config.CONFIG.ensure_scope_ordering()
@contextlib.contextmanager
def use_config(self):

View File

@ -180,7 +180,7 @@ def ensure_mirror_usable(self, direction: str = "push"):
if errors:
msg = f"invalid {direction} configuration for mirror {self.name}: "
msg += "\n ".join(errors)
raise spack.mirror.MirrorError(msg)
raise MirrorError(msg)
def _update_connection_dict(self, current_data: dict, new_data: dict, top_level: bool):
# Only allow one to exist in the config

View File

@ -397,6 +397,7 @@ def create_opener():
"""Create an opener that can handle OCI authentication."""
opener = urllib.request.OpenerDirector()
for handler in [
urllib.request.ProxyHandler(),
urllib.request.UnknownHandler(),
urllib.request.HTTPSHandler(context=spack.util.web.ssl_create_default_context()),
spack.util.web.SpackHTTPDefaultErrorHandler(),

View File

@ -13,6 +13,7 @@
import macholib.mach_o
import macholib.MachO
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.lang import memoized
@ -275,10 +276,10 @@ def modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
# Deduplicate and flatten
args = list(itertools.chain.from_iterable(llnl.util.lang.dedupe(args)))
install_name_tool = executable.Executable("install_name_tool")
if args:
args.append(str(cur_path))
install_name_tool = executable.Executable("install_name_tool")
install_name_tool(*args)
with fs.edit_in_place_through_temporary_file(cur_path) as temp_path:
install_name_tool(*args, temp_path)
def macholib_get_paths(cur_path):
@ -717,8 +718,8 @@ def fixup_macos_rpath(root, filename):
# No fixes needed
return False
args.append(abspath)
executable.Executable("install_name_tool")(*args)
with fs.edit_in_place_through_temporary_file(abspath) as temp_path:
executable.Executable("install_name_tool")(*args, temp_path)
return True

View File

@ -209,7 +209,7 @@ def _apply_to_file(self, f):
# but it's nasty to deal with matches across boundaries, so let's stick to
# something simple.
modified = True
modified = False
for match in self.regex.finditer(f.read()):
# The matching prefix (old) and its replacement (new)

View File

@ -106,8 +106,8 @@
{
"names": ["install_missing_compilers"],
"message": "The config:install_missing_compilers option has been deprecated in "
"Spack v0.23, and is currently ignored. It will be removed from config in "
"Spack v0.25.",
"Spack v0.23, and is currently ignored. It will be removed from config after "
"Spack v1.0.",
"error": False,
},
],

View File

@ -3839,12 +3839,21 @@ def splice_at_hash(
self._splices.setdefault(parent_node, []).append(splice)
def _resolve_automatic_splices(self):
"""After all of the specs have been concretized, apply all immediate
splices in size order. This ensures that all dependencies are resolved
"""After all of the specs have been concretized, apply all immediate splices.
Use reverse topological order to ensure that all dependencies are resolved
before their parents, allowing for maximal sharing and minimal copying.
"""
fixed_specs = {}
for node, spec in sorted(self._specs.items(), key=lambda x: len(x[1])):
# create a mapping from dag hash to an integer representing position in reverse topo order.
specs = self._specs.values()
topo_order = list(traverse.traverse_nodes(specs, order="topo", key=traverse.by_dag_hash))
topo_lookup = {spec.dag_hash(): index for index, spec in enumerate(reversed(topo_order))}
# iterate over specs, children before parents
for node, spec in sorted(self._specs.items(), key=lambda x: topo_lookup[x[1].dag_hash()]):
immediate = self._splices.get(node, [])
if not immediate and not any(
edge.spec in fixed_specs for edge in spec.edges_to_dependencies()

View File

@ -448,6 +448,9 @@ def _target_satisfies(self, other: "ArchSpec", strict: bool) -> bool:
return bool(self._target_intersection(other))
def _target_constrain(self, other: "ArchSpec") -> bool:
if self.target is None and other.target is None:
return False
if not other._target_satisfies(self, strict=False):
raise UnsatisfiableArchitectureSpecError(self, other)
@ -496,21 +499,56 @@ def _target_intersection(self, other):
if (not s_min or o_comp >= s_min) and (not s_max or o_comp <= s_max):
results.append(o_min)
else:
# Take intersection of two ranges
# Lots of comparisons needed
_s_min = _make_microarchitecture(s_min)
_s_max = _make_microarchitecture(s_max)
_o_min = _make_microarchitecture(o_min)
_o_max = _make_microarchitecture(o_max)
# Take the "min" of the two max, if there is a partial ordering.
n_max = ""
if s_max and o_max:
_s_max = _make_microarchitecture(s_max)
_o_max = _make_microarchitecture(o_max)
if _s_max.family != _o_max.family:
continue
if _s_max <= _o_max:
n_max = s_max
elif _o_max < _s_max:
n_max = o_max
else:
continue
elif s_max:
n_max = s_max
elif o_max:
n_max = o_max
# Take the "max" of the two min.
n_min = ""
if s_min and o_min:
_s_min = _make_microarchitecture(s_min)
_o_min = _make_microarchitecture(o_min)
if _s_min.family != _o_min.family:
continue
if _s_min >= _o_min:
n_min = s_min
elif _o_min > _s_min:
n_min = o_min
else:
continue
elif s_min:
n_min = s_min
elif o_min:
n_min = o_min
if n_min and n_max:
_n_min = _make_microarchitecture(n_min)
_n_max = _make_microarchitecture(n_max)
if _n_min.family != _n_max.family or not _n_min <= _n_max:
continue
if n_min == n_max:
results.append(n_min)
else:
results.append(f"{n_min}:{n_max}")
elif n_min:
results.append(f"{n_min}:")
elif n_max:
results.append(f":{n_max}")
n_min = s_min if _s_min >= _o_min else o_min
n_max = s_max if _s_max <= _o_max else o_max
_n_min = _make_microarchitecture(n_min)
_n_max = _make_microarchitecture(n_max)
if _n_min == _n_max:
results.append(n_min)
elif not n_min or not n_max or _n_min < _n_max:
results.append("%s:%s" % (n_min, n_max))
return results
def constrain(self, other: "ArchSpec") -> bool:
@ -1431,8 +1469,6 @@ def tree(
class Spec:
#: Cache for spec's prefix, computed lazily in the corresponding property
_prefix = None
#: Cache for spec's length, computed lazily in the corresponding property
_length = None
abstract_hash = None
@staticmethod
@ -1502,9 +1538,8 @@ def __init__(
self._external_path = external_path
self.external_modules = Spec._format_module_list(external_modules)
# This attribute is used to store custom information for
# external specs. None signal that it was not set yet.
self.extra_attributes = None
# This attribute is used to store custom information for external specs.
self.extra_attributes: dict = {}
# This attribute holds the original build copy of the spec if it is
# deployed differently than it was built. None signals that the spec
@ -2219,8 +2254,8 @@ def to_node_dict(self, hash=ht.dag_hash):
d["external"] = syaml.syaml_dict(
[
("path", self.external_path),
("module", self.external_modules),
("extra_attributes", self.extra_attributes),
("module", self.external_modules or None),
("extra_attributes", syaml.sorted_dict(self.extra_attributes)),
]
)
@ -2958,7 +2993,7 @@ def _finalize_concretization(self):
for spec in self.traverse():
spec._cached_hash(ht.dag_hash)
def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "spack.spec.Spec":
def concretized(self, tests: Union[bool, Iterable[str]] = False) -> "Spec":
"""This is a non-destructive version of concretize().
First clones, then returns a concrete version of this package
@ -3081,18 +3116,13 @@ def constrain(self, other, deps=True):
if not self.variants[v].compatible(other.variants[v]):
raise vt.UnsatisfiableVariantSpecError(self.variants[v], other.variants[v])
# TODO: Check out the logic here
sarch, oarch = self.architecture, other.architecture
if sarch is not None and oarch is not None:
if sarch.platform is not None and oarch.platform is not None:
if sarch.platform != oarch.platform:
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
if sarch.os is not None and oarch.os is not None:
if sarch.os != oarch.os:
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
if sarch.target is not None and oarch.target is not None:
if sarch.target != oarch.target:
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
if (
sarch is not None
and oarch is not None
and not self.architecture.intersects(other.architecture)
):
raise UnsatisfiableArchitectureSpecError(sarch, oarch)
changed = False
@ -3115,18 +3145,12 @@ def constrain(self, other, deps=True):
changed |= self.compiler_flags.constrain(other.compiler_flags)
old = str(self.architecture)
sarch, oarch = self.architecture, other.architecture
if sarch is None or other.architecture is None:
self.architecture = sarch or oarch
else:
if sarch.platform is None or oarch.platform is None:
self.architecture.platform = sarch.platform or oarch.platform
if sarch.os is None or oarch.os is None:
sarch.os = sarch.os or oarch.os
if sarch.target is None or oarch.target is None:
sarch.target = sarch.target or oarch.target
changed |= str(self.architecture) != old
if sarch is not None and oarch is not None:
changed |= self.architecture.constrain(other.architecture)
elif oarch is not None:
self.architecture = oarch
changed = True
if deps:
changed |= self._constrain_dependencies(other)
@ -3702,18 +3726,6 @@ def __getitem__(self, name: str):
return child
def __len__(self):
if not self.concrete:
raise spack.error.SpecError(f"Cannot get length of abstract spec: {self}")
if not self._length:
self._length = 1 + sum(len(dep) for dep in self.dependencies())
return self._length
def __bool__(self):
# Need to define this so __len__ isn't used by default
return True
def __contains__(self, spec):
"""True if this spec or some dependency satisfies the spec.
@ -4472,7 +4484,7 @@ def clear_caches(self, ignore=()):
if h.attr not in ignore:
if hasattr(self, h.attr):
setattr(self, h.attr, None)
for attr in ("_dunder_hash", "_prefix", "_length"):
for attr in ("_dunder_hash", "_prefix"):
if attr not in ignore:
setattr(self, attr, None)
@ -4849,8 +4861,8 @@ def from_node_dict(cls, node):
spec.external_modules = node["external"]["module"]
if spec.external_modules is False:
spec.external_modules = None
spec.extra_attributes = node["external"].get(
"extra_attributes", syaml.syaml_dict()
spec.extra_attributes = (
node["external"].get("extra_attributes") or syaml.syaml_dict()
)
# specs read in are concrete unless marked abstract

View File

@ -487,7 +487,7 @@ def _generate_fetchers(self, mirror_only=False) -> Generator["fs.FetchStrategy",
# Insert fetchers in the order that the URLs are provided.
fetchers[:0] = (
fs.from_url_scheme(
url_util.join(mirror.fetch_url, self.mirror_layout.path),
url_util.join(mirror.fetch_url, *self.mirror_layout.path.split(os.sep)),
checksum=digest,
expand=expand,
extension=extension,

View File

@ -10,9 +10,6 @@
import spack.config
import spack.deptypes as dt
import spack.package_base
import spack.paths
import spack.repo
import spack.solver.asp
from spack.installer import PackageInstaller
from spack.spec import Spec
@ -232,3 +229,19 @@ def test_spliced_build_deps_only_in_build_spec(splicing_setup):
assert _has_build_dependency(build_spec, "splice-z")
# Spliced build dependencies are removed
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0
def test_spliced_transitive_dependency(splicing_setup):
cache = ["splice-depends-on-t@1.0 ^splice-h@1.0.1"]
goal_spec = Spec("splice-depends-on-t^splice-h@1.0.2")
with CacheManager(cache):
spack.config.set("packages", _make_specs_non_buildable(["splice-depends-on-t"]))
_enable_splicing()
concr_goal = goal_spec.concretized()
# Spec has been spliced
assert concr_goal._build_spec is not None
assert concr_goal["splice-t"]._build_spec is not None
assert concr_goal.satisfies(goal_spec)
# Spliced build dependencies are removed
assert len(concr_goal.dependencies(None, dt.BUILD)) == 0

View File

@ -4,9 +4,11 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import pytest
import spack.config
import spack.environment as ev
import spack.error
import spack.solver.asp as asp
import spack.store
from spack.cmd import (
CommandNameError,
PythonNameError,

View File

@ -315,6 +315,7 @@ def test_pkg_grep(mock_packages, capfd):
"depends-on-manyvariants",
"manyvariants",
"splice-a",
"splice-depends-on-t",
"splice-h",
"splice-t",
"splice-vh",

View File

@ -7,6 +7,7 @@
import pytest
import spack.config
import spack.environment as ev
import spack.error
import spack.spec

View File

@ -1532,3 +1532,30 @@ def test_config_path_dsl(path, it_should_work, expected_parsed):
else:
with pytest.raises(ValueError):
spack.config.ConfigPath._validate(path)
@pytest.mark.regression("48254")
def test_env_activation_preserves_config_scopes(mutable_mock_env_path):
"""Check that the "command_line" scope remains the highest priority scope, when we activate,
or deactivate, environments.
"""
expected_cl_scope = spack.config.CONFIG.highest()
assert expected_cl_scope.name == "command_line"
# Creating an environment pushes a new scope
ev.create("test")
with ev.read("test"):
assert spack.config.CONFIG.highest() == expected_cl_scope
# No active environment pops the scope
with ev.no_active_environment():
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope
# Switch the environment to another one
ev.create("test-2")
with ev.read("test-2"):
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope
assert spack.config.CONFIG.highest() == expected_cl_scope

View File

@ -1249,3 +1249,14 @@ def test_find_input_types(tmp_path: pathlib.Path):
with pytest.raises(TypeError):
fs.find(1, "file.txt") # type: ignore
def test_edit_in_place_through_temporary_file(tmp_path):
(tmp_path / "example.txt").write_text("Hello")
current_ino = os.stat(tmp_path / "example.txt").st_ino
with fs.edit_in_place_through_temporary_file(tmp_path / "example.txt") as temporary:
os.unlink(temporary)
with open(temporary, "w") as f:
f.write("World")
assert (tmp_path / "example.txt").read_text() == "World"
assert os.stat(tmp_path / "example.txt").st_ino == current_ino

View File

@ -298,30 +298,6 @@ def inner():
top-level raised TypeError: ok"""
)
full_message = h.grouped_message(with_tracebacks=True)
no_line_numbers = re.sub(r"line [0-9]+,", "line xxx,", full_message)
assert (
no_line_numbers
== dedent(
"""\
due to the following failures:
inner method raised ValueError: wow!
File "{0}", \
line xxx, in test_grouped_exception
inner()
File "{0}", \
line xxx, in inner
raise ValueError("wow!")
top-level raised TypeError: ok
File "{0}", \
line xxx, in test_grouped_exception
raise TypeError("ok")
"""
).format(__file__)
)
def test_grouped_exception_base_type():
h = llnl.util.lang.GroupedExceptionHandler()

View File

@ -57,18 +57,16 @@ def test_log_python_output_without_echo(capfd, tmpdir):
assert capfd.readouterr()[0] == ""
def test_log_python_output_with_invalid_utf8(capfd, tmpdir):
with tmpdir.as_cwd():
with log.log_output("foo.txt"):
sys.stdout.buffer.write(b"\xc3\x28\n")
def test_log_python_output_with_invalid_utf8(capfd, tmp_path):
tmp_file = str(tmp_path / "foo.txt")
with log.log_output(tmp_file, echo=True):
sys.stdout.buffer.write(b"\xc3helloworld\n")
expected = b"<line lost: output was not encoded as UTF-8>\n"
with open("foo.txt", "rb") as f:
written = f.read()
assert written == expected
# we should be able to read this as valid utf-8
with open(tmp_file, "r", encoding="utf-8") as f:
assert f.read() == "<EFBFBD>helloworld\n"
# nothing on stdout or stderr
assert capfd.readouterr()[0] == ""
assert capfd.readouterr().out == "<EFBFBD>helloworld\n"
def test_log_python_output_and_echo_output(capfd, tmpdir):

View File

@ -138,3 +138,19 @@ def test_round_trip_configuration(initial_content, expected_final_content, tmp_p
expected_final_content = initial_content
assert final_content.getvalue() == expected_final_content
def test_sorted_dict():
assert syaml.sorted_dict(
{
"z": 0,
"y": [{"x": 0, "w": [2, 1, 0]}, 0],
"v": ({"u": 0, "t": 0, "s": 0}, 0, {"r": 0, "q": 0}),
"p": 0,
}
) == {
"p": 0,
"v": ({"s": 0, "t": 0, "u": 0}, 0, {"q": 0, "r": 0}),
"y": [{"w": [2, 1, 0], "x": 0}, 0],
"z": 0,
}

View File

@ -1842,6 +1842,16 @@ def test_abstract_contains_semantic(lhs, rhs, expected, mock_packages):
# Different virtuals intersect if there is at least package providing both
(Spec, "mpi", "lapack", (True, False, False)),
(Spec, "mpi", "pkgconfig", (False, False, False)),
# Intersection among target ranges for different architectures
(Spec, "target=x86_64:", "target=ppc64le:", (False, False, False)),
(Spec, "target=x86_64:", "target=:power9", (False, False, False)),
(Spec, "target=:haswell", "target=:power9", (False, False, False)),
(Spec, "target=:haswell", "target=ppc64le:", (False, False, False)),
# Intersection among target ranges for the same architecture
(Spec, "target=:haswell", "target=x86_64:", (True, True, True)),
(Spec, "target=:haswell", "target=x86_64_v4:", (False, False, False)),
# Edge case of uarch that split in a diamond structure, from a common ancestor
(Spec, "target=:cascadelake", "target=:cannonlake", (False, False, False)),
],
)
def test_intersects_and_satisfies(factory, lhs_str, rhs_str, results):
@ -1891,6 +1901,16 @@ def test_intersects_and_satisfies(factory, lhs_str, rhs_str, results):
# Flags
(Spec, "cppflags=-foo", "cppflags=-foo", False, "cppflags=-foo"),
(Spec, "cppflags=-foo", "cflags=-foo", True, "cppflags=-foo cflags=-foo"),
# Target ranges
(Spec, "target=x86_64:", "target=x86_64:", False, "target=x86_64:"),
(Spec, "target=x86_64:", "target=:haswell", True, "target=x86_64:haswell"),
(
Spec,
"target=x86_64:haswell",
"target=x86_64_v2:icelake",
True,
"target=x86_64_v2:haswell",
),
],
)
def test_constrain(factory, lhs_str, rhs_str, result, constrained_str):

View File

@ -21,6 +21,7 @@
import pytest
import ruamel.yaml
import spack.config
import spack.hash_types as ht
import spack.paths
import spack.repo
@ -144,86 +145,83 @@ def descend_and_check(iterable, level=0):
assert level >= 5
def test_ordered_read_not_required_for_consistent_dag_hash(config, mock_packages):
@pytest.mark.parametrize("spec_str", ["mpileaks ^zmpi", "dttop", "dtuse"])
def test_ordered_read_not_required_for_consistent_dag_hash(
spec_str, mutable_config: spack.config.Configuration, mock_packages
):
"""Make sure ordered serialization isn't required to preserve hashes.
For consistent hashes, we require that YAML and json documents
have their keys serialized in a deterministic order. However, we
don't want to require them to be serialized in order. This
ensures that is not required.
"""
specs = ["mpileaks ^zmpi", "dttop", "dtuse"]
for spec in specs:
spec = Spec(spec)
spec.concretize()
For consistent hashes, we require that YAML and JSON serializations have their keys in a
deterministic order. However, we don't want to require them to be serialized in order. This
ensures that is not required."""
#
# Dict & corresponding YAML & JSON from the original spec.
#
spec_dict = spec.to_dict()
spec_yaml = spec.to_yaml()
spec_json = spec.to_json()
# Make sure that `extra_attributes` of externals is order independent for hashing.
extra_attributes = {
"compilers": {"c": "/some/path/bin/cc", "cxx": "/some/path/bin/c++"},
"foo": "bar",
"baz": "qux",
}
mutable_config.set(
"packages:dtuse",
{
"buildable": False,
"externals": [
{"spec": "dtuse@=1.0", "prefix": "/usr", "extra_attributes": extra_attributes}
],
},
)
#
# Make a spec with reversed OrderedDicts for every
# OrderedDict in the original.
#
reversed_spec_dict = reverse_all_dicts(spec.to_dict())
spec = spack.spec.Spec(spec_str).concretized()
#
# Dump to YAML and JSON
#
yaml_string = syaml.dump(spec_dict, default_flow_style=False)
reversed_yaml_string = syaml.dump(reversed_spec_dict, default_flow_style=False)
json_string = sjson.dump(spec_dict)
reversed_json_string = sjson.dump(reversed_spec_dict)
if spec_str == "dtuse":
assert spec.external and spec.extra_attributes == extra_attributes
#
# Do many consistency checks
#
spec_dict = spec.to_dict(hash=ht.dag_hash)
spec_yaml = spec.to_yaml()
spec_json = spec.to_json()
# spec yaml is ordered like the spec dict
assert yaml_string == spec_yaml
assert json_string == spec_json
# Make a spec with dict keys reversed recursively
spec_dict_rev = reverse_all_dicts(spec_dict)
# reversed string is different from the original, so it
# *would* generate a different hash
assert yaml_string != reversed_yaml_string
assert json_string != reversed_json_string
# Dump to YAML and JSON
yaml_string = syaml.dump(spec_dict, default_flow_style=False)
yaml_string_rev = syaml.dump(spec_dict_rev, default_flow_style=False)
json_string = sjson.dump(spec_dict)
json_string_rev = sjson.dump(spec_dict_rev)
# build specs from the "wrongly" ordered data
round_trip_yaml_spec = Spec.from_yaml(yaml_string)
round_trip_json_spec = Spec.from_json(json_string)
round_trip_reversed_yaml_spec = Spec.from_yaml(reversed_yaml_string)
round_trip_reversed_json_spec = Spec.from_yaml(reversed_json_string)
# spec yaml is ordered like the spec dict
assert yaml_string == spec_yaml
assert json_string == spec_json
# Strip spec if we stripped the yaml
spec = spec.copy(deps=ht.dag_hash.depflag)
# reversed string is different from the original, so it *would* generate a different hash
assert yaml_string != yaml_string_rev
assert json_string != json_string_rev
# specs are equal to the original
assert spec == round_trip_yaml_spec
assert spec == round_trip_json_spec
# build specs from the "wrongly" ordered data
from_yaml = Spec.from_yaml(yaml_string)
from_json = Spec.from_json(json_string)
from_yaml_rev = Spec.from_yaml(yaml_string_rev)
from_json_rev = Spec.from_json(json_string_rev)
assert spec == round_trip_reversed_yaml_spec
assert spec == round_trip_reversed_json_spec
assert round_trip_yaml_spec == round_trip_reversed_yaml_spec
assert round_trip_json_spec == round_trip_reversed_json_spec
# dag_hashes are equal
assert spec.dag_hash() == round_trip_yaml_spec.dag_hash()
assert spec.dag_hash() == round_trip_json_spec.dag_hash()
assert spec.dag_hash() == round_trip_reversed_yaml_spec.dag_hash()
assert spec.dag_hash() == round_trip_reversed_json_spec.dag_hash()
# Strip spec if we stripped the yaml
spec = spec.copy(deps=ht.dag_hash.depflag)
# dag_hash is equal after round-trip by dag_hash
spec.concretize()
round_trip_yaml_spec.concretize()
round_trip_json_spec.concretize()
round_trip_reversed_yaml_spec.concretize()
round_trip_reversed_json_spec.concretize()
assert spec.dag_hash() == round_trip_yaml_spec.dag_hash()
assert spec.dag_hash() == round_trip_json_spec.dag_hash()
assert spec.dag_hash() == round_trip_reversed_yaml_spec.dag_hash()
assert spec.dag_hash() == round_trip_reversed_json_spec.dag_hash()
# specs and their hashes are equal to the original
assert (
spec.process_hash()
== from_yaml.process_hash()
== from_json.process_hash()
== from_yaml_rev.process_hash()
== from_json_rev.process_hash()
)
assert (
spec.dag_hash()
== from_yaml.dag_hash()
== from_json.dag_hash()
== from_yaml_rev.dag_hash()
== from_json_rev.dag_hash()
)
assert spec == from_yaml == from_json == from_yaml_rev == from_json_rev
@pytest.mark.parametrize("module", [spack.spec, spack.version])
@ -294,13 +292,10 @@ def visit_Call(self, node):
def reverse_all_dicts(data):
"""Descend into data and reverse all the dictionaries"""
if isinstance(data, dict):
return syaml_dict(
reversed([(reverse_all_dicts(k), reverse_all_dicts(v)) for k, v in data.items()])
)
return type(data)((k, reverse_all_dicts(v)) for k, v in reversed(list(data.items())))
elif isinstance(data, (list, tuple)):
return type(data)(reverse_all_dicts(elt) for elt in data)
else:
return data
return data
def check_specs_equal(original_spec, spec_yaml_path):

View File

@ -13,7 +13,7 @@
import sys
from llnl.util import tty
from llnl.util.filesystem import join_path
from llnl.util.filesystem import edit_in_place_through_temporary_file
from llnl.util.lang import memoized
from spack.util.executable import Executable, which
@ -81,12 +81,11 @@ def fix_darwin_install_name(path):
Parameters:
path (str): directory in which .dylib files are located
"""
libs = glob.glob(join_path(path, "*.dylib"))
libs = glob.glob(os.path.join(path, "*.dylib"))
install_name_tool = Executable("install_name_tool")
otool = Executable("otool")
for lib in libs:
# fix install name first:
install_name_tool = Executable("install_name_tool")
install_name_tool("-id", lib, lib)
otool = Executable("otool")
args = ["-id", lib]
long_deps = otool("-L", lib, output=str).split("\n")
deps = [dep.partition(" ")[0][1::] for dep in long_deps[2:-1]]
# fix all dependencies:
@ -98,5 +97,8 @@ def fix_darwin_install_name(path):
# but we don't know builddir (nor how symbolic links look
# in builddir). We thus only compare the basenames.
if os.path.basename(dep) == os.path.basename(loc):
install_name_tool("-change", dep, loc, lib)
args.extend(("-change", dep, loc))
break
with edit_in_place_through_temporary_file(lib) as tmp:
install_name_tool(*args, tmp)

View File

@ -447,20 +447,13 @@ def _dump_annotated(handler, data, stream=None):
return getvalue()
def sorted_dict(dict_like):
"""Return an ordered dict with all the fields sorted recursively.
Args:
dict_like (dict): dictionary to be sorted
Returns:
dictionary sorted recursively
"""
result = syaml_dict(sorted(dict_like.items()))
for key, value in result.items():
if isinstance(value, collections.abc.Mapping):
result[key] = sorted_dict(value)
return result
def sorted_dict(data):
"""Descend into data and sort all dictionary keys."""
if isinstance(data, dict):
return type(data)((k, sorted_dict(v)) for k, v in sorted(data.items()))
elif isinstance(data, (list, tuple)):
return type(data)(sorted_dict(v) for v in data)
return data
def extract_comments(data):

View File

@ -14,10 +14,10 @@ default:
image: { "name": "ghcr.io/spack/e4s-ubuntu-18.04:v2021-10-18", "entrypoint": [""] }
# CI Platform-Arch
.cray_rhel_zen4:
.cray_rhel_x86_64_v3:
variables:
SPACK_TARGET_PLATFORM: "cray-rhel"
SPACK_TARGET_ARCH: "zen4"
SPACK_TARGET_ARCH: "x86_64_v3"
.cray_sles_zen4:
variables:
@ -876,7 +876,7 @@ aws-pcluster-build-neoverse_v1:
- cat /proc/meminfo | grep 'MemTotal\|MemFree' || true
.generate-cray-rhel:
tags: [ "cray-rhel-zen4", "public" ]
tags: [ "cray-rhel-x86_64_v3", "public" ]
extends: [ ".generate-cray" ]
.generate-cray-sles:
@ -888,7 +888,7 @@ aws-pcluster-build-neoverse_v1:
# E4S - Cray RHEL
#######################################
.e4s-cray-rhel:
extends: [ ".cray_rhel_zen4" ]
extends: [ ".cray_rhel_x86_64_v3" ]
variables:
SPACK_CI_STACK_NAME: e4s-cray-rhel
@ -896,7 +896,6 @@ e4s-cray-rhel-generate:
extends: [ ".generate-cray-rhel", ".e4s-cray-rhel" ]
e4s-cray-rhel-build:
allow_failure: true # libsci_cray.so broken, misses DT_NEEDED for libdl.so
extends: [ ".build", ".e4s-cray-rhel" ]
trigger:
include:
@ -915,10 +914,10 @@ e4s-cray-rhel-build:
variables:
SPACK_CI_STACK_NAME: e4s-cray-sles
e4s-cray-sles-generate:
.e4s-cray-sles-generate:
extends: [ ".generate-cray-sles", ".e4s-cray-sles" ]
e4s-cray-sles-build:
.e4s-cray-sles-build:
allow_failure: true # libsci_cray.so broken, misses DT_NEEDED for libdl.so
extends: [ ".build", ".e4s-cray-sles" ]
trigger:

View File

@ -1,31 +1,27 @@
compilers:
- compiler:
spec: cce@15.0.1
spec: cce@=18.0.0
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
cc: /opt/cray/pe/cce/18.0.0/bin/craycc
cxx: /opt/cray/pe/cce/18.0.0/bin/crayCC
f77: /opt/cray/pe/cce/18.0.0/bin/crayftn
fc: /opt/cray/pe/cce/18.0.0/bin/crayftn
flags: {}
operating_system: rhel8
target: any
modules:
- PrgEnv-cray/8.3.3
- cce/15.0.1
environment:
set:
MACHTYPE: x86_64
- compiler:
spec: gcc@11.2.0
paths:
cc: gcc
cxx: g++
f77: gfortran
fc: gfortran
flags: {}
operating_system: rhel8
target: any
modules:
- PrgEnv-gnu
- gcc/11.2.0
target: x86_64
modules: []
environment: {}
extra_rpaths: []
- compiler:
spec: gcc@=8.5.0
paths:
cc: /usr/bin/gcc
cxx: /usr/bin/g++
f77: /usr/bin/gfortran
fc: /usr/bin/gfortran
flags: {}
operating_system: rhel8
target: x86_64
modules: []
environment: {}
extra_rpaths: []

View File

@ -1,16 +1,15 @@
packages:
# EXTERNALS
cray-mpich:
buildable: false
externals:
- spec: cray-mpich@8.1.25 %cce@15.0.1
prefix: /opt/cray/pe/mpich/8.1.25/ofi/cray/10.0
- spec: cray-mpich@8.1.30 %cce
prefix: /opt/cray/pe/mpich/8.1.30/ofi/cray/18.0
modules:
- cray-mpich/8.1.25
- cray-mpich/8.1.30
cray-libsci:
buildable: false
externals:
- spec: cray-libsci@23.02.1.1 %cce@15.0.1
prefix: /opt/cray/pe/libsci/23.02.1.1/CRAY/9.0/x86_64/
- spec: cray-libsci@24.07.0 %cce
prefix: /opt/cray/pe/libsci/24.07.0/CRAY/18.0/x86_64/
modules:
- cray-libsci/23.02.1.1
- cray-libsci/24.07.0

View File

@ -0,0 +1,4 @@
ci:
pipeline-gen:
- build-job:
tags: ["cray-rhel-x86_64_v3"]

View File

@ -1,4 +0,0 @@
ci:
pipeline-gen:
- build-job:
tags: ["cray-rhel-zen4"]

View File

@ -10,8 +10,7 @@ spack:
packages:
all:
prefer:
- "%cce"
require: "%cce@18.0.0 target=x86_64_v3"
compiler: [cce]
providers:
blas: [cray-libsci]
@ -19,17 +18,15 @@ spack:
mpi: [cray-mpich]
tbb: [intel-tbb]
scalapack: [netlib-scalapack]
target: [zen4]
variants: +mpi
ncurses:
require: +termlib ldflags=-Wl,--undefined-version
tbb:
require: "intel-tbb"
binutils:
variants: +ld +gold +headers +libiberty ~nls
boost:
variants: +python +filesystem +iostreams +system
cuda:
version: [11.7.0]
elfutils:
variants: ~nls
require: "%gcc"
@ -39,20 +36,14 @@ spack:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=sockets,tcp,udp,rxm
libunwind:
variants: +pic +xz
mgard:
require:
- "@2023-01-10:"
mpich:
variants: ~wrapperrpath
ncurses:
variants: +termlib
paraview:
# Don't build GUI support or GLX rendering for HPC/container deployments
require: "@5.11 ~qt ^[virtuals=gl] osmesa"
python:
version: [3.8.13]
require: "~qt ^[virtuals=gl] osmesa"
trilinos:
require:
- one_of: [+amesos +amesos2 +anasazi +aztec +boost +epetra +epetraext +ifpack
@ -63,12 +54,6 @@ spack:
- one_of: [~ml ~muelu ~zoltan2 ~teko, +ml +muelu +zoltan2 +teko]
- one_of: [+superlu-dist, ~superlu-dist]
- one_of: [+shylu, ~shylu]
xz:
variants: +pic
mesa:
version: [21.3.8]
unzip:
require: "%gcc"
specs:
# CPU
@ -76,62 +61,43 @@ spack:
- aml
- arborx
- argobots
- bolt
- butterflypack
- boost +python +filesystem +iostreams +system
- cabana
- caliper
- chai
- charliecloud
- conduit
# - cp2k +mpi # libxsmm: ftn-78 ftn: ERROR in command linel; The -f option has an invalid argument, "tree-vectorize".
- datatransferkit
- flecsi
- flit
- flux-core
- fortrilinos
- ginkgo
- globalarrays
- gmp
- gotcha
- h5bench
- hdf5-vol-async
- hdf5-vol-cache
- hdf5-vol-cache cflags=-Wno-error=incompatible-function-pointer-types
- hdf5-vol-log
- heffte +fftw
- hpx max_cpu_count=512 networking=mpi
- hypre
- kokkos +openmp
- kokkos-kernels +openmp
- lammps
- legion
- libnrm
#- libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +mgard # mgard:
- libquo
- libunwind
- mercury
- metall
- mfem
# - mgard +serial +openmp +timing +unstructured ~cuda # mgard
- mpark-variant
- mpifileutils ~xattr
- mpifileutils ~xattr cflags=-Wno-error=implicit-function-declaration
- nccmp
- nco
- netlib-scalapack
- omega-h
- openmpi
- netlib-scalapack cflags=-Wno-error=implicit-function-declaration
- openpmd-api ^adios2~mgard
- papi
- papyrus
- pdt
- petsc
- plumed
- precice
- pumi
- py-h5py +mpi
- py-h5py ~mpi
- py-libensemble +mpi +nlopt
- py-petsc4py
- qthreads scheduler=distrib
- raja
- slate ~cuda
@ -144,8 +110,7 @@ spack:
- swig@4.0.2-fortran
- sz3
- tasmanian
- tau +mpi +python
- trilinos@13.0.1 +belos +ifpack2 +stokhos
- trilinos +belos +ifpack2 +stokhos
- turbine
- umap
- umpire
@ -155,27 +120,47 @@ spack:
# - alquimia # pflotran: petsc-3.19.4-c6pmpdtpzarytxo434zf76jqdkhdyn37/lib/petsc/conf/rules:169: material_aux.o] Error 1: fortran errors
# - amrex # disabled temporarily pending resolution of unreproducible CI failure
# - axom # axom: CMake Error at axom/sidre/cmake_install.cmake:154 (file): file INSTALL cannot find "/tmp/gitlab-runner-2/spack-stage/spack-stage-axom-0.8.1-jvol6riu34vuyqvrd5ft2gyhrxdqvf63/spack-build-jvol6ri/lib/fortran/axom_spio.mod": No such file or directory.
# - bolt # ld.lld: error: CMakeFiles/bolt-omp.dir/kmp_gsupport.cpp.o: symbol GOMP_atomic_end@@GOMP_1.0 has undefined version GOMP_1.0
# - bricks # bricks: clang-15: error: clang frontend command failed with exit code 134 (use -v to see invocation)
# - butterflypack ^netlib-scalapack cflags=-Wno-error=implicit-function-declaration # ftn-2116 ftn: INTERNAL "driver" was terminated due to receipt of signal 01: Hangup.
# - caliper # papi: papi_internal.c:124:3: error: use of undeclared identifier '_papi_hwi_my_thread'; did you mean '_papi_hwi_read'?
# - charliecloud # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - cp2k +mpi # libxsmm: ftn-78 ftn: ERROR in command linel; The -f option has an invalid argument, "tree-vectorize".
# - dealii # llvm@14.0.6: ?; intel-tbb@2020.3: clang-15: error: unknown argument: '-flifetime-dse=1'; assimp@5.2.5: clang-15: error: clang frontend command failed with exit code 134 (use -v to see invocation)
# - dyninst # requires %gcc
# - ecp-data-vis-sdk ~cuda ~rocm +adios2 +ascent +cinema +darshan +faodel +hdf5 +paraview +pnetcdf +sz +unifyfs +veloc ~visit +vtkm +zfp ^hdf5@1.14 # llvm@14.0.6: ?;
# - exaworks # rust: ld.lld: error: relocation R_X86_64_32 cannot be used against local symbol; recompile with -fPIC'; defined in /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o, referenced by /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o:(__no_mmap_for_malloc)
# - flux-core # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - fortrilinos # trilinos-14.0.0: packages/teuchos/core/src/Teuchos_BigUIntDecl.hpp:67:8: error: no type named 'uint32_t' in namespace 'std'
# - gasnet # configure error: User requested --enable-ofi but I don't know how to build ofi programs for your system
# - gptune # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']]
# - hpctoolkit # dyninst requires %gcc
# - hpx max_cpu_count=512 networking=mpi # libxcrypt-4.4.35
# - lammps # lammps-20240829.1: Reversed (or previously applied) patch detected! Assume -R? [n]
# - libpressio +bitgrooming +bzip2 ~cuda ~cusz +fpzip +hdf5 +libdistributed +lua +openmp +python +sz +sz3 +unix +zfp +json +remote +netcdf +mgard # mgard:
# - mgard +serial +openmp +timing +unstructured ~cuda # mgard
# - nrm # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']]
# - nvhpc # requires %gcc
# - omega-h # trilinos-13.4.1: packages/kokkos/core/src/impl/Kokkos_MemoryPool.cpp:112:48: error: unknown type name 'uint32_t'
# - openmpi # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - papi # papi_internal.c:124:3: error: use of undeclared identifier '_papi_hwi_my_thread'; did you mean '_papi_hwi_read'?
# - parsec ~cuda # parsec: parsec/fortran/CMakeFiles/parsec_fortran.dir/parsecf.F90.o: ftn-2103 ftn: WARNING in command line. The -W extra option is not supported or invalid and will be ignored.
# - phist # fortran_bindings/CMakeFiles/phist_fort.dir/phist_testing.F90.o: ftn-78 ftn: ERROR in command line. The -f option has an invalid argument, "no-math-errno".
# - plasma # %cce conflict
# - plumed # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - py-h5py +mpi # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - py-h5py ~mpi # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - py-jupyterhub # rust: ld.lld: error: relocation R_X86_64_32 cannot be used against local symbol; recompile with -fPIC'; defined in /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o, referenced by /opt/cray/pe/cce/15.0.1/cce/x86_64/lib/no_mmap.o:(__no_mmap_for_malloc)
# - py-libensemble +mpi +nlopt # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - py-petsc4py # libxcrypt-4.4.35: ld.lld: error: version script assignment of 'XCRYPT_2.0' to symbol 'xcrypt_r' failed: symbol not defined
# - quantum-espresso # quantum-espresso: CMake Error at cmake/FindSCALAPACK.cmake:503 (message): A required library with SCALAPACK API not found. Please specify library
# - scr # scr: make[2]: *** [examples/CMakeFiles/test_ckpt_F.dir/build.make:112: examples/test_ckpt_F] Error 1: /opt/cray/pe/cce/15.0.1/binutils/x86_64/x86_64-pc-linux-gnu/bin/ld: /opt/cray/pe/mpich/8.1.25/ofi/cray/10.0/lib/libmpi_cray.so: undefined reference to `PMI_Barrier'
# - strumpack ~slate # strumpack: [test/CMakeFiles/test_HSS_seq.dir/build.make:117: test/test_HSS_seq] Error 1: ld.lld: error: undefined reference due to --no-allow-shlib-undefined: mpi_abort_
# - tau +mpi +python # libelf: configure: error: installation or configuration problem: C compiler cannot create executables.; papi: papi_internal.c:124:3: error: use of undeclared identifier '_papi_hwi_my_thread'; did you mean '_papi_hwi_read'?
# - upcxx # upcxx: configure error: User requested --enable-ofi but I don't know how to build ofi programs for your system
# - variorum # variorum: /opt/cray/pe/cce/15.0.1/binutils/x86_64/x86_64-pc-linux-gnu/bin/ld: /opt/cray/pe/lib64/libpals.so.0: undefined reference to `json_array_append_new@@libjansson.so.4'
# - xyce +mpi +shared +pymi +pymi_static_tpls ^trilinos~shylu # openblas: ftn-2307 ftn: ERROR in command line: The "-m" option must be followed by 0, 1, 2, 3 or 4.; make[2]: *** [<builtin>: spotrf2.o] Error 1; make[1]: *** [Makefile:27: lapacklib] Error 2; make: *** [Makefile:250: netlib] Error 2
# - warpx +python # py-scipy: meson.build:82:0: ERROR: Unknown compiler(s): [['/home/gitlab-runner-3/builds/dWfnZWPh/0/spack/spack/lib/spack/env/cce/ftn']]
# - xyce +mpi +shared +pymi +pymi_static_tpls ^trilinos~shylu # openblas: ftn-2307 ftn: ERROR in command line: The "-m" option must be followed by 0, 1, 2, 3 or 4.; make[2]: *** [<builtin>: spotrf2.o] Error 1; make[1]: *** [Makefile:27: lapacklib] Error 2; make: *** [Makefile:250: netlib] Error 2
cdash:
build-group: E4S Cray

View File

@ -36,7 +36,7 @@ export QA_DIR=$(realpath $QA_DIR)
cd "$SPACK_ROOT"
# Run bash tests with coverage enabled, but pipe output to /dev/null
# because it seems that kcov seems to undo the script's redirection
# because it seems that kcov undoes the script's redirection
if [ "$COVERAGE" = true ]; then
kcov "$SPACK_ROOT/coverage" "$QA_DIR/setup-env-test.sh" &> /dev/null
kcov "$SPACK_ROOT/coverage" "$QA_DIR/completion-test.sh" &> /dev/null

View File

@ -0,0 +1,22 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class SpliceDependsOnT(Package):
"""Package that depends on splice-t"""
homepage = "http://www.example.com"
url = "http://www.example.com/splice-depends-on-t-1.0.tar.gz"
version("1.0", md5="0123456789abcdef0123456789abcdef")
depends_on("splice-t")
def install(self, spec, prefix):
with open(prefix.join("splice-depends-on-t"), "w") as f:
f.write("splice-depends-on-t: {0}".format(prefix))
f.write("splice-t: {0}".format(spec["splice-t"].prefix))

View File

@ -12,33 +12,36 @@ class GdkPixbuf(MesonPackage):
GTK+ 2 but it was split off into a separate package in preparation for the change to GTK+ 3."""
homepage = "https://gitlab.gnome.org/GNOME/gdk-pixbuf"
url = "https://ftp.acc.umu.se/pub/gnome/sources/gdk-pixbuf/2.40/gdk-pixbuf-2.40.0.tar.xz"
git = "https://gitlab.gnome.org/GNOME/gdk-pixbuf"
list_url = "https://ftp.acc.umu.se/pub/gnome/sources/gdk-pixbuf/"
url = "https://gitlab.gnome.org/GNOME/gdk-pixbuf/-/archive/2.40.0/gdk-pixbuf-2.40.0.tar.gz"
# Falling back to the gitlab source since the mirror here seems to be broken
# url = "https://ftp.acc.umu.se/pub/gnome/sources/gdk-pixbuf/2.40/gdk-pixbuf-2.40.0.tar.xz"
# list_url = "https://ftp.acc.umu.se/pub/gnome/sources/gdk-pixbuf/"
list_depth = 1
license("LGPL-2.1-or-later", checked_by="wdconinc")
version("2.42.12", sha256="b9505b3445b9a7e48ced34760c3bcb73e966df3ac94c95a148cb669ab748e3c7")
version("2.42.12", sha256="d41966831b3d291fcdfe31f683bea4b3f03241d591ddbe550b5db873af3da364")
# https://nvd.nist.gov/vuln/detail/CVE-2022-48622
version(
"2.42.10",
sha256="ee9b6c75d13ba096907a2e3c6b27b61bcd17f5c7ebeab5a5b439d2f2e39fe44b",
sha256="87a086c51d9705698b22bd598a795efaccf61e4db3a96f439dcb3cd90506dab8",
deprecated=True,
)
version(
"2.42.9",
sha256="28f7958e7bf29a32d4e963556d241d0a41a6786582ff6a5ad11665e0347fc962",
sha256="226d950375907857b23c5946ae6d30128f08cd75f65f14b14334c7a9fb686e36",
deprecated=True,
)
version(
"2.42.6",
sha256="c4a6b75b7ed8f58ca48da830b9fa00ed96d668d3ab4b1f723dcf902f78bde77f",
sha256="c4f3a84a04bc7c5f4fbd97dce7976ab648c60628f72ad4c7b79edce2bbdb494d",
deprecated=True,
)
version(
"2.42.2",
sha256="83c66a1cfd591d7680c144d2922c5955d38b4db336d7cd3ee109f7bcf9afef15",
sha256="249b977279f761979104d7befbb5ee23f1661e29d19a36da5875f3a97952d13f",
deprecated=True,
)

View File

@ -65,9 +65,15 @@ def cmake_args(self):
if re_qt.match(dep.name):
qt_prefix_path.append(self.spec[dep.name].prefix)
# Now append all qt-* dependency prefixex into a prefix path
# Now append all qt-* dependency prefixes into a prefix path
args.append(self.define("QT_ADDITIONAL_PACKAGES_PREFIX_PATH", ":".join(qt_prefix_path)))
# Make our CMAKE_INSTALL_RPATH redundant:
# for prefix of current package ($ORIGIN/../lib type of rpaths),
args.append(self.define("QT_DISABLE_RPATH", True))
# for prefixes of dependencies
args.append(self.define("QT_NO_DISABLE_CMAKE_INSTALL_RPATH_USE_LINK_PATH", True))
return args
@run_after("install")