Compare commits

..

55 Commits

Author SHA1 Message Date
Todd Gamblin
12e3665df3 Bump version on develop to v0.23dev0 (#44137) 2024-05-11 18:01:50 +02:00
Todd Gamblin
fa4778b9fc changelog: add changes form 0.21.1 and 0.21.2 (#44136)
These changes were added to the release branch but did not make it onto `develop`.
2024-05-11 17:45:14 +02:00
Harmen Stoppels
66d297d420 oci: improve default_retry (#44132)
Apparently urllib can throw a range of different exceptions:

1. HTTPError
2. URLError with e.reason set to the actual exception
3. TimeoutError from getresponse, which is not wrapped
2024-05-11 15:43:32 +02:00
James Taliaferro
56251c11f3 qpdf: new package (#44066)
* New package: qpdf

* Update var/spack/repos/builtin/packages/qpdf/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Fix format of cmake_args

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-10 14:32:57 -06:00
James Smillie
40bf9a179b New package: py-nuitka (#44079) 2024-05-10 12:07:02 -07:00
John W. Parent
095aba0b9f Buildcache/ensure symlinks proper prefix (#43851)
* archive: relative links only

Ensure all links written into tarfiles generated from Spack prefixes do not contain symlinks pointing outside the prefix

* binary_distribution: limit extraction to prefix

Ensure files extracted from spackballs are not links pointing outside of the prefix

* Ensure rpaths are properly set on Windows

* hard error on extraction of absolute links

* refactor for non link-modifying approach

* Restore tarball extraction to original impl

* use custom readlink

* cleanup symlink module

* make lstrip
2024-05-10 13:00:40 -05:00
John W. Parent
4270136598 Windows: Non config changes to support Gitlab CI (#43965)
* Quote python for shlex

* Remove python path quoting patch

* spack env: Allow `C` "protocol" for config_path

When running spack on windows, a path beginning with `C://...` is a valid path.

* Remove makefile from ci rebuild

* GPG use llnl.util.filesystem.getuid

* Cleanup process_command

* Remove unused lines

* Fix tyop in encode_path

* Double quote arguments

* Cleanup process_command

* Pass cdash args with =

* Escape parens in CMD script

* escape parens doesn't only apply to paths

* Install deps

* sfn prefix

* use sfn with libxml2

* Add hash to dep install

* WIP

* REview

* Changes missed in prior review commit

* Style

* Ensure we handle Windows paths with config scopes

* clarify docstring

* No more MAKE_COMMAND

* syntax cleanup

* Actually correct is_path_url

* Correct call

* raise on other errors

* url2path behaves differently on unix

* Ensure proper quoting

* actually prepend slash in slash_hash

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
2024-05-10 13:00:13 -05:00
Juan Miguel Carceller
f73d7d2dce dd4hep: cleanup recipe, remove deprecated versions and patches (#44110)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-10 07:40:38 -07:00
Juan Miguel Carceller
567566da08 edm4hep: cleanup recipe, remove deprecated versions and patches (#44113)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-10 07:39:06 -07:00
Chris Marsh
30a9ab749d libtheora: Remove unneeded autoreconf section (#44063)
* Remove autoreconf section that was causing issues with libtool mismatch. Fixes issue #43498
2024-05-10 10:27:20 -04:00
Mikael Simberg
8160a96b27 dla-future: Add 0.4.1 (#44115)
* dla-future: Add 0.4.1

* Use ninja as generator in dla-future
2024-05-10 15:21:47 +02:00
Mikael Simberg
10414d3e6c pika: Add 0.25.0 (#44114) 2024-05-10 14:37:31 +02:00
Stephen Sachs
1d96c09094 libhugetlbfs: Fix the build with an update to 2.24 (#44059)
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-05-10 10:50:36 +02:00
Harmen Stoppels
e7112fbc6a PythonExtension: fix issue where package does not extend python (#44109) 2024-05-10 10:47:37 +02:00
Tom Scogland
b79761b7eb flux-sched: set the version if the ver file is missing (#44068)
* flux-sched: set the version if the ver file is missing

problem: flux-sched needs a version, it normally gets this from a
release tarball or from git tags, but if using a source archive or a git
clone without tags the version is missing

solution: set the version through cmake based on the version spack sees
when the version file is missing

* Update var/spack/repos/builtin/packages/flux-sched/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-09 11:50:42 -07:00
Christopher Christofi
3381899c69 miniforge3: add new versions with mamba installation (#43995)
* miniforge3: add new version with mamba installation
* fix styling
* update maintainers
* Fix variant directive ordering
2024-05-09 12:48:40 -06:00
Massimiliano Culpo
c7cf5eabc1 Fix filtering external specs (#44093)
When an include filter on externals is present, implicitly
include libcs.

Also, do not penalize deprecated versions if they come
from externals.
2024-05-09 18:50:15 +02:00
Juan Miguel Carceller
d88fa5cf8e pythia8: add a cxxstd variant (#44077)
* pythia8: add a cxxstd variant
* Add multi=False, fix regexp

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-09 09:32:38 -07:00
Richard Berger
2ed0e3d737 kokkos-nvcc-wrapper: add missing versions (#44089) 2024-05-09 09:23:16 -07:00
Julien Cortial
506a40cac1 mmg: Build & install shared libraries, add 5.7.2 & 5.7.3(#43749) 2024-05-09 16:59:53 +02:00
Rémi Lacroix
447739fcef Universal Ctags: Add version 6.1.20240505.0 (#44058) 2024-05-09 15:55:41 +02:00
Massimiliano Culpo
e60f6f4a6e CI/Update macOS runners: macos-latest switched to macos-14 (#44094)
macos-latest switched to macos-14, so now we are running
two identical jobs.
2024-05-09 14:28:17 +02:00
Vanessasaurus
7df35d0da0 package: libsodium update URL to GitHub (#44090)
Problem: the libsodium download endpoints are not reliable,
they fail multiple times a day for several of our automated
pipelines.
Solution: use the GitHub releases and branches.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
2024-05-09 14:26:02 +02:00
Luc Berger
71b035ece1 Kokkos: adding new release 4.3.01 (#44086) 2024-05-09 03:33:35 -06:00
Ashwin Kumar Karnad
86a134235e Octopus 14.1 : Add new version hash (#44083)
* octopus: add hash for new version
* octopus: add --enable-silent-deprecation flag when @14.1:
2024-05-09 03:18:22 -06:00
Jon Rood
24cd0da7fb amr-wind: add missing variants (#44078)
* amr-wind: add missing variants

* Fix copy and paste.

* waves2amr is only available on amr-wind@2.0

* Style.

* Use satisfies.
2024-05-09 02:38:16 -06:00
alvaro-sch
762833a663 Orca (standalone) 5.0.4 (#44011)
* orca: added latest version 5.0.4
* Fixed openmpi versions
2024-05-08 15:36:27 -07:00
Ken Raffenetti
636d479e5f mpich: Add license (#43821) 2024-05-08 12:07:46 -07:00
Sreenivasa Murthy Kolam
f2184f26fa hipsparselt: new package (#44080)
* Initial commit for adding hipsparselt recipe
* correct the style errors
* remove master version
2024-05-08 12:03:21 -07:00
Harmen Stoppels
e1686eef7c gcc: use -idirafter for libc headers (#44081)
GCC C++ headers like cstdlib use `#include_next <stdlib.h>` to wrap libc
headers. We're using `-isystem` for libc, which puts those headers too
early in the search path. `-idirafter` fixes this so `include_next`
works.
2024-05-08 20:45:04 +02:00
Adam J. Stewart
314893982e JAX: add v0.4.27, NCCL variant (#44071) 2024-05-08 11:36:24 -07:00
Jake Koester
9ab6c30a3d add flag to turn off building tests and examples (#44075) 2024-05-08 11:33:49 -07:00
Adam J. Stewart
ddf94291d4 py-jsonargparse: add v4.28.0 (#44074) 2024-05-08 11:31:07 -07:00
Jon Rood
5d1038c512 hypre: add cublas and rocblas variants (#44038)
* Add cublas and roblas variants to hypre.

* Fix mistakes.

* Remove newline.

* Address suggestions.
2024-05-08 12:04:11 -06:00
renjithravindrankannath
2e40c88d50 Bump up the version for ROCm-6.1.0 (#43843)
* Bump up the version for ROCm-6.1.0
* Style check error correction and patch files
* Update for rocm-openmp-extras 6.1
* updating rocm-openmp-extras and math libraries with 6.1
* Style check error correcion
* updating hipcub, hipfort & miopen-hip for 6.1
* Rocm-openmp-extras and some mathlib updates
* iAudit error correction and rocmlir update
* Updating dependency on suite-sparse and adding path
* Style check error corection
* hip-tensor 6.1.0 update
* rdc 6.1 needs grpc 1.59.1
* rvs 6.1 updates and patch
2024-05-08 10:26:30 -07:00
dependabot[bot]
2bcba57757 build(deps): bump actions/checkout from 4.1.4 to 4.1.5 (#44045)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.4 to 4.1.5.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](0ad4b8fada...44c2b7a8a4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-08 09:22:48 -07:00
shanedsnyder
37330e5e2b darshan-runtime,darshan-util,py-darshan: add darshan-3.4.5 packages (#43989)
* add darshan-3.4.5 packages, update URLs
* py-setuptools version switches for py-darshan
* more py-darshan test dependencies
* try a conditional importlib_resources dependency
2024-05-08 09:06:24 -07:00
snehring
b4411cf2db iq-tree: add new version delete duplicate package (#44043)
* iq-tree: add new version delete duplicate package
* Replace iqtree2 dependency
   orthofinder: replace iqtree2 with iq-tree@2
   py-phylophlan: replace iqtree2 with iq-tree@2
2024-05-08 09:02:06 -07:00
Harmen Stoppels
65d1ae083c gitlab ci: tutorial: add julia and vim (#44073) 2024-05-08 14:18:12 +02:00
andriish
0b8faa3918 cernlib: add 2023.08.14.0-free (#40211)
* Update CERNLIB

Update CERNLIB

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of andriish

* cernlib: merge crypto->crypt patches

* cernlib: depends_on xbae when `@2023:`

* cernlib: patch for Xbae and Xm link order (DSO)

* [@spackbot] updating style on behalf of andriish

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-08 06:58:07 -05:00
Alec Scott
f077c7e33b unmaintained pkg bump: 2024-05-05 (#44021)
* unmaintained pkgs: bump versions

* Changes following review feedback

* glm: update cmake dependency

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-08 05:53:49 -06:00
Massimiliano Culpo
9d7410d22e gcc: add v14.1.0 (#44053) 2024-05-08 12:14:44 +02:00
Tamara Dahlgren
e295730d0e mold: Replace maintainer (#44047)
* Remove maintainer at his request

* Add msimberg as the maintainer
2024-05-08 09:29:43 +02:00
Todd Gamblin
868327ee14 r: patch R-CVE-2024-27322 for r@3.5:4.3.3 (#44050)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-08 08:56:37 +02:00
Massimiliano Culpo
f5430b16bc Bump removal version in deprecation messages (#44064) 2024-05-08 08:49:14 +02:00
Tamara Dahlgren
2446695113 Remove dead environment creation code (#44065) 2024-05-07 22:49:06 -06:00
snehring
e0c6cca65c orca: switching to xz archives, removing old version (#44035) 2024-05-07 15:18:53 -07:00
Auriane R
84ed4cd331 Add transformer engine package (#43982)
* Add py-flash-attn@2.4.2
* Add py-transfomer-engine package

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-07 14:56:34 -07:00
Juan Miguel Carceller
f6d50f790e gaudi: Add version 38.1 (#44055)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-05-07 14:20:02 -07:00
Tim Haines
d3c3d23d1e Dyninst: update compiler requirements (#44033)
As of 13.0.0, Dyninst can now build with any Linux compiler.
2024-05-07 15:03:17 -06:00
Vicente Bolea
33752c2b55 fix(adios2): fix missing stdind include in 2.7 (#43786) 2024-05-07 15:52:20 -05:00
Harmen Stoppels
26759249ca gitlab: dont build paraview for neoverse v2 (#44060) 2024-05-07 18:59:12 +02:00
dependabot[bot]
8b4cbbe7b3 build(deps): bump pygments from 2.17.2 to 2.18.0 in /lib/spack/docs (#44044)
Bumps [pygments](https://github.com/pygments/pygments) from 2.17.2 to 2.18.0.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.17.2...2.18.0)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-07 18:55:04 +02:00
Richarda Butler
be71f9fdc4 Include concrete environments with include_concrete (#33768)
Add the ability to include any number of (potentially nested) concrete environments, e.g.:

```yaml
   spack:
     specs: []
     concretizer:
         unify: true
     include_concrete:
     - /path/to/environment1
     - /path/to/environment2
```

or, from the CLI:

```console
   $ spack env create myenv
   $ spack -e myenv add python
   $ spack -e myenv concretize
   $ spack env create --include-concrete myenv included_env
```

The contents of included concrete environments' spack.lock files are
included in the environment's lock file at creation time. Any changes
to included concrete environments are only reflected after the environment
is re-concretized from the re-concretized included environments.

- [x] Concretize included envs
- [x] Save concrete specs in memory by hash
- [x] Add included envs to combined env's lock file
- [x] Add test
- [x] Update documentation

    Co-authored-by: Kayla Butler <<butler59@llnl.gov>
    Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.co
m>
    Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
    Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-07 09:32:40 -07:00
Massimiliano Culpo
05c1e7ecc2 Update the tutorial command to point to releases/v0.22 (#44056) 2024-05-07 17:56:29 +02:00
377 changed files with 3970 additions and 4184 deletions

View File

@@ -28,7 +28,7 @@ jobs:
run: run:
shell: ${{ matrix.system.shell }} shell: ${{ matrix.system.shell }}
steps: steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b - uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: ${{inputs.python_version}} python-version: ${{inputs.python_version}}

View File

@@ -37,7 +37,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \ make patch unzip which xz python3 python3-devel tree \
cmake bison cmake bison
- name: Checkout - name: Checkout
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Bootstrap clingo - name: Bootstrap clingo
@@ -60,7 +60,7 @@ jobs:
run: | run: |
brew install cmake bison tree brew install cmake bison tree
- name: Checkout - name: Checkout
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -92,7 +92,7 @@ jobs:
run: | run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf) sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout - name: Checkout
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Bootstrap GnuPG - name: Bootstrap GnuPG
@@ -121,7 +121,7 @@ jobs:
run: | run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf) sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout - name: Checkout
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -56,7 +56,7 @@ jobs:
if: github.repository == 'spack/spack' if: github.repository == 'spack/spack'
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81 - uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta id: docker_meta

View File

@@ -36,7 +36,7 @@ jobs:
core: ${{ steps.filter.outputs.core }} core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }} packages: ${{ steps.filter.outputs.packages }}
steps: steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b - uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
if: ${{ github.event_name == 'push' }} if: ${{ github.event_name == 'push' }}
with: with:
fetch-depth: 0 fetch-depth: 0
@@ -77,8 +77,13 @@ jobs:
needs: [ prechecks, changes ] needs: [ prechecks, changes ]
uses: ./.github/workflows/unit_tests.yaml uses: ./.github/workflows/unit_tests.yaml
secrets: inherit secrets: inherit
windows:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.core == 'true' }}
needs: [ prechecks ]
uses: ./.github/workflows/windows_python.yml
secrets: inherit
all: all:
needs: [ unit-tests, bootstrap ] needs: [ windows, unit-tests, bootstrap ]
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Success - name: Success

View File

@@ -14,7 +14,7 @@ jobs:
build-paraview-deps: build-paraview-deps:
runs-on: windows-latest runs-on: windows-latest
steps: steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b - uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -51,7 +51,7 @@ jobs:
on_develop: false on_develop: false
steps: steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b - uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -100,7 +100,7 @@ jobs:
shell: shell:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b - uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -141,7 +141,7 @@ jobs:
dnf install -y \ dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \ bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz make patch tcl unzip which xz
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b - uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- name: Setup repo and non-root user - name: Setup repo and non-root user
run: | run: |
git --version git --version
@@ -160,7 +160,7 @@ jobs:
clingo-cffi: clingo-cffi:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b - uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -198,7 +198,7 @@ jobs:
os: [macos-13, macos-14] os: [macos-13, macos-14]
python-version: ["3.11"] python-version: ["3.11"]
steps: steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b - uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -228,34 +228,3 @@ jobs:
flags: unittests,macos flags: unittests,macos
token: ${{ secrets.CODECOV_TOKEN }} token: ${{ secrets.CODECOV_TOKEN }}
verbose: true verbose: true
# Run unit tests on Windows
windows:
defaults:
run:
shell:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true

View File

@@ -18,7 +18,7 @@ jobs:
validate: validate:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b - uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: '3.11' python-version: '3.11'
@@ -35,7 +35,7 @@ jobs:
style: style:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b - uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -70,7 +70,7 @@ jobs:
dnf install -y \ dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \ bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz make patch tcl unzip which xz
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b - uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
- name: Setup repo and non-root user - name: Setup repo and non-root user
run: | run: |
git --version git --version

83
.github/workflows/windows_python.yml vendored Normal file
View File

@@ -0,0 +1,83 @@
name: windows
on:
workflow_call:
concurrency:
group: windows-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
defaults:
run:
shell:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml --ignore=lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools coverage pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Command Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@44c2b7a8a4ea60a981eaca3cf939b5f4305c123b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools coverage
- name: Build Test
run: |
spack compiler find
spack -d external find cmake ninja
spack -d install abseil-cpp

View File

@@ -1,387 +1,3 @@
# v0.22.2 (2024-09-21)
## Bugfixes
- Forward compatibility with Spack 0.23 packages with language dependencies (#45205, #45191)
- Forward compatibility with `urllib` from Python 3.12.6+ (#46453, #46483)
- Bump vendored `archspec` for better aarch64 support (#45721, #46445)
- Support macOS Sequoia (#45018, #45127)
- Fix regression in `{variants.X}` and `{variants.X.value}` format strings (#46206)
- Ensure shell escaping of environment variable values in load and activate commands (#42780)
- Fix an issue where `spec[pkg]` considers specs outside the current DAG (#45090)
- Do not halt concretization on unknown variants in externals (#45326)
- Improve validation of `develop` config section (#46485)
- Explicitly disable `ccache` if turned off in config, to avoid cache pollution (#45275)
- Improve backwards compatibility in `include_concrete` (#45766)
- Fix issue where package tags were sometimes repeated (#45160)
- Make `setup-env.sh` "sourced only" by dropping execution bits (#45641)
- Make certain source/binary fetch errors recoverable instead of a hard error (#45683)
- Remove debug statements in package hash computation (#45235)
- Remove redundant clingo warnings (#45269)
- Remove hard-coded layout version (#45645)
- Do not initialize previous store state in `use_store` (#45268)
- Docs improvements (#46475)
## Package updates
- `chapel` major update (#42197, #44931, #45304)
# v0.22.1 (2024-07-04)
## Bugfixes
- Fix reuse of externals on Linux (#44316)
- Ensure parent gcc-runtime version >= child (#44834, #44870)
- Ensure the latest gcc-runtime is rpath'ed when multiple exist among link deps (#44219)
- Improve version detection of glibc (#44154)
- Improve heuristics for solver (#44893, #44976, #45023)
- Make strong preferences override reuse (#44373)
- Reduce verbosity when C compiler is missing (#44182)
- Make missing ccache executable an error when required (#44740)
- Make every environment view containing `python` a `venv` (#44382)
- Fix external detection for compilers with os but no target (#44156)
- Fix version optimization for roots (#44272)
- Handle common implementations of pagination of tags in OCI build caches (#43136)
- Apply fetched patches to develop specs (#44950)
- Avoid Windows wrappers for filesystem utilities on non-Windows (#44126)
- Fix issue with long filenames in build caches on Windows (#43851)
- Fix formatting issue in `spack audit` (#45045)
- CI fixes (#44582, #43965, #43967, #44279, #44213)
## Package updates
- protobuf: fix 3.4:3.21 patch checksum (#44443)
- protobuf: update hash for patch needed when="@3.4:3.21" (#44210)
- git: bump v2.39 to 2.45; deprecate unsafe versions (#44248)
- gcc: use -rpath {rpath_dir} not -rpath={rpath dir} (#44315)
- Remove mesa18 and libosmesa (#44264)
- Enforce consistency of `gl` providers (#44307)
- Require libiconv for iconv (#44335, #45026).
Notice that glibc/musl also provide iconv, but are not guaranteed to be
complete. Set `packages:iconv:require:[glibc]` to restore the old behavior.
- py-matplotlib: qualify when to do a post install (#44191)
- rust: fix v1.78.0 instructions (#44127)
- suite-sparse: improve setting of the `libs` property (#44214)
- netlib-lapack: provide blas and lapack together (#44981)
# v0.22.0 (2024-05-12)
`v0.22.0` is a major feature release.
## Features in this release
1. **Compiler dependencies**
We are in the process of making compilers proper dependencies in Spack, and a number
of changes in `v0.22` support that effort. You may notice nodes in your dependency
graphs for compiler runtime libraries like `gcc-runtime` or `libgfortran`, and you
may notice that Spack graphs now include `libc`. We've also begun moving compiler
configuration from `compilers.yaml` to `packages.yaml` to make it consistent with
other externals. We are trying to do this with the least disruption possible, so
your existing `compilers.yaml` files should still work. We expect to be done with
this transition by the `v0.23` release in November.
* #41104: Packages compiled with `%gcc` on Linux, macOS and FreeBSD now depend on a
new package `gcc-runtime`, which contains a copy of the shared compiler runtime
libraries. This enables gcc runtime libraries to be installed and relocated when
using a build cache. When building minimal Spack-generated container images it is
no longer necessary to install libgfortran, libgomp etc. using the system package
manager.
* #42062: Packages compiled with `%oneapi` now depend on a new package
`intel-oneapi-runtime`. This is similar to `gcc-runtime`, and the runtimes can
provide virtuals and compilers can inject dependencies on virtuals into compiled
packages. This allows us to model library soname compatibility and allows
compilers like `%oneapi` to provide virtuals like `sycl` (which can also be
provided by standalone libraries). Note that until we have an agreement in place
with intel, Intel packages are marked `redistribute(source=False, binary=False)`
and must be downloaded outside of Spack.
* #43272: changes to the optimization criteria of the solver improve the hit-rate of
buildcaches by a fair amount. The solver more relaxed compatibility rules and will
not try to strictly match compilers or targets of reused specs. Users can still
enforce the previous strict behavior with `require:` sections in `packages.yaml`.
Note that to enforce correct linking, Spack will *not* reuse old `%gcc` and
`%oneapi` specs that do not have the runtime libraries as a dependency.
* #43539: Spack will reuse specs built with compilers that are *not* explicitly
configured in `compilers.yaml`. Because we can now keep runtime libraries in build
cache, we do not require you to also have a local configured compiler to *use* the
runtime libraries. This improves reuse in buildcaches and avoids conflicts with OS
updates that happen underneath Spack.
* #43190: binary compatibility on `linux` is now based on the `libc` version,
instead of on the `os` tag. Spack builds now detect the host `libc` (`glibc` or
`musl`) and add it as an implicit external node in the dependency graph. Binaries
with a `libc` with the same name and a version less than or equal to that of the
detected `libc` can be reused. This is only on `linux`, not `macos` or `Windows`.
* #43464: each package that can provide a compiler is now detectable using `spack
external find`. External packages defining compiler paths are effectively used as
compilers, and `spack external find -t compiler` can be used as a substitute for
`spack compiler find`. More details on this transition are in
[the docs](https://spack.readthedocs.io/en/latest/getting_started.html#manual-compiler-configuration)
2. **Improved `spack find` UI for Environments**
If you're working in an enviroment, you likely care about:
* What are the roots
* Which ones are installed / not installed
* What's been added that still needs to be concretized
We've tweaked `spack find` in environments to show this information much more
clearly. Installation status is shown next to each root, so you can see what is
installed. Roots are also shown in bold in the list of installed packages. There is
also a new option for `spack find -r` / `--only-roots` that will only show env
roots, if you don't want to look at all the installed specs.
More details in #42334.
3. **Improved command-line string quoting**
We are making some breaking changes to how Spack parses specs on the CLI in order to
respect shell quoting instead of trying to fight it. If you (sadly) had to write
something like this on the command line:
```
spack install zlib cflags=\"-O2 -g\"
```
That will now result in an error, but you can now write what you probably expected
to work in the first place:
```
spack install zlib cflags="-O2 -g"
```
Quoted can also now include special characters, so you can supply flags like:
```
spack intall zlib ldflags='-Wl,-rpath=$ORIGIN/_libs'
```
To reduce ambiguity in parsing, we now require that you *not* put spaces around `=`
and `==` when for flags or variants. This would not have broken before but will now
result in an error:
```
spack install zlib cflags = "-O2 -g"
```
More details and discussion in #30634.
4. **Revert default `spack install` behavior to `--reuse`**
We changed the default concretizer behavior from `--reuse` to `--reuse-deps` in
#30990 (in `v0.20`), which meant that *every* `spack install` invocation would
attempt to build a new version of the requested package / any environment roots.
While this is a common ask for *upgrading* and for *developer* workflows, we don't
think it should be the default for a package manager.
We are going to try to stick to this policy:
1. Prioritize reuse and build as little as possible by default.
2. Only upgrade or install duplicates if they are explicitly asked for, or if there
is a known security issue that necessitates an upgrade.
With the install command you now have three options:
* `--reuse` (default): reuse as many existing installations as possible.
* `--reuse-deps` / `--fresh-roots`: upgrade (freshen) roots but reuse dependencies if possible.
* `--fresh`: install fresh versions of requested packages (roots) and their dependencies.
We've also introduced `--fresh-roots` as an alias for `--reuse-deps` to make it more clear
that it may give you fresh versions. More details in #41302 and #43988.
5. **More control over reused specs**
You can now control which packages to reuse and how. There is a new
`concretizer:reuse` config option, which accepts the following properties:
- `roots`: `true` to reuse roots, `false` to reuse just dependencies
- `exclude`: list of constraints used to select which specs *not* to reuse
- `include`: list of constraints used to select which specs *to* reuse
- `from`: list of sources for reused specs (some combination of `local`,
`buildcache`, or `external`)
For example, to reuse only specs compiled with GCC, you could write:
```yaml
concretizer:
reuse:
roots: true
include:
- "%gcc"
```
Or, if `openmpi` must be used from externals, and it must be the only external used:
```yaml
concretizer:
reuse:
roots: true
from:
- type: local
exclude: ["openmpi"]
- type: buildcache
exclude: ["openmpi"]
- type: external
include: ["openmpi"]
```
6. **New `redistribute()` directive**
Some packages can't be redistributed in source or binary form. We need an explicit
way to say that in a package.
Now there is a `redistribute()` directive so that package authors can write:
```python
class MyPackage(Package):
redistribute(source=False, binary=False)
```
Like other directives, this works with `when=`:
```python
class MyPackage(Package):
# 12.0 and higher are proprietary
redistribute(source=False, binary=False, when="@12.0:")
# can't redistribute when we depend on some proprietary dependency
redistribute(source=False, binary=False, when="^proprietary-dependency")
```
More in #20185.
7. **New `conflict:` and `prefer:` syntax for package preferences**
Previously, you could express conflicts and preferences in `packages.yaml` through
some contortions with `require:`:
```yaml
packages:
zlib-ng:
require:
- one_of: ["%clang", "@:"] # conflict on %clang
- any_of: ["+shared", "@:"] # strong preference for +shared
```
You can now use `require:` and `prefer:` for a much more readable configuration:
```yaml
packages:
zlib-ng:
conflict:
- "%clang"
prefer:
- "+shared"
```
See [the documentation](https://spack.readthedocs.io/en/latest/packages_yaml.html#conflicts-and-strong-preferences)
and #41832 for more details.
8. **`include_concrete` in environments**
You may want to build on the *concrete* contents of another environment without
changing that environment. You can now include the concrete specs from another
environment's `spack.lock` with `include_concrete`:
```yaml
spack:
specs: []
concretizer:
unify: true
include_concrete:
- /path/to/environment1
- /path/to/environment2
```
Now, when *this* environment is concretized, it will bring in the already concrete
specs from `environment1` and `environment2`, and build on top of them without
changing them. This is useful if you have phased deployments, where old deployments
should not be modified but you want to use as many of them as possible. More details
in #33768.
9. **`python-venv` isolation**
Spack has unique requirements for Python because it:
1. installs every package in its own independent directory, and
2. allows users to register *external* python installations.
External installations may contain their own installed packages that can interfere
with Spack installations, and some distributions (Debian and Ubuntu) even change the
`sysconfig` in ways that alter the installation layout of installed Python packages
(e.g., with the addition of a `/local` prefix on Debian or Ubuntu). To isolate Spack
from these and other issues, we now insert a small `python-venv` package in between
`python` and packages that need to install Python code. This isolates Spack's build
environment, isolates Spack from any issues with an external python, and resolves a
large number of issues we've had with Python installations.
See #40773 for further details.
## New commands, options, and directives
* Allow packages to be pushed to build cache after install from source (#42423)
* `spack develop`: stage build artifacts in same root as non-dev builds #41373
* Don't delete `spack develop` build artifacts after install (#43424)
* `spack find`: add options for local/upstream only (#42999)
* `spack logs`: print log files for packages (either partially built or installed) (#42202)
* `patch`: support reversing patches (#43040)
* `develop`: Add -b/--build-directory option to set build_directory package attribute (#39606)
* `spack list`: add `--namesapce` / `--repo` option (#41948)
* directives: add `checked_by` field to `license()`, add some license checks
* `spack gc`: add options for environments and build dependencies (#41731)
* Add `--create` to `spack env activate` (#40896)
## Performance improvements
* environment.py: fix excessive re-reads (#43746)
* ruamel yaml: fix quadratic complexity bug (#43745)
* Refactor to improve `spec format` speed (#43712)
* Do not acquire a write lock on the env post install if no views (#43505)
* asp.py: fewer calls to `spec.copy()` (#43715)
* spec.py: early return in `__str__`
* avoid `jinja2` import at startup unless needed (#43237)
## Other new features of note
* `archspec`: update to `v0.2.4`: support for Windows, bugfixes for `neoverse-v1` and
`neoverse-v2` detection.
* `spack config get`/`blame`: with no args, show entire config
* `spack env create <env>`: dir if dir-like (#44024)
* ASP-based solver: update os compatibility for macOS (#43862)
* Add handling of custom ssl certs in urllib ops (#42953)
* Add ability to rename environments (#43296)
* Add config option and compiler support to reuse across OS's (#42693)
* Support for prereleases (#43140)
* Only reuse externals when configured (#41707)
* Environments: Add support for including views (#42250)
## Binary caches
* Build cache: make signed/unsigned a mirror property (#41507)
* tools stack
## Removals, deprecations, and syntax changes
* remove `dpcpp` compiler and package (#43418)
* spack load: remove --only argument (#42120)
## Notable Bugfixes
* repo.py: drop deleted packages from provider cache (#43779)
* Allow `+` in module file names (#41999)
* `cmd/python`: use runpy to allow multiprocessing in scripts (#41789)
* Show extension commands with spack -h (#41726)
* Support environment variable expansion inside module projections (#42917)
* Alert user to failed concretizations (#42655)
* shell: fix zsh color formatting for PS1 in environments (#39497)
* spack mirror create --all: include patches (#41579)
## Spack community stats
* 7,994 total packages; 525 since `v0.21.0`
* 178 new Python packages, 5 new R packages
* 358 people contributed to this release
* 344 committers to packages
* 45 committers to core
# v0.21.2 (2024-03-01) # v0.21.2 (2024-03-01)
## Bugfixes ## Bugfixes
@@ -411,7 +27,7 @@
- spack graph: fix coloring with environments (#41240) - spack graph: fix coloring with environments (#41240)
- spack info: sort variants in --variants-by-name (#41389) - spack info: sort variants in --variants-by-name (#41389)
- Spec.format: error on old style format strings (#41934) - Spec.format: error on old style format strings (#41934)
- ASP-based solver: - ASP-based solver:
- fix infinite recursion when computing concretization errors (#41061) - fix infinite recursion when computing concretization errors (#41061)
- don't error for type mismatch on preferences (#41138) - don't error for type mismatch on preferences (#41138)
- don't emit spurious debug output (#41218) - don't emit spurious debug output (#41218)

View File

@@ -144,5 +144,3 @@ switch($SpackSubCommand)
"unload" {Invoke-SpackLoad} "unload" {Invoke-SpackLoad}
default {python "$Env:SPACK_ROOT/bin/spack" $SpackCMD_params $SpackSubCommand $SpackSubCommandArgs} default {python "$Env:SPACK_ROOT/bin/spack" $SpackCMD_params $SpackSubCommand $SpackSubCommandArgs}
} }
exit $LASTEXITCODE

View File

@@ -0,0 +1,19 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
all:
providers:
iconv: [glibc, musl, libiconv]

View File

@@ -1,3 +1,19 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages: packages:
iconv: all:
require: [libiconv] providers:
iconv: [glibc, musl, libiconv]

View File

@@ -38,9 +38,10 @@ packages:
lapack: [openblas, amdlibflame] lapack: [openblas, amdlibflame]
libc: [glibc, musl] libc: [glibc, musl]
libgfortran: [ gcc-runtime ] libgfortran: [ gcc-runtime ]
libglx: [mesa+glx] libglx: [mesa+glx, mesa18+glx]
libifcore: [ intel-oneapi-runtime ] libifcore: [ intel-oneapi-runtime ]
libllvm: [llvm] libllvm: [llvm]
libosmesa: [mesa+osmesa, mesa18+osmesa]
lua-lang: [lua, lua-luajit-openresty, lua-luajit] lua-lang: [lua, lua-luajit-openresty, lua-luajit]
luajit: [lua-luajit-openresty, lua-luajit] luajit: [lua-luajit-openresty, lua-luajit]
mariadb-client: [mariadb-c-client, mariadb] mariadb-client: [mariadb-c-client, mariadb]

View File

@@ -5,9 +5,9 @@
.. chain: .. chain:
============================================= ============================
Chaining Spack Installations (upstreams.yaml) Chaining Spack Installations
============================================= ============================
You can point your Spack installation to another installation to use any You can point your Spack installation to another installation to use any
packages that are installed there. To register the other Spack instance, packages that are installed there. To register the other Spack instance,

View File

@@ -4,7 +4,7 @@ sphinx_design==0.5.0
sphinx-rtd-theme==2.0.0 sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.1 python-levenshtein==0.25.1
docutils==0.20.1 docutils==0.20.1
pygments==2.17.2 pygments==2.18.0
urllib3==2.2.1 urllib3==2.2.1
pytest==8.2.0 pytest==8.2.0
isort==5.13.2 isort==5.13.2

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec * Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures * Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.5-dev (commit cbb1fd5eb397a70d466e5160b393b87b0dbcc78f) * Version: 0.2.4 (commit 48b92512b9ce203ded0ebd1ac41b42593e931f7c)
astunparse astunparse
---------------- ----------------

View File

@@ -47,11 +47,7 @@ def decorator(factory):
def partial_uarch( def partial_uarch(
name: str = "", name: str = "", vendor: str = "", features: Optional[Set[str]] = None, generation: int = 0
vendor: str = "",
features: Optional[Set[str]] = None,
generation: int = 0,
cpu_part: str = "",
) -> Microarchitecture: ) -> Microarchitecture:
"""Construct a partial microarchitecture, from information gathered during system scan.""" """Construct a partial microarchitecture, from information gathered during system scan."""
return Microarchitecture( return Microarchitecture(
@@ -61,7 +57,6 @@ def partial_uarch(
features=features or set(), features=features or set(),
compilers={}, compilers={},
generation=generation, generation=generation,
cpu_part=cpu_part,
) )
@@ -95,7 +90,6 @@ def proc_cpuinfo() -> Microarchitecture:
return partial_uarch( return partial_uarch(
vendor=_canonicalize_aarch64_vendor(data), vendor=_canonicalize_aarch64_vendor(data),
features=_feature_set(data, key="Features"), features=_feature_set(data, key="Features"),
cpu_part=data.get("CPU part", ""),
) )
if architecture in (PPC64LE, PPC64): if architecture in (PPC64LE, PPC64):
@@ -351,10 +345,6 @@ def sorting_fn(item):
generic_candidates = [c for c in candidates if c.vendor == "generic"] generic_candidates = [c for c in candidates if c.vendor == "generic"]
best_generic = max(generic_candidates, key=sorting_fn) best_generic = max(generic_candidates, key=sorting_fn)
# Relevant for AArch64. Filter on "cpu_part" if we have any match
if info.cpu_part != "" and any(c for c in candidates if info.cpu_part == c.cpu_part):
candidates = [c for c in candidates if info.cpu_part == c.cpu_part]
# Filter the candidates to be descendant of the best generic candidate. # Filter the candidates to be descendant of the best generic candidate.
# This is to avoid that the lack of a niche feature that can be disabled # This is to avoid that the lack of a niche feature that can be disabled
# from e.g. BIOS prevents detection of a reasonably performant architecture # from e.g. BIOS prevents detection of a reasonably performant architecture

View File

@@ -2,7 +2,9 @@
# Archspec Project Developers. See the top-level COPYRIGHT file for details. # Archspec Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Types and functions to manage information on CPU microarchitectures.""" """Types and functions to manage information
on CPU microarchitectures.
"""
import functools import functools
import platform import platform
import re import re
@@ -63,24 +65,21 @@ class Microarchitecture:
passed in as argument above. passed in as argument above.
* versions: versions that support this micro-architecture. * versions: versions that support this micro-architecture.
generation (int): generation of the micro-architecture, if relevant. generation (int): generation of the micro-architecture, if
cpu_part (str): cpu part of the architecture, if relevant. relevant.
""" """
# pylint: disable=too-many-arguments,too-many-instance-attributes # pylint: disable=too-many-arguments
#: Aliases for micro-architecture's features #: Aliases for micro-architecture's features
feature_aliases = FEATURE_ALIASES feature_aliases = FEATURE_ALIASES
def __init__(self, name, parents, vendor, features, compilers, generation=0, cpu_part=""): def __init__(self, name, parents, vendor, features, compilers, generation=0):
self.name = name self.name = name
self.parents = parents self.parents = parents
self.vendor = vendor self.vendor = vendor
self.features = features self.features = features
self.compilers = compilers self.compilers = compilers
# Only relevant for PowerPC
self.generation = generation self.generation = generation
# Only relevant for AArch64
self.cpu_part = cpu_part
# Cache the ancestor computation # Cache the ancestor computation
self._ancestors = None self._ancestors = None
@@ -112,7 +111,6 @@ def __eq__(self, other):
and self.parents == other.parents # avoid ancestors here and self.parents == other.parents # avoid ancestors here
and self.compilers == other.compilers and self.compilers == other.compilers
and self.generation == other.generation and self.generation == other.generation
and self.cpu_part == other.cpu_part
) )
@coerce_target_names @coerce_target_names
@@ -145,8 +143,7 @@ def __repr__(self):
cls_name = self.__class__.__name__ cls_name = self.__class__.__name__
fmt = ( fmt = (
cls_name + "({0.name!r}, {0.parents!r}, {0.vendor!r}, " cls_name + "({0.name!r}, {0.parents!r}, {0.vendor!r}, "
"{0.features!r}, {0.compilers!r}, generation={0.generation!r}, " "{0.features!r}, {0.compilers!r}, {0.generation!r})"
"cpu_part={0.cpu_part!r})"
) )
return fmt.format(self) return fmt.format(self)
@@ -193,7 +190,6 @@ def to_dict(self):
"generation": self.generation, "generation": self.generation,
"parents": [str(x) for x in self.parents], "parents": [str(x) for x in self.parents],
"compilers": self.compilers, "compilers": self.compilers,
"cpupart": self.cpu_part,
} }
@staticmethod @staticmethod
@@ -206,7 +202,6 @@ def from_dict(data) -> "Microarchitecture":
features=set(data["features"]), features=set(data["features"]),
compilers=data.get("compilers", {}), compilers=data.get("compilers", {}),
generation=data.get("generation", 0), generation=data.get("generation", 0),
cpu_part=data.get("cpupart", ""),
) )
def optimization_flags(self, compiler, version): def optimization_flags(self, compiler, version):
@@ -365,11 +360,8 @@ def fill_target_from_dict(name, data, targets):
features = set(values["features"]) features = set(values["features"])
compilers = values.get("compilers", {}) compilers = values.get("compilers", {})
generation = values.get("generation", 0) generation = values.get("generation", 0)
cpu_part = values.get("cpupart", "")
targets[name] = Microarchitecture( targets[name] = Microarchitecture(name, parents, vendor, features, compilers, generation)
name, parents, vendor, features, compilers, generation=generation, cpu_part=cpu_part
)
known_targets = {} known_targets = {}
data = archspec.cpu.schema.TARGETS_JSON["microarchitectures"] data = archspec.cpu.schema.TARGETS_JSON["microarchitectures"]

View File

@@ -2225,14 +2225,10 @@
], ],
"nvhpc": [ "nvhpc": [
{ {
"versions": "21.11:23.8", "versions": "21.11:",
"name": "zen3", "name": "zen3",
"flags": "-tp {name}", "flags": "-tp {name}",
"warnings": "zen4 is not fully supported by nvhpc versions < 23.9, falling back to zen3" "warnings": "zen4 is not fully supported by nvhpc yet, falling back to zen3"
},
{
"versions": "23.9:",
"flags": "-tp {name}"
} }
] ]
} }
@@ -2715,8 +2711,7 @@
"flags": "-mcpu=thunderx2t99" "flags": "-mcpu=thunderx2t99"
} }
] ]
}, }
"cpupart": "0x0af"
}, },
"a64fx": { "a64fx": {
"from": ["armv8.2a"], "from": ["armv8.2a"],
@@ -2784,8 +2779,7 @@
"flags": "-march=armv8.2-a+crc+crypto+fp16+sve" "flags": "-march=armv8.2-a+crc+crypto+fp16+sve"
} }
] ]
}, }
"cpupart": "0x001"
}, },
"cortex_a72": { "cortex_a72": {
"from": ["aarch64"], "from": ["aarch64"],
@@ -2822,8 +2816,7 @@
"flags" : "-mcpu=cortex-a72" "flags" : "-mcpu=cortex-a72"
} }
] ]
}, }
"cpupart": "0xd08"
}, },
"neoverse_n1": { "neoverse_n1": {
"from": ["cortex_a72", "armv8.2a"], "from": ["cortex_a72", "armv8.2a"],
@@ -2844,7 +2837,8 @@
"asimdrdm", "asimdrdm",
"lrcpc", "lrcpc",
"dcpop", "dcpop",
"asimddp" "asimddp",
"ssbs"
], ],
"compilers" : { "compilers" : {
"gcc": [ "gcc": [
@@ -2908,8 +2902,7 @@
"flags": "-tp {name}" "flags": "-tp {name}"
} }
] ]
}, }
"cpupart": "0xd0c"
}, },
"neoverse_v1": { "neoverse_v1": {
"from": ["neoverse_n1", "armv8.4a"], "from": ["neoverse_n1", "armv8.4a"],
@@ -2933,6 +2926,8 @@
"lrcpc", "lrcpc",
"dcpop", "dcpop",
"sha3", "sha3",
"sm3",
"sm4",
"asimddp", "asimddp",
"sha512", "sha512",
"sve", "sve",
@@ -2941,6 +2936,7 @@
"uscat", "uscat",
"ilrcpc", "ilrcpc",
"flagm", "flagm",
"ssbs",
"dcpodp", "dcpodp",
"svei8mm", "svei8mm",
"svebf16", "svebf16",
@@ -3008,7 +3004,7 @@
}, },
{ {
"versions": "11:", "versions": "11:",
"flags" : "-march=armv8.4-a+sve+fp16+bf16+crypto+i8mm+rng" "flags" : "-march=armv8.4-a+sve+ssbs+fp16+bf16+crypto+i8mm+rng"
}, },
{ {
"versions": "12:", "versions": "12:",
@@ -3032,8 +3028,7 @@
"flags": "-tp {name}" "flags": "-tp {name}"
} }
] ]
}, }
"cpupart": "0xd40"
}, },
"neoverse_v2": { "neoverse_v2": {
"from": ["neoverse_n1", "armv9.0a"], "from": ["neoverse_n1", "armv9.0a"],
@@ -3057,22 +3052,32 @@
"lrcpc", "lrcpc",
"dcpop", "dcpop",
"sha3", "sha3",
"sm3",
"sm4",
"asimddp", "asimddp",
"sha512", "sha512",
"sve", "sve",
"asimdfhm", "asimdfhm",
"dit",
"uscat", "uscat",
"ilrcpc", "ilrcpc",
"flagm", "flagm",
"ssbs",
"sb", "sb",
"dcpodp", "dcpodp",
"sve2", "sve2",
"sveaes",
"svepmull",
"svebitperm",
"svesha3",
"svesm4",
"flagm2", "flagm2",
"frint", "frint",
"svei8mm", "svei8mm",
"svebf16", "svebf16",
"i8mm", "i8mm",
"bf16" "bf16",
"dgh"
], ],
"compilers" : { "compilers" : {
"gcc": [ "gcc": [
@@ -3097,19 +3102,15 @@
"flags" : "-march=armv8.5-a+sve -mtune=cortex-a76" "flags" : "-march=armv8.5-a+sve -mtune=cortex-a76"
}, },
{ {
"versions": "10.0:11.3.99", "versions": "10.0:11.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16 -mtune=cortex-a77" "flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16 -mtune=cortex-a77"
}, },
{
"versions": "11.4:11.99",
"flags" : "-mcpu=neoverse-v2"
},
{ {
"versions": "12.0:12.2.99", "versions": "12.0:12.99",
"flags" : "-march=armv9-a+i8mm+bf16 -mtune=cortex-a710" "flags" : "-march=armv9-a+i8mm+bf16 -mtune=cortex-a710"
}, },
{ {
"versions": "12.3:", "versions": "13.0:",
"flags" : "-mcpu=neoverse-v2" "flags" : "-mcpu=neoverse-v2"
} }
], ],
@@ -3144,112 +3145,7 @@
"flags": "-tp {name}" "flags": "-tp {name}"
} }
] ]
}, }
"cpupart": "0xd4f"
},
"neoverse_n2": {
"from": ["neoverse_n1", "armv9.0a"],
"vendor": "ARM",
"features": [
"fp",
"asimd",
"evtstrm",
"aes",
"pmull",
"sha1",
"sha2",
"crc32",
"atomics",
"fphp",
"asimdhp",
"cpuid",
"asimdrdm",
"jscvt",
"fcma",
"lrcpc",
"dcpop",
"sha3",
"asimddp",
"sha512",
"sve",
"asimdfhm",
"uscat",
"ilrcpc",
"flagm",
"sb",
"dcpodp",
"sve2",
"flagm2",
"frint",
"svei8mm",
"svebf16",
"i8mm",
"bf16"
],
"compilers" : {
"gcc": [
{
"versions": "4.8:5.99",
"flags": "-march=armv8-a"
},
{
"versions": "6:6.99",
"flags" : "-march=armv8.1-a"
},
{
"versions": "7.0:7.99",
"flags" : "-march=armv8.2-a -mtune=cortex-a72"
},
{
"versions": "8.0:8.99",
"flags" : "-march=armv8.4-a+sve -mtune=cortex-a72"
},
{
"versions": "9.0:9.99",
"flags" : "-march=armv8.5-a+sve -mtune=cortex-a76"
},
{
"versions": "10.0:10.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16 -mtune=cortex-a77"
},
{
"versions": "11.0:",
"flags" : "-mcpu=neoverse-n2"
}
],
"clang" : [
{
"versions": "9.0:10.99",
"flags" : "-march=armv8.5-a+sve"
},
{
"versions": "11.0:13.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16"
},
{
"versions": "14.0:15.99",
"flags" : "-march=armv9-a+i8mm+bf16"
},
{
"versions": "16.0:",
"flags" : "-mcpu=neoverse-n2"
}
],
"arm" : [
{
"versions": "23.04.0:",
"flags" : "-mcpu=neoverse-n2"
}
],
"nvhpc" : [
{
"versions": "23.3:",
"name": "neoverse-n1",
"flags": "-tp {name}"
}
]
},
"cpupart": "0xd49"
}, },
"m1": { "m1": {
"from": ["armv8.4a"], "from": ["armv8.4a"],
@@ -3315,8 +3211,7 @@
"flags" : "-mcpu=apple-m1" "flags" : "-mcpu=apple-m1"
} }
] ]
}, }
"cpupart": "0x022"
}, },
"m2": { "m2": {
"from": ["m1", "armv8.5a"], "from": ["m1", "armv8.5a"],
@@ -3394,8 +3289,7 @@
"flags" : "-mcpu=apple-m2" "flags" : "-mcpu=apple-m2"
} }
] ]
}, }
"cpupart": "0x032"
}, },
"arm": { "arm": {
"from": [], "from": [],

View File

@@ -52,9 +52,6 @@
} }
} }
} }
},
"cpupart": {
"type": "string"
} }
}, },
"required": [ "required": [
@@ -110,4 +107,4 @@
"additionalProperties": false "additionalProperties": false
} }
} }
} }

View File

@@ -843,7 +843,7 @@ def copy_tree(
if islink(s): if islink(s):
link_target = resolve_link_target_relative_to_the_link(s) link_target = resolve_link_target_relative_to_the_link(s)
if symlinks: if symlinks:
target = readlink(s) target = os.readlink(s)
if os.path.isabs(target): if os.path.isabs(target):
def escaped_path(path): def escaped_path(path):
@@ -2531,14 +2531,8 @@ def establish_link(self):
# for each binary install dir in self.pkg (i.e. pkg.prefix.bin, pkg.prefix.lib) # for each binary install dir in self.pkg (i.e. pkg.prefix.bin, pkg.prefix.lib)
# install a symlink to each dependent library # install a symlink to each dependent library
for library, lib_dir in itertools.product(self.rpaths, self.library_dependents):
# do not rpath for system libraries included in the dag self._link(library, lib_dir)
# we should not be modifying libraries managed by the Windows system
# as this will negatively impact linker behavior and can result in permission
# errors if those system libs are not modifiable by Spack
if "windows-system" not in getattr(self.pkg, "tags", []):
for library, lib_dir in itertools.product(self.rpaths, self.library_dependents):
self._link(library, lib_dir)
@system_path_filter @system_path_filter

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string #: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.22.2" __version__ = "0.23.0.dev0"
spack_version = __version__ spack_version = __version__

View File

@@ -779,7 +779,7 @@ def check_virtual_with_variants(spec, msg):
return return
error = error_cls( error = error_cls(
f"{pkg_name}: {msg}", f"{pkg_name}: {msg}",
[f"remove variants from '{spec}' in depends_on directive in {filename}"], f"remove variants from '{spec}' in depends_on directive in {filename}",
) )
errors.append(error) errors.append(error)

View File

@@ -23,6 +23,7 @@
import warnings import warnings
from contextlib import closing from contextlib import closing
from typing import Dict, Iterable, List, NamedTuple, Optional, Set, Tuple from typing import Dict, Iterable, List, NamedTuple, Optional, Set, Tuple
from urllib.error import HTTPError, URLError
import llnl.util.filesystem as fsys import llnl.util.filesystem as fsys
import llnl.util.lang import llnl.util.lang
@@ -898,8 +899,9 @@ def url_read_method(url):
try: try:
_, _, spec_file = web_util.read_from_url(url) _, _, spec_file = web_util.read_from_url(url)
contents = codecs.getreader("utf-8")(spec_file).read() contents = codecs.getreader("utf-8")(spec_file).read()
except web_util.SpackWebError as e: except (URLError, web_util.SpackWebError) as url_err:
tty.error(f"Error reading specfile: {url}: {e}") tty.error("Error reading specfile: {0}".format(url))
tty.error(url_err)
return contents return contents
try: try:
@@ -2039,17 +2041,21 @@ def try_direct_fetch(spec, mirrors=None):
try: try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_signed_json) _, _, fs = web_util.read_from_url(buildcache_fetch_url_signed_json)
specfile_is_signed = True specfile_is_signed = True
except web_util.SpackWebError as e1: except (URLError, web_util.SpackWebError, HTTPError) as url_err:
try: try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_json) _, _, fs = web_util.read_from_url(buildcache_fetch_url_json)
except web_util.SpackWebError as e2: except (URLError, web_util.SpackWebError, HTTPError) as url_err_x:
tty.debug( tty.debug(
f"Did not find {specfile_name} on {buildcache_fetch_url_signed_json}", "Did not find {0} on {1}".format(
e1, specfile_name, buildcache_fetch_url_signed_json
),
url_err,
level=2, level=2,
) )
tty.debug( tty.debug(
f"Did not find {specfile_name} on {buildcache_fetch_url_json}", e2, level=2 "Did not find {0} on {1}".format(specfile_name, buildcache_fetch_url_json),
url_err_x,
level=2,
) )
continue continue
specfile_contents = codecs.getreader("utf-8")(fs).read() specfile_contents = codecs.getreader("utf-8")(fs).read()
@@ -2134,9 +2140,6 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
for mirror in mirror_collection.values(): for mirror in mirror_collection.values():
fetch_url = mirror.fetch_url fetch_url = mirror.fetch_url
# TODO: oci:// does not support signing.
if fetch_url.startswith("oci://"):
continue
keys_url = url_util.join( keys_url = url_util.join(
fetch_url, BUILD_CACHE_RELATIVE_PATH, BUILD_CACHE_KEYS_RELATIVE_PATH fetch_url, BUILD_CACHE_RELATIVE_PATH, BUILD_CACHE_KEYS_RELATIVE_PATH
) )
@@ -2147,12 +2150,19 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
try: try:
_, _, json_file = web_util.read_from_url(keys_index) _, _, json_file = web_util.read_from_url(keys_index)
json_index = sjson.load(codecs.getreader("utf-8")(json_file)) json_index = sjson.load(codecs.getreader("utf-8")(json_file))
except web_util.SpackWebError as url_err: except (URLError, web_util.SpackWebError) as url_err:
if web_util.url_exists(keys_index): if web_util.url_exists(keys_index):
err_msg = [
"Unable to find public keys in {0},",
" caught exception attempting to read from {1}.",
]
tty.error( tty.error(
f"Unable to find public keys in {url_util.format(fetch_url)}," "".join(err_msg).format(
f" caught exception attempting to read from {url_util.format(keys_index)}." url_util.format(fetch_url), url_util.format(keys_index)
)
) )
tty.debug(url_err) tty.debug(url_err)
continue continue
@@ -2432,7 +2442,7 @@ def get_remote_hash(self):
url_index_hash = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json.hash") url_index_hash = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json.hash")
try: try:
response = self.urlopen(urllib.request.Request(url_index_hash, headers=self.headers)) response = self.urlopen(urllib.request.Request(url_index_hash, headers=self.headers))
except (TimeoutError, urllib.error.URLError): except urllib.error.URLError:
return None return None
# Validate the hash # Validate the hash
@@ -2454,7 +2464,7 @@ def conditional_fetch(self) -> FetchIndexResult:
try: try:
response = self.urlopen(urllib.request.Request(url_index, headers=self.headers)) response = self.urlopen(urllib.request.Request(url_index, headers=self.headers))
except (TimeoutError, urllib.error.URLError) as e: except urllib.error.URLError as e:
raise FetchIndexError("Could not fetch index from {}".format(url_index), e) from e raise FetchIndexError("Could not fetch index from {}".format(url_index), e) from e
try: try:
@@ -2495,7 +2505,10 @@ def __init__(self, url, etag, urlopen=web_util.urlopen):
def conditional_fetch(self) -> FetchIndexResult: def conditional_fetch(self) -> FetchIndexResult:
# Just do a conditional fetch immediately # Just do a conditional fetch immediately
url = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json") url = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json")
headers = {"User-Agent": web_util.SPACK_USER_AGENT, "If-None-Match": f'"{self.etag}"'} headers = {
"User-Agent": web_util.SPACK_USER_AGENT,
"If-None-Match": '"{}"'.format(self.etag),
}
try: try:
response = self.urlopen(urllib.request.Request(url, headers=headers)) response = self.urlopen(urllib.request.Request(url, headers=headers))
@@ -2503,14 +2516,14 @@ def conditional_fetch(self) -> FetchIndexResult:
if e.getcode() == 304: if e.getcode() == 304:
# Not modified; that means fresh. # Not modified; that means fresh.
return FetchIndexResult(etag=None, hash=None, data=None, fresh=True) return FetchIndexResult(etag=None, hash=None, data=None, fresh=True)
raise FetchIndexError(f"Could not fetch index {url}", e) from e raise FetchIndexError("Could not fetch index {}".format(url), e) from e
except (TimeoutError, urllib.error.URLError) as e: except urllib.error.URLError as e:
raise FetchIndexError(f"Could not fetch index {url}", e) from e raise FetchIndexError("Could not fetch index {}".format(url), e) from e
try: try:
result = codecs.getreader("utf-8")(response).read() result = codecs.getreader("utf-8")(response).read()
except ValueError as e: except ValueError as e:
raise FetchIndexError(f"Remote index {url} is invalid", e) from e raise FetchIndexError("Remote index {} is invalid".format(url), e) from e
headers = response.headers headers = response.headers
etag_header_value = headers.get("Etag", None) or headers.get("etag", None) etag_header_value = headers.get("Etag", None) or headers.get("etag", None)
@@ -2541,19 +2554,21 @@ def conditional_fetch(self) -> FetchIndexResult:
headers={"Accept": "application/vnd.oci.image.manifest.v1+json"}, headers={"Accept": "application/vnd.oci.image.manifest.v1+json"},
) )
) )
except (TimeoutError, urllib.error.URLError) as e: except urllib.error.URLError as e:
raise FetchIndexError(f"Could not fetch manifest from {url_manifest}", e) from e raise FetchIndexError(
"Could not fetch manifest from {}".format(url_manifest), e
) from e
try: try:
manifest = json.loads(response.read()) manifest = json.loads(response.read())
except Exception as e: except Exception as e:
raise FetchIndexError(f"Remote index {url_manifest} is invalid", e) from e raise FetchIndexError("Remote index {} is invalid".format(url_manifest), e) from e
# Get first blob hash, which should be the index.json # Get first blob hash, which should be the index.json
try: try:
index_digest = spack.oci.image.Digest.from_string(manifest["layers"][0]["digest"]) index_digest = spack.oci.image.Digest.from_string(manifest["layers"][0]["digest"])
except Exception as e: except Exception as e:
raise FetchIndexError(f"Remote index {url_manifest} is invalid", e) from e raise FetchIndexError("Remote index {} is invalid".format(url_manifest), e) from e
# Fresh? # Fresh?
if index_digest.digest == self.local_hash: if index_digest.digest == self.local_hash:

View File

@@ -43,7 +43,7 @@
from collections import defaultdict from collections import defaultdict
from enum import Flag, auto from enum import Flag, auto
from itertools import chain from itertools import chain
from typing import Dict, List, Set, Tuple from typing import List, Set, Tuple
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.string import plural from llnl.string import plural
@@ -72,7 +72,6 @@
import spack.store import spack.store
import spack.subprocess_context import spack.subprocess_context
import spack.user_environment import spack.user_environment
import spack.util.executable
import spack.util.path import spack.util.path
import spack.util.pattern import spack.util.pattern
from spack import traverse from spack import traverse
@@ -480,12 +479,12 @@ def set_wrapper_variables(pkg, env):
env.set(SPACK_DEBUG_LOG_ID, pkg.spec.format("{name}-{hash:7}")) env.set(SPACK_DEBUG_LOG_ID, pkg.spec.format("{name}-{hash:7}"))
env.set(SPACK_DEBUG_LOG_DIR, spack.main.spack_working_dir) env.set(SPACK_DEBUG_LOG_DIR, spack.main.spack_working_dir)
# Find ccache binary and hand it to build environment
if spack.config.get("config:ccache"): if spack.config.get("config:ccache"):
# Enable ccache in the compiler wrapper ccache = Executable("ccache")
env.set(SPACK_CCACHE_BINARY, spack.util.executable.which_string("ccache", required=True)) if not ccache:
else: raise RuntimeError("No ccache binary found in PATH")
# Avoid cache pollution if a build system forces `ccache <compiler wrapper invocation>`. env.set(SPACK_CCACHE_BINARY, ccache)
env.set("CCACHE_DISABLE", "1")
# Gather information about various types of dependencies # Gather information about various types of dependencies
link_deps = set(pkg.spec.traverse(root=False, deptype=("link"))) link_deps = set(pkg.spec.traverse(root=False, deptype=("link")))
@@ -731,28 +730,12 @@ def _static_to_shared_library(arch, compiler, static_lib, shared_lib=None, **kwa
return compiler(*compiler_args, output=compiler_output) return compiler(*compiler_args, output=compiler_output)
def _get_rpath_deps_from_spec( def get_rpath_deps(pkg):
spec: spack.spec.Spec, transitive_rpaths: bool """Return immediate or transitive RPATHs depending on the package."""
) -> List[spack.spec.Spec]: if pkg.transitive_rpaths:
if not transitive_rpaths: return [d for d in pkg.spec.traverse(root=False, deptype=("link"))]
return spec.dependencies(deptype=dt.LINK) else:
return pkg.spec.dependencies(deptype="link")
by_name: Dict[str, spack.spec.Spec] = {}
for dep in spec.traverse(root=False, deptype=dt.LINK):
lookup = by_name.get(dep.name)
if lookup is None:
by_name[dep.name] = dep
elif lookup.version < dep.version:
by_name[dep.name] = dep
return list(by_name.values())
def get_rpath_deps(pkg: spack.package_base.PackageBase) -> List[spack.spec.Spec]:
"""Return immediate or transitive dependencies (depending on the package) that need to be
rpath'ed. If a package occurs multiple times, the newest version is kept."""
return _get_rpath_deps_from_spec(pkg.spec, pkg.transitive_rpaths)
def get_rpaths(pkg): def get_rpaths(pkg):

View File

@@ -44,7 +44,6 @@
from spack import traverse from spack import traverse
from spack.error import SpackError from spack.error import SpackError
from spack.reporters import CDash, CDashConfiguration from spack.reporters import CDash, CDashConfiguration
from spack.reporters.cdash import SPACK_CDASH_TIMEOUT
from spack.reporters.cdash import build_stamp as cdash_build_stamp from spack.reporters.cdash import build_stamp as cdash_build_stamp
# See https://docs.gitlab.com/ee/ci/yaml/#retry for descriptions of conditions # See https://docs.gitlab.com/ee/ci/yaml/#retry for descriptions of conditions
@@ -684,22 +683,6 @@ def generate_gitlab_ci_yaml(
"instead.", "instead.",
) )
def ensure_expected_target_path(path):
"""Returns passed paths with all Windows path separators exchanged
for posix separators only if copy_only_pipeline is enabled
This is required as copy_only_pipelines are a unique scenario where
the generate job and child pipelines are run on different platforms.
To make this compatible w/ Windows, we cannot write Windows style path separators
that will be consumed on by the Posix copy job runner.
TODO (johnwparent): Refactor config + cli read/write to deal only in posix
style paths
"""
if copy_only_pipeline and path:
path = path.replace("\\", "/")
return path
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True) pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
deprecated_mirror_config = False deprecated_mirror_config = False
buildcache_destination = None buildcache_destination = None
@@ -823,7 +806,7 @@ def ensure_expected_target_path(path):
if scope not in include_scopes and scope not in env_includes: if scope not in include_scopes and scope not in env_includes:
include_scopes.insert(0, scope) include_scopes.insert(0, scope)
env_includes.extend(include_scopes) env_includes.extend(include_scopes)
env_yaml_root["spack"]["include"] = [ensure_expected_target_path(i) for i in env_includes] env_yaml_root["spack"]["include"] = env_includes
if "gitlab-ci" in env_yaml_root["spack"] and "ci" not in env_yaml_root["spack"]: if "gitlab-ci" in env_yaml_root["spack"] and "ci" not in env_yaml_root["spack"]:
env_yaml_root["spack"]["ci"] = env_yaml_root["spack"].pop("gitlab-ci") env_yaml_root["spack"]["ci"] = env_yaml_root["spack"].pop("gitlab-ci")
@@ -1111,7 +1094,7 @@ def main_script_replacements(cmd):
if cdash_handler and cdash_handler.auth_token: if cdash_handler and cdash_handler.auth_token:
try: try:
cdash_handler.populate_buildgroup(all_job_names) cdash_handler.populate_buildgroup(all_job_names)
except (SpackError, HTTPError, URLError, TimeoutError) as err: except (SpackError, HTTPError, URLError) as err:
tty.warn(f"Problem populating buildgroup: {err}") tty.warn(f"Problem populating buildgroup: {err}")
else: else:
tty.warn("Unable to populate buildgroup without CDash credentials") tty.warn("Unable to populate buildgroup without CDash credentials")
@@ -1244,9 +1227,6 @@ def main_script_replacements(cmd):
"SPACK_REBUILD_EVERYTHING": str(rebuild_everything), "SPACK_REBUILD_EVERYTHING": str(rebuild_everything),
"SPACK_REQUIRE_SIGNING": os.environ.get("SPACK_REQUIRE_SIGNING", "False"), "SPACK_REQUIRE_SIGNING": os.environ.get("SPACK_REQUIRE_SIGNING", "False"),
} }
output_vars = output_object["variables"]
for item, val in output_vars.items():
output_vars[item] = ensure_expected_target_path(val)
# TODO: Remove this block in Spack 0.23 # TODO: Remove this block in Spack 0.23
if deprecated_mirror_config and remote_mirror_override: if deprecated_mirror_config and remote_mirror_override:
@@ -1303,6 +1283,7 @@ def main_script_replacements(cmd):
sorted_output = {} sorted_output = {}
for output_key, output_value in sorted(output_object.items()): for output_key, output_value in sorted(output_object.items()):
sorted_output[output_key] = output_value sorted_output[output_key] = output_value
if known_broken_specs_encountered: if known_broken_specs_encountered:
tty.error("This pipeline generated hashes known to be broken on develop:") tty.error("This pipeline generated hashes known to be broken on develop:")
display_broken_spec_messages(broken_specs_url, known_broken_specs_encountered) display_broken_spec_messages(broken_specs_url, known_broken_specs_encountered)
@@ -1525,7 +1506,7 @@ def download_and_extract_artifacts(url, work_dir):
request = Request(url, headers=headers) request = Request(url, headers=headers)
request.get_method = lambda: "GET" request.get_method = lambda: "GET"
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT) response = opener.open(request)
response_code = response.getcode() response_code = response.getcode()
if response_code != 200: if response_code != 200:
@@ -2095,7 +2076,7 @@ def read_broken_spec(broken_spec_url):
""" """
try: try:
_, _, fs = web_util.read_from_url(broken_spec_url) _, _, fs = web_util.read_from_url(broken_spec_url)
except web_util.SpackWebError: except (URLError, web_util.SpackWebError, HTTPError):
tty.warn(f"Unable to read broken spec from {broken_spec_url}") tty.warn(f"Unable to read broken spec from {broken_spec_url}")
return None return None
@@ -2273,7 +2254,7 @@ def create_buildgroup(self, opener, headers, url, group_name, group_type):
request = Request(url, data=enc_data, headers=headers) request = Request(url, data=enc_data, headers=headers)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT) response = opener.open(request)
response_code = response.getcode() response_code = response.getcode()
if response_code not in [200, 201]: if response_code not in [200, 201]:
@@ -2319,7 +2300,7 @@ def populate_buildgroup(self, job_names):
request = Request(url, data=enc_data, headers=headers) request = Request(url, data=enc_data, headers=headers)
request.get_method = lambda: "PUT" request.get_method = lambda: "PUT"
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT) response = opener.open(request)
response_code = response.getcode() response_code = response.getcode()
if response_code != 200: if response_code != 200:

View File

@@ -13,6 +13,7 @@
import shutil import shutil
import sys import sys
import tempfile import tempfile
import urllib.request
from typing import Dict, List, Optional, Tuple, Union from typing import Dict, List, Optional, Tuple, Union
import llnl.util.tty as tty import llnl.util.tty as tty
@@ -53,7 +54,6 @@
from spack.oci.oci import ( from spack.oci.oci import (
copy_missing_layers_with_retry, copy_missing_layers_with_retry,
get_manifest_and_config_with_retry, get_manifest_and_config_with_retry,
list_tags,
upload_blob_with_retry, upload_blob_with_retry,
upload_manifest_with_retry, upload_manifest_with_retry,
) )
@@ -813,7 +813,7 @@ def _push_oci(
def extra_config(spec: Spec): def extra_config(spec: Spec):
spec_dict = spec.to_dict(hash=ht.dag_hash) spec_dict = spec.to_dict(hash=ht.dag_hash)
spec_dict["buildcache_layout_version"] = bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION spec_dict["buildcache_layout_version"] = 1
spec_dict["binary_cache_checksum"] = { spec_dict["binary_cache_checksum"] = {
"hash_algorithm": "sha256", "hash_algorithm": "sha256",
"hash": checksums[spec.dag_hash()].compressed_digest.digest, "hash": checksums[spec.dag_hash()].compressed_digest.digest,
@@ -856,7 +856,10 @@ def _config_from_tag(image_ref: ImageReference, tag: str) -> Optional[dict]:
def _update_index_oci(image_ref: ImageReference, tmpdir: str, pool: MaybePool) -> None: def _update_index_oci(image_ref: ImageReference, tmpdir: str, pool: MaybePool) -> None:
tags = list_tags(image_ref) request = urllib.request.Request(url=image_ref.tags_url())
response = spack.oci.opener.urlopen(request)
spack.oci.opener.ensure_status(request, response, 200)
tags = json.load(response)["tags"]
# Fetch all image config files in parallel # Fetch all image config files in parallel
spec_dicts = pool.starmap( spec_dicts = pool.starmap(

View File

@@ -11,6 +11,7 @@
from argparse import ArgumentParser, Namespace from argparse import ArgumentParser, Namespace
from typing import IO, Any, Callable, Dict, Iterable, List, Optional, Sequence, Set, Tuple, Union from typing import IO, Any, Callable, Dict, Iterable, List, Optional, Sequence, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.argparsewriter import ArgparseRstWriter, ArgparseWriter, Command from llnl.util.argparsewriter import ArgparseRstWriter, ArgparseWriter, Command
from llnl.util.tty.colify import colify from llnl.util.tty.colify import colify
@@ -866,6 +867,9 @@ def _commands(parser: ArgumentParser, args: Namespace) -> None:
prepend_header(args, f) prepend_header(args, f)
formatter(args, f) formatter(args, f)
if args.update_completion:
fs.set_executable(args.update)
else: else:
prepend_header(args, sys.stdout) prepend_header(args, sys.stdout)
formatter(args, sys.stdout) formatter(args, sys.stdout)

View File

@@ -468,30 +468,32 @@ def env_remove(args):
This removes an environment managed by Spack. Directory environments This removes an environment managed by Spack. Directory environments
and manifests embedded in repositories should be removed manually. and manifests embedded in repositories should be removed manually.
""" """
remove_envs = [] read_envs = []
valid_envs = [] valid_envs = []
bad_envs = [] bad_envs = []
invalid_envs = []
for env_name in ev.all_environment_names(): for env_name in ev.all_environment_names():
try: try:
env = ev.read(env_name) env = ev.read(env_name)
valid_envs.append(env) valid_envs.append(env_name)
if env_name in args.rm_env: if env_name in args.rm_env:
remove_envs.append(env) read_envs.append(env)
except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError): except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError):
invalid_envs.append(env_name)
if env_name in args.rm_env: if env_name in args.rm_env:
bad_envs.append(env_name) bad_envs.append(env_name)
# Check if remove_env is included from another env before trying to remove # Check if env is linked to another before trying to remove
for env in valid_envs: for name in valid_envs:
for remove_env in remove_envs:
# don't check if environment is included to itself # don't check if environment is included to itself
if env.name == remove_env.name: if name == env_name:
continue continue
environ = ev.Environment(ev.root(name))
if remove_env.path in env.included_concrete_envs: if ev.root(env_name) in environ.included_concrete_envs:
msg = f'Environment "{remove_env.name}" is being used by environment "{env.name}"' msg = f'Environment "{env_name}" is being used by environment "{name}"'
if args.force: if args.force:
tty.warn(msg) tty.warn(msg)
else: else:
@@ -504,7 +506,7 @@ def env_remove(args):
if not answer: if not answer:
tty.die("Will not remove any environments") tty.die("Will not remove any environments")
for env in remove_envs: for env in read_envs:
name = env.name name = env.name
if env.active: if env.active:
tty.die(f"Environment {name} can't be removed while activated.") tty.die(f"Environment {name} can't be removed while activated.")

View File

@@ -38,10 +38,10 @@
import spack.cmd import spack.cmd
import spack.environment as ev import spack.environment as ev
import spack.filesystem_view as fsv
import spack.schema.projections import spack.schema.projections
import spack.store import spack.store
from spack.config import validate from spack.config import validate
from spack.filesystem_view import YamlFilesystemView, view_func_parser
from spack.util import spack_yaml as s_yaml from spack.util import spack_yaml as s_yaml
description = "project packages to a compact naming scheme on the filesystem" description = "project packages to a compact naming scheme on the filesystem"
@@ -193,13 +193,17 @@ def view(parser, args):
ordered_projections = {} ordered_projections = {}
# What method are we using for this view # What method are we using for this view
link_type = args.action if args.action in actions_link else "symlink" if args.action in actions_link:
view = fsv.YamlFilesystemView( link_fn = view_func_parser(args.action)
else:
link_fn = view_func_parser("symlink")
view = YamlFilesystemView(
path, path,
spack.store.STORE.layout, spack.store.STORE.layout,
projections=ordered_projections, projections=ordered_projections,
ignore_conflicts=getattr(args, "ignore_conflicts", False), ignore_conflicts=getattr(args, "ignore_conflicts", False),
link_type=link_type, link=link_fn,
verbose=args.verbose, verbose=args.verbose,
) )

View File

@@ -220,10 +220,10 @@ def _compiler_config_from_external(config):
operating_system = host_platform.operating_system("default_os") operating_system = host_platform.operating_system("default_os")
target = host_platform.target("default_target").microarchitecture target = host_platform.target("default_target").microarchitecture
else: else:
target = spec.architecture.target target = spec.target
if not target: if not target:
target = spack.platforms.host().target("default_target") host_platform = spack.platforms.host()
target = target.microarchitecture target = host_platform.target("default_target").microarchitecture
operating_system = spec.os operating_system = spec.os
if not operating_system: if not operating_system:

View File

@@ -97,7 +97,7 @@ class OpenMpi(Package):
PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]] PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]]
SUPPORTED_LANGUAGES = ("fortran", "cxx", "c") SUPPORTED_LANGUAGES = ("fortran", "cxx")
def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]: def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]:

View File

@@ -15,7 +15,6 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.config import spack.config
import spack.hash_types as ht import spack.hash_types as ht
@@ -182,7 +181,7 @@ def deprecated_file_path(self, deprecated_spec, deprecator_spec=None):
base_dir = ( base_dir = (
self.path_for_spec(deprecator_spec) self.path_for_spec(deprecator_spec)
if deprecator_spec if deprecator_spec
else readlink(deprecated_spec.prefix) else os.readlink(deprecated_spec.prefix)
) )
yaml_path = os.path.join( yaml_path = os.path.join(

View File

@@ -22,7 +22,7 @@
import llnl.util.tty as tty import llnl.util.tty as tty
import llnl.util.tty.color as clr import llnl.util.tty.color as clr
from llnl.util.link_tree import ConflictingSpecsError from llnl.util.link_tree import ConflictingSpecsError
from llnl.util.symlink import readlink, symlink from llnl.util.symlink import symlink
import spack.compilers import spack.compilers
import spack.concretize import spack.concretize
@@ -30,7 +30,6 @@
import spack.deptypes as dt import spack.deptypes as dt
import spack.error import spack.error
import spack.fetch_strategy import spack.fetch_strategy
import spack.filesystem_view as fsv
import spack.hash_types as ht import spack.hash_types as ht
import spack.hooks import spack.hooks
import spack.main import spack.main
@@ -53,6 +52,7 @@
import spack.util.url import spack.util.url
import spack.version import spack.version
from spack import traverse from spack import traverse
from spack.filesystem_view import SimpleFilesystemView, inverse_view_func_parser, view_func_parser
from spack.installer import PackageInstaller from spack.installer import PackageInstaller
from spack.schema.env import TOP_LEVEL_KEY from spack.schema.env import TOP_LEVEL_KEY
from spack.spec import Spec from spack.spec import Spec
@@ -606,7 +606,7 @@ def __init__(
self.projections = projections self.projections = projections
self.select = select self.select = select
self.exclude = exclude self.exclude = exclude
self.link_type = fsv.canonicalize_link_type(link_type) self.link_type = view_func_parser(link_type)
self.link = link self.link = link
def select_fn(self, spec): def select_fn(self, spec):
@@ -640,7 +640,7 @@ def to_dict(self):
if self.exclude: if self.exclude:
ret["exclude"] = self.exclude ret["exclude"] = self.exclude
if self.link_type: if self.link_type:
ret["link_type"] = self.link_type ret["link_type"] = inverse_view_func_parser(self.link_type)
if self.link != default_view_link: if self.link != default_view_link:
ret["link"] = self.link ret["link"] = self.link
return ret return ret
@@ -662,7 +662,7 @@ def _current_root(self):
if not os.path.islink(self.root): if not os.path.islink(self.root):
return None return None
root = readlink(self.root) root = os.readlink(self.root)
if os.path.isabs(root): if os.path.isabs(root):
return root return root
@@ -690,7 +690,7 @@ def get_projection_for_spec(self, spec):
to exist on the filesystem.""" to exist on the filesystem."""
return self._view(self.root).get_projection_for_spec(spec) return self._view(self.root).get_projection_for_spec(spec)
def view(self, new: Optional[str] = None) -> fsv.SimpleFilesystemView: def view(self, new: Optional[str] = None) -> SimpleFilesystemView:
""" """
Returns a view object for the *underlying* view directory. This means that the Returns a view object for the *underlying* view directory. This means that the
self.root symlink is followed, and that the view has to exist on the filesystem self.root symlink is followed, and that the view has to exist on the filesystem
@@ -710,14 +710,14 @@ def view(self, new: Optional[str] = None) -> fsv.SimpleFilesystemView:
) )
return self._view(path) return self._view(path)
def _view(self, root: str) -> fsv.SimpleFilesystemView: def _view(self, root: str) -> SimpleFilesystemView:
"""Returns a view object for a given root dir.""" """Returns a view object for a given root dir."""
return fsv.SimpleFilesystemView( return SimpleFilesystemView(
root, root,
spack.store.STORE.layout, spack.store.STORE.layout,
ignore_conflicts=True, ignore_conflicts=True,
projections=self.projections, projections=self.projections,
link_type=self.link_type, link=self.link_type,
) )
def __contains__(self, spec): def __contains__(self, spec):
@@ -1190,6 +1190,7 @@ def scope_name(self):
def include_concrete_envs(self): def include_concrete_envs(self):
"""Copy and save the included envs' specs internally""" """Copy and save the included envs' specs internally"""
lockfile_meta = None
root_hash_seen = set() root_hash_seen = set()
concrete_hash_seen = set() concrete_hash_seen = set()
self.included_concrete_spec_data = {} self.included_concrete_spec_data = {}
@@ -1200,26 +1201,37 @@ def include_concrete_envs(self):
raise SpackEnvironmentError(f"Unable to find env at {env_path}") raise SpackEnvironmentError(f"Unable to find env at {env_path}")
env = Environment(env_path) env = Environment(env_path)
self.included_concrete_spec_data[env_path] = {"roots": [], "concrete_specs": {}}
with open(env.lock_path) as f:
lockfile_as_dict = env._read_lockfile(f)
# Lockfile_meta must match each env and use at least format version 5
if lockfile_meta is None:
lockfile_meta = lockfile_as_dict["_meta"]
elif lockfile_meta != lockfile_as_dict["_meta"]:
raise SpackEnvironmentError("All lockfile _meta values must match")
elif lockfile_meta["lockfile-version"] < 5:
raise SpackEnvironmentError("The lockfile format must be at version 5 or higher")
# Copy unique root specs from env # Copy unique root specs from env
for root_dict in env._concrete_roots_dict(): self.included_concrete_spec_data[env_path] = {"roots": []}
for root_dict in lockfile_as_dict["roots"]:
if root_dict["hash"] not in root_hash_seen: if root_dict["hash"] not in root_hash_seen:
self.included_concrete_spec_data[env_path]["roots"].append(root_dict) self.included_concrete_spec_data[env_path]["roots"].append(root_dict)
root_hash_seen.add(root_dict["hash"]) root_hash_seen.add(root_dict["hash"])
# Copy unique concrete specs from env # Copy unique concrete specs from env
for dag_hash, spec_details in env._concrete_specs_dict().items(): for concrete_spec in lockfile_as_dict["concrete_specs"]:
if dag_hash not in concrete_hash_seen: if concrete_spec not in concrete_hash_seen:
self.included_concrete_spec_data[env_path]["concrete_specs"].update( self.included_concrete_spec_data[env_path].update(
{dag_hash: spec_details} {"concrete_specs": lockfile_as_dict["concrete_specs"]}
) )
concrete_hash_seen.add(dag_hash) concrete_hash_seen.add(concrete_spec)
# Copy transitive include data if "include_concrete" in lockfile_as_dict.keys():
transitive = env.included_concrete_spec_data self.included_concrete_spec_data[env_path]["include_concrete"] = lockfile_as_dict[
if transitive: "include_concrete"
self.included_concrete_spec_data[env_path]["include_concrete"] = transitive ]
self._read_lockfile_dict(self._to_lockfile_dict()) self._read_lockfile_dict(self._to_lockfile_dict())
self.write() self.write()
@@ -2132,23 +2144,16 @@ def _get_environment_specs(self, recurse_dependencies=True):
return specs return specs
def _concrete_specs_dict(self): def _to_lockfile_dict(self):
"""Create a dictionary to store a lockfile for this environment."""
concrete_specs = {} concrete_specs = {}
for s in traverse.traverse_nodes(self.specs_by_hash.values(), key=traverse.by_dag_hash): for s in traverse.traverse_nodes(self.specs_by_hash.values(), key=traverse.by_dag_hash):
spec_dict = s.node_dict_with_hashes(hash=ht.dag_hash) spec_dict = s.node_dict_with_hashes(hash=ht.dag_hash)
# Assumes no legacy formats, since this was just created. # Assumes no legacy formats, since this was just created.
spec_dict[ht.dag_hash.name] = s.dag_hash() spec_dict[ht.dag_hash.name] = s.dag_hash()
concrete_specs[s.dag_hash()] = spec_dict concrete_specs[s.dag_hash()] = spec_dict
return concrete_specs
def _concrete_roots_dict(self):
hash_spec_list = zip(self.concretized_order, self.concretized_user_specs) hash_spec_list = zip(self.concretized_order, self.concretized_user_specs)
return [{"hash": h, "spec": str(s)} for h, s in hash_spec_list]
def _to_lockfile_dict(self):
"""Create a dictionary to store a lockfile for this environment."""
concrete_specs = self._concrete_specs_dict()
root_specs = self._concrete_roots_dict()
spack_dict = {"version": spack.spack_version} spack_dict = {"version": spack.spack_version}
spack_commit = spack.main.get_spack_commit() spack_commit = spack.main.get_spack_commit()
@@ -2169,7 +2174,7 @@ def _to_lockfile_dict(self):
# spack version information # spack version information
"spack": spack_dict, "spack": spack_dict,
# users specs + hashes are the 'roots' of the environment # users specs + hashes are the 'roots' of the environment
"roots": root_specs, "roots": [{"hash": h, "spec": str(s)} for h, s in hash_spec_list],
# Concrete specs by hash, including dependencies # Concrete specs by hash, including dependencies
"concrete_specs": concrete_specs, "concrete_specs": concrete_specs,
} }

View File

@@ -554,7 +554,7 @@ def fetch(self):
try: try:
response = self._urlopen(self.url) response = self._urlopen(self.url)
except (TimeoutError, urllib.error.URLError) as e: except urllib.error.URLError as e:
# clean up archive on failure. # clean up archive on failure.
if self.archive_file: if self.archive_file:
os.remove(self.archive_file) os.remove(self.archive_file)

View File

@@ -10,9 +10,8 @@
import shutil import shutil
import stat import stat
import sys import sys
from typing import Callable, Dict, Optional from typing import Optional
from llnl.string import comma_or
from llnl.util import tty from llnl.util import tty
from llnl.util.filesystem import ( from llnl.util.filesystem import (
mkdirp, mkdirp,
@@ -50,20 +49,19 @@
_projections_path = ".spack/projections.yaml" _projections_path = ".spack/projections.yaml"
LinkCallbackType = Callable[[str, str, "FilesystemView", Optional["spack.spec.Spec"]], None] def view_symlink(src, dst, **kwargs):
# keyword arguments are irrelevant
# here to fit required call signature
def view_symlink(src: str, dst: str, *args, **kwargs) -> None:
symlink(src, dst) symlink(src, dst)
def view_hardlink(src: str, dst: str, *args, **kwargs) -> None: def view_hardlink(src, dst, **kwargs):
# keyword arguments are irrelevant
# here to fit required call signature
os.link(src, dst) os.link(src, dst)
def view_copy( def view_copy(src: str, dst: str, view, spec: Optional[spack.spec.Spec] = None):
src: str, dst: str, view: "FilesystemView", spec: Optional["spack.spec.Spec"] = None
) -> None:
""" """
Copy a file from src to dst. Copy a file from src to dst.
@@ -106,40 +104,27 @@ def view_copy(
tty.debug(f"Can't change the permissions for {dst}") tty.debug(f"Can't change the permissions for {dst}")
#: supported string values for `link_type` in an env, mapped to canonical values def view_func_parser(parsed_name):
_LINK_TYPES = { # What method are we using for this view
"hardlink": "hardlink", if parsed_name in ("hardlink", "hard"):
"hard": "hardlink",
"copy": "copy",
"relocate": "copy",
"add": "symlink",
"symlink": "symlink",
"soft": "symlink",
}
_VALID_LINK_TYPES = sorted(set(_LINK_TYPES.values()))
def canonicalize_link_type(link_type: str) -> str:
"""Return canonical"""
canonical = _LINK_TYPES.get(link_type)
if not canonical:
raise ValueError(
f"Invalid link type: '{link_type}. Must be one of {comma_or(_VALID_LINK_TYPES)}'"
)
return canonical
def function_for_link_type(link_type: str) -> LinkCallbackType:
link_type = canonicalize_link_type(link_type)
if link_type == "hardlink":
return view_hardlink return view_hardlink
elif link_type == "symlink": elif parsed_name in ("copy", "relocate"):
return view_symlink
elif link_type == "copy":
return view_copy return view_copy
elif parsed_name in ("add", "symlink", "soft"):
return view_symlink
else:
raise ValueError(f"invalid link type for view: '{parsed_name}'")
assert False, "invalid link type" # need mypy Literal values
def inverse_view_func_parser(view_type):
# get string based on view type
if view_type is view_hardlink:
link_name = "hardlink"
elif view_type is view_copy:
link_name = "copy"
else:
link_name = "symlink"
return link_name
class FilesystemView: class FilesystemView:
@@ -155,16 +140,7 @@ class FilesystemView:
directory structure. directory structure.
""" """
def __init__( def __init__(self, root, layout, **kwargs):
self,
root: str,
layout: "spack.directory_layout.DirectoryLayout",
*,
projections: Optional[Dict] = None,
ignore_conflicts: bool = False,
verbose: bool = False,
link_type: str = "symlink",
):
""" """
Initialize a filesystem view under the given `root` directory with Initialize a filesystem view under the given `root` directory with
corresponding directory `layout`. corresponding directory `layout`.
@@ -173,14 +149,15 @@ def __init__(
""" """
self._root = root self._root = root
self.layout = layout self.layout = layout
self.projections = {} if projections is None else projections
self.ignore_conflicts = ignore_conflicts self.projections = kwargs.get("projections", {})
self.verbose = verbose
self.ignore_conflicts = kwargs.get("ignore_conflicts", False)
self.verbose = kwargs.get("verbose", False)
# Setup link function to include view # Setup link function to include view
self.link_type = link_type link_func = kwargs.get("link", view_symlink)
self.link = ft.partial(function_for_link_type(link_type), view=self) self.link = ft.partial(link_func, view=self)
def add_specs(self, *specs, **kwargs): def add_specs(self, *specs, **kwargs):
""" """
@@ -278,24 +255,8 @@ class YamlFilesystemView(FilesystemView):
Filesystem view to work with a yaml based directory layout. Filesystem view to work with a yaml based directory layout.
""" """
def __init__( def __init__(self, root, layout, **kwargs):
self, super().__init__(root, layout, **kwargs)
root: str,
layout: "spack.directory_layout.DirectoryLayout",
*,
projections: Optional[Dict] = None,
ignore_conflicts: bool = False,
verbose: bool = False,
link_type: str = "symlink",
):
super().__init__(
root,
layout,
projections=projections,
ignore_conflicts=ignore_conflicts,
verbose=verbose,
link_type=link_type,
)
# Super class gets projections from the kwargs # Super class gets projections from the kwargs
# YAML specific to get projections from YAML file # YAML specific to get projections from YAML file
@@ -677,6 +638,9 @@ class SimpleFilesystemView(FilesystemView):
"""A simple and partial implementation of FilesystemView focused on performance and immutable """A simple and partial implementation of FilesystemView focused on performance and immutable
views, where specs cannot be removed after they were added.""" views, where specs cannot be removed after they were added."""
def __init__(self, root, layout, **kwargs):
super().__init__(root, layout, **kwargs)
def _sanity_check_view_projection(self, specs): def _sanity_check_view_projection(self, specs):
"""A very common issue is that we end up with two specs of the same package, that project """A very common issue is that we end up with two specs of the same package, that project
to the same prefix. We want to catch that as early as possible and give a sensible error to to the same prefix. We want to catch that as early as possible and give a sensible error to

View File

@@ -11,7 +11,7 @@
import urllib.parse import urllib.parse
import urllib.request import urllib.request
from http.client import HTTPResponse from http.client import HTTPResponse
from typing import List, NamedTuple, Tuple from typing import NamedTuple, Tuple
from urllib.request import Request from urllib.request import Request
import llnl.util.tty as tty import llnl.util.tty as tty
@@ -27,7 +27,6 @@
import spack.stage import spack.stage
import spack.traverse import spack.traverse
import spack.util.crypto import spack.util.crypto
import spack.util.url
from .image import Digest, ImageReference from .image import Digest, ImageReference
@@ -70,42 +69,6 @@ def with_query_param(url: str, param: str, value: str) -> str:
) )
def list_tags(ref: ImageReference, _urlopen: spack.oci.opener.MaybeOpen = None) -> List[str]:
"""Retrieves the list of tags associated with an image, handling pagination."""
_urlopen = _urlopen or spack.oci.opener.urlopen
tags = set()
fetch_url = ref.tags_url()
while True:
# Fetch tags
request = Request(url=fetch_url)
response = _urlopen(request)
spack.oci.opener.ensure_status(request, response, 200)
tags.update(json.load(response)["tags"])
# Check for pagination
link_header = response.headers["Link"]
if link_header is None:
break
tty.debug(f"OCI tag pagination: {link_header}")
rel_next_value = spack.util.url.parse_link_rel_next(link_header)
if rel_next_value is None:
break
rel_next = urllib.parse.urlparse(rel_next_value)
if rel_next.scheme not in ("https", ""):
break
fetch_url = ref.endpoint(rel_next_value)
return sorted(tags)
def upload_blob( def upload_blob(
ref: ImageReference, ref: ImageReference,
file: str, file: str,

View File

@@ -143,7 +143,6 @@ def __init__(self):
"12": "monterey", "12": "monterey",
"13": "ventura", "13": "ventura",
"14": "sonoma", "14": "sonoma",
"15": "sequoia",
} }
version = macos_version() version = macos_version()

View File

@@ -161,11 +161,7 @@ def windows_establish_runtime_linkage(self):
Performs symlinking to incorporate rpath dependencies to Windows runtime search paths Performs symlinking to incorporate rpath dependencies to Windows runtime search paths
""" """
# If spec is an external, we should not be modifying its bin directory, as we would if sys.platform == "win32":
# be doing in this method
# Spack should in general not modify things it has not installed
# we can reasonably expect externals to have their link interface properly established
if sys.platform == "win32" and not self.spec.external:
self.win_rpath.add_library_dependent(*self.win_add_library_dependent()) self.win_rpath.add_library_dependent(*self.win_add_library_dependent())
self.win_rpath.add_rpath(*self.win_add_rpath()) self.win_rpath.add_rpath(*self.win_add_rpath())
self.win_rpath.establish_link() self.win_rpath.establish_link()
@@ -199,10 +195,10 @@ def __init__(cls, name, bases, attr_dict):
# assumed to be detectable # assumed to be detectable
if hasattr(cls, "executables") or hasattr(cls, "libraries"): if hasattr(cls, "executables") or hasattr(cls, "libraries"):
# Append a tag to each detectable package, so that finding them is faster # Append a tag to each detectable package, so that finding them is faster
if not hasattr(cls, "tags"): if hasattr(cls, "tags"):
getattr(cls, "tags").append(DetectablePackageMeta.TAG)
else:
setattr(cls, "tags", [DetectablePackageMeta.TAG]) setattr(cls, "tags", [DetectablePackageMeta.TAG])
elif DetectablePackageMeta.TAG not in cls.tags:
cls.tags.append(DetectablePackageMeta.TAG)
@classmethod @classmethod
def platform_executables(cls): def platform_executables(cls):
@@ -1119,9 +1115,10 @@ def _make_stage(self):
if not link_format: if not link_format:
link_format = "build-{arch}-{hash:7}" link_format = "build-{arch}-{hash:7}"
stage_link = self.spec.format_path(link_format) stage_link = self.spec.format_path(link_format)
source_stage = DevelopStage(compute_stage_name(self.spec), dev_path, stage_link) return DevelopStage(compute_stage_name(self.spec), dev_path, stage_link)
else:
source_stage = self._make_root_stage(self.fetcher) # To fetch the current version
source_stage = self._make_root_stage(self.fetcher)
# all_stages is source + resources + patches # all_stages is source + resources + patches
all_stages = StageComposite() all_stages = StageComposite()
@@ -1450,8 +1447,10 @@ def do_fetch(self, mirror_only=False):
return return
checksum = spack.config.get("config:checksum") checksum = spack.config.get("config:checksum")
fetch = self.stage.needs_fetching
if ( if (
checksum checksum
and fetch
and (self.version not in self.versions) and (self.version not in self.versions)
and (not isinstance(self.version, GitVersion)) and (not isinstance(self.version, GitVersion))
): ):
@@ -1558,11 +1557,13 @@ def do_patch(self):
tty.debug("Patching failed last time. Restaging.") tty.debug("Patching failed last time. Restaging.")
self.stage.restage() self.stage.restage()
else: else:
# develop specs may have patch failures but should never be restaged # develop specs/ DIYStages may have patch failures but
tty.warn( # should never be restaged
f"A patch failure was detected in {self.name}." msg = (
" Build errors may occur due to this." "A patch failure was detected in %s." % self.name
+ " Build errors may occur due to this."
) )
tty.warn(msg)
return return
# If this file exists, then we already applied all the patches. # If this file exists, then we already applied all the patches.
@@ -2445,18 +2446,9 @@ def rpath(self):
# on Windows, libraries of runtime interest are typically # on Windows, libraries of runtime interest are typically
# stored in the bin directory # stored in the bin directory
# Do not include Windows system libraries in the rpath interface
# these libraries are handled automatically by VS/VCVARS and adding
# Spack derived system libs into the link path or address space of a program
# can result in conflicting versions, which makes Spack packages less useable
if sys.platform == "win32": if sys.platform == "win32":
rpaths = [self.prefix.bin] rpaths = [self.prefix.bin]
rpaths.extend( rpaths.extend(d.prefix.bin for d in deps if os.path.isdir(d.prefix.bin))
d.prefix.bin
for d in deps
if os.path.isdir(d.prefix.bin)
and "windows-system" not in getattr(d.package, "tags", [])
)
else: else:
rpaths = [self.prefix.lib, self.prefix.lib64] rpaths = [self.prefix.lib, self.prefix.lib64]
rpaths.extend(d.prefix.lib for d in deps if os.path.isdir(d.prefix.lib)) rpaths.extend(d.prefix.lib for d in deps if os.path.isdir(d.prefix.lib))

View File

@@ -10,7 +10,6 @@
import archspec.cpu import archspec.cpu
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.target import spack.target
import spack.version import spack.version
@@ -134,7 +133,7 @@ def craype_type_and_version(cls):
# Take the default version from known symlink path # Take the default version from known symlink path
default_path = os.path.join(craype_dir, "default") default_path = os.path.join(craype_dir, "default")
if os.path.islink(default_path): if os.path.islink(default_path):
version = spack.version.Version(readlink(default_path)) version = spack.version.Version(os.readlink(default_path))
return (craype_type, version) return (craype_type, version)
# If no default version, sort available versions and return latest # If no default version, sort available versions and return latest

View File

@@ -566,7 +566,7 @@ def make_link_relative(new_links, orig_links):
orig_links (list): original links orig_links (list): original links
""" """
for new_link, orig_link in zip(new_links, orig_links): for new_link, orig_link in zip(new_links, orig_links):
target = readlink(orig_link) target = os.readlink(orig_link)
relative_target = os.path.relpath(target, os.path.dirname(orig_link)) relative_target = os.path.relpath(target, os.path.dirname(orig_link))
os.unlink(new_link) os.unlink(new_link)
symlink(relative_target, new_link) symlink(relative_target, new_link)

View File

@@ -241,7 +241,7 @@ def get_all_package_diffs(type, rev1="HEAD^1", rev2="HEAD"):
Arguments: Arguments:
type (str): String containing one or more of 'A', 'R', 'C' type (str): String containing one or more of 'A', 'B', 'C'
rev1 (str): Revision to compare against, default is 'HEAD^' rev1 (str): Revision to compare against, default is 'HEAD^'
rev2 (str): Revision to compare to rev1, default is 'HEAD' rev2 (str): Revision to compare to rev1, default is 'HEAD'
@@ -264,7 +264,7 @@ def get_all_package_diffs(type, rev1="HEAD^1", rev2="HEAD"):
lines = [] if not out else re.split(r"\s+", out) lines = [] if not out else re.split(r"\s+", out)
changed = set() changed = set()
for path in lines: for path in lines:
pkg_name, _, _ = path.partition("/") pkg_name, _, _ = path.partition(os.sep)
if pkg_name not in added and pkg_name not in removed: if pkg_name not in added and pkg_name not in removed:
changed.add(pkg_name) changed.add(pkg_name)

View File

@@ -58,8 +58,7 @@
# Initialize data structures common to each phase's report. # Initialize data structures common to each phase's report.
CDASH_PHASES = set(MAP_PHASES_TO_CDASH.values()) CDASH_PHASES = set(MAP_PHASES_TO_CDASH.values())
CDASH_PHASES.add("update") CDASH_PHASES.add("update")
# CDash request timeout in seconds
SPACK_CDASH_TIMEOUT = 45
CDashConfiguration = collections.namedtuple( CDashConfiguration = collections.namedtuple(
"CDashConfiguration", ["upload_url", "packages", "build", "site", "buildstamp", "track"] "CDashConfiguration", ["upload_url", "packages", "build", "site", "buildstamp", "track"]
@@ -448,7 +447,7 @@ def upload(self, filename):
# By default, urllib2 only support GET and POST. # By default, urllib2 only support GET and POST.
# CDash expects this file to be uploaded via PUT. # CDash expects this file to be uploaded via PUT.
request.get_method = lambda: "PUT" request.get_method = lambda: "PUT"
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT) response = opener.open(request)
if self.current_package_name not in self.buildIds: if self.current_package_name not in self.buildIds:
resp_value = response.read() resp_value = response.read()
if isinstance(resp_value, bytes): if isinstance(resp_value, bytes):

View File

@@ -9,7 +9,7 @@
import tempfile import tempfile
from collections import OrderedDict from collections import OrderedDict
from llnl.util.symlink import readlink, symlink from llnl.util.symlink import symlink
import spack.binary_distribution as bindist import spack.binary_distribution as bindist
import spack.error import spack.error
@@ -26,7 +26,7 @@ def _relocate_spliced_links(links, orig_prefix, new_prefix):
in our case. This still needs to be called after the copy to destination in our case. This still needs to be called after the copy to destination
because it expects the new directory structure to be in place.""" because it expects the new directory structure to be in place."""
for link in links: for link in links:
link_target = readlink(os.path.join(orig_prefix, link)) link_target = os.readlink(os.path.join(orig_prefix, link))
link_target = re.sub("^" + orig_prefix, new_prefix, link_target) link_target = re.sub("^" + orig_prefix, new_prefix, link_target)
new_link_path = os.path.join(new_prefix, link) new_link_path = os.path.join(new_prefix, link)
os.unlink(new_link_path) os.unlink(new_link_path)

View File

@@ -13,7 +13,6 @@
r"\w[\w-]*": { r"\w[\w-]*": {
"type": "object", "type": "object",
"additionalProperties": False, "additionalProperties": False,
"required": ["spec"],
"properties": {"spec": {"type": "string"}, "path": {"type": "string"}}, "properties": {"spec": {"type": "string"}, "path": {"type": "string"}},
} }
}, },

View File

@@ -314,10 +314,6 @@ def using_libc_compatibility() -> bool:
return spack.platforms.host().name == "linux" return spack.platforms.host().name == "linux"
def c_compiler_runs(compiler: spack.compiler.Compiler) -> bool:
return compiler.compiler_verbose_output is not None
def extend_flag_list(flag_list, new_flags): def extend_flag_list(flag_list, new_flags):
"""Extend a list of flags, preserving order and precedence. """Extend a list of flags, preserving order and precedence.
@@ -844,6 +840,8 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
parent_dir = os.path.dirname(__file__) parent_dir = os.path.dirname(__file__)
self.control.load(os.path.join(parent_dir, "concretize.lp")) self.control.load(os.path.join(parent_dir, "concretize.lp"))
self.control.load(os.path.join(parent_dir, "heuristic.lp")) self.control.load(os.path.join(parent_dir, "heuristic.lp"))
if spack.config.CONFIG.get("concretizer:duplicates:strategy", "none") != "none":
self.control.load(os.path.join(parent_dir, "heuristic_separate.lp"))
self.control.load(os.path.join(parent_dir, "display.lp")) self.control.load(os.path.join(parent_dir, "display.lp"))
if not setup.concretize_everything: if not setup.concretize_everything:
self.control.load(os.path.join(parent_dir, "when_possible.lp")) self.control.load(os.path.join(parent_dir, "when_possible.lp"))
@@ -1433,14 +1431,16 @@ def condition(
# caller, we won't emit partial facts. # caller, we won't emit partial facts.
condition_id = next(self._id_counter) condition_id = next(self._id_counter)
self.gen.fact(fn.pkg_fact(required_spec.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
trigger_id = self._get_condition_id( trigger_id = self._get_condition_id(
required_spec, cache=self._trigger_cache, body=True, transform=transform_required required_spec, cache=self._trigger_cache, body=True, transform=transform_required
) )
self.gen.fact(fn.pkg_fact(required_spec.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
self.gen.fact( self.gen.fact(
fn.pkg_fact(required_spec.name, fn.condition_trigger(condition_id, trigger_id)) fn.pkg_fact(required_spec.name, fn.condition_trigger(condition_id, trigger_id))
) )
if not imposed_spec: if not imposed_spec:
return condition_id return condition_id
@@ -1689,43 +1689,19 @@ def external_packages(self):
spack.spec.parse_with_version_concrete(x["spec"]) for x in externals spack.spec.parse_with_version_concrete(x["spec"]) for x in externals
] ]
selected_externals = set() external_specs = []
if spec_filters: if spec_filters:
for current_filter in spec_filters: for current_filter in spec_filters:
current_filter.factory = lambda: candidate_specs current_filter.factory = lambda: candidate_specs
selected_externals.update(current_filter.selected_specs()) external_specs.extend(current_filter.selected_specs())
else:
# Emit facts for externals specs. Note that "local_idx" is the index of the spec external_specs.extend(candidate_specs)
# in packages:<pkg_name>:externals. This means:
#
# packages:<pkg_name>:externals[local_idx].spec == spec
external_versions = []
for local_idx, spec in enumerate(candidate_specs):
msg = f"{spec.name} available as external when satisfying {spec}"
if spec_filters and spec not in selected_externals:
continue
if not spec.versions.concrete:
warnings.warn(f"cannot use the external spec {spec}: needs a concrete version")
continue
def external_imposition(input_spec, requirements):
return requirements + [
fn.attr("external_conditions_hold", input_spec.name, local_idx)
]
try:
self.condition(spec, spec, msg=msg, transform_imposed=external_imposition)
except (spack.error.SpecError, RuntimeError) as e:
warnings.warn(f"while setting up external spec {spec}: {e}")
continue
external_versions.append((spec.version, local_idx))
self.possible_versions[spec.name].add(spec.version)
self.gen.newline()
# Order the external versions to prefer more recent versions # Order the external versions to prefer more recent versions
# even if specs in packages.yaml are not ordered that way # even if specs in packages.yaml are not ordered that way
external_versions = [
(x.version, external_id) for external_id, x in enumerate(external_specs)
]
external_versions = [ external_versions = [
(v, idx, external_id) (v, idx, external_id)
for idx, (v, external_id) in enumerate(sorted(external_versions, reverse=True)) for idx, (v, external_id) in enumerate(sorted(external_versions, reverse=True))
@@ -1735,6 +1711,19 @@ def external_imposition(input_spec, requirements):
DeclaredVersion(version=version, idx=idx, origin=Provenance.EXTERNAL) DeclaredVersion(version=version, idx=idx, origin=Provenance.EXTERNAL)
) )
# Declare external conditions with a local index into packages.yaml
for local_idx, spec in enumerate(external_specs):
msg = "%s available as external when satisfying %s" % (spec.name, spec)
def external_imposition(input_spec, requirements):
return requirements + [
fn.attr("external_conditions_hold", input_spec.name, local_idx)
]
self.condition(spec, spec, msg=msg, transform_imposed=external_imposition)
self.possible_versions[spec.name].add(spec.version)
self.gen.newline()
self.trigger_rules() self.trigger_rules()
self.effect_rules() self.effect_rules()
@@ -1887,8 +1876,11 @@ def _spec_clauses(
) )
clauses.append(f.variant_value(spec.name, vname, value)) clauses.append(f.variant_value(spec.name, vname, value))
if variant.propagate: if variant.propagate:
clauses.append(f.propagate(spec.name, fn.variant_value(vname, value))) clauses.append(
f.variant_propagation_candidate(spec.name, vname, value, spec.name)
)
# Tell the concretizer that this is a possible value for the # Tell the concretizer that this is a possible value for the
# variant, to account for things like int/str values where we # variant, to account for things like int/str values where we
@@ -1943,11 +1935,6 @@ def _spec_clauses(
for virtual in virtuals: for virtual in virtuals:
clauses.append(fn.attr("virtual_on_incoming_edges", spec.name, virtual)) clauses.append(fn.attr("virtual_on_incoming_edges", spec.name, virtual))
# If the spec is external and concrete, we allow all the libcs on the system
if spec.external and spec.concrete and using_libc_compatibility():
for libc in self.libcs:
clauses.append(fn.attr("compatible_libc", spec.name, libc.name, libc.version))
# add all clauses from dependencies # add all clauses from dependencies
if transitive: if transitive:
# TODO: Eventually distinguish 2 deps on the same pkg (build and link) # TODO: Eventually distinguish 2 deps on the same pkg (build and link)
@@ -2743,7 +2730,7 @@ class _Head:
node_flag = fn.attr("node_flag_set") node_flag = fn.attr("node_flag_set")
node_flag_source = fn.attr("node_flag_source") node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate") node_flag_propagate = fn.attr("node_flag_propagate")
propagate = fn.attr("propagate") variant_propagation_candidate = fn.attr("variant_propagation_candidate")
class _Body: class _Body:
@@ -2760,7 +2747,7 @@ class _Body:
node_flag = fn.attr("node_flag") node_flag = fn.attr("node_flag")
node_flag_source = fn.attr("node_flag_source") node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate") node_flag_propagate = fn.attr("node_flag_propagate")
propagate = fn.attr("propagate") variant_propagation_candidate = fn.attr("variant_propagation_candidate")
class ProblemInstanceBuilder: class ProblemInstanceBuilder:
@@ -2988,13 +2975,6 @@ class CompilerParser:
def __init__(self, configuration) -> None: def __init__(self, configuration) -> None:
self.compilers: Set[KnownCompiler] = set() self.compilers: Set[KnownCompiler] = set()
for c in all_compilers_in_config(configuration): for c in all_compilers_in_config(configuration):
if using_libc_compatibility() and not c_compiler_runs(c):
tty.debug(
f"the C compiler {c.cc} does not exist, or does not run correctly."
f" The compiler {c.spec} will not be used during concretization."
)
continue
if using_libc_compatibility() and not c.default_libc: if using_libc_compatibility() and not c.default_libc:
warnings.warn( warnings.warn(
f"cannot detect libc from {c.spec}. The compiler will not be used " f"cannot detect libc from {c.spec}. The compiler will not be used "
@@ -3234,39 +3214,6 @@ def requires(self, impose: str, *, when: str):
self.runtime_conditions.add((imposed_spec, when_spec)) self.runtime_conditions.add((imposed_spec, when_spec))
self.reset() self.reset()
def propagate(self, constraint_str: str, *, when: str):
msg = "the 'propagate' method can be called only with pkg('*')"
assert self.current_package == "*", msg
when_spec = spack.spec.Spec(when)
assert when_spec.name is None, "only anonymous when specs are accepted"
placeholder = "XXX"
node_variable = "node(ID, Package)"
when_spec.name = placeholder
body_clauses = self._setup.spec_clauses(when_spec, body=True)
body_str = (
f" {f',{os.linesep} '.join(str(x) for x in body_clauses)},\n"
f" not external({node_variable}),\n"
f" not runtime(Package)"
).replace(f'"{placeholder}"', f"{node_variable}")
constraint_spec = spack.spec.Spec(constraint_str)
assert constraint_spec.name is None, "only anonymous constraint specs are accepted"
constraint_spec.name = placeholder
constraint_clauses = self._setup.spec_clauses(constraint_spec, body=False)
for clause in constraint_clauses:
if clause.args[0] == "node_compiler_version_satisfies":
self._setup.compiler_version_constraints.add(constraint_spec.compiler)
args = f'"{constraint_spec.compiler.name}", "{constraint_spec.compiler.versions}"'
head_str = f"propagate({node_variable}, node_compiler_version_satisfies({args}))"
rule = f"{head_str} :-\n{body_str}.\n\n"
self.rules.append(rule)
self.reset()
def consume_facts(self): def consume_facts(self):
"""Consume the facts collected by this object, and emits rules and """Consume the facts collected by this object, and emits rules and
facts for the runtimes. facts for the runtimes.

View File

@@ -811,6 +811,37 @@ node_has_variant(node(ID, Package), Variant) :-
pkg_fact(Package, variant(Variant)), pkg_fact(Package, variant(Variant)),
attr("node", node(ID, Package)). attr("node", node(ID, Package)).
% Variant propagation is forwarded to dependencies
attr("variant_propagation_candidate", PackageNode, Variant, Value, Source) :-
attr("node", PackageNode),
depends_on(ParentNode, PackageNode),
attr("variant_value", node(_, Source), Variant, Value),
attr("variant_propagation_candidate", ParentNode, Variant, _, Source).
% If the node is a candidate, and it has the variant and value,
% then those variant and value should be propagated
attr("variant_propagate", node(ID, Package), Variant, Value, Source) :-
attr("variant_propagation_candidate", node(ID, Package), Variant, Value, Source),
node_has_variant(node(ID, Package), Variant),
pkg_fact(Package, variant_possible_value(Variant, Value)),
not attr("variant_set", node(ID, Package), Variant).
% Propagate the value, if there is the corresponding attribute
attr("variant_value", PackageNode, Variant, Value) :- attr("variant_propagate", PackageNode, Variant, Value, _).
% If a variant is propagated, we cannot have extraneous values (this is for multi valued variants)
variant_is_propagated(PackageNode, Variant) :- attr("variant_propagate", PackageNode, Variant, _, _).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_value", PackageNode, Variant, Value),
not attr("variant_propagate", PackageNode, Variant, Value, _).
% Cannot receive different values from different sources on the same variant
error(100, "{0} and {1} cannot both propagate variant '{2}' to package {3} with values '{4}' and '{5}'", Source1, Source2, Variant, Package, Value1, Value2) :-
attr("variant_propagate", node(X, Package), Variant, Value1, Source1),
attr("variant_propagate", node(X, Package), Variant, Value2, Source2),
node_has_variant(node(X, Package), Variant),
Value1 < Value2, Source1 < Source2.
% a variant cannot be set if it is not a variant on the package % a variant cannot be set if it is not a variant on the package
error(100, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, Package) error(100, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, Package)
:- attr("variant_set", node(X, Package), Variant), :- attr("variant_set", node(X, Package), Variant),
@@ -888,7 +919,7 @@ variant_not_default(node(ID, Package), Variant, Value)
% variants set explicitly on the CLI don't count as non-default % variants set explicitly on the CLI don't count as non-default
not attr("variant_set", node(ID, Package), Variant, Value), not attr("variant_set", node(ID, Package), Variant, Value),
% variant values forced by propagation don't count as non-default % variant values forced by propagation don't count as non-default
not propagate(node(ID, Package), variant_value(Variant, Value)), not attr("variant_propagate", node(ID, Package), Variant, Value, _),
% variants set on externals that we could use don't count as non-default % variants set on externals that we could use don't count as non-default
% this makes spack prefer to use an external over rebuilding with the % this makes spack prefer to use an external over rebuilding with the
% default configuration % default configuration
@@ -901,7 +932,7 @@ variant_default_not_used(node(ID, Package), Variant, Value)
:- variant_default_value(Package, Variant, Value), :- variant_default_value(Package, Variant, Value),
node_has_variant(node(ID, Package), Variant), node_has_variant(node(ID, Package), Variant),
not attr("variant_value", node(ID, Package), Variant, Value), not attr("variant_value", node(ID, Package), Variant, Value),
not propagate(node(ID, Package), variant_value(Variant, _)), not attr("variant_propagate", node(ID, Package), Variant, _, _),
attr("node", node(ID, Package)). attr("node", node(ID, Package)).
% The variant is set in an external spec % The variant is set in an external spec
@@ -958,67 +989,6 @@ pkg_fact(Package, variant_single_value("dev_path"))
#defined variant_default_value/3. #defined variant_default_value/3.
#defined variant_default_value_from_packages_yaml/3. #defined variant_default_value_from_packages_yaml/3.
%-----------------------------------------------------------------------------
% Propagation semantics
%-----------------------------------------------------------------------------
% Propagation roots have a corresponding attr("propagate", ...)
propagate(RootNode, PropagatedAttribute) :- attr("propagate", RootNode, PropagatedAttribute).
% Propagate an attribute along edges to child nodes
propagate(ChildNode, PropagatedAttribute) :-
propagate(ParentNode, PropagatedAttribute),
depends_on(ParentNode, ChildNode).
%-----------------------------------------------------------------------------
% Activation of propagated values
%-----------------------------------------------------------------------------
%----
% Variants
%----
% If a variant is propagated, and can be accepted, set its value
attr("variant_value", node(ID, Package), Variant, Value) :-
propagate(node(ID, Package), variant_value(Variant, Value)),
node_has_variant(node(ID, Package), Variant),
pkg_fact(Package, variant_possible_value(Variant, Value)),
not attr("variant_set", node(ID, Package), Variant).
% If a variant is propagated, we cannot have extraneous values
variant_is_propagated(PackageNode, Variant) :-
attr("variant_value", PackageNode, Variant, Value),
propagate(PackageNode, variant_value(Variant, Value)),
not attr("variant_set", PackageNode, Variant).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_value", PackageNode, Variant, Value),
not propagate(PackageNode, variant_value(Variant, Value)).
%----
% Compiler constraints
%----
attr("node_compiler_version_satisfies", node(ID, Package), Compiler, Version) :-
propagate(node(ID, Package), node_compiler_version_satisfies(Compiler, Version)),
node_compiler(node(ID, Package), CompilerID),
compiler_name(CompilerID, Compiler),
not runtime(Package),
not external(Package).
%-----------------------------------------------------------------------------
% Runtimes
%-----------------------------------------------------------------------------
% Check whether the DAG has any built package
has_built_packages() :- build(X), not external(X).
% If we build packages, the runtime nodes must use an available compiler
1 { node_compiler(PackageNode, CompilerID) : build(PackageNode), not external(PackageNode) } :-
has_built_packages(),
runtime(RuntimePackage),
node_compiler(node(_, RuntimePackage), CompilerID).
%----------------------------------------------------------------------------- %-----------------------------------------------------------------------------
% Platform semantics % Platform semantics
%----------------------------------------------------------------------------- %-----------------------------------------------------------------------------
@@ -1120,18 +1090,10 @@ attr("node_target", PackageNode, Target)
:- attr("node", PackageNode), attr("node_target_set", PackageNode, Target). :- attr("node", PackageNode), attr("node_target_set", PackageNode, Target).
% each node has the weight of its assigned target % each node has the weight of its assigned target
target_weight(Target, 0) node_target_weight(node(ID, Package), Weight)
:- attr("node", PackageNode), :- attr("node", node(ID, Package)),
attr("node_target", PackageNode, Target), attr("node_target", node(ID, Package), Target),
attr("node_target_set", PackageNode, Target). target_weight(Target, Weight).
node_target_weight(PackageNode, MinWeight)
:- attr("node", PackageNode),
attr("node_target", PackageNode, Target),
target(Target),
MinWeight = #min { Weight : target_weight(Target, Weight) }.
:- attr("node_target", PackageNode, Target), not node_target_weight(PackageNode, _).
% compatibility rules for targets among nodes % compatibility rules for targets among nodes
node_target_match(ParentNode, DependencyNode) node_target_match(ParentNode, DependencyNode)
@@ -1193,12 +1155,12 @@ error(10, "No valid compiler for {0} satisfies '%{1}'", Package, Compiler)
% If the compiler of a node must satisfy a constraint, then its version % If the compiler of a node must satisfy a constraint, then its version
% must be chosen among the ones that satisfy said constraint % must be chosen among the ones that satisfy said constraint
error(100, "Package {0} cannot satisfy '%{1}@{2}'", Package, Compiler, Constraint) error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
:- attr("node", node(X, Package)), :- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint), attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, _). not compiler_version_satisfies(Compiler, Constraint, _).
error(100, "Package {0} cannot satisfy '%{1}@{2}'", Package, Compiler, Constraint) error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
:- attr("node", node(X, Package)), :- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint), attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, ID), not compiler_version_satisfies(Compiler, Constraint, ID),
@@ -1383,10 +1345,8 @@ build(PackageNode) :- not attr("hash", PackageNode, _), attr("node", PackageNode
% topmost-priority criterion to reuse what is installed. % topmost-priority criterion to reuse what is installed.
% %
% The priority ranges are: % The priority ranges are:
% 1000+ Optimizations for concretization errors % 200+ Shifted priorities for build nodes; correspond to priorities 0 - 99.
% 300 - 1000 Highest priority optimizations for valid solutions % 100 - 199 Unshifted priorities. Currently only includes minimizing #builds.
% 200 - 299 Shifted priorities for build nodes; correspond to priorities 0 - 99.
% 100 - 199 Unshifted priorities. Currently only includes minimizing #builds and minimizing dupes.
% 0 - 99 Priorities for non-built nodes. % 0 - 99 Priorities for non-built nodes.
build_priority(PackageNode, 200) :- build(PackageNode), attr("node", PackageNode). build_priority(PackageNode, 200) :- build(PackageNode), attr("node", PackageNode).
build_priority(PackageNode, 0) :- not build(PackageNode), attr("node", PackageNode). build_priority(PackageNode, 0) :- not build(PackageNode), attr("node", PackageNode).
@@ -1434,16 +1394,6 @@ build_priority(PackageNode, 0) :- not build(PackageNode), attr("node", Package
% 2. a `#minimize{ 0@2 : #true }.` statement that ensures the criterion % 2. a `#minimize{ 0@2 : #true }.` statement that ensures the criterion
% is displayed (clingo doesn't display sums over empty sets by default) % is displayed (clingo doesn't display sums over empty sets by default)
% A condition group specifies one or more specs that must be satisfied.
% Specs declared first are preferred, so we assign increasing weights and
% minimize the weights.
opt_criterion(310, "requirement weight").
#minimize{ 0@310: #true }.
#minimize {
Weight@310,PackageNode,Group
: requirement_weight(PackageNode, Group, Weight)
}.
% Try hard to reuse installed packages (i.e., minimize the number built) % Try hard to reuse installed packages (i.e., minimize the number built)
opt_criterion(110, "number of packages to build (vs. reuse)"). opt_criterion(110, "number of packages to build (vs. reuse)").
#minimize { 0@110: #true }. #minimize { 0@110: #true }.
@@ -1455,6 +1405,18 @@ opt_criterion(100, "number of nodes from the same package").
#minimize { ID@100,Package : attr("virtual_node", node(ID, Package)) }. #minimize { ID@100,Package : attr("virtual_node", node(ID, Package)) }.
#defined optimize_for_reuse/0. #defined optimize_for_reuse/0.
% A condition group specifies one or more specs that must be satisfied.
% Specs declared first are preferred, so we assign increasing weights and
% minimize the weights.
opt_criterion(75, "requirement weight").
#minimize{ 0@275: #true }.
#minimize{ 0@75: #true }.
#minimize {
Weight@75+Priority,PackageNode,Group
: requirement_weight(PackageNode, Group, Weight),
build_priority(PackageNode, Priority)
}.
% Minimize the number of deprecated versions being used % Minimize the number of deprecated versions being used
opt_criterion(73, "deprecated versions used"). opt_criterion(73, "deprecated versions used").
#minimize{ 0@273: #true }. #minimize{ 0@273: #true }.
@@ -1470,11 +1432,11 @@ opt_criterion(73, "deprecated versions used").
% 1. Version weight % 1. Version weight
% 2. Number of variants with a non default value, if not set % 2. Number of variants with a non default value, if not set
% for the root package. % for the root package.
opt_criterion(70, "version badness (roots)"). opt_criterion(70, "version weight").
#minimize{ 0@270: #true }. #minimize{ 0@270: #true }.
#minimize{ 0@70: #true }. #minimize{ 0@70: #true }.
#minimize { #minimize {
Weight@70+Priority,PackageNode Weight@70+Priority
: attr("root", PackageNode), : attr("root", PackageNode),
version_weight(PackageNode, Weight), version_weight(PackageNode, Weight),
build_priority(PackageNode, Priority) build_priority(PackageNode, Priority)
@@ -1534,7 +1496,7 @@ opt_criterion(45, "preferred providers (non-roots)").
}. }.
% Try to minimize the number of compiler mismatches in the DAG. % Try to minimize the number of compiler mismatches in the DAG.
opt_criterion(40, "compiler mismatches that are not required"). opt_criterion(40, "compiler mismatches that are not from CLI").
#minimize{ 0@240: #true }. #minimize{ 0@240: #true }.
#minimize{ 0@40: #true }. #minimize{ 0@40: #true }.
#minimize{ #minimize{
@@ -1544,7 +1506,7 @@ opt_criterion(40, "compiler mismatches that are not required").
not runtime(Dependency) not runtime(Dependency)
}. }.
opt_criterion(39, "compiler mismatches that are required"). opt_criterion(39, "compiler mismatches that are not from CLI").
#minimize{ 0@239: #true }. #minimize{ 0@239: #true }.
#minimize{ 0@39: #true }. #minimize{ 0@39: #true }.
#minimize{ #minimize{
@@ -1564,14 +1526,13 @@ opt_criterion(30, "non-preferred OS's").
}. }.
% Choose more recent versions for nodes % Choose more recent versions for nodes
opt_criterion(25, "version badness (non roots)"). opt_criterion(25, "version badness").
#minimize{ 0@225: #true }. #minimize{ 0@225: #true }.
#minimize{ 0@25: #true }. #minimize{ 0@25: #true }.
#minimize{ #minimize{
Weight@25+Priority,node(X, Package) Weight@25+Priority,node(X, Package)
: version_weight(node(X, Package), Weight), : version_weight(node(X, Package), Weight),
build_priority(node(X, Package), Priority), build_priority(node(X, Package), Priority),
not attr("root", node(X, Package)),
not runtime(Package) not runtime(Package)
}. }.

View File

@@ -4,35 +4,21 @@
% SPDX-License-Identifier: (Apache-2.0 OR MIT) % SPDX-License-Identifier: (Apache-2.0 OR MIT)
%============================================================================= %=============================================================================
% Heuristic to speed-up solves % Heuristic to speed-up solves (node with ID 0)
%============================================================================= %=============================================================================
% No duplicates by default (most of them will be true)
#heuristic attr("node", node(PackageID, Package)). [100, init]
#heuristic attr("node", node(PackageID, Package)). [ 2, factor]
#heuristic attr("virtual_node", node(VirtualID, Virtual)). [100, init]
#heuristic attr("node", node(1..X-1, Package)) : max_dupes(Package, X), not virtual(Package), X > 1. [-1, sign]
#heuristic attr("virtual_node", node(1..X-1, Package)) : max_dupes(Package, X), virtual(Package) , X > 1. [-1, sign]
% Pick preferred version %-----------------
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, Weight)), attr("node", node(PackageID, Package)). [40, init] % Domain heuristic
#heuristic version_weight(node(PackageID, Package), 0) : pkg_fact(Package, version_declared(Version, 0 )), attr("node", node(PackageID, Package)). [ 1, sign] %-----------------
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, 0 )), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, Weight)), attr("node", node(PackageID, Package)), Weight > 0. [-1, sign]
% Use default variants % Root node
#heuristic attr("variant_value", node(PackageID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(PackageID, Package)). [40, true] #heuristic attr("version", node(0, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("root", node(0, Package)). [35, true]
#heuristic attr("variant_value", node(PackageID, Package), Variant, Value) : not variant_default_value(Package, Variant, Value), attr("node", node(PackageID, Package)). [40, false] #heuristic version_weight(node(0, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("root", node(0, Package)). [35, true]
#heuristic attr("variant_value", node(0, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("root", node(0, Package)). [35, true]
#heuristic attr("node_target", node(0, Package), Target) : target_weight(Target, 0), attr("root", node(0, Package)). [35, true]
#heuristic node_target_weight(node(0, Package), 0) : attr("root", node(0, Package)). [35, true]
#heuristic node_compiler(node(0, Package), CompilerID) : compiler_weight(ID, 0), compiler_id(ID), attr("root", node(0, Package)). [35, true]
% Use default operating system and platform % Providers
#heuristic attr("node_os", node(PackageID, Package), OS) : os(OS, 0), attr("root", node(PackageID, Package)). [40, true] #heuristic attr("node", node(0, Package)) : default_provider_preference(Virtual, Package, 0), possible_in_link_run(Package). [30, true]
#heuristic attr("node_platform", node(PackageID, Package), Platform) : allowed_platform(Platform), attr("root", node(PackageID, Package)). [40, true]
% Use default targets
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)). [30, init]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)). [ 2, factor]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, 0), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)), Weight > 0. [-1, sign]
% Use the default compilers
#heuristic node_compiler(node(PackageID, Package), ID) : compiler_weight(ID, 0), compiler_id(ID), attr("node", node(PackageID, Package)). [30, init]

View File

@@ -0,0 +1,24 @@
% Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
% Spack Project Developers. See the top-level COPYRIGHT file for details.
%
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% Heuristic to speed-up solves (node with ID > 0)
%=============================================================================
% node(ID, _)
#heuristic attr("version", node(ID, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic version_weight(node(ID, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic attr("variant_value", node(ID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic attr("node_target", node(ID, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_target_weight(node(ID, Package), 0) : attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : compiler_weight(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
% node(ID, _), split build dependencies
#heuristic attr("version", node(ID, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic version_weight(node(ID, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic attr("variant_value", node(ID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic attr("node_target", node(ID, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_target_weight(node(ID, Package), 0) : attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : compiler_weight(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]

View File

@@ -10,13 +10,15 @@
%============================================================================= %=============================================================================
% A package cannot be reused if the libc is not compatible with it % A package cannot be reused if the libc is not compatible with it
error(100, "Cannot reuse {0} since we cannot determine libc compatibility", ReusedPackage) :- provider(node(X, LibcPackage), node(0, "libc")),
:- provider(node(X, LibcPackage), node(0, "libc")), attr("version", node(X, LibcPackage), LibcVersion),
attr("version", node(X, LibcPackage), LibcVersion), attr("hash", node(R, ReusedPackage), Hash),
attr("hash", node(R, ReusedPackage), Hash), % Libc packages can be reused without the "compatible_libc" attribute
% Libc packages can be reused without the "compatible_libc" attribute ReusedPackage != LibcPackage,
ReusedPackage != LibcPackage, not attr("compatible_libc", node(R, ReusedPackage), LibcPackage, LibcVersion).
not attr("compatible_libc", node(R, ReusedPackage), LibcPackage, LibcVersion).
% Check whether the DAG has any built package
has_built_packages() :- build(X), not external(X).
% A libc is needed in the DAG % A libc is needed in the DAG
:- has_built_packages(), not provider(_, node(0, "libc")). :- has_built_packages(), not provider(_, node(0, "libc")).

View File

@@ -12,7 +12,6 @@
%============================================================================= %=============================================================================
% macOS % macOS
os_compatible("sequoia", "sonoma").
os_compatible("sonoma", "ventura"). os_compatible("sonoma", "ventura").
os_compatible("ventura", "monterey"). os_compatible("ventura", "monterey").
os_compatible("monterey", "bigsur"). os_compatible("monterey", "bigsur").

View File

@@ -4164,21 +4164,29 @@ def __getitem__(self, name: str):
csv = query_parameters.pop().strip() csv = query_parameters.pop().strip()
query_parameters = re.split(r"\s*,\s*", csv) query_parameters = re.split(r"\s*,\s*", csv)
# In some cases a package appears multiple times in the same DAG for *distinct*
# specs. For example, a build-type dependency may itself depend on a package
# the current spec depends on, but their specs may differ. Therefore we iterate
# in an order here that prioritizes the build, test and runtime dependencies;
# only when we don't find the package do we consider the full DAG.
order = lambda: itertools.chain( order = lambda: itertools.chain(
self.traverse_edges(deptype=dt.LINK, order="breadth", cover="edges"), self.traverse(deptype="link"),
self.edges_to_dependencies(depflag=dt.BUILD | dt.RUN | dt.TEST), self.dependencies(deptype=dt.BUILD | dt.RUN | dt.TEST),
self.traverse_edges(deptype=dt.ALL, order="breadth", cover="edges"), self.traverse(), # fall back to a full search
) )
# Consider runtime dependencies and direct build/test deps before transitive dependencies,
# and prefer matches closest to the root.
try: try:
child: Spec = next( child: Spec = next(
e.spec itertools.chain(
for e in itertools.chain( # Regular specs
(e for e in order() if e.spec.name == name or name in e.virtuals), (x for x in order() if x.name == name),
# for historical reasons (
(e for e in order() if e.spec.concrete and e.spec.package.provides(name)), x
for x in order()
if (not x.virtual)
and any(name in edge.virtuals for edge in x.edges_from_dependents())
),
(x for x in order() if (not x.virtual) and x.package.provides(name)),
) )
) )
except StopIteration: except StopIteration:
@@ -4420,12 +4428,9 @@ def format_attribute(match_object: Match) -> str:
if part.startswith("_"): if part.startswith("_"):
raise SpecFormatStringError("Attempted to format private attribute") raise SpecFormatStringError("Attempted to format private attribute")
else: else:
if isinstance(current, vt.VariantMap): if part == "variants" and isinstance(current, vt.VariantMap):
# subscript instead of getattr for variant names # subscript instead of getattr for variant names
try: current = current[part]
current = current[part]
except KeyError:
raise SpecFormatStringError(f"Variant '{part}' does not exist")
else: else:
# aliases # aliases
if part == "arch": if part == "arch":

View File

@@ -346,6 +346,8 @@ class Stage(LockableStagingDir):
similar, and are intended to persist for only one run of spack. similar, and are intended to persist for only one run of spack.
""" """
#: Most staging is managed by Spack. DIYStage is one exception.
needs_fetching = True
requires_patch_success = True requires_patch_success = True
def __init__( def __init__(
@@ -770,6 +772,8 @@ def __init__(self):
"cache_mirror", "cache_mirror",
"steal_source", "steal_source",
"disable_mirrors", "disable_mirrors",
"needs_fetching",
"requires_patch_success",
] ]
) )
@@ -808,10 +812,6 @@ def path(self):
def archive_file(self): def archive_file(self):
return self[0].archive_file return self[0].archive_file
@property
def requires_patch_success(self):
return self[0].requires_patch_success
@property @property
def keep(self): def keep(self):
return self[0].keep return self[0].keep
@@ -822,7 +822,64 @@ def keep(self, value):
item.keep = value item.keep = value
class DIYStage:
"""
Simple class that allows any directory to be a spack stage. Consequently,
it does not expect or require that the source path adhere to the standard
directory naming convention.
"""
needs_fetching = False
requires_patch_success = False
def __init__(self, path):
if path is None:
raise ValueError("Cannot construct DIYStage without a path.")
elif not os.path.isdir(path):
raise StagePathError("The stage path directory does not exist:", path)
self.archive_file = None
self.path = path
self.source_path = path
self.created = True
# DIY stages do nothing as context managers.
def __enter__(self):
pass
def __exit__(self, exc_type, exc_val, exc_tb):
pass
def fetch(self, *args, **kwargs):
tty.debug("No need to fetch for DIY.")
def check(self):
tty.debug("No checksum needed for DIY.")
def expand_archive(self):
tty.debug("Using source directory: {0}".format(self.source_path))
@property
def expanded(self):
"""Returns True since the source_path must exist."""
return True
def restage(self):
raise RestageError("Cannot restage a DIY stage.")
def create(self):
self.created = True
def destroy(self):
# No need to destroy DIY stage.
pass
def cache_local(self):
tty.debug("Sources for DIY stages are not cached")
class DevelopStage(LockableStagingDir): class DevelopStage(LockableStagingDir):
needs_fetching = False
requires_patch_success = False requires_patch_success = False
def __init__(self, name, dev_path, reference_link): def __init__(self, name, dev_path, reference_link):

View File

@@ -371,6 +371,7 @@ def use_store(
data.update(extra_data) data.update(extra_data)
# Swap the store with the one just constructed and return it # Swap the store with the one just constructed and return it
ensure_singleton_created()
spack.config.CONFIG.push_scope( spack.config.CONFIG.push_scope(
spack.config.InternalConfigScope(name=scope_name, data={"config": {"install_tree": data}}) spack.config.InternalConfigScope(name=scope_name, data={"config": {"install_tree": data}})
) )

View File

@@ -218,12 +218,10 @@ def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constra
str(archspec.cpu.host().family) != "x86_64", reason="tests are for x86_64 uarch ranges" str(archspec.cpu.host().family) != "x86_64", reason="tests are for x86_64 uarch ranges"
) )
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch): def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
spec = Spec( spec = Spec(f"a %gcc@10 foobar=bar target={root_target_range} ^b target={dep_target_range}")
f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}"
)
with spack.concretize.disable_compiler_existence_check(): with spack.concretize.disable_compiler_existence_check():
spec.concretize() spec.concretize()
assert spec.target == spec["pkg-b"].target == result assert spec.target == spec["b"].target == result
@pytest.mark.parametrize( @pytest.mark.parametrize(

View File

@@ -22,7 +22,6 @@
import archspec.cpu import archspec.cpu
from llnl.util.filesystem import join_path, visit_directory_tree from llnl.util.filesystem import join_path, visit_directory_tree
from llnl.util.symlink import readlink
import spack.binary_distribution as bindist import spack.binary_distribution as bindist
import spack.caches import spack.caches
@@ -1063,10 +1062,10 @@ def test_tarball_common_prefix(dummy_prefix, tmpdir):
assert set(os.listdir(os.path.join("prefix2", "share"))) == {"file"} assert set(os.listdir(os.path.join("prefix2", "share"))) == {"file"}
# Relative symlink should still be correct # Relative symlink should still be correct
assert readlink(os.path.join("prefix2", "bin", "relative_app_link")) == "app" assert os.readlink(os.path.join("prefix2", "bin", "relative_app_link")) == "app"
# Absolute symlink should remain absolute -- this is for relocation to fix up. # Absolute symlink should remain absolute -- this is for relocation to fix up.
assert readlink(os.path.join("prefix2", "bin", "absolute_app_link")) == os.path.join( assert os.readlink(os.path.join("prefix2", "bin", "absolute_app_link")) == os.path.join(
dummy_prefix, "bin", "app" dummy_prefix, "bin", "app"
) )

View File

@@ -228,25 +228,3 @@ def test_source_is_disabled(mutable_config):
spack.config.add("bootstrap:trusted:{0}:{1}".format(conf["name"], False)) spack.config.add("bootstrap:trusted:{0}:{1}".format(conf["name"], False))
with pytest.raises(ValueError): with pytest.raises(ValueError):
spack.bootstrap.core.source_is_enabled_or_raise(conf) spack.bootstrap.core.source_is_enabled_or_raise(conf)
@pytest.mark.regression("45247")
def test_use_store_does_not_try_writing_outside_root(tmp_path, monkeypatch, mutable_config):
"""Tests that when we use the 'use_store' context manager, there is no attempt at creating
a Store outside the given root.
"""
initial_store = mutable_config.get("config:install_tree:root")
user_store = tmp_path / "store"
fn = spack.store.Store.__init__
def _checked_init(self, root, *args, **kwargs):
fn(self, root, *args, **kwargs)
assert self.root == str(user_store)
monkeypatch.setattr(spack.store.Store, "__init__", _checked_init)
spack.store.reinitialize()
with spack.store.use_store(user_store):
assert spack.config.CONFIG.get("config:install_tree:root") == str(user_store)
assert spack.config.CONFIG.get("config:install_tree:root") == initial_store

View File

@@ -14,7 +14,6 @@
import spack.build_environment import spack.build_environment
import spack.config import spack.config
import spack.deptypes as dt
import spack.package_base import spack.package_base
import spack.spec import spack.spec
import spack.util.spack_yaml as syaml import spack.util.spack_yaml as syaml
@@ -457,14 +456,14 @@ def test_parallel_false_is_not_propagating(default_mock_concretization):
# a foobar=bar (parallel = False) # a foobar=bar (parallel = False)
# | # |
# b (parallel =True) # b (parallel =True)
s = default_mock_concretization("pkg-a foobar=bar") s = default_mock_concretization("a foobar=bar")
spack.build_environment.set_package_py_globals(s.package, context=Context.BUILD) spack.build_environment.set_package_py_globals(s.package, context=Context.BUILD)
assert s["pkg-a"].package.module.make_jobs == 1 assert s["a"].package.module.make_jobs == 1
spack.build_environment.set_package_py_globals(s["pkg-b"].package, context=Context.BUILD) spack.build_environment.set_package_py_globals(s["b"].package, context=Context.BUILD)
assert s["pkg-b"].package.module.make_jobs == spack.build_environment.determine_number_of_jobs( assert s["b"].package.module.make_jobs == spack.build_environment.determine_number_of_jobs(
parallel=s["pkg-b"].package.parallel parallel=s["b"].package.parallel
) )
@@ -560,7 +559,7 @@ def test_dirty_disable_module_unload(config, mock_packages, working_env, mock_mo
"""Test that on CRAY platform 'module unload' is not called if the 'dirty' """Test that on CRAY platform 'module unload' is not called if the 'dirty'
option is on. option is on.
""" """
s = spack.spec.Spec("pkg-a").concretized() s = spack.spec.Spec("a").concretized()
# If called with "dirty" we don't unload modules, so no calls to the # If called with "dirty" we don't unload modules, so no calls to the
# `module` function on Cray # `module` function on Cray
@@ -717,21 +716,3 @@ def test_build_system_globals_only_set_on_root_during_build(default_mock_concret
for depth, spec in root.traverse(depth=True, root=True): for depth, spec in root.traverse(depth=True, root=True):
for variable in build_variables: for variable in build_variables:
assert hasattr(spec.package.module, variable) == should_be_set(depth) assert hasattr(spec.package.module, variable) == should_be_set(depth)
def test_rpath_with_duplicate_link_deps():
"""If we have two instances of one package in the same link sub-dag, only the newest version is
rpath'ed. This is for runtime support without splicing."""
runtime_1 = spack.spec.Spec("runtime@=1.0")
runtime_2 = spack.spec.Spec("runtime@=2.0")
child = spack.spec.Spec("child@=1.0")
root = spack.spec.Spec("root@=1.0")
root.add_dependency_edge(child, depflag=dt.LINK, virtuals=())
root.add_dependency_edge(runtime_2, depflag=dt.LINK, virtuals=())
child.add_dependency_edge(runtime_1, depflag=dt.LINK, virtuals=())
rpath_deps = spack.build_environment._get_rpath_deps_from_spec(root, transitive_rpaths=True)
assert child in rpath_deps
assert runtime_2 in rpath_deps
assert runtime_1 not in rpath_deps

View File

@@ -97,7 +97,7 @@ def test_negative_ninja_check(self, input_dir, test_dir, concretize_and_setup):
@pytest.mark.usefixtures("config", "mock_packages") @pytest.mark.usefixtures("config", "mock_packages")
class TestAutotoolsPackage: class TestAutotoolsPackage:
def test_with_or_without(self, default_mock_concretization): def test_with_or_without(self, default_mock_concretization):
s = default_mock_concretization("pkg-a") s = default_mock_concretization("a")
options = s.package.with_or_without("foo") options = s.package.with_or_without("foo")
# Ensure that values that are not representing a feature # Ensure that values that are not representing a feature
@@ -129,7 +129,7 @@ def activate(value):
assert "--without-lorem-ipsum" in options assert "--without-lorem-ipsum" in options
def test_none_is_allowed(self, default_mock_concretization): def test_none_is_allowed(self, default_mock_concretization):
s = default_mock_concretization("pkg-a foo=none") s = default_mock_concretization("a foo=none")
options = s.package.with_or_without("foo") options = s.package.with_or_without("foo")
# Ensure that values that are not representing a feature # Ensure that values that are not representing a feature

View File

@@ -51,7 +51,7 @@ def __init__(self, response_code=200, content_to_read=[]):
self._content = content_to_read self._content = content_to_read
self._read = [False for c in content_to_read] self._read = [False for c in content_to_read]
def open(self, request, data=None, timeout=object()): def open(self, request):
return self return self
def getcode(self): def getcode(self):

View File

@@ -106,24 +106,24 @@ def test_specs_staging(config, tmpdir):
""" """
builder = repo.MockRepositoryBuilder(tmpdir) builder = repo.MockRepositoryBuilder(tmpdir)
builder.add_package("pkg-g") builder.add_package("g")
builder.add_package("pkg-f") builder.add_package("f")
builder.add_package("pkg-e") builder.add_package("e")
builder.add_package("pkg-d", dependencies=[("pkg-f", None, None), ("pkg-g", None, None)]) builder.add_package("d", dependencies=[("f", None, None), ("g", None, None)])
builder.add_package("pkg-c") builder.add_package("c")
builder.add_package("pkg-b", dependencies=[("pkg-d", None, None), ("pkg-e", None, None)]) builder.add_package("b", dependencies=[("d", None, None), ("e", None, None)])
builder.add_package("pkg-a", dependencies=[("pkg-b", None, None), ("pkg-c", None, None)]) builder.add_package("a", dependencies=[("b", None, None), ("c", None, None)])
with repo.use_repositories(builder.root): with repo.use_repositories(builder.root):
spec_a = Spec("pkg-a").concretized() spec_a = Spec("a").concretized()
spec_a_label = ci._spec_ci_label(spec_a) spec_a_label = ci._spec_ci_label(spec_a)
spec_b_label = ci._spec_ci_label(spec_a["pkg-b"]) spec_b_label = ci._spec_ci_label(spec_a["b"])
spec_c_label = ci._spec_ci_label(spec_a["pkg-c"]) spec_c_label = ci._spec_ci_label(spec_a["c"])
spec_d_label = ci._spec_ci_label(spec_a["pkg-d"]) spec_d_label = ci._spec_ci_label(spec_a["d"])
spec_e_label = ci._spec_ci_label(spec_a["pkg-e"]) spec_e_label = ci._spec_ci_label(spec_a["e"])
spec_f_label = ci._spec_ci_label(spec_a["pkg-f"]) spec_f_label = ci._spec_ci_label(spec_a["f"])
spec_g_label = ci._spec_ci_label(spec_a["pkg-g"]) spec_g_label = ci._spec_ci_label(spec_a["g"])
spec_labels, dependencies, stages = ci.stage_spec_jobs([spec_a]) spec_labels, dependencies, stages = ci.stage_spec_jobs([spec_a])
@@ -1290,7 +1290,7 @@ def test_ci_generate_override_runner_attrs(
spack: spack:
specs: specs:
- flatten-deps - flatten-deps
- pkg-a - a
mirrors: mirrors:
some-mirror: https://my.fake.mirror some-mirror: https://my.fake.mirror
ci: ci:
@@ -1307,12 +1307,12 @@ def test_ci_generate_override_runner_attrs(
- match: - match:
- dependency-install - dependency-install
- match: - match:
- pkg-a - a
build-job: build-job:
tags: tags:
- specific-a-2 - specific-a-2
- match: - match:
- pkg-a - a
build-job-remove: build-job-remove:
tags: tags:
- toplevel2 - toplevel2
@@ -1372,8 +1372,8 @@ def test_ci_generate_override_runner_attrs(
assert global_vars["SPACK_CHECKOUT_VERSION"] == git_version or "v0.20.0.test0" assert global_vars["SPACK_CHECKOUT_VERSION"] == git_version or "v0.20.0.test0"
for ci_key in yaml_contents.keys(): for ci_key in yaml_contents.keys():
if ci_key.startswith("pkg-a"): if ci_key.startswith("a"):
# Make sure pkg-a's attributes override variables, and all the # Make sure a's attributes override variables, and all the
# scripts. Also, make sure the 'toplevel' tag doesn't # scripts. Also, make sure the 'toplevel' tag doesn't
# appear twice, but that a's specific extra tag does appear # appear twice, but that a's specific extra tag does appear
the_elt = yaml_contents[ci_key] the_elt = yaml_contents[ci_key]
@@ -1830,7 +1830,7 @@ def test_ci_generate_read_broken_specs_url(
tmpdir, mutable_mock_env_path, install_mockery, mock_packages, monkeypatch, ci_base_environment tmpdir, mutable_mock_env_path, install_mockery, mock_packages, monkeypatch, ci_base_environment
): ):
"""Verify that `broken-specs-url` works as intended""" """Verify that `broken-specs-url` works as intended"""
spec_a = Spec("pkg-a") spec_a = Spec("a")
spec_a.concretize() spec_a.concretize()
a_dag_hash = spec_a.dag_hash() a_dag_hash = spec_a.dag_hash()
@@ -1856,7 +1856,7 @@ def test_ci_generate_read_broken_specs_url(
spack: spack:
specs: specs:
- flatten-deps - flatten-deps
- pkg-a - a
mirrors: mirrors:
some-mirror: https://my.fake.mirror some-mirror: https://my.fake.mirror
ci: ci:
@@ -1864,9 +1864,9 @@ def test_ci_generate_read_broken_specs_url(
pipeline-gen: pipeline-gen:
- submapping: - submapping:
- match: - match:
- pkg-a - a
- flatten-deps - flatten-deps
- pkg-b - b
- dependency-install - dependency-install
build-job: build-job:
tags: tags:

View File

@@ -81,14 +81,14 @@ def test_match_spec_env(mock_packages, mutable_mock_env_path):
""" """
# Initial sanity check: we are planning on choosing a non-default # Initial sanity check: we are planning on choosing a non-default
# value, so make sure that is in fact not the default. # value, so make sure that is in fact not the default.
check_defaults = spack.cmd.parse_specs(["pkg-a"], concretize=True)[0] check_defaults = spack.cmd.parse_specs(["a"], concretize=True)[0]
assert not check_defaults.satisfies("foobar=baz") assert not check_defaults.satisfies("foobar=baz")
e = ev.create("test") e = ev.create("test")
e.add("pkg-a foobar=baz") e.add("a foobar=baz")
e.concretize() e.concretize()
with e: with e:
env_spec = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-a"])[0]) env_spec = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["a"])[0])
assert env_spec.satisfies("foobar=baz") assert env_spec.satisfies("foobar=baz")
assert env_spec.concrete assert env_spec.concrete
@@ -96,12 +96,12 @@ def test_match_spec_env(mock_packages, mutable_mock_env_path):
@pytest.mark.usefixtures("config") @pytest.mark.usefixtures("config")
def test_multiple_env_match_raises_error(mock_packages, mutable_mock_env_path): def test_multiple_env_match_raises_error(mock_packages, mutable_mock_env_path):
e = ev.create("test") e = ev.create("test")
e.add("pkg-a foobar=baz") e.add("a foobar=baz")
e.add("pkg-a foobar=fee") e.add("a foobar=fee")
e.concretize() e.concretize()
with e: with e:
with pytest.raises(ev.SpackEnvironmentError) as exc_info: with pytest.raises(ev.SpackEnvironmentError) as exc_info:
spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-a"])[0]) spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["a"])[0])
assert "matches multiple specs" in exc_info.value.message assert "matches multiple specs" in exc_info.value.message
@@ -109,16 +109,16 @@ def test_multiple_env_match_raises_error(mock_packages, mutable_mock_env_path):
@pytest.mark.usefixtures("config") @pytest.mark.usefixtures("config")
def test_root_and_dep_match_returns_root(mock_packages, mutable_mock_env_path): def test_root_and_dep_match_returns_root(mock_packages, mutable_mock_env_path):
e = ev.create("test") e = ev.create("test")
e.add("pkg-b@0.9") e.add("b@0.9")
e.add("pkg-a foobar=bar") # Depends on b, should choose b@1.0 e.add("a foobar=bar") # Depends on b, should choose b@1.0
e.concretize() e.concretize()
with e: with e:
# This query matches the root b and b as a dependency of a. In that # This query matches the root b and b as a dependency of a. In that
# case the root instance should be preferred. # case the root instance should be preferred.
env_spec1 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-b"])[0]) env_spec1 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["b"])[0])
assert env_spec1.satisfies("@0.9") assert env_spec1.satisfies("@0.9")
env_spec2 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-b@1.0"])[0]) env_spec2 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["b@1.0"])[0])
assert env_spec2 assert env_spec2

View File

@@ -51,8 +51,8 @@ def test_concretize_root_test_dependencies_are_concretized(unify, mutable_mock_e
with ev.read("test") as e: with ev.read("test") as e:
e.unify = unify e.unify = unify
add("pkg-a") add("a")
add("pkg-b") add("b")
concretize("--test", "root") concretize("--test", "root")
assert e.matching_spec("test-dependency") assert e.matching_spec("test-dependency")

View File

@@ -15,26 +15,26 @@
def test_env(mutable_mock_env_path, config, mock_packages): def test_env(mutable_mock_env_path, config, mock_packages):
ev.create("test") ev.create("test")
with ev.read("test") as e: with ev.read("test") as e:
e.add("pkg-a@2.0 foobar=bar ^pkg-b@1.0") e.add("a@2.0 foobar=bar ^b@1.0")
e.add("pkg-a@1.0 foobar=bar ^pkg-b@0.9") e.add("a@1.0 foobar=bar ^b@0.9")
e.concretize() e.concretize()
e.write() e.write()
def test_deconcretize_dep(test_env): def test_deconcretize_dep(test_env):
with ev.read("test") as e: with ev.read("test") as e:
deconcretize("-y", "pkg-b@1.0") deconcretize("-y", "b@1.0")
specs = [s for s, _ in e.concretized_specs()] specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 1 assert len(specs) == 1
assert specs[0].satisfies("pkg-a@1.0") assert specs[0].satisfies("a@1.0")
def test_deconcretize_all_dep(test_env): def test_deconcretize_all_dep(test_env):
with ev.read("test") as e: with ev.read("test") as e:
with pytest.raises(SpackCommandError): with pytest.raises(SpackCommandError):
deconcretize("-y", "pkg-b") deconcretize("-y", "b")
deconcretize("-y", "--all", "pkg-b") deconcretize("-y", "--all", "b")
specs = [s for s, _ in e.concretized_specs()] specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 0 assert len(specs) == 0
@@ -42,27 +42,27 @@ def test_deconcretize_all_dep(test_env):
def test_deconcretize_root(test_env): def test_deconcretize_root(test_env):
with ev.read("test") as e: with ev.read("test") as e:
output = deconcretize("-y", "--root", "pkg-b@1.0") output = deconcretize("-y", "--root", "b@1.0")
assert "No matching specs to deconcretize" in output assert "No matching specs to deconcretize" in output
assert len(e.concretized_order) == 2 assert len(e.concretized_order) == 2
deconcretize("-y", "--root", "pkg-a@2.0") deconcretize("-y", "--root", "a@2.0")
specs = [s for s, _ in e.concretized_specs()] specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 1 assert len(specs) == 1
assert specs[0].satisfies("pkg-a@1.0") assert specs[0].satisfies("a@1.0")
def test_deconcretize_all_root(test_env): def test_deconcretize_all_root(test_env):
with ev.read("test") as e: with ev.read("test") as e:
with pytest.raises(SpackCommandError): with pytest.raises(SpackCommandError):
deconcretize("-y", "--root", "pkg-a") deconcretize("-y", "--root", "a")
output = deconcretize("-y", "--root", "--all", "pkg-b") output = deconcretize("-y", "--root", "--all", "b")
assert "No matching specs to deconcretize" in output assert "No matching specs to deconcretize" in output
assert len(e.concretized_order) == 2 assert len(e.concretized_order) == 2
deconcretize("-y", "--root", "--all", "pkg-a") deconcretize("-y", "--root", "--all", "a")
specs = [s for s, _ in e.concretized_specs()] specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 0 assert len(specs) == 0

View File

@@ -15,7 +15,6 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import llnl.util.link_tree import llnl.util.link_tree
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.cmd.env import spack.cmd.env
import spack.config import spack.config
@@ -28,9 +27,7 @@
import spack.package_base import spack.package_base
import spack.paths import spack.paths
import spack.repo import spack.repo
import spack.store
import spack.util.spack_json as sjson import spack.util.spack_json as sjson
import spack.util.spack_yaml
from spack.cmd.env import _env_create from spack.cmd.env import _env_create
from spack.main import SpackCommand, SpackCommandError from spack.main import SpackCommand, SpackCommandError
from spack.spec import Spec from spack.spec import Spec
@@ -503,7 +500,7 @@ def test_env_install_two_specs_same_dep(install_mockery, mock_fetch, tmpdir, cap
"""\ """\
spack: spack:
specs: specs:
- pkg-a - a
- depb - depb
""" """
) )
@@ -522,8 +519,8 @@ def test_env_install_two_specs_same_dep(install_mockery, mock_fetch, tmpdir, cap
depb = spack.store.STORE.db.query_one("depb", installed=True) depb = spack.store.STORE.db.query_one("depb", installed=True)
assert depb, "Expected depb to be installed" assert depb, "Expected depb to be installed"
a = spack.store.STORE.db.query_one("pkg-a", installed=True) a = spack.store.STORE.db.query_one("a", installed=True)
assert a, "Expected pkg-a to be installed" assert a, "Expected a to be installed"
def test_remove_after_concretize(): def test_remove_after_concretize():
@@ -827,7 +824,7 @@ def test_env_view_external_prefix(tmp_path, mutable_database, mock_packages):
"""\ """\
spack: spack:
specs: specs:
- pkg-a - a
view: true view: true
""" """
) )
@@ -835,9 +832,9 @@ def test_env_view_external_prefix(tmp_path, mutable_database, mock_packages):
external_config = io.StringIO( external_config = io.StringIO(
"""\ """\
packages: packages:
pkg-a: a:
externals: externals:
- spec: pkg-a@2.0 - spec: a@2.0
prefix: {a_prefix} prefix: {a_prefix}
buildable: false buildable: false
""".format( """.format(
@@ -1739,17 +1736,6 @@ def test_env_include_concrete_env_yaml(env_name):
assert test.path in combined_yaml["include_concrete"] assert test.path in combined_yaml["include_concrete"]
@pytest.mark.regression("45766")
@pytest.mark.parametrize("format", ["v1", "v2", "v3"])
def test_env_include_concrete_old_env(format, tmpdir):
lockfile = os.path.join(spack.paths.test_path, "data", "legacy_env", f"{format}.lock")
# create an env from old .lock file -- this does not update the format
env("create", "old-env", lockfile)
env("create", "--include-concrete", "old-env", "test")
assert ev.read("old-env").all_specs() == ev.read("test").all_specs()
def test_env_bad_include_concrete_env(): def test_env_bad_include_concrete_env():
with pytest.raises(ev.SpackEnvironmentError): with pytest.raises(ev.SpackEnvironmentError):
env("create", "--include-concrete", "nonexistant_env", "combined_env") env("create", "--include-concrete", "nonexistant_env", "combined_env")
@@ -4428,8 +4414,8 @@ def test_env_view_resolves_identical_file_conflicts(tmp_path, install_mockery, m
# view-file/bin/ # view-file/bin/
# x # expect this x to be linked # x # expect this x to be linked
assert readlink(tmp_path / "view" / "bin" / "x") == bottom.bin.x assert os.readlink(tmp_path / "view" / "bin" / "x") == bottom.bin.x
assert readlink(tmp_path / "view" / "bin" / "y") == top.bin.y assert os.readlink(tmp_path / "view" / "bin" / "y") == top.bin.y
def test_env_view_ignores_different_file_conflicts(tmp_path, install_mockery, mock_fetch): def test_env_view_ignores_different_file_conflicts(tmp_path, install_mockery, mock_fetch):
@@ -4440,4 +4426,4 @@ def test_env_view_ignores_different_file_conflicts(tmp_path, install_mockery, mo
install() install()
prefix_dependent = e.matching_spec("view-ignore-conflict").prefix prefix_dependent = e.matching_spec("view-ignore-conflict").prefix
# The dependent's file is linked into the view # The dependent's file is linked into the view
assert readlink(tmp_path / "view" / "bin" / "x") == prefix_dependent.bin.x assert os.readlink(tmp_path / "view" / "bin" / "x") == prefix_dependent.bin.x

View File

@@ -89,7 +89,7 @@ def check(pkg):
assert pkg.run_tests assert pkg.run_tests
monkeypatch.setattr(spack.package_base.PackageBase, "unit_test_check", check) monkeypatch.setattr(spack.package_base.PackageBase, "unit_test_check", check)
install("--test=all", "pkg-a") install("--test=all", "a")
def test_install_package_already_installed( def test_install_package_already_installed(
@@ -570,58 +570,61 @@ def test_cdash_upload_build_error(tmpdir, mock_fetch, install_mockery, capfd):
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
def test_cdash_upload_clean_build(tmpdir, mock_fetch, install_mockery, capfd): def test_cdash_upload_clean_build(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capturing of e.g., Build.xml output # capfd interferes with Spack's capturing of e.g., Build.xml output
with capfd.disabled(), tmpdir.as_cwd(): with capfd.disabled():
install("--log-file=cdash_reports", "--log-format=cdash", "pkg-a") with tmpdir.as_cwd():
report_dir = tmpdir.join("cdash_reports") install("--log-file=cdash_reports", "--log-format=cdash", "a")
assert report_dir in tmpdir.listdir() report_dir = tmpdir.join("cdash_reports")
report_file = report_dir.join("pkg-a_Build.xml") assert report_dir in tmpdir.listdir()
assert report_file in report_dir.listdir() report_file = report_dir.join("a_Build.xml")
content = report_file.open().read() assert report_file in report_dir.listdir()
assert "</Build>" in content content = report_file.open().read()
assert "<Text>" not in content assert "</Build>" in content
assert "<Text>" not in content
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
def test_cdash_upload_extra_params(tmpdir, mock_fetch, install_mockery, capfd): def test_cdash_upload_extra_params(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capture of e.g., Build.xml output # capfd interferes with Spack's capture of e.g., Build.xml output
with capfd.disabled(), tmpdir.as_cwd(): with capfd.disabled():
install( with tmpdir.as_cwd():
"--log-file=cdash_reports", install(
"--log-format=cdash", "--log-file=cdash_reports",
"--cdash-build=my_custom_build", "--log-format=cdash",
"--cdash-site=my_custom_site", "--cdash-build=my_custom_build",
"--cdash-track=my_custom_track", "--cdash-site=my_custom_site",
"pkg-a", "--cdash-track=my_custom_track",
) "a",
report_dir = tmpdir.join("cdash_reports") )
assert report_dir in tmpdir.listdir() report_dir = tmpdir.join("cdash_reports")
report_file = report_dir.join("pkg-a_Build.xml") assert report_dir in tmpdir.listdir()
assert report_file in report_dir.listdir() report_file = report_dir.join("a_Build.xml")
content = report_file.open().read() assert report_file in report_dir.listdir()
assert 'Site BuildName="my_custom_build - pkg-a"' in content content = report_file.open().read()
assert 'Name="my_custom_site"' in content assert 'Site BuildName="my_custom_build - a"' in content
assert "-my_custom_track" in content assert 'Name="my_custom_site"' in content
assert "-my_custom_track" in content
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
def test_cdash_buildstamp_param(tmpdir, mock_fetch, install_mockery, capfd): def test_cdash_buildstamp_param(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capture of e.g., Build.xml output # capfd interferes with Spack's capture of e.g., Build.xml output
with capfd.disabled(), tmpdir.as_cwd(): with capfd.disabled():
cdash_track = "some_mocked_track" with tmpdir.as_cwd():
buildstamp_format = "%Y%m%d-%H%M-{0}".format(cdash_track) cdash_track = "some_mocked_track"
buildstamp = time.strftime(buildstamp_format, time.localtime(int(time.time()))) buildstamp_format = "%Y%m%d-%H%M-{0}".format(cdash_track)
install( buildstamp = time.strftime(buildstamp_format, time.localtime(int(time.time())))
"--log-file=cdash_reports", install(
"--log-format=cdash", "--log-file=cdash_reports",
"--cdash-buildstamp={0}".format(buildstamp), "--log-format=cdash",
"pkg-a", "--cdash-buildstamp={0}".format(buildstamp),
) "a",
report_dir = tmpdir.join("cdash_reports") )
assert report_dir in tmpdir.listdir() report_dir = tmpdir.join("cdash_reports")
report_file = report_dir.join("pkg-a_Build.xml") assert report_dir in tmpdir.listdir()
assert report_file in report_dir.listdir() report_file = report_dir.join("a_Build.xml")
content = report_file.open().read() assert report_file in report_dir.listdir()
assert buildstamp in content content = report_file.open().read()
assert buildstamp in content
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
@@ -629,37 +632,38 @@ def test_cdash_install_from_spec_json(
tmpdir, mock_fetch, install_mockery, capfd, mock_packages, mock_archive, config tmpdir, mock_fetch, install_mockery, capfd, mock_packages, mock_archive, config
): ):
# capfd interferes with Spack's capturing # capfd interferes with Spack's capturing
with capfd.disabled(), tmpdir.as_cwd(): with capfd.disabled():
spec_json_path = str(tmpdir.join("spec.json")) with tmpdir.as_cwd():
spec_json_path = str(tmpdir.join("spec.json"))
pkg_spec = Spec("pkg-a") pkg_spec = Spec("a")
pkg_spec.concretize() pkg_spec.concretize()
with open(spec_json_path, "w") as fd: with open(spec_json_path, "w") as fd:
fd.write(pkg_spec.to_json(hash=ht.dag_hash)) fd.write(pkg_spec.to_json(hash=ht.dag_hash))
install( install(
"--log-format=cdash", "--log-format=cdash",
"--log-file=cdash_reports", "--log-file=cdash_reports",
"--cdash-build=my_custom_build", "--cdash-build=my_custom_build",
"--cdash-site=my_custom_site", "--cdash-site=my_custom_site",
"--cdash-track=my_custom_track", "--cdash-track=my_custom_track",
"-f", "-f",
spec_json_path, spec_json_path,
) )
report_dir = tmpdir.join("cdash_reports") report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir() assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Configure.xml") report_file = report_dir.join("a_Configure.xml")
assert report_file in report_dir.listdir() assert report_file in report_dir.listdir()
content = report_file.open().read() content = report_file.open().read()
install_command_regex = re.compile( install_command_regex = re.compile(
r"<ConfigureCommand>(.+)</ConfigureCommand>", re.MULTILINE | re.DOTALL r"<ConfigureCommand>(.+)</ConfigureCommand>", re.MULTILINE | re.DOTALL
) )
m = install_command_regex.search(content) m = install_command_regex.search(content)
assert m assert m
install_command = m.group(1) install_command = m.group(1)
assert "pkg-a@" in install_command assert "a@" in install_command
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
@@ -791,15 +795,15 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# ^libdwarf # ^libdwarf
# ^mpich # ^mpich
# libelf@0.8.10 # libelf@0.8.10
# pkg-a~bvv # a~bvv
# ^pkg-b # ^b
# pkg-a # a
# ^pkg-b # ^b
e = ev.create("test", with_view=False) e = ev.create("test", with_view=False)
e.add("mpileaks") e.add("mpileaks")
e.add("libelf@0.8.10") # so env has both root and dep libelf specs e.add("libelf@0.8.10") # so env has both root and dep libelf specs
e.add("pkg-a") e.add("a")
e.add("pkg-a ~bvv") e.add("a ~bvv")
e.concretize() e.concretize()
e.write() e.write()
env_specs = e.all_specs() env_specs = e.all_specs()
@@ -810,9 +814,9 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# First find and remember some target concrete specs in the environment # First find and remember some target concrete specs in the environment
for e_spec in env_specs: for e_spec in env_specs:
if e_spec.satisfies(Spec("pkg-a ~bvv")): if e_spec.satisfies(Spec("a ~bvv")):
a_spec = e_spec a_spec = e_spec
elif e_spec.name == "pkg-b": elif e_spec.name == "b":
b_spec = e_spec b_spec = e_spec
elif e_spec.satisfies(Spec("mpi")): elif e_spec.satisfies(Spec("mpi")):
mpi_spec = e_spec mpi_spec = e_spec
@@ -835,8 +839,8 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
assert "You can add specs to the environment with 'spack add " in inst_out assert "You can add specs to the environment with 'spack add " in inst_out
# Without --add, ensure that two packages "a" get installed # Without --add, ensure that two packages "a" get installed
inst_out = install("pkg-a", output=str) inst_out = install("a", output=str)
assert len([x for x in e.all_specs() if x.installed and x.name == "pkg-a"]) == 2 assert len([x for x in e.all_specs() if x.installed and x.name == "a"]) == 2
# Install an unambiguous dependency spec (that already exists as a dep # Install an unambiguous dependency spec (that already exists as a dep
# in the environment) and make sure it gets installed (w/ deps), # in the environment) and make sure it gets installed (w/ deps),
@@ -869,7 +873,7 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# root of the environment as well as installed. # root of the environment as well as installed.
assert b_spec not in e.roots() assert b_spec not in e.roots()
install("--add", "pkg-b") install("--add", "b")
assert b_spec in e.roots() assert b_spec in e.roots()
assert b_spec not in e.uninstalled_specs() assert b_spec not in e.uninstalled_specs()
@@ -904,7 +908,7 @@ def test_cdash_auth_token(tmpdir, mock_fetch, install_mockery, monkeypatch, capf
# capfd interferes with Spack's capturing # capfd interferes with Spack's capturing
with tmpdir.as_cwd(), capfd.disabled(): with tmpdir.as_cwd(), capfd.disabled():
monkeypatch.setenv("SPACK_CDASH_AUTH_TOKEN", "asdf") monkeypatch.setenv("SPACK_CDASH_AUTH_TOKEN", "asdf")
out = install("-v", "--log-file=cdash_reports", "--log-format=cdash", "pkg-a") out = install("-v", "--log-file=cdash_reports", "--log-format=cdash", "a")
assert "Using CDash auth token from environment" in out assert "Using CDash auth token from environment" in out
@@ -912,25 +916,26 @@ def test_cdash_auth_token(tmpdir, mock_fetch, install_mockery, monkeypatch, capf
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd): def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capturing of e.g., Build.xml output # capfd interferes with Spack's capturing of e.g., Build.xml output
with capfd.disabled(), tmpdir.as_cwd(): with capfd.disabled():
# Test would fail if install raised an error. with tmpdir.as_cwd():
# Test would fail if install raised an error.
# Ensure that even on non-x86_64 architectures, there are no # Ensure that even on non-x86_64 architectures, there are no
# dependencies installed # dependencies installed
spec = Spec("configure-warning").concretized() spec = spack.spec.Spec("configure-warning").concretized()
spec.clear_dependencies() spec.clear_dependencies()
specfile = "./spec.json" specfile = "./spec.json"
with open(specfile, "w") as f: with open(specfile, "w") as f:
f.write(spec.to_json()) f.write(spec.to_json())
install("--log-file=cdash_reports", "--log-format=cdash", specfile) install("--log-file=cdash_reports", "--log-format=cdash", specfile)
# Verify Configure.xml exists with expected contents. # Verify Configure.xml exists with expected contents.
report_dir = tmpdir.join("cdash_reports") report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir() assert report_dir in tmpdir.listdir()
report_file = report_dir.join("Configure.xml") report_file = report_dir.join("Configure.xml")
assert report_file in report_dir.listdir() assert report_file in report_dir.listdir()
content = report_file.open().read() content = report_file.open().read()
assert "foo: No such file or directory" in content assert "foo: No such file or directory" in content
@pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows") @pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows")
@@ -947,7 +952,7 @@ def test_compiler_bootstrap(
assert CompilerSpec("gcc@=12.0") not in compilers.all_compiler_specs() assert CompilerSpec("gcc@=12.0") not in compilers.all_compiler_specs()
# Test succeeds if it does not raise an error # Test succeeds if it does not raise an error
install("pkg-a%gcc@=12.0") install("a%gcc@=12.0")
@pytest.mark.not_on_windows("Binary mirrors not supported on windows") @pytest.mark.not_on_windows("Binary mirrors not supported on windows")
@@ -987,8 +992,8 @@ def test_compiler_bootstrap_from_binary_mirror(
# Now make sure that when the compiler is installed from binary mirror, # Now make sure that when the compiler is installed from binary mirror,
# it also gets configured as a compiler. Test succeeds if it does not # it also gets configured as a compiler. Test succeeds if it does not
# raise an error # raise an error
install("--no-check-signature", "--cache-only", "--only", "dependencies", "pkg-b%gcc@=10.2.0") install("--no-check-signature", "--cache-only", "--only", "dependencies", "b%gcc@=10.2.0")
install("--no-cache", "--only", "package", "pkg-b%gcc@10.2.0") install("--no-cache", "--only", "package", "b%gcc@10.2.0")
@pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows") @pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows")
@@ -1008,7 +1013,7 @@ def test_compiler_bootstrap_already_installed(
# Test succeeds if it does not raise an error # Test succeeds if it does not raise an error
install("gcc@=12.0") install("gcc@=12.0")
install("pkg-a%gcc@=12.0") install("a%gcc@=12.0")
def test_install_fails_no_args(tmpdir): def test_install_fails_no_args(tmpdir):
@@ -1190,7 +1195,7 @@ def test_report_filename_for_cdash(install_mockery_mutable_config, mock_fetch):
parser = argparse.ArgumentParser() parser = argparse.ArgumentParser()
spack.cmd.install.setup_parser(parser) spack.cmd.install.setup_parser(parser)
args = parser.parse_args( args = parser.parse_args(
["--cdash-upload-url", "https://blahblah/submit.php?project=debugging", "pkg-a"] ["--cdash-upload-url", "https://blahblah/submit.php?project=debugging", "a"]
) )
specs = spack.cmd.install.concrete_specs_from_cli(args, {}) specs = spack.cmd.install.concrete_specs_from_cli(args, {})
filename = spack.cmd.install.report_filename(args, specs) filename = spack.cmd.install.report_filename(args, specs)

View File

@@ -121,7 +121,7 @@ def test_maintainers_list_packages(mock_packages, capfd):
def test_maintainers_list_fails(mock_packages, capfd): def test_maintainers_list_fails(mock_packages, capfd):
out = maintainers("pkg-a", fail_on_error=False) out = maintainers("a", fail_on_error=False)
assert not out assert not out
assert maintainers.returncode == 1 assert maintainers.returncode == 1

View File

@@ -11,7 +11,6 @@
import spack.config import spack.config
import spack.main import spack.main
import spack.modules import spack.modules
import spack.spec
import spack.store import spack.store
module = spack.main.SpackCommand("module") module = spack.main.SpackCommand("module")
@@ -179,8 +178,8 @@ def test_setdefault_command(mutable_database, mutable_config):
} }
} }
spack.config.set("modules", data) spack.config.set("modules", data)
# Install two different versions of pkg-a # Install two different versions of a package
other_spec, preferred = "pkg-a@1.0", "pkg-a@2.0" other_spec, preferred = "a@1.0", "a@2.0"
spack.spec.Spec(other_spec).concretized().package.do_install(fake=True) spack.spec.Spec(other_spec).concretized().package.do_install(fake=True)
spack.spec.Spec(preferred).concretized().package.do_install(fake=True) spack.spec.Spec(preferred).concretized().package.do_install(fake=True)

View File

@@ -28,8 +28,8 @@ def install(self, spec, prefix):
pass pass
""" """
abc = {"mockpkg-a", "mockpkg-b", "mockpkg-c"} abc = set(("pkg-a", "pkg-b", "pkg-c"))
abd = {"mockpkg-a", "mockpkg-b", "mockpkg-d"} abd = set(("pkg-a", "pkg-b", "pkg-d"))
# Force all tests to use a git repository *in* the mock packages repo. # Force all tests to use a git repository *in* the mock packages repo.
@@ -53,33 +53,27 @@ def mock_pkg_git_repo(git, tmpdir_factory):
git("config", "user.name", "Spack Testing") git("config", "user.name", "Spack Testing")
git("-c", "commit.gpgsign=false", "commit", "-m", "initial mock repo commit") git("-c", "commit.gpgsign=false", "commit", "-m", "initial mock repo commit")
# add commit with mockpkg-a, mockpkg-b, mockpkg-c packages # add commit with pkg-a, pkg-b, pkg-c packages
mkdirp("mockpkg-a", "mockpkg-b", "mockpkg-c") mkdirp("pkg-a", "pkg-b", "pkg-c")
with open("mockpkg-a/package.py", "w") as f: with open("pkg-a/package.py", "w") as f:
f.write(pkg_template.format(name="PkgA")) f.write(pkg_template.format(name="PkgA"))
with open("mockpkg-b/package.py", "w") as f: with open("pkg-b/package.py", "w") as f:
f.write(pkg_template.format(name="PkgB")) f.write(pkg_template.format(name="PkgB"))
with open("mockpkg-c/package.py", "w") as f: with open("pkg-c/package.py", "w") as f:
f.write(pkg_template.format(name="PkgC")) f.write(pkg_template.format(name="PkgC"))
git("add", "mockpkg-a", "mockpkg-b", "mockpkg-c") git("add", "pkg-a", "pkg-b", "pkg-c")
git("-c", "commit.gpgsign=false", "commit", "-m", "add mockpkg-a, mockpkg-b, mockpkg-c") git("-c", "commit.gpgsign=false", "commit", "-m", "add pkg-a, pkg-b, pkg-c")
# remove mockpkg-c, add mockpkg-d # remove pkg-c, add pkg-d
with open("mockpkg-b/package.py", "a") as f: with open("pkg-b/package.py", "a") as f:
f.write("\n# change mockpkg-b") f.write("\n# change pkg-b")
git("add", "mockpkg-b") git("add", "pkg-b")
mkdirp("mockpkg-d") mkdirp("pkg-d")
with open("mockpkg-d/package.py", "w") as f: with open("pkg-d/package.py", "w") as f:
f.write(pkg_template.format(name="PkgD")) f.write(pkg_template.format(name="PkgD"))
git("add", "mockpkg-d") git("add", "pkg-d")
git("rm", "-rf", "mockpkg-c") git("rm", "-rf", "pkg-c")
git( git("-c", "commit.gpgsign=false", "commit", "-m", "change pkg-b, remove pkg-c, add pkg-d")
"-c",
"commit.gpgsign=false",
"commit",
"-m",
"change mockpkg-b, remove mockpkg-c, add mockpkg-d",
)
with spack.repo.use_repositories(str(repo_path)): with spack.repo.use_repositories(str(repo_path)):
yield mock_repo_packages yield mock_repo_packages
@@ -92,11 +86,12 @@ def mock_pkg_names():
# Be sure to include virtual packages since packages with stand-alone # Be sure to include virtual packages since packages with stand-alone
# tests may inherit additional tests from the virtuals they provide, # tests may inherit additional tests from the virtuals they provide,
# such as packages that implement `mpi`. # such as packages that implement `mpi`.
return { names = set(
name name
for name in repo.all_package_names(include_virtuals=True) for name in repo.all_package_names(include_virtuals=True)
if not name.startswith("mockpkg-") if not name.startswith("pkg-")
} )
return names
def split(output): def split(output):
@@ -118,17 +113,17 @@ def test_mock_packages_path(mock_packages):
def test_pkg_add(git, mock_pkg_git_repo): def test_pkg_add(git, mock_pkg_git_repo):
with working_dir(mock_pkg_git_repo): with working_dir(mock_pkg_git_repo):
mkdirp("mockpkg-e") mkdirp("pkg-e")
with open("mockpkg-e/package.py", "w") as f: with open("pkg-e/package.py", "w") as f:
f.write(pkg_template.format(name="PkgE")) f.write(pkg_template.format(name="PkgE"))
pkg("add", "mockpkg-e") pkg("add", "pkg-e")
with working_dir(mock_pkg_git_repo): with working_dir(mock_pkg_git_repo):
try: try:
assert "A mockpkg-e/package.py" in git("status", "--short", output=str) assert "A pkg-e/package.py" in git("status", "--short", output=str)
finally: finally:
shutil.rmtree("mockpkg-e") shutil.rmtree("pkg-e")
# Removing a package mid-run disrupts Spack's caching # Removing a package mid-run disrupts Spack's caching
if spack.repo.PATH.repos[0]._fast_package_checker: if spack.repo.PATH.repos[0]._fast_package_checker:
spack.repo.PATH.repos[0]._fast_package_checker.invalidate() spack.repo.PATH.repos[0]._fast_package_checker.invalidate()
@@ -143,10 +138,10 @@ def test_pkg_list(mock_pkg_git_repo, mock_pkg_names):
assert sorted(mock_pkg_names) == sorted(out) assert sorted(mock_pkg_names) == sorted(out)
out = split(pkg("list", "HEAD^")) out = split(pkg("list", "HEAD^"))
assert sorted(mock_pkg_names.union(["mockpkg-a", "mockpkg-b", "mockpkg-c"])) == sorted(out) assert sorted(mock_pkg_names.union(["pkg-a", "pkg-b", "pkg-c"])) == sorted(out)
out = split(pkg("list", "HEAD")) out = split(pkg("list", "HEAD"))
assert sorted(mock_pkg_names.union(["mockpkg-a", "mockpkg-b", "mockpkg-d"])) == sorted(out) assert sorted(mock_pkg_names.union(["pkg-a", "pkg-b", "pkg-d"])) == sorted(out)
# test with three dots to make sure pkg calls `git merge-base` # test with three dots to make sure pkg calls `git merge-base`
out = split(pkg("list", "HEAD^^...")) out = split(pkg("list", "HEAD^^..."))
@@ -156,25 +151,25 @@ def test_pkg_list(mock_pkg_git_repo, mock_pkg_names):
@pytest.mark.not_on_windows("stdout format conflict") @pytest.mark.not_on_windows("stdout format conflict")
def test_pkg_diff(mock_pkg_git_repo, mock_pkg_names): def test_pkg_diff(mock_pkg_git_repo, mock_pkg_names):
out = split(pkg("diff", "HEAD^^", "HEAD^")) out = split(pkg("diff", "HEAD^^", "HEAD^"))
assert out == ["HEAD^:", "mockpkg-a", "mockpkg-b", "mockpkg-c"] assert out == ["HEAD^:", "pkg-a", "pkg-b", "pkg-c"]
out = split(pkg("diff", "HEAD^^", "HEAD")) out = split(pkg("diff", "HEAD^^", "HEAD"))
assert out == ["HEAD:", "mockpkg-a", "mockpkg-b", "mockpkg-d"] assert out == ["HEAD:", "pkg-a", "pkg-b", "pkg-d"]
out = split(pkg("diff", "HEAD^", "HEAD")) out = split(pkg("diff", "HEAD^", "HEAD"))
assert out == ["HEAD^:", "mockpkg-c", "HEAD:", "mockpkg-d"] assert out == ["HEAD^:", "pkg-c", "HEAD:", "pkg-d"]
@pytest.mark.not_on_windows("stdout format conflict") @pytest.mark.not_on_windows("stdout format conflict")
def test_pkg_added(mock_pkg_git_repo): def test_pkg_added(mock_pkg_git_repo):
out = split(pkg("added", "HEAD^^", "HEAD^")) out = split(pkg("added", "HEAD^^", "HEAD^"))
assert ["mockpkg-a", "mockpkg-b", "mockpkg-c"] == out assert ["pkg-a", "pkg-b", "pkg-c"] == out
out = split(pkg("added", "HEAD^^", "HEAD")) out = split(pkg("added", "HEAD^^", "HEAD"))
assert ["mockpkg-a", "mockpkg-b", "mockpkg-d"] == out assert ["pkg-a", "pkg-b", "pkg-d"] == out
out = split(pkg("added", "HEAD^", "HEAD")) out = split(pkg("added", "HEAD^", "HEAD"))
assert ["mockpkg-d"] == out assert ["pkg-d"] == out
out = split(pkg("added", "HEAD", "HEAD")) out = split(pkg("added", "HEAD", "HEAD"))
assert out == [] assert out == []
@@ -189,7 +184,7 @@ def test_pkg_removed(mock_pkg_git_repo):
assert out == [] assert out == []
out = split(pkg("removed", "HEAD^", "HEAD")) out = split(pkg("removed", "HEAD^", "HEAD"))
assert out == ["mockpkg-c"] assert out == ["pkg-c"]
@pytest.mark.not_on_windows("stdout format conflict") @pytest.mark.not_on_windows("stdout format conflict")
@@ -201,34 +196,34 @@ def test_pkg_changed(mock_pkg_git_repo):
assert out == [] assert out == []
out = split(pkg("changed", "--type", "a", "HEAD^^", "HEAD^")) out = split(pkg("changed", "--type", "a", "HEAD^^", "HEAD^"))
assert out == ["mockpkg-a", "mockpkg-b", "mockpkg-c"] assert out == ["pkg-a", "pkg-b", "pkg-c"]
out = split(pkg("changed", "--type", "r", "HEAD^^", "HEAD^")) out = split(pkg("changed", "--type", "r", "HEAD^^", "HEAD^"))
assert out == [] assert out == []
out = split(pkg("changed", "--type", "ar", "HEAD^^", "HEAD^")) out = split(pkg("changed", "--type", "ar", "HEAD^^", "HEAD^"))
assert out == ["mockpkg-a", "mockpkg-b", "mockpkg-c"] assert out == ["pkg-a", "pkg-b", "pkg-c"]
out = split(pkg("changed", "--type", "arc", "HEAD^^", "HEAD^")) out = split(pkg("changed", "--type", "arc", "HEAD^^", "HEAD^"))
assert out == ["mockpkg-a", "mockpkg-b", "mockpkg-c"] assert out == ["pkg-a", "pkg-b", "pkg-c"]
out = split(pkg("changed", "HEAD^", "HEAD")) out = split(pkg("changed", "HEAD^", "HEAD"))
assert out == ["mockpkg-b"] assert out == ["pkg-b"]
out = split(pkg("changed", "--type", "c", "HEAD^", "HEAD")) out = split(pkg("changed", "--type", "c", "HEAD^", "HEAD"))
assert out == ["mockpkg-b"] assert out == ["pkg-b"]
out = split(pkg("changed", "--type", "a", "HEAD^", "HEAD")) out = split(pkg("changed", "--type", "a", "HEAD^", "HEAD"))
assert out == ["mockpkg-d"] assert out == ["pkg-d"]
out = split(pkg("changed", "--type", "r", "HEAD^", "HEAD")) out = split(pkg("changed", "--type", "r", "HEAD^", "HEAD"))
assert out == ["mockpkg-c"] assert out == ["pkg-c"]
out = split(pkg("changed", "--type", "ar", "HEAD^", "HEAD")) out = split(pkg("changed", "--type", "ar", "HEAD^", "HEAD"))
assert out == ["mockpkg-c", "mockpkg-d"] assert out == ["pkg-c", "pkg-d"]
out = split(pkg("changed", "--type", "arc", "HEAD^", "HEAD")) out = split(pkg("changed", "--type", "arc", "HEAD^", "HEAD"))
assert out == ["mockpkg-b", "mockpkg-c", "mockpkg-d"] assert out == ["pkg-b", "pkg-c", "pkg-d"]
# invalid type argument # invalid type argument
with pytest.raises(spack.main.SpackCommandError): with pytest.raises(spack.main.SpackCommandError):
@@ -294,7 +289,7 @@ def test_pkg_canonical_source(mock_packages):
def test_pkg_hash(mock_packages): def test_pkg_hash(mock_packages):
output = pkg("hash", "pkg-a", "pkg-b").strip().split() output = pkg("hash", "a", "b").strip().split()
assert len(output) == 2 and all(len(elt) == 32 for elt in output) assert len(output) == 2 and all(len(elt) == 32 for elt in output)
output = pkg("hash", "multimethod").strip().split() output = pkg("hash", "multimethod").strip().split()

View File

@@ -58,7 +58,7 @@ def test_spec_concretizer_args(mutable_config, mutable_database, do_not_check_ru
def test_spec_parse_dependency_variant_value(): def test_spec_parse_dependency_variant_value():
"""Verify that we can provide multiple key=value variants to multiple separate """Verify that we can provide multiple key=value variants to multiple separate
packages within a spec string.""" packages within a spec string."""
output = spec("multivalue-variant fee=barbaz ^ pkg-a foobar=baz") output = spec("multivalue-variant fee=barbaz ^ a foobar=baz")
assert "fee=barbaz" in output assert "fee=barbaz" in output
assert "foobar=baz" in output assert "foobar=baz" in output

View File

@@ -10,14 +10,10 @@
from llnl.util.filesystem import copy_tree from llnl.util.filesystem import copy_tree
import spack.cmd.common.arguments
import spack.cmd.install import spack.cmd.install
import spack.cmd.test
import spack.config import spack.config
import spack.install_test
import spack.package_base import spack.package_base
import spack.paths import spack.paths
import spack.spec
import spack.store import spack.store
from spack.install_test import TestStatus from spack.install_test import TestStatus
from spack.main import SpackCommand from spack.main import SpackCommand

View File

@@ -24,8 +24,6 @@
import spack.platforms import spack.platforms
import spack.repo import spack.repo
import spack.solver.asp import spack.solver.asp
import spack.store
import spack.util.file_cache
import spack.util.libc import spack.util.libc
import spack.variant as vt import spack.variant as vt
from spack.concretize import find_spec from spack.concretize import find_spec
@@ -406,7 +404,7 @@ def test_compiler_flags_from_compiler_and_dependent(self):
def test_compiler_flags_differ_identical_compilers(self, mutable_config, clang12_with_flags): def test_compiler_flags_differ_identical_compilers(self, mutable_config, clang12_with_flags):
mutable_config.set("compilers", [clang12_with_flags]) mutable_config.set("compilers", [clang12_with_flags])
# Correct arch to use test compiler that has flags # Correct arch to use test compiler that has flags
spec = Spec("pkg-a %clang@12.2.0 platform=test os=fe target=fe") spec = Spec("a %clang@12.2.0 platform=test os=fe target=fe")
# Get the compiler that matches the spec ( # Get the compiler that matches the spec (
compiler = spack.compilers.compiler_for_spec("clang@=12.2.0", spec.architecture) compiler = spack.compilers.compiler_for_spec("clang@=12.2.0", spec.architecture)
@@ -475,7 +473,7 @@ def test_architecture_deep_inheritance(self, mock_targets, compiler_factory):
assert s.architecture.target == spec.architecture.target assert s.architecture.target == spec.architecture.target
def test_compiler_flags_from_user_are_grouped(self): def test_compiler_flags_from_user_are_grouped(self):
spec = Spec('pkg-a%gcc cflags="-O -foo-flag foo-val" platform=test') spec = Spec('a%gcc cflags="-O -foo-flag foo-val" platform=test')
spec.concretize() spec.concretize()
cflags = spec.compiler_flags["cflags"] cflags = spec.compiler_flags["cflags"]
assert any(x == "-foo-flag foo-val" for x in cflags) assert any(x == "-foo-flag foo-val" for x in cflags)
@@ -583,20 +581,20 @@ def test_concretize_propagate_multivalue_variant(self):
spec = Spec("multivalue-variant foo==baz,fee") spec = Spec("multivalue-variant foo==baz,fee")
spec.concretize() spec.concretize()
assert spec.satisfies("^pkg-a foo=baz,fee") assert spec.satisfies("^a foo=baz,fee")
assert spec.satisfies("^pkg-b foo=baz,fee") assert spec.satisfies("^b foo=baz,fee")
assert not spec.satisfies("^pkg-a foo=bar") assert not spec.satisfies("^a foo=bar")
assert not spec.satisfies("^pkg-b foo=bar") assert not spec.satisfies("^b foo=bar")
def test_no_matching_compiler_specs(self, mock_low_high_config): def test_no_matching_compiler_specs(self, mock_low_high_config):
# only relevant when not building compilers as needed # only relevant when not building compilers as needed
with spack.concretize.enable_compiler_existence_check(): with spack.concretize.enable_compiler_existence_check():
s = Spec("pkg-a %gcc@=0.0.0") s = Spec("a %gcc@=0.0.0")
with pytest.raises(spack.concretize.UnavailableCompilerVersionError): with pytest.raises(spack.concretize.UnavailableCompilerVersionError):
s.concretize() s.concretize()
def test_no_compilers_for_arch(self): def test_no_compilers_for_arch(self):
s = Spec("pkg-a arch=linux-rhel0-x86_64") s = Spec("a arch=linux-rhel0-x86_64")
with pytest.raises(spack.error.SpackError): with pytest.raises(spack.error.SpackError):
s.concretize() s.concretize()
@@ -805,7 +803,7 @@ def test_regression_issue_7941(self):
# The string representation of a spec containing # The string representation of a spec containing
# an explicit multi-valued variant and a dependency # an explicit multi-valued variant and a dependency
# might be parsed differently than the originating spec # might be parsed differently than the originating spec
s = Spec("pkg-a foobar=bar ^pkg-b") s = Spec("a foobar=bar ^b")
t = Spec(str(s)) t = Spec(str(s))
s.concretize() s.concretize()
@@ -1185,14 +1183,14 @@ def test_conditional_provides_or_depends_on(self):
[ [
# Check that True is treated correctly and attaches test deps # Check that True is treated correctly and attaches test deps
# to all nodes in the DAG # to all nodes in the DAG
("pkg-a", True, ["pkg-a"], []), ("a", True, ["a"], []),
("pkg-a foobar=bar", True, ["pkg-a", "pkg-b"], []), ("a foobar=bar", True, ["a", "b"], []),
# Check that a list of names activates the dependency only for # Check that a list of names activates the dependency only for
# packages in that list # packages in that list
("pkg-a foobar=bar", ["pkg-a"], ["pkg-a"], ["pkg-b"]), ("a foobar=bar", ["a"], ["a"], ["b"]),
("pkg-a foobar=bar", ["pkg-b"], ["pkg-b"], ["pkg-a"]), ("a foobar=bar", ["b"], ["b"], ["a"]),
# Check that False disregard test dependencies # Check that False disregard test dependencies
("pkg-a foobar=bar", False, [], ["pkg-a", "pkg-b"]), ("a foobar=bar", False, [], ["a", "b"]),
], ],
) )
def test_activating_test_dependencies(self, spec_str, tests_arg, with_dep, without_dep): def test_activating_test_dependencies(self, spec_str, tests_arg, with_dep, without_dep):
@@ -1251,7 +1249,7 @@ def test_custom_compiler_version(self, mutable_config, compiler_factory, monkeyp
"compilers", [compiler_factory(spec="gcc@10foo", operating_system="redhat6")] "compilers", [compiler_factory(spec="gcc@10foo", operating_system="redhat6")]
) )
monkeypatch.setattr(spack.compiler.Compiler, "real_version", "10.2.1") monkeypatch.setattr(spack.compiler.Compiler, "real_version", "10.2.1")
s = Spec("pkg-a %gcc@10foo os=redhat6").concretized() s = Spec("a %gcc@10foo os=redhat6").concretized()
assert "%gcc@10foo" in s assert "%gcc@10foo" in s
def test_all_patches_applied(self): def test_all_patches_applied(self):
@@ -1395,10 +1393,10 @@ def test_no_reuse_when_variant_condition_does_not_hold(self, mutable_database, m
@pytest.mark.only_clingo("Use case not supported by the original concretizer") @pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_reuse_with_flags(self, mutable_database, mutable_config): def test_reuse_with_flags(self, mutable_database, mutable_config):
spack.config.set("concretizer:reuse", True) spack.config.set("concretizer:reuse", True)
spec = Spec("pkg-a cflags=-g cxxflags=-g").concretized() spec = Spec("a cflags=-g cxxflags=-g").concretized()
spack.store.STORE.db.add(spec, None) spack.store.STORE.db.add(spec, None)
testspec = Spec("pkg-a cflags=-g") testspec = Spec("a cflags=-g")
testspec.concretize() testspec.concretize()
assert testspec == spec assert testspec == spec
@@ -1741,49 +1739,49 @@ def test_reuse_with_unknown_namespace_dont_raise(
self, temporary_store, mock_custom_repository self, temporary_store, mock_custom_repository
): ):
with spack.repo.use_repositories(mock_custom_repository, override=False): with spack.repo.use_repositories(mock_custom_repository, override=False):
s = Spec("pkg-c").concretized() s = Spec("c").concretized()
assert s.namespace != "builtin.mock" assert s.namespace != "builtin.mock"
s.package.do_install(fake=True, explicit=True) s.package.do_install(fake=True, explicit=True)
with spack.config.override("concretizer:reuse", True): with spack.config.override("concretizer:reuse", True):
s = Spec("pkg-c").concretized() s = Spec("c").concretized()
assert s.namespace == "builtin.mock" assert s.namespace == "builtin.mock"
@pytest.mark.regression("28259") @pytest.mark.regression("28259")
def test_reuse_with_unknown_package_dont_raise(self, tmpdir, temporary_store, monkeypatch): def test_reuse_with_unknown_package_dont_raise(self, tmpdir, temporary_store, monkeypatch):
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"), namespace="myrepo") builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"), namespace="myrepo")
builder.add_package("pkg-c") builder.add_package("c")
with spack.repo.use_repositories(builder.root, override=False): with spack.repo.use_repositories(builder.root, override=False):
s = Spec("pkg-c").concretized() s = Spec("c").concretized()
assert s.namespace == "myrepo" assert s.namespace == "myrepo"
s.package.do_install(fake=True, explicit=True) s.package.do_install(fake=True, explicit=True)
del sys.modules["spack.pkg.myrepo.pkg-c"] del sys.modules["spack.pkg.myrepo.c"]
del sys.modules["spack.pkg.myrepo"] del sys.modules["spack.pkg.myrepo"]
builder.remove("pkg-c") builder.remove("c")
with spack.repo.use_repositories(builder.root, override=False) as repos: with spack.repo.use_repositories(builder.root, override=False) as repos:
# TODO (INJECT CONFIGURATION): unclear why the cache needs to be invalidated explicitly # TODO (INJECT CONFIGURATION): unclear why the cache needs to be invalidated explicitly
repos.repos[0]._pkg_checker.invalidate() repos.repos[0]._pkg_checker.invalidate()
with spack.config.override("concretizer:reuse", True): with spack.config.override("concretizer:reuse", True):
s = Spec("pkg-c").concretized() s = Spec("c").concretized()
assert s.namespace == "builtin.mock" assert s.namespace == "builtin.mock"
@pytest.mark.parametrize( @pytest.mark.parametrize(
"specs,expected,libc_offset", "specs,expected",
[ [
(["libelf", "libelf@0.8.10"], 1, 1), (["libelf", "libelf@0.8.10"], 1),
(["libdwarf%gcc", "libelf%clang"], 2, 1), (["libdwarf%gcc", "libelf%clang"], 2),
(["libdwarf%gcc", "libdwarf%clang"], 3, 2), (["libdwarf%gcc", "libdwarf%clang"], 3),
(["libdwarf^libelf@0.8.12", "libdwarf^libelf@0.8.13"], 4, 1), (["libdwarf^libelf@0.8.12", "libdwarf^libelf@0.8.13"], 4),
(["hdf5", "zmpi"], 3, 1), (["hdf5", "zmpi"], 3),
(["hdf5", "mpich"], 2, 1), (["hdf5", "mpich"], 2),
(["hdf5^zmpi", "mpich"], 4, 1), (["hdf5^zmpi", "mpich"], 4),
(["mpi", "zmpi"], 2, 1), (["mpi", "zmpi"], 2),
(["mpi", "mpich"], 1, 1), (["mpi", "mpich"], 1),
], ],
) )
@pytest.mark.only_clingo("Original concretizer cannot concretize in rounds") @pytest.mark.only_clingo("Original concretizer cannot concretize in rounds")
def test_best_effort_coconcretize(self, specs, expected, libc_offset): def test_best_effort_coconcretize(self, specs, expected):
specs = [Spec(s) for s in specs] specs = [Spec(s) for s in specs]
solver = spack.solver.asp.Solver() solver = spack.solver.asp.Solver()
solver.reuse = False solver.reuse = False
@@ -1792,9 +1790,7 @@ def test_best_effort_coconcretize(self, specs, expected, libc_offset):
for s in result.specs: for s in result.specs:
concrete_specs.update(s.traverse()) concrete_specs.update(s.traverse())
if not spack.solver.asp.using_libc_compatibility(): libc_offset = 1 if spack.solver.asp.using_libc_compatibility() else 0
libc_offset = 0
assert len(concrete_specs) == expected + libc_offset assert len(concrete_specs) == expected + libc_offset
@pytest.mark.parametrize( @pytest.mark.parametrize(
@@ -1896,20 +1892,20 @@ def test_misleading_error_message_on_version(self, mutable_database):
@pytest.mark.only_clingo("Use case not supported by the original concretizer") @pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_version_weight_and_provenance(self): def test_version_weight_and_provenance(self):
"""Test package preferences during coconcretization.""" """Test package preferences during coconcretization."""
reusable_specs = [Spec(spec_str).concretized() for spec_str in ("pkg-b@0.9", "pkg-b@1.0")] reusable_specs = [Spec(spec_str).concretized() for spec_str in ("b@0.9", "b@1.0")]
root_spec = Spec("pkg-a foobar=bar") root_spec = Spec("a foobar=bar")
with spack.config.override("concretizer:reuse", True): with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver() solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup() setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, [root_spec], reuse=reusable_specs) result, _, _ = solver.driver.solve(setup, [root_spec], reuse=reusable_specs)
# The result here should have a single spec to build ('pkg-a') # The result here should have a single spec to build ('a')
# and it should be using pkg-b@1.0 with a version badness of 2 # and it should be using b@1.0 with a version badness of 2
# The provenance is: # The provenance is:
# version_declared("pkg-b","1.0",0,"package_py"). # version_declared("b","1.0",0,"package_py").
# version_declared("pkg-b","0.9",1,"package_py"). # version_declared("b","0.9",1,"package_py").
# version_declared("pkg-b","1.0",2,"installed"). # version_declared("b","1.0",2,"installed").
# version_declared("pkg-b","0.9",3,"installed"). # version_declared("b","0.9",3,"installed").
# #
# Depending on the target, it may also use gnuconfig # Depending on the target, it may also use gnuconfig
result_spec = result.specs[0] result_spec = result.specs[0]
@@ -1918,16 +1914,16 @@ def test_version_weight_and_provenance(self):
libc_offset = 1 if spack.solver.asp.using_libc_compatibility() else 0 libc_offset = 1 if spack.solver.asp.using_libc_compatibility() else 0
criteria = [ criteria = [
(num_specs - 1 - libc_offset, None, "number of packages to build (vs. reuse)"), (num_specs - 1 - libc_offset, None, "number of packages to build (vs. reuse)"),
(2, 0, "version badness (non roots)"), (2, 0, "version badness"),
] ]
for criterion in criteria: for criterion in criteria:
assert criterion in result.criteria, criterion assert criterion in result.criteria, result_spec
assert result_spec.satisfies("^pkg-b@1.0") assert result_spec.satisfies("^b@1.0")
@pytest.mark.only_clingo("Use case not supported by the original concretizer") @pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_reuse_succeeds_with_config_compatible_os(self): def test_reuse_succeeds_with_config_compatible_os(self):
root_spec = Spec("pkg-b") root_spec = Spec("b")
s = root_spec.concretized() s = root_spec.concretized()
other_os = s.copy() other_os = s.copy()
mock_os = "ubuntu2204" mock_os = "ubuntu2204"
@@ -2191,7 +2187,7 @@ def test_external_python_extension_find_unified_python(self):
"specs", "specs",
[ [
["mpileaks^ callpath ^dyninst@8.1.1:8 ^mpich2@1.3:1"], ["mpileaks^ callpath ^dyninst@8.1.1:8 ^mpich2@1.3:1"],
["multivalue-variant ^pkg-a@2:2"], ["multivalue-variant ^a@2:2"],
["v1-consumer ^conditional-provider@1:1 +disable-v1"], ["v1-consumer ^conditional-provider@1:1 +disable-v1"],
], ],
) )
@@ -2230,9 +2226,9 @@ def test_unsolved_specs_raises_error(self, monkeypatch, mock_packages, config):
def test_clear_error_when_unknown_compiler_requested(self, mock_packages, config): def test_clear_error_when_unknown_compiler_requested(self, mock_packages, config):
"""Tests that the solver can report a case where the compiler cannot be set""" """Tests that the solver can report a case where the compiler cannot be set"""
with pytest.raises( with pytest.raises(
spack.error.UnsatisfiableSpecError, match="Cannot set the required compiler: pkg-a%foo" spack.error.UnsatisfiableSpecError, match="Cannot set the required compiler: a%foo"
): ):
Spec("pkg-a %foo").concretized() Spec("a %foo").concretized()
@pytest.mark.regression("36339") @pytest.mark.regression("36339")
def test_compiler_match_constraints_when_selected(self): def test_compiler_match_constraints_when_selected(self):
@@ -2268,7 +2264,7 @@ def test_compiler_match_constraints_when_selected(self):
}, },
] ]
spack.config.set("compilers", compiler_configuration) spack.config.set("compilers", compiler_configuration)
s = Spec("pkg-a %gcc@:11").concretized() s = Spec("a %gcc@:11").concretized()
assert s.compiler.version == ver("=11.1.0"), s assert s.compiler.version == ver("=11.1.0"), s
@pytest.mark.regression("36339") @pytest.mark.regression("36339")
@@ -2289,7 +2285,7 @@ def test_compiler_with_custom_non_numeric_version(self, mock_executable):
} }
] ]
spack.config.set("compilers", compiler_configuration) spack.config.set("compilers", compiler_configuration)
s = Spec("pkg-a %gcc@foo").concretized() s = Spec("a %gcc@foo").concretized()
assert s.compiler.version == ver("=foo") assert s.compiler.version == ver("=foo")
@pytest.mark.regression("36628") @pytest.mark.regression("36628")
@@ -2315,7 +2311,7 @@ def test_concretization_with_compilers_supporting_target_any(self):
] ]
with spack.config.override("compilers", compiler_configuration): with spack.config.override("compilers", compiler_configuration):
s = Spec("pkg-a").concretized() s = spack.spec.Spec("a").concretized()
assert s.satisfies("%gcc@12.1.0") assert s.satisfies("%gcc@12.1.0")
@pytest.mark.parametrize("spec_str", ["mpileaks", "mpileaks ^mpich"]) @pytest.mark.parametrize("spec_str", ["mpileaks", "mpileaks ^mpich"])
@@ -2350,7 +2346,7 @@ def test_dont_define_new_version_from_input_if_checksum_required(self, working_e
with pytest.raises(spack.error.UnsatisfiableSpecError): with pytest.raises(spack.error.UnsatisfiableSpecError):
# normally spack concretizes to @=3.0 if it's not defined in package.py, except # normally spack concretizes to @=3.0 if it's not defined in package.py, except
# when checksums are required # when checksums are required
Spec("pkg-a@=3.0").concretized() Spec("a@=3.0").concretized()
@pytest.mark.regression("39570") @pytest.mark.regression("39570")
@pytest.mark.db @pytest.mark.db
@@ -2450,7 +2446,7 @@ def _default_libc(self):
spack.util.libc, "libc_from_current_python_process", lambda: Spec("glibc@=2.28") spack.util.libc, "libc_from_current_python_process", lambda: Spec("glibc@=2.28")
) )
mutable_config.set("config:install_missing_compilers", True) mutable_config.set("config:install_missing_compilers", True)
s = Spec("pkg-a %gcc@=13.2.0").concretized() s = Spec("a %gcc@=13.2.0").concretized()
assert s.satisfies("%gcc@13.2.0") assert s.satisfies("%gcc@13.2.0")
@pytest.mark.regression("43267") @pytest.mark.regression("43267")
@@ -2550,79 +2546,6 @@ def test_include_specs_from_externals_and_libcs(
assert result["deprecated-versions"].satisfies("@1.0.0") assert result["deprecated-versions"].satisfies("@1.0.0")
@pytest.mark.regression("44085")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_can_reuse_concrete_externals_for_dependents(self, mutable_config, tmp_path):
"""Test that external specs that are in the DB can be reused. This means they are
preferred to concretizing another external from packages.yaml
"""
packages_yaml = {
"externaltool": {"externals": [{"spec": "externaltool@2.0", "prefix": "/fake/path"}]}
}
mutable_config.set("packages", packages_yaml)
# Concretize with gcc@9 to get a suboptimal spec, since we have gcc@10 available
external_spec = Spec("externaltool@2 %gcc@9").concretized()
assert external_spec.external
root_specs = [Spec("sombrero")]
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, root_specs, reuse=[external_spec])
assert len(result.specs) == 1
sombrero = result.specs[0]
assert sombrero["externaltool"].dag_hash() == external_spec.dag_hash()
@pytest.mark.regression("45321")
@pytest.mark.parametrize(
"corrupted_str",
[
"cmake@3.4.3 foo=bar", # cmake has no variant "foo"
"mvdefaults@1.0 foo=a,d", # variant "foo" has no value "d"
"cmake %gcc", # spec has no version
],
)
def test_corrupted_external_does_not_halt_concretization(self, corrupted_str, mutable_config):
"""Tests that having a wrong variant in an external spec doesn't stop concretization"""
corrupted_spec = Spec(corrupted_str)
packages_yaml = {
f"{corrupted_spec.name}": {
"externals": [{"spec": corrupted_str, "prefix": "/dev/null"}]
}
}
mutable_config.set("packages", packages_yaml)
# Assert we don't raise due to the corrupted external entry above
s = Spec("pkg-a").concretized()
assert s.concrete
@pytest.mark.regression("44828")
@pytest.mark.not_on_windows("Tests use linux paths")
def test_correct_external_is_selected_from_packages_yaml(self, mutable_config):
"""Tests that when filtering external specs, the correct external is selected to
reconstruct the prefix, and other external attributes.
"""
packages_yaml = {
"cmake": {
"externals": [
{"spec": "cmake@3.23.1 %gcc", "prefix": "/tmp/prefix1"},
{"spec": "cmake@3.23.1 %clang", "prefix": "/tmp/prefix2"},
]
}
}
concretizer_yaml = {
"reuse": {"roots": True, "from": [{"type": "external", "exclude": ["%gcc"]}]}
}
mutable_config.set("packages", packages_yaml)
mutable_config.set("concretizer", concretizer_yaml)
s = Spec("cmake").concretized()
# Check that we got the properties from the right external
assert s.external
assert s.satisfies("%clang")
assert s.prefix == "/tmp/prefix2"
@pytest.fixture() @pytest.fixture()
def duplicates_test_repository(): def duplicates_test_repository():
@@ -2824,9 +2747,7 @@ def test_drop_moving_targets(v_str, v_opts, checksummed):
class TestConcreteSpecsByHash: class TestConcreteSpecsByHash:
"""Tests the container of concrete specs""" """Tests the container of concrete specs"""
@pytest.mark.parametrize( @pytest.mark.parametrize("input_specs", [["a"], ["a foobar=bar", "b"], ["a foobar=baz", "b"]])
"input_specs", [["pkg-a"], ["pkg-a foobar=bar", "pkg-b"], ["pkg-a foobar=baz", "pkg-b"]]
)
def test_adding_specs(self, input_specs, default_mock_concretization): def test_adding_specs(self, input_specs, default_mock_concretization):
"""Tests that concrete specs in the container are equivalent, but stored as different """Tests that concrete specs in the container are equivalent, but stored as different
objects in memory. objects in memory.

View File

@@ -9,7 +9,6 @@
import archspec.cpu import archspec.cpu
import spack.config
import spack.paths import spack.paths
import spack.repo import spack.repo
import spack.solver.asp import spack.solver.asp
@@ -48,8 +47,8 @@ def enable_runtimes():
def test_correct_gcc_runtime_is_injected_as_dependency(runtime_repo): def test_correct_gcc_runtime_is_injected_as_dependency(runtime_repo):
s = spack.spec.Spec("pkg-a%gcc@10.2.1 ^pkg-b%gcc@9.4.0").concretized() s = spack.spec.Spec("a%gcc@10.2.1 ^b%gcc@9.4.0").concretized()
a, b = s["pkg-a"], s["pkg-b"] a, b = s["a"], s["b"]
# Both a and b should depend on the same gcc-runtime directly # Both a and b should depend on the same gcc-runtime directly
assert a.dependencies("gcc-runtime") == b.dependencies("gcc-runtime") assert a.dependencies("gcc-runtime") == b.dependencies("gcc-runtime")
@@ -62,16 +61,16 @@ def test_correct_gcc_runtime_is_injected_as_dependency(runtime_repo):
def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_path): def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_path):
"""Tests that external nodes don't have runtime dependencies.""" """Tests that external nodes don't have runtime dependencies."""
packages_yaml = {"pkg-b": {"externals": [{"spec": "pkg-b@1.0", "prefix": f"{str(tmp_path)}"}]}} packages_yaml = {"b": {"externals": [{"spec": "b@1.0", "prefix": f"{str(tmp_path)}"}]}}
spack.config.set("packages", packages_yaml) spack.config.set("packages", packages_yaml)
s = spack.spec.Spec("pkg-a%gcc@10.2.1").concretized() s = spack.spec.Spec("a%gcc@10.2.1").concretized()
a, b = s["pkg-a"], s["pkg-b"] a, b = s["a"], s["b"]
# Since b is an external, it doesn't depend on gcc-runtime # Since b is an external, it doesn't depend on gcc-runtime
assert a.dependencies("gcc-runtime") assert a.dependencies("gcc-runtime")
assert a.dependencies("pkg-b") assert a.dependencies("b")
assert not b.dependencies("gcc-runtime") assert not b.dependencies("gcc-runtime")
@@ -79,36 +78,23 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
"root_str,reused_str,expected,nruntime", "root_str,reused_str,expected,nruntime",
[ [
# The reused runtime is older than we need, thus we'll add a more recent one for a # The reused runtime is older than we need, thus we'll add a more recent one for a
( ("a%gcc@10.2.1", "b%gcc@9.4.0", {"a": "gcc-runtime@10.2.1", "b": "gcc-runtime@9.4.0"}, 2),
"pkg-a%gcc@10.2.1", # The root is compiled with an older compiler, thus we'll reuse the runtime from b
"pkg-b%gcc@9.4.0", ("a%gcc@9.4.0", "b%gcc@10.2.1", {"a": "gcc-runtime@10.2.1", "b": "gcc-runtime@10.2.1"}, 1),
{"pkg-a": "gcc-runtime@10.2.1", "pkg-b": "gcc-runtime@9.4.0"},
2,
),
# The root is compiled with an older compiler, thus we'll NOT reuse the runtime from b
(
"pkg-a%gcc@9.4.0",
"pkg-b%gcc@10.2.1",
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"},
1,
),
# Same as before, but tests that we can reuse from a more generic target # Same as before, but tests that we can reuse from a more generic target
pytest.param( pytest.param(
"pkg-a%gcc@9.4.0", "a%gcc@9.4.0",
"pkg-b%gcc@10.2.1 target=x86_64", "b%gcc@10.2.1 target=x86_64",
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"}, {"a": "gcc-runtime@10.2.1 target=x86_64", "b": "gcc-runtime@10.2.1 target=x86_64"},
1, 1,
marks=pytest.mark.skipif( marks=pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific" str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
), ),
), ),
pytest.param( pytest.param(
"pkg-a%gcc@10.2.1", "a%gcc@10.2.1",
"pkg-b%gcc@9.4.0 target=x86_64", "b%gcc@9.4.0 target=x86_64",
{ {"a": "gcc-runtime@10.2.1 target=x86_64", "b": "gcc-runtime@9.4.0 target=x86_64"},
"pkg-a": "gcc-runtime@10.2.1 target=x86_64",
"pkg-b": "gcc-runtime@9.4.0 target=x86_64",
},
2, 2,
marks=pytest.mark.skipif( marks=pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific" str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
@@ -116,19 +102,17 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
), ),
], ],
) )
@pytest.mark.regression("44444")
def test_reusing_specs_with_gcc_runtime(root_str, reused_str, expected, nruntime, runtime_repo): def test_reusing_specs_with_gcc_runtime(root_str, reused_str, expected, nruntime, runtime_repo):
"""Tests that we can reuse specs with a "gcc-runtime" leaf node. In particular, checks """Tests that we can reuse specs with a "gcc-runtime" leaf node. In particular, checks
that the semantic for gcc-runtimes versions accounts for reused packages too. that the semantic for gcc-runtimes versions accounts for reused packages too.
Reusable runtime versions should be lower, or equal, to that of parent nodes.
""" """
root, reused_spec = _concretize_with_reuse(root_str=root_str, reused_str=reused_str) root, reused_spec = _concretize_with_reuse(root_str=root_str, reused_str=reused_str)
assert f"{expected['b']}" in reused_spec
runtime_a = root.dependencies("gcc-runtime")[0] runtime_a = root.dependencies("gcc-runtime")[0]
assert runtime_a.satisfies(expected["pkg-a"]) assert runtime_a.satisfies(expected["a"])
runtime_b = root["pkg-b"].dependencies("gcc-runtime")[0] runtime_b = root["b"].dependencies("gcc-runtime")[0]
assert runtime_b.satisfies(expected["pkg-b"]) assert runtime_b.satisfies(expected["b"])
runtimes = [x for x in root.traverse() if x.name == "gcc-runtime"] runtimes = [x for x in root.traverse() if x.name == "gcc-runtime"]
assert len(runtimes) == nruntime assert len(runtimes) == nruntime
@@ -139,7 +123,8 @@ def test_reusing_specs_with_gcc_runtime(root_str, reused_str, expected, nruntime
[ [
# Ensure that, whether we have multiple runtimes in the DAG or not, # Ensure that, whether we have multiple runtimes in the DAG or not,
# we always link only the latest version # we always link only the latest version
("pkg-a%gcc@10.2.1", "pkg-b%gcc@9.4.0", ["gcc-runtime@10.2.1"], ["gcc-runtime@9.4.0"]) ("a%gcc@10.2.1", "b%gcc@9.4.0", ["gcc-runtime@10.2.1"], ["gcc-runtime@9.4.0"]),
("a%gcc@9.4.0", "b%gcc@10.2.1", ["gcc-runtime@10.2.1"], ["gcc-runtime@9.4.0"]),
], ],
) )
def test_views_can_handle_duplicate_runtime_nodes( def test_views_can_handle_duplicate_runtime_nodes(

View File

@@ -512,5 +512,5 @@ def test_default_preference_variant_different_type_does_not_error(self):
packages.yaml doesn't fail with an error. packages.yaml doesn't fail with an error.
""" """
with spack.config.override("packages:all", {"variants": "+foo"}): with spack.config.override("packages:all", {"variants": "+foo"}):
s = Spec("pkg-a").concretized() s = Spec("a").concretized()
assert s.satisfies("foo=bar") assert s.satisfies("foo=bar")

View File

@@ -103,6 +103,23 @@ def test_repo(_create_test_repo, monkeypatch, mock_stage):
yield mock_repo_path yield mock_repo_path
class MakeStage:
def __init__(self, stage):
self.stage = stage
def __call__(self, *args, **kwargs):
return self.stage
@pytest.fixture
def fake_installs(monkeypatch, tmpdir):
stage_path = str(tmpdir.ensure("fake-stage", dir=True))
universal_unused_stage = spack.stage.DIYStage(stage_path)
monkeypatch.setattr(
spack.build_systems.generic.Package, "_make_stage", MakeStage(universal_unused_stage)
)
def test_one_package_multiple_reqs(concretize_scope, test_repo): def test_one_package_multiple_reqs(concretize_scope, test_repo):
conf_str = """\ conf_str = """\
packages: packages:
@@ -497,7 +514,7 @@ def test_oneof_ordering(concretize_scope, test_repo):
assert s2.satisfies("@2.5") assert s2.satisfies("@2.5")
def test_reuse_oneof(concretize_scope, _create_test_repo, mutable_database, mock_fetch): def test_reuse_oneof(concretize_scope, _create_test_repo, mutable_database, fake_installs):
conf_str = """\ conf_str = """\
packages: packages:
y: y:
@@ -927,9 +944,9 @@ def test_default_requirements_semantic(packages_yaml, concretize_scope, mock_pac
Spec("zlib ~shared").concretized() Spec("zlib ~shared").concretized()
# A spec without the shared variant still concretize # A spec without the shared variant still concretize
s = Spec("pkg-a").concretized() s = Spec("a").concretized()
assert not s.satisfies("pkg-a +shared") assert not s.satisfies("a +shared")
assert not s.satisfies("pkg-a ~shared") assert not s.satisfies("a ~shared")
@pytest.mark.parametrize( @pytest.mark.parametrize(
@@ -1159,46 +1176,3 @@ def test_forward_multi_valued_variant_using_requires(
for constraint in not_expected: for constraint in not_expected:
assert not s.satisfies(constraint) assert not s.satisfies(constraint)
def test_strong_preferences_higher_priority_than_reuse(concretize_scope, mock_packages):
"""Tests that strong preferences have a higher priority than reusing specs."""
reused_spec = Spec("adios2~bzip2").concretized()
reuse_nodes = list(reused_spec.traverse())
root_specs = [Spec("ascent+adios2")]
# Check that without further configuration adios2 is reused
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, root_specs, reuse=reuse_nodes)
ascent = result.specs[0]
assert ascent["adios2"].dag_hash() == reused_spec.dag_hash(), ascent
# If we stick a preference, adios2 is not reused
update_packages_config(
"""
packages:
adios2:
prefer:
- "+bzip2"
"""
)
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, root_specs, reuse=reuse_nodes)
ascent = result.specs[0]
assert ascent["adios2"].dag_hash() != reused_spec.dag_hash()
assert ascent["adios2"].satisfies("+bzip2")
# A preference is still preference, so we can override from input
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(
setup, [Spec("ascent+adios2 ^adios2~bzip2")], reuse=reuse_nodes
)
ascent = result.specs[0]
assert ascent["adios2"].dag_hash() == reused_spec.dag_hash(), ascent

View File

@@ -1211,13 +1211,13 @@ def test_license_dir_config(mutable_config, mock_packages):
expected_dir = spack.paths.default_license_dir expected_dir = spack.paths.default_license_dir
assert spack.config.get("config:license_dir") == expected_dir assert spack.config.get("config:license_dir") == expected_dir
assert spack.package_base.PackageBase.global_license_dir == expected_dir assert spack.package_base.PackageBase.global_license_dir == expected_dir
assert spack.repo.PATH.get_pkg_class("pkg-a").global_license_dir == expected_dir assert spack.repo.PATH.get_pkg_class("a").global_license_dir == expected_dir
rel_path = os.path.join(os.path.sep, "foo", "bar", "baz") rel_path = os.path.join(os.path.sep, "foo", "bar", "baz")
spack.config.set("config:license_dir", rel_path) spack.config.set("config:license_dir", rel_path)
assert spack.config.get("config:license_dir") == rel_path assert spack.config.get("config:license_dir") == rel_path
assert spack.package_base.PackageBase.global_license_dir == rel_path assert spack.package_base.PackageBase.global_license_dir == rel_path
assert spack.repo.PATH.get_pkg_class("pkg-a").global_license_dir == rel_path assert spack.repo.PATH.get_pkg_class("a").global_license_dir == rel_path
@pytest.mark.regression("22547") @pytest.mark.regression("22547")

View File

@@ -595,7 +595,7 @@ def mutable_mock_repo(mock_repo_path, request):
def mock_custom_repository(tmpdir, mutable_mock_repo): def mock_custom_repository(tmpdir, mutable_mock_repo):
"""Create a custom repository with a single package "c" and return its path.""" """Create a custom repository with a single package "c" and return its path."""
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("myrepo")) builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("myrepo"))
builder.add_package("pkg-c") builder.add_package("c")
return builder.root return builder.root
@@ -2053,11 +2053,3 @@ def _true(x):
@pytest.fixture() @pytest.fixture()
def do_not_check_runtimes_on_reuse(monkeypatch): def do_not_check_runtimes_on_reuse(monkeypatch):
monkeypatch.setattr(spack.solver.asp, "_has_runtime_dependencies", _true) monkeypatch.setattr(spack.solver.asp, "_has_runtime_dependencies", _true)
@pytest.fixture(autouse=True, scope="session")
def _c_compiler_always_exists():
fn = spack.solver.asp.c_compiler_runs
spack.solver.asp.c_compiler_runs = _true
yield
spack.solver.asp.c_compiler_runs = fn

View File

@@ -58,22 +58,16 @@ def upstream_and_downstream_db(tmpdir, gen_mock_layout):
@pytest.mark.parametrize( @pytest.mark.parametrize(
"install_tree,result", "install_tree,result",
[ [("all", ["b", "c"]), ("upstream", ["c"]), ("local", ["b"]), ("{u}", ["c"]), ("{d}", ["b"])],
("all", ["pkg-b", "pkg-c"]),
("upstream", ["pkg-c"]),
("local", ["pkg-b"]),
("{u}", ["pkg-c"]),
("{d}", ["pkg-b"]),
],
) )
def test_query_by_install_tree( def test_query_by_install_tree(
install_tree, result, upstream_and_downstream_db, mock_packages, monkeypatch, config install_tree, result, upstream_and_downstream_db, mock_packages, monkeypatch, config
): ):
up_write_db, up_db, up_layout, down_db, down_layout = upstream_and_downstream_db up_write_db, up_db, up_layout, down_db, down_layout = upstream_and_downstream_db
# Set the upstream DB to contain "pkg-c" and downstream to contain "pkg-b") # Set the upstream DB to contain "c" and downstream to contain "b")
b = spack.spec.Spec("pkg-b").concretized() b = spack.spec.Spec("b").concretized()
c = spack.spec.Spec("pkg-c").concretized() c = spack.spec.Spec("c").concretized()
up_write_db.add(c, up_layout) up_write_db.add(c, up_layout)
up_db._read() up_db._read()
down_db.add(b, down_layout) down_db.add(b, down_layout)
@@ -92,7 +86,7 @@ def test_spec_installed_upstream(
# a known installed spec should say that it's installed # a known installed spec should say that it's installed
with spack.repo.use_repositories(mock_custom_repository): with spack.repo.use_repositories(mock_custom_repository):
spec = spack.spec.Spec("pkg-c").concretized() spec = spack.spec.Spec("c").concretized()
assert not spec.installed assert not spec.installed
assert not spec.installed_upstream assert not spec.installed_upstream
@@ -854,7 +848,7 @@ def test_query_virtual_spec(database):
def test_failed_spec_path_error(database): def test_failed_spec_path_error(database):
"""Ensure spec not concrete check is covered.""" """Ensure spec not concrete check is covered."""
s = spack.spec.Spec("pkg-a") s = spack.spec.Spec("a")
with pytest.raises(AssertionError, match="concrete spec required"): with pytest.raises(AssertionError, match="concrete spec required"):
spack.store.STORE.failure_tracker.mark(s) spack.store.STORE.failure_tracker.mark(s)
@@ -869,7 +863,7 @@ def _is(self, spec):
# Pretend the spec has been failure locked # Pretend the spec has been failure locked
monkeypatch.setattr(spack.database.FailureTracker, "lock_taken", _is) monkeypatch.setattr(spack.database.FailureTracker, "lock_taken", _is)
s = spack.spec.Spec("pkg-a").concretized() s = spack.spec.Spec("a").concretized()
spack.store.STORE.failure_tracker.clear(s) spack.store.STORE.failure_tracker.clear(s)
out = capfd.readouterr()[0] out = capfd.readouterr()[0]
assert "Retaining failure marking" in out assert "Retaining failure marking" in out
@@ -887,7 +881,7 @@ def _is(self, spec):
# Ensure raise OSError when try to remove the non-existent marking # Ensure raise OSError when try to remove the non-existent marking
monkeypatch.setattr(spack.database.FailureTracker, "persistent_mark", _is) monkeypatch.setattr(spack.database.FailureTracker, "persistent_mark", _is)
s = spack.spec.Spec("pkg-a").concretized() s = default_mock_concretization("a")
spack.store.STORE.failure_tracker.clear(s, force=True) spack.store.STORE.failure_tracker.clear(s, force=True)
out = capfd.readouterr()[1] out = capfd.readouterr()[1]
assert "Removing failure marking despite lock" in out assert "Removing failure marking despite lock" in out
@@ -901,16 +895,15 @@ def test_mark_failed(default_mock_concretization, mutable_database, monkeypatch,
def _raise_exc(lock): def _raise_exc(lock):
raise lk.LockTimeoutError("write", "/mock-lock", 1.234, 10) raise lk.LockTimeoutError("write", "/mock-lock", 1.234, 10)
# Ensure attempt to acquire write lock on the mark raises the exception
monkeypatch.setattr(lk.Lock, "acquire_write", _raise_exc)
with tmpdir.as_cwd(): with tmpdir.as_cwd():
s = spack.spec.Spec("pkg-a").concretized() s = default_mock_concretization("a")
# Ensure attempt to acquire write lock on the mark raises the exception
monkeypatch.setattr(lk.Lock, "acquire_write", _raise_exc)
spack.store.STORE.failure_tracker.mark(s) spack.store.STORE.failure_tracker.mark(s)
out = str(capsys.readouterr()[1]) out = str(capsys.readouterr()[1])
assert "Unable to mark pkg-a as failed" in out assert "Unable to mark a as failed" in out
spack.store.STORE.failure_tracker.clear_all() spack.store.STORE.failure_tracker.clear_all()
@@ -919,7 +912,7 @@ def _raise_exc(lock):
def test_prefix_failed(default_mock_concretization, mutable_database, monkeypatch): def test_prefix_failed(default_mock_concretization, mutable_database, monkeypatch):
"""Add coverage to failed operation.""" """Add coverage to failed operation."""
s = spack.spec.Spec("pkg-a").concretized() s = default_mock_concretization("a")
# Confirm the spec is not already marked as failed # Confirm the spec is not already marked as failed
assert not spack.store.STORE.failure_tracker.has_failed(s) assert not spack.store.STORE.failure_tracker.has_failed(s)
@@ -943,7 +936,7 @@ def test_prefix_write_lock_error(default_mock_concretization, mutable_database,
def _raise(db, spec): def _raise(db, spec):
raise lk.LockError("Mock lock error") raise lk.LockError("Mock lock error")
s = spack.spec.Spec("pkg-a").concretized() s = default_mock_concretization("a")
# Ensure subsequent lock operations fail # Ensure subsequent lock operations fail
monkeypatch.setattr(lk.Lock, "acquire_write", _raise) monkeypatch.setattr(lk.Lock, "acquire_write", _raise)
@@ -1119,7 +1112,7 @@ def test_database_read_works_with_trailing_data(tmp_path, default_mock_concretiz
# Populate a database # Populate a database
root = str(tmp_path) root = str(tmp_path)
db = spack.database.Database(root) db = spack.database.Database(root)
spec = default_mock_concretization("pkg-a") spec = default_mock_concretization("a")
db.add(spec, directory_layout=None) db.add(spec, directory_layout=None)
specs_in_db = db.query_local() specs_in_db = db.query_local()
assert spec in specs_in_db assert spec in specs_in_db

View File

@@ -31,7 +31,7 @@ def test_true_directives_exist(mock_packages):
assert cls.dependencies assert cls.dependencies
assert "extendee" in cls.dependencies[spack.spec.Spec()] assert "extendee" in cls.dependencies[spack.spec.Spec()]
assert "pkg-b" in cls.dependencies[spack.spec.Spec()] assert "b" in cls.dependencies[spack.spec.Spec()]
assert cls.resources assert cls.resources
assert spack.spec.Spec() in cls.resources assert spack.spec.Spec() in cls.resources
@@ -44,7 +44,7 @@ def test_constraints_from_context(mock_packages):
pkg_cls = spack.repo.PATH.get_pkg_class("with-constraint-met") pkg_cls = spack.repo.PATH.get_pkg_class("with-constraint-met")
assert pkg_cls.dependencies assert pkg_cls.dependencies
assert "pkg-b" in pkg_cls.dependencies[spack.spec.Spec("@1.0")] assert "b" in pkg_cls.dependencies[spack.spec.Spec("@1.0")]
assert pkg_cls.conflicts assert pkg_cls.conflicts
assert (spack.spec.Spec("%gcc"), None) in pkg_cls.conflicts[spack.spec.Spec("+foo@1.0")] assert (spack.spec.Spec("%gcc"), None) in pkg_cls.conflicts[spack.spec.Spec("+foo@1.0")]
@@ -55,7 +55,7 @@ def test_constraints_from_context_are_merged(mock_packages):
pkg_cls = spack.repo.PATH.get_pkg_class("with-constraint-met") pkg_cls = spack.repo.PATH.get_pkg_class("with-constraint-met")
assert pkg_cls.dependencies assert pkg_cls.dependencies
assert "pkg-c" in pkg_cls.dependencies[spack.spec.Spec("@0.14:15 ^pkg-b@3.8:4.0")] assert "c" in pkg_cls.dependencies[spack.spec.Spec("@0.14:15 ^b@3.8:4.0")]
@pytest.mark.regression("27754") @pytest.mark.regression("27754")
@@ -69,7 +69,7 @@ def test_extends_spec(config, mock_packages):
@pytest.mark.regression("34368") @pytest.mark.regression("34368")
def test_error_on_anonymous_dependency(config, mock_packages): def test_error_on_anonymous_dependency(config, mock_packages):
pkg = spack.repo.PATH.get_pkg_class("pkg-a") pkg = spack.repo.PATH.get_pkg_class("a")
with pytest.raises(spack.directives.DependencyError): with pytest.raises(spack.directives.DependencyError):
spack.directives._depends_on(pkg, "@4.5") spack.directives._depends_on(pkg, "@4.5")

View File

@@ -383,10 +383,10 @@ def test_can_add_specs_to_environment_without_specs_attribute(tmp_path, mock_pac
""" """
) )
env = ev.Environment(tmp_path) env = ev.Environment(tmp_path)
env.add("pkg-a") env.add("a")
assert len(env.user_specs) == 1 assert len(env.user_specs) == 1
assert env.manifest.pristine_yaml_content["spack"]["specs"] == ["pkg-a"] assert env.manifest.pristine_yaml_content["spack"]["specs"] == ["a"]
@pytest.mark.parametrize( @pytest.mark.parametrize(
@@ -584,7 +584,7 @@ def test_conflicts_with_packages_that_are_not_dependencies(
spack: spack:
specs: specs:
- {spec_str} - {spec_str}
- pkg-b - b
concretizer: concretizer:
unify: true unify: true
""" """
@@ -712,7 +712,7 @@ def test_variant_propagation_with_unify_false(tmp_path, mock_packages, config):
spack: spack:
specs: specs:
- parent-foo ++foo - parent-foo ++foo
- pkg-c - c
concretizer: concretizer:
unify: false unify: false
""" """
@@ -797,10 +797,10 @@ def test_deconcretize_then_concretize_does_not_error(mutable_mock_env_path, mock
"""spack: """spack:
specs: specs:
# These two specs concretize to the same hash # These two specs concretize to the same hash
- pkg-c - c
- pkg-c@1.0 - c@1.0
# Spec used to trigger the bug # Spec used to trigger the bug
- pkg-a - a
concretizer: concretizer:
unify: true unify: true
""" """
@@ -808,38 +808,8 @@ def test_deconcretize_then_concretize_does_not_error(mutable_mock_env_path, mock
e = ev.Environment(mutable_mock_env_path) e = ev.Environment(mutable_mock_env_path)
with e: with e:
e.concretize() e.concretize()
e.deconcretize(spack.spec.Spec("pkg-a"), concrete=False) e.deconcretize(spack.spec.Spec("a"), concrete=False)
e.concretize() e.concretize()
assert len(e.concrete_roots()) == 3 assert len(e.concrete_roots()) == 3
all_root_hashes = {x.dag_hash() for x in e.concrete_roots()} all_root_hashes = set(x.dag_hash() for x in e.concrete_roots())
assert len(all_root_hashes) == 2 assert len(all_root_hashes) == 2
@pytest.mark.regression("44216")
@pytest.mark.only_clingo()
def test_root_version_weights_for_old_versions(mutable_mock_env_path, mock_packages):
"""Tests that, when we select two old versions of root specs that have the same version
optimization penalty, both are considered.
"""
mutable_mock_env_path.mkdir()
spack_yaml = mutable_mock_env_path / ev.manifest_name
spack_yaml.write_text(
"""spack:
specs:
# allow any version, but the most recent
- bowtie@:1.3
# allows only the third most recent, so penalty is 2
- gcc@1
concretizer:
unify: true
"""
)
e = ev.Environment(mutable_mock_env_path)
with e:
e.concretize()
bowtie = [x for x in e.concrete_roots() if x.name == "bowtie"][0]
gcc = [x for x in e.concrete_roots() if x.name == "gcc"][0]
assert bowtie.satisfies("@=1.3.0")
assert gcc.satisfies("@=1.0")

View File

@@ -132,7 +132,7 @@ def test_hms(sec, result):
def test_get_dependent_ids(install_mockery, mock_packages): def test_get_dependent_ids(install_mockery, mock_packages):
# Concretize the parent package, which handle dependency too # Concretize the parent package, which handle dependency too
spec = spack.spec.Spec("pkg-a") spec = spack.spec.Spec("a")
spec.concretize() spec.concretize()
assert spec.concrete assert spec.concrete
@@ -223,11 +223,11 @@ def _spec(spec, unsigned=False, mirrors_for_spec=None):
# Skip database updates # Skip database updates
monkeypatch.setattr(spack.database.Database, "add", _noop) monkeypatch.setattr(spack.database.Database, "add", _noop)
spec = spack.spec.Spec("pkg-a").concretized() spec = spack.spec.Spec("a").concretized()
assert inst._process_binary_cache_tarball(spec.package, explicit=False, unsigned=False) assert inst._process_binary_cache_tarball(spec.package, explicit=False, unsigned=False)
out = capfd.readouterr()[0] out = capfd.readouterr()[0]
assert "Extracting pkg-a" in out assert "Extracting a" in out
assert "from binary cache" in out assert "from binary cache" in out
@@ -278,7 +278,7 @@ def test_installer_prune_built_build_deps(install_mockery, monkeypatch, tmpdir):
@property @property
def _mock_installed(self): def _mock_installed(self):
return self.name == "pkg-c" return self.name in ["c"]
# Mock the installed property to say that (b) is installed # Mock the installed property to say that (b) is installed
monkeypatch.setattr(spack.spec.Spec, "installed", _mock_installed) monkeypatch.setattr(spack.spec.Spec, "installed", _mock_installed)
@@ -286,25 +286,24 @@ def _mock_installed(self):
# Create mock repository with packages (a), (b), (c), (d), and (e) # Create mock repository with packages (a), (b), (c), (d), and (e)
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock-repo")) builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock-repo"))
builder.add_package("pkg-a", dependencies=[("pkg-b", "build", None), ("pkg-c", "build", None)]) builder.add_package("a", dependencies=[("b", "build", None), ("c", "build", None)])
builder.add_package("pkg-b", dependencies=[("pkg-d", "build", None)]) builder.add_package("b", dependencies=[("d", "build", None)])
builder.add_package( builder.add_package(
"pkg-c", "c", dependencies=[("d", "build", None), ("e", "all", None), ("f", "build", None)]
dependencies=[("pkg-d", "build", None), ("pkg-e", "all", None), ("pkg-f", "build", None)],
) )
builder.add_package("pkg-d") builder.add_package("d")
builder.add_package("pkg-e") builder.add_package("e")
builder.add_package("pkg-f") builder.add_package("f")
with spack.repo.use_repositories(builder.root): with spack.repo.use_repositories(builder.root):
const_arg = installer_args(["pkg-a"], {}) const_arg = installer_args(["a"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
installer._init_queue() installer._init_queue()
# Assert that (c) is not in the build_pq # Assert that (c) is not in the build_pq
result = {task.pkg_id[:5] for _, task in installer.build_pq} result = set([task.pkg_id[0] for _, task in installer.build_pq])
expected = {"pkg-a", "pkg-b", "pkg-c", "pkg-d", "pkg-e"} expected = set(["a", "b", "c", "d", "e"])
assert result == expected assert result == expected
@@ -419,7 +418,8 @@ def test_ensure_locked_have(install_mockery, tmpdir, capsys):
@pytest.mark.parametrize("lock_type,reads,writes", [("read", 1, 0), ("write", 0, 1)]) @pytest.mark.parametrize("lock_type,reads,writes", [("read", 1, 0), ("write", 0, 1)])
def test_ensure_locked_new_lock(install_mockery, tmpdir, lock_type, reads, writes): def test_ensure_locked_new_lock(install_mockery, tmpdir, lock_type, reads, writes):
const_arg = installer_args(["pkg-a"], {}) pkg_id = "a"
const_arg = installer_args([pkg_id], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
spec = installer.build_requests[0].pkg.spec spec = installer.build_requests[0].pkg.spec
with tmpdir.as_cwd(): with tmpdir.as_cwd():
@@ -438,7 +438,8 @@ def _pl(db, spec, timeout):
lock.default_timeout = 1e-9 if timeout is None else None lock.default_timeout = 1e-9 if timeout is None else None
return lock return lock
const_arg = installer_args(["pkg-a"], {}) pkg_id = "a"
const_arg = installer_args([pkg_id], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
spec = installer.build_requests[0].pkg.spec spec = installer.build_requests[0].pkg.spec
@@ -493,7 +494,7 @@ def test_packages_needed_to_bootstrap_compiler_packages(install_mockery, monkeyp
spec.concretize() spec.concretize()
def _conc_spec(compiler): def _conc_spec(compiler):
return spack.spec.Spec("pkg-a").concretized() return spack.spec.Spec("a").concretized()
# Ensure we can get past functions that are precluding obtaining # Ensure we can get past functions that are precluding obtaining
# packages. # packages.
@@ -601,7 +602,7 @@ def test_clear_failures_success(tmpdir):
"""Test the clear_failures happy path.""" """Test the clear_failures happy path."""
failures = spack.database.FailureTracker(str(tmpdir), default_timeout=0.1) failures = spack.database.FailureTracker(str(tmpdir), default_timeout=0.1)
spec = spack.spec.Spec("pkg-a") spec = spack.spec.Spec("a")
spec._mark_concrete() spec._mark_concrete()
# Set up a test prefix failure lock # Set up a test prefix failure lock
@@ -627,7 +628,7 @@ def test_clear_failures_success(tmpdir):
def test_clear_failures_errs(tmpdir, capsys): def test_clear_failures_errs(tmpdir, capsys):
"""Test the clear_failures exception paths.""" """Test the clear_failures exception paths."""
failures = spack.database.FailureTracker(str(tmpdir), default_timeout=0.1) failures = spack.database.FailureTracker(str(tmpdir), default_timeout=0.1)
spec = spack.spec.Spec("pkg-a") spec = spack.spec.Spec("a")
spec._mark_concrete() spec._mark_concrete()
failures.mark(spec) failures.mark(spec)
@@ -689,11 +690,11 @@ def test_check_deps_status_install_failure(install_mockery):
"""Tests that checking the dependency status on a request to install """Tests that checking the dependency status on a request to install
'a' fails, if we mark the dependency as failed. 'a' fails, if we mark the dependency as failed.
""" """
s = spack.spec.Spec("pkg-a").concretized() s = spack.spec.Spec("a").concretized()
for dep in s.traverse(root=False): for dep in s.traverse(root=False):
spack.store.STORE.failure_tracker.mark(dep) spack.store.STORE.failure_tracker.mark(dep)
const_arg = installer_args(["pkg-a"], {}) const_arg = installer_args(["a"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
request = installer.build_requests[0] request = installer.build_requests[0]
@@ -702,7 +703,7 @@ def test_check_deps_status_install_failure(install_mockery):
def test_check_deps_status_write_locked(install_mockery, monkeypatch): def test_check_deps_status_write_locked(install_mockery, monkeypatch):
const_arg = installer_args(["pkg-a"], {}) const_arg = installer_args(["a"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
request = installer.build_requests[0] request = installer.build_requests[0]
@@ -714,7 +715,7 @@ def test_check_deps_status_write_locked(install_mockery, monkeypatch):
def test_check_deps_status_external(install_mockery, monkeypatch): def test_check_deps_status_external(install_mockery, monkeypatch):
const_arg = installer_args(["pkg-a"], {}) const_arg = installer_args(["a"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
request = installer.build_requests[0] request = installer.build_requests[0]
@@ -727,7 +728,7 @@ def test_check_deps_status_external(install_mockery, monkeypatch):
def test_check_deps_status_upstream(install_mockery, monkeypatch): def test_check_deps_status_upstream(install_mockery, monkeypatch):
const_arg = installer_args(["pkg-a"], {}) const_arg = installer_args(["a"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
request = installer.build_requests[0] request = installer.build_requests[0]
@@ -804,7 +805,7 @@ def test_install_task_add_compiler(install_mockery, monkeypatch, capfd):
def _add(_compilers): def _add(_compilers):
tty.msg(config_msg) tty.msg(config_msg)
const_arg = installer_args(["pkg-a"], {}) const_arg = installer_args(["a"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
task = create_build_task(installer.build_requests[0].pkg) task = create_build_task(installer.build_requests[0].pkg)
task.compiler = True task.compiler = True
@@ -842,7 +843,7 @@ def test_release_lock_write_n_exception(install_mockery, tmpdir, capsys):
@pytest.mark.parametrize("installed", [True, False]) @pytest.mark.parametrize("installed", [True, False])
def test_push_task_skip_processed(install_mockery, installed): def test_push_task_skip_processed(install_mockery, installed):
"""Test to ensure skip re-queueing a processed package.""" """Test to ensure skip re-queueing a processed package."""
const_arg = installer_args(["pkg-a"], {}) const_arg = installer_args(["a"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
assert len(list(installer.build_tasks)) == 0 assert len(list(installer.build_tasks)) == 0
@@ -860,7 +861,7 @@ def test_push_task_skip_processed(install_mockery, installed):
def test_requeue_task(install_mockery, capfd): def test_requeue_task(install_mockery, capfd):
"""Test to ensure cover _requeue_task.""" """Test to ensure cover _requeue_task."""
const_arg = installer_args(["pkg-a"], {}) const_arg = installer_args(["a"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
task = create_build_task(installer.build_requests[0].pkg) task = create_build_task(installer.build_requests[0].pkg)
@@ -878,7 +879,7 @@ def test_requeue_task(install_mockery, capfd):
assert qtask.attempts == task.attempts + 1 assert qtask.attempts == task.attempts + 1
out = capfd.readouterr()[1] out = capfd.readouterr()[1]
assert "Installing pkg-a" in out assert "Installing a" in out
assert " in progress by another process" in out assert " in progress by another process" in out
@@ -891,17 +892,17 @@ def _mktask(pkg):
def _rmtask(installer, pkg_id): def _rmtask(installer, pkg_id):
raise RuntimeError("Raise an exception to test except path") raise RuntimeError("Raise an exception to test except path")
const_arg = installer_args(["pkg-a"], {}) const_arg = installer_args(["a"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
spec = installer.build_requests[0].pkg.spec spec = installer.build_requests[0].pkg.spec
# Cover task removal happy path # Cover task removal happy path
installer.build_tasks["pkg-a"] = _mktask(spec.package) installer.build_tasks["a"] = _mktask(spec.package)
installer._cleanup_all_tasks() installer._cleanup_all_tasks()
assert len(installer.build_tasks) == 0 assert len(installer.build_tasks) == 0
# Cover task removal exception path # Cover task removal exception path
installer.build_tasks["pkg-a"] = _mktask(spec.package) installer.build_tasks["a"] = _mktask(spec.package)
monkeypatch.setattr(inst.PackageInstaller, "_remove_task", _rmtask) monkeypatch.setattr(inst.PackageInstaller, "_remove_task", _rmtask)
installer._cleanup_all_tasks() installer._cleanup_all_tasks()
assert len(installer.build_tasks) == 1 assert len(installer.build_tasks) == 1
@@ -995,7 +996,7 @@ def test_install_uninstalled_deps(install_mockery, monkeypatch, capsys):
def test_install_failed(install_mockery, monkeypatch, capsys): def test_install_failed(install_mockery, monkeypatch, capsys):
"""Test install with failed install.""" """Test install with failed install."""
const_arg = installer_args(["pkg-b"], {}) const_arg = installer_args(["b"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
# Make sure the package is identified as failed # Make sure the package is identified as failed
@@ -1011,7 +1012,7 @@ def test_install_failed(install_mockery, monkeypatch, capsys):
def test_install_failed_not_fast(install_mockery, monkeypatch, capsys): def test_install_failed_not_fast(install_mockery, monkeypatch, capsys):
"""Test install with failed install.""" """Test install with failed install."""
const_arg = installer_args(["pkg-a"], {"fail_fast": False}) const_arg = installer_args(["a"], {"fail_fast": False})
installer = create_installer(const_arg) installer = create_installer(const_arg)
# Make sure the package is identified as failed # Make sure the package is identified as failed
@@ -1022,12 +1023,12 @@ def test_install_failed_not_fast(install_mockery, monkeypatch, capsys):
out = str(capsys.readouterr()) out = str(capsys.readouterr())
assert "failed to install" in out assert "failed to install" in out
assert "Skipping build of pkg-a" in out assert "Skipping build of a" in out
def test_install_fail_on_interrupt(install_mockery, monkeypatch): def test_install_fail_on_interrupt(install_mockery, monkeypatch):
"""Test ctrl-c interrupted install.""" """Test ctrl-c interrupted install."""
spec_name = "pkg-a" spec_name = "a"
err_msg = "mock keyboard interrupt for {0}".format(spec_name) err_msg = "mock keyboard interrupt for {0}".format(spec_name)
def _interrupt(installer, task, install_status, **kwargs): def _interrupt(installer, task, install_status, **kwargs):
@@ -1045,13 +1046,13 @@ def _interrupt(installer, task, install_status, **kwargs):
with pytest.raises(KeyboardInterrupt, match=err_msg): with pytest.raises(KeyboardInterrupt, match=err_msg):
installer.install() installer.install()
assert "pkg-b" in installer.installed # ensure dependency of pkg-a is 'installed' assert "b" in installer.installed # ensure dependency of a is 'installed'
assert spec_name not in installer.installed assert spec_name not in installer.installed
def test_install_fail_single(install_mockery, monkeypatch): def test_install_fail_single(install_mockery, monkeypatch):
"""Test expected results for failure of single package.""" """Test expected results for failure of single package."""
spec_name = "pkg-a" spec_name = "a"
err_msg = "mock internal package build error for {0}".format(spec_name) err_msg = "mock internal package build error for {0}".format(spec_name)
class MyBuildException(Exception): class MyBuildException(Exception):
@@ -1072,13 +1073,13 @@ def _install(installer, task, install_status, **kwargs):
with pytest.raises(MyBuildException, match=err_msg): with pytest.raises(MyBuildException, match=err_msg):
installer.install() installer.install()
assert "pkg-b" in installer.installed # ensure dependency of a is 'installed' assert "b" in installer.installed # ensure dependency of a is 'installed'
assert spec_name not in installer.installed assert spec_name not in installer.installed
def test_install_fail_multi(install_mockery, monkeypatch): def test_install_fail_multi(install_mockery, monkeypatch):
"""Test expected results for failure of multiple packages.""" """Test expected results for failure of multiple packages."""
spec_name = "pkg-c" spec_name = "c"
err_msg = "mock internal package build error" err_msg = "mock internal package build error"
class MyBuildException(Exception): class MyBuildException(Exception):
@@ -1090,7 +1091,7 @@ def _install(installer, task, install_status, **kwargs):
else: else:
installer.installed.add(task.pkg.name) installer.installed.add(task.pkg.name)
const_arg = installer_args([spec_name, "pkg-a"], {}) const_arg = installer_args([spec_name, "a"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
# Raise a KeyboardInterrupt error to trigger early termination # Raise a KeyboardInterrupt error to trigger early termination
@@ -1099,14 +1100,14 @@ def _install(installer, task, install_status, **kwargs):
with pytest.raises(inst.InstallError, match="Installation request failed"): with pytest.raises(inst.InstallError, match="Installation request failed"):
installer.install() installer.install()
assert "pkg-a" in installer.installed # ensure the the second spec installed assert "a" in installer.installed # ensure the the second spec installed
assert spec_name not in installer.installed assert spec_name not in installer.installed
def test_install_fail_fast_on_detect(install_mockery, monkeypatch, capsys): def test_install_fail_fast_on_detect(install_mockery, monkeypatch, capsys):
"""Test fail_fast install when an install failure is detected.""" """Test fail_fast install when an install failure is detected."""
const_arg = installer_args(["pkg-b"], {"fail_fast": False}) const_arg = installer_args(["b"], {"fail_fast": False})
const_arg.extend(installer_args(["pkg-c"], {"fail_fast": True})) const_arg.extend(installer_args(["c"], {"fail_fast": True}))
installer = create_installer(const_arg) installer = create_installer(const_arg)
pkg_ids = [inst.package_id(spec) for spec, _ in const_arg] pkg_ids = [inst.package_id(spec) for spec, _ in const_arg]
@@ -1136,7 +1137,7 @@ def _test_install_fail_fast_on_except_patch(installer, **kwargs):
@pytest.mark.disable_clean_stage_check @pytest.mark.disable_clean_stage_check
def test_install_fail_fast_on_except(install_mockery, monkeypatch, capsys): def test_install_fail_fast_on_except(install_mockery, monkeypatch, capsys):
"""Test fail_fast install when an install failure results from an error.""" """Test fail_fast install when an install failure results from an error."""
const_arg = installer_args(["pkg-a"], {"fail_fast": True}) const_arg = installer_args(["a"], {"fail_fast": True})
installer = create_installer(const_arg) installer = create_installer(const_arg)
# Raise a non-KeyboardInterrupt exception to trigger fast failure. # Raise a non-KeyboardInterrupt exception to trigger fast failure.
@@ -1151,7 +1152,7 @@ def test_install_fail_fast_on_except(install_mockery, monkeypatch, capsys):
installer.install() installer.install()
out = str(capsys.readouterr()) out = str(capsys.readouterr())
assert "Skipping build of pkg-a" in out assert "Skipping build of a" in out
def test_install_lock_failures(install_mockery, monkeypatch, capfd): def test_install_lock_failures(install_mockery, monkeypatch, capfd):
@@ -1160,7 +1161,7 @@ def test_install_lock_failures(install_mockery, monkeypatch, capfd):
def _requeued(installer, task, install_status): def _requeued(installer, task, install_status):
tty.msg("requeued {0}".format(task.pkg.spec.name)) tty.msg("requeued {0}".format(task.pkg.spec.name))
const_arg = installer_args(["pkg-b"], {}) const_arg = installer_args(["b"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
# Ensure never acquire a lock # Ensure never acquire a lock
@@ -1180,7 +1181,7 @@ def _requeued(installer, task, install_status):
def test_install_lock_installed_requeue(install_mockery, monkeypatch, capfd): def test_install_lock_installed_requeue(install_mockery, monkeypatch, capfd):
"""Cover basic install handling for installed package.""" """Cover basic install handling for installed package."""
const_arg = installer_args(["pkg-b"], {}) const_arg = installer_args(["b"], {})
b, _ = const_arg[0] b, _ = const_arg[0]
installer = create_installer(const_arg) installer = create_installer(const_arg)
b_pkg_id = inst.package_id(b) b_pkg_id = inst.package_id(b)
@@ -1236,7 +1237,7 @@ def _requeued(installer, task, install_status):
# Ensure don't continually requeue the task # Ensure don't continually requeue the task
monkeypatch.setattr(inst.PackageInstaller, "_requeue_task", _requeued) monkeypatch.setattr(inst.PackageInstaller, "_requeue_task", _requeued)
const_arg = installer_args(["pkg-b"], {}) const_arg = installer_args(["b"], {})
installer = create_installer(const_arg) installer = create_installer(const_arg)
with pytest.raises(inst.InstallError, match="request failed"): with pytest.raises(inst.InstallError, match="request failed"):
@@ -1252,7 +1253,7 @@ def _requeued(installer, task, install_status):
def test_install_skip_patch(install_mockery, mock_fetch): def test_install_skip_patch(install_mockery, mock_fetch):
"""Test the path skip_patch install path.""" """Test the path skip_patch install path."""
spec_name = "pkg-b" spec_name = "b"
const_arg = installer_args([spec_name], {"fake": False, "skip_patch": True}) const_arg = installer_args([spec_name], {"fake": False, "skip_patch": True})
installer = create_installer(const_arg) installer = create_installer(const_arg)
@@ -1279,7 +1280,7 @@ def test_overwrite_install_backup_success(temporary_store, config, mock_packages
of the original prefix, and leave the original spec marked installed. of the original prefix, and leave the original spec marked installed.
""" """
# Get a build task. TODO: refactor this to avoid calling internal methods # Get a build task. TODO: refactor this to avoid calling internal methods
const_arg = installer_args(["pkg-b"], {}) const_arg = installer_args(["b"])
installer = create_installer(const_arg) installer = create_installer(const_arg)
installer._init_queue() installer._init_queue()
task = installer._pop_task() task = installer._pop_task()
@@ -1340,7 +1341,7 @@ def remove(self, spec):
self.called = True self.called = True
# Get a build task. TODO: refactor this to avoid calling internal methods # Get a build task. TODO: refactor this to avoid calling internal methods
const_arg = installer_args(["pkg-b"], {}) const_arg = installer_args(["b"])
installer = create_installer(const_arg) installer = create_installer(const_arg)
installer._init_queue() installer._init_queue()
task = installer._pop_task() task = installer._pop_task()
@@ -1369,8 +1370,8 @@ def test_term_status_line():
# accept that. `with log_output(buf)` doesn't really work because it trims output # accept that. `with log_output(buf)` doesn't really work because it trims output
# and we actually want to test for escape sequences etc. # and we actually want to test for escape sequences etc.
x = inst.TermStatusLine(enabled=True) x = inst.TermStatusLine(enabled=True)
x.add("pkg-a") x.add("a")
x.add("pkg-b") x.add("b")
x.clear() x.clear()

View File

@@ -14,7 +14,7 @@
import pytest import pytest
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
from llnl.util.symlink import islink, readlink, symlink from llnl.util.symlink import islink, symlink
import spack.paths import spack.paths
@@ -181,7 +181,7 @@ def test_symlinks_true(self, stage):
assert os.path.exists("dest/a/b2") assert os.path.exists("dest/a/b2")
with fs.working_dir("dest/a"): with fs.working_dir("dest/a"):
assert os.path.exists(readlink("b2")) assert os.path.exists(os.readlink("b2"))
assert os.path.realpath("dest/f/2") == os.path.abspath("dest/a/b/2") assert os.path.realpath("dest/f/2") == os.path.abspath("dest/a/b/2")
assert os.path.realpath("dest/2") == os.path.abspath("dest/1") assert os.path.realpath("dest/2") == os.path.abspath("dest/1")
@@ -281,7 +281,7 @@ def test_allow_broken_symlinks(self, stage):
symlink("nonexistant.txt", "source/broken", allow_broken_symlinks=True) symlink("nonexistant.txt", "source/broken", allow_broken_symlinks=True)
fs.install_tree("source", "dest", symlinks=True, allow_broken_symlinks=True) fs.install_tree("source", "dest", symlinks=True, allow_broken_symlinks=True)
assert os.path.islink("dest/broken") assert os.path.islink("dest/broken")
assert not os.path.exists(readlink("dest/broken")) assert not os.path.exists(os.readlink("dest/broken"))
def test_glob_src(self, stage): def test_glob_src(self, stage):
"""Test using a glob as the source.""" """Test using a glob as the source."""

View File

@@ -289,8 +289,8 @@ def test_mirror_cache_symlinks(tmpdir):
@pytest.mark.parametrize( @pytest.mark.parametrize(
"specs,expected_specs", "specs,expected_specs",
[ [
(["pkg-a"], ["pkg-a@=1.0", "pkg-a@=2.0"]), (["a"], ["a@=1.0", "a@=2.0"]),
(["pkg-a", "brillig"], ["pkg-a@=1.0", "pkg-a@=2.0", "brillig@=1.0.0", "brillig@=2.0.0"]), (["a", "brillig"], ["a@=1.0", "a@=2.0", "brillig@=1.0.0", "brillig@=2.0.0"]),
], ],
) )
def test_get_all_versions(specs, expected_specs): def test_get_all_versions(specs, expected_specs):

View File

@@ -7,8 +7,6 @@
import pytest import pytest
from llnl.util.symlink import readlink
import spack.cmd.modules import spack.cmd.modules
import spack.config import spack.config
import spack.error import spack.error
@@ -80,7 +78,7 @@ def test_modules_default_symlink(
link_path = os.path.join(os.path.dirname(mock_module_filename), "default") link_path = os.path.join(os.path.dirname(mock_module_filename), "default")
assert os.path.islink(link_path) assert os.path.islink(link_path)
assert readlink(link_path) == mock_module_filename assert os.readlink(link_path) == mock_module_filename
generator.remove() generator.remove()
assert not os.path.lexists(link_path) assert not os.path.lexists(link_path)

View File

@@ -151,9 +151,7 @@ class InMemoryOCIRegistry(DummyServer):
A third option is to use the chunked upload, but this is not implemented here, because A third option is to use the chunked upload, but this is not implemented here, because
it's typically a major performance hit in upload speed, so we're not using it in Spack.""" it's typically a major performance hit in upload speed, so we're not using it in Spack."""
def __init__( def __init__(self, domain: str, allow_single_post: bool = True) -> None:
self, domain: str, allow_single_post: bool = True, tags_per_page: int = 100
) -> None:
super().__init__(domain) super().__init__(domain)
self.router.register("GET", r"/v2/", self.index) self.router.register("GET", r"/v2/", self.index)
self.router.register("HEAD", r"/v2/(?P<name>.+)/blobs/(?P<digest>.+)", self.head_blob) self.router.register("HEAD", r"/v2/(?P<name>.+)/blobs/(?P<digest>.+)", self.head_blob)
@@ -167,9 +165,6 @@ def __init__(
# If True, allow single POST upload, not all registries support this # If True, allow single POST upload, not all registries support this
self.allow_single_post = allow_single_post self.allow_single_post = allow_single_post
# How many tags are returned in a single request
self.tags_per_page = tags_per_page
# Used for POST + PUT upload. This is a map from session ID to image name # Used for POST + PUT upload. This is a map from session ID to image name
self.sessions: Dict[str, str] = {} self.sessions: Dict[str, str] = {}
@@ -285,34 +280,10 @@ def handle_upload(self, req: Request, name: str, digest: Digest):
return MockHTTPResponse(201, "Created", headers={"Location": f"/v2/{name}/blobs/{digest}"}) return MockHTTPResponse(201, "Created", headers={"Location": f"/v2/{name}/blobs/{digest}"})
def list_tags(self, req: Request, name: str): def list_tags(self, req: Request, name: str):
# Paginate using Link headers, this was added to the spec in the following commit:
# https://github.com/opencontainers/distribution-spec/commit/2ed79d930ecec11dd755dc8190409a3b10f01ca9
# List all tags, exclude digests. # List all tags, exclude digests.
all_tags = sorted( tags = [_tag for _name, _tag in self.manifests.keys() if _name == name and ":" not in _tag]
_tag for _name, _tag in self.manifests.keys() if _name == name and ":" not in _tag tags.sort()
) return MockHTTPResponse.with_json(200, "OK", body={"tags": tags})
query = urllib.parse.parse_qs(urllib.parse.urlparse(req.full_url).query)
n = int(query["n"][0]) if "n" in query else self.tags_per_page
if "last" in query:
try:
offset = all_tags.index(query["last"][0]) + 1
except ValueError:
return MockHTTPResponse(404, "Not found")
else:
offset = 0
tags = all_tags[offset : offset + n]
if offset + n < len(all_tags):
headers = {"Link": f'</v2/{name}/tags/list?last={tags[-1]}&n={n}>; rel="next"'}
else:
headers = None
return MockHTTPResponse.with_json(200, "OK", headers=headers, body={"tags": tags})
class DummyServerUrllibHandler(urllib.request.BaseHandler): class DummyServerUrllibHandler(urllib.request.BaseHandler):

View File

@@ -6,7 +6,6 @@
import hashlib import hashlib
import json import json
import random
import urllib.error import urllib.error
import urllib.parse import urllib.parse
import urllib.request import urllib.request
@@ -20,7 +19,6 @@
copy_missing_layers, copy_missing_layers,
get_manifest_and_config, get_manifest_and_config,
image_from_mirror, image_from_mirror,
list_tags,
upload_blob, upload_blob,
upload_manifest, upload_manifest,
) )
@@ -672,31 +670,3 @@ def test_retry(url, max_retries, expect_failure, expect_requests):
assert len(server.requests) == expect_requests assert len(server.requests) == expect_requests
assert sleep_time == [2**i for i in range(expect_requests - 1)] assert sleep_time == [2**i for i in range(expect_requests - 1)]
def test_list_tags():
# Follows a relatively new rewording of the OCI distribution spec, which is not yet tagged.
# https://github.com/opencontainers/distribution-spec/commit/2ed79d930ecec11dd755dc8190409a3b10f01ca9
N = 20
urlopen = create_opener(InMemoryOCIRegistry("example.com", tags_per_page=5)).open
image = ImageReference.from_string("example.com/image")
to_tag = lambda i: f"tag-{i:02}"
# Create N tags in arbitrary order
_tags_to_create = [to_tag(i) for i in range(N)]
random.shuffle(_tags_to_create)
for tag in _tags_to_create:
upload_manifest(image.with_tag(tag), default_manifest(), tag=True, _urlopen=urlopen)
# list_tags should return all tags from all pages in order
tags = list_tags(image, urlopen)
assert len(tags) == N
assert [to_tag(i) for i in range(N)] == tags
# Test a single request, which should give the first 5 tags
assert json.loads(urlopen(image.tags_url()).read())["tags"] == [to_tag(i) for i in range(5)]
# Test response at an offset, which should exclude the `last` tag.
assert json.loads(urlopen(image.tags_url() + f"?last={to_tag(N - 3)}").read())["tags"] == [
to_tag(i) for i in range(N - 2, N)
]

View File

@@ -13,44 +13,28 @@
# Normalize simple conditionals # Normalize simple conditionals
("optional-dep-test", {"optional-dep-test": None}), ("optional-dep-test", {"optional-dep-test": None}),
("optional-dep-test~a", {"optional-dep-test~a": None}), ("optional-dep-test~a", {"optional-dep-test~a": None}),
("optional-dep-test+a", {"optional-dep-test+a": {"pkg-a": None}}), ("optional-dep-test+a", {"optional-dep-test+a": {"a": None}}),
("optional-dep-test a=true", {"optional-dep-test a=true": {"pkg-a": None}}), ("optional-dep-test a=true", {"optional-dep-test a=true": {"a": None}}),
("optional-dep-test a=true", {"optional-dep-test+a": {"pkg-a": None}}), ("optional-dep-test a=true", {"optional-dep-test+a": {"a": None}}),
("optional-dep-test@1.1", {"optional-dep-test@1.1": {"pkg-b": None}}), ("optional-dep-test@1.1", {"optional-dep-test@1.1": {"b": None}}),
("optional-dep-test%intel", {"optional-dep-test%intel": {"pkg-c": None}}), ("optional-dep-test%intel", {"optional-dep-test%intel": {"c": None}}),
( ("optional-dep-test%intel@64.1", {"optional-dep-test%intel@64.1": {"c": None, "d": None}}),
"optional-dep-test%intel@64.1",
{"optional-dep-test%intel@64.1": {"pkg-c": None, "pkg-d": None}},
),
( (
"optional-dep-test%intel@64.1.2", "optional-dep-test%intel@64.1.2",
{"optional-dep-test%intel@64.1.2": {"pkg-c": None, "pkg-d": None}}, {"optional-dep-test%intel@64.1.2": {"c": None, "d": None}},
), ),
("optional-dep-test%clang@35", {"optional-dep-test%clang@35": {"pkg-e": None}}), ("optional-dep-test%clang@35", {"optional-dep-test%clang@35": {"e": None}}),
# Normalize multiple conditionals # Normalize multiple conditionals
("optional-dep-test+a@1.1", {"optional-dep-test+a@1.1": {"pkg-a": None, "pkg-b": None}}), ("optional-dep-test+a@1.1", {"optional-dep-test+a@1.1": {"a": None, "b": None}}),
( ("optional-dep-test+a%intel", {"optional-dep-test+a%intel": {"a": None, "c": None}}),
"optional-dep-test+a%intel", ("optional-dep-test@1.1%intel", {"optional-dep-test@1.1%intel": {"b": None, "c": None}}),
{"optional-dep-test+a%intel": {"pkg-a": None, "pkg-c": None}},
),
(
"optional-dep-test@1.1%intel",
{"optional-dep-test@1.1%intel": {"pkg-b": None, "pkg-c": None}},
),
( (
"optional-dep-test@1.1%intel@64.1.2+a", "optional-dep-test@1.1%intel@64.1.2+a",
{ {"optional-dep-test@1.1%intel@64.1.2+a": {"a": None, "b": None, "c": None, "d": None}},
"optional-dep-test@1.1%intel@64.1.2+a": {
"pkg-a": None,
"pkg-b": None,
"pkg-c": None,
"pkg-d": None,
}
},
), ),
( (
"optional-dep-test@1.1%clang@36.5+a", "optional-dep-test@1.1%clang@36.5+a",
{"optional-dep-test@1.1%clang@36.5+a": {"pkg-b": None, "pkg-a": None, "pkg-e": None}}, {"optional-dep-test@1.1%clang@36.5+a": {"b": None, "a": None, "e": None}},
), ),
# Chained MPI # Chained MPI
( (
@@ -60,10 +44,7 @@
# Each of these dependencies comes from a conditional # Each of these dependencies comes from a conditional
# dependency on another. This requires iterating to evaluate # dependency on another. This requires iterating to evaluate
# the whole chain. # the whole chain.
( ("optional-dep-test+f", {"optional-dep-test+f": {"f": None, "g": None, "mpi": None}}),
"optional-dep-test+f",
{"optional-dep-test+f": {"pkg-f": None, "pkg-g": None, "mpi": None}},
),
] ]
) )
def spec_and_expected(request): def spec_and_expected(request):
@@ -82,12 +63,12 @@ def test_normalize(spec_and_expected, config, mock_packages):
def test_default_variant(config, mock_packages): def test_default_variant(config, mock_packages):
spec = Spec("optional-dep-test-3") spec = Spec("optional-dep-test-3")
spec.concretize() spec.concretize()
assert "pkg-a" in spec assert "a" in spec
spec = Spec("optional-dep-test-3~var") spec = Spec("optional-dep-test-3~var")
spec.concretize() spec.concretize()
assert "pkg-a" in spec assert "a" in spec
spec = Spec("optional-dep-test-3+var") spec = Spec("optional-dep-test-3+var")
spec.concretize() spec.concretize()
assert "pkg-b" in spec assert "b" in spec

View File

@@ -21,7 +21,6 @@
import spack.install_test import spack.install_test
import spack.package_base import spack.package_base
import spack.repo import spack.repo
import spack.spec
from spack.build_systems.generic import Package from spack.build_systems.generic import Package
from spack.installer import InstallError from spack.installer import InstallError
@@ -143,19 +142,19 @@ def setup_install_test(source_paths, test_root):
"spec,sources,extras,expect", "spec,sources,extras,expect",
[ [
( (
"pkg-a", "a",
["example/a.c"], # Source(s) ["example/a.c"], # Source(s)
["example/a.c"], # Extra test source ["example/a.c"], # Extra test source
["example/a.c"], ["example/a.c"],
), # Test install dir source(s) ), # Test install dir source(s)
( (
"pkg-b", "b",
["test/b.cpp", "test/b.hpp", "example/b.txt"], # Source(s) ["test/b.cpp", "test/b.hpp", "example/b.txt"], # Source(s)
["test"], # Extra test source ["test"], # Extra test source
["test/b.cpp", "test/b.hpp"], ["test/b.cpp", "test/b.hpp"],
), # Test install dir source ), # Test install dir source
( (
"pkg-c", "c",
["examples/a.py", "examples/b.py", "examples/c.py", "tests/d.py"], ["examples/a.py", "examples/b.py", "examples/c.py", "tests/d.py"],
["examples/b.py", "tests"], ["examples/b.py", "tests"],
["examples/b.py", "tests/d.py"], ["examples/b.py", "tests/d.py"],
@@ -203,7 +202,7 @@ def test_cache_extra_sources(install_mockery, spec, sources, extras, expect):
def test_cache_extra_sources_fails(install_mockery): def test_cache_extra_sources_fails(install_mockery):
s = spack.spec.Spec("pkg-a").concretized() s = spack.spec.Spec("a").concretized()
s.package.spec.concretize() s.package.spec.concretize()
with pytest.raises(InstallError) as exc_info: with pytest.raises(InstallError) as exc_info:
@@ -227,7 +226,7 @@ class URLsPackage(spack.package.Package):
url = "https://www.example.com/url-package-1.0.tgz" url = "https://www.example.com/url-package-1.0.tgz"
urls = ["https://www.example.com/archive"] urls = ["https://www.example.com/archive"]
s = spack.spec.Spec("pkg-a") s = spack.spec.Spec("a")
with pytest.raises(ValueError, match="defines both"): with pytest.raises(ValueError, match="defines both"):
URLsPackage(s) URLsPackage(s)
@@ -237,7 +236,7 @@ class LicensedPackage(spack.package.Package):
extendees = None # currently a required attribute for is_extension() extendees = None # currently a required attribute for is_extension()
license_files = None license_files = None
s = spack.spec.Spec("pkg-a") s = spack.spec.Spec("a")
pkg = LicensedPackage(s) pkg = LicensedPackage(s)
assert pkg.global_license_file is None assert pkg.global_license_file is None
@@ -250,21 +249,21 @@ class BaseTestPackage(Package):
def test_package_version_fails(): def test_package_version_fails():
s = spack.spec.Spec("pkg-a") s = spack.spec.Spec("a")
pkg = BaseTestPackage(s) pkg = BaseTestPackage(s)
with pytest.raises(ValueError, match="does not have a concrete version"): with pytest.raises(ValueError, match="does not have a concrete version"):
pkg.version() pkg.version()
def test_package_tester_fails(): def test_package_tester_fails():
s = spack.spec.Spec("pkg-a") s = spack.spec.Spec("a")
pkg = BaseTestPackage(s) pkg = BaseTestPackage(s)
with pytest.raises(ValueError, match="without concrete version"): with pytest.raises(ValueError, match="without concrete version"):
pkg.tester() pkg.tester()
def test_package_fetcher_fails(): def test_package_fetcher_fails():
s = spack.spec.Spec("pkg-a") s = spack.spec.Spec("a")
pkg = BaseTestPackage(s) pkg = BaseTestPackage(s)
with pytest.raises(ValueError, match="without concrete version"): with pytest.raises(ValueError, match="without concrete version"):
pkg.fetcher pkg.fetcher
@@ -276,7 +275,7 @@ def compilers(compiler, arch_spec):
monkeypatch.setattr(spack.compilers, "compilers_for_spec", compilers) monkeypatch.setattr(spack.compilers, "compilers_for_spec", compilers)
s = spack.spec.Spec("pkg-a") s = spack.spec.Spec("a")
pkg = BaseTestPackage(s) pkg = BaseTestPackage(s)
pkg.test_requires_compiler = True pkg.test_requires_compiler = True
pkg.do_test() pkg.do_test()

View File

@@ -16,7 +16,7 @@
import pytest import pytest
from llnl.util import filesystem as fs from llnl.util import filesystem as fs
from llnl.util.symlink import readlink, symlink from llnl.util.symlink import symlink
import spack.binary_distribution as bindist import spack.binary_distribution as bindist
import spack.cmd.buildcache as buildcache import spack.cmd.buildcache as buildcache
@@ -181,12 +181,12 @@ def test_relocate_links(tmpdir):
relocate_links(["to_self", "to_dependency", "to_system"], prefix_to_prefix) relocate_links(["to_self", "to_dependency", "to_system"], prefix_to_prefix)
# These two are relocated # These two are relocated
assert readlink("to_self") == str(tmpdir.join("new_prefix_a", "file")) assert os.readlink("to_self") == str(tmpdir.join("new_prefix_a", "file"))
assert readlink("to_dependency") == str(tmpdir.join("new_prefix_b", "file")) assert os.readlink("to_dependency") == str(tmpdir.join("new_prefix_b", "file"))
# These two are not. # These two are not.
assert readlink("to_system") == system_path assert os.readlink("to_system") == system_path
assert readlink("to_self_but_relative") == "relative" assert os.readlink("to_self_but_relative") == "relative"
def test_needs_relocation(): def test_needs_relocation():
@@ -517,7 +517,7 @@ def test_manual_download(
def _instr(pkg): def _instr(pkg):
return f"Download instructions for {pkg.spec.name}" return f"Download instructions for {pkg.spec.name}"
spec = default_mock_concretization("pkg-a") spec = default_mock_concretization("a")
spec.package.manual_download = manual spec.package.manual_download = manual
if instr: if instr:
monkeypatch.setattr(spack.package_base.PackageBase, "download_instr", _instr) monkeypatch.setattr(spack.package_base.PackageBase, "download_instr", _instr)
@@ -543,7 +543,7 @@ def test_fetch_without_code_is_noop(
default_mock_concretization, install_mockery, fetching_not_allowed default_mock_concretization, install_mockery, fetching_not_allowed
): ):
"""do_fetch for packages without code should be a no-op""" """do_fetch for packages without code should be a no-op"""
pkg = default_mock_concretization("pkg-a").package pkg = default_mock_concretization("a").package
pkg.has_code = False pkg.has_code = False
pkg.do_fetch() pkg.do_fetch()
@@ -552,7 +552,7 @@ def test_fetch_external_package_is_noop(
default_mock_concretization, install_mockery, fetching_not_allowed default_mock_concretization, install_mockery, fetching_not_allowed
): ):
"""do_fetch for packages without code should be a no-op""" """do_fetch for packages without code should be a no-op"""
spec = default_mock_concretization("pkg-a") spec = default_mock_concretization("a")
spec.external_path = "/some/where" spec.external_path = "/some/where"
assert spec.external assert spec.external
spec.package.do_fetch() spec.package.do_fetch()

View File

@@ -270,9 +270,12 @@ def trigger_bad_patch(pkg):
def test_patch_failure_develop_spec_exits_gracefully( def test_patch_failure_develop_spec_exits_gracefully(
mock_packages, config, install_mockery, mock_fetch, tmpdir, mock_stage mock_packages, config, install_mockery, mock_fetch, tmpdir, mock_stage
): ):
"""ensure that a failing patch does not trigger exceptions for develop specs""" """
ensure that a failing patch does not trigger exceptions
for develop specs
"""
spec = Spec(f"patch-a-dependency ^libelf dev_path={tmpdir}") spec = Spec("patch-a-dependency " "^libelf dev_path=%s" % str(tmpdir))
spec.concretize() spec.concretize()
libelf = spec["libelf"] libelf = spec["libelf"]
assert "patches" in list(libelf.variants.keys()) assert "patches" in list(libelf.variants.keys())

View File

@@ -9,8 +9,6 @@
import spack.package_base import spack.package_base
import spack.paths import spack.paths
import spack.repo import spack.repo
import spack.spec
import spack.util.file_cache
@pytest.fixture(params=["packages", "", "foo"]) @pytest.fixture(params=["packages", "", "foo"])
@@ -32,25 +30,25 @@ def extra_repo(tmpdir_factory, request):
def test_repo_getpkg(mutable_mock_repo): def test_repo_getpkg(mutable_mock_repo):
mutable_mock_repo.get_pkg_class("pkg-a") mutable_mock_repo.get_pkg_class("a")
mutable_mock_repo.get_pkg_class("builtin.mock.pkg-a") mutable_mock_repo.get_pkg_class("builtin.mock.a")
def test_repo_multi_getpkg(mutable_mock_repo, extra_repo): def test_repo_multi_getpkg(mutable_mock_repo, extra_repo):
mutable_mock_repo.put_first(extra_repo[0]) mutable_mock_repo.put_first(extra_repo[0])
mutable_mock_repo.get_pkg_class("pkg-a") mutable_mock_repo.get_pkg_class("a")
mutable_mock_repo.get_pkg_class("builtin.mock.pkg-a") mutable_mock_repo.get_pkg_class("builtin.mock.a")
def test_repo_multi_getpkgclass(mutable_mock_repo, extra_repo): def test_repo_multi_getpkgclass(mutable_mock_repo, extra_repo):
mutable_mock_repo.put_first(extra_repo[0]) mutable_mock_repo.put_first(extra_repo[0])
mutable_mock_repo.get_pkg_class("pkg-a") mutable_mock_repo.get_pkg_class("a")
mutable_mock_repo.get_pkg_class("builtin.mock.pkg-a") mutable_mock_repo.get_pkg_class("builtin.mock.a")
def test_repo_pkg_with_unknown_namespace(mutable_mock_repo): def test_repo_pkg_with_unknown_namespace(mutable_mock_repo):
with pytest.raises(spack.repo.UnknownNamespaceError): with pytest.raises(spack.repo.UnknownNamespaceError):
mutable_mock_repo.get_pkg_class("unknown.pkg-a") mutable_mock_repo.get_pkg_class("unknown.a")
def test_repo_unknown_pkg(mutable_mock_repo): def test_repo_unknown_pkg(mutable_mock_repo):
@@ -144,14 +142,14 @@ def test_get_all_mock_packages(mock_packages):
def test_repo_path_handles_package_removal(tmpdir, mock_packages): def test_repo_path_handles_package_removal(tmpdir, mock_packages):
builder = spack.repo.MockRepositoryBuilder(tmpdir, namespace="removal") builder = spack.repo.MockRepositoryBuilder(tmpdir, namespace="removal")
builder.add_package("pkg-c") builder.add_package("c")
with spack.repo.use_repositories(builder.root, override=False) as repos: with spack.repo.use_repositories(builder.root, override=False) as repos:
r = repos.repo_for_pkg("pkg-c") r = repos.repo_for_pkg("c")
assert r.namespace == "removal" assert r.namespace == "removal"
builder.remove("pkg-c") builder.remove("c")
with spack.repo.use_repositories(builder.root, override=False) as repos: with spack.repo.use_repositories(builder.root, override=False) as repos:
r = repos.repo_for_pkg("pkg-c") r = repos.repo_for_pkg("c")
assert r.namespace == "builtin.mock" assert r.namespace == "builtin.mock"

View File

@@ -138,19 +138,19 @@ def test_specify_preinstalled_dep(tmpdir, monkeypatch):
transitive dependency that is only supplied by the preinstalled package. transitive dependency that is only supplied by the preinstalled package.
""" """
builder = spack.repo.MockRepositoryBuilder(tmpdir) builder = spack.repo.MockRepositoryBuilder(tmpdir)
builder.add_package("pkg-c") builder.add_package("c")
builder.add_package("pkg-b", dependencies=[("pkg-c", None, None)]) builder.add_package("b", dependencies=[("c", None, None)])
builder.add_package("pkg-a", dependencies=[("pkg-b", None, None)]) builder.add_package("a", dependencies=[("b", None, None)])
with spack.repo.use_repositories(builder.root): with spack.repo.use_repositories(builder.root):
b_spec = Spec("pkg-b").concretized() b_spec = Spec("b").concretized()
monkeypatch.setattr(Spec, "installed", property(lambda x: x.name != "pkg-a")) monkeypatch.setattr(Spec, "installed", property(lambda x: x.name != "a"))
a_spec = Spec("pkg-a") a_spec = Spec("a")
a_spec._add_dependency(b_spec, depflag=dt.BUILD | dt.LINK, virtuals=()) a_spec._add_dependency(b_spec, depflag=dt.BUILD | dt.LINK, virtuals=())
a_spec.concretize() a_spec.concretize()
assert {x.name for x in a_spec.traverse()} == {"pkg-a", "pkg-b", "pkg-c"} assert set(x.name for x in a_spec.traverse()) == set(["a", "b", "c"])
@pytest.mark.usefixtures("config") @pytest.mark.usefixtures("config")
@@ -982,15 +982,15 @@ def test_synthetic_construction_of_split_dependencies_from_same_package(mock_pac
# Construct in a synthetic way (i.e. without using the solver) # Construct in a synthetic way (i.e. without using the solver)
# the following spec: # the following spec:
# #
# pkg-b # b
# build / \ link,run # build / \ link,run
# pkg-c@2.0 pkg-c@1.0 # c@2.0 c@1.0
# #
# To demonstrate that a spec can now hold two direct # To demonstrate that a spec can now hold two direct
# dependencies from the same package # dependencies from the same package
root = Spec("pkg-b").concretized() root = Spec("b").concretized()
link_run_spec = Spec("pkg-c@=1.0").concretized() link_run_spec = Spec("c@=1.0").concretized()
build_spec = Spec("pkg-c@=2.0").concretized() build_spec = Spec("c@=2.0").concretized()
root.add_dependency_edge(link_run_spec, depflag=dt.LINK, virtuals=()) root.add_dependency_edge(link_run_spec, depflag=dt.LINK, virtuals=())
root.add_dependency_edge(link_run_spec, depflag=dt.RUN, virtuals=()) root.add_dependency_edge(link_run_spec, depflag=dt.RUN, virtuals=())
@@ -998,10 +998,10 @@ def test_synthetic_construction_of_split_dependencies_from_same_package(mock_pac
# Check dependencies from the perspective of root # Check dependencies from the perspective of root
assert len(root.dependencies()) == 2 assert len(root.dependencies()) == 2
assert all(x.name == "pkg-c" for x in root.dependencies()) assert all(x.name == "c" for x in root.dependencies())
assert "@2.0" in root.dependencies(name="pkg-c", deptype=dt.BUILD)[0] assert "@2.0" in root.dependencies(name="c", deptype=dt.BUILD)[0]
assert "@1.0" in root.dependencies(name="pkg-c", deptype=dt.LINK | dt.RUN)[0] assert "@1.0" in root.dependencies(name="c", deptype=dt.LINK | dt.RUN)[0]
# Check parent from the perspective of the dependencies # Check parent from the perspective of the dependencies
assert len(build_spec.dependents()) == 1 assert len(build_spec.dependents()) == 1
@@ -1013,30 +1013,30 @@ def test_synthetic_construction_of_split_dependencies_from_same_package(mock_pac
def test_synthetic_construction_bootstrapping(mock_packages, config): def test_synthetic_construction_bootstrapping(mock_packages, config):
# Construct the following spec: # Construct the following spec:
# #
# pkg-b@2.0 # b@2.0
# | build # | build
# pkg-b@1.0 # b@1.0
# #
root = Spec("pkg-b@=2.0").concretized() root = Spec("b@=2.0").concretized()
bootstrap = Spec("pkg-b@=1.0").concretized() bootstrap = Spec("b@=1.0").concretized()
root.add_dependency_edge(bootstrap, depflag=dt.BUILD, virtuals=()) root.add_dependency_edge(bootstrap, depflag=dt.BUILD, virtuals=())
assert len(root.dependencies()) == 1 assert len(root.dependencies()) == 1
assert root.dependencies()[0].name == "pkg-b" assert root.dependencies()[0].name == "b"
assert root.name == "pkg-b" assert root.name == "b"
def test_addition_of_different_deptypes_in_multiple_calls(mock_packages, config): def test_addition_of_different_deptypes_in_multiple_calls(mock_packages, config):
# Construct the following spec: # Construct the following spec:
# #
# pkg-b@2.0 # b@2.0
# | build,link,run # | build,link,run
# pkg-b@1.0 # b@1.0
# #
# with three calls and check we always have a single edge # with three calls and check we always have a single edge
root = Spec("pkg-b@=2.0").concretized() root = Spec("b@=2.0").concretized()
bootstrap = Spec("pkg-b@=1.0").concretized() bootstrap = Spec("b@=1.0").concretized()
for current_depflag in (dt.BUILD, dt.LINK, dt.RUN): for current_depflag in (dt.BUILD, dt.LINK, dt.RUN):
root.add_dependency_edge(bootstrap, depflag=current_depflag, virtuals=()) root.add_dependency_edge(bootstrap, depflag=current_depflag, virtuals=())
@@ -1063,9 +1063,9 @@ def test_addition_of_different_deptypes_in_multiple_calls(mock_packages, config)
def test_adding_same_deptype_with_the_same_name_raises( def test_adding_same_deptype_with_the_same_name_raises(
mock_packages, config, c1_depflag, c2_depflag mock_packages, config, c1_depflag, c2_depflag
): ):
p = Spec("pkg-b@=2.0").concretized() p = Spec("b@=2.0").concretized()
c1 = Spec("pkg-b@=1.0").concretized() c1 = Spec("b@=1.0").concretized()
c2 = Spec("pkg-b@=2.0").concretized() c2 = Spec("b@=2.0").concretized()
p.add_dependency_edge(c1, depflag=c1_depflag, virtuals=()) p.add_dependency_edge(c1, depflag=c1_depflag, virtuals=())
with pytest.raises(spack.error.SpackError): with pytest.raises(spack.error.SpackError):
@@ -1105,23 +1105,3 @@ def test_indexing_prefers_direct_or_transitive_link_deps():
# Ensure that the full DAG is still searched # Ensure that the full DAG is still searched
assert root["a2"] assert root["a2"]
def test_getitem_sticks_to_subdag():
"""Test that indexing on Spec by virtual does not traverse outside the dag, which happens in
the unlikely case someone would rewrite __getitem__ in terms of edges_from_dependents instead
of edges_to_dependencies."""
x, y, z = Spec("x"), Spec("y"), Spec("z")
x.add_dependency_edge(z, depflag=dt.LINK, virtuals=("virtual",))
y.add_dependency_edge(z, depflag=dt.LINK, virtuals=())
assert x["virtual"].name == "z"
with pytest.raises(KeyError):
y["virtual"]
def test_getitem_finds_transitive_virtual():
x, y, z = Spec("x"), Spec("y"), Spec("z")
x.add_dependency_edge(z, depflag=dt.LINK, virtuals=())
x.add_dependency_edge(y, depflag=dt.LINK, virtuals=())
y.add_dependency_edge(z, depflag=dt.LINK, virtuals=("virtual",))
assert x["virtual"].name == "z"

View File

@@ -373,7 +373,7 @@ def test_satisfies_single_valued_variant(self):
https://github.com/spack/spack/pull/2386#issuecomment-282147639 https://github.com/spack/spack/pull/2386#issuecomment-282147639
is handled correctly. is handled correctly.
""" """
a = Spec("pkg-a foobar=bar") a = Spec("a foobar=bar")
a.concretize() a.concretize()
assert a.satisfies("foobar=bar") assert a.satisfies("foobar=bar")
@@ -390,21 +390,21 @@ def test_satisfies_single_valued_variant(self):
assert "foo=bar" in a assert "foo=bar" in a
# Check that conditional dependencies are treated correctly # Check that conditional dependencies are treated correctly
assert "^pkg-b" in a assert "^b" in a
def test_unsatisfied_single_valued_variant(self): def test_unsatisfied_single_valued_variant(self):
a = Spec("pkg-a foobar=baz") a = Spec("a foobar=baz")
a.concretize() a.concretize()
assert "^pkg-b" not in a assert "^b" not in a
mv = Spec("multivalue-variant") mv = Spec("multivalue-variant")
mv.concretize() mv.concretize()
assert "pkg-a@1.0" not in mv assert "a@1.0" not in mv
def test_indirect_unsatisfied_single_valued_variant(self): def test_indirect_unsatisfied_single_valued_variant(self):
spec = Spec("singlevalue-variant-dependent") spec = Spec("singlevalue-variant-dependent")
spec.concretize() spec.concretize()
assert "pkg-a@1.0" not in spec assert "a@1.0" not in spec
def test_unsatisfiable_multi_value_variant(self, default_mock_concretization): def test_unsatisfiable_multi_value_variant(self, default_mock_concretization):
# Semantics for a multi-valued variant is different # Semantics for a multi-valued variant is different
@@ -672,13 +672,6 @@ def test_spec_formatting(self, default_mock_concretization):
("{/hash}", "/", lambda s: "/" + s.dag_hash()), ("{/hash}", "/", lambda s: "/" + s.dag_hash()),
] ]
variants_segments = [
("{variants.debug}", spec, "debug"),
("{variants.foo}", spec, "foo"),
("{^pkg-a.variants.bvv}", spec["pkg-a"], "bvv"),
("{^pkg-a.variants.foo}", spec["pkg-a"], "foo"),
]
other_segments = [ other_segments = [
("{spack_root}", spack.paths.spack_root), ("{spack_root}", spack.paths.spack_root),
("{spack_install}", spack.store.STORE.layout.root), ("{spack_install}", spack.store.STORE.layout.root),
@@ -706,12 +699,6 @@ def check_prop(check_spec, fmt_str, prop, getter):
callpath, fmt_str = depify("callpath", named_str, sigil) callpath, fmt_str = depify("callpath", named_str, sigil)
assert spec.format(fmt_str) == getter(callpath) assert spec.format(fmt_str) == getter(callpath)
for named_str, test_spec, variant_name in variants_segments:
assert test_spec.format(named_str) == str(test_spec.variants[variant_name])
assert test_spec.format(named_str[:-1] + ".value}") == str(
test_spec.variants[variant_name].value
)
for named_str, expected in other_segments: for named_str, expected in other_segments:
actual = spec.format(named_str) actual = spec.format(named_str)
assert expected == actual assert expected == actual
@@ -744,7 +731,6 @@ def test_spec_formatting_sigil_mismatches(self, default_mock_concretization, fmt
r"{dag_hash}", r"{dag_hash}",
r"{foo}", r"{foo}",
r"{+variants.debug}", r"{+variants.debug}",
r"{variants.this_variant_does_not_exist}",
], ],
) )
def test_spec_formatting_bad_formats(self, default_mock_concretization, fmt_str): def test_spec_formatting_bad_formats(self, default_mock_concretization, fmt_str):
@@ -1000,8 +986,8 @@ def test_splice_swap_names_mismatch_virtuals(self, default_mock_concretization,
spec.splice(dep, transitive) spec.splice(dep, transitive)
def test_spec_override(self): def test_spec_override(self):
init_spec = Spec("pkg-a foo=baz foobar=baz cflags=-O3 cxxflags=-O1") init_spec = Spec("a foo=baz foobar=baz cflags=-O3 cxxflags=-O1")
change_spec = Spec("pkg-a foo=fee cflags=-O2") change_spec = Spec("a foo=fee cflags=-O2")
new_spec = Spec.override(init_spec, change_spec) new_spec = Spec.override(init_spec, change_spec)
new_spec.concretize() new_spec.concretize()
assert "foo=fee" in new_spec assert "foo=fee" in new_spec
@@ -1283,15 +1269,15 @@ def test_spec_installed(default_mock_concretization, database):
spec = Spec("not-a-real-package") spec = Spec("not-a-real-package")
assert not spec.installed assert not spec.installed
# pkg-a is not in the mock DB and is not installed # 'a' is not in the mock DB and is not installed
spec = default_mock_concretization("pkg-a") spec = default_mock_concretization("a")
assert not spec.installed assert not spec.installed
@pytest.mark.regression("30678") @pytest.mark.regression("30678")
def test_call_dag_hash_on_old_dag_hash_spec(mock_packages, default_mock_concretization): def test_call_dag_hash_on_old_dag_hash_spec(mock_packages, default_mock_concretization):
# create a concrete spec # create a concrete spec
a = default_mock_concretization("pkg-a") a = default_mock_concretization("a")
dag_hashes = {spec.name: spec.dag_hash() for spec in a.traverse()} dag_hashes = {spec.name: spec.dag_hash() for spec in a.traverse()}
# make it look like an old DAG hash spec with no package hash on the spec. # make it look like an old DAG hash spec with no package hash on the spec.
@@ -1350,8 +1336,8 @@ def test_unsupported_compiler():
def test_package_hash_affects_dunder_and_dag_hash(mock_packages, default_mock_concretization): def test_package_hash_affects_dunder_and_dag_hash(mock_packages, default_mock_concretization):
a1 = default_mock_concretization("pkg-a") a1 = default_mock_concretization("a")
a2 = default_mock_concretization("pkg-a") a2 = default_mock_concretization("a")
assert hash(a1) == hash(a2) assert hash(a1) == hash(a2)
assert a1.dag_hash() == a2.dag_hash() assert a1.dag_hash() == a2.dag_hash()
@@ -1375,8 +1361,8 @@ def test_intersects_and_satisfies_on_concretized_spec(default_mock_concretizatio
"""Test that a spec obtained by concretizing an abstract spec, satisfies the abstract spec """Test that a spec obtained by concretizing an abstract spec, satisfies the abstract spec
but not vice-versa. but not vice-versa.
""" """
a1 = default_mock_concretization("pkg-a@1.0") a1 = default_mock_concretization("a@1.0")
a2 = Spec("pkg-a@1.0") a2 = Spec("a@1.0")
assert a1.intersects(a2) assert a1.intersects(a2)
assert a2.intersects(a1) assert a2.intersects(a1)
@@ -1502,17 +1488,17 @@ def test_constrain(factory, lhs_str, rhs_str, result, constrained_str):
def test_abstract_hash_intersects_and_satisfies(default_mock_concretization): def test_abstract_hash_intersects_and_satisfies(default_mock_concretization):
concrete: Spec = default_mock_concretization("pkg-a") concrete: Spec = default_mock_concretization("a")
hash = concrete.dag_hash() hash = concrete.dag_hash()
hash_5 = hash[:5] hash_5 = hash[:5]
hash_6 = hash[:6] hash_6 = hash[:6]
# abstract hash that doesn't have a common prefix with the others. # abstract hash that doesn't have a common prefix with the others.
hash_other = f"{'a' if hash_5[0] == 'b' else 'b'}{hash_5[1:]}" hash_other = f"{'a' if hash_5[0] == 'b' else 'b'}{hash_5[1:]}"
abstract_5 = Spec(f"pkg-a/{hash_5}") abstract_5 = Spec(f"a/{hash_5}")
abstract_6 = Spec(f"pkg-a/{hash_6}") abstract_6 = Spec(f"a/{hash_6}")
abstract_none = Spec(f"pkg-a/{hash_other}") abstract_none = Spec(f"a/{hash_other}")
abstract = Spec("pkg-a") abstract = Spec("a")
def assert_subset(a: Spec, b: Spec): def assert_subset(a: Spec, b: Spec):
assert a.intersects(b) and b.intersects(a) and a.satisfies(b) and not b.satisfies(a) assert a.intersects(b) and b.intersects(a) and a.satisfies(b) and not b.satisfies(a)
@@ -1549,6 +1535,6 @@ def test_edge_equality_does_not_depend_on_virtual_order():
def test_old_format_strings_trigger_error(default_mock_concretization): def test_old_format_strings_trigger_error(default_mock_concretization):
s = Spec("pkg-a").concretized() s = Spec("a").concretized()
with pytest.raises(SpecFormatStringError): with pytest.raises(SpecFormatStringError):
s.format("${PACKAGE}-${VERSION}-${HASH}") s.format("${PACKAGE}-${VERSION}-${HASH}")

View File

@@ -759,7 +759,7 @@ def test_spec_by_hash_tokens(text, tokens):
@pytest.mark.db @pytest.mark.db
def test_spec_by_hash(database, monkeypatch, config): def test_spec_by_hash(database, monkeypatch, config):
mpileaks = database.query_one("mpileaks ^zmpi") mpileaks = database.query_one("mpileaks ^zmpi")
b = spack.spec.Spec("pkg-b").concretized() b = spack.spec.Spec("b").concretized()
monkeypatch.setattr(spack.binary_distribution, "update_cache_and_get_specs", lambda: [b]) monkeypatch.setattr(spack.binary_distribution, "update_cache_and_get_specs", lambda: [b])
hash_str = f"/{mpileaks.dag_hash()}" hash_str = f"/{mpileaks.dag_hash()}"
@@ -856,7 +856,7 @@ def test_ambiguous_hash(mutable_database, default_mock_concretization, config):
In the past this ambiguity error would happen during parse time.""" In the past this ambiguity error would happen during parse time."""
# This is a very sketchy as manually setting hashes easily breaks invariants # This is a very sketchy as manually setting hashes easily breaks invariants
x1 = spack.spec.Spec("pkg-a").concretized() x1 = default_mock_concretization("a")
x2 = x1.copy() x2 = x1.copy()
x1._hash = "xyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy" x1._hash = "xyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy"
x1._process_hash = "xyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy" x1._process_hash = "xyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy"
@@ -874,7 +874,7 @@ def test_ambiguous_hash(mutable_database, default_mock_concretization, config):
s1.lookup_hash() s1.lookup_hash()
# ambiguity in first hash character AND spec name # ambiguity in first hash character AND spec name
s2 = SpecParser("pkg-a/x").next_spec() s2 = SpecParser("a/x").next_spec()
with pytest.raises(spack.spec.AmbiguousHashError): with pytest.raises(spack.spec.AmbiguousHashError):
s2.lookup_hash() s2.lookup_hash()

View File

@@ -314,23 +314,23 @@ def test_save_dependency_spec_jsons_subset(tmpdir, config):
output_path = str(tmpdir.mkdir("spec_jsons")) output_path = str(tmpdir.mkdir("spec_jsons"))
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock-repo")) builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock-repo"))
builder.add_package("pkg-g") builder.add_package("g")
builder.add_package("pkg-f") builder.add_package("f")
builder.add_package("pkg-e") builder.add_package("e")
builder.add_package("pkg-d", dependencies=[("pkg-f", None, None), ("pkg-g", None, None)]) builder.add_package("d", dependencies=[("f", None, None), ("g", None, None)])
builder.add_package("pkg-c") builder.add_package("c")
builder.add_package("pkg-b", dependencies=[("pkg-d", None, None), ("pkg-e", None, None)]) builder.add_package("b", dependencies=[("d", None, None), ("e", None, None)])
builder.add_package("pkg-a", dependencies=[("pkg-b", None, None), ("pkg-c", None, None)]) builder.add_package("a", dependencies=[("b", None, None), ("c", None, None)])
with spack.repo.use_repositories(builder.root): with spack.repo.use_repositories(builder.root):
spec_a = Spec("pkg-a").concretized() spec_a = Spec("a").concretized()
b_spec = spec_a["pkg-b"] b_spec = spec_a["b"]
c_spec = spec_a["pkg-c"] c_spec = spec_a["c"]
save_dependency_specfiles(spec_a, output_path, [Spec("pkg-b"), Spec("pkg-c")]) save_dependency_specfiles(spec_a, output_path, [Spec("b"), Spec("c")])
assert check_specs_equal(b_spec, os.path.join(output_path, "pkg-b.json")) assert check_specs_equal(b_spec, os.path.join(output_path, "b.json"))
assert check_specs_equal(c_spec, os.path.join(output_path, "pkg-c.json")) assert check_specs_equal(c_spec, os.path.join(output_path, "c.json"))
def test_legacy_yaml(tmpdir, install_mockery, mock_packages): def test_legacy_yaml(tmpdir, install_mockery, mock_packages):

View File

@@ -15,7 +15,6 @@
import pytest import pytest
from llnl.util.filesystem import getuid, mkdirp, partition_path, touch, working_dir from llnl.util.filesystem import getuid, mkdirp, partition_path, touch, working_dir
from llnl.util.symlink import readlink
import spack.error import spack.error
import spack.paths import spack.paths
@@ -23,7 +22,7 @@
import spack.util.executable import spack.util.executable
import spack.util.url as url_util import spack.util.url as url_util
from spack.resource import Resource from spack.resource import Resource
from spack.stage import DevelopStage, ResourceStage, Stage, StageComposite from spack.stage import DevelopStage, DIYStage, ResourceStage, Stage, StageComposite
from spack.util.path import canonicalize_path from spack.util.path import canonicalize_path
# The following values are used for common fetch and stage mocking fixtures: # The following values are used for common fetch and stage mocking fixtures:
@@ -146,8 +145,9 @@ def check_destroy(stage, stage_name):
assert not os.path.exists(stage_path) assert not os.path.exists(stage_path)
# tmp stage needs to remove tmp dir too. # tmp stage needs to remove tmp dir too.
target = os.path.realpath(stage_path) if not isinstance(stage, DIYStage):
assert not os.path.exists(target) target = os.path.realpath(stage_path)
assert not os.path.exists(target)
def check_setup(stage, stage_name, archive): def check_setup(stage, stage_name, archive):
@@ -800,6 +800,62 @@ def test_stage_constructor_with_path(self, tmpdir):
with Stage("file:///does-not-exist", path=testpath) as stage: with Stage("file:///does-not-exist", path=testpath) as stage:
assert stage.path == testpath assert stage.path == testpath
def test_diystage_path_none(self):
"""Ensure DIYStage for path=None behaves as expected."""
with pytest.raises(ValueError):
DIYStage(None)
def test_diystage_path_invalid(self):
"""Ensure DIYStage for an invalid path behaves as expected."""
with pytest.raises(spack.stage.StagePathError):
DIYStage("/path/does/not/exist")
def test_diystage_path_valid(self, tmpdir):
"""Ensure DIYStage for a valid path behaves as expected."""
path = str(tmpdir)
stage = DIYStage(path)
assert stage.path == path
assert stage.source_path == path
# Order doesn't really matter for DIYStage since they are
# basically NOOPs; however, call each since they are part
# of the normal stage usage and to ensure full test coverage.
stage.create() # Only sets the flag value
assert stage.created
stage.cache_local() # Only outputs a message
stage.fetch() # Only outputs a message
stage.check() # Only outputs a message
stage.expand_archive() # Only outputs a message
assert stage.expanded # The path/source_path does exist
with pytest.raises(spack.stage.RestageError):
stage.restage()
stage.destroy() # A no-op
assert stage.path == path # Ensure can still access attributes
assert os.path.exists(stage.source_path) # Ensure path still exists
def test_diystage_preserve_file(self, tmpdir):
"""Ensure DIYStage preserves an existing file."""
# Write a file to the temporary directory
fn = tmpdir.join(_readme_fn)
fn.write(_readme_contents)
# Instantiate the DIYStage and ensure the above file is unchanged.
path = str(tmpdir)
stage = DIYStage(path)
assert os.path.isdir(path)
assert os.path.isfile(str(fn))
stage.create() # Only sets the flag value
readmefn = str(fn)
assert os.path.isfile(readmefn)
with open(readmefn) as _file:
_file.read() == _readme_contents
def _create_files_from_tree(base, tree): def _create_files_from_tree(base, tree):
for name, content in tree.items(): for name, content in tree.items():
@@ -816,7 +872,7 @@ def _create_files_from_tree(base, tree):
def _create_tree_from_dir_recursive(path): def _create_tree_from_dir_recursive(path):
if os.path.islink(path): if os.path.islink(path):
return readlink(path) return os.readlink(path)
elif os.path.isdir(path): elif os.path.isdir(path):
tree = {} tree = {}
for name in os.listdir(path): for name in os.listdir(path):

View File

@@ -519,7 +519,7 @@ def test_find_required_file(tmpdir):
def test_packagetest_fails(mock_packages): def test_packagetest_fails(mock_packages):
MyPackage = collections.namedtuple("MyPackage", ["spec"]) MyPackage = collections.namedtuple("MyPackage", ["spec"])
s = spack.spec.Spec("pkg-a") s = spack.spec.Spec("a")
pkg = MyPackage(s) pkg = MyPackage(s)
with pytest.raises(ValueError, match="require a concrete package"): with pytest.raises(ValueError, match="require a concrete package"):
spack.install_test.PackageTest(pkg) spack.install_test.PackageTest(pkg)

View File

@@ -160,13 +160,22 @@ def test_reverse_environment_modifications(working_env):
assert os.environ == start_env assert os.environ == start_env
def test_shell_modifications_are_properly_escaped(): def test_escape_double_quotes_in_shell_modifications():
"""Test that variable values are properly escaped so that they can safely be eval'd.""" to_validate = envutil.EnvironmentModifications()
changes = envutil.EnvironmentModifications()
changes.set("VAR", "$PATH")
changes.append_path("VAR", "$ANOTHER_PATH")
changes.set("RM_RF", "$(rm -rf /)")
script = changes.shell_modifications(shell="sh") to_validate.set("VAR", "$PATH")
assert f"export VAR='$PATH{os.pathsep}$ANOTHER_PATH'" in script to_validate.append_path("VAR", "$ANOTHER_PATH")
assert "export RM_RF='$(rm -rf /)'" in script
to_validate.set("QUOTED_VAR", '"MY_VAL"')
if sys.platform == "win32":
cmds = to_validate.shell_modifications(shell="bat")
assert r'set "VAR=$PATH;$ANOTHER_PATH"' in cmds
assert r'set "QUOTED_VAR="MY_VAL"' in cmds
cmds = to_validate.shell_modifications(shell="pwsh")
assert "$Env:VAR='$PATH;$ANOTHER_PATH'" in cmds
assert "$Env:QUOTED_VAR='\"MY_VAL\"'" in cmds
else:
cmds = to_validate.shell_modifications()
assert 'export VAR="$PATH:$ANOTHER_PATH"' in cmds
assert r'export QUOTED_VAR="\"MY_VAL\""' in cmds

View File

@@ -8,8 +8,6 @@
import os.path import os.path
import urllib.parse import urllib.parse
import pytest
import spack.util.path import spack.util.path
import spack.util.url as url_util import spack.util.url as url_util
@@ -47,63 +45,155 @@ def test_relative_path_to_file_url(tmpdir):
assert os.path.samefile(roundtrip, path) assert os.path.samefile(roundtrip, path)
@pytest.mark.parametrize("resolve_href", [True, False]) def test_url_join_local_paths():
@pytest.mark.parametrize("scheme", ["http", "s3", "gs", "file", "oci"]) # Resolve local link against page URL
def test_url_join_absolute(scheme, resolve_href):
"""Test that joining a URL with an absolute path works the same for schemes we care about, and
whether we work in web browser mode or not."""
netloc = "" if scheme == "file" else "example.com"
a1 = url_util.join(f"{scheme}://{netloc}/a/b/c", "/d/e/f", resolve_href=resolve_href)
a2 = url_util.join(f"{scheme}://{netloc}/a/b/c", "/d", "e", "f", resolve_href=resolve_href)
assert a1 == a2 == f"{scheme}://{netloc}/d/e/f"
b1 = url_util.join(f"{scheme}://{netloc}/a", "https://b.com/b", resolve_href=resolve_href) # wrong:
b2 = url_util.join(f"{scheme}://{netloc}/a", "https://b.com", "b", resolve_href=resolve_href) assert (
assert b1 == b2 == "https://b.com/b" url_util.join("s3://bucket/index.html", "../other-bucket/document.txt")
== "s3://bucket/other-bucket/document.txt"
)
# correct - need to specify resolve_href=True:
assert (
url_util.join("s3://bucket/index.html", "../other-bucket/document.txt", resolve_href=True)
== "s3://other-bucket/document.txt"
)
# same as above: make sure several components are joined together correctly
assert (
url_util.join(
# with resolve_href=True, first arg is the base url; can not be
# broken up
"s3://bucket/index.html",
# with resolve_href=True, remaining arguments are the components of
# the local href that needs to be resolved
"..",
"other-bucket",
"document.txt",
resolve_href=True,
)
== "s3://other-bucket/document.txt"
)
# Append local path components to prefix URL
# wrong:
assert (
url_util.join("https://mirror.spack.io/build_cache", "my-package", resolve_href=True)
== "https://mirror.spack.io/my-package"
)
# correct - Need to specify resolve_href=False:
assert (
url_util.join("https://mirror.spack.io/build_cache", "my-package", resolve_href=False)
== "https://mirror.spack.io/build_cache/my-package"
)
# same as above; make sure resolve_href=False is default
assert (
url_util.join("https://mirror.spack.io/build_cache", "my-package")
== "https://mirror.spack.io/build_cache/my-package"
)
# same as above: make sure several components are joined together correctly
assert (
url_util.join(
# with resolve_href=False, first arg is just a prefix. No
# resolution is done. So, there should be no difference between
# join('/a/b/c', 'd/e'),
# join('/a/b', 'c', 'd/e'),
# join('/a', 'b/c', 'd', 'e'), etc.
"https://mirror.spack.io",
"build_cache",
"my-package",
)
== "https://mirror.spack.io/build_cache/my-package"
)
# For s3:// URLs, the "netloc" (bucket) is considered part of the path.
# Make sure join() can cross bucket boundaries in this case.
args = ["s3://bucket/a/b", "new-bucket", "c"]
assert url_util.join(*args) == "s3://bucket/a/b/new-bucket/c"
args.insert(1, "..")
assert url_util.join(*args) == "s3://bucket/a/new-bucket/c"
args.insert(1, "..")
assert url_util.join(*args) == "s3://bucket/new-bucket/c"
# new-bucket is now the "netloc" (bucket name)
args.insert(1, "..")
assert url_util.join(*args) == "s3://new-bucket/c"
@pytest.mark.parametrize("scheme", ["http", "s3", "gs"]) def test_url_join_absolute_paths():
def test_url_join_up(scheme): # Handling absolute path components is a little tricky. To this end, we
"""Test that the netloc component is preserved when going .. up in the path.""" # distinguish "absolute path components", from the more-familiar concept of
a1 = url_util.join(f"{scheme}://netloc/a/b.html", "c", resolve_href=True) # "absolute paths" as they are understood for local filesystem paths.
assert a1 == f"{scheme}://netloc/a/c" #
b1 = url_util.join(f"{scheme}://netloc/a/b.html", "../c", resolve_href=True) # - All absolute paths are absolute path components. Joining a URL with
b2 = url_util.join(f"{scheme}://netloc/a/b.html", "..", "c", resolve_href=True) # these components has the effect of completely replacing the path of the
assert b1 == b2 == f"{scheme}://netloc/c" # URL with the absolute path. These components do not specify a URL
c1 = url_util.join(f"{scheme}://netloc/a/b.html", "../../c", resolve_href=True) # scheme, so the scheme of the URL procuced when joining them depend on
c2 = url_util.join(f"{scheme}://netloc/a/b.html", "..", "..", "c", resolve_href=True) # those provided by components that came before it (file:// assumed if no
assert c1 == c2 == f"{scheme}://netloc/c" # such scheme is provided).
d1 = url_util.join(f"{scheme}://netloc/a/b", "c", resolve_href=False) # For eaxmple:
assert d1 == f"{scheme}://netloc/a/b/c" p = "/path/to/resource"
d2 = url_util.join(f"{scheme}://netloc/a/b", "../c", resolve_href=False) # ...is an absolute path
d3 = url_util.join(f"{scheme}://netloc/a/b", "..", "c", resolve_href=False)
assert d2 == d3 == f"{scheme}://netloc/a/c"
e1 = url_util.join(f"{scheme}://netloc/a/b", "../../c", resolve_href=False)
e2 = url_util.join(f"{scheme}://netloc/a/b", "..", "..", "c", resolve_href=False)
assert e1 == e2 == f"{scheme}://netloc/c"
f1 = url_util.join(f"{scheme}://netloc/a/b", "../../../c", resolve_href=False)
f2 = url_util.join(f"{scheme}://netloc/a/b", "..", "..", "..", "c", resolve_href=False)
assert f1 == f2 == f"{scheme}://netloc/c"
# http:// URL
assert url_util.join("http://example.com/a/b/c", p) == "http://example.com/path/to/resource"
@pytest.mark.parametrize("scheme", ["http", "https", "ftp", "s3", "gs", "file"]) # s3:// URL
def test_url_join_resolve_href(scheme): # also notice how the netloc is treated as part of the path for s3:// URLs
"""test that `resolve_href=True` behaves like a web browser at the base page, and assert url_util.join("s3://example.com/a/b/c", p) == "s3://path/to/resource"
`resolve_href=False` behaves like joining paths in a file system at the base directory."""
# these are equivalent because of the trailing /
netloc = "" if scheme == "file" else "netloc"
a1 = url_util.join(f"{scheme}://{netloc}/my/path/", "other/path", resolve_href=True)
a2 = url_util.join(f"{scheme}://{netloc}/my/path/", "other", "path", resolve_href=True)
assert a1 == a2 == f"{scheme}://{netloc}/my/path/other/path"
b1 = url_util.join(f"{scheme}://{netloc}/my/path", "other/path", resolve_href=False)
b2 = url_util.join(f"{scheme}://{netloc}/my/path", "other", "path", resolve_href=False)
assert b1 == b2 == f"{scheme}://{netloc}/my/path/other/path"
# this is like a web browser: relative to /my. # - URL components that specify a scheme are always absolute path
c1 = url_util.join(f"{scheme}://{netloc}/my/path", "other/path", resolve_href=True) # components. Joining a base URL with these components effectively
c2 = url_util.join(f"{scheme}://{netloc}/my/path", "other", "path", resolve_href=True) # discards the base URL and "resets" the joining logic starting at the
assert c1 == c2 == f"{scheme}://{netloc}/my/other/path" # component in question and using it as the new base URL.
# For eaxmple:
p = "http://example.com/path/to"
# ...is an http:// URL
join_result = url_util.join(p, "resource")
assert join_result == "http://example.com/path/to/resource"
# works as if everything before the http:// URL was left out
assert url_util.join("literally", "does", "not", "matter", p, "resource") == join_result
assert url_util.join("file:///a/b/c", "./d") == "file:///a/b/c/d"
# Finally, resolve_href should have no effect for how absolute path
# components are handled because local hrefs can not be absolute path
# components.
args = [
"s3://does",
"not",
"matter",
"http://example.com",
"also",
"does",
"not",
"matter",
"/path",
]
expected = "http://example.com/path"
assert url_util.join(*args, resolve_href=True) == expected
assert url_util.join(*args, resolve_href=False) == expected
# resolve_href only matters for the local path components at the end of the
# argument list.
args[-1] = "/path/to/page"
args.extend(("..", "..", "resource"))
assert url_util.join(*args, resolve_href=True) == "http://example.com/resource"
assert url_util.join(*args, resolve_href=False) == "http://example.com/path/resource"
def test_default_download_name(): def test_default_download_name():
@@ -117,29 +207,3 @@ def test_default_download_name_dot_dot():
assert url_util.default_download_filename("https://example.com/.") == "_" assert url_util.default_download_filename("https://example.com/.") == "_"
assert url_util.default_download_filename("https://example.com/..") == "_." assert url_util.default_download_filename("https://example.com/..") == "_."
assert url_util.default_download_filename("https://example.com/.abcdef") == "_abcdef" assert url_util.default_download_filename("https://example.com/.abcdef") == "_abcdef"
def test_parse_link_rel_next():
parse = url_util.parse_link_rel_next
assert parse(r'</abc>; rel="next"') == "/abc"
assert parse(r'</abc>; x=y; rel="next", </def>; x=y; rel="prev"') == "/abc"
assert parse(r'</abc>; rel="prev"; x=y, </def>; x=y; rel="next"') == "/def"
# example from RFC5988
assert (
parse(
r"""</TheBook/chapter2>; title*=UTF-8'de'letztes%20Kapitel; rel="previous","""
r"""</TheBook/chapter4>; title*=UTF-8'de'n%c3%a4chstes%20Kapitel; rel="next" """
)
== "/TheBook/chapter4"
)
assert (
parse(r"""<https://example.com/example>; key=";a=b, </c/d>; e=f"; rel="next" """)
== "https://example.com/example"
)
assert parse("https://example.com/example") is None
assert parse("<https://example.com/example; broken=broken") is None
assert parse("https://example.com/example; rel=prev") is None
assert parse("https://example.com/example; a=b; c=d; g=h") is None

View File

@@ -33,8 +33,8 @@ def test_view_with_spec_not_contributing_files(mock_packages, tmpdir):
layout = DirectoryLayout(view_dir) layout = DirectoryLayout(view_dir)
view = SimpleFilesystemView(view_dir, layout) view = SimpleFilesystemView(view_dir, layout)
a = Spec("pkg-a") a = Spec("a")
b = Spec("pkg-b") b = Spec("b")
a.prefix = os.path.join(tmpdir, "a") a.prefix = os.path.join(tmpdir, "a")
b.prefix = os.path.join(tmpdir, "b") b.prefix = os.path.join(tmpdir, "b")
a._mark_concrete() a._mark_concrete()

Some files were not shown because too many files have changed in this diff Show More