Compare commits

..

119 Commits

Author SHA1 Message Date
Massimiliano Culpo
1aaf606cd1 Update version number
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-03-01 08:49:29 +01:00
Massimiliano Culpo
0f9b1c85dd Update command completion
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:57 +01:00
Massimiliano Culpo
eacdfef38f Account for language on flag propagation
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:57 +01:00
Massimiliano Culpo
1316e4a2e3 concretize.lp: minor clean up of node_compiler
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:56 +01:00
Massimiliano Culpo
de51c6b894 spack verify: use --fake in test installs 2025-02-28 23:34:56 +01:00
Harmen Stoppels
9fb5878ebb improve static check of direct deps 2025-02-28 23:34:55 +01:00
Massimiliano Culpo
cfaf130115 Fix style issue
Possible bug in mypy?
2025-02-28 23:34:55 +01:00
Massimiliano Culpo
5dbbb52579 spec: remove {%compiler} in default format 2025-02-28 23:34:55 +01:00
Harmen Stoppels
964baf9402 remove leftover debug statements 2025-02-28 23:34:54 +01:00
Harmen Stoppels
2e2f76819c config.py: undo reintroduction of ConfigurationType 2025-02-28 23:34:54 +01:00
Massimiliano Culpo
848816efa4 cray-libsci: fix finding library
The fix is brittle, but not more brittle
than before, where users had to specify
the "correct" compiler ans prefix in the
external definition
2025-02-28 23:34:53 +01:00
Massimiliano Culpo
035131749b macOS: improve error message for required compilers 2025-02-28 23:34:53 +01:00
Massimiliano Culpo
05cf9b32a5 Add unit-tests on runtimes 2025-02-28 23:34:52 +01:00
Massimiliano Culpo
29d273d86e Disregard only default compiler requirements for runtimes 2025-02-28 23:34:52 +01:00
Massimiliano Culpo
e37a7b6c91 gromacs: fix issue with ^gcc 2025-02-28 23:34:51 +01:00
Massimiliano Culpo
626c5c59c6 Relax rule on gcc-runtime
It might happen that gcc-runtime is just a dep
of another runtime, and not used as a compiler.
2025-02-28 23:34:51 +01:00
Massimiliano Culpo
229945ed27 compiler-wrapper: add missing links on Cray 2025-02-28 23:34:51 +01:00
Massimiliano Culpo
6989bf9661 py-petsc4py: patch again for wrong assumption on LDSHARED 2025-02-28 23:34:50 +01:00
Massimiliano Culpo
bab2053d54 darwin: try to use better defaults 2025-02-28 23:34:50 +01:00
Massimiliano Culpo
cc8447968e gcc-runtime: fix case of only gfortran in macOS 2025-02-28 23:34:49 +01:00
Massimiliano Culpo
ae1f39339b Fix failing test self dependency
TO BE CHECKED
2025-02-28 23:34:49 +01:00
Massimiliano Culpo
30ebf3595b Don't use compilers from buildcache if reuse is false 2025-02-28 23:34:48 +01:00
Massimiliano Culpo
7800c4c51b Allow using compilers from build caches 2025-02-28 23:34:48 +01:00
Massimiliano Culpo
476f2a63e2 Add support for externals with compiler specified
Compiler annotation is taken into account when
selecting externals as dependencies.
2025-02-28 23:34:47 +01:00
Massimiliano Culpo
745a0fac8a Improve default selection of compilers
This splits optimization on providers, and
puts selection of default compilers at lower
priority wrt selection of e.g. mpi or lapack
provider.

This is so that, on systems where 2 or more
compilers are used (e.g. macOS), clingo would
not make weird choices for mpi or lapack, in
order to minimize compiler penalty due to
using different compilers on the same node.
2025-02-28 23:34:47 +01:00
Massimiliano Culpo
660fff39eb rpcsvc-proto: point to compiler-wrapper, instead of Spack dir
https://gitlab.spack.io/spack/spack/-/jobs/15003090
2025-02-28 23:34:47 +01:00
Massimiliano Culpo
d4b5eb2be6 Improve heuristic.lp 2025-02-28 23:34:46 +01:00
Massimiliano Culpo
ddeab9879e Improve stability of DAG when using multiple compilers for the same language 2025-02-28 23:34:46 +01:00
Massimiliano Culpo
5920d31b25 unit-test: remove test that is superseded by new semantic 2025-02-28 23:34:45 +01:00
Massimiliano Culpo
f75555136b No flag propagation when compilers are different
Flag propagation is not activated on nodes with
a compiler that is different from the propagation
source
2025-02-28 23:34:45 +01:00
Massimiliano Culpo
922b9b0e50 solver: test that we can build llvm%gcc 2025-02-28 23:34:44 +01:00
Harmen Stoppels
da916a944e remove unused code 2025-02-28 23:34:44 +01:00
Harmen Stoppels
6d224d8a6f simplify imports 2025-02-28 23:34:43 +01:00
Massimiliano Culpo
15f0871a6f Add an alias for SpecfileLatest 2025-02-28 23:34:43 +01:00
Massimiliano Culpo
99489c236f Modify granularity of CachedCMakeBuilder.cache_name 2025-02-28 23:34:42 +01:00
Massimiliano Culpo
e94ee8b2f3 Fix version numbers 2025-02-28 23:34:42 +01:00
Massimiliano Culpo
ac72170e91 Spack fails with an error the local database is at an outdated version
With this commit Spack fails with an error and an informative message,
if the local store is at an outdated version. This usually happens when
people upgrade their Spack version, but retain the same configuration as
before.

Since updating the DB without an explicit consent of the user means making
the local store incompatible with previous versions of Spack, here we opt
to error out and give the user the choice to either reindex, or update the
configuration.
2025-02-28 23:34:42 +01:00
Massimiliano Culpo
fc2793f98f Improve info messages 2025-02-28 23:34:41 +01:00
Massimiliano Culpo
a8dd481bbf Allow using compilers from the local store
To do this we introduce a new fact, that is true
when a compiler is used as a link dependency.

If we don't have this fact, we enforce only run
dependencies in the ASP problem.
2025-02-28 23:34:41 +01:00
Massimiliano Culpo
56c685e374 compiler-wrapper: install in libexec/spack instead of spack/bin 2025-02-28 23:34:40 +01:00
Massimiliano Culpo
42c5cc4dc8 Fix for compiler search and conversion from compilers.yaml
* Don't write to platform scope
* Remove leftover print about compilers.yaml
* Avoid merging all the scopes when writing compilers
2025-02-28 23:34:40 +01:00
Massimiliano Culpo
e0d889fb91 Remove dead code after "spack compiler" reworking 2025-02-28 23:34:39 +01:00
Massimiliano Culpo
4566ad9c2b Tolerate compiler tokens in external specs 2025-02-28 23:34:39 +01:00
Massimiliano Culpo
dc12cfde75 Remove artifacts of frequent rebase
fixup
2025-02-28 23:34:38 +01:00
Massimiliano Culpo
b3cc4b4cb3 Fix non-determinism when using mixed compilers 2025-02-28 23:34:38 +01:00
Massimiliano Culpo
bb65d495d9 Fix setting build dependencies of a build dependency from cli 2025-02-28 23:34:38 +01:00
Massimiliano Culpo
8ccf626306 compiler-wrapper: decouple the package from Spack configuration 2025-02-28 23:34:37 +01:00
Massimiliano Culpo
11065ff318 Remove attribute spuriously added during rebase 2025-02-28 23:34:37 +01:00
Harmen Stoppels
b9b7ef424c Fix test_pkg_flags_from_compiler_and_none 2025-02-28 23:34:36 +01:00
Massimiliano Culpo
91b09cfeb6 Better handling of legacy compilers.yaml
After this commit, entries in compilers.yaml will be "converted"
to entries in packages.yaml, only if no other compiler is present,
when Spack tries to initialize automatically the compiler
configuration.
2025-02-28 23:34:36 +01:00
Massimiliano Culpo
57b8167ead Spec: relax a few type-hints to accept a Sequence 2025-02-28 23:34:35 +01:00
Massimiliano Culpo
aa10284a0a Flatten the default store projection 2025-02-28 23:34:35 +01:00
Massimiliano Culpo
ec8c6e565d Update pipeline configurations
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:34 +01:00
Massimiliano Culpo
324d427292 builtin: reduce boilerplate when accessing package attributes 2025-02-28 23:34:34 +01:00
Massimiliano Culpo
4bd9ff2ef0 Use a stub compiler wrapper on windows 2025-02-28 23:34:34 +01:00
Massimiliano Culpo
95115d4290 Turn the compiler wrapper into a package
Remove the compiler wrappers from core Spack, and move
the relevant code into the "compiler-wrapper" package.
2025-02-28 23:34:33 +01:00
Massimiliano Culpo
e5f8049f3d builtin: add cc, cxx, and fortran properties
These properties are implemented in the same way in each compiler
package, at least for external specs. So, push the implementation
to the base class for now.

This needs to be revisited, to make the base class less dependent
on C, C++, and Fortran.
2025-02-28 23:34:33 +01:00
Massimiliano Culpo
da48fdd864 mapl: fix conditional on gfortran 2025-02-28 23:34:32 +01:00
Massimiliano Culpo
f8117e8182 builtin: minimal fix for _get_host_config_path 2025-02-28 23:34:32 +01:00
Massimiliano Culpo
533973671b builtin: fix for Windows pipelines 2025-02-28 23:34:31 +01:00
Massimiliano Culpo
cf9a148708 builtin: changes to packages 2025-02-28 23:34:31 +01:00
Massimiliano Culpo
d6f0ce3e5a lmod: do not init compilers 2025-02-28 23:34:30 +01:00
Massimiliano Culpo
d6cb54da4b Write repo caches in specfile specific files
In this way we'll never encounter weird errors, when bumping the
specfile version. There are other unrelated issues with repo caches,
but those can be resolved separately.
2025-02-28 23:34:30 +01:00
Massimiliano Culpo
f6851a56e8 compiler: add cc, cxx, and fortran properties
These properties are implemented in the same way in each compiler
package, at least for external specs. So, push the implementation
to the base class for now.

This needs to be revisited, to make the base class less dependent
on C, C++, and Fortran.
2025-02-28 23:34:29 +01:00
Massimiliano Culpo
95820a91b3 Fix print in spack compiler info 2025-02-28 23:34:29 +01:00
Harmen Stoppels
c3ddea9061 delete redundant spack.concretize.CHECK_COMPILER_EXISTENCE 2025-02-28 23:34:28 +01:00
Massimiliano Culpo
0bd8ca4e08 Raise UnsupportedCompilerFlag when a flag is not supported 2025-02-28 23:34:28 +01:00
Massimiliano Culpo
2d40025ae3 Remove SPACK_COMPILER_SPEC from the environment 2025-02-28 23:34:28 +01:00
Massimiliano Culpo
47d01c086c Prepend compiler wrappers path last, so we don't risk finding externals 2025-02-28 23:34:27 +01:00
Massimiliano Culpo
00c04bd36a Recover splicing semantic 2025-02-28 23:34:27 +01:00
Massimiliano Culpo
0d6a5c0f06 solver: temporarily enforce compilers to be externals 2025-02-28 23:34:26 +01:00
Massimiliano Culpo
f379b304a1 Remove rule already accounted for by the gcc package 2025-02-28 23:34:26 +01:00
Massimiliano Culpo
ec97e7e6fe Allow different target flags for different compilers 2025-02-28 23:34:25 +01:00
Massimiliano Culpo
7f093d129b unit-test: make Spec.compiler behavior stricter
Now the adaptor will raise if the Spec has no C, C++,
or Fortran compiler.
2025-02-28 23:34:25 +01:00
Massimiliano Culpo
3a67dfd9e8 Remove a test that should fail according to concretization rules 2025-02-28 23:34:25 +01:00
Massimiliano Culpo
dfd28bc5c0 Add a unit-test for satisfies and __getitem__ semantic 2025-02-28 23:34:24 +01:00
Massimiliano Culpo
0d8549e282 Add a unit-test for compiler self-dependencies 2025-02-28 23:34:24 +01:00
Massimiliano Culpo
895e3c453e Exempt "compilers" and "runtimes" from default requirements 2025-02-28 23:34:23 +01:00
Massimiliano Culpo
7a429af479 unit-tests: mark a few tests as xfail, or skip, for now
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:23 +01:00
Massimiliano Culpo
32fc8c351d Fix setting SPACK_TARGET_ARGS for concrete specs 2025-02-28 23:34:22 +01:00
Massimiliano Culpo
eb85f2e862 unit-test: fix reading Cray manifest files 2025-02-28 23:34:22 +01:00
Massimiliano Culpo
28d42eed5e unit-tests: fix most unit tests to account for the new model
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:21 +01:00
Massimiliano Culpo
f79354c312 Fix setting SPACK_TARGET_ARGS for concrete specs 2025-02-28 23:34:21 +01:00
Massimiliano Culpo
5492b9cc6d Fix concretization of julia
That package depends on llvm as a library, and the rule on compatible
targets for compilers was getting in the way.
2025-02-28 23:34:21 +01:00
Massimiliano Culpo
5260acc53b Modify Spec.short_spec to remove compiler info 2025-02-28 23:34:20 +01:00
Massimiliano Culpo
040b827dad Make Spec.compiler behavior stricter
Now the adaptor will raise if the Spec has no C, C++,
or Fortran compiler.
2025-02-28 23:34:20 +01:00
Massimiliano Culpo
54bca16130 asp: fix intel-oneapi-compilers-classic 2025-02-28 23:34:19 +01:00
Massimiliano Culpo
bec6b06c16 Spec.__contains__: traverse only lin/run + direct build 2025-02-28 23:34:19 +01:00
Massimiliano Culpo
27e2e146e2 Exempt "compilers" and "runtimes" from default requirements 2025-02-28 23:34:18 +01:00
Massimiliano Culpo
1ddc0e6b52 Allow self concretization to bootstrap compilers 2025-02-28 23:34:18 +01:00
Massimiliano Culpo
f56aaf1fc3 Add more constraint to providers 2025-02-28 23:34:18 +01:00
Massimiliano Culpo
5b3f4387b3 Fix for duplicate glibc in concretization 2025-02-28 23:34:17 +01:00
Massimiliano Culpo
55196252dd Improve reporting when bootstrapping from source
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:17 +01:00
Massimiliano Culpo
d3a7a73a00 Improve error messages for statically checked specs
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:16 +01:00
Massimiliano Culpo
21afe2af1f spec: implemented direct satisfy semantic
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:16 +01:00
Massimiliano Culpo
646c2f42c4 Fixup binary cache reuse
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:15 +01:00
Massimiliano Culpo
1ab3e8c776 Write adaptors for CompilerSpec and Compiler
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:15 +01:00
Massimiliano Culpo
49978d5b6c Fix reading Cray manifest files
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:14 +01:00
Massimiliano Culpo
a1866d7a4b (WIP) Fix LMod module generation
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:14 +01:00
Massimiliano Culpo
6674ce6dc4 (WIP) Remove deprecated argument for Spec.format
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:14 +01:00
Massimiliano Culpo
f729353ac3 fixup: spec copies compiler annotation
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:13 +01:00
Massimiliano Culpo
73e0cf07cb Restore bootstrapping from binaries
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:13 +01:00
Massimiliano Culpo
8842df3f94 Restore bootstrapping from sources
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:12 +01:00
Massimiliano Culpo
8d3132b26b spec: change semantic of __getitem__
Now __getitem__ can pick items in the transitive link/run graph,
or from direct build dependencies.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:12 +01:00
Massimiliano Culpo
e342de41b2 spec: bump specfile format to v5
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:11 +01:00
Massimiliano Culpo
0415390270 Overhaul of the spack.compilers package
Now the package contains modules that help using, or
detecting, compiler packages.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:11 +01:00
Massimiliano Culpo
5b7caba4a6 Remove spack.compilers Python modules
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:10 +01:00
Massimiliano Culpo
f59c120e0a (WIP) Install mechanism 2025-02-28 23:34:10 +01:00
Massimiliano Culpo
a0cae04302 (WIP) Recover bootstrapping from binaries on linux 2025-02-28 23:34:10 +01:00
Massimiliano Culpo
496ae0bb31 unit-tests: fix concretization and spack compiler tests 2025-02-28 23:34:09 +01:00
Massimiliano Culpo
4c06f83c60 solver: first working implementation of compiler as nodes
This commit changes the model to treat compilers as nodes, and
drops the concept of a "compiler" as a bundle of a C, C++, and
Fortran compiler.

Implementation does not rely on `Compiler` or `CompilerSpec`.
2025-02-28 23:34:09 +01:00
Massimiliano Culpo
5c66cc71fe builtin.mock et al. : changes to packages
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:08 +01:00
Massimiliano Culpo
0b11775529 Deprecate packages:all:compiler and update default configs
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:08 +01:00
Massimiliano Culpo
a10f3295bc directives: remove workaround for the c, cxx and fortran language
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-02-28 23:34:07 +01:00
Massimiliano Culpo
285926cb69 Overhaul the spack compiler command
This reverts commit 2c47dddbc1.

Now, `spack compiler` writes by default in packages.yaml. Entries
in old `compilers.yaml` are converted to external specs as a way to
support legacy configuration.

Since this operation is expensive, an environment variable can be
used to enforce the deprecation of `compiler.yaml`.

The --mixed-toolchain option has been deprecated, since it stops to
make sense once compiler are treated as nodes.
2025-02-28 23:34:07 +01:00
Massimiliano Culpo
da02a4a606 Allow reading old JSON files 2025-02-28 23:34:06 +01:00
Massimiliano Culpo
37dd777a51 parse_with_version_concrete: remove compiler= switch 2025-02-28 23:34:06 +01:00
Massimiliano Culpo
7dc824d1ff Make CompilerSpec raise on __init__ 2025-02-28 23:34:06 +01:00
Massimiliano Culpo
78744b11ae parser: parse compilers as direct build deps 2025-02-28 23:34:02 +01:00
550 changed files with 4695 additions and 8442 deletions

View File

@@ -9,7 +9,6 @@ on:
branches:
- develop
- releases/**
merge_group:
concurrency:
group: ci-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
@@ -26,17 +25,13 @@ jobs:
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
if: ${{ github.event_name == 'push' || github.event_name == 'merge_group' }}
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36
id: filter
with:
# For merge group events, compare against the target branch (main)
base: ${{ github.event_name == 'merge_group' && github.event.merge_group.base_ref || '' }}
# For merge group events, use the merge group head ref
ref: ${{ github.event_name == 'merge_group' && github.event.merge_group.head_sha || github.ref }}
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below
# Don't run if we only modified packages in the
# built-in repository or documentation
@@ -81,11 +76,10 @@ jobs:
prechecks:
needs: [ changes ]
uses: ./.github/workflows/prechecks.yml
uses: ./.github/workflows/valid-style.yml
secrets: inherit
with:
with_coverage: ${{ needs.changes.outputs.core }}
with_packages: ${{ needs.changes.outputs.packages }}
import-check:
needs: [ changes ]
@@ -99,7 +93,7 @@ jobs:
- name: Success
run: |
if [ "${{ needs.prechecks.result }}" == "failure" ] || [ "${{ needs.prechecks.result }}" == "canceled" ]; then
echo "Unit tests failed."
echo "Unit tests failed."
exit 1
else
exit 0
@@ -107,7 +101,6 @@ jobs:
coverage:
needs: [ unit-tests, prechecks ]
if: ${{ needs.changes.outputs.core }}
uses: ./.github/workflows/coverage.yml
secrets: inherit
@@ -120,10 +113,10 @@ jobs:
- name: Status summary
run: |
if [ "${{ needs.unit-tests.result }}" == "failure" ] || [ "${{ needs.unit-tests.result }}" == "canceled" ]; then
echo "Unit tests failed."
echo "Unit tests failed."
exit 1
elif [ "${{ needs.bootstrap.result }}" == "failure" ] || [ "${{ needs.bootstrap.result }}" == "canceled" ]; then
echo "Bootstrap tests failed."
echo "Bootstrap tests failed."
exit 1
else
exit 0

View File

@@ -1,7 +1,7 @@
black==25.1.0
clingo==5.7.1
flake8==7.1.2
isort==6.0.1
isort==6.0.0
mypy==1.15.0
types-six==1.17.0.20250304
types-six==1.17.0.20241205
vermin==1.6.0

View File

@@ -1,4 +1,4 @@
name: prechecks
name: style
on:
workflow_call:
@@ -6,9 +6,6 @@ on:
with_coverage:
required: true
type: string
with_packages:
required: true
type: string
concurrency:
group: style-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
@@ -33,7 +30,6 @@ jobs:
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv var/spack/repos
# Run style checks on the files that have been changed
style:
runs-on: ubuntu-latest
@@ -57,25 +53,12 @@ jobs:
- name: Run style tests
run: |
share/spack/qa/run-style-tests
audit:
uses: ./.github/workflows/audit.yaml
secrets: inherit
with:
with_coverage: ${{ inputs.with_coverage }}
python_version: '3.13'
verify-checksums:
if: ${{ inputs.with_packages == 'true' }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 2
- name: Verify Added Checksums
run: |
bin/spack ci verify-versions HEAD^1 HEAD
# Check that spack can bootstrap the development environment on Python 3.6 - RHEL8
bootstrap-dev-rhel8:
runs-on: ubuntu-latest

1
.gitignore vendored
View File

@@ -201,6 +201,7 @@ tramp
# Org-mode
.org-id-locations
*_archive
# flymake-mode
*_flymake.*

View File

@@ -0,0 +1,2 @@
concretizer:
static_analysis: true

View File

@@ -50,11 +50,8 @@ packages:
- spec: apple-libuuid@1353.100.2
prefix: /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk
c:
prefer:
- apple-clang
require: apple-clang
cxx:
prefer:
- apple-clang
require: apple-clang
fortran:
prefer:
- gcc
require: gcc

View File

@@ -19,14 +19,14 @@ packages:
awk: [gawk]
armci: [armcimpi]
blas: [openblas, amdblis]
c: [gcc, llvm, intel-oneapi-compilers]
cxx: [gcc, llvm, intel-oneapi-compilers]
c: [gcc, llvm, intel-oneapi-compilers, xl, aocc]
cxx: [gcc, llvm, intel-oneapi-compilers, xl, aocc]
D: [ldc]
daal: [intel-oneapi-daal]
elf: [elfutils]
fftw-api: [fftw, amdfftw]
flame: [libflame, amdlibflame]
fortran: [gcc, llvm, intel-oneapi-compilers]
fortran: [gcc, llvm]
fortran-rt: [gcc-runtime, intel-oneapi-runtime]
fuse: [libfuse]
gl: [glx, osmesa]

View File

@@ -14,7 +14,6 @@ case you want to skip directly to specific docs:
* :ref:`compilers.yaml <compiler-config>`
* :ref:`concretizer.yaml <concretizer-options>`
* :ref:`config.yaml <config-yaml>`
* :ref:`include.yaml <include-yaml>`
* :ref:`mirrors.yaml <mirrors>`
* :ref:`modules.yaml <modules>`
* :ref:`packages.yaml <packages-config>`

View File

@@ -457,13 +457,6 @@ developed package in the environment are concretized to match the
version (and other constraints) passed as the spec argument to the
``spack develop`` command.
When working deep in the graph it is often desirable to have multiple specs marked
as ``develop`` so you don't have to restage and/or do full rebuilds each time you
call ``spack install``. The ``--recursive`` flag can be used in these scenarios
to ensure that all the dependents of the initial spec you provide are also marked
as develop specs. The ``--recursive`` flag requires a pre-concretized environment
so the graph can be traversed from the supplied spec all the way to the root specs.
For packages with ``git`` attributes, git branches, tags, and commits can
also be used as valid concrete versions (see :ref:`version-specifier`).
This means that for a package ``foo``, ``spack develop foo@git.main`` will clone
@@ -677,45 +670,24 @@ This configuration sets the default compiler for all packages to
Included configurations
^^^^^^^^^^^^^^^^^^^^^^^
Spack environments allow an ``include`` heading in their yaml schema.
This heading pulls in external configuration files and applies them to
the environment.
Spack environments allow an ``include`` heading in their yaml
schema. This heading pulls in external configuration files and applies
them to the environment.
.. code-block:: yaml
spack:
include:
- environment/relative/path/to/config.yaml
- relative/path/to/config.yaml
- https://github.com/path/to/raw/config/compilers.yaml
- /absolute/path/to/packages.yaml
- path: /path/to/$os/$target/environment
optional: true
- path: /path/to/os-specific/config-dir
when: os == "ventura"
Included configuration files are required *unless* they are explicitly optional
or the entry's condition evaluates to ``false``. Optional includes are specified
with the ``optional`` clause and conditional with the ``when`` clause. (See
:ref:`include-yaml` for more information on optional and conditional entries.)
Files are listed using paths to individual files or directories containing them.
Path entries may be absolute or relative to the environment or specified as
URLs. URLs to individual files need link to the **raw** form of the file's
contents (e.g., `GitHub
<https://docs.github.com/en/repositories/working-with-files/using-files/viewing-and-understanding-files#viewing-or-copying-the-raw-file-content>`_
or `GitLab
<https://docs.gitlab.com/ee/api/repository_files.html#get-raw-file-from-repository>`_).
Only the ``file``, ``ftp``, ``http`` and ``https`` protocols (or schemes) are
supported. Spack-specific, environment and user path variables can be used.
(See :ref:`config-file-variables` for more information.)
.. warning::
Recursive includes are not currently processed in a breadth-first manner
so the value of a configuration option that is altered by multiple included
files may not be what you expect. This will be addressed in a future
update.
Environments can include files or URLs. File paths can be relative or
absolute. URLs include the path to the text for individual files or
can be the path to a directory containing configuration files.
Spack supports ``file``, ``http``, ``https`` and ``ftp`` protocols (or
schemes). Spack-specific, environment and user path variables may be
used in these paths. See :ref:`config-file-variables` for more information.
^^^^^^^^^^^^^^^^^^^^^^^^
Configuration precedence

View File

@@ -30,7 +30,7 @@ than always choosing the latest versions or default variants.
.. note::
As a rule of thumb: requirements + constraints > strong preferences > reuse > preferences > defaults.
As a rule of thumb: requirements + constraints > reuse > preferences > defaults.
The following set of criteria (from lowest to highest precedence) explain
common cases where concretization output may seem surprising at first.
@@ -56,19 +56,7 @@ common cases where concretization output may seem surprising at first.
concretizer:
reuse: dependencies # other options are 'true' and 'false'
3. :ref:`Strong preferences <package-strong-preferences>` configured in ``packages.yaml``
are higher priority than reuse, and can be used to strongly prefer a specific version
or variant, without erroring out if it's not possible. Strong preferences are specified
as follows:
.. code-block:: yaml
packages:
foo:
prefer:
- "@1.1: ~mpi"
4. :ref:`Package requirements <package-requirements>` configured in ``packages.yaml``,
3. :ref:`Package requirements <package-requirements>` configured in ``packages.yaml``,
and constraints from the command line as well as ``package.py`` files override all
of the above. Requirements are specified as follows:
@@ -78,8 +66,6 @@ common cases where concretization output may seem surprising at first.
foo:
require:
- "@1.2: +mpi"
conflicts:
- "@1.4"
Requirements and constraints restrict the set of possible solutions, while reuse
behavior and preferences influence what an optimal solution looks like.

View File

@@ -1,51 +0,0 @@
.. Copyright Spack Project Developers. See COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _include-yaml:
===============================
Include Settings (include.yaml)
===============================
Spack allows you to include configuration files through ``include.yaml``.
Using the ``include:`` heading results in pulling in external configuration
information to be used by any Spack command.
Included configuration files are required *unless* they are explicitly optional
or the entry's condition evaluates to ``false``. Optional includes are specified
with the ``optional`` clause and conditional with the ``when`` clause. For
example,
.. code-block:: yaml
include:
- /path/to/a/required/config.yaml
- path: /path/to/$os/$target/config
optional: true
- path: /path/to/os-specific/config-dir
when: os == "ventura"
shows all three. The first entry, ``/path/to/a/required/config.yaml``,
indicates that included ``config.yaml`` file is required (so must exist).
Use of ``optional: true`` for ``/path/to/$os/$target/config`` means
the path is only included if it exists. The condition ``os == "ventura"``
in the ``when`` clause for ``/path/to/os-specific/config-dir`` means the
path is only included when the operating system (``os``) is ``ventura``.
The same conditions and variables in `Spec List References
<https://spack.readthedocs.io/en/latest/environments.html#spec-list-references>`_
can be used for conditional activation in the ``when`` clauses.
Included files can be specified by path or by their parent directory.
Paths may be absolute, relative (to the configuration file including the path),
or specified as URLs. Only the ``file``, ``ftp``, ``http`` and ``https`` protocols (or
schemes) are supported. Spack-specific, environment and user path variables
can be used. (See :ref:`config-file-variables` for more information.)
.. warning::
Recursive includes are not currently processed in a breadth-first manner
so the value of a configuration option that is altered by multiple included
files may not be what you expect. This will be addressed in a future
update.

View File

@@ -71,7 +71,6 @@ or refer to the full manual below.
configuration
config_yaml
include_yaml
packages_yaml
build_settings
environments

View File

@@ -486,8 +486,6 @@ present. For instance with a configuration like:
you will use ``mvapich2~cuda %gcc`` as an ``mpi`` provider.
.. _package-strong-preferences:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Conflicts and strong preferences
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -1,13 +1,13 @@
sphinx==8.2.3
sphinx==8.2.1
sphinxcontrib-programoutput==0.18
sphinx_design==0.6.1
sphinx-rtd-theme==3.0.2
python-levenshtein==0.27.1
python-levenshtein==0.26.1
docutils==0.21.2
pygments==2.19.1
urllib3==2.3.0
pytest==8.3.5
isort==6.0.1
pytest==8.3.4
isort==6.0.0
black==25.1.0
flake8==7.1.2
mypy==1.11.1

View File

@@ -11,7 +11,6 @@
import re
import sys
import traceback
import types
import typing
import warnings
from datetime import datetime, timedelta
@@ -708,24 +707,14 @@ def __init__(self, wrapped_object):
class Singleton:
"""Wrapper for lazily initialized singleton objects."""
"""Simple wrapper for lazily initialized singleton objects."""
def __init__(self, factory: Callable[[], object]):
def __init__(self, factory):
"""Create a new singleton to be inited with the factory function.
Most factories will simply create the object to be initialized and
return it.
In some cases, e.g. when bootstrapping some global state, the singleton
may need to be initialized incrementally. If the factory returns a generator
instead of a regular object, the singleton will assign each result yielded by
the generator to the singleton instance. This allows methods called by
the factory in later stages to refer back to the singleton.
Args:
factory (function): function taking no arguments that creates the
singleton instance.
factory (function): function taking no arguments that
creates the singleton instance.
"""
self.factory = factory
self._instance = None
@@ -733,16 +722,7 @@ def __init__(self, factory: Callable[[], object]):
@property
def instance(self):
if self._instance is None:
instance = self.factory()
if isinstance(instance, types.GeneratorType):
# if it's a generator, assign every value
for value in instance:
self._instance = value
else:
# if not, just assign the result like a normal singleton
self._instance = instance
self._instance = self.factory()
return self._instance
def __getattr__(self, name):

View File

@@ -10,21 +10,9 @@
import spack.util.git
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "1.0.0.dev0"
__version__ = "1.0.0-alpha.4"
spack_version = __version__
#: The current Package API version implemented by this version of Spack. The Package API defines
#: the Python interface for packages as well as the layout of package repositories. The minor
#: version is incremented when the package API is extended in a backwards-compatible way. The major
#: version is incremented upon breaking changes. This version is changed independently from the
#: Spack version.
package_api_version = (1, 0)
#: The minimum Package API version that this version of Spack is compatible with. This should
#: always be a tuple of the form ``(major, 0)``, since compatibility with vX.Y implies
#: compatibility with vX.0.
min_package_api_version = (1, 0)
def __try_int(v):
try:
@@ -91,6 +79,4 @@ def get_short_version() -> str:
"get_version",
"get_spack_commit",
"get_short_version",
"package_api_version",
"min_package_api_version",
]

View File

@@ -1,20 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Alias names to convert legacy compilers to builtin packages and vice-versa"""
BUILTIN_TO_LEGACY_COMPILER = {
"llvm": "clang",
"intel-oneapi-compilers": "oneapi",
"llvm-amdgpu": "rocmcc",
"intel-oneapi-compiler-classic": "intel",
"acfl": "arm",
}
LEGACY_COMPILER_TO_BUILTIN = {
"clang": "llvm",
"oneapi": "intel-oneapi-compilers",
"rocmcc": "llvm-amdgpu",
"intel": "intel-oneapi-compiler-classic",
"arm": "acfl",
}

View File

@@ -636,7 +636,14 @@ def tarball_directory_name(spec):
Return name of the tarball directory according to the convention
<os>-<architecture>/<compiler>/<package>-<version>/
"""
return spec.format_path("{architecture}/{compiler.name}-{compiler.version}/{name}-{version}")
if spec.original_spec_format() < 5:
compiler = spec.annotations.compiler_node_attribute
assert compiler is not None, "a compiler spec is expected"
return spec.format_path(
f"{spec.architecture}/{compiler.name}-{compiler.version}/{spec.name}-{spec.version}"
)
return spec.format_path(f"{spec.architecture.platform}/{spec.name}-{spec.version}")
def tarball_name(spec, ext):
@@ -644,9 +651,17 @@ def tarball_name(spec, ext):
Return the name of the tarfile according to the convention
<os>-<architecture>-<package>-<dag_hash><ext>
"""
spec_formatted = spec.format_path(
"{architecture}-{compiler.name}-{compiler.version}-{name}-{version}-{hash}"
)
if spec.original_spec_format() < 5:
compiler = spec.annotations.compiler_node_attribute
assert compiler is not None, "a compiler spec is expected"
spec_formatted = (
f"{spec.architecture}-{compiler.name}-{compiler.version}-{spec.name}"
f"-{spec.version}-{spec.dag_hash()}"
)
else:
spec_formatted = (
f"{spec.architecture.platform}-{spec.name}-{spec.version}-{spec.dag_hash()}"
)
return f"{spec_formatted}{ext}"

View File

@@ -234,6 +234,10 @@ def _root_spec(spec_str: str) -> str:
# Add a compiler and platform requirement to the root spec.
platform = str(spack.platforms.host())
if platform == "windows":
spec_str += " %msvc"
elif platform == "freebsd":
spec_str += " %clang"
spec_str += f" platform={platform}"
target = archspec.cpu.host().family
spec_str += f" target={target}"

View File

@@ -113,7 +113,7 @@
# set_wrapper_variables and used to pass parameters to
# Spack's compiler wrappers.
#
SPACK_COMPILER_WRAPPER_PATH = "SPACK_COMPILER_WRAPPER_PATH"
SPACK_ENV_PATH = "SPACK_ENV_PATH"
SPACK_MANAGED_DIRS = "SPACK_MANAGED_DIRS"
SPACK_INCLUDE_DIRS = "SPACK_INCLUDE_DIRS"
SPACK_LINK_DIRS = "SPACK_LINK_DIRS"
@@ -715,6 +715,21 @@ def get_rpath_deps(pkg: spack.package_base.PackageBase) -> List[spack.spec.Spec]
return _get_rpath_deps_from_spec(pkg.spec, pkg.transitive_rpaths)
def load_external_modules(pkg):
"""Traverse a package's spec DAG and load any external modules.
Traverse a package's dependencies and load any external modules
associated with them.
Args:
pkg (spack.package_base.PackageBase): package to load deps for
"""
for dep in list(pkg.spec.traverse()):
external_modules = dep.external_modules or []
for external_module in external_modules:
load_module(external_module)
def setup_package(pkg, dirty, context: Context = Context.BUILD):
"""Execute all environment setup routines."""
if context not in (Context.BUILD, Context.TEST):
@@ -748,10 +763,8 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
tty.debug("setup_package: adding compiler wrappers paths")
env_by_name = env_mods.group_by_name()
for x in env_by_name["SPACK_COMPILER_WRAPPER_PATH"]:
assert isinstance(
x, PrependPath
), "unexpected setting used for SPACK_COMPILER_WRAPPER_PATH"
for x in env_by_name["SPACK_ENV_PATH"]:
assert isinstance(x, PrependPath), "unexpected setting used for SPACK_ENV_PATH"
env_mods.prepend_path("PATH", x.value)
# Check whether we want to force RPATH or RUNPATH
@@ -779,7 +792,7 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
# Load modules on an already clean environment, just before applying Spack's
# own environment modifications. This ensures Spack controls CC/CXX/... variables.
load_external_modules(setup_context)
load_external_modules(pkg)
# Make sure nothing's strange about the Spack environment.
validate(env_mods, tty.warn)
@@ -1076,21 +1089,6 @@ def _make_runnable(self, dep: spack.spec.Spec, env: EnvironmentModifications):
env.prepend_path("PATH", bin_dir)
def load_external_modules(context: SetupContext) -> None:
"""Traverse a package's spec DAG and load any external modules.
Traverse a package's dependencies and load any external modules
associated with them.
Args:
context: A populated SetupContext object
"""
for spec, _ in context.external:
external_modules = spec.external_modules or []
for external_module in external_modules:
load_module(external_module)
def _setup_pkg_and_run(
serialized_pkg: "spack.subprocess_context.PackageInstallContext",
function: Callable,

View File

@@ -276,24 +276,17 @@ def initconfig_hardware_entries(self):
entries.append("# ROCm")
entries.append("#------------------{0}\n".format("-" * 30))
if spec.satisfies("^blt@0.7:"):
rocm_root = os.path.dirname(spec["llvm-amdgpu"].prefix)
entries.append(cmake_cache_path("ROCM_PATH", rocm_root))
else:
# Explicitly setting HIP_ROOT_DIR may be a patch that is no longer necessary
entries.append(cmake_cache_path("HIP_ROOT_DIR", "{0}".format(spec["hip"].prefix)))
llvm_bin = spec["llvm-amdgpu"].prefix.bin
llvm_prefix = spec["llvm-amdgpu"].prefix
# Some ROCm systems seem to point to /<path>/rocm-<ver>/ and
# others point to /<path>/rocm-<ver>/llvm
if os.path.basename(os.path.normpath(llvm_prefix)) != "llvm":
llvm_bin = os.path.join(llvm_prefix, "llvm/bin/")
entries.append(
cmake_cache_filepath(
"CMAKE_HIP_COMPILER", os.path.join(llvm_bin, "amdclang++")
)
)
# Explicitly setting HIP_ROOT_DIR may be a patch that is no longer necessary
entries.append(cmake_cache_path("HIP_ROOT_DIR", "{0}".format(spec["hip"].prefix)))
llvm_bin = spec["llvm-amdgpu"].prefix.bin
llvm_prefix = spec["llvm-amdgpu"].prefix
# Some ROCm systems seem to point to /<path>/rocm-<ver>/ and
# others point to /<path>/rocm-<ver>/llvm
if os.path.basename(os.path.normpath(llvm_prefix)) != "llvm":
llvm_bin = os.path.join(llvm_prefix, "llvm/bin/")
entries.append(
cmake_cache_filepath("CMAKE_HIP_COMPILER", os.path.join(llvm_bin, "clang++"))
)
archs = self.spec.variants["amdgpu_target"].value
if archs[0] != "none":
arch_str = ";".join(archs)

View File

@@ -45,7 +45,7 @@ class CompilerPackage(spack.package_base.PackageBase):
compiler_languages: Sequence[str] = ["c", "cxx", "fortran"]
#: Relative path to compiler wrappers
compiler_wrapper_link_paths: Dict[str, str] = {}
link_paths: Dict[str, str] = {}
def __init__(self, spec: "spack.spec.Spec"):
super().__init__(spec)
@@ -159,7 +159,7 @@ def determine_variants(cls, exes: Sequence[Path], version_str: str) -> Tuple:
#: Flag to activate OpenMP support
openmp_flag: str = "-fopenmp"
implicit_rpath_libs: List[str] = []
required_libs: List[str] = []
def standard_flag(self, *, language: str, standard: str) -> str:
"""Returns the flag used to enforce a given standard for a language"""

View File

@@ -6,7 +6,6 @@
import codecs
import json
import os
import pathlib
import re
import shutil
import stat
@@ -14,7 +13,7 @@
import tempfile
import zipfile
from collections import namedtuple
from typing import Callable, Dict, List, Set, Union
from typing import Callable, Dict, List, Set
from urllib.request import Request
import llnl.path
@@ -24,6 +23,7 @@
import spack
import spack.binary_distribution as bindist
import spack.builder
import spack.config as cfg
import spack.environment as ev
import spack.error
@@ -32,7 +32,6 @@
import spack.paths
import spack.repo
import spack.spec
import spack.store
import spack.util.git
import spack.util.gpg as gpg_util
import spack.util.spack_yaml as syaml
@@ -41,7 +40,6 @@
from spack import traverse
from spack.error import SpackError
from spack.reporters.cdash import SPACK_CDASH_TIMEOUT
from spack.version import GitVersion, StandardVersion
from .common import (
IS_WINDOWS,
@@ -80,45 +78,6 @@ def get_change_revisions():
return None, None
def get_added_versions(
checksums_version_dict: Dict[str, Union[StandardVersion, GitVersion]],
path: str,
from_ref: str = "HEAD~1",
to_ref: str = "HEAD",
) -> List[Union[StandardVersion, GitVersion]]:
"""Get a list of the versions added between `from_ref` and `to_ref`.
Args:
checksums_version_dict (Dict): all package versions keyed by known checksums.
path (str): path to the package.py
from_ref (str): oldest git ref, defaults to `HEAD~1`
to_ref (str): newer git ref, defaults to `HEAD`
Returns: list of versions added between refs
"""
git_exe = spack.util.git.git(required=True)
# Gather git diff
diff_lines = git_exe("diff", from_ref, to_ref, "--", path, output=str).split("\n")
# Store added and removed versions
# Removed versions are tracked here to determine when versions are moved in a file
# and show up as both added and removed in a git diff.
added_checksums = set()
removed_checksums = set()
# Scrape diff for modified versions and prune added versions if they show up
# as also removed (which means they've actually just moved in the file and
# we shouldn't need to rechecksum them)
for checksum in checksums_version_dict.keys():
for line in diff_lines:
if checksum in line:
if line.startswith("+"):
added_checksums.add(checksum)
if line.startswith("-"):
removed_checksums.add(checksum)
return [checksums_version_dict[c] for c in added_checksums - removed_checksums]
def get_stack_changed(env_path, rev1="HEAD^", rev2="HEAD"):
"""Given an environment manifest path and two revisions to compare, return
whether or not the stack was changed. Returns True if the environment
@@ -264,7 +223,7 @@ def rebuild_filter(s: spack.spec.Spec) -> RebuildDecision:
def _format_pruning_message(spec: spack.spec.Spec, prune: bool, reasons: List[str]) -> str:
reason_msg = ", ".join(reasons)
spec_fmt = "{name}{@version}{/hash:7}{%compiler}"
spec_fmt = "{name}{@version}{%compiler}{/hash:7}"
if not prune:
status = colorize("@*g{[x]} ")
@@ -620,25 +579,22 @@ def copy_stage_logs_to_artifacts(job_spec: spack.spec.Spec, job_log_dir: str) ->
tty.debug(f"job spec: {job_spec}")
try:
package_metadata_root = pathlib.Path(spack.store.STORE.layout.metadata_path(job_spec))
except spack.error.SpackError as e:
tty.error(f"Cannot copy logs: {str(e)}")
pkg_cls = spack.repo.PATH.get_pkg_class(job_spec.name)
job_pkg = pkg_cls(job_spec)
tty.debug(f"job package: {job_pkg}")
except AssertionError:
msg = f"Cannot copy stage logs: job spec ({job_spec}) must be concrete"
tty.error(msg)
return
# Get the package's archived files
archive_files = []
archive_root = package_metadata_root / "archived-files"
if archive_root.is_dir():
archive_files = [f for f in archive_root.rglob("*") if f.is_file()]
else:
msg = "Cannot copy package archived files: archived-files must be a directory"
tty.warn(msg)
build_log_zipped = package_metadata_root / "spack-build-out.txt.gz"
build_env_mods = package_metadata_root / "spack-build-env.txt"
for f in [build_log_zipped, build_env_mods, *archive_files]:
copy_files_to_artifacts(str(f), job_log_dir)
stage_dir = job_pkg.stage.path
tty.debug(f"stage dir: {stage_dir}")
for file in [
job_pkg.log_path,
job_pkg.env_mods_path,
*spack.builder.create(job_pkg).archive_files,
]:
copy_files_to_artifacts(file, job_log_dir)
def copy_test_logs_to_artifacts(test_stage, job_test_dir):

View File

@@ -330,7 +330,7 @@ def ensure_single_spec_or_die(spec, matching_specs):
if len(matching_specs) <= 1:
return
format_string = "{name}{@version}{ arch=architecture} {%compiler.name}{@compiler.version}"
format_string = "{name}{@version}{%compiler.name}{@compiler.version}{ arch=architecture}"
args = ["%s matches multiple packages." % spec, "Matching packages:"]
args += [
colorize(" @K{%s} " % s.dag_hash(7)) + s.cformat(format_string) for s in matching_specs
@@ -477,7 +477,7 @@ def get_arg(name, default=None):
if flags:
ffmt += " {compiler_flags}"
vfmt = "{variants}" if variants else ""
format_string = nfmt + "{@version}" + vfmt + ffmt
format_string = nfmt + "{@version}" + ffmt + vfmt
def fmt(s, depth=0):
"""Formatter function for all output specs"""

View File

@@ -4,15 +4,12 @@
import json
import os
import re
import shutil
import sys
from typing import Dict
from urllib.parse import urlparse, urlunparse
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.util import tty
import spack.binary_distribution as bindist
import spack.ci as spack_ci
@@ -21,22 +18,12 @@
import spack.cmd.common.arguments
import spack.config as cfg
import spack.environment as ev
import spack.error
import spack.fetch_strategy
import spack.hash_types as ht
import spack.mirrors.mirror
import spack.package_base
import spack.paths
import spack.repo
import spack.spec
import spack.stage
import spack.util.executable
import spack.util.git
import spack.util.gpg as gpg_util
import spack.util.timer as timer
import spack.util.url as url_util
import spack.util.web as web_util
import spack.version
description = "manage continuous integration pipelines"
section = "build"
@@ -45,7 +32,6 @@
SPACK_COMMAND = "spack"
INSTALL_FAIL_CODE = 1
FAILED_CREATE_BUILDCACHE_CODE = 100
BUILTIN = re.compile(r"var\/spack\/repos\/builtin\/packages\/([^\/]+)\/package\.py")
def deindent(desc):
@@ -205,16 +191,6 @@ def setup_parser(subparser):
reproduce.set_defaults(func=ci_reproduce)
# Verify checksums inside of ci workflows
verify_versions = subparsers.add_parser(
"verify-versions",
description=deindent(ci_verify_versions.__doc__),
help=spack.cmd.first_line(ci_verify_versions.__doc__),
)
verify_versions.add_argument("from_ref", help="git ref from which start looking at changes")
verify_versions.add_argument("to_ref", help="git ref to end looking at changes")
verify_versions.set_defaults(func=ci_verify_versions)
def ci_generate(args):
"""generate jobs file from a CI-aware spack file
@@ -451,7 +427,7 @@ def ci_rebuild(args):
# Arguments when installing the root from sources
deps_install_args = install_args + ["--only=dependencies"]
root_install_args = install_args + ["--only=package"]
root_install_args = install_args + ["--keep-stage", "--only=package"]
if cdash_handler:
# Add additional arguments to `spack install` for CDash reporting.
@@ -488,7 +464,8 @@ def ci_rebuild(args):
job_spec.to_dict(hash=ht.dag_hash),
)
# Copy logs and archived files from the install metadata (.spack) directory to artifacts now
# We generated the "spack install ..." command to "--keep-stage", copy
# any logs from the staging directory to artifacts now
spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir)
# If the installation succeeded and we're running stand-alone tests for
@@ -683,159 +660,6 @@ def _gitlab_artifacts_url(url: str) -> str:
return urlunparse(parsed._replace(path="/".join(parts), fragment="", query=""))
def validate_standard_versions(
pkg: spack.package_base.PackageBase, versions: spack.version.VersionList
) -> bool:
"""Get and test the checksum of a package version based on a tarball.
Args:
pkg spack.package_base.PackageBase: Spack package for which to validate a version checksum
versions spack.version.VersionList: list of package versions to validate
Returns: bool: result of the validation. True is valid and false is failed.
"""
url_dict: Dict[spack.version.StandardVersion, str] = {}
for version in versions:
url = pkg.find_valid_url_for_version(version)
url_dict[version] = url
version_hashes = spack.stage.get_checksums_for_versions(
url_dict, pkg.name, fetch_options=pkg.fetch_options
)
valid_checksums = True
for version, sha in version_hashes.items():
if sha != pkg.versions[version]["sha256"]:
tty.error(
f"Invalid checksum found {pkg.name}@{version}\n"
f" [package.py] {pkg.versions[version]['sha256']}\n"
f" [Downloaded] {sha}"
)
valid_checksums = False
continue
tty.info(f"Validated {pkg.name}@{version} --> {sha}")
return valid_checksums
def validate_git_versions(
pkg: spack.package_base.PackageBase, versions: spack.version.VersionList
) -> bool:
"""Get and test the commit and tag of a package version based on a git repository.
Args:
pkg spack.package_base.PackageBase: Spack package for which to validate a version
versions spack.version.VersionList: list of package versions to validate
Returns: bool: result of the validation. True is valid and false is failed.
"""
valid_commit = True
for version in versions:
fetcher = spack.fetch_strategy.for_package_version(pkg, version)
with spack.stage.Stage(fetcher) as stage:
known_commit = pkg.versions[version]["commit"]
try:
stage.fetch()
except spack.error.FetchError:
tty.error(
f"Invalid commit for {pkg.name}@{version}\n"
f" {known_commit} could not be checked out in the git repository."
)
valid_commit = False
continue
# Test if the specified tag matches the commit in the package.py
# We retrieve the commit associated with a tag and compare it to the
# commit that is located in the package.py file.
if "tag" in pkg.versions[version]:
tag = pkg.versions[version]["tag"]
try:
with fs.working_dir(stage.source_path):
found_commit = fetcher.git(
"rev-list", "-n", "1", tag, output=str, error=str
).strip()
except spack.util.executable.ProcessError:
tty.error(
f"Invalid tag for {pkg.name}@{version}\n"
f" {tag} could not be found in the git repository."
)
valid_commit = False
continue
if found_commit != known_commit:
tty.error(
f"Mismatched tag <-> commit found for {pkg.name}@{version}\n"
f" [package.py] {known_commit}\n"
f" [Downloaded] {found_commit}"
)
valid_commit = False
continue
# If we have downloaded the repository, found the commit, and compared
# the tag (if specified) we can conclude that the version is pointing
# at what we would expect.
tty.info(f"Validated {pkg.name}@{version} --> {known_commit}")
return valid_commit
def ci_verify_versions(args):
"""validate version checksum & commits between git refs
This command takes a from_ref and to_ref arguments and
then parses the git diff between the two to determine which packages
have been modified verifies the new checksums inside of them.
"""
with fs.working_dir(spack.paths.prefix):
# We use HEAD^1 explicitly on the merge commit created by
# GitHub Actions. However HEAD~1 is a safer default for the helper function.
files = spack.util.git.get_modified_files(from_ref=args.from_ref, to_ref=args.to_ref)
# Get a list of package names from the modified files.
pkgs = [(m.group(1), p) for p in files for m in [BUILTIN.search(p)] if m]
failed_version = False
for pkg_name, path in pkgs:
spec = spack.spec.Spec(pkg_name)
pkg = spack.repo.PATH.get_pkg_class(spec.name)(spec)
# Skip checking manual download packages and trust the maintainers
if pkg.manual_download:
tty.warn(f"Skipping manual download package: {pkg_name}")
continue
# Store versions checksums / commits for future loop
checksums_version_dict = {}
commits_version_dict = {}
for version in pkg.versions:
# If the package version defines a sha256 we'll use that as the high entropy
# string to detect which versions have been added between from_ref and to_ref
if "sha256" in pkg.versions[version]:
checksums_version_dict[pkg.versions[version]["sha256"]] = version
# If a package version instead defines a commit we'll use that as a
# high entropy string to detect new versions.
elif "commit" in pkg.versions[version]:
commits_version_dict[pkg.versions[version]["commit"]] = version
# TODO: enforce every version have a commit or a sha256 defined if not
# an infinite version (there are a lot of package's where this doesn't work yet.)
with fs.working_dir(spack.paths.prefix):
added_checksums = spack_ci.get_added_versions(
checksums_version_dict, path, from_ref=args.from_ref, to_ref=args.to_ref
)
added_commits = spack_ci.get_added_versions(
commits_version_dict, path, from_ref=args.from_ref, to_ref=args.to_ref
)
if added_checksums:
failed_version = not validate_standard_versions(pkg, added_checksums) or failed_version
if added_commits:
failed_version = not validate_git_versions(pkg, added_commits) or failed_version
if failed_version:
sys.exit(1)
def ci(parser, args):
if args.func:
return args.func(args)

View File

@@ -350,12 +350,9 @@ def _config_change(config_path, match_spec_str=None):
if spack.config.get(key_path, scope=scope):
ideal_scope_to_modify = scope
break
# If we find our key in a specific scope, that's the one we want
# to modify. Otherwise we use the default write scope.
write_scope = ideal_scope_to_modify or spack.config.default_modify_scope()
update_path = f"{key_path}:[{str(spec)}]"
spack.config.add(update_path, scope=write_scope)
spack.config.add(update_path, scope=ideal_scope_to_modify)
else:
raise ValueError("'config change' can currently only change 'require' sections")

View File

@@ -55,7 +55,7 @@ def dependencies(parser, args):
env = ev.active_environment()
spec = spack.cmd.disambiguate_spec(specs[0], env)
format_string = "{name}{@version}{/hash:7}{%compiler}"
format_string = "{name}{@version}{%compiler}{/hash:7}"
if sys.stdout.isatty():
tty.msg("Dependencies of %s" % spec.format(format_string, color=True))
deps = spack.store.STORE.db.installed_relatives(

View File

@@ -93,7 +93,7 @@ def dependents(parser, args):
env = ev.active_environment()
spec = spack.cmd.disambiguate_spec(specs[0], env)
format_string = "{name}{@version}{/hash:7}{%compiler}"
format_string = "{name}{@version}{%compiler}{/hash:7}"
if sys.stdout.isatty():
tty.msg("Dependents of %s" % spec.cformat(format_string))
deps = spack.store.STORE.db.installed_relatives(spec, "parents", args.transitive)

View File

@@ -3,13 +3,11 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
from typing import Optional
import llnl.util.tty as tty
import spack.cmd
import spack.config
import spack.environment
import spack.fetch_strategy
import spack.repo
import spack.spec
@@ -33,33 +31,37 @@ def setup_parser(subparser):
"--no-clone",
action="store_false",
dest="clone",
default=None,
help="do not clone, the package already exists at the source path",
)
clone_group.add_argument(
"--clone",
action="store_true",
dest="clone",
default=True,
help=(
"(default) clone the package unless the path already exists, "
"use --force to overwrite"
),
default=None,
help="clone the package even if the path already exists",
)
subparser.add_argument(
"-f", "--force", help="remove any files or directories that block cloning source code"
)
subparser.add_argument(
"-r",
"--recursive",
action="store_true",
help="traverse nodes of the graph to mark everything up to the root as a develop spec",
)
arguments.add_common_arguments(subparser, ["spec"])
def _update_config(spec, path):
find_fn = lambda section: spec.name in section
entry = {"spec": str(spec)}
if path != spec.name:
entry["path"] = path
def change_fn(section):
section[spec.name] = entry
spack.config.change_or_add("develop", find_fn, change_fn)
def _retrieve_develop_source(spec: spack.spec.Spec, abspath: str) -> None:
# "steal" the source code via staging API. We ask for a stage
# to be created, then copy it afterwards somewhere else. It would be
@@ -81,43 +83,44 @@ def _retrieve_develop_source(spec: spack.spec.Spec, abspath: str) -> None:
package.stage.steal_source(abspath)
def assure_concrete_spec(env: spack.environment.Environment, spec: spack.spec.Spec):
version = spec.versions.concrete_range_as_version
if not version:
# first check environment for a matching concrete spec
matching_specs = env.all_matching_specs(spec)
if matching_specs:
version = matching_specs[0].version
test_spec = spack.spec.Spec(f"{spec}@{version}")
for m_spec in matching_specs:
if not m_spec.satisfies(test_spec):
raise SpackError(
f"{spec.name}: has multiple concrete instances in the graph that can't be"
" satisified by a single develop spec. To use `spack develop` ensure one"
" of the following:"
f"\n a) {spec.name} nodes can satisfy the same develop spec (minimally "
"this means they all share the same version)"
f"\n b) Provide a concrete develop spec ({spec.name}@[version]) to clearly"
" indicate what should be developed"
)
else:
# look up the maximum version so infintiy versions are preferred for develop
version = max(spec.package_class.versions.keys())
tty.msg(f"Defaulting to highest version: {spec.name}@{version}")
spec.versions = spack.version.VersionList([version])
def develop(parser, args):
# Note: we could put develop specs in any scope, but I assume
# users would only ever want to do this for either (a) an active
# env or (b) a specified config file (e.g. that is included by
# an environment)
# TODO: when https://github.com/spack/spack/pull/35307 is merged,
# an active env is not required if a scope is specified
env = spack.cmd.require_active_env(cmd_name="develop")
if not args.spec:
if args.clone is False:
raise SpackError("No spec provided to spack develop command")
# download all dev specs
for name, entry in env.dev_specs.items():
path = entry.get("path", name)
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
def setup_src_code(spec: spack.spec.Spec, src_path: str, clone: bool = True, force: bool = False):
"""
Handle checking, cloning or overwriting source code
"""
assert spec.versions
if os.path.exists(abspath):
msg = "Skipping developer download of %s" % entry["spec"]
msg += " because its path already exists."
tty.msg(msg)
continue
if clone:
_clone(spec, src_path, force)
# Both old syntax `spack develop pkg@x` and new syntax `spack develop pkg@=x`
# are currently supported.
spec = spack.spec.parse_with_version_concrete(entry["spec"])
_retrieve_develop_source(spec, abspath)
if not clone and not os.path.exists(src_path):
raise SpackError(f"Provided path {src_path} does not exist")
if not env.dev_specs:
tty.warn("No develop specs to download")
return
specs = spack.cmd.parse_specs(args.spec)
if len(specs) > 1:
raise SpackError("spack develop requires at most one named spec")
spec = specs[0]
version = spec.versions.concrete_range_as_version
if not version:
@@ -126,114 +129,40 @@ def setup_src_code(spec: spack.spec.Spec, src_path: str, clone: bool = True, for
tty.msg(f"Defaulting to highest version: {spec.name}@{version}")
spec.versions = spack.version.VersionList([version])
# If user does not specify --path, we choose to create a directory in the
# active environment's directory, named after the spec
path = args.path or spec.name
if not os.path.isabs(path):
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
else:
abspath = path
def _update_config(spec, path):
find_fn = lambda section: spec.name in section
# clone default: only if the path doesn't exist
clone = args.clone
if clone is None:
clone = not os.path.exists(abspath)
entry = {"spec": str(spec)}
if path and path != spec.name:
entry["path"] = path
if not clone and not os.path.exists(abspath):
raise SpackError("Provided path %s does not exist" % abspath)
def change_fn(section):
section[spec.name] = entry
if clone:
if os.path.exists(abspath):
if args.force:
shutil.rmtree(abspath)
else:
msg = "Path %s already exists and cannot be cloned to." % abspath
msg += " Use `spack develop -f` to overwrite."
raise SpackError(msg)
spack.config.change_or_add("develop", find_fn, change_fn)
def update_env(
env: spack.environment.Environment,
spec: spack.spec.Spec,
specified_path: Optional[str] = None,
build_dir: Optional[str] = None,
):
"""
Update the spack.yaml file with additions or changes from a develop call
"""
tty.debug(f"Updating develop config for {env.name} transactionally")
if not specified_path:
dev_entry = env.dev_specs.get(spec.name)
if dev_entry:
specified_path = dev_entry.get("path", None)
_retrieve_develop_source(spec, abspath)
tty.debug("Updating develop config for {0} transactionally".format(env.name))
with env.write_transaction():
if build_dir is not None:
if args.build_directory is not None:
spack.config.add(
f"packages:{spec.name}:package_attributes:build_directory:{build_dir}",
"packages:{}:package_attributes:build_directory:{}".format(
spec.name, args.build_directory
),
env.scope_name,
)
# add develop spec and update path
_update_config(spec, specified_path)
def _clone(spec: spack.spec.Spec, abspath: str, force: bool = False):
if os.path.exists(abspath):
if force:
shutil.rmtree(abspath)
else:
msg = f"Skipping developer download of {spec.name}"
msg += f" because its path {abspath} already exists."
tty.msg(msg)
return
# cloning can take a while and it's nice to get a message for the longer clones
tty.msg(f"Cloning source code for {spec}")
_retrieve_develop_source(spec, abspath)
def _abs_code_path(
env: spack.environment.Environment, spec: spack.spec.Spec, path: Optional[str] = None
):
src_path = path if path else spec.name
return spack.util.path.canonicalize_path(src_path, default_wd=env.path)
def _dev_spec_generator(args, env):
"""
Generator function to loop over all the develop specs based on how the command is called
If no specs are supplied then loop over the develop specs listed in the environment.
"""
if not args.spec:
if args.clone is False:
raise SpackError("No spec provided to spack develop command")
for name, entry in env.dev_specs.items():
path = entry.get("path", name)
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
# Both old syntax `spack develop pkg@x` and new syntax `spack develop pkg@=x`
# are currently supported.
spec = spack.spec.parse_with_version_concrete(entry["spec"])
yield spec, abspath
else:
specs = spack.cmd.parse_specs(args.spec)
if (args.path or args.build_directory) and len(specs) > 1:
raise SpackError(
"spack develop requires at most one named spec when using the --path or"
" --build-directory arguments"
)
for spec in specs:
if args.recursive:
concrete_specs = env.all_matching_specs(spec)
if not concrete_specs:
tty.warn(
f"{spec.name} has no matching concrete specs in the environment and "
"will be skipped. `spack develop --recursive` requires a concretized"
" environment"
)
else:
for s in concrete_specs:
for node_spec in s.traverse(direction="parents", root=True):
tty.debug(f"Recursive develop for {node_spec.name}")
yield node_spec, _abs_code_path(env, node_spec, args.path)
else:
yield spec, _abs_code_path(env, spec, args.path)
def develop(parser, args):
env = spack.cmd.require_active_env(cmd_name="develop")
for spec, abspath in _dev_spec_generator(args, env):
assure_concrete_spec(env, spec)
setup_src_code(spec, abspath, clone=args.clone, force=args.force)
update_env(env, spec, args.path, args.build_directory)
_update_config(spec, path)

View File

@@ -73,7 +73,7 @@
boxlib @B{dim=2} boxlib built for 2 dimensions
libdwarf @g{%intel} ^libelf@g{%gcc}
libdwarf, built with intel compiler, linked to libelf built with gcc
mvapich2 @B{fabrics=psm,mrail,sock} @g{%gcc}
mvapich2 @g{%gcc} @B{fabrics=psm,mrail,sock}
mvapich2, built with gcc compiler, with support for multiple fabrics
"""

View File

@@ -383,10 +383,8 @@ def modules_cmd(parser, args, module_type, callbacks=callbacks):
query = " ".join(str(s) for s in args.constraint_specs)
msg = f"the constraint '{query}' matches multiple packages:\n"
for s in specs:
spec_fmt = (
"{hash:7} {name}{@version}{compiler_flags}{variants}"
"{arch=architecture} {%compiler}"
)
spec_fmt = "{hash:7} {name}{@version}{%compiler}"
spec_fmt += "{compiler_flags}{variants}{arch=architecture}"
msg += "\t" + s.cformat(spec_fmt) + "\n"
tty.die(msg, "In this context exactly *one* match is needed.")

View File

@@ -1,12 +1,7 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
from llnl.util import tty
import spack.database
import spack.store
description = "rebuild Spack's package database"
@@ -15,11 +10,4 @@
def reindex(parser, args):
current_index = spack.store.STORE.db._index_path
if os.path.isfile(current_index):
backup = f"{current_index}.bkp"
shutil.copy(current_index, backup)
tty.msg(f"Created a back-up copy of the DB at {backup}")
spack.store.STORE.reindex()
tty.msg(f"The DB at {current_index} has been reindex to v{spack.database._DB_VERSION}")

View File

@@ -6,9 +6,8 @@
import os
import re
import sys
import warnings
from itertools import islice, zip_longest
from typing import Callable, Dict, List, Optional
from typing import Dict, List, Optional
import llnl.util.tty as tty
import llnl.util.tty.color as color
@@ -17,9 +16,6 @@
import spack.paths
import spack.repo
import spack.util.git
import spack.util.spack_yaml
from spack.spec_parser import SPEC_TOKENIZER, SpecTokens
from spack.tokenize import Token
from spack.util.executable import Executable, which
description = "runs source code style checks on spack"
@@ -202,13 +198,6 @@ def setup_parser(subparser):
action="append",
help="specify tools to skip (choose from %s)" % ", ".join(tool_names),
)
subparser.add_argument(
"--spec-strings",
action="store_true",
help="upgrade spec strings in Python, JSON and YAML files for compatibility with Spack "
"v1.0 and v0.x. Example: spack style --spec-strings $(git ls-files). Note: this flag "
"will be removed in Spack v1.0.",
)
subparser.add_argument("files", nargs=argparse.REMAINDER, help="specific files to check")
@@ -518,196 +507,7 @@ def _bootstrap_dev_dependencies():
spack.bootstrap.ensure_environment_dependencies()
IS_PROBABLY_COMPILER = re.compile(r"%[a-zA-Z_][a-zA-Z0-9\-]")
def _spec_str_reorder_compiler(idx: int, blocks: List[List[Token]]) -> None:
# only move the compiler to the back if it exists and is not already at the end
if not 0 <= idx < len(blocks) - 1:
return
# if there's only whitespace after the compiler, don't move it
if all(token.kind == SpecTokens.WS for block in blocks[idx + 1 :] for token in block):
return
# rotate left and always add at least one WS token between compiler and previous token
compiler_block = blocks.pop(idx)
if compiler_block[0].kind != SpecTokens.WS:
compiler_block.insert(0, Token(SpecTokens.WS, " "))
# delete the WS tokens from the new first block if it was at the very start, to prevent leading
# WS tokens.
while idx == 0 and blocks[0][0].kind == SpecTokens.WS:
blocks[0].pop(0)
blocks.append(compiler_block)
def _spec_str_format(spec_str: str) -> Optional[str]:
"""Given any string, try to parse as spec string, and rotate the compiler token to the end
of each spec instance. Returns the formatted string if it was changed, otherwise None."""
# We parse blocks of tokens that include leading whitespace, and move the compiler block to
# the end when we hit a dependency ^... or the end of a string.
# [@3.1][ +foo][ +bar][ %gcc@3.1][ +baz]
# [@3.1][ +foo][ +bar][ +baz][ %gcc@3.1]
current_block: List[Token] = []
blocks: List[List[Token]] = []
compiler_block_idx = -1
in_edge_attr = False
for token in SPEC_TOKENIZER.tokenize(spec_str):
if token.kind == SpecTokens.UNEXPECTED:
# parsing error, we cannot fix this string.
return None
elif token.kind in (SpecTokens.COMPILER, SpecTokens.COMPILER_AND_VERSION):
# multiple compilers are not supported in Spack v0.x, so early return
if compiler_block_idx != -1:
return None
current_block.append(token)
blocks.append(current_block)
current_block = []
compiler_block_idx = len(blocks) - 1
elif token.kind in (
SpecTokens.START_EDGE_PROPERTIES,
SpecTokens.DEPENDENCY,
SpecTokens.UNQUALIFIED_PACKAGE_NAME,
SpecTokens.FULLY_QUALIFIED_PACKAGE_NAME,
):
_spec_str_reorder_compiler(compiler_block_idx, blocks)
compiler_block_idx = -1
if token.kind == SpecTokens.START_EDGE_PROPERTIES:
in_edge_attr = True
current_block.append(token)
blocks.append(current_block)
current_block = []
elif token.kind == SpecTokens.END_EDGE_PROPERTIES:
in_edge_attr = False
current_block.append(token)
blocks.append(current_block)
current_block = []
elif in_edge_attr:
current_block.append(token)
elif token.kind in (
SpecTokens.VERSION_HASH_PAIR,
SpecTokens.GIT_VERSION,
SpecTokens.VERSION,
SpecTokens.PROPAGATED_BOOL_VARIANT,
SpecTokens.BOOL_VARIANT,
SpecTokens.PROPAGATED_KEY_VALUE_PAIR,
SpecTokens.KEY_VALUE_PAIR,
SpecTokens.DAG_HASH,
):
current_block.append(token)
blocks.append(current_block)
current_block = []
elif token.kind == SpecTokens.WS:
current_block.append(token)
else:
raise ValueError(f"unexpected token {token}")
if current_block:
blocks.append(current_block)
_spec_str_reorder_compiler(compiler_block_idx, blocks)
new_spec_str = "".join(token.value for block in blocks for token in block)
return new_spec_str if spec_str != new_spec_str else None
SpecStrHandler = Callable[[str, int, int, str, str], None]
def _spec_str_default_handler(path: str, line: int, col: int, old: str, new: str):
"""A SpecStrHandler that prints formatted spec strings and their locations."""
print(f"{path}:{line}:{col}: `{old}` -> `{new}`")
def _spec_str_fix_handler(path: str, line: int, col: int, old: str, new: str):
"""A SpecStrHandler that updates formatted spec strings in files."""
with open(path, "r", encoding="utf-8") as f:
lines = f.readlines()
new_line = lines[line - 1].replace(old, new)
if new_line == lines[line - 1]:
tty.warn(f"{path}:{line}:{col}: could not apply fix: `{old}` -> `{new}`")
return
lines[line - 1] = new_line
print(f"{path}:{line}:{col}: fixed `{old}` -> `{new}`")
with open(path, "w", encoding="utf-8") as f:
f.writelines(lines)
def _spec_str_ast(path: str, tree: ast.AST, handler: SpecStrHandler) -> None:
"""Walk the AST of a Python file and apply handler to formatted spec strings."""
has_constant = sys.version_info >= (3, 8)
for node in ast.walk(tree):
if has_constant and isinstance(node, ast.Constant) and isinstance(node.value, str):
current_str = node.value
elif not has_constant and isinstance(node, ast.Str):
current_str = node.s
else:
continue
if not IS_PROBABLY_COMPILER.search(current_str):
continue
new = _spec_str_format(current_str)
if new is not None:
handler(path, node.lineno, node.col_offset, current_str, new)
def _spec_str_json_and_yaml(path: str, data: dict, handler: SpecStrHandler) -> None:
"""Walk a YAML or JSON data structure and apply handler to formatted spec strings."""
queue = [data]
seen = set()
while queue:
current = queue.pop(0)
if id(current) in seen:
continue
seen.add(id(current))
if isinstance(current, dict):
queue.extend(current.values())
queue.extend(current.keys())
elif isinstance(current, list):
queue.extend(current)
elif isinstance(current, str) and IS_PROBABLY_COMPILER.search(current):
new = _spec_str_format(current)
if new is not None:
mark = getattr(current, "_start_mark", None)
if mark:
line, col = mark.line + 1, mark.column + 1
else:
line, col = 0, 0
handler(path, line, col, current, new)
def _check_spec_strings(
paths: List[str], handler: SpecStrHandler = _spec_str_default_handler
) -> None:
"""Open Python, JSON and YAML files, and format their string literals that look like spec
strings. A handler is called for each formatting, which can be used to print or apply fixes."""
for path in paths:
is_json_or_yaml = path.endswith(".json") or path.endswith(".yaml") or path.endswith(".yml")
is_python = path.endswith(".py")
if not is_json_or_yaml and not is_python:
continue
try:
with open(path, "r", encoding="utf-8") as f:
# skip files that are likely too large to be user code or config
if os.fstat(f.fileno()).st_size > 1024 * 1024:
warnings.warn(f"skipping {path}: too large.")
continue
if is_json_or_yaml:
_spec_str_json_and_yaml(path, spack.util.spack_yaml.load_config(f), handler)
elif is_python:
_spec_str_ast(path, ast.parse(f.read()), handler)
except (OSError, spack.util.spack_yaml.SpackYAMLError, SyntaxError, ValueError):
warnings.warn(f"skipping {path}")
continue
def style(parser, args):
if args.spec_strings:
if not args.files:
tty.die("No files provided to check spec strings.")
handler = _spec_str_fix_handler if args.fix else _spec_str_default_handler
return _check_spec_strings(args.files, handler)
# save initial working directory for relativizing paths later
args.initial_working_dir = os.getcwd()

View File

@@ -17,7 +17,6 @@
pytest = None # type: ignore
import llnl.util.filesystem
import llnl.util.tty as tty
import llnl.util.tty.color as color
from llnl.util.tty.colify import colify
@@ -237,12 +236,6 @@ def unit_test(parser, args, unknown_args):
pytest_root = spack.extensions.load_extension(args.extension)
if args.numprocesses is not None and args.numprocesses > 1:
try:
import xdist # noqa: F401
except ImportError:
tty.error("parallel unit-test requires pytest-xdist module")
return 1
pytest_args.extend(
[
"--dist",

View File

@@ -25,6 +25,15 @@
from spack.operating_systems import windows_os
from spack.util.environment import get_path
package_name_to_compiler_name = {
"llvm": "clang",
"intel-oneapi-compilers": "oneapi",
"llvm-amdgpu": "rocmcc",
"intel-oneapi-compilers-classic": "intel",
"acfl": "arm",
}
#: Tag used to identify packages providing a compiler
COMPILER_TAG = "compiler"

View File

@@ -250,11 +250,7 @@ def implicit_rpaths(self) -> List[str]:
return []
link_dirs = parse_non_system_link_dirs(output)
all_required_libs = list(self.spec.package.implicit_rpath_libs) + [
"libc",
"libc++",
"libstdc++",
]
all_required_libs = list(self.spec.package.required_libs) + ["libc", "libc++", "libstdc++"]
dynamic_linker = self.default_dynamic_linker()
result = DefaultDynamicLinkerFilter(dynamic_linker)(
paths_containing_libs(link_dirs, all_required_libs)

View File

@@ -32,10 +32,9 @@
import copy
import functools
import os
import os.path
import re
import sys
from typing import Any, Callable, Dict, Generator, List, NamedTuple, Optional, Tuple, Union
from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Union
import jsonschema
@@ -43,6 +42,7 @@
import spack.error
import spack.paths
import spack.platforms
import spack.schema
import spack.schema.bootstrap
import spack.schema.cdash
@@ -54,18 +54,17 @@
import spack.schema.develop
import spack.schema.env
import spack.schema.env_vars
import spack.schema.include
import spack.schema.merged
import spack.schema.mirrors
import spack.schema.modules
import spack.schema.packages
import spack.schema.repos
import spack.schema.upstreams
import spack.schema.view
import spack.util.remote_file_cache as rfc_util
# Hacked yaml for configuration files preserves line numbers.
import spack.util.spack_yaml as syaml
import spack.util.web as web_util
from spack.util.cpus import cpus_available
from spack.util.spack_yaml import get_mark_from_yaml_data
from .enums import ConfigScopePriority
@@ -75,7 +74,6 @@
"concretizer": spack.schema.concretizer.schema,
"definitions": spack.schema.definitions.schema,
"env_vars": spack.schema.env_vars.schema,
"include": spack.schema.include.schema,
"view": spack.schema.view.schema,
"develop": spack.schema.develop.schema,
"mirrors": spack.schema.mirrors.schema,
@@ -123,17 +121,6 @@
#: Type used for raw YAML configuration
YamlConfigDict = Dict[str, Any]
#: prefix for name of included configuration scopes
INCLUDE_SCOPE_PREFIX = "include"
#: safeguard for recursive includes -- maximum include depth
MAX_RECURSIVE_INCLUDES = 100
def _include_cache_location():
"""Location to cache included configuration files."""
return os.path.join(spack.paths.user_cache_path, "includes")
class ConfigScope:
def __init__(self, name: str) -> None:
@@ -141,25 +128,6 @@ def __init__(self, name: str) -> None:
self.writable = False
self.sections = syaml.syaml_dict()
#: names of any included scopes
self._included_scopes: Optional[List["ConfigScope"]] = None
@property
def included_scopes(self) -> List["ConfigScope"]:
"""Memoized list of included scopes, in the order they appear in this scope."""
if self._included_scopes is None:
self._included_scopes = []
includes = self.get_section("include")
if includes:
include_paths = [included_path(data) for data in includes["include"]]
for path in include_paths:
included_scope = include_path_scope(path)
if included_scope:
self._included_scopes.append(included_scope)
return self._included_scopes
def get_section_filename(self, section: str) -> str:
raise NotImplementedError
@@ -465,9 +433,7 @@ def highest(self) -> ConfigScope:
return next(self.scopes.reversed_values()) # type: ignore
@_config_mutator
def push_scope(
self, scope: ConfigScope, priority: Optional[int] = None, _depth: int = 0
) -> None:
def push_scope(self, scope: ConfigScope, priority: Optional[int] = None) -> None:
"""Adds a scope to the Configuration, at a given priority.
If a priority is not given, it is assumed to be the current highest priority.
@@ -476,44 +442,18 @@ def push_scope(
scope: scope to be added
priority: priority of the scope
"""
# TODO: As a follow on to #48784, change this to create a graph of the
# TODO: includes AND ensure properly sorted such that the order included
# TODO: at the highest level is reflected in the value of an option that
# TODO: is set in multiple included files.
# before pushing the scope itself, push any included scopes recursively, at same priority
for included_scope in reversed(scope.included_scopes):
if _depth + 1 > MAX_RECURSIVE_INCLUDES: # make sure we're not recursing endlessly
mark = ""
if hasattr(included_scope, "path") and syaml.marked(included_scope.path):
mark = included_scope.path._start_mark # type: ignore
raise RecursiveIncludeError(
f"Maximum include recursion exceeded in {included_scope.name}", str(mark)
)
# record this inclusion so that remove_scope() can use it
self.push_scope(included_scope, priority=priority, _depth=_depth + 1)
tty.debug(f"[CONFIGURATION: PUSH SCOPE]: {str(scope)}, priority={priority}", level=2)
self.scopes.add(scope.name, value=scope, priority=priority)
@_config_mutator
def remove_scope(self, scope_name: str) -> Optional[ConfigScope]:
"""Removes a scope by name, and returns it. If the scope does not exist, returns None."""
try:
scope = self.scopes.remove(scope_name)
tty.debug(f"[CONFIGURATION: REMOVE SCOPE]: {str(scope)}", level=2)
tty.debug(f"[CONFIGURATION: POP SCOPE]: {str(scope)}", level=2)
except KeyError as e:
tty.debug(f"[CONFIGURATION: REMOVE SCOPE]: {e}", level=2)
tty.debug(f"[CONFIGURATION: POP SCOPE]: {e}", level=2)
return None
# transitively remove included scopes
for included_scope in scope.included_scopes:
assert (
included_scope.name in self.scopes
), f"Included scope '{included_scope.name}' was never added to configuration!"
self.remove_scope(included_scope.name)
return scope
@property
@@ -823,8 +763,6 @@ def _add_platform_scope(
cfg: Configuration, name: str, path: str, priority: ConfigScopePriority, writable: bool = True
) -> None:
"""Add a platform-specific subdirectory for the current platform."""
import spack.platforms # circular dependency
platform = spack.platforms.host().name
scope = DirectoryConfigScope(
f"{name}/{platform}", os.path.join(path, platform), writable=writable
@@ -832,75 +770,6 @@ def _add_platform_scope(
cfg.push_scope(scope, priority=priority)
#: Class for the relevance of an optional path conditioned on a limited
#: python code that evaluates to a boolean and or explicit specification
#: as optional.
class IncludePath(NamedTuple):
path: str
when: str
sha256: str
optional: bool
def included_path(entry: Union[str, dict]) -> IncludePath:
"""Convert the included path entry into an IncludePath.
Args:
entry: include configuration entry
Returns: converted entry, where an empty ``when`` means the path is
not conditionally included
"""
if isinstance(entry, str):
return IncludePath(path=entry, sha256="", when="", optional=False)
path = entry["path"]
sha256 = entry.get("sha256", "")
when = entry.get("when", "")
optional = entry.get("optional", False)
return IncludePath(path=path, sha256=sha256, when=when, optional=optional)
def include_path_scope(include: IncludePath) -> Optional[ConfigScope]:
"""Instantiate an appropriate configuration scope for the given path.
Args:
include: optional include path
Returns: configuration scope
Raises:
ValueError: included path has an unsupported URL scheme, is required
but does not exist; configuration stage directory argument is missing
ConfigFileError: unable to access remote configuration file(s)
"""
# circular dependencies
import spack.spec
if (not include.when) or spack.spec.eval_conditional(include.when):
config_path = rfc_util.local_path(include.path, include.sha256, _include_cache_location)
if not config_path:
raise ConfigFileError(f"Unable to fetch remote configuration from {include.path}")
if os.path.isdir(config_path):
# directories are treated as regular ConfigScopes
config_name = f"{INCLUDE_SCOPE_PREFIX}:{os.path.basename(config_path)}"
tty.debug(f"Creating DirectoryConfigScope {config_name} for '{config_path}'")
return DirectoryConfigScope(config_name, config_path)
if os.path.exists(config_path):
# files are assumed to be SingleFileScopes
config_name = f"{INCLUDE_SCOPE_PREFIX}:{config_path}"
tty.debug(f"Creating SingleFileScope {config_name} for '{config_path}'")
return SingleFileScope(config_name, config_path, spack.schema.merged.schema)
if not include.optional:
path = f" at ({config_path})" if config_path != include.path else ""
raise ValueError(f"Required path ({include.path}) does not exist{path}")
return None
def config_paths_from_entry_points() -> List[Tuple[str, str]]:
"""Load configuration paths from entry points
@@ -926,7 +795,7 @@ def config_paths_from_entry_points() -> List[Tuple[str, str]]:
return config_paths
def create_incremental() -> Generator[Configuration, None, None]:
def create() -> Configuration:
"""Singleton Configuration instance.
This constructs one instance associated with this module and returns
@@ -970,25 +839,11 @@ def create_incremental() -> Generator[Configuration, None, None]:
# Each scope can have per-platform overrides in subdirectories
_add_platform_scope(cfg, name, path, priority=ConfigScopePriority.CONFIG_FILES)
# yield the config incrementally so that each config level's init code can get
# data from the one below. This can be tricky, but it enables us to have a
# single unified config system.
#
# TODO: think about whether we want to restrict what types of config can be used
# at each level. e.g., we may want to just more forcibly disallow remote
# config (which uses ssl and other config options) for some of the scopes,
# to make the bootstrap issues more explicit, even if allowing config scope
# init to reference lower scopes is more flexible.
yield cfg
def create() -> Configuration:
"""Create a configuration using create_incremental(), return the last yielded result."""
return list(create_incremental())[-1]
return cfg
#: This is the singleton configuration instance for Spack.
CONFIG: Configuration = lang.Singleton(create_incremental) # type: ignore
CONFIG: Configuration = lang.Singleton(create) # type: ignore
def add_from_file(filename: str, scope: Optional[str] = None) -> None:
@@ -1084,8 +939,7 @@ def set(path: str, value: Any, scope: Optional[str] = None) -> None:
Accepts the path syntax described in ``get()``.
"""
result = CONFIG.set(path, value, scope)
return result
return CONFIG.set(path, value, scope)
def scopes() -> lang.PriorityOrderedMapping[str, ConfigScope]:
@@ -1608,6 +1462,120 @@ def create_from(*scopes_or_paths: Union[ScopeWithOptionalPriority, str]) -> Conf
return result
def raw_github_gitlab_url(url: str) -> str:
"""Transform a github URL to the raw form to avoid undesirable html.
Args:
url: url to be converted to raw form
Returns:
Raw github/gitlab url or the original url
"""
# Note we rely on GitHub to redirect the 'raw' URL returned here to the
# actual URL under https://raw.githubusercontent.com/ with '/blob'
# removed and or, '/blame' if needed.
if "github" in url or "gitlab" in url:
return url.replace("/blob/", "/raw/")
return url
def collect_urls(base_url: str) -> list:
"""Return a list of configuration URLs.
Arguments:
base_url: URL for a configuration (yaml) file or a directory
containing yaml file(s)
Returns:
List of configuration file(s) or empty list if none
"""
if not base_url:
return []
extension = ".yaml"
if base_url.endswith(extension):
return [base_url]
# Collect configuration URLs if the base_url is a "directory".
_, links = web_util.spider(base_url, 0)
return [link for link in links if link.endswith(extension)]
def fetch_remote_configs(url: str, dest_dir: str, skip_existing: bool = True) -> str:
"""Retrieve configuration file(s) at the specified URL.
Arguments:
url: URL for a configuration (yaml) file or a directory containing
yaml file(s)
dest_dir: destination directory
skip_existing: Skip files that already exist in dest_dir if
``True``; otherwise, replace those files
Returns:
Path to the corresponding file if URL is or contains a
single file and it is the only file in the destination directory or
the root (dest_dir) directory if multiple configuration files exist
or are retrieved.
"""
def _fetch_file(url):
raw = raw_github_gitlab_url(url)
tty.debug(f"Reading config from url {raw}")
return web_util.fetch_url_text(raw, dest_dir=dest_dir)
if not url:
raise ConfigFileError("Cannot retrieve configuration without a URL")
# Return the local path to the cached configuration file OR to the
# directory containing the cached configuration files.
config_links = collect_urls(url)
existing_files = os.listdir(dest_dir) if os.path.isdir(dest_dir) else []
paths = []
for config_url in config_links:
basename = os.path.basename(config_url)
if skip_existing and basename in existing_files:
tty.warn(
f"Will not fetch configuration from {config_url} since a "
f"version already exists in {dest_dir}"
)
path = os.path.join(dest_dir, basename)
else:
path = _fetch_file(config_url)
if path:
paths.append(path)
if paths:
return dest_dir if len(paths) > 1 else paths[0]
raise ConfigFileError(f"Cannot retrieve configuration (yaml) from {url}")
def get_mark_from_yaml_data(obj):
"""Try to get ``spack.util.spack_yaml`` mark from YAML data.
We try the object, and if that fails we try its first member (if it's a container).
Returns:
mark if one is found, otherwise None.
"""
# mark of object itelf
mark = getattr(obj, "_start_mark", None)
if mark:
return mark
# mark of first member if it is a container
if isinstance(obj, (list, dict)):
first_member = next(iter(obj), None)
if first_member:
mark = getattr(first_member, "_start_mark", None)
return mark
def determine_number_of_jobs(
*,
parallel: bool = False,
@@ -1712,7 +1680,3 @@ def get_path(path, data):
# give up and return None if nothing worked
return None
class RecursiveIncludeError(spack.error.SpackError):
"""Too many levels of recursive includes."""

View File

@@ -649,7 +649,7 @@ def __init__(
@property
def db_version(self) -> vn.ConcreteVersion:
if self._db_version is None:
raise AttributeError("version not set -- DB has not been read yet")
raise AttributeError("db version is not yet set")
return self._db_version
@db_version.setter
@@ -896,7 +896,7 @@ def _handle_current_version_read(self, check, db):
def _handle_old_db_versions_read(self, check, db, *, reindex: bool):
if reindex is False and not self.is_upstream:
self.raise_explicit_database_upgrade_error()
self.raise_explicit_database_upgrade()
if not self.is_readable():
raise DatabaseNotReadableError(
@@ -909,16 +909,13 @@ def is_readable(self) -> bool:
"""Returns true if this DB can be read without reindexing"""
return (self.db_version, _DB_VERSION) in _REINDEX_NOT_NEEDED_ON_READ
def raise_explicit_database_upgrade_error(self):
def raise_explicit_database_upgrade(self):
"""Raises an ExplicitDatabaseUpgradeError with an appropriate message"""
raise ExplicitDatabaseUpgradeError(
f"database is v{self.db_version}, but Spack v{spack.__version__} needs v{_DB_VERSION}",
long_message=(
f"\nChange config:install_tree:root to use a different store, or use `spack "
f"reindex` to migrate the store at {self.root} to version {_DB_VERSION}.\n\n"
f"If you decide to migrate the store, note that:\n"
f"1. The operation cannot be reverted, and\n"
f"2. Older Spack versions will not be able to read the store anymore\n"
f"\nUse `spack reindex` to upgrade the store at {self.root} to version "
f"{_DB_VERSION}, or change config:install_tree:root to use a different store"
),
)
@@ -1163,7 +1160,7 @@ def _add(
installation_time:
Date and time of installation
allow_missing: if True, don't warn when installation is not found on on disk
This is useful when installing specs without build/test deps.
This is useful when installing specs without build deps.
"""
if not spec.concrete:
raise NonConcreteSpecAddError("Specs added to DB must be concrete.")
@@ -1183,8 +1180,10 @@ def _add(
edge.spec,
explicit=False,
installation_time=installation_time,
# allow missing build / test only deps
allow_missing=allow_missing or edge.depflag & (dt.BUILD | dt.TEST) == edge.depflag,
# allow missing build-only deps. This prevents excessive warnings when a spec is
# installed, and its build dep is missing a build dep; there's no need to install
# the build dep's build dep first, and there's no need to warn about it missing.
allow_missing=allow_missing or edge.depflag == dt.BUILD,
)
# Make sure the directory layout agrees whether the spec is installed

View File

@@ -7,7 +7,6 @@
import collections
import concurrent.futures
import os
import pathlib
import re
import sys
import traceback
@@ -16,7 +15,6 @@
import llnl.util.filesystem
import llnl.util.lang
import llnl.util.symlink
import llnl.util.tty
import spack.error
@@ -72,21 +70,13 @@ def dedupe_paths(paths: List[str]) -> List[str]:
"""Deduplicate paths based on inode and device number. In case the list contains first a
symlink and then the directory it points to, the symlink is replaced with the directory path.
This ensures that we pick for example ``/usr/bin`` over ``/bin`` if the latter is a symlink to
the former."""
the former`."""
seen: Dict[Tuple[int, int], str] = {}
linked_parent_check = lambda x: any(
[llnl.util.symlink.islink(str(y)) for y in pathlib.Path(x).parents]
)
for path in paths:
identifier = file_identifier(path)
if identifier not in seen:
seen[identifier] = path
# we also want to deprioritize paths if they contain a symlink in any parent
# (not just the basedir): e.g. oneapi has "latest/bin",
# where "latest" is a symlink to 2025.0"
elif not (llnl.util.symlink.islink(path) or linked_parent_check(path)):
elif not os.path.islink(path):
seen[identifier] = path
return list(seen.values())

View File

@@ -457,7 +457,8 @@ def _execute_extends(pkg):
if dep_spec.name == "python" and not pkg.name == "python-venv":
_depends_on(pkg, spack.spec.Spec("python-venv"), when=when, type=("build", "run"))
pkg.extendees[dep_spec.name] = (dep_spec, when_spec)
# TODO: the values of the extendees dictionary are not used. Remove in next refactor.
pkg.extendees[dep_spec.name] = (dep_spec, None)
return _execute_extends

View File

@@ -9,7 +9,7 @@
`spack.lock` format
===================
Spack environments have existed since Spack ``v0.12.0``, and there have been different
Spack environments have existed since Spack ``v0.12.0``, and there have been 4 different
``spack.lock`` formats since then. The formats are documented here.
The high-level format of a Spack lockfile hasn't changed much between versions, but the
@@ -53,44 +53,31 @@
- ``v3``
- ``v4``
- ``v5``
- ``v6``
* - ``v0.12:0.14``
-
-
-
-
-
-
* - ``v0.15:0.16``
-
-
-
-
-
-
* - ``v0.17``
-
-
-
-
-
-
* - ``v0.18:``
-
-
-
-
-
-
* - ``v0.22:v0.23``
-
-
-
-
-
-
* - ``v1.0:``
-
* - ``v0.22:``
-
-
-
@@ -472,78 +459,6 @@
}
}
}
Version 6
---------
Version 6 uses specs where compilers are modeled as real dependencies, and not as a node attribute.
It doesn't change the top-level lockfile format.
As part of Spack v1.0, compilers stopped being a node attribute, and became a build-only dependency. Packages may
declare a dependency on the c, cxx, or fortran languages, which are now treated as virtuals, and compilers would
be providers for one or more of those languages. Compilers can also inject runtime dependency, on the node being
compiled. The compiler-wrapper is explicitly represented as a node in the DAG, and enters the hash.
.. code-block:: json
{
"_meta": {
"file-type": "spack-lockfile",
"lockfile-version": 6,
"specfile-version": 5
},
"spack": {
"version": "1.0.0.dev0",
"type": "git",
"commit": "395b34f17417132389a6a8ee4dbf831c4a04f642"
},
"roots": [
{
"hash": "tivmbe3xjw7oqv4c3jv3v4jw42a7cajq",
"spec": "zlib-ng"
}
],
"concrete_specs": {
"tivmbe3xjw7oqv4c3jv3v4jw42a7cajq": {
"name": "zlib-ng",
"version": "2.2.3",
"<other attributes>": {}
}
"dependencies": [
{
"name": "compiler-wrapper",
"hash": "n5lamxu36f4cx4sm7m7gocalctve4mcx",
"parameters": {
"deptypes": [
"build"
],
"virtuals": []
}
},
{
"name": "gcc",
"hash": "b375mbpprxze4vxy4ho7aixhuchsime2",
"parameters": {
"deptypes": [
"build"
],
"virtuals": [
"c",
"cxx"
]
}
},
{
"<other dependencies>": {}
}
],
"annotations": {
"original_specfile_version": 5
},
}
}
"""
from .environment import (

View File

@@ -10,6 +10,8 @@
import re
import shutil
import stat
import urllib.parse
import urllib.request
import warnings
from typing import Any, Dict, Iterable, List, Optional, Sequence, Tuple, Union
@@ -30,6 +32,7 @@
import spack.paths
import spack.repo
import spack.schema.env
import spack.schema.merged
import spack.spec
import spack.spec_list
import spack.store
@@ -40,6 +43,7 @@
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.url
from spack import traverse
from spack.installer import PackageInstaller
from spack.schema.env import TOP_LEVEL_KEY
@@ -386,7 +390,6 @@ def create_in_dir(
# dev paths in this environment to refer to their original
# locations.
_rewrite_relative_dev_paths_on_relocation(env, init_file_dir)
_rewrite_relative_repos_paths_on_relocation(env, init_file_dir)
return env
@@ -403,8 +406,8 @@ def _rewrite_relative_dev_paths_on_relocation(env, init_file_dir):
dev_path = substitute_path_variables(entry["path"])
expanded_path = spack.util.path.canonicalize_path(dev_path, default_wd=init_file_dir)
# Skip if the substituted and expanded path is the same (e.g. when absolute)
if entry["path"] == expanded_path:
# Skip if the expanded path is the same (e.g. when absolute)
if dev_path == expanded_path:
continue
tty.debug("Expanding develop path for {0} to {1}".format(name, expanded_path))
@@ -419,34 +422,6 @@ def _rewrite_relative_dev_paths_on_relocation(env, init_file_dir):
env._re_read()
def _rewrite_relative_repos_paths_on_relocation(env, init_file_dir):
"""When initializing the environment from a manifest file and we plan
to store the environment in a different directory, we have to rewrite
relative repo paths to absolute ones and expand environment variables."""
with env:
repos_specs = spack.config.get("repos", default={}, scope=env.scope_name)
if not repos_specs:
return
for i, entry in enumerate(repos_specs):
repo_path = substitute_path_variables(entry)
expanded_path = spack.util.path.canonicalize_path(repo_path, default_wd=init_file_dir)
# Skip if the substituted and expanded path is the same (e.g. when absolute)
if entry == expanded_path:
continue
tty.debug("Expanding repo path for {0} to {1}".format(entry, expanded_path))
repos_specs[i] = expanded_path
spack.config.set("repos", repos_specs, scope=env.scope_name)
env.repos_specs = None
# If we changed the environment's spack.yaml scope, that will not be reflected
# in the manifest that we read
env._re_read()
def environment_dir_from_name(name: str, exists_ok: bool = True) -> str:
"""Returns the directory associated with a named environment.
@@ -574,6 +549,13 @@ def _write_yaml(data, str_or_file):
syaml.dump_config(data, str_or_file, default_flow_style=False)
def _eval_conditional(string):
"""Evaluate conditional definitions using restricted variable scope."""
valid_variables = spack.spec.get_host_environment()
valid_variables.update({"re": re, "env": os.environ})
return eval(string, valid_variables)
def _is_dev_spec_and_has_changed(spec):
"""Check if the passed spec is a dev build and whether it has changed since the
last installation"""
@@ -1006,7 +988,7 @@ def _process_definition(self, entry):
"""Process a single spec definition item."""
when_string = entry.get("when")
if when_string is not None:
when = spack.spec.eval_conditional(when_string)
when = _eval_conditional(when_string)
assert len([x for x in entry if x != "when"]) == 1
else:
when = True
@@ -1129,6 +1111,11 @@ def user_specs(self):
@property
def dev_specs(self):
if not self._dev_specs:
self._dev_specs = self._read_dev_specs()
return self._dev_specs
def _read_dev_specs(self):
dev_specs = {}
dev_config = spack.config.get("develop", {})
for name, entry in dev_config.items():
@@ -1546,6 +1533,9 @@ def _get_specs_to_concretize(
return new_user_specs, kept_user_specs, specs_to_concretize
def _concretize_together_where_possible(self, tests: bool = False) -> Sequence[SpecPair]:
# Avoid cyclic dependency
import spack.solver.asp
# Exit early if the set of concretized specs is the set of user specs
new_user_specs, _, specs_to_concretize = self._get_specs_to_concretize()
if not new_user_specs:
@@ -2656,23 +2646,20 @@ def _ensure_env_dir():
# error handling for bad manifests is handled on other code paths
return
# TODO: make this recursive
includes = manifest[TOP_LEVEL_KEY].get("include", [])
for include in includes:
included_path = spack.config.included_path(include)
path = included_path.path
if os.path.isabs(path):
if os.path.isabs(include):
continue
abspath = pathlib.Path(os.path.normpath(environment_dir / path))
abspath = pathlib.Path(os.path.normpath(environment_dir / include))
common_path = pathlib.Path(os.path.commonpath([environment_dir, abspath]))
if common_path != environment_dir:
tty.debug(f"Will not copy relative include file from outside environment: {path}")
tty.debug(f"Will not copy relative include from outside environment: {include}")
continue
orig_abspath = os.path.normpath(envfile.parent / path)
orig_abspath = os.path.normpath(envfile.parent / include)
if not os.path.exists(orig_abspath):
tty.warn(f"Included file does not exist; will not copy: '{path}'")
tty.warn(f"Included file does not exist; will not copy: '{include}'")
continue
fs.touchp(abspath)
@@ -2895,7 +2882,7 @@ def extract_name(_item):
continue
condition_str = item.get("when", "True")
if not spack.spec.eval_conditional(condition_str):
if not _eval_conditional(condition_str):
continue
yield idx, item
@@ -2956,20 +2943,127 @@ def __iter__(self):
def __str__(self):
return str(self.manifest_file)
@property
def included_config_scopes(self) -> List[spack.config.ConfigScope]:
"""List of included configuration scopes from the manifest.
Scopes are listed in the YAML file in order from highest to
lowest precedence, so configuration from earlier scope will take
precedence over later ones.
This routine returns them in the order they should be pushed onto
the internal scope stack (so, in reverse, from lowest to highest).
Returns: Configuration scopes associated with the environment manifest
Raises:
SpackEnvironmentError: if the manifest includes a remote file but
no configuration stage directory has been identified
"""
scopes: List[spack.config.ConfigScope] = []
# load config scopes added via 'include:', in reverse so that
# highest-precedence scopes are last.
includes = self[TOP_LEVEL_KEY].get("include", [])
missing = []
for i, config_path in enumerate(reversed(includes)):
# allow paths to contain spack config/environment variables, etc.
config_path = substitute_path_variables(config_path)
include_url = urllib.parse.urlparse(config_path)
# If scheme is not valid, config_path is not a url
# of a type Spack is generally aware
if spack.util.url.validate_scheme(include_url.scheme):
# Transform file:// URLs to direct includes.
if include_url.scheme == "file":
config_path = urllib.request.url2pathname(include_url.path)
# Any other URL should be fetched.
elif include_url.scheme in ("http", "https", "ftp"):
# Stage any remote configuration file(s)
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
)
remote_path = urllib.request.url2pathname(include_url.path)
basename = os.path.basename(remote_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(remote_path)
)
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path, str(self.config_stage_dir), skip_existing=True
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
)
config_path = staged_path
elif include_url.scheme:
raise ValueError(
f"Unsupported URL scheme ({include_url.scheme}) for "
f"environment include: {config_path}"
)
# treat relative paths as relative to the environment
if not os.path.isabs(config_path):
config_path = os.path.join(self.manifest_dir, config_path)
config_path = os.path.normpath(os.path.realpath(config_path))
if os.path.isdir(config_path):
# directories are treated as regular ConfigScopes
config_name = f"env:{self.name}:{os.path.basename(config_path)}"
tty.debug(f"Creating DirectoryConfigScope {config_name} for '{config_path}'")
scopes.append(spack.config.DirectoryConfigScope(config_name, config_path))
elif os.path.exists(config_path):
# files are assumed to be SingleFileScopes
config_name = f"env:{self.name}:{config_path}"
tty.debug(f"Creating SingleFileScope {config_name} for '{config_path}'")
scopes.append(
spack.config.SingleFileScope(
config_name, config_path, spack.schema.merged.schema
)
)
else:
missing.append(config_path)
continue
if missing:
msg = "Detected {0} missing include path(s):".format(len(missing))
msg += "\n {0}".format("\n ".join(missing))
raise spack.config.ConfigFileError(msg)
return scopes
@property
def env_config_scopes(self) -> List[spack.config.ConfigScope]:
"""A list of all configuration scopes for the environment manifest. On the first call this
instantiates all the scopes, on subsequent calls it returns the cached list."""
if self._config_scopes is not None:
return self._config_scopes
scopes: List[spack.config.ConfigScope] = [
*self.included_config_scopes,
spack.config.SingleFileScope(
self.scope_name,
str(self.manifest_file),
spack.schema.env.schema,
yaml_path=[TOP_LEVEL_KEY],
)
),
]
ensure_no_disallowed_env_config_mods(scopes)
self._config_scopes = scopes

View File

@@ -8,7 +8,6 @@
import llnl.util.tty as tty
from llnl.util.tty.color import colorize
import spack.config
import spack.environment as ev
import spack.repo
import spack.schema.environment
@@ -159,8 +158,7 @@ def activate(
# become PATH variables.
#
with env.manifest.use_config():
env_vars_yaml = spack.config.get("env_vars", None)
env_vars_yaml = env.manifest.configuration.get("env_vars", None)
if env_vars_yaml:
env_mods.extend(spack.schema.environment.parse(env_vars_yaml))
@@ -197,8 +195,7 @@ def deactivate() -> EnvironmentModifications:
if active is None:
return env_mods
with active.manifest.use_config():
env_vars_yaml = spack.config.get("env_vars", None)
env_vars_yaml = active.manifest.configuration.get("env_vars", None)
if env_vars_yaml:
env_mods.extend(spack.schema.environment.parse(env_vars_yaml).reversed())

View File

@@ -643,7 +643,7 @@ def print_status(self, *specs, **kwargs):
specs.sort()
abbreviated = [
s.cformat("{name}{@version}{compiler_flags}{variants}{%compiler}")
s.cformat("{name}{@version}{%compiler}{compiler_flags}{variants}")
for s in specs
]

View File

@@ -482,7 +482,7 @@ class SimpleDAG(DotGraphBuilder):
"""Simple DOT graph, with nodes colored uniformly and edges without properties"""
def node_entry(self, node):
format_option = "{name}{@version}{/hash:7}{%compiler}"
format_option = "{name}{@version}{%compiler}{/hash:7}"
return node.dag_hash(), f'[label="{node.format(format_option)}"]'
def edge_entry(self, edge):
@@ -515,7 +515,7 @@ def visit(self, edge):
super().visit(edge)
def node_entry(self, node):
node_str = node.format("{name}{@version}{/hash:7}{%compiler}")
node_str = node.format("{name}{@version}{%compiler}{/hash:7}")
options = f'[label="{node_str}", group="build_dependencies", fillcolor="coral"]'
if node.dag_hash() in self.main_unified_space:
options = f'[label="{node_str}", group="main_psid"]'

View File

@@ -6,7 +6,7 @@
import spack.deptypes as dt
import spack.repo
HASHES = []
hashes = []
class SpecHashDescriptor:
@@ -23,7 +23,7 @@ def __init__(self, depflag: dt.DepFlag, package_hash, name, override=None):
self.depflag = depflag
self.package_hash = package_hash
self.name = name
HASHES.append(self)
hashes.append(self)
# Allow spec hashes to have an alternate computation method
self.override = override
@@ -43,9 +43,13 @@ def __repr__(self):
)
#: The DAG hash includes all inputs that can affect how a package is built.
dag_hash = SpecHashDescriptor(
depflag=dt.BUILD | dt.LINK | dt.RUN | dt.TEST, package_hash=True, name="hash"
#: Spack's deployment hash. Includes all inputs that can affect how a package is built.
dag_hash = SpecHashDescriptor(depflag=dt.BUILD | dt.LINK | dt.RUN, package_hash=True, name="hash")
#: Hash descriptor used only to transfer a DAG, as is, across processes
process_hash = SpecHashDescriptor(
depflag=dt.BUILD | dt.LINK | dt.RUN | dt.TEST, package_hash=True, name="process_hash"
)

View File

@@ -21,6 +21,7 @@
from llnl.util.lang import nullcontext
from llnl.util.tty.color import colorize
import spack.build_environment
import spack.config
import spack.error
import spack.package_base
@@ -397,7 +398,7 @@ def stand_alone_tests(self, kwargs):
Args:
kwargs (dict): arguments to be used by the test process
"""
import spack.build_environment # avoid circular dependency
import spack.build_environment
spack.build_environment.start_build_process(self.pkg, test_process, kwargs)
@@ -462,8 +463,6 @@ def write_tested_status(self):
@contextlib.contextmanager
def test_part(pkg: Pb, test_name: str, purpose: str, work_dir: str = ".", verbose: bool = False):
import spack.build_environment # avoid circular dependency
wdir = "." if work_dir is None else work_dir
tester = pkg.tester
assert test_name and test_name.startswith(

View File

@@ -564,12 +564,6 @@ def __init__(self, configuration):
def spec(self):
return self.conf.spec
@tengine.context_property
def tags(self):
if not hasattr(self.spec.package, "tags"):
return []
return self.spec.package.tags
@tengine.context_property
def timestamp(self):
return datetime.datetime.now()

View File

@@ -19,7 +19,6 @@
import spack.spec
import spack.tengine as tengine
import spack.util.environment
from spack.aliases import BUILTIN_TO_LEGACY_COMPILER
from .common import BaseConfiguration, BaseContext, BaseFileLayout, BaseModuleFileWriter
@@ -224,9 +223,9 @@ def provides(self):
# If it is in the list of supported compilers family -> compiler
if self.spec.name in spack.compilers.config.supported_compilers():
provides["compiler"] = spack.spec.Spec(self.spec.format("{name}{@versions}"))
elif self.spec.name in BUILTIN_TO_LEGACY_COMPILER:
elif self.spec.name in spack.compilers.config.package_name_to_compiler_name:
# If it is the package for a supported compiler, but of a different name
cname = BUILTIN_TO_LEGACY_COMPILER[self.spec.name]
cname = spack.compilers.config.package_name_to_compiler_name[self.spec.name]
provides["compiler"] = spack.spec.Spec(cname, self.spec.versions)
# All the other tokens in the hierarchy must be virtual dependencies

View File

@@ -47,7 +47,6 @@
import spack.store
import spack.url
import spack.util.environment
import spack.util.executable
import spack.util.path
import spack.util.web
import spack.variant
@@ -1290,13 +1289,12 @@ def extendee_spec(self):
if not self.extendees:
return None
deps = []
# If the extendee is in the spec's deps already, return that.
deps = [
dep
for dep in self.spec.dependencies(deptype=("link", "run"))
for d, when in self.extendees.values()
if dep.satisfies(d) and self.spec.satisfies(when)
]
for dep in self.spec.traverse(deptype=("link", "run")):
if dep.name in self.extendees:
deps.append(dep)
if deps:
assert len(deps) == 1
@@ -1373,14 +1371,6 @@ def prefix(self):
def home(self):
return self.prefix
@property
def command(self) -> spack.util.executable.Executable:
"""Returns the main executable for this package."""
path = os.path.join(self.home.bin, self.spec.name)
if fsys.is_exe(path):
return spack.util.executable.Executable(path)
raise RuntimeError(f"Unable to locate {self.spec.name} command in {self.home.bin}")
def url_version(self, version):
"""
Given a version, this returns a string that should be substituted

View File

@@ -32,7 +32,6 @@
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
import spack
import spack.caches
import spack.config
import spack.error
@@ -50,8 +49,6 @@
#: Package modules are imported as spack.pkg.<repo-namespace>.<pkg-name>
ROOT_PYTHON_NAMESPACE = "spack.pkg"
_API_REGEX = re.compile(r"^v(\d+)\.(\d+)$")
def python_package_for_repo(namespace):
"""Returns the full namespace of a repository, given its relative one
@@ -914,52 +911,19 @@ def __reduce__(self):
return RepoPath.unmarshal, self.marshal()
def _parse_package_api_version(
config: Dict[str, Any],
min_api: Tuple[int, int] = spack.min_package_api_version,
max_api: Tuple[int, int] = spack.package_api_version,
) -> Tuple[int, int]:
api = config.get("api")
if api is None:
package_api = (1, 0)
else:
if not isinstance(api, str):
raise BadRepoError(f"Invalid Package API version '{api}'. Must be of the form vX.Y")
api_match = _API_REGEX.match(api)
if api_match is None:
raise BadRepoError(f"Invalid Package API version '{api}'. Must be of the form vX.Y")
package_api = (int(api_match.group(1)), int(api_match.group(2)))
if min_api <= package_api <= max_api:
return package_api
min_str = ".".join(str(i) for i in min_api)
max_str = ".".join(str(i) for i in max_api)
curr_str = ".".join(str(i) for i in package_api)
raise BadRepoError(
f"Package API v{curr_str} is not supported by this version of Spack ("
f"must be between v{min_str} and v{max_str})"
)
class Repo:
"""Class representing a package repository in the filesystem.
Each package repository must have a top-level configuration file called `repo.yaml`.
Each package repository must have a top-level configuration file
called `repo.yaml`.
It contains the following keys:
Currently, `repo.yaml` must define:
`namespace`:
A Python namespace where the repository's packages should live.
`subdirectory`:
An optional subdirectory name where packages are placed
`api`:
A string of the form vX.Y that indicates the Package API version. The default is "v1.0".
For the repo to be compatible with the current version of Spack, the version must be
greater than or equal to :py:data:`spack.min_package_api_version` and less than or equal to
:py:data:`spack.package_api_version`.
"""
def __init__(
@@ -996,7 +960,7 @@ def check(condition, msg):
f"{os.path.join(root, repo_config_name)} must define a namespace.",
)
self.namespace: str = config["namespace"]
self.namespace = config["namespace"]
check(
re.match(r"[a-zA-Z][a-zA-Z0-9_.]+", self.namespace),
f"Invalid namespace '{self.namespace}' in repo '{self.root}'. "
@@ -1009,14 +973,12 @@ def check(condition, msg):
# Keep name components around for checking prefixes.
self._names = self.full_namespace.split(".")
packages_dir: str = config.get("subdirectory", packages_dir_name)
packages_dir = config.get("subdirectory", packages_dir_name)
self.packages_path = os.path.join(self.root, packages_dir)
check(
os.path.isdir(self.packages_path), f"No directory '{packages_dir}' found in '{root}'"
)
self.package_api = _parse_package_api_version(config)
# Class attribute overrides by package name
self.overrides = overrides or {}
@@ -1066,7 +1028,7 @@ def is_prefix(self, fullname: str) -> bool:
parts = fullname.split(".")
return self._names[: len(parts)] == parts
def _read_config(self) -> Dict[str, Any]:
def _read_config(self) -> Dict[str, str]:
"""Check for a YAML config file in this db's root directory."""
try:
with open(self.config_file, encoding="utf-8") as reponame_file:
@@ -1408,8 +1370,6 @@ def create_repo(root, namespace=None, subdir=packages_dir_name):
config.write(f" namespace: '{namespace}'\n")
if subdir != packages_dir_name:
config.write(f" subdirectory: '{subdir}'\n")
x, y = spack.package_api_version
config.write(f" api: v{x}.{y}\n")
except OSError as e:
# try to clean up.

View File

@@ -29,7 +29,11 @@
# merged configuration scope schemas
spack.schema.merged.properties,
# extra environment schema properties
{"specs": spec_list_schema, "include_concrete": include_concrete},
{
"include": {"type": "array", "default": [], "items": {"type": "string"}},
"specs": spec_list_schema,
"include_concrete": include_concrete,
},
),
}
}

View File

@@ -1,41 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for include.yaml configuration file.
.. literalinclude:: _spack_root/lib/spack/spack/schema/include.py
:lines: 12-
"""
from typing import Any, Dict
#: Properties for inclusion in other schemas
properties: Dict[str, Any] = {
"include": {
"type": "array",
"default": [],
"additionalProperties": False,
"items": {
"anyOf": [
{
"type": "object",
"properties": {
"when": {"type": "string"},
"path": {"type": "string"},
"sha256": {"type": "string"},
"optional": {"type": "boolean"},
},
"required": ["path"],
"additionalProperties": False,
},
{"type": "string"},
]
},
}
}
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack include configuration file schema",
"properties": properties,
}

View File

@@ -21,7 +21,6 @@
import spack.schema.definitions
import spack.schema.develop
import spack.schema.env_vars
import spack.schema.include
import spack.schema.mirrors
import spack.schema.modules
import spack.schema.packages
@@ -41,7 +40,6 @@
spack.schema.definitions.properties,
spack.schema.develop.properties,
spack.schema.env_vars.properties,
spack.schema.include.properties,
spack.schema.mirrors.properties,
spack.schema.modules.properties,
spack.schema.packages.properties,
@@ -50,6 +48,7 @@
spack.schema.view.properties,
)
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",

View File

@@ -49,6 +49,7 @@
import spack.deptypes as dt
import spack.environment as ev
import spack.error
import spack.hash_types as ht
import spack.package_base
import spack.package_prefs
import spack.patch
@@ -562,6 +563,7 @@ def to_dict(self, test: bool = False) -> dict:
serial_node_arg = (
lambda node_dict: f"""{{"id": "{node_dict.id}", "pkg": "{node_dict.pkg}"}}"""
)
spec_hash_type = ht.process_hash if test else ht.dag_hash
ret = dict()
ret["asp"] = self.asp
ret["criteria"] = self.criteria
@@ -575,14 +577,14 @@ def to_dict(self, test: bool = False) -> dict:
serial_answer = answer[:2]
serial_answer_dict = {}
for node, spec in answer[2].items():
serial_answer_dict[serial_node_arg(node)] = spec.to_dict()
serial_answer_dict[serial_node_arg(node)] = spec.to_dict(hash=spec_hash_type)
serial_answer = serial_answer + (serial_answer_dict,)
serial_answers.append(serial_answer)
ret["answers"] = serial_answers
ret["specs_by_input"] = {}
input_specs = {} if not self.specs_by_input else self.specs_by_input
for input, spec in input_specs.items():
ret["specs_by_input"][str(input)] = spec.to_dict()
ret["specs_by_input"][str(input)] = spec.to_dict(hash=spec_hash_type)
return ret
@staticmethod
@@ -642,9 +644,10 @@ class ConcretizationCache:
"""
def __init__(self, root: Union[str, None] = None):
root = root or spack.config.get(
"config:concretization_cache:url", spack.paths.default_conc_cache_path
)
if not root:
root = spack.config.get(
"config:concretization_cache:url", spack.paths.default_conc_cache_path
)
self.root = pathlib.Path(spack.util.path.canonicalize_path(root))
self._fc = FileCache(self.root)
self._cache_manifest = ".cache_manifest"
@@ -1186,7 +1189,7 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
full_path = lambda x: os.path.join(parent_dir, x)
abs_control_files = [full_path(x) for x in control_files]
for ctrl_file in abs_control_files:
with open(ctrl_file, "r", encoding="utf-8") as f:
with open(ctrl_file, "r+", encoding="utf-8") as f:
problem_repr += "\n" + f.read()
result = None
@@ -1504,7 +1507,6 @@ def __init__(self, tests: bool = False):
)
self.possible_compilers: List[spack.spec.Spec] = []
self.rejected_compilers: Set[spack.spec.Spec] = set()
self.possible_oses: Set = set()
self.variant_values_from_specs: Set = set()
self.version_constraints: Set = set()
@@ -2165,8 +2167,8 @@ def emit_facts_from_requirement_rules(self, rules: List[RequirementRule]):
spec.attach_git_version_lookup()
when_spec = spec
if virtual and spec.name != pkg_name:
when_spec = spack.spec.Spec(f"^[virtuals={pkg_name}] {spec.name}")
if virtual:
when_spec = spack.spec.Spec(pkg_name)
try:
context = ConditionContext()
@@ -2264,13 +2266,6 @@ def external_packages(self):
for local_idx, spec in enumerate(candidate_specs):
msg = f"{spec.name} available as external when satisfying {spec}"
if any(x.satisfies(spec) for x in self.rejected_compilers):
tty.debug(
f"[{__name__}]: not considering {spec} as external, since "
f"it's a non-working compiler"
)
continue
if spec_filters and spec not in selected_externals:
continue
@@ -3053,9 +3048,7 @@ def setup(
compilers_from_reuse = {
x for x in reuse if x.name in supported_compilers and not x.external
}
candidate_compilers, self.rejected_compilers = possible_compilers(
configuration=spack.config.CONFIG
)
candidate_compilers = possible_compilers(configuration=spack.config.CONFIG)
for x in candidate_compilers:
if x.external or x in reuse:
continue
@@ -3394,7 +3387,8 @@ def __init__(self):
self.asp_problem = []
def fact(self, atom: AspFunction) -> None:
self.asp_problem.append(f"{atom}.\n")
symbol = atom.symbol() if hasattr(atom, "symbol") else atom
self.asp_problem.append(f"{str(symbol)}.\n")
def append(self, rule: str) -> None:
self.asp_problem.append(rule)
@@ -3423,13 +3417,12 @@ def value(self) -> str:
return "".join(self.asp_problem)
def possible_compilers(*, configuration) -> Tuple[Set["spack.spec.Spec"], Set["spack.spec.Spec"]]:
result, rejected = set(), set()
def possible_compilers(*, configuration) -> Set["spack.spec.Spec"]:
result = set()
# Compilers defined in configuration
for c in spack.compilers.config.all_compilers_from(configuration):
if using_libc_compatibility() and not c_compiler_runs(c):
rejected.add(c)
try:
compiler = c.extra_attributes["compilers"]["c"]
tty.debug(
@@ -3442,7 +3435,6 @@ def possible_compilers(*, configuration) -> Tuple[Set["spack.spec.Spec"], Set["s
continue
if using_libc_compatibility() and not CompilerPropertyDetector(c).default_libc():
rejected.add(c)
warnings.warn(
f"cannot detect libc from {c}. The compiler will not be used "
f"during concretization."
@@ -3450,7 +3442,9 @@ def possible_compilers(*, configuration) -> Tuple[Set["spack.spec.Spec"], Set["s
continue
if c in result:
tty.debug(f"[{__name__}] duplicate {c.long_spec} compiler found")
warnings.warn(
f"duplicate {c.long_spec} compiler found. Edit your packages.yaml to remove it."
)
continue
result.add(c)
@@ -3460,7 +3454,7 @@ def possible_compilers(*, configuration) -> Tuple[Set["spack.spec.Spec"], Set["s
for pkg_name in supported_compilers:
result.update(spack.store.STORE.db.query(pkg_name))
return result, rejected
return result
class RuntimePropertyRecorder:
@@ -3638,6 +3632,7 @@ def propagate(self, constraint_str: str, *, when: str):
body_str, node_variable = self.rule_body_from(when_spec)
constraint_spec = spack.spec.Spec(constraint_str)
# constraint_spec.name = placeholder
constraint_clauses = self._setup.spec_clauses(constraint_spec, body=False)
for clause in constraint_clauses:
if clause.args[0] == "node_version_satisfies":
@@ -4264,8 +4259,6 @@ def _is_reusable(spec: spack.spec.Spec, packages, local: bool) -> bool:
def _has_runtime_dependencies(spec: spack.spec.Spec) -> bool:
# TODO (compiler as nodes): this function contains specific names from builtin, and should
# be made more general
if "gcc" in spec and "gcc-runtime" not in spec:
return False
@@ -4702,7 +4695,3 @@ def __init__(self, provided, conflicts):
class InvalidSpliceError(spack.error.SpackError):
"""For cases in which the splice configuration is invalid."""
class NoCompilerFoundError(spack.error.SpackError):
"""Raised when there is no possible compiler"""

View File

@@ -314,6 +314,12 @@ possible_version_weight(node(ID, Package), Weight)
{ attr("version", node(ID, Package), Version) : pkg_fact(Package, version_satisfies(Constraint, Version)) }
:- attr("node_version_satisfies", node(ID, Package), Constraint).
% If there is at least a version that satisfy the constraint, impose a lower
% bound on the choice rule to avoid false positives with the error below
{ attr("version", node(ID, Package), Version) : pkg_fact(Package, version_satisfies(Constraint, Version)) }
:- attr("node_version_satisfies", node(ID, Package), Constraint),
pkg_fact(Package, version_satisfies(Constraint, _)).
% More specific error message if the version cannot satisfy some constraint
% Otherwise covered by `no_version_error` and `versions_conflict_error`.
error(1, "Cannot satisfy '{0}@{1}'", Package, Constraint)
@@ -498,9 +504,6 @@ attr("node_version_satisfies", node(X, BuildDependency), Constraint) :-
attr("depends_on", node(X, Parent), node(Y, BuildDependency), "build") :- build_requirement(node(X, Parent), node(Y, BuildDependency)).
1 { attr("provider_set", node(X, BuildDependency), node(0..Y-1, Virtual)) : max_dupes(Virtual, Y) } 1 :-
attr("build_requirement", ParentNode, build_requirement("provider_set", BuildDependency, Virtual)),
build_requirement(ParentNode, node(X, BuildDependency)).
% Reconstruct virtual dependencies for reused specs
attr("virtual_on_edge", node(X, A1), node(Y, A2), Virtual)
@@ -694,13 +697,6 @@ attr("virtual_on_edge", PackageNode, ProviderNode, Virtual)
attr("virtual_on_incoming_edges", ProviderNode, Virtual)
:- attr("virtual_on_edge", _, ProviderNode, Virtual).
% This is needed to allow requirement on virtuals,
% when a virtual root is requested
attr("virtual_on_incoming_edges", ProviderNode, Virtual)
:- attr("virtual_root", node(min_dupe_id, Virtual)),
attr("root", ProviderNode),
provider(ProviderNode, node(min_dupe_id, Virtual)).
% dependencies on virtuals also imply that the virtual is a virtual node
1 { attr("virtual_node", node(0..X-1, Virtual)) : max_dupes(Virtual, X) }
:- node_depends_on_virtual(PackageNode, Virtual).
@@ -709,8 +705,8 @@ attr("virtual_on_incoming_edges", ProviderNode, Virtual)
% The provider must be selected among the possible providers.
error(100, "'{0}' cannot be a provider for the '{1}' virtual", Package, Virtual)
:- attr("provider_set", node(X, Package), node(Y, Virtual)),
not virtual_condition_holds( node(X, Package), Virtual).
:- attr("provider_set", node(min_dupe_id, Package), node(min_dupe_id, Virtual)),
not virtual_condition_holds( node(min_dupe_id, Package), Virtual).
error(100, "Cannot find valid provider for virtual {0}", Virtual)
:- attr("virtual_node", node(X, Virtual)),
@@ -1071,14 +1067,12 @@ error(100, "Cannot set variant '{0}' for package '{1}' because the variant condi
build(node(ID, Package)).
% at most one variant value for single-valued variants.
error(100, "'{0}' requires conflicting variant values 'Spec({1}={2})' and 'Spec({1}={3})'", Package, Variant, Value1, Value2)
error(100, "'{0}' required multiple values for single-valued variant '{1}'", Package, Variant)
:- attr("node", node(ID, Package)),
node_has_variant(node(ID, Package), Variant, _),
variant_single_value(node(ID, Package), Variant),
attr("variant_value", node(ID, Package), Variant, Value1),
attr("variant_value", node(ID, Package), Variant, Value2),
Value1 < Value2,
build(node(ID, Package)).
build(node(ID, Package)),
2 { attr("variant_value", node(ID, Package), Variant, Value) }.
error(100, "No valid value for variant '{1}' of package '{0}'", Package, Variant)
:- attr("node", node(ID, Package)),
@@ -1421,7 +1415,6 @@ compiler(Compiler) :- compiler_supports_target(Compiler, _, _).
language("c").
language("cxx").
language("fortran").
language_runtime("fortran-rt").
error(10, "Only external, or concrete, compilers are allowed for the {0} language", Language)
:- provider(ProviderNode, node(_, Language)),
@@ -1684,7 +1677,7 @@ opt_criterion(60, "preferred providers for roots").
#minimize{
Weight@60+Priority,ProviderNode,X,Virtual
: provider_weight(ProviderNode, node(X, Virtual), Weight),
attr("root", ProviderNode), not language(Virtual), not language_runtime(Virtual),
attr("root", ProviderNode), not language(Virtual),
build_priority(ProviderNode, Priority)
}.
@@ -1717,7 +1710,7 @@ opt_criterion(48, "preferred providers (non-roots)").
#minimize{
Weight@48+Priority,ProviderNode,X,Virtual
: provider_weight(ProviderNode, node(X, Virtual), Weight),
not attr("root", ProviderNode), not language(Virtual), not language_runtime(Virtual),
not attr("root", ProviderNode), not language(Virtual),
build_priority(ProviderNode, Priority)
}.
@@ -1810,15 +1803,6 @@ opt_criterion(5, "non-preferred targets").
not runtime(Package)
}.
opt_criterion(4, "preferred providers (language runtimes)").
#minimize{ 0@204: #true }.
#minimize{ 0@4: #true }.
#minimize{
Weight@4+Priority,ProviderNode,X,Virtual
: provider_weight(ProviderNode, node(X, Virtual), Weight),
language_runtime(Virtual),
build_priority(ProviderNode, Priority)
}.
% Choose more recent versions for runtimes
opt_criterion(3, "version badness (runtimes)").

View File

@@ -31,19 +31,16 @@ class AspObject:
"""Object representing a piece of ASP code."""
def _id(thing: Any) -> Union[str, int, AspObject]:
def _id(thing: Any) -> Union[str, AspObject]:
"""Quote string if needed for it to be a valid identifier."""
if isinstance(thing, bool):
return f'"{thing}"'
elif isinstance(thing, (AspObject, int)):
if isinstance(thing, AspObject):
return thing
elif isinstance(thing, bool):
return f'"{str(thing)}"'
elif isinstance(thing, int):
return str(thing)
else:
if isinstance(thing, str):
# escape characters that cannot be in clingo strings
thing = thing.replace("\\", r"\\")
thing = thing.replace("\n", r"\n")
thing = thing.replace('"', r"\"")
return f'"{thing}"'
return f'"{str(thing)}"'
class AspVar(AspObject):
@@ -93,9 +90,26 @@ def __call__(self, *args: Any) -> "AspFunction":
"""
return AspFunction(self.name, self.args + args)
def _argify(self, arg: Any) -> Any:
"""Turn the argument into an appropriate clingo symbol"""
if isinstance(arg, bool):
return clingo().String(str(arg))
elif isinstance(arg, int):
return clingo().Number(arg)
elif isinstance(arg, AspFunction):
return clingo().Function(arg.name, [self._argify(x) for x in arg.args], positive=True)
elif isinstance(arg, AspVar):
return clingo().Variable(arg.name)
return clingo().String(str(arg))
def symbol(self):
"""Return a clingo symbol for this function"""
return clingo().Function(
self.name, [self._argify(arg) for arg in self.args], positive=True
)
def __str__(self) -> str:
args = f"({','.join(str(_id(arg)) for arg in self.args)})"
return f"{self.name}{args}"
return f"{self.name}({', '.join(str(_id(arg)) for arg in self.args)})"
def __repr__(self) -> str:
return str(self)

View File

@@ -117,7 +117,7 @@ error(0, "Cannot find a valid provider for virtual {0}", Virtual, startcauses, C
condition_holds(Cause, node(CID, TriggerPkg)).
% At most one variant value for single-valued variants
error(0, "'{0}' requires conflicting variant values 'Spec({1}={2})' and 'Spec({1}={3})'", Package, Variant, Value1, Value2, startcauses, Cause1, X, Cause2, X)
error(0, "'{0}' required multiple values for single-valued variant '{1}'\n Requested 'Spec({1}={2})' and 'Spec({1}={3})'", Package, Variant, Value1, Value2, startcauses, Cause1, X, Cause2, X)
:- attr("node", node(X, Package)),
node_has_variant(node(X, Package), Variant, VariantID),
variant_single_value(node(X, Package), Variant),

View File

@@ -8,10 +8,6 @@
% These rules are used on Linux
%=============================================================================
% Non-libc reused specs must be host libc compatible. In case we build packages, we get a
% host compatible libc provider from other rules. If nothing is built, there is no libc provider,
% since it's pruned from reusable specs, meaning we have to explicitly impose reused specs are host
% compatible.
% A package cannot be reused if it needs a libc that is not compatible with the current one
error(100, "Cannot reuse {0} since we cannot determine libc compatibility", ReusedPackage)
@@ -28,6 +24,14 @@ error(100, "Cannot reuse {0} since we cannot determine libc compatibility", Reus
attr("needs_libc", node(R, ReusedPackage)),
not attr("compatible_libc", node(R, ReusedPackage), _, _).
% Non-libc reused specs must be host libc compatible. In case we build packages, we get a
% host compatible libc provider from other rules. If nothing is built, there is no libc provider,
% since it's pruned from reusable specs, meaning we have to explicitly impose reused specs are host
% compatible.
%:- attr("hash", node(R, ReusedPackage), Hash),
% not provider(node(R, ReusedPackage), node(0, "libc")),
% not attr("compatible_libc", node(R, ReusedPackage), _, _).
% The libc provider must be one that a compiler can target
:- has_built_packages(),
provider(node(X, LibcPackage), node(0, "libc")),

View File

@@ -11,7 +11,7 @@
import spack.package_base
import spack.repo
import spack.spec
from spack.util.spack_yaml import get_mark_from_yaml_data
from spack.config import get_mark_from_yaml_data
class RequirementKind(enum.Enum):
@@ -69,29 +69,18 @@ def rules_from_package_py(self, pkg: spack.package_base.PackageBase) -> List[Req
return rules
def rules_from_virtual(self, virtual_str: str) -> List[RequirementRule]:
kind, requests = self._raw_yaml_data(virtual_str, section="require", virtual=True)
result = self._rules_from_requirements(virtual_str, requests, kind=kind)
kind, requests = self._raw_yaml_data(virtual_str, section="prefer", virtual=True)
result.extend(self._rules_from_preferences(virtual_str, preferences=requests, kind=kind))
kind, requests = self._raw_yaml_data(virtual_str, section="conflict", virtual=True)
result.extend(self._rules_from_conflicts(virtual_str, conflicts=requests, kind=kind))
return result
requirements = self.config.get("packages", {}).get(virtual_str, {}).get("require", [])
return self._rules_from_requirements(
virtual_str, requirements, kind=RequirementKind.VIRTUAL
)
def rules_from_require(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
kind, requirements = self._raw_yaml_data(pkg.name, section="require")
kind, requirements = self._raw_yaml_data(pkg, section="require")
return self._rules_from_requirements(pkg.name, requirements, kind=kind)
def rules_from_prefer(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
kind, preferences = self._raw_yaml_data(pkg.name, section="prefer")
return self._rules_from_preferences(pkg.name, preferences=preferences, kind=kind)
def _rules_from_preferences(
self, pkg_name: str, *, preferences, kind: RequirementKind
) -> List[RequirementRule]:
result = []
kind, preferences = self._raw_yaml_data(pkg, section="prefer")
for item in preferences:
spec, condition, message = self._parse_prefer_conflict_item(item)
result.append(
@@ -100,7 +89,7 @@ def _rules_from_preferences(
# require:
# - any_of: [spec_str, "@:"]
RequirementRule(
pkg_name=pkg_name,
pkg_name=pkg.name,
policy="any_of",
requirements=[spec, spack.spec.Spec("@:")],
kind=kind,
@@ -111,13 +100,8 @@ def _rules_from_preferences(
return result
def rules_from_conflict(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
kind, conflicts = self._raw_yaml_data(pkg.name, section="conflict")
return self._rules_from_conflicts(pkg.name, conflicts=conflicts, kind=kind)
def _rules_from_conflicts(
self, pkg_name: str, *, conflicts, kind: RequirementKind
) -> List[RequirementRule]:
result = []
kind, conflicts = self._raw_yaml_data(pkg, section="conflict")
for item in conflicts:
spec, condition, message = self._parse_prefer_conflict_item(item)
result.append(
@@ -126,7 +110,7 @@ def _rules_from_conflicts(
# require:
# - one_of: [spec_str, "@:"]
RequirementRule(
pkg_name=pkg_name,
pkg_name=pkg.name,
policy="one_of",
requirements=[spec, spack.spec.Spec("@:")],
kind=kind,
@@ -148,14 +132,10 @@ def _parse_prefer_conflict_item(self, item):
message = item.get("message")
return spec, condition, message
def _raw_yaml_data(self, pkg_name: str, *, section: str, virtual: bool = False):
def _raw_yaml_data(self, pkg: spack.package_base.PackageBase, *, section: str):
config = self.config.get("packages")
data = config.get(pkg_name, {}).get(section, [])
data = config.get(pkg.name, {}).get(section, [])
kind = RequirementKind.PACKAGE
if virtual:
return RequirementKind.VIRTUAL, data
if not data:
data = config.get("all", {}).get(section, [])
kind = RequirementKind.DEFAULT
@@ -188,8 +168,7 @@ def _rules_from_requirements(
# validate specs from YAML first, and fail with line numbers if parsing fails.
constraints = [
parse_spec_from_yaml_string(constraint, named=kind == RequirementKind.VIRTUAL)
for constraint in constraints
parse_spec_from_yaml_string(constraint) for constraint in constraints
]
when_str = requirement.get("when")
when = parse_spec_from_yaml_string(when_str) if when_str else spack.spec.Spec()
@@ -247,37 +226,21 @@ def reject_requirement_constraint(
return False
def parse_spec_from_yaml_string(string: str, *, named: bool = False) -> spack.spec.Spec:
def parse_spec_from_yaml_string(string: str) -> spack.spec.Spec:
"""Parse a spec from YAML and add file/line info to errors, if it's available.
Parse a ``Spec`` from the supplied string, but also intercept any syntax errors and
add file/line information for debugging using file/line annotations from the string.
Args:
Arguments:
string: a string representing a ``Spec`` from config YAML.
named: if True, the spec must have a name
"""
try:
result = spack.spec.Spec(string)
return spack.spec.Spec(string)
except spack.error.SpecSyntaxError as e:
mark = get_mark_from_yaml_data(string)
if mark:
msg = f"{mark.name}:{mark.line + 1}: {str(e)}"
raise spack.error.SpecSyntaxError(msg) from e
raise e
if named is True and not result.name:
msg = f"expected a named spec, but got '{string}' instead"
mark = get_mark_from_yaml_data(string)
# Add a hint in case it's dependencies
deps = result.dependencies()
if len(deps) == 1:
msg = f"{msg}. Did you mean '{deps[0]}'?"
if mark:
msg = f"{mark.name}:{mark.line + 1}: {msg}"
raise spack.error.SpackError(msg)
return result

View File

@@ -86,7 +86,6 @@
import llnl.util.tty.color as clr
import spack
import spack.aliases
import spack.compilers.flags
import spack.deptypes as dt
import spack.error
@@ -98,6 +97,7 @@
import spack.spec_parser
import spack.store
import spack.traverse
import spack.util.executable
import spack.util.hash
import spack.util.prefix
import spack.util.spack_json as sjson
@@ -699,11 +699,6 @@ def __init__(self):
super().__init__(name="compiler")
def factory(self, instance, owner):
if instance.original_spec_format() < 5:
compiler = instance.annotations.compiler_node_attribute
assert compiler is not None, "a compiler spec is expected"
return CompilerSpec(compiler)
for language in ("c", "cxx", "fortran"):
deps = instance.dependencies(virtuals=language)
if deps:
@@ -1072,6 +1067,28 @@ def clear(self):
self.edges.clear()
def _command_default_handler(spec: "Spec"):
"""Default handler when looking for the 'command' attribute.
Tries to search for ``spec.name`` in the ``spec.home.bin`` directory.
Parameters:
spec: spec that is being queried
Returns:
Executable: An executable of the command
Raises:
RuntimeError: If the command is not found
"""
home = getattr(spec.package, "home")
path = os.path.join(home.bin, spec.name)
if fs.is_exe(path):
return spack.util.executable.Executable(path)
raise RuntimeError(f"Unable to locate {spec.name} command in {home.bin}")
def _headers_default_handler(spec: "Spec"):
"""Default handler when looking for the 'headers' attribute.
@@ -1275,7 +1292,9 @@ class SpecBuildInterface(lang.ObjectWrapper):
home = ForwardQueryToPackage("home", default_handler=None)
headers = ForwardQueryToPackage("headers", default_handler=_headers_default_handler)
libs = ForwardQueryToPackage("libs", default_handler=_libs_default_handler)
command = ForwardQueryToPackage("command", default_handler=None, _indirect=True)
command = ForwardQueryToPackage(
"command", default_handler=_command_default_handler, _indirect=True
)
def __init__(
self,
@@ -1474,7 +1493,7 @@ def __init__(self, spec_like=None, *, external_path=None, external_modules=None)
self.abstract_hash = None
# initial values for all spec hash types
for h in ht.HASHES:
for h in ht.hashes:
setattr(self, h.attr, None)
# cache for spec's prefix, computed lazily by prefix property
@@ -1625,16 +1644,12 @@ def edge_attributes(self) -> str:
return ""
union = DependencySpec(parent=Spec(), spec=self, depflag=0, virtuals=())
all_direct_edges = all(x.direct for x in edges)
for edge in edges:
union.update_deptypes(edge.depflag)
union.update_virtuals(edge.virtuals)
deptypes_str = ""
if not all_direct_edges and union.depflag:
deptypes_str = f"deptypes={','.join(dt.flag_to_tuple(union.depflag))}"
deptypes_str = (
f"deptypes={','.join(dt.flag_to_tuple(union.depflag))}" if union.depflag else ""
)
virtuals_str = f"virtuals={','.join(union.virtuals)}" if union.virtuals else ""
if not deptypes_str and not virtuals_str:
return ""
@@ -2074,19 +2089,21 @@ def traverse_edges(
def long_spec(self):
"""Returns a string of the spec with the dependencies completely
enumerated."""
name_conversion = {
"llvm": "clang",
"intel-oneapi-compilers": "oneapi",
"llvm-amdgpu": "rocmcc",
"intel-oneapi-compiler-classic": "intel",
"acfl": "arm",
}
parts = [self.format()]
direct, transitive = lang.stable_partition(
self.edges_to_dependencies(), predicate_fn=lambda x: x.direct
)
for item in sorted(direct, key=lambda x: x.spec.name):
current_name = item.spec.name
new_name = spack.aliases.BUILTIN_TO_LEGACY_COMPILER.get(current_name, current_name)
# note: depflag not allowed, currently, on "direct" edges
edge_attributes = ""
if item.virtuals:
edge_attributes = item.spec.format("{edge_attributes}") + " "
parts.append(f"%{edge_attributes}{item.spec.format()}".replace(current_name, new_name))
new_name = name_conversion.get(current_name, current_name)
parts.append(f"%{item.spec.format()}".replace(current_name, new_name))
for item in sorted(transitive, key=lambda x: x.spec.name):
# Recurse to attach build deps in order
edge_attributes = ""
@@ -2183,16 +2200,30 @@ def package_hash(self):
def dag_hash(self, length=None):
"""This is Spack's default hash, used to identify installations.
Same as the full hash (includes package hash and build/link/run deps).
Tells us when package files and any dependencies have changes.
NOTE: Versions of Spack prior to 0.18 only included link and run deps.
NOTE: Versions of Spack prior to 1.0 only did not include test deps.
"""
return self._cached_hash(ht.dag_hash, length)
def process_hash(self, length=None):
"""Hash used to transfer specs among processes.
This hash includes build and test dependencies and is only used to
serialize a spec and pass it around among processes.
"""
return self._cached_hash(ht.process_hash, length)
def dag_hash_bit_prefix(self, bits):
"""Get the first <bits> bits of the DAG hash as an integer type."""
return spack.util.hash.base32_prefix_bits(self.dag_hash(), bits)
def process_hash_bit_prefix(self, bits):
"""Get the first <bits> bits of the DAG hash as an integer type."""
return spack.util.hash.base32_prefix_bits(self.process_hash(), bits)
def _lookup_hash(self):
"""Lookup just one spec with an abstract hash, returning a spec from the the environment,
store, or finally, binary caches."""
@@ -3119,8 +3150,6 @@ def _constrain_dependencies(self, other: "Spec") -> bool:
raise UnconstrainableDependencySpecError(other)
# Handle common first-order constraints directly
# Note: This doesn't handle constraining transitive dependencies with the same name
# as direct dependencies
changed = False
common_dependencies = {x.name for x in self.dependencies()}
common_dependencies &= {x.name for x in other.dependencies()}
@@ -3410,18 +3439,7 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
# Note: this relies on abstract specs from string not being deeper than 2 levels
# e.g. in foo %fee ^bar %baz we cannot go deeper than "baz" and e.g. specify its
# dependencies too.
#
# We also need to account for cases like gcc@<new> %gcc@<old> where the parent
# name is the same as the child name
#
# The same assumptions hold on Spec.constrain, and Spec.intersect
current_node = self
if rhs_edge.parent.name is not None and rhs_edge.parent.name != rhs_edge.spec.name:
try:
current_node = self[rhs_edge.parent.name]
except KeyError:
return False
current_node = self if rhs_edge.parent.name is None else self[rhs_edge.parent.name]
candidates = current_node.dependencies(
name=rhs_edge.spec.name,
deptype=rhs_edge.depflag,
@@ -3430,8 +3448,6 @@ def satisfies(self, other: Union[str, "Spec"], deps: bool = True) -> bool:
if not candidates or not any(x.satisfies(rhs_edge.spec) for x in candidates):
return False
continue
if not rhs_edge.virtuals:
continue
@@ -3576,11 +3592,11 @@ def _dup(self, other: "Spec", deps: Union[bool, dt.DepTypes, dt.DepFlag] = True)
if self._concrete:
self._dunder_hash = other._dunder_hash
for h in ht.HASHES:
for h in ht.hashes:
setattr(self, h.attr, getattr(other, h.attr, None))
else:
self._dunder_hash = None
for h in ht.HASHES:
for h in ht.hashes:
setattr(self, h.attr, None)
return changed
@@ -3666,8 +3682,8 @@ def __getitem__(self, name: str):
# Consider all direct dependencies and transitive runtime dependencies
order = itertools.chain(
self.edges_to_dependencies(depflag=dt.BUILD | dt.TEST),
self.traverse_edges(deptype=dt.LINK | dt.RUN, order="breadth", cover="edges"),
self.edges_to_dependencies(depflag=dt.BUILD | dt.TEST),
)
try:
@@ -3774,6 +3790,16 @@ def _cmp_iter(self):
# serialized before the hash change and one after, are considered different.
yield self.dag_hash() if self.concrete else None
# This needs to be in _cmp_iter so that no specs with different process hashes
# are considered the same by `__hash__` or `__eq__`.
#
# TODO: We should eventually unify the `_cmp_*` methods with `to_node_dict` so
# TODO: there aren't two sources of truth, but this needs some thought, since
# TODO: they exist for speed. We should benchmark whether it's really worth
# TODO: having two types of hashing now that we use `json` instead of `yaml` for
# TODO: spec hashing.
yield self.process_hash() if self.concrete else None
def deps():
for dep in sorted(itertools.chain.from_iterable(self._dependencies.values())):
yield dep.spec.name
@@ -4427,7 +4453,7 @@ def clear_caches(self, ignore: Tuple[str, ...] = ()) -> None:
"""
Clears all cached hashes in a Spec, while preserving other properties.
"""
for h in ht.HASHES:
for h in ht.hashes:
if h.attr not in ignore:
if hasattr(self, h.attr):
setattr(self, h.attr, None)
@@ -4436,12 +4462,18 @@ def clear_caches(self, ignore: Tuple[str, ...] = ()) -> None:
setattr(self, attr, None)
def __hash__(self):
# If the spec is concrete, we leverage the dag hash and just use a 64-bit prefix of it.
# The dag hash has the advantage that it's computed once per concrete spec, and it's saved
# -- so if we read concrete specs we don't need to recompute the whole hash.
# If the spec is concrete, we leverage the process hash and just use
# a 64-bit prefix of it. The process hash has the advantage that it's
# computed once per concrete spec, and it's saved -- so if we read
# concrete specs we don't need to recompute the whole hash. This is
# good for large, unchanging specs.
#
# We use the process hash instead of the DAG hash here because the DAG
# hash includes the package hash, which can cause infinite recursion,
# and which isn't defined unless the spec has a known package.
if self.concrete:
if not self._dunder_hash:
self._dunder_hash = self.dag_hash_bit_prefix(64)
self._dunder_hash = self.process_hash_bit_prefix(64)
return self._dunder_hash
# This is the normal hash for lazy_lexicographic_ordering. It's
@@ -4450,7 +4482,7 @@ def __hash__(self):
return hash(lang.tuplify(self._cmp_iter))
def __reduce__(self):
return Spec.from_dict, (self.to_dict(hash=ht.dag_hash),)
return Spec.from_dict, (self.to_dict(hash=ht.process_hash),)
def attach_git_version_lookup(self):
# Add a git lookup method for GitVersions
@@ -4464,9 +4496,6 @@ def original_spec_format(self) -> int:
"""Returns the spec format originally used for this spec."""
return self.annotations.original_spec_format
def has_virtual_dependency(self, virtual: str) -> bool:
return bool(self.dependencies(virtuals=(virtual,)))
class VariantMap(lang.HashableMap):
"""Map containing variant instances. New values can be added only
@@ -4779,7 +4808,7 @@ def from_node_dict(cls, node):
spec = Spec()
name, node = cls.name_and_data(node)
for h in ht.HASHES:
for h in ht.hashes:
setattr(spec, h.attr, node.get(h.name, None))
spec.name = name
@@ -4975,7 +5004,7 @@ def read_specfile_dep_specs(cls, deps, hash_type=ht.dag_hash.name):
"""
for dep_name, elt in deps.items():
if isinstance(elt, dict):
for h in ht.HASHES:
for h in ht.hashes:
if h.name in elt:
dep_hash, deptypes = elt[h.name], elt["type"]
hash_type = h.name
@@ -5020,7 +5049,7 @@ def read_specfile_dep_specs(cls, deps, hash_type=ht.dag_hash.name):
dep_name = dep["name"]
if isinstance(elt, dict):
# new format: elements of dependency spec are keyed.
for h in ht.HASHES:
for h in ht.hashes:
if h.name in elt:
dep_hash, deptypes, hash_type, virtuals = cls.extract_info_from_dep(elt, h)
break
@@ -5144,13 +5173,6 @@ def get_host_environment() -> Dict[str, Any]:
}
def eval_conditional(string):
"""Evaluate conditional definitions using restricted variable scope."""
valid_variables = get_host_environment()
valid_variables.update({"re": re, "env": os.environ})
return eval(string, valid_variables)
class SpecParseError(spack.error.SpecError):
"""Wrapper for ParseError for when we're parsing specs."""

View File

@@ -60,19 +60,14 @@
import pathlib
import re
import sys
import traceback
import warnings
from typing import Iterator, List, Optional, Tuple
from typing import Iterator, List, Optional
from llnl.util.tty import color
import spack.deptypes
import spack.error
import spack.paths
import spack.spec
import spack.util.spack_yaml
import spack.version
from spack.aliases import LEGACY_COMPILER_TO_BUILTIN
from spack.tokenize import Token, TokenBase, Tokenizer
#: Valid name for specs and variants. Here we are not using
@@ -102,9 +97,6 @@
#: Regex with groups to use for splitting (optionally propagated) key-value pairs
SPLIT_KVP = re.compile(rf"^({NAME})(==?)(.*)$")
#: Regex with groups to use for splitting %[virtuals=...] tokens
SPLIT_COMPILER_TOKEN = re.compile(rf"^%\[virtuals=({VALUE}|{QUOTED_VALUE})]\s*(.*)$")
#: A filename starts either with a "." or a "/" or a "{name}/, or on Windows, a drive letter
#: followed by a colon and "\" or "." or {name}\
WINDOWS_FILENAME = r"(?:\.|[a-zA-Z0-9-_]*\\|[a-zA-Z]:\\)(?:[a-zA-Z0-9-_\.\\]*)(?:\.json|\.yaml)"
@@ -140,11 +132,6 @@ class SpecTokens(TokenBase):
# Compilers
COMPILER_AND_VERSION = rf"(?:%\s*(?:{NAME})(?:[\s]*)@\s*(?:{VERSION_LIST}))"
COMPILER = rf"(?:%\s*(?:{NAME}))"
COMPILER_AND_VERSION_WITH_VIRTUALS = (
rf"(?:%\[virtuals=(?:{VALUE}|{QUOTED_VALUE})\]"
rf"\s*(?:{NAME})(?:[\s]*)@\s*(?:{VERSION_LIST}))"
)
COMPILER_WITH_VIRTUALS = rf"(?:%\[virtuals=(?:{VALUE}|{QUOTED_VALUE})\]\s*(?:{NAME}))"
# FILENAME
FILENAME = rf"(?:{FILENAME})"
# Package name
@@ -217,32 +204,6 @@ def __init__(self, tokens: List[Token], text: str):
super().__init__(message)
def _warn_about_variant_after_compiler(literal_str: str, issues: List[str]):
"""Issue a warning if variant or other token is preceded by a compiler token. The warning is
only issued if it's actionable: either we know the config file it originates from, or we have
call site that's not internal to Spack."""
ignore = [spack.paths.lib_path, spack.paths.bin_path]
mark = spack.util.spack_yaml.get_mark_from_yaml_data(literal_str)
issue_str = ", ".join(issues)
error = f"{issue_str} in `{literal_str}`"
# warning from config file
if mark:
warnings.warn(f"{mark.name}:{mark.line + 1}: {error}")
return
# warning from hopefully package.py
for frame in reversed(traceback.extract_stack()):
if frame.lineno and not any(frame.filename.startswith(path) for path in ignore):
warnings.warn_explicit(
error,
category=spack.error.SpackAPIWarning,
filename=frame.filename,
lineno=frame.lineno,
)
return
class SpecParser:
"""Parse text into specs"""
@@ -281,31 +242,26 @@ def add_dependency(dep, **edge_properties):
raise SpecParsingError(str(e), self.ctx.current_token, self.literal_str) from e
initial_spec = initial_spec or spack.spec.Spec()
root_spec, parser_warnings = SpecNodeParser(self.ctx, self.literal_str).parse(initial_spec)
root_spec = SpecNodeParser(self.ctx, self.literal_str).parse(initial_spec)
while True:
if self.ctx.accept(SpecTokens.START_EDGE_PROPERTIES):
edge_properties = EdgeAttributeParser(self.ctx, self.literal_str).parse()
edge_properties.setdefault("depflag", 0)
edge_properties.setdefault("virtuals", ())
dependency, warnings = self._parse_node(root_spec)
parser_warnings.extend(warnings)
dependency = self._parse_node(root_spec)
add_dependency(dependency, **edge_properties)
elif self.ctx.accept(SpecTokens.DEPENDENCY):
dependency, warnings = self._parse_node(root_spec)
parser_warnings.extend(warnings)
dependency = self._parse_node(root_spec)
add_dependency(dependency, depflag=0, virtuals=())
else:
break
if parser_warnings:
_warn_about_variant_after_compiler(self.literal_str, parser_warnings)
return root_spec
def _parse_node(self, root_spec):
dependency, parser_warnings = SpecNodeParser(self.ctx, self.literal_str).parse()
dependency = SpecNodeParser(self.ctx, self.literal_str).parse()
if dependency is None:
msg = (
"the dependency sigil and any optional edge attributes must be followed by a "
@@ -314,7 +270,7 @@ def _parse_node(self, root_spec):
raise SpecParsingError(msg, self.ctx.current_token, self.literal_str)
if root_spec.concrete:
raise spack.spec.RedundantSpecError(root_spec, "^" + str(dependency))
return dependency, parser_warnings
return dependency
def all_specs(self) -> List["spack.spec.Spec"]:
"""Return all the specs that remain to be parsed"""
@@ -333,7 +289,7 @@ def __init__(self, ctx, literal_str):
def parse(
self, initial_spec: Optional["spack.spec.Spec"] = None
) -> Tuple["spack.spec.Spec", List[str]]:
) -> Optional["spack.spec.Spec"]:
"""Parse a single spec node from a stream of tokens
Args:
@@ -342,15 +298,12 @@ def parse(
Return
The object passed as argument
"""
parser_warnings: List[str] = []
last_compiler = None
if not self.ctx.next_token or self.ctx.expect(SpecTokens.DEPENDENCY):
return initial_spec
if initial_spec is None:
initial_spec = spack.spec.Spec()
if not self.ctx.next_token or self.ctx.expect(SpecTokens.DEPENDENCY):
return initial_spec, parser_warnings
# If we start with a package name we have a named spec, we cannot
# accept another package name afterwards in a node
if self.ctx.accept(SpecTokens.UNQUALIFIED_PACKAGE_NAME):
@@ -364,7 +317,7 @@ def parse(
initial_spec.namespace = namespace
elif self.ctx.accept(SpecTokens.FILENAME):
return FileParser(self.ctx).parse(initial_spec), parser_warnings
return FileParser(self.ctx).parse(initial_spec)
def raise_parsing_error(string: str, cause: Optional[Exception] = None):
"""Raise a spec parsing error with token context."""
@@ -377,40 +330,25 @@ def add_flag(name: str, value: str, propagate: bool):
except Exception as e:
raise_parsing_error(str(e), e)
def warn_if_after_compiler(token: str):
"""Register a warning for %compiler followed by +variant that will in the future apply
to the compiler instead of the current root."""
if last_compiler:
parser_warnings.append(f"`{token}` should go before `{last_compiler}`")
while True:
if (
self.ctx.accept(SpecTokens.COMPILER)
or self.ctx.accept(SpecTokens.COMPILER_AND_VERSION)
or self.ctx.accept(SpecTokens.COMPILER_WITH_VIRTUALS)
or self.ctx.accept(SpecTokens.COMPILER_AND_VERSION_WITH_VIRTUALS)
if self.ctx.accept(SpecTokens.COMPILER) or self.ctx.accept(
SpecTokens.COMPILER_AND_VERSION
):
current_token = self.ctx.current_token
if current_token.kind in (
SpecTokens.COMPILER_WITH_VIRTUALS,
SpecTokens.COMPILER_AND_VERSION_WITH_VIRTUALS,
):
m = SPLIT_COMPILER_TOKEN.match(current_token.value)
assert m, "SPLIT_COMPILER_TOKEN and COMPILER_* do not agree."
virtuals_str, compiler_str = m.groups()
virtuals = tuple(virtuals_str.strip("'\" ").split(","))
else:
virtuals = tuple()
compiler_str = current_token.value[1:]
build_dependency = spack.spec.Spec(self.ctx.current_token.value[1:])
name_conversion = {
"clang": "llvm",
"oneapi": "intel-oneapi-compilers",
"rocmcc": "llvm-amdgpu",
"intel": "intel-oneapi-compiler-classic",
"arm": "acfl",
}
build_dependency = spack.spec.Spec(compiler_str)
if build_dependency.name in LEGACY_COMPILER_TO_BUILTIN:
build_dependency.name = LEGACY_COMPILER_TO_BUILTIN[build_dependency.name]
if build_dependency.name in name_conversion:
build_dependency.name = name_conversion[build_dependency.name]
initial_spec._add_dependency(
build_dependency, depflag=spack.deptypes.BUILD, virtuals=virtuals, direct=True
build_dependency, depflag=spack.deptypes.BUILD, virtuals=(), direct=True
)
last_compiler = self.ctx.current_token.value
elif (
self.ctx.accept(SpecTokens.VERSION_HASH_PAIR)
@@ -425,17 +363,14 @@ def warn_if_after_compiler(token: str):
)
initial_spec.attach_git_version_lookup()
self.has_version = True
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.accept(SpecTokens.BOOL_VARIANT):
variant_value = self.ctx.current_token.value[0] == "+"
add_flag(self.ctx.current_token.value[1:].strip(), variant_value, propagate=False)
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.accept(SpecTokens.PROPAGATED_BOOL_VARIANT):
variant_value = self.ctx.current_token.value[0:2] == "++"
add_flag(self.ctx.current_token.value[2:].strip(), variant_value, propagate=True)
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.accept(SpecTokens.KEY_VALUE_PAIR):
match = SPLIT_KVP.match(self.ctx.current_token.value)
@@ -443,7 +378,6 @@ def warn_if_after_compiler(token: str):
name, _, value = match.groups()
add_flag(name, strip_quotes_and_unescape(value), propagate=False)
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.accept(SpecTokens.PROPAGATED_KEY_VALUE_PAIR):
match = SPLIT_KVP.match(self.ctx.current_token.value)
@@ -451,19 +385,17 @@ def warn_if_after_compiler(token: str):
name, _, value = match.groups()
add_flag(name, strip_quotes_and_unescape(value), propagate=True)
warn_if_after_compiler(self.ctx.current_token.value)
elif self.ctx.expect(SpecTokens.DAG_HASH):
if initial_spec.abstract_hash:
break
self.ctx.accept(SpecTokens.DAG_HASH)
initial_spec.abstract_hash = self.ctx.current_token.value[1:]
warn_if_after_compiler(self.ctx.current_token.value)
else:
break
return initial_spec, parser_warnings
return initial_spec
class FileParser:
@@ -553,18 +485,23 @@ def parse_one_or_raise(
text (str): text to be parsed
initial_spec: buffer where to parse the spec. If None a new one will be created.
"""
parser = SpecParser(text)
stripped_text = text.strip()
parser = SpecParser(stripped_text)
result = parser.next_spec(initial_spec)
next_token = parser.ctx.next_token
last_token = parser.ctx.current_token
if next_token:
message = f"expected a single spec, but got more:\n{text}"
underline = f"\n{' ' * next_token.start}{'^' * len(next_token.value)}"
message += color.colorize(f"@*r{{{underline}}}")
if last_token is not None and last_token.end != len(stripped_text):
message = "a single spec was requested, but parsed more than one:"
message += f"\n{text}"
if last_token is not None:
underline = f"\n{' ' * last_token.end}{'^' * (len(text) - last_token.end)}"
message += color.colorize(f"@*r{{{underline}}}")
raise ValueError(message)
if result is None:
raise ValueError("expected a single spec, but got none")
message = "a single spec was requested, but none was parsed:"
message += f"\n{text}"
raise ValueError(message)
return result

View File

@@ -129,6 +129,6 @@ def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constra
)
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
spec = spack.concretize.concretize_one(
f"pkg-a foobar=bar target={root_target_range} %gcc@10 ^pkg-b target={dep_target_range}"
f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}"
)
assert spec.target == spec["pkg-b"].target == result

View File

@@ -197,11 +197,7 @@ def dummy_prefix(tmpdir):
@pytest.mark.requires_executables(*required_executables)
@pytest.mark.maybeslow
@pytest.mark.usefixtures(
"default_config",
"cache_directory",
"install_dir_default_layout",
"temporary_mirror",
"mutable_mock_env_path",
"default_config", "cache_directory", "install_dir_default_layout", "temporary_mirror"
)
def test_default_rpaths_create_install_default_layout(temporary_mirror_dir):
"""
@@ -273,11 +269,7 @@ def test_default_rpaths_install_nondefault_layout(temporary_mirror_dir):
@pytest.mark.maybeslow
@pytest.mark.nomockstage
@pytest.mark.usefixtures(
"default_config",
"cache_directory",
"install_dir_default_layout",
"temporary_mirror",
"mutable_mock_env_path",
"default_config", "cache_directory", "install_dir_default_layout", "temporary_mirror"
)
def test_relative_rpaths_install_default_layout(temporary_mirror_dir):
"""
@@ -569,6 +561,7 @@ def test_FetchCacheError_only_accepts_lists_of_errors():
def test_FetchCacheError_pretty_printing_multiple():
e = bindist.FetchCacheError([RuntimeError("Oops!"), TypeError("Trouble!")])
str_e = str(e)
print("'" + str_e + "'")
assert "Multiple errors" in str_e
assert "Error 1: RuntimeError: Oops!" in str_e
assert "Error 2: TypeError: Trouble!" in str_e

View File

@@ -48,7 +48,7 @@ def build_environment(monkeypatch, wrapper_dir, tmp_path):
monkeypatch.setenv("SPACK_FC", realcc)
monkeypatch.setenv("SPACK_PREFIX", prefix)
monkeypatch.setenv("SPACK_COMPILER_WRAPPER_PATH", "test")
monkeypatch.setenv("SPACK_ENV_PATH", "test")
monkeypatch.setenv("SPACK_DEBUG_LOG_DIR", ".")
monkeypatch.setenv("SPACK_DEBUG_LOG_ID", "foo-hashabc")
monkeypatch.setenv("SPACK_SHORT_SPEC", "foo@1.2 arch=linux-rhel6-x86_64 /hashabc")
@@ -312,7 +312,7 @@ def test_spack_paths_before_module_paths(
mutable_config.set("packages", {"gcc": {"externals": [gcc_entry]}})
module_path = os.path.join("path", "to", "module")
monkeypatch.setenv("SPACK_COMPILER_WRAPPER_PATH", wrapper_dir)
monkeypatch.setenv("SPACK_ENV_PATH", wrapper_dir)
def _set_wrong_cc(x):
os.environ["PATH"] = module_path + os.pathsep + os.environ["PATH"]

View File

@@ -146,7 +146,7 @@ def wrapper_environment(working_env):
SPACK_CXX=real_cc,
SPACK_FC=real_cc,
SPACK_PREFIX=pkg_prefix,
SPACK_COMPILER_WRAPPER_PATH="test",
SPACK_ENV_PATH="test",
SPACK_DEBUG_LOG_DIR=".",
SPACK_DEBUG_LOG_ID="foo-hashabc",
SPACK_SHORT_SPEC="foo@1.2 arch=linux-rhel6-x86_64 /hashabc",
@@ -737,19 +737,19 @@ def test_expected_args_with_flags(wrapper_environment, wrapper_flags, wrapper_di
def test_system_path_cleanup(wrapper_environment, wrapper_dir):
"""Ensure SPACK_COMPILER_WRAPPER_PATH is removed from PATH, even with trailing /
"""Ensure SPACK_ENV_PATH is removed from PATH, even with trailing /
The compiler wrapper has to ensure that it is not called nested
like it would happen when gcc's collect2 looks in PATH for ld.
To prevent nested calls, the compiler wrapper removes the elements
of SPACK_COMPILER_WRAPPER_PATH from PATH. Autotest's generated testsuite appends
of SPACK_ENV_PATH from PATH. Autotest's generated testsuite appends
a / to each element of PATH when adding AUTOTEST_PATH.
Thus, ensure that PATH cleanup works even with trailing /.
"""
cc = wrapper_dir / "cc"
system_path = "/bin:/usr/bin:/usr/local/bin"
with set_env(SPACK_COMPILER_WRAPPER_PATH=str(wrapper_dir), SPACK_CC="true"):
with set_env(SPACK_ENV_PATH=str(wrapper_dir), SPACK_CC="true"):
with set_env(PATH=str(wrapper_dir) + ":" + system_path):
check_env_var(cc, "PATH", system_path)
with set_env(PATH=str(wrapper_dir) + "/:" + system_path):

View File

@@ -18,7 +18,6 @@
import spack.repo as repo
import spack.util.git
from spack.test.conftest import MockHTTPResponse
from spack.version import Version
pytestmark = [pytest.mark.usefixtures("mock_packages")]
@@ -31,43 +30,6 @@ def repro_dir(tmp_path):
yield result
def test_get_added_versions_new_checksum(mock_git_package_changes):
repo_path, filename, commits = mock_git_package_changes
checksum_versions = {
"3f6576971397b379d4205ae5451ff5a68edf6c103b2f03c4188ed7075fbb5f04": Version("2.1.5"),
"a0293475e6a44a3f6c045229fe50f69dc0eebc62a42405a51f19d46a5541e77a": Version("2.1.4"),
"6c0853bb27738b811f2b4d4af095323c3d5ce36ceed6b50e5f773204fb8f7200": Version("2.0.7"),
"86993903527d9b12fc543335c19c1d33a93797b3d4d37648b5addae83679ecd8": Version("2.0.0"),
}
with fs.working_dir(str(repo_path)):
added_versions = ci.get_added_versions(
checksum_versions, filename, from_ref=commits[-1], to_ref=commits[-2]
)
assert len(added_versions) == 1
assert added_versions[0] == Version("2.1.5")
def test_get_added_versions_new_commit(mock_git_package_changes):
repo_path, filename, commits = mock_git_package_changes
checksum_versions = {
"74253725f884e2424a0dd8ae3f69896d5377f325": Version("2.1.6"),
"3f6576971397b379d4205ae5451ff5a68edf6c103b2f03c4188ed7075fbb5f04": Version("2.1.5"),
"a0293475e6a44a3f6c045229fe50f69dc0eebc62a42405a51f19d46a5541e77a": Version("2.1.4"),
"6c0853bb27738b811f2b4d4af095323c3d5ce36ceed6b50e5f773204fb8f7200": Version("2.0.7"),
"86993903527d9b12fc543335c19c1d33a93797b3d4d37648b5addae83679ecd8": Version("2.0.0"),
}
with fs.working_dir(str(repo_path)):
added_versions = ci.get_added_versions(
checksum_versions, filename, from_ref=commits[2], to_ref=commits[1]
)
assert len(added_versions) == 1
assert added_versions[0] == Version("2.1.6")
def test_pipeline_dag(config, tmpdir):
r"""Test creation, pruning, and traversal of PipelineDAG using the
following package dependency graph:

View File

@@ -214,7 +214,9 @@ def verify_mirror_contents():
if in_env_pkg in p:
found_pkg = True
assert found_pkg, f"Expected to find {in_env_pkg} in {dest_mirror_dir}"
if not found_pkg:
print("Expected to find {0} in {1}".format(in_env_pkg, dest_mirror_dir))
assert False
# Install a package and put it in the buildcache
s = spack.concretize.concretize_one(out_env_pkg)

View File

@@ -22,11 +22,7 @@
import spack.hash_types as ht
import spack.main
import spack.paths as spack_paths
import spack.repo
import spack.spec
import spack.stage
import spack.util.spack_yaml as syaml
import spack.version
from spack.ci import gitlab as gitlab_generator
from spack.ci.common import PipelineDag, PipelineOptions, SpackCIConfig
from spack.ci.generator_registry import generator
@@ -871,7 +867,7 @@ def test_push_to_build_cache(
logs_dir = scratch / "logs_dir"
logs_dir.mkdir()
ci.copy_stage_logs_to_artifacts(concrete_spec, str(logs_dir))
assert "spack-build-out.txt.gz" in os.listdir(logs_dir)
assert "spack-build-out.txt" in os.listdir(logs_dir)
dl_dir = scratch / "download_dir"
buildcache_cmd("download", "--spec-file", json_path, "--path", str(dl_dir))
@@ -1845,216 +1841,3 @@ def test_ci_generate_alternate_target(
assert pipeline_doc.startswith("unittestpipeline")
assert "externaltest" in pipeline_doc
@pytest.fixture
def fetch_versions_match(monkeypatch):
"""Fake successful checksums returned from downloaded tarballs."""
def get_checksums_for_versions(url_by_version, package_name, **kwargs):
pkg_cls = spack.repo.PATH.get_pkg_class(package_name)
return {v: pkg_cls.versions[v]["sha256"] for v in url_by_version}
monkeypatch.setattr(spack.stage, "get_checksums_for_versions", get_checksums_for_versions)
@pytest.fixture
def fetch_versions_invalid(monkeypatch):
"""Fake successful checksums returned from downloaded tarballs."""
def get_checksums_for_versions(url_by_version, package_name, **kwargs):
return {
v: "abcdef1234567890abcdef1234567890abcdef1234567890abcdef1234567890"
for v in url_by_version
}
monkeypatch.setattr(spack.stage, "get_checksums_for_versions", get_checksums_for_versions)
@pytest.mark.parametrize("versions", [["2.1.4"], ["2.1.4", "2.1.5"]])
def test_ci_validate_standard_versions_valid(capfd, mock_packages, fetch_versions_match, versions):
spec = spack.spec.Spec("diff-test")
pkg = spack.repo.PATH.get_pkg_class(spec.name)(spec)
version_list = [spack.version.Version(v) for v in versions]
assert spack.cmd.ci.validate_standard_versions(pkg, version_list)
out, err = capfd.readouterr()
for version in versions:
assert f"Validated diff-test@{version}" in out
@pytest.mark.parametrize("versions", [["2.1.4"], ["2.1.4", "2.1.5"]])
def test_ci_validate_standard_versions_invalid(
capfd, mock_packages, fetch_versions_invalid, versions
):
spec = spack.spec.Spec("diff-test")
pkg = spack.repo.PATH.get_pkg_class(spec.name)(spec)
version_list = [spack.version.Version(v) for v in versions]
assert spack.cmd.ci.validate_standard_versions(pkg, version_list) is False
out, err = capfd.readouterr()
for version in versions:
assert f"Invalid checksum found diff-test@{version}" in err
@pytest.mark.parametrize("versions", [[("1.0", -2)], [("1.1", -4), ("2.0", -6)]])
def test_ci_validate_git_versions_valid(
capfd, monkeypatch, mock_packages, mock_git_version_info, versions
):
spec = spack.spec.Spec("diff-test")
pkg = spack.repo.PATH.get_pkg_class(spec.name)(spec)
version_list = [spack.version.Version(v) for v, _ in versions]
repo_path, filename, commits = mock_git_version_info
version_commit_dict = {
spack.version.Version(v): {"tag": f"v{v}", "commit": commits[c]} for v, c in versions
}
pkg_class = spec.package_class
monkeypatch.setattr(pkg_class, "git", repo_path)
monkeypatch.setattr(pkg_class, "versions", version_commit_dict)
assert spack.cmd.ci.validate_git_versions(pkg, version_list)
out, err = capfd.readouterr()
for version in version_list:
assert f"Validated diff-test@{version}" in out
@pytest.mark.parametrize("versions", [[("1.0", -3)], [("1.1", -5), ("2.0", -5)]])
def test_ci_validate_git_versions_bad_tag(
capfd, monkeypatch, mock_packages, mock_git_version_info, versions
):
spec = spack.spec.Spec("diff-test")
pkg = spack.repo.PATH.get_pkg_class(spec.name)(spec)
version_list = [spack.version.Version(v) for v, _ in versions]
repo_path, filename, commits = mock_git_version_info
version_commit_dict = {
spack.version.Version(v): {"tag": f"v{v}", "commit": commits[c]} for v, c in versions
}
pkg_class = spec.package_class
monkeypatch.setattr(pkg_class, "git", repo_path)
monkeypatch.setattr(pkg_class, "versions", version_commit_dict)
assert spack.cmd.ci.validate_git_versions(pkg, version_list) is False
out, err = capfd.readouterr()
for version in version_list:
assert f"Mismatched tag <-> commit found for diff-test@{version}" in err
@pytest.mark.parametrize("versions", [[("1.0", -2)], [("1.1", -4), ("2.0", -6), ("3.0", -6)]])
def test_ci_validate_git_versions_invalid(
capfd, monkeypatch, mock_packages, mock_git_version_info, versions
):
spec = spack.spec.Spec("diff-test")
pkg = spack.repo.PATH.get_pkg_class(spec.name)(spec)
version_list = [spack.version.Version(v) for v, _ in versions]
repo_path, filename, commits = mock_git_version_info
version_commit_dict = {
spack.version.Version(v): {
"tag": f"v{v}",
"commit": "abcdefabcdefabcdefabcdefabcdefabcdefabc",
}
for v, c in versions
}
pkg_class = spec.package_class
monkeypatch.setattr(pkg_class, "git", repo_path)
monkeypatch.setattr(pkg_class, "versions", version_commit_dict)
assert spack.cmd.ci.validate_git_versions(pkg, version_list) is False
out, err = capfd.readouterr()
for version in version_list:
assert f"Invalid commit for diff-test@{version}" in err
@pytest.fixture
def verify_standard_versions_valid(monkeypatch):
def validate_standard_versions(pkg, versions):
for version in versions:
print(f"Validated {pkg.name}@{version}")
return True
monkeypatch.setattr(spack.cmd.ci, "validate_standard_versions", validate_standard_versions)
@pytest.fixture
def verify_git_versions_valid(monkeypatch):
def validate_git_versions(pkg, versions):
for version in versions:
print(f"Validated {pkg.name}@{version}")
return True
monkeypatch.setattr(spack.cmd.ci, "validate_git_versions", validate_git_versions)
@pytest.fixture
def verify_standard_versions_invalid(monkeypatch):
def validate_standard_versions(pkg, versions):
for version in versions:
print(f"Invalid checksum found {pkg.name}@{version}")
return False
monkeypatch.setattr(spack.cmd.ci, "validate_standard_versions", validate_standard_versions)
@pytest.fixture
def verify_git_versions_invalid(monkeypatch):
def validate_git_versions(pkg, versions):
for version in versions:
print(f"Invalid commit for {pkg.name}@{version}")
return False
monkeypatch.setattr(spack.cmd.ci, "validate_git_versions", validate_git_versions)
def test_ci_verify_versions_valid(
monkeypatch,
mock_packages,
mock_git_package_changes,
verify_standard_versions_valid,
verify_git_versions_valid,
):
repo_path, _, commits = mock_git_package_changes
monkeypatch.setattr(spack.paths, "prefix", repo_path)
out = ci_cmd("verify-versions", commits[-1], commits[-3])
assert "Validated diff-test@2.1.5" in out
assert "Validated diff-test@2.1.6" in out
def test_ci_verify_versions_standard_invalid(
monkeypatch,
mock_packages,
mock_git_package_changes,
verify_standard_versions_invalid,
verify_git_versions_invalid,
):
repo_path, _, commits = mock_git_package_changes
monkeypatch.setattr(spack.paths, "prefix", repo_path)
out = ci_cmd("verify-versions", commits[-1], commits[-3], fail_on_error=False)
assert "Invalid checksum found diff-test@2.1.5" in out
assert "Invalid commit for diff-test@2.1.6" in out
def test_ci_verify_versions_manual_package(monkeypatch, mock_packages, mock_git_package_changes):
repo_path, _, commits = mock_git_package_changes
monkeypatch.setattr(spack.paths, "prefix", repo_path)
pkg_class = spack.spec.Spec("diff-test").package_class
monkeypatch.setattr(pkg_class, "manual_download", True)
out = ci_cmd("verify-versions", commits[-1], commits[-2])
assert "Skipping manual download package: diff-test" in out

View File

@@ -5,7 +5,6 @@
import filecmp
import os
import shutil
import textwrap
import pytest
@@ -260,25 +259,15 @@ def test_update_completion_arg(shell, tmpdir, monkeypatch):
def test_updated_completion_scripts(shell, tmpdir):
"""Make sure our shell tab completion scripts remain up-to-date."""
width = 72
lines = textwrap.wrap(
msg = (
"It looks like Spack's command-line interface has been modified. "
"If differences are more than your global 'include:' scopes, please "
"update Spack's shell tab completion scripts by running:",
width,
"Please update Spack's shell tab completion scripts by running:\n\n"
" spack commands --update-completion\n\n"
"and adding the changed files to your pull request."
)
lines.append("\n spack commands --update-completion\n")
lines.extend(
textwrap.wrap(
"and adding the changed files (minus your global 'include:' scopes) "
"to your pull request.",
width,
)
)
msg = "\n".join(lines)
header = os.path.join(spack.paths.share_path, shell, f"spack-completion.{shell}")
script = f"spack-completion.{shell}"
script = "spack-completion.{0}".format(shell)
old_script = os.path.join(spack.paths.share_path, script)
new_script = str(tmpdir.join(script))

View File

@@ -213,7 +213,7 @@ def test_config_add_update_dict(mutable_empty_config):
def test_config_with_c_argument(mutable_empty_config):
# I don't know how to add a spack argument to a Spack Command, so we test this way
config_file = "config:install_tree:root:/path/to/config.yaml"
config_file = "config:install_root:root:/path/to/config.yaml"
parser = spack.main.make_argument_parser()
args = parser.parse_args(["-c", config_file])
assert config_file in args.config_vars
@@ -221,7 +221,7 @@ def test_config_with_c_argument(mutable_empty_config):
# Add the path to the config
config("add", args.config_vars[0], scope="command_line")
output = config("get", "config")
assert "config:\n install_tree:\n root: /path/to/config.yaml" in output
assert "config:\n install_root:\n root: /path/to/config.yaml" in output
def test_config_add_ordered_dict(mutable_empty_config):

View File

@@ -15,9 +15,6 @@
deprecate = SpackCommand("deprecate")
find = SpackCommand("find")
# Unit tests should not be affected by the user's managed environments
pytestmark = pytest.mark.usefixtures("mutable_mock_env_path")
def test_deprecate(mock_packages, mock_archive, mock_fetch, install_mockery):
install("--fake", "libelf@0.8.13")

View File

@@ -16,7 +16,6 @@
import spack.stage
import spack.util.git
import spack.util.path
from spack.error import SpackError
from spack.main import SpackCommand
add = SpackCommand("add")
@@ -160,7 +159,6 @@ def check_path(stage, dest):
# Create path to allow develop to modify env
fs.mkdirp(abspath)
develop("--no-clone", "-p", path, "mpich@1.0")
self.check_develop(e, spack.spec.Spec("mpich@=1.0"), path)
# Remove path to ensure develop with no args runs staging code
os.rmdir(abspath)
@@ -220,40 +218,6 @@ def test_develop_full_git_repo(
assert len(commits) > 1
def test_recursive(mutable_mock_env_path, install_mockery, mock_fetch):
env("create", "test")
with ev.read("test") as e:
add("indirect-mpich@1.0")
e.concretize()
specs = e.all_specs()
assert len(specs) > 1
develop("--recursive", "mpich")
expected_dev_specs = ["mpich", "direct-mpich", "indirect-mpich"]
for spec in expected_dev_specs:
assert spec in e.dev_specs
def test_develop_fails_with_multiple_concrete_versions(
mutable_mock_env_path, install_mockery, mock_fetch
):
env("create", "test")
with ev.read("test") as e:
add("indirect-mpich@1.0")
add("indirect-mpich@0.9")
e.unify = False
e.concretize()
with pytest.raises(SpackError) as develop_error:
develop("indirect-mpich", fail_on_error=True)
error_str = "has multiple concrete instances in the graph"
assert error_str in str(develop_error.value)
def test_concretize_dev_path_with_at_symbol_in_env(mutable_mock_env_path, tmpdir, mock_packages):
spec_like = "develop-test@develop"

View File

@@ -1067,17 +1067,13 @@ def test_init_from_yaml_relative_includes(tmp_path):
assert os.path.exists(os.path.join(e2.path, f))
# TODO: Should we be supporting relative path rewrites when creating new env from existing?
# TODO: If so, then this should confirm that the absolute include paths in the new env exist.
def test_init_from_yaml_relative_includes_outside_env(tmp_path):
"""Ensure relative includes to files outside the environment fail."""
files = ["../outside_env/repos.yaml"]
files = ["../outside_env_not_copied/repos.yaml"]
manifest = f"""
spack:
specs: []
include:
- path: {files[0]}
include: {files}
"""
# subdir to ensure parent of environment dir is not shared
@@ -1090,7 +1086,7 @@ def test_init_from_yaml_relative_includes_outside_env(tmp_path):
for f in files:
fs.touchp(e1_path / f)
with pytest.raises(ValueError, match="does not exist"):
with pytest.raises(spack.config.ConfigFileError, match="Detected 1 missing include"):
_ = _env_create("test2", init_file=e1_manifest)
@@ -1190,14 +1186,14 @@ def test_env_with_config(environment_from_manifest):
def test_with_config_bad_include_create(environment_from_manifest):
"""Confirm missing required include raises expected exception."""
err = "does not exist"
with pytest.raises(ValueError, match=err):
"""Confirm missing include paths raise expected exception and error."""
with pytest.raises(spack.config.ConfigFileError, match="2 missing include path"):
environment_from_manifest(
"""
spack:
include:
- /no/such/directory
- no/such/file.yaml
"""
)
@@ -1207,25 +1203,34 @@ def test_with_config_bad_include_activate(environment_from_manifest, tmpdir):
include1 = env_root / "include1.yaml"
include1.touch()
abs_include_path = os.path.abspath(tmpdir.join("subdir").ensure("include2.yaml"))
spack_yaml = env_root / ev.manifest_name
spack_yaml.write_text(
"""
f"""
spack:
include:
- ./include1.yaml
- {abs_include_path}
"""
)
with ev.Environment(env_root) as e:
e.concretize()
# We've created an environment with included config file (which does
# exist). Now we remove it and check that we get a sensible error.
# we've created an environment with some included config files (which do
# in fact exist): now we remove them and check that we get a sensible
# error message
os.remove(abs_include_path)
os.remove(include1)
with pytest.raises(ValueError, match="does not exist"):
with pytest.raises(spack.config.ConfigFileError) as exc:
ev.activate(ev.Environment(env_root))
err = exc.value.message
assert "missing include" in err
assert abs_include_path in err
assert "include1.yaml" in err
assert ev.active_environment() is None
@@ -1333,10 +1338,8 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
included file scope.
"""
env_path = tmp_path / "test_config"
fs.mkdirp(env_path)
included_file = "included-packages.yaml"
included_path = env_path / included_file
included_path = tmp_path / included_file
with open(included_path, "w", encoding="utf-8") as f:
f.write(
"""\
@@ -1352,7 +1355,7 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
"""
)
spack_yaml = env_path / ev.manifest_name
spack_yaml = tmp_path / ev.manifest_name
spack_yaml.write_text(
f"""\
spack:
@@ -1366,8 +1369,7 @@ def test_config_change_existing(mutable_mock_env_path, tmp_path, mock_packages,
"""
)
mutable_config.set("config:misc_cache", str(tmp_path / "cache"))
e = ev.Environment(env_path)
e = ev.Environment(tmp_path)
with e:
# List of requirements, flip a variant
config("change", "packages:mpich:require:~debug")
@@ -1457,6 +1459,19 @@ def test_env_with_included_config_file_url(tmpdir, mutable_empty_config, package
assert cfg["mpileaks"]["version"] == ["2.2"]
def test_env_with_included_config_missing_file(tmpdir, mutable_empty_config):
"""Test inclusion of a missing configuration file raises FetchError
noting missing file."""
spack_yaml = tmpdir.join("spack.yaml")
missing_file = tmpdir.join("packages.yaml")
with spack_yaml.open("w") as f:
f.write("spack:\n include:\n - {0}\n".format(missing_file.strpath))
with pytest.raises(spack.error.ConfigError, match="missing include path"):
ev.Environment(tmpdir.strpath)
def test_env_with_included_config_scope(mutable_mock_env_path, packages_file):
"""Test inclusion of a package file from the environment's configuration
stage directory. This test is intended to represent a case where a remote
@@ -1551,7 +1566,7 @@ def test_env_with_included_config_precedence(tmp_path):
def test_env_with_included_configs_precedence(tmp_path):
"""Test precedence of multiple included configuration files."""
"""Test precendence of multiple included configuration files."""
file1 = "high-config.yaml"
file2 = "low-config.yaml"
@@ -1779,7 +1794,7 @@ def test_roots_display_with_variants():
with ev.read("test"):
out = find(output=str)
assert "boost+shared" in out
assert "boost +shared" in out
def test_uninstall_keeps_in_env(mock_stage, mock_fetch, install_mockery):
@@ -3065,26 +3080,14 @@ def test_stack_view_activate_from_default(
def test_envvar_set_in_activate(tmp_path, mock_packages, install_mockery):
spack_yaml = tmp_path / "spack.yaml"
env_vars_yaml = tmp_path / "env_vars.yaml"
env_vars_yaml.write_text(
"""
env_vars:
set:
CONFIG_ENVAR_SET_IN_ENV_LOAD: "True"
"""
)
spack_yaml.write_text(
"""
spack:
include:
- env_vars.yaml
specs:
- cmake%gcc
env_vars:
set:
SPACK_ENVAR_SET_IN_ENV_LOAD: "True"
ENVAR_SET_IN_ENV_LOAD: "True"
"""
)
@@ -3095,16 +3098,12 @@ def test_envvar_set_in_activate(tmp_path, mock_packages, install_mockery):
test_env = ev.read("test")
output = env("activate", "--sh", "test")
assert "SPACK_ENVAR_SET_IN_ENV_LOAD=True" in output
assert "CONFIG_ENVAR_SET_IN_ENV_LOAD=True" in output
assert "ENVAR_SET_IN_ENV_LOAD=True" in output
with test_env:
with spack.util.environment.set_env(
SPACK_ENVAR_SET_IN_ENV_LOAD="True", CONFIG_ENVAR_SET_IN_ENV_LOAD="True"
):
with spack.util.environment.set_env(ENVAR_SET_IN_ENV_LOAD="True"):
output = env("deactivate", "--sh")
assert "unset SPACK_ENVAR_SET_IN_ENV_LOAD" in output
assert "unset CONFIG_ENVAR_SET_IN_ENV_LOAD" in output
assert "unset ENVAR_SET_IN_ENV_LOAD" in output
def test_stack_view_no_activate_without_default(
@@ -4278,31 +4277,21 @@ def test_unify_when_possible_works_around_conflicts():
assert len([x for x in e.all_specs() if x.satisfies("mpich")]) == 1
# Using mock_include_cache to ensure the "remote" file is cached in a temporary
# location and not polluting the user cache.
def test_env_include_packages_url(
tmpdir, mutable_empty_config, mock_fetch_url_text, mock_curl_configs, mock_include_cache
tmpdir, mutable_empty_config, mock_spider_configs, mock_curl_configs
):
"""Test inclusion of a (GitHub) URL."""
develop_url = "https://github.com/fake/fake/blob/develop/"
default_packages = develop_url + "etc/fake/defaults/packages.yaml"
sha256 = "8b69d9c6e983dfb8bac2ddc3910a86265cffdd9c85f905c716d426ec5b0d9847"
spack_yaml = tmpdir.join("spack.yaml")
with spack_yaml.open("w") as f:
f.write(
f"""\
spack:
include:
- path: {default_packages}
sha256: {sha256}
"""
)
f.write("spack:\n include:\n - {0}\n".format(default_packages))
assert os.path.isfile(spack_yaml.strpath)
with spack.config.override("config:url_fetch_method", "curl"):
env = ev.Environment(tmpdir.strpath)
ev.activate(env)
# Make sure a setting from test/data/config/packages.yaml is present
cfg = spack.config.get("packages")
assert "mpich" in cfg["all"]["providers"]["mpi"]
@@ -4371,7 +4360,7 @@ def test_env_view_disabled(tmp_path, mutable_mock_env_path):
@pytest.mark.parametrize("first", ["false", "true", "custom"])
def test_env_include_mixed_views(tmp_path, mutable_config, mutable_mock_env_path, first):
def test_env_include_mixed_views(tmp_path, mutable_mock_env_path, mutable_config, first):
"""Ensure including path and boolean views in different combinations result
in the creation of only the first view if it is not disabled."""
false_yaml = tmp_path / "false-view.yaml"

View File

@@ -712,11 +712,10 @@ def test_install_deps_then_package(tmpdir, mock_fetch, install_mockery):
assert os.path.exists(root.prefix)
# Unit tests should not be affected by the user's managed environments
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
@pytest.mark.regression("12002")
def test_install_only_dependencies_in_env(
tmpdir, mutable_mock_env_path, mock_fetch, install_mockery
tmpdir, mock_fetch, install_mockery, mutable_mock_env_path
):
env("create", "test")
@@ -730,10 +729,9 @@ def test_install_only_dependencies_in_env(
assert not os.path.exists(root.prefix)
# Unit tests should not be affected by the user's managed environments
@pytest.mark.regression("12002")
def test_install_only_dependencies_of_all_in_env(
tmpdir, mutable_mock_env_path, mock_fetch, install_mockery
tmpdir, mock_fetch, install_mockery, mutable_mock_env_path
):
env("create", "--without-view", "test")
@@ -753,8 +751,7 @@ def test_install_only_dependencies_of_all_in_env(
assert os.path.exists(dep.prefix)
# Unit tests should not be affected by the user's managed environments
def test_install_no_add_in_env(tmpdir, mutable_mock_env_path, mock_fetch, install_mockery):
def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock_env_path):
# To test behavior of --add option, we create the following environment:
#
# mpileaks
@@ -895,6 +892,7 @@ def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd):
specfile = "./spec.json"
with open(specfile, "w", encoding="utf-8") as f:
f.write(spec.to_json())
print(spec.to_json())
install("--log-file=cdash_reports", "--log-format=cdash", specfile)
# Verify Configure.xml exists with expected contents.
report_dir = tmpdir.join("cdash_reports")
@@ -929,10 +927,9 @@ def test_install_fails_no_args_suggests_env_activation(tmpdir):
assert "using the `spack.yaml` in this directory" in output
# Unit tests should not be affected by the user's managed environments
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
def test_install_env_with_tests_all(
tmpdir, mutable_mock_env_path, mock_packages, mock_fetch, install_mockery
tmpdir, mock_packages, mock_fetch, install_mockery, mutable_mock_env_path
):
env("create", "test")
with ev.read("test"):
@@ -942,10 +939,9 @@ def test_install_env_with_tests_all(
assert os.path.exists(test_dep.prefix)
# Unit tests should not be affected by the user's managed environments
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
def test_install_env_with_tests_root(
tmpdir, mutable_mock_env_path, mock_packages, mock_fetch, install_mockery
tmpdir, mock_packages, mock_fetch, install_mockery, mutable_mock_env_path
):
env("create", "test")
with ev.read("test"):
@@ -955,10 +951,9 @@ def test_install_env_with_tests_root(
assert not os.path.exists(test_dep.prefix)
# Unit tests should not be affected by the user's managed environments
@pytest.mark.not_on_windows("Environment views not supported on windows. Revisit after #34701")
def test_install_empty_env(
tmpdir, mutable_mock_env_path, mock_packages, mock_fetch, install_mockery
tmpdir, mock_packages, mock_fetch, install_mockery, mutable_mock_env_path
):
env_name = "empty"
env("create", env_name)
@@ -994,17 +989,9 @@ def test_installation_fail_tests(install_mockery, mock_fetch, name, method):
assert "See test log for details" in output
# Unit tests should not be affected by the user's managed environments
@pytest.mark.not_on_windows("Buildcache not supported on windows")
def test_install_use_buildcache(
capsys,
mutable_mock_env_path,
mock_packages,
mock_fetch,
mock_archive,
mock_binary_index,
tmpdir,
install_mockery,
capsys, mock_packages, mock_fetch, mock_archive, mock_binary_index, tmpdir, install_mockery
):
"""
Make sure installing with use-buildcache behaves correctly.

View File

@@ -12,9 +12,6 @@
install = SpackCommand("install")
uninstall = SpackCommand("uninstall")
# Unit tests should not be affected by the user's managed environments
pytestmark = pytest.mark.usefixtures("mutable_mock_env_path")
@pytest.mark.db
def test_mark_mode_required(mutable_database):

View File

@@ -38,9 +38,8 @@ def test_regression_8083(tmpdir, capfd, mock_packages, mock_fetch, config):
assert "as it is an external spec" in output
# Unit tests should not be affected by the user's managed environments
@pytest.mark.regression("12345")
def test_mirror_from_env(mutable_mock_env_path, tmp_path, mock_packages, mock_fetch):
def test_mirror_from_env(tmp_path, mock_packages, mock_fetch, mutable_mock_env_path):
mirror_dir = str(tmp_path / "mirror")
env_name = "test"
@@ -343,16 +342,8 @@ def test_mirror_name_collision(mutable_config):
mirror("add", "first", "1")
# Unit tests should not be affected by the user's managed environments
def test_mirror_destroy(
mutable_mock_env_path,
install_mockery,
mock_packages,
mock_fetch,
mock_archive,
mutable_config,
monkeypatch,
tmpdir,
install_mockery, mock_packages, mock_fetch, mock_archive, mutable_config, monkeypatch, tmpdir
):
# Create a temp mirror directory for buildcache usage
mirror_dir = tmpdir.join("mirror_dir")

View File

@@ -5,13 +5,9 @@
import pytest
import spack.config
import spack.environment as ev
import spack.main
from spack.main import SpackCommand
repo = spack.main.SpackCommand("repo")
env = SpackCommand("env")
def test_help_option():
@@ -37,33 +33,3 @@ def test_create_add_list_remove(mutable_config, tmpdir):
repo("remove", "--scope=site", str(tmpdir))
output = repo("list", "--scope=site", output=str)
assert "mockrepo" not in output
def test_env_repo_path_vars_substitution(
tmpdir, install_mockery, mutable_mock_env_path, monkeypatch
):
"""Test Spack correctly substitues repo paths with environment variables when creating an
environment from a manifest file."""
monkeypatch.setenv("CUSTOM_REPO_PATH", ".")
# setup environment from spack.yaml
envdir = tmpdir.mkdir("env")
with envdir.as_cwd():
with open("spack.yaml", "w", encoding="utf-8") as f:
f.write(
"""\
spack:
specs: []
repos:
- $CUSTOM_REPO_PATH
"""
)
# creating env from manifest file
env("create", "test", "./spack.yaml")
# check that repo path was correctly substituted with the environment variable
current_dir = os.getcwd()
with ev.read("test") as newenv:
repos_specs = spack.config.get("repos", default={}, scope=newenv.scope_name)
assert current_dir in repos_specs

View File

@@ -13,10 +13,7 @@
import spack.store
from spack.main import SpackCommand, SpackCommandError
# Unit tests should not be affected by the user's managed environments
pytestmark = pytest.mark.usefixtures(
"mutable_mock_env_path", "mutable_config", "mutable_mock_repo"
)
pytestmark = pytest.mark.usefixtures("mutable_config", "mutable_mock_repo")
spec = SpackCommand("spec")

View File

@@ -409,108 +409,3 @@ def test_case_sensitive_imports(tmp_path: pathlib.Path):
def test_pkg_imports():
assert spack.cmd.style._module_part(spack.paths.prefix, "spack.pkg.builtin.boost") is None
assert spack.cmd.style._module_part(spack.paths.prefix, "spack.pkg") is None
def test_spec_strings(tmp_path):
(tmp_path / "example.py").write_text(
"""\
def func(x):
print("dont fix %s me" % x, 3)
return x.satisfies("+foo %gcc +bar") and x.satisfies("%gcc +baz")
"""
)
(tmp_path / "example.json").write_text(
"""\
{
"spec": [
"+foo %gcc +bar~nope ^dep %clang +yup @3.2 target=x86_64 /abcdef ^another %gcc ",
"%gcc +baz"
],
"%gcc x=y": 2
}
"""
)
(tmp_path / "example.yaml").write_text(
"""\
spec:
- "+foo %gcc +bar"
- "%gcc +baz"
- "this is fine %clang"
"%gcc x=y": 2
"""
)
issues = set()
def collect_issues(path: str, line: int, col: int, old: str, new: str):
issues.add((path, line, col, old, new))
# check for issues with custom handler
spack.cmd.style._check_spec_strings(
[
str(tmp_path / "nonexistent.py"),
str(tmp_path / "example.py"),
str(tmp_path / "example.json"),
str(tmp_path / "example.yaml"),
],
handler=collect_issues,
)
assert issues == {
(
str(tmp_path / "example.json"),
3,
9,
"+foo %gcc +bar~nope ^dep %clang +yup @3.2 target=x86_64 /abcdef ^another %gcc ",
"+foo +bar~nope %gcc ^dep +yup @3.2 target=x86_64 /abcdef %clang ^another %gcc ",
),
(str(tmp_path / "example.json"), 4, 9, "%gcc +baz", "+baz %gcc"),
(str(tmp_path / "example.json"), 6, 5, "%gcc x=y", "x=y %gcc"),
(str(tmp_path / "example.py"), 3, 23, "+foo %gcc +bar", "+foo +bar %gcc"),
(str(tmp_path / "example.py"), 3, 57, "%gcc +baz", "+baz %gcc"),
(str(tmp_path / "example.yaml"), 2, 5, "+foo %gcc +bar", "+foo +bar %gcc"),
(str(tmp_path / "example.yaml"), 3, 5, "%gcc +baz", "+baz %gcc"),
(str(tmp_path / "example.yaml"), 5, 1, "%gcc x=y", "x=y %gcc"),
}
# fix the issues in the files
spack.cmd.style._check_spec_strings(
[
str(tmp_path / "nonexistent.py"),
str(tmp_path / "example.py"),
str(tmp_path / "example.json"),
str(tmp_path / "example.yaml"),
],
handler=spack.cmd.style._spec_str_fix_handler,
)
assert (
(tmp_path / "example.json").read_text()
== """\
{
"spec": [
"+foo +bar~nope %gcc ^dep +yup @3.2 target=x86_64 /abcdef %clang ^another %gcc ",
"+baz %gcc"
],
"x=y %gcc": 2
}
"""
)
assert (
(tmp_path / "example.py").read_text()
== """\
def func(x):
print("dont fix %s me" % x, 3)
return x.satisfies("+foo +bar %gcc") and x.satisfies("+baz %gcc")
"""
)
assert (
(tmp_path / "example.yaml").read_text()
== """\
spec:
- "+foo +bar %gcc"
- "+baz %gcc"
- "this is fine %clang"
"x=y %gcc": 2
"""
)

View File

@@ -16,9 +16,6 @@
uninstall = SpackCommand("uninstall")
install = SpackCommand("install")
# Unit tests should not be affected by the user's managed environments
pytestmark = pytest.mark.usefixtures("mutable_mock_env_path")
class MockArgs:
def __init__(self, packages, all=False, force=False, dependents=False):
@@ -220,7 +217,9 @@ class TestUninstallFromEnv:
find = SpackCommand("find")
@pytest.fixture(scope="function")
def environment_setup(self, mock_packages, mutable_database, install_mockery):
def environment_setup(
self, mutable_mock_env_path, mock_packages, mutable_database, install_mockery
):
TestUninstallFromEnv.env("create", "e1")
e1 = spack.environment.read("e1")
with e1:

View File

@@ -50,7 +50,7 @@ def test_list_long(capsys):
def test_list_long_with_pytest_arg(capsys):
with capsys.disabled():
output = spack_test("--list-long", cmd_test_py)
print(output)
assert "unit_test.py::\n" in output
assert "test_list" in output
assert "test_list_with_pytest_arg" in output

View File

@@ -49,6 +49,7 @@ def test_single_file_verify_cmd(tmpdir):
sjson.dump({filepath: data}, f)
results = verify("manifest", "-f", filepath, fail_on_error=False)
print(results)
assert not results
os.utime(filepath, (0, 0))

View File

@@ -59,6 +59,13 @@ def mock_fetch_remote_versions(*args, **kwargs):
assert v.strip(" \n\t") == "99.99.99\n 3.2.1"
@pytest.mark.maybeslow
def test_no_versions():
"""Test a package for which no remote versions are available."""
versions("converge")
@pytest.mark.maybeslow
def test_no_unchecksummed_versions():
"""Test a package for which no unchecksummed versions are available."""

View File

@@ -82,7 +82,7 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
# Same as before, but tests that we can reuse from a more generic target
pytest.param(
"pkg-a%gcc@9.4.0",
"pkg-b target=x86_64 %gcc@10.2.1",
"pkg-b%gcc@10.2.1 target=x86_64",
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"},
1,
marks=pytest.mark.skipif(
@@ -91,7 +91,7 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
),
pytest.param(
"pkg-a%gcc@10.2.1",
"pkg-b target=x86_64 %gcc@9.4.0",
"pkg-b%gcc@9.4.0 target=x86_64",
{
"pkg-a": "gcc-runtime@10.2.1 target=core2",
"pkg-b": "gcc-runtime@9.4.0 target=x86_64",

View File

@@ -116,7 +116,7 @@ def binary_compatibility(monkeypatch, request):
"mpileaks ^mpi@1.2:2",
# conflict not triggered
"conflict",
"conflict~foo%clang",
"conflict%clang~foo",
"conflict-parent%gcc",
]
)
@@ -378,8 +378,8 @@ def test_different_compilers_get_different_flags(
t = archspec.cpu.host().family
client = spack.concretize.concretize_one(
Spec(
f"cmake-client platform=test os=redhat6 target={t} %gcc@11.1.0"
f" ^cmake platform=test os=redhat6 target={t} %clang@12.2.0"
f"cmake-client %gcc@11.1.0 platform=test os=redhat6 target={t}"
f" ^cmake %clang@12.2.0 platform=test os=redhat6 target={t}"
)
)
cmake = client["cmake"]
@@ -394,7 +394,7 @@ def test_spec_flags_maintain_order(self, mutable_config, gcc11_with_flags):
for successive concretizations.
"""
mutable_config.set("packages", {"gcc": {"externals": [gcc11_with_flags]}})
spec_str = "libelf os=redhat6 %gcc@11.1.0"
spec_str = "libelf %gcc@11.1.0 os=redhat6"
for _ in range(3):
s = spack.concretize.concretize_one(spec_str)
assert all(
@@ -439,8 +439,7 @@ def test_mixing_compilers_only_affects_subdag(self):
where the compiler is not forced.
"""
spec = spack.concretize.concretize_one("dt-diamond%clang ^dt-diamond-bottom%gcc")
# This is intended to traverse the "root" unification set, and check compilers
# on the nodes in the set
print(spec.tree())
for x in spec.traverse(deptype=("link", "run")):
if "c" not in x or not x.name.startswith("dt-diamond"):
continue
@@ -466,13 +465,13 @@ def test_architecture_deep_inheritance(self, mock_targets, compiler_factory):
"""
cnl_compiler = compiler_factory(spec="gcc@4.5.0 os=CNL target=nocona")
with spack.config.override("packages", {"gcc": {"externals": [cnl_compiler]}}):
spec_str = "mpileaks os=CNL target=nocona %gcc@4.5.0 ^dyninst os=CNL ^callpath os=CNL"
spec_str = "mpileaks %gcc@4.5.0 os=CNL target=nocona ^dyninst os=CNL ^callpath os=CNL"
spec = spack.concretize.concretize_one(spec_str)
for s in spec.traverse(root=False, deptype=("link", "run")):
assert s.architecture.target == spec.architecture.target
def test_compiler_flags_from_user_are_grouped(self):
spec = Spec('pkg-a cflags="-O -foo-flag foo-val" platform=test %gcc')
spec = Spec('pkg-a%gcc cflags="-O -foo-flag foo-val" platform=test')
spec = spack.concretize.concretize_one(spec)
cflags = spec.compiler_flags["cflags"]
assert any(x == "-foo-flag foo-val" for x in cflags)
@@ -816,7 +815,7 @@ def test_external_and_virtual(self, mutable_config):
)
def test_compiler_child(self):
s = Spec("mpileaks target=x86_64 %clang ^dyninst%gcc")
s = Spec("mpileaks%clang target=x86_64 ^dyninst%gcc")
s = spack.concretize.concretize_one(s)
assert s["mpileaks"].satisfies("%clang")
assert s["dyninst"].satisfies("%gcc")
@@ -957,9 +956,9 @@ def test_noversion_pkg(self, spec):
"gcc@4.4.7 languages=c,c++,fortran",
"core2",
),
("mpileaks target=x86_64: %gcc@=4.8", "gcc@4.8 languages=c,c++,fortran", "haswell"),
("mpileaks%gcc@=4.8 target=x86_64:", "gcc@4.8 languages=c,c++,fortran", "haswell"),
(
"mpileaks target=x86_64: %gcc@=5.3.0",
"mpileaks%gcc@=5.3.0 target=x86_64:",
"gcc@5.3.0 languages=c,c++,fortran",
"broadwell",
),
@@ -1233,7 +1232,7 @@ def test_compiler_match_is_preferred_to_newer_version(self, compiler_factory):
with spack.config.override(
"packages", {"gcc": {"externals": [compiler_factory(spec="gcc@10.1.0 os=redhat6")]}}
):
spec_str = "simple-inheritance+openblas os=redhat6 %gcc@10.1.0"
spec_str = "simple-inheritance+openblas %gcc@10.1.0 os=redhat6"
s = spack.concretize.concretize_one(spec_str)
assert "openblas@0.2.15" in s
assert s["openblas"].satisfies("%gcc@10.1.0")
@@ -1773,6 +1772,10 @@ def test_best_effort_coconcretize(self, specs, checks):
for s in result.specs:
concrete_specs.update(s.traverse())
for x in concrete_specs:
print(x.tree(hashes=True))
print()
for matching_spec, expected_count in checks.items():
matches = [x for x in concrete_specs if x.satisfies(matching_spec)]
assert len(matches) == expected_count
@@ -1921,6 +1924,7 @@ def test_git_ref_version_is_equivalent_to_specified_version(self, git_ref):
s = Spec("develop-branch-version@git.%s=develop" % git_ref)
c = spack.concretize.concretize_one(s)
assert git_ref in str(c)
print(str(c))
assert s.satisfies("@develop")
assert s.satisfies("@0.1:")
@@ -2527,7 +2531,7 @@ def test_cannot_reuse_host_incompatible_libc(self):
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, [Spec("pkg-b")], reuse=[fst, snd])
assert len(result.specs) == 1
assert result.specs[0] == snd
assert result.specs[0] == snd, result.specs[0].tree()
@pytest.mark.regression("45321")
@pytest.mark.parametrize(
@@ -2864,7 +2868,7 @@ def test_virtuals_provided_together_but_only_one_required_in_dag(self):
def test_reusable_externals_match(mock_packages, tmpdir):
spec = Spec("mpich@4.1~debug build_system=generic arch=linux-ubuntu23.04-zen2 %gcc@13.1.0")
spec = Spec("mpich@4.1%gcc@13.1.0~debug build_system=generic arch=linux-ubuntu23.04-zen2")
spec.external_path = tmpdir.strpath
spec.external_modules = ["mpich/4.1"]
spec._mark_concrete()
@@ -2882,7 +2886,7 @@ def test_reusable_externals_match(mock_packages, tmpdir):
def test_reusable_externals_match_virtual(mock_packages, tmpdir):
spec = Spec("mpich@4.1~debug build_system=generic arch=linux-ubuntu23.04-zen2 %gcc@13.1.0")
spec = Spec("mpich@4.1%gcc@13.1.0~debug build_system=generic arch=linux-ubuntu23.04-zen2")
spec.external_path = tmpdir.strpath
spec.external_modules = ["mpich/4.1"]
spec._mark_concrete()
@@ -2900,7 +2904,7 @@ def test_reusable_externals_match_virtual(mock_packages, tmpdir):
def test_reusable_externals_different_prefix(mock_packages, tmpdir):
spec = Spec("mpich@4.1~debug build_system=generic arch=linux-ubuntu23.04-zen2 %gcc@13.1.0")
spec = Spec("mpich@4.1%gcc@13.1.0~debug build_system=generic arch=linux-ubuntu23.04-zen2")
spec.external_path = "/other/path"
spec.external_modules = ["mpich/4.1"]
spec._mark_concrete()
@@ -2919,7 +2923,7 @@ def test_reusable_externals_different_prefix(mock_packages, tmpdir):
@pytest.mark.parametrize("modules", [None, ["mpich/4.1", "libfabric/1.19"]])
def test_reusable_externals_different_modules(mock_packages, tmpdir, modules):
spec = Spec("mpich@4.1~debug build_system=generic arch=linux-ubuntu23.04-zen2 %gcc@13.1.0")
spec = Spec("mpich@4.1%gcc@13.1.0~debug build_system=generic arch=linux-ubuntu23.04-zen2")
spec.external_path = tmpdir.strpath
spec.external_modules = modules
spec._mark_concrete()
@@ -2937,7 +2941,7 @@ def test_reusable_externals_different_modules(mock_packages, tmpdir, modules):
def test_reusable_externals_different_spec(mock_packages, tmpdir):
spec = Spec("mpich@4.1~debug build_system=generic arch=linux-ubuntu23.04-zen2 %gcc@13.1.0")
spec = Spec("mpich@4.1%gcc@13.1.0~debug build_system=generic arch=linux-ubuntu23.04-zen2")
spec.external_path = tmpdir.strpath
spec._mark_concrete()
assert not spack.solver.asp._is_reusable(
@@ -3210,8 +3214,9 @@ def test_duplicate_compiler_in_externals(mutable_config, mock_packages):
def test_compiler_can_depend_on_themselves_to_build(config, mock_packages):
"""Tests that a compiler can depend on "itself" to bootstrap."""
"""Tests that a compiler can depend on itself to bootstrap."""
s = Spec("gcc@14 %gcc@9.4.0").concretized()
print(s.tree())
assert s.satisfies("gcc@14")
assert s.satisfies("^gcc-runtime@9.4.0")
@@ -3240,6 +3245,7 @@ def test_compiler_attribute_is_tolerated_in_externals(mutable_config, mock_packa
def test_compiler_can_be_built_with_other_compilers(config, mock_packages):
"""Tests that a compiler can be built also with another compiler."""
s = Spec("llvm@18 +clang %gcc").concretized()
print(s.tree())
assert s.satisfies("llvm@18")
c_compiler = s.dependencies(virtuals=("c",))
@@ -3310,29 +3316,3 @@ def test_compiler_match_for_externals_with_versions(
s = spack.concretize.concretize_one(spec_str)
libelf = s["libelf"]
assert libelf.external and libelf.external_path == str(tmp_path / expected)
def test_specifying_compilers_with_virtuals_syntax(default_mock_concretization):
"""Tests that we can pin compilers to nodes using the %[virtuals=...] syntax"""
# clang will be used for both C and C++, since they are provided together
mpich = default_mock_concretization("mpich %[virtuals=fortran] gcc %clang")
assert mpich["fortran"].satisfies("gcc")
assert mpich["c"].satisfies("llvm")
assert mpich["cxx"].satisfies("llvm")
# gcc is the default compiler
mpileaks = default_mock_concretization(
"mpileaks ^libdwarf %gcc ^mpich %[virtuals=fortran] gcc %clang"
)
assert mpileaks["c"].satisfies("gcc")
libdwarf = mpileaks["libdwarf"]
assert libdwarf["c"].satisfies("gcc")
assert libdwarf["c"].satisfies("gcc")
mpich = mpileaks["mpi"]
assert mpich["fortran"].satisfies("gcc")
assert mpich["c"].satisfies("llvm")
assert mpich["cxx"].satisfies("llvm")

View File

@@ -29,7 +29,8 @@
]
variant_error_messages = [
"'fftw' requires conflicting variant values '~mpi' and '+mpi'",
"'fftw' required multiple values for single-valued variant 'mpi'",
" Requested '~mpi' and '+mpi'",
" required because quantum-espresso depends on fftw+mpi when +invino",
" required because quantum-espresso+invino ^fftw~mpi requested explicitly",
" required because quantum-espresso+invino ^fftw~mpi requested explicitly",
@@ -59,4 +60,5 @@ def test_error_messages(error_messages, config_set, spec, mock_packages, mutable
_ = spack.concretize.concretize_one(spec)
for em in error_messages:
print(e.value)
assert em in str(e.value)

View File

@@ -93,7 +93,7 @@ def test_mix_spec_and_compiler_cfg(concretize_scope, test_repo):
conf_str = _compiler_cfg_one_entry_with_cflags("-Wall")
update_concretize_scope(conf_str, "packages")
s1 = spack.concretize.concretize_one('y cflags="-O2" %gcc@12.100.100')
s1 = spack.concretize.concretize_one('y %gcc@12.100.100 cflags="-O2"')
assert s1.satisfies('cflags="-Wall -O2"')
@@ -194,7 +194,7 @@ def test_propagate_and_compiler_cfg(concretize_scope, test_repo):
conf_str = _compiler_cfg_one_entry_with_cflags("-f2")
update_concretize_scope(conf_str, "packages")
root_spec = spack.concretize.concretize_one("v cflags=='-f1' %gcc@12.100.100")
root_spec = spack.concretize.concretize_one("v %gcc@12.100.100 cflags=='-f1'")
assert root_spec["y"].satisfies("cflags='-f1 -f2'")
@@ -237,7 +237,7 @@ def test_dev_mix_flags(tmp_path, concretize_scope, mutable_mock_env_path, test_r
env_content = f"""\
spack:
specs:
- y cflags=='-fsanitize=address' %gcc@12.100.100
- y %gcc@12.100.100 cflags=='-fsanitize=address'
develop:
y:
spec: y cflags=='-fsanitize=address'

View File

@@ -359,10 +359,10 @@ def test_one_package_multiple_oneof_groups(concretize_scope, test_repo):
update_packages_config(conf_str)
s1 = spack.concretize.concretize_one("y@2.5")
assert s1.satisfies("~shared%clang")
assert s1.satisfies("%clang~shared")
s2 = spack.concretize.concretize_one("y@2.4")
assert s2.satisfies("+shared%gcc")
assert s2.satisfies("%gcc+shared")
@pytest.mark.regression("34241")
@@ -377,14 +377,11 @@ def test_require_cflags(concretize_scope, mock_packages):
"""
update_packages_config(conf_str)
mpich2 = spack.concretize.concretize_one("mpich2")
assert mpich2.satisfies("cflags=-g")
spec_mpich2 = spack.concretize.concretize_one("mpich2")
assert spec_mpich2.satisfies("cflags=-g")
mpileaks = spack.concretize.concretize_one("mpileaks")
assert mpileaks["mpi"].satisfies("mpich cflags=-O1")
mpi = spack.concretize.concretize_one("mpi")
assert mpi.satisfies("mpich cflags=-O1")
spec_mpi = spack.concretize.concretize_one("mpi")
assert spec_mpi.satisfies("mpich cflags=-O1")
def test_requirements_for_package_that_is_not_needed(concretize_scope, test_repo):
@@ -504,7 +501,7 @@ def test_default_requirements_with_all(spec_str, requirement_str, concretize_sco
"requirements,expectations",
[
(("%gcc", "%clang"), ("%gcc", "%clang")),
(("~shared%gcc", "@1.0"), ("~shared%gcc", "@1.0+shared")),
(("%gcc~shared", "@1.0"), ("%gcc~shared", "@1.0+shared")),
],
)
def test_default_and_package_specific_requirements(
@@ -758,7 +755,7 @@ def test_skip_requirement_when_default_requirement_condition_cannot_be_met(
update_packages_config(packages_yaml)
s = spack.concretize.concretize_one("mpileaks")
assert s.satisfies("+shared %clang")
assert s.satisfies("%clang+shared")
# Sanity checks that 'callpath' doesn't have the shared variant, but that didn't
# cause failures during concretization.
assert "shared" not in s["callpath"].variants
@@ -771,8 +768,7 @@ def test_requires_directive(mock_packages, config):
s = spack.concretize.concretize_one("requires_clang_or_gcc %gcc")
assert s.satisfies("%gcc")
s = spack.concretize.concretize_one("requires_clang_or_gcc %clang")
# Test both the real package (llvm) and its alias (clang)
assert s.satisfies("%llvm") and s.satisfies("%clang")
assert s.satisfies("%llvm")
# This package can only be compiled with clang
s = spack.concretize.concretize_one("requires_clang")
@@ -948,8 +944,8 @@ def test_requiring_package_on_multiple_virtuals(concretize_scope, mock_packages)
- "%clang"
""",
"multivalue-variant",
["%[virtuals=c] llvm"],
["%gcc"],
["llvm"],
["gcc"],
),
(
"""
@@ -959,8 +955,8 @@ def test_requiring_package_on_multiple_virtuals(concretize_scope, mock_packages)
- "%clang"
""",
"multivalue-variant %gcc",
["%[virtuals=c] gcc"],
["%llvm"],
["gcc"],
["llvm"],
),
# Test parsing objects instead of strings
(
@@ -971,58 +967,12 @@ def test_requiring_package_on_multiple_virtuals(concretize_scope, mock_packages)
- spec: "%clang"
""",
"multivalue-variant",
["%[virtuals=c] llvm"],
["%gcc"],
),
# Test using preferences on virtuals
(
"""
packages:
all:
providers:
mpi: [mpich]
mpi:
prefer:
- zmpi
""",
"mpileaks",
["^[virtuals=mpi] zmpi"],
["^[virtuals=mpi] mpich"],
),
(
"""
packages:
all:
providers:
mpi: [mpich]
mpi:
prefer:
- zmpi
""",
"mpileaks ^[virtuals=mpi] mpich",
["^[virtuals=mpi] mpich"],
["^[virtuals=mpi] zmpi"],
),
# Tests that strong preferences can be overridden by requirements
(
"""
packages:
all:
providers:
mpi: [zmpi]
mpi:
require:
- mpich
prefer:
- zmpi
""",
"mpileaks",
["^[virtuals=mpi] mpich"],
["^[virtuals=mpi] zmpi"],
["llvm"],
["gcc"],
),
],
)
def test_strong_preferences_packages_yaml(
def test_compiler_strong_preferences_packages_yaml(
packages_yaml, spec_str, expected, not_expected, concretize_scope, mock_packages
):
"""Tests that strong preferences are taken into account for compilers."""
@@ -1030,10 +980,10 @@ def test_strong_preferences_packages_yaml(
s = spack.concretize.concretize_one(spec_str)
for constraint in expected:
assert s.satisfies(constraint)
assert s.dependencies(deptype="build", name=constraint)
for constraint in not_expected:
assert not s.satisfies(constraint)
assert not s.dependencies(deptype="build", name=constraint)
@pytest.mark.parametrize(
@@ -1070,16 +1020,6 @@ def test_strong_preferences_packages_yaml(
""",
"multivalue-variant@=2.3 %clang",
),
# Test using conflict on virtual
(
"""
packages:
mpi:
conflict:
- mpich
""",
"mpileaks ^[virtuals=mpi] mpich",
),
],
)
def test_conflict_packages_yaml(packages_yaml, spec_str, concretize_scope, mock_packages):
@@ -1173,69 +1113,3 @@ def test_strong_preferences_higher_priority_than_reuse(concretize_scope, mock_pa
)
ascent = result.specs[0]
assert ascent["adios2"].dag_hash() == reused_spec.dag_hash(), ascent
@pytest.mark.parametrize(
"packages_yaml,err_match",
[
(
"""
packages:
mpi:
require:
- "+bzip2"
""",
"expected a named spec",
),
(
"""
packages:
mpi:
require:
- one_of: ["+bzip2", openmpi]
""",
"expected a named spec",
),
(
"""
packages:
mpi:
require:
- "^mpich"
""",
"Did you mean",
),
],
)
def test_anonymous_spec_cannot_be_used_in_virtual_requirements(
packages_yaml, err_match, concretize_scope, mock_packages
):
"""Tests that using anonymous specs in requirements for virtual packages raises an
appropriate error message.
"""
update_packages_config(packages_yaml)
with pytest.raises(spack.error.SpackError, match=err_match):
spack.concretize.concretize_one("mpileaks")
def test_virtual_requirement_respects_any_of(concretize_scope, mock_packages):
"""Tests that "any of" requirements can be used with virtuals"""
conf_str = """\
packages:
mpi:
require:
- any_of: ["mpich2", "mpich"]
"""
update_packages_config(conf_str)
s = spack.concretize.concretize_one("mpileaks")
assert s.satisfies("^[virtuals=mpi] mpich2")
s = spack.concretize.concretize_one("mpileaks ^mpich2")
assert s.satisfies("^[virtuals=mpi] mpich2")
s = spack.concretize.concretize_one("mpileaks ^mpich")
assert s.satisfies("^[virtuals=mpi] mpich")
with pytest.raises(spack.error.SpackError):
spack.concretize.concretize_one("mpileaks ^[virtuals=mpi] zmpi")

View File

@@ -11,7 +11,8 @@
import pytest
from llnl.util.filesystem import join_path, touch
import llnl.util.tty as tty
from llnl.util.filesystem import join_path, touch, touchp
import spack
import spack.config
@@ -25,7 +26,6 @@
import spack.schema.compilers
import spack.schema.config
import spack.schema.env
import spack.schema.include
import spack.schema.mirrors
import spack.schema.repos
import spack.spec
@@ -51,9 +51,22 @@
config_override_list = {"config": {"build_stage:": ["pathd", "pathe"]}}
config_merge_dict = {"config": {"aliases": {"ls": "find", "dev": "develop"}}}
config_merge_dict = {"config": {"info": {"a": 3, "b": 4}}}
config_override_dict = {"config": {"aliases:": {"be": "build-env", "deps": "dependencies"}}}
config_override_dict = {"config": {"info:": {"a": 7, "c": 9}}}
@pytest.fixture()
def write_config_file(tmpdir):
"""Returns a function that writes a config file."""
def _write(config, data, scope):
config_yaml = tmpdir.join(scope, config + ".yaml")
config_yaml.ensure()
with config_yaml.open("w") as f:
syaml.dump_config(data, f)
return _write
@pytest.fixture()
@@ -1023,16 +1036,6 @@ def test_bad_config_yaml(tmpdir):
)
def test_bad_include_yaml(tmpdir):
with pytest.raises(spack.config.ConfigFormatError, match="is not of type"):
check_schema(
spack.schema.include.schema,
"""\
include: $HOME/include.yaml
""",
)
def test_bad_mirrors_yaml(tmpdir):
with pytest.raises(spack.config.ConfigFormatError):
check_schema(
@@ -1097,9 +1100,9 @@ def test_internal_config_section_override(mock_low_high_config, write_config_fil
def test_internal_config_dict_override(mock_low_high_config, write_config_file):
write_config_file("config", config_merge_dict, "low")
wanted_dict = config_override_dict["config"]["aliases:"]
wanted_dict = config_override_dict["config"]["info:"]
mock_low_high_config.push_scope(spack.config.InternalConfigScope("high", config_override_dict))
assert mock_low_high_config.get("config:aliases") == wanted_dict
assert mock_low_high_config.get("config:info") == wanted_dict
def test_internal_config_list_override(mock_low_high_config, write_config_file):
@@ -1131,10 +1134,10 @@ def test_set_list_override(mock_low_high_config, write_config_file):
def test_set_dict_override(mock_low_high_config, write_config_file):
write_config_file("config", config_merge_dict, "low")
wanted_dict = config_override_dict["config"]["aliases:"]
with spack.config.override("config:aliases:", wanted_dict):
assert wanted_dict == mock_low_high_config.get("config:aliases")
assert config_merge_dict["config"]["aliases"] == mock_low_high_config.get("config:aliases")
wanted_dict = config_override_dict["config"]["info:"]
with spack.config.override("config:info:", wanted_dict):
assert wanted_dict == mock_low_high_config.get("config:info")
assert config_merge_dict["config"]["info"] == mock_low_high_config.get("config:info")
def test_set_bad_path(config):
@@ -1260,6 +1263,134 @@ def test_user_cache_path_is_default_when_env_var_is_empty(working_env):
assert os.path.expanduser("~%s.spack" % os.sep) == spack.paths._get_user_cache_path()
github_url = "https://github.com/fake/fake/{0}/develop"
gitlab_url = "https://gitlab.fake.io/user/repo/-/blob/config/defaults"
@pytest.mark.parametrize(
"url,isfile",
[
(github_url.format("tree"), False),
("{0}/README.md".format(github_url.format("blob")), True),
("{0}/etc/fake/defaults/packages.yaml".format(github_url.format("blob")), True),
(gitlab_url, False),
(None, False),
],
)
def test_config_collect_urls(mutable_empty_config, mock_spider_configs, url, isfile):
with spack.config.override("config:url_fetch_method", "curl"):
urls = spack.config.collect_urls(url)
if url:
if isfile:
expected = 1 if url.endswith(".yaml") else 0
assert len(urls) == expected
else:
# Expect multiple configuration files for a "directory"
assert len(urls) > 1
else:
assert not urls
@pytest.mark.parametrize(
"url,isfile,fail",
[
(github_url.format("tree"), False, False),
(gitlab_url, False, False),
("{0}/README.md".format(github_url.format("blob")), True, True),
("{0}/packages.yaml".format(gitlab_url), True, False),
(None, False, True),
],
)
def test_config_fetch_remote_configs(
tmpdir, mutable_empty_config, mock_collect_urls, mock_curl_configs, url, isfile, fail
):
def _has_content(filename):
# The first element of all configuration files for this test happen to
# be the basename of the file so this check leverages that feature. If
# that changes, then this check will need to change accordingly.
element = "{0}:".format(os.path.splitext(os.path.basename(filename))[0])
with open(filename, "r", encoding="utf-8") as fd:
for line in fd:
if element in line:
return True
tty.debug("Expected {0} in '{1}'".format(element, filename))
return False
dest_dir = join_path(tmpdir.strpath, "defaults")
if fail:
msg = "Cannot retrieve configuration"
with spack.config.override("config:url_fetch_method", "curl"):
with pytest.raises(spack.config.ConfigFileError, match=msg):
spack.config.fetch_remote_configs(url, dest_dir)
else:
with spack.config.override("config:url_fetch_method", "curl"):
path = spack.config.fetch_remote_configs(url, dest_dir)
assert os.path.exists(path)
if isfile:
# Ensure correct file is "fetched"
assert os.path.basename(path) == os.path.basename(url)
# Ensure contents of the file has expected config element
assert _has_content(path)
else:
for filename in os.listdir(path):
assert _has_content(join_path(path, filename))
@pytest.fixture(scope="function")
def mock_collect_urls(mock_config_data, monkeypatch):
"""Mock the collection of URLs to avoid mocking spider."""
_, config_files = mock_config_data
def _collect(base_url):
if not base_url:
return []
ext = os.path.splitext(base_url)[1]
if ext:
return [base_url] if ext == ".yaml" else []
return [join_path(base_url, f) for f in config_files]
monkeypatch.setattr(spack.config, "collect_urls", _collect)
yield
@pytest.mark.parametrize(
"url,skip",
[(github_url.format("tree"), True), ("{0}/compilers.yaml".format(gitlab_url), True)],
)
def test_config_fetch_remote_configs_skip(
tmpdir, mutable_empty_config, mock_collect_urls, mock_curl_configs, url, skip
):
"""Ensure skip fetching remote config file if it already exists when
required and not skipping if replacing it."""
def check_contents(filename, expected):
with open(filename, "r", encoding="utf-8") as fd:
lines = fd.readlines()
if expected:
assert lines[0] == "compilers:"
else:
assert not lines
dest_dir = join_path(tmpdir.strpath, "defaults")
filename = "compilers.yaml"
# Create a stage directory with an empty configuration file
path = join_path(dest_dir, filename)
touchp(path)
# Do NOT replace the existing cached configuration file if skipping
expected = None if skip else "compilers:"
with spack.config.override("config:url_fetch_method", "curl"):
path = spack.config.fetch_remote_configs(url, dest_dir, skip)
result_filename = path if path.endswith(".yaml") else join_path(path, filename)
check_contents(result_filename, expected)
def test_config_file_dir_failure(tmpdir, mutable_empty_config):
with pytest.raises(spack.config.ConfigFileError, match="not a file"):
spack.config.read_config_file(tmpdir.strpath)

View File

@@ -30,15 +30,7 @@
import llnl.util.lang
import llnl.util.lock
import llnl.util.tty as tty
from llnl.util.filesystem import (
copy,
copy_tree,
join_path,
mkdirp,
remove_linked_tree,
touchp,
working_dir,
)
from llnl.util.filesystem import copy_tree, mkdirp, remove_linked_tree, touchp, working_dir
import spack.binary_distribution
import spack.bootstrap.core
@@ -73,7 +65,6 @@
from spack.installer import PackageInstaller
from spack.main import SpackCommand
from spack.util.pattern import Bunch
from spack.util.remote_file_cache import raw_github_gitlab_url
from ..enums import ConfigScopePriority
@@ -161,7 +152,7 @@ def mock_git_version_info(git, tmpdir, override_git_repos_cache_path):
version tags on multiple branches, and version order is not equal to time
order or topological order.
"""
repo_path = str(tmpdir.mkdir("git_version_info_repo"))
repo_path = str(tmpdir.mkdir("git_repo"))
filename = "file.txt"
def commit(message):
@@ -242,84 +233,6 @@ def latest_commit():
yield repo_path, filename, commits
@pytest.fixture
def mock_git_package_changes(git, tmpdir, override_git_repos_cache_path):
"""Create a mock git repo with known structure of package edits
The structure of commits in this repo is as follows::
o diff-test: modification to make manual download package
|
o diff-test: add v1.2 (from a git ref)
|
o diff-test: add v1.1 (from source tarball)
|
o diff-test: new package (testing multiple added versions)
The repo consists of a single package.py file for DiffTest.
Important attributes of the repo for test coverage are: multiple package
versions are added with some coming from a tarball and some from git refs.
"""
repo_path = str(tmpdir.mkdir("git_package_changes_repo"))
filename = "var/spack/repos/builtin/packages/diff-test/package.py"
def commit(message):
global commit_counter
git(
"commit",
"--no-gpg-sign",
"--date",
"2020-01-%02d 12:0:00 +0300" % commit_counter,
"-am",
message,
)
commit_counter += 1
with working_dir(repo_path):
git("init")
git("config", "user.name", "Spack")
git("config", "user.email", "spack@spack.io")
commits = []
def latest_commit():
return git("rev-list", "-n1", "HEAD", output=str, error=str).strip()
os.makedirs(os.path.dirname(filename))
# add pkg-a as a new package to the repository
shutil.copy2(f"{spack.paths.test_path}/data/conftest/diff-test/package-0.txt", filename)
git("add", filename)
commit("diff-test: new package")
commits.append(latest_commit())
# add v2.1.5 to pkg-a
shutil.copy2(f"{spack.paths.test_path}/data/conftest/diff-test/package-1.txt", filename)
git("add", filename)
commit("diff-test: add v2.1.5")
commits.append(latest_commit())
# add v2.1.6 to pkg-a
shutil.copy2(f"{spack.paths.test_path}/data/conftest/diff-test/package-2.txt", filename)
git("add", filename)
commit("diff-test: add v2.1.6")
commits.append(latest_commit())
# convert pkg-a to a manual download package
shutil.copy2(f"{spack.paths.test_path}/data/conftest/diff-test/package-3.txt", filename)
git("add", filename)
commit("diff-test: modification to make manual download package")
commits.append(latest_commit())
# The commits are ordered with the last commit first in the list
commits = list(reversed(commits))
# Return the git directory to install, the filename used, and the commits
yield repo_path, filename, commits
@pytest.fixture(autouse=True)
def clear_recorded_monkeypatches():
yield
@@ -1732,7 +1645,7 @@ def installation_dir_with_headers(tmpdir_factory):
##########
@pytest.fixture(params=["conflict+foo%clang", "conflict-parent@0.9^conflict~foo"])
@pytest.fixture(params=["conflict%clang+foo", "conflict-parent@0.9^conflict~foo"])
def conflict_spec(request):
"""Specs which violate constraints specified with the "conflicts"
directive in the "conflict" package.
@@ -1947,21 +1860,35 @@ def __call__(self, *args, **kwargs):
@pytest.fixture(scope="function")
def mock_fetch_url_text(tmpdir, mock_config_data, monkeypatch):
"""Mock spack.util.web.fetch_url_text."""
def mock_spider_configs(mock_config_data, monkeypatch):
"""
Mock retrieval of configuration file URLs from the web by grabbing
them from the test data configuration directory.
"""
config_data_dir, config_files = mock_config_data
stage_dir, config_files = mock_config_data
def _spider(*args, **kwargs):
root_urls = args[0]
if not root_urls:
return [], set()
def _fetch_text_file(url, dest_dir):
raw_url = raw_github_gitlab_url(url)
mkdirp(dest_dir)
basename = os.path.basename(raw_url)
src = join_path(stage_dir, basename)
dest = join_path(dest_dir, basename)
copy(src, dest)
return dest
root_urls = [root_urls] if isinstance(root_urls, str) else root_urls
monkeypatch.setattr(spack.util.web, "fetch_url_text", _fetch_text_file)
# Any URL with an extension will be treated like a file; otherwise,
# it is considered a directory/folder and we'll grab all available
# files.
urls = []
for url in root_urls:
if os.path.splitext(url)[1]:
urls.append(url)
else:
urls.extend([os.path.join(url, f) for f in config_files])
return [], set(urls)
monkeypatch.setattr(spack.util.web, "spider", _spider)
yield
@pytest.fixture(scope="function")
@@ -2178,6 +2105,7 @@ def _c_compiler_always_exists():
@pytest.fixture(scope="session")
def mock_test_cache(tmp_path_factory):
cache_dir = tmp_path_factory.mktemp("cache")
print(cache_dir)
return spack.util.file_cache.FileCache(cache_dir)
@@ -2229,30 +2157,6 @@ def mock_runtimes(config, mock_packages):
return mock_packages.packages_with_tags("runtime")
@pytest.fixture()
def write_config_file(tmpdir):
"""Returns a function that writes a config file."""
def _write(config, data, scope):
config_yaml = tmpdir.join(scope, config + ".yaml")
config_yaml.ensure()
with config_yaml.open("w") as f:
syaml.dump_config(data, f)
return config_yaml
return _write
def _include_cache_root():
return join_path(str(tempfile.mkdtemp()), "user_cache", "includes")
@pytest.fixture()
def mock_include_cache(monkeypatch):
"""Override the include cache directory so tests don't pollute user cache."""
monkeypatch.setattr(spack.config, "_include_cache_location", _include_cache_root)
@pytest.fixture()
def wrapper_dir(install_mockery):
"""Installs the compiler wrapper and returns the prefix where the script is installed."""

View File

@@ -1,19 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class DiffTest(AutotoolsPackage):
"""zlib replacement with optimizations for next generation systems."""
homepage = "https://github.com/zlib-ng/zlib-ng"
url = "https://github.com/zlib-ng/zlib-ng/archive/2.0.0.tar.gz"
git = "https://github.com/zlib-ng/zlib-ng.git"
license("Zlib")
version("2.1.4", sha256="a0293475e6a44a3f6c045229fe50f69dc0eebc62a42405a51f19d46a5541e77a")
version("2.0.0", sha256="86993903527d9b12fc543335c19c1d33a93797b3d4d37648b5addae83679ecd8")
version("2.0.7", sha256="6c0853bb27738b811f2b4d4af095323c3d5ce36ceed6b50e5f773204fb8f7200")

View File

@@ -1,20 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class DiffTest(AutotoolsPackage):
"""zlib replacement with optimizations for next generation systems."""
homepage = "https://github.com/zlib-ng/zlib-ng"
url = "https://github.com/zlib-ng/zlib-ng/archive/2.0.0.tar.gz"
git = "https://github.com/zlib-ng/zlib-ng.git"
license("Zlib")
version("2.1.5", sha256="3f6576971397b379d4205ae5451ff5a68edf6c103b2f03c4188ed7075fbb5f04")
version("2.1.4", sha256="a0293475e6a44a3f6c045229fe50f69dc0eebc62a42405a51f19d46a5541e77a")
version("2.0.7", sha256="6c0853bb27738b811f2b4d4af095323c3d5ce36ceed6b50e5f773204fb8f7200")
version("2.0.0", sha256="86993903527d9b12fc543335c19c1d33a93797b3d4d37648b5addae83679ecd8")

View File

@@ -1,21 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class DiffTest(AutotoolsPackage):
"""zlib replacement with optimizations for next generation systems."""
homepage = "https://github.com/zlib-ng/zlib-ng"
url = "https://github.com/zlib-ng/zlib-ng/archive/2.0.0.tar.gz"
git = "https://github.com/zlib-ng/zlib-ng.git"
license("Zlib")
version("2.1.6", tag="2.1.6", commit="74253725f884e2424a0dd8ae3f69896d5377f325")
version("2.1.5", sha256="3f6576971397b379d4205ae5451ff5a68edf6c103b2f03c4188ed7075fbb5f04")
version("2.1.4", sha256="a0293475e6a44a3f6c045229fe50f69dc0eebc62a42405a51f19d46a5541e77a")
version("2.0.7", sha256="6c0853bb27738b811f2b4d4af095323c3d5ce36ceed6b50e5f773204fb8f7200")
version("2.0.0", sha256="86993903527d9b12fc543335c19c1d33a93797b3d4d37648b5addae83679ecd8")

View File

@@ -1,23 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class DiffTest(AutotoolsPackage):
"""zlib replacement with optimizations for next generation systems."""
homepage = "https://github.com/zlib-ng/zlib-ng"
url = "https://github.com/zlib-ng/zlib-ng/archive/2.0.0.tar.gz"
git = "https://github.com/zlib-ng/zlib-ng.git"
license("Zlib")
manual_download = True
version("2.1.6", tag="2.1.6", commit="74253725f884e2424a0dd8ae3f69896d5377f325")
version("2.1.5", sha256="3f6576971397b379d4205ae5451ff5a68edf6c103b2f03c4188ed7075fbb5f04")
version("2.1.4", sha256="a0293475e6a44a3f6c045229fe50f69dc0eebc62a42405a51f19d46a5541e77a")
version("2.0.7", sha256="6c0853bb27738b811f2b4d4af095323c3d5ce36ceed6b50e5f773204fb8f7200")
version("2.0.0", sha256="86993903527d9b12fc543335c19c1d33a93797b3d4d37648b5addae83679ecd8")

View File

@@ -67,20 +67,6 @@ def test_extends_spec(config, mock_packages):
assert extender.package.extends(extendee)
@pytest.mark.regression("48024")
def test_conditionally_extends_transitive_dep(config, mock_packages):
spec = spack.spec.Spec("conditionally-extends-transitive-dep").concretized()
assert not spec.package.extendee_spec
@pytest.mark.regression("48025")
def test_conditionally_extends_direct_dep(config, mock_packages):
spec = spack.spec.Spec("conditionally-extends-direct-dep").concretized()
assert not spec.package.extendee_spec
@pytest.mark.regression("34368")
def test_error_on_anonymous_dependency(config, mock_packages):
pkg = spack.repo.PATH.get_pkg_class("pkg-a")

View File

@@ -12,7 +12,6 @@
import spack.config
import spack.environment as ev
import spack.platforms
import spack.solver.asp
import spack.spec
from spack.environment.environment import (
@@ -924,53 +923,6 @@ def test_environment_from_name_or_dir(mock_packages, mutable_mock_env_path, tmp_
_ = ev.environment_from_name_or_dir("fake-env")
def test_env_include_configs(mutable_mock_env_path, mock_packages):
"""check config and package values using new include schema"""
env_path = mutable_mock_env_path
env_path.mkdir()
this_os = spack.platforms.host().default_os
config_root = env_path / this_os
config_root.mkdir()
config_path = str(config_root / "config.yaml")
with open(config_path, "w", encoding="utf-8") as f:
f.write(
"""\
config:
verify_ssl: False
"""
)
packages_path = str(env_path / "packages.yaml")
with open(packages_path, "w", encoding="utf-8") as f:
f.write(
"""\
packages:
python:
require:
- spec: "@3.11:"
"""
)
spack_yaml = env_path / ev.manifest_name
spack_yaml.write_text(
f"""\
spack:
include:
- path: {config_path}
optional: true
- path: {packages_path}
"""
)
e = ev.Environment(env_path)
with e.manifest.use_config():
assert not spack.config.get("config:verify_ssl")
python_reqs = spack.config.get("packages")["python"]["require"]
req_specs = set(x["spec"] for x in python_reqs)
assert req_specs == set(["@3.11:"])
def test_using_multiple_compilers_on_a_node_is_discouraged(
tmp_path, mutable_config, mock_packages
):

View File

@@ -680,19 +680,13 @@ def test_install_spliced_build_spec_installed(install_mockery, capfd, mock_fetch
assert node.build_spec.installed
# Unit tests should not be affected by the user's managed environments
@pytest.mark.not_on_windows("lacking windows support for binary installs")
@pytest.mark.parametrize("transitive", [True, False])
@pytest.mark.parametrize(
"root_str", ["splice-t^splice-h~foo", "splice-h~foo", "splice-vt^splice-a"]
)
def test_install_splice_root_from_binary(
mutable_mock_env_path,
install_mockery,
mock_fetch,
mutable_temporary_mirror,
transitive,
root_str,
install_mockery, mock_fetch, mutable_temporary_mirror, transitive, root_str
):
"""Test installing a spliced spec with the root available in binary cache"""
# Test splicing and rewiring a spec with the same name, different hash.
@@ -983,6 +977,7 @@ class MyBuildException(Exception):
def _install_fail_my_build_exception(installer, task, install_status, **kwargs):
print(task, task.pkg.name)
if task.pkg.name == "pkg-a":
raise MyBuildException("mock internal package build error for pkg-a")
else:

View File

@@ -3,9 +3,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import pytest
import llnl.util.filesystem as fs
@@ -16,10 +13,8 @@
import spack.error
import spack.main
import spack.paths
import spack.platforms
import spack.util.executable as exe
import spack.util.git
import spack.util.spack_yaml as syaml
pytestmark = pytest.mark.not_on_windows(
"Test functionality supported but tests are failing on Win"
@@ -172,163 +167,3 @@ def test_add_command_line_scope_env(tmp_path, mutable_mock_env_path):
assert config.get("config:install_tree:root") == "/tmp/first"
assert ev.active_environment() is None # shouldn't cause an environment to be activated
def test_include_cfg(mock_low_high_config, write_config_file, tmpdir):
cfg1_path = str(tmpdir.join("include1.yaml"))
with open(cfg1_path, "w", encoding="utf-8") as f:
f.write(
"""\
config:
verify_ssl: False
dirty: True
packages:
python:
require:
- spec: "@3.11:"
"""
)
def python_cfg(_spec):
return f"""\
packages:
python:
require:
- spec: {_spec}
"""
def write_python_cfg(_spec, _cfg_name):
cfg_path = str(tmpdir.join(_cfg_name))
with open(cfg_path, "w", encoding="utf-8") as f:
f.write(python_cfg(_spec))
return cfg_path
# This config will not be included
cfg2_path = write_python_cfg("+shared", "include2.yaml")
# The config will point to this using substitutable variables,
# namely $os; we expect that Spack resolves these variables
# into the actual path of the config
this_os = spack.platforms.host().default_os
cfg3_expanded_path = os.path.join(str(tmpdir), f"{this_os}", "include3.yaml")
fs.mkdirp(os.path.dirname(cfg3_expanded_path))
with open(cfg3_expanded_path, "w", encoding="utf-8") as f:
f.write(python_cfg("+ssl"))
cfg3_abstract_path = os.path.join(str(tmpdir), "$os", "include3.yaml")
# This will be included unconditionally
cfg4_path = write_python_cfg("+tk", "include4.yaml")
# This config will not exist, and the config will explicitly
# allow this
cfg5_path = os.path.join(str(tmpdir), "non-existent.yaml")
include_entries = [
{"path": f"{cfg1_path}", "when": f'os == "{this_os}"'},
{"path": f"{cfg2_path}", "when": "False"},
{"path": cfg3_abstract_path},
cfg4_path,
{"path": cfg5_path, "optional": True},
]
include_cfg = {"include": include_entries}
filename = write_config_file("include", include_cfg, "low")
assert not spack.config.get("config:dirty")
spack.main.add_command_line_scopes(mock_low_high_config, [os.path.dirname(filename)])
assert spack.config.get("config:dirty")
python_reqs = spack.config.get("packages")["python"]["require"]
req_specs = set(x["spec"] for x in python_reqs)
assert req_specs == set(["@3.11:", "+ssl", "+tk"])
def test_include_duplicate_source(tmpdir, mutable_config):
"""Check precedence when include.yaml files have the same path."""
include_yaml = "debug.yaml"
include_list = {"include": [f"./{include_yaml}"]}
system_filename = mutable_config.get_config_filename("system", "include")
site_filename = mutable_config.get_config_filename("site", "include")
def write_configs(include_path, debug_data):
fs.mkdirp(os.path.dirname(include_path))
with open(include_path, "w", encoding="utf-8") as f:
syaml.dump_config(include_list, f)
debug_path = fs.join_path(os.path.dirname(include_path), include_yaml)
with open(debug_path, "w", encoding="utf-8") as f:
syaml.dump_config(debug_data, f)
system_config = {"config": {"debug": False}}
write_configs(system_filename, system_config)
spack.main.add_command_line_scopes(mutable_config, [os.path.dirname(system_filename)])
site_config = {"config": {"debug": True}}
write_configs(site_filename, site_config)
spack.main.add_command_line_scopes(mutable_config, [os.path.dirname(site_filename)])
# Ensure takes the last value of the option pushed onto the stack
assert mutable_config.get("config:debug") == site_config["config"]["debug"]
def test_include_recurse_limit(tmpdir, mutable_config):
"""Ensure hit the recursion limit."""
include_yaml = "include.yaml"
include_list = {"include": [f"./{include_yaml}"]}
include_path = str(tmpdir.join(include_yaml))
with open(include_path, "w", encoding="utf-8") as f:
syaml.dump_config(include_list, f)
with pytest.raises(spack.config.RecursiveIncludeError, match="recursion exceeded"):
spack.main.add_command_line_scopes(mutable_config, [os.path.dirname(include_path)])
# TODO: Fix this once recursive includes are processed in the expected order.
@pytest.mark.parametrize("child,expected", [("b", True), ("c", False)])
def test_include_recurse_diamond(tmpdir, mutable_config, child, expected):
"""Demonstrate include parent's value overrides that of child in diamond include.
Check that the value set by b or c overrides that set by d.
"""
configs_root = tmpdir.join("configs")
configs_root.mkdir()
def write(path, contents):
with open(path, "w", encoding="utf-8") as f:
f.write(contents)
def debug_contents(value):
return f"config:\n debug: {value}\n"
def include_contents(paths):
indent = "\n - "
values = indent.join([str(p) for p in paths])
return f"include:{indent}{values}"
a_yaml = tmpdir.join("a.yaml")
b_yaml = configs_root.join("b.yaml")
c_yaml = configs_root.join("c.yaml")
d_yaml = configs_root.join("d.yaml")
debug_yaml = configs_root.join("enable_debug.yaml")
write(debug_yaml, debug_contents("true"))
a_contents = f"""\
include:
- {b_yaml}
- {c_yaml}
"""
write(a_yaml, a_contents)
write(d_yaml, debug_contents("false"))
write(b_yaml, include_contents([debug_yaml, d_yaml] if child == "b" else [d_yaml]))
write(c_yaml, include_contents([debug_yaml, d_yaml] if child == "c" else [d_yaml]))
spack.main.add_command_line_scopes(mutable_config, [str(tmpdir)])
try:
assert mutable_config.get("config:debug") is expected
except AssertionError:
pytest.xfail("recursive includes are not processed in the expected order")

View File

@@ -38,9 +38,9 @@
{"optional-dep-test@1.1%intel": {"pkg-b": None, "pkg-c": None}},
),
(
"optional-dep-test@1.1+a%intel@64.1.2",
"optional-dep-test@1.1%intel@64.1.2+a",
{
"optional-dep-test@1.1+a%intel@64.1.2": {
"optional-dep-test@1.1%intel@64.1.2+a": {
"pkg-a": None,
"pkg-b": None,
"pkg-c": None,
@@ -49,8 +49,8 @@
},
),
(
"optional-dep-test@1.1+a%clang@36.5",
{"optional-dep-test@1.1+a%clang@36.5": {"pkg-b": None, "pkg-a": None, "pkg-e": None}},
"optional-dep-test@1.1%clang@36.5+a",
{"optional-dep-test@1.1%clang@36.5+a": {"pkg-b": None, "pkg-a": None, "pkg-e": None}},
),
# Chained MPI
(

Some files were not shown because too many files have changed in this diff Show More