Compare commits

..

59 Commits

Author SHA1 Message Date
Massimiliano Culpo
e30fedab10 Update CHANGELOG and version 2024-01-12 10:16:58 +01:00
eugeneswalker
a7c6df1b5a e4s ci: disable gpu test stack (#41296) 2024-01-12 10:16:58 +01:00
Harmen Stoppels
19e86088cd binary_distribution.py: support build cache layout 2 (#41773)
Add forward compatibility for tarballs created by Spack 0.22, which
use build cache layout version 2.

Spack 0.21 continues to produce build cache layout version 1 tarballs.

Build cache layout version 2 also lists parent directories of the
package prefix in the tarball, which is required for certain container
runtimes.
2024-01-11 09:40:22 +01:00
Harmen Stoppels
0295b466d7 Dont expect __qualname__ to exist (#41989) 2024-01-11 09:40:22 +01:00
Harmen Stoppels
68e9547615 installer.py: do not tty.die when cache only fails (#41990) 2024-01-11 09:40:22 +01:00
Harmen Stoppels
0ab9c8b904 installer.py: don't dereference stage before installing from binaries (#41986)
This fixes an issue where pkg.stage throws because a patch cannot be found,
but the patch is redundant because the spec is reused from a build cache and
will be installed from existing binaries.
2024-01-11 09:40:22 +01:00
Miguel Dias Costa
dd768bb6c3 update BerkeleyGW source urls (#38218)
* update url for BerkeleyGW version 3.0.1
* update source urls and add version 3.1.0 to berkeleygw package
2024-01-11 09:40:22 +01:00
Harmen Stoppels
b15f9d011c Spec.format: error on old style format strings (#41934) 2024-01-11 09:40:22 +01:00
Massimiliano Culpo
4f7cce68b8 ASP-based solver: don't error for type mismatch on preferences (#41138)
This commit discards type mismatches or failures to validate a package preference during concretization. The values discarded are logged as debug level messages. It also adds a config audit to help users spot misconfigurations in packages.yaml preferences.
2024-01-11 09:40:22 +01:00
Massimiliano Culpo
f39a1e5fc8 Fix an issue with deconcretization/reconcretization of environments (#41294) 2024-01-11 09:40:22 +01:00
Massimiliano Culpo
df2a3bd531 ASP-based solver: use a unique ID counter (#41290)
* solver: use a unique counter for condition, triggers and effects

* Do not reset counters when re-running setup

  What we need is just a unique ID, it doesn't need
  to start from zero every time.
2024-01-11 09:40:22 +01:00
Todd Gamblin
e29049d9c0 bugfix: sort variants in spack info --variants-by-name (#41389)
This was missed while backporting the new `spack info` command from #40326.

Variants should be sorted by name when invoking `spack info --variants-by-name`.
2024-01-11 09:40:22 +01:00
Stephen Sachs
de2249c334 clingo-bootstrap: use new Spack API for environment modifications (#41574) 2024-01-11 09:40:22 +01:00
Harmen Stoppels
03d7643480 tests: fix more cases of env variables (#41226) 2024-01-11 09:40:22 +01:00
Massimiliano Culpo
9f2b8eef7a Refactor a test to not use the "working_env" fixture (#41308)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-01-11 09:40:22 +01:00
Harmen Stoppels
f57ac8d2da tests: fix issue with os.environ binding (#41342) 2024-01-11 09:40:22 +01:00
Harmen Stoppels
539fa5c39a test_variant_propagation_with_unify_false: missing fixture (#41345) 2024-01-11 09:40:22 +01:00
Harmen Stoppels
50710c2b6e tests: fix side effects of default_config fixture (#41361)
* tests: default_config drop scope

* use default_config elsewhere

* use parse_install_tree for missing defaults in default config
2024-01-11 09:40:22 +01:00
Harmen Stoppels
f1b2515ce1 tests: add missing mutable db (#41359) 2024-01-11 09:40:22 +01:00
Harmen Stoppels
37f65e769c unit tests: replace /bin/bash with /bin/sh (#41495) 2024-01-11 09:40:22 +01:00
Harmen Stoppels
16b600c193 tests: use temporary_store (#41369) 2024-01-11 09:40:22 +01:00
Harmen Stoppels
bb62f71aa0 asp.py: remove "CLI" reference (#41718)
Can also be an environment root, or programatically
`Spec("x").concretized()`.
2024-01-11 09:40:22 +01:00
Juan Miguel Carceller
fff8e16cdd Add a webgui patch (#41404)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-01-11 09:40:22 +01:00
Dave Keeshan
33c287eed8 Fix filter_compiler_wrapper where compiler is None (#41502)
Fix filer_compiler_wrapper for cases where the compiler returned in None, this happens on some installed gcc systems that do not have fortran built into them as standard, e.g. gcc@11.4.0 on ubuntu 22.04
2024-01-11 09:40:22 +01:00
Robert Cohn
2d5ccd3068 handle use of an unconfigured compiler (#41213) 2024-01-10 20:28:14 +01:00
Michael Kuhn
125085c580 Fix multi-word aliases (#41126)
PR #40929 reverted the argument parsing to make `spack --verbose
install` work again. It looks like `--verbose` is the only instance
where this kind of argument inheritance is used since all other commands
override arguments with the same name instead. For instance, `spack
--bootstrap clean` does not invoke `spack clean --bootstrap`.

Therefore, fix multi-line aliases again by parsing the resolved
arguments and instead explicitly pass down `args.verbose` to commands.
2024-01-10 20:28:14 +01:00
Massimiliano Culpo
e70e401be1 spack graph: fix coloring with environments (#41240)
If we use all specs, we won't color correctly build-only dependencies
2024-01-10 20:28:14 +01:00
John W. Parent
e8bbd7763c MSVC preview version breaks clingo build (#41185)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-01-10 20:28:14 +01:00
Harmen Stoppels
712db8c85c setup_platform_environment before package env mods (#41205)
This roughly restores the order of operation from Spack 0.20,
where where `AutotoolsPackage.setup_build_environment` would
override the env variable set in `setup_platform_environment` on
macOS.
2024-01-10 20:28:14 +01:00
Massimiliano Culpo
47c560d526 ASP-based solver: don't emit spurious debug output (#41218)
When improving the error message, we started #showing in the
answer set a lot more symbols - but we forgot to suppress the
debug messages warning about UNKNOWN SYMBOLs
2024-01-10 20:28:14 +01:00
Harmen Stoppels
5b386cf9b1 test_which: do not mutate os.environ 2024-01-10 20:28:14 +01:00
Harmen Stoppels
2cbe84b1ee docs: document how spack picks a version / variant (#41070) 2024-01-10 20:28:14 +01:00
Massimiliano Culpo
e1f98fd206 Improve the error message for deprecated preferences (#41075)
Improves the warning for deprecated preferences, and adds a configuration
audit to get files:lines details of the issues.

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-01-10 20:28:14 +01:00
Massimiliano Culpo
d5ef4f8c83 Add audit check to spot when= arguments using wrong named specs (#41107)
* Add audit check to spot when= arguments using named specs

* Fix package issues caught by the new audit
2024-01-10 20:28:14 +01:00
Harmen Stoppels
b49022822d docs: packages config on separate page, demote bootstrapping (#41085) 2024-01-10 20:28:14 +01:00
Massimiliano Culpo
198bd87914 Fix infinite recursion when computing concretization errors (#41061) 2024-01-10 20:28:14 +01:00
Harmen Stoppels
080a781b81 Set version to 0.21.1.dev0 2024-01-10 20:28:14 +01:00
Todd Gamblin
65d3221a9c Update version and CHANGELOG.md for v0.21.0
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-11-11 03:32:22 -08:00
Todd Gamblin
f6b17f6329 update release branch for tutorial command 2023-11-11 03:32:22 -08:00
Greg Becker
09e9bb5c3d spack deconcretize command (#38803)
We have two ways to concretize now:
* `spack concretize` concretizes only the root specs that are not concrete in the environment.
* `spack concretize -f` eliminates all cached concretization data and reconcretizes the *entire* environment.

This PR adds `spack deconcretize`, which eliminates cached concretization data for a spec.  This allows
users greater control over what is preserved from their `spack.lock` file and what is reused when not
using `spack concretize -f`.  If you want to update a spec installed in your environment, you can call
`spack deconcretize` on it, and that spec and any relevant dependents will be removed from the lock file.

`spack concretize` has two options:
* `--root`: limits deconcretized specs to *specific* roots in the environment. You can use this to
  deconcretize exactly one root in a `unify: false` environment.  i.e., if `foo` root is a dependent
  of `bar`, both roots, `spack deconcretize bar` will *not* deconcretize `foo`.
* `--all`: deconcretize *all* specs that match the input spec. By default `spack deconcretize`
  will complain about multiple matches, like `spack uninstall`.
2023-11-11 03:32:22 -08:00
Massimiliano Culpo
f6dc557764 builtin.repo: fix ^mkl pattern in minor packages (#41003)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-11-11 03:32:22 -08:00
Massimiliano Culpo
74172fb9d2 gromacs et al: fix ^mkl pattern (#41002)
The ^mkl pattern was used to refer to three packages
even though none of software using it was depending
on "mkl".

This pattern, which follows Hyrum's law, is now being
removed in favor of a more explicit one.

In this PR gromacs, abinit, lammps, and quantum-espresso
are modified.

Intel packages are also modified to provide "lapack"
and "blas" together.
2023-11-11 03:32:22 -08:00
Harmen Stoppels
c266e69cde env: compute env mods only for installed roots (#40997)
And improve the error message (load vs unload).

Of course you could have some uninstalled dependency too, but as long as
it doesn't implement `setup_run_environment` etc, I don't think it hurts
to attempt to load the root anyways, given that failure to do so is a
warning, not a fatal error.
2023-11-11 03:32:22 -08:00
Todd Gamblin
fe57ec2ab7 info: rework spack info command to display variants better (#40998)
This changes variant display to use a much more legible format, and to use screen space
much better (particularly on narrow terminals). It also adds color the variant display
to match other parts of `spack info`.

Descriptions and variant value lists that were frequently squished into a tiny column
before now have closer to the full terminal width.

This change also preserves any whitespace formatting present in `package.py`, so package
maintainers can make easer-to-read descriptions of variant values if they want. For
example, `gasnet` has had a nice description of the `conduits` variant for a while, but
it was wrapped and made illegible by `spack info`. That is now fixed and the original
newlines are kept.

Conditional variants are grouped by their when clauses by default, but if you do not
like the grouping, you can display all the variants in order with `--variants-by-name`.
I'm not sure when people will prefer this, but it makes it easier to tell that a
particular variant is/isn't there. I do think grouping by `when` is the better default.
2023-11-11 03:32:22 -08:00
Adam J. Stewart
3c3476a176 py-black: add v23.10: (#40959) 2023-11-11 03:32:22 -08:00
Scott Wittenburg
67f20c3e5c buildcache: skip unrecognized metadata files (#40941)
This commit improves forward compatibility of Spack with newer build cache metadata formats.

Before this commit, invalid or unrecognized metadata would be fatal errors, now they just cause
a mirror to be skipped.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-11-11 03:32:22 -08:00
Satish Balay
1baf712b87 intel-oneapi-mkl: do not set __INTEL_POST_CFLAGS env variable (#40947)
This triggers warnings from icx compiler - that breaks petsc configure

$ I_MPI_CC=icx /opt/intel/oneapi/mpi/2021.7.0/bin/mpiicc -E a.c > /dev/null
$ __INTEL_POST_CFLAGS=-Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.0/lib/intel64 I_MPI_CC=icx /opt/intel/oneapi/mpi/2021.7.0/bin/mpiicc -E a.c > /dev/null
icx: warning: -Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.0/lib/intel64: 'linker' input unused [-Wunused-command-line-argument]
2023-11-09 11:44:57 +01:00
Massimiliano Culpo
c73ec0b36d modules: remove deprecated code and test data (#40966)
This removes a few deprecated attributes from the
schema of the "modules" section. Test data for
deprecated options is removed as well.
2023-11-09 11:44:57 +01:00
Harmen Stoppels
9d58d5e645 modules: restore exclude_implicits (#40958) 2023-11-09 11:44:57 +01:00
Massimiliano Culpo
fc5fd7fc60 tcl: filter compiler wrappers to avoid pointing to Spack (#40946) 2023-11-09 11:44:57 +01:00
Tom Vander Aa
efa59510a8 libevent: always autogen.sh (#40945)
The libevent release tarballs ship with a `configure` script generated by an old `libtool`. The `libtool` generated by `configure` is not compatible with `MACOSX_DEPLOYMENT_VERSION` > 10. Regeneration of the `configure` scripts fixes build on macOS. 

Original configure contains:
```
    case $host_os in
    rhapsody* | darwin1.[012])
      _lt_dar_allow_undefined='$wl-undefined ${wl}suppress' ;;
    darwin1.*)
      _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
    darwin*) # darwin 5.x on
      # if running on 10.5 or later, the deployment target defaults
      # to the OS version, if on x86, and 10.4, the deployment
      # target defaults to 10.4. Don't you love it?
      case ${MACOSX_DEPLOYMENT_TARGET-10.0},$host in
        10.0,*86*-darwin8*|10.0,*-darwin[91]*)
          _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;;
        10.[012][,.]*)
          _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
        10.*)
          _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;;
      esac
```

After re-running `autogen.sh`:
```
    case $host_os in
    rhapsody* | darwin1.[012])
      _lt_dar_allow_undefined='$wl-undefined ${wl}suppress' ;;
    darwin1.*)
      _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
    darwin*)
      case $MACOSX_DEPLOYMENT_TARGET,$host in
        10.[012],*|,*powerpc*-darwin[5-8]*)
          _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
        *)
          _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;;
      esac
```
2023-11-09 11:44:57 +01:00
Harmen Stoppels
6094ee5eae Revert "defaults/modules.yaml: hide implicits (#40906)" (#40955)
This reverts commit a2f00886e9.
2023-11-09 11:44:57 +01:00
Greg Becker
f7cacdbf40 tutorial stack: update for changes to the basics section for SC23 (#40942) 2023-11-08 08:56:33 +01:00
Harmen Stoppels
5152738084 tutorial: use lmod@8.7.18 because @8.7.19: has bugs (#40939) 2023-11-08 08:56:33 +01:00
Richarda Butler
9ba8d60789 Propagate variant across nodes that don't have that variant (#38512)
Before this PR, variant were not propagated to leaf nodes that could accept 
the propagated value, if some intermediate node couldn't accept it.

This PR fixes that issue by marking nodes as "candidate" for propagation
and by setting the variant only if it can be accepted by the node.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-11-08 08:56:33 +01:00
Harmen Stoppels
3c8cd8d30c Ensure global command line arguments end up in args like before (#40929) 2023-11-08 08:56:33 +01:00
Harmen Stoppels
53a31bbbbf tutorial pipeline: force gcc@12.3.0 (#40937) 2023-11-08 08:56:33 +01:00
Harmen Stoppels
bac314a4f1 spack tutorial: use backports/v0.21.0 branch 2023-11-08 08:56:33 +01:00
Harmen Stoppels
10ba172611 catch exceptions in which_string (#40935) 2023-11-08 08:56:33 +01:00
562 changed files with 2800 additions and 7752 deletions

View File

@@ -23,7 +23,7 @@ jobs:
operating_system: ["ubuntu-latest", "macos-latest"]
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages

View File

@@ -159,7 +159,7 @@ jobs:
brew install cmake bison@2.7 tree
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: "3.12"
- name: Bootstrap clingo

View File

@@ -57,7 +57,7 @@ jobs:
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: docker/metadata-action@31cebacef4805868f9ce9a0cb03ee36c32df2ac4
- uses: docker/metadata-action@96383f45573cb7f253c731d3b3ab81c87ef81934
id: docker_meta
with:
images: |
@@ -113,7 +113,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@4a13e500e55cf31b7a5d59a38ab2040ab0f42f56
uses: docker/build-push-action@0565240e2d4ab88bba5387d719585280857ece09
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}

View File

@@ -17,7 +17,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -1,7 +1,7 @@
black==23.11.0
black==23.10.1
clingo==5.6.2
flake8==6.1.0
isort==5.12.0
mypy==1.7.1
mypy==1.6.1
types-six==1.16.21.9
vermin==1.6.0
vermin==1.5.2

View File

@@ -54,7 +54,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -101,7 +101,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -159,7 +159,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -194,7 +194,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
@@ -215,7 +215,7 @@ jobs:
$(which spack) bootstrap disable spack-install
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
$(which spack) unit-test --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
with:
flags: unittests,macos

View File

@@ -19,7 +19,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: '3.11'
cache: 'pip'
@@ -38,7 +38,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: '3.11'
cache: 'pip'

View File

@@ -18,7 +18,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: 3.9
- name: Install Python packages
@@ -42,7 +42,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: 3.9
- name: Install Python packages
@@ -66,7 +66,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -1,3 +1,29 @@
# v0.21.1 (2024-01-11)
## New features
- Add support for reading buildcaches created by Spack v0.22 (#41773)
## Bugfixes
- spack graph: fix coloring with environments (#41240)
- spack info: sort variants in --variants-by-name (#41389)
- Spec.format: error on old style format strings (#41934)
- ASP-based solver:
- fix infinite recursion when computing concretization errors (#41061)
- don't error for type mismatch on preferences (#41138)
- don't emit spurious debug output (#41218)
- Improve the error message for deprecated preferences (#41075)
- Fix MSVC preview version breaking clingo build on Windows (#41185)
- Fix multi-word aliases (#41126)
- Add a warning for unconfigured compiler (#41213)
- environment: fix an issue with deconcretization/reconcretization of specs (#41294)
- buildcache: don't error if a patch is missing, when installing from binaries (#41986)
- Multiple improvements to unit-tests (#41215,#41369,#41495,#41359,#41361,#41345,#41342,#41308,#41226)
## Package updates
- root: add a webgui patch to address security issue (#41404)
- BerkeleyGW: update source urls (#38218)
# v0.21.0 (2023-11-11)
`v0.21.0` is a major feature release.

View File

@@ -66,11 +66,10 @@ Resources:
* **Matrix space**: [#spack-space:matrix.org](https://matrix.to/#/#spack-space:matrix.org):
[bridged](https://github.com/matrix-org/matrix-appservice-slack#matrix-appservice-slack) to Slack.
* [**Github Discussions**](https://github.com/spack/spack/discussions):
for Q&A and discussions. Note the pinned discussions for announcements.
not just for discussions, but also Q&A.
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack)
* **Twitter**: [@spackpm](https://twitter.com/spackpm). Be sure to
`@mention` us!
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack):
only for announcements. Please use other venues for discussions.
Contributing
------------------------

View File

@@ -50,4 +50,4 @@ packages:
# Apple bundles libuuid in libsystem_c version 1353.100.2,
# although the version number used here isn't critical
- spec: apple-libuuid@1353.100.2
prefix: /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk
prefix: /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk

View File

@@ -182,7 +182,6 @@ section of the configuration:
padded_length: 128
.. _binary_caches_oci:
-----------------------------------------
OCI / Docker V2 registries as build cache

View File

@@ -82,7 +82,7 @@ class already contains:
.. code-block:: python
depends_on("cmake", type="build")
depends_on('cmake', type='build')
If you need to specify a particular version requirement, you can
@@ -90,7 +90,7 @@ override this in your package:
.. code-block:: python
depends_on("cmake@2.8.12:", type="build")
depends_on('cmake@2.8.12:', type='build')
^^^^^^^^^^^^^^^^^^^
@@ -137,10 +137,10 @@ and without the :meth:`~spack.build_systems.cmake.CMakeBuilder.define` and
def cmake_args(self):
args = [
"-DWHATEVER:STRING=somevalue",
self.define("ENABLE_BROKEN_FEATURE", False),
self.define_from_variant("DETECT_HDF5", "hdf5"),
self.define_from_variant("THREADS"), # True if +threads
'-DWHATEVER:STRING=somevalue',
self.define('ENABLE_BROKEN_FEATURE', False),
self.define_from_variant('DETECT_HDF5', 'hdf5'),
self.define_from_variant('THREADS'), # True if +threads
]
return args
@@ -151,10 +151,10 @@ and CMake simply ignores the empty command line argument. For example the follow
.. code-block:: python
variant("example", default=True, when="@2.0:")
variant('example', default=True, when='@2.0:')
def cmake_args(self):
return [self.define_from_variant("EXAMPLE", "example")]
return [self.define_from_variant('EXAMPLE', 'example')]
will generate ``'cmake' '-DEXAMPLE=ON' ...`` when `@2.0: +example` is met, but will
result in ``'cmake' '' ...`` when the spec version is below ``2.0``.
@@ -193,9 +193,9 @@ a variant to control this:
.. code-block:: python
variant("build_type", default="RelWithDebInfo",
description="CMake build type",
values=("Debug", "Release", "RelWithDebInfo", "MinSizeRel"))
variant('build_type', default='RelWithDebInfo',
description='CMake build type',
values=('Debug', 'Release', 'RelWithDebInfo', 'MinSizeRel'))
However, not every CMake package accepts all four of these options.
Grep the ``CMakeLists.txt`` file to see if the default values are
@@ -205,9 +205,9 @@ package overrides the default variant with:
.. code-block:: python
variant("build_type", default="DebugRelease",
description="The build type to build",
values=("Debug", "Release", "DebugRelease"))
variant('build_type', default='DebugRelease',
description='The build type to build',
values=('Debug', 'Release', 'DebugRelease'))
For more information on ``CMAKE_BUILD_TYPE``, see:
https://cmake.org/cmake/help/latest/variable/CMAKE_BUILD_TYPE.html
@@ -250,7 +250,7 @@ generator is Ninja. To switch to the Ninja generator, simply add:
.. code-block:: python
generator = "Ninja"
generator = 'Ninja'
``CMakePackage`` defaults to "Unix Makefiles". If you switch to the
@@ -258,7 +258,7 @@ Ninja generator, make sure to add:
.. code-block:: python
depends_on("ninja", type="build")
depends_on('ninja', type='build')
to the package as well. Aside from that, you shouldn't need to do
anything else. Spack will automatically detect that you are using
@@ -288,7 +288,7 @@ like so:
.. code-block:: python
root_cmakelists_dir = "src"
root_cmakelists_dir = 'src'
Note that this path is relative to the root of the extracted tarball,
@@ -304,7 +304,7 @@ different sub-directory, simply override ``build_directory`` like so:
.. code-block:: python
build_directory = "my-build"
build_directory = 'my-build'
^^^^^^^^^^^^^^^^^^^^^^^^^
Build and install targets
@@ -324,8 +324,8 @@ library or build the documentation, you can add these like so:
.. code-block:: python
build_targets = ["all", "docs"]
install_targets = ["install", "docs"]
build_targets = ['all', 'docs']
install_targets = ['install', 'docs']
^^^^^^^
Testing

View File

@@ -53,24 +53,18 @@ Install the oneAPI compilers::
Add the compilers to your ``compilers.yaml`` so spack can use them::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/bin
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin/intel64
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin
Verify that the compilers are available::
spack compiler list
Note that 2024 and later releases do not include ``icc``. Before 2024,
the package layout was different::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin/intel64
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin
The ``intel-oneapi-compilers`` package includes 2 families of
compilers:
* ``intel``: ``icc``, ``icpc``, ``ifort``. Intel's *classic*
compilers. 2024 and later releases contain ``ifort``, but not
``icc`` and ``icpc``.
compilers.
* ``oneapi``: ``icx``, ``icpx``, ``ifx``. Intel's new generation of
compilers based on LLVM.
@@ -95,8 +89,8 @@ Install the oneAPI compilers::
Add the compilers to your ``compilers.yaml`` so Spack can use them::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/bin
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/bin
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin/intel64
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin
Verify that the compilers are available::
@@ -152,7 +146,8 @@ Compilers
To use the compilers, add some information about the installation to
``compilers.yaml``. For most users, it is sufficient to do::
spack compiler add /opt/intel/oneapi/compiler/latest/bin
spack compiler add /opt/intel/oneapi/compiler/latest/linux/bin/intel64
spack compiler add /opt/intel/oneapi/compiler/latest/linux/bin
Adapt the paths above if you did not install the tools in the default
location. After adding the compilers, using them is the same
@@ -161,12 +156,6 @@ Another option is to manually add the configuration to
``compilers.yaml`` as described in :ref:`Compiler configuration
<compiler-config>`.
Before 2024, the directory structure was different::
spack compiler add /opt/intel/oneapi/compiler/latest/linux/bin/intel64
spack compiler add /opt/intel/oneapi/compiler/latest/linux/bin
Libraries
---------

View File

@@ -934,9 +934,9 @@ a *virtual* ``mkl`` package is declared in Spack.
.. code-block:: python
# Examples for absolute and conditional dependencies:
depends_on("mkl")
depends_on("mkl", when="+mkl")
depends_on("mkl", when="fftw=mkl")
depends_on('mkl')
depends_on('mkl', when='+mkl')
depends_on('mkl', when='fftw=mkl')
The ``MKLROOT`` environment variable (part of the documented API) will be set
during all stages of client package installation, and is available to both
@@ -972,8 +972,8 @@ a *virtual* ``mkl`` package is declared in Spack.
def configure_args(self):
args = []
...
args.append("--with-blas=%s" % self.spec["blas"].libs.ld_flags)
args.append("--with-lapack=%s" % self.spec["lapack"].libs.ld_flags)
args.append('--with-blas=%s' % self.spec['blas'].libs.ld_flags)
args.append('--with-lapack=%s' % self.spec['lapack'].libs.ld_flags)
...
.. tip::
@@ -989,13 +989,13 @@ a *virtual* ``mkl`` package is declared in Spack.
.. code-block:: python
self.spec["blas"].headers.include_flags
self.spec['blas'].headers.include_flags
and to generate linker options (``-L<dir> -llibname ...``), use the same as above,
.. code-block:: python
self.spec["blas"].libs.ld_flags
self.spec['blas'].libs.ld_flags
See
:ref:`MakefilePackage <makefilepackage>`

View File

@@ -88,7 +88,7 @@ override the ``luarocks_args`` method like so:
.. code-block:: python
def luarocks_args(self):
return ["flag1", "flag2"]
return ['flag1', 'flag2']
One common use of this is to override warnings or flags for newer compilers, as in:

View File

@@ -48,8 +48,8 @@ class automatically adds the following dependencies:
.. code-block:: python
depends_on("java", type=("build", "run"))
depends_on("maven", type="build")
depends_on('java', type=('build', 'run'))
depends_on('maven', type='build')
In the ``pom.xml`` file, you may see sections like:
@@ -72,8 +72,8 @@ should add:
.. code-block:: python
depends_on("java@7:", type="build")
depends_on("maven@3.5.4:", type="build")
depends_on('java@7:', type='build')
depends_on('maven@3.5.4:', type='build')
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -88,9 +88,9 @@ the build phase. For example:
def build_args(self):
return [
"-Pdist,native",
"-Dtar",
"-Dmaven.javadoc.skip=true"
'-Pdist,native',
'-Dtar',
'-Dmaven.javadoc.skip=true'
]

View File

@@ -86,8 +86,8 @@ the ``MesonPackage`` base class already contains:
.. code-block:: python
depends_on("meson", type="build")
depends_on("ninja", type="build")
depends_on('meson', type='build')
depends_on('ninja', type='build')
If you need to specify a particular version requirement, you can
@@ -95,8 +95,8 @@ override this in your package:
.. code-block:: python
depends_on("meson@0.43.0:", type="build")
depends_on("ninja", type="build")
depends_on('meson@0.43.0:', type='build')
depends_on('ninja', type='build')
^^^^^^^^^^^^^^^^^^^
@@ -121,7 +121,7 @@ override the ``meson_args`` method like so:
.. code-block:: python
def meson_args(self):
return ["--warnlevel=3"]
return ['--warnlevel=3']
This method can be used to pass flags as well as variables.

View File

@@ -118,7 +118,7 @@ so ``PerlPackage`` contains:
.. code-block:: python
extends("perl")
extends('perl')
If your package requires a specific version of Perl, you should
@@ -132,14 +132,14 @@ properly. If your package uses ``Makefile.PL`` to build, add:
.. code-block:: python
depends_on("perl-extutils-makemaker", type="build")
depends_on('perl-extutils-makemaker', type='build')
If your package uses ``Build.PL`` to build, add:
.. code-block:: python
depends_on("perl-module-build", type="build")
depends_on('perl-module-build', type='build')
^^^^^^^^^^^^^^^^^
@@ -165,11 +165,11 @@ arguments to ``Makefile.PL`` or ``Build.PL`` by overriding
.. code-block:: python
def configure_args(self):
expat = self.spec["expat"].prefix
expat = self.spec['expat'].prefix
return [
"EXPATLIBPATH={0}".format(expat.lib),
"EXPATINCPATH={0}".format(expat.include),
'EXPATLIBPATH={0}'.format(expat.lib),
'EXPATINCPATH={0}'.format(expat.include),
]

View File

@@ -83,7 +83,7 @@ base class already contains:
.. code-block:: python
depends_on("qt", type="build")
depends_on('qt', type='build')
If you want to specify a particular version requirement, or need to
@@ -91,7 +91,7 @@ link to the ``qt`` libraries, you can override this in your package:
.. code-block:: python
depends_on("qt@5.6.0:")
depends_on('qt@5.6.0:')
^^^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to qmake
@@ -103,7 +103,7 @@ override the ``qmake_args`` method like so:
.. code-block:: python
def qmake_args(self):
return ["-recursive"]
return ['-recursive']
This method can be used to pass flags as well as variables.
@@ -118,7 +118,7 @@ sub-directory by adding the following to the package:
.. code-block:: python
build_directory = "src"
build_directory = 'src'
^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -163,28 +163,28 @@ attributes that can be used to set ``homepage``, ``url``, ``list_url``, and
.. code-block:: python
cran = "caret"
cran = 'caret'
is equivalent to:
.. code-block:: python
homepage = "https://cloud.r-project.org/package=caret"
url = "https://cloud.r-project.org/src/contrib/caret_6.0-86.tar.gz"
list_url = "https://cloud.r-project.org/src/contrib/Archive/caret"
homepage = 'https://cloud.r-project.org/package=caret'
url = 'https://cloud.r-project.org/src/contrib/caret_6.0-86.tar.gz'
list_url = 'https://cloud.r-project.org/src/contrib/Archive/caret'
Likewise, the following ``bioc`` attribute:
.. code-block:: python
bioc = "BiocVersion"
bioc = 'BiocVersion'
is equivalent to:
.. code-block:: python
homepage = "https://bioconductor.org/packages/BiocVersion/"
git = "https://git.bioconductor.org/packages/BiocVersion"
homepage = 'https://bioconductor.org/packages/BiocVersion/'
git = 'https://git.bioconductor.org/packages/BiocVersion'
^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -200,7 +200,7 @@ base class contains:
.. code-block:: python
extends("r")
extends('r')
Take a close look at the homepage for ``caret``. If you look at the
@@ -209,7 +209,7 @@ You should add this to your package like so:
.. code-block:: python
depends_on("r@3.2.0:", type=("build", "run"))
depends_on('r@3.2.0:', type=('build', 'run'))
^^^^^^^^^^^^^^
@@ -227,7 +227,7 @@ and list all of their dependencies in the following sections:
* LinkingTo
As far as Spack is concerned, all 3 of these dependency types
correspond to ``type=("build", "run")``, so you don't have to worry
correspond to ``type=('build', 'run')``, so you don't have to worry
about the details. If you are curious what they mean,
https://github.com/spack/spack/issues/2951 has a pretty good summary:
@@ -330,7 +330,7 @@ the dependency:
.. code-block:: python
depends_on("r-lattice@0.20:", type=("build", "run"))
depends_on('r-lattice@0.20:', type=('build', 'run'))
^^^^^^^^^^^^^^^^^^
@@ -361,20 +361,20 @@ like so:
.. code-block:: python
def configure_args(self):
mpi_name = self.spec["mpi"].name
mpi_name = self.spec['mpi'].name
# The type of MPI. Supported values are:
# OPENMPI, LAM, MPICH, MPICH2, or CRAY
if mpi_name == "openmpi":
Rmpi_type = "OPENMPI"
elif mpi_name == "mpich":
Rmpi_type = "MPICH2"
if mpi_name == 'openmpi':
Rmpi_type = 'OPENMPI'
elif mpi_name == 'mpich':
Rmpi_type = 'MPICH2'
else:
raise InstallError("Unsupported MPI type")
raise InstallError('Unsupported MPI type')
return [
"--with-Rmpi-type={0}".format(Rmpi_type),
"--with-mpi={0}".format(spec["mpi"].prefix),
'--with-Rmpi-type={0}'.format(Rmpi_type),
'--with-mpi={0}'.format(spec['mpi'].prefix),
]

View File

@@ -84,8 +84,8 @@ The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
summary = "An implementation of the AsciiDoc text processor and publishing toolchain"
description = "A fast, open source text processor and publishing toolchain for converting AsciiDoc content to HTML 5, DocBook 5, and other formats."
summary = 'An implementation of the AsciiDoc text processor and publishing toolchain'
description = 'A fast, open source text processor and publishing toolchain for converting AsciiDoc content to HTML 5, DocBook 5, and other formats.'
Either of these can be used for the description of the Spack package.
@@ -98,7 +98,7 @@ The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
homepage = "https://asciidoctor.org"
homepage = 'https://asciidoctor.org'
This should be used as the official homepage of the Spack package.
@@ -112,21 +112,21 @@ the base class contains:
.. code-block:: python
extends("ruby")
extends('ruby')
The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
required_ruby_version = ">= 2.3.0"
required_ruby_version = '>= 2.3.0'
This can be added to the Spack package using:
.. code-block:: python
depends_on("ruby@2.3.0:", type=("build", "run"))
depends_on('ruby@2.3.0:', type=('build', 'run'))
^^^^^^^^^^^^^^^^^

View File

@@ -124,7 +124,7 @@ are wrong, you can provide the names yourself by overriding
.. code-block:: python
import_modules = ["PyQt5"]
import_modules = ['PyQt5']
These tests often catch missing dependencies and non-RPATHed

View File

@@ -63,8 +63,8 @@ run package-specific unit tests.
.. code-block:: python
def installtest(self):
with working_dir("test"):
pytest = which("py.test")
with working_dir('test'):
pytest = which('py.test')
pytest()
@@ -93,7 +93,7 @@ the following dependency automatically:
.. code-block:: python
depends_on("python@2.5:", type="build")
depends_on('python@2.5:', type='build')
Waf only supports Python 2.5 and up.
@@ -113,7 +113,7 @@ phase, you can use:
args = []
if self.run_tests:
args.append("--test")
args.append('--test')
return args

View File

@@ -243,11 +243,9 @@ lower-precedence settings. Completely ignoring higher-level configuration
options is supported with the ``::`` notation for keys (see
:ref:`config-overrides` below).
There are also special notations for string concatenation and precendense override:
* ``+:`` will force *prepending* strings or lists. For lists, this is the default behavior.
* ``-:`` works similarly, but for *appending* values.
There are also special notations for string concatenation and precendense override.
Using the ``+:`` notation can be used to force *prepending* strings or lists. For lists, this is identical
to the default behavior. Using the ``-:`` works similarly, but for *appending* values.
:ref:`config-prepend-append`
^^^^^^^^^^^

View File

@@ -24,16 +24,6 @@ image, or to set up a proper entrypoint to run the image. These tasks are
usually both necessary and repetitive, so Spack comes with a command
to generate recipes for container images starting from a ``spack.yaml``.
.. seealso::
This page is a reference for generating recipes to build container images.
It means that your environment is built from scratch inside the container
runtime.
Since v0.21, Spack can also create container images from existing package installations
on your host system. See :ref:`binary_caches_oci` for more information on
that topic.
--------------------
A Quick Introduction
--------------------

View File

@@ -9,42 +9,46 @@
Custom Extensions
=================
*Spack extensions* allow you to extend Spack capabilities by deploying your
*Spack extensions* permit you to extend Spack capabilities by deploying your
own custom commands or logic in an arbitrary location on your filesystem.
This might be extremely useful e.g. to develop and maintain a command whose purpose is
too specific to be considered for reintegration into the mainline or to
evolve a command through its early stages before starting a discussion to merge
it upstream.
From Spack's point of view an extension is any path in your filesystem which
respects the following naming and layout for files:
respects a prescribed naming and layout for files:
.. code-block:: console
spack-scripting/ # The top level directory must match the format 'spack-{extension_name}'
├── pytest.ini # Optional file if the extension ships its own tests
├── scripting # Folder that may contain modules that are needed for the extension commands
│   ── cmd # Folder containing extension commands
│   │   └── filter.py # A new command that will be available
│   └── functions.py # Module with internal details
└── tests # Tests for this extension
│   ── cmd # Folder containing extension commands
│   └── filter.py # A new command that will be available
├── tests # Tests for this extension
│ ├── conftest.py
│ └── test_filter.py
└── templates # Templates that may be needed by the extension
In the example above, the extension is named *scripting*. It adds an additional command
(``spack filter``) and unit tests to verify its behavior.
In the example above the extension named *scripting* adds an additional command (``filter``)
and unit tests to verify its behavior. The code for this example can be
obtained by cloning the corresponding git repository:
The extension can import any core Spack module in its implementation. When loaded by
the ``spack`` command, the extension itself is imported as a Python package in the
``spack.extensions`` namespace. In the example above, since the extension is named
"scripting", the corresponding Python module is ``spack.extensions.scripting``.
The code for this example extension can be obtained by cloning the corresponding git repository:
.. TODO: write an ad-hoc "hello world" extension and make it part of the spack organization
.. code-block:: console
$ git -C /tmp clone https://github.com/spack/spack-scripting.git
$ cd ~/
$ mkdir tmp && cd tmp
$ git clone https://github.com/alalazo/spack-scripting.git
Cloning into 'spack-scripting'...
remote: Counting objects: 11, done.
remote: Compressing objects: 100% (7/7), done.
remote: Total 11 (delta 0), reused 11 (delta 0), pack-reused 0
Receiving objects: 100% (11/11), done.
As you can see by inspecting the sources, Python modules that are part of the extension
can import any core Spack module.
---------------------------------
Configure Spack to Use Extensions
@@ -57,7 +61,7 @@ paths to ``config.yaml``. In the case of our example this means ensuring that:
config:
extensions:
- /tmp/spack-scripting
- ~/tmp/spack-scripting
is part of your configuration file. Once this is setup any command that the extension provides
will be available from the command line:
@@ -82,32 +86,37 @@ will be available from the command line:
--implicit select specs that are not installed or were installed implicitly
--output OUTPUT where to dump the result
The corresponding unit tests can be run giving the appropriate options to ``spack unit-test``:
The corresponding unit tests can be run giving the appropriate options
to ``spack unit-test``:
.. code-block:: console
$ spack unit-test --extension=scripting
========================================== test session starts ===========================================
platform linux -- Python 3.11.5, pytest-7.4.3, pluggy-1.3.0
rootdir: /home/culpo/github/spack-scripting
configfile: pytest.ini
testpaths: tests
plugins: xdist-3.5.0
============================================================== test session starts ===============================================================
platform linux2 -- Python 2.7.15rc1, pytest-3.2.5, py-1.4.34, pluggy-0.4.0
rootdir: /home/mculpo/tmp/spack-scripting, inifile: pytest.ini
collected 5 items
tests/test_filter.py ..... [100%]
tests/test_filter.py ...XX
============================================================ short test summary info =============================================================
XPASS tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
XPASS tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
========================================== slowest 30 durations ==========================================
2.31s setup tests/test_filter.py::test_filtering_specs[kwargs0-specs0-expected0]
0.57s call tests/test_filter.py::test_filtering_specs[kwargs2-specs2-expected2]
0.56s call tests/test_filter.py::test_filtering_specs[kwargs4-specs4-expected4]
0.54s call tests/test_filter.py::test_filtering_specs[kwargs3-specs3-expected3]
0.54s call tests/test_filter.py::test_filtering_specs[kwargs1-specs1-expected1]
0.48s call tests/test_filter.py::test_filtering_specs[kwargs0-specs0-expected0]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs4-specs4-expected4]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs2-specs2-expected2]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs1-specs1-expected1]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs3-specs3-expected3]
(5 durations < 0.005s hidden. Use -vv to show these durations.)
=========================================== 5 passed in 5.06s ============================================
=========================================================== slowest 20 test durations ============================================================
3.74s setup tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.17s call tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
0.16s call tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.15s call tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1]
0.13s call tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
0.08s call tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.04s teardown tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
0.00s setup tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
0.00s setup tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
0.00s setup tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1]
0.00s setup tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
====================================================== 3 passed, 2 xpassed in 4.51 seconds =======================================================

View File

@@ -111,28 +111,3 @@ CUDA is split into fewer components and is simpler to specify:
prefix: /opt/cuda/cuda-11.0.2/
where ``/opt/cuda/cuda-11.0.2/lib/`` contains ``libcudart.so``.
-----------------------------------
Using an External OpenGL API
-----------------------------------
Depending on whether we have a graphics card or not, we may choose to use OSMesa or GLX to implement the OpenGL API.
If a graphics card is unavailable, OSMesa is recommended and can typically be built with Spack.
However, if we prefer to utilize the system GLX tailored to our graphics card, we need to declare it as an external. Here's how to do it:
.. code-block:: yaml
packages:
libglx:
require: [opengl]
opengl:
buildable: false
externals:
- prefix: /usr/
spec: opengl@4.6
Note that prefix has to be the root of both the libraries and the headers, using is /usr not the path the the lib.
To know which spec for opengl is available use ``cd /usr/include/GL && grep -Ri gl_version``.

View File

@@ -383,33 +383,7 @@ like this:
which means every spec will be required to use ``clang`` as a compiler.
Requirements on variants for all packages are possible too, but note that they
are only enforced for those packages that define these variants, otherwise they
are disregarded. For example:
.. code-block:: yaml
packages:
all:
require:
- "+shared"
- "+cuda"
will just enforce ``+shared`` on ``zlib``, which has a boolean ``shared`` variant but
no ``cuda`` variant.
Constraints in a single spec literal are always considered as a whole, so in a case like:
.. code-block:: yaml
packages:
all:
require: "+shared +cuda"
the default requirement will be enforced only if a package has both a ``cuda`` and
a ``shared`` variant, and will never be partially enforced.
Finally, ``all`` represents a *default set of requirements* -
Note that in this case ``all`` represents a *default set of requirements* -
if there are specific package requirements, then the default requirements
under ``all`` are disregarded. For example, with a configuration like this:
@@ -417,18 +391,12 @@ under ``all`` are disregarded. For example, with a configuration like this:
packages:
all:
require:
- 'build_type=Debug'
- '%clang'
require: '%clang'
cmake:
require:
- 'build_type=Debug'
- '%gcc'
require: '%gcc'
Spack requires ``cmake`` to use ``gcc`` and all other nodes (including ``cmake``
dependencies) to use ``clang``. If enforcing ``build_type=Debug`` is needed also
on ``cmake``, it must be repeated in the specific ``cmake`` requirements.
dependencies) to use ``clang``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting requirements on virtual specs

View File

@@ -2337,7 +2337,7 @@ window while a batch job is running ``spack install`` on the same or
overlapping dependencies without any process trying to re-do the work of
another.
For example, if you are using Slurm, you could launch an installation
For example, if you are using SLURM, you could launch an installation
of ``mpich`` using the following command:
.. code-block:: console

View File

@@ -1,13 +1,13 @@
sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx-rtd-theme==2.0.0
sphinx-rtd-theme==1.3.0
python-levenshtein==0.23.0
docutils==0.20.1
pygments==2.17.2
urllib3==2.1.0
docutils==0.18.1
pygments==2.16.1
urllib3==2.0.7
pytest==7.4.3
isort==5.12.0
black==23.11.0
black==23.10.1
flake8==6.1.0
mypy==1.7.1
mypy==1.6.1

View File

@@ -1047,9 +1047,9 @@ def __bool__(self):
"""Whether any exceptions were handled."""
return bool(self.exceptions)
def forward(self, context: str, base: type = BaseException) -> "GroupedExceptionForwarder":
def forward(self, context: str) -> "GroupedExceptionForwarder":
"""Return a contextmanager which extracts tracebacks and prefixes a message."""
return GroupedExceptionForwarder(context, self, base)
return GroupedExceptionForwarder(context, self)
def _receive_forwarded(self, context: str, exc: Exception, tb: List[str]):
self.exceptions.append((context, exc, tb))
@@ -1072,18 +1072,15 @@ class GroupedExceptionForwarder:
"""A contextmanager to capture exceptions and forward them to a
GroupedExceptionHandler."""
def __init__(self, context: str, handler: GroupedExceptionHandler, base: type):
def __init__(self, context: str, handler: GroupedExceptionHandler):
self._context = context
self._handler = handler
self._base = base
def __enter__(self):
return None
def __exit__(self, exc_type, exc_value, tb):
if exc_value is not None:
if not issubclass(exc_type, self._base):
return False
self._handler._receive_forwarded(self._context, exc_value, traceback.format_tb(tb))
# Suppress any exception from being re-raised:

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.22.0.dev0"
__version__ = "0.21.1"
spack_version = __version__

View File

@@ -726,37 +726,13 @@ def _unknown_variants_in_directives(pkgs, error_cls):
@package_directives
def _issues_in_depends_on_directive(pkgs, error_cls):
"""Reports issues with 'depends_on' directives.
Issues might be unknown dependencies, unknown variants or variant values, or declaration
of nested dependencies.
"""
def _unknown_variants_in_dependencies(pkgs, error_cls):
"""Report unknown dependencies and wrong variants for dependencies"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
filename = spack.repo.PATH.filename_for_package_name(pkg_name)
for dependency_name, dependency_data in pkg_cls.dependencies.items():
# Check if there are nested dependencies declared. We don't want directives like:
#
# depends_on('foo+bar ^fee+baz')
#
# but we'd like to have two dependencies listed instead.
for when, dependency_edge in dependency_data.items():
dependency_spec = dependency_edge.spec
nested_dependencies = dependency_spec.dependencies()
if nested_dependencies:
summary = (
f"{pkg_name}: invalid nested dependency "
f"declaration '{str(dependency_spec)}'"
)
details = [
f"split depends_on('{str(dependency_spec)}', when='{str(when)}') "
f"into {len(nested_dependencies) + 1} directives",
f"in {filename}",
]
errors.append(error_cls(summary=summary, details=details))
# No need to analyze virtual packages
if spack.repo.PATH.is_virtual(dependency_name):
continue

View File

@@ -69,6 +69,7 @@
BUILD_CACHE_RELATIVE_PATH = "build_cache"
BUILD_CACHE_KEYS_RELATIVE_PATH = "_pgp"
CURRENT_BUILD_CACHE_LAYOUT_VERSION = 1
FORWARD_COMPAT_BUILD_CACHE_LAYOUT_VERSION = 2
class BuildCacheDatabase(spack_db.Database):
@@ -230,11 +231,7 @@ def _associate_built_specs_with_mirror(self, cache_key, mirror_url):
)
return
spec_list = [
s
for s in db.query_local(installed=any, in_buildcache=any)
if s.external or db.query_local_by_spec_hash(s.dag_hash()).in_buildcache
]
spec_list = db.query_local(installed=False, in_buildcache=True)
for indexed_spec in spec_list:
dag_hash = indexed_spec.dag_hash()
@@ -1700,7 +1697,7 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
try:
_get_valid_spec_file(
local_specfile_stage.save_filename,
CURRENT_BUILD_CACHE_LAYOUT_VERSION,
FORWARD_COMPAT_BUILD_CACHE_LAYOUT_VERSION,
)
except InvalidMetadataFile as e:
tty.warn(
@@ -1741,7 +1738,7 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
try:
_get_valid_spec_file(
local_specfile_path, CURRENT_BUILD_CACHE_LAYOUT_VERSION
local_specfile_path, FORWARD_COMPAT_BUILD_CACHE_LAYOUT_VERSION
)
except InvalidMetadataFile as e:
tty.warn(
@@ -2030,11 +2027,12 @@ def _extract_inner_tarball(spec, filename, extract_to, unsigned, remote_checksum
def _tar_strip_component(tar: tarfile.TarFile, prefix: str):
"""Strip the top-level directory `prefix` from the member names in a tarfile."""
"""Yield all members of tarfile that start with given prefix, and strip that prefix (including
symlinks)"""
# Including trailing /, otherwise we end up with absolute paths.
regex = re.compile(re.escape(prefix) + "/*")
# Remove the top-level directory from the member (link)names.
# Only yield members in the package prefix.
# Note: when a tarfile is created, relative in-prefix symlinks are
# expanded to matching member names of tarfile entries. So, we have
# to ensure that those are updated too.
@@ -2042,12 +2040,14 @@ def _tar_strip_component(tar: tarfile.TarFile, prefix: str):
# them.
for m in tar.getmembers():
result = regex.match(m.name)
assert result is not None
if not result:
continue
m.name = m.name[result.end() :]
if m.linkname:
result = regex.match(m.linkname)
if result:
m.linkname = m.linkname[result.end() :]
yield m
def extract_tarball(spec, download_result, unsigned=False, force=False, timer=timer.NULL_TIMER):
@@ -2071,7 +2071,7 @@ def extract_tarball(spec, download_result, unsigned=False, force=False, timer=ti
specfile_path = download_result["specfile_stage"].save_filename
spec_dict, layout_version = _get_valid_spec_file(
specfile_path, CURRENT_BUILD_CACHE_LAYOUT_VERSION
specfile_path, FORWARD_COMPAT_BUILD_CACHE_LAYOUT_VERSION
)
bchecksum = spec_dict["binary_cache_checksum"]
@@ -2090,7 +2090,7 @@ def extract_tarball(spec, download_result, unsigned=False, force=False, timer=ti
_delete_staged_downloads(download_result)
shutil.rmtree(tmpdir)
raise e
elif layout_version == 1:
elif 1 <= layout_version <= 2:
# Newer buildcache layout: the .spack file contains just
# in the install tree, the signature, if it exists, is
# wrapped around the spec.json at the root. If sig verify
@@ -2117,8 +2117,10 @@ def extract_tarball(spec, download_result, unsigned=False, force=False, timer=ti
try:
with closing(tarfile.open(tarfile_path, "r")) as tar:
# Remove install prefix from tarfil to extract directly into spec.prefix
_tar_strip_component(tar, prefix=_ensure_common_prefix(tar))
tar.extractall(path=spec.prefix)
tar.extractall(
path=spec.prefix,
members=_tar_strip_component(tar, prefix=_ensure_common_prefix(tar)),
)
except Exception:
shutil.rmtree(spec.prefix, ignore_errors=True)
_delete_staged_downloads(download_result)
@@ -2153,20 +2155,47 @@ def extract_tarball(spec, download_result, unsigned=False, force=False, timer=ti
def _ensure_common_prefix(tar: tarfile.TarFile) -> str:
# Get the shortest length directory.
common_prefix = min((e.name for e in tar.getmembers() if e.isdir()), key=len, default=None)
# Find the lowest `binary_distribution` file (hard-coded forward slash is on purpose).
binary_distribution = min(
(
e.name
for e in tar.getmembers()
if e.isfile() and e.name.endswith(".spack/binary_distribution")
),
key=len,
default=None,
)
if common_prefix is None:
raise ValueError("Tarball does not contain a common prefix")
if binary_distribution is None:
raise ValueError("Tarball is not a Spack package, missing binary_distribution file")
# Validate that each file starts with the prefix
pkg_path = pathlib.PurePosixPath(binary_distribution).parent.parent
# Even the most ancient Spack version has required to list the dir of the package itself, so
# guard against broken tarballs where `path.parent.parent` is empty.
if pkg_path == pathlib.PurePosixPath():
raise ValueError("Invalid tarball, missing package prefix dir")
pkg_prefix = str(pkg_path)
# Ensure all tar entries are in the pkg_prefix dir, and if they're not, they should be parent
# dirs of it.
has_prefix = False
for member in tar.getmembers():
if not member.name.startswith(common_prefix):
raise ValueError(
f"Tarball contains file {member.name} outside of prefix {common_prefix}"
)
stripped = member.name.rstrip("/")
if not (
stripped.startswith(pkg_prefix) or member.isdir() and pkg_prefix.startswith(stripped)
):
raise ValueError(f"Tarball contains file {stripped} outside of prefix {pkg_prefix}")
if member.isdir() and stripped == pkg_prefix:
has_prefix = True
return common_prefix
# This is technically not required, but let's be defensive about the existence of the package
# prefix dir.
if not has_prefix:
raise ValueError(f"Tarball does not contain a common prefix {pkg_prefix}")
return pkg_prefix
def install_root_node(spec, unsigned=False, force=False, sha256=None):

View File

@@ -16,7 +16,6 @@
import llnl.util.filesystem as fs
from llnl.util import tty
import spack.platforms
import spack.store
import spack.util.environment
import spack.util.executable
@@ -207,19 +206,17 @@ def _root_spec(spec_str: str) -> str:
"""Add a proper compiler and target to a spec used during bootstrapping.
Args:
spec_str: spec to be bootstrapped. Must be without compiler and target.
spec_str (str): spec to be bootstrapped. Must be without compiler and target.
"""
# Add a compiler requirement to the root spec.
platform = str(spack.platforms.host())
if platform == "darwin":
# Add a proper compiler hint to the root spec. We use GCC for
# everything but MacOS and Windows.
if str(spack.platforms.host()) == "darwin":
spec_str += " %apple-clang"
elif platform == "windows":
elif str(spack.platforms.host()) == "windows":
# TODO (johnwparent): Remove version constraint when clingo patch is up
spec_str += " %msvc@:19.37"
elif platform == "linux":
else:
spec_str += " %gcc"
elif platform == "freebsd":
spec_str += " %clang"
target = archspec.cpu.host().family
spec_str += f" target={target}"

View File

@@ -386,7 +386,7 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
with exception_handler.forward(current_config["name"], Exception):
with exception_handler.forward(current_config["name"]):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_import(module, abstract_spec):
@@ -441,7 +441,7 @@ def ensure_executables_in_path_or_raise(
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
with exception_handler.forward(current_config["name"], Exception):
with exception_handler.forward(current_config["name"]):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_search_path(executables, abstract_spec):

View File

@@ -19,6 +19,7 @@
import spack.tengine
import spack.util.cpus
import spack.util.executable
from spack.environment import depfile
from ._common import _root_spec
from .config import root_path, spec_for_current_python, store_path
@@ -85,9 +86,12 @@ def __init__(self) -> None:
super().__init__(self.environment_root())
def update_installations(self) -> None:
"""Update the installations of this environment."""
log_enabled = tty.is_debug() or tty.is_verbose()
with tty.SuppressOutput(msg_enabled=log_enabled, warn_enabled=log_enabled):
"""Update the installations of this environment.
The update is done using a depfile on Linux and macOS, and using the ``install_all``
method of environments on Windows.
"""
with tty.SuppressOutput(msg_enabled=False, warn_enabled=False):
specs = self.concretize()
if specs:
colorized_specs = [
@@ -96,9 +100,11 @@ def update_installations(self) -> None:
]
tty.msg(f"[BOOTSTRAPPING] Installing dependencies ({', '.join(colorized_specs)})")
self.write(regenerate=False)
with tty.SuppressOutput(msg_enabled=log_enabled, warn_enabled=log_enabled):
if sys.platform == "win32":
self.install_all()
self.write(regenerate=True)
else:
self._install_with_depfile()
self.write(regenerate=True)
def update_syspath_and_environ(self) -> None:
"""Update ``sys.path`` and the PATH, PYTHONPATH environment variables to point to
@@ -116,6 +122,25 @@ def update_syspath_and_environ(self) -> None:
+ [str(x) for x in self.pythonpaths()]
)
def _install_with_depfile(self) -> None:
model = depfile.MakefileModel.from_env(self)
template = spack.tengine.make_environment().get_template(
os.path.join("depfile", "Makefile")
)
makefile = self.environment_root() / "Makefile"
makefile.write_text(template.render(model.to_dict()))
make = spack.util.executable.which("make")
kwargs = {}
if not tty.is_debug():
kwargs = {"output": os.devnull, "error": os.devnull}
make(
"-C",
str(self.environment_root()),
"-j",
str(spack.util.cpus.determine_number_of_jobs(parallel=True)),
**kwargs,
)
def _write_spack_yaml_file(self) -> None:
tty.msg(
"[BOOTSTRAPPING] Spack has missing dependencies, creating a bootstrapping environment"
@@ -136,7 +161,7 @@ def _write_spack_yaml_file(self) -> None:
def isort_root_spec() -> str:
"""Return the root spec used to bootstrap isort"""
return _root_spec("py-isort@5")
return _root_spec("py-isort@4.3.5:")
def mypy_root_spec() -> str:

View File

@@ -66,6 +66,7 @@ def _core_requirements() -> List[RequiredResponseType]:
_core_system_exes = {
"make": _missing("make", "required to build software from sources"),
"patch": _missing("patch", "required to patch source code before building"),
"bash": _missing("bash", "required for Spack compiler wrapper"),
"tar": _missing("tar", "required to manage code archives"),
"gzip": _missing("gzip", "required to compress/decompress code archives"),
"unzip": _missing("unzip", "required to compress/decompress code archives"),

View File

@@ -1333,7 +1333,7 @@ def make_stack(tb, stack=None):
# don't provide context if the code is actually in the base classes.
obj = frame.f_locals["self"]
func = getattr(obj, tb.tb_frame.f_code.co_name, "")
if func:
if func and hasattr(func, "__qualname__"):
typename, *_ = func.__qualname__.partition(".")
if isinstance(obj, CONTEXT_BASES) and typename not in basenames:
break

View File

@@ -9,7 +9,6 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.build_environment
import spack.builder
from .cmake import CMakeBuilder, CMakePackage
@@ -35,11 +34,6 @@ def cmake_cache_option(name, boolean_value, comment="", force=False):
return 'set({0} {1} CACHE BOOL "{2}"{3})\n'.format(name, value, comment, force_str)
def cmake_cache_filepath(name, value, comment=""):
"""Generate a string for a cmake cache variable of type FILEPATH"""
return 'set({0} "{1}" CACHE FILEPATH "{2}")\n'.format(name, value, comment)
class CachedCMakeBuilder(CMakeBuilder):
#: Phases of a Cached CMake package
#: Note: the initconfig phase is used for developer builds as a final phase to stop on
@@ -263,15 +257,6 @@ def initconfig_hardware_entries(self):
entries.append(
cmake_cache_path("HIP_CXX_COMPILER", "{0}".format(self.spec["hip"].hipcc))
)
llvm_bin = spec["llvm-amdgpu"].prefix.bin
llvm_prefix = spec["llvm-amdgpu"].prefix
# Some ROCm systems seem to point to /<path>/rocm-<ver>/ and
# others point to /<path>/rocm-<ver>/llvm
if os.path.basename(os.path.normpath(llvm_prefix)) != "llvm":
llvm_bin = os.path.join(llvm_prefix, "llvm/bin/")
entries.append(
cmake_cache_filepath("CMAKE_HIP_COMPILER", os.path.join(llvm_bin, "clang++"))
)
archs = self.spec.variants["amdgpu_target"].value
if archs[0] != "none":
arch_str = ";".join(archs)
@@ -286,29 +271,13 @@ def initconfig_hardware_entries(self):
def std_initconfig_entries(self):
cmake_prefix_path_env = os.environ["CMAKE_PREFIX_PATH"]
cmake_prefix_path = cmake_prefix_path_env.replace(os.pathsep, ";")
cmake_rpaths_env = spack.build_environment.get_rpaths(self.pkg)
cmake_rpaths_path = ";".join(cmake_rpaths_env)
complete_rpath_list = cmake_rpaths_path
if "SPACK_COMPILER_EXTRA_RPATHS" in os.environ:
spack_extra_rpaths_env = os.environ["SPACK_COMPILER_EXTRA_RPATHS"]
spack_extra_rpaths_path = spack_extra_rpaths_env.replace(os.pathsep, ";")
complete_rpath_list = "{0};{1}".format(complete_rpath_list, spack_extra_rpaths_path)
if "SPACK_COMPILER_IMPLICIT_RPATHS" in os.environ:
spack_implicit_rpaths_env = os.environ["SPACK_COMPILER_IMPLICIT_RPATHS"]
spack_implicit_rpaths_path = spack_implicit_rpaths_env.replace(os.pathsep, ";")
complete_rpath_list = "{0};{1}".format(complete_rpath_list, spack_implicit_rpaths_path)
return [
"#------------------{0}".format("-" * 60),
"# !!!! This is a generated file, edit at own risk !!!!",
"#------------------{0}".format("-" * 60),
"# CMake executable path: {0}".format(self.pkg.spec["cmake"].command.path),
"#------------------{0}\n".format("-" * 60),
cmake_cache_string("CMAKE_PREFIX_PATH", cmake_prefix_path),
cmake_cache_string("CMAKE_INSTALL_RPATH_USE_LINK_PATH", "ON"),
cmake_cache_string("CMAKE_BUILD_RPATH", complete_rpath_list),
cmake_cache_string("CMAKE_INSTALL_RPATH", complete_rpath_list),
cmake_cache_path("CMAKE_PREFIX_PATH", cmake_prefix_path),
self.define_cmake_cache_from_variant("CMAKE_BUILD_TYPE", "build_type"),
]

View File

@@ -1,89 +0,0 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import llnl.util.filesystem as fs
import spack.builder
import spack.package_base
from spack.directives import build_system, depends_on
from spack.multimethod import when
from ._checks import BaseBuilder, execute_install_time_tests
class CargoPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using a Makefiles."""
#: This attribute is used in UI queries that need to know the build
#: system base class
build_system_class = "CargoPackage"
build_system("cargo")
with when("build_system=cargo"):
depends_on("rust", type="build")
@spack.builder.builder("cargo")
class CargoBuilder(BaseBuilder):
"""The Cargo builder encodes the most common way of building software with
a rust Cargo.toml file. It has two phases that can be overridden, if need be:
1. :py:meth:`~.CargoBuilder.build`
2. :py:meth:`~.CargoBuilder.install`
For a finer tuning you may override:
+-----------------------------------------------+----------------------+
| **Method** | **Purpose** |
+===============================================+======================+
| :py:meth:`~.CargoBuilder.build_args` | Specify arguments |
| | to ``cargo install`` |
+-----------------------------------------------+----------------------+
| :py:meth:`~.CargoBuilder.check_args` | Specify arguments |
| | to ``cargo test`` |
+-----------------------------------------------+----------------------+
"""
phases = ("build", "install")
#: Callback names for install-time test
install_time_test_callbacks = ["check"]
@property
def build_directory(self):
"""Return the directory containing the main Cargo.toml."""
return self.pkg.stage.source_path
@property
def build_args(self):
"""Arguments for ``cargo build``."""
return []
@property
def check_args(self):
"""Argument for ``cargo test`` during check phase"""
return []
def build(self, pkg, spec, prefix):
"""Runs ``cargo install`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(pkg).cargo(
"install", "--root", "out", "--path", ".", *self.build_args
)
def install(self, pkg, spec, prefix):
"""Copy build files into package prefix."""
with fs.working_dir(self.build_directory):
fs.install_tree("out", prefix)
spack.builder.run_after("install")(execute_install_time_tests)
def check(self):
"""Run "cargo test"."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).cargo("test", *self.check_args)

View File

@@ -136,12 +136,12 @@ def cuda_flags(arch_list):
conflicts("%gcc@11:", when="+cuda ^cuda@:11.4.0")
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.3")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.1")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
conflicts("%clang@15:", when="+cuda ^cuda@:12.0")
conflicts("%clang@16:", when="+cuda ^cuda@:12.3")
conflicts("%clang@16:", when="+cuda ^cuda@:12.1")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -1,98 +0,0 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import llnl.util.filesystem as fs
import spack.builder
import spack.package_base
from spack.directives import build_system, extends
from spack.multimethod import when
from ._checks import BaseBuilder, execute_install_time_tests
class GoPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using the Go toolchain."""
#: This attribute is used in UI queries that need to know the build
#: system base class
build_system_class = "GoPackage"
#: Legacy buildsystem attribute used to deserialize and install old specs
legacy_buildsystem = "go"
build_system("go")
with when("build_system=go"):
# TODO: this seems like it should be depends_on, see
# setup_dependent_build_environment in go for why I kept it like this
extends("go@1.14:", type="build")
@spack.builder.builder("go")
class GoBuilder(BaseBuilder):
"""The Go builder encodes the most common way of building software with
a golang go.mod file. It has two phases that can be overridden, if need be:
1. :py:meth:`~.GoBuilder.build`
2. :py:meth:`~.GoBuilder.install`
For a finer tuning you may override:
+-----------------------------------------------+--------------------+
| **Method** | **Purpose** |
+===============================================+====================+
| :py:meth:`~.GoBuilder.build_args` | Specify arguments |
| | to ``go build`` |
+-----------------------------------------------+--------------------+
| :py:meth:`~.GoBuilder.check_args` | Specify arguments |
| | to ``go test`` |
+-----------------------------------------------+--------------------+
"""
phases = ("build", "install")
#: Callback names for install-time test
install_time_test_callbacks = ["check"]
def setup_build_environment(self, env):
env.set("GO111MODULE", "on")
env.set("GOTOOLCHAIN", "local")
@property
def build_directory(self):
"""Return the directory containing the main go.mod."""
return self.pkg.stage.source_path
@property
def build_args(self):
"""Arguments for ``go build``."""
# Pass ldflags -s = --strip-all and -w = --no-warnings by default
return ["-ldflags", "-s -w", "-o", f"{self.pkg.name}"]
@property
def check_args(self):
"""Argument for ``go test`` during check phase"""
return []
def build(self, pkg, spec, prefix):
"""Runs ``go build`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(pkg).go("build", *self.build_args)
def install(self, pkg, spec, prefix):
"""Install built binaries into prefix bin."""
with fs.working_dir(self.build_directory):
fs.mkdirp(prefix.bin)
fs.install(pkg.name, prefix.bin)
spack.builder.run_after("install")(execute_install_time_tests)
def check(self):
"""Run ``go test .`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).go("test", *self.check_args)

View File

@@ -7,9 +7,9 @@
import os
import platform
import shutil
from os.path import basename, isdir
from os.path import basename, dirname, isdir
from llnl.util.filesystem import HeaderList, find_libraries, join_path, mkdirp
from llnl.util.filesystem import find_headers, find_libraries, join_path, mkdirp
from llnl.util.link_tree import LinkTree
from spack.directives import conflicts, variant
@@ -55,21 +55,10 @@ def component_dir(self):
"""Subdirectory for this component in the install prefix."""
raise NotImplementedError
@property
def v2_layout_versions(self):
"""Version that implements the v2 directory layout."""
raise NotImplementedError
@property
def v2_layout(self):
"""Returns true if this version implements the v2 directory layout."""
return self.spec.satisfies(self.v2_layout_versions)
@property
def component_prefix(self):
"""Path to component <prefix>/<component>/<version>."""
v = self.spec.version.up_to(2) if self.v2_layout else self.spec.version
return self.prefix.join(self.component_dir).join(str(v))
return self.prefix.join(join_path(self.component_dir, self.spec.version))
@property
def env_script_args(self):
@@ -123,9 +112,8 @@ def install_component(self, installer_path):
shutil.rmtree("/var/intel/installercache", ignore_errors=True)
# Some installers have a bug and do not return an error code when failing
install_dir = self.component_prefix
if not isdir(install_dir):
raise RuntimeError("install failed to directory: {0}".format(install_dir))
if not isdir(join_path(self.prefix, self.component_dir)):
raise RuntimeError("install failed")
def setup_run_environment(self, env):
"""Adds environment variables to the generated module file.
@@ -140,7 +128,7 @@ def setup_run_environment(self, env):
if "~envmods" not in self.spec:
env.extend(
EnvironmentModifications.from_sourcing_file(
self.component_prefix.env.join("vars.sh"), *self.env_script_args
join_path(self.component_prefix, "env", "vars.sh"), *self.env_script_args
)
)
@@ -179,40 +167,16 @@ class IntelOneApiLibraryPackage(IntelOneApiPackage):
"""
def header_directories(self, dirs):
h = HeaderList([])
h.directories = dirs
return h
@property
def headers(self):
return self.header_directories(
[self.component_prefix.include, self.component_prefix.include.join(self.component_dir)]
)
include_path = join_path(self.component_prefix, "include")
return find_headers("*", include_path, recursive=True)
@property
def libs(self):
# for v2_layout all libraries are in the top level, v1 sometimes put them in intel64
return find_libraries("*", root=self.component_prefix.lib, recursive=not self.v2_layout)
class IntelOneApiLibraryPackageWithSdk(IntelOneApiPackage):
"""Base class for Intel oneAPI library packages with SDK components.
Contains some convenient default implementations for libraries
that expose functionality in sdk subdirectories.
Implement the method directly in the package if something
different is needed.
"""
@property
def headers(self):
return self.header_directories([self.component_prefix.sdk.include])
@property
def libs(self):
return find_libraries("*", self.component_prefix.sdk.lib64)
lib_path = join_path(self.component_prefix, "lib", "intel64")
lib_path = lib_path if isdir(lib_path) else dirname(lib_path)
return find_libraries("*", root=lib_path, shared=True, recursive=True)
class IntelOneApiStaticLibraryList:

View File

@@ -6,14 +6,13 @@
import os
import re
import shutil
from typing import Iterable, List, Mapping, Optional
from typing import Optional
import archspec
import llnl.util.filesystem as fs
import llnl.util.lang as lang
import llnl.util.tty as tty
from llnl.util.filesystem import HeaderList, LibraryList
import spack.builder
import spack.config
@@ -26,18 +25,14 @@
from spack.directives import build_system, depends_on, extends, maintainers
from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import test_part
from spack.spec import Spec
from spack.util.prefix import Prefix
from ._checks import BaseBuilder, execute_install_time_tests
def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
def _flatten_dict(dictionary):
"""Iterable that yields KEY=VALUE paths through a dictionary.
Args:
dictionary: Possibly nested dictionary of arbitrary keys and values.
Yields:
A single path through the dictionary.
"""
@@ -55,7 +50,7 @@ class PythonExtension(spack.package_base.PackageBase):
maintainers("adamjstewart")
@property
def import_modules(self) -> Iterable[str]:
def import_modules(self):
"""Names of modules that the Python package provides.
These are used to test whether or not the installation succeeded.
@@ -70,7 +65,7 @@ def import_modules(self) -> Iterable[str]:
detected, this property can be overridden by the package.
Returns:
List of strings of module names.
list: list of strings of module names
"""
modules = []
pkg = self.spec["python"].package
@@ -107,14 +102,14 @@ def import_modules(self) -> Iterable[str]:
return modules
@property
def skip_modules(self) -> Iterable[str]:
def skip_modules(self):
"""Names of modules that should be skipped when running tests.
These are a subset of import_modules. If a module has submodules,
they are skipped as well (meaning a.b is skipped if a is contained).
Returns:
List of strings of module names.
list: list of strings of module names
"""
return []
@@ -190,12 +185,12 @@ def remove_files_from_view(self, view, merge_map):
view.remove_files(to_remove)
def test_imports(self) -> None:
def test_imports(self):
"""Attempts to import modules of the installed package."""
# Make sure we are importing the installed modules,
# not the ones in the source directory
python = inspect.getmodule(self).python # type: ignore[union-attr]
python = inspect.getmodule(self).python
for module in self.import_modules:
with test_part(
self,
@@ -320,27 +315,24 @@ class PythonPackage(PythonExtension):
py_namespace: Optional[str] = None
@lang.classproperty
def homepage(cls) -> Optional[str]: # type: ignore[override]
def homepage(cls):
if cls.pypi:
name = cls.pypi.split("/")[0]
return f"https://pypi.org/project/{name}/"
return None
return "https://pypi.org/project/" + name + "/"
@lang.classproperty
def url(cls) -> Optional[str]:
def url(cls):
if cls.pypi:
return f"https://files.pythonhosted.org/packages/source/{cls.pypi[0]}/{cls.pypi}"
return None
return "https://files.pythonhosted.org/packages/source/" + cls.pypi[0] + "/" + cls.pypi
@lang.classproperty
def list_url(cls) -> Optional[str]: # type: ignore[override]
def list_url(cls):
if cls.pypi:
name = cls.pypi.split("/")[0]
return f"https://pypi.org/simple/{name}/"
return None
return "https://pypi.org/simple/" + name + "/"
@property
def headers(self) -> HeaderList:
def headers(self):
"""Discover header files in platlib."""
# Remove py- prefix in package name
@@ -358,7 +350,7 @@ def headers(self) -> HeaderList:
raise NoHeadersError(msg.format(self.spec.name, include, platlib))
@property
def libs(self) -> LibraryList:
def libs(self):
"""Discover libraries in platlib."""
# Remove py- prefix in package name
@@ -392,7 +384,7 @@ class PythonPipBuilder(BaseBuilder):
install_time_test_callbacks = ["test"]
@staticmethod
def std_args(cls) -> List[str]:
def std_args(cls):
return [
# Verbose
"-vvv",
@@ -417,7 +409,7 @@ def std_args(cls) -> List[str]:
]
@property
def build_directory(self) -> str:
def build_directory(self):
"""The root directory of the Python package.
This is usually the directory containing one of the following files:
@@ -428,51 +420,51 @@ def build_directory(self) -> str:
"""
return self.pkg.stage.source_path
def config_settings(self, spec: Spec, prefix: Prefix) -> Mapping[str, object]:
def config_settings(self, spec, prefix):
"""Configuration settings to be passed to the PEP 517 build backend.
Requires pip 22.1 or newer for keys that appear only a single time,
or pip 23.1 or newer if the same key appears multiple times.
Args:
spec: Build spec.
prefix: Installation prefix.
spec (spack.spec.Spec): build spec
prefix (spack.util.prefix.Prefix): installation prefix
Returns:
Possibly nested dictionary of KEY, VALUE settings.
dict: Possibly nested dictionary of KEY, VALUE settings
"""
return {}
def install_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
def install_options(self, spec, prefix):
"""Extra arguments to be supplied to the setup.py install command.
Requires pip 23.0 or older.
Args:
spec: Build spec.
prefix: Installation prefix.
spec (spack.spec.Spec): build spec
prefix (spack.util.prefix.Prefix): installation prefix
Returns:
List of options.
list: list of options
"""
return []
def global_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
def global_options(self, spec, prefix):
"""Extra global options to be supplied to the setup.py call before the install
or bdist_wheel command.
Deprecated in pip 23.1.
Args:
spec: Build spec.
prefix: Installation prefix.
spec (spack.spec.Spec): build spec
prefix (spack.util.prefix.Prefix): installation prefix
Returns:
List of options.
list: list of options
"""
return []
def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
def install(self, pkg, spec, prefix):
"""Install everything from build directory."""
args = PythonPipBuilder.std_args(pkg) + [f"--prefix={prefix}"]

View File

@@ -46,22 +46,7 @@
from spack.reporters import CDash, CDashConfiguration
from spack.reporters.cdash import build_stamp as cdash_build_stamp
# See https://docs.gitlab.com/ee/ci/yaml/#retry for descriptions of conditions
JOB_RETRY_CONDITIONS = [
# "always",
"unknown_failure",
"script_failure",
"api_failure",
"stuck_or_timeout_failure",
"runner_system_failure",
"runner_unsupported",
"stale_schedule",
# "job_execution_timeout",
"archived_failure",
"unmet_prerequisites",
"scheduler_failure",
"data_integrity_failure",
]
JOB_RETRY_CONDITIONS = ["always"]
TEMP_STORAGE_MIRROR_NAME = "ci_temporary_mirror"
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.cmd
from spack.cmd.common import arguments
import spack.cmd.common.arguments as arguments
description = "add a spec to an environment"
section = "environments"

View File

@@ -15,13 +15,13 @@
import spack.bootstrap
import spack.bootstrap.config
import spack.bootstrap.core
import spack.cmd.common.arguments
import spack.config
import spack.main
import spack.mirror
import spack.spec
import spack.stage
import spack.util.path
from spack.cmd.common import arguments
description = "manage bootstrap configuration"
section = "system"
@@ -68,8 +68,12 @@
def _add_scope_option(parser):
scopes = spack.config.scopes()
parser.add_argument(
"--scope", action=arguments.ConfigScope, help="configuration scope to read/modify"
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
)
@@ -102,7 +106,7 @@ def setup_parser(subparser):
disable.add_argument("name", help="name of the source to be disabled", nargs="?", default=None)
reset = sp.add_parser("reset", help="reset bootstrapping configuration to Spack defaults")
arguments.add_common_arguments(reset, ["yes_to_all"])
spack.cmd.common.arguments.add_common_arguments(reset, ["yes_to_all"])
root = sp.add_parser("root", help="get/set the root bootstrap directory")
_add_scope_option(root)

View File

@@ -21,6 +21,7 @@
import spack.binary_distribution as bindist
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.error
@@ -39,7 +40,6 @@
import spack.util.web as web_util
from spack.build_environment import determine_number_of_jobs
from spack.cmd import display_specs
from spack.cmd.common import arguments
from spack.oci.image import (
Digest,
ImageReference,
@@ -182,22 +182,23 @@ def setup_parser(subparser: argparse.ArgumentParser):
)
# used to construct scope arguments below
scopes = spack.config.scopes()
check.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope containing mirrors to check",
)
# Unfortunately there are 3 ways to do the same thing here:
check_specs = check.add_mutually_exclusive_group()
check_specs.add_argument(
check_spec_or_specfile = check.add_mutually_exclusive_group(required=True)
check_spec_or_specfile.add_argument(
"-s", "--spec", help="check single spec instead of release specs file"
)
check_specs.add_argument(
check_spec_or_specfile.add_argument(
"--spec-file",
help="check single spec from json or yaml file instead of release specs file",
)
arguments.add_common_arguments(check, ["specs"])
check.set_defaults(func=check_fn)
@@ -815,24 +816,15 @@ def check_fn(args: argparse.Namespace):
exit code is non-zero, then at least one of the indicated specs needs to be rebuilt
"""
if args.spec_file:
specs_arg = (
args.spec_file if os.path.sep in args.spec_file else os.path.join(".", args.spec_file)
)
tty.warn(
"The flag `--spec-file` is deprecated and will be removed in Spack 0.22. "
f"Use `spack buildcache check {specs_arg}` instead."
"Use --spec instead."
)
elif args.spec:
specs_arg = args.spec
tty.warn(
"The flag `--spec` is deprecated and will be removed in Spack 0.23. "
f"Use `spack buildcache check {specs_arg}` instead."
)
else:
specs_arg = args.specs
if specs_arg:
specs = _matching_specs(spack.cmd.parse_specs(specs_arg))
specs = spack.cmd.parse_specs(args.spec or args.spec_file)
if specs:
specs = _matching_specs(specs)
else:
specs = spack.cmd.require_active_env("buildcache check").all_specs()

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.cmd
from spack.cmd.common import arguments
import spack.cmd.common.arguments as arguments
description = "change an existing spec in an environment"
section = "environments"

View File

@@ -16,7 +16,6 @@
import spack.cmd.buildcache as buildcache
import spack.config as cfg
import spack.environment as ev
import spack.environment.depfile
import spack.hash_types as ht
import spack.mirror
import spack.util.gpg as gpg_util
@@ -607,9 +606,7 @@ def ci_rebuild(args):
"SPACK_INSTALL_FLAGS={}".format(args_to_string(deps_install_args)),
"-j$(nproc)",
"install-deps/{}".format(
spack.environment.depfile.MakefileSpec(job_spec).safe_format(
"{name}-{version}-{hash}"
)
ev.depfile.MakefileSpec(job_spec).safe_format("{name}-{version}-{hash}")
),
],
spack_cmd + ["install"] + root_install_args,

View File

@@ -12,13 +12,13 @@
import spack.bootstrap
import spack.caches
import spack.cmd.common.arguments as arguments
import spack.cmd.test
import spack.config
import spack.repo
import spack.stage
import spack.store
import spack.util.path
from spack.cmd.common import arguments
from spack.paths import lib_path, var_path
description = "remove temporary build files and/or downloaded archives"

View File

@@ -124,33 +124,6 @@ def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, deptype)
class ConfigScope(argparse.Action):
"""Pick the currently configured config scopes."""
def __init__(self, *args, **kwargs) -> None:
kwargs.setdefault("metavar", spack.config.SCOPES_METAVAR)
super().__init__(*args, **kwargs)
@property
def default(self):
return self._default() if callable(self._default) else self._default
@default.setter
def default(self, value):
self._default = value
@property
def choices(self):
return spack.config.scopes().keys()
@choices.setter
def choices(self, value):
pass
def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, values)
def _cdash_reporter(namespace):
"""Helper function to create a CDash reporter. This function gets an early reference to the
argparse namespace under construction, so it can later use it to create the object.

View File

@@ -8,13 +8,13 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.deptypes as dt
import spack.error
import spack.paths
import spack.spec
import spack.store
from spack import build_environment, traverse
from spack.cmd.common import arguments
from spack.context import Context
from spack.util.environment import dump_environment, pickle_environment

View File

@@ -14,7 +14,6 @@
import spack.compilers
import spack.config
import spack.spec
from spack.cmd.common import arguments
description = "manage compilers"
section = "system"
@@ -24,6 +23,8 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="compiler_command")
scopes = spack.config.scopes()
# Find
find_parser = sp.add_parser(
"find",
@@ -46,8 +47,9 @@ def setup_parser(subparser):
find_parser.add_argument("add_paths", nargs=argparse.REMAINDER)
find_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope("compilers"),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope("compilers"),
help="configuration scope to modify",
)
@@ -58,15 +60,20 @@ def setup_parser(subparser):
)
remove_parser.add_argument("compiler_spec")
remove_parser.add_argument(
"--scope", action=arguments.ConfigScope, default=None, help="configuration scope to modify"
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=None,
help="configuration scope to modify",
)
# List
list_parser = sp.add_parser("list", help="list available compilers")
list_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_list_scope(),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
help="configuration scope to read from",
)
@@ -75,8 +82,9 @@ def setup_parser(subparser):
info_parser.add_argument("compiler_spec")
info_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_list_scope(),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
help="configuration scope to read from",
)

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.cmd.common import arguments
import spack.config
from spack.cmd.compiler import compiler_list
description = "list available compilers"
@@ -12,8 +12,13 @@
def setup_parser(subparser):
scopes = spack.config.scopes()
subparser.add_argument(
"--scope", action=arguments.ConfigScope, help="configuration scope to read/modify"
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
)

View File

@@ -10,6 +10,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.cmd.common.arguments
import spack.config
import spack.environment as ev
import spack.repo
@@ -17,7 +18,6 @@
import spack.schema.packages
import spack.store
import spack.util.spack_yaml as syaml
from spack.cmd.common import arguments
from spack.util.editor import editor
description = "get and set configuration options"
@@ -26,9 +26,14 @@
def setup_parser(subparser):
scopes = spack.config.scopes()
# User can only choose one
subparser.add_argument(
"--scope", action=arguments.ConfigScope, help="configuration scope to read/modify"
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
)
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="config_command")
@@ -96,13 +101,13 @@ def setup_parser(subparser):
setup_parser.add_parser = add_parser
update = sp.add_parser("update", help="update configuration files to the latest format")
arguments.add_common_arguments(update, ["yes_to_all"])
spack.cmd.common.arguments.add_common_arguments(update, ["yes_to_all"])
update.add_argument("section", help="section to update")
revert = sp.add_parser(
"revert", help="revert configuration files to their state before update"
)
arguments.add_common_arguments(revert, ["yes_to_all"])
spack.cmd.common.arguments.add_common_arguments(revert, ["yes_to_all"])
revert.add_argument("section", help="section to update")

View File

@@ -172,14 +172,6 @@ def configure_args(self):
return args"""
class CargoPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for cargo-based packages"""
base_class_name = "CargoPackage"
body_def = ""
class CMakePackageTemplate(PackageTemplate):
"""Provides appropriate overrides for CMake-based packages"""
@@ -194,14 +186,6 @@ def cmake_args(self):
return args"""
class GoPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for Go-module-based packages"""
base_class_name = "GoPackage"
body_def = ""
class LuaPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for LuaRocks-based packages"""
@@ -591,30 +575,28 @@ def __init__(self, name, *args, **kwargs):
templates = {
"autoreconf": AutoreconfPackageTemplate,
"autotools": AutotoolsPackageTemplate,
"bazel": BazelPackageTemplate,
"bundle": BundlePackageTemplate,
"cargo": CargoPackageTemplate,
"autoreconf": AutoreconfPackageTemplate,
"cmake": CMakePackageTemplate,
"generic": PackageTemplate,
"go": GoPackageTemplate,
"intel": IntelPackageTemplate,
"lua": LuaPackageTemplate,
"makefile": MakefilePackageTemplate,
"maven": MavenPackageTemplate,
"meson": MesonPackageTemplate,
"octave": OctavePackageTemplate,
"perlbuild": PerlbuildPackageTemplate,
"perlmake": PerlmakePackageTemplate,
"python": PythonPackageTemplate,
"bundle": BundlePackageTemplate,
"qmake": QMakePackageTemplate,
"maven": MavenPackageTemplate,
"scons": SconsPackageTemplate,
"waf": WafPackageTemplate,
"bazel": BazelPackageTemplate,
"python": PythonPackageTemplate,
"r": RPackageTemplate,
"racket": RacketPackageTemplate,
"perlmake": PerlmakePackageTemplate,
"perlbuild": PerlbuildPackageTemplate,
"octave": OctavePackageTemplate,
"ruby": RubyPackageTemplate,
"scons": SconsPackageTemplate,
"makefile": MakefilePackageTemplate,
"intel": IntelPackageTemplate,
"meson": MesonPackageTemplate,
"lua": LuaPackageTemplate,
"sip": SIPPackageTemplate,
"waf": WafPackageTemplate,
"generic": PackageTemplate,
}
@@ -697,8 +679,6 @@ def __call__(self, stage, url):
clues = [
(r"/CMakeLists\.txt$", "cmake"),
(r"/NAMESPACE$", "r"),
(r"/Cargo\.toml$", "cargo"),
(r"/go\.mod$", "go"),
(r"/configure$", "autotools"),
(r"/configure\.(in|ac)$", "autoreconf"),
(r"/Makefile\.am$", "autoreconf"),

View File

@@ -10,10 +10,10 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.cmd.common.confirmation as confirmation
import spack.environment as ev
import spack.spec
from spack.cmd.common import arguments
description = "remove specs from the concretized lockfile of an environment"
section = "environments"

View File

@@ -9,11 +9,11 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.package_base
import spack.repo
import spack.store
from spack.cmd.common import arguments
description = "show dependencies of a package"
section = "basic"

View File

@@ -9,10 +9,10 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.repo
import spack.store
from spack.cmd.common import arguments
description = "show packages that depend on another"
section = "basic"

View File

@@ -20,9 +20,9 @@
from llnl.util.symlink import symlink
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from spack.error import SpackError

View File

@@ -9,9 +9,9 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.repo
from spack.cmd.common import arguments
description = "developer build: build from code in current working directory"
section = "build"

View File

@@ -8,10 +8,10 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.spec
import spack.util.path
import spack.version
from spack.cmd.common import arguments
from spack.error import SpackError
description = "add a spec to an environment's dev-build information"

View File

@@ -10,11 +10,11 @@
from llnl.util.tty.color import cprint, get_color_when
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.solver.asp as asp
import spack.util.environment
import spack.util.spack_json as sjson
from spack.cmd.common import arguments
description = "compare two specs"
section = "basic"
@@ -200,8 +200,6 @@ def diff(parser, args):
specs = []
for spec in spack.cmd.parse_specs(args.specs):
# If the spec has a hash, check it before disambiguating
spec.replace_hash()
if spec.concrete:
specs.append(spec)
else:

View File

@@ -20,6 +20,7 @@
import spack.cmd
import spack.cmd.common
import spack.cmd.common.arguments
import spack.cmd.common.arguments as arguments
import spack.cmd.install
import spack.cmd.modules
import spack.cmd.uninstall
@@ -30,7 +31,6 @@
import spack.schema.env
import spack.spec
import spack.tengine
from spack.cmd.common import arguments
from spack.util.environment import EnvironmentModifications
description = "manage virtual environments"

View File

@@ -10,10 +10,10 @@
from llnl.util.tty.colify import colify
import spack.cmd as cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.repo
import spack.store
from spack.cmd.common import arguments
description = "list extensions for package"
section = "extensions"

View File

@@ -14,12 +14,12 @@
import spack
import spack.cmd
import spack.cmd.common.arguments
import spack.config
import spack.cray_manifest as cray_manifest
import spack.detection
import spack.error
import spack.util.environment
from spack.cmd.common import arguments
description = "manage external packages in Spack configuration"
section = "config"
@@ -29,6 +29,8 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="external_command")
scopes = spack.config.scopes()
find_parser = sp.add_parser("find", help="add external packages to packages.yaml")
find_parser.add_argument(
"--not-buildable",
@@ -46,14 +48,15 @@ def setup_parser(subparser):
)
find_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope("packages"),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope("packages"),
help="configuration scope to modify",
)
find_parser.add_argument(
"--all", action="store_true", help="search for all packages that Spack knows about"
)
arguments.add_common_arguments(find_parser, ["tags", "jobs"])
spack.cmd.common.arguments.add_common_arguments(find_parser, ["tags", "jobs"])
find_parser.add_argument("packages", nargs=argparse.REMAINDER)
find_parser.epilog = (
'The search is by default on packages tagged with the "build-tools" or '

View File

@@ -6,11 +6,11 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.repo
import spack.traverse
from spack.cmd.common import arguments
description = "fetch archives for packages"
section = "build"

View File

@@ -12,9 +12,9 @@
import spack.bootstrap
import spack.cmd as cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.repo
from spack.cmd.common import arguments
from spack.database import InstallStatuses
description = "list and search installed packages"

View File

@@ -7,11 +7,11 @@
import os
import spack.binary_distribution
import spack.cmd.common.arguments as arguments
import spack.mirror
import spack.paths
import spack.util.gpg
import spack.util.url
from spack.cmd.common import arguments
description = "handle GPG actions for spack"
section = "packaging"

View File

@@ -5,10 +5,10 @@
from llnl.util import tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.store
from spack.cmd.common import arguments
from spack.graph import DAGWithDependencyTypes, SimpleDAG, graph_ascii, graph_dot, static_graph_dot
description = "generate graphs of package dependency relationships"

View File

@@ -11,13 +11,13 @@
import llnl.util.tty.color as color
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.deptypes as dt
import spack.fetch_strategy as fs
import spack.install_test
import spack.repo
import spack.spec
import spack.version
from spack.cmd.common import arguments
from spack.package_base import preferred_version
description = "get detailed information on a particular package"
@@ -139,7 +139,7 @@ def lines(self):
yield " " + self.fmt % t
def print_dependencies(pkg, args):
def print_dependencies(pkg):
"""output build, link, and run package dependencies"""
for deptype in ("build", "link", "run"):
@@ -152,7 +152,7 @@ def print_dependencies(pkg, args):
color.cprint(" None")
def print_detectable(pkg, args):
def print_detectable(pkg):
"""output information on external detection"""
color.cprint("")
@@ -180,7 +180,7 @@ def print_detectable(pkg, args):
color.cprint(" False")
def print_maintainers(pkg, args):
def print_maintainers(pkg):
"""output package maintainers"""
if len(pkg.maintainers) > 0:
@@ -189,7 +189,7 @@ def print_maintainers(pkg, args):
color.cprint(section_title("Maintainers: ") + mnt)
def print_phases(pkg, args):
def print_phases(pkg):
"""output installation phases"""
if hasattr(pkg.builder, "phases") and pkg.builder.phases:
@@ -201,7 +201,7 @@ def print_phases(pkg, args):
color.cprint(phase_str)
def print_tags(pkg, args):
def print_tags(pkg):
"""output package tags"""
color.cprint("")
@@ -213,7 +213,7 @@ def print_tags(pkg, args):
color.cprint(" None")
def print_tests(pkg, args):
def print_tests(pkg):
"""output relevant build-time and stand-alone tests"""
# Some built-in base packages (e.g., Autotools) define callback (e.g.,
@@ -407,15 +407,12 @@ def print_variants_by_name(pkg):
sys.stdout.write("\n")
def print_variants(pkg, args):
def print_variants(pkg):
"""output variants"""
if args.variants_by_name:
print_variants_by_name(pkg)
else:
print_variants_grouped_by_when(pkg)
print_variants_grouped_by_when(pkg)
def print_versions(pkg, args):
def print_versions(pkg):
"""output versions"""
color.cprint("")
@@ -468,7 +465,7 @@ def get_url(version):
color.cprint(line)
def print_virtuals(pkg, args):
def print_virtuals(pkg):
"""output virtual packages"""
color.cprint("")
@@ -491,7 +488,7 @@ def print_virtuals(pkg, args):
color.cprint(" None")
def print_licenses(pkg, args):
def print_licenses(pkg):
"""Output the licenses of the project."""
color.cprint("")
@@ -526,13 +523,17 @@ def info(parser, args):
if getattr(pkg, "homepage"):
color.cprint(section_title("Homepage: ") + pkg.homepage)
_print_variants = (
print_variants_by_name if args.variants_by_name else print_variants_grouped_by_when
)
# Now output optional information in expected order
sections = [
(args.all or args.maintainers, print_maintainers),
(args.all or args.detectable, print_detectable),
(args.all or args.tags, print_tags),
(args.all or not args.no_versions, print_versions),
(args.all or not args.no_variants, print_variants),
(args.all or not args.no_variants, _print_variants),
(args.all or args.phases, print_phases),
(args.all or not args.no_dependencies, print_dependencies),
(args.all or args.virtuals, print_virtuals),
@@ -541,6 +542,6 @@ def info(parser, args):
]
for print_it, func in sections:
if print_it:
func(pkg, args)
func(pkg)
color.cprint("")

View File

@@ -14,6 +14,7 @@
import spack.build_environment
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.fetch_strategy
@@ -22,7 +23,6 @@
import spack.report
import spack.spec
import spack.store
from spack.cmd.common import arguments
from spack.error import SpackError
from spack.installer import PackageInstaller

View File

@@ -15,9 +15,9 @@
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.deptypes as dt
import spack.repo
from spack.cmd.common import arguments
from spack.version import VersionList
description = "list and search available packages"

View File

@@ -8,12 +8,12 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.cmd.find
import spack.environment as ev
import spack.store
import spack.user_environment as uenv
import spack.util.environment
from spack.cmd.common import arguments
description = "add package to the user environment"
section = "user environment"

View File

@@ -9,11 +9,11 @@
import spack.builder
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.paths
import spack.repo
import spack.stage
from spack.cmd.common import arguments
description = "print out locations of packages and spack directories"
section = "basic"

View File

@@ -8,11 +8,11 @@
from llnl.util import tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.error
import spack.package_base
import spack.repo
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
description = "mark packages as explicitly or implicitly installed"

View File

@@ -11,6 +11,7 @@
import spack.caches
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.concretize
import spack.config
import spack.environment as ev
@@ -19,7 +20,6 @@
import spack.spec
import spack.util.path
import spack.util.web as web_util
from spack.cmd.common import arguments
from spack.error import SpackError
description = "manage mirrors (source and binary)"
@@ -88,14 +88,18 @@ def setup_parser(subparser):
"--mirror-url", metavar="mirror_url", type=str, help="find mirror to destroy by url"
)
# used to construct scope arguments below
scopes = spack.config.scopes()
# Add
add_parser = sp.add_parser("add", help=mirror_add.__doc__)
add_parser.add_argument("name", help="mnemonic name for mirror", metavar="mirror")
add_parser.add_argument("url", help="url of mirror directory from 'spack mirror create'")
add_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope to modify",
)
add_parser.add_argument(
@@ -113,8 +117,9 @@ def setup_parser(subparser):
remove_parser.add_argument("name", help="mnemonic name for mirror", metavar="mirror")
remove_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope to modify",
)
@@ -131,8 +136,9 @@ def setup_parser(subparser):
)
set_url_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope to modify",
)
arguments.add_connection_args(set_url_parser, False)
@@ -159,8 +165,9 @@ def setup_parser(subparser):
set_parser.add_argument("--url", help="url of mirror directory from 'spack mirror create'")
set_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope to modify",
)
arguments.add_connection_args(set_parser, False)
@@ -169,8 +176,9 @@ def setup_parser(subparser):
list_parser = sp.add_parser("list", help=mirror_list.__doc__)
list_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_list_scope(),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
help="configuration scope to read from",
)

View File

@@ -14,11 +14,11 @@
from llnl.util.tty import color
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.modules
import spack.modules.common
import spack.repo
from spack.cmd.common import arguments
description = "manipulate module files"
section = "environment"

View File

@@ -6,12 +6,12 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.package_base
import spack.repo
import spack.traverse
from spack.cmd.common import arguments
description = "patch expanded archive sources in preparation for install"
section = "build"

View File

@@ -12,11 +12,11 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.paths
import spack.repo
import spack.util.executable as exe
import spack.util.package_hash as ph
from spack.cmd.common import arguments
description = "query packages associated with particular git revisions"
section = "developer"

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.cmd
from spack.cmd.common import arguments
import spack.cmd.common.arguments as arguments
description = "remove specs from an environment"
section = "environments"

View File

@@ -11,7 +11,6 @@
import spack.config
import spack.repo
import spack.util.path
from spack.cmd.common import arguments
description = "manage package source repositories"
section = "config"
@@ -20,6 +19,7 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="repo_command")
scopes = spack.config.scopes()
# Create
create_parser = sp.add_parser("create", help=repo_create.__doc__)
@@ -43,8 +43,9 @@ def setup_parser(subparser):
list_parser = sp.add_parser("list", help=repo_list.__doc__)
list_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_list_scope(),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
help="configuration scope to read from",
)
@@ -53,8 +54,9 @@ def setup_parser(subparser):
add_parser.add_argument("path", help="path to a Spack package repository directory")
add_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope to modify",
)
@@ -65,8 +67,9 @@ def setup_parser(subparser):
)
remove_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope to modify",
)

View File

@@ -6,8 +6,8 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.repo
from spack.cmd.common import arguments
description = "revert checked out package source code"
section = "build"

View File

@@ -12,12 +12,12 @@
import spack
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment
import spack.hash_types as ht
import spack.package_base
import spack.solver.asp as asp
from spack.cmd.common import arguments
description = "concretize a specs using an ASP solver"
section = "developer"

View File

@@ -10,11 +10,11 @@
import spack
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.hash_types as ht
import spack.spec
import spack.store
from spack.cmd.common import arguments
description = "show what would be installed, given a spec"
section = "build"

View File

@@ -8,13 +8,13 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.package_base
import spack.repo
import spack.stage
import spack.traverse
from spack.cmd.common import arguments
description = "expand downloaded archive in preparation for install"
section = "build"

View File

@@ -15,12 +15,12 @@
from llnl.util.tty import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.install_test
import spack.package_base
import spack.repo
import spack.report
from spack.cmd.common import arguments
description = "run spack's tests for an install"
section = "admin"

View File

@@ -10,11 +10,11 @@
from llnl.util.filesystem import working_dir
import spack
import spack.cmd.common.arguments as arguments
import spack.config
import spack.paths
import spack.util.git
import spack.util.gpg
from spack.cmd.common import arguments
from spack.util.spack_yaml import syaml_dict
description = "set up spack for our tutorial (WARNING: modifies config!)"

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.cmd
from spack.cmd.common import arguments
import spack.cmd.common.arguments as arguments
description = "remove specs from an environment"
section = "environments"

View File

@@ -10,13 +10,13 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.cmd.common.confirmation as confirmation
import spack.environment as ev
import spack.package_base
import spack.spec
import spack.store
import spack.traverse as traverse
from spack.cmd.common import arguments
from spack.database import InstallStatuses
description = "remove installed packages"

View File

@@ -227,7 +227,9 @@ def unit_test(parser, args, unknown_args):
# has been used, then test that extension.
pytest_root = spack.paths.spack_root
if args.extension:
pytest_root = spack.extensions.load_extension(args.extension)
target = args.extension
extensions = spack.extensions.get_extension_paths()
pytest_root = spack.extensions.path_for_extension(target, *extensions)
# pytest.ini lives in the root of the spack repository.
with llnl.util.filesystem.working_dir(pytest_root):

View File

@@ -7,10 +7,10 @@
import sys
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.error
import spack.user_environment as uenv
import spack.util.environment
from spack.cmd.common import arguments
description = "remove package from the user environment"
section = "user environment"

View File

@@ -8,9 +8,9 @@
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.repo
import spack.spec
from spack.cmd.common import arguments
from spack.version import infinity_versions, ver
description = "list available versions of a package"

View File

@@ -40,6 +40,7 @@ def debug_flags(self):
"-gdwarf-5",
"-gline-tables-only",
"-gmodules",
"-gz",
"-g",
]

View File

@@ -20,16 +20,16 @@ def __init__(self, *args, **kwargs):
self.version_argument = "-V"
# Subclasses use possible names of C compiler
cc_names = ["craycc"]
cc_names = ["craycc", "cc"]
# Subclasses use possible names of C++ compiler
cxx_names = ["crayCC"]
cxx_names = ["crayCC", "CC"]
# Subclasses use possible names of Fortran 77 compiler
f77_names = ["crayftn"]
f77_names = ["crayftn", "ftn"]
# Subclasses use possible names of Fortran 90 compiler
fc_names = ["crayftn"]
fc_names = ["crayftn", "ftn"]
# MacPorts builds gcc versions with prefixes and -mp-X.Y suffixes.
suffixes = [r"-mp-\d\.\d"]

View File

@@ -55,6 +55,7 @@ def debug_flags(self):
"-gdwarf-5",
"-gline-tables-only",
"-gmodules",
"-gz",
"-g",
]

View File

@@ -85,14 +85,6 @@ def cxx14_flag(self):
else:
return "-std=c++14"
@property
def cxx17_flag(self):
# https://www.intel.com/content/www/us/en/developer/articles/news/c17-features-supported-by-c-compiler.html
if self.real_version < Version("19"):
raise UnsupportedCompilerFlag(self, "the C++17 standard", "cxx17_flag", "< 19")
else:
return "-std=c++17"
@property
def c99_flag(self):
if self.real_version < Version("12"):

Some files were not shown because too many files have changed in this diff Show More