Compare commits

...

636 Commits

Author SHA1 Message Date
Martin Diehl
bc4ecccfbf damask: add version 3.0.0-alpha8 (#41444)
* damask 3.0.0-alpha8

* ensuring correct Python versions for py-damask
2023-12-22 14:26:52 -06:00
Adam J. Stewart
9ee4876eb2 py-lightning: add v2.1.3 (#41809) 2023-12-22 14:13:45 -06:00
Patrick Broderick
d96f8efb9c Fftx final ecp update (#41812)
* build(deps): bump actions/setup-python from 2.2.2 to 2.3.1

Bumps [actions/setup-python](https://github.com/actions/setup-python) from 2.2.2 to 2.3.1.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](dc73133d4d...f382193329)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* build(deps): bump docker/build-push-action from 2.8.0 to 2.9.0

Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 2.8.0 to 2.9.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](1814d3dfb3...7f9d37fa54)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* build(deps): bump actions/upload-artifact from 2 to 3

Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 2 to 3.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](https://github.com/actions/upload-artifact/compare/v2...v3)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* build(deps): bump actions/download-artifact from 2 to 3

Bumps [actions/download-artifact](https://github.com/actions/download-artifact) from 2 to 3.
- [Release notes](https://github.com/actions/download-artifact/releases)
- [Commits](https://github.com/actions/download-artifact/compare/v2...v3)

---
updated-dependencies:
- dependency-name: actions/download-artifact
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* build(deps): bump actions/setup-python from 2 to 3.1.2

Bumps [actions/setup-python](https://github.com/actions/setup-python) from 2 to 3.1.2.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](https://github.com/actions/setup-python/compare/v2...98f2ad02fd48d057ee3b4d4f66525b231c3e52b6)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* build(deps): bump codecov/codecov-action from 2.1.0 to 3.1.0

Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 2.1.0 to 3.1.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/master/CHANGELOG.md)
- [Commits](f32b3a3741...81cd2dc814)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* build(deps): bump actions/checkout from 2 to 3.0.2

Bumps [actions/checkout](https://github.com/actions/checkout) from 2 to 3.0.2.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v2...2541b1294d2704b0964813337f33b291d3f8596b)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* build(deps): bump docker/build-push-action from 2.10.0 to 4.0.0

Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 2.10.0 to 4.0.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](ac9327eae2...3b5e8027fc)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* build(deps): bump docker/setup-buildx-action from 2.2.1 to 2.4.1

Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 2.2.1 to 2.4.1.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](8c0edbc76e...f03ac48505)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* build(deps): bump docutils from 0.18.1 to 0.20.1 in /lib/spack/docs

Bumps [docutils](https://docutils.sourceforge.io/) from 0.18.1 to 0.20.1.

---
updated-dependencies:
- dependency-name: docutils
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Bump mypy from 1.5.0 to 1.5.1 in /lib/spack/docs

Bumps [mypy](https://github.com/python/mypy) from 1.5.0 to 1.5.1.
- [Commits](https://github.com/python/mypy/compare/v1.5.0...v1.5.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Bump sphinx-rtd-theme from 1.2.2 to 1.3.0 in /lib/spack/docs

Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 1.2.2 to 1.3.0.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/1.2.2...1.3.0)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Bump actions/upload-artifact from 3.1.2 to 3.1.3

Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 3.1.2 to 3.1.3.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](0b7f8abb15...a8a3f3ad30)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Bump docker/build-push-action from 4.1.1 to 5.0.0

Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 4.1.1 to 5.0.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](2eb1c1961a...0565240e2d)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* Bump docker/setup-qemu-action from 2.2.0 to 3.0.0

Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 2.2.0 to 3.0.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](2b82ce82d5...68827325e0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* Bump docker/setup-buildx-action from 2.9.1 to 3.0.0

Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 2.9.1 to 3.0.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](4c0219f9ac...f95db51fdd)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* Bump actions/checkout from 3.5.3 to 4.1.0

Bumps [actions/checkout](https://github.com/actions/checkout) from 3.5.3 to 4.1.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](c85c95e3d7...8ade135a41)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* Bump pytest from 7.4.0 to 7.4.2 in /lib/spack/docs

Bumps [pytest](https://github.com/pytest-dev/pytest) from 7.4.0 to 7.4.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/7.4.0...7.4.2)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Bump sphinx from 6.2.1 to 7.2.6 in /lib/spack/docs

Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.2.1 to 7.2.6.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v6.2.1...v7.2.6)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* Update tarball and version info for Final ECP versions of FFTX/Spiral repos

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-22 14:12:35 -06:00
Peter Scheibel
a2dc11acd3 update an everywhere-but-windows check to include freebsd (#41819) 2023-12-22 14:11:04 -06:00
Alec Scott
2f0df0131c Fix a couple typos in the docs (#41822) 2023-12-22 14:08:53 -06:00
Adam J. Stewart
dd8941abc9 py-keras: add v3.0.2 (#41827) 2023-12-22 14:07:09 -06:00
Adam J. Stewart
b341030a0f py-mypy: add v1.8.0 (#41831) 2023-12-22 14:05:43 -06:00
Mark Olesen
1d6cea6af2 openfoam: add version 2312 (#41828) 2023-12-22 14:04:02 -06:00
Wileam Y. Phan
327a7a4031 Add PAPI 7.0.1 and 7.1.0 (#38443)
* Add PAPI 7.0.1

* Add comment about skipping PAPI 7.0.0

* Add patch to avoid adding Intel ifort/ifx flag on Cray ftn

* Modify patch to include Cray-specific flags

* Adjust recipe to always apply patch for 7.0.1

* Expand Cray compiler checks in patch

* Forgot to update recipe

* Adjust recipe so it looks for hipcc in the correct path

* Revert "Adjust recipe so it looks for hipcc in the correct path"

This reverts commit 0db3df4fe2.

* Patch HIP_PATH to work with Spack-built HIP

* Patch LDFLAGS with llvm-amdgpu path

* Forgot the depends_on line

* libomptarget only builds with clang

* Try a self-consistent build of llvm-amdgpu

* Try making llvm-amdgpu depend on llvm for llvmoffloadarch library

* Update prereq to use rocm-openmp-extras instead

* Refactor llvm-amdgpu to use a version dict

* Fix typo

* Hack to exclude older versions without matching rocm-openmp-extras

* Add PAPI 7.1.0

* Revert changes to llvm-amdgpu

* Fix PAPI 7.1.0 checksum
2023-12-22 13:58:06 -06:00
Alec Scott
6ac75f47e8 Use white logo or dark backgrounds (#41836) 2023-12-22 12:47:48 -07:00
Alec Scott
843346ce1b Update README.md formatting (#41813)
* Update README.md formatting

* Add docs and bootstrapping status and left align

* Update vertical badge spacing

* Update vertical logo spacing
2023-12-22 08:09:54 -08:00
Adam J. Stewart
23f03966b4 Remove deprecated versions from packages (#41031)
For now, this only includes packages that I personally maintain.

Notable removals:

* Anaconda 2
* Catalyst
* Ancient numpy/scipy
* Ancient PyTorch
* Ancient Bazel/TF
2023-12-22 08:01:33 -08:00
Tim Fuller
4540980337 Fix variant initialization logic to allow proper handling of values="*" (#40406)
Co-authored-by: psakiev <psakiev@sandia.gov>
Co-authored-by: tjfulle <tjfulle@users.noreply.github.com>
2023-12-22 16:10:43 +01:00
Laurent Aphecetche
ec9d08e71e llvm: fix llvm@14 build with apple-clang-15 (#40191)
* llvm: fix llvm@14 build with apple-clang-15

* fix formatting

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-12-22 05:03:23 -07:00
Simon Pintarelli
397c066464 costa: add missing dependency (#41829) 2023-12-22 12:44:18 +01:00
Adam J. Stewart
4d1b5d6a88 GDAL: add v3.8.2 (#41795) 2023-12-22 10:11:11 +01:00
pelesh
0cae943b5c ReSolve: Add version 0.99.1 (#41802)
* ReSolve: Add version 0.99.1

* [@spackbot] updating style on behalf of pelesh

---------

Co-authored-by: pelesh <pelesh@users.noreply.github.com>
2023-12-22 10:07:30 +01:00
Sam Gillingham
78c6c607db kealib: add version 1.5.3 (#41804) 2023-12-22 10:02:07 +01:00
Todd Gamblin
0da1fae709 Consolidate definition of Spack's extra sys.path components (#41816)
To work properly, Spack requires a few directories from its repository to be added to
`sys.path`. Previously these were buried in `spack_installable.main.main()`, but it's
sometimes useful to get the paths separately, e.g., if you want to set up your own
functioning spack environment.

With this change, adding the paths is much simpler:

```python
import spack_installable
sys.path[:0] = get_spack_sys_paths(spack_prefix)
```

- [x] Add `get_spack_sys_paths()` method with extra paths in order.
- [x] Refactor `spack_installable.main.main()` to use it.
2023-12-21 16:25:12 -08:00
John W. Parent
94fc2314f1 sqlite: add NMake build system for Windows (#41761) 2023-12-21 15:18:01 -07:00
John W. Parent
05761de8c7 Improve error reporting when Clingo install is broken (#41181)
With an improper/incomplete/broken installation of Clingo, it can be
importable but not have any of the expected attributes
Improve error reporting in this case
2023-12-21 13:56:09 -08:00
John W. Parent
ecdf3ff297 Tcl: add nmake system for Windows (#41759) 2023-12-21 13:15:49 -08:00
Massimiliano Culpo
ea7e3e4f9f Compilers can inject first order rules into the solver
* Restore PackageBase class, and modify only ASP

  This prevents a noticeable slowdown in concretization
  due to the number of directives involved.

* Fix issue with 'clang' being preferred to 'gcc',
  due to runtime version weights

* Constraints on runtimes are declared by compilers

  The declaration of available runtime versions, and of
  their compatibility constraints are in the associated
  compiler class.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-12-21 12:22:58 -08:00
Harmen Stoppels
8371bb4e19 gcc-runtime: add separate package for gcc runtime libs
The gcc-runtime package adds a separate node for gcc's dynamic runtime
libraries.

This should help with:

1. binary caches where rpaths for compiler support libs cannot be
   relocated because the compiler is missing on the target system
2. creating "minimal" container images

The package is versioned like `gcc` (in principle it could be
unversioned, but Spack doesn't always guarantee not mixing compilers)
2023-12-21 12:22:58 -08:00
Todd Gamblin
0a5f2fc94d specs: Better error messages for badly quoted specs (#41805)
If you are calling Spack from the python API, you might have written something like this
before #41529:

```
find = SpackCommand("find")
find('--format={name}', 'saxpy@1.0.0', '+rocm', 'amdgpu_target="gfx90a"')
```

But with the breaking change in #41529, you should write:

```
find = SpackCommand("find")
find('--format={name}', 'gromacs', '+rocm', 'amdgpu_target=gfx90a')
```

Note that we don't need quotes in Python strings, and that this is what would come in
via argv if you typed a quoted variant on the CLI.

The error messages for strings like this are not great -- you get something like this:

```
==> No package matches the query: gromacs+rocm amdgpu_target="gfx90a"
```

Which doesn't indicate that the issue might be your quoting. This is because we were
simply outputting the argv we got, instead of using spec.format() to output the error
message. This PR fixes such errors to use `spec.format()` and to look like this:

```
==> No package matches the query: gromacs+rocm amdgpu_target='"gfx90a"'
```

So users should have an easier time understanding that Spack considers the variant value
to contain quotes here.

- [x] update ConstraintAction to store parsed Specs
- [x] refactor commands to display formatted parsed Specs instead of raw input
2023-12-21 09:06:06 -08:00
Andrey Perestoronin
45b2c207db intel-oneapi-compilers and intel-oneapi-ccl: added new version to packages (#41807)
* added new packages

* compiler package

* fix link in ccl

* fix another links in ccl
2023-12-21 08:23:08 -07:00
Harmen Stoppels
e7f897f959 ci: use "strong preference" idiom for compilers (#41806)
to avoid duplication of conflicts / requirements in config
2023-12-21 12:50:35 +01:00
Harmen Stoppels
1aaab97a16 Only reuse externals when configured (#41707)
Users expect that changes to the externals sections in packages.yaml config apply immediately, but reuse concretization caused this not to be the case. With this commit, the concretizer is only allowed to reuse externals previously imported from config if identical config exists.
2023-12-20 19:21:15 +00:00
Christopher Christofi
3053e701c0 fix attribute error in perl build-system (#41628) 2023-12-20 09:42:45 -08:00
Massimiliano Culpo
20572fb87b Add missing import to packages (#41791) 2023-12-20 09:33:12 -07:00
Harmen Stoppels
7e2e063979 containers.rst: small docs improvement (#41792) 2023-12-20 11:54:41 +01:00
Harmen Stoppels
16e27ba4a6 spack buildcache push --tag: create container image with multiple roots (#41077)
This PR adds a flag `--tag/-t` to `buildcache push`, which you can use like

```
$ spack mirror add my-oci-registry oci://example.com/hello/world
$ spack -e my_env buildcache push --base-image ubuntu:22.04 --tag my_custom_tag my-oci-registry
```

and lets users ship a full, installed environment as a minimal container image where each image layer is one Spack package, on top of a base image of choice. The image can then be used as

```
$ docker run -it --rm example.com/hello/world:my_custom_tag
```

Apart from environments, users can also pick arbitrary installed spec from their database, for instance:

```
$ spack buildcache push --base-image ubuntu:22.04 --tag some_specs my-oci-registry gcc@12 cmake
$ docker run -it --rm example.com/hello/world:some_specs
```

It has many advantages over `spack containerize`:

1. No external tools required (`docker`, `buildah`, ...)
2. Creates images from locally installed Spack packages (No need to rebuild inside `docker build`, where troubleshooting build failures is notoriously hard)
3. No need for multistage builds (Spack just tarballs existing installations of runtime deps)
4. Reduced storage size / composability: when pushing multiple environments with common specs, container image layers are shared.
5. Automatic build cache: later `spack install` of the env elsewhere speeds up since the containerized environment is a build cache
2023-12-20 11:31:41 +01:00
Massimiliano Culpo
2fda288cc5 Fujitsu packages: require %fj (#41755)
These packages were written before the "requires" directive,
and so they are conflicting with all compilers but Fujitsu
to express they _require_ `%fj`
2023-12-20 11:15:36 +01:00
Massimiliano Culpo
9986652b27 Add detection tests for XL compilers (#41743) 2023-12-20 11:15:15 +01:00
Chris Marsh
fd46923216 GDAL: Ensure a spack libproj is used instead of a system libproj (#41785)
* Ensure a spack libproj is used instead of a system libproj when libproj < 8.
spack/spack/issues/41299

* Fix style as per ci-bot

* Fix style as per ci-bot

* Ensure 3.5:3.8.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-12-19 23:43:20 -07:00
Andrey Perestoronin
bb2975b7f1 intel-oneapi-compilers 2024.0.2 (#41778)
* new compiler packages

* Fix ifort version number

---------

Co-authored-by: Robert Cohn <rscohn2@gmail.com>
2023-12-19 23:08:20 -07:00
Annop Wongwathanarat
1168f19e60 Require target=x86_64 for some packages (#41633)
This resolves issue #41148
2023-12-19 17:50:49 -07:00
Peter Scheibel
5d50ad3941 "spack diff": add ignore option for dependencies (#41711)
* add trim function to `Spec` and `--ignore` option to 'spack diff'

Allows user to compare two specs while ignoring the sub-DAG of a particular dependency, e.g.

spack diff --ignore=mpi --ignore=zlib trilinos/abcdef trilinos/fedcba

to focus on differences closer to the root of the software stack
2023-12-19 16:37:44 -08:00
kwryankrattiger
a43156a861 CI: Disable downloading artifacts from upstream jobs (#41432)
* CI: Disable downloading artifacts from upstream jobs

* CI: Default .base-jobs are `when:manual`
2023-12-19 15:53:28 -07:00
Harmen Stoppels
ec2729706b environment_modifications_for_specs: do not mutate spec.prefix (#41737)
Sometimes env variables computed in `setup_run_environment` depend on tests
w.r.t. files in `spec.prefix`, but Spack temporarily projects `spec.prefix` to
the view. 

This is problematic for two reasons:

1. Some packages iterate over `<prefix>/bin`: they expect only the current
   package's executables, but find all linked in the view, leading to false
   positives.
2. Some packages test for `os.path.islink(...)`, which is always true in a view

`gcc` is an example that does both.

This PR lets Spack compute the environment modifications using the original
prefix, and projects to the view afterwards
2023-12-19 23:33:16 +01:00
Dom Heinzeller
494d3f9002 Skip 'icc.patch' in var/spack/repos/builtin/packages/py-gevent/package.py for py-gevent@23.7.0+ (#41568) 2023-12-19 12:44:29 -06:00
Harmen Stoppels
4f8b856145 e4s: add julia (#41768) 2023-12-19 18:17:54 +01:00
Massimiliano Culpo
0eca79e7e4 Add an audit to prevent virtual packages with variants specified (#41747)
Currently, a virtual spec is composed of just a name and a version. When a virtual spec contains other components, such as variants, Spack won't emit warnings or errors but will silently drop them - which is unexpected by users.
2023-12-19 18:05:33 +01:00
Ben Wibking
f245bde772 adios2: fix build failure in 2.7.1 (#41753) 2023-12-19 17:58:05 +01:00
Mikael Simberg
4aee067bb0 umpire: backport -fcompare-debug-second flag removal (#41506) 2023-12-19 17:24:52 +01:00
Martin Aumüller
cc25a0e561 ffmpeg: mostly build fixes (#41050) 2023-12-19 17:23:23 +01:00
Aiden Grossman
3f063153f0 openblas: add patches to build with clang (#39138) 2023-12-19 17:22:29 +01:00
Dr Marco Claudio De La Pierre
aa350a4ed1 removing deprecated: recipes tower-agent and tower-cli, as nf- prefixed recipes available (#41576)
Signed-off-by: Dr Marco Claudio De La Pierre <marco.delapierre@seqera.io>
2023-12-19 17:16:26 +01:00
Brian Van Essen
e36bee41a0 lbann: relax the requirement on protobuf (#41591) 2023-12-19 17:16:04 +01:00
dependabot[bot]
138d0c7a13 build(deps): bump black from 23.11.0 to 23.12.0 in /lib/spack/docs (#41615)
Bumps [black](https://github.com/psf/black) from 23.11.0 to 23.12.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.11.0...23.12.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-19 17:14:05 +01:00
Thomas-Ulrich
a688479564 easi: specify better the impalajit dependency (#41637) 2023-12-19 17:08:22 +01:00
Wouter Deconinck
5ead4c2d56 pcre: ensure consistency between autotools and cmake builds (#41644)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2023-12-19 16:57:59 +01:00
Richard Berger
2e18fbbdeb legion: do not set HIP_PATH env variable (#41660)
* legion: do not set HIP_PATH env variable

* flecsi: workaround Legion CMake for +rocm
2023-12-19 16:14:19 +01:00
Howard Pritchard
02eafeee03 openmpi: allow external libevent in general case (#41686)
add a internal-libevent variant to add an out.

related to #41549

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2023-12-19 14:54:25 +01:00
wspear
812a43621b tau : v2.33.1 and later requires otf2 v3 (#41691) 2023-12-19 12:20:47 +01:00
Richard Berger
0fe338b526 legion: inject correct mpicc to embedded GASnet slingshot11 config (#41701) 2023-12-19 12:20:05 +01:00
Tom Payerle
3dc02e55e6 ufs-weather-model: add build dependency (#41724) 2023-12-19 04:18:48 -07:00
Adam J. Stewart
7023edb37c PyTorch: update ecosystem (#41713) 2023-12-19 12:18:25 +01:00
James Beal
f1fdaca345 samtools: add v1.19 (#41634)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2023-12-19 12:13:57 +01:00
Mikael Simberg
d4454e54dc ut: add v2.0.0 and v2.0.1 (#41771) 2023-12-19 04:13:33 -07:00
Paul Kuberry
969718d176 xyce: remove CMake test for all compilers (#41679) 2023-12-19 12:10:16 +01:00
Jack Morrison
0a9179fddb intel-mpi-benchmarks: add v2021.7, v2021.6, v2021.5, v2021.4 (#41730) 2023-12-19 11:42:13 +01:00
Pramod Kumbhar
b5b0a76991 creduce: fix build of @develop (#41258) 2023-12-19 11:30:42 +01:00
Christopher Christofi
59b39f3eba uthash: add new package (#41732) 2023-12-19 11:26:13 +01:00
Wouter Deconinck
7a0c4e8017 acts: new versions 31.* (#41733)
This adds three new versions in the 31.* series. Release notes of 31.0.0 at https://github.com/acts-project/acts/releases/tag/v31.0.0. No changes to the CMakeLists.txt files that need addressing in the package recipe.

The only new feature I'm a bit concerned about is https://github.com/acts-project/acts/pull/2626, which replaces testing for C++20 concepts support by the feature-testing macro `__cpp_concepts`, which is also a C++20 feature. So technically we now should require `cxxstd=20` even though Acts itself still allows (and defaults to) 17. Judging by https://en.cppreference.com/w/cpp/compiler_support/20, the support for feature-testing macros was added very early by most compilers.
2023-12-19 11:13:07 +01:00
Thomas Madlener
1ddf4ee6ba whizard: fix support for building with hepmc output (#41538) 2023-12-19 11:12:27 +01:00
dependabot[bot]
12d0507cb7 build(deps): bump black in /.github/workflows/style (#41616)
Bumps [black](https://github.com/psf/black) from 23.11.0 to 23.12.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.11.0...23.12.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-19 10:47:57 +01:00
Wouter Deconinck
cf99912352 flexiblas: explicitly set SYSCONFDIR (#41748)
As of CMake 3.4, [GNUInstallDirs](https://cmake.org/cmake/help/latest/module/GNUInstallDirs.html) treats `SYSCONFDIR` differently for a prefix that starts with `/opt`, then "the SYSCONFDIR value etc becomes /etc/opt/...." In the case of flexiblas, that results in failing attempts to write files to a system directory.

Since [flexiblas version 1](0f2d2c7659 (diff-1e7de1ae2d059d21e1dd75d5812d5a34b0222cef273b7c3a2af62eb747f9d20aR16)), we can override SYSCONFDIR with our own defines.
2023-12-19 10:47:16 +01:00
snehring
9723fe88f5 rebayes: add v1.2.2 (#41749) 2023-12-19 10:44:16 +01:00
Christopher Christofi
2439ff56a5 kalign: add v3.4.0 (#41758)
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2023-12-19 10:35:23 +01:00
Massimiliano Culpo
2ef8d09fc7 spack config get/blame: with no args, show entire config
This PR changes the default behavior of `spack config get` and `spack config blame`
to print a flattened version of the entire spack configuration, including any active 
environment, if the commands are invoked with no section arguments.

The new behavior is used in Gitlab CI to help debug CI configuration, but it can also
be useful when asking for more information in issues, or when simply debugging Spack.
2023-12-19 01:26:53 -08:00
Mosè Giordano
e5e767b300 julia: set compatibility with suite-sparse (#41754) 2023-12-19 10:17:06 +01:00
Rocco Meli
1c6b38f36d gnina: add version 1.1 (#41762) 2023-12-19 09:39:46 +01:00
Arne Becker
091cd47caa tnftp: new package (#41763) 2023-12-19 09:38:42 +01:00
John W. Parent
1ebf1a0c6c libxml2: correct improper use of base builder meta (#41760) 2023-12-19 09:34:59 +01:00
Greg Becker
56761649a2 environment modifications for externals (#41723)
* allow externals to configure environment modifications

* docs for external env modification

---------

Co-authored-by: becker33 <becker33@users.noreply.github.com>
2023-12-18 22:24:15 -06:00
dependabot[bot]
6a19cf1b42 build(deps): bump docker/metadata-action from 5.3.0 to 5.4.0 (#41764)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.3.0 to 5.4.0.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](31cebacef4...9dc751fe24)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-18 16:31:02 -08:00
Thomas Madlener
ef4274ed2e podio: Add latest tag 0.17.4 (#41735) 2023-12-18 22:37:00 +01:00
dependabot[bot]
88b8fc63ef build(deps): bump isort in /.github/workflows/style (#41650)
Bumps [isort](https://github.com/pycqa/isort) from 5.12.0 to 5.13.2.
- [Release notes](https://github.com/pycqa/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pycqa/isort/compare/5.12.0...5.13.2)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-18 10:23:51 -08:00
dependabot[bot]
639a6a6897 build(deps): bump isort from 5.12.0 to 5.13.2 in /lib/spack/docs (#41651)
Bumps [isort](https://github.com/pycqa/isort) from 5.12.0 to 5.13.2.
- [Release notes](https://github.com/pycqa/isort/releases)
- [Changelog](https://github.com/PyCQA/isort/blob/main/CHANGELOG.md)
- [Commits](https://github.com/pycqa/isort/compare/5.12.0...5.13.2)

---
updated-dependencies:
- dependency-name: isort
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-18 10:23:18 -08:00
Massimiliano Culpo
af96fef1da spack.config: cleanup and add type hints (#41741) 2023-12-18 17:05:36 +01:00
Michael Kuhn
7550a41660 gcc: fix run environment variables not being exported in environments (#41729)
Since views use symlinks, all compiler binaries were skipped in this
case. Instead, only skip them if their target does not exist.
2023-12-18 16:19:38 +01:00
Mikael Simberg
ffd2a34d9e pika-algorithms: Add upper bound for pika version (#41736) 2023-12-18 15:12:36 +01:00
Michael Kuhn
6a74a82e19 glib: add v2.78.3 (#41697) 2023-12-18 10:06:23 +01:00
Harmen Stoppels
ebb7c5ac8f asp.py: remove "CLI" reference (#41718)
Can also be an environment root, or programatically
`Spec("x").concretized()`.
2023-12-18 09:52:12 +01:00
Peter Scheibel
14c7bfe9ce spack develop: convert to config (#35273)
Convert the 'develop' section of an environment to a dedicated configuration section.
This means for example that instead of having to define `develop` specs in the
`spack.yaml`, the environment can `include:` another `develop.yaml` configuration
which specifies which specs should be developed in the environment.

This change is not expected to be disruptive given that existing environment `spack.yaml`
files will conform to the new schema.

(Update 11/28/2023) I have implemented the `develop`/`undevelop` commands in terms
of more-generic modification functions added to the `config` module: `change_or_add`
and `update_all`. It is assumed that the semantics added here (described in 11/18 update)
would be desirable to extend to other config update actions (e.g. adding compilers, 
changing package requirements, adding mirrors).

(Update 11/18/2023) I have updated this such that `spack develop`, and
`spack undevelop` to potentially modify all writable scopes, like 
https://github.com/spack/spack/pull/41147. https://github.com/spack/spack/pull/35307
will be useful for modifying included scopes, but generally speaking specifying a 
`--scope` will not be required for `spack develop`: `spack develop` will add new 
develop specs to whatever scope already has develop specs defined, or to the
highest-priority writable scope (which should be the env scope).

TODOs:

- [x] If you `spack undevelop` a package which is mentioned at multiple layers of
      configuration, then currently this would only modify one of them. That's not
      technically a new issue (has always existed for configuration modification), but
      may be confusing to users when presented via an interface other than `spack config set`
- [x] Need to add (or confirm) the ability to modify individual config files by providing
      a path (rather than using a scope identifier as a key to retrieve associated config).
- [x] `spack develop` adds new develop specs to the scope that defines them
      (potentially skipping higher priority scopes to e.g. augment included scope files)

---------

Co-authored-by: scheibelp <scheibelp@users.noreply.github.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2023-12-18 00:47:53 -08:00
Christopher Christofi
ed52b505d4 py-plum-dispatch: add new package (#41536)
* py-plum-dispatch: add new package

* Update var/spack/repos/builtin/packages/py-plum-dispatch/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-12-17 12:18:02 -06:00
Wouter Deconinck
b111064e22 py-htgettoken: use os.environ, avoid AttributeError (#41717)
* py-htgettoken: use os.environ, avoid AttributeError

This avoids the following error:
```
Warning: could not load runtime environment due to AttributeError: 'EnvironmentModifications' object has no attribute 'get'
```

* py-htgettoken: allow for undefined variables

* py-htgettoken: use dict get()

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-12-17 12:17:23 -06:00
Wouter Deconinck
17d47accf9 qt: apply patch for apple-clang@15: (#41695) 2023-12-17 07:35:46 -05:00
Christopher Christofi
d7fb298a6b py-pyopenssl: add version 18.0.0 (#41709) 2023-12-17 01:05:38 -06:00
Christopher Christofi
107ea768ab py-sortedcontainers: add new version 2.4.0 (#41652) 2023-12-17 01:01:52 -06:00
Christopher Christofi
8797dd35f7 py-python-utils: add version 2.7.1 (#41653) 2023-12-17 00:59:59 -06:00
Christopher Christofi
0596a46cd9 py-pyasn1: add version 0.4.3 (#41654) 2023-12-17 00:59:19 -06:00
Christopher Christofi
9d406463d4 py-cryptography: add version 2.8 (#41621) 2023-12-17 00:58:52 -06:00
Christopher Christofi
86906bf5b3 py-lxml: add new version 4.4.2 (#41623) 2023-12-17 00:55:56 -06:00
Christopher Christofi
03ddccbc93 py-progressbar2: add version 3.43.1 (#41624) 2023-12-17 00:55:23 -06:00
Garth N. Wells
12db37906b py-fenics-dolfinx: update for v0.7.2 (#41394)
* Update for v0.7.2

* Dependency fix

* Dep type fix
2023-12-17 00:46:33 -06:00
Christopher Christofi
b158a15754 py-optax: add new package (#41278)
* py-optax: add new package with version 0.1.7

* Update var/spack/repos/builtin/packages/py-optax/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-12-16 23:43:02 -07:00
Thomas Madlener
b82f78003c py-jupyterlab: Use the correct version dependency for jinja2 (#41543) 2023-12-17 00:21:24 -06:00
Henrique Finger Zimerman
49616d3020 Add new python package to spack - py-pygame (game development package) (#41477)
Add pygame to spack
2023-12-17 00:06:44 -06:00
Lydéric Debusschère
8467f8ae8a py-poetry: Add version 1.6.1 (#41291)
* py-poetry: Add version 1.6.1

* py-poetry-core: Add version 1.7.0

* py-dulwich: Add version 0.21.6

* py-installer: Add version 0.7.0

* py-keyring: Add version 24.3.0

* py-poetry-plugin-export: Add version 1.6.0

* py-cachecontrol: Add version 0.13.0

* py-xattr: Add version 0.10.1, py-poetry dependence on darwin platform

* py-cachecontrol: fix typo

* py-cachecontrol: add version 0.13.1

* py-dulwich: remove version constraint on python, sort dependences, add py-typing-extensions dependence

* py-poetry-core: add version constraint on python

* py-poetry-plugin-export: fix python dependence, sort dependences

* py-poetry: sort dependences, fix dependences with respect to reviewing

* py-cachecontrol: fix typo

* py-poetry-plugin-export: comment py-poetry dependence; py-poetry: fix py-build dependence

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-16 23:58:52 -06:00
Kyle Gerheiser
5b6137d91a libfabric: Add uring variant (#41563)
* libfabric: Add uring variant
* Remove tcp fabric requirement for uring
* Fix style and use spec.satisfies
2023-12-15 12:51:29 -08:00
Arne Becker
b7edcbecd7 perl-rose-object: New package (#41715)
Adds Rose::Object
2023-12-15 10:22:28 -08:00
Arne Becker
5ccbe68f16 perl-compress-lzo: New package (#41716)
Adds Compress::LZO
2023-12-15 10:20:46 -08:00
Arne Becker
9fe4cef89e perl-dbd-oracle: New package (#41719) 2023-12-15 10:18:04 -08:00
John W. Parent
165c6cef08 clingo: patch clingo to allow for build with modern msvc (#41188) 2023-12-15 10:43:08 -07:00
Sean Koyama
0efd5287c4 mpifileutils: add DAOS variant (#35618)
* mpifileutils: add DAOS variant
* mpifileutils: Add daos dep when +daos
  Add dependency on DAOS when +daos
  Pass DAOS prefix to ensure correct DAOS is found by during configuration
* Change in to satisfies for boolean variants

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
2023-12-15 10:38:01 -07:00
Vanessasaurus
b1ab01280a Automated deployment to update package flux-sched 2023-12-14 (#41658)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-12-14 12:52:31 -08:00
Arne Becker
ab84876e2c perl-proc-daemon: New package (#41666) 2023-12-14 12:51:43 -08:00
Arne Becker
e2d5be83e7 perl-test-base: New package (#41668)
Adds Test::Base
2023-12-14 12:49:43 -08:00
Arne Becker
85cdf37d3b perl-datetime-format-strptime: New package (#41676)
* perl-datetime-format-strptime: New package
  Adds package:
  - perl-datetime-format-strptime
  And adds these because they are test dependencies:
  - perl-test-file-sharedir
  - perl-test2-plugin-nowarnings
  - perl-test2-suite
  And modifies these to enable build time tests:
  - perl-b-hooks-endofscope
  - perl-class-singleton
  - perl-datetime-locale
  - perl-datetime-timezone
  - perl-file-sharedir
  - perl-namespace-autoclean
  - perl-namespace-clean
  - perl-params-validationcompiler
  - perl-specio
* Add myself as maintainer
2023-12-14 12:47:40 -08:00
Arne Becker
06521b44b6 perl-common-sense: New package (#41677) 2023-12-14 12:20:47 -08:00
Arne Becker
1e5325eea0 perl-time-clock: New package (#41678) 2023-12-14 12:19:06 -08:00
Arne Becker
0995a29c5c perl-sql-reservedwords: New package (#41685) 2023-12-14 12:07:12 -08:00
Adam J. Stewart
133d6e2656 py-pandas: add v2.1.4 (#41590) 2023-12-14 13:53:05 -06:00
Adam J. Stewart
36117444aa py-black: add v23.12.0 (#41589) 2023-12-14 13:52:47 -06:00
Andrey Perestoronin
330a9a7c9a intel-oneapi-compilers 2023.2.3: added new version to dpcpp package (#41680)
* add new cpp compiler version

* empty ftn for 2023.2.3

* OLD ftn in 2023.2.3 version

* tolerate missing fortran compiler

---------

Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>
2023-12-14 11:03:18 -07:00
Todd Gamblin
0dc3fc2d21 spec parser: precompile quoting-related regular expressions (#41657)
This adds a small (~5%) performance improvement to Spec parsing.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-12-14 05:08:08 -07:00
Dom Heinzeller
a972314fa6 Fix spack compiler wrappers in ESMF's esmf.mk on Cray when using cc, CC, ftn (#41640) 2023-12-14 09:42:34 +01:00
Wouter Deconinck
16d1ed3591 geant4: new version 11.2.0 (#41643)
* geant4: new version 11.2.0

* geant4: depends_on geant4-data@11.2:

* geant4-data: new version 11.2.0

* g4abla: new version 3.3

* g4emlow: new version 8.4

* g4incl: new version 1.1

* geant4: depends_on vecgeom@1.2.6:

* geant4: depends_on qt@5.9: when @11.2: +qt

* vecgeom: new version 1.2.6
2023-12-14 09:18:04 +01:00
Arne Becker
5c25f16df2 perl-cpanel-json-xs: New package (#41646) 2023-12-13 18:13:58 -08:00
Arne Becker
b3ccaa81a7 perl-mock-config: New package (#41647) 2023-12-13 18:13:04 -08:00
Arne Becker
a0041731a3 perl-ref-util: New package (#41648) 2023-12-13 18:12:02 -08:00
Todd Gamblin
a690b8c27c Improve parsing of quoted flags and variants in specs (#41529)
This PR does several things:

- [x] Allow any character to appear in the quoted values of variants and flags.
- [x] Allow easier passing of quoted flags on the command line, e.g. `cflags="-O2 -g"`.
- [x] Handle quoting better in spec output, using single quotes around double 
      quotes and vice versa.
- [x] Disallow spaces around `=` and `==` when parsing variants and flags.

## Motivation

This PR is motivated by the issues above and by ORNL's 
[tips for launching at scale on Frontier](https://docs.olcf.ornl.gov/systems/frontier_user_guide.html#tips-for-launching-at-scale).
ORNL recommends using `sbcast --send-libs` to broadcast executables and their
libraries to compute nodes when running large jobs (e.g., 80k ranks). For an
executable named `exe`, `sbcast --send-libs` stores the needed libraries in a
directory alongside the executable called `exe_libs`. ORNL recommends pointing
`LD_LIBRARY_PATH` at that directory so that `exe` will find the local libraries and
not overwhelm the filesystem.

There are other ways to mitigate this problem:
* You could build with `RUNPATH` using `spack config add config:shared_linking:type:runpath`,
  which would make `LD_LIBRARY_PATH` take precedence over Spack's `RUNPATHs`.
  I don't recommend this one because `RUNPATH` can cause many other things to go wrong.
* You could use `spack config add config:shared_linking:bind:true`, added in #31948, which
  will greatly reduce the filesystem load for large jobs by pointing `DT_NEEDED` entries in
  ELF *directly* at the needed `.so` files instead of relying on `RPATH` search via soname.
  I have not experimented with this at 80,000 ranks, but it should help quite a bit.
* You could use [Spindle](https://github.com/hpc/Spindle) (as LLNL does on its machines)
  which should transparently fix this without any changes to your executable and without
  any need to use `sbcast` or other tools.

But we want to support the `sbcast` use case as well.

## `sbcast` and Spack

Spack's `RPATHs` break the `sbcast` fix because they're considered with higher precedence
than `LD_LIBRARY_PATH`. So Spack applications will still end up hitting the shared filesystem
when searching for libraries. We can avoid this by injecting some `ldflags` in to the build, e.g.,
if were were going to launch, say, `LAMMPS` at scale, we could add another `RPATH`
specifically for use with `sbcast`:

    spack install lammps ldflags='-Wl,-rpath=$ORIGIN/lmp_libs'

This will put the `lmp_libs` directory alongside `LAMMPS`'s `lmp` executable first in the
`RPATH`, so it will be searched before any directories on the shared filesystem.

## Issues with quoting

Before this PR, the command above would've errored out for two reasons:

1. `$` wasn't an allowed character in our spec parser.
2. You would've had to double quote the flags to get them to pass through correctly:

       spack install lammps ldflags='"-Wl,-rpath=$ORIGIN/lmp_libs"'

This is ugly and I don't think many users will easily figure it out. The behavior was added in
#29282, and it improved parsing of specs passed as a single string, e.g.:

    spack install 'lammps ldflags="-Wl,-rpath=$ORIGIN/lmp_libs"'

but a lot of users are naturally going to try to quote arguments *directly* on the command
line, without quoting their entire spec. #29282 used a heuristic to detect unquoted flags
and warn the user, but the warning could be confusing. In particular, if you wrote
`cflags="-O2 -g"` on the command line, it would break the flags up, warn, and tell you
that you could fix the issue by writing `cflags="-O2 -g"` even though you just wrote
that. It's telling you to *quote* that value, but the user has to know to double quote.

## New heuristic for quoted arguments from the CLI

There are only two places where we allow arbitrary quoted strings in specs: flags and
variant values, so this PR adds a simpler heuristic to the CLI parser: if an argument in
`sys.argv` starts with `name=...`, then we assume the whole argument is quoted.

This means you can write:

    spack install bzip2 cflags="-O2 -g"

directly on the command line, without multiple levels of quoting. This also works:

    spack install 'bzip2 cflags="-O2 -g"'

The only place where this heuristic runs into ambiguity is if you attempt to pass
anonymous specs that start with `name=...` as one large string. e.g., this will be
interpreted as one large flag value:

    spack find 'cflags="-O2 -g" ~bar +baz'

This sets `cflags` to `"-O2 -g" ~bar +baz`, which is likely not what you wanted. You
can fix this easily by either removing the quotes:

    spack find cflags="-O2 -g" ~bar +baz

Or by adding a space at the start, which has the same effect:

    spack find ' cflags="-O2 -g" ~bar +baz'

You may wonder why we don't just look for quotes inside of flag arguments, and the
reason is that you *might* want them there.  If you are passing arguments like:

    spack install zlib cppflags="-D DEBUG_MSG1='quick fox' -D DEBUG_MSG2='lazy dog'"

You *need* the quotes there. So we've opted for one potentially confusing, but easily
fixed outcome vs. limiting what you can put in your quoted strings.

## Quotes in formatted spec output

In addition to being more lenient about characters accepted in quoted strings, this PR fixes
up spec formatting a bit. We now format quoted strings in specs with single quotes, unless
the string has a single quote in it, in which case we JSON-escape the string (i.e., we add
`\` before `"` and `\`).  

    zlib cflags='-D FOO="bar"'
    zlib cflags="-D FOO='bar'"
    zlib cflags="-D FOO='bar' BAR=\"baz\""
2023-12-13 16:36:22 -08:00
yizeyi18
a1fa862c3f camp: fixing build issue (#41400)
* adding necessary headers, to fix https://github.com/spack/spack/issues/41398

* deleting something imported by accident

* [@spackbot] updating style on behalf of yizeyi18

* undo commit 7688fed according to suggestion from @msimberg

* patching camp@:2022.10.1 for compatibility with gcc-13

* adding the patch

* fixing paths in the patch

* [@spackbot] updating style on behalf of yizeyi18

* Update camp patch using LLNL/camp@05e1c35

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* changing patch name

---------

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2023-12-13 16:28:57 -08:00
Harmen Stoppels
80f31829a8 python: don't run mkdirp in setup_dependent_package (#41603)
`setup_dependent_package` is not a build phase, it should just set
globals for a package.

It's called during setup of runtime environment of packages, and there
have been reports of it actually failing due to a read only file system
(not sure under what exact conditions that is possible).
2023-12-13 23:54:28 +01:00
Arne Becker
84436f10ba perl-config-tiny: New package (#41584) 2023-12-13 14:43:03 -07:00
Arne Becker
660485709d perl-b-cow: New package (#41596) 2023-12-13 12:06:14 -08:00
Arne Becker
251dce05c9 perl-throwable: New package (#41597) 2023-12-13 12:05:17 -08:00
Arne Becker
ecd05fdfb4 perl-scope-guard: New package (#41598) 2023-12-13 12:02:50 -08:00
Arne Becker
9ffcf36444 perl-test-sharedfork: New package (#41599) 2023-12-13 12:01:54 -08:00
Arne Becker
07258a7c80 perl-safe-isa: New package (#41600) 2023-12-13 12:01:10 -08:00
Arne Becker
395e53a5e0 perl-proc-processtable: New package (#41601) 2023-12-13 11:59:05 -08:00
Arne Becker
77c331c753 perl-net-ip: New package (#41606) 2023-12-13 11:57:17 -08:00
Arne Becker
ee5481a861 perl-any-uri-escape: New package (#41607) 2023-12-13 11:56:11 -08:00
Arne Becker
f7ec061c64 perl-term-table: New package (#41608) 2023-12-13 11:47:08 -08:00
Arne Becker
7cb70ff4b1 perl-test-pod: New package (#41609) 2023-12-13 11:45:59 -08:00
Arne Becker
4a661f3255 perl-spiffy: New package (#41610) 2023-12-13 11:44:36 -08:00
Arne Becker
7037240879 perl-ipc-system-simple: New package (#41611) 2023-12-13 11:43:21 -08:00
Arne Becker
0e96dfaeef perl-mime-types: New package (#41612) 2023-12-13 11:41:48 -08:00
Arne Becker
a0a2cd6a1a perl-convert-nls-date-format: New package (#41613) 2023-12-13 11:39:11 -08:00
Arne Becker
170c05bebb perl-module-pluggable: New package (#41614) 2023-12-13 11:04:27 -08:00
Arne Becker
bdf68b7ac0 perl-email-date-format: New package (#41617) 2023-12-13 11:03:24 -08:00
Arne Becker
c176de94e2 perl-heap: New package (#41618) 2023-12-13 11:00:55 -08:00
Harmen Stoppels
f63dbbe75d spack mirror create --all: include patches (#41579) 2023-12-13 20:00:44 +01:00
Arne Becker
a0c7b10c76 perl-log-any: New package (#41619) 2023-12-13 10:59:54 -08:00
Arne Becker
2dc3bf0164 perl-http-cookiejar: New package (#41620) 2023-12-13 10:57:55 -08:00
Harmen Stoppels
9bf6e05d02 Revert "[protobuf] New versions, explicit cxxstd variant (#41459)" (#41635)
This reverts commit b82bd8e6b6.
2023-12-13 15:02:25 +01:00
Massimiliano Culpo
cd283846af mysql: add v8.0.35, fix build (#41602) 2023-12-13 12:10:33 +01:00
Taillefumier Mathieu
03625c1c95 Add pic variant when building the library (#41631)
* Add pic variant when building the library

* make pretty

* Probably better approach
2023-12-13 01:38:13 -07:00
Wouter Deconinck
f01774f1d4 hepmc3: fix from_variant -> self.define (#41605)
* hepmc3: fix from_variant -> self.define
* hepmc3: str on versions
2023-12-13 07:02:48 +01:00
Christopher Christofi
965860d1f8 perl-getopt-argvfile: add new package with version 1.11 (#41625) 2023-12-12 20:33:25 -07:00
Christopher Christofi
c4baf4e199 perl-parselex: add new package with version 2.21 (#41626) 2023-12-12 20:08:28 -07:00
Aiden Grossman
dd82227ae7 Remove MCT license annotation (#41593)
This license annotation is currently invalid as it specifies a URL
rather than an SPDX expression. Remove it for now until we have a
consensus on how to represent this case.
2023-12-12 17:52:42 -07:00
fpruvost
a9028630a5 pastix: new release v6.3.2 (#41585) 2023-12-12 15:57:11 -07:00
Arne Becker
789c85ed8b perl-clone-pp: New package (#41586) 2023-12-12 15:46:25 -07:00
James Taliaferro
cf9d36fd64 kakoune: add v2023.08.05 (#41443) 2023-12-12 23:02:20 +01:00
Arne Becker
ef7ce46649 perl-ipc-run3: New package (#41583) 2023-12-12 14:29:30 -07:00
pabloaledo
334a50662f Update bioconductor packages (#41227)
Signed-off-by: Pablo <pablo.aledo@seqera.io>
2023-12-12 22:04:45 +01:00
Dom Heinzeller
d68e73d006 New package Model Coupling Toolkit (MCT) (#41564)
* New package Model Coupling Toolkit (MCT)
* Remove ~mpi variant from mct, build is not working correctly
* Remove boilerplate stuff from var/spack/repos/builtin/packages/mct/package.py
2023-12-12 11:26:57 -08:00
Adam J. Stewart
7d7f097295 py-pyvista: add v0.42.3 (#41246) 2023-12-12 10:58:15 -08:00
Massimiliano Culpo
37cdcc7172 mysql: fix issue when using old core API call (#41573)
MySQL was performing a core API call to `Spec.flat_dependencies`
when setting up the build environment. This function is an
implementation detail of the old concretizer, where multiple nodes
from the same package are not allowed.

This PR uses a more idiomatic way to check if "python" is
in the DAG.

For reference, see #11356 to check why the call was introduced.
2023-12-12 10:40:31 -08:00
Thomas Madlener
0a40bb72e8 genfit: Add latest tags and update root dependency (#41572) 2023-12-12 10:30:06 -08:00
James Beal
24b6edac89 bowtie2 add latest version (#41580)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2023-12-12 10:23:17 -08:00
Arne Becker
3e7acf3e61 perl-type-tiny: New package (#41582) 2023-12-12 10:21:56 -08:00
Harmen Stoppels
ede36512e7 gcc: simplify patch when range (#41587) 2023-12-12 10:03:41 -07:00
jmuddnv
e06b169720 NVIDIA HPC SDK: add v23.11 (#41125) 2023-12-12 17:40:53 +01:00
Stephen Sachs
7ed968d42c clingo-bootstrap: use new Spack API for environment modifications (#41574) 2023-12-12 17:28:15 +01:00
Cameron Rutherford
c673b9245c exago: Add v1.2.0 and patches for builds without python or tests. (#41350)
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2023-12-12 10:17:26 -06:00
Mikael Simberg
27c0dab5ca fmt: Add patch to allow compilation with clang in CUDA mode (#41578) 2023-12-12 08:33:44 -07:00
Chris Green
b82bd8e6b6 [protobuf] New versions, explicit cxxstd variant (#41459)
* [protobuf] New versions, explicit cxxstd variant
* New versions 3.15.8, 3.25.0, 3.25.1.
* New explicit variant `cxxstd` with support for older Protobuf
  versions.
* Support testing.
* Use Protobuf's `protobuf_BUILD_SHARED_LIBS` instead of
  `BUILD_SHARED_LIBS`.
* Support building with LLVM/Clang's `libc++`.
* Address audit issue
* Variant default does not honor `when` clause
* Use `self.spec.satisfies()` instead of `with when()`
* Fix silliness; improve consistency
* Today was apparently a go-back-to-bed day
2023-12-11 22:08:40 -07:00
Sreenivasa Murthy Kolam
5351382501 Bump up the version for ROCm-5.7.0 and ROCm-5.7.1 releases. (#40724)
* initial commit for rocm-5.7.0 and 5.7.1 releases
* bump up ther version for 5.7.0 and 5.7.1 releases
* update recipes to support 5.7.0 and 5.7.1 releases
* bump up the version for ROCm 5.7.0 and ROCm-5.7.1 releases
* bump up the version for composable-kernel amd miopen-hip
* fix style errors
* fix style errors in hip etc
* renaming composable-kernel recipe
* changes for composable_kernel
* Revert "renaming composable-kernel recipe"
  This reverts commit 0cf6c6debf.
* Revert "changes for composable_kernel"
  This reverts commit 05272a10a7.
* bump up the version for hiprand
* using the checksum for hiprand-5.7.1
* bump up the version for 5.7.0 and 5.7.1 releases
* fix style errors
* fix merge conflicts with the develop.
* temp workaround for the error seen with rocm-5.7.0 when trying
  to generate the dependency file for runtime/legion/legion_redop.cu
* fix build issue(work around) with legion
* add patch for migraphx package to turn off ck
* update to  hip recipe
* fix hip-path detection inside llvm clang driver
* update llvm-amdgpu and rocm-validation-suite recipes
* fix style errors
* bump up the version for amdsmi for rocm-5.7.0 release
* add support for gfx941,gfx942 for rocm-5.7.0 release onwards
* revert changes to rocm.py file
* added gfx941 and gfx942 to rocm.py and add the gfx942 to kokkos and new checksum
  the new version seem to support gfx942
* bump up the version for rccl for 5.7.1
* update the patch for rocm-openmp-extras for 5.7.0
* update mivisionx recipe for 5.7.0 release
* add new dependencies for rocfft tests
* port the fix for avx build, the start address of values_ buffer in KernelParameters is not
  correct as it is computed based on 16-byte alignment
* set HIP_PATH=ROCM_PATH for 5.7.0 onwards
* address review comments
* revert adding xnack- and xnack+ to gfx940,gfx941,gfx942 as the prechecks were failing
2023-12-11 14:49:19 -08:00
Harmen Stoppels
8c29e90fa9 Build cache: make signed/unsigned a mirror property (#41507)
* Add `signed` property to mirror config

* make unsigned a tri-state: true/false overrides mirror config, none takes mirror config

* test commands

* Document this

* add a test
2023-12-11 15:14:59 -06:00
Tamara Dahlgren
045f398f3d Add missing build-system/custom phases to the CDash map (#41439) 2023-12-11 11:06:19 -08:00
Chris Green
6986e70877 [abseil-cpp] New version 20230802.1 (#41457) 2023-12-11 11:04:46 -08:00
wspear
4d59e746fd otf2: add v3.0.3 (#41499)
* Add otf2 version 3.3
* Update var/spack/repos/builtin/packages/otf2/package.py

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-12-11 11:02:58 -08:00
Arne Becker
f6de34f9db New package: perl-test-file (#41510) 2023-12-11 10:52:24 -08:00
Arne Becker
0f0adb71d0 perl-class-tiny: New package (#41513) 2023-12-11 10:49:38 -08:00
Joe Schoonover
4ec958c5c6 Add new version of feq-parse (#41515)
The new feq-parse version includes fixes for ifort and ifx compilers
Additionally, evaluation of parser objects with multidimension arrays is
now supported.
2023-12-11 10:47:42 -08:00
Kyle Gerheiser
2aa07fa557 libfabric: Add ucx provider variant (#41524) 2023-12-11 10:46:18 -08:00
Arne Becker
239d343588 perl-test-nowarnings: New package (#41539)
* perl-test-nowarnings: New package
* Add myself as maintainer
2023-12-11 10:37:52 -08:00
Andrey Perestoronin
44604708ad intel-oneapi-compilers 2024.0.1: added new version to packages (#41555)
* add new packages

* fix version
2023-12-11 11:33:42 -07:00
Arne Becker
f0109e4afe perl-test-weaken: New package (#41540) 2023-12-11 10:33:04 -08:00
Arne Becker
b8f90e1bdc perl-test-without-module: New package (#41541) 2023-12-11 10:31:44 -08:00
James Beal
8b9064e5e4 samtools/htslib add latest version (#41545)
* samtools/htslib add latest version
* Given I work at the same institute as the authors I think it fair I am willing to review changes, if it's complex I can ask them over tea.

---------

Co-authored-by: James Beal <jb23@sanger.ac.uk>
2023-12-11 10:29:08 -08:00
James Beal
ce79785c10 igv add latest version (#41546)
Co-authored-by: James Beal <jb23@sanger.ac.uk>
2023-12-11 10:26:29 -08:00
Christopher Christofi
af378c7f31 perl-bio-db-hts: add new package with version 3.01 (#41554)
* perl-bio-db-hts: add new package with version 3.01
* fix styling
2023-12-11 11:25:20 -07:00
Dom Heinzeller
cf50bfb7c2 Add eckit@1.24.5, ecmwf-atlas@{0.35.0,0.35.1,0.36.0} (#41547)
* Add eckit@1.24.5
* Add ecmwf-atlas@0.35.1
* Add ecmwf-atlas@0.36.0
2023-12-11 10:20:39 -08:00
Dom Heinzeller
620e090ff5 Add @climbfuji to fms maintainers (#41550) 2023-12-11 10:18:05 -08:00
Massimiliano Culpo
c4d86a9c2e apple-clang: add new package (#41485)
* apple-clang: added new package
* Add a maintainer
* Use f-strings, remove leftover comment
2023-12-11 10:06:27 -08:00
Kyle Gerheiser
3b74b894c7 fabtests: Add versions and update maintainer (#41525) 2023-12-11 18:55:02 +01:00
Dan Lipsa
3fa8afc036 Add v5.0.0 to SENSEI (#41551)
Co-authored-by: Dan Lipsa <dan.lipsa@savannah.khq.kitware.com>
2023-12-11 11:12:47 -06:00
Garth N. Wells
60628075cb Update for v0.7.2 (#41393) 2023-12-11 09:10:52 -08:00
Chris Green
9e4fab277b [root] New variants, patches (#41548)
* New variants:
  - `tmvz-cpu`
  - `tmvz-gpu`
  - `tmvz-pymva`
  - `tmvz-sofie`

* Improve X-related dependencies.

* Improve TMVA-related dependencies with more specificity.

* Patch possible missing standard header include in Eve7.

* Patch Protobuf handling to support new Protobuf-provided CMake config
  files required to handle transitive `abseil-cpp` dependence.

* Add missing terminal newline to `webgui` patch to remove patch
  warning.

* Handle deprecated/removed build options.
2023-12-11 09:18:10 -07:00
Christopher Christofi
5588e328f7 fpocket: improve recipe (#41532) 2023-12-11 15:45:52 +01:00
Rocco Meli
93a1fc90c9 add cmake constraint (#41542) 2023-12-11 15:36:29 +01:00
Patrick Gartung
7297721e78 Revert "[root] New variants, checksum changes, sundry improvements (#41463)" (#41544)
This reverts commit 7d45e132a6.
2023-12-11 15:34:24 +01:00
Felix Werner
eb57d96ea9 geant4/geant4-data: add v10.0.4 (#41478)
* geant4/geant4-data: add builtin_clhep variant and v10.0.4.

* geant4: revert addition of builtin_clhep variant.

* geant4: fix vecgeom variant only being available for v10.3 and above.
2023-12-11 14:21:25 +01:00
Rocco Meli
ce09642922 netlib-scalapack: add git attribute and master version (#41537) 2023-12-11 03:38:28 -07:00
Arne Becker
cd88eb1ed0 kyotocabinet: add new package (#41512) 2023-12-11 03:38:12 -07:00
Thomas Helfer
826df84baf tfel: fix v3.4.5 checksum (#41523) 2023-12-11 03:27:59 -07:00
Alex Richert
0a4b365a7d fms: add v2023.04 (#41475) 2023-12-11 11:15:01 +01:00
Harmen Stoppels
a2ed4704e7 Link to GitHub Action spack/setup-spack in docs (#41509) 2023-12-11 11:12:40 +01:00
Harmen Stoppels
28b49d5d2f unit tests: replace /bin/bash with /bin/sh (#41495) 2023-12-11 11:11:27 +01:00
Adam J. Stewart
16bb4c360a PyTorch: disable sleef dep for now (#41508) 2023-12-11 11:00:24 +01:00
Thomas Gruber
cfd58bdafe Likwid: likwid-icx-mem-group-fix.patch only for 5.2.* versions (#41514) 2023-12-11 10:48:02 +01:00
Matthieu Dorier
53493ceab1 leveldb: turning benchmark and tests off (#41518) 2023-12-11 10:41:03 +01:00
Dave Keeshan
64cd429cc8 Fix filter_compiler_wrapper where compiler is None (#41502)
Fix filer_compiler_wrapper for cases where the compiler returned in None, this happens on some installed gcc systems that do not have fortran built into them as standard, e.g. gcc@11.4.0 on ubuntu 22.04
2023-12-11 10:31:56 +01:00
Harmen Stoppels
525809632e petsc: improve hipsparse compat (#40311)
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2023-12-11 10:30:14 +01:00
Vanessasaurus
a6c32c80ab flux-core: add v0.57.0 (#41520)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-12-11 10:25:02 +01:00
Todd Gamblin
57ad848f47 commands: better install status help formatting (#41527)
Before (hard to read, doesn't fit on small terminals):
:
```console
  -I, --install-status  show install status of packages

                        packages can be: installed [+], missing and needed by an installed package [-], installed in an upstream instance [^], or not installed (no annotation)
```

After (fits in 80 columns):

```console
  -I, --install-status  show install status of packages
                        [+] installed       [^] installed in an upstream
                         -  not installed   [-] missing dep of installed package
```
2023-12-11 10:17:37 +01:00
Brian Spilner
15623d8077 cdo: add v2.3.0 (#41479) 2023-12-11 09:45:16 +01:00
Thomas Madlener
c352db7645 vecgeom: Use correct checksum for version 1.2.5 (#41530)
See https://gitlab.cern.ch/VecGeom/VecGeom/-/releases/v1.2.5
2023-12-11 08:53:44 +01:00
Brian Van Essen
5d999d0e4f Add logic to cache the RPATH variables in CachedCMakePackages. (#41417) 2023-12-08 09:27:44 -08:00
Seth R. Johnson
694a1ff340 celeritas: new version 0.4.1 (#41504)
* celeritas: new version 0.4.1

* Mark correct versions as deprecated
2023-12-08 09:56:18 +00:00
Richard Berger
4ec451cfed flecsi: remove ^legion network=gasnet restriction (#41494) 2023-12-07 19:03:04 -07:00
Julien Cortial
a77eca7f88 cdt: Add versions 1.3.6 and 1.4.0 (#41490) 2023-12-07 14:27:39 -07:00
Greg Becker
14ac2b063a cce compiler: remove vestigial compiler names (#41303) 2023-12-07 15:17:03 -06:00
Tamara Dahlgren
edf4d6659d add missing endtime property to CDash (#41498) 2023-12-07 22:06:46 +01:00
Lydéric Debusschère
6531fbf425 py-mpldock: new package (#41316)
* py-mpldock: new package

* py-mpldock: remove version constraint on python

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-07 21:24:05 +01:00
Victor Brunini
0a6045eadf Fix cdash reporter time stamps (#38825)
* Fix cdash reporter time stamps (#38818).
   The cdash reporter is created before packages are installed so save the
   starttime then instead of the endtime.
* Use endtime instead of starttime for the endtime of update

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2023-12-07 19:32:10 +00:00
Todd Gamblin
5722a13af0 Spack mailing list is now announcement-only (#41496)
Participation in the venerable Spack google group has dwindled, though we still have
540+ subscribers there.  I've made the mailing list announcement-only, and I've given
a few maintainers posting privileges.

This PR adds some notes to the README indicating that the mailing list is only for
announcements.
2023-12-07 19:22:14 +00:00
Brian Van Essen
9f1223e7a3 Bugfix spectrum-mpi module generation (#41466)
* Ensure that additional environment variables are set when a module
file is generated.

* Fixed the detection of the opal_prefix / MPI_ROOT field to use ompi_info.
---------

Co-authored-by: Greg Becker <becker33@llnl.gov>
2023-12-07 11:06:07 -08:00
Vijay M
5beef28444 Update homepage URL, add 5.5.1 version, remove bad version hashes, and other minro changes (#41492) 2023-12-07 10:52:06 -08:00
snehring
e618a93f3d iqtree2: add new version 2.2.2.7 and new variant lsd2 (#41467)
* iqtree2: add new version 2.2.2.7 and new variant lsd2
* iqtree2: reorder variant and resource
2023-12-07 11:30:14 -07:00
Garth N. Wells
3f0ec5c580 Update UFCx for v0.7.0. (#41392) 2023-12-07 10:20:34 -08:00
Kyle Knoepfel
14392efc6d Permit shared-library for libbacktrace (#41454) 2023-12-07 10:17:31 -08:00
John W. Parent
d7406aaaa5 CMake: v3.26.6 (#41282) 2023-12-07 10:25:42 -07:00
Harmen Stoppels
5a7e691ae2 freebsd (#41480) 2023-12-07 09:19:55 -08:00
Dave Keeshan
b9f63ab40b opensta: add new package (#41484)
* Add opensta, is allows 2 variants, zlib and cudd, but they are both enabled by default
* Remove unused import, os
2023-12-07 09:18:58 -08:00
Auriane R
4417b1f9ee Update pika package to use f-strings (#41483) 2023-12-07 10:14:33 -07:00
Adam J. Stewart
04f14166cb py-keras: add v3.0.1 (#41486) 2023-12-07 09:05:42 -08:00
Robert Cohn
223a54098e [intel-mkl,intel-ipp,intel-daal]: deprecate packages (#41488) 2023-12-07 09:01:02 -08:00
Satish Balay
ea505e2d26 petsc: add variant +zoltan (#41472) 2023-12-07 10:51:00 -06:00
Ataf Fazledin Ahamed
e2b51e01be traverse.py: use > 0 instead of >= 0 (#41482)
Signed-off-by: fazledyn-or <ataf@openrefactory.com>
2023-12-07 09:38:54 -07:00
jmlapre
a04ee77f77 trilinos: replace pytrilinos2 variant with python (#41435)
* depend_on python

There is an ill-named variant "python" that enables the pytrilinos1
variant.  This made it through our testing but broke on our actual
CI test machines.

* adjust "python" variant based on Trilinos version

For Trilinos <= 14, enable PyTrilinos(1). For later versions
of Trilinos, enable PyTrilinos2.

We still support directly enabling PyTrilinos2 via the "pytrilinos2"
variant.

* remove pytrilinos2 variant

* correct depends_on constraints
2023-12-07 10:28:13 -05:00
Jordan Galby
bb03ce7281 Do not use depfile in bootstrap (#41458)
- we don't have a fallback if make is not installed
- we assume file system locking works
- we don't verify that make is gnu make (bootstrapping fails on FreeBSD as a result)
- there are some weird race conditions in writing spack.yaml on concurrent spack install
- the view is updated after every package install instead of post environment install.
2023-12-07 10:09:49 +00:00
Massimiliano Culpo
31640652c7 audit: forbid nested dependencies in depends_on declarations (#41428)
Forbid nested dependencies in depends_on declarations, by running an audit in CI.

Fix the packages not passing the new audit:
- amd-aocl
- exago
- palace
- shapemapper
- xsdk-examples

ginkgo: add a commit sha to v1.5.0.glu_experimental
2023-12-07 10:21:01 +01:00
Samuel Browne
0ff0e8944e trilinos: add v15.0.0 (#41465) 2023-12-07 10:07:02 +01:00
Jack Morrison
a877d812d0 rdma-core: Add new versions 41.5, 42.5, 43.4, 44.4, 45.3, 46.2, 47.1, 49.0 (#41473) 2023-12-07 09:58:06 +01:00
dependabot[bot]
24a59ffd36 build(deps): bump actions/setup-python from 4.8.0 to 5.0.0 (#41474)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.8.0 to 5.0.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](b64ffcaf5b...0a5c615913)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-07 09:55:55 +01:00
Dave Keeshan
57f46f0375 cudd: add new package (#41476) 2023-12-07 09:37:22 +01:00
Chris Green
7d45e132a6 [root] New variants, checksum changes, sundry improvements (#41463)
* New variants:
  - `tmvz-cpu`
  - `tmvz-gpu`
  - `tmvz-pymva`
  - `tmvz-sofie`

* Improve X-related dependencies.

* Improve TMVA-related dependencies with more specificity.

* Patch possible missing standard header include in Eve7.

* Patch Protobuf handling to support new Protobuf-provided CMake config
  files required to handle transitive `abseil-cpp` dependence.

* Add missing terminal newline to `webgui` patch to remove patch
  warning.

* Handle deprecated/removed build options.

* Handle unwanted system paths in various `PATH`-like environment
  variables.
2023-12-06 18:39:51 -06:00
Vicente Bolea
e7ac676417 paraview: dropping patch since changes exists (#41462) 2023-12-06 16:18:46 -07:00
Adam J. Stewart
94ba152ef5 py-torchmetrics: add v1.2.1 (#41456) 2023-12-06 12:48:28 -07:00
Massimiliano Culpo
5404a5bb82 llvm: reformulate a when condition to avoid tautology (#41461)
The condition on swig can be interpreted as "true if true,
false if false" and gives clingo the option to add swig
or not.

If not other optimization criteria break the tie, then
the concretization is non-deterministic.
2023-12-06 19:12:42 +01:00
Dave Keeshan
b522d8f610 yosys: add new package (#41416)
* Add EDA tools, yosys to Spack
* Add maintainers
* Move from format to f-strings
2023-12-06 09:47:58 -08:00
Thomas Madlener
2a57c11d28 lcio: add version 2.20.2, sio: add version 0.2 (#41451) 2023-12-06 09:46:18 -08:00
Thomas Madlener
1aa3a641ee edm4hep: Update cmake version dependency for newer versions (#41450) 2023-12-06 09:43:24 -08:00
yizeyi18
6feba1590c nwchem: add libxc/elpa support (#41376)
* added external libxc/elpa choice
* fixed formatting issues and 1 unused variant found by reviewer
* try to fix a string formatting issue
* try to fix some other string formatting issues
* fixed 1 flake8 style issue
* use explicit fftw-api@3
2023-12-06 09:38:46 -08:00
Chris Green
58a7912435 [catch2] Sundry improvements including C++20 support (#41199)
* Add `url_list` to facilitate finding new versions.

* `cxxstd` is not meaningful when `@:2.99.99` as it was a header-only
  package before v3.

* Support C++20/23, remove C++14 support.

* Add @greenc-FNAL to maintainers.

* Add CMake arguments to support testing, build of extras and examples.
2023-12-06 11:07:27 -06:00
yizeyi18
03ae2eb223 dla-future: add a patch (#41409)
* using std::int64_t needs include cstdint in gcc-13

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2023-12-06 16:55:40 +01:00
Julien Cortial
013f0d3a13 Only build tests for proj package if required (#41065)
* Only build tests for proj package if required

Even if tests are not explictly required to be built, proj build them
anyway and tries to download Google Test.

* proj: fix name of test activation flag

* proj: Always set test activation flag

* proj: Patch test activation logic for versions 5.x
2023-12-06 09:32:39 -06:00
Eric Berquist
3e68aa0b2f py-pre-commit: add 3.5.0 (#41438) 2023-12-06 09:11:32 -06:00
Gavin John
1da0d0342b Add new versions of py-quast (#40788)
* Add new versions of py-quast

* Update hashes

* Add joblib and simplejson

* Fat finger

* Update dependency type
2023-12-06 09:04:17 -06:00
Lydéric Debusschère
6f7d91aebf py-python-pptx: new package (#41315)
* py-python-pptx: new package

* py-python-pptx: use pil instead of pillow, remove version constraint on python

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-06 08:43:51 -06:00
Lydéric Debusschère
071c74d185 py-tldextract: new package (#41330)
* py-tldextract: new package

* py-tldextract: add version 5.1.1

* py-tldextract: fix version constraint on py-setuptools-scm

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-06 08:42:48 -06:00
Juan Miguel Carceller
51435d6d69 ruff: add version 0.1.6 (#41355)
* Add a new version of ruff

* Add a comment about where the dependency can be found

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-12-06 08:41:20 -06:00
Jordan Galby
8ce110e069 bootstrap: Don't catch Ctrl-C (#41449) 2023-12-06 14:58:14 +01:00
Auriane R
90aee11c33 Add pika 0.21.0 release (#41446) 2023-12-06 03:48:26 -07:00
Harmen Stoppels
4fc73bd7f3 minimal support for freebsd (#41434) 2023-12-06 10:27:22 +00:00
Dom Heinzeller
f7fc4b201d Update py-werkzeug version dependency for py-graphene-tornado@2.6.1 (#41426)
* Update py-werkzeug version dependency for py-graphene-tornado@2.6.1

* Add note on diverging version requirements for py-werkzeug in py-graphene-tornado
2023-12-06 10:28:58 +01:00
Jim Edwards
84999b6996 mpiserial: rework installation (#40762) 2023-12-06 09:39:08 +01:00
Harmen Stoppels
b0f193071d bootstrap status: no bash (#41431) 2023-12-06 09:20:59 +01:00
dependabot[bot]
d1c3374ccb build(deps): bump actions/setup-python from 4.7.1 to 4.8.0 (#41441)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.7.1 to 4.8.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](65d7f2d534...b64ffcaf5b)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-06 09:20:14 +01:00
Richard Berger
cba8ba0466 legion: correct cuda dependency for cr version (#41119)
* legion: correct cuda dependency for cr version
* Update var/spack/repos/builtin/packages/legion/package.py

---------

Co-authored-by: Davis Herring <herring@lanl.gov>
2023-12-05 18:36:50 -08:00
Wouter Deconinck
d50f8d7b19 root: sha256 change on latest versions (#41401)
* root: sha256 change on latest version
* root: sha256 change on 6.28.10
* root: replace 6.26.12 by 6.26.14
* root: hash for 6.26.14
2023-12-05 18:20:27 -08:00
kjrstory
969fbbfb5a foam-extend: add v5.0 (#40480)
* foam-extend:add new version
* depreacted versions
2023-12-05 18:11:59 -08:00
Robert Cohn
1cd5397b12 [intel-parallel-studio] Deprecate entire package (#41430) 2023-12-05 17:56:49 -08:00
psakievich
1829dbd7b6 CDash: Spack dumps stage errors to configure phase (#41436) 2023-12-05 22:05:39 +00:00
Philippe Virouleau
9e3b231e6f julia: fix LLVM paches hashes (#41410) 2023-12-05 22:53:40 +01:00
Alec Scott
5911a677d4 py-gidgethub: add new package (#41286)
* py-gidgethub: add new package

* Add main branch version and scope flit/flit-core dependency

* Update var/spack/repos/builtin/packages/py-gidgethub/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* Add optional dependencies as variants of package

* Add git url for main version

* Fix variant and dependency ordering

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-12-05 21:26:42 +01:00
Alec Scott
bb60bb4f7a py-gidgetlab: add new package (#41338)
* gidgetlab: add new package

* Convert both cachetools and aiohttp to optional deps with variants

* Fix forgotten variant conditional on cachetools dependency

* Add git url and main version for dev workflows

* Fix variant and dependency ordering

* Remove cachetools variant and merge dependency with aiohttp variant
2023-12-05 21:24:38 +01:00
Victoria Cherkas
ddec75315e Add maintainers to fdb, eckit, ecbuild, metkit, eccodes (#41433)
* Add maintainers to fdb
* Add maintainers to eckit
* Add maintainers to mekit
* Add maintainers to eccodes
* Add maintainers to ecbuild
* Add climbfuji to eccodes maintainers
2023-12-05 13:18:41 -07:00
Veselin Dobrev
8bcb1f8766 MFEM: Add a patch to fix the +gslib+shared+miniapps build (#41399)
* [mfem] Add a patch to resolve issue #41382

* [mfem] Spack CI wants "patch URL must end with ?full_index=1"
2023-12-05 10:26:48 -08:00
Matthew Thompson
5a0ac4ba94 Update versions of GFE packages (#41429) 2023-12-05 10:24:51 -07:00
Lydéric Debusschère
673689d53b py-sphinx: add versions 7.2.4, 7.2.5 and 7.2.6 (#41411)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-05 15:07:22 +01:00
Brian Vanderwende
ace8e17f02 libdap4: add explicit RPC dependency (#40019) 2023-12-05 13:11:19 +01:00
Billae
eb9c63541a documentation: add instructions on how to use external opengl (#40987) 2023-12-05 12:59:41 +01:00
Kensuke WATANABE
b9f4d9f6fc sirius: fix build error with Fujitsu compiler (#41101) 2023-12-05 12:54:03 +01:00
Lydéric Debusschère
eda3522ce8 py-sphinxcontrib-moderncmakedomain: add new package (#41331)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-05 12:47:25 +01:00
Harmen Stoppels
3cefd73fcc spack buildcache check: use same interface as push (#41378) 2023-12-05 12:44:50 +01:00
Alberto Invernizzi
3547bcb517 openfoam-org: fix for being able to build manually checked out repository (#41000)
* grep WM_PROJECT_VERSION from etc/bashrc

This fixes the problem when building from a manually checked out repo
which might have a different version wrt the one defined in the spack
package (e.g. anything later than 5.0 is known as 5.x by the build
system)

* patch applies to just 5.0, in newer versions it is already addressed

In `5.20171030` there's a commit

c66fba323c

very similar (almost identical) to what the patch `50-etc.patch` does.

So the patch should not be applied to other than `5.0` otherwise it errors.

References:
- https://github.com/OpenFOAM/OpenFOAM-5.x/commits/20171030/etc/bashrc
- 197d9d3bf2/etc/bashrc (L45-L47)
2023-12-05 12:37:57 +01:00
Todd Gamblin
53b528f649 bugfix: sort variants in spack info --variants-by-name (#41389)
This was missed while backporting the new `spack info` command from #40326.

Variants should be sorted by name when invoking `spack info --variants-by-name`.
2023-12-05 12:31:40 +01:00
Mark W. Krentel
798770f9e5 hpctoolkit: add conflict for recent intel-xed (#41413)
Intel made an incompatible change in XED in 2023.08.21 that breaks
hpctoolkit (at run time).  Hpctoolkit develop can adapt (soon will),
but older versions must use xed :2023.07.09.
2023-12-05 11:17:37 +01:00
Jim Edwards
4a920243a0 cprnc: update sha256 for github artifacts (#41418) 2023-12-05 11:13:14 +01:00
Ben Wibking
8727195b84 openblas: fix macOS build when using XCode 15 or newer (#41420)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-12-05 09:53:54 +01:00
Massimiliano Culpo
456f2ca40f extensions: improve docs, fix unit-tests (#41425) 2023-12-05 09:49:35 +01:00
dependabot[bot]
b4258aaa25 build(deps): bump docker/metadata-action from 5.2.0 to 5.3.0 (#41423)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.2.0 to 5.3.0.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](e6428a5c4e...31cebacef4)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-05 09:48:29 +01:00
Mitch B
5d9647544a arrow: add versions up to v14.0.1 (#41424) 2023-12-05 09:48:03 +01:00
Brian Van Essen
1fdb6a3e7e Updating the LBANN, Hydrogen, and DiHydrogen recipes (#41390)
* Updating the LBANN, Hydrogen, and DiHydrogen recipes for both new
variants and to make sure that RPATHs are properly setup.

Co-authored-by: bvanessen <bvanessen@users.noreply.github.com>
2023-12-05 09:31:51 +01:00
Robert Cohn
7c77b3a4b2 [intel] deprecate all versions (#41412)
Deprecating intel package, which contains intel classic compilers. This package has not been updated in 3 years. Please use intel-oneapi-compilers instead.
2023-12-04 21:52:55 -07:00
Ye Luo
eb4b8292b6 A few changes to quantum-espresso (#41225)
* gipaw.x installed by cmake if version >= 5c4a4ce.
  gipaw.x will only be installed with cmake if the qe-gipaw version
  is >= 5c4a4ce. Currently, QE source uses the older f5823521 one.
  Here a patch to the submodule_commit_hash_records to use a newer
  qe-gipaw version.
* Update package.py
* Delete var/spack/repos/builtin/packages/quantum-espresso/gipaw-eccee44.patch
* Update package.py
* Restoring gipaw-eccee44 patch
* Update package.py
* Add fox variant in quantum-espresso
* Fix an issue introduced in #36484. Patches are 7.1 only.
* Change plugin handling.
* formatting.
* Typo correction
* Refine conflict

---------

Co-authored-by: S. Alexis Paz <alexis.paz@gmail.com>
2023-12-04 11:37:05 -08:00
John Biddiscombe
16bc58ea49 Add EGL support to ParaView and Glew (#39800)
* Add EGL support to ParaView and Glew

add a package for egl that provides GL but also adds
EGL libs and headers for projects that need them

Fix a header problem with the opengl package

Format files using black

* better description for egl variant description

Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>

* better check/setup of non egl variant dependencies

Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>

* Add biddisco as maintainer

* Fix unused var style warning

* Add egl conflicts for other gl providers

---------

Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>
2023-12-04 10:53:06 -06:00
Alec Scott
6028ce8bc1 direnv: add v2.33.0 (#41397) 2023-12-04 14:12:13 +01:00
Harmen Stoppels
349e7e4c37 zlib-ng: add v2.1.5 (#41402) 2023-12-04 12:44:10 +01:00
Harmen Stoppels
a982118c1f ci.py: fix missing import (#41391) 2023-12-04 12:14:59 +01:00
Adam J. Stewart
40d12ed7e2 PythonPackage: type hints (#40539)
* PythonPackage: nested config_settings, type hints

* No need to quote PythonPackage

* Use narrower types for now until needed
2023-12-04 10:53:53 +00:00
James Smillie
9e0720207a Windows: fix kit base path and reference to windows registry key (#41388)
* Proper handling of argument passed as semicolon-separated str
* Fix reference to windows registry key in win-wdk
2023-12-03 15:35:13 -08:00
Jaelyn Litzinger
88e738c343 Allow exago to use hiop@develop past v1.0.1 (#41384) 2023-12-02 14:06:22 -06:00
Cameron Rutherford
8bbc2e2ade resolve: add package with cuda and rocm support (#40871) 2023-12-01 20:49:11 -06:00
Julien Cortial
1509e54435 Add MUMPS versions 5.6.0, 5.6.1 and 5.6.2 (#41386)
The patch for version 5.5.x still applies to 5.6.x.
2023-12-01 18:48:00 -07:00
Dom Heinzeller
ca164d6619 Fix curl install using Intel compilers (#41380)
When using Intel to build curl, add 'CFLAGS=-we147' to the configure
args to fix error 'compiler does not halt on function prototype
mismatch'
2023-12-01 17:24:05 -07:00
Felix Werner
a632576231 Add XCDF. (#41379) 2023-12-01 14:56:18 -08:00
Felix Werner
70b16cfb59 Add PhotoSpline. (#41374) 2023-12-01 14:44:53 -08:00
Brian Vanderwende
1d89d4dc13 MET fixes for 11.1 and HDF4 support (#41372)
* MET fixes for 11.1 and HDF4 support
* Fix zlib reference in MET
2023-12-01 14:42:36 -08:00
Dewi
bc8a0f56ed removed cmake build version pointing to fork (#41368) 2023-12-01 14:39:45 -08:00
Jack Morrison
4e09396f8a Libfabric: Introduce OPX provider conflict for v1.20.0 (#41343)
* Libfabric: Introduce OPX provider conflict for v1.20.0
* Add message to libfabric 1.20.0 opx provider conflict
2023-12-01 14:35:54 -08:00
Weiqun Zhang
0d488c6e4f amrex: add v23.12 (#41385) 2023-12-01 22:45:58 +01:00
Erik Heeren
50e76bc3d3 py-pyglet: version bump (#41082)
* py-pyglet: version bump

* py-pyglet: use zip instead of whl, update dependencies

* py-pyglet: 2.0.9 and 2.0.10 zips should be downloaded from github

* py-pyglet: style

* py-pyglet: use virtual packages in dependencies

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-pyglet: doesn't depend on py-future any more

* py-pyglet: remove glx dependency

* py-pyglet: back to the pypi zipfiles with patch instead

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-12-01 21:44:30 +01:00
Dave Keeshan
a543fd79f1 verible: add new package (#41270)
* Add initial version of verible to spack
* Update to use explict url path for each release, as the release tagh includes extra data, also added the bottom most point of gcc, gcc9
2023-12-01 10:48:52 -08:00
Adam J. Stewart
ab50aa61db py-keras: add v3.0.0 (#41356)
* py-keras: add v3.0.0

* Older keras actually requires protobuf

* Correct url_for_version

* Capitalization is important

* Keep pil and pydot deps
2023-12-01 18:23:58 +00:00
Richard Berger
d06a102e69 Various FleCSI updates (#41068)
* flecsi: remove deprecated versions
* flecsi: add explicit conflict for backend=hpx +hdf5
* flecsi: propagate +openmp to kokkos and legion
* flecsi: remove doc variant prior to @2.2
   It wouldn't do anything meaningful and won't install the documentation.

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-12-01 09:41:33 -08:00
Rohit Goswami
b0f0d2f1fb dftbplus: Update and add upstream maintainer (#33243)
* dftbp: Update and add upstream maintainer
* dftbp: Trust in the hybrid cmake builds
* dftbp: Handle scalapack better
* dftbp: Refactor as per review
* dftbp: Build shared for python
* dftbp: Address review comments
* dftbp: Add another maintainer
* dftp: Fix typo
* dftbp: Arpack for serial builds only
* dftbp: Update option docs
* dftbp: Update documentation for elsi
* dftbp: Add comment for context
* dftbp: Tighter bounds on python
* dftbp: Add negf only when shared
* dftbp: Fix typo
* dftbp: Update sha256
* dftpb: Add when directive for cmake and ninja
* dftbp: Enforce comment

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
Co-authored-by: awvwgk <awvwgk@users.noreply.github.com>
Co-authored-by: iamashwin99 <iamashwin99@users.noreply.github.com>
Co-authored-by: Ashwin Kumar Karnad <46030335+iamashwin99@users.noreply.github.com>
Co-authored-by: Sebastian Ehlert <28669218+awvwgk@users.noreply.github.com>
Co-authored-by: tldahlgren <tldahlgren@users.noreply.github.com>
2023-12-01 09:34:11 -08:00
Victoria Cherkas
6029b600f0 eccodes: add v2.32.0, v2.31.0 (#40770)
* eccodes new versions and dependencies
* Suggested changes for multiple variant defaults
* Update var/spack/repos/builtin/packages/eccodes/package.py

---------

Co-authored-by: Sergey Kosukhin <skosukhin@gmail.com>
2023-12-01 09:25:31 -08:00
dependabot[bot]
e6107e336c build(deps): bump docutils from 0.18.1 to 0.20.1 in /lib/spack/docs (#38174)
Bumps [docutils](https://docutils.sourceforge.io/) from 0.18.1 to 0.20.1.

---
updated-dependencies:
- dependency-name: docutils
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 17:16:34 +01:00
Lydéric Debusschère
9ef57c3c86 py-cma: new package (#41326)
* py-pycma: new package

* rename py-pycma in py-cma; py-cma: use pypi instead of github sources

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 08:54:02 -06:00
Lydéric Debusschère
3b021bb4ac py-mahotas: new package (#41329)
* py-mahotas: new package

* py-mahotas: relax version constraint on numpy

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 08:52:37 -06:00
Richard Berger
42bac83c2e LAMMPS updates (#40879)
* lammps: add new stable version 20230802.1

* lammps: add missing potential download for +mesont

* lammps: fix python package install

* Update var/spack/repos/builtin/packages/lammps/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* lammps: py-numpy and py-mpi4py should be build and run deps

* lammps: add new 20231121 release

- MPIIO package has been removed -> disable mpiio variant
- LAMMPS_EXCEPTIONS is now always on -> disable exceptions variant
- CMake 3.16+ is now required
- Kokkos 4.1.0 is now supported

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-12-01 07:13:56 -07:00
Lydéric Debusschère
7e38e9e515 py-xmlplain: new package (#41324)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 08:02:24 -06:00
Lydéric Debusschère
cf5ffedc23 py-requests-file: new package (#41328)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 08:00:23 -06:00
Lydéric Debusschère
3a9aea753d [add] py-autodocsumm: new package (#41309)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 07:53:57 -06:00
Lydéric Debusschère
c61da8381c py-sphinx-jinja2-compat: new package (#41310)
* [add] py-sphinx-jinja2-compat: new package

* py-whey-pth: new package, dependence of py-sphinx-jinja2-compat

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 07:53:02 -06:00
Lydéric Debusschère
8de814eddf py-sphinx-removed-in: new package (#41325)
* py-sphinx-removed-in: new package

* py-sphinx-removed-in: fix dependence

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 07:43:40 -06:00
Satish Balay
c9341a2532 petsc, py-petsc4py: add v3.20.2 (#41366) 2023-12-01 07:42:48 -06:00
Manuela Kuhn
8b202b3fb2 py-urllib3: add 2.1.0 and 2.0.7 (#41358) 2023-12-01 07:39:18 -06:00
Dom Heinzeller
2ecc260e0e Update py-cylc-flow (add version 8.2.3) (#41209)
* Add missing runtime dependency on py-colorama to py-ansimarkup

* Add py-metomi-isodatetime@3.1.0

* New package py-graphql-relay

* Update py-cylc-flowi, add version 8.2.3

* Fix merge conflict

* Revert mistake in var/spack/repos/builtin/packages/py-cylc-flow/package.py

* Update py-metomi-isodatetime dependencies for py-cylc-flow

* Add 'climbfuji' to list of maintainers for py-cylc-flow
2023-12-01 07:33:34 -06:00
Christopher Christofi
6130fe8f57 py-beartype: new package with versions 0.15.0 and 0.16.2 (#39759)
* py-beartype: new package with version 0.15.0

* Update var/spack/repos/builtin/packages/py-beartype/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-beartype: depend on python 3.8 or higher

* py-beartype: add new version 0.16.2

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-12-01 07:29:30 -06:00
Adam J. Stewart
d3aa7a620e py-mpi4py: fix build with Apple Clang (#41362)
* py-mpi4py: fix build with Apple Clang

* [@spackbot] updating style on behalf of adamjstewart

---------

Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2023-12-01 05:33:14 -06:00
dependabot[bot]
2794e14870 build(deps): bump pygments from 2.17.1 to 2.17.2 in /lib/spack/docs (#41212)
Bumps [pygments](https://github.com/pygments/pygments) from 2.17.1 to 2.17.2.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/2.17.2/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.17.1...2.17.2)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 12:29:42 +01:00
dependabot[bot]
cc1e990c7e build(deps): bump sphinx-rtd-theme in /lib/spack/docs (#41305)
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 1.3.0 to 2.0.0.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/1.3.0...2.0.0)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 12:29:22 +01:00
dependabot[bot]
59e6b0b100 build(deps): bump mypy from 1.7.0 to 1.7.1 in /lib/spack/docs (#41243)
Bumps [mypy](https://github.com/python/mypy) from 1.7.0 to 1.7.1.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.7.0...v1.7.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 09:22:23 +01:00
dependabot[bot]
ec53d02814 build(deps): bump docker/metadata-action from 5.0.0 to 5.2.0 (#41371)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.0.0 to 5.2.0.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](96383f4557...e6428a5c4e)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 09:21:56 +01:00
dependabot[bot]
389c77cf83 build(deps): bump mypy from 1.6.1 to 1.7.1 in /.github/workflows/style (#41242)
Bumps [mypy](https://github.com/python/mypy) from 1.6.1 to 1.7.1.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.6.1...v1.7.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 09:20:56 +01:00
Vicente Bolea
17c87b4c29 vtk-m: bump vtk-m 2.1.0 (#41351)
* vtk-m: bump vtk-m 2.1.0

* Update package.py

* Update package.py
2023-11-30 20:18:15 -08:00
Thomas Helfer
91453c5ba0 Add support for new versions of TFEL and MGIS (#41357)
* Add new versions to TFEL and MGIS
2023-11-30 19:47:57 -07:00
Robert Cohn
a587a10c86 [intel-mpi] deprecation (#41322) 2023-11-30 19:52:42 -05:00
Derek Ryan Strong
cfe77fcd90 r: add license and missing versions and fix rmath build directory (#41260)
* Add R license and missing versions
* Fix rmath build directory
2023-11-30 16:18:02 -08:00
Matthieu Dorier
6cf36a1817 mochi-margo: added version 0.15.0 (#41319) 2023-11-30 15:32:24 -08:00
Matthieu Dorier
ee40cfa830 mochi-thallium: adding a few new versions (#41323)
* mochi-thallium: added a few newer versions
* mochi-thallium: added constraint on the version of margo required
* removed thallium 0.12 which needs updating
* mochi-thallium: fixed hash for version 0.12.0
* removed thallium 0.12 which needs updating (again)
2023-11-30 15:19:32 -08:00
Harmen Stoppels
04f64d4ac6 tests: use temporary_store (#41369) 2023-11-30 15:11:35 -08:00
jmlapre
779fef7d41 trilinos: new pytrilinos2 variant (#40615) 2023-11-30 14:08:58 -07:00
Richard Berger
5be3ca396b Singularity-EOS update (#41333)
* singularity-eos: deprecate v1.6 versions and remove unused code
* singularity-eos: add v1.8.0
2023-11-30 12:55:25 -08:00
Dave Keeshan
e420441bc2 Fix flex for build and link, limit gcc to 7 or greater (#41335) 2023-11-30 12:51:48 -08:00
Dave Keeshan
a039dc16fa Move compiler renaming to filter_compiler_wrappers (#41336) 2023-11-30 12:48:59 -08:00
Christopher Christofi
b31c89b110 perl-carp-assert: add new package with version 0.22 (#41347)
* perl-carp-assert: add new package with version 0.22
* fix style
2023-11-30 12:46:24 -08:00
Christopher Christofi
bc4f3d6cbd perl-parse-yapp: add new package with version 1.21 (#41348) 2023-11-30 12:37:26 -08:00
Matthias Wolf
40209506b7 acfl: truncate version version number (#41354)
When using `spack external find acfl`, we get the full version string
with 4 components in `packages.yaml`.  This PR truncates the version
nubmer when finding the `armpl` component to be able to run without
intervention.
2023-11-30 12:32:10 -08:00
Massimiliano Culpo
6ff07c7753 Fix issue with latest mypy (#41363) 2023-11-30 20:31:03 +01:00
Adam J. Stewart
d874c6d79c py-tensorflow-estimator: add new versions (#41364) 2023-11-30 10:54:11 -08:00
Adam J. Stewart
927e739e0a GDAL: add v3.8.1 (#41365) 2023-11-30 10:51:42 -08:00
Tom Scogland
dd607d11d5 developer tools stack try 2 (#40921)
* developer tools stack try 2

This version is actually in use locally and has largely stabilized, at
least on x86.  Some packages are still a challenge on ppc64le, but maybe
worth keeping this working as a set.

* add packages, try to get container with newer gcc

* remove reuse: true

* try to get cmake to build on medium, 25 minutes is too long

* add lsd package and add to dev tools stack

* clean up fzf dependency and sorting

* Update share/spack/gitlab/cloud_pipelines/stacks/developer_tools/spack.yaml

* cuda: add 12.3.0 (#40827)

* Switch to dashes

* yet more underscores

---------

Co-authored-by: Paul R. C. Kent <kentpr@ornl.gov>
2023-11-30 18:32:21 +00:00
Harmen Stoppels
d436e97fc6 reuse concretization: allow externals from remote when locally configured (#35975)
This looks to me like the best compromise regarding externals in a
build cache. I wouldn't want `spack install` on my machine to install
specs that were marked external on another. At the same time there are
centers that control the target systems on which spack is used, and
would want to use external in buildcaches.

As a solution, reuse concretization will now consider those externals
used in buildcaches that match a locally configured external in
packages.yaml.

So for example person A installs and pushes specs with this config:

```yaml
packages:
  ncurses:
    externals:
    - spec: ncurses@6.0.12345 +feature
      prefix: /usr
```

and person B concretizes and installs using that buildcache with the
following config:

```yaml
packages:
  ncurses:
    externals:
    - spec: ncurses@6
    prefix: /usr
```

the spec will be reused (or rather, will be considered for reuse...)
2023-11-30 09:38:05 -08:00
Harmen Stoppels
f3983d60c2 tests: add missing mutable db (#41359) 2023-11-30 18:37:35 +01:00
Harmen Stoppels
40e705d39e tests: fix side effects of default_config fixture (#41361)
* tests: default_config drop scope

* use default_config elsewhere

* use parse_install_tree for missing defaults in default config
2023-11-30 18:19:10 +01:00
Harmen Stoppels
d92457467a test_variant_propagation_with_unify_false: missing fixture (#41345) 2023-11-30 17:27:53 +01:00
Harmen Stoppels
4c2734fe14 --scope: lazy defaults (#41353) 2023-11-30 15:35:21 +01:00
Taillefumier Mathieu
34d791189d Update SIRIUS version for CP2K master (#41264)
* Update SIRIUS version for CP2K master

* Update var/spack/repos/builtin/packages/cp2k/package.py

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>

---------

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2023-11-30 04:08:22 -07:00
Vinícius
eec9eced1b simgrid: add v3.34 and v3.35 (#41340) 2023-11-30 11:23:57 +01:00
Christopher Christofi
3bc8a7aa5f use double quotes where spack style finds errors (#41349) 2023-11-30 09:46:02 +01:00
Massimiliano Culpo
3b045c289d Fix a typo in an integrity constraint (#41334) 2023-11-30 09:44:51 +01:00
Harmen Stoppels
4b93c57d44 argparse: make scope choices lazy s.t. validation in tests works (#41344) 2023-11-30 08:37:11 +01:00
Harmen Stoppels
377e7de0d2 tests: fix issue with os.environ binding (#41342) 2023-11-30 07:18:41 +01:00
Greg Sjaardema
0a9c84dd25 SEACAS: new release (#41273)
Replace last release due to build issues on windows. Tag was applied incorrectly
2023-11-29 21:38:05 -07:00
Chris White
2ececcd03e MFEM: add mpi link dir (#41337)
Also fix netcdf-c zlib reference
2023-11-29 21:32:50 -07:00
Adam J. Stewart
f4f67adf49 py-numba: add v0.58.1 (#41262)
* py-numba: add v0.58.1

* Passing tests
2023-11-30 00:17:34 +01:00
Adam J. Stewart
220898b4de py-pygeos: add v0.14 (#41248)
* py-pygeos: add v0.14

* Python is also a link dep
2023-11-30 00:15:07 +01:00
Luc Berger
450f938056 kokkos: add v4.2.00 (#41203)
* Kokkos: adding version 4.2.00 to the package
* Kokkos: adding AMD GPU arch
* kokkos@4.2.00 +sycl: patch numeric traits unit test

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2023-11-29 22:47:15 +00:00
Massimiliano Culpo
889b729e52 Refactor a test to not use the "working_env" fixture (#41308)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-11-29 22:14:57 +01:00
Sergio Alexis Paz
3987604b89 Add gipaw when building quantum-espresso with cmake (#41142)
* gipaw.x installed by cmake if version >= 5c4a4ce.

gipaw.x will only be installed with cmake if the qe-gipaw version
is >= 5c4a4ce. Currently, QE source uses the older f5823521 one.
Here a patch to the submodule_commit_hash_records to use a newer
qe-gipaw version.
2023-11-29 12:16:27 -08:00
Andrey Perestoronin
b6f8cb821c intel-oneapi-ccl 2021.11.1: added new version to packages (#41300)
* added new packages

* align release with ccl patch

* revert version on compiler classics
2023-11-29 10:09:32 -07:00
Mikael Simberg
72216b503f dla-future: Add conflicts for compilation issues pre-0.3.1 (#41317)
* dla-future: Add conflict for pedantic warnings turning into errors

* dla-future: Add conflicts for nvcc/fmt/Umpire compilation issues
2023-11-29 15:10:27 +01:00
Harmen Stoppels
c06f353f55 Simplify _create_mock_configuration_scopes (#41318) 2023-11-29 14:06:41 +01:00
Mikael Simberg
367ca3f0ec fmt: Add git attribute (#41293) 2023-11-29 12:50:28 +01:00
Harmen Stoppels
2c4bc287b8 cuda: fix compiler conflicts (#41304) 2023-11-29 12:14:39 +01:00
Harmen Stoppels
fcb3d62093 unit tests: use --verbose to see order on macos (#41314) 2023-11-29 11:29:17 +01:00
John W. Parent
306377684b CMake: add v3.27.9 (#41301) 2023-11-29 02:19:29 -07:00
Massimiliano Culpo
29b75a7ace Fix an issue with deconcretization/reconcretization of environments (#41294) 2023-11-29 09:09:16 +01:00
Paul R. C. Kent
4b41b11c30 cuda: add 12.3.0 (#40827) 2023-11-28 13:28:28 -08:00
afzpatel
92e0d42b64 update hipblas rocalution, rocsolver, rocsparse to new syntax (#40135)
* initial commit to update hipblas rocalution, rocsolver, rocsparse to new syntax
* add rocblas test changes and fixes for hipblas and rocsolver tests
* fix styling
* remove updates for rocblas
2023-11-28 12:49:07 -08:00
afzpatel
1ebd37d20c fix hip tests and bump hip-examples to 5.6.1 (#40928)
* Initial commit to fix hip tests and bump hip-examples to 5.6.1
* fix styling
* add installation of hip samples to share folder
2023-11-28 12:17:10 -08:00
Adam J. Stewart
b719c905f1 apple-libuuid: update installation directory (#40416)
* apple-libuuid: update installation directory

Copy design of Apple GL
2023-11-28 10:13:55 -08:00
eugeneswalker
430b2dff5c e4s ci: disable gpu test stack (#41296) 2023-11-28 18:02:00 +01:00
Tom Payerle
ef8e6a969c vtk: Restrict application of patch xdmf2-hdf51.13.2.patch (#40266)
The changes in patch xdmf2-hdf51.13.2.patch have effectively
been added to vtk@9.2.3 (commit e81a2fe)

So restrict application of patch fo @9:9.2
2023-11-28 11:27:58 -05:00
Massimiliano Culpo
0e65e84768 ASP-based solver: use a unique ID counter (#41290)
* solver: use a unique counter for condition, triggers and effects

* Do not reset counters when re-running setup

  What we need is just a unique ID, it doesn't need
  to start from zero every time.
2023-11-28 16:28:54 +01:00
Seth R. Johnson
e0da7154ad celeritas: new version 0.4.0 (#41288) 2023-11-28 14:44:15 +00:00
Juan Miguel Carceller
6f08daf670 root: add a webgui patch (#41289)
* root: add a webgui patch

* Update var/spack/repos/builtin/packages/root/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Add also the versions that don't need the webgui patch

* Fix hash

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-11-28 07:33:48 -07:00
Tom Scogland
c2d29ca38c libvips requires pkg-config to find glib (#41184) 2023-11-28 14:42:16 +01:00
Raffaele Solcà
f037ef7451 Fix elpa flags (missing optimization) (#41252)
Setting CFLAGS/FCFLAGS overrides the default optimization flags.

This commit brings them back.
2023-11-28 11:31:30 +01:00
dependabot[bot]
f84557a81b build(deps): bump vermin from 1.5.2 to 1.6.0 in /.github/workflows/style (#41285)
Bumps [vermin](https://github.com/netromdk/vermin) from 1.5.2 to 1.6.0.
- [Release notes](https://github.com/netromdk/vermin/releases)
- [Commits](https://github.com/netromdk/vermin/compare/v1.5.2...v1.6.0)

---
updated-dependencies:
- dependency-name: vermin
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-28 09:38:38 +00:00
Alec Scott
18efd808da GoPackage: add new build system for Go packages (#41164)
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-11-28 10:33:46 +01:00
Wouter Deconinck
5299b84319 qt-*: new versions 6.6.1 (#41281)
* qt-*: new versions 6.6.1

* qt-quick3d: fixup
2023-11-28 02:18:54 -07:00
Adam J. Stewart
70fb0b35e5 py-transformers: add v4.35.2 (#41266) 2023-11-28 10:17:52 +01:00
Alec Scott
4f7f3cbbdf smee-client: add new package (#41280) 2023-11-28 08:34:49 +01:00
Jose E. Roman
4205ac74e8 slepc: add v3.20.1 (#41274) 2023-11-27 20:53:45 -06:00
Jack Morrison
72ed14e4a9 libfabric: Add version 1.20.0 (#41277) 2023-11-27 19:04:44 -07:00
Dave Keeshan
ed54359454 Move compiler renaming to filter_compiler_wrappers (#41275) 2023-11-27 18:16:03 -07:00
Dave Keeshan
ea610d3fe2 iverilog-vpi: filter compiler wrappers from a few files (#41244) 2023-11-27 17:44:44 -07:00
John W. Parent
cd33becebc CMake: add version 3.27.8 (#41094) 2023-11-27 15:09:44 -07:00
Jim Edwards
8ccfe9f710 cprnc: add new package (#41237) 2023-11-27 14:02:04 -07:00
Eric Berquist
abc294e3a2 Update SST packages to 13.1.0 (#41220)
* Update SST packages to 13.1.0

* Allow mismatch between sst-core dependency and current macro version

* SST does not work with Python 3.12 yet

* Sanity check install binaries for sst-core

* Elements compiles with OTF2 but not OTF

* Version bounds in specs are inclusive

* Remove not-strictly-necessary file check
2023-11-27 14:55:22 -06:00
Alec Scott
c482534c1d CargoPackage: add new build system for Cargo packages (#41192)
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-11-27 20:15:16 +00:00
Robert Cohn
fbec91e491 handle use of an unconfigured compiler (#41213) 2023-11-27 11:57:10 -07:00
Andrey Perestoronin
3d744e7c95 intel-oneapi 2024.0.0: added new version to packages (#41135)
* oneapi 2024.0.0 release

* oneapi v2 directory support and some cleanups

* sycl abi change requires 2024 compilers for packages that use sycl

---------

Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>
2023-11-27 13:13:04 -05:00
Dave Keeshan
b4bafbbf7e verilator: add v5.018 (#41256)
Add all version since 4.108, deprecate previous version, issues with flex,
switch from veripool to github for releases
2023-11-27 10:24:49 -07:00
Juan Miguel Carceller
bd3a1d28bf Simplify a few CMakePackages by removing redundant directives (#41163)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-11-27 17:43:39 +01:00
Harmen Stoppels
e0ef78b26e docs: refer to oci build cache from containers.rst (#41269) 2023-11-27 16:36:00 +00:00
Satish Balay
d768e6ea5c Balay/xsdk 1.0.0 updates (#41180)
* superlu-dist: add v8.2.1 for xsdk

* heffte, phist build fixes on tioga

* exago: build fixes on polaris

---------

Co-authored-by: Veselin Dobrev <dobrev@llnl.gov>
2023-11-27 09:57:41 -06:00
Harmen Stoppels
8d0e0d5c77 tests: fix more cases of env variables (#41226) 2023-11-27 16:37:31 +01:00
Anton Kozhevnikov
13b711f620 [sirius] update spack recipe; add v7.5.0 (#41233)
* update spack recipe

* [@spackbot] updating style on behalf of toxa81

* change  from @develop to @7.5.0

* return dependency on boost_filesystem

* return dependency on boost_filesystem

* remove boost filesystem as agreed by @RMeli and  @simonpintarelli

---------

Co-authored-by: toxa81 <toxa81@users.noreply.github.com>
2023-11-27 16:28:19 +01:00
Nisarg Patel
d76a774957 libpsm3: add v11.5.1.1 (#41231) 2023-11-27 15:52:19 +01:00
Brian Vanderwende
ee8e40003b vapor: add new recipe (#40707) 2023-11-27 15:48:04 +01:00
Adam J. Stewart
dc715d9840 py-llvmlite: add new versions (#41247) 2023-11-27 15:43:49 +01:00
Derek Ryan Strong
89173b6d24 fpart: add license and variants (#41257) 2023-11-27 15:25:07 +01:00
stepanvanecek
848d270548 sys-sage: update repo url, rework recipe (#41005)
Co-authored-by: Stepan Vanecek <stepan.vanecek@tum.de>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-11-27 15:21:57 +01:00
fpruvost
ea347b6468 pastix: add v6.3.1 (#41265) 2023-11-27 15:03:04 +01:00
Dave Keeshan
b1b4ef6d1b Add patch so that ccache can compile with the standard gcc@12 version (#41249) 2023-11-27 14:33:30 +01:00
Thomas Madlener
c564b2d969 googletest: Add 1.13.0 and 1.14.0 tags (#41253)
* Add latest tags for googletest

* Implement proper url_for_version

* Fix hashes for older versions
2023-11-27 07:12:49 -05:00
Massimiliano Culpo
343517e794 Improve semantic for packages:all:require (#41239)
An `all` requirement is emitted for a package if all variants referenced are defined by it. Otherwise, the constraint is rejected.
2023-11-27 12:41:16 +01:00
Harmen Stoppels
6fff0d4aed libxsmm: relax arch requirement (#41193)
* libxsmm: relax arch requirement

* libxsmm: add a fixed commit from main
2023-11-27 11:55:33 +01:00
Alberto Invernizzi
34bce3f490 Remove old conflict with gcc@10.3.0 (#41254)
The conflict is captured in CudaPackage and redundant in umpire
2023-11-27 09:29:19 +01:00
Rocco Meli
7cb70e3258 force cp2k cuda/rocm variant on elpa (#41241) 2023-11-27 09:08:50 +01:00
Chris Richardson
df777dbbaa py-fenics-basix: update for main and future 0.8.0 (#40838)
* Update to latest version

* Add dependency

* revert

* address PR comments

* Correct dependencies for 0.7 to 0.8 transition

* Fix cmake line.

* Update nanobind dep

---------

Co-authored-by: Matt Archer <ma595@cam.ac.uk>
Co-authored-by: Jack S. Hale <mail@jackhale.co.uk>
Co-authored-by: Garth N. Wells <gnw20@cam.ac.uk>
2023-11-25 16:42:54 -06:00
Morten Kristensen
f28ccae3df py-vermin: add latest version 1.6.0 (#41261) 2023-11-25 14:42:58 -07:00
svengoldberg
ecdc296ef8 py-heat: add new package (#39394)
* heat: Create new spack package

* t8code: Add maintainer

* t8code: Add variant descriptions

* t8code: Add second maintainer

* t8code: Add another maintainer

* heat: Changes after review

* heat: Fix style test error

* heat: Delete obsolete install_options and re-add package homepage

* heat: Add dependency on py-setuptools

---------

Co-authored-by: Sven Goldberg <sven.goldberg@dlr.de>
2023-11-25 08:56:27 -06:00
Erik Heeren
9b5c85e919 py-sqlalchemy-utils, py-sql-alchemy: version bump (#41081)
* py-sqlalchemy-utils, py-sql-alchemy: version bump

* Update var/spack/repos/builtin/packages/py-sqlalchemy-utils/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-11-25 08:53:00 -06:00
Erik Heeren
ea8dcb73db py-dictdiffer: version bump (#41080)
* py-dictdiffer: version bump

* py-dictdiffer: removed runtime py-setuptools dependency in 0.9.0

* Update var/spack/repos/builtin/packages/py-dictdiffer/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-11-25 08:51:13 -06:00
Moritz Kern
089e117904 Update py-neo (#39213)
* add 0.12.0

* remove whitespace

* update deps

* Update var/spack/repos/builtin/packages/py-neo/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* add dep for python 3.8+

* add dep for python 3.8+ with 0.12.0

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-25 08:40:10 -06:00
Wouter Deconinck
1456d9b727 py-stashcp: new package (#41091)
* py-stashcp: new package

* py-stashcp: depends_on py-urllib3

* py-stashcp: comment as suggested in review

* Update var/spack/repos/builtin/packages/py-stashcp/package.py
2023-11-25 08:31:56 -06:00
Wouter Deconinck
c485709f62 iwyu: new versions up 0.21 (depends_on llvm-17) (#41235) 2023-11-24 10:09:24 -05:00
Michael Kuhn
7db386a018 Fix multi-word aliases (#41126)
PR #40929 reverted the argument parsing to make `spack --verbose
install` work again. It looks like `--verbose` is the only instance
where this kind of argument inheritance is used since all other commands
override arguments with the same name instead. For instance, `spack
--bootstrap clean` does not invoke `spack clean --bootstrap`.

Therefore, fix multi-line aliases again by parsing the resolved
arguments and instead explicitly pass down `args.verbose` to commands.
2023-11-24 15:56:42 +01:00
Massimiliano Culpo
92d076e683 spack graph: fix coloring with environments (#41240)
If we use all specs, we won't color correctly build-only dependencies
2023-11-24 10:08:21 +01:00
joscot-linaro
f70ef51f1a linaro-forge: update for 23.1 (#41236) 2023-11-24 09:15:47 +01:00
Jógvan Magnus Haugaard Olsen
e9d968d95f pfunit: add version 4.7.4 (#41232) 2023-11-24 09:04:37 +01:00
Nathalie Furmento
918d6baed4 starpu: add release 1.4.2 (#41238) 2023-11-24 09:02:51 +01:00
Loris Ercole
624df2a1bb nlcglib: pass cuda_arch setting to kokkos dependency (#39725)
When building with `+cuda`, the specified `cuda_arch` was not passed to kokkos,
leading to a wrong concretization.
2023-11-23 17:49:00 +01:00
Massimiliano Culpo
ee0d3a3be2 ASP-based solver: don't error for type mismatch on preferences (#41138)
This commit discards type mismatches or failures to validate a package preference during concretization. The values discarded are logged as debug level messages. It also adds a config audit to help users spot misconfigurations in packages.yaml preferences.
2023-11-23 11:30:39 +01:00
Mark Abraham
de64ce5541 rdma-core: add new variants for a library without Python dependencies (#41195)
These variants allow packages that use rdma-core as a library to avoid
dependencies on python infrastructure that is not useful to them.
2023-11-23 09:52:57 +01:00
Raffaele Solcà
f556e52bf6 Add dla-future 0.3.1 (#41219) 2023-11-23 09:23:20 +01:00
Todd Gamblin
81e7d39bd2 Update CHANGELOG.md from v0.21.0
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-11-22 14:39:05 -08:00
Harmen Stoppels
61055d9ee5 test_which: do not mutate os.environ 2023-11-22 14:22:37 -08:00
afzpatel
c1a8bb2a12 composable-kernel, migraphx: Fix build on CentOS8 (#41206) 2023-11-22 20:43:45 +01:00
Juan Miguel Carceller
7e9ddca0ff gloo: add a patch for building with gcc 12 (#41169) 2023-11-22 20:39:01 +01:00
Jen Herting
7cad4bb8d9 [nettle] depend on spack managed openssl (#40783) 2023-11-22 20:37:36 +01:00
Thomas Madlener
2d71c6bb8e dd4hep: add v1.27.1 (#41202)
* Make sure that geant4 comes with cxxstd that should be OK
2023-11-22 20:29:46 +01:00
Alec Scott
285de8ad4d fzf: add v0.44.1 (#41204) 2023-11-22 20:21:53 +01:00
Manuela Kuhn
89a0ea01a7 pulseaudio: add missing m4 dependency (#41216) 2023-11-22 20:20:59 +01:00
Stephen Sachs
e49f55ba53 aocc: help compiler find include paths and libstdc++.so (#40450)
Add --gcc-toolchain option by default.
Only add these paths if c++ libs and include files are available and the compiler was built with gcc
2023-11-22 09:59:20 -08:00
Thomas Madlener
3c54177c5d edm4hep: add latest tag for 0.10.2 (#41201) 2023-11-22 10:16:39 -07:00
Harmen Stoppels
24a38e6782 setup_platform_environment before package env mods (#41205)
This roughly restores the order of operation from Spack 0.20,
where where `AutotoolsPackage.setup_build_environment` would
override the env variable set in `setup_platform_environment` on
macOS.
2023-11-22 17:32:13 +01:00
Massimiliano Culpo
3cf7f7b800 ASP-based solver: don't emit spurious debug output (#41218)
When improving the error message, we started #showing in the
answer set a lot more symbols - but we forgot to suppress the
debug messages warning about UNKNOWN SYMBOLs
2023-11-22 16:06:46 +01:00
LucaMarradi
d7e756a26b onnxruntime: fix the call to as_string() operator (#41087)
* onnxruntime: fix the call to as_string() operator

* Update var/spack/repos/builtin/packages/py-onnxruntime/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/py-onnxruntime/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-onnxruntime/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-onnxruntime: rm now-unused stringpiece_1_10.patch

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-22 07:23:52 -06:00
Tom Scogland
721f15bbeb hub: add v2.14.2, update to go module (#41183)
* packages/hub: add new version, update to module

Hub now uses a go module to build, needs different env vars, and we're
on a very, very old version before that.  Deprecate the old ones so we
can clean out that old build once we pass a spack version.

* cleanup suggested by @adamjstewart
2023-11-22 07:22:22 -06:00
Michael Kuhn
efa316aafa perl: add missing gmake dependency (#41210) 2023-11-22 11:17:47 +01:00
Michael Kuhn
6ac23545ec python: add missing gmake dependency (#41211) 2023-11-22 11:17:27 +01:00
Alex Richert
432f5d64e3 Add cxx17_flag to intel.py (#41207)
* Add cxx17_flag to intel.py
2023-11-21 18:08:02 -07:00
Chris Richardson
f2192a48ce Update py-scikit-build-core to version 0.6.1 (#40779)
* Update to latest version

* Fix linebreak

* Make suggested changes

* bumped to 0.6.1

* Update var/spack/repos/builtin/packages/py-scikit-build-core/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Chris Richardson <cnr12@cam.ac.uk>
Co-authored-by: Matt Archer <ma595@cam.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-21 13:17:07 -06:00
Alec Scott
70bed662fc gettext: add v0.22.4 (#41189) 2023-11-21 05:19:16 -07:00
dependabot[bot]
ae38987cb4 build(deps): bump pygments from 2.16.1 to 2.17.1 in /lib/spack/docs (#41191)
Bumps [pygments](https://github.com/pygments/pygments) from 2.16.1 to 2.17.1.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.16.1...2.17.1)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-21 01:12:50 -07:00
Massimiliano Culpo
b361ffbe22 spack style: fix isort on sl:7 (#41133)
Bump the minimum version required for isort. This should fix
an issue reported on Scientific Linux 7, and due to:

https://github.com/PyCQA/isort/issues/1363
2023-11-21 06:24:37 +01:00
Mark W. Krentel
964440a08b elfutils: add version 0.190 (#41187) 2023-11-20 21:10:21 -07:00
Wouter Deconinck
aeb1bec8f3 qt-base: have QtBase provide qmake, ont QtPackage (#41186) 2023-11-20 20:05:11 -07:00
Mark Abraham
cf163eecc5 gromacs: fix newly added variant (#41178)
In practice, one can only compiler for the Intel Data Center Max GPU
via a SYCL build and the oneAPI compiler. This is unlikely to change,
so we can be explicit about that.
2023-11-20 19:49:34 -07:00
Harmen Stoppels
7ec62d117e py-grpcio* do not assume lib / header dir (#41182) 2023-11-20 15:34:29 -06:00
John W. Parent
5154d69629 MSVC preview version breaks clingo build (#41185)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-11-20 21:55:07 +01:00
Alec Scott
d272c49fb6 rust: add v1.73.0 and add support for external openssl certs (#41161)
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-11-20 21:35:26 +01:00
Alec Scott
868a3c43e4 llvm: Remove python bindings when >= v17 (#41160)
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-11-20 14:10:13 +01:00
Marc Perache
73858df14d Catch2: add variant to choose cxx standard (#40996) 2023-11-20 11:31:17 +01:00
Sergio Sánchez Ramírez
8003f18709 openblas: optimize flags for A64FX (#41093) 2023-11-20 11:20:11 +01:00
Adam J. Stewart
87a9b428e5 py-scipy: add v1.11.4 (#41158) 2023-11-20 11:10:28 +01:00
Scott Wittenburg
714a362f94 mpich: support ch3:sock for a non busy-polling option (#40964) 2023-11-20 11:00:39 +01:00
Alec Scott
4636d6ec62 restic: add v0.16.2 (#41168) 2023-11-20 10:46:59 +01:00
Alec Scott
a015078c36 bfs: add v3.0.4 (#41165) 2023-11-20 10:00:31 +01:00
Alec Scott
ec2a0c8847 rclone: add v1.64.2 (#41166) 2023-11-20 10:00:15 +01:00
Alec Scott
cfae42a514 glab: add v1.35.0 (#41167) 2023-11-20 09:59:55 +01:00
iarspider
2c74ac5b2b Remove a maintainer from CMS packages (#41170) 2023-11-20 01:27:38 -07:00
Wouter Deconinck
df1111c24a sherpa: only enable_or_disable in v3: (#41162) 2023-11-20 09:20:21 +01:00
Harmen Stoppels
55d2ee9160 docs: document how spack picks a version / variant (#41070) 2023-11-20 09:00:53 +01:00
Alec Scott
edda2ef419 npm: only depend on libvips when @6, remove deprecated versions (#41159)
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2023-11-19 10:13:51 -07:00
Ethan Williams
6159168079 elbencho: add new version and git master branch (#41136)
* elbencho add new version and git master branch

* Update var/spack/repos/builtin/packages/elbencho/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* formatting fix requested by @alecbcs

* remove whitespace added in blank line by github auto resolve

---------

Co-authored-by: Ethan W <mail@ethanwilliams.xyz>
Co-authored-by: Alec Scott <alec@bcs.sh>
2023-11-19 08:24:06 -07:00
eugeneswalker
2870b6002c e4s oneapi stack: turn on +sycl: ginkgo, heffte, petsc, upcxx, warpx (#41157)
* e4s oneapi stack: turn on +sycl: ginkgo, heffte, petsc, upcxx, warpx

* comment out warpx; build fails; add note
2023-11-18 22:21:20 -08:00
Christoph Junghans
73a715ad75 votca: add v2023 (#41100) 2023-11-18 16:51:46 -07:00
Mark Abraham
6ca49549d9 gromacs: Add new variants and clarify existing ones (#41115)
* gromacs: Add new variants and clarify existing ones

Add new variants that reflect existing capabilities and defaults in
the upstream build system. Add other existing constraints that were
not yet specified.

* conform to style

* Fix missing hyphens

* Correct cmake variable names
2023-11-18 16:48:00 -07:00
Wouter Deconinck
50051b5619 geant4: new version 11.1.3 (#41112)
* geant4: new version 11.1.3

Release notes: https://geant4.web.cern.ch/download/release-notes/notes-v11.1.3.txt

* geant4: cmake patch with expat fix only until 11.1.2.
2023-11-18 15:39:28 -07:00
dependabot[bot]
3907838e1d build(deps): bump docker/build-push-action from 5.0.0 to 5.1.0 (#41149)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 5.0.0 to 5.1.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](0565240e2d...4a13e500e5)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-18 15:30:12 -07:00
Mark Abraham
f12b877e51 heffte: add sycl variant (#41132)
* heffte: add sycl variant

This targets the oneAPI SYCL compiler with oneMKL as FFT
implementation library.

* Require oneAPI compiler for sycl variant
2023-11-18 09:21:50 -07:00
Massimiliano Culpo
a701b24ad3 libksba: add v1.6.5 (#41129) 2023-11-18 09:18:19 -07:00
Adam J. Stewart
c60a806f0e py-matplotlib: add v3.7.4, v3.8.2 (#41156) 2023-11-18 09:16:51 -07:00
Vanessasaurus
df7747eb9a Automated deployment to update package flux-sched 2023-11-18 (#41153)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-11-18 09:13:49 -07:00
Vanessasaurus
81130274f4 Automated deployment to update package flux-core 2023-11-18 (#41154)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-11-18 09:13:06 -07:00
Vanessasaurus
2428c10703 Automated deployment to update package flux-security 2023-11-18 (#41152)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-11-18 09:11:36 -07:00
Wouter Deconinck
d171f314c7 py-pygithub: new versions, dependencies (#41072)
* py-pygithub: new versions, dependencies

* py-pygithub: reordered dependencies per requirements.txt

* py-pygithub: depends on py-setuptools-scm
2023-11-18 08:57:46 -07:00
Wouter Deconinck
16f4c53cd4 py-bokeh: new version 3.3.1, and supporting packages (#41089)
* py-bokeh: new version 3.3.1

* py-xyzservices: new package

* py-bokeh, py-xyzservices: update homepages

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-bokeh: depends on newer py-numpy

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-18 08:47:45 -07:00
Moritz Kern
e8f09713be Update py-elephant (#39200)
* add elephant version v0.12.0 and 0.13.0

* update copyright

* reformat according to black format errors

* restore maintainers directive

* Update var/spack/repos/builtin/packages/py-elephant/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* add dependency python 3.8+

* sorted dependencies

* sort dependencies from newest to oldest

* add deps for @master

* removed dependency for master, since it is included in 0.12.0:

* removed dependency for python 3.7+ , since 3.7+ is the lowest supported version anyway

* removed specific deps for master, since master is always newer than all stable releases

* updated numpy dependency for Elephant 0.12.0:

* Update var/spack/repos/builtin/packages/py-elephant/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-elephant/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* removed upper bounds for py-quantities, omitting v0.14.0

* add elephant v0.14.0

* update required quantities version

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-18 07:42:57 -07:00
Lydéric Debusschère
063c28e559 py-cleo: add versions 2.0.0 2.0.1; add maintainers (#40611)
* py-cleo: add versions 2.0.0 2.0.1; add maintainers

* py-cleo: add forgotten dependence

* py-cleo: update from review: remove preferred version, remove a dependence, fix py-rapidfuzz version

* py-cleo: deprecated version 1.0.0a5; add version 1.0.0; update dependences

* py-cleo: add version 2.1.0; update version range of dependences

* py-crashtest: add version 0.4.1, dependence of py-cleo

* py-cleo: update dependence

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-cleo: update dependence py-clikit

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-cleo: update dependence py-rapidfuzz

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-rapidfuzz: add version 2.2.0 dependence of py-cleo@2

* py-cleo: fix version range of py-crashtest

* py-rapidfuzz: fix dependences; add py-rapidfuzz-capi and py-jarowinkler

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-18 08:28:56 -06:00
Hariharan Devarajan
31ec1be85f Release DLIO Profiler Py 0.0.2 (#41127) 2023-11-18 07:58:56 -06:00
Erik Heeren
7137c43407 py-jsonpath-ng: version bump (#41078)
* py-jsonpath-ng: version bump

* py-jsonpath-ng: fix typo in version number
2023-11-18 07:54:50 -06:00
Erik Heeren
f474c87814 py-gitpython: version bump (#41079) 2023-11-18 07:53:55 -06:00
Juan Miguel Carceller
edf872c94b py-pyh5py: reorder dependencies from newest version to oldest (#41137)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-11-18 07:48:34 -06:00
Michael Kuhn
223e5b8ca2 Fix invalid escape sequences (#41130)
Using Python 3.12 in a freshly cloned Spack repository results in
warnings such as this:
```
==> Warning: invalid escape sequence '\$'
==> Warning: invalid escape sequence '\('
==> Warning: invalid escape sequence '\.'
==> Warning: invalid escape sequence '\.'
```
These will turn into errors in 3.13, so fix them. All of them actually
do not need to be regexes, so convert them into normal strings.
2023-11-18 07:43:35 -06:00
Moritz Kern
cb764ce41c Update py-quantities (#39201)
* add version 0.14.1

* formatting

* style checks

* fix style errors

* remove old versions

* fix typo

* style

* update maintainers directive

* sort dependencies from newest to oldest

* Update var/spack/repos/builtin/packages/py-quantities/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-quantities/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* sort dependencies from newest to oldest

* removed upper bounds for python version

* Update var/spack/repos/builtin/packages/py-quantities/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove dependency on Python 3.7 +, since 3.7 is the lowest supported version anyway

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-11-18 07:12:58 -06:00
Brian Van Essen
9383953f76 Hydrogen package: avoid newer openblas on power (#41143) 2023-11-17 14:20:33 -08:00
Stephen Herbener
9ad134c594 Add cxxstd variant to ecflow, update with latest ecflow version (#41120)
* Added recent versions to ecflow/package.py, as well as added a cxxstd variant that is
needed to set BOOST_NO_CXX98_FUNCTION_BASE appropriately when building with C++17 standard.
* Fixed pep8 style error in the ecflow package.py script.
* Remov
* Removed cxxstd variant since the ecflow cmake configuration was already specifying
to use the c++17 standard for newer versions. The use of the BOOST_NO_CXX98_FUNCTION_BASE
define is now triggered by the ecflow version.
2023-11-17 13:34:17 -08:00
Mark Abraham
ec8bd38c4e Permit packages that depend on Intel oneAPI packages to access sdk (#41117)
* Permit packages that depend on Intel oneAPI packages to access sdk

* Implement and use IntelOneapiLibraryPackageWithSdk

* Restore libs property to IntelOneapiLibraryPackage

* Conform to style

* Provide new class to infrastructure

* Treat sdk/include as the main include
2023-11-17 14:59:04 +00:00
Wouter Deconinck
81e73b4dd4 root: new version 6.30.00 (#41118)
* root: new version 6.30.00

There is a new release of ROOT, v6.30.00, with release notes at https://root.cern/doc/v630/release-notes.html.

In addition to some deprecations of build options, this updates the C++ standard to 17 or higher (well, 20), and increases the vc minimum version.

* vc: new version 1.4.4

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2023-11-17 08:49:01 -06:00
Thomas Madlener
53c7edb0ad lcio: Add latest tag 2.20.1 (#41102) 2023-11-17 08:24:43 -06:00
Massimiliano Culpo
54ab0872f2 py-archspec: add v0.2.2 (#41110) 2023-11-17 11:22:32 +01:00
Harmen Stoppels
1b66cbacf0 llvm: patch missing cstdint include (#41108)
* llvm: patch missing cstdint include
2023-11-17 10:17:25 +01:00
Kelly (KT) Thompson
a3d6714c8b [doxygen] Add versions 1.9.7 and 1.9.8. (#41123)
* [doxygen] Add versions 1.9.7 and 1.9.8.
* Fix has for 1.9.8.
2023-11-16 19:44:17 -07:00
bk
da2e53b2ee r-rlang: add v1.1.1, v1.1.2 (#41114) 2023-11-16 19:24:20 -07:00
Weiqun Zhang
3e060cce60 Update amrex maintainers (#41122) 2023-11-16 15:06:54 -08:00
Wouter Deconinck
00df2368a3 clhep: new version 2.4.7.1 (#41113) 2023-11-16 15:57:29 -07:00
Massimiliano Culpo
ef689ea586 libgcrypt: add v1.10.3 (#41111) 2023-11-16 15:46:44 -07:00
Thomas Madlener
4991a60eac podio: Add latest tag 0.17.3 (#41103) 2023-11-16 14:41:24 -08:00
Tim Wickberg
67c2c80cf4 Use preferred capitalization of "Slurm" (#41109)
https://slurm.schedmd.com/faq.html#acronym
2023-11-16 22:40:54 +00:00
Massimiliano Culpo
0cde944ccc Improve the error message for deprecated preferences (#41075)
Improves the warning for deprecated preferences, and adds a configuration
audit to get files:lines details of the issues.

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-11-16 23:30:29 +01:00
Adam J. Stewart
1927ca1f35 Update PyTorch ecosystem (#41105) 2023-11-16 10:59:13 -08:00
Adam J. Stewart
765df31381 py-lightning: add v2.1.2 (#41106) 2023-11-16 10:52:21 -08:00
Sinan
ba091e00b3 package/lemon: improve (#40971)
* package/lemon: improve
* fix bug
* final improvements
* use f strings for boolean options, add soplex as TODO
* leave +coin as TODO
* depends on bzip2 when +coin
* tidy

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
2023-11-16 10:36:29 -08:00
afzpatel
8d2e76e8b5 enable rocAL and add MIVisionX tests (#39630)
* initial commit to enable rocAL and add MIVisionX tests
* fix styling
* updated checksum for libjpeg patches
* update for 5.6
* use satisfies for checking spec version
2023-11-16 10:29:00 -08:00
Martin Aumüller
0798bd0915 Updates for Ospray@3.0.0 (#41054)
* rkcommon: add v1.12.0
* openimagedenoise: add v2.1.0
* openvkl: 1.3.2 only compatible with rkcommon@:1.11
* openvkl: add v2.0.0
* ospray: add v3.0.0
* paraview: not yet compatible with ospray@3
2023-11-16 10:04:02 -08:00
Massimiliano Culpo
1e1cb68b84 Add audit check to spot when= arguments using wrong named specs (#41107)
* Add audit check to spot when= arguments using named specs

* Fix package issues caught by the new audit
2023-11-16 14:19:05 +01:00
Auriane R
495252f7f6 Add patch for libffi@3.4.4 since failing to install using clang@15 (#41083) 2023-11-16 10:04:46 +01:00
Sergio Sánchez Ramírez
66dea1d396 Update package.py (#41092) 2023-11-15 17:38:28 -07:00
Daniel Arndt
2f4046308f deal.II: Require at least taskflow 3.4 (#41095) 2023-11-15 16:16:53 -08:00
Alberto Sartori
95321f4f3a justbuild: add version v1.2.3 (#41084) 2023-11-15 15:57:06 -08:00
Satish Balay
6eae4b9714 taskflow: add v3.6.0 (#41098) 2023-11-15 15:20:38 -08:00
Harmen Stoppels
2f24aeb7f6 docs: packages config on separate page, demote bootstrapping (#41085) 2023-11-15 15:49:16 +00:00
Rocco Meli
1d30e78b54 cp2k: add hipfft and hipblas explicitly (#41074) 2023-11-15 01:44:38 -07:00
Jonathon Anderson
b3146559fb hpctoolkit: Add depends on autotools for @develop (#41067) 2023-11-15 09:19:02 +01:00
moloney
84e33b496f mrtrix3: fix some issues w/ 3.0.3 and add 3.0.4 (#41036) 2023-11-15 09:13:21 +01:00
Jonathon Anderson
de850e97e8 libevent: call autoreconf directly instead of via autogen.sh (#41057) 2023-11-15 09:11:49 +01:00
kwryankrattiger
c7157d13a8 ParaView: Add release candidate 5.12.0-RC1 (#41009)
* ParaView: Add release candidate 5.12.0-RC1

* [@spackbot] updating style on behalf of kwryankrattiger
2023-11-15 08:27:27 +01:00
Gerhard Theurich
d97d73fad1 esmf: add v8.6.0 (#41066) 2023-11-14 22:23:37 -07:00
Julien Cortial
9792625d1f Fix typo in mumps recipe (#41062)
* Fix typo in mumps recipe
* Adopt mumps package
2023-11-14 10:43:40 -07:00
Satish Balay
43a94e981a xsdk: add version 1.0.0 (#40825)
xsdk: add +sycl variant - with amrex, arborx, ginkgo, petsc, sundials

xsdk: add +pflotran variant

xsdk: enable hypre+rocm

xsdk: enable superlu-dist for GPU - but use trilinos~superlu-dist [as that breaks builds]

xsdk: dealii: disable oce as it can cause intel-tbb-2017.6 to be picked up for some builds (for ex: gcc=13) and result in subsequent build failures
2023-11-14 11:00:19 -06:00
Thomas-Ulrich
ee1a2d94ad bison: conflict %oneapi due to possible miscompilation (#40860) 2023-11-14 17:58:17 +01:00
Harmen Stoppels
25eca56909 gmake: fix bootstrap (#41060) 2023-11-14 17:44:48 +01:00
Massimiliano Culpo
2ac128a3ad Add papyrus to the list of broken tests (#40923)
* Disable papyrus in the neoverse v1 pipeline
   See https://gitlab.spack.io/spack/spack/-/jobs/8983875
   The job is hanging on tests for 6 hrs.
* Add papyrus to broken tests instead of removing it
2023-11-14 07:37:29 -08:00
Massimiliano Culpo
1255620a14 Fix infinite recursion when computing concretization errors (#41061) 2023-11-14 14:44:58 +01:00
Harmen Stoppels
18ebef60aa R: cleanup recipe and fix linking to lapack libraries (#41040) 2023-11-14 14:44:36 +01:00
Dennis Klein
6fc8679fb4 fairmq: add v1.8.1 (#41007) 2023-11-14 12:55:09 +01:00
Massimiliano Culpo
8a8dcb9479 modules: unit-tests without polluted user scope (#41041) 2023-11-14 10:29:28 +00:00
Adam J. Stewart
a6179f26b9 GDAL: add v3.8.0 (#41047) 2023-11-14 03:01:31 -06:00
Martin Aumüller
0dc73884c7 ispc: add v1.21 and v1.21.1 (#41053) 2023-11-14 09:38:08 +01:00
dependabot[bot]
a80b4fd20d build(deps): bump urllib3 from 2.0.7 to 2.1.0 in /lib/spack/docs (#41055)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.0.7 to 2.1.0.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.0.7...2.1.0)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-14 09:33:41 +01:00
Wouter Deconinck
c264cf12a2 dd4hep: avoid IndexError in setup_run_environment (#41051)
Some environments may have `dd4hep` as a concretized package without having it installed (yet). For those environments, `dd4hep` has property `libs` that is an empty list. Nevertheless, it can be added to a run environment (for example in case `dd4hep` is part of an environment). This results in an IndexError:
```
==> Warning: couldn't load runtime environment due to IndexError: list index out of range
```
To avoid the IndexError, only prepend the `dd4hep` libs if there are actually libs found.
2023-11-14 09:25:56 +01:00
Raffaele Solcà
769474fcb0 DLA-future: add v0.3.0 (#41042) 2023-11-14 09:25:08 +01:00
Adam J. Stewart
ab60bfe36a py-numpy: add v1.26.2 (#41046) 2023-11-13 16:41:05 -07:00
John W. Parent
8bcc3e2820 CMake Package: support building ~ownlibs on Windows (#38758) 2023-11-13 14:26:33 -08:00
Hariharan Devarajan
388f141a92 Release Brahma v0.0.2 (#40994) 2023-11-13 14:25:12 -08:00
Todd Gamblin
f74b083a15 info: improve coverage (#41001)
Tests didn't cover the new `--variants-by-name` parameter in #40998.
Add some parameterization to hit that.

This changeset makes me think that the main section-printing loop in `spack info` isn't
factored so well. It makes it difficult to pass different arguments to different helper
functions.  I could break it out into if statements if folks think that would be cleaner.
2023-11-13 13:45:18 -08:00
heatherkellyucl
5b9d260054 gzip: deprecate <1.13 for vulnerability (#41044) 2023-11-13 13:38:16 -07:00
Greg Becker
4bd47d89db spack diff: allow hashes from mirrors (#41043) 2023-11-13 12:27:52 -08:00
Daniel Arndt
96f3c76052 dealii: add v9.5.0, v9.5.1 (#40747)
* dealii: 9.5.0

* kokkos+cuda_lambda

* dealii ^kokkos@3.7: require +cuda +cuda_lambda +wrapper

* Added 9.5.1, try ~cgal when +cuda

* Forward Cuda architecture request

* Remove workaround

* Try not enforcing the Kokkos compiler

* Enforce using nvcc_wrapper with Trilinos+Cuda

* Don't define CMAKE_*_COMPILER to point to MPI wrappers

* Use the same compiler as Trilinos/Kokkos

* Only check for Trilinos compiler

* Disable Trilinos+Cuda

* Disable Cuda support

* Try CUDA build without ninja

* Combined examples and examples_compile

* Use f-string for cuda_arch

* p -> _package

* Indentation

* Fix up f-string

---------

Co-authored-by: Luca Heltai <luca.heltai@sissa.it>
Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2023-11-13 12:29:55 -06:00
H. Joe Lee
9c74eda61f hdf5: add a new variant for enabling sub-filing VFD (#40804) 2023-11-13 11:18:02 -07:00
dependabot[bot]
d9de93a0fc build(deps): bump black from 23.10.1 to 23.11.0 in /lib/spack/docs (#40967)
Bumps [black](https://github.com/psf/black) from 23.10.1 to 23.11.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.10.1...23.11.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-13 08:18:06 -07:00
Wouter Deconinck
3892fadbf6 qwt: conflict with qt-base (Qt6) (#40883) 2023-11-13 12:42:37 +01:00
Glenn Horton-Smith
62b32080a8 epics-base: patch to avoid failure on "perl xsubpp" when "xsubpp" otherwise works fine. (#40849) 2023-11-13 12:29:51 +01:00
Wanlin Wang
09d66168c4 riscv-gnu-toolchain: add v2023.09.13 -> v2023.10.18 (#40854) 2023-11-13 12:19:09 +01:00
Tuomas Koskela
d7869da36b conquest: add build system changes and library paths (#40718) 2023-11-13 12:13:53 +01:00
Mikael Simberg
b6864fb1c3 Add license directives to various packages (#41039) 2023-11-13 04:03:48 -07:00
Harmen Stoppels
e6125061e1 Compiler.debug_flags: drop -gz (#40900)
That enables compression of the debug symbols, it doesn't toggle them on
or off.
2023-11-13 11:33:40 +01:00
dependabot[bot]
491bd48897 build(deps): bump black in /.github/workflows/style (#40968)
Bumps [black](https://github.com/psf/black) from 23.10.1 to 23.11.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.10.1...23.11.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-13 11:18:52 +01:00
Victoria Cherkas
ad4878f770 metkit: add v1.10.2 and v1.10.17 (#40668) 2023-11-13 10:56:52 +01:00
Satish Balay
420eff11b7 superlu-dist: add v8.2.0 (#41004) 2023-11-13 10:55:05 +01:00
Adam J. Stewart
15e7aaf94d py-mypy: add v1.4:v1.7 (#41015) 2023-11-13 10:33:33 +01:00
Adam J. Stewart
bd6c5ec82d py-pandas: add v2.1.3 (#41017) 2023-11-13 10:26:56 +01:00
dependabot[bot]
4e171453c0 build(deps): bump mypy from 1.6.1 to 1.7.0 in /lib/spack/docs (#41020)
Bumps [mypy](https://github.com/python/mypy) from 1.6.1 to 1.7.0.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.6.1...v1.7.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-11-13 10:13:26 +01:00
Adam J. Stewart
420bce5cd2 GEOS: add new versions (#41030) 2023-11-13 10:09:58 +01:00
Thomas Gruber
b4f6c49bc0 likwid: add 5.3.0 version (#41008) 2023-11-13 10:09:39 +01:00
David Gardner
da4f2776d2 sundials: add license directive (#41028) 2023-11-13 10:08:28 +01:00
Adam J. Stewart
e2f274a634 PyTorch: allow +openmp on macOS (#41025) 2023-11-13 10:07:18 +01:00
Christian Glusa
15dcd3c65c py-pynucleus: Add variant, modify dependencies (#41006) 2023-11-11 16:24:12 -06:00
Matthew Archer
49c2894def update to latest version (#40905) 2023-11-11 16:16:45 -06:00
Stephen Hudson
1ae37f6720 libEnsemble: add v1.1.0 (#40969) 2023-11-11 16:15:43 -06:00
Adrien Berchet
15f6368c7f Add geomdl package (#40933) 2023-11-11 15:55:08 -06:00
Terry Cojean
57b63228ce Ginkgo: 1.7.0, change compatibility, update option oneapi->sycl (#40874)
Signed-off-by: Terry Cojean <terry.cojean@kit.edu>
2023-11-11 09:00:52 -06:00
Greg Becker
13abfb7013 spack deconcretize command (#38803)
We have two ways to concretize now:
* `spack concretize` concretizes only the root specs that are not concrete in the environment.
* `spack concretize -f` eliminates all cached concretization data and reconcretizes the *entire* environment.

This PR adds `spack deconcretize`, which eliminates cached concretization data for a spec.  This allows
users greater control over what is preserved from their `spack.lock` file and what is reused when not
using `spack concretize -f`.  If you want to update a spec installed in your environment, you can call
`spack deconcretize` on it, and that spec and any relevant dependents will be removed from the lock file.

`spack concretize` has two options:
* `--root`: limits deconcretized specs to *specific* roots in the environment. You can use this to
  deconcretize exactly one root in a `unify: false` environment.  i.e., if `foo` root is a dependent
  of `bar`, both roots, `spack deconcretize bar` will *not* deconcretize `foo`.
* `--all`: deconcretize *all* specs that match the input spec. By default `spack deconcretize`
  will complain about multiple matches, like `spack uninstall`.
2023-11-10 14:55:35 -08:00
Nils Lehmann
b41fc1ec79 new release (#41010) 2023-11-10 11:52:59 -07:00
Henri Menke
124e41da23 libpspio 0.3.0 (#40953)
Co-authored-by: Alec Scott <alec@bcs.sh>
2023-11-10 09:48:50 -08:00
Massimiliano Culpo
f6039d1d45 builtin.repo: fix ^mkl pattern in minor packages (#41003)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-11-10 17:18:24 +01:00
Victoria Cherkas
8871bd5ba5 fdb: add dependency on eckit later release (#40737)
* depends_on("eckit@1.24.4:", when="@5.11.22:")

* Update var/spack/repos/builtin/packages/fdb/package.py

Co-authored-by: Alec Scott <alec@bcs.sh>

* make latest tagged release the default install

* revert f258f46660

---------

Co-authored-by: Alec Scott <alec@bcs.sh>
2023-11-10 07:54:25 -08:00
Cody Balos
efe85755d8 alquimia: apply patch for iso_c_binding to latest version (#40989) 2023-11-10 08:31:38 -06:00
Cody Balos
7aaa17856d pflotran: tweak for building with xsdk rocm/hip (#40990) 2023-11-10 08:30:35 -06:00
Massimiliano Culpo
fbf02b561a gromacs et al: fix ^mkl pattern (#41002)
The ^mkl pattern was used to refer to three packages
even though none of software using it was depending
on "mkl".

This pattern, which follows Hyrum's law, is now being
removed in favor of a more explicit one.

In this PR gromacs, abinit, lammps, and quantum-espresso
are modified.

Intel packages are also modified to provide "lapack"
and "blas" together.
2023-11-10 13:56:04 +00:00
Harmen Stoppels
4027a2139b env: compute env mods only for installed roots (#40997)
And improve the error message (load vs unload).

Of course you could have some uninstalled dependency too, but as long as
it doesn't implement `setup_run_environment` etc, I don't think it hurts
to attempt to load the root anyways, given that failure to do so is a
warning, not a fatal error.
2023-11-10 12:32:48 +01:00
Todd Gamblin
f0ced1af42 info: rework spack info command to display variants better (#40998)
This changes variant display to use a much more legible format, and to use screen space
much better (particularly on narrow terminals). It also adds color the variant display
to match other parts of `spack info`.

Descriptions and variant value lists that were frequently squished into a tiny column
before now have closer to the full terminal width.

This change also preserves any whitespace formatting present in `package.py`, so package
maintainers can make easer-to-read descriptions of variant values if they want. For
example, `gasnet` has had a nice description of the `conduits` variant for a while, but
it was wrapped and made illegible by `spack info`. That is now fixed and the original
newlines are kept.

Conditional variants are grouped by their when clauses by default, but if you do not
like the grouping, you can display all the variants in order with `--variants-by-name`.
I'm not sure when people will prefer this, but it makes it easier to tell that a
particular variant is/isn't there. I do think grouping by `when` is the better default.
2023-11-10 12:31:28 +01:00
David Boehme
2e45edf4e3 Add adiak v0.4.0 (#40993)
* Add adiak v0.4.0
* Fix style checks
2023-11-09 21:23:00 -07:00
Adam J. Stewart
4bcfb01566 py-black: add v23.10: (#40959) 2023-11-10 00:10:28 +01:00
Adam J. Stewart
b8bb8a70ce PyTorch: specify CUDA root directory (#40855) 2023-11-09 14:25:54 -08:00
Hariharan Devarajan
dd2b436b5a new release cpp-logger v0.0.2 (#40972) 2023-11-09 13:08:04 -08:00
Hariharan Devarajan
da2cc2351c Release Gotcha v1.0.5 (#40973) 2023-11-09 13:06:56 -08:00
eugeneswalker
383ec19a0c Revert "Deactivate Cray sles, due to unavailable runner (#40291)" (#40910)
This reverts commit 4b06862a7f.
2023-11-09 12:24:18 -08:00
Todd Gamblin
45f8a0e42c docs: tweak formatting of +: and -: operators (#40988)
Just trying to make these stand out a bit more in the docs.
2023-11-09 19:55:29 +00:00
Dom Heinzeller
4636a7f14f Add symlinks for hdf5 library names when built in debug mode (#40965)
* Add symlinks for hdf5 library names when built in debug mode
* Only apply bug fix for debug libs when build type is Debug
2023-11-09 11:40:53 -08:00
Kelly (KT) Thompson
38f3f57a54 [lcov] Add build and runtime deps necessary for lcov@2.0.0: (#40974)
* [lcov] Add build and runtime deps necessary for lcov@2.0.0:
   + Many additional Perl package dependecies are required for the new version of lcov.
   + Some of the new dependencies were not known to spack until now.
* Style fix
2023-11-09 11:37:38 -08:00
Satish Balay
b17d7cd0e6 mfem: add hipblas dependency for superlu-dist (#40981) 2023-11-09 11:19:48 -08:00
Satish Balay
b5e2f23b6c hypre: add in hipblas dependency due to superlu-dist (#40980) 2023-11-09 11:03:03 -08:00
Brian Van Essen
7a4df732e1 DiHydrogen, Hydrogen, and Aluminum CachedCMakePackage (#39714) 2023-11-09 19:08:37 +01:00
George Young
7e6aaf9458 py-macs3: adding zlib dependency (#40979) 2023-11-09 16:44:24 +01:00
Cody Balos
2d35d29e0f sundials: add v6.6.2 (#40920) 2023-11-09 07:38:40 -06:00
Scott Wittenburg
1baed0d833 buildcache: skip unrecognized metadata files (#40941)
This commit improves forward compatibility of Spack with newer build cache metadata formats.

Before this commit, invalid or unrecognized metadata would be fatal errors, now they just cause
a mirror to be skipped.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-11-09 13:30:41 +00:00
Harmen Stoppels
cadc2a1aa5 Set version to 0.22.0.dev0 (#40975) 2023-11-09 10:02:29 +01:00
Satish Balay
78449ba92b intel-oneapi-mkl: do not set __INTEL_POST_CFLAGS env variable (#40947)
This triggers warnings from icx compiler - that breaks petsc configure

$ I_MPI_CC=icx /opt/intel/oneapi/mpi/2021.7.0/bin/mpiicc -E a.c > /dev/null
$ __INTEL_POST_CFLAGS=-Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.0/lib/intel64 I_MPI_CC=icx /opt/intel/oneapi/mpi/2021.7.0/bin/mpiicc -E a.c > /dev/null
icx: warning: -Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.0/lib/intel64: 'linker' input unused [-Wunused-command-line-argument]
2023-11-09 08:40:12 +01:00
Massimiliano Culpo
26d6bfbb7f modules: remove deprecated code and test data (#40966)
This removes a few deprecated attributes from the
schema of the "modules" section. Test data for
deprecated options is removed as well.
2023-11-09 08:15:46 +01:00
Sergio Sánchez Ramírez
3405fe60f1 libgit2: add python as test dependency (#40863)
Libgit2 requires python as build dependency. I was getting an error because it was falling back to system Python which is compiled with Intel compilers and thus, `libgit2` was failing because it couldn't find `libimf.so` (which doesn't make sense).

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-11-08 23:20:55 +01:00
Harmen Stoppels
53c266b161 modules: restore exclude_implicits (#40958) 2023-11-08 22:56:55 +01:00
Thomas Madlener
ed8ecc469e podio: Add the latest tag (0.17.2) (#40956)
* podio: Add myself as maintainer
* podio: Add 0.17.2 tag
2023-11-08 14:53:23 -07:00
Harmen Stoppels
b2840acd52 Revert "defaults/modules.yaml: hide implicits (#40906)" (#40955)
This reverts commit a2f00886e9.
2023-11-08 14:33:50 -07:00
Tom Vander Aa
c35250b313 libevent: always autogen.sh (#40945)
The libevent release tarballs ship with a `configure` script generated by an old `libtool`. The `libtool` generated by `configure` is not compatible with `MACOSX_DEPLOYMENT_VERSION` > 10. Regeneration of the `configure` scripts fixes build on macOS. 

Original configure contains:
```
    case $host_os in
    rhapsody* | darwin1.[012])
      _lt_dar_allow_undefined='$wl-undefined ${wl}suppress' ;;
    darwin1.*)
      _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
    darwin*) # darwin 5.x on
      # if running on 10.5 or later, the deployment target defaults
      # to the OS version, if on x86, and 10.4, the deployment
      # target defaults to 10.4. Don't you love it?
      case ${MACOSX_DEPLOYMENT_TARGET-10.0},$host in
        10.0,*86*-darwin8*|10.0,*-darwin[91]*)
          _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;;
        10.[012][,.]*)
          _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
        10.*)
          _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;;
      esac
```

After re-running `autogen.sh`:
```
    case $host_os in
    rhapsody* | darwin1.[012])
      _lt_dar_allow_undefined='$wl-undefined ${wl}suppress' ;;
    darwin1.*)
      _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
    darwin*)
      case $MACOSX_DEPLOYMENT_TARGET,$host in
        10.[012],*|,*powerpc*-darwin[5-8]*)
          _lt_dar_allow_undefined='$wl-flat_namespace $wl-undefined ${wl}suppress' ;;
        *)
          _lt_dar_allow_undefined='$wl-undefined ${wl}dynamic_lookup' ;;
      esac
```
2023-11-08 22:33:09 +01:00
Adam J. Stewart
e114853115 py-lightning: add v2.1.1 (#40957) 2023-11-08 13:15:23 -08:00
Cameron Smith
89fc9a9d47 lcov: add version2, embed perl path in binaries (#39342)
* lcov: add version2, perl dep at build and runtime
* lcov: add runtime deps
* namespace-autoclean: new perl package
* datetime: dep on autoclean
* formatting
2023-11-08 11:23:23 -08:00
Massimiliano Culpo
afc693645a tcl: filter compiler wrappers to avoid pointing to Spack (#40946) 2023-11-08 19:38:41 +01:00
downloadico
4ac0e511ad abinit: add v9.10.3 (#40919)
* abinit:	add v9.10.3
Changed	configure arguments for specfying how to use Wannier90 for versions
after 9.8.
When the mpi variant is requested, set the F90 environment variable to point
to the MPI Fortran wrapper when building versions after 9.8 instead of FC.
---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2023-11-08 10:15:49 -08:00
Henri Menke
b0355d6cc0 ScaFaCoS 1.0.4 (#40948) 2023-11-08 10:17:58 -07:00
Konstantinos Parasyris
300d53d6f8 Add new tag on AMS (#40949) 2023-11-08 08:52:53 -08:00
Greg Becker
0b344e0fd3 tutorial stack: update for changes to the basics section for SC23 (#40942) 2023-11-07 23:46:57 -08:00
Peter Scheibel
15adb308bf RAJA package: find libs (#40885) 2023-11-08 08:33:04 +01:00
Michael Kuhn
050d565375 julia: constrain patchelf version (#40938)
* julia: constrain patchelf version

patchelf@0.18 breaks (at least) `libjulea-internal.so`, leading to
errors like:
```
$ julia --version
ERROR: Unable to load dependent library $SPACK/opt/spack/linux-centos8-x86_64_v3/gcc-12.3.0/julia-1.9.2-6hf5qx2q27jth2fkm6kgqmfdlhzzw6pl/bin/../lib/julia/libjulia-internal.so.1
Message:$SPACK/opt/spack/linux-centos8-x86_64_v3/gcc-12.3.0/julia-1.9.2-6hf5qx2q27jth2fkm6kgqmfdlhzzw6pl/bin/../lib/julia/libjulia-internal.so.1: ELF load command address/offset not properly aligned
```

* patchelf: prefer v0.17.x since v0.18 breaks libraries
2023-11-08 08:13:54 +01:00
Matthew Thompson
f6ef2c254e mapl: add v2.41 and v2.42 (#40870)
* mapl: add 2.41 and 2.42

* Conflict MPICH 3
2023-11-07 17:36:11 -08:00
Freifrau von Bleifrei
62c27b1924 discotec: add compression variant (#40925) 2023-11-07 14:58:48 -08:00
SWAT Team (JSC)
2ff0766aa4 adds cubew 4.8.1, cubelib 4.8.1 and cubegui 4.8.1, 4.8.2 (#40612)
* exago: fix v1.5.1 tag; only allow python up to 3.10 for for @:1.5 (#40676)
* exago: fix v1.5.1 tag; only allow python up to 3.10 for for @:1.5 due to pybind error with py 3.11
* hiop@:1.0 +cuda: constrain to cuda@:11.9
* fixes syntax of maintainers

---------

Co-authored-by: eugeneswalker <38933153+eugeneswalker@users.noreply.github.com>
2023-11-07 14:40:36 -08:00
Mark W. Krentel
dc245e87f9 intel-xed: fix git hash for mbuild, add version 2023.10.11 (#40922)
* intel-xed: fix git hash for mbuild, add version 2023.10.11
   Fixes #40912
* Fix the git commit hash for mbuild 2022.04.17.  This was broken in
   commit eef9939c21 by mixing up the hashes for xed versus mbuild.
* Add versions 2023.08.21 and 2023.10.11.
* fix style
2023-11-07 14:36:42 -08:00
Harmen Stoppels
c1f134e2a0 tutorial: use lmod@8.7.18 because @8.7.19: has bugs (#40939) 2023-11-07 23:04:45 +01:00
Mosè Giordano
391940d2eb julia: Add v1.9.3 (#40911) 2023-11-07 22:06:12 +01:00
Adam J. Stewart
8c061e51e3 sleef: build shared libs (#40893) 2023-11-07 21:48:59 +01:00
Richarda Butler
5774df6b7a Propagate variant across nodes that don't have that variant (#38512)
Before this PR, variant were not propagated to leaf nodes that could accept 
the propagated value, if some intermediate node couldn't accept it.

This PR fixes that issue by marking nodes as "candidate" for propagation
and by setting the variant only if it can be accepted by the node.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-11-07 21:04:41 +01:00
Harmen Stoppels
3a5c1eb5f3 tutorial pipeline: force gcc@12.3.0 (#40937) 2023-11-07 20:53:44 +01:00
Harmen Stoppels
3a2ec729f7 Ensure global command line arguments end up in args like before (#40929) 2023-11-07 20:35:56 +01:00
Jacob King
a093f4a8ce superlu-dist: add +parmetis variant. (#40746)
* Expose ability to make parmetis an optional superlu-dist dependency to
spack package management.

* rename parmetis variant: Enable ParMETIS library

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2023-11-07 10:21:38 -08:00
Scott Wittenburg
b8302a8277 ci: do not retry timed out build jobs (#40936) 2023-11-07 17:44:28 +00:00
Massimiliano Culpo
32f319157d Update the branch for the tutorial command (#40934) 2023-11-07 16:59:48 +00:00
Harmen Stoppels
75dfad8788 catch exceptions in which_string (#40935) 2023-11-07 17:17:31 +01:00
Vanessasaurus
f3ba20db26 fix configure args for darshan-runtime (#40873)
Problem: the current configure arguments are added lists to a list,
and this needs to be adding strings to the same list.
Solution: ensure we add each item (string) separately.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2023-11-07 07:00:28 -08:00
Rob Falgout
6301edbd5d Update package.py for new release 2.30.0 (#40907) 2023-11-07 07:58:00 -07:00
957 changed files with 17563 additions and 8702 deletions

View File

@@ -23,7 +23,7 @@ jobs:
operating_system: ["ubuntu-latest", "macos-latest"]
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages

View File

@@ -159,7 +159,7 @@ jobs:
brew install cmake bison@2.7 tree
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: "3.12"
- name: Bootstrap clingo

View File

@@ -57,7 +57,7 @@ jobs:
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: docker/metadata-action@96383f45573cb7f253c731d3b3ab81c87ef81934
- uses: docker/metadata-action@9dc751fe249ad99385a2583ee0d084c400eee04e
id: docker_meta
with:
images: |
@@ -113,7 +113,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@0565240e2d4ab88bba5387d719585280857ece09
uses: docker/build-push-action@4a13e500e55cf31b7a5d59a38ab2040ab0f42f56
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}

View File

@@ -17,7 +17,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -1,7 +1,7 @@
black==23.10.1
black==23.12.0
clingo==5.6.2
flake8==6.1.0
isort==5.12.0
mypy==1.6.1
isort==5.13.2
mypy==1.7.1
types-six==1.16.21.9
vermin==1.5.2
vermin==1.6.0

View File

@@ -54,7 +54,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -101,7 +101,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -159,7 +159,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -194,7 +194,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
@@ -215,7 +215,7 @@ jobs:
$(which spack) bootstrap disable spack-install
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@eaaf4bedf32dbdc6b720b63067d99c4d77d6047d
with:
flags: unittests,macos

View File

@@ -19,7 +19,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: '3.11'
cache: 'pip'
@@ -38,7 +38,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: '3.11'
cache: 'pip'

View File

@@ -18,7 +18,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: 3.9
- name: Install Python packages
@@ -42,7 +42,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: 3.9
- name: Install Python packages
@@ -66,7 +66,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -1,3 +1,290 @@
# v0.21.0 (2023-11-11)
`v0.21.0` is a major feature release.
## Features in this release
1. **Better error messages with condition chaining**
In v0.18, we added better error messages that could tell you what problem happened,
but they couldn't tell you *why* it happened. `0.21` adds *condition chaining* to the
solver, and Spack can now trace back through the conditions that led to an error and
build a tree of causes potential causes and where they came from. For example:
```console
$ spack solve hdf5 ^cmake@3.0.1
==> Error: concretization failed for the following reasons:
1. Cannot satisfy 'cmake@3.0.1'
2. Cannot satisfy 'cmake@3.0.1'
required because hdf5 ^cmake@3.0.1 requested from CLI
3. Cannot satisfy 'cmake@3.18:' and 'cmake@3.0.1
required because hdf5 ^cmake@3.0.1 requested from CLI
required because hdf5 depends on cmake@3.18: when @1.13:
required because hdf5 ^cmake@3.0.1 requested from CLI
4. Cannot satisfy 'cmake@3.12:' and 'cmake@3.0.1
required because hdf5 depends on cmake@3.12:
required because hdf5 ^cmake@3.0.1 requested from CLI
required because hdf5 ^cmake@3.0.1 requested from CLI
```
More details in #40173.
2. **OCI build caches**
You can now use an arbitrary [OCI](https://opencontainers.org) registry as a build
cache:
```console
$ spack mirror add my_registry oci://user/image # Dockerhub
$ spack mirror add my_registry oci://ghcr.io/haampie/spack-test # GHCR
$ spack mirror set --push --oci-username ... --oci-password ... my_registry # set login creds
$ spack buildcache push my_registry [specs...]
```
And you can optionally add a base image to get *runnable* images:
```console
$ spack buildcache push --base-image ubuntu:23.04 my_registry python
Pushed ... as [image]:python-3.11.2-65txfcpqbmpawclvtasuog4yzmxwaoia.spack
$ docker run --rm -it [image]:python-3.11.2-65txfcpqbmpawclvtasuog4yzmxwaoia.spack
```
This creates a container image from the Spack installations on the host system,
without the need to run `spack install` from a `Dockerfile` or `sif` file. It also
addresses the inconvenience of losing binaries of dependencies when `RUN spack
install` fails inside `docker build`.
Further, the container image layers and build cache tarballs are the same files. This
means that `spack install` and `docker pull` use the exact same underlying binaries.
If you previously used `spack install` inside of `docker build`, this feature helps
you save storage by a factor two.
More details in #38358.
3. **Multiple versions of build dependencies**
Increasingly, complex package builds require multiple versions of some build
dependencies. For example, Python packages frequently require very specific versions
of `setuptools`, `cython`, and sometimes different physics packages require different
versions of Python to build. The concretizer enforced that every solve was *unified*,
i.e., that there only be one version of every package. The concretizer now supports
"duplicate" nodes for *build dependencies*, but enforces unification through
transitive link and run dependencies. This will allow it to better resolve complex
dependency graphs in ecosystems like Python, and it also gets us very close to
modeling compilers as proper dependencies.
This change required a major overhaul of the concretizer, as well as a number of
performance optimizations. See #38447, #39621.
4. **Cherry-picking virtual dependencies**
You can now select only a subset of virtual dependencies from a spec that may provide
more. For example, if you want `mpich` to be your `mpi` provider, you can be explicit
by writing:
```
hdf5 ^[virtuals=mpi] mpich
```
Or, if you want to use, e.g., `intel-parallel-studio` for `blas` along with an external
`lapack` like `openblas`, you could write:
```
strumpack ^[virtuals=mpi] intel-parallel-studio+mkl ^[virtuals=lapack] openblas
```
The `virtuals=mpi` is an edge attribute, and dependency edges in Spack graphs now
track which virtuals they satisfied. More details in #17229 and #35322.
Note for packaging: in Spack 0.21 `spec.satisfies("^virtual")` is true if and only if
the package specifies `depends_on("virtual")`. This is different from Spack 0.20,
where depending on a provider implied depending on the virtual provided. See #41002
for an example where `^mkl` was being used to test for several `mkl` providers in a
package that did not depend on `mkl`.
5. **License directive**
Spack packages can now have license metadata, with the new `license()` directive:
```python
license("Apache-2.0")
```
Licenses use [SPDX identifiers](https://spdx.org/licenses), and you can use SPDX
expressions to combine them:
```python
license("Apache-2.0 OR MIT")
```
Like other directives in Spack, it's conditional, so you can handle complex cases like
Spack itself:
```python
license("LGPL-2.1", when="@:0.11")
license("Apache-2.0 OR MIT", when="@0.12:")
```
More details in #39346, #40598.
6. **`spack deconcretize` command**
We are getting close to having a `spack update` command for environments, but we're
not quite there yet. This is the next best thing. `spack deconcretize` gives you
control over what you want to update in an already concrete environment. If you have
an environment built with, say, `meson`, and you want to update your `meson` version,
you can run:
```console
spack deconcretize meson
```
and have everything that depends on `meson` rebuilt the next time you run `spack
concretize`. In a future Spack version, we'll handle all of this in a single command,
but for now you can use this to drop bits of your lockfile and resolve your
dependencies again. More in #38803.
7. **UI Improvements**
The venerable `spack info` command was looking shabby compared to the rest of Spack's
UI, so we reworked it to have a bit more flair. `spack info` now makes much better
use of terminal space and shows variants, their values, and their descriptions much
more clearly. Conditional variants are grouped separately so you can more easily
understand how packages are structured. More in #40998.
`spack checksum` now allows you to filter versions from your editor, or by version
range. It also notifies you about potential download URL changes. See #40403.
8. **Environments can include definitions**
Spack did not previously support using `include:` with The
[definitions](https://spack.readthedocs.io/en/latest/environments.html#spec-list-references)
section of an environment, but now it does. You can use this to curate lists of specs
and more easily reuse them across environments. See #33960.
9. **Aliases**
You can now add aliases to Spack commands in `config.yaml`, e.g. this might enshrine
your favorite args to `spack find` as `spack f`:
```yaml
config:
aliases:
f: find -lv
```
See #17229.
10. **Improved autoloading of modules**
Spack 0.20 was the first release to enable autoloading of direct dependencies in
module files.
The downside of this was that `module avail` and `module load` tab completion would
show users too many modules to choose from, and many users disabled generating
modules for dependencies through `exclude_implicits: true`. Further, it was
necessary to keep hashes in module names to avoid file name clashes.
In this release, you can start using `hide_implicits: true` instead, which exposes
only explicitly installed packages to the user, while still autoloading
dependencies. On top of that, you can safely use `hash_length: 0`, as this config
now only applies to the modules exposed to the user -- you don't have to worry about
file name clashes for hidden dependencies.
Note: for `tcl` this feature requires Modules 4.7 or higher
11. **Updated container labeling**
Nightly Docker images from the `develop` branch will now be tagged as `:develop` and
`:nightly`. The `:latest` tag is no longer associated with `:develop`, but with the
latest stable release. Releases will be tagged with `:{major}`, `:{major}.{minor}`
and `:{major}.{minor}.{patch}`. `ubuntu:18.04` has also been removed from the list of
generated Docker images, as it is no longer supported. See #40593.
## Other new commands and directives
* `spack env activate` without arguments now loads a `default` environment that you do
not have to create (#40756).
* `spack find -H` / `--hashes`: a new shortcut for piping `spack find` output to
other commands (#38663)
* Add `spack checksum --verify`, fix `--add` (#38458)
* New `default_args` context manager factors out common args for directives (#39964)
* `spack compiler find --[no]-mixed-toolchain` lets you easily mix `clang` and
`gfortran` on Linux (#40902)
## Performance improvements
* `spack external find` execution is now much faster (#39843)
* `spack location -i` now much faster on success (#40898)
* Drop redundant rpaths post install (#38976)
* ASP-based solver: avoid cycles in clingo using hidden directive (#40720)
* Fix multiple quadratic complexity issues in environments (#38771)
## Other new features of note
* archspec: update to v0.2.2, support for Sapphire Rapids, Power10, Neoverse V2 (#40917)
* Propagate variants across nodes that don't have that variant (#38512)
* Implement fish completion (#29549)
* Can now distinguish between source/binary mirror; don't ping mirror.spack.io as much (#34523)
* Improve status reporting on install (add [n/total] display) (#37903)
## Windows
This release has the best Windows support of any Spack release yet, with numerous
improvements and much larger swaths of tests passing:
* MSVC and SDK improvements (#37711, #37930, #38500, #39823, #39180)
* Windows external finding: update default paths; treat .bat as executable on Windows (#39850)
* Windows decompression: fix removal of intermediate file (#38958)
* Windows: executable/path handling (#37762)
* Windows build systems: use ninja and enable tests (#33589)
* Windows testing (#36970, #36972, #36973, #36840, #36977, #36792, #36834, #34696, #36971)
* Windows PowerShell support (#39118, #37951)
* Windows symlinking and libraries (#39933, #38599, #34701, #38578, #34701)
## Notable refactors
* User-specified flags take precedence over others in Spack compiler wrappers (#37376)
* Improve setup of build, run, and test environments (#35737, #40916)
* `make` is no longer a required system dependency of Spack (#40380)
* Support Python 3.12 (#40404, #40155, #40153)
* docs: Replace package list with packages.spack.io (#40251)
* Drop Python 2 constructs in Spack (#38720, #38718, #38703)
## Binary cache and stack updates
* e4s arm stack: duplicate and target neoverse v1 (#40369)
* Add macOS ML CI stacks (#36586)
* E4S Cray CI Stack (#37837)
* e4s cray: expand spec list (#38947)
* e4s cray sles ci: expand spec list (#39081)
## Removals, deprecations, and syntax changes
* ASP: targets, compilers and providers soft-preferences are only global (#31261)
* Parser: fix ambiguity with whitespace in version ranges (#40344)
* Module file generation is disabled by default; you'll need to enable it to use it (#37258)
* Remove deprecated "extra_instructions" option for containers (#40365)
* Stand-alone test feature deprecation postponed to v0.22 (#40600)
* buildcache push: make `--allow-root` the default and deprecate the option (#38878)
## Notable Bugfixes
* Bugfix: propagation of multivalued variants (#39833)
* Allow `/` in git versions (#39398)
* Fetch & patch: actually acquire stage lock, and many more issues (#38903)
* Environment/depfile: better escaping of targets with Git versions (#37560)
* Prevent "spack external find" to error out on wrong permissions (#38755)
* lmod: allow core compiler to be specified with a version range (#37789)
## Spack community stats
* 7,469 total packages, 303 new since `v0.20.0`
* 150 new Python packages
* 34 new R packages
* 353 people contributed to this release
* 336 committers to packages
* 65 committers to core
# v0.20.3 (2023-10-31)
## Bugfixes

View File

@@ -1,13 +1,34 @@
# <img src="https://cdn.rawgit.com/spack/spack/develop/share/spack/logo/spack-logo.svg" width="64" valign="middle" alt="Spack"/> Spack
<div align="left">
[![Unit Tests](https://github.com/spack/spack/workflows/linux%20tests/badge.svg)](https://github.com/spack/spack/actions)
[![Bootstrapping](https://github.com/spack/spack/actions/workflows/bootstrap.yml/badge.svg)](https://github.com/spack/spack/actions/workflows/bootstrap.yml)
[![codecov](https://codecov.io/gh/spack/spack/branch/develop/graph/badge.svg)](https://codecov.io/gh/spack/spack)
[![Containers](https://github.com/spack/spack/actions/workflows/build-containers.yml/badge.svg)](https://github.com/spack/spack/actions/workflows/build-containers.yml)
[![Read the Docs](https://readthedocs.org/projects/spack/badge/?version=latest)](https://spack.readthedocs.io)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Slack](https://slack.spack.io/badge.svg)](https://slack.spack.io)
[![Matrix](https://img.shields.io/matrix/spack-space%3Amatrix.org?label=Matrix)](https://matrix.to/#/#spack-space:matrix.org)
<h2>
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://cdn.rawgit.com/spack/spack/develop/share/spack/logo/spack-logo-white-text.svg" width="250">
<source media="(prefers-color-scheme: light)" srcset="https://cdn.rawgit.com/spack/spack/develop/share/spack/logo/spack-logo-text.svg" width="250">
<img alt="Spack" src="https://cdn.rawgit.com/spack/spack/develop/share/spack/logo/spack-logo-text.svg" width="250">
</picture>
<br>
<br clear="all">
<a href="https://github.com/spack/spack/actions/workflows/ci.yml"><img src="https://github.com/spack/spack/workflows/ci/badge.svg" alt="CI Status"></a>
<a href="https://github.com/spack/spack/actions/workflows/bootstrapping.yml"><img src="https://github.com/spack/spack/actions/workflows/bootstrap.yml/badge.svg" alt="Bootstrap Status"></a>
<a href="https://github.com/spack/spack/actions/workflows/build-containers.yml"><img src="https://github.com/spack/spack/actions/workflows/build-containers.yml/badge.svg" alt="Containers Status"></a>
<a href="https://spack.readthedocs.io"><img src="https://readthedocs.org/projects/spack/badge/?version=latest" alt="Documentation Status"></a>
<a href="https://codecov.io/gh/spack/spack"><img src="https://codecov.io/gh/spack/spack/branch/develop/graph/badge.svg" alt="Code coverage"/></a>
<a href="https://slack.spack.io"><img src="https://slack.spack.io/badge.svg" alt="Slack"/></a>
<a href="https://matrix.to/#/#spack-space:matrix.org"><img src="https://img.shields.io/matrix/spack-space%3Amatrix.org?label=matrix" alt="Matrix"/></a>
</h2>
**[Getting Started] &nbsp;&nbsp; [Config] &nbsp;&nbsp; [Community] &nbsp;&nbsp; [Contributing] &nbsp;&nbsp; [Packaging Guide]**
[Getting Started]: https://spack.readthedocs.io/en/latest/getting_started.html
[Config]: https://spack.readthedocs.io/en/latest/configuration.html
[Community]: #community
[Contributing]: https://spack.readthedocs.io/en/latest/contribution_guide.html
[Packaging Guide]: https://spack.readthedocs.io/en/latest/packaging_guide.html
</div>
Spack is a multi-platform package manager that builds and installs
multiple versions and configurations of software. It works on Linux,
@@ -66,10 +87,11 @@ Resources:
* **Matrix space**: [#spack-space:matrix.org](https://matrix.to/#/#spack-space:matrix.org):
[bridged](https://github.com/matrix-org/matrix-appservice-slack#matrix-appservice-slack) to Slack.
* [**Github Discussions**](https://github.com/spack/spack/discussions):
not just for discussions, but also Q&A.
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack)
for Q&A and discussions. Note the pinned discussions for announcements.
* **Twitter**: [@spackpm](https://twitter.com/spackpm). Be sure to
`@mention` us!
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack):
only for announcements. Please use other venues for discussions.
Contributing
------------------------

View File

@@ -50,4 +50,4 @@ packages:
# Apple bundles libuuid in libsystem_c version 1353.100.2,
# although the version number used here isn't critical
- spec: apple-libuuid@1353.100.2
prefix: /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk
prefix: /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk

View File

@@ -46,12 +46,10 @@ modules:
tcl:
all:
autoload: direct
hide_implicits: true
# Default configurations if lmod is enabled
lmod:
all:
autoload: direct
hide_implicits: true
hierarchy:
- mpi

View File

@@ -153,7 +153,43 @@ keyring, and trusting all downloaded keys.
List of popular build caches
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
* `Extreme-scale Scientific Software Stack (E4S) <https://e4s-project.github.io/>`_: `build cache <https://oaciss.uoregon.edu/e4s/inventory.html>`_
* `Extreme-scale Scientific Software Stack (E4S) <https://e4s-project.github.io/>`_: `build cache <https://oaciss.uoregon.edu/e4s/inventory.html>`_'
-------------------
Build cache signing
-------------------
By default, Spack will add a cryptographic signature to each package pushed to
a build cache, and verifies the signature when installing from a build cache.
Keys for signing can be managed with the :ref:`spack gpg <cmd-spack-gpg>` command,
as well as ``spack buildcache keys`` as mentioned above.
You can disable signing when pushing with ``spack buildcache push --unsigned``,
and disable verification when installing from any build cache with
``spack install --no-check-signature``.
Alternatively, signing and verification can be enabled or disabled on a per build cache
basis:
.. code-block:: console
$ spack mirror add --signed <name> <url> # enable signing and verification
$ spack mirror add --unsigned <name> <url> # disable signing and verification
$ spack mirror set --signed <name> # enable signing and verification for an existing mirror
$ spack mirror set --unsigned <name> # disable signing and verification for an existing mirror
Or you can directly edit the ``mirrors.yaml`` configuration file:
.. code-block:: yaml
mirrors:
<name>:
url: <url>
signed: false # disable signing and verification
See also :ref:`mirrors`.
----------
Relocation
@@ -182,6 +218,7 @@ section of the configuration:
padded_length: 128
.. _binary_caches_oci:
-----------------------------------------
OCI / Docker V2 registries as build cache
@@ -250,87 +287,13 @@ To significantly speed up Spack in GitHub Actions, binaries can be cached in
GitHub Packages. This service is an OCI registry that can be linked to a GitHub
repository.
A typical workflow is to include a ``spack.yaml`` environment in your repository
that specifies the packages to install, the target architecture, and the build
cache to use under ``mirrors``:
.. code-block:: yaml
spack:
specs:
- python@3.11
config:
install_tree:
root: /opt/spack
padded_length: 128
packages:
all:
require: target=x86_64_v2
mirrors:
local-buildcache: oci://ghcr.io/<organization>/<repository>
A GitHub action can then be used to install the packages and push them to the
build cache:
.. code-block:: yaml
name: Install Spack packages
on: push
env:
SPACK_COLOR: always
jobs:
example:
runs-on: ubuntu-22.04
permissions:
packages: write
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Checkout Spack
uses: actions/checkout@v3
with:
repository: spack/spack
path: spack
- name: Setup Spack
run: echo "$PWD/spack/bin" >> "$GITHUB_PATH"
- name: Concretize
run: spack -e . concretize
- name: Install
run: spack -e . install --no-check-signature
- name: Run tests
run: ./my_view/bin/python3 -c 'print("hello world")'
- name: Push to buildcache
run: |
spack -e . mirror set --oci-username ${{ github.actor }} --oci-password "${{ secrets.GITHUB_TOKEN }}" local-buildcache
spack -e . buildcache push --base-image ubuntu:22.04 --unsigned --update-index local-buildcache
if: ${{ !cancelled() }}
The first time this action runs, it will build the packages from source and
push them to the build cache. Subsequent runs will pull the binaries from the
build cache. The concretizer will ensure that prebuilt binaries are favored
over source builds.
The build cache entries appear in the GitHub Packages section of your repository,
and contain instructions for pulling and running them with ``docker`` or ``podman``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using Spack's public build cache for GitHub Actions
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack offers a public build cache for GitHub Actions with a set of common packages,
which lets you get started quickly. See the following resources for more information:
* `spack/github-actions-buildcache <https://github.com/spack/github-actions-buildcache>`_
* `spack/setup-spack <https://github.com/spack/setup-spack>`_ for setting up Spack in GitHub
Actions
* `spack/github-actions-buildcache <https://github.com/spack/github-actions-buildcache>`_ for
more details on the public build cache
.. _cmd-spack-buildcache:

View File

@@ -37,7 +37,11 @@ to enable reuse for a single installation, and you can use:
spack install --fresh <spec>
to do a fresh install if ``reuse`` is enabled by default.
``reuse: true`` is the default.
``reuse: dependencies`` is the default.
.. seealso::
FAQ: :ref:`Why does Spack pick particular versions and variants? <faq-concretizer-precedence>`
------------------------------------------
Selection of the target microarchitectures
@@ -99,547 +103,3 @@ while `py-numpy` still needs an older version:
Up to Spack v0.20 ``duplicates:strategy:none`` was the default (and only) behavior. From Spack v0.21 the
default behavior is ``duplicates:strategy:minimal``.
.. _build-settings:
================================
Package Settings (packages.yaml)
================================
Spack allows you to customize how your software is built through the
``packages.yaml`` file. Using it, you can make Spack prefer particular
implementations of virtual dependencies (e.g., MPI or BLAS/LAPACK),
or you can make it prefer to build with particular compilers. You can
also tell Spack to use *external* software installations already
present on your system.
At a high level, the ``packages.yaml`` file is structured like this:
.. code-block:: yaml
packages:
package1:
# settings for package1
package2:
# settings for package2
# ...
all:
# settings that apply to all packages.
So you can either set build preferences specifically for *one* package,
or you can specify that certain settings should apply to *all* packages.
The types of settings you can customize are described in detail below.
Spack's build defaults are in the default
``etc/spack/defaults/packages.yaml`` file. You can override them in
``~/.spack/packages.yaml`` or ``etc/spack/packages.yaml``. For more
details on how this works, see :ref:`configuration-scopes`.
.. _sec-external-packages:
-----------------
External Packages
-----------------
Spack can be configured to use externally-installed
packages rather than building its own packages. This may be desirable
if machines ship with system packages, such as a customized MPI
that should be used instead of Spack building its own MPI.
External packages are configured through the ``packages.yaml`` file.
Here's an example of an external configuration:
.. code-block:: yaml
packages:
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
This example lists three installations of OpenMPI, one built with GCC,
one built with GCC and debug information, and another built with Intel.
If Spack is asked to build a package that uses one of these MPIs as a
dependency, it will use the pre-installed OpenMPI in
the given directory. Note that the specified path is the top-level
install prefix, not the ``bin`` subdirectory.
``packages.yaml`` can also be used to specify modules to load instead
of the installation prefixes. The following example says that module
``CMake/3.7.2`` provides cmake version 3.7.2.
.. code-block:: yaml
cmake:
externals:
- spec: cmake@3.7.2
modules:
- CMake/3.7.2
Each ``packages.yaml`` begins with a ``packages:`` attribute, followed
by a list of package names. To specify externals, add an ``externals:``
attribute under the package name, which lists externals.
Each external should specify a ``spec:`` string that should be as
well-defined as reasonably possible. If a
package lacks a spec component, such as missing a compiler or
package version, then Spack will guess the missing component based
on its most-favored packages, and it may guess incorrectly.
Each package version and compiler listed in an external should
have entries in Spack's packages and compiler configuration, even
though the package and compiler may not ever be built.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Prevent packages from being built from sources
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Adding an external spec in ``packages.yaml`` allows Spack to use an external location,
but it does not prevent Spack from building packages from sources. In the above example,
Spack might choose for many valid reasons to start building and linking with the
latest version of OpenMPI rather than continue using the pre-installed OpenMPI versions.
To prevent this, the ``packages.yaml`` configuration also allows packages
to be flagged as non-buildable. The previous example could be modified to
be:
.. code-block:: yaml
packages:
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
buildable: False
The addition of the ``buildable`` flag tells Spack that it should never build
its own version of OpenMPI from sources, and it will instead always rely on a pre-built
OpenMPI.
.. note::
If ``concretizer:reuse`` is on (see :ref:`concretizer-options` for more information on that flag)
pre-built specs include specs already available from a local store, an upstream store, a registered
buildcache or specs marked as externals in ``packages.yaml``. If ``concretizer:reuse`` is off, only
external specs in ``packages.yaml`` are included in the list of pre-built specs.
If an external module is specified as not buildable, then Spack will load the
external module into the build environment which can be used for linking.
The ``buildable`` does not need to be paired with external packages.
It could also be used alone to forbid packages that may be
buggy or otherwise undesirable.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Non-buildable virtual packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Virtual packages in Spack can also be specified as not buildable, and
external implementations can be provided. In the example above,
OpenMPI is configured as not buildable, but Spack will often prefer
other MPI implementations over the externally available OpenMPI. Spack
can be configured with every MPI provider not buildable individually,
but more conveniently:
.. code-block:: yaml
packages:
mpi:
buildable: False
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
Spack can then use any of the listed external implementations of MPI
to satisfy a dependency, and will choose depending on the compiler and
architecture.
In cases where the concretizer is configured to reuse specs, and other ``mpi`` providers
(available via stores or buildcaches) are not wanted, Spack can be configured to require
specs matching only the available externals:
.. code-block:: yaml
packages:
mpi:
buildable: False
require:
- one_of: [
"openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64",
"openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug",
"openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
]
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
This configuration prevents any spec using MPI and originating from stores or buildcaches to be reused,
unless it matches the requirements under ``packages:mpi:require``. For more information on requirements see
:ref:`package-requirements`.
.. _cmd-spack-external-find:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Automatically Find External Packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can run the :ref:`spack external find <spack-external-find>` command
to search for system-provided packages and add them to ``packages.yaml``.
After running this command your ``packages.yaml`` may include new entries:
.. code-block:: yaml
packages:
cmake:
externals:
- spec: cmake@3.17.2
prefix: /usr
Generally this is useful for detecting a small set of commonly-used packages;
for now this is generally limited to finding build-only dependencies.
Specific limitations include:
* Packages are not discoverable by default: For a package to be
discoverable with ``spack external find``, it needs to add special
logic. See :ref:`here <make-package-findable>` for more details.
* The logic does not search through module files, it can only detect
packages with executables defined in ``PATH``; you can help Spack locate
externals which use module files by loading any associated modules for
packages that you want Spack to know about before running
``spack external find``.
* Spack does not overwrite existing entries in the package configuration:
If there is an external defined for a spec at any configuration scope,
then Spack will not add a new external entry (``spack config blame packages``
can help locate all external entries).
.. _package-requirements:
--------------------
Package Requirements
--------------------
Spack can be configured to always use certain compilers, package
versions, and variants during concretization through package
requirements.
Package requirements are useful when you find yourself repeatedly
specifying the same constraints on the command line, and wish that
Spack respects these constraints whether you mention them explicitly
or not. Another use case is specifying constraints that should apply
to all root specs in an environment, without having to repeat the
constraint everywhere.
Apart from that, requirements config is more flexible than constraints
on the command line, because it can specify constraints on packages
*when they occur* as a dependency. In contrast, on the command line it
is not possible to specify constraints on dependencies while also keeping
those dependencies optional.
^^^^^^^^^^^^^^^^^^^
Requirements syntax
^^^^^^^^^^^^^^^^^^^
The package requirements configuration is specified in ``packages.yaml``,
keyed by package name and expressed using the Spec syntax. In the simplest
case you can specify attributes that you always want the package to have
by providing a single spec string to ``require``:
.. code-block:: yaml
packages:
libfabric:
require: "@1.13.2"
In the above example, ``libfabric`` will always build with version 1.13.2. If you
need to compose multiple configuration scopes ``require`` accepts a list of
strings:
.. code-block:: yaml
packages:
libfabric:
require:
- "@1.13.2"
- "%gcc"
In this case ``libfabric`` will always build with version 1.13.2 **and** using GCC
as a compiler.
For more complex use cases, require accepts also a list of objects. These objects
must have either a ``any_of`` or a ``one_of`` field, containing a list of spec strings,
and they can optionally have a ``when`` and a ``message`` attribute:
.. code-block:: yaml
packages:
openmpi:
require:
- any_of: ["@4.1.5", "%gcc"]
message: "in this example only 4.1.5 can build with other compilers"
``any_of`` is a list of specs. One of those specs must be satisfied
and it is also allowed for the concretized spec to match more than one.
In the above example, that means you could build ``openmpi@4.1.5%gcc``,
``openmpi@4.1.5%clang`` or ``openmpi@3.9%gcc``, but
not ``openmpi@3.9%clang``.
If a custom message is provided, and the requirement is not satisfiable,
Spack will print the custom error message:
.. code-block:: console
$ spack spec openmpi@3.9%clang
==> Error: in this example only 4.1.5 can build with other compilers
We could express a similar requirement using the ``when`` attribute:
.. code-block:: yaml
packages:
openmpi:
require:
- any_of: ["%gcc"]
when: "@:4.1.4"
message: "in this example only 4.1.5 can build with other compilers"
In the example above, if the version turns out to be 4.1.4 or less, we require the compiler to be GCC.
For readability, Spack also allows a ``spec`` key accepting a string when there is only a single
constraint:
.. code-block:: yaml
packages:
openmpi:
require:
- spec: "%gcc"
when: "@:4.1.4"
message: "in this example only 4.1.5 can build with other compilers"
This code snippet and the one before it are semantically equivalent.
Finally, instead of ``any_of`` you can use ``one_of`` which also takes a list of specs. The final
concretized spec must match one and only one of them:
.. code-block:: yaml
packages:
mpich:
require:
- one_of: ["+cuda", "+rocm"]
In the example above, that means you could build ``mpich+cuda`` or ``mpich+rocm`` but not ``mpich+cuda+rocm``.
.. note::
For ``any_of`` and ``one_of``, the order of specs indicates a
preference: items that appear earlier in the list are preferred
(note that these preferences can be ignored in favor of others).
.. note::
When using a conditional requirement, Spack is allowed to actively avoid the triggering
condition (the ``when=...`` spec) if that leads to a concrete spec with better scores in
the optimization criteria. To check the current optimization criteria and their
priorities you can run ``spack solve zlib``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting default requirements
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can also set default requirements for all packages under ``all``
like this:
.. code-block:: yaml
packages:
all:
require: '%clang'
which means every spec will be required to use ``clang`` as a compiler.
Note that in this case ``all`` represents a *default set of requirements* -
if there are specific package requirements, then the default requirements
under ``all`` are disregarded. For example, with a configuration like this:
.. code-block:: yaml
packages:
all:
require: '%clang'
cmake:
require: '%gcc'
Spack requires ``cmake`` to use ``gcc`` and all other nodes (including ``cmake``
dependencies) to use ``clang``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting requirements on virtual specs
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
A requirement on a virtual spec applies whenever that virtual is present in the DAG.
This can be useful for fixing which virtual provider you want to use:
.. code-block:: yaml
packages:
mpi:
require: 'mvapich2 %gcc'
With the configuration above the only allowed ``mpi`` provider is ``mvapich2 %gcc``.
Requirements on the virtual spec and on the specific provider are both applied, if
present. For instance with a configuration like:
.. code-block:: yaml
packages:
mpi:
require: 'mvapich2 %gcc'
mvapich2:
require: '~cuda'
you will use ``mvapich2~cuda %gcc`` as an ``mpi`` provider.
.. _package-preferences:
-------------------
Package Preferences
-------------------
In some cases package requirements can be too strong, and package
preferences are the better option. Package preferences do not impose
constraints on packages for particular versions or variants values,
they rather only set defaults. The concretizer is free to change
them if it must, due to other constraints, and also prefers reusing
installed packages over building new ones that are a better match for
preferences.
Most package preferences (``compilers``, ``target`` and ``providers``)
can only be set globally under the ``all`` section of ``packages.yaml``:
.. code-block:: yaml
packages:
all:
compiler: [gcc@12.2.0, clang@12:, oneapi@2023:]
target: [x86_64_v3]
providers:
mpi: [mvapich2, mpich, openmpi]
These preferences override Spack's default and effectively reorder priorities
when looking for the best compiler, target or virtual package provider. Each
preference takes an ordered list of spec constraints, with earlier entries in
the list being preferred over later entries.
In the example above all packages prefer to be compiled with ``gcc@12.2.0``,
to target the ``x86_64_v3`` microarchitecture and to use ``mvapich2`` if they
depend on ``mpi``.
The ``variants`` and ``version`` preferences can be set under
package specific sections of the ``packages.yaml`` file:
.. code-block:: yaml
packages:
opencv:
variants: +debug
gperftools:
version: [2.2, 2.4, 2.3]
In this case, the preference for ``opencv`` is to build with debug options, while
``gperftools`` prefers version 2.2 over 2.4.
Any preference can be overwritten on the command line if explicitly requested.
Preferences cannot overcome explicit constraints, as they only set a preferred
ordering among homogeneous attribute values. Going back to the example, if
``gperftools@2.3:`` was requested, then Spack will install version 2.4
since the most preferred version 2.2 is prohibited by the version constraint.
.. _package_permissions:
-------------------
Package Permissions
-------------------
Spack can be configured to assign permissions to the files installed
by a package.
In the ``packages.yaml`` file under ``permissions``, the attributes
``read``, ``write``, and ``group`` control the package
permissions. These attributes can be set per-package, or for all
packages under ``all``. If permissions are set under ``all`` and for a
specific package, the package-specific settings take precedence.
The ``read`` and ``write`` attributes take one of ``user``, ``group``,
and ``world``.
.. code-block:: yaml
packages:
all:
permissions:
write: group
group: spack
my_app:
permissions:
read: group
group: my_team
The permissions settings describe the broadest level of access to
installations of the specified packages. The execute permissions of
the file are set to the same level as read permissions for those files
that are executable. The default setting for ``read`` is ``world``,
and for ``write`` is ``user``. In the example above, installations of
``my_app`` will be installed with user and group permissions but no
world permissions, and owned by the group ``my_team``. All other
packages will be installed with user and group write privileges, and
world read privileges. Those packages will be owned by the group
``spack``.
The ``group`` attribute assigns a Unix-style group to a package. All
files installed by the package will be owned by the assigned group,
and the sticky group bit will be set on the install prefix and all
directories inside the install prefix. This will ensure that even
manually placed files within the install prefix are owned by the
assigned group. If no group is assigned, Spack will allow the OS
default behavior to go as expected.
----------------------------
Assigning Package Attributes
----------------------------
You can assign class-level attributes in the configuration:
.. code-block:: yaml
packages:
mpileaks:
# Override existing attributes
url: http://www.somewhereelse.com/mpileaks-1.0.tar.gz
# ... or add new ones
x: 1
Attributes set this way will be accessible to any method executed
in the package.py file (e.g. the ``install()`` method). Values for these
attributes may be any value parseable by yaml.
These can only be applied to specific packages, not "all" or
virtual packages.

View File

@@ -82,7 +82,7 @@ class already contains:
.. code-block:: python
depends_on('cmake', type='build')
depends_on("cmake", type="build")
If you need to specify a particular version requirement, you can
@@ -90,7 +90,7 @@ override this in your package:
.. code-block:: python
depends_on('cmake@2.8.12:', type='build')
depends_on("cmake@2.8.12:", type="build")
^^^^^^^^^^^^^^^^^^^
@@ -137,10 +137,10 @@ and without the :meth:`~spack.build_systems.cmake.CMakeBuilder.define` and
def cmake_args(self):
args = [
'-DWHATEVER:STRING=somevalue',
self.define('ENABLE_BROKEN_FEATURE', False),
self.define_from_variant('DETECT_HDF5', 'hdf5'),
self.define_from_variant('THREADS'), # True if +threads
"-DWHATEVER:STRING=somevalue",
self.define("ENABLE_BROKEN_FEATURE", False),
self.define_from_variant("DETECT_HDF5", "hdf5"),
self.define_from_variant("THREADS"), # True if +threads
]
return args
@@ -151,10 +151,10 @@ and CMake simply ignores the empty command line argument. For example the follow
.. code-block:: python
variant('example', default=True, when='@2.0:')
variant("example", default=True, when="@2.0:")
def cmake_args(self):
return [self.define_from_variant('EXAMPLE', 'example')]
return [self.define_from_variant("EXAMPLE", "example")]
will generate ``'cmake' '-DEXAMPLE=ON' ...`` when `@2.0: +example` is met, but will
result in ``'cmake' '' ...`` when the spec version is below ``2.0``.
@@ -193,9 +193,9 @@ a variant to control this:
.. code-block:: python
variant('build_type', default='RelWithDebInfo',
description='CMake build type',
values=('Debug', 'Release', 'RelWithDebInfo', 'MinSizeRel'))
variant("build_type", default="RelWithDebInfo",
description="CMake build type",
values=("Debug", "Release", "RelWithDebInfo", "MinSizeRel"))
However, not every CMake package accepts all four of these options.
Grep the ``CMakeLists.txt`` file to see if the default values are
@@ -205,9 +205,9 @@ package overrides the default variant with:
.. code-block:: python
variant('build_type', default='DebugRelease',
description='The build type to build',
values=('Debug', 'Release', 'DebugRelease'))
variant("build_type", default="DebugRelease",
description="The build type to build",
values=("Debug", "Release", "DebugRelease"))
For more information on ``CMAKE_BUILD_TYPE``, see:
https://cmake.org/cmake/help/latest/variable/CMAKE_BUILD_TYPE.html
@@ -250,7 +250,7 @@ generator is Ninja. To switch to the Ninja generator, simply add:
.. code-block:: python
generator = 'Ninja'
generator = "Ninja"
``CMakePackage`` defaults to "Unix Makefiles". If you switch to the
@@ -258,7 +258,7 @@ Ninja generator, make sure to add:
.. code-block:: python
depends_on('ninja', type='build')
depends_on("ninja", type="build")
to the package as well. Aside from that, you shouldn't need to do
anything else. Spack will automatically detect that you are using
@@ -288,7 +288,7 @@ like so:
.. code-block:: python
root_cmakelists_dir = 'src'
root_cmakelists_dir = "src"
Note that this path is relative to the root of the extracted tarball,
@@ -304,7 +304,7 @@ different sub-directory, simply override ``build_directory`` like so:
.. code-block:: python
build_directory = 'my-build'
build_directory = "my-build"
^^^^^^^^^^^^^^^^^^^^^^^^^
Build and install targets
@@ -324,8 +324,8 @@ library or build the documentation, you can add these like so:
.. code-block:: python
build_targets = ['all', 'docs']
install_targets = ['install', 'docs']
build_targets = ["all", "docs"]
install_targets = ["install", "docs"]
^^^^^^^
Testing

View File

@@ -53,18 +53,24 @@ Install the oneAPI compilers::
Add the compilers to your ``compilers.yaml`` so spack can use them::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin/intel64
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/bin
Verify that the compilers are available::
spack compiler list
Note that 2024 and later releases do not include ``icc``. Before 2024,
the package layout was different::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin/intel64
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin
The ``intel-oneapi-compilers`` package includes 2 families of
compilers:
* ``intel``: ``icc``, ``icpc``, ``ifort``. Intel's *classic*
compilers.
compilers. 2024 and later releases contain ``ifort``, but not
``icc`` and ``icpc``.
* ``oneapi``: ``icx``, ``icpx``, ``ifx``. Intel's new generation of
compilers based on LLVM.
@@ -89,8 +95,8 @@ Install the oneAPI compilers::
Add the compilers to your ``compilers.yaml`` so Spack can use them::
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin/intel64
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/linux/bin
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/bin
spack compiler add `spack location -i intel-oneapi-compilers`/compiler/latest/bin
Verify that the compilers are available::
@@ -146,8 +152,7 @@ Compilers
To use the compilers, add some information about the installation to
``compilers.yaml``. For most users, it is sufficient to do::
spack compiler add /opt/intel/oneapi/compiler/latest/linux/bin/intel64
spack compiler add /opt/intel/oneapi/compiler/latest/linux/bin
spack compiler add /opt/intel/oneapi/compiler/latest/bin
Adapt the paths above if you did not install the tools in the default
location. After adding the compilers, using them is the same
@@ -156,6 +161,12 @@ Another option is to manually add the configuration to
``compilers.yaml`` as described in :ref:`Compiler configuration
<compiler-config>`.
Before 2024, the directory structure was different::
spack compiler add /opt/intel/oneapi/compiler/latest/linux/bin/intel64
spack compiler add /opt/intel/oneapi/compiler/latest/linux/bin
Libraries
---------

View File

@@ -90,7 +90,7 @@ and optimizers do require a paid license. In Spack, they are packaged as:
TODO: Confirm and possible change(!) the scope of MPI components (runtime
vs. devel) in current (and previous?) *cluster/professional/composer*
editions, i.e., presence in downloads, possibly subject to license
coverage(!); see `disussion in PR #4300
coverage(!); see `discussion in PR #4300
<https://github.com/spack/spack/pull/4300#issuecomment-305582898>`_. [NB:
An "mpi" subdirectory is not indicative of the full MPI SDK being present
(i.e., ``mpicc``, ..., and header files). The directory may just as well
@@ -392,7 +392,7 @@ See section
:ref:`Configuration Scopes <configuration-scopes>`
for an explanation about the different files
and section
:ref:`Build customization <build-settings>`
:ref:`Build customization <packages-config>`
for specifics and examples for ``packages.yaml`` files.
.. If your system administrator did not provide modules for pre-installed Intel
@@ -934,9 +934,9 @@ a *virtual* ``mkl`` package is declared in Spack.
.. code-block:: python
# Examples for absolute and conditional dependencies:
depends_on('mkl')
depends_on('mkl', when='+mkl')
depends_on('mkl', when='fftw=mkl')
depends_on("mkl")
depends_on("mkl", when="+mkl")
depends_on("mkl", when="fftw=mkl")
The ``MKLROOT`` environment variable (part of the documented API) will be set
during all stages of client package installation, and is available to both
@@ -972,8 +972,8 @@ a *virtual* ``mkl`` package is declared in Spack.
def configure_args(self):
args = []
...
args.append('--with-blas=%s' % self.spec['blas'].libs.ld_flags)
args.append('--with-lapack=%s' % self.spec['lapack'].libs.ld_flags)
args.append("--with-blas=%s" % self.spec["blas"].libs.ld_flags)
args.append("--with-lapack=%s" % self.spec["lapack"].libs.ld_flags)
...
.. tip::
@@ -989,13 +989,13 @@ a *virtual* ``mkl`` package is declared in Spack.
.. code-block:: python
self.spec['blas'].headers.include_flags
self.spec["blas"].headers.include_flags
and to generate linker options (``-L<dir> -llibname ...``), use the same as above,
.. code-block:: python
self.spec['blas'].libs.ld_flags
self.spec["blas"].libs.ld_flags
See
:ref:`MakefilePackage <makefilepackage>`

View File

@@ -88,7 +88,7 @@ override the ``luarocks_args`` method like so:
.. code-block:: python
def luarocks_args(self):
return ['flag1', 'flag2']
return ["flag1", "flag2"]
One common use of this is to override warnings or flags for newer compilers, as in:

View File

@@ -48,8 +48,8 @@ class automatically adds the following dependencies:
.. code-block:: python
depends_on('java', type=('build', 'run'))
depends_on('maven', type='build')
depends_on("java", type=("build", "run"))
depends_on("maven", type="build")
In the ``pom.xml`` file, you may see sections like:
@@ -72,8 +72,8 @@ should add:
.. code-block:: python
depends_on('java@7:', type='build')
depends_on('maven@3.5.4:', type='build')
depends_on("java@7:", type="build")
depends_on("maven@3.5.4:", type="build")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -88,9 +88,9 @@ the build phase. For example:
def build_args(self):
return [
'-Pdist,native',
'-Dtar',
'-Dmaven.javadoc.skip=true'
"-Pdist,native",
"-Dtar",
"-Dmaven.javadoc.skip=true"
]

View File

@@ -86,8 +86,8 @@ the ``MesonPackage`` base class already contains:
.. code-block:: python
depends_on('meson', type='build')
depends_on('ninja', type='build')
depends_on("meson", type="build")
depends_on("ninja", type="build")
If you need to specify a particular version requirement, you can
@@ -95,8 +95,8 @@ override this in your package:
.. code-block:: python
depends_on('meson@0.43.0:', type='build')
depends_on('ninja', type='build')
depends_on("meson@0.43.0:", type="build")
depends_on("ninja", type="build")
^^^^^^^^^^^^^^^^^^^
@@ -121,7 +121,7 @@ override the ``meson_args`` method like so:
.. code-block:: python
def meson_args(self):
return ['--warnlevel=3']
return ["--warnlevel=3"]
This method can be used to pass flags as well as variables.

View File

@@ -118,7 +118,7 @@ so ``PerlPackage`` contains:
.. code-block:: python
extends('perl')
extends("perl")
If your package requires a specific version of Perl, you should
@@ -132,14 +132,14 @@ properly. If your package uses ``Makefile.PL`` to build, add:
.. code-block:: python
depends_on('perl-extutils-makemaker', type='build')
depends_on("perl-extutils-makemaker", type="build")
If your package uses ``Build.PL`` to build, add:
.. code-block:: python
depends_on('perl-module-build', type='build')
depends_on("perl-module-build", type="build")
^^^^^^^^^^^^^^^^^
@@ -165,11 +165,11 @@ arguments to ``Makefile.PL`` or ``Build.PL`` by overriding
.. code-block:: python
def configure_args(self):
expat = self.spec['expat'].prefix
expat = self.spec["expat"].prefix
return [
'EXPATLIBPATH={0}'.format(expat.lib),
'EXPATINCPATH={0}'.format(expat.include),
"EXPATLIBPATH={0}".format(expat.lib),
"EXPATINCPATH={0}".format(expat.include),
]

View File

@@ -83,7 +83,7 @@ base class already contains:
.. code-block:: python
depends_on('qt', type='build')
depends_on("qt", type="build")
If you want to specify a particular version requirement, or need to
@@ -91,7 +91,7 @@ link to the ``qt`` libraries, you can override this in your package:
.. code-block:: python
depends_on('qt@5.6.0:')
depends_on("qt@5.6.0:")
^^^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to qmake
@@ -103,7 +103,7 @@ override the ``qmake_args`` method like so:
.. code-block:: python
def qmake_args(self):
return ['-recursive']
return ["-recursive"]
This method can be used to pass flags as well as variables.
@@ -118,7 +118,7 @@ sub-directory by adding the following to the package:
.. code-block:: python
build_directory = 'src'
build_directory = "src"
^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -163,28 +163,28 @@ attributes that can be used to set ``homepage``, ``url``, ``list_url``, and
.. code-block:: python
cran = 'caret'
cran = "caret"
is equivalent to:
.. code-block:: python
homepage = 'https://cloud.r-project.org/package=caret'
url = 'https://cloud.r-project.org/src/contrib/caret_6.0-86.tar.gz'
list_url = 'https://cloud.r-project.org/src/contrib/Archive/caret'
homepage = "https://cloud.r-project.org/package=caret"
url = "https://cloud.r-project.org/src/contrib/caret_6.0-86.tar.gz"
list_url = "https://cloud.r-project.org/src/contrib/Archive/caret"
Likewise, the following ``bioc`` attribute:
.. code-block:: python
bioc = 'BiocVersion'
bioc = "BiocVersion"
is equivalent to:
.. code-block:: python
homepage = 'https://bioconductor.org/packages/BiocVersion/'
git = 'https://git.bioconductor.org/packages/BiocVersion'
homepage = "https://bioconductor.org/packages/BiocVersion/"
git = "https://git.bioconductor.org/packages/BiocVersion"
^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -200,7 +200,7 @@ base class contains:
.. code-block:: python
extends('r')
extends("r")
Take a close look at the homepage for ``caret``. If you look at the
@@ -209,7 +209,7 @@ You should add this to your package like so:
.. code-block:: python
depends_on('r@3.2.0:', type=('build', 'run'))
depends_on("r@3.2.0:", type=("build", "run"))
^^^^^^^^^^^^^^
@@ -227,7 +227,7 @@ and list all of their dependencies in the following sections:
* LinkingTo
As far as Spack is concerned, all 3 of these dependency types
correspond to ``type=('build', 'run')``, so you don't have to worry
correspond to ``type=("build", "run")``, so you don't have to worry
about the details. If you are curious what they mean,
https://github.com/spack/spack/issues/2951 has a pretty good summary:
@@ -330,7 +330,7 @@ the dependency:
.. code-block:: python
depends_on('r-lattice@0.20:', type=('build', 'run'))
depends_on("r-lattice@0.20:", type=("build", "run"))
^^^^^^^^^^^^^^^^^^
@@ -361,20 +361,20 @@ like so:
.. code-block:: python
def configure_args(self):
mpi_name = self.spec['mpi'].name
mpi_name = self.spec["mpi"].name
# The type of MPI. Supported values are:
# OPENMPI, LAM, MPICH, MPICH2, or CRAY
if mpi_name == 'openmpi':
Rmpi_type = 'OPENMPI'
elif mpi_name == 'mpich':
Rmpi_type = 'MPICH2'
if mpi_name == "openmpi":
Rmpi_type = "OPENMPI"
elif mpi_name == "mpich":
Rmpi_type = "MPICH2"
else:
raise InstallError('Unsupported MPI type')
raise InstallError("Unsupported MPI type")
return [
'--with-Rmpi-type={0}'.format(Rmpi_type),
'--with-mpi={0}'.format(spec['mpi'].prefix),
"--with-Rmpi-type={0}".format(Rmpi_type),
"--with-mpi={0}".format(spec["mpi"].prefix),
]

View File

@@ -84,8 +84,8 @@ The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
summary = 'An implementation of the AsciiDoc text processor and publishing toolchain'
description = 'A fast, open source text processor and publishing toolchain for converting AsciiDoc content to HTML 5, DocBook 5, and other formats.'
summary = "An implementation of the AsciiDoc text processor and publishing toolchain"
description = "A fast, open source text processor and publishing toolchain for converting AsciiDoc content to HTML 5, DocBook 5, and other formats."
Either of these can be used for the description of the Spack package.
@@ -98,7 +98,7 @@ The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
homepage = 'https://asciidoctor.org'
homepage = "https://asciidoctor.org"
This should be used as the official homepage of the Spack package.
@@ -112,21 +112,21 @@ the base class contains:
.. code-block:: python
extends('ruby')
extends("ruby")
The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
required_ruby_version = '>= 2.3.0'
required_ruby_version = ">= 2.3.0"
This can be added to the Spack package using:
.. code-block:: python
depends_on('ruby@2.3.0:', type=('build', 'run'))
depends_on("ruby@2.3.0:", type=("build", "run"))
^^^^^^^^^^^^^^^^^

View File

@@ -124,7 +124,7 @@ are wrong, you can provide the names yourself by overriding
.. code-block:: python
import_modules = ['PyQt5']
import_modules = ["PyQt5"]
These tests often catch missing dependencies and non-RPATHed

View File

@@ -63,8 +63,8 @@ run package-specific unit tests.
.. code-block:: python
def installtest(self):
with working_dir('test'):
pytest = which('py.test')
with working_dir("test"):
pytest = which("py.test")
pytest()
@@ -93,7 +93,7 @@ the following dependency automatically:
.. code-block:: python
depends_on('python@2.5:', type='build')
depends_on("python@2.5:", type="build")
Waf only supports Python 2.5 and up.
@@ -113,7 +113,7 @@ phase, you can use:
args = []
if self.run_tests:
args.append('--test')
args.append("--test")
return args

View File

@@ -17,7 +17,7 @@ case you want to skip directly to specific docs:
* :ref:`config.yaml <config-yaml>`
* :ref:`mirrors.yaml <mirrors>`
* :ref:`modules.yaml <modules>`
* :ref:`packages.yaml <build-settings>`
* :ref:`packages.yaml <packages-config>`
* :ref:`repos.yaml <repositories>`
You can also add any of these as inline configuration in the YAML
@@ -243,9 +243,11 @@ lower-precedence settings. Completely ignoring higher-level configuration
options is supported with the ``::`` notation for keys (see
:ref:`config-overrides` below).
There are also special notations for string concatenation and precendense override.
Using the ``+:`` notation can be used to force *prepending* strings or lists. For lists, this is identical
to the default behavior. Using the ``-:`` works similarly, but for *appending* values.
There are also special notations for string concatenation and precendense override:
* ``+:`` will force *prepending* strings or lists. For lists, this is the default behavior.
* ``-:`` works similarly, but for *appending* values.
:ref:`config-prepend-append`
^^^^^^^^^^^

View File

@@ -9,24 +9,96 @@
Container Images
================
Spack :ref:`environments` are a great tool to create container images, but
preparing one that is suitable for production requires some more boilerplate
than just:
Spack :ref:`environments` can easily be turned into container images. This page
outlines two ways in which this can be done:
1. By installing the environment on the host system, and copying the installations
into the container image. This approach does not require any tools like Docker
or Singularity to be installed.
2. By generating a Docker or Singularity recipe that can be used to build the
container image. In this approach, Spack builds the software inside the
container runtime, not on the host system.
The first approach is easiest if you already have an installed environment,
the second approach gives more control over the container image.
---------------------------
From existing installations
---------------------------
If you already have a Spack environment installed on your system, you can
share the binaries as an OCI compatible container image. To get started you
just have to configure and OCI registry and run ``spack buildcache push``.
.. code-block:: console
# Create and install an environment in the current directory
spack env create -d .
spack -e . add pkg-a pkg-b
spack -e . install
# Configure the registry
spack -e . mirror add --oci-username ... --oci-password ... container-registry oci://example.com/name/image
# Push the image
spack -e . buildcache push --update-index --base-image ubuntu:22.04 --tag my_env container-registry
The resulting container image can then be run as follows:
.. code-block:: console
$ docker run -it example.com/name/image:my_env
The image generated by Spack consists of the specified base image with each package from the
environment as a separate layer on top. The image is minimal by construction, it only contains the
environment roots and its runtime dependencies.
.. note::
When using registries like GHCR and Docker Hub, the ``--oci-password`` flag is not
the password for your account, but a personal access token you need to generate separately.
The specified ``--base-image`` should have a libc that is compatible with the host system.
For example if your host system is Ubuntu 20.04, you can use ``ubuntu:20.04``, ``ubuntu:22.04``
or newer: the libc in the container image must be at least the version of the host system,
assuming ABI compatibility. It is also perfectly fine to use a completely different
Linux distribution as long as the libc is compatible.
For convenience, Spack also turns the OCI registry into a :ref:`build cache <binary_caches_oci>`,
so that future ``spack install`` of the environment will simply pull the binaries from the
registry instead of doing source builds. The flag ``--update-index`` is needed to make Spack
take the build cache into account when concretizing.
.. note::
When generating container images in CI, the approach above is recommended when CI jobs
already run in a sandboxed environment. You can simply use ``spack`` directly
in the CI job and push the resulting image to a registry. Subsequent CI jobs should
run faster because Spack can install from the same registry instead of rebuilding from
sources.
---------------------------------------------
Generating recipes for Docker and Singularity
---------------------------------------------
Apart from copying existing installations into container images, Spack can also
generate recipes for container images. This is useful if you want to run Spack
itself in a sandboxed environment instead of on the host system.
Since recipes need a little bit more boilerplate than
.. code-block:: docker
COPY spack.yaml /environment
RUN spack -e /environment install
Additional actions may be needed to minimize the size of the
container, or to update the system software that is installed in the base
image, or to set up a proper entrypoint to run the image. These tasks are
usually both necessary and repetitive, so Spack comes with a command
to generate recipes for container images starting from a ``spack.yaml``.
Spack provides a command to generate customizable recipes for container images. Customizations
include minimizing the size of the image, installing packages in the base image using the system
package manager, and setting up a proper entrypoint to run the image.
--------------------
~~~~~~~~~~~~~~~~~~~~
A Quick Introduction
--------------------
~~~~~~~~~~~~~~~~~~~~
Consider having a Spack environment like the following:
@@ -37,8 +109,8 @@ Consider having a Spack environment like the following:
- gromacs+mpi
- mpich
Producing a ``Dockerfile`` from it is as simple as moving to the directory
where the ``spack.yaml`` file is stored and giving the following command:
Producing a ``Dockerfile`` from it is as simple as changing directories to
where the ``spack.yaml`` file is stored and running the following command:
.. code-block:: console
@@ -104,9 +176,9 @@ configuration are discussed in details in the sections below.
.. _container_spack_images:
--------------------------
~~~~~~~~~~~~~~~~~~~~~~~~~~
Spack Images on Docker Hub
--------------------------
~~~~~~~~~~~~~~~~~~~~~~~~~~
Docker images with Spack preinstalled and ready to be used are
built when a release is tagged, or nightly on ``develop``. The images
@@ -176,9 +248,9 @@ by Spack use them as default base images for their ``build`` stage,
even though handles to use custom base images provided by users are
available to accommodate complex use cases.
---------------------------------
Creating Images From Environments
---------------------------------
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Configuring the Container Recipe
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Any Spack Environment can be used for the automatic generation of container
recipes. Sensible defaults are provided for things like the base image or the
@@ -219,18 +291,18 @@ under the ``container`` attribute of environments:
A detailed description of the options available can be found in the :ref:`container_config_options` section.
-------------------
~~~~~~~~~~~~~~~~~~~
Setting Base Images
-------------------
~~~~~~~~~~~~~~~~~~~
The ``images`` subsection is used to select both the image where
Spack builds the software and the image where the built software
is installed. This attribute can be set in different ways and
which one to use depends on the use case at hand.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
""""""""""""""""""""""""""""""""""""""""
Use Official Spack Images From Dockerhub
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
""""""""""""""""""""""""""""""""""""""""
To generate a recipe that uses an official Docker image from the
Spack organization to build the software and the corresponding official OS image
@@ -435,9 +507,9 @@ responsibility to ensure that:
Therefore we don't recommend its use in cases that can be otherwise
covered by the simplified mode shown first.
----------------------------
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Singularity Definition Files
----------------------------
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In addition to producing recipes in ``Dockerfile`` format Spack can produce
Singularity Definition Files by just changing the value of the ``format``
@@ -458,9 +530,9 @@ attribute:
The minimum version of Singularity required to build a SIF (Singularity Image Format)
image from the recipes generated by Spack is ``3.5.3``.
------------------------------
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Extending the Jinja2 Templates
------------------------------
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The Dockerfile and the Singularity definition file that Spack can generate are based on
a few Jinja2 templates that are rendered according to the environment being containerized.
@@ -581,9 +653,9 @@ The recipe that gets generated contains the two extra instruction that we added
.. _container_config_options:
-----------------------
~~~~~~~~~~~~~~~~~~~~~~~
Configuration Reference
-----------------------
~~~~~~~~~~~~~~~~~~~~~~~
The tables below describe all the configuration options that are currently supported
to customize the generation of container recipes:
@@ -680,13 +752,13 @@ to customize the generation of container recipes:
- Description string
- No
--------------
~~~~~~~~~~~~~~
Best Practices
--------------
~~~~~~~~~~~~~~
^^^
"""
MPI
^^^
"""
Due to the dependency on Fortran for OpenMPI, which is the spack default
implementation, consider adding ``gfortran`` to the ``apt-get install`` list.
@@ -697,9 +769,9 @@ For execution on HPC clusters, it can be helpful to import the docker
image into Singularity in order to start a program with an *external*
MPI. Otherwise, also add ``openssh-server`` to the ``apt-get install`` list.
^^^^
""""
CUDA
^^^^
""""
Starting from CUDA 9.0, Nvidia provides minimal CUDA images based on
Ubuntu. Please see `their instructions <https://hub.docker.com/r/nvidia/cuda/>`_.
Avoid double-installing CUDA by adding, e.g.
@@ -718,9 +790,9 @@ to your ``spack.yaml``.
Users will either need ``nvidia-docker`` or e.g. Singularity to *execute*
device kernels.
^^^^^^^^^^^^^^^^^^^^^^^^^
"""""""""""""""""""""""""
Docker on Windows and OSX
^^^^^^^^^^^^^^^^^^^^^^^^^
"""""""""""""""""""""""""
On Mac OS and Windows, docker runs on a hypervisor that is not allocated much
memory by default, and some spack packages may fail to build due to lack of

View File

@@ -9,46 +9,42 @@
Custom Extensions
=================
*Spack extensions* permit you to extend Spack capabilities by deploying your
*Spack extensions* allow you to extend Spack capabilities by deploying your
own custom commands or logic in an arbitrary location on your filesystem.
This might be extremely useful e.g. to develop and maintain a command whose purpose is
too specific to be considered for reintegration into the mainline or to
evolve a command through its early stages before starting a discussion to merge
it upstream.
From Spack's point of view an extension is any path in your filesystem which
respects a prescribed naming and layout for files:
respects the following naming and layout for files:
.. code-block:: console
spack-scripting/ # The top level directory must match the format 'spack-{extension_name}'
├── pytest.ini # Optional file if the extension ships its own tests
├── scripting # Folder that may contain modules that are needed for the extension commands
│   ── cmd # Folder containing extension commands
│   └── filter.py # A new command that will be available
├── tests # Tests for this extension
│   ── cmd # Folder containing extension commands
│   │   └── filter.py # A new command that will be available
│   └── functions.py # Module with internal details
└── tests # Tests for this extension
│ ├── conftest.py
│ └── test_filter.py
└── templates # Templates that may be needed by the extension
In the example above the extension named *scripting* adds an additional command (``filter``)
and unit tests to verify its behavior. The code for this example can be
obtained by cloning the corresponding git repository:
In the example above, the extension is named *scripting*. It adds an additional command
(``spack filter``) and unit tests to verify its behavior.
.. TODO: write an ad-hoc "hello world" extension and make it part of the spack organization
The extension can import any core Spack module in its implementation. When loaded by
the ``spack`` command, the extension itself is imported as a Python package in the
``spack.extensions`` namespace. In the example above, since the extension is named
"scripting", the corresponding Python module is ``spack.extensions.scripting``.
The code for this example extension can be obtained by cloning the corresponding git repository:
.. code-block:: console
$ cd ~/
$ mkdir tmp && cd tmp
$ git clone https://github.com/alalazo/spack-scripting.git
Cloning into 'spack-scripting'...
remote: Counting objects: 11, done.
remote: Compressing objects: 100% (7/7), done.
remote: Total 11 (delta 0), reused 11 (delta 0), pack-reused 0
Receiving objects: 100% (11/11), done.
As you can see by inspecting the sources, Python modules that are part of the extension
can import any core Spack module.
$ git -C /tmp clone https://github.com/spack/spack-scripting.git
---------------------------------
Configure Spack to Use Extensions
@@ -61,7 +57,7 @@ paths to ``config.yaml``. In the case of our example this means ensuring that:
config:
extensions:
- ~/tmp/spack-scripting
- /tmp/spack-scripting
is part of your configuration file. Once this is setup any command that the extension provides
will be available from the command line:
@@ -86,37 +82,32 @@ will be available from the command line:
--implicit select specs that are not installed or were installed implicitly
--output OUTPUT where to dump the result
The corresponding unit tests can be run giving the appropriate options
to ``spack unit-test``:
The corresponding unit tests can be run giving the appropriate options to ``spack unit-test``:
.. code-block:: console
$ spack unit-test --extension=scripting
============================================================== test session starts ===============================================================
platform linux2 -- Python 2.7.15rc1, pytest-3.2.5, py-1.4.34, pluggy-0.4.0
rootdir: /home/mculpo/tmp/spack-scripting, inifile: pytest.ini
========================================== test session starts ===========================================
platform linux -- Python 3.11.5, pytest-7.4.3, pluggy-1.3.0
rootdir: /home/culpo/github/spack-scripting
configfile: pytest.ini
testpaths: tests
plugins: xdist-3.5.0
collected 5 items
tests/test_filter.py ...XX
============================================================ short test summary info =============================================================
XPASS tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
XPASS tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
tests/test_filter.py ..... [100%]
=========================================================== slowest 20 test durations ============================================================
3.74s setup tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.17s call tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
0.16s call tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.15s call tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1]
0.13s call tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
0.08s call tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.04s teardown tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
0.00s setup tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
0.00s setup tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
0.00s setup tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1]
0.00s setup tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
====================================================== 3 passed, 2 xpassed in 4.51 seconds =======================================================
========================================== slowest 30 durations ==========================================
2.31s setup tests/test_filter.py::test_filtering_specs[kwargs0-specs0-expected0]
0.57s call tests/test_filter.py::test_filtering_specs[kwargs2-specs2-expected2]
0.56s call tests/test_filter.py::test_filtering_specs[kwargs4-specs4-expected4]
0.54s call tests/test_filter.py::test_filtering_specs[kwargs3-specs3-expected3]
0.54s call tests/test_filter.py::test_filtering_specs[kwargs1-specs1-expected1]
0.48s call tests/test_filter.py::test_filtering_specs[kwargs0-specs0-expected0]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs4-specs4-expected4]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs2-specs2-expected2]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs1-specs1-expected1]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs3-specs3-expected3]
(5 durations < 0.005s hidden. Use -vv to show these durations.)
=========================================== 5 passed in 5.06s ============================================

View File

@@ -0,0 +1,77 @@
.. Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
==========================
Frequently Asked Questions
==========================
This page contains answers to frequently asked questions about Spack.
If you have questions that are not answered here, feel free to ask on
`Slack <https://slack.spack.io>`_ or `GitHub Discussions
<https://github.com/spack/spack/discussions>`_. If you've learned the
answer to a question that you think should be here, please consider
contributing to this page.
.. _faq-concretizer-precedence:
-----------------------------------------------------
Why does Spack pick particular versions and variants?
-----------------------------------------------------
This question comes up in a variety of forms:
1. Why does Spack seem to ignore my package preferences from ``packages.yaml`` config?
2. Why does Spack toggle a variant instead of using the default from the ``package.py`` file?
The short answer is that Spack always picks an optimal configuration
based on a complex set of criteria\ [#f1]_. These criteria are more nuanced
than always choosing the latest versions or default variants.
.. note::
As a rule of thumb: requirements + constraints > reuse > preferences > defaults.
The following set of criteria (from lowest to highest precedence) explain
common cases where concretization output may seem surprising at first.
1. :ref:`Package preferences <package-preferences>` configured in ``packages.yaml``
override variant defaults from ``package.py`` files, and influence the optimal
ordering of versions. Preferences are specified as follows:
.. code-block:: yaml
packages:
foo:
version: [1.0, 1.1]
variants: ~mpi
2. :ref:`Reuse concretization <concretizer-options>` configured in ``concretizer.yaml``
overrides preferences, since it's typically faster to reuse an existing spec than to
build a preferred one from sources. When build caches are enabled, specs may be reused
from a remote location too. Reuse concretization is configured as follows:
.. code-block:: yaml
concretizer:
reuse: dependencies # other options are 'true' and 'false'
3. :ref:`Package requirements <package-requirements>` configured in ``packages.yaml``,
and constraints from the command line as well as ``package.py`` files override all
of the above. Requirements are specified as follows:
.. code-block:: yaml
packages:
foo:
require:
- "@1.2: +mpi"
Requirements and constraints restrict the set of possible solutions, while reuse
behavior and preferences influence what an optimal solution looks like.
.. rubric:: Footnotes
.. [#f1] The exact list of criteria can be retrieved with the ``spack solve`` command

View File

@@ -111,3 +111,28 @@ CUDA is split into fewer components and is simpler to specify:
prefix: /opt/cuda/cuda-11.0.2/
where ``/opt/cuda/cuda-11.0.2/lib/`` contains ``libcudart.so``.
-----------------------------------
Using an External OpenGL API
-----------------------------------
Depending on whether we have a graphics card or not, we may choose to use OSMesa or GLX to implement the OpenGL API.
If a graphics card is unavailable, OSMesa is recommended and can typically be built with Spack.
However, if we prefer to utilize the system GLX tailored to our graphics card, we need to declare it as an external. Here's how to do it:
.. code-block:: yaml
packages:
libglx:
require: [opengl]
opengl:
buildable: false
externals:
- prefix: /usr/
spec: opengl@4.6
Note that prefix has to be the root of both the libraries and the headers, using is /usr not the path the the lib.
To know which spec for opengl is available use ``cd /usr/include/GL && grep -Ri gl_version``.

View File

@@ -55,6 +55,7 @@ or refer to the full manual below.
getting_started
basic_usage
replace_conda_homebrew
frequently_asked_questions
.. toctree::
:maxdepth: 2
@@ -70,7 +71,7 @@ or refer to the full manual below.
configuration
config_yaml
bootstrapping
packages_yaml
build_settings
environments
containers
@@ -78,6 +79,7 @@ or refer to the full manual below.
module_file_support
repositories
binary_caches
bootstrapping
command_index
chain
extensions

View File

@@ -0,0 +1,620 @@
.. Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _packages-config:
================================
Package Settings (packages.yaml)
================================
Spack allows you to customize how your software is built through the
``packages.yaml`` file. Using it, you can make Spack prefer particular
implementations of virtual dependencies (e.g., MPI or BLAS/LAPACK),
or you can make it prefer to build with particular compilers. You can
also tell Spack to use *external* software installations already
present on your system.
At a high level, the ``packages.yaml`` file is structured like this:
.. code-block:: yaml
packages:
package1:
# settings for package1
package2:
# settings for package2
# ...
all:
# settings that apply to all packages.
So you can either set build preferences specifically for *one* package,
or you can specify that certain settings should apply to *all* packages.
The types of settings you can customize are described in detail below.
Spack's build defaults are in the default
``etc/spack/defaults/packages.yaml`` file. You can override them in
``~/.spack/packages.yaml`` or ``etc/spack/packages.yaml``. For more
details on how this works, see :ref:`configuration-scopes`.
.. _sec-external-packages:
-----------------
External Packages
-----------------
Spack can be configured to use externally-installed
packages rather than building its own packages. This may be desirable
if machines ship with system packages, such as a customized MPI
that should be used instead of Spack building its own MPI.
External packages are configured through the ``packages.yaml`` file.
Here's an example of an external configuration:
.. code-block:: yaml
packages:
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
This example lists three installations of OpenMPI, one built with GCC,
one built with GCC and debug information, and another built with Intel.
If Spack is asked to build a package that uses one of these MPIs as a
dependency, it will use the pre-installed OpenMPI in
the given directory. Note that the specified path is the top-level
install prefix, not the ``bin`` subdirectory.
``packages.yaml`` can also be used to specify modules to load instead
of the installation prefixes. The following example says that module
``CMake/3.7.2`` provides cmake version 3.7.2.
.. code-block:: yaml
cmake:
externals:
- spec: cmake@3.7.2
modules:
- CMake/3.7.2
Each ``packages.yaml`` begins with a ``packages:`` attribute, followed
by a list of package names. To specify externals, add an ``externals:``
attribute under the package name, which lists externals.
Each external should specify a ``spec:`` string that should be as
well-defined as reasonably possible. If a
package lacks a spec component, such as missing a compiler or
package version, then Spack will guess the missing component based
on its most-favored packages, and it may guess incorrectly.
Each package version and compiler listed in an external should
have entries in Spack's packages and compiler configuration, even
though the package and compiler may not ever be built.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Extra attributes for external packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Sometimes external packages require additional attributes to be used
effectively. This information can be defined on a per-package basis
and stored in the ``extra_attributes`` section of the external package
configuration. In addition to per-package information, this section
can be used to define environment modifications to be performed
whenever the package is used. For example, if an external package is
built without ``rpath`` support, it may require ``LD_LIBRARY_PATH``
settings to find its dependencies. This could be configured as
follows:
.. code-block:: yaml
packages:
mpich:
externals:
- spec: mpich@3.3 %clang@12.0.0 +hwloc
prefix: /path/to/mpich
extra_attributes:
environment:
prepend_path:
LD_LIBRARY_PATH: /path/to/hwloc/lib64
See :ref:`configuration_environment_variables` for more information on
how to configure environment modifications in Spack config files.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Prevent packages from being built from sources
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Adding an external spec in ``packages.yaml`` allows Spack to use an external location,
but it does not prevent Spack from building packages from sources. In the above example,
Spack might choose for many valid reasons to start building and linking with the
latest version of OpenMPI rather than continue using the pre-installed OpenMPI versions.
To prevent this, the ``packages.yaml`` configuration also allows packages
to be flagged as non-buildable. The previous example could be modified to
be:
.. code-block:: yaml
packages:
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
buildable: False
The addition of the ``buildable`` flag tells Spack that it should never build
its own version of OpenMPI from sources, and it will instead always rely on a pre-built
OpenMPI.
.. note::
If ``concretizer:reuse`` is on (see :ref:`concretizer-options` for more information on that flag)
pre-built specs include specs already available from a local store, an upstream store, a registered
buildcache or specs marked as externals in ``packages.yaml``. If ``concretizer:reuse`` is off, only
external specs in ``packages.yaml`` are included in the list of pre-built specs.
If an external module is specified as not buildable, then Spack will load the
external module into the build environment which can be used for linking.
The ``buildable`` does not need to be paired with external packages.
It could also be used alone to forbid packages that may be
buggy or otherwise undesirable.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Non-buildable virtual packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Virtual packages in Spack can also be specified as not buildable, and
external implementations can be provided. In the example above,
OpenMPI is configured as not buildable, but Spack will often prefer
other MPI implementations over the externally available OpenMPI. Spack
can be configured with every MPI provider not buildable individually,
but more conveniently:
.. code-block:: yaml
packages:
mpi:
buildable: False
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
Spack can then use any of the listed external implementations of MPI
to satisfy a dependency, and will choose depending on the compiler and
architecture.
In cases where the concretizer is configured to reuse specs, and other ``mpi`` providers
(available via stores or buildcaches) are not wanted, Spack can be configured to require
specs matching only the available externals:
.. code-block:: yaml
packages:
mpi:
buildable: False
require:
- one_of: [
"openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64",
"openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug",
"openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
]
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
This configuration prevents any spec using MPI and originating from stores or buildcaches to be reused,
unless it matches the requirements under ``packages:mpi:require``. For more information on requirements see
:ref:`package-requirements`.
.. _cmd-spack-external-find:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Automatically Find External Packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can run the :ref:`spack external find <spack-external-find>` command
to search for system-provided packages and add them to ``packages.yaml``.
After running this command your ``packages.yaml`` may include new entries:
.. code-block:: yaml
packages:
cmake:
externals:
- spec: cmake@3.17.2
prefix: /usr
Generally this is useful for detecting a small set of commonly-used packages;
for now this is generally limited to finding build-only dependencies.
Specific limitations include:
* Packages are not discoverable by default: For a package to be
discoverable with ``spack external find``, it needs to add special
logic. See :ref:`here <make-package-findable>` for more details.
* The logic does not search through module files, it can only detect
packages with executables defined in ``PATH``; you can help Spack locate
externals which use module files by loading any associated modules for
packages that you want Spack to know about before running
``spack external find``.
* Spack does not overwrite existing entries in the package configuration:
If there is an external defined for a spec at any configuration scope,
then Spack will not add a new external entry (``spack config blame packages``
can help locate all external entries).
.. _package-requirements:
--------------------
Package Requirements
--------------------
Spack can be configured to always use certain compilers, package
versions, and variants during concretization through package
requirements.
Package requirements are useful when you find yourself repeatedly
specifying the same constraints on the command line, and wish that
Spack respects these constraints whether you mention them explicitly
or not. Another use case is specifying constraints that should apply
to all root specs in an environment, without having to repeat the
constraint everywhere.
Apart from that, requirements config is more flexible than constraints
on the command line, because it can specify constraints on packages
*when they occur* as a dependency. In contrast, on the command line it
is not possible to specify constraints on dependencies while also keeping
those dependencies optional.
.. seealso::
FAQ: :ref:`Why does Spack pick particular versions and variants? <faq-concretizer-precedence>`
^^^^^^^^^^^^^^^^^^^
Requirements syntax
^^^^^^^^^^^^^^^^^^^
The package requirements configuration is specified in ``packages.yaml``,
keyed by package name and expressed using the Spec syntax. In the simplest
case you can specify attributes that you always want the package to have
by providing a single spec string to ``require``:
.. code-block:: yaml
packages:
libfabric:
require: "@1.13.2"
In the above example, ``libfabric`` will always build with version 1.13.2. If you
need to compose multiple configuration scopes ``require`` accepts a list of
strings:
.. code-block:: yaml
packages:
libfabric:
require:
- "@1.13.2"
- "%gcc"
In this case ``libfabric`` will always build with version 1.13.2 **and** using GCC
as a compiler.
For more complex use cases, require accepts also a list of objects. These objects
must have either a ``any_of`` or a ``one_of`` field, containing a list of spec strings,
and they can optionally have a ``when`` and a ``message`` attribute:
.. code-block:: yaml
packages:
openmpi:
require:
- any_of: ["@4.1.5", "%gcc"]
message: "in this example only 4.1.5 can build with other compilers"
``any_of`` is a list of specs. One of those specs must be satisfied
and it is also allowed for the concretized spec to match more than one.
In the above example, that means you could build ``openmpi@4.1.5%gcc``,
``openmpi@4.1.5%clang`` or ``openmpi@3.9%gcc``, but
not ``openmpi@3.9%clang``.
If a custom message is provided, and the requirement is not satisfiable,
Spack will print the custom error message:
.. code-block:: console
$ spack spec openmpi@3.9%clang
==> Error: in this example only 4.1.5 can build with other compilers
We could express a similar requirement using the ``when`` attribute:
.. code-block:: yaml
packages:
openmpi:
require:
- any_of: ["%gcc"]
when: "@:4.1.4"
message: "in this example only 4.1.5 can build with other compilers"
In the example above, if the version turns out to be 4.1.4 or less, we require the compiler to be GCC.
For readability, Spack also allows a ``spec`` key accepting a string when there is only a single
constraint:
.. code-block:: yaml
packages:
openmpi:
require:
- spec: "%gcc"
when: "@:4.1.4"
message: "in this example only 4.1.5 can build with other compilers"
This code snippet and the one before it are semantically equivalent.
Finally, instead of ``any_of`` you can use ``one_of`` which also takes a list of specs. The final
concretized spec must match one and only one of them:
.. code-block:: yaml
packages:
mpich:
require:
- one_of: ["+cuda", "+rocm"]
In the example above, that means you could build ``mpich+cuda`` or ``mpich+rocm`` but not ``mpich+cuda+rocm``.
.. note::
For ``any_of`` and ``one_of``, the order of specs indicates a
preference: items that appear earlier in the list are preferred
(note that these preferences can be ignored in favor of others).
.. note::
When using a conditional requirement, Spack is allowed to actively avoid the triggering
condition (the ``when=...`` spec) if that leads to a concrete spec with better scores in
the optimization criteria. To check the current optimization criteria and their
priorities you can run ``spack solve zlib``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting default requirements
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can also set default requirements for all packages under ``all``
like this:
.. code-block:: yaml
packages:
all:
require: '%clang'
which means every spec will be required to use ``clang`` as a compiler.
Requirements on variants for all packages are possible too, but note that they
are only enforced for those packages that define these variants, otherwise they
are disregarded. For example:
.. code-block:: yaml
packages:
all:
require:
- "+shared"
- "+cuda"
will just enforce ``+shared`` on ``zlib``, which has a boolean ``shared`` variant but
no ``cuda`` variant.
Constraints in a single spec literal are always considered as a whole, so in a case like:
.. code-block:: yaml
packages:
all:
require: "+shared +cuda"
the default requirement will be enforced only if a package has both a ``cuda`` and
a ``shared`` variant, and will never be partially enforced.
Finally, ``all`` represents a *default set of requirements* -
if there are specific package requirements, then the default requirements
under ``all`` are disregarded. For example, with a configuration like this:
.. code-block:: yaml
packages:
all:
require:
- 'build_type=Debug'
- '%clang'
cmake:
require:
- 'build_type=Debug'
- '%gcc'
Spack requires ``cmake`` to use ``gcc`` and all other nodes (including ``cmake``
dependencies) to use ``clang``. If enforcing ``build_type=Debug`` is needed also
on ``cmake``, it must be repeated in the specific ``cmake`` requirements.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting requirements on virtual specs
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
A requirement on a virtual spec applies whenever that virtual is present in the DAG.
This can be useful for fixing which virtual provider you want to use:
.. code-block:: yaml
packages:
mpi:
require: 'mvapich2 %gcc'
With the configuration above the only allowed ``mpi`` provider is ``mvapich2 %gcc``.
Requirements on the virtual spec and on the specific provider are both applied, if
present. For instance with a configuration like:
.. code-block:: yaml
packages:
mpi:
require: 'mvapich2 %gcc'
mvapich2:
require: '~cuda'
you will use ``mvapich2~cuda %gcc`` as an ``mpi`` provider.
.. _package-preferences:
-------------------
Package Preferences
-------------------
In some cases package requirements can be too strong, and package
preferences are the better option. Package preferences do not impose
constraints on packages for particular versions or variants values,
they rather only set defaults. The concretizer is free to change
them if it must, due to other constraints, and also prefers reusing
installed packages over building new ones that are a better match for
preferences.
.. seealso::
FAQ: :ref:`Why does Spack pick particular versions and variants? <faq-concretizer-precedence>`
Most package preferences (``compilers``, ``target`` and ``providers``)
can only be set globally under the ``all`` section of ``packages.yaml``:
.. code-block:: yaml
packages:
all:
compiler: [gcc@12.2.0, clang@12:, oneapi@2023:]
target: [x86_64_v3]
providers:
mpi: [mvapich2, mpich, openmpi]
These preferences override Spack's default and effectively reorder priorities
when looking for the best compiler, target or virtual package provider. Each
preference takes an ordered list of spec constraints, with earlier entries in
the list being preferred over later entries.
In the example above all packages prefer to be compiled with ``gcc@12.2.0``,
to target the ``x86_64_v3`` microarchitecture and to use ``mvapich2`` if they
depend on ``mpi``.
The ``variants`` and ``version`` preferences can be set under
package specific sections of the ``packages.yaml`` file:
.. code-block:: yaml
packages:
opencv:
variants: +debug
gperftools:
version: [2.2, 2.4, 2.3]
In this case, the preference for ``opencv`` is to build with debug options, while
``gperftools`` prefers version 2.2 over 2.4.
Any preference can be overwritten on the command line if explicitly requested.
Preferences cannot overcome explicit constraints, as they only set a preferred
ordering among homogeneous attribute values. Going back to the example, if
``gperftools@2.3:`` was requested, then Spack will install version 2.4
since the most preferred version 2.2 is prohibited by the version constraint.
.. _package_permissions:
-------------------
Package Permissions
-------------------
Spack can be configured to assign permissions to the files installed
by a package.
In the ``packages.yaml`` file under ``permissions``, the attributes
``read``, ``write``, and ``group`` control the package
permissions. These attributes can be set per-package, or for all
packages under ``all``. If permissions are set under ``all`` and for a
specific package, the package-specific settings take precedence.
The ``read`` and ``write`` attributes take one of ``user``, ``group``,
and ``world``.
.. code-block:: yaml
packages:
all:
permissions:
write: group
group: spack
my_app:
permissions:
read: group
group: my_team
The permissions settings describe the broadest level of access to
installations of the specified packages. The execute permissions of
the file are set to the same level as read permissions for those files
that are executable. The default setting for ``read`` is ``world``,
and for ``write`` is ``user``. In the example above, installations of
``my_app`` will be installed with user and group permissions but no
world permissions, and owned by the group ``my_team``. All other
packages will be installed with user and group write privileges, and
world read privileges. Those packages will be owned by the group
``spack``.
The ``group`` attribute assigns a Unix-style group to a package. All
files installed by the package will be owned by the assigned group,
and the sticky group bit will be set on the install prefix and all
directories inside the install prefix. This will ensure that even
manually placed files within the install prefix are owned by the
assigned group. If no group is assigned, Spack will allow the OS
default behavior to go as expected.
----------------------------
Assigning Package Attributes
----------------------------
You can assign class-level attributes in the configuration:
.. code-block:: yaml
packages:
mpileaks:
# Override existing attributes
url: http://www.somewhereelse.com/mpileaks-1.0.tar.gz
# ... or add new ones
x: 1
Attributes set this way will be accessible to any method executed
in the package.py file (e.g. the ``install()`` method). Values for these
attributes may be any value parseable by yaml.
These can only be applied to specific packages, not "all" or
virtual packages.

View File

@@ -2337,7 +2337,7 @@ window while a batch job is running ``spack install`` on the same or
overlapping dependencies without any process trying to re-do the work of
another.
For example, if you are using SLURM, you could launch an installation
For example, if you are using Slurm, you could launch an installation
of ``mpich`` using the following command:
.. code-block:: console
@@ -5284,7 +5284,7 @@ installed example.
example = which(self.prefix.bin.example)
example()
Output showing the identification of each test part after runnig the tests
Output showing the identification of each test part after running the tests
is illustrated below.
.. code-block:: console
@@ -5781,7 +5781,7 @@ with those implemented in the package itself.
* - `Cxx
<https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/cxx>`_
- Compiles and runs several ``hello`` programs
* - `Fortan
* - `Fortran
<https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/fortran>`_
- Compiles and runs ``hello`` programs (``F`` and ``f90``)
* - `Mpi

View File

@@ -1,13 +1,13 @@
sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx-rtd-theme==1.3.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.23.0
docutils==0.18.1
pygments==2.16.1
urllib3==2.0.7
docutils==0.20.1
pygments==2.17.2
urllib3==2.1.0
pytest==7.4.3
isort==5.12.0
black==23.10.1
isort==5.13.2
black==23.12.0
flake8==6.1.0
mypy==1.6.1
mypy==1.7.1

View File

@@ -142,7 +142,7 @@ Reputational Key
----------------
The Reputational Key is the public facing key used to sign complete groups of
development and release packages. Only one key pair exsits in this class of
development and release packages. Only one key pair exists in this class of
keys. In contrast to the Intermediate CI Key the Reputational Key *should* be
used to verify package integrity. At the end of develop and release pipeline a
final pipeline job pulls down all signed package metadata built by the pipeline,
@@ -272,7 +272,7 @@ Internal Implementation
The technical implementation of the pipeline signing process includes components
defined in Amazon Web Services, the Kubernetes cluster, at affilicated
institutions, and the GitLab/GitLab Runner deployment. We present the techincal
institutions, and the GitLab/GitLab Runner deployment. We present the technical
implementation in two interdependent sections. The first addresses how secrets
are managed through the lifecycle of a develop or release pipeline. The second
section describes how Gitlab Runner and pipelines are configured and managed to
@@ -295,7 +295,7 @@ infrastructure.
-----------------------
Multiple intermediate CI signing keys exist, one Intermediate CI Key for jobs
run in AWS, and one key for each affiliated institution (e.g. Univerity of
run in AWS, and one key for each affiliated institution (e.g. University of
Oregon). Here we describe how the Intermediate CI Key is managed in AWS:
The Intermediate CI Key (including the Signing Intermediate CI Private Key is
@@ -305,7 +305,7 @@ contains an ASCII-armored export of just the *public* components of the
Reputational Key. This secret also contains the *public* components of each of
the affiliated institutions' Intermediate CI Key. These are potentially needed
to verify dependent packages which may have been found in the public mirror or
built by a protected job running on an affiliated institution's infrastrcuture
built by a protected job running on an affiliated institution's infrastructure
in an earlier stage of the pipeline.
Procedurally the ``spack-intermediate-ci-signing-key`` secret is used in

View File

@@ -1047,9 +1047,9 @@ def __bool__(self):
"""Whether any exceptions were handled."""
return bool(self.exceptions)
def forward(self, context: str) -> "GroupedExceptionForwarder":
def forward(self, context: str, base: type = BaseException) -> "GroupedExceptionForwarder":
"""Return a contextmanager which extracts tracebacks and prefixes a message."""
return GroupedExceptionForwarder(context, self)
return GroupedExceptionForwarder(context, self, base)
def _receive_forwarded(self, context: str, exc: Exception, tb: List[str]):
self.exceptions.append((context, exc, tb))
@@ -1072,15 +1072,18 @@ class GroupedExceptionForwarder:
"""A contextmanager to capture exceptions and forward them to a
GroupedExceptionHandler."""
def __init__(self, context: str, handler: GroupedExceptionHandler):
def __init__(self, context: str, handler: GroupedExceptionHandler, base: type):
self._context = context
self._handler = handler
self._base = base
def __enter__(self):
return None
def __exit__(self, exc_type, exc_value, tb):
if exc_value is not None:
if not issubclass(exc_type, self._base):
return False
self._handler._receive_forwarded(self._context, exc_value, traceback.format_tb(tb))
# Suppress any exception from being re-raised:

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.21.0.dev0"
__version__ = "0.22.0.dev0"
spack_version = __version__

View File

@@ -40,6 +40,7 @@ def _search_duplicate_compilers(error_cls):
import collections.abc
import glob
import inspect
import io
import itertools
import pathlib
import pickle
@@ -54,6 +55,7 @@ def _search_duplicate_compilers(error_cls):
import spack.repo
import spack.spec
import spack.util.crypto
import spack.util.spack_yaml as syaml
import spack.variant
#: Map an audit tag to a list of callables implementing checks
@@ -250,6 +252,88 @@ def _search_duplicate_specs_in_externals(error_cls):
return errors
@config_packages
def _deprecated_preferences(error_cls):
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.22)"""
# TODO (v0.22): remove this audit as the attributes will not be allowed in config
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
def make_error(attribute_name, config_data, summary):
s = io.StringIO()
s.write("Occurring in the following file:\n")
dict_view = syaml.syaml_dict((k, v) for k, v in config_data.items() if k == attribute_name)
syaml.dump_config(dict_view, stream=s, blame=True)
return error_cls(summary=summary, details=[s.getvalue()])
if "all" in packages_yaml and "version" in packages_yaml["all"]:
summary = "Using the deprecated 'version' attribute under 'packages:all'"
errors.append(make_error("version", packages_yaml["all"], summary))
for package_name in packages_yaml:
if package_name == "all":
continue
package_conf = packages_yaml[package_name]
for attribute in ("compiler", "providers", "target"):
if attribute not in package_conf:
continue
summary = (
f"Using the deprecated '{attribute}' attribute " f"under 'packages:{package_name}'"
)
errors.append(make_error(attribute, package_conf, summary))
return errors
@config_packages
def _avoid_mismatched_variants(error_cls):
"""Warns if variant preferences have mismatched types or names."""
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
def make_error(config_data, summary):
s = io.StringIO()
s.write("Occurring in the following file:\n")
syaml.dump_config(config_data, stream=s, blame=True)
return error_cls(summary=summary, details=[s.getvalue()])
for pkg_name in packages_yaml:
# 'all:' must be more forgiving, since it is setting defaults for everything
if pkg_name == "all" or "variants" not in packages_yaml[pkg_name]:
continue
preferences = packages_yaml[pkg_name]["variants"]
if not isinstance(preferences, list):
preferences = [preferences]
for variants in preferences:
current_spec = spack.spec.Spec(variants)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for variant in current_spec.variants.values():
# Variant does not exist at all
if variant.name not in pkg_cls.variants:
summary = (
f"Setting a preference for the '{pkg_name}' package to the "
f"non-existing variant '{variant.name}'"
)
errors.append(make_error(preferences, summary))
continue
# Variant cannot accept this value
s = spack.spec.Spec(pkg_name)
try:
s.update_variant_validate(variant.name, variant.value)
except Exception:
summary = (
f"Setting the variant '{variant.name}' of the '{pkg_name}' package "
f"to the invalid value '{str(variant)}'"
)
errors.append(make_error(preferences, summary))
return errors
#: Sanity checks on package directives
package_directives = AuditClass(
group="packages",
@@ -642,13 +726,46 @@ def _unknown_variants_in_directives(pkgs, error_cls):
@package_directives
def _unknown_variants_in_dependencies(pkgs, error_cls):
"""Report unknown dependencies and wrong variants for dependencies"""
def _issues_in_depends_on_directive(pkgs, error_cls):
"""Reports issues with 'depends_on' directives.
Issues might be unknown dependencies, unknown variants or variant values, or declaration
of nested dependencies.
"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
filename = spack.repo.PATH.filename_for_package_name(pkg_name)
for dependency_name, dependency_data in pkg_cls.dependencies.items():
# Check if there are nested dependencies declared. We don't want directives like:
#
# depends_on('foo+bar ^fee+baz')
#
# but we'd like to have two dependencies listed instead.
for when, dependency_edge in dependency_data.items():
dependency_spec = dependency_edge.spec
nested_dependencies = dependency_spec.dependencies()
if nested_dependencies:
summary = (
f"{pkg_name}: invalid nested dependency "
f"declaration '{str(dependency_spec)}'"
)
details = [
f"split depends_on('{str(dependency_spec)}', when='{str(when)}') "
f"into {len(nested_dependencies) + 1} directives",
f"in {filename}",
]
errors.append(error_cls(summary=summary, details=details))
for s in (dependency_spec, when):
if s.virtual and s.variants:
summary = f"{pkg_name}: virtual dependency cannot have variants"
details = [
f"remove variants from '{str(s)}' in depends_on directive",
f"in {filename}",
]
errors.append(error_cls(summary=summary, details=details))
# No need to analyze virtual packages
if spack.repo.PATH.is_virtual(dependency_name):
continue
@@ -776,7 +893,7 @@ def _version_constraints_are_satisfiable_by_some_version_in_repo(pkgs, error_cls
)
except Exception:
summary = (
"{0}: dependency on {1} cannot be satisfied " "by known versions of {1.name}"
"{0}: dependency on {1} cannot be satisfied by known versions of {1.name}"
).format(pkg_name, s)
details = ["happening in " + filename]
if dependency_pkg_cls is not None:
@@ -818,6 +935,53 @@ def _analyze_variants_in_directive(pkg, constraint, directive, error_cls):
return errors
@package_directives
def _named_specs_in_when_arguments(pkgs, error_cls):
"""Reports named specs in the 'when=' attribute of a directive.
Note that 'conflicts' is the only directive allowing that.
"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
def _extracts_errors(triggers, summary):
_errors = []
for trigger in list(triggers):
when_spec = spack.spec.Spec(trigger)
if when_spec.name is not None and when_spec.name != pkg_name:
details = [f"using '{trigger}', should be '^{trigger}'"]
_errors.append(error_cls(summary=summary, details=details))
return _errors
for dname, triggers in pkg_cls.dependencies.items():
summary = f"{pkg_name}: wrong 'when=' condition for the '{dname}' dependency"
errors.extend(_extracts_errors(triggers, summary))
for vname, (variant, triggers) in pkg_cls.variants.items():
summary = f"{pkg_name}: wrong 'when=' condition for the '{vname}' variant"
errors.extend(_extracts_errors(triggers, summary))
for provided, triggers in pkg_cls.provided.items():
summary = f"{pkg_name}: wrong 'when=' condition for the '{provided}' virtual"
errors.extend(_extracts_errors(triggers, summary))
for _, triggers in pkg_cls.requirements.items():
triggers = [when_spec for when_spec, _, _ in triggers]
summary = f"{pkg_name}: wrong 'when=' condition in 'requires' directive"
errors.extend(_extracts_errors(triggers, summary))
triggers = list(pkg_cls.patches)
summary = f"{pkg_name}: wrong 'when=' condition in 'patch' directives"
errors.extend(_extracts_errors(triggers, summary))
triggers = list(pkg_cls.resources)
summary = f"{pkg_name}: wrong 'when=' condition in 'resource' directives"
errors.extend(_extracts_errors(triggers, summary))
return llnl.util.lang.dedupe(errors)
#: Sanity checks on package directives
external_detection = AuditClass(
group="externals",

View File

@@ -25,7 +25,7 @@
import warnings
from contextlib import closing, contextmanager
from gzip import GzipFile
from typing import Dict, List, NamedTuple, Optional, Set, Tuple
from typing import Dict, Iterable, List, NamedTuple, Optional, Set, Tuple
from urllib.error import HTTPError, URLError
import llnl.util.filesystem as fsys
@@ -66,8 +66,9 @@
from spack.stage import Stage
from spack.util.executable import which
_build_cache_relative_path = "build_cache"
_build_cache_keys_relative_path = "_pgp"
BUILD_CACHE_RELATIVE_PATH = "build_cache"
BUILD_CACHE_KEYS_RELATIVE_PATH = "_pgp"
CURRENT_BUILD_CACHE_LAYOUT_VERSION = 1
class BuildCacheDatabase(spack_db.Database):
@@ -229,7 +230,11 @@ def _associate_built_specs_with_mirror(self, cache_key, mirror_url):
)
return
spec_list = db.query_local(installed=False, in_buildcache=True)
spec_list = [
s
for s in db.query_local(installed=any, in_buildcache=any)
if s.external or db.query_local_by_spec_hash(s.dag_hash()).in_buildcache
]
for indexed_spec in spec_list:
dag_hash = indexed_spec.dag_hash()
@@ -481,7 +486,7 @@ def _fetch_and_cache_index(self, mirror_url, cache_entry={}):
scheme = urllib.parse.urlparse(mirror_url).scheme
if scheme != "oci" and not web_util.url_exists(
url_util.join(mirror_url, _build_cache_relative_path, "index.json")
url_util.join(mirror_url, BUILD_CACHE_RELATIVE_PATH, "index.json")
):
return False
@@ -600,6 +605,10 @@ def __init__(self, msg):
super().__init__(msg)
class InvalidMetadataFile(spack.error.SpackError):
pass
class UnsignedPackageException(spack.error.SpackError):
"""
Raised if installation of unsigned package is attempted without
@@ -614,11 +623,11 @@ def compute_hash(data):
def build_cache_relative_path():
return _build_cache_relative_path
return BUILD_CACHE_RELATIVE_PATH
def build_cache_keys_relative_path():
return _build_cache_keys_relative_path
return BUILD_CACHE_KEYS_RELATIVE_PATH
def build_cache_prefix(prefix):
@@ -1401,7 +1410,7 @@ def _build_tarball_in_stage_dir(spec: Spec, out_url: str, stage_dir: str, option
spec_dict = sjson.load(content)
else:
raise ValueError("{0} not a valid spec file type".format(spec_file))
spec_dict["buildcache_layout_version"] = 1
spec_dict["buildcache_layout_version"] = CURRENT_BUILD_CACHE_LAYOUT_VERSION
spec_dict["binary_cache_checksum"] = {"hash_algorithm": "sha256", "hash": checksum}
with open(specfile_path, "w") as outfile:
@@ -1560,14 +1569,50 @@ def _delete_staged_downloads(download_result):
download_result["specfile_stage"].destroy()
def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
def _get_valid_spec_file(path: str, max_supported_layout: int) -> Tuple[Dict, int]:
"""Read and validate a spec file, returning the spec dict with its layout version, or raising
InvalidMetadataFile if invalid."""
try:
with open(path, "rb") as f:
binary_content = f.read()
except OSError:
raise InvalidMetadataFile(f"No such file: {path}")
# In the future we may support transparently decompressing compressed spec files.
if binary_content[:2] == b"\x1f\x8b":
raise InvalidMetadataFile("Compressed spec files are not supported")
try:
as_string = binary_content.decode("utf-8")
if path.endswith(".json.sig"):
spec_dict = Spec.extract_json_from_clearsig(as_string)
else:
spec_dict = json.loads(as_string)
except Exception as e:
raise InvalidMetadataFile(f"Could not parse {path} due to: {e}") from e
# Ensure this version is not too new.
try:
layout_version = int(spec_dict.get("buildcache_layout_version", 0))
except ValueError as e:
raise InvalidMetadataFile("Could not parse layout version") from e
if layout_version > max_supported_layout:
raise InvalidMetadataFile(
f"Layout version {layout_version} is too new for this version of Spack"
)
return spec_dict, layout_version
def download_tarball(spec, unsigned: Optional[bool] = False, mirrors_for_spec=None):
"""
Download binary tarball for given package into stage area, returning
path to downloaded tarball if successful, None otherwise.
Args:
spec (spack.spec.Spec): Concrete spec
unsigned (bool): Whether or not to require signed binaries
unsigned: if ``True`` or ``False`` override the mirror signature verification defaults
mirrors_for_spec (list): Optional list of concrete specs and mirrors
obtained by calling binary_distribution.get_mirrors_for_spec().
These will be checked in order first before looking in other
@@ -1588,7 +1633,9 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
"signature_verified": "true-if-binary-pkg-was-already-verified"
}
"""
configured_mirrors = spack.mirror.MirrorCollection(binary=True).values()
configured_mirrors: Iterable[spack.mirror.Mirror] = spack.mirror.MirrorCollection(
binary=True
).values()
if not configured_mirrors:
tty.die("Please add a spack mirror to allow download of pre-compiled packages.")
@@ -1606,8 +1653,16 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
# mirror for the spec twice though.
try_first = [i["mirror_url"] for i in mirrors_for_spec] if mirrors_for_spec else []
try_next = [i.fetch_url for i in configured_mirrors if i.fetch_url not in try_first]
mirror_urls = try_first + try_next
mirrors = try_first + try_next
# TODO: turn `mirrors_for_spec` into a list of Mirror instances, instead of doing that here.
def fetch_url_to_mirror(url):
for mirror in configured_mirrors:
if mirror.fetch_url == url:
return mirror
return spack.mirror.Mirror(url)
mirrors = [fetch_url_to_mirror(url) for url in mirror_urls]
tried_to_verify_sigs = []
@@ -1616,14 +1671,17 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
# we remove support for deprecated spec formats and buildcache layouts.
for try_signed in (True, False):
for mirror in mirrors:
# Override mirror's default if
currently_unsigned = unsigned if unsigned is not None else not mirror.signed
# If it's an OCI index, do things differently, since we cannot compose URLs.
parsed = urllib.parse.urlparse(mirror)
fetch_url = mirror.fetch_url
# TODO: refactor this to some "nice" place.
if parsed.scheme == "oci":
ref = spack.oci.image.ImageReference.from_string(mirror[len("oci://") :]).with_tag(
spack.oci.image.default_tag(spec)
)
if fetch_url.startswith("oci://"):
ref = spack.oci.image.ImageReference.from_string(
fetch_url[len("oci://") :]
).with_tag(spack.oci.image.default_tag(spec))
# Fetch the manifest
try:
@@ -1652,6 +1710,18 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
try:
local_specfile_stage.fetch()
local_specfile_stage.check()
try:
_get_valid_spec_file(
local_specfile_stage.save_filename,
CURRENT_BUILD_CACHE_LAYOUT_VERSION,
)
except InvalidMetadataFile as e:
tty.warn(
f"Ignoring binary package for {spec.name}/{spec.dag_hash()[:7]} "
f"from {fetch_url} due to invalid metadata file: {e}"
)
local_specfile_stage.destroy()
continue
except Exception:
continue
local_specfile_stage.cache_local()
@@ -1670,28 +1740,43 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
"tarball_stage": tarball_stage,
"specfile_stage": local_specfile_stage,
"signature_verified": False,
"signature_required": not currently_unsigned,
}
else:
ext = "json.sig" if try_signed else "json"
specfile_path = url_util.join(mirror, _build_cache_relative_path, specfile_prefix)
specfile_path = url_util.join(
fetch_url, BUILD_CACHE_RELATIVE_PATH, specfile_prefix
)
specfile_url = f"{specfile_path}.{ext}"
spackfile_url = url_util.join(mirror, _build_cache_relative_path, tarball)
spackfile_url = url_util.join(fetch_url, BUILD_CACHE_RELATIVE_PATH, tarball)
local_specfile_stage = try_fetch(specfile_url)
if local_specfile_stage:
local_specfile_path = local_specfile_stage.save_filename
signature_verified = False
if try_signed and not unsigned:
try:
_get_valid_spec_file(
local_specfile_path, CURRENT_BUILD_CACHE_LAYOUT_VERSION
)
except InvalidMetadataFile as e:
tty.warn(
f"Ignoring binary package for {spec.name}/{spec.dag_hash()[:7]} "
f"from {fetch_url} due to invalid metadata file: {e}"
)
local_specfile_stage.destroy()
continue
if try_signed and not currently_unsigned:
# If we found a signed specfile at the root, try to verify
# the signature immediately. We will not download the
# tarball if we could not verify the signature.
tried_to_verify_sigs.append(specfile_url)
signature_verified = try_verify(local_specfile_path)
if not signature_verified:
tty.warn("Failed to verify: {0}".format(specfile_url))
tty.warn(f"Failed to verify: {specfile_url}")
if unsigned or signature_verified or not try_signed:
if currently_unsigned or signature_verified or not try_signed:
# We will download the tarball in one of three cases:
# 1. user asked for --no-check-signature
# 2. user didn't ask for --no-check-signature, but we
@@ -1714,6 +1799,7 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
"tarball_stage": tarball_stage,
"specfile_stage": local_specfile_stage,
"signature_verified": signature_verified,
"signature_required": not currently_unsigned,
}
local_specfile_stage.destroy()
@@ -1912,7 +1998,7 @@ def is_backup_file(file):
relocate.relocate_text(text_names, prefix_to_prefix_text)
def _extract_inner_tarball(spec, filename, extract_to, unsigned, remote_checksum):
def _extract_inner_tarball(spec, filename, extract_to, signature_required: bool, remote_checksum):
stagepath = os.path.dirname(filename)
spackfile_name = tarball_name(spec, ".spack")
spackfile_path = os.path.join(stagepath, spackfile_name)
@@ -1932,7 +2018,7 @@ def _extract_inner_tarball(spec, filename, extract_to, unsigned, remote_checksum
else:
raise ValueError("Cannot find spec file for {0}.".format(extract_to))
if not unsigned:
if signature_required:
if os.path.exists("%s.asc" % specfile_path):
suppress = config.get("config:suppress_gpg_warnings", False)
try:
@@ -1981,7 +2067,7 @@ def _tar_strip_component(tar: tarfile.TarFile, prefix: str):
m.linkname = m.linkname[result.end() :]
def extract_tarball(spec, download_result, unsigned=False, force=False, timer=timer.NULL_TIMER):
def extract_tarball(spec, download_result, force=False, timer=timer.NULL_TIMER):
"""
extract binary tarball for given package into install area
"""
@@ -2001,35 +2087,30 @@ def extract_tarball(spec, download_result, unsigned=False, force=False, timer=ti
)
specfile_path = download_result["specfile_stage"].save_filename
with open(specfile_path, "r") as inputfile:
content = inputfile.read()
if specfile_path.endswith(".json.sig"):
spec_dict = Spec.extract_json_from_clearsig(content)
else:
spec_dict = sjson.load(content)
spec_dict, layout_version = _get_valid_spec_file(
specfile_path, CURRENT_BUILD_CACHE_LAYOUT_VERSION
)
bchecksum = spec_dict["binary_cache_checksum"]
filename = download_result["tarball_stage"].save_filename
signature_verified = download_result["signature_verified"]
signature_verified: bool = download_result["signature_verified"]
signature_required: bool = download_result["signature_required"]
tmpdir = None
if (
"buildcache_layout_version" not in spec_dict
or int(spec_dict["buildcache_layout_version"]) < 1
):
if layout_version == 0:
# Handle the older buildcache layout where the .spack file
# contains a spec json, maybe an .asc file (signature),
# and another tarball containing the actual install tree.
tmpdir = tempfile.mkdtemp()
try:
tarfile_path = _extract_inner_tarball(spec, filename, tmpdir, unsigned, bchecksum)
tarfile_path = _extract_inner_tarball(
spec, filename, tmpdir, signature_required, bchecksum
)
except Exception as e:
_delete_staged_downloads(download_result)
shutil.rmtree(tmpdir)
raise e
else:
elif layout_version == 1:
# Newer buildcache layout: the .spack file contains just
# in the install tree, the signature, if it exists, is
# wrapped around the spec.json at the root. If sig verify
@@ -2037,9 +2118,10 @@ def extract_tarball(spec, download_result, unsigned=False, force=False, timer=ti
# the tarball.
tarfile_path = filename
if not unsigned and not signature_verified:
if signature_required and not signature_verified:
raise UnsignedPackageException(
"To install unsigned packages, use the --no-check-signature option."
"To install unsigned packages, use the --no-check-signature option, "
"or configure the mirror with signed: false."
)
# compute the sha256 checksum of the tarball
@@ -2053,7 +2135,6 @@ def extract_tarball(spec, download_result, unsigned=False, force=False, timer=ti
raise NoChecksumException(
tarfile_path, size, contents, "sha256", expected, local_checksum
)
try:
with closing(tarfile.open(tarfile_path, "r")) as tar:
# Remove install prefix from tarfil to extract directly into spec.prefix
@@ -2153,7 +2234,7 @@ def install_root_node(spec, unsigned=False, force=False, sha256=None):
# don't print long padded paths while extracting/relocating binaries
with spack.util.path.filter_padding():
tty.msg('Installing "{0}" from a buildcache'.format(spec.format()))
extract_tarball(spec, download_result, unsigned, force)
extract_tarball(spec, download_result, force)
spack.hooks.post_install(spec, False)
spack.store.STORE.db.add(spec, spack.store.STORE.layout)
@@ -2184,10 +2265,10 @@ def try_direct_fetch(spec, mirrors=None):
for mirror in binary_mirrors:
buildcache_fetch_url_json = url_util.join(
mirror.fetch_url, _build_cache_relative_path, specfile_name
mirror.fetch_url, BUILD_CACHE_RELATIVE_PATH, specfile_name
)
buildcache_fetch_url_signed_json = url_util.join(
mirror.fetch_url, _build_cache_relative_path, signed_specfile_name
mirror.fetch_url, BUILD_CACHE_RELATIVE_PATH, signed_specfile_name
)
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_signed_json)
@@ -2292,7 +2373,7 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
for mirror in mirror_collection.values():
fetch_url = mirror.fetch_url
keys_url = url_util.join(
fetch_url, _build_cache_relative_path, _build_cache_keys_relative_path
fetch_url, BUILD_CACHE_RELATIVE_PATH, BUILD_CACHE_KEYS_RELATIVE_PATH
)
keys_index = url_util.join(keys_url, "index.json")
@@ -2357,7 +2438,7 @@ def push_keys(*mirrors, **kwargs):
for mirror in mirrors:
push_url = getattr(mirror, "push_url", mirror)
keys_url = url_util.join(
push_url, _build_cache_relative_path, _build_cache_keys_relative_path
push_url, BUILD_CACHE_RELATIVE_PATH, BUILD_CACHE_KEYS_RELATIVE_PATH
)
keys_local = url_util.local_file_path(keys_url)
@@ -2495,11 +2576,11 @@ def download_buildcache_entry(file_descriptions, mirror_url=None):
)
if mirror_url:
mirror_root = os.path.join(mirror_url, _build_cache_relative_path)
mirror_root = os.path.join(mirror_url, BUILD_CACHE_RELATIVE_PATH)
return _download_buildcache_entry(mirror_root, file_descriptions)
for mirror in spack.mirror.MirrorCollection(binary=True).values():
mirror_root = os.path.join(mirror.fetch_url, _build_cache_relative_path)
mirror_root = os.path.join(mirror.fetch_url, BUILD_CACHE_RELATIVE_PATH)
if _download_buildcache_entry(mirror_root, file_descriptions):
return True
@@ -2590,7 +2671,7 @@ def __init__(self, url, local_hash, urlopen=web_util.urlopen):
def get_remote_hash(self):
# Failure to fetch index.json.hash is not fatal
url_index_hash = url_util.join(self.url, _build_cache_relative_path, "index.json.hash")
url_index_hash = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json.hash")
try:
response = self.urlopen(urllib.request.Request(url_index_hash, headers=self.headers))
except urllib.error.URLError:
@@ -2611,7 +2692,7 @@ def conditional_fetch(self) -> FetchIndexResult:
return FetchIndexResult(etag=None, hash=None, data=None, fresh=True)
# Otherwise, download index.json
url_index = url_util.join(self.url, _build_cache_relative_path, "index.json")
url_index = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json")
try:
response = self.urlopen(urllib.request.Request(url_index, headers=self.headers))
@@ -2655,7 +2736,7 @@ def __init__(self, url, etag, urlopen=web_util.urlopen):
def conditional_fetch(self) -> FetchIndexResult:
# Just do a conditional fetch immediately
url = url_util.join(self.url, _build_cache_relative_path, "index.json")
url = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json")
headers = {
"User-Agent": web_util.SPACK_USER_AGENT,
"If-None-Match": '"{}"'.format(self.etag),

View File

@@ -16,6 +16,7 @@
import llnl.util.filesystem as fs
from llnl.util import tty
import spack.platforms
import spack.store
import spack.util.environment
import spack.util.executable
@@ -206,16 +207,19 @@ def _root_spec(spec_str: str) -> str:
"""Add a proper compiler and target to a spec used during bootstrapping.
Args:
spec_str (str): spec to be bootstrapped. Must be without compiler and target.
spec_str: spec to be bootstrapped. Must be without compiler and target.
"""
# Add a proper compiler hint to the root spec. We use GCC for
# everything but MacOS and Windows.
if str(spack.platforms.host()) == "darwin":
# Add a compiler requirement to the root spec.
platform = str(spack.platforms.host())
if platform == "darwin":
spec_str += " %apple-clang"
elif str(spack.platforms.host()) == "windows":
spec_str += " %msvc"
else:
elif platform == "windows":
# TODO (johnwparent): Remove version constraint when clingo patch is up
spec_str += " %msvc@:19.37"
elif platform == "linux":
spec_str += " %gcc"
elif platform == "freebsd":
spec_str += " %clang"
target = archspec.cpu.host().family
spec_str += f" target={target}"

View File

@@ -386,7 +386,7 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
with exception_handler.forward(current_config["name"]):
with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_import(module, abstract_spec):
@@ -441,7 +441,7 @@ def ensure_executables_in_path_or_raise(
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
with exception_handler.forward(current_config["name"]):
with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_search_path(executables, abstract_spec):

View File

@@ -19,7 +19,6 @@
import spack.tengine
import spack.util.cpus
import spack.util.executable
from spack.environment import depfile
from ._common import _root_spec
from .config import root_path, spec_for_current_python, store_path
@@ -86,12 +85,9 @@ def __init__(self) -> None:
super().__init__(self.environment_root())
def update_installations(self) -> None:
"""Update the installations of this environment.
The update is done using a depfile on Linux and macOS, and using the ``install_all``
method of environments on Windows.
"""
with tty.SuppressOutput(msg_enabled=False, warn_enabled=False):
"""Update the installations of this environment."""
log_enabled = tty.is_debug() or tty.is_verbose()
with tty.SuppressOutput(msg_enabled=log_enabled, warn_enabled=log_enabled):
specs = self.concretize()
if specs:
colorized_specs = [
@@ -100,11 +96,9 @@ def update_installations(self) -> None:
]
tty.msg(f"[BOOTSTRAPPING] Installing dependencies ({', '.join(colorized_specs)})")
self.write(regenerate=False)
if sys.platform == "win32":
with tty.SuppressOutput(msg_enabled=log_enabled, warn_enabled=log_enabled):
self.install_all()
else:
self._install_with_depfile()
self.write(regenerate=True)
self.write(regenerate=True)
def update_syspath_and_environ(self) -> None:
"""Update ``sys.path`` and the PATH, PYTHONPATH environment variables to point to
@@ -122,25 +116,6 @@ def update_syspath_and_environ(self) -> None:
+ [str(x) for x in self.pythonpaths()]
)
def _install_with_depfile(self) -> None:
model = depfile.MakefileModel.from_env(self)
template = spack.tengine.make_environment().get_template(
os.path.join("depfile", "Makefile")
)
makefile = self.environment_root() / "Makefile"
makefile.write_text(template.render(model.to_dict()))
make = spack.util.executable.which("make")
kwargs = {}
if not tty.is_debug():
kwargs = {"output": os.devnull, "error": os.devnull}
make(
"-C",
str(self.environment_root()),
"-j",
str(spack.util.cpus.determine_number_of_jobs(parallel=True)),
**kwargs,
)
def _write_spack_yaml_file(self) -> None:
tty.msg(
"[BOOTSTRAPPING] Spack has missing dependencies, creating a bootstrapping environment"
@@ -161,7 +136,7 @@ def _write_spack_yaml_file(self) -> None:
def isort_root_spec() -> str:
"""Return the root spec used to bootstrap isort"""
return _root_spec("py-isort@4.3.5:")
return _root_spec("py-isort@5")
def mypy_root_spec() -> str:

View File

@@ -66,7 +66,6 @@ def _core_requirements() -> List[RequiredResponseType]:
_core_system_exes = {
"make": _missing("make", "required to build software from sources"),
"patch": _missing("patch", "required to patch source code before building"),
"bash": _missing("bash", "required for Spack compiler wrapper"),
"tar": _missing("tar", "required to manage code archives"),
"gzip": _missing("gzip", "required to compress/decompress code archives"),
"unzip": _missing("unzip", "required to compress/decompress code archives"),

View File

@@ -324,19 +324,29 @@ def set_compiler_environment_variables(pkg, env):
# ttyout, ttyerr, etc.
link_dir = spack.paths.build_env_path
# Set SPACK compiler variables so that our wrapper knows what to call
# Set SPACK compiler variables so that our wrapper knows what to
# call. If there is no compiler configured then use a default
# wrapper which will emit an error if it is used.
if compiler.cc:
env.set("SPACK_CC", compiler.cc)
env.set("CC", os.path.join(link_dir, compiler.link_paths["cc"]))
else:
env.set("CC", os.path.join(link_dir, "cc"))
if compiler.cxx:
env.set("SPACK_CXX", compiler.cxx)
env.set("CXX", os.path.join(link_dir, compiler.link_paths["cxx"]))
else:
env.set("CC", os.path.join(link_dir, "c++"))
if compiler.f77:
env.set("SPACK_F77", compiler.f77)
env.set("F77", os.path.join(link_dir, compiler.link_paths["f77"]))
else:
env.set("F77", os.path.join(link_dir, "f77"))
if compiler.fc:
env.set("SPACK_FC", compiler.fc)
env.set("FC", os.path.join(link_dir, compiler.link_paths["fc"]))
else:
env.set("FC", os.path.join(link_dir, "fc"))
# Set SPACK compiler rpath flags so that our wrapper knows what to use
env.set("SPACK_CC_RPATH_ARG", compiler.cc_rpath_arg)
@@ -743,15 +753,16 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
set_compiler_environment_variables(pkg, env_mods)
set_wrapper_variables(pkg, env_mods)
tty.debug("setup_package: grabbing modifications from dependencies")
env_mods.extend(setup_context.get_env_modifications())
tty.debug("setup_package: collected all modifications from dependencies")
# architecture specific setup
# Platform specific setup goes before package specific setup. This is for setting
# defaults like MACOSX_DEPLOYMENT_TARGET on macOS.
platform = spack.platforms.by_name(pkg.spec.architecture.platform)
target = platform.target(pkg.spec.architecture.target)
platform.setup_platform_environment(pkg, env_mods)
tty.debug("setup_package: grabbing modifications from dependencies")
env_mods.extend(setup_context.get_env_modifications())
tty.debug("setup_package: collected all modifications from dependencies")
if context == Context.TEST:
env_mods.prepend_path("PATH", ".")
elif context == Context.BUILD and not dirty and not env_mods.is_unset("CPATH"):
@@ -1021,6 +1032,11 @@ def get_env_modifications(self) -> EnvironmentModifications:
if id(spec) in self.nodes_in_subdag:
pkg.setup_dependent_run_environment(run_env_mods, spec)
pkg.setup_run_environment(run_env_mods)
external_env = (dspec.extra_attributes or {}).get("environment", {})
if external_env:
run_env_mods.extend(spack.schema.environment.parse(external_env))
if self.context == Context.BUILD:
# Don't let the runtime environment of comiler like dependencies leak into the
# build env

View File

@@ -9,6 +9,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.build_environment
import spack.builder
from .cmake import CMakeBuilder, CMakePackage
@@ -34,6 +35,11 @@ def cmake_cache_option(name, boolean_value, comment="", force=False):
return 'set({0} {1} CACHE BOOL "{2}"{3})\n'.format(name, value, comment, force_str)
def cmake_cache_filepath(name, value, comment=""):
"""Generate a string for a cmake cache variable of type FILEPATH"""
return 'set({0} "{1}" CACHE FILEPATH "{2}")\n'.format(name, value, comment)
class CachedCMakeBuilder(CMakeBuilder):
#: Phases of a Cached CMake package
#: Note: the initconfig phase is used for developer builds as a final phase to stop on
@@ -257,6 +263,15 @@ def initconfig_hardware_entries(self):
entries.append(
cmake_cache_path("HIP_CXX_COMPILER", "{0}".format(self.spec["hip"].hipcc))
)
llvm_bin = spec["llvm-amdgpu"].prefix.bin
llvm_prefix = spec["llvm-amdgpu"].prefix
# Some ROCm systems seem to point to /<path>/rocm-<ver>/ and
# others point to /<path>/rocm-<ver>/llvm
if os.path.basename(os.path.normpath(llvm_prefix)) != "llvm":
llvm_bin = os.path.join(llvm_prefix, "llvm/bin/")
entries.append(
cmake_cache_filepath("CMAKE_HIP_COMPILER", os.path.join(llvm_bin, "clang++"))
)
archs = self.spec.variants["amdgpu_target"].value
if archs[0] != "none":
arch_str = ";".join(archs)
@@ -271,13 +286,29 @@ def initconfig_hardware_entries(self):
def std_initconfig_entries(self):
cmake_prefix_path_env = os.environ["CMAKE_PREFIX_PATH"]
cmake_prefix_path = cmake_prefix_path_env.replace(os.pathsep, ";")
cmake_rpaths_env = spack.build_environment.get_rpaths(self.pkg)
cmake_rpaths_path = ";".join(cmake_rpaths_env)
complete_rpath_list = cmake_rpaths_path
if "SPACK_COMPILER_EXTRA_RPATHS" in os.environ:
spack_extra_rpaths_env = os.environ["SPACK_COMPILER_EXTRA_RPATHS"]
spack_extra_rpaths_path = spack_extra_rpaths_env.replace(os.pathsep, ";")
complete_rpath_list = "{0};{1}".format(complete_rpath_list, spack_extra_rpaths_path)
if "SPACK_COMPILER_IMPLICIT_RPATHS" in os.environ:
spack_implicit_rpaths_env = os.environ["SPACK_COMPILER_IMPLICIT_RPATHS"]
spack_implicit_rpaths_path = spack_implicit_rpaths_env.replace(os.pathsep, ";")
complete_rpath_list = "{0};{1}".format(complete_rpath_list, spack_implicit_rpaths_path)
return [
"#------------------{0}".format("-" * 60),
"# !!!! This is a generated file, edit at own risk !!!!",
"#------------------{0}".format("-" * 60),
"# CMake executable path: {0}".format(self.pkg.spec["cmake"].command.path),
"#------------------{0}\n".format("-" * 60),
cmake_cache_path("CMAKE_PREFIX_PATH", cmake_prefix_path),
cmake_cache_string("CMAKE_PREFIX_PATH", cmake_prefix_path),
cmake_cache_string("CMAKE_INSTALL_RPATH_USE_LINK_PATH", "ON"),
cmake_cache_string("CMAKE_BUILD_RPATH", complete_rpath_list),
cmake_cache_string("CMAKE_INSTALL_RPATH", complete_rpath_list),
self.define_cmake_cache_from_variant("CMAKE_BUILD_TYPE", "build_type"),
]

View File

@@ -0,0 +1,89 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import llnl.util.filesystem as fs
import spack.builder
import spack.package_base
from spack.directives import build_system, depends_on
from spack.multimethod import when
from ._checks import BaseBuilder, execute_install_time_tests
class CargoPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using a Makefiles."""
#: This attribute is used in UI queries that need to know the build
#: system base class
build_system_class = "CargoPackage"
build_system("cargo")
with when("build_system=cargo"):
depends_on("rust", type="build")
@spack.builder.builder("cargo")
class CargoBuilder(BaseBuilder):
"""The Cargo builder encodes the most common way of building software with
a rust Cargo.toml file. It has two phases that can be overridden, if need be:
1. :py:meth:`~.CargoBuilder.build`
2. :py:meth:`~.CargoBuilder.install`
For a finer tuning you may override:
+-----------------------------------------------+----------------------+
| **Method** | **Purpose** |
+===============================================+======================+
| :py:meth:`~.CargoBuilder.build_args` | Specify arguments |
| | to ``cargo install`` |
+-----------------------------------------------+----------------------+
| :py:meth:`~.CargoBuilder.check_args` | Specify arguments |
| | to ``cargo test`` |
+-----------------------------------------------+----------------------+
"""
phases = ("build", "install")
#: Callback names for install-time test
install_time_test_callbacks = ["check"]
@property
def build_directory(self):
"""Return the directory containing the main Cargo.toml."""
return self.pkg.stage.source_path
@property
def build_args(self):
"""Arguments for ``cargo build``."""
return []
@property
def check_args(self):
"""Argument for ``cargo test`` during check phase"""
return []
def build(self, pkg, spec, prefix):
"""Runs ``cargo install`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(pkg).cargo(
"install", "--root", "out", "--path", ".", *self.build_args
)
def install(self, pkg, spec, prefix):
"""Copy build files into package prefix."""
with fs.working_dir(self.build_directory):
fs.install_tree("out", prefix)
spack.builder.run_after("install")(execute_install_time_tests)
def check(self):
"""Run "cargo test"."""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).cargo("test", *self.check_args)

View File

@@ -136,12 +136,12 @@ def cuda_flags(arch_list):
conflicts("%gcc@11:", when="+cuda ^cuda@:11.4.0")
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.1")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.3")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
conflicts("%clang@15:", when="+cuda ^cuda@:12.0")
conflicts("%clang@16:", when="+cuda ^cuda@:12.1")
conflicts("%clang@16:", when="+cuda ^cuda@:12.3")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -0,0 +1,98 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import llnl.util.filesystem as fs
import spack.builder
import spack.package_base
from spack.directives import build_system, extends
from spack.multimethod import when
from ._checks import BaseBuilder, execute_install_time_tests
class GoPackage(spack.package_base.PackageBase):
"""Specialized class for packages built using the Go toolchain."""
#: This attribute is used in UI queries that need to know the build
#: system base class
build_system_class = "GoPackage"
#: Legacy buildsystem attribute used to deserialize and install old specs
legacy_buildsystem = "go"
build_system("go")
with when("build_system=go"):
# TODO: this seems like it should be depends_on, see
# setup_dependent_build_environment in go for why I kept it like this
extends("go@1.14:", type="build")
@spack.builder.builder("go")
class GoBuilder(BaseBuilder):
"""The Go builder encodes the most common way of building software with
a golang go.mod file. It has two phases that can be overridden, if need be:
1. :py:meth:`~.GoBuilder.build`
2. :py:meth:`~.GoBuilder.install`
For a finer tuning you may override:
+-----------------------------------------------+--------------------+
| **Method** | **Purpose** |
+===============================================+====================+
| :py:meth:`~.GoBuilder.build_args` | Specify arguments |
| | to ``go build`` |
+-----------------------------------------------+--------------------+
| :py:meth:`~.GoBuilder.check_args` | Specify arguments |
| | to ``go test`` |
+-----------------------------------------------+--------------------+
"""
phases = ("build", "install")
#: Callback names for install-time test
install_time_test_callbacks = ["check"]
def setup_build_environment(self, env):
env.set("GO111MODULE", "on")
env.set("GOTOOLCHAIN", "local")
@property
def build_directory(self):
"""Return the directory containing the main go.mod."""
return self.pkg.stage.source_path
@property
def build_args(self):
"""Arguments for ``go build``."""
# Pass ldflags -s = --strip-all and -w = --no-warnings by default
return ["-ldflags", "-s -w", "-o", f"{self.pkg.name}"]
@property
def check_args(self):
"""Argument for ``go test`` during check phase"""
return []
def build(self, pkg, spec, prefix):
"""Runs ``go build`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(pkg).go("build", *self.build_args)
def install(self, pkg, spec, prefix):
"""Install built binaries into prefix bin."""
with fs.working_dir(self.build_directory):
fs.mkdirp(prefix.bin)
fs.install(pkg.name, prefix.bin)
spack.builder.run_after("install")(execute_install_time_tests)
def check(self):
"""Run ``go test .`` in the source directory"""
with fs.working_dir(self.build_directory):
inspect.getmodule(self.pkg).go("test", *self.check_args)

View File

@@ -7,13 +7,12 @@
import os
import platform
import shutil
from os.path import basename, dirname, isdir
from os.path import basename, isdir
from llnl.util.filesystem import find_headers, find_libraries, join_path
from llnl.util.filesystem import HeaderList, find_libraries, join_path, mkdirp
from llnl.util.link_tree import LinkTree
from spack.directives import conflicts, variant
from spack.package import mkdirp
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
@@ -56,10 +55,21 @@ def component_dir(self):
"""Subdirectory for this component in the install prefix."""
raise NotImplementedError
@property
def v2_layout_versions(self):
"""Version that implements the v2 directory layout."""
raise NotImplementedError
@property
def v2_layout(self):
"""Returns true if this version implements the v2 directory layout."""
return self.spec.satisfies(self.v2_layout_versions)
@property
def component_prefix(self):
"""Path to component <prefix>/<component>/<version>."""
return self.prefix.join(join_path(self.component_dir, self.spec.version))
v = self.spec.version.up_to(2) if self.v2_layout else self.spec.version
return self.prefix.join(self.component_dir).join(str(v))
@property
def env_script_args(self):
@@ -113,8 +123,9 @@ def install_component(self, installer_path):
shutil.rmtree("/var/intel/installercache", ignore_errors=True)
# Some installers have a bug and do not return an error code when failing
if not isdir(join_path(self.prefix, self.component_dir)):
raise RuntimeError("install failed")
install_dir = self.component_prefix
if not isdir(install_dir):
raise RuntimeError("install failed to directory: {0}".format(install_dir))
def setup_run_environment(self, env):
"""Adds environment variables to the generated module file.
@@ -129,7 +140,7 @@ def setup_run_environment(self, env):
if "~envmods" not in self.spec:
env.extend(
EnvironmentModifications.from_sourcing_file(
join_path(self.component_prefix, "env", "vars.sh"), *self.env_script_args
self.component_prefix.env.join("vars.sh"), *self.env_script_args
)
)
@@ -168,16 +179,40 @@ class IntelOneApiLibraryPackage(IntelOneApiPackage):
"""
def header_directories(self, dirs):
h = HeaderList([])
h.directories = dirs
return h
@property
def headers(self):
include_path = join_path(self.component_prefix, "include")
return find_headers("*", include_path, recursive=True)
return self.header_directories(
[self.component_prefix.include, self.component_prefix.include.join(self.component_dir)]
)
@property
def libs(self):
lib_path = join_path(self.component_prefix, "lib", "intel64")
lib_path = lib_path if isdir(lib_path) else dirname(lib_path)
return find_libraries("*", root=lib_path, shared=True, recursive=True)
# for v2_layout all libraries are in the top level, v1 sometimes put them in intel64
return find_libraries("*", root=self.component_prefix.lib, recursive=not self.v2_layout)
class IntelOneApiLibraryPackageWithSdk(IntelOneApiPackage):
"""Base class for Intel oneAPI library packages with SDK components.
Contains some convenient default implementations for libraries
that expose functionality in sdk subdirectories.
Implement the method directly in the package if something
different is needed.
"""
@property
def headers(self):
return self.header_directories([self.component_prefix.sdk.include])
@property
def libs(self):
return find_libraries("*", self.component_prefix.sdk.lib64)
class IntelOneApiStaticLibraryList:
@@ -212,3 +247,7 @@ def link_flags(self):
@property
def ld_flags(self):
return "{0} {1}".format(self.search_flags, self.link_flags)
#: Tuple of Intel math libraries, exported to packages
INTEL_MATH_LIBRARIES = ("intel-mkl", "intel-oneapi-mkl", "intel-parallel-studio")

View File

@@ -10,13 +10,12 @@
import spack.builder
import spack.package_base
from spack.directives import build_system, extends
from spack.package_base import PackageBase
from spack.util.executable import Executable
from ._checks import BaseBuilder, execute_build_time_tests
class PerlPackage(PackageBase):
class PerlPackage(spack.package_base.PackageBase):
"""Specialized class for packages that are built using Perl."""
#: This attribute is used in UI queries that need to know the build
@@ -61,6 +60,30 @@ class PerlBuilder(BaseBuilder):
#: Callback names for build-time test
build_time_test_callbacks = ["check"]
@property
def build_method(self):
"""Searches the package for either a Makefile.PL or Build.PL.
Raises:
RuntimeError: if neither Makefile.PL nor Build.PL exist
"""
if os.path.isfile("Makefile.PL"):
build_method = "Makefile.PL"
elif os.path.isfile("Build.PL"):
build_method = "Build.PL"
else:
raise RuntimeError("Unknown build_method for perl package")
return build_method
@property
def build_executable(self):
"""Returns the executable method to build the perl package"""
if self.build_method == "Makefile.PL":
build_executable = inspect.getmodule(self.pkg).make
elif self.build_method == "Build.PL":
build_executable = Executable(os.path.join(self.pkg.stage.source_path, "Build"))
return build_executable
def configure_args(self):
"""List of arguments passed to :py:meth:`~.PerlBuilder.configure`.
@@ -73,19 +96,7 @@ def configure(self, pkg, spec, prefix):
"""Run Makefile.PL or Build.PL with arguments consisting of
an appropriate installation base directory followed by the
list returned by :py:meth:`~.PerlBuilder.configure_args`.
Raises:
RuntimeError: if neither Makefile.PL nor Build.PL exist
"""
if os.path.isfile("Makefile.PL"):
self.build_method = "Makefile.PL"
self.build_executable = inspect.getmodule(self.pkg).make
elif os.path.isfile("Build.PL"):
self.build_method = "Build.PL"
self.build_executable = Executable(os.path.join(self.pkg.stage.source_path, "Build"))
else:
raise RuntimeError("Unknown build_method for perl package")
if self.build_method == "Makefile.PL":
options = ["Makefile.PL", "INSTALL_BASE={0}".format(prefix)]
elif self.build_method == "Build.PL":

View File

@@ -6,13 +6,14 @@
import os
import re
import shutil
from typing import Optional
from typing import Iterable, List, Mapping, Optional
import archspec
import llnl.util.filesystem as fs
import llnl.util.lang as lang
import llnl.util.tty as tty
from llnl.util.filesystem import HeaderList, LibraryList
import spack.builder
import spack.config
@@ -25,14 +26,18 @@
from spack.directives import build_system, depends_on, extends, maintainers
from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import test_part
from spack.spec import Spec
from spack.util.prefix import Prefix
from ._checks import BaseBuilder, execute_install_time_tests
def _flatten_dict(dictionary):
def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
"""Iterable that yields KEY=VALUE paths through a dictionary.
Args:
dictionary: Possibly nested dictionary of arbitrary keys and values.
Yields:
A single path through the dictionary.
"""
@@ -50,7 +55,7 @@ class PythonExtension(spack.package_base.PackageBase):
maintainers("adamjstewart")
@property
def import_modules(self):
def import_modules(self) -> Iterable[str]:
"""Names of modules that the Python package provides.
These are used to test whether or not the installation succeeded.
@@ -65,7 +70,7 @@ def import_modules(self):
detected, this property can be overridden by the package.
Returns:
list: list of strings of module names
List of strings of module names.
"""
modules = []
pkg = self.spec["python"].package
@@ -102,14 +107,14 @@ def import_modules(self):
return modules
@property
def skip_modules(self):
def skip_modules(self) -> Iterable[str]:
"""Names of modules that should be skipped when running tests.
These are a subset of import_modules. If a module has submodules,
they are skipped as well (meaning a.b is skipped if a is contained).
Returns:
list: list of strings of module names
List of strings of module names.
"""
return []
@@ -185,12 +190,12 @@ def remove_files_from_view(self, view, merge_map):
view.remove_files(to_remove)
def test_imports(self):
def test_imports(self) -> None:
"""Attempts to import modules of the installed package."""
# Make sure we are importing the installed modules,
# not the ones in the source directory
python = inspect.getmodule(self).python
python = inspect.getmodule(self).python # type: ignore[union-attr]
for module in self.import_modules:
with test_part(
self,
@@ -315,24 +320,27 @@ class PythonPackage(PythonExtension):
py_namespace: Optional[str] = None
@lang.classproperty
def homepage(cls):
def homepage(cls) -> Optional[str]: # type: ignore[override]
if cls.pypi:
name = cls.pypi.split("/")[0]
return "https://pypi.org/project/" + name + "/"
return f"https://pypi.org/project/{name}/"
return None
@lang.classproperty
def url(cls):
def url(cls) -> Optional[str]:
if cls.pypi:
return "https://files.pythonhosted.org/packages/source/" + cls.pypi[0] + "/" + cls.pypi
return f"https://files.pythonhosted.org/packages/source/{cls.pypi[0]}/{cls.pypi}"
return None
@lang.classproperty
def list_url(cls):
def list_url(cls) -> Optional[str]: # type: ignore[override]
if cls.pypi:
name = cls.pypi.split("/")[0]
return "https://pypi.org/simple/" + name + "/"
return f"https://pypi.org/simple/{name}/"
return None
@property
def headers(self):
def headers(self) -> HeaderList:
"""Discover header files in platlib."""
# Remove py- prefix in package name
@@ -350,7 +358,7 @@ def headers(self):
raise NoHeadersError(msg.format(self.spec.name, include, platlib))
@property
def libs(self):
def libs(self) -> LibraryList:
"""Discover libraries in platlib."""
# Remove py- prefix in package name
@@ -384,7 +392,7 @@ class PythonPipBuilder(BaseBuilder):
install_time_test_callbacks = ["test"]
@staticmethod
def std_args(cls):
def std_args(cls) -> List[str]:
return [
# Verbose
"-vvv",
@@ -409,7 +417,7 @@ def std_args(cls):
]
@property
def build_directory(self):
def build_directory(self) -> str:
"""The root directory of the Python package.
This is usually the directory containing one of the following files:
@@ -420,51 +428,51 @@ def build_directory(self):
"""
return self.pkg.stage.source_path
def config_settings(self, spec, prefix):
def config_settings(self, spec: Spec, prefix: Prefix) -> Mapping[str, object]:
"""Configuration settings to be passed to the PEP 517 build backend.
Requires pip 22.1 or newer for keys that appear only a single time,
or pip 23.1 or newer if the same key appears multiple times.
Args:
spec (spack.spec.Spec): build spec
prefix (spack.util.prefix.Prefix): installation prefix
spec: Build spec.
prefix: Installation prefix.
Returns:
dict: Possibly nested dictionary of KEY, VALUE settings
Possibly nested dictionary of KEY, VALUE settings.
"""
return {}
def install_options(self, spec, prefix):
def install_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
"""Extra arguments to be supplied to the setup.py install command.
Requires pip 23.0 or older.
Args:
spec (spack.spec.Spec): build spec
prefix (spack.util.prefix.Prefix): installation prefix
spec: Build spec.
prefix: Installation prefix.
Returns:
list: list of options
List of options.
"""
return []
def global_options(self, spec, prefix):
def global_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
"""Extra global options to be supplied to the setup.py call before the install
or bdist_wheel command.
Deprecated in pip 23.1.
Args:
spec (spack.spec.Spec): build spec
prefix (spack.util.prefix.Prefix): installation prefix
spec: Build spec.
prefix: Installation prefix.
Returns:
list: list of options
List of options.
"""
return []
def install(self, pkg, spec, prefix):
def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
"""Install everything from build directory."""
args = PythonPipBuilder.std_args(pkg) + [f"--prefix={prefix}"]

View File

@@ -108,6 +108,8 @@ class ROCmPackage(PackageBase):
"gfx90a:xnack+",
"gfx90c",
"gfx940",
"gfx941",
"gfx942",
"gfx1010",
"gfx1011",
"gfx1012",
@@ -168,6 +170,8 @@ def hip_flags(amdgpu_target):
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack-")
depends_on("llvm-amdgpu@4.3.0:", when="amdgpu_target=gfx90a:xnack+")
depends_on("llvm-amdgpu@5.2.0:", when="amdgpu_target=gfx940")
depends_on("llvm-amdgpu@5.7.0:", when="amdgpu_target=gfx941")
depends_on("llvm-amdgpu@5.7.0:", when="amdgpu_target=gfx942")
depends_on("llvm-amdgpu@4.5.0:", when="amdgpu_target=gfx1013")
depends_on("llvm-amdgpu@3.8.0:", when="amdgpu_target=gfx1030")
depends_on("llvm-amdgpu@3.9.0:", when="amdgpu_target=gfx1031")

View File

@@ -46,7 +46,22 @@
from spack.reporters import CDash, CDashConfiguration
from spack.reporters.cdash import build_stamp as cdash_build_stamp
JOB_RETRY_CONDITIONS = ["always"]
# See https://docs.gitlab.com/ee/ci/yaml/#retry for descriptions of conditions
JOB_RETRY_CONDITIONS = [
# "always",
"unknown_failure",
"script_failure",
"api_failure",
"stuck_or_timeout_failure",
"runner_system_failure",
"runner_unsupported",
"stale_schedule",
# "job_execution_timeout",
"archived_failure",
"unmet_prerequisites",
"scheduler_failure",
"data_integrity_failure",
]
TEMP_STORAGE_MIRROR_NAME = "ci_temporary_mirror"
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
@@ -1238,6 +1253,7 @@ def main_script_replacements(cmd):
op=lambda cmd: cmd.replace("mirror_prefix", temp_storage_url_prefix),
)
cleanup_job["dependencies"] = []
output_object["cleanup"] = cleanup_job
if (
@@ -1261,6 +1277,7 @@ def main_script_replacements(cmd):
if buildcache_destination
else remote_mirror_override or remote_mirror_url
)
signing_job["dependencies"] = []
output_object["sign-pkgs"] = signing_job
@@ -1281,6 +1298,7 @@ def main_script_replacements(cmd):
final_job["when"] = "always"
final_job["retry"] = service_job_retries
final_job["interruptible"] = True
final_job["dependencies"] = []
output_object["rebuild-index"] = final_job

View File

@@ -6,10 +6,8 @@
import argparse
import os
import re
import shlex
import sys
from textwrap import dedent
from typing import List, Match, Tuple
from typing import List, Union
import llnl.string
import llnl.util.tty as tty
@@ -147,89 +145,37 @@ def get_command(cmd_name):
return getattr(get_module(cmd_name), pname)
class _UnquotedFlags:
"""Use a heuristic in `.extract()` to detect whether the user is trying to set
multiple flags like the docker ENV attribute allows (e.g. 'cflags=-Os -pipe').
def quote_kvp(string: str) -> str:
"""For strings like ``name=value`` or ``name==value``, quote and escape the value if needed.
If the heuristic finds a match (which can be checked with `__bool__()`), a warning
message explaining how to quote multiple flags correctly can be generated with
`.report()`.
This is a compromise to respect quoting of key-value pairs on the CLI. The shell
strips quotes from quoted arguments, so we cannot know *exactly* how CLI arguments
were quoted. To compensate, we re-add quotes around anything staritng with ``name=``
or ``name==``, and we assume the rest of the argument is the value. This covers the
common cases of passign flags, e.g., ``cflags="-O2 -g"`` on the command line.
"""
match = spack.parser.SPLIT_KVP.match(string)
if not match:
return string
flags_arg_pattern = re.compile(
r'^({0})=([^\'"].*)$'.format("|".join(spack.spec.FlagMap.valid_compiler_flags()))
)
def __init__(self, all_unquoted_flag_pairs: List[Tuple[Match[str], str]]):
self._flag_pairs = all_unquoted_flag_pairs
def __bool__(self) -> bool:
return bool(self._flag_pairs)
@classmethod
def extract(cls, sargs: str) -> "_UnquotedFlags":
all_unquoted_flag_pairs: List[Tuple[Match[str], str]] = []
prev_flags_arg = None
for arg in shlex.split(sargs):
if prev_flags_arg is not None:
all_unquoted_flag_pairs.append((prev_flags_arg, arg))
prev_flags_arg = cls.flags_arg_pattern.match(arg)
return cls(all_unquoted_flag_pairs)
def report(self) -> str:
single_errors = [
"({0}) {1} {2} => {3}".format(
i + 1,
match.group(0),
next_arg,
'{0}="{1} {2}"'.format(match.group(1), match.group(2), next_arg),
)
for i, (match, next_arg) in enumerate(self._flag_pairs)
]
return dedent(
"""\
Some compiler or linker flags were provided without quoting their arguments,
which now causes spack to try to parse the *next* argument as a spec component
such as a variant instead of an additional compiler or linker flag. If the
intent was to set multiple flags, try quoting them together as described below.
Possible flag quotation errors (with the correctly-quoted version after the =>):
{0}"""
).format("\n".join(single_errors))
key, delim, value = match.groups()
return f"{key}{delim}{spack.parser.quote_if_needed(value)}"
def parse_specs(args, **kwargs):
def parse_specs(
args: Union[str, List[str]], concretize: bool = False, tests: bool = False
) -> List[spack.spec.Spec]:
"""Convenience function for parsing arguments from specs. Handles common
exceptions and dies if there are errors.
"""
concretize = kwargs.get("concretize", False)
normalize = kwargs.get("normalize", False)
tests = kwargs.get("tests", False)
args = [args] if isinstance(args, str) else args
arg_string = " ".join([quote_kvp(arg) for arg in args])
sargs = args
if not isinstance(args, str):
sargs = " ".join(args)
unquoted_flags = _UnquotedFlags.extract(sargs)
try:
specs = spack.parser.parse(sargs)
for spec in specs:
if concretize:
spec.concretize(tests=tests) # implies normalize
elif normalize:
spec.normalize(tests=tests)
return specs
except spack.error.SpecError as e:
msg = e.message
if e.long_message:
msg += e.long_message
# Unquoted flags will be read as a variant or hash
if unquoted_flags and ("variant" in msg or "hash" in msg):
msg += "\n\n"
msg += unquoted_flags.report()
raise spack.error.SpackError(msg) from e
specs = spack.parser.parse(arg_string)
for spec in specs:
if concretize:
spec.concretize(tests=tests)
return specs
def matching_spec_from_env(spec):

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
from spack.cmd.common import arguments
description = "add a spec to an environment"
section = "environments"

View File

@@ -2,6 +2,8 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import warnings
import llnl.util.tty as tty
import llnl.util.tty.colify
import llnl.util.tty.color as cl
@@ -52,8 +54,10 @@ def setup_parser(subparser):
def configs(parser, args):
reports = spack.audit.run_group(args.subcommand)
_process_reports(reports)
with warnings.catch_warnings():
warnings.simplefilter("ignore")
reports = spack.audit.run_group(args.subcommand)
_process_reports(reports)
def packages(parser, args):

View File

@@ -15,13 +15,13 @@
import spack.bootstrap
import spack.bootstrap.config
import spack.bootstrap.core
import spack.cmd.common.arguments
import spack.config
import spack.main
import spack.mirror
import spack.spec
import spack.stage
import spack.util.path
from spack.cmd.common import arguments
description = "manage bootstrap configuration"
section = "system"
@@ -68,12 +68,8 @@
def _add_scope_option(parser):
scopes = spack.config.scopes()
parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
"--scope", action=arguments.ConfigScope, help="configuration scope to read/modify"
)
@@ -106,7 +102,7 @@ def setup_parser(subparser):
disable.add_argument("name", help="name of the source to be disabled", nargs="?", default=None)
reset = sp.add_parser("reset", help="reset bootstrapping configuration to Spack defaults")
spack.cmd.common.arguments.add_common_arguments(reset, ["yes_to_all"])
arguments.add_common_arguments(reset, ["yes_to_all"])
root = sp.add_parser("root", help="get/set the root bootstrap directory")
_add_scope_option(root)

View File

@@ -21,7 +21,6 @@
import spack.binary_distribution as bindist
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.error
@@ -38,8 +37,10 @@
import spack.util.crypto
import spack.util.url as url_util
import spack.util.web as web_util
from spack import traverse
from spack.build_environment import determine_number_of_jobs
from spack.cmd import display_specs
from spack.cmd.common import arguments
from spack.oci.image import (
Digest,
ImageReference,
@@ -76,7 +77,19 @@ def setup_parser(subparser: argparse.ArgumentParser):
)
push_sign = push.add_mutually_exclusive_group(required=False)
push_sign.add_argument(
"--unsigned", "-u", action="store_true", help="push unsigned buildcache tarballs"
"--unsigned",
"-u",
action="store_false",
dest="signed",
default=None,
help="push unsigned buildcache tarballs",
)
push_sign.add_argument(
"--signed",
action="store_true",
dest="signed",
default=None,
help="push signed buildcache tarballs",
)
push_sign.add_argument(
"--key", "-k", metavar="key", type=str, default=None, help="key for signing"
@@ -110,7 +123,14 @@ def setup_parser(subparser: argparse.ArgumentParser):
help="stop pushing on first failure (default is best effort)",
)
push.add_argument(
"--base-image", default=None, help="specify the base image for the buildcache. "
"--base-image", default=None, help="specify the base image for the buildcache"
)
push.add_argument(
"--tag",
"-t",
default=None,
help="when pushing to an OCI registry, tag an image containing all root specs and their "
"runtime dependencies",
)
arguments.add_common_arguments(push, ["specs", "jobs"])
push.set_defaults(func=push_fn)
@@ -182,23 +202,22 @@ def setup_parser(subparser: argparse.ArgumentParser):
)
# used to construct scope arguments below
scopes = spack.config.scopes()
check.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope containing mirrors to check",
)
check_spec_or_specfile = check.add_mutually_exclusive_group(required=True)
check_spec_or_specfile.add_argument(
# Unfortunately there are 3 ways to do the same thing here:
check_specs = check.add_mutually_exclusive_group()
check_specs.add_argument(
"-s", "--spec", help="check single spec instead of release specs file"
)
check_spec_or_specfile.add_argument(
check_specs.add_argument(
"--spec-file",
help="check single spec from json or yaml file instead of release specs file",
)
arguments.add_common_arguments(check, ["specs"])
check.set_defaults(func=check_fn)
@@ -320,26 +339,36 @@ def push_fn(args):
)
if args.specs or args.spec_file:
specs = _matching_specs(spack.cmd.parse_specs(args.specs or args.spec_file))
roots = _matching_specs(spack.cmd.parse_specs(args.specs or args.spec_file))
else:
specs = spack.cmd.require_active_env("buildcache push").all_specs()
roots = spack.cmd.require_active_env(cmd_name="buildcache push").concrete_roots()
if args.allow_root:
tty.warn(
"The flag `--allow-root` is the default in Spack 0.21, will be removed in Spack 0.22"
)
mirror: spack.mirror.Mirror = args.mirror
# Check if this is an OCI image.
try:
image_ref = spack.oci.oci.image_from_mirror(args.mirror)
target_image = spack.oci.oci.image_from_mirror(mirror)
except ValueError:
image_ref = None
target_image = None
push_url = mirror.push_url
# When neither --signed, --unsigned nor --key are specified, use the mirror's default.
if args.signed is None and not args.key:
unsigned = not mirror.signed
else:
unsigned = not (args.key or args.signed)
# For OCI images, we require dependencies to be pushed for now.
if image_ref:
if target_image:
if "dependencies" not in args.things_to_install:
tty.die("Dependencies must be pushed for OCI images.")
if not args.unsigned:
if not unsigned:
tty.warn(
"Code signing is currently not supported for OCI images. "
"Use --unsigned to silence this warning."
@@ -347,26 +376,48 @@ def push_fn(args):
# This is a list of installed, non-external specs.
specs = bindist.specs_to_be_packaged(
specs,
roots,
root="package" in args.things_to_install,
dependencies="dependencies" in args.things_to_install,
)
url = args.mirror.push_url
# When pushing multiple specs, print the url once ahead of time, as well as how
# many specs are being pushed.
if len(specs) > 1:
tty.info(f"Selected {len(specs)} specs to push to {url}")
tty.info(f"Selected {len(specs)} specs to push to {push_url}")
failed = []
# TODO: unify this logic in the future.
if image_ref:
if target_image:
base_image = ImageReference.from_string(args.base_image) if args.base_image else None
with tempfile.TemporaryDirectory(
dir=spack.stage.get_stage_root()
) as tmpdir, _make_pool() as pool:
skipped = _push_oci(args, image_ref, specs, tmpdir, pool)
skipped, base_images, checksums = _push_oci(
target_image=target_image,
base_image=base_image,
installed_specs_with_deps=specs,
force=args.force,
tmpdir=tmpdir,
pool=pool,
)
# Apart from creating manifests for each individual spec, we allow users to create a
# separate image tag for all root specs and their runtime dependencies.
if args.tag:
tagged_image = target_image.with_tag(args.tag)
# _push_oci may not populate base_images if binaries were already in the registry
for spec in roots:
_update_base_images(
base_image=base_image,
target_image=target_image,
spec=spec,
base_image_cache=base_images,
)
_put_manifest(base_images, checksums, tagged_image, tmpdir, None, None, *roots)
tty.info(f"Tagged {tagged_image}")
else:
skipped = []
@@ -374,10 +425,10 @@ def push_fn(args):
try:
bindist.push_or_raise(
spec,
url,
push_url,
bindist.PushOptions(
force=args.force,
unsigned=args.unsigned,
unsigned=unsigned,
key=args.key,
regenerate_index=args.update_index,
),
@@ -385,7 +436,7 @@ def push_fn(args):
msg = f"{_progress(i, len(specs))}Pushed {_format_spec(spec)}"
if len(specs) == 1:
msg += f" to {url}"
msg += f" to {push_url}"
tty.info(msg)
except bindist.NoOverwriteException:
@@ -427,11 +478,11 @@ def push_fn(args):
# Update the index if requested
# TODO: remove update index logic out of bindist; should be once after all specs are pushed
# not once per spec.
if image_ref and len(skipped) < len(specs) and args.update_index:
if target_image and len(skipped) < len(specs) and args.update_index:
with tempfile.TemporaryDirectory(
dir=spack.stage.get_stage_root()
) as tmpdir, _make_pool() as pool:
_update_index_oci(image_ref, tmpdir, pool)
_update_index_oci(target_image, tmpdir, pool)
def _get_spack_binary_blob(image_ref: ImageReference) -> Optional[spack.oci.oci.Blob]:
@@ -497,17 +548,21 @@ def _archspec_to_gooarch(spec: spack.spec.Spec) -> str:
def _put_manifest(
base_images: Dict[str, Tuple[dict, dict]],
checksums: Dict[str, spack.oci.oci.Blob],
spec: spack.spec.Spec,
image_ref: ImageReference,
tmpdir: str,
extra_config: Optional[dict],
annotations: Optional[dict],
*specs: spack.spec.Spec,
):
architecture = _archspec_to_gooarch(spec)
architecture = _archspec_to_gooarch(specs[0])
dependencies = list(
reversed(
list(
s
for s in spec.traverse(order="topo", deptype=("link", "run"), root=True)
for s in traverse.traverse_nodes(
specs, order="topo", deptype=("link", "run"), root=True
)
if not s.external
)
)
@@ -516,7 +571,7 @@ def _put_manifest(
base_manifest, base_config = base_images[architecture]
env = _retrieve_env_dict_from_config(base_config)
spack.user_environment.environment_modifications_for_specs(spec).apply_modifications(env)
spack.user_environment.environment_modifications_for_specs(*specs).apply_modifications(env)
# Create an oci.image.config file
config = copy.deepcopy(base_config)
@@ -528,20 +583,14 @@ def _put_manifest(
# Set the environment variables
config["config"]["Env"] = [f"{k}={v}" for k, v in env.items()]
# From the OCI v1.0 spec:
# > Any extra fields in the Image JSON struct are considered implementation
# > specific and MUST be ignored by any implementations which are unable to
# > interpret them.
# We use this to store the Spack spec, so we can use it to create an index.
spec_dict = spec.to_dict(hash=ht.dag_hash)
spec_dict["buildcache_layout_version"] = 1
spec_dict["binary_cache_checksum"] = {
"hash_algorithm": "sha256",
"hash": checksums[spec.dag_hash()].compressed_digest.digest,
}
config.update(spec_dict)
if extra_config:
# From the OCI v1.0 spec:
# > Any extra fields in the Image JSON struct are considered implementation
# > specific and MUST be ignored by any implementations which are unable to
# > interpret them.
config.update(extra_config)
config_file = os.path.join(tmpdir, f"{spec.dag_hash()}.config.json")
config_file = os.path.join(tmpdir, f"{specs[0].dag_hash()}.config.json")
with open(config_file, "w") as f:
json.dump(config, f, separators=(",", ":"))
@@ -572,48 +621,69 @@ def _put_manifest(
for s in dependencies
),
],
"annotations": {"org.opencontainers.image.description": spec.format()},
}
image_ref_for_spec = image_ref.with_tag(default_tag(spec))
if annotations:
oci_manifest["annotations"] = annotations
# Finally upload the manifest
upload_manifest_with_retry(image_ref_for_spec, oci_manifest=oci_manifest)
upload_manifest_with_retry(image_ref, oci_manifest=oci_manifest)
# delete the config file
os.unlink(config_file)
return image_ref_for_spec
def _update_base_images(
*,
base_image: Optional[ImageReference],
target_image: ImageReference,
spec: spack.spec.Spec,
base_image_cache: Dict[str, Tuple[dict, dict]],
):
"""For a given spec and base image, copy the missing layers of the base image with matching
arch to the registry of the target image. If no base image is specified, create a dummy
manifest and config file."""
architecture = _archspec_to_gooarch(spec)
if architecture in base_image_cache:
return
if base_image is None:
base_image_cache[architecture] = (
default_manifest(),
default_config(architecture, "linux"),
)
else:
base_image_cache[architecture] = copy_missing_layers_with_retry(
base_image, target_image, architecture
)
def _push_oci(
args,
image_ref: ImageReference,
*,
target_image: ImageReference,
base_image: Optional[ImageReference],
installed_specs_with_deps: List[Spec],
tmpdir: str,
pool: multiprocessing.pool.Pool,
) -> List[str]:
force: bool = False,
) -> Tuple[List[str], Dict[str, Tuple[dict, dict]], Dict[str, spack.oci.oci.Blob]]:
"""Push specs to an OCI registry
Args:
args: The command line arguments.
image_ref: The image reference.
image_ref: The target OCI image
base_image: Optional base image, which will be copied to the target registry.
installed_specs_with_deps: The installed specs to push, excluding externals,
including deps, ordered from roots to leaves.
force: Whether to overwrite existing layers and manifests in the buildcache.
Returns:
List[str]: The list of skipped specs (already in the buildcache).
A tuple consisting of the list of skipped specs already in the build cache,
a dictionary mapping architectures to base image manifests and configs,
and a dictionary mapping each spec's dag hash to a blob.
"""
# Reverse the order
installed_specs_with_deps = list(reversed(installed_specs_with_deps))
# The base image to use for the package. When not set, we use
# the OCI registry only for storage, and do not use any base image.
base_image_ref: Optional[ImageReference] = (
ImageReference.from_string(args.base_image) if args.base_image else None
)
# Spec dag hash -> blob
checksums: Dict[str, spack.oci.oci.Blob] = {}
@@ -623,11 +693,11 @@ def _push_oci(
# Specs not uploaded because they already exist
skipped = []
if not args.force:
if not force:
tty.info("Checking for existing specs in the buildcache")
to_be_uploaded = []
tags_to_check = (image_ref.with_tag(default_tag(s)) for s in installed_specs_with_deps)
tags_to_check = (target_image.with_tag(default_tag(s)) for s in installed_specs_with_deps)
available_blobs = pool.map(_get_spack_binary_blob, tags_to_check)
for spec, maybe_blob in zip(installed_specs_with_deps, available_blobs):
@@ -640,46 +710,63 @@ def _push_oci(
to_be_uploaded = installed_specs_with_deps
if not to_be_uploaded:
return skipped
return skipped, base_images, checksums
tty.info(
f"{len(to_be_uploaded)} specs need to be pushed to {image_ref.domain}/{image_ref.name}"
f"{len(to_be_uploaded)} specs need to be pushed to "
f"{target_image.domain}/{target_image.name}"
)
# Upload blobs
new_blobs = pool.starmap(
_push_single_spack_binary_blob, ((image_ref, spec, tmpdir) for spec in to_be_uploaded)
_push_single_spack_binary_blob, ((target_image, spec, tmpdir) for spec in to_be_uploaded)
)
# And update the spec to blob mapping
for spec, blob in zip(to_be_uploaded, new_blobs):
checksums[spec.dag_hash()] = blob
# Copy base image layers, probably fine to do sequentially.
# Copy base images if necessary
for spec in to_be_uploaded:
architecture = _archspec_to_gooarch(spec)
# Get base image details, if we don't have them yet
if architecture in base_images:
continue
if base_image_ref is None:
base_images[architecture] = (default_manifest(), default_config(architecture, "linux"))
else:
base_images[architecture] = copy_missing_layers_with_retry(
base_image_ref, image_ref, architecture
)
_update_base_images(
base_image=base_image,
target_image=target_image,
spec=spec,
base_image_cache=base_images,
)
def extra_config(spec: Spec):
spec_dict = spec.to_dict(hash=ht.dag_hash)
spec_dict["buildcache_layout_version"] = 1
spec_dict["binary_cache_checksum"] = {
"hash_algorithm": "sha256",
"hash": checksums[spec.dag_hash()].compressed_digest.digest,
}
return spec_dict
# Upload manifests
tty.info("Uploading manifests")
pushed_image_ref = pool.starmap(
pool.starmap(
_put_manifest,
((base_images, checksums, spec, image_ref, tmpdir) for spec in to_be_uploaded),
(
(
base_images,
checksums,
target_image.with_tag(default_tag(spec)),
tmpdir,
extra_config(spec),
{"org.opencontainers.image.description": spec.format()},
spec,
)
for spec in to_be_uploaded
),
)
# Print the image names of the top-level specs
for spec, ref in zip(to_be_uploaded, pushed_image_ref):
tty.info(f"Pushed {_format_spec(spec)} to {ref}")
for spec in to_be_uploaded:
tty.info(f"Pushed {_format_spec(spec)} to {target_image.with_tag(default_tag(spec))}")
return skipped
return skipped, base_images, checksums
def _config_from_tag(image_ref: ImageReference, tag: str) -> Optional[dict]:
@@ -816,15 +903,24 @@ def check_fn(args: argparse.Namespace):
exit code is non-zero, then at least one of the indicated specs needs to be rebuilt
"""
if args.spec_file:
specs_arg = (
args.spec_file if os.path.sep in args.spec_file else os.path.join(".", args.spec_file)
)
tty.warn(
"The flag `--spec-file` is deprecated and will be removed in Spack 0.22. "
"Use --spec instead."
f"Use `spack buildcache check {specs_arg}` instead."
)
elif args.spec:
specs_arg = args.spec
tty.warn(
"The flag `--spec` is deprecated and will be removed in Spack 0.23. "
f"Use `spack buildcache check {specs_arg}` instead."
)
else:
specs_arg = args.specs
specs = spack.cmd.parse_specs(args.spec or args.spec_file)
if specs:
specs = _matching_specs(specs)
if specs_arg:
specs = _matching_specs(spack.cmd.parse_specs(specs_arg))
else:
specs = spack.cmd.require_active_env("buildcache check").all_specs()

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.cmd
import spack.cmd.common.arguments as arguments
from spack.cmd.common import arguments
description = "change an existing spec in an environment"
section = "environments"

View File

@@ -16,6 +16,7 @@
import spack.cmd.buildcache as buildcache
import spack.config as cfg
import spack.environment as ev
import spack.environment.depfile
import spack.hash_types as ht
import spack.mirror
import spack.util.gpg as gpg_util
@@ -606,7 +607,9 @@ def ci_rebuild(args):
"SPACK_INSTALL_FLAGS={}".format(args_to_string(deps_install_args)),
"-j$(nproc)",
"install-deps/{}".format(
ev.depfile.MakefileSpec(job_spec).safe_format("{name}-{version}-{hash}")
spack.environment.depfile.MakefileSpec(job_spec).safe_format(
"{name}-{version}-{hash}"
)
),
],
spack_cmd + ["install"] + root_install_args,

View File

@@ -12,13 +12,13 @@
import spack.bootstrap
import spack.caches
import spack.cmd.common.arguments as arguments
import spack.cmd.test
import spack.config
import spack.repo
import spack.stage
import spack.store
import spack.util.path
from spack.cmd.common import arguments
from spack.paths import lib_path, var_path
description = "remove temporary build files and/or downloaded archives"

View File

@@ -67,12 +67,13 @@ class ConstraintAction(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
# Query specs from command line
self.values = values
namespace.constraint = values
self.constraint = namespace.constraint = values
self.constraint_specs = namespace.constraint_specs = []
namespace.specs = self._specs
def _specs(self, **kwargs):
qspecs = spack.cmd.parse_specs(self.values)
# store parsed specs in spec.constraint after a call to specs()
self.constraint_specs[:] = spack.cmd.parse_specs(self.constraint)
# If an environment is provided, we'll restrict the search to
# only its installed packages.
@@ -81,12 +82,12 @@ def _specs(self, **kwargs):
kwargs["hashes"] = set(env.all_hashes())
# return everything for an empty query.
if not qspecs:
if not self.constraint_specs:
return spack.store.STORE.db.query(**kwargs)
# Return only matching stuff otherwise.
specs = {}
for spec in qspecs:
for spec in self.constraint_specs:
for s in spack.store.STORE.db.query(spec, **kwargs):
# This is fast for already-concrete specs
specs[s.dag_hash()] = s
@@ -124,6 +125,33 @@ def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, deptype)
class ConfigScope(argparse.Action):
"""Pick the currently configured config scopes."""
def __init__(self, *args, **kwargs) -> None:
kwargs.setdefault("metavar", spack.config.SCOPES_METAVAR)
super().__init__(*args, **kwargs)
@property
def default(self):
return self._default() if callable(self._default) else self._default
@default.setter
def default(self, value):
self._default = value
@property
def choices(self):
return spack.config.scopes().keys()
@choices.setter
def choices(self, value):
pass
def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, values)
def _cdash_reporter(namespace):
"""Helper function to create a CDash reporter. This function gets an early reference to the
argparse namespace under construction, so it can later use it to create the object.
@@ -357,10 +385,11 @@ def install_status():
"--install-status",
action="store_true",
default=True,
help="show install status of packages\n\npackages can be: "
"installed [+], missing and needed by an installed package [-], "
"installed in an upstream instance [^], "
"or not installed (no annotation)",
help=(
"show install status of packages\n"
"[+] installed [^] installed in an upstream\n"
" - not installed [-] missing dep of installed package\n"
),
)

View File

@@ -0,0 +1,30 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import sys
from typing import List
import llnl.util.tty as tty
import spack.cmd
display_args = {"long": True, "show_flags": False, "variants": False, "indent": 4}
def confirm_action(specs: List[spack.spec.Spec], participle: str, noun: str):
"""Display the list of specs to be acted on and ask for confirmation.
Args:
specs: specs to be removed
participle: action expressed as a participle, e.g. "uninstalled"
noun: action expressed as a noun, e.g. "uninstallation"
"""
tty.msg(f"The following {len(specs)} packages will be {participle}:\n")
spack.cmd.display_specs(specs, **display_args)
print("")
answer = tty.get_yes_or_no("Do you want to proceed?", default=False)
if not answer:
tty.msg(f"Aborting {noun}")
sys.exit(0)

View File

@@ -8,13 +8,13 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.deptypes as dt
import spack.error
import spack.paths
import spack.spec
import spack.store
from spack import build_environment, traverse
from spack.cmd.common import arguments
from spack.context import Context
from spack.util.environment import dump_environment, pickle_environment

View File

@@ -14,6 +14,7 @@
import spack.compilers
import spack.config
import spack.spec
from spack.cmd.common import arguments
description = "manage compilers"
section = "system"
@@ -23,8 +24,6 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="compiler_command")
scopes = spack.config.scopes()
# Find
find_parser = sp.add_parser(
"find",
@@ -47,9 +46,8 @@ def setup_parser(subparser):
find_parser.add_argument("add_paths", nargs=argparse.REMAINDER)
find_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope("compilers"),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope("compilers"),
help="configuration scope to modify",
)
@@ -60,32 +58,20 @@ def setup_parser(subparser):
)
remove_parser.add_argument("compiler_spec")
remove_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=None,
help="configuration scope to modify",
"--scope", action=arguments.ConfigScope, default=None, help="configuration scope to modify"
)
# List
list_parser = sp.add_parser("list", help="list available compilers")
list_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
help="configuration scope to read from",
"--scope", action=arguments.ConfigScope, help="configuration scope to read from"
)
# Info
info_parser = sp.add_parser("info", help="show compiler paths")
info_parser.add_argument("compiler_spec")
info_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
help="configuration scope to read from",
"--scope", action=arguments.ConfigScope, help="configuration scope to read from"
)

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.config
from spack.cmd.common import arguments
from spack.cmd.compiler import compiler_list
description = "list available compilers"
@@ -12,13 +12,8 @@
def setup_parser(subparser):
scopes = spack.config.scopes()
subparser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
"--scope", action=arguments.ConfigScope, help="configuration scope to read/modify"
)

View File

@@ -5,12 +5,12 @@
import collections
import os
import shutil
import sys
from typing import List
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.cmd.common.arguments
import spack.config
import spack.environment as ev
import spack.repo
@@ -18,6 +18,7 @@
import spack.schema.packages
import spack.store
import spack.util.spack_yaml as syaml
from spack.cmd.common import arguments
from spack.util.editor import editor
description = "get and set configuration options"
@@ -26,14 +27,9 @@
def setup_parser(subparser):
scopes = spack.config.scopes()
# User can only choose one
subparser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
"--scope", action=arguments.ConfigScope, help="configuration scope to read/modify"
)
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="config_command")
@@ -53,6 +49,7 @@ def setup_parser(subparser):
blame_parser.add_argument(
"section",
help="configuration section to print\n\noptions: %(choices)s",
nargs="?",
metavar="section",
choices=spack.config.SECTION_SCHEMAS,
)
@@ -101,13 +98,13 @@ def setup_parser(subparser):
setup_parser.add_parser = add_parser
update = sp.add_parser("update", help="update configuration files to the latest format")
spack.cmd.common.arguments.add_common_arguments(update, ["yes_to_all"])
arguments.add_common_arguments(update, ["yes_to_all"])
update.add_argument("section", help="section to update")
revert = sp.add_parser(
"revert", help="revert configuration files to their state before update"
)
spack.cmd.common.arguments.add_common_arguments(revert, ["yes_to_all"])
arguments.add_common_arguments(revert, ["yes_to_all"])
revert.add_argument("section", help="section to update")
@@ -136,32 +133,50 @@ def _get_scope_and_section(args):
return scope, section
def print_configuration(args, *, blame: bool) -> None:
if args.scope and args.section is None:
tty.die(f"the argument --scope={args.scope} requires specifying a section.")
if args.section is not None:
spack.config.CONFIG.print_section(args.section, blame=blame, scope=args.scope)
return
print_flattened_configuration(blame=blame)
def print_flattened_configuration(*, blame: bool) -> None:
"""Prints to stdout a flattened version of the configuration.
Args:
blame: if True, shows file provenance for each entry in the configuration.
"""
env = ev.active_environment()
if env is not None:
pristine = env.manifest.pristine_yaml_content
flattened = pristine.copy()
flattened[spack.schema.env.TOP_LEVEL_KEY] = pristine[spack.schema.env.TOP_LEVEL_KEY].copy()
else:
flattened = syaml.syaml_dict()
flattened[spack.schema.env.TOP_LEVEL_KEY] = syaml.syaml_dict()
for config_section in spack.config.SECTION_SCHEMAS:
current = spack.config.get(config_section)
flattened[spack.schema.env.TOP_LEVEL_KEY][config_section] = current
syaml.dump_config(flattened, stream=sys.stdout, default_flow_style=False, blame=blame)
def config_get(args):
"""Dump merged YAML configuration for a specific section.
With no arguments and an active environment, print the contents of
the environment's manifest file (spack.yaml).
"""
scope, section = _get_scope_and_section(args)
if section is not None:
spack.config.CONFIG.print_section(section)
elif scope and scope.startswith("env:"):
config_file = spack.config.CONFIG.get_config_filename(scope, section)
if os.path.exists(config_file):
with open(config_file) as f:
print(f.read())
else:
tty.die("environment has no %s file" % ev.manifest_name)
else:
tty.die("`spack config get` requires a section argument or an active environment.")
print_configuration(args, blame=False)
def config_blame(args):
"""Print out line-by-line blame of merged YAML."""
spack.config.CONFIG.print_section(args.section, blame=True)
print_configuration(args, blame=True)
def config_edit(args):

View File

@@ -172,6 +172,14 @@ def configure_args(self):
return args"""
class CargoPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for cargo-based packages"""
base_class_name = "CargoPackage"
body_def = ""
class CMakePackageTemplate(PackageTemplate):
"""Provides appropriate overrides for CMake-based packages"""
@@ -186,6 +194,14 @@ def cmake_args(self):
return args"""
class GoPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for Go-module-based packages"""
base_class_name = "GoPackage"
body_def = ""
class LuaPackageTemplate(PackageTemplate):
"""Provides appropriate overrides for LuaRocks-based packages"""
@@ -575,28 +591,30 @@ def __init__(self, name, *args, **kwargs):
templates = {
"autotools": AutotoolsPackageTemplate,
"autoreconf": AutoreconfPackageTemplate,
"cmake": CMakePackageTemplate,
"bundle": BundlePackageTemplate,
"qmake": QMakePackageTemplate,
"maven": MavenPackageTemplate,
"scons": SconsPackageTemplate,
"waf": WafPackageTemplate,
"autotools": AutotoolsPackageTemplate,
"bazel": BazelPackageTemplate,
"bundle": BundlePackageTemplate,
"cargo": CargoPackageTemplate,
"cmake": CMakePackageTemplate,
"generic": PackageTemplate,
"go": GoPackageTemplate,
"intel": IntelPackageTemplate,
"lua": LuaPackageTemplate,
"makefile": MakefilePackageTemplate,
"maven": MavenPackageTemplate,
"meson": MesonPackageTemplate,
"octave": OctavePackageTemplate,
"perlbuild": PerlbuildPackageTemplate,
"perlmake": PerlmakePackageTemplate,
"python": PythonPackageTemplate,
"qmake": QMakePackageTemplate,
"r": RPackageTemplate,
"racket": RacketPackageTemplate,
"perlmake": PerlmakePackageTemplate,
"perlbuild": PerlbuildPackageTemplate,
"octave": OctavePackageTemplate,
"ruby": RubyPackageTemplate,
"makefile": MakefilePackageTemplate,
"intel": IntelPackageTemplate,
"meson": MesonPackageTemplate,
"lua": LuaPackageTemplate,
"scons": SconsPackageTemplate,
"sip": SIPPackageTemplate,
"generic": PackageTemplate,
"waf": WafPackageTemplate,
}
@@ -679,6 +697,8 @@ def __call__(self, stage, url):
clues = [
(r"/CMakeLists\.txt$", "cmake"),
(r"/NAMESPACE$", "r"),
(r"/Cargo\.toml$", "cargo"),
(r"/go\.mod$", "go"),
(r"/configure$", "autotools"),
(r"/configure\.(in|ac)$", "autoreconf"),
(r"/Makefile\.am$", "autoreconf"),

View File

@@ -0,0 +1,103 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import sys
from typing import List
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.confirmation as confirmation
import spack.environment as ev
import spack.spec
from spack.cmd.common import arguments
description = "remove specs from the concretized lockfile of an environment"
section = "environments"
level = "long"
# Arguments for display_specs when we find ambiguity
display_args = {"long": True, "show_flags": False, "variants": False, "indent": 4}
def setup_parser(subparser):
subparser.add_argument(
"--root", action="store_true", help="deconcretize only specific environment roots"
)
arguments.add_common_arguments(subparser, ["yes_to_all", "specs"])
subparser.add_argument(
"-a",
"--all",
action="store_true",
dest="all",
help="deconcretize ALL specs that match each supplied spec",
)
def get_deconcretize_list(
args: argparse.Namespace, specs: List[spack.spec.Spec], env: ev.Environment
) -> List[spack.spec.Spec]:
"""
Get list of environment roots to deconcretize
"""
env_specs = [s for _, s in env.concretized_specs()]
to_deconcretize = []
errors = []
for s in specs:
if args.root:
# find all roots matching given spec
to_deconc = [e for e in env_specs if e.satisfies(s)]
else:
# find all roots matching or depending on a matching spec
to_deconc = [e for e in env_specs if any(d.satisfies(s) for d in e.traverse())]
if len(to_deconc) < 1:
tty.warn(f"No matching specs to deconcretize for {s}")
elif len(to_deconc) > 1 and not args.all:
errors.append((s, to_deconc))
to_deconcretize.extend(to_deconc)
if errors:
for spec, matching in errors:
tty.error(f"{spec} matches multiple concrete specs:")
sys.stderr.write("\n")
spack.cmd.display_specs(matching, output=sys.stderr, **display_args)
sys.stderr.write("\n")
sys.stderr.flush()
tty.die("Use '--all' to deconcretize all matching specs, or be more specific")
return to_deconcretize
def deconcretize_specs(args, specs):
env = spack.cmd.require_active_env(cmd_name="deconcretize")
if args.specs:
deconcretize_list = get_deconcretize_list(args, specs, env)
else:
deconcretize_list = [s for _, s in env.concretized_specs()]
if not args.yes_to_all:
confirmation.confirm_action(deconcretize_list, "deconcretized", "deconcretization")
with env.write_transaction():
for spec in deconcretize_list:
env.deconcretize(spec)
env.write()
def deconcretize(parser, args):
if not args.specs and not args.all:
tty.die(
"deconcretize requires at least one spec argument.",
" Use `spack deconcretize --all` to deconcretize ALL specs.",
)
specs = spack.cmd.parse_specs(args.specs) if args.specs else [any]
deconcretize_specs(args, specs)

View File

@@ -9,11 +9,11 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.package_base
import spack.repo
import spack.store
from spack.cmd.common import arguments
description = "show dependencies of a package"
section = "basic"

View File

@@ -9,10 +9,10 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.repo
import spack.store
from spack.cmd.common import arguments
description = "show packages that depend on another"
section = "basic"

View File

@@ -20,9 +20,9 @@
from llnl.util.symlink import symlink
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from spack.error import SpackError

View File

@@ -9,9 +9,9 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.repo
from spack.cmd.common import arguments
description = "developer build: build from code in current working directory"
section = "build"

View File

@@ -8,10 +8,10 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.spec
import spack.util.path
import spack.version
from spack.cmd.common import arguments
from spack.error import SpackError
description = "add a spec to an environment's dev-build information"
@@ -45,10 +45,41 @@ def setup_parser(subparser):
arguments.add_common_arguments(subparser, ["spec"])
def develop(parser, args):
env = spack.cmd.require_active_env(cmd_name="develop")
def _update_config(spec, path):
find_fn = lambda section: spec.name in section
entry = {"spec": str(spec)}
if path != spec.name:
entry["path"] = path
def change_fn(section):
section[spec.name] = entry
spack.config.change_or_add("develop", find_fn, change_fn)
def _retrieve_develop_source(spec, abspath):
# "steal" the source code via staging API. We ask for a stage
# to be created, then copy it afterwards somewhere else. It would be
# better if we can create the `source_path` directly into its final
# destination.
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# We construct a package class ourselves, rather than asking for
# Spec.package, since Spec only allows this when it is concrete
package = pkg_cls(spec)
if isinstance(package.stage[0].fetcher, spack.fetch_strategy.GitFetchStrategy):
package.stage[0].fetcher.get_full_repo = True
# If we retrieved this version before and cached it, we may have
# done so without cloning the full git repo; likewise, any
# mirror might store an instance with truncated history.
package.stage[0].disable_mirrors()
package.stage.steal_source(abspath)
def develop(parser, args):
if not args.spec:
env = spack.cmd.require_active_env(cmd_name="develop")
if args.clone is False:
raise SpackError("No spec provided to spack develop command")
@@ -66,7 +97,7 @@ def develop(parser, args):
# Both old syntax `spack develop pkg@x` and new syntax `spack develop pkg@=x`
# are currently supported.
spec = spack.spec.parse_with_version_concrete(entry["spec"])
env.develop(spec=spec, path=path, clone=True)
_retrieve_develop_source(spec, abspath)
if not env.dev_specs:
tty.warn("No develop specs to download")
@@ -81,12 +112,16 @@ def develop(parser, args):
version = spec.versions.concrete_range_as_version
if not version:
raise SpackError("Packages to develop must have a concrete version")
spec.versions = spack.version.VersionList([version])
# default path is relative path to spec.name
# If user does not specify --path, we choose to create a directory in the
# active environment's directory, named after the spec
path = args.path or spec.name
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
if not os.path.isabs(path):
env = spack.cmd.require_active_env(cmd_name="develop")
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
else:
abspath = path
# clone default: only if the path doesn't exist
clone = args.clone
@@ -96,15 +131,24 @@ def develop(parser, args):
if not clone and not os.path.exists(abspath):
raise SpackError("Provided path %s does not exist" % abspath)
if clone and os.path.exists(abspath):
if args.force:
shutil.rmtree(abspath)
else:
msg = "Path %s already exists and cannot be cloned to." % abspath
msg += " Use `spack develop -f` to overwrite."
raise SpackError(msg)
if clone:
if os.path.exists(abspath):
if args.force:
shutil.rmtree(abspath)
else:
msg = "Path %s already exists and cannot be cloned to." % abspath
msg += " Use `spack develop -f` to overwrite."
raise SpackError(msg)
_retrieve_develop_source(spec, abspath)
# Note: we could put develop specs in any scope, but I assume
# users would only ever want to do this for either (a) an active
# env or (b) a specified config file (e.g. that is included by
# an environment)
# TODO: when https://github.com/spack/spack/pull/35307 is merged,
# an active env is not required if a scope is specified
env = spack.cmd.require_active_env(cmd_name="develop")
tty.debug("Updating develop config for {0} transactionally".format(env.name))
with env.write_transaction():
changed = env.develop(spec, path, clone)
if changed:
env.write()
_update_config(spec, path)

View File

@@ -10,11 +10,11 @@
from llnl.util.tty.color import cprint, get_color_when
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.solver.asp as asp
import spack.util.environment
import spack.util.spack_json as sjson
from spack.cmd.common import arguments
description = "compare two specs"
section = "basic"
@@ -44,6 +44,9 @@ def setup_parser(subparser):
action="append",
help="select the attributes to show (defaults to all)",
)
subparser.add_argument(
"--ignore", action="append", help="omit diffs related to these dependencies"
)
def shift(asp_function):
@@ -54,7 +57,7 @@ def shift(asp_function):
return asp.AspFunction(first, rest)
def compare_specs(a, b, to_string=False, color=None):
def compare_specs(a, b, to_string=False, color=None, ignore_packages=None):
"""
Generate a comparison, including diffs (for each side) and an intersection.
@@ -73,6 +76,14 @@ def compare_specs(a, b, to_string=False, color=None):
if color is None:
color = get_color_when()
a = a.copy()
b = b.copy()
if ignore_packages:
for pkg_name in ignore_packages:
a.trim(pkg_name)
b.trim(pkg_name)
# Prepare a solver setup to parse differences
setup = asp.SpackSolverSetup()
@@ -200,6 +211,8 @@ def diff(parser, args):
specs = []
for spec in spack.cmd.parse_specs(args.specs):
# If the spec has a hash, check it before disambiguating
spec.replace_hash()
if spec.concrete:
specs.append(spec)
else:
@@ -207,7 +220,7 @@ def diff(parser, args):
# Calculate the comparison (c)
color = False if args.dump_json else get_color_when()
c = compare_specs(specs[0], specs[1], to_string=True, color=color)
c = compare_specs(specs[0], specs[1], to_string=True, color=color, ignore_packages=args.ignore)
# Default to all attributes
attributes = args.attribute or ["all"]

View File

@@ -20,7 +20,6 @@
import spack.cmd
import spack.cmd.common
import spack.cmd.common.arguments
import spack.cmd.common.arguments as arguments
import spack.cmd.install
import spack.cmd.modules
import spack.cmd.uninstall
@@ -31,6 +30,7 @@
import spack.schema.env
import spack.spec
import spack.tengine
from spack.cmd.common import arguments
from spack.util.environment import EnvironmentModifications
description = "manage virtual environments"

View File

@@ -10,10 +10,10 @@
from llnl.util.tty.colify import colify
import spack.cmd as cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.repo
import spack.store
from spack.cmd.common import arguments
description = "list extensions for package"
section = "extensions"

View File

@@ -14,12 +14,12 @@
import spack
import spack.cmd
import spack.cmd.common.arguments
import spack.config
import spack.cray_manifest as cray_manifest
import spack.detection
import spack.error
import spack.util.environment
from spack.cmd.common import arguments
description = "manage external packages in Spack configuration"
section = "config"
@@ -29,8 +29,6 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="external_command")
scopes = spack.config.scopes()
find_parser = sp.add_parser("find", help="add external packages to packages.yaml")
find_parser.add_argument(
"--not-buildable",
@@ -48,15 +46,14 @@ def setup_parser(subparser):
)
find_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope("packages"),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope("packages"),
help="configuration scope to modify",
)
find_parser.add_argument(
"--all", action="store_true", help="search for all packages that Spack knows about"
)
spack.cmd.common.arguments.add_common_arguments(find_parser, ["tags", "jobs"])
arguments.add_common_arguments(find_parser, ["tags", "jobs"])
find_parser.add_argument("packages", nargs=argparse.REMAINDER)
find_parser.epilog = (
'The search is by default on packages tagged with the "build-tools" or '

View File

@@ -6,11 +6,11 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.repo
import spack.traverse
from spack.cmd.common import arguments
description = "fetch archives for packages"
section = "build"

View File

@@ -12,9 +12,9 @@
import spack.bootstrap
import spack.cmd as cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.repo
from spack.cmd.common import arguments
from spack.database import InstallStatuses
description = "list and search installed packages"
@@ -261,10 +261,8 @@ def find(parser, args):
# Exit early with an error code if no package matches the constraint
if not results and args.constraint:
msg = "No package matches the query: {0}"
msg = msg.format(" ".join(args.constraint))
tty.msg(msg)
raise SystemExit(1)
constraint_str = " ".join(str(s) for s in args.constraint_specs)
tty.die(f"No package matches the query: {constraint_str}")
# If tags have been specified on the command line, filter by tags
if args.tags:

View File

@@ -6,6 +6,7 @@
import llnl.util.tty as tty
import spack.cmd.common.arguments
import spack.cmd.common.confirmation
import spack.cmd.uninstall
import spack.environment as ev
import spack.store
@@ -41,6 +42,6 @@ def gc(parser, args):
return
if not args.yes_to_all:
spack.cmd.uninstall.confirm_removal(specs)
spack.cmd.common.confirmation.confirm_action(specs, "uninstalled", "uninstallation")
spack.cmd.uninstall.do_uninstall(specs, force=False)

View File

@@ -7,11 +7,11 @@
import os
import spack.binary_distribution
import spack.cmd.common.arguments as arguments
import spack.mirror
import spack.paths
import spack.util.gpg
import spack.util.url
from spack.cmd.common import arguments
description = "handle GPG actions for spack"
section = "packaging"

View File

@@ -5,10 +5,10 @@
from llnl.util import tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.store
from spack.cmd.common import arguments
from spack.graph import DAGWithDependencyTypes, SimpleDAG, graph_ascii, graph_dot, static_graph_dot
description = "generate graphs of package dependency relationships"
@@ -61,7 +61,7 @@ def graph(parser, args):
args.dot = True
env = ev.active_environment()
if env:
specs = env.all_specs()
specs = env.concrete_roots()
else:
specs = spack.store.STORE.db.query()

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import sys
import textwrap
from itertools import zip_longest
@@ -10,12 +11,13 @@
import llnl.util.tty.color as color
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.deptypes as dt
import spack.fetch_strategy as fs
import spack.install_test
import spack.repo
import spack.spec
import spack.version
from spack.cmd.common import arguments
from spack.package_base import preferred_version
description = "get detailed information on a particular package"
@@ -53,6 +55,7 @@ def setup_parser(subparser):
("--tags", print_tags.__doc__),
("--tests", print_tests.__doc__),
("--virtuals", print_virtuals.__doc__),
("--variants-by-name", "list variants in strict name order; don't group by condition"),
]
for opt, help_comment in options:
subparser.add_argument(opt, action="store_true", help=help_comment)
@@ -77,35 +80,10 @@ def license(s):
class VariantFormatter:
def __init__(self, variants):
self.variants = variants
def __init__(self, pkg):
self.variants = pkg.variants
self.headers = ("Name [Default]", "When", "Allowed values", "Description")
# Formats
fmt_name = "{0} [{1}]"
# Initialize column widths with the length of the
# corresponding headers, as they cannot be shorter
# than that
self.column_widths = [len(x) for x in self.headers]
# Expand columns based on max line lengths
for k, e in variants.items():
v, w = e
candidate_max_widths = (
len(fmt_name.format(k, self.default(v))), # Name [Default]
len(str(w)),
len(v.allowed_values), # Allowed values
len(v.description), # Description
)
self.column_widths = (
max(self.column_widths[0], candidate_max_widths[0]),
max(self.column_widths[1], candidate_max_widths[1]),
max(self.column_widths[2], candidate_max_widths[2]),
max(self.column_widths[3], candidate_max_widths[3]),
)
# Don't let name or possible values be less than max widths
_, cols = tty.terminal_size()
max_name = min(self.column_widths[0], 30)
@@ -137,6 +115,8 @@ def default(self, v):
def lines(self):
if not self.variants:
yield " None"
return
else:
yield " " + self.fmt % self.headers
underline = tuple([w * "=" for w in self.column_widths])
@@ -159,7 +139,7 @@ def lines(self):
yield " " + self.fmt % t
def print_dependencies(pkg):
def print_dependencies(pkg, args):
"""output build, link, and run package dependencies"""
for deptype in ("build", "link", "run"):
@@ -172,7 +152,7 @@ def print_dependencies(pkg):
color.cprint(" None")
def print_detectable(pkg):
def print_detectable(pkg, args):
"""output information on external detection"""
color.cprint("")
@@ -200,7 +180,7 @@ def print_detectable(pkg):
color.cprint(" False")
def print_maintainers(pkg):
def print_maintainers(pkg, args):
"""output package maintainers"""
if len(pkg.maintainers) > 0:
@@ -209,7 +189,7 @@ def print_maintainers(pkg):
color.cprint(section_title("Maintainers: ") + mnt)
def print_phases(pkg):
def print_phases(pkg, args):
"""output installation phases"""
if hasattr(pkg.builder, "phases") and pkg.builder.phases:
@@ -221,7 +201,7 @@ def print_phases(pkg):
color.cprint(phase_str)
def print_tags(pkg):
def print_tags(pkg, args):
"""output package tags"""
color.cprint("")
@@ -233,7 +213,7 @@ def print_tags(pkg):
color.cprint(" None")
def print_tests(pkg):
def print_tests(pkg, args):
"""output relevant build-time and stand-alone tests"""
# Some built-in base packages (e.g., Autotools) define callback (e.g.,
@@ -271,18 +251,171 @@ def print_tests(pkg):
color.cprint(" None")
def print_variants(pkg):
def _fmt_value(v):
if v is None or isinstance(v, bool):
return str(v).lower()
else:
return str(v)
def _fmt_name_and_default(variant):
"""Print colorized name [default] for a variant."""
return color.colorize(f"@c{{{variant.name}}} @C{{[{_fmt_value(variant.default)}]}}")
def _fmt_when(when, indent):
return color.colorize(f"{indent * ' '}@B{{when}} {color.cescape(when)}")
def _fmt_variant_description(variant, width, indent):
"""Format a variant's description, preserving explicit line breaks."""
return "\n".join(
textwrap.fill(
line, width=width, initial_indent=indent * " ", subsequent_indent=indent * " "
)
for line in variant.description.split("\n")
)
def _fmt_variant(variant, max_name_default_len, indent, when=None, out=None):
out = out or sys.stdout
_, cols = tty.terminal_size()
name_and_default = _fmt_name_and_default(variant)
name_default_len = color.clen(name_and_default)
values = variant.values
if not isinstance(variant.values, (tuple, list, spack.variant.DisjointSetsOfValues)):
values = [variant.values]
# put 'none' first, sort the rest by value
sorted_values = sorted(values, key=lambda v: (v != "none", v))
pad = 4 # min padding between 'name [default]' and values
value_indent = (indent + max_name_default_len + pad) * " " # left edge of values
# This preserves any formatting (i.e., newlines) from how the description was
# written in package.py, but still wraps long lines for small terminals.
# This allows some packages to provide detailed help on their variants (see, e.g., gasnet).
formatted_values = "\n".join(
textwrap.wrap(
f"{', '.join(_fmt_value(v) for v in sorted_values)}",
width=cols - 2,
initial_indent=value_indent,
subsequent_indent=value_indent,
)
)
formatted_values = formatted_values[indent + name_default_len + pad :]
# name [default] value1, value2, value3, ...
padding = pad * " "
color.cprint(f"{indent * ' '}{name_and_default}{padding}@c{{{formatted_values}}}", stream=out)
# when <spec>
description_indent = indent + 4
if when is not None and when != spack.spec.Spec():
out.write(_fmt_when(when, description_indent - 2))
out.write("\n")
# description, preserving explicit line breaks from the way it's written in the package file
out.write(_fmt_variant_description(variant, cols - 2, description_indent))
out.write("\n")
def _variants_by_name_when(pkg):
"""Adaptor to get variants keyed by { name: { when: { [Variant...] } }."""
# TODO: replace with pkg.variants_by_name(when=True) when unified directive dicts are merged.
variants = {}
for name, (variant, whens) in sorted(pkg.variants.items()):
for when in whens:
variants.setdefault(name, {}).setdefault(when, []).append(variant)
return variants
def _variants_by_when_name(pkg):
"""Adaptor to get variants keyed by { when: { name: Variant } }"""
# TODO: replace with pkg.variants when unified directive dicts are merged.
variants = {}
for name, (variant, whens) in pkg.variants.items():
for when in whens:
variants.setdefault(when, {})[name] = variant
return variants
def _print_variants_header(pkg):
"""output variants"""
if not pkg.variants:
print(" None")
return
color.cprint("")
color.cprint(section_title("Variants:"))
formatter = VariantFormatter(pkg.variants)
for line in formatter.lines:
color.cprint(color.cescape(line))
variants_by_name = _variants_by_name_when(pkg)
# Calculate the max length of the "name [default]" part of the variant display
# This lets us know where to print variant values.
max_name_default_len = max(
color.clen(_fmt_name_and_default(variant))
for name, when_variants in variants_by_name.items()
for variants in when_variants.values()
for variant in variants
)
return max_name_default_len, variants_by_name
def print_versions(pkg):
def _unconstrained_ver_first(item):
"""sort key that puts specs with open version ranges first"""
spec, _ = item
return (spack.version.any_version not in spec.versions, spec)
def print_variants_grouped_by_when(pkg):
max_name_default_len, _ = _print_variants_header(pkg)
indent = 4
variants = _variants_by_when_name(pkg)
for when, variants_by_name in sorted(variants.items(), key=_unconstrained_ver_first):
padded_values = max_name_default_len + 4
start_indent = indent
if when != spack.spec.Spec():
sys.stdout.write("\n")
sys.stdout.write(_fmt_when(when, indent))
sys.stdout.write("\n")
# indent names slightly inside 'when', but line up values
padded_values -= 2
start_indent += 2
for name, variant in sorted(variants_by_name.items()):
_fmt_variant(variant, padded_values, start_indent, None, out=sys.stdout)
def print_variants_by_name(pkg):
max_name_default_len, variants_by_name = _print_variants_header(pkg)
max_name_default_len += 4
indent = 4
for name, when_variants in variants_by_name.items():
for when, variants in sorted(when_variants.items(), key=_unconstrained_ver_first):
for variant in variants:
_fmt_variant(variant, max_name_default_len, indent, when, out=sys.stdout)
sys.stdout.write("\n")
def print_variants(pkg, args):
"""output variants"""
if args.variants_by_name:
print_variants_by_name(pkg)
else:
print_variants_grouped_by_when(pkg)
def print_versions(pkg, args):
"""output versions"""
color.cprint("")
@@ -300,18 +433,24 @@ def print_versions(pkg):
pad = padder(pkg.versions, 4)
preferred = preferred_version(pkg)
url = ""
if pkg.has_code:
url = fs.for_package_version(pkg, preferred)
def get_url(version):
try:
return fs.for_package_version(pkg, version)
except spack.fetch_strategy.InvalidArgsError:
return "No URL"
url = get_url(preferred) if pkg.has_code else ""
line = version(" {0}".format(pad(preferred))) + color.cescape(url)
color.cprint(line)
color.cwrite(line)
print()
safe = []
deprecated = []
for v in reversed(sorted(pkg.versions)):
if pkg.has_code:
url = fs.for_package_version(pkg, v)
url = get_url(v)
if pkg.versions[v].get("deprecated", False):
deprecated.append((v, url))
else:
@@ -329,7 +468,7 @@ def print_versions(pkg):
color.cprint(line)
def print_virtuals(pkg):
def print_virtuals(pkg, args):
"""output virtual packages"""
color.cprint("")
@@ -352,7 +491,7 @@ def print_virtuals(pkg):
color.cprint(" None")
def print_licenses(pkg):
def print_licenses(pkg, args):
"""Output the licenses of the project."""
color.cprint("")
@@ -384,7 +523,8 @@ def info(parser, args):
else:
color.cprint(" None")
color.cprint(section_title("Homepage: ") + pkg.homepage)
if getattr(pkg, "homepage"):
color.cprint(section_title("Homepage: ") + pkg.homepage)
# Now output optional information in expected order
sections = [
@@ -401,6 +541,6 @@ def info(parser, args):
]
for print_it, func in sections:
if print_it:
func(pkg)
func(pkg, args)
color.cprint("")

View File

@@ -14,7 +14,6 @@
import spack.build_environment
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.fetch_strategy
@@ -23,6 +22,7 @@
import spack.report
import spack.spec
import spack.store
from spack.cmd.common import arguments
from spack.error import SpackError
from spack.installer import PackageInstaller
@@ -162,8 +162,8 @@ def setup_parser(subparser):
"--no-check-signature",
action="store_true",
dest="unsigned",
default=False,
help="do not check signatures of binary packages",
default=None,
help="do not check signatures of binary packages (override mirror config)",
)
subparser.add_argument(
"--show-log-on-error",

View File

@@ -15,9 +15,9 @@
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.deptypes as dt
import spack.repo
from spack.cmd.common import arguments
from spack.version import VersionList
description = "list and search available packages"

View File

@@ -8,12 +8,12 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.cmd.find
import spack.environment as ev
import spack.store
import spack.user_environment as uenv
import spack.util.environment
from spack.cmd.common import arguments
description = "add package to the user environment"
section = "user environment"
@@ -98,15 +98,15 @@ def load(parser, args):
spack.cmd.display_specs(results)
return
constraint_specs = spack.cmd.parse_specs(args.constraint)
specs = [
spack.cmd.disambiguate_spec(spec, env, first=args.load_first)
for spec in spack.cmd.parse_specs(args.constraint)
spack.cmd.disambiguate_spec(spec, env, first=args.load_first) for spec in constraint_specs
]
if not args.shell:
specs_str = " ".join(args.constraint) or "SPECS"
specs_str = " ".join(str(s) for s in constraint_specs) or "SPECS"
spack.cmd.common.shell_init_instructions(
"spack load", " eval `spack load {sh_arg} %s`" % specs_str
"spack load", f" eval `spack load {{sh_arg}} {specs_str}`"
)
return 1

View File

@@ -9,11 +9,11 @@
import spack.builder
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.paths
import spack.repo
import spack.stage
from spack.cmd.common import arguments
description = "print out locations of packages and spack directories"
section = "basic"

View File

@@ -8,11 +8,11 @@
from llnl.util import tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.error
import spack.package_base
import spack.repo
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
description = "mark packages as explicitly or implicitly installed"

View File

@@ -11,7 +11,6 @@
import spack.caches
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.concretize
import spack.config
import spack.environment as ev
@@ -20,6 +19,7 @@
import spack.spec
import spack.util.path
import spack.util.web as web_util
from spack.cmd.common import arguments
from spack.error import SpackError
description = "manage mirrors (source and binary)"
@@ -88,18 +88,14 @@ def setup_parser(subparser):
"--mirror-url", metavar="mirror_url", type=str, help="find mirror to destroy by url"
)
# used to construct scope arguments below
scopes = spack.config.scopes()
# Add
add_parser = sp.add_parser("add", help=mirror_add.__doc__)
add_parser.add_argument("name", help="mnemonic name for mirror", metavar="mirror")
add_parser.add_argument("url", help="url of mirror directory from 'spack mirror create'")
add_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope to modify",
)
add_parser.add_argument(
@@ -111,15 +107,31 @@ def setup_parser(subparser):
"and source use `--type binary --type source` (default)"
),
)
add_parser_signed = add_parser.add_mutually_exclusive_group(required=False)
add_parser_signed.add_argument(
"--unsigned",
help="do not require signing and signature verification when pushing and installing from "
"this build cache",
action="store_false",
default=None,
dest="signed",
)
add_parser_signed.add_argument(
"--signed",
help="require signing and signature verification when pushing and installing from this "
"build cache",
action="store_true",
default=None,
dest="signed",
)
arguments.add_connection_args(add_parser, False)
# Remove
remove_parser = sp.add_parser("remove", aliases=["rm"], help=mirror_remove.__doc__)
remove_parser.add_argument("name", help="mnemonic name for mirror", metavar="mirror")
remove_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope to modify",
)
@@ -136,9 +148,8 @@ def setup_parser(subparser):
)
set_url_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope to modify",
)
arguments.add_connection_args(set_url_parser, False)
@@ -163,11 +174,27 @@ def setup_parser(subparser):
),
)
set_parser.add_argument("--url", help="url of mirror directory from 'spack mirror create'")
set_parser_unsigned = set_parser.add_mutually_exclusive_group(required=False)
set_parser_unsigned.add_argument(
"--unsigned",
help="do not require signing and signature verification when pushing and installing from "
"this build cache",
action="store_false",
default=None,
dest="signed",
)
set_parser_unsigned.add_argument(
"--signed",
help="require signing and signature verification when pushing and installing from this "
"build cache",
action="store_true",
default=None,
dest="signed",
)
set_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope to modify",
)
arguments.add_connection_args(set_parser, False)
@@ -175,11 +202,7 @@ def setup_parser(subparser):
# List
list_parser = sp.add_parser("list", help=mirror_list.__doc__)
list_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
help="configuration scope to read from",
"--scope", action=arguments.ConfigScope, help="configuration scope to read from"
)
@@ -194,6 +217,7 @@ def mirror_add(args):
or args.type
or args.oci_username
or args.oci_password
or args.signed is not None
):
connection = {"url": args.url}
if args.s3_access_key_id and args.s3_access_key_secret:
@@ -209,6 +233,8 @@ def mirror_add(args):
if args.type:
connection["binary"] = "binary" in args.type
connection["source"] = "source" in args.type
if args.signed is not None:
connection["signed"] = args.signed
mirror = spack.mirror.Mirror(connection, name=args.name)
else:
mirror = spack.mirror.Mirror(args.url, name=args.name)
@@ -241,6 +267,8 @@ def _configure_mirror(args):
changes["endpoint_url"] = args.s3_endpoint_url
if args.oci_username and args.oci_password:
changes["access_pair"] = [args.oci_username, args.oci_password]
if getattr(args, "signed", None) is not None:
changes["signed"] = args.signed
# argparse cannot distinguish between --binary and --no-binary when same dest :(
# notice that set-url does not have these args, so getattr

View File

@@ -14,11 +14,11 @@
from llnl.util.tty import color
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.modules
import spack.modules.common
import spack.repo
from spack.cmd.common import arguments
description = "manipulate module files"
section = "environment"
@@ -388,21 +388,15 @@ def modules_cmd(parser, args, module_type, callbacks=callbacks):
callbacks[args.subparser_name](module_type, specs, args)
except MultipleSpecsMatch:
msg = "the constraint '{query}' matches multiple packages:\n"
query = " ".join(str(s) for s in args.constraint_specs)
msg = f"the constraint '{query}' matches multiple packages:\n"
for s in specs:
spec_fmt = "{hash:7} {name}{@version}{%compiler}"
spec_fmt += "{compiler_flags}{variants}{arch=architecture}"
msg += "\t" + s.cformat(spec_fmt) + "\n"
tty.error(msg.format(query=args.constraint))
tty.die(
"In this context exactly **one** match is needed: "
"please specify your constraints better."
)
tty.die(msg, "In this context exactly *one* match is needed.")
except NoSpecMatches:
msg = "the constraint '{query}' matches no package."
tty.error(msg.format(query=args.constraint))
tty.die(
"In this context exactly **one** match is needed: "
"please specify your constraints better."
)
query = " ".join(str(s) for s in args.constraint_specs)
msg = f"the constraint '{query}' matches no package."
tty.die(msg, "In this context exactly *one* match is needed.")

View File

@@ -6,12 +6,12 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.package_base
import spack.repo
import spack.traverse
from spack.cmd.common import arguments
description = "patch expanded archive sources in preparation for install"
section = "build"

View File

@@ -12,11 +12,11 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.paths
import spack.repo
import spack.util.executable as exe
import spack.util.package_hash as ph
from spack.cmd.common import arguments
description = "query packages associated with particular git revisions"
section = "developer"

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
from spack.cmd.common import arguments
description = "remove specs from an environment"
section = "environments"

View File

@@ -11,6 +11,7 @@
import spack.config
import spack.repo
import spack.util.path
from spack.cmd.common import arguments
description = "manage package source repositories"
section = "config"
@@ -19,7 +20,6 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="repo_command")
scopes = spack.config.scopes()
# Create
create_parser = sp.add_parser("create", help=repo_create.__doc__)
@@ -42,11 +42,7 @@ def setup_parser(subparser):
# List
list_parser = sp.add_parser("list", help=repo_list.__doc__)
list_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
help="configuration scope to read from",
"--scope", action=arguments.ConfigScope, help="configuration scope to read from"
)
# Add
@@ -54,9 +50,8 @@ def setup_parser(subparser):
add_parser.add_argument("path", help="path to a Spack package repository directory")
add_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope to modify",
)
@@ -67,9 +62,8 @@ def setup_parser(subparser):
)
remove_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope to modify",
)

View File

@@ -6,8 +6,8 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.repo
from spack.cmd.common import arguments
description = "revert checked out package source code"
section = "build"

View File

@@ -12,12 +12,12 @@
import spack
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment
import spack.hash_types as ht
import spack.package_base
import spack.solver.asp as asp
from spack.cmd.common import arguments
description = "concretize a specs using an ASP solver"
section = "developer"

View File

@@ -10,11 +10,11 @@
import spack
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.hash_types as ht
import spack.spec
import spack.store
from spack.cmd.common import arguments
description = "show what would be installed, given a spec"
section = "build"

View File

@@ -8,13 +8,13 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.package_base
import spack.repo
import spack.stage
import spack.traverse
from spack.cmd.common import arguments
description = "expand downloaded archive in preparation for install"
section = "build"

Some files were not shown because too many files have changed in this diff Show More