Compare commits

...

465 Commits

Author SHA1 Message Date
Jen Herting
d367b4285a py-dataclasses-json: add new package (#39493)
* [py-dataclasses-json] New package

* [py-dataclasses-json] limiting py-poetry-core versions
2023-09-02 16:27:47 -05:00
Jen Herting
260e735425 [py-preshed] added version 3.0.8 (#38969)
* [py-preshed] added version 3.0.8

* [py-preshed] reordered for consistancy
2023-09-02 16:25:41 -05:00
John W. Parent
ca872f9c34 Windows: fix pwsh env activate/deactivate; load/unload (#39118)
These commands are currently broken on powershell (Windows) due to
improper use of the InvokeCommand commandlet and a lack of direct
support for the `--pwsh` argument in `spack load`, `spack unload`,
and `spack env deactivate`.
2023-09-01 11:36:27 -07:00
John W. Parent
b72a268bc5 "spack config add": allow values with a ":" (#39279)
If you wanted to set a configuration option like
`config:install_tree:root` to "C:/path/to/config.yaml", Spack  had
trouble parsing this because of the ":" in the value. This adds
logic to allow using quotes to enclose the value, so you can add
`config:install_tree:root:"C:/path/to/config.yaml"`.

Configuration keys should never contain a quote character, so the
presence of any quote is taken to mean that the rest of the string
is specifying the value.
2023-09-01 11:05:02 -07:00
Adam J. Stewart
818195a3bd py-pandas: add v2.1.0 (#39707) 2023-08-31 11:08:44 -05:00
Anton Kozhevnikov
679d41ea66 [NVHPC] add a possibility to control default CUDA version (#38909)
* add a possibility to control default cuda version

* fix stype

* style fix

* resolve comment

* resolve comment

* Fix style in nvhpc package.py

---------

Co-authored-by: antonk <antonk@cscs.ch>
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2023-08-31 16:16:31 +02:00
Massimiliano Culpo
86216cc36e environment: improve spack remove matching (#39390)
search for equivalent specs, not for equal strings when selecting a spec to remove.
2023-08-31 09:28:52 +00:00
dependabot[bot]
ecb7ad493f build(deps): bump sphinx from 7.2.4 to 7.2.5 in /lib/spack/docs (#39716)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 7.2.4 to 7.2.5.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.2.4...v7.2.5)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-31 10:40:33 +02:00
Harmen Stoppels
fb1e81657c Remove a few local imports in tests (#39719) 2023-08-31 10:40:02 +02:00
Raffaele Solcà
34e4c62e8c dla-future: add v0.2.0 (#39705) 2023-08-31 10:39:22 +02:00
Harmen Stoppels
acb02326aa ASP-based solver: add hidden mode to ignore versions that are moving targets, use that in CI (#39611)
Setting the undocumented variable SPACK_CONCRETIZER_REQUIRE_CHECKSUM
now causes the solver to avoid accounting for versions that are not checksummed.

This feature is used in CI to avoid spurious concretization against e.g. develop branches.
2023-08-31 08:09:37 +00:00
Pat McCormick
c1756257c2 llvm: fix for Flang variant due to exception handling (#36171)
* The flang project does not support exceptions enabled in the core llvm
library (and developer guidelines explicitly state they should not be
used).  For this reason, when the flang variant is selected,
LLVM_ENABLE_EH needs to be disabled.  In the current main branch of
llvm (and thus future releases), enabling flang and setting
LLVM_ENABLE_EH will cause the overall build to fail.

* Update var/spack/repos/builtin/packages/llvm/package.py

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>

* fix syntax

---------

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2023-08-30 23:22:33 -05:00
John W. Parent
1ee7c735ec Windows: oneapi/msvc consistency (#39180)
Currently, OneAPI's setvars scripts effectively disregard any arguments
we're passing to the MSVC vcvars env setup script, and additionally,
completely ignore the requested version of OneAPI, defaulting to whatever
the latest installed on the system is.

This leads to a scenario where we have improperly constructed Windows
native development environments, with potentially multiple versions of
MSVC and OneAPI being loaded or called in the same env. Obviously this is
far from ideal and leads to some fairly inscrutable errors such as
overlapping header files between MSVC and OneAPI and a different version
of OneAPI being called than the env was setup for.

This PR solves this issue by creating a structured invocation of each
relevant script in an order that ensures the correct values are set in
the resultant build env.

The order needs to be:

1. MSVC vcvarsall
2. The compiler specific env.bat script for the relevant version of
   the oneapi compiler we're looking for. The root setvars scripts seems
   to respect this as well, although it is less explicit
3. The root oneapi setvars script, which sets up everything else the
   oneapi env needs and seems to respect previous env invocations.
2023-08-30 23:19:38 +00:00
Adam J. Stewart
22deed708e py-llvmlite: add Python version requirements (#39711) 2023-08-30 16:15:44 -07:00
Todd Gamblin
6693dc5eb8 completion: make bash completion work in zsh
`compgen -W` does not behave the same way in zsh as it does in bash; it seems not to
actually generate the completions we want.

- [x] add a zsh equivalent and `_compgen_w` to abstract it away
- [x] use `_compgen_w` instead of `compgen -W`
2023-08-30 12:42:31 -07:00
Todd Gamblin
396f219011 completion: add alias handling
Bash completion is now smarter about handling aliases. In particular, if all completions
for some input command are aliased to the same thing, we'll just complete with that thing.

If you've already *typed* the full alias for a command, we'll complete the alias.

So, for example, here there's more than one real command involved, so all aliases are
shown:

```console
$ spack con
concretise    concretize    config        containerise  containerize
```

Here, there are two possibilities: `concretise` and `concretize`, but both map to
`concretize` so we just complete that:

```console
$ spack conc
concretize
```

And here, the user has already typed `concretis`, so we just go with it as there is only
one option:

```console
 spack concretis
concretise
```
2023-08-30 12:42:31 -07:00
Todd Gamblin
a3ecd7efed Add concretise and containerise aliases for our UK users
From a user:

> Aargh.
> ```
> ==> Error: concretise is not a recognized Spack command or extension command; check with `spack commands`.
> ```

To make things easier for our friends in the UK, this adds `concretise` and
`containerise` aliases for the `spack concretize` and `spack containerize` commands.

- [x] add aliases
- [x] update completions
2023-08-30 12:42:31 -07:00
Chris Green
ae5511afd6 [procps] New versions (#38650)
* [procps] New versions 3.3.14, 3.3.3.15..3.3.17, 4.0.0..4.0.3

* Put upper bound on patch application

* [Patch was
  merged](https://gitlab.com/procps-ng/procps/-/merge_requests/189) for
  upcoming 4.0.4.
2023-08-30 13:09:06 -05:00
Ryan Marcellino
78fe2c63fa py-parsl: add v2023.08.21 (#39642)
* py-parsl: add v2023.08.21

* fix style

* add missing deps
2023-08-29 18:33:00 -05:00
Dewi
f4f396745e amg2023 package: cmake build (#39618)
Switch from makefile to CMake-based build. CMake support is currently
in a specific branch of the amg2023 repo, so add this branch as a
version in the package.
2023-08-29 15:47:36 -07:00
Loïc Pottier
f3c6d892b1 Upgrading py-pika versions (#39670)
* Upgraded py-pika versions

Signed-off-by: Loïc Pottier <pottier1@llnl.gov>

* Upgraded py-pika versions

Signed-off-by: Loïc Pottier <pottier1@llnl.gov>

* added variants and adding setuptools constraint

Signed-off-by: Loïc Pottier <pottier1@llnl.gov>

* missing setuptools for older versions

Signed-off-by: Loïc Pottier <pottier1@llnl.gov>

---------

Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
2023-08-29 17:20:22 -05:00
Adam J. Stewart
2f5988cec7 libpng: add libs property (#39638) 2023-08-29 14:00:28 -07:00
Tom Payerle
44922f734d boxlib: Add BL_USE_PARTICLES cmake var (#39660)
Set BL_USE_PARTICLES to 1, which should case boxlib build to include Particles classes
according to CMakeLists.txt.

This seems to fix #18172
The aforementioned error seems to occur in cmake phase while processing
CMakeLists.txt in Src/C_ParticleLib, and appears to be due to the
variable containing the list of src files for the add_library() call
being empty unless BL_USE_PARTICLES is set to 1.
2023-08-29 13:57:02 -07:00
G-Ragghianti
144e657c58 Added release 2023.08.25 (#39665) 2023-08-29 13:54:31 -07:00
Jen Herting
6f48fe2b6f [mathematica] added version 13.0.1 (#39675) 2023-08-29 13:47:33 -07:00
Jim Edwards
fcd03adc02 Jedwards/parallelio262 (#39676)
* add parallelio@2.6.2
* add comments
2023-08-29 13:44:54 -07:00
Alberto Sartori
0620b954be justbuild: add v1.2.1 (#39679) 2023-08-29 15:23:22 -04:00
Jen Herting
6174b829f7 [py-tiktoken] added verison 0.3.1 (#39677) 2023-08-29 14:28:39 -04:00
Adam J. Stewart
0d4b1c6a73 py-torchmetrics: add v1.1.1 (#39685) 2023-08-29 13:25:29 -05:00
Matthew Thompson
fb9797bd67 mapl: Fix pflogger dependency (#39683) 2023-08-29 14:22:52 -04:00
Jen Herting
4eee3c12c1 [py-gradio] New package (#39304)
* [py-gradio] New package

* [py-gradio] added deps from requirements.txt

* [py-gradio] py-pillow -> pil
2023-08-29 14:08:54 -04:00
Adam J. Stewart
3e5f9a2138 py-lightly: add v1.4.17 (#39686) 2023-08-29 11:05:02 -07:00
LydDeb
8295a45999 py-distributed: use py-jinja2 versions from 2.10.3 (#39658)
In depends_on py-jinja2, use versions from 2.10.3, instead of strict equality, according to the pyproject.toml of py-distributed 2023.4.1
2023-08-29 12:09:48 -05:00
Harmen Stoppels
5138c71d34 Revert "Add style tool to fix fish file formatting (#39155)" (#39680)
This reverts commit 70c71e8f93.
2023-08-29 18:12:19 +02:00
Harmen Stoppels
eef9939c21 Automated git version fixes (#39637)
Use full length commit sha instead of short prefixes, to improve
reproducibility (future clashes) and guard against compromised repos and
man in the middle attacks.

Abbreviated commit shas are expanded to full length, to guard against future
clashes on short hash. It also guards against compromised repos and
man in the middle attacks, where attackers can easily fabricate a malicious
commit with a shasum prefix collision.

Versions with just tags now also get a commit sha, which can later be used to
check for retagged commits.
2023-08-29 16:33:03 +02:00
dependabot[bot]
ffddaabaa0 build(deps): bump sphinx from 7.2.3 to 7.2.4 in /lib/spack/docs (#39668)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 7.2.3 to 7.2.4.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.2.3...v7.2.4)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-29 14:45:06 +02:00
dependabot[bot]
f664d1edaa build(deps): bump docker/setup-buildx-action from 2.9.1 to 2.10.0 (#39669)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 2.9.1 to 2.10.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](4c0219f9ac...885d1462b8)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-29 14:44:42 +02:00
Harmen Stoppels
6d5d1562bd fmt: 10.1.1 (#39664) 2023-08-29 10:47:42 +02:00
Adam J. Stewart
70c71e8f93 Add style tool to fix fish file formatting (#39155) 2023-08-28 22:16:44 -05:00
Jonathon Anderson
d9d1eb24f9 modules: copy matched config to prevent bleed (#39421) 2023-08-28 15:03:29 -07:00
kwryankrattiger
cef59ad0bf Patch VTK to enable python 3.8 in VTK 8.2 (#38735)
* VTK: Add patch for python 3.8 support

* CI: Re-enable VisIt in CI

* Configure spec matrix for stack with VisIt

* Add pugixml dep for 8.2.0

* Make VTK and ParaView consistent on proj dep

* OpenMPI 3: provides MP support by default

* Add details on proj dep in ParaView

* Add python 3.8 to test mock repo

* Patches to get VisIt VTK interface

* CI: Disable VisIt with GUI in DAV
2023-08-28 16:56:31 -05:00
Brian Vanderwende
a1e117a98b singularityce: arg handling fix (#39663) 2023-08-28 15:52:57 -04:00
Loris Ercole
cb855d5ffd kokkos: avoid setting multiple cuda_arch (#38723)
The kokkos package does not support building with cuda with multiple
`cuda_arch`. An error is raised if multiple `cuda_arch` are specified.
2023-08-28 14:13:05 -04:00
Matthew Thompson
3866ff0096 mapl: add 2.40, deprecate old versions (#39615)
* Add MAPL 2.40, deprecated old versions
* Undo some variant changes
* style fixes
* Make variants reflect Cmake options
* Remove pnetcdf variant
* Clean up depends_on
2023-08-28 11:11:03 -07:00
Todd Gamblin
6dc167e43d security: change SECURITY.md to recommend GitHub's private reporting (#39651)
GitHub's beta private security issue reporting feature is enabled on the Spack repo now,
so we can change `SECURITY.md` to recommend using it instead of `maintainers@spack.io`.

- [x] Update `SECURITY.md` to direct people to the GitHub security tab.
- [x] Update working in `SECURITY.md` to say "last two major releases" with a link to
      the releases page, instead of explicitly listing release names. This way we don't have
      to update it (which we keep forgetting to do).
2023-08-28 18:06:17 +00:00
Ashwin Kumar Karnad
0fd085be8e octopus: specify the scalapack library not the directory (#39109)
* octopus: specify the scalapack library not the directory
* octopus: use spec.scalapack.libs.ld_flags
2023-08-28 09:57:24 -07:00
Massimiliano Culpo
74fba221f1 GnuPG: add v2.4.3 (#39654)
Also updates a few dependencies
2023-08-28 10:58:16 -04:00
Harmen Stoppels
deeeb86067 Fix a few issues with git versions / moving targets in packages (#39635) 2023-08-28 15:48:13 +02:00
Jen Herting
98daf5b7ec py-scikit-optimize: add v0.9.0 and move gptune patch (#38738)
Co-authored-by: liuyangzhuan <liuyangzhuan@gmail.com>
2023-08-28 07:43:25 -04:00
Harmen Stoppels
8a3d98b632 Remove mxnet from darwin aarch64 since theres no supported release (#39644) 2023-08-28 12:21:10 +02:00
Harmen Stoppels
0cc945b367 libarchive: use openssl by default (#39640) 2023-08-28 09:23:33 +02:00
Stan Tomov
e732155e8c magma: add v2.7.2 (#39645) 2023-08-28 08:57:59 +02:00
Harmen Stoppels
c07fb833a9 libarchive: new versions (#39646) 2023-08-28 08:52:59 +02:00
Harmen Stoppels
7d566b481f libuv: add versions up to v1.46.0 (#39647) 2023-08-28 08:52:13 +02:00
Christopher Christofi
a72e5e762e py-igraph: new package with version 0.10.6 (#39179) 2023-08-27 14:46:21 -05:00
Nichols A. Romero
0eb22ef770 llvm: ensure runtimes set rpaths (#39641)
Ensure that CMAKE_INSTALL_RPATH_USE_LINK_PATH is propagated to sub-make when building runtimes.
2023-08-27 08:41:59 +02:00
Peter Scheibel
95f78440f1 External ROCm: add example configuration (#39602)
* add an example of an external rocm configuration
* include more info
* generalize section to all GPU support
2023-08-26 15:46:25 -07:00
John W. Parent
74a51aba50 CMake: add versions 3.26.4|5 and 3.27.* (#39628)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-08-26 21:26:41 +02:00
Daryl W. Grunau
b370ecfbda py-scipy: re-apply scipy/scipy#14935 workaround (#39555)
* re-apply scipy/scipy#14935 workaround

* restrict to %intel and eliminate Version() check

---------

Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2023-08-26 08:44:57 -05:00
LydDeb
04d55b7600 py-dask: use py-jinja2 versions from 2.10.3 (#39631)
In depends_on py-jinja2, use versions from 2.10.3, instead of strict equality, according to the pyproject.toml of dask 2023.4.1
2023-08-26 00:56:39 -05:00
Tim Gymnich
d695438851 Update package.py (#39619) 2023-08-26 00:55:03 -05:00
Nichols A. Romero
f0447d63ad llvm: fix +lldb optional deps and add lua variant (#39579)
* llvm: fix +lldb optional deps, add lua variant

* use good old conflicts

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-08-25 17:46:33 -05:00
Mikael Simberg
e8a7a04f14 Add gperftools 2.12 (#39623) 2023-08-25 14:57:00 -07:00
Paul R. C. Kent
23316f0352 QMCPACK: add v3.17.1 (#39626) 2023-08-25 14:55:36 -07:00
Brian Van Essen
b3433cb872 LBANN package: strip hostname on non-LC systems. (#39601) 2023-08-25 14:55:24 -07:00
James Smillie
349ba83bc6 Windows symlinking support (#38599)
This reapplies 66f7540, which adds supports for hardlinks/junctions on
Windows systems where developer mode is not enabled.

The commit was reverted on account of multiple issues:

* Checks added to prevent dangling symlinks were interfering with
  existing CI builds on Linux (i.e. builds that otherwise succeed were
  failing for creating dangling symlinks).
* The logic also updated symlinking to perform redirection of relative
  paths, which lead to malformed symlinks.

This commit fixes these issues.
2023-08-25 12:18:19 -07:00
Thomas-Ulrich
ecfd9ef12b add memalign option to petsc to be used with tandem (#37282) 2023-08-25 09:03:53 -05:00
Loris Ercole
4502351659 intel-oneapi-mkl: add hpcx-mpi to the list of supported MPI libs (#39625)
Co-authored-by: Loris Ercole <a-lercole@microsoft.com>
2023-08-25 08:53:22 -04:00
Adam J. Stewart
8a08f09ac0 py-scipy: add v1.11.2 (#39515) 2023-08-25 06:03:22 -04:00
Erik Heeren
60ecd0374e arrow: explicitly disable RE2 and UTF8PROC flags when not in dependencies (#35353)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-08-25 09:04:06 +02:00
Richard Berger
52ccee79d8 legion: drop cray-pmi dependency (#39620)
There are other ways to enforce cray-pmi being loaded in environments
that use cray-mpich. This avoids breaking environments where this was
already the case and avoids forcing them to declare an external.
2023-08-25 08:51:38 +02:00
dependabot[bot]
7f0f1b63d6 build(deps): bump actions/checkout from 3.5.3 to 3.6.0 (#39617)
Bumps [actions/checkout](https://github.com/actions/checkout) from 3.5.3 to 3.6.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](c85c95e3d7...f43a0e5ff2)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-25 08:27:56 +02:00
Adam J. Stewart
b65f1f22ec py-sphinx: add v7.2.3 (#39612) 2023-08-24 18:30:13 -07:00
Richard Berger
e9efa1df75 legion: depend on cray-pmi when using gasnet and cray-mpich (#39614) 2023-08-24 18:17:24 -07:00
Jonas Thies
884a5b8b07 phist: new version 1.12.0 (#39616) 2023-08-24 18:12:42 -07:00
AMD Toolchain Support
91d674f5d0 AOCL spack recipes for 4.1 release w. new AOCL-Utils library (#39489)
* AOCC and AOCL spack recipes for 4.1 release
* Fix broken checksum
* remove blank line
* Add missing `@when` for 4.1 only function

---------

Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
2023-08-24 18:05:39 -07:00
Matt Drozt
76fbb8cd8f [py-smartsim] New Package (#39306)
* Create a smartsim package

* rm ss 0.4.2

* no py upper bound, add build dep

* add setup_build_env

* add comment to find ml deps lower bounds

* Apply suggestions from code review

Correct dep versions, use `python_purelib`

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove the cuda/rocm vars

* point editors to bin deps version constraints

* Apply suggestions from code review

Loosen `py-smartredis` constraint, enforce `setup.cfg` py version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* style

* rm rai lower bound

* lower bound setuptools

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-08-24 18:03:52 -04:00
Wouter Deconinck
0f3f2a8024 py-particle: new versions through 0.23.0 (#39547)
* py-particle: new versions through 0.23.0

* py-particle: depends_on py-deprecated
2023-08-24 16:22:33 -05:00
Frédéric Simonis
5a5f774369 preCICE: prepare for version 3 (#39486) 2023-08-24 14:04:36 -04:00
Adam J. Stewart
f5212ae139 py-sphinx: add v7.2 (#39513) 2023-08-24 12:07:24 -04:00
Sajid Ali
4b618704bf add magma variant to strumpack (#39224)
* add magma variant to strumpack
* clarify conflicts to be excuive to rocm/cuda, without it +magma+cuda fails as it is ~rocm
   modified:   var/spack/repos/builtin/packages/strumpack/package.py
* add missing depends_on for magma variant
2023-08-24 08:39:24 -07:00
Harmen Stoppels
46285d9725 zlib-ng: build system patches (#39559) 2023-08-24 12:13:22 +02:00
eugeneswalker
36852fe348 h5bench: add e4s tag (#39595) 2023-08-24 08:58:07 +02:00
Harmen Stoppels
8914d26867 rebuild-index: fix race condition by avoiding parallelism (#39594) 2023-08-24 08:26:56 +02:00
Massimiliano Culpo
fdea5e7624 Remove leftover attributes from parser (#39574)
#35042 introduced lazy hash parsing, but didn't remove a
few attributes from the parser that were needed only for
concrete specs

This commit removes them, since they are effectively
dead code.
2023-08-24 08:04:43 +02:00
dependabot[bot]
ca1e4d54b5 build(deps): bump sphinx from 7.2.2 to 7.2.3 in /lib/spack/docs (#39603)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 7.2.2 to 7.2.3.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.2.2...v7.2.3)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-24 07:55:14 +02:00
Ryan Mulhall
656528bbbb fms: add v2023.02 and v2023.01.01 (#39434)
* Add 2023.02 and 2023.01.01 with variant for mpp_io and fms_io deprecation

Co-authored-by: rem1776 <Ryan.Mulhall@noaa.gov>
2023-08-24 03:58:50 +02:00
Alan Orth
4d42e9d1f3 perl-http-message: add dependency on perl-clone (#39567)
Clone is a hard dependency as of HTTP-Message v6.44. This causes
problems in packages like Roary which depend on perl-http-message:

    $ spack load roary
    $ roary
    Use of uninitialized value in require at /var/scratch/spack/opt/spack/linux-centos8-x86_64_v3/gcc-8.5.0/perl-http-message-6.44-lzp5th4jddd3gojkjfli4hljgem2nl26/lib/perl5/HTTP/Headers.pm line 8.
    Can't locate Clone.pm in @INC (you may need to install the Clone module) (@INC contains:  /home/aorth/lib ...

See: https://github.com/libwww-perl/HTTP-Message/blob/master/Changes
2023-08-24 03:33:10 +02:00
Julien Cortial
d058c1d649 Add package mpi-test-suite (#39487) 2023-08-24 03:18:58 +02:00
Martin Aumüller
43854fc2ec ffmpeg: apply upstream fix for build with binutils 2.41 (#39392)
While spack does not yet provide binutils 2.41, they might still be
installed. However, building ffmpeg on x86_64 fails with multiple errors like
this:
./libavcodec/x86/mathops.h:125: Error: operand type mismatch for `shr'

also reported here: https://trac.ffmpeg.org/ticket/10405
2023-08-24 03:11:02 +02:00
Julien Cortial
6a2149df6e Make lcov dependency on perl explicit (#39485) 2023-08-24 03:05:00 +02:00
Brent Huisman
af38d097ac Arbor: version added: v0.9.0 (#39374) 2023-08-24 02:59:53 +02:00
Richard Berger
e67dca73d1 lammps: simplify url_for_version logic (#39554)
Use the stable_versions variable to check stable versions
2023-08-24 02:44:39 +02:00
Richard Berger
2e6ed1e707 slurm: add version 23-02-3-1 and 23-02-4-1 (#39598) 2023-08-23 20:43:21 -04:00
Greg Becker
53d2ffaf83 Add amg2023 package; remove amg package (#39105)
Add amg2023 package

Consolidate existing amg and amg2013 packages (they reference the
same code) under the amg2013 name to minimize confusion between
amg2023 and amg2013.

Co-authored-by: Riyaz Haque <haque1@llnl.gov>
2023-08-23 19:48:44 -04:00
svengoldberg
a95e061fed Include t8code package (#39391)
* t8code: Create new spack package

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2023-08-23 18:48:21 -04:00
Vicente Bolea
e01b9b38ef adios2: correct c-blosc dependency (#39385)
* adios2: correct c-blosc dependency

* c-blosc2: disable oneapi spec
2023-08-23 14:44:26 -07:00
lorddavidiii
eac15badd3 cfitsio: add new version 4.3.0 (#39588) 2023-08-23 17:32:37 -04:00
Massimiliano Culpo
806b8aa966 Uppercase global constants in spec.py (#39573)
* Uppercase global constants in spec.py

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2023-08-23 21:26:30 +00:00
Greg Becker
9e5ca525f7 do not warn for invalid package name on repo.yaml if subdirectory: '' (#39439) 2023-08-23 23:23:35 +02:00
Simon Pintarelli
5ea4322f88 elpa 2023.05 (#39585) 2023-08-23 22:39:04 +02:00
Harmen Stoppels
4ca2d8bc19 aws-sdk-cpp: disable flaky tests in build (#39565) 2023-08-23 21:59:22 +02:00
Massimiliano Culpo
e0059ef961 ASP-based solver: split heuristic for duplicate nodes (#39593)
The heuristic for duplicate nodes contains a few typos, and
apparently slows down the solve for specs that have a lot of
sub-optimal choices to be taken.

This is likely because with a lot of sub-optimal choices, the
low priority, flawed heuristic is being used by clingo.

Here I split the heuristic, so complex rules that matter only
if we allow multiple nodes from the same package are used
only in that case.
2023-08-23 19:02:20 +00:00
Cameron Rutherford
7d9fad9576 Add kpp2 branch tag and move to GitHub. (#39503) 2023-08-23 08:55:54 -07:00
Richard Berger
553277a84f FleCSI dependency updates (#39557)
* use full cuda_arch list
* update compiler requirement
* update boost requirements
* propagate +kokkos to legion in non-GPU cases
* add missing graphviz dependency for +doc
2023-08-23 15:40:10 +02:00
Harmen Stoppels
00a3ebd0bb llvm: fix elf dep conditions and cmake bug (#39566) 2023-08-23 07:33:23 -04:00
Julien Cortial
ffc9060e11 Add CDT (Constrained Delaunay Triangulation) library (#39488) 2023-08-23 11:29:25 +02:00
Dom Heinzeller
31d5f56913 Add --fail-fast option for generating build caches (#38496)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-08-23 11:20:33 +02:00
Adam J. Stewart
bfdebae831 py-nbmake: add v1.4.3 (#39514) 2023-08-23 11:13:10 +02:00
eugeneswalker
aa83fa44e1 libyogrt %oneapi: add cflag=-Wno-error=implicit-function-declaration (#39522) 2023-08-23 11:10:37 +02:00
Martin Aumüller
e56291dd45 embree: new versions and ARM support (#38083) 2023-08-23 11:10:07 +02:00
Paul R. C. Kent
2f52545214 QMCPACK: add v3.17.0 (#39523) 2023-08-23 11:08:40 +02:00
Leopold Talirz
5090023e3a xtb: fix la_backend logic (#38309) 2023-08-23 11:08:13 +02:00
Giuseppe Congiu
d355880110 PAPI misc fixes (#39191)
Spack installs the hsa-rocr-dev and rocprofiler packages into different
directories. However, PAPI typically expects those to be under the same
directory and locates such directory through the PAPI_ROCM_ROOT env
variable.

The PAPI rocm component also allows users to override PAPI_ROCM_ROOT to
locate directly the librocprofiler64.so through the HSA_TOOLS_LIB env
variable that acts directly onto the HSA runtime tools mechanism.

Hence, in order to account for the decoupling of hsa and rocprofiler,
this patch sets HSA_TOOLS_LIB to librocprofiler64.so full path.
2023-08-23 10:09:54 +02:00
snehring
1a0434b808 sentieon-genomics: adding version 202308 (#39572) 2023-08-23 10:05:03 +02:00
Laura Weber
c3eec8a36f sublime-test: add v4.4152 (#39577) 2023-08-23 10:04:24 +02:00
eflumerf
25b8cf93d2 xmlrpc-c: add build of xmlrpc tools directory (#39580) 2023-08-23 09:41:27 +02:00
Adam J. Stewart
34ff7605e6 py-torchmetrics: add v1.1.0 (#39582) 2023-08-23 09:34:53 +02:00
Adam J. Stewart
e026fd3613 py-lightly: add v1.4.16 (#39583) 2023-08-23 09:34:15 +02:00
Tamara Dahlgren
3f5f4cfe26 docs: API, f-string, and path updates to test section (#39584) 2023-08-23 09:22:25 +02:00
Mikael Simberg
74fe9ccef3 Update version constraint for HIP patch (#39578) 2023-08-23 09:09:14 +02:00
Wouter Deconinck
fd5a8b2075 py-boost-histogram: new version 1.3.2 (#39558) 2023-08-22 19:34:09 -05:00
Wouter Deconinck
33793445cf py-uhi: new versions 0.3.2, 0.3.3 (#39548) 2023-08-22 19:23:52 -05:00
Wouter Deconinck
f4a144c8ac py-hepunits: new version 2.3.2 (#39546) 2023-08-22 19:21:51 -05:00
Vincent Michaud-Rioux
6c439ec022 Update/py pennylane lightning (#39466)
* Update pennylane-lightning.

* Update Lightning-Kokkos to v0.31

* Constrain scipy version.

* Update var/spack/repos/builtin/packages/py-pennylane-lightning-kokkos/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fix PLK kokkos dep versioning.

* Move kokkos ver outised backend loop and reformat.

* Update package.py

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-08-22 19:14:31 -05:00
genric
209409189a py-xarray: add v2023.7.0 (#39418)
* py-xarray: add v2023.7.0

* address review comments
2023-08-22 19:13:19 -05:00
Tim Haines
ff900566e0 libiberty: add version 2.41 (#39569) 2023-08-22 16:29:21 -04:00
eugeneswalker
a954a0bb9f tau: add v2.32.1 (#39504) 2023-08-22 16:24:05 -04:00
Kamil Iskra
c21e00f504 spack.caches: make fetch_cache_location lowercase (#39575)
fetch_cache_location was erroneously renamed to FETCH_cache_location
as part of #39428, breaking "spack module create".
2023-08-22 16:18:51 -04:00
Massimiliano Culpo
9ae1317e79 ASP-based solver: use edge properties for reused specs (#39508)
Since #34821 we are annotating virtual dependencies on
DAG edges, and reconstructing virtuals in memory when
we read a concrete spec from previous formats.

Therefore, we can remove a TODO in asp.py, and rely on
"virtual_on_edge" facts to be imposed.
2023-08-22 12:07:51 -07:00
Wouter Deconinck
9f1a30d3b5 veccore: new variant vc (#39542)
* veccore: new variants umesimd and vc

* veccore: remove variant umesimd again
2023-08-22 11:20:19 -04:00
Harmen Stoppels
1340995249 clingo-bootstrap: pgo, lto, allocator optimizations (#34926)
Add support for PGO and LTO for gcc, clang and apple-clang, and add a
patch to allow mimalloc as an allocator in operator new/delete, give
reduces clingo runtime by about 30%.
2023-08-22 14:44:07 +02:00
dependabot[bot]
afebc11742 Bump sphinx from 6.2.1 to 7.2.2 in /lib/spack/docs (#39502)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.2.1 to 7.2.2.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v6.2.1...v7.2.2)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-22 12:02:58 +02:00
Jim Edwards
34e9fc612c parallelio: add v2.6.1 (#39479) 2023-08-22 09:44:06 +02:00
dependabot[bot]
1d8ff7f742 Bump sphinx-rtd-theme from 1.2.2 to 1.3.0 in /lib/spack/docs (#39562)
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 1.2.2 to 1.3.0.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/1.2.2...1.3.0)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-22 09:32:39 +02:00
Axel Huebl
0e27f05611 openPMD-api: 0.15.2 (#39528)
Latest patch release.
2023-08-21 18:31:53 -07:00
Axel Huebl
19aaa97ff2 C-Blosc2: v2.10.2 (#39537)
This fixes regressions with external dependencies introduced
in 2.10.1
2023-08-21 18:29:58 -07:00
Kamil Iskra
990309355f [bats] Update to v1.10.0 (#39561)
* [bats] Update to v1.10.0
  Original bats has been abandoned for years; nowadays it's
  community-maintained.
* [bats] Fix style
2023-08-21 18:25:35 -07:00
eugeneswalker
2cb66e6e44 tasmanian@:7.9 +cuda conflicts with cuda@12 (#39541) 2023-08-21 13:08:12 -07:00
David Gardner
cfaade098a add sundials v6.6.0 (#39526) 2023-08-21 10:48:58 -07:00
Wouter Deconinck
ed65532e27 singularityce: fix after no spaces in Executable allowed (#39553) 2023-08-21 13:43:11 -04:00
Dr Marco Claudio De La Pierre
696d4a1b85 amdblis recipe: adding variant for a performance flag (#39549)
* amdblis recipe: adding variant for a performance flag

Signed-off-by: Dr Marco De La Pierre <marco.delapierre@gmail.com>

* typo fix

Signed-off-by: Dr Marco De La Pierre <marco.delapierre@gmail.com>

* style fix

Signed-off-by: Dr Marco De La Pierre <marco.delapierre@gmail.com>

* another typo fix

Signed-off-by: Dr Marco De La Pierre <marco.delapierre@gmail.com>

* one more style fix

Signed-off-by: Dr Marco De La Pierre <marco.delapierre@gmail.com>

---------

Signed-off-by: Dr Marco De La Pierre <marco.delapierre@gmail.com>
2023-08-21 10:16:39 -07:00
Taillefumier Mathieu
8def75b414 Update cp2k recipe (#39128)
Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2023-08-21 17:09:28 +02:00
Harmen Stoppels
5389db821d aws-sdk-cpp: new versions and greatly reduce install size(#39511) 2023-08-21 14:52:21 +02:00
Wouter Deconinck
0d5ae3a809 cepgen: new version 1.1.0 (#39550) 2023-08-21 08:08:05 +02:00
Wouter Deconinck
b61ad8d2a8 whizard: new versions 3.1.1 and 3.1.2 (#39540) 2023-08-21 08:07:36 +02:00
Wouter Deconinck
b35db020eb zlib: new version 1.3 (#39539) 2023-08-21 08:06:41 +02:00
Wouter Deconinck
ca1d15101e zlib: add git url (#39533) 2023-08-21 08:04:28 +02:00
Brian Spilner
c9ec5fb9ac cdo: add v2.2.2 (#39506)
* update cdo-2.2.2

* add note on hdf5
2023-08-18 15:46:50 -07:00
Todd Gamblin
71abb8c7f0 less: update version, dependencies (#39521) 2023-08-18 15:43:54 -07:00
John W. Parent
4dafae8d17 spack.bat: Fixup CL arg parsing (#39359)
Previous changes to this file stopped directly processing CL args to
stop batch `for` from interpolating batch reserved characters needed in
arguments like URLS. In doing so, we relied on `for` for an easy
"split" operation, however this incorrectly splits paths with spaces in
certain cases. Processing everything ourselves with manual looping via
`goto` statements allows for full control over CL parsing and handling
of both paths with spaces and reserved characters.
2023-08-18 14:29:47 -07:00
Jen Herting
b2b00df5cc [py-gradio-client] New package (#39496) 2023-08-18 12:03:37 -05:00
Jen Herting
114e5d4767 [py-openmim] Beginning work (#39494) 2023-08-18 12:00:48 -05:00
Jen Herting
fd70e7fb31 [py-confection] new package (#39491) 2023-08-18 11:59:47 -05:00
George Young
77760c8ea4 py-multiqc: add 1.15, correct py-matplotlib dependency (#39509)
Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2023-08-18 11:56:20 -05:00
Jen Herting
737a6dcc73 py-markdown-it-py: add linkify variant and v2.2.0 (#39492)
* [py-markdown-it-py] added linkify variant

* [py-markdown-it-py] added version 2.2.0
2023-08-18 11:14:15 -05:00
Seth R. Johnson
3826fe3765 fortrilinos: release 2.3.0 compatible with trilinos@14 (#39500)
* fortrilinos: release 2.3.0 compatible with trilinos@14.0
2023-08-18 12:10:00 -04:00
Benjamin Meyers
edb11941b2 New: py-alpaca-farm, py-alpaca-eval, py-tiktoken; Updated: py-accerlate, py-transformers (#39432) 2023-08-18 10:29:12 -05:00
Axel Huebl
1bd58a8026 WarpX 23.08 (#39407)
* WarpX 23.08

Update WarpX and related Python packages to the lastest releases.

* fix style

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2023-08-18 10:03:47 -05:00
Jordan Galby
f8e0c8caed Fix Spack freeze on install child process unexpected exit (#39015)
* Fix spack frozen on child process defunct

* Rename parent/child pipe to read/write to emphasize non-duplex mode
2023-08-18 09:41:02 +02:00
Vijay M
d0412c1578 Updating the versions of available tarballs and adding an eigen3 variant as well (#39498) 2023-08-17 22:23:32 -04:00
Harmen Stoppels
ec500adb50 zlib-api: use zlib-ng +compat by default (#39358)
In the HPC package manager, we want the fastest `zlib`  implementation by default.  `zlib-ng` is up to 4x faster than stock `zlib`, and it can do things like take advantage of AVX-512 instructions.  This PR makes `zlib-ng` the default `zlib-api` provider (`zlib-api` was introduced earlier, in #37372).

As far as I can see, the only issues you can encounter are:

1. Build issues with packages that heavily rely on `zlib` internals. In Gitlab CI only one out of hundreds of packages had that issue (it extended zlib with deflate stuff, and used its own copy of zlib sources).
2. Packages that like to detect `zlib-ng` separately and rely on `zlib-ng` internals. The only issue I've found with this among the hundreds of packages built in CI is `perl` trying to report more specific zlib-ng version details, and relied on some internals that got refactored. But yeah... that warrants a patch / conflict and is nothing special.

At runtime, you cannot really have any issues, given that zlib and zlib-ng export the exact same symbols (and zlib-ng tests this in their CI).

You can't really have issues with externals when using zlib-ng either. The only type of issue is when system zlib is rather new, and not marked as external; if another external uses new symbols, and Spack builds an older zlib/zlib-ng, then the external might not find the new symbols. But this is a configuration issue, and it's not an issue caused by zlib-ng, as the same would happen with older Spack zlib.

* zlib-api: use zlib-ng +compat by default
* make a trivial change to zlib-ng to trigger a rebuild
* add `haampie` as maintainer
2023-08-17 14:03:14 -07:00
Harmen Stoppels
30f5c74614 py-flake8: bump including deps (#39426) 2023-08-17 21:02:11 +02:00
Massimiliano Culpo
713eb210ac zig: add v0.11.0 (#39484) 2023-08-17 19:18:00 +02:00
Massimiliano Culpo
a022e45866 ASP-based solver: optimize key to intermediate dicts (#39471)
Computing str(spec) is faster than computing hash(spec), and
since all the abstract specs we deal with come from user configuration
they cannot cover DAG structures that are not captured by str() but
are captured by hash()
2023-08-17 14:11:49 +02:00
eugeneswalker
82685a68d9 boost %oneapi: add cxxflags -Wno-error=enum-constexpr-conversion (#39477) 2023-08-17 07:49:20 -04:00
dependabot[bot]
b19691d503 Bump mypy from 1.5.0 to 1.5.1 in /lib/spack/docs (#39478)
Bumps [mypy](https://github.com/python/mypy) from 1.5.0 to 1.5.1.
- [Commits](https://github.com/python/mypy/compare/v1.5.0...v1.5.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-17 12:35:49 +02:00
eugeneswalker
54ea860b37 gmsh %oneapi: add cflag: -Wno-error=implicit-function-declaration (#39476) 2023-08-17 06:09:07 -04:00
eugeneswalker
fb598baa53 py-ruamel-yaml-clib %oneapi: -Wno-error=incompatible-function-pointer-types (#39480) 2023-08-17 11:59:45 +02:00
eugeneswalker
02763e967a sz %oneapi: add cflag=-Wno-error=implicit-function-declaration (#39467) 2023-08-17 02:36:17 -07:00
Seth R. Johnson
2846be315b kokkos: use 'when' instead of 'conflicts' for CUDA variants (#39463) 2023-08-17 05:12:52 -04:00
eugeneswalker
4818b75814 silo %oneapi: add cflags: -Wno-error=int, -Wno-error=int-conversion (#39475) 2023-08-17 04:42:58 -04:00
eugeneswalker
b613bf3855 py-matplotlib %oneapi: add cxxflags=-Wno-error=register (#39469) 2023-08-17 04:13:09 -04:00
Adam J. Stewart
3347372a7b py-deepspeed: add new package (#39427) 2023-08-17 08:47:40 +02:00
Sergey Kosukhin
c417a77a19 snappy: patch and conflict for %nvhpc (#39063) 2023-08-17 08:44:38 +02:00
Martin Aumüller
90d0d0176c botan: version 3 requires newer GCC (#39450)
e.g. 3.1.1 produces this during configuration when trying to install:
  ERROR: This version of Botan requires at least gcc 11.0
2023-08-17 08:34:43 +02:00
Adam J. Stewart
72b9f89504 py-setuptools: document Python 3.12 support (#39449) 2023-08-17 08:31:53 +02:00
Peter Scheibel
a89f1b1bf4 Add debugging statements to file search (#39121)
Co-authored-by: Scheibel <scheibel1@ml-9983616.the-lab.llnl.gov>
2023-08-17 08:31:06 +02:00
Adam J. Stewart
c6e26251a1 py-lightning: add v2.0.7 (#39468) 2023-08-17 08:19:50 +02:00
Harmen Stoppels
190a1bf523 Delay abstract hashes lookup (#39251)
Delay lookup for abstract hashes until concretization time, instead of
until Spec comparison. This has a few advantages:

1. `satisfies` / `intersects` etc don't always know where to resolve the
   abstract hash (in some cases it's wrong to look in the current env,
   db, buildcache, ...). Better to let the call site dictate it.
2. Allows search by abstract hash without triggering a database lookup,
   causing quadratic complexity issues (accidental nested loop during
   search)
3. Simplifies queries against the buildcache, they can now use Spec
   instances instead of strings.

The rules are straightforward:

1. a satisfies b when b's hash is prefix of a's hash
2. a intersects b when either a's or b's hash is a prefix of b's or a's
   hash respectively
2023-08-17 08:08:50 +02:00
eugeneswalker
e381e166ec py-numcodecs %oneapi: add cflags -Wno-error=implicit-function-declaration (#39454) 2023-08-16 15:52:01 -05:00
eugeneswalker
2f145b2684 pruners-ninja cflags: -Wno-error=implicit-function-declaration (#39452) 2023-08-16 15:51:38 -05:00
eugeneswalker
4c7748e954 amrex+sycl: restructure constraint on %oneapi (#39451) 2023-08-16 11:49:32 -06:00
eugeneswalker
86485dea14 hdf5-vol-cache %oneapi: cflags: add -Wno-error=incompatible-function-pointer-types (#39453) 2023-08-16 09:56:09 -06:00
Todd Kordenbrock
00f8f5898a faodel: update github URL organization to sandialabs (#39446) 2023-08-16 07:25:47 -07:00
Massimiliano Culpo
f41d7a89f3 Extract Package from PackageNode for error messages 2023-08-16 06:20:57 -07:00
Harmen Stoppels
4f07205c63 Avoid sort on singleton list during edge insertion (#39458)
The median length of this list of 1. For reasons I don't know, `.sort()`
still like to call the key function.

This saves ~9% of total database read time, and the number of calls
goes from 5305 -> 1715.
2023-08-16 14:33:03 +02:00
Massimiliano Culpo
08f9c7670e Do not impose provider conditions, if the node is not a provider (#39456)
* Do not impose provider conditions, if the node is not a provider

fixes #39455

When a node can be a provider of a spec, but is not selected as
a provider, we should not be imposing provider conditions on the
virtual.

* Adjust the integrity constraint, by using the correct atom
2023-08-16 05:15:13 -07:00
Harmen Stoppels
b451791336 json: minify by default (#39457) 2023-08-16 11:26:40 +00:00
Massimiliano Culpo
47f176d635 Add new custom markers to unit tests (#33862)
* Add "only_clingo", "only_original" and "not_on_windows" markers

* Modify tests to use the "not_on_windows" marker

* Mark tests that run only with clingo

* Mark tests that run only with the original concretizer
2023-08-16 09:04:10 +02:00
Patrick Bridges
b6ae751657 Fixed HeFFTe package spec to not do the smoke test prior to 2.2.0 (#39435)
* Fixed HeFFTe package spec to not do the smoke test prior to 2.2.0, where it breaks
* Convert test return to 'raise SkipTest'

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-08-15 21:03:14 -04:00
Massimiliano Culpo
9bb5cffc73 Change semantic for providers
If a possible provider is not used to satisfy a vdep,
then it's not a provider of that vdep.
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
135b44ca59 Change "effect_rules" for symmetry with trigger rules
This even though right now we don't have cases where
the effect is on another package.
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
d3aca68e8f Rework conflicts so that "vendors" is not needed anymore 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
fb83f8ef31 Add a description at the top of lp files 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
f69c18a922 Remove commented out code in lp files 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
b95a9d2e47 Reduce line length in lp file 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
def4d19980 Demote warning to debug message 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
1db91e0ccd Rename "main_node" -> "make_node" 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
34ebe7f53c Rename ""*_node" -> "*_dupe" 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
d07d5410f3 Rename "stringify", improve docs 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
1db73eb1f2 Add vendors directive
For the time being this directive prevents the vendored package
to be in the same DAG as the one vendoring it.
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
2da34de519 Add "^" automatically for named conflicts that don't refer to 'this' package
See https://github.com/spack/spack/pull/38447#discussion_r1285291520
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
d237430f47 Inline a few functions that are not needed anymore 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
3f0adae9ef Remove the need for "node_regex" 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
3b4d7bf119 Rename method: "root_node" -> "main_node" 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
b3087b32c6 Rename const: "root_node_id" -> "main_node_id" 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
ad9c90cb2e Rename atom: "special_case" -> "multiple_nodes_attribute" 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
1b0e113a9d Rename atom: "facts" -> "pkg_fact" 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
6df5738482 Simplify "node_has_variant" internal atom. 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
927d831612 Removed leftover TODOs 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
3f3c75e56a ecp-data-viz-sdk: fix building with new encoding 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
9733bb3da8 We cannot require "mpich" as an mpi provider and ask for openmpi 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
1de5117ef1 Improve handling of cases with cycles
To avoid paying the cost of setup and of a full grounding again,
move cycle detection into a separate program and check first if
the solution has cycles.

If it has, ground only the integrity constraint preventing cycles
and solve again.
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
cf8f44ae5a Fix ecp-data-vis-sdk: a package cannot impose constraints on transitive deps 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
006e69265e Optimize grounding of "can_inherit_flags" 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
eaec3062a1 Fix computation of max nodes 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
d5eb5106b0 Add unit-tests for use cases requiring separate concretization of build deps 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
9f8edbf6bf Add a new configuration option to select among different concretization modes
The "concretizer" section has been extended with a "duplicates:strategy"
attribute, that can take three values:

- "none": only 1 node per package
- "minimal": allow multiple nodes opf specific packages
- "full": allow full duplication for a build tool
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
a4301badef Fix a few bugs in the encoding when imposing constraints on build deps only
These bugs would show up when we try to split nodes by
imposing different targets or different compilers to all
build dependencies.
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
4565811556 Construct unification sets on demand, improve heuristic 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
b94d54e4d9 Reduce the number of unification sets to only two 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
a410b22098 Make cycle detection optional, to speed-up grounding and solving 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
c1a73878ea Deduplicate trigger and effect conditions in packages
This refactor introduces extra indices for triggers and
effect of a condition, so that the corresponding clauses
are evaluated once for every condition they apply to.
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
ae553051c8 Extract a function to emit variant rules 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
b94e22b284 ASP-based solver: do not optimize on known dimensions
All the solution modes we use imply that we have to solve for all
the literals, except for "when possible".

Here we remove a minimization on the number of literals not
solved, and emit directly a fact when a literal *has* to be
solved.
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
e25dcf73cd Tweak a unit test by allowing a different type of exception to be raised 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
b7cc4bd247 Reduce the dependency types in a solve
Count the maximum number of nodes based on dependency types
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
22c95923e3 Parametrize all the logic program for multiple nodes
Introduce the concept of "condition sets", i.e. the set of packages on which
a package can require / impose conditions. This currently maps to the link/run
sub-dag of each package + its direct build dependencies.

Parametrize the "condition" and "requirement" logic to multiple nodes.
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
c050b99a06 Introduce unification sets
Unification sets are possibly overlapping sets of nodes that
might contain at most a single configuration for any package.
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
60f82685ae Allow clingo to generate edges 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
27ab53b68a Rework the encoding to introduce node(ID, Package) nested facts
So far the encoding has a single ID per package, i.e. all the
facts will be node(0, Package). This will prepare the stage for
extending this logic and having multiple nodes from the same
package in a DAG.
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
907a80ca71 Remove unneeded #defined directives 2023-08-15 15:54:37 -07:00
Massimiliano Culpo
a53cc93016 Remove useless rule
The version_equivalent fact was deleted in #36347,
but the corresponding rule was not removed.
2023-08-15 15:54:37 -07:00
Massimiliano Culpo
6ad0dc3722 Transform many package related facts to use a nested function
Each fact that is deduced from package rules, and start with
a bare package atom, is transformed into a "facts" atom containing
a nested function.

For instance we transformed

  version_declared(Package, ...) -> facts(Package, version_declared(...))

This allows us to clearly mark facts that represent a rule on the package,
and will be of help later when we'll have to distinguish the cases where
the atom "Package" is being used referred to package rules and not to a
node in the DAG.
2023-08-15 15:54:37 -07:00
afzpatel
87d4bdaa02 adding new rpp package (#38942)
* initial commit for adding new rpp package
* fix styling
2023-08-15 14:04:35 -07:00
Mikael Simberg
36394aab2f Add mold 2.1.0 (#39443) 2023-08-15 09:50:47 -07:00
Larry Knox
358947fc03 Add version HDF5 1.14.2 (#39409)
* Add version HDF5 1.14.2
2023-08-15 11:21:21 -05:00
Andrey Prokopenko
477a3c0ef6 arborx: new version and patch for 1.4 with Trilinos (#39438)
* arborx: patch 1.4 for Trilinos 14.0

* arborx: version 1.4.1
2023-08-15 06:46:41 -07:00
Mikael Simberg
c6c5e11353 Add gperftools 2.11 (#39440) 2023-08-15 10:41:12 +02:00
Massimiliano Culpo
29e2997bd5 spack.caches: uppercase global variables (#39428) 2023-08-15 09:59:02 +02:00
Bryan Herman
41bd6a75d5 squashfs: Add static variant (#39433) 2023-08-15 09:54:27 +02:00
Harmen Stoppels
0976ad3184 spack.config: use all caps for globals (#39424) 2023-08-15 08:19:36 +02:00
Mark W. Krentel
fc1d9ba550 intel-xed: add version 2023.07.09 (#39436) 2023-08-14 22:33:10 -04:00
Richard Berger
61f0088a27 Spiner and ports-of-call updates (#39437)
* ports-of-call: new version 1.5.2
* spiner: new version 1.6.2
2023-08-14 22:22:52 -04:00
markus-ferrell
c202a045e6 Windows: executable/path handling (#37762)
Windows executable paths can have spaces in them, which was leading to
errors when constructing Executable objects: the parser was intended
to handle cases where users could provide an executable along with one
or more space-delimited arguments.

* Executable now assumes that it is constructed with a string argument
  that represents the path to the executable, but no additional arguments.
* Invocations of Executable.__init__ that depended on this have been
  updated (this includes the core, tests, and one instance of builtin
  repository package).
* The error handling for failed invocations of Executable.__call__ now
  includes a check for whether the executable name/path contains a
  space, to help users debug cases where they (now incorrectly)
  concatenate the path and the arguments.
2023-08-14 23:29:12 +00:00
Bryan Herman
843e1e80f0 fuse-overlayfs needs pkgconfig to build (#39431) 2023-08-14 19:19:07 -04:00
Paul Kuberry
643c028308 xyce: Disable CMake test for OneAPI (#39429) 2023-08-14 13:27:33 -07:00
markus-ferrell
d823037c40 Windows: enable "spack install" tests (#34696)
* The module-level skip for tests in `cmd.install` on Windows is removed.
  A few classes of errors still persist:

  * Cdash tests are not working on Windows
  * Tests for failed installs are also not working (this will require
    investigating bugs in output redirection)
  * Environments are not yet supported on Windows

  overall though, this enables testing of most basic uses of "spack install"
* Git repositories cached for version lookups were using a layout that
  mimicked the URL as much as possible. This was useful for listing the
  cache directory and understanding what was present at a glance, but
  the paths were overly long on Windows. On all systems, the layout is
  now a single directory based on a hash of the Git URL and is shortened
  (which ensures a consistent and acceptable length, and also avoids
  special characters).
  * In particular, this removes util.url.parse_git_url and its associated
    test, which were used exclusively for generating the git cache layout
* Bootstrapping is now enabled for unit tests on Windows
2023-08-14 13:15:40 -07:00
Bryan Herman
4d945be955 fixed build of libfuse need werror (#39430) 2023-08-14 14:39:40 -04:00
eugeneswalker
a4ac3f2767 trilinos@14.4.0 +kokkos: use external kokkos (#39397) 2023-08-14 14:29:13 -04:00
Massimiliano Culpo
6e31676b29 Fix style issues with latest versions of tools (#39422) 2023-08-14 12:38:59 -04:00
Mikael Simberg
1fff0241f2 Add Boost 1.83.0 (#39415) 2023-08-14 12:28:44 -04:00
Harmen Stoppels
a2a52dfb21 Fix containerize view symlink issue (#39419) 2023-08-14 16:02:48 +00:00
Mikael Simberg
f0ed159a1b Add patch to build silo on nixos (#39375)
Change the shebang in mkinc from /usr/bin/perl to /usr/bin/env perl for
portability to systems that don't necessarily have perl in /usr/bin.
Also adds perl as a build-time dependency.
2023-08-14 17:03:13 +02:00
Mikael Simberg
9bf7fa0067 Add fmt 10.1.0 (#39413) 2023-08-14 17:01:34 +02:00
Mikael Simberg
fbaea0336e mold: add v2.0.0 (#39416) 2023-08-14 10:53:34 -04:00
Brian Van Essen
1673d3e322 Added package for the hipTT library (#39388) 2023-08-14 16:11:30 +02:00
Wouter Deconinck
c7cca3aa8d dd4hep: new version 1.26 (#39230)
Release notes: https://github.com/AIDASoft/DD4hep/releases/tag/v01-26

Diff: https://github.com/AIDASoft/DD4hep/compare/v01-25-01...v01-26

Relevant changes:
- updated requirements on cmake,
- can now support compressed hepmc3.

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-08-14 15:56:40 +02:00
Harmen Stoppels
da46b63a34 Fix broken semver regex (#39414) 2023-08-14 15:38:06 +02:00
Harmen Stoppels
c882214273 spack bootstrap dev: detect git as an external (#39417)
#36770 added git as a dependency to `setuptools-scm`. This in turn makes `git` a
transitive dependency for our bootstrapping process. 

Since `git` may take a long time to build, and is found on most systems, try to 
detect it as an external.
2023-08-14 10:29:15 +00:00
Pariksheet Nanda
2bacab0402 arrow: fix conditional spec expression (#38212) (#38213) 2023-08-14 09:27:50 +02:00
Hao Lyu
0681d9a157 wrf: add v4.5.1 (#39291) 2023-08-14 09:20:56 +02:00
Audrius Kalpokas
887847610e gromacs: add plumed 2.8.3-2.9.0 variants (#39411) 2023-08-14 09:17:16 +02:00
Pariksheet Nanda
282a01ef76 py-scikits-odes: add a new package (#39061) 2023-08-14 02:00:03 +00:00
Adam J. Stewart
151c551781 Python: fix Apple Clang +optimizations build (#39412) 2023-08-13 21:47:47 -04:00
Hariharan Devarajan
abbd1abc1a release 0.0.1 py-dlio-profiler-py (#39377) 2023-08-13 12:20:58 -05:00
Wouter Deconinck
49c505cc14 py-pyliblzma: remove python2 package (#39399)
Last release 13 years ago, before python 3.2 was released. No depedendents or next of kin could be located.
2023-08-13 11:57:52 -05:00
Sergey Kosukhin
237a56a305 autotools: set 'ldlibs' as 'LIBS' (#17254) 2023-08-13 00:38:10 -07:00
Harmen Stoppels
7e7e6c2797 bazel: pretty print is all but pretty (#35963)
* bazel: pretty print is all but pretty
* Don't ask bazel to explain what it's up to
2023-08-12 20:43:09 -04:00
Alex Richert
e67c61aac0 Allow static-only openssl on Linux (#38098) 2023-08-12 20:42:53 +02:00
Scott Wittenburg
1b1ed1b1fa ci: continue to support SPACK_SIGNING_KEY (#39170) 2023-08-12 08:52:46 -07:00
snehring
ec0e51316b usalign: update checksum (#39400) 2023-08-12 08:40:44 -07:00
Vicente Bolea
533821e46f adios2: add mgard variant (#39405) 2023-08-12 08:38:33 -07:00
Wouter Deconinck
6c5d125cb0 cernlib: new variant shared (#39406)
This allows users to install cernlib with static libraries only, instead of both static and shared.
2023-08-12 08:34:48 -07:00
Richard Berger
668fb1201f lammps: new stable versions, deprecate older patch versions (#39317) 2023-08-12 09:57:38 +02:00
Adam J. Stewart
f7918fd8ab Python: remove maintainer (#39384) 2023-08-12 09:08:25 +02:00
Axel Huebl
fc1996e0fa WarpX 23.07 (#39108)
* WarpX 23.07

Update WarpX and related Python packages to the lastest releases.

* py-picmistandard: refresh hashes

Keeping even a single `git` version in here confuses the fetcher.
Remove broken old versions.

* Remove `py-warpx` versions with broken PICMI
2023-08-11 17:35:04 -07:00
eugeneswalker
ed3aaafd73 e4s cray sles ci: expand spec list (#39081)
* e4s cray sles ci: expand spec list

* flux-core: disable until python+gary-sles issue is resolved
2023-08-11 23:42:10 +00:00
Wouter Deconinck
63bb2c9bad py-cryptography: does not run-depend on py-setuptools-rust (#39386)
* py-cryptography: does not run-depend on py-setuptools-rust

* py-cryptography: depens_on py-setuptools-rust when @3.4.2:

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-cryptography: re-add depends_on type=run for narrow range

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-08-11 17:53:25 +00:00
Alex Richert
a67455707a ncl: add support for grib (#39277)
Co-authored-by: Alex Richert <alexander.richert@noaa.gov>
2023-08-11 08:42:08 +00:00
Cristian Di Pietrantonio
09ca71dbe0 kokkos: rename 'std' variant to 'cxxstd'. (#39319)
Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
2023-08-11 07:18:02 +00:00
Todd Gamblin
ea082539e4 Revert "package import: remove magic import line (#39183)" (#39380)
This reverts commit 8e7c53a8ba.
2023-08-11 09:04:40 +02:00
Massimiliano Culpo
143146f4f3 spack.repo: uppercase the global PATH variable (#39372)
This makes the name of the global variable representing
the repository currently in use uppercase. Doing so is advised
by pylint rules, and helps to identify where the global is used.
2023-08-11 09:04:16 +02:00
Melven Roehrig-Zoellner
ee6ae402aa qt: fedora patch missing includes in qtlocation (#38987) 2023-08-11 08:10:06 +02:00
Adam J. Stewart
0b26b26821 GDAL: fix supported versions of HDF5 (#39378) 2023-08-11 08:02:15 +02:00
dependabot[bot]
c764f9b1ab build(deps): bump mypy from 1.4.1 to 1.5.0 in /lib/spack/docs (#39383)
Bumps [mypy](https://github.com/python/mypy) from 1.4.1 to 1.5.0.
- [Commits](https://github.com/python/mypy/compare/v1.4.1...v1.5.0)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-11 08:01:34 +02:00
William R Tobin
db19d83ea7 essl: add +lapackforessl variant (#39362) 2023-08-11 08:01:02 +02:00
Gregor Daiß
24256be6d6 sgpp: fix build issues (#39086)
* fix(sgpp): Fix installation phase scons args

* fix(sgpp): Workaround for distutils deprecation

The distutils deprecation warning in Python 3.10 - 3.11 caused
problems within the SGpp SConfigure checks by causing failures
when looking for Python.h. This commit works around this by adding a
patch that simply disables the warning. It also puts limits on the
python dependency version until distutils is removed from SGpp.

* fix(sgpp): cleanup and simplify

* fix(sgpp): Fix style
2023-08-11 07:55:44 +02:00
Sergey Kosukhin
633723236e mpich: fix building with NVHPC (#39082) 2023-08-11 07:29:16 +02:00
Yan
381f31e69e py-flit: add 3.8 and 3.9 (#39307) 2023-08-10 20:23:58 -04:00
Massimiliano Culpo
9438cac219 Fix conflicts on direct dependencies that don't start with ^ (#39369) 2023-08-10 16:58:05 -04:00
Harmen Stoppels
85cf66f650 git: add new versions, deprecated / drop versions (#39368)
Remove git versions deprecated in Spack 0.19
2023-08-10 21:12:23 +02:00
markus-ferrell
f3c080e546 Windows build systems: use ninja and enable tests (#33589)
* Set default CMake generator is ninja on Windows
* Enable build systems tests (except for autotools/make)
2023-08-10 17:16:23 +00:00
dependabot[bot]
37634f8b08 build(deps): bump pygments from 2.15.1 to 2.16.1 in /lib/spack/docs (#39365)
Bumps [pygments](https://github.com/pygments/pygments) from 2.15.1 to 2.16.1.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.15.1...2.16.1)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-10 18:10:32 +02:00
ddeeptimahanti
2ae8bbce9e trilinos: fix @13.2: +python (#39370) 2023-08-10 11:50:33 -04:00
William R Tobin
b8bfaf65bf intel-oneapi-mkl: linking with the +cluster variant is broken for external mkl (#39343)
* Update package.py

Adding `mpi` variant to deal with external oneapi-mkl usage per #38238.

* style conformance

* accept both mpi specification mechanisms

* style conformance

* Update package.py

* Update var/spack/repos/builtin/packages/intel-oneapi-mkl/package.py

Co-authored-by: Robert Cohn <rscohn2@gmail.com>

* rename mpi variant mpi_family

* style conformance

* update help message for mpi_family

---------

Co-authored-by: Robert Cohn <rscohn2@gmail.com>
2023-08-10 10:33:29 -04:00
Alberto Invernizzi
7968cb7fa2 mpich: fix macos problem with -flat-namespace (#35611)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-08-10 08:34:33 -04:00
Nicholas Sly
ebc2efdfd2 Add LLVM_ENABLE_RTTI option to llvm-amdgpu package, required by mesa. (#35655) 2023-08-10 07:14:02 -04:00
Harmen Stoppels
ff07fd5ccb Remove qt variant in cmake (#39360)
`cmake+qt` depends on `qt`, which depends on `libmng`, which is a CMake
package, and has been for 4 years. Nobody ever complained about
`cmake+qt` not concretizing... so why pay the solve cost.


Before:

```
    setup          3.779s
    load           0.018s
    ground         2.625s
    solve          4.511s
    total         11.236s
```

After:

```
    setup          3.734s
    load           0.018s
    ground         0.468s
    solve          0.560s
    total          5.080s
```
2023-08-10 05:59:22 -04:00
dependabot[bot]
3f83ef6566 build(deps): bump flake8 from 6.0.0 to 6.1.0 in /lib/spack/docs (#39366)
Bumps [flake8](https://github.com/pycqa/flake8) from 6.0.0 to 6.1.0.
- [Commits](https://github.com/pycqa/flake8/compare/6.0.0...6.1.0)

---
updated-dependencies:
- dependency-name: flake8
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-10 05:34:33 -04:00
dependabot[bot]
554ce7f063 build(deps): bump sphinx-design from 0.4.1 to 0.5.0 in /lib/spack/docs (#39367)
Bumps [sphinx-design](https://github.com/executablebooks/sphinx-design) from 0.4.1 to 0.5.0.
- [Release notes](https://github.com/executablebooks/sphinx-design/releases)
- [Changelog](https://github.com/executablebooks/sphinx-design/blob/main/CHANGELOG.md)
- [Commits](https://github.com/executablebooks/sphinx-design/compare/v0.4.1...v0.5.0)

---
updated-dependencies:
- dependency-name: sphinx-design
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-10 05:29:52 -04:00
Aiden Grossman
23963779f4 Prefix conflict messages with package name (#39106)
* Prefix conflict messages with package name

This patch prefixes all conflict messages with the package name to
alleviate what was otherwise a very manual process. Note that this patch
is a one line change but has a fairly outsized impact.

* same for requires directive

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-08-10 08:09:00 +00:00
Ricard Zarco Badia
45c5af10c3 Fixed broken calls to _if_ninja_target_execute (#38992)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-08-10 09:34:02 +02:00
Harmen Stoppels
532a37e7ba Revert "Spec versions: allow git. references for branches with / (#38239)" (#39354)
This reverts commit 3453259c98.
2023-08-10 08:56:39 +02:00
Cristian Di Pietrantonio
aeb9a92845 ncview: change URL for source code (#39321)
Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
2023-08-10 08:38:58 +02:00
dependabot[bot]
a3c7ad7669 build(deps): bump urllib3 from 2.0.3 to 2.0.4 in /lib/spack/docs (#39002)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.0.3 to 2.0.4.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/2.0.3...2.0.4)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-10 05:04:58 +00:00
Juan Miguel Carceller
b99288dcae edm4hep: add edm4hep to PYTHONPATH (#38407)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-08-09 18:22:17 -05:00
markus-ferrell
01b7cc5106 Windows: enable more stage tests (#36834)
Enable some already-supported tests (no changes to staging logic).
2023-08-09 14:24:30 -07:00
pabloaledo
f5888d8127 picard: add variant with jvm arguments (#39204)
* picard: add variant with jvm arguments
* fix variant and update wrapper script

---------

Signed-off-by: Pablo <pablo.aledo@seqera.io>
2023-08-09 13:24:34 -07:00
Jennifer Green
77c838ca93 libhio: package update for new sha256 checksum for v1.4.1.5 (#38696) 2023-08-09 12:10:37 -07:00
Alec Scott
11e538d962 emacs: Add v29.1 (#39147) 2023-08-09 11:32:04 -07:00
Axel Huebl
7d444038ee AMReX: 23.06+ Multi-Dim Support (#37695)
* AMReX: 23.06+ Multi-Dim Support

This updated the Spack package to allow to install AMReX, modules of
AMReX in E4S deployments and dependent packages with support for
multiple dimensions. Due to an upstream change in AMReX, we do not
longer need to ship three, binary incompatible package variants.

* [E4S] oneAPI AMReX < 23.06 Variant

Work-around the auto-concretization to the multi-dim of `dimensions`,
which only in 23.06+ became a multi-variant.

* e4s cray rhel ci: temporarily disable amrex build until spurious ci failure can be resolved

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2023-08-09 11:25:43 -07:00
Hariharan Devarajan
c24471834b Release brahma 0.0.1 (#39345)
* release brahma 0.0.1
2023-08-09 11:18:52 -07:00
Andrey Parfenov
b1e33ae37b add info about devito and qe for intel env (#39357)
Signed-off-by: Andrey Parfenov <andrey.parfenov@intel.com>
2023-08-09 12:47:35 -04:00
Jordan Galby
c36617f9da Fix package.py error handling bug (#39017) 2023-08-09 17:47:43 +02:00
dependabot[bot]
deadb64206 build(deps): bump black from 23.1.0 to 23.7.0 in /lib/spack/docs (#38982)
Bumps [black](https://github.com/psf/black) from 23.1.0 to 23.7.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.1.0...23.7.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-09 17:19:49 +02:00
Massimiliano Culpo
9eaa88e467 caliper: deprecate old versions (#39352) 2023-08-09 17:11:57 +02:00
Juan Miguel Carceller
bd58801415 dd4hep: fix setting LD_LIBRARY_PATH (#38015)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-08-09 16:42:05 +02:00
Juan Miguel Carceller
548a9de671 py-setuptools-scm: add a git dependency (#36770)
* Add a git dependency

* Add the dependency only for building

* Change dependency to build and run

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-08-09 10:38:46 -04:00
Harmen Stoppels
8e7c53a8ba package import: remove magic import line (#39183) 2023-08-09 14:28:06 +00:00
Tamara Dahlgren
5e630174a1 cpmd: convert to new stand-alone test process (#35744) 2023-08-09 15:59:51 +02:00
Cristian Di Pietrantonio
175a65dfba libzmq: do not treat warnings as errors during compilation. (#39325)
The package won't compile with newer compilers because warnings
are converted to errors. Hence, disable such conversion.

Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
2023-08-09 15:57:43 +02:00
Adam J. Stewart
39d4c402d5 Fix suffix of tab completion scripts (#39154) 2023-08-09 08:28:55 -05:00
Harmen Stoppels
e51748ee8f zlib-api: new virtual with zlib/zlib-ng as providers (#37372)
Introduces a new virtual zlib-api, which replaces zlib in most packages.

This allows users to switch to zlib-ng by default for better performance.
2023-08-09 09:22:58 -04:00
Alec Scott
f9457fa80b cxxopts: add v3.1.1 (#37977) 2023-08-09 14:31:10 +02:00
Alec Scott
4cc2ca3e2e rclone: add v1.63.1 and remove previously deprecated versions (#39148) 2023-08-09 14:30:33 +02:00
Massimiliano Culpo
3843001004 Fixed bugs discovered in conflicts directives (#39338) 2023-08-09 14:20:49 +02:00
Cristian Di Pietrantonio
e24bb5dd1c Changes the way MPICC is set (#39327) 2023-08-09 07:04:04 -05:00
Cristian Di Pietrantonio
f6013114eb nwchem: make sure to link both single and double precision fftw libraries. (#39333)
With the previous version of the recipe I would get linking errors due
to missing `-lfftw3f`.

Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
2023-08-09 14:03:15 +02:00
Martin Aumüller
bdca875eb3 protobuf: fix build for 3.21 on Centos 8 (#39162)
apply upstream patch
2023-08-09 13:44:05 +02:00
pabloaledo
af8c392de2 fq: add new package (#39205)
Signed-off-by: Pablo <pablo.aledo@seqera.io>
2023-08-09 06:23:44 -04:00
Jonathon Anderson
9aa3b4619b containerize: ensure bootstrap images contain all system dependencies (#36818)
This also makes `spack bootstrap status` exit 1 if some dependency is missing
2023-08-09 09:46:59 +02:00
Cristian Di Pietrantonio
3d733da70a idg: add new package (#39316)
Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
2023-08-09 02:45:34 -05:00
Cristian Di Pietrantonio
cda99b792c miniocli: add new package (#39318)
Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-08-09 03:43:29 -04:00
Gerhard Theurich
9834bad82e esmf: specify build dependency on CMake when using internal ParallelIO (#39347) 2023-08-09 09:14:28 +02:00
Peter Scheibel
3453259c98 Spec versions: allow git. references for branches with / (#38239) 2023-08-09 09:07:04 +02:00
Adam J. Stewart
ee243b84eb py-lightly: add v1.4.15 (#39348) 2023-08-09 09:01:41 +02:00
Adam J. Stewart
5080e2cb45 py-torchmetrics: add v1.0.3 (#39349) 2023-08-09 09:01:23 +02:00
Cristian Di Pietrantonio
f42ef7aea7 ior: add lustre variant (#39320)
Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
2023-08-09 08:59:21 +02:00
Jon Rood
41793673d9 hypre: fix typo in --with-sycl option. (#39351) 2023-08-09 08:37:15 +02:00
Cristian Di Pietrantonio
8b6a6982ee caliper: add KOKKOS variant. (#39312)
Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-08-09 08:35:56 +02:00
Cristian Di Pietrantonio
ee74ca6391 beast2: add dependency on libbeagle. (#39311)
Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
2023-08-09 08:33:53 +02:00
Cristian Di Pietrantonio
7165e70186 chgcentre: fix compilation error by using explicit casting (#39315)
Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
2023-08-09 08:32:58 +02:00
Massimiliano Culpo
97d632a161 Push conflict between rocm and blt down to packages that actually use blt (#39339) 2023-08-09 07:53:17 +02:00
Alec Scott
571919992d go: add v1.20.6 and v1.19.11, deprecate previous versions due to CVE-2023-29405 (#39149) 2023-08-08 16:38:21 -07:00
snehring
99112ad2ad Update py-gtdbtk to 2.3.2 (#39296)
* fastani: adding version 1.34 and changing build_system

* py-gtdbtk: adding version 2.3.2

* fastani: adding requested changes

* py-gtdbtk: adding requested changes

* fastani: remove conditional
2023-08-08 18:33:56 -05:00
Paul Kuberry
75c70c395d trilinos: Add version 14.4.0 (#39340) 2023-08-08 18:48:23 -04:00
Jen Herting
960bdfe612 [py-model-index] new package (#39301) 2023-08-08 17:20:37 -05:00
Jen Herting
97892bda18 [py-opendatalab] New package (#39302) 2023-08-08 17:19:18 -05:00
Jen Herting
cead6ef98d [py-linkify-it-py] New package (#39305)
* [py-linkify-it-py] New package

* [py-linkify-it-py] Added version 1.0.3
2023-08-08 17:13:11 -05:00
eugeneswalker
5d70c0f100 e4s ci: add libpressio (#34222)
* e4s ci: add libpressio

* libpressio ~cuda, +cuda: add +mgard
2023-08-08 12:03:23 -07:00
Adam J. Stewart
361632fc4b Ensure that all variants have a description (#39025)
* Ensure that all variants have a description

* Update mock packages too

* Fix test invocations

* Black fix

* mgard: update variant descriptions

* flake8 fix

* black fix

* Add to audit tests

* Relax type hints

* Older Python support

* Undo all changes to mock packages

* Flake8 fix
2023-08-08 09:29:49 -05:00
simonLeary42
6576655137 Name clash error message include hash of clashing packages (#39234)
Co-authored-by: Simon <simonleary@umass.edu>
2023-08-08 14:36:49 +02:00
Mikael Simberg
feb26efecd hpx: add v1.9.1 (#39336) 2023-08-08 05:28:04 -04:00
Mikael Simberg
4752d1cde3 pika-algorithms: add v0.1.4 (#39335) 2023-08-08 05:23:10 -04:00
Paul Kuberry
a07afa6e1a OpenMPI: require compilers for F77 and "modern" Fortran (#39266) 2023-08-08 11:10:27 +02:00
Adam J. Stewart
7327d2913a ML CI: get more packages working on macOS (#39199) 2023-08-08 10:59:24 +02:00
Axel Huebl
8f8a1f7f52 C-Blosc2: add v2.10.1 (#39297)
This adds the latest C-Blosc2 release, the first to add CMake
CONFIG install files for easier downstream usage of the installed
library and CMake targets.
2023-08-08 10:08:12 +02:00
Carlos Bederián
eb8d836e76 aocc: add v4.1.0 (#39280) 2023-08-08 10:07:12 +02:00
Jen Herting
bad8495e16 parallel-hashmap: add new package (#39299) 2023-08-08 10:03:01 +02:00
Cristian Di Pietrantonio
43de7f4881 plumed: add v2.9.0, v2.8.3 (#39324)
Co-authored-by: Cristian Di Pietrantonio <cdipietrantonio@pawsey.org.au>
2023-08-08 09:59:03 +02:00
Hariharan Devarajan
84585ac575 cpp-logger: add new package (#39303) 2023-08-08 09:27:44 +02:00
Harmen Stoppels
86f9d3865b Fix broken inode assertion (#39188) 2023-08-08 09:21:23 +02:00
Adam J. Stewart
834e7b2b0a py-cartopy: add v0.22.0 (#39309) 2023-08-08 08:53:04 +02:00
shanedsnyder
c14f23ddaa Add darshan 3.4.4 release (#39310) 2023-08-08 08:49:27 +02:00
Sinan
49f3681a12 package:py-wand add build env var, add new version (#39281)
* package:py-wand add build env var, add new version

* fix style

* [@spackbot] updating style on behalf of Sinan81

* improve

* improve

---------

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
Co-authored-by: Sinan81 <Sinan81@users.noreply.github.com>
2023-08-07 19:45:48 -05:00
Jen Herting
19e1d10cdf [py-marshmallow] added version 3.19.0 (#38972) 2023-08-07 19:44:44 -05:00
Jen Herting
7caf2a512d [py-srsly] added version 2.4.6 (#38970)
* [py-srsly] added version 2.4.6

* [py-srsly] added dependency on py-catalogue

* [py-srsly] added conflict on newer versions of py-cython
2023-08-07 19:44:34 -05:00
Jen Herting
1f6e3cc8cb [py-pathy] new package (#38968)
* [py-pathy] new package

* [py-pathy] removed lines for unsupported versions of python
2023-08-07 19:41:21 -05:00
Jen Herting
169c4245e0 [py-gensim] added version 4.3.1 (#38272)
* [py-gensim] added version 4.3.1

* [py-gensim] flake8

* [py-gensim] removed version limitation on py-cython
2023-08-07 18:56:53 -05:00
Jen Herting
ee1982010f [py-clean-text] added version 0.6.0 (#38271)
* [py-clean-text] added missing dependency on py-poetry

* [py-clean-text] added version 0.6.0

* [py-clean-text] Made required changes
2023-08-07 18:52:22 -05:00
Melven Roehrig-Zoellner
dd396c4a76 freeglut: set correct library name (#39300) 2023-08-07 19:08:08 -04:00
Jean Luca Bez
235802013d h5bench: include tests for E4S and new version (#37830)
* include tests for h5bench
* update to new test format
* update path
* improve description and detect MPI runner
* fix test name
* Include new release
* update with corrections
* include SLURM dependency and reduce test to one process
* fixes on filters
* fix on oversubscribe
* suggested fixes

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-08-07 11:43:32 -07:00
Harmen Stoppels
e1f07e98ae libdeflate: new versions, switch to cmake (#39062) 2023-08-07 13:48:12 -04:00
Adam J. Stewart
60aee6f535 py-torchmetrics: add v1.0.2 (#39274) 2023-08-07 10:43:43 -07:00
Adam J. Stewart
2510dc9e6e py-timm: add v0.9.5 (#39275) 2023-08-07 10:42:14 -07:00
Christopher Christofi
5607dd259b samtools: add 1.17 (#39287) 2023-08-07 10:29:03 -07:00
Harmen Stoppels
7bd5d1fd3c use {%compiler.name}{@compiler.version} in matching_specs (#39247) 2023-08-07 18:54:10 +02:00
Harmen Stoppels
f6104cc3cb spack: bump package (#39184) 2023-08-07 18:52:14 +02:00
kwryankrattiger
b54d286b4a CI: remove redundant sections (#38514)
Refactor gitlab ci configs so that mac and cray jobs can reuse as much higher level
configuration as possible.

* CI: remove redundant sections
* CI: Include base linux CI configs in cray stacks

Relocation and runner mapping is consistent between cray and linux runners.

* Export user cache path in before script
* CI: add GPG root for mac runners
* Disable user configs

Metal runners share a ~ directory

* Disable user config and add configs in activate env
2023-08-07 10:23:16 -06:00
Seth R. Johnson
ea9c488897 celeritas: new version 0.3.1 (#39288) 2023-08-07 12:16:30 +01:00
Massimiliano Culpo
ba1d295023 Extract prefix locks and failure markers from Database (#39024)
This PR extracts two responsibilities from the `Database` class:
1. Managing locks for prefixes during an installation
2. Marking installation failures

and pushes them into their own class (`SpecLocker` and `FailureMarker`). These responsibilities are also pushed up into the `Store`, leaving to `Database` only the duty to manage `index.json` files.

`SpecLocker` classes no longer share a global list of locks, but locks are per instance. Their identifier is simply `(dag hash, package name)`, and not the spec prefix path, to avoid circular dependencies across Store / Database / Spec.
2023-08-07 06:47:52 -04:00
Harmen Stoppels
27f04b3544 Picklable HTTPError (#39285) 2023-08-07 10:48:35 +02:00
Brian Van Essen
8cd9497522 Bugfix spack module files (#39282)
* Fixed the LD_LIBRARY_PATH to use the lib64 directory.
* On AMD systems the llvm/lib directory is not properly put into the LD_LIBRARY_PATH.
* Added both lib and lib64 paths for libfabric.
* Split the prepend statements.
2023-08-05 20:26:00 -07:00
Harmen Stoppels
ef544a3b6d Add a more detailed HTTPError (#39187) 2023-08-05 11:16:51 +02:00
Adam J. Stewart
4eed832653 py-pyqt6: add new package (#32696) 2023-08-04 18:49:54 -05:00
Wouter Deconinck
5996aaa4e3 acts: new versions 23.[3-5].0, 24.0.0, 25.0.[0-1], 26.0.0, 27.[0-1].0, 28.0.0 (#37055)
* acts: new version 23.[3-5].0, 24.0.0
- https://github.com/acts-project/acts/compare/v23.2.1...v23.3.0
- https://github.com/acts-project/acts/compare/v23.3.0...v23.4.0
- https://github.com/acts-project/acts/compare/v23.4.0...v23.5.0
- https://github.com/acts-project/acts/compare/v23.5.0...v24.0.0
* acts: update dependencies
* acts: new versions 25.0.0, 25.0.1, 26.0.0, 27.0.0
* [@spackbot] updating style on behalf of wdconinc
* acts: rm duplicate edm4hep variant definition
* acts: depends_on podio 0.6: open ended range
* acts: new version 27.1.0
* acts: depends_on mlpack when +mlpack
* acts: new version 28.0.0
* actsvg: new version 0.4.35
* acts: depends_on actsvg@0.4.35: when @28:

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2023-08-04 16:47:15 -07:00
Brian Van Essen
4957607005 FIX libfabric module file (#39271)
* Add support to export the LD_LIBRARY_PATH for the libfabric package
and subsequent module files.

Fix the AWS OFI RCCL package so that it prepends the enviornment
variables.

* Fixed comment
2023-08-04 15:04:54 -07:00
Matt Drozt
78bca131fb [py-smartredis] New Package (#39098)
* Create a spack package for smartredis python client

* make py-SR deps versions match docs

* tie SR v0.4.0 to redis-plus-plus v1.3.5

* looser extension lib deps for concretization

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Address reviewer feedback

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-08-04 09:44:23 -05:00
Brian Van Essen
045c5cea53 Replace code that captures a recipes generator statement in the cached (#39192)
cmake build.
2023-08-04 05:38:12 -07:00
Alex Richert
ea256145d9 Add MET, METplus packages (#39238)
* Add metplus package
* Add met package
* add metplus develop version
* copyright year var/spack/repos/builtin/packages/met/package.py
* copyright year var/spack/repos/builtin/packages/metplus/package.py

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-08-04 06:53:44 -04:00
Davide
0b2098850c julia: add v1.9.2 (#39269) 2023-08-04 10:53:59 +02:00
pabloaledo
d2df0a29ce salmon: update version (#39202)
* salmon: update version

---------

Signed-off-by: Pablo <pablo.aledo@seqera.io>
2023-08-03 17:57:17 -07:00
Vicente Bolea
92e9daec9b adios2: add 2.9.1 release (#39257) 2023-08-03 17:28:55 -04:00
Erik Schnetter
d65437114a openssl: Update to 3.1.2, 3.0.10, 1.1.1v (#39216) 2023-08-03 23:22:35 +02:00
Tamara Dahlgren
9a6e98e729 Revert checksum verification CI test (#39259) 2023-08-03 23:21:39 +02:00
Satish Balay
6515c16432 sowing: add version 1.1.26-p8 (#39258)
* sowing: add version 1.1.26-p8
* add in maintainer
2023-08-03 16:34:19 -04:00
Vanessasaurus
2826ab36f0 Automated deployment to update package flux-core 2023-08-03 (#39237)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-08-03 11:07:41 -07:00
Mark W. Krentel
c035512930 hpcviewer: add version 2023.07 (#39236)
Add version 2023.07, adjust the line for deprecated (anything using
java 8 is now deprecated).
2023-08-03 10:52:58 -07:00
Stephen Sachs
cfadba47d3 patchelf@0.18.0 breaks intel compiler libraries (#39253) 2023-08-03 13:33:48 -04:00
Matthieu Dorier
95391dfe94 kafka: new versions (#39249)
* kafka: version 2.13-3.5.0
* kafka: version 2.13-3.5.1 and 2.13-3.4.1
2023-08-03 10:14:48 -07:00
Matthieu Dorier
382ba99631 librdkafka: version 2.2.0 (#39250) 2023-08-03 10:05:35 -07:00
Alec Scott
f8e25c79bf Add checksum CI test and verify with new versions for glab and gh (#39181) 2023-08-03 09:53:05 -07:00
Matthew Thompson
93b54b79d3 gftl-shared: new version 1.6.1 (#39217)
This PR adds gFTL-shared v1.6.1
2023-08-03 11:32:17 -04:00
Matthew Thompson
ff30efcebc mapl: add package (#39227)
* mapl: add package
* Fix style
2023-08-03 09:40:34 -04:00
Seth R. Johnson
54514682d4 py-sphinx-rtd-theme: avoid concretizing 0.5 with Sphinx 7.0 (#39212)
* py-sphinx-rtd-theme: avoid concretizing 0.5 with Sphinx 7.0

* Update var/spack/repos/builtin/packages/py-sphinx-rtd-theme/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-08-03 09:31:05 -04:00
Auriane R
92a75717f0 Add pika 0.17.0 (#39221) 2023-08-03 09:26:55 -04:00
Matthew Thompson
b9be8e883e pfunit: add versions 4.7.1, 4.7.2, 4.7.3 (#39218)
This adds the latest versions of pFUnit
2023-08-03 09:26:33 -04:00
Seth R. Johnson
da838a7d10 vecgeom: new version 1.2.5 (#39211)
* vecgeom: new version 1.2.5
* Mark previous versions as deprecated
* [@spackbot] updating style on behalf of sethrj
* vecgeom: fix 1.2.4 checksum for change to git describe
2023-08-03 08:59:41 -04:00
pabloaledo
85e5fb9ab7 ucsc-bedgraphtobigwig: add package (#39208)
Signed-off-by: Pablo <pablo.aledo@seqera.io>
2023-08-03 14:16:30 +02:00
Adam J. Stewart
c0c300d773 R: fix build on macOS arm64 (#39228) 2023-08-03 14:15:06 +02:00
Harmen Stoppels
6e933ac7df repo cache: use -inf default instead of 0 (#39214)
FastPackageChecker.modified_since should use a default number < 0

When the repo cache does not exist, Spack uses mtime 0. This causes the repo
cache not to be generated when the repo has mtime 0.

Some popular package managers such as spack use 0 mtime normalization for
reproducible tarballs. So when installing spack with spack from a buildcache, the
repo cache doesn't generate

Also add some typehints
2023-08-03 14:13:13 +02:00
pabloaledo
be679759be ucsc-bedclip: add package (#39209)
Signed-off-by: Pablo <pablo.aledo@seqera.io>
2023-08-03 14:04:37 +02:00
Adam J. Stewart
eace479b1e py-kornia: add v0.7.0 (#39233) 2023-08-03 13:59:31 +02:00
Harmen Stoppels
2069a42ba3 Buildcache commands cleanup, again... (#39203)
* Inform mypy that tty.die is noreturn

* avoid temporary allocation in env

* update spack buildcache save-specfile

* fix spack buildcache check/download/get-buildcache-name

- ensure that required args and mutually exclusive ones are marked as
  such in argparse for better error messages
- deprecate --spec-file everywhere
- use disambiguate for better error messages
2023-08-03 10:44:02 +02:00
George Young
41d2161b5b nextdenovo: new package @2.5.2 (and py-paralleltask) (#39139)
* py-paralleltask: new package @0.2.2

* adding hidden dependency

* nextdenovo: new package @2.5.2

* style

* Update var/spack/repos/builtin/packages/py-paralleltask/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-08-02 19:11:18 -04:00
mschouler
ba936574fc py-melissa-core: add new versions and py-iterative-stats dependency (#38654)
* py-melissa-core: add new versions and py-iterative-stats dependency

* Enhance dependency specification

* Fix dependency specification

* Fix comment alignment

* Improve dependency specification style

* Enhance dependencies

* Fix mpi4py, py-cloudpickle and py-python-hostlist dependencies

* Update var/spack/repos/builtin/packages/py-melissa-core/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-melissa-core/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-melissa-core/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Set develop version as preferred

* Update var/spack/repos/builtin/packages/py-melissa-core/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Marc Schouler <marc.schouler@inria.fr>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-08-02 17:46:50 -05:00
Stephen Hudson
c0d0603baa py-libEnsemble: add v0.10.2 (#39074)
* libEnsemble: add v0.10.2

* Make setuptools build only dep

* Update var/spack/repos/builtin/packages/py-libensemble/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-08-02 14:59:52 -05:00
Adam J. Stewart
edbf12cfa8 Add qmake virtual provider (#38848) 2023-08-02 13:50:37 -05:00
snehring
11b3dac705 openmolcas: adding version 23.06 (#39195) 2023-08-02 11:05:42 -07:00
pabloaledo
a7a5a994dc bioconductor-dupradar, bioconductor-rsubread: add packages (#39206)
* bioconductor-dupradar, bioconductor-rsubread: add packages
* [@spackbot] updating style on behalf of pabloaledo
2023-08-02 10:22:27 -07:00
pabloaledo
8a9a24ce1e tximeta: add package (#39207)
* tximeta: add package
  Signed-off-by: Pablo <pablo.aledo@seqera.io>
* [@spackbot] updating style on behalf of pabloaledo

---------

Signed-off-by: Pablo <pablo.aledo@seqera.io>
2023-08-02 10:16:21 -07:00
Harmen Stoppels
f54974d66e patchelf: 0.18 (#39215) 2023-08-02 09:54:31 -07:00
kwryankrattiger
0b4631a774 CI: Refactor ci reproducer (#37088)
* CI: Refactor ci reproducer

* Autostart container
* Reproducer paths match CI paths
* Generate start scripts for docker and reproducer

* CI: Add interactive and gpg options to reproduce-build

* Interactive will determine if the docker container persists
  after running reproduction.
* GPG path/url allow downloading GPG keys needed for binary
  cache download validation. This is important for running
  reproducer for protected CI jobs.

* Add exit_on_failure option to CI scripts

* CI: Add runtime option for reproducer
2023-08-02 09:51:12 -07:00
Harmen Stoppels
e7fa6d99bf version: move to module, avoid circular imports (#39077) 2023-08-02 17:47:08 +02:00
Harmen Stoppels
03c0d74139 buildcache extractall: extract directly into spec.prefix (#37441)
- Run `mkdirp` on `spec.prefix`
- Extract directly into `spec.prefix`
  1. No need for `$store/tmp.xxx` where we extract the tarball directly, pray that it has one subdir `<name>-<version>-<hash>`, and then `rm -rf` the package prefix followed by `mv`.
  2. No need to clean up this temp dir in `spack clean`.
  3. Instead figure out package directory prefix from the tarball contents, and strip the tarinfo entries accordingly (kinda like tar --strip-components but more strict)
- Set package dir permissions
- Don't error during error handling when files cannot removed
- No need to "enrich" spec.json with this tarball-toplevel-path

After this PR, we can in fact tarball packages relative to `/` instead of `spec.prefix/..`, which makes it possible to use Spack tarballs as container layers, where relocation is impossible, and rootfs tarballs are expected.
2023-08-02 17:06:13 +02:00
Wouter Deconinck
a14f4b5a02 feat: move -N/--namespace(s) to common args, allow in buildcache list (#36719)
`spack buildcache list` did not have a way to display the namespace of
packages in the buildcache. This PR adds that functionality.

For consistency's sake, it moves the `-N/--namespace` arg definition to
the `common/arguments.py` and modifies `find`, `solve`, `spec` to use
the common definition.

Previously, `find` was using `--namespace` (singular) to control whether
to display the namespace (it doesn't restrict the search to that
namespace). The other commands were using `--namespaces` (plural). For
backwards compatibility and for consistency with `--deps`, `--tags`,
etc, the plural `--namespaces` was chosen. The argument parser ensures
that `find --namespace` will continue to behave as before.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-08-02 11:16:14 +00:00
Alec Scott
3be565f49e rust: disable build from downloading a version of LLVM from Rust CI (#39146) 2023-08-02 12:35:52 +02:00
Adam J. Stewart
df2938dfcf Python: fix library/header error msg format (#39171) 2023-08-02 12:14:30 +02:00
Martin Aumüller
5d8482598b dcmtk: new version and pic configuration (#39161)
* dcmtk: checksum 3.6.7

compiles on macos

* dcmtk: support pic configuration

* [@spackbot] updating style on behalf of aumuell

* dcmtk: use define_from_variant for shorter code

* dcmtk: refine conflict

it appears that dcmtk < 3.6.7 only fails on macos/aarch64:
/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/include/xmmintrin.h:14:2:
  error: "This header is only meant to be used on x86 and x64 architecture"

---------

Co-authored-by: aumuell <aumuell@users.noreply.github.com>
2023-08-01 21:21:47 -07:00
Manuela Kuhn
f079e7fc34 py-gevent: add 23.7.0 and py-greenlet: add 3.0.0a1 (#39164)
* py-gevent: add 23.7.0 and py-greenlet: add 3.0.0a1

* Update var/spack/repos/builtin/packages/py-gevent/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Remove version 1.3.a2

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-08-01 18:21:40 -05:00
sid
3369acc050 py-openai, py-pandas-stubs (#38912)
* simple build of py-openai

* added variants to py-openai

* py-pandas-stubs is a dependency for py-openai

* fixed format and flake8 errors for py-openai

* black format error for py-pandas-stubs

* [@spackbot] updating style on behalf of sidpbury

* made style and format changes to py-openai

* made style and format changes to py-pandas-stubs

* py-types-pytz is a dependency for py-openai

* [@spackbot] updating style on behalf of sidpbury

* updated py-openpyxl for ver 3.0.7 and 3.1.2

* Update var/spack/repos/builtin/packages/py-pandas-stubs/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* ajs requested changes for py-openai

* updated py-openpyxl for supported python

* [@spackbot] updating style on behalf of sidpbury

* updated py-openpyxl

* removed requirement.txt dependencies in  py-openpyxl

* removed python depends on from openpyxl

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-08-01 17:35:41 -04:00
mschouler
26f4fc0f34 Add two latest versions (#39182)
Co-authored-by: Marc Schouler <marc.schouler@inria.fr>
2023-08-01 15:35:55 -05:00
Weiqun Zhang
8c0551c1c0 amrex: add 23.08 (#39190) 2023-08-01 12:15:46 -07:00
Dax Lynch
59866cdb11 Mongodb: Add new package (#39085)
* Added the package

* added dependency

* Update package.py

* Update package.py

Added xz as a dependency
2023-08-01 10:25:16 -07:00
Chris White
8d2a32f66d remove CMAKE_GENERATOR from the host-config because it cannot be overwritten on the command line (#39044) 2023-08-01 08:57:56 -07:00
vucoda
b28ae67369 Fix source bootstrapping for Debian and derivatives (#39107)
* Fix Python package.py for Debian and derivatives to find the system Python library location

When bootstrapping from source, find_library() does not contain any paths that work for Debian and derivatives.  fixes #36666

* Update to pass styling

* Update to styling

* Update python package.py with fake config value for LIBPL

* Update python package.py libpl config_vars entry to follow double quote standard

* styling update
2023-08-01 03:42:46 -04:00
Satish Balay
b8590fbd05 petsc, py-petsc4py: add v3.19.4 (#39169) 2023-07-31 20:57:18 -05:00
Martin Aumüller
9343b9524f qt-base: add conflict for 6.5+ with GCC < 9 (#39158) 2023-07-31 18:36:16 -04:00
Adam J. Stewart
bb0cec1530 py-numpy: add v1.25.2 (#39172) 2023-07-31 16:58:07 -05:00
Alec Scott
d4f41b51f4 Add spack checksum --verify, fix --add (#38458)
* Add rewrite of spack checksum to include --verify and better add versions to package.py files
* Fix formatting and remove unused import
* Update checksum unit-tests to correctly test multiple versions and add to package
* Remove references to latest in stage.py
* Update bash-completion scripts to fix unit tests failures
* Fix docs generation
* Remove unused url_dict argument from methods
* Reduce chance of redundant remote_versions work
* Add print() before tty.die() to increase error readablity
* Update version regular expression to allow for multi-line versions
* Add a few unit tests to improve test coverage
* Update command completion
* Add type hints to added functions and fix a few py-lint suggestions
* Add @no_type_check to prevent mypy from failing on pkg.versions
* Add type hints to format.py and fix unit test
* Black format lib/spack/spack/package_base.py
* Attempt ignoring type errors
* Add optional dict type hint and declare versions in PackageBase
* Refactor util/format.py to allow for url_dict as an optional parameter
* Directly reference PackageBase class instead of using TypeVar
* Fix comment typo

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2023-07-31 21:49:43 +00:00
eugeneswalker
347acf3cc6 update py-cupy to enable ROCm builds and add variant to control optional dependencies (#38919)
* py-cupy updates: add +rocm and +all

* rocm deps are link only

* set parallelism for both +rocm and +cuda

* add missing deps; remove unnecessary deps; uncomment maintainers; get hipcc properly
2023-07-31 16:23:19 -05:00
Manuela Kuhn
65224ad6bc py-terminado: add 0.17.1 (#39165)
* py-terminado: add 0.17.1

* Fix style

* Update var/spack/repos/builtin/packages/py-terminado/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-07-31 14:22:00 -07:00
Andrey Perestoronin
784d56ce05 added new packages (#39166) 2023-07-31 15:38:33 -04:00
Martin Aumüller
b30523fdd8 libtiff: 4.5.1 (#39159) 2023-07-31 14:15:20 -05:00
Martin Aumüller
b46e098696 libgeotiff: new versions (#39160) 2023-07-31 14:14:44 -05:00
Manuela Kuhn
20a7622602 py-rdflib: add 6.3.2 (#39065)
* py-rdflib: add 6.3.2

* Update var/spack/repos/builtin/packages/py-rdflib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Remove python dependency

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-07-31 13:50:00 -05:00
mschouler
d25f1059dd py-python-hostlist: package addition (#39035)
* Add recipe for py-hostlist

* Fix style

* Fix style

* Add homepage, fix version url and remove unnecessary dependency

* Fix version and remove url

* Rename package and fix git link

---------

Co-authored-by: Marc Schouler <marc.schouler@inria.fr>
2023-07-31 12:39:56 -05:00
Adam J. Stewart
9394fa403e Remove Xcode mock-up (#39020)
* Remove Xcode mock-up

* Remove unused imports
2023-07-31 10:24:04 -07:00
Alec Scott
679c6a606d fzf: add v0.42.0 (#39150) 2023-07-31 09:55:48 -07:00
Adam J. Stewart
27f378601e py-lightning: add v2.0.6 (#39152) 2023-07-31 09:54:35 -05:00
Adam J. Stewart
832ddbdf6d py-sphinx: add v7.1 (#39151) 2023-07-31 09:54:10 -05:00
Martin Aumüller
0286455e1d libjpeg-turbo: checksum 2.1.5.1 & 3.0.0 (#39157) 2023-07-31 07:48:48 -07:00
Adam J. Stewart
4baf489460 py-lightly: add v1.4.14 (#39153) 2023-07-30 19:07:54 -07:00
Christopher Christofi
56c7921430 py-chex: add 0.1.5 (#39102) 2023-07-30 16:30:36 -05:00
Manuela Kuhn
c2288af55c py-csvkit: add 1.1.1 (#39124) 2023-07-30 16:28:33 -05:00
George Young
39cd2f3754 sourmash: new package @4.8.2 (#38571)
* sourmash: new package @4.8.2

* sourmash: new package @4.8.2

* py-bitarray: add 2.7.6, 2.7.4

* Update var/spack/repos/builtin/packages/py-bitstring/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update setuptools dependency

* Adding missing deps

* Update var/spack/repos/builtin/packages/sourmash/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Correcting maturin dep

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/sourmash/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/sourmash/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/sourmash/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Adding dependency types

* Add `@0.14.17` as the last pre-`@1:` release

* Switch to use `python_platlib`

* Update package.py

* Update var/spack/repos/builtin/packages/py-screed/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-07-30 16:13:25 -05:00
Adam J. Stewart
a941ab4acb PyPy: add new package (#38999)
* PyPy: add new package

* Typo fix

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-07-30 15:54:45 -05:00
Jordan Galby
048cc711d6 py-pytest-html: Add version 3.2.0 (#38989)
* py-pytest-html: add 3.2.0

* py-pytest-html: Add py-py version requirement

See https://github.com/pytest-dev/pytest-html/blob/v3.2.0/setup.py#L16

* py-pytest-html: Add dependencies from setup.py and pyproject.toml

* py-pytest-html: Add git url

* py-pytest-html: Add conflict with py-pytest@7.2: pending py-pytest-html@4
2023-07-30 15:01:00 -05:00
Tamara Dahlgren
63a5cf78ac Bugfix/ltrace: Add missing elf dependency (#39143) 2023-07-29 23:02:00 -07:00
Filippo Spiga
bef03b9588 Adding NVIDIA HPC SDK 23.7 (#39127) 2023-07-28 18:48:06 -04:00
dslarm
2859f0a7e1 HACCKernels git repository has changed URL. Updating. (#39129) 2023-07-28 16:33:52 -04:00
afzpatel
c1b084d754 Adding optional hip test (#34907)
* Adding optional hip test
* Modifications to run every samples test
* Skipping test directories without a Makefile
* fix styling and cleaning code
* fix styling and changed method of itterating through sample folders
* changed to new syntax for standalone tests
* Updates for changes in syntax
2023-07-28 10:10:29 -07:00
Julien Bigot
a8301709a8 libffi: add -Wno-error=implicit-function-declaration for clang >= 16 (#38356) 2023-07-28 10:18:12 -04:00
1260 changed files with 16464 additions and 7664 deletions

View File

@@ -22,7 +22,7 @@ jobs:
matrix:
operating_system: ["ubuntu-latest", "macos-latest"]
steps:
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9 # @v2
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744 # @v2
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
with:
python-version: ${{inputs.python_version}}

View File

@@ -24,7 +24,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- name: Checkout
uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
with:
fetch-depth: 0
- name: Setup non-root user
@@ -62,7 +62,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- name: Checkout
uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
with:
fetch-depth: 0
- name: Setup non-root user
@@ -99,7 +99,7 @@ jobs:
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
with:
fetch-depth: 0
- name: Setup non-root user
@@ -133,7 +133,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
with:
fetch-depth: 0
- name: Setup repo
@@ -158,7 +158,7 @@ jobs:
run: |
brew install cmake bison@2.7 tree
- name: Checkout
uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
@@ -179,7 +179,7 @@ jobs:
run: |
brew install tree
- name: Checkout
uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
- name: Bootstrap clingo
run: |
set -ex
@@ -204,7 +204,7 @@ jobs:
runs-on: ubuntu-20.04
steps:
- name: Checkout
uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
with:
fetch-depth: 0
- name: Setup repo
@@ -247,7 +247,7 @@ jobs:
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
with:
fetch-depth: 0
- name: Setup non-root user
@@ -283,7 +283,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
gawk
- name: Checkout
uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
with:
fetch-depth: 0
- name: Setup non-root user
@@ -316,7 +316,7 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
@@ -333,7 +333,7 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh

View File

@@ -56,7 +56,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9 # @v2
uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744 # @v2
- name: Set Container Tag Normal (Nightly)
run: |
@@ -95,7 +95,7 @@ jobs:
uses: docker/setup-qemu-action@2b82ce82d56a2a04d2637cd93a637ae1b359c0a7 # @v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@4c0219f9ac95b02789c1075625400b2acbff50b1 # @v1
uses: docker/setup-buildx-action@885d1462b80bc1c1c7f0b00334ad271f09369c55 # @v1
- name: Log in to GitHub Container Registry
uses: docker/login-action@465a07811f14bebb1938fbed4728c6a1ff8901fc # @v1

View File

@@ -35,7 +35,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9 # @v2
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744 # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0

View File

@@ -14,7 +14,7 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1

View File

@@ -47,7 +47,7 @@ jobs:
on_develop: false
steps:
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9 # @v2
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
@@ -94,7 +94,7 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9 # @v2
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
@@ -133,7 +133,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9 # @v2
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -152,7 +152,7 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9 # @v2
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
@@ -165,6 +165,7 @@ jobs:
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest coverage[toml] pytest-cov clingo pytest-xdist
pip install --upgrade flake8 "isort>=4.3.5" "mypy>=0.900" "click" "black"
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -186,7 +187,7 @@ jobs:
matrix:
python-version: ["3.10"]
steps:
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9 # @v2
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2

View File

@@ -18,7 +18,7 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9 # @v2
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744 # @v2
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
with:
python-version: '3.11'
@@ -35,7 +35,7 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9 # @v2
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1 # @v2
@@ -68,7 +68,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9 # @v2
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744 # @v2
- name: Setup repo and non-root user
run: |
git --version

View File

@@ -15,7 +15,7 @@ jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1
@@ -39,7 +39,7 @@ jobs:
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1
@@ -63,7 +63,7 @@ jobs:
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@c85c95e3d7251135ab7dc9ce3241c5835cc595a9
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744
with:
fetch-depth: 0
- uses: actions/setup-python@61a6322f88396a6271a6ee3565807d608ecaddd1

View File

@@ -2,24 +2,26 @@
## Supported Versions
We provide security updates for the following releases.
We provide security updates for `develop` and for the last two
stable (`0.x`) release series of Spack. Security updates will be
made available as patch (`0.x.1`, `0.x.2`, etc.) releases.
For more on Spack's release structure, see
[`README.md`](https://github.com/spack/spack#releases).
| Version | Supported |
| ------- | ------------------ |
| develop | :white_check_mark: |
| 0.19.x | :white_check_mark: |
| 0.18.x | :white_check_mark: |
## Reporting a Vulnerability
To report a vulnerability or other security
issue, email maintainers@spack.io.
You can report a vulnerability using GitHub's private reporting
feature:
You can expect to hear back within two days.
If your security issue is accepted, we will do
our best to release a fix within a week. If
fixing the issue will take longer than this,
we will discuss timeline options with you.
1. Go to [github.com/spack/spack/security](https://github.com/spack/spack/security).
2. Click "Report a vulnerability" in the upper right corner of that page.
3. Fill out the form and submit your draft security advisory.
More details are available in
[GitHub's docs](https://docs.github.com/en/code-security/security-advisories/guidance-on-reporting-and-writing/privately-reporting-a-security-vulnerability).
You can expect to hear back about security issues within two days.
If your security issue is accepted, we will do our best to release
a fix within a week. If fixing the issue will take longer than
this, we will discuss timeline options with you.

View File

@@ -51,65 +51,43 @@ setlocal enabledelayedexpansion
:: subcommands will never start with '-'
:: everything after the subcommand is an arg
:: we cannot allow batch "for" loop to directly process CL args
:: a number of batch reserved characters are commonly passed to
:: spack and allowing batch's "for" method to process the raw inputs
:: results in a large number of formatting issues
:: instead, treat the entire CLI as one string
:: and split by space manually
:: capture cl args in variable named cl_args
set cl_args=%*
:process_cl_args
rem tokens=1* returns the first processed token produced
rem by tokenizing the input string cl_args on spaces into
rem the named variable %%g
rem While this make look like a for loop, it only
rem executes a single time for each of the cl args
rem the actual iterative loop is performed by the
rem goto process_cl_args stanza
rem we are simply leveraging the "for" method's string
rem tokenization
for /f "tokens=1*" %%g in ("%cl_args%") do (
set t=%%~g
rem remainder of string is composed into %%h
rem these are the cl args yet to be processed
rem assign cl_args var to only the args to be processed
rem effectively discarding the current arg %%g
rem this will be nul when we have no further tokens to process
set cl_args=%%h
rem process the first space delineated cl arg
rem of this iteration
if "!t:~0,1!" == "-" (
if defined _sp_subcommand (
rem We already have a subcommand, processing args now
if not defined _sp_args (
set "_sp_args=!t!"
) else (
set "_sp_args=!_sp_args! !t!"
)
) else (
if not defined _sp_flags (
set "_sp_flags=!t!"
shift
) else (
set "_sp_flags=!_sp_flags! !t!"
shift
)
)
) else if not defined _sp_subcommand (
set "_sp_subcommand=!t!"
shift
) else (
rem Set first cl argument (denoted by %1) to be processed
set t=%1
rem shift moves all cl positional arguments left by one
rem meaning %2 is now %1, this allows us to iterate over each
rem argument
shift
rem assign next "first" cl argument to cl_args, will be null when
rem there are now further arguments to process
set cl_args=%1
if "!t:~0,1!" == "-" (
if defined _sp_subcommand (
rem We already have a subcommand, processing args now
if not defined _sp_args (
set "_sp_args=!t!"
shift
) else (
set "_sp_args=!_sp_args! !t!"
shift
)
) else (
if not defined _sp_flags (
set "_sp_flags=!t!"
) else (
set "_sp_flags=!_sp_flags! !t!"
)
)
) else if not defined _sp_subcommand (
set "_sp_subcommand=!t!"
) else (
if not defined _sp_args (
set "_sp_args=!t!"
) else (
set "_sp_args=!_sp_args! !t!"
)
)
rem if this is not nil, we have more tokens to process
rem if this is not nu;ll, we have more tokens to process
rem start above process again with remaining unprocessed cl args
if defined cl_args goto :process_cl_args

View File

@@ -39,6 +39,20 @@ function Read-SpackArgs {
return $SpackCMD_params, $SpackSubCommand, $SpackSubCommandArgs
}
function Set-SpackEnv {
# This method is responsible
# for processing the return from $(spack <command>)
# which are returned as System.Object[]'s containing
# a list of env commands
# Invoke-Expression can only handle one command at a time
# so we iterate over the list to invoke the env modification
# expressions one at a time
foreach($envop in $args[0]){
Invoke-Expression $envop
}
}
function Invoke-SpackCD {
if (Compare-CommonArgs $SpackSubCommandArgs) {
python $Env:SPACK_ROOT/bin/spack cd -h
@@ -79,7 +93,7 @@ function Invoke-SpackEnv {
}
else {
$SpackEnv = $(python $Env:SPACK_ROOT/bin/spack $SpackCMD_params env activate "--pwsh" $SubCommandSubCommandArgs)
$ExecutionContext.InvokeCommand($SpackEnv)
Set-SpackEnv $SpackEnv
}
}
"deactivate" {
@@ -90,8 +104,8 @@ function Invoke-SpackEnv {
python $Env:SPACK_ROOT/bin/spack env deactivate -h
}
else {
$SpackEnv = $(python $Env:SPACK_ROOT/bin/spack $SpackCMD_params env deactivate --pwsh)
$ExecutionContext.InvokeCommand($SpackEnv)
$SpackEnv = $(python $Env:SPACK_ROOT/bin/spack $SpackCMD_params env deactivate "--pwsh")
Set-SpackEnv $SpackEnv
}
}
default {python $Env:SPACK_ROOT/bin/spack $SpackCMD_params $SpackSubCommand $SpackSubCommandArgs}
@@ -107,8 +121,9 @@ function Invoke-SpackLoad {
python $Env:SPACK_ROOT/bin/spack $SpackCMD_params $SpackSubCommand $SpackSubCommandArgs
}
else {
# python $Env:SPACK_ROOT/bin/spack $SpackCMD_params $SpackSubCommand "--pwsh" $SpackSubCommandArgs
$SpackEnv = $(python $Env:SPACK_ROOT/bin/spack $SpackCMD_params $SpackSubCommand "--pwsh" $SpackSubCommandArgs)
$ExecutionContext.InvokeCommand($SpackEnv)
Set-SpackEnv $SpackEnv
}
}

View File

@@ -36,3 +36,9 @@ concretizer:
# on each root spec, allowing different versions and variants of the same package in
# an environment.
unify: true
# Option to deal with possible duplicate nodes (i.e. different nodes from the same package) in the DAG.
duplicates:
# "none": allows a single node for any package in the DAG.
# "minimal": allows the duplication of 'build-tools' nodes only (e.g. py-setuptools, cmake etc.)
# "full" (experimental): allows separation of the entire build-tool stack (e.g. the entire "cmake" subDAG)
strategy: none

View File

@@ -49,6 +49,7 @@ packages:
pbs: [openpbs, torque]
pil: [py-pillow]
pkgconfig: [pkgconf, pkg-config]
qmake: [qt-base, qt]
rpc: [libtirpc]
scalapack: [netlib-scalapack, amdscalapack]
sycl: [hipsycl]
@@ -59,6 +60,7 @@ packages:
xxd: [xxd-standalone, vim]
yacc: [bison, byacc]
ziglang: [zig]
zlib-api: [zlib-ng+compat, zlib]
permissions:
read: world
write: user

View File

@@ -32,9 +32,14 @@ can't be found. You can readily check if any prerequisite for using Spack is mis
Spack will take care of bootstrapping any missing dependency marked as [B]. Dependencies marked as [-] are instead required to be found on the system.
% echo $?
1
In the case of the output shown above Spack detected that both ``clingo`` and ``gnupg``
are missing and it's giving detailed information on why they are needed and whether
they can be bootstrapped. Running a command that concretize a spec, like:
they can be bootstrapped. The return code of this command summarizes the results, if any
dependencies are missing the return code is ``1``, otherwise ``0``. Running a command that
concretizes a spec, like:
.. code-block:: console
@@ -44,7 +49,7 @@ they can be bootstrapped. Running a command that concretize a spec, like:
==> Installing "clingo-bootstrap@spack%apple-clang@12.0.0~docs~ipo+python build_type=Release arch=darwin-catalina-x86_64" from a buildcache
[ ... ]
triggers the bootstrapping of clingo from pre-built binaries as expected.
automatically triggers the bootstrapping of clingo from pre-built binaries as expected.
Users can also bootstrap all the dependencies needed by Spack in a single command, which
might be useful to setup containers or other similar environments:

View File

@@ -104,11 +104,13 @@ Clone `spack-configs <https://github.com/spack/spack-configs>`_ repo and activat
`Intel oneAPI CPU environment <https://github.com/spack/spack-configs/blob/main/INTEL/CPU/spack.yaml>`_ contains applications tested and validated by Intel, this list is constantly extended. And currently it supports:
- `Devito <https://www.devitoproject.org/>`_
- `GROMACS <https://www.gromacs.org/>`_
- `HPCG <https://www.hpcg-benchmark.org/>`_
- `HPL <https://netlib.org/benchmark/hpl/>`_
- `LAMMPS <https://www.lammps.org/#gsc.tab=0>`_
- `OpenFOAM <https://www.openfoam.com/>`_
- `Quantum Espresso <https://www.quantum-espresso.org/>`_
- `STREAM <https://www.cs.virginia.edu/stream/>`_
- `WRF <https://github.com/wrf-model/WRF>`_

View File

@@ -32,7 +32,7 @@ By default, these phases run:
.. code-block:: console
$ python configure.py --bindir ... --destdir ...
$ sip-build --verbose --target-dir ...
$ make
$ make install
@@ -41,30 +41,30 @@ By default, these phases run:
Important files
^^^^^^^^^^^^^^^
Each SIP package comes with a custom ``configure.py`` build script,
written in Python. This script contains instructions to build the project.
Each SIP package comes with a custom configuration file written in Python.
For newer packages, this is called ``project.py``, while in older packages,
it may be called ``configure.py``. This script contains instructions to build
the project.
^^^^^^^^^^^^^^^^^^^^^^^^^
Build system dependencies
^^^^^^^^^^^^^^^^^^^^^^^^^
``SIPPackage`` requires several dependencies. Python is needed to run
the ``configure.py`` build script, and to run the resulting Python
libraries. Qt is needed to provide the ``qmake`` command. SIP is also
needed to build the package. All of these dependencies are automatically
added via the base class
``SIPPackage`` requires several dependencies. Python and SIP are needed at build-time
to run the aforementioned configure script. Python is also needed at run-time to
actually use the installed Python library. And as we are building Python bindings
for C/C++ libraries, Python is also needed as a link dependency. All of these
dependencies are automatically added via the base class.
.. code-block:: python
extends('python')
extends("python", type=("build", "link", "run"))
depends_on("py-sip", type="build")
depends_on('qt', type='build')
depends_on('py-sip', type='build')
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to ``configure.py``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to ``sip-build``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Each phase comes with a ``<phase_args>`` function that can be used to pass
arguments to that particular phase. For example, if you need to pass
@@ -73,10 +73,10 @@ arguments to the configure phase, you can use:
.. code-block:: python
def configure_args(self):
return ['--no-python-dbus']
return ["--no-python-dbus"]
A list of valid options can be found by running ``python configure.py --help``.
A list of valid options can be found by running ``sip-build --help``.
^^^^^^^
Testing

View File

@@ -0,0 +1,113 @@
.. Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
==========================
Using External GPU Support
==========================
Many packages come with a ``+cuda`` or ``+rocm`` variant. With no added
configuration Spack will download and install the needed components.
It may be preferable to use existing system support: the following sections
help with using a system installation of GPU libraries.
-----------------------------------
Using an External ROCm Installation
-----------------------------------
Spack breaks down ROCm into many separate component packages. The following
is an example ``packages.yaml`` that organizes a consistent set of ROCm
components for use by dependent packages:
.. code-block:: yaml
packages:
all:
compiler: [rocmcc@=5.3.0]
variants: amdgpu_target=gfx90a
hip:
buildable: false
externals:
- spec: hip@5.3.0
prefix: /opt/rocm-5.3.0/hip
hsa-rocr-dev:
buildable: false
externals:
- spec: hsa-rocr-dev@5.3.0
prefix: /opt/rocm-5.3.0/
llvm-amdgpu:
buildable: false
externals:
- spec: llvm-amdgpu@5.3.0
prefix: /opt/rocm-5.3.0/llvm/
comgr:
buildable: false
externals:
- spec: comgr@5.3.0
prefix: /opt/rocm-5.3.0/
hipsparse:
buildable: false
externals:
- spec: hipsparse@5.3.0
prefix: /opt/rocm-5.3.0/
hipblas:
buildable: false
externals:
- spec: hipblas@5.3.0
prefix: /opt/rocm-5.3.0/
rocblas:
buildable: false
externals:
- spec: rocblas@5.3.0
prefix: /opt/rocm-5.3.0/
rocprim:
buildable: false
externals:
- spec: rocprim@5.3.0
prefix: /opt/rocm-5.3.0/rocprim/
This is in combination with the following compiler definition:
.. code-block:: yaml
compilers:
- compiler:
spec: rocmcc@=5.3.0
paths:
cc: /opt/rocm-5.3.0/bin/amdclang
cxx: /opt/rocm-5.3.0/bin/amdclang++
f77: null
fc: /opt/rocm-5.3.0/bin/amdflang
operating_system: rhel8
target: x86_64
This includes the following considerations:
- Each of the listed externals specifies ``buildable: false`` to force Spack
to use only the externals we defined.
- ``spack external find`` can automatically locate some of the ``hip``/``rocm``
packages, but not all of them, and furthermore not in a manner that
guarantees a complementary set if multiple ROCm installations are available.
- The ``prefix`` is the same for several components, but note that others
require listing one of the subdirectories as a prefix.
-----------------------------------
Using an External CUDA Installation
-----------------------------------
CUDA is split into fewer components and is simpler to specify:
.. code-block:: yaml
packages:
all:
variants:
- cuda_arch=70
cuda:
buildable: false
externals:
- spec: cuda@11.0.2
prefix: /opt/cuda/cuda-11.0.2/
where ``/opt/cuda/cuda-11.0.2/lib/`` contains ``libcudart.so``.

View File

@@ -77,6 +77,7 @@ or refer to the full manual below.
extensions
pipelines
signing
gpu_configuration
.. toctree::
:maxdepth: 2

View File

@@ -2243,7 +2243,7 @@ looks like this:
url = "http://www.openssl.org/source/openssl-1.0.1h.tar.gz"
version("1.0.1h", md5="8d6d684a9430d5cc98a62a5d8fbda8cf")
depends_on("zlib")
depends_on("zlib-api")
parallel = False
@@ -4773,17 +4773,17 @@ For example, running:
results in spack checking that the installation created the following **file**:
* ``self.prefix/bin/reframe``
* ``self.prefix.bin.reframe``
and the following **directories**:
* ``self.prefix/bin``
* ``self.prefix/config``
* ``self.prefix/docs``
* ``self.prefix/reframe``
* ``self.prefix/tutorials``
* ``self.prefix/unittests``
* ``self.prefix/cscs-checks``
* ``self.prefix.bin``
* ``self.prefix.config``
* ``self.prefix.docs``
* ``self.prefix.reframe``
* ``self.prefix.tutorials``
* ``self.prefix.unittests``
* ``self.prefix.cscs-checks``
If **any** of these paths are missing, then Spack considers the installation
to have failed.
@@ -4927,7 +4927,7 @@ installed executable. The check is implemented as follows:
@on_package_attributes(run_tests=True)
def check_list(self):
with working_dir(self.stage.source_path):
reframe = Executable(join_path(self.prefix, "bin", "reframe"))
reframe = Executable(self.prefix.bin.reframe)
reframe("-l")
.. warning::
@@ -5147,8 +5147,8 @@ embedded test parts.
for example in ["ex1", "ex2"]:
with test_part(
self,
"test_example_{0}".format(example),
purpose="run installed {0}".format(example),
f"test_example_{example}",
purpose=f"run installed {example}",
):
exe = which(join_path(self.prefix.bin, example))
exe()
@@ -5226,11 +5226,10 @@ Below illustrates using this feature to compile an example.
...
cxx = which(os.environ["CXX"])
cxx(
"-L{0}".format(self.prefix.lib),
"-I{0}".format(self.prefix.include),
"{0}.cpp".format(exe),
"-o",
exe
f"-L{self.prefix.lib}",
f"-I{self.prefix.include}",
f"{exe}.cpp",
"-o", exe
)
cxx_example = which(exe)
cxx_example()
@@ -5254,7 +5253,7 @@ Saving build-time files
will be important to maintain them so they work across listed or supported
versions of the package.
You can use the ``cache_extra_test_sources`` method to copy directories
You can use the ``cache_extra_test_sources`` helper to copy directories
and or files from the source build stage directory to the package's
installation directory.
@@ -5262,10 +5261,15 @@ The signature for ``cache_extra_test_sources`` is:
.. code-block:: python
def cache_extra_test_sources(self, srcs):
def cache_extra_test_sources(pkg, srcs):
where each argument has the following meaning:
* ``pkg`` is an instance of the package for the spec under test.
* ``srcs`` is a string *or* a list of strings corresponding to the
paths of subdirectories and or files needed for stand-alone testing.
where ``srcs`` is a string *or* a list of strings corresponding to the
paths of subdirectories and or files needed for stand-alone testing.
The paths must be relative to the staged source directory. Contents of
subdirectories and files are copied to a special test cache subdirectory
of the installation prefix. They are automatically copied to the appropriate
@@ -5286,21 +5290,18 @@ and using ``foo.c`` in a test method is illustrated below.
srcs = ["tests",
join_path("examples", "foo.c"),
join_path("examples", "bar.c")]
self.cache_extra_test_sources(srcs)
cache_extra_test_sources(self, srcs)
def test_foo(self):
exe = "foo"
src_dir = join_path(
self.test_suite.current_test_cache_dir, "examples"
)
src_dir = self.test_suite.current_test_cache_dir.examples
with working_dir(src_dir):
cc = which(os.environ["CC"])
cc(
"-L{0}".format(self.prefix.lib),
"-I{0}".format(self.prefix.include),
"{0}.c".format(exe),
"-o",
exe
f"-L{self.prefix.lib}",
f"-I{self.prefix.include}",
f"{exe}.c",
"-o", exe
)
foo = which(exe)
foo()
@@ -5326,9 +5327,9 @@ the files using the ``self.test_suite.current_test_cache_dir`` property.
In our example above, test methods can use the following paths to reference
the copy of each entry listed in ``srcs``, respectively:
* ``join_path(self.test_suite.current_test_cache_dir, "tests")``
* ``join_path(self.test_suite.current_test_cache_dir, "examples", "foo.c")``
* ``join_path(self.test_suite.current_test_cache_dir, "examples", "bar.c")``
* ``self.test_suite.current_test_cache_dir.tests``
* ``join_path(self.test_suite.current_test_cache_dir.examples, "foo.c")``
* ``join_path(self.test_suite.current_test_cache_dir.examples, "bar.c")``
.. admonition:: Library packages should build stand-alone tests
@@ -5347,7 +5348,7 @@ the copy of each entry listed in ``srcs``, respectively:
If one or more of the copied files needs to be modified to reference
the installed software, it is recommended that those changes be made
to the cached files **once** in the ``copy_test_sources`` method and
***after** the call to ``self.cache_extra_test_sources()``. This will
***after** the call to ``cache_extra_test_sources()``. This will
reduce the amount of unnecessary work in the test method **and** avoid
problems testing in shared instances and facility deployments.
@@ -5394,7 +5395,7 @@ property as shown below.
"""build and run custom-example"""
data_dir = self.test_suite.current_test_data_dir
exe = "custom-example"
src = datadir.join("{0}.cpp".format(exe))
src = datadir.join(f"{exe}.cpp")
...
# TODO: Build custom-example using src and exe
...
@@ -5444,7 +5445,7 @@ added to the package's ``test`` subdirectory.
db_filename, ".dump", output=str.split, error=str.split
)
for exp in expected:
assert re.search(exp, out), "Expected '{0}' in output".format(exp)
assert re.search(exp, out), f"Expected '{exp}' in output"
If the file was instead copied from the ``tests`` subdirectory of the staged
source code, the path would be obtained as shown below.
@@ -5494,9 +5495,12 @@ Invoking the method is the equivalent of:
.. code-block:: python
errors = []
for check in expected:
if not re.search(check, actual):
raise RuntimeError("Expected '{0}' in output '{1}'".format(check, actual))
errors.append(f"Expected '{check}' in output '{actual}'")
if errors:
raise RuntimeError("\n ".join(errors))
.. _accessing-files:
@@ -5536,7 +5540,7 @@ repository, and installation.
- ``self.test_suite.test_dir_for_spec(self.spec)``
* - Current Spec's Build-time Files
- ``self.test_suite.current_test_cache_dir``
- ``join_path(self.test_suite.current_test_cache_dir, "examples", "foo.c")``
- ``join_path(self.test_suite.current_test_cache_dir.examples, "foo.c")``
* - Current Spec's Custom Test Files
- ``self.test_suite.current_test_data_dir``
- ``join_path(self.test_suite.current_test_data_dir, "hello.f90")``
@@ -6071,7 +6075,7 @@ in the extra attributes can implement this method like this:
@classmethod
def validate_detected_spec(cls, spec, extra_attributes):
"""Check that "compilers" is in the extra attributes."""
msg = ("the extra attribute "compilers" must be set for "
msg = ("the extra attribute 'compilers' must be set for "
"the detected spec '{0}'".format(spec))
assert "compilers" in extra_attributes, msg

View File

@@ -1,13 +1,13 @@
sphinx==6.2.1
sphinx==7.2.5
sphinxcontrib-programoutput==0.17
sphinx_design==0.4.1
sphinx-rtd-theme==1.2.2
sphinx_design==0.5.0
sphinx-rtd-theme==1.3.0
python-levenshtein==0.21.1
docutils==0.18.1
pygments==2.15.1
urllib3==2.0.3
pygments==2.16.1
urllib3==2.0.4
pytest==7.4.0
isort==5.12.0
black==23.1.0
flake8==6.0.0
mypy==1.4.1
black==23.7.0
flake8==6.1.0
mypy==1.5.1

View File

@@ -217,13 +217,7 @@ file would live in the ``build_cache`` directory of a binary mirror::
"binary_cache_checksum": {
"hash_algorithm": "sha256",
"hash": "4f1e46452c35a5e61bcacca205bae1bfcd60a83a399af201a29c95b7cc3e1423"
},
"buildinfo": {
"relative_prefix":
"linux-ubuntu18.04-haswell/gcc-7.5.0/zlib-1.2.12-llv2ysfdxnppzjrt5ldybb5c52qbmoow",
"relative_rpaths": false
}
}
}
-----BEGIN PGP SIGNATURE-----

View File

@@ -18,11 +18,13 @@
import sys
import tempfile
from contextlib import contextmanager
from itertools import accumulate
from typing import Callable, Iterable, List, Match, Optional, Tuple, Union
import llnl.util.symlink
from llnl.util import tty
from llnl.util.lang import dedupe, memoized
from llnl.util.symlink import islink, symlink
from llnl.util.symlink import islink, readlink, resolve_link_target_relative_to_the_link, symlink
from spack.util.executable import Executable, which
from spack.util.path import path_to_os_path, system_path_filter
@@ -101,7 +103,7 @@ def _nop(args, ns=None, follow_symlinks=None):
pass
# follow symlinks (aka don't not follow symlinks)
follow = follow_symlinks or not (os.path.islink(src) and os.path.islink(dst))
follow = follow_symlinks or not (islink(src) and islink(dst))
if follow:
# use the real function if it exists
def lookup(name):
@@ -169,7 +171,7 @@ def rename(src, dst):
if sys.platform == "win32":
# Windows path existence checks will sometimes fail on junctions/links/symlinks
# so check for that case
if os.path.exists(dst) or os.path.islink(dst):
if os.path.exists(dst) or islink(dst):
os.remove(dst)
os.rename(src, dst)
@@ -566,7 +568,7 @@ def set_install_permissions(path):
# If this points to a file maintained in a Spack prefix, it is assumed that
# this function will be invoked on the target. If the file is outside a
# Spack-maintained prefix, the permissions should not be modified.
if os.path.islink(path):
if islink(path):
return
if os.path.isdir(path):
os.chmod(path, 0o755)
@@ -635,7 +637,7 @@ def chmod_x(entry, perms):
@system_path_filter
def copy_mode(src, dest):
"""Set the mode of dest to that of src unless it is a link."""
if os.path.islink(dest):
if islink(dest):
return
src_mode = os.stat(src).st_mode
dest_mode = os.stat(dest).st_mode
@@ -721,26 +723,12 @@ def install(src, dest):
copy(src, dest, _permissions=True)
@system_path_filter
def resolve_link_target_relative_to_the_link(link):
"""
os.path.isdir uses os.path.exists, which for links will check
the existence of the link target. If the link target is relative to
the link, we need to construct a pathname that is valid from
our cwd (which may not be the same as the link's directory)
"""
target = os.readlink(link)
if os.path.isabs(target):
return target
link_dir = os.path.dirname(os.path.abspath(link))
return os.path.join(link_dir, target)
@system_path_filter
def copy_tree(
src: str,
dest: str,
symlinks: bool = True,
allow_broken_symlinks: bool = sys.platform != "win32",
ignore: Optional[Callable[[str], bool]] = None,
_permissions: bool = False,
):
@@ -763,6 +751,8 @@ def copy_tree(
src (str): the directory to copy
dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks
allow_broken_symlinks (bool): whether or not to allow broken (dangling) symlinks,
On Windows, setting this to True will raise an exception. Defaults to true on unix.
ignore (typing.Callable): function indicating which files to ignore
_permissions (bool): for internal use only
@@ -770,6 +760,8 @@ def copy_tree(
IOError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest*
"""
if allow_broken_symlinks and sys.platform == "win32":
raise llnl.util.symlink.SymlinkError("Cannot allow broken symlinks on Windows!")
if _permissions:
tty.debug("Installing {0} to {1}".format(src, dest))
else:
@@ -783,6 +775,11 @@ def copy_tree(
if not files:
raise IOError("No such file or directory: '{0}'".format(src))
# For Windows hard-links and junctions, the source path must exist to make a symlink. Add
# all symlinks to this list while traversing the tree, then when finished, make all
# symlinks at the end.
links = []
for src in files:
abs_src = os.path.abspath(src)
if not abs_src.endswith(os.path.sep):
@@ -805,7 +802,7 @@ def copy_tree(
ignore=ignore,
follow_nonexisting=True,
):
if os.path.islink(s):
if islink(s):
link_target = resolve_link_target_relative_to_the_link(s)
if symlinks:
target = os.readlink(s)
@@ -819,7 +816,9 @@ def escaped_path(path):
tty.debug("Redirecting link {0} to {1}".format(target, new_target))
target = new_target
symlink(target, d)
links.append((target, d, s))
continue
elif os.path.isdir(link_target):
mkdirp(d)
else:
@@ -834,9 +833,17 @@ def escaped_path(path):
set_install_permissions(d)
copy_mode(s, d)
for target, d, s in links:
symlink(target, d, allow_broken_symlinks=allow_broken_symlinks)
if _permissions:
set_install_permissions(d)
copy_mode(s, d)
@system_path_filter
def install_tree(src, dest, symlinks=True, ignore=None):
def install_tree(
src, dest, symlinks=True, ignore=None, allow_broken_symlinks=sys.platform != "win32"
):
"""Recursively install an entire directory tree rooted at *src*.
Same as :py:func:`copy_tree` with the addition of setting proper
@@ -847,12 +854,21 @@ def install_tree(src, dest, symlinks=True, ignore=None):
dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks
ignore (typing.Callable): function indicating which files to ignore
allow_broken_symlinks (bool): whether or not to allow broken (dangling) symlinks,
On Windows, setting this to True will raise an exception.
Raises:
IOError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest*
"""
copy_tree(src, dest, symlinks=symlinks, ignore=ignore, _permissions=True)
copy_tree(
src,
dest,
symlinks=symlinks,
allow_broken_symlinks=allow_broken_symlinks,
ignore=ignore,
_permissions=True,
)
@system_path_filter
@@ -1256,7 +1272,12 @@ def traverse_tree(
Keyword Arguments:
order (str): Whether to do pre- or post-order traversal. Accepted
values are 'pre' and 'post'
ignore (typing.Callable): function indicating which files to ignore
ignore (typing.Callable): function indicating which files to ignore. This will also
ignore symlinks if they point to an ignored file (regardless of whether the symlink
is explicitly ignored); note this only supports one layer of indirection (i.e. if
you have x -> y -> z, and z is ignored but x/y are not, then y would be ignored
but not x). To avoid this, make sure the ignore function also ignores the symlink
paths too.
follow_nonexisting (bool): Whether to descend into directories in
``src`` that do not exit in ``dest``. Default is True
follow_links (bool): Whether to descend into symlinks in ``src``
@@ -1283,11 +1304,24 @@ def traverse_tree(
dest_child = os.path.join(dest_path, f)
rel_child = os.path.join(rel_path, f)
# If the source path is a link and the link's source is ignored, then ignore the link too,
# but only do this if the ignore is defined.
if ignore is not None:
if islink(source_child) and not follow_links:
target = readlink(source_child)
all_parents = accumulate(target.split(os.sep), lambda x, y: os.path.join(x, y))
if any(map(ignore, all_parents)):
tty.warn(
f"Skipping {source_path} because the source or a part of the source's "
f"path is included in the ignores."
)
continue
# Treat as a directory
# TODO: for symlinks, os.path.isdir looks for the link target. If the
# target is relative to the link, then that may not resolve properly
# relative to our cwd - see resolve_link_target_relative_to_the_link
if os.path.isdir(source_child) and (follow_links or not os.path.islink(source_child)):
if os.path.isdir(source_child) and (follow_links or not islink(source_child)):
# When follow_nonexisting isn't set, don't descend into dirs
# in source that do not exist in dest
if follow_nonexisting or os.path.exists(dest_child):
@@ -1313,7 +1347,11 @@ def traverse_tree(
def lexists_islink_isdir(path):
"""Computes the tuple (lexists(path), islink(path), isdir(path)) in a minimal
number of stat calls."""
number of stat calls on unix. Use os.path and symlink.islink methods for windows."""
if sys.platform == "win32":
if not os.path.lexists(path):
return False, False, False
return os.path.lexists(path), islink(path), os.path.isdir(path)
# First try to lstat, so we know if it's a link or not.
try:
lst = os.lstat(path)
@@ -1528,7 +1566,7 @@ def remove_if_dead_link(path):
Parameters:
path (str): The potential dead link
"""
if os.path.islink(path) and not os.path.exists(path):
if islink(path) and not os.path.exists(path):
os.unlink(path)
@@ -1587,7 +1625,7 @@ def remove_linked_tree(path):
kwargs["onerror"] = readonly_file_handler(ignore_errors=True)
if os.path.exists(path):
if os.path.islink(path):
if islink(path):
shutil.rmtree(os.path.realpath(path), **kwargs)
os.unlink(path)
else:
@@ -1754,9 +1792,14 @@ def find(root, files, recursive=True):
files = [files]
if recursive:
return _find_recursive(root, files)
tty.debug(f"Find (recursive): {root} {str(files)}")
result = _find_recursive(root, files)
else:
return _find_non_recursive(root, files)
tty.debug(f"Find (not recursive): {root} {str(files)}")
result = _find_non_recursive(root, files)
tty.debug(f"Find complete: {root} {str(files)}")
return result
@system_path_filter
@@ -2688,7 +2731,7 @@ def remove_directory_contents(dir):
"""Remove all contents of a directory."""
if os.path.exists(dir):
for entry in [os.path.join(dir, entry) for entry in os.listdir(dir)]:
if os.path.isfile(entry) or os.path.islink(entry):
if os.path.isfile(entry) or islink(entry):
os.unlink(entry)
else:
shutil.rmtree(entry)

View File

@@ -143,7 +143,7 @@ def get_fh(self, path: str) -> IO:
def release_by_stat(self, stat):
key = (stat.st_dev, stat.st_ino, os.getpid())
open_file = self._descriptors.get(key)
assert open_file, "Attempted to close non-existing inode: %s" % stat.st_inode
assert open_file, "Attempted to close non-existing inode: %s" % stat.st_ino
open_file.refs -= 1
if not open_file.refs:

View File

@@ -2,77 +2,188 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import errno
import os
import re
import shutil
import subprocess
import sys
import tempfile
from os.path import exists, join
from llnl.util import lang
from llnl.util import lang, tty
from spack.error import SpackError
from spack.util.path import system_path_filter
if sys.platform == "win32":
from win32file import CreateHardLink
is_windows = sys.platform == "win32"
def symlink(real_path, link_path):
"""
Create a symbolic link.
On Windows, use junctions if os.symlink fails.
def symlink(source_path: str, link_path: str, allow_broken_symlinks: bool = not is_windows):
"""
if sys.platform != "win32":
os.symlink(real_path, link_path)
elif _win32_can_symlink():
# Windows requires target_is_directory=True when the target is a dir.
os.symlink(real_path, link_path, target_is_directory=os.path.isdir(real_path))
else:
try:
# Try to use junctions
_win32_junction(real_path, link_path)
except OSError as e:
if e.errno == errno.EEXIST:
# EEXIST error indicates that file we're trying to "link"
# is already present, don't bother trying to copy which will also fail
# just raise
raise
Create a link.
On non-Windows and Windows with System Administrator
privleges this will be a normal symbolic link via
os.symlink.
On Windows without privledges the link will be a
junction for a directory and a hardlink for a file.
On Windows the various link types are:
Symbolic Link: A link to a file or directory on the
same or different volume (drive letter) or even to
a remote file or directory (using UNC in its path).
Need System Administrator privileges to make these.
Hard Link: A link to a file on the same volume (drive
letter) only. Every file (file's data) has at least 1
hard link (file's name). But when this method creates
a new hard link there will be 2. Deleting all hard
links effectively deletes the file. Don't need System
Administrator privileges.
Junction: A link to a directory on the same or different
volume (drive letter) but not to a remote directory. Don't
need System Administrator privileges.
Parameters:
source_path (str): The real file or directory that the link points to.
Must be absolute OR relative to the link.
link_path (str): The path where the link will exist.
allow_broken_symlinks (bool): On Linux or Mac, don't raise an exception if the source_path
doesn't exist. This will still raise an exception on Windows.
"""
source_path = os.path.normpath(source_path)
win_source_path = source_path
link_path = os.path.normpath(link_path)
# Never allow broken links on Windows.
if sys.platform == "win32" and allow_broken_symlinks:
raise ValueError("allow_broken_symlinks parameter cannot be True on Windows.")
if not allow_broken_symlinks:
# Perform basic checks to make sure symlinking will succeed
if os.path.lexists(link_path):
raise SymlinkError(f"Link path ({link_path}) already exists. Cannot create link.")
if not os.path.exists(source_path):
if os.path.isabs(source_path) and not allow_broken_symlinks:
# An absolute source path that does not exist will result in a broken link.
raise SymlinkError(
f"Source path ({source_path}) is absolute but does not exist. Resulting "
f"link would be broken so not making link."
)
else:
# If all else fails, fall back to copying files
shutil.copyfile(real_path, link_path)
# os.symlink can create a link when the given source path is relative to
# the link path. Emulate this behavior and check to see if the source exists
# relative to the link patg ahead of link creation to prevent broken
# links from being made.
link_parent_dir = os.path.dirname(link_path)
relative_path = os.path.join(link_parent_dir, source_path)
if os.path.exists(relative_path):
# In order to work on windows, the source path needs to be modified to be
# relative because hardlink/junction dont resolve relative paths the same
# way as os.symlink. This is ignored on other operating systems.
win_source_path = relative_path
elif not allow_broken_symlinks:
raise SymlinkError(
f"The source path ({source_path}) is not relative to the link path "
f"({link_path}). Resulting link would be broken so not making link."
)
# Create the symlink
if sys.platform == "win32" and not _windows_can_symlink():
_windows_create_link(win_source_path, link_path)
else:
os.symlink(source_path, link_path, target_is_directory=os.path.isdir(source_path))
def islink(path):
return os.path.islink(path) or _win32_is_junction(path)
def islink(path: str) -> bool:
"""Override os.islink to give correct answer for spack logic.
For Non-Windows: a link can be determined with the os.path.islink method.
Windows-only methods will return false for other operating systems.
For Windows: spack considers symlinks, hard links, and junctions to
all be links, so if any of those are True, return True.
Args:
path (str): path to check if it is a link.
Returns:
bool - whether the path is any kind link or not.
"""
return any([os.path.islink(path), _windows_is_junction(path), _windows_is_hardlink(path)])
# '_win32' functions based on
# https://github.com/Erotemic/ubelt/blob/master/ubelt/util_links.py
def _win32_junction(path, link):
# junctions require absolute paths
if not os.path.isabs(link):
link = os.path.abspath(link)
def _windows_is_hardlink(path: str) -> bool:
"""Determines if a path is a windows hard link. This is accomplished
by looking at the number of links using os.stat. A non-hard-linked file
will have a st_nlink value of 1, whereas a hard link will have a value
larger than 1. Note that both the original and hard-linked file will
return True because they share the same inode.
# os.symlink will fail if link exists, emulate the behavior here
if exists(link):
raise OSError(errno.EEXIST, "File exists: %s -> %s" % (link, path))
Args:
path (str): Windows path to check for a hard link
if not os.path.isabs(path):
parent = os.path.join(link, os.pardir)
path = os.path.join(parent, path)
path = os.path.abspath(path)
Returns:
bool - Whether the path is a hard link or not.
"""
if sys.platform != "win32" or os.path.islink(path) or not os.path.exists(path):
return False
CreateHardLink(link, path)
return os.stat(path).st_nlink > 1
def _windows_is_junction(path: str) -> bool:
"""Determines if a path is a windows junction. A junction can be
determined using a bitwise AND operation between the file's
attribute bitmask and the known junction bitmask (0x400).
Args:
path (str): A non-file path
Returns:
bool - whether the path is a junction or not.
"""
if sys.platform != "win32" or os.path.islink(path) or os.path.isfile(path):
return False
import ctypes.wintypes
get_file_attributes = ctypes.windll.kernel32.GetFileAttributesW # type: ignore[attr-defined]
get_file_attributes.argtypes = (ctypes.wintypes.LPWSTR,)
get_file_attributes.restype = ctypes.wintypes.DWORD
invalid_file_attributes = 0xFFFFFFFF
reparse_point = 0x400
file_attr = get_file_attributes(str(path))
if file_attr == invalid_file_attributes:
return False
return file_attr & reparse_point > 0
@lang.memoized
def _win32_can_symlink():
def _windows_can_symlink() -> bool:
"""
Determines if windows is able to make a symlink depending on
the system configuration and the level of the user's permissions.
"""
if sys.platform != "win32":
tty.warn("windows_can_symlink method can't be used on non-Windows OS.")
return False
tempdir = tempfile.mkdtemp()
dpath = join(tempdir, "dpath")
fpath = join(tempdir, "fpath.txt")
dpath = os.path.join(tempdir, "dpath")
fpath = os.path.join(tempdir, "fpath.txt")
dlink = join(tempdir, "dlink")
flink = join(tempdir, "flink.txt")
dlink = os.path.join(tempdir, "dlink")
flink = os.path.join(tempdir, "flink.txt")
import llnl.util.filesystem as fs
@@ -96,24 +207,136 @@ def _win32_can_symlink():
return can_symlink_directories and can_symlink_files
def _win32_is_junction(path):
def _windows_create_link(source: str, link: str):
"""
Determines if a path is a win32 junction
Attempts to create a Hard Link or Junction as an alternative
to a symbolic link. This is called when symbolic links cannot
be created.
"""
if os.path.islink(path):
return False
if sys.platform != "win32":
raise SymlinkError("windows_create_link method can't be used on non-Windows OS.")
elif os.path.isdir(source):
_windows_create_junction(source=source, link=link)
elif os.path.isfile(source):
_windows_create_hard_link(path=source, link=link)
else:
raise SymlinkError(
f"Cannot create link from {source}. It is neither a file nor a directory."
)
if sys.platform == "win32":
import ctypes.wintypes
GetFileAttributes = ctypes.windll.kernel32.GetFileAttributesW
GetFileAttributes.argtypes = (ctypes.wintypes.LPWSTR,)
GetFileAttributes.restype = ctypes.wintypes.DWORD
def _windows_create_junction(source: str, link: str):
"""Duly verify that the path and link are eligible to create a junction,
then create the junction.
"""
if sys.platform != "win32":
raise SymlinkError("windows_create_junction method can't be used on non-Windows OS.")
elif not os.path.exists(source):
raise SymlinkError("Source path does not exist, cannot create a junction.")
elif os.path.lexists(link):
raise SymlinkError("Link path already exists, cannot create a junction.")
elif not os.path.isdir(source):
raise SymlinkError("Source path is not a directory, cannot create a junction.")
INVALID_FILE_ATTRIBUTES = 0xFFFFFFFF
FILE_ATTRIBUTE_REPARSE_POINT = 0x400
import subprocess
res = GetFileAttributes(path)
return res != INVALID_FILE_ATTRIBUTES and bool(res & FILE_ATTRIBUTE_REPARSE_POINT)
cmd = ["cmd", "/C", "mklink", "/J", link, source]
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = proc.communicate()
tty.debug(out.decode())
if proc.returncode != 0:
err = err.decode()
tty.error(err)
raise SymlinkError("Make junction command returned a non-zero return code.", err)
return False
def _windows_create_hard_link(path: str, link: str):
"""Duly verify that the path and link are eligible to create a hard
link, then create the hard link.
"""
if sys.platform != "win32":
raise SymlinkError("windows_create_hard_link method can't be used on non-Windows OS.")
elif not os.path.exists(path):
raise SymlinkError(f"File path {path} does not exist. Cannot create hard link.")
elif os.path.lexists(link):
raise SymlinkError(f"Link path ({link}) already exists. Cannot create hard link.")
elif not os.path.isfile(path):
raise SymlinkError(f"File path ({link}) is not a file. Cannot create hard link.")
else:
tty.debug(f"Creating hard link {link} pointing to {path}")
CreateHardLink(link, path)
def readlink(path: str):
"""Spack utility to override of os.readlink method to work cross platform"""
if _windows_is_hardlink(path):
return _windows_read_hard_link(path)
elif _windows_is_junction(path):
return _windows_read_junction(path)
else:
return os.readlink(path)
def _windows_read_hard_link(link: str) -> str:
"""Find all of the files that point to the same inode as the link"""
if sys.platform != "win32":
raise SymlinkError("Can't read hard link on non-Windows OS.")
link = os.path.abspath(link)
fsutil_cmd = ["fsutil", "hardlink", "list", link]
proc = subprocess.Popen(fsutil_cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
out, err = proc.communicate()
if proc.returncode != 0:
raise SymlinkError(f"An error occurred while reading hard link: {err.decode()}")
# fsutil response does not include the drive name, so append it back to each linked file.
drive, link_tail = os.path.splitdrive(os.path.abspath(link))
links = set([os.path.join(drive, p) for p in out.decode().splitlines()])
links.remove(link)
if len(links) == 1:
return links.pop()
elif len(links) > 1:
# TODO: How best to handle the case where 3 or more paths point to a single inode?
raise SymlinkError(f"Found multiple paths pointing to the same inode {links}")
else:
raise SymlinkError("Cannot determine hard link source path.")
def _windows_read_junction(link: str):
"""Find the path that a junction points to."""
if sys.platform != "win32":
raise SymlinkError("Can't read junction on non-Windows OS.")
link = os.path.abspath(link)
link_basename = os.path.basename(link)
link_parent = os.path.dirname(link)
fsutil_cmd = ["dir", "/a:l", link_parent]
proc = subprocess.Popen(fsutil_cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
out, err = proc.communicate()
if proc.returncode != 0:
raise SymlinkError(f"An error occurred while reading junction: {err.decode()}")
matches = re.search(rf"<JUNCTION>\s+{link_basename} \[(.*)]", out.decode())
if matches:
return matches.group(1)
else:
raise SymlinkError("Could not find junction path.")
@system_path_filter
def resolve_link_target_relative_to_the_link(link):
"""
os.path.isdir uses os.path.exists, which for links will check
the existence of the link target. If the link target is relative to
the link, we need to construct a pathname that is valid from
our cwd (which may not be the same as the link's directory)
"""
target = readlink(link)
if os.path.isabs(target):
return target
link_dir = os.path.dirname(os.path.abspath(link))
return os.path.join(link_dir, target)
class SymlinkError(SpackError):
"""Exception class for errors raised while creating symlinks,
junctions and hard links
"""

View File

@@ -12,6 +12,7 @@
import traceback
from datetime import datetime
from sys import platform as _platform
from typing import NoReturn
if _platform != "win32":
import fcntl
@@ -244,7 +245,7 @@ def warn(message, *args, **kwargs):
info("Warning: " + str(message), *args, **kwargs)
def die(message, *args, **kwargs):
def die(message, *args, **kwargs) -> NoReturn:
kwargs.setdefault("countback", 4)
error(message, *args, **kwargs)
sys.exit(1)

View File

@@ -780,7 +780,7 @@ def __enter__(self):
raise RuntimeError("file argument must be set by __init__ ")
# Open both write and reading on logfile
if type(self.logfile) == io.StringIO:
if isinstance(self.logfile, io.StringIO):
self._ioflag = True
# cannot have two streams on tempfile, so we must make our own
sys.stdout = self.logfile

View File

@@ -286,7 +286,7 @@ def _check_build_test_callbacks(pkgs, error_cls):
"""Ensure stand-alone test method is not included in build-time callbacks"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
test_callbacks = getattr(pkg_cls, "build_time_test_callbacks", None)
# TODO (post-34236): "test*"->"test_*" once remove deprecated methods
@@ -312,7 +312,7 @@ def _check_patch_urls(pkgs, error_cls):
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for condition, patches in pkg_cls.patches.items():
for patch in patches:
if not isinstance(patch, spack.patch.UrlPatch):
@@ -342,7 +342,7 @@ def _search_for_reserved_attributes_names_in_packages(pkgs, error_cls):
errors = []
for pkg_name in pkgs:
name_definitions = collections.defaultdict(list)
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for cls_item in inspect.getmro(pkg_cls):
for name in RESERVED_NAMES:
@@ -383,7 +383,7 @@ def _ensure_packages_are_pickeleable(pkgs, error_cls):
"""Ensure that package objects are pickleable"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
pkg = pkg_cls(spack.spec.Spec(pkg_name))
try:
pickle.dumps(pkg)
@@ -424,7 +424,7 @@ def _ensure_all_versions_can_produce_a_fetcher(pkgs, error_cls):
"""Ensure all versions in a package can produce a fetcher"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
pkg = pkg_cls(spack.spec.Spec(pkg_name))
try:
spack.fetch_strategy.check_pkg_attributes(pkg)
@@ -449,7 +449,7 @@ def _ensure_docstring_and_no_fixme(pkgs, error_cls):
]
for pkg_name in pkgs:
details = []
filename = spack.repo.path.filename_for_package_name(pkg_name)
filename = spack.repo.PATH.filename_for_package_name(pkg_name)
with open(filename, "r") as package_file:
for i, line in enumerate(package_file):
pattern = next((r for r in fixme_regexes if r.search(line)), None)
@@ -461,7 +461,7 @@ def _ensure_docstring_and_no_fixme(pkgs, error_cls):
error_msg = "Package '{}' contains boilerplate that need to be removed"
errors.append(error_cls(error_msg.format(pkg_name), details))
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
if not pkg_cls.__doc__:
error_msg = "Package '{}' miss a docstring"
errors.append(error_cls(error_msg.format(pkg_name), []))
@@ -474,7 +474,7 @@ def _ensure_all_packages_use_sha256_checksums(pkgs, error_cls):
"""Ensure no packages use md5 checksums"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
if pkg_cls.manual_download:
continue
@@ -511,7 +511,7 @@ def _ensure_env_methods_are_ported_to_builders(pkgs, error_cls):
"""Ensure that methods modifying the build environment are ported to builder classes."""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
buildsystem_variant, _ = pkg_cls.variants["build_system"]
buildsystem_names = [getattr(x, "value", x) for x in buildsystem_variant.values]
builder_cls_names = [spack.builder.BUILDER_CLS[x].__name__ for x in buildsystem_names]
@@ -538,7 +538,7 @@ def _linting_package_file(pkgs, error_cls):
"""Check for correctness of links"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
# Does the homepage have http, and if so, does https work?
if pkg_cls.homepage.startswith("http://"):
@@ -562,7 +562,7 @@ def _unknown_variants_in_directives(pkgs, error_cls):
"""Report unknown or wrong variants in directives for this package"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
# Check "conflicts" directive
for conflict, triggers in pkg_cls.conflicts.items():
@@ -628,15 +628,15 @@ def _unknown_variants_in_dependencies(pkgs, error_cls):
"""Report unknown dependencies and wrong variants for dependencies"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
filename = spack.repo.path.filename_for_package_name(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
filename = spack.repo.PATH.filename_for_package_name(pkg_name)
for dependency_name, dependency_data in pkg_cls.dependencies.items():
# No need to analyze virtual packages
if spack.repo.path.is_virtual(dependency_name):
if spack.repo.PATH.is_virtual(dependency_name):
continue
try:
dependency_pkg_cls = spack.repo.path.get_pkg_class(dependency_name)
dependency_pkg_cls = spack.repo.PATH.get_pkg_class(dependency_name)
except spack.repo.UnknownPackageError:
# This dependency is completely missing, so report
# and continue the analysis
@@ -675,7 +675,7 @@ def _ensure_variant_defaults_are_parsable(pkgs, error_cls):
"""Ensures that variant defaults are present and parsable from cli"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for variant_name, entry in pkg_cls.variants.items():
variant, _ = entry
default_is_parsable = (
@@ -709,18 +709,33 @@ def _ensure_variant_defaults_are_parsable(pkgs, error_cls):
return errors
@package_directives
def _ensure_variants_have_descriptions(pkgs, error_cls):
"""Ensures that all variants have a description."""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for variant_name, entry in pkg_cls.variants.items():
variant, _ = entry
if not variant.description:
error_msg = "Variant '{}' in package '{}' is missing a description"
errors.append(error_cls(error_msg.format(variant_name, pkg_name), []))
return errors
@package_directives
def _version_constraints_are_satisfiable_by_some_version_in_repo(pkgs, error_cls):
"""Report if version constraints used in directives are not satisfiable"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
filename = spack.repo.path.filename_for_package_name(pkg_name)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
filename = spack.repo.PATH.filename_for_package_name(pkg_name)
dependencies_to_check = []
for dependency_name, dependency_data in pkg_cls.dependencies.items():
# Skip virtual dependencies for the time being, check on
# their versions can be added later
if spack.repo.path.is_virtual(dependency_name):
if spack.repo.PATH.is_virtual(dependency_name):
continue
dependencies_to_check.extend([edge.spec for edge in dependency_data.values()])
@@ -729,7 +744,7 @@ def _version_constraints_are_satisfiable_by_some_version_in_repo(pkgs, error_cls
for s in dependencies_to_check:
dependency_pkg_cls = None
try:
dependency_pkg_cls = spack.repo.path.get_pkg_class(s.name)
dependency_pkg_cls = spack.repo.PATH.get_pkg_class(s.name)
# Some packages have hacks that might cause failures on some platform
# Allow to explicitly set conditions to skip version checks in that case
skip_conditions = getattr(dependency_pkg_cls, "skip_version_audit", [])
@@ -772,7 +787,7 @@ def _analyze_variants_in_directive(pkg, constraint, directive, error_cls):
except variant_exceptions as e:
summary = pkg.name + ': wrong variant in "{0}" directive'
summary = summary.format(directive)
filename = spack.repo.path.filename_for_package_name(pkg.name)
filename = spack.repo.PATH.filename_for_package_name(pkg.name)
error_msg = str(e).strip()
if isinstance(e, KeyError):

View File

@@ -9,7 +9,6 @@
import io
import itertools
import json
import multiprocessing.pool
import os
import re
import shutil
@@ -52,6 +51,7 @@
import spack.util.url as url_util
import spack.util.web as web_util
from spack.caches import misc_cache_location
from spack.package_prefs import get_package_dir_permissions, get_package_group
from spack.relocate_text import utf8_paths_to_single_binary_regex
from spack.spec import Spec
from spack.stage import Stage
@@ -875,32 +875,18 @@ def _read_specs_and_push_index(file_list, read_method, cache_prefix, db, temp_di
db: A spack database used for adding specs and then writing the index.
temp_dir (str): Location to write index.json and hash for pushing
concurrency (int): Number of parallel processes to use when fetching
Return:
None
"""
for file in file_list:
contents = read_method(file)
# Need full spec.json name or this gets confused with index.json.
if file.endswith(".json.sig"):
specfile_json = Spec.extract_json_from_clearsig(contents)
fetched_spec = Spec.from_dict(specfile_json)
elif file.endswith(".json"):
fetched_spec = Spec.from_json(contents)
else:
continue
def _fetch_spec_from_mirror(spec_url):
spec_file_contents = read_method(spec_url)
if spec_file_contents:
# Need full spec.json name or this gets confused with index.json.
if spec_url.endswith(".json.sig"):
specfile_json = Spec.extract_json_from_clearsig(spec_file_contents)
return Spec.from_dict(specfile_json)
if spec_url.endswith(".json"):
return Spec.from_json(spec_file_contents)
tp = multiprocessing.pool.ThreadPool(processes=concurrency)
try:
fetched_specs = tp.map(
llnl.util.lang.star(_fetch_spec_from_mirror), [(f,) for f in file_list]
)
finally:
tp.terminate()
tp.join()
for fetched_spec in fetched_specs:
db.add(fetched_spec, None)
db.mark(fetched_spec, "in_buildcache", True)
@@ -1312,15 +1298,7 @@ def _build_tarball_in_stage_dir(spec: Spec, out_url: str, stage_dir: str, option
else:
raise ValueError("{0} not a valid spec file type".format(spec_file))
spec_dict["buildcache_layout_version"] = 1
bchecksum = {}
bchecksum["hash_algorithm"] = "sha256"
bchecksum["hash"] = checksum
spec_dict["binary_cache_checksum"] = bchecksum
# Add original install prefix relative to layout root to spec.json.
# This will be used to determine is the directory layout has changed.
buildinfo = {}
buildinfo["relative_prefix"] = os.path.relpath(spec.prefix, spack.store.STORE.layout.root)
spec_dict["buildinfo"] = buildinfo
spec_dict["binary_cache_checksum"] = {"hash_algorithm": "sha256", "hash": checksum}
with open(specfile_path, "w") as outfile:
# Note: when using gpg clear sign, we need to avoid long lines (19995 chars).
@@ -1799,6 +1777,27 @@ def _extract_inner_tarball(spec, filename, extract_to, unsigned, remote_checksum
return tarfile_path
def _tar_strip_component(tar: tarfile.TarFile, prefix: str):
"""Strip the top-level directory `prefix` from the member names in a tarfile."""
# Including trailing /, otherwise we end up with absolute paths.
regex = re.compile(re.escape(prefix) + "/*")
# Remove the top-level directory from the member (link)names.
# Note: when a tarfile is created, relative in-prefix symlinks are
# expanded to matching member names of tarfile entries. So, we have
# to ensure that those are updated too.
# Absolute symlinks are copied verbatim -- relocation should take care of
# them.
for m in tar.getmembers():
result = regex.match(m.name)
assert result is not None
m.name = m.name[result.end() :]
if m.linkname:
result = regex.match(m.linkname)
if result:
m.linkname = m.linkname[result.end() :]
def extract_tarball(spec, download_result, unsigned=False, force=False):
"""
extract binary tarball for given package into install area
@@ -1809,6 +1808,14 @@ def extract_tarball(spec, download_result, unsigned=False, force=False):
else:
raise NoOverwriteException(str(spec.prefix))
# Create the install prefix
fsys.mkdirp(
spec.prefix,
mode=get_package_dir_permissions(spec),
group=get_package_group(spec),
default_perms="parents",
)
specfile_path = download_result["specfile_stage"].save_filename
with open(specfile_path, "r") as inputfile:
@@ -1862,42 +1869,23 @@ def extract_tarball(spec, download_result, unsigned=False, force=False):
tarfile_path, size, contents, "sha256", expected, local_checksum
)
new_relative_prefix = str(os.path.relpath(spec.prefix, spack.store.STORE.layout.root))
# if the original relative prefix is in the spec file use it
buildinfo = spec_dict.get("buildinfo", {})
old_relative_prefix = buildinfo.get("relative_prefix", new_relative_prefix)
rel = buildinfo.get("relative_rpaths")
info = "old relative prefix %s\nnew relative prefix %s\nrelative rpaths %s"
tty.debug(info % (old_relative_prefix, new_relative_prefix, rel), level=2)
# Extract the tarball into the store root, presumably on the same filesystem.
# The directory created is the base directory name of the old prefix.
# Moving the old prefix name to the new prefix location should preserve
# hard links and symbolic links.
extract_tmp = os.path.join(spack.store.STORE.layout.root, ".tmp")
mkdirp(extract_tmp)
extracted_dir = os.path.join(extract_tmp, old_relative_prefix.split(os.path.sep)[-1])
with closing(tarfile.open(tarfile_path, "r")) as tar:
try:
tar.extractall(path=extract_tmp)
except Exception as e:
_delete_staged_downloads(download_result)
shutil.rmtree(extracted_dir)
raise e
try:
shutil.move(extracted_dir, spec.prefix)
except Exception as e:
with closing(tarfile.open(tarfile_path, "r")) as tar:
# Remove install prefix from tarfil to extract directly into spec.prefix
_tar_strip_component(tar, prefix=_ensure_common_prefix(tar))
tar.extractall(path=spec.prefix)
except Exception:
shutil.rmtree(spec.prefix, ignore_errors=True)
_delete_staged_downloads(download_result)
shutil.rmtree(extracted_dir)
raise e
raise
os.remove(tarfile_path)
os.remove(specfile_path)
try:
relocate_package(spec)
except Exception as e:
shutil.rmtree(spec.prefix)
shutil.rmtree(spec.prefix, ignore_errors=True)
raise e
else:
manifest_file = os.path.join(
@@ -1910,12 +1898,29 @@ def extract_tarball(spec, download_result, unsigned=False, force=False):
tty.warn("No manifest file in tarball for spec %s" % spec_id)
finally:
if tmpdir:
shutil.rmtree(tmpdir)
shutil.rmtree(tmpdir, ignore_errors=True)
if os.path.exists(filename):
os.remove(filename)
_delete_staged_downloads(download_result)
def _ensure_common_prefix(tar: tarfile.TarFile) -> str:
# Get the shortest length directory.
common_prefix = min((e.name for e in tar.getmembers() if e.isdir()), key=len, default=None)
if common_prefix is None:
raise ValueError("Tarball does not contain a common prefix")
# Validate that each file starts with the prefix
for member in tar.getmembers():
if not member.name.startswith(common_prefix):
raise ValueError(
f"Tarball contains file {member.name} outside of prefix {common_prefix}"
)
return common_prefix
def install_root_node(spec, unsigned=False, force=False, sha256=None):
"""Install the root node of a concrete spec from a buildcache.
@@ -2363,22 +2368,12 @@ def __init__(self, all_architectures):
self.possible_specs = specs
def __call__(self, spec, **kwargs):
def __call__(self, spec: Spec, **kwargs):
"""
Args:
spec (str): The spec being searched for in its string representation or hash.
spec: The spec being searched for
"""
matches = []
if spec.startswith("/"):
# Matching a DAG hash
query_hash = spec.replace("/", "")
for candidate_spec in self.possible_specs:
if candidate_spec.dag_hash().startswith(query_hash):
matches.append(candidate_spec)
else:
# Matching a spec constraint
matches = [s for s in self.possible_specs if s.satisfies(spec)]
return matches
return [s for s in self.possible_specs if s.satisfies(spec)]
class FetchIndexError(Exception):

View File

@@ -124,9 +124,9 @@ def _read_and_sanitize_configuration() -> Dict[str, Any]:
def _bootstrap_config_scopes() -> Sequence["spack.config.ConfigScope"]:
tty.debug("[BOOTSTRAP CONFIG SCOPE] name=_builtin")
config_scopes: MutableSequence["spack.config.ConfigScope"] = [
spack.config.InternalConfigScope("_builtin", spack.config.config_defaults)
spack.config.InternalConfigScope("_builtin", spack.config.CONFIG_DEFAULTS)
]
configuration_paths = (spack.config.configuration_defaults_path, ("bootstrap", _config_path()))
configuration_paths = (spack.config.CONFIGURATION_DEFAULTS_PATH, ("bootstrap", _config_path()))
for name, path in configuration_paths:
platform = spack.platforms.host().name
platform_scope = spack.config.ConfigScope(

View File

@@ -476,15 +476,22 @@ def ensure_executables_in_path_or_raise(
def _add_externals_if_missing() -> None:
search_list = [
# clingo
spack.repo.path.get_pkg_class("cmake"),
spack.repo.path.get_pkg_class("bison"),
spack.repo.PATH.get_pkg_class("cmake"),
spack.repo.PATH.get_pkg_class("bison"),
# GnuPG
spack.repo.path.get_pkg_class("gawk"),
spack.repo.PATH.get_pkg_class("gawk"),
# develop deps
spack.repo.PATH.get_pkg_class("git"),
]
if IS_WINDOWS:
search_list.append(spack.repo.path.get_pkg_class("winbison"))
detected_packages = spack.detection.by_executable(search_list)
spack.detection.update_configuration(detected_packages, scope="bootstrap")
search_list.append(spack.repo.PATH.get_pkg_class("winbison"))
externals = spack.detection.by_executable(search_list)
# System git is typically deprecated, so mark as non-buildable to force it as external
non_buildable_externals = {k: externals.pop(k) for k in ("git",) if k in externals}
spack.detection.update_configuration(externals, scope="bootstrap", buildable=True)
spack.detection.update_configuration(
non_buildable_externals, scope="bootstrap", buildable=False
)
def clingo_root_spec() -> str:

View File

@@ -23,6 +23,7 @@
from ._common import _root_spec
from .config import root_path, spec_for_current_python, store_path
from .core import _add_externals_if_missing
class BootstrapEnvironment(spack.environment.Environment):
@@ -185,6 +186,7 @@ def pytest_root_spec() -> str:
def ensure_environment_dependencies() -> None:
"""Ensure Spack dependencies from the bootstrap environment are installed and ready to use"""
_add_externals_if_missing()
with BootstrapEnvironment() as env:
env.update_installations()
env.update_syspath_and_environ()

View File

@@ -1027,7 +1027,7 @@ def get_cmake_prefix_path(pkg):
def _setup_pkg_and_run(
serialized_pkg, function, kwargs, child_pipe, input_multiprocess_fd, jsfd1, jsfd2
serialized_pkg, function, kwargs, write_pipe, input_multiprocess_fd, jsfd1, jsfd2
):
context = kwargs.get("context", "build")
@@ -1048,12 +1048,12 @@ def _setup_pkg_and_run(
pkg, dirty=kwargs.get("dirty", False), context=context
)
return_value = function(pkg, kwargs)
child_pipe.send(return_value)
write_pipe.send(return_value)
except StopPhase as e:
# Do not create a full ChildError from this, it's not an error
# it's a control statement.
child_pipe.send(e)
write_pipe.send(e)
except BaseException:
# catch ANYTHING that goes wrong in the child process
exc_type, exc, tb = sys.exc_info()
@@ -1102,10 +1102,10 @@ def _setup_pkg_and_run(
context,
package_context,
)
child_pipe.send(ce)
write_pipe.send(ce)
finally:
child_pipe.close()
write_pipe.close()
if input_multiprocess_fd is not None:
input_multiprocess_fd.close()
@@ -1149,7 +1149,7 @@ def child_fun():
For more information on `multiprocessing` child process creation
mechanisms, see https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods
"""
parent_pipe, child_pipe = multiprocessing.Pipe()
read_pipe, write_pipe = multiprocessing.Pipe(duplex=False)
input_multiprocess_fd = None
jobserver_fd1 = None
jobserver_fd2 = None
@@ -1174,7 +1174,7 @@ def child_fun():
serialized_pkg,
function,
kwargs,
child_pipe,
write_pipe,
input_multiprocess_fd,
jobserver_fd1,
jobserver_fd2,
@@ -1183,6 +1183,12 @@ def child_fun():
p.start()
# We close the writable end of the pipe now to be sure that p is the
# only process which owns a handle for it. This ensures that when p
# closes its handle for the writable end, read_pipe.recv() will
# promptly report the readable end as being ready.
write_pipe.close()
except InstallError as e:
e.pkg = pkg
raise
@@ -1192,7 +1198,16 @@ def child_fun():
if input_multiprocess_fd is not None:
input_multiprocess_fd.close()
child_result = parent_pipe.recv()
def exitcode_msg(p):
typ = "exit" if p.exitcode >= 0 else "signal"
return f"{typ} {abs(p.exitcode)}"
try:
child_result = read_pipe.recv()
except EOFError:
p.join()
raise InstallError(f"The process has stopped unexpectedly ({exitcode_msg(p)})")
p.join()
# If returns a StopPhase, raise it
@@ -1212,6 +1227,10 @@ def child_fun():
child_result.print_context()
raise child_result
# Fallback. Usually caught beforehand in EOFError above.
if p.exitcode != 0:
raise InstallError(f"The process failed unexpectedly ({exitcode_msg(p)})")
return child_result
@@ -1256,9 +1275,8 @@ def make_stack(tb, stack=None):
func = getattr(obj, tb.tb_frame.f_code.co_name, "")
if func:
typename, *_ = func.__qualname__.partition(".")
if isinstance(obj, CONTEXT_BASES) and typename not in basenames:
break
if isinstance(obj, CONTEXT_BASES) and typename not in basenames:
break
else:
return None

View File

@@ -55,7 +55,8 @@ def flags_to_build_system_args(self, flags):
setattr(self, "configure_flag_args", [])
for flag, values in flags.items():
if values:
values_str = "{0}={1}".format(flag.upper(), " ".join(values))
var_name = "LIBS" if flag == "ldlibs" else flag.upper()
values_str = "{0}={1}".format(var_name, " ".join(values))
self.configure_flag_args.append(values_str)
# Spack's fflags are meant for both F77 and FC, therefore we
# additionaly set FCFLAGS if required.

View File

@@ -162,17 +162,6 @@ def initconfig_compiler_entries(self):
libs_string = libs_format_string.format(lang)
entries.append(cmake_cache_string(libs_string, libs_flags))
# Set the generator in the cached config
if self.spec.satisfies("generator=make"):
entries.append(cmake_cache_string("CMAKE_GENERATOR", "Unix Makefiles"))
if self.spec.satisfies("generator=ninja"):
entries.append(cmake_cache_string("CMAKE_GENERATOR", "Ninja"))
entries.append(
cmake_cache_string(
"CMAKE_MAKE_PROGRAM", "{0}/ninja".format(spec["ninja"].prefix.bin)
)
)
return entries
def initconfig_mpi_entries(self):

View File

@@ -248,7 +248,8 @@ def std_cmake_args(self):
@staticmethod
def std_args(pkg, generator=None):
"""Computes the standard cmake arguments for a generic package"""
generator = generator or "Unix Makefiles"
default_generator = "Ninja" if sys.platform == "win32" else "Unix Makefiles"
generator = generator or default_generator
valid_primary_generators = ["Unix Makefiles", "Ninja"]
primary_generator = _extract_primary_generator(generator)
if primary_generator not in valid_primary_generators:

View File

@@ -209,5 +209,5 @@ def install(self, pkg, spec, prefix):
def check(self):
"""Search Meson-generated files for the target ``test`` and run it if found."""
with fs.working_dir(self.build_directory):
self._if_ninja_target_execute("test")
self._if_ninja_target_execute("check")
self.pkg._if_ninja_target_execute("test")
self.pkg._if_ninja_target_execute("check")

View File

@@ -30,7 +30,7 @@
class PythonExtension(spack.package_base.PackageBase):
maintainers("adamjstewart", "pradyunsg")
maintainers("adamjstewart")
@property
def import_modules(self):
@@ -201,7 +201,7 @@ def update_external_dependencies(self, extendee_spec=None):
else:
python = self.get_external_python_for_prefix()
if not python.concrete:
repo = spack.repo.path.repo_for_pkg(python)
repo = spack.repo.PATH.repo_for_pkg(python)
python.namespace = repo.namespace
# Ensure architecture information is present
@@ -301,7 +301,7 @@ def get_external_python_for_prefix(self):
return python_externals_configured[0]
python_externals_detection = spack.detection.by_executable(
[spack.repo.path.get_pkg_class("python")], path_hints=[self.spec.external_path]
[spack.repo.PATH.get_pkg_class("python")], path_hints=[self.spec.external_path]
)
python_externals_detected = [

View File

@@ -28,7 +28,7 @@ class QMakePackage(spack.package_base.PackageBase):
build_system("qmake")
depends_on("qt", type="build", when="build_system=qmake")
depends_on("qmake", type="build", when="build_system=qmake")
@spack.builder.builder("qmake")

View File

@@ -140,8 +140,6 @@ class ROCmPackage(PackageBase):
depends_on("hsa-rocr-dev", when="+rocm")
depends_on("hip +rocm", when="+rocm")
conflicts("^blt@:0.3.6", when="+rocm")
# need amd gpu type for rocm builds
conflicts("amdgpu_target=none", when="+rocm")

View File

@@ -7,13 +7,14 @@
import re
import llnl.util.tty as tty
from llnl.util.filesystem import find, join_path, working_dir
from llnl.util.filesystem import find, working_dir
import spack.builder
import spack.install_test
import spack.package_base
from spack.directives import build_system, depends_on, extends
from spack.multimethod import when
from spack.util.executable import Executable
from ._checks import BaseBuilder, execute_install_time_tests
@@ -39,9 +40,8 @@ class SIPPackage(spack.package_base.PackageBase):
build_system("sip")
with when("build_system=sip"):
extends("python")
depends_on("qt")
depends_on("py-sip")
extends("python", type=("build", "link", "run"))
depends_on("py-sip", type="build")
@property
def import_modules(self):
@@ -113,13 +113,13 @@ class SIPBuilder(BaseBuilder):
* install
The configure phase already adds a set of default flags. To see more
options, run ``python configure.py --help``.
options, run ``sip-build --help``.
"""
phases = ("configure", "build", "install")
#: Names associated with package methods in the old build-system format
legacy_methods = ("configure_file", "configure_args", "build_args", "install_args")
legacy_methods = ("configure_args", "build_args", "install_args")
#: Names associated with package attributes in the old build-system format
legacy_attributes = (
@@ -130,34 +130,17 @@ class SIPBuilder(BaseBuilder):
"build_directory",
)
def configure_file(self):
"""Returns the name of the configure file to use."""
return "configure.py"
build_directory = "build"
def configure(self, pkg, spec, prefix):
"""Configure the package."""
configure = self.configure_file()
args = self.configure_args()
# https://www.riverbankcomputing.com/static/Docs/sip/command_line_tools.html
args = ["--verbose", "--target-dir", inspect.getmodule(self.pkg).python_platlib]
args.extend(self.configure_args())
args.extend(
[
"--verbose",
"--confirm-license",
"--qmake",
spec["qt"].prefix.bin.qmake,
"--sip",
spec["py-sip"].prefix.bin.sip,
"--sip-incdir",
join_path(spec["py-sip"].prefix, spec["python"].package.include),
"--bindir",
prefix.bin,
"--destdir",
inspect.getmodule(self.pkg).python_platlib,
]
)
self.pkg.python(configure, *args)
sip_build = Executable(spec["py-sip"].prefix.bin.join("sip-build"))
sip_build(*args)
def configure_args(self):
"""Arguments to pass to configure."""
@@ -167,7 +150,8 @@ def build(self, pkg, spec, prefix):
"""Build the package."""
args = self.build_args()
inspect.getmodule(self.pkg).make(*args)
with working_dir(self.build_directory):
inspect.getmodule(self.pkg).make(*args)
def build_args(self):
"""Arguments to pass to build."""
@@ -177,21 +161,11 @@ def install(self, pkg, spec, prefix):
"""Install the package."""
args = self.install_args()
inspect.getmodule(self.pkg).make("install", parallel=False, *args)
with working_dir(self.build_directory):
inspect.getmodule(self.pkg).make("install", *args)
def install_args(self):
"""Arguments to pass to install."""
return []
spack.builder.run_after("install")(execute_install_time_tests)
@spack.builder.run_after("install")
def extend_path_setup(self):
# See github issue #14121 and PR #15297
module = self.pkg.spec["py-sip"].variants["module"].value
if module != "sip":
module = module.split(".")[0]
with working_dir(inspect.getmodule(self.pkg).python_platlib):
with open(os.path.join(module, "__init__.py"), "a") as f:
f.write("from pkgutil import extend_path\n")
f.write("__path__ = extend_path(__path__, __name__)\n")

View File

@@ -20,9 +20,9 @@
def misc_cache_location():
"""The ``misc_cache`` is Spack's cache for small data.
"""The ``MISC_CACHE`` is Spack's cache for small data.
Currently the ``misc_cache`` stores indexes for virtual dependency
Currently the ``MISC_CACHE`` stores indexes for virtual dependency
providers and for which packages provide which tags.
"""
path = spack.config.get("config:misc_cache", spack.paths.default_misc_cache_path)
@@ -35,7 +35,7 @@ def _misc_cache():
#: Spack's cache for small data
misc_cache: Union[
MISC_CACHE: Union[
spack.util.file_cache.FileCache, llnl.util.lang.Singleton
] = llnl.util.lang.Singleton(_misc_cache)
@@ -91,6 +91,6 @@ def symlink(self, mirror_ref):
#: Spack's local cache for downloaded source archives
fetch_cache: Union[
FETCH_CACHE: Union[
spack.fetch_strategy.FsCache, llnl.util.lang.Singleton
] = llnl.util.lang.Singleton(_fetch_cache)

View File

@@ -535,7 +535,7 @@ def __job_name(name, suffix=""):
"""Compute the name of a named job with appropriate suffix.
Valid suffixes are either '-remove' or empty string or None
"""
assert type(name) == str
assert isinstance(name, str)
jname = name
if suffix:
@@ -885,7 +885,7 @@ def generate_gitlab_ci_yaml(
cli_scopes = [
os.path.relpath(s.path, concrete_env_dir)
for s in cfg.scopes().values()
if type(s) == cfg.ImmutableConfigScope
if isinstance(s, cfg.ImmutableConfigScope)
and s.path not in env_includes
and os.path.exists(s.path)
]
@@ -1504,7 +1504,7 @@ def copy_stage_logs_to_artifacts(job_spec: spack.spec.Spec, job_log_dir: str) ->
return
try:
pkg_cls = spack.repo.path.get_pkg_class(job_spec.name)
pkg_cls = spack.repo.PATH.get_pkg_class(job_spec.name)
job_pkg = pkg_cls(job_spec)
tty.debug("job package: {0}".format(job_pkg))
except AssertionError:
@@ -1690,7 +1690,7 @@ def setup_spack_repro_version(repro_dir, checkout_commit, merge_commit=None):
return True
def reproduce_ci_job(url, work_dir):
def reproduce_ci_job(url, work_dir, autostart, gpg_url, runtime):
"""Given a url to gitlab artifacts.zip from a failed 'spack ci rebuild' job,
attempt to setup an environment in which the failure can be reproduced
locally. This entails the following:
@@ -1706,6 +1706,11 @@ def reproduce_ci_job(url, work_dir):
work_dir = os.path.realpath(work_dir)
download_and_extract_artifacts(url, work_dir)
gpg_path = None
if gpg_url:
gpg_path = web_util.fetch_url_text(gpg_url, dest_dir=os.path.join(work_dir, "_pgp"))
rel_gpg_path = gpg_path.replace(work_dir, "").lstrip(os.path.sep)
lock_file = fs.find(work_dir, "spack.lock")[0]
repro_lock_dir = os.path.dirname(lock_file)
@@ -1798,60 +1803,63 @@ def reproduce_ci_job(url, work_dir):
# more faithful reproducer if everything appears to run in the same
# absolute path used during the CI build.
mount_as_dir = "/work"
mounted_workdir = "/reproducer"
if repro_details:
mount_as_dir = repro_details["ci_project_dir"]
mounted_repro_dir = os.path.join(mount_as_dir, rel_repro_dir)
mounted_env_dir = os.path.join(mount_as_dir, relative_concrete_env_dir)
if gpg_path:
mounted_gpg_path = os.path.join(mounted_workdir, rel_gpg_path)
# We will also try to clone spack from your local checkout and
# reproduce the state present during the CI build, and put that into
# the bind-mounted reproducer directory.
# We will also try to clone spack from your local checkout and
# reproduce the state present during the CI build, and put that into
# the bind-mounted reproducer directory.
# Regular expressions for parsing that HEAD commit. If the pipeline
# was on the gitlab spack mirror, it will have been a merge commit made by
# gitub and pushed by the sync script. If the pipeline was run on some
# environment repo, then the tested spack commit will likely have been
# a regular commit.
commit_1 = None
commit_2 = None
commit_regex = re.compile(r"commit\s+([^\s]+)")
merge_commit_regex = re.compile(r"Merge\s+([^\s]+)\s+into\s+([^\s]+)")
# Regular expressions for parsing that HEAD commit. If the pipeline
# was on the gitlab spack mirror, it will have been a merge commit made by
# gitub and pushed by the sync script. If the pipeline was run on some
# environment repo, then the tested spack commit will likely have been
# a regular commit.
commit_1 = None
commit_2 = None
commit_regex = re.compile(r"commit\s+([^\s]+)")
merge_commit_regex = re.compile(r"Merge\s+([^\s]+)\s+into\s+([^\s]+)")
# Try the more specific merge commit regex first
m = merge_commit_regex.search(spack_info)
# Try the more specific merge commit regex first
m = merge_commit_regex.search(spack_info)
if m:
# This was a merge commit and we captured the parents
commit_1 = m.group(1)
commit_2 = m.group(2)
else:
# Not a merge commit, just get the commit sha
m = commit_regex.search(spack_info)
if m:
# This was a merge commit and we captured the parents
commit_1 = m.group(1)
commit_2 = m.group(2)
setup_result = False
if commit_1:
if commit_2:
setup_result = setup_spack_repro_version(work_dir, commit_2, merge_commit=commit_1)
else:
# Not a merge commit, just get the commit sha
m = commit_regex.search(spack_info)
if m:
commit_1 = m.group(1)
setup_result = setup_spack_repro_version(work_dir, commit_1)
setup_result = False
if commit_1:
if commit_2:
setup_result = setup_spack_repro_version(work_dir, commit_2, merge_commit=commit_1)
else:
setup_result = setup_spack_repro_version(work_dir, commit_1)
if not setup_result:
setup_msg = """
This can happen if the spack you are using to run this command is not a git
repo, or if it is a git repo, but it does not have the commits needed to
recreate the tested merge commit. If you are trying to reproduce a spack
PR pipeline job failure, try fetching the latest develop commits from
mainline spack and make sure you have the most recent commit of the PR
branch in your local spack repo. Then run this command again.
Alternatively, you can also manually clone spack if you know the version
you want to test.
"""
tty.error(
"Failed to automatically setup the tested version of spack "
"in your local reproduction directory."
)
print(setup_msg)
if not setup_result:
setup_msg = """
This can happen if the spack you are using to run this command is not a git
repo, or if it is a git repo, but it does not have the commits needed to
recreate the tested merge commit. If you are trying to reproduce a spack
PR pipeline job failure, try fetching the latest develop commits from
mainline spack and make sure you have the most recent commit of the PR
branch in your local spack repo. Then run this command again.
Alternatively, you can also manually clone spack if you know the version
you want to test.
"""
tty.error(
"Failed to automatically setup the tested version of spack "
"in your local reproduction directory."
)
print(setup_msg)
# In cases where CI build was run on a shell runner, it might be useful
# to see what tags were applied to the job so the user knows what shell
@@ -1862,45 +1870,92 @@ def reproduce_ci_job(url, work_dir):
job_tags = job_yaml["tags"]
tty.msg("Job ran with the following tags: {0}".format(job_tags))
inst_list = []
entrypoint_script = [
["git", "config", "--global", "--add", "safe.directory", mount_as_dir],
[".", os.path.join(mount_as_dir if job_image else work_dir, "share/spack/setup-env.sh")],
["spack", "gpg", "trust", mounted_gpg_path if job_image else gpg_path] if gpg_path else [],
["spack", "env", "activate", mounted_env_dir if job_image else repro_dir],
[os.path.join(mounted_repro_dir, "install.sh") if job_image else install_script],
]
inst_list = []
# Finally, print out some instructions to reproduce the build
if job_image:
inst_list.append("\nRun the following command:\n\n")
inst_list.append(
" $ docker run --rm --name spack_reproducer -v {0}:{1}:Z -ti {2}\n".format(
work_dir, mount_as_dir, job_image
)
# Allow interactive
entrypoint_script.extend(
[
[
"echo",
"Re-run install script using:\n\t{0}".format(
os.path.join(mounted_repro_dir, "install.sh")
if job_image
else install_script
),
],
# Allow interactive
["exec", "$@"],
]
)
inst_list.append("\nOnce inside the container:\n\n")
process_command(
"entrypoint", entrypoint_script, work_dir, run=False, exit_on_failure=False
)
docker_command = [
[
runtime,
"run",
"-i",
"-t",
"--rm",
"--name",
"spack_reproducer",
"-v",
":".join([work_dir, mounted_workdir, "Z"]),
"-v",
":".join(
[
os.path.join(work_dir, "jobs_scratch_dir"),
os.path.join(mount_as_dir, "jobs_scratch_dir"),
"Z",
]
),
"-v",
":".join([os.path.join(work_dir, "spack"), mount_as_dir, "Z"]),
"--entrypoint",
os.path.join(mounted_workdir, "entrypoint.sh"),
job_image,
"bash",
]
]
autostart = autostart and setup_result
process_command("start", docker_command, work_dir, run=autostart)
if not autostart:
inst_list.append("\nTo run the docker reproducer:\n\n")
inst_list.extend(
[
" - Start the docker container install",
" $ {0}/start.sh".format(work_dir),
]
)
else:
process_command("reproducer", entrypoint_script, work_dir, run=False)
inst_list.append("\nOnce on the tagged runner:\n\n")
inst_list.extent(
[" - Run the reproducer script", " $ {0}/reproducer.sh".format(work_dir)]
)
if not setup_result:
inst_list.append(" - Clone spack and acquire tested commit\n")
inst_list.append("{0}".format(spack_info))
spack_root = "<spack-clone-path>"
else:
spack_root = "{0}/spack".format(mount_as_dir)
inst_list.append("\n - Clone spack and acquire tested commit")
inst_list.append("\n {0}\n".format(spack_info))
inst_list.append("\n")
inst_list.append("\n Path to clone spack: {0}/spack\n\n".format(work_dir))
inst_list.append(" - Activate the environment\n\n")
inst_list.append(" $ source {0}/share/spack/setup-env.sh\n".format(spack_root))
inst_list.append(
" $ spack env activate --without-view {0}\n\n".format(
mounted_env_dir if job_image else repro_dir
)
)
inst_list.append(" - Run the install script\n\n")
inst_list.append(
" $ {0}\n".format(
os.path.join(mounted_repro_dir, "install.sh") if job_image else install_script
)
)
print("".join(inst_list))
tty.msg("".join(inst_list))
def process_command(name, commands, repro_dir):
def process_command(name, commands, repro_dir, run=True, exit_on_failure=True):
"""
Create a script for and run the command. Copy the script to the
reproducibility directory.
@@ -1910,6 +1965,7 @@ def process_command(name, commands, repro_dir):
commands (list): list of arguments for single command or list of lists of
arguments for multiple commands. No shell escape is performed.
repro_dir (str): Job reproducibility directory
run (bool): Run the script and return the exit code if True
Returns: the exit code from processing the command
"""
@@ -1928,7 +1984,8 @@ def process_command(name, commands, repro_dir):
with open(script, "w") as fd:
fd.write("#!/bin/sh\n\n")
fd.write("\n# spack {0} command\n".format(name))
fd.write("set -e\n")
if exit_on_failure:
fd.write("set -e\n")
if os.environ.get("SPACK_VERBOSE_SCRIPT"):
fd.write("set -x\n")
fd.write(full_command)
@@ -1939,19 +1996,27 @@ def process_command(name, commands, repro_dir):
copy_path = os.path.join(repro_dir, script)
shutil.copyfile(script, copy_path)
st = os.stat(copy_path)
os.chmod(copy_path, st.st_mode | stat.S_IEXEC)
# Run the generated install.sh shell script as if it were being run in
# a login shell.
try:
cmd_process = subprocess.Popen(["/bin/sh", "./{0}".format(script)])
cmd_process.wait()
exit_code = cmd_process.returncode
except (ValueError, subprocess.CalledProcessError, OSError) as err:
tty.error("Encountered error running {0} script".format(name))
tty.error(err)
exit_code = 1
exit_code = None
if run:
try:
cmd_process = subprocess.Popen(["/bin/sh", "./{0}".format(script)])
cmd_process.wait()
exit_code = cmd_process.returncode
except (ValueError, subprocess.CalledProcessError, OSError) as err:
tty.error("Encountered error running {0} script".format(name))
tty.error(err)
exit_code = 1
tty.debug("spack {0} exited {1}".format(name, exit_code))
else:
# Delete the script, it is copied to the destination dir
os.remove(script)
tty.debug("spack {0} exited {1}".format(name, exit_code))
return exit_code

View File

@@ -291,7 +291,7 @@ def ensure_single_spec_or_die(spec, matching_specs):
if len(matching_specs) <= 1:
return
format_string = "{name}{@version}{%compiler}{arch=architecture}"
format_string = "{name}{@version}{%compiler.name}{@compiler.version}{arch=architecture}"
args = ["%s matches multiple packages." % spec, "Matching packages:"]
args += [
colorize(" @K{%s} " % s.dag_hash(7)) + s.cformat(format_string) for s in matching_specs
@@ -342,9 +342,9 @@ def iter_groups(specs, indent, all_headers):
print()
header = "%s{%s} / %s{%s}" % (
spack.spec.architecture_color,
spack.spec.ARCHITECTURE_COLOR,
architecture if architecture else "no arch",
spack.spec.compiler_color,
spack.spec.COMPILER_COLOR,
f"{compiler.display_str}" if compiler else "no compiler",
)
@@ -383,7 +383,7 @@ def display_specs(specs, args=None, **kwargs):
deps (bool): Display dependencies with specs
long (bool): Display short hashes with specs
very_long (bool): Display full hashes with specs (supersedes ``long``)
namespace (bool): Print namespaces along with names
namespaces (bool): Print namespaces along with names
show_flags (bool): Show compiler flags with specs
variants (bool): Show variants with specs
indent (int): indent each line this much
@@ -407,7 +407,7 @@ def get_arg(name, default=None):
paths = get_arg("paths", False)
deps = get_arg("deps", False)
hashes = get_arg("long", False)
namespace = get_arg("namespace", False)
namespaces = get_arg("namespaces", False)
flags = get_arg("show_flags", False)
full_compiler = get_arg("show_full_compiler", False)
variants = get_arg("variants", False)
@@ -428,7 +428,7 @@ def get_arg(name, default=None):
format_string = get_arg("format", None)
if format_string is None:
nfmt = "{fullname}" if namespace else "{name}"
nfmt = "{fullname}" if namespaces else "{name}"
ffmt = ""
if full_compiler or flags:
ffmt += "{%compiler.name}"
@@ -584,14 +584,14 @@ def require_active_env(cmd_name):
if env:
return env
else:
tty.die(
"`spack %s` requires an environment" % cmd_name,
"activate an environment first:",
" spack env activate ENV",
"or use:",
" spack -e ENV %s ..." % cmd_name,
)
tty.die(
"`spack %s` requires an environment" % cmd_name,
"activate an environment first:",
" spack env activate ENV",
"or use:",
" spack -e ENV %s ..." % cmd_name,
)
def find_environment(args):

View File

@@ -47,7 +47,7 @@ def configs(parser, args):
def packages(parser, args):
pkgs = args.name or spack.repo.path.all_package_names()
pkgs = args.name or spack.repo.PATH.all_package_names()
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs)
_process_reports(reports)
@@ -57,7 +57,7 @@ def packages_https(parser, args):
if not args.check_all and not args.name:
tty.die("Please specify one or more packages to audit, or --all.")
pkgs = args.name or spack.repo.path.all_package_names()
pkgs = args.name or spack.repo.PATH.all_package_names()
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs)
_process_reports(reports)

View File

@@ -126,7 +126,7 @@ def blame(parser, args):
blame_file = path
if not blame_file:
pkg_cls = spack.repo.path.get_pkg_class(args.package_or_file)
pkg_cls = spack.repo.PATH.get_pkg_class(args.package_or_file)
blame_file = pkg_cls.module.__file__.rstrip("c") # .pyc -> .py
# get git blame for the package

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os.path
import shutil
import sys
import tempfile
import llnl.util.filesystem
@@ -68,11 +69,10 @@
def _add_scope_option(parser):
scopes = spack.config.scopes()
scopes_metavar = spack.config.scopes_metavar
parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
)
@@ -169,7 +169,7 @@ def _reset(args):
if not ok_to_continue:
raise RuntimeError("Aborting")
for scope in spack.config.config.file_scopes:
for scope in spack.config.CONFIG.file_scopes:
# The default scope should stay untouched
if scope.name == "defaults":
continue
@@ -186,7 +186,7 @@ def _reset(args):
if os.path.exists(bootstrap_yaml):
shutil.move(bootstrap_yaml, backup_file)
spack.config.config.clear_caches()
spack.config.CONFIG.clear_caches()
def _root(args):
@@ -326,6 +326,7 @@ def _status(args):
if missing:
print(llnl.util.tty.color.colorize(legend))
print()
sys.exit(1)
def _add(args):

View File

@@ -2,12 +2,14 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import glob
import json
import os
import shutil
import sys
import tempfile
from typing import List
import llnl.util.tty as tty
import llnl.util.tty.color as clr
@@ -18,7 +20,7 @@
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.hash_types as ht
import spack.error
import spack.mirror
import spack.relocate
import spack.repo
@@ -28,7 +30,6 @@
import spack.util.url as url_util
import spack.util.web as web_util
from spack.cmd import display_specs
from spack.error import SpecError
from spack.spec import Spec, save_dependency_specfiles
from spack.stage import Stage
from spack.util.string import plural
@@ -38,8 +39,8 @@
level = "long"
def setup_parser(subparser):
setup_parser.parser = subparser
def setup_parser(subparser: argparse.ArgumentParser):
setattr(setup_parser, "parser", subparser)
subparsers = subparser.add_subparsers(help="buildcache sub-commands")
push = subparsers.add_parser("push", aliases=["create"], help=push_fn.__doc__)
@@ -78,6 +79,11 @@ def setup_parser(subparser):
"Alternatively, one can decide to build a cache for only the package or only the "
"dependencies",
)
push.add_argument(
"--fail-fast",
action="store_true",
help="stop pushing on first failure (default is best effort)",
)
arguments.add_common_arguments(push, ["specs"])
push.set_defaults(func=push_fn)
@@ -105,7 +111,7 @@ def setup_parser(subparser):
install.set_defaults(func=install_fn)
listcache = subparsers.add_parser("list", help=list_fn.__doc__)
arguments.add_common_arguments(listcache, ["long", "very_long"])
arguments.add_common_arguments(listcache, ["long", "very_long", "namespaces"])
listcache.add_argument(
"-v",
"--variants",
@@ -149,23 +155,20 @@ def setup_parser(subparser):
# used to construct scope arguments below
scopes = spack.config.scopes()
scopes_metavar = spack.config.scopes_metavar
check.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope containing mirrors to check",
)
check.add_argument(
"-s", "--spec", default=None, help="check single spec instead of release specs file"
check_spec_or_specfile = check.add_mutually_exclusive_group(required=True)
check_spec_or_specfile.add_argument(
"-s", "--spec", help="check single spec instead of release specs file"
)
check.add_argument(
check_spec_or_specfile.add_argument(
"--spec-file",
default=None,
help="check single spec from json or yaml file instead of release specs file",
)
@@ -173,16 +176,19 @@ def setup_parser(subparser):
# Download tarball and specfile
download = subparsers.add_parser("download", help=download_fn.__doc__)
download.add_argument(
"-s", "--spec", default=None, help="download built tarball for spec from mirror"
download_spec_or_specfile = download.add_mutually_exclusive_group(required=True)
download_spec_or_specfile.add_argument(
"-s", "--spec", help="download built tarball for spec from mirror"
)
download_spec_or_specfile.add_argument(
"--spec-file", help="download built tarball for spec (from json or yaml file) from mirror"
)
download.add_argument(
"--spec-file",
"-p",
"--path",
required=True,
default=None,
help="download built tarball for spec (from json or yaml file) from mirror",
)
download.add_argument(
"-p", "--path", default=None, help="path to directory where tarball should be downloaded"
help="path to directory where tarball should be downloaded",
)
download.set_defaults(func=download_fn)
@@ -190,32 +196,32 @@ def setup_parser(subparser):
getbuildcachename = subparsers.add_parser(
"get-buildcache-name", help=get_buildcache_name_fn.__doc__
)
getbuildcachename.add_argument(
"-s", "--spec", default=None, help="spec string for which buildcache name is desired"
getbuildcachename_spec_or_specfile = getbuildcachename.add_mutually_exclusive_group(
required=True
)
getbuildcachename.add_argument(
"--spec-file",
default=None,
help="path to spec json or yaml file for which buildcache name is desired",
getbuildcachename_spec_or_specfile.add_argument(
"-s", "--spec", help="spec string for which buildcache name is desired"
)
getbuildcachename_spec_or_specfile.add_argument(
"--spec-file", help="path to spec json or yaml file for which buildcache name is desired"
)
getbuildcachename.set_defaults(func=get_buildcache_name_fn)
# Given the root spec, save the yaml of the dependent spec to a file
savespecfile = subparsers.add_parser("save-specfile", help=save_specfile_fn.__doc__)
savespecfile.add_argument("--root-spec", default=None, help="root spec of dependent spec")
savespecfile.add_argument(
"--root-specfile",
default=None,
help="path to json or yaml file containing root spec of dependent spec",
savespecfile_spec_or_specfile = savespecfile.add_mutually_exclusive_group(required=True)
savespecfile_spec_or_specfile.add_argument("--root-spec", help="root spec of dependent spec")
savespecfile_spec_or_specfile.add_argument(
"--root-specfile", help="path to json or yaml file containing root spec of dependent spec"
)
savespecfile.add_argument(
"-s",
"--specs",
default=None,
required=True,
help="list of dependent specs for which saved yaml is desired",
)
savespecfile.add_argument(
"--specfile-dir", default=None, help="path to directory where spec yamls should be saved"
"--specfile-dir", required=True, help="path to directory where spec yamls should be saved"
)
savespecfile.set_defaults(func=save_specfile_fn)
@@ -257,54 +263,24 @@ def setup_parser(subparser):
update_index.set_defaults(func=update_index_fn)
def _matching_specs(specs, spec_file):
"""Return a list of matching specs read from either a spec file (JSON or YAML),
a query over the store or a query over the active environment.
"""
env = ev.active_environment()
hashes = env.all_hashes() if env else None
if spec_file:
return spack.store.specfile_matches(spec_file, hashes=hashes)
if specs:
constraints = spack.cmd.parse_specs(specs)
return spack.store.find(constraints, hashes=hashes)
if env:
return [concrete for _, concrete in env.concretized_specs()]
tty.die(
"build cache file creation requires at least one"
" installed package spec, an active environment,"
" or else a path to a json or yaml file containing a spec"
" to install"
)
def _concrete_spec_from_args(args):
spec_str, specfile_path = args.spec, args.spec_file
if not spec_str and not specfile_path:
tty.error("must provide either spec string or path to YAML or JSON specfile")
sys.exit(1)
if spec_str:
try:
constraints = spack.cmd.parse_specs(spec_str)
spec = spack.store.find(constraints)[0]
spec.concretize()
except SpecError as spec_error:
tty.error("Unable to concretize spec {0}".format(spec_str))
tty.debug(spec_error)
sys.exit(1)
return spec
return Spec.from_specfile(specfile_path)
def _matching_specs(specs: List[Spec]) -> List[Spec]:
"""Disambiguate specs and return a list of matching specs"""
return [spack.cmd.disambiguate_spec(s, ev.active_environment(), installed=any) for s in specs]
def push_fn(args):
"""create a binary package and push it to a mirror"""
if args.spec_file:
tty.warn(
"The flag `--spec-file` is deprecated and will be removed in Spack 0.22. "
"Use positional arguments instead."
)
if args.specs or args.spec_file:
specs = _matching_specs(spack.cmd.parse_specs(args.specs or args.spec_file))
else:
specs = spack.cmd.require_active_env("buildcache push").all_specs()
mirror = arguments.mirror_name_or_url(args.mirror)
if args.allow_root:
@@ -315,7 +291,7 @@ def push_fn(args):
url = mirror.push_url
specs = bindist.specs_to_be_packaged(
_matching_specs(args.specs, args.spec_file),
specs,
root="package" in args.things_to_install,
dependencies="dependencies" in args.things_to_install,
)
@@ -326,6 +302,7 @@ def push_fn(args):
tty.info(f"Selected {len(specs)} specs to push to {url}")
skipped = []
failed = []
# tty printing
color = clr.get_color_when()
@@ -356,11 +333,17 @@ def push_fn(args):
except bindist.NoOverwriteException:
skipped.append(format_spec(spec))
# Catch any other exception unless the fail fast option is set
except Exception as e:
if args.fail_fast or isinstance(e, (bindist.PickKeyException, bindist.NoKeyException)):
raise
failed.append((format_spec(spec), e))
if skipped:
if len(specs) == 1:
tty.info("The spec is already in the buildcache. Use --force to overwrite it.")
elif len(skipped) == len(specs):
tty.info("All specs are already in the buildcache. Use --force to overwite them.")
tty.info("All specs are already in the buildcache. Use --force to overwrite them.")
else:
tty.info(
"The following {} specs were skipped as they already exist in the buildcache:\n"
@@ -370,6 +353,17 @@ def push_fn(args):
)
)
if failed:
if len(failed) == 1:
raise failed[0][1]
raise spack.error.SpackError(
f"The following {len(failed)} errors occurred while pushing specs to the buildcache",
"\n".join(
elide_list([f" {spec}: {e.__class__.__name__}: {e}" for spec, e in failed], 5)
),
)
def install_fn(args):
"""install from a binary package"""
@@ -423,16 +417,21 @@ def preview_fn(args):
def check_fn(args):
"""check specs against remote binary mirror(s) to see if any need to be rebuilt
either a single spec from --spec, or else the full set of release specs. this command uses the
process exit code to indicate its result, specifically, if the exit code is non-zero, then at
least one of the indicated specs needs to be rebuilt
this command uses the process exit code to indicate its result, specifically, if the
exit code is non-zero, then at least one of the indicated specs needs to be rebuilt
"""
if args.spec or args.spec_file:
specs = [_concrete_spec_from_args(args)]
if args.spec_file:
tty.warn(
"The flag `--spec-file` is deprecated and will be removed in Spack 0.22. "
"Use --spec instead."
)
specs = spack.cmd.parse_specs(args.spec or args.spec_file)
if specs:
specs = _matching_specs(specs, specs)
else:
env = spack.cmd.require_active_env(cmd_name="buildcache")
env.concretize()
specs = env.all_specs()
specs = spack.cmd.require_active_env("buildcache check").all_specs()
if not specs:
tty.msg("No specs provided, exiting.")
@@ -462,26 +461,28 @@ def download_fn(args):
code indicates that the command failed to download at least one of the required buildcache
components
"""
if not args.spec and not args.spec_file:
tty.msg("No specs provided, exiting.")
return
if args.spec_file:
tty.warn(
"The flag `--spec-file` is deprecated and will be removed in Spack 0.22. "
"Use --spec instead."
)
if not args.path:
tty.msg("No download path provided, exiting")
return
specs = _matching_specs(spack.cmd.parse_specs(args.spec or args.spec_file))
spec = _concrete_spec_from_args(args)
result = bindist.download_single_spec(spec, args.path)
if len(specs) != 1:
tty.die("a single spec argument is required to download from a buildcache")
if not result:
if not bindist.download_single_spec(specs[0], args.path):
sys.exit(1)
def get_buildcache_name_fn(args):
"""get name (prefix) of buildcache entries for this spec"""
spec = _concrete_spec_from_args(args)
buildcache_name = bindist.tarball_name(spec, "")
print("{0}".format(buildcache_name))
tty.warn("This command is deprecated and will be removed in Spack 0.22.")
specs = _matching_specs(spack.cmd.parse_specs(args.spec or args.spec_file))
if len(specs) != 1:
tty.die("a single spec argument is required to get buildcache name")
print(bindist.tarball_name(specs[0], ""))
def save_specfile_fn(args):
@@ -491,29 +492,24 @@ def save_specfile_fn(args):
successful. if any errors or exceptions are encountered, or if expected command-line arguments
are not provided, then the exit code will be non-zero
"""
if not args.root_spec and not args.root_specfile:
tty.msg("No root spec provided, exiting.")
sys.exit(1)
if not args.specs:
tty.msg("No dependent specs provided, exiting.")
sys.exit(1)
if not args.specfile_dir:
tty.msg("No yaml directory provided, exiting.")
sys.exit(1)
if args.root_specfile:
with open(args.root_specfile) as fd:
root_spec_as_json = fd.read()
spec_format = "yaml" if args.root_specfile.endswith("yaml") else "json"
else:
root_spec = Spec(args.root_spec)
root_spec.concretize()
root_spec_as_json = root_spec.to_json(hash=ht.dag_hash)
spec_format = "json"
tty.warn(
"The flag `--root-specfile` is deprecated and will be removed in Spack 0.22. "
"Use --root-spec instead."
)
specs = spack.cmd.parse_specs(args.root_spec or args.root_specfile)
if len(specs) != 1:
tty.die("a single spec argument is required to save specfile")
root = specs[0]
if not root.concrete:
root.concretize()
save_dependency_specfiles(
root_spec_as_json, args.specfile_dir, args.specs.split(), spec_format
root, args.specfile_dir, dependencies=spack.cmd.parse_specs(args.specs)
)

View File

@@ -4,18 +4,21 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import re
import sys
import llnl.util.tty as tty
import llnl.util.lang
from llnl.util import tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.repo
import spack.spec
import spack.stage
import spack.util.crypto
from spack.package_base import deprecated_version, preferred_version
from spack.cmd.common import arguments
from spack.package_base import PackageBase, deprecated_version, preferred_version
from spack.util.editor import editor
from spack.util.format import get_version_lines
from spack.util.naming import valid_fully_qualified_module_name
from spack.version import Version
@@ -31,35 +34,38 @@ def setup_parser(subparser):
default=False,
help="don't clean up staging area when command completes",
)
sp = subparser.add_mutually_exclusive_group()
sp.add_argument(
subparser.add_argument(
"-b",
"--batch",
action="store_true",
default=False,
help="don't ask which versions to checksum",
)
sp.add_argument(
subparser.add_argument(
"-l",
"--latest",
action="store_true",
default=False,
help="checksum the latest available version only",
help="checksum the latest available version",
)
sp.add_argument(
subparser.add_argument(
"-p",
"--preferred",
action="store_true",
default=False,
help="checksum the preferred version only",
help="checksum the known Spack preferred version",
)
subparser.add_argument(
modes_parser = subparser.add_mutually_exclusive_group()
modes_parser.add_argument(
"-a",
"--add-to-package",
action="store_true",
default=False,
help="add new versions to package",
)
modes_parser.add_argument(
"--verify", action="store_true", default=False, help="verify known package checksums"
)
arguments.add_common_arguments(subparser, ["package"])
subparser.add_argument(
"versions", nargs=argparse.REMAINDER, help="versions to generate checksums for"
@@ -77,89 +83,174 @@ def checksum(parser, args):
tty.die("`spack checksum` accepts package names, not URLs.")
# Get the package we're going to generate checksums for
pkg_cls = spack.repo.path.get_pkg_class(args.package)
pkg_cls = spack.repo.PATH.get_pkg_class(args.package)
pkg = pkg_cls(spack.spec.Spec(args.package))
# Build a list of versions to checksum
versions = [Version(v) for v in args.versions]
# Define placeholder for remote versions.
# This'll help reduce redundant work if we need to check for the existance
# of remote versions more than once.
remote_versions = None
# Add latest version if requested
if args.latest:
remote_versions = pkg.fetch_remote_versions()
if len(remote_versions) > 0:
latest_version = sorted(remote_versions.keys(), reverse=True)[0]
versions.append(latest_version)
# Add preferred version if requested
if args.preferred:
versions.append(preferred_version(pkg))
# Store a dict of the form version -> URL
url_dict = {}
if not args.versions and args.preferred:
versions = [preferred_version(pkg)]
else:
versions = [Version(v) for v in args.versions]
if versions:
remote_versions = None
for version in versions:
if deprecated_version(pkg, version):
tty.warn("Version {0} is deprecated".format(version))
for version in versions:
if deprecated_version(pkg, version):
tty.warn(f"Version {version} is deprecated")
url = pkg.find_valid_url_for_version(version)
if url is not None:
url_dict[version] = url
continue
# if we get here, it's because no valid url was provided by the package
# do expensive fallback to try to recover
if remote_versions is None:
remote_versions = pkg.fetch_remote_versions()
if version in remote_versions:
url_dict[version] = remote_versions[version]
else:
url_dict = pkg.fetch_remote_versions()
url = pkg.find_valid_url_for_version(version)
if url is not None:
url_dict[version] = url
continue
# if we get here, it's because no valid url was provided by the package
# do expensive fallback to try to recover
if remote_versions is None:
remote_versions = pkg.fetch_remote_versions()
if version in remote_versions:
url_dict[version] = remote_versions[version]
if len(versions) <= 0:
if remote_versions is None:
remote_versions = pkg.fetch_remote_versions()
url_dict = remote_versions
if not url_dict:
tty.die("Could not find any remote versions for {0}".format(pkg.name))
tty.die(f"Could not find any remote versions for {pkg.name}")
version_lines = spack.stage.get_checksums_for_versions(
# print an empty line to create a new output section block
print()
version_hashes = spack.stage.get_checksums_for_versions(
url_dict,
pkg.name,
keep_stage=args.keep_stage,
batch=(args.batch or len(args.versions) > 0 or len(url_dict) == 1),
latest=args.latest,
batch=(args.batch or len(versions) > 0 or len(url_dict) == 1),
fetch_options=pkg.fetch_options,
)
if args.verify:
print_checksum_status(pkg, version_hashes)
sys.exit(0)
# convert dict into package.py version statements
version_lines = get_version_lines(version_hashes, url_dict)
print()
print(version_lines)
print()
if args.add_to_package:
filename = spack.repo.path.filename_for_package_name(pkg.name)
# Make sure we also have a newline after the last version
versions = [v + "\n" for v in version_lines.splitlines()]
versions.append("\n")
# We need to insert the versions in reversed order
versions.reverse()
versions.append(" # FIXME: Added by `spack checksum`\n")
version_line = None
add_versions_to_package(pkg, version_lines)
with open(filename, "r") as f:
lines = f.readlines()
for i in range(len(lines)):
# Black is drunk, so this is what it looks like for now
# See https://github.com/psf/black/issues/2156 for more information
if lines[i].startswith(" # FIXME: Added by `spack checksum`") or lines[
i
].startswith(" version("):
version_line = i
break
if version_line is not None:
for v in versions:
lines.insert(version_line, v)
def print_checksum_status(pkg: PackageBase, version_hashes: dict):
"""
Verify checksums present in version_hashes against those present
in the package's instructions.
with open(filename, "w") as f:
f.writelines(lines)
Args:
pkg (spack.package_base.PackageBase): A package class for a given package in Spack.
version_hashes (dict): A dictionary of the form: version -> checksum.
msg = "opening editor to verify"
"""
results = []
num_verified = 0
failed = False
if not sys.stdout.isatty():
msg = "please verify"
max_len = max(len(str(v)) for v in version_hashes)
num_total = len(version_hashes)
tty.info(
"Added {0} new versions to {1}, "
"{2}.".format(len(versions) - 2, args.package, msg)
)
for version, sha in version_hashes.items():
if version not in pkg.versions:
msg = "No previous checksum"
status = "-"
elif sha == pkg.versions[version]["sha256"]:
msg = "Correct"
status = "="
num_verified += 1
if sys.stdout.isatty():
editor(filename)
else:
tty.warn("Could not add new versions to {0}.".format(args.package))
msg = sha
status = "x"
failed = True
results.append("{0:{1}} {2} {3}".format(str(version), max_len, f"[{status}]", msg))
# Display table of checksum results.
tty.msg(f"Verified {num_verified} of {num_total}", "", *llnl.util.lang.elide_list(results), "")
# Terminate at the end of function to prevent additional output.
if failed:
print()
tty.die("Invalid checksums found.")
def add_versions_to_package(pkg: PackageBase, version_lines: str):
"""
Add checksumed versions to a package's instructions and open a user's
editor so they may double check the work of the function.
Args:
pkg (spack.package_base.PackageBase): A package class for a given package in Spack.
version_lines (str): A string of rendered version lines.
"""
# Get filename and path for package
filename = spack.repo.PATH.filename_for_package_name(pkg.name)
num_versions_added = 0
version_statement_re = re.compile(r"([\t ]+version\([^\)]*\))")
version_re = re.compile(r'[\t ]+version\(\s*"([^"]+)"[^\)]*\)')
# Split rendered version lines into tuple of (version, version_line)
# We reverse sort here to make sure the versions match the version_lines
new_versions = []
for ver_line in version_lines.split("\n"):
match = version_re.match(ver_line)
if match:
new_versions.append((Version(match.group(1)), ver_line))
with open(filename, "r+") as f:
contents = f.read()
split_contents = version_statement_re.split(contents)
for i, subsection in enumerate(split_contents):
# If there are no more versions to add we should exit
if len(new_versions) <= 0:
break
# Check if the section contains a version
contents_version = version_re.match(subsection)
if contents_version is not None:
parsed_version = Version(contents_version.group(1))
if parsed_version < new_versions[0][0]:
split_contents[i:i] = [new_versions.pop(0)[1], " # FIX ME", "\n"]
num_versions_added += 1
elif parsed_version == new_versions[0][0]:
new_versions.pop(0)
# Seek back to the start of the file so we can rewrite the file contents.
f.seek(0)
f.writelines("".join(split_contents))
tty.msg(f"Added {num_versions_added} new versions to {pkg.name}")
tty.msg(f"Open {filename} to review the additions.")
if sys.stdout.isatty():
editor(filename)

View File

@@ -156,11 +156,27 @@ def setup_parser(subparser):
help=spack.cmd.first_line(ci_reproduce.__doc__),
)
reproduce.add_argument("job_url", help="URL of job artifacts bundle")
reproduce.add_argument(
"--runtime",
help="Container runtime to use.",
default="docker",
choices=["docker", "podman"],
)
reproduce.add_argument(
"--working-dir",
help="where to unpack artifacts",
default=os.path.join(os.getcwd(), "ci_reproduction"),
)
reproduce.add_argument(
"-s", "--autostart", help="Run docker reproducer automatically", action="store_true"
)
gpg_group = reproduce.add_mutually_exclusive_group(required=False)
gpg_group.add_argument(
"--gpg-file", help="Path to public GPG key for validating binary cache installs"
)
gpg_group.add_argument(
"--gpg-url", help="URL to public GPG key for validating binary cache installs"
)
reproduce.set_defaults(func=ci_reproduce)
@@ -273,6 +289,10 @@ def ci_rebuild(args):
rebuild_everything = os.environ.get("SPACK_REBUILD_EVERYTHING")
require_signing = os.environ.get("SPACK_REQUIRE_SIGNING")
# If signing key was provided via "SPACK_SIGNING_KEY", then try to import it.
if signing_key:
spack_ci.import_signing_key(signing_key)
# Fail early if signing is required but we don't have a signing key
sign_binaries = require_signing is not None and require_signing.lower() == "true"
if sign_binaries and not spack_ci.can_sign_binaries():
@@ -402,11 +422,6 @@ def ci_rebuild(args):
dst_file = os.path.join(repro_dir, file_name)
shutil.copyfile(src_file, dst_file)
# If signing key was provided via "SPACK_SIGNING_KEY", then try to
# import it.
if signing_key:
spack_ci.import_signing_key(signing_key)
# Write this job's spec json into the reproduction directory, and it will
# also be used in the generated "spack install" command to install the spec
tty.debug("job concrete spec path: {0}".format(job_spec_json_path))
@@ -663,7 +678,7 @@ def ci_rebuild(args):
input_spec=job_spec,
buildcache_mirror_url=buildcache_mirror_url,
pipeline_mirror_url=pipeline_mirror_url,
sign_binaries=sign_binaries,
sign_binaries=spack_ci.can_sign_binaries(),
):
msg = tty.msg if result.success else tty.warn
msg(
@@ -707,7 +722,7 @@ def ci_rebuild(args):
\033[34mTo reproduce this build locally, run:
spack ci reproduce-build {0} [--working-dir <dir>]
spack ci reproduce-build {0} [--working-dir <dir>] [--autostart]
If this project does not have public pipelines, you will need to first:
@@ -733,8 +748,18 @@ def ci_reproduce(args):
"""
job_url = args.job_url
work_dir = args.working_dir
autostart = args.autostart
runtime = args.runtime
return spack_ci.reproduce_ci_job(job_url, work_dir)
# Allow passing GPG key for reprocuding protected CI jobs
if args.gpg_file:
gpg_key_url = url_util.path_to_file_url(args.gpg_file)
elif args.gpg_url:
gpg_key_url = args.gpg_url
else:
gpg_key_url = None
return spack_ci.reproduce_ci_job(job_url, work_dir, autostart, gpg_key_url, runtime)
def ci(parser, args):

View File

@@ -17,6 +17,7 @@
import spack.config
import spack.repo
import spack.stage
import spack.store
import spack.util.path
from spack.paths import lib_path, var_path
@@ -114,22 +115,18 @@ def clean(parser, args):
if args.stage:
tty.msg("Removing all temporary build stages")
spack.stage.purge()
# Temp directory where buildcaches are extracted
extract_tmp = os.path.join(spack.store.STORE.layout.root, ".tmp")
if os.path.exists(extract_tmp):
tty.debug("Removing {0}".format(extract_tmp))
shutil.rmtree(extract_tmp)
if args.downloads:
tty.msg("Removing cached downloads")
spack.caches.fetch_cache.destroy()
spack.caches.FETCH_CACHE.destroy()
if args.failures:
tty.msg("Removing install failure marks")
spack.installer.clear_failures()
spack.store.STORE.failure_tracker.clear_all()
if args.misc_cache:
tty.msg("Removing cached information on repositories")
spack.caches.misc_cache.destroy()
spack.caches.MISC_CACHE.destroy()
if args.python_cache:
tty.msg("Removing python cache files")

View File

@@ -36,13 +36,13 @@
"bash": {
"aliases": True,
"format": "bash",
"header": os.path.join(spack.paths.share_path, "bash", "spack-completion.in"),
"header": os.path.join(spack.paths.share_path, "bash", "spack-completion.bash"),
"update": os.path.join(spack.paths.share_path, "spack-completion.bash"),
},
"fish": {
"aliases": True,
"format": "fish",
"header": os.path.join(spack.paths.share_path, "fish", "spack-completion.in"),
"header": os.path.join(spack.paths.share_path, "fish", "spack-completion.fish"),
"update": os.path.join(spack.paths.share_path, "spack-completion.fish"),
},
}
@@ -812,6 +812,9 @@ def bash(args: Namespace, out: IO) -> None:
parser = spack.main.make_argument_parser()
spack.main.add_all_commands(parser)
aliases = ";".join(f"{key}:{val}" for key, val in spack.main.aliases.items())
out.write(f'SPACK_ALIASES="{aliases}"\n\n')
writer = BashCompletionWriter(parser.prog, out, args.aliases)
writer.write(parser)

View File

@@ -331,6 +331,17 @@ def tags():
)
@arg
def namespaces():
return Args(
"-N",
"--namespaces",
action="store_true",
default=False,
help="show fully qualified package names",
)
@arg
def jobs():
return Args(

View File

@@ -24,7 +24,6 @@ def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="compiler_command")
scopes = spack.config.scopes()
scopes_metavar = spack.config.scopes_metavar
# Find
find_parser = sp.add_parser(
@@ -36,7 +35,7 @@ def setup_parser(subparser):
find_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope("compilers"),
help="configuration scope to modify",
)
@@ -50,7 +49,7 @@ def setup_parser(subparser):
remove_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=None,
help="configuration scope to modify",
)
@@ -60,7 +59,7 @@ def setup_parser(subparser):
list_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
help="configuration scope to read from",
)
@@ -71,7 +70,7 @@ def setup_parser(subparser):
info_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
help="configuration scope to read from",
)
@@ -93,7 +92,7 @@ def compiler_find(args):
n = len(new_compilers)
s = "s" if n > 1 else ""
config = spack.config.config
config = spack.config.CONFIG
filename = config.get_config_filename(args.scope, "compilers")
tty.msg("Added %d new compiler%s to %s" % (n, s, filename))
colify(reversed(sorted(c.spec.display_str for c in new_compilers)), indent=4)
@@ -186,7 +185,7 @@ def compiler_list(args):
os_str = os
if target:
os_str += "-%s" % target
cname = "%s{%s} %s" % (spack.spec.compiler_color, name, os_str)
cname = "%s{%s} %s" % (spack.spec.COMPILER_COLOR, name, os_str)
tty.hline(colorize(cname), char="-")
colify(reversed(sorted(c.spec.display_str for c in compilers)))

View File

@@ -13,12 +13,11 @@
def setup_parser(subparser):
scopes = spack.config.scopes()
scopes_metavar = spack.config.scopes_metavar
subparser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
)

View File

@@ -27,13 +27,12 @@
def setup_parser(subparser):
scopes = spack.config.scopes()
scopes_metavar = spack.config.scopes_metavar
# User can only choose one
subparser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
)
@@ -45,7 +44,7 @@ def setup_parser(subparser):
help="configuration section to print\n\noptions: %(choices)s",
nargs="?",
metavar="section",
choices=spack.config.section_schemas,
choices=spack.config.SECTION_SCHEMAS,
)
blame_parser = sp.add_parser(
@@ -55,7 +54,7 @@ def setup_parser(subparser):
"section",
help="configuration section to print\n\noptions: %(choices)s",
metavar="section",
choices=spack.config.section_schemas,
choices=spack.config.SECTION_SCHEMAS,
)
edit_parser = sp.add_parser("edit", help="edit configuration file")
@@ -64,7 +63,7 @@ def setup_parser(subparser):
help="configuration section to edit\n\noptions: %(choices)s",
metavar="section",
nargs="?",
choices=spack.config.section_schemas,
choices=spack.config.SECTION_SCHEMAS,
)
edit_parser.add_argument(
"--print-file", action="store_true", help="print the file name that would be edited"
@@ -146,10 +145,10 @@ def config_get(args):
scope, section = _get_scope_and_section(args)
if section is not None:
spack.config.config.print_section(section)
spack.config.CONFIG.print_section(section)
elif scope and scope.startswith("env:"):
config_file = spack.config.config.get_config_filename(scope, section)
config_file = spack.config.CONFIG.get_config_filename(scope, section)
if os.path.exists(config_file):
with open(config_file) as f:
print(f.read())
@@ -162,7 +161,7 @@ def config_get(args):
def config_blame(args):
"""Print out line-by-line blame of merged YAML."""
spack.config.config.print_section(args.section, blame=True)
spack.config.CONFIG.print_section(args.section, blame=True)
def config_edit(args):
@@ -181,7 +180,7 @@ def config_edit(args):
scope, section = _get_scope_and_section(args)
if not scope and not section:
tty.die("`spack config edit` requires a section argument or an active environment.")
config_file = spack.config.config.get_config_filename(scope, section)
config_file = spack.config.CONFIG.get_config_filename(scope, section)
if args.print_file:
print(config_file)
@@ -194,7 +193,7 @@ def config_list(args):
Used primarily for shell tab completion scripts.
"""
print(" ".join(list(spack.config.section_schemas)))
print(" ".join(list(spack.config.SECTION_SCHEMAS)))
def config_add(args):
@@ -251,19 +250,19 @@ def _can_update_config_file(scope: spack.config.ConfigScope, cfg_file):
def config_update(args):
# Read the configuration files
spack.config.config.get_config(args.section, scope=args.scope)
spack.config.CONFIG.get_config(args.section, scope=args.scope)
updates: List[spack.config.ConfigScope] = list(
filter(
lambda s: not isinstance(
s, (spack.config.InternalConfigScope, spack.config.ImmutableConfigScope)
),
spack.config.config.format_updates[args.section],
spack.config.CONFIG.format_updates[args.section],
)
)
cannot_overwrite, skip_system_scope = [], False
for scope in updates:
cfg_file = spack.config.config.get_config_filename(scope.name, args.section)
cfg_file = spack.config.CONFIG.get_config_filename(scope.name, args.section)
can_be_updated = _can_update_config_file(scope, cfg_file)
if not can_be_updated:
if scope.name == "system":
@@ -302,7 +301,7 @@ def config_update(args):
" the latest schema format:\n\n"
)
for scope in updates:
cfg_file = spack.config.config.get_config_filename(scope.name, args.section)
cfg_file = spack.config.CONFIG.get_config_filename(scope.name, args.section)
msg += "\t[scope={0}, file={1}]\n".format(scope.name, cfg_file)
msg += (
"\nIf the configuration files are updated, versions of Spack "
@@ -325,7 +324,7 @@ def config_update(args):
# Make a backup copy and rewrite the file
bkp_file = cfg_file + ".bkp"
shutil.copy(cfg_file, bkp_file)
spack.config.config.update_config(args.section, data, scope=scope.name, force=True)
spack.config.CONFIG.update_config(args.section, data, scope=scope.name, force=True)
tty.msg(f'File "{cfg_file}" update [backup={bkp_file}]')
@@ -337,13 +336,13 @@ def _can_revert_update(scope_dir, cfg_file, bkp_file):
def config_revert(args):
scopes = [args.scope] if args.scope else [x.name for x in spack.config.config.file_scopes]
scopes = [args.scope] if args.scope else [x.name for x in spack.config.CONFIG.file_scopes]
# Search for backup files in the configuration scopes
Entry = collections.namedtuple("Entry", ["scope", "cfg", "bkp"])
to_be_restored, cannot_overwrite = [], []
for scope in scopes:
cfg_file = spack.config.config.get_config_filename(scope, args.section)
cfg_file = spack.config.CONFIG.get_config_filename(scope, args.section)
bkp_file = cfg_file + ".bkp"
# If the backup files doesn't exist move to the next scope
@@ -457,7 +456,7 @@ def config_prefer_upstream(args):
existing = spack.config.get("packages", scope=scope)
new = spack.config.merge_yaml(existing, pkgs)
spack.config.set("packages", new, scope)
config_file = spack.config.config.get_config_filename(scope, section)
config_file = spack.config.CONFIG.get_config_filename(scope, section)
tty.msg("Updated config at {0}".format(config_file))

View File

@@ -17,6 +17,7 @@
from spack.url import UndetectableNameError, UndetectableVersionError, parse_name, parse_version
from spack.util.editor import editor
from spack.util.executable import ProcessError, which
from spack.util.format import get_version_lines
from spack.util.naming import mod_to_class, simplify_name, valid_fully_qualified_module_name
description = "create a new package file"
@@ -832,13 +833,15 @@ def get_versions(args, name):
version = parse_version(args.url)
url_dict = {version: args.url}
versions = spack.stage.get_checksums_for_versions(
version_hashes = spack.stage.get_checksums_for_versions(
url_dict,
name,
first_stage_function=guesser,
keep_stage=args.keep_stage,
batch=(args.batch or len(url_dict) == 1),
)
versions = get_version_lines(version_hashes, url_dict)
else:
versions = unhashed_versions
@@ -912,11 +915,11 @@ def get_repository(args, name):
)
else:
if spec.namespace:
repo = spack.repo.path.get_repo(spec.namespace, None)
repo = spack.repo.PATH.get_repo(spec.namespace, None)
if not repo:
tty.die("Unknown namespace: '{0}'".format(spec.namespace))
else:
repo = spack.repo.path.first_repo()
repo = spack.repo.PATH.first_repo()
# Set the namespace on the spec if it's not there already
if not spec.namespace:

View File

@@ -47,14 +47,14 @@ def inverted_dependencies():
actual dependents.
"""
dag = {}
for pkg_cls in spack.repo.path.all_package_classes():
for pkg_cls in spack.repo.PATH.all_package_classes():
dag.setdefault(pkg_cls.name, set())
for dep in pkg_cls.dependencies:
deps = [dep]
# expand virtuals if necessary
if spack.repo.path.is_virtual(dep):
deps += [s.name for s in spack.repo.path.providers_for(dep)]
if spack.repo.PATH.is_virtual(dep):
deps += [s.name for s in spack.repo.PATH.providers_for(dep)]
for d in deps:
dag.setdefault(d, set()).add(pkg_cls.name)

View File

@@ -98,7 +98,7 @@ def dev_build(self, args):
tty.die("spack dev-build only takes one spec.")
spec = specs[0]
if not spack.repo.path.exists(spec.name):
if not spack.repo.PATH.exists(spec.name):
tty.die(
"No package for '{0}' was found.".format(spec.name),
" Use `spack create` to create a new package",

View File

@@ -31,9 +31,9 @@ def edit_package(name, repo_path, namespace):
if repo_path:
repo = spack.repo.Repo(repo_path)
elif namespace:
repo = spack.repo.path.get_repo(namespace)
repo = spack.repo.PATH.get_repo(namespace)
else:
repo = spack.repo.path
repo = spack.repo.PATH
path = repo.filename_for_package_name(name)
spec = Spec(name)

View File

@@ -239,6 +239,13 @@ def env_deactivate_setup_parser(subparser):
const="bat",
help="print bat commands to activate the environment",
)
shells.add_argument(
"--pwsh",
action="store_const",
dest="shell",
const="pwsh",
help="print pwsh commands to activate the environment",
)
def env_deactivate(args):

View File

@@ -58,7 +58,7 @@ def extensions(parser, args):
extendable_pkgs = []
for name in spack.repo.all_package_names():
pkg_cls = spack.repo.path.get_pkg_class(name)
pkg_cls = spack.repo.PATH.get_pkg_class(name)
if pkg_cls.extendable:
extendable_pkgs.append(name)
@@ -81,7 +81,7 @@ def extensions(parser, args):
if args.show in ("packages", "all"):
# List package names of extensions
extensions = spack.repo.path.extensions_for(spec)
extensions = spack.repo.PATH.extensions_for(spec)
if not extensions:
tty.msg("%s has no extensions." % spec.cshort_spec)
else:

View File

@@ -13,6 +13,7 @@
import spack
import spack.cmd
import spack.cmd.common.arguments
import spack.config
import spack.cray_manifest as cray_manifest
import spack.detection
import spack.error
@@ -27,7 +28,6 @@ def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="external_command")
scopes = spack.config.scopes()
scopes_metavar = spack.config.scopes_metavar
find_parser = sp.add_parser("find", help="add external packages to packages.yaml")
find_parser.add_argument(
@@ -47,7 +47,7 @@ def setup_parser(subparser):
find_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope("packages"),
help="configuration scope to modify",
)
@@ -133,9 +133,9 @@ def external_find(args):
# Add the packages that have been required explicitly
if args.packages:
pkg_cls_to_check = [spack.repo.path.get_pkg_class(pkg) for pkg in args.packages]
pkg_cls_to_check = [spack.repo.PATH.get_pkg_class(pkg) for pkg in args.packages]
if args.tags:
allowed = set(spack.repo.path.packages_with_tags(*args.tags))
allowed = set(spack.repo.PATH.packages_with_tags(*args.tags))
pkg_cls_to_check = [x for x in pkg_cls_to_check if x.name in allowed]
if args.tags and not pkg_cls_to_check:
@@ -144,15 +144,15 @@ def external_find(args):
# Since tags are cached it's much faster to construct what we need
# to search directly, rather than filtering after the fact
pkg_cls_to_check = [
spack.repo.path.get_pkg_class(pkg_name)
spack.repo.PATH.get_pkg_class(pkg_name)
for tag in args.tags
for pkg_name in spack.repo.path.packages_with_tags(tag)
for pkg_name in spack.repo.PATH.packages_with_tags(tag)
]
pkg_cls_to_check = list(set(pkg_cls_to_check))
# If the list of packages is empty, search for every possible package
if not args.tags and not pkg_cls_to_check:
pkg_cls_to_check = list(spack.repo.path.all_package_classes())
pkg_cls_to_check = list(spack.repo.PATH.all_package_classes())
# If the user specified any packages to exclude from external find, add them here
if args.exclude:
@@ -165,7 +165,7 @@ def external_find(args):
detected_packages, scope=args.scope, buildable=not args.not_buildable
)
if new_entries:
path = spack.config.config.get_config_filename(args.scope, "packages")
path = spack.config.CONFIG.get_config_filename(args.scope, "packages")
msg = "The following specs have been detected on this system and added to {0}"
tty.msg(msg.format(path))
spack.cmd.display_specs(new_entries)
@@ -239,7 +239,7 @@ def _collect_and_consume_cray_manifest_files(
def external_list(args):
# Trigger a read of all packages, might take a long time.
list(spack.repo.path.all_package_classes())
list(spack.repo.PATH.all_package_classes())
# Print all the detectable packages
tty.msg("Detectable packages per repository")
for namespace, pkgs in sorted(spack.package_base.detectable_packages.items()):

View File

@@ -67,7 +67,7 @@ def setup_parser(subparser):
help="do not group specs by arch/compiler",
)
arguments.add_common_arguments(subparser, ["long", "very_long", "tags"])
arguments.add_common_arguments(subparser, ["long", "very_long", "tags", "namespaces"])
subparser.add_argument(
"-c",
@@ -140,9 +140,6 @@ def setup_parser(subparser):
subparser.add_argument(
"--only-deprecated", action="store_true", help="show only deprecated packages"
)
subparser.add_argument(
"-N", "--namespace", action="store_true", help="show fully qualified package names"
)
subparser.add_argument("--start-date", help="earliest date of installation [YYYY-MM-DD]")
subparser.add_argument("--end-date", help="latest date of installation [YYYY-MM-DD]")
@@ -230,7 +227,7 @@ def display_env(env, args, decorator, results):
env.user_specs,
root_args,
decorator=lambda s, f: color.colorize("@*{%s}" % f),
namespace=True,
namespaces=True,
show_flags=True,
show_full_compiler=True,
variants=True,
@@ -271,7 +268,7 @@ def find(parser, args):
# If tags have been specified on the command line, filter by tags
if args.tags:
packages_with_tags = spack.repo.path.packages_with_tags(*args.tags)
packages_with_tags = spack.repo.PATH.packages_with_tags(*args.tags)
results = [x for x in results if x.name in packages_with_tags]
if args.loaded:

View File

@@ -64,11 +64,11 @@ def section_title(s):
def version(s):
return spack.spec.version_color + s + plain_format
return spack.spec.VERSION_COLOR + s + plain_format
def variant(s):
return spack.spec.enabled_variant_color + s + plain_format
return spack.spec.ENABLED_VARIANT_COLOR + s + plain_format
class VariantFormatter:
@@ -349,7 +349,7 @@ def print_virtuals(pkg):
def info(parser, args):
spec = spack.spec.Spec(args.package)
pkg_cls = spack.repo.path.get_pkg_class(spec.name)
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
pkg = pkg_cls(spec)
# Output core package information

View File

@@ -107,7 +107,7 @@ def match(p, f):
if f.match(p):
return True
pkg_cls = spack.repo.path.get_pkg_class(p)
pkg_cls = spack.repo.PATH.get_pkg_class(p)
if pkg_cls.__doc__:
return f.match(pkg_cls.__doc__)
return False
@@ -159,7 +159,7 @@ def get_dependencies(pkg):
@formatter
def version_json(pkg_names, out):
"""Print all packages with their latest versions."""
pkg_classes = [spack.repo.path.get_pkg_class(name) for name in pkg_names]
pkg_classes = [spack.repo.PATH.get_pkg_class(name) for name in pkg_names]
out.write("[\n")
@@ -201,7 +201,7 @@ def html(pkg_names, out):
"""
# Read in all packages
pkg_classes = [spack.repo.path.get_pkg_class(name) for name in pkg_names]
pkg_classes = [spack.repo.PATH.get_pkg_class(name) for name in pkg_names]
# Start at 2 because the title of the page from Sphinx is id1.
span_id = 2
@@ -313,13 +313,13 @@ def list(parser, args):
# If tags have been specified on the command line, filter by tags
if args.tags:
packages_with_tags = spack.repo.path.packages_with_tags(*args.tags)
packages_with_tags = spack.repo.PATH.packages_with_tags(*args.tags)
sorted_packages = [p for p in sorted_packages if p in packages_with_tags]
if args.update:
# change output stream if user asked for update
if os.path.exists(args.update):
if os.path.getmtime(args.update) > spack.repo.path.last_mtime():
if os.path.getmtime(args.update) > spack.repo.PATH.last_mtime():
tty.msg("File is up to date: %s" % args.update)
return

View File

@@ -52,6 +52,13 @@ def setup_parser(subparser):
const="bat",
help="print bat commands to load the package",
)
shells.add_argument(
"--pwsh",
action="store_const",
dest="shell",
const="pwsh",
help="print pwsh commands to load the package",
)
subparser.add_argument(
"--first",

View File

@@ -109,7 +109,7 @@ def location(parser, args):
return
if args.packages:
print(spack.repo.path.first_repo().root)
print(spack.repo.PATH.first_repo().root)
return
if args.stages:
@@ -135,7 +135,7 @@ def location(parser, args):
# Package dir just needs the spec name
if args.package_dir:
print(spack.repo.path.dirname_for_package_name(spec.name))
print(spack.repo.PATH.dirname_for_package_name(spec.name))
return
# Either concretize or filter from already concretized environment

View File

@@ -54,11 +54,11 @@ def setup_parser(subparser):
def packages_to_maintainers(package_names=None):
if not package_names:
package_names = spack.repo.path.all_package_names()
package_names = spack.repo.PATH.all_package_names()
pkg_to_users = defaultdict(lambda: set())
for name in package_names:
cls = spack.repo.path.get_pkg_class(name)
cls = spack.repo.PATH.get_pkg_class(name)
for user in cls.maintainers:
pkg_to_users[name].add(user)
@@ -67,8 +67,8 @@ def packages_to_maintainers(package_names=None):
def maintainers_to_packages(users=None):
user_to_pkgs = defaultdict(lambda: [])
for name in spack.repo.path.all_package_names():
cls = spack.repo.path.get_pkg_class(name)
for name in spack.repo.PATH.all_package_names():
cls = spack.repo.PATH.get_pkg_class(name)
for user in cls.maintainers:
lower_users = [u.lower() for u in users]
if not users or user.lower() in lower_users:
@@ -80,8 +80,8 @@ def maintainers_to_packages(users=None):
def maintained_packages():
maintained = []
unmaintained = []
for name in spack.repo.path.all_package_names():
cls = spack.repo.path.get_pkg_class(name)
for name in spack.repo.PATH.all_package_names():
cls = spack.repo.PATH.get_pkg_class(name)
if cls.maintainers:
maintained.append(name)
else:

View File

@@ -90,7 +90,6 @@ def setup_parser(subparser):
# used to construct scope arguments below
scopes = spack.config.scopes()
scopes_metavar = spack.config.scopes_metavar
# Add
add_parser = sp.add_parser("add", help=mirror_add.__doc__)
@@ -99,7 +98,7 @@ def setup_parser(subparser):
add_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope to modify",
)
@@ -119,7 +118,7 @@ def setup_parser(subparser):
remove_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope to modify",
)
@@ -138,7 +137,7 @@ def setup_parser(subparser):
set_url_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope to modify",
)
@@ -167,7 +166,7 @@ def setup_parser(subparser):
set_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope to modify",
)
@@ -178,7 +177,7 @@ def setup_parser(subparser):
list_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
help="configuration scope to read from",
)
@@ -474,7 +473,7 @@ def create_mirror_for_all_specs(path, skip_unstable_versions, selection_fn):
path, skip_unstable_versions=skip_unstable_versions
)
for candidate in mirror_specs:
pkg_cls = spack.repo.path.get_pkg_class(candidate.name)
pkg_cls = spack.repo.PATH.get_pkg_class(candidate.name)
pkg_obj = pkg_cls(spack.spec.Spec(candidate))
mirror_stats.next_spec(pkg_obj.spec)
spack.mirror.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)

View File

@@ -309,7 +309,7 @@ def refresh(module_type, specs, args):
# Skip unknown packages.
writers = [
cls(spec, args.module_set_name) for spec in specs if spack.repo.path.exists(spec.name)
cls(spec, args.module_set_name) for spec in specs if spack.repo.PATH.exists(spec.name)
]
# Filter excluded packages early
@@ -321,12 +321,13 @@ def refresh(module_type, specs, args):
file2writer[item.layout.filename].append(item)
if len(file2writer) != len(writers):
spec_fmt_str = "{name}@={version}%{compiler}/{hash:7} {variants} arch={arch}"
message = "Name clashes detected in module files:\n"
for filename, writer_list in file2writer.items():
if len(writer_list) > 1:
message += "\nfile: {0}\n".format(filename)
for x in writer_list:
message += "spec: {0}\n".format(x.spec.format())
message += "spec: {0}\n".format(x.spec.format(spec_fmt_str))
tty.error(message)
tty.error("Operation aborted")
raise SystemExit(1)
@@ -376,7 +377,7 @@ def refresh(module_type, specs, args):
def modules_cmd(parser, args, module_type, callbacks=callbacks):
# Qualifiers to be used when querying the db for specs
constraint_qualifiers = {
"refresh": {"installed": True, "known": lambda x: not spack.repo.path.exists(x)}
"refresh": {"installed": True, "known": lambda x: not spack.repo.PATH.exists(x)}
}
query_args = constraint_qualifiers.get(args.subparser_name, {})

View File

@@ -143,7 +143,7 @@ def pkg_source(args):
tty.die("spack pkg source requires exactly one spec")
spec = specs[0]
filename = spack.repo.path.filename_for_package_name(spec.name)
filename = spack.repo.PATH.filename_for_package_name(spec.name)
# regular source dump -- just get the package and print its contents
if args.canonical:
@@ -184,7 +184,7 @@ def pkg_grep(args, unknown_args):
grouper = lambda e: e[0] // 500
# set up iterator and save the first group to ensure we don't end up with a group of size 1
groups = itertools.groupby(enumerate(spack.repo.path.all_package_paths()), grouper)
groups = itertools.groupby(enumerate(spack.repo.PATH.all_package_paths()), grouper)
if not groups:
return 0 # no packages to search

View File

@@ -24,7 +24,7 @@ def setup_parser(subparser):
def providers(parser, args):
valid_virtuals = sorted(spack.repo.path.provider_index.providers.keys())
valid_virtuals = sorted(spack.repo.PATH.provider_index.providers.keys())
buffer = io.StringIO()
isatty = sys.stdout.isatty()
@@ -53,5 +53,5 @@ def providers(parser, args):
for spec in specs:
if sys.stdout.isatty():
print("{0}:".format(spec))
spack.cmd.display_specs(sorted(spack.repo.path.providers_for(spec)))
spack.cmd.display_specs(sorted(spack.repo.PATH.providers_for(spec)))
print("")

View File

@@ -20,7 +20,6 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="repo_command")
scopes = spack.config.scopes()
scopes_metavar = spack.config.scopes_metavar
# Create
create_parser = sp.add_parser("create", help=repo_create.__doc__)
@@ -45,7 +44,7 @@ def setup_parser(subparser):
list_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
help="configuration scope to read from",
)
@@ -56,7 +55,7 @@ def setup_parser(subparser):
add_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope to modify",
)
@@ -69,7 +68,7 @@ def setup_parser(subparser):
remove_parser.add_argument(
"--scope",
choices=scopes,
metavar=scopes_metavar,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
help="configuration scope to modify",
)

View File

@@ -29,7 +29,7 @@ def setup_parser(subparser):
def _show_patch(sha256):
"""Show a record from the patch index."""
patches = spack.repo.path.patch_index.index
patches = spack.repo.PATH.patch_index.index
data = patches.get(sha256)
if not data:
@@ -47,7 +47,7 @@ def _show_patch(sha256):
owner = rec["owner"]
if "relative_path" in rec:
pkg_dir = spack.repo.path.get_pkg_class(owner).package_dir
pkg_dir = spack.repo.PATH.get_pkg_class(owner).package_dir
path = os.path.join(pkg_dir, rec["relative_path"])
print(" path: %s" % path)
else:
@@ -60,7 +60,7 @@ def _show_patch(sha256):
def resource_list(args):
"""list all resources known to spack (currently just patches)"""
patches = spack.repo.path.patch_index.index
patches = spack.repo.PATH.patch_index.index
for sha256 in patches:
if args.only_hashes:
print(sha256)

View File

@@ -42,7 +42,7 @@ def setup_parser(subparser):
)
# Below are arguments w.r.t. spec display (like spack spec)
arguments.add_common_arguments(subparser, ["long", "very_long"])
arguments.add_common_arguments(subparser, ["long", "very_long", "namespaces"])
install_status_group = subparser.add_mutually_exclusive_group()
arguments.add_common_arguments(install_status_group, ["install_status", "no_install_status"])
@@ -73,13 +73,6 @@ def setup_parser(subparser):
choices=["nodes", "edges", "paths"],
help="how extensively to traverse the DAG (default: nodes)",
)
subparser.add_argument(
"-N",
"--namespaces",
action="store_true",
default=False,
help="show fully qualified package names",
)
subparser.add_argument(
"-t", "--types", action="store_true", default=False, help="show dependency types"
)
@@ -144,7 +137,7 @@ def solve(parser, args):
# these are the same options as `spack spec`
install_status_fn = spack.spec.Spec.install_status
fmt = spack.spec.display_format
fmt = spack.spec.DISPLAY_FORMAT
if args.namespaces:
fmt = "{namespace}." + fmt

View File

@@ -29,7 +29,7 @@ def setup_parser(subparser):
for further documentation regarding the spec syntax, see:
spack help --spec
"""
arguments.add_common_arguments(subparser, ["long", "very_long"])
arguments.add_common_arguments(subparser, ["long", "very_long", "namespaces"])
install_status_group = subparser.add_mutually_exclusive_group()
arguments.add_common_arguments(install_status_group, ["install_status", "no_install_status"])
@@ -67,13 +67,6 @@ def setup_parser(subparser):
choices=["nodes", "edges", "paths"],
help="how extensively to traverse the DAG (default: nodes)",
)
subparser.add_argument(
"-N",
"--namespaces",
action="store_true",
default=False,
help="show fully qualified package names",
)
subparser.add_argument(
"-t", "--types", action="store_true", default=False, help="show dependency types"
)
@@ -84,7 +77,7 @@ def setup_parser(subparser):
def spec(parser, args):
install_status_fn = spack.spec.Spec.install_status
fmt = spack.spec.display_format
fmt = spack.spec.DISPLAY_FORMAT
if args.namespaces:
fmt = "{namespace}." + fmt

View File

@@ -68,7 +68,7 @@ def tags(parser, args):
return
# unique list of available tags
available_tags = sorted(spack.repo.path.tag_index.keys())
available_tags = sorted(spack.repo.PATH.tag_index.keys())
if not available_tags:
tty.msg("No tagged packages")
return

View File

@@ -228,7 +228,7 @@ def create_reporter(args, specs_to_test, test_suite):
def test_list(args):
"""list installed packages with available tests"""
tagged = set(spack.repo.path.packages_with_tags(*args.tag)) if args.tag else set()
tagged = set(spack.repo.PATH.packages_with_tags(*args.tag)) if args.tag else set()
def has_test_and_tags(pkg_class):
tests = spack.install_test.test_functions(pkg_class)
@@ -237,7 +237,7 @@ def has_test_and_tags(pkg_class):
if args.list_all:
report_packages = [
pkg_class.name
for pkg_class in spack.repo.path.all_package_classes()
for pkg_class in spack.repo.PATH.all_package_classes()
if has_test_and_tags(pkg_class)
]

View File

@@ -209,12 +209,11 @@ def unit_test(parser, args, unknown_args):
# mock configuration used by unit tests
# Note: skip on windows here because for the moment,
# clingo is wholly unsupported from bootstrap
if sys.platform != "win32":
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
if pytest is None:
spack.bootstrap.ensure_environment_dependencies()
import pytest
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
if pytest is None:
spack.bootstrap.ensure_environment_dependencies()
import pytest
if args.pytest_help:
# make the pytest.main help output more accurate

View File

@@ -51,6 +51,13 @@ def setup_parser(subparser):
const="bat",
help="print bat commands to load the package",
)
shells.add_argument(
"--pwsh",
action="store_const",
dest="shell",
const="pwsh",
help="print pwsh commands to load the package",
)
subparser.add_argument(
"-a", "--all", action="store_true", help="unload all loaded Spack packages"

View File

@@ -155,7 +155,7 @@ def url_list(args):
urls = set()
# Gather set of URLs from all packages
for pkg_cls in spack.repo.path.all_package_classes():
for pkg_cls in spack.repo.PATH.all_package_classes():
url = getattr(pkg_cls, "url", None)
urls = url_list_parsing(args, urls, url, pkg_cls)
@@ -192,7 +192,7 @@ def url_summary(args):
tty.msg("Generating a summary of URL parsing in Spack...")
# Loop through all packages
for pkg_cls in spack.repo.path.all_package_classes():
for pkg_cls in spack.repo.PATH.all_package_classes():
urls = set()
pkg = pkg_cls(spack.spec.Spec(pkg_cls.name))
@@ -336,7 +336,7 @@ def add(self, pkg_name, fetcher):
version_stats = UrlStats()
resource_stats = UrlStats()
for pkg_cls in spack.repo.path.all_package_classes():
for pkg_cls in spack.repo.PATH.all_package_classes():
npkgs += 1
for v in pkg_cls.versions:

View File

@@ -45,7 +45,7 @@ def setup_parser(subparser):
def versions(parser, args):
spec = spack.spec.Spec(args.package)
pkg_cls = spack.repo.path.get_pkg_class(spec.name)
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
pkg = pkg_cls(spec)
safe_versions = pkg.versions

View File

@@ -135,7 +135,7 @@ def _init_compiler_config(*, scope):
def compiler_config_files():
config_files = list()
config = spack.config.config
config = spack.config.CONFIG
for scope in config.file_scopes:
name = scope.name
compiler_config = config.get("compilers", scope=name)
@@ -169,7 +169,7 @@ def remove_compiler_from_config(compiler_spec, scope=None):
"""
candidate_scopes = [scope]
if scope is None:
candidate_scopes = spack.config.config.scopes.keys()
candidate_scopes = spack.config.CONFIG.scopes.keys()
removal_happened = False
for current_scope in candidate_scopes:
@@ -523,7 +523,7 @@ def compiler_for_spec(compiler_spec, arch_spec):
@_auto_compiler_spec
def get_compiler_duplicates(compiler_spec, arch_spec):
config = spack.config.config
config = spack.config.CONFIG
scope_to_compilers = {}
for scope in config.scopes:

View File

@@ -2,13 +2,9 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os.path
import re
import shutil
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.symlink import symlink
import spack.compiler
import spack.compilers.clang
@@ -119,108 +115,3 @@ def c23_flag(self):
self, "the C23 standard", "c23_flag", "< 11.0.3"
)
return "-std=c2x"
def setup_custom_environment(self, pkg, env):
"""Set the DEVELOPER_DIR environment for the Xcode toolchain.
On macOS, not all buildsystems support querying CC and CXX for the
compilers to use and instead query the Xcode toolchain for what
compiler to run. This side-steps the spack wrappers. In order to inject
spack into this setup, we need to copy (a subset of) Xcode.app and
replace the compiler executables with symlinks to the spack wrapper.
Currently, the stage is used to store the Xcode.app copies. We then set
the 'DEVELOPER_DIR' environment variables to cause the xcrun and
related tools to use this Xcode.app.
"""
super().setup_custom_environment(pkg, env)
if not pkg.use_xcode:
# if we do it for all packages, we get into big troubles with MPI:
# filter_compilers(self) will use mockup XCode compilers on macOS
# with Clang. Those point to Spack's compiler wrappers and
# consequently render MPI non-functional outside of Spack.
return
# Use special XCode versions of compiler wrappers when using XCode
# Overwrites build_environment's setting of SPACK_CC and SPACK_CXX
xcrun = spack.util.executable.Executable("xcrun")
xcode_clang = xcrun("-f", "clang", output=str).strip()
xcode_clangpp = xcrun("-f", "clang++", output=str).strip()
env.set("SPACK_CC", xcode_clang, force=True)
env.set("SPACK_CXX", xcode_clangpp, force=True)
xcode_select = spack.util.executable.Executable("xcode-select")
# Get the path of the active developer directory
real_root = xcode_select("--print-path", output=str).strip()
# The path name can be used to determine whether the full Xcode suite
# or just the command-line tools are installed
if real_root.endswith("Developer"):
# The full Xcode suite is installed
pass
else:
if real_root.endswith("CommandLineTools"):
# Only the command-line tools are installed
msg = "It appears that you have the Xcode command-line tools "
msg += "but not the full Xcode suite installed.\n"
else:
# Xcode is not installed
msg = "It appears that you do not have Xcode installed.\n"
msg += "In order to use Spack to build the requested application, "
msg += "you need the full Xcode suite. It can be installed "
msg += "through the App Store. Make sure you launch the "
msg += "application and accept the license agreement.\n"
raise OSError(msg)
real_root = os.path.dirname(os.path.dirname(real_root))
developer_root = os.path.join(
spack.stage.get_stage_root(), "xcode-select", self.name, str(self.version)
)
xcode_link = os.path.join(developer_root, "Xcode.app")
if not os.path.exists(developer_root):
tty.warn(
"Copying Xcode from %s to %s in order to add spack "
"wrappers to it. Please do not interrupt." % (real_root, developer_root)
)
# We need to make a new Xcode.app instance, but with symlinks to
# the spack wrappers for the compilers it ships. This is necessary
# because some projects insist on just asking xcrun and related
# tools where the compiler runs. These tools are very hard to trick
# as they do realpath and end up ignoring the symlinks in a
# "softer" tree of nothing but symlinks in the right places.
shutil.copytree(
real_root,
developer_root,
symlinks=True,
ignore=shutil.ignore_patterns(
"AppleTV*.platform",
"Watch*.platform",
"iPhone*.platform",
"Documentation",
"swift*",
),
)
real_dirs = ["Toolchains/XcodeDefault.xctoolchain/usr/bin", "usr/bin"]
bins = ["c++", "c89", "c99", "cc", "clang", "clang++", "cpp"]
for real_dir in real_dirs:
dev_dir = os.path.join(developer_root, "Contents", "Developer", real_dir)
for fname in os.listdir(dev_dir):
if fname in bins:
os.unlink(os.path.join(dev_dir, fname))
symlink(
os.path.join(spack.paths.build_env_path, "cc"),
os.path.join(dev_dir, fname),
)
symlink(developer_root, xcode_link)
env.set("DEVELOPER_DIR", xcode_link)

View File

@@ -29,6 +29,90 @@
}
class CmdCall:
"""Compose a call to `cmd` for an ordered series of cmd commands/scripts"""
def __init__(self, *cmds):
if not cmds:
raise RuntimeError(
"""Attempting to run commands from CMD without specifying commands.
Please add commands to be run."""
)
self._cmds = cmds
def __call__(self):
out = subprocess.check_output(self.cmd_line, stderr=subprocess.STDOUT) # novermin
return out.decode("utf-16le", errors="replace") # novermin
@property
def cmd_line(self):
base_call = "cmd /u /c "
commands = " && ".join([x.command_str() for x in self._cmds])
# If multiple commands are being invoked by a single subshell
# they must be encapsulated by a double quote. Always double
# quote to be sure of proper handling
# cmd will properly resolve nested double quotes as needed
#
# `set`` writes out the active env to the subshell stdout,
# and in this context we are always trying to obtain env
# state so it should always be appended
return base_call + f'"{commands} && set"'
class VarsInvocation:
def __init__(self, script):
self._script = script
def command_str(self):
return f'"{self._script}"'
@property
def script(self):
return self._script
class VCVarsInvocation(VarsInvocation):
def __init__(self, script, arch, msvc_version):
super(VCVarsInvocation, self).__init__(script)
self._arch = arch
self._msvc_version = msvc_version
@property
def sdk_ver(self):
"""Accessor for Windows SDK version property
Note: This property may not be set by
the calling context and as such this property will
return an empty string
This property will ONLY be set if the SDK package
is a dependency somewhere in the Spack DAG of the package
for which we are constructing an MSVC compiler env.
Otherwise this property should be unset to allow the VCVARS
script to use its internal heuristics to determine appropriate
SDK version
"""
if getattr(self, "_sdk_ver", None):
return self._sdk_ver + ".0"
return ""
@sdk_ver.setter
def sdk_ver(self, val):
self._sdk_ver = val
@property
def arch(self):
return self._arch
@property
def vcvars_ver(self):
return f"-vcvars_ver={self._msvc_version}"
def command_str(self):
script = super(VCVarsInvocation, self).command_str()
return f"{script} {self.arch} {self.sdk_ver} {self.vcvars_ver}"
def get_valid_fortran_pth(comp_ver):
cl_ver = str(comp_ver)
sort_fn = lambda fc_ver: StrictVersion(fc_ver)
@@ -75,22 +159,48 @@ class Msvc(Compiler):
# file based on compiler executable path.
def __init__(self, *args, **kwargs):
new_pth = [pth if pth else get_valid_fortran_pth(args[0].version) for pth in args[3]]
args[3][:] = new_pth
# This positional argument "paths" is later parsed and process by the base class
# via the call to `super` later in this method
paths = args[3]
# This positional argument "cspec" is also parsed and handled by the base class
# constructor
cspec = args[0]
new_pth = [pth if pth else get_valid_fortran_pth(cspec.version) for pth in paths]
paths[:] = new_pth
super().__init__(*args, **kwargs)
if os.getenv("ONEAPI_ROOT"):
# To use the MSVC compilers, VCVARS must be invoked
# VCVARS is located at a fixed location, referencable
# idiomatically by the following relative path from the
# compiler.
# Spack first finds the compilers via VSWHERE
# and stores their path, but their respective VCVARS
# file must be invoked before useage.
env_cmds = []
compiler_root = os.path.join(self.cc, "../../../../../../..")
vcvars_script_path = os.path.join(compiler_root, "Auxiliary", "Build", "vcvars64.bat")
# get current platform architecture and format for vcvars argument
arch = spack.platforms.real_host().default.lower()
arch = arch.replace("-", "_")
self.vcvars_call = VCVarsInvocation(vcvars_script_path, arch, self.msvc_version)
env_cmds.append(self.vcvars_call)
# Below is a check for a valid fortran path
# paths has c, cxx, fc, and f77 paths in that order
# paths[2] refers to the fc path and is a generic check
# for a fortran compiler
if paths[2]:
# If this found, it sets all the vars
self.setvarsfile = os.path.join(os.getenv("ONEAPI_ROOT"), "setvars.bat")
else:
# To use the MSVC compilers, VCVARS must be invoked
# VCVARS is located at a fixed location, referencable
# idiomatically by the following relative path from the
# compiler.
# Spack first finds the compilers via VSWHERE
# and stores their path, but their respective VCVARS
# file must be invoked before useage.
self.setvarsfile = os.path.abspath(os.path.join(self.cc, "../../../../../../.."))
self.setvarsfile = os.path.join(self.setvarsfile, "Auxiliary", "Build", "vcvars64.bat")
oneapi_root = os.getenv("ONEAPI_ROOT")
oneapi_root_setvars = os.path.join(oneapi_root, "setvars.bat")
oneapi_version_setvars = os.path.join(
oneapi_root, "compiler", str(self.ifx_version), "env", "vars.bat"
)
# order matters here, the specific version env must be invoked first,
# otherwise it will be ignored if the root setvars sets up the oneapi
# env first
env_cmds.extend(
[VarsInvocation(oneapi_version_setvars), VarsInvocation(oneapi_root_setvars)]
)
self.msvc_compiler_environment = CmdCall(*env_cmds)
@property
def msvc_version(self):
@@ -119,16 +229,30 @@ def platform_toolset_ver(self):
"""
return self.msvc_version[:2].joined.string[:3]
@property
def cl_version(self):
"""Cl toolset version"""
def _compiler_version(self, compiler):
"""Returns version object for given compiler"""
# ignore_errors below is true here due to ifx's
# non zero return code if it is not provided
# and input file
return Version(
re.search(
Msvc.version_regex,
spack.compiler.get_compiler_version_output(self.cc, version_arg=None),
spack.compiler.get_compiler_version_output(
compiler, version_arg=None, ignore_errors=True
),
).group(1)
)
@property
def cl_version(self):
"""Cl toolset version"""
return self._compiler_version(self.cc)
@property
def ifx_version(self):
"""Ifx compiler version associated with this version of MSVC"""
return self._compiler_version(self.fc)
@property
def vs_root(self):
# The MSVC install root is located at a fix level above the compiler
@@ -146,27 +270,12 @@ def setup_custom_environment(self, pkg, env):
# output, sort into dictionary, use that to make the build
# environment.
# get current platform architecture and format for vcvars argument
arch = spack.platforms.real_host().default.lower()
arch = arch.replace("-", "_")
# vcvars can target specific sdk versions, force it to pick up concretized sdk
# version, if needed by spec
sdk_ver = (
""
if "win-sdk" not in pkg.spec or pkg.name == "win-sdk"
else pkg.spec["win-sdk"].version.string + ".0"
)
# provide vcvars with msvc version selected by concretization,
# not whatever it happens to pick up on the system (highest available version)
out = subprocess.check_output( # novermin
'cmd /u /c "{}" {} {} {} && set'.format(
self.setvarsfile, arch, sdk_ver, "-vcvars_ver=%s" % self.msvc_version
),
stderr=subprocess.STDOUT,
)
if sys.version_info[0] >= 3:
out = out.decode("utf-16le", errors="replace") # novermin
if pkg.name != "win-sdk" and "win-sdk" in pkg.spec:
self.vcvars_call.sdk_ver = pkg.spec["win-sdk"].version.string
out = self.msvc_compiler_environment()
int_env = dict(
(key, value)
for key, _, value in (line.partition("=") for line in out.splitlines())

View File

@@ -28,6 +28,7 @@
import spack.abi
import spack.compilers
import spack.config
import spack.environment
import spack.error
import spack.platforms
@@ -37,7 +38,6 @@
import spack.tengine
import spack.util.path
import spack.variant as vt
from spack.config import config
from spack.package_prefs import PackagePrefs, is_spec_buildable, spec_externals
from spack.version import ClosedOpenRange, VersionList, ver
@@ -76,7 +76,7 @@ class Concretizer:
def __init__(self, abstract_spec=None):
if Concretizer.check_for_compiler_existence is None:
Concretizer.check_for_compiler_existence = not config.get(
Concretizer.check_for_compiler_existence = not spack.config.get(
"config:install_missing_compilers", False
)
self.abstract_spec = abstract_spec
@@ -113,7 +113,7 @@ def _valid_virtuals_and_externals(self, spec):
pref_key = lambda spec: 0 # no-op pref key
if spec.virtual:
candidates = spack.repo.path.providers_for(spec)
candidates = spack.repo.PATH.providers_for(spec)
if not candidates:
raise spack.error.UnsatisfiableProviderSpecError(candidates[0], spec)

View File

@@ -47,6 +47,8 @@
import spack.platforms
import spack.schema
import spack.schema.bootstrap
import spack.schema.cdash
import spack.schema.ci
import spack.schema.compilers
import spack.schema.concretizer
import spack.schema.config
@@ -64,7 +66,7 @@
from spack.util.cpus import cpus_available
#: Dict from section names -> schema for that section
section_schemas = {
SECTION_SCHEMAS = {
"compilers": spack.schema.compilers.schema,
"concretizer": spack.schema.concretizer.schema,
"mirrors": spack.schema.mirrors.schema,
@@ -80,16 +82,16 @@
# Same as above, but including keys for environments
# this allows us to unify config reading between configs and environments
all_schemas = copy.deepcopy(section_schemas)
all_schemas.update({spack.schema.env.TOP_LEVEL_KEY: spack.schema.env.schema})
_ALL_SCHEMAS = copy.deepcopy(SECTION_SCHEMAS)
_ALL_SCHEMAS.update({spack.schema.env.TOP_LEVEL_KEY: spack.schema.env.schema})
#: Path to the default configuration
configuration_defaults_path = ("defaults", os.path.join(spack.paths.etc_path, "defaults"))
CONFIGURATION_DEFAULTS_PATH = ("defaults", os.path.join(spack.paths.etc_path, "defaults"))
#: Hard-coded default values for some key configuration options.
#: This ensures that Spack will still work even if config.yaml in
#: the defaults scope is removed.
config_defaults = {
CONFIG_DEFAULTS = {
"config": {
"debug": False,
"connect_timeout": 10,
@@ -105,10 +107,10 @@
#: metavar to use for commands that accept scopes
#: this is shorter and more readable than listing all choices
scopes_metavar = "{defaults,system,site,user}[/PLATFORM] or env:ENVIRONMENT"
SCOPES_METAVAR = "{defaults,system,site,user}[/PLATFORM] or env:ENVIRONMENT"
#: Base name for the (internal) overrides scope.
overrides_base_name = "overrides-"
_OVERRIDES_BASE_NAME = "overrides-"
class ConfigScope:
@@ -134,7 +136,7 @@ def get_section_filename(self, section):
def get_section(self, section):
if section not in self.sections:
path = self.get_section_filename(section)
schema = section_schemas[section]
schema = SECTION_SCHEMAS[section]
data = read_config_file(path, schema)
self.sections[section] = data
return self.sections[section]
@@ -145,7 +147,7 @@ def _write_section(self, section):
# We copy data here to avoid adding defaults at write time
validate_data = copy.deepcopy(data)
validate(validate_data, section_schemas[section])
validate(validate_data, SECTION_SCHEMAS[section])
try:
mkdirp(self.path)
@@ -317,7 +319,7 @@ def __init__(self, name, data=None):
data = InternalConfigScope._process_dict_keyname_overrides(data)
for section in data:
dsec = data[section]
validate({section: dsec}, section_schemas[section])
validate({section: dsec}, SECTION_SCHEMAS[section])
self.sections[section] = _mark_internal(syaml.syaml_dict({section: dsec}), name)
def get_section_filename(self, section):
@@ -333,7 +335,7 @@ def _write_section(self, section):
"""This only validates, as the data is already in memory."""
data = self.get_section(section)
if data is not None:
validate(data, section_schemas[section])
validate(data, SECTION_SCHEMAS[section])
self.sections[section] = _mark_internal(data, self.name)
def __repr__(self):
@@ -430,7 +432,7 @@ def file_scopes(self) -> List[ConfigScope]:
return [
s
for s in self.scopes.values()
if (type(s) == ConfigScope or type(s) == SingleFileScope)
if (type(s) is ConfigScope or type(s) is SingleFileScope)
]
def highest_precedence_scope(self) -> ConfigScope:
@@ -711,11 +713,11 @@ def override(path_or_scope, value=None):
"""
if isinstance(path_or_scope, ConfigScope):
overrides = path_or_scope
config.push_scope(path_or_scope)
CONFIG.push_scope(path_or_scope)
else:
base_name = overrides_base_name
base_name = _OVERRIDES_BASE_NAME
# Ensure the new override gets a unique scope name
current_overrides = [s.name for s in config.matching_scopes(r"^{0}".format(base_name))]
current_overrides = [s.name for s in CONFIG.matching_scopes(r"^{0}".format(base_name))]
num_overrides = len(current_overrides)
while True:
scope_name = "{0}{1}".format(base_name, num_overrides)
@@ -725,19 +727,19 @@ def override(path_or_scope, value=None):
break
overrides = InternalConfigScope(scope_name)
config.push_scope(overrides)
config.set(path_or_scope, value, scope=scope_name)
CONFIG.push_scope(overrides)
CONFIG.set(path_or_scope, value, scope=scope_name)
try:
yield config
yield CONFIG
finally:
scope = config.remove_scope(overrides.name)
scope = CONFIG.remove_scope(overrides.name)
assert scope is overrides
#: configuration scopes added on the command line
#: set by ``spack.main.main()``.
command_line_scopes: List[str] = []
COMMAND_LINE_SCOPES: List[str] = []
def _add_platform_scope(cfg, scope_type, name, path):
@@ -781,14 +783,14 @@ def create():
cfg = Configuration()
# first do the builtin, hardcoded defaults
builtin = InternalConfigScope("_builtin", config_defaults)
builtin = InternalConfigScope("_builtin", CONFIG_DEFAULTS)
cfg.push_scope(builtin)
# Builtin paths to configuration files in Spack
configuration_paths = [
# Default configuration scope is the lowest-level scope. These are
# versioned with Spack and can be overridden by systems, sites or users
configuration_defaults_path
CONFIGURATION_DEFAULTS_PATH
]
disable_local_config = "SPACK_DISABLE_LOCAL_CONFIG" in os.environ
@@ -815,7 +817,7 @@ def create():
_add_platform_scope(cfg, ConfigScope, name, path)
# add command-line scopes
_add_command_line_scopes(cfg, command_line_scopes)
_add_command_line_scopes(cfg, COMMAND_LINE_SCOPES)
# we make a special scope for spack commands so that they can
# override configuration options.
@@ -825,7 +827,7 @@ def create():
#: This is the singleton configuration instance for Spack.
config: Union[Configuration, llnl.util.lang.Singleton] = llnl.util.lang.Singleton(create)
CONFIG: Union[Configuration, llnl.util.lang.Singleton] = llnl.util.lang.Singleton(create)
def add_from_file(filename, scope=None):
@@ -838,7 +840,7 @@ def add_from_file(filename, scope=None):
# update all sections from config dict
# We have to iterate on keys to keep overrides from the file
for section in data.keys():
if section in section_schemas.keys():
if section in SECTION_SCHEMAS.keys():
# Special handling for compiler scope difference
# Has to be handled after we choose a section
if scope is None:
@@ -849,7 +851,7 @@ def add_from_file(filename, scope=None):
new = merge_yaml(existing, value)
# We cannot call config.set directly (set is a type)
config.set(section, new, scope)
CONFIG.set(section, new, scope)
def add(fullpath, scope=None):
@@ -861,6 +863,7 @@ def add(fullpath, scope=None):
has_existing_value = True
path = ""
override = False
value = syaml.load_config(components[-1])
for idx, name in enumerate(components[:-1]):
# First handle double colons in constructing path
colon = "::" if override else ":" if path else ""
@@ -881,14 +884,14 @@ def add(fullpath, scope=None):
existing = get_valid_type(path)
# construct value from this point down
value = syaml.load_config(components[-1])
for component in reversed(components[idx + 1 : -1]):
value = {component: value}
break
if override:
path += "::"
if has_existing_value:
path, _, value = fullpath.rpartition(":")
value = syaml.load_config(value)
existing = get(path, scope=scope)
# append values to lists
@@ -897,12 +900,12 @@ def add(fullpath, scope=None):
# merge value into existing
new = merge_yaml(existing, value)
config.set(path, new, scope)
CONFIG.set(path, new, scope)
def get(path, default=None, scope=None):
"""Module-level wrapper for ``Configuration.get()``."""
return config.get(path, default, scope)
return CONFIG.get(path, default, scope)
def set(path, value, scope=None):
@@ -910,26 +913,26 @@ def set(path, value, scope=None):
Accepts the path syntax described in ``get()``.
"""
return config.set(path, value, scope)
return CONFIG.set(path, value, scope)
def add_default_platform_scope(platform):
plat_name = os.path.join("defaults", platform)
plat_path = os.path.join(configuration_defaults_path[1], platform)
config.push_scope(ConfigScope(plat_name, plat_path))
plat_path = os.path.join(CONFIGURATION_DEFAULTS_PATH[1], platform)
CONFIG.push_scope(ConfigScope(plat_name, plat_path))
def scopes():
"""Convenience function to get list of configuration scopes."""
return config.scopes
return CONFIG.scopes
def _validate_section_name(section):
"""Exit if the section is not a valid section."""
if section not in section_schemas:
if section not in SECTION_SCHEMAS:
raise ConfigSectionError(
"Invalid config section: '%s'. Options are: %s"
% (section, " ".join(section_schemas.keys()))
% (section, " ".join(SECTION_SCHEMAS.keys()))
)
@@ -990,7 +993,7 @@ def read_config_file(filename, schema=None):
if data:
if not schema:
key = next(iter(data))
schema = all_schemas[key]
schema = _ALL_SCHEMAS[key]
validate(data, schema)
return data
@@ -1089,7 +1092,7 @@ def get_valid_type(path):
test_data = {component: test_data}
try:
validate(test_data, section_schemas[section])
validate(test_data, SECTION_SCHEMAS[section])
except (ConfigFormatError, AttributeError) as e:
jsonschema_error = e.validation_error
if jsonschema_error.validator == "type":
@@ -1229,11 +1232,17 @@ def they_are(t):
return copy.copy(source)
#
# Process a path argument to config.set() that may contain overrides ('::' or
# trailing ':')
#
def process_config_path(path):
"""Process a path argument to config.set() that may contain overrides ('::' or
trailing ':')
Note: quoted value path components will be processed as a single value (escaping colons)
quoted path components outside of the value will be considered ill formed and will
raise.
e.g. `this:is:a:path:'value:with:colon'` will yield:
[this, is, a, path, value:with:colon]
"""
result = []
if path.startswith(":"):
raise syaml.SpackYAMLError("Illegal leading `:' in path `{0}'".format(path), "")
@@ -1260,6 +1269,16 @@ def process_config_path(path):
front = syaml.syaml_str(front)
front.append = True
quote = "['\"]"
not_quote = "[^'\"]"
if re.match(f"^{quote}", path):
m = re.match(rf"^{quote}({not_quote}+){quote}$", path)
if not m:
raise ValueError("Quotes indicate value, but there are additional path entries")
result.append(m.group(1))
break
result.append(front)
return result
@@ -1278,9 +1297,9 @@ def default_modify_scope(section="config"):
If this is not 'compilers', a general (non-platform) scope is used.
"""
if section == "compilers":
return spack.config.config.highest_precedence_scope().name
return CONFIG.highest_precedence_scope().name
else:
return spack.config.config.highest_precedence_non_platform_scope().name
return CONFIG.highest_precedence_non_platform_scope().name
def default_list_scope():
@@ -1337,18 +1356,18 @@ def use_configuration(*scopes_or_paths):
Returns:
Configuration object associated with the scopes passed as arguments
"""
global config
global CONFIG
# Normalize input and construct a Configuration object
configuration = _config_from(scopes_or_paths)
config.clear_caches(), configuration.clear_caches()
CONFIG.clear_caches(), configuration.clear_caches()
saved_config, config = config, configuration
saved_config, CONFIG = CONFIG, configuration
try:
yield configuration
finally:
config = saved_config
CONFIG = saved_config
@llnl.util.lang.memoized

View File

@@ -5,8 +5,8 @@
"""Writers for different kind of recipes and related
convenience functions.
"""
import collections
import copy
from collections import namedtuple
from typing import Optional
import spack.environment as ev
@@ -159,13 +159,13 @@ def depfile(self):
@tengine.context_property
def run(self):
"""Information related to the run image."""
Run = collections.namedtuple("Run", ["image"])
Run = namedtuple("Run", ["image"])
return Run(image=self.final_image)
@tengine.context_property
def build(self):
"""Information related to the build image."""
Build = collections.namedtuple("Build", ["image"])
Build = namedtuple("Build", ["image"])
return Build(image=self.build_image)
@tengine.context_property
@@ -176,12 +176,13 @@ def strip(self):
@tengine.context_property
def paths(self):
"""Important paths in the image"""
Paths = collections.namedtuple("Paths", ["environment", "store", "hidden_view", "view"])
Paths = namedtuple("Paths", ["environment", "store", "view_parent", "view", "former_view"])
return Paths(
environment="/opt/spack-environment",
store="/opt/software",
hidden_view="/opt/._view",
view="/opt/view",
view_parent="/opt/views",
view="/opt/views/view",
former_view="/opt/view", # /opt/view -> /opt/views/view for backward compatibility
)
@tengine.context_property
@@ -257,7 +258,7 @@ def _package_info_from(self, package_list):
update, install, clean = commands_for(os_pkg_manager)
Packages = collections.namedtuple("Packages", ["update", "install", "list", "clean"])
Packages = namedtuple("Packages", ["update", "install", "list", "clean"])
return Packages(update=update, install=install, list=package_list, clean=clean)
def _os_pkg_manager(self):
@@ -273,7 +274,7 @@ def _os_pkg_manager(self):
@tengine.context_property
def extra_instructions(self):
Extras = collections.namedtuple("Extra", ["build", "final"])
Extras = namedtuple("Extra", ["build", "final"])
extras = self.container_config.get("extra_instructions", {})
build, final = extras.get("build", None), extras.get("final", None)
return Extras(build=build, final=final)
@@ -295,7 +296,7 @@ def bootstrap(self):
context = {"bootstrap": {"image": self.bootstrap_image, "spack_checkout": command}}
bootstrap_recipe = env.get_template(template_path).render(**context)
Bootstrap = collections.namedtuple("Bootstrap", ["image", "recipe"])
Bootstrap = namedtuple("Bootstrap", ["image", "recipe"])
return Bootstrap(image=self.bootstrap_image, recipe=bootstrap_recipe)
@tengine.context_property
@@ -303,7 +304,7 @@ def render_phase(self):
render_bootstrap = bool(self.bootstrap_image)
render_build = not (self.last_phase == "bootstrap")
render_final = self.last_phase in (None, "final")
Render = collections.namedtuple("Render", ["bootstrap", "build", "final"])
Render = namedtuple("Render", ["bootstrap", "build", "final"])
return Render(bootstrap=render_bootstrap, build=render_build, final=render_final)
def __call__(self):

View File

@@ -90,7 +90,7 @@ def spec_from_entry(entry):
name=entry["name"], version=entry["version"], compiler=compiler_str, arch=arch_str
)
pkg_cls = spack.repo.path.get_pkg_class(entry["name"])
pkg_cls = spack.repo.PATH.get_pkg_class(entry["name"])
if "parameters" in entry:
variant_strs = list()

View File

@@ -21,10 +21,11 @@
import contextlib
import datetime
import os
import pathlib
import socket
import sys
import time
from typing import Dict, List, NamedTuple, Set, Type, Union
from typing import Any, Callable, Dict, Generator, List, NamedTuple, Set, Type, Union
try:
import uuid
@@ -141,22 +142,23 @@ class InstallStatuses:
def canonicalize(cls, query_arg):
if query_arg is True:
return [cls.INSTALLED]
elif query_arg is False:
if query_arg is False:
return [cls.MISSING]
elif query_arg is any:
if query_arg is any:
return [cls.INSTALLED, cls.DEPRECATED, cls.MISSING]
elif isinstance(query_arg, InstallStatus):
if isinstance(query_arg, InstallStatus):
return [query_arg]
else:
try: # Try block catches if it is not an iterable at all
if any(type(x) != InstallStatus for x in query_arg):
raise TypeError
except TypeError:
raise TypeError(
"installation query must be `any`, boolean, "
"InstallStatus, or iterable of InstallStatus"
)
return query_arg
try:
statuses = list(query_arg)
if all(isinstance(x, InstallStatus) for x in statuses):
return statuses
except TypeError:
pass
raise TypeError(
"installation query must be `any`, boolean, "
"InstallStatus, or iterable of InstallStatus"
)
class InstallRecord:
@@ -306,15 +308,16 @@ def __reduce__(self):
"""
#: Data class to configure locks in Database objects
#:
#: Args:
#: enable (bool): whether to enable locks or not.
#: database_timeout (int or None): timeout for the database lock
#: package_timeout (int or None): timeout for the package lock
class LockConfiguration(NamedTuple):
"""Data class to configure locks in Database objects
Args:
enable: whether to enable locks or not.
database_timeout: timeout for the database lock
package_timeout: timeout for the package lock
"""
enable: bool
database_timeout: Optional[int]
package_timeout: Optional[int]
@@ -348,13 +351,230 @@ def lock_configuration(configuration):
)
def prefix_lock_path(root_dir: Union[str, pathlib.Path]) -> pathlib.Path:
"""Returns the path of the prefix lock file, given the root directory.
Args:
root_dir: root directory containing the database directory
"""
return pathlib.Path(root_dir) / _DB_DIRNAME / "prefix_lock"
def failures_lock_path(root_dir: Union[str, pathlib.Path]) -> pathlib.Path:
"""Returns the path of the failures lock file, given the root directory.
Args:
root_dir: root directory containing the database directory
"""
return pathlib.Path(root_dir) / _DB_DIRNAME / "prefix_failures"
class SpecLocker:
"""Manages acquiring and releasing read or write locks on concrete specs."""
def __init__(self, lock_path: Union[str, pathlib.Path], default_timeout: Optional[float]):
self.lock_path = pathlib.Path(lock_path)
self.default_timeout = default_timeout
# Maps (spec.dag_hash(), spec.name) to the corresponding lock object
self.locks: Dict[Tuple[str, str], lk.Lock] = {}
def lock(self, spec: "spack.spec.Spec", timeout: Optional[float] = None) -> lk.Lock:
"""Returns a lock on a concrete spec.
The lock is a byte range lock on the nth byte of a file.
The lock file is ``self.lock_path``.
n is the sys.maxsize-bit prefix of the DAG hash. This makes likelihood of collision is
very low AND it gives us readers-writer lock semantics with just a single lockfile, so
no cleanup required.
"""
assert spec.concrete, "cannot lock a non-concrete spec"
timeout = timeout or self.default_timeout
key = self._lock_key(spec)
if key not in self.locks:
self.locks[key] = self.raw_lock(spec, timeout=timeout)
else:
self.locks[key].default_timeout = timeout
return self.locks[key]
def raw_lock(self, spec: "spack.spec.Spec", timeout: Optional[float] = None) -> lk.Lock:
"""Returns a raw lock for a Spec, but doesn't keep track of it."""
return lk.Lock(
str(self.lock_path),
start=spec.dag_hash_bit_prefix(bit_length(sys.maxsize)),
length=1,
default_timeout=timeout,
desc=spec.name,
)
def has_lock(self, spec: "spack.spec.Spec") -> bool:
"""Returns True if the spec is already managed by this spec locker"""
return self._lock_key(spec) in self.locks
def _lock_key(self, spec: "spack.spec.Spec") -> Tuple[str, str]:
return (spec.dag_hash(), spec.name)
@contextlib.contextmanager
def write_lock(self, spec: "spack.spec.Spec") -> Generator["SpecLocker", None, None]:
lock = self.lock(spec)
lock.acquire_write()
try:
yield self
except lk.LockError:
# This addresses the case where a nested lock attempt fails inside
# of this context manager
raise
except (Exception, KeyboardInterrupt):
lock.release_write()
raise
else:
lock.release_write()
def clear(self, spec: "spack.spec.Spec") -> Tuple[bool, Optional[lk.Lock]]:
key = self._lock_key(spec)
lock = self.locks.pop(key, None)
return bool(lock), lock
def clear_all(self, clear_fn: Optional[Callable[[lk.Lock], Any]] = None) -> None:
if clear_fn is not None:
for lock in self.locks.values():
clear_fn(lock)
self.locks.clear()
class FailureTracker:
"""Tracks installation failures.
Prefix failure marking takes the form of a byte range lock on the nth
byte of a file for coordinating between concurrent parallel build
processes and a persistent file, named with the full hash and
containing the spec, in a subdirectory of the database to enable
persistence across overlapping but separate related build processes.
The failure lock file lives alongside the install DB.
``n`` is the sys.maxsize-bit prefix of the associated DAG hash to make
the likelihood of collision very low with no cleanup required.
"""
def __init__(self, root_dir: Union[str, pathlib.Path], default_timeout: Optional[float]):
#: Ensure a persistent location for dealing with parallel installation
#: failures (e.g., across near-concurrent processes).
self.dir = pathlib.Path(root_dir) / _DB_DIRNAME / "failures"
self.dir.mkdir(parents=True, exist_ok=True)
self.locker = SpecLocker(failures_lock_path(root_dir), default_timeout=default_timeout)
def clear(self, spec: "spack.spec.Spec", force: bool = False) -> None:
"""Removes any persistent and cached failure tracking for the spec.
see `mark()`.
Args:
spec: the spec whose failure indicators are being removed
force: True if the failure information should be cleared when a failure lock
exists for the file, or False if the failure should not be cleared (e.g.,
it may be associated with a concurrent build)
"""
locked = self.lock_taken(spec)
if locked and not force:
tty.msg(f"Retaining failure marking for {spec.name} due to lock")
return
if locked:
tty.warn(f"Removing failure marking despite lock for {spec.name}")
succeeded, lock = self.locker.clear(spec)
if succeeded and lock is not None:
lock.release_write()
if self.persistent_mark(spec):
path = self._path(spec)
tty.debug(f"Removing failure marking for {spec.name}")
try:
path.unlink()
except OSError as err:
tty.warn(
f"Unable to remove failure marking for {spec.name} ({str(path)}): {str(err)}"
)
def clear_all(self) -> None:
"""Force remove install failure tracking files."""
tty.debug("Releasing prefix failure locks")
self.locker.clear_all(
clear_fn=lambda x: x.release_write() if x.is_write_locked() else True
)
tty.debug("Removing prefix failure tracking files")
try:
for fail_mark in os.listdir(str(self.dir)):
try:
(self.dir / fail_mark).unlink()
except OSError as exc:
tty.warn(f"Unable to remove failure marking file {fail_mark}: {str(exc)}")
except OSError as exc:
tty.warn(f"Unable to remove failure marking files: {str(exc)}")
def mark(self, spec: "spack.spec.Spec") -> lk.Lock:
"""Marks a spec as failing to install.
Args:
spec: spec that failed to install
"""
# Dump the spec to the failure file for (manual) debugging purposes
path = self._path(spec)
path.write_text(spec.to_json())
# Also ensure a failure lock is taken to prevent cleanup removal
# of failure status information during a concurrent parallel build.
if not self.locker.has_lock(spec):
try:
mark = self.locker.lock(spec)
mark.acquire_write()
except lk.LockTimeoutError:
# Unlikely that another process failed to install at the same
# time but log it anyway.
tty.debug(f"PID {os.getpid()} failed to mark install failure for {spec.name}")
tty.warn(f"Unable to mark {spec.name} as failed.")
return self.locker.lock(spec)
def has_failed(self, spec: "spack.spec.Spec") -> bool:
"""Return True if the spec is marked as failed."""
# The failure was detected in this process.
if self.locker.has_lock(spec):
return True
# The failure was detected by a concurrent process (e.g., an srun),
# which is expected to be holding a write lock if that is the case.
if self.lock_taken(spec):
return True
# Determine if the spec may have been marked as failed by a separate
# spack build process running concurrently.
return self.persistent_mark(spec)
def lock_taken(self, spec: "spack.spec.Spec") -> bool:
"""Return True if another process has a failure lock on the spec."""
check = self.locker.raw_lock(spec)
return check.is_write_locked()
def persistent_mark(self, spec: "spack.spec.Spec") -> bool:
"""Determine if the spec has a persistent failure marking."""
return self._path(spec).exists()
def _path(self, spec: "spack.spec.Spec") -> pathlib.Path:
"""Return the path to the spec's failure file, which may not exist."""
assert spec.concrete, "concrete spec required for failure path"
return self.dir / f"{spec.name}-{spec.dag_hash()}"
class Database:
#: Per-process lock objects for each install prefix
_prefix_locks: Dict[str, lk.Lock] = {}
#: Per-process failure (lock) objects for each install prefix
_prefix_failures: Dict[str, lk.Lock] = {}
#: Fields written for each install record
record_fields: Tuple[str, ...] = DEFAULT_INSTALL_RECORD_FIELDS
@@ -392,24 +612,10 @@ def __init__(
self._verifier_path = os.path.join(self.database_directory, "index_verifier")
self._lock_path = os.path.join(self.database_directory, "lock")
# This is for other classes to use to lock prefix directories.
self.prefix_lock_path = os.path.join(self.database_directory, "prefix_lock")
# Ensure a persistent location for dealing with parallel installation
# failures (e.g., across near-concurrent processes).
self._failure_dir = os.path.join(self.database_directory, "failures")
# Support special locks for handling parallel installation failures
# of a spec.
self.prefix_fail_path = os.path.join(self.database_directory, "prefix_failures")
# Create needed directories and files
if not is_upstream and not os.path.exists(self.database_directory):
fs.mkdirp(self.database_directory)
if not is_upstream and not os.path.exists(self._failure_dir):
fs.mkdirp(self._failure_dir)
self.is_upstream = is_upstream
self.last_seen_verifier = ""
# Failed write transactions (interrupted by exceptions) will alert
@@ -423,15 +629,7 @@ def __init__(
# initialize rest of state.
self.db_lock_timeout = lock_cfg.database_timeout
self.package_lock_timeout = lock_cfg.package_timeout
tty.debug("DATABASE LOCK TIMEOUT: {0}s".format(str(self.db_lock_timeout)))
timeout_format_str = (
"{0}s".format(str(self.package_lock_timeout))
if self.package_lock_timeout
else "No timeout"
)
tty.debug("PACKAGE LOCK TIMEOUT: {0}".format(str(timeout_format_str)))
self.lock: Union[ForbiddenLock, lk.Lock]
if self.is_upstream:
@@ -471,212 +669,6 @@ def read_transaction(self):
"""Get a read lock context manager for use in a `with` block."""
return self._read_transaction_impl(self.lock, acquire=self._read)
def _failed_spec_path(self, spec):
"""Return the path to the spec's failure file, which may not exist."""
if not spec.concrete:
raise ValueError("Concrete spec required for failure path for {0}".format(spec.name))
return os.path.join(self._failure_dir, "{0}-{1}".format(spec.name, spec.dag_hash()))
def clear_all_failures(self) -> None:
"""Force remove install failure tracking files."""
tty.debug("Releasing prefix failure locks")
for pkg_id in list(self._prefix_failures.keys()):
lock = self._prefix_failures.pop(pkg_id, None)
if lock:
lock.release_write()
# Remove all failure markings (aka files)
tty.debug("Removing prefix failure tracking files")
for fail_mark in os.listdir(self._failure_dir):
try:
os.remove(os.path.join(self._failure_dir, fail_mark))
except OSError as exc:
tty.warn(
"Unable to remove failure marking file {0}: {1}".format(fail_mark, str(exc))
)
def clear_failure(self, spec: "spack.spec.Spec", force: bool = False) -> None:
"""
Remove any persistent and cached failure tracking for the spec.
see `mark_failed()`.
Args:
spec: the spec whose failure indicators are being removed
force: True if the failure information should be cleared when a prefix failure
lock exists for the file, or False if the failure should not be cleared (e.g.,
it may be associated with a concurrent build)
"""
failure_locked = self.prefix_failure_locked(spec)
if failure_locked and not force:
tty.msg("Retaining failure marking for {0} due to lock".format(spec.name))
return
if failure_locked:
tty.warn("Removing failure marking despite lock for {0}".format(spec.name))
lock = self._prefix_failures.pop(spec.prefix, None)
if lock:
lock.release_write()
if self.prefix_failure_marked(spec):
try:
path = self._failed_spec_path(spec)
tty.debug("Removing failure marking for {0}".format(spec.name))
os.remove(path)
except OSError as err:
tty.warn(
"Unable to remove failure marking for {0} ({1}): {2}".format(
spec.name, path, str(err)
)
)
def mark_failed(self, spec: "spack.spec.Spec") -> lk.Lock:
"""
Mark a spec as failing to install.
Prefix failure marking takes the form of a byte range lock on the nth
byte of a file for coordinating between concurrent parallel build
processes and a persistent file, named with the full hash and
containing the spec, in a subdirectory of the database to enable
persistence across overlapping but separate related build processes.
The failure lock file, ``spack.store.STORE.db.prefix_failures``, lives
alongside the install DB. ``n`` is the sys.maxsize-bit prefix of the
associated DAG hash to make the likelihood of collision very low with
no cleanup required.
"""
# Dump the spec to the failure file for (manual) debugging purposes
path = self._failed_spec_path(spec)
with open(path, "w") as f:
spec.to_json(f)
# Also ensure a failure lock is taken to prevent cleanup removal
# of failure status information during a concurrent parallel build.
err = "Unable to mark {0.name} as failed."
prefix = spec.prefix
if prefix not in self._prefix_failures:
mark = lk.Lock(
self.prefix_fail_path,
start=spec.dag_hash_bit_prefix(bit_length(sys.maxsize)),
length=1,
default_timeout=self.package_lock_timeout,
desc=spec.name,
)
try:
mark.acquire_write()
except lk.LockTimeoutError:
# Unlikely that another process failed to install at the same
# time but log it anyway.
tty.debug(
"PID {0} failed to mark install failure for {1}".format(os.getpid(), spec.name)
)
tty.warn(err.format(spec))
# Whether we or another process marked it as a failure, track it
# as such locally.
self._prefix_failures[prefix] = mark
return self._prefix_failures[prefix]
def prefix_failed(self, spec: "spack.spec.Spec") -> bool:
"""Return True if the prefix (installation) is marked as failed."""
# The failure was detected in this process.
if spec.prefix in self._prefix_failures:
return True
# The failure was detected by a concurrent process (e.g., an srun),
# which is expected to be holding a write lock if that is the case.
if self.prefix_failure_locked(spec):
return True
# Determine if the spec may have been marked as failed by a separate
# spack build process running concurrently.
return self.prefix_failure_marked(spec)
def prefix_failure_locked(self, spec: "spack.spec.Spec") -> bool:
"""Return True if a process has a failure lock on the spec."""
check = lk.Lock(
self.prefix_fail_path,
start=spec.dag_hash_bit_prefix(bit_length(sys.maxsize)),
length=1,
default_timeout=self.package_lock_timeout,
desc=spec.name,
)
return check.is_write_locked()
def prefix_failure_marked(self, spec: "spack.spec.Spec") -> bool:
"""Determine if the spec has a persistent failure marking."""
return os.path.exists(self._failed_spec_path(spec))
def prefix_lock(self, spec: "spack.spec.Spec", timeout: Optional[float] = None) -> lk.Lock:
"""Get a lock on a particular spec's installation directory.
NOTE: The installation directory **does not** need to exist.
Prefix lock is a byte range lock on the nth byte of a file.
The lock file is ``spack.store.STORE.db.prefix_lock`` -- the DB
tells us what to call it and it lives alongside the install DB.
n is the sys.maxsize-bit prefix of the DAG hash. This makes
likelihood of collision is very low AND it gives us
readers-writer lock semantics with just a single lockfile, so no
cleanup required.
"""
timeout = timeout or self.package_lock_timeout
prefix = spec.prefix
if prefix not in self._prefix_locks:
self._prefix_locks[prefix] = lk.Lock(
self.prefix_lock_path,
start=spec.dag_hash_bit_prefix(bit_length(sys.maxsize)),
length=1,
default_timeout=timeout,
desc=spec.name,
)
elif timeout != self._prefix_locks[prefix].default_timeout:
self._prefix_locks[prefix].default_timeout = timeout
return self._prefix_locks[prefix]
@contextlib.contextmanager
def prefix_read_lock(self, spec):
prefix_lock = self.prefix_lock(spec)
prefix_lock.acquire_read()
try:
yield self
except lk.LockError:
# This addresses the case where a nested lock attempt fails inside
# of this context manager
raise
except (Exception, KeyboardInterrupt):
prefix_lock.release_read()
raise
else:
prefix_lock.release_read()
@contextlib.contextmanager
def prefix_write_lock(self, spec):
prefix_lock = self.prefix_lock(spec)
prefix_lock.acquire_write()
try:
yield self
except lk.LockError:
# This addresses the case where a nested lock attempt fails inside
# of this context manager
raise
except (Exception, KeyboardInterrupt):
prefix_lock.release_write()
raise
else:
prefix_lock.release_write()
def _write_to_file(self, stream):
"""Write out the database in JSON format to the stream passed
as argument.

View File

@@ -33,7 +33,7 @@ class OpenMpi(Package):
import functools
import os.path
import re
from typing import List, Optional, Set, Union
from typing import Any, Callable, List, Optional, Set, Tuple, Union
import llnl.util.lang
import llnl.util.tty.color
@@ -42,6 +42,7 @@ class OpenMpi(Package):
import spack.patch
import spack.spec
import spack.url
import spack.util.crypto
import spack.variant
from spack.dependency import Dependency, canonical_deptype, default_deptype
from spack.fetch_strategy import from_kwargs
@@ -407,10 +408,7 @@ def version(
def _execute_version(pkg, ver, **kwargs):
if (
any(
s in kwargs
for s in ("sha256", "sha384", "sha512", "md5", "sha1", "sha224", "checksum")
)
(any(s in kwargs for s in spack.util.crypto.hashes) or "checksum" in kwargs)
and hasattr(pkg, "has_code")
and not pkg.has_code
):
@@ -520,7 +518,8 @@ def _execute_conflicts(pkg):
# Save in a list the conflicts and the associated custom messages
when_spec_list = pkg.conflicts.setdefault(conflict_spec, [])
when_spec_list.append((when_spec, msg))
msg_with_name = f"{pkg.name}: {msg}" if msg is not None else msg
when_spec_list.append((when_spec, msg_with_name))
return _execute_conflicts
@@ -663,39 +662,35 @@ def _execute_patch(pkg_or_dep):
@directive("variants")
def variant(
name,
default=None,
description="",
values=None,
multi=None,
validator=None,
when=None,
sticky=False,
name: str,
default: Optional[Any] = None,
description: str = "",
values: Optional[Union[collections.abc.Sequence, Callable[[Any], bool]]] = None,
multi: Optional[bool] = None,
validator: Optional[Callable[[str, str, Tuple[Any, ...]], None]] = None,
when: Optional[Union[str, bool]] = None,
sticky: bool = False,
):
"""Define a variant for the package. Packager can specify a default
value as well as a text description.
"""Define a variant for the package.
Packager can specify a default value as well as a text description.
Args:
name (str): name of the variant
default (str or bool): default value for the variant, if not
specified otherwise the default will be False for a boolean
variant and 'nothing' for a multi-valued variant
description (str): description of the purpose of the variant
values (tuple or typing.Callable): either a tuple of strings containing the
allowed values, or a callable accepting one value and returning
True if it is valid
multi (bool): if False only one value per spec is allowed for
this variant
validator (typing.Callable): optional group validator to enforce additional
logic. It receives the package name, the variant name and a tuple
of values and should raise an instance of SpackError if the group
doesn't meet the additional constraints
when (spack.spec.Spec, bool): optional condition on which the
variant applies
sticky (bool): the variant should not be changed by the concretizer to
find a valid concrete spec.
name: Name of the variant
default: Default value for the variant, if not specified otherwise the default will be
False for a boolean variant and 'nothing' for a multi-valued variant
description: Description of the purpose of the variant
values: Either a tuple of strings containing the allowed values, or a callable accepting
one value and returning True if it is valid
multi: If False only one value per spec is allowed for this variant
validator: Optional group validator to enforce additional logic. It receives the package
name, the variant name and a tuple of values and should raise an instance of SpackError
if the group doesn't meet the additional constraints
when: Optional condition on which the variant applies
sticky: The variant should not be changed by the concretizer to find a valid concrete spec
Raises:
DirectiveError: if arguments passed to the directive are invalid
DirectiveError: If arguments passed to the directive are invalid
"""
def format_error(msg, pkg):
@@ -763,7 +758,7 @@ def _execute_variant(pkg):
when_spec = make_when_spec(when)
when_specs = [when_spec]
if not re.match(spack.spec.identifier_re, name):
if not re.match(spack.spec.IDENTIFIER_RE, name):
directive = "variant"
msg = "Invalid variant name in {0}: '{1}'"
raise DirectiveError(directive, msg.format(pkg.name, name))
@@ -900,7 +895,8 @@ def _execute_requires(pkg):
# Save in a list the requirements and the associated custom messages
when_spec_list = pkg.requirements.setdefault(tuple(requirement_specs), [])
when_spec_list.append((when_spec, policy, msg))
msg_with_name = f"{pkg.name}: {msg}" if msg is not None else msg
when_spec_list.append((when_spec, policy, msg_with_name))
return _execute_requires

View File

@@ -11,6 +11,7 @@
import shutil
import sys
from contextlib import contextmanager
from pathlib import Path
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -104,7 +105,7 @@ def relative_path_for_spec(self, spec):
projection = spack.projections.get_projection(self.projections, spec)
path = spec.format(projection)
return path
return str(Path(path))
def write_spec(self, spec, path):
"""Write a spec out to a file."""

View File

@@ -16,7 +16,7 @@
import urllib.parse
import urllib.request
import warnings
from typing import Any, Dict, List, Optional, Set, Tuple, Union
from typing import Any, Dict, Iterable, List, Optional, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -203,7 +203,7 @@ def activate(env, use_env_repo=False):
env.store_token = spack.store.reinitialize()
if use_env_repo:
spack.repo.path.put_first(env.repo)
spack.repo.PATH.put_first(env.repo)
tty.debug("Using environment '%s'" % env.name)
@@ -227,7 +227,7 @@ def deactivate():
# use _repo so we only remove if a repo was actually constructed
if _active_environment._repo:
spack.repo.path.remove(_active_environment._repo)
spack.repo.PATH.remove(_active_environment._repo)
tty.debug("Deactivated environment '%s'" % _active_environment.name)
@@ -1084,8 +1084,8 @@ def add(self, user_spec, list_name=user_speclist_name):
if list_name == user_speclist_name:
if spec.anonymous:
raise SpackEnvironmentError("cannot add anonymous specs to an environment")
elif not spack.repo.path.exists(spec.name) and not spec.abstract_hash:
virtuals = spack.repo.path.provider_index.providers.keys()
elif not spack.repo.PATH.exists(spec.name) and not spec.abstract_hash:
virtuals = spack.repo.PATH.provider_index.providers.keys()
if spec.name not in virtuals:
msg = "no such package: %s" % spec.name
raise SpackEnvironmentError(msg)
@@ -1262,7 +1262,7 @@ def develop(self, spec: Spec, path: str, clone: bool = False) -> bool:
# better if we can create the `source_path` directly into its final
# destination.
abspath = spack.util.path.canonicalize_path(path, default_wd=self.path)
pkg_cls = spack.repo.path.get_pkg_class(spec.name)
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# We construct a package class ourselves, rather than asking for
# Spec.package, since Spec only allows this when it is concrete
package = pkg_cls(spec)
@@ -1490,7 +1490,7 @@ def _concretize_separately(self, tests=False):
# for a write lock. We do this indirectly by retrieving the
# provider index, which should in turn trigger the update of
# all the indexes if there's any need for that.
_ = spack.repo.path.provider_index
_ = spack.repo.PATH.provider_index
# Ensure we have compilers in compilers.yaml to avoid that
# processes try to write the config file in parallel
@@ -1921,16 +1921,17 @@ def install_specs(self, specs=None, **install_args):
"Could not install log links for {0}: {1}".format(spec.name, str(e))
)
def all_specs(self):
"""Return all specs, even those a user spec would shadow."""
roots = [self.specs_by_hash[h] for h in self.concretized_order]
specs = [s for s in traverse.traverse_nodes(roots, key=traverse.by_dag_hash)]
specs.sort()
return specs
def all_specs_generator(self) -> Iterable[Spec]:
"""Returns a generator for all concrete specs"""
return traverse.traverse_nodes(self.concrete_roots(), key=traverse.by_dag_hash)
def all_specs(self) -> List[Spec]:
"""Returns a list of all concrete specs"""
return list(self.all_specs_generator())
def all_hashes(self):
"""Return hashes of all specs."""
return [s.dag_hash() for s in self.all_specs()]
return [s.dag_hash() for s in self.all_specs_generator()]
def roots(self):
"""Specs explicitly requested by the user *in this environment*.
@@ -1993,14 +1994,10 @@ def get_one_by_hash(self, dag_hash):
def all_matching_specs(self, *specs: spack.spec.Spec) -> List[Spec]:
"""Returns all concretized specs in the environment satisfying any of the input specs"""
# Look up abstract hashes ahead of time, to avoid O(n^2) traversal.
specs = [s.lookup_hash() for s in specs]
# Avoid double lookup by directly calling _satisfies.
return [
s
for s in traverse.traverse_nodes(self.concrete_roots(), key=traverse.by_dag_hash)
if any(s._satisfies(t) for t in specs)
if any(s.satisfies(t) for t in specs)
]
@spack.repo.autospec
@@ -2061,7 +2058,7 @@ def matching_spec(self, spec):
# If multiple root specs match, it is assumed that the abstract
# spec will most-succinctly summarize the difference between them
# (and the user can enter one of these to disambiguate)
fmt_str = "{hash:7} " + spack.spec.default_format
fmt_str = "{hash:7} " + spack.spec.DEFAULT_FORMAT
color = clr.get_color_when()
match_strings = [
f"Root spec {abstract.format(color=color)}\n {concrete.format(fmt_str, color=color)}"
@@ -2279,7 +2276,7 @@ def _add_to_environment_repository(self, spec_node: Spec) -> None:
repository = spack.repo.create_or_construct(repository_dir, spec_node.namespace)
pkg_dir = repository.dirname_for_package_name(spec_node.name)
fs.mkdirp(pkg_dir)
spack.repo.path.dump_provenance(spec_node, pkg_dir)
spack.repo.PATH.dump_provenance(spec_node, pkg_dir)
def manifest_uptodate_or_warn(self):
"""Emits a warning if the manifest file is not up-to-date."""
@@ -2369,7 +2366,7 @@ def display_specs(concretized_specs):
def _tree_to_display(spec):
return spec.tree(
recurse_dependencies=True,
format=spack.spec.display_format,
format=spack.spec.DISPLAY_FORMAT,
status_fn=spack.spec.Spec.install_status,
hashlen=7,
hashes=True,
@@ -2447,13 +2444,13 @@ def make_repo_path(root):
def prepare_config_scope(env):
"""Add env's scope to the global configuration search path."""
for scope in env.config_scopes():
spack.config.config.push_scope(scope)
spack.config.CONFIG.push_scope(scope)
def deactivate_config_scope(env):
"""Remove any scopes from env from the global config path."""
for scope in env.config_scopes():
spack.config.config.remove_scope(scope.name)
spack.config.CONFIG.remove_scope(scope.name)
def manifest_file(env_name_or_dir):
@@ -2667,6 +2664,26 @@ def __init__(self, manifest_dir: Union[pathlib.Path, str]) -> None:
self.yaml_content = with_defaults_added
self.changed = False
def _all_matches(self, user_spec: str) -> List[str]:
"""Maps the input string to the first equivalent user spec in the manifest,
and returns it.
Args:
user_spec: user spec to be found
Raises:
ValueError: if no equivalent match is found
"""
result = []
for yaml_spec_str in self.pristine_configuration["specs"]:
if Spec(yaml_spec_str) == Spec(user_spec):
result.append(yaml_spec_str)
if not result:
raise ValueError(f"cannot find a spec equivalent to {user_spec}")
return result
def add_user_spec(self, user_spec: str) -> None:
"""Appends the user spec passed as input to the list of root specs.
@@ -2687,8 +2704,9 @@ def remove_user_spec(self, user_spec: str) -> None:
SpackEnvironmentError: when the user spec is not in the list
"""
try:
self.pristine_configuration["specs"].remove(user_spec)
self.configuration["specs"].remove(user_spec)
for key in self._all_matches(user_spec):
self.pristine_configuration["specs"].remove(key)
self.configuration["specs"].remove(key)
except ValueError as e:
msg = f"cannot remove {user_spec} from {self}, no such spec exists"
raise SpackEnvironmentError(msg) from e

View File

@@ -43,7 +43,7 @@ def activate_header(env, shell, prompt=None):
# TODO: despacktivate
# TODO: prompt
elif shell == "pwsh":
cmds += "$Env:SPACK_ENV=%s\n" % env.path
cmds += "$Env:SPACK_ENV='%s'\n" % env.path
else:
if "color" in os.getenv("TERM", "") and prompt:
prompt = colorize("@G{%s}" % prompt, color=True, enclose=True)
@@ -82,7 +82,7 @@ def deactivate_header(shell):
# TODO: despacktivate
# TODO: prompt
elif shell == "pwsh":
cmds += "Remove-Item Env:SPACK_ENV"
cmds += "Set-Item -Path Env:SPACK_ENV\n"
else:
cmds += "if [ ! -z ${SPACK_ENV+x} ]; then\n"
cmds += "unset SPACK_ENV; export SPACK_ENV;\n"

View File

@@ -45,6 +45,7 @@
import spack.util.url as url_util
import spack.util.web as web_util
import spack.version
import spack.version.git_ref_lookup
from spack.util.compression import decompressor_for, extension_from_path
from spack.util.executable import CommandNotFoundError, which
from spack.util.string import comma_and, quote
@@ -1540,7 +1541,7 @@ def for_package_version(pkg, version=None):
f"Cannot fetch git version for {pkg.name}. Package has no 'git' attribute"
)
# Populate the version with comparisons to other commits
version.attach_git_lookup_from_package(pkg.name)
version.attach_lookup(spack.version.git_ref_lookup.GitRefLookup(pkg.name))
# For GitVersion, we have no way to determine whether a ref is a branch or tag
# Fortunately, we handle branches and tags identically, except tags are

View File

@@ -590,9 +590,9 @@ def print_status(self, *specs, **kwargs):
print()
header = "%s{%s} / %s{%s}" % (
spack.spec.architecture_color,
spack.spec.ARCHITECTURE_COLOR,
architecture,
spack.spec.compiler_color,
spack.spec.COMPILER_COLOR,
compiler,
)
tty.hline(colorize(header), char="-")

View File

@@ -535,7 +535,7 @@ def edge_entry(self, edge):
def _static_edges(specs, deptype):
for spec in specs:
pkg_cls = spack.repo.path.get_pkg_class(spec.name)
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
possible = pkg_cls.possible_dependencies(expand_virtuals=True, deptype=deptype)
for parent_name, dependencies in possible.items():

View File

@@ -49,7 +49,7 @@ def __call__(self, spec):
def _content_hash_override(spec):
pkg_cls = spack.repo.path.get_pkg_class(spec.name)
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
pkg = pkg_cls(spec)
return pkg.content_hash()

View File

@@ -1147,12 +1147,12 @@ def write_test_result(self, spec, result):
def write_reproducibility_data(self):
for spec in self.specs:
repo_cache_path = self.stage.repo.join(spec.name)
spack.repo.path.dump_provenance(spec, repo_cache_path)
spack.repo.PATH.dump_provenance(spec, repo_cache_path)
for vspec in spec.package.virtuals_provided:
repo_cache_path = self.stage.repo.join(vspec.name)
if not os.path.exists(repo_cache_path):
try:
spack.repo.path.dump_provenance(vspec, repo_cache_path)
spack.repo.PATH.dump_provenance(vspec, repo_cache_path)
except spack.repo.UnknownPackageError:
pass # not all virtuals have package files

View File

@@ -519,13 +519,6 @@ def _try_install_from_binary_cache(
)
def clear_failures() -> None:
"""
Remove all failure tracking markers for the Spack instance.
"""
spack.store.STORE.db.clear_all_failures()
def combine_phase_logs(phase_log_files: List[str], log_path: str) -> None:
"""
Read set or list of logs and combine them into one file.
@@ -597,9 +590,11 @@ def dump_packages(spec: "spack.spec.Spec", path: str) -> None:
# Get the location of the package in the dest repo.
dest_pkg_dir = repo.dirname_for_package_name(node.name)
if node is spec:
spack.repo.path.dump_provenance(node, dest_pkg_dir)
spack.repo.PATH.dump_provenance(node, dest_pkg_dir)
elif source_pkg_dir:
fs.install_tree(source_pkg_dir, dest_pkg_dir)
fs.install_tree(
source_pkg_dir, dest_pkg_dir, allow_broken_symlinks=(sys.platform != "win32")
)
def get_dependent_ids(spec: "spack.spec.Spec") -> List[str]:
@@ -1126,15 +1121,13 @@ class PackageInstaller:
instance.
"""
def __init__(self, installs: List[Tuple["spack.package_base.PackageBase", dict]] = []):
def __init__(self, installs: List[Tuple["spack.package_base.PackageBase", dict]] = []) -> None:
"""Initialize the installer.
Args:
installs (list): list of tuples, where each
tuple consists of a package (PackageBase) and its associated
install arguments (dict)
Return:
PackageInstaller: instance
"""
# List of build requests
self.build_requests = [BuildRequest(pkg, install_args) for pkg, install_args in installs]
@@ -1287,7 +1280,7 @@ def _check_deps_status(self, request: BuildRequest) -> None:
dep_id = package_id(dep_pkg)
# Check for failure since a prefix lock is not required
if spack.store.STORE.db.prefix_failed(dep):
if spack.store.STORE.failure_tracker.has_failed(dep):
action = "'spack install' the dependency"
msg = "{0} is marked as an install failure: {1}".format(dep_id, action)
raise InstallError(err.format(request.pkg_id, msg), pkg=dep_pkg)
@@ -1325,7 +1318,6 @@ def _prepare_for_install(self, task: BuildTask) -> None:
"""
Check the database and leftover installation directories/files and
prepare for a new install attempt for an uninstalled package.
Preparation includes cleaning up installation and stage directories
and ensuring the database is up-to-date.
@@ -1502,7 +1494,7 @@ def _ensure_locked(
if lock is None:
tty.debug(msg.format("Acquiring", desc, pkg_id, pretty_seconds(timeout or 0)))
op = "acquire"
lock = spack.store.STORE.db.prefix_lock(pkg.spec, timeout)
lock = spack.store.STORE.prefix_locker.lock(pkg.spec, timeout)
if timeout != lock.default_timeout:
tty.warn(
"Expected prefix lock timeout {0}, not {1}".format(
@@ -1627,12 +1619,12 @@ def _add_tasks(self, request: BuildRequest, all_deps):
# Clear any persistent failure markings _unless_ they are
# associated with another process in this parallel build
# of the spec.
spack.store.STORE.db.clear_failure(dep, force=False)
spack.store.STORE.failure_tracker.clear(dep, force=False)
install_package = request.install_args.get("install_package")
if install_package and request.pkg_id not in self.build_tasks:
# Be sure to clear any previous failure
spack.store.STORE.db.clear_failure(request.spec, force=True)
spack.store.STORE.failure_tracker.clear(request.spec, force=True)
# If not installing dependencies, then determine their
# installation status before proceeding
@@ -1888,7 +1880,7 @@ def _update_failed(
err = "" if exc is None else ": {0}".format(str(exc))
tty.debug("Flagging {0} as failed{1}".format(pkg_id, err))
if mark:
self.failed[pkg_id] = spack.store.STORE.db.mark_failed(task.pkg.spec)
self.failed[pkg_id] = spack.store.STORE.failure_tracker.mark(task.pkg.spec)
else:
self.failed[pkg_id] = None
task.status = STATUS_FAILED
@@ -2074,7 +2066,7 @@ def install(self) -> None:
# Flag a failed spec. Do not need an (install) prefix lock since
# assume using a separate (failed) prefix lock file.
if pkg_id in self.failed or spack.store.STORE.db.prefix_failed(spec):
if pkg_id in self.failed or spack.store.STORE.failure_tracker.has_failed(spec):
term_status.clear()
tty.warn("{0} failed to install".format(pkg_id))
self._update_failed(task)
@@ -2403,7 +2395,9 @@ def _install_source(self) -> None:
src_target = os.path.join(pkg.spec.prefix, "share", pkg.name, "src")
tty.debug("{0} Copying source to {1}".format(self.pre, src_target))
fs.install_tree(pkg.stage.source_path, src_target)
fs.install_tree(
pkg.stage.source_path, src_target, allow_broken_symlinks=(sys.platform != "win32")
)
def _real_install(self) -> None:
import spack.builder

Some files were not shown because too many files have changed in this diff Show More