Compare commits

..

843 Commits

Author SHA1 Message Date
Cameron Smith
93b14e6c19 patch for config update problem
spack #26169
2022-10-18 13:27:59 -04:00
snehring
b44c83429c vsearch: add v2.22.1 (#33327) 2022-10-18 08:59:38 -06:00
snehring
8e8d6e5adb mothur: add v1.48.0 and variants (#33326) 2022-10-18 08:54:23 -06:00
dependabot[bot]
b21e54dffa build(deps): bump docker/setup-buildx-action from 2.1.0 to 2.2.0 (#33384)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 2.1.0 to 2.2.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](95cb08cb26...c74574e6c8)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-10-18 10:22:30 +02:00
Massimiliano Culpo
fe2656f182 intel-oneapi-compilers: fix Python 2.7 compliance (#33383) 2022-10-18 07:19:25 +02:00
Adam J. Stewart
0b014ff9cd py-fiona: add v1.8.22 (#33372) 2022-10-17 22:33:50 -06:00
eugeneswalker
dd003f66a8 e4s ci stack: add trilinos +rocm (#31601) 2022-10-17 21:04:31 +00:00
Michael Kuhn
9d11b96e4b lz4: add 1.9.4 (#33334) 2022-10-17 14:42:03 -06:00
Brian Van Essen
47bfc60845 Bugfix HIP and aluminum rocm build (#33344)
* Fixed two bugs in the HIP package recipe.  The first is that the
HIP_PATH was being set to the actual spec, and not the spec prefix.

The second bug is that HIP is expected to be in /opt/rocm-x.y.z/hip
but it's libraries can exist at both /opt/rocm-x.y.z/hip/lib and
/opt/rocm-x.y.z/lib.  This means that the external detection logic may
find it in either and it turns out that some modules only expose one
of those two locations.  Logic is added to ensure that the internal
HIP_PATH and associated ROCM_PATH are correctly set in both cases.

* Added support for Aluminum to use the libfabric plugin with either
RCCL or NCCL.
2022-10-17 13:07:27 -07:00
Mosè Giordano
9b87b4c8cd grid: reference fftw-api instead of fftw (#33374)
This makes it possible to compile with, e.g., `cray-fftw`, not just `fftw`.
2022-10-17 14:06:15 -06:00
snehring
c0361168a5 New packages: libbigwig, methyldackel (#33273)
* libbigwig: adding new package libbigwig
* methyldackel: adding new package methyldackel
* libbigwig: tighten up curl variant
2022-10-17 12:40:01 -07:00
Robert Underwood
da6aeaad44 Initial contribution of LibPressio ecosystem (#32630)
* Add libpressio and dependencies; some of these packages are
  maintained as forks of the original repositories and in those
  cases the docstring mentions this.
* Add optional dependency in adios2 on libpressio
* cub package: set CUB_DIR environment variable for dependent
  installations
* Clear R_HOME/R_ENVIRON before Spack installation (avoid sources
  outside of Spack from affecting the installation in Spack)
* Rename dlib to dorian3d-dlib and update dependents; add new dlib
  implementation. Pending an official policy on how to handle
  packages with short names, reviewer unilaterally decided that
  the rename was acceptable given that the new Spack dlib package
  is referenced more widely (by orders of magnitude) than the
  original

Co-authored-by: Samuel Li <shaomeng@users.noreply.github.com>
2022-10-17 13:30:54 -06:00
Luke Diorio-Toth
fb090a69f4 py-xopen: version bump to 1.6.0 (#33231)
* version bump to 1.6.0

* added py-isal, updated URL
2022-10-17 19:16:09 +00:00
Stephen Sachs
2e55812417 Classic Intel compilers do not support gcc-toolchain (#33281)
* Classic Intel compilers do not support gcc-toolchain

This fix removes `--gcc-toolchain=` from the ~.fcg` files for the classic Intel
compilers. AFAIK this option is only supported for Clang based compilers.

This lead to an issue when installing cmake. Reproducer:
```
spack install cmake@3.24.2%intel@2021.7.0~doc+ncurses+ownlibs~qt
build_type=Release arch=linux-amzn2-skylake_avx512
```

Tagging maintainer @rscohn2

* Add `-gcc-name` for icc

.. and `-gxx-name` for icpc.

AFAIK this is used for modern C++ support, so we can ignore `ifort`.

Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-10-17 10:46:11 -06:00
Harmen Stoppels
e7b14dd491 database: don't warn adding missing build deps (#33361)
When installing an individual spec `spack --only=package --cache-only /xyz`
from a buildcache, Spack currently issues tons of warnings about
missing build deps (and their deps) in the database.

This PR disables these warnings, since it's fine to have a spec without
its build deps in the db (they are just "missing").
2022-10-17 18:30:19 +02:00
Adam J. Stewart
839cf48352 py-horovod: add v0.26 (#33311)
* py-horovod: add v0.26

* py-petastorm: add v0.12.0
2022-10-17 09:14:55 -07:00
Harmen Stoppels
39105a3a6f installer.py: traverse_dependencies has local deptype (#33367)
Currently `traverse_dependencies` fixes deptypes to traverse once and
for all in the recursion, but this is incorrect, since deptypes depend
on the node (e.g. if it's a dependency and cache-only, don't follow
build type edges, even if the parent is build from sources and needs
build deps.)
2022-10-17 16:14:12 +00:00
Massimiliano Culpo
9081871966 GnuPG: add v2.3.8 and update stack (#33368) 2022-10-17 10:13:52 -06:00
iarspider
ad430a7504 Add checksum for py-ipykernel 6.15.2 (#33360) 2022-10-17 10:59:21 -05:00
iarspider
db342d8727 Add checksum for py-secretstorage 3.3.3 (#33366) 2022-10-17 10:54:31 -05:00
Carlos Bederián
32761cdb7b python: add 3.10.7, 3.9.14, 3.8.14, 3.7.14 (#32623) 2022-10-17 17:47:10 +02:00
Scott Wittenburg
29df7e9be3 Support spackbot rebuilding all specs from source (#32596)
Support spackbot rebuilding all specs from source when asked (with "rebuild everything")

- Allow overriding --prune-dag cli opt with env var
- Use job variable to optionally prevent rebuild jobs early exit behavior
- ci rebuild: Use new install argument to insist deps are always installed from binary, but
package is only installed from source.
- gitlab: fix bug w/ untouched pruning
- ci rebuild: install from hash rather than json file
- When doing a "rebuild everything" pipeline, make sure that each install job only consumes
binary dependencies from the mirror being populated by the current pipeline.  This avoids
using, e.g. binaries from develop, when rebuilding everything on a PR.
- When running a pipeline to rebuild everything, do not die because we generated a hash on
the broken specs list.  Instead only warn in that case.
- bugfix: Replace broken no-args tty.die() with sys.exit(1)
2022-10-17 09:45:09 -06:00
Harmen Stoppels
ea80113d0f Github Discussions can be used for Q&A (#33364) 2022-10-17 08:38:25 -07:00
Rui Peng Li
4fe53061a8 hypre 2.26.0 (#33299) 2022-10-17 08:31:28 -07:00
iarspider
bfe49222d5 Add checksum for py-prompt-toolkit 3.0.31 (#33362) 2022-10-17 09:02:19 -06:00
iarspider
42a27f3075 Add checksum for py-grpcio-tools 1.48.1 (#33358) 2022-10-17 10:01:22 -05:00
Harmen Stoppels
e882583b01 installer.py: fix/test get_deptypes (#33363)
Fixing an oversight in https://github.com/spack/spack/pull/32537

`get_deptypes` should depend on new `package/dependencies_cache_only`
props.
2022-10-17 08:57:58 -06:00
Scott Wittenburg
1be6506e29 gitlab ci: Do not force protected build jobs to run on aws runners (#33314) 2022-10-17 07:29:56 -07:00
iarspider
25e35c936b Add checksum for py-astroid 2.12.7, py-astroid 2.12.10, py-setuptools 62.6.0, py-wrapt 1.14.1, py-pylint 2.15.0 (#32976)
* Add checksum for py-astroid 2.12.7, py-setuptools 62.6.0

* Also add checksum for py-wrapt

* Update package.py

* Update package.py (#57)

* Update package.py

* Update package.py

* Update package.py
2022-10-17 08:09:52 -06:00
Harmen Stoppels
2c802c12a5 installer.py: show timers for binary install (#33305)
Print a message of the form
```
Fetch mm:ss.  Build: mm:ss.  Total: mm:ss
```
when installing from buildcache. 

Previously this only happened for source builds.
2022-10-17 15:54:40 +02:00
iarspider
f3523d8655 py-jupyterlab-pygments: install from wheel to avoid cyclic dependency (#33278)
* py-jupyterlab-pygments: avoid cyclic dependency

* Fix style

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of iarspider

* Update package.py

* Flake-8

* fix

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
2022-10-17 07:42:01 -06:00
Adam J. Stewart
84fbccd682 py-numpy: add v1.23.4 (#33260) 2022-10-17 15:23:08 +02:00
iarspider
8dc2d37447 Add checksum for py-sniffio 1.3.0 (#32975) 2022-10-17 05:19:40 -06:00
Harmen Stoppels
9933a9046a py-tensorflow-hub: zlib, again. (#33359) 2022-10-17 12:39:16 +02:00
Michael Kuhn
f3ebe237e5 mariadb-c-client: add 3.3.2, 3.2.7, 3.1.18, 3.0.10 (#33335) 2022-10-17 09:53:07 +02:00
Wouter Deconinck
7993d10e54 sdl2: add v2.0.22 and v2.24.1 (#33351) 2022-10-17 09:37:34 +02:00
Diego Alvarez
574ab3e40a nextflow: add v20.10.0 (#33354) 2022-10-17 09:35:45 +02:00
Mosè Giordano
1338dbca56 libblastrampoline: Add versions 5.1.1, 5.2.0 (#33352) 2022-10-17 09:23:51 +02:00
Adam J. Stewart
02fb32bc1e py-setuptools: add v65.5.0 (#33353) 2022-10-17 09:17:10 +02:00
Adam J. Stewart
25e4d48227 py-sphinx: add v5.3 and v5.2 (#33356) 2022-10-16 22:45:07 +02:00
Hans Fangohr
7547f1c414 octopus: upgrade to 12.1 (#33343) 2022-10-16 18:19:01 +02:00
Miroslav Stoyanov
0c505e459b tasmanian: disable openmp by default (#33345) 2022-10-16 18:17:44 +02:00
Adam J. Stewart
23fe981c41 py-meson-python: add new versions (#33294) 2022-10-16 18:00:24 +02:00
Harmen Stoppels
f7f11fc881 py-tensorflow: fix zlib (#33349)
* py-tensorflow: fix zlib

* [@spackbot] updating style on behalf of haampie

Co-authored-by: haampie <haampie@users.noreply.github.com>
2022-10-16 05:09:56 -06:00
Adam J. Stewart
496f4193a6 meson: update OneAPI compiler support patch (#33293) 2022-10-16 11:53:54 +02:00
Jonathon Anderson
10491e98a8 CI: allow multiple matches to combine tags (#32290)
Currently "spack ci generate" chooses the first matching entry in
gitlab-ci:mappings to fill attributes for a generated build-job,
requiring that the entire configuration matrix is listed out
explicitly. This unfortunately causes significant problems in
environments with large configuration spaces, for example the
environment in #31598 (spack.yaml) supports 5 operating systems,
3 architectures and 130 packages with explicit size requirements,
resulting in 1300 lines of configuration YAML.

This patch adds a configuraiton option to the gitlab-ci schema called
"match_behavior"; when it is set to "merge", all matching entries
are applied in order to the final build-job, allowing a few entries
to cover an entire matrix of configurations.

The default for "match_behavior" is "first", which behaves as before
this commit (only the runner attributes of the first match are used).

In addition, match entries may now include a "remove-attributes"
configuration, which allows matches to remove tags that have been
aggregated by prior matches. This only makes sense to use with
"match_behavior:merge". You can combine "runner-attributes" with
"remove-attributes" to effectively override prior tags.
2022-10-15 17:29:53 +00:00
iarspider
898c0b45fb Add checksum for py-seaborn 0.12.0 (#33145)
* Add checksum for py-seaborn 0.12.0

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of iarspider

* Update package.py

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
2022-10-15 10:34:30 -05:00
Michael Kuhn
b2d7782b00 rocksdb: add 7.7.3 (#33341) 2022-10-14 22:34:17 -06:00
Harmen Stoppels
2f6a56a43b depfile: update docs (#33279) 2022-10-15 06:10:05 +02:00
Michael Kuhn
31cda96181 glib: add 2.74.0 and 2.72.4 (#33332) 2022-10-14 21:27:08 -06:00
WuK
19226ecc49 gptl: new version 8.1.1; use the correct mpi fortran compiler (#33235)
* gptl: new version 8.1.1; use the correct `mpifc`
* add `F90` and `$F77`
2022-10-14 15:54:41 -07:00
Greg Sjaardema
2a166a5cc4 seacas: update to latest release (#33330)
Add checksum for latest tag/release
2022-10-14 15:49:06 -07:00
Erik Schnetter
a661193c64 ninja: New version 1.11.1 (#33215) 2022-10-14 14:31:42 -06:00
iarspider
9546eadd98 Add checksum for py-oauthlib 3.2.1 (#33201)
* Add checksum for py-oauthlib 3.2.1

* Update package.py

* [@spackbot] updating style on behalf of iarspider

* Update package.py

* Update package.py

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
2022-10-14 12:46:08 -06:00
John-Luke Navarro
89b3e6c6d0 py-libensemble: updating package for v0.9.3 (#33298)
* commit updating py-libensemble package for 0.9.3

* removed commented-out lines
2022-10-14 11:01:57 -05:00
Satish Balay
6983d520fa petsc,py-petsc4py,slepc,py-slepc4py: add version 3.18.0 (#32938)
* petsc,py-petsc4py,slepc,py-slepc4py: add version 3.18.0

* workaround for dealii build failure [with petsc version check]

* pism: add compatibility fix to for petsc@3.18

* add in hipsolver dependency
2022-10-14 08:12:16 -07:00
eugeneswalker
c44934a44d hip@5.2.0 onwards: set prefix properly (#33257)
* hip-set-prefix-rocm5.2.0-onwards

* Update var/spack/repos/builtin/packages/hip/package.py

Update description

Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2022-10-14 08:12:07 -07:00
Auriane R
67bc90acb7 Fix pika@0.9.0 sha (#33307) 2022-10-14 07:13:57 -06:00
Dan Bonachea
0de1f98920 UPC++/GASNet-EX 2022.9.0 update (#33277)
* gasnet: Add new release hash
* upcxx: Add new release hash
* gasnet: misc updates
* upcxx: misc updates
2022-10-13 18:50:27 -07:00
iarspider
c13381fab3 Add checksum for py-gitpython 3.1.27 (#33285)
* Add checksum for py-gitpython 3.1.27

* Update package.py

* Update var/spack/repos/builtin/packages/py-gitpython/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-13 20:27:39 -05:00
Adam J. Stewart
d5ebb55338 meson: remove slash in path (#33292) 2022-10-13 20:26:51 -05:00
Scott Wittenburg
7da303334e gitlab ci: Print better information about broken specs (#33124)
When a pipeline generation job is automatically failed because it
generated jobs for specs known to be broken on develop, print better
information about the broken specs that were encountered.  Include
at a minimum the hash and the url of the job whose failure caused it
to be put on the broken specs list in the first place.
2022-10-13 16:35:07 -06:00
iarspider
43dd34b651 Add checksum for py-psutil 5.9.2 (#33139) 2022-10-13 15:06:06 -06:00
Jim Edwards
00ea25061f add version 1.12.3 of parallel-netcdf (#33286) 2022-10-13 14:51:50 -06:00
Harmen Stoppels
75f71d3f81 py-execnet: 1.9.0 (#33282)
* py-execnet: 1.9.0

* bounds
2022-10-13 14:22:08 -06:00
Luke Diorio-Toth
f74742b834 new package (#33262) 2022-10-13 15:01:16 -05:00
Harmen Stoppels
599480ae9a Add missing upperbound to docs/spack.yaml (#33280) 2022-10-13 11:54:20 -07:00
Gregory Lee
feb1f3aadb libcroco does not respect gtk-doc configure flag, so removing variant (#32890)
* libcroco does not respect gtk-doc configure flag, so removing variant
2022-10-13 11:51:28 -07:00
Adam J. Stewart
8ce1574e0c py-meson: remove package (#33295) 2022-10-13 12:22:09 -06:00
Luke Diorio-Toth
7f24ab9b0a py-alive-progress, py-about-time, py-graphme: new packages (#33252)
* new package + deps

* Update var/spack/repos/builtin/packages/py-about-time/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-alive-progress/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* removed unnecessary python version dep

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-13 12:22:03 -06:00
Chris White
5167eed558 Lua: Add versions and minor clean up (#33037)
* add new lua releases

* split install phase and move it into a build phase, remove hardcoded standard flag

* revert back to the original hardcoded std flag, guard patch against versions above 5.4
2022-10-13 11:51:07 -06:00
Adam J. Stewart
3014caa586 py-kornia: add v0.6.8 (#33287) 2022-10-13 12:42:27 -05:00
Adam J. Stewart
6ac1f445ec py-shapely: add v1.8.5 (#33259) 2022-10-13 11:52:20 -05:00
iarspider
be93b27ffc Add checksum for py-atomicwrites 1.4.1 (#33284) 2022-10-13 11:42:58 -05:00
iarspider
9c6c296474 Add checksum for py-asn1crypto 1.5.1 (#33283) 2022-10-13 11:42:19 -05:00
Cameron Rutherford
e20d45f3bc Fix ROCm constraints for ginkgo@glu_experimental in HiOp (#32499)
* Remove ROCm constraints for ginkgo@glu_experimental.

* Fix style.

* Apply @tcojean suggestion.

* Fix hip_repair_options in camp package.

* Remvoe old ROCm logic.

* Remove added whitespace.

* Fix style issue.

* Revert camp changes.

* Revert camp whitespace change.

* Set Ginkgo preferred version to 1.4.0
2022-10-13 09:41:18 -07:00
Laurent Aphecetche
0c4ad440c6 py-pyopenssl: add version 22.1.0 (#33189)
* py-pyopenssl: add version 22.1.0

* Update var/spack/repos/builtin/packages/py-pyopenssl/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-13 11:32:11 -05:00
iarspider
93be19d0e3 Add checksum for py-bokeh 2.4.3 (#33236)
* Add checksum for py-bokeh 2.4.3

* [@spackbot] updating style on behalf of iarspider

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
2022-10-13 11:31:42 -05:00
iarspider
4c151e0387 Add checksum for py-mpld3 0.5.8 (#33239) 2022-10-13 11:29:48 -05:00
iarspider
0af5838581 Add checksum for py-cloudpickle 2.2.0 (#33240) 2022-10-13 11:28:52 -05:00
iarspider
828fddacf1 Add checksum for py-bottle 0.12.23 (#33241) 2022-10-13 11:28:15 -05:00
iarspider
268a43762c Add checksum for py-pymongo 4.2.0 (#33234) 2022-10-13 11:25:43 -05:00
iarspider
bd263b71da Add checksum for py-keyring 23.9.1 (#33185)
* Add checksum for py-keyring 23.9.1

* Update package.py
2022-10-13 11:24:35 -05:00
iarspider
f76a3a1a73 Add checksum for py-setuptools-rust 1.5.1 (#33188)
* Add checksum for py-setuptools-rust 1.5.1

* Update package.py
2022-10-13 11:24:05 -05:00
iarspider
5eeb81c253 Add checksum for py-jupyter-server-mathjax 0.2.6 (#33203)
* Add checksum for py-jupyter-server-mathjax 0.2.6

* Update package.py

* Update package.py
2022-10-13 11:23:38 -05:00
iarspider
4ddf011c56 Add checksum for py-regex 2022.8.17 (#33209)
* Add checksum for py-regex 2022.8.17

* Update var/spack/repos/builtin/packages/py-regex/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-13 11:23:07 -05:00
iarspider
f5704fff69 Add checksum for py-async-lru 1.0.3 (#33196)
* Add checksum for py-async-lru 1.0.3

* [@spackbot] updating style on behalf of iarspider

* Update var/spack/repos/builtin/packages/py-async-lru/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-13 11:22:37 -05:00
dependabot[bot]
a8b1314d18 Bump docker/login-action from 2.0.0 to 2.1.0 (#33268)
Bumps [docker/login-action](https://github.com/docker/login-action) from 2.0.0 to 2.1.0.
- [Release notes](https://github.com/docker/login-action/releases)
- [Commits](49ed152c8e...f4ef78c080)

---
updated-dependencies:
- dependency-name: docker/login-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-10-13 09:49:26 +02:00
dependabot[bot]
601c727491 Bump docker/setup-buildx-action from 2.0.0 to 2.1.0 (#33267)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 2.0.0 to 2.1.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](dc7b9719a9...95cb08cb26)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-10-13 09:48:55 +02:00
William R Tobin
40c400441a exodusii: add fortran variant (#33074)
* add fortran variant, pass cmake options to build with fortran + specified compiler/mpi compiler wrapper (following existing style in the package), activate exodus fortran wrapper library compilation
* change variant description, fix style
2022-10-12 15:28:35 -07:00
Max Zeyen
a2dee76310 gpi-space: add new version (#33184)
* gpi-space: add new version
* gpi-space: fix flake8 formatting issues
* gpi-space: fix more flake8 issues
2022-10-12 15:23:19 -07:00
Brian Van Essen
f24c135383 Added hash for version 2.7.0-6 (#33263) 2022-10-12 14:35:45 -07:00
Harmen Stoppels
5009e3d94a env depfile: allow deps only install (#33245)
* env depfile: allow deps only install

- Refactor `spack env depfile` to use a Jinja template, making it a bit
  easier to follow as a human being.
- Add a layer of indirection in the generated Makefile through an
  `<prefix>/.install-deps/<hash>` target, which allows one to specify
  different options when installing dependencies. For example, only
  verbose/debug mode on when installing some particular spec:
  ```
  $ spack -e my_env env depfile -o Makefile --make-target-prefix example
  $ make example/.install-deps/<hash> -j16
  $ make example/.install/<hash> SPACK="spack -d" SPACK_INSTALL_FLAGS=--verbose -j16
  ```

This could be used to speed up `spack ci rebuild`:
- Parallel install of dependencies from buildcache
- Better readability of logs, e.g. reducing verbosity when installing
  dependencies, and splitting logs into deps.log and current_spec.log

* Silence please!
2022-10-12 14:30:00 -07:00
Filippo Spiga
8dbdfbd1eb NVIDIA HPC SDK: add v22.9 (#33258) 2022-10-12 15:17:53 -06:00
Glenn Johnson
042fcc3575 update Bioconductor R packages (#33224)
* Add bioc attribute to r-do-db
* add version 1.38.1 to bioconductor package r-annotationforge
* add version 1.30.4 to bioconductor package r-biocparallel
* add version 2.64.1 to bioconductor package r-biostrings
* add version 4.4.4 to bioconductor package r-clusterprofiler
* add version 2.12.1 to bioconductor package r-complexheatmap
* add version 1.18.1 to bioconductor package r-delayedmatrixstats
* add version 3.22.1 to bioconductor package r-dose
* add version 3.38.4 to bioconductor package r-edger
* add version 1.16.2 to bioconductor package r-enrichplot
* add version 2.20.2 to bioconductor package r-ensembldb
* add version 1.32.4 to bioconductor package r-genomeinfodb
* add version 1.32.1 to bioconductor package r-genomicalignments
* add version 1.48.4 to bioconductor package r-genomicfeatures
* add version 1.44.1 to bioconductor package r-ggbio
* add version 3.4.4 to bioconductor package r-ggtree
* add version 1.24.2 to bioconductor package r-hdf5array
* add version 2.30.1 to bioconductor package r-iranges
* add version 1.36.3 to bioconductor package r-keggrest
* add version 3.52.4 to bioconductor package r-limma
* add version 1.8.1 to bioconductor package r-matrixgenerics
* update r-org-hs-eg-db
* add version 1.38.1 to bioconductor package r-organismdbi
* add version 1.36.1 to bioconductor package r-pathview
* add version 1.56.1 to bioconductor package r-rtracklayer
* add version 1.4.1 to bioconductor package r-scaledmatrix
* add version 1.24.1 to bioconductor package r-scran
* add version 1.6.3 to bioconductor package r-scuttle
* add version 1.18.1 to bioconductor package r-singlecellexperiment
* add version 1.20.2 to bioconductor package r-treeio
* Revert "Add bioc attribute to r-do-db"
This reverts commit 36be5c6072.
* Fix quotes on versions

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-10-12 14:04:20 -07:00
Massimiliano Culpo
48da17d18e py-pythran: customize headers attribute (#33242) 2022-10-12 13:09:52 -06:00
Auriane R
4765309b97 Limit whip dependencies for pika (#33244) 2022-10-12 11:10:03 -06:00
iarspider
d52eef5b16 Add checksum for py-requests-oauthlib 1.3.1 (#33199) 2022-10-12 10:22:12 -06:00
G-Ragghianti
a227fec4b9 Package slate: Added deps for +rocm smoke test (#33218)
* Added deps for slate+rocm smoke test
* Style change
2022-10-12 08:34:48 -07:00
Vicente Bolea
949151aff3 vtkm: add v1.9.0 (#33221) 2022-10-12 06:50:14 -06:00
iarspider
008b13f676 Add checksum for py-awkward 1.9.0 (#33159) 2022-10-12 07:43:14 -05:00
Harmen Stoppels
acd4787a1a oneapi: set -Wno-unused-command-line-argument (#33192)
For older versions of intel-oneapi-compilers, running the compiler in
preprocessor / compilation mode would trigger warnings that
`-Wl,-rpath,...` flags were unused.

This in turn caused certain configure scripts to fail as they did not
expect output from the compiler (it's treated as an error). Notably
cmake's bootstrap phase failed to detect c++ features of the compiler.

As a workaround, add this flag to silence the warning, since I don't
think we can scope the flags to compile+link mode.
2022-10-12 14:38:57 +02:00
Alberto Invernizzi
4ee22e7cf7 neovim: add version 0.8.0 (#33238)
* bump version for libvterm, required by neovim

* bump version for neovim and add related dep constraints

see release note:
d367ed9b23

in particular:
'deps: Bump required libvterm to v0.3'
https://github.com/neovim/neovim/pull/20222
2022-10-12 13:26:48 +02:00
iarspider
8537220b0e Add checksum for py-prettytable 3.4.1 (#33138) 2022-10-12 03:41:53 -06:00
Sergey Kosukhin
c85faaa216 netcdf packages: filter compiler wrappers in the *-config files (#33025)
* netcdf packages: filter compiler wrappers in the *-config files

* netcdf-c: provide dependent packages with unfiltered nc-config
2022-10-12 11:29:47 +02:00
Harmen Stoppels
dc39edb790 man-db: fix gnulib issue (#33149)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-10-12 11:28:45 +02:00
Seth R. Johnson
f9620950cd py-sphinxcontrib-bibtex: new version 2.5.0 (#32902)
2.4 seems to have issues with sphinx-rtd and sphinx 5.1:
```
AttributeError: 'Text' object has no attribute 'rawsource'
```
2022-10-12 11:25:56 +02:00
iarspider
f10997a0df py-uproot: add v4.3.5 (#33151) 2022-10-12 11:22:43 +02:00
Michael Kuhn
2df25d8b37 meson: add 0.63.3 (#33216) 2022-10-12 03:02:44 -06:00
Jim Edwards
e31a4b6dc6 ESMF package update (#33202) 2022-10-12 10:59:02 +02:00
Jonathon Anderson
827e576c3d bear: fix RPATH handling (#33217) 2022-10-12 10:34:56 +02:00
Adam J. Stewart
549f6361ce py-poetry-core: jail git to stage directory (#33181) 2022-10-12 10:31:12 +02:00
MicK7
c3cc462a69 Add new vtk 9.2.2 release (#33001)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-10-12 10:22:42 +02:00
Glenn Johnson
bffc4ab826 Update CRAN packages (#33223) 2022-10-12 10:20:52 +02:00
Luke Diorio-Toth
fffa9258c5 infernal: add v1.1.4 (#33230) 2022-10-12 09:41:01 +02:00
Luke Diorio-Toth
a34b36703a trnascan-se: add v2.0.11 (#33232) 2022-10-12 09:39:36 +02:00
Massimiliano Culpo
7efbd7d8eb Fix typo in docs (#33182) 2022-10-12 08:31:48 +02:00
iarspider
bd5705b2a8 Add checksum for py-cryptography 37.0.4 (#33186) 2022-10-11 20:43:56 -05:00
iarspider
f4cc48286d Add checksum for py-frozenlist 1.3.1 (#33193) 2022-10-11 20:41:08 -05:00
iarspider
caf8b57fd4 Add checksum for py-yarl 1.8.1 (#33195) 2022-10-11 20:40:22 -05:00
iarspider
9342344f78 Add checksum for py-immutables 0.18 (#33197) 2022-10-11 20:39:20 -05:00
iarspider
552908595e Add checksum for py-google-auth-oauthlib 0.5.2 (#33198) 2022-10-11 20:38:26 -05:00
iarspider
9cd47454f3 Add checksum for py-jupyter-server 1.18.1 (#33204) 2022-10-11 20:32:42 -05:00
iarspider
cbe2178e3f Add checksum for py-websocket-client 1.4.1 (#33205) 2022-10-11 20:30:45 -05:00
iarspider
7d4fa0ea00 Add checksum for py-requests-unixsocket 0.3.0 (#33206) 2022-10-11 20:29:40 -05:00
iarspider
7f49cc2d17 Add checksum for py-typed-ast 1.5.4 (#33207) 2022-10-11 20:28:37 -05:00
iarspider
a43ad2d876 Add checksum for py-scinum 1.4.3 (#33208) 2022-10-11 20:27:49 -05:00
iarspider
401412f999 Add checksum for py-python-rapidjson@1.8 (#33210) 2022-10-11 20:25:30 -05:00
iarspider
019463be39 Add checksum for py-pysqlite3 0.4.7 (#33211) 2022-10-11 20:24:46 -05:00
Luke Diorio-Toth
cd74e091d0 version bump to 3.10.42 (#33220) 2022-10-11 19:13:56 -06:00
Adam J. Stewart
5844c24ca8 py-rtree: add v1.0.1 (#33222) 2022-10-11 17:49:41 -07:00
Adam J. Stewart
d1fb82a2c4 GCC: update Xcode 14 conflict (#33226) 2022-10-11 17:26:47 -07:00
Harmen Stoppels
926dca9e5f Specify GCC prefix in LLVM-based compilers (#33146)
* spack.compiler.Compiler: introduce prefix property

We currently don't really have something that gives the GCC install
path, which is used by many LLVM-based compilers (llvm, llvm-amdgpu,
nvhpc, ...) to fix the GCC toolchain once and for all.

This `prefix` property is dynamic in the sense that it queries the
compiler itself. This is necessary because it's not easy to deduce the
install path from the `cc` property (might be a symlink, might be a
filename like `gcc` which works by having the compiler load a module
that sets the PATH variable, might be a generic compiler wrapper based
on environment variables like on cray...).

With this property introduced, we can clean up some recipes that have
the logic repeated for GCC.

* intel-oneapi-compilers: set --gcc-sysroot to %gcc prefix
2022-10-11 17:45:51 -06:00
kwryankrattiger
4b866e8ffc Darshan variant cleanup (#33165)
* Darshan-Runtime: Cleanup version dependent variants

* Darshan-Util: Cleanup version dependent variants.
2022-10-11 12:04:22 -07:00
kwryankrattiger
5d0f0914b8 Omega-H: Current constraint doesn't allow any cuda (#33164)
From the issue referenced, it seems later and earlier versions
of cuda work.
2022-10-11 12:03:39 -07:00
Massimiliano Culpo
de8c827983 Refactor a few classes related to package repositories (#32273)
Caches used by repositories don't reference the global spack.repo.path instance
anymore, but get the repository they refer to during initialization.
 
Spec.virtual now use the index, and computation done to compute the index 
use Repository.is_virtual_safe. 

Code to construct mock packages and mock repository has been factored into 
a unique MockRepositoryBuilder that is used throughout the codebase.

Add debug print for pushing and popping config scopes.

Changed spack.repo.use_repositories so that it can override or not previous repos

spack.repo.use_repositories updates spack.config.config according to the modifications done

Removed a peculiar behavior from spack.config.Configuration where push would always 
bubble-up a scope named command_line if it existed
2022-10-11 19:28:27 +02:00
Peter Scheibel
b594c0aee0 spack diff any specs you want (#32737)
Resolves #31782

With this change, if a spec is concrete after parsing (e.g. spec.yaml
or /hash-based), then it is not disambiguated (a process which requires
(a) that the spec be installed and (b) that it be part of the
currently-active environment).

This commit allows you to:

* Diff specs from an environment regardless of whether they have
  been installed (more useful for projection/matrix-based envs)
* Diff specs read from .yaml files which may or may not be entirely
  different installations of Spack

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-10-11 10:03:31 -06:00
dependabot[bot]
8e3c088a7a Bump actions/setup-python from 4.2.0 to 4.3.0 (#33166)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.2.0 to 4.3.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](b55428b188...13ae5bb136)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-10-11 09:54:17 +02:00
iarspider
a0489d8480 py-importlib-resources: add v5.9.0 (#33047) 2022-10-11 09:52:01 +02:00
eugeneswalker
a587bff119 e4s ci: add cabana +rocm (#33177) 2022-10-11 00:37:55 -06:00
Glenn Johnson
c885b591e2 new package: r-profvis (#33171) 2022-10-10 21:09:53 -06:00
Glenn Johnson
771aee30ea new package: r-urlchecker (#33174) 2022-10-10 19:42:06 -06:00
Glenn Johnson
87536ab107 new package: r-ragg (#33172) 2022-10-10 19:38:06 -06:00
Glenn Johnson
acf6acc93c Add bioc attribute to r-do-db (#33179) 2022-10-10 19:33:55 -06:00
Glenn Johnson
cecec254b0 new package: r-optimparallel (#33169) 2022-10-10 19:25:55 -06:00
Mark W. Krentel
c7472c849f hpctoolkit: add version 2022.10.01 (#33078)
* hpctoolkit: add version 2022.10.01

 1. add version 2022.10.01
 2. remove version for master branch, develop is now the main branch
 3. add CPATH and LD_LIBRARY_PATH to module run environment,
    this is for apps that want to use the start/stop interface
 4. cleanup style in variants, depends and conflicts
 5. remove all-static variant, nothing uses it
 6. deprecate more old versions

* [@spackbot] updating style on behalf of mwkrentel

* Add when(+level_zero) to the gtpin variant.

* Test commit to see if this passes E4S.

* Another test commit to see if E4S succeeds.

* Add temporary hack to ignore +mpi for version 2022.10.01 and issue a
warning instead.

Co-authored-by: mwkrentel <mwkrentel@users.noreply.github.com>
2022-10-10 17:59:13 -07:00
iarspider
be293ceb7a Add checksum for py-virtualenv 20.16.4 (#33154)
* Add checksum for py-virtualenv 20.16.4

* [@spackbot] updating style on behalf of iarspider

* Update package.py

* Update package.py

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
2022-10-10 18:49:52 -06:00
iarspider
4a9790e8cd Add checksum for py-msgpack 1.0.4 (#33161)
* Add checksum for py-msgpack 1.0.4

* Update var/spack/repos/builtin/packages/py-msgpack/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-10 18:38:19 -06:00
Glenn Johnson
cd6ef2c3cb new package: r-textshaping (#33173) 2022-10-10 18:38:01 -06:00
Glenn Johnson
042050be11 new package: r-pkgdown (#33170) 2022-10-10 18:33:58 -06:00
Glenn Johnson
46b7d3995c new package: r-interp (#33168) 2022-10-10 17:57:52 -06:00
iarspider
c393a57a48 Update py werkzeug (#33155)
* Add checksum for py-virtualenv 20.16.4

* Add checksum for py-werkzeug 2.2.2

* Restore py-virtualenv/package.py

* Update var/spack/repos/builtin/packages/py-werkzeug/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-10 17:50:06 -06:00
Axel Huebl
9d89dba292 Docs: Getting Started Dependencies (#32480)
* Docs: Getting Started Dependencies

Finally document what one needs to install to use Spack on
Linux and Mac :-)

With <3 for minimal container users and my colleagues with
their fancy Macs.

* Debian Update Packages: GCC, Python

- build-essential: includes gcc, g++ (thx Cory)
- Python: add python3-venv, python3-distutils (thx Pradyun)

* Add RHEL8 Dependencies
2022-10-10 23:25:37 +00:00
Glenn Johnson
a05a34361a new package: r-downlit (#33167) 2022-10-10 16:24:13 -07:00
eugeneswalker
dc141f5ad6 e4s: add hypre +rocm (#32148) 2022-10-10 16:09:52 -06:00
iarspider
634941a1c3 Add checksum for py-mako 1.2.2 (#33141) 2022-10-10 13:30:06 -06:00
iarspider
22ea4aa210 Add checksum for py-pyrsistent 0.18.1 (#33084) 2022-10-10 13:26:03 -06:00
iarspider
83916961da Add checksum for py-parso 0.8.3 (#33050) 2022-10-10 13:16:29 -06:00
Harmen Stoppels
dcf157d3a9 julia: add latest 1.8.x and 1.6.x releases and update deps, remove deprecated versions (#32956)
* julia: add latest 1.8.x and 1.6.x releases and update deps, remove deprecated versions
* get libuv verisons right
* resurrect libuv 1.44.1
2022-10-10 12:11:13 -07:00
Adam J. Stewart
831d7979ca ML CPU pipeline: test py-torch-nvidia-apex (#33158) 2022-10-10 12:18:05 -06:00
iarspider
14f6de9bf2 Add checksum for py-skl2onnx 1.12 (#33137)
* Add checksum for py-skl2onnx 1.12

* Update var/spack/repos/builtin/packages/py-skl2onnx/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-10 12:17:52 -06:00
iarspider
c777380569 Add checksum for py-tables 3.7.0 (#33157)
* Add checksum for py-tables 3.7.0

* Update package.py

* Update var/spack/repos/builtin/packages/py-tables/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-10 12:06:08 -06:00
Jean Luca Bez
0835b69771 New package: DXT Explorer tool (#31007)
* DXT Explorer tool

* Remove comments

* Fix style

* Syntax change

* Fix syntax

* remove dependencies, update version number, fix recipe

* fix syntax

* fixes

* change version order
2022-10-10 11:49:58 -06:00
iarspider
d886700de5 Add checksum for py-stevedore 4.0.0 (#33147) 2022-10-10 12:26:55 -05:00
iarspider
15dddee8f6 Add checksum for py-pycurl 7.45.1 (#33143) 2022-10-10 12:23:52 -05:00
iarspider
75cb7cefc1 Add checksum for py-pytools 2022.1.12 (#33142) 2022-10-10 12:22:59 -05:00
Jonas Thies
c61dad1c25 phist: new version 1.11 and patch to make previous versions compile w… (#33132)
* phist: new version 1.11 and patch to make previous versions compile with OpenBLAS

* phist; drop conflict on netlib-lapack and openblas
2022-10-10 19:19:09 +02:00
Jim Galarowicz
ee2ece3c91 Update versions of the survey performance tool. (#33058) 2022-10-10 09:55:49 -07:00
Jim Edwards
8829fb7c03 add ANL mpi-serial package (used by parallelio) (#33150) 2022-10-10 09:39:58 -07:00
iarspider
ef4c7474e5 Add checksum for py-dill 0.3.5.1 (#33144) 2022-10-10 11:31:51 -05:00
iarspider
5f642ff2d6 Add checksum for py-crashtest 0.4.0 (#33162) 2022-10-10 11:30:28 -05:00
Auriane R
41f992a2f8 Do not set CMAKE_HIP_ARCHITECTURES if none specified (#33156) 2022-10-10 09:21:45 -07:00
iarspider
c453d8718b Add checksum for py-vector 0.8.5 (#33152) 2022-10-10 11:13:37 -05:00
iarspider
5b3e8a46b3 Add checksum for py-cachecontrol 0.12.11 (#33160) 2022-10-10 10:05:59 -06:00
Adam J. Stewart
8d9a035d12 py-torch-nvidia-apex: fix +cuda build (#33070) 2022-10-10 16:04:35 +02:00
Sarah Osborn
cbc867a24c hypre: fix to correctly find rocsparse and rocrand when not in ROCM_PATH (#33073) 2022-10-10 15:46:26 +02:00
snehring
86aaede202 libfabric: add version 1.16.1 (#33030) 2022-10-10 15:45:41 +02:00
Adam J. Stewart
8da82ebf69 py-matplotlib: add v3.6.1 (#33126) 2022-10-10 15:45:06 +02:00
Thomas Madlener
fc23b48804 podio, edm4hep: add latest versions (#33056) 2022-10-10 15:41:33 +02:00
iarspider
f915f9db32 py-luigi: add v3.1.1 (#33090) 2022-10-10 15:37:40 +02:00
Brian Vanderwende
27cf8dddec shell prompt: enclose control sequence in brackets (#33079)
When setting `PS1` in Bash, it's required to enclose non-printable characters in square brackets, so that the width of the terminal is handled correctly.

See https://www.gnu.org/software/bash/manual/bash.html#Controlling-the-Prompt
2022-10-10 07:29:58 -06:00
Adam J. Stewart
7cb745b03a PythonPackage: fix libs/headers attributes (#32970) 2022-10-10 13:26:30 +00:00
iarspider
bfbd411091 Add checksum for py-lizard 1.17.10 (#33095) 2022-10-10 06:09:56 -06:00
Wouter Deconinck
2a43571a68 root: new variant webgui when +root7 (default True) (#33133)
ROOT has a webgui which is available with the `+root7` variant. This is a fairly large part of a ROOT install (275MB out of 732MB on my system) which is not necessarily useful in all use cases (e.g. inside containers on network-restricted HPC/HTC compute nodes). This new variant adds the option to retain the ROOT7 functionality but not necessarily include the `webgui` aspects.
2022-10-10 11:13:35 +02:00
Yang Zongze
46239ea525 fixbug-ctags-5.8: general.h: missing binary operator before token "(" (#33135)
`__unused__` defined in `general.h` conflict with the one defined by libc headers,
so change it to `__attribute__unused__` according to s.zharkoff:
  https://bugs.gentoo.org/828550#c11

cmd:
  `grep -rl "__unused__" . | xargs -n1 sed -i -e 's/\b__unused__\b/__attribute__unused__/g' -e 's/(unused)/(__unused__)/g'`
2022-10-10 10:24:54 +02:00
Adam J. Stewart
01ede3c595 Add CI stack for ML packages (#31592)
Basic stack of ML packages we would like to test and generate binaries for in CI. 

Spack now has a large CI framework in GitLab for PR testing and public binary generation.
We should take advantage of this to test and distribute optimized binaries for popular ML
frameworks.

This is a pretty extensive initial set, including CPU, ROCm, and CUDA versions of a core
`x96_64_v4` stack.

### Core ML frameworks

These are all popular core ML frameworks already available in Spack.

- [x] PyTorch
- [x] TensorFlow
- [x] Scikit-learn
- [x] MXNet
- [x] CNTK
- [x] Caffe
- [x] Chainer
- [x] XGBoost
- [x] Theano

### ML extensions

These are domain libraries and wrappers that build on top of core ML libraries

- [x] Keras
- [x] TensorBoard
- [x] torchvision
- [x] torchtext
- [x] torchaudio
- [x] TorchGeo
- [x] PyTorch Lightning
- [x] torchmetrics
- [x] GPyTorch
- [x] Horovod

### ML-adjacent libraries

These are libraries that aren't specific to ML but are still core libraries used in ML pipelines

- [x] numpy
- [x] scipy
- [x] pandas
- [x] ONNX
- [x] bazel

Co-authored-by: Jonathon Anderson <17242663+blue42u@users.noreply.github.com>
2022-10-09 15:39:47 -07:00
iarspider
4a6aff8bd1 Add checksum for py-nest-asyncio 1.5.5 (#33080) 2022-10-09 16:25:58 -06:00
iarspider
5d8e97af2a Add checksum for py-pycparser 2.21 (#32981) 2022-10-09 16:21:54 -06:00
iarspider
48f5f8eb17 Add checksum for py-br 5.10.0 (#33093) 2022-10-09 11:11:05 -06:00
iarspider
01a54dd616 Add checksum for py-hep-ml 0.7.1 (#33091) 2022-10-09 11:10:43 -06:00
iarspider
e5414ed9cc Add checksum for py-python-daemon 2.3.1 (#33092) 2022-10-09 11:10:24 -06:00
iarspider
20453622a0 Add checksum for py-law 0.1.7 (#33089) 2022-10-09 11:10:12 -06:00
Sergey Kosukhin
4a3e3807a3 py-eccodes: fix environment variables (#32807) 2022-10-09 11:09:57 -06:00
Sergey Kosukhin
c60dffaea7 py-cdo: add version 1.5.6, deprecate 1.3.2 (#32793)
* py-cdo: add version 1.5.6

* py-cdo: make python run depdendencies also the build ones

* py-cdo: restrict Python version
2022-10-09 11:38:54 -05:00
Axel Huebl
17898a61dd py-cupy: 11.2 (#33076)
* py-cupy: 11.2

Add the latest version of `cupy`:
  https://github.com/cupy/cupy/tree/v11.2.0#installation

* Update Dependencies

* Deprecate: 8.0.0
2022-10-09 10:38:04 -06:00
iarspider
2d28274387 Add checksum for py-pybind11 2.10.0 (#32971) 2022-10-09 11:17:20 -05:00
iarspider
918ed5f328 Add checksum for py-scikit-build 0.15.0 and use sources from pypi (#32954)
* Add checksum for py-scikit-build 0.15.0 and use sources from pypi

* Update var/spack/repos/builtin/packages/py-scikit-build/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-09 11:17:07 -05:00
iarspider
6f79dc654c Add checksum for py-terminado 0.15.0 (#33085)
* Add checksum for py-terminado 0.15.0

* Update var/spack/repos/builtin/packages/py-terminado/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-09 11:13:19 -05:00
Luke Diorio-Toth
6c2df00443 new packages (py-metaphlan, py-hclust2, iqtree2) + updates to others (py-dendropy, py-phylophlan, py-pkgconfig) (#32936)
* added metaphlan v4, cleaned up phylophlan

* added iqtree2

* fixed phylophlan, builds now

* changed config.yaml to default

* fixed style

* py-jsonschema: add 4.16.0 and new package py-hatch-fancy-pypi-readme (#32929)

* acfl: add v22.1 (#32915)

Co-authored-by: Annop Wongwathanarat <annop.wongwathanarat@arm.com>

* Fixup errors introduced by Clingo Pr: (#32905)

* re2c depends on cmake on Windows
* Winbison properly added to bootstrap package search list

* Set CMAKE_HIP_ARCHITECTURES with the value of amdgpu_target (#32901)

* libtiff: default to +zlib+jpeg (#32945)

* octave: add version 7.2.0 (#32943)

* simgrid new releases (#32920)

* [rocksdb] Added rtti variant (#32918)

* rvs binary path updated for 5.2 rocm release (#32892)

* Add checksum for py-pytest-runner 6.0.0 (#32957)

* py-einops: add v0.5.0 (#32959)

* Replace repo with the NVIDIA one (#32951)

* Add checksum for py-tomli 2.0.1 (#32949)

* QMCPACK: add @3.15.0 (#32931)

* Tidied up configure arguments to use special spack autotools features. (#32930)

* casper: old domain fell off, adding github repo (#32928)

* unifyfs: pin mercury version; add boost variant (#32911)

Mercury has a new version (v2.2) releasing soon that UnifyFS does not build with and hasn't been tested with. This pins UnifyFS to the last version of Mercury used/tested.

Add a variant to avoid building/using boost

Append -std=gnu99 to cflags if building with gcc@4. Needed for mochi-margo to compile

* trilinos: constrain superlu-dist version (#32889)

* trilinos: constrain superlu-dist version for 13.x
* syntax

* FEniCSx: Updates for 0.5.1 (#32665)

* Updates for DOLFINx 0.5.1 and associated packages
* xtensor needed on anything less than main
* Switch back to Python 3.7 minimum.
* Might be good to point out in our README how to fix Python version?
* Fix basix, xtensor dep
* Add numba feature
* Fix checksum
* Make slepc optional

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* simgrid: add variant and remove flag (#32797)

* simgrid: remove std c++11 flag
* simgrid: add msg variant

* Axom: bring in changes from axom repo (#32643)

* bring in changes from axom repo

Co-authored-by: white238 <white238@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* Add checksum for py-pyparsing 3.0.9 (#32952)

* rdma-core: fix syntax for external discoverability (#32962)

* Add checksum for py-flatbuffers 2.0.7 (#32955)

* amrex: add v22.10 (#32966)

* Remove CMakePackage.define alias from most packages (#32950)

* Bug fix for `ca-certificates-mozilla/package.py` to enable `spack install --source` (#32953)

* made suggested changes to iqtree2, py-dendropy, py-metaphlan, and py-pkgconfig. Poetry install still broken

* reverted py-pkgconfig deps to poetry-core

* made iqtree2 less dedundant, changes to py-dendropy and py-pkgconfig deps

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
Co-authored-by: Annop Wongwathanarat <annop.wongwathanarat@gmail.com>
Co-authored-by: Annop Wongwathanarat <annop.wongwathanarat@arm.com>
Co-authored-by: John W. Parent <45471568+johnwparent@users.noreply.github.com>
Co-authored-by: Auriane R <48684432+aurianer@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Kai Torben Ohlhus <k.ohlhus@gmail.com>
Co-authored-by: Vinícius <viniciusvgp@gmail.com>
Co-authored-by: Matthieu Dorier <mdorier@anl.gov>
Co-authored-by: renjithravindrankannath <94420380+renjithravindrankannath@users.noreply.github.com>
Co-authored-by: iarspider <iarspider@gmail.com>
Co-authored-by: Paul R. C. Kent <kentpr@ornl.gov>
Co-authored-by: Brian Van Essen <vanessen1@llnl.gov>
Co-authored-by: snehring <7978778+snehring@users.noreply.github.com>
Co-authored-by: Cameron Stanavige <stanavige1@llnl.gov>
Co-authored-by: Cody Balos <balos1@llnl.gov>
Co-authored-by: Jack S. Hale <mail@jackhale.co.uk>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Lucas Nesi <lucas31nesi@hotmail.com>
Co-authored-by: Chris White <white238@llnl.gov>
Co-authored-by: white238 <white238@users.noreply.github.com>
Co-authored-by: Martin Pokorny <mpokorny@caltech.edu>
Co-authored-by: Weiqun Zhang <WeiqunZhang@lbl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Dom Heinzeller <dom.heinzeller@icloud.com>
2022-10-08 16:57:45 -06:00
Jen Herting
3146f9309c New package: py-qudida (#33115)
* [py-qudida] New package

* [@spackbot] updating style on behalf of qwertos

Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-10-08 11:37:50 -06:00
Qian Jianhua
c1440aaa17 py-blis: fix environment settings (#32913) 2022-10-08 01:20:51 -05:00
iarspider
5ca32d82e4 Add checksum for py-jupyterlab-pygments 0.2.2 (#33081) 2022-10-07 22:06:15 -06:00
eugeneswalker
dfbeaff8ae remove outdated comments (#33123) 2022-10-07 19:47:57 -07:00
iarspider
8309ae08d1 Add checksum for py-jupyter-core 4.11.1 (#33086) 2022-10-07 16:06:11 -06:00
Sergey Kosukhin
4bc8f66388 autotools: extend patching of the libtool script (#30768)
* filter_file: introduce argument 'start_at'

* autotools: extend patching of the libtool script

* autotools: refactor _patch_usr_bin_file

* autotools: improve readability of the filtering

* autotools: keep the modification time of the configure scripts

* autotools: do not try to patch directories

* autotools: explain libtool patching for posterity

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-10-07 22:04:44 +00:00
iarspider
8a5790514d Add checksum for py-nbclient 0.6.7 (#33083) 2022-10-07 15:58:03 -06:00
eugeneswalker
01e7b89b53 e4s ci: add variorum (#33113) 2022-10-07 14:52:30 -07:00
eugeneswalker
041c1486f8 e4s ci: add h5bench (#33114) 2022-10-07 14:52:08 -07:00
Jen Herting
cbc224852b [brotli] added version 1.0.9 (#33107) 2022-10-07 15:33:07 -06:00
Jen Herting
5560bbf97e New package: py-pathml (#32566)
* [py-pathml Farber] Created package

* [pathml Farber] Dependencies added

* [py-pathml] Corrected dependency

* [py-pathml] Added types

* [py-pathml] depends on py-h5py

* [py-pathml] py-opencv-contrib-python -> opencv+python3+contrib

* [py-pathml] added spack import and fixed setuptools type

* [py-pathml] update import

* [py-pathml] opencv no longer has +contrib

* [@spackbot] updating style on behalf of qwertos

Co-authored-by: James A Zilberman <jazrc@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-10-07 15:32:43 -06:00
Gregory Lee
b3cfcebf94 fixed and added mrnet versions (#33120) 2022-10-07 14:25:53 -06:00
iarspider
f3027fb561 Add checksum for py-onnxmltools 1.11.0 (#33104)
* Add checksum for py-onnxmltools 1.11.0

* Add checksum for py-onnxmltools 1.11.0

* Fix patch name

* Update var/spack/repos/builtin/packages/py-onnx-runtime/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-07 14:05:52 -05:00
iarspider
af4134dd48 Add checksum for py-pkginfo 1.8.3 (#33094)
* Add checksum for py-pkginfo 1.8.3

* Update var/spack/repos/builtin/packages/py-pkginfo/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-07 13:02:13 -06:00
iarspider
3270df735f Add checksum for py-markdown 3.4.1 (#33098)
* Add checksum for py-markdown 3.4.1

* Update package.py
2022-10-07 13:01:54 -06:00
Laurent Aphecetche
a2d7776c95 pythia6: set CMAKE_MACOSX_RPATH=True to build on macos (#33082) 2022-10-07 11:53:41 -07:00
iarspider
a4651a2a02 Add checksum for py-pathlib2 2.3.7.poast1 (#33105)
* Add checksum for py-pathlib2 2.3.7.poast1

* [@spackbot] updating style on behalf of iarspider

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
2022-10-07 13:47:26 -05:00
iarspider
910cf7fe7b Add checksum for py-python-ldap 3.4.2 (#33106) 2022-10-07 13:46:09 -05:00
iarspider
fb0d8cd151 Add checksum for py-networkx 2.8.6, py-pygraphviz 1.10 (#33102)
* Add checksum for py-networkx 2.8.6, py-pygraphviz 1.10

* [@spackbot] updating style on behalf of iarspider

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
2022-10-07 13:39:56 -05:00
iarspider
c5e3ec8b1f Add checksum for py-mplhep 0.3.26 (#33101) 2022-10-07 13:37:17 -05:00
iarspider
e81ecae3b5 Add checksum for py-more-itertools 8.14.0 (#33100) 2022-10-07 13:36:11 -05:00
iarspider
ce7461a783 Add checksum for py-lz4 4.0.2 (#33097) 2022-10-07 13:32:34 -05:00
iarspider
f141f806e9 Add checksum for py-llvmlite 0.38.1 (#33096) 2022-10-07 13:31:31 -05:00
iarspider
3968263bf6 Add checksum for py-jupyter-console 6.4.4 (#33088) 2022-10-07 12:54:58 -05:00
iarspider
9de9d2f65b Add checksum for py-jsonpickle 2.2.0 (#33087) 2022-10-07 12:53:44 -05:00
Adam J. Stewart
163242bd6e py-black: add v22.10.0 (#33077) 2022-10-07 10:33:04 -07:00
eugeneswalker
35bc158387 e4s ci: add quantum-espresso (#33075) 2022-10-07 08:38:10 -07:00
Auriane R
39fe4371fa Rename p2300 to stdexec (#33099)
* Rename p2300 package after renaming PR merged in stdexec

https://github.com/NVIDIA/stdexec/pull/622

* Adapt pika to depend on stdexec
2022-10-07 14:54:05 +02:00
Paul Kuberry
2346544e6f trilinos and xyce: fix fortran library handling (#33033)
* trilinos and xyce: fix fortran library handling

xyce:
  - add pymi_static_blas variant and logic
    handles blas and hdf5 conflicts for Xyce-PyMi
  - make +isorropia and +zoltan conditional on mpi

* xyce: clean up CMake options

* xyce: change pymi_static_blas to pymi_static_tpls

* xyce: made pymi_static_tpls only when +pymi
2022-10-07 06:47:26 -04:00
iarspider
8acb4da6aa Add checksum for py-tornado 6.2, py-pyzmq 24.0.1, py-jupyter-client 7.3.5 (#33062) 2022-10-07 00:09:56 -06:00
Laurent Aphecetche
837954729f geant3: new package (#33065)
* geant3: new package
* fix copyright
2022-10-06 22:26:22 -06:00
MatthewLieber
23f6c92f33 add compatibility for rocky8 and rhel8 (#33068) 2022-10-07 04:25:58 +00:00
Hans Fangohr
a661536eb6 oommf: update oommf package to version 2.0b0 (#33072)
- Official release mentioned on https://math.nist.gov/oommf/software-20.html
- Tested at https://github.com/fangohr/oommf-in-spack/pull/39/files
2022-10-06 18:30:27 -07:00
Laurent Aphecetche
080c37046f cppgsl: add version 4.0.0 (#33060) 2022-10-06 18:24:01 -07:00
Jonas Thies
f56ff16564 phist: add patch to resolve build issue #32111 with +fortran %oneapi (#32965)
* phist: add patch to resolve build issue #32111 with +fortran %oneapi

* phist: some lines of code were patched twice, make these patches orthogonal.
2022-10-06 16:16:48 -07:00
Massimiliano Culpo
1a12ddbd2d Add a warning on Python 2.7 deprecation (#33052)
Co-authored-by: alalazo <alalazo@users.noreply.github.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2022-10-06 23:42:34 +02:00
snehring
f43887dd4e ncbi-toolkit: adding version 26_0_1 (#33061) 2022-10-06 12:26:25 -07:00
Sergey Kosukhin
a520a7ef28 tcl module template: automatically unload automatically loaded modules (#32853)
Remove `module-info mode load` condition that prevents auto-unloading when autoloading is enabled. It looks like this condition was added to work around an issue in environment-modules that is no longer necessary.

Add quotes to make is-loaded happy
2022-10-06 20:15:28 +02:00
iarspider
c3018f95ee Add checksum for py-executing 1.0.0 (#33059) 2022-10-06 10:25:54 -06:00
iarspider
164d5fc7a4 Add checksum for py-histogrammar 1.0.31 (#33045) 2022-10-06 09:52:38 -05:00
iarspider
38b50079e1 Add checksum for py-uhi 0.3.1 (#33043) 2022-10-06 09:51:28 -05:00
iarspider
e67a19cb36 Add checksum for py-tqdm 4.64.1 (#33046) 2022-10-06 09:50:24 -05:00
iarspider
7e1cb5414c Add checksum for py-zipp@3.8.1 (#33048) 2022-10-06 09:47:56 -05:00
iarspider
5c5de3e683 Update py stack data (#33051)
* Add checksum for py-traitlets 5.3.0

* Add checksum for py-stack-data 0.5.0

* Remove extra file
2022-10-06 09:41:16 -05:00
Laurent Aphecetche
ca0e023c43 geant4-vmc: unset MACOSX_DEPLOYMENT_TARGET (#32828) 2022-10-06 10:41:08 -04:00
Sergey Kosukhin
70a3868168 ncl: enable building with recent versions of hdf5 (#32842) 2022-10-06 16:20:55 +02:00
iarspider
f168a44fcb Add checksum for py-urllib3 1.26.12 (#33019)
* Add checksum for py-urllib3 1.26.12

* Update var/spack/repos/builtin/packages/py-urllib3/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fixes from review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-06 07:38:00 -06:00
Harmen Stoppels
c959d6c905 gmake: add 4.3.90 alpha release (#33044)
GNU Make 4.3.90 supports jobserver through fifo, so it's nice to add the
alpha release to Spack already.
2022-10-06 04:34:00 -06:00
Manuela Kuhn
4cfe58651a py-docutils: add 0.19 (#32814) 2022-10-06 04:29:48 -06:00
eugeneswalker
a56cd8ffb6 e4s ci: add charliecloud (#32990) 2022-10-06 12:22:23 +02:00
Sreenivasa Murthy Kolam
5cf05b6515 mlirmiopen: update the SHA for 5.2.1 release (#33049) 2022-10-06 03:54:03 -06:00
Harmen Stoppels
fc5da74998 docs: fix deprecated use of install_tree (#33004) 2022-10-06 09:45:46 +00:00
iarspider
0bc23d8bdc Add checksum for py-markupsafe 2.1.1 (#33003) 2022-10-06 03:42:01 -06:00
iarspider
5fc0bef4fc Add checksum for py-avro 1.11.1 (#32979) 2022-10-06 03:26:07 -06:00
iarspider
0184f008f1 Add checksum for py-autopep8 1.7.0, py-pycodestyle 2.9.1 (#32978) 2022-10-06 03:22:06 -06:00
Mark W. Krentel
e8dcfcd7ae hpcviewer: add v2022.10 (#33041) 2022-10-06 03:18:04 -06:00
dependabot[bot]
bfd848089f build(deps): bump actions/checkout from 3.0.2 to 3.1.0 (#32998) 2022-10-06 02:58:13 -06:00
Wouter Deconinck
887dd3fcd9 dd4hep: add v1.23 (#32968)
Release notes at https://github.com/AIDASoft/DD4hep/releases/tag/v01-23
2022-10-06 10:57:09 +02:00
iarspider
96daaa359d Add checksum for py-soupsieve 2.3.2.post1 (#32980)
* Add checksum for py-soupsieve 2.3.2.post1

* Update package.py

* Update var/spack/repos/builtin/packages/py-soupsieve/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-10-06 02:53:58 -06:00
Jen Herting
1f5729644b New package: py-scanpy (#32565)
* [py-scanpy] New package

* [py-scanpy] Added types

* [py-scan] reworked dependencies

* [py-scanpy] flake8

* [py-scanpy] update import

* [py-scanpy] fixed minor issues

Co-authored-by: James A Zilberman <jazrc@rit.edu>
2022-10-06 10:27:37 +02:00
iarspider
aab5bcf05a Add checksum for py-histoprint 2.4.0 (#33017) 2022-10-06 02:22:00 -06:00
iarspider
0ae46519ba Add checksum for py-hepdata-lib 0.10.1, py-hepdata-validator 0.3.3 (#33018)
* Add checksum for py-hepdata-lib 0.10.1, py-hepdata-validator 0.3.3

* Update package.py

* Update package.py

* Update package.py

* Update package.py
2022-10-06 02:14:06 -06:00
Vanessasaurus
270a19504b flux-core: add v0.44.0 (#33039)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2022-10-06 02:10:11 -06:00
snehring
6de5f58026 dock: fix compilation with gcc10+, add dep (#33034) 2022-10-06 09:57:14 +02:00
Jean Luca Bez
33b1425add h5bench: update version (#33000)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-10-06 09:48:09 +02:00
Jen Herting
071c323b95 New package: py-pprintpp (#30764)
* New package: py-pprintpp

* [py-pprintpp] added dependency on setuptools

Co-authored-by: Viv Eric Hafener <vehrc@sporcbuild.rc.rit.edu>
2022-10-06 09:46:32 +02:00
Andrew W Elble
28de7da0cc cuda: add v11.8.0 (#33027)
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
2022-10-06 09:46:01 +02:00
Vanessasaurus
791776cece flux-sched: add v0.25.0 (#33038)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2022-10-06 09:41:15 +02:00
Chris White
9ba102c2bb add blt 0.5.2 (#33029) 2022-10-06 00:02:04 -06:00
Jen Herting
c5d62294b2 New package: py-pyassimp (#30762)
* New package: py-pyasimp

* Cleaned up dependencies

* Fixed liked issue

* Fixed linked issue

* [py-pyassimp] depends on assimp

* [py-pyassimp] added dependency on py-setuptools

* [@spackbot] updating style on behalf of qwertos

Co-authored-by: Viv Eric Hafener <vehrc@sporcbuild.rc.rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-10-05 21:43:29 -05:00
Harmen Stoppels
189baa96db python: don't accidentally link to system db libs (#33007) 2022-10-05 18:22:01 -06:00
snehring
4b17d9b92e salmon: new version 1.9.0 (#33035) 2022-10-05 17:10:00 -06:00
Charles Ferenbaugh
21ca17a157 pfunit: fix @4: ~openmp (#33024) 2022-10-05 17:03:18 -06:00
iarspider
5b45ffb353 Add checksum for py-threadpoolctl 3.1.0 (#33012)
* Add checksum for py-threadpoolctl 3.1.0

* Update package.py
2022-10-05 16:54:04 -06:00
iarspider
143faeee0e Add checksum for py-distlib 0.3.6 (#33009)
* Add checksum for py-distlib 0.3.6

* Update package.py

* Update package.py

* Update package.py
2022-10-05 16:32:56 -06:00
dlkuehn
b4df99376d Package jsonnet: change sha256sum to sha256 for version hash (#33031)
Co-authored-by: David Kuehn <las_dkuehn@iastate.edu>
2022-10-05 14:40:59 -07:00
Daryl W. Grunau
b2c2958acc eospac: support version 6.5.5 (#33023)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2022-10-05 14:08:27 -07:00
snehring
881571c4ba canu: new version 2.2 (#32999) 2022-10-05 15:06:08 -06:00
Glenn Johnson
aef6d4a40a add version 4.2.1 of R (#32993) 2022-10-05 13:40:45 -07:00
Adam J. Stewart
760294b402 py-matplotlib: add v3.6.0, new backends (#32697)
* py-matplotlib: add v3.6.0, new backends
* [@spackbot] updating style on behalf of adamjstewart
* Undo conditional variant hack
* Update dependencies

Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2022-10-05 13:36:37 -07:00
Charles Ferenbaugh
417760e829 Fix wonton CMake config to not break host codes (#33026) 2022-10-05 14:06:12 -06:00
MichaelLaufer
e0b418f288 py-pyfr: add v1.15.0, explicitly set env variables for dependencies (#32964)
* Update to PyFR v1.15.0

* allow unsupported compilers for cuda

* update requirement for py-gimmik version

* update requirement for cuda version

* update versions, style fix

* lib directory changed, style fixes

* PYFR_METIS_LIBRARY_PATH set
2022-10-05 14:08:37 -05:00
eugeneswalker
1fe8387d23 e4s ci: add gotcha cpu (#32989) 2022-10-05 11:43:13 -07:00
Adam J. Stewart
d956da95c3 py-torch-nvidia-apex: ~cuda does not build (#32939) 2022-10-05 12:38:21 -05:00
Axel Huebl
4ed963dda1 OpenBLAS 0.3.21: w/o Fortran (#32398)
There is a new OpenBLAS release out that can be compiled w/o
a Fortran compiler.

macOS XCode developers, rejoice. Maybe at some point Spack
becomes a package manager that can be used without using
another package manager (to get gfortran) 🎉

phist: add conflict on reference netlib-lapack due to API change in lapack.h

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-10-05 11:30:09 -06:00
eugeneswalker
cefc1dc808 e4s ci: add caliper +cuda (#32992) 2022-10-05 10:25:18 -07:00
eugeneswalker
af02dcbd2e e4s ci: enable caliper cpu (#32988) 2022-10-05 10:25:04 -07:00
iarspider
1800684ac1 Add checksum for py-flit 3.7.1 (#33013) 2022-10-05 11:41:16 -05:00
eugeneswalker
82cfa68e69 e4s: add omega-h +cuda (#32156) 2022-10-05 09:36:09 -07:00
eugeneswalker
27074d5bec e4s ci: add unifyfs (#32991) 2022-10-05 09:28:19 -07:00
Wileam Y. Phan
9fe315b953 Make hwloc both CudaPackage and ROCmPackage (#31334)
* Make hwloc both CudaPackage and ROCmPackage

* Remove redundant variants
2022-10-05 09:45:47 -06:00
iarspider
592d97137a Add checksum for py-arrow 1.2.3 (#33005) 2022-10-05 10:38:58 -05:00
iarspider
9abee2e851 Add checksum for py-pydantic 1.10.2 (#33006) 2022-10-05 10:37:50 -05:00
iarspider
e5da747d6f Add checksum for py-rich 12.5.1 (#33008) 2022-10-05 10:29:30 -05:00
Brian Van Essen
f463666f0e Add aws-ofi-nccl library (#32906)
* Added a package for the aws-ofi-nccl plug-in from to enable NCCL to
use libfabric communication library as a network provider.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2022-10-05 14:08:32 +02:00
Jordan Galby
6c12630e95 Optimize concurrent misc_cache provider index rebuild (#32874)
When concurrent misc_cache provider index rebuilds happen, try to
rebuild it only once, so we don't exceed misc_cache lock timeout.

For example, when using `spack env depfile`, with no previous
misc_cache, running `make -f depfile -j8` could run at most 8 concurrent
`spack install` locking on misc_cache to rebuild the provider index. If
one rebuild takes 30s, before this fix, the "worst" lock could wait up
to 30s * 7, easily exceeding misc_cache lock timeout. Now, the "worst"
lock would take 30s * 1 + ~1s * 6.
2022-10-05 06:01:59 -06:00
Chris MacMackin
53cea629b7 autotools: Filter libtools when building with dpcpp (#32876)
Due to a [known
issue](https://community.intel.com/t5/Intel-oneAPI-Data-Parallel-C/dpcpp-and-GNU-Autotools/m-p/1296985)
with dpcpp, autotool-based builds that try to use it will fail because
they try to link against temporary files that no longer exist. This
commit filters those files out of the libtools script so that linking
can work properly.
2022-10-05 11:57:37 +00:00
iarspider
cfea21319f Add checksums for Rivet 3.1.7 and YODA 1.9.7 and fastjet 3.4.0 (#32917)
* Add checksums for Rivet 3.1.7 and YODA 1.9.7

* Add checksum for fastjet 3.4.0

* Add v3.1.7b...

... in which version requirement for autoconf was lowered to 2.68
2022-10-05 12:54:20 +02:00
Adam J. Stewart
e2b5179060 py-torchmetrics: add v0.10.0 (#32997) 2022-10-05 04:45:59 -06:00
Auriane R
4e5ea86b20 Add pika 0.9.0 (#33010)
* Reorder dependencies in alphabetical order + fix typo
2022-10-05 03:33:54 -06:00
Annop Wongwathanarat
e69c8338a4 armpl-gcc: Add version 22.1 (#32914)
* armpl-gcc: Add version 22.1

* fixed url, installation script name, and conflicts

Co-authored-by: Annop Wongwathanarat <annop.wongwathanarat@arm.com>
2022-10-05 11:02:56 +02:00
Brian Van Essen
15e0e15c90 Added a new version (#33002) 2022-10-05 10:29:36 +02:00
Paul R. C. Kent
316b620a05 llvm: add 15.0.1, 15.0.2 (#32983)
* llvm: 15.0.1, 15.0.2

* Requested format change
2022-10-04 21:54:04 -06:00
Richard Berger
153206be00 flecsi: upcoming release will no longer need lanl-cmake-modules (#32986) 2022-10-04 17:44:58 -07:00
snehring
f677855e7d pbbam: switching to meson, adding version 2.1.0 (#32996) 2022-10-04 18:38:08 -06:00
Sam Grayson
d1fe67b0bc nix: Fix #32994 (#32995)
* Fix nix
* Check value in attribute
2022-10-04 18:37:47 -06:00
SXS Bot
fd911e7b2e spectre: add v2022.10.04 (#32987)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2022-10-04 14:22:21 -06:00
Greg Becker
5f59821433 BuildEnvironment: accumulate module changes to poke to all relevant modules (#32340)
Currently, module changes from `setup_dependent_package` are applied only to the module of the package class, but not to any parent classes' modules between the package class module and `spack.package_base`.

In this PR, we create a custom class to accumulate module changes, and apply those changes to each class that requires it. This design allows us to code for a single module, while applying the changes to multiple modules as needed under the hood, without requiring the user to reason about package inheritance.
2022-10-04 13:06:50 -07:00
eugeneswalker
b57d24a5e1 e4s: add vtk-m +rocm (#32151) 2022-10-04 12:28:24 -07:00
Adam J. Stewart
8e1de16193 py-tensorflow-probability: add v0.18.0 (#32941) 2022-10-04 13:50:25 -05:00
Adam J. Stewart
7fdfdea0c7 py-tensorflow-estimator: add v2.10 (#32940)
* py-tensorflow-estimator: add v2.10

* Typo fix
2022-10-04 13:48:11 -05:00
eugeneswalker
4bd537574b omega-h: add v9.34.13 (#32963) 2022-10-04 12:05:52 -06:00
Todd Gamblin
8c50b44bfe find/list: display package counts last (#32946)
* find/list: display package counts last

We have over 6,600 packages now, and `spack list` still displays the number of packages
before it lists them all. This is useless for large sets of results (e.g., with no args)
as the number has scrolled way off the screen before you can see it. The same is true
for `spack find` with large installations.

This PR changes `spack find` and `spack list` so that they display the package count
last.

* add some quick testing

Co-authored-by: Danny McClanahan <1305167+cosmicexplorer@users.noreply.github.com>
2022-10-04 10:56:46 -07:00
Adam J. Stewart
450a3074e2 Fix typo in documentation (#32984) 2022-10-04 19:54:40 +02:00
iarspider
7f2e204e20 Add checksum for py-pip 22.2.2 (#32877) 2022-10-04 11:26:17 -06:00
iarspider
bf7512e25a Add checksum for py-pysympy 1.11.1 (#32972) 2022-10-04 12:23:14 -05:00
iarspider
aaf6b1fd29 Add checksum for py-absl-py 1.2.0 (#32973) 2022-10-04 12:22:12 -05:00
iarspider
30c2d2765c Add checksum for py-grpcio 1.48.1 (#32974) 2022-10-04 12:21:14 -05:00
Cody Balos
c03979c74b superlu-dist: add 8.1.0 and 8.0.0 versions (#32558) 2022-10-04 07:25:53 -06:00
Dom Heinzeller
8af1802bd9 Bug fix for ca-certificates-mozilla/package.py to enable spack install --source (#32953) 2022-10-04 04:29:48 -06:00
Massimiliano Culpo
abbdf24083 Remove CMakePackage.define alias from most packages (#32950) 2022-10-04 10:58:58 +02:00
Weiqun Zhang
93cd84c922 amrex: add v22.10 (#32966) 2022-10-03 22:01:49 -06:00
iarspider
7daab390b3 Add checksum for py-flatbuffers 2.0.7 (#32955) 2022-10-03 17:53:47 -07:00
Martin Pokorny
916b21bfb4 rdma-core: fix syntax for external discoverability (#32962) 2022-10-03 17:45:46 -06:00
iarspider
977c89cee1 Add checksum for py-pyparsing 3.0.9 (#32952) 2022-10-03 14:57:56 -06:00
Chris White
9225f4f27f Axom: bring in changes from axom repo (#32643)
* bring in changes from axom repo

Co-authored-by: white238 <white238@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-10-03 14:02:46 -06:00
Lucas Nesi
ffc40a0fdb simgrid: add variant and remove flag (#32797)
* simgrid: remove std c++11 flag
* simgrid: add msg variant
2022-10-03 13:53:59 -06:00
Jack S. Hale
241b4624bc FEniCSx: Updates for 0.5.1 (#32665)
* Updates for DOLFINx 0.5.1 and associated packages
* xtensor needed on anything less than main
* Switch back to Python 3.7 minimum.
* Might be good to point out in our README how to fix Python version?
* Fix basix, xtensor dep
* Add numba feature
* Fix checksum
* Make slepc optional

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-10-03 12:49:04 -07:00
Cody Balos
a51a81655a trilinos: constrain superlu-dist version (#32889)
* trilinos: constrain superlu-dist version for 13.x
* syntax
2022-10-03 12:24:21 -07:00
Cameron Stanavige
db1e32623f unifyfs: pin mercury version; add boost variant (#32911)
Mercury has a new version (v2.2) releasing soon that UnifyFS does not build with and hasn't been tested with. This pins UnifyFS to the last version of Mercury used/tested.

Add a variant to avoid building/using boost

Append -std=gnu99 to cflags if building with gcc@4. Needed for mochi-margo to compile
2022-10-03 11:51:05 -07:00
snehring
6e86daf470 casper: old domain fell off, adding github repo (#32928) 2022-10-03 11:42:03 -07:00
Brian Van Essen
25c1ef1e57 Tidied up configure arguments to use special spack autotools features. (#32930) 2022-10-03 11:05:21 -07:00
Paul R. C. Kent
8e60b3932c QMCPACK: add @3.15.0 (#32931) 2022-10-03 19:28:31 +02:00
iarspider
8227a221e6 Add checksum for py-tomli 2.0.1 (#32949) 2022-10-03 10:45:55 -06:00
Auriane R
941eb8d297 Replace repo with the NVIDIA one (#32951) 2022-10-03 09:33:00 -07:00
Adam J. Stewart
e89b79d074 py-einops: add v0.5.0 (#32959) 2022-10-03 10:29:50 -06:00
iarspider
0fee3095e1 Add checksum for py-pytest-runner 6.0.0 (#32957) 2022-10-03 10:21:51 -06:00
renjithravindrankannath
52e538e1d9 rvs binary path updated for 5.2 rocm release (#32892) 2022-10-03 05:13:52 -06:00
Matthieu Dorier
d4f05b0362 [rocksdb] Added rtti variant (#32918) 2022-10-03 12:55:18 +02:00
Vinícius
9728ddb0cd simgrid new releases (#32920) 2022-10-03 12:51:11 +02:00
Kai Torben Ohlhus
4924a9a28d octave: add version 7.2.0 (#32943) 2022-10-03 12:38:57 +02:00
Adam J. Stewart
4fb99912f1 libtiff: default to +zlib+jpeg (#32945) 2022-10-03 12:37:46 +02:00
Auriane R
7ee8fd5926 Set CMAKE_HIP_ARCHITECTURES with the value of amdgpu_target (#32901) 2022-10-03 10:22:25 +02:00
John W. Parent
5a0f4970df Fixup errors introduced by Clingo Pr: (#32905)
* re2c depends on cmake on Windows
* Winbison properly added to bootstrap package search list
2022-10-02 17:44:05 -07:00
Annop Wongwathanarat
fa7407093e acfl: add v22.1 (#32915)
Co-authored-by: Annop Wongwathanarat <annop.wongwathanarat@arm.com>
2022-10-01 09:23:16 +02:00
Manuela Kuhn
1596191b27 py-jsonschema: add 4.16.0 and new package py-hatch-fancy-pypi-readme (#32929) 2022-09-30 20:57:45 -06:00
Greg Becker
deae0c48e4 develop: canonicalize dev paths and base relative paths on env.path (#30075)
Allow environment variables and spack-specific path substitution variables (e.g. `$spack`) to be
used in the paths associated with develop specs, while maintaining the ability to keep those
paths relative to the environment rather than the working directory.
2022-09-30 20:56:53 +00:00
Manuela Kuhn
a5a16ed5b3 py-jinja2: add 3.1.2 (#32925) 2022-09-30 13:41:48 -06:00
Manuela Kuhn
c4429ad6ed py-ipykernel: add 6.16.0 (#32923) 2022-09-30 12:38:02 -06:00
Manuela Kuhn
918de81f5a py-joblib: add 1.2.0 (#32926) 2022-09-30 12:29:53 -06:00
Manuela Kuhn
8b2ed09832 py-json5: add 0.9.10 (#32927) 2022-09-30 12:09:51 -06:00
Manuela Kuhn
4f5be6c17e py-jeepney: add 0.8.0 (#32924) 2022-09-30 10:33:57 -06:00
Massimiliano Culpo
cf0c4523f3 qmcpack: set python3 executable explicitly (#32900) 2022-09-30 09:43:15 +02:00
Daniel De Lucca
e31f8da021 New Package: Navi (#32895)
* feat: adds navi
* Update var/spack/repos/builtin/packages/navi/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-09-30 00:53:53 -06:00
liuyangzhuan
2abbcaa49c add butterflypack 2.2.1 (#32909) 2022-09-29 19:38:02 -06:00
Stephen McDowell
24f9ec7dc0 Update ecp-data-vis-sdk homepage and maintainers (#32886) 2022-09-29 17:33:52 -06:00
G-Ragghianti
d417d49690 Package updates for slate, lapackpp, and blaspp (#32907)
* Added new versions
* New slate version
* Adding GPU support for lapackpp package
* Modified dependency on lapackpp
* Added rocblas and rocsolver to deps
* Testing with custom lapackpp repo
* Added chaining depends_on for +rocm
* Removing testing repo
2022-09-29 17:09:55 -06:00
Stephen Sachs
bb510c7979 [lammps] Add +intel and +user-intel options (#32898)
Going ahead adding this option individually since https://github.com/spack/spack/pull/25015 needs some more time.

Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-09-29 16:42:08 -06:00
Sergey Kosukhin
56a2cfd19d claw: enable building with oneapi and nag+gcc@10: (#32897)
* claw: enable building when '%oneapi'
* claw: enable building when '%nag' is mixed with 'gcc@10:'
2022-09-29 13:32:10 -07:00
Manuela Kuhn
4b7836f822 py-ipython: add 8.5.0 and py-win-unicode-console: new package (#32903)
* py-ipython: add 8.5.0 and py-win-unicode-console: new package

* Fix style

* Fix dependency version

* Deprecate version 2.3.1 and 3.1.0

* Always skip IPython.kernel

* Move skip_module definition to top
2022-09-29 13:46:18 -06:00
kwryankrattiger
a01c36da45 Install: Add use-buildcache option to install (#32537)
Install: Add use-buildcache option to install

* Allow differentiating between top level packages and dependencies when
determining whether to install from the cache or not.

* Add unit test for --use-buildcache

* Use metavar to display use-buildcache options.

* Update spack-completion
2022-09-29 13:48:06 -05:00
Brian Van Essen
7a25f416b8 Added new versions of NCCL (#32891) 2022-09-29 11:54:07 -06:00
Brian Van Essen
400a9f3df7 Add aws ofi rccl (#32773)
* Added a package for the aws-ofi-rccl plug-in from the ROCm software
stack.  It allows RCCL to use the libfabric communication library.

Added support for using libfabric in Aluminum.

* Updated the run environment so that the plugin would get loaded.

* Added support for setting up the the LD_LIBRARY_PATH for dependent packages.

* Added package for RCCL tests to assess the impact of OFI libfabric RCCL plug-in.
2022-09-29 10:46:27 -07:00
iarspider
699f575976 Add tag for xgboost 1.6.2 (#32896)
* Add tag for xgboost 1.6.2

* Update py-xgboost as well

* Update package.py

* Update var/spack/repos/builtin/packages/py-xgboost/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-29 11:10:22 -06:00
Manuela Kuhn
2ea7703d69 py-itsdangerous: add 2.1.2 (#32904) 2022-09-29 10:54:27 -06:00
Stephen McDowell
679cd4b60d Enable adios2+cuda for ecp-data-vis-sdk +cuda (#32885) 2022-09-29 09:49:58 -05:00
Manuela Kuhn
22054403e8 py-hatchling: add 1.10.0 and py-pathspec: add 0.10.1 (#32865) 2022-09-29 08:16:11 -06:00
Mikael Simberg
a78462988b Add patch to link pthread library for llvm 15 (#32838) 2022-09-29 06:41:59 -07:00
Manuela Kuhn
a0147a1f07 py-importlib-metadata-4.12.0 (#32872) 2022-09-29 07:04:42 -06:00
Sergey Kosukhin
19faeab84d intel-oneapi-compilers-classic: extend setup_run_environment with the one from intel-oneapi-compilers (#32854)
* intel-oneapi-compilers-classic: refactor setup_run_environment

* intel-oneapi-compilers-classic: extend setup_run_environment with the one from intel-oneapi-compilers
2022-09-29 07:58:05 -04:00
Annop Wongwathanarat
42a230eef1 Rename arm to acfl, add v22.0.2 (#32518)
Co-authored-by: Annop Wongwathanarat <annop.wongwathanarat@arm.com>
2022-09-29 13:49:39 +02:00
Manuela Kuhn
8d14f16246 py-imageio: add 2.22.0 (#32870) 2022-09-29 05:29:48 -06:00
Manuela Kuhn
dfecbbeeee py-idna: add 3.4 (#32869) 2022-09-29 04:41:55 -06:00
Manuela Kuhn
1b343434c3 py-interface-meta: add 1.3.0 (incl new dependency packages) (#32873)
* py-interface-meta: add 1.3.0 (incl new dependency packages)

* Update var/spack/repos/builtin/packages/py-poetry-dynamic-versioning/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-29 04:22:41 -06:00
Manuela Kuhn
d9cab51fd7 py-formulaic: add 0.5.2 (#32851) 2022-09-29 04:22:18 -06:00
Manuela Kuhn
3f1ebfd4fb py-fracridge: add 2.0 (#32852) 2022-09-29 04:17:59 -06:00
Howard Pritchard
eab148288a superlu:oneapi-deal with non ISO C99 compliance (#32685)
superlu:oneapi-deal with non ISO C99 complianc in the package.

The Intel OneAPI compilers are based on LLVM 14.  A recent enhancement to LLVM -
  https://reviews.llvm.org/D122983
results in superlu-dist not compiling because of some non ISO C99 compliant stuff.

A workaround is to use an llvm compile line option noted in the above URL.

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2022-09-29 03:58:03 -06:00
Laurent Aphecetche
17d9960424 gl2ps: set CMAKE_MACOSX_RPATH=True to build (#32777) 2022-09-29 03:50:08 -06:00
Lucas Nesi
b37a1ec12b fxt: add static variant (#32794) 2022-09-29 03:46:05 -06:00
Matthieu Dorier
9515998deb [py-deephyper] Added py-deephyper package (#32848) 2022-09-29 03:38:19 -06:00
Manuela Kuhn
2d0ff51185 py-identify: add 2.5.5 (#32868) 2022-09-29 03:34:32 -06:00
iarspider
7b365f4c43 clhep: add v2.4.5.3 and cms-specific patch (#32757) 2022-09-29 11:26:16 +02:00
Manuela Kuhn
7d50fd3b8e py-imagesize-1.4.1 (#32871) 2022-09-29 03:14:18 -06:00
Rob Jones
0e95f580a0 MGCFD-OP2: add new package (#32529) 2022-09-29 11:11:53 +02:00
Manuela Kuhn
a70897c356 py-flake8: add 5.0.4 (#32867) 2022-09-29 03:02:37 -06:00
Manuela Kuhn
6fe89a4220 py-humanize: add 4.4.0 (#32866) 2022-09-29 03:02:14 -06:00
Adam J. Stewart
bc039524da py-torch: fix +rocm+nccl build (#32771) 2022-09-29 11:01:32 +02:00
Sarah Osborn
77afad229c hypre: add Umpire variant (#32884)
* hypre: Add umpire variant
* hypre: Fix to umpire / GPU target propagation
2022-09-29 02:42:00 -06:00
Hans Fangohr
2214f34380 anaconda3: add v2021.11 and v2022.05 (#32882) 2022-09-29 09:07:16 +02:00
Paul Kuberry
59415e6fc8 hwloc: update url_for_version (#32735) 2022-09-29 09:05:18 +02:00
Paul Kuberry
31d54da198 trilinos: update url (#32887) 2022-09-28 23:30:07 -06:00
Manuela Kuhn
d6d40919b2 py-greenlet: add 1.1.3 (#32864) 2022-09-28 22:57:59 -06:00
Manuela Kuhn
e7b0ef719d py-fonttools: add 4.37.3 (#32850) 2022-09-28 22:05:50 -06:00
Sergey Kosukhin
266453ce24 serialbox: enable building with NAG 7.1 (#32883) 2022-09-28 15:57:51 -06:00
Sergey Kosukhin
81f9a5b732 serialbox: patch to add missing include directives (#32880) 2022-09-28 10:33:59 -06:00
Manuela Kuhn
25a75ff9bf py-filelock: add 3.8.0 and py-setuptools-scm: add 7.0.5 (#32846) 2022-09-28 10:09:59 -06:00
John W. Parent
650a668a9d Windows: Support for Clingo and dependencies (#30690)
Make it possible to install the Clingo package on Windows; this
also provides a means to use Clingo with Spack on Windows.

This includes

* A new "winbison" package: Windows has a port of bison and flex where
  the two packages are grouped together. Clingo dependencies have been
  updated to use winbison on Windows and bison elsewhere (this avoids
  complicating the existin bison/flex packages until we can add support
  for implied virtuals).
* The CMake build system was incorrectly converting CMAKE_INSTALL_PREFIX
  to POSIX format.
* The re2c package has been modified to use CMake on Windows; for now
  this is done by overloading the configure/build/install methods to
  perform CMake-appropriate operations; the package should be refactored
  once support for multiple build systems in one Package is available.
2022-09-28 09:54:00 -06:00
iarspider
db2565cb53 Add checksum for py-ipywidgets 8.0.2, py-jupyterlab-widgets 3.0.3 (#32847)
* Add checksum for py-ipywidgets 8.0.2, py-jupyterlab-widgets 3.0.3, add py-jupyterpackaging10

* Apply changes from review

* Fix
2022-09-28 03:45:53 -06:00
Massimiliano Culpo
63dca0c6cc Remove mentions of "best-effort" matrix expansion in the docs (#32755)
closes #20340
2022-09-28 09:15:36 +02:00
Adam J. Stewart
51feed16db py-sphinx: add v5.2.0 (#32804)
* py-sphinx: add v5.2.0
* py-colorama: add v0.4.5
* 3.6 still supported
2022-09-27 21:13:49 -06:00
Vicente Bolea
af1e62b0ac vtkm: add v1.9.0-rc1, keep v1.8.0 preferred (#32858) 2022-09-27 17:41:48 -06:00
eugeneswalker
87b014ed13 e4s: use ubuntu 20.04 image and %gcc@9.4.0 (#32795) 2022-09-27 16:25:56 -06:00
Tom Scogland
d0136899a2 fix cmake cache package handling for hip workaround 2022-09-27 14:21:59 -07:00
Adam J. Stewart
90b86a2266 py-torch-geometric: add v2.1.0 (#32822)
* py-torch-geometric: add v2.1.0

* black

* Update homepage

* Add missing sklearn dep
2022-09-27 15:47:45 -05:00
Alex Hornburg
67717c569e shell.py: fix undefined variable in csh deactivate (#32816)
This commit fixes #27027.

The root cause of the issue is that the `SPACK_OLD_PROMPT` variable
was evaluated in string interpolation regardless of whether the
guard condition above evaluates to true or false. This commit uses
the `eval` keyword to defer evaluation until the command is executed.

Co-authored-by: Alexander Hornburg <alexande@xilinx.com>
2022-09-27 12:32:42 -07:00
snehring
cb9f174a7f zziplib: adding some missing build deps (#32833)
* zziplib: adding some missing build deps
2022-09-27 10:46:01 -06:00
Philipp Edelmann
f0ec6a994c pgplot: install rbg.txt and change PGPLOT_DIR (#32775)
The file rbg.txt is needed for many PGPLOT application, such as the MESA
stellar evolution code. This change installs the file to the PGPLOT_DIR,
where the library expects it.

PGPLOT_DIR was previously set to the prefix itself, which is an odd
place to install rgb.txt. This commit changes it to lib/pgplot5,
following the convention used by Debian.

Co-authored-by: Philipp Edelmann <edelmann@fs.tum.de>
2022-09-27 09:44:19 -07:00
Laurent Aphecetche
384ff70b0d root: add arrow variant (#32837) 2022-09-27 09:26:22 -07:00
Michael Kuhn
9c1d6da111 strace: add 5.19 (#32844) 2022-09-27 08:45:28 -07:00
Manuela Kuhn
0c49ee939b py-executing: add 1.1.0 (#32840) 2022-09-27 10:42:58 -05:00
Manuela Kuhn
086dc66653 py-exifread: add 3.0.0 (#32841) 2022-09-27 10:41:59 -05:00
Manuela Kuhn
608d20446d py-fasteners: add 0.18 (#32843) 2022-09-27 10:39:08 -05:00
Manuela Kuhn
b7a43bf515 py-fastjsonschema: add 2.16.2 (#32845) 2022-09-27 10:38:02 -05:00
Stefano Bertone
ae627150b2 cgal: depends on for older versions (#32802)
* cgal: depends on for older versions
* Update package.py

Co-authored-by: steo85it <steo85it@users.noreply.github.com>
2022-09-27 08:32:16 -07:00
iarspider
9f271fa388 hwloc: replace 'shared' variant with 'libs' (#32093)
* hwloc: replace 'shared' variant with 'libs'
* PEP-8
* Fix gpi-space
2022-09-27 05:53:57 -06:00
Adam J. Stewart
43ceff4193 py-libclang: add v14 (#32824) 2022-09-26 17:21:52 -06:00
Chris White
777271da18 improve lexing error (#32832) 2022-09-26 22:47:45 +00:00
Pak Lui
de9fc038f7 add ROCmPackage to OSU Micro Benchmarks (#32806) 2022-09-26 14:26:58 -07:00
iarspider
b192e3492c Add checksum for py-nbconvert 7.0.0, py-mistune 2.0.4 (#32817) 2022-09-26 14:02:06 -06:00
Jean Luca Bez
f64ca7bc6a hdf5-vol-async: fix path needed for h5bench (#32805)
* fix path needed for h5bench
* Update var/spack/repos/builtin/packages/hdf5-vol-async/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-09-26 12:21:45 -06:00
Laurent Aphecetche
a077b2b0ee vmc: add versions 1-1-p1 and 2-0 (#32703)
* vmc: add versions 1-1-p1 and 2-0
* Update var/spack/repos/builtin/packages/vmc/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-09-26 11:02:10 -07:00
Valentin Volkl
b6be1f3520 hoppet: fix typo in hep tag (#32820) 2022-09-26 11:26:05 -06:00
Manuela Kuhn
94435778ba py-datalad-metalad: add 0.4.5 and new package py-datalad-metadata-model (#32811)
* py-datalad-metalad: add 0.4.5

* Update var/spack/repos/builtin/packages/py-datalad-metalad/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-26 11:25:43 -06:00
Manuela Kuhn
f8e23b4fbe py-cryptography: add 38.0.1 (#32809)
* py-cryptography: add 38.0.1

* Fix rust dependency
2022-09-26 10:38:01 -06:00
Alberto Invernizzi
b1fda5dd64 add neovim version 0.7.2 (#32815) 2022-09-26 09:10:38 -07:00
akhursev
5fbbc5fa68 2022.3 oneAPI release promotion (#32786)
In addition to the new release, made intel-oneapi-compilers-classic version number match the compiler version number, instead of the version of the package that contains it.

Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>
2022-09-26 10:01:59 -06:00
Seth R. Johnson
494946d020 celeritas: new versions 0.1.2,3 (#32803) 2022-09-26 08:50:14 -07:00
Manuela Kuhn
7c5a417bb3 py-bottleneck: add 1.3.5 and py-versioneer: add 0.26 (#32780)
* py-bottleneck: add 1.3.5

* Fix sha256 for version 0.18

* Fix python dependency
2022-09-26 10:47:37 -05:00
Manuela Kuhn
beab39eb3c py-coverage: add 6.4.4 (#32789)
* py-coverage: add 6.4.4

* Fix python version in +toml variant
2022-09-26 10:46:23 -05:00
Manuela Kuhn
100eb5014a py-datalad-container: add 1.1.7 (#32810) 2022-09-26 10:42:25 -05:00
Laurent Aphecetche
a5bf7f458d root: make X11 really optional on macOS (#32661)
* root: make X11 really optional on macOS
* Update var/spack/repos/builtin/packages/root/package.py
* remove when clauses in provides

Co-authored-by: Hadrien G. <knights_of_ni@gmx.com>
2022-09-26 08:37:21 -07:00
Manuela Kuhn
b961cfa8d6 py-debugpy: add 1.6.3 (#32812) 2022-09-26 10:35:59 -05:00
Manuela Kuhn
07157306b2 py-distro: add 1.7.0 (#32813) 2022-09-26 10:34:43 -05:00
Laurent Aphecetche
3aa93ca79d arrow: add version 9 and more variants (#32701)
* utf8proc: add version 2.7.0 and shared variant
* xsimd: add version 8.1.0
* arrow: add version 9.0.0 and more variants
2022-09-26 08:30:24 -07:00
Adam J. Stewart
b7926eb6c8 py-py2cairo: only supports Python 2 (#32818) 2022-09-26 17:14:20 +02:00
Adam J. Stewart
1f6545c1c7 gcc: add Apple Silicon support for newer versions (#32702) 2022-09-26 16:21:25 +02:00
Auriane R
677a862fb9 Adapt pika to use the p2300 spack package (#32667)
* Adapt pika to use the p2300 spack package

* Fix formatting with black
2022-09-26 15:07:26 +02:00
Massimiliano Culpo
93dc500f41 py-pkgutil-resolve-name: fix issue with UTF-8 character
This modifications breaks `develop` since it doesn't
pass audits with Python 2.7 It is to be investigated
why audits pass in CI for the PR and the issue is
revealed only when the package is pushed to develop.
2022-09-26 14:47:13 +02:00
Massimiliano Culpo
ecce657509 Remove "tut" from the "build-systems" pipeline
PR #32615 deprecated Python versions up to 3.6.X. Since
the "build-systems" pipeline requires Python 3.6.15 to
build "tut", it will fail on the first rebuild that
involves Python.

The "tut" package is meant to perform an end-to-end
test of the "Waf" build-system, which is scarcely
used. The fix therefore is just to remove it from
the pipeline.
2022-09-26 14:47:13 +02:00
John W. Parent
30f6fd8dc0 Fetching/decompressing: use magic numbers (#31589)
Spack currently depends on parsing filenames of downloaded files to
determine what type of archive they are and how to decompress them.
This commit adds a preliminary check based on magic numbers to
determine archive type (but falls back on name parsing if the
extension type cannot be determined).

As part of this work, this commit also enables decompression of
.tar.xz-compressed archives on Windows.
2022-09-26 00:01:42 -07:00
iarspider
a5ea566bdf Update py-poetry to 1.2.1 (including new packages) (#32776)
* Update py-poetry to 1.2.1

* Update py-xattr

* Apply style from review

* Apply suggestions from code review (part 1)

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Changes from review - 2

* Fix typo

* Fix style

* Add missing py-dulwich version

* Update var/spack/repos/builtin/packages/py-poetry-core/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-poetry/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-poetry/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Apply changes from review

* Add py-backports-cached-property and fix style

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-25 12:36:42 -05:00
Sergey Kosukhin
5dc440ea92 py-eccodes: add version 1.5.0 (#32792)
* py-eccodes: ensure the minimal recommended shared version of libeccodes

* py-eccodes: set less general environment variables to enable location of libeccodes

* py-eccodes: add version 1.5.0

* py-eccodes: make flake8 happy
2022-09-24 16:00:45 -05:00
Manuela Kuhn
d9b7bedaaa py-cppy: add 1.2.1 (#32790)
* py-cppy: add 1.2.1

* Update var/spack/repos/builtin/packages/py-cppy/package.py

Co-authored-by: iarspider <iarspider@gmail.com>

Co-authored-by: iarspider <iarspider@gmail.com>
2022-09-24 15:57:55 -05:00
Manuela Kuhn
00753d49da py-click: add 8.1.3 (#32788) 2022-09-24 15:53:42 -05:00
Valentin Volkl
5573ccee53 py-colorcet: add new package (#31597)
* py-colorcet: add new package

* style

* py-colorcet: depend on python 2.7:

* address pr review comments

* address pr review comments

* style
2022-09-24 15:52:14 -05:00
Manuela Kuhn
b721f8038f py-ci-info: add 0.3.0 (#32785) 2022-09-24 15:49:58 -05:00
Manuela Kuhn
08371954bb py-charset-normalizer: add 2.1.1 (#32784) 2022-09-24 15:49:16 -05:00
Manuela Kuhn
3e4bf1e400 py-cfgv: add 3.3.1 (#32782) 2022-09-24 15:48:30 -05:00
Manuela Kuhn
cd58ae23be py-chardet: add 3.0.2 (#32783) 2022-09-24 15:47:37 -05:00
Manuela Kuhn
184af723e4 py-certifi: add 2022.9.14 (#32781) 2022-09-24 15:46:32 -05:00
Manuela Kuhn
c2aab98920 py-beautifulsoup4: add 4.11.1 (#32779) 2022-09-24 15:42:47 -05:00
iarspider
6ee7b5ad91 Add checksum for py-hist 2.6.1, py-boost-histogram 1.3.1 (#32787)
* Add checksum for py-hist 2.6.1

* Swap when and type
2022-09-24 15:41:18 -05:00
iarspider
02e4d4e546 Add checksum for py-google-auth 2.11.0, py-cachetools 5.2.0 (#32791)
* Add checksum for py-google-auth 2.11.0

* Swap when and type

* Update package.py
2022-09-24 15:27:19 -05:00
Tamara Dahlgren
4751881fd1 Fix two docstring typos (#32751) 2022-09-23 11:17:01 -07:00
Manuela Kuhn
3f28ef89cc py-pytz: add 2022.2.1 (#32711) 2022-09-23 09:30:01 -06:00
Daryl W. Grunau
ca62819261 Packages/py scipy (#32767) 2022-09-23 07:41:44 -06:00
Manuela Kuhn
4bfa61c149 py-attrs: add 22.1.0 (#32762) 2022-09-23 04:09:44 -06:00
Harmen Stoppels
2fa9aff206 gitlab: cuda not compatible amazon linux 2 (#32678)
amazon linux 2 ships a glibc that is too old to work with cuda toolkit
for aarch64.

For example:

`libcurand.so.10.2.10.50` requires the symbol `logf@@GLIBC_2.27`, but
glibc is at 2.26.

So, these specs are removed.
2022-09-23 11:01:59 +02:00
Manuela Kuhn
e98e27ac3f py-asttokens: add 2.0.8 (#32760) 2022-09-23 02:49:43 -06:00
iarspider
86958669cf New packages: py-conan, py-node-semver, py-patch-ng (#32753)
* New packages: py-conan, py-node-semver, py-patch-ng

* Update var/spack/repos/builtin/packages/py-conan/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-conan/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-23 02:17:42 -06:00
Peter Scheibel
0a050785e9 Skip all tests using problematic fixture on python 2.7 (#32768) 2022-09-23 09:29:48 +02:00
Manuela Kuhn
8f5b847cb0 py-babel: add 2.10.3 (#32763)
* py-babel: add 2.10.3

* Update var/spack/repos/builtin/packages/py-babel/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-22 18:29:38 -06:00
iarspider
cbde650eed New package: py-jaraco-classes (#32759)
* New package: py-jaraco-classes

* Update var/spack/repos/builtin/packages/py-jaraco-classes/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-22 14:41:53 -06:00
Matthieu Dorier
b18ab062cf [py-tinydb] Added package py-tinydb (#32752)
* [py-tinydb] added package py-tinydb

* [py-tinydb] corrected version in dependency for py-tinydb

* [py-tinydb] update python dependency

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* [py-tinydb] update dependency

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-22 15:32:19 -05:00
Peter Scheibel
4094e59ab8 Fetching: log more exceptions (#32736)
Include exception info related to url retrieval in debug messages
which otherwise would be swallowed. This is intended to be useful
for detecting if CA configuration interferes with downloads from
HTTPS links.
2022-09-22 10:54:52 -07:00
Adam J. Stewart
0869d22fcb py-pytorch-lightning: add v1.7.7 (#32765) 2022-09-22 10:22:45 -07:00
Manuela Kuhn
be38400ce2 py-anyio: add 3.6.1 (#32756) 2022-09-22 10:30:14 -06:00
Hans Fangohr
2ad47c8ab2 Enable use of latest version of Octopus (12.0) (#32758)
Release notes:
https://www.octopus-code.org/documentation/main/releases/octopus-12/
2022-09-22 08:50:59 -07:00
iarspider
83d55daae5 New package: cepgen (#32754) 2022-09-22 08:47:44 -07:00
Andrey Alekseenko
0ddbb92ae3 GROMACS: update CMake version, and a couple more fixes (#32748)
- GROMACS is not an acronym (https://www.gromacs.org/about.html).
- GROMACS switched to LGPL a long time ago, so let's mention it first.
- CMake version required for `main` has been bumped to 3.18
  (https://gitlab.com/gromacs/gromacs/-/merge_requests/3093)
- `-DGMX_USE_OPENCL` flag was used before 2021; for newer versions,
  `-DGMX_GPU=OpenCL` is enough.
2022-09-22 07:10:30 -05:00
Tamara Dahlgren
54d06fca79 Add hash hint to multi-spec message (#32652) 2022-09-22 09:18:43 +02:00
Sreenivasa Murthy Kolam
457daf4be6 Add new variants (OPENCL and HIP) for MIVisionX recipe and their dependencies correctly. (#32486)
* add the 2 variants OPENCL and HIP and their dependencies correctly
for OPENCL - rocm-opencl, miopengemm and miopen-opencl
for HIP - miopen-hip
Earlier this was adding both the dependencies -miopen-hip and miopen-opencl
for both the backends which did not seem correct.
Also corrected the miopen-hip or miopen-opencl config.h  in patch() depending on the
backend
Also added libjpeg-turbo as it is required for building ROCAl .
the AMDRpp is still required for ROCAL inclusion but it currently does not build
AMDRpp will be added as a new spack recipe and the mivisionx will refer to that as a
dependency in future.

* fix style errors
* bump up the version for 5.2.3 release. tested +opencl, +hip and ~hip~opencl(cpu backend)

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-09-21 16:54:40 -07:00
renjithravindrankannath
c6d7557484 Update the version details for rccl, rvs, rocm-tensile rocm-5.2.3 release (#32741)
* ROCm 5.2.3 release changes
* Patch file for rccl

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-09-21 15:48:23 -07:00
Sreenivasa Murthy Kolam
e7f1f1af30 Update the version details for rocm-5.2.3 release (#32733)
* Bump up the version for rocm-5.2.3 release
* revert miopen-hip till the tag for mlirmiopen for 5.2.3 is available
2022-09-21 14:40:40 -07:00
Massimiliano Culpo
7e01a1252a Allow conditional variants as first values in a variant directive (#32740) 2022-09-21 19:22:58 +02:00
Robert Pavel
d07b200b67 Spackage for mlperf's OpenCatalyst (#32553)
* Initial opencatalyst proxy app spackage

Initial spackage for opencatalyst proxy app. Includes exposing of
versions in dependency spackages

* Verified Functionality and Spack Style

Verified build of mlperf-opencatalyst and fixed lingering spack style
issues

* Making requested changes to py-ocp

Making requested changed to spackage

* Further Requested Changes
2022-09-21 10:45:48 -06:00
Vanessasaurus
663f55825d flux-core: add v0.43.0 (#32744)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2022-09-21 18:26:41 +02:00
Vanessasaurus
dc214edc9c flux-sched: add v0.24.0 (#32743)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2022-09-21 18:26:09 +02:00
Manuela Kuhn
36df918e7d py-bleach: add 5.0.1 (#32727) 2022-09-21 09:48:16 -05:00
Laurent Aphecetche
874364bf4d vgm: add v5-0 (#32742) 2022-09-21 15:38:52 +02:00
Tamara Dahlgren
f12ececee5 Docs: Update pipeline ci rebuild to add --tests (plus fixed typos) (#32048) 2022-09-21 14:23:58 +02:00
Garth N. Wells
1313bc99a6 dolfinx: fix depends_on directive missing a version (#32730) 2022-09-21 10:12:22 +02:00
Adam J. Stewart
e45bc8440b pango: harfbuzz+coretext required on macOS (#32637) 2022-09-20 17:57:50 -06:00
Paulo Baraldi Mausbach
16692b802b New python package support: py-river (#32718) 2022-09-20 15:47:08 -05:00
John W. Parent
fd66f55e3c cmake: add v3.24.1 and v3.24.2 (#32654) 2022-09-20 11:24:16 -07:00
Wouter Deconinck
4b66364b06 gaudi: @:36 fails to build with fmt@9: (#32699)
Since fmt@9.0.0 and 9.1.0 were [added](6c4acfbf83) to spack a few days ago, gaudi fails to compile with default concretization. Since gaudi developers are usually paying attention to new versions of dependencies, I'm going to assume (perhaps optimistically) that the next bugfix version of gaudi will fix this (even though the issue has not been reported yet to Gaudi; I posted on the [key4hep public mirror](https://github.com/key4hep/Gaudi/issues/1)).
2022-09-20 11:20:55 -07:00
HELICS-bot
692f1d0e75 helics: Add version 3.3.0 (#32682)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Ryan Mast <3969255+nightlark@users.noreply.github.com>
2022-09-20 11:53:59 -06:00
Stephanie Labasan Brink
2f796e1119 variorum: new versions 0.5 and 0.6, remove log option (#32731) 2022-09-20 11:46:04 -06:00
iarspider
ac459df2b6 Add fixed version 0.7.0 to cpu-features (#32726)
* Add fixed version 0.7.0 to cpu-features
* [@spackbot] updating style on behalf of iarspider

Co-authored-by: iarspider <iarspider@users.noreply.github.com>
2022-09-20 10:40:39 -07:00
Joel Falcou
d8e566d554 Update EVE to 2022.09 (#32729) 2022-09-20 10:35:39 -07:00
Luke Diorio-Toth
cc0e7c87c1 new version (py-multiqc v1.13) + new package (py-rich-click) (#32719)
* started updating multiqc package

* working now

* added py-rich-click

* fixed style

* changed py-matlibplot versions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* changed py-networkx versions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* changed py-coloredlogs versions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* changed python versions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* changed py-markdown versions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* changed py-pyyaml requirement

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* changed py-requests requirements

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* changed py-spectra requirements

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-20 11:14:01 -06:00
Manuela Kuhn
757a2db741 py-python-gitlab: add 3.9.0 (#32723) 2022-09-20 10:41:53 -06:00
Manuela Kuhn
8409ec0f21 py-annexremote: add 1.6.0 (#32725) 2022-09-20 10:06:03 -06:00
Manuela Kuhn
0d3a4d799a py-pybids: add 0.15.3 (#32722) 2022-09-20 11:03:33 -05:00
Manuela Kuhn
578d06a86a py-nibabel: add 4.0.2 (#32721) 2022-09-20 11:01:06 -05:00
dependabot[bot]
e656b4659a build(deps): bump codecov/codecov-action from 3.1.0 to 3.1.1 (#32717)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 3.1.0 to 3.1.1.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/master/CHANGELOG.md)
- [Commits](81cd2dc814...d9f34f8cd5)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-09-20 14:03:47 +02:00
natshineman
3b273e263b osu-micro-benchmarks: add v6.1 (#32692) 2022-09-20 12:57:59 +02:00
Jim Galarowicz
24245be85a must: fix incorrect cmake argument (#32671) 2022-09-20 12:45:15 +02:00
Massimiliano Culpo
71434d8b9d intel-oneapi-itac: remove UTF-8 character in docstring
This character causes random failures in CI for Python 2.7
2022-09-20 11:02:10 +02:00
iarspider
2518a55105 Add checksum for py-nbconvert 6.5.1 (#32646) 2022-09-20 09:31:48 +02:00
Jen Herting
637a95c67f py-h5py: add v3.7.0 (#32578) 2022-09-20 09:30:36 +02:00
Manuela Kuhn
7e00852278 py-bids-validator: add 1.9.8 (#32712) 2022-09-20 01:05:45 -06:00
Gregory Lee
9b9691233a stat: add v4.2.1 (#32691) 2022-09-20 08:43:36 +02:00
Sreenivasa Murthy Kolam
0405ed6e9b Bump up the version for rocm-5.2.1 release (#32195)
* Bump up the version for rocm-5.2.1-initial commit
* Bump up the version for rocm-5.2.1 release
* Bump up the version for rocm-5.2.1 release
* correct the PROF_API_HEADER_PATH to include
* Bump up the version of rocm-openmp-extras for rocm-5.2.1 release
* bump up the version of rocwmma for 5.2.1
2022-09-19 21:13:38 -06:00
Jim Edwards
60997a7bc3 add pio2_5_9 to spack package (#32716) 2022-09-19 15:45:40 -06:00
Adam J. Stewart
5479f26aa6 py-pandas: add v1.5.0 (#32714) 2022-09-19 15:09:52 -06:00
kent-cheung-arm
8a7343f97b Added variant to ensure Arm Forge license acceptance (#32635)
* Added variant to ensure Arm Forge license acceptance
* Update var/spack/repos/builtin/packages/arm-forge/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-09-19 13:45:43 -07:00
Wouter Deconinck
5b26df8f64 imagemagick: new variant ghostscript (#32450)
* imagemagick: new variant ghostscript

Ghostscript adds about 75 dependencies to an installation of
imagemagick (97 without, 172 with Ghostscript). This adds a
variant (defaulting to true for backward compatibility) that
allows users to turn off Ghostscript support.

* imagemagick: be explicit when `--with-gslib`

* imagemagick: fix suggestion fail

* imagemagick: use spec.prefix.share.font

* imagemagick: default ghostscript false

* imagemagick: no need for join_path anymore

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-09-19 15:41:11 -05:00
Adam J. Stewart
83e66ce03e Deprecate Python 2 support (#32615)
This PR deprecates using Spack to install [EOL Python versions](https://endoflife.date/python), including Python 2.7, 3.1–3.6. It also deprecates running Spack with Python 2.7. Going forward, we expect Spack to have the following Python support timeline.

### Running Spack

* Spack 0.18 (spring 2022): Python 2.7, 3.5–3.10
* Spack 0.19 (fall 2022): Python 2.7 (deprecated), 3.6–3.11
* Spack 0.20 (spring 2023): Python 3.6–3.11

### Building with Spack

* Spack 0.18 (spring 2022): Python 2.7, 3.1–3.10
* Spack 0.19 (fall 2022): Python 2.7, 3.1–3.6 (deprecated), 3.7–3.11
* Spack 0.20 (spring 2023): Python 3.7–3.11

This is a reboot of #28003. See #31824 for a detailed discussion of the motivation for this PR.
If you have concerns about this change, please comment on #31824.
2022-09-19 13:34:38 -07:00
Manuela Kuhn
cc78d5db36 py-datalad: add 0.17.5 (#32676)
* py-datalad: add 0.17.5

* Fix description of py-types-urllib3

* Update var/spack/repos/builtin/packages/py-datalad/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-datalad/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Add colorama dependency for windows

* Fix importlib-metadata dependency

* Update var/spack/repos/builtin/packages/py-pytest/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pytest/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pytest/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Use conflict to avoid dependency duplication

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-19 15:30:34 -05:00
rsanthanam-amd
8f8205af88 NCCL should not be dependent on ROCm builds since RCCL is used instead. (#32461)
Allow the nccl flag to be specified even for ROCm builds so that NCCL kernels are included in the build.

In this case the NCCL kernels will use RCCL as the backend implementation.
2022-09-19 15:02:02 -05:00
Stephen Sachs
13d870157f Adding intel-oneapi-itac package (#32658)
* Adding intel-oneapi-itac package
* Make black happy
* add rscohn2 as maintainer
* black prefers double quotes

Co-authored-by: Stephen Sachs <stesachs@amazon.com>
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
2022-09-19 12:44:32 -07:00
Terry Cojean
5d42185698 Ginkgo: improve smoke testing (#32647)
* Ginkgo: improve smoke testing
* Fix style issues
* Pass `self.compiler.{cc,cxx}` to the smoke tests
2022-09-19 12:37:34 -07:00
Ashwin Kumar
3ffc7f4281 patch etsf-io tests (#32666)
(Patches from meisam.tabriz@mpcdf.mpg.de and smoke tests from @fangohr)
2022-09-19 12:31:59 -07:00
Wouter Deconinck
7d250c4bea root: determine_variants cxxstd=20 support (#32673)
This is somewhat ahead of reality, but once/when external ROOT installs with cxxstd=20 are common, we will need this to pick up the correct variant.
2022-09-19 11:32:47 -07:00
snehring
edee67d96f fasttree: updating to 2.1.11 (#32683) 2022-09-19 11:07:51 -07:00
Massimiliano Culpo
fc1e25c5e1 Don't run bootstrap on package only PRs (#32689)
* Don't run bootstrap on package only PRs

* Run bootstrap tests when ci.yaml is modified

* Test a package only PR

* Revert "Test a package only PR"

This reverts commit af96b1af60b0c31efcc9a2875ffb1a809ef97241.
2022-09-19 10:59:29 -07:00
Adam J. Stewart
bb071d03ba py-sip: add v6.6.2 (#32695)
* py-sip: add v6.6.2

* Fix python deptype
2022-09-19 10:56:47 -07:00
Wouter Deconinck
0298823e28 edm4hep: allow variant cxxstd 20, sync with podio (#32698)
Now that `podio` can support `cxxstd` variants 17 and 20, we can allow `edm4hep` to use `cxxstd=20` as well, but must ensure that `edm4hep` uses the same `cxxstd` variant as `podio`. Solution as in `celeritas`.
2022-09-19 10:51:22 -07:00
Laurent Aphecetche
51d2c05022 geant4-vmc: add version 6-1-p1 (#32704) 2022-09-19 10:12:02 -07:00
iarspider
3762e7319d Fix benchmark recipe after branch name was changed (#32708) 2022-09-19 10:08:30 -07:00
Wouter Deconinck
2f19826470 geant4: new bugfix version 11.0.3 (#32700)
Geant4 has a new bugfix version 11.0 patch-03, [release notes](https://geant4-data.web.cern.ch/ReleaseNotes/Patch.11.0-3.txt). No build system changes.
2022-09-19 13:11:31 +01:00
Manuela Kuhn
0f9cd73f58 py-neurora: add 1.1.6.8 (#32684) 2022-09-19 04:41:41 -06:00
Manuela Kuhn
17c16ac2fc py-pydicom: add 2.3.0 (#32681) 2022-09-19 04:25:35 -06:00
John W. Parent
0376a62458 Attempt at patching phist find mpi heuristic (#32688)
for updated phist version
2022-09-18 11:18:28 -05:00
Manuela Kuhn
d4c13b0f8f Add skip_import to PythonPackage and use it in py-nilearn (#32664)
* Add skip_import to PythonPackage and use it in py-nilearn

* Fix dependencies
2022-09-17 23:02:30 +00:00
Paul Kuberry
2c7c749986 xyce: Change preferred to last numeric version (#32679)
- Also handles new constraints on Trilinos version
2022-09-17 13:57:40 -06:00
Robert Cohn
9e0755ca3a [intel-oneapi-compilers] fix CC/CXX (#32677) 2022-09-16 14:16:06 -04:00
Massimiliano Culpo
0f26931628 CI: add coverage on Windows (#32610)
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
2022-09-16 05:38:29 -07:00
Todd Gamblin
a0c7209dc1 bugfix: package hash should affect process, dag, and dunder hashes (#32516)
This fixes a bug where two installations that differ only by package hash will not show
up in `spack find`.

The bug arose because `_cmp_node` on `Spec` didn't include the package hash in its
yielded fields. So, any two `Spec` objects that were only different by package hash
would appear to be equal and would overwrite each other when inserted into the same
`dict`. Note that we could still *install* specs with different package hashes, and they
would appear in the database, but we code that needed to put them into data structures
that use `__hash__` would have issues.

This PR makes `Spec.__hash__` and `Spec.__eq__` include the `process_hash()`, and it
makes `Spec._cmp_node` include the package hash. All of these *should* include all
information in a spec so that we don't end up in a situation where we are blind to
particular field differences.

Eventually, we should unify the `_cmp_*` methods with `to_node_dict` so there aren't two
sources of truth, but this needs some thought, since the `_cmp_*` methods exist for
speed. We should benchmark whether it's really worth having two types of hashing now
that we use `json` instead of `yaml` for spec hashing.

- [x] Add `package_hash` to `Spec._cmp_node`
- [x] Add `package_hash` to `spack.solve.asp.spec_clauses` so that the `package_hash`
      will show up in `spack diff`.
- [x] Add `package_hash` to the `process_hash` (which doesn't affect abstract specs
      but will make concrete specs correct)
- [x] Make `_cmp_iter` report the dag_hash so that no two specs with different
      process hashes will be considered equal.
2022-09-16 00:57:10 +00:00
liuyangzhuan
46d7ba9f78 adding butterflypack 2.2.0 (#32670) 2022-09-15 18:14:01 -06:00
iarspider
31c24cd0d5 New package: xrdcl-record (#32663)
* New package: xrdcl-record
* Flake-8
2022-09-15 10:13:37 -07:00
Matthew Archer
66b451a70d Fenicsx v0.5.0 (#32611)
* initial commit of 0.5.0 changes
* updated dependencies
* updated ffcx sha
* comment style
* llvm compilers
* introduce pugixml dependency for 0.5.0:
* update compilers to support C++20 features
* style fixes
* xtensor and xtl not needed for basix 0.5.0 and above
* Skip to Basix 0.5.1

The 0.5.1 release removes the C++ build dependency on Python that sneaked into the 0.5.0 build system.

* Improve depends on version ranges
* More dependency version improvements

Co-authored-by: Chris Richardson <chris@bpi.cam.ac.uk>
Co-authored-by: Garth N. Wells <gnw20@cam.ac.uk>
2022-09-15 08:11:25 -07:00
Robert Underwood
6f15d8ac76 added missing dependency for py-msgpack that breaks neovim (#32631)
* added missing dependency for py-msgpack that breaks neovim

* Update var/spack/repos/builtin/packages/py-pynvim/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Robert Underwood <runderwood@anl.gov>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-15 09:45:39 -05:00
Glenn Johnson
123354e920 FPLO: add new package (#32580) 2022-09-15 10:03:05 +02:00
Matthieu Dorier
eaf3f7c17c [py-pip] fix setup_dependent_package in py-pip (#32632) 2022-09-14 23:41:34 -06:00
Vicente Bolea
dde5867d15 libcatalyst: use git instead of fixed urls (#32642)
The issue is that we are not not able to install (Fetch URL error) any
version of catalyst other than the specified in the spack package.py.
This very version is accessible only because it is cached by Spack. The
real URL does not exist anymore, I believe the reason is that there used
to be a tag in catalyst that does not exist anymore.
2022-09-14 17:05:02 -07:00
John W. Parent
deca34676f Bugfix: find_libraries (#32653)
53a7b49 created a reference error which broke `.libs` (and
`find_libraries`) for many packages. This fixes the reference
error and improves the testing for `find_libraries` by actually
checking the extension types of libraries that are retrieved by
the function.
2022-09-14 14:45:42 -06:00
Peter Scheibel
7971985a06 Manifest parsing: skip invalid manifest files (#32467)
* catch json schema errors and reraise as property of SpackError
* no need to catch subclass of given error
* Builtin json library for Python 2 uses more generic type
* Correct instantiation of SpackError (requires a string rather than an exception)
* Use exception chaining (where possible)
2022-09-14 13:02:51 -07:00
Sergey Kosukhin
ebb20eb8f8 llvm: fix detection of LLDB_PYTHON_EXE_RELATIVE_PATH (#32584) 2022-09-14 12:45:41 -06:00
psakievich
045a5e80cb Allow version to accept the '=' token without activating lexer switch (#32257) 2022-09-14 10:16:28 -07:00
Valentin Volkl
01c9780577 py-awkward: allow newer, fixed rapidjson versions (#32650)
Rapidjson was constrained due to a bug (see https://github.com/spack/spack/pull/29889) , but newer (although untagged) versions of rapidjson have since been added to spack (https://github.com/spack/spack/pull/29869). Tested the build of py-awkward with the latest, works fine.
2022-09-14 11:57:20 -05:00
iarspider
5ffde1d51c Add checksum for yoda 1.9.6 (#32648) 2022-09-14 09:48:56 -07:00
Ishaan Desai
4453653418 [py-pyprecice] add v2.5.0.0 and v2.5.0.1 and update checksum of v2.4.0.0 (#32551)
* Update versions 2.5.0.0 and 2.5.0.1

* Applying review changes

* Updating incorrect checksum for v2.4.0.0

* Add for loop to define depends_on for preCICE versions and bindings versions

* Formatting

* Missing comma
2022-09-14 09:55:48 -05:00
Paul Kuberry
f37e9addcd compadre: add v1.5.0, v1.4.1 (#32609) 2022-09-14 06:49:52 -06:00
Houjun Tang
c1b3f0e02f Add hdf5-vol-async v1.3 (#32634) 2022-09-14 04:21:48 -06:00
Adam J. Stewart
ddd18351a1 py-pytorch-lightning: add v1.7.6 (#32644) 2022-09-14 02:09:58 -06:00
Adam J. Stewart
ce56cb23fe py-tokenize-rt: add missing setuptools dependency (#32639) 2022-09-14 10:04:03 +02:00
Jonas Thies
8d2cd4a620 phist: add v1.9.5 (#32621) 2022-09-14 09:46:09 +02:00
Erik Schnetter
9543716d81 berkeley-db: Correct TLS auto-detection (#32592) 2022-09-13 22:13:50 -06:00
Zack Galbreath
4ebdc5643e Revert "e4s ci: restore power builds (#32296)" (#32633)
This reverts commit 0d18c32bca.
2022-09-14 03:26:26 +00:00
Adam J. Stewart
93c39464e3 json-glib: add pkgconfig dependency (#32638) 2022-09-13 21:18:55 -06:00
Adam J. Stewart
cb323b1f55 graphite2: fix build on macOS (#32636) 2022-09-13 21:16:59 -06:00
João Seródio
9f72962dd1 New package: py-qiskit-nature (#32628) 2022-09-13 21:16:38 -06:00
Robert Underwood
9e5f3f96dd add ability to disable werror for rocksdb (#32618)
* add ability to disable werror for rocksdb
* Update var/spack/repos/builtin/packages/rocksdb/package.py

Co-authored-by: Robert Underwood <runderwood@anl.gov>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-09-13 19:45:56 -06:00
Sam Broderick
d6a3ffc301 Update module_file_support.rst (#32629)
Missing lead in "This may to inconsistent modulfiles if..."
2022-09-14 01:41:50 +00:00
Jonas Thies
4654d66905 packages/phist: new version 1.10 (#32581)
* packages/phist: new version 1.10

* fix (and narrow) conflicts, thanks @glennpj
2022-09-13 17:21:46 -05:00
Ondřej Čertík
463c5eacca Add libraqm package; enable in py-pillow (#32619)
* Add libraqm package

* py-pillow: Add optional raqm dependency/variant

* Use sha256

* Use " instead of '

* Use more explicit import

* Only add raqm from @8.4.0:

* Make the docstring shorter to satisfy flake

* Add conflict, silence warning, adjust version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-13 14:13:57 -06:00
Aurelien Bouteiller
83d6aff03a Parsec: v3.0.2209 (#32617)
Signed-off-by: Aurelien Bouteiller <bouteill@icl.utk.edu>

Signed-off-by: Aurelien Bouteiller <bouteill@icl.utk.edu>
2022-09-13 13:33:56 -06:00
Carlos Bederián
3b46a0bffe gromacs: add 2022.3 (#32624) 2022-09-13 12:41:57 -06:00
Chris White
937b576b5b fix depends on versioning (#32626) 2022-09-13 10:29:21 -07:00
John W. Parent
53a7b49619 Windows rpath support (#31930)
Add a post-install step which runs (only) on Windows to modify an
install prefix, adding symlinks to all dependency libraries.

Windows does not have the same concept of RPATHs as Linux, but when
resolving symbols will check the local directory for dependency
libraries; by placing a symlink to each dependency library in the
directory with the library that needs it, the package can then
use all Spack-built dependencies.

Note:

* This collects dependency libraries based on Package.rpath, which
  includes only direct link dependencies
* There is no examination of libraries to check what dependencies
  they require, so all libraries of dependencies are symlinked
  into any directory of the package which contains libraries
2022-09-13 10:28:29 -07:00
Cameron Stanavige
251d86e5ab unifyfs & gotcha: new releases (#32625)
UnifyFS:
- Add 1.0 release
- Deprecate older, unsupported versions
- Set fortran variant to true by default
- Update gotcha and mochi-margo dependency versions for unifyfs@1.0
  and unifyfs@develop
- Add conflict of unifyfs with libfabric 1.13.* versions
- Update configure_args to use helper functions

GOTCHA: Hasn't been under active development for a couple years but
the develop branch has some fixes UnifyFS uses. To avoid having
UnifyFS v1.0 depend on a develop branch of a dependency, this creates
a new release in the Gotcha Spack package based on the most recent
commit of the develop branch.
2022-09-13 10:53:11 -06:00
Cory Bloor
633a4cbd46 rocsparse: add spack build test (#31529)
* rocsparse: add spack build test

* Fix Navi 1x patch for ROCm 5.2

* Remove bench variant and other cleanup

* Fix style
2022-09-13 08:42:42 -07:00
Jordan Ogas
3e331c7397 charliecloud: deprecate old versions (#32462)
* tidy, deprecate old versions

* bump python

* begrudgingly apply less readable style

* adjust comment spacing

* apply ghastly multiline function arguments
2022-09-13 09:25:55 -06:00
Tamara Dahlgren
e97915eef2 Add an issue template for reporting package stand-alone test failures (#32487) 2022-09-13 08:22:04 -06:00
Brian Van Essen
2d895a9ec3 Added 2.6.0 version to NVSHMEM package. Added new variant for GPU (#32622)
initiated support experimental feature.
2022-09-13 08:14:04 -06:00
Tamara Dahlgren
62e788fb89 Added install phase test methods for CachedCMakePackage (inherited) and WafPackage (#32627) 2022-09-13 12:12:15 +02:00
Henrik Nortamo
bc83c72ebe Hyperqueue package (#32475)
* Adds hyperqueue package

* Adds maintainers

* Adds missing dockstring

* Bump year for Copyright

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* Fixes checksum, selects preferred version

Don't use a release candidate as the default version

* Switch maintainer to one of the developers for HQ

* Update package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-09-13 00:57:55 -06:00
Adam J. Stewart
68115eb1d0 GDAL: add v3.5.2 (#32614) 2022-09-13 00:53:51 -06:00
Erik Schnetter
6e5dc7374c openblas: Allow building linktest.c with clang@15 (#32597) 2022-09-12 12:17:49 -06:00
kwryankrattiger
6a5d247d7c VTK-m: Update gcc 11 conflict after patch (#32546) 2022-09-12 11:29:45 -06:00
Cory Bloor
13d872592e rocalution: fix compilation for Navi 1x and 2x (#32586) 2022-09-12 09:19:59 -07:00
Jim Edwards
5dc1a9f214 parallelio: make shared lib the default (#32432) 2022-09-12 17:48:21 +02:00
Wouter Deconinck
8dad297526 cuba: new package (#32510) 2022-09-12 15:27:03 +00:00
Qian Jianhua
734ae99285 dtc: fix build error with clang or Fujitsu compiler (#32543) 2022-09-12 16:59:05 +02:00
kwryankrattiger
b4f3812077 SDK: Remove conflicting variants from sensei propagation (#32490) 2022-09-12 16:57:42 +02:00
Rémi Lacroix
f2b19c39a0 r-gridextra: fix the CRAN package name. (#32474)
It seems like it was renamed from "gridExtras" to "gridExtra".
2022-09-12 16:57:15 +02:00
Simon Pintarelli
cb3b5fb716 sirius: new version, libsci supports (#32443) 2022-09-12 16:56:14 +02:00
Vicente Bolea
e5dcc43b57 ParaView: add v5.11.0-RC1 (#32436) 2022-09-12 16:55:18 +02:00
Mikael Simberg
daf691fd07 whip: add new package (#32576) 2022-09-12 16:54:17 +02:00
Kendra Long!
3ff63b06bf draco: add v7.14.1 (#32495) 2022-09-12 08:52:41 -06:00
Adrien Cotte
acdb6321d1 wi4mpi: mpi dep removed after v3.6.0 (#32352) 2022-09-12 08:52:17 -06:00
Jack Morrison
8611aeff52 opa-psm2: add v11.2.230 (#32530) 2022-09-12 08:47:04 -06:00
Ioannis Magkanaris
6c4acfbf83 fmt: add v9.1.0 and v9.0.0 (#32527)
Co-authored-by: Ioannis Magkanaris <ioannis.magkanaris@epfl.ch>
2022-09-12 08:46:42 -06:00
aeropl3b
d8e6782f42 HIPSycl: LLVM@10: provides working clang with cuda >= 11 (#32512)
Co-authored-by: RJ <aeropl3b+dev@gmail.com>
2022-09-12 08:46:19 -06:00
Simon Pintarelli
f33b7c0a58 spfft: inherit from ROCmPackage (#32550)
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
2022-09-12 08:22:39 -06:00
Seth R. Johnson
d4065e11c6 iwyu: support external find (#32458) 2022-09-12 08:22:14 -06:00
natshineman
bb6c39ea7c osu-micro-benchmarks: add v6.0 (#32587) 2022-09-12 16:19:23 +02:00
Carson Woods
46828eff1b Add new and legacy intel-mpi-benchmarks versions (#32535)
* Add new intel-mpi-benchmarks version

* Add new versions of intel mpi benchmarks

* Fix style bugs

* Fix style bugs

* Switch to using url_for_version formatting and improve patch ranges

* p2p benchmark is not included on older versions

* Set patch to proper version

* Add url field, improve patch versioning, improve version detection

* Add url field, improve patch versioning, improve version detection

* Bug fix
Syntax fix

* Remove 2019 from valid version on reorder_benchmark_macros patch

* OpenMPI isn't supported on older versions of the benchmark. Prevents OpenMPI from being selected on those versions

* Add new requirement of gmake for older versions

* Require intel-mpi for older versions of benchmark

* Minor changes to build directory for older versions

* Remove repeated conflict

* Minor style changes

* Minor change

* Correct fix for intel-mpi-benchmarks

* Bug fix

* Bug fix

* Attempted fix for install bug

* Attempted fix for install bug

* Remove duplicate build_directory setting
2022-09-12 08:54:42 -05:00
Christoph Conrads
5be1da4491 fenics-basix: make xtensor-blas dependency explicit (#28951) 2022-09-12 12:51:37 +00:00
Wouter Deconinck
29093f13ec podio: new variant cxxstd=(17,20) (#30437) 2022-09-12 14:44:16 +02:00
SXS Bot
51ce370412 spectre: add v2022.09.02 (#32501)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2022-09-12 14:05:40 +02:00
Jen Herting
3c822cee58 [py-bitarray] added version 2.6.0 (#32560) 2022-09-12 04:01:55 -06:00
Adam J. Stewart
7904e1d504 GDAL: fix typo (#32524) 2022-09-12 11:41:55 +02:00
Adam J. Stewart
9d9f1ac816 py-pytorch-lightning: add v1.7.5 (#32552) 2022-09-12 11:41:33 +02:00
eugeneswalker
bf84cdfcbe e4s: add py-torch +cuda (#32601) 2022-09-12 09:40:49 +00:00
Ken Raffenetti
8071cc60cb mpich: Set minimum libfabric version to build (#32557)
Starting with MPICH 3.4, we need at least libfabric 1.5 in order to
build. Fixes #24394
2022-09-12 11:38:55 +02:00
Elsa Gonsiorowski, PhD
f5c0bc194f SCR: add v3.0.1 (#32555) 2022-09-12 03:38:05 -06:00
Jhon-Cleto
1ca184c1ee cuda: add v11.7.1 (#32606) 2022-09-12 03:30:00 -06:00
Sergey Kosukhin
7a93eddf1c gcc: add support for the D language (GDC) (#32330) 2022-09-12 11:27:35 +02:00
Andrew W Elble
f7fbfc54b3 routinator: add new package (#32532) 2022-09-12 11:20:16 +02:00
Larry Knox
0bc9dbe6f8 Add spack package hdf5-vol-cache. (#32449)
* Add spack package hdf5-vol-cache.

* Style updates.

* Update var/spack/repos/builtin/packages/hdf5-vol-cache/package.py

* Remove outdated hdf5-cmake package options.
2022-09-12 03:18:05 -06:00
Adam J. Stewart
3d904d8e65 py-tensorflow: add v2.10.0 (#32544) 2022-09-12 11:17:47 +02:00
Stephen Sachs
9c1e916c1c wrf: define NETCDFFPATH in case it differs from NETCDFPATH (#32548)
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-09-12 11:17:17 +02:00
Erik Schnetter
cd1d6a9178 mpitrampoline: add v5.0.2 (#32598) 2022-09-12 03:09:52 -06:00
Auriane R
a2396c30b2 p2300: add wg21 p2300 std_execution as a package (#32531) 2022-09-12 09:04:04 +00:00
Philipp Edelmann
dea1e12c88 pgplot: fix build failure when using +X (#32542)
The spec was using the wrong key to find the X11 library flags.
2022-09-12 10:58:43 +02:00
Mikael Simberg
17b98f6b1f pika: add v0.8.0 (#32577) 2022-09-12 10:41:29 +02:00
Stephen Sachs
40ee78f6e8 GPCNeT: add new package (#32593)
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-09-12 10:40:19 +02:00
Massimiliano Culpo
34a7df9ea0 Fix encoding issues with py-cylp (#32608)
fixes #32607

The package contains character that have encoding
issues with Python 2.7.
2022-09-12 10:34:54 +02:00
Adam J. Stewart
cb2cdb7875 py-tensorflow-metadata: add v1.10.0 (#32545) 2022-09-12 10:27:16 +02:00
Adam J. Stewart
50c031e6aa nettle: add v3.8.1 (#32523) 2022-09-12 09:45:35 +02:00
João Marcos
7382a3ca89 distbench: add new package (#32605) 2022-09-12 08:44:05 +02:00
Mark W. Krentel
d4ec0da49a hpctoolkit: add yaml-cpp as dependency for develop (#32538) 2022-09-12 08:35:37 +02:00
Adam J. Stewart
2caf449b8b py-numpy: add v1.23.3 (#32590) 2022-09-10 19:58:11 -05:00
Adam J. Stewart
76b4e5cc51 py-torchgeo: add v0.3.1 (#32582) 2022-09-10 19:43:45 -05:00
Adam J. Stewart
2ad9164379 py-pyproj: add v3.4.0 (#32599)
* py-pyproj: add v3.4.0

* Remove older basemap versions
2022-09-10 19:25:01 -05:00
Adam J. Stewart
11a4b5ed69 py-cartopy: add v0.21.0 (#32600) 2022-09-10 19:24:30 -05:00
Glenn Johnson
01153b3271 new packages: py-arm-pyart and dependencies (#32579)
* new packages: py-arm-pyart and dependencies

- py-arm-pyart
- py-cylp
- rsl

* Update var/spack/repos/builtin/packages/py-cylp/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fix dependencies

- xarray is not optional
- pandas is needed
- pylab is needed
  - new package, py-pylab-sdk
- setuptools is needed at run time

* Patch for import of StringIO

* Update var/spack/repos/builtin/packages/py-arm-pyart/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fix call to `StringIO` in patch

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-10 23:02:15 +00:00
Massimiliano Culpo
1427ddaa59 ci: restore coverage computation (#32585)
* ci: restore coverage computation

* Mark "test_foreground_background" as xfail

* Mark "test_foreground_background_output" as xfail

* Make number of processes explicit, remove verbosity on linux

* Run coverage on just 3 Python jobs for linux

* Run coverage on just 3 Python jobs for linux

* Run coverage on just 2 Python jobs for linux

* Add back verbose, since before we didn't encounter the xdist internal error

* Reduce the workers to 2

* Try to use command line
2022-09-10 07:25:44 -06:00
psakievich
b4a2b8d46c GitRef Versions: Bug Fixes (#32437)
* Fix a version cmp bug in asp.py

* Fix submodule bug for git refs

* Add branch in logic for submodules

* Fix git version comparisons

main does not satisfy git.foo=main
git.foo=main does satisfy main
2022-09-09 11:02:40 -07:00
Jen Herting
c51af2262e py-session-info: add new package (#32564)
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-09-09 05:13:59 -06:00
Jen Herting
a13cf43b65 py-anndata: add new package (#32563)
Co-authored-by: James A Zilberman <jazrc@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-09-09 05:09:45 -06:00
Jen Herting
0fd749aa2c New package: py-python-bioformats (#32561)
* [py-python-bioformats] New package

* [py-python-bioformats] Added version 4.0.0

* [py-python-bioformats] Added types

* [py-python-bioformats] setuptools is build only

* [py-python-bioformats] fixup import

* [@spackbot] updating style on behalf of qwertos

Co-authored-by: James A Zilberman <jazrc@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-09-09 03:09:57 -06:00
Jen Herting
59602f790e py-umap-learn: add new package (#32562)
Co-authored-by: James A Zilberman <jazrc@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-09-09 03:05:49 -06:00
eugeneswalker
0d18c32bca e4s ci: restore power builds (#32296) 2022-09-08 21:00:58 -07:00
Brian Van Essen
f0c1c6f8cc Match protobuf to py-protobuf version (#32491)
* Fixed the py-protobuf recipe so that when cpp support is require so
that it uses the same major and minor version range for the protobuf
package.

* Fixed the range bound for the 3.x py-protobuf packages.

Added mappings for 4.x py-protobuf packages to 3.x protobuf packages.

Removed a hash for v21.1 protobuf and replaced with v3.21.1 to keep a
standard versioning convention.  Note that while Google has started
releasing both 3.x.y and a tag that dropped the leading 3. so it is
just x.y.  This provides the appearance of a new major version, but
really is just a new minor version.  These packages still report
versions as 3.x.y, so switching to versions and hashes with that
convention.

* Simplified constraints based on reviewer comments.

* Fixed flake8 errors

* Update var/spack/repos/builtin/packages/py-protobuf/package.py

* Fixed constraints on v2. versions and addressed Flake8 comments.

* Fixed flake8

* Fixed range dependencies for version 2.x

* Update var/spack/repos/builtin/packages/py-protobuf/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Fixed version ranges to skip unknown versions.

* Fixed the dependencies on protobuf to solve weird build issues.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-09-08 21:05:52 -06:00
Peter Scheibel
02151ac649 CMakePackage: allow custom CMAKE_PREFIX_PATH entries (#32547)
* define `cmake_prefix_paths` property for packages

add to CMake arguments via CMakePackage

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2022-09-08 16:23:04 -07:00
Mark W. Krentel
e3f87035b6 libmonitor: add version 2022.09.02 (#32525) 2022-09-08 13:45:57 -06:00
Massimiliano Culpo
67534516c7 ci: avoid running coverage on package only PRs (#32573)
* ci: remove !docs from "core" filters

Written like it is now it causes package only PRs
to run with coverage.

* Try to skip job under condition, see if the workflow proceed

* Try to cancel a running CI job

* Simplify linux unit-tests, skip windows unit-tests on package PRs

* Reduce the inputs to unit-tests workflow

* Move control logic to main workflow, remove inputs

* Revert "Move control logic to main workflow, remove inputs"

This reverts commit 0c46fece4c.

* Do not compute "with_coverage" since it's always == to "core"

* Remove workflow dispatch from unit tests

* Revert "Revert "Move control logic to main workflow, remove inputs""

This reverts commit dd4e4a4e61.

* Try to skip all from the main workflow

* Add back bootstrap to needed checks for "all"

* Restore the correct logic for conditionals
2022-09-08 10:58:53 -07:00
Michael Kuhn
dc1734f0a6 py-cython: add 0.29.32 (#32574) 2022-09-08 12:41:24 -05:00
Michael Kuhn
2cfac3389a meson: add v0.63.2 (#32575) 2022-09-08 10:33:49 -06:00
Jen Herting
63d079cce9 New package: py-pytesseract (#30765)
* Versions added for each dep, but I think I'll need to remove them

* py-tesseract now builds and will import in python

* Fixed flake style error as raised by pipeline

* changed to proper python dependency

* added pil as a dependency

* Fixed flake style errors

* [py-pytesseract] py-pillow and py-wheel are redundant

* [py-pytesseract]

- fixed spelling
- removed unneeded dependency

* [py-pytesseract] update import

Co-authored-by: Viv Eric Hafener <vehrc@sporcbuild.rc.rit.edu>
2022-09-08 15:24:42 +00:00
Howard Pritchard
a7fe137941 ucx: add 1.13.1 release (#32556)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2022-09-08 08:24:21 +02:00
Peter Scheibel
021ff1c7da Skip test which fails randomly on Python 2.7 (#32569) 2022-09-08 08:11:20 +02:00
Tom Scogland
762ba27036 Make GHA tests parallel by using xdist (#32361)
* Add two no-op jobs named "all-prechecks" and "all"

These are a suggestion from @tgamblin, they are stable named markers we
can use from gitlab and possibly for required checks to make CI more
resilient to refactors changing the names of specific checks.

* Enable parallel testing using xdist for unit testing in CI

* Normalize tmp paths to deal with macos

* add -u flag compatibility to spack python

As of now, it is accepted and ignored.  The usage with xdist, where it
is invoked specifically by `python -u spack python` which is then passed
`-u` by xdist is the entire reason for doing this.  It should never be
used without explicitly passing -u to the executing python interpreter.

* use spack python in xdist to support python 2

When running on python2, spack has many import cycles unless started
through main.  To allow that, this uses `spack python` as the
interpreter, leveraging the `-u` support so xdist doesn't error out when
it unconditionally requests unbuffered binary IO.

* Use shutil.move to account for tmpdir being in a separate filesystem sometimes
2022-09-07 20:12:57 +02:00
Satish Balay
8e5ccddc13 llvm: add 15.0.0 (#32536) 2022-09-07 10:42:01 -07:00
Robert Blake
5bb175aede Upgrade py-kosh to 2022-08-10 version. (#32541) 2022-09-06 21:57:57 -06:00
Laura Bellentani
92e0dbde03 quantum-espresso: add cuda support (#31869) 2022-09-06 14:45:57 -06:00
Adam J. Stewart
ab82cc5257 py-tensorflow: add v2.7.4, v2.8.3, v2.9.2 (#32500)
* py-tensorflow: add v2.7.4

* py-tensorflow: add v2.8.3

* py-tensorflow: add v2.9.2
2022-09-06 15:44:51 -05:00
luker
92018261aa update libflame for work with crayCC, craycc, ... (#32533)
update libflame for work with crayCC, craycc, crayftn compiler wrappers. These lightweight compiler drivers do not add the `-L<lib_path>` like the CC/cc/ftn compiler drivers do. I've made a slight change to add the lib directories.
2022-09-06 11:14:06 -06:00
Seth R. Johnson
c7292aa4b6 Fix spack locking on some NFS systems (#32426)
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2022-09-06 09:50:59 -07:00
Massimiliano Culpo
d7d59a24d1 Mark a test xfail on Python 2.7 (#32526)
refers #32470
2022-09-06 11:56:44 +02:00
Cory Bloor
b093929f91 rocthrust: add amdgpu_target and spack build test (#31203)
This change adds support for building the rocthrust tests and adds the `amdgpu_target` 
variant to the `rocthrust` package.

- [x] rocthrust: add amdgpu_target and spack build test
- [x] Drop numactl as it is not a direct dependency
2022-09-04 19:18:33 -07:00
Tom Scogland
c25b7ea898 Apply hip workaround for raja-framework (#32469)
* add workaround for broken behavior in HIP

Hip has a longstanding cmake issue where they calculate include paths
incorrectly, this works around it for raja and adds an explicit rocprim
dependency.

* propagate openmp requirement and workaround to camp

* refactor and include umpire

* propagate openmp option to camp in umpire and use main camp for main and develop raja and umpire

* bump camp to new patch release
2022-09-02 16:55:06 -07:00
Tom Scogland
69f7a8f4d1 Reorder workflow execution in GHA (#32183)
This patchset refactors our GitHub actions into a single top-level ci workflow that
invokes a series of reusable actions.  The main goal of this is to be able to easily
control which tests run and in what order based on the success or failure of top-level
prechecks.  Our previous workflows ran in three sets:

* nix tests: style and verification first, then linux and macos tests if successful
* windows tests: style and verification first, then linux and macos tests if successful
* bootstrap tests

As a result, the bootstrap tests ran even if the style failed, and style and verification
had to run on two different platforms despite running identical checks.  I'm relatively
sure that's because of the limitation on dependencies between steps in the jobs.
Reusable workflows allow us to run the style, verification and now audit checks once,
then depending on the results, and the files changed, run the appropriate nix, windows
and bootstrap tests.  While it saves only a few minutes by itself, this makes it easier to
refactor checks to subset tests without having to replicate tests or other workflow
components in the future.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-09-02 14:09:23 -07:00
Harmen Stoppels
80389911cc Update bootstrap buildcache to v0.3 (#32262)
This release allow to bootstrap patchelf from binaries.
2022-09-02 12:48:46 +02:00
Timothy Brown
a2e829c7b9 [CURL] New version. (#32481)
Adding a new version of curl. This addresses issue [9081](https://github.com/curl/curl/issues/9081).
2022-09-02 01:57:58 -06:00
Alex Hedges
85446f7a96 tree: add 2.0.3 (#32463)
Fix GCC compiler warnings due to not using C99 mode

CC should be overriden with Spack's value, and the other flags needed
to be copied from the Makefile.
2022-09-02 09:44:43 +02:00
Adam J. Stewart
de623f240c meson: add maintainer (#32460) 2022-09-01 20:45:46 -07:00
Adam J. Stewart
7796bf3fa0 py-pytorch-lightning: add v1.7.4 (#32464) 2022-09-01 22:15:52 +00:00
Scott Wittenburg
6239198d65 Fix cause of checksum failures in public binary mirror (#32407)
Move the copying of the buildcache to a root job that runs after all the child
pipelines have finished, so that the operation can be coordinated across all
child pipelines to remove the possibility of race conditions during potentially
simlutandous copies. This lets us ensure the .spec.json.sig and .spack files
for any spec in the root mirror always come from the same child pipeline
mirror (though which pipeline is arbitrary).  It also allows us to avoid copying
of duplicates, which we now do.
2022-09-01 15:29:44 -06:00
H. Joe Lee
d9313cf561 Upgrade version from 0.7.0-beta to 0.8.0-beta. (#32442)
The version 0.8.0-beta is released.
2022-09-01 14:53:48 -06:00
Axel Huebl
ce1500fad3 Add: py-sphinx-design (#32482)
* Add: py-sphinx-design

Needed for #32480
2022-09-01 19:25:59 +00:00
kwryankrattiger
117b0af831 Backport fix for buliding vtk-m diy with GCC 11 (#32465) 2022-09-01 11:41:20 -07:00
Peter Scheibel
2968ae667f New command, spack change, to change existing env specs (#31995)
If you have an environment like

```
$ cat spack.yaml
spack:
  specs: [openmpi@4.1.0+cuda]
```

this PR provides a new command `spack change` that you can use to adjust environment specs from the command line:

```
$ spack change openmpi~cuda
$ cat spack.yaml
spack:
  specs: [openmpi@4.1.0~cuda]
```

in other words, this allows you to tweak the details of environment specs from the command line.

Notes:

* This is only allowed for environments that do not define matrices
  * This is possible but not anticipated to be needed immediately
  * If this were done, it should probably only be done for "named"/not-anonymous specs (i.e. we can change `openmpi+cuda` but not spec like `+cuda` or `@4.0.1~cuda`)
2022-09-01 11:04:01 -07:00
Graeme A Stewart
92b72f186e root: apply 6.26 COMPILE_DEFINITIONS patch unconditionally (#32472)
This bug isn't per-se tied to the root7 variant and should
be applied always for these ROOT releases
2022-09-01 17:57:50 +02:00
Weiqun Zhang
3d67c58436 amrex: add v22.09 (#32477) 2022-09-01 09:45:49 -06:00
Massimiliano Culpo
01298287f6 Fix CI for package only PRs (#32473) 2022-09-01 14:41:28 +02:00
Jordan Ogas
9dca54fdc8 use bash for autoreconf (#32466) 2022-09-01 02:29:58 -06:00
Massimiliano Culpo
0df0b9a505 Port package sanity unit tests to audits (#32405) 2022-09-01 08:21:12 +02:00
Michael Kuhn
c4647a9c1f meson: add 0.63.1 (#32441) 2022-08-31 20:45:52 -06:00
marcost2
597af9210f perl-bignum: Adding perl module (#31590) 2022-08-31 21:07:41 -05:00
Adam J. Stewart
986e8fd6c5 py-black: add v22.8.0 (#32468) 2022-08-31 19:53:46 -06:00
Wouter Deconinck
c75c27c95c gdk-pixbuf: only build tests when requested (#32452)
The building of tests is optional [as of 2.42.9](801eef111d). This applies this option in the build.

The reason the option was added was to deal with test build failures in sandboxed environments and with certain glibc versions (caused by glib gresources). For example, with the latest version glibc and in the latest version of docker these tests [cannot be built](https://github.com/moby/moby/issues/43595).
2022-08-31 16:16:22 -07:00
Adam J. Stewart
f1f831edef py-pandas: add v1.4.4 (#32459) 2022-08-31 14:27:15 -07:00
Satish Balay
60f37e8c88 llvm: fix 15.0.0rc builds on MacOS with command-line-tools (#32397)
* llvm: fix 15.0.0rc builds on MacOS with command-line-tools

Ref: https://github.com/llvm/llvm-project/issues/57037

i.e use -DBUILTINS_CMAKE_ARGS=-DCOMPILER_RT_ENABLE_IOS=OFF. But this needs switching "compiler-rt" from "projects" to "runtimes".

Also fixing the warnings below fixes compile errors

CMake Warning at CMakeLists.txt:101 (message):
  Using LLVM_ENABLE_PROJECTS=libcxx is deprecated now, please use
  -DLLVM_ENABLE_RUNTIMES=libcxx or see the instructions at
  https://libcxx.llvm.org/BuildingLibcxx.html for building the runtimes.


CMake Warning at CMakeLists.txt:101 (message):
  Using LLVM_ENABLE_PROJECTS=libcxxabi is deprecated now, please use
  -DLLVM_ENABLE_RUNTIMES=libcxxabi or see the instructions at
  https://libcxx.llvm.org/BuildingLibcxx.html for building the runtimes.


CMake Warning at CMakeLists.txt:101 (message):
  Using LLVM_ENABLE_PROJECTS=libunwind is deprecated now, please use
  -DLLVM_ENABLE_RUNTIMES=libunwind or see the instructions at
  https://libcxx.llvm.org/BuildingLibcxx.html for building the runtimes.


/private/var/folders/nt/_m1t_x7j76q6sl3xt91tqgs00000gn/T/balay/spack-stage/spack-stage-llvm-15.0.0-rc2-h2t5bohzyy7exz2ub3m42pfycjcmbndk/spack-build-h2t5boh/include/c++/v1/cstdlib:135:9: error: no member named 'at_quick_exit' in the global namespace
using ::at_quick_exit _LIBCPP_USING_IF_EXISTS;
      ~~^

* Update var/spack/repos/builtin/packages/llvm/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-08-31 13:29:07 -07:00
Massimiliano Culpo
0c6e3188ac ASP-based solver: allow to reuse installed externals (#31558)
fixes #31484

Before this change if anything was matching an external
condition, it was considered "external" and thus something
to be "built".

This was happening in particular to external packages
that were re-read from the DB, which then couldn't be
reused, causing the problems shown in #31484.

This PR fixes the issue by excluding specs with a
"hash" from being considered "external"

* Test that users have a way to select a virtual

This ought to be solved by extending the "require"
attribute to virtual packages, so that one can:
```yaml
mpi:
  require: 'multi-provider-mpi'
```

* Prevent conflicts to be enforced on specs that can be reused.

* Rename the "external_only" fact to "buildable_false", to better reflect its origin
2022-08-31 20:05:55 +00:00
Seth R. Johnson
08261af4ab py-breathe: add new version and improve version constraints (#31857)
* py-breathe: add new version and improve version constraints

* py-breathe: everyone loves versions

```
py-breathe, py-breathe in the air
don't be afraid to care
````

* Update var/spack/repos/builtin/packages/py-breathe/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* add comment

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-08-31 17:23:19 +00:00
Larry Knox
62e4177e44 hdf5-vol-log: Update package versions and update HDF5 dependency to version 1.13.2 (#32448)
* Add HDF5 version 1.13.2.
Remove HDF5 versions 1.13.0 and 1.13.1.

* Correct formatting.

* Update vol-log-based versions and HDF5 dependency version.
2022-08-31 06:29:15 -07:00
Ivan Maidanski
74506a2a83 bdw-gc: add v8.2.2 (#32453) 2022-08-31 06:18:49 -07:00
Adam J. Stewart
5fcfce18fd py-gpytorch: add v1.9.0 (#32445) 2022-08-30 11:34:06 -07:00
Adam J. Stewart
c07c4629ff py-kornia: add v0.6.7 (#32447) 2022-08-30 11:33:49 -07:00
Tamara Dahlgren
3894ceebc9 Environments: Add support for include URLs (#29026)
* Preliminary support for include URLs in spack.yaml (environment) files

This commit adds support in environments for external configuration files obtained from a URL with a preference for grabbing raw text from GitHub and gitlab for efficient downloads of the relevant files. The URL can also be a link to a directory that contains multiple configuration files.

Remote configuration files are retrieved and cached for the environment. Configuration files with the same name will not be overwritten once cached.
2022-08-30 11:13:23 -07:00
Ryan Marcellino
b35a0b1b40 New package: py-mkdocstrings-python (#32421) 2022-08-30 10:27:02 -07:00
Gregory Lee
543a797d1a added STAT 4.2.0 and updated deps (#32385)
* added STAT 4.2.0 and updated deps

* launchmon package fix and style fixes

* launchmon package fix

* Update var/spack/repos/builtin/packages/launchmon/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-08-30 09:25:55 -06:00
Ivan Maidanski
08c67302fc libatomic_ops: add v7.6.14 (#32418)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-08-30 06:00:09 -07:00
Harmen Stoppels
33cb61afb9 libxml2: fix building with nvhpc (#32440) 2022-08-30 06:13:47 -06:00
Wouter Deconinck
43ae15a887 root: new bugfix version 6.26.06 (#32230) 2022-08-30 10:31:17 +02:00
Toyohisa Kameyama
a735dc027d oprnmx: fix to build with Fujitsu compiler. (#32389) 2022-08-30 10:26:15 +02:00
Olivier Cessenat
5be9f4dfef coreutils: add support for external find (#32414) 2022-08-30 02:26:02 -06:00
David M. Rogers
d75234b675 hipsycl: fix building on OSX. Patch boost to be backward-compatible. (#31311)
Co-authored-by: frobnitzem <frobnitzem@users.noreply.github.com>
2022-08-30 10:22:34 +02:00
luker
3779e645b8 scorep: modify configure flags for Cray platforms (#32201) 2022-08-30 00:25:35 -07:00
Olivier Cessenat
7f4799a4a2 visit: actually set the dev env for plugins again (#32427) 2022-08-30 08:02:23 +02:00
Jim Edwards
a09d4ffb4e esmf: update package for cce (cray) compiler (#32433) 2022-08-30 07:53:45 +02:00
robgics
0c9b4bc3d2 Initial commit for freesurfer package (#32395)
* Initial commit for freesurfer package

* Add myself as maintainer

* Change URLs to https and move url function to proper place.
2022-08-29 14:35:34 -07:00
snehring
778af070f0 snphylo: update to new version (#32425) 2022-08-29 11:56:57 -07:00
Brian Van Essen
bd815fbada Switched LBANN to using an explicit variant to enable unit testing (#32429)
support rather than relying on the spack install --test root option,
because it doesn't play nice in environments.
2022-08-29 12:53:45 -06:00
Luca Heltai
1c22af8ef4 dealii:: fixed application of stdcxx variant (#32413)
* Fixed imposition of stdcxx.

* Fixed style.
2022-08-29 08:41:17 -07:00
Seth R. Johnson
9687d91d72 celeritas: new version 0.1.1 (#32412) 2022-08-29 08:34:16 -07:00
Wouter Deconinck
a14d228e8b gdk-pixbuf: new bugfix version 2.42.9 (#32420)
No changes of note. Just bugfixes. Full diff at https://gitlab.gnome.org/GNOME/gdk-pixbuf/-/compare/2.42.6...2.42.9?from_project_id=1548
2022-08-29 07:48:48 -07:00
Ryan Marcellino
b64d2393ea New package: py-mkdocstrings (#32417) 2022-08-28 14:34:02 -07:00
Ryan Marcellino
d0173a486f New package: py-mkdocs-jupyter (#32416) 2022-08-28 14:32:50 -07:00
Sergey Kosukhin
215379cc27 libxml2: add new versions (#32394)
* libxml2: add new versions

* libxml2: apply patch preventing SIGFPE
2022-08-27 02:29:49 -06:00
Adam J. Stewart
a7cc58d42c py-radiant-mlhub: add v0.5 (#32406) 2022-08-26 18:02:02 -06:00
downloadico
430b8ca362 add sra-tools and NCBI-VDT packages (#32403)
* ncbi-vdb: add ncbi-vdb package to spack
ncbi-vdb provides the interface to the NCBI VDB

* sra-tools: add sra-tools to spack
2022-08-26 17:38:27 -06:00
Richard Berger
e6f6de406d py-sphinx-rtd-theme: add version 1.0.0 (#32379) 2022-08-26 17:38:14 -06:00
Betsy McPhail
ad95719a1d Use threading.TIMEOUT_MAX when available (#32399)
This value was introduced in Python 3.2. Specifying a timeout greater than
this value will raise an OverflowError.
2022-08-26 17:37:56 -06:00
Jen Herting
59bfa6bcd3 [py-pybedtools] added version 0.9.0 (#32401) 2022-08-26 21:44:53 +00:00
dlkuehn
c7acda0062 New packages: roary plus 5 new perl-* (#32217)
* New package: roary
2022-08-26 14:29:44 -06:00
Massimiliano Culpo
51244abee9 Configuration: Allow requirements for virtual packages (#32369)
Extend the semantics of package requirements to
allow using them also under a virtual package
attribute in packages.yaml

These requirements are enforced whenever that
virtual spec is present in the DAG.
2022-08-26 13:17:40 -07:00
kwryankrattiger
eb1c9c1583 SDK: Option to disable fortran support. (#32319)
This option was requested to support MacOS systems that do not have
fortran installed.
2022-08-26 11:12:13 -07:00
Ryan Marcellino
c227e7ab13 New package: py-mkdocs-material (#32387) 2022-08-26 10:32:03 -07:00
Ryan Marcellino
987c067f68 New package: py-mkdocs-autorefs (#32386) 2022-08-26 10:30:28 -07:00
Dom Heinzeller
26a16dc5bb Add pic variant to bacio (#32388)
* Add pic variant to bacio

* Fix black style error
2022-08-26 09:24:34 -07:00
François Trahay
e41f9aad38 add eztrace 2.0 release (#32350)
* [EZTrace] Version uptate to 2.0

* Update var/spack/repos/builtin/packages/eztrace/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-08-26 08:45:05 -07:00
Massimiliano Culpo
b6ea2a46d1 Update archspec to latest commit (#32368)
Modifications:

- [x] Add graviton3
- [x] Optimize __eq__ for microarchitectures
2022-08-26 12:58:20 +02:00
Tom Scogland
a2f0588004 calculate make_jobs in ninja (#32384)
This is in case ninja's setup_dependent_package is called before its
module is initialized by build_environment
2022-08-26 03:37:53 -06:00
Chris White
b31cd189a7 Improve error message for yaml config file (#32382)
* improve error message

* Update lib/spack/spack/config.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-08-26 02:43:03 +00:00
Wouter Deconinck
3ba58d85e2 libzmq: add patch for gcc-12; conflict for older gcc-11 versions (#32381)
Per #32214, the existing patch `92b2c...` for `gcc@11:` only applies
cleanly for `libzmq@4.3.3:4.3.4`. This PR adds a conflict for earlier
versions, as they cannot be patched due to different context.

For `gcc@11`, this leaves the most recent two versions available (a
satisfactory compromise).

For `gcc@12`, however, there is another existing conflict that makes
these most recent two versions unavailable. This PR adds an upstream
patch for the single most recent version that allows compilation with
`gcc@12` for that most recent version.

Starting point:
- `gcc@11` concretizes on all versions, attempts to apply patch on
`@4.2.3:4.3.4`, and only succeeds to apply patch on `@4.3.3:4.3.4`,
- `gcc@12` concretizes on `@:4.3.1` (and `@master`), attempts to apply
patch on `@4.2.3:4.3.1`, fails to apply patch on all.

Ending point:
- `gcc@11` concretizes on `@4.3.3:4.3.4` (and `@master`), attempts and
succeeds to apply patch on `@4.3.3:4.3.4`,
- `gcc@12` concretizes on `@4.3.4` (and `@master`), attempts and
succeeds to apply patch on `@4.3.4`.

Verified with environment build:
```yaml
spack:
  specs:
  - libzmq@4.3.4%gcc@12.1.0
  - libzmq@4.3.4%gcc@11.3.0
  - libzmq@4.3.3%gcc@11.3.0
  view: false
```
which returns the following:
```console
16:14:47 wdconinc@menelaos ~/git/spack (libzmq-patch-gcc12 *+$%=) $
spack install --fresh
==> Installing environment libzmq
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-12.1.0/libmd-1.0.4-egpgd6eoaqtsl5fja2iwsl6gyc4o32p5
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-12.1.0/libsodium-1.0.18-af3rsfnvck6anxf7eeog3f2bph44tjia
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-12.1.0/pkgconf-1.8.0-z5of2hj2c6ygd3kxr4cwv7u7t42sxair
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-11.3.0/libmd-1.0.4-tec234gco2sd7n52dkwbrad43sdhaw4o
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-11.3.0/libsodium-1.0.18-uljf675u3yrn5c7fdjdpa5c7qnnkynke
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-11.3.0/pkgconf-1.8.0-l4hzc2g4pnn7dwyttphmxivt3xghvpoq
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-12.1.0/libbsd-0.11.5-fi3ri64moy45ksr4sf5pcwd6k23dsa4o
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-11.3.0/libbsd-0.11.5-2matmm7im7oygrr77k7wznttv4rbupfz
==> Installing libzmq-4.3.4-t7ad54q3atrnte4rzq7g7cfjcw5227pr
==> No binary for libzmq-4.3.4-t7ad54q3atrnte4rzq7g7cfjcw5227pr found:
installing from source
==> Fetching
c593001a89.tar.gz
==> Fetching
310b8aa57a
==> Fetching
https://github.com/zeromq/libzmq/pull/4334.patch?full_index=1
==> Applied patch
92b2c38a2c.patch?full_index=1
==> Applied patch
https://github.com/zeromq/libzmq/pull/4334.patch?full_index=1
==> libzmq: Executing phase: 'autoreconf'
==> libzmq: Executing phase: 'configure'
==> libzmq: Executing phase: 'build'
==> libzmq: Executing phase: 'install'
==> libzmq: Successfully installed
libzmq-4.3.4-t7ad54q3atrnte4rzq7g7cfjcw5227pr
  Fetch: 0.61s.  Build: 1m 31.57s.  Total: 1m 32.18s.
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-12.1.0/libzmq-4.3.4-t7ad54q3atrnte4rzq7g7cfjcw5227pr
==> Installing libzmq-4.3.3-pxrd6piprucu65bkro2ixms6d3x2eudz
==> No binary for libzmq-4.3.3-pxrd6piprucu65bkro2ixms6d3x2eudz found:
installing from source
==> Fetching
9d9285db37.tar.gz
==> Using cached archive:
/home/wdconinc/.spack/cache/_source-cache/archive/31/310b8aa57a8ea77b7ac74debb3bf928cbafdef5e7ca35beaac5d9c61c7edd239
==> Applied patch
92b2c38a2c.patch?full_index=1
==> libzmq: Executing phase: 'autoreconf'
==> libzmq: Executing phase: 'configure'
==> libzmq: Executing phase: 'build'
==> libzmq: Executing phase: 'install'
==> libzmq: Successfully installed
libzmq-4.3.3-pxrd6piprucu65bkro2ixms6d3x2eudz
  Fetch: 0.93s.  Build: 11.55s.  Total: 12.48s.
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-11.3.0/libzmq-4.3.3-pxrd6piprucu65bkro2ixms6d3x2eudz
==> Installing libzmq-4.3.4-hiil6dzy2reb4nk555zztwh44rpbyv4h
==> No binary for libzmq-4.3.4-hiil6dzy2reb4nk555zztwh44rpbyv4h found:
installing from source
==> Using cached archive:
/home/wdconinc/.spack/cache/_source-cache/archive/c5/c593001a89f5a85dd2ddf564805deb860e02471171b3f204944857336295c3e5.tar.gz
==> Using cached archive:
/home/wdconinc/.spack/cache/_source-cache/archive/31/310b8aa57a8ea77b7ac74debb3bf928cbafdef5e7ca35beaac5d9c61c7edd239
==> Applied patch
92b2c38a2c.patch?full_index=1
==> libzmq: Executing phase: 'autoreconf'
==> libzmq: Executing phase: 'configure'
==> libzmq: Executing phase: 'build'
==> libzmq: Executing phase: 'install'
==> libzmq: Successfully installed
libzmq-4.3.4-hiil6dzy2reb4nk555zztwh44rpbyv4h
  Fetch: 0.01s.  Build: 10.77s.  Total: 10.78s.
[+]
/opt/software/linux-ubuntu22.04-skylake/gcc-11.3.0/libzmq-4.3.4-hiil6dzy2reb4nk555zztwh44rpbyv4h
```
2022-08-25 19:13:41 -06:00
Adam J. Stewart
c6842605ac py-pytorch-lightning: add v1.7.3 (#32383) 2022-08-25 18:01:50 -06:00
snehring
4d10cdb7e8 phylip: adding workarounds for gcc10+ (#32376)
* phylip: adding workarounds for gcc10+

* phylip: switch to spec.satisfies
2022-08-25 16:29:40 -06:00
Massimiliano Culpo
1bffa46d4d zig: add v0.9.1 (#32380) 2022-08-25 14:38:26 -07:00
Richard Berger
6e420c3556 charliecloud: add versions 0.28 and 0.29 (#32378) 2022-08-25 13:21:27 -07:00
Ryan Marcellino
a476f0fa59 New package: py-mkdocs (#32363) 2022-08-25 12:37:20 -07:00
AMD Toolchain Support
6fb13c0fc5 fixed cp2k libcp2k.pc (#32349)
Co-authored-by: Prasanthi <pdonthir@amd.com>
2022-08-25 12:01:02 -07:00
Glenn Johnson
c2291f7eb3 new packages: r-cmdstanr and cmdstan (#32364) 2022-08-25 11:58:31 -07:00
Glenn Johnson
925a99a043 new package: r-glmgampoi (#32366)
* new package: r-glmgampoi

* Remove duplicate depends_on directive
2022-08-25 11:50:11 -07:00
Glenn Johnson
883b96aeb5 new packages: r-scdblfinder and dependencies (#32367)
- r-bluster
- r-metapod
- r-scran
2022-08-25 11:47:12 -07:00
Matthieu Dorier
9204bd6204 [ycsb] fixes build process and installation of YCSB (#32213)
* [ycsb] fixes build process and installation of YCSB

* [ycsb] fixing style

* [ycsb] removed extra newline
2022-08-25 11:26:08 -07:00
Wileam Y. Phan
b482c71b43 New package: glab (#32251) 2022-08-25 11:23:15 -07:00
robgics
ff2874333d ddt: Initial commit (#32338)
* ddt: Initial commit

* ddt: Minor change to license date to appease the CI gods

* Get rid of unattractive extra line

* Switch to sha256 instead of md5
2022-08-25 11:03:07 -07:00
Robert Pavel
7214a438dc Adding e3sm-kernels to Spack (#32341)
* Initial Draft of E3SM-Kernels Spackage

Initial draft of E3SM-Kernels. Currently no support for nested_loops due
to build system limitations

* Style Check and Fixed gfortran Check

Fixed style issues and changed gnu toolchain check to a gfortran check
due to hybrid compilers (e.g. clang+gfortran)

* Fixed Style Issues
2022-08-25 11:01:07 -07:00
Lucas Frérot
06ba4d5c28 lammps: added build dependency to py-setuptools (#32351)
LAMMPS' setup.py uses setuptools as of lammps/lammps@2ed8e5cf02
2022-08-25 10:40:07 -07:00
Mikael Simberg
1c3979dcfa Add patch for missing template instantiation in pika 0.7.0 on macOS (#32370) 2022-08-25 11:30:00 -06:00
Melven Roehrig-Zoellner
f0925e1823 tixi: new variants (fortran,shared) (#32356)
* tixi: new variants (fortran,shared)

Since some tixi 3 versions, additional CMake flags are needed to build
tixi with shared libraries, respectively with Fortran support.

* tixi: fix style
2022-08-25 10:27:00 -07:00
snehring
ba87413eeb snap-berkeley: update to version 2.0.1 (#32358) 2022-08-25 10:21:53 -07:00
Tyler Funnell
a4cbdba388 fish: adding v3.5.1 (#32362) 2022-08-25 10:18:39 -07:00
snehring
70388da974 libfabric: adding missing dep for opx fabric (#32377) 2022-08-25 10:10:30 -07:00
Sreenivasa Murthy Kolam
f0df4b653d Changes for ROCm-5.2.0 changes and new recipe rocwmma (#31667)
* Changes for ROCm-5.2.0 changes and new recipe rocwmma

* modify the maintainers for hipify-clang

* address review comments

* update the rocwmma new recipe as per latest syntax

* fix style errors

* modify the patch file to provide the details about the patch

* fix style errors
2022-08-25 10:00:02 -07:00
Adam J. Stewart
507e06b5d2 Python: add new maintainer (#31755) 2022-08-25 09:46:09 -07:00
H. Joe Lee
c8f8b6957d libfabric: Match main version. (#32342)
* Match main version.

`develop` no loger exists.

* Change develop to main.
2022-08-25 09:25:43 -07:00
robgics
10ac24874b cplex: add package.py (#32337)
* cplex: add package.py

* Update var/spack/repos/builtin/packages/cplex/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-08-25 09:11:40 -07:00
Massimiliano Culpo
8654197c56 ASP-based solver: prevent the use of hashes that are not available (#32354)
fixes #32348
2022-08-25 17:55:58 +02:00
Dom Heinzeller
b332c1055c Update ESMF: bug fix for esmf@8.3.0b09 with PIO (#32360) 2022-08-25 08:52:54 -07:00
Christian Mauch
7e0d8fce89 py-pynn: add version 0.10.0 (#32307) 2022-08-25 08:02:52 -07:00
Robert Cohn
c56abe4247 oneapi e4s: use require: to force gcc build for some packages (#32309) 2022-08-24 16:22:02 -04:00
Graeme A Stewart
ee02623171 whizard: Fix passing of build options, update versions (#32326)
* whizard: Fix passing of build options, update versions

The dependency of whizard on libtirpc is now correctly passed down as
an autotools option.

Update known versions of package with 3.0.2 and 3.0.3.

* Express path to headers via spec object methods
2022-08-24 06:58:48 -07:00
otsukay
6fa6b8a903 fujitsu-fftw: add option for enabling thread support (#31426)
Co-authored-by: Yuichi Otsuka <otsukay@riken.jp>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: otsukay <otsukay@users.noreply.github.com>
2022-08-24 11:36:14 +02:00
Atsushi Hori
14e99a1a5d process-in-process: overhaul package recipe (#32347) 2022-08-24 11:35:08 +02:00
Massimiliano Culpo
e2468c8928 Allow default requirements in packages.yaml (#32260)
Allow users to express default requirements in packages.yaml. 

These requirements are overridden if more specific requirements
are present for a given package.
2022-08-24 09:33:55 +02:00
Ryan Marcellino
b4df535e8d New package: py-mkdocs-material-extensions (#32334) 2022-08-23 23:29:56 -06:00
Ryan Marcellino
af4788fdef New package: py-griffe and new build backend: py-pdm-pep517 (#32335)
* New package: py-griffe and new build backend: py-pdm-pep517

* add pdm to build backend docs
2022-08-23 23:23:23 -05:00
Cyrus Harrison
a2bdbfcd24 add conduit 0.8.4 release (#32343) 2022-08-23 21:33:49 -06:00
Ben Darwin
3fb7fdfc58 minc-toolkit: add runtime flex dependency (#32345)
Some programs such as `minccalc` depend on libfl.so at runtime.
2022-08-23 17:14:07 -06:00
Timothy Brown
7e87c208ba [WRF] Version update to 4.4 (#32046)
* [WRF] Update to version 4.4.

* [WRF] Patches for v4.4.

* Fixing style.

* [@spackbot] updating style on behalf of t-brown
2022-08-23 13:47:23 -07:00
Jim Galarowicz
f8ae2ef8b4 Add new package for MUST runtime correctness analysis tool (#32329)
* Add new package for MUST runtime correctness analysis tool

* Fix encryption to SHA256 to fix with spack convention.
2022-08-23 20:43:55 +00:00
Ryan Marcellino
b238a9a457 New package: py-pyyaml-env-tag (#32333) 2022-08-23 12:55:53 -07:00
Harmen Stoppels
8af3743e91 Relative paths in default prefix_inspections start with ./ (#31867) 2022-08-23 12:54:12 -07:00
Harmen Stoppels
707a099b69 spack -e x config --scope=y add z add to scope (#31746)
* `spack -e x config --scope=y add z` add to scope

instead of to the environment file.
2022-08-23 12:52:34 -07:00
iarspider
2b680ae7b8 cpu-features: fix fetch failure (#32325)
* cpu-features: fix fetch failure

`master` branch was renamed to `main`

* Update var/spack/repos/builtin/packages/cpu-features/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-08-23 12:37:59 -07:00
Robert Cohn
6654d53611 intel-oneapi-compilers-classic: fix paths (#32305)
* Fix intel-oneapi-compilers-classic bin path

* Update package.py

* intel-oneapi-compilers-classic fix path

Co-authored-by: OfirGan <offirgan@gmail.com>
2022-08-23 12:14:24 -07:00
Ben Darwin
9b6b1a277d minc-toolkit: add MINC_TOOLKIT environment variable at runtime (#32331)
This variable is used by some programs both internal and external to the
toolkit itself to discover shared objects, data, etc.
2022-08-23 11:03:12 -07:00
Toyohisa Kameyama
a08d6e790d py-scipy: Fortran compiler specify code is change to setup.cfg for Fujitsu compiler (#32142)
* py-numpy: Change Fortran detect order for Fujitsu Compiler.

* create setup.cfg instead of command line.
2022-08-23 10:06:49 -07:00
Jen Herting
00feccde34 New package: py-loguru (#32074)
* [py-loguru] New package

* [py-loguru] Removed commented out line

* [py-loguru] Added types removed extra dependencies

* [py-loguru] missing windows dependency. listing windows as a conflict for now

* [py-loguru] depends on py-colorama when platform=windows

* [py-loguru] flake8

* [py-loguru] Import update

* [py-loguru]

- python is a runtime dependency
- setuptools is a build dependency

* [@spackbot] updating style on behalf of qwertos

Co-authored-by: James A Zilberman <jazrc@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-08-23 10:00:25 -07:00
Jen Herting
2c567edc1d New package: py-python-javabridge (#32075)
* [py-python-javabridge] New package

* [py-python-javabridge] Added dependencies

* [py-python-javabridge] Added types

* [py-python-javabridge] py-setuptools and java are dependencies

* [py-python-javabridge] Import update

* [@spackbot] updating style on behalf of qwertos

Co-authored-by: James A Zilberman <jazrc@rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2022-08-23 09:59:09 -07:00
Martin Pokorny
51293f3750 libfabric: new versions (#32083)
* libfabric: add new versions

1.15.0, 1.15.1, main (previously named master)

* Add OPX fabric option, with conflict for versions before v1.15.0

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-08-23 10:33:49 -06:00
kwryankrattiger
b7024010d3 SDK: Quick fix to allow SDK to build with HIP (#32321) 2022-08-23 07:09:56 -07:00
Simon Pintarelli
44258a7cce meson: create unique names for build directory (#32062)
Taken from CMakePackage
2022-08-23 13:24:32 +02:00
Seth R. Johnson
0f25d3b219 Add cxxstd flag to googletest and default mock to true (#32171) 2022-08-23 13:05:03 +02:00
Seth R. Johnson
79a462c084 Geant4: update cmake defines and add support for nvhpc (#32185)
* geant4: use define function

* geant4: Change new feature from conflicts to when

* geant4: add support/conflicts for nvhpc

* fixup! geant4: add support/conflicts for nvhpc
2022-08-23 12:57:00 +02:00
Zhilin Zheng
b307d6e766 remove compiler optimize level limit of O2 for GCC (#32303)
gfortran can make it with O3, it might be better to remove this limitation.
2022-08-23 12:13:47 +02:00
Wouter Deconinck
abdecd2a9b acts: add new dependency on vecmem and autodiff (#26790)
Co-authored-by: Hadrien G. <knights_of_ni@gmx.com>
2022-08-23 11:31:33 +02:00
Tamara Dahlgren
3c3b18d858 spack ci: add support for running stand-alone tests (#27877)
This support requires adding the '--tests' option to 'spack ci rebuild'.
Packages whose stand-alone tests are broken (in the CI environment) can
be configured in gitlab-ci to be skipped by adding them to
broken-tests-packages.

Highlights include:
- Restructured 'spack ci' help to provide better subcommand summaries;
- Ensured only one InstallError (i.e., installer's) rather than allowing
  build_environment to have its own; and
- Refactored CI and CDash reporting to keep CDash-related properties and
  behavior in a separate class.

This allows stand-alone tests from `spack ci` to run when the `--tests`
option is used.  With `--tests`, stand-alone tests are run **after** a
**successful** (re)build of the package.  Test results are collected
and report(able) using CDash.

This PR adds the following features:
- Adds `-t` and `--tests` to `spack ci rebuild` to run stand-alone tests;
- Adds `--fail-fast` to stop stand-alone tests after the first failure;
- Ensures a *single* `InstallError` across packages
    (i.e., removes second class from build environment);
- Captures skipping tests for externals and uninstalled packages
    (for CDash reporting);
- Copies test logs and outputs to the CI artifacts directory to facilitate
    debugging;
- Parses stand-alone test results to report outputs from each `run_test` as
    separate test parts (CDash reporting);
- Logs a test completion message to allow capture of timing of the last
    `run_test` part;
- Adds the runner description to the CDash site to better distinguish entries
    in CDash tables;
- Adds `gitlab-ci` `broken-tests-packages` to CI configuration to skip
    stand-alone testing for packages with known issues;
- Changes `spack ci --help` so description of each subcommand is a single line;
- Changes `spack ci <subcommand> --help` to provide the full description of
    each command (versus no description); and
- Ensures `junit` test log file ends in an `.xml` extension (versus default where
    it does not).

Tasks:

- [x] Include the equivalent of the architecture information, or at least the host target, in the CDash output
- [x] Upload stand-alone test results files as  `test` artifacts
- [x] Confirm tests are run in GitLab
- [x] Ensure CDash results are uploaded as artifacts
- [x] Resolve issues with CDash build-and test results appearing on same row of the table 
- [x] Add unit tests  as needed
- [x] Investigate why some (dependency) packages don't have test results (e.g., related from other pipelines)
- [x] Ensure proper parsing and reporting of skipped tests (as `not run`) .. post- #28701 merge
- [x] Restore the proper CDash URLand or mirror ONCE out-of-band testing completed
2022-08-23 00:52:48 -07:00
Mikael Simberg
8b49790784 Constrain __skip_rocmclang workaround in pika package (#32208) 2022-08-23 01:41:54 -06:00
robgics
abad24265e ampl: Add new version and add variants for more install control. (#32317) 2022-08-23 09:40:53 +02:00
Daryl W. Grunau
4095666013 eospac: support intel-classic@2021 (#32323)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2022-08-23 09:14:22 +02:00
Ryan Marcellino
877393e019 py-jupytext: add v1.14.1 (#32313) 2022-08-23 09:12:54 +02:00
kwryankrattiger
0d29bc00ec Kokkos: ROCm and CUDA are not compatible in Kokkos (#32181) 2022-08-23 08:24:56 +02:00
Ryan Marcellino
b203418720 New package: py-mergedeep (#32322) 2022-08-23 00:45:30 -05:00
Ryan Marcellino
3e3e387a45 New package: py-pymdown-extensions (#32324) 2022-08-23 00:44:39 -05:00
Vanessasaurus
04339fbe9a Automated deployment to update package flux-core 2022-08-23 (#32320)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2022-08-22 20:33:42 -06:00
Dom Heinzeller
560b6432cc Update ESMF from JCSDA/NOAA-EMC spack fork (esmf@8.3.0 with external parallelio) (#32222)
* Update ESMF package from JCSDA/NOAA-EMC spack fork

* Update var/spack/repos/builtin/packages/esmf/package.py

Fix url_for_version

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* [@spackbot] updating style on behalf of climbfuji

Co-authored-by: Jim Edwards <jedwards@ucar.edu>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: climbfuji <climbfuji@users.noreply.github.com>
2022-08-22 17:36:07 -07:00
Cody Balos
bea8936e02 sundials: add v6.3.0 and logging-mpi variant (#32315) 2022-08-22 15:37:59 -06:00
Ryan Marcellino
95e8c4a615 New package: py-ghp-import (#32316) 2022-08-22 14:29:16 -07:00
Wouter Deconinck
c92db8a22a gaudi: consistent test dependencies when +examples (#32134)
* gaudi: consistent test dependencies when +examples

Gaudi requires testing to be enabled for examples to be built, so all
test dependencies are also there for `+examples`. This PR fixes a
missing pytest dependency when `+examples` is used but no testing is
enabled. The construct with the loop is to ensure the identical
dependencies are always used, even as version ranges my start to differ.

Testing with gaudi_add_tests was added in v35r0. Some nosetests and
QMtests were in the tree before, but not accessible it seems. The
effective dependency since 35.0 is also applied for pytest, extending
the range that was there before disentangling `optional`, at
9d67d1e034/var/spack/repos/builtin/packages/gaudi/package.py

* gaudi: version 36.7 in other PR...
2022-08-22 14:24:36 -07:00
Valentin Volkl
afecd70bc8 dd4hep: add version 1.22 (#32209) 2022-08-22 15:21:50 -06:00
Adam J. Stewart
9296a76658 py-shapely: add v1.8.3 (#32218) 2022-08-22 14:09:38 -07:00
Ryan Marcellino
e6e569e07c New package: py-py-f90nml (#32314) 2022-08-22 19:03:42 +00:00
Eric Brugger
ca9b2fe6ad VisIt: Add patch to include jpeg library in install. (#32245) 2022-08-22 11:59:07 -07:00
Mark Abraham
70f0a725cc hwloc: fix build with +onepai-level-zero (#31622)
Flags to configure were erroneously managed to configure twice. 

Removed the one that was wrong so that a configure warning is no longer issued.
2022-08-22 12:45:58 -06:00
Barry Rountree
fd80e34037 teckit: Fixes missing xmlparse.h issue. (#32295)
The "release" tarball provided by github lacks several files in
the SFconv/expat/xmlparse directory, including xmlparse.  Using
tarballs based off of version tags solves the problem.

o Changes version() to use commits associated with version tags.
o Adds several additional versions.
o Adds myself as maintainer.
o Adds hook to execute autogen.sh.
o Adds autotools &ct dependences.
o Removes expat dependence.
2022-08-22 18:39:28 +00:00
Paul Romano
7fc78b8b0f OpenMC: add v0.13.1 (#32263)
* openmc: add v0.13.1

* Add @paulromano as maintainer of openmc and py-openmc

* Address review comments from @adamjstewart

* Add back MPI variant in openmc package
2022-08-22 18:09:37 +00:00
Adam J. Stewart
1039caabed crunch: add new package (#31980) 2022-08-22 18:05:06 +00:00
Ryan Mast
5edb0fe96a helics: update v3 options and add missing versions (#32121)
* helics: update v3 options and add missing versions

* helics: allow using openssl 3.x for encryption support

* helics: fix style errors

* helics: Apply suggestion from code review

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-08-22 17:49:57 +00:00
iarspider
2860f2f70a Add checksum for py-requests 2.27.1, 2.28.0, 2.28.1 (#31896)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-08-22 11:46:01 -06:00
Jean-Paul Pelteret
85b8a41ccb deal.II 9.4: Fix issues due to override of CMake FIND_PACKAGE macro (#32079) 2022-08-22 11:37:52 -06:00
Harmen Stoppels
6e5d6ebf99 openmpi: fix patches for %nvhpc (#32308) 2022-08-22 10:57:51 -06:00
Adam J. Stewart
177e2e2672 xgboost: GCC 8+ required (#32282) 2022-08-22 08:48:18 -07:00
Arjun Guha
72b133a875 racket: fix spec (#32304) 2022-08-22 14:50:11 +00:00
Adam J. Stewart
dc0836804f py-rasterio: add v1.3.2 (#32277) 2022-08-22 16:15:33 +02:00
kwryankrattiger
a38beeaed2 HPX: ROCm and Cuda conflict needed (#32178)
Discovered this missing conflict when building the e4s enviroment with
unify:when_possible. #31940
2022-08-22 16:13:59 +02:00
Kelly (KT) Thompson
1c61f0420a Provide patches provided by consumers of draco. (#32281) 2022-08-22 16:04:58 +02:00
Robert Cohn
00b244853e enable packages that use numpy for oneapi e4s stack (#32293)
* enable geopm oneapi

* enable packages that depend on py-numpy

* disable nan warnings

* undo geopm enable

* undo geopm, it has multiple compile issues
2022-08-22 07:02:22 -07:00
Kendra Long!
1a030aa417 draco: add v7_14_0 (#32267) 2022-08-22 16:02:11 +02:00
Eric Brugger
c28f1c0b42 VisIt: add conduit and mfem variants (#32255) 2022-08-22 15:59:29 +02:00
Olivier Cessenat
a20c9b2d8e keepassxc: option autotype with pcsclite (#32288) 2022-08-22 15:50:41 +02:00
Jonas Thies
88a58abcf8 openfoam-org: add v9 and v10 (#32274) 2022-08-22 15:33:40 +02:00
Rémi Lacroix
e16f0b9f12 Molden: update URLs. (#32207) 2022-08-22 14:48:44 +02:00
Valentin Volkl
4d0612abfc baurmc: correct compile flag (#31305) 2022-08-22 14:28:20 +02:00
Valentin Volkl
408076adf0 prophecy4f: add hep tag, install to share (#31308) 2022-08-22 14:27:54 +02:00
Cory Bloor
dde6d00ab9 hip@4.5.2: fix installation (#31416)
In a fast-moving project with as many forks as LLVM, it's difficult to
accurately determine if a function exists just by checking the version
number. The existing version check fails, for example, with llvm-amdgpu
from ROCm 4.5. It is more robust to directly check if the function
exists.
2022-08-22 12:39:09 +02:00
Alan Sill
4c64a0fab2 VASP: add v6.3.2; handle changes to makefile.include naming pattern (#31701)
Added the SHA256 for version 6.3.2 and added logic to eal with the change of naming pattern for the makefile.include files that now appears to leave out the "linux_" prefix. (Changes should be. backwards compatible.)

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: alansill <alansill@users.noreply.github.com>
2022-08-22 12:21:00 +02:00
haralmha
01a4844ac4 feynhiggs: update url (#31760) 2022-08-22 12:20:09 +02:00
Adam J. Stewart
2ff132a62f py-isort: add v5.10.1 (#32302) 2022-08-22 11:09:09 +02:00
Olivier Cessenat
f7424e1fea qscintilla: coherence with py-sip (#32128) 2022-08-21 11:32:10 -07:00
Wileam Y. Phan
ab31256ee7 gcc: add 12.2.0 (#32270) 2022-08-21 15:30:37 +02:00
sparkyniner
21e6679056 spack list: add --tag flag (#32016)
* modified list.py and added functionality for --tag

* Removed long and very long, shifted rest of code above return statement

* removed results variable

* added import statement at top

* added the line accidentally deleted

* added line accidentally deleted

* changed p.name to p, added line inside if statement

* line order switched

* [@spackbot] updating style on behalf of sparkyniner

* ran update completion command

* add tests

* Update lib/spack/spack/test/cmd/list.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* [@spackbot] updating style on behalf of sparkyniner

* changed argument to mock_packages and moved code under filter by tag

* removed bad rebase code and added additional test

* [@spackbot] updating style on behalf of sparkyniner

* added line removed earlier

* added line removed earlier

* replaced function

* added more recommended changes

Co-authored-by: sairaj <sairaj@sairajs-MacBook-Pro.local>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-08-20 16:09:44 -06:00
Morten Kristensen
8d02e2cc52 py-vermin: add latest version 1.4.2 (#32287) 2022-08-20 13:35:07 -05:00
Axel Huebl
39beafc99a Ascent, Conduit & VTK-h: don't assume I have Fortran (#32285)
* VTK-h: don't assume I have Fortran

Don't assume I have a working Fortran compiler in my toolchain :)

* Conduit: Do not Assume Fortran

* Ascent: Do not Assume Fortran

* fix style
2022-08-20 10:12:23 -07:00
kwryankrattiger
fff929d5ab WarpX: Add hints for FindMPI (#31842)
This seems to be needed on some cray systems and is safe on normal
desktops.
2022-08-20 15:45:03 +02:00
Olivier Cessenat
51619fdb00 gxsview: new version 2022.05.09 (#32289) 2022-08-20 14:33:10 +02:00
Axel Huebl
7db1c69945 Vis: macOS llvm-openmp (#32284)
Add some OpenMP lib provider for Apple-Clang to the vis packages.
2022-08-20 01:11:03 -07:00
Sreenivasa Murthy Kolam
11a4f5e25d Enable Tensorflow for ROCm. Add ROCm dependencies. (#32248)
* Build Tensorflow using the fork for rocm. Initial commit

* re-order the versions

* fix style errors

* address review comments

* add conflicts for rocm version

* address review comments

* remove rocm variant as its added by ROCmPackage
2022-08-20 01:50:38 +00:00
eugeneswalker
5590cad1ef py-globus-sdk: add new versions; unpin py-cryptography version constraint (#32280)
* add mothra tests

* py-globus-sdk: add new versions; unpin py-cryptography version constraint

* Update var/spack/repos/builtin/packages/py-globus-sdk/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-08-19 19:30:07 -06:00
eugeneswalker
09486c082c py-pyjwt: fix py-cryptography version constraints (#32279)
* py-pyjwt: fix py-cryptography version constraints

* Update var/spack/repos/builtin/packages/py-pyjwt/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-08-19 18:50:20 -06:00
Adam J. Stewart
dab46296b6 py-setuptools: add v64.0.0 (#32078) 2022-08-19 14:20:08 -07:00
Axel Huebl
7f314947dd Doc: WarpX SENSEI Work-Around (#32276)
Follow-up to #31542
2022-08-19 14:10:00 -06:00
Rémi Lacroix
56e4203450 netcdf-cxx: Update the download URL (#32275) 2022-08-19 18:44:14 +00:00
Adam J. Stewart
b0990aa1fa py-pygments: add v2.13.0 (#32168) 2022-08-19 18:09:51 +00:00
Seth R. Johnson
e29d9de2dd ForTrilinos: new versions 2.0.1, 2.1.0 (#32265)
* ForTrilinos: new versions 2.0.1, 2.1.0

I also had to update the checksum for the released 2.0.0: see #32000 for
the explanation and solution to keep this from happening again.

* Soooo stylish
2022-08-19 11:37:55 -06:00
Harmen Stoppels
7acc082fba ncurses 6.3 restrict patch (#32271) 2022-08-19 10:11:32 -07:00
Olivier Cessenat
362cdc5437 pcsclite: add new package (#32129) 2022-08-19 18:22:15 +02:00
eugeneswalker
e1bce8c577 e4s %oneapi: add aml +ze (level zero) (#32161) 2022-08-19 08:25:15 -07:00
Seth R. Johnson
7760625b3f py-macholib: add v1.16 (#31967)
* py-macholib: add v1.16

* Update dependencies

Even 1.11 requires `altgraph (>=0.15)`. The lastest requires no
additional dependencies.
2022-08-19 08:25:51 -06:00
dunatotatos
77c2332df6 htslib, samtools: add v1.15 and v1.15.1 (#32259) 2022-08-19 13:56:14 +00:00
Olivier Cessenat
8773c183b5 ghostscript: new version 9.56.1 (#32088) 2022-08-19 15:00:14 +02:00
Harmen Stoppels
2daea9e4b4 docs: add a note about an issue being solved on develop (#32261) 2022-08-19 12:40:41 +00:00
psakievich
abf847ce82 Add messages to assertions in asp.py (#32237)
Assertions without messages if/when hit create a blank error message for users.  

This PR adds error messages to all assertions in asp.py even
if it seems unlikely they will ever be needed.
2022-08-19 14:15:25 +02:00
iarspider
8d99d33128 xpmem: fix configure with ~kernel-module (#32258) 2022-08-19 12:36:11 +02:00
eugeneswalker
a42a874faa e4s oneapi: compiler environment: prepend lib dir to LD_LIBRARY_PATH (#32227) 2022-08-19 06:30:18 -04:00
Mikael Simberg
be62635154 Add tracy variant to pika (#32090) 2022-08-19 02:38:07 -06:00
eugeneswalker
022d59f4a5 e4s: add dealii +cuda (#32159) 2022-08-18 22:41:57 -06:00
Massimiliano Culpo
605d72133b spack.util.package_hash: parametrize unit-tests (#32240)
* spack.util.package_hash: parametrize unit-tests

* Fix comment
2022-08-18 18:17:43 -07:00
eugeneswalker
b2d1d07d2e e4s %oneapi: add amrex +sycl (#32162) 2022-08-18 18:16:55 -07:00
Mark W. Krentel
908055e37c hpctoolkit: update git url from github to gitlab (#32252) 2022-08-18 17:35:34 -07:00
Wouter Deconinck
90352d7223 kassiopeia: new version 3.8.2 (#32254)
* kassiopeia: new version 3.8.2

There was no version 3.8.1, so here is the diff with 3.8.0: https://github.com/KATRIN-Experiment/Kassiopeia/compare/v3.8.0...v3.8.2

Build system changes:
- default `cxxstd` is now 17.

* kassiopeia: updated hash

* kassiopeia: use spec.satisfies

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-08-18 17:20:17 -07:00
kwryankrattiger
d77a8f7aa1 Sensei v4 (#30432)
* Sensei: Refactor package to work with v4.0.0

* Add missing MPI dependency
* Patch bug in libsim adapter
* Simplify conflicts with when-clauses
* Conflict variants that are incompatible (catalyst/libsim/ascent)
* Fix paraview version constraints to be more clear
* Add version constraints for VTK
* Drop unneeded visit restrictions
* Specify +vtkm dependency on ParaView's VTKm
* +hl is not needed for VTK, and is already specified in the VTK recipe
when it is needed
* Pass paths for adios2 and ascent packages

* ECP-SDK: Enable sensei

* CI: Add sensei to the data-vis-sdk pipeline

* Sensei: Change VISIT_DIR to work on linux

* Fixup: style check

* Sensei: Add patch for version detection

* CI: revert SDK pipeline in favor of new matrices

* Sensei: Formatting fixes
2022-08-18 17:20:01 -07:00
snehring
364e4e03ef pasta: missing setuptools dep (#32220)
* pasta: missing setuptools dep

* pasta: remove extraneous which

* pasta: convert to PythonPackage

* pasta: specify dendropy version
2022-08-18 22:53:09 +00:00
Adam J. Stewart
373d2ccf9f py-rasterio: add v1.3.1 (#32244) 2022-08-18 21:08:46 +00:00
Gregory Lee
0a8df434d2 added qt-creator 5.0.3 (#32241)
* added qt-creator 5.0.3

* [@spackbot] updating style on behalf of lee218llnl

Co-authored-by: lee218llnl <lee218llnl@users.noreply.github.com>
2022-08-18 19:35:44 +00:00
Sergey Kosukhin
15be91b585 autoconf-archive: fetch patch from github (#32232)
* autoconf-archive: fetch patch from github

* autoconf-archive: do not try to patch libtool
2022-08-18 10:55:04 -07:00
Thomas Gruber
bfa67bb275 Add patch for MEM_* group for Intel Icelake SP for 5.2.0 and newer (#32235) 2022-08-18 10:30:56 -07:00
1372 changed files with 23764 additions and 5867 deletions

View File

@@ -29,6 +29,7 @@ max-line-length = 99
#
per-file-ignores =
var/spack/repos/*/package.py:F403,F405,F821
*-ci-package.py:F403,F405,F821
# exclude things we usually do not want linting for.
# These still get linted when passed explicitly, as when spack flake8 passes

62
.github/ISSUE_TEMPLATE/test_error.yml vendored Normal file
View File

@@ -0,0 +1,62 @@
name: "\U0001F4A5 Tests error"
description: Some package in Spack had stand-alone tests that didn't pass
title: "Testing issue: "
labels: [test-error]
body:
- type: textarea
id: reproduce
attributes:
label: Steps to reproduce the failure(s) or link(s) to test output(s)
description: |
Fill in the test output from the exact spec that is having stand-alone test failures. Links to test outputs (e.g., CDash) can also be provided.
value: |
```console
$ spack spec -I <spec>
...
```
- type: textarea
id: error
attributes:
label: Error message
description: |
Please post the error message from spack inside the `<details>` tag below:
value: |
<details><summary>Error message</summary><pre>
...
</pre></details>
validations:
required: true
- type: textarea
id: information
attributes:
label: Information on your system or the test runner
description: Please include the output of `spack debug report` for your system.
validations:
required: true
- type: markdown
attributes:
value: |
If you have any relevant configuration detail (custom `packages.yaml` or `modules.yaml`, etc.) you can add that here as well.
- type: textarea
id: additional_information
attributes:
label: Additional information
description: |
Please upload test logs or any additional information about the problem.
- type: markdown
attributes:
value: |
Some packages have maintainers who have volunteered to debug build failures. Run `spack maintainers <name-of-the-package>` and **@mention** them here if they exist.
- type: checkboxes
id: checks
attributes:
label: General information
options:
- label: I have reported the version of Spack/Python/Platform/Runner
required: true
- label: I have run `spack maintainers <name-of-the-package>` and **@mentioned** any maintainers
required: true
- label: I have uploaded any available logs
required: true
- label: I have searched the issues of this repo and believe this is not a duplicate
required: true

44
.github/workflows/audit.yaml vendored Normal file
View File

@@ -0,0 +1,44 @@
name: audit
on:
workflow_call:
inputs:
with_coverage:
required: true
type: string
python_version:
required: true
type: string
concurrency:
group: audit-${{inputs.python_version}}-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
jobs:
# Run audits on all the packages in the built-in repository
package-audits:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov 'coverage[toml]<=6.2'
- name: Package audits (with coverage)
if: ${{ inputs.with_coverage == 'true' }}
run: |
. share/spack/setup-env.sh
coverage run $(which spack) audit packages
coverage combine
coverage xml
- name: Package audits (without coverage)
if: ${{ inputs.with_coverage == 'false' }}
run: |
. share/spack/setup-env.sh
$(which spack) audit packages
- uses: codecov/codecov-action@d9f34f8cd5cb3b3eb79b3e4b5dae3a16df499a70 # @v2.1.0
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,linux,audits

7
.github/workflows/bootstrap-test.sh vendored Executable file
View File

@@ -0,0 +1,7 @@
#!/bin/bash
set -ex
source share/spack/setup-env.sh
$PYTHON bin/spack bootstrap untrust spack-install
$PYTHON bin/spack -d solve zlib
tree $BOOTSTRAP/store
exit 0

View File

@@ -3,33 +3,19 @@ name: Bootstrapping
on:
# This Workflow can be triggered manually
workflow_dispatch:
pull_request:
branches:
- develop
- releases/**
paths-ignore:
# Don't run if we only modified packages in the
# built-in repository or documentation
- 'var/spack/repos/builtin/**'
- '!var/spack/repos/builtin/packages/clingo-bootstrap/**'
- '!var/spack/repos/builtin/packages/clingo/**'
- '!var/spack/repos/builtin/packages/python/**'
- '!var/spack/repos/builtin/packages/re2c/**'
- 'lib/spack/docs/**'
workflow_call:
schedule:
# nightly at 2:16 AM
- cron: '16 2 * * *'
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
group: bootstrap-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
jobs:
fedora-clingo-sources:
runs-on: ubuntu-latest
container: "fedora:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -38,7 +24,9 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
@@ -49,7 +37,6 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
@@ -63,7 +50,6 @@ jobs:
ubuntu-clingo-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
env:
@@ -75,7 +61,9 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
@@ -86,7 +74,6 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
@@ -100,7 +87,6 @@ jobs:
ubuntu-clingo-binaries-and-patchelf:
runs-on: ubuntu-latest
container: "ubuntu:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
env:
@@ -111,7 +97,9 @@ jobs:
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
@@ -122,7 +110,6 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
@@ -134,7 +121,6 @@ jobs:
opensuse-clingo-sources:
runs-on: ubuntu-latest
container: "opensuse/leap:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -145,13 +131,14 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
fetch-depth: 0
- name: Setup repo
run: |
# See [1] below
git config --global --add safe.directory /__w/spack/spack
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
run: |
@@ -163,13 +150,12 @@ jobs:
macos-clingo-sources:
runs-on: macos-latest
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
brew install cmake bison@2.7 tree
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
@@ -183,53 +169,70 @@ jobs:
runs-on: ${{ matrix.macos-version }}
strategy:
matrix:
python-version: ['3.6', '3.7', '3.8', '3.9', '3.10']
macos-version: ['macos-11', 'macos-12']
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
brew install tree
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5
with:
python-version: ${{ matrix.python-version }}
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
spack bootstrap untrust spack-install
spack -d solve zlib
tree ~/.spack/bootstrap/store/
set -ex
for ver in '3.6' '3.7' '3.8' '3.9' '3.10' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
if [[ -d "$ver_dir" ]] ; then
if $ver_dir/python --version ; then
export PYTHON="$ver_dir/python"
not_found=0
old_path="$PATH"
export PATH="$ver_dir:$PATH"
./bin/spack-tmpconfig -b ./.github/workflows/bootstrap-test.sh
export PATH="$old_path"
fi
fi
# NOTE: test all pythons that exist, not all do on 12
done
ubuntu-clingo-binaries:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['2.7', '3.6', '3.7', '3.8', '3.9', '3.10']
if: github.repository == 'spack/spack'
runs-on: ubuntu-20.04
steps:
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
python-version: ${{ matrix.python-version }}
fetch-depth: 0
- name: Setup repo
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
spack bootstrap untrust spack-install
spack -d solve zlib
tree ~/.spack/bootstrap/store/
set -ex
for ver in '2.7' '3.6' '3.7' '3.8' '3.9' '3.10' ; do
not_found=1
ver_dir="$(find $RUNNER_TOOL_CACHE/Python -wholename "*/${ver}.*/*/bin" | grep . || true)"
echo "Testing $ver_dir"
if [[ -d "$ver_dir" ]] ; then
if $ver_dir/python --version ; then
export PYTHON="$ver_dir/python"
not_found=0
old_path="$PATH"
export PATH="$ver_dir:$PATH"
./bin/spack-tmpconfig -b ./.github/workflows/bootstrap-test.sh
export PATH="$old_path"
fi
fi
if (($not_found)) ; then
echo Required python version $ver not found in runner!
exit 1
fi
done
ubuntu-gnupg-binaries:
runs-on: ubuntu-latest
container: "ubuntu:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
env:
@@ -240,7 +243,9 @@ jobs:
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
@@ -251,7 +256,6 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
- name: Bootstrap GnuPG
shell: runuser -u spack-test -- bash {0}
@@ -264,7 +268,6 @@ jobs:
ubuntu-gnupg-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
env:
@@ -276,7 +279,9 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
gawk
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
fetch-depth: 0
- name: Setup non-root user
run: |
# See [1] below
@@ -287,7 +292,6 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
- name: Bootstrap GnuPG
shell: runuser -u spack-test -- bash {0}
@@ -300,7 +304,6 @@ jobs:
macos-gnupg-binaries:
runs-on: macos-latest
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -308,7 +311,7 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
@@ -318,7 +321,6 @@ jobs:
macos-gnupg-sources:
runs-on: macos-latest
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -326,7 +328,7 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh

View File

@@ -20,7 +20,7 @@ on:
types: [published]
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
group: build_containers-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
jobs:
@@ -50,7 +50,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- name: Set Container Tag Normal (Nightly)
run: |
@@ -89,10 +89,10 @@ jobs:
uses: docker/setup-qemu-action@8b122486cedac8393e77aa9734c3528886e4a1a8 # @v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@dc7b9719a96d48369863986a06765841d7ea23f6 # @v1
uses: docker/setup-buildx-action@c74574e6c82eeedc46366be1b0d287eff9085eb6 # @v1
- name: Log in to GitHub Container Registry
uses: docker/login-action@49ed152c8eca782a232dede0303416e8f356c37b # @v1
uses: docker/login-action@f4ef78c080cd8ba55a85445d5b36e214a81df20a # @v1
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -100,7 +100,7 @@ jobs:
- name: Log in to DockerHub
if: github.event_name != 'pull_request'
uses: docker/login-action@49ed152c8eca782a232dede0303416e8f356c37b # @v1
uses: docker/login-action@f4ef78c080cd8ba55a85445d5b36e214a81df20a # @v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}

92
.github/workflows/ci.yaml vendored Normal file
View File

@@ -0,0 +1,92 @@
name: ci
on:
push:
branches:
- develop
- releases/**
pull_request:
branches:
- develop
- releases/**
concurrency:
group: ci-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
jobs:
prechecks:
needs: [ changes ]
uses: ./.github/workflows/valid-style.yml
with:
with_coverage: ${{ needs.changes.outputs.core }}
audit-ancient-python:
uses: ./.github/workflows/audit.yaml
needs: [ changes ]
with:
with_coverage: ${{ needs.changes.outputs.core }}
python_version: 2.7
all-prechecks:
needs: [ prechecks ]
runs-on: ubuntu-latest
steps:
- name: Success
run: "true"
# Check which files have been updated by the PR
changes:
runs-on: ubuntu-latest
# Set job outputs to values from filter step
outputs:
bootstrap: ${{ steps.filter.outputs.bootstrap }}
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@b2feaf19c27470162a626bd6fa8438ae5b263721
id: filter
with:
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below
# Don't run if we only modified packages in the
# built-in repository or documentation
filters: |
bootstrap:
- 'var/spack/repos/builtin/packages/clingo-bootstrap/**'
- 'var/spack/repos/builtin/packages/clingo/**'
- 'var/spack/repos/builtin/packages/python/**'
- 'var/spack/repos/builtin/packages/re2c/**'
- 'lib/spack/**'
- 'share/spack/**'
- '.github/workflows/bootstrap.yml'
- '.github/workflows/ci.yaml'
core:
- './!(var/**)/**'
packages:
- 'var/**'
# Some links for easier reference:
#
# "github" context: https://docs.github.com/en/actions/reference/context-and-expression-syntax-for-github-actions#github-context
# job outputs: https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#jobsjob_idoutputs
# setting environment variables from earlier steps: https://docs.github.com/en/actions/reference/workflow-commands-for-github-actions#setting-an-environment-variable
#
bootstrap:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.bootstrap == 'true' }}
needs: [ prechecks, changes ]
uses: ./.github/workflows/bootstrap.yml
unit-tests:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.core == 'true' }}
needs: [ prechecks, changes ]
uses: ./.github/workflows/unit_tests.yaml
windows:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.core == 'true' }}
needs: [ prechecks ]
uses: ./.github/workflows/windows_python.yml
all:
needs: [ windows, unit-tests, bootstrap, audit-ancient-python ]
runs-on: ubuntu-latest
steps:
- name: Success
run: "true"

View File

@@ -1,118 +1,46 @@
name: linux tests
name: unit tests
on:
push:
branches:
- develop
- releases/**
pull_request:
branches:
- develop
- releases/**
workflow_dispatch:
workflow_call:
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
group: unit_tests-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
jobs:
# Validate that the code can be run on all the Python versions
# supported by Spack
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5 # @v2
with:
python-version: '3.10'
- name: Install Python Packages
run: |
pip install --upgrade pip
pip install --upgrade vermin
- name: vermin (Spack's Core)
run: vermin --backport argparse --violations --backport typing -t=2.7- -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
run: vermin --backport argparse --violations --backport typing -t=2.7- -t=3.6- -vvv var/spack/repos
# Run style checks on the files that have been changed
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5 # @v2
with:
python-version: '3.10'
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools types-six
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
- name: Run style tests
run: |
share/spack/qa/run-style-tests
# Check which files have been updated by the PR
changes:
runs-on: ubuntu-latest
# Set job outputs to values from filter step
outputs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
with_coverage: ${{ steps.coverage.outputs.with_coverage }}
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@b2feaf19c27470162a626bd6fa8438ae5b263721
id: filter
with:
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below
filters: |
core:
- './!(var/**)/**'
packages:
- 'var/**'
# Some links for easier reference:
#
# "github" context: https://docs.github.com/en/actions/reference/context-and-expression-syntax-for-github-actions#github-context
# job outputs: https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#jobsjob_idoutputs
# setting environment variables from earlier steps: https://docs.github.com/en/actions/reference/workflow-commands-for-github-actions#setting-an-environment-variable
#
- id: coverage
# Run the subsequent jobs with coverage if core has been modified,
# regardless of whether this is a pull request or a push to a branch
run: |
echo Core changes: ${{ steps.filter.outputs.core }}
echo Event name: ${{ github.event_name }}
if [ "${{ steps.filter.outputs.core }}" == "true" ]
then
echo "::set-output name=with_coverage::true"
else
echo "::set-output name=with_coverage::false"
fi
# Run unit tests with different configurations on linux
unittests:
needs: [ validate, style, changes ]
ubuntu:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['2.7', '3.6', '3.7', '3.8', '3.9', '3.10']
concretizer: ['clingo']
on_develop:
- ${{ github.ref == 'refs/heads/develop' }}
include:
- python-version: 2.7
concretizer: original
- python-version: 3.9
on_develop: ${{ github.ref == 'refs/heads/develop' }}
- python-version: '3.10'
concretizer: original
on_develop: ${{ github.ref == 'refs/heads/develop' }}
exclude:
- python-version: '3.7'
concretizer: 'clingo'
on_develop: false
- python-version: '3.8'
concretizer: 'clingo'
on_develop: false
- python-version: '3.9'
concretizer: 'clingo'
on_develop: false
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5 # @v2
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -124,7 +52,7 @@ jobs:
patchelf cmake bison libbison-dev kcov
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov "coverage[toml]<=6.2"
pip install --upgrade pip six setuptools pytest codecov[toml] pytest-cov pytest-xdist
# ensure style checks are not skipped in unit tests for python >= 3.6
# note that true/false (i.e., 1/0) are opposite in conditions in python and bash
if python -c 'import sys; sys.exit(not sys.version_info >= (3, 6))'; then
@@ -147,37 +75,28 @@ jobs:
. share/spack/setup-env.sh
spack bootstrap untrust spack-install
spack -v solve zlib
- name: Run unit tests (full suite with coverage)
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
- name: Run unit tests
env:
SPACK_PYTHON: python
SPACK_TEST_SOLVER: ${{ matrix.concretizer }}
SPACK_TEST_PARALLEL: 2
COVERAGE: true
SPACK_TEST_SOLVER: ${{ matrix.concretizer }}
UNIT_TEST_COVERAGE: ${{ (matrix.concretizer == 'original' && matrix.python-version == '2.7') || (matrix.python-version == '3.10') }}
run: |
share/spack/qa/run-unit-tests
coverage combine
coverage combine -a
coverage xml
- name: Run unit tests (reduced suite without coverage)
if: ${{ needs.changes.outputs.with_coverage == 'false' }}
env:
SPACK_PYTHON: python
ONLY_PACKAGES: true
SPACK_TEST_SOLVER: ${{ matrix.concretizer }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@81cd2dc8148241f03f5839d295e000b8f761e378 # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
- uses: codecov/codecov-action@d9f34f8cd5cb3b3eb79b3e4b5dae3a16df499a70
with:
flags: unittests,linux,${{ matrix.concretizer }}
# Test shell integration
shell:
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5 # @v2
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
with:
python-version: '3.10'
- name: Install System packages
@@ -187,33 +106,25 @@ jobs:
sudo apt-get install -y coreutils kcov csh zsh tcsh fish dash bash
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov coverage[toml]==6.2
pip install --upgrade pip six setuptools pytest codecov coverage[toml]==6.2 pytest-xdist
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
- name: Run shell tests (without coverage)
if: ${{ needs.changes.outputs.with_coverage == 'false' }}
run: |
share/spack/qa/run-shell-tests
- name: Run shell tests (with coverage)
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
- name: Run shell tests
env:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@81cd2dc8148241f03f5839d295e000b8f761e378 # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
- uses: codecov/codecov-action@d9f34f8cd5cb3b3eb79b3e4b5dae3a16df499a70
with:
flags: shelltests,linux
# Test RHEL8 UBI with platform Python. This job is run
# only on PRs modifying core Spack
rhel8-platform-python:
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
container: registry.access.redhat.com/ubi8/ubi
steps:
- name: Install dependencies
@@ -221,7 +132,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -237,13 +148,12 @@ jobs:
spack unit-test -k 'not cvs and not svn and not hg' -x --verbose
# Test for the clingo based solver (using clingo-cffi)
clingo-cffi:
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5 # @v2
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
with:
python-version: '3.10'
- name: Install System packages
@@ -255,105 +165,60 @@ jobs:
patchelf kcov
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov coverage[toml]==6.2 clingo
pip install --upgrade pip six setuptools pytest codecov coverage[toml] pytest-cov clingo pytest-xdist
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
- name: Run unit tests (full suite with coverage)
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
env:
COVERAGE: true
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
coverage combine
coverage combine -a
coverage xml
- name: Run unit tests (reduced suite without coverage)
if: ${{ needs.changes.outputs.with_coverage == 'false' }}
env:
ONLY_PACKAGES: true
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@81cd2dc8148241f03f5839d295e000b8f761e378 # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
- uses: codecov/codecov-action@d9f34f8cd5cb3b3eb79b3e4b5dae3a16df499a70 # @v2.1.0
with:
flags: unittests,linux,clingo
# Run unit tests on MacOS
build:
needs: [ validate, style, changes ]
macos:
runs-on: macos-latest
strategy:
matrix:
python-version: [3.8]
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5 # @v2
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools
pip install --upgrade pytest codecov coverage[toml]==6.2
pip install --upgrade pytest codecov coverage[toml] pytest-xdist pytest-cov
- name: Setup Homebrew packages
run: |
brew install dash fish gcc gnupg2 kcov
- name: Run unit tests
env:
SPACK_TEST_SOLVER: clingo
SPACK_TEST_PARALLEL: 4
run: |
git --version
. .github/workflows/setup_git.sh
. share/spack/setup-env.sh
$(which spack) bootstrap untrust spack-install
$(which spack) solve zlib
if [ "${{ needs.changes.outputs.with_coverage }}" == "true" ]
then
coverage run $(which spack) unit-test -x
coverage combine
coverage xml
# Delete the symlink going from ./lib/spack/docs/_spack_root back to
# the initial directory, since it causes ELOOP errors with codecov/actions@2
rm lib/spack/docs/_spack_root
else
echo "ONLY PACKAGE RECIPES CHANGED [skipping coverage]"
$(which spack) unit-test -x -m "not maybeslow" -k "package_sanity"
fi
- uses: codecov/codecov-action@81cd2dc8148241f03f5839d295e000b8f761e378 # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --cov --cov-config=pyproject.toml "${common_args[@]}"
coverage combine -a
coverage xml
# Delete the symlink going from ./lib/spack/docs/_spack_root back to
# the initial directory, since it causes ELOOP errors with codecov/actions@2
rm lib/spack/docs/_spack_root
- uses: codecov/codecov-action@d9f34f8cd5cb3b3eb79b3e4b5dae3a16df499a70
with:
files: ./coverage.xml
flags: unittests,macos
# Run audits on all the packages in the built-in repository
package-audits:
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5 # @v2
with:
python-version: '3.10'
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools pytest codecov coverage[toml]==6.2
- name: Package audits (with coverage)
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
run: |
. share/spack/setup-env.sh
coverage run $(which spack) audit packages
coverage combine
coverage xml
- name: Package audits (without coverage)
if: ${{ needs.changes.outputs.with_coverage == 'false' }}
run: |
. share/spack/setup-env.sh
$(which spack) audit packages
- uses: codecov/codecov-action@81cd2dc8148241f03f5839d295e000b8f761e378 # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
with:
flags: unittests,linux,audits

60
.github/workflows/valid-style.yml vendored Normal file
View File

@@ -0,0 +1,60 @@
name: style
on:
workflow_call:
inputs:
with_coverage:
required: true
type: string
concurrency:
group: style-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
jobs:
# Validate that the code can be run on all the Python versions
# supported by Spack
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
with:
python-version: '3.10'
cache: 'pip'
- name: Install Python Packages
run: |
pip install --upgrade pip
pip install --upgrade vermin
- name: vermin (Spack's Core)
run: vermin --backport argparse --violations --backport typing -t=2.7- -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
run: vermin --backport argparse --violations --backport typing -t=2.7- -t=3.6- -vvv var/spack/repos
# Run style checks on the files that have been changed
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984 # @v2
with:
python-version: '3.10'
cache: 'pip'
- name: Install Python packages
run: |
python3 -m pip install --upgrade pip six setuptools types-six click==8.0.2 'black==21.12b0' mypy isort clingo flake8
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
- name: Run style tests
run: |
share/spack/qa/run-style-tests
audit:
uses: ./.github/workflows/audit.yaml
with:
with_coverage: ${{ inputs.with_coverage }}
python_version: '3.10'

View File

@@ -1,17 +1,10 @@
name: windows tests
name: windows
on:
push:
branches:
- develop
- releases/**
pull_request:
branches:
- develop
- releases/**
workflow_call:
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
group: windows-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
defaults:
@@ -19,91 +12,66 @@ defaults:
shell:
powershell Invoke-Expression -Command ".\share\spack\qa\windows_test_setup.ps1"; {0}
jobs:
validate:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5
with:
python-version: 3.9
- name: Install Python Packages
run: |
python -m pip install --upgrade pip
python -m pip install --upgrade vermin
- name: vermin (Spack's Core)
run: vermin --backport argparse --backport typing -t='2.7-' -t='3.6-' -v spack/lib/spack/spack/ spack/lib/spack/llnl/ spack/bin/
- name: vermin (Repositories)
run: vermin --backport argparse --backport typing -t='2.7-' -t='3.6-' -v spack/var/spack/repos
# Run style checks on the files that have been changed
style:
runs-on: windows-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
fetch-depth: 0
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip six setuptools flake8 "isort>=4.3.5" "mypy>=0.800" "click==8.0.4" "black<=21.12b0" pywin32 types-python-dateutil
- name: Create local develop
run: |
.\spack\.github\workflows\setup_git.ps1
- name: Run style tests
run: |
spack style
- name: Verify license headers
run: |
python spack\bin\spack license verify
unittest:
needs: [ validate, style ]
runs-on: windows-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
with:
fetch-depth: 0
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip six pywin32 setuptools codecov coverage
python -m pip install --upgrade pip six pywin32 setuptools codecov pytest-cov
- name: Create local develop
run: |
.\spack\.github\workflows\setup_git.ps1
- name: Unit Test
run: |
echo F|xcopy .\spack\share\spack\qa\configuration\windows_config.yaml $env:USERPROFILE\.spack\windows\config.yaml
spack unit-test --verbose --ignore=lib/spack/spack/test/cmd
unittest-cmd:
needs: [ validate, style ]
cd spack
dir
(Get-Item '.\lib\spack\docs\_spack_root').Delete()
spack unit-test --verbose --cov --cov-config=pyproject.toml --ignore=lib/spack/spack/test/cmd
coverage combine -a
coverage xml
- uses: codecov/codecov-action@d9f34f8cd5cb3b3eb79b3e4b5dae3a16df499a70
with:
flags: unittests,windows
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
fetch-depth: 0
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip six pywin32 setuptools codecov coverage
python -m pip install --upgrade pip six pywin32 setuptools codecov coverage pytest-cov
- name: Create local develop
run: |
.\spack\.github\workflows\setup_git.ps1
- name: Command Unit Test
run: |
echo F|xcopy .\spack\share\spack\qa\configuration\windows_config.yaml $env:USERPROFILE\.spack\windows\config.yaml
spack unit-test lib/spack/spack/test/cmd --verbose
buildtest:
needs: [ validate, style ]
cd spack
(Get-Item '.\lib\spack\docs\_spack_root').Delete()
spack unit-test --verbose --cov --cov-config=pyproject.toml lib/spack/spack/test/cmd
coverage combine -a
coverage xml
- uses: codecov/codecov-action@d9f34f8cd5cb3b3eb79b3e4b5dae3a16df499a70
with:
flags: unittests,windows
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
fetch-depth: 0
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
with:
python-version: 3.9
- name: Install Python packages
@@ -116,8 +84,7 @@ jobs:
spack external find cmake
spack external find ninja
spack install abseil-cpp
generate-installer-test:
needs: [ validate, style ]
make-installer:
runs-on: windows-latest
steps:
- name: Disable Windows Symlinks
@@ -125,15 +92,15 @@ jobs:
git config --global core.symlinks false
shell:
powershell
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/checkout@93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8
with:
fetch-depth: 0
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip six pywin32 setuptools codecov coverage
python -m pip install --upgrade pip six pywin32 setuptools
- name: Add Light and Candle to Path
run: |
$env:WIX >> $GITHUB_PATH
@@ -153,18 +120,18 @@ jobs:
name: Windows Spack Installer
path: ${{ env.installer_root}}\pkg\Spack.msi
execute-installer:
needs: generate-installer-test
needs: make-installer
runs-on: windows-latest
defaults:
run:
shell: pwsh
steps:
- uses: actions/setup-python@b55428b1882923874294fa556849718a1d7f2ca5
- uses: actions/setup-python@13ae5bb136fac2878aff31522b9efb785519f984
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip six pywin32 setuptools codecov coverage
python -m pip install --upgrade pip six pywin32 setuptools
- name: Setup installer directory
run: |
mkdir -p spack_installer

View File

@@ -62,6 +62,7 @@ Resources:
* **Slack workspace**: [spackpm.slack.com](https://spackpm.slack.com).
To get an invitation, visit [slack.spack.io](https://slack.spack.io).
* [**Github Discussions**](https://github.com/spack/spack/discussions): not just for discussions, also Q&A.
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack)
* **Twitter**: [@spackpm](https://twitter.com/spackpm). Be sure to
`@mention` us!

95
bin/spack-tmpconfig Executable file
View File

@@ -0,0 +1,95 @@
#!/bin/bash
set -euo pipefail
[[ -n "${TMPCONFIG_DEBUG:=}" ]] && set -x
DIR="$(cd -P "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
mkdir -p "${XDG_RUNTIME_DIR:=/tmp}/spack-tests"
export TMPDIR="${XDG_RUNTIME_DIR}"
export TMP_DIR="$(mktemp -d -t spack-test-XXXXX)"
clean_up() {
[[ -n "$TMPCONFIG_DEBUG" ]] && printf "cleaning up: $TMP_DIR\n"
rm -rf "$TMP_DIR"
}
trap clean_up EXIT
trap clean_up ERR
[[ -n "$TMPCONFIG_DEBUG" ]] && printf "Redirecting TMP_DIR and spack directories to $TMP_DIR\n"
export BOOTSTRAP="${SPACK_USER_CACHE_PATH:=$HOME/.spack}/bootstrap"
export SPACK_USER_CACHE_PATH="$TMP_DIR/user_cache"
mkdir -p "$SPACK_USER_CACHE_PATH"
private_bootstrap="$SPACK_USER_CACHE_PATH/bootstrap"
use_spack=''
use_bwrap=''
# argument handling
while (($# >= 1)) ; do
case "$1" in
-b) # privatize bootstrap too, useful for CI but not always cheap
shift
export BOOTSTRAP="$private_bootstrap"
;;
-B) # use specified bootstrap dir
export BOOTSTRAP="$2"
shift 2
;;
-s) # run spack directly with remaining args
shift
use_spack=1
;;
--contain=bwrap)
if bwrap --help 2>&1 > /dev/null ; then
use_bwrap=1
else
echo Bubblewrap containment requested, but no bwrap command found
exit 1
fi
shift
;;
--)
shift
break
;;
*)
break
;;
esac
done
typeset -a CMD
if [[ -n "$use_spack" ]] ; then
CMD=("$DIR/spack" "$@")
else
CMD=("$@")
fi
mkdir -p "$BOOTSTRAP"
export SPACK_SYSTEM_CONFIG_PATH="$TMP_DIR/sys_conf"
export SPACK_USER_CONFIG_PATH="$TMP_DIR/user_conf"
mkdir -p "$SPACK_USER_CONFIG_PATH"
cat >"$SPACK_USER_CONFIG_PATH/config.yaml" <<EOF
config:
install_tree:
root: $TMP_DIR/install
misc_cache: $$user_cache_path/cache
source_cache: $$user_cache_path/source
EOF
cat >"$SPACK_USER_CONFIG_PATH/bootstrap.yaml" <<EOF
bootstrap:
root: $BOOTSTRAP
EOF
if [[ -n "$use_bwrap" ]] ; then
CMD=(
bwrap
--dev-bind / /
--ro-bind "$DIR/.." "$DIR/.." # do not touch spack root
--ro-bind $HOME/.spack $HOME/.spack # do not touch user config/cache dir
--bind "$TMP_DIR" "$TMP_DIR"
--bind "$BOOTSTRAP" "$BOOTSTRAP"
--die-with-parent
"${CMD[@]}"
)
fi
(( ${TMPCONFIG_DEBUG:=0} > 1)) && echo "Running: ${CMD[@]}"
"${CMD[@]}"

View File

@@ -9,6 +9,8 @@ bootstrap:
# may not be able to bootstrap all the software that Spack needs,
# depending on its type.
sources:
- name: 'github-actions-v0.3'
metadata: $spack/share/spack/bootstrap/github-actions-v0.3
- name: 'github-actions-v0.2'
metadata: $spack/share/spack/bootstrap/github-actions-v0.2
- name: 'github-actions-v0.1'
@@ -18,5 +20,5 @@ bootstrap:
trusted:
# By default we trust bootstrapping from sources and from binaries
# produced on Github via the workflow
github-actions-v0.2: true
github-actions-v0.3: true
spack-install: true

View File

@@ -15,7 +15,7 @@
# -------------------------------------------------------------------------
modules:
prefix_inspections:
lib:
./lib:
- DYLD_FALLBACK_LIBRARY_PATH
lib64:
./lib64:
- DYLD_FALLBACK_LIBRARY_PATH

View File

@@ -14,23 +14,24 @@
# ~/.spack/modules.yaml
# -------------------------------------------------------------------------
modules:
# Paths to check when creating modules for all module sets
# This maps paths in the package install prefix to environment variables
# they should be added to. For example, <prefix>/bin should be in PATH.
prefix_inspections:
bin:
./bin:
- PATH
man:
./man:
- MANPATH
share/man:
./share/man:
- MANPATH
share/aclocal:
./share/aclocal:
- ACLOCAL_PATH
lib/pkgconfig:
./lib/pkgconfig:
- PKG_CONFIG_PATH
lib64/pkgconfig:
./lib64/pkgconfig:
- PKG_CONFIG_PATH
share/pkgconfig:
./share/pkgconfig:
- PKG_CONFIG_PATH
'':
./:
- CMAKE_PREFIX_PATH
# These are configurations for the module set named "default"

View File

@@ -49,9 +49,8 @@ packages rather than building its own packages. This may be desirable
if machines ship with system packages, such as a customized MPI
that should be used instead of Spack building its own MPI.
External packages are configured through the ``packages.yaml`` file found
in a Spack installation's ``etc/spack/`` or a user's ``~/.spack/``
directory. Here's an example of an external configuration:
External packages are configured through the ``packages.yaml`` file.
Here's an example of an external configuration:
.. code-block:: yaml
@@ -97,11 +96,14 @@ Each package version and compiler listed in an external should
have entries in Spack's packages and compiler configuration, even
though the package and compiler may not ever be built.
The packages configuration can tell Spack to use an external location
for certain package versions, but it does not restrict Spack to using
external packages. In the above example, since newer versions of OpenMPI
are available, Spack will choose to start building and linking with the
latest version rather than continue using the pre-installed OpenMPI versions.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Prevent packages from being built from sources
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Adding an external spec in ``packages.yaml`` allows Spack to use an external location,
but it does not prevent Spack from building packages from sources. In the above example,
Spack might choose for many valid reasons to start building and linking with the
latest version of OpenMPI rather than continue using the pre-installed OpenMPI versions.
To prevent this, the ``packages.yaml`` configuration also allows packages
to be flagged as non-buildable. The previous example could be modified to
@@ -121,9 +123,15 @@ be:
buildable: False
The addition of the ``buildable`` flag tells Spack that it should never build
its own version of OpenMPI, and it will instead always rely on a pre-built
OpenMPI. Similar to ``paths``, ``buildable`` is specified as a property under
a package name.
its own version of OpenMPI from sources, and it will instead always rely on a pre-built
OpenMPI.
.. note::
If ``concretizer:reuse`` is on (see :ref:`concretizer-options` for more information on that flag)
pre-built specs include specs already available from a local store, an upstream store, a registered
buildcache or specs marked as externals in ``packages.yaml``. If ``concretizer:reuse`` is off, only
external specs in ``packages.yaml`` are included in the list of pre-built specs.
If an external module is specified as not buildable, then Spack will load the
external module into the build environment which can be used for linking.
@@ -132,6 +140,10 @@ The ``buildable`` does not need to be paired with external packages.
It could also be used alone to forbid packages that may be
buggy or otherwise undesirable.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Non-buildable virtual packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Virtual packages in Spack can also be specified as not buildable, and
external implementations can be provided. In the example above,
OpenMPI is configured as not buildable, but Spack will often prefer
@@ -153,21 +165,37 @@ but more conveniently:
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
Implementations can also be listed immediately under the virtual they provide:
Spack can then use any of the listed external implementations of MPI
to satisfy a dependency, and will choose depending on the compiler and
architecture.
In cases where the concretizer is configured to reuse specs, and other ``mpi`` providers
(available via stores or buildcaches) are not wanted, Spack can be configured to require
specs matching only the available externals:
.. code-block:: yaml
packages:
mpi:
buildable: False
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64: /opt/openmpi-1.4.3
openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug: /opt/openmpi-1.4.3-debug
openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64: /opt/openmpi-1.6.5-intel
mpich@3.3 %clang@9.0.0 arch=linux-debian7-x86_64: /opt/mpich-3.3-intel
require:
- one_of: [
"openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64",
"openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug",
"openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
]
openmpi:
externals:
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.4.3
- spec: "openmpi@1.4.3%gcc@4.4.7 arch=linux-debian7-x86_64+debug"
prefix: /opt/openmpi-1.4.3-debug
- spec: "openmpi@1.6.5%intel@10.1 arch=linux-debian7-x86_64"
prefix: /opt/openmpi-1.6.5-intel
Spack can then use any of the listed external implementations of MPI
to satisfy a dependency, and will choose depending on the compiler and
architecture.
This configuration prevents any spec using MPI and originating from stores or buildcaches to be reused,
unless it matches the requirements under ``packages:mpi:require``. For more information on requirements see
:ref:`package-requirements`.
.. _cmd-spack-external-find:
@@ -194,11 +222,6 @@ Specific limitations include:
* Packages are not discoverable by default: For a package to be
discoverable with ``spack external find``, it needs to add special
logic. See :ref:`here <make-package-findable>` for more details.
* The current implementation only collects and examines executable files,
so it is typically only useful for build/run dependencies (in some cases
if a library package also provides an executable, it may be possible to
extract a meaningful Spec by running the executable - for example the
compiler wrappers in MPI implementations).
* The logic does not search through module files, it can only detect
packages with executables defined in ``PATH``; you can help Spack locate
externals which use module files by loading any associated modules for
@@ -369,7 +392,7 @@ The following is an example of how to enforce package properties in
require: "@1.13.2"
openmpi:
require:
- any_of: ["~cuda", "gcc"]
- any_of: ["~cuda", "%gcc"]
mpich:
require:
- one_of: ["+cuda", "+rocm"]
@@ -396,15 +419,69 @@ choose between a set of options using ``any_of`` or ``one_of``:
``mpich`` already includes a conflict, so this is redundant but
still demonstrates the concept).
Other notes about ``requires``:
.. note::
* You can only specify requirements for specific packages: you cannot
add ``requires`` under ``all``.
* You cannot specify requirements for virtual packages (e.g. you can
specify requirements for ``openmpi`` but not ``mpi``).
* For ``any_of`` and ``one_of``, the order of specs indicates a
preference: items that appear earlier in the list are preferred
(note that these preferences can be ignored in favor of others).
For ``any_of`` and ``one_of``, the order of specs indicates a
preference: items that appear earlier in the list are preferred
(note that these preferences can be ignored in favor of others).
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting default requirements
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can also set default requirements for all packages under ``all``
like this:
.. code-block:: yaml
packages:
all:
require: '%clang'
which means every spec will be required to use ``clang`` as a compiler.
Note that in this case ``all`` represents a *default set of requirements* -
if there are specific package requirements, then the default requirements
under ``all`` are disregarded. For example, with a configuration like this:
.. code-block:: yaml
packages:
all:
require: '%clang'
cmake:
require: '%gcc'
Spack requires ``cmake`` to use ``gcc`` and all other nodes (including cmake dependencies)
to use ``clang``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting requirements on virtual specs
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
A requirement on a virtual spec applies whenever that virtual is present in the DAG. This
can be useful for fixing which virtual provider you want to use:
.. code-block:: yaml
packages:
mpi:
require: 'mvapich2 %gcc'
With the configuration above the only allowed ``mpi`` provider is ``mvapich2 %gcc``.
Requirements on the virtual spec and on the specific provider are both applied, if present. For
instance with a configuration like:
.. code-block:: yaml
packages:
mpi:
require: 'mvapich2 %gcc'
mvapich2:
require: '~cuda'
you will use ``mvapich2~cuda %gcc`` as an ``mpi`` provider.
.. _package_permissions:

View File

@@ -50,8 +50,9 @@ important to understand.
include `setuptools <https://setuptools.pypa.io/>`__,
`flit <https://flit.pypa.io/>`_,
`poetry <https://python-poetry.org/>`_,
`hatchling <https://hatch.pypa.io/latest/>`_, and
`meson <https://meson-python.readthedocs.io/>`_.
`hatchling <https://hatch.pypa.io/latest/>`_,
`meson <https://meson-python.readthedocs.io/>`_, and
`pdm <https://pdm.fming.dev/latest/>`_.
^^^^^^^^^^^
Downloading
@@ -368,6 +369,16 @@ it uses the meson build system. Meson uses the default
See https://meson-python.readthedocs.io/en/latest/usage/start.html
for more information.
"""
pdm
"""
If the ``pyproject.toml`` lists ``pdm.pep517.api`` as the ``build-backend``,
it uses the PDM build system. PDM uses the default ``pyproject.toml``
keys to list dependencies.
See https://pdm.fming.dev/latest/ for more information.
""""""
wheels
""""""
@@ -571,6 +582,19 @@ libraries. Make sure not to add modules/packages containing the word
"test", as these likely won't end up in the installation directory,
or may require test dependencies like pytest to be installed.
Instead of defining the ``import_modules`` explicity, only the subset
of module names to be skipped can be defined by using ``skip_modules``.
If a defined module has submodules, they are skipped as well, e.g.,
in case the ``plotting`` modules should be excluded from the
automatically detected ``import_modules`` ``['nilearn', 'nilearn.surface',
'nilearn.plotting', 'nilearn.plotting.data']`` set:
.. code-block:: python
skip_modules = ['nilearn.plotting']
This will set ``import_modules`` to ``['nilearn', 'nilearn.surface']``
Import tests can be run during the installation using ``spack install
--test=root`` or at any time after the installation using
``spack test run``.
@@ -758,3 +782,4 @@ For more information on build backend tools, see:
* poetry: https://python-poetry.org/
* hatchling: https://hatch.pypa.io/latest/
* meson: https://meson-python.readthedocs.io/
* pdm: https://pdm.fming.dev/latest/

View File

@@ -127,6 +127,7 @@ def setup(sphinx):
"sphinx.ext.napoleon",
"sphinx.ext.todo",
"sphinx.ext.viewcode",
"sphinx_design",
"sphinxcontrib.programoutput",
]
@@ -201,6 +202,7 @@ def setup(sphinx):
("py:class", "unittest.case.TestCase"),
("py:class", "_frozen_importlib_external.SourceFileLoader"),
("py:class", "clingo.Control"),
("py:class", "six.moves.urllib.parse.ParseResult"),
# Spack classes that are private and we don't want to expose
("py:class", "spack.provider_index._IndexBase"),
("py:class", "spack.repo._PrependFileLoader"),

View File

@@ -19,9 +19,9 @@ see the default settings by looking at
These settings can be overridden in ``etc/spack/config.yaml`` or
``~/.spack/config.yaml``. See :ref:`configuration-scopes` for details.
--------------------
``install_tree``
--------------------
---------------------
``install_tree:root``
---------------------
The location where Spack will install packages and their dependencies.
Default is ``$spack/opt/spack``.

View File

@@ -478,14 +478,21 @@ them to the Environment.
spack:
include:
- relative/path/to/config.yaml
- https://github.com/path/to/raw/config/compilers.yaml
- /absolute/path/to/packages.yaml
Environments can include files with either relative or absolute
paths. Inline configurations take precedence over included
configurations, so you don't have to change shared configuration files
to make small changes to an individual Environment. Included configs
listed earlier will have higher precedence, as the included configs are
applied in reverse order.
Environments can include files or URLs. File paths can be relative or
absolute. URLs include the path to the text for individual files or
can be the path to a directory containing configuration files.
^^^^^^^^^^^^^^^^^^^^^^^^
Configuration precedence
^^^^^^^^^^^^^^^^^^^^^^^^
Inline configurations take precedence over included configurations, so
you don't have to change shared configuration files to make small changes
to an individual environment. Included configurations listed earlier will
have higher precedence, as the included configs are applied in reverse order.
-------------------------------
Manually Editing the Specs List
@@ -616,31 +623,6 @@ The following two Environment manifests are identical:
Spec matrices can be used to install swaths of software across various
toolchains.
The concretization logic for spec matrices differs slightly from the
rest of Spack. If a variant or dependency constraint from a matrix is
invalid, Spack will reject the constraint and try again without
it. For example, the following two Environment manifests will produce
the same specs:
.. code-block:: yaml
spack:
specs:
- matrix:
- [zlib, libelf, hdf5+mpi]
- [^mvapich2@2.2, ^openmpi@3.1.0]
spack:
specs:
- zlib
- libelf
- hdf5+mpi ^mvapich2@2.2
- hdf5+mpi ^openmpi@3.1.0
This allows one to create toolchains out of combinations of
constraints and apply them somewhat indiscriminately to packages,
without regard for the applicability of the constraint.
^^^^^^^^^^^^^^^^^^^^
Spec List References
^^^^^^^^^^^^^^^^^^^^
@@ -1004,7 +986,7 @@ A typical workflow is as follows:
spack env create -d .
spack -e . add perl
spack -e . concretize
spack -e . env depfile > Makefile
spack -e . env depfile -o Makefile
make -j64
This generates a ``Makefile`` from a concretized environment in the
@@ -1017,7 +999,6 @@ load, even when packages are built in parallel.
By default the following phony convenience targets are available:
- ``make all``: installs the environment (default target);
- ``make fetch-all``: only fetch sources of all packages;
- ``make clean``: cleans files used by make, but does not uninstall packages.
.. tip::
@@ -1027,8 +1008,17 @@ By default the following phony convenience targets are available:
printed orderly per package install. To get synchronized output with colors,
use ``make -j<N> SPACK_COLOR=always --output-sync=recurse``.
The following advanced example shows how generated targets can be used in a
``Makefile``:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Specifying dependencies on generated ``make`` targets
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
An interesting question is how to include generated ``Makefile``\s in your own
``Makefile``\s. This comes up when you want to install an environment that provides
executables required in a command for a make target of your own.
The example below shows how to accomplish this: the ``env`` target specifies
the generated ``spack/env`` target as a prerequisite, meaning that the environment
gets installed and is available for use in the ``env`` target.
.. code:: Makefile
@@ -1054,11 +1044,10 @@ The following advanced example shows how generated targets can be used in a
include env.mk
endif
When ``make`` is invoked, it first "remakes" the missing include ``env.mk``
from its rule, which triggers concretization. When done, the generated target
``spack/env`` is available. In the above example, the ``env`` target uses this generated
target as a prerequisite, meaning that it can make use of the installed packages in
its commands.
This works as follows: when ``make`` is invoked, it first "remakes" the missing
include ``env.mk`` as there is a target for it. This triggers concretization of
the environment and makes spack output ``env.mk``. At that point the
generated target ``spack/env`` becomes available through ``include env.mk``.
As it is typically undesirable to remake ``env.mk`` as part of ``make clean``,
the include is conditional.
@@ -1069,3 +1058,24 @@ the include is conditional.
the ``--make-target-prefix`` flag and use the non-phony target
``<target-prefix>/env`` as prerequisite, instead of the phony target
``<target-prefix>/all``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Building a subset of the environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The generated ``Makefile``\s contain install targets for each spec. Given the hash
of a particular spec, you can use the ``.install/<hash>`` target to install the
spec with its dependencies. There is also ``.install-deps/<hash>`` to *only* install
its dependencies. This can be useful when certain flags should only apply to
dependencies. Below we show a use case where a spec is installed with verbose
output (``spack install --verbose``) while its dependencies are installed silently:
.. code:: console
$ spack env depfile -o Makefile --make-target-prefix my_env
# Install dependencies in parallel, only show a log on error.
$ make -j16 my_env/.install-deps/<hash> SPACK_INSTALL_FLAGS=--show-log-on-error
# Install the root spec with verbose output.
$ make -j16 my_env/.install/<hash> SPACK_INSTALL_FLAGS=--verbose

View File

@@ -23,8 +23,36 @@ be present on the machine where Spack is run:
These requirements can be easily installed on most modern Linux systems;
on macOS, XCode is required. Spack is designed to run on HPC
platforms like Cray. Not all packages should be expected
to work on all platforms. A build matrix showing which packages are
working on which systems is planned but not yet available.
to work on all platforms.
A build matrix showing which packages are working on which systems is shown below.
.. tab-set::
.. tab-item:: Debian/Ubuntu
.. code-block:: console
apt update
apt install build-essential ca-certificates coreutils curl environment-modules gfortran git gpg lsb-release python3 python3-distutils python3-venv unzip zip
.. tab-item:: RHEL
.. code-block:: console
yum update -y
yum install -y epel-release
yum update -y
yum --enablerepo epel groupinstall -y "Development Tools"
yum --enablerepo epel install -y curl findutils gcc-c++ gcc gcc-gfortran git gnupg2 hostname iproute make patch python3 python3-pip python3-setuptools unzip
python3 -m pip install boto3
.. tab-item:: macOS Brew
.. code-block:: console
brew update
brew install curl gcc git gnupg zip
------------
Installation

View File

@@ -14,7 +14,13 @@ problems if you encounter them.
Spack does not seem to respect ``packages.yaml``
------------------------------------------------
A common problem in Spack v0.18 and above is that package, compiler and target
.. note::
This issue is **resolved** as of v0.19.0.dev0 commit
`8281a0c5feabfc4fe180846d6fe95cfe53420bc5`, through the introduction of package
requirements. See :ref:`package-requirements`.
A common problem in Spack v0.18.0 up to v0.19.0.dev0 is that package, compiler and target
preferences specified in ``packages.yaml`` do not seem to be respected. Spack picks the
"wrong" compilers and their versions, package versions and variants, and
micro-architectures.

View File

@@ -77,7 +77,7 @@ installation of a package.
Spack only generates modulefiles when a package is installed. If
you attempt to install a package and it is already installed, Spack
will not regenerate modulefiles for the package. This may to
will not regenerate modulefiles for the package. This may lead to
inconsistent modulefiles if the Spack module configuration has
changed since the package was installed, either by editing a file
or changing scopes or environments.

View File

@@ -4561,6 +4561,9 @@ other checks.
* - :ref:`AutotoolsPackage <autotoolspackage>`
- ``check`` (``make test``, ``make check``)
- ``installcheck`` (``make installcheck``)
* - :ref:`CachedCMakePackage <cachedcmakepackage>`
- ``check`` (``make check``, ``make test``)
- Not applicable
* - :ref:`CMakePackage <cmakepackage>`
- ``check`` (``make check``, ``make test``)
- Not applicable
@@ -4585,6 +4588,9 @@ other checks.
* - :ref:`SIPPackage <sippackage>`
- Not applicable
- ``test`` (module imports)
* - :ref:`WafPackage <wafpackage>`
- ``build_test`` (must be overridden)
- ``install_test`` (must be overridden)
For example, the ``Libelf`` package inherits from ``AutotoolsPackage``
and its ``Makefile`` has a standard ``check`` target. So Spack will

View File

@@ -5,9 +5,9 @@
.. _pipelines:
=========
Pipelines
=========
============
CI Pipelines
============
Spack provides commands that support generating and running automated build
pipelines designed for Gitlab CI. At the highest level it works like this:
@@ -168,7 +168,7 @@ which specs are up to date and which need to be rebuilt (it's a good idea for ot
reasons as well, but those are out of scope for this discussion). In this case we
have disabled it (using ``rebuild-index: False``) because the index would only be
generated in the artifacts mirror anyway, and consequently would not be available
during subesequent pipeline runs.
during subsequent pipeline runs.
.. note::
With the addition of reproducible builds (#22887) a previously working
@@ -267,24 +267,64 @@ generated by jobs in the pipeline.
``spack ci rebuild``
^^^^^^^^^^^^^^^^^^^^^
The purpose of the ``spack ci rebuild`` is straightforward: take its assigned
spec job, check whether the target mirror already has a binary for that spec,
and if not, build the spec from source and push the binary to the mirror. To
accomplish this in a reproducible way, the sub-command prepares a ``spack install``
command line to build a single spec in the DAG, saves that command in a
shell script, ``install.sh``, in the current working directory, and then runs
it to install the spec. The shell script is also exported as an artifact to
aid in reproducing the build outside of the CI environment.
The purpose of ``spack ci rebuild`` is straightforward: take its assigned
spec and ensure a binary of a successful build exists on the target mirror.
If the binary does not already exist, it is built from source and pushed
to the mirror. The associated stand-alone tests are optionally run against
the new build. Additionally, files for reproducing the build outside of the
CI environment are created to facilitate debugging.
If it was necessary to install the spec from source, ``spack ci rebuild`` will
also subsequently create a binary package for the spec and try to push it to the
mirror.
If a binary for the spec does not exist on the target mirror, an install
shell script, ``install.sh``, is created and saved in the current working
directory. The script is run in a job to install the spec from source. The
resulting binary package is pushed to the mirror. If ``cdash`` is configured
for the environment, then the build results will be uploaded to the site.
The ``spack ci rebuild`` sub-command mainly expects its "input" to come either
from environment variables or from the ``gitlab-ci`` section of the ``spack.yaml``
environment file. There are two main sources of the environment variables, some
are written into ``.gitlab-ci.yml`` by ``spack ci generate``, and some are
provided by the GitLab CI runtime.
Environment variables and values in the ``gitlab-ci`` section of the
``spack.yaml`` environment file provide inputs to this process. The
two main sources of environment variables are variables written into
``.gitlab-ci.yml`` by ``spack ci generate`` and the GitLab CI runtime.
Several key CI pipeline variables are described in
:ref:`ci_environment_variables`.
If the ``--tests`` option is provided, stand-alone tests are performed but
only if the build was successful *and* the package does not appear in the
list of ``broken-tests-packages``. A shell script, ``test.sh``, is created
and run to perform the tests. On completion, test logs are exported as job
artifacts for review and to facilitate debugging. If `cdash` is configured,
test results are also uploaded to the site.
A snippet from an example ``spack.yaml`` file illustrating use of this
option *and* specification of a package with broken tests is given below.
The inclusion of a spec for building ``gptune`` is not shown here. Note
that ``--tests`` is passed to ``spack ci rebuild`` as part of the
``gitlab-ci`` script.
.. code-block:: yaml
gitlab-ci:
script:
- . "./share/spack/setup-env.sh"
- spack --version
- cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
- spack -d ci rebuild --tests > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
broken-tests-packages:
- gptune
In this case, even if ``gptune`` is successfully built from source, the
pipeline will *not* run its stand-alone tests since the package is listed
under ``broken-tests-packages``.
Spack's cloud pipelines provide actual, up-to-date examples of the CI/CD
configuration and environment files used by Spack. You can find them
under Spack's `stacks
<https://github.com/spack/spack/tree/develop/share/spack/gitlab/cloud_pipelines/stacks>`_ repository directory.
.. _cmd-spack-ci-rebuild-index:
@@ -447,7 +487,7 @@ Note about "no-op" jobs
^^^^^^^^^^^^^^^^^^^^^^^
If no specs in an environment need to be rebuilt during a given pipeline run
(meaning all are already up to date on the mirror), a single succesful job
(meaning all are already up to date on the mirror), a single successful job
(a NO-OP) is still generated to avoid an empty pipeline (which GitLab
considers to be an error). An optional ``service-job-attributes`` section
can be added to your ``spack.yaml`` where you can provide ``tags`` and
@@ -725,7 +765,7 @@ above with ``git checkout ${SPACK_CHECKOUT_VERSION}``.
On the other hand, if you're pointing to a spack repository and branch under your
control, there may be no benefit in using the captured ``SPACK_CHECKOUT_VERSION``,
and you can instead just clone using the variables you define (``SPACK_REPO``
and ``SPACK_REF`` in the example aboves).
and ``SPACK_REF`` in the example above).
.. _custom_workflow:

View File

@@ -3,6 +3,7 @@
sphinx>=3.4,!=4.1.2,!=5.1.0
sphinxcontrib-programoutput
sphinx-design
sphinx-rtd-theme
python-levenshtein
# Restrict to docutils <0.17 to workaround a list rendering issue in sphinx.

View File

@@ -18,7 +18,10 @@ spack:
- "py-sphinx@3.4:4.1.1,4.1.3:"
- py-sphinxcontrib-programoutput
- py-docutils@:0.16
- py-sphinx-design
- py-sphinx-rtd-theme
- py-pygments@:2.12
# VCS
- git
- mercurial

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.1.4 (commit b8eea9df2b4204ff27d204452cd46f5199a0b423)
* Version: 0.1.4 (commit e2cfdc266174488dee78b8c9058e36d60dc1b548)
argparse
--------

View File

@@ -106,7 +106,7 @@ def __eq__(self, other):
self.name == other.name
and self.vendor == other.vendor
and self.features == other.features
and self.ancestors == other.ancestors
and self.parents == other.parents # avoid ancestors here
and self.compilers == other.compilers
and self.generation == other.generation
)

View File

@@ -1099,8 +1099,7 @@
"avx512cd",
"avx512vbmi",
"avx512ifma",
"sha",
"umip"
"sha"
],
"compilers": {
"gcc": [
@@ -1263,7 +1262,6 @@
"avx512vbmi",
"avx512ifma",
"sha_ni",
"umip",
"clwb",
"rdpid",
"gfni",
@@ -2249,7 +2247,7 @@
}
},
"graviton2": {
"from": ["aarch64"],
"from": ["graviton"],
"vendor": "ARM",
"features": [
"fp",
@@ -2319,6 +2317,107 @@
]
}
},
"graviton3": {
"from": ["graviton2"],
"vendor": "ARM",
"features": [
"fp",
"asimd",
"evtstrm",
"aes",
"pmull",
"sha1",
"sha2",
"crc32",
"atomics",
"fphp",
"asimdhp",
"cpuid",
"asimdrdm",
"jscvt",
"fcma",
"lrcpc",
"dcpop",
"sha3",
"sm3",
"sm4",
"asimddp",
"sha512",
"sve",
"asimdfhm",
"dit",
"uscat",
"ilrcpc",
"flagm",
"ssbs",
"paca",
"pacg",
"dcpodp",
"svei8mm",
"svebf16",
"i8mm",
"bf16",
"dgh",
"rng"
],
"compilers" : {
"gcc": [
{
"versions": "4.8:4.8.9",
"flags": "-march=armv8-a"
},
{
"versions": "4.9:5.9",
"flags": "-march=armv8-a+crc+crypto"
},
{
"versions": "6:6.9",
"flags" : "-march=armv8.1-a"
},
{
"versions": "7:7.9",
"flags" : "-march=armv8.2-a+crypto+fp16 -mtune=cortex-a72"
},
{
"versions": "8.0:8.9",
"flags" : "-march=armv8.2-a+fp16+dotprod+crypto -mtune=cortex-a72"
},
{
"versions": "9.0:9.9",
"flags" : "-march=armv8.4-a+crypto+rcpc+sha3+sm4+sve+rng+nodotprod -mtune=neoverse-v1"
},
{
"versions": "10.0:",
"flags" : "-march=armv8.4-a+crypto+rcpc+sha3+sm4+sve+rng+ssbs+i8mm+bf16+nodotprod -mtune=neoverse-v1"
}
],
"clang" : [
{
"versions": "3.9:4.9",
"flags" : "-march=armv8.2-a+fp16+crc+crypto"
},
{
"versions": "5:10",
"flags" : "-march=armv8.2-a+fp16+rcpc+dotprod+crypto"
},
{
"versions": "11:",
"flags" : "-march=armv8.4-a+sve+ssbs+fp16+bf16+crypto+i8mm+rng"
}
],
"arm" : [
{
"versions": "20:21.9",
"flags" : "-march=armv8.2-a+sve+fp16+rcpc+dotprod+crypto"
},
{
"versions": "22:",
"flags" : "-march=armv8.4-a+sve+ssbs+fp16+bf16+crypto+i8mm+rng"
}
]
}
},
"m1": {
"from": ["aarch64"],
"vendor": "Apple",

View File

@@ -71,6 +71,8 @@
import re
import math
import multiprocessing
import sys
import threading
import time
from contextlib import contextmanager
@@ -409,7 +411,12 @@ def parse(self, stream, context=6, jobs=None):
pool = multiprocessing.Pool(jobs)
try:
# this is a workaround for a Python bug in Pool with ctrl-C
results = pool.map_async(_parse_unpack, args, 1).get(9999999)
if sys.version_info >= (3, 2):
max_timeout = threading.TIMEOUT_MAX
else:
max_timeout = 9999999
results = pool.map_async(_parse_unpack, args, 1).get(max_timeout)
errors, warnings, timings = zip(*results)
finally:
pool.terminate()

View File

@@ -22,9 +22,9 @@
from llnl.util import tty
from llnl.util.compat import Sequence
from llnl.util.lang import dedupe, memoized
from llnl.util.symlink import symlink
from llnl.util.symlink import islink, symlink
from spack.util.executable import Executable
from spack.util.executable import CommandNotFoundError, Executable, which
from spack.util.path import path_to_os_path, system_path_filter
is_windows = _platform == "win32"
@@ -113,6 +113,69 @@ def path_contains_subdirectory(path, root):
return norm_path.startswith(norm_root)
@memoized
def file_command(*args):
"""Creates entry point to `file` system command with provided arguments"""
try:
file_cmd = which("file", required=True)
except CommandNotFoundError as e:
if is_windows:
raise CommandNotFoundError("`file` utility is not available on Windows")
else:
raise e
for arg in args:
file_cmd.add_default_arg(arg)
return file_cmd
@memoized
def _get_mime_type():
"""Generate method to call `file` system command to aquire mime type
for a specified path
"""
return file_command("-b", "-h", "--mime-type")
@memoized
def _get_mime_type_compressed():
"""Same as _get_mime_type but attempts to check for
compression first
"""
mime_uncompressed = _get_mime_type()
mime_uncompressed.add_default_arg("-Z")
return mime_uncompressed
def mime_type(filename):
"""Returns the mime type and subtype of a file.
Args:
filename: file to be analyzed
Returns:
Tuple containing the MIME type and subtype
"""
output = _get_mime_type()(filename, output=str, error=str).strip()
tty.debug("==> " + output)
type, _, subtype = output.partition("/")
return type, subtype
def compressed_mime_type(filename):
"""Same as mime_type but checks for type that has been compressed
Args:
filename (str): file to be analyzed
Returns:
Tuple containing the MIME type and subtype
"""
output = _get_mime_type_compressed()(filename, output=str, error=str).strip()
tty.debug("==> " + output)
type, _, subtype = output.partition("/")
return type, subtype
#: This generates the library filenames that may appear on any OS.
library_extensions = ["a", "la", "so", "tbd", "dylib"]
@@ -170,9 +233,14 @@ def filter_file(regex, repl, *filenames, **kwargs):
Keyword Arguments:
string (bool): Treat regex as a plain string. Default it False
backup (bool): Make backup file(s) suffixed with ``~``. Default is True
backup (bool): Make backup file(s) suffixed with ``~``. Default is False
ignore_absent (bool): Ignore any files that don't exist.
Default is False
start_at (str): Marker used to start applying the replacements. If a
text line matches this marker filtering is started at the next line.
All contents before the marker and the marker itself are copied
verbatim. Default is to start filtering from the first line of the
file.
stop_at (str): Marker used to stop scanning the file further. If a text
line matches this marker filtering is stopped and the rest of the
file is copied verbatim. Default is to filter until the end of the
@@ -181,6 +249,7 @@ def filter_file(regex, repl, *filenames, **kwargs):
string = kwargs.get("string", False)
backup = kwargs.get("backup", False)
ignore_absent = kwargs.get("ignore_absent", False)
start_at = kwargs.get("start_at", None)
stop_at = kwargs.get("stop_at", None)
# Allow strings to use \1, \2, etc. for replacement, like sed
@@ -229,6 +298,7 @@ def groupid_to_group(x):
# reached or we found a marker in the line if it was specified
with open(tmp_filename, mode="r", **extra_kwargs) as input_file:
with open(filename, mode="w", **extra_kwargs) as output_file:
do_filtering = start_at is None
# Using iter and readline is a workaround needed not to
# disable input_file.tell(), which will happen if we call
# input_file.next() implicitly via the for loop
@@ -238,8 +308,12 @@ def groupid_to_group(x):
if stop_at == line.strip():
output_file.write(line)
break
filtered_line = re.sub(regex, repl, line)
output_file.write(filtered_line)
if do_filtering:
filtered_line = re.sub(regex, repl, line)
output_file.write(filtered_line)
else:
do_filtering = start_at == line.strip()
output_file.write(line)
else:
current_position = None
@@ -637,7 +711,11 @@ def copy_tree(src, dest, symlinks=True, ignore=None, _permissions=False):
if symlinks:
target = os.readlink(s)
if os.path.isabs(target):
new_target = re.sub(abs_src, abs_dest, target)
def escaped_path(path):
return path.replace("\\", r"\\")
new_target = re.sub(escaped_path(abs_src), escaped_path(abs_dest), target)
if new_target != target:
tty.debug("Redirecting link {0} to {1}".format(target, new_target))
target = new_target
@@ -1903,7 +1981,11 @@ def names(self):
name = x[3:]
# Valid extensions include: ['.dylib', '.so', '.a']
for ext in [".dylib", ".so", ".a"]:
# on non Windows platform
# Windows valid library extensions are:
# ['.dll', '.lib']
valid_exts = [".dll", ".lib"] if is_windows else [".dylib", ".so", ".a"]
for ext in valid_exts:
i = name.rfind(ext)
if i != -1:
names.append(name[:i])
@@ -2046,15 +2128,23 @@ def find_libraries(libraries, root, shared=True, recursive=False):
message = message.format(find_libraries.__name__, type(libraries))
raise TypeError(message)
if is_windows:
static_ext = "lib"
shared_ext = "dll"
else:
# Used on both Linux and macOS
static_ext = "a"
shared_ext = "so"
# Construct the right suffix for the library
if shared:
# Used on both Linux and macOS
suffixes = ["so"]
suffixes = [shared_ext]
if sys.platform == "darwin":
# Only used on macOS
suffixes.append("dylib")
else:
suffixes = ["a"]
suffixes = [static_ext]
# List of libraries we are searching with suffixes
libraries = ["{0}.{1}".format(lib, suffix) for lib in libraries for suffix in suffixes]
@@ -2067,7 +2157,11 @@ def find_libraries(libraries, root, shared=True, recursive=False):
# perform first non-recursive search in root/lib then in root/lib64 and
# finally search all of root recursively. The search stops when the first
# match is found.
for subdir in ("lib", "lib64"):
common_lib_dirs = ["lib", "lib64"]
if is_windows:
common_lib_dirs.extend(["bin", "Lib"])
for subdir in common_lib_dirs:
dirname = join_path(root, subdir)
if not os.path.isdir(dirname):
continue
@@ -2080,6 +2174,155 @@ def find_libraries(libraries, root, shared=True, recursive=False):
return LibraryList(found_libs)
def find_all_shared_libraries(root, recursive=False):
"""Convenience function that returns the list of all shared libraries found
in the directory passed as argument.
See documentation for `llnl.util.filesystem.find_libraries` for more information
"""
return find_libraries("*", root=root, shared=True, recursive=recursive)
def find_all_static_libraries(root, recursive=False):
"""Convenience function that returns the list of all static libraries found
in the directory passed as argument.
See documentation for `llnl.util.filesystem.find_libraries` for more information
"""
return find_libraries("*", root=root, shared=False, recursive=recursive)
def find_all_libraries(root, recursive=False):
"""Convenience function that returns the list of all libraries found
in the directory passed as argument.
See documentation for `llnl.util.filesystem.find_libraries` for more information
"""
return find_all_shared_libraries(root, recursive=recursive) + find_all_static_libraries(
root, recursive=recursive
)
class WindowsSimulatedRPath(object):
"""Class representing Windows filesystem rpath analog
One instance of this class is associated with a package (only on Windows)
For each lib/binary directory in an associated package, this class introduces
a symlink to any/all dependent libraries/binaries. This includes the packages
own bin/lib directories, meaning the libraries are linked to the bianry directory
and vis versa.
"""
def __init__(self, package, link_install_prefix=True):
"""
Args:
package (spack.package_base.PackageBase): Package requiring links
link_install_prefix (bool): Link against package's own install or stage root.
Packages that run their own executables during build and require rpaths to
the build directory during build time require this option. Default: install
root
"""
self.pkg = package
self._addl_rpaths = set()
self.link_install_prefix = link_install_prefix
self._internal_links = set()
@property
def link_dest(self):
"""
Set of directories where package binaries/libraries are located.
"""
if hasattr(self.pkg, "libs") and self.pkg.libs:
pkg_libs = set(self.pkg.libs.directories)
else:
pkg_libs = set((self.pkg.prefix.lib, self.pkg.prefix.lib64))
return pkg_libs | set([self.pkg.prefix.bin]) | self.internal_links
@property
def internal_links(self):
"""
linking that would need to be established within the package itself. Useful for links
against extension modules/build time executables/internal linkage
"""
return self._internal_links
def add_internal_links(self, *dest):
"""
Incorporate additional paths into the rpath (sym)linking scheme.
Paths provided to this method are linked against by a package's libraries
and libraries found at these paths are linked against a package's binaries.
(i.e. /site-packages -> /bin and /bin -> /site-packages)
Specified paths should be outside of a package's lib, lib64, and bin
directories.
"""
self._internal_links = self._internal_links | set(*dest)
@property
def link_targets(self):
"""
Set of libraries this package needs to link against during runtime
These packages will each be symlinked into the packages lib and binary dir
"""
dependent_libs = []
for path in self.pkg.rpath:
dependent_libs.extend(list(find_all_shared_libraries(path, recursive=True)))
for extra_path in self._addl_rpaths:
dependent_libs.extend(list(find_all_shared_libraries(extra_path, recursive=True)))
return set(dependent_libs)
def include_additional_link_paths(self, *paths):
"""
Add libraries found at the root of provided paths to runtime linking
These are libraries found outside of the typical scope of rpath linking
that require manual inclusion in a runtime linking scheme
Args:
*paths (str): arbitrary number of paths to be added to runtime linking
"""
self._addl_rpaths = self._addl_rpaths | set(paths)
def establish_link(self):
"""
(sym)link packages to runtime dependencies based on RPath configuration for
Windows heuristics
"""
# from build_environment.py:463
# The top-level package is always RPATHed. It hasn't been installed yet
# so the RPATHs are added unconditionally
# for each binary install dir in self.pkg (i.e. pkg.prefix.bin, pkg.prefix.lib)
# install a symlink to each dependent library
for library, lib_dir in itertools.product(self.link_targets, self.link_dest):
if not path_contains_subdirectory(library, lib_dir):
file_name = os.path.basename(library)
dest_file = os.path.join(lib_dir, file_name)
if os.path.exists(lib_dir):
try:
symlink(library, dest_file)
# For py2 compatibility, we have to catch the specific Windows error code
# associate with trying to create a file that already exists (winerror 183)
except OSError as e:
if e.winerror == 183:
# We have either already symlinked or we are encoutering a naming clash
# either way, we don't want to overwrite existing libraries
already_linked = islink(dest_file)
tty.debug(
"Linking library %s to %s failed, " % (library, dest_file)
+ "already linked."
if already_linked
else "library with name %s already exists." % file_name
)
pass
else:
raise e
@system_path_filter
@memoized
def can_access_dir(path):

View File

@@ -386,8 +386,12 @@ def _ensure_parent_directory(self):
try:
os.makedirs(parent)
except OSError as e:
# makedirs can fail when diretory already exists.
if not (e.errno == errno.EEXIST and os.path.isdir(parent) or e.errno == errno.EISDIR):
# os.makedirs can fail in a number of ways when the directory already exists.
# With EISDIR, we know it exists, and others like EEXIST, EACCES, and EROFS
# are fine if we ensure that the directory exists.
# Python 3 allows an exist_ok parameter and ignores any OSError as long as
# the directory exists.
if not (e.errno == errno.EISDIR or os.path.isdir(parent)):
raise
return parent

View File

@@ -150,13 +150,17 @@ def color_when(value):
class match_to_ansi(object):
def __init__(self, color=True):
def __init__(self, color=True, enclose=False):
self.color = _color_when_value(color)
self.enclose = enclose
def escape(self, s):
"""Returns a TTY escape sequence for a color"""
if self.color:
return "\033[%sm" % s
if self.enclose:
return r"\[\033[%sm\]" % s
else:
return "\033[%sm" % s
else:
return ""
@@ -201,9 +205,11 @@ def colorize(string, **kwargs):
Keyword Arguments:
color (bool): If False, output will be plain text without control
codes, for output to non-console devices.
enclose (bool): If True, enclose ansi color sequences with
square brackets to prevent misestimation of terminal width.
"""
color = _color_when_value(kwargs.get("color", get_color_when()))
string = re.sub(color_re, match_to_ansi(color), string)
string = re.sub(color_re, match_to_ansi(color, kwargs.get("enclose")), string)
string = string.replace("}}", "}")
return string

View File

@@ -228,8 +228,8 @@ def __init__(self, controller_function, minion_function):
self.minion_function = minion_function
# these can be optionally set to change defaults
self.controller_timeout = 1
self.sleep_time = 0
self.controller_timeout = 3
self.sleep_time = 0.1
def start(self, **kwargs):
"""Start the controller and minion processes.

View File

@@ -35,9 +35,11 @@ def _search_duplicate_compilers(error_cls):
the decorator object, that will forward the keyword arguments passed
as input.
"""
import ast
import collections
import inspect
import itertools
import pickle
import re
from six.moves.urllib.request import urlopen
@@ -49,6 +51,7 @@ def _search_duplicate_compilers(error_cls):
import spack.patch
import spack.repo
import spack.spec
import spack.util.crypto
import spack.variant
#: Map an audit tag to a list of callables implementing checks
@@ -261,6 +264,14 @@ def _search_duplicate_specs_in_externals(error_cls):
)
package_properties = AuditClass(
group="packages",
tag="PKG-PROPERTIES",
description="Sanity checks on properties a package should maintain",
kwargs=("pkgs",),
)
#: Sanity checks on linting
# This can take some time, so it's run separately from packages
package_https_directives = AuditClass(
@@ -353,6 +364,145 @@ def _search_for_reserved_attributes_names_in_packages(pkgs, error_cls):
return errors
@package_properties
def _ensure_all_package_names_are_lowercase(pkgs, error_cls):
"""Ensure package names are lowercase and consistent"""
badname_regex, errors = re.compile(r"[_A-Z]"), []
for pkg_name in pkgs:
if badname_regex.search(pkg_name):
error_msg = "Package name '{}' is either lowercase or conatine '_'".format(pkg_name)
errors.append(error_cls(error_msg, []))
return errors
@package_properties
def _ensure_packages_are_pickeleable(pkgs, error_cls):
"""Ensure that package objects are pickleable"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg = pkg_cls(spack.spec.Spec(pkg_name))
try:
pickle.dumps(pkg)
except Exception as e:
error_msg = "Package '{}' failed to pickle".format(pkg_name)
details = ["{}".format(str(e))]
errors.append(error_cls(error_msg, details))
return errors
@package_properties
def _ensure_packages_are_unparseable(pkgs, error_cls):
"""Ensure that all packages can unparse and that unparsed code is valid Python"""
import spack.util.package_hash as ph
errors = []
for pkg_name in pkgs:
try:
source = ph.canonical_source(pkg_name, filter_multimethods=False)
except Exception as e:
error_msg = "Package '{}' failed to unparse".format(pkg_name)
details = ["{}".format(str(e))]
errors.append(error_cls(error_msg, details))
continue
try:
compile(source, "internal", "exec", ast.PyCF_ONLY_AST)
except Exception as e:
error_msg = "The unparsed package '{}' failed to compile".format(pkg_name)
details = ["{}".format(str(e))]
errors.append(error_cls(error_msg, details))
return errors
@package_properties
def _ensure_all_versions_can_produce_a_fetcher(pkgs, error_cls):
"""Ensure all versions in a package can produce a fetcher"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg = pkg_cls(spack.spec.Spec(pkg_name))
try:
spack.fetch_strategy.check_pkg_attributes(pkg)
for version in pkg.versions:
assert spack.fetch_strategy.for_package_version(pkg, version)
except Exception as e:
error_msg = "The package '{}' cannot produce a fetcher for some of its versions"
details = ["{}".format(str(e))]
errors.append(error_cls(error_msg.format(pkg_name), details))
return errors
@package_properties
def _ensure_docstring_and_no_fixme(pkgs, error_cls):
"""Ensure the package has a docstring and no fixmes"""
errors = []
fixme_regexes = [
re.compile(r"remove this boilerplate"),
re.compile(r"FIXME: Put"),
re.compile(r"FIXME: Add"),
re.compile(r"example.com"),
]
for pkg_name in pkgs:
details = []
filename = spack.repo.path.filename_for_package_name(pkg_name)
with open(filename, "r") as package_file:
for i, line in enumerate(package_file):
pattern = next((r for r in fixme_regexes if r.search(line)), None)
if pattern:
details.append(
"%s:%d: boilerplate needs to be removed: %s" % (filename, i, line.strip())
)
if details:
error_msg = "Package '{}' contains boilerplate that need to be removed"
errors.append(error_cls(error_msg.format(pkg_name), details))
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
if not pkg_cls.__doc__:
error_msg = "Package '{}' miss a docstring"
errors.append(error_cls(error_msg.format(pkg_name), []))
return errors
@package_properties
def _ensure_all_packages_use_sha256_checksums(pkgs, error_cls):
"""Ensure no packages use md5 checksums"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
if pkg_cls.manual_download:
continue
pkg = pkg_cls(spack.spec.Spec(pkg_name))
def invalid_sha256_digest(fetcher):
if getattr(fetcher, "digest", None):
h = spack.util.crypto.hash_algo_for_digest(fetcher.digest)
if h != "sha256":
return h, True
return None, False
error_msg = "Package '{}' does not use sha256 checksum".format(pkg_name)
details = []
for v, args in pkg.versions.items():
fetcher = spack.fetch_strategy.for_package_version(pkg, v)
digest, is_bad = invalid_sha256_digest(fetcher)
if is_bad:
details.append("{}@{} uses {}".format(pkg_name, v, digest))
for _, resources in pkg.resources.items():
for resource in resources:
digest, is_bad = invalid_sha256_digest(resource.fetcher)
if is_bad:
details.append("Resource in '{}' uses {}".format(pkg_name, digest))
if details:
errors.append(error_cls(error_msg, details))
return errors
@package_https_directives
def _linting_package_file(pkgs, error_cls):
"""Check for correctness of links"""
@@ -490,6 +640,36 @@ def _unknown_variants_in_dependencies(pkgs, error_cls):
return errors
@package_directives
def _ensure_variant_defaults_are_parsable(pkgs, error_cls):
"""Ensures that variant defaults are present and parsable from cli"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
for variant_name, entry in pkg_cls.variants.items():
variant, _ = entry
default_is_parsable = (
# Permitting a default that is an instance on 'int' permits
# to have foo=false or foo=0. Other falsish values are
# not allowed, since they can't be parsed from cli ('foo=')
isinstance(variant.default, int)
or variant.default
)
if not default_is_parsable:
error_msg = "Variant '{}' of package '{}' has a bad default value"
errors.append(error_cls(error_msg.format(variant_name, pkg_name), []))
continue
vspec = variant.make_default()
try:
variant.validate_or_raise(vspec, pkg_cls=pkg_cls)
except spack.variant.InvalidVariantValueError:
error_msg = "The variant '{}' default value in package '{}' cannot be validated"
errors.append(error_cls(error_msg.format(variant_name, pkg_name), []))
return errors
@package_directives
def _version_constraints_are_satisfiable_by_some_version_in_repo(pkgs, error_cls):
"""Report if version constraints used in directives are not satisfiable"""

View File

@@ -19,6 +19,7 @@
import ruamel.yaml as yaml
from six.moves.urllib.error import HTTPError, URLError
import llnl.util.filesystem as fsys
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
@@ -26,7 +27,6 @@
import spack.cmd
import spack.config as config
import spack.database as spack_db
import spack.fetch_strategy as fs
import spack.hooks
import spack.hooks.sbang
import spack.mirror
@@ -654,7 +654,7 @@ def get_buildfile_manifest(spec):
for filename in files:
path_name = os.path.join(root, filename)
m_type, m_subtype = relocate.mime_type(path_name)
m_type, m_subtype = fsys.mime_type(path_name)
rel_path_name = os.path.relpath(path_name, spec.prefix)
added = False
@@ -1231,7 +1231,7 @@ def try_fetch(url_to_fetch):
try:
stage.fetch()
except fs.FetchError:
except web_util.FetchError:
stage.destroy()
return None
@@ -1954,7 +1954,7 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
if not os.path.exists(stage.save_filename):
try:
stage.fetch()
except fs.FetchError:
except web_util.FetchError:
continue
tty.debug("Found key {0}".format(fingerprint))
@@ -2106,7 +2106,7 @@ def _download_buildcache_entry(mirror_root, descriptions):
try:
stage.fetch()
break
except fs.FetchError as e:
except web_util.FetchError as e:
tty.debug(e)
else:
if fail_if_missing:

View File

@@ -15,6 +15,7 @@
import re
import sys
import sysconfig
import uuid
import six
@@ -40,10 +41,13 @@
import spack.util.path
import spack.util.spack_yaml
import spack.util.url
import spack.version
#: Name of the file containing metadata about the bootstrapping source
METADATA_YAML_FILENAME = "metadata.yaml"
is_windows = sys.platform == "win32"
#: Map a bootstrapper type to the corresponding class
_bootstrap_methods = {}
@@ -260,12 +264,11 @@ def mirror_scope(self):
class _BuildcacheBootstrapper(_BootstrapperBase):
"""Install the software needed during bootstrapping from a buildcache."""
config_scope_name = "bootstrap_buildcache"
def __init__(self, conf):
super(_BuildcacheBootstrapper, self).__init__(conf)
self.metadata_dir = spack.util.path.canonicalize_path(conf["metadata"])
self.last_search = None
self.config_scope_name = "bootstrap_buildcache-{}".format(uuid.uuid4())
@staticmethod
def _spec_and_platform(abstract_spec_str):
@@ -378,13 +381,12 @@ def try_search_path(self, executables, abstract_spec_str):
class _SourceBootstrapper(_BootstrapperBase):
"""Install the software needed during bootstrapping from sources."""
config_scope_name = "bootstrap_source"
def __init__(self, conf):
super(_SourceBootstrapper, self).__init__(conf)
self.metadata_dir = spack.util.path.canonicalize_path(conf["metadata"])
self.conf = conf
self.last_search = None
self.config_scope_name = "bootstrap_source-{}".format(uuid.uuid4())
def try_import(self, module, abstract_spec_str):
info = {}
@@ -655,6 +657,8 @@ def _add_externals_if_missing():
# GnuPG
spack.repo.path.get_pkg_class("gawk"),
]
if is_windows:
search_list.append(spack.repo.path.get_pkg_class("winbison"))
detected_packages = spack.detection.by_executable(search_list)
spack.detection.update_configuration(detected_packages, scope="bootstrap")
@@ -788,17 +792,46 @@ def ensure_gpg_in_path_or_raise():
def patchelf_root_spec():
"""Return the root spec used to bootstrap patchelf"""
# TODO: patchelf is restricted to v0.13 since earlier versions have
# TODO: bugs that we don't to deal with, while v0.14 requires a C++17
# TODO: which may not be available on all platforms.
return _root_spec("patchelf@0.13.1:0.13.99")
# 0.13.1 is the last version not to require C++17.
return _root_spec("patchelf@0.13.1:")
def verify_patchelf(patchelf):
"""Older patchelf versions can produce broken binaries, so we
verify the version here.
Arguments:
patchelf (spack.util.executable.Executable): patchelf executable
"""
out = patchelf("--version", output=str, error=os.devnull, fail_on_error=False).strip()
if patchelf.returncode != 0:
return False
parts = out.split(" ")
if len(parts) < 2:
return False
try:
version = spack.version.Version(parts[1])
except ValueError:
return False
return version >= spack.version.Version("0.13.1")
def ensure_patchelf_in_path_or_raise():
"""Ensure patchelf is in the PATH or raise."""
return ensure_executables_in_path_or_raise(
executables=["patchelf"], abstract_spec=patchelf_root_spec()
)
# The old concretizer is not smart and we're doing its job: if the latest patchelf
# does not concretize because the compiler doesn't support C++17, we try to
# concretize again with an upperbound @:13.
try:
return ensure_executables_in_path_or_raise(
executables=["patchelf"], abstract_spec=patchelf_root_spec(), cmd_check=verify_patchelf
)
except RuntimeError:
return ensure_executables_in_path_or_raise(
executables=["patchelf"],
abstract_spec=_root_spec("patchelf@0.13.1:0.13"),
cmd_check=verify_patchelf,
)
###

View File

@@ -64,7 +64,9 @@
import spack.subprocess_context
import spack.user_environment
import spack.util.path
import spack.util.pattern
from spack.error import NoHeadersError, NoLibrariesError
from spack.installer import InstallError
from spack.util.cpus import cpus_available
from spack.util.environment import (
EnvironmentModifications,
@@ -108,7 +110,14 @@
# Platform-specific library suffix.
dso_suffix = "dylib" if sys.platform == "darwin" else "so"
if sys.platform == "darwin":
dso_suffix = "dylib"
elif sys.platform == "win32":
dso_suffix = "dll"
else:
dso_suffix = "so"
stat_suffix = "lib" if sys.platform == "win32" else "a"
def should_set_parallel_jobs(jobserver_support=False):
@@ -192,6 +201,8 @@ def clean_environment():
env.unset("CMAKE_PREFIX_PATH")
env.unset("PYTHONPATH")
env.unset("R_HOME")
env.unset("R_ENVIRON")
# Affects GNU make, can e.g. indirectly inhibit enabling parallel build
# env.unset('MAKEFLAGS')
@@ -985,8 +996,24 @@ def add_modifications_for_dep(dep):
dpkg = dep.package
if set_package_py_globals:
set_module_variables_for_package(dpkg)
# Allow dependencies to modify the module
dpkg.setup_dependent_package(spec.package.module, spec)
# Get list of modules that may need updating
modules = []
for cls in inspect.getmro(type(spec.package)):
module = cls.module
if module == spack.package_base:
break
modules.append(module)
# Execute changes as if on a single module
# copy dict to ensure prior changes are available
changes = spack.util.pattern.Bunch()
dpkg.setup_dependent_package(changes, spec)
for module in modules:
module.__dict__.update(changes.__dict__)
if context == "build":
dpkg.setup_dependent_build_environment(env, spec)
else:
@@ -1029,8 +1056,11 @@ def get_cmake_prefix_path(pkg):
spack_built.insert(0, dspec)
ordered_build_link_deps = spack_built + externals
build_link_prefixes = filter_system_paths(x.prefix for x in ordered_build_link_deps)
return build_link_prefixes
cmake_prefix_path_entries = []
for spec in ordered_build_link_deps:
cmake_prefix_path_entries.extend(spec.package.cmake_prefix_paths)
return filter_system_paths(cmake_prefix_path_entries)
def _setup_pkg_and_run(
@@ -1279,15 +1309,6 @@ def make_stack(tb, stack=None):
return lines
class InstallError(spack.error.SpackError):
"""Raised by packages when a package fails to install.
Any subclass of InstallError will be annotated by Spack with a
``pkg`` attribute on failure, which the caller can use to get the
package for which the exception was raised.
"""
class ChildError(InstallError):
"""Special exception class for wrapping exceptions from child processes
in Spack's build environment.

View File

@@ -244,8 +244,11 @@ def _patch_usr_bin_file(self):
scripts to use file from path."""
if self.spec.os.startswith("nixos"):
for configure_file in fs.find(".", files=["configure"], recursive=True):
fs.filter_file("/usr/bin/file", "file", configure_file, string=True)
x = fs.FileFilter(
*filter(fs.is_exe, fs.find(self.build_directory, "configure", recursive=True))
)
with fs.keep_modification_time(*x.filenames):
x.filter(regex="/usr/bin/file", repl="file", string=True)
@run_before("configure")
def _set_autotools_environment_variables(self):
@@ -262,34 +265,97 @@ def _set_autotools_environment_variables(self):
"""
os.environ["FORCE_UNSAFE_CONFIGURE"] = "1"
@run_before("configure")
def _do_patch_libtool_configure(self):
"""Patch bugs that propagate from libtool macros into "configure" and
further into "libtool". Note that patches that can be fixed by patching
"libtool" directly should be implemented in the _do_patch_libtool method
below."""
# Exit early if we are required not to patch libtool-related problems:
if not self.patch_libtool:
return
x = fs.FileFilter(
*filter(fs.is_exe, fs.find(self.build_directory, "configure", recursive=True))
)
# There are distributed automatically generated files that depend on the configure script
# and require additional tools for rebuilding.
# See https://github.com/spack/spack/pull/30768#issuecomment-1219329860
with fs.keep_modification_time(*x.filenames):
# Fix parsing of compiler output when collecting predeps and postdeps
# https://lists.gnu.org/archive/html/bug-libtool/2016-03/msg00003.html
x.filter(regex=r'^(\s*if test x-L = )("\$p" \|\|\s*)$', repl=r"\1x\2")
x.filter(
regex=r'^(\s*test x-R = )("\$p")(; then\s*)$', repl=r'\1x\2 || test x-l = x"$p"\3'
)
# Support Libtool 2.4.2 and older:
x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2')
@run_after("configure")
def _do_patch_libtool(self):
"""If configure generates a "libtool" script that does not correctly
detect the compiler (and patch_libtool is set), patch in the correct
flags for the Arm, Clang/Flang, Fujitsu and NVHPC compilers."""
values for libtool variables.
# Exit early if we are required not to patch libtool
The generated libtool script supports mixed compilers through tags:
``libtool --tag=CC/CXX/FC/...```. For each tag there is a block with variables,
which defines what flags to pass to the compiler. The default variables (which
are used by the default tag CC) are set in a block enclosed by
``# ### {BEGIN,END} LIBTOOL CONFIG``. For non-default tags, there are
corresponding blocks ``# ### {BEGIN,END} LIBTOOL TAG CONFIG: {CXX,FC,F77}`` at
the end of the file (after the exit command). libtool evals these blocks.
Whenever we need to update variables that the configure script got wrong
(for example cause it did not recognize the compiler), we should properly scope
those changes to these tags/blocks so they only apply to the compiler we care
about. Below, ``start_at`` and ``stop_at`` are used for that."""
# Exit early if we are required not to patch libtool:
if not self.patch_libtool:
return
for libtool_path in fs.find(self.build_directory, "libtool", recursive=True):
self._patch_libtool(libtool_path)
x = fs.FileFilter(
*filter(fs.is_exe, fs.find(self.build_directory, "libtool", recursive=True))
)
def _patch_libtool(self, libtool_path):
if (
self.spec.satisfies("%arm")
or self.spec.satisfies("%clang")
or self.spec.satisfies("%fj")
or self.spec.satisfies("%nvhpc")
):
fs.filter_file('wl=""\n', 'wl="-Wl,"\n', libtool_path)
fs.filter_file(
'pic_flag=""\n', 'pic_flag="{0}"\n'.format(self.compiler.cc_pic_flag), libtool_path
# Exit early if there is nothing to patch:
if not x.filenames:
return
markers = {"cc": "LIBTOOL CONFIG"}
for tag in ["cxx", "fc", "f77"]:
markers[tag] = "LIBTOOL TAG CONFIG: {0}".format(tag.upper())
# Replace empty linker flag prefixes:
if self.compiler.name == "nag":
# Nag is mixed with gcc and g++, which are recognized correctly.
# Therefore, we change only Fortran values:
for tag in ["fc", "f77"]:
marker = markers[tag]
x.filter(
regex='^wl=""$',
repl='wl="{0}"'.format(self.compiler.linker_arg),
start_at="# ### BEGIN {0}".format(marker),
stop_at="# ### END {0}".format(marker),
)
else:
x.filter(regex='^wl=""$', repl='wl="{0}"'.format(self.compiler.linker_arg))
# Replace empty PIC flag values:
for cc, marker in markers.items():
x.filter(
regex='^pic_flag=""$',
repl='pic_flag="{0}"'.format(getattr(self.compiler, "{0}_pic_flag".format(cc))),
start_at="# ### BEGIN {0}".format(marker),
stop_at="# ### END {0}".format(marker),
)
if self.spec.satisfies("%fj"):
fs.filter_file("-nostdlib", "", libtool_path)
# Other compiler-specific patches:
if self.compiler.name == "fj":
x.filter(regex="-nostdlib", repl="", string=True)
rehead = r"/\S*/"
objfile = [
for o in [
"fjhpctag.o",
"fjcrt0.o",
"fjlang08.o",
@@ -297,9 +363,86 @@ def _patch_libtool(self, libtool_path):
"crti.o",
"crtbeginS.o",
"crtendS.o",
]
for o in objfile:
fs.filter_file(rehead + o, "", libtool_path)
]:
x.filter(regex=(rehead + o), repl="", string=True)
elif self.compiler.name == "dpcpp":
# Hack to filter out spurious predep_objects when building with Intel dpcpp
# (see https://github.com/spack/spack/issues/32863):
x.filter(regex=r"^(predep_objects=.*)/tmp/conftest-[0-9A-Fa-f]+\.o", repl=r"\1")
x.filter(regex=r"^(predep_objects=.*)/tmp/a-[0-9A-Fa-f]+\.o", repl=r"\1")
elif self.compiler.name == "nag":
for tag in ["fc", "f77"]:
marker = markers[tag]
start_at = "# ### BEGIN {0}".format(marker)
stop_at = "# ### END {0}".format(marker)
# Libtool 2.4.2 does not know the shared flag:
x.filter(
regex=r"\$CC -shared",
repl=r"\$CC -Wl,-shared",
string=True,
start_at=start_at,
stop_at=stop_at,
)
# Libtool does not know how to inject whole archives
# (e.g. https://github.com/pmodels/mpich/issues/4358):
x.filter(
regex=r'^whole_archive_flag_spec="\\\$({?wl}?)--whole-archive'
r'\\\$convenience \\\$\1--no-whole-archive"$',
repl=r'whole_archive_flag_spec="\$\1--whole-archive'
r"\`for conv in \$convenience\\\\\"\\\\\"; do test -n \\\\\"\$conv\\\\\" && "
r"new_convenience=\\\\\"\$new_convenience,\$conv\\\\\"; done; "
r'func_echo_all \\\\\"\$new_convenience\\\\\"\` \$\1--no-whole-archive"',
start_at=start_at,
stop_at=stop_at,
)
# The compiler requires special treatment in certain cases:
x.filter(
regex=r"^(with_gcc=.*)$",
repl="\\1\n\n# Is the compiler the NAG compiler?\nwith_nag=yes",
start_at=start_at,
stop_at=stop_at,
)
# Disable the special treatment for gcc and g++:
for tag in ["cc", "cxx"]:
marker = markers[tag]
x.filter(
regex=r"^(with_gcc=.*)$",
repl="\\1\n\n# Is the compiler the NAG compiler?\nwith_nag=no",
start_at="# ### BEGIN {0}".format(marker),
stop_at="# ### END {0}".format(marker),
)
# The compiler does not support -pthread flag, which might come
# from the inherited linker flags. We prepend the flag with -Wl,
# before using it:
x.filter(
regex=r"^(\s*)(for tmp_inherited_linker_flag in \$tmp_inherited_linker_flags; "
r"do\s*)$",
repl='\\1if test "x$with_nag" = xyes; then\n'
"\\1 revert_nag_pthread=$tmp_inherited_linker_flags\n"
"\\1 tmp_inherited_linker_flags="
"`$ECHO \"$tmp_inherited_linker_flags\" | $SED 's% -pthread% -Wl,-pthread%g'`\n"
'\\1 test x"$revert_nag_pthread" = x"$tmp_inherited_linker_flags" && '
"revert_nag_pthread=no || revert_nag_pthread=yes\n"
"\\1fi\n\\1\\2",
start_at='if test -n "$inherited_linker_flags"; then',
stop_at='case " $new_inherited_linker_flags " in',
)
# And revert the modification to produce '*.la' files that can be
# used with gcc (normally, we do not install the files but they can
# still be used during the building):
start_at = '# Time to change all our "foo.ltframework" stuff back to "-framework foo"'
stop_at = "# installed libraries to the beginning of the library search list"
x.filter(
regex=r"(\s*)(# move library search paths that coincide with paths to not "
r"yet\s*)$",
repl='\\1test x"$with_nag$revert_nag_pthread" = xyesyes &&\n'
'\\1 new_inherited_linker_flags=`$ECHO " $new_inherited_linker_flags" | '
"$SED 's% -Wl,-pthread% -pthread%g'`\n\\1\\2",
start_at=start_at,
stop_at=stop_at,
)
@property
def configure_directory(self):

View File

@@ -19,7 +19,6 @@
import spack.build_environment
from spack.directives import conflicts, depends_on, variant
from spack.package_base import InstallError, PackageBase, run_after
from spack.util.path import convert_to_posix_path
# Regex to extract the primary generator from the CMake generator
# string.
@@ -176,7 +175,7 @@ def _std_args(pkg):
args = [
"-G",
generator,
define("CMAKE_INSTALL_PREFIX", convert_to_posix_path(pkg.prefix)),
define("CMAKE_INSTALL_PREFIX", pkg.prefix),
define("CMAKE_BUILD_TYPE", build_type),
define("BUILD_TESTING", pkg.run_tests),
]

View File

@@ -41,6 +41,9 @@ class CudaPackage(PackageBase):
"75",
"80",
"86",
"87",
"89",
"90",
)
# FIXME: keep cuda and cuda_arch separate to make usage easier until
@@ -100,6 +103,11 @@ def cuda_flags(arch_list):
depends_on("cuda@11.0:", when="cuda_arch=80")
depends_on("cuda@11.1:", when="cuda_arch=86")
depends_on("cuda@11.4:", when="cuda_arch=87")
depends_on("cuda@11.8:", when="cuda_arch=89")
depends_on("cuda@11.8:", when="cuda_arch=90")
# From the NVIDIA install guide we know of conflicts for particular
# platforms (linux, darwin), architectures (x86, powerpc) and compilers
# (gcc, clang). We don't restrict %gcc and %clang conflicts to
@@ -128,10 +136,11 @@ def cuda_flags(arch_list):
conflicts("%gcc@10:", when="+cuda ^cuda@:11.0")
conflicts("%gcc@11:", when="+cuda ^cuda@:11.4.0")
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.7")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
conflicts("%clang@15:", when="+cuda ^cuda@:11.8")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -75,7 +75,7 @@ class MesonPackage(PackageBase):
@property
def archive_files(self):
"""Files to archive for packages based on Meson"""
return [os.path.join(self.build_directory, "meson-logs/meson-log.txt")]
return [os.path.join(self.build_directory, "meson-logs", "meson-log.txt")]
@property
def root_mesonlists_dir(self):
@@ -138,13 +138,21 @@ def flags_to_build_system_args(self, flags):
# Has to be dynamic attribute due to caching
setattr(self, "meson_flag_args", [])
@property
def build_dirname(self):
"""Returns the directory name to use when building the package
:return: name of the subdirectory for building the package
"""
return "spack-build-%s" % self.spec.dag_hash(7)
@property
def build_directory(self):
"""Returns the directory to use when building the package
:return: directory where to build the package
"""
return os.path.join(self.stage.source_path, "spack-build")
return os.path.join(self.stage.path, self.build_dirname)
def meson_args(self):
"""Produces a list containing all the arguments that must be passed to

View File

@@ -33,7 +33,7 @@ class PythonPackage(PackageBase):
#: Package name, version, and extension on PyPI
pypi = None # type: Optional[str]
maintainers = ["adamjstewart"]
maintainers = ["adamjstewart", "pradyunsg"]
# Default phases
phases = ["install"]
@@ -138,12 +138,28 @@ def import_modules(self):
path.replace(root + os.sep, "", 1).replace(".py", "").replace("/", ".")
)
modules = [mod for mod in modules if re.match("[a-zA-Z0-9._]+$", mod)]
modules = [
mod
for mod in modules
if re.match("[a-zA-Z0-9._]+$", mod) and not any(map(mod.startswith, self.skip_modules))
]
tty.debug("Detected the following modules: {0}".format(modules))
return modules
@property
def skip_modules(self):
"""Names of modules that should be skipped when running tests.
These are a subset of import_modules. If a module has submodules,
they are skipped as well (meaning a.b is skipped if a is contained).
Returns:
list: list of strings of module names
"""
return []
@property
def build_directory(self):
"""The root directory of the Python package.
@@ -227,8 +243,8 @@ def headers(self):
"""Discover header files in platlib."""
# Headers may be in either location
include = inspect.getmodule(self).include
platlib = inspect.getmodule(self).platlib
include = self.prefix.join(self.spec["python"].package.include)
platlib = self.prefix.join(self.spec["python"].package.platlib)
headers = find_all_headers(include) + find_all_headers(platlib)
if headers:
@@ -243,7 +259,7 @@ def libs(self):
# Remove py- prefix in package name
library = "lib" + self.spec.name[3:].replace("-", "?")
root = inspect.getmodule(self).platlib
root = self.prefix.join(self.spec["python"].package.platlib)
for shared in [True, False]:
libs = find_libraries(library, root, shared=shared, recursive=True)

View File

@@ -4,13 +4,17 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import base64
import codecs
import copy
import json
import os
import re
import shutil
import stat
import subprocess
import sys
import tempfile
import time
import zipfile
from six import iteritems
@@ -20,6 +24,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.lang import memoized
import spack
import spack.binary_distribution as bindist
@@ -33,9 +38,13 @@
import spack.util.executable as exe
import spack.util.gpg as gpg_util
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util
from spack.error import SpackError
from spack.reporters.cdash import CDash
from spack.reporters.cdash import build_stamp as cdash_build_stamp
from spack.spec import Spec
from spack.util.pattern import Bunch
JOB_RETRY_CONDITIONS = [
"always",
@@ -60,69 +69,6 @@ def __exit__(self, exc_type, exc_value, exc_traceback):
return False
def _create_buildgroup(opener, headers, url, project, group_name, group_type):
data = {"newbuildgroup": group_name, "project": project, "type": group_type}
enc_data = json.dumps(data).encode("utf-8")
request = Request(url, data=enc_data, headers=headers)
response = opener.open(request)
response_code = response.getcode()
if response_code != 200 and response_code != 201:
msg = "Creating buildgroup failed (response code = {0}".format(response_code)
tty.warn(msg)
return None
response_text = response.read()
response_json = json.loads(response_text)
build_group_id = response_json["id"]
return build_group_id
def _populate_buildgroup(job_names, group_name, project, site, credentials, cdash_url):
url = "{0}/api/v1/buildgroup.php".format(cdash_url)
headers = {
"Authorization": "Bearer {0}".format(credentials),
"Content-Type": "application/json",
}
opener = build_opener(HTTPHandler)
parent_group_id = _create_buildgroup(opener, headers, url, project, group_name, "Daily")
group_id = _create_buildgroup(
opener, headers, url, project, "Latest {0}".format(group_name), "Latest"
)
if not parent_group_id or not group_id:
msg = "Failed to create or retrieve buildgroups for {0}".format(group_name)
tty.warn(msg)
return
data = {
"project": project,
"buildgroupid": group_id,
"dynamiclist": [
{"match": name, "parentgroupid": parent_group_id, "site": site} for name in job_names
],
}
enc_data = json.dumps(data).encode("utf-8")
request = Request(url, data=enc_data, headers=headers)
request.get_method = lambda: "PUT"
response = opener.open(request)
response_code = response.getcode()
if response_code != 200:
msg = "Error response code ({0}) in _populate_buildgroup".format(response_code)
tty.warn(msg)
def _is_main_phase(phase_name):
return True if phase_name == "specs" else False
@@ -180,12 +126,6 @@ def get_job_name(phase, strip_compiler, spec, osarch, build_group):
return format_str.format(*format_args)
def _get_cdash_build_name(spec, build_group):
return "{0}@{1}%{2} arch={3} ({4})".format(
spec.name, spec.version, spec.compiler, spec.architecture, build_group
)
def _remove_reserved_tags(tags):
"""Convenience function to strip reserved tags from jobs"""
return [tag for tag in tags if tag not in SPACK_RESERVED_TAGS]
@@ -458,6 +398,14 @@ def _spec_matches(spec, match_string):
return spec.satisfies(match_string)
def _remove_attributes(src_dict, dest_dict):
if "tags" in src_dict and "tags" in dest_dict:
# For 'tags', we remove any tags that are listed for removal
for tag in src_dict["tags"]:
while tag in dest_dict["tags"]:
dest_dict["tags"].remove(tag)
def _copy_attributes(attrs_list, src_dict, dest_dict):
for runner_attr in attrs_list:
if runner_attr in src_dict:
@@ -491,19 +439,23 @@ def _find_matching_config(spec, gitlab_ci):
_copy_attributes(overridable_attrs, gitlab_ci, runner_attributes)
ci_mappings = gitlab_ci["mappings"]
for ci_mapping in ci_mappings:
matched = False
only_first = gitlab_ci.get("match_behavior", "first") == "first"
for ci_mapping in gitlab_ci["mappings"]:
for match_string in ci_mapping["match"]:
if _spec_matches(spec, match_string):
matched = True
if "remove-attributes" in ci_mapping:
_remove_attributes(ci_mapping["remove-attributes"], runner_attributes)
if "runner-attributes" in ci_mapping:
_copy_attributes(
overridable_attrs, ci_mapping["runner-attributes"], runner_attributes
)
return runner_attributes
else:
return None
break
if matched and only_first:
break
return runner_attributes
return runner_attributes if matched else None
def _pkg_name_from_spec_label(spec_label):
@@ -672,27 +624,14 @@ def generate_gitlab_ci_yaml(
gitlab_ci = yaml_root["gitlab-ci"]
build_group = None
enable_cdash_reporting = False
cdash_auth_token = None
cdash_handler = CDashHandler(yaml_root.get("cdash")) if "cdash" in yaml_root else None
build_group = cdash_handler.build_group if cdash_handler else None
if "cdash" in yaml_root:
enable_cdash_reporting = True
ci_cdash = yaml_root["cdash"]
build_group = ci_cdash["build-group"]
cdash_url = ci_cdash["url"]
cdash_project = ci_cdash["project"]
cdash_site = ci_cdash["site"]
if "SPACK_CDASH_AUTH_TOKEN" in os.environ:
tty.verbose("Using CDash auth token from environment")
cdash_auth_token = os.environ.get("SPACK_CDASH_AUTH_TOKEN")
prune_untouched_packages = os.environ.get("SPACK_PRUNE_UNTOUCHED", None)
if prune_untouched_packages:
prune_untouched_packages = False
spack_prune_untouched = os.environ.get("SPACK_PRUNE_UNTOUCHED", None)
if spack_prune_untouched is not None and spack_prune_untouched.lower() == "true":
# Requested to prune untouched packages, but assume we won't do that
# unless we're actually in a git repo.
prune_untouched_packages = False
rev1, rev2 = get_change_revisions()
tty.debug("Got following revisions: rev1={0}, rev2={1}".format(rev1, rev2))
if rev1 and rev2:
@@ -708,6 +647,14 @@ def generate_gitlab_ci_yaml(
for s in affected_specs:
tty.debug(" {0}".format(s.name))
# Allow overriding --prune-dag cli opt with environment variable
prune_dag_override = os.environ.get("SPACK_PRUNE_UP_TO_DATE", None)
if prune_dag_override is not None:
prune_dag = True if prune_dag_override.lower() == "true" else False
# If we are not doing any kind of pruning, we are rebuilding everything
rebuild_everything = not prune_dag and not prune_untouched_packages
# Downstream jobs will "need" (depend on, for both scheduling and
# artifacts, which include spack.lock file) this pipeline generation
# job by both name and pipeline id. If those environment variables
@@ -720,8 +667,6 @@ def generate_gitlab_ci_yaml(
# Values: "spack_pull_request", "spack_protected_branch", or not set
spack_pipeline_type = os.environ.get("SPACK_PIPELINE_TYPE", None)
spack_buildcache_copy = os.environ.get("SPACK_COPY_BUILDCACHE", None)
if "mirrors" not in yaml_root or len(yaml_root["mirrors"].values()) < 1:
tty.die("spack ci generate requires an env containing a mirror")
@@ -729,6 +674,12 @@ def generate_gitlab_ci_yaml(
mirror_urls = [url for url in ci_mirrors.values()]
remote_mirror_url = mirror_urls[0]
spack_buildcache_copy = os.environ.get("SPACK_COPY_BUILDCACHE", None)
if spack_buildcache_copy:
buildcache_copies = {}
buildcache_copy_src_prefix = remote_mirror_override or remote_mirror_url
buildcache_copy_dest_prefix = spack_buildcache_copy
# Check for a list of "known broken" specs that we should not bother
# trying to build.
broken_specs_url = ""
@@ -820,6 +771,7 @@ def generate_gitlab_ci_yaml(
job_log_dir = os.path.join(pipeline_artifacts_dir, "logs")
job_repro_dir = os.path.join(pipeline_artifacts_dir, "reproduction")
job_test_dir = os.path.join(pipeline_artifacts_dir, "tests")
local_mirror_dir = os.path.join(pipeline_artifacts_dir, "mirror")
user_artifacts_dir = os.path.join(pipeline_artifacts_dir, "user_data")
@@ -833,7 +785,8 @@ def generate_gitlab_ci_yaml(
rel_concrete_env_dir = os.path.relpath(concrete_env_dir, ci_project_dir)
rel_job_log_dir = os.path.relpath(job_log_dir, ci_project_dir)
rel_job_repro_dir = os.path.relpath(job_repro_dir, ci_project_dir)
rel_local_mirror_dir = os.path.relpath(local_mirror_dir, ci_project_dir)
rel_job_test_dir = os.path.relpath(job_test_dir, ci_project_dir)
rel_local_mirror_dir = os.path.join(local_mirror_dir, ci_project_dir)
rel_user_artifacts_dir = os.path.relpath(user_artifacts_dir, ci_project_dir)
# Speed up staging by first fetching binary indices from all mirrors
@@ -934,7 +887,7 @@ def generate_gitlab_ci_yaml(
# For spack pipelines "public" and "protected" are reserved tags
tags = _remove_reserved_tags(tags)
if spack_pipeline_type == "spack_protected_branch":
tags.extend(["aws", "protected"])
tags.extend(["protected"])
elif spack_pipeline_type == "spack_pull_request":
tags.extend(["public"])
@@ -1090,9 +1043,37 @@ def generate_gitlab_ci_yaml(
continue
if broken_spec_urls is not None and release_spec_dag_hash in broken_spec_urls:
known_broken_specs_encountered.append(
"{0} ({1})".format(release_spec, release_spec_dag_hash)
)
known_broken_specs_encountered.append(release_spec_dag_hash)
# Only keep track of these if we are copying rebuilt cache entries
if spack_buildcache_copy:
# TODO: This assumes signed version of the spec
buildcache_copies[release_spec_dag_hash] = [
{
"src": url_util.join(
buildcache_copy_src_prefix,
bindist.build_cache_relative_path(),
bindist.tarball_name(release_spec, ".spec.json.sig"),
),
"dest": url_util.join(
buildcache_copy_dest_prefix,
bindist.build_cache_relative_path(),
bindist.tarball_name(release_spec, ".spec.json.sig"),
),
},
{
"src": url_util.join(
buildcache_copy_src_prefix,
bindist.build_cache_relative_path(),
bindist.tarball_path_name(release_spec, ".spack"),
),
"dest": url_util.join(
buildcache_copy_dest_prefix,
bindist.build_cache_relative_path(),
bindist.tarball_path_name(release_spec, ".spack"),
),
},
]
if artifacts_root:
job_dependencies.append(
@@ -1101,14 +1082,23 @@ def generate_gitlab_ci_yaml(
job_vars["SPACK_SPEC_NEEDS_REBUILD"] = str(rebuild_spec)
if enable_cdash_reporting:
cdash_build_name = _get_cdash_build_name(release_spec, build_group)
all_job_names.append(cdash_build_name)
job_vars["SPACK_CDASH_BUILD_NAME"] = cdash_build_name
if cdash_handler:
cdash_handler.current_spec = release_spec
build_name = cdash_handler.build_name
all_job_names.append(build_name)
job_vars["SPACK_CDASH_BUILD_NAME"] = build_name
build_stamp = cdash_handler.build_stamp
job_vars["SPACK_CDASH_BUILD_STAMP"] = build_stamp
variables.update(job_vars)
artifact_paths = [rel_job_log_dir, rel_job_repro_dir, rel_user_artifacts_dir]
artifact_paths = [
rel_job_log_dir,
rel_job_repro_dir,
rel_job_test_dir,
rel_user_artifacts_dir,
]
if enable_artifacts_buildcache:
bc_root = os.path.join(local_mirror_dir, "build_cache")
@@ -1176,11 +1166,9 @@ def generate_gitlab_ci_yaml(
)
# Use "all_job_names" to populate the build group for this set
if enable_cdash_reporting and cdash_auth_token:
if cdash_handler and cdash_handler.auth_token:
try:
_populate_buildgroup(
all_job_names, build_group, cdash_project, cdash_site, cdash_auth_token, cdash_url
)
cdash_handler.populate_buildgroup(all_job_names)
except (SpackError, HTTPError, URLError) as err:
tty.warn("Problem populating buildgroup: {0}".format(err))
else:
@@ -1264,32 +1252,6 @@ def generate_gitlab_ci_yaml(
output_object["sign-pkgs"] = signing_job
if spack_buildcache_copy:
# Generate a job to copy the contents from wherever the builds are getting
# pushed to the url specified in the "SPACK_BUILDCACHE_COPY" environment
# variable.
src_url = remote_mirror_override or remote_mirror_url
dest_url = spack_buildcache_copy
stage_names.append("stage-copy-buildcache")
copy_job = {
"stage": "stage-copy-buildcache",
"tags": ["spack", "public", "medium", "aws", "x86_64"],
"image": "ghcr.io/spack/python-aws-bash:0.0.1",
"when": "on_success",
"interruptible": True,
"retry": service_job_retries,
"script": [
". ./share/spack/setup-env.sh",
"spack --version",
"aws s3 sync --exclude *index.json* --exclude *pgp* {0} {1}".format(
src_url, dest_url
),
],
}
output_object["copy-mirror"] = copy_job
if rebuild_index_enabled:
# Add a final job to regenerate the index
stage_names.append("stage-rebuild-index")
@@ -1341,8 +1303,12 @@ def generate_gitlab_ci_yaml(
"SPACK_REMOTE_MIRROR_URL": remote_mirror_url,
"SPACK_JOB_LOG_DIR": rel_job_log_dir,
"SPACK_JOB_REPRO_DIR": rel_job_repro_dir,
"SPACK_JOB_TEST_DIR": rel_job_test_dir,
"SPACK_LOCAL_MIRROR_DIR": rel_local_mirror_dir,
"SPACK_PIPELINE_TYPE": str(spack_pipeline_type),
"SPACK_CI_STACK_NAME": os.environ.get("SPACK_CI_STACK_NAME", "None"),
"SPACK_REBUILD_CHECK_UP_TO_DATE": str(prune_dag),
"SPACK_REBUILD_EVERYTHING": str(rebuild_everything),
}
if remote_mirror_override:
@@ -1352,6 +1318,21 @@ def generate_gitlab_ci_yaml(
if spack_stack_name:
output_object["variables"]["SPACK_CI_STACK_NAME"] = spack_stack_name
if spack_buildcache_copy:
# Write out the file describing specs that should be copied
copy_specs_dir = os.path.join(pipeline_artifacts_dir, "specs_to_copy")
if not os.path.exists(copy_specs_dir):
os.makedirs(copy_specs_dir)
copy_specs_file = os.path.join(
copy_specs_dir,
"copy_{}_specs.json".format(spack_stack_name if spack_stack_name else "rebuilt"),
)
with open(copy_specs_file, "w") as fd:
fd.write(json.dumps(buildcache_copies))
sorted_output = {}
for output_key, output_value in sorted(output_object.items()):
sorted_output[output_key] = output_value
@@ -1385,13 +1366,11 @@ def generate_gitlab_ci_yaml(
sorted_output = {"no-specs-to-rebuild": noop_job}
if known_broken_specs_encountered:
error_msg = (
"Pipeline generation failed due to the presence of the "
"following specs that are known to be broken in develop:\n"
)
for broken_spec in known_broken_specs_encountered:
error_msg += "* {0}\n".format(broken_spec)
tty.die(error_msg)
tty.error("This pipeline generated hashes known to be broken on develop:")
display_broken_spec_messages(broken_specs_url, known_broken_specs_encountered)
if not rebuild_everything:
sys.exit(1)
with open(output_file, "w") as outf:
outf.write(syaml.dump_config(sorted_output, default_flow_style=True))
@@ -1609,33 +1588,83 @@ def push_mirror_contents(env, specfile_path, mirror_url, sign_binaries):
raise inst
def copy_stage_logs_to_artifacts(job_spec, job_log_dir):
"""Looks for spack-build-out.txt in the stage directory of the given
job_spec, and attempts to copy the file into the directory given
by job_log_dir.
Arguments:
job_spec (spack.spec.Spec): Spec associated with spack install log
job_log_dir (str): Path into which build log should be copied
def remove_other_mirrors(mirrors_to_keep, scope=None):
"""Remove all mirrors from the given config scope, the exceptions being
any listed in in mirrors_to_keep, which is a list of mirror urls.
"""
mirrors_to_remove = []
for name, mirror_url in spack.config.get("mirrors", scope=scope).items():
if mirror_url not in mirrors_to_keep:
mirrors_to_remove.append(name)
for mirror_name in mirrors_to_remove:
spack.mirror.remove(mirror_name, scope)
def copy_files_to_artifacts(src, artifacts_dir):
"""
Copy file(s) to the given artifacts directory
Parameters:
src (str): the glob-friendly path expression for the file(s) to copy
artifacts_dir (str): the destination directory
"""
try:
fs.copy(src, artifacts_dir)
except Exception as err:
msg = ("Unable to copy files ({0}) to artifacts {1} due to " "exception: {2}").format(
src, artifacts_dir, str(err)
)
tty.error(msg)
def copy_stage_logs_to_artifacts(job_spec, job_log_dir):
"""Copy selected build stage file(s) to the given artifacts directory
Looks for spack-build-out.txt in the stage directory of the given
job_spec, and attempts to copy the file into the directory given
by job_log_dir.
Parameters:
job_spec (spack.spec.Spec): spec associated with spack install log
job_log_dir (str): path into which build log should be copied
"""
tty.debug("job spec: {0}".format(job_spec))
if not job_spec:
msg = "Cannot copy stage logs: job spec ({0}) is required"
tty.error(msg.format(job_spec))
return
try:
pkg_cls = spack.repo.path.get_pkg_class(job_spec.name)
job_pkg = pkg_cls(job_spec)
tty.debug("job package: {0.fullname}".format(job_pkg))
stage_dir = job_pkg.stage.path
tty.debug("stage dir: {0}".format(stage_dir))
build_out_src = os.path.join(stage_dir, "spack-build-out.txt")
build_out_dst = os.path.join(job_log_dir, "spack-build-out.txt")
tty.debug(
"Copying build log ({0}) to artifacts ({1})".format(build_out_src, build_out_dst)
)
shutil.copyfile(build_out_src, build_out_dst)
except Exception as inst:
msg = (
"Unable to copy build logs from stage to artifacts " "due to exception: {0}"
).format(inst)
tty.error(msg)
tty.debug("job package: {0}".format(job_pkg))
except AssertionError:
msg = "Cannot copy stage logs: job spec ({0}) must be concrete"
tty.error(msg.format(job_spec))
return
stage_dir = job_pkg.stage.path
tty.debug("stage dir: {0}".format(stage_dir))
build_out_src = os.path.join(stage_dir, "spack-build-out.txt")
copy_files_to_artifacts(build_out_src, job_log_dir)
def copy_test_logs_to_artifacts(test_stage, job_test_dir):
"""
Copy test log file(s) to the given artifacts directory
Parameters:
test_stage (str): test stage path
job_test_dir (str): the destination artifacts test directory
"""
tty.debug("test stage: {0}".format(test_stage))
if not os.path.exists(test_stage):
msg = "Cannot copy test logs: job test stage ({0}) does not exist"
tty.error(msg.format(test_stage))
return
copy_files_to_artifacts(os.path.join(test_stage, "*", "*.txt"), job_test_dir)
def download_and_extract_artifacts(url, work_dir):
@@ -1985,3 +2014,392 @@ def reproduce_ci_job(url, work_dir):
)
print("".join(inst_list))
def process_command(cmd, cmd_args, repro_dir):
"""
Create a script for and run the command. Copy the script to the
reproducibility directory.
Arguments:
cmd (str): name of the command being processed
cmd_args (list): string arguments to pass to the command
repro_dir (str): Job reproducibility directory
Returns: the exit code from processing the command
"""
tty.debug("spack {0} arguments: {1}".format(cmd, cmd_args))
# Write the command to a shell script
script = "{0}.sh".format(cmd)
with open(script, "w") as fd:
fd.write("#!/bin/bash\n\n")
fd.write("\n# spack {0} command\n".format(cmd))
fd.write(" ".join(['"{0}"'.format(i) for i in cmd_args]))
fd.write("\n")
st = os.stat(script)
os.chmod(script, st.st_mode | stat.S_IEXEC)
copy_path = os.path.join(repro_dir, script)
shutil.copyfile(script, copy_path)
# Run the generated install.sh shell script as if it were being run in
# a login shell.
try:
cmd_process = subprocess.Popen(["bash", "./{0}".format(script)])
cmd_process.wait()
exit_code = cmd_process.returncode
except (ValueError, subprocess.CalledProcessError, OSError) as err:
tty.error("Encountered error running {0} script".format(cmd))
tty.error(err)
exit_code = 1
tty.debug("spack {0} exited {1}".format(cmd, exit_code))
return exit_code
def create_buildcache(**kwargs):
"""Create the buildcache at the provided mirror(s).
Arguments:
kwargs (dict): dictionary of arguments used to create the buildcache
List of recognized keys:
* "env" (spack.environment.Environment): the active environment
* "buildcache_mirror_url" (str or None): URL for the buildcache mirror
* "pipeline_mirror_url" (str or None): URL for the pipeline mirror
* "pr_pipeline" (bool): True if the CI job is for a PR
* "json_path" (str): path the the spec's JSON file
"""
env = kwargs.get("env")
buildcache_mirror_url = kwargs.get("buildcache_mirror_url")
pipeline_mirror_url = kwargs.get("pipeline_mirror_url")
pr_pipeline = kwargs.get("pr_pipeline")
json_path = kwargs.get("json_path")
sign_binaries = pr_pipeline is False and can_sign_binaries()
# Create buildcache in either the main remote mirror, or in the
# per-PR mirror, if this is a PR pipeline
if buildcache_mirror_url:
push_mirror_contents(env, json_path, buildcache_mirror_url, sign_binaries)
# Create another copy of that buildcache in the per-pipeline
# temporary storage mirror (this is only done if either
# artifacts buildcache is enabled or a temporary storage url
# prefix is set)
if pipeline_mirror_url:
push_mirror_contents(env, json_path, pipeline_mirror_url, sign_binaries)
def write_broken_spec(url, pkg_name, stack_name, job_url, pipeline_url, spec_dict):
"""Given a url to write to and the details of the failed job, write an entry
in the broken specs list.
"""
tmpdir = tempfile.mkdtemp()
file_path = os.path.join(tmpdir, "broken.txt")
broken_spec_details = {
"broken-spec": {
"job-name": pkg_name,
"job-stack": stack_name,
"job-url": job_url,
"pipeline-url": pipeline_url,
"concrete-spec-dict": spec_dict,
}
}
try:
with open(file_path, "w") as fd:
fd.write(syaml.dump(broken_spec_details))
web_util.push_to_url(
file_path,
url,
keep_original=False,
extra_args={"ContentType": "text/plain"},
)
except Exception as err:
# If there is an S3 error (e.g., access denied or connection
# error), the first non boto-specific class in the exception
# hierarchy is Exception. Just print a warning and return
msg = "Error writing to broken specs list {0}: {1}".format(url, err)
tty.warn(msg)
finally:
shutil.rmtree(tmpdir)
def read_broken_spec(broken_spec_url):
"""Read data from broken specs file located at the url, return as a yaml
object.
"""
try:
_, _, fs = web_util.read_from_url(broken_spec_url)
except (URLError, web_util.SpackWebError, HTTPError):
tty.warn("Unable to read broken spec from {0}".format(broken_spec_url))
return None
broken_spec_contents = codecs.getreader("utf-8")(fs).read()
return syaml.load(broken_spec_contents)
def display_broken_spec_messages(base_url, hashes):
"""Fetch the broken spec file for each of the hashes under the base_url and
print a message with some details about each one.
"""
broken_specs = [(h, read_broken_spec(url_util.join(base_url, h))) for h in hashes]
for spec_hash, broken_spec in [tup for tup in broken_specs if tup[1]]:
details = broken_spec["broken-spec"]
if "job-name" in details:
item_name = "{0}/{1}".format(details["job-name"], spec_hash[:7])
else:
item_name = spec_hash
if "job-stack" in details:
item_name = "{0} (in stack {1})".format(item_name, details["job-stack"])
msg = " {0} was reported broken here: {1}".format(item_name, details["job-url"])
tty.msg(msg)
def run_standalone_tests(**kwargs):
"""Run stand-alone tests on the current spec.
Arguments:
kwargs (dict): dictionary of arguments used to run the tests
List of recognized keys:
* "cdash" (CDashHandler): (optional) cdash handler instance
* "fail_fast" (bool): (optional) terminate tests after the first failure
* "log_file" (str): (optional) test log file name if NOT CDash reporting
* "job_spec" (Spec): spec that was built
* "repro_dir" (str): reproduction directory
"""
cdash = kwargs.get("cdash")
fail_fast = kwargs.get("fail_fast")
log_file = kwargs.get("log_file")
if cdash and log_file:
tty.msg("The test log file {0} option is ignored with CDash reporting".format(log_file))
log_file = None
# Error out but do NOT terminate if there are missing required arguments.
job_spec = kwargs.get("job_spec")
if not job_spec:
tty.error("Job spec is required to run stand-alone tests")
return
repro_dir = kwargs.get("repro_dir")
if not repro_dir:
tty.error("Reproduction directory is required for stand-alone tests")
return
test_args = [
"spack",
"-d",
"-v",
"test",
"run",
]
if fail_fast:
test_args.append("--fail-fast")
if cdash:
test_args.extend(cdash.args())
else:
test_args.extend(["--log-format", "junit"])
if log_file:
test_args.extend(["--log-file", log_file])
test_args.append(job_spec.name)
tty.debug("Running {0} stand-alone tests".format(job_spec.name))
exit_code = process_command("test", test_args, repro_dir)
tty.debug("spack test exited {0}".format(exit_code))
class CDashHandler(object):
"""
Class for managing CDash data and processing.
"""
def __init__(self, ci_cdash):
# start with the gitlab ci configuration
self.url = ci_cdash.get("url")
self.build_group = ci_cdash.get("build-group")
self.project = ci_cdash.get("project")
self.site = ci_cdash.get("site")
# grab the authorization token when available
self.auth_token = os.environ.get("SPACK_CDASH_AUTH_TOKEN")
if self.auth_token:
tty.verbose("Using CDash auth token from environment")
# append runner description to the site if available
runner = os.environ.get("CI_RUNNER_DESCRIPTION")
if runner:
self.site += " ({0})".format(runner)
# track current spec, if any
self.current_spec = None
def args(self):
return [
"--cdash-upload-url",
self.upload_url,
"--cdash-build",
self.build_name,
"--cdash-site",
self.site,
"--cdash-buildstamp",
self.build_stamp,
]
@property # type: ignore
def build_name(self):
"""Returns the CDash build name.
A name will be generated if the `current_spec` property is set;
otherwise, the value will be retrieved from the environment
through the `SPACK_CDASH_BUILD_NAME` variable.
Returns: (str) current spec's CDash build name."""
spec = self.current_spec
if spec:
build_name = "{0}@{1}%{2} hash={3} arch={4} ({5})".format(
spec.name,
spec.version,
spec.compiler,
spec.dag_hash(),
spec.architecture,
self.build_group,
)
tty.verbose(
"Generated CDash build name ({0}) from the {1}".format(build_name, spec.name)
)
return build_name
build_name = os.environ.get("SPACK_CDASH_BUILD_NAME")
tty.verbose("Using CDash build name ({0}) from the environment".format(build_name))
return build_name
@property # type: ignore
def build_stamp(self):
"""Returns the CDash build stamp.
The one defined by SPACK_CDASH_BUILD_STAMP environment variable
is preferred due to the representation of timestamps; otherwise,
one will be built.
Returns: (str) current CDash build stamp"""
build_stamp = os.environ.get("SPACK_CDASH_BUILD_STAMP")
if build_stamp:
tty.verbose("Using build stamp ({0}) from the environment".format(build_stamp))
return build_stamp
build_stamp = cdash_build_stamp(self.build_group, time.time())
tty.verbose("Generated new build stamp ({0})".format(build_stamp))
return build_stamp
@property # type: ignore
@memoized
def project_enc(self):
tty.debug("Encoding project ({0}): {1})".format(type(self.project), self.project))
encode = urlencode({"project": self.project})
index = encode.find("=") + 1
return encode[index:]
@property
def upload_url(self):
url_format = "{0}/submit.php?project={1}"
return url_format.format(self.url, self.project_enc)
def copy_test_results(self, source, dest):
"""Copy test results to artifacts directory."""
reports = fs.join_path(source, "*_Test*.xml")
copy_files_to_artifacts(reports, dest)
def create_buildgroup(self, opener, headers, url, group_name, group_type):
data = {"newbuildgroup": group_name, "project": self.project, "type": group_type}
enc_data = json.dumps(data).encode("utf-8")
request = Request(url, data=enc_data, headers=headers)
response = opener.open(request)
response_code = response.getcode()
if response_code not in [200, 201]:
msg = "Creating buildgroup failed (response code = {0})".format(response_code)
tty.warn(msg)
return None
response_text = response.read()
response_json = json.loads(response_text)
build_group_id = response_json["id"]
return build_group_id
def populate_buildgroup(self, job_names):
url = "{0}/api/v1/buildgroup.php".format(self.url)
headers = {
"Authorization": "Bearer {0}".format(self.auth_token),
"Content-Type": "application/json",
}
opener = build_opener(HTTPHandler)
parent_group_id = self.create_buildgroup(
opener,
headers,
url,
self.build_group,
"Daily",
)
group_id = self.create_buildgroup(
opener,
headers,
url,
"Latest {0}".format(self.build_group),
"Latest",
)
if not parent_group_id or not group_id:
msg = "Failed to create or retrieve buildgroups for {0}".format(self.build_group)
tty.warn(msg)
return
data = {
"dynamiclist": [
{
"match": name,
"parentgroupid": parent_group_id,
"site": self.site,
}
for name in job_names
],
}
enc_data = json.dumps(data).encode("utf-8")
request = Request(url, data=enc_data, headers=headers)
request.get_method = lambda: "PUT"
response = opener.open(request)
response_code = response.getcode()
if response_code != 200:
msg = "Error response code ({0}) in populate_buildgroup".format(response_code)
tty.warn(msg)
def report_skipped(self, spec, directory_name, reason):
cli_args = self.args()
cli_args.extend(["package", [spec.name]])
it = iter(cli_args)
kv = {x.replace("--", "").replace("-", "_"): next(it) for x in it}
reporter = CDash(Bunch(**kv))
reporter.test_skipped_report(directory_name, spec, reason)

View File

@@ -291,19 +291,24 @@ def disambiguate_spec_from_hashes(spec, hashes, local=False, installed=True, fir
elif first:
return matching_specs[0]
elif len(matching_specs) > 1:
format_string = "{name}{@version}{%compiler}{arch=architecture}"
args = ["%s matches multiple packages." % spec, "Matching packages:"]
args += [
colorize(" @K{%s} " % s.dag_hash(7)) + s.cformat(format_string)
for s in matching_specs
]
args += ["Use a more specific spec."]
tty.die(*args)
ensure_single_spec_or_die(spec, matching_specs)
return matching_specs[0]
def ensure_single_spec_or_die(spec, matching_specs):
if len(matching_specs) <= 1:
return
format_string = "{name}{@version}{%compiler}{arch=architecture}"
args = ["%s matches multiple packages." % spec, "Matching packages:"]
args += [
colorize(" @K{%s} " % s.dag_hash(7)) + s.cformat(format_string) for s in matching_specs
]
args += ["Use a more specific spec (e.g., prepend '/' to the hash)."]
tty.die(*args)
def gray_hash(spec, length):
if not length:
# default to maximum hash length
@@ -640,3 +645,8 @@ def find_environment(args):
return ev.Environment(env)
raise ev.SpackEnvironmentError("no environment in %s" % env)
def first_line(docstring):
"""Return the first line of the docstring."""
return docstring.split("\n")[0]

View File

@@ -2,6 +2,8 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import glob
import json
import os
import shutil
import sys
@@ -15,7 +17,6 @@
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.fetch_strategy as fs
import spack.hash_types as ht
import spack.mirror
import spack.relocate
@@ -272,7 +273,12 @@ def setup_parser(subparser):
# Sync buildcache entries from one mirror to another
sync = subparsers.add_parser("sync", help=sync_fn.__doc__)
source = sync.add_mutually_exclusive_group(required=True)
sync.add_argument(
"--manifest-glob",
default=None,
help="A quoted glob pattern identifying copy manifest files",
)
source = sync.add_mutually_exclusive_group(required=False)
source.add_argument(
"--src-directory", metavar="DIRECTORY", type=str, help="Source mirror as a local file path"
)
@@ -282,7 +288,7 @@ def setup_parser(subparser):
source.add_argument(
"--src-mirror-url", metavar="MIRROR_URL", type=str, help="URL of the source mirror"
)
dest = sync.add_mutually_exclusive_group(required=True)
dest = sync.add_mutually_exclusive_group(required=False)
dest.add_argument(
"--dest-directory",
metavar="DIRECTORY",
@@ -615,6 +621,31 @@ def copy_fn(args):
shutil.copyfile(specfile_src_path_yaml, specfile_dest_path_yaml)
def copy_buildcache_file(src_url, dest_url, local_path=None):
"""Copy from source url to destination url"""
tmpdir = None
if not local_path:
tmpdir = tempfile.mkdtemp()
local_path = os.path.join(tmpdir, os.path.basename(src_url))
try:
temp_stage = Stage(src_url, path=os.path.dirname(local_path))
try:
temp_stage.create()
temp_stage.fetch()
web_util.push_to_url(local_path, dest_url, keep_original=True)
except web_util.FetchError as e:
# Expected, since we have to try all the possible extensions
tty.debug("no such file: {0}".format(src_url))
tty.debug(e)
finally:
temp_stage.destroy()
finally:
if tmpdir and os.path.exists(tmpdir):
shutil.rmtree(tmpdir)
def sync_fn(args):
"""Syncs binaries (and associated metadata) from one mirror to another.
Requires an active environment in order to know which specs to sync.
@@ -623,6 +654,10 @@ def sync_fn(args):
src (str): Source mirror URL
dest (str): Destination mirror URL
"""
if args.manifest_glob:
manifest_copy(glob.glob(args.manifest_glob))
return 0
# Figure out the source mirror
source_location = None
if args.src_directory:
@@ -688,8 +723,9 @@ def sync_fn(args):
buildcache_rel_paths.extend(
[
os.path.join(build_cache_dir, bindist.tarball_path_name(s, ".spack")),
os.path.join(build_cache_dir, bindist.tarball_name(s, ".spec.yaml")),
os.path.join(build_cache_dir, bindist.tarball_name(s, ".spec.json.sig")),
os.path.join(build_cache_dir, bindist.tarball_name(s, ".spec.json")),
os.path.join(build_cache_dir, bindist.tarball_name(s, ".spec.yaml")),
]
)
@@ -702,24 +738,31 @@ def sync_fn(args):
dest_url = url_util.join(dest_mirror_url, rel_path)
tty.debug("Copying {0} to {1} via {2}".format(src_url, dest_url, local_path))
stage = Stage(
src_url, name="temporary_file", path=os.path.dirname(local_path), keep=True
)
try:
stage.create()
stage.fetch()
web_util.push_to_url(local_path, dest_url, keep_original=True)
except fs.FetchError as e:
tty.debug("spack buildcache unable to sync {0}".format(rel_path))
tty.debug(e)
finally:
stage.destroy()
copy_buildcache_file(src_url, dest_url, local_path=local_path)
finally:
shutil.rmtree(tmpdir)
def manifest_copy(manifest_file_list):
"""Read manifest files containing information about specific specs to copy
from source to destination, remove duplicates since any binary packge for
a given hash should be the same as any other, and copy all files specified
in the manifest files."""
deduped_manifest = {}
for manifest_path in manifest_file_list:
with open(manifest_path) as fd:
manifest = json.loads(fd.read())
for spec_hash, copy_list in manifest.items():
# Last duplicate hash wins
deduped_manifest[spec_hash] = copy_list
for spec_hash, copy_list in deduped_manifest.items():
for copy_file in copy_list:
tty.debug("copying {0} to {1}".format(copy_file["src"], copy_file["dest"]))
copy_buildcache_file(copy_file["src"], copy_file["dest"])
def update_index(mirror_url, update_keys=False):
mirror = spack.mirror.MirrorCollection().lookup(mirror_url)
outdir = url_util.format(mirror.push_url)

View File

@@ -0,0 +1,51 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.cmd
import spack.cmd.common.arguments as arguments
description = "change an existing spec in an environment"
section = "environments"
level = "long"
def setup_parser(subparser):
subparser.add_argument(
"-l",
"--list-name",
dest="list_name",
default="specs",
help="name of the list to remove specs from",
)
subparser.add_argument(
"--match-spec",
dest="match_spec",
help="if name is ambiguous, supply a spec to match",
)
subparser.add_argument(
"-a",
"--all",
action="store_true",
help="change all matching specs (allow changing more than one spec)",
)
arguments.add_common_arguments(subparser, ["specs"])
def change(parser, args):
env = spack.cmd.require_active_env(cmd_name="change")
with env.write_transaction():
if args.match_spec:
match_spec = spack.spec.Spec(args.match_spec)
else:
match_spec = None
for spec in spack.cmd.parse_specs(args.specs):
env.change_existing_spec(
spec,
list_name=args.list_name,
match_spec=match_spec,
allow_changing_multiple_specs=args.all,
)
env.write()

View File

@@ -6,13 +6,9 @@
import json
import os
import shutil
import stat
import subprocess
import sys
import tempfile
from six.moves.urllib.parse import urlencode
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.binary_distribution as bindist
@@ -22,7 +18,6 @@
import spack.environment as ev
import spack.hash_types as ht
import spack.mirror
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util
@@ -34,6 +29,10 @@
INSTALL_FAIL_CODE = 1
def deindent(desc):
return desc.replace(" ", "")
def get_env_var(variable_name):
if variable_name in os.environ:
return os.environ.get(variable_name)
@@ -45,27 +44,35 @@ def setup_parser(subparser):
subparsers = subparser.add_subparsers(help="CI sub-commands")
# Dynamic generation of the jobs yaml from a spack environment
generate = subparsers.add_parser("generate", help=ci_generate.__doc__)
generate = subparsers.add_parser(
"generate",
description=deindent(ci_generate.__doc__),
help=spack.cmd.first_line(ci_generate.__doc__),
)
generate.add_argument(
"--output-file",
default=None,
help="Path to file where generated jobs file should be "
+ "written. The default is .gitlab-ci.yml in the root of the "
+ "repository.",
help="""pathname for the generated gitlab ci yaml file
Path to the file where generated jobs file should
be written. Default is .gitlab-ci.yml in the root of
the repository.""",
)
generate.add_argument(
"--copy-to",
default=None,
help="Absolute path of additional location where generated jobs "
+ "yaml file should be copied. Default is not to copy.",
help="""path to additional directory for job files
This option provides an absolute path to a directory
where the generated jobs yaml file should be copied.
Default is not to copy.""",
)
generate.add_argument(
"--optimize",
action="store_true",
default=False,
help="(Experimental) run the generated document through a series of "
"optimization passes designed to reduce the size of the "
"generated file.",
help="""(Experimental) optimize the gitlab yaml file for size
Run the generated document through a series of
optimization passes designed to reduce the size
of the generated file.""",
)
generate.add_argument(
"--dependencies",
@@ -86,53 +93,84 @@ def setup_parser(subparser):
action="store_true",
dest="prune_dag",
default=True,
help="""Do not generate jobs for specs already up to
date on the mirror""",
help="""skip up-to-date specs
Do not generate jobs for specs that are up-to-date
on the mirror.""",
)
prune_group.add_argument(
"--no-prune-dag",
action="store_false",
dest="prune_dag",
default=True,
help="""Generate jobs for specs already up to date
on the mirror""",
help="""process up-to-date specs
Generate jobs for specs even when they are up-to-date
on the mirror.""",
)
generate.add_argument(
"--check-index-only",
action="store_true",
dest="index_only",
default=False,
help="""Spack always check specs against configured
binary mirrors when generating the pipeline, regardless of whether or not
DAG pruning is enabled. This flag controls whether it might attempt to
fetch remote spec files directly (ensuring no spec is rebuilt if it
is present on the mirror), or whether it should reduce pipeline generation time
by assuming all remote buildcache indices are up to date and only use those
to determine whether a given spec is up to date on mirrors. In the latter
case, specs might be needlessly rebuilt if remote buildcache indices are out
of date.""",
help="""only check spec state from buildcache indices
Spack always checks specs against configured binary
mirrors, regardless of the DAG pruning option.
If enabled, Spack will assume all remote buildcache
indices are up-to-date when assessing whether the spec
on the mirror, if present, is up-to-date. This has the
benefit of reducing pipeline generation time but at the
potential cost of needlessly rebuilding specs when the
indices are outdated.
If not enabled, Spack will fetch remote spec files
directly to assess whether the spec on the mirror is
up-to-date.""",
)
generate.add_argument(
"--artifacts-root",
default=None,
help="""Path to root of artifacts directory. If provided, concrete
environment files (spack.yaml, spack.lock) will be generated under this
path and their location sent to generated child jobs via the custom job
variable SPACK_CONCRETE_ENVIRONMENT_PATH.""",
help="""path to the root of the artifacts directory
If provided, concrete environment files (spack.yaml,
spack.lock) will be generated under this directory.
Their location will be passed to generated child jobs
through the SPACK_CONCRETE_ENVIRONMENT_PATH variable.""",
)
generate.set_defaults(func=ci_generate)
# Rebuild the buildcache index associated with the mirror in the
# active, gitlab-enabled environment.
index = subparsers.add_parser("rebuild-index", help=ci_reindex.__doc__)
index = subparsers.add_parser(
"rebuild-index",
description=deindent(ci_reindex.__doc__),
help=spack.cmd.first_line(ci_reindex.__doc__),
)
index.set_defaults(func=ci_reindex)
# Handle steps of a ci build/rebuild
rebuild = subparsers.add_parser("rebuild", help=ci_rebuild.__doc__)
rebuild = subparsers.add_parser(
"rebuild",
description=deindent(ci_rebuild.__doc__),
help=spack.cmd.first_line(ci_rebuild.__doc__),
)
rebuild.add_argument(
"-t",
"--tests",
action="store_true",
default=False,
help="""run stand-alone tests after the build""",
)
rebuild.add_argument(
"--fail-fast",
action="store_true",
default=False,
help="""stop stand-alone tests after the first failure""",
)
rebuild.set_defaults(func=ci_rebuild)
# Facilitate reproduction of a failed CI build job
reproduce = subparsers.add_parser("reproduce-build", help=ci_reproduce.__doc__)
reproduce = subparsers.add_parser(
"reproduce-build",
description=deindent(ci_reproduce.__doc__),
help=spack.cmd.first_line(ci_reproduce.__doc__),
)
reproduce.add_argument("job_url", help="Url of job artifacts bundle")
reproduce.add_argument(
"--working-dir",
@@ -144,12 +182,12 @@ def setup_parser(subparser):
def ci_generate(args):
"""Generate jobs file from a spack environment file containing CI info.
Before invoking this command, you can set the environment variable
SPACK_CDASH_AUTH_TOKEN to contain the CDash authorization token
for creating a build group for the generated workload and registering
all generated jobs under that build group. If this environment
variable is not set, no build group will be created on CDash."""
"""Generate jobs file from a CI-aware spack file.
If you want to report the results on CDash, you will need to set
the SPACK_CDASH_AUTH_TOKEN before invoking this command. The
value must be the CDash authorization token needed to create a
build group and register all generated jobs under it."""
env = spack.cmd.require_active_env(cmd_name="ci generate")
output_file = args.output_file
@@ -190,8 +228,10 @@ def ci_generate(args):
def ci_reindex(args):
"""Rebuild the buildcache index associated with the mirror in the
active, gitlab-enabled environment."""
"""Rebuild the buildcache index for the remote mirror.
Use the active, gitlab-enabled environment to rebuild the buildcache
index for the associated mirror."""
env = spack.cmd.require_active_env(cmd_name="ci rebuild-index")
yaml_root = ev.config_dict(env.yaml)
@@ -206,17 +246,16 @@ def ci_reindex(args):
def ci_rebuild(args):
"""Check a single spec against the remote mirror, and rebuild it from
"""Rebuild a spec if it is not on the remote mirror.
Check a single spec against the remote mirror, and rebuild it from
source if the mirror does not contain the hash."""
env = spack.cmd.require_active_env(cmd_name="ci rebuild")
# Make sure the environment is "gitlab-enabled", or else there's nothing
# to do.
yaml_root = ev.config_dict(env.yaml)
gitlab_ci = None
if "gitlab-ci" in yaml_root:
gitlab_ci = yaml_root["gitlab-ci"]
gitlab_ci = yaml_root["gitlab-ci"] if "gitlab-ci" in yaml_root else None
if not gitlab_ci:
tty.die("spack ci rebuild requires an env containing gitlab-ci cfg")
@@ -231,6 +270,7 @@ def ci_rebuild(args):
# out as variables, or else provided by GitLab itself.
pipeline_artifacts_dir = get_env_var("SPACK_ARTIFACTS_ROOT")
job_log_dir = get_env_var("SPACK_JOB_LOG_DIR")
job_test_dir = get_env_var("SPACK_JOB_TEST_DIR")
repro_dir = get_env_var("SPACK_JOB_REPRO_DIR")
local_mirror_dir = get_env_var("SPACK_LOCAL_MIRROR_DIR")
concrete_env_dir = get_env_var("SPACK_CONCRETE_ENV_DIR")
@@ -240,15 +280,17 @@ def ci_rebuild(args):
root_spec = get_env_var("SPACK_ROOT_SPEC")
job_spec_pkg_name = get_env_var("SPACK_JOB_SPEC_PKG_NAME")
compiler_action = get_env_var("SPACK_COMPILER_ACTION")
cdash_build_name = get_env_var("SPACK_CDASH_BUILD_NAME")
spack_pipeline_type = get_env_var("SPACK_PIPELINE_TYPE")
remote_mirror_override = get_env_var("SPACK_REMOTE_MIRROR_OVERRIDE")
remote_mirror_url = get_env_var("SPACK_REMOTE_MIRROR_URL")
spack_ci_stack_name = get_env_var("SPACK_CI_STACK_NAME")
rebuild_everything = get_env_var("SPACK_REBUILD_EVERYTHING")
# Construct absolute paths relative to current $CI_PROJECT_DIR
ci_project_dir = get_env_var("CI_PROJECT_DIR")
pipeline_artifacts_dir = os.path.join(ci_project_dir, pipeline_artifacts_dir)
job_log_dir = os.path.join(ci_project_dir, job_log_dir)
job_test_dir = os.path.join(ci_project_dir, job_test_dir)
repro_dir = os.path.join(ci_project_dir, repro_dir)
local_mirror_dir = os.path.join(ci_project_dir, local_mirror_dir)
concrete_env_dir = os.path.join(ci_project_dir, concrete_env_dir)
@@ -263,23 +305,15 @@ def ci_rebuild(args):
# Query the environment manifest to find out whether we're reporting to a
# CDash instance, and if so, gather some information from the manifest to
# support that task.
enable_cdash = False
if "cdash" in yaml_root:
enable_cdash = True
ci_cdash = yaml_root["cdash"]
job_spec_buildgroup = ci_cdash["build-group"]
cdash_base_url = ci_cdash["url"]
cdash_project = ci_cdash["project"]
proj_enc = urlencode({"project": cdash_project})
eq_idx = proj_enc.find("=") + 1
cdash_project_enc = proj_enc[eq_idx:]
cdash_site = ci_cdash["site"]
tty.debug("cdash_base_url = {0}".format(cdash_base_url))
tty.debug("cdash_project = {0}".format(cdash_project))
tty.debug("cdash_project_enc = {0}".format(cdash_project_enc))
tty.debug("cdash_build_name = {0}".format(cdash_build_name))
tty.debug("cdash_site = {0}".format(cdash_site))
tty.debug("job_spec_buildgroup = {0}".format(job_spec_buildgroup))
cdash_handler = spack_ci.CDashHandler(yaml_root.get("cdash")) if "cdash" in yaml_root else None
if cdash_handler:
tty.debug("cdash url = {0}".format(cdash_handler.url))
tty.debug("cdash project = {0}".format(cdash_handler.project))
tty.debug("cdash project_enc = {0}".format(cdash_handler.project_enc))
tty.debug("cdash build_name = {0}".format(cdash_handler.build_name))
tty.debug("cdash build_stamp = {0}".format(cdash_handler.build_stamp))
tty.debug("cdash site = {0}".format(cdash_handler.site))
tty.debug("cdash build_group = {0}".format(cdash_handler.build_group))
# Is this a pipeline run on a spack PR or a merge to develop? It might
# be neither, e.g. a pipeline run on some environment repository.
@@ -292,6 +326,8 @@ def ci_rebuild(args):
)
)
full_rebuild = True if rebuild_everything and rebuild_everything.lower() == "true" else False
# If no override url exists, then just push binary package to the
# normal remote mirror url.
buildcache_mirror_url = remote_mirror_override or remote_mirror_url
@@ -344,6 +380,9 @@ def ci_rebuild(args):
if os.path.exists(job_log_dir):
shutil.rmtree(job_log_dir)
if os.path.exists(job_test_dir):
shutil.rmtree(job_test_dir)
if os.path.exists(repro_dir):
shutil.rmtree(repro_dir)
@@ -351,6 +390,7 @@ def ci_rebuild(args):
# need for storing artifacts. The cdash_report directory will be
# created internally if needed.
os.makedirs(job_log_dir)
os.makedirs(job_test_dir)
os.makedirs(repro_dir)
# Copy the concrete environment files to the repro directory so we can
@@ -411,6 +451,8 @@ def ci_rebuild(args):
fd.write(spack_info.encode("utf8"))
fd.write(b"\n")
pipeline_mirrors = []
# If we decided there should be a temporary storage mechanism, add that
# mirror now so it's used when we check for a hash match already
# built for this spec.
@@ -418,22 +460,29 @@ def ci_rebuild(args):
spack.mirror.add(
spack_ci.TEMP_STORAGE_MIRROR_NAME, pipeline_mirror_url, cfg.default_modify_scope()
)
pipeline_mirrors.append(pipeline_mirror_url)
# Check configured mirrors for a built spec with a matching hash
mirrors_to_check = None
if remote_mirror_override and spack_pipeline_type == "spack_protected_branch":
# Passing "mirrors_to_check" below means we *only* look in the override
# mirror to see if we should skip building, which is what we want.
mirrors_to_check = {"override": remote_mirror_override}
if remote_mirror_override:
if spack_pipeline_type == "spack_protected_branch":
# Passing "mirrors_to_check" below means we *only* look in the override
# mirror to see if we should skip building, which is what we want.
mirrors_to_check = {"override": remote_mirror_override}
# Adding this mirror to the list of configured mirrors means dependencies
# could be installed from either the override mirror or any other configured
# mirror (e.g. remote_mirror_url which is defined in the environment or
# pipeline_mirror_url), which is also what we want.
spack.mirror.add("mirror_override", remote_mirror_override, cfg.default_modify_scope())
# Adding this mirror to the list of configured mirrors means dependencies
# could be installed from either the override mirror or any other configured
# mirror (e.g. remote_mirror_url which is defined in the environment or
# pipeline_mirror_url), which is also what we want.
spack.mirror.add("mirror_override", remote_mirror_override, cfg.default_modify_scope())
pipeline_mirrors.append(remote_mirror_override)
matches = bindist.get_mirrors_for_spec(
job_spec, mirrors_to_check=mirrors_to_check, index_only=False
matches = (
None
if full_rebuild
else bindist.get_mirrors_for_spec(
job_spec, mirrors_to_check=mirrors_to_check, index_only=False
)
)
if matches:
@@ -456,6 +505,13 @@ def ci_rebuild(args):
# Now we are done and successful
sys.exit(0)
# Before beginning the install, if this is a "rebuild everything" pipeline, we
# only want to keep the mirror being used by the current pipeline as it's binary
# package destination. This ensures that the when we rebuild everything, we only
# consume binary dependencies built in this pipeline.
if full_rebuild:
spack_ci.remove_other_mirrors(pipeline_mirrors, cfg.default_modify_scope())
# No hash match anywhere means we need to rebuild spec
# Start with spack arguments
@@ -468,7 +524,10 @@ def ci_rebuild(args):
install_args.extend(
[
"install",
"--show-log-on-error", # Print full log on fails
"--keep-stage",
"--use-buildcache",
"dependencies:only,package:never",
]
)
@@ -477,22 +536,9 @@ def ci_rebuild(args):
if not verify_binaries:
install_args.append("--no-check-signature")
if enable_cdash:
if cdash_handler:
# Add additional arguments to `spack install` for CDash reporting.
cdash_upload_url = "{0}/submit.php?project={1}".format(cdash_base_url, cdash_project_enc)
install_args.extend(
[
"--cdash-upload-url",
cdash_upload_url,
"--cdash-build",
cdash_build_name,
"--cdash-site",
cdash_site,
"--cdash-track",
job_spec_buildgroup,
]
)
install_args.extend(cdash_handler.args())
# A compiler action of 'FIND_ANY' means we are building a bootstrap
# compiler or one of its deps.
@@ -500,35 +546,11 @@ def ci_rebuild(args):
if compiler_action != "FIND_ANY":
install_args.append("--no-add")
# TODO: once we have the concrete spec registry, use the DAG hash
# to identify the spec to install, rather than the concrete spec
# json file.
install_args.extend(["-f", job_spec_json_path])
# Identify spec to install by hash
install_args.append("/{0}".format(job_spec.dag_hash()))
tty.debug("Installing {0} from source".format(job_spec.name))
tty.debug("spack install arguments: {0}".format(install_args))
# Write the install command to a shell script
with open("install.sh", "w") as fd:
fd.write("#!/bin/bash\n\n")
fd.write("\n# spack install command\n")
fd.write(" ".join(['"{0}"'.format(i) for i in install_args]))
fd.write("\n")
st = os.stat("install.sh")
os.chmod("install.sh", st.st_mode | stat.S_IEXEC)
install_copy_path = os.path.join(repro_dir, "install.sh")
shutil.copyfile("install.sh", install_copy_path)
# Run the generated install.sh shell script
try:
install_process = subprocess.Popen(["bash", "./install.sh"])
install_process.wait()
install_exit_code = install_process.returncode
except (ValueError, subprocess.CalledProcessError, OSError) as inst:
tty.error("Encountered error running install script")
tty.error(inst)
install_exit_code = spack_ci.process_command("install", install_args, repro_dir)
# Now do the post-install tasks
tty.debug("spack install exited {0}".format(install_exit_code))
@@ -543,61 +565,92 @@ def ci_rebuild(args):
dev_fail_hash = job_spec.dag_hash()
broken_spec_path = url_util.join(broken_specs_url, dev_fail_hash)
tty.msg("Reporting broken develop build as: {0}".format(broken_spec_path))
tmpdir = tempfile.mkdtemp()
empty_file_path = os.path.join(tmpdir, "empty.txt")
broken_spec_details = {
"broken-spec": {
"job-url": get_env_var("CI_JOB_URL"),
"pipeline-url": get_env_var("CI_PIPELINE_URL"),
"concrete-spec-dict": job_spec.to_dict(hash=ht.dag_hash),
}
}
try:
with open(empty_file_path, "w") as efd:
efd.write(syaml.dump(broken_spec_details))
web_util.push_to_url(
empty_file_path,
broken_spec_path,
keep_original=False,
extra_args={"ContentType": "text/plain"},
)
except Exception as err:
# If we got some kind of S3 (access denied or other connection
# error), the first non boto-specific class in the exception
# hierarchy is Exception. Just print a warning and return
msg = "Error writing to broken specs list {0}: {1}".format(broken_spec_path, err)
tty.warn(msg)
finally:
shutil.rmtree(tmpdir)
spack_ci.write_broken_spec(
broken_spec_path,
job_spec_pkg_name,
spack_ci_stack_name,
get_env_var("CI_JOB_URL"),
get_env_var("CI_PIPELINE_URL"),
job_spec.to_dict(hash=ht.dag_hash),
)
# We generated the "spack install ..." command to "--keep-stage", copy
# any logs from the staging directory to artifacts now
spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir)
# If the installation succeeded and we're running stand-alone tests for
# the package, run them and copy the output. Failures of any kind should
# *not* terminate the build process or preclude creating the build cache.
broken_tests = (
"broken-tests-packages" in gitlab_ci
and job_spec.name in gitlab_ci["broken-tests-packages"]
)
reports_dir = fs.join_path(os.getcwd(), "cdash_report")
if args.tests and broken_tests:
tty.warn(
"Unable to run stand-alone tests since listed in "
"gitlab-ci's 'broken-tests-packages'"
)
if cdash_handler:
msg = "Package is listed in gitlab-ci's broken-tests-packages"
cdash_handler.report_skipped(job_spec, reports_dir, reason=msg)
cdash_handler.copy_test_results(reports_dir, job_test_dir)
elif args.tests:
if install_exit_code == 0:
try:
# First ensure we will use a reasonable test stage directory
stage_root = os.path.dirname(str(job_spec.package.stage.path))
test_stage = fs.join_path(stage_root, "spack-standalone-tests")
tty.debug("Configuring test_stage to {0}".format(test_stage))
config_test_path = "config:test_stage:{0}".format(test_stage)
cfg.add(config_test_path, scope=cfg.default_modify_scope())
# Run the tests, resorting to junit results if not using cdash
log_file = (
None if cdash_handler else fs.join_path(test_stage, "ci-test-results.xml")
)
spack_ci.run_standalone_tests(
cdash=cdash_handler,
job_spec=job_spec,
fail_fast=args.fail_fast,
log_file=log_file,
repro_dir=repro_dir,
)
except Exception as err:
# If there is any error, just print a warning.
msg = "Error processing stand-alone tests: {0}".format(str(err))
tty.warn(msg)
finally:
# Copy the test log/results files
spack_ci.copy_test_logs_to_artifacts(test_stage, job_test_dir)
if cdash_handler:
cdash_handler.copy_test_results(reports_dir, job_test_dir)
elif log_file:
spack_ci.copy_files_to_artifacts(log_file, job_test_dir)
else:
tty.warn("No recognized test results reporting option")
else:
tty.warn("Unable to run stand-alone tests due to unsuccessful " "installation")
if cdash_handler:
msg = "Failed to install the package"
cdash_handler.report_skipped(job_spec, reports_dir, reason=msg)
cdash_handler.copy_test_results(reports_dir, job_test_dir)
# If the install succeeded, create a buildcache entry for this job spec
# and push it to one or more mirrors. If the install did not succeed,
# print out some instructions on how to reproduce this build failure
# outside of the pipeline environment.
if install_exit_code == 0:
can_sign = spack_ci.can_sign_binaries()
sign_binaries = can_sign and spack_is_pr_pipeline is False
# Create buildcache in either the main remote mirror, or in the
# per-PR mirror, if this is a PR pipeline
if buildcache_mirror_url:
spack_ci.push_mirror_contents(
env, job_spec_json_path, buildcache_mirror_url, sign_binaries
)
# Create another copy of that buildcache in the per-pipeline
# temporary storage mirror (this is only done if either
# artifacts buildcache is enabled or a temporary storage url
# prefix is set)
if pipeline_mirror_url:
spack_ci.push_mirror_contents(
env, job_spec_json_path, pipeline_mirror_url, sign_binaries
if buildcache_mirror_url or pipeline_mirror_url:
spack_ci.create_buildcache(
env=env,
buildcache_mirror_url=buildcache_mirror_url,
pipeline_mirror_url=pipeline_mirror_url,
pr_pipeline=spack_is_pr_pipeline,
json_path=job_spec_json_path,
)
# If this is a develop pipeline, check if the spec that we just built is
@@ -611,13 +664,11 @@ def ci_rebuild(args):
try:
web_util.remove_url(broken_spec_path)
except Exception as err:
# If we got some kind of S3 (access denied or other connection
# If there is an S3 error (e.g., access denied or connection
# error), the first non boto-specific class in the exception
# hierarchy is Exception. Just print a warning and return
msg = "Error removing {0} from broken specs list: {1}".format(
broken_spec_path, err
)
tty.warn(msg)
# hierarchy is Exception. Just print a warning and return.
msg = "Error removing {0} from broken specs list: {1}"
tty.warn(msg.format(broken_spec_path, err))
else:
tty.debug("spack install exited non-zero, will not create buildcache")
@@ -654,6 +705,10 @@ def ci_rebuild(args):
def ci_reproduce(args):
"""Generate instructions for reproducing the spec rebuild job.
Artifacts of the provided gitlab pipeline rebuild job's URL will be
used to derive instructions for reproducing the build locally."""
job_url = args.job_url
work_dir = args.working_dir

View File

@@ -120,7 +120,7 @@ def _get_scope_and_section(args):
path = getattr(args, "path", None)
# w/no args and an active environment, point to env manifest
if not section:
if not section and not scope:
env = ev.active_environment()
if env:
scope = env.env_file_config_scope_name()
@@ -258,7 +258,7 @@ def config_update(args):
cannot_overwrite, skip_system_scope = [], False
for scope in updates:
cfg_file = spack.config.config.get_config_filename(scope.name, args.section)
scope_dir = scope.path
scope_dir = os.path.dirname(scope.path)
can_be_updated = _can_update_config_file(scope_dir, cfg_file)
if not can_be_updated:
if scope.name == "system":

View File

@@ -9,6 +9,7 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.util.path
from spack.error import SpackError
description = "add a spec to an environment's dev-build information"
@@ -52,7 +53,7 @@ def develop(parser, args):
# download all dev specs
for name, entry in env.dev_specs.items():
path = entry.get("path", name)
abspath = path if os.path.isabs(path) else os.path.join(env.path, path)
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
if os.path.exists(abspath):
msg = "Skipping developer download of %s" % entry["spec"]
@@ -79,11 +80,7 @@ def develop(parser, args):
# default path is relative path to spec.name
path = args.path or spec.name
# get absolute path to check
abspath = path
if not os.path.isabs(abspath):
abspath = os.path.join(env.path, path)
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
# clone default: only if the path doesn't exist
clone = args.clone

View File

@@ -198,10 +198,12 @@ def diff(parser, args):
if len(args.specs) != 2:
tty.die("You must provide two specs to diff.")
specs = [
spack.cmd.disambiguate_spec(spec, env, first=args.load_first)
for spec in spack.cmd.parse_specs(args.specs)
]
specs = []
for spec in spack.cmd.parse_specs(args.specs):
if spec.concrete:
specs.append(spec)
else:
specs.append(spack.cmd.disambiguate_spec(spec, env, first=args.load_first))
# Calculate the comparison (c)
color = False if args.dump_json else get_color_when()

View File

@@ -24,6 +24,7 @@
import spack.environment as ev
import spack.environment.shell
import spack.schema.env
import spack.tengine
import spack.util.string as string
from spack.util.environment import EnvironmentModifications
@@ -641,6 +642,9 @@ def get_target(name):
def get_install_target(name):
return os.path.join(target_prefix, ".install", name)
def get_install_deps_target(name):
return os.path.join(target_prefix, ".install-deps", name)
for _, spec in env.concretized_specs():
for s in spec.traverse(root=True):
hash_to_spec[s.dag_hash()] = s
@@ -655,76 +659,38 @@ def get_install_target(name):
# All package install targets, not just roots.
all_install_targets = [get_install_target(h) for h in hash_to_spec.keys()]
all_install_deps_targets = [get_install_deps_target(h) for h, _ in hash_to_prereqs.items()]
buf = six.StringIO()
buf.write(
"""SPACK ?= spack
template = spack.tengine.make_environment().get_template(os.path.join("depfile", "Makefile"))
.PHONY: {} {}
{}: {}
{}: {}
\t@touch $@
{}:
\t@mkdir -p {}
{}: | {}
\t$(info Installing $(SPEC))
\t{}$(SPACK) -e '{}' install $(SPACK_INSTALL_FLAGS) --only-concrete --only=package \
--no-add /$(notdir $@) && touch $@
""".format(
get_target("all"),
get_target("clean"),
get_target("all"),
get_target("env"),
get_target("env"),
" ".join(root_install_targets),
get_target("dirs"),
get_target(".install"),
get_target(".install/%"),
get_target("dirs"),
"+" if args.jobserver else "",
env.path,
)
)
# Targets are of the form <prefix>/<name>: [<prefix>/<depname>]...,
# The prefix can be an empty string, in that case we don't add the `/`.
# The name is currently the dag hash of the spec. In principle it
# could be the package name in case of `concretization: together` so
# it can be more easily referred to, but for now we don't special case
# this.
fmt = "{name}{@version}{%compiler}{variants}{arch=architecture}"
hash_with_name = [(h, hash_to_spec[h].format(fmt)) for h in hash_to_prereqs.keys()]
targets_to_prereqs = [
(get_install_deps_target(h), " ".join(prereqs)) for h, prereqs in hash_to_prereqs.items()
]
# Set SPEC for each hash
buf.write("# Set the human-readable spec for each target\n")
for dag_hash in hash_to_prereqs.keys():
formatted_spec = hash_to_spec[dag_hash].format(fmt)
buf.write("{}: SPEC = {}\n".format(get_target("%/" + dag_hash), formatted_spec))
buf.write("\n")
# Set install dependencies
buf.write("# Install dependencies\n")
for parent, children in hash_to_prereqs.items():
if not children:
continue
buf.write("{}: {}\n".format(get_install_target(parent), " ".join(children)))
buf.write("\n")
# Clean target: remove target files but not their folders, cause
# --make-target-prefix can be any existing directory we do not control,
# including empty string (which means deleting the containing folder
# would delete the folder with the Makefile)
buf.write(
"{}:\n\trm -f -- {} {}\n".format(
get_target("clean"), get_target("env"), " ".join(all_install_targets)
)
rendered = template.render(
{
"all_target": get_target("all"),
"env_target": get_target("env"),
"clean_target": get_target("clean"),
"all_install_targets": " ".join(all_install_targets),
"all_install_deps_targets": " ".join(all_install_deps_targets),
"root_install_targets": " ".join(root_install_targets),
"dirs_target": get_target("dirs"),
"environment": env.path,
"install_target": get_target(".install"),
"install_deps_target": get_target(".install-deps"),
"any_hash_target": get_target("%"),
"hash_with_name": hash_with_name,
"jobserver_support": "+" if args.jobserver else "",
"targets_to_prereqs": targets_to_prereqs,
}
)
buf.write(rendered)
makefile = buf.getvalue()
# Finally write to stdout/file.

View File

@@ -219,7 +219,7 @@ def _collect_and_consume_cray_manifest_files(
tty.debug("Reading manifest file: " + path)
try:
cray_manifest.read(path, not dry_run)
except (spack.compilers.UnknownCompilerError, spack.error.SpackError) as e:
except spack.error.SpackError as e:
if fail_on_error:
raise
else:

View File

@@ -203,7 +203,16 @@ def decorator(spec, fmt):
return decorator, added, roots, removed
def display_env(env, args, decorator):
def display_env(env, args, decorator, results):
"""Display extra find output when running in an environment.
Find in an environment outputs 2 or 3 sections:
1. Root specs
2. Concretized roots (if asked for with -c)
3. Installed specs
"""
tty.msg("In environment %s" % env.name)
if not env.user_specs:
@@ -234,6 +243,12 @@ def display_env(env, args, decorator):
cmd.display_specs(env.specs_by_hash.values(), args, decorator=decorator)
print()
# Display a header for the installed packages section IF there are installed
# packages. If there aren't any, we'll just end up printing "0 installed packages"
# later.
if results:
tty.msg("Installed packages")
def find(parser, args):
if args.bootstrap:
@@ -286,10 +301,11 @@ def _find(parser, args):
else:
if not args.format:
if env:
display_env(env, args, decorator)
display_env(env, args, decorator, results)
cmd.display_specs(results, args, decorator=decorator, all_headers=True)
# print number of installed packages last (as the list may be long)
if sys.stdout.isatty() and args.groups:
pkg_type = "loaded" if args.loaded else "installed"
spack.cmd.print_how_many_pkgs(results, pkg_type)
cmd.display_specs(results, args, decorator=decorator, all_headers=True)

View File

@@ -5,6 +5,7 @@
import argparse
import os
import re
import shutil
import sys
import textwrap
@@ -31,10 +32,50 @@
level = "short"
# Pass in the value string passed to use-buildcache and get back
# the package and dependencies values.
def parse_use_buildcache(opt):
bc_keys = ["package:", "dependencies:", ""]
bc_values = ["only", "never", "auto"]
kv_list = re.findall("([a-z]+:)?([a-z]+)", opt)
# Verify keys and values
bc_map = {k: v for k, v in kv_list if k in bc_keys and v in bc_values}
if not len(kv_list) == len(bc_map):
tty.error("Unrecognized arguments passed to use-buildcache")
tty.error(
"Expected: --use-buildcache "
"[[auto|only|never],[package:[auto|only|never]],[dependencies:[auto|only|never]]]"
)
exit(1)
for _group in ["package:", "dependencies:"]:
if _group not in bc_map:
if "" in bc_map:
bc_map[_group] = bc_map[""]
else:
bc_map[_group] = "auto"
return bc_map["package:"], bc_map["dependencies:"]
# Determine value of cache flag
def cache_opt(default_opt, use_buildcache):
if use_buildcache == "auto":
return default_opt
elif use_buildcache == "only":
return True
elif use_buildcache == "never":
return False
def install_kwargs_from_args(args):
"""Translate command line arguments into a dictionary that will be passed
to the package installer.
"""
pkg_use_bc, dep_use_bc = parse_use_buildcache(args.use_buildcache)
return {
"fail_fast": args.fail_fast,
"keep_prefix": args.keep_prefix,
@@ -44,8 +85,10 @@ def install_kwargs_from_args(args):
"verbose": args.verbose or args.install_verbose,
"fake": args.fake,
"dirty": args.dirty,
"use_cache": args.use_cache,
"cache_only": args.cache_only,
"package_use_cache": cache_opt(args.use_cache, pkg_use_bc),
"package_cache_only": cache_opt(args.cache_only, pkg_use_bc),
"dependencies_use_cache": cache_opt(args.use_cache, dep_use_bc),
"dependencies_cache_only": cache_opt(args.cache_only, dep_use_bc),
"include_build_deps": args.include_build_deps,
"explicit": True, # Use true as a default for install command
"stop_at": args.until,
@@ -123,6 +166,18 @@ def setup_parser(subparser):
default=False,
help="only install package from binary mirrors",
)
cache_group.add_argument(
"--use-buildcache",
dest="use_buildcache",
default="package:auto,dependencies:auto",
metavar="[{auto,only,never},][package:{auto,only,never},][dependencies:{auto,only,never}]",
help="""select the mode of buildcache for the 'package' and 'dependencies'.
Default: package:auto,dependencies:auto
- `auto` behaves like --use-cache
- `only` behaves like --cache-only
- `never` behaves like --no-cache
""",
)
subparser.add_argument(
"--include-build-deps",
@@ -236,6 +291,7 @@ def install_specs(specs, install_kwargs, cli_args):
except spack.build_environment.InstallError as e:
if cli_args.show_log_on_error:
e.print_context()
assert e.pkg, "Expected InstallError to include the associated package"
if not os.path.exists(e.pkg.build_log_path):
tty.error("'spack install' created no log.")
else:

View File

@@ -54,7 +54,6 @@
r"^share/spack/.*\.fish$",
r"^share/spack/qa/run-[^/]*$",
r"^share/spack/bash/spack-completion.in$",
r"^share/spack/templates/misc/coconcretization.pyt$",
# action workflows
r"^.github/actions/.*\.py$",
# all packages

View File

@@ -16,6 +16,7 @@
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.dependency
import spack.repo
from spack.version import VersionList
@@ -72,6 +73,7 @@ def setup_parser(subparser):
default=False,
help="include virtual packages in list",
)
arguments.add_common_arguments(subparser, ["tags"])
def filter_by_name(pkgs, args):
@@ -120,9 +122,9 @@ def match(p, f):
@formatter
def name_only(pkgs, out):
indent = 0
if out.isatty():
tty.msg("%d packages." % len(pkgs))
colify(pkgs, indent=indent, output=out)
if out.isatty():
tty.msg("%d packages" % len(pkgs))
def github_url(pkg):
@@ -306,6 +308,11 @@ def list(parser, args):
# Filter the set appropriately
sorted_packages = filter_by_name(pkgs, args)
# If tags have been specified on the command line, filter by tags
if args.tags:
packages_with_tags = spack.repo.path.packages_with_tags(*args.tags)
sorted_packages = [p for p in sorted_packages if p in packages_with_tags]
if args.update:
# change output stream if user asked for update
if os.path.exists(args.update):

View File

@@ -30,6 +30,12 @@ def setup_parser(subparser):
help="print the Python version number and exit",
)
subparser.add_argument("-c", dest="python_command", help="command to execute")
subparser.add_argument(
"-u",
dest="unbuffered",
action="store_true",
help="for compatibility with xdist, do not use without adding -u to the interpreter",
)
subparser.add_argument(
"-i",
dest="python_interpreter",

View File

@@ -29,17 +29,14 @@
level = "long"
def first_line(docstring):
"""Return the first line of the docstring."""
return docstring.split("\n")[0]
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="test_command")
# Run
run_parser = sp.add_parser(
"run", description=test_run.__doc__, help=first_line(test_run.__doc__)
"run",
description=test_run.__doc__,
help=spack.cmd.first_line(test_run.__doc__),
)
alias_help_msg = "Provide an alias for this test-suite"
@@ -83,7 +80,9 @@ def setup_parser(subparser):
# List
list_parser = sp.add_parser(
"list", description=test_list.__doc__, help=first_line(test_list.__doc__)
"list",
description=test_list.__doc__,
help=spack.cmd.first_line(test_list.__doc__),
)
list_parser.add_argument(
"-a",
@@ -97,7 +96,9 @@ def setup_parser(subparser):
# Find
find_parser = sp.add_parser(
"find", description=test_find.__doc__, help=first_line(test_find.__doc__)
"find",
description=test_find.__doc__,
help=spack.cmd.first_line(test_find.__doc__),
)
find_parser.add_argument(
"filter",
@@ -107,7 +108,9 @@ def setup_parser(subparser):
# Status
status_parser = sp.add_parser(
"status", description=test_status.__doc__, help=first_line(test_status.__doc__)
"status",
description=test_status.__doc__,
help=spack.cmd.first_line(test_status.__doc__),
)
status_parser.add_argument(
"names", nargs=argparse.REMAINDER, help="Test suites for which to print status"
@@ -115,7 +118,9 @@ def setup_parser(subparser):
# Results
results_parser = sp.add_parser(
"results", description=test_results.__doc__, help=first_line(test_results.__doc__)
"results",
description=test_results.__doc__,
help=spack.cmd.first_line(test_results.__doc__),
)
results_parser.add_argument(
"-l", "--logs", action="store_true", help="print the test log for each matching package"
@@ -142,7 +147,9 @@ def setup_parser(subparser):
# Remove
remove_parser = sp.add_parser(
"remove", description=test_remove.__doc__, help=first_line(test_remove.__doc__)
"remove",
description=test_remove.__doc__,
help=spack.cmd.first_line(test_remove.__doc__),
)
arguments.add_common_arguments(remove_parser, ["yes_to_all"])
remove_parser.add_argument(
@@ -191,6 +198,16 @@ def test_run(args):
matching = spack.store.db.query_local(spec, hashes=hashes)
if spec and not matching:
tty.warn("No installed packages match spec %s" % spec)
"""
TODO: Need to write out a log message and/or CDASH Testing
output that package not installed IF continue to process
these issues here.
if args.log_format:
# Proceed with the spec assuming the test process
# to ensure report package as skipped (e.g., for CI)
specs_to_test.append(spec)
"""
specs_to_test.extend(matching)
# test_stage_dir

View File

@@ -35,7 +35,6 @@
"""
import llnl.util.tty as tty
from llnl.util.link_tree import MergeConflictError
from llnl.util.tty.color import colorize
import spack.cmd
import spack.environment as ev
@@ -66,16 +65,7 @@ def squash(matching_specs):
tty.die("Spec matches no installed packages.")
matching_in_view = [ms for ms in matching_specs if ms in view_specs]
if len(matching_in_view) > 1:
spec_format = "{name}{@version}{%compiler}{arch=architecture}"
args = ["Spec matches multiple packages.", "Matching packages:"]
args += [
colorize(" @K{%s} " % s.dag_hash(7)) + s.cformat(spec_format)
for s in matching_in_view
]
args += ["Use a more specific spec."]
tty.die(*args)
spack.cmd.ensure_single_spec_or_die("Spec", matching_in_view)
return matching_in_view[0] if matching_in_view else matching_specs[0]

View File

@@ -537,6 +537,14 @@ def get_real_version(self):
)
return self.extract_version_from_output(output)
@property
def prefix(self):
"""Query the compiler for its install prefix. This is the install
path as reported by the compiler. Note that paths for cc, cxx, etc
are not enough to find the install prefix of the compiler, since
the can be symlinks, wrappers, or filenames instead of absolute paths."""
raise NotImplementedError("prefix is not implemented for this compiler")
#
# Compiler classes have methods for querying the version of
# specific compiler executables. This is used when discovering compilers.

View File

@@ -6,8 +6,11 @@
import os
import re
from llnl.util.filesystem import ancestor
import spack.compiler
import spack.compilers.apple_clang as apple_clang
import spack.util.executable
from spack.version import ver
@@ -196,3 +199,21 @@ def f77_version(cls, f77):
@property
def stdcxx_libs(self):
return ("-lstdc++",)
@property
def prefix(self):
# GCC reports its install prefix when running ``-print-search-dirs``
# on the first line ``install: <prefix>``.
cc = spack.util.executable.Executable(self.cc)
with self.compiler_environment():
gcc_output = cc("-print-search-dirs", output=str, error=str)
for line in gcc_output.splitlines():
if line.startswith("install:"):
gcc_prefix = line.split(":")[1].strip()
# Go from <prefix>/lib/gcc/<triplet>/<version>/ to <prefix>
return ancestor(gcc_prefix, 4)
raise RuntimeError(
"could not find install prefix of GCC from output:\n\t{}".format(gcc_output)
)

View File

@@ -17,7 +17,6 @@
from __future__ import print_function
import functools
import os.path
import platform
import tempfile
from contextlib import contextmanager
@@ -25,7 +24,6 @@
import archspec.cpu
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
@@ -38,6 +36,7 @@
import spack.spec
import spack.target
import spack.tengine
import spack.util.path
import spack.variant as vt
from spack.config import config
from spack.package_prefs import PackagePrefs, is_spec_buildable, spec_externals
@@ -91,7 +90,7 @@ def concretize_develop(self, spec):
if not dev_info:
return False
path = os.path.normpath(os.path.join(env.path, dev_info["path"]))
path = spack.util.path.canonicalize_path(dev_info["path"], default_wd=env.path)
if "dev_path" in spec.variants:
assert spec.variants["dev_path"].value == path
@@ -752,37 +751,20 @@ def _concretize_specs_together_new(*abstract_specs, **kwargs):
def _concretize_specs_together_original(*abstract_specs, **kwargs):
def make_concretization_repository(abstract_specs):
"""Returns the path to a temporary repository created to contain
a fake package that depends on all of the abstract specs.
"""
tmpdir = tempfile.mkdtemp()
repo_path, _ = spack.repo.create_repo(tmpdir)
debug_msg = "[CONCRETIZATION]: Creating helper repository in {0}"
tty.debug(debug_msg.format(repo_path))
pkg_dir = os.path.join(repo_path, "packages", "concretizationroot")
fs.mkdirp(pkg_dir)
environment = spack.tengine.make_environment()
template = environment.get_template("misc/coconcretization.pyt")
# Split recursive specs, as it seems the concretizer has issue
# respecting conditions on dependents expressed like
# depends_on('foo ^bar@1.0'), see issue #11160
split_specs = [
dep.copy(deps=False) for spec in abstract_specs for dep in spec.traverse(root=True)
]
with open(os.path.join(pkg_dir, "package.py"), "w") as f:
f.write(template.render(specs=[str(s) for s in split_specs]))
return spack.repo.Repo(repo_path)
abstract_specs = [spack.spec.Spec(s) for s in abstract_specs]
concretization_repository = make_concretization_repository(abstract_specs)
tmpdir = tempfile.mkdtemp()
builder = spack.repo.MockRepositoryBuilder(tmpdir)
# Split recursive specs, as it seems the concretizer has issue
# respecting conditions on dependents expressed like
# depends_on('foo ^bar@1.0'), see issue #11160
split_specs = [
dep.copy(deps=False) for spec1 in abstract_specs for dep in spec1.traverse(root=True)
]
builder.add_package(
"concretizationroot", dependencies=[(str(x), None, None) for x in split_specs]
)
with spack.repo.additional_repository(concretization_repository):
with spack.repo.use_repositories(builder.root, override=False):
# Spec from a helper package that depends on all the abstract_specs
concretization_root = spack.spec.Spec("concretizationroot")
concretization_root.concretize(tests=kwargs.get("tests", False))

View File

@@ -64,6 +64,7 @@
# Hacked yaml for configuration files preserves line numbers.
import spack.util.spack_yaml as syaml
import spack.util.web as web_util
from spack.error import SpackError
from spack.util.cpus import cpus_available
@@ -408,28 +409,22 @@ def __init__(self, *scopes):
@_config_mutator
def push_scope(self, scope):
"""Add a higher precedence scope to the Configuration."""
cmd_line_scope = None
if self.scopes:
highest_precedence_scope = list(self.scopes.values())[-1]
if highest_precedence_scope.name == "command_line":
# If the command-line scope is present, it should always
# be the scope of highest precedence
cmd_line_scope = self.pop_scope()
tty.debug("[CONFIGURATION: PUSH SCOPE]: {}".format(str(scope)), level=2)
self.scopes[scope.name] = scope
if cmd_line_scope:
self.scopes["command_line"] = cmd_line_scope
@_config_mutator
def pop_scope(self):
"""Remove the highest precedence scope and return it."""
name, scope = self.scopes.popitem(last=True)
tty.debug("[CONFIGURATION: POP SCOPE]: {}".format(str(scope)), level=2)
return scope
@_config_mutator
def remove_scope(self, scope_name):
"""Remove scope by name; has no effect when ``scope_name`` does not exist"""
return self.scopes.pop(scope_name, None)
scope = self.scopes.pop(scope_name, None)
tty.debug("[CONFIGURATION: POP SCOPE]: {}".format(str(scope)), level=2)
return scope
@property
def file_scopes(self):
@@ -994,18 +989,19 @@ def read_config_file(filename, schema=None):
# schema when it's not necessary) while allowing us to validate against a
# known schema when the top-level key could be incorrect.
# Ignore nonexisting files.
if not os.path.exists(filename):
# Ignore nonexistent files.
tty.debug("Skipping nonexistent config path {0}".format(filename))
return None
elif not os.path.isfile(filename):
raise ConfigFileError("Invalid configuration. %s exists but is not a file." % filename)
elif not os.access(filename, os.R_OK):
raise ConfigFileError("Config file is not readable: %s" % filename)
raise ConfigFileError("Config file is not readable: {0}".format(filename))
try:
tty.debug("Reading config file %s" % filename)
tty.debug("Reading config from file {0}".format(filename))
with open(filename) as f:
data = syaml.load_config(f)
@@ -1020,7 +1016,15 @@ def read_config_file(filename, schema=None):
raise ConfigFileError("Config file is empty or is not a valid YAML dict: %s" % filename)
except MarkedYAMLError as e:
raise ConfigFileError("Error parsing yaml%s: %s" % (str(e.context_mark), e.problem))
msg = "Error parsing yaml"
mark = e.context_mark if e.context_mark else e.problem_mark
if mark:
line, column = mark.line, mark.column
msg += ": near %s, %s, %s" % (mark.name, str(line), str(column))
else:
msg += ": %s" % (filename)
msg += ": %s" % (e.problem)
raise ConfigFileError(msg)
except IOError as e:
raise ConfigFileError("Error reading configuration file %s: %s" % (filename, str(e)))
@@ -1296,6 +1300,95 @@ def _config_from(scopes_or_paths):
return configuration
def raw_github_gitlab_url(url):
"""Transform a github URL to the raw form to avoid undesirable html.
Args:
url: url to be converted to raw form
Returns: (str) raw github/gitlab url or the original url
"""
# Note we rely on GitHub to redirect the 'raw' URL returned here to the
# actual URL under https://raw.githubusercontent.com/ with '/blob'
# removed and or, '/blame' if needed.
if "github" in url or "gitlab" in url:
return url.replace("/blob/", "/raw/")
return url
def collect_urls(base_url):
"""Return a list of configuration URLs.
Arguments:
base_url (str): URL for a configuration (yaml) file or a directory
containing yaml file(s)
Returns: (list) list of configuration file(s) or empty list if none
"""
if not base_url:
return []
extension = ".yaml"
if base_url.endswith(extension):
return [base_url]
# Collect configuration URLs if the base_url is a "directory".
_, links = web_util.spider(base_url, 0)
return [link for link in links if link.endswith(extension)]
def fetch_remote_configs(url, dest_dir, skip_existing=True):
"""Retrieve configuration file(s) at the specified URL.
Arguments:
url (str): URL for a configuration (yaml) file or a directory containing
yaml file(s)
dest_dir (str): destination directory
skip_existing (bool): Skip files that already exist in dest_dir if
``True``; otherwise, replace those files
Returns: (str) path to the corresponding file if URL is or contains a
single file and it is the only file in the destination directory or
the root (dest_dir) directory if multiple configuration files exist
or are retrieved.
"""
def _fetch_file(url):
raw = raw_github_gitlab_url(url)
tty.debug("Reading config from url {0}".format(raw))
return web_util.fetch_url_text(raw, dest_dir=dest_dir)
if not url:
raise ConfigFileError("Cannot retrieve configuration without a URL")
# Return the local path to the cached configuration file OR to the
# directory containing the cached configuration files.
config_links = collect_urls(url)
existing_files = os.listdir(dest_dir) if os.path.isdir(dest_dir) else []
paths = []
for config_url in config_links:
basename = os.path.basename(config_url)
if skip_existing and basename in existing_files:
tty.warn(
"Will not fetch configuration from {0} since a version already"
"exists in {1}".format(config_url, dest_dir)
)
path = os.path.join(dest_dir, basename)
else:
path = _fetch_file(config_url)
if path:
paths.append(path)
if paths:
return dest_dir if len(paths) > 1 else paths[0]
raise ConfigFileError("Cannot retrieve configuration (yaml) from {0}".format(url))
class ConfigError(SpackError):
"""Superclass for all Spack config related errors."""

View File

@@ -4,8 +4,10 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import json
import sys
import jsonschema
import jsonschema.exceptions
import six
import llnl.util.tty as tty
@@ -161,10 +163,21 @@ def entries_to_specs(entries):
def read(path, apply_updates):
with open(path, "r") as json_file:
json_data = json.load(json_file)
if sys.version_info >= (3, 0):
decode_exception_type = json.decoder.JSONDecodeError
else:
decode_exception_type = ValueError
jsonschema.validate(json_data, manifest_schema)
try:
with open(path, "r") as json_file:
json_data = json.load(json_file)
jsonschema.validate(json_data, manifest_schema)
except (jsonschema.exceptions.ValidationError, decode_exception_type) as e:
raise six.raise_from(
ManifestValidationError("error parsing manifest JSON:", str(e)),
e,
)
specs = entries_to_specs(json_data["specs"])
tty.debug("{0}: {1} specs read from manifest".format(path, str(len(specs))))
@@ -179,3 +192,8 @@ def read(path, apply_updates):
if apply_updates:
for spec in specs.values():
spack.store.db.add(spec, directory_layout=None)
class ManifestValidationError(spack.error.SpackError):
def __init__(self, msg, long_msg=None):
super(ManifestValidationError, self).__init__(msg, long_msg)

View File

@@ -48,7 +48,10 @@
import spack.store
import spack.util.lock as lk
import spack.util.spack_json as sjson
from spack.directory_layout import DirectoryLayoutError
from spack.directory_layout import (
DirectoryLayoutError,
InconsistentInstallDirectoryError,
)
from spack.error import SpackError
from spack.filesystem_view import YamlFilesystemView
from spack.util.crypto import bit_length
@@ -1063,7 +1066,14 @@ def _read(self):
elif self.is_upstream:
tty.warn("upstream not found: {0}".format(self._index_path))
def _add(self, spec, directory_layout=None, explicit=False, installation_time=None):
def _add(
self,
spec,
directory_layout=None,
explicit=False,
installation_time=None,
allow_missing=False,
):
"""Add an install record for this spec to the database.
Assumes spec is installed in ``layout.path_for_spec(spec)``.
@@ -1074,19 +1084,18 @@ def _add(self, spec, directory_layout=None, explicit=False, installation_time=No
Args:
spec: spec to be added
directory_layout: layout of the spec installation
**kwargs:
explicit:
Possible values: True, False, any
explicit
Possible values: True, False, any
A spec that was installed following a specific user
request is marked as explicit. If instead it was
pulled-in as a dependency of a user requested spec
it's considered implicit.
installation_time
Date and time of installation
A spec that was installed following a specific user
request is marked as explicit. If instead it was
pulled-in as a dependency of a user requested spec
it's considered implicit.
installation_time:
Date and time of installation
allow_missing: if True, don't warn when installation is not found on on disk
This is useful when installing specs without build deps.
"""
if not spec.concrete:
raise NonConcreteSpecAddError("Specs added to DB must be concrete.")
@@ -1100,11 +1109,22 @@ def _add(self, spec, directory_layout=None, explicit=False, installation_time=No
# Retrieve optional arguments
installation_time = installation_time or _now()
for dep in spec.dependencies(deptype=_tracked_deps):
dkey = dep.dag_hash()
if dkey not in self._data:
extra_args = {"explicit": False, "installation_time": installation_time}
self._add(dep, directory_layout, **extra_args)
for edge in spec.edges_to_dependencies(deptype=_tracked_deps):
if edge.spec.dag_hash() in self._data:
continue
# allow missing build-only deps. This prevents excessive
# warnings when a spec is installed, and its build dep
# is missing a build dep; there's no need to install the
# build dep's build dep first, and there's no need to warn
# about it missing.
dep_allow_missing = allow_missing or edge.deptypes == ("build",)
self._add(
edge.spec,
directory_layout,
explicit=False,
installation_time=installation_time,
allow_missing=dep_allow_missing,
)
# Make sure the directory layout agrees whether the spec is installed
if not spec.external and directory_layout:
@@ -1115,13 +1135,14 @@ def _add(self, spec, directory_layout=None, explicit=False, installation_time=No
installed = True
self._installed_prefixes.add(path)
except DirectoryLayoutError as e:
msg = (
"{0} is being {1} in the database with prefix {2}, "
"but this directory does not contain an installation of "
"the spec, due to: {3}"
)
action = "updated" if key in self._data else "registered"
tty.warn(msg.format(spec.short_spec, action, path, str(e)))
if not (allow_missing and isinstance(e, InconsistentInstallDirectoryError)):
msg = (
"{0} is being {1} in the database with prefix {2}, "
"but this directory does not contain an installation of "
"the spec, due to: {3}"
)
action = "updated" if key in self._data else "registered"
tty.warn(msg.format(spec.short_spec, action, path, str(e)))
elif spec.external_path:
path = spec.external_path
installed = True

View File

@@ -17,7 +17,6 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.filesystem import rename
from llnl.util.lang import dedupe
from llnl.util.symlink import symlink
@@ -593,7 +592,7 @@ def regenerate(self, concretized_root_specs):
symlink(new_root, tmp_symlink_name)
# mv symlink atomically over root symlink to old_root
rename(tmp_symlink_name, self.root)
fs.rename(tmp_symlink_name, self.root)
except Exception as e:
# Clean up new view and temporary symlink on any failure.
try:
@@ -897,6 +896,11 @@ def repos_path(self):
def log_path(self):
return os.path.join(self.path, env_subdir_name, "logs")
@property
def config_stage_dir(self):
"""Directory for any staged configuration file(s)."""
return os.path.join(self.env_subdir_path, "config")
@property
def view_path_default(self):
# default path for environment views
@@ -928,6 +932,47 @@ def included_config_scopes(self):
# allow paths to contain spack config/environment variables, etc.
config_path = substitute_path_variables(config_path)
# strip file URL prefix, if needed, to avoid unnecessary remote
# config processing for local files
config_path = config_path.replace("file://", "")
if not os.path.exists(config_path):
# Stage any remote configuration file(s)
if spack.util.url.is_url_format(config_path):
staged_configs = (
os.listdir(self.config_stage_dir)
if os.path.exists(self.config_stage_dir)
else []
)
basename = os.path.basename(config_path)
if basename in staged_configs:
# Do NOT re-stage configuration files over existing
# ones with the same name since there is a risk of
# losing changes (e.g., from 'spack config update').
tty.warn(
"Will not re-stage configuration from {0} to avoid "
"losing changes to the already staged file of the "
"same name.".format(config_path)
)
# Recognize the configuration stage directory
# is flattened to ensure a single copy of each
# configuration file.
config_path = self.config_stage_dir
if basename.endswith(".yaml"):
config_path = os.path.join(config_path, basename)
else:
staged_path = spack.config.fetch_remote_configs(
config_path,
self.config_stage_dir,
skip_existing=True,
)
if not staged_path:
raise SpackEnvironmentError(
"Unable to fetch remote configuration {0}".format(config_path)
)
config_path = staged_path
# treat relative paths as relative to the environment
if not os.path.isabs(config_path):
config_path = os.path.join(self.path, config_path)
@@ -936,10 +981,14 @@ def included_config_scopes(self):
if os.path.isdir(config_path):
# directories are treated as regular ConfigScopes
config_name = "env:%s:%s" % (self.name, os.path.basename(config_path))
tty.debug("Creating ConfigScope {0} for '{1}'".format(config_name, config_path))
scope = spack.config.ConfigScope(config_name, config_path)
elif os.path.exists(config_path):
# files are assumed to be SingleFileScopes
config_name = "env:%s:%s" % (self.name, config_path)
tty.debug(
"Creating SingleFileScope {0} for '{1}'".format(config_name, config_path)
)
scope = spack.config.SingleFileScope(
config_name, config_path, spack.schema.merged.schema
)
@@ -1024,6 +1073,58 @@ def add(self, user_spec, list_name=user_speclist_name):
return bool(not existing)
def change_existing_spec(
self,
change_spec,
list_name=user_speclist_name,
match_spec=None,
allow_changing_multiple_specs=False,
):
"""
Find the spec identified by `match_spec` and change it to `change_spec`.
Arguments:
change_spec (spack.spec.Spec): defines the spec properties that
need to be changed. This will not change attributes of the
matched spec unless they conflict with `change_spec`.
list_name (str): identifies the spec list in the environment that
should be modified
match_spec (spack.spec.Spec): if set, this identifies the spec
that should be changed. If not set, it is assumed we are
looking for a spec with the same name as `change_spec`.
"""
if not (change_spec.name or (match_spec and match_spec.name)):
raise ValueError(
"Must specify a spec name to identify a single spec"
" in the environment that will be changed"
)
match_spec = match_spec or Spec(change_spec.name)
list_to_change = self.spec_lists[list_name]
if list_to_change.is_matrix:
raise SpackEnvironmentError(
"Cannot directly change specs in matrices:"
" specify a named list that is not a matrix"
)
matches = list(x for x in list_to_change if x.satisfies(match_spec))
if len(matches) == 0:
raise ValueError(
"There are no specs named {0} in {1}".format(match_spec.name, list_name)
)
elif len(matches) > 1 and not allow_changing_multiple_specs:
raise ValueError("{0} matches multiple specs".format(str(match_spec)))
new_speclist = SpecList(list_name)
for i, spec in enumerate(list_to_change):
if spec.satisfies(match_spec):
new_speclist.add(Spec.override(spec, change_spec))
else:
new_speclist.add(spec)
self.spec_lists[list_name] = new_speclist
self.update_stale_references()
def remove(self, query_spec, list_name=user_speclist_name, force=False):
"""Remove specs from an environment that match a query_spec"""
query_spec = Spec(query_spec)
@@ -1116,7 +1217,7 @@ def develop(self, spec, path, clone=False):
if clone:
# "steal" the source code via staging API
abspath = os.path.normpath(os.path.join(self.path, path))
abspath = spack.util.path.canonicalize_path(path, default_wd=self.path)
# Stage, at the moment, requires a concrete Spec, since it needs the
# dag_hash for the stage dir name. Below though we ask for a stage
@@ -1694,7 +1795,7 @@ def added_specs(self):
spec for already concretized but not yet installed specs.
"""
# use a transaction to avoid overhead of repeated calls
# to `package.installed`
# to `package.spec.installed`
with spack.store.db.read_transaction():
concretized = dict(self.concretized_specs())
for spec in self.user_specs:

View File

@@ -44,7 +44,7 @@ def activate_header(env, shell, prompt=None):
# TODO: prompt
else:
if "color" in os.getenv("TERM", "") and prompt:
prompt = colorize("@G{%s}" % prompt, color=True)
prompt = colorize("@G{%s}" % prompt, color=True, enclose=True)
cmds += "export SPACK_ENV=%s;\n" % env.path
cmds += "alias despacktivate='spack env deactivate';\n"
@@ -65,8 +65,8 @@ def deactivate_header(shell):
if shell == "csh":
cmds += "unsetenv SPACK_ENV;\n"
cmds += "if ( $?SPACK_OLD_PROMPT ) "
cmds += 'set prompt="$SPACK_OLD_PROMPT" && '
cmds += "unsetenv SPACK_OLD_PROMPT;\n"
cmds += ' eval \'set prompt="$SPACK_OLD_PROMPT" &&'
cmds += " unsetenv SPACK_OLD_PROMPT';\n"
cmds += "unalias despacktivate;\n"
elif shell == "fish":
cmds += "set -e SPACK_ENV;\n"

View File

@@ -52,9 +52,9 @@
import spack.util.crypto as crypto
import spack.util.pattern as pattern
import spack.util.url as url_util
import spack.util.web
import spack.util.web as web_util
import spack.version
from spack.util.compression import decompressor_for, extension
from spack.util.compression import decompressor_for, extension_from_path
from spack.util.executable import CommandNotFoundError, which
from spack.util.string import comma_and, quote
@@ -337,7 +337,8 @@ def fetch(self):
url = None
errors = []
for url in self.candidate_urls:
if not self._existing_url(url):
if not web_util.url_exists(url, self.curl):
tty.debug("URL does not exist: " + url)
continue
try:
@@ -352,30 +353,6 @@ def fetch(self):
if not self.archive_file:
raise FailedDownloadError(url)
def _existing_url(self, url):
tty.debug("Checking existence of {0}".format(url))
if spack.config.get("config:url_fetch_method") == "curl":
curl = self.curl
# Telling curl to fetch the first byte (-r 0-0) is supposed to be
# portable.
curl_args = ["--stderr", "-", "-s", "-f", "-r", "0-0", url]
if not spack.config.get("config:verify_ssl"):
curl_args.append("-k")
_ = curl(*curl_args, fail_on_error=False, output=os.devnull)
return curl.returncode == 0
else:
# Telling urllib to check if url is accessible
try:
url, headers, response = spack.util.web.read_from_url(url)
except spack.util.web.SpackWebError as werr:
msg = "Urllib fetch failed to verify url\
{0}\n with error {1}".format(
url, werr
)
raise FailedDownloadError(url, msg)
return response.getcode() is None or response.getcode() == 200
def _fetch_from_url(self, url):
if spack.config.get("config:url_fetch_method") == "curl":
return self._fetch_curl(url)
@@ -397,8 +374,8 @@ def _fetch_urllib(self, url):
# Run urllib but grab the mime type from the http headers
try:
url, headers, response = spack.util.web.read_from_url(url)
except spack.util.web.SpackWebError as e:
url, headers, response = web_util.read_from_url(url)
except web_util.SpackWebError as e:
# clean up archive on failure.
if self.archive_file:
os.remove(self.archive_file)
@@ -433,38 +410,19 @@ def _fetch_curl(self, url):
else:
save_args = ["-O"]
curl_args = save_args + [
"-f", # fail on >400 errors
"-D",
"-", # print out HTML headers
"-L", # resolve 3xx redirects
url,
]
if not spack.config.get("config:verify_ssl"):
curl_args.append("-k")
if sys.stdout.isatty() and tty.msg_enabled():
curl_args.append("-#") # status bar when using a tty
else:
curl_args.append("-sS") # show errors if fail
connect_timeout = spack.config.get("config:connect_timeout", 10)
timeout = 0
cookie_args = []
if self.extra_options:
cookie = self.extra_options.get("cookie")
if cookie:
curl_args.append("-j") # junk cookies
curl_args.append("-b") # specify cookie
curl_args.append(cookie)
cookie_args.append("-j") # junk cookies
cookie_args.append("-b") # specify cookie
cookie_args.append(cookie)
timeout = self.extra_options.get("timeout")
if timeout:
connect_timeout = max(connect_timeout, int(timeout))
if connect_timeout > 0:
# Timeout if can't establish a connection after n sec.
curl_args.extend(["--connect-timeout", str(connect_timeout)])
base_args = web_util.base_curl_fetch_args(url, timeout)
curl_args = save_args + base_args + cookie_args
# Run curl but grab the mime type from the http headers
curl = self.curl
@@ -479,26 +437,10 @@ def _fetch_curl(self, url):
if partial_file and os.path.lexists(partial_file):
os.remove(partial_file)
if curl.returncode == 22:
# This is a 404. Curl will print the error.
raise FailedDownloadError(url, "URL %s was not found!" % url)
elif curl.returncode == 60:
# This is a certificate error. Suggest spack -k
raise FailedDownloadError(
url,
"Curl was unable to fetch due to invalid certificate. "
"This is either an attack, or your cluster's SSL "
"configuration is bad. If you believe your SSL "
"configuration is bad, you can try running spack -k, "
"which will not check SSL certificates."
"Use this at your own risk.",
)
else:
# This is some other curl error. Curl will print the
# error, but print a spack message too
raise FailedDownloadError(url, "Curl failed with error %d" % curl.returncode)
try:
web_util.check_curl_code(curl.returncode)
except web_util.FetchError as err:
raise spack.fetch_strategy.FailedDownloadError(url, str(err))
self._check_headers(headers)
@@ -556,7 +498,7 @@ def archive(self, destination):
if not self.archive_file:
raise NoArchiveFileError("Cannot call archive() before fetching.")
spack.util.web.push_to_url(self.archive_file, destination, keep_original=True)
web_util.push_to_url(self.archive_file, destination, keep_original=True)
@_needs_stage
def check(self):
@@ -671,7 +613,7 @@ def expand(self):
@_needs_stage
def archive(self, destination, **kwargs):
assert extension(destination) == "tar.gz"
assert extension_from_path(destination) == "tar.gz"
assert self.stage.source_path.startswith(self.stage.path)
tar = which("tar", required=True)
@@ -1385,19 +1327,19 @@ def fetch(self):
parsed_url = url_util.parse(self.url)
if parsed_url.scheme != "s3":
raise FetchError("S3FetchStrategy can only fetch from s3:// urls.")
raise web_util.FetchError("S3FetchStrategy can only fetch from s3:// urls.")
tty.debug("Fetching {0}".format(self.url))
basename = os.path.basename(parsed_url.path)
with working_dir(self.stage.path):
_, headers, stream = spack.util.web.read_from_url(self.url)
_, headers, stream = web_util.read_from_url(self.url)
with open(basename, "wb") as f:
shutil.copyfileobj(stream, f)
content_type = spack.util.web.get_header(headers, "Content-type")
content_type = web_util.get_header(headers, "Content-type")
if content_type == "text/html":
warn_content_type_mismatch(self.archive_file or "the archive")
@@ -1426,15 +1368,13 @@ def __init__(self, *args, **kwargs):
@_needs_stage
def fetch(self):
import spack.util.web as web_util
if self.archive_file:
tty.debug("Already downloaded {0}".format(self.archive_file))
return
parsed_url = url_util.parse(self.url)
if parsed_url.scheme != "gs":
raise FetchError("GCSFetchStrategy can only fetch from gs:// urls.")
raise web_util.FetchError("GCSFetchStrategy can only fetch from gs:// urls.")
tty.debug("Fetching {0}".format(self.url))
@@ -1489,7 +1429,7 @@ def from_kwargs(**kwargs):
on attribute names (e.g., ``git``, ``hg``, etc.)
Raises:
FetchError: If no ``fetch_strategy`` matches the args.
spack.util.web.FetchError: If no ``fetch_strategy`` matches the args.
"""
for fetcher in all_strategies:
if fetcher.matches(kwargs):
@@ -1586,7 +1526,7 @@ def for_package_version(pkg, version):
# if it's a commit, we must use a GitFetchStrategy
if isinstance(version, spack.version.GitVersion):
if not hasattr(pkg, "git"):
raise FetchError(
raise web_util.FetchError(
"Cannot fetch git version for %s. Package has no 'git' attribute" % pkg.name
)
# Populate the version with comparisons to other commits
@@ -1604,7 +1544,19 @@ def for_package_version(pkg, version):
ref_type: version.ref,
"no_cache": True,
}
kwargs["submodules"] = getattr(pkg, "submodules", False)
# if we have a ref_version already, and it is a version from the package
# we can use that version's submodule specifications
if pkg.version.ref_version:
ref_version = spack.version.Version(pkg.version.ref_version[0])
ref_version_attributes = pkg.versions.get(ref_version)
if ref_version_attributes:
kwargs["submodules"] = ref_version_attributes.get(
"submodules", kwargs["submodules"]
)
fetcher = GitFetchStrategy(**kwargs)
return fetcher
@@ -1731,15 +1683,11 @@ def destroy(self):
shutil.rmtree(self.root, ignore_errors=True)
class FetchError(spack.error.SpackError):
"""Superclass for fetcher errors."""
class NoCacheError(FetchError):
class NoCacheError(web_util.FetchError):
"""Raised when there is no cached archive for a package."""
class FailedDownloadError(FetchError):
class FailedDownloadError(web_util.FetchError):
"""Raised when a download fails."""
def __init__(self, url, msg=""):
@@ -1747,23 +1695,23 @@ def __init__(self, url, msg=""):
self.url = url
class NoArchiveFileError(FetchError):
""" "Raised when an archive file is expected but none exists."""
class NoArchiveFileError(web_util.FetchError):
"""Raised when an archive file is expected but none exists."""
class NoDigestError(FetchError):
class NoDigestError(web_util.FetchError):
"""Raised after attempt to checksum when URL has no digest."""
class ExtrapolationError(FetchError):
class ExtrapolationError(web_util.FetchError):
"""Raised when we can't extrapolate a version for a package."""
class FetcherConflict(FetchError):
class FetcherConflict(web_util.FetchError):
"""Raised for packages with invalid fetch attributes."""
class InvalidArgsError(FetchError):
class InvalidArgsError(web_util.FetchError):
"""Raised when a version can't be deduced from a set of arguments."""
def __init__(self, pkg=None, version=None, **args):
@@ -1776,11 +1724,11 @@ def __init__(self, pkg=None, version=None, **args):
super(InvalidArgsError, self).__init__(msg, long_msg)
class ChecksumError(FetchError):
class ChecksumError(web_util.FetchError):
"""Raised when archive fails to checksum."""
class NoStageError(FetchError):
class NoStageError(web_util.FetchError):
"""Raised when fetch operations are called before set_stage()."""
def __init__(self, method):

View File

@@ -44,7 +44,7 @@ def __call__(self, spec):
#: Hash descriptor used only to transfer a DAG, as is, across processes
process_hash = SpecHashDescriptor(
deptype=("build", "link", "run", "test"), package_hash=False, name="process_hash"
deptype=("build", "link", "run", "test"), package_hash=True, name="process_hash"
)

View File

@@ -12,6 +12,7 @@
import six
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.error
import spack.paths
@@ -180,6 +181,9 @@ def __call__(self, *args, **kwargs):
if spec.external and not externals:
status = "SKIPPED"
skipped += 1
elif not spec.installed:
status = "SKIPPED"
skipped += 1
else:
status = "NO-TESTS"
untested += 1
@@ -187,6 +191,7 @@ def __call__(self, *args, **kwargs):
self.write_test_result(spec, status)
except BaseException as exc:
self.fails += 1
tty.debug("Test failure: {0}".format(str(exc)))
if isinstance(exc, (SyntaxError, TestSuiteSpecError)):
# Create the test log file and report the error.
self.ensure_stage()

View File

@@ -84,6 +84,9 @@
#: queue invariants).
STATUS_REMOVED = "removed"
is_windows = sys.platform == "win32"
is_osx = sys.platform == "darwin"
class InstallAction(object):
#: Don't perform an install
@@ -165,7 +168,9 @@ def _do_fake_install(pkg):
if not pkg.name.startswith("lib"):
library = "lib" + library
dso_suffix = ".dylib" if sys.platform == "darwin" else ".so"
plat_shared = ".dll" if is_windows else ".so"
plat_static = ".lib" if is_windows else ".a"
dso_suffix = ".dylib" if is_osx else plat_shared
# Install fake command
fs.mkdirp(pkg.prefix.bin)
@@ -180,7 +185,7 @@ def _do_fake_install(pkg):
# Install fake shared and static libraries
fs.mkdirp(pkg.prefix.lib)
for suffix in [dso_suffix, ".a"]:
for suffix in [dso_suffix, plat_static]:
fs.touch(os.path.join(pkg.prefix.lib, library + suffix))
# Install fake man page
@@ -257,6 +262,30 @@ def _hms(seconds):
return " ".join(parts)
def _log_prefix(pkg_name):
"""Prefix of the form "[pid]: [pkg name]: ..." when printing a status update during
the build."""
pid = "{0}: ".format(os.getpid()) if tty.show_pid() else ""
return "{0}{1}:".format(pid, pkg_name)
def _print_installed_pkg(message):
"""
Output a message with a package icon.
Args:
message (str): message to be output
"""
print(colorize("@*g{[+]} ") + spack.util.path.debug_padded_filter(message))
def _print_timer(pre, pkg_id, fetch, build, total):
tty.msg(
"{0} Successfully installed {1}".format(pre, pkg_id),
"Fetch: {0}. Build: {1}. Total: {2}.".format(_hms(fetch), _hms(build), _hms(total)),
)
def _install_from_cache(pkg, cache_only, explicit, unsigned=False):
"""
Extract the package from binary cache
@@ -273,7 +302,10 @@ def _install_from_cache(pkg, cache_only, explicit, unsigned=False):
bool: ``True`` if the package was extract from binary cache,
``False`` otherwise
"""
installed_from_cache = _try_install_from_binary_cache(pkg, explicit, unsigned=unsigned)
timer = Timer()
installed_from_cache = _try_install_from_binary_cache(
pkg, explicit, unsigned=unsigned, timer=timer
)
pkg_id = package_id(pkg)
if not installed_from_cache:
pre = "No binary for {0} found".format(pkg_id)
@@ -282,23 +314,20 @@ def _install_from_cache(pkg, cache_only, explicit, unsigned=False):
tty.msg("{0}: installing from source".format(pre))
return False
timer.stop()
tty.debug("Successfully extracted {0} from binary cache".format(pkg_id))
_print_timer(
pre=_log_prefix(pkg.name),
pkg_id=pkg_id,
fetch=timer.phases.get("search", 0) + timer.phases.get("fetch", 0),
build=timer.phases.get("install", 0),
total=timer.total,
)
_print_installed_pkg(pkg.spec.prefix)
spack.hooks.post_install(pkg.spec)
return True
def _print_installed_pkg(message):
"""
Output a message with a package icon.
Args:
message (str): message to be output
"""
print(colorize("@*g{[+]} ") + spack.util.path.debug_padded_filter(message))
def _process_external_package(pkg, explicit):
"""
Helper function to run post install hooks and register external packages.
@@ -340,7 +369,9 @@ def _process_external_package(pkg, explicit):
spack.store.db.add(spec, None, explicit=explicit)
def _process_binary_cache_tarball(pkg, binary_spec, explicit, unsigned, mirrors_for_spec=None):
def _process_binary_cache_tarball(
pkg, binary_spec, explicit, unsigned, mirrors_for_spec=None, timer=None
):
"""
Process the binary cache tarball.
@@ -352,6 +383,7 @@ def _process_binary_cache_tarball(pkg, binary_spec, explicit, unsigned, mirrors_
otherwise, ``False``
mirrors_for_spec (list): Optional list of concrete specs and mirrors
obtained by calling binary_distribution.get_mirrors_for_spec().
timer (Timer): timer to keep track of binary install phases.
Return:
bool: ``True`` if the package was extracted from binary cache,
@@ -360,6 +392,8 @@ def _process_binary_cache_tarball(pkg, binary_spec, explicit, unsigned, mirrors_
download_result = binary_distribution.download_tarball(
binary_spec, unsigned, mirrors_for_spec=mirrors_for_spec
)
if timer:
timer.phase("fetch")
# see #10063 : install from source if tarball doesn't exist
if download_result is None:
tty.msg("{0} exists in binary cache but with different hash".format(pkg.name))
@@ -376,10 +410,12 @@ def _process_binary_cache_tarball(pkg, binary_spec, explicit, unsigned, mirrors_
pkg.installed_from_binary_cache = True
spack.store.db.add(pkg.spec, spack.store.layout, explicit=explicit)
if timer:
timer.phase("install")
return True
def _try_install_from_binary_cache(pkg, explicit, unsigned=False):
def _try_install_from_binary_cache(pkg, explicit, unsigned=False, timer=None):
"""
Try to extract the package from binary cache.
@@ -388,16 +424,20 @@ def _try_install_from_binary_cache(pkg, explicit, unsigned=False):
explicit (bool): the package was explicitly requested by the user
unsigned (bool): ``True`` if binary package signatures to be checked,
otherwise, ``False``
timer (Timer):
"""
pkg_id = package_id(pkg)
tty.debug("Searching for binary cache of {0}".format(pkg_id))
matches = binary_distribution.get_mirrors_for_spec(pkg.spec)
if timer:
timer.phase("search")
if not matches:
return False
return _process_binary_cache_tarball(
pkg, pkg.spec, explicit, unsigned, mirrors_for_spec=matches
pkg, pkg.spec, explicit, unsigned, mirrors_for_spec=matches, timer=timer
)
@@ -820,7 +860,7 @@ def _check_deps_status(self, request):
if spack.store.db.prefix_failed(dep):
action = "'spack install' the dependency"
msg = "{0} is marked as an install failure: {1}".format(dep_id, action)
raise InstallError(err.format(request.pkg_id, msg))
raise InstallError(err.format(request.pkg_id, msg), pkg=dep_pkg)
# Attempt to get a read lock to ensure another process does not
# uninstall the dependency while the requested spec is being
@@ -828,7 +868,7 @@ def _check_deps_status(self, request):
ltype, lock = self._ensure_locked("read", dep_pkg)
if lock is None:
msg = "{0} is write locked by another process".format(dep_id)
raise InstallError(err.format(request.pkg_id, msg))
raise InstallError(err.format(request.pkg_id, msg), pkg=request.pkg)
# Flag external and upstream packages as being installed
if dep_pkg.spec.external or dep_pkg.spec.installed_upstream:
@@ -883,6 +923,7 @@ def _prepare_for_install(self, task):
"Install prefix collision for {0}".format(task.pkg_id),
long_msg="Prefix directory {0} already used by another "
"installed spec.".format(task.pkg.spec.prefix),
pkg=task.pkg,
)
# Make sure the installation directory is in the desired state
@@ -1176,12 +1217,12 @@ def _install_task(self, task):
Args:
task (BuildTask): the installation build task for a package"""
install_args = task.request.install_args
cache_only = install_args.get("cache_only")
explicit = task.explicit
install_args = task.request.install_args
cache_only = task.cache_only
use_cache = task.use_cache
tests = install_args.get("tests")
unsigned = install_args.get("unsigned")
use_cache = install_args.get("use_cache")
pkg, pkg_id = task.pkg, task.pkg_id
@@ -1213,7 +1254,10 @@ def _install_task(self, task):
spack.package_base.PackageBase._verbose = spack.build_environment.start_build_process(
pkg, build_process, install_args
)
# Currently this is how RPATH-like behavior is achieved on Windows, after install
# establish runtime linkage via Windows Runtime link object
# Note: this is a no-op on non Windows platforms
pkg.windows_establish_runtime_linkage()
# Note: PARENT of the build process adds the new package to
# the database, so that we don't need to re-read from file.
spack.store.db.add(pkg.spec, spack.store.layout, explicit=explicit)
@@ -1571,7 +1615,8 @@ def install(self):
raise InstallError(
"Cannot proceed with {0}: {1} uninstalled {2}: {3}".format(
pkg_id, task.priority, dep_str, ",".join(task.uninstalled_deps)
)
),
pkg=pkg,
)
# Skip the installation if the spec is not being installed locally
@@ -1596,7 +1641,7 @@ def install(self):
spack.hooks.on_install_failure(task.request.pkg.spec)
if self.fail_fast:
raise InstallError(fail_fast_err)
raise InstallError(fail_fast_err, pkg=pkg)
continue
@@ -1718,7 +1763,7 @@ def install(self):
)
# Terminate if requested to do so on the first failure.
if self.fail_fast:
raise InstallError("{0}: {1}".format(fail_fast_err, str(exc)))
raise InstallError("{0}: {1}".format(fail_fast_err, str(exc)), pkg=pkg)
# Terminate at this point if the single explicit spec has
# failed to install.
@@ -1727,7 +1772,7 @@ def install(self):
# Track explicit spec id and error to summarize when done
if task.explicit:
failed_explicits.append((pkg_id, str(exc)))
failed_explicits.append((pkg, pkg_id, str(exc)))
finally:
# Remove the install prefix if anything went wrong during
@@ -1750,19 +1795,38 @@ def install(self):
# Ensure we properly report if one or more explicit specs failed
# or were not installed when should have been.
missing = [
request.pkg_id
(request.pkg, request.pkg_id)
for request in self.build_requests
if request.install_args.get("install_package") and request.pkg_id not in self.installed
]
if failed_explicits or missing:
for pkg_id, err in failed_explicits:
for _, pkg_id, err in failed_explicits:
tty.error("{0}: {1}".format(pkg_id, err))
for pkg_id in missing:
for _, pkg_id in missing:
tty.error("{0}: Package was not installed".format(pkg_id))
pkg = None
if len(failed_explicits) > 0:
pkg = failed_explicits[0][0]
ids = [pkg_id for _, pkg_id, _ in failed_explicits]
tty.debug(
"Associating installation failure with first failed "
"explicit package ({0}) from {1}".format(ids[0], ", ".join(ids))
)
if not pkg and len(missing) > 0:
pkg = missing[0][0]
ids = [pkg_id for _, pkg_id in missing]
tty.debug(
"Associating installation failure with first "
"missing package ({0}) from {1}".format(ids[0], ", ".join(ids))
)
raise InstallError(
"Installation request failed. Refer to " "reported errors for failing package(s)."
"Installation request failed. Refer to reported errors for failing package(s).",
pkg=pkg,
)
@@ -1812,8 +1876,7 @@ def __init__(self, pkg, install_args):
self.filter_fn = spack.util.path.padding_filter if padding else None
# info/debug information
pid = "{0}: ".format(os.getpid()) if tty.show_pid() else ""
self.pre = "{0}{1}:".format(pid, pkg.name)
self.pre = _log_prefix(pkg.name)
self.pkg_id = package_id(pkg)
def run(self):
@@ -1856,12 +1919,12 @@ def run(self):
# Run post install hooks before build stage is removed.
spack.hooks.post_install(self.pkg.spec)
build_time = self.timer.total - self.pkg._fetch_time
tty.msg(
"{0} Successfully installed {1}".format(self.pre, self.pkg_id),
"Fetch: {0}. Build: {1}. Total: {2}.".format(
_hms(self.pkg._fetch_time), _hms(build_time), _hms(self.timer.total)
),
_print_timer(
pre=self.pre,
pkg_id=self.pkg_id,
fetch=self.pkg._fetch_time,
build=self.timer.total - self.pkg._fetch_time,
total=self.timer.total,
)
_print_installed_pkg(self.pkg.prefix)
@@ -2060,7 +2123,7 @@ def __init__(self, pkg, request, compiler, start, attempts, status, installed):
# queue.
if status == STATUS_REMOVED:
msg = "Cannot create a build task for {0} with status '{1}'"
raise InstallError(msg.format(self.pkg_id, status))
raise InstallError(msg.format(self.pkg_id, status), pkg=pkg)
self.status = status
@@ -2191,7 +2254,29 @@ def flag_installed(self, installed):
@property
def explicit(self):
"""The package was explicitly requested by the user."""
return self.pkg == self.request.pkg and self.request.install_args.get("explicit", True)
return self.is_root and self.request.install_args.get("explicit", True)
@property
def is_root(self):
"""The package was requested directly, but may or may not be explicit
in an environment."""
return self.pkg == self.request.pkg
@property
def use_cache(self):
_use_cache = True
if self.is_root:
return self.request.install_args.get("package_use_cache", _use_cache)
else:
return self.request.install_args.get("dependencies_use_cache", _use_cache)
@property
def cache_only(self):
_cache_only = False
if self.is_root:
return self.request.install_args.get("package_cache_only", _cache_only)
else:
return self.request.install_args.get("dependencies_cache_only", _cache_only)
@property
def key(self):
@@ -2273,21 +2358,23 @@ def __str__(self):
def _add_default_args(self):
"""Ensure standard install options are set to at least the default."""
for arg, default in [
("cache_only", False),
("context", "build"), # installs *always* build
("dependencies_cache_only", False),
("dependencies_use_cache", True),
("dirty", False),
("fail_fast", False),
("fake", False),
("install_deps", True),
("install_package", True),
("install_source", False),
("package_cache_only", False),
("package_use_cache", True),
("keep_prefix", False),
("keep_stage", False),
("restage", False),
("skip_patch", False),
("tests", False),
("unsigned", False),
("use_cache", True),
("verbose", False),
]:
_ = self.install_args.setdefault(arg, default)
@@ -2304,7 +2391,13 @@ def get_deptypes(self, pkg):
"""
deptypes = ["link", "run"]
include_build_deps = self.install_args.get("include_build_deps")
if not self.install_args.get("cache_only") or include_build_deps:
if self.pkg_id == package_id(pkg):
cache_only = self.install_args.get("package_cache_only")
else:
cache_only = self.install_args.get("dependencies_cache_only")
if not cache_only or include_build_deps:
deptypes.append("build")
if self.run_tests(pkg):
deptypes.append("test")
@@ -2333,28 +2426,43 @@ def spec(self):
"""The specification associated with the package."""
return self.pkg.spec
def traverse_dependencies(self):
def traverse_dependencies(self, spec=None, visited=None):
"""
Yield any dependencies of the appropriate type(s)
Yields:
(Spec) The next child spec in the DAG
"""
get_spec = lambda s: s.spec
# notice: deptype is not constant across nodes, so we cannot use
# spec.traverse_edges(deptype=...).
deptypes = self.get_deptypes(self.pkg)
tty.debug("Processing dependencies for {0}: {1}".format(self.pkg_id, deptypes))
for dspec in self.spec.traverse_edges(
deptype=deptypes, order="post", root=False, direction="children"
):
yield get_spec(dspec)
if spec is None:
spec = self.spec
if visited is None:
visited = set()
deptype = self.get_deptypes(spec.package)
for dep in spec.dependencies(deptype=deptype):
hash = dep.dag_hash()
if hash in visited:
continue
visited.add(hash)
# In Python 3: yield from self.traverse_dependencies(dep, visited)
for s in self.traverse_dependencies(dep, visited):
yield s
yield dep
class InstallError(spack.error.SpackError):
"""Raised when something goes wrong during install or uninstall."""
"""Raised when something goes wrong during install or uninstall.
def __init__(self, message, long_msg=None):
The error can be annotated with a ``pkg`` attribute to allow the
caller to get the package for which the exception was raised.
"""
def __init__(self, message, long_msg=None, pkg=None):
super(InstallError, self).__init__(message, long_msg)
self.pkg = pkg
class BadInstallPhase(InstallError):

View File

@@ -18,6 +18,7 @@
import pstats
import re
import signal
import subprocess as sp
import sys
import traceback
import warnings
@@ -546,6 +547,12 @@ def setup_main_options(args):
# Assign a custom function to show warnings
warnings.showwarning = send_warning_to_tty
if sys.version_info[:2] == (2, 7):
warnings.warn(
"Python 2.7 support is deprecated and will be removed in Spack v0.20.\n"
" Please move to Python 3.6 or higher."
)
# Set up environment based on args.
tty.set_verbose(args.verbose)
tty.set_debug(args.debug)
@@ -570,7 +577,14 @@ def setup_main_options(args):
spack.config.set("config:locks", args.locks, scope="command_line")
if args.mock:
spack.repo.path = spack.repo.RepoPath(spack.paths.mock_packages_path)
import spack.util.spack_yaml as syaml
key = syaml.syaml_str("repos")
key.override = True
spack.config.config.scopes["command_line"].sections["repos"] = syaml.syaml_dict(
[(key, [spack.paths.mock_packages_path])]
)
spack.repo.path = spack.repo.create(spack.config.config)
# If the user asked for it, don't check ssl certs.
if args.insecure:
@@ -623,15 +637,19 @@ class SpackCommand(object):
their output.
"""
def __init__(self, command_name):
def __init__(self, command_name, subprocess=False):
"""Create a new SpackCommand that invokes ``command_name`` when called.
Args:
command_name (str): name of the command to invoke
subprocess (bool): whether to fork a subprocess or not. Currently not supported on
Windows, where it is always False.
"""
self.parser = make_argument_parser()
self.command = self.parser.add_command(command_name)
self.command_name = command_name
# TODO: figure out how to support this on windows
self.subprocess = subprocess if sys.platform != "win32" else False
def __call__(self, *argv, **kwargs):
"""Invoke this SpackCommand.
@@ -656,25 +674,36 @@ def __call__(self, *argv, **kwargs):
self.error = None
prepend = kwargs["global_args"] if "global_args" in kwargs else []
args, unknown = self.parser.parse_known_args(prepend + [self.command_name] + list(argv))
fail_on_error = kwargs.get("fail_on_error", True)
out = StringIO()
try:
with log_output(out):
self.returncode = _invoke_command(self.command, self.parser, args, unknown)
if self.subprocess:
p = sp.Popen(
[spack.paths.spack_script, self.command_name] + prepend + list(argv),
stdout=sp.PIPE,
stderr=sp.STDOUT,
)
out, self.returncode = p.communicate()
out = out.decode()
else:
args, unknown = self.parser.parse_known_args(
prepend + [self.command_name] + list(argv)
)
except SystemExit as e:
self.returncode = e.code
out = StringIO()
try:
with log_output(out):
self.returncode = _invoke_command(self.command, self.parser, args, unknown)
except BaseException as e:
tty.debug(e)
self.error = e
if fail_on_error:
self._log_command_output(out)
raise
except SystemExit as e:
self.returncode = e.code
except BaseException as e:
tty.debug(e)
self.error = e
if fail_on_error:
self._log_command_output(out)
raise
out = out.getvalue()
if fail_on_error and self.returncode not in (None, 0):
self._log_command_output(out)
@@ -683,7 +712,7 @@ def __call__(self, *argv, **kwargs):
% (self.returncode, self.command_name, ", ".join("'%s'" % a for a in argv))
)
return out.getvalue()
return out
def _log_command_output(self, out):
if tty.is_verbose():

View File

@@ -63,6 +63,7 @@
from spack.util.executable import ProcessError, which
from spack.util.package_hash import package_hash
from spack.util.prefix import Prefix
from spack.util.web import FetchError
from spack.version import GitVersion, Version, VersionBase
if sys.version_info[0] >= 3:
@@ -96,6 +97,9 @@
_spack_configure_argsfile = "spack-configure-args.txt"
is_windows = sys.platform == "win32"
def preferred_version(pkg):
"""
Returns a sorted list of the preferred versions of the package.
@@ -181,6 +185,30 @@ def copy(self):
return other
class WindowsRPathMeta(object):
"""Collection of functionality surrounding Windows RPATH specific features
This is essentially meaningless for all other platforms
due to their use of RPATH. All methods within this class are no-ops on
non Windows. Packages can customize and manipulate this class as
they would a genuine RPATH, i.e. adding directories that contain
runtime library dependencies"""
def add_search_paths(self, *path):
"""Add additional rpaths that are not implicitly included in the search
scheme
"""
self.win_rpath.include_additional_link_paths(*path)
def windows_establish_runtime_linkage(self):
"""Establish RPATH on Windows
Performs symlinking to incorporate rpath dependencies to Windows runtime search paths
"""
if is_windows:
self.win_rpath.establish_link()
#: Registers which are the detectable packages, by repo and package name
#: Need a pass of package repositories to be filled.
detectable_packages = collections.defaultdict(list)
@@ -220,7 +248,7 @@ def to_windows_exe(exe):
plat_exe = []
if hasattr(cls, "executables"):
for exe in cls.executables:
if sys.platform == "win32":
if is_windows:
exe = to_windows_exe(exe)
plat_exe.append(exe)
return plat_exe
@@ -512,7 +540,7 @@ def test_log_pathname(test_stage, spec):
return os.path.join(test_stage, "test-{0}-out.txt".format(TestSuite.test_pkg_id(spec)))
class PackageBase(six.with_metaclass(PackageMeta, PackageViewMixin, object)):
class PackageBase(six.with_metaclass(PackageMeta, WindowsRPathMeta, PackageViewMixin, object)):
"""This is the superclass for all spack packages.
***The Package class***
@@ -752,6 +780,8 @@ def __init__(self, spec):
# Set up timing variables
self._fetch_time = 0.0
self.win_rpath = fsys.WindowsSimulatedRPath(self)
if self.is_extension:
pkg_cls = spack.repo.path.get_pkg_class(self.extendee_spec.name)
pkg_cls(self.extendee_spec)._check_extendable()
@@ -1739,6 +1769,10 @@ def content_hash(self, content=None):
return b32_hash
@property
def cmake_prefix_paths(self):
return [self.prefix]
def _has_make_target(self, target):
"""Checks to see if 'target' is a valid target in a Makefile.
@@ -2749,6 +2783,8 @@ def rpath(self):
deps = self.spec.dependencies(deptype="link")
rpaths.extend(d.prefix.lib for d in deps if os.path.isdir(d.prefix.lib))
rpaths.extend(d.prefix.lib64 for d in deps if os.path.isdir(d.prefix.lib64))
if is_windows:
rpaths.extend(d.prefix.bin for d in deps if os.path.isdir(d.prefix.bin))
return rpaths
@property
@@ -2840,6 +2876,10 @@ def test_process(pkg, kwargs):
print_test_message(logger, "Skipped tests for external package", verbose)
return
if not pkg.spec.installed:
print_test_message(logger, "Skipped not installed package", verbose)
return
# run test methods from the package and all virtuals it
# provides virtuals have to be deduped by name
v_names = list(set([vspec.name for vspec in pkg.virtuals_provided]))
@@ -2910,6 +2950,9 @@ def test_process(pkg, kwargs):
# non-pass-only methods
if ran_actual_test_function:
fsys.touch(pkg.tested_file)
# log one more test message to provide a completion timestamp
# for CDash reporting
tty.msg("Completed testing")
else:
print_test_message(logger, "No tests to run", verbose)
@@ -3015,13 +3058,6 @@ def possible_dependencies(*pkg_or_spec, **kwargs):
return visited
class FetchError(spack.error.SpackError):
"""Raised when something goes wrong during fetch."""
def __init__(self, message, long_msg=None):
super(FetchError, self).__init__(message, long_msg)
class PackageStillNeededError(InstallError):
"""Raised when package is still needed by another on uninstall."""

View File

@@ -78,7 +78,9 @@ def lex_word(self, word):
break
if remainder and not remainder_used:
raise LexError("Invalid character", word, word.index(remainder))
msg = "Invalid character, '{0}',".format(remainder[0])
msg += " in '{0}' at index {1}".format(word, word.index(remainder))
raise LexError(msg, word, word.index(remainder))
return tokens

View File

@@ -271,12 +271,13 @@ def to_dict(self):
return data
def from_dict(dictionary):
def from_dict(dictionary, repository=None):
"""Create a patch from json dictionary."""
repository = repository or spack.repo.path
owner = dictionary.get("owner")
if "owner" not in dictionary:
raise ValueError("Invalid patch dictionary: %s" % dictionary)
pkg_cls = spack.repo.path.get_pkg_class(owner)
pkg_cls = repository.get_pkg_class(owner)
if "url" in dictionary:
return UrlPatch(
@@ -329,7 +330,7 @@ class PatchCache(object):
"""
def __init__(self, data=None):
def __init__(self, repository, data=None):
if data is None:
self.index = {}
else:
@@ -337,9 +338,11 @@ def __init__(self, data=None):
raise IndexError("invalid patch index; try `spack clean -m`")
self.index = data["patches"]
self.repository = repository
@classmethod
def from_json(cls, stream):
return PatchCache(sjson.load(stream))
def from_json(cls, stream, repository):
return PatchCache(repository=repository, data=sjson.load(stream))
def to_json(self, stream):
sjson.dump({"patches": self.index}, stream)
@@ -375,7 +378,7 @@ def patch_for_package(self, sha256, pkg):
# because it's the index key)
patch_dict = dict(patch_dict)
patch_dict["sha256"] = sha256
return from_dict(patch_dict)
return from_dict(patch_dict, repository=self.repository)
def update_package(self, pkg_fullname):
# remove this package from any patch entries that reference it.
@@ -397,8 +400,8 @@ def update_package(self, pkg_fullname):
del self.index[sha256]
# update the index with per-package patch indexes
pkg_cls = spack.repo.path.get_pkg_class(pkg_fullname)
partial_index = self._index_patches(pkg_cls)
pkg_cls = self.repository.get_pkg_class(pkg_fullname)
partial_index = self._index_patches(pkg_cls, self.repository)
for sha256, package_to_patch in partial_index.items():
p2p = self.index.setdefault(sha256, {})
p2p.update(package_to_patch)
@@ -410,7 +413,7 @@ def update(self, other):
p2p.update(package_to_patch)
@staticmethod
def _index_patches(pkg_class):
def _index_patches(pkg_class, repository):
index = {}
# Add patches from the class
@@ -425,7 +428,7 @@ def _index_patches(pkg_class):
for cond, dependency in conditions.items():
for pcond, patch_list in dependency.patches.items():
for patch in patch_list:
dspec_cls = spack.repo.path.get_pkg_class(dependency.spec.name)
dspec_cls = repository.get_pkg_class(dependency.spec.name)
patch_dict = patch.to_dict()
patch_dict.pop("sha256") # save some space
index[patch.sha256] = {dspec_cls.fullname: patch_dict}

View File

@@ -129,7 +129,7 @@ def __repr__(self):
class ProviderIndex(_IndexBase):
def __init__(self, specs=None, restrict=False):
def __init__(self, repository, specs=None, restrict=False):
"""Provider index based on a single mapping of providers.
Args:
@@ -143,17 +143,16 @@ def __init__(self, specs=None, restrict=False):
TODO: as possible without overly restricting results, so it is
TODO: not the best name.
"""
if specs is None:
specs = []
self.repository = repository
self.restrict = restrict
self.providers = {}
specs = specs or []
for spec in specs:
if not isinstance(spec, spack.spec.Spec):
spec = spack.spec.Spec(spec)
if spec.virtual:
if self.repository.is_virtual_safe(spec.name):
continue
self.update(spec)
@@ -171,9 +170,10 @@ def update(self, spec):
# Empty specs do not have a package
return
assert not spec.virtual, "cannot update an index using a virtual spec"
msg = "cannot update an index passing the virtual spec '{}'".format(spec.name)
assert not self.repository.is_virtual_safe(spec.name), msg
pkg_provided = spec.package_class.provided
pkg_provided = self.repository.get_pkg_class(spec.name).provided
for provided_spec, provider_specs in six.iteritems(pkg_provided):
for provider_spec in provider_specs:
# TODO: fix this comment.
@@ -262,12 +262,12 @@ def remove_provider(self, pkg_name):
def copy(self):
"""Return a deep copy of this index."""
clone = ProviderIndex()
clone = ProviderIndex(repository=self.repository)
clone.providers = self._transform(lambda vpkg, pset: (vpkg, set((p.copy() for p in pset))))
return clone
@staticmethod
def from_json(stream):
def from_json(stream, repository):
"""Construct a provider index from its JSON representation.
Args:
@@ -281,7 +281,7 @@ def from_json(stream):
if "provider_index" not in data:
raise ProviderIndexError("YAML ProviderIndex does not start with 'provider_index'")
index = ProviderIndex()
index = ProviderIndex(repository=repository)
providers = data["provider_index"]["providers"]
index.providers = _transform(
providers,

View File

@@ -11,8 +11,10 @@
import macholib.mach_o
import macholib.MachO
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.lang import memoized
from llnl.util.symlink import symlink
import spack.bootstrap
@@ -76,15 +78,14 @@ def __init__(self, old_path, new_path):
super(BinaryTextReplaceError, self).__init__(msg, err_msg)
@memoized
def _patchelf():
"""Return the full path to the patchelf binary, if available, else None."""
if is_macos:
return None
patchelf = executable.which("patchelf")
if patchelf is None:
with spack.bootstrap.ensure_bootstrap_configuration():
patchelf = spack.bootstrap.ensure_patchelf_in_path_or_raise()
with spack.bootstrap.ensure_bootstrap_configuration():
patchelf = spack.bootstrap.ensure_patchelf_in_path_or_raise()
return patchelf.path
@@ -887,7 +888,7 @@ def file_is_relocatable(filename, paths_to_relocate=None):
# Remove the RPATHS from the strings in the executable
set_of_strings = set(strings(filename, output=str).split())
m_type, m_subtype = mime_type(filename)
m_type, m_subtype = fs.mime_type(filename)
if m_type == "application":
tty.debug("{0},{1}".format(m_type, m_subtype), level=2)
@@ -923,7 +924,7 @@ def is_binary(filename):
Returns:
True or False
"""
m_type, _ = mime_type(filename)
m_type, _ = fs.mime_type(filename)
msg = "[{0}] -> ".format(filename)
if m_type == "application":
@@ -934,30 +935,6 @@ def is_binary(filename):
return False
@llnl.util.lang.memoized
def _get_mime_type():
file_cmd = executable.which("file")
for arg in ["-b", "-h", "--mime-type"]:
file_cmd.add_default_arg(arg)
return file_cmd
@llnl.util.lang.memoized
def mime_type(filename):
"""Returns the mime type and subtype of a file.
Args:
filename: file to be analyzed
Returns:
Tuple containing the MIME type and subtype
"""
output = _get_mime_type()(filename, output=str, error=str).strip()
tty.debug("==> " + output, level=2)
type, _, subtype = output.partition("/")
return type, subtype
# Memoize this due to repeated calls to libraries in the same directory.
@llnl.util.lang.memoized
def _exists_dir(dirname):
@@ -975,7 +952,7 @@ def fixup_macos_rpath(root, filename):
True if fixups were applied, else False
"""
abspath = os.path.join(root, filename)
if mime_type(abspath) != ("application", "x-mach-binary"):
if fs.mime_type(abspath) != ("application", "x-mach-binary"):
return False
# Get Mach-O header commands

View File

@@ -12,13 +12,16 @@
import itertools
import os
import os.path
import random
import re
import shutil
import stat
import string
import sys
import tempfile
import traceback
import types
import uuid
from typing import Dict # novm
import ruamel.yaml as yaml
@@ -37,6 +40,7 @@
import spack.provider_index
import spack.spec
import spack.tag
import spack.util.file_cache
import spack.util.naming as nm
import spack.util.path
from spack.util.executable import which
@@ -559,6 +563,9 @@ def _create_new_cache(self): # type: () -> Dict[str, os.stat_result]
def last_mtime(self):
return max(sinfo.st_mtime for sinfo in self._packages_to_stats.values())
def modified_since(self, since):
return [name for name, sinfo in self._packages_to_stats.items() if sinfo.st_mtime > since]
def __getitem__(self, item):
return self._packages_to_stats[item]
@@ -573,6 +580,10 @@ def __len__(self):
class Indexer(object):
"""Adaptor for indexes that need to be generated when repos are updated."""
def __init__(self, repository):
self.repository = repository
self.index = None
def create(self):
self.index = self._create()
@@ -613,10 +624,10 @@ class TagIndexer(Indexer):
"""Lifecycle methods for a TagIndex on a Repo."""
def _create(self):
return spack.tag.TagIndex()
return spack.tag.TagIndex(self.repository)
def read(self, stream):
self.index = spack.tag.TagIndex.from_json(stream)
self.index = spack.tag.TagIndex.from_json(stream, self.repository)
def update(self, pkg_fullname):
self.index.update_package(pkg_fullname)
@@ -629,14 +640,17 @@ class ProviderIndexer(Indexer):
"""Lifecycle methods for virtual package providers."""
def _create(self):
return spack.provider_index.ProviderIndex()
return spack.provider_index.ProviderIndex(repository=self.repository)
def read(self, stream):
self.index = spack.provider_index.ProviderIndex.from_json(stream)
self.index = spack.provider_index.ProviderIndex.from_json(stream, self.repository)
def update(self, pkg_fullname):
name = pkg_fullname.split(".")[-1]
if spack.repo.path.is_virtual(name, use_index=False):
is_virtual = (
not self.repository.exists(name) or self.repository.get_pkg_class(name).virtual
)
if is_virtual:
return
self.index.remove_provider(pkg_fullname)
self.index.update(pkg_fullname)
@@ -649,7 +663,7 @@ class PatchIndexer(Indexer):
"""Lifecycle methods for patch cache."""
def _create(self):
return spack.patch.PatchCache()
return spack.patch.PatchCache(repository=self.repository)
def needs_update(self):
# TODO: patches can change under a package and we should handle
@@ -659,7 +673,7 @@ def needs_update(self):
return False
def read(self, stream):
self.index = spack.patch.PatchCache.from_json(stream)
self.index = spack.patch.PatchCache.from_json(stream, repository=self.repository)
def write(self, stream):
self.index.to_json(stream)
@@ -684,7 +698,7 @@ class RepoIndex(object):
"""
def __init__(self, package_checker, namespace):
def __init__(self, package_checker, namespace, cache):
self.checker = package_checker
self.packages_path = self.checker.packages_path
if sys.platform == "win32":
@@ -693,6 +707,7 @@ def __init__(self, package_checker, namespace):
self.indexers = {}
self.indexes = {}
self.cache = cache
def add_indexer(self, name, indexer):
"""Add an indexer to the repo index.
@@ -737,22 +752,26 @@ def _build_index(self, name, indexer):
cache_filename = "{0}/{1}-index.json".format(name, self.namespace)
# Compute which packages needs to be updated in the cache
misc_cache = spack.caches.misc_cache
index_mtime = misc_cache.mtime(cache_filename)
index_mtime = self.cache.mtime(cache_filename)
needs_update = self.checker.modified_since(index_mtime)
needs_update = [x for x, sinfo in self.checker.items() if sinfo.st_mtime > index_mtime]
index_existed = misc_cache.init_entry(cache_filename)
index_existed = self.cache.init_entry(cache_filename)
if index_existed and not needs_update:
# If the index exists and doesn't need an update, read it
with misc_cache.read_transaction(cache_filename) as f:
with self.cache.read_transaction(cache_filename) as f:
indexer.read(f)
else:
# Otherwise update it and rewrite the cache file
with misc_cache.write_transaction(cache_filename) as (old, new):
with self.cache.write_transaction(cache_filename) as (old, new):
indexer.read(old) if old else indexer.create()
# Compute which packages needs to be updated **again** in case someone updated them
# while we waited for the lock
new_index_mtime = self.cache.mtime(cache_filename)
if new_index_mtime != index_mtime:
needs_update = self.checker.modified_since(new_index_mtime)
for pkg_name in needs_update:
namespaced_name = "%s.%s" % (self.namespace, pkg_name)
indexer.update(namespaced_name)
@@ -773,7 +792,8 @@ class RepoPath(object):
repos (list): list Repo objects or paths to put in this RepoPath
"""
def __init__(self, *repos):
def __init__(self, *repos, **kwargs):
cache = kwargs.get("cache", spack.caches.misc_cache)
self.repos = []
self.by_namespace = nm.NamespaceTrie()
@@ -785,7 +805,7 @@ def __init__(self, *repos):
for repo in repos:
try:
if isinstance(repo, six.string_types):
repo = Repo(repo)
repo = Repo(repo, cache=cache)
self.put_last(repo)
except RepoError as e:
tty.warn(
@@ -876,7 +896,7 @@ def all_package_classes(self):
def provider_index(self):
"""Merged ProviderIndex from all Repos in the RepoPath."""
if self._provider_index is None:
self._provider_index = spack.provider_index.ProviderIndex()
self._provider_index = spack.provider_index.ProviderIndex(repository=self)
for repo in reversed(self.repos):
self._provider_index.merge(repo.provider_index)
@@ -886,7 +906,7 @@ def provider_index(self):
def tag_index(self):
"""Merged TagIndex from all Repos in the RepoPath."""
if self._tag_index is None:
self._tag_index = spack.tag.TagIndex()
self._tag_index = spack.tag.TagIndex(repository=self)
for repo in reversed(self.repos):
self._tag_index.merge(repo.tag_index)
@@ -896,7 +916,7 @@ def tag_index(self):
def patch_index(self):
"""Merged PatchIndex from all Repos in the RepoPath."""
if self._patch_index is None:
self._patch_index = spack.patch.PatchCache()
self._patch_index = spack.patch.PatchCache(repository=self)
for repo in reversed(self.repos):
self._patch_index.update(repo.patch_index)
@@ -925,7 +945,6 @@ def repo_for_pkg(self, spec):
"""Given a spec, get the repository for its package."""
# We don't @_autospec this function b/c it's called very frequently
# and we want to avoid parsing str's into Specs unnecessarily.
namespace = None
if isinstance(spec, spack.spec.Spec):
namespace = spec.namespace
name = spec.name
@@ -938,7 +957,7 @@ def repo_for_pkg(self, spec):
if namespace:
fullspace = python_package_for_repo(namespace)
if fullspace not in self.by_namespace:
raise UnknownNamespaceError(namespace)
raise UnknownNamespaceError(namespace, name=name)
return self.by_namespace[fullspace]
# If there's no namespace, search in the RepoPath.
@@ -983,20 +1002,34 @@ def exists(self, pkg_name):
"""
return any(repo.exists(pkg_name) for repo in self.repos)
def is_virtual(self, pkg_name, use_index=True):
"""True if the package with this name is virtual, False otherwise.
Set `use_index` False when calling from a code block that could
be run during the computation of the provider index."""
def _have_name(self, pkg_name):
have_name = pkg_name is not None
if have_name and not isinstance(pkg_name, str):
raise ValueError("is_virtual(): expected package name, got %s" % type(pkg_name))
if use_index:
return have_name and pkg_name in self.provider_index
else:
return have_name and (
not self.exists(pkg_name) or self.get_pkg_class(pkg_name).virtual
)
return have_name
def is_virtual(self, pkg_name):
"""Return True if the package with this name is virtual, False otherwise.
This function use the provider index. If calling from a code block that
is used to construct the provider index use the ``is_virtual_safe`` function.
Args:
pkg_name (str): name of the package we want to check
"""
have_name = self._have_name(pkg_name)
return have_name and pkg_name in self.provider_index
def is_virtual_safe(self, pkg_name):
"""Return True if the package with this name is virtual, False otherwise.
This function doesn't use the provider index.
Args:
pkg_name (str): name of the package we want to check
"""
have_name = self._have_name(pkg_name)
return have_name and (not self.exists(pkg_name) or self.get_pkg_class(pkg_name).virtual)
def __contains__(self, pkg_name):
return self.exists(pkg_name)
@@ -1015,7 +1048,7 @@ class Repo(object):
"""
def __init__(self, root):
def __init__(self, root, cache=None):
"""Instantiate a package repository from a filesystem path.
Args:
@@ -1070,6 +1103,7 @@ def check(condition, msg):
# Indexes for this repository, computed lazily
self._repo_index = None
self._cache = cache or spack.caches.misc_cache
def real_name(self, import_name):
"""Allow users to import Spack packages using Python identifiers.
@@ -1181,10 +1215,10 @@ def purge(self):
def index(self):
"""Construct the index for this repo lazily."""
if self._repo_index is None:
self._repo_index = RepoIndex(self._pkg_checker, self.namespace)
self._repo_index.add_indexer("providers", ProviderIndexer())
self._repo_index.add_indexer("tags", TagIndexer())
self._repo_index.add_indexer("patches", PatchIndexer())
self._repo_index = RepoIndex(self._pkg_checker, self.namespace, cache=self._cache)
self._repo_index.add_indexer("providers", ProviderIndexer(self))
self._repo_index.add_indexer("tags", TagIndexer(self))
self._repo_index.add_indexer("patches", PatchIndexer(self))
return self._repo_index
@property
@@ -1283,9 +1317,26 @@ def last_mtime(self):
return self._pkg_checker.last_mtime()
def is_virtual(self, pkg_name):
"""True if the package with this name is virtual, False otherwise."""
"""Return True if the package with this name is virtual, False otherwise.
This function use the provider index. If calling from a code block that
is used to construct the provider index use the ``is_virtual_safe`` function.
Args:
pkg_name (str): name of the package we want to check
"""
return pkg_name in self.provider_index
def is_virtual_safe(self, pkg_name):
"""Return True if the package with this name is virtual, False otherwise.
This function doesn't use the provider index.
Args:
pkg_name (str): name of the package we want to check
"""
return not self.exists(pkg_name) or self.get_pkg_class(pkg_name).virtual
def get_pkg_class(self, pkg_name):
"""Get the class for the package out of its module.
@@ -1384,9 +1435,19 @@ def create_or_construct(path, namespace=None):
return Repo(path)
def _path(repo_dirs=None):
def _path(configuration=None):
"""Get the singleton RepoPath instance for Spack."""
repo_dirs = repo_dirs or spack.config.get("repos")
configuration = configuration or spack.config.config
return create(configuration=configuration)
def create(configuration):
"""Create a RepoPath from a configuration object.
Args:
configuration (spack.config.Configuration): configuration object
"""
repo_dirs = configuration.get("repos")
if not repo_dirs:
raise NoRepoConfiguredError("Spack configuration contains no package repositories.")
return RepoPath(*repo_dirs)
@@ -1396,7 +1457,8 @@ def _path(repo_dirs=None):
path = llnl.util.lang.Singleton(_path)
# Add the finder to sys.meta_path
sys.meta_path.append(ReposFinder())
REPOS_FINDER = ReposFinder()
sys.meta_path.append(REPOS_FINDER)
def all_package_names(include_virtuals=False):
@@ -1405,36 +1467,67 @@ def all_package_names(include_virtuals=False):
@contextlib.contextmanager
def additional_repository(repository):
"""Adds temporarily a repository to the default one.
Args:
repository: repository to be added
"""
path.put_first(repository)
yield
path.remove(repository)
@contextlib.contextmanager
def use_repositories(*paths_and_repos):
def use_repositories(*paths_and_repos, **kwargs):
"""Use the repositories passed as arguments within the context manager.
Args:
*paths_and_repos: paths to the repositories to be used, or
already constructed Repo objects
override (bool): if True use only the repositories passed as input,
if False add them to the top of the list of current repositories.
Returns:
Corresponding RepoPath object
"""
global path
path, saved = RepoPath(*paths_and_repos), path
# TODO (Python 2.7): remove this kwargs on deprecation of Python 2.7 support
override = kwargs.get("override", True)
paths = [getattr(x, "root", x) for x in paths_and_repos]
scope_name = "use-repo-{}".format(uuid.uuid4())
repos_key = "repos:" if override else "repos"
spack.config.config.push_scope(
spack.config.InternalConfigScope(name=scope_name, data={repos_key: paths})
)
path, saved = create(configuration=spack.config.config), path
try:
yield path
finally:
spack.config.config.remove_scope(scope_name=scope_name)
path = saved
class MockRepositoryBuilder(object):
"""Build a mock repository in a directory"""
def __init__(self, root_directory, namespace=None):
namespace = namespace or "".join(random.choice(string.ascii_uppercase) for _ in range(10))
self.root, self.namespace = create_repo(str(root_directory), namespace)
def add_package(self, name, dependencies=None):
"""Create a mock package in the repository, using a Jinja2 template.
Args:
name (str): name of the new package
dependencies (list): list of ("dep_spec", "dep_type", "condition") tuples.
Both "dep_type" and "condition" can default to ``None`` in which case
``spack.dependency.default_deptype`` and ``spack.spec.Spec()`` are used.
"""
dependencies = dependencies or []
context = {"cls_name": spack.util.naming.mod_to_class(name), "dependencies": dependencies}
template = spack.tengine.make_environment().get_template("mock-repository/package.pyt")
text = template.render(context)
package_py = self.recipe_filename(name)
fs.mkdirp(os.path.dirname(package_py))
with open(package_py, "w") as f:
f.write(text)
def remove(self, name):
package_py = self.recipe_filename(name)
shutil.rmtree(os.path.dirname(package_py))
def recipe_filename(self, name):
return os.path.join(self.root, "packages", name, "package.py")
class RepoError(spack.error.SpackError):
"""Superclass for repository-related errors."""
@@ -1463,7 +1556,7 @@ class UnknownPackageError(UnknownEntityError):
"""Raised when we encounter a package spack doesn't have."""
def __init__(self, name, repo=None):
msg = None
msg = "Attempting to retrieve anonymous package."
long_msg = None
if name:
if repo:
@@ -1480,8 +1573,6 @@ def __init__(self, name, repo=None):
long_msg = long_msg.format(name)
else:
long_msg = "You may need to run 'spack clean -m'."
else:
msg = "Attempting to retrieve anonymous package."
super(UnknownPackageError, self).__init__(msg, long_msg)
self.name = name
@@ -1490,8 +1581,12 @@ def __init__(self, name, repo=None):
class UnknownNamespaceError(UnknownEntityError):
"""Raised when we encounter an unknown namespace"""
def __init__(self, namespace):
super(UnknownNamespaceError, self).__init__("Unknown namespace: %s" % namespace)
def __init__(self, namespace, name=None):
msg, long_msg = "Unknown namespace: {}".format(namespace), None
if name == "yaml":
long_msg = "Did you mean to specify a filename with './{}.{}'?"
long_msg = long_msg.format(namespace, name)
super(UnknownNamespaceError, self).__init__(msg, long_msg)
class FailedConstructorError(RepoError):

View File

@@ -245,6 +245,7 @@ def __init__(self, cls, function, format_name, args):
self.cls = cls
self.function = function
self.filename = None
self.ctest_parsing = getattr(args, "ctest_parsing", False)
if args.cdash_upload_url:
self.format_name = "cdash"
self.filename = "cdash_report"
@@ -271,10 +272,10 @@ def __enter__(self):
def __exit__(self, exc_type, exc_val, exc_tb):
if self.format_name:
# Close the collector and restore the
# original PackageInstaller._install_task
# Close the collector and restore the original function
self.collector.__exit__(exc_type, exc_val, exc_tb)
report_data = {"specs": self.collector.specs}
report_data["ctest-parsing"] = self.ctest_parsing
report_fn = getattr(self.report_writer, "%s_report" % self.type)
report_fn(self.filename, report_data)

View File

@@ -23,8 +23,10 @@
import spack.build_environment
import spack.fetch_strategy
import spack.package_base
import spack.platforms
from spack.error import SpackError
from spack.reporter import Reporter
from spack.reporters.extract import extract_test_parts
from spack.util.crypto import checksum
from spack.util.executable import which
from spack.util.log_parse import parse_log_events
@@ -46,6 +48,11 @@
cdash_phases.add("update")
def build_stamp(track, timestamp):
buildstamp_format = "%Y%m%d-%H%M-{0}".format(track)
return time.strftime(buildstamp_format, time.localtime(timestamp))
class CDash(Reporter):
"""Generate reports of spec installations for CDash.
@@ -80,6 +87,9 @@ def __init__(self, args):
packages = args.spec
elif getattr(args, "specs", ""):
packages = args.specs
elif getattr(args, "package", ""):
# Ensure CI 'spack test run' can output CDash results
packages = args.package
else:
packages = []
for file in args.specfiles:
@@ -90,29 +100,36 @@ def __init__(self, args):
self.base_buildname = args.cdash_build or self.install_command
self.site = args.cdash_site or socket.gethostname()
self.osname = platform.system()
self.osrelease = platform.release()
self.target = spack.platforms.host().target("default_target")
self.endtime = int(time.time())
if args.cdash_buildstamp:
self.buildstamp = args.cdash_buildstamp
else:
buildstamp_format = "%Y%m%d-%H%M-{0}".format(args.cdash_track)
self.buildstamp = time.strftime(buildstamp_format, time.localtime(self.endtime))
self.buildstamp = (
args.cdash_buildstamp
if args.cdash_buildstamp
else build_stamp(args.cdash_track, self.endtime)
)
self.buildIds = collections.OrderedDict()
self.revision = ""
git = which("git")
with working_dir(spack.paths.spack_root):
self.revision = git("rev-parse", "HEAD", output=str).strip()
self.generator = "spack-{0}".format(spack.main.get_version())
self.multiple_packages = False
def report_build_name(self, pkg_name):
return (
"{0} - {1}".format(self.base_buildname, pkg_name)
if self.multiple_packages
else self.base_buildname
)
def build_report_for_package(self, directory_name, package, duration):
if "stdout" not in package:
# Skip reporting on packages that did not generate any output.
return
self.current_package_name = package["name"]
if self.multiple_packages:
self.buildname = "{0} - {1}".format(self.base_buildname, package["name"])
else:
self.buildname = self.base_buildname
self.buildname = self.report_build_name(self.current_package_name)
report_data = self.initialize_report(directory_name)
for phase in cdash_phases:
report_data[phase] = {}
@@ -228,6 +245,7 @@ def build_report(self, directory_name, input_data):
# Do an initial scan to determine if we are generating reports for more
# than one package. When we're only reporting on a single package we
# do not explicitly include the package's name in the CDash build name.
self.multipe_packages = False
num_packages = 0
for spec in input_data["specs"]:
# Do not generate reports for packages that were installed
@@ -255,27 +273,19 @@ def build_report(self, directory_name, input_data):
self.build_report_for_package(directory_name, package, duration)
self.finalize_report()
def test_report_for_package(self, directory_name, package, duration):
if "stdout" not in package:
# Skip reporting on packages that did not generate any output.
return
def extract_ctest_test_data(self, package, phases, report_data):
"""Extract ctest test data for the package."""
# Track the phases we perform so we know what reports to create.
# We always report the update step because this is how we tell CDash
# what revision of Spack we are using.
assert "update" in phases
self.current_package_name = package["name"]
self.buildname = "{0} - {1}".format(self.base_buildname, package["name"])
report_data = self.initialize_report(directory_name)
for phase in ("test", "update"):
for phase in phases:
report_data[phase] = {}
report_data[phase]["loglines"] = []
report_data[phase]["status"] = 0
report_data[phase]["endtime"] = self.endtime
# Track the phases we perform so we know what reports to create.
# We always report the update step because this is how we tell CDash
# what revision of Spack we are using.
phases_encountered = ["test", "update"]
# Generate a report for this package.
# The first line just says "Testing package name-hash"
report_data["test"]["loglines"].append(
@@ -284,8 +294,7 @@ def test_report_for_package(self, directory_name, package, duration):
for line in package["stdout"].splitlines()[1:]:
report_data["test"]["loglines"].append(xml.sax.saxutils.escape(line))
self.starttime = self.endtime - duration
for phase in phases_encountered:
for phase in phases:
report_data[phase]["starttime"] = self.starttime
report_data[phase]["log"] = "\n".join(report_data[phase]["loglines"])
errors, warnings = parse_log_events(report_data[phase]["loglines"])
@@ -326,6 +335,19 @@ def clean_log_event(event):
if phase == "update":
report_data[phase]["revision"] = self.revision
def extract_standalone_test_data(self, package, phases, report_data):
"""Extract stand-alone test outputs for the package."""
testing = {}
report_data["testing"] = testing
testing["starttime"] = self.starttime
testing["endtime"] = self.starttime
testing["generator"] = self.generator
testing["parts"] = extract_test_parts(package["name"], package["stdout"].splitlines())
def report_test_data(self, directory_name, package, phases, report_data):
"""Generate and upload the test report(s) for the package."""
for phase in phases:
# Write the report.
report_name = phase.capitalize() + ".xml"
report_file_name = package["name"] + "_" + report_name
@@ -333,7 +355,7 @@ def clean_log_event(event):
with codecs.open(phase_report, "w", "utf-8") as f:
env = spack.tengine.make_environment()
if phase != "update":
if phase not in ["update", "testing"]:
# Update.xml stores site information differently
# than the rest of the CTest XML files.
site_template = posixpath.join(self.template_dir, "Site.xml")
@@ -343,18 +365,65 @@ def clean_log_event(event):
phase_template = posixpath.join(self.template_dir, report_name)
t = env.get_template(phase_template)
f.write(t.render(report_data))
tty.debug("Preparing to upload {0}".format(phase_report))
self.upload(phase_report)
def test_report_for_package(self, directory_name, package, duration, ctest_parsing=False):
if "stdout" not in package:
# Skip reporting on packages that did not generate any output.
tty.debug("Skipping report for {0}: No generated output".format(package["name"]))
return
self.current_package_name = package["name"]
if self.base_buildname == self.install_command:
# The package list is NOT all that helpful in this case
self.buildname = "{0}-{1}".format(self.current_package_name, package["id"])
else:
self.buildname = self.report_build_name(self.current_package_name)
self.starttime = self.endtime - duration
report_data = self.initialize_report(directory_name)
report_data["hostname"] = socket.gethostname()
if ctest_parsing:
phases = ["test", "update"]
self.extract_ctest_test_data(package, phases, report_data)
else:
phases = ["testing"]
self.extract_standalone_test_data(package, phases, report_data)
self.report_test_data(directory_name, package, phases, report_data)
def test_report(self, directory_name, input_data):
# Generate reports for each package in each spec.
"""Generate reports for each package in each spec."""
tty.debug("Processing test report")
for spec in input_data["specs"]:
duration = 0
if "time" in spec:
duration = int(spec["time"])
for package in spec["packages"]:
self.test_report_for_package(directory_name, package, duration)
self.test_report_for_package(
directory_name,
package,
duration,
input_data["ctest-parsing"],
)
self.finalize_report()
def test_skipped_report(self, directory_name, spec, reason=None):
output = "Skipped {0} package".format(spec.name)
if reason:
output += "\n{0}".format(reason)
package = {
"name": spec.name,
"id": spec.dag_hash(),
"result": "skipped",
"stdout": output,
}
self.test_report_for_package(directory_name, package, duration=0.0, ctest_parsing=False)
def concretization_report(self, directory_name, msg):
self.buildname = self.base_buildname
report_data = self.initialize_report(directory_name)
@@ -384,12 +453,16 @@ def initialize_report(self, directory_name):
report_data["buildname"] = self.buildname
report_data["buildstamp"] = self.buildstamp
report_data["install_command"] = self.install_command
report_data["generator"] = self.generator
report_data["osname"] = self.osname
report_data["osrelease"] = self.osrelease
report_data["site"] = self.site
report_data["target"] = self.target
return report_data
def upload(self, filename):
if not self.cdash_upload_url:
print("Cannot upload {0} due to missing upload url".format(filename))
return
# Compute md5 checksum for the contents of this file.
@@ -412,7 +485,7 @@ def upload(self, filename):
request.add_header("Authorization", "Bearer {0}".format(self.authtoken))
try:
# By default, urllib2 only support GET and POST.
# CDash needs expects this file to be uploaded via PUT.
# CDash expects this file to be uploaded via PUT.
request.get_method = lambda: "PUT"
response = opener.open(request)
if self.current_package_name not in self.buildIds:
@@ -428,13 +501,13 @@ def upload(self, filename):
def finalize_report(self):
if self.buildIds:
print("View your build results here:")
tty.msg("View your build results here:")
for package_name, buildid in iteritems(self.buildIds):
# Construct and display a helpful link if CDash responded with
# a buildId.
build_url = self.cdash_upload_url
build_url = build_url[0 : build_url.find("submit.php")]
build_url += "buildSummary.php?buildid={0}".format(buildid)
print("{0}: {1}".format(package_name, build_url))
tty.msg("{0}: {1}".format(package_name, build_url))
if not self.success:
raise SpackError("Errors encountered, see above for more details")

View File

@@ -0,0 +1,212 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import xml.sax.saxutils
from datetime import datetime
import llnl.util.tty as tty
# The keys here represent the only recognized (ctest/cdash) status values
completed = {
"failed": "Completed",
"passed": "Completed",
"notrun": "No tests to run",
}
log_regexp = re.compile(r"^==> \[([0-9:.\-]*)(?:, [0-9]*)?\] (.*)")
returns_regexp = re.compile(r"\[([0-9 ,]*)\]")
skip_msgs = ["Testing package", "Results for", "Detected the following"]
skip_regexps = [re.compile(r"{0}".format(msg)) for msg in skip_msgs]
status_values = ["FAILED", "PASSED", "NO-TESTS"]
status_regexps = [re.compile(r"^({0})".format(stat)) for stat in status_values]
def add_part_output(part, line):
if part:
part["loglines"].append(xml.sax.saxutils.escape(line))
def elapsed(current, previous):
if not (current and previous):
return 0
diff = current - previous
tty.debug("elapsed = %s - %s = %s" % (current, previous, diff))
return diff.total_seconds()
def expected_failure(line):
if not line:
return False
match = returns_regexp.search(line)
xfail = "0" not in match.group(0) if match else False
return xfail
def new_part():
return {
"command": None,
"completed": "Unknown",
"desc": None,
"elapsed": None,
"name": None,
"loglines": [],
"output": None,
"status": "passed",
}
def part_name(source):
# TODO: Should be passed the package prefix and only remove it
elements = []
for e in source.replace("'", "").split(" "):
elements.append(os.path.basename(e) if os.sep in e else e)
return "_".join(elements)
def process_part_end(part, curr_time, last_time):
if part:
if not part["elapsed"]:
part["elapsed"] = elapsed(curr_time, last_time)
stat = part["status"]
if stat in completed:
if stat == "passed" and expected_failure(part["desc"]):
part["completed"] = "Expected to fail"
elif part["completed"] == "Unknown":
part["completed"] = completed[stat]
part["output"] = "\n".join(part["loglines"])
def timestamp(time_string):
return datetime.strptime(time_string, "%Y-%m-%d-%H:%M:%S.%f")
def skip(line):
for regex in skip_regexps:
match = regex.search(line)
if match:
return match
def status(line):
for regex in status_regexps:
match = regex.search(line)
if match:
stat = match.group(0)
stat = "notrun" if stat == "NO-TESTS" else stat
return stat.lower()
def extract_test_parts(default_name, outputs):
parts = []
part = {}
testdesc = ""
last_time = None
curr_time = None
for line in outputs:
line = line.strip()
if not line:
add_part_output(part, line)
continue
if skip(line):
continue
# Skipped tests start with "Skipped" and end with "package"
if line.startswith("Skipped") and line.endswith("package"):
part = new_part()
part["command"] = "Not Applicable"
part["completed"] = line
part["elapsed"] = 0.0
part["name"] = default_name
part["status"] = "notrun"
parts.append(part)
continue
# Process Spack log messages
if line.find("==>") != -1:
match = log_regexp.search(line)
if match:
curr_time = timestamp(match.group(1))
msg = match.group(2)
# Skip logged message for caching build-time data
if msg.startswith("Installing"):
continue
# New command means the start of a new test part
if msg.startswith("'") and msg.endswith("'"):
# Update the last part processed
process_part_end(part, curr_time, last_time)
part = new_part()
part["command"] = msg
part["name"] = part_name(msg)
parts.append(part)
# Save off the optional test description if it was
# tty.debuged *prior to* the command and reset
if testdesc:
part["desc"] = testdesc
testdesc = ""
else:
# Update the last part processed since a new log message
# means a non-test action
process_part_end(part, curr_time, last_time)
if testdesc:
# We had a test description but no command so treat
# as a new part (e.g., some import tests)
part = new_part()
part["name"] = "_".join(testdesc.split())
part["command"] = "unknown"
part["desc"] = testdesc
parts.append(part)
process_part_end(part, curr_time, curr_time)
# Assuming this is a description for the next test part
testdesc = msg
else:
tty.debug("Did not recognize test output '{0}'".format(line))
# Each log message potentially represents a new test part so
# save off the last timestamp
last_time = curr_time
continue
# Check for status values
stat = status(line)
if stat:
if part:
part["status"] = stat
add_part_output(part, line)
else:
tty.warn("No part to add status from '{0}'".format(line))
continue
add_part_output(part, line)
# Process the last lingering part IF it didn't generate status
process_part_end(part, curr_time, last_time)
# If no parts, create a skeleton to flag that the tests are not run
if not parts:
part = new_part()
stat = "notrun"
part["command"] = "Not Applicable"
part["completed"] = completed[stat]
part["elapsed"] = 0.0
part["name"] = default_name
part["status"] = stat
parts.append(part)
return parts

View File

@@ -3,11 +3,10 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os.path
import posixpath
import spack.build_environment
import spack.fetch_strategy
import spack.package_base
import spack.tengine
from spack.reporter import Reporter
__all__ = ["JUnit"]
@@ -23,6 +22,11 @@ def __init__(self, args):
self.template_file = posixpath.join("reports", "junit.xml")
def build_report(self, filename, report_data):
if not (os.path.splitext(filename))[1]:
# Ensure the report name will end with the proper extension;
# otherwise, it currently defaults to the "directory" name.
filename = filename + ".xml"
# Write the report
with open(filename, "w") as f:
env = spack.tengine.make_environment()

View File

@@ -52,6 +52,15 @@
"properties": runner_attributes_schema_items,
}
remove_attributes_schema = {
"type": "object",
"additionalProperties": False,
"required": ["tags"],
"properties": {
"tags": {"type": "array", "items": {"type": "string"}},
},
}
core_shared_properties = union_dicts(
runner_attributes_schema_items,
@@ -80,6 +89,7 @@
],
},
},
"match_behavior": {"type": "string", "enum": ["first", "merge"], "default": "first"},
"mappings": {
"type": "array",
"items": {
@@ -93,6 +103,7 @@
"type": "string",
},
},
"remove-attributes": remove_attributes_schema,
"runner-attributes": runner_selector_schema,
},
},
@@ -101,6 +112,12 @@
"signing-job-attributes": runner_selector_schema,
"rebuild-index": {"type": "boolean"},
"broken-specs-url": {"type": "string"},
"broken-tests-packages": {
"type": "array",
"items": {
"type": "string",
},
},
},
)

View File

@@ -47,6 +47,7 @@
import spack.repo
import spack.spec
import spack.store
import spack.util.path
import spack.util.timer
import spack.variant
import spack.version
@@ -307,7 +308,10 @@ def check_same_flags(flag_dict_1, flag_dict_2):
for t in types:
values1 = set(flag_dict_1.get(t, []))
values2 = set(flag_dict_2.get(t, []))
assert values1 == values2
error_msg = "Internal Error: A mismatch in flags has occurred:"
error_msg += "\n\tvalues1: {v1}\n\tvalues2: {v2}".format(v1=values1, v2=values2)
error_msg += "\n Please report this as an issue to the spack maintainers"
assert values1 == values2, error_msg
def check_packages_exist(specs):
@@ -363,7 +367,11 @@ def format_core(self, core):
Modeled after traceback.format_stack.
"""
assert self.control
error_msg = (
"Internal Error: ASP Result.control not populated. Please report to the spack"
" maintainers"
)
assert self.control, error_msg
symbols = dict((a.literal, a.symbol) for a in self.control.symbolic_atoms)
@@ -382,7 +390,11 @@ def minimize_core(self, core):
ensure unsatisfiability. This algorithm reduces the core to only those
essential facts.
"""
assert self.control
error_msg = (
"Internal Error: ASP Result.control not populated. Please report to the spack"
" maintainers"
)
assert self.control, error_msg
min_core = core[:]
for fact in core:
@@ -821,7 +833,8 @@ def key_fn(version):
def spec_versions(self, spec):
"""Return list of clauses expressing spec's version constraints."""
spec = specify(spec)
assert spec.name
msg = "Internal Error: spec with no name occured. Please report to the spack maintainers."
assert spec.name, msg
if spec.concrete:
return [fn.version(spec.name, spec.version)]
@@ -930,7 +943,16 @@ def package_compiler_defaults(self, pkg):
def package_requirement_rules(self, pkg):
pkg_name = pkg.name
config = spack.config.get("packages")
requirements = config.get(pkg_name, {}).get("require", [])
requirements = config.get(pkg_name, {}).get("require", []) or config.get("all", {}).get(
"require", []
)
rules = self._rules_from_requirements(pkg_name, requirements)
self.emit_facts_from_requirement_rules(rules, virtual=False)
def _rules_from_requirements(self, pkg_name, requirements):
"""Manipulate requirements from packages.yaml, and return a list of tuples
with a uniform structure (name, policy, requirements).
"""
if isinstance(requirements, string_types):
rules = [(pkg_name, "one_of", [requirements])]
else:
@@ -939,17 +961,7 @@ def package_requirement_rules(self, pkg):
for policy in ("one_of", "any_of"):
if policy in requirement:
rules.append((pkg_name, policy, requirement[policy]))
for requirement_grp_id, (pkg_name, policy, requirement_grp) in enumerate(rules):
self.gen.fact(fn.requirement_group(pkg_name, requirement_grp_id))
self.gen.fact(fn.requirement_policy(pkg_name, requirement_grp_id, policy))
for requirement_weight, spec_str in enumerate(requirement_grp):
spec = spack.spec.Spec(spec_str)
if not spec.name:
spec.name = pkg_name
member_id = self.condition(spec, imposed_spec=spec, name=pkg_name)
self.gen.fact(fn.requirement_group_member(member_id, pkg_name, requirement_grp_id))
self.gen.fact(fn.requirement_has_weight(member_id, requirement_weight))
return rules
def pkg_rules(self, pkg, tests):
pkg = packagize(pkg)
@@ -1043,7 +1055,7 @@ def pkg_rules(self, pkg, tests):
self.package_requirement_rules(pkg)
def condition(self, required_spec, imposed_spec=None, name=None, msg=None):
def condition(self, required_spec, imposed_spec=None, name=None, msg=None, node=False):
"""Generate facts for a dependency or virtual provider condition.
Arguments:
@@ -1053,6 +1065,8 @@ def condition(self, required_spec, imposed_spec=None, name=None, msg=None):
name (str or None): name for `required_spec` (required if
required_spec is anonymous, ignored if not)
msg (str or None): description of the condition
node (bool): if False does not emit "node" or "virtual_node" requirements
from the imposed spec
Returns:
int: id of the condition created by this function
"""
@@ -1069,7 +1083,7 @@ def condition(self, required_spec, imposed_spec=None, name=None, msg=None):
self.gen.fact(fn.condition_requirement(condition_id, pred.name, *pred.args))
if imposed_spec:
self.impose(condition_id, imposed_spec, node=False, name=name)
self.impose(condition_id, imposed_spec, node=node, name=name)
return condition_id
@@ -1137,12 +1151,47 @@ def virtual_preferences(self, pkg_name, func):
def provider_defaults(self):
self.gen.h2("Default virtual providers")
assert self.possible_virtuals is not None
msg = (
"Internal Error: possible_virtuals is not populated. Please report to the spack"
" maintainers"
)
assert self.possible_virtuals is not None, msg
self.virtual_preferences(
"all",
lambda v, p, i: self.gen.fact(fn.default_provider_preference(v, p, i)),
)
def provider_requirements(self):
self.gen.h2("Requirements on virtual providers")
msg = (
"Internal Error: possible_virtuals is not populated. Please report to the spack"
" maintainers"
)
packages_yaml = spack.config.config.get("packages")
assert self.possible_virtuals is not None, msg
for virtual_str in sorted(self.possible_virtuals):
requirements = packages_yaml.get(virtual_str, {}).get("require", [])
rules = self._rules_from_requirements(virtual_str, requirements)
self.emit_facts_from_requirement_rules(rules, virtual=True)
def emit_facts_from_requirement_rules(self, rules, virtual=False):
"""Generate facts to enforce requirements from packages.yaml."""
for requirement_grp_id, (pkg_name, policy, requirement_grp) in enumerate(rules):
self.gen.fact(fn.requirement_group(pkg_name, requirement_grp_id))
self.gen.fact(fn.requirement_policy(pkg_name, requirement_grp_id, policy))
for requirement_weight, spec_str in enumerate(requirement_grp):
spec = spack.spec.Spec(spec_str)
if not spec.name:
spec.name = pkg_name
when_spec = spec
if virtual:
when_spec = spack.spec.Spec(pkg_name)
member_id = self.condition(
required_spec=when_spec, imposed_spec=spec, name=pkg_name, node=virtual
)
self.gen.fact(fn.requirement_group_member(member_id, pkg_name, requirement_grp_id))
self.gen.fact(fn.requirement_has_weight(member_id, requirement_weight))
def external_packages(self):
"""Facts on external packages, as read from packages.yaml"""
# Read packages.yaml and normalize it, so that it
@@ -1162,10 +1211,11 @@ def external_packages(self):
self.gen.h2("External package: {0}".format(pkg_name))
# Check if the external package is buildable. If it is
# not then "external(<pkg>)" is a fact.
# not then "external(<pkg>)" is a fact, unless we can
# reuse an already installed spec.
external_buildable = data.get("buildable", True)
if not external_buildable:
self.gen.fact(fn.external_only(pkg_name))
self.gen.fact(fn.buildable_false(pkg_name))
# Read a list of all the specs for this package
externals = data.get("externals", [])
@@ -1396,6 +1446,9 @@ class Body(object):
# dependencies
if spec.concrete:
# older specs do not have package hashes, so we have to do this carefully
if getattr(spec, "_package_hash", None):
clauses.append(fn.package_hash(spec.name, spec._package_hash))
clauses.append(fn.hash(spec.name, spec.dag_hash()))
# add all clauses from dependencies
@@ -1467,8 +1520,10 @@ def key_fn(item):
# specs will be computed later
version_preferences = packages_yaml.get(pkg_name, {}).get("version", [])
for idx, v in enumerate(version_preferences):
# v can be a string so force it into an actual version for comparisons
ver = spack.version.Version(v)
self.declared_versions[pkg_name].append(
DeclaredVersion(version=v, idx=idx, origin=version_provenance.packages_yaml)
DeclaredVersion(version=ver, idx=idx, origin=version_provenance.packages_yaml)
)
for spec in specs:
@@ -1656,7 +1711,11 @@ def target_defaults(self, specs):
def virtual_providers(self):
self.gen.h2("Virtual providers")
assert self.possible_virtuals is not None
msg = (
"Internal Error: possible_virtuals is not populated. Please report to the spack"
" maintainers"
)
assert self.possible_virtuals is not None, msg
# what provides what
for vspec in sorted(self.possible_virtuals):
@@ -1908,6 +1967,7 @@ def setup(self, driver, specs, reuse=None):
self.virtual_providers()
self.provider_defaults()
self.provider_requirements()
self.external_packages()
self.flag_defaults()
@@ -2066,7 +2126,7 @@ def depends_on(self, pkg, dep, type):
dependencies = self._specs[pkg].edges_to_dependencies(name=dep)
# TODO: assertion to be removed when cross-compilation is handled correctly
msg = "Current solver does not handle multiple dependency edges " "of the same name"
msg = "Current solver does not handle multiple dependency edges of the same name"
assert len(dependencies) < 2, msg
if not dependencies:
@@ -2156,7 +2216,11 @@ def build_specs(self, function_tuples):
tty.debug(msg)
continue
assert action and callable(action)
msg = (
"Internal Error: Uncallable action found in asp.py. Please report to the spack"
" maintainers."
)
assert action and callable(action), msg
# ignore predicates on virtual packages, as they're used for
# solving but don't construct anything. Do not ignore error
@@ -2223,10 +2287,17 @@ def _develop_specs_from_env(spec, env):
if not dev_info:
return
path = os.path.normpath(os.path.join(env.path, dev_info["path"]))
path = spack.util.path.canonicalize_path(dev_info["path"], default_wd=env.path)
if "dev_path" in spec.variants:
assert spec.variants["dev_path"].value == path
error_msg = (
"Internal Error: The dev_path for spec {name} is not connected to a valid environment"
"path. Please note that develop specs can only be used inside an environment"
"These paths should be the same:\n\tdev_path:{dev_path}\n\tenv_based_path:{env_path}"
)
error_msg.format(name=spec.name, dev_path=spec.variants["dev_path"], env_path=path)
assert spec.variants["dev_path"].value == path, error_msg
else:
spec.variants.setdefault("dev_path", spack.variant.SingleValuedVariant("dev_path", path))
spec.constrain(dev_info["spec"])

View File

@@ -276,7 +276,8 @@ error(0, Msg) :- node(Package),
conflict(Package, TriggerID, ConstraintID, Msg),
condition_holds(TriggerID),
condition_holds(ConstraintID),
not external(Package). % ignore conflicts for externals
not external(Package), % ignore conflicts for externals
not hash(Package, _). % ignore conflicts for installed packages
#defined conflict/4.
@@ -436,7 +437,7 @@ attr("node_compiler_version_satisfies", Package, Compiler, Version)
#defined external/1.
#defined external_spec/2.
#defined external_version_declared/4.
#defined external_only/1.
#defined buildable_false/1.
#defined pkg_provider_preference/4.
#defined default_provider_preference/3.
#defined node_version_satisfies/2.
@@ -463,8 +464,10 @@ error(2, "Attempted to use external for '{0}' which does not satisfy any configu
version_weight(Package, Weight) :- external_version(Package, Version, Weight).
version(Package, Version) :- external_version(Package, Version, Weight).
% if a package is not buildable (external_only), only externals are allowed
external(Package) :- external_only(Package), node(Package).
% if a package is not buildable, only externals or hashed specs are allowed
external(Package) :- buildable_false(Package),
node(Package),
not hash(Package, _).
% a package is a real_node if it is not external
real_node(Package) :- node(Package), not external(Package).
@@ -483,7 +486,8 @@ external(Package) :- external_spec_selected(Package, _).
% determine if an external spec has been selected
external_spec_selected(Package, LocalIndex) :-
external_conditions_hold(Package, LocalIndex),
node(Package).
node(Package),
not hash(Package, _).
external_conditions_hold(Package, LocalIndex) :-
possible_external(ID, Package, LocalIndex), condition_holds(ID).
@@ -504,9 +508,12 @@ error(2, "Attempted to use external for '{0}' which does not satisfy any configu
% Config required semantics
%-----------------------------------------------------------------------------
activate_requirement_rules(Package) :- node(Package).
activate_requirement_rules(Package) :- virtual_node(Package).
requirement_group_satisfied(Package, X) :-
1 { condition_holds(Y) : requirement_group_member(Y, Package, X) } 1,
node(Package),
activate_requirement_rules(Package),
requirement_policy(Package, X, "one_of"),
requirement_group(Package, X).
@@ -519,7 +526,7 @@ requirement_weight(Package, W) :-
requirement_group_satisfied(Package, X) :-
1 { condition_holds(Y) : requirement_group_member(Y, Package, X) } ,
node(Package),
activate_requirement_rules(Package),
requirement_policy(Package, X, "any_of"),
requirement_group(Package, X).
@@ -535,7 +542,7 @@ requirement_weight(Package, W) :-
requirement_group_satisfied(Package, X).
error(2, "Cannot satisfy requirement group for package '{0}'", Package) :-
node(Package),
activate_requirement_rules(Package),
requirement_group(Package, X),
not requirement_group_satisfied(Package, X).
@@ -1059,6 +1066,11 @@ no_flags(Package, FlagType)
% you can't choose an installed hash for a dev spec
:- hash(Package, Hash), variant_value(Package, "dev_path", _).
% You can't install a hash, if it is not installed
:- hash(Package, Hash), not installed_hash(Package, Hash).
% This should be redundant given the constraint above
:- hash(Package, Hash1), hash(Package, Hash2), Hash1 != Hash2.
% if a hash is selected, we impose all the constraints that implies
impose(Hash) :- hash(Package, Hash).

View File

@@ -20,3 +20,6 @@ os_compatible("ubuntu20.04", "ubuntu19.10").
os_compatible("ubuntu19.10", "ubuntu19.04").
os_compatible("ubuntu19.04", "ubuntu18.10").
os_compatible("ubuntu18.10", "ubuntu18.04").
%EL8
os_compatible("rhel8", "rocky8").

View File

@@ -284,6 +284,22 @@ def _string_or_none(s):
self.platform, self.os, self.target = platform_tuple
@staticmethod
def override(init_spec, change_spec):
if init_spec:
new_spec = init_spec.copy()
else:
new_spec = ArchSpec()
if change_spec.platform:
new_spec.platform = change_spec.platform
# TODO: if the platform is changed to something that is incompatible
# with the current os, we should implicitly remove it
if change_spec.os:
new_spec.os = change_spec.os
if change_spec.target:
new_spec.target = change_spec.target
return new_spec
def _autospec(self, spec_like):
if isinstance(spec_like, ArchSpec):
return spec_like
@@ -1532,16 +1548,7 @@ def package_class(self):
@property
def virtual(self):
"""Right now, a spec is virtual if no package exists with its name.
TODO: revisit this -- might need to use a separate namespace and
be more explicit about this.
Possible idea: just use conventin and make virtual deps all
caps, e.g., MPI vs mpi.
"""
# This method can be called while regenerating the provider index
# So we turn off using the index to detect virtuals
return spack.repo.path.is_virtual(self.name, use_index=False)
return spack.repo.path.is_virtual(self.name)
@property
def concrete(self):
@@ -2242,6 +2249,33 @@ def read_yaml_dep_specs(deps, hash_type=ht.dag_hash.name):
raise spack.error.SpecError("Couldn't parse dependency types in spec.")
yield dep_name, dep_hash, list(deptypes), hash_type
@staticmethod
def override(init_spec, change_spec):
# TODO: this doesn't account for the case where the changed spec
# (and the user spec) have dependencies
new_spec = init_spec.copy()
package_cls = spack.repo.path.get_pkg_class(new_spec.name)
if change_spec.versions and not change_spec.versions == spack.version.ver(":"):
new_spec.versions = change_spec.versions
for variant, value in change_spec.variants.items():
if variant in package_cls.variants:
if variant in new_spec.variants:
new_spec.variants.substitute(value)
else:
new_spec.variants[variant] = value
else:
raise ValueError("{0} is not a variant of {1}".format(variant, new_spec.name))
if change_spec.compiler:
new_spec.compiler = change_spec.compiler
if change_spec.compiler_flags:
for flagname, flagvals in change_spec.compiler_flags.items():
new_spec.compiler_flags[flagname] = flagvals
if change_spec.architecture:
new_spec.architecture = ArchSpec.override(
new_spec.architecture, change_spec.architecture
)
return new_spec
@staticmethod
def from_literal(spec_dict, normal=True):
"""Builds a Spec from a dictionary containing the spec literal.
@@ -2584,7 +2618,9 @@ def _expand_virtual_packages(self, concretizer):
a problem.
"""
# Make an index of stuff this spec already provides
self_index = spack.provider_index.ProviderIndex(self.traverse(), restrict=True)
self_index = spack.provider_index.ProviderIndex(
repository=spack.repo.path, specs=self.traverse(), restrict=True
)
changed = False
done = False
@@ -3108,7 +3144,7 @@ def _find_provider(self, vdep, provider_index):
Raise an exception if there is a conflicting virtual
dependency already in this spec.
"""
assert vdep.virtual
assert spack.repo.path.is_virtual_safe(vdep.name), vdep
# note that this defensively copies.
providers = provider_index.providers_for(vdep)
@@ -3173,16 +3209,18 @@ def _merge_dependency(self, dependency, visited, spec_deps, provider_index, test
# If it's a virtual dependency, try to find an existing
# provider in the spec, and merge that.
if dep.virtual:
if spack.repo.path.is_virtual_safe(dep.name):
visited.add(dep.name)
provider = self._find_provider(dep, provider_index)
if provider:
dep = provider
else:
index = spack.provider_index.ProviderIndex([dep], restrict=True)
index = spack.provider_index.ProviderIndex(
repository=spack.repo.path, specs=[dep], restrict=True
)
items = list(spec_deps.items())
for name, vspec in items:
if not vspec.virtual:
if not spack.repo.path.is_virtual_safe(vspec.name):
continue
if index.providers_for(vspec):
@@ -3332,7 +3370,7 @@ def normalize(self, force=False, tests=False, user_spec_deps=None):
# Initialize index of virtual dependency providers if
# concretize didn't pass us one already
provider_index = spack.provider_index.ProviderIndex(
[s for s in all_spec_deps.values()], restrict=True
repository=spack.repo.path, specs=[s for s in all_spec_deps.values()], restrict=True
)
# traverse the package DAG and fill out dependencies according
@@ -3710,8 +3748,12 @@ def satisfies_dependencies(self, other, strict=False):
return False
# For virtual dependencies, we need to dig a little deeper.
self_index = spack.provider_index.ProviderIndex(self.traverse(), restrict=True)
other_index = spack.provider_index.ProviderIndex(other.traverse(), restrict=True)
self_index = spack.provider_index.ProviderIndex(
repository=spack.repo.path, specs=self.traverse(), restrict=True
)
other_index = spack.provider_index.ProviderIndex(
repository=spack.repo.path, specs=other.traverse(), restrict=True
)
# This handles cases where there are already providers for both vpkgs
if not self_index.satisfies(other_index):
@@ -4013,6 +4055,9 @@ def _cmp_node(self):
yield self.compiler_flags
yield self.architecture
# this is not present on older specs
yield getattr(self, "_package_hash", None)
def eq_node(self, other):
"""Equality with another spec, not including dependencies."""
return (other is not None) and lang.lazy_eq(self._cmp_node, other._cmp_node)
@@ -4022,6 +4067,16 @@ def _cmp_iter(self):
for item in self._cmp_node():
yield item
# This needs to be in _cmp_iter so that no specs with different process hashes
# are considered the same by `__hash__` or `__eq__`.
#
# TODO: We should eventually unify the `_cmp_*` methods with `to_node_dict` so
# TODO: there aren't two sources of truth, but this needs some thought, since
# TODO: they exist for speed. We should benchmark whether it's really worth
# TODO: having two types of hashing now that we use `json` instead of `yaml` for
# TODO: spec hashing.
yield self.process_hash() if self.concrete else None
def deps():
for dep in sorted(itertools.chain.from_iterable(self._dependencies.values())):
yield dep.spec.name
@@ -4938,7 +4993,7 @@ def __missing__(self, key):
#: These are possible token types in the spec grammar.
HASH, DEP, AT, COLON, COMMA, ON, OFF, PCT, EQ, ID, VAL, FILE = range(12)
HASH, DEP, VER, COLON, COMMA, ON, OFF, PCT, EQ, ID, VAL, FILE = range(12)
#: Regex for fully qualified spec names. (e.g., builtin.hdf5)
spec_id_re = r"\w[\w.-]*"
@@ -4958,10 +5013,13 @@ def __init__(self):
)
super(SpecLexer, self).__init__(
[
(r"\^", lambda scanner, val: self.token(DEP, val)),
(r"\@", lambda scanner, val: self.token(AT, val)),
(
r"\@([\w.\-]*\s*)*(\s*\=\s*\w[\w.\-]*)?",
lambda scanner, val: self.token(VER, val),
),
(r"\:", lambda scanner, val: self.token(COLON, val)),
(r"\,", lambda scanner, val: self.token(COMMA, val)),
(r"\^", lambda scanner, val: self.token(DEP, val)),
(r"\+", lambda scanner, val: self.token(ON, val)),
(r"\-", lambda scanner, val: self.token(OFF, val)),
(r"\~", lambda scanner, val: self.token(OFF, val)),
@@ -5099,7 +5157,7 @@ def do_parse(self):
else:
# If the next token can be part of a valid anonymous spec,
# create the anonymous spec
if self.next.type in (AT, ON, OFF, PCT):
if self.next.type in (VER, ON, OFF, PCT):
# Raise an error if the previous spec is already concrete
if specs and specs[-1].concrete:
raise RedundantSpecError(specs[-1], "compiler, version, " "or variant")
@@ -5207,7 +5265,7 @@ def spec(self, name):
spec.name = spec_name
while self.next:
if self.accept(AT):
if self.accept(VER):
vlist = self.version_list()
spec._add_versions(vlist)
@@ -5225,7 +5283,6 @@ def spec(self, name):
elif self.accept(ID):
self.previous = self.token
if self.accept(EQ):
# We're adding a key-value pair to the spec
self.expect(VAL)
spec._add_flag(self.previous.value, self.token.value)
self.previous = None
@@ -5261,16 +5318,24 @@ def variant(self, name=None):
return self.token.value
def version(self):
start = None
end = None
if self.accept(ID):
start = self.token.value
if self.accept(EQ):
# This is for versions that are associated with a hash
# i.e. @[40 char hash]=version
start += self.token.value
self.expect(VAL)
start += self.token.value
def str_translate(value):
# return None for empty strings since we can end up with `'@'.strip('@')`
if not (value and value.strip()):
return None
else:
return value
if self.token.type is COMMA:
# need to increment commas, could be ID or COLON
self.accept(ID)
if self.token.type in (VER, ID):
version_spec = self.token.value.lstrip("@")
start = str_translate(version_spec)
if self.accept(COLON):
if self.accept(ID):
@@ -5280,10 +5345,10 @@ def version(self):
else:
end = self.token.value
elif start:
# No colon, but there was a version.
# No colon, but there was a version
return vn.Version(start)
else:
# No colon and no id: invalid version.
# No colon and no id: invalid version
self.next_token_error("Invalid version specifier")
if start:
@@ -5306,7 +5371,7 @@ def compiler(self):
compiler = CompilerSpec.__new__(CompilerSpec)
compiler.name = self.token.value
compiler.versions = vn.VersionList()
if self.accept(AT):
if self.accept(VER):
vlist = self.version_list()
compiler._add_versions(vlist)
else:

View File

@@ -34,6 +34,13 @@ def __init__(self, name="specs", yaml_list=None, reference=None):
self._constraints = None
self._specs = None
@property
def is_matrix(self):
for item in self.specs_as_yaml_list:
if isinstance(item, dict):
return True
return False
@property
def specs_as_yaml_list(self):
if self._expanded_list is None:

View File

@@ -42,6 +42,7 @@
import spack.util.pattern as pattern
import spack.util.url as url_util
from spack.util.crypto import bit_length, prefix_bits
from spack.util.web import FetchError
# The well-known stage source subdirectory name.
_source_path_subdir = "spack-src"
@@ -529,7 +530,7 @@ def print_errors(errors):
self.fetcher = self.default_fetcher
default_msg = "All fetchers failed for {0}".format(self.name)
raise fs.FetchError(err_msg or default_msg, None)
raise FetchError(err_msg or default_msg, None)
print_errors(errors)

View File

@@ -102,8 +102,8 @@ def __init__(self):
def restore(self):
if _serialize:
spack.repo.path = spack.repo._path(self.repo_dirs)
spack.config.config = self.config
spack.repo.path = spack.repo._path(self.config)
spack.platforms.host = self.platform
new_store = spack.store.Store.deserialize(self.store_token)

View File

@@ -50,8 +50,9 @@ def packages_with_tags(tags, installed, skip_empty):
class TagIndex(Mapping):
"""Maps tags to list of packages."""
def __init__(self):
def __init__(self, repository):
self._tag_dict = collections.defaultdict(list)
self.repository = repository
@property
def tags(self):
@@ -61,7 +62,7 @@ def to_json(self, stream):
sjson.dump({"tags": self._tag_dict}, stream)
@staticmethod
def from_json(stream):
def from_json(stream, repository):
d = sjson.load(stream)
if not isinstance(d, dict):
@@ -70,7 +71,7 @@ def from_json(stream):
if "tags" not in d:
raise TagIndexError("TagIndex data does not start with 'tags'")
r = TagIndex()
r = TagIndex(repository=repository)
for tag, packages in d["tags"].items():
r[tag].extend(packages)
@@ -88,7 +89,7 @@ def __len__(self):
def copy(self):
"""Return a deep copy of this index."""
clone = TagIndex()
clone = TagIndex(repository=self.repository)
clone._tag_dict = copy.deepcopy(self._tag_dict)
return clone
@@ -117,9 +118,8 @@ def update_package(self, pkg_name):
Args:
pkg_name (str): name of the package to be removed from the index
"""
pkg_cls = spack.repo.path.get_pkg_class(pkg_name)
pkg_cls = self.repository.get_pkg_class(pkg_name)
# Remove the package from the list of packages, if present
for pkg_list in self._tag_dict.values():

View File

@@ -11,6 +11,7 @@
import llnl.util.lang
import spack.config
import spack.extensions
from spack.util.path import canonicalize_path

View File

@@ -9,18 +9,20 @@
@pytest.mark.parametrize(
"packages,expected_error",
# PKG-PROPERTIES are ubiquitous in mock packages, since they don't use sha256
# and they don't change the example.com URL very often.
"packages,expected_errors",
[
# A non existing variant is used in a conflict directive
(["wrong-variant-in-conflicts"], "PKG-DIRECTIVES"),
(["wrong-variant-in-conflicts"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# The package declares a non-existing dependency
(["missing-dependency"], "PKG-DIRECTIVES"),
(["missing-dependency"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# The package use a non existing variant in a depends_on directive
(["wrong-variant-in-depends-on"], "PKG-DIRECTIVES"),
(["wrong-variant-in-depends-on"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# This package has a GitHub patch URL without full_index=1
(["invalid-github-patch-url"], "PKG-DIRECTIVES"),
(["invalid-github-patch-url"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# This package has a stand-alone 'test' method in build-time callbacks
(["test-build-callbacks"], "PKG-DIRECTIVES"),
(["test-build-callbacks"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# This package has no issues
(["mpileaks"], None),
# This package has a conflict with a trigger which cannot constrain the constraint
@@ -28,15 +30,16 @@
(["unconstrainable-conflict"], None),
],
)
def test_package_audits(packages, expected_error, mock_packages):
def test_package_audits(packages, expected_errors, mock_packages):
reports = spack.audit.run_group("packages", pkgs=packages)
# Check that errors were reported only for the expected failure
actual_errors = [check for check, errors in reports if errors]
if expected_error:
assert [expected_error] == actual_errors
msg = [str(e) for _, errors in reports for e in errors]
if expected_errors:
assert expected_errors == actual_errors, msg
else:
assert not actual_errors
assert not actual_errors, msg
# Data used in the test below to audit the double definition of a compiler

View File

@@ -97,12 +97,12 @@ def config_directory(tmpdir_factory):
@pytest.fixture(scope="function")
def default_config(tmpdir_factory, config_directory, monkeypatch, install_mockery_mutable_config):
def default_config(tmpdir, config_directory, monkeypatch, install_mockery_mutable_config):
# This fixture depends on install_mockery_mutable_config to ensure
# there is a clear order of initialization. The substitution of the
# config scopes here is done on top of the substitution that comes with
# install_mockery_mutable_config
mutable_dir = tmpdir_factory.mktemp("mutable_config").join("tmp")
mutable_dir = tmpdir.mkdir("mutable_config").join("tmp")
config_directory.copy(mutable_dir)
cfg = spack.config.Configuration(
@@ -113,7 +113,7 @@ def default_config(tmpdir_factory, config_directory, monkeypatch, install_mocker
)
spack.config.config, old_config = cfg, spack.config.config
spack.config.config.set("repos", [spack.paths.mock_packages_path])
# This is essential, otherwise the cache will create weird side effects
# that will compromise subsequent tests if compilers.yaml is modified
monkeypatch.setattr(spack.compilers, "_cache_config_file", [])

View File

@@ -134,7 +134,7 @@ def test_config_yaml_is_preserved_during_bootstrap(mutable_config):
@pytest.mark.regression("26548")
def test_custom_store_in_environment(mutable_config, tmpdir):
def test_bootstrap_custom_store_in_environment(mutable_config, tmpdir):
# Test that the custom store in an environment is taken into account
# during bootstrapping
spack_yaml = tmpdir.join("spack.yaml")

View File

@@ -177,6 +177,14 @@ def _set_wrong_cc(x):
assert os.environ["ANOTHER_VAR"] == "THIS_IS_SET"
def test_setup_dependent_package_inherited_modules(
config, working_env, mock_packages, install_mockery, mock_fetch
):
# This will raise on regression
s = spack.spec.Spec("cmake-client-inheritor").concretized()
s.package.do_install()
@pytest.mark.parametrize(
"initial,modifications,expected",
[

View File

@@ -62,3 +62,36 @@ def test_build_request_strings(install_mockery):
istr = str(request)
assert "package=dependent-install" in istr
assert "install_args=" in istr
@pytest.mark.parametrize(
"package_cache_only,dependencies_cache_only,package_deptypes,dependencies_deptypes",
[
(False, False, ["build", "link", "run"], ["build", "link", "run"]),
(True, False, ["link", "run"], ["build", "link", "run"]),
(False, True, ["build", "link", "run"], ["link", "run"]),
(True, True, ["link", "run"], ["link", "run"]),
],
)
def test_build_request_deptypes(
install_mockery,
package_cache_only,
dependencies_cache_only,
package_deptypes,
dependencies_deptypes,
):
s = spack.spec.Spec("dependent-install").concretized()
build_request = inst.BuildRequest(
s.package,
{
"package_cache_only": package_cache_only,
"dependencies_cache_only": dependencies_cache_only,
},
)
actual_package_deptypes = build_request.get_deptypes(s.package)
actual_dependency_deptypes = build_request.get_deptypes(s["dependency-install"].package)
assert sorted(actual_package_deptypes) == package_deptypes
assert sorted(actual_dependency_deptypes) == dependencies_deptypes

View File

@@ -176,6 +176,33 @@ def test_download_and_extract_artifacts(tmpdir, monkeypatch, working_env):
ci.download_and_extract_artifacts(url, working_dir)
def test_ci_copy_stage_logs_to_artifacts_fail(tmpdir, config, mock_packages, monkeypatch, capfd):
"""The copy will fail because the spec is not concrete so does not have
a package."""
log_dir = tmpdir.join("log_dir")
s = spec.Spec("printing-package").concretized()
ci.copy_stage_logs_to_artifacts(s, log_dir)
_, err = capfd.readouterr()
assert "Unable to copy files" in err
assert "No such file or directory" in err
def test_ci_copy_test_logs_to_artifacts_fail(tmpdir, capfd):
log_dir = tmpdir.join("log_dir")
ci.copy_test_logs_to_artifacts("no-such-dir", log_dir)
_, err = capfd.readouterr()
assert "Cannot copy test logs" in err
stage_dir = tmpdir.join("stage_dir").strpath
os.makedirs(stage_dir)
ci.copy_test_logs_to_artifacts(stage_dir, log_dir)
_, err = capfd.readouterr()
assert "Unable to copy files" in err
assert "No such file or directory" in err
def test_setup_spack_repro_version(tmpdir, capfd, last_two_git_commits, monkeypatch):
c1, c2 = last_two_git_commits
repro_dir = os.path.join(tmpdir.strpath, "repro")
@@ -457,6 +484,7 @@ def test_get_spec_filter_list(mutable_mock_env_path, config, mutable_mock_repo):
assert affected_pkg_names == expected_affected_pkg_names
@pytest.mark.maybeslow
@pytest.mark.regression("29947")
def test_affected_specs_on_first_concretization(mutable_mock_env_path, config):
e = ev.create("first_concretization")
@@ -467,3 +495,154 @@ def test_affected_specs_on_first_concretization(mutable_mock_env_path, config):
affected_specs = spack.ci.get_spec_filter_list(e, ["zlib"])
hdf5_specs = [s for s in affected_specs if s.name == "hdf5"]
assert len(hdf5_specs) == 2
@pytest.mark.skipif(
sys.platform == "win32", reason="Reliance on bash script ot supported on Windows"
)
def test_ci_process_command(tmpdir):
repro_dir = tmpdir.join("repro_dir").strpath
os.makedirs(repro_dir)
result = ci.process_command("help", [], repro_dir)
assert os.path.exists(fs.join_path(repro_dir, "help.sh"))
assert not result
@pytest.mark.skipif(
sys.platform == "win32", reason="Reliance on bash script ot supported on Windows"
)
def test_ci_process_command_fail(tmpdir, monkeypatch):
import subprocess
err = "subprocess wait exception"
def _fail(self, args):
raise RuntimeError(err)
monkeypatch.setattr(subprocess.Popen, "__init__", _fail)
repro_dir = tmpdir.join("repro_dir").strpath
os.makedirs(repro_dir)
with pytest.raises(RuntimeError, match=err):
ci.process_command("help", [], repro_dir)
def test_ci_create_buildcache(tmpdir, working_env, config, mock_packages, monkeypatch):
# Monkeypatching ci method tested elsewhere to reduce number of methods
# that would need to be patched here.
monkeypatch.setattr(spack.ci, "push_mirror_contents", lambda a, b, c, d: None)
args = {
"env": None,
"buildcache_mirror_url": "file://fake-url",
"pipeline_mirror_url": "file://fake-url",
}
ci.create_buildcache(**args)
def test_ci_run_standalone_tests_missing_requirements(
tmpdir, working_env, config, mock_packages, capfd
):
"""This test case checks for failing prerequisite checks."""
ci.run_standalone_tests()
err = capfd.readouterr()[1]
assert "Job spec is required" in err
args = {"job_spec": spec.Spec("printing-package").concretized()}
ci.run_standalone_tests(**args)
err = capfd.readouterr()[1]
assert "Reproduction directory is required" in err
@pytest.mark.skipif(
sys.platform == "win32", reason="Reliance on bash script ot supported on Windows"
)
def test_ci_run_standalone_tests_not_installed_junit(
tmpdir, working_env, config, mock_packages, mock_test_stage, capfd
):
log_file = tmpdir.join("junit.xml").strpath
args = {
"log_file": log_file,
"job_spec": spec.Spec("printing-package").concretized(),
"repro_dir": tmpdir.join("repro_dir").strpath,
"fail_fast": True,
}
os.makedirs(args["repro_dir"])
ci.run_standalone_tests(**args)
err = capfd.readouterr()[1]
assert "No installed packages" in err
assert os.path.getsize(log_file) > 0
@pytest.mark.skipif(
sys.platform == "win32", reason="Reliance on bash script ot supported on Windows"
)
def test_ci_run_standalone_tests_not_installed_cdash(
tmpdir, working_env, config, mock_packages, mock_test_stage, capfd
):
"""Test run_standalone_tests with cdash and related options."""
log_file = tmpdir.join("junit.xml").strpath
args = {
"log_file": log_file,
"job_spec": spec.Spec("printing-package").concretized(),
"repro_dir": tmpdir.join("repro_dir").strpath,
}
os.makedirs(args["repro_dir"])
# Cover when CDash handler provided (with the log file as well)
ci_cdash = {
"url": "file://fake",
"build-group": "fake-group",
"project": "ci-unit-testing",
"site": "fake-site",
}
os.environ["SPACK_CDASH_BUILD_NAME"] = "ci-test-build"
os.environ["SPACK_CDASH_BUILD_STAMP"] = "ci-test-build-stamp"
os.environ["CI_RUNNER_DESCRIPTION"] = "test-runner"
handler = ci.CDashHandler(ci_cdash)
args["cdash"] = handler
ci.run_standalone_tests(**args)
out = capfd.readouterr()[0]
# CDash *and* log file output means log file ignored
assert "xml option is ignored" in out
assert "0 passed of 0" in out
# copy test results (though none)
artifacts_dir = tmpdir.join("artifacts")
fs.mkdirp(artifacts_dir.strpath)
handler.copy_test_results(tmpdir.strpath, artifacts_dir.strpath)
err = capfd.readouterr()[1]
assert "Unable to copy files" in err
assert "No such file or directory" in err
def test_ci_skipped_report(tmpdir, mock_packages, config):
"""Test explicit skipping of report as well as CI's 'package' arg."""
pkg = "trivial-smoke-test"
spec = spack.spec.Spec(pkg).concretized()
ci_cdash = {
"url": "file://fake",
"build-group": "fake-group",
"project": "ci-unit-testing",
"site": "fake-site",
}
os.environ["SPACK_CDASH_BUILD_NAME"] = "fake-test-build"
os.environ["SPACK_CDASH_BUILD_STAMP"] = "ci-test-build-stamp"
os.environ["CI_RUNNER_DESCRIPTION"] = "test-runner"
handler = ci.CDashHandler(ci_cdash)
reason = "Testing skip"
handler.report_skipped(spec, tmpdir.strpath, reason=reason)
report = fs.join_path(tmpdir, "{0}_Testing.xml".format(pkg))
expected = "Skipped {0} package".format(pkg)
with open(report, "r") as f:
have = [0, 0]
for line in f:
if expected in line:
have[0] += 1
elif reason in line:
have[1] += 1
assert all(count == 1 for count in have)

Some files were not shown because too many files have changed in this diff Show More