Compare commits

..

1805 Commits

Author SHA1 Message Date
Peter Josef Scheibel
70708d7420 Perform load/list commands in the same subshell 2021-11-04 12:16:41 -07:00
Todd Gamblin
b624c5b9a3 commands: spack load --list alias for spack find --loaded
See #25249 and https://github.com/spack/spack/pull/27159#issuecomment-958163679.
This adds `spack load --list` as an alias for `spack find --loaded`.  The new command is
not as powerful as `spack find --loaded`, as you can't combine it with all the queries or
formats that `spack find` provides.  However, it is more intuitively located in the command
structure in that it appears in the output of `spack load --help`.

The idea here is that people can use `spack load --list`  for simple stuff but fall back to
`spack find --loaded` if they need more.

- [x] add help to `spack load --list` that references `spack find`
- [x] factor some parts of `spack find` out to be called from `spack load`
- [x] add shell integration
- [x] add shell test
- [x] update completion
- [x] update docs
2021-11-02 22:49:24 -07:00
Richarda Butler
1a3747b2b3 Update docs how to display loaded modules (#27159)
* Update spack load docs
2021-11-02 22:12:08 -07:00
dependabot[bot]
a382a6c0e2 build(deps): bump actions/checkout from 2.3.5 to 2.4.0 (#27179)
Bumps [actions/checkout](https://github.com/actions/checkout) from 2.3.5 to 2.4.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](1e204e9a92...ec3a7ce113)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-03 02:52:28 +00:00
Manuela Kuhn
61ded65888 r-dtplyr: add new package (#27112) 2021-11-03 00:24:21 +00:00
Dan Lipsa
f1fb816d93 Add build editions for catalyst builds. (#22676)
* Add build editions for catalyst builds.

* Fix style.

* Build edition works only for 5.8:
2021-11-02 17:36:53 -04:00
kwryankrattiger
f1afd5ff27 Add and propagate CUDA variants for DAV SDK (#26476) 2021-11-02 17:31:50 -04:00
Satish Balay
c9f8dd93f3 trilinos: new version 13.2.0 (#27106)
* trilinos: add @13.2.0, and switch default to cxxstd=14

* trilinos: fix python dependency when using +ifpack or +ifpack2

* trilinos: add conflict for ~epetra +ml when @13.2.0:

* trilinos: keep 13.0.1 as the preferred version

* Update var/spack/repos/builtin/packages/trilinos/package.py

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* update

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2021-11-02 13:07:42 -06:00
Greg Becker
b3711c0d9d Improved error messages from clingo (#26719)
This PR adds error message sentinels to the clingo solve, attached to each of the rules that could fail a solve. The unsat core is then restricted to these messages, which makes the minimization problem tractable. Errors that can only be generated by a bug in the logic program or generating code are prefaced with "Internal error" to make clear to users that something has gone wrong on the Spack side of things.

* minimize unsat cores manually

* only errors messages are choices/assumptions for performance

* pre-check for unreachable nodes

* update tests for new error message

* make clingo concretization errors show up in cdash reports fully

* clingo: make import of clingo.ast parsing routines robust to clingo version

Older `clingo` has `parse_string`; newer `clingo` has `parse_files`.  Make the
code work wtih both.

* make AST access functions backward-compatible with clingo 5.4.0

Clingo AST API has changed since 5.4.0; make some functions to help us
handle both versions of the AST.

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-11-02 10:55:50 -07:00
Manuela Kuhn
3187689862 r-googlesheets4: add new package (#27114) 2021-11-02 11:46:48 -06:00
Manuela Kuhn
00b6927c65 r-dbplyr: add 2.1.1 (#27111) 2021-11-02 17:35:56 +00:00
Manuela Kuhn
9dc790bbe2 r-haven: add 2.4.3 (#27115) 2021-11-02 17:07:55 +00:00
Manuela Kuhn
ae76692c2e r-lubridate: add 1.8.0 (#27117) 2021-11-02 16:55:27 +00:00
iarspider
fa63bebf36 New versions of py-jupyter-server; fix tests (#27153)
* New versions of py-jupyter-server; fix tests

* Update package.py
2021-11-02 10:37:53 -06:00
Tom Payerle
eee4522103 trilinos: Additional fix for linking C code when built with PyTrilinos (#19834)
This removes `-lpytrilinos` from Makefile.export.Trilinos so that C code
trying to link against a Trilinos built with PyTrilinos does not fail
due to undefined references to python routines (libpytrilinos is only
used when importing PyTrilinos in python, in which case those references
are already defined by Python).

There was already a bit of code to do something similar for C codes
importing Trilinos via a CMake mechanism, this extends that to a basic
Makefile mechanism as well.  This patch also updates the comments to
remove a stale link discussing this issue, and replacing with links to
the some Trilinos issue reports related to the matter.
2021-11-02 12:31:10 -04:00
Manuela Kuhn
b8a44870a4 r-vroom: add new package (#27118) 2021-11-02 11:28:43 -05:00
Manuela Kuhn
426033bc5b r-brms: add 2.16.1 (#27148) 2021-11-02 11:21:54 -05:00
Manuela Kuhn
c2a7f8c608 r-rvest: add 1.0.2 (#27147) 2021-11-02 11:21:09 -05:00
Manuela Kuhn
429a60c703 r-callr: add 3.7.0 (#27146) 2021-11-02 11:19:09 -05:00
Manuela Kuhn
23f8a7331e r-forcats: add 0.5.1 (#27113) 2021-11-02 11:17:22 -05:00
Manuela Kuhn
6e2de9e41f r-processx-3.5.2 (#27119) 2021-11-02 11:13:13 -05:00
Seth R. Johnson
9cfecec002 relocate: do not change library id to use rpaths on package install (#27139)
After #26608 I got a report about missing rpaths when building a
downstream package independently using a spack-installed toolchain
(@tmdelellis). This occurred because the spack-installed libraries were
being linked into the downstream app, but the rpaths were not being
manually added. Prior to #26608 autotools-installed libs would retain
their hard-coded path and would thus propagate their link information
into the downstream library on mac.

We could solve this problem *if* the mac linker (ld) respected
`LD_RUN_PATH` like it does on GNU systems, i.e. adding `rpath` entries
to each item in the environment variable. However on mac we would have
to manually add rpaths either using spack's compiler wrapper scripts or
manually (e.g. using `CMAKE_BUILD_RPATH` and pointing to the libraries of
all the autotools-installed spack libraries).

The easier and safer thing to do for now is to simply stop changing the
dylib IDs.
2021-11-02 17:04:29 +01:00
Manuela Kuhn
34b2742817 r-hms: add 1.1.1 (#27116) 2021-11-02 11:03:52 -05:00
Manuela Kuhn
9d124bdac6 r-crayon: add 1.4.1 (#27110) 2021-11-02 10:49:56 -05:00
Manuela Kuhn
507f7a94d9 r-broom: add 0.7.9 and 0.7.10 (#27109) 2021-11-02 10:48:32 -05:00
Sinan
5fdf6e5f68 package/qgis_revert_incorrect_constraint (#27140)
* package/qgis_revert_incorrect_constraint

* fix bug

* also update dependency constraints

* also update python version constraints

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2021-11-02 10:43:29 -05:00
iarspider
669954a71d New version: py-isort 5.9.3 (#27144) 2021-11-02 10:42:34 -05:00
iarspider
6f44bf01e0 New version: py-jupyter-console 6.4.0; download sources from pip (#27150) 2021-11-02 10:37:37 -05:00
Joseph Wang
372fc78e98 yoda: add zlib as a dependency (#26454)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-11-02 12:37:23 +01:00
Cameron Smith
ca30940868 pumi: support build/install time and stand-alone testing (#26832) 2021-11-02 12:18:46 +01:00
Bernhard Kaindl
5a4d03060b gtk packages: fix dependencies (#26960)
gconf depends on gettext and libintl (dep: intltool)
glibmm, gtkmm, libcanberra and cups need pkgconfig
glibmm needs libsigc++ < 2.9x(which are 3.x pre-releases)
libsigc++@:2.9 depends on m4 for the build

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-11-02 10:39:25 +00:00
Chien-Chang Feng
bf411c7c55 pgplot: add intel compiler support and add more driver (#26761) 2021-11-02 11:30:28 +01:00
Massimiliano Culpo
3f3048e157 Fix GitHub Action's container build (#27143)
#26538 introduced a typo that causes the Docker image
build to fail.
2021-11-02 04:02:04 -06:00
Jean Luca Bez
85bdf4f74e h5bench: update maintainers and versions (#27083) 2021-11-02 10:52:36 +01:00
Glenn Johnson
cf50905ee9 modifications to docbook-xml (#27131)
- added more versions in case packages request a specific version of
  docbook-xml
- added a "current" alias to handle when packages use that
2021-11-02 10:50:07 +01:00
Glenn Johnson
f972863712 fsl: new version, updated constraints and patches (#27129)
- add version 6.0.5
- add patch to allow fsl to use newer gcc versions
- add patch to allow fsl to use newer cuda versions
- remove constraints on gcc and cuda versions
- add filters to prevent using system headers and libraries
- clean up the installed tree
2021-11-02 10:49:22 +01:00
Michael Kuhn
1e26e25bc8 spack arch: add --generic argument (#27061)
The `--generic` argument allows printing the best generic target for the
current machine. This can be quite handy when wanting to find the
generic architecture to use when building a shared software stack for
multiple machines.
2021-11-02 10:19:23 +01:00
Tamara Dahlgren
9d3d7c68fb Add tag filters to spack test list (#26842) 2021-11-02 10:00:21 +01:00
dependabot[bot]
94e0bf0112 build(deps): bump actions/checkout from 2.3.4 to 2.3.5 (#27135)
Bumps [actions/checkout](https://github.com/actions/checkout) from 2.3.4 to 2.3.5.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](5a4ac9002d...1e204e9a92)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-02 08:55:28 +01:00
iarspider
80807e6dd4 Add missing dependency to py-setuptools-scm (#27132) 2021-11-01 20:04:43 -06:00
Weiqun Zhang
840c9b7b4a amrex: add new release 21.11 (#27126) 2021-11-02 02:49:03 +01:00
Vicente Bolea
6a5ea3db99 vtkm: add v1.7.0-rc1 (#26882) 2021-11-02 02:42:10 +01:00
Manuela Kuhn
78a073218f r-tzdb: add new package (#27105) 2021-11-01 17:10:13 -05:00
Manuela Kuhn
881b2dcf05 r-rcpp: add 1.0.7 (#27088) 2021-11-01 17:09:18 -05:00
eugeneswalker
8bc01ff63c py-pylint needs pip for build (#27123)
* py-pylint: needs py-pip for build

* alphabetize py- dependencies

* add comment pointing to issue

* fix style
2021-11-01 21:50:36 +00:00
Andre Merzky
e0a929ba60 New package: exaworks (#27054) 2021-11-01 14:15:41 -07:00
Tamara Dahlgren
d4cecd9ab2 feature: add "spack tags" command (#26136)
This PR adds a "spack tags" command to output package tags or 
(available) packages with those tags. It also ensures each package
is listed in the tag cache ONLY ONCE per tag.
2021-11-01 20:40:29 +00:00
Chuck Atkins
b56f464c29 GCC 11 fixes (#27122)
* adios2: Fix compile errors for gcc 11

* unifyfs: Suppress bogus warnings for gcc 11

* conduit: Fix compile errors for gcc 11
2021-11-01 14:31:39 -06:00
Harmen Stoppels
6845307384 Pin actions to a commit sha instead of tag (#26538) 2021-11-01 10:44:03 -07:00
Bernhard Kaindl
02aa1d5d48 intel-gpu-tools: add v1.20 (#26588)
Add version 1.20, fix including the glib-2.0 header files
and add missing dependencies: libunwind and kmod.
2021-11-01 17:35:14 +01:00
Bernhard Kaindl
8ca411c749 glib: skip tests which we cannot make pass (#26693)
glib has a few tests which have external dependencies or
try to access the X server. We cannot run those.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-11-01 17:33:35 +01:00
Jon Rood
888de27210 kokkos-nvcc-wrapper: add HPE MPT to MPI environment variables (#27069) 2021-11-01 17:31:33 +01:00
Chuck Atkins
15d407c674 ci: Enable more packages in the DVSDK CI pipeline (#27025)
* ci: Enable more packages in the DVSDK CI pipeline

* doxygen: Add conflicts for gcc bugs

* dray: Add version constraints for api breakage with newer deps
2021-11-01 08:54:50 -07:00
Satish Balay
a1eb5596ec kokkos-kernels: add variant 'shared' (#27097)
* kokkos-kernels: add variant 'shared'

* Update var/spack/repos/builtin/packages/kokkos-kernels/package.py
2021-11-01 11:45:26 -04:00
iarspider
6c4f891b8f New version: py-typing-extensions 3.10.0.2 (#27120) 2021-11-01 10:04:18 -05:00
iarspider
99ee9a826a New version: py-beautifulsoup4 4.10.0 (#27121) 2021-11-01 10:02:57 -05:00
iarspider
8dcbd2ee6f alpgen: fix cms recipe (#26812) 2021-11-01 15:13:25 +01:00
Mahendra Paipuri
be0df5c47a ucx:add rocm variant (#26992)
Co-authored-by: mahendrapaipuri <mahendra.paipuri@inria.fr>
2021-11-01 15:05:56 +01:00
Ben Boeckel
80ba7040f8 python: detect Python from custom Xcode installations (#27023) 2021-11-01 07:55:15 -05:00
Hans Pabst
9094d6bb68 Updated LIBXSMM. (#27108) 2021-11-01 04:34:52 -06:00
Kenny Shen
3253faf01e cpp-argparse: new package (#27107) 2021-11-01 10:54:25 +01:00
Massimiliano Culpo
d73b1b9742 Fix caching of spack.repo.all_package_names() (#26991)
fixes #24522
2021-11-01 02:16:30 -06:00
Manuela Kuhn
b87678c2dd r-tidyr: add 1.1.4 (#27099) 2021-10-31 21:44:33 -05:00
Manuela Kuhn
8b8e7bd8e9 r-cpp11: add 0.4.0 (#27103) 2021-10-31 21:42:52 -05:00
Manuela Kuhn
200a1079d3 r-curl: add 4.3.2 (#27102) 2021-10-31 21:42:25 -05:00
Manuela Kuhn
3cd8381eb2 r-tidyselect: add 1.1.1 (#27101) 2021-10-31 21:41:16 -05:00
Manuela Kuhn
6b76b76d26 r-data-table: add 1.14.2 (#27100) 2021-10-31 21:40:36 -05:00
Manuela Kuhn
3406e8e632 r-dplyr: add 1.0.7 (#27098) 2021-10-31 21:40:07 -05:00
Manuela Kuhn
f363864f7f r-posterior: add new package (#27093) 2021-10-31 21:39:04 -05:00
Manuela Kuhn
543e5480a1 r-googledrive: add new package (#27095) 2021-10-31 21:37:26 -05:00
Manuela Kuhn
f8be56c941 r-withr: add 2.4.2 (#27094) 2021-10-31 21:35:41 -05:00
Manuela Kuhn
b6f6bbe371 r-nlme: add 3.1-153 (#27092) 2021-10-31 21:34:11 -05:00
Manuela Kuhn
ec8dc1830b r-bayesplot-1.8.1 (#27091) 2021-10-31 21:31:29 -05:00
Manuela Kuhn
86d2ab5240 r-mgcv: add 1.8-38 (#27090) 2021-10-31 21:30:34 -05:00
Manuela Kuhn
885e4d50e8 r-matrix: add 1.3-4 (#27089) 2021-10-31 21:29:54 -05:00
Manuela Kuhn
9397ca0a9f r-tibble: add 3.1.5 (#27087) 2021-10-31 21:25:19 -05:00
Manuela Kuhn
2b3207073d r-matrixstats: add 0.61.0 (#27086) 2021-10-31 21:23:10 -05:00
Manuela Kuhn
75a6d50fdc r-ids: add new package (#27104) 2021-10-31 21:21:54 -05:00
Bernhard Kaindl
6344d163b3 qt: @5.8:5.14.2 don't build with gcc@11, fix build of 5.6.3 (#27072)
5.14.2 fails with %gcc@11 with Error: 'numeric_limits' is not a class template
5.8.0 has multiple compile failures as well: Extend the conflict to those too.
- Fix also the confgigure of @5.6.3 (tested with %gcc@11)
2021-10-31 21:22:15 -04:00
Sinan
6c1f952bda package/qgis: fix runtime issue, improve package file, add new versions (#27084)
* package/qgis: fix runtime issue, improve package file, add new versions

* replace conflict with depends_on

* tidy up

Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2021-10-31 18:24:44 -05:00
iarspider
f45ef21e37 New versions of py-gitpython: 3.1.13 - 3.1.24 (#27052)
* New versions of py-gitpython

* Update package.py
2021-10-30 11:19:51 -05:00
iarspider
30573f5e5c New versions: py-gitdb 4.0.7, 4.0.8, 4.0.9 (#27051)
* New versions: py-gitdb 4.0.7, 4.0.8, 4.0.9

* Update var/spack/repos/builtin/packages/py-gitdb/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-30 11:19:22 -05:00
iarspider
4262a66f5e New versions of py-google-auth and py-google-auth-oauthlib (#27056)
* New versions of py-google-auth and py-google-auth-oauthlib

* Update var/spack/repos/builtin/packages/py-google-auth/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-30 11:18:48 -05:00
iarspider
bdfb281a35 New version: py-deprecation 2.1.0 (#27001)
* New version: py-deprecation 2.1.0

* Update package.py
2021-10-30 07:25:31 -06:00
iarspider
0ef0a27ea0 New version: py-html5lib 1.1 (#27059)
* New verion: py-html5lib 1.1

* Update var/spack/repos/builtin/packages/py-html5lib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-html5lib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update package.py

* Update package.py

* Update package.py

* Update package.py

* Update var/spack/repos/builtin/packages/py-html5lib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-html5lib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-30 06:10:32 -06:00
Glenn Johnson
15e5508fcf r-vctrs: Fix checksums after setting url to CRAN (#27080)
This package had its url set to a github url and was then changed to a
CRAN url. The checksums need to change as a result.
2021-10-30 09:51:20 +02:00
Vasileios Karakasis
b1c4c1b6ca Add more ReFrame versions (#27082) 2021-10-30 09:17:30 +02:00
Jon Rood
c19a4e12c7 trilinos: Avoid Intel compiler segfaults in Trilinos with STK (#27073) 2021-10-29 22:03:46 -04:00
Kyle Gerheiser
c19514e540 w3emc package: add versions 2.9.2 and 2.7.3 (#27031) 2021-10-29 16:39:45 -07:00
Manuela Kuhn
657b9204ca New package: r-distributional (#27048) 2021-10-29 16:37:59 -07:00
Adam J. Stewart
1aaa7bd089 GDAL package: add version 3.3.3 (#27071) 2021-10-29 16:21:28 -07:00
iarspider
9448864c0e New versions: ipython 7.27.0 and 7.28.0 (#27066)
* New versions: ipython 7.27.0 and 7.28.0

* Changes from MR (1/2)

* Fix dep name (2/2)
2021-10-29 17:13:44 -06:00
Scott Wittenburg
f2a36bdf14 pipelines: llvm kills the xlarge, use huge (#27079) 2021-10-29 22:01:35 +00:00
Peter Scheibel
7eddf3ae9b For Spack commands that fail but don't throw exceptions, we were discarding the return code (#27077) 2021-10-29 14:14:41 -07:00
iarspider
b9e63c9f42 New version: py-importlib-resources 5.3.0 (#27064)
* New version: py-importlib-resources 5.3.0

* Update var/spack/repos/builtin/packages/py-importlib-resources/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-29 15:07:59 -06:00
Jen Herting
c38d3044fa New package: py-pypulse (#27026)
* python package py-pypulse created - per Andy

* URL fixed

* [py-pypulse]

- removed duplicate dependency
- updated copyright

* [py-pypulse] added version 0.1.1

* [py-pypulse] depends on setuptools

* [py-pypulse] url -> pypi

* [py-pypulse] simplified python dependency

Co-authored-by: Alex C Leute <aclrc@sporcsubmit.rc.rit.edu>
2021-10-29 14:58:58 -06:00
iarspider
ec23ce4176 New version: py-immutables 0.16 (#27060)
* New version: py-immutables 0.16

* Changes from review
2021-10-29 14:53:12 -06:00
Bernhard Kaindl
9a1626e54a py-slurm-pipeline: Add 4.0.4, Fix base.py: import six for @:3 (#26968)
* py-slurm-pipeline: Add 4.0.4, Fix base.py: import six for @:3

Before 4.0.0, slurm_pipeline/base.py has: `from six import string_types`

* Added depends_on('py-pytest@6.2.2:', type='build') as requested by Adam

* remove comment requested to be removed
2021-10-29 14:52:57 -06:00
Jon Rood
91408ef6ad trilinos: add MPICH and HPE MPT MPI environment variables (#27070) 2021-10-29 14:05:10 -06:00
genric
749640faf9 py-ipyparallel: add 7.1.0 (#26984) 2021-10-29 14:10:36 -05:00
Simon Pintarelli
494ba67704 sirius: add mpi datatypes patch for recent cray mpich (#27065) 2021-10-29 12:31:56 -06:00
Baptiste Jonglez
b8bc030a3c py-torch: Add a breakpad variant, disable it for ppc64 and ppc64le (#26990) 2021-10-29 13:00:48 -05:00
Massimiliano Culpo
3eb52b48b8 config add: infer type based on JSON schema validation errors (#27035)
- [x] Allow dding enumerated types and types whose default value is forbidden by the schema
- [x] Add a test for using enumerated types in the tests for `spack config add`
- [x] Make `config add` tests use the `mutable_config` fixture so they do not
      affect other tests

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-10-29 18:44:49 +02:00
Manuela Kuhn
20b7485cd4 r-cli: add 3.0.1 (#27047) 2021-10-29 10:57:14 -05:00
Manuela Kuhn
d36b045e2d r-afex: add 1.0-1 (#27032) 2021-10-29 10:55:46 -05:00
Thomas Madlener
874f06e29c curl: fix mbedtls versions and certs config (#26877)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-10-29 08:55:48 -06:00
Jon Rood
962d06441e m4: set times of m4.texi to fix build on Summit GPFS (#26994)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-29 16:05:41 +02:00
Bernhard Kaindl
fd10e54409 libssh2: skip the testsuite when docker is not installed (#26962)
The build-time testsuite which would be run when building
with tests needs docker. Check that it exists before
attempting to execute the tests.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-10-29 07:04:40 -06:00
Harmen Stoppels
49034abd76 Fix exit codes posix shell wrapper (#27012)
* Correct exit code in sh wrapper

* Fix tests

* SC2069
2021-10-29 08:10:22 +00:00
Harmen Stoppels
f610b506cd elfio: add minimum version requirements to cmake (#27033) 2021-10-29 09:28:19 +02:00
Axel Huebl
5a7496eb82 WarpX & HiPACE++: Constrain FFTW for No-MPI (#27043)
Contrain FFTW for no-MPI to simplify builds and logic to handle.
2021-10-29 09:19:23 +02:00
Valentin Churavy
7a6a232730 add Julia 1.7.0-rc2 (#27016) 2021-10-29 08:24:38 +02:00
Manuela Kuhn
4031ea5e03 r-gargle: add new (#27046) 2021-10-29 05:19:50 +00:00
Manuela Kuhn
852e14a655 r-openssl: add 1.4.5 (#27045) 2021-10-29 04:17:35 +00:00
Manuela Kuhn
5b2d9ae0b0 r-car: add 3.0-11 (#27044) 2021-10-29 04:14:06 +00:00
Manuela Kuhn
dc3b818192 r-future: add 1.22.1 (#27041) 2021-10-29 03:58:00 +00:00
Manuela Kuhn
5c1874c387 r-pillar: add 1.6.4 (#27040) 2021-10-29 03:47:42 +00:00
Manuela Kuhn
fcf6a46aeb r-digest: add 0.6.28 (#27039) 2021-10-29 03:00:10 +00:00
Manuela Kuhn
180bb5003c r-farver: add 2.1.0 (#27038) 2021-10-29 02:46:55 +00:00
Manuela Kuhn
2e880c5513 r-ggplot2: add 3.3.5 (#27037) 2021-10-29 02:36:49 +00:00
Manuela Kuhn
71412f547d r-generics: add 0.1.1 (#27034) 2021-10-29 01:44:13 +00:00
Harmen Stoppels
574395af93 Fix exit codes in fish (#27028) 2021-10-29 01:10:31 +00:00
Manuela Kuhn
c04b2fa26a r-rappdirs: add 0.3.3 (#26989) 2021-10-28 17:55:38 -05:00
Manuela Kuhn
19c77f11b6 r-emmeans: add 1.7.0 (#26987) 2021-10-28 17:54:24 -05:00
Manuela Kuhn
1f4ada6b22 r-fansi: add 0.5.0 (#26986) 2021-10-28 17:51:45 -05:00
Manuela Kuhn
4ff87ea04f r-lifecycle: add 1.0.1 (#26975) 2021-10-28 17:42:56 -05:00
Manuela Kuhn
d5520a264b r-parallelly: add 1.28.1 (#26974) 2021-10-28 17:40:58 -05:00
Manuela Kuhn
999044033b r-vctrs: add 0.3.8 (#26966) 2021-10-28 17:39:09 -05:00
Todd Gamblin
233dabbd4f bugfix: config edit should work with a malformed spack.yaml
If you don't format `spack.yaml` correctly, `spack config edit` still fails and
you have to edit your `spack.yaml` manually.

- [x] Add some code to `_main()` to defer `ConfigFormatError` when loading the
  environment, until we know what command is being run.

- [x] Make `spack config edit` use `SPACK_ENV` instead of the config scope
  object to find `spack.yaml`, so it can work even if the environment is bad.

Co-authored-by: scheibelp <scheibel1@llnl.gov>
2021-10-28 15:37:44 -07:00
Todd Gamblin
374e3465c5 bugfix: spack config get <section> in environments
`spack config get <section>` was erroneously returning just the `spack.yaml`
for the environment.

It should return the combined configuration for that section (including
anything from `spack.yaml`), even in an environment.

- [x] reorder conditions in `cmd/config.py` to fix
2021-10-28 15:37:44 -07:00
Todd Gamblin
2bd513d659 config: ensure that options like --debug are set first
`spack --debug config edit` was not working properly -- it would not do show a
stack trace for configuration errors.

- [x] Rework `_main()` and add some notes for maintainers on where things need
      to go for configuration to work properly.
- [x] Move config setup to *after* command-line parsing is done.

Co-authored-by: scheibelp <scheibel1@llnl.gov>
2021-10-28 15:37:44 -07:00
Todd Gamblin
56ad721eb5 errors: Rework error handling in main()
`main()` has grown, and in some cases code that can generate errors has gotten
outside the top-level try/catch in there. This means that simple errors like
config issues give you large stack traces, which shouldn't happen without
`--debug`.

- [x] Split `main()` into `main()` for the top-level error handling and
      `_main()` with all logic.
2021-10-28 15:37:44 -07:00
Manuela Kuhn
732be7dec6 r-ellipsis: add 0.3.2 (#26965) 2021-10-28 17:36:10 -05:00
Manuela Kuhn
895bd75762 r-rlang: add 0.4.12 (#26963) 2021-10-28 17:34:06 -05:00
Manuela Kuhn
dd1eb7ea18 r-mvtnorm: add 1.1-3 (#26957) 2021-10-28 17:31:43 -05:00
Manuela Kuhn
494dc0bd13 r-lme4: add 1.1-27.1 (#26955)
* r-lme4: add 1.1-27.1

* Use cran instead of explicit url
2021-10-28 17:30:21 -05:00
Manuela Kuhn
4b2564a2d6 llvm: fix gcc11 build for @11 (#27013) 2021-10-28 16:07:56 -06:00
iarspider
eb2d44a57b New versions of py-flake8 and py-pyflakes (#27008)
* New versions of py-flake8 and py-pyflakes

* Changes from review
2021-10-28 21:56:28 +00:00
Todd Gamblin
a1216138f6 config: fix SPACK_DISABLE_LOCAL_CONFIG, remove $user_config_path (#27022)
There were some loose ends left in ##26735 that cause errors when
using `SPACK_DISABLE_LOCAL_CONFIG`.

- [x] Fix hard-coded `~/.spack` references in `install_test.py` and `monitor.py`

Also, if `SPACK_DISABLE_LOCAL_CONFIG` is used, there is the issue that
`$user_config_path`, when used in configuration files, makes no sense,
because there is no user config scope.

Since we already have `$user_cache_path` in configuration files, and since there
really shouldn't be *any* data stored in a configuration scope (which is what
you'd configure in `config.yaml`/`bootstrap.yaml`/etc., this just removes
`$user_config_path`.

There will *always* be a `$user_cache_path`, as Spack needs to write files, but
we shouldn't rely on the existence of a particular configuration scope in the
Spack code, as scopes are configurable, both in number and location.

- [x] Remove `$user_config_path` substitution.
- [x] Fix reference to `$user_config_path` in `etc/spack/deaults/bootstrap.yaml`
      to refer to `$user_cache_path`, which is where it was intended to be.
2021-10-28 21:33:44 +00:00
Daryl W. Grunau
d0e177e711 depend on libevent when +pmix (#27020)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2021-10-28 23:19:19 +02:00
Phil Carns
7fd1d2b03f mochi-margo: add version 0.9.6 (#26951) 2021-10-28 23:11:30 +02:00
Valentin Churavy
7f8b0d4820 add MPItrampoline 2.0.0 (#27019) 2021-10-28 22:20:38 +02:00
iarspider
612639534a New version: py-distro 1.6.0 (#27003) 2021-10-28 14:38:36 -05:00
iarspider
f512bb1dc1 New versions: docutils 0.17, 0.17.1, 0.18 (#27005) 2021-10-28 13:53:02 -05:00
Harmen Stoppels
6d030ba137 Deactivate previous env before activating new one (#25409)
* Deactivate previous env before activating new one

Currently on develop you can run `spack env activate` multiple times to switch
between environments, but they leave traces, even though Spack only supports
one active environment at a time.

Currently:

```console
$ spack env create a
$ spack env create b
$ spack env activate -p a
[a] $ spack env activate -p b
[b] [a] $ spack env activate -p b
[a] [b] [a] $ spack env activate -p a
[a] [b] [c] $ echo $MANPATH | tr ":" "\n"
/path/to/environments/a/.spack-env/view/share/man
/path/to/environments/a/.spack-env/view/man
/path/to/environments/b/.spack-env/view/share/man
/path/to/environments/b/.spack-env/view/man
```

This PR fixes that:

```console
$ spack env activate -p a
[a] $ spack env activate -p b
[b] $ spack env activate -p a
[a] $ echo $MANPATH | tr ":" "\n"
/path/to/environments/a/.spack-env/view/share/man
/path/to/environments/a/.spack-env/view/man
```
2021-10-28 11:39:25 -07:00
Tom Scogland
87e456d59c spack setup-env.sh: make zsh loading async compatible, and ~10x faster (in some cases) (#26120)
Currently spack is a bit of a bad actor as a zsh plugin, and it was my
fault.  The autoload and compinit should really be handled by the user,
as was made abundantly clear when I found spack was doing completion
initialization for *all* of my plugins due to a deferred setup that was
getting messed up by it.

Making this conditional took spack load time from 1.5 seconds (with
module loading disabled) to 0.029 seconds. I can actually afford to load
spack by default with this change in.

Hopefully someday we'll do proper zsh completion support, but for now
this helps a lot.

* use zsh hist expansion in place of dirname
* only run (bash)compinit if compdef/complete missing
* add zsh compiled files to .gitignore
* move changes to .in file, because spack
2021-10-28 11:32:59 -07:00
iarspider
5faa457a35 New version: py-fasteners 0.16.3 (#27006) 2021-10-28 13:15:50 -05:00
Harmen Stoppels
8ca3a0fdf8 Remove failing macOS test (#27009) 2021-10-28 09:30:51 -07:00
Brent Huisman
63fcb0331b Add Pybind11 v2.8 (#26867)
* Add Pybind11 v2.8

* Add Python dependency

* Update package.py

* Added Pybind v2.8.1
2021-10-28 16:14:49 +00:00
Robert Blackwell
8fd94e3114 YamlFilesystemView: improve file removal performance via batching (#24355)
* Drastically improve YamlFilesystemView file removal via batching

The `remove_file` routine has to check if the file is owned by multiple packages, so it doesn't
remove necessary files. This is done by the `get_all_specs` routine, which walks the entire
package tree. With large numbers of packages on shared file systems, this can take seconds
per file tree traversal, which adds up extremely quickly. For example, a single deactivate
of a largish python package in our software stack on GPFS took approximately 40 minutes.

This patch simply replaces `remove_file` with a batch `remove_files` routine. This routine
removes a list of files rather than a single file, requiring only one traversal per batch. In
practice this means a package can be removed in seconds time, rather than potentially hours,
essentially a ~100x speedup (ignoring initial deactivation logic, which takes about 3 minutes
in our test setup).
2021-10-28 07:39:16 -07:00
Harmen Stoppels
c13f915735 cmake: add v3.21.4, v3.20.6 (#27004) 2021-10-28 13:59:46 +00:00
Cameron Stanavige
afbc67fdd8 dtcmp & lwgrp: add shared variant (#26999) 2021-10-28 15:00:54 +02:00
Michael Kuhn
e9f3ef785d Fix sbang hook for non-writable files (#27007)
* Fix sbang hook for non-writable files

PR #26793 seems to have broken the sbang hook for files with missing
write permissions. Installing perl now breaks with the following error:
```
==> [2021-10-28-12:09:26.832759] Error: PermissionError: [Errno 13] Permission denied: '$SPACK/opt/spack/linux-fedora34-zen2/gcc-11.2.1/perl-5.34.0-afuweplnhphcojcowsc2mb5ngncmczk4/bin/cpanm'
```

Temporarily add write permissions to the original file so it can be
overwritten with the patched one.

And test that file permissions are preserved in sbang even for non-writable files

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-10-28 14:49:23 +02:00
Erik Schnetter
7c886bcac1 nsimd: add v3.0.1, determine SIMD variant automatically by default (#26850) 2021-10-28 14:38:48 +02:00
Paul Ferrell
4ee37c37de buildcaches: fix directory link relocation (#26948)
When relocating a binary distribution, Spack only checks files to see
if they are a link that needs to be relocated. Directories can be
such links as well, however, and need to undergo the same checks
and potential relocation.
2021-10-28 14:34:31 +02:00
Seth R. Johnson
890095e876 llvm: use cmake helper functions (#26988)
* llvm: use cmake helper functipack stns

* llvm: review feedback
2021-10-27 20:26:22 +00:00
iarspider
7416df692a New versions: py-cffi 1.15.0, 1.14.6 (#26979)
* New versions: py-cffi 1.15.0, 1.14.6

* Changes from review
2021-10-27 15:01:15 -05:00
iarspider
dd0770fd64 New versions of py-cachetools (#26976)
* New versions of py-cachetools

* Changes from review
2021-10-27 15:00:48 -05:00
iarspider
704c94429b New version: py-bottle@0.12.19 (#26973)
* New version: py-bottle@0.12.19

* Changes from review
2021-10-27 15:00:19 -05:00
Mark W. Krentel
21d909784c hpcviewer: add support for macosx, add version 2021.10 (#26823) 2021-10-27 19:51:14 +00:00
Kyle Gerheiser
e35eacf87b Add w3emc version 2.9.1 (#26880) 2021-10-27 12:05:06 -06:00
iarspider
dc40405fd6 New version: py-contextlib2 21.6.0 (#26985) 2021-10-27 11:08:25 -06:00
Massimiliano Culpo
3d5444fdd8 Remove documentation tests from GitHub Actions (#26981)
We moved documentation tests to readthedocs since a while,
so remove the one on GitHub.
2021-10-27 19:02:52 +02:00
iarspider
80d4a83636 New versions: py-boken@2.3.3, 2.4.0, 2.4.1 (#26972) 2021-10-27 11:59:42 -05:00
iarspider
39f46b1c3b New version: py-certifi 2021.10.8 (#26978) 2021-10-27 11:54:37 -05:00
H. Joe Lee
1fcc9c6552 hdf5-vol-log: add new package (#26956) 2021-10-27 16:48:44 +00:00
iarspider
1842785eae New version: py-commonmark 0.9.1 (#26983) 2021-10-27 11:47:28 -05:00
Mosè Giordano
acb8ab338d fftw: add v3.3.10 (#26982) 2021-10-27 15:29:53 +00:00
Bernhard Kaindl
d1803af957 fontconfig: add v2.13.94 and fix test with dash (#26961)
Fix install --test=root with /bin/sh -> dash: A test uses
SIGINT SIGTERM SIGABRT EXIT for trap -> use signal numbers
2021-10-27 16:20:57 +02:00
Bernhard Kaindl
f4431851ab perl-extutils-installpaths: depend on perl-extutils-config (#26969) 2021-10-27 16:02:14 +02:00
Pieter Ghysels
1f728ab4ce strumpack: add v6.1.0, remove unused variants (#26971) 2021-10-27 15:58:54 +02:00
Valentin Volkl
9fa20b8a39 recola: fix compilation (#26634)
* recola: fix compilation

* Update var/spack/repos/builtin/packages/recola-sm/package.py

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* flake8

* fixes

* fix typo

* fix typo

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2021-10-27 07:19:35 -06:00
Todd Gamblin
4f124bc9e7 tests: speed up spack list tests (#26958)
`spack list` tests are not using mock packages for some reason, and many
are marked as potentially slow. This isn't really necessary; we don't need
6,000 packages to test the command.

- [x] update tests to use `mock_packages` fixture
- [x] remove `maybeslow` annotations
2021-10-27 05:10:39 -06:00
Harmen Stoppels
e04b172eb0 Allow non-UTF-8 encoding in sbang hook (#26793)
Currently Spack reads full files containing shebangs to memory as
strings, meaning Spack would have to guess their encoding. Currently
Spack has a fixed guess of UTF-8.

This is unnecessary, since e.g. the Linux kernel does not assume an
encoding on paths at all, it's just bytes and some delimiters on the
byte level.

This commit does the following:

1. Shebangs are treated as bytes, so that e.g. latin1 encoded files do
not throw UnicodeEncoding errors, and adds a test for this.
2. No more bytes than necessary are read to memory, we only have to read
until the first newline, and from there on we an copy the file byte by
bytes instead of decoding and re-encoding text.
3. We cap the number of bytes read to 4096, if no newline is found
before that, we don't attempt to patch it.
4. Add support for luajit too.

This should make Spack both more efficient and usable for non-UTF8
files.
2021-10-27 02:59:10 -07:00
Harmen Stoppels
2fd87046cd Fix assumption v.concrete => isinstance(v, Version) (#26537)
* Add test
* Only extend with Git version when using Version
* xfail v.concrete test
2021-10-27 02:58:04 -07:00
Harmen Stoppels
ae6e83b1d5 config: overrides for caches and system and user scopes (#26735)
Spack's `system` and `user` scopes provide ways for administrators and
users to set global defaults for all Spack instances, but for use cases
where one wants a clean Spack installation, these scopes can be undesirable.
For example, users may want to opt out of global system configuration, or
they may want to ignore their own home directory settings when running in
a continuous integration environment.

Spack also, by default, keeps various caches and user data in `~/.spack`,
but users may want to override these locations.

Spack provides three environment variables that allow you to override or
opt out of configuration locations:

 * `SPACK_USER_CONFIG_PATH`: Override the path to use for the
   `user` (`~/.spack`) scope.

 * `SPACK_SYSTEM_CONFIG_PATH`: Override the path to use for the
   `system` (`/etc/spack`) scope.

 * `SPACK_DISABLE_LOCAL_CONFIG`: set this environment variable to completely
   disable *both* the system and user configuration directories. Spack will
   only consider its own defaults and `site` configuration locations.

And one that allows you to move the default cache location:

 * `SPACK_USER_CACHE_PATH`: Override the default path to use for user data
   (misc_cache, tests, reports, etc.)

With these settings, if you want to isolate Spack in a CI environment, you can do this:

   export SPACK_DISABLE_LOCAL_CONFIG=true
   export SPACK_USER_CACHE_PATH=/tmp/spack

This is a stop-gap approach until we have figured out how to deal with
the system and user config scopes more generally, as there are plans to
potentially / eventually get rid of them.

**User config**

Spack is a bit of a pain when you have:

- a shared $HOME folder across different systems.
- multiple Spack versions on the same system.

**System config**

- On shared systems with a versioned programming environment / toolkit,
  system administrators want to provide config for each version (e.g.
  21.09, 21.10) of the programming environment, and the user Spack
  instance should be able to pick this up without a steep learning
  curve.
- On shared systems the user should be able to opt out of the
  hard-coded config scope in /etc/spack, since it may be incompatible
  with their particular instance. Currently Spack can only opt out of all
  config scopes through overrides with `"config:":`, `"packages:":`, but that
  also drops the defaults config, which would have to be repeated, which
  is undesirable, especially the lengthy packages.yaml.

An example use case is: having config in this folder:

```
/path/to/programming/environment/{version}/{compilers,packages}.yaml
```

and have `module load spack-system-config` set the variable

```
SPACK_SYSTEM_CONFIG_PATH=/path/to/programming/environment/{version}
```

where the user no longer has to worry about what `{version}` they are
on.

**Continuous integration**

Finally, there is the use case of continuous integration, which may
clone an arbitrary Spack version, which optimally should not pick up
system or user config from the previous run (like may happen in
classical bare metal non-containerized filesystem side effect ridden
jenkins pipelines). In fact this is very similar to how spack itself
tries to avoid picking up system dependencies during builds...

**But environments solve this?**

- You could do `include`s in environment files to get similar behavior
  to the spack_system_config_path example, but environments require you
  to:
  1) require paths to individual config files, not directories.
  2) fail if the listed config file does not exist
- They allow you to override config scopes, but this is generally too
  rigurous, as it requires you to repeat the default config, in
  particular packages.yaml, and just defies the point of layered config.

Co-authored-by: Tom Scogland <tscogland@llnl.gov>
Co-authored-by: Tim Fuller <tjfulle@sandia.gov>
Co-authored-by: Steve Leak <sleak@lbl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-10-26 18:08:25 -07:00
Ye Luo
7dc94b6683 Build OpenMP in LLVM via LLVM_ENABLE_RUNTIMES. (#26870) 2021-10-26 17:43:28 -06:00
Hao Lyu
02bea6d2d2 fix bug when there is version id in the path of compiler (#26916) 2021-10-26 14:12:58 -07:00
Greg Becker
9a637bbd09 modules: allow user to remove arch dir (#24156)
* allow no arch-dir modules

* add tests for modules with no arch

* document arch-specific module roots
2021-10-26 13:26:09 -07:00
Mark W. Krentel
444e156685 hpctoolkit: add version 2021.10.15 (#26881) 2021-10-26 13:15:09 -07:00
Richarda Butler
bc616a60b7 Py-Libensemble: Add E4S testsuite stand alone test (#26270) 2021-10-26 13:11:53 -07:00
Miroslav Stoyanov
ad03981468 fix the spack test dir (#26816) 2021-10-26 13:07:04 -07:00
Sreenivasa Murthy Kolam
5ee2ab314c ROCm packages: add RelWithDebInfo build_type (#26888)
Also set default build_type to Release for many ROCm packages.
2021-10-26 12:18:38 -07:00
Greg Becker
a8a08f66ad modules: configurable module defaults (#24367)
Any spec satisfying a default will be symlinked to `default`

If multiple specs have modulefiles in the same directory and satisfy
configured module defaults, then whichever was written last will be
default.
2021-10-26 19:34:06 +02:00
Kyle Gerheiser
dee75a4945 upp: Add version 10.0.10 (#26946) 2021-10-26 19:26:34 +02:00
Seth R. Johnson
aacdd5614e htop: add new URL and versions (#26928) 2021-10-26 19:14:06 +02:00
Ben Morgan
a2e5a28892 virtest: prevent out-of-order build/test (#26944)
Use of `-R` flag to CTest command causes "empty-14" test to run,
by matching "empty", before the empty-14 target is built.

Patch CTest command in buildscript to match name exactly.
2021-10-26 18:55:54 +02:00
Asher Mancinelli
94a9733822 hiop: update constraints and add new version (#26905)
* Update hiop package dependencies

* Use single quotes to shrink diff

* add hiop 0.5.1

* apply flake8

* Apply formatting suggestions
2021-10-26 11:43:03 -04:00
iarspider
fb1f3b1a1c Add checksum for py-argon2-cffi 21.1.0 and update python dependency (#26894)
* Add checksum for py-argon2-cffi 21.1.0 and update python dependency

* Update package.py
2021-10-26 09:32:11 -06:00
iarspider
d673e634d0 New versions: py-bleach 4.0.0 and 4.1.0 (#26947) 2021-10-26 09:23:10 -06:00
iarspider
1f1f121e8f New version: py-beniget 0.4.1 (#26945) 2021-10-26 09:02:03 -06:00
Wouter Deconinck
ba2a03e1da geant4: depends_on vecgeom@1.1.8:1.1 range (#26917)
* [geant4] depends_on vecgeom@1.1.8:1.1 range

While previous versions were unclear, the [geant4.10.7 release notes](https://geant4-data.web.cern.ch/ReleaseNotes/ReleaseNotes4.10.7.html) indicate that vecgeom@1.1.8 is a minimum required version, not an exact required version ("Set VecGeom-1.1.8 as minimum required version for optional build with VecGeom."). This will allow some more freedom on the concretizer solutions while allowing geant4 to take advantage of bugfixes and improvements in vecgeom.

* [vecgeom] new version 1.1.17
2021-10-26 10:31:37 -04:00
Seth R. Johnson
dad68e41e0 freetype: explicitly specify dependencies (#26942)
Freetype picked up 'brotli' from homebrew, causing issues downstream.
2021-10-26 04:07:48 -06:00
eugeneswalker
1ae38037ef tau: add version 2.30.2 (#26941) 2021-10-26 05:36:29 -04:00
Chuck Atkins
80d8c93452 unifyfs: Remove the hdf5 variant (examples only) (#26932)
UnifyFS doesn't actually use HDF5 for anything.  Enabling it only enables
a few examples to be built so it's not actually a dependency of the package.
2021-10-26 09:42:52 +02:00
Seth R. Johnson
bdcbc4cefe qt package: versions @:5.13 don't build with gcc@11: (#26922) 2021-10-25 18:20:54 -07:00
Ronak Buch
1d53810d77 charmpp: add version 7.0.0 (#26940) 2021-10-25 18:16:37 -07:00
Adam J. Stewart
4a8c53472d py-kornia: add version 0.6.1 (#26939) 2021-10-25 18:15:57 -07:00
Adam J. Stewart
b05df2cdc7 py-shapely: add version 1.8.0 (#26937) 2021-10-25 18:15:22 -07:00
Adam J. Stewart
92cef8d7ad py-scikit-learn: add version 1.0.1 (#26934) 2021-10-25 18:14:17 -07:00
Adam J. Stewart
912c6ff6e8 PyTorch/torchvision: add version 1.10.0/0.11.1 (#26889)
* For py-torch: Also update dependencies: many version constraints
  with an upper bound of @1.9 are now open (e.g. `@1.8.0:1.9` is
  converted to `@1.8.0:`).
* For py-torchvision: Also add 0.11.0 and update ^pil constraint
  to avoid building with 8.3.0
2021-10-25 15:26:12 -07:00
iarspider
cdcdd71b41 Add new versions of py-autopep8 and py-pycodestyle (#26924)
* Add new versions of py-autopep8 (1.5.7, 1.6.0) and py-pycodestyle (2.7.0, 2.8.0)

* Update package.py

* Restore old versions
2021-10-25 22:08:52 +00:00
Daniel Arndt
e221617386 deal.II package: Update CMake variable for >=9.3.0 (#26909) 2021-10-25 15:00:40 -07:00
Massimiliano Culpo
6063600a7b containerize: pin the Spack version used in a container (#21910)
This PR permits to specify the `url` and `ref` of the Spack instance used in a container recipe simply by expanding the YAML schema as outlined in #20442:
```yaml
container:
  images:
    os: amazonlinux:2
    spack:
      ref: develop
      resolve_sha: true
```
The `resolve_sha` option, if true, verifies the `ref` by cloning the Spack repository in a temporary directory and transforming any tag or branch name to a commit sha. When this new ability is leveraged an additional "bootstrap" stage is added, which builds an image with Spack setup and ready to install software. The Spack repository to be used can be customized with the `url` keyword under `spack`.

Modifications:
- [x] Permit to pin the version of Spack, either by branch or tag or sha
- [x] Added a few new OSes (centos:8, amazonlinux:2, ubuntu:20.04, alpine:3, cuda:11.2.1)
- [x] Permit to print the bootstrap image as a standalone
- [x] Add documentation on the new part of the schema
- [x] Add unit tests for different use cases
2021-10-25 13:09:27 -07:00
iarspider
ff65e6352f py-avro: new version 1.10.2 (#26927)
* Add checksum for py-avro@1.10.2

* Update package.py
2021-10-25 11:45:33 -05:00
Olli Lupton
06c983d38f cuda: add 11.4.1, 11.4.2, 11.5.0. (#26892)
* cuda: add 11.4.1, 11.4.2, 11.5.0.

Note that the curses dependency from cuda-gdb was dropped in 11.4.0.

* Update clang/gcc constraints.

* Address review, assume clang 12 is OK from 11.4.1 onwards.

* superlu-dist@7.1.0 conflicts with cuda@11.5.0.

* Update var/spack/repos/builtin/packages/superlu-dist/package.py

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-10-25 09:05:21 -07:00
iarspider
b0752bf1b3 Add checksum for py-atomicwrites@1.4.0 (#26923) 2021-10-25 10:54:04 -05:00
Kendra Long!
c9847766e4 draco: new version 7_12_0 (#26907)
* Add draco-7_12_0 to package file

* Update hash to zip version
2021-10-25 11:35:38 -04:00
Kelly (KT) Thompson
86a9c703db eospac: new default version 6.5.0 (#26723)
* [pkg][new version] Provide eospac@6.5.0 and mark it as default.

* Merge in changes found in #21629

* Mark all alpha/beta versions as deprecated.

- Addresses @sethrj's recommendation
- Also add a note indicated why these versions are marked this way.
2021-10-25 11:10:46 -04:00
Harmen Stoppels
276e637522 Reduce verbosity of module files warning
1. Currently it prints not just the spec name, but the dependencies +
their variants + their compilers + their architectures + ...
2. It's clear from the context what spec the message applies to, so,
let's not print the spec at all.
2021-10-25 17:07:56 +02:00
Olivier Cessenat
b51e0b363e silo: new release 4.11 (#26876) 2021-10-25 08:13:54 -06:00
Manuela Kuhn
e9740a9978 py-nipype: add 1.7.0 (#26883) 2021-10-25 06:55:50 -06:00
Harmen Stoppels
cc8d8cc9cb Return early in do_fetch when there is no code or a package is external (#26926)
Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
2021-10-25 13:51:23 +02:00
Todd Gamblin
de8e795563 virtuals: simplify virtual handling
These three rules in `concretize.lp` are overly complex:

```prolog
:- not provider(Package, Virtual),
   provides_virtual(Package, Virtual),
   virtual_node(Virtual).
```

```prolog
  :- provides_virtual(Package, V1), provides_virtual(Package, V2), V1 != V2,
     provider(Package, V1), not provider(Package, V2),
     virtual_node(V1), virtual_node(V2).
```

```prolog
provider(Package, Virtual) :- root(Package), provides_virtual(Package, Virtual).
```

and they can be simplified to just:

```prolog
provider(Package, Virtual) :- node(Package), provides_virtual(Package, Virtual).
```

- [x] simplify virtual rules to just one implication
- [x] rename `provides_virtual` to `virtual_condition_holds`
2021-10-25 09:11:04 +02:00
Massimiliano Culpo
6d69d23aa5 Add a unit test to prevent regression 2021-10-25 09:11:04 +02:00
Massimiliano Culpo
dd4d7bae1d ASP-based solver: a package eligible to provide a virtual must provide it
fixes #26866

This semantics fits with the way Spack currently treats providers of
virtual dependencies. It needs to be revisited when #15569 is reworked
with a new syntax.
2021-10-25 09:11:04 +02:00
Miguel Dias Costa
1e90160d68 berkeleygw: force openmp propagation on some providers of blas / ffw-api (#26918) 2021-10-25 07:42:05 +02:00
Michael Kuhn
dfcd5d4c81 hyperfine: new package (#26901) 2021-10-24 20:47:56 -04:00
Satish Balay
e3b7eb418f Add new math solver versions (#26903)
* tasmanian: add @7.7
* butterflypack: add @2.0.0
* pflotran: add @3.0.2
* alquimia: add @1.0.9
* superlu-dist: add @7.1.1
2021-10-24 20:45:44 -04:00
Michael Kuhn
fb5d4d9397 meson: add 0.60.0 (#26921) 2021-10-24 20:35:11 -04:00
Scott Wittenburg
2003fa1b35 Mark flaky test_ci_rebuild as xfail (#26911) 2021-10-24 22:46:26 +02:00
Michael Kuhn
b481b5114a Fix pkg-config dependencies (#26912)
pkg-config and pkgconf are implementations of the pkgconfig provider.
2021-10-24 20:39:50 +02:00
Michael Kuhn
d7148a74a0 glib: add 2.70.0 (#26915) 2021-10-24 20:39:30 +02:00
Seth R. Johnson
c255e91bba gcc: support alternate mechanism for providing GCC rpaths (#26590)
* gcc: support runtime ability to not install spack rpaths

Fixes #26582 .

* gcc: Fix malformed specs file and add docs

The updated docs point out that the spack-modified GCC does *not*
follow the usual behavior of LD_RUN_PATH!

* gcc: fix bad rpath on macOS

This bug has been around since the beginning of the GCC package file:
the rpath command it generates for macOS writes a single (invalid)
rpath entry.

* gcc: only write rpaths for directories with shared libraries

The original lib64+lib was just a hack for "in case either has one" but
it's easy to tell whether either has libraries that can be directly
referenced.
2021-10-24 13:46:52 -04:00
Michael Kuhn
5a38165674 python: add 3.9.7, 3.8.12, 3.7.12 and 3.6.15 (#26914) 2021-10-24 11:43:39 -06:00
Morten Kristensen
9e11e62bca py-vermin: add latest version 1.3.1 (#26920)
* py-vermin: add latest version 1.3.1

* Exclude line from Vermin since version is already being checked for

Vermin 1.3.1 finds that `encoding` kwarg of builtin `open()` requires Python 3+.
2021-10-24 16:49:05 +00:00
Mark Grondona
7fee7a7ce1 flux-core, -sched: update to latest versions, fix czmq build error (#26840)
* flux-core: fix compile error with czmq_containers
* Fix build with clang and gcc-11
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-23 17:37:44 +02:00
Jen Herting
7380161ea6 [py-emcee] added versions 3.0.2 and 3.1.1 (#26910)
* [py-emcee] added version 3.0.2

* [py-emcee] added version 3.1.1
2021-10-22 17:52:45 -06:00
eugeneswalker
0baec34dd7 E4S amd64 CI: add parsec (#26906) 2021-10-22 15:02:47 -06:00
Wouter Deconinck
d04b6e0fb5 acts: fix misspelled variant in depends_on (#26904)
Correct variant is plural `unit_tests`.
2021-10-22 12:19:51 -06:00
Harmen Stoppels
336c60c618 Document backport in py (#26897) 2021-10-22 19:14:35 +02:00
iarspider
0beb2e1ab2 Update py-aiohttp to 3.7.4 and py-chartdet to 4.0 (#26893)
* Update py-aiohttp to 3.7.4 and py-chartdet to 4.0

* Changes from review

* Update package.py

* Update var/spack/repos/builtin/packages/py-chardet/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-22 16:36:56 +00:00
iarspider
aa2bcbdc12 Use pypi for astroid; add versions 2.7.3 and 2.8.3; update dependencies (#26900)
* Use pypi for astroid; add versions 2.7.3 and 2.8.3; update dependencies

* py-six is not required for recent versions
2021-10-22 16:21:22 +00:00
Wouter Deconinck
d5880f3e7e acts: add v14.0.0, v14.1.0 added "python" and "analysis" variants (#26657) 2021-10-22 17:43:35 +02:00
Robert Cimrman
2d41d6fc9e py-sfepy: fix version 2021.3 tar.gz hash (#26890) 2021-10-22 10:23:58 -05:00
iarspider
8349a8ac60 Add py-asn1crypto@1.4.0 (#26895) 2021-10-22 10:22:30 -05:00
Manuela Kuhn
edc7341625 py-jupyterlab: add 3.1.18 and 3.2.1 (#26896) 2021-10-22 10:21:00 -05:00
Harmen Stoppels
609a42d63b Shorten long shebangs only if the execute permission is set (#26899)
The OS should only interpret shebangs, if a file is executable. 

Thus, there should be no need to modify files where no execute bit is set. 

This solves issues that are e.g. encountered while packaging software as 
COVISE (https://github.com/hlrs-vis/covise), which includes example data
in Tecplot format. The sbang post-install hook is applied to every installed
file that starts with the two characters #!, but this fails on the binary Tecplot
files, as they happen to start with #!TDV. Decoding them with UTF-8 fails 
and an exception is thrown during post_install.

Co-authored-by: Martin Aumüller <aumuell@reserv.at>
2021-10-22 16:55:19 +02:00
Harmen Stoppels
d274769761 Backport #186 from py-py to fix macOS failures (#26653)
Backports the relevant bits of 0f77b6e66f
2021-10-22 13:52:46 +02:00
Harmen Stoppels
7cec6476c7 Remove hard-coded repository in centos6 unit test (#26885)
This makes it easier to run tests in a fork.
2021-10-22 10:24:03 +02:00
Doug Jacobsen
d1d0021647 Add GCS Bucket Mirrors (#26382)
This commit contains changes to support Google Cloud Storage 
buckets as mirrors, meant for hosting Spack build-caches. This 
feature is beneficial for folks that are running infrastructure on 
Google Cloud Platform. On public cloud systems, resources are
ephemeral and in many cases, installing compilers, MPI flavors,
and user packages from scratch takes up considerable time.

Giving users the ability to host a Spack mirror that can store build
caches in GCS buckets offers a clean solution for reducing
application rebuilds for Google Cloud infrastructure.

Co-authored-by: Joe Schoonover <joe@fluidnumerics.com>
2021-10-22 06:22:38 +02:00
Miroslav Stoyanov
d024faf044 heffte: new version with rocm (#26878) 2021-10-22 01:12:46 +00:00
Cameron Stanavige
8b8bec2179 scr/veloc: component releases (#26716)
* scr/veloc: component releases

Update the ECP-VeloC component packages in preparation for an
upcoming scr@3.0rc2 release.

All
- Add new release versions
- Add new `shared` variant for all components
- Add zlib link dependency to packages that were missing it
- Add maintainers
- Use self.define and self.define_from_variant to clean up cmake_args

axl
- Add independent vendor async support variants

rankstr
- Update older version sha that fails checksum on install

* Fix scr build error

Lock dependencies for scr@3.0rc1 to the versions released at the same
time.
2021-10-21 18:05:59 -07:00
Frank Willmore
a707bf345d Add vacuumms package (#26838) 2021-10-21 17:12:08 -07:00
Robert Cimrman
9b28f99dd2 py-sfepy: add v2021.3 (#26865) 2021-10-21 23:50:38 +00:00
Manuela Kuhn
dbe1060420 py-neurokit2: add 0.1.4.1 (#26871)
* py-neurokit2: add 0.1.4.1

* Update var/spack/repos/builtin/packages/py-neurokit2/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-21 21:03:04 +00:00
eugeneswalker
3720d06e26 update E4S CI environments in preparation for 21.11 release (#26826)
* update E4S CI environments in preparation for 21.11 release

* e4s ci env: use clingo
2021-10-21 07:06:02 -07:00
Alexander Jaust
3176c6d860 [py-pyprecice] add v2.3.0.1 and fix hash of v2.2.1.1 (#26846) 2021-10-21 08:36:24 -05:00
Harmen Stoppels
3fa0654d0b Make 'spack location -e' print the current env, and 'spack cd -e' go to the current env (#26654) 2021-10-21 13:19:48 +02:00
Eric Brugger
7f4bf2b2e1 vtk: make use of system GLEW dependent on osmesa being disabled. (#26764) 2021-10-21 09:53:13 +02:00
Vanessasaurus
3fe1785d33 libabigail: support source install (#26807)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-10-21 09:22:22 +02:00
Wouter Deconinck
b65937e193 kassiopeia: add v3.8.0; new variants log4cxx, boost (#26841) 2021-10-21 07:44:57 +02:00
Alexander Jaust
97d244eb95 Make superlu-dist@7.1.0 request CMake 3.18.1 or newer (#26849) 2021-10-21 07:43:30 +02:00
Edward Hartnett
88591536c8 bacio: add v2.5.0 (#26851) 2021-10-21 07:41:47 +02:00
RichardABunt
f78e5383cc arm-forge: add v21.1 (#26852) 2021-10-21 07:41:06 +02:00
downloadico
34d4091dbc tramonto: add new package (#26858) 2021-10-21 07:18:33 +02:00
Stephen Hudson
afd4a72937 libEnsemble: add v0.8.0 (#26864) 2021-10-21 07:14:31 +02:00
Adam J. Stewart
342f89e277 py-numpy: add v1.21.3 (#26860) 2021-10-21 07:13:43 +02:00
Luc Berger
fc7178da17 Kokkos Kernels: updating maintainers and releases (#26855)
Adding the latests releases of Kokkos Kernels and the official maintainers for the package.
2021-10-20 12:19:56 -07:00
iarspider
594325ae09 Add py-sniffio version 1.2.0 and fix dependencies (#26847)
* Add py-sniffio version 1.2.0 and fix dependencies

* Changes from review
2021-10-20 16:30:26 +00:00
iarspider
fcff01aaaf Update py-anyio to 3.3.4 (#26821)
* Update py-anyio to 3.3.4

* Changes from PR
2021-10-20 10:36:45 -05:00
Manuela Kuhn
b9c633d086 py-bids-validator: add 1.8.4 (#26844) 2021-10-20 10:33:27 -05:00
kwryankrattiger
45bea7cef7 Update ECP dav helper for propagating variants (#26175) 2021-10-20 11:22:07 -04:00
Martin Aumüller
029b47ad72 embree: add checksums for 3.12.2 & 3.13.1 (#26675)
Includes fix for for dependency ispc: fix build if cc -m32 is not possible
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-20 16:41:18 +02:00
Tamara Dahlgren
cc8b6ca69f Add --preferred and --latest tospack checksum (#25830) 2021-10-20 13:38:55 +00:00
Adam J. Stewart
f184900b1a damask: fix maintainer GitHub handle (#26829) 2021-10-20 14:57:13 +02:00
Wouter Deconinck
abc8cfe8fa cppgsl: branch master is now main (#26834) 2021-10-20 14:56:01 +02:00
Massimiliano Culpo
56209cb114 Reduce verbosity of error messages when concretizing environments (#26843)
With this commit stacktraces of subprocesses are shown only if debug mode is active
2021-10-20 11:30:07 +00:00
Alexander Jaust
26b58701bc Fix typo in repositories.rst (#26845) 2021-10-20 11:11:17 +00:00
Wouter Deconinck
42a6e1fcee root: prepend dependent_spec.prefix.include to ROOT_INCLUDE_PATH (#26379)
Spack is not populating CPATH anymore (e3f97b37e6 (diff-259adc895c0b2e8fca42ffb99d8051eec0712c868d12d8da255d32f1663acdc7)), and downstream packages ([gaudi](1aa7758dbb/var/spack/repos/builtin/packages/gaudi/package.py (L116))) have alrady started to include this in their package.py files. Instead of propagating this to all downstream packages, it tries to address the issue at the source.
2021-10-20 09:15:14 +02:00
Massimiliano Culpo
1983d6d9d0 Pipelines: pin importlib-metadata and setuptools-scm versions (#26831)
This will hopefully be a hotfix for the issue with pipelines
in #26813 and #26820
2021-10-19 16:09:40 -07:00
Chris White
b7d94ebc4f allow camp to use blt package (#26765) 2021-10-19 16:09:15 -07:00
Harmen Stoppels
1b634f05e0 A single process pool is not something to boast about (#26837) 2021-10-19 23:04:52 +00:00
Greg Becker
7dc0ca4ee6 cray architecture detection for zen3/milan (#26827)
* Update cray architecture detection for milan

Update the cray architecture module table with x86-milan -> zen3
Make cray architecture more robust to back off from frontend
architecture to a recent ancestor if necessary. This should make
future cray updates less paingful for users.

Co-authored-by: Gregory Becker <becker33.llnl.gov>
Co-authored-by: Todd Gamblin <gamblin2@llnl.gov>
2021-10-19 21:39:50 +00:00
Kai Germaschewski
501fa6767b adios2: fix unresolved symbols in 2.6.0 when built with gcc10 (#23871) 2021-10-19 21:09:07 +00:00
Harmen Stoppels
e7c7f44bb6 Reduce verbosity of threaded concretization (#26822)
1. Don't use 16 digits of precision for the seconds, round to 2 digits after the comma
2. Don't print if we don't concretize (i.e. `spack concretize` without `-f` doesn't have to tell me it did nothing in `0.00` seconds)
2021-10-19 18:33:17 +00:00
iarspider
bc99d8a2fd Update py-zipp (#26819)
* Update py-zipp

* Fix typo
2021-10-19 13:29:32 -05:00
Garth N. Wells
2bfff5338f py-fenics-ffcx: dependency updates (#26783)
* Update py-fenics-ffcx dependencies

* Relax some version numbering

* Remove stray colon
2021-10-19 13:28:20 -05:00
iarspider
a678a66683 py-setuptools-scm: make py-tomli dependency an open range (#26820) 2021-10-19 16:08:56 +00:00
Massimiliano Culpo
4c082a5357 Relax os constraints in e4s pipelines (#26547) 2021-10-19 10:51:37 -05:00
Massimiliano Culpo
2d45a9d617 Speed-up environment concretization on linux with a process pool (#26264)
* Speed-up environment concretization with a process pool

We can exploit the fact that the environment is concretized
separately and use a pool of processes to concretize it.

* Add module spack.util.parallel

Module includes `pool` and `parallel_map` abstractions,
along with implementation details for both.

* Add a new hash type to pass specs across processes

* Add tty msg with concretization time
2021-10-19 10:09:34 -05:00
Ryan Mast
64a323b22d spdlog: add v1.9.0, v1.9.1 and v1.9.2 (#26777) 2021-10-19 16:55:36 +02:00
Ryan Mast
f7666f0074 asio: Add versions up to 1.20.0 (#26778) 2021-10-19 16:54:55 +02:00
Ryan Mast
b72723d5d9 cli11: added v2.1.1, v2.1.0 (#26780) 2021-10-19 16:54:33 +02:00
Ryan Mast
65d344b2f8 cli11: disable building the examples (#26781) 2021-10-19 16:53:45 +02:00
Adam J. Stewart
9afc5ba198 py-pandas: add v1.3.4 (#26788) 2021-10-19 16:50:42 +02:00
Christopher Kotfila
ad35251860 Fix trigger and child links in pipeline docs (#26814) 2021-10-19 14:44:36 +00:00
Paul
9eaccf3bbf Add Go 1.17.2 and 1.16.9 (#26794) 2021-10-19 16:30:59 +02:00
Mark Grondona
bb1833bc8c flux: update maintainer of flux-core, flux-sched (#26800) 2021-10-19 16:30:33 +02:00
Robert Pavel
c66aa858c2 Made Legion Dependency on Gasnet Tarball Explicit (#26805)
Made legion dependency on a gasnet tarball explicit so as to take
advantage of spack mirrors for the purpose of deploying on machines with
firewalls
2021-10-19 16:28:43 +02:00
Erik Schnetter
4fc8ed10cb nsimd: add v3.0 (#26806) 2021-10-19 16:27:29 +02:00
Adam J. Stewart
0de8c65a2d Libtiff: improve compression support (#26809) 2021-10-19 16:17:37 +02:00
genric
42fb34e903 py-xarray: add v0.18.2 (#26811) 2021-10-19 16:15:56 +02:00
iarspider
d6fc914a9b Update py-importlib-metadata and py-setuptools-scm (#26813) 2021-10-19 16:08:53 +02:00
Massimiliano Culpo
79c92062a8 Gitlab pipelines: use images from the Spack organization (#26796) 2021-10-19 14:38:39 +02:00
Scott Wittenburg
95538de731 Speed up pipeline generation (#26622)
- [x] Stage already concretized specs instead of abstract ones
- [x] Reduce number of network calls by reading naughty list up front
2021-10-18 20:58:02 -07:00
Miguel Dias Costa
d41ddb8a9c new package: berkeleygw (#21455) 2021-10-18 18:17:58 -07:00
Harmen Stoppels
2b54132b9a cosma: add new versions and improve package (#24136)
* cosma: add new versions and improve package

* Move method below depends_on's
2021-10-18 18:06:14 -07:00
Todd Gamblin
c5ca0db27f patches: make re-applied patches idempotent (#26784)
We use POSIX `patch` to apply patches to files when building, but
`patch` by default prompts the user when it looks like a patch
has already been applied. This means that:

1. If a patch lands in upstream and we don't disable it
   in a package, the build will start failing.
2. `spack develop` builds (which keep the stage around) will
   fail the second time you try to use them.

To avoid that, we can run `patch` with `-N` (also called
`--forward`, but the long option is not in POSIX). `-N` causes
`patch` to just ignore patches that have already been applied.
This *almost* makes `patch` idempotent, except that it returns 1
when it detects already applied patches with `-N`, so we have to
look at the output of the command to see if it's safe to ignore
the error.

- [x] Remove non-POSIX `-s` option from `patch` call
- [x] Add `-N` option to `patch`
- [x] Ignore error status when `patch` returns 1 due to `-N`
- [x] Add tests for applying a patch twice and applying a bad patch
- [x] Tweak `spack.util.executable` so that it saves the error that
      *would have been* raised with `fail_on_error=True`. This lets
      us easily re-raise it.

Co-authored-by: Greg Becker <becker33@llnl.gov>
2021-10-18 23:11:42 +00:00
Mickaël Schoentgen
a118e799ae httpie: add v2.6.0 (#26791)
Signed-off-by: Mickaël Schoentgen <contact@tiger-222.fr>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-18 21:21:08 +00:00
Seth R. Johnson
bf42d3d49b file and python dependents: add missing dependencies (#26785)
* py-magic: delete redundant package

This package is actually named py-python-magic (since the project itself
is "python-magic").

* New package: libmagic

* Py-python-magic: add required runtime dependency on libmagic and new version

* Py-filemagic: add required runtime dependency

* py-magic: restore and mark as redundant

This reverts commit 4cab7fb69e.

* file: add implicit dependencies and static variant

Replaces redundant libmagic that I added. Compression headers were previously
being picked up from the system.

* Fix py-python-magic dependency

* Update python version requirements
2021-10-18 21:11:16 +00:00
Miroslav Stoyanov
e0ff44a056 tasmanian: add smoke test (#26763) 2021-10-18 16:09:46 -04:00
Seth R. Johnson
c48b733773 Make macOS installed libraries more relocatable (#26608)
* relocate: call install_name_tool less

* zstd: fix race condition

Multiple times on my mac, trying to install in parallel led to failures
from multiple tasks trying to simultaneously create `$PREFIX/lib`.

* PackageMeta: simplify callback flush

* Relocate: use spack.platforms instead of platform

* Relocate: code improvements

* fix zstd

* Automatically fix rpaths for packages on macOS

* Only change library IDs when the path is already in the rpath

This restores the hardcoded library path for GCC.

* Delete nonexistent rpaths and add more testing

* Relocate: Allow @executable_path and @loader_path
2021-10-18 13:34:16 -04:00
Shahzeb Siddiqui
3c013b5be6 docutils > 0.17 issue with rendering list items in sphinx (#26355)
* downgrade_docutils_version

* invalid version

* Update requirements.txt

* Improve spelling and shorten the reference link

* Update spack.yaml

* update version requirement

* update version to maximum of 0.16

Co-authored-by: bernhardkaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-10-18 16:55:46 +00:00
Harmen Stoppels
30e8dd95b5 Remove unused exist_errors in installer.py (#26650) 2021-10-18 15:53:51 +02:00
Harmen Stoppels
1e5f7b3542 Don't print error output in the test whether gpgconf works (#26682) 2021-10-18 15:52:53 +02:00
Sreenivasa Murthy Kolam
1156c7d0a9 allow multiple values for tensile_architecture and expand the gpu list for rocm-4.3.1 (#26745) 2021-10-18 08:48:07 +02:00
Xavier Delaruelle
cab17c4ac3 environment-modules: add version 5.0.1 (#26786) 2021-10-18 08:46:57 +02:00
Harmen Stoppels
33ef7d57c1 Revert 19736 because conflicts are avoided by clingo by default (#26721) 2021-10-18 08:41:35 +02:00
Valentin Volkl
9c7e71df34 py-jupytext: add new package (#26732)
* py-jupytext: add new package

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* update jupytext dependencies

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-jupytext: remove py-jupyerlab dependency

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-17 18:51:42 -05:00
Morten Kristensen
03f84fb440 py-vermin: add latest version 1.3.0 (#26787) 2021-10-17 11:46:53 -05:00
Valentin Volkl
d496568ff9 py-gevent: add version 1.5 (#26731)
* py-gevent: add version 1.5

* py-gevent: update dependencies for v1.5.0

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-gevent/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-17 11:07:11 -05:00
Sam Reeve
ba62b691a6 Add ECP tags for CoPA and related packages (#26739)
* Add ECP tags for CoPA (and related) packages

* Update CoPA maintainers
2021-10-17 10:00:28 -04:00
Ryan Mast
b794b5bd4d nlohmann-json: update to version 3.10.4 (#26779) 2021-10-17 09:56:20 -04:00
Bernhard Kaindl
64143f970e [Fix for the GitLab CI] phist: prefer @1.9.5 (1.9.6 is not compatible w/ mpich%gcc:9) (#26773)
* phist: Prefer 1.9.5 (1.9.6 uses mpi_f08, but not available in CI)

* phist: remove dupe of 1.9.5, missing preferred=True

Also, for 1.9.6, patch the (most, one does not work) tests to use
2021-10-16 11:18:07 -07:00
Scot Halverson
b2f059e547 Update GASNet package.py to include version 2021.9.0 (#26736) 2021-10-15 21:15:32 +02:00
Mosè Giordano
2cf7d43e62 libblastrampoline: add v3.1.0 (#26769)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-10-15 18:17:26 +00:00
Brice Videau
0bc1bffe50 Fix ruby dependent extensions. (#26729)
* Fix ruby dependent extensions.

* Added Kerilk as maintainer.
2021-10-15 16:59:32 +00:00
Axel Huebl
33da53e325 GCC: Conflict for <12 for M1 (#26318)
aarch64/M1 is only a supported build combination for GCC in
the planned GCC 12+ release.
2021-10-15 12:30:09 -04:00
Mickaël Schoentgen
6686db42ad py-charset-normalizer: add v2.0.7 (#26756)
Signed-off-by: Mickaël Schoentgen <contact@tiger-222.fr>
2021-10-15 15:58:09 +00:00
Manuela Kuhn
ec67bec22a py-rdflib: add 6.0.2 (#26757) 2021-10-15 10:35:59 -05:00
Manuela Kuhn
2a5a576667 py-ipykernel: add 6.4.1 and fix deps (#26758) 2021-10-15 10:34:51 -05:00
Manuela Kuhn
5dc8094f84 py-setuptools: add 58.2.0 (#26759) 2021-10-15 10:31:52 -05:00
Manuela Kuhn
012b4f479f py-jupyter-client: add 6.1.12 (#26760) 2021-10-15 10:30:50 -05:00
Harmen Stoppels
f8e4aa7d70 Revert "Don't run lsb_release on linux (#26707)" (#26754)
This reverts commit fcac95b065.
2021-10-15 09:34:04 +00:00
Harmen Stoppels
e0fbf09239 EnvironmentModifications: allow disabling stack tracing (#26706)
Currently Spack keeps track of the origin in the code of any
modification to the environment variables. This is very slow 
and enabled unconditionally even in code paths where the 
origin of the modification is never queried.

The only place where we inspect the origins of environment 
modifications is before we start a build, If there's an override 
of the type `e.set(...)` after incremental changes like 
`e.append_path(..)`, which is a "suspicious" change.

This is very rare though.

If an override like this ever happens, it might mean a package is
broken. If that leads to build errors, we can just ask the user to run
`spack -d install ...` and check the warnings issued by Spack to find
the origins of the problem.
2021-10-15 10:00:44 +02:00
Vanessasaurus
842e56efb8 libabigail: add v2.0 (#26753)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-10-15 07:53:29 +02:00
Joseph Wang
0749d94ad3 Disable parallel builds in groff and gosam-contrib (#26730)
Work around to #26726 and #26714
2021-10-15 07:20:18 +02:00
Cameron Rutherford
2bc0ea70ed HiOp: add v0.5.0 + small changes in dependencies (#26744) 2021-10-15 07:09:09 +02:00
kwryankrattiger
bf5ef3b6b9 paraview: add adios2 variant (#26728) 2021-10-15 07:07:04 +02:00
Timothy Brown
7ccdae5233 Removing NCEP Post (ncep-post). (#26749)
UPP and ncep-post are the same package, so this PR 
removes the duplication. 

ncep-post was originally named after the upstream repo
that now changed its name to UPP.
2021-10-15 07:04:14 +02:00
Eric Brugger
ea453db674 vtk: modify conflict between osmesa and qt (#26752) 2021-10-15 06:58:43 +02:00
Tamara Dahlgren
41d375f6a4 Stand-alone tests: disallow re-using an alias (#25881)
It can be frustrating to successfully run `spack test run --alias <name>` only to find you cannot get the results because you already use `<name>` in some previous stand-alone test execution.  This PR prevents that from happening.
2021-10-14 15:08:00 -07:00
Tamara Dahlgren
beb8a36792 Remove extra tag assignments (#26692) 2021-10-14 14:50:09 -07:00
Manuela Kuhn
28ebe2f495 py-datalad: add 0.15.2 (#26750) 2021-10-14 21:44:52 +00:00
Harmen Stoppels
4acda0839b cp2k: use variant propagation trick for virtuals (#26737) 2021-10-14 23:11:22 +02:00
Massimiliano Culpo
eded8f48dc ASP-based solver: add a rule for version uniqueness in virtual packages (#26740)
fixes #26718

A virtual package may or may not have a version, but it
never has more than one. Previously we were missing a rule
for that.
2021-10-14 23:06:41 +02:00
Christoph Junghans
d9d0ceb726 add py-pyh5md and update py-espressopp (#26746)
* add  py-pyh5md and update py-espressopp

* Update package.py
2021-10-14 19:17:02 +00:00
Bernhard Kaindl
6ca4d554cd libfive: Add all variants, +qt needs qt@5.15.2:+opengl (#26629)
Refresh of deps to fix the build and add variants from CMakeLists.txt
2021-10-14 11:42:01 -07:00
Mosè Giordano
74c6b7d8c1 sombrero: add version 2021-08-16 (#26741) 2021-10-14 19:41:50 +02:00
Bernhard Kaindl
95030a97b6 vc: Enable the testsuite, excluding tests failing on Zen2 (#26699)
This fixes running the testsuite, it adds the package virtest for it.
2021-10-14 09:02:31 -07:00
Miroslav Stoyanov
b64a7fe1ff switch to the smoke testing included in heffte (#26720) 2021-10-14 08:58:29 -07:00
iarspider
160371f879 alpgen: new package (#26713)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-10-14 16:58:09 +02:00
Thomas Madlener
49354140be edm4hep: new version, fix tests (depends on catch2) (#26679) 2021-10-14 14:04:45 +00:00
Thomas Madlener
25704ae8e6 podio: new version and fix python unittest env (#26649) 2021-10-14 13:54:02 +00:00
Bernhard Kaindl
a61853816f phist: Fix build of 1.9.6, fix build- and install-tests (#26727)
Primary fix:

Due to a typo in a version range, overlapping PR merges resulted
in a build failure of the latest version:
Don't attempt to remove a non-existing file for version 1.9.6.

Secondary fixes:

update_tpetra_gotypes.patch was mentioned twice, and the version
range has to exclude @1.4.2, to which it cannot be applied.

Add depend_on() py-pytest, py-numpy and pkgconfig with type='test'

@:1.9.0 fail with 'Rank mismatch with gfortran@10:, add a conflicts().

raise InstallError('~mpi not possible with kernel_lib=builtin!')
when applicable.

Fixes for spack install --test=root phist:

mpiexec -n12 puts a lot of stress on a pod and gets stuck in a loop
very often: Reduce the mpiexec procs and the number of threads.

Remove @run_after('build') @on_package_attributes(run_tests=True):
from 'def check()': fixes it from getting called twice

The build script of 'make test_install' for the installcheck expects
the examples to be copied to self.stage.path: Provide them.
2021-10-14 08:14:25 -04:00
AMD Toolchain Support
8c1399ff7c LAMMPS: update recipe for %aocc (#26710)
* updating the recipe for betterment

* addressing the suggesions received from reviewers

* adding package helper macros

Co-authored-by: mohan002 <mohbabul@amd.com>
2021-10-14 12:03:15 +00:00
Eric Brugger
2bc97f62fd Qt: Qt fixes for a Cray AMD system. (#26722)
* Qt fixes for a Cray AMD system.

* Update to latest changes.
2021-10-14 07:41:26 -04:00
Massimiliano Culpo
949094544e Constrain abstract specs rather than concatenating strings in the "when" context manager (#26700)
Using the Spec.constrain method doesn't work since it might
trigger a repository lookup which could break our directives
and triggers a circular import error.

To fix that we introduce a function to merge abstract anonymous
specs, based only on package names, which does not perform any
lookup in the repository.
2021-10-14 12:33:10 +02:00
Bernhard Kaindl
2c3ea68dd1 openslide: Fix missing dependencies: gdk-pixbuf and perl-alien-libxml2 (#26620)
Add missing pkgconfig to openslide and its dep perl-alien-libxml2.

Fix shared-mime-info to be a runtime dependency of gdk-pixbuf,
Otherwise, configure cannot detect use gdk-pixbuf without error.
2021-10-13 19:07:27 -05:00
Bernhard Kaindl
99200f91b7 r-imager: add depends_on('r+X') and bump version (#26697) 2021-10-13 19:02:54 -05:00
Satish Balay
36bd69021d dealii: @9.3.1 has build errors with boost@1.77.0 - so add dependency on boost@1.76.0 [or lower] (#26709) 2021-10-14 01:30:53 +02:00
Harmen Stoppels
fcac95b065 Don't run lsb_release on linux (#26707)
Running `lsb_release` on Linux takes about 50ms because it is written in
Python. We do not use the output, so this change makes use not call it.
2021-10-14 01:27:24 +02:00
Manuela Kuhn
0227fc3ddc py-mne: add full variant (#26702) 2021-10-13 16:59:46 -05:00
Keita Iwabuchi
7c8c29a231 metall: add version 0.17 (#26694)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-10-13 22:10:16 +02:00
Bernhard Kaindl
5782cb6083 magics: Add v4.9.3 to fix build with gcc@11, skip broken testcase (#26695)
To build with gcc-11, v4.9.3 is needed, conflict added for older revs.
2021-10-13 12:14:37 -07:00
Bernhard Kaindl
f4a132256a qgis: fix build of LTS release with proj>7 (#26696)
Co-authored-by: Sinan <sbulutw@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-13 20:42:18 +02:00
David Beckingsale
2864212ae3 Add camp 0.3.0 and 0.2.3 (#26717) 2021-10-13 18:41:35 +00:00
Bernhard Kaindl
87663c6796 vapor: Fix the build and update: Use correct deps and find numpy incdir (#26630)
vapor needs proj@:7 and gives a list of tested dependency versions.
Make it find the numpy include path and add version 3.5.0 as well
2021-10-13 13:25:18 -04:00
Bernhard Kaindl
14a929a651 sfcgal: build fails with cgal@:4.6, works with cgal@4.7: (#26642)
Use depends_on('cgal@4.7: +core') to fix the build
2021-10-13 11:47:09 -05:00
Bernhard Kaindl
bcd1272253 wireshark: Fix install race and skip network capture tests (#26698)
The network capture tests can't pass when built as normal user.
2021-10-13 12:39:19 -04:00
Todd Kordenbrock
6125117b5d SEACAS: add a Faodel variant (#26583)
* SEACAS: add a Faodel variant

* Use safer CMake and variant packages instead of directly adding parameters
Add a "+faodel ~mpi" dependency to balance "+faodel +mpi"
2021-10-13 12:39:11 -04:00
Satish Balay
37278c9fa0 superlu-dist add version 7.1.0 (#26708) 2021-10-13 11:34:43 -05:00
Patrick Gartung
047c95aa8d buildcache: do one less tar file extraction
The buildcache is now extracted in a temporary folder within the current store,
moved to its final place and relocated. 

"spack clean -s" has been extended to also clean the temporary extraction directory.

Add hardlinks with absolute paths for libraries in the corge, garply and quux packages
to detect incorrect handling of hardlinks in tests.
2021-10-13 17:38:29 +02:00
haralmha
e6b76578d2 Add version 4.12.0 (#26532) 2021-10-13 14:07:48 +00:00
iarspider
81c272fcb7 photos-f: new package (Fortran version) (#26703) 2021-10-13 15:56:11 +02:00
Jose E. Roman
c47eb7217e slepc: set up SLEPC_DIR for dependent packages (#26701) 2021-10-13 15:51:08 +02:00
Jen Herting
8ad608a756 py-convokit: new package (#26236)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-10-13 15:50:07 +02:00
Joseph Wang
252784ccf5 cppgsl: disable tests on gcc11 (#26593) 2021-10-13 09:36:45 -04:00
Bernhard Kaindl
b771c4ea01 phist: Fix build: ppc64_sse.patch only applies to 1.9.4 (#26704)
ppc64_sse.patch can only be applied to 1.9.4:
* Older releases don't have the patched file
* All newer releases carry the change of the patch already.
2021-10-13 09:20:50 -04:00
Francis Kloss
53461b7b04 salome-medcoupling: new package (with dependencies) (#25785)
Adds new packages for using MEDCoupling from SALOME platform
2021-10-13 13:50:37 +02:00
Joe Schoonover
4d58661d08 feq-parse: add version 1.1.0 and update maintainer (#26060) 2021-10-13 00:20:21 +00:00
Valentin Volkl
81d157b05e garfieldpp: update dependencies, add variant (#25816) 2021-10-13 02:13:50 +02:00
Jen Herting
0f9080738f [py-spacy] added version 2.3.7 (#25999) 2021-10-12 23:57:04 +00:00
Scott McMillan
b937aa205b Fix Amber patch target specification (#26687)
Co-authored-by: Scott McMillan <smcmillan@nvidia.com>
2021-10-13 00:15:05 +02:00
Jose E. Roman
31b0fe21a2 py-slepc4py: add missing depends_on() (#26688) 2021-10-13 00:09:19 +02:00
Bernhard Kaindl
e7bfaeea52 libical: Add missing deps: pkgconfig, glib and libxml2 (#26618)
Libical needs pkgconfig, glib and libxml2 to build.
2021-10-12 23:22:49 +02:00
Harmen Stoppels
1ed695dca7 Improve error messages for bootstrap download failures (#26599) 2021-10-12 22:11:07 +02:00
Bernhard Kaindl
b6ad9848d2 babelflow, parallelmergetree: fix build with gcc11 (#26681)
gcc-11 does not include the <limits> and <algorithm> as side effect
of including other header, at least not as often as earlier gcc did.
2021-10-12 21:45:00 +02:00
Stephen Herbein
8d04c8d23c flux-core, flux-sched: add 0.29.0, 0.18.0 and cleanup env vars (#26391)
Problem: Flux expects the `FLUX_PMI_LIBRARY_PATH` to point directly at
the `libpmi.so` installed by Flux.  When the env var is unset,
prepending to it results in this behavior.  In the rare case that the
env var is already set, then the spack `libpmi.so` gets prepended with a
`:`, which Flux then attempts to interpret as a single path.

Solution: don't prepend to the path, instead set the path to point to
the `libpmi.so` (which will be undone when Flux is unloaded).

* flux-core: remove deprecated environment variables

The earliest checksummed version in this package is 0.15.0. As of
0.12.0, wreck (and its associated paths) no longer exist in Flux. As of
0.13.0, the `FLUX_RCX_PATH` variables are no longer used. So clean up
these env vars from the `setup_run_environment`.
2021-10-12 21:39:43 +02:00
Adam J. Stewart
47554e1e2f GMT: add conflict for GCC 11 (#26684) 2021-10-12 17:56:17 +00:00
Bernhard Kaindl
862ce517ce gromacs: @2018:2020: add #include <limits> for newer %gcc builds (#26678)
gromacs@2018:2020.6 is fixed to build with gcc@11.2.0
by adding #include <limits> to a few header files.

Thanks to Maciej Wójcik <w8jcik@gmail.com> for testing versions.
2021-10-12 11:39:09 -06:00
Alexander Jaust
50a2316a15 Add missing spack command in basic usage tutorial (#26646)
The `find` command was missing for the examples forcing colorized output. Without this (or another suitable) command, spack produces output that is not using any color. Thus, without the `find` command one does not see any difference between forced colorized and non-colorized output.
2021-10-12 19:23:53 +02:00
Mark W. Krentel
2edfccf61d binutils: fix parallel make for version 2.36 (#26611)
There was a bug in 2.36.* of missing Makefile dependencies.  The
previous workaround was to require 2.36 to be built serially.  This is
now fixed upstream in 2.37 and this PR adds the patch to restore
parallel make to 2.36.
2021-10-12 19:01:46 +02:00
Veselin Dobrev
e2a64bb483 mfem: patch @4.3.0 to support hypre up to v2.23.0 (#26640) 2021-10-12 18:29:56 +02:00
Manuela Kuhn
ff66c237c0 py-niworkflows: add new package (#26639)
* py-niworkflows: add new package

* Update var/spack/repos/builtin/packages/py-niworkflows/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove unnecessary comment

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-12 11:22:44 -05:00
Manuela Kuhn
f8910c7859 py-nistats: add new package (#26662)
* py-nistats: add new package

* Update var/spack/repos/builtin/packages/py-nistats/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove `conflicts`

* remove test dependencies

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-12 11:22:00 -05:00
Harmen Stoppels
e168320bb1 spack: Add package (#25979)
* Make python 2 use 'from __future__ import absolute_import' to allow import spack.pkgkit

* Add Spack

* Improve ranges
2021-10-12 11:39:39 -04:00
Sergei Shudler
f58f5f8a0d babelflow, parallelmergetree: add the current versions (#26660) 2021-10-12 17:36:13 +02:00
Vanessasaurus
ce7eebfc1f allowing spack monitor to handle redirect (#26666)
when deployed on kubernetes, the server sends back permanent redirect responses.
This is elegantly handled by the requests library, but not urllib that we have
to use here, so I have to manually handle it by parsing the exception to
get the Location header, and then retrying the request there.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
2021-10-12 17:29:22 +02:00
Jonas Thies
f66ae104bf phist: force MPI compiler wrappers (#26312)
* packages/phist, re #26002: force phist to use MPI compiler wrappers (copied from trilinos package)

* packages/phist re #26002, use cmake-provded FindMPI module only

* packages/phist source code formatting

* packages/phist: set MPI_HOME rather than MPI_BASE_DIR, thanks @sethri.

* phist: delete own FindMPI.cmake for older versions (rather than patching it away)

* packages/phist: remove blank line

* phist: adjust sorting of imports

* phist: change order of imports
2021-10-12 11:11:00 -04:00
Joseph Wang
1e382d6d20 madgraph5amc: Add changes fixing bugs shown by gcc10 (#26643) 2021-10-12 15:55:48 +02:00
Martin Aumüller
6f8f37ffb1 ispc: update development branch name and limit to llvm@12 (#26676)
1.16 and 1.16.1 are not compatible with LLVM 13
2021-10-12 14:31:24 +02:00
Massimiliano Culpo
551120ee0b ASP-based solver: decrease the priority of multi-valued variant optimization for root (#26677)
The ASP-based solver maximizes the number of values in multi-valued
variants (if other higher order constraints are met), to avoid cases
where only a subset of the values that have been specified on the
command line or imposed by another constraint are selected.

Here we swap the priority of this optimization target with the
selection of the default providers, to avoid unexpected results
like the one in #26598
2021-10-12 14:15:48 +02:00
Harmen Stoppels
c2bf585d17 Fix potentially broken shutil.rmtree in tests (#26665)
Seems like https://bugs.python.org/issue29699 is relevant. Better to
just ignore errors when removing them tmpdir. The OS will remove it
anyways.

Errors are happening randomly from tests that are using this fixture.
2021-10-12 14:01:52 +02:00
Martin Diehl
66b32b337f damask{,-grid,-mesh}: add @3.0.0-alpha5 (#26570) 2021-10-12 13:22:09 +02:00
Maciej Wójcik
4b2cbd3aea gromacs: Add Gromacs 2020.6 and Plumed 2.7.2 (#26663) 2021-10-12 13:01:10 +02:00
Bernhard Kaindl
cb06f91df7 boost: Fix build of 1.53:1.54 with glibc>=2.17 (#26659)
Fix missing declaration of uintptr_t with glibc>=2.17 in 1.53:1.54
See: https://bugs.gentoo.org/482372
2021-10-12 10:59:36 +02:00
Harmen Stoppels
0c0831861c Avoid quadratic complexity in log parser (#26568)
TL;DR: there are matching groups trying to match 1 or more occurrences of
something. We don't use the matching group. Therefore it's sufficient to test
for 1 occurrence. This reduce quadratic complexity to linear time.

---

When parsing logs of an mpich build, I'm getting a 4 minute (!!) wait
with 16 threads for regexes to run:

```
In [1]: %time p.parse("mpich.log")
Wall time: 4min 14s
```

That's really unacceptably slow... 

After some digging, it seems a few regexes tend to have `O(n^2)` scaling
where `n` is the string / log line length. I don't think they *necessarily*
should scale like that, but it seems that way. The common pattern is this

```
([^:]+): error
```

which matches `: error` literally, and then one or more non-colons before that. So
for a log line like this:

```
abcdefghijklmnopqrstuvwxyz: error etc etc
```

Any of these are potential group matches when using `search` in Python:

```
abcdefghijklmnopqrstuvwxyz
 bcdefghijklmnopqrstuvwxyz
  cdefghijklmnopqrstuvwxyz
                         ⋮
                        yz
                         z
```

but clearly the capture group should return the longest match.

My hypothesis is that Python has a very bad implementation of `search`
that somehow considers all of these, even though it can be implemented
in linear time by scanning for `: error` first, and then greedily expanding
the longest possible `[^:]+` match to the left. If Python indeed considers
all possible matches, then with `n` matches of length `1 .. n` you
see the `O(n^2)` slowness (i verified this by replacing + with {1,k}
and doubling `k`, it doubles the execution time indeed).

This PR fixes this by removing the `+`, so effectively changing the 
O(n^2) into a O(n) worst case.

The reason we are fine with dropping `+` is that we don't use the
capture group anywhere, so, we just ensure `:: error` is not a match
but `x: error` is.

After going from O(n^2) to O(n), the 15MB mpich build log is parsed
in `1.288s`, so about 200x faster.

Just to be sure I've also updated `^CMake Error.*:` to `^CMake Error`,
so that it does not match with all the possible `:`'s in the line.
Another option is to use `.*?` there to make it quit scanning as soon as
possible, but what line that starts with `CMake Error` that does not have
a colon is really a false positive...
2021-10-12 00:05:11 -07:00
Manuela Kuhn
580a20c243 py-templateflow: add 0.4.2 (#26471)
* py-templateflow: add 0.4.2

* py-templateflow: fix python dependency for 0.4.2

* py-templateflow: remove wheel dependency for older versions
2021-10-11 21:37:50 -05:00
Manuela Kuhn
eb486371b2 py-pysurfer: add new package (#26638) 2021-10-11 21:36:43 -05:00
Manuela Kuhn
0da892f2dd py-pandas: fix installation and tests for versions @:0.25 (#26668) 2021-10-12 00:44:32 +00:00
Daniele Cesarini
a36a5ab624 fleur: new package (#26631) 2021-10-12 02:31:23 +02:00
Todd Kordenbrock
af2ecf87d4 Faodel: Update for the v1.2108.1 release (#26516) 2021-10-11 23:17:42 +00:00
Valentin Volkl
a0b9face0d frontier-client: add missing openssl dependency (#26636) 2021-10-12 01:02:12 +02:00
Kevin Pedretti
0b62974160 openblas: fix build on riscv64 (#26565)
OpenBLAS now has support for the riscv64 architecture. This commit
extends the spack openblas package.py to handle building on riscv64.
2021-10-12 00:55:45 +02:00
Alan Sill
833f1de24a bcl2fastq2: add boost@1.55 as default boost dependency (#26655)
When PR #26659 is merged, boost version boost@1.54.0 will be available for build too.

Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-10-11 22:21:17 +00:00
Bernhard Kaindl
4eb176b070 motif package: fix linking the simple_app demo program (#26574) 2021-10-11 15:00:42 -07:00
Desmond Orton
95e63e0c29 New Package Qualimap:2.2.1 (#26615)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-10-11 19:28:34 +00:00
Derek Ryan Strong
2644a15ff9 GNU parallel package: add version 20210922 (#26591) 2021-10-11 12:08:06 -07:00
Marie Houillon
c7785b74be openCARP packages: add version 8.1 (#26602) 2021-10-11 12:07:05 -07:00
Bernhard Kaindl
260f4ca190 mapserver: Add missing deps: giflib and postgresql (#26619) 2021-10-11 12:04:18 -07:00
Michael Kuhn
d1f3279607 installer: Support showing status information in terminal title (#16259)
Installing packages with a lot of dependencies does not have an easy way
of judging the current progress (apart from running `spack spec -I pkg`
in another terminal). This change allows Spack to update the terminal's
title with status information, including its current progress as well as
information about the current and total number of packages.
2021-10-11 17:54:59 +02:00
Seth R. Johnson
8f62039d45 llvm: add conflict for newer apple-clang versions (#26633) 2021-10-11 17:49:23 +02:00
Bernhard Kaindl
2e9512fe8b mesa: gallium fails with llvm@13: use 'llvm@6:12', add mesa@21.2.3 (#26627)
The software rasterizer of Mesa's Gallium3D(even @21.2.3) fails to
build with llvm@13.0.0, use: depends_on('llvm@6:12', when='+llvm')
2021-10-11 17:34:33 +02:00
Matthew Archer
e034930775 kahip: update build system to cmake for v3.11, retain scons for older versions (#25645)
* kahip: update to cmake for v3.11, retain scons for older versions

* kahip: update build system to cmake for v3.11, retain SCons for older versions

* address PR comments and add maintainer

* address PR comments - correct version to 2.10, add deprecated and url, and remove scons version
2021-10-11 10:20:43 -05:00
Daniele Cesarini
79aba5942a Update of siesta libs (#26489) 2021-10-11 09:09:39 -05:00
iarspider
78200b6b41 Add new versions of kiwisolver (#26597) 2021-10-11 09:09:01 -05:00
Harmen Stoppels
89220bc0e1 Only install env modifications in <prefix>/.spack (#24081)
- Do not store the full list of environment variables in
  <prefix>/.spack/spack-build-env.txt because it may contain user secrets.

- Only store environment variable modifications upon installation.

- Variables like PATH may still contain user and system paths to make
  spack-build-env.txt sourceable. Variables containing paths are
  modified through prepending/appending, and if we don't apply these
  to the current environment variable, we end up with statements like
  `export PATH=/path/to/spack/bin` with system paths missing, meaning
  no system binaries are in the path, which is a bad user experience.

- Do write the full environment to spack-build-env.txt in the staging dir,
  but ensure it is readonly for the current user, to make it a bit safer
  on shared systems.
2021-10-11 09:07:45 -05:00
Harmen Stoppels
c0c9ab113e Add spack env activate --temp (#25388)
Creates an environment in a temporary directory and activates it, which
is useful for a quick ephemeral environment:

```
$ spack env activate -p --temp
[spack-1a203lyg] $ spack add zlib
==> Adding zlib to environment /tmp/spack-1a203lyg
==> Updating view at /tmp/spack-1a203lyg/.spack-env/view
```
2021-10-11 06:56:03 -04:00
Harmen Stoppels
f28b08bf02 Remove unused --dependencies flag (#25731) 2021-10-11 10:16:11 +02:00
iarspider
ad75f74334 New package: py-cppy (required for py-kiwisolver) (#26645) 2021-10-10 17:14:20 -05:00
Manuela Kuhn
4e1fdfeb18 py-nilearn: add 0.4.2 and 0.6.2 (#26635) 2021-10-09 19:14:13 -05:00
Manuela Kuhn
7f46b9a669 py-pyvistaqt: add new package (#26637) 2021-10-09 21:00:00 +00:00
Michael Kuhn
5deb8a05c4 environment-modules: fix build (#26632)
PR #25904 moved the `--with-tcl` option to only older versions. However,
without this option, the build breaks:
```
checking for Tcl configuration... configure: error: Can't find Tcl configuration definitions. Use --with-tcl to specify a directory containing tclConfig.sh
```
2021-10-09 12:43:54 -05:00
Bernhard Kaindl
9194ca1e2f idna: Add versions 2.9 for sierrapy #23768 and 3.2(current) (#26616)
py-sierrapy was merged accidentally and pins its versions to exact
numbers. Add 2.9 as the version for sierrapy and the current 3.2
2021-10-09 11:31:11 -05:00
Joseph Wang
e6cc92554d py-awkward: add py-yaml as depends (#26626) 2021-10-09 17:35:22 +02:00
Joseph Wang
03ab5dee31 evtgen: fix pythia typo (#26625) 2021-10-09 16:11:58 +02:00
Bernhard Kaindl
93df47f4d5 librsvg: fix build when does not use -pthread for linking (#26592)
librsvg uses pthread_atfork() but does not use -pthread on Ubuntu 18.04 %gcc@8
2021-10-09 13:10:13 +02:00
Derek Ryan Strong
4b5cc8e3bd Add R 4.1.1 (#26589) 2021-10-08 20:02:03 -05:00
Derek Ryan Strong
e89d4b730a Add Julia 1.6.3 (#26624) 2021-10-09 00:58:19 +00:00
Massimiliano Culpo
2386630e10 Remove DB reindex during a read operation (#26601)
The DB should be what is trusted for certain operations.
If it is not present when read we should assume the
corresponding store is empty, rather than trying a
write operation during a read.

* Add a unit test
* Document what needs to be there in tests
2021-10-08 22:35:23 +00:00
Bernhard Kaindl
21ce23816e py-twisted,py-storm: dep on zope.interface, bump storm version (#26610)
* py-twisted,py-storm: dep on zope.interface, bump storm version

py-twisted and py-storm's import tests need zope.interface.
py-storm: Use pypi and add version 0.25. It didn't change reqs.
zope.infterface@4.5.0 imports removed Feature: Use setuptools@:45

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-storm: all deps updated with type=('build', 'run')

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-08 21:20:39 +00:00
psakievich
8ebb705871 Trilinos: update for CUDA and Nalu-Wind (#26614) 2021-10-08 21:08:54 +00:00
Bernhard Kaindl
7e8f2e0c11 py-hypothesis: Add variants django, dumpy, pandas and fix import tests (#26604) 2021-10-08 21:00:47 +00:00
Manuela Kuhn
654268b065 py-bcrypt, py-bleach, py-decorator, py-pygdal: fix python dependency (#26596)
* py-bcrypt, py-bleach, py-decorator, py-pygdal: fix python dependency

* Update var/spack/repos/builtin/packages/py-bleach/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-08 15:56:22 -05:00
Bernhard Kaindl
6d0d8d021a py-pymatgen: fix build of old versions, bump version to 2021.3.9 (#26249)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-08 19:54:40 +00:00
Bernhard Kaindl
f1f614ee9f py-anytree: Add dep py-six@1.9.0 as required by setup.py (#26603)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-08 19:36:12 +00:00
Manuela Kuhn
88c33686fd py-matplotlib: fix 3.4.3 (#26586)
* py-matplotlib: fix 3.4.3

* Update var/spack/repos/builtin/packages/py-matplotlib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-08 13:52:10 -05:00
Bernhard Kaindl
de86726483 py-traceback2: Fix depends_on: add six and py-linecache2 (#26607) 2021-10-08 13:35:36 -05:00
Bernhard Kaindl
a95c1dd615 py-keras-preprocessing: Add missing deps: six@1.9.0: and numpy@1.9.1: (#26605)
* py-keras-preprocessing: Add missing deps: six@1.9.0: and numpy@1.9.1:

Add deps: pip download --no-binary :all: keras-preprocessing==1.1.2
Collecting numpy>=1.9.1
  Installing build dependencies: started
Collecting six>=1.9.0

* Update var/spack/repos/builtin/packages/py-keras-preprocessing/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-08 13:34:25 -05:00
Harmen Stoppels
7d89a95028 Fix leaky spack.binary_distribution.binary_index in tests (#26609)
* Fix issues with leaky binary index across tests

* More rigorous binary_index reset as now other tests are failing :(
2021-10-08 13:41:47 -04:00
Manuela Kuhn
32b35e1a78 py-neurora: add new package (#26479) 2021-10-08 13:39:45 +00:00
Tamara Dahlgren
7f2611a960 Allow Version('') and map it to the empty tuple (#25953) 2021-10-08 10:36:54 +02:00
Rodrigo Ceccato de Freitas
169d0a5649 cling: add missing CMake dependency (#26577) 2021-10-08 09:28:37 +02:00
Adam J. Stewart
f390289629 More strict ReadTheDocs tests (#26580) 2021-10-08 09:27:17 +02:00
Daniel G Travieso
10de12c7d0 add hash field to spec on find --json and assert in test its there (#26443)
Co-authored-by: Daniel Travieso <daniel@dgtravieso.com>
2021-10-07 23:50:05 -07:00
Bernhard Kaindl
449a5832c8 llvm: llvm@13+libcxx needs a very recent C++ compiler (#26584)
libc++-13 does not support %gcc@:10, see:
https://bugs.llvm.org/show_bug.cgi?id=51359#c1

https://libcxx.llvm.org/#platform-and-compiler-support says:
- GCC    11     - latest stable release per GCC’s release page
- Clang: 11, 12 - latest two stable releases per LLVM’s release page
- AppleClang 12 - latest stable release per Xcode’s release page
2021-10-08 00:25:51 +00:00
Pedro Demarchi Gomes
28529f9eaf re2 pic support (#26513) 2021-10-08 01:01:40 +02:00
Oliver Perks
da31c7e894 Updatepackage/minigmg (#26467)
* MiniGMG, add support for optimised flags + SIMDe implementation of AVX instrinsics

* Add .gitlab-ci.yml

* NVHPC fast

* remove CI

* Syntax fix
2021-10-07 14:49:14 -05:00
Paul Ferrell
0b9914e2f5 Fix for license symlinking issue. (#26576)
When a symlink to a license file exists but is broken, the license symlink post-install hook fails
because os.path.exists() checks the existence of the target not the symlink itself.
os.path.lexists() is the proper function to use.
2021-10-07 19:18:35 +00:00
Seth R. Johnson
9853fd50e2 itk: use CMakePackage helpers (#26572) 2021-10-07 10:49:51 -05:00
Scott Wittenburg
0561af1975 Pipelines: retry service job on system errors (#26508)
Retry rebuild-index, cleanup, and no-op jobs automatically if they fail
due to infrastructure-related problems.
2021-10-07 08:59:51 -06:00
Kevin Huck
b33a0923e1 apex: support profiling/tracing HIP applications (#26569)
libz is added for compressing google trace events output.
2021-10-07 16:22:40 +02:00
Harmen Stoppels
05834e7c9d Memoize the result of spack.platforms.host() (#26573) 2021-10-07 14:04:05 +02:00
Olivier Cessenat
20ee786d09 visit: add an external find function (determine_version) (#25817)
* visit: add an external find function (determine_version)

* visit: correct too long comment line

* visit: forgot to set executables

* visit: external find uses signgle dash version

* visit: found as external asking visit version
2021-10-07 10:19:58 +00:00
Manuela Kuhn
653d5e1590 py-mayavi: add 4.7.3 (#26566) 2021-10-06 19:15:34 -05:00
Tyler Funnell
60909f7250 fish: adding version 3.3.1 (#26488)
* fish: adding version 3.3.1

* adding maintainer

* Update var/spack/repos/builtin/packages/fish/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-10-06 20:42:45 +00:00
Manuela Kuhn
b787065bdb py-scikit-image: add 0.18.3 and fix dependencies (#26406) 2021-10-06 15:30:27 -05:00
Jen Herting
3945e2ad93 New package: py-clean-text (#26511)
* [py-clean-text] created template

* [py-clean-text]

- added description
- added dependencies
- removed fixmes
2021-10-06 19:44:33 +00:00
Tamara Dahlgren
affd2236e6 Provide more info in SbangPathError to aid CI debugging (#26316) 2021-10-06 21:03:33 +02:00
Kevin Pedretti
cb5b28392e Patch from upstream needed to build numactl on riscv64. (#26541)
The most recent release of numactl (2.0.14) fails to build on riscv64
because of a missing "-latomic". This patch from upstream resolves this
issue. It can be dropped once the next version of numactl is released.
2021-10-06 11:50:21 -07:00
Rodrigo Ceccato de Freitas
b1f7b39a06 ucx: fix typo in config description (#26564) 2021-10-06 18:44:22 +00:00
Manuela Kuhn
770c4fd0b2 py-nipype: add 1.4.2 (#26472) 2021-10-06 20:35:42 +02:00
Massimiliano Culpo
98ee00b977 Restore the correct computation of stores in environments (#26560)
Environments push/pop scopes upon activation. If some lazily
evaluated value depending on the current configuration was
computed and cached before the scopes are pushed / popped
there will be an inconsistency in the current state.

This PR fixes the issue for stores, but it would be better
to move away from global state.
2021-10-06 11:32:26 -07:00
Jose E. Roman
fdf76ecb51 New release slepc4py 3.16.0 (#26468) 2021-10-06 11:18:53 -07:00
Manuela Kuhn
9a75f44846 py-svgutils: add 0.3.1 (#26470) 2021-10-06 11:15:36 -07:00
haralmha
6ce4d26c25 Add 1.4.2 (#26475) 2021-10-06 11:13:55 -07:00
haralmha
8fc04984ba Add 1.9.3 (#26483) 2021-10-06 11:10:58 -07:00
haralmha
cfde432b19 Add 1.1.1 (#26484) 2021-10-06 11:10:05 -07:00
haralmha
ae9428df84 Add 14.16.1 (#26485) 2021-10-06 11:09:06 -07:00
haralmha
533705f4e3 Add 1.0.2 (#26486) 2021-10-06 10:50:28 -07:00
haralmha
2b6d7cc74c Add 4.2.0 (#26498) 2021-10-06 10:35:04 -07:00
g-mathias
9978353057 New package: intel oneapi advisor (#26535)
Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2021-10-06 10:28:55 -07:00
Massimiliano Culpo
319ae9254e Remove the spack.architecture module (#25986)
The `spack.architecture` module contains an `Arch` class that is very similar to `spack.spec.ArchSpec` but points to platform, operating system and target objects rather than "names". There's a TODO in the class since 2016:

abb0f6e27c/lib/spack/spack/architecture.py (L70-L75)

and this PR basically addresses that. Since there are just a few places where the `Arch` class was used, here we query the relevant platform objects where they are needed directly from `spack.platforms`. This permits to clean the code from vestigial logic.

Modifications:
- [x] Remove the `spack.architecture` module and replace its use by `spack.platforms`
- [x] Remove unneeded tests
2021-10-06 10:28:12 -07:00
g-mathias
232086c6af new package: intel oneapi inspector (#26549)
Co-authored-by: lu64bag3 <gerald.mathias@lrz.de>
2021-10-06 10:25:18 -07:00
Danny McClanahan
e2e8e5c1e2 use conflicts() instead of referring to SpecError in mesa (#26562)
* mesa: use conflicts() instead of referring to SpecError
2021-10-06 10:07:42 -07:00
Paul R. C. Kent
fefe624141 llvm: new version 13.0.0 (#26563) 2021-10-06 16:53:49 +00:00
Alec Scott
4732c722e7 Add v1.56.2 to Rclone (#26556) 2021-10-06 09:53:34 -07:00
Alec Scott
97407795f6 Add v5.3.0 to Superlu (#26558) 2021-10-06 09:50:12 -07:00
haralmha
fe723b2ad8 py-jupyter-client: add 6.1.12 (#26503) 2021-10-06 09:21:10 -04:00
haralmha
d27bb7984a py-ptyprocess: add 0.7.0 (#26504) 2021-10-06 09:20:30 -04:00
haralmha
a59d6be99b py-setuptools: add 57.1.0 (#26505) 2021-10-06 09:19:49 -04:00
haralmha
ceb962fe17 py-six: add 1.16.0 (#26506) 2021-10-06 09:18:55 -04:00
Manuela Kuhn
11bdd18b1f py-matplotlib: fix qhull dependency (#26553) 2021-10-06 09:18:11 -04:00
Jen Herting
b92fd08028 py-ftfy: added version 6.0.3 (#26509) 2021-10-06 08:51:15 -04:00
Jen Herting
bc152fc6c4 New package: py-emoji (#26510)
* [py-emoji] created template

* [py-emoji]

- removed fixmes
- added homepage
- added description
- added dependencies
2021-10-06 08:50:24 -04:00
Frédéric Simonis
51f77f8af1 precice: add version 2.3.0 (#26551) 2021-10-06 08:44:59 -04:00
haralmha
8366f87cbf gnuplot: add version 5.4.2 (#26529) 2021-10-06 08:41:49 -04:00
haralmha
ab5962e997 hadoop: add version 2.7.5 (#26530) 2021-10-06 08:41:28 -04:00
haralmha
943f1e7a3a Add 7.6.3 (#26502) 2021-10-06 14:04:48 +02:00
Ivan Maidanski
1d258151f3 bdw-gc: add v8.0.6 (#26545) 2021-10-06 07:24:51 -04:00
haralmha
2cdfa33677 Add 6.0.2 (#26501) 2021-10-06 13:19:29 +02:00
haralmha
199bd9f8f8 py-decorator: Add version 5.0.9 and 5.10. (#26500)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-06 02:24:47 +00:00
haralmha
0c02f6c36d py-hepdata-validator: Add version 0.2.3 and 0.3.0 (#26496)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-06 02:15:06 +00:00
haralmha
c97c135f1d py-heapdict: Add version 1.0.0 (#26495)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-06 01:16:18 +00:00
haralmha
d07c366408 Add 1.28.1 (#26494)
py-grpcio: Add version 1.28.1
2021-10-06 02:27:55 +02:00
haralmha
df93becc99 Add 2021.4.1 (#26493)
py-distributed: Add version 2021.4.1
2021-10-06 02:26:05 +02:00
haralmha
5e9497e05a Add 0.7.1 (#26492)
py-defusedxml: add version 0.7.1
2021-10-06 02:25:04 +02:00
haralmha
2e28281f1a py-bleach: Add version 3.3.1 (#26490)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-06 00:18:09 +00:00
haralmha
e8b9074300 py-dask: add version 2021.4.1 (#26491) 2021-10-06 02:17:52 +02:00
haralmha
eafa0ff085 py-pygdal: Add versions 3.3.0.10 and 3.3.2.10 (#26528)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-05 23:57:59 +00:00
haralmha
b20e7317ab py-backcall: Add version 0.2.0 (#26487) 2021-10-06 01:55:27 +02:00
haralmha
2773178897 Add versions 3.1.6, 3.1.7 and 3.2.0 (#26527) 2021-10-06 01:48:18 +02:00
Bernhard Kaindl
e39866f443 h5utils: Fix build and add new version (#26133)
@1.12.1+png depends_on libpng@:1.5.0 because it needs the old API

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-10-05 23:13:02 +00:00
Massimiliano Culpo
d7fc17d5db Set explicitly write permission for packages (#26539) 2021-10-05 23:12:25 +00:00
Olivier Cessenat
cc6508ec09 vtk: gui support for vtk 9 added (#25636) 2021-10-05 22:54:39 +00:00
haralmha
3157a97743 Add version 3.16.1 (#26534) 2021-10-06 00:27:17 +02:00
haralmha
aac7e28ea0 dd4hep: Add version 01-18 (#26525)
* Add 01-18

* Keep master on top and change version name format
2021-10-05 17:37:31 -04:00
Kevin Pedretti
47607dcac5 Use gnuconfig package for config file replacement for RISC-V. (#26364)
* Use gnuconfig package for config file replacement for RISC-V.

This extends the changes in #26035 to handle RISC-V. Before this change,
many packages fail to configure on riscv64 due to config.guess being too
old to know about RISC-V. This is seen out of the box when clingo fails
to build from source due to pkgconfig failing to configure, throwing
error: "configure: error: cannot guess build type; you must specify one".

* Add riscv64 architecture

* Update vendored archspec from upstream project.
These archspec updates include changes needed to support riscv64.

* Update archspec's __init__.py to reflect the commit hash of archspec being used.
2021-10-05 19:22:55 +00:00
Harmen Stoppels
d998ea1bd4 Move shell aware env into spack.environment.shell (#25608)
Cherry-picked from #25564 so this is standalone.

With this PR we can activate an environment in Spack itself, without computing changes to environment variables only necessary for "shell aware" env activation.

1. Activating an environment:
    
    ```python
    spack.environment.activate(Environment(xyz)) -> None
    ```
    this basically just sets `_active_environment` and modifies some config scopes.

2. Activating an environment **and** getting environment variable modifications for the shell:

    ```python
    spack.environment.shell.activate(Environment(xyz)) -> EnvironmentModifications
    ```

This should make it easier/faster to do unit tests and scripting with spack, without the cli interface.
2021-10-05 18:25:43 +00:00
haralmha
713bbdbe7c postgresql: Add versions 14.0 and 12.2 (#26499) 2021-10-05 13:34:46 +02:00
haralmha
789beaffb7 doxygen: Add versions 1.9.2 and 1.8.18 (#26497) 2021-10-05 12:49:48 +02:00
Ricardo Jesus
4ee74c01e3 meme: Fix compilation with arm and nvhpc compilers (#24883)
* Fix compilation with `arm` and `nvhpc` compilers

* Update package.py
2021-10-05 05:55:12 -04:00
Jonas Thies
7f2fd50d20 phist: add a patch for the case +host arch=ppc64le (#26216) 2021-10-05 08:33:41 +00:00
Guilherme Perrotta
7ea9c75494 py-flatbuffers: add new package (#26444)
Python port of the "flatbuffers" package
2021-10-05 10:10:43 +02:00
Ivan Maidanski
1a651f1dca libatomic_ops: add v7.6.12 (#26512) 2021-10-05 10:08:57 +02:00
Manuela Kuhn
4e6f2ede0a py-rsatoolbox: add new package (#26514) 2021-10-05 10:07:03 +02:00
Axel Huebl
e1fb77496c WarpX: 21.10 (#26507) 2021-10-05 10:01:26 +02:00
David Gardner
85de527668 sundials: Add 5.8.0 and sycl variant (#26524) 2021-10-05 10:00:11 +02:00
Heiko Bauke
f6a4f6bda7 mpl: add new package (#26522) 2021-10-05 09:59:14 +02:00
Mihael Hategan
95846ad443 py-parsl: new package (see https://parsl-project.org/) (#26360) 2021-10-05 09:53:02 +02:00
Massimiliano Culpo
337b54fab0 Isolate bootstrap configuration from user configuration (#26071)
* Isolate bootstrap configuration from user configuration

* Search for build dependencies automatically if bootstrapping from sources

The bootstrapping logic will search for build dependencies
automatically if bootstrapping anything form sources. Any
external spec, if found, is written in a scope that is specific
to bootstrapping.

* Don't clean the bootstrap store with "spack clean -a"

* Copy bootstrap.yaml and config.yaml in the bootstrap area
2021-10-05 09:16:09 +02:00
Seth R. Johnson
3cf426df99 zstd: fix install name on macOS (#26518)
Reverting from CMake to Make install caused
`-install_path=/usr/local/lib/libzstd.1.dylib` to be hardcoded into the
zstd. Now we explicitly pass the PREFIX into the build command so that
the correct spack install path is saved.

Fixes #26438 and also the ROOT install issue I had :)
2021-10-05 02:03:48 +00:00
Nisarg Patel
b5673d70de molden: fix build with gcc@10: (#25803) 2021-10-05 01:36:20 +00:00
Todd Gamblin
84c878b66a cc: make error messages more clear
- [x] Our wrapper error messages are sometimes hard to differentiate from other build
      output, so prefix all errors from `die()` with '[spack cc] ERROR:'

- [x] The error we raise when running, say, `fc` without a Fortran compiler was not
      clear enough. Clarify the message and the comment.
2021-10-04 18:30:19 -07:00
Todd Gamblin
052b2e1b08 cc: convert compiler wrapper to posix shell
This converts everything in cc to POSIX sh, except for the parts currently
handled with bash arrays. Tests are still passing.

This version tries to be as straightforward as possible. Specifically, most conversions
are kept simple -- convert ifs to ifs, handle indirect expansion the way we do in
`setup-env.sh`, only mess with the logic in `cc`, and don't mess with the python code at
all.

The big refactor is for arrays. We can't rely on bash's nice arrays and be ignorant of
separators anymore. So:

1. To avoid complicated separator logic, there are three types of lists. They are:

    * `$lsep`-separated lists, which end with `_list`. `lsep` is customizable, but we
      picked `^G` (alarm bell) for `$lsep` because it's ASCII and it's unlikely that it
      would actually appear in any arguments. If we need to get fancier (and I will lose
      faith in the world if we do) then we could consider XON or XOFF.
    * `:`-separated directory lists, which end with `_dirs`, `_DIRS`, `PATH`, or `PATHS`
    * Whitespace-separated lists (like flags), which can have any other name.

    Whitespace and colon-separated lists come with the territory with PATHs from env
    vars and lists of flags. `^G` separated lists are what we use for most internal
    variables, b/c it's more likely to work.

2. To avoid subshells, use a bunch of functions that do dirty `eval` stuff instead. This
   adds 3 functions to deal with lists:

    * `append LISTNAME ELEMENT [SEP]` will put `ELEMENT` at the end of the list called
      `LISTNAME`. You can optionally say what separator you expect to use. Note that we
      are taking advantage of everything being global and passing lists by name.

    * `prepend LISTNAME ELEMENT [SEP]` like append, but puts `ELEMENT` at the start of
      `LISTNAME`

    * `extend LISTNAME1 LISTNAME2 [PREFIX]` appends everything in LISTNAME2 to
       LISTNAME1, and optionally prepends `PREFIX` to every element (this is useful for
       things like `-I`, `-isystem `, etc.

    * `preextend LISTNAME1 LISTNAME2 [PREFIX]` prepends everything in LISTNAME2 to
       LISTNAME1 in order, and optionally prepends `PREFIX` to every element.

The routines determine the separator for each argument by its name, so we don't have to
pass around separators everywhere. Amazingly, as long as you do not expand variables'
values within an `eval` environment, you can do all this and still preserve quoting.
When iterating over lists, the user of this API still has to set and unset `IFS`
properly.

We ended up having to ignore shellcheck SC2034 (unused variable), because using evals
all over the place means that shellcheck doesn't notice that our list variables are
actually used.

So far this is looking pretty good. I took the most complex unit test I could find
(which runs a sample link line) and ran the same command line 200 times in a shell
script.  Times are roughly as follows:

For this invocation:

```console
$ bash -c 'time (for i in `seq 1 200`; do ~/test_cc.sh > /dev/null; done)'
```

I get the following performance numbers (the listed shells are what I put in `cc`'s
shebang):

**Original**
* Old version of `cc` with arrays and `bash v3.2.57` (macOS builtin): `4.462s` (`.022s` / call)
* Old version of `cc` with arrays and `bash v5.1.8` (Homebrew): `3.267s` (`.016s` / call)

**Using many subshells (#26408)**
*  with `bash v3.2.57`: `25.302s` (`.127s` / call)
*  with `bash v5.1.8`: `27.801s` (`.139s` / call)
*  with `dash`: `15.302s` (`.077s` / call)

This version didn't seem to work with zsh.

**This PR (no subshells)**
*  with `bash v3.2.57`: `4.973s` (`.025s` / call)
*  with `bash v5.1.8`: `4.984s` (`.025s` / call)
*  with `zsh`: `2.995s` (`.015s` / call)
*  with `dash`: `1.890s` (`.0095s` / call)

Dash, with the new posix design, is easily the winner.

So there are several interesting things to note here:

1. Running the posix version in `bash` is slower than using `bash` arrays. That is to be
   expected because it's doing a bunch of string processing where it likely did not have
   to before, at least in `bash`.

2. `zsh`, at least on macOS, is significantly faster than the ancient `bash` they ship
   with the system. Using `zsh` with the new version also makes the posix wrappers
   faster than `develop`. So it's worth preferring `zsh` if we have it. I suppose we
   should also try this with newer `bash` on Linux.

3. `bash v5.1.8` seems to be significantly faster than the old system `bash v3.2.57` for
   arrays. For straight POSIX stuff, it's a little slower. It did not seem to matter
   whether `--posix` was used.

4. `dash` is way faster than `bash` or `zsh`, so the real payoff just comes from being
   able to use it. I am not sure if that is mostly startup time, but it's significant.
   `dash` is ~2.4x faster than the original `bash` with arrays.

So, doing a lot of string stuff is slower than arrays, but converting to posix seems
worth it to be able to exploit `dash`.

- [x] Convert all but array-related portions to sh
- [x] Fix basic shellcheck issues.
- [x] Convert arrays to use a few convenience functions: `append` and `extend`
- [x] Get `cc` tests passing.
- [x] Add `cc` tests where needed passing.
- [x] Benchmarking.

Co-authored-by: Tom Scogland <scogland1@llnl.gov>
Co-authored-by: Danny McClanahan <1305167+cosmicexplorer@users.noreply.github.com>
2021-10-04 18:30:19 -07:00
Todd Gamblin
472638f025 .gitignore needs to be below env and ENV for case-insensitive FS 2021-10-04 18:30:19 -07:00
haralmha
8d0025b8af Add 5.2.0 (#26481) 2021-10-04 19:37:34 -05:00
Martin Pokorny
32f8dad0e2 log4cxx: new version and fix for c++11 (#26480)
* Add version 0.12.1

* Add variant to build with C++11 standard

build with c++11 standard requires boost threads, and needs explicit setting of
CMAKE_CXX_STANDARD
2021-10-05 00:34:03 +00:00
KoyamaSohei
c426386f46 intel-tbb: install pkgconfig file (#23977)
* intel-tbb: install pkgconfig file

* intel-tbb: install pkgconfig file when @:2021.2.0

* intel-tbb: add blank line

* intel-tbb: fix library name to refer

* intel-tbb: fix library name to refer again

* intel-tbb: use self.prefix.lib.pkgconfig
2021-10-04 21:38:31 +00:00
Tamara Dahlgren
5a9e5ddb3d Stand-alone tests: distinguish NO-TESTS from PASSED (#25880)
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2021-10-04 19:57:08 +00:00
snehring
675bdada49 esys-particle: add new package esys-particle (#25338)
* esys-particle: add new package esys-particle

* esys-particle: requested changes

* esys-particle: placating isort
2021-10-04 19:41:31 +00:00
Daniel Ahlin
c59223effe gcc: apply nvcc patch to more versions (#24295)
Apply to all known affected versions (10.1.0 to 10.3.0 and 11.1.0)
2021-10-04 21:02:57 +02:00
haralmha
6a6bc3ecbb git: new version 2.29.2 (#26478) 2021-10-04 18:23:38 +00:00
iarspider
b88c80bb1b Add new pythia8 variants and fix url_for_version (#26280)
* Add new pythia8 variants
Add proper description to the existing ones

* Flake-8

* Flake-8

* Style

* Fix Pythia8 recipe

* Fix typo in variant description

* Update package.py
2021-10-04 12:52:51 -05:00
haralmha
b9e8d2d648 cppgsl: new version 3.1.0 (#26477) 2021-10-04 13:25:44 -04:00
Manuela Kuhn
377c781d7e py-scikit-learn: add 0.19.2 and 0.22.2.post1 (#26473) 2021-10-04 18:48:26 +02:00
snehring
9a8712f4cb masurca: Fix build with glibc-2.32+ (#26173)
remove sysctl.h which is not used by the build
2021-10-04 17:59:01 +02:00
Pedro Demarchi Gomes
90fa50d9df Avoid replacing symlinked spack.yaml when concretizing an environment (#26428) 2021-10-04 14:59:03 +00:00
Pedro Demarchi Gomes
4ae71b0297 use umpire version 5.0.1:5 (#26451) 2021-10-04 16:32:47 +02:00
Vicente Bolea
572791006d paraview: add v5.10.0-RC1 release (#26291) 2021-10-04 09:32:03 -04:00
mcuma
fa528c96e6 octave: add support for MKL (#25952) 2021-10-04 14:57:57 +02:00
iarspider
4f1c195bf9 grpc: update recipe (#25261)
1) forward `+shared` to re2
2) add `cxxstd` variant
3) use "more idiomatic" way of setting CMake options
2021-10-04 14:53:17 +02:00
Andreas Baumbach
35dd474473 Improve an error message in stage.py (#23737) 2021-10-04 14:47:14 +02:00
Tiziano Müller
2ca58e91d6 cp2k: bugfix for intel oneapi (#26143) 2021-10-04 14:41:49 +02:00
Oliver Perks
ad98e88173 arm: added download and better detection (#25460) 2021-10-04 14:35:20 +02:00
Harmen Stoppels
e557730c96 gnupg: Add version 1 (#23798)
From the gnupg.org website:

> GnuPG 1.4 is the old, single binary version which still support the
> unsafe PGP-2 keys. However, it lacks many modern features and will
> receive only important updates.

I'm starting to appreciate gpg1 more, because it is relocatable (gng2
has hard-coded paths to gpg-agent and other tools) and it does not
require gpg-agent at all.
2021-10-04 08:29:41 -04:00
iarspider
50b1b654c9 evtgen: fix mac build and version 2.0.0 with pythia >= 8.304. (#25933) 2021-10-04 14:06:49 +02:00
Manuela Kuhn
e9accaaca1 py-duecredit: add v0.6.5 (#26465) 2021-10-04 14:01:12 +02:00
Manuela Kuhn
68648b3ff5 py-nibabel: add v2.4.1 (#26462) 2021-10-04 14:00:49 +02:00
Valentin Volkl
e4d5bcf916 podio: remove build_type variant (provided by base package class already) (#26463) 2021-10-04 14:00:20 +02:00
Bernhard Kaindl
3fff338844 freebayes: Fix running the testsuite and add pkgconfig (#26440)
freebayes needs the tools of the vcflib it includes to run tests
2021-10-04 14:00:08 +02:00
Nikolay Simakov
225927c1c3 graph500: added option -fcommon for gcc@10.2 (#25367)
* graph500: added option -fcommon for gcc@10.2:, otherwise failed to build with "multiple definition of `column'"

* graph500: moved setting cflag to flag_handler
2021-10-04 13:42:33 +02:00
iarspider
bf9cf87d9b libunwind: add variants (#26099) 2021-10-04 13:04:39 +02:00
Manuela Kuhn
f1839c6aae py-pybids: add v0.9.5 (#26461) 2021-10-04 10:35:06 +00:00
Harmen Stoppels
d84d7f0599 Disable tests on hip for cmake 3.21 (#26460) 2021-10-04 10:20:07 +00:00
Manuela Kuhn
e2ee3066cf py-mne: add new package (#26446) 2021-10-04 09:56:58 +00:00
Manuela Kuhn
1f451924b1 py-sphinxcontrib-napoleon: add new package (#26457) 2021-10-04 09:52:56 +00:00
Manuela Kuhn
8bcbead42b py-templateflow: add new package (#26458) 2021-10-04 09:47:37 +00:00
Dylan Simon
9926799aeb Fix NonVirtualInHierarchyError message format (#26008) 2021-10-04 10:38:09 +02:00
Seth R. Johnson
205b414162 hdf5: allow implicit functions on apple-clang (#26409)
Work around issues in older hdf5 build and overzealous build flags:
```
 >> 1420    /var/folders/j4/fznvdyhx4875h6fhkqjn2kdr4jvyqd/T/9te/spack-stage/spack-stage-hdf5-1.10.4-feyl6tz6hpx5kl7m33avpuacwje2ubul/spack-src/src/H5Odeprec.c:141:8: error: implicit decl
             aration of function 'H5CX_set_apl' is invalid in C99 [-Werror,-Wimplicit-function-declaration]
```
2021-10-04 10:19:56 +02:00
Seth R. Johnson
5d431087ab python: correctly disable ~tkinter when @3.8 (#26365)
The older patch does not apply so the build ends up failing:
```
     1539    In file included from /private/var/folders/fy/x2xtwh1n7fn0_0q2kk29xkv9vvmbqb/T/s3j/spack-stage/spack-stage-python-3.8.11
             -6jyb6sxztfs6fw26xdbc3ktmbtut3ypr/spack-src/Modules/_tkinter.c:48:
  >> 1540    /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/usr/include/tk.h:86:11: f
             atal error: 'X11/Xlib.h' file not found
     1541    #       include <X11/Xlib.h>
     1542                    ^~~~~~~~~~~~
     1543    1 error generated.
```
2021-10-04 10:18:23 +02:00
iarspider
7104b8599e Update byacc (#25837)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2021-10-04 09:55:20 +02:00
G-Ragghianti
7009724f0d papi: new variants rocm and rocm_smi (#26214) 2021-10-04 09:53:17 +02:00
Manuela Kuhn
81abbb41bc git-annex: add 8.20210804 for amd64 (#26449) 2021-10-04 09:50:05 +02:00
Manuela Kuhn
d0da7c8440 py-pockets: add new package (#26452) 2021-10-04 09:43:10 +02:00
Massimiliano Culpo
69abc4d860 Build ppc64le docker images (#26442)
* Update archspec
* Add ppc64le to docker images
2021-10-04 09:34:53 +02:00
Massimiliano Culpo
e91815de7c Update Spec.full_hash docstring (#26456)
The docstring is outdated since #21735
when the build hash has been included
in the full hash.
2021-10-04 07:33:22 +00:00
Garth N. Wells
baa50c6679 FEniCSx: fix CMake root directory and dependency versions (#26445) 2021-10-04 09:20:53 +02:00
Manuela Kuhn
44d7218038 py-seaborn: add v0.11.2 (#26447) 2021-10-04 09:17:38 +02:00
Manuela Kuhn
9e0aabdef3 py-transforms3d: add new package (#26448) 2021-10-04 09:17:04 +02:00
Joseph Wang
91598912f7 ocaml: add v4.13.1 (#26453)
This is needed in order to make it work with glibc-2.34
2021-10-04 08:47:30 +02:00
Daniel Arndt
862bcc5059 ArborX: Explicitly set path to Kokkos (#26347)
* Explicitly set path to Kokkos for ArborX testing

* Improve formatting

* Update var/spack/repos/builtin/packages/arborx/package.py

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

* Remove blank line

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2021-10-03 23:04:52 -04:00
Hector
2f6590ddd1 source-highlight: add gcc11 patch (#26412) 2021-10-03 16:07:39 +00:00
Lorién López Villellas
c0b383b3e1 PICSAR: added support for GCC >10.0 and arm compiler (#24927)
* -fallow-argument-mismatch flag added when compiling with GCC to avoid a compilation error when using a GCC version > 10.0.

Co-authored-by: Haz99 <jsalamerosanz@gmail.com>

* Filtered every occurrence of "!$OMP SIMD SAFELEN(LVEC2)" when compiling with nvhcp to avoid a compilation error.

Co-authored-by: Haz99 <jsalamerosanz@gmail.com>

* Line with more than 80 characters split into multiple lines.

Co-authored-by: Haz99 <jsalamerosanz@gmail.com>
2021-10-03 10:11:04 -04:00
Jordan Galby
0d6a2381b2 Fix JSONDecodeError when using compiler modules (#25624)
When using modules for compiler (and/or external package), if a
package's `setup_[dependent_]build_environment` sets `PYTHONHOME`, it
can influence the python subprocess executed to gather module
information.

The error seen was:

```
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
```

But the actual hidden error happened in the `python -c 'import
json...'` subprocess, which made it return an empty string as json:

```
ModuleNotFoundError: No module named 'encodings'
```

This fix uses `python -E` to ignore `PYTHONHOME` and
`PYTHONPATH`. Should be safe here because the python subprocess code
only use packages built-in python.

The python subprocess in `environment.py` was also patched to be safe
and consistent.
2021-10-03 16:10:33 +02:00
Harmen Stoppels
b9e72557e8 Remove .99 from version ranges (#26422)
In most cases, .99 can be omitted thanks to #26402 .
2021-10-03 09:09:02 -04:00
Tim Moon
2de116d285 DiHydrogen: Specify required NVSHMEM variants (#26384) 2021-10-03 08:30:27 -04:00
Bernhard Kaindl
f6ce95bfcb chombo: several build fixes: csh, space regex and install (#26424)
See description of the the PR for details on the changes.
2021-10-03 14:28:28 +02:00
Harmen Stoppels
e53bd0eba8 bison: new versions 3.8:3.8.2 (#26439) 2021-10-03 07:57:03 -04:00
Takahiro Ueda
d603f4519d form: use version tarballs, fix and enable gmp and zlib variants (#26425)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
Co-authored-by: iarspider <iarspider@gmail.com>
2021-10-03 08:45:33 +00:00
Michael Kuhn
d4b3724557 meson: add 0.59.2 (#26430) 2021-10-03 10:41:08 +02:00
Michael Kuhn
abce326ea3 openblas: add 0.3.18 (#26429) 2021-10-03 10:38:48 +02:00
Manuela Kuhn
4c1285b1c2 py-svgutils: new package (#26437) 2021-10-03 02:38:36 +02:00
Marc Fehling
34a1e4d8b6 p4est: bump/update to 2.8 (#26081) 2021-10-03 01:56:03 +02:00
Manuela Kuhn
ee568cb6a4 py-packaging: bump version to 21.0 (#26436) 2021-10-02 23:31:24 +00:00
Manuela Kuhn
0db072dc4a py-attrs: bump version to 21.2.0 (#26435) 2021-10-02 23:23:02 +00:00
Manuela Kuhn
6a188b2b13 py-nitransforms: new package (#26434) 2021-10-03 01:06:13 +02:00
Wouter Deconinck
846428a661 podio: add supported CMAKE_BUILD_TYPE values (#26432) 2021-10-03 00:38:36 +02:00
Manuela Kuhn
aee1d44edf py-pybids: bump version to 0.13.2 (#26433) 2021-10-03 00:36:02 +02:00
Sreenivasa Murthy Kolam
f1c83b00d4 atmi: replace /usr/bin/rsync with depends_on rsync (#26353) 2021-10-03 00:32:46 +02:00
Manuela Kuhn
8dcd568769 py-ipyvtk-simple: new package (#26431) 2021-10-02 23:18:08 +02:00
Josh Bowden
aaf35fd520 damaris: bump version to 1.5.0, add variant examples (#26220) 2021-10-02 23:06:17 +02:00
Manuela Kuhn
bc6edc7baf py-ipycanvas: new package (#26426) 2021-10-02 22:30:09 +02:00
Olivier Cessenat
2ce3be3a27 fltk: new version 1.3.7, new variant xft (#26423) 2021-10-02 16:27:38 -04:00
Manuela Kuhn
b65360f721 py-ipyevents: add new package (#26427) 2021-10-02 22:23:09 +02:00
Robert Romero
57803887e8 cddlib: use github URLs and bump version (#26315)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-02 19:50:32 +00:00
Weiqun Zhang
43e08af7b2 amrex: bump version to 21.10 (#26416) 2021-10-02 18:25:50 +02:00
Massimiliano Culpo
0e469fc01e clingo-bootstrap: add a variant for static libstdc++ (#26421) 2021-10-02 14:53:24 +02:00
Mihael Hategan
b3d3ce1c37 py-pytest-random-order: new package - plugin for py-pytest. (#26413) 2021-10-02 12:38:09 +02:00
Rob Falgout
d9f25f223f Add hypre release 2.23.0 (#26418) 2021-10-02 09:52:50 +02:00
Howard Pritchard
3575c14a6a ucx: add version 1.11.2 (#26417)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2021-10-02 09:49:48 +02:00
Wouter Deconinck
314f5fdb97 New versions: geant4@10.7.2, geant4-data@10.7.2 (#25449)
* [geant4] new version 10.7.2

* [geant4-data] new version 10.7.2
2021-10-01 17:08:15 -07:00
Mihael Hategan
d8f23c0f6d Added py-typeguard package. (#26411)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-01 23:58:39 +00:00
Harmen Stoppels
d0e49ae4bb Simplify setup_package in build environment (#26070)
* Remove redundant preserve environment code in build environment

* Remove fix for a bug in a module

See https://github.com/spack/spack/issues/3153#issuecomment-280460041,
this shouldn't be part of core spack.

* Don't module unload cray-libsci on all platforms
2021-10-01 19:41:30 -04:00
Cory Bloor
b6169c213d Fix error message when test throws AttributeError (#25895)
Narrow the scope of the try/except block, to avoid a misleading
error message if fn() throws an AttributeError.
2021-10-01 19:40:24 -04:00
Mihael Hategan
d19105f761 Added py-sqlalchemy-stubs package. (#26414)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-01 21:50:55 +00:00
bernhardkaindl
9893718f78 axel(download manager): fix build, bump version (#26140) 2021-10-01 23:48:33 +02:00
Seth R. Johnson
2f630226e7 trilinos: add conflict and enable gtest for developers (#26150) 2021-10-01 15:53:16 -04:00
Harmen Stoppels
50feaae81c Allow non-empty ranges 1.1.0:1.1 (#26402) 2021-10-01 12:23:26 -07:00
Chris Richardson
bce269df13 Fenicsx update dependency and build for main branch (#25864)
* Updates to make it work again for main
* Use property

Co-authored-by: Chris <cnr12@cam.ac.uk>
2021-10-01 12:05:19 -07:00
Harmen Stoppels
18760de972 Spack install: handle failed restore of backup (#25647)
Spack has logic to preserve an installation prefix when it is being
overwritten: if the new install fails, the old files are restored.
This PR adds error handling for when this backup restoration fails
(i.e. the new install fails, and then some unexpected error prevents
restoration from the backup).
2021-10-01 11:40:48 -07:00
Olivier Cessenat
df590bb6ee hypre: add version 2.22.1; add fortran variant; becomes AutotoolsPackage (#25781) 2021-10-01 11:31:24 -07:00
bernhardkaindl
da171bd561 uftrace(dynamic function graph tracer): bump version (#26224) 2021-10-01 18:28:25 +00:00
Luciano Zago
f878f769a2 gh: add new package (#26405) 2021-10-01 17:58:47 +00:00
bernhardkaindl
ffb6597808 xrootd: Add current version 5.3.1 and restrict to openssl@1 (#26303) 2021-10-01 17:50:35 +00:00
Piotr Luszczek
f75ad02ba2 Add new package: ffte (#26340) 2021-10-01 17:27:38 +00:00
Scott Wittenburg
9f09156923 Retry pipeline generation jobs in certain cases 2021-10-01 10:12:37 -07:00
Scott Wittenburg
d11156f361 Add DAG scheduling to child pipelines 2021-10-01 10:12:37 -07:00
Scott Wittenburg
4637c51c7f Service jobs do not need an active environment 2021-10-01 10:12:37 -07:00
Scott Wittenburg
d233c20725 Activate concrete env dir in all service jobs 2021-10-01 10:12:37 -07:00
Scott Wittenburg
5f527075bf Use same image to build e4s as to generate it 2021-10-01 10:12:37 -07:00
Scott Wittenburg
f6fcfd2acc Use same image to build dvsdk as to generate it 2021-10-01 10:12:37 -07:00
Scott Wittenburg
ae092915ac Use default runner image for radiuss 2021-10-01 10:12:37 -07:00
Victor
98466f9b12 Add pmix variant to openmpi package (#26342)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-10-01 18:43:55 +02:00
Manuela Kuhn
0735419592 py-traitsui: add v7.2.1 (#26392) 2021-10-01 15:57:21 +00:00
Daniel Arndt
7293913e1c DataTransferKit: add v3.1-rc3 (#26269) 2021-10-01 17:33:09 +02:00
Seth R. Johnson
f2ef34638d graphviz: macOS patch not needed on newer version (#26292)
The autoconf file was updated in 2.49.0 but is still needed in older
versions.
2021-10-01 17:30:23 +02:00
bernhardkaindl
6aff4831e3 cmor: bump version and depend on the specific libuuid packages needed by version. (#26149) 2021-10-01 17:29:39 +02:00
Georgiana Mania
7d77f36f83 eigen: add v3.4.0 (#26296) 2021-10-01 17:24:58 +02:00
haralmha
9c8041b8e4 hoppet: add new package (#26297)
Co-authored-by: Harald Minde Hansen <hahansen@office.dyndns.cern.ch>
2021-10-01 17:24:17 +02:00
Manuela Kuhn
eff5bd4005 py-coverage: add v5.5 (#26407) 2021-10-01 15:14:41 +00:00
Axel Huebl
177fe731b1 llvm-openmp: 12.0.1 (#26320)
Add the latest LLVM OpenMP release.
This one compiles for aarch64/M1 on mac.
2021-10-01 17:06:12 +02:00
bernhardkaindl
6b96486d42 py-gsutil: add v5.2 (#26337) 2021-10-01 17:03:54 +02:00
Daniel Arndt
f168d264d1 ArborX: add v1.1 (#26344) 2021-10-01 17:02:05 +02:00
Michael Kuhn
12dc01318e otf: Explicitly disable MPI and ZOIDFS (#26346)
Otherwise, global installations of MPI could be picked up by OTF.
2021-10-01 17:01:31 +02:00
bernhardkaindl
ad8b8b3377 damask-grid, damask-mesh: fix spack install --test=root (#26350) 2021-10-01 16:59:31 +02:00
Edgar Leon
4078c5e537 mpibind: add v0.7.0 and new flux variant (#26359) 2021-10-01 16:56:22 +02:00
Adam Moody
60aa97b25f LWGRP: add v1.0.4 DTCMP: add v1.1.2 and v1.1.3 (#26362) 2021-10-01 16:42:10 +02:00
Satish Balay
6362b3014a petsc,py-petsc4py add versions 3.15.5, 3.16.0 (#26375) 2021-10-01 16:37:28 +02:00
Christoph Conrads
07e91e8afa med: work around "could not find TARGET hdf5" error (#25312)
Fixes #24671, fixes #26051
2021-10-01 10:27:22 -04:00
Ryan Mast
c3c28d510f ns-3-dev: add v3.31, v3.32, v3.33, and v3.34 (#26383) 2021-10-01 16:15:39 +02:00
Pieter Ghysels
d765b4b53d STRUMPACK: add v6.0.0 (#26386)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-01 16:12:41 +02:00
Manuela Kuhn
66866e45f2 py-pyface: add v7.3.0 (#26389) 2021-10-01 16:07:27 +02:00
Manuela Kuhn
ef202cbd37 py-ipywidgets: add 7.6.5 and get sources from pypi (#26393) 2021-10-01 16:06:25 +02:00
Mihael Hategan
e9609b62c6 Python Globus SDK (globus.org): add new package (#26395) 2021-10-01 16:05:49 +02:00
bernhardkaindl
da7666a967 py-pyside: restrict version of dependency on v1.2.2 (#26396)
Restrict @:1.2.2 to older sphinx versions to not fail the build
2021-10-01 16:04:36 +02:00
Manuela Kuhn
627c7007ef py-pyvista: add new package (#26398) 2021-10-01 16:00:30 +02:00
Anna Masalskaya
8fc770608d Add oneAPI packages from 2021.4 release (#26401) 2021-10-01 15:58:32 +02:00
Jose E. Roman
91f668a695 SLEPc: add v3.16 (#26403) 2021-10-01 15:55:42 +02:00
Manuela Kuhn
f775e81fca py-fury: add new package (#26404) 2021-10-01 15:53:50 +02:00
Harmen Stoppels
5b9f60f9bb bootstrapping: improve error messages (#26399) 2021-10-01 13:00:14 +02:00
bernhardkaindl
6ca42f0199 boost: @1.77.0: need updated python_jam.patch for +python (#26363)
One hunk changed and the new patch is refreshed using quilt.
2021-10-01 11:09:32 +02:00
Tamara Dahlgren
2d1ebbe0a2 Add info command tests to increase coverage (#26127) 2021-09-30 22:13:45 -04:00
Tamara Dahlgren
3feab42203 py-setuptools-rust: remove checksum url workaround (#25843) 2021-09-30 22:13:26 -04:00
Tamara Dahlgren
7aedeca764 Replace spec-related install_test asserts with exceptions; added unit tests (#25982) 2021-09-30 22:05:06 -04:00
Manuela Kuhn
c58cd83e38 py-shiboken2: add new package (#26390) 2021-10-01 01:19:41 +00:00
Manuela Kuhn
032ce20171 vtk: fix install path to lib (#26387) 2021-10-01 02:52:09 +02:00
Paul Spencer
046ed47b1f New package: slurm-drmaa (#25424)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-10-01 00:07:03 +00:00
Manuela Kuhn
c076aeb192 py-jupyterlab: add new package (#26357) 2021-10-01 01:33:28 +02:00
Massimiliano Culpo
74d36076c2 ArchSpec: minor cleanup of a few methods (#26376)
* Remove vestigial code to be compatible with Spack v0.9.X
* ArchSpec: reworked __repr__ to be more adherent to common Python idioms
* ArchSpec: simplified __init__.py and copy()
2021-09-30 16:03:39 -07:00
Robert Underwood
a75c86f178 GDB: Fix for GMP and Python (#26366)
closes  #26354 and #26358

Previously we did not pass paths for GDB or GMP and ./configure would
get confused about which one to pull from.  Be more specific.

Built with all variants enabled and fixed the fixable versions and variants:
@:8.1 were fixable by limiting python versions for these to @:3.6.
7.10.1 and 7.11(.1) were fixable to build with glibc-2.25 and newer
using two long patches.
gdb 7.8 and 7.9 weren't fixable as there is no backport if the fix
to build these with glibc-2.25 and newer:
http://lists.busybox.net/pipermail/buildroot/2017-March/188055.html

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-09-30 22:40:35 +00:00
Massimiliano Culpo
8ade8a77dd Build container images on Github Actions and push to multiple registries (#26247)
Modifications:
- Modify the workflow to build container images without pushing when the workflow file itself is modified
- Strip the leading ghcr.io/spack/ from env.container env.versioned to prepare pushing to multiple registries
- Fixed CentOS 7 and Amazon Linux builds
- Login and push to Docker Hub as well as Github Action
- Add a badge to README.md with the status of docker images
2021-09-30 23:34:47 +02:00
Jen Herting
7bda430de0 [py-tabulate] added version 0.8.9 (#25974) 2021-09-30 23:02:14 +02:00
Cory Bloor
4238318de2 rocsolver: Tighten rocsolver package dependencies (#25917)
- Specify CMake minimum version more precisely
- Ensure rocBLAS is available at build time
- Limit workaround for missing rocblas include path
  to the only affected version (4.1.0)
- Make hip a build and link dependency
- Remove hip's link dependencies

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-09-30 22:57:28 +02:00
Robert Underwood
7694c58736 sz: fix for external python sitelib (#26067)
fixes #25973
2021-09-30 11:52:20 -07:00
bernhardkaindl
af54f7b15b gromacs-chain-coordinate: Fix calling the test suite (#26367) 2021-09-30 18:18:03 +02:00
Harmen Stoppels
1aa7758dbb match .spack literal, not as a regex (#26374) 2021-09-30 14:54:57 +00:00
Harmen Stoppels
727dcef2f4 Disable __skip_rocmclang again (#26187)
CMake 3.21.3 disables the broken hipcc-as-rocmclang detection again.

From the release notes:

> The AMD ROCm Platform hipcc compiler was identifed by CMake 3.21.0
> through 3.21.2 as a distinct compiler with id ROCMClang. This has been
> removed because it caused regressions. Instead:
> * hipcc may no longer be used as a HIP compiler because it interferes
>   with flags CMake needs to pass to Clang. Use Clang directly.
> * hipcc may once again be used as a CXX compiler, and is treated as
>   whatever compiler it selects underneath, as CMake 3.20 and below
>   did.
2021-09-30 16:01:17 +02:00
Harmen Stoppels
4dee7d2b22 Add a conflict for patchelf to catch errors earlier in bootstrapping (#26373) 2021-09-30 15:18:56 +02:00
Harmen Stoppels
525aa11827 Bump version from v0.16.2 to v0.16.3 (#26372) 2021-09-30 11:54:48 +00:00
Olivier Cessenat
9e70fc630e dos2unix: new version 7.4.2 and enforce link with gettext (#26371) 2021-09-30 12:04:32 +02:00
Harmen Stoppels
b96d836a72 Move gnuconfig under the spack org (#26370) 2021-09-30 11:30:29 +02:00
Diego Alvarez
56eddf2e3c Added OpenJDK 17, 11 for Darwin, and set preferred to 11 (#26368) 2021-09-30 11:29:36 +02:00
Maciej Wójcik
7c3b2c1aa8 gromacs-chain-coordinate: new package (#25426) 2021-09-30 04:12:37 +02:00
Shahzeb Siddiqui
ef9967c7f4 Add e4s tags (#23382)
Add 'e4s' tags for all packages according to https://github.com/E4S-Project/e4s/blob/master/E4S_Products.md
2021-09-29 17:57:52 -07:00
acastanedam
02f356159b linux-pam: include more official (recent) releases (#26065) 2021-09-29 20:51:39 -04:00
David Beckingsale
d31830eaf5 Move new CUDA conflicts inside when('~allow-unsupported-compilers') block (#26132)
* Move new CUDA conflicts inside when('~allow-unsupported-compilers') block

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
2021-09-30 00:22:26 +00:00
Valentin Volkl
2ea1bf84ec py-snappy: add patch to fix dependencies (#26352)
* py-snappy: add patch to fix dependencies

* Update var/spack/repos/builtin/packages/py-snappy/req.patch

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-09-29 15:50:42 -05:00
Manuela Kuhn
0a7804f52e py-jupyterlab: add 3.0.14, 3.0.18 and 3.1.14 (#26339) 2021-09-29 14:34:18 -05:00
Valentin Volkl
805d59d47f geos: fix patch versions (#26351) 2021-09-29 18:32:37 +00:00
Manuela Kuhn
10638ea449 py-jupyter-packaging: add 0.7.12 and 0.10.6 (#26338)
* py-jupyter-packaging: add 0.7.12 and 0.10.6

* Update var/spack/repos/builtin/packages/py-jupyter-packaging/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-jupyter-packaging/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-09-29 18:31:32 +00:00
Howard Pritchard
6d613bded0 cray-mvapich2: add a package to support (#26273)
The format of the HPE/Cray supplied module for cray-mvapich2 on HPE apollo systems is
very different from the cray-mpich module supplied on Cray EX and XE
systems.

Recent changes to the cray-mpich package -

https://github.com/spack/spack/pull/23470

broke support for cray-mvapich2 and relies now on the structure of the
cray-mpich module to work properly.

Rather than try to support two very different vendor mpich modules
using the same spack package, just add another one specialized for
the cray-mvapich2 module.

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
2021-09-29 20:12:13 +02:00
Martin Diehl
f32feeee45 new package(s): DAMASK (damask, damask-grid, damask-mesh) (#25692) 2021-09-29 19:57:01 +02:00
Manuela Kuhn
c75e84e9f3 py-qtpy: add 1.11.2 (#26343) 2021-09-29 11:19:23 -05:00
Harmen Stoppels
7fdb879308 ca-certificates-mozilla for openssl & curl (#26263)
1. Changes the variant of openssl to `certs=mozilla/system/none` so that
   users can pick whether they want Spack or system certs, or if they
   don't want certs at all.
2. Keeps the default behavior of openssl to use certs=systems.
3. Changes the curl configuration to not guess the ca path during
   config, but rather fall back to whatever the tls provider is
   configured with. If we don't do this, curl will still pick up system
   certs if it finds them.

As a minor fix, it also adds the build dep `pkgconfig` to curl, since
that's being used during the configure phase to get openssl compilation
flags.
2021-09-29 09:05:58 -07:00
iarspider
24263c9e92 New package: py-async-lru (#26328) 2021-09-29 18:05:36 +02:00
Manuela Kuhn
7cd3af5d52 py-mffpy: add new package (#26345) 2021-09-29 17:59:50 +02:00
bernhardkaindl
4a31f0fb0f py-mock: fix depends of @:2.0.0 and bump version (#26336)
* py-mock: fix depends of `@:2.0.0` and bump version

fixes the build of `py-gsutil`, it depends on `'py-mock@:2.0.0'`.

* Update var/spack/repos/builtin/packages/py-mock/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-mock/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Apply the other requested changes

* Add requested change: Add the python@3.6 for newer versions

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-09-29 15:39:57 +00:00
Greg Sjaardema
5132cfcc63 seacas: new release and fixes for metis/parmetis (#26310)
* seacas: new release and fixes for metis/parmetis

* Update to add sha256 checksum for latest seacas release

* Updated the documentation strings with new applications

* Fixed the metis/parmetis variants and logic depending on whether mpi
  is enabled/disabled. (There is still a zoltan issue I need to fix,
  but this will at least allow seacas to be built without
  metis/parmetis or with +mpi+parmetis.  The ~mpi+metis still needs
  work elsewhere.)

* Enable cpup, slice, zellij in +applications
2021-09-29 16:53:49 +02:00
Harmen Stoppels
9d39a1bf42 Un-unset TERM and DISPLAY (#26326)
For interactive `spack build-env`'s this is undesired behavior
2021-09-29 16:28:32 +02:00
bernhardkaindl
11278dd55a py-monotonic: bump version to 1.6 (#26335)
Fixes `spack solve py-gsutil`: It depends on 'py-monotonic@1.4:'
2021-09-29 09:19:47 -05:00
Pat Riehecky
e45979d39a Added OpenJDK 16.0.2 (#26322) 2021-09-29 14:19:38 +00:00
MichaelLaufer
ed54d6c926 py-netcdf4: adds parallel IO support (#26178) 2021-09-29 08:21:16 -05:00
iarspider
634b0b25c4 Add new package: py-backports-entry-points-selectable (#26329) 2021-09-29 15:09:27 +02:00
Manuela Kuhn
ac29b70d18 py-dipy: add new package (#26331) 2021-09-29 15:05:34 +02:00
Manuela Kuhn
e01947effc py-imageio-ffmpeg: add 0.4.5, fix installation and import (#26330) 2021-09-29 14:56:16 +02:00
Manuela Kuhn
78dbac69e9 py-pyqt5-sip: add new package (#26325) 2021-09-29 14:52:29 +02:00
bernhardkaindl
80c8978691 krb5: bump version and limit depends_on('openssl') to @1 (#26332)
krb5 uses an openssl RSA API which is deprecated in OpenSSL@3.0.0
with an const/non-const mismatch.
2021-09-29 11:24:08 +00:00
Manuela Kuhn
ab6b531e10 py-argcomplete: add 1.12.3 (#26228) 2021-09-29 13:10:03 +02:00
Olli Lupton
cffa1e7f6c Add ccache 4.4.2. (#26327) 2021-09-29 09:41:28 +00:00
Axel Huebl
c5410bd333 FFTW: NEON SIMD is float-only (#26321)
Fix on Apple Mac M1 (aarch64):
```
configure: error: NEON requires single precision
```
2021-09-29 02:40:45 +00:00
Ben Darwin
c578882bb2 itk: add version 5.2.1 (#26307) 2021-09-28 18:15:42 -07:00
Greg Sjaardema
a550416a1f netcdf-c: Update for 4.8.1 and new download site (#26309)
NetCDF-4.8.1 has been released.

    As discussed in https://github.com/Unidata/netcdf-c/issues/2110
    (netcdf-c-4.8.1.tar.gz not on ftp site... #2110), the canonical
    download site for netCDF releases has been changed and the previous
    ftp site is no longer available.

    This PR updates the `url` to point to the new recommended download
    site and updates the sha256 checksums for the new tar files.
2021-09-28 17:21:28 -07:00
Gregory Lee
82d857dced qhull deprecated libqhull.so in 2020.2 (#26304) 2021-09-28 16:54:30 -07:00
Manuela Kuhn
32b5669e8d py-pytz: add 2021.1 (#26289) 2021-09-28 18:07:28 +00:00
iarspider
a2c16356a1 dire: Fix depends for new pythia version scheme (#26294)
Pythia version numbering was changed in #26255 from xyyy to x.yyy
2021-09-28 18:05:52 +02:00
Edward Hartnett
4f4984e313 wrf-io: added NOAA software maintainers, added openmp variant (#26113) 2021-09-28 08:55:09 -07:00
Tamara Dahlgren
5ace5ca7a4 Post 26211 rebase updates (#26246) 2021-09-28 08:53:59 -07:00
Manuela Kuhn
519f75b865 py-scooby: add new package (#26284) 2021-09-28 17:51:37 +02:00
Diego Alvarez
023d9e3b4c Added OpenJDK 11.0.12_7 (#26299) 2021-09-28 17:47:45 +02:00
iarspider
9810a80c6d Update nlohmann-json to 3.10.2 (#26260)
* Update nlohmann-json

* fix testsuite

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-09-28 08:45:34 -07:00
Harmen Stoppels
6c33047514 Squashfs is a makefilepackage (#26300) 2021-09-28 17:45:16 +02:00
bernhardkaindl
3c5a4157b0 py-anuga: add git main version to support build with python@3.5: (#26248)
* py-anuga: add git main version to support build with python@3.5:

py-anuga's main branch has been converted to Python-3 recently.

* py-triangle: use pypi, py-anuga: Fixed depends and test suite works now
2021-09-28 10:37:09 -05:00
bernhardkaindl
d00c9b47f2 py-tomopy: version bump and master, add new deps, make runtest pass (#26252)
Current py-tomopy has many features and dependencies, add them.
2021-09-28 10:33:29 -05:00
Manuela Kuhn
79808f92ae py-numba: add 0.54.0 and restrict old dependencies (#26281)
* py-numba: add 0.54.0 and restrict old dependencies

* Update var/spack/repos/builtin/packages/py-numba/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-09-28 10:32:44 -05:00
Manuela Kuhn
0f5c63b0c3 py-xlrd: add 2.0.1 (#26282) 2021-09-28 10:31:45 -05:00
Manuela Kuhn
b8dd8b07f9 py-appdirs: add 1.4.4 (#26298) 2021-09-28 10:26:23 -05:00
Diego Alvarez
f65a50d9f9 Update Nextflow (#26042) 2021-09-28 07:15:17 -04:00
Harmen Stoppels
4ce9c839ac Add OpenSSL 3 (#26097)
* OpenSSL 3.0.0

* Remove openssl constraint in e4s to test 3.0.0

* Restrict openssl

* Restrict openssl to @:1 in unifyfs

* Revert "Remove openssl constraint in e4s to test 3.0.0"

This reverts commit 0f0355609771764280ab1b6a523c80843a4f85d6.

* Prefer 1.x
2021-09-28 10:36:55 +02:00
Massimiliano Culpo
aa8727f6f9 Move detection logic in its own package (#26119)
The logic to perform detection of already installed
packages has been extracted from cmd/external.py
and put into the spack.detection package.

In this way it can be reused programmatically for
other purposes, like bootstrapping.

The new implementation accounts for cases where the
executables are placed in a subdirectory within <prefix>/bin
2021-09-28 09:05:49 +02:00
bernhardkaindl
499d39f211 libp11: Fix build and add new version (#26135)
The build needs `pkgconfig` and `openssl`, `m4` is already added by `autoconf`.
Also add the current version of `libp11` to the list of versions.

Co-authored-by: Bernhard Kaindl <bernhard.kaindl@ait.ac.at>
2021-09-28 04:01:08 +02:00
Manuela Kuhn
cd8e85d802 py-dataclasses: add 0.8 (#26283) 2021-09-27 20:18:07 -05:00
Manuela Kuhn
c1cfbc5536 py-meshio: add 5.0.1 and 4.4.6 (#26285) 2021-09-27 20:17:22 -05:00
Manuela Kuhn
87b23cc9cd py-deprecated: add 1.2.13 and get sources from pypi (#26286) 2021-09-27 20:15:35 -05:00
Harmen Stoppels
87450f3688 Use gnuconfig package for config file replacement (#26035)
* Use gnuconfig package for config file replacement

Currently the autotools build system tries to pick up config.sub and
config.guess files from the system (in /usr/share) on arm and power.
This is introduces an implicit system dependency which we can avoid by
distributing config.guess and config.sub files in a separate package,
such as the new `gnuconfig` package which is very lightweight/text only
(unlike automake where we previously pulled these files from as a
backup). This PR adds `gnuconfig` as an unconditional build dependency
for arm and power archs.

In case the user needs a system version of config.sub and config.guess,
they are free to mark `gnuconfig` as an external package with the prefix
pointing to the directory containing the config files:

```yaml
    gnuconfig:
      externals:
      - spec: gnuconfig@master
        prefix: /tmp/tmp.ooBlkyAKdw/lol
      buildable: false
```

Apart from that, this PR gives some better instructions for users when
replacing config files goes wrong.

* Mock needs this package too now, because autotools adds a depends_on

* Add documentation

* Make patch_config_files a prop, fix the docs, add integrations tests

* Make macOS happy
2021-09-27 18:38:14 -04:00
Tamara Dahlgren
c0da0d83ff qhull: use secure github URLS instead of www.qhull.org downloads (#26211) 2021-09-27 14:50:43 -07:00
Manuela Kuhn
9b7a8f69fb py-python-picard: add new package (#26268) 2021-09-27 23:39:30 +02:00
Brian Van Essen
acc8673e1a Updated versions for cuDNN and NVSHMEM (#26272) 2021-09-27 13:54:15 -07:00
Adam J. Stewart
5c1e133d23 py-torchvision: add v0.10.1 (#26267) 2021-09-27 13:35:53 -07:00
Manuela Kuhn
1ad8c0d67d py-llvmlite: add 0.37.0 (#26278) 2021-09-27 13:33:28 -07:00
Manuela Kuhn
044e2def4f py-datalad: move datalad wtf test over from py-datalad-metalad (#26274)
* py-datalad: move datalad wtf test over from py-datalad-metalad

* Update var/spack/repos/builtin/packages/py-datalad/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-09-27 15:24:48 -05:00
Manuela Kuhn
44355a4f18 py-numexpr: add 2.7.3 (#26279) 2021-09-27 20:13:59 +00:00
Manuela Kuhn
6b94d7e6a3 py-tqdm: add 4.62.3 (#26277) 2021-09-27 19:55:22 +00:00
Manuela Kuhn
441c602d2d py-nilearn: add 0.8.1 (#26276) 2021-09-27 14:50:13 -05:00
Thomas Madlener
285df128a1 [openloops][vbfnlo] Fix recipes and updated dependencies (#24461)
* [vbfnlo] Add doc variant to toggle building of docs

* [openloops] Add scons to dependencies

Make sure that the build_processes does not accidentally pick up a
non-suitable scons version from the underlying system

* [openloops] Set OLPYTHON to make sure the right scons is picked

* [openloops] Fix Flake8 style complaints
2021-09-27 11:40:33 -07:00
Richarda Butler
c68536e9d3 MPICH: Add E4S testsuite stand alone test (#25634) 2021-09-27 11:36:53 -07:00
Ben Darwin
c07af15946 mrtrix3: add version 3.0.3 (#25943) 2021-09-27 11:33:42 -07:00
iarspider
e256d02963 Update highfive (#26262)
* Add new versions of highfive
* Fix CMake option name
2021-09-27 11:24:37 -07:00
Marie Houillon
87c7a7e103 New packages for openCARP (#25909)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-09-27 11:21:43 -07:00
Melven Roehrig-Zoellner
6b2192ef96 source-highlight: allow to build the git master branch (#25947)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-09-27 11:16:55 -07:00
Miroslav Stoyanov
5130f43eb3 added workaround for rocm/cmake bug in tasmanian (#25949) 2021-09-27 11:14:21 -07:00
Edward Hartnett
c592e17d49 added package g2c from NOAAs NCEPLIBS project (#26190)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-09-27 11:03:54 -07:00
Olivier Cessenat
27c0b4a0f7 sundials package: @:5.7.0 requires hypre@:2.22.0 (#25903)
sundials@:5.7.0 fails to build with hypre@2.22.1
2021-09-27 10:58:54 -07:00
Steven Smith
ee63cfe0fd New Package: EcoSLIM (#25975)
Co-authored-by: Steven Smith <Steven Smith>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-09-27 10:38:53 -07:00
psakievich
df9d1dd1cb Correct path comparisons for fs view (#25891)
* Fix path comparisons in copy views

* Get correct permissions

* Set group id though os

Co-authored-by: Philip Sakievich <psakiev@sanida.gov>
2021-09-27 09:51:38 -07:00
jacorvar
33957f2745 add mantainer to bedops (#26141) 2021-09-27 16:23:50 +00:00
bernhardkaindl
0cc80f5118 spindle: It seems it needs mpi.h to compile, adding depends on mpi. (#26151)
Workaround this compile error for gcc by adding -Wno-narrowing for it:
spindle_logd.cc:65:76: error: narrowing conversion of '255' from 'int' to 'char'
spindle_logd.cc:65:76: error: narrowing conversion of '223' from 'int' to 'char'
spindle_logd.cc:65:76: error: narrowing conversion of '191' from 'int' to 'char'

spindle 0.8.1 wants to compile tests with mpi.h, newer versions need mpicc,
thus add: depends_on("mpi"). Spindle supports the --no-mpi to disable MPI.
2021-09-27 08:58:47 -07:00
Edward Hartnett
ffedc3637c added NOAAs UPP package (#26198) 2021-09-27 08:56:05 -07:00
Adam J. Stewart
4d3becdfd9 py-torch: add v1.9.1 (#26159) 2021-09-27 09:30:47 -05:00
bernhardkaindl
4a130b9185 perl: Fix test case bug perl#15544: Don't fail with PATH >1000 chars (#26245)
Fix the perl test case bug Perl/perl5#15544
Variable PATH longer than 1000 characters (as is usual with spack) fails a perl test case
The fix is: Don't test PATH in testcase perlbug.t
Fixes `spack install --test=all` for specs triggering a build and test of perl!
2021-09-27 15:18:36 +02:00
Manuela Kuhn
2acfe55b74 exempi: fix expat dependency (#26221)
* exempi: fix expat dependency and fix test with --test=root

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-09-27 14:52:23 +02:00
iarspider
8710353336 pythia8: update to v8.306, change url, set environment variables for dependents (#26255) 2021-09-27 09:05:51 +00:00
Jen Herting
514d04c8f1 New package: r-klar (#25390) 2021-09-27 02:53:03 +00:00
Jen Herting
87e91178a6 New package: r-questionr (#25395) 2021-09-26 21:35:24 -05:00
Manuela Kuhn
e90534d45e py-python-gitlab: add 2.10.1 (#26229) 2021-09-27 01:24:05 +02:00
Wouter Deconinck
caa33b7b05 [acts] eigen is not only a build dependency (#26257)
The ActsConfig.cmake includes a `find_dependency(Eigen3)` line. Requiring depends_on type='build' does not expose eigen to dependents during their build.

Ref: https://github.com/acts-project/acts/blob/v13.0.0/cmake/ActsConfig.cmake.in#L59
2021-09-27 01:23:44 +02:00
lorddavidiii
8fa6597d43 cfitsio: add version 4.0.0 (#26218) 2021-09-27 01:18:09 +02:00
Seth R. Johnson
7e2fc45354 trilinos: default gotype=long_long (#26253)
"Long long" is the default type when building trilinos on its own, and
many downstream packages (both in and out of spack) rely on it. E4S
already sets this explicitly to long_long.
2021-09-26 16:51:23 +02:00
Desmond Orton
3e7cee27d7 masurca: add versions up to v4.0.5 (#26171)
Co-authored-by: las_djorton <las_djorton@babybuild.las.iastate.edu>
2021-09-26 15:53:59 +02:00
Wouter Deconinck
798ef2bdb0 vtk: depends_on freetype @:2.10.2 through @:9.0.1 only (#26180)
The issues in https://gitlab.kitware.com/vtk/vtk/-/issues/18033 were fixed in:
- dae1718d50
- 31e8e4ebeb

These fixes are in vtk@9.0.2 https://github.com/Kitware/VTK/releases/tag/v9.0.2 and beyond.
2021-09-26 15:52:51 +02:00
Brian Van Essen
cc658c5f1e lbann: add support for building with the ONNX C++ library (#26130) 2021-09-26 15:50:09 +02:00
Valentin Volkl
1f972f90c8 podio, edm4hep: fix tests (#26116)
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2021-09-26 15:49:10 +02:00
Brian Van Essen
123c105771 onnx: fix python version in environments (#26131)
When using the ONNX package inside of an environment that specifies a
python3 executable, it will attempt to use a system installed
version.  This can lead to a failure where the system python and the
environment python don't agree and the system python ends up with an
invalid environment.  Forces ONNX to use the same version of python as
the rest of the spec.

Co-authored-by: Greg Becker <becker33@llnl.gov>
2021-09-26 15:48:45 +02:00
Brian Van Essen
b909560ed5 Added CMake command to export compiler directories to support emacs LSP mode. (#26118) 2021-09-26 15:47:46 +02:00
Paul Spencer
0e6e9c23a1 libsignal-protocol-c: new package (#26121)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-09-26 15:47:15 +02:00
Hadrien G
a99d8463ac diffutils: Add v3.8, fixes an incompatibility with newer glibc (#26237) 2021-09-26 12:30:16 +02:00
Harmen Stoppels
e76462da38 Compiler conflict in umpire as a range (#26161) 2021-09-26 12:28:30 +02:00
Harmen Stoppels
fbda52d1b0 Curl 7.79.0 (#26075) 2021-09-26 12:28:05 +02:00
Manuela Kuhn
ab85961e73 py-datalad-hirni: add new package (#26227) 2021-09-26 09:20:36 +00:00
Nikolay Simakov
72cabcf454 enzo: build extra tools, add optimization variant (#25401)
* added inits and rings tools to be built
* added opt variant
2021-09-26 10:59:06 +02:00
iarspider
344b37730b jemalloc: add more variants (#26144) 2021-09-26 10:56:27 +02:00
bernhardkaindl
5967a10432 candle-benchmarks: depend_on()s: Fix obsoleted and renamed variants (#26250)
Fix solving/concretizing candle-benchmarks:
py-theano: The requested variant +gpu us now named +cuda
opencv: The requested variants +python and +zlib are now fixed deps
2021-09-26 10:37:01 +02:00
bernhardkaindl
6ccda81368 log_parser.py: Find failed test case messages in error logs (#25694)
- Match failed autotest tests show the word "FAILED" near the end
- Match "FAIL: ", "FATAL: ", "failed ", "Failed test" of other suites
- autotest "   ok"$ means the test passed, independend of text before.
- autoconf messages showing missing tools are fatal later, show them.
2021-09-26 10:35:04 +02:00
bernhardkaindl
34415f6878 swiftsim: Fix build with version bump and depends_on('gsl') (#26166)
swiftsim-0.7.0 didn't build because of a configure script issue and
more. Fix the build by a version bump and adding `depends_on('gsl')`.
2021-09-26 10:32:44 +02:00
bernhardkaindl
a947680e37 dropwatch: make check starts a daemon which does not stop: Skip it (#26164)
dropwatch is a network packet drop checker and it's make check starts
a daemon which does not terminate.

- Skip this test to not block builds.
- Add depends_on('pkgconfig', type='build')
  It is needed in case the host does not have pkg-config installed.
- Remove the depends_on('m4', type='build'):
  The depends_on('autoconf', type='build') pulls m4 as it needs it.
2021-09-26 10:32:04 +02:00
bernhardkaindl
2154786b61 dnstop: needs ncurses and it forgot to add -ltinfo (#26163)
dnstop misses a depends('ncurses') and it needs
to link with libtinfo from ncurses as well.
2021-09-26 10:30:42 +02:00
bernhardkaindl
9a64a229a0 xfwp: Fix build using: env.append_flags('CPPFLAGS', '-D_GNU_SOURCE') (#26157)
Without -D_GNU_SOURCE, the build of xfwp has many compilation errors.
Example: io.c:1039:7: error: implicit declaration of function 'swab'
2021-09-26 10:29:43 +02:00
bernhardkaindl
7189a69c70 rivet: Fix of build and tests on Ubuntu 18.04 w/ version bump (#26152)
When using Ubuntu's gcc-8.4.0 on Ubuntu 18.04 to compile rivet-3.1.3,
compilation errors related to UnstableParticles(), "UFS" show up.

Compilation with this compiler is fixed in rivet-3.1.4, adding it.

Adding type='link' to the depends on 'hepmc' and 'hepmc' fixes
the tests to find libHepMC.so.4 in `spack install --tests=all`

Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2021-09-26 10:29:08 +02:00
bernhardkaindl
fee0831b63 acct: skip installcheck: fails checking if all progs have --help (#26093)
Checks if all programs support --help and --version but not all do.
2021-09-26 10:26:21 +02:00
bernhardkaindl
919a893c08 mountpointattr: Fix checksum of 1.1.1 and fix build(patch is only for 1.1) (#26087) 2021-09-26 10:25:50 +02:00
bernhardkaindl
91ce8354a2 cairo: pass tests: devs wrote that other people shouldn't attempt them (#26086)
The cairo test suite is huge, has many backends and the README states
that running and attempting to pass it is not a goal for normal users,
it has so many dependencies into the system, including fonts, that
passing it is not a goal realistically in reach soon:
Skip it, it takes far too long to be practical.
2021-09-26 10:25:26 +02:00
bernhardkaindl
c411779c51 libfuse: Fix build when meson doesn't get host's udev pkg (#26147)
Despite the patch disabling installation of rules, meson's setup
stage looks up the udev package to get `/lib/udev/rules.d`, but as
spack has no `systemd/udev` package, it would fail to build.

Fix such builds by passing `-Dudevrulesdir` and bump version to 3.10.5
2021-09-26 10:24:30 +02:00
Manuela Kuhn
17857083b0 py-datalad-container: add new package (#26222) 2021-09-26 10:18:17 +02:00
Manuela Kuhn
dbbc2879e5 py-datalad-neuroimaging: add new package (#26223) 2021-09-26 10:17:22 +02:00
Manuela Kuhn
d627026910 py-datalad-webapp: add new package (#26226) 2021-09-26 09:50:01 +02:00
Manuela Kuhn
3b323d6c90 py-datalad-metalad: add new package (#26225)
Added py-nose for import and a simple installtest from appveyor.yml

Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-09-26 07:44:18 +00:00
bernhardkaindl
7ca657aef6 libidl builds using flex and bison: Add them as build-depends (#26183) 2021-09-26 09:40:18 +02:00
Jen Herting
27e6032f73 New package: py-gpyopt (#26232)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2021-09-26 07:34:18 +00:00
Adam J. Stewart
76ca57bb2f py-scikit-learn: add v1.0 (#26243) 2021-09-26 08:33:58 +02:00
Jen Herting
56858eb34e first build of py-doe2 a fork of py-doe (#26235)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-09-26 08:12:40 +02:00
Carlos Bederián
b4b3b256bc ucx: use bfd instead of lld with %aocc (#26254) 2021-09-26 04:47:07 +02:00
Jen Herting
3c14d130ca New package: r-styler (#25399) 2021-09-25 20:50:22 +00:00
Sreenivasa Murthy Kolam
b56e1d1f6d limit the amd_target_sram_ecc to rocm-3.9.0-4.0.0 (#26240) 2021-09-25 19:18:44 +00:00
Sam Reeve
b2376db632 cabana: add 0.4 and new dependency variants (#25847)
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Christoph Junghans <junghans@votca.org>
2021-09-25 12:04:24 -06:00
Glenn Johnson
d8dc1f2c80 new package: dicom3tools (#25700) 2021-09-25 08:43:17 -06:00
Jen Herting
348cebfb8f py-sobol-seq: new package (#26234)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-09-25 14:39:36 +02:00
Jen Herting
3f7d6763e0 New package: py-pydoe (#26233)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-09-25 14:34:35 +02:00
bernhardkaindl
ff511e090a autotools doc PR: No depends_on('m4') with depends_on('autoconf') (#26101)
* autotoolspackage.rst: No depends_on('m4') with depends_on('autoconf')
  - Remove `m4` from the example depends_on() lines for the autoreconf phase.
  - Change the branch used as example from develop to master as it is
    far more common in the packages of spack's builtin repo.
- Fix the wrong info that libtoolize and aclocal are run explicitly
  in the autoreconf phase by default. autoreconf calls these internally
  as needed, thus autotools.py also does not call them directly.
- Add that autoreconf() also adds -I<aclocal-prefix>/share/aclocal.
- Add an example how to set autoreconf_extra_args.
- Add an example of a custom autoreconf phase for running autogen.sh.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-09-25 10:15:03 +02:00
Alec Scott
ef7cf83ea5 Add Go v1.17.1 (#26128) 2021-09-25 01:22:34 -06:00
acastanedam
b67cfc0fd1 gpi-2: improve GPI-2 installation (#25808)
Based on the original script of R. Mijakovic further improvements of
GPI-2 installation, in particular different official versions,
configuration setups and even testing. Importantly, the non-autotools
way of installation for older versions is also considered, which is
relevant for some packages using GPI-2.

Co-authored-by: Arcesio Castaneda Medina <arcesio.castaneda.medina@itwm.fraunhofer.de>
2021-09-25 00:58:46 -06:00
bernhardkaindl
e3dc586064 xload: fix build by disabling gettext and add v1.1.3 (#26142)
xload failed with unresolved referenced to libintl functions:
Disabled it's use of gettext calls and added the last "new" version.

Co-authored-by: Bernhard Kaindl <bernhard.kaindl@ait.ac.at>
2021-09-25 00:58:32 -06:00
AMD Toolchain Support
ed13addf3b Cloverleaf: adding AOCC support (#26148)
Co-authored-by: Mohan Babu <mohbabul@amd.com>
2021-09-25 00:49:43 -06:00
Adam J. Stewart
b8e21f2b0b GDAL: fix build of Java bindings (#26244) 2021-09-25 00:01:45 -06:00
Manuela Kuhn
7e67652983 py-datalad: add 0.15.1 (#26230) 2021-09-24 23:40:48 -06:00
kwryankrattiger
a9cb31ab49 Enable cinema variant in ecp-data-vis-sdk (#26176) 2021-09-25 07:18:34 +02:00
Filippo Spiga
c725ca83ce nvhpc: add v21.9 (#26217) 2021-09-25 07:15:22 +02:00
bernhardkaindl
2eddc60baa tauola: Fix build: The current build wants lhapdf headers by default (#26154)
The hand-written `configure` script of this package does not handle
--without-<feature> at all. The source wants to use `lhapdf` headers
even if support of lhapdf is not indicated using `--with-lhapdf`.

Enable the variant `lhapdf` by default: It fixes the build of the
current package and provides the TauSpinner feature as well.
2021-09-24 20:16:59 -06:00
bernhardkaindl
13d313d3ad scallop: Fix build by bumping the version to 0.10.5 (#26162)
build of 0.10.3 fails with CoinPragma.hpp: No such file or directory
2021-09-24 18:11:06 -06:00
Olivier Cessenat
8ed7f65bc8 pigz: new version 2.6 (#26066) 2021-09-24 17:17:08 -06:00
bernhardkaindl
742bd1f149 nix: bump version, add new depends and make installcheck pass (#26179)
The current version of `nix` has some more features and has more
dependencies. The installcheck is quite involved but passes now.
2021-09-24 17:08:10 -06:00
bernhardkaindl
8d70f94aae cgdcbxd: Fix build: Use DESTDIR=self.prefix to not write to /etc (#26145)
Fix the build for normal non-root/non-system-user builds, as we cannot
know that we'd have to uninstall these files even if installed as root.
Also add `pkgconfig` and remove not explicitly needed `depends_on('m4')`
2021-09-24 16:32:09 -06:00
luker
ab90cd8fc4 Tau rocm fix (#26134)
* update the Tau package to use the correct ROCm dependencies and prefixes

    1st:
    When the rocm variant is selected, tau defaults to look for rocm in /opt/rocm
    which is not guarenteed to be the correct location -- this has been fixed
    to provide the prefix for hsa-rocr-dev (which is now a dependency when +rocm is
    selected).

    2nd:
    the rocprofiler dependency package was not specified correctly, it should
    be called rocprofiler-dev, also rocprofiler-dev is a dependency when
    +rocprofiler is selected.

added roctracer support
2021-09-24 16:29:04 -06:00
bernhardkaindl
cdcecda9d0 sst-dumpi: Correct the wrong checksum of the current version (#26181)
Correct the sha256 of the current version as shown by `spack checksum`
2021-09-24 14:17:16 -06:00
bernhardkaindl
adc7fee12e w3m: Fix build by disabling RAND_egd and japanese messages (#26168)
w3m's build fails with `undefined reference to `RAND_egd'` which
is an deprecated insecure feature and from building japanese messages.

Disabling both makes the build of `w3m` work.
2021-09-24 13:26:14 -06:00
bernhardkaindl
12252f1ca4 iproute2: add v5.10.0 and v5.11.0, add missing deps (#26182)
iproute2 needs libmnl and builds using flex and bison: Add them.
2021-09-24 12:37:52 -06:00
bernhardkaindl
7efb18b502 nginx: Bump version for OpenSSL@3 and restrict @:1.21.2 to openssl@:1 (#26156) 2021-09-24 12:16:51 -06:00
bernhardkaindl
45c84a2b04 xdriinfo: uses glXGetProcAddressARB, add: depends_on('gl') (#26184) 2021-09-24 12:08:07 -06:00
Edward Hartnett
2c9dfcabf8 added NOAA package prod_util (#26197) 2021-09-24 11:44:09 -06:00
Edward Hartnett
1a0e1597c8 nemsio: Add v2.5.3, which includes dependency change, and maintainers (#26188) 2021-09-24 11:41:15 -06:00
Adam J. Stewart
9a17dc0001 py-torchvision: rename master -> main (#26160) 2021-09-24 11:32:23 -06:00
Robert Cohn
3a48f5f931 intel-oneapi-mpi/mkl packages: add ilp64 support (#26045) 2021-09-24 10:32:06 -07:00
Edward Hartnett
73913a5d51 added GFDL FMS package (#26191) 2021-09-24 11:14:16 -06:00
Scott Wittenburg
45b70d9798 Pipelines: Disable ppc builds until we have resources or make it smaller (#26238) 2021-09-24 08:24:36 -07:00
bernhardkaindl
cdbb586a93 autotools.py/autoreconf: Show the depends_on()s to add to the package (#26115)
This commit shows a template for cut-and-paste into the package to fix it:

```py
==> fast-global-file-status: Executing phase: 'autoreconf'
==> Error: RuntimeError: Cannot generate configure: missing dependencies autoconf, automake, libtool.

Please add the following lines to the package:

    depends_on('autoconf', type='build', when='@master')
    depends_on('automake', type='build', when='@master')
    depends_on('libtool', type='build', when='@master')

Update the version (when='@master') as needed.
```
    
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-09-24 09:17:07 -06:00
bernhardkaindl
bcf708098d kmod: Add the missing depends on m4 and pkgconfig (#26137)
The build needs pkgconfig and lzma, m4 is already added by autoconf.

Disable generation of kmod manpages as spack does not have xsltproc yet.

Co-authored-by: Bernhard Kaindl <bernhard.kaindl@ait.ac.at>
2021-09-24 08:02:09 -06:00
Edward Hartnett
71378bcf39 landsfcutil: new package (#26193) 2021-09-24 13:48:58 +02:00
Wouter Deconinck
f68b2ea73a acts: add v13.0.0 (#26196) 2021-09-24 13:47:25 +02:00
Edward Hartnett
2d8f06bfa0 ufs-utils: added NOAA maintainers (#26201) 2021-09-24 13:07:18 +02:00
Harmen Stoppels
87e1ed0ef1 Bump cmake 3.21.3 (#26186) 2021-09-24 04:13:52 -06:00
Troy Redfearn
ef6013da9c Added new versions and deprecated vulnerable versions (#26206) 2021-09-24 03:37:44 -06:00
Tamara Dahlgren
c348daf4dc Add 'radiuss' tags to RADIUSS packages (#26212) 2021-09-24 03:20:14 -06:00
Edward Hartnett
21eaf31291 g2tmpl: added NOAA maintainers (#26192) 2021-09-24 02:35:02 -06:00
Edward Hartnett
6d2ce0535f added NOAA package grib-util (#26200)
* added NOAA package grib-util
2021-09-24 02:17:07 -06:00
Ben Bergen
2e4528c55c Added mising '+' for kokkos (#26202) 2021-09-24 02:14:25 -06:00
Edward Hartnett
6f8c9edefc added NOAA ncio package (#26194) 2021-09-24 02:14:11 -06:00
Edward Hartnett
d532f5b40c added NOAA package nemsiogfs (#26195) 2021-09-24 02:08:06 -06:00
Wouter Deconinck
fb2e764f50 assimp: depends_on zlib (#26199)
Assimp searches for zlib (or builds its own version). When it searches, it can find a system install that is not provided by spack. Ref: d286aadbdf/CMakeLists.txt (L451)
2021-09-24 02:05:06 -06:00
Andrew-Dunning-NNL
7bb5dd7a14 gradle: add new versions through 7.2 (#26203) 2021-09-24 02:02:17 -06:00
iarspider
1299c714c4 gosamcontrib: add libs variants (#26030) 2021-09-24 10:00:02 +02:00
Ryan Mast
2bcdb33666 helics: Add versions 2.8.0, 3.0.0, and 3.0.1 (#26046) 2021-09-24 01:52:52 -06:00
Harmen Stoppels
a6bb3a66ea Remove centos:6 image references (#26095)
This was EOL November 30th, 2020. I believe the "builds" are failing on
develop because of it.
2021-09-24 09:47:49 +02:00
Wouter Deconinck
117ea5a239 opencascade: add v7.5.3; added VTK INCLUDE cmake flag (#26209) 2021-09-24 09:27:07 +02:00
Massimiliano Culpo
03c203effa Use Leap instead of Tumbleweed for e2e bootstrapping test (#26205)
Tumbleweed has been broken for a couple of days. The attempt
to fix it in #26170 didn't really work. Let's try to move to
a more stable release series for OpenSuse.
2021-09-24 01:22:40 -06:00
Mikhail Titov
ba452f7645 radical cybertools: add v1.8.0 (#26215) 2021-09-24 09:22:02 +02:00
Edward Hartnett
54ea7caf3f g2: add maintainers and version 3.4.5 (#26105)
* added new release, added noaa software maintainers to maintainer list

* updated comment

* removed trailing whitespace
2021-09-24 00:17:07 -06:00
kjrstory
bdab4c9f1f fontconfig: add dependency python (#25960) 2021-09-23 23:26:05 -06:00
iarspider
72c22696ac gperftools package: add variants (#26032)
* Make libunwind optional
* Add support for sized_delete and debugalloc

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-09-23 22:40:28 -06:00
Thomas Kluyver
fdc9cb0adb h5py: new version 3.4 (#25935)
No changes to dependencies or supported Python versions.

https://docs.h5py.org/en/stable/whatsnew/3.4.html
2021-09-23 12:12:33 -06:00
bernhardkaindl
ea459993db python: Fix regression in python-2.7.17+-distutils-C++.patch (#25821) 2021-09-23 07:38:58 -05:00
Christoph Conrads
45c6fe2b94 SQLite: make variants discoverable (#25885)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-09-23 09:54:02 +02:00
Harmen Stoppels
56a81ec81c Pin opensuse image in bootstrap tests (#26170)
Currently zypper in opensuse containers throws 'not permitted'

Temporarily fix the digest until they fixed their upstream package manager issues
2021-09-23 08:03:43 +02:00
Gregory Becker
1195ac02e5 Merge tag 'v0.16.3' into develop 2021-09-22 16:25:40 -07:00
Brian Van Essen
fa92482675 dihydrogen package: add missing dependency (#25896) 2021-09-22 12:03:15 -07:00
Edward Hartnett
76ad001c56 added NOAA software maintainers to maintainer list (#26112) 2021-09-22 07:24:16 -04:00
bernhardkaindl
4ccd2e8ff0 pango: Fix build: restore autotools-based versions (#26084)
Fix the build of pango and it's 20 dependents: Only provide the versions which
support the build using autotools (conversion to MesonPackage didn't progress)
This only restores the list of versions of August 10, before the build broke.
2021-09-22 05:13:49 -06:00
Alec Scott
2b6a8ccf29 Add Rclone v1.56.1 (#26124) 2021-09-22 04:07:48 -06:00
albestro
7613511718 add conflict (#26028) 2021-09-22 12:07:19 +02:00
Alec Scott
ed3aa9633d Add Picard v2.26.2 (#26125) 2021-09-22 04:04:44 -06:00
kjrstory
24ca007448 su2: add version 7.0.4-7.2.0 (#25956) 2021-09-22 01:07:44 -06:00
Alec Scott
86ddcd0e2d Add Slepc v3.15.2 (#26123) 2021-09-21 23:13:51 -06:00
Edward Hartnett
5097a41675 gfsio: add NOAA software maintainers (#26106)
* added NOAA software maintainers to maintainer list

* added comment about NCEPLIBS
2021-09-21 19:20:47 -06:00
Harmen Stoppels
3b55c2e715 Bump version and update changelog 2021-09-21 16:58:41 -07:00
Harmen Stoppels
7caa844d49 Fix style tests 2021-09-21 16:58:41 -07:00
Harmen Stoppels
4cd6381ed5 Remove centos:6 image references
This was EOL November 30th, 2020. I believe the "builds" are failing on
develop because of it.
2021-09-21 16:58:41 -07:00
Massimiliano Culpo
aebc8f6ce5 docker: remove boto3 from CentOS 6 since it requires and updated pip (#24813) 2021-09-21 16:58:41 -07:00
Massimiliano Culpo
073c92d526 docker: Fix CentOS 6 build on Docker Hub (#24804)
This change make yum usable again on CentOS 6
2021-09-21 16:58:41 -07:00
Greg Becker
0e85d7011e Ensure all roots of an installed environment are marked explicit in db (#24277) 2021-09-21 16:58:41 -07:00
Todd Gamblin
77e633efa1 locks: only open lockfiles once instead of for every lock held (#24794)
This adds lockfile tracking to Spack's lock mechanism, so that we ensure that there
is only one open file descriptor per inode.

The `fcntl` locks that Spack uses are associated with an inode and a process.
This is convenient, because if a process exits, it releases its locks.
Unfortunately, this also means that if you close a file, *all* locks associated
with that file's inode are released, regardless of whether the process has any
other open file descriptors on it.

Because of this, we need to track open lock files so that we only close them when
a process no longer needs them.  We do this by tracking each lockfile by its
inode and process id.  This has several nice properties:

1. Tracking by pid ensures that, if we fork, we don't inadvertently track the parent
   process's lockfiles. `fcntl` locks are not inherited across forks, so we'll
   just track new lockfiles in the child.
2. Tracking by inode ensures that referencs are counted per inode, and that we don't
   inadvertently close a file whose inode still has open locks.
3. Tracking by both pid and inode ensures that we only open lockfiles the minimum
   number of times necessary for the locks we have.

Note: as mentioned elsewhere, these locks aren't thread safe -- they're designed to
work in Python and assume the GIL.

Tasks:
- [x] Introduce an `OpenFileTracker` class to track open file descriptors by inode.
- [x] Reference-count open file descriptors and only close them if they're no longer
      needed (this avoids inadvertently releasing locks that should not be released).
2021-09-21 16:58:41 -07:00
Todd Gamblin
2ae92ebdbb Use AWS CloudFront for source mirror (#23978)
Spack's source mirror was previously in a plain old S3 bucket. That will still
work, but we can do better. This switches to AWS's CloudFront CDN for hosting
the mirror.

CloudFront is 16x faster (or more) than the old bucket.

- [x] change mirror to https://mirror.spack.io
2021-09-21 16:58:41 -07:00
Harmen Stoppels
7f29dd238f Cray: fix extracting paths from module files (#23472)
Co-authored-by: Tiziano Müller <tm@dev-zero.ch>
2021-09-21 16:58:41 -07:00
Adam J. Stewart
0f486080b3 Fix use of quotes in Python build system (#22279) 2021-09-21 16:58:41 -07:00
Michael Kuhn
6b0c775448 clang/llvm: fix version detection (#19978)
This PR fixes two problems with clang/llvm's version detection. clang's
version output looks like this:

```
clang version 11.0.0
Target: x86_64-unknown-linux-gnu
```

This caused clang's version to be misdetected as:

```
clang@11.0.0
Target:
```

This resulted in errors when trying to actually use it as a compiler.

When using `spack external find`, we couldn't determine the compiler
version, resulting in errors like this:

```
==> Warning: "llvm@11.0.0+clang+lld+lldb" has been detected on the system but will not be added to packages.yaml [reason=c compiler not found for llvm@11.0.0+clang+lld+lldb]
```

Changing the regex to only match until the end of the line fixes these
problems.

Fixes: #19473
2021-09-21 16:58:41 -07:00
Adam J. Stewart
9120856c01 Fix fetching for Python 3.9.6 (#24686)
When using Python 3.9.6, Spack is no longer able to fetch anything. Commands like `spack fetch` and `spack install` all break.

Python 3.9.6 includes a [new change](https://github.com/python/cpython/pull/25853/files#diff-b3712475a413ec972134c0260c8f1eb1deefb66184f740ef00c37b4487ef873eR462) that means that `scheme` must be a string, it cannot be None. The solution is to use an empty string like the method default.

Fixes #24644. Also see https://github.com/Homebrew/homebrew-core/pull/80175 where this issue was discovered by CI. Thanks @branchvincent for reporting such a serious issue before any actual users encountered it!

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-09-21 16:58:41 -07:00
Edward Hartnett
90d953d3f6 w3emc: add NOAA software maintainers (#26110) 2021-09-21 17:54:19 -06:00
Tiziano Müller
d34b2638c7 boost: fix for @1.77.0%intel (#25965)
Add patch for build script from boost repo.
2021-09-21 16:07:25 -07:00
bernhardkaindl
979c355c99 spack/build_environment.py: Clean MAKEFLAGS, DISPLAY and TERM (#26092)
clean_environment(): Unset three more environment variables:

MAKEFLAGS: Affects make, can eg indirectly inhibit enabling parallel build
DISPLAY: Tests of GUI widget libraries might try to connect to an X server
TERM: Could make testsuites attempt to color their output
2021-09-22 00:23:10 +02:00
Edward Hartnett
92f199d57b added NOAA software maintainers to the maintainers list (#26102) 2021-09-21 16:16:49 -06:00
Edward Hartnett
fd716ce0c5 sfcio: add NOAA software maintainers (#26108)
* added NOAA software maintainers to maintainer list

* added comment about NCEPLIBS project
2021-09-21 15:28:37 -06:00
Edward Hartnett
d865caa7f7 ip2: add NOAA software maintainers and deprecation note (#26107)
* added NOAA software maintainers to maintainer list, added comment about library being deprecated

* deleted trailing whitespace
2021-09-21 14:49:13 -06:00
bernhardkaindl
981551b47f libtool: fix running the unit-tests with spack install --test root (#25707)
Besides adding autoconf and automake as needed for tests of 2.4.6,
skip Fortran test cases when Fortran compilers are not provided.
2021-09-21 14:18:29 -06:00
Tamara Dahlgren
c3cb863b82 Feature: Add deprecated versions section to spack info output (#25972) 2021-09-21 11:49:36 -07:00
AMD Toolchain Support
8de77bb304 Fix for - Installation issue: amdlibflame #25878 (#25987)
* Fix for - Installation issue: amdlibflame #25878

* Updated with python+pythoncmd - Symlink 'python3' executable to 'python'
2021-09-21 11:15:23 -07:00
Edward Hartnett
a8977f828e w3nco: add NOAA software maintainers (#26111) 2021-09-21 12:08:43 -06:00
Jonas Thies
6c487f4c6d packages/phist new version 1.9.5 (#26114) 2021-09-21 11:57:22 -06:00
Edward Hartnett
24500d4115 added noaa software maintainers to maintainer list (#26103) 2021-09-21 09:58:53 -07:00
iarspider
2741037c69 Fix FORM recipe (#26104) 2021-09-21 09:57:40 -07:00
Edward Hartnett
d278837316 sigio: add NOAA software maintainers (#26109) 2021-09-21 10:39:05 -06:00
bernhardkaindl
22cfc19bcb ddd,debuild,flux-sched: add missing dependencies (#26090) 2021-09-21 18:08:11 +02:00
Melven Roehrig-Zoellner
c24413a530 petsc: fix for enabling openmp (#25942)
* petsc: fix for enabling openmp

* petsc: shorten comment (style guidelines)

* petsc: move flag to make code more clear
2021-09-21 07:13:35 -06:00
natshineman
544b3f447d mvapich2-gdr: add v2.3.6 (#26076)
Co-authored-by: Nick Contini <contini.26@buckeyemail.osu.edu>
2021-09-21 06:19:56 -06:00
Gregory Lee
05ca19d9d7 fgfs: fix missing autotools depends_on (#26005)
* added build deps for fgfs

* added build deps when building master branch
2021-09-21 07:40:59 -04:00
Harmen Stoppels
a326713826 py-abcpy: new package (#25713) 2021-09-21 13:10:16 +02:00
bernhardkaindl
1b95c97bb3 ccfits: add v2.6 (#26089)
ccfits@2.5 doesn't compile on Ubuntu 18.04: Update to version 2.6.
2021-09-21 04:43:48 -06:00
bernhardkaindl
c6a33c28de Fix deps of diffmark,faiss,fgsl,fipcheck,nbdkit,ncbi-magicblast,xcb-util-cursor (#26091)
Fix missing type='build' deps for cpio, m4, pkg-config and python in
diffmark, faiss, fgsl, fipcheck, nbdkit, ncbi-magicblast and xcb-util-cursor
2021-09-21 04:16:43 -06:00
Harmen Stoppels
58663692a4 Rename 'variant_name' to 'variant' and document it in autotools build system (#26064) 2021-09-21 11:27:41 +02:00
Christoph Junghans
45ba4d94bd votca-* packages: add version 2021.2 (#26058) 2021-09-20 16:14:19 -07:00
Adam J. Stewart
3e3b7b1394 New package: py-torchgeo (#26059) 2021-09-20 16:12:55 -07:00
Jordan Ogas
d0fd9e6d5f charliecloud package: add version 0.25 (#26073) 2021-09-20 16:08:41 -07:00
adityakavalur
88880c5369 Include newer versions of UCX (#26072) 2021-09-20 23:41:26 +02:00
Danny McClanahan
aa7e354a3a patch serf and scons to use #!/usr/bin/env python3 (#26062) 2021-09-20 22:12:30 +02:00
iarspider
f3d2f62468 Add variants to FORM recipe (#25963) 2021-09-20 13:53:54 -06:00
Edward Hartnett
65f285831e added NOAA maintainers to ip package (#26055) 2021-09-20 11:36:03 -06:00
Olivier Cessenat
0197b364a2 mfem and petsc: alter hypre dependencies (#25902)
Hypre latest version 2.22.1 breaks MFEM and PETSc.
2021-09-20 09:51:29 -07:00
Edward Hartnett
96eefea9a3 added NOAA software team to maintainers list (#26056) 2021-09-20 09:01:53 -06:00
Olivier Cessenat
1ea58e0cb9 p7zip: resolve gcc 10 conflict (#25676)
Fix credit: Eric Brugger
2021-09-20 07:43:07 -04:00
Tamara Dahlgren
c6f9e9baf6 strumpack: Update stand-alone test to use stage directory (#25751) 2021-09-20 07:36:58 -04:00
kjrstory
430caaf5f6 paraview: patch for version 5.9.1 with gcc 11.1.0 (#25887)
* paraview: patch for version 5.9.1 with gcc 11.1.0

* Fix : blank line contains whitespace
2021-09-20 07:31:54 -04:00
Robert Underwood
6476e0cf02 gdb: new version v11.1 (#25920) 2021-09-20 07:30:37 -04:00
Jen Herting
acd4186a88 New packages: py-imgaug (#25894)
* updates from sid

* [py-imgaug] url -> pypi

* [py-imgaug] added version 0.4.0 and fixed up dependencies

* [py-imgaug] updated copyright

Co-authored-by: Andrew Elble <aweits@rit.edu>
2021-09-20 07:29:43 -04:00
iarspider
ca8d16c9d1 Allow setting variant name in AutotoolsPackage._activate_or_not (#26054) 2021-09-20 10:54:24 +02:00
jacorvar
d20fa421ca Fix R version (#26061)
R version should be `3.0.0` instead of `3.00`.
2021-09-20 10:47:40 +02:00
Edward Hartnett
8fceaf4a33 added NOAA software team to maintainers list (#26057) 2021-09-19 19:02:02 -06:00
Vanessasaurus
534bad0ff7 updating OmegaH to 9.34.1 (#26015) 2021-09-19 08:55:54 -06:00
Todd Gamblin
4f8b643404 Create SECURITY.md 2021-09-19 06:43:14 -07:00
Vanessasaurus
c2f42a6f09 updating kokkos to 3.4.01 (#26013) 2021-09-19 06:10:55 -06:00
Vanessasaurus
148c7aeb06 updating singularity to 3.8.3 (#26020)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-09-19 04:20:15 -06:00
jacorvar
4f79a83dd1 Fix R version (#26047)
R version should be `3.0.0`, in contrast to `3.00`.
2021-09-19 12:14:14 +02:00
Alec Scott
2d4d51eb5e Add v1.6.0 to Benchmark (#25996) 2021-09-18 20:59:02 -06:00
Cameron Smith
b14d6d217e omegah: add version 9.34.1 (#25828) 2021-09-18 19:58:44 -06:00
Alec Scott
5b400a69b7 Add v0.935 to Angsd (#25995) 2021-09-18 18:46:55 -06:00
Adam J. Stewart
c0839b14d5 Python: use platform-specific site packages dir (#25998) 2021-09-19 00:37:50 +00:00
Seth R. Johnson
6ff5795342 cmake: allow gcc on macOS for newer versions (#25994) 2021-09-18 17:34:43 -06:00
Michele Martone
60f0d0ac20 librsb: add v1.2.0.10 (#26044) 2021-09-18 17:43:41 -04:00
Axel Huebl
8f3482b2ce FFTW: Fix OpenMP Build on macOS (#26039) 2021-09-18 15:13:47 -06:00
Vanessasaurus
fc79a5da17 updating siesta to 4.0.2 (#26021) 2021-09-18 08:05:02 -06:00
Vanessasaurus
1880db8457 updating universal ctags to 5.2.0.xxxx (#26024) 2021-09-18 07:13:55 -06:00
Vanessasaurus
6538727fb3 updating sparsehash to 2.0.4 (#26023) 2021-09-18 06:52:59 -06:00
Vanessasaurus
f5275fecc5 updating htslib to 1.13 (#26012) 2021-09-18 06:26:00 -06:00
Vanessasaurus
d499309433 updating nco to 5.0.1 (#26014) 2021-09-18 06:23:05 -06:00
Vanessasaurus
c5e2662dea updating graphviz to 2.49.0 (#26011) 2021-09-18 06:19:56 -06:00
Vanessasaurus
7b0a505795 updating veloc to 1.5 and cleaning up spacing (#26025) 2021-09-18 06:10:56 -06:00
Vanessasaurus
e8b86b7ec6 updating cloc to 1.9.0 (#26009) 2021-09-18 05:44:04 -06:00
Vanessasaurus
608a0b1f8f updating poppler (#26016) 2021-09-18 05:10:56 -06:00
Vanessasaurus
abd11cf042 updating raxml to 8.2.12 (#26018) 2021-09-18 04:58:50 -06:00
Vanessasaurus
8ad910d8f1 updating protobuf (Gooooogle!) to 3.18.0 (#26017) 2021-09-18 04:40:49 -06:00
Vanessasaurus
d5c985b888 updating samtools to 1.13 (#26019) 2021-09-18 04:16:51 -06:00
Vanessasaurus
f0443d0327 updating spades to 3.15.3 (#26022) 2021-09-18 03:46:50 -06:00
Vanessasaurus
9af1858bff updating gatk to 4.2.2.0 (#26010) 2021-09-18 10:07:28 +02:00
David Beckingsale
fa3265ea51 Convert RAJA, CHAI and Umpire to CachedCMakePackages (#25788)
* Switch Umpire to CMakeCachedPackage
* Fix missing import
* Correct tests option in Umpire
* Switch RAJA to CachedCMakePackage
* Convert CHAI to CachedCMakePackage
* Corrections in RAJA
* Patches in Umpire & RAJA for BLT target export
* Fixup style
* Fixup incorrect use of cmake_cache_string
2021-09-17 21:29:41 -07:00
Chris White
79ac572f21 update tutorial version of hdf5 (#25368) 2021-09-17 17:34:56 -07:00
Massimiliano Culpo
b847bb72f0 Bootstrap should search for compilers after switching config scopes (#26029)
fixes #25992

Currently the bootstrapping process may need a compiler.

When bootstrapping from sources the need is obvious, while
when bootstrapping from binaries it's currently needed in
case patchelf is not on the system (since it will be then
bootstrapped from sources).

Before this PR we were searching for compilers as the
first operation, in case they were not declared in
the configuration. This fails in case we start
bootstrapping from within an environment.

The fix is to defer the search until we have swapped
configuration.
2021-09-17 18:28:48 -06:00
Harmen Stoppels
4d36c40cfb Bump reframe (#25970) 2021-09-17 14:23:05 -06:00
Olli Lupton
ae9adba900 Add ccache v4.4.1. (#25957) 2021-09-17 13:37:58 -06:00
Cyrus Harrison
bd415ec841 improve ascent package to use stages and cmake base (#25720)
* improve ascent package to use stages and cmake base

* style

* more style
2021-09-17 10:37:16 -07:00
iarspider
7e7de25aba fmt: add variant for shared library (#25969) 2021-09-17 08:04:55 -06:00
eugeneswalker
730720d50a variant build: openmp_ref should be openmp (#26006) 2021-09-17 06:38:00 -06:00
iarspider
8e486c1e57 gosam: new version 2.1.1 (#25985) 2021-09-17 06:10:41 -06:00
Kurt Sansom
be8e52fbbe GCC: patch for gcc 10.3.0 ICE when using nvcc (#25980)
* fix: patch for gcc 10.3.0 ICE when using nvcc

* fix: use URL reference instead

* fix: add missing sha256sum
2021-09-16 18:43:56 -06:00
Edward Hartnett
f8fae997d3 added package.py for GPTL (#25993) 2021-09-16 17:11:05 -06:00
Massimiliano Culpo
71a3173a32 Filter UserWarning out of test output (#26001) 2021-09-16 14:56:00 -06:00
Miroslav Stoyanov
e027cecff2 workaround a cmake/rocm bug in heffte (#25948) 2021-09-16 13:26:07 -06:00
Alec Scott
bb29c5d674 Add v7.0.2 to Admixtools (#25997) 2021-09-16 12:08:15 -06:00
G-Ragghianti
c4e26ac7c8 Fix for problem with cmake@3.21 (#25989) 2021-09-16 11:52:59 -06:00
AMD Toolchain Support
cf81046bb1 New package: ROMS (#25990)
Co-authored-by: Mohan Babu <mohbabul@amd.com>
2021-09-16 10:34:45 -07:00
Michael Kuhn
2d34acf29e cc: Use parameter expansion instead of basename (#24509)
While debugging #24508, I noticed that we call `basename` in `cc`. The
same can be achieved by using Bash's parameter expansion, saving one
external process per call.

Parameter expansion cannot replace basename for directories in some
cases, but is guaranteed to work for executables.
2021-09-16 16:25:49 +00:00
Michael Kuhn
d73fe19d93 Recommend Git's manyFiles feature (#25977)
Git 2.24 introduced a feature flag for repositories with many files, see:
https://github.blog/2019-11-03-highlights-from-git-2-24/#feature-macros

Since Spack's Git repository contains roughly 8,500 files, it can be
worthwhile to enable this, especially on slow file systems such as NFS:
```
$ hyperfine --warmup 3 'cd spack-default; git status' 'cd spack-manyfiles; git status'
Benchmark #1: cd spack-default; git status
  Time (mean ± σ):      3.388 s ±  0.095 s    [User: 256.2 ms, System: 625.8 ms]
  Range (min … max):    3.168 s …  3.535 s    10 runs

Benchmark #2: cd spack-manyfiles; git status
  Time (mean ± σ):     168.7 ms ±  10.9 ms    [User: 98.6 ms, System: 126.1 ms]
  Range (min … max):   144.8 ms … 188.0 ms    19 runs

Summary
  'cd spack-manyfiles; git status' ran
   20.09 ± 1.42 times faster than 'cd spack-default; git status'
```
2021-09-16 09:41:10 -06:00
Harmen Stoppels
5b211c90f5 Bump sirius 7.2.x (#25939) 2021-09-16 15:00:58 +02:00
Massimiliano Culpo
fa6366a7df Add a deprecation warning when using the old concretizer (#25966) 2021-09-16 13:25:24 +02:00
Mikael Simberg
b09ad2cc8c Update HPX package (#25775)
* Add support for C++20 to HPX package

* Enable unity builds in HPX package when available

* Add support for HIP/ROCm to HPX package

* Rearrange and update required versions for HPX package

* Add C++20 option to asio package
2021-09-16 13:24:17 +02:00
Harmen Stoppels
ccfdac8402 Improve bootstrapping docs a hair (#25962) 2021-09-16 07:02:31 -04:00
Harmen Stoppels
abb0f6e27c Fix NameError in foreground/background test (#25967) 2021-09-16 10:39:07 +02:00
Harmen Stoppels
3fe9b34362 Inform the user about bootstrapping (#25964) 2021-09-16 09:28:24 +02:00
jacorvar
c0122242ee bedops: Fix checksum for 2.4.40 (#25958)
Fixes #25951
2021-09-15 16:17:40 -07:00
Adam J. Stewart
0015f700b7 py-pybind11: use PythonPackage install method (#25650) 2021-09-15 11:57:33 -07:00
dependabot[bot]
5d23638fdc build(deps): bump codecov/codecov-action from 2.0.3 to 2.1.0 (#25925)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 2.0.3 to 2.1.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/codecov/codecov-action/compare/v2.0.3...v2.1.0)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-09-15 16:02:05 +02:00
Tamara Dahlgren
f0b4afe7db Raise exception when 1+ stand-alone tests fail (#25857) 2021-09-15 08:04:59 -04:00
Ben Corbett
75497537a4 Added LvArray 0.2.2 (#25950) 2021-09-15 01:44:07 -06:00
Massimiliano Culpo
c52426ea7a Make clingo the default solver (#25502)
Modifications:
- [x] Change `defaults/config.yaml`
- [x] Add a fix for bootstrapping patchelf from sources if `compilers.yaml` is empty
- [x] Make `SPACK_TEST_SOLVER=clingo` the default for unit-tests
- [x] Fix package failures in the e4s pipeline

Caveats:
1. CentOS 6 still uses the original concretizer as it can't connect to the buildcache due to issues with `ssl` (bootstrapping from sources requires a C++14 capable compiler)
1. I had to update the image tag for GitlabCI in e699f14.  
1. libtool v2.4.2 has been deprecated and other packages received some update
2021-09-14 22:44:16 -07:00
Adam J. Stewart
0d0d438c11 Add a __reduce__ method to Environment (#25678)
* Add a __reduce__ method to Environment
* Add unit test
* Convert Path to str
2021-09-14 22:37:36 -07:00
Vanessasaurus
ef5ad4eb34 Adding ability to compare git references to spack install (#24639)
This will allow a user to (from anywhere a Spec is parsed including both name and version) refer to a git commit in lieu of 
a package version, and be able to make comparisons with releases in the history based on commits (or with other commits). We do this by way of:

 - Adding a property, is_commit, to a version, meaning I can always check if a version is a commit and then change some action.
 - Adding an attribute to the Version object which can lookup commits from a git repo and find the last known version before that commit, and the distance
 - Construct new Version comparators, which are tuples. For normal versions, they are unchanged. For commits with a previous version x.y.z, d commits away, the comparator is (x, y, z, '', d). For commits with no previous version, the comparator is ('', d) where d is the distance from the first commit in the repo.
 - Metadata on git commits is cached in the misc_cache, for quick lookup later.
 - Git repos are cached as bare repos in `~/.spack/git_repos`
 - In both caches, git repo urls are turned into file paths within the cache

If a commit cannot be found in the cached git repo, we fetch from the repo. If a commit is found in the cached metadata, we do not recompare to newly downloaded tags (assuming repo structure does not change). The cached metadata may be thrown out by using the `spack clean -m` option if you know the repo structure has changed in a way that invalidates existing entries. Future work will include automatic updates.

# Finding previous versions
Spack will search the repo for any tags that match the string of a version given by the `version` directive. Spack will also search for any tags that match `v + string` for any version string. Beyond that, Spack will search for tags that match a SEMVER regex (i.e., tags of the form x.y.z) and interpret those tags as valid versions as well. Future work will increase the breadth of tags understood by Spack

For each tag, Spack queries git to determine whether the tag is an ancestor of the commit in question or not. Spack then sorts the tags that are ancestors of the commit by commit-distance in the repo, and takes the nearest ancestor. The version represented by that tag is listed as the previous version for the commit.

Not all commits will find a previous version, depending on the package workflow. Future work may enable more tangential relationships between commits and versions to be discovered, but many commits in real world git repos require human knowledge to associate with a most recent previous version. Future work will also allow packages to specify commit/tag/version relationships manually for such situations.

# Version comparisons.
The empty string is a valid component of a Spack version tuple, and is in fact the lowest-valued component. It cannot be generated as part of any valid version. These two characteristics make it perfect for delineating previous versions from distances. For any version x.y.z, (x, y, z, '', _) will be less than any "real" version beginning x.y.z. This ensures that no distance from a release will cause the commit to be interpreted as "greater than" a version which is not an ancestor of it.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: Gregory Becker <becker33@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-09-14 22:12:34 -07:00
David Beckingsale
c3bc3e61aa gcc: apply backported fixes to v4.9.3 (#25945) 2021-09-15 07:10:04 +02:00
Satish Balay
148071ac8a dealii: add version 9.3.1 (#25915) 2021-09-14 15:26:08 -06:00
Scott Wittenburg
bf7c12b4df Pipelines: (Re)enable E4S on Power stack (#25921)
Pipelines: (Re)enable E4S on Power stack
2021-09-14 14:55:50 -06:00
Weston Ortiz
56c375743a Add missing mumps TPL commands (#25940) 2021-09-14 20:26:37 +01:00
Vanessasaurus
c6edfa3f31 Update spack monitor to support new spec (#25928)
This PR coincides with tiny changes to spack to support spack monitor using the new spec
the corresponding spack monitor PR is at https://github.com/spack/spack-monitor/pull/31.
Since there are no changes to the database we can actually update the current server
fairly easily, so either someone can test locally or we can just update and then
test from that (and update as needed).

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-09-14 08:02:04 -06:00
Harmen Stoppels
fc96c49b0b Add py-pot with patch (#25712) 2021-09-14 15:55:02 +02:00
Harmen Stoppels
ce790a89f2 reframe: set PYTHONPATH at runtime (#25842) 2021-09-14 15:54:14 +02:00
Valentin Volkl
1b633e1ca4 ocaml: add patch for clang@11: (#25886) 2021-09-14 09:34:57 +02:00
Ben Darwin
2ac9dc76c4 dcmtk: add v3.6.4, v3.6.5, v3.6.6 (#25923) 2021-09-14 09:22:48 +02:00
Weston Ortiz
d7c5aa46fe trilinos: variant for libx11 (#25823) 2021-09-14 08:53:53 +02:00
Edward Hartnett
f6eb16982a added new version of parallelio library (#25916) 2021-09-13 18:22:54 -06:00
David Beckingsale
be1c4bc563 Rename camp 'main' version (#25918) 2021-09-13 16:55:55 -06:00
Timothy Brown
998de97091 ESMF, NEMSIO and UFS-UTILS changes. (#25846)
* ESMF and NEMSIO changes.

- Updating ESMF to set the COMM correctly when using Intel oneapi.
- Explicitly setting the CMake MPI Fortran compiler for NEMSIO.

* Update UFS utils CMake to use MPI_<lang>_COMPILER.
2021-09-13 16:13:43 -06:00
Greg Becker
dad69e7d7c Fix environment reading from lockfile to trust written hashes (#25879)
#22845 revealed a long-standing bug that had never been triggered before, because the
hashing algorithm had been stable for multiple years while the bug was in production. The
bug was that when reading a concretized environment, Spack did not properly read in the
build hashes associated with the specs in the environment. Those hashes were recomputed
(and as long as we didn't change the algorithm, were recomputed identically). Spack's
policy, though, is never to recompute a hash. Once something is installed, we respect its
metadata hash forever -- even if internally Spack changes the hashing method. Put
differently, once something is concretized, it has a concrete hash, and that's it -- forever.

When we changed the hashing algorithm for performance in #22845 we exposed the bug.
This PR fixes the bug at its source, but properly reading in the cached build hash attributes
associated with the specs. I've also renamed some variables in the Environment class
methods to make a mistake of this sort more difficult to make in the future.

* ensure environment build hashes are never recomputed
* add comment clarifying reattachment of env build hashes
* bump lockfile version and include specfile version in env meta
* Fix unit-test for v1 to v2 conversion

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-09-13 15:25:48 -06:00
Satish Balay
9956841331 petsc, petsc4py: add version 3.15.4 (#25912) 2021-09-13 11:49:36 -07:00
Massimiliano Culpo
e9f1cfdaaf Avoid hidden circular dependencies in spack.architecture (#25873)
* Refactor platform etc. to avoid circular dependencies

All the base classes in spack.architecture have been
moved to the corresponding specialized subpackages,
e.g. Platform is now defined within spack.platforms.

This resolves a circular dependency where spack.architecture
was both:
- Defining the base classes for spack.platforms, etc.
- Collecting derived classes from spack.platforms, etc.
Now it dopes only the latter.

* Move a few platform related functions to "spack.platforms"

* Removed spack.architecture.sys_type()

* Fixup for docs

* Rename Python modules according to review
2021-09-13 11:04:42 -07:00
Chuck Atkins
060582a21d ci: Add ecp-data-vis-sdk CI pipeline (#22179)
* ci: Add a minimal subset of the ECP Data & Vis SDK CI pipeline

* ci: Expand the ECP Data & Vis SDK pipeline with more variants
2021-09-13 11:34:13 -06:00
Chuck Atkins
c392454125 Disable dvsdk variants (#25889)
* dvsdk: Turn off variants by default

This allows an install to more easily be explicit about which pieces to
turn on as more variants are added

* dvsdk: effectively disable the broken variants
2021-09-13 11:33:55 -06:00
Seth R. Johnson
fca81c2ac8 kokkos: fail gracefully on missing microarch (#25910)
Fall back on known parent microarches (as determined by spack's built-in
archspec knowledge). Clsoes spack/spack#25907 .
2021-09-13 11:07:14 -06:00
Tamara Dahlgren
bafd84e191 Switch http to https where latter exists (#25672)
* Switch http to https where latter exists
* Hopefully restore original permissions
* Add URL updates after include the -L curl option
* Manual corrections to select URL format strings
2021-09-13 09:21:35 -07:00
Desmond Orton
1e08f31e16 New Version: mothur@1.46.1 (#25850) 2021-09-13 10:20:22 -06:00
Seth R. Johnson
1da7839898 trilinos: add conflicts for fortran (#25911) 2021-09-13 16:51:38 +02:00
Harmen Stoppels
a4a22a6926 Add a master branch for gnuconfig (#25866) 2021-09-13 14:17:32 +02:00
Nic McDonald
ceb94bd6ae Log4cxx add v0.12.0, convert to CMakePackage (#25875) 2021-09-13 05:17:08 -06:00
Tim Gymnich
b745e208a3 ravel: fix missing header file (#25906)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-09-13 04:49:42 -06:00
Harmen Stoppels
6a8383b2a7 Make sure perl doesn't run into recompile with fPIC linker errors (#25834) 2021-09-13 11:29:15 +02:00
Kelly (KT) Thompson
6210e694e1 draco: add v7.11.0 (#24631) 2021-09-13 11:16:57 +02:00
Adam J. Stewart
2f889b045c py-kornia: add new package (#25844) 2021-09-13 11:10:19 +02:00
Daniel Arndt
bc3b90f6ac Add SYCL build option to ArborX (#25736) 2021-09-13 11:09:51 +02:00
Harmen Stoppels
1f323d296d py-glmnet: new package (#25711) 2021-09-13 11:07:59 +02:00
Mikael Simberg
819cd41ee4 hpx, kokkos: add consistent variants for C++ std (#25535)
* Add cuda_constexpr variant to Kokkos package
* Don't require nvcc_wrapper in Kokkos package when using Cray compiler
2021-09-13 10:43:20 +02:00
Axel Huebl
c0069210e2 HiPACE: Update openPMD dep, add v21.09 (#25698) 2021-09-13 10:28:08 +02:00
Tamara Dahlgren
a1d792af4c Bugfix: Correct checksum's sha256 when retrieve from remote (#25831) 2021-09-13 08:08:00 +00:00
Glenn Johnson
b7e61a4b75 Tell gtk-doc where the XML catalog is (#25569)
* Tell gtk-doc where the XML catalog is

The gtk-doc configure script has an option for specifying the path to
the XML catalog. If this is not set the configure script will search
a defined set of directories for a catalog file and will set
`with_xml_catalog` based on that. Only if no system catalog is found will
the XML_CATALOG_FILES be looked at. In order to make sure that the spack
provided catalog is used, pass the `--with-xml-catalog` option.

* Use the property from docbook-xml
2021-09-13 09:57:04 +02:00
Mickaël Schoentgen
59832fb0ac httpie: add v2.5.0 (#25888) 2021-09-13 09:41:02 +02:00
Tamara Dahlgren
5fa075f5b4 Bugfix: spack test debug requires Spack tty (#25897) 2021-09-13 08:57:16 +02:00
gpotter2
4573741baa nmap: overhaul of the package recipe (#25656) 2021-09-13 08:24:02 +02:00
Xavier Delaruelle
e0d8f67f34 environment-modules: add version 5.0.0 (#25904)
Adapt configure arguments to only keep useful ones for this new major
release version.
2021-09-13 07:58:40 +02:00
Adam J. Stewart
0545f7d5cc py-pandas: add v1.3.3 (#25905) 2021-09-13 07:57:56 +02:00
Stephen McDowell
3a9028427c [docs] document official gfortran macOS precompiled binaries (#25818)
* document official gfortran macOS precompiled binaries

* compile without -vvv ;) {squash this}
2021-09-10 14:11:26 -05:00
Harmen Stoppels
729726d157 Remove dead code in installer (#24035)
Currently as part of installing a package, we lock a prefix, check if
it exists, and create it if not; the logic for creating the prefix
included a check for the existence of that prefix (and raised an
exception if it did), which was redundant.

This also includes removal of tests which were not verifying
anything (they pass with or without the modifications in this PR).
2021-09-10 11:32:46 -07:00
Scott Wittenburg
81962f100c Pipelines: build kokkos-kernels on bigger instance (#25845) 2021-09-10 08:27:42 -07:00
bernhardkaindl
d54a692e09 openssh: Fix parallel install issue, add test suite (#25790)
- Parallel install was failing to generate a config file.
- OpenSSH has an extensive test suite, run it if requested.
- 'executables' wrongly had 'rsh', replaced the openssh tools.
2021-09-10 05:58:37 -06:00
Christoph Conrads
b12f38383c SQLite: fix rtree, add version, make discoverable (#25554)
There are two ways to build SQLite: With the Autotools setup or the
so-called "amalgamation" which is a single large C file containing the
SQLite implementation. The amalgamation build is controlled by
pre-processor flags and the Spack setup was using an amalgamation
pre-processor flag for a feature that is controlled by an option of the
configure script. As a consequence, until now Spack has always built
SQLite with the rtree feature enabled.
2021-09-09 18:13:49 -06:00
Mark Abraham
9084ad69b4 Have GROMACS log files indicate spack was used (#25869)
Knowing that spack has patched the code and organized the build is potentially valuable information for GROMACS users and developers troubleshooting their builds.

PLUMED does further patches to GROMACS, so that is expressed directly also.
2021-09-09 16:38:31 -06:00
albestro
59d8031076 CUDA official GCC conflicts (#25054)
* update CUDA 11 / GCC compatibility range

* additional unofficial conflict

* minor changes to comments
2021-09-09 15:03:16 -07:00
Massimiliano Culpo
5fddd48f80 Refactor unit-tests in test/architecture.py (#25848)
Modifications:
- Export platforms from spack.platforms directly, so that client modules don't have to import submodules
- Use only plain imports in test/architecture.py
- Parametrized test in test/architecture.py and put most of the setup/teardown in fixtures
2021-09-09 09:34:47 -06:00
acastanedam
d8b95a496c openssl: link system openssl.conf after installation (#25807)
Co-authored-by: Arcesio Castaneda Medina <arcesio.castaneda.medina@itwm.fraunhofer.de>
2021-09-09 15:15:53 +02:00
Raghu Raja
161f0d5045 libfabric: Adds v1.13.1 (#25827)
This commit adds the 1.13.1 release. There was an update to the release
tarball on 9/7/21, so using the latest sha256 sum here.
2021-09-09 04:10:57 -06:00
Nathan Hanford
d83f7110d5 specs: move to new spec.json format with build provenance (#22845)
This is a major rework of Spack's core core `spec.yaml` metadata format.  It moves from `spec.yaml` to `spec.json` for speed, and it changes the format in several ways. Specifically:

1. The spec format now has a `_meta` section with a version (now set to version `2`).  This will simplify major changes like this one in the future.
2. The node list in spec dictionaries is no longer keyed by name. Instead, it is a list of records with no required key. The name, hash, etc. are fields in the dictionary records like any other.
3. Dependencies can be keyed by any hash (`hash`, `full_hash`, `build_hash`).
4. `build_spec` provenance from #20262 is included in the spec format. This means that, for spliced specs, we preserve the *full* provenance of how to build, and we can reproduce a spliced spec from the original builds that produced it.

**NOTE**: Because we have switched the spec format, this PR changes Spack's hashing algorithm.  This means that after this commit, Spack will think a lot of things need rebuilds.

There are two major benefits this PR provides:
* The switch to JSON format speeds up Spack significantly, as Python's builtin JSON implementation is orders of magnitude faster than YAML. 
* The new Spec format will soon allow us to represent DAGs with potentially multiple versions of the same dependency -- e.g., for build dependencies or for compilers-as-dependencies.  This PR lays the necessary groundwork for those features.

The old `spec.yaml` format continues to be supported, but is now considered a legacy format, and Spack will opportunistically convert these to the new `spec.json` format.
2021-09-09 01:48:30 -07:00
Harmen Stoppels
edb1d75b1b Add new package gnuconfig (#25849) 2021-09-09 02:13:43 -06:00
bernhardkaindl
ed9b38c8e3 Fix python/packages.py's config_vars for python2 packages (#25839)
Analysis mostly by me, fix updated after suggestion by Adam J. Steward

Co-authored-by: Bernhard Kaindl <bernhard.kaindl@ait.ac.at>
2021-09-09 01:25:55 -06:00
Adam J. Stewart
1a5891754a py-torchmetrics: add v0.5.1 (#25855) 2021-09-09 01:19:45 -06:00
Esteban Pauli
d916d801f2 sina: new package (#25448)
* Added spackage to build Sina (https://github.com/LLNL/Sina).

* Improvements to sina/package.py

Made numerous simplifications and improvements to sina/package.py
based on PR feedback.

* Added licence info

* Added maintainers

* Changed maintainers to be Github IDs.
2021-09-08 20:56:37 -07:00
Dr. Christian Tacke
46d770b416 root: Add Version 6.24.06 (#25822) 2021-09-08 16:29:10 -07:00
Cyrus Harrison
6979a63396 conduit: changes related to hdf5 builds that use cmake (#25719) 2021-09-08 15:53:47 -07:00
lukebroskop
f9314d38b0 mpip: fix package to depends on libunwind when +libunwind (#24007)
Added a dependency for mpip@3.5: when the libunwind is set to true (which is the default)
and '~setjmp' is set to False (which is also the default) to avoid a configure
time error from not finding libunwind.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-09-08 13:34:39 -06:00
Massimiliano Culpo
e47f0d486c Account for bootstrapping from sources niche case
This modification accounts for:
1. Bootstrapping from sources using system, non-standard Python
2. Using later an ABI compatible standard Python interpreter
2021-09-08 11:13:37 -07:00
Massimiliano Culpo
fd0884c273 Fix clingo bootstrapping on rhel + ppc64le
The system Python interpreter on rhel is patched to have
slightly different names for some architectures. This
makes it incompatible with manylinux generated extensions
for ppc64le.

To fix this issue when bootstrapping Spack we generate
on-the-fly symbolic links to the name expected by the
current interpreter if it differs from the default.

Links:
  https://github.com/pypa/manylinux/issues/687
  https://src.fedoraproject.org/fork/churchyard/rpms/python3/blame/00274-fix-arch-names.patch?identifier=test_email-mktime
2021-09-08 11:13:37 -07:00
Massimiliano Culpo
4033cc0250 Disable module generation during bootstrapping 2021-09-08 11:13:16 -07:00
Todd Gamblin
c309adb4b3 url stats: add --show-issues option (#25792)
* tests: make `spack url [stats|summary]` work on mock packages

Mock packages have historically had mock hashes, but this means they're also invalid
as far as Spack's hash detection is concerned.

- [x] convert all hashes in mock package to md5 or sha256
- [x] ensure that all mock packages have a URL
- [x] ignore some special cases with multiple VCS fetchers

* url stats: add `--show-issues` option

`spack url stats` tells us how many URLs are using what protocol, type of checksum,
etc., but it previously did not tell us which packages and URLs had the issues. This
adds a `--show-issues` option to show URLs with insecure (`http`) URLs or `md5` hashes
(which are now deprecated by NIST).
2021-09-08 07:59:06 -07:00
Paul Kuberry
c2a6ccbea8 trilinos: Gather teko requirements in one place and add conflict for muelu (#25703) 2021-09-08 15:18:01 +01:00
acastanedam
ca94240dd4 gcc: add patch for sanitizer in gcc<11.1.0 (#25804)
This allows to fix the compilation of gcc versions less than 11.1.0,
due to the remove of cyclades of libsanitizer as it is described in
the patch:

The Linux kernel has removed the interface to cyclades from the latest
kernel headers due to them being orphaned for the past 13
years. libsanitizer uses this header when compiling against glibc, but
glibcs itself doesn't seem to have any references to cyclades. Further
more it seems that the driver is broken in the kernel and the firmware
doesn't seem to be available anymore. As such since this is breaking
the build of libsanitizer (and so the GCC bootstrap) it is proposed to
remove this.

Co-authored-by: Arcesio Castaneda Medina <arcesio.castaneda.medina@itwm.fraunhofer.de>
2021-09-08 15:57:42 +02:00
bernhardkaindl
0ac751b27b perl: Bind us to @gdbm:1.19 due to API change in gdbm@1.20: (#25819)
By changing return values from C #defines to enums, gdbm-1.20 breaks a kludge:

  #ifndef GDBM_ITEM_NOT_FOUND
  # define GDBM_ITEM_NOT_FOUND GDBM_NO_ERROR
  #endif

The absence of the #define causes perl to #define GDBM_ITEM_NOT_FOUND
as GDBM_NO_ERROR which incorrect for gdbm@1.20:
2021-09-08 07:28:52 -06:00
Nic McDonald
9ef1dbd0ef systemc: new package (#25761) 2021-09-08 07:08:03 -06:00
Ethan Stam
c3d5232d5b ParaView: disable externals for fmt and exprtk (#25462) 2021-09-08 14:05:52 +02:00
Seth R. Johnson
f4e66b306e trilinos: yak shaving (#25549)
* trilinos: yak shaving

- use flags instead of manually adding cxxflags
- rearrange defines more sensibly
- use conflicts instead of inline package disables
- fix some inapplicable definitions such as OpenMP instantiation
- disable "broken" leaf packages(FEI, Panzer) by default
- rely on upstream libraries 'libs' rather than manual names

* flake8

* Fix executable call

* Address reviewer feedback
2021-09-08 14:04:35 +02:00
Harmen Stoppels
c33382b607 Cmake: improve ncurses detection (#25776)
* Optionally enable ccmake in cmake

Renames ncurses variant to `ccmake` since that's how users know it, and
explicitly enable/disable `BUILD_CursesDialog`.

* Make cmake locate its dependencies with CMAKE_PREFIX_PATH, and set rpath flags too

* Undo variant name & defaults change
2021-09-08 05:56:00 -06:00
iarspider
47b16b39a3 Add new version of Alpaka, set minimal CMake version (#25835) 2021-09-08 13:55:38 +02:00
Harmen Stoppels
7018a42211 Make sure we only pull in openssl w/ +ownlibs (#25559) 2021-09-08 05:16:50 -06:00
iarspider
28f71c4d12 Update catch2 (#25838) 2021-09-08 04:49:50 -06:00
Harmen Stoppels
26455a4ac2 mbedtls: trun into a MakefilePackage (#25558)
This is a step towards building cmake with curl using mbedtls instead of
openssl; building mbedtls shouldn't require cmake then.
2021-09-08 12:39:25 +02:00
bernhardkaindl
4e4b199f16 lib/spack/env/cc: tolerate trailing / in elements of $PATH (#25733)
Fixes removal of SPACK_ENV_PATH from PATH in the presence of trailing
slashes in the elements of PATH:

The compiler wrapper has to ensure that it is not called nested like
it would happen when gcc's collect2 uses PATH to call the linker ld,
or else the compilation fails.

To prevent nested calls, the compiler wrapper removes the elements
of SPACK_ENV_PATH from PATH.

Sadly, the autotest framework appends a slash to each element
of PATH when adding AUTOTEST_PATH to the PATH for the tests,
and some tests like those of GNU bison run cc inside the test.

Thus, ensure that PATH cleanup works even with trailing slashes.

This fixes the autotest suite of bison, compiling hundreds of
bison-generated test cases in a autotest-generated testsuite.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-09-08 12:09:07 +02:00
Sebastian Ehlert
0fb5a39c17 Add new versions of nlopt (#25836) 2021-09-08 12:06:45 +02:00
QuellynSnead
a86279cc52 netlib-lapack: Fixes for IBM XL builds (#25793)
netlib-lapack: Version 3.9.0 and above no longer builds with the IBM XL
compiler (#25447). Ported some fixes from the old ibm-xl.patch and added
logic for detection of XL's -qrecur flag.
2021-09-08 04:09:12 -04:00
bernhardkaindl
fd111a3395 autoconf package: Fix 2.69 and 2.62 to pass the testsuite (#25701)
Apply stable-release fixes from 2017 to older autoconf releses:
- Fix the scripts autoheader and autoscan to pass the test suite
- Fix test case to passing when libtool 2.4.3+ is in use

autoconf-2.13 dates back to 1999. The build wasn't possible since
4 years: Since 2017, we patch autom4te which didn't exist in 2.13,
failing the build of it. 4 years of not being able to build 2.13
is a crystal clear indication that we can remove it safely.
2021-09-07 18:25:44 -07:00
Cody Balos
32b6da8d57 amrex: support sundials variant in newer amrex versions (#25745)
* amrex: support sundials variant in newer amrex versions

* propagate cuda_arch to sundials

* change to old string formatting

* require sundials+rocm when amrex+rocm
2021-09-07 18:05:32 -07:00
Adam J. Stewart
c6e538583f GDAL package: add version 3.3.2 (#25820) 2021-09-07 13:55:51 -07:00
Stephen McDowell
83298160cc docs: minor grammar fix (#25814) 2021-09-07 09:51:14 +00:00
Adam J. Stewart
54fbe555cd gdbm: fix build issue on macOS (#25811) 2021-09-07 08:51:19 +01:00
Tao Lin
c4e85faa2d garfieldpp: add new package (#25800)
Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2021-09-07 08:16:15 +02:00
Ignacio Laguna
a59edb2826 fpchecker: add new package (#25550) 2021-09-07 08:14:25 +02:00
Timothy Brown
2e2fbc6408 mpas-model: add v 7.1 (#25809) 2021-09-07 08:08:04 +02:00
Adam J. Stewart
5abbd094c7 py-einops: add new package (#25812) 2021-09-07 07:40:18 +02:00
bernhardkaindl
ca58cb701c gtk-doc: Fix the testsuite (hangs if gtkdocize was not installed) (#25717)
Ensure that testsuite has py-anytree and py-parameterized
and finds gtk-doc's gitdocize.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-09-06 20:10:14 -05:00
Scott Wittenburg
a358358aa7 Update pinned OpenSSL version to 1.1.1l (#25787)
Update to the latest version of openssl, as the previous one (1.1.1k) is
now deprecated, so spack can no longer rebuild it from source.
2021-09-06 16:46:41 +00:00
bernhardkaindl
84d525dbdf perl: fix regressions in the end-of-life revisions (#25801)
- perl@:5.24.1 needs zlib@:1.2.8 - shown by more than a dozen tests
  https://rt.cpan.org/Public/Bug/Display.html?id=120134

- perl@:5.26.2 needs gdbm@:1.14.1 - shown by the test suite
  https://rt-archive.perl.org/perl5/Ticket/Display.html?id=133295

- Fix the test case cpan/Time-Local/t/Local.t to use 4-digit years
  http://blogs.perl.org/users/tom_wyant/2020/01/my-y2020-bug.html
2021-09-06 09:07:54 -06:00
bernhardkaindl
b5fa64fb10 libunistring: apply upstream fix for test suite (#25691)
Simple case of pragma weak not working with --as-needed
https://bugs.gentoo.org/688464
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=925746

Added possiblity to build from git master(new version master)
2021-09-06 08:55:38 -06:00
Olivier Cessenat
c424b86a64 gxsview: new package, an MCNP viewer (#25637)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-09-06 16:17:22 +02:00
Billae
ca50c91469 mesa: add v21.2.1 (#25715) 2021-09-06 16:02:11 +02:00
bernhardkaindl
13c0b0dcb3 uftrace: add new package (#25710) 2021-09-06 15:57:11 +02:00
Harmen Stoppels
834155fdb8 ccache 4.4 errors for old gcc/clang (#25783) 2021-09-06 15:33:08 +02:00
Axel Huebl
1badb47b80 WarpX/HiPACE/openPMD-api: Use when(...) (#25789)
Use the new with `when()` syntax for variant dependencies.
2021-09-06 15:26:04 +02:00
QuellynSnead
1a48c0f51c elfutils: address external linkage failures in 0.185 (#25769) 2021-09-06 15:07:42 +02:00
lukebroskop
92e4db4681 Do not allow cray build system patch for later version of otf2 (#25283)
Co-authored-by: Luke Roskop <lroskop@cedar.head.cm.us.cray.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-09-06 14:38:51 +02:00
Terry Cojean
dd8dc08a90 Ginkgo: add v1.4.0 (#25606)
* Add a patch to skip unavailable smoke tests.
2021-09-06 14:11:35 +02:00
Jean-Paul Pelteret
3e4b576f83 deal.II: Bump minimum required version for Ginkgo package (#25795) 2021-09-06 14:07:11 +02:00
Michael Kuhn
526315410a gzip: add v1.11 (#25797) 2021-09-06 14:00:25 +02:00
Michael Kuhn
66526cb57a gdbm: add v1.21 (#25798) 2021-09-06 13:13:18 +02:00
Sebastian Ehlert
beff29176c fpm: new package (#25799) 2021-09-06 13:12:53 +02:00
Jordan Galby
2af6c57afa py-pyqt5: don't install files into qt5 install prefix (#25657)
By default, py-pyqt5 installs qt designer and qml plugins into qt5's
install `<prefix>/plugins`. Don't do that.
2021-09-06 12:18:13 +02:00
Peter Scheibel
accd6dd228 boost @1.76: set toolset for intel b2 invocation (#25722)
and simplify constraint and add explanatory comment
2021-09-05 13:34:10 +02:00
bernhardkaindl
0ad54e0679 New package: py-anytree (#25718)
It is required for running all tests of gtk-doc.
2021-09-03 14:14:38 -07:00
Asher Mancinelli
bbc9d7d965 HiOp package: update magma dependency (#25721) 2021-09-03 14:05:45 -07:00
bernhardkaindl
6f5ec73087 m4: fix glitch in the test suite (#25702) 2021-09-03 15:04:59 -06:00
Desmond Orton
0aedafda19 New package: getOrganelle@1.7.5.0 (#25725) 2021-09-03 14:04:24 -07:00
Jose E. Roman
ac3ccad1e2 SLEPc package: add gpu variants (#25760) 2021-09-03 14:01:42 -07:00
Cyrus Harrison
5180b0b454 apcomp package: add versions 0.0.3 and 0.0.2 (#25767) 2021-09-03 14:00:17 -07:00
pmargara
432f577a0c RELION package: add versions 3.1.3 and 3.1.2 (#25772) 2021-09-03 13:58:39 -07:00
Paul Spencer
f6060c9894 Mathematica package: add version 12.2.0 (#25791) 2021-09-03 13:41:39 -07:00
Harmen Stoppels
c6c9213766 rocblas: use AMDGPU_TARGETS instead of Tensile_ARCHITECTURE (#25778) 2021-09-03 18:19:57 +02:00
Harmen Stoppels
64407e253c Always disable leftover active environment after tests 2021-09-03 07:27:19 -07:00
Harmen Stoppels
de492e73d5 Don't error when removing scope that does not exist 2021-09-03 07:27:19 -07:00
Sreenivasa Murthy Kolam
29d344e4c7 update version for rocm-4.3.1 release (#25766) 2021-09-03 15:25:31 +02:00
bernhardkaindl
95586335f7 openssh: add v8.7 (#25774) 2021-09-03 06:22:40 -06:00
Weiqun Zhang
f56f4677cf amrex: 21.09 (#25739) 2021-09-03 03:34:56 -06:00
Desmond Orton
043f0cd014 py-ansi2html: new package (#25741) 2021-09-03 11:25:52 +02:00
Desmond Orton
9841d1f571 py-cryolobm: add new package (#25748) 2021-09-03 11:25:06 +02:00
Desmond Orton
d61439c26a py-lineenhancer: add new package (#25743) 2021-09-03 10:59:06 +02:00
Desmond Orton
0b9baf9ae3 py-terminaltables: add new package (#25738) 2021-09-03 10:58:18 +02:00
Axel Huebl
2331148f4b WarpX: 21.09 (#25771)
Add the latest WarpX release.
2021-09-03 10:53:29 +02:00
Stephen Hudson
ffec74c359 libensemble: add tasmanian as optional dependency (#25762) 2021-09-03 10:30:08 +02:00
Desmond Orton
f162dd4f5b py-gooey: new package (#25763) 2021-09-03 10:24:31 +02:00
Desmond Orton
8149048a78 py-colorclass: add new package (#25747) 2021-09-03 10:17:33 +02:00
Desmond Orton
2b7a2f66b7 py-mrcfile: add new package (#25742) 2021-09-03 10:16:38 +02:00
Desmond Orton
551766e3c6 py-pathtools: add new package (#25749) 2021-09-03 09:23:01 +02:00
Olivier Cessenat
35fe188d22 silo: add build dependency on m4 (#25770) 2021-09-03 08:54:31 +02:00
jacorvar
5f4fcea79c Correct typo in r-assertive-data-uk (#25754)
`r-assertive-base` version contains a dot instead of a hyphen.
2021-09-02 18:10:56 -05:00
Tamara Dahlgren
487edcc416 mfem: Update stand-alone test to use test stage directory (#25744) 2021-09-02 14:52:56 -07:00
Gianluca Ficarelli
f71d93fc55 py-submitit: add new package (#25699) 2021-09-02 07:58:02 -05:00
Gilles Gouaillardet
2d78045cdd gromacs: add v2021.3 (#25750) 2021-09-02 12:18:05 +02:00
Vanessasaurus
8e61f54260 start of work to add spack audit packages-https checker (#25670)
This PR will add a new audit, specifically for spack package homepage urls (and eventually
other kinds I suspect) to see if there is an http address that can be changed to https.

Usage is as follows:

```bash
$ spack audit packages-https <package>
```
And in list view:

```bash
$ spack audit list
generic:
  Generic checks relying on global variables

configs:
  Sanity checks on compilers.yaml
  Sanity checks on packages.yaml

packages:
  Sanity checks on specs used in directives

packages-https:
  Sanity checks on https checks of package urls, etc.
```

I think it would be unwise to include with packages, because when run for all, since we do requests it takes a long time. I also like the idea of more well scoped checks - likely there will be other addresses for http/https within a package that we eventually check. For now, there are two error cases - one is when an https url is tried but there is some SSL error (or other error that means we cannot update to https):

```bash
$ spack audit packages-https zoltan
PKG-HTTPS-DIRECTIVES: 1 issue found
1. Error with attempting https for "zoltan": 
    <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: Hostname mismatch, certificate is not valid for 'www.cs.sandia.gov'. (_ssl.c:1125)>
```
This is either not fixable, or could be fixed with a change to the url or (better) contacting the site owners to ask about some certificate or similar.

The second case is when there is an http that needs to be https, which is a huge issue now, but hopefully not after this spack PR.

```bash
$ spack audit packages-https xman
Package "xman" uses http but has a valid https endpoint.
```

And then when a package is fixed:

```bash
$ spack audit packages-https zlib
PKG-HTTPS-DIRECTIVES: 0 issues found.
```
And that's mostly it. :)

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-09-02 08:46:27 +02:00
Cody Balos
8a7af82a82 py-adios: add new version (#25746) 2021-09-01 23:19:45 -07:00
Harmen Stoppels
0c61b31922 Bump nlohman-json and fix cmake args (#25504)
* Bump nlohman-json and fix cmake args

* Rename variant
2021-09-01 18:28:11 -07:00
David Beckingsale
156edffec2 Update versions for RAJA, CHAI, Umpire and camp (#25528) 2021-09-01 17:58:47 -07:00
eugeneswalker
6d484a055a py-jupyterhub: add version: 1.4.1 (#24890)
* py-jupyterhub: add version: 1.4.1

* dont need mako for latest release

* sort dependencies

* notebook isnt used for 1.4.1+

* add dependency on py-jupyter-telemetry; create new package py-jupyter-telemetry

* py-jupyter-telemetry: declare missing dependencies

* py-jupyterhub: need more specific depends_on before less specific

* add py-json-logger; py-jupyter-telemetry: add depends_on for py-json-logger

* Update var/spack/repos/builtin/packages/py-jupyter-telemetry/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove py-json-logger erroneously and duplicatively added

* Update var/spack/repos/builtin/packages/py-jupyterhub/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* need py-alembic@1.4: for newest py-jupyterhub

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-09-01 22:58:24 +00:00
Harmen Stoppels
03331de0f2 bugfix: R packages can be concretized together with clingo again (#25730) 2021-09-01 14:12:55 -07:00
David Beckingsale
aabece46ba Add variant to allow unsupported compiler & CUDA combinations (#19736)
Sometimes users need to be able to override the conflicts in `CudaPacakge`.  This introduces a variant to enable/disable them.
2021-09-01 11:34:20 -07:00
Harmen Stoppels
4c23059017 Speed-up two unit tests by using builtin.mock instead of builtin (#25544) 2021-09-01 11:58:29 +02:00
Nisarg Patel
ab37ac95bf Adding new versions of redis (#25714) 2021-09-01 11:39:39 +02:00
Harmen Stoppels
2411a9599e nghttp2: add v1.44.0, fix build (#25605) 2021-09-01 11:33:31 +02:00
Peter Scheibel
b0ee7deaa7 allow building silo+mpi ^hdf5~mpi (#25724) 2021-09-01 09:43:57 +02:00
Richarda Butler
7adacf967d Legion: Add E4S testsuite stand alone test (#25285)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-08-31 16:57:55 -07:00
Richarda Butler
2633cf7da6 Papyrus: Add E4S testsuite stand alone test (#25324) 2021-08-31 15:56:20 -07:00
Hadrien G
3c6050d3a2 acts: add v12.0.0, v12.0.1 (#25697) 2021-08-31 16:44:01 +02:00
Scott Wittenburg
b34f289796 Pipelines: disable power builds (#25704) 2021-08-30 17:19:42 -07:00
Jen Herting
beb3524392 [py-plotly] added version 5.2.2 (#25668)
* [py-plotly] added version 5.2.2

* [py-plotly] flake8

* [py-plotly] requests not required for new version
2021-08-30 16:15:49 -05:00
Adam J. Stewart
8ee5bf6d03 py-pythran: update checksum for patch (#25693)
* Replace URL patch with file patch
* Add comment explaining patch origin and purpose
2021-08-30 10:02:39 -07:00
psakievich
a018f48df9 Updates to Exawind packages (#25568) 2021-08-30 09:22:55 -06:00
lorddavidiii
e51463f587 asciidoc-py3: add v9.1.0, v9.0.5 and v9.0.4 (#25562) 2021-08-30 08:32:03 -06:00
Piotr Luszczek
378543b554 plasma: add version 21.8.29, migrate url (#25688) 2021-08-30 07:46:46 -06:00
Valentin Volkl
3cd224afbf gaudi: update py-xenv dependency (#25457) 2021-08-30 07:13:41 -06:00
Mikael Simberg
44c0089be4 Add patch to fix Boost with CCE and CUDA (#25534) 2021-08-30 05:19:54 -06:00
Bryan Herman
65584a3b92 universal-ctags: add version p5.9.20210829.0 (#25361) 2021-08-30 11:51:47 +02:00
Phil Carns
ab657d7b53 mochi-margo: add v0.9.5 (#25365) 2021-08-30 11:42:36 +02:00
Ryan Marcellino
d3d0ee7328 miniconda3: add v4.10.3 (#25442)
Co-authored-by: Cloud User <marcryan@ryanmarcelli001.hzterscemazurawp3xenxzahla.bx.internal.cloudapp.net>
2021-08-30 11:39:27 +02:00
Ryan Marcellino
ed17c3638b anaconda3: add v2021.05 (#25443)
Co-authored-by: Cloud User <marcryan@ryanmarcelli001.hzterscemazurawp3xenxzahla.bx.internal.cloudapp.net>
2021-08-30 11:38:23 +02:00
Valentin Volkl
2f777d08a2 rivet: fixes for gcc@10: (#25454) 2021-08-30 11:37:39 +02:00
kwryankrattiger
9be81ac4d9 Add documentation on compiler environment (#25508) 2021-08-30 11:17:03 +02:00
lorddavidiii
e60e41d9ca ocl-icd: add v2.3.1 (#25561) 2021-08-30 11:11:16 +02:00
dependabot[bot]
a2293e6ee1 build(deps): bump codecov/codecov-action from 2.0.2 to 2.0.3 (#25594)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 2.0.2 to 2.0.3.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/codecov/codecov-action/compare/v2.0.2...v2.0.3)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-08-30 10:58:12 +02:00
Mikael Simberg
d381ab77b2 boost: add v1.77.0 (#25536) 2021-08-30 10:19:49 +02:00
Adam J. Stewart
1bf051e229 py-pbr: fix import tests (#25679) 2021-08-30 09:11:44 +02:00
Adam J. Stewart
506f62ddfe py-jupyterlab: fix import tests (#25680) 2021-08-30 09:11:30 +02:00
Adam J. Stewart
a6a448b16c py-metpy: fix import tests (#25681) 2021-08-30 09:11:14 +02:00
Adam J. Stewart
0dfa49af8e py-flit-core: build from source (#25682) 2021-08-30 09:09:08 +02:00
Barry Rountree
b3128af901 tmux: added additional versions. (#25684)
Added sha256 checksums for 3.21, 3.2, 3.1c 3.1a, 3.1, 3.0 and 2.9a.

Co-authored-by: Barry <rountree4@llnl.gov>
2021-08-30 09:07:43 +02:00
Joe Schoonover
5029b8ca55 FLAP: add new package (#25685)
Co-authored-by: Joe Schoonover <joe@fluidnumerics.com>
2021-08-30 09:04:05 +02:00
Hadrien G
0d226aa710 acts: add v11.0.0, v10.0.0 (#25444) 2021-08-30 00:58:39 -06:00
Sebastian Schmitt
07a9cb87ef Bump py-salib (#25403) 2021-08-29 08:19:12 -05:00
Kai Germaschewski
7cafe7dd66 add 'develop' branch to cmake package (#25623)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-08-28 18:42:55 +02:00
Massimiliano Culpo
40788cf49a Add a __reduce__ method to Spec (#25658)
* Add a __reduce__ method to Spec

fixes #23892

The recursion limit seems to be due to the default
way in which a Spec is serialized, following all
the attributes. It's still not clear to me why this
is related to being in an environment, but in any
case we already have methods to serialize Specs to
disk in JSON and YAML format. Here we use them to
pickle a Spec instance too.

* Downgrade to build-hash

Hopefully nothing will change the package in
between serializing the spec and sending it
to the child process.

* Add support for Python 2
2021-08-28 07:31:16 -07:00
Chen Wang
025dbb2162 recorder: add v2.2.1, v2.2.0, "master" and "pilgrim" (#25674) 2021-08-28 10:17:35 +02:00
Harmen Stoppels
e2b9ba3001 Add zstd support to libarchive (#25659) 2021-08-27 17:28:46 -07:00
Adam J. Stewart
3a4073cfff py-flit: add maintainer (#25667) 2021-08-27 17:26:41 -07:00
Harmen Stoppels
9577d890c4 add py-flameprof (#25539) 2021-08-27 17:22:35 -06:00
Harmen Stoppels
f5ab3ad82a Fix: --overwrite backs up old install dir, but it gets destroyed anyways (#25583)
* Make sure PackageInstaller does not remove the just-restored
  install dir after failure in spack install --overwrite
* Remove cryptic error message and rethrow actual error
2021-08-27 12:41:24 -07:00
Tamara Dahlgren
b5d3c48824 Load package environment prior to stand-alone/smoke test execution (#25619) 2021-08-27 18:46:26 +00:00
Harmen Stoppels
9d17d474ff Add missing link dep for py-uwsgi (#25654) 2021-08-27 13:44:21 -05:00
Adam J. Stewart
50411f8394 py-psycopg2: add version 2.9.1 (#25646) 2021-08-27 11:00:47 -07:00
Adam J. Stewart
32210b0658 py-pathlib: prevent conflicts with standard library (#25631) 2021-08-27 10:58:20 -07:00
Adam J. Stewart
98e6e4a3a5 New package: py-flit (#25630) 2021-08-27 10:57:07 -07:00
Chris Richardson
a7c6224b3a FEniCSx packages: add version 0.3.0 (#25627) 2021-08-27 10:55:01 -07:00
Adam J. Stewart
9d95125d6a py-pyinstrument: add version 4.0.3 (#25632)
* Swap github download link for pypi
* Versions >= 4 dont need npm
2021-08-27 10:51:48 -07:00
Dylan Simon
ed07fa4c37 r-irkernel: add version 1.2; update version 0.7 (#25644)
Use a commit hash instead of a tag for 0.7
2021-08-27 10:44:10 -07:00
Jordan Galby
2d97d877e4 figlet: Fix figlet font dir (#25662)
By default, figlet looks for fonts in `/usr/local/share/figlet`, and if
it doesn't exist you get `figlet: standard: Unable to open font file`.

This fix changes the default font dir to the one installed in the
install prefix.
2021-08-27 10:42:12 -07:00
Timothy Brown
7fd4dee962 ESMF package: add version 8.1.1 (#25590)
Also build with internal lapack library by default.
2021-08-27 10:26:06 -07:00
Michael Kuhn
4f3a538519 meson: add 0.59.1 and 0.58.2 (#25661) 2021-08-27 06:35:00 -06:00
Michael Kuhn
5c1710f7dc glib: add 2.68.4 (#25660) 2021-08-27 06:22:42 -06:00
Jordan Galby
97ea57e59f Add ld.gold and ld.lld compiler wrapper (#25626)
The gcc compiler can be configured to use `ld.gold` by default. It will
then call `ld.gold` explicitly when linking. When so, spack need to have
a ld.gold wrapper in PATH to inject rpaths link flags etc...

Also I wouldn't be surprised to see some package calling `ld.gold`
directly.

As for ld.gold, the argument could be made that we want to support any
package that could call ld.lld.
2021-08-27 13:16:26 +02:00
Massimiliano Culpo
c152e558e9 Make SpecBuildInterface pickleable (#25628)
* Add a __reduce__ method to SpecBuildInterface

This class was confusing pickle when being serialized,
due to its scary nature of being an object that disguise
as another type.

* Add more MacOS tests, switch them to clingo

* Fix condition syntax

* Remove Python v3.6 and v3.9 with macOS
2021-08-27 09:10:03 +00:00
Harmen Stoppels
12e87ebf14 Fix fish test "framework" (#25242)
Remove broken test, see #21699
2021-08-27 10:52:00 +02:00
Vanessasaurus
1113705080 adding remainder of issues from repology problems (#25653)
some of these are not resolvable in that there is only an http page
available, or a page reported as broken is actually ok, or a page has
an SSL error that does not prevent one from visiting (and no good replacement)

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-08-27 01:16:46 -06:00
Vasileios Karakasis
c3dabf05f4 Add ReFrame 3.8.0 (#25648) 2021-08-27 00:54:59 +00:00
Patrick Gartung
d0d6b29c9e Incorporate intel-tbb-oneapi package into intel-tbb package (#25613)
* Add intel-tbb-oneapi package that does the cmake configure and build.
Compare too the intel-oneapi-tbb package which only downloads a script that contains prebuilt binaries.

* Rename package intel-tbb-cmake

* Incorporate intel-tbb-cmake into intel-tbb package
2021-08-26 17:06:27 -07:00
Harmen Stoppels
74389472ab Make env (de)activate error with -e, -E, -D flags (#25625)
* Make sure that spack -e. env activate b and spack -e. env deactivate error
* Add a test
2021-08-26 16:54:58 -07:00
Adam J. Stewart
f5d4f5bdac py-black: add new version, missing dep (#25629) 2021-08-26 16:53:29 -07:00
Tamara Dahlgren
1d4e00a9ff activeharmony: Switch to https (#25641) 2021-08-26 16:51:57 -07:00
Tamara Dahlgren
8530ea88a3 acl: Switch to https (#25640) 2021-08-26 16:51:26 -07:00
Tamara Dahlgren
e57780d7f0 ACE: Switch to https (#25638) 2021-08-26 16:50:51 -07:00
Tamara Dahlgren
e39c9a7656 adlbx: switch url to https (#25642) 2021-08-26 16:50:17 -07:00
Morten Kristensen
bdb02ed535 py-vermin: add latest version 1.2.2 (#25643) 2021-08-26 22:30:09 +00:00
Erik Schnetter
b5f812cd32 New package: reprimand (#25364) 2021-08-26 15:03:49 -07:00
Paul Kuberry
abfd8fa70b Conditionally remove 'context' from kwargs in _urlopen (#25316)
* Conditionally remove 'context' from kwargs in _urlopen

Previously, 'context' is purged from kwargs in _urlopen to
conform to varying support for 'context' in different versions
of urllib. This fix tries to use 'context', and then removes
it if an exception is thrown and tries again.

* Specify error type in try statement in _urlopen

Specify TypeError when checking if 'context' is in kwargs
for _urlopen. Also, if try fails, check that 'context' is
in the error message before removing from kwargs.
2021-08-26 13:51:08 -07:00
Adam J. Stewart
6eb942cf45 Speedup environment activation, part 2 (#25633)
This is a direct followup to #13557 which caches additional attributes that were added in #24095 that are expensive to compute. I had to reopen #25556 in another PR to invalidate the GitLab CI cache, but see #25556 for prior discussion.

### Before

```console
$ time spack env activate .

real	2m13.037s
user	1m25.584s
sys	0m43.654s
$ time spack env view regenerate
==> Updating view at /Users/Adam/.spack/.spack-env/view

real	16m3.541s
user	10m28.892s
sys	4m57.816s
$ time spack env deactivate

real	2m30.974s
user	1m38.090s
sys	0m49.781s
```

### After
```console
$ time spack env activate .

real	0m8.937s
user	0m7.323s
sys	0m1.074s
$ time spack env view regenerate
==> Updating view at /Users/Adam/.spack/.spack-env/view

real	2m22.024s
user	1m44.739s
sys	0m30.717s
$ time spack env deactivate

real	0m10.398s
user	0m8.414s
sys	0m1.630s
```

Fixes #25555
Fixes #25541 

* Speedup environment activation, part 2
* Only query distutils a single time
* Fix KeyError bug
* Make vermin happy
* Manual memoize
* Add comment on cross-compiling
* Use platform-specific include directory
* Fix multiple bugs
* Fix python_inc discrepancy
* Fix import tests
2021-08-26 20:44:31 +00:00
Enrico Usai
9dab298f0d aws-parallelcluster: add v2.11.2 (#25635)
Signed-off-by: Rex <shuningc@amazon.com>
2021-08-26 15:35:01 -05:00
Vanessasaurus
6a26322eb3 fixing "problems in speck" as identified by repology (#25491)
Most of these are perl packages that need to point to the meta docs site,
and then a fair amount of http addresses that need to be https, and then
the rest are usually documentation sites that no longer exist or were
otherwise changes

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-08-26 15:05:24 -05:00
Harmen Stoppels
23106ac0f5 Set pubkey trust to ultimate during gpg trust (#24976)
* Set pubkey trust to ultimate during `gpg trust`

Tries to solve the same problem as #24760 without surpressing stderr
from gpg commands.

This PR makes every imported key trusted in the gpg database.

Note: I've outlined
[here](https://github.com/spack/spack/pull/24760#issuecomment-883183175)
that gpg's trust model makes sense, since how can we trust a random
public key we download from a binary cache?

* Fix test
2021-08-26 12:59:44 -07:00
Massimiliano Culpo
29d1bc6546 Ensure environment are deactivated when bootstrapping (#25607)
Fixes #25603

This commit adds a new context manager to temporarily
deactivate active environments. This context manager
is used when setting up bootstrapping configuration to
make sure that the current environment is not affected
by operations on the bootstrap store.

* Preserve exit code 1 if nothing is found
* Use context manager for the environment
2021-08-26 12:20:05 -07:00
lorddavidiii
c963bdee8b opencl-c-headers and opencl-clhpp: add new versions (#25576) 2021-08-26 10:57:14 -07:00
Adam J. Stewart
6b3518d6fd py-omegaconf: add missing dependency (#25589) 2021-08-26 10:55:42 -07:00
Glenn Johnson
6a31ca7386 opium package: add version 4.1 and update blas/lapack dependencies (#25591)
- remove unneeded dependency on blas
- create external-lapack variant
- patch makefile to not build lapack if `+external-lapack`

Also: 

- fix homepage link
- set parallel = False
- make references to `spec` consistent
- remove unneeded `build` method
2021-08-26 10:35:58 -07:00
Tamara Dahlgren
8664abc178 Remove references to self.install_test_root from packaging guide (#25238) 2021-08-26 19:22:40 +02:00
Massimiliano Culpo
a3d8e95e76 Avoid double loop in subprocess_context.store_patches (#25621)
fixes #21643

As far as I can see the double loop is not needed, since
"patch" is never used and the items in the list are tuples
of three values.
2021-08-26 09:46:01 -07:00
Massimiliano Culpo
1ab6f30fdd Remove fork_context from llnl.util.lang (#25620)
This object was introduced in #18124, and was later superseded by
#18205 and removed any use if the object.
2021-08-26 09:39:59 -07:00
Massimiliano Culpo
7dd3592eab Regression test for version selection with preferences (#25602)
This commit adds a regression test for version selection
with preferences in `packages.yaml`. Before PR 25585 we
used negative weights in a minimization to select the
optimal version. This may lead to situations where a
dependency may make the version score of dependents
"better" if it is preferred in packages.yaml.
2021-08-26 09:28:47 -07:00
Harmen Stoppels
e602c40d09 zstd package: use make instead of CMake (#25610)
zstd doesn't use -pthread for the static lib anymore, so gcc should be fine
2021-08-26 08:30:30 -07:00
Rémi Lacroix
270cbf08e3 VTK: add version 9.0.3 (#25609) 2021-08-26 07:29:41 -07:00
Vanessasaurus
4ddc0ff218 fixing bugs with new package updates for samtools and abi-dum (#25615)
samtools needed more constraints for htslib, and abi-dumper was missing pkg-config
on the dependency universal ctags

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-08-26 07:39:46 +02:00
Harmen Stoppels
73005166ef Bugfix: reinstalling updated develop specs (#25579)
PackageInstaller and Package.installed disagree over what it means
for a package to be installed: PackageInstaller believes it should be
enough for a database entry to exist, whereas Package.installed
requires a database entry & a prefix directory.

This leads to the following niche issue:

* a develop spec in an environment is successfully installed
* then somehow its install prefix is removed (e.g. through a bug fixed
  in #25583)
* you modify the sources and reinstall the environment
  1. spack checks pkg.installed and realizes the develop spec is NOT
     installed, therefore it doesn't need to have 'overwrite: true'
  2. the installer gets the build task and checks the database and
      realizes the spec IS installed, hence it doesn't have to install it.
  3. the develop spec is not rebuilt.

The solution is to make PackageInstaller and pkg.installed agree over
what it means to be installed, and this PR does that by dropping the
prefix directory check from pkg.installed, so that it only checks the
database.

As a result, spack will create a build task with overwrite: true for
the develop spec, and the installer in fact handles overwrite requests
fine even if the install prefix doesn't exist (it just does a normal
install).
2021-08-25 18:14:11 -07:00
Xiao-Yong
204b49fc1f py-grpcio: set parallel build jobs (#25616)
By default the number of parellel compiler processes launched by
py-grpcio equals the number of threads.  This commit limit it to
spack config build_jobs.
2021-08-26 01:08:05 +00:00
Olivier Cessenat
de3c0e62d0 New Package: perl-fth (#21879) 2021-08-25 19:16:13 -05:00
Kelly (KT) Thompson
e4e4bf75ca [pkg][new version] Provide eospac@6.5.0beta (#25614)
* Provide new version of eospac.

+ Provide version 6.5.0beta.
+ Make version 6.4.2 the default

+ Also increment

* volunteer to be the maintainer (for now).
2021-08-25 16:35:49 -06:00
Massimiliano Culpo
af2f07852c environment: match concrete specs only if they have the same build hash (#25575)
see #25563

When we have a concrete environment and we ask to install a
concrete spec from a file, currently Spack returns a list of
specs that are all the one that match the argument DAG hash.

Instead we want to compare build hashes, which also account
for build-only dependencies.
2021-08-25 10:56:16 -07:00
Matthieu Dorier
e2e7b0788f [py-meshio] Added py-meshio package (#25578)
* Added py-meshio package

* Added setuptools dependency to py-meshio package

* Update var/spack/repos/builtin/packages/py-meshio/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-meshio/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-meshio/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-meshio/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-meshio/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* added missing py-importlib-metadata dependency in py-meshio

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-08-25 11:19:40 -06:00
Todd Gamblin
d27e0bff5a installation: filter padding from binary installs, too (#25600)
#25303 filtered padding from build output, but it's still there in binary install/relocate output,
so our CI logs are still quite long and frequently hit the limit.

- [x] add context handler from #25303 to buildcache installation as well
2021-08-25 17:39:00 +02:00
Todd Gamblin
fafe1cb7e8 Make spack graph -i environment-aware (#25599)
This allows you to run `spack graph --installed` from within an environment and get a dot graph of
its concrete specs.

- [x] make `spack graph -i` environment-aware

- [x] add code to the generated dot graph to ensure roots have min rank (i.e., they're all at the
      top or left of the DAG)
2021-08-25 07:41:04 -07:00
arjun-raj-kuppala
3e2f890467 Bump up version for rocm-debug-agent - ROCm 4.3.0 (#25588) 2021-08-25 15:03:07 +02:00
Harmen Stoppels
6fab0e1b9c libarchive: add v3.5.2, add maintainer, reworked recipe (#25604)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-08-25 05:16:45 -06:00
Harmen Stoppels
f4c9161f84 ca-certificates-mozilla: add v2021-07-05 (#25565) 2021-08-25 12:58:08 +02:00
Harmen Stoppels
fd095a3660 libuv: add v1.42.0, v1.41.1 (#25560) 2021-08-25 12:55:05 +02:00
lukebroskop
f6a9ef5ef5 Update the cray-mpich package to use the new cray-mpich MPI wrappers (#25597)
As of cray-mpich version 8.1.7, conventional MPI compiler wrappers are included in cray-mpich.

Co-authored-by: Luke Roskop <lroskop@cedar.head.cm.us.cray.com>
2021-08-24 23:50:02 -07:00
Todd Gamblin
df10e88e97 bootstrap: use sys.exec_prefix to set up external python correctly (#25593)
Bootstrapping clingo on macOS on `develop` gives errors like this:

```
==> Error: RuntimeError: Unable to locate python command in /System/Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents/bin

/Users/gamblin2/Workspace/spack/var/spack/repos/builtin/packages/python/package.py:662, in command:
        659                return Executable(path)
        660        else:
        661            msg = 'Unable to locate {0} command in {1}'
  >>    662            raise RuntimeError(msg.format(self.name, self.prefix.bin))
```

On macOS, `python` is laid out differently. In particular, `sys.executable` is here:

```console
Python 2.7.16 (default, May  8 2021, 11:48:02)
[GCC Apple LLVM 12.0.5 (clang-1205.0.19.59.6) [+internal-os, ptrauth-isa=deploy on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import sys
>>> sys.executable
'/System/Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents/MacOS/Python'
```

Based on that, you'd think that
`/System/Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents` would be
where you'd look for a `bin` directory, but you (and Spack) would be wrong:

```console
$ ls /System/Library/Frameworks/Python.framework/Versions/2.7/Resources/Python.app/Contents/
Info.plist      MacOS/          PkgInfo         Resources/      _CodeSignature/ version.plist
```

You need to look in `sys.exec_prefix`

```
>>> sys.exec_prefix
'/System/Library/Frameworks/Python.framework/Versions/2.7'
```

Which looks much more like a standard prefix, with understandable `bin`, `lib`, and `include`
directories:

```console
$ ls /System/Library/Frameworks/Python.framework/Versions/2.7
Extras/         Mac/            Resources/      bin/            lib/
Headers@        Python*         _CodeSignature/ include/
$ ls -l /System/Library/Frameworks/Python.framework/Versions/2.7/bin/python
lrwxr-xr-x  1 root  wheel     7B Jan  1  2020 /System/Library/Frameworks/Python.framework/Versions/2.7/bin/python@ -> python2
```

- [x] change `bootstrap.py` to use the `sys.exec_prefix` as the external prefix, instead of just
      getting the parent directory of the executable.
2021-08-24 21:44:26 -07:00
Tamara Dahlgren
99076660d4 bugfix: Correct source of PID for -ddd installation outputs (#25596) 2021-08-25 00:21:48 +00:00
Todd Gamblin
80713e234c bootstrap: fix printing for python 2 (#25592) 2021-08-24 21:31:30 +00:00
Todd Gamblin
1374fea5d9 locks: only open lockfiles once instead of for every lock held (#24794)
This adds lockfile tracking to Spack's lock mechanism, so that we ensure that there
is only one open file descriptor per inode.

The `fcntl` locks that Spack uses are associated with an inode and a process.
This is convenient, because if a process exits, it releases its locks.
Unfortunately, this also means that if you close a file, *all* locks associated
with that file's inode are released, regardless of whether the process has any
other open file descriptors on it.

Because of this, we need to track open lock files so that we only close them when
a process no longer needs them.  We do this by tracking each lockfile by its
inode and process id.  This has several nice properties:

1. Tracking by pid ensures that, if we fork, we don't inadvertently track the parent
   process's lockfiles. `fcntl` locks are not inherited across forks, so we'll
   just track new lockfiles in the child.
2. Tracking by inode ensures that referencs are counted per inode, and that we don't
   inadvertently close a file whose inode still has open locks.
3. Tracking by both pid and inode ensures that we only open lockfiles the minimum
   number of times necessary for the locks we have.

Note: as mentioned elsewhere, these locks aren't thread safe -- they're designed to
work in Python and assume the GIL.

Tasks:
- [x] Introduce an `OpenFileTracker` class to track open file descriptors by inode.
- [x] Reference-count open file descriptors and only close them if they're no longer
      needed (this avoids inadvertently releasing locks that should not be released).
2021-08-24 14:08:34 -07:00
Andrew W Elble
7274d8bca2 openssl: new version 1.1.1l (#25586)
security update
2021-08-24 11:31:48 -06:00
Harmen Stoppels
73208f5835 Fix bindist network issues (#25587)
* Fix bindist network issues

* Another one using the network
2021-08-24 12:09:23 -05:00
Scott McMillan
107693fbd1 m4: fixes for the NVIDIA HPC SDK (#25546)
Co-authored-by: Scott McMillan <smcmillan@nvidia.com>
2021-08-24 17:07:03 +00:00
Massimiliano Culpo
31dcdf7262 ASP-based solver: rework version facts (#25585)
This commit rework version facts so that:
1. All the information on versions is collected
   before emitting the facts
2. The same kind of atom is emitted for versions
   stemming from different origins (package.py
   vs. packages.yaml)

In the end all the possible versions for a given
package are totally ordered and they are given
different and increasing weights staring from zero.

This refactor allow us to avoid using negative
weights, which in some configurations may make
parent node score "better" and lead to unexpected
"optimal" results.
2021-08-24 09:24:18 -07:00
Christoph Conrads
b2968c817f Melissa: add v0.7.1, deprecate v0.7.0 (#25584) 2021-08-24 09:46:45 -06:00
corentin-dev
213ec6df5f petsc: added variants and dips (#24725)
Add HPDDM, MMG, ParMMG and Tetgen to PETSc.

Add mmg version 5.5.2 (compatible with PETSc).
Add parmmg, depending on mmg.
Add pic variant to tetgen for PETSc.
2021-08-24 07:43:35 -06:00
Vanessasaurus
5823a9b302 fixing small bug that a line of spack monitor commands are still produced (#25366)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-08-24 05:48:16 +00:00
Paul Spencer
01cbf3b81c Spelling fixes (#25570) 2021-08-23 21:29:20 +02:00
Timothy Brown
6e68792ded Adding a heap of NOAA packages for UFS. (#25542)
* Adding a heap of NOAA packages for UFS.

Adding the Unified Forecast System (UFS) and all of the packages
it depends on.

* Fixing style tests.

* Removing the package CMAKE_BUILD_TYPE override.

* Removing compiler specs from `cmake_args()`.
2021-08-23 10:55:36 -06:00
Harmen Stoppels
2971a630b8 re2c: add versions up to v2.2 (#25500) 2021-08-23 02:28:31 -06:00
Adam J. Stewart
bf7ce7e4e9 curl: add tls multi-valued variant, fix macOS build (#25553) 2021-08-23 09:38:06 +02:00
Adam J. Stewart
c5c809ee3e py-numpy: add v1.21.2 (#25436) 2021-08-23 09:26:51 +02:00
Adam J. Stewart
9a8d7ea3cb py-ipykernel: add v6.2.0 and v5.5.5 (#25520) 2021-08-23 09:26:32 +02:00
Adam J. Stewart
a68701c636 py-pythran: add OpenMP dependency (#25137) 2021-08-22 09:02:21 +02:00
Adam J. Stewart
1212847eee Document how to handle changing build systems (#25174) 2021-08-21 11:05:42 -07:00
Cyrus Harrison
768ea7e8f7 ascent: a few small changes to the package (#25551)
- provides the site packages fix
- excludes the hdf5 linking changes (which are fixed in conduit@develop's build system)
- relaxes constraints to allows building static ascent against shared python
2021-08-21 18:01:27 +02:00
Adam J. Stewart
9b66138054 py-ipython: add v7.26.0 (#25521) 2021-08-21 04:19:38 -06:00
Sergey Kosukhin
cf7e40f03c claw: add v2.0.3 (#25459) 2021-08-21 04:10:41 -06:00
Olli Lupton
8c25b17d8e ccache: add v4.4. (#25540) 2021-08-21 11:02:49 +02:00
Asher Mancinelli
a7a37e4de6 hiop: add v0.4.6, v0.4.5, added maintainers (#25548) 2021-08-21 11:00:14 +02:00
Satish Balay
37a1885deb py-libensemble: add maintainer (#25543) 2021-08-20 11:49:56 -06:00
Gilles Grospellier
81e4155eaf mono: add v6.12.0.122, add maintainer (#25538) 2021-08-20 11:25:41 -06:00
Oliver Perks
7d666fc220 rsbench: Version bump and added compiler support (#25464)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-08-20 09:19:56 -06:00
Harmen Stoppels
c4e50c9efb pkgconf: add v1.8.0 (#25480) 2021-08-20 08:22:44 -06:00
Paul Spencer
caed90fcf2 Fix spelling mistake in the word "monitor" (#25486) 2021-08-20 13:52:02 +00:00
Scott Wittenburg
84100afc91 Pipelines: use shared pr mirror for pipeline generation and builds (#25529)
Once PR binary graduation is deployed, the shared PR mirror will
contain binaries just built by a merged PR, before the subsequent
develop pipeline has had time to finish.  Using the shared PR mirror
as a source of binaries will reduce the number of times we have to
rebuild the same full hash.
2021-08-20 07:45:23 -06:00
Adam J. Stewart
37e4d32d53 py-protobuf: fix checksums (#25469) 2021-08-20 15:03:33 +02:00
Sergey Kosukhin
c0bb2b9943 hdf5: support for Cray Fortran Compiler (#25466) 2021-08-20 15:00:14 +02:00
Adam J. Stewart
b1755c4fb3 py-torch: add v1.8.2 (#25471) 2021-08-20 14:56:58 +02:00
Christoph Conrads
2474b91078 file: make package discoverable (#25479) 2021-08-20 14:54:43 +02:00
Valentin Volkl
dcd19e7982 cupla: add new package (#25505) 2021-08-20 14:43:43 +02:00
Christoph Conrads
8be614729c sed: make package discoverable (#25481) 2021-08-20 14:31:31 +02:00
Tamara Dahlgren
201f5bdfe8 WarpX: Actually skip tests when cannot be run (#25494) 2021-08-20 14:27:12 +02:00
Keita Iwabuchi
ef32ff0e4c metall: add v0.16 and v0.14 (#25493) 2021-08-20 14:26:34 +02:00
Wouter Deconinck
2754f2f506 podio:add v0.13.1 (#25492)
This allows for fixed width integers in the fields, and other changes as in changelog https://github.com/AIDASoft/podio/releases/tag/v00-13-01.
2021-08-20 14:26:03 +02:00
Dylan Simon
89f442392e gtkplus: needs libepoxy+glx (#25488)
build fails with ~glx
2021-08-20 14:25:32 +02:00
Harmen Stoppels
9ead83caa7 curl: add v7.78 (#25496) 2021-08-20 14:23:06 +02:00
Harmen Stoppels
737f09f2b0 libxml2: add v2.9.12 (#25497) 2021-08-20 14:22:21 +02:00
Harmen Stoppels
0c25015fdc mbedtls: add v3.0.0 (#25498) 2021-08-20 14:19:55 +02:00
Harmen Stoppels
e6d1485c28 openssh: add v8.6p1 (#25499) 2021-08-20 14:19:32 +02:00
Harmen Stoppels
e9e0cd0728 rsync: add v3.2.3 (#25501) 2021-08-20 14:14:29 +02:00
Harmen Stoppels
09b52c4f04 util-linux: add v2.37.2, v2.37.1 and v2.37 (#25503) 2021-08-20 14:12:30 +02:00
Christoph Conrads
eb5061d54c mercurial: complete spec of Python dependency (#25506)
The list of required Python standard library components can be found in
the Mercurial wiki, "Supported Python Versions":
  https://www.mercurial-scm.org/wiki/SupportedPythonVersions
2021-08-20 14:07:44 +02:00
Christoph Conrads
b124fbb0c8 melissa: new package (#25511) 2021-08-20 14:05:21 +02:00
kwryankrattiger
9b239392b1 Add MPI variant and fix python3 variant for cinema (#25513) 2021-08-20 14:03:37 +02:00
Adam J. Stewart
05e933d7af py-nbsphinx: add v0.8.7 (#25515) 2021-08-20 13:50:32 +02:00
Erik Schnetter
65a7ceb3ce mpitrampoline: new package (#25516) 2021-08-20 13:50:03 +02:00
Tobias Ribizel
bdf7754552 ginkgo: update smoke test location (#25517)
The `test_install` folder was moved to `test` with https://github.com/ginkgo-project/ginkgo/pull/733.
2021-08-20 13:49:00 +02:00
Adam J. Stewart
b6f7fa6eb5 py-nbmake: add new package (#25525) 2021-08-20 05:46:40 -06:00
Garth N. Wells
2f85d3cdb9 pybind11: add v2.7.1 (#25519) 2021-08-20 13:44:39 +02:00
miheer vaidya
a6d26598ef p7zip build fails with gcc10+ (#25450) 2021-08-20 05:43:56 -06:00
Adam J. Stewart
d699478ab8 py-nbformat: add v5.1.3 (#25522) 2021-08-20 13:36:40 +02:00
Adam J. Stewart
68d488546f py-debugpy: add new package (#25523) 2021-08-20 13:36:14 +02:00
Adam J. Stewart
1d2798dd77 py-matplotlib-inline: add new package (#25524) 2021-08-20 13:35:55 +02:00
Adam J. Stewart
6ce0d934cf py-pydantic: add new package (#25526) 2021-08-20 13:31:58 +02:00
Pramod Kumbhar
ec720dd148 neuron: add v8.0.0 (#25533) 2021-08-20 13:24:18 +02:00
Harmen Stoppels
d52a1b8279 Fix broken develop as CI didn't run on latest merge commit (#25531)
* CI for #25439 was not run on the latest merge commit, and fails after #25470
* Make it consistent
2021-08-20 08:06:47 +00:00
Seth R. Johnson
e8bcb43695 trilinos: restore develop branch for exawind (#25487)
Exawind needs to build against trilinos@develop to sniff out errors
before they are merged to master.
2021-08-20 08:11:28 +01:00
Harmen Stoppels
220a87812c New spack.environment.active_environment api, and make spack.environment not depend on spack.cmd. (#25439)
* Refactor active environment getters

- Make `spack.environment.active_environment` a trivial getter for the active
environment, replacing `spack.environment.get_env` when the arguments are
not needed
- New method `spack.cmd.require_active_environment(cmd_name)` for 
commands that require an environment (rather than abusing 
get_env/active_environment)
- Clean up calling code to call spack.environment.active_environment or
spack.cmd.require_active_environment as appropriate
- Remove the `-e` parsing from `active_environment`, because `main.py` is
responsible for processing `-e` and already activates the environment.
- Move `spack.environment.find_environment` to
`spack.cmd.find_environment`, to avoid having spack.environment aware
of argparse.
- Refactor `spack install` command so argument parsing is all handled in the
command, no argparse in spack.environment or spack.installer
- Update documentation

* Python 2: toplevel import errors only with 'as ev'

In two files, `import spack.environment as ev` leads to errors
These errors are not well understood ("'module' object has no attribute
'environment'"). All other files standardize on the above syntax.
2021-08-19 19:01:37 -07:00
Massimiliano Culpo
10695f1ed3 Use kcov from official Ubuntu 20.04 repository (#25385)
* Ubuntu 20.04 provides kcov, so don't build from source

* Use two undocumented options for kcov v3.8
2021-08-19 14:03:10 -07:00
Scott Wittenburg
350372e3bf buildcache: Add environment-aware buildcache sync command (#25470) 2021-08-19 12:15:40 -06:00
Axel Huebl
cd91abcf88 WarpX: Check & Smoke Tests (#25352)
Run an example at build time with:
```
spack install --test=root warpx@<version>
```
Ref.: https://spack.readthedocs.io/en/latest/packaging_guide.html#stand-alone-or-smoke-tests

Run smoke-tests after install and loading of the package via
```
spack load -r /<spec>
spack test run /<spec>
```
2021-08-18 18:01:14 -07:00
tehyayi11
c865aaaa0f seacas: add lib directory to pythonpath in run environment (#25453) 2021-08-19 00:11:46 +00:00
Massimiliano Culpo
4318ceb2b3 Bootstrap clingo from binaries (#22720)
* Bootstrap clingo from binaries

* Move information on clingo binaries to a JSON file

* Add support to bootstrap on Cray

Bootstrapping on Cray requires, at the moment, to
swap the platform when looking for binaries - due
to #22800.

* Add SHA256 verification for bootstrapped software

Use sha256 verification for binaries necessary to bootstrap
the concretizer and gpg for signature verification

* patchelf: use Spec._old_concretize() to bootstrap

As noted in #24450 we may happen to need the
concretizer when bootstrapping clingo. In that case
only the old concretizer is available.

* Add a schema for bootstrapping methods

Two fields have been added to bootstrap.yaml:
  "sources" which lists the methods available for
       bootstrapping software
  "trusted" which records if a source is trusted or not

A subcommand has been added to "spack bootstrap" to list
the sources currently available.

* Methods used for bootstrapping are configurable from bootstrap:sources

The function that tries to ensure a given Python module
is importable now tries bootstrapping methods in the same
order as they are defined in `bootstrap.yaml`

* Permit to trust/untrust bootstrapping methods

* Add binary tests for MacOS, Ubuntu

* Add documentation

* Add a note on bash
2021-08-18 11:14:02 -07:00
Axel Huebl
8a32f72829 openPMD-api: add v0.14.2 (#25473) 2021-08-18 19:08:44 +02:00
Christoph Conrads
06c8fdafd4 Serf: add Python dependency, sort dependencies (#25478) 2021-08-18 09:01:40 -07:00
Harmen Stoppels
b22728d55c Support older py-pygments (#25456)
`markdown` is only supported since py-pygments@2.8.0:, see
9647d2ae50

Let's allow old versions too again.
2021-08-18 09:49:51 +02:00
Erik Schnetter
c869f3639d cmake.py: Improve documentation (#25467)
Add missing `self.` prefixes when calling `define_from_variant`
2021-08-17 23:30:55 +02:00
psakievich
d00fc55e41 Add link_type documentation (#25451) 2021-08-17 09:47:26 -07:00
Massimiliano Culpo
09378f56c0 Use a patched argparse only in Python 2.X (#25376)
Spack is internally using a patched version of `argparse` mainly to backport Python 3 functionality
into Python 2. This PR makes it such that for the supported Python 3 versions we use `argparse`
from the standard Python library. This PR has been extracted from #25371 where it was needed
to be able to use recent versions of `pytest`.

* Fixed formatting issues when using a pristine argparse.py
* Fix error message for Python 3.X when missing positional arguments
* Account for the change of API in Python 3.7
* Layout multi-valued args into columns in error messages
* Seamless transition in develop if argparse.pyc is in external
* Be more defensive in case we can't remove the file.
2021-08-17 08:52:51 -07:00
Harmen Stoppels
f444303ce5 Add new pygments (#25455) 2021-08-17 07:19:31 -07:00
Vicente Bolea
4c0f1bf4e4 Paraview: uses canonical cuda_arch variant (#23257)
cuda_arch in ParaView will no longer accept CUDA architecture names
2021-08-17 04:55:16 -07:00
psakievich
a81ec88c6c Allow environment views to be sym/hard link and copy types (#24832)
Add link type to spack.yaml format

Add tests to verify link behavior is correct for installed files
for all three view types

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-08-16 19:21:57 -07:00
robgics
657a5c85cc Improve license err msg (#24117)
* Add to the error message to help determine failure source.

* Break up long line to keep under 80 chars.

Co-authored-by: Rob Groner <rug262@psu.edu>
2021-08-16 14:25:41 -07:00
robgics
437a272854 ampl: Add missing ampl_lic install and improve look of resources (#25205)
* ampl: Add missing ampl_lic install and improve look of resources

* ampl: Add myself as maintainer

* ampl: Remove unused variable and delete extra lines

Co-authored-by: Rob Groner <rug262@psu.edu>
2021-08-16 14:25:19 -07:00
Jen Herting
bfb811b7d3 New package: r-reams (#25396) 2021-08-16 14:51:40 -05:00
Howard Pritchard
79c2d55830 coreutils: patch for 8.32 for aarch64 (#25320)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2021-08-16 20:12:18 +02:00
Jen Herting
e64659b008 New package: r-nfactors (#25393) 2021-08-16 12:52:47 -05:00
Jen Herting
51ad841f1e New package: r-neuralnet (#25392) 2021-08-16 12:51:27 -05:00
Jen Herting
9d05c7ba76 New package: r-labelled (#25391) 2021-08-16 12:50:07 -05:00
Jen Herting
1fd15703cd New package: r-islr (#25389) 2021-08-16 12:42:48 -05:00
eugeneswalker
538744c9ac e4s ci: further expand power stack (#25405) 2021-08-16 11:21:20 -06:00
Valentin Volkl
39cf1b2736 freeglut: add patch for clang@11: (#25438) 2021-08-16 09:07:38 -07:00
Jean-Paul Pelteret
3773185639 gmsh: remove gl2ps dependency (#25425) 2021-08-16 06:58:55 -07:00
Patrick Gartung
f22f857ece pythia6: update url (#25414) 2021-08-16 15:19:55 +02:00
Michael Will
3555446f73 FEniCS: Added byte order patch (#25417) 2021-08-16 05:46:36 -07:00
Seth R. Johnson
42c230dfbe trilinos: simplify variants (#25359) 2021-08-16 07:34:03 -04:00
Michele Mesiti
6110aa374c SOMBRERO - tests now in $PATH (#25421) 2021-08-16 13:22:30 +02:00
Sreenivasa Murthy Kolam
2d047d1f51 enable the variant AMDGPU_TARGETS in rocsolver (#25423) 2021-08-16 13:06:09 +02:00
Jen Herting
5f50f3329f r-viennacl: new package (#25398)
Co-authored-by: Alex C Leute <aclrc@rit.edu>
2021-08-16 01:13:28 -07:00
Jen Herting
1fa5642858 r-pvclust: new package (#25394)
Co-authored-by: Alex C Leute <aclrc@rit.edu>
2021-08-16 09:59:39 +02:00
lukebroskop
09cc439572 cray-libsci package update: add and addition function to access libraries. (#25386)
Co-authored-by: Luke Roskop <lukebr@login1.spock.olcf.ornl.gov>
2021-08-16 09:49:10 +02:00
Brian Van Essen
09fa9cdaae C++17 support for LBANN and HIP (#25406)
* Added logic to explicitly pass the c++17 language flags to the
HIP/ROCm software stack to ensure that HIP complies with the C++17
requirements.
2021-08-16 09:47:27 +02:00
Harmen Stoppels
de0d618730 libiconv: add libs variant to allow share and static builds (#25357) 2021-08-16 09:34:59 +02:00
Tamara Dahlgren
2ccbc00fd9 Second pass at increasing RADIUSS cloud CI packages (#25321) 2021-08-16 09:28:44 +02:00
Ethan Stam
51a22d5db7 ParaView: paraview should build its own IOSS (#25284) 2021-08-16 09:27:20 +02:00
Mikael Simberg
7f77ca4efb Update package.py (#25412) 2021-08-16 09:26:03 +02:00
Adam J. Stewart
83110cfd4c Python: update dependencies for 3.10 (#25430) 2021-08-16 09:19:32 +02:00
Valentin Volkl
c02539bd29 root: external find can now determine variants (#25427) 2021-08-16 09:18:48 +02:00
Adam J. Stewart
0b7aff2ad7 py-pandas: add v1.3.2 (#25434) 2021-08-16 09:10:55 +02:00
Adam J. Stewart
cd25599eba py-cython: add v0.29.24 (#25435) 2021-08-16 09:08:32 +02:00
Valentin Volkl
77a9004c31 vtk: add patch for missing includes (#25437) 2021-08-16 09:08:04 +02:00
AMD Toolchain Support
e42af64e24 WRF: v3.9.1.1 and v4.2 can be built with aocc@3.1 (#25384)
Co-authored-by: mohan002 <mohbabul@amd.com>
2021-08-16 08:26:56 +02:00
Geoffrey Gunter
fbed679dd0 Add date v3.0.1 (#25432) 2021-08-15 14:42:30 -05:00
wspear
44e251d974 PDT accepts oneapi as provider of icpc compiler (#25157)
* PDT accepts oneapi as provider of icpc compiler

* Add maintainers section (same as for tau)

* Whitespace formatting fix
2021-08-15 08:40:24 -07:00
Seth R. Johnson
2965c501a5 openpbs: add provider, new version, new name (#25429) 2021-08-15 07:59:19 -05:00
Adam J. Stewart
8d881cb7ee py-matplotlib: add v3.4.3 (#25413) 2021-08-14 21:09:17 -04:00
Satish Balay
c7ba2e9663 xsdk: deprecate versions 0.4.0, 0.5.0 (#25428) 2021-08-14 21:06:18 -04:00
Mikhail Titov
e911f8ab3c New packages: RADICAL-Cybertools (#25415)
* rct: new packages (core packages and some dependencies)

* rct: new packages (core packages and some dependencies)

* radical-entk: updated dependencies (according to comments)

* radical-gtod: updated version name

* radical-pilot: updated dependencies (according to comments)

* radical-saga: updated dependencies (according to comments)

* radical-utils: updated dependencies and set old versions deprecated

* saga-python: removed due to absence of packages (in PyPI, GitHub), this project was replaced by `radical-saga` and corresponding package `py-radical-saga` should be used

* saga-python: rolled back, but with deprecation status

* ntplib: removed maintainer

* pika: removed maintainer
2021-08-13 21:59:03 +00:00
Sreenivasa Murthy Kolam
1c7aa12615 cleanup opt-rocm references when using spack (#25264)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-08-13 19:08:25 +00:00
Massimiliano Culpo
f0875b2fef Remove handling for deprecated module commands (#25411)
The commands have been deprecated in #7098, and have
been failing with an error message since then.

Cleaning the code since it is unlikely that somebody
is still using them.
2021-08-13 07:51:56 -07:00
Harmen Stoppels
920f695d1d Mention bash in prerequisites (#25379)
Isn't installed on Alpine.
2021-08-13 07:28:29 -07:00
Harmen Stoppels
607f2a0c1c libtree: add v2.0.0 (#25360) 2021-08-13 14:54:37 +02:00
Valentin Volkl
6cd8583165 recola: add new package (including dependencies) (#25383) 2021-08-13 11:50:28 +02:00
Sreenivasa Murthy Kolam
06eda406cc append CMAKE_MODULE_PATH to include hip for rocalution (#25407) 2021-08-13 10:23:09 +02:00
arjun-raj-kuppala
4784ec67b2 AMD ROCm 4.3: Bump up rvs version to 4.3 and fix hip patch (#25382) 2021-08-13 10:20:25 +02:00
Pieter Ghysels
264b00bff4 strumpack: Patch for building shared lib when enabling ROCm. (#25252)
* Fix for building shared lib when enabling ROCm, for STRUMPACK 5.1.1.

* Update patch for shared lib with STRUMPACK 5.1.1 and ROCm, also update FindHIP.cmake

* update patch for shared libs with ROCm
2021-08-12 10:51:43 -07:00
messense
b008d2b1fe py-py-spy: upgrade to 0.3.8 and build with cargo (#25375) 2021-08-12 14:37:56 +00:00
Massimiliano Culpo
78850f38eb Fix typos in fixture use. Mention fixtures in pytest.ini (#25381) 2021-08-12 12:08:53 +00:00
Sreenivasa Murthy Kolam
cc8bb38aab update version for rocm-4.3.0 release for hip recipe (#25342)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-08-12 13:09:36 +02:00
Sreenivasa Murthy Kolam
524d4e5071 update version for rocm-4.3.0 release (#25374) 2021-08-12 10:17:03 +02:00
Aurelien Bouteiller
66fc6940a0 parsec: add missing build deps on flex/bison (#25373)
Signed-off-by: Aurelien Bouteiller <bouteill@icl.utk.edu>
2021-08-12 08:22:46 +02:00
Vasileios Karakasis
aef4696593 ReFrame: add v3.7.x (#25372) 2021-08-12 08:16:32 +02:00
Tamara Dahlgren
229bcd9f03 raja: develop version requires camp at master (#25346) 2021-08-11 16:05:37 +02:00
Tamara Dahlgren
345617ecb5 umpire: develop version requires camp at develop (#25347) 2021-08-11 16:05:13 +02:00
Massimiliano Culpo
6fb8122187 binutils: add v2.37 (#25356) 2021-08-11 02:52:42 -07:00
Sinan
a5fbf8fbae pango: add versions up to v1.48 (#25355)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2021-08-11 09:40:41 +00:00
Harmen Stoppels
3b23b42519 libtree: add "master" version, variants and dependencies (#25337) 2021-08-11 08:14:01 +00:00
Sreenivasa Murthy Kolam
84e2469e41 update the version for rocm-4.3.0 release (#25343) 2021-08-11 09:55:20 +02:00
Sreenivasa Murthy Kolam
2ca44a6f6d update version for rocm-4.3.0 release (#25349) 2021-08-11 09:52:22 +02:00
Tamara Dahlgren
fb8c954e2e Switch to settings option used in latest *and* oldest listed version (#25340) 2021-08-11 07:30:22 +00:00
Alexander Jaust
e89f2c0e91 fenics: updates to allow newer versions of Boost (#25329) 2021-08-11 09:29:48 +02:00
Valentin Volkl
0d4f69f28c py-particle: add v0.15.1 (#25333) 2021-08-11 09:28:30 +02:00
Edgar Leon
c3370897cf mpibind: add new package (#25322) 2021-08-11 09:22:37 +02:00
Seth R. Johnson
b686b73d6f moab: add 5.3.0 and use https (#25345) 2021-08-11 09:15:16 +02:00
Sinan
ffbc00de21 libaec: add v1.0.3, v1.0.4, v1.0.5 (#25348)
Co-authored-by: sbulut <sbulut@3vgeomatics.com>
2021-08-11 09:14:48 +02:00
Axel Huebl
41ae83463e doc: def llnl.util.filesystem.find fix rst (#25350)
The star was not rendered in the docs.
2021-08-11 09:14:04 +02:00
Brian Van Essen
128d788363 Changed the LBANN software stack to not explicitly set the Host (#25351)
Transfer protocol in the Aluminum library.  If required the Host
Transfer variant +ht should be explicity set.
2021-08-11 09:12:59 +02:00
Harmen Stoppels
26ed2776e7 Add __skip_rocmclang to more rocm packages for cmake 3.21 (#25328) 2021-08-11 08:21:17 +02:00
arjun-raj-kuppala
d88d887ed0 AMD ROCm 4.3.0 - bump up version for llvmamdgpu, hsakmt-roct, rocm-cmake, rocm-smi-lib (#25228)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-08-11 00:42:31 +02:00
Massimiliano Culpo
371bc37dd4 Rework rules for provider weights (#25331)
Preferred providers had a non-zero weight because in an earlier formulation of the logic program that was needed to prefer external providers over default providers. With the current formulation for externals this is not needed anymore, so we can give a weight of zero to both default choices and providers that are externals. _Using zero ensures that we don't introduce any drift towards having less providers, which was happening when minimizing positive weights_.

Modifications:

- [x] Default weight for providers starts at 0 (instead of 10, needed before to prefer externals)
- [x] Rules to compute the `provider_weight` have been refactored. There are multiple possible weights for a given `Virtual`. Only one gets selected by the solver (the one that minimizes the objective function).
- [x] `provider_weight` are now accounting for each different `Virtual`. Before there was a single weight per provider, even if the package was providing multiple virtuals.

* Give preferred providers a weight of zero

Preferred providers had a non-zero weight because in an earlier
formulation of the logic program that was needed to prefer
external providers over default providers.

With the current formulation for externals this is not needed anymore,
so we can give a weight of zero to default choices. Using zero
ensures that we don't introduce any drift towards having
less providers, which was happening when minimizing positive weights.

* Simplify how we compute weights for providers

Rewrite rules so that specific events (i.e. being
an external) unlock the possibility to use certain
weights. The weight being considered is then selected
by the minimization process to be the one that gives
the best score.

* Allow providers to have different weights for different virtuals

Before this change we didn't differentiate providers based on
the virtual they provide, which meant that packages providing
more than one virtual had nonetheless a single weight.

With this change there will be a weight per virtual.
2021-08-10 14:15:45 -07:00
Harmen Stoppels
3fafbafab0 cxxopts: add v2.2.1 and clean package.py (#25334) 2021-08-10 18:42:18 +02:00
Harmen Stoppels
4ec70ca2ff elfio: new package (#25335) 2021-08-10 18:41:33 +02:00
Harmen Stoppels
afc1134fe6 cpp-termcolor: new package (#25336) 2021-08-10 18:39:55 +02:00
permeakra
8dff43b8ea Elk update (#24432)
* elk package updated to handle 3 latest versions support for older
versions is dropped

* fixed typos

* openmp dependency handling added

* and for blis too

* Retain support for elk 3, deprecate

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-08-10 09:51:45 -05:00
Howard Pritchard
d3f67a4855 openmpi:add pandoc dependency when building master (#25319)
related to #25304

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2021-08-10 09:02:19 +02:00
Paul Adamson
e44de970f8 uqtk: add pyqutk variant to uqtk (#25298)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-08-10 08:05:26 +02:00
Massimiliano Culpo
5916afec84 Add a badge for the Boostrapping workflow (#25318)
This workflow is run daily on `develop`.
2021-08-09 21:45:01 +02:00
Harmen Stoppels
81be31aee0 Make spack env activate x idempotent (#25222)
* Make spack env activate x idempotent

* Update lib/spack/spack/cmd/env.py
2021-08-09 07:07:39 -07:00
Robert Cohn
fc46db2269 sos: shr-atomics is a disable/enable (#25315) 2021-08-09 16:05:06 +02:00
iarspider
8c8b934fd8 giflib: define prefix and libversion also when building (#25263) 2021-08-09 13:32:38 +00:00
Robert Cohn
2738bc17a1 sos: add xpmem variant (#25260)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-08-09 13:16:19 +00:00
Vasily Danilin
ce199e1c67 intel-oneapi-vtune: new package + remove danvev from maintainers (#25311) 2021-08-09 15:04:06 +02:00
Tom Stitt
a3f76740e7 xwidgets: add new package (#25237) 2021-08-09 05:16:43 -07:00
downloadico
be59f2bff0 libxc: add std flag for Intel (#25281) 2021-08-09 14:05:42 +02:00
Valentin Volkl
d4df3b31fb dd4hep: fixes to run tests (#25204) 2021-08-09 14:04:33 +02:00
Martin Aumüller
597358e735 ispc: add v1.16 and v1.16.1 (#25206)
Aso adapt dependencies for older versions
2021-08-09 14:02:58 +02:00
Tiziano Müller
fa715c9892 cp2k: update elpa and sirius dependencies, fix build with mpich and gcc@10 (#24376)
* cp2k: fix build with GCC-10+ and MPICH

* cp2k: update SIRIUS and ELPA dependencies

* elpa: add version 2021.05.001, add ROCm support, include SVE flags
2021-08-09 13:46:15 +02:00
Tom Stitt
80473283f3 xeus: add v1.0.4, improved package recipe (#24984)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-08-09 13:42:46 +02:00
Adam J. Stewart
21e2ba13dc py-pydeprecate: add v0.3.1 (#25231) 2021-08-09 13:24:12 +02:00
Adam J. Stewart
df0d86d795 py-pytorch-lightning: add v1.4.1 (#25232) 2021-08-09 13:23:38 +02:00
Tom Stitt
0f514a390a xproperty: add new package (#25236) 2021-08-09 13:20:57 +02:00
Adam J. Stewart
c513ba25f1 py-pytest: add v6.2.4 (#25240) 2021-08-09 12:41:12 +02:00
Adam J. Stewart
250a08ab7a py-requests-unixsocket: add missing py-pbr dependency (#25241) 2021-08-09 12:40:10 +02:00
Erik Schnetter
534b5d28e8 ssht: add v1.5.0 (#25247) 2021-08-09 12:36:12 +02:00
Adam J. Stewart
20698f8f7e py-setuptools: add v57.4.0 (#25239) 2021-08-09 12:35:38 +02:00
Ben Darwin
bd8ae72146 dcm2niix: new package (at version 1.0.20210317) (#25250) 2021-08-09 11:15:38 +02:00
Severin Strobl
f67d3b127c apex: add v2.4.1 (#25248)
Also updated the URLs for APEX according to the recent move.
2021-08-09 11:14:55 +02:00
Martin Pokorny
22bc189e0e casacore: add variant for ADIOS2 support (#25251) 2021-08-09 11:13:26 +02:00
Todd Gamblin
7ddd6ad461 installation: filter padding from all tty output
This is both a bugfix and a generalization of #25168. In #25168, we attempted to filter padding
*just* from the debug output of `spack.util.executable.Executable` objects. It turns out we got it
wrong -- filtering the command line string instead of the arg list resulted in output like this:

```
==> [2021-08-05-21:34:19.918576] ["'", '/', 'b', 'i', 'n', '/', 't', 'a', 'r', "'", ' ', "'", '-', 'o', 'x', 'f', "'", ' ', "'", '/', 't', 'm', 'p', '/', 'r', 'o', 'o', 't', '/', 's', 'p', 'a', 'c', 'k', '-', 's', 't', 'a', 'g', 'e', '/', 's', 'p', 'a', 'c', 'k', '-', 's', 't', 'a', 'g', 'e', '-', 'p', 'a', 't', 'c', 'h', 'e', 'l', 'f', '-', '0', '.', '1', '3', '-', 'w', 'p', 'h', 'p', 't', 'l', 'h', 'w', 'u', 's', 'e', 'i', 'a', '4', 'k', 'p', 'g', 'y', 'd', 'q', 'l', 'l', 'i', '2', '4', 'q', 'b', '5', '5', 'q', 'u', '4', '/', 'p', 'a', 't', 'c', 'h', 'e', 'l', 'f', '-', '0', '.', '1', '3', '.', 't', 'a', 'r', '.', 'b', 'z', '2', "'"]
```

Additionally, plenty of builds output padded paths in other plcaes -- e.g., not just command
arguments, but in other `tty` messages via `llnl.util.filesystem` and other places. `Executable`
isn't really the right place for this.

This PR reverts the changes to `Executable` and moves the filtering into `llnl.util.tty`. There is
now a context manager there that you can use to install a filter for all output.
`spack.installer.build_process()` now uses this context manager to make `tty` do path filtering
when padding is enabled.

- [x] revert filtering in `Executable`
- [x] add ability for `tty` to filter output
- [x] install output filter in `build_process()`
- [x] tests
2021-08-09 01:42:07 -07:00
Jen Herting
29098a1e07 py-speech-recognition: new package (#25273) 2021-08-09 10:31:21 +02:00
Sergey Kosukhin
053fa2a47b netcdf-fortran: fix building with AOCC (#25186) 2021-08-09 10:30:31 +02:00
Tom Stitt
25a4c5ad18 mfem: add v4.3 (#25158)
Co-authored-by: Veselin Dobrev <dobrev@llnl.gov>
2021-08-09 10:08:33 +02:00
Tamara Dahlgren
4cd2cfc7e7 ci pipelines: expand the list of RADIUSS packages (#25282) 2021-08-09 10:06:56 +02:00
Satish Balay
874a35e30d trilinos: fix +hypre build (#25291) 2021-08-09 10:05:52 +02:00
Satish Balay
f6cb076229 petsc: fix build with trilinos (#25289) 2021-08-09 10:02:47 +02:00
Sreenivasa Murthy Kolam
ea71eb35d3 Update rocm ecosystem to v4.3.0 (#25299) 2021-08-09 09:48:05 +02:00
Satish Balay
24ea784dc4 petsc, petsc4py: add version 3.15.3 (#25307) 2021-08-09 00:43:35 -07:00
Glenn Johnson
ff9c7380e8 r-cairo: add an explicit dependency on libxt (#25309)
Since r-cairo will always look for X libraries, and likely find them on
the system, make it always depend on r+X.
2021-08-09 09:27:57 +02:00
Harmen Stoppels
c37aee4620 hip: fix broken tests with ^cmake@3.21: (#25246) 2021-08-09 07:19:29 +00:00
Harmen Stoppels
4384ff8e41 Use __skip_rocmclang for cmake 3.21 in rocblas/rocfft to avoid incomplete compiler support (#25253) 2021-08-09 07:03:03 +00:00
Harmen Stoppels
420113d5ab libtree: add v1.2.3 (#25270) 2021-08-09 08:51:39 +02:00
Seth R Johnson
66a8993092 Remove xsdk@0.2.0 and associated versions and xsdktrilinos
These versions can cause weird concretizations, and it looks like the
old version of xsdk may not even work because of xsdktrilinos being
disabled. The hypre version tagged for xsdk@0.2 no longer exists at the
described location.
2021-08-08 10:55:39 -07:00
Seth R Johnson
38803e3597 trilinos: remove develop version, change xsdk name
With the previous naming scheme, `trilinos@:10` concretizes to
`trilinos@xsdk-0.2.0`. Now, it's clear what the xsdk version is closest
to. Changed from tag to the corresponding commit SHA for safety.
2021-08-08 10:55:39 -07:00
Satish Balay
6ed4cf4016 petsc: cleanup test code to support 'spack test run' and add a 'cuda' test (#25099) 2021-08-08 12:16:42 -05:00
Alec Scott
0dd6b1e134 Clean Up PR from Container Builder 2021-08-08 07:20:25 -07:00
Edgar A. Leon
1c90b25933 hwloc: Adding opencl and rocm (AMD GPUs) variants to hwloc. 2021-08-08 07:18:26 -07:00
lukebroskop
1c204bef8a trilinos: flag_handler logic fix (#25290)
* Do not allow cray build system patch for later version of otf2

* Modify flag_handler logic in the trilinos package

Modify flag_handler logic in the trilinos package to work better with compilers
other than CCE
2021-08-08 10:16:57 -04:00
Wouter Deconinck
9b66053d99 vecgeom: new version 1.1.16 (#25266)
Diff 1.1.15 to 1.1.16 at https://gitlab.cern.ch/VecGeom/VecGeom/-/compare/v1.1.15...v1.1.16?from_project_id=981, no changes to build dependencies.
2021-08-08 10:05:19 -04:00
Alec Scott
cfbefee0fa Fix GHCR Username in Container Builder (#25301) 2021-08-07 00:25:36 +00:00
Axel Huebl
98d4a7af24 openPMD-api: CTest & Install Tests (#25300)
Run CTest at build time with:
```
spack install --test=root openpmd-api@<version>
```

and run smoke-tests after install and loading of the package via
```
spack load -r /<spec>
spack test run /<spec>
```
2021-08-06 16:55:32 -07:00
Axel Huebl
91b3dcca26 openPMD-api: 0.14.1 (#25265)
* openPMD-api: 0.14.1

Add the latest bugfix release.

* Keep 0.13.4 still preferred

More regressions to mitigate...
2021-08-06 16:43:46 -07:00
Axel Huebl
d69a22f160 WarpX: 21.08 (#25234)
Add the latest WarpX release.
2021-08-06 16:42:58 -07:00
Alec Scott
b92fa6bbf9 Add New Build Containers Workflow (#24257)
This pull request adds a new workflow to build and deploy Spack Docker containers
from GitHub Actions. In comparison with our current system where we use Dockerhub's
CI to build our Docker containers, this workflow will allow us to now build for multiple
architectures and deploy to multiple registries. (At the moment x86_64 and Arm64 because
ppc64le is throwing an error within archspec.)

As currently set up, the PR will build all of the current containers (minus Centos6 because 
those yum repositories are no longer available?) as both x86_64 and Arm64 variants. The
workflow is currently setup to build and deploy containers nightly from develop as well as
on tagged releases. The workflow will also build, but NOT deploy containers on a pull request
for the purposes of testing this PR. At the moment it is setup to deploy the built containers to
GitHub's Container Registry although, support for also uploading to Dockerhub/Quay can be
included easily if we decide to keep releasing on Dockerhub/want to begin releasing on Quay.
2021-08-06 15:53:46 -07:00
Jen Herting
97993ac38a [py-sentencepiece] added veriosn 0.1.91 (#25275) 2021-08-06 15:41:27 +00:00
Todd Gamblin
ad66b758e4 codecov: allow coverage offsets for more base commit flexibility (#25293)
This is an attempt to fix "Missing base commit" messages in the codecov UI. Because we do not run
full tests on package PRs, package PRs' merge commits on `develop` don't have coverage info. It
appears that codecov will give you an error if the pseudo-base's coverage data doesn't all apply
properly to the real PR base, unless the `allow_coverage_offsets` option is set.

* See here for docs:
  https://docs.codecov.com/docs/comparing-commits#pseudo-comparison

* See here for another potential solution:
  https://community.codecov.com/t/2480/15
2021-08-06 01:33:12 -07:00
Todd Gamblin
0a6e98cdb5 refactor: rename colorful kwarg to color (#25292)
`compare_specs()` had a `colorful` keyword argument, but everything else in
spack uses `color` for this.

- [x] rename the argument
- [x] make the default follow spack's `--color=always/never/auto` setting
2021-08-06 06:29:49 +00:00
Jen Herting
2aea624dca New package: r-elemstatlearn (#25276) 2021-08-05 17:12:15 -05:00
Jen Herting
0053117ac8 New package: r-gparotation (#25277) 2021-08-05 17:05:21 -05:00
Jen Herting
c178000d18 [sentencepiece] added version 0.1.91 (#25274) 2021-08-05 16:49:23 +00:00
gpotter2
4af6d6bb1b Add libvpx to ffmpeg (#25155) 2021-08-05 10:55:03 -05:00
Harmen Stoppels
8f238c03ad Add py-fastcov (#25268)
* Add py-fastcov

* Update var/spack/repos/builtin/packages/py-fastcov/package.py
2021-08-05 13:36:41 +00:00
Harmen Stoppels
15653868c8 patchelf: add v0.13 (#25271) 2021-08-05 13:30:17 +00:00
iarspider
159ac3e3fb Add checksum for py-wheel 0.33.6 (#25257) 2021-08-05 08:01:22 +00:00
iarspider
b11f8aa4ea Update py-typing-extensions (#25262) 2021-08-04 21:56:52 +00:00
iarspider
4ac246760e Add checksum for py-gast 0.5.1 and 0.5.2 (#25258) 2021-08-04 14:55:48 -07:00
iarspider
63f950768f Add checksums for new protobuf versions (#25259) 2021-08-04 21:47:23 +00:00
Adam J. Stewart
e2fe415ae6 Spack version: 0.16.1 -> 0.16.2 (#25255)
17473a08ff merged `v0.16.1` back into `develop` but somehow lost the version bump.  Fix it here.
2021-08-04 14:37:40 -07:00
iarspider
988c67fff2 Add checksum for py-astor@0.8.1 (#25256) 2021-08-04 21:31:07 +00:00
Richarda Butler
b5c82aa986 Caliper: Add E4S testsuite stand alone test (#25094) 2021-08-04 10:22:50 -07:00
lukebroskop
5f3c25f6e9 libcircle fix for CCE (#25224) 2021-08-04 12:43:18 +00:00
Michele Mesiti
978191aff5 Added new versions for Sombrero (#25243) 2021-08-04 10:02:30 +00:00
Harmen Stoppels
4b870196c0 Bump py-python-swiftclient and add keystone support (#25221) 2021-08-04 08:09:01 +00:00
Harmen Stoppels
575e321cc5 Add py-python-keystoneclient (#25220)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-08-04 07:55:53 +00:00
Harmen Stoppels
20394a97da Add py-oslo-serialization (#25218)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-08-04 07:36:11 +00:00
Harmen Stoppels
63ac1b6620 Add py-oslo-config (#25216) 2021-08-04 07:31:44 +00:00
Harmen Stoppels
574ade6f76 Add py-keystoneauth1 (#25213)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-08-04 07:28:54 +00:00
Harmen Stoppels
56bb98c542 Add py-oslo-utils (#25219) 2021-08-03 15:10:32 -07:00
Tomoyasu Nojiri
929eb311c7 poplddecay: fix checksum for v3.41 (#20896) 2021-08-03 20:28:26 +00:00
Jen Herting
b21649e6b8 [py-efficientnet-pytorch] added version 0.7.1 (#25230) 2021-08-03 20:28:05 +00:00
Thomas Madlener
70b32b53fc py-sympy: Add new versions and python versions they work with (#25226) 2021-08-03 19:35:15 +00:00
Harmen Stoppels
269b6ced99 py-python-dateutil: add v2.8.2 (#25209) 2021-08-03 18:12:26 +00:00
Harmen Stoppels
196a0a91a5 Bump py-boto3, add python constraints, bump deps (#25211)
* Bump py-boto3, add python constraints, bump deps

* Update var/spack/repos/builtin/packages/py-boto3/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-boto3/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-boto3/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-08-03 18:04:11 +00:00
Harmen Stoppels
468823d1b9 Fix typo (#25223) 2021-08-03 18:00:29 +00:00
Harmen Stoppels
ccdf418e52 Add py-os-service-types (#25215)
* Add py-os-service-types

* Update var/spack/repos/builtin/packages/py-os-service-types/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-08-03 17:57:46 +00:00
Harmen Stoppels
b172b43fa9 Add py-oslo-i18n (#25217)
* Add py-oslo-i18n

* Update var/spack/repos/builtin/packages/py-oslo-i18n/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-08-03 17:12:51 +00:00
Harmen Stoppels
bdc3bde74b Bump py-botocore and add python constraints (#25210)
* Bump py-botocore and add python constraints

* Update var/spack/repos/builtin/packages/py-botocore/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-08-03 16:57:56 +00:00
eugeneswalker
4e9bccf2ef e4s ci stack: update package preferences (#25163) 2021-08-03 09:30:14 -07:00
Harmen Stoppels
d92eff7e89 Add py-debtcollector (#25212)
* Add py-debtcollector

* Fix missing dependencies
2021-08-03 15:40:02 +00:00
Harmen Stoppels
1f191ef37c Add py-netaddr (#25214) 2021-08-03 10:29:04 -05:00
Harmen Stoppels
d734bfda18 py-urllib3: add v1.26.6 (#25207) 2021-08-03 08:07:51 -07:00
Massimiliano Culpo
0026d60b60 Test bootstrapping in a workflow (#25138)
Add a workflow to test bootstrapping clingo on 
different platforms so that we can detect changes 
that break it.

Compute `site_packages_dir` in `bootstrap.py` as it was
before #24095, until we figure a better way to override
that attribute.
2021-08-03 16:53:40 +02:00
Harmen Stoppels
15bc4faf2d py-s3tranfer: add v0.5.0, v0.4.2 (#25208) 2021-08-03 16:49:24 +02:00
Manuela Kuhn
7d0878f5c4 py-datalad: add new package (#25181) 2021-08-03 09:20:06 -05:00
Brent Huisman
a76365c72b Bump Arbor package to v0.5.2 (#24519) 2021-08-03 13:13:19 +00:00
Sebastian Schmitt
73923f1e93 fmt: add v8.0.1 (#25201) 2021-08-03 06:04:39 -07:00
Michael Kuhn
fa729858ac h5bench: new package (#25190) 2021-08-03 05:46:38 -07:00
Harmen Stoppels
62d59f0fb7 scrot: use tarball with configure script (#25176) 2021-08-03 14:13:08 +02:00
Rémi Lacroix
a62210efb9 midnight-commander: add v4.8.26 (#25178) 2021-08-03 14:11:41 +02:00
AMD Toolchain Support
d1ee325ecd AOCC support for CloverLeaf (#25106)
* AOCC support for CloverLeaf

* removing patch as it is upstreamed to source

Co-authored-by: mohan002 <mohbabul@amd.com>
2021-08-03 07:06:38 -05:00
Adam J. Stewart
88d24150e6 py-scipy: add v1.7.1 (#25187) 2021-08-03 14:00:14 +02:00
Axel Huebl
420a8c2eb2 openPMD-api: make v0.13.4 preferred (#25188)
Keep the previous patch release as preferred as we investigate
a few regressions.
2021-08-03 13:52:35 +02:00
AMD Toolchain Support
5698850dc4 aocc 3.1.0: fix version detection for v3.1.0 (#25084) 2021-08-03 13:48:19 +02:00
Tom Payerle
c7e8bdf9cf intel-tbb: allow compilation with nvhpc (#25044)
These are the versions tested (and successfully patched) against
intel-tbb.nvhpc-remove-flags.2017.patch: @2017, @2017.8, @2018, @2018.6
intel-tbb.nvhpc-remove-flags.2019.patch: @2019
intel-tbb.nvhpc-remove-flags.2019.1.patch: @2019.[1-6]
intel-tbb.nvhpc-remove-flags.2019.7.patch: @2019.[7-8]
intel-tbb.nvhpc-remove-flags.2019.9.patch: @2019.9, 2020.[0-3]

The intel-tbb.nvhpc-version-script-fix.2017.patch was tested and
applied successfully against all of the versions above.
2021-08-03 13:35:31 +02:00
John Vandenberg
d33d9d1f03 sbp: add new package (#25194) 2021-08-03 13:25:21 +02:00
Carlos Bederián
a4698f6122 aocc: add v3.1.0 (#25193) 2021-08-03 03:10:31 -07:00
Todd Gamblin
fc840c904b executable: filter long paths from debug output (#25168)
Long, padded install paths can get to be very long in the verbose install
output. This has to be filtered out by the Executable class, as it
generates these debug messages.

- [x] add ability to filter paths from Executable output.
- [x] add a context manager that can enable path filtering
- [x] make `build_process` in `installer.py`

This should hopefully allow us to see most of the build output in
Gitlab pipeline builds again.
2021-08-03 10:00:33 +00:00
Tim Haines
e477101345 PAPI: add version 'master' (#25192)
This is needed for testing with the ECP Dev Tools SDK.

Co-authored-by: Tim Haines <thaines@cs.wisc.edu>
2021-08-03 02:34:32 -07:00
Adam J. Stewart
171001ca84 py-fiona: add v1.8.20 (#25196) 2021-08-03 11:20:43 +02:00
Rémi Lacroix
6fa803a38d DIAMOND: add v2.0.11 (#25198) 2021-08-03 10:38:54 +02:00
Todd Gamblin
cf8d1b0387 refactor: convert build_process to use BuildProcessInstaller (#25167)
`build_process` has been around a long time but it's become a very large,
unwieldy method. It's hard to work with because it has a lot of local
variables that need to persist across all of the code.

- [x] To address this, convert it its own `BuildInfoProcess` class.
- [x] Start breaking the method apart by factoring out the main
      installation logic into its own function.
2021-08-03 10:24:24 +02:00
Todd Gamblin
0a0338ddfa bugfix: ensure all bootstrap context managers are exception-safe
When context managers are used to save and restore values, we need to remember
to use try/finally around the yield in case an exception is thrown.  Otherwise,
the cleanup will be skipped.
2021-08-03 10:07:11 +02:00
Todd Gamblin
693c4d8f3a spack style: improve tests for failure cases
This fixes the bad bootstrap test for spack style, and it refines the
asserrtions on other failure cases.
2021-08-03 10:07:11 +02:00
Dylan Simon
507d3c841c don't spin writer daemon when < /dev/null (#25170) 2021-08-02 21:39:38 -07:00
Erik Schnetter
2dd2a5b167 rnpletal: New package (#25154)
* rnpletal: New package

RNPL is an old package that is still used today by my collaborators, but doesn't see any development any more. I'm creating a Spack package merely to make it easier to install it on various systems. The code is not modern (C without prototypes – yes, that used to be a thing), and a large diff modernizes the code to make it palatable to modern C and Fortran compilers.

RNPL contains several sub-package. The current Spack package builds only the main one.

* rnpletal: Remove unused import

* Convert into AutotoolsPackage

* Don't check for "shared" variant

* rnpletal: Change "version" to `develop`

* rnpletal: Use existing `configure` function
2021-08-02 14:01:45 -07:00
Ali Ahmed
a60e3f80f6 [curl] Fix brotli option flag (#25166)
Co-authored-by: Ali Ahmed <alia@splunk.com>
2021-08-02 12:46:56 -07:00
Adam J. Stewart
6d810cb2e7 Docs: add link to source code (#25088) 2021-08-02 12:36:40 -07:00
Satish Balay
413919be1f petsc: add variants strumpack, scalapack (#25058)
strumpack: switch defaut to +shared
2021-08-02 13:48:33 -05:00
Sreenivasa Murthy Kolam
73a65dc370 modfiy hip-rocclr references for 4.1.0 and 4.2.0 releases (#24868) 2021-08-02 11:16:47 -07:00
Frank Willmore
0df067e64f adjust for erroneous detection of nvc as gcc (#24915)
* adjust for erroneous detection of nvc as gcc

adjust for erroneous detection of nvc as gcc when it is built with gcc

* add missing parenthesis :/

* fix trailing whitespace

* re-work hdf5 patch for nvc to make it more general

* flake8 fixes

* Render as comment

Render intended note as a comment rather than logical constraint

Co-authored-by: Frank Willmore <willmore@anl.gov>
2021-08-02 11:13:32 -07:00
Tamara Dahlgren
413ea10e78 ci: Add RADIUSS stack to cloud CI (#23922)
Add RADIUSS software stack to gitlab PR testing pipelines
2021-08-02 10:19:35 -06:00
Fabian Brandt
71cd303362 Bump version 9.0 (#25039) 2021-08-02 15:56:55 +00:00
Harmen Stoppels
db08ce6105 Bump cmake (#25183) 2021-08-02 08:22:08 -07:00
Harmen Stoppels
ac8521e9b3 Do not issue a warning for a missing source id when installing from local sources (#24960) 2021-08-02 14:17:41 +02:00
Mikael Simberg
d628e3ba5c Add maintainers for gperftools (#25156) 2021-08-02 13:46:38 +02:00
loulawrence
be3e6a0e9b document config option "url_fetch_method" (#24638)
- Change config from the undocumented `use_curl: true/false` to `url_fetch_method: urllib/curl`.
- Documentation of `url_fetch_method` in `defaults/config.yaml`
- Default fetch option explicitly set to `urllib` for users who may not have curl on their system

To upgrade from `use_curl` to `url_fetch_method`, run `spack config update config`
2021-08-02 10:30:25 +02:00
Wouter Deconinck
cd8f7d844d [imlib2] depends_on pkg-config, type = build; and new versions (#25030) 2021-08-02 09:53:19 +02:00
s1913388
2a20943f9b Optimised Cloverleaf3D (#24920) 2021-08-02 09:40:43 +02:00
Sreenivasa Murthy Kolam
437c1e438e add MIOPEN_AMDGCN_ASSEMBLER to cmake args (#25159) 2021-08-02 09:38:34 +02:00
Weiqun Zhang
55e218649a amrex: 21.08 (#25175) 2021-08-01 15:47:48 -07:00
Erik Schnetter
f9703c1c9f kadath: New package (#25162)
* kadath: New package

* Update var/spack/repos/builtin/packages/kadath/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/kadath/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/kadath/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/kadath/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* kadath: Add description to MPI variant

* kadath: Add empty line

* kadath: Add variant "codes=none" to avoid empty default

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-08-01 19:22:07 +00:00
Iman Hosseini
861abb512e laghos: add variant with compiler optimization (#24910)
* add variant with compiler optimization

Update package.py to include variant with compiler optimization, benchmarked at A-HUG hackaton to improve major kernel time by roughly 3%.

* fix style

* Update var/spack/repos/builtin/packages/laghos/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-08-01 18:58:24 +00:00
Jen Herting
8867827a89 [py-asteval] added version 0.9.25 (#25107)
* [py-asteval] added version 0.9.25

* [py-asteval] 0.9.25 requires py-setuptools-scm
2021-08-01 11:43:58 -05:00
iarspider
a9bc118031 [giflib] Set LIBVER and LIBMAJOR when installing (#25173)
* Set LIBVER and LIBMAJOR

* Typo fix

* Fix 2

* Fix #3
2021-08-01 11:37:31 -05:00
Todd Gamblin
ab5954520f spack diff: make output order deterministic (#25169)
The output order for `spack diff` is nondeterministic for larger diffs -- if you
ran it several times it will not put the fields in the spec in the same order on
successive invocations.

This makes a few fixes to `spack diff`:

- [x] Implement the change discussed in https://github.com/spack/spack/pull/22283#discussion_r598337448
      to make `AspFunction` comparable in and of itself and to eliminate the need for `to_tuple()`

- [x] Sort the lists of diff properties so that the output is always in the same order.

- [x] Make the output for different fields the same as what we use in the solver. Previously, we
      would use `Type(value)` for non-string values and `value` for strings.  Now we just use
      the value.  So the output looks a little cleaner:

      ```
      == Old ==========================        == New ====================
      @@ node_target @@                        @@ node_target @@
      -  gdbm Target(x86_64)                   -  gdbm x86_64
      +  zlib Target(skylake)                  +  zlib skylake
      @@ variant_value @@                      @@ variant_value @@
      -  ncurses symlinks bool(False)          -  ncurses symlinks False
      +  zlib optimize bool(True)              +  zlib optimize True
      @@ version @@                            @@ version @@
      -  gdbm Version(1.18.1)                  -  gdbm 1.18.1
      +  zlib Version(1.2.11)                  +  zlib 1.2.11
      @@ node_os @@                            @@ node_os @@
      -  gdbm catalina                         -  gdbm catalina
      +  zlib catalina                         +  zlib catalina
      ```

I suppose if we want to use `repr()` in the output we could do that and could be
consistent but we don't do that elsewhere -- the types of things in Specs are
all stringifiable so the string and the name of the attribute (`version`, `node_os`,
etc.) are sufficient to know what they are.
2021-08-01 05:15:33 +00:00
Erik Schnetter
1e708bdb45 lorene: Install only executables, not unrelated files (#25148)
* lorene: Install only executables, not unrelated files in the same directory

* lorene: Don't determine compile dependencies

The current way doesn't work (cpp misses C++ include paths), and we don't need dependencies anyway.

* lorene: Correct BLAS library names

* lorene: Remove comment
2021-07-31 21:11:34 -05:00
Garth N. Wells
60eef9c0de Add missing xtl dependency for fenics-basix and py-fenics-basix (#25151) 2021-07-31 21:10:24 -05:00
Erik Schnetter
b5f587bbe0 libjpeg-turbo: New version 2.1.0 (#25153) 2021-07-31 20:59:05 -05:00
gpotter2
b1abfd3ff6 Use the new cool Github templates (#25118)
* Use the new cool github templates

* Add the "mention maintainers" clause

* Fix broken HTML tag

* Minor improvements, missing filenames
2021-07-31 20:44:20 -05:00
Jen Herting
b4c6c11e68 [py-pyarrow] added version 3.0.0 and 4.0.1 (#25161)
* [py-pyarrow] added version 4.0.1

* [py-pyarrow] added version 3.0.0

* [py-pyarrow] updated dependencies for newer versions
2021-07-30 21:16:49 +00:00
Jen Herting
886e94d0ee [arrow] added versions 3.0.0 and 4.0.1 (#25160)
* [arrow] added version 4.0.1

* [arrow] added version 3.0.0
2021-07-30 20:38:23 +00:00
Scott Wittenburg
de88d2c7cc CI: capture stdout/stderr output to artifact files (#24401)
Gitlab truncates job trace output (even the complete raw output) at 4MB,
so this change captures it to a file under "user_data" artifacts as well,
to make sure we can debug output from the end of the rebuild job.
2021-07-30 13:24:03 -06:00
Scott Wittenburg
f591e9788d pipelines: Store details about specs broken on develop (#24637)
When a spec fails to build on `develop`, instead of storing an empty file as the entry in the broken specs list, this change stores the full spec yaml as well as links to the failing  pipeline and job.
2021-07-30 09:11:00 -06:00
Ryan Marcellino
3df1d9062e add new version of py-dvc (#25152)
Co-authored-by: Cloud User <marcryan@ryanmarcelli001.hzterscemazurawp3xenxzahla.bx.internal.cloudapp.net>
2021-07-30 15:01:56 +00:00
Sebastian Ehlert
771e73dfa4 dftd4: add v3.2.0, v3.1.0 and v3.0.0 (#25145) 2021-07-30 14:56:40 +00:00
Ryan Marcellino
d02d683126 py-fsspec: add v0.9.0 (#25133)
Co-authored-by: Cloud User <marcryan@ryanmarcelli001.hzterscemazurawp3xenxzahla.bx.internal.cloudapp.net>
2021-07-30 12:16:22 +02:00
Ryan Marcellino
3408440991 py-dulwich: add v0.20.21 (#25132)
Co-authored-by: Cloud User <marcryan@ryanmarcelli001.hzterscemazurawp3xenxzahla.bx.internal.cloudapp.net>
2021-07-30 12:15:25 +02:00
Ryan Marcellino
3baac2faac py-diskcache: add v5.2.1 (#25131)
Co-authored-by: Cloud User <marcryan@ryanmarcelli001.hzterscemazurawp3xenxzahla.bx.internal.cloudapp.net>
2021-07-30 12:14:58 +02:00
Cyrus Harrison
69ce54b86a fides: add new package (#25128) 2021-07-30 12:14:16 +02:00
Olli Lupton
84613da90a Add C-Reduce and dependencies. (#25109) 2021-07-30 09:59:06 +00:00
Ryan Marcellino
4b89f6a90b py-rich: add v10.0.0 (#25134)
Co-authored-by: Cloud User <marcryan@ryanmarcelli001.hzterscemazurawp3xenxzahla.bx.internal.cloudapp.net>
2021-07-30 11:37:07 +02:00
Ryan Marcellino
c26f328e1a py-psutil: add v5.8.0 (#25135)
Co-authored-by: Cloud User <marcryan@ryanmarcelli001.hzterscemazurawp3xenxzahla.bx.internal.cloudapp.net>
2021-07-30 11:36:37 +02:00
Ryan Marcellino
d969320ba1 py-pygit2: add v1.6.0 (#25136)
Co-authored-by: Cloud User <marcryan@ryanmarcelli001.hzterscemazurawp3xenxzahla.bx.internal.cloudapp.net>
2021-07-30 11:35:46 +02:00
downloadico
72acc54d84 octopus: add cuda variant (#25126) 2021-07-30 11:33:42 +02:00
Garth N. Wells
711ed17606 FEniCSx: updated dependencies (#25110)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-07-30 11:32:28 +02:00
Swann Perarnau
c1de2e926d aml: update website, repository, and maintainer (#25140)
AML moved its repository and website during spring 2021.

Signed-off-by: Swann Perarnau <swann@anl.gov>
2021-07-30 11:22:20 +02:00
Axel Huebl
189ff91f25 openPMD-api: add v0.14.0 (#25142)
Add the latest release.
2021-07-30 11:14:19 +02:00
AMD Toolchain Support
bbfaf4e816 quantum-espresso: update patch for AOCC support (#25144)
Co-authored-by: mohan002 <mohbabul@amd.com>
2021-07-30 11:07:57 +02:00
Vanessasaurus
54e8e19a60 adding spack diff command (#22283)
A `spack diff` will take two specs, and then use the spack.solver.asp.SpackSolverSetup to generate
lists of facts about each (e.g., nodes, variants, etc.) and then take a set difference between the
two to show the user the differences.

Example output:

    $ spack diff python@2.7.8 python@3.8.11
     ==> Warning: This interface is subject to change.

     --- python@2.7.8/tsxdi6gl4lihp25qrm4d6nys3nypufbf
     +++ python@3.8.11/yjtseru4nbpllbaxb46q7wfkyxbuvzxx
     @@ variant_value @@
     -  python patches a8c52415a8b03c0e5f28b5d52ae498f7a7e602007db2b9554df28cd5685839b8
     +  python patches 0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87
     @@ version @@
     -  openssl Version(1.0.2u)
     +  openssl Version(1.1.1k)
     -  python Version(2.7.8)
     +  python Version(3.8.11)

Currently this uses diff-like output but we will attempt to improve on this in the future.

One use case for `spack diff` is whenever a user has a disambiguate situation and cannot 
remember how two different installs are different. The command can also output `--json` in
the case of a more analysis type use case where we want to save complete data with all
diffs and the intersection. However, the command is really more intended for a command
line use case, and we likely will have an analyzer more suited to saving data

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-07-30 00:08:38 -07:00
Zack Galbreath
e8f284bf52 ci: automatically prune the broken-specs list (#24809)
When a develop pipeline successfully finishes a `spack install`, check if
the spec that was just built is on the broken-specs list. If so, remove it.
2021-07-29 13:47:10 -06:00
Zack Galbreath
d7771f190f Catch ConnectionError from CDash reporter (#24818)
* Catch ConnectionError from CDash reporter

Catch ConnectionError when attempting to upload the results of `spack install`
to CDash. This follows in the spirit of #24299. We do not want `spack install`
to exit with a non-zero status when something goes wrong while attempting to
report results to CDash.

* Catch HTTP Error 400 (Bad Request) in relate_cdash_builds()
2021-07-29 13:46:17 -06:00
sknigh
095f327f32 Update sst-elements and dependency packages (#25041)
* sst-elements: add optional support for flashdimmsim, dramsim3 and
  add new packages for each
* sst-dumpi: add version 7.1.0
* sst-core: autotools dependencies are required for all versions
* new package: dtc
* add error message redirect for +dumpi, otf, and otf2: these are not
  currently supported
2021-07-29 11:48:41 -07:00
Itaru Kitayama
a904418270 nest: add v3.0 and v2.20.1, removed previous versions (#24328)
Co-authored-by: Itaru Kitayama <itaru.kitayama@riken.jp>
2021-07-29 10:34:50 +02:00
Massimiliano Culpo
b42b0cd45a Move build tests from GA to Gitlab (#25120)
Modifications:

- Remove the "build tests" workflow from GitHub Actions
- Setup a similar e2e test on Gitlab

In this way we'll reduce load on GitHub Actions workflows and for e2e tests will
benefit from the buildcache reuse granted by pipelines.
2021-07-29 09:08:32 +02:00
Brian Van Essen
adb507bdd9 Added support for using the Cray LibSci BLAS/LAPACK/ScaLAPACK library. (#25124) 2021-07-28 15:14:58 -07:00
Cyrus Harrison
db00cf24c0 add parmetis variant to conduit (#25127) 2021-07-28 15:09:19 -07:00
Satish Balay
c114cf019d petsc: update config option logic for locating dependencies (#25074)
Primarily use --with-package-include, --with-package-lib options
(vs. --with-package-dir)
2021-07-28 14:09:54 -07:00
Chris White
7ea94d1103 add more directories to implicit link exclusion 2021-07-28 14:01:16 -07:00
iarspider
ace3753076 re2: addv2021-06-01 and 'shared' variant (#25121)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-07-28 19:07:12 +02:00
Michael Kuhn
d6cbaee16b gcc: add v11.2.0 (#25125) 2021-07-28 17:02:40 +00:00
Rémi Lacroix
0d5e1c7db9 libxc: add v5.1.5 (#25123) 2021-07-28 18:42:05 +02:00
Houjun Tang
69464bf36f pdc: add new package (#24762)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-07-28 18:31:25 +02:00
Dylan Simon
e3096e94b1 poppler: drop splash patch for 21.07.0 (#25032)
ENABLE_SPLASH configuration has been removed entirely after 21.06 so
patch is no longer necessary after #24931.  (Versions between 0.90.1 and
21.06 will likely still need a patch, and while it's not clear if this
patch is the right one, seems better to leave something in.)
2021-07-28 18:23:56 +02:00
Tim Moon
c9327649c0 nvshmem: set env variables instead of appending (#25095) 2021-07-28 16:07:24 +00:00
Tom Stitt
68f696af64 glvis: add v4.0 (#25045) 2021-07-28 17:51:53 +02:00
downloadico
737936d02c octopus: add v10.5 (#25068) 2021-07-28 17:47:42 +02:00
Ethan Stam
a7ed20cb92 ParaView: disable VTK_MODULE_USE_EXTERNAL_ParaView_vtkcatalyst to avoid looking for external Catalyst_DIR (#25061) 2021-07-28 17:30:21 +02:00
Glenn Johnson
e29168ad02 gurobi: add v9.1.2 and extend python (#25064)
- add version 9.1.2
- set a license file
- set the license environment variable
- remove the download and license information out of the description so
  it does not show up in environment modules
- extend python and set python version constraints
- build gurobipy to be used in any compatible python, used for more
  extensive computations than the gurobi shell
- remove preexisting PYTHONPATH from gurobi.sh as the shell uses a
  built-in python, which will likely be different from "system" python
- add maintainer
2021-07-28 17:29:32 +02:00
gpotter2
be2e224e75 vtk-m: add conflict with gcc version < 5.0 (#25117) 2021-07-28 16:24:27 +02:00
iarspider
4d96f2668b py-absl-py: add v0.11.0, v0.12.0, v0.13.0 (#25122) 2021-07-28 15:36:39 +02:00
Sreenivasa Murthy Kolam
ac2444f1d5 fix roctracer references in hip and cleanup code in rocprofiler-dev recipe (#24994)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-07-28 10:36:35 +02:00
Todd Gamblin
6104c31556 bugfix: be careful about GITHUB_BASE_REF in spack style
`spack style` previously used a Travis CI variable to figure out
what the base branch of a PR was, and this was apparently also set
on `develop`.  We switched to `GITHUB_BASE_REF` to support GitHub
Actions, but it looks like this is set to `""` in pushes to develop,
so `spack style` breaks there.

This PR does two things:

- [x] Remove `GITHUB_BASE_REF` knowledge from `spack style` entirely

- [x] Handle `GITHUB_BASE_REF` in style scripts instead, and explicitly
      pass the base ref if it is present, but don't otherwise.

This makes `spack style` *not* dependent on the environment and fixes
handling of the base branch in the right place.
2021-07-27 17:57:17 -07:00
Richarda Butler
af468235e2 AML: Add E4S testsuite stand alone test (#23874) 2021-07-27 17:30:34 -07:00
Todd Gamblin
c8efec0295 spack style: add --root option (#25085)
This adds a `--root` option so that `spack style` can check style for
a spack instance other than its own.

We also change the inner workings of `spack style` so that `--config FILE`
(and similar options for the various tools) options are used. This ensures
that when `spack style` runs, it always uses the config from the running spack,
and does *not* pick up configuration from the external root.

- [x] add `--root` option to `spack style`
- [x] add `--config` (or similar) option when invoking style tools
- [x] add a test that verifies we can check an external instance
2021-07-27 15:09:17 -06:00
Jen Herting
e5bbb6e5b4 [py-lmfit] added version 1.0.2 (#25108)
* [py-lmfit] fixed py-asteval dependency requirements

* [py-lmfit] added version 1.0.2

* [py-lmfit] flake8

* [py-lmfit] 1.0.2 reqires python 3.6

* [py-lmfit] removed newer dependency requirements to be in line with setup.py not requirements.txt
2021-07-27 16:59:45 +00:00
Seth R. Johnson
0a41d4ebb8 pbs: new virtual package (#24568)
* pbs: new virtual package

Some of our clusters have an older installation of
libtorque and tm.h that are *not* from OpenPBS. Using the current
openpbs dependency for openmpi causes concretization errors due to
restrictions on older python and hwloc requirements that don't apply,
even with an external non-buildable installation.
The new 'torque' bundle package allows users to point to that external
installation without problems.

Detailed description of torque by Sergey Kosukhin <skosukhin@gmail.com>
2021-07-27 16:37:00 +00:00
Timo Heister
ca260a3d63 aspect: add v2.3.0 (#25100) 2021-07-27 14:55:29 +02:00
Adam J. Stewart
4eaa5a2635 py-sphinx: add versions up to v4.1.2 (#25092) 2021-07-27 14:54:46 +02:00
Garth N. Wells
aeacc2ff92 pybind11: add v2.7.0 (#25103) 2021-07-27 14:53:02 +02:00
Garth N. Wells
b69c4a66e7 Add xtensor 0.23.10 and xsimd 07.5.0 version (#25105) 2021-07-27 14:52:36 +02:00
Adam J. Stewart
de3fa5556a serf: add missing libuuid dependency (#25098) 2021-07-27 14:50:59 +02:00
Satish Balay
06a292290e openmpi: fix cuda dependency (#25101) 2021-07-27 06:41:15 -04:00
Michael Kuhn
1454935edc libfuse: fix typo (#25104)
This caused both static and shared libraries to be disabled.
2021-07-27 10:28:21 +02:00
Adam J. Stewart
520a465190 Documentation does not build with Sphinx 4.1.2 2021-07-26 13:46:27 -07:00
Adam J. Stewart
ab39f548dc Confirm that the docstring is the issue 2021-07-26 13:46:27 -07:00
Adam J. Stewart
26c3df20f1 Docs: attempt to fix doc tests for sphinx 4.1.2 2021-07-26 13:46:27 -07:00
Adam J. Stewart
6472ee8c76 py-sphinxcontrib-serializinghtml: add new version (#25091) 2021-07-26 15:03:52 -04:00
Adam J. Stewart
8a8aa16f1b py-sphinxcontrib-htmlhelp: add new version (#25090) 2021-07-26 15:03:38 -04:00
Rémi Lacroix
43c135e3ce n2p2 package: Add version 2.1.4 (#25031) 2021-07-26 11:16:13 -07:00
Dylan Simon
1c350854f8 intel-oneapi: fix parallel installer errors (#24911)
Intel oneAPI installs maintain a lock file in XDG_RUNTIME_DIR,
which by default exists in /tmp (and is shared by all component
installs). This prevented multiple oneAPI components from being
installed in parallel. This commit sets XDG_RUNTIME_DIR to exist
within Spack's installation Stage, so allows multiple components
to be installed at the same time.
2021-07-26 10:39:35 -07:00
tilne
98549ddbe5 aws-parallelcluster: add v2.11.1 (#25089)
* aws-parallelcluster: update maintainers list

Signed-off-by: Tim Lane <tilne@amazon.com>

* aws-parallelcluster: add v2.11.1

Signed-off-by: Tim Lane <tilne@amazon.com>
2021-07-26 10:37:15 -05:00
Thomas Madlener
4a19741a36 yoda: only depend on root if explicitly desired (#25087) 2021-07-26 13:50:00 +02:00
Erik Schnetter
cef3a2a6ee asdf-cxx: require a particular version of yaml-cpp (#24988) 2021-07-26 10:28:02 +02:00
Harmen Stoppels
c912911d0e roc-tracer: remove py-setuptools since it is not used (#25010) 2021-07-26 10:27:33 +02:00
Tamara Dahlgren
6d30299d80 eigenexa: update stand-alone tests to use test stage work directory (#24129) 2021-07-26 10:25:36 +02:00
Tamara Dahlgren
cb87271a01 genesis: update stand-alone tests to use test stage work directory (#24193) 2021-07-26 10:24:29 +02:00
Rémi Lacroix
9edd281044 genesis: add v1.6.0. (#25055) 2021-07-26 10:20:25 +02:00
Kevin Harms
e666a3b366 Smoke test for darshan-runtime, builds a test code, runs it and check… (#25049) 2021-07-26 10:04:39 +02:00
Adam J. Stewart
eef514a1dc py-pandas: add v1.3.1 (#25076) 2021-07-26 10:03:13 +02:00
Adam J. Stewart
9e01fcf0de py-antlr4-python3-runtime: add v4.8 (#25078) 2021-07-26 10:02:19 +02:00
Adam J. Stewart
696f9458c2 py-torchmetrics: add new version (#25081) 2021-07-26 09:59:51 +02:00
Manuela Kuhn
de6d3ef1ee rstudio: add new package (#24647)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-07-26 09:57:56 +02:00
Adam J. Stewart
24048b3545 py-pyprecice: simplify package (#25077) 2021-07-26 09:55:51 +02:00
Adam J. Stewart
5f44b0ad48 py-omegaconf: add missing java dependency (#25080) 2021-07-26 09:55:13 +02:00
Silvio Traversaro
f68b91defe libdc1394: add new package (#25079) 2021-07-26 09:54:19 +02:00
romerojosh
7339f2d476 Fix LBANN and related packages CMake CUDA arch flag (#25062) 2021-07-26 09:16:38 +02:00
Glenn Johnson
63e04ce220 Fix tesseract package (#24304)
This PR fixes the tesseract package
- add missing dependencies
- build documentation
- build and install java component
- build and install training component
2021-07-26 07:49:58 +02:00
Alec Scott
d6432e9718 bedops: add v2.4.40 (#25065) 2021-07-26 07:48:11 +02:00
Alec Scott
1cf43cd2fc picard: add versions up to v2.25.7 (#25063) 2021-07-26 07:47:41 +02:00
Matthieu Dorier
ed0c3233db lua-sol2: added new package (#25067) 2021-07-26 07:47:04 +02:00
Alec Scott
794931f0d7 Rclone: add v1.56.0 (#25066) 2021-07-25 14:09:07 +02:00
Michael Kuhn
0d54bd68c1 freetype: add v2.11.0 (#25075) 2021-07-25 14:01:38 +02:00
vsoch
4208cf66be spack style: automatically bootstrap dependencies
This uses our bootstrapping logic to automatically install dependencies for
`spack style`. Users should no longer have to pre-install all of the tools
(`isort`, `mypy`, `black`, `flake8`). The command will do it for them.

- [x] add logic to bootstrap specs with specific version requirements in `spack style`
- [x] remove style tools from CI requirements (to ensure we test bootstrapping)
- [x] rework dependencies for `mypy` and `py-typed-ast`
      - `py-typed-ast` needs to be a link dependency
      - it needs to be at 1.4.1 or higher to work with python 3.9

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
2021-07-24 07:07:35 -07:00
Massimiliano Culpo
a30e6c6158 Bump codecov/action to v2.0.2 (#24983)
* build(deps): bump codecov/codecov-action from 1 to 2.0.1

Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 1 to 2.0.1.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/master/CHANGELOG.md)
- [Commits](https://github.com/codecov/codecov-action/compare/v1...v2.0.1)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* Update arguments to codecov action

* Try to delete the symbolic link to root folder

Hopefully this should get rid of the ELOOP error

* Delete also for shell tests and MacOS

* Bump to v2.0.2

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-07-24 07:23:56 -04:00
Paul Kuberry
bb20cadd91 Only add hwloc to tpl/dep list for certain versions of Trilinos (#25071) 2021-07-24 07:20:51 -04:00
iarspider
0beb35e426 Added versions of py-grpcio upto 1.39.0 (#25056)
Tested only `py-grpcio@1.39.0 ^grpcio@1.39.0`
2021-07-23 21:42:32 -05:00
Filippo Spiga
25c09e7a56 nvhpc: add v21.7 (#25040) 2021-07-23 09:10:50 +00:00
iarspider
e083d32f10 Add grpc upto 1.39.0 and update dependencies (#25037) 2021-07-23 08:58:56 +00:00
Dylan Simon
7b0d869c3c pgplot: fix with gcc<10 (#25042) 2021-07-23 01:34:26 -07:00
Tom Stitt
dc3e1d9a40 sdl2: add v2.0.14 (#25046) 2021-07-23 10:24:38 +02:00
Caleb Robinson
0c65433dd3 py-omegaconf: add new package (#25052)
* Adding package for omegaconf

* Update var/spack/repos/builtin/packages/py-omegaconf/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Changing py-omegaconf to use github source URL instead of pypi

* Style fix

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-07-22 21:37:36 -07:00
Satish Balay
b89286d0de amrex: add variant plotfile_tools (#25038) 2021-07-22 22:29:58 -04:00
Filippo Spiga
accdf4445d Adding CUDA SDK 11.4.0 (#24609)
* Adding CUDA SDK 11.4.0

* Fixing import order (finger crossed)

* Fixing import order (thanks alalazo for the tips)

* Restored URLs for 11.4.0
2021-07-22 16:57:09 -07:00
Kevin Harms
0825463904 Add smoke test for darshan-util (#25016) 2021-07-22 16:46:09 -07:00
wspear
91b4d974f1 Add test operation to tau package (#25036)
Copy over the general purpose mm matmult testcase. On test run build, execute (with mpi as needed) and process profile output.
2021-07-22 10:09:18 -07:00
Seth R. Johnson
8735d0a281 Trilinos: enable x11 when +exodus (#25033)
* trilinos: rearrange dependencies

* trilinos: refactor tpl enables and add libx11 for +exodus

Fixes #25028
2021-07-22 12:27:29 -04:00
Richarda Butler
68dbca64e7 Archer: Add E4S testsuite stand alone test (#24729) 2021-07-22 09:26:14 -07:00
Adam J. Stewart
0b6a0dd7fa py-torch: fix build on blue waters (#25026) 2021-07-22 07:13:36 -07:00
Adam J. Stewart
4e0f97bee3 PROJ: set PROJ_LIB env var (#25029) 2021-07-22 14:44:59 +02:00
natshineman
c0b6d42b23 Update MVAPICH2 Maintainers and GDR Dependencies (#25027)
Co-authored-by: Nick Contini <contini.26@buckeyemail.osu.edu>
Co-authored-by: Nat Shineman <shineman.5@buckeyemail.osu.edu>
2021-07-22 05:37:51 -07:00
Adam J. Stewart
537d316311 py-tensorboard-plugin-wit: fix build on Ubuntu (#25025) 2021-07-22 05:31:40 -07:00
albestro
7ad72de0d3 boost: conflict with GCC on macOS (#24917)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-07-22 09:49:20 +02:00
Iman Hosseini
8cfb0a0d52 nut: reflect conflict with nvhpc (#25023)
'random123' is a template library and cannot be compiled with nvhpc

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-07-22 08:34:06 +02:00
Ethan Stam
7011608be0 ParaView: Add cli11 dependency (#24732) 2021-07-21 16:51:49 -07:00
Harmen Stoppels
8e59c847dd fix rocprofiler includes (#25009) 2021-07-21 16:21:36 -07:00
Robert Pavel
f12fccce65 flecsi: package updates (#24986)
Worked with flecsi developers to tighten, relax, and clarify
constraints and better understand how the flecsi project uses
legion. In the process, discovered that flecsi@1.4 cannot be
built with legion without heavy changes/reverts to the legion
and gasnet spackages.

Also, most importantly, fixed branding as to how flecsi is spelled
2021-07-21 22:43:51 +02:00
Adam J. Stewart
497dfee320 py-segmentation-models-pytorch: add new package (#25022) 2021-07-21 20:14:41 +00:00
Richarda Butler
dfc0aa86ee Bolt: Add E4S testsuite stand alone test (#24846) 2021-07-21 19:40:14 +00:00
Matthieu Dorier
2db3459e6b libpmemobj-cpp: fixing package not building because of valgrind flag (#24967) 2021-07-21 21:38:32 +02:00
Matthieu Dorier
feb0556664 pmdk: added versions up to 1.11.0 (#24972) 2021-07-21 21:36:32 +02:00
Adam J. Stewart
1f59163e8c py-efficientnet-pytorch: add new package (#25021) 2021-07-21 19:33:45 +00:00
Adam J. Stewart
e1b2ab0105 py-pretrainedmodels: add new package (#25020) 2021-07-21 19:30:59 +00:00
Ryan Mast
147c4fb96d helics: added "main" version (#24987) 2021-07-21 21:28:32 +02:00
Adam J. Stewart
df8e51d6e5 py-timm: add new package (#25019) 2021-07-21 19:25:04 +00:00
Tom Payerle
5a49264e19 metis: suppress warnings causing issues for %nvhpc builds (#25014)
We add compilation flags when using %nvhpc to suppress warnings
(which due to global -Werror flag in the build get promoted to
errors) for the following:
Diagnostic 111: statement is unreachable
Diagnostic 177: variable "foo" was declared but never referenced
Diagnostic 188: enumerated type mixed with another type
Diagnostic 550: variable "foo" was set but never used
2021-07-21 21:24:51 +02:00
Martin Pokorny
4a4d1759f5 Kokkos: allow c++17 starting with CUDA v11.0.0 (#25018) 2021-07-21 21:19:49 +02:00
Jonathan R. Madsen
80592613ad timemory package: add versions including 3.2.3; update options (#24825)
* add variants: python hatchet/line-profiler support and likwid
  nvmon support
* removed ompt_standalone/ompt_llvm variants
2021-07-21 11:54:04 -07:00
Massimiliano Culpo
a68abc15c5 Fix bootstrap from sources
#24095 introduced a couple of bugs, which are fixed here:

1. The module path is computed incorrectly for bootstrapped clingo
2. We remove too many paths for `sys.path` in case of failures
2021-07-21 07:22:07 -07:00
Manuela Kuhn
9237a9f244 py-python-xmp-toolkit: add new package (#25008) 2021-07-21 06:52:38 -07:00
Matthieu Dorier
7a458be3eb tkrzw: added more versions and compression variant (#24953) 2021-07-21 13:32:52 +00:00
Manuela Kuhn
9c96cc578e py-soupsieve: add 1.9.6 and 2.2.1 (#24973) 2021-07-21 08:31:14 -05:00
Manuela Kuhn
1cf52de47a py-requests-ftp: add new package (#24975) 2021-07-21 08:30:50 -05:00
Manuela Kuhn
8a038ef64c py-duecredit: add new package (#25006) 2021-07-21 13:08:12 +00:00
Harmen Stoppels
bb985e40dd z3: disable python binding by default (#25007)
z3 is a dependency of llvm and llvm-amdgpu, and when z3 python bindings
are enabled it depends on py-setuptools as a run dependency. That's
fine, except that py-setuptools now influences the hash of
llvm/llvm-amdgpu, which can be very annoying when another package
restricts the py-setuptools version -- you'll end up recompiling llvm
for no good reason :(.
2021-07-21 14:44:18 +02:00
Erik Schnetter
1c06ec0c11 yaml-cpp: add v0.7.0 (#24996) 2021-07-21 14:21:27 +02:00
Brian Van Essen
4e885b4358 lbann: update darwin build (#24998)
* Updated the lbann package to not enabled OpenMP in BLAS package when
working on Darwin systems.

* Add the Sphinx RTD theme as an explicit dependency when building documentation
2021-07-21 14:20:53 +02:00
Erik Schnetter
9b48827a10 pgplot: add libs method (#24999) 2021-07-21 14:19:02 +02:00
Erik Schnetter
7c5f48c99b lorene: Use correct library names of dependencies (#25000)
Query `spec[...].libs` to find out library flags and names of dependencies.

Also define `libs` property.
2021-07-21 14:18:36 +02:00
Erik Schnetter
1ae760ef31 ssht: add v1.4.0 (#25001) 2021-07-21 14:17:48 +02:00
Robert Underwood
04c5582eb2 sz: add v2.1.12 (#25004) 2021-07-21 14:17:25 +02:00
Manuela Kuhn
78a5e98721 py-citeproc-py: add new package (#25005) 2021-07-21 14:12:47 +02:00
Manuela Kuhn
669769c090 exempi: add new package (#24982) 2021-07-21 14:11:21 +02:00
Erik Schnetter
cb53a9cc14 gperftools: New version 2.9.1 (#24997) 2021-07-21 11:02:48 +02:00
Tamara Dahlgren
ad16eeb9af Ascent: removed redundant cuda variant (#24576) 2021-07-21 10:58:38 +02:00
Chris White
f6d9a1876a add new blt version (#25003) 2021-07-21 00:46:34 -07:00
Manuela Kuhn
af806a8c1e py-rnc2rng: add new package (#24990) 2021-07-21 01:43:18 +00:00
Manuela Kuhn
5ec52fc3f8 py-flask-restful: add new package (#24875) 2021-07-20 20:38:17 -05:00
Manuela Kuhn
c543b86e81 py-httpretty: add new package (#24977) 2021-07-20 20:37:09 -05:00
Manuela Kuhn
3be54d4aab py-vcrpy: add new package (#24978) 2021-07-20 20:34:25 -05:00
Manuela Kuhn
93b694f973 py-mutagen: add new package (#24979) 2021-07-20 20:33:13 -05:00
Manuela Kuhn
30e559592a py-exifread: add new package (#24980) 2021-07-20 20:25:50 -05:00
Manuela Kuhn
d4f498db7c py-rply: add new package (#24989) 2021-07-20 20:24:18 -05:00
Manuela Kuhn
72fb3f768f py-pyperclip: add 1.8.2 and missing xclip dependency (#24970) 2021-07-20 20:22:14 -05:00
Manuela Kuhn
b46061aa42 py-beautifulsoup4: add 4.9.3 (#24974) 2021-07-20 20:18:59 -05:00
Satish Balay
3724c78a25 petsc4py,slepc4py: update maintainers 2021-07-20 13:50:55 -07:00
Seth R. Johnson
2fd24f8542 trilinos: fix boost variant/dependency error for minitensor (#24985) 2021-07-20 15:13:25 -04:00
Wouter Deconinck
c1567463b0 kassiopeia: new versions, updated cmake_args (#24964)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-07-20 16:51:05 +02:00
Manuela Kuhn
889ece85ed py-annexremote: add new package (#24958) 2021-07-20 07:58:54 -05:00
Manuela Kuhn
5d0c7d4ba2 py-pybids: add new package (#24956) 2021-07-20 07:58:26 -05:00
Marcus Boden
3408d22df8 New package: Mash (#24833)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-07-20 05:16:47 -07:00
Sreenivasa Murthy Kolam
aa9f560128 update hip-rocclr recipe and fix dangling pointer in rocm-smi-lib (#24603) 2021-07-20 05:13:36 -07:00
Harmen Stoppels
a2ebeb8e76 boost: Add version ranges for macOS GCC patch (#24969) 2021-07-20 07:30:32 -04:00
Olivier Cessenat
df10ffe20d New Package: visit-unv (#22904) 2021-07-20 13:17:40 +02:00
Mikael Simberg
08629d8fb4 Add HPX 1.7.0 (#24880) 2021-07-20 13:05:40 +02:00
Tim Gymnich
b3b01a47d2 Add Enzyme (#24830)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-07-20 12:45:26 +02:00
Erik Schnetter
d580c5506c boost: Allow building with GCC on macOS (#24895) 2021-07-20 11:50:51 +02:00
Gabriel Rockefeller
0bf5156caf global: add v6.6.7 (#24930) 2021-07-20 01:58:41 -07:00
Alec Scott
9abd77c517 poppler: add v21.07.0 (#24931) 2021-07-20 01:55:34 -07:00
Martin Köhler
887820ecb5 fenics: add HDF5_NO_FIND_PACKAGE_CONFIG_FILE to cmake opts in FEniCS (#24922)
In some cases the FindHDF5.cmake returnd a wrong value for the HDF5 library names and path. For example it returns hdf5-shared as library name without a search path or checking if this is really an existing shared library. By HDF5_NO_FIND_PACKAGE_CONFIG_FILE=True/ON to the cmake options, the FindHDF5 module does not rely on a properly install hdf5-config.cmake and thus searches for the library and its paths. This results in a usable return value and fenics works afterwards.
2021-07-20 10:16:00 +02:00
Adam J. Stewart
55e247b407 py-pydocstyle: add new package (#24929) 2021-07-20 10:12:42 +02:00
Alec Scott
8ec0ef4c6d metall: add v0.15 (#24932) 2021-07-20 10:06:21 +02:00
Adam J. Stewart
f7be6f94ea rust: add spack external find support (#24939) 2021-07-20 10:03:46 +02:00
Chris Richardson
967743adc7 Fenicsx ecosystem: various updates (#24940)
* Updates for dependencies in main branch

* Add more depends

* Make CMake available at runtime for fenics-dolfinx

* Add maintainer

Co-authored-by: Garth N. Wells <gnw20@cam.ac.uk>
2021-07-20 10:03:02 +02:00
Sheng Di
55fe16991c zchecker: add v0.7 (#24942) 2021-07-20 10:01:54 +02:00
Adam J. Stewart
db909353e5 py-fsspec: add v2021.7.0 (#24943) 2021-07-20 10:01:23 +02:00
Adam J. Stewart
7ad7fa4da9 py-pytorch-lightning: add v1.3.8 (#24944) 2021-07-20 10:00:54 +02:00
Christoph Junghans
899e08a180 votca: add v2021.1 (#24949)
apply patch only to @2021
2021-07-20 09:59:01 +02:00
Adam J. Stewart
d511364a43 py-numpy: add v1.21.1 (#24950) 2021-07-20 09:57:37 +02:00
Bryan Herman
4d5f7b361e pvm: add fpic variant (#24954) 2021-07-20 09:54:32 +02:00
Satish Balay
19677c5ad1 petsc, petsc4py: add version 3.15.2 (#24961) 2021-07-20 09:50:05 +02:00
Satish Balay
2c87992506 petsc: pass in 'cuda_arch' to configure via --with-cuda-gencodearch [or CUDAFLAGS for older releases] (#24962) 2021-07-20 09:48:48 +02:00
Filip Sedlák
330507f329 ncbi-rmblastn: add build dependency on cpio (#24963)
Although `cpio` is present in many environments, it may not be always
available.

The failure to build this package can be reproduced in a fresh Docker
image `debian:10`.
2021-07-20 09:44:32 +02:00
Alec Scott
8080a5e5b2 Add Julia v1.6.2 (#24935) 2021-07-19 23:08:37 -05:00
Tiziano Müller
3a698112cc abinit: make libxml2 really optional, add optimization-flavor variant, fix build with ifort for atompaw (#24876) 2021-07-19 22:57:54 -05:00
Manuela Kuhn
b8b8450400 py-pygithub: add new package (#24957) 2021-07-19 19:49:34 -07:00
Tiziano Müller
d74b296752 libint: add debug/fma variants, fix tests for v2.6 (#24665) 2021-07-19 10:28:31 -06:00
Sreenivasa Murthy Kolam
7845939722 fix compile error with the correct python path (#24936) 2021-07-19 15:14:05 +02:00
Michael Kuhn
fdcd7f96e5 meson: add 0.59.0 and 0.58.1 (#24955) 2021-07-19 06:49:42 -06:00
Kai Torben Ohlhus
339c2290e7 openblas: add version 0.3.17 (#24941) 2021-07-19 13:46:14 +02:00
Harmen Stoppels
fc50e04b59 Fix S3 urlparse scheme error and add test (#24952) 2021-07-19 13:39:17 +02:00
Timothy Brown
1bf9c10f0c mpas-model: support Intel compiler (#24905) 2021-07-19 13:15:16 +02:00
shanedsnyder
feb229a5f9 darshan runtime,darshan-util: convert to autotool packages (#24906) 2021-07-19 13:14:32 +02:00
Ben Darwin
efd9884e83 minc-toolkit: allow building shared libs and enable by default (#24909) 2021-07-19 13:08:36 +02:00
Adam J. Stewart
846ab65cc0 py-pythran: add new version (#24900) 2021-07-19 12:57:21 +02:00
Adam J. Stewart
29c7542c48 py-pyproj: add new versions (#24893) 2021-07-19 12:52:02 +02:00
Adam J. Stewart
3f9a5eda16 PROJ: add v8.x (#24892) 2021-07-19 12:51:17 +02:00
Ricardo Jesus
fefedbe653 Remove -Wmissing-format-attribute if compiling with nvhpc (#24873) 2021-07-19 12:25:13 +02:00
Adam J. Stewart
be90bdc355 py-rtree: add new version, fix runtime env (#24862) 2021-07-19 12:22:03 +02:00
Adam J. Stewart
b074dc17b1 py-bandit: add new package (#24857) 2021-07-19 12:18:47 +02:00
Satish Balay
fa503ef0e2 strumpack@develop: update change in examples/data PATH (#24814) 2021-07-19 12:01:05 +02:00
Lizzie Lundgren
99eb98d029 gchp: add versions 13.1.0, 13.1.1, 13.1.2 (#24755) 2021-07-19 10:53:02 +02:00
Alec Scott
90da25e24e go: add v1.16.6 (#24934) 2021-07-19 09:04:34 +02:00
Seth R. Johnson
624c72afae trilinos: simplify some variants (#24820)
* trilinos: rename basker variant

The Basker solver is part of amesos2 but is clearer without the extra
scoping.

* trilinos: automatically enable teuchos and remove variant

Basically everything in trilinos needs teuchos

* trilinos: group top-level dependencies

* trilinos: update dependencies, removing unused

- GLM, X11 are unused (x11 lacks dependency specs too)
- Python variant is more like a TPL so rearrange that
- Gtest internal package shouldn't be compiled or exported
- Add MPI4PY requirement for pytrilinos

* trilinos: remove package meta-options

- XSDK settings and "all opt packages" are not used anywhere
- all optional packages are dangerous

* trilinos: Use hwloc iff kokkos

See #19119, also the HWLOC tpl name was misspelled so this was being ignored before.

* Flake

* Fix trilinos +netcdf~mpi

* trilinos: default to disabling external dependencies

* Remove teuchos from downstream dependencies

* fixup! trilinos: Use hwloc iff kokkos

* Add netcdf requirements to packages with ^trilinos+exodus

* trilinos: disable exodus by default

* fixup! Add netcdf requirements to packages with ^trilinos+exodus

* trilinos: only enable hwloc when @13: +kokkos

* xyce: propagate trilinos dependencies more simply

* dtk: fix missing boost dependency

* trilinos: remove explicit metis dependency

* trilinos: require metis/parmetis for zoltan

Disable zoltan by default to minimize default dependencies

* trilinos: mark mesquite disabled and fix kokkos arch

* xsdk: fix trilinos to also list zoltan [with zoltan2]

* ci: remove nonexistent variant from trilinos

* trilinos: add missing boost dependency

Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2021-07-16 11:36:06 -07:00
Adam J. Stewart
c56f2a935d Sphinx 3.4+ required for correct reference target linking 2021-07-16 08:30:56 -07:00
Adam J. Stewart
b8afc0fd29 API Docs: fix broken reference targets 2021-07-16 08:30:56 -07:00
Adam J. Stewart
c37df94932 Python: query distutils to find site-packages directory (#24095)
Third-party Python libraries may be installed in one of several directories:

1. `lib/pythonX.Y/site-packages` for Spack-installed Python
2. `lib64/pythonX.Y/site-packages` for system Python on RHEL/CentOS/Fedora
3. `lib/pythonX/dist-packages` for system Python on Debian/Ubuntu

Previously, Spack packages were hard-coded to use the (1). Now, we query the Python installation itself and ask it which to use. Ever since #21446 this is how we've been determining where to install Python libraries anyway.

Note: there are still many packages that are hard-coded to use (1). I can change them in this PR, but I don't have the bandwidth to test all of them.

* Python: handle dist-packages and site-packages
* Query Python to find site-packages directory
* Add try-except statements for when distutils isn't installed
* Catch more errors
* Fix root directory used in import tests
* Rely on site_packages_dir property
2021-07-16 08:28:00 -07:00
Maciej Wójcik
64f31c4579 Added missing Plumed 2.5-2.7 versions to Gromacs package (#24912)
* Added missing Plumed 2.5-2.7 releases

* Added missing Plumed 2.5-2.7 dependencies

* Merged version ranges

* Simplified version ranges

* Deduplicated comment
2021-07-16 05:47:27 -06:00
eugeneswalker
e96ba16555 ci: build trilinos with all variants on (#24908) 2021-07-15 18:44:42 +00:00
Massimiliano Culpo
fd55d627a7 archspec: added support for arm compiler on graviton2 (#24904) 2021-07-15 13:27:13 +00:00
Adam J. Stewart
753fa4ed08 py-gast: add v0.5.0 (#24898) 2021-07-15 14:56:09 +02:00
Adam J. Stewart
f66571ffe1 py-beniget: add v0.4.0, v0.2.3 (#24899) 2021-07-15 14:54:56 +02:00
Valentin Volkl
fa4b9a6abc simsipm: add new package (#24903)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-07-15 14:06:41 +02:00
Erik Schnetter
0a3f875b95 boost: Run b2 headers after a git clone (#24889) 2021-07-14 22:00:44 -05:00
vucoda
29f10624bd Change url+checksums for libpng to official sourceforge archives (#23767)
* Change url and checksums for libpng to official sourceforge archives
* Update url scheme from http to https
* switch to .xz archives

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-07-14 21:30:32 +00:00
Tamara Dahlgren
1c07dd1adb Update stand-alone tests to use test stage work directory (#24112) 2021-07-14 13:33:50 -07:00
Axel Huebl
8126a13211 Dask: 2021.06.2 (#24606)
* Dask: 2021.06.2

Add the latest DASK release.

* Apply suggestions from code review

Co-authored-by: Sergey Kosukhin <skosukhin@gmail.com>

* Update py-distributed relation

Co-authored-by: Sergey Kosukhin <skosukhin@gmail.com>
2021-07-14 14:58:53 -05:00
robgics
b24ba28774 Add ampl package (#24105) 2021-07-14 19:55:28 +00:00
Luhan Cheng
369ccb953f Fix inconsistent arch arguments expected in cudnn package (#24882)
* change aarch64sbsa to aarch64

* fixing arch in url

* making ci pipeline happy

* removing comments

Co-authored-by: Luhan Cheng <luhan.cheng@monash.edu>
2021-07-14 19:40:02 +00:00
Greg Becker
3004f33c58 spec may be a string, use precomputed namespace (#24867) 2021-07-14 11:58:31 -07:00
Tamara Dahlgren
a9e7f3a4e7 scotch: use https (#24891) 2021-07-14 14:35:27 -04:00
Zack Galbreath
56c8f533cd py-setuptools-scm: change default to +toml (#24884) 2021-07-14 15:45:41 +00:00
Valentin Volkl
e6e21b16d8 py-particle: add version 0.15.1 (#24834) 2021-07-14 07:53:01 -05:00
G-Ragghianti
555c054984 Added new version 2.6.1 (#24871) 2021-07-14 07:51:23 -05:00
Xavier Delaruelle
b37bf93aa2 environment-modules: add v4.8.0 (#24874) 2021-07-14 14:17:21 +02:00
Thomas Kluyver
c33ec328fb Add py-h5py version 3.3.0 (#24781)
* Add py-h5py version 3.3.0

The mpi4py dependency was bumped to 3.0.2 in setup.py. I'm not sure if that's actually required or not, but nothing lower is still tested.

* Use environment variable to stop h5py using setuptools setup_requires feature

* Add myself as a maintainer for py-h5py
2021-07-14 07:33:57 -04:00
Kai Torben Ohlhus
652f35a39f openblas: add version 0.3.16 (#24872) 2021-07-14 04:07:16 -07:00
Jen Herting
e3fdbb976e [py-transformers] added version 4.6.1 (#24571)
* [py-transformers] can now use newer versions of tokenizers

* [py-transformers] Added version 4.6.1

* [py-transformers] removing old patch

* [py-transformers] boto3 no longer needed
2021-07-14 02:58:10 +00:00
Jen Herting
f095383caf New package: py-torchmeta (#24596)
* first build of py-torchmeta

* updated versions for torchvision and torch

* [py-torchmeta] using pil provider

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-07-13 21:38:12 -05:00
Manuela Kuhn
94767ea573 py-setuptools-rust: add 0.12.1 (#24863)
* py-setuptools-rust: add 0.12.1

* mark 0.10.6 as deprecated and fix style
2021-07-13 21:36:14 -05:00
Alec Scott
28872955d5 Limit Gdal Version for Grass and Add Version 7.8.5 (#24737) 2021-07-13 20:50:56 -05:00
Mauricio Ferrato
e1d7275f92 flecsph package: use cxxstd=17 and external cinch (#24856) 2021-07-13 15:22:55 -07:00
Manuela Kuhn
04520ebdea py-cryptography: add 3.4.7 (#24866) 2021-07-13 22:01:54 +00:00
G-Ragghianti
819f288587 MAGMA package: fix smoke test method (#24848)
The Makefile for the MAGMA smoke tests uses pkg-config to find
the MAGMA compile flags, but the test() routine in the spack
package was not configured to provide the location of the
pkg-config file. This modification sets PKG_CONFIG_PATH correctly
to allow the smoketests to successfully compile. It also removes
the *_dir variables which were unused by the magma
examples/Makefile.
2021-07-13 14:50:15 -07:00
Martin Köhler
8ccdcf2e84 Octave: add version 6.3.0 (#24851) 2021-07-13 14:27:49 -07:00
Michael Kuhn
df77922d22 py-jupyterlab, py-jupyter-server: fix version range (#24864)
Using the original concretizer, trying to concretize py-jupyterlab fails
with
```
==> Error: Invalid Version range: 6.1.0:6.1
```
because py-tornado does not have a 6.1.0 version but only a 6.1 one.
2021-07-13 21:05:14 +00:00
Tom Payerle
9a0febab89 libtirpc: Fix for #24806 (remove -pipe flag when using %nvhpc) (#24807)
Makefiles for libtirpc have hardcoded the -pipe flag to the compiler
nvhpc compilers do not recognize that flag.
This PR provides a patch to remove the -pipe flag from the Makefile.
Patch should work with libtirpc@1.2.6 and @1.1.4
2021-07-13 12:55:20 -07:00
Michael Schlottke-Lakemper
231a36c5fd HOHQMesh: add version v1.0.1 (#24823) 2021-07-13 12:31:34 -07:00
Frank Willmore
d79022f842 openmpi: add direct cuda dependency (#24859)
makes cuda a direct dependency, so it still shows up when using external hwloc+cuda
2021-07-13 19:14:29 +00:00
Manuela Kuhn
3c5287c458 py-jupyterlab: fix startup and add 3.0.16 (#24780)
jupyterlab was looking for its application directory inside the python
prefix instead its own one. This was fixed by setting the according
environment variable.
2021-07-13 19:07:45 +00:00
Robert Mijakovic
88be996d45 scotch: new versions, 6.1.0 and 6.1.1 (#24855)
Co-authored-by: Robert Mijakovic <robert.mijakovic@lxp.lu>
2021-07-13 11:03:20 -07:00
Adam J. Stewart
9828df7335 py-torch: CUDA 9.2+ required for 1.6+ (#24808) 2021-07-13 10:46:07 -07:00
Jianwen
657b3ec052 blast-plus: add version 2.12.0 (#24828) 2021-07-13 10:44:07 -07:00
Seth R. Johnson
80813b61ff hdf5: new version 1.12.1 (#24841) 2021-07-13 10:42:23 -07:00
Hadrien G
e4fa31230c ACTS package: add version 9.02.0 (#24844) 2021-07-13 10:41:45 -07:00
iarspider
a6f839b880 Add new Cython version (#24853) 2021-07-13 10:22:34 -05:00
eugeneswalker
09540d411e binary_distribution: relocate x-pie-executable files (#24854) 2021-07-13 07:53:35 -07:00
Manuela Kuhn
aaad65fbd8 graphviz: add python dependency to fix installation (#24852)
The bootstrap script in the autoreconf procedure calls the
gen_version.py script which requires python 3.6 to process f-strings.
2021-07-13 13:12:10 +00:00
Valentin Volkl
e3e50b3af9 py-awkward: add version 1.4.0 (#24838) 2021-07-13 08:11:59 -05:00
Valentin Volkl
047c9704df whizard: add version 3.0.1 (#24836) 2021-07-13 08:10:55 -05:00
Valentin Volkl
1ee8947677 gaudi: add version 36.0 (#24840) 2021-07-13 08:31:11 -04:00
Valentin Volkl
7f24feb5a4 cppunit: disable doxygen (#24850) 2021-07-13 08:25:26 -04:00
Todd Gamblin
326fe433b3 Add spack help --spec to README.md (#24849)
We don't really advertise `spack help --spec` enough. I think the README is a good place to start
doing that.
2021-07-13 14:04:41 +02:00
Adam J. Stewart
60765d38d0 Fix KeyboardInterrupt signal for Python 2 2021-07-13 01:17:51 -07:00
Erik Schnetter
667ab50199 c-blosc2: New version 2.0.2 (#24843) 2021-07-12 23:02:36 +00:00
Massimiliano Culpo
3228c35df6 Enable/disable bootstrapping and customize store location (#23677)
* Permit to enable/disable bootstrapping and customize store location

This PR adds configuration handles to allow enabling
and disabling bootstrapping, and to customize the store
location.

* Move bootstrap related configuration into its own YAML file

* Add a bootstrap command to manage configuration
2021-07-12 19:00:37 -04:00
Valentin Volkl
9fb1c3e143 py-uproot: add version 4.0.11 (#24835) 2021-07-12 17:39:22 -05:00
Valentin Volkl
e05be70bcb py-hepunits: add version 2.1.1 (#24837) 2021-07-12 17:37:37 -05:00
jkelling
81bad21d3a Update caffe package for cuda9 (#24831)
Add base CudaPackage, cuda_arch
2021-07-12 17:31:38 -05:00
Todd Gamblin
f58b2e03ca build output: filter padding out of console output when padded_length is used (#24514)
Spack allows users to set `padded_length` to pad out the installation path in
build farms so that any binaries created are more easily relocatable. The issue
with this is that the padding dominates installation output and makes it
difficult to see what is going on. The padding also causes logs to easily
exceed size limits for things like GitLab artifacts.

This PR fixes this by adding a filter in the logger daemon. If you use a
setting like this:

config:
    install_tree:
        padded_length: 512

Then lines like this in the output:

==> [2021-06-23-15:59:05.020387] './configure' '--prefix=/Users/gamblin2/padding-log-test/opt/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_placeholder__/__spack_path_pla/darwin-bigsur-skylake/apple-clang-12.0.5/zlib-1.2.11-74mwnxgn6nujehpyyalhwizwojwn5zga

will be replaced with the much more readable:

==> [2021-06-23-15:59:05.020387] './configure' '--prefix=/Users/gamblin2/padding-log-test/opt/[padded-to-512-chars]/darwin-bigsur-skylake/apple-clang-12.0.5/zlib-1.2.11-74mwnxgn6nujehpyyalhwizwojwn5zga

You can see that the padding has been replaced with `[padded-to-512-chars]` to
indicate the total number of characters in the padded prefix. Over a long log
file, this should save a lot of space and allow us to see error messages in
GitHub/GitLab log output.

The *actual* build logs still have full paths in them. Also lines that are
output by Spack and not by a package build are not filtered and will still
display the fully padded path. There aren't that many of these, so the change
should still help reduce file size and readability quite a bit.
2021-07-12 21:48:52 +00:00
Valentin Volkl
34b763f792 hepmc3: add version 3.2.4 (#24839)
* hepmc3: add version 3.2.4

* hepmc3: clean up legacy arguments from hepmc2 and fix tests
2021-07-12 17:48:09 -04:00
Ryan O'Malley
9d6d2f0f9b folly: added latest version and switched to CMakePackage (#23938)
* added latest version and switched to CMakePackage

* Added optional dependencies and cxxstd variant

* Added cxxstd variant and optional dependencies

* Added lib. that Boost doesn't install by default

* BUG: Removed previous broken versions of Folly

* BUG: refactored comments

* BUG: Fixed styling errors
2021-07-12 14:13:31 -07:00
Sebastian Schmitt
6d22e9cd7b Update pathos (#24636)
* Update pathos

* Add build and run
2021-07-12 10:01:43 -05:00
Sebastian Schmitt
1784b05eaf Update multiprocess (#24634)
* Update multiprocess

* Add build and run
2021-07-12 10:01:13 -05:00
Adam J. Stewart
872785db16 py-sphobjinv: add new package (#24790) 2021-07-12 13:40:32 +02:00
Erik Schnetter
4c7aed5d57 lorene, pgplot: new packages (#24549) 2021-07-12 13:35:43 +02:00
Christoph Conrads
e957c58a1e Expat: add version 2.4.0, 2.4.1; fix CVE-2013-0340 (#24669)
* Expat: add version 2.4.0, 2.4.1; fix CVE-2013-0340

fixes #24628

* E4S pipeline: update pinned Expat version
2021-07-11 19:43:37 +00:00
Matthew Fernandez
112c1751d7 docs: fix reference to count of system packages (#24821)
015e29efe1 that introduced this section to the
documentation said “two” here instead of the actual count, three.
9f54cea5c5 then added a fourth, BLAS/LAPACK.
Rather than trying to keep this leading count in sync, this change just replaces
the wording with something more generic/stable.
2021-07-11 18:49:19 +00:00
Scott Wittenburg
e0017a6666 Fix out of date remote spec.yamls while updating the index (#24811) 2021-07-10 06:49:09 +00:00
Todd Gamblin
084bafe18c coverage: move config from .coveragerc to pyproject.toml
Getting rid of another top-level file.

`coverage.py` has supported `pyproject.toml` since version 5.0, and
all versions of coverage so far work with python 2.7. We just need to
ensure that a version of coverage with the `toml` extra is installed
in the test environment.

I tested this with `coverage run`, `coverage report`, and `coverage html`.
2021-07-09 22:49:47 -07:00
Vanessasaurus
775c8223c3 debug: initial global debug flag support (#24285)
The developer can export environment variables that
are seen by the compiler wrapper to then add the flags
2021-07-09 21:22:26 -04:00
Greg Becker
ebf2076755 spec.splice: properly handle cached hash invalidations (#24795)
* spec.splice: properly handle cached hash invalidations

* make package_hash a cached hash on the spec
2021-07-10 01:16:48 +00:00
Christoph Junghans
556975564c votca-tools: fix build with newer gcc/glibc (#24815) 2021-07-10 01:05:30 +00:00
Massimiliano Culpo
d14520d6d8 docker: remove boto3 from CentOS 6 since it requires and updated pip (#24813) 2021-07-10 00:45:30 +00:00
Axel Huebl
78f65e7ce0 openPMD-api: rename develop (#24810)
* openPMD-api: rename develop

Rename to match known Spack version comparison schemes:
```
develop>main>master>head>trunk>9999>0>z>a
```

Currently, the hdf5 patch that is pre-0.14.0 is also applied to
`dev`, which naturally fails (already applied).

* fix dev in warpx
2021-07-09 22:43:28 +00:00
Asher Mancinelli
c9fe3af92d toml11: New package (#24620) 2021-07-09 14:13:22 -04:00
Manuela Kuhn
c6ffec1d78 py-markupsafe: add 2.0.1 (#24766)
* py-markupsafe: add 2.0.1

* Update var/spack/repos/builtin/packages/py-markupsafe/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-07-09 12:04:36 -06:00
Zack Galbreath
0e177cb95f ci: tolerate 'InvalidAccessKeyId' (#24741)
Add 'InvalidAccessKeyId' to the list of error messages we expect
and tolerate in push_mirror_contents()
2021-07-09 09:39:16 -07:00
QuellynSnead
72585afcef hypre: Add new versions (#24625)
* hypre: Add releases 2.21.0 and 2.22.0
2021-07-09 12:25:36 -04:00
Chris White
4037a94d1e Remove unnecessary compiler id override for XL (#24799) 2021-07-09 09:20:54 -07:00
Adam J. Stewart
b4e757dc35 py-fuzzywuzzy: add new package (#24789) 2021-07-09 17:51:52 +02:00
darmac
d734df705a libnids: add new package (#22153) 2021-07-09 17:49:10 +02:00
Massimiliano Culpo
2a858f8f3d docker: Fix CentOS 6 build on Docker Hub (#24804)
This change make yum usable again on CentOS 6
2021-07-09 15:45:46 +00:00
Seth R. Johnson
0edc55adc2 scotch: default to not installing vendored metis/parmetis (#24785) 2021-07-09 11:30:00 -04:00
Hadrien G
7b7f758db3 acts: add v9.01.00 (#24570) 2021-07-09 17:13:35 +02:00
Robert Mijakovic
8cc54036b5 c-blosc2: adds v2.0.1 (#24581)
Co-authored-by: Robert Mijakovic <robert.mijakovic@lxp.lu>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
2021-07-09 17:12:00 +02:00
Miroslav Stoyanov
b3bdc2ef38 heffte: add v2.1.0 (#24599) 2021-07-09 17:11:13 +02:00
Robert Mijakovic
a53f4c36c6 util-macros: new versions, 1.19.2/3 (#24612)
Co-authored-by: Robert Mijakovic <robert.mijakovic@lxp.lu>
2021-07-09 17:02:45 +02:00
Gabriel Rockefeller
4682ff0cc4 global: add v6.6.6 (#24642) 2021-07-09 16:59:23 +02:00
Manuela Kuhn
fabe86be96 py-bids-validator: add new package (#24677) 2021-07-09 16:58:44 +02:00
Manuela Kuhn
9797c8f060 py-sqlalchemy: add 1.4.20 (#24676)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-07-09 16:58:26 +02:00
Dylan Simon
02c5c76f0b pthreadpool: enable shared libraries (#24657)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-07-09 16:58:02 +02:00
Simon Frasch
07fab46262 spla: add v1.5.1 (#24661) 2021-07-09 16:54:49 +02:00
Simon Frasch
0edb7937e7 spfft: add v1.0.4 (#24662) 2021-07-09 16:54:37 +02:00
Paul Kuberry
46fa8481d9 trilinos: add a neww maintainer (#24670) 2021-07-09 16:54:06 +02:00
Manuela Kuhn
a2a2d6ab7e py-aniso8601: add new package (#24672) 2021-07-09 16:53:01 +02:00
Manuela Kuhn
aaeaa0516d py-jeepney: add v0.6.0 (#24674) 2021-07-09 16:52:49 +02:00
Manuela Kuhn
463c704265 py-jupyterlab-server: add v2.6.0 (#24779) 2021-07-09 14:35:26 +00:00
Manuela Kuhn
cd118341e9 py-nbclassic: add new packageg (#24778) 2021-07-09 14:21:56 +00:00
Gregor Daiß
2d1631c9fd sgpp: add v3.4.0 (#24678) 2021-07-09 16:09:35 +02:00
Manuela Kuhn
cdd6c71f66 py-greenlet: add v1.1.0 (#24682) 2021-07-09 16:09:13 +02:00
figroc
7d334471d3 protobuf: add versions up to 3.17.3 (#24691) 2021-07-09 16:05:16 +02:00
Sebastian Schmitt
8cb3253a04 Update ppft (#24632) 2021-07-09 08:16:39 -05:00
Sebastian Schmitt
9add3182c7 Add py-salib (#24218) 2021-07-09 08:15:21 -05:00
Enrico Usai
f830585994 aws-parallelcluster: add v2.11.0 (#24648) 2021-07-09 08:13:57 -05:00
Vasileios Karakasis
2e80b60a04 Add ReFrame 3.6.3 (#24664) 2021-07-09 14:29:04 +02:00
Adam J. Stewart
ace28e2ef5 pinentry: add gui multi-valued variant (#24717) 2021-07-09 14:26:02 +02:00
Manuela Kuhn
4a44f023e8 py-jupyter-server: add new package (#24777) 2021-07-09 11:37:08 +00:00
Olivier Cessenat
1b26c47cb8 latex2html: adding the famous LaTeX to HTML converter (#24750) 2021-07-09 13:17:43 +02:00
Manuela Kuhn
aeab3b2872 py-jupyter-packaging: add new package (#24751) 2021-07-09 13:16:43 +02:00
Cameron Smith
727f43f69f simmetrix-simmodsuite: add v16.0-210623 and maintainer (#24763) 2021-07-09 13:08:04 +02:00
Robert Mijakovic
ba2e186f31 llvm: add v12.0.1 (#24803)
Co-authored-by: Robert Mijakovic <robert.mijakovic@lxp.lu>
2021-07-09 13:06:04 +02:00
Michele Mesiti
de8d4e9d9a sombrero: add v2021-07-08, deprecate v1.0 (#24782) 2021-07-09 13:02:26 +02:00
Manuela Kuhn
01ca429c9a py-tornado: add v6.1 (#24753) 2021-07-09 12:57:45 +02:00
Manuela Kuhn
83d0e20ae8 py-requests-unixsocket: add new package (#24764) 2021-07-09 12:57:32 +02:00
Manuela Kuhn
85b49f115f py-anyio: add new package (#24765) 2021-07-09 12:57:20 +02:00
holrock
451f484c9a ruby: add v3.0.2 (#24771) 2021-07-09 12:53:37 +02:00
Manuela Kuhn
bbd80e5cf3 py-jupyter-core: add v4.7.1 (#24768) 2021-07-09 12:53:13 +02:00
Manuela Kuhn
d8f655159a py-jinja2: add 3.0.1 and +i18n variant (#24767) 2021-07-09 12:52:59 +02:00
Adam J. Stewart
94e5c1d078 py-pyrsistent: need link dep on python (#24788) 2021-07-09 11:58:51 +02:00
Robert Cohn
0b9b3f6f79 intel-oneapi-dpl: new package (#24793) 2021-07-09 11:56:02 +02:00
Axel Huebl
5b5f99bbd4 WarpX: add v21.07 (#24800) 2021-07-09 11:53:04 +02:00
Todd Gamblin
eff7f20118 mypy: move configuration to pyproject.toml (#24802)
This moves our `mypy` configuration from `.mypy.ini` to `.pyproject.toml`
and increases the minimum `mypy` version in the tests.

- [x] move `mypy` configuration to `pyproject.toml`
- [x] remove `.mypy.ini`
- [x] ensure that `mypy` version .900 or higher is used in tests
2021-07-09 11:52:23 +02:00
Seth R. Johnson
3fb5c13983 googletest: add v1.11 and "live at head", keep v1.10 the default (#24290)
Ideally a test-only dependency won't be in the build, but until then
mark the requirement of gtest up to 1.10. 

See e4s job failure at https://gitlab.spack.io/spack/spack/-/jobs/349959 .

Looks like 1.11 introduces some breaking incompatibilities, so perhaps
we should transition later.
2021-07-09 10:57:37 +02:00
Stephen Herbein
57a9fb7610 flux: add latest tagged releases (#24687)
flux-core v0.21 requires jansson 2.10+

For more details, see:
a6086e021e
2021-07-09 10:55:06 +02:00
Nic McDonald
4c3005673e abseil-cpp: add cxxstd variant (#24577) 2021-07-09 10:54:14 +02:00
Sebastian Schmitt
89b57929f2 Update pox (#24635)
* Update pox

* Add build and run
2021-07-08 21:42:26 -05:00
Massimiliano Culpo
c699e907fc Update command to setup tutorial (#24488) 2021-06-23 19:38:36 -05:00
3122 changed files with 53712 additions and 16095 deletions

View File

@@ -19,3 +19,16 @@ comment: off
# annotations in files that seemingly have nothing to do with the PR.
github_checks:
annotations: false
# Attempt to fix "Missing base commit" messages in the codecov UI.
# Because we do not run full tests on package PRs, package PRs' merge
# commits on `develop` don't have coverage info. It appears that
# codecov will give you an error if the pseudo-base's coverage data
# doesn't all apply properly to the real PR base.
#
# See here for docs:
# https://docs.codecov.com/docs/comparing-commits#pseudo-comparison
# See here for another potential solution:
# https://community.codecov.com/t/2480/15
codecov:
allow_coverage_offsets: true

View File

@@ -1,38 +0,0 @@
# -*- conf -*-
# .coveragerc to control coverage.py
[run]
parallel = True
concurrency = multiprocessing
branch = True
source =
bin
lib
omit =
lib/spack/spack/test/*
lib/spack/docs/*
lib/spack/external/*
share/spack/qa/*
[report]
# Regexes for lines to exclude from consideration
exclude_lines =
# Have to re-enable the standard pragma
pragma: no cover
# Don't complain about missing debug-only code:
def __repr__
if self\.debug
# Don't complain if tests don't hit defensive assertion code:
raise AssertionError
raise NotImplementedError
# Don't complain if non-runnable code isn't run:
if 0:
if False:
if __name__ == .__main__.:
ignore_errors = True
[html]
directory = htmlcov

View File

@@ -1,42 +0,0 @@
---
name: "\U0001F41E Bug report"
about: Report a bug in the core of Spack (command not working as expected, etc.)
labels: "bug,triage"
---
<!-- Explain, in a clear and concise way, the command you ran and the result you were trying to achieve.
Example: "I ran `spack find` to list all the installed packages and ..." -->
### Steps to reproduce the issue
```console
$ spack <command1> <spec>
$ spack <command2> <spec>
...
```
### Error Message
<!-- If Spack reported an error, provide the error message. If it did not report an error but the output appears incorrect, provide the incorrect output. If there was no error message and no output but the result is incorrect, describe how it does not match what you expect. -->
```console
$ spack --debug --stacktrace <command>
```
### Information on your system
<!-- Please include the output of `spack debug report` -->
<!-- If you have any relevant configuration detail (custom `packages.yaml` or `modules.yaml`, etc.) you can add that here as well. -->
### Additional information
<!-- These boxes can be checked by replacing [ ] with [x] or by clicking them after submitting the issue. -->
- [ ] I have run `spack debug report` and reported the version of Spack/Python/Platform
- [ ] I have searched the issues of this repo and believe this is not a duplicate
- [ ] I have run the failing commands in debug mode and reported the output
<!-- We encourage you to try, as much as possible, to reduce your problem to the minimal example that still reproduces the issue. That would help us a lot in fixing it quickly and effectively!
If you want to ask a question about the tool (how to use it, what it can currently do, etc.), try the `#general` channel on our Slack first. We have a welcoming community and chances are you'll get your reply faster and without opening an issue.
Other than that, thanks for taking the time to contribute to Spack! -->

58
.github/ISSUE_TEMPLATE/bug_report.yml vendored Normal file
View File

@@ -0,0 +1,58 @@
name: "\U0001F41E Bug report"
description: Report a bug in the core of Spack (command not working as expected, etc.)
labels: [bug, triage]
body:
- type: textarea
id: reproduce
attributes:
label: Steps to reproduce
description: |
Explain, in a clear and concise way, the command you ran and the result you were trying to achieve.
Example: "I ran `spack find` to list all the installed packages and ..."
placeholder: |
```console
$ spack <command1> <spec>
$ spack <command2> <spec>
...
```
validations:
required: true
- type: textarea
id: error
attributes:
label: Error message
description: |
If Spack reported an error, provide the error message. If it did not report an error but the output appears incorrect, provide the incorrect output. If there was no error message and no output but the result is incorrect, describe how it does not match what you expect.
placeholder: |
```console
$ spack --debug --stacktrace <command>
```
- type: textarea
id: information
attributes:
label: Information on your system
description: Please include the output of `spack debug report`
validations:
required: true
- type: markdown
attributes:
value: |
If you have any relevant configuration detail (custom `packages.yaml` or `modules.yaml`, etc.) you can add that here as well.
- type: checkboxes
id: checks
attributes:
label: General information
options:
- label: I have run `spack debug report` and reported the version of Spack/Python/Platform
required: true
- label: I have searched the issues of this repo and believe this is not a duplicate
required: true
- label: I have run the failing commands in debug mode and reported the output
required: true
- type: markdown
attributes:
value: |
We encourage you to try, as much as possible, to reduce your problem to the minimal example that still reproduces the issue. That would help us a lot in fixing it quickly and effectively!
If you want to ask a question about the tool (how to use it, what it can currently do, etc.), try the `#general` channel on [our Slack](https://slack.spack.io/) first. We have a welcoming community and chances are you'll get your reply faster and without opening an issue.
Other than that, thanks for taking the time to contribute to Spack!

View File

@@ -1,43 +0,0 @@
---
name: "\U0001F4A5 Build error"
about: Some package in Spack didn't build correctly
title: "Installation issue: "
labels: "build-error"
---
<!-- Thanks for taking the time to report this build failure. To proceed with the report please:
1. Title the issue "Installation issue: <name-of-the-package>".
2. Provide the information required below.
We encourage you to try, as much as possible, to reduce your problem to the minimal example that still reproduces the issue. That would help us a lot in fixing it quickly and effectively! -->
### Steps to reproduce the issue
<!-- Fill in the exact spec you are trying to build and the relevant part of the error message -->
```console
$ spack install <spec>
...
```
### Information on your system
<!-- Please include the output of `spack debug report` -->
<!-- If you have any relevant configuration detail (custom `packages.yaml` or `modules.yaml`, etc.) you can add that here as well. -->
### Additional information
<!-- Please upload the following files. They should be present in the stage directory of the failing build. Also upload any config.log or similar file if one exists. -->
* [spack-build-out.txt]()
* [spack-build-env.txt]()
<!-- Some packages have maintainers who have volunteered to debug build failures. Run `spack maintainers <name-of-the-package>` and @mention them here if they exist. -->
### General information
<!-- These boxes can be checked by replacing [ ] with [x] or by clicking them after submitting the issue. -->
- [ ] I have run `spack debug report` and reported the version of Spack/Python/Platform
- [ ] I have run `spack maintainers <name-of-the-package>` and @mentioned any maintainers
- [ ] I have uploaded the build log and environment files
- [ ] I have searched the issues of this repo and believe this is not a duplicate

64
.github/ISSUE_TEMPLATE/build_error.yml vendored Normal file
View File

@@ -0,0 +1,64 @@
name: "\U0001F4A5 Build error"
description: Some package in Spack didn't build correctly
title: "Installation issue: "
labels: [build-error]
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to report this build failure. To proceed with the report please:
1. Title the issue `Installation issue: <name-of-the-package>`.
2. Provide the information required below.
We encourage you to try, as much as possible, to reduce your problem to the minimal example that still reproduces the issue. That would help us a lot in fixing it quickly and effectively!
- type: textarea
id: reproduce
attributes:
label: Steps to reproduce the issue
description: |
Fill in the exact spec you are trying to build and the relevant part of the error message
placeholder: |
```console
$ spack install <spec>
...
```
validations:
required: true
- type: textarea
id: information
attributes:
label: Information on your system
description: Please include the output of `spack debug report`
validations:
required: true
- type: markdown
attributes:
value: |
If you have any relevant configuration detail (custom `packages.yaml` or `modules.yaml`, etc.) you can add that here as well.
- type: textarea
id: additional_information
attributes:
label: Additional information
description: |
Please upload the following files:
* **`spack-build-out.txt`**
* **`spack-build-env.txt`**
They should be present in the stage directory of the failing build. Also upload any `config.log` or similar file if one exists.
- type: markdown
attributes:
value: |
Some packages have maintainers who have volunteered to debug build failures. Run `spack maintainers <name-of-the-package>` and **@mention** them here if they exist.
- type: checkboxes
id: checks
attributes:
label: General information
options:
- label: I have run `spack debug report` and reported the version of Spack/Python/Platform
required: true
- label: I have run `spack maintainers <name-of-the-package>` and **@mentioned** any maintainers
required: true
- label: I have uploaded the build log and environment files
required: true
- label: I have searched the issues of this repo and believe this is not a duplicate
required: true

1
.github/ISSUE_TEMPLATE/config.yml vendored Normal file
View File

@@ -0,0 +1 @@
blank_issues_enabled: true

View File

@@ -1,33 +0,0 @@
---
name: "\U0001F38A Feature request"
about: Suggest adding a feature that is not yet in Spack
labels: feature
---
<!--*Please add a concise summary of your suggestion here.*-->
### Rationale
<!--*Is your feature request related to a problem? Please describe it!*-->
### Description
<!--*Describe the solution you'd like and the alternatives you have considered.*-->
### Additional information
<!--*Add any other context about the feature request here.*-->
### General information
- [ ] I have run `spack --version` and reported the version of Spack
- [ ] I have searched the issues of this repo and believe this is not a duplicate
<!--If you want to ask a question about the tool (how to use it, what it can currently do, etc.), try the `#general` channel on our Slack first. We have a welcoming community and chances are you'll get your reply faster and without opening an issue.
Other than that, thanks for taking the time to contribute to Spack!
-->

View File

@@ -0,0 +1,41 @@
name: "\U0001F38A Feature request"
description: Suggest adding a feature that is not yet in Spack
labels: [feature]
body:
- type: textarea
id: summary
attributes:
label: Summary
description: Please add a concise summary of your suggestion here.
validations:
required: true
- type: textarea
id: rationale
attributes:
label: Rationale
description: Is your feature request related to a problem? Please describe it!
- type: textarea
id: description
attributes:
label: Description
description: Describe the solution you'd like and the alternatives you have considered.
- type: textarea
id: additional_information
attributes:
label: Additional information
description: Add any other context about the feature request here.
- type: checkboxes
id: checks
attributes:
label: General information
options:
- label: I have run `spack --version` and reported the version of Spack
required: true
- label: I have searched the issues of this repo and believe this is not a duplicate
required: true
- type: markdown
attributes:
value: |
If you want to ask a question about the tool (how to use it, what it can currently do, etc.), try the `#general` channel on [our Slack](https://slack.spack.io/) first. We have a welcoming community and chances are you'll get your reply faster and without opening an issue.
Other than that, thanks for taking the time to contribute to Spack!

161
.github/workflows/bootstrap.yml vendored Normal file
View File

@@ -0,0 +1,161 @@
name: Bootstrapping
on:
pull_request:
branches:
- develop
- releases/**
paths-ignore:
# Don't run if we only modified packages in the
# built-in repository or documentation
- 'var/spack/repos/builtin/**'
- '!var/spack/repos/builtin/packages/clingo-bootstrap/**'
- '!var/spack/repos/builtin/packages/python/**'
- '!var/spack/repos/builtin/packages/re2c/**'
- 'lib/spack/docs/**'
schedule:
# nightly at 2:16 AM
- cron: '16 2 * * *'
jobs:
fedora-sources:
runs-on: ubuntu-latest
container: "fedora:latest"
steps:
- name: Install dependencies
run: |
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
useradd spack-test
chown -R spack-test .
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap untrust github-actions
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
steps:
- name: Install dependencies
env:
DEBIAN_FRONTEND: noninteractive
run: |
apt-get update -y && apt-get upgrade -y
apt-get install -y \
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
useradd -m spack-test
chown -R spack-test .
- name: Bootstrap clingo
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack bootstrap untrust github-actions
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
opensuse-sources:
runs-on: ubuntu-latest
container: "opensuse/leap:latest"
steps:
- name: Install dependencies
run: |
zypper update -y
zypper install -y \
bzip2 curl file gcc-c++ gcc gcc-fortran tar git gpg2 gzip \
make patch unzip which xz python3 python3-devel tree \
cmake bison
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
spack bootstrap untrust github-actions
spack external find cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
macos-sources:
runs-on: macos-latest
steps:
- name: Install dependencies
run: |
brew install cmake bison@2.7 tree
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
export PATH=/usr/local/opt/bison@2.7/bin:$PATH
spack bootstrap untrust github-actions
spack external find --not-buildable cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
macos-clingo-binaries:
runs-on: macos-latest
strategy:
matrix:
python-version: ['3.5', '3.6', '3.7', '3.8', '3.9']
steps:
- name: Install dependencies
run: |
brew install tree
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
spack bootstrap untrust spack-install
spack -d solve zlib
tree ~/.spack/bootstrap/store/
ubuntu-clingo-binaries:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['2.7', '3.5', '3.6', '3.7', '3.8', '3.9']
steps:
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Setup repo and non-root user
run: |
git --version
git fetch --unshallow
. .github/workflows/setup_git.sh
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
spack bootstrap untrust spack-install
spack -d solve zlib
tree ~/.spack/bootstrap/store/

91
.github/workflows/build-containers.yml vendored Normal file
View File

@@ -0,0 +1,91 @@
name: Containers
on:
# This Workflow can be triggered manually
workflow_dispatch:
# Build new Spack develop containers nightly.
schedule:
- cron: '34 0 * * *'
# Run on pull requests that modify this file
pull_request:
branches:
- develop
paths:
- '.github/workflows/build-containers.yml'
# Let's also build & tag Spack containers on releases.
release:
types: [published]
jobs:
deploy-images:
runs-on: ubuntu-latest
permissions:
packages: write
strategy:
# Even if one container fails to build we still want the others
# to continue their builds.
fail-fast: false
# A matrix of Dockerfile paths, associated tags, and which architectures
# they support.
matrix:
dockerfile: [[amazon-linux, amazonlinux-2.dockerfile, 'linux/amd64,linux/arm64'],
[centos7, centos-7.dockerfile, 'linux/amd64,linux/arm64,linux/ppc64le'],
[leap15, leap-15.dockerfile, 'linux/amd64,linux/arm64,linux/ppc64le'],
[ubuntu-xenial, ubuntu-1604.dockerfile, 'linux/amd64,linux/arm64,linux/ppc64le'],
[ubuntu-bionic, ubuntu-1804.dockerfile, 'linux/amd64,linux/arm64,linux/ppc64le']]
name: Build ${{ matrix.dockerfile[0] }}
steps:
- name: Checkout
uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Set Container Tag Normal (Nightly)
run: |
container="${{ matrix.dockerfile[0] }}:latest"
echo "container=${container}" >> $GITHUB_ENV
echo "versioned=${container}" >> $GITHUB_ENV
# On a new release create a container with the same tag as the release.
- name: Set Container Tag on Release
if: github.event_name == 'release'
run: |
versioned="${{matrix.dockerfile[0]}}:${GITHUB_REF##*/}"
echo "versioned=${versioned}" >> $GITHUB_ENV
- name: Check ${{ matrix.dockerfile[1] }} Exists
run: |
printf "Preparing to build ${{ env.container }} from ${{ matrix.dockerfile[1] }}"
if [ ! -f "share/spack/docker/${{ matrix.dockerfile[1]}}" ]; then
printf "Dockerfile ${{ matrix.dockerfile[0]}} does not exist"
exit 1;
fi
- name: Set up QEMU
uses: docker/setup-qemu-action@27d0a4f181a40b142cce983c5393082c365d1480 # @v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@94ab11c41e45d028884a99163086648e898eed25 # @v1
- name: Log in to GitHub Container Registry
uses: docker/login-action@f054a8b539a109f9f41c372932f1ae047eff08c9 # @v1
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Log in to DockerHub
uses: docker/login-action@f054a8b539a109f9f41c372932f1ae047eff08c9 # @v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[1] }}
uses: docker/build-push-action@a66e35b9cbcf4ad0ea91ffcaf7bbad63ad9e0229 # @v2
with:
file: share/spack/docker/${{matrix.dockerfile[1]}}
platforms: ${{ matrix.dockerfile[2] }}
push: ${{ github.event_name != 'pull_request' }}
tags: |
spack/${{ env.container }}
spack/${{ env.versioned }}
ghcr.io/spack/${{ env.container }}
ghcr.io/spack/${{ env.versioned }}

View File

@@ -1,77 +0,0 @@
name: linux builds
on:
push:
branches:
- develop
- releases/**
paths-ignore:
# Don't run if we only modified packages in the built-in repository
- 'var/spack/repos/builtin/**'
- '!var/spack/repos/builtin/packages/lz4/**'
- '!var/spack/repos/builtin/packages/mpich/**'
- '!var/spack/repos/builtin/packages/tut/**'
- '!var/spack/repos/builtin/packages/py-setuptools/**'
- '!var/spack/repos/builtin/packages/openjpeg/**'
- '!var/spack/repos/builtin/packages/r-rcpp/**'
- '!var/spack/repos/builtin/packages/ruby-rake/**'
# Don't run if we only modified documentation
- 'lib/spack/docs/**'
pull_request:
branches:
- develop
- releases/**
paths-ignore:
# Don't run if we only modified packages in the built-in repository
- 'var/spack/repos/builtin/**'
- '!var/spack/repos/builtin/packages/lz4/**'
- '!var/spack/repos/builtin/packages/mpich/**'
- '!var/spack/repos/builtin/packages/tut/**'
- '!var/spack/repos/builtin/packages/py-setuptools/**'
- '!var/spack/repos/builtin/packages/openjpeg/**'
- '!var/spack/repos/builtin/packages/r-rcpp/**'
- '!var/spack/repos/builtin/packages/ruby-rake/**'
# Don't run if we only modified documentation
- 'lib/spack/docs/**'
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
package:
- lz4 # MakefilePackage
- mpich~fortran # AutotoolsPackage
- 'tut%gcc@:10.99.99' # WafPackage
- py-setuptools # PythonPackage
- openjpeg # CMakePackage
- r-rcpp # RPackage
- ruby-rake # RubyPackage
steps:
- uses: actions/checkout@v2
- uses: actions/cache@v2.1.6
with:
path: ~/.ccache
key: ccache-build-${{ matrix.package }}
restore-keys: |
ccache-build-${{ matrix.package }}
- uses: actions/setup-python@v2
with:
python-version: 3.9
- name: Install System Packages
run: |
sudo apt-get update
sudo apt-get -yqq install ccache gfortran perl perl-base r-base r-base-core r-base-dev ruby findutils openssl libssl-dev libpciaccess-dev
R --version
perl --version
ruby --version
- name: Copy Configuration
run: |
ccache -M 300M && ccache -z
# Set up external deps for build tests, b/c they take too long to compile
cp share/spack/qa/configuration/*.yaml etc/spack/
- name: Run the build test
run: |
. share/spack/setup-env.sh
SPEC=${{ matrix.package }} share/spack/qa/run-build-tests
ccache -s

View File

@@ -24,8 +24,8 @@ jobs:
name: gcc with clang
runs-on: macos-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: spack install
@@ -39,8 +39,8 @@ jobs:
runs-on: macos-latest
timeout-minutes: 700
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: spack install
@@ -52,8 +52,8 @@ jobs:
name: scipy, mpl, pd
runs-on: macos-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: spack install
@@ -62,17 +62,3 @@ jobs:
spack install -v --fail-fast py-scipy %apple-clang
spack install -v --fail-fast py-matplotlib %apple-clang
spack install -v --fail-fast py-pandas %apple-clang
install_mpi4py_clang:
name: mpi4py, petsc4py
runs-on: macos-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.9
- name: spack install
run: |
. .github/workflows/install_spack.sh
spack install -v --fail-fast py-mpi4py %apple-clang
spack install -v --fail-fast py-petsc4py %apple-clang

View File

@@ -15,8 +15,8 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install Python Packages
@@ -31,15 +31,15 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools flake8 isort>=4.3.5 mypy>=0.800 black types-six
pip install --upgrade pip six setuptools types-six
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -48,26 +48,6 @@ jobs:
- name: Run style tests
run: |
share/spack/qa/run-style-tests
# Build the documentation
documentation:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.9
- name: Install System packages
run: |
sudo apt-get -y update
sudo apt-get install -y coreutils ninja-build graphviz
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools
pip install --upgrade -r lib/spack/docs/requirements.txt
- name: Build documentation
run: |
share/spack/qa/run-doc-tests
# Check which files have been updated by the PR
changes:
runs-on: ubuntu-latest
@@ -77,12 +57,12 @@ jobs:
packages: ${{ steps.filter.outputs.packages }}
with_coverage: ${{ steps.coverage.outputs.with_coverage }}
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@v2
- uses: dorny/paths-filter@b2feaf19c27470162a626bd6fa8438ae5b263721
id: filter
with:
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below
@@ -112,17 +92,17 @@ jobs:
# Run unit tests with different configurations on linux
unittests:
needs: [ validate, style, documentation, changes ]
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [2.7, 3.5, 3.6, 3.7, 3.8, 3.9]
concretizer: ['original', 'clingo']
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -130,37 +110,28 @@ jobs:
sudo apt-get -y update
# Needed for unit tests
sudo apt-get -y install \
coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \
patchelf
# Needed for kcov
sudo apt-get -y install cmake binutils-dev libcurl4-openssl-dev
sudo apt-get -y install zlib1g-dev libdw-dev libiberty-dev
coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \
patchelf cmake bison libbison-dev kcov
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools codecov coverage
pip install --upgrade pip six setuptools codecov coverage[toml]
# ensure style checks are not skipped in unit tests for python >= 3.6
# note that true/false (i.e., 1/0) are opposite in conditions in python and bash
if python -c 'import sys; sys.exit(not sys.version_info >= (3, 6))'; then
pip install --upgrade flake8 isort>=4.3.5 mypy>=0.900 black
fi
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
- name: Install kcov for bash script coverage
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
env:
KCOV_VERSION: 34
run: |
KCOV_ROOT=$(mktemp -d)
wget --output-document=${KCOV_ROOT}/${KCOV_VERSION}.tar.gz https://github.com/SimonKagstrom/kcov/archive/v${KCOV_VERSION}.tar.gz
tar -C ${KCOV_ROOT} -xzvf ${KCOV_ROOT}/${KCOV_VERSION}.tar.gz
mkdir -p ${KCOV_ROOT}/build
cd ${KCOV_ROOT}/build && cmake -Wno-dev ${KCOV_ROOT}/kcov-${KCOV_VERSION} && cd -
make -C ${KCOV_ROOT}/build && sudo make -C ${KCOV_ROOT}/build install
- name: Bootstrap clingo from sources
- name: Bootstrap clingo
if: ${{ matrix.concretizer == 'clingo' }}
env:
SPACK_PYTHON: python
run: |
. share/spack/setup-env.sh
spack external find --not-buildable cmake bison
spack bootstrap untrust spack-install
spack -v solve zlib
- name: Run unit tests (full suite with coverage)
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
@@ -180,48 +151,34 @@ jobs:
SPACK_TEST_SOLVER: ${{ matrix.concretizer }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@v1
- uses: codecov/codecov-action@f32b3a3741e1053eb607407145bc9619351dc93b # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
with:
flags: unittests,linux,${{ matrix.concretizer }}
# Test shell integration
shell:
needs: [ validate, style, documentation, changes ]
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install System packages
run: |
sudo apt-get -y update
# Needed for shell tests
sudo apt-get install -y coreutils csh zsh tcsh fish dash bash
# Needed for kcov
sudo apt-get -y install cmake binutils-dev libcurl4-openssl-dev
sudo apt-get -y install zlib1g-dev libdw-dev libiberty-dev
sudo apt-get install -y coreutils kcov csh zsh tcsh fish dash bash
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools codecov coverage
pip install --upgrade pip six setuptools codecov coverage[toml]
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
- name: Install kcov for bash script coverage
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
env:
KCOV_VERSION: 38
run: |
KCOV_ROOT=$(mktemp -d)
wget --output-document=${KCOV_ROOT}/${KCOV_VERSION}.tar.gz https://github.com/SimonKagstrom/kcov/archive/v${KCOV_VERSION}.tar.gz
tar -C ${KCOV_ROOT} -xzvf ${KCOV_ROOT}/${KCOV_VERSION}.tar.gz
mkdir -p ${KCOV_ROOT}/build
cd ${KCOV_ROOT}/build && cmake -Wno-dev ${KCOV_ROOT}/kcov-${KCOV_VERSION} && cd -
make -C ${KCOV_ROOT}/build && sudo make -C ${KCOV_ROOT}/build install
- name: Run shell tests (without coverage)
if: ${{ needs.changes.outputs.with_coverage == 'false' }}
run: |
@@ -232,13 +189,13 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@v1
- uses: codecov/codecov-action@f32b3a3741e1053eb607407145bc9619351dc93b # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
with:
flags: shelltests,linux
# Test for Python2.6 run on Centos 6
centos6:
needs: [ validate, style, documentation, changes ]
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
container: spack/github-actions:centos6
steps:
@@ -248,28 +205,30 @@ jobs:
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
env:
HOME: /home/spack-test
SPACK_TEST_SOLVER: original
run: |
whoami && echo $HOME && cd $HOME
git clone https://github.com/spack/spack.git && cd spack
git fetch origin ${{ github.ref }}:test-branch
git clone "${{ github.server_url }}/${{ github.repository }}.git" && cd spack
git fetch origin "${{ github.ref }}:test-branch"
git checkout test-branch
share/spack/qa/run-unit-tests
bin/spack unit-test -x
- name: Run unit tests (only package tests)
if: ${{ needs.changes.outputs.with_coverage == 'false' }}
env:
HOME: /home/spack-test
ONLY_PACKAGES: true
SPACK_TEST_SOLVER: original
run: |
whoami && echo $HOME && cd $HOME
git clone https://github.com/spack/spack.git && cd spack
git fetch origin ${{ github.ref }}:test-branch
git clone "${{ github.server_url }}/${{ github.repository }}.git" && cd spack
git fetch origin "${{ github.ref }}:test-branch"
git checkout test-branch
share/spack/qa/run-unit-tests
bin/spack unit-test -x -m "not maybeslow" -k "package_sanity"
# Test RHEL8 UBI with platform Python. This job is run
# only on PRs modifying core Spack
rhel8-platform-python:
needs: [ validate, style, documentation, changes ]
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
container: registry.access.redhat.com/ubi8/ubi
@@ -279,7 +238,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -291,16 +250,17 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack -d solve zlib
spack unit-test -k 'not cvs and not svn and not hg' -x --verbose
# Test for the clingo based solver (using clingo-cffi)
clingo-cffi:
needs: [ validate, style, documentation, changes ]
needs: [ validate, style, changes ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: 3.9
- name: Install System packages
@@ -309,24 +269,10 @@ jobs:
# Needed for unit tests
sudo apt-get -y install \
coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \
patchelf
# Needed for kcov
sudo apt-get -y install cmake binutils-dev libcurl4-openssl-dev
sudo apt-get -y install zlib1g-dev libdw-dev libiberty-dev
- name: Install kcov for bash script coverage
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
env:
KCOV_VERSION: 34
run: |
KCOV_ROOT=$(mktemp -d)
wget --output-document=${KCOV_ROOT}/${KCOV_VERSION}.tar.gz https://github.com/SimonKagstrom/kcov/archive/v${KCOV_VERSION}.tar.gz
tar -C ${KCOV_ROOT} -xzvf ${KCOV_ROOT}/${KCOV_VERSION}.tar.gz
mkdir -p ${KCOV_ROOT}/build
cd ${KCOV_ROOT}/build && cmake -Wno-dev ${KCOV_ROOT}/kcov-${KCOV_VERSION} && cd -
make -C ${KCOV_ROOT}/build && sudo make -C ${KCOV_ROOT}/build install
patchelf kcov
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools codecov coverage clingo
pip install --upgrade pip six setuptools codecov coverage[toml] clingo
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -348,48 +294,54 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@v1
- uses: codecov/codecov-action@f32b3a3741e1053eb607407145bc9619351dc93b # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
with:
flags: unittests,linux,clingo
# Run unit tests on MacOS
build:
needs: [ validate, style, documentation, changes ]
needs: [ validate, style, changes ]
runs-on: macos-latest
strategy:
matrix:
python-version: [3.8]
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@ec3a7ce113134d7a93b817d10a8272cb61118579 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
- uses: actions/setup-python@dc73133d4da04e56a135ae2246682783cc7c7cb6 # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools
pip install --upgrade codecov coverage
pip install --upgrade flake8 isort>=4.3.5 mypy>=0.800
pip install --upgrade codecov coverage[toml]
- name: Setup Homebrew packages
run: |
brew install dash fish gcc gnupg2 kcov
- name: Run unit tests
env:
SPACK_TEST_SOLVER: clingo
run: |
git --version
. .github/workflows/setup_git.sh
. share/spack/setup-env.sh
$(which spack) bootstrap untrust spack-install
$(which spack) solve zlib
if [ "${{ needs.changes.outputs.with_coverage }}" == "true" ]
then
coverage run $(which spack) unit-test -x
coverage combine
coverage xml
# Delete the symlink going from ./lib/spack/docs/_spack_root back to
# the initial directory, since it causes ELOOP errors with codecov/actions@2
rm lib/spack/docs/_spack_root
else
echo "ONLY PACKAGE RECIPES CHANGED [skipping coverage]"
$(which spack) unit-test -x -m "not maybeslow" -k "package_sanity"
fi
- uses: codecov/codecov-action@v1
- uses: codecov/codecov-action@f32b3a3741e1053eb607407145bc9619351dc93b # @v2.1.0
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
with:
file: ./coverage.xml
files: ./coverage.xml
flags: unittests,macos

4
.gitignore vendored
View File

@@ -136,6 +136,7 @@ venv/
ENV/
env.bak/
venv.bak/
!/lib/spack/env
# Spyder project settings
.spyderproject
@@ -209,6 +210,9 @@ tramp
/eshell/history
/eshell/lastdir
# zsh byte-compiled files
*.zwc
# elpa packages
/elpa/

View File

@@ -1,35 +0,0 @@
[mypy]
python_version = 3.7
files=lib/spack/llnl/**/*.py,lib/spack/spack/**/*.py
mypy_path=bin,lib/spack,lib/spack/external,var/spack/repos/builtin
# This and a generated import file allows supporting packages
namespace_packages=True
# To avoid re-factoring all the externals, ignore errors and missing imports
# globally, then turn back on in spack and spack submodules
ignore_errors=True
ignore_missing_imports=True
[mypy-spack.*]
ignore_errors=False
ignore_missing_imports=False
[mypy-packages.*]
ignore_errors=False
ignore_missing_imports=False
[mypy-llnl.*]
ignore_errors=False
ignore_missing_imports=False
[mypy-spack.test.packages]
ignore_errors=True
# ignore errors in fake import path for packages
[mypy-spack.pkg.*]
ignore_errors=True
ignore_missing_imports=True
# jinja has syntax in it that requires python3 and causes a parse error
# skip importing it
[mypy-jinja2]
follow_imports=skip

View File

@@ -2,6 +2,7 @@ version: 2
sphinx:
configuration: lib/spack/docs/conf.py
fail_on_warning: true
python:
version: 3.7

View File

@@ -1,9 +1,10 @@
# <img src="https://cdn.rawgit.com/spack/spack/develop/share/spack/logo/spack-logo.svg" width="64" valign="middle" alt="Spack"/> Spack
[![Unit Tests](https://github.com/spack/spack/workflows/linux%20tests/badge.svg)](https://github.com/spack/spack/actions)
[![Linux Builds](https://github.com/spack/spack/workflows/linux%20builds/badge.svg)](https://github.com/spack/spack/actions)
[![Bootstrapping](https://github.com/spack/spack/actions/workflows/bootstrap.yml/badge.svg)](https://github.com/spack/spack/actions/workflows/bootstrap.yml)
[![macOS Builds (nightly)](https://github.com/spack/spack/workflows/macOS%20builds%20nightly/badge.svg?branch=develop)](https://github.com/spack/spack/actions?query=workflow%3A%22macOS+builds+nightly%22)
[![codecov](https://codecov.io/gh/spack/spack/branch/develop/graph/badge.svg)](https://codecov.io/gh/spack/spack)
[![Containers](https://github.com/spack/spack/actions/workflows/build-containers.yml/badge.svg)](https://github.com/spack/spack/actions/workflows/build-containers.yml)
[![Read the Docs](https://readthedocs.org/projects/spack/badge/?version=latest)](https://spack.readthedocs.io)
[![Slack](https://slack.spack.io/badge.svg)](https://slack.spack.io)
@@ -26,7 +27,7 @@ for examples and highlights.
To install spack and your first package, make sure you have Python.
Then:
$ git clone https://github.com/spack/spack.git
$ git clone -c feature.manyFiles=true https://github.com/spack/spack.git
$ cd spack/bin
$ ./spack install zlib
@@ -36,6 +37,8 @@ Documentation
[**Full documentation**](https://spack.readthedocs.io/) is available, or
run `spack help` or `spack help --all`.
For a cheat sheet on Spack syntax, run `spack help --spec`.
Tutorial
----------------

24
SECURITY.md Normal file
View File

@@ -0,0 +1,24 @@
# Security Policy
## Supported Versions
We provide security updates for the following releases.
For more on Spack's release structure, see
[`README.md`](https://github.com/spack/spack#releases).
| Version | Supported |
| ------- | ------------------ |
| develop | :white_check_mark: |
| 0.16.x | :white_check_mark: |
## Reporting a Vulnerability
To report a vulnerability or other security
issue, email maintainers@spack.io.
You can expect to hear back within two days.
If your security issue is accepted, we will do
our best to release a fix within a week. If
fixing the issue will take longer than this,
we will discuss timeline options with you.

View File

@@ -28,6 +28,7 @@ exit 1
from __future__ import print_function
import os
import os.path
import sys
min_python3 = (3, 5)
@@ -70,6 +71,28 @@ if "ruamel.yaml" in sys.modules:
if "ruamel" in sys.modules:
del sys.modules["ruamel"]
# The following code is here to avoid failures when updating
# the develop version, due to spurious argparse.pyc files remaining
# in the libs/spack/external directory, see:
# https://github.com/spack/spack/pull/25376
# TODO: Remove in v0.18.0 or later
try:
import argparse
except ImportError:
argparse_pyc = os.path.join(spack_external_libs, 'argparse.pyc')
if not os.path.exists(argparse_pyc):
raise
try:
os.remove(argparse_pyc)
import argparse # noqa
except Exception:
msg = ('The file\n\n\t{0}\n\nis corrupted and cannot be deleted by Spack. '
'Either delete it manually or ask some administrator to '
'delete it for you.')
print(msg.format(argparse_pyc))
sys.exit(1)
import spack.main # noqa
# Once we've set up the system path, run the spack main method

View File

@@ -0,0 +1,32 @@
bootstrap:
# If set to false Spack will not bootstrap missing software,
# but will instead raise an error.
enable: true
# Root directory for bootstrapping work. The software bootstrapped
# by Spack is installed in a "store" subfolder of this root directory
root: $user_cache_path/bootstrap
# Methods that can be used to bootstrap software. Each method may or
# may not be able to bootstrap all of the software that Spack needs,
# depending on its type.
sources:
- name: 'github-actions'
type: buildcache
description: |
Buildcache generated from a public workflow using Github Actions.
The sha256 checksum of binaries is checked before installation.
info:
url: https://mirror.spack.io/bootstrap/github-actions/v0.1
homepage: https://github.com/alalazo/spack-bootstrap-mirrors
releases: https://github.com/alalazo/spack-bootstrap-mirrors/releases
# This method is just Spack bootstrapping the software it needs from sources.
# It has been added here so that users can selectively disable bootstrapping
# from sources by "untrusting" it.
- name: spack-install
type: install
description: |
Specs built from sources by Spack. May take a long time.
trusted:
# By default we trust bootstrapping from sources and from binaries
# produced on Github via the workflow
github-actions: true
spack-install: true

View File

@@ -42,8 +42,8 @@ config:
# (i.e., ``$TMP` or ``$TMPDIR``).
#
# Another option that prevents conflicts and potential permission issues is
# to specify `~/.spack/stage`, which ensures each user builds in their home
# directory.
# to specify `$user_cache_path/stage`, which ensures each user builds in their
# home directory.
#
# A more traditional path uses the value of `$spack/var/spack/stage`, which
# builds directly inside Spack's instance without staging them in a
@@ -60,13 +60,13 @@ config:
# identifies Spack staging to avoid accidentally wiping out non-Spack work.
build_stage:
- $tempdir/$user/spack-stage
- ~/.spack/stage
- $user_cache_path/stage
# - $spack/var/spack/stage
# Directory in which to run tests and store test results.
# Tests will be stored in directories named by date/time and package
# name/hash.
test_stage: ~/.spack/test
test_stage: $user_cache_path/test
# Cache directory for already downloaded source tarballs and archived
# repositories. This can be purged with `spack clean --downloads`.
@@ -75,7 +75,7 @@ config:
# Cache directory for miscellaneous files, like the package index.
# This can be purged with `spack clean --misc-cache`
misc_cache: ~/.spack/cache
misc_cache: $user_cache_path/cache
# Timeout in seconds used for downloading sources etc. This only applies
@@ -134,6 +134,10 @@ config:
# enabling locks.
locks: true
# The default url fetch method to use.
# If set to 'curl', Spack will require curl on the user's system
# If set to 'urllib', Spack will use python built-in libs to fetch
url_fetch_method: urllib
# The maximum number of jobs to use for the build system (e.g. `make`), when
# the -j flag is not given on the command line. Defaults to 16 when not set.
@@ -156,11 +160,10 @@ config:
# sufficiently for many specs.
#
# 'clingo': Uses a logic solver under the hood to solve DAGs with full
# backtracking and optimization for user preferences.
# backtracking and optimization for user preferences. Spack will
# try to bootstrap the logic solver, if not already available.
#
# 'clingo' currently requires the clingo ASP solver to be installed and
# built with python bindings. 'original' is built in.
concretizer: original
concretizer: clingo
# How long to wait to lock the Spack installation database. This lock is used
@@ -187,3 +190,8 @@ config:
# Set to 'false' to allow installation on filesystems that doesn't allow setgid bit
# manipulation by unprivileged user (e.g. AFS)
allow_sgid: true
# Whether to set the terminal title to display status information during
# building and installing packages. This gives information about Spack's
# current progress as well as the current and total number of packages.
terminal_title: false

View File

@@ -43,6 +43,7 @@ packages:
opencl: [pocl]
onedal: [intel-oneapi-dal]
osmesa: [mesa+osmesa, mesa18+osmesa]
pbs: [openpbs, torque]
pil: [py-pillow]
pkgconfig: [pkgconf, pkg-config]
rpc: [libtirpc]

View File

@@ -2,7 +2,7 @@
#
# You can set these variables from the command line.
SPHINXOPTS = -W
SPHINXOPTS = -W --keep-going
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = _build

View File

@@ -31,13 +31,13 @@ colorized output with a flag
.. code-block:: console
$ spack --color always | less -R
$ spack --color always find | less -R
or an environment variable
.. code-block:: console
$ SPACK_COLOR=always spack | less -R
$ SPACK_COLOR=always spack find | less -R
--------------------------
Listing available packages
@@ -695,6 +695,136 @@ structured the way you want:
}
^^^^^^^^^^^^^^
``spack diff``
^^^^^^^^^^^^^^
It's often the case that you have two versions of a spec that you need to
disambiguate. Let's say that we've installed two variants of zlib, one with
and one without the optimize variant:
.. code-block:: console
$ spack install zlib
$ spack install zlib -optimize
When we do ``spack find`` we see the two versions.
.. code-block:: console
$ spack find zlib
==> 2 installed packages
-- linux-ubuntu20.04-skylake / gcc@9.3.0 ------------------------
zlib@1.2.11 zlib@1.2.11
Let's now say that we want to uninstall zlib. We run the command, and hit a problem
real quickly since we have two!
.. code-block:: console
$ spack uninstall zlib
==> Error: zlib matches multiple packages:
-- linux-ubuntu20.04-skylake / gcc@9.3.0 ------------------------
efzjziy zlib@1.2.11 sl7m27m zlib@1.2.11
==> Error: You can either:
a) use a more specific spec, or
b) specify the spec by its hash (e.g. `spack uninstall /hash`), or
c) use `spack uninstall --all` to uninstall ALL matching specs.
Oh no! We can see from the above that we have two different versions of zlib installed,
and the only difference between the two is the hash. This is a good use case for
``spack diff``, which can easily show us the "diff" or set difference
between properties for two packages. Let's try it out.
Since the only difference we see in the ``spack find`` view is the hash, let's use
``spack diff`` to look for more detail. We will provide the two hashes:
.. code-block:: console
$ spack diff /efzjziy /sl7m27m
==> Warning: This interface is subject to change.
--- zlib@1.2.11efzjziyc3dmb5h5u5azsthgbgog5mj7g
+++ zlib@1.2.11sl7m27mzkbejtkrajigj3a3m37ygv4u2
@@ variant_value @@
- zlib optimize False
+ zlib optimize True
The output is colored, and written in the style of a git diff. This means that you
can copy and paste it into a GitHub markdown as a code block with language "diff"
and it will render nicely! Here is an example:
.. code-block:: md
```diff
--- zlib@1.2.11/efzjziyc3dmb5h5u5azsthgbgog5mj7g
+++ zlib@1.2.11/sl7m27mzkbejtkrajigj3a3m37ygv4u2
@@ variant_value @@
- zlib optimize False
+ zlib optimize True
```
Awesome! Now let's read the diff. It tells us that our first zlib was built with ``~optimize``
(``False``) and the second was built with ``+optimize`` (``True``). You can't see it in the docs
here, but the output above is also colored based on the content being an addition (+) or
subtraction (-).
This is a small example, but you will be able to see differences for any attributes on the
installation spec. Running ``spack diff A B`` means we'll see which spec attributes are on
``B`` but not on ``A`` (green) and which are on ``A`` but not on ``B`` (red). Here is another
example with an additional difference type, ``version``:
.. code-block:: console
$ spack diff python@2.7.8 python@3.8.11
==> Warning: This interface is subject to change.
--- python@2.7.8/tsxdi6gl4lihp25qrm4d6nys3nypufbf
+++ python@3.8.11/yjtseru4nbpllbaxb46q7wfkyxbuvzxx
@@ variant_value @@
- python patches a8c52415a8b03c0e5f28b5d52ae498f7a7e602007db2b9554df28cd5685839b8
+ python patches 0d98e93189bc278fbc37a50ed7f183bd8aaf249a8e1670a465f0db6bb4f8cf87
@@ version @@
- openssl 1.0.2u
+ openssl 1.1.1k
- python 2.7.8
+ python 3.8.11
Let's say that we were only interested in one kind of attribute above, ``version``.
We can ask the command to only output this attribute. To do this, you'd add
the ``--attribute`` for attribute parameter, which defaults to all. Here is how you
would filter to show just versions:
.. code-block:: console
$ spack diff --attribute version python@2.7.8 python@3.8.11
==> Warning: This interface is subject to change.
--- python@2.7.8/tsxdi6gl4lihp25qrm4d6nys3nypufbf
+++ python@3.8.11/yjtseru4nbpllbaxb46q7wfkyxbuvzxx
@@ version @@
- openssl 1.0.2u
+ openssl 1.1.1k
- python 2.7.8
+ python 3.8.11
And you can add as many attributes as you'd like with multiple `--attribute` arguments
(for lots of attributes, you can use ``-a`` for short). Finally, if you want to view the
data as json (and possibly pipe into an output file) just add ``--json``:
.. code-block:: console
$ spack diff --json python@2.7.8 python@3.8.11
This data will be much longer because along with the differences for ``A`` vs. ``B`` and
``B`` vs. ``A``, the JSON output also showsthe intersection.
------------------------
Using installed packages
------------------------
@@ -738,8 +868,9 @@ your path:
These commands will add appropriate directories to your ``PATH``,
``MANPATH``, ``CPATH``, and ``LD_LIBRARY_PATH`` according to the
:ref:`prefix inspections <customize-env-modifications>` defined in your
modules configuration. When you no longer want to use a package, you
can type unload or unuse similarly:
modules configuration.
When you no longer want to use a package, you can type unload or
unuse similarly:
.. code-block:: console
@@ -780,6 +911,22 @@ first ``libelf`` above, you would run:
$ spack load /qmm4kso
To see which packages that you have loaded to your enviornment you would
use ``spack find --loaded``.
.. code-block:: console
$ spack find --loaded
==> 2 installed packages
-- linux-debian7 / gcc@4.4.7 ------------------------------------
libelf@0.8.13
-- linux-debian7 / intel@15.0.0 ---------------------------------
libelf@0.8.13
You can also use ``spack load --list`` to get the same output, but it
does not have the full set of query options that ``spack find`` offers.
We'll learn more about Spack's spec syntax in the next section.
@@ -1519,6 +1666,7 @@ and it will be added to the ``PYTHONPATH`` in your current shell:
Now ``import numpy`` will succeed for as long as you keep your current
session open.
The loaded packages can be checked using ``spack find --loaded``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Loading Extensions via Modules

View File

@@ -63,6 +63,7 @@ on these ideas for each distinct build system that Spack supports:
build_systems/intelpackage
build_systems/rocmpackage
build_systems/custompackage
build_systems/multiplepackage
For reference, the :py:mod:`Build System API docs <spack.build_systems>`
provide a list of build systems and methods/attributes that can be

View File

@@ -112,20 +112,44 @@ phase runs:
.. code-block:: console
$ libtoolize
$ aclocal
$ autoreconf --install --verbose --force
$ autoreconf --install --verbose --force -I <aclocal-prefix>/share/aclocal
All you need to do is add a few Autotools dependencies to the package.
Most stable releases will come with a ``configure`` script, but if you
check out a commit from the ``develop`` branch, you would want to add:
In case you need to add more arguments, override ``autoreconf_extra_args``
in your ``package.py`` on class scope like this:
.. code-block:: python
depends_on('autoconf', type='build', when='@develop')
depends_on('automake', type='build', when='@develop')
depends_on('libtool', type='build', when='@develop')
depends_on('m4', type='build', when='@develop')
autoreconf_extra_args = ["-Im4"]
All you need to do is add a few Autotools dependencies to the package.
Most stable releases will come with a ``configure`` script, but if you
check out a commit from the ``master`` branch, you would want to add:
.. code-block:: python
depends_on('autoconf', type='build', when='@master')
depends_on('automake', type='build', when='@master')
depends_on('libtool', type='build', when='@master')
It is typically redundant to list the ``m4`` macro processor package as a
dependency, since ``autoconf`` already depends on it.
"""""""""""""""""""""""""""""""
Using a custom autoreconf phase
"""""""""""""""""""""""""""""""
In some cases, it might be needed to replace the default implementation
of the autoreconf phase with one running a script interpreter. In this
example, the ``bash`` shell is used to run the ``autogen.sh`` script.
.. code-block:: python
def autoreconf(self, spec, prefix):
which('bash')('autogen.sh')
"""""""""""""""""""""""""""""""""""""""
patching configure or Makefile.in files
"""""""""""""""""""""""""""""""""""""""
In some cases, developers might need to distribute a patch that modifies
one of the files used to generate ``configure`` or ``Makefile.in``.
@@ -135,6 +159,57 @@ create a new patch that directly modifies ``configure``. That way,
Spack can use the secondary patch and additional build system
dependencies aren't necessary.
""""""""""""""""""""""""""""
Old Autotools helper scripts
""""""""""""""""""""""""""""
Autotools based tarballs come with helper scripts such as ``config.sub`` and
``config.guess``. It is the responsibility of the developers to keep these files
up to date so that they run on every platform, but for very old software
releases this is impossible. In these cases Spack can help to replace these
files with newer ones, without having to add the heavy dependency on
``automake``.
Automatic helper script replacement is currently enabled by default on
``ppc64le`` and ``aarch64``, as these are the known cases where old scripts fail.
On these targets, ``AutotoolsPackage`` adds a build dependency on ``gnuconfig``,
which is a very light-weight package with newer versions of the helper files.
Spack then tries to run all the helper scripts it can find in the release, and
replaces them on failure with the helper scripts from ``gnuconfig``.
To opt out of this feature, use the following setting:
.. code-block:: python
patch_config_files = False
To enable it conditionally on different architectures, define a property and
make the package depend on ``gnuconfig`` as a build dependency:
.. code-block
depends_on('gnuconfig', when='@1.0:')
@property
def patch_config_files(self):
return self.spec.satisfies("@1.0:")
.. note::
On some exotic architectures it is necessary to use system provided
``config.sub`` and ``config.guess`` files. In this case, the most
transparent solution is to mark the ``gnuconfig`` package as external and
non-buildable, with a prefix set to the directory containing the files:
.. code-block:: yaml
gnuconfig:
buildable: false
externals:
- spec: gnuconfig@master
prefix: /usr/share/configure_files/
""""""""""""""""
force_autoreconf
""""""""""""""""
@@ -324,8 +399,29 @@ options:
--with-libfabric=</path/to/libfabric>
"""""""""""""""""""""""
The ``variant`` keyword
"""""""""""""""""""""""
When Spack variants and configure flags do not correspond one-to-one, the
``variant`` keyword can be passed to ``with_or_without`` and
``enable_or_disable``. For example:
.. code-block:: python
variant('debug_tools', default=False)
config_args += self.enable_or_disable('debug-tools', variant='debug_tools')
Or when one variant controls multiple flags:
.. code-block:: python
variant('debug_tools', default=False)
config_args += self.with_or_without('memchecker', variant='debug_tools')
config_args += self.with_or_without('profiler', variant='debug_tools')
""""""""""""""""""""
activation overrides
Activation overrides
""""""""""""""""""""
Finally, the behavior of either ``with_or_without`` or

View File

@@ -130,8 +130,8 @@ Adding flags to cmake
To add additional flags to the ``cmake`` call, simply override the
``cmake_args`` function. The following example defines values for the flags
``WHATEVER``, ``ENABLE_BROKEN_FEATURE``, ``DETECT_HDF5``, and ``THREADS`` with
and without the :py:meth:`~.CMakePackage.define` and
:py:meth:`~.CMakePackage.define_from_variant` helper functions:
and without the :meth:`~spack.build_systems.cmake.CMakePackage.define` and
:meth:`~spack.build_systems.cmake.CMakePackage.define_from_variant` helper functions:
.. code-block:: python

View File

@@ -0,0 +1,350 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _multiplepackage:
----------------------
Multiple Build Systems
----------------------
Quite frequently, a package will change build systems from one version to the
next. For example, a small project that once used a single Makefile to build
may now require Autotools to handle the increased number of files that need to
be compiled. Or, a package that once used Autotools may switch to CMake for
Windows support. In this case, it becomes a bit more challenging to write a
single build recipe for this package in Spack.
There are several ways that this can be handled in Spack:
#. Subclass the new build system, and override phases as needed (preferred)
#. Subclass ``Package`` and implement ``install`` as needed
#. Create separate ``*-cmake``, ``*-autotools``, etc. packages for each build system
#. Rename the old package to ``*-legacy`` and create a new package
#. Move the old package to a ``legacy`` repository and create a new package
#. Drop older versions that only support the older build system
Of these options, 1 is preferred, and will be demonstrated in this
documentation. Options 3-5 have issues with concretization, so shouldn't be
used. Options 4-5 also don't support more than two build systems. Option 6 only
works if the old versions are no longer needed. Option 1 is preferred over 2
because it makes it easier to drop the old build system entirely.
The exact syntax of the package depends on which build systems you need to
support. Below are a couple of common examples.
^^^^^^^^^^^^^^^^^^^^^
Makefile -> Autotools
^^^^^^^^^^^^^^^^^^^^^
Let's say we have the following package:
.. code-block:: python
class Foo(MakefilePackage):
version("1.2.0", sha256="...")
def edit(self, spec, prefix):
filter_file("CC=", "CC=" + spack_cc, "Makefile")
def install(self, spec, prefix):
install_tree(".", prefix)
The package subclasses from :ref:`makefilepackage`, which has three phases:
#. ``edit`` (does nothing by default)
#. ``build`` (runs ``make`` by default)
#. ``install`` (runs ``make install`` by default)
In this case, the ``install`` phase needed to be overridden because the
Makefile did not have an install target. We also modify the Makefile to use
Spack's compiler wrappers. The default ``build`` phase is not changed.
Starting with version 1.3.0, we want to use Autotools to build instead.
:ref:`autotoolspackage` has four phases:
#. ``autoreconf`` (does not if a configure script already exists)
#. ``configure`` (runs ``./configure --prefix=...`` by default)
#. ``build`` (runs ``make`` by default)
#. ``install`` (runs ``make install`` by default)
If the only version we need to support is 1.3.0, the package would look as
simple as:
.. code-block:: python
class Foo(AutotoolsPackage):
version("1.3.0", sha256="...")
def configure_args(self):
return ["--enable-shared"]
In this case, we use the default methods for each phase and only override
``configure_args`` to specify additional flags to pass to ``./configure``.
If we wanted to write a single package that supports both versions 1.2.0 and
1.3.0, it would look something like:
.. code-block:: python
class Foo(AutotoolsPackage):
version("1.3.0", sha256="...")
version("1.2.0", sha256="...", deprecated=True)
def configure_args(self):
return ["--enable-shared"]
# Remove the following once version 1.2.0 is dropped
@when("@:1.2")
def patch(self):
filter_file("CC=", "CC=" + spack_cc, "Makefile")
@when("@:1.2")
def autoreconf(self, spec, prefix):
pass
@when("@:1.2")
def configure(self, spec, prefix):
pass
@when("@:1.2")
def install(self, spec, prefix):
install_tree(".", prefix)
There are a few interesting things to note here:
* We added ``deprecated=True`` to version 1.2.0. This signifies that version
1.2.0 is deprecated and shouldn't be used. However, if a user still relies
on version 1.2.0, it's still there and builds just fine.
* We moved the contents of the ``edit`` phase to the ``patch`` function. Since
``AutotoolsPackage`` doesn't have an ``edit`` phase, the only way for this
step to be executed is to move it to the ``patch`` function, which always
gets run.
* The ``autoreconf`` and ``configure`` phases become no-ops. Since the old
Makefile-based build system doesn't use these, we ignore these phases when
building ``foo@1.2.0``.
* The ``@when`` decorator is used to override these phases only for older
versions. The default methods are used for ``foo@1.3:``.
Once a new Spack release comes out, version 1.2.0 and everything below the
comment can be safely deleted. The result is the same as if we had written a
package for version 1.3.0 from scratch.
^^^^^^^^^^^^^^^^^^
Autotools -> CMake
^^^^^^^^^^^^^^^^^^
Let's say we have the following package:
.. code-block:: python
class Bar(AutotoolsPackage):
version("1.2.0", sha256="...")
def configure_args(self):
return ["--enable-shared"]
The package subclasses from :ref:`autotoolspackage`, which has four phases:
#. ``autoreconf`` (does not if a configure script already exists)
#. ``configure`` (runs ``./configure --prefix=...`` by default)
#. ``build`` (runs ``make`` by default)
#. ``install`` (runs ``make install`` by default)
In this case, we use the default methods for each phase and only override
``configure_args`` to specify additional flags to pass to ``./configure``.
Starting with version 1.3.0, we want to use CMake to build instead.
:ref:`cmakepackage` has three phases:
#. ``cmake`` (runs ``cmake ...`` by default)
#. ``build`` (runs ``make`` by default)
#. ``install`` (runs ``make install`` by default)
If the only version we need to support is 1.3.0, the package would look as
simple as:
.. code-block:: python
class Bar(CMakePackage):
version("1.3.0", sha256="...")
def cmake_args(self):
return [self.define("BUILD_SHARED_LIBS", True)]
In this case, we use the default methods for each phase and only override
``cmake_args`` to specify additional flags to pass to ``cmake``.
If we wanted to write a single package that supports both versions 1.2.0 and
1.3.0, it would look something like:
.. code-block:: python
class Bar(CMakePackage):
version("1.3.0", sha256="...")
version("1.2.0", sha256="...", deprecated=True)
def cmake_args(self):
return [self.define("BUILD_SHARED_LIBS", True)]
# Remove the following once version 1.2.0 is dropped
def configure_args(self):
return ["--enable-shared"]
@when("@:1.2")
def cmake(self, spec, prefix):
configure("--prefix=" + prefix, *self.configure_args())
There are a few interesting things to note here:
* We added ``deprecated=True`` to version 1.2.0. This signifies that version
1.2.0 is deprecated and shouldn't be used. However, if a user still relies
on version 1.2.0, it's still there and builds just fine.
* Since CMake and Autotools are so similar, we only need to override the
``cmake`` phase, we can use the default ``build`` and ``install`` phases.
* We override ``cmake`` to run ``./configure`` for older versions.
``configure_args`` remains the same.
* The ``@when`` decorator is used to override these phases only for older
versions. The default methods are used for ``bar@1.3:``.
Once a new Spack release comes out, version 1.2.0 and everything below the
comment can be safely deleted. The result is the same as if we had written a
package for version 1.3.0 from scratch.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Multiple build systems for the same version
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
During the transition from one build system to another, developers often
support multiple build systems at the same time. Spack can only use a single
build system for a single version. To decide which build system to use for a
particular version, take the following things into account:
1. If the developers explicitly state that one build system is preferred over
another, use that one.
2. If one build system is considered "experimental" while another is considered
"stable", use the stable build system.
3. Otherwise, use the newer build system.
The developer preference for which build system to use can change over time as
a newer build system becomes stable/recommended.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dropping support for old build systems
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
When older versions of a package don't support a newer build system, it can be
tempting to simply delete them from a package. This significantly reduces
package complexity and makes the build recipe much easier to maintain. However,
other packages or Spack users may rely on these older versions. The recommended
approach is to first support both build systems (as demonstrated above),
:ref:`deprecate <deprecate>` versions that rely on the old build system, and
remove those versions and any phases that needed to be overridden in the next
Spack release.
^^^^^^^^^^^^^^^^^^^^^^^^^^^
Three or more build systems
^^^^^^^^^^^^^^^^^^^^^^^^^^^
In rare cases, a package may change build systems multiple times. For example,
a package may start with Makefiles, then switch to Autotools, then switch to
CMake. The same logic used above can be extended to any number of build systems.
For example:
.. code-block:: python
class Baz(CMakePackage):
version("1.4.0", sha256="...") # CMake
version("1.3.0", sha256="...") # Autotools
version("1.2.0", sha256="...") # Makefile
def cmake_args(self):
return [self.define("BUILD_SHARED_LIBS", True)]
# Remove the following once version 1.3.0 is dropped
def configure_args(self):
return ["--enable-shared"]
@when("@1.3")
def cmake(self, spec, prefix):
configure("--prefix=" + prefix, *self.configure_args())
# Remove the following once version 1.2.0 is dropped
@when("@:1.2")
def patch(self):
filter_file("CC=", "CC=" + spack_cc, "Makefile")
@when("@:1.2")
def cmake(self, spec, prefix):
pass
@when("@:1.2")
def install(self, spec, prefix):
install_tree(".", prefix)
^^^^^^^^^^^^^^^^^^^
Additional examples
^^^^^^^^^^^^^^^^^^^
When writing new packages, it often helps to see examples of existing packages.
Here is an incomplete list of existing Spack packages that have changed build
systems before:
================ ===================== ================
Package Previous Build System New Build System
================ ===================== ================
amber custom CMake
arpack-ng Autotools CMake
atk Autotools Meson
blast None Autotools
dyninst Autotools CMake
evtgen Autotools CMake
fish Autotools CMake
gdk-pixbuf Autotools Meson
glib Autotools Meson
glog Autotools CMake
gmt Autotools CMake
gtkplus Autotools Meson
hpl Makefile Autotools
interproscan Perl Maven
jasper Autotools CMake
kahip SCons CMake
kokkos Makefile CMake
kokkos-kernels Makefile CMake
leveldb Makefile CMake
libdrm Autotools Meson
libjpeg-turbo Autotools CMake
mesa Autotools Meson
metis None CMake
mpifileutils Autotools CMake
muparser Autotools CMake
mxnet Makefile CMake
nest Autotools CMake
neuron Autotools CMake
nsimd CMake nsconfig
opennurbs Makefile CMake
optional-lite None CMake
plasma Makefile CMake
preseq Makefile Autotools
protobuf Autotools CMake
py-pygobject Autotools Python
singularity Autotools Makefile
span-lite None CMake
ssht Makefile CMake
string-view-lite None CMake
superlu Makefile CMake
superlu-dist Makefile CMake
uncrustify Autotools CMake
================ ===================== ================
Packages that support multiple build systems can be a bit confusing to write.
Don't hesitate to open an issue or draft pull request and ask for advice from
other Spack developers!

View File

@@ -336,7 +336,7 @@ This would be translated to:
.. code-block:: python
extends('python')
depends_on('python@3.5:3.999', type=('build', 'run'))
depends_on('python@3.5:3', type=('build', 'run'))
Many ``setup.py`` or ``setup.cfg`` files also contain information like::
@@ -568,7 +568,7 @@ check the ``METADATA`` file for lines like::
Lines that use ``Requires-Dist`` are similar to ``install_requires``.
Lines that use ``Provides-Extra`` are similar to ``extra_requires``,
and you can add a variant for those dependencies. The ``~=1.11.0``
syntax is equivalent to ``1.11.0:1.11.999``.
syntax is equivalent to ``1.11.0:1.11``.
""""""""""
setuptools

View File

@@ -97,15 +97,19 @@ def setup(sphinx):
# -- General configuration -----------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
needs_sphinx = '1.8'
needs_sphinx = '3.4'
# Add any Sphinx extension module names here, as strings. They can be extensions
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = ['sphinx.ext.autodoc',
'sphinx.ext.graphviz',
'sphinx.ext.napoleon',
'sphinx.ext.todo',
'sphinxcontrib.programoutput']
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.graphviz',
'sphinx.ext.intersphinx',
'sphinx.ext.napoleon',
'sphinx.ext.todo',
'sphinx.ext.viewcode',
'sphinxcontrib.programoutput',
]
# Set default graphviz options
graphviz_dot_args = [
@@ -164,6 +168,19 @@ def setup(sphinx):
# directories to ignore when looking for source files.
exclude_patterns = ['_build', '_spack_root', '.spack-env']
nitpicky = True
nitpick_ignore = [
# Python classes that intersphinx is unable to resolve
('py:class', 'argparse.HelpFormatter'),
('py:class', 'contextlib.contextmanager'),
('py:class', 'module'),
('py:class', '_io.BufferedReader'),
('py:class', 'unittest.case.TestCase'),
('py:class', '_frozen_importlib_external.SourceFileLoader'),
# Spack classes that are private and we don't want to expose
('py:class', 'spack.provider_index._IndexBase'),
]
# The reST default role (used for this markup: `text`) to use for all documents.
#default_role = None
@@ -358,3 +375,11 @@ class SpackStyle(DefaultStyle):
# How to display URL addresses: 'footnote', 'no', or 'inline'.
#texinfo_show_urls = 'footnote'
# -- Extension configuration -------------------------------------------------
# sphinx.ext.intersphinx
intersphinx_mapping = {
"python": ("https://docs.python.org/3", None),
}

View File

@@ -259,3 +259,16 @@ and ld.so will ONLY search for dependencies in the ``RUNPATH`` of
the loading object.
DO NOT MIX the two options within the same install tree.
----------------------
``terminal_title``
----------------------
By setting this option to ``true``, Spack will update the terminal's title to
provide information about its current progress as well as the current and
total package numbers.
To work properly, this requires your terminal to reset its title after
Spack has finished its work, otherwise Spack's status information will
remain in the terminal's title indefinitely. Most terminals should already
be set up this way and clear Spack's status information.

View File

@@ -402,12 +402,15 @@ Spack-specific variables
Spack understands several special variables. These are:
* ``$env``: name of the currently active :ref:`environment <environments>`
* ``$spack``: path to the prefix of this Spack installation
* ``$tempdir``: default system temporary directory (as specified in
Python's `tempfile.tempdir
<https://docs.python.org/2/library/tempfile.html#tempfile.tempdir>`_
variable.
* ``$user``: name of the current user
* ``$user_cache_path``: user cache directory (``~/.spack`` unless
:ref:`overridden <local-config-overrides>`)
Note that, as with shell variables, you can write these as ``$varname``
or with braces to distinguish the variable from surrounding characters:
@@ -562,3 +565,39 @@ built in and are not overridden by a configuration file. The
command line. ``dirty`` and ``install_tree`` come from the custom
scopes ``./my-scope`` and ``./my-scope-2``, and all other configuration
options come from the default configuration files that ship with Spack.
.. _local-config-overrides:
------------------------------
Overriding Local Configuration
------------------------------
Spack's ``system`` and ``user`` scopes provide ways for administrators and users to set
global defaults for all Spack instances, but for use cases where one wants a clean Spack
installation, these scopes can be undesirable. For example, users may want to opt out of
global system configuration, or they may want to ignore their own home directory
settings when running in a continuous integration environment.
Spack also, by default, keeps various caches and user data in ``~/.spack``, but
users may want to override these locations.
Spack provides three environment variables that allow you to override or opt out of
configuration locations:
* ``SPACK_USER_CONFIG_PATH``: Override the path to use for the
``user`` scope (``~/.spack`` by default).
* ``SPACK_SYSTEM_CONFIG_PATH``: Override the path to use for the
``system`` scope (``/etc/spack`` by default).
* ``SPACK_DISABLE_LOCAL_CONFIG``: set this environment variable to completely disable
**both** the system and user configuration directories. Spack will only consider its
own defaults and ``site`` configuration locations.
And one that allows you to move the default cache location:
* ``SPACK_USER_CACHE_PATH``: Override the default path to use for user data
(misc_cache, tests, reports, etc.)
With these settings, if you want to isolate Spack in a CI environment, you can do this::
export SPACK_DISABLE_LOCAL_CONFIG=true
export SPACK_USER_CACHE_PATH=/tmp/spack

View File

@@ -126,9 +126,6 @@ are currently supported are summarized in the table below:
* - Ubuntu 18.04
- ``ubuntu:18.04``
- ``spack/ubuntu-bionic``
* - CentOS 6
- ``centos:6``
- ``spack/centos6``
* - CentOS 7
- ``centos:7``
- ``spack/centos7``
@@ -200,7 +197,7 @@ Setting Base Images
The ``images`` subsection is used to select both the image where
Spack builds the software and the image where the built software
is installed. This attribute can be set in two different ways and
is installed. This attribute can be set in different ways and
which one to use depends on the use case at hand.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -260,10 +257,54 @@ software is respectively built and installed:
ENTRYPOINT ["/bin/bash", "--rcfile", "/etc/profile", "-l"]
This method of selecting base images is the simplest of the two, and we advise
This is the simplest available method of selecting base images, and we advise
to use it whenever possible. There are cases though where using Spack official
images is not enough to fit production needs. In these situations users can manually
select which base image to start from in the recipe, as we'll see next.
images is not enough to fit production needs. In these situations users can
extend the recipe to start with the bootstrapping of Spack at a certain pinned
version or manually select which base image to start from in the recipe,
as we'll see next.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Use a Bootstrap Stage for Spack
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In some cases users may want to pin the commit sha that is used for Spack, to ensure later
reproducibility, or start from a fork of the official Spack repository to try a bugfix or
a feature in the early stage of development. This is possible by being just a little more
verbose when specifying information about Spack in the ``spack.yaml`` file:
.. code-block:: yaml
images:
os: amazonlinux:2
spack:
# URL of the Spack repository to be used in the container image
url: <to-use-a-fork>
# Either a commit sha, a branch name or a tag
ref: <sha/tag/branch>
# If true turn a branch name or a tag into the corresponding commit
# sha at the time of recipe generation
resolve_sha: <true/false>
``url`` specifies the URL from which to clone Spack and defaults to https://github.com/spack/spack.
The ``ref`` attribute can be either a commit sha, a branch name or a tag. The default value in
this case is to use the ``develop`` branch, but it may change in the future to point to the latest stable
release. Finally ``resolve_sha`` transform branch names or tags into the corresponding commit
shas at the time of recipe generation, to allow for a greater reproducibility of the results
at a later time.
The list of operating systems that can be used to bootstrap Spack can be
obtained with:
.. command-output:: spack containerize --list-os
.. note::
The ``resolve_sha`` option uses ``git rev-parse`` under the hood and thus it requires
to checkout the corresponding Spack repository in a temporary folder before generating
the recipe. Recipe generation may take longer when this option is set to true because
of this additional step.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Use Custom Images Provided by Users
@@ -415,6 +456,18 @@ to customize the generation of container recipes:
- Version of Spack use in the ``build`` stage
- Valid tags for ``base:image``
- Yes, if using constrained selection of base images
* - ``images:spack:url``
- Repository from which Spack is cloned
- Any fork of Spack
- No
* - ``images:spack:ref``
- Reference for the checkout of Spack
- Either a commit sha, a branch name or a tag
- No
* - ``images:spack:resolve_sha``
- Resolve branches and tags in ``spack.yaml`` to commits in the generated recipe
- True or False (default: False)
- No
* - ``images:build``
- Image to be used in the ``build`` stage
- Any valid container image

View File

@@ -338,15 +338,6 @@ Once all of the dependencies are installed, you can try building the documentati
If you see any warning or error messages, you will have to correct those before
your PR is accepted.
.. note::
There is also a ``run-doc-tests`` script in ``share/spack/qa``. The only
difference between running this script and running ``make`` by hand is that
the script will exit immediately if it encounters an error or warning. This
is necessary for CI. If you made a lot of documentation changes, it is
much quicker to run ``make`` by hand so that you can see all of the warnings
at once.
If you are editing the documentation, you should obviously be running the
documentation tests. But even if you are simply adding a new package, your
changes could cause the documentation tests to fail:

View File

@@ -108,9 +108,9 @@ with a high level view of Spack's directory structure:
spack/ <- spack module; contains Python code
analyzers/ <- modules to run analysis on installed packages
build_systems/ <- modules for different build systems
build_systems/ <- modules for different build systems
cmd/ <- each file in here is a spack subcommand
compilers/ <- compiler description files
compilers/ <- compiler description files
container/ <- module for spack containerize
hooks/ <- hook modules to run at different points
modules/ <- modules for lmod, tcl, etc.
@@ -151,24 +151,22 @@ Package-related modules
^^^^^^^^^^^^^^^^^^^^^^^
:mod:`spack.package`
Contains the :class:`Package <spack.package.Package>` class, which
Contains the :class:`~spack.package.Package` class, which
is the superclass for all packages in Spack. Methods on ``Package``
implement all phases of the :ref:`package lifecycle
<package-lifecycle>` and manage the build process.
:mod:`spack.packages`
Contains all of the packages in Spack and methods for managing them.
Functions like :func:`packages.get <spack.packages.get>` and
:func:`class_name_for_package_name
<packages.class_name_for_package_name>` handle mapping package module
names to class names and dynamically instantiating packages by name
from module files.
:mod:`spack.util.naming`
Contains functions for mapping between Spack package names,
Python module names, and Python class names. Functions like
:func:`~spack.util.naming.mod_to_class` handle mapping package
module names to class names.
:mod:`spack.relations`
*Relations* are relationships between packages, like
:func:`depends_on <spack.relations.depends_on>` and :func:`provides
<spack.relations.provides>`. See :ref:`dependencies` and
:ref:`virtual-dependencies`.
:mod:`spack.directives`
*Directives* are functions that can be called inside a package definition
to modify the package, like :func:`~spack.directives.depends_on`
and :func:`~spack.directives.provides`. See :ref:`dependencies`
and :ref:`virtual-dependencies`.
:mod:`spack.multimethod`
Implementation of the :func:`@when <spack.multimethod.when>`
@@ -180,31 +178,27 @@ Spec-related modules
^^^^^^^^^^^^^^^^^^^^
:mod:`spack.spec`
Contains :class:`Spec <spack.spec.Spec>` and :class:`SpecParser
<spack.spec.SpecParser>`. Also implements most of the logic for
normalization and concretization of specs.
Contains :class:`~spack.spec.Spec` and :class:`~spack.spec.SpecParser`.
Also implements most of the logic for normalization and concretization
of specs.
:mod:`spack.parse`
Contains some base classes for implementing simple recursive descent
parsers: :class:`Parser <spack.parse.Parser>` and :class:`Lexer
<spack.parse.Lexer>`. Used by :class:`SpecParser
<spack.spec.SpecParser>`.
parsers: :class:`~spack.parse.Parser` and :class:`~spack.parse.Lexer`.
Used by :class:`~spack.spec.SpecParser`.
:mod:`spack.concretize`
Contains :class:`DefaultConcretizer
<spack.concretize.DefaultConcretizer>` implementation, which allows
site administrators to change Spack's :ref:`concretization-policies`.
Contains :class:`~spack.concretize.Concretizer` implementation,
which allows site administrators to change Spack's :ref:`concretization-policies`.
:mod:`spack.version`
Implements a simple :class:`Version <spack.version.Version>` class
with simple comparison semantics. Also implements
:class:`VersionRange <spack.version.VersionRange>` and
:class:`VersionList <spack.version.VersionList>`. All three are
comparable with each other and offer union and intersection
operations. Spack uses these classes to compare versions and to
manage version constraints on specs. Comparison semantics are
similar to the ``LooseVersion`` class in ``distutils`` and to the
way RPM compares version strings.
Implements a simple :class:`~spack.version.Version` class with simple
comparison semantics. Also implements :class:`~spack.version.VersionRange`
and :class:`~spack.version.VersionList`. All three are comparable with each
other and offer union and intersection operations. Spack uses these classes
to compare versions and to manage version constraints on specs. Comparison
semantics are similar to the ``LooseVersion`` class in ``distutils`` and to
the way RPM compares version strings.
:mod:`spack.compilers`
Submodules contains descriptors for all valid compilers in Spack.
@@ -216,15 +210,6 @@ Spec-related modules
but compilers aren't fully integrated with the build process
yet.
:mod:`spack.architecture`
:func:`architecture.sys_type <spack.architecture.sys_type>` is used
to determine the host architecture while building.
.. warning::
Not yet implemented. Should eventually have architecture
descriptions for cross-compiling.
^^^^^^^^^^^^^^^^^
Build environment
^^^^^^^^^^^^^^^^^
@@ -232,7 +217,7 @@ Build environment
:mod:`spack.stage`
Handles creating temporary directories for builds.
:mod:`spack.compilation`
:mod:`spack.build_environment`
This contains utility functions used by the compiler wrapper script,
``cc``.
@@ -257,22 +242,19 @@ Unit tests
Implements Spack's test suite. Add a module and put its name in
the test suite in ``__init__.py`` to add more unit tests.
:mod:`spack.test.mock_packages`
This is a fake package hierarchy used to mock up packages for
Spack's test suite.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Research and Monitoring Modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
:mod:`spack.monitor`
Contains :class:`SpackMonitor <spack.monitor.SpackMonitor>`. This is accessed
from the ``spack install`` and ``spack analyze`` commands to send build
and package metadada up to a `Spack Monitor <https://github.com/spack/spack-monitor>`_ server.
Contains :class:`~spack.monitor.SpackMonitorClient`. This is accessed from
the ``spack install`` and ``spack analyze`` commands to send build and
package metadata up to a `Spack Monitor
<https://github.com/spack/spack-monitor>`_ server.
:mod:`spack.analyzers`
A module folder with a :class:`AnalyzerBase <spack.analyzers.analyzer_base.AnalyzerBase>`
A module folder with a :class:`~spack.analyzers.analyzer_base.AnalyzerBase`
that provides base functions to run, save, and (optionally) upload analysis
results to a `Spack Monitor <https://github.com/spack/spack-monitor>`_ server.
@@ -286,7 +268,7 @@ Other Modules
tarball URLs.
:mod:`spack.error`
:class:`SpackError <spack.error.SpackError>`, the base class for
:class:`~spack.error.SpackError`, the base class for
Spack's exception hierarchy.
:mod:`llnl.util.tty`
@@ -335,8 +317,8 @@ Writing analyzers
To write an analyzer, you should add a new python file to the
analyzers module directory at ``lib/spack/spack/analyzers`` .
Your analyzer should be a subclass of the :class:`AnalyzerBase <spack.analyzers.analyzer_base.AnalyzerBase>`. For example, if you want
to add an analyzer class ``Myanalyzer`` you woul write to
``spack/analyzers/myanalyzer.py`` and import and
to add an analyzer class ``Myanalyzer`` you would write to
``spack/analyzers/myanalyzer.py`` and import and
use the base as follows:
.. code-block:: python
@@ -347,7 +329,7 @@ use the base as follows:
Note that the class name is your module file name, all lowercase
except for the first capital letter. You can look at other analyzers in
except for the first capital letter. You can look at other analyzers in
that analyzer directory for examples. The guide here will tell you about the basic functions needed.
^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -356,13 +338,13 @@ Analyzer Output Directory
By default, when you run ``spack analyze run`` an analyzer output directory will
be created in your spack user directory in your ``$HOME``. The reason we output here
is because the install directory might not always be writable.
is because the install directory might not always be writable.
.. code-block:: console
~/.spack/
analyzers
Result files will be written here, organized in subfolders in the same structure
as the package, with each analyzer owning it's own subfolder. for example:
@@ -380,11 +362,11 @@ as the package, with each analyzer owning it's own subfolder. for example:
│   └── spack-analyzer-install-files.json
└── libabigail
└── lib
└── spack-analyzer-libabigail-libz.so.1.2.11.xml
└── spack-analyzer-libabigail-libz.so.1.2.11.xml
Notice that for the libabigail analyzer, since results are generated per object,
we honor the object's folder in case there are equivalently named files in
we honor the object's folder in case there are equivalently named files in
different folders. The result files are typically written as json so they can be easily read and uploaded in a future interaction with a monitor.
@@ -426,7 +408,7 @@ and then return the object with a key as the analyzer name. The result data
should be a list of objects, each with a name, ``analyzer_name``, ``install_file``,
and one of ``value`` or ``binary_value``. The install file should be for a relative
path, and not the absolute path. For example, let's say we extract a metric called
``metric`` for ``bin/wget`` using our analyzer ``thebest-analyzer``.
``metric`` for ``bin/wget`` using our analyzer ``thebest-analyzer``.
We might have data that looks like this:
.. code-block:: python
@@ -482,7 +464,7 @@ Saving Analyzer Results
The analyzer will have ``save_result`` called, with the result object generated
to save it to the filesystem, and if the user has added the ``--monitor`` flag
to upload it to a monitor server. If your result follows an accepted result
format and you don't need to parse it further, you don't need to add this
format and you don't need to parse it further, you don't need to add this
function to your class. However, if your result data is large or otherwise
needs additional parsing, you can define it. If you define the function, it
is useful to know about the ``output_dir`` property, which you can join
@@ -548,7 +530,7 @@ each one (separately) to the monitor:
Notice that this function, if you define it, requires a result object (generated by
``run()``, a monitor (if you want to send), and a boolean ``overwrite`` to be used
to check if a result exists first, and not write to it if the result exists and
to check if a result exists first, and not write to it if the result exists and
overwrite is False. Also notice that since we already saved these files to the analyzer metadata folder, we return early if a monitor isn't defined, because this function serves to send results to the monitor. If you haven't saved anything to the analyzer metadata folder
yet, you might want to do that here. You should also use ``tty.info`` to give
the user a message of "Writing result to $DIRNAME."
@@ -616,7 +598,7 @@ types of hooks in the ``__init__.py``, and then python files in that folder
can use hook functions. The files are automatically parsed, so if you write
a new file for some integration (e.g., ``lib/spack/spack/hooks/myintegration.py``
you can then write hook functions in that file that will be automatically detected,
and run whenever your hook is called. This section will cover the basic kind
and run whenever your hook is called. This section will cover the basic kind
of hooks, and how to write them.
^^^^^^^^^^^^^^
@@ -624,7 +606,7 @@ Types of Hooks
^^^^^^^^^^^^^^
The following hooks are currently implemented to make it easy for you,
the developer, to add hooks at different stages of a spack install or similar.
the developer, to add hooks at different stages of a spack install or similar.
If there is a hook that you would like and is missing, you can propose to add a new one.
"""""""""""""""""""""
@@ -632,9 +614,9 @@ If there is a hook that you would like and is missing, you can propose to add a
"""""""""""""""""""""
A ``pre_install`` hook is run within an install subprocess, directly before
the install starts. It expects a single argument of a spec, and is run in
the install starts. It expects a single argument of a spec, and is run in
a multiprocessing subprocess. Note that if you see ``pre_install`` functions associated with packages these are not hooks
as we have defined them here, but rather callback functions associated with
as we have defined them here, but rather callback functions associated with
a package install.
@@ -657,7 +639,7 @@ here.
This hook is run at the beginning of ``lib/spack/spack/installer.py``,
in the install function of a ``PackageInstaller``,
and importantly is not part of a build process, but before it. This is when
we have just newly grabbed the task, and are preparing to install. If you
we have just newly grabbed the task, and are preparing to install. If you
write a hook of this type, you should provide the spec to it.
.. code-block:: python
@@ -666,7 +648,7 @@ write a hook of this type, you should provide the spec to it.
"""On start of an install, we want to...
"""
print('on_install_start')
""""""""""""""""""""""""""""
``on_install_success(spec)``
@@ -744,8 +726,8 @@ to trigger after anything is written to a logger. You would add it as follows:
post_install = HookRunner('post_install')
# hooks related to logging
post_log_write = HookRunner('post_log_write') # <- here is my new hook!
post_log_write = HookRunner('post_log_write') # <- here is my new hook!
You then need to decide what arguments my hook would expect. Since this is
related to logging, let's say that you want a message and level. That means
@@ -775,7 +757,7 @@ In this example, we use it outside of a logger that is already defined:
This is not to say that this would be the best way to implement an integration
with the logger (you'd probably want to write a custom logger, or you could
have the hook defined within the logger) but serves as an example of writing a hook.
have the hook defined within the logger) but serves as an example of writing a hook.
----------
Unit tests
@@ -785,6 +767,38 @@ Unit tests
Unit testing
------------
---------------------
Developer environment
---------------------
.. warning::
This is an experimental feature. It is expected to change and you should
not use it in a production environment.
When installing a package, we currently have support to export environment
variables to specify adding debug flags to the build. By default, a package
install will build without any debug flag. However, if you want to add them,
you can export:
.. code-block:: console
export SPACK_ADD_DEBUG_FLAGS=true
spack install zlib
If you want to add custom flags, you should export an additional variable:
.. code-block:: console
export SPACK_ADD_DEBUG_FLAGS=true
export SPACK_DEBUG_FLAGS="-g"
spack install zlib
These environment variables will eventually be integrated into spack so
they are set from the command line.
------------------
Developer commands
------------------
@@ -795,6 +809,29 @@ Developer commands
``spack doc``
^^^^^^^^^^^^^
.. _cmd-spack-style:
^^^^^^^^^^^^^^^
``spack style``
^^^^^^^^^^^^^^^
spack style exists to help the developer user to check imports and style with
mypy, flake8, isort, and (soon) black. To run all style checks, simply do:
.. code-block:: console
$ spack style
To run automatic fixes for isort you can do:
.. code-block:: console
$ spack style --fix
You do not need any of these Python packages installed on your system for
the checks to work! Spack will bootstrap install them from packages for
your use.
^^^^^^^^^^^^^^^^^^^
``spack unit-test``
^^^^^^^^^^^^^^^^^^^
@@ -873,7 +910,7 @@ just like you would with the normal ``python`` command.
^^^^^^^^^^^^^^^
Spack blame is a way to quickly see contributors to packages or files
in the spack repository. You should provide a target package name or
in the spack repository. You should provide a target package name or
file name to the command. Here is an example asking to see contributions
for the package "python":
@@ -883,8 +920,8 @@ for the package "python":
LAST_COMMIT LINES % AUTHOR EMAIL
2 weeks ago 3 0.3 Mickey Mouse <cheddar@gmouse.org>
a month ago 927 99.7 Minnie Mouse <swiss@mouse.org>
2 weeks ago 930 100.0
2 weeks ago 930 100.0
By default, you will get a table view (shown above) sorted by date of contribution,
@@ -1255,7 +1292,7 @@ Publishing a release on GitHub
#. Create the release in GitHub.
* Go to
* Go to
`github.com/spack/spack/releases <https://github.com/spack/spack/releases>`_
and click ``Draft a new release``.

View File

@@ -732,13 +732,17 @@ Configuring environment views
The Spack Environment manifest file has a top-level keyword
``view``. Each entry under that heading is a view descriptor, headed
by a name. The view descriptor contains the root of the view, and
optionally the projections for the view, and ``select`` and
``exclude`` lists for the view. For example, in the following manifest
optionally the projections for the view, ``select`` and
``exclude`` lists for the view and link information via ``link`` and
``link_type``. For example, in the following manifest
file snippet we define a view named ``mpis``, rooted at
``/path/to/view`` in which all projections use the package name,
version, and compiler name to determine the path for a given
package. This view selects all packages that depend on MPI, and
excludes those built with the PGI compiler at version 18.5.
All the dependencies of each root spec in the environment will be linked
in the view due to the command ``link: all`` and the files in the view will
be symlinks to the spack install directories.
.. code-block:: yaml
@@ -751,11 +755,16 @@ excludes those built with the PGI compiler at version 18.5.
exclude: ['%pgi@18.5']
projections:
all: {name}/{version}-{compiler.name}
link: all
link_type: symlink
For more information on using view projections, see the section on
:ref:`adding_projections_to_views`. The default for the ``select`` and
``exclude`` values is to select everything and exclude nothing. The
default projection is the default view projection (``{}``).
default projection is the default view projection (``{}``). The ``link``
defaults to ``all`` but can also be ``roots`` when only the root specs
in the environment are desired in the view. The ``link_type`` defaults
to ``symlink`` but can also take the value of ``hardlink`` or ``copy``.
Any number of views may be defined under the ``view`` heading in a
Spack Environment.

View File

@@ -9,21 +9,16 @@
Getting Started
===============
-------------
Prerequisites
-------------
--------------------
System Prerequisites
--------------------
Spack has the following minimum requirements, which must be installed
before Spack is run:
Spack has the following minimum system requirements, which are assumed to
be present on the machine where Spack is run:
#. Python 2 (2.6 or 2.7) or 3 (3.5 - 3.9) to run Spack
#. A C/C++ compiler for building
#. The ``make`` executable for building
#. The ``tar``, ``gzip``, ``unzip``, ``bzip2``, ``xz`` and optionally ``zstd``
executables for extracting source code
#. The ``patch`` command to apply patches
#. The ``git`` and ``curl`` commands for fetching
#. If using the ``gpg`` subcommand, ``gnupg2`` is required
.. csv-table:: System prerequisites for Spack
:file: tables/system_prerequisites.csv
:header-rows: 1
These requirements can be easily installed on most modern Linux systems;
on macOS, XCode is required. Spack is designed to run on HPC
@@ -40,7 +35,7 @@ Getting Spack is easy. You can clone it from the `github repository
.. code-block:: console
$ git clone https://github.com/spack/spack.git
$ git clone -c feature.manyFiles=true https://github.com/spack/spack.git
This will create a directory called ``spack``.
@@ -89,6 +84,140 @@ sourcing time, ensuring future invocations of the ``spack`` command will
continue to use the same consistent python version regardless of changes in
the environment.
^^^^^^^^^^^^^^^^^^^^
Bootstrapping clingo
^^^^^^^^^^^^^^^^^^^^
Spack uses ``clingo`` under the hood to resolve optimal versions and variants of
dependencies when installing a package. Since ``clingo`` itself is a binary,
Spack has to install it on initial use, which is called bootstrapping.
Spack provides two ways of bootstrapping ``clingo``: from pre-built binaries
(default), or from sources. The fastest way to get started is to bootstrap from
pre-built binaries.
.. note::
When bootstrapping from pre-built binaries, Spack currently requires
``patchelf`` on Linux and ``otool`` on macOS. If ``patchelf`` is not in the
``PATH``, Spack will build it from sources, and a C++ compiler is required.
The first time you concretize a spec, Spack will bootstrap in the background:
.. code-block:: console
$ time spack spec zlib
Input spec
--------------------------------
zlib
Concretized
--------------------------------
zlib@1.2.11%gcc@7.5.0+optimize+pic+shared arch=linux-ubuntu18.04-zen
real 0m20.023s
user 0m18.351s
sys 0m0.784s
After this command you'll see that ``clingo`` has been installed for Spack's own use:
.. code-block:: console
$ spack find -b
==> Showing internal bootstrap store at "/root/.spack/bootstrap/store"
==> 3 installed packages
-- linux-rhel5-x86_64 / gcc@9.3.0 -------------------------------
clingo-bootstrap@spack python@3.6
-- linux-ubuntu18.04-zen / gcc@7.5.0 ----------------------------
patchelf@0.13
Subsequent calls to the concretizer will then be much faster:
.. code-block:: console
$ time spack spec zlib
[ ... ]
real 0m0.490s
user 0m0.431s
sys 0m0.041s
If for security concerns you cannot bootstrap ``clingo`` from pre-built
binaries, you have to mark this bootstrapping method as untrusted. This makes
Spack fall back to bootstrapping from sources:
.. code-block:: console
$ spack bootstrap untrust github-actions
==> "github-actions" is now untrusted and will not be used for bootstrapping
You can verify that the new settings are effective with:
.. code-block:: console
$ spack bootstrap list
Name: github-actions UNTRUSTED
Type: buildcache
Info:
url: https://mirror.spack.io/bootstrap/github-actions/v0.1
homepage: https://github.com/alalazo/spack-bootstrap-mirrors
releases: https://github.com/alalazo/spack-bootstrap-mirrors/releases
Description:
Buildcache generated from a public workflow using Github Actions.
The sha256 checksum of binaries is checked before installation.
Name: spack-install TRUSTED
Type: install
Description:
Specs built from sources by Spack. May take a long time.
.. note::
When bootstrapping from sources, Spack requires a full install of Python
including header files (e.g. ``python3-dev`` on Debian), and a compiler
with support for C++14 (GCC on Linux, Apple Clang on macOS) and static C++
standard libraries on Linux.
Spack will build the required software on the first request to concretize a spec:
.. code-block:: console
$ spack spec zlib
[+] /usr (external bison-3.0.4-wu5pgjchxzemk5ya2l3ddqug2d7jv6eb)
[+] /usr (external cmake-3.19.4-a4kmcfzxxy45mzku4ipmj5kdiiz5a57b)
[+] /usr (external python-3.6.9-x4fou4iqqlh5ydwddx3pvfcwznfrqztv)
==> Installing re2c-1.2.1-e3x6nxtk3ahgd63ykgy44mpuva6jhtdt
[ ... ]
zlib@1.2.11%gcc@10.1.0+optimize+pic+shared arch=linux-ubuntu18.04-broadwell
"""""""""""""""""""
The Bootstrap Store
"""""""""""""""""""
All the tools Spack needs for its own functioning are installed in a separate store, which lives
under the ``${HOME}/.spack`` directory. The software installed there can be queried with:
.. code-block:: console
$ spack find --bootstrap
==> Showing internal bootstrap store at "/home/spack/.spack/bootstrap/store"
==> 3 installed packages
-- linux-ubuntu18.04-x86_64 / gcc@10.1.0 ------------------------
clingo-bootstrap@spack python@3.6.9 re2c@1.2.1
In case it's needed the bootstrap store can also be cleaned with:
.. code-block:: console
$ spack clean -b
==> Removing software in "/home/spack/.spack/bootstrap/store"
^^^^^^^^^^^^^^^^^^
Check Installation
@@ -117,53 +246,6 @@ environment*, especially for ``PATH``. Only software that comes with
the system, or that you know you wish to use with Spack, should be
included. This procedure will avoid many strange build errors.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Optional: Bootstrapping clingo
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack supports using clingo as an external solver to compute which software
needs to be installed. If you have a default compiler supporting C++14 Spack
can automatically bootstrap this tool from sources the first time it is
needed:
.. code-block:: console
$ spack solve zlib
[+] /usr (external bison-3.0.4-wu5pgjchxzemk5ya2l3ddqug2d7jv6eb)
[+] /usr (external cmake-3.19.4-a4kmcfzxxy45mzku4ipmj5kdiiz5a57b)
[+] /usr (external python-3.6.9-x4fou4iqqlh5ydwddx3pvfcwznfrqztv)
==> Installing re2c-1.2.1-e3x6nxtk3ahgd63ykgy44mpuva6jhtdt
[ ... ]
==> Optimization: [0, 0, 0, 0, 0, 1, 0, 0, 0]
zlib@1.2.11%gcc@10.1.0+optimize+pic+shared arch=linux-ubuntu18.04-broadwell
If you want to speed-up bootstrapping, you may try to search for ``cmake`` and ``bison``
on your system:
.. code-block:: console
$ spack external find cmake bison
==> The following specs have been detected on this system and added to /home/spack/.spack/packages.yaml
bison@3.0.4 cmake@3.19.4
All the tools Spack needs for its own functioning are installed in a separate store, which lives
under the ``${HOME}/.spack`` directory. The software installed there can be queried with:
.. code-block:: console
$ spack find --bootstrap
==> Showing internal bootstrap store at "/home/spack/.spack/bootstrap/store"
==> 3 installed packages
-- linux-ubuntu18.04-x86_64 / gcc@10.1.0 ------------------------
clingo-bootstrap@spack python@3.6.9 re2c@1.2.1
In case it's needed the bootstrap store can also be cleaned with:
.. code-block:: console
$ spack clean -b
==> Removing software in "/home/spack/.spack/bootstrap/store"
^^^^^^^^^^^^^^^^^^^^^^^^^^
Optional: Alternate Prefix
^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -367,6 +449,34 @@ then inject those flags into the compiler command. Compiler flags
entered from the command line will be discussed in more detail in the
following section.
Some compilers also require additional environment configuration.
Examples include Intels oneAPI and AMDs AOCC compiler suites,
which have custom scripts for loading environment variables and setting paths.
These variables should be specified in the ``environment`` section of the compiler
specification. The operations available to modify the environment are ``set``, ``unset``,
``prepend_path``, ``append_path``, and ``remove_path``. For example:
.. code-block:: yaml
compilers:
- compiler:
modules: []
operating_system: centos6
paths:
cc: /opt/intel/oneapi/compiler/latest/linux/bin/icx
cxx: /opt/intel/oneapi/compiler/latest/linux/bin/icpx
f77: /opt/intel/oneapi/compiler/latest/linux/bin/ifx
fc: /opt/intel/oneapi/compiler/latest/linux/bin/ifx
spec: oneapi@latest
environment:
set:
MKL_ROOT: "/path/to/mkl/root"
unset: # A list of environment variables to unset
- CC
prepend_path: # Similar for append|remove_path
LD_LIBRARY_PATH: /ld/paths/added/by/setvars/sh
^^^^^^^^^^^^^^^^^^^^^^^
Build Your Own Compiler
^^^^^^^^^^^^^^^^^^^^^^^
@@ -521,8 +631,9 @@ Fortran.
#. Run ``spack compiler find`` to locate Clang.
#. There are different ways to get ``gfortran`` on macOS. For example, you can
install GCC with Spack (``spack install gcc``) or with Homebrew
(``brew install gcc``).
install GCC with Spack (``spack install gcc``), with Homebrew (``brew install
gcc``), or from a `DMG installer
<https://github.com/fxcoudert/gfortran-for-macOS/releases>`_.
#. The only thing left to do is to edit ``~/.spack/darwin/compilers.yaml`` to provide
the path to ``gfortran``:
@@ -543,7 +654,8 @@ Fortran.
If you used Spack to install GCC, you can get the installation prefix by
``spack location -i gcc`` (this will only work if you have a single version
of GCC installed). Whereas for Homebrew, GCC is installed in
``/usr/local/Cellar/gcc/x.y.z``.
``/usr/local/Cellar/gcc/x.y.z``. With the DMG installer, the correct path
will be ``/usr/local/gfortran``.
^^^^^^^^^^^^^^^^^^^^^
Compiler Verification
@@ -777,7 +889,7 @@ an OpenMPI installed in /opt/local, one would use:
buildable: False
In general, Spack is easier to use and more reliable if it builds all of
its own dependencies. However, there are two packages for which one
its own dependencies. However, there are several packages for which one
commonly needs to use system versions:
^^^

View File

@@ -39,7 +39,7 @@ package:
.. code-block:: console
$ git clone https://github.com/spack/spack.git
$ git clone -c feature.manyFiles=true https://github.com/spack/spack.git
$ cd spack/bin
$ ./spack install libelf

View File

@@ -213,6 +213,18 @@ location). The set ``my_custom_lmod_modules`` will install its lmod
modules to ``/path/to/install/custom/lmod/modules`` (and still install
its tcl modules, if any, to the default location).
By default, an architecture-specific directory is added to the root
directory. A module set may override that behavior by setting the
``arch_folder`` config value to ``False``.
.. code-block:: yaml
modules:
default:
roots:
tcl: /path/to/install/tcl/modules
arch_folder: false
Obviously, having multiple module sets install modules to the default
location could be confusing to users of your modules. In the next
section, we will discuss enabling and disabling module types (module
@@ -449,6 +461,36 @@ that are already in the LMod hierarchy.
For hierarchies that are deeper than three layers ``lmod spider`` may have some issues.
See `this discussion on the LMod project <https://github.com/TACC/Lmod/issues/114>`_.
""""""""""""""""""""""
Select default modules
""""""""""""""""""""""
By default, when multiple modules of the same name share a directory,
the highest version number will be the default module. This behavior
of the ``module`` command can be overridden with a symlink named
``default`` to the desired default module. If you wish to configure
default modules with Spack, add a ``defaults`` key to your modules
configuration:
.. code-block:: yaml
modules:
my-module-set:
tcl:
defaults:
- gcc@10.2.1
- hdf5@1.2.10+mpi+hl%gcc
These defaults may be arbitrarily specific. For any package that
satisfies a default, Spack will generate the module file in the
appropriate path, and will generate a default symlink to the module
file as well.
.. warning::
If Spack is configured to generate multiple default packages in the
same directory, the last modulefile to be generated will be the
default module.
.. _customize-env-modifications:
"""""""""""""""""""""""""""""""""""

View File

@@ -612,6 +612,7 @@ it executable, then runs it with some arguments.
installer = Executable(self.stage.archive_file)
installer('--prefix=%s' % prefix, 'arg1', 'arg2', 'etc.')
.. _deprecate:
^^^^^^^^^^^^^^^^^^^^^^^^
Deprecating old versions
@@ -2102,7 +2103,7 @@ correct way to specify this would be:
.. code-block:: python
depends_on('python@2.6.0:2.6.999')
depends_on('python@2.6.0:2.6')
A spec can contain multiple version ranges separated by commas.
For example, if you need Boost 1.59.0 or newer, but there are known
@@ -2823,7 +2824,7 @@ is equivalent to:
depends_on('elpa+openmp', when='+openmp+elpa')
Constraints from nested context managers are also added together, but they are rarely
Constraints from nested context managers are also combined together, but they are rarely
needed or recommended.
.. _install-method:
@@ -2884,52 +2885,52 @@ The package base class, usually specialized for a given build system, determines
actual set of entities available for overriding.
The classes that are currently provided by Spack are:
+-------------------------------+----------------------------------+
| **Base Class** | **Purpose** |
+===============================+==================================+
| :py:class:`.Package` | General base class not |
| | specialized for any build system |
+-------------------------------+----------------------------------+
| :py:class:`.MakefilePackage` | Specialized class for packages |
| | built invoking |
| | hand-written Makefiles |
+-------------------------------+----------------------------------+
| :py:class:`.AutotoolsPackage` | Specialized class for packages |
| | built using GNU Autotools |
+-------------------------------+----------------------------------+
| :py:class:`.CMakePackage` | Specialized class for packages |
| | built using CMake |
+-------------------------------+----------------------------------+
| :py:class:`.CudaPackage` | A helper class for packages that |
| | use CUDA |
+-------------------------------+----------------------------------+
| :py:class:`.QMakePackage` | Specialized class for packages |
| | build using QMake |
+-------------------------------+----------------------------------+
| :py:class:`.ROCmPackage` | A helper class for packages that |
| | use ROCm |
+-------------------------------+----------------------------------+
| :py:class:`.SConsPackage` | Specialized class for packages |
| | built using SCons |
+-------------------------------+----------------------------------+
| :py:class:`.WafPackage` | Specialized class for packages |
| | built using Waf |
+-------------------------------+----------------------------------+
| :py:class:`.RPackage` | Specialized class for |
| | :py:class:`.R` extensions |
+-------------------------------+----------------------------------+
| :py:class:`.OctavePackage` | Specialized class for |
| | :py:class:`.Octave` packages |
+-------------------------------+----------------------------------+
| :py:class:`.PythonPackage` | Specialized class for |
| | :py:class:`.Python` extensions |
+-------------------------------+----------------------------------+
| :py:class:`.PerlPackage` | Specialized class for |
| | :py:class:`.Perl` extensions |
+-------------------------------+----------------------------------+
| :py:class:`.IntelPackage` | Specialized class for licensed |
| | Intel software |
+-------------------------------+----------------------------------+
+-------------------------=--------------------------------+----------------------------------+
| **Base Class** | **Purpose** |
+==========================================================+==================================+
| :class:`~spack.package.Package` | General base class not |
| | specialized for any build system |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.makefile.MakefilePackage` | Specialized class for packages |
| | built invoking |
| | hand-written Makefiles |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.autotools.AutotoolsPackage` | Specialized class for packages |
| | built using GNU Autotools |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.cmake.CMakePackage` | Specialized class for packages |
| | built using CMake |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.cuda.CudaPackage` | A helper class for packages that |
| | use CUDA |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.qmake.QMakePackage` | Specialized class for packages |
| | built using QMake |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.rocm.ROCmPackage` | A helper class for packages that |
| | use ROCm |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.scons.SConsPackage` | Specialized class for packages |
| | built using SCons |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.waf.WafPackage` | Specialized class for packages |
| | built using Waf |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.r.RPackage` | Specialized class for |
| | R extensions |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.octave.OctavePackage` | Specialized class for |
| | Octave packages |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.python.PythonPackage` | Specialized class for |
| | Python extensions |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.perl.PerlPackage` | Specialized class for |
| | Perl extensions |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.intel.IntelPackage` | Specialized class for licensed |
| | Intel software |
+----------------------------------------------------------+----------------------------------+
.. note::
@@ -2939,7 +2940,7 @@ The classes that are currently provided by Spack are:
rare cases where manual intervention is needed we need to stress that a
package base class depends on the *build system* being used, not the language of the package.
For example, a Python extension installed with CMake would ``extends('python')`` and
subclass from :py:class:`.CMakePackage`.
subclass from :class:`~spack.build_systems.cmake.CMakePackage`.
^^^^^^^^^^^^^^^^^^^^^
Installation pipeline
@@ -4079,7 +4080,7 @@ prefix **before** ``make install``. Builds like this can falsely report
success when an error occurs before the installation is complete. Simple
sanity checks can be used to identify files and or directories that are
required of a successful installation. Spack checks for the presence of
the files and directories after ``install()`` runs.
the files and directories after ``install()`` runs.
If any of the listed files or directories are missing, then the build will
fail and the install prefix will be removed. If they all exist, then Spack
@@ -4193,7 +4194,7 @@ need to use two decorators for each phase test method:
The first decorator tells Spack when in the installation process to
run your test method installation process; namely *after* the provided
installation phase. The second decorator tells Spack to only run the
checks when the ``--test`` option is provided on the command line.
checks when the ``--test`` option is provided on the command line.
.. note::
@@ -4267,17 +4268,17 @@ tests can be performed days, even weeks, after the software is installed.
Stand-alone tests are checks that should run relatively quickly -- as
in on the order of at most a few minutes -- and ideally execute all
aspects of the installed software, or at least key functionality.
aspects of the installed software, or at least key functionality.
.. note::
Execution speed is important because these tests are intended
to quickly assess whether the installed software works on the
system.
Failing stand-alone tests indicate that there is no reason to
proceed with more resource-intensive tests.
Passing stand-alone (or smoke) tests can lead to more thorough
testing, such as extensive unit or regression tests, or tests
that run at scale. Spack support for more thorough testing is
@@ -4307,7 +4308,7 @@ file such that:
test_stage: /path/to/stage
The package can access this path **during test processing** using
`self.test_suite.stage`.
`self.test_suite.stage`.
.. note::
@@ -4367,9 +4368,9 @@ The signature for ``cache_extra_test_sources`` is:
where ``srcs`` is a string or a list of strings corresponding to
the paths for the files and or subdirectories, relative to the staged
source, that are to be copied to the corresponding path relative to
``self.install_test_root``. All of the contents within each subdirectory
will be also be copied.
source, that are to be copied to the corresponding relative test path
under the prefix. All of the contents within each subdirectory will
also be copied.
For example, a package method for copying everything in the ``tests``
subdirectory plus the ``foo.c`` and ``bar.c`` files from ``examples``
@@ -4377,8 +4378,13 @@ can be implemented as shown below.
.. note::
The ``run_after`` directive ensures associated files are copied
**after** the package is installed by the build process.
The method name ``copy_test_sources`` here is for illustration
purposes. You are free to use a name that is more suited to your
package.
The key to copying the files at build time for stand-alone testing
is use of the ``run_after`` directive, which ensures the associated
files are copied **after** the provided build stage.
.. code-block:: python
@@ -4388,25 +4394,20 @@ can be implemented as shown below.
@run_after('install')
def copy_test_sources(self):
srcs = ['tests',
join_path('examples', 'foo.c'),
join_path('examples', 'foo.c'),
join_path('examples', 'bar.c')]
self.cache_extra_test_sources(srcs)
In this case, the method copies the associated files from the build
stage **after** the software is installed to the package's metadata
directory. The result is the directory and files will be cached in
paths under ``self.install_test_root`` as follows:
* ``join_path(self.install_test_root, 'tests')`` along with its files
and subdirectories
* ``join_path(self.install_test_root, 'examples', 'foo.c')``
* ``join_path(self.install_test_root, 'examples', 'bar.c')``
a special test subdirectory under the installation prefix.
These paths are **automatically copied** to the test stage directory
where they are available to the package's ``test`` method through the
``self.test_suite.current_test_cache_dir`` property. In our example,
the method can access the directory and files using the following
paths:
during stand-alone testing. The package's ``test`` method can access
them using the ``self.test_suite.current_test_cache_dir`` property.
In our example, the method would use the following paths to reference
the copy of each entry listed in ``srcs``, respectively:
* ``join_path(self.test_suite.current_test_cache_dir, 'tests')``
* ``join_path(self.test_suite.current_test_cache_dir, 'examples', 'foo.c')``
@@ -4414,9 +4415,8 @@ paths:
.. note::
Library developers will want to build the associated tests under
the ``self.test_suite.current_test_cache_dir`` and against their
**installed** libraries before running them.
Library developers will want to build the associated tests
against their **installed** libraries before running them.
.. note::
@@ -4426,11 +4426,6 @@ paths:
would be appropriate for ensuring the installed software continues
to work as the underlying system evolves.
.. note::
You are free to use a method name that is more suitable for
your package.
.. _cache_custom_files:
"""""""""""""""""""
@@ -4446,7 +4441,7 @@ Examples include:
- expected test output
These extra files should be added to the ``test`` subdirectory of the
package in the Spack repository.
package in the Spack repository.
Spack will **automatically copy** the contents of that directory to the
test staging directory for stand-alone testing. The ``test`` method can
@@ -4471,7 +4466,7 @@ The signature for ``get_escaped_text_output`` is:
where ``filename`` is the path to the file containing the expected output.
The ``filename`` for a :ref:`custom file <cache_custom_files>` can be
The ``filename`` for a :ref:`custom file <cache_custom_files>` can be
accessed and used as illustrated by a simplified version of an ``sqlite``
package check:
@@ -4509,7 +4504,8 @@ can retrieve the expected output from ``examples/foo.out`` using:
def test(self):
..
filename = join_path(self.install_test_root, 'examples', 'foo.out')
filename = join_path(self.test_suite.current_test_cache_dir,
'examples', 'foo.out')
expected = get_escaped_text_output(filename)
..
@@ -4591,10 +4587,10 @@ where each argument has the following meaning:
Options are a list of strings to be passed to the executable when
it runs.
The default is ``[]``, which means no options are provided to the
executable.
* ``expected`` is an optional list of expected output strings.
Spack requires every string in ``expected`` to be a regex matching
@@ -4605,31 +4601,31 @@ where each argument has the following meaning:
The expected output can be :ref:`read from a file
<expected_test_output_from_file>`.
The default is ``expected=[]``, so Spack will not check the output.
* ``status`` is the optional expected return code(s).
A list of return codes corresponding to successful execution can
be provided (e.g., ``status=[0,3,7]``). Support for non-zero return
codes allows for basic **expected failure** tests as well as different
return codes across versions of the software.
The default is ``status=[0]``, which corresponds to **successful**
execution in the sense that the executable does not exit with a
failure code or raise an exception.
* ``installed`` is used to require ``exe`` to be within the package
prefix.
If ``True``, then the path for ``exe`` is required to be within the
package prefix; otherwise, the path is not constrained.
The default is ``False``, so the fully qualified path for ``exe``
does **not** need to be within the installation directory.
* ``purpose`` is an optional heading describing the the test part.
Output from the test is written to a test log file so this argument
serves as a searchable heading in text logs to highlight the start
of the test part. Having a description can be helpful when debugging
@@ -4644,10 +4640,10 @@ where each argument has the following meaning:
The default is ``False``, which means the test executable must be
present for any installable version of the software.
* ``work_dir`` is the path to the directory from which the executable
will run.
The default of ``None`` corresponds to the current directory (``'.'``).
"""""""""""""""""""""""""""""""""""""""""
@@ -4677,9 +4673,6 @@ directory paths are provided in the table below.
* - Test Suite Stage Files
- ``self.test_suite.stage``
- ``join_path(self.test_suite.stage, 'results.txt')``
* - Cached Build-time Files
- ``self.install_test_root``
- ``join_path(self.install_test_root, 'examples', 'foo.c')``
* - Staged Cached Build-time Files
- ``self.test_suite.current_test_cache_dir``
- ``join_path(self.test_suite.current_test_cache_dir, 'examples', 'foo.c')``
@@ -4754,7 +4747,7 @@ where only the outputs for the first of each set are shown:
Copyright (C) 2018 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
PASSED
...
==> [2021-04-26-17:35:20.493921] test: checking mpirun output
@@ -4915,7 +4908,7 @@ This is already part of the boilerplate for packages created with
Filtering functions
^^^^^^^^^^^^^^^^^^^
:py:func:`filter_file(regex, repl, *filenames, **kwargs) <spack.filter_file>`
:py:func:`filter_file(regex, repl, *filenames, **kwargs) <llnl.util.filesystem.filter_file>`
Works like ``sed`` but with Python regular expression syntax. Takes
a regular expression, a replacement, and a set of files. ``repl``
can be a raw string or a callable function. If it is a raw string,
@@ -4953,7 +4946,7 @@ Filtering functions
filter_file('CXX="c++"', 'CXX="%s"' % self.compiler.cxx,
prefix.bin.mpicxx)
:py:func:`change_sed_delimiter(old_delim, new_delim, *filenames) <spack.change_sed_delim>`
:py:func:`change_sed_delimiter(old_delim, new_delim, *filenames) <llnl.util.filesystem.change_sed_delimiter>`
Some packages, like TAU, have a build system that can't install
into directories with, e.g. '@' in the name, because they use
hard-coded ``sed`` commands in their build.
@@ -4975,14 +4968,14 @@ Filtering functions
File functions
^^^^^^^^^^^^^^
:py:func:`ancestor(dir, n=1) <spack.ancestor>`
:py:func:`ancestor(dir, n=1) <llnl.util.filesystem.ancestor>`
Get the n\ :sup:`th` ancestor of the directory ``dir``.
:py:func:`can_access(path) <spack.can_access>`
:py:func:`can_access(path) <llnl.util.filesystem.can_access>`
True if we can read and write to the file at ``path``. Same as
native python ``os.access(file_name, os.R_OK|os.W_OK)``.
:py:func:`install(src, dest) <spack.install>`
:py:func:`install(src, dest) <llnl.util.filesystem.install>`
Install a file to a particular location. For example, install a
header into the ``include`` directory under the install ``prefix``:
@@ -4990,14 +4983,14 @@ File functions
install('my-header.h', prefix.include)
:py:func:`join_path(*paths) <spack.join_path>`
:py:func:`join_path(*paths) <llnl.util.filesystem.join_path>`
An alias for ``os.path.join``. This joins paths using the OS path separator.
:py:func:`mkdirp(*paths) <spack.mkdirp>`
:py:func:`mkdirp(*paths) <llnl.util.filesystem.mkdirp>`
Create each of the directories in ``paths``, creating any parent
directories if they do not exist.
:py:func:`working_dir(dirname, kwargs) <spack.working_dir>`
:py:func:`working_dir(dirname, kwargs) <llnl.util.filesystem.working_dir>`
This is a Python `Context Manager
<https://docs.python.org/2/library/contextlib.html>`_ that makes it
easier to work with subdirectories in builds. You use this with the
@@ -5039,7 +5032,7 @@ File functions
The ``create=True`` keyword argument causes the command to create
the directory if it does not exist.
:py:func:`touch(path) <spack.touch>`
:py:func:`touch(path) <llnl.util.filesystem.touch>`
Create an empty file at ``path``.
.. _make-package-findable:

View File

@@ -48,9 +48,9 @@ or Amazon Elastic Kubernetes Service (`EKS <https://aws.amazon.com/eks>`_), thou
topics are outside the scope of this document.
Spack's pipelines are now making use of the
`trigger <https://docs.gitlab.com/12.9/ee/ci/yaml/README.html#trigger>`_ syntax to run
`trigger <https://docs.gitlab.com/ee/ci/yaml/#trigger>`_ syntax to run
dynamically generated
`child pipelines <https://docs.gitlab.com/12.9/ee/ci/parent_child_pipelines.html>`_.
`child pipelines <https://docs.gitlab.com/ee/ci/pipelines/parent_child_pipelines.html>`_.
Note that the use of dynamic child pipelines requires running Gitlab version
``>= 12.9``.

View File

@@ -335,7 +335,7 @@ merged YAML from all configuration files, use ``spack config get repos``:
- ~/myrepo
- $spack/var/spack/repos/builtin
mNote that, unlike ``spack repo list``, this does not include the
Note that, unlike ``spack repo list``, this does not include the
namespace, which is read from each repo's ``repo.yaml``.
^^^^^^^^^^^^^^^^^^^^^

View File

@@ -1,7 +1,10 @@
# These dependencies should be installed using pip in order
# to build the documentation.
sphinx
sphinx>=3.4,!=4.1.2
sphinxcontrib-programoutput
sphinx-rtd-theme
python-levenshtein
# Restrict to docutils <0.17 to workaround a list rendering issue in sphinx.
# https://stackoverflow.com/questions/67542699
docutils <0.17

View File

@@ -8,12 +8,21 @@
# these commands in this directory to install Sphinx and its plugins,
# then build the docs:
#
# spack install
# spack env activate .
# spack install
# make
#
spack:
specs:
- py-sphinx
# Sphinx
- "py-sphinx@3.4:4.1.1,4.1.3:"
- py-sphinxcontrib-programoutput
- py-docutils@:0.16
- py-sphinx-rtd-theme
# VCS
- git
- mercurial
- subversion
# Plotting
- graphviz
concretization: together

View File

@@ -0,0 +1,18 @@
Name, Supported Versions, Notes, Requirement Reason
Python, 2.6/2.7/3.5-3.9, , Interpreter for Spack
C/C++ Compilers, , , Building software
make, , , Build software
patch, , , Build software
bash, , , Compiler wrappers
tar, , , Extract/create archives
gzip, , , Compress/Decompress archives
unzip, , , Compress/Decompress archives
bzip, , , Compress/Decompress archives
xz, , , Compress/Decompress archives
zstd, , Optional, Compress/Decompress archives
file, , , Create/Use Buildcaches
gnupg2, , , Sign/Verify Buildcaches
git, , , Manage Software Repositories
svn, , Optional, Manage Software Repositories
hg, , Optional, Manage Software Repositories
Python header files, , Optional (e.g. ``python3-dev`` on Debian), Bootstrapping from sources
1 Name Supported Versions Notes Requirement Reason
2 Python 2.6/2.7/3.5-3.9 Interpreter for Spack
3 C/C++ Compilers Building software
4 make Build software
5 patch Build software
6 bash Compiler wrappers
7 tar Extract/create archives
8 gzip Compress/Decompress archives
9 unzip Compress/Decompress archives
10 bzip Compress/Decompress archives
11 xz Compress/Decompress archives
12 zstd Optional Compress/Decompress archives
13 file Create/Use Buildcaches
14 gnupg2 Sign/Verify Buildcaches
15 git Manage Software Repositories
16 svn Optional Manage Software Repositories
17 hg Optional Manage Software Repositories
18 Python header files Optional (e.g. ``python3-dev`` on Debian) Bootstrapping from sources

View File

@@ -387,7 +387,7 @@ some nice features:
Spack-built compiler can be given to an IDE without requiring the
IDE to load that compiler's module.
Unfortunately, Spack's RPATH support does not work in all case. For example:
Unfortunately, Spack's RPATH support does not work in every case. For example:
#. Software comes in many forms --- not just compiled ELF binaries,
but also as interpreted code in Python, R, JVM bytecode, etc.

597
lib/spack/env/cc vendored
View File

@@ -1,4 +1,5 @@
#!/bin/bash
#!/bin/sh
# shellcheck disable=SC2034 # evals in this script fool shellcheck
#
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
@@ -20,25 +21,41 @@
# -Wl,-rpath arguments for dependency /lib directories.
#
# Reset IFS to the default: whitespace-separated lists. When we use
# other separators, we set and reset it.
unset IFS
# Separator for lists whose names end with `_list`.
# We pick the alarm bell character, which is highly unlikely to
# conflict with anything. This is a literal bell character (which
# we have to use since POSIX sh does not convert escape sequences
# like '\a' outside of the format argument of `printf`).
# NOTE: Depending on your editor this may look empty, but it is not.
readonly lsep=''
# This is an array of environment variables that need to be set before
# the script runs. They are set by routines in spack.build_environment
# as part of the package installation process.
parameters=(
SPACK_ENV_PATH
SPACK_DEBUG_LOG_DIR
SPACK_DEBUG_LOG_ID
SPACK_COMPILER_SPEC
SPACK_CC_RPATH_ARG
SPACK_CXX_RPATH_ARG
SPACK_F77_RPATH_ARG
SPACK_FC_RPATH_ARG
SPACK_TARGET_ARGS
SPACK_DTAGS_TO_ADD
SPACK_DTAGS_TO_STRIP
SPACK_LINKER_ARG
SPACK_SHORT_SPEC
SPACK_SYSTEM_DIRS
)
readonly params="\
SPACK_ENV_PATH
SPACK_DEBUG_LOG_DIR
SPACK_DEBUG_LOG_ID
SPACK_COMPILER_SPEC
SPACK_CC_RPATH_ARG
SPACK_CXX_RPATH_ARG
SPACK_F77_RPATH_ARG
SPACK_FC_RPATH_ARG
SPACK_LINKER_ARG
SPACK_SHORT_SPEC
SPACK_SYSTEM_DIRS"
# Optional parameters that aren't required to be set
# Boolean (true/false/custom) if we want to add debug flags
# SPACK_ADD_DEBUG_FLAGS
# If a custom flag is requested, it will be defined
# SPACK_DEBUG_FLAGS
# The compiler input variables are checked for sanity later:
# SPACK_CC, SPACK_CXX, SPACK_F77, SPACK_FC
@@ -50,43 +67,159 @@ parameters=(
# Test command is used to unit test the compiler script.
# SPACK_TEST_COMMAND
# die()
# Prints a message and exits with error 1.
function die {
echo "$@"
# die MESSAGE
# Print a message and exit with error code 1.
die() {
echo "[spack cc] ERROR: $*"
exit 1
}
# read input parameters into proper bash arrays.
# SYSTEM_DIRS is delimited by :
IFS=':' read -ra SPACK_SYSTEM_DIRS <<< "${SPACK_SYSTEM_DIRS}"
# empty VARNAME
# Return whether the variable VARNAME is unset or set to the empty string.
empty() {
eval "test -z \"\${$1}\""
}
# SPACK_<LANG>FLAGS and SPACK_LDLIBS are split by ' '
IFS=' ' read -ra SPACK_FFLAGS <<< "$SPACK_FFLAGS"
IFS=' ' read -ra SPACK_CPPFLAGS <<< "$SPACK_CPPFLAGS"
IFS=' ' read -ra SPACK_CFLAGS <<< "$SPACK_CFLAGS"
IFS=' ' read -ra SPACK_CXXFLAGS <<< "$SPACK_CXXFLAGS"
IFS=' ' read -ra SPACK_LDFLAGS <<< "$SPACK_LDFLAGS"
IFS=' ' read -ra SPACK_LDLIBS <<< "$SPACK_LDLIBS"
# setsep LISTNAME
# Set the global variable 'sep' to the separator for a list with name LISTNAME.
# There are three types of lists:
# 1. regular lists end with _list and are separated by $lsep
# 2. directory lists end with _dirs/_DIRS/PATH(S) and are separated by ':'
# 3. any other list is assumed to be separated by spaces: " "
setsep() {
case "$1" in
*_dirs|*_DIRS|*PATH|*PATHS)
sep=':'
;;
*_list)
sep="$lsep"
;;
*)
sep=" "
;;
esac
}
# prepend LISTNAME ELEMENT [SEP]
#
# Prepend ELEMENT to the list stored in the variable LISTNAME,
# assuming the list is separated by SEP.
# Handles empty lists and single-element lists.
prepend() {
varname="$1"
elt="$2"
if empty "$varname"; then
eval "$varname=\"\${elt}\""
else
# Get the appropriate separator for the list we're appending to.
setsep "$varname"
eval "$varname=\"\${elt}${sep}\${$varname}\""
fi
}
# append LISTNAME ELEMENT [SEP]
#
# Append ELEMENT to the list stored in the variable LISTNAME,
# assuming the list is separated by SEP.
# Handles empty lists and single-element lists.
append() {
varname="$1"
elt="$2"
if empty "$varname"; then
eval "$varname=\"\${elt}\""
else
# Get the appropriate separator for the list we're appending to.
setsep "$varname"
eval "$varname=\"\${$varname}${sep}\${elt}\""
fi
}
# extend LISTNAME1 LISTNAME2 [PREFIX]
#
# Append the elements stored in the variable LISTNAME2
# to the list stored in LISTNAME1.
# If PREFIX is provided, prepend it to each element.
extend() {
# Figure out the appropriate IFS for the list we're reading.
setsep "$2"
if [ "$sep" != " " ]; then
IFS="$sep"
fi
eval "for elt in \${$2}; do append $1 \"$3\${elt}\"; done"
unset IFS
}
# preextend LISTNAME1 LISTNAME2 [PREFIX]
#
# Prepend the elements stored in the list at LISTNAME2
# to the list at LISTNAME1, preserving order.
# If PREFIX is provided, prepend it to each element.
preextend() {
# Figure out the appropriate IFS for the list we're reading.
setsep "$2"
if [ "$sep" != " " ]; then
IFS="$sep"
fi
# first, reverse the list to prepend
_reversed_list=""
eval "for elt in \${$2}; do prepend _reversed_list \"$3\${elt}\"; done"
# prepend reversed list to preextend in order
IFS="${lsep}"
for elt in $_reversed_list; do prepend "$1" "$3${elt}"; done
unset IFS
}
# system_dir PATH
# test whether a path is a system directory
function system_dir {
system_dir() {
IFS=':' # SPACK_SYSTEM_DIRS is colon-separated
path="$1"
for sd in "${SPACK_SYSTEM_DIRS[@]}"; do
if [ "${path}" == "${sd}" ] || [ "${path}" == "${sd}/" ]; then
for sd in $SPACK_SYSTEM_DIRS; do
if [ "${path}" = "${sd}" ] || [ "${path}" = "${sd}/" ]; then
# success if path starts with a system prefix
unset IFS
return 0
fi
done
unset IFS
return 1 # fail if path starts no system prefix
}
for param in "${parameters[@]}"; do
if [[ -z ${!param+x} ]]; then
# Fail with a clear message if the input contains any bell characters.
if eval "[ \"\${*#*${lsep}}\" != \"\$*\" ]"; then
die "Compiler command line contains our separator ('${lsep}'). Cannot parse."
fi
# ensure required variables are set
for param in $params; do
if eval "test -z \"\${${param}:-}\""; then
die "Spack compiler must be run from Spack! Input '$param' is missing."
fi
done
# Check if optional parameters are defined
# If we aren't asking for debug flags, don't add them
if [ -z "${SPACK_ADD_DEBUG_FLAGS:-}" ]; then
SPACK_ADD_DEBUG_FLAGS="false"
fi
# SPACK_ADD_DEBUG_FLAGS must be true/false/custom
is_valid="false"
for param in "true" "false" "custom"; do
if [ "$param" = "$SPACK_ADD_DEBUG_FLAGS" ]; then
is_valid="true"
fi
done
# Exit with error if we are given an incorrect value
if [ "$is_valid" = "false" ]; then
die "SPACK_ADD_DEBUG_FLAGS, if defined, must be one of 'true', 'false', or 'custom'."
fi
# Figure out the type of compiler, the language, and the mode so that
# the compiler script knows what to do.
#
@@ -101,37 +234,42 @@ done
# ld link
# ccld compile & link
command=$(basename "$0")
command="${0##*/}"
comp="CC"
case "$command" in
cpp)
mode=cpp
debug_flags="-g"
;;
cc|c89|c99|gcc|clang|armclang|icc|icx|pgcc|nvc|xlc|xlc_r|fcc)
command="$SPACK_CC"
language="C"
comp="CC"
lang_flags=C
debug_flags="-g"
;;
c++|CC|g++|clang++|armclang++|icpc|icpx|pgc++|nvc++|xlc++|xlc++_r|FCC)
command="$SPACK_CXX"
language="C++"
comp="CXX"
lang_flags=CXX
debug_flags="-g"
;;
ftn|f90|fc|f95|gfortran|flang|armflang|ifort|ifx|pgfortran|nvfortran|xlf90|xlf90_r|nagfor|frt)
command="$SPACK_FC"
language="Fortran 90"
comp="FC"
lang_flags=F
debug_flags="-g"
;;
f77|xlf|xlf_r|pgf77)
command="$SPACK_F77"
language="Fortran 77"
comp="F77"
lang_flags=F
debug_flags="-g"
;;
ld)
ld|ld.gold|ld.lld)
mode=ld
;;
*)
@@ -142,7 +280,7 @@ esac
# If any of the arguments below are present, then the mode is vcheck.
# In vcheck mode, nothing is added in terms of extra search paths or
# libraries.
if [[ -z $mode ]] || [[ $mode == ld ]]; then
if [ -z "$mode" ] || [ "$mode" = ld ]; then
for arg in "$@"; do
case $arg in
-v|-V|--version|-dumpversion)
@@ -154,16 +292,16 @@ if [[ -z $mode ]] || [[ $mode == ld ]]; then
fi
# Finish setting up the mode.
if [[ -z $mode ]]; then
if [ -z "$mode" ]; then
mode=ccld
for arg in "$@"; do
if [[ $arg == -E ]]; then
if [ "$arg" = "-E" ]; then
mode=cpp
break
elif [[ $arg == -S ]]; then
elif [ "$arg" = "-S" ]; then
mode=as
break
elif [[ $arg == -c ]]; then
elif [ "$arg" = "-c" ]; then
mode=cc
break
fi
@@ -190,42 +328,46 @@ dtags_to_strip="${SPACK_DTAGS_TO_STRIP}"
linker_arg="${SPACK_LINKER_ARG}"
# Set up rpath variable according to language.
eval rpath=\$SPACK_${comp}_RPATH_ARG
rpath="ERROR: RPATH ARG WAS NOT SET"
eval "rpath=\${SPACK_${comp}_RPATH_ARG:?${rpath}}"
# Dump the mode and exit if the command is dump-mode.
if [[ $SPACK_TEST_COMMAND == dump-mode ]]; then
if [ "$SPACK_TEST_COMMAND" = "dump-mode" ]; then
echo "$mode"
exit
fi
# Check that at least one of the real commands was actually selected,
# otherwise we don't know what to execute.
if [[ -z $command ]]; then
die "ERROR: Compiler '$SPACK_COMPILER_SPEC' does not support compiling $language programs."
# If, say, SPACK_CC is set but SPACK_FC is not, we want to know. Compilers do not
# *have* to set up Fortran executables, so we need to tell the user when a build is
# about to attempt to use them unsuccessfully.
if [ -z "$command" ]; then
die "Compiler '$SPACK_COMPILER_SPEC' does not have a $language compiler configured."
fi
#
# Filter '.' and Spack environment directories out of PATH so that
# this script doesn't just call itself
#
IFS=':' read -ra env_path <<< "$PATH"
IFS=':' read -ra spack_env_dirs <<< "$SPACK_ENV_PATH"
spack_env_dirs+=("" ".")
export PATH=""
for dir in "${env_path[@]}"; do
new_dirs=""
IFS=':'
for dir in $PATH; do
addpath=true
for env_dir in "${spack_env_dirs[@]}"; do
if [[ "$dir" == "$env_dir" ]]; then
addpath=false
break
fi
for spack_env_dir in $SPACK_ENV_PATH; do
case "${dir%%/}" in
"$spack_env_dir"|'.'|'')
addpath=false
break
;;
esac
done
if $addpath; then
export PATH="${PATH:+$PATH:}$dir"
if [ $addpath = true ]; then
append new_dirs "$dir"
fi
done
unset IFS
export PATH="$new_dirs"
if [[ $mode == vcheck ]]; then
if [ "$mode" = vcheck ]; then
exec "${command}" "$@"
fi
@@ -233,16 +375,20 @@ fi
# It doesn't work with -rpath.
# This variable controls whether they are added.
add_rpaths=true
if [[ ($mode == ld || $mode == ccld) && "$SPACK_SHORT_SPEC" =~ "darwin" ]];
then
for arg in "$@"; do
if [[ ($arg == -r && $mode == ld) ||
($arg == -r && $mode == ccld) ||
($arg == -Wl,-r && $mode == ccld) ]]; then
add_rpaths=false
break
fi
done
if [ "$mode" = ld ] || [ "$mode" = ccld ]; then
if [ "${SPACK_SHORT_SPEC#*darwin}" != "${SPACK_SHORT_SPEC}" ]; then
for arg in "$@"; do
if [ "$arg" = "-r" ]; then
if [ "$mode" = ld ] || [ "$mode" = ccld ]; then
add_rpaths=false
break
fi
elif [ "$arg" = "-Wl,-r" ] && [ "$mode" = ccld ]; then
add_rpaths=false
break
fi
done
fi
fi
# Save original command for debug logging
@@ -265,17 +411,22 @@ input_command="$*"
# The libs variable is initialized here for completeness, and it is also
# used later to inject flags supplied via `ldlibs` on the command
# line. These come into the wrappers via SPACK_LDLIBS.
#
includes=()
libdirs=()
rpaths=()
system_includes=()
system_libdirs=()
system_rpaths=()
libs=()
other_args=()
isystem_system_includes=()
isystem_includes=()
# The loop below breaks up the command line into these lists of components.
# The lists are all bell-separated to be as flexible as possible, as their
# contents may come from the command line, from ' '-separated lists,
# ':'-separated lists, etc.
include_dirs_list=""
lib_dirs_list=""
rpath_dirs_list=""
system_include_dirs_list=""
system_lib_dirs_list=""
system_rpath_dirs_list=""
isystem_system_include_dirs_list=""
isystem_include_dirs_list=""
libs_list=""
other_args_list=""
while [ $# -ne 0 ]; do
@@ -295,32 +446,32 @@ while [ $# -ne 0 ]; do
isystem_was_used=true
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
isystem_system_includes+=("$arg")
append isystem_system_include_dirs_list "$arg"
else
isystem_includes+=("$arg")
append isystem_include_dirs_list "$arg"
fi
;;
-I*)
arg="${1#-I}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
system_includes+=("$arg")
append system_include_dirs_list "$arg"
else
includes+=("$arg")
append include_dirs_list "$arg"
fi
;;
-L*)
arg="${1#-L}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if system_dir "$arg"; then
system_libdirs+=("$arg")
append system_lib_dirs_list "$arg"
else
libdirs+=("$arg")
append lib_dirs_list "$arg"
fi
;;
-l*)
# -loopopt=0 is generated erroneously in autoconf <= 2.69,
# and passed by ifx to the linker, which confuses it with a
# and passed by ifx to the linker, which confuses it with a
# library. Filter it out.
# TODO: generalize filtering of args with an env var, so that
# TODO: we do not have to special case this here.
@@ -331,66 +482,76 @@ while [ $# -ne 0 ]; do
fi
arg="${1#-l}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
other_args+=("-l$arg")
append other_args_list "-l$arg"
;;
-Wl,*)
arg="${1#-Wl,}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if [[ "$arg" = -rpath=* ]]; then
rp="${arg#-rpath=}"
elif [[ "$arg" = --rpath=* ]]; then
rp="${arg#--rpath=}"
elif [[ "$arg" = -rpath,* ]]; then
rp="${arg#-rpath,}"
elif [[ "$arg" = --rpath,* ]]; then
rp="${arg#--rpath,}"
elif [[ "$arg" =~ ^-?-rpath$ ]]; then
shift; arg="$1"
if [[ "$arg" != -Wl,* ]]; then
die "-Wl,-rpath was not followed by -Wl,*"
fi
rp="${arg#-Wl,}"
elif [[ "$arg" = "$dtags_to_strip" ]] ; then
: # We want to remove explicitly this flag
else
other_args+=("-Wl,$arg")
fi
case "$arg" in
-rpath=*) rp="${arg#-rpath=}" ;;
--rpath=*) rp="${arg#--rpath=}" ;;
-rpath,*) rp="${arg#-rpath,}" ;;
--rpath,*) rp="${arg#--rpath,}" ;;
-rpath|--rpath)
shift; arg="$1"
case "$arg" in
-Wl,*)
rp="${arg#-Wl,}"
;;
*)
die "-Wl,-rpath was not followed by -Wl,*"
;;
esac
;;
"$dtags_to_strip")
: # We want to remove explicitly this flag
;;
*)
append other_args_list "-Wl,$arg"
;;
esac
;;
-Xlinker,*)
arg="${1#-Xlinker,}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
if [[ "$arg" = -rpath=* ]]; then
rp="${arg#-rpath=}"
elif [[ "$arg" = --rpath=* ]]; then
rp="${arg#--rpath=}"
elif [[ "$arg" = -rpath ]] || [[ "$arg" = --rpath ]]; then
shift; arg="$1"
if [[ "$arg" != -Xlinker,* ]]; then
die "-Xlinker,-rpath was not followed by -Xlinker,*"
fi
rp="${arg#-Xlinker,}"
else
other_args+=("-Xlinker,$arg")
fi
case "$arg" in
-rpath=*) rp="${arg#-rpath=}" ;;
--rpath=*) rp="${arg#--rpath=}" ;;
-rpath|--rpath)
shift; arg="$1"
case "$arg" in
-Xlinker,*)
rp="${arg#-Xlinker,}"
;;
*)
die "-Xlinker,-rpath was not followed by -Xlinker,*"
;;
esac
;;
*)
append other_args_list "-Xlinker,$arg"
;;
esac
;;
-Xlinker)
if [[ "$2" == "-rpath" ]]; then
if [[ "$3" != "-Xlinker" ]]; then
if [ "$2" = "-rpath" ]; then
if [ "$3" != "-Xlinker" ]; then
die "-Xlinker,-rpath was not followed by -Xlinker,*"
fi
shift 3;
rp="$1"
elif [[ "$2" = "$dtags_to_strip" ]] ; then
elif [ "$2" = "$dtags_to_strip" ]; then
shift # We want to remove explicitly this flag
else
other_args+=("$1")
append other_args_list "$1"
fi
;;
*)
if [[ "$1" = "$dtags_to_strip" ]] ; then
if [ "$1" = "$dtags_to_strip" ]; then
: # We want to remove explicitly this flag
else
other_args+=("$1")
append other_args_list "$1"
fi
;;
esac
@@ -398,9 +559,9 @@ while [ $# -ne 0 ]; do
# test rpaths against system directories in one place.
if [ -n "$rp" ]; then
if system_dir "$rp"; then
system_rpaths+=("$rp")
append system_rpath_dirs_list "$rp"
else
rpaths+=("$rp")
append rpath_dirs_list "$rp"
fi
fi
shift
@@ -413,14 +574,24 @@ done
# See the gmake manual on implicit rules for details:
# https://www.gnu.org/software/make/manual/html_node/Implicit-Variables.html
#
flags=()
flags_list=""
# Add debug flags
if [ "${SPACK_ADD_DEBUG_FLAGS}" = "true" ]; then
extend flags_list debug_flags
# If a custom flag is requested, derive from environment
elif [ "$SPACK_ADD_DEBUG_FLAGS" = "custom" ]; then
extend flags_list SPACK_DEBUG_FLAGS
fi
# Fortran flags come before CPPFLAGS
case "$mode" in
cc|ccld)
case $lang_flags in
F)
flags=("${flags[@]}" "${SPACK_FFLAGS[@]}") ;;
extend flags_list SPACK_FFLAGS
;;
esac
;;
esac
@@ -428,7 +599,8 @@ esac
# C preprocessor flags come before any C/CXX flags
case "$mode" in
cpp|as|cc|ccld)
flags=("${flags[@]}" "${SPACK_CPPFLAGS[@]}") ;;
extend flags_list SPACK_CPPFLAGS
;;
esac
@@ -437,67 +609,67 @@ case "$mode" in
cc|ccld)
case $lang_flags in
C)
flags=("${flags[@]}" "${SPACK_CFLAGS[@]}") ;;
extend flags_list SPACK_CFLAGS
;;
CXX)
flags=("${flags[@]}" "${SPACK_CXXFLAGS[@]}") ;;
extend flags_list SPACK_CXXFLAGS
;;
esac
flags=(${SPACK_TARGET_ARGS[@]} "${flags[@]}")
# prepend target args
preextend flags_list SPACK_TARGET_ARGS
;;
esac
# Linker flags
case "$mode" in
ld|ccld)
flags=("${flags[@]}" "${SPACK_LDFLAGS[@]}") ;;
extend flags_list SPACK_LDFLAGS
;;
esac
# On macOS insert headerpad_max_install_names linker flag
if [[ ($mode == ld || $mode == ccld) && "$SPACK_SHORT_SPEC" =~ "darwin" ]];
then
case "$mode" in
ld)
flags=("${flags[@]}" -headerpad_max_install_names) ;;
ccld)
flags=("${flags[@]}" "-Wl,-headerpad_max_install_names") ;;
esac
if [ "$mode" = ld ] || [ "$mode" = ccld ]; then
if [ "${SPACK_SHORT_SPEC#*darwin}" != "${SPACK_SHORT_SPEC}" ]; then
case "$mode" in
ld)
append flags_list "-headerpad_max_install_names" ;;
ccld)
append flags_list "-Wl,-headerpad_max_install_names" ;;
esac
fi
fi
IFS=':' read -ra rpath_dirs <<< "$SPACK_RPATH_DIRS"
if [[ $mode == ccld || $mode == ld ]]; then
if [[ "$add_rpaths" != "false" ]] ; then
if [ "$mode" = ccld ] || [ "$mode" = ld ]; then
if [ "$add_rpaths" != "false" ]; then
# Append RPATH directories. Note that in the case of the
# top-level package these directories may not exist yet. For dependencies
# it is assumed that paths have already been confirmed.
rpaths=("${rpaths[@]}" "${rpath_dirs[@]}")
extend rpath_dirs_list SPACK_RPATH_DIRS
fi
fi
IFS=':' read -ra link_dirs <<< "$SPACK_LINK_DIRS"
if [[ $mode == ccld || $mode == ld ]]; then
libdirs=("${libdirs[@]}" "${link_dirs[@]}")
if [ "$mode" = ccld ] || [ "$mode" = ld ]; then
extend lib_dirs_list SPACK_LINK_DIRS
fi
# add RPATHs if we're in in any linking mode
case "$mode" in
ld|ccld)
# Set extra RPATHs
IFS=':' read -ra extra_rpaths <<< "$SPACK_COMPILER_EXTRA_RPATHS"
libdirs+=("${extra_rpaths[@]}")
if [[ "$add_rpaths" != "false" ]] ; then
rpaths+=("${extra_rpaths[@]}")
extend lib_dirs_list SPACK_COMPILER_EXTRA_RPATHS
if [ "$add_rpaths" != "false" ]; then
extend rpath_dirs_list SPACK_COMPILER_EXTRA_RPATHS
fi
# Set implicit RPATHs
IFS=':' read -ra implicit_rpaths <<< "$SPACK_COMPILER_IMPLICIT_RPATHS"
if [[ "$add_rpaths" != "false" ]] ; then
rpaths+=("${implicit_rpaths[@]}")
if [ "$add_rpaths" != "false" ]; then
extend rpath_dirs_list SPACK_COMPILER_IMPLICIT_RPATHS
fi
# Add SPACK_LDLIBS to args
for lib in "${SPACK_LDLIBS[@]}"; do
libs+=("${lib#-l}")
for lib in $SPACK_LDLIBS; do
append libs_list "${lib#-l}"
done
;;
esac
@@ -505,63 +677,62 @@ esac
#
# Finally, reassemble the command line.
#
# Includes and system includes first
args=()
# flags assembled earlier
args+=("${flags[@]}")
args_list="$flags_list"
# Insert include directories just prior to any system include directories
# NOTE: adding ${lsep} to the prefix here turns every added element into two
extend args_list include_dirs_list "-I"
extend args_list isystem_include_dirs_list "-isystem${lsep}"
for dir in "${includes[@]}"; do args+=("-I$dir"); done
for dir in "${isystem_includes[@]}"; do args+=("-isystem" "$dir"); done
case "$mode" in
cpp|cc|as|ccld)
if [ "$isystem_was_used" = "true" ]; then
extend args_list SPACK_INCLUDE_DIRS "-isystem${lsep}"
else
extend args_list SPACK_INCLUDE_DIRS "-I"
fi
;;
esac
IFS=':' read -ra spack_include_dirs <<< "$SPACK_INCLUDE_DIRS"
if [[ $mode == cpp || $mode == cc || $mode == as || $mode == ccld ]]; then
if [[ "$isystem_was_used" == "true" ]] ; then
for dir in "${spack_include_dirs[@]}"; do args+=("-isystem" "$dir"); done
else
for dir in "${spack_include_dirs[@]}"; do args+=("-I$dir"); done
fi
fi
for dir in "${system_includes[@]}"; do args+=("-I$dir"); done
for dir in "${isystem_system_includes[@]}"; do args+=("-isystem" "$dir"); done
extend args_list system_include_dirs_list -I
extend args_list isystem_system_include_dirs_list "-isystem${lsep}"
# Library search paths
for dir in "${libdirs[@]}"; do args+=("-L$dir"); done
for dir in "${system_libdirs[@]}"; do args+=("-L$dir"); done
extend args_list lib_dirs_list "-L"
extend args_list system_lib_dirs_list "-L"
# RPATHs arguments
case "$mode" in
ccld)
if [ -n "$dtags_to_add" ] ; then args+=("$linker_arg$dtags_to_add") ; fi
for dir in "${rpaths[@]}"; do args+=("$rpath$dir"); done
for dir in "${system_rpaths[@]}"; do args+=("$rpath$dir"); done
if [ -n "$dtags_to_add" ] ; then
append args_list "$linker_arg$dtags_to_add"
fi
extend args_list rpath_dirs_list "$rpath"
extend args_list system_rpath_dirs_list "$rpath"
;;
ld)
if [ -n "$dtags_to_add" ] ; then args+=("$dtags_to_add") ; fi
for dir in "${rpaths[@]}"; do args+=("-rpath" "$dir"); done
for dir in "${system_rpaths[@]}"; do args+=("-rpath" "$dir"); done
if [ -n "$dtags_to_add" ] ; then
append args_list "$dtags_to_add"
fi
extend args_list rpath_dirs_list "-rpath${lsep}"
extend args_list system_rpath_dirs_list "-rpath${lsep}"
;;
esac
# Other arguments from the input command
args+=("${other_args[@]}")
extend args_list other_args_list
# Inject SPACK_LDLIBS, if supplied
for lib in "${libs[@]}"; do
args+=("-l$lib");
done
extend args_list libs_list "-l"
full_command=("$command" "${args[@]}")
full_command_list="$command"
extend full_command_list args_list
# prepend the ccache binary if we're using ccache
if [ -n "$SPACK_CCACHE_BINARY" ]; then
case "$lang_flags" in
C|CXX) # ccache only supports C languages
full_command=("${SPACK_CCACHE_BINARY}" "${full_command[@]}")
prepend full_command_list "${SPACK_CCACHE_BINARY}"
# workaround for stage being a temp folder
# see #3761#issuecomment-294352232
export CCACHE_NOHASHDIR=yes
@@ -570,22 +741,36 @@ if [ -n "$SPACK_CCACHE_BINARY" ]; then
fi
# dump the full command if the caller supplies SPACK_TEST_COMMAND=dump-args
if [[ $SPACK_TEST_COMMAND == dump-args ]]; then
IFS="
" && echo "${full_command[*]}"
exit
elif [[ -n $SPACK_TEST_COMMAND ]]; then
die "ERROR: Unknown test command"
if [ -n "${SPACK_TEST_COMMAND=}" ]; then
case "$SPACK_TEST_COMMAND" in
dump-args)
IFS="$lsep"
for arg in $full_command_list; do
echo "$arg"
done
unset IFS
exit
;;
dump-env-*)
var=${SPACK_TEST_COMMAND#dump-env-}
eval "printf '%s\n' \"\$0: \$var: \$$var\""
;;
*)
die "Unknown test command: '$SPACK_TEST_COMMAND'"
;;
esac
fi
#
# Write the input and output commands to debug logs if it's asked for.
#
if [[ $SPACK_DEBUG == TRUE ]]; then
if [ "$SPACK_DEBUG" = TRUE ]; then
input_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.in.log"
output_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.out.log"
echo "[$mode] $command $input_command" >> "$input_log"
echo "[$mode] ${full_command[*]}" >> "$output_log"
echo "[$mode] ${full_command_list}" >> "$output_log"
fi
exec "${full_command[@]}"
# Execute the full command, preserving spaces with IFS set
# to the alarm bell separator.
IFS="$lsep"; exec $full_command_list

1
lib/spack/env/ld.gold vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/ld.lld vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

View File

@@ -11,7 +11,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.1.2 (commit 26dec9d47e509daf8c970de4c89da200da52ad20)
* Version: 0.1.2 (commit 85757b6666422fca86aa882a769bf78b0f992f54)
argparse
--------
@@ -88,6 +88,8 @@
* Usage: Needed by pytest. Library with cross-python path,
ini-parsing, io, code, and log facilities.
* Version: 1.4.34 (last version supporting Python 2.6)
* Note: This packages has been modified:
* https://github.com/pytest-dev/py/pull/186 was backported
pytest
------

View File

@@ -49,6 +49,19 @@ $ tox
congratulations :)
```
## Citing Archspec
If you are referencing `archspec` in a publication, please cite the following
paper:
* Massimiliano Culpo, Gregory Becker, Carlos Eduardo Arango Gutierrez, Kenneth
Hoste, and Todd Gamblin.
[**`archspec`: A library for detecting, labeling, and reasoning about
microarchitectures**](https://tgamblin.github.io/pubs/archspec-canopie-hpc-2020.pdf).
In *2nd International Workshop on Containers and New Orchestration Paradigms
for Isolated Environments in HPC (CANOPIE-HPC'20)*, Online Event, November
12, 2020.
## License
Archspec is distributed under the terms of both the MIT license and the

View File

@@ -206,11 +206,26 @@ def host():
# Get a list of possible candidates for this micro-architecture
candidates = compatible_microarchitectures(info)
# Sorting criteria for candidates
def sorting_fn(item):
return len(item.ancestors), len(item.features)
# Get the best generic micro-architecture
generic_candidates = [c for c in candidates if c.vendor == "generic"]
best_generic = max(generic_candidates, key=sorting_fn)
# Filter the candidates to be descendant of the best generic candidate.
# This is to avoid that the lack of a niche feature that can be disabled
# from e.g. BIOS prevents detection of a reasonably performant architecture
candidates = [c for c in candidates if c > best_generic]
# If we don't have candidates, return the best generic micro-architecture
if not candidates:
return best_generic
# Reverse sort of the depth for the inheritance tree among only targets we
# can use. This gets the newest target we satisfy.
return sorted(
candidates, key=lambda t: (len(t.ancestors), len(t.features)), reverse=True
)[0]
return max(candidates, key=sorting_fn)
def compatibility_check(architecture_family):
@@ -245,7 +260,13 @@ def compatibility_check_for_power(info, target):
"""Compatibility check for PPC64 and PPC64LE architectures."""
basename = platform.machine()
generation_match = re.search(r"POWER(\d+)", info.get("cpu", ""))
generation = int(generation_match.group(1))
try:
generation = int(generation_match.group(1))
except AttributeError:
# There might be no match under emulated environments. For instance
# emulating a ppc64le with QEMU and Docker still reports the host
# /proc/cpuinfo and not a Power
generation = 0
# We can use a target if it descends from our machine type and our
# generation (9 for POWER9, etc) is at least its generation.
@@ -285,3 +306,22 @@ def compatibility_check_for_aarch64(info, target):
and (target.vendor == vendor or target.vendor == "generic")
and target.features.issubset(features)
)
@compatibility_check(architecture_family="riscv64")
def compatibility_check_for_riscv64(info, target):
"""Compatibility check for riscv64 architectures."""
basename = "riscv64"
uarch = info.get("uarch")
# sifive unmatched board
if uarch == "sifive,u74-mc":
uarch = "u74mc"
# catch-all for unknown uarchs
else:
uarch = "riscv64"
arch_root = TARGETS[basename]
return (target == arch_root or arch_root in target.ancestors) and (
target == uarch or target.vendor == "generic"
)

View File

@@ -173,6 +173,12 @@ def family(self):
return roots.pop()
@property
def generic(self):
"""Returns the best generic architecture that is compatible with self"""
generics = [x for x in [self] + self.ancestors if x.vendor == "generic"]
return max(generics, key=lambda x: len(x.ancestors))
def to_dict(self, return_list_of_items=False):
"""Returns a dictionary representation of this object.

View File

@@ -1942,6 +1942,12 @@
"versions": "5:",
"flags" : "-march=armv8.2-a+fp16+rcpc+dotprod+crypto"
}
],
"arm" : [
{
"versions": "20:",
"flags" : "-march=armv8.2-a+fp16+rcpc+dotprod+crypto"
}
]
}
},
@@ -2011,6 +2017,44 @@
"features": [],
"compilers": {
}
},
"riscv64": {
"from": [],
"vendor": "generic",
"features": [],
"compilers": {
"gcc": [
{
"versions": "7.1:",
"flags" : "-march=rv64gc"
}
],
"clang": [
{
"versions": "9.0:",
"flags" : "-march=rv64gc"
}
]
}
},
"u74mc": {
"from": ["riscv64"],
"vendor": "SiFive",
"features": [],
"compilers": {
"gcc": [
{
"versions": "10.2:",
"flags" : "-march=rv64gc -mtune=sifive-7-series"
}
],
"clang" : [
{
"versions": "12.0:",
"flags" : "-march=rv64gc -mtune=sifive-7-series"
}
]
}
}
},
"feature_aliases": {

View File

@@ -77,52 +77,18 @@
from six import StringIO
from six import string_types
class prefilter(object):
"""Make regular expressions faster with a simple prefiltering predicate.
Some regular expressions seem to be much more costly than others. In
most cases, we can evaluate a simple precondition, e.g.::
lambda x: "error" in x
to avoid evaluating expensive regexes on all lines in a file. This
can reduce parse time for large files by orders of magnitude when
evaluating lots of expressions.
A ``prefilter`` object is designed to act like a regex,, but
``search`` and ``match`` check the precondition before bothering to
evaluate the regular expression.
Note that ``match`` and ``search`` just return ``True`` and ``False``
at the moment. Make them return a ``MatchObject`` or ``None`` if it
becomes necessary.
"""
def __init__(self, precondition, *patterns):
self.patterns = [re.compile(p) for p in patterns]
self.pre = precondition
self.pattern = "\n ".join(
('MERGED:',) + patterns)
def search(self, text):
return self.pre(text) and any(p.search(text) for p in self.patterns)
def match(self, text):
return self.pre(text) and any(p.match(text) for p in self.patterns)
_error_matches = [
prefilter(
lambda x: any(s in x for s in (
'Error:', 'error', 'undefined reference', 'multiply defined')),
"([^:]+): error[ \\t]*[0-9]+[ \\t]*:",
"([^:]+): (Error:|error|undefined reference|multiply defined)",
"([^ :]+) ?: (error|fatal error|catastrophic error)",
"([^:]+)\\(([^\\)]+)\\) ?: (error|fatal error|catastrophic error)"),
"^FAILED",
"^FAIL: ",
"^FATAL: ",
"^failed ",
"FAILED",
"Failed test",
"^[Bb]us [Ee]rror",
"^[Ss]egmentation [Vv]iolation",
"^[Ss]egmentation [Ff]ault",
":.*[Pp]ermission [Dd]enied",
"[^ :]:[0-9]+: [^ \\t]",
"[^:]: error[ \\t]*[0-9]+[ \\t]*:",
"^Error ([0-9]+):",
"^Fatal",
"^[Ee]rror: ",
@@ -132,6 +98,9 @@ def match(self, text):
"^cc[^C]*CC: ERROR File = ([^,]+), Line = ([0-9]+)",
"^ld([^:])*:([ \\t])*ERROR([^:])*:",
"^ild:([ \\t])*\\(undefined symbol\\)",
"[^ :] : (error|fatal error|catastrophic error)",
"[^:]: (Error:|error|undefined reference|multiply defined)",
"[^:]\\([^\\)]+\\) ?: (error|fatal error|catastrophic error)",
"^fatal error C[0-9]+:",
": syntax error ",
"^collect2: ld returned 1 exit status",
@@ -140,7 +109,7 @@ def match(self, text):
"^Unresolved:",
"Undefined symbol",
"^Undefined[ \\t]+first referenced",
"^CMake Error.*:",
"^CMake Error",
":[ \\t]cannot find",
":[ \\t]can't find",
": \\*\\*\\* No rule to make target [`'].*\\'. Stop",
@@ -154,6 +123,7 @@ def match(self, text):
"ld: 0706-006 Cannot find or open library file: -l ",
"ild: \\(argument error\\) can't find library argument ::",
"^could not be found and will not be loaded.",
"^WARNING: '.*' is missing on your system",
"s:616 string too big",
"make: Fatal error: ",
"ld: 0711-993 Error occurred while writing to the output file:",
@@ -175,44 +145,40 @@ def match(self, text):
"instantiated from ",
"candidates are:",
": warning",
": WARNING",
": \\(Warning\\)",
": note",
" ok",
"Note:",
"makefile:",
"Makefile:",
":[ \\t]+Where:",
"([^ :]+):([0-9]+): Warning",
"[^ :]:[0-9]+: Warning",
"------ Build started: .* ------",
]
#: Regexes to match file/line numbers in error/warning messages
_warning_matches = [
prefilter(
lambda x: 'warning' in x,
"([^ :]+):([0-9]+): warning:",
"([^:]+): warning ([0-9]+):",
"([^:]+): warning[ \\t]*[0-9]+[ \\t]*:",
"([^ :]+) : warning",
"([^:]+): warning"),
prefilter(
lambda x: 'note:' in x,
"^([^ :]+):([0-9]+): note:"),
prefilter(
lambda x: any(s in x for s in ('Warning', 'Warnung')),
"^(Warning|Warnung) ([0-9]+):",
"^(Warning|Warnung)[ :]",
"^cxx: Warning:",
"([^ :]+):([0-9]+): (Warning|Warnung)",
"^CMake Warning.*:"),
"file: .* has no symbols",
"[^ :]:[0-9]+: warning:",
"[^ :]:[0-9]+: note:",
"^cc[^C]*CC: WARNING File = ([^,]+), Line = ([0-9]+)",
"^ld([^:])*:([ \\t])*WARNING([^:])*:",
"[^:]: warning [0-9]+:",
"^\"[^\"]+\", line [0-9]+: [Ww](arning|arnung)",
"[^:]: warning[ \\t]*[0-9]+[ \\t]*:",
"^(Warning|Warnung) ([0-9]+):",
"^(Warning|Warnung)[ :]",
"WARNING: ",
"[^ :] : warning",
"[^:]: warning",
"\", line [0-9]+\\.[0-9]+: [0-9]+-[0-9]+ \\([WI]\\)",
"^cxx: Warning:",
"file: .* has no symbols",
"[^ :]:[0-9]+: (Warning|Warnung)",
"\\([0-9]*\\): remark #[0-9]*",
"\".*\", line [0-9]+: remark\\([0-9]*\\):",
"cc-[0-9]* CC: REMARK File = .*, Line = [0-9]*",
"^CMake Warning",
"^\\[WARNING\\]",
]
@@ -343,8 +309,7 @@ def _profile_match(matches, exceptions, line, match_times, exc_times):
def _parse(lines, offset, profile):
def compile(regex_array):
return [regex if isinstance(regex, prefilter) else re.compile(regex)
for regex in regex_array]
return [re.compile(regex) for regex in regex_array]
error_matches = compile(_error_matches)
error_exceptions = compile(_error_exceptions)

View File

@@ -10,7 +10,7 @@
from py._path.common import iswin32, fspath
from stat import S_ISLNK, S_ISDIR, S_ISREG
from os.path import abspath, normcase, normpath, isabs, exists, isdir, isfile, islink, dirname
from os.path import abspath, normpath, isabs, exists, isdir, isfile, islink, dirname
if sys.version_info > (3,0):
def map_as_list(func, iter):
@@ -801,10 +801,10 @@ def make_numbered_dir(cls, prefix='session-', rootdir=None, keep=3,
if rootdir is None:
rootdir = cls.get_temproot()
nprefix = normcase(prefix)
nprefix = prefix.lower()
def parse_num(path):
""" parse the number out of a path (if it matches the prefix) """
nbasename = normcase(path.basename)
nbasename = path.basename.lower()
if nbasename.startswith(nprefix):
try:
return int(nbasename[len(nprefix):])

View File

@@ -326,7 +326,7 @@ def end_function(self, prog=None):
"""Returns the syntax needed to end a function definition.
Parameters:
prog (str, optional): the command name
prog (str or None): the command name
Returns:
str: the function definition ending

View File

@@ -444,7 +444,7 @@ def copy_tree(src, dest, symlinks=True, ignore=None, _permissions=False):
src (str): the directory to copy
dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks
ignore (function): function indicating which files to ignore
ignore (typing.Callable): function indicating which files to ignore
_permissions (bool): for internal use only
Raises:
@@ -518,7 +518,7 @@ def install_tree(src, dest, symlinks=True, ignore=None):
src (str): the directory to install
dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks
ignore (function): function indicating which files to ignore
ignore (typing.Callable): function indicating which files to ignore
Raises:
IOError: if *src* does not match any files or directories
@@ -557,12 +557,12 @@ def mkdirp(*paths, **kwargs):
paths (str): paths to create with mkdirp
Keyword Aguments:
mode (permission bits or None, optional): optional permissions to set
mode (permission bits or None): optional permissions to set
on the created directory -- use OS default if not provided
group (group name or None, optional): optional group for permissions of
group (group name or None): optional group for permissions of
final created directory -- use OS default if not provided. Only
used if world write permissions are not set
default_perms ('parents' or 'args', optional): The default permissions
default_perms (str or None): one of 'parents' or 'args'. The default permissions
that are set for directories that are not themselves an argument
for mkdirp. 'parents' means intermediate directories get the
permissions of their direct parent directory, 'args' means
@@ -656,6 +656,12 @@ def working_dir(dirname, **kwargs):
os.chdir(orig_dir)
class CouldNotRestoreDirectoryBackup(RuntimeError):
def __init__(self, inner_exception, outer_exception):
self.inner_exception = inner_exception
self.outer_exception = outer_exception
@contextmanager
def replace_directory_transaction(directory_name, tmp_root=None):
"""Moves a directory to a temporary space. If the operations executed
@@ -683,32 +689,33 @@ def replace_directory_transaction(directory_name, tmp_root=None):
assert os.path.isabs(tmp_root)
tmp_dir = tempfile.mkdtemp(dir=tmp_root)
tty.debug('TEMPORARY DIRECTORY CREATED [{0}]'.format(tmp_dir))
tty.debug('Temporary directory created [{0}]'.format(tmp_dir))
shutil.move(src=directory_name, dst=tmp_dir)
tty.debug('DIRECTORY MOVED [src={0}, dest={1}]'.format(
directory_name, tmp_dir
))
tty.debug('Directory moved [src={0}, dest={1}]'.format(directory_name, tmp_dir))
try:
yield tmp_dir
except (Exception, KeyboardInterrupt, SystemExit) as e:
# Delete what was there, before copying back the original content
if os.path.exists(directory_name):
shutil.rmtree(directory_name)
shutil.move(
src=os.path.join(tmp_dir, directory_basename),
dst=os.path.dirname(directory_name)
)
tty.debug('DIRECTORY RECOVERED [{0}]'.format(directory_name))
except (Exception, KeyboardInterrupt, SystemExit) as inner_exception:
# Try to recover the original directory, if this fails, raise a
# composite exception.
try:
# Delete what was there, before copying back the original content
if os.path.exists(directory_name):
shutil.rmtree(directory_name)
shutil.move(
src=os.path.join(tmp_dir, directory_basename),
dst=os.path.dirname(directory_name)
)
except Exception as outer_exception:
raise CouldNotRestoreDirectoryBackup(inner_exception, outer_exception)
msg = 'the transactional move of "{0}" failed.'
msg += '\n ' + str(e)
raise RuntimeError(msg.format(directory_name))
tty.debug('Directory recovered [{0}]'.format(directory_name))
raise
else:
# Otherwise delete the temporary directory
shutil.rmtree(tmp_dir)
tty.debug('TEMPORARY DIRECTORY DELETED [{0}]'.format(tmp_dir))
shutil.rmtree(tmp_dir, ignore_errors=True)
tty.debug('Temporary directory deleted [{0}]'.format(tmp_dir))
def hash_directory(directory, ignore=[]):
@@ -866,7 +873,7 @@ def traverse_tree(source_root, dest_root, rel_path='', **kwargs):
Keyword Arguments:
order (str): Whether to do pre- or post-order traversal. Accepted
values are 'pre' and 'post'
ignore (function): function indicating which files to ignore
ignore (typing.Callable): function indicating which files to ignore
follow_nonexisting (bool): Whether to descend into directories in
``src`` that do not exit in ``dest``. Default is True
follow_links (bool): Whether to descend into symlinks in ``src``
@@ -1102,23 +1109,23 @@ def find(root, files, recursive=True):
Accepts any glob characters accepted by fnmatch:
======= ====================================
Pattern Meaning
======= ====================================
* matches everything
? matches any single character
[seq] matches any character in ``seq``
[!seq] matches any character not in ``seq``
======= ====================================
========== ====================================
Pattern Meaning
========== ====================================
``*`` matches everything
``?`` matches any single character
``[seq]`` matches any character in ``seq``
``[!seq]`` matches any character not in ``seq``
========== ====================================
Parameters:
root (str): The root directory to start searching from
files (str or Sequence): Library name(s) to search for
recurse (bool, optional): if False search only root folder,
recursive (bool): if False search only root folder,
if True descends top-down from the root. Defaults to True.
Returns:
list of strings: The files that have been found
list: The files that have been found
"""
if isinstance(files, six.string_types):
files = [files]
@@ -1200,7 +1207,7 @@ def directories(self):
['/dir1', '/dir2']
Returns:
list of strings: A list of directories
list: A list of directories
"""
return list(dedupe(
os.path.dirname(x) for x in self.files if os.path.dirname(x)
@@ -1218,7 +1225,7 @@ def basenames(self):
['a.h', 'b.h']
Returns:
list of strings: A list of base-names
list: A list of base-names
"""
return list(dedupe(os.path.basename(x) for x in self.files))
@@ -1305,7 +1312,7 @@ def headers(self):
"""Stable de-duplication of the headers.
Returns:
list of strings: A list of header files
list: A list of header files
"""
return self.files
@@ -1318,7 +1325,7 @@ def names(self):
['a', 'b']
Returns:
list of strings: A list of files without extensions
list: A list of files without extensions
"""
names = []
@@ -1409,9 +1416,9 @@ def find_headers(headers, root, recursive=False):
======= ====================================
Parameters:
headers (str or list of str): Header name(s) to search for
headers (str or list): Header name(s) to search for
root (str): The root directory to start searching from
recursive (bool, optional): if False search only root folder,
recursive (bool): if False search only root folder,
if True descends top-down from the root. Defaults to False.
Returns:
@@ -1447,7 +1454,7 @@ def find_all_headers(root):
in the directory passed as argument.
Args:
root (path): directory where to look recursively for header files
root (str): directory where to look recursively for header files
Returns:
List of all headers found in ``root`` and subdirectories.
@@ -1467,7 +1474,7 @@ def libraries(self):
"""Stable de-duplication of library files.
Returns:
list of strings: A list of library files
list: A list of library files
"""
return self.files
@@ -1480,7 +1487,7 @@ def names(self):
['a', 'b']
Returns:
list of strings: A list of library names
list: A list of library names
"""
names = []
@@ -1565,8 +1572,8 @@ def find_system_libraries(libraries, shared=True):
======= ====================================
Parameters:
libraries (str or list of str): Library name(s) to search for
shared (bool, optional): if True searches for shared libraries,
libraries (str or list): Library name(s) to search for
shared (bool): if True searches for shared libraries,
otherwise for static. Defaults to True.
Returns:
@@ -1616,11 +1623,11 @@ def find_libraries(libraries, root, shared=True, recursive=False):
======= ====================================
Parameters:
libraries (str or list of str): Library name(s) to search for
libraries (str or list): Library name(s) to search for
root (str): The root directory to start searching from
shared (bool, optional): if True searches for shared libraries,
shared (bool): if True searches for shared libraries,
otherwise for static. Defaults to True.
recursive (bool, optional): if False search only root folder,
recursive (bool): if False search only root folder,
if True descends top-down from the root. Defaults to False.
Returns:
@@ -1848,3 +1855,18 @@ def keep_modification_time(*filenames):
for f, mtime in mtimes.items():
if os.path.exists(f):
os.utime(f, (os.path.getatime(f), mtime))
@contextmanager
def temporary_dir(*args, **kwargs):
"""Create a temporary directory and cd's into it. Delete the directory
on exit.
Takes the same arguments as tempfile.mkdtemp()
"""
tmp_dir = tempfile.mkdtemp(*args, **kwargs)
try:
with working_dir(tmp_dir):
yield tmp_dir
finally:
remove_directory_contents(tmp_dir)

View File

@@ -7,7 +7,6 @@
import functools
import inspect
import multiprocessing
import os
import re
import sys
@@ -31,23 +30,6 @@
ignore_modules = [r'^\.#', '~$']
# On macOS, Python 3.8 multiprocessing now defaults to the 'spawn' start
# method. Spack cannot currently handle this, so force the process to start
# using the 'fork' start method.
#
# TODO: This solution is not ideal, as the 'fork' start method can lead to
# crashes of the subprocess. Figure out how to make 'spawn' work.
#
# See:
# * https://github.com/spack/spack/pull/18124
# * https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods # noqa: E501
# * https://bugs.python.org/issue33725
if sys.version_info >= (3,): # novm
fork_context = multiprocessing.get_context('fork')
else:
fork_context = multiprocessing
def index_by(objects, *funcs):
"""Create a hierarchy of dictionaries by splitting the supplied
set of objects on unique values of the supplied functions.
@@ -258,6 +240,47 @@ def new_dec(*args, **kwargs):
return new_dec
def key_ordering(cls):
"""Decorates a class with extra methods that implement rich comparison
operations and ``__hash__``. The decorator assumes that the class
implements a function called ``_cmp_key()``. The rich comparison
operations will compare objects using this key, and the ``__hash__``
function will return the hash of this key.
If a class already has ``__eq__``, ``__ne__``, ``__lt__``, ``__le__``,
``__gt__``, or ``__ge__`` defined, this decorator will overwrite them.
Raises:
TypeError: If the class does not have a ``_cmp_key`` method
"""
def setter(name, value):
value.__name__ = name
setattr(cls, name, value)
if not has_method(cls, '_cmp_key'):
raise TypeError("'%s' doesn't define _cmp_key()." % cls.__name__)
setter('__eq__',
lambda s, o:
(s is o) or (o is not None and s._cmp_key() == o._cmp_key()))
setter('__lt__',
lambda s, o: o is not None and s._cmp_key() < o._cmp_key())
setter('__le__',
lambda s, o: o is not None and s._cmp_key() <= o._cmp_key())
setter('__ne__',
lambda s, o:
(s is not o) and (o is None or s._cmp_key() != o._cmp_key()))
setter('__gt__',
lambda s, o: o is None or s._cmp_key() > o._cmp_key())
setter('__ge__',
lambda s, o: o is None or s._cmp_key() >= o._cmp_key())
setter('__hash__', lambda self: hash(self._cmp_key()))
return cls
#: sentinel for testing that iterators are done in lazy_lexicographic_ordering
done = object()
@@ -573,8 +596,8 @@ def pretty_date(time, now=None):
"""Convert a datetime or timestamp to a pretty, relative date.
Args:
time (datetime or int): date to print prettily
now (datetime): dateimte for 'now', i.e. the date the pretty date
time (datetime.datetime or int): date to print prettily
now (datetime.datetime): datetime for 'now', i.e. the date the pretty date
is relative to (default is datetime.now())
Returns:
@@ -648,7 +671,7 @@ def pretty_string_to_date(date_str, now=None):
or be a *pretty date* (like ``yesterday`` or ``two months ago``)
Returns:
(datetime): datetime object corresponding to ``date_str``
(datetime.datetime): datetime object corresponding to ``date_str``
"""
pattern = {}
@@ -892,3 +915,19 @@ class Devnull(object):
"""
def write(self, *_):
pass
def elide_list(line_list, max_num=10):
"""Takes a long list and limits it to a smaller number of elements,
replacing intervening elements with '...'. For example::
elide_list([1,2,3,4,5,6], 4)
gives::
[1, 2, 3, '...', 6]
"""
if len(line_list) > max_num:
return line_list[:max_num - 1] + ['...'] + line_list[-1:]
else:
return line_list

View File

@@ -9,14 +9,25 @@
import socket
import time
from datetime import datetime
from typing import Dict, Tuple # novm
import llnl.util.tty as tty
import spack.util.string
__all__ = ['Lock', 'LockTransaction', 'WriteTransaction', 'ReadTransaction',
'LockError', 'LockTimeoutError',
'LockPermissionError', 'LockROFileError', 'CantCreateLockError']
__all__ = [
'Lock',
'LockDowngradeError',
'LockUpgradeError',
'LockTransaction',
'WriteTransaction',
'ReadTransaction',
'LockError',
'LockTimeoutError',
'LockPermissionError',
'LockROFileError',
'CantCreateLockError'
]
#: Mapping of supported locks to description
lock_type = {fcntl.LOCK_SH: 'read', fcntl.LOCK_EX: 'write'}
@@ -26,6 +37,126 @@
true_fn = lambda: True
class OpenFile(object):
"""Record for keeping track of open lockfiles (with reference counting).
There's really only one ``OpenFile`` per inode, per process, but we record the
filehandle here as it's the thing we end up using in python code. You can get
the file descriptor from the file handle if needed -- or we could make this track
file descriptors as well in the future.
"""
def __init__(self, fh):
self.fh = fh
self.refs = 0
class OpenFileTracker(object):
"""Track open lockfiles, to minimize number of open file descriptors.
The ``fcntl`` locks that Spack uses are associated with an inode and a process.
This is convenient, because if a process exits, it releases its locks.
Unfortunately, this also means that if you close a file, *all* locks associated
with that file's inode are released, regardless of whether the process has any
other open file descriptors on it.
Because of this, we need to track open lock files so that we only close them when
a process no longer needs them. We do this by tracking each lockfile by its
inode and process id. This has several nice properties:
1. Tracking by pid ensures that, if we fork, we don't inadvertently track the parent
process's lockfiles. ``fcntl`` locks are not inherited across forks, so we'll
just track new lockfiles in the child.
2. Tracking by inode ensures that referencs are counted per inode, and that we don't
inadvertently close a file whose inode still has open locks.
3. Tracking by both pid and inode ensures that we only open lockfiles the minimum
number of times necessary for the locks we have.
Note: as mentioned elsewhere, these locks aren't thread safe -- they're designed to
work in Python and assume the GIL.
"""
def __init__(self):
"""Create a new ``OpenFileTracker``."""
self._descriptors = {} # type: Dict[Tuple[int, int], OpenFile]
def get_fh(self, path):
"""Get a filehandle for a lockfile.
This routine will open writable files for read/write even if you're asking
for a shared (read-only) lock. This is so that we can upgrade to an exclusive
(write) lock later if requested.
Arguments:
path (str): path to lock file we want a filehandle for
"""
# Open writable files as 'r+' so we can upgrade to write later
os_mode, fh_mode = (os.O_RDWR | os.O_CREAT), 'r+'
pid = os.getpid()
open_file = None # OpenFile object, if there is one
stat = None # stat result for the lockfile, if it exists
try:
# see whether we've seen this inode/pid before
stat = os.stat(path)
key = (stat.st_ino, pid)
open_file = self._descriptors.get(key)
except OSError as e:
if e.errno != errno.ENOENT: # only handle file not found
raise
# path does not exist -- fail if we won't be able to create it
parent = os.path.dirname(path) or '.'
if not os.access(parent, os.W_OK):
raise CantCreateLockError(path)
# if there was no already open file, we'll need to open one
if not open_file:
if stat and not os.access(path, os.W_OK):
# we know path exists but not if it's writable. If it's read-only,
# only open the file for reading (and fail if we're trying to get
# an exclusive (write) lock on it)
os_mode, fh_mode = os.O_RDONLY, 'r'
fd = os.open(path, os_mode)
fh = os.fdopen(fd, fh_mode)
open_file = OpenFile(fh)
# if we just created the file, we'll need to get its inode here
if not stat:
inode = os.fstat(fd).st_ino
key = (inode, pid)
self._descriptors[key] = open_file
open_file.refs += 1
return open_file.fh
def release_fh(self, path):
"""Release a filehandle, only closing it if there are no more references."""
try:
inode = os.stat(path).st_ino
except OSError as e:
if e.errno != errno.ENOENT: # only handle file not found
raise
inode = None # this will not be in self._descriptors
key = (inode, os.getpid())
open_file = self._descriptors.get(key)
assert open_file, "Attempted to close non-existing lock path: %s" % path
open_file.refs -= 1
if not open_file.refs:
del self._descriptors[key]
open_file.fh.close()
#: Open file descriptors for locks in this process. Used to prevent one process
#: from opening the sam file many times for different byte range locks
file_tracker = OpenFileTracker()
def _attempts_str(wait_time, nattempts):
# Don't print anything if we succeeded on the first try
if nattempts <= 1:
@@ -46,7 +177,8 @@ class Lock(object):
Note that this is for managing contention over resources *between*
processes and not for managing contention between threads in a process: the
functions of this object are not thread-safe. A process also must not
maintain multiple locks on the same file.
maintain multiple locks on the same file (or, more specifically, on
overlapping byte ranges in the same file).
"""
def __init__(self, path, start=0, length=0, default_timeout=None,
@@ -151,25 +283,10 @@ def _lock(self, op, timeout=None):
# Create file and parent directories if they don't exist.
if self._file is None:
parent = self._ensure_parent_directory()
self._ensure_parent_directory()
self._file = file_tracker.get_fh(self.path)
# Open writable files as 'r+' so we can upgrade to write later
os_mode, fd_mode = (os.O_RDWR | os.O_CREAT), 'r+'
if os.path.exists(self.path):
if not os.access(self.path, os.W_OK):
if op == fcntl.LOCK_SH:
# can still lock read-only files if we open 'r'
os_mode, fd_mode = os.O_RDONLY, 'r'
else:
raise LockROFileError(self.path)
elif not os.access(parent, os.W_OK):
raise CantCreateLockError(self.path)
fd = os.open(self.path, os_mode)
self._file = os.fdopen(fd, fd_mode)
elif op == fcntl.LOCK_EX and self._file.mode == 'r':
if op == fcntl.LOCK_EX and self._file.mode == 'r':
# Attempt to upgrade to write lock w/a read-only file.
# If the file were writable, we'd have opened it 'r+'
raise LockROFileError(self.path)
@@ -282,7 +399,8 @@ def _unlock(self):
"""
fcntl.lockf(self._file, fcntl.LOCK_UN,
self._length, self._start, os.SEEK_SET)
self._file.close()
file_tracker.release_fh(self.path)
self._file = None
self._reads = 0
self._writes = 0
@@ -401,7 +519,7 @@ def release_read(self, release_fn=None):
"""Releases a read lock.
Arguments:
release_fn (callable): function to call *before* the last recursive
release_fn (typing.Callable): function to call *before* the last recursive
lock (read or write) is released.
If the last recursive lock will be released, then this will call
@@ -437,7 +555,7 @@ def release_write(self, release_fn=None):
"""Releases a write lock.
Arguments:
release_fn (callable): function to call before the last recursive
release_fn (typing.Callable): function to call before the last recursive
write is released.
If the last recursive *write* lock will be released, then this
@@ -533,10 +651,10 @@ class LockTransaction(object):
Arguments:
lock (Lock): underlying lock for this transaction to be accquired on
enter and released on exit
acquire (callable or contextmanager): function to be called after lock
is acquired, or contextmanager to enter after acquire and leave
acquire (typing.Callable or contextlib.contextmanager): function to be called
after lock is acquired, or contextmanager to enter after acquire and leave
before release.
release (callable): function to be called before release. If
release (typing.Callable): function to be called before release. If
``acquire`` is a contextmanager, this will be called *after*
exiting the nexted context and before the lock is released.
timeout (float): number of seconds to set for the timeout when

View File

@@ -5,6 +5,7 @@
from __future__ import unicode_literals
import contextlib
import fcntl
import os
import struct
@@ -28,6 +29,7 @@
_msg_enabled = True
_warn_enabled = True
_error_enabled = True
_output_filter = lambda s: s
indent = " "
@@ -90,6 +92,18 @@ def error_enabled():
return _error_enabled
@contextlib.contextmanager
def output_filter(filter_fn):
"""Context manager that applies a filter to all output."""
global _output_filter
saved_filter = _output_filter
try:
_output_filter = filter_fn
yield
finally:
_output_filter = saved_filter
class SuppressOutput:
"""Class for disabling output in a scope using 'with' keyword"""
@@ -166,13 +180,23 @@ def msg(message, *args, **kwargs):
if _stacktrace:
st_text = process_stacktrace(2)
if newline:
cprint("@*b{%s==>} %s%s" % (
st_text, get_timestamp(), cescape(message)))
cprint(
"@*b{%s==>} %s%s" % (
st_text,
get_timestamp(),
cescape(_output_filter(message))
)
)
else:
cwrite("@*b{%s==>} %s%s" % (
st_text, get_timestamp(), cescape(message)))
cwrite(
"@*b{%s==>} %s%s" % (
st_text,
get_timestamp(),
cescape(_output_filter(message))
)
)
for arg in args:
print(indent + six.text_type(arg))
print(indent + _output_filter(six.text_type(arg)))
def info(message, *args, **kwargs):
@@ -188,18 +212,29 @@ def info(message, *args, **kwargs):
st_text = ""
if _stacktrace:
st_text = process_stacktrace(st_countback)
cprint("@%s{%s==>} %s%s" % (
format, st_text, get_timestamp(), cescape(six.text_type(message))
), stream=stream)
cprint(
"@%s{%s==>} %s%s" % (
format,
st_text,
get_timestamp(),
cescape(_output_filter(six.text_type(message)))
),
stream=stream
)
for arg in args:
if wrap:
lines = textwrap.wrap(
six.text_type(arg), initial_indent=indent,
subsequent_indent=indent, break_long_words=break_long_words)
_output_filter(six.text_type(arg)),
initial_indent=indent,
subsequent_indent=indent,
break_long_words=break_long_words
)
for line in lines:
stream.write(line + '\n')
else:
stream.write(indent + six.text_type(arg) + '\n')
stream.write(
indent + _output_filter(six.text_type(arg)) + '\n'
)
def verbose(message, *args, **kwargs):

View File

@@ -109,19 +109,17 @@ def colify(elts, **options):
using ``str()``.
Keyword Arguments:
output (stream): A file object to write to. Default is ``sys.stdout``
indent (int): Optionally indent all columns by some number of spaces
padding (int): Spaces between columns. Default is 2
width (int): Width of the output. Default is 80 if tty not detected
cols (int): Force number of columns. Default is to size to
terminal, or single-column if no tty
tty (bool): Whether to attempt to write to a tty. Default is to
autodetect a tty. Set to False to force single-column
output
method (str): Method to use to fit columns. Options are variable or
uniform. Variable-width columns are tighter, uniform
columns are all the same width and fit less data on
the screen
output (typing.IO): A file object to write to. Default is ``sys.stdout``
indent (int): Optionally indent all columns by some number of spaces
padding (int): Spaces between columns. Default is 2
width (int): Width of the output. Default is 80 if tty not detected
cols (int): Force number of columns. Default is to size to terminal, or
single-column if no tty
tty (bool): Whether to attempt to write to a tty. Default is to autodetect a
tty. Set to False to force single-column output
method (str): Method to use to fit columns. Options are variable or uniform.
Variable-width columns are tighter, uniform columns are all the same width
and fit less data on the screen
"""
# Get keyword arguments or set defaults
cols = options.pop("cols", 0)

View File

@@ -33,7 +33,7 @@
# Use this to strip escape sequences
_escape = re.compile(r'\x1b[^m]*m|\x1b\[?1034h')
_escape = re.compile(r'\x1b[^m]*m|\x1b\[?1034h|\x1b\][0-9]+;[^\x07]*\x07')
# control characters for enabling/disabling echo
#
@@ -323,7 +323,7 @@ def unwrap(self):
if sys.version_info < (3,):
self.file = open(self.file_like, 'w')
else:
self.file = open(self.file_like, 'w', encoding='utf-8')
self.file = open(self.file_like, 'w', encoding='utf-8') # novm
else:
self.file = StringIO()
return self.file
@@ -436,7 +436,7 @@ class log_output(object):
"""
def __init__(self, file_like=None, echo=False, debug=0, buffer=False,
env=None):
env=None, filter_fn=None):
"""Create a new output log context manager.
Args:
@@ -446,6 +446,8 @@ def __init__(self, file_like=None, echo=False, debug=0, buffer=False,
debug (int): positive to enable tty debug mode during logging
buffer (bool): pass buffer=True to skip unbuffering output; note
this doesn't set up any *new* buffering
filter_fn (callable, optional): Callable[str] -> str to filter each
line of output
log_output can take either a file object or a filename. If a
filename is passed, the file will be opened and closed entirely
@@ -465,6 +467,7 @@ def __init__(self, file_like=None, echo=False, debug=0, buffer=False,
self.debug = debug
self.buffer = buffer
self.env = env # the environment to use for _writer_daemon
self.filter_fn = filter_fn
self._active = False # used to prevent re-entry
@@ -530,20 +533,22 @@ def __enter__(self):
# Sets a daemon that writes to file what it reads from a pipe
try:
# need to pass this b/c multiprocessing closes stdin in child.
input_multiprocess_fd = None
try:
input_multiprocess_fd = MultiProcessFd(
os.dup(sys.stdin.fileno())
)
if sys.stdin.isatty():
input_multiprocess_fd = MultiProcessFd(
os.dup(sys.stdin.fileno())
)
except BaseException:
# just don't forward input if this fails
input_multiprocess_fd = None
pass
with replace_environment(self.env):
self.process = multiprocessing.Process(
target=_writer_daemon,
args=(
input_multiprocess_fd, read_multiprocess_fd, write_fd,
self.echo, self.log_file, child_pipe
self.echo, self.log_file, child_pipe, self.filter_fn
)
)
self.process.daemon = True # must set before start()
@@ -667,7 +672,7 @@ def force_echo(self):
def _writer_daemon(stdin_multiprocess_fd, read_multiprocess_fd, write_fd, echo,
log_file_wrapper, control_pipe):
log_file_wrapper, control_pipe, filter_fn):
"""Daemon used by ``log_output`` to write to a log file and to ``stdout``.
The daemon receives output from the parent process and writes it both
@@ -712,6 +717,7 @@ def _writer_daemon(stdin_multiprocess_fd, read_multiprocess_fd, write_fd, echo,
log_file_wrapper (FileWrapper): file to log all output
control_pipe (Pipe): multiprocessing pipe on which to send control
information to the parent
filter_fn (callable, optional): function to filter each line of output
"""
# If this process was forked, then it will inherit file descriptors from
@@ -784,7 +790,10 @@ def _writer_daemon(stdin_multiprocess_fd, read_multiprocess_fd, write_fd, echo,
# Echo to stdout if requested or forced.
if echo or force_echo:
sys.stdout.write(clean_line)
output_line = clean_line
if filter_fn:
output_line = filter_fn(clean_line)
sys.stdout.write(output_line)
# Stripped output to log file.
log_file.write(_strip(clean_line))

View File

@@ -3,9 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: major, minor, patch version for Spack, in a tuple
spack_version_info = (0, 16, 1)
spack_version_info = (0, 16, 3)
#: String containing Spack version joined with .'s
spack_version = '.'.join(str(v) for v in spack_version_info)

View File

@@ -10,6 +10,8 @@
import os
import llnl.util.tty as tty
from spack.util.environment import EnvironmentModifications
from .analyzer_base import AnalyzerBase
@@ -43,6 +45,7 @@ def _read_environment_file(self, filename):
to remove path prefixes specific to user systems.
"""
if not os.path.exists(filename):
tty.warn("No environment file available")
return
mods = EnvironmentModifications.from_sourcing_file(filename)

View File

@@ -1,609 +0,0 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""
This module contains all the elements that are required to create an
architecture object. These include, the target processor, the operating system,
and the architecture platform (i.e. cray, darwin, linux, etc) classes.
On a multiple architecture machine, the architecture spec field can be set to
build a package against any target and operating system that is present on the
platform. On Cray platforms or any other architecture that has different front
and back end environments, the operating system will determine the method of
compiler
detection.
There are two different types of compiler detection:
1. Through the $PATH env variable (front-end detection)
2. Through the tcl module system. (back-end detection)
Depending on which operating system is specified, the compiler will be detected
using one of those methods.
For platforms such as linux and darwin, the operating system is autodetected
and the target is set to be x86_64.
The command line syntax for specifying an architecture is as follows:
target=<Target name> os=<OperatingSystem name>
If the user wishes to use the defaults, either target or os can be left out of
the command line and Spack will concretize using the default. These defaults
are set in the 'platforms/' directory which contains the different subclasses
for platforms. If the machine has multiple architectures, the user can
also enter front-end, or fe or back-end or be. These settings will concretize
to their respective front-end and back-end targets and operating systems.
Additional platforms can be added by creating a subclass of Platform
and adding it inside the platform directory.
Platforms are an abstract class that are extended by subclasses. If the user
wants to add a new type of platform (such as cray_xe), they can create a
subclass and set all the class attributes such as priority, front_target,
back_target, front_os, back_os. Platforms also contain a priority class
attribute. A lower number signifies higher priority. These numbers are
arbitrarily set and can be changed though often there isn't much need unless a
new platform is added and the user wants that to be detected first.
Targets are created inside the platform subclasses. Most architecture
(like linux, and darwin) will have only one target (x86_64) but in the case of
Cray machines, there is both a frontend and backend processor. The user can
specify which targets are present on front-end and back-end architecture
Depending on the platform, operating systems are either auto-detected or are
set. The user can set the front-end and back-end operating setting by the class
attributes front_os and back_os. The operating system as described earlier,
will be responsible for compiler detection.
"""
import contextlib
import functools
import warnings
import six
import archspec.cpu
import llnl.util.lang as lang
import llnl.util.tty as tty
import spack.compiler
import spack.compilers
import spack.config
import spack.error as serr
import spack.paths
import spack.util.classes
import spack.util.executable
import spack.version
from spack.util.spack_yaml import syaml_dict
class NoPlatformError(serr.SpackError):
def __init__(self):
super(NoPlatformError, self).__init__(
"Could not determine a platform for this machine.")
def _ensure_other_is_target(method):
"""Decorator to be used in dunder methods taking a single argument to
ensure that the argument is an instance of ``Target`` too.
"""
@functools.wraps(method)
def _impl(self, other):
if isinstance(other, six.string_types):
other = Target(other)
if not isinstance(other, Target):
return NotImplemented
return method(self, other)
return _impl
class Target(object):
def __init__(self, name, module_name=None):
"""Target models microarchitectures and their compatibility.
Args:
name (str or Microarchitecture):micro-architecture of the
target
module_name (str): optional module name to get access to the
current target. This is typically used on machines
like Cray (e.g. craype-compiler)
"""
if not isinstance(name, archspec.cpu.Microarchitecture):
name = archspec.cpu.TARGETS.get(
name, archspec.cpu.generic_microarchitecture(name)
)
self.microarchitecture = name
self.module_name = module_name
@property
def name(self):
return self.microarchitecture.name
@_ensure_other_is_target
def __eq__(self, other):
return self.microarchitecture == other.microarchitecture and \
self.module_name == other.module_name
def __ne__(self, other):
# This method is necessary as long as we support Python 2. In Python 3
# __ne__ defaults to the implementation below
return not self == other
@_ensure_other_is_target
def __lt__(self, other):
# TODO: In the future it would be convenient to say
# TODO: `spec.architecture.target < other.architecture.target`
# TODO: and change the semantic of the comparison operators
# This is needed to sort deterministically specs in a list.
# It doesn't implement a total ordering semantic.
return self.microarchitecture.name < other.microarchitecture.name
def __hash__(self):
return hash((self.name, self.module_name))
@staticmethod
def from_dict_or_value(dict_or_value):
# A string here represents a generic target (like x86_64 or ppc64) or
# a custom micro-architecture
if isinstance(dict_or_value, six.string_types):
return Target(dict_or_value)
# TODO: From a dict we actually retrieve much more information than
# TODO: just the name. We can use that information to reconstruct an
# TODO: "old" micro-architecture or check the current definition.
target_info = dict_or_value
return Target(target_info['name'])
def to_dict_or_value(self):
"""Returns a dict or a value representing the current target.
String values are used to keep backward compatibility with generic
targets, like e.g. x86_64 or ppc64. More specific micro-architectures
will return a dictionary which contains information on the name,
features, vendor, generation and parents of the current target.
"""
# Generic targets represent either an architecture
# family (like x86_64) or a custom micro-architecture
if self.microarchitecture.vendor == 'generic':
return str(self)
return syaml_dict(
self.microarchitecture.to_dict(return_list_of_items=True)
)
def __repr__(self):
cls_name = self.__class__.__name__
fmt = cls_name + '({0}, {1})'
return fmt.format(repr(self.microarchitecture),
repr(self.module_name))
def __str__(self):
return str(self.microarchitecture)
def __contains__(self, cpu_flag):
return cpu_flag in self.microarchitecture
def optimization_flags(self, compiler):
"""Returns the flags needed to optimize for this target using
the compiler passed as argument.
Args:
compiler (CompilerSpec or Compiler): object that contains both the
name and the version of the compiler we want to use
"""
# Mixed toolchains are not supported yet
import spack.compilers
if isinstance(compiler, spack.compiler.Compiler):
if spack.compilers.is_mixed_toolchain(compiler):
msg = ('microarchitecture specific optimizations are not '
'supported yet on mixed compiler toolchains [check'
' {0.name}@{0.version} for further details]')
warnings.warn(msg.format(compiler))
return ''
# Try to check if the current compiler comes with a version number or
# has an unexpected suffix. If so, treat it as a compiler with a
# custom spec.
compiler_version = compiler.version
version_number, suffix = archspec.cpu.version_components(
compiler.version
)
if not version_number or suffix not in ('', 'apple'):
# Try to deduce the underlying version of the compiler, regardless
# of its name in compilers.yaml. Depending on where this function
# is called we might get either a CompilerSpec or a fully fledged
# compiler object.
import spack.spec
if isinstance(compiler, spack.spec.CompilerSpec):
compiler = spack.compilers.compilers_for_spec(compiler).pop()
try:
compiler_version = compiler.real_version
except spack.util.executable.ProcessError as e:
# log this and just return compiler.version instead
tty.debug(str(e))
return self.microarchitecture.optimization_flags(
compiler.name, str(compiler_version)
)
@lang.lazy_lexicographic_ordering
class Platform(object):
""" Abstract class that each type of Platform will subclass.
Will return a instance of it once it is returned.
"""
# Subclass sets number. Controls detection order
priority = None # type: int
#: binary formats used on this platform; used by relocation logic
binary_formats = ['elf']
front_end = None # type: str
back_end = None # type: str
default = None # type: str # The default back end target.
front_os = None # type: str
back_os = None # type: str
default_os = None # type: str
reserved_targets = ['default_target', 'frontend', 'fe', 'backend', 'be']
reserved_oss = ['default_os', 'frontend', 'fe', 'backend', 'be']
def __init__(self, name):
self.targets = {}
self.operating_sys = {}
self.name = name
def add_target(self, name, target):
"""Used by the platform specific subclass to list available targets.
Raises an error if the platform specifies a name
that is reserved by spack as an alias.
"""
if name in Platform.reserved_targets:
raise ValueError(
"%s is a spack reserved alias "
"and cannot be the name of a target" % name)
self.targets[name] = target
def target(self, name):
"""This is a getter method for the target dictionary
that handles defaulting based on the values provided by default,
front-end, and back-end. This can be overwritten
by a subclass for which we want to provide further aliasing options.
"""
# TODO: Check if we can avoid using strings here
name = str(name)
if name == 'default_target':
name = self.default
elif name == 'frontend' or name == 'fe':
name = self.front_end
elif name == 'backend' or name == 'be':
name = self.back_end
return self.targets.get(name, None)
def add_operating_system(self, name, os_class):
""" Add the operating_system class object into the
platform.operating_sys dictionary
"""
if name in Platform.reserved_oss:
raise ValueError(
"%s is a spack reserved alias "
"and cannot be the name of an OS" % name)
self.operating_sys[name] = os_class
def operating_system(self, name):
if name == 'default_os':
name = self.default_os
if name == 'frontend' or name == "fe":
name = self.front_os
if name == 'backend' or name == 'be':
name = self.back_os
return self.operating_sys.get(name, None)
@classmethod
def setup_platform_environment(cls, pkg, env):
""" Subclass can override this method if it requires any
platform-specific build environment modifications.
"""
@classmethod
def detect(cls):
""" Subclass is responsible for implementing this method.
Returns True if the Platform class detects that
it is the current platform
and False if it's not.
"""
raise NotImplementedError()
def __repr__(self):
return self.__str__()
def __str__(self):
return self.name
def _cmp_iter(self):
yield self.name
yield self.default
yield self.front_end
yield self.back_end
yield self.default_os
yield self.front_os
yield self.back_os
def targets():
for t in sorted(self.targets.values()):
yield t._cmp_iter
yield targets
def oses():
for o in sorted(self.operating_sys.values()):
yield o._cmp_iter
yield oses
@lang.lazy_lexicographic_ordering
class OperatingSystem(object):
""" Operating System will be like a class similar to platform extended
by subclasses for the specifics. Operating System will contain the
compiler finding logic. Instead of calling two separate methods to
find compilers we call find_compilers method for each operating system
"""
def __init__(self, name, version):
self.name = name.replace('-', '_')
self.version = str(version).replace('-', '_')
def __str__(self):
return "%s%s" % (self.name, self.version)
def __repr__(self):
return self.__str__()
def _cmp_iter(self):
yield self.name
yield self.version
def to_dict(self):
return syaml_dict([
('name', self.name),
('version', self.version)
])
@lang.lazy_lexicographic_ordering
class Arch(object):
"""Architecture is now a class to help with setting attributes.
TODO: refactor so that we don't need this class.
"""
def __init__(self, plat=None, os=None, target=None):
self.platform = plat
if plat and os:
os = self.platform.operating_system(os)
self.os = os
if plat and target:
target = self.platform.target(target)
self.target = target
# Hooks for parser to use when platform is set after target or os
self.target_string = None
self.os_string = None
@property
def concrete(self):
return all((self.platform is not None,
isinstance(self.platform, Platform),
self.os is not None,
isinstance(self.os, OperatingSystem),
self.target is not None, isinstance(self.target, Target)))
def __str__(self):
if self.platform or self.os or self.target:
if self.platform.name == 'darwin':
os_name = self.os.name if self.os else "None"
else:
os_name = str(self.os)
return (str(self.platform) + "-" +
os_name + "-" + str(self.target))
else:
return ''
def __contains__(self, string):
return string in str(self)
# TODO: make this unnecessary: don't include an empty arch on *every* spec.
def __nonzero__(self):
return (self.platform is not None or
self.os is not None or
self.target is not None)
__bool__ = __nonzero__
def _cmp_iter(self):
if isinstance(self.platform, Platform):
yield self.platform.name
else:
yield self.platform
if isinstance(self.os, OperatingSystem):
yield self.os.name
else:
yield self.os
if isinstance(self.target, Target):
yield self.target.microarchitecture
else:
yield self.target
def to_dict(self):
str_or_none = lambda v: str(v) if v else None
d = syaml_dict([
('platform', str_or_none(self.platform)),
('platform_os', str_or_none(self.os)),
('target', self.target.to_dict_or_value())])
return syaml_dict([('arch', d)])
def to_spec(self):
"""Convert this Arch to an anonymous Spec with architecture defined."""
spec = spack.spec.Spec()
spec.architecture = spack.spec.ArchSpec(str(self))
return spec
@staticmethod
def from_dict(d):
spec = spack.spec.ArchSpec.from_dict(d)
return arch_for_spec(spec)
@lang.memoized
def get_platform(platform_name):
"""Returns a platform object that corresponds to the given name."""
platform_list = all_platforms()
for p in platform_list:
if platform_name.replace("_", "").lower() == p.__name__.lower():
return p()
def verify_platform(platform_name):
""" Determines whether or not the platform with the given name is supported
in Spack. For more information, see the 'spack.platforms' submodule.
"""
platform_name = platform_name.replace("_", "").lower()
platform_names = [p.__name__.lower() for p in all_platforms()]
if platform_name not in platform_names:
tty.die("%s is not a supported platform; supported platforms are %s" %
(platform_name, platform_names))
def arch_for_spec(arch_spec):
"""Transforms the given architecture spec into an architecture object."""
arch_spec = spack.spec.ArchSpec(arch_spec)
assert arch_spec.concrete
arch_plat = get_platform(arch_spec.platform)
if not (arch_plat.operating_system(arch_spec.os) and
arch_plat.target(arch_spec.target)):
raise ValueError(
"Can't recreate arch for spec %s on current arch %s; "
"spec architecture is too different" % (arch_spec, sys_type()))
return Arch(arch_plat, arch_spec.os, arch_spec.target)
@lang.memoized
def _all_platforms():
mod_path = spack.paths.platform_path
return spack.util.classes.list_classes("spack.platforms", mod_path)
@lang.memoized
def _platform():
"""Detects the platform for this machine.
Gather a list of all available subclasses of platforms.
Sorts the list according to their priority looking. Priority is
an arbitrarily set number. Detects platform either using uname or
a file path (/opt/cray...)
"""
# Try to create a Platform object using the config file FIRST
platform_list = _all_platforms()
platform_list.sort(key=lambda a: a.priority)
for platform_cls in platform_list:
if platform_cls.detect():
return platform_cls()
#: The "real" platform of the host running Spack. This should not be changed
#: by any method and is here as a convenient way to refer to the host platform.
real_platform = _platform
#: The current platform used by Spack. May be swapped by the use_platform
#: context manager.
platform = _platform
#: The list of all platform classes. May be swapped by the use_platform
#: context manager.
all_platforms = _all_platforms
@lang.memoized
def default_arch():
"""Default ``Arch`` object for this machine.
See ``sys_type()``.
"""
return Arch(platform(), 'default_os', 'default_target')
def sys_type():
"""Print out the "default" platform-os-target tuple for this machine.
On machines with only one target OS/target, prints out the
platform-os-target for the frontend. For machines with a frontend
and a backend, prints the default backend.
TODO: replace with use of more explicit methods to get *all* the
backends, as client code should really be aware of cross-compiled
architectures.
"""
return str(default_arch())
@lang.memoized
def compatible_sys_types():
"""Returns a list of all the systypes compatible with the current host."""
compatible_archs = []
current_host = archspec.cpu.host()
compatible_targets = [current_host] + current_host.ancestors
for target in compatible_targets:
arch = Arch(platform(), 'default_os', target)
compatible_archs.append(str(arch))
return compatible_archs
class _PickleableCallable(object):
"""Class used to pickle a callable that may substitute either
_platform or _all_platforms. Lambda or nested functions are
not pickleable.
"""
def __init__(self, return_value):
self.return_value = return_value
def __call__(self):
return self.return_value
@contextlib.contextmanager
def use_platform(new_platform):
global platform, all_platforms
msg = '"{0}" must be an instance of Platform'
assert isinstance(new_platform, Platform), msg.format(new_platform)
original_platform_fn, original_all_platforms_fn = platform, all_platforms
platform = _PickleableCallable(new_platform)
all_platforms = _PickleableCallable([type(new_platform)])
# Clear configuration and compiler caches
spack.config.config.clear_caches()
spack.compilers._cache_config_files = []
yield new_platform
platform, all_platforms = original_platform_fn, original_all_platforms_fn
# Clear configuration and compiler caches
spack.config.config.clear_caches()
spack.compilers._cache_config_files = []

View File

@@ -37,12 +37,16 @@ def _search_duplicate_compilers(error_cls):
"""
import collections
import itertools
import re
from six.moves.urllib.request import urlopen
try:
from collections.abc import Sequence # novm
except ImportError:
from collections import Sequence
#: Map an audit tag to a list of callables implementing checks
CALLBACKS = {}
@@ -261,6 +265,45 @@ def _search_duplicate_specs_in_externals(error_cls):
kwargs=('pkgs',)
)
#: Sanity checks on linting
# This can take some time, so it's run separately from packages
package_https_directives = AuditClass(
group='packages-https',
tag='PKG-HTTPS-DIRECTIVES',
description='Sanity checks on https checks of package urls, etc.',
kwargs=('pkgs',)
)
@package_https_directives
def _linting_package_file(pkgs, error_cls):
"""Check for correctness of links
"""
import llnl.util.lang
import spack.repo
import spack.spec
errors = []
for pkg_name in pkgs:
pkg = spack.repo.get(pkg_name)
# Does the homepage have http, and if so, does https work?
if pkg.homepage.startswith('http://'):
https = re.sub("http", "https", pkg.homepage, 1)
try:
response = urlopen(https)
except Exception as e:
msg = 'Error with attempting https for "{0}": '
errors.append(error_cls(msg.format(pkg.name), [str(e)]))
continue
if response.getcode() == 200:
msg = 'Package "{0}" uses http but has a valid https endpoint.'
errors.append(msg.format(pkg.name))
return llnl.util.lang.dedupe(errors)
@package_directives
def _unknown_variants_in_directives(pkgs, error_cls):

View File

@@ -4,15 +4,14 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import codecs
import glob
import hashlib
import json
import os
import re
import shutil
import sys
import tarfile
import tempfile
import traceback
from contextlib import closing
import ruamel.yaml as yaml
@@ -27,7 +26,10 @@
import spack.config as config
import spack.database as spack_db
import spack.fetch_strategy as fs
import spack.hash_types as ht
import spack.hooks.sbang
import spack.mirror
import spack.platforms
import spack.relocate as relocate
import spack.util.file_cache as file_cache
import spack.util.gpg
@@ -43,6 +45,25 @@
_build_cache_keys_relative_path = '_pgp'
class FetchCacheError(Exception):
"""Error thrown when fetching the cache failed, usually a composite error list."""
def __init__(self, errors):
if not isinstance(errors, list):
raise TypeError("Expected a list of errors")
self.errors = errors
if len(errors) > 1:
msg = " Error {0}: {1}: {2}"
self.message = "Multiple errors during fetching:\n"
self.message += "\n".join((
msg.format(i + 1, err.__class__.__name__, str(err))
for (i, err) in enumerate(errors)
))
else:
err = errors[0]
self.message = "{0}: {1}".format(err.__class__.__name__, str(err))
super(FetchCacheError, self).__init__(self.message)
class BinaryCacheIndex(object):
"""
The BinaryCacheIndex tracks what specs are available on (usually remote)
@@ -204,7 +225,7 @@ def find_built_spec(self, spec):
The cache can be updated by calling ``update()`` on the cache.
Args:
spec (Spec): Concrete spec to find
spec (spack.spec.Spec): Concrete spec to find
Returns:
An list of objects containing the found specs and mirror url where
@@ -289,14 +310,22 @@ def update(self):
# Otherwise the concrete spec cache should not need to be updated at
# all.
fetch_errors = []
all_methods_failed = True
for cached_mirror_url in self._local_index_cache:
cache_entry = self._local_index_cache[cached_mirror_url]
cached_index_hash = cache_entry['index_hash']
cached_index_path = cache_entry['index_path']
if cached_mirror_url in configured_mirror_urls:
# May need to fetch the index and update the local caches
needs_regen = self._fetch_and_cache_index(
cached_mirror_url, expect_hash=cached_index_hash)
try:
needs_regen = self._fetch_and_cache_index(
cached_mirror_url, expect_hash=cached_index_hash)
all_methods_failed = False
except FetchCacheError as fetch_error:
needs_regen = False
fetch_errors.extend(fetch_error.errors)
# The need to regenerate implies a need to clear as well.
spec_cache_clear_needed |= needs_regen
spec_cache_regenerate_needed |= needs_regen
@@ -323,7 +352,12 @@ def update(self):
for mirror_url in configured_mirror_urls:
if mirror_url not in self._local_index_cache:
# Need to fetch the index and update the local caches
needs_regen = self._fetch_and_cache_index(mirror_url)
try:
needs_regen = self._fetch_and_cache_index(mirror_url)
all_methods_failed = False
except FetchCacheError as fetch_error:
fetch_errors.extend(fetch_error.errors)
needs_regen = False
# Generally speaking, a new mirror wouldn't imply the need to
# clear the spec cache, so leave it as is.
if needs_regen:
@@ -331,7 +365,9 @@ def update(self):
self._write_local_index_cache()
if spec_cache_regenerate_needed:
if all_methods_failed:
raise FetchCacheError(fetch_errors)
elif spec_cache_regenerate_needed:
self.regenerate_spec_cache(clear_existing=spec_cache_clear_needed)
def _fetch_and_cache_index(self, mirror_url, expect_hash=None):
@@ -350,6 +386,8 @@ def _fetch_and_cache_index(self, mirror_url, expect_hash=None):
True if this function thinks the concrete spec cache,
``_mirrors_for_spec``, should be regenerated. Returns False
otherwise.
Throws:
FetchCacheError: a composite exception.
"""
index_fetch_url = url_util.join(
mirror_url, _build_cache_relative_path, 'index.json')
@@ -359,14 +397,19 @@ def _fetch_and_cache_index(self, mirror_url, expect_hash=None):
old_cache_key = None
fetched_hash = None
errors = []
# Fetch the hash first so we can check if we actually need to fetch
# the index itself.
try:
_, _, fs = web_util.read_from_url(hash_fetch_url)
fetched_hash = codecs.getreader('utf-8')(fs).read()
except (URLError, web_util.SpackWebError) as url_err:
tty.debug('Unable to read index hash {0}'.format(
hash_fetch_url), url_err, 1)
errors.append(
RuntimeError("Unable to read index hash {0} due to {1}: {2}".format(
hash_fetch_url, url_err.__class__.__name__, str(url_err)
))
)
# The only case where we'll skip attempting to fetch the buildcache
# index from the mirror is when we already have a hash for this
@@ -393,24 +436,23 @@ def _fetch_and_cache_index(self, mirror_url, expect_hash=None):
_, _, fs = web_util.read_from_url(index_fetch_url)
index_object_str = codecs.getreader('utf-8')(fs).read()
except (URLError, web_util.SpackWebError) as url_err:
tty.debug('Unable to read index {0}'.format(index_fetch_url),
url_err, 1)
# We failed to fetch the index, even though we decided it was
# necessary. However, regenerating the spec cache won't produce
# anything different than what it has already, so return False.
return False
errors.append(
RuntimeError("Unable to read index {0} due to {1}: {2}".format(
index_fetch_url, url_err.__class__.__name__, str(url_err)
))
)
raise FetchCacheError(errors)
locally_computed_hash = compute_hash(index_object_str)
if fetched_hash is not None and locally_computed_hash != fetched_hash:
msg_tmpl = ('Computed hash ({0}) did not match remote ({1}), '
'indicating error in index transmission')
tty.error(msg_tmpl.format(locally_computed_hash, expect_hash))
msg = ('Computed hash ({0}) did not match remote ({1}), '
'indicating error in index transmission').format(
locally_computed_hash, expect_hash)
errors.append(RuntimeError(msg))
# We somehow got an index that doesn't match the remote one, maybe
# the next time we try we'll be successful. Regardless, we're not
# updating our index cache with this, so don't regenerate the spec
# cache either.
return False
# the next time we try we'll be successful.
raise FetchCacheError(errors)
url_hash = compute_hash(mirror_url)
@@ -566,6 +608,16 @@ def get_buildfile_manifest(spec):
# Used by make_package_relative to determine binaries to change.
for root, dirs, files in os.walk(spec.prefix, topdown=True):
dirs[:] = [d for d in dirs if d not in blacklist]
# Directories may need to be relocated too.
for directory in dirs:
dir_path_name = os.path.join(root, directory)
rel_path_name = os.path.relpath(dir_path_name, spec.prefix)
if os.path.islink(dir_path_name):
link = os.readlink(dir_path_name)
if os.path.isabs(link) and link.startswith(spack.store.layout.root):
data['link_to_relocate'].append(rel_path_name)
for filename in files:
path_name = os.path.join(root, filename)
m_type, m_subtype = relocate.mime_type(path_name)
@@ -581,7 +633,7 @@ def get_buildfile_manifest(spec):
added = True
if relocate.needs_binary_relocation(m_type, m_subtype):
if ((m_subtype in ('x-executable', 'x-sharedlib')
if ((m_subtype in ('x-executable', 'x-sharedlib', 'x-pie-executable')
and sys.platform != 'darwin') or
(m_subtype in ('x-mach-binary')
and sys.platform == 'darwin') or
@@ -613,9 +665,8 @@ def write_buildinfo_file(spec, workdir, rel=False):
prefix_to_hash[str(d.prefix)] = d.dag_hash()
# Create buildinfo data and write it to disk
import spack.hooks.sbang as sbang
buildinfo = {}
buildinfo['sbang_install_path'] = sbang.sbang_install_path()
buildinfo['sbang_install_path'] = spack.hooks.sbang.sbang_install_path()
buildinfo['relative_rpaths'] = rel
buildinfo['buildpath'] = spack.store.layout.root
buildinfo['spackprefix'] = spack.paths.prefix
@@ -706,20 +757,14 @@ def generate_package_index(cache_prefix):
"""Create the build cache index page.
Creates (or replaces) the "index.json" page at the location given in
cache_prefix. This page contains a link for each binary package (.yaml)
under cache_prefix.
cache_prefix. This page contains a link for each binary package (.yaml or
.json) under cache_prefix.
"""
tmpdir = tempfile.mkdtemp()
db_root_dir = os.path.join(tmpdir, 'db_root')
db = spack_db.Database(None, db_dir=db_root_dir,
enable_transaction_locking=False,
record_fields=['spec', 'ref_count', 'in_buildcache'])
try:
file_list = (
entry
for entry in web_util.list_url(cache_prefix)
if entry.endswith('.yaml'))
if entry.endswith('.yaml') or entry.endswith('spec.json'))
except KeyError as inst:
msg = 'No packages at {0}: {1}'.format(cache_prefix, inst)
tty.warn(msg)
@@ -733,24 +778,97 @@ def generate_package_index(cache_prefix):
tty.warn(msg)
return
tty.debug('Retrieving spec.yaml files from {0} to build index'.format(
tty.debug('Retrieving spec descriptor files from {0} to build index'.format(
cache_prefix))
all_mirror_specs = {}
for file_path in file_list:
try:
yaml_url = url_util.join(cache_prefix, file_path)
tty.debug('fetching {0}'.format(yaml_url))
_, _, yaml_file = web_util.read_from_url(yaml_url)
yaml_contents = codecs.getreader('utf-8')(yaml_file).read()
# yaml_obj = syaml.load(yaml_contents)
# s = Spec.from_yaml(yaml_obj)
s = Spec.from_yaml(yaml_contents)
db.add(s, None)
db.mark(s, 'in_buildcache', True)
spec_url = url_util.join(cache_prefix, file_path)
tty.debug('fetching {0}'.format(spec_url))
_, _, spec_file = web_util.read_from_url(spec_url)
spec_file_contents = codecs.getreader('utf-8')(spec_file).read()
# Need full spec.json name or this gets confused with index.json.
if spec_url.endswith('.json'):
spec_dict = sjson.load(spec_file_contents)
s = Spec.from_json(spec_file_contents)
elif spec_url.endswith('.yaml'):
spec_dict = syaml.load(spec_file_contents)
s = Spec.from_yaml(spec_file_contents)
all_mirror_specs[s.dag_hash()] = {
'spec_url': spec_url,
'spec': s,
'num_deps': len(list(s.traverse(root=False))),
'binary_cache_checksum': spec_dict['binary_cache_checksum'],
'buildinfo': spec_dict['buildinfo'],
}
except (URLError, web_util.SpackWebError) as url_err:
tty.error('Error reading spec.yaml: {0}'.format(file_path))
tty.error('Error reading specfile: {0}'.format(file_path))
tty.error(url_err)
sorted_specs = sorted(all_mirror_specs.keys(),
key=lambda k: all_mirror_specs[k]['num_deps'])
tmpdir = tempfile.mkdtemp()
db_root_dir = os.path.join(tmpdir, 'db_root')
db = spack_db.Database(None, db_dir=db_root_dir,
enable_transaction_locking=False,
record_fields=['spec', 'ref_count', 'in_buildcache'])
try:
tty.debug('Specs sorted by number of dependencies:')
for dag_hash in sorted_specs:
spec_record = all_mirror_specs[dag_hash]
s = spec_record['spec']
num_deps = spec_record['num_deps']
tty.debug(' {0}/{1} -> {2}'.format(
s.name, dag_hash[:7], num_deps))
if num_deps > 0:
# Check each of this spec's dependencies (which we have already
# processed), as they are the source of truth for their own
# full hash. If the full hash we have for any deps does not
# match what those deps have themselves, then we need to splice
# this spec with those deps, and push this spliced spec
# (spec.json file) back to the mirror, as well as update the
# all_mirror_specs dictionary with this spliced spec.
to_splice = []
for dep in s.dependencies():
dep_dag_hash = dep.dag_hash()
if dep_dag_hash in all_mirror_specs:
true_dep = all_mirror_specs[dep_dag_hash]['spec']
if true_dep.full_hash() != dep.full_hash():
to_splice.append(true_dep)
if to_splice:
tty.debug(' needs the following deps spliced:')
for true_dep in to_splice:
tty.debug(' {0}/{1}'.format(
true_dep.name, true_dep.dag_hash()[:7]))
s = s.splice(true_dep, True)
# Push this spliced spec back to the mirror
spliced_spec_dict = s.to_dict(hash=ht.full_hash)
for key in ['binary_cache_checksum', 'buildinfo']:
spliced_spec_dict[key] = spec_record[key]
temp_json_path = os.path.join(tmpdir, 'spliced.spec.json')
with open(temp_json_path, 'w') as fd:
fd.write(sjson.dump(spliced_spec_dict))
spliced_spec_url = spec_record['spec_url']
web_util.push_to_url(
temp_json_path, spliced_spec_url, keep_original=False)
tty.debug(' spliced and wrote {0}'.format(
spliced_spec_url))
spec_record['spec'] = s
db.add(s, None)
db.mark(s, 'in_buildcache', True)
# Now that we have fixed any old specfiles that might have had the wrong
# full hash for their dependencies, we can generate the index, compute
# the hash, and push those files to the mirror.
index_json_path = os.path.join(db_root_dir, 'index.json')
with open(index_json_path, 'w') as f:
db._write_to_file(f)
@@ -782,6 +900,7 @@ def generate_package_index(cache_prefix):
msg = 'Encountered problem pushing package index to {0}: {1}'.format(
cache_prefix, err)
tty.warn(msg)
tty.debug('\n' + traceback.format_exc())
finally:
shutil.rmtree(tmpdir)
@@ -883,19 +1002,27 @@ def build_tarball(spec, outdir, force=False, rel=False, unsigned=False,
# need to copy the spec file so the build cache can be downloaded
# without concretizing with the current spack packages
# and preferences
spec_file = os.path.join(spec.prefix, ".spack", "spec.yaml")
specfile_name = tarball_name(spec, '.spec.yaml')
specfile_path = os.path.realpath(
os.path.join(cache_prefix, specfile_name))
spec_file = spack.store.layout.spec_file_path(spec)
specfile_name = tarball_name(spec, '.spec.json')
specfile_path = os.path.realpath(os.path.join(cache_prefix, specfile_name))
deprecated_specfile_path = specfile_path.replace('.spec.json', '.spec.yaml')
remote_specfile_path = url_util.join(
outdir, os.path.relpath(specfile_path, os.path.realpath(tmpdir)))
remote_specfile_path_deprecated = url_util.join(
outdir, os.path.relpath(deprecated_specfile_path,
os.path.realpath(tmpdir)))
if web_util.url_exists(remote_specfile_path):
if force:
# If force and exists, overwrite. Otherwise raise exception on collision.
if force:
if web_util.url_exists(remote_specfile_path):
web_util.remove_url(remote_specfile_path)
else:
raise NoOverwriteException(url_util.format(remote_specfile_path))
if web_util.url_exists(remote_specfile_path_deprecated):
web_util.remove_url(remote_specfile_path_deprecated)
elif (web_util.url_exists(remote_specfile_path) or
web_util.url_exists(remote_specfile_path_deprecated)):
raise NoOverwriteException(url_util.format(remote_specfile_path))
# make a copy of the install directory to work with
workdir = os.path.join(tmpdir, os.path.basename(spec.prefix))
@@ -943,15 +1070,23 @@ def build_tarball(spec, outdir, force=False, rel=False, unsigned=False,
# get the sha256 checksum of the tarball
checksum = checksum_tarball(tarfile_path)
# add sha256 checksum to spec.yaml
# add sha256 checksum to spec.json
with open(spec_file, 'r') as inputfile:
content = inputfile.read()
spec_dict = yaml.load(content)
if spec_file.endswith('.yaml'):
spec_dict = yaml.load(content)
elif spec_file.endswith('.json'):
spec_dict = sjson.load(content)
else:
raise ValueError(
'{0} not a valid spec file type (json or yaml)'.format(
spec_file))
bchecksum = {}
bchecksum['hash_algorithm'] = 'sha256'
bchecksum['hash'] = checksum
spec_dict['binary_cache_checksum'] = bchecksum
# Add original install prefix relative to layout root to spec.yaml.
# Add original install prefix relative to layout root to spec.json.
# This will be used to determine is the directory layout has changed.
buildinfo = {}
buildinfo['relative_prefix'] = os.path.relpath(
@@ -960,7 +1095,7 @@ def build_tarball(spec, outdir, force=False, rel=False, unsigned=False,
spec_dict['buildinfo'] = buildinfo
with open(specfile_path, 'w') as outfile:
outfile.write(syaml.dump(spec_dict))
outfile.write(sjson.dump(spec_dict))
# sign the tarball and spec file with gpg
if not unsigned:
@@ -1014,14 +1149,14 @@ def download_tarball(spec, preferred_mirrors=None):
path to downloaded tarball if successful, None otherwise.
Args:
spec (Spec): Concrete spec
spec (spack.spec.Spec): Concrete spec
preferred_mirrors (list): If provided, this is a list of preferred
mirror urls. Other configured mirrors will only be used if the
tarball can't be retrieved from one of these.
mirror urls. Other configured mirrors will only be used if the
tarball can't be retrieved from one of these.
Returns:
Path to the downloaded tarball, or ``None`` if the tarball could not
be downloaded from any configured mirrors.
be downloaded from any configured mirrors.
"""
if not spack.mirror.MirrorCollection():
tty.die("Please add a spack mirror to allow " +
@@ -1070,7 +1205,7 @@ def make_package_relative(workdir, spec, allow_root):
orig_path_names.append(os.path.join(prefix, filename))
cur_path_names.append(os.path.join(workdir, filename))
platform = spack.architecture.get_platform(spec.platform)
platform = spack.platforms.by_name(spec.platform)
if 'macho' in platform.binary_formats:
relocate.make_macho_binaries_relative(
cur_path_names, orig_path_names, old_layout_root)
@@ -1104,8 +1239,6 @@ def relocate_package(spec, allow_root):
"""
Relocate the given package
"""
import spack.hooks.sbang as sbang
workdir = str(spec.prefix)
buildinfo = read_buildinfo_file(workdir)
new_layout_root = str(spack.store.layout.root)
@@ -1144,7 +1277,8 @@ def relocate_package(spec, allow_root):
prefix_to_prefix_bin = OrderedDict({})
if old_sbang_install_path:
prefix_to_prefix_text[old_sbang_install_path] = sbang.sbang_install_path()
install_path = spack.hooks.sbang.sbang_install_path()
prefix_to_prefix_text[old_sbang_install_path] = install_path
prefix_to_prefix_text[old_prefix] = new_prefix
prefix_to_prefix_bin[old_prefix] = new_prefix
@@ -1158,7 +1292,7 @@ def relocate_package(spec, allow_root):
# now a POSIX script that lives in the install prefix. Old packages
# will have the old sbang location in their shebangs.
orig_sbang = '#!/bin/bash {0}/bin/sbang'.format(old_spack_prefix)
new_sbang = sbang.sbang_shebang_line()
new_sbang = spack.hooks.sbang.sbang_shebang_line()
prefix_to_prefix_text[orig_sbang] = new_sbang
tty.debug("Relocating package from",
@@ -1182,7 +1316,7 @@ def is_backup_file(file):
]
# If the buildcache was not created with relativized rpaths
# do the relocation of path in binaries
platform = spack.architecture.get_platform(spec.platform)
platform = spack.platforms.by_name(spec.platform)
if 'macho' in platform.binary_formats:
relocate.relocate_macho_binaries(files_to_relocate,
old_layout_root,
@@ -1241,15 +1375,26 @@ def extract_tarball(spec, filename, allow_root=False, unsigned=False,
spackfile_path = os.path.join(stagepath, spackfile_name)
tarfile_name = tarball_name(spec, '.tar.gz')
tarfile_path = os.path.join(tmpdir, tarfile_name)
specfile_name = tarball_name(spec, '.spec.yaml')
specfile_path = os.path.join(tmpdir, specfile_name)
specfile_is_json = True
deprecated_yaml_name = tarball_name(spec, '.spec.yaml')
deprecated_yaml_path = os.path.join(tmpdir, deprecated_yaml_name)
json_name = tarball_name(spec, '.spec.json')
json_path = os.path.join(tmpdir, json_name)
with closing(tarfile.open(spackfile_path, 'r')) as tar:
tar.extractall(tmpdir)
# some buildcache tarfiles use bzip2 compression
if not os.path.exists(tarfile_path):
tarfile_name = tarball_name(spec, '.tar.bz2')
tarfile_path = os.path.join(tmpdir, tarfile_name)
if os.path.exists(json_path):
specfile_path = json_path
elif os.path.exists(deprecated_yaml_path):
specfile_is_json = False
specfile_path = deprecated_yaml_path
else:
raise ValueError('Cannot find spec file for {0}.'.format(tmpdir))
if not unsigned:
if os.path.exists('%s.asc' % specfile_path):
try:
@@ -1272,7 +1417,10 @@ def extract_tarball(spec, filename, allow_root=False, unsigned=False,
spec_dict = {}
with open(specfile_path, 'r') as inputfile:
content = inputfile.read()
spec_dict = syaml.load(content)
if specfile_is_json:
spec_dict = sjson.load(content)
else:
spec_dict = syaml.load(content)
bchecksum = spec_dict['binary_cache_checksum']
# if the checksums don't match don't install
@@ -1288,42 +1436,30 @@ def extract_tarball(spec, filename, allow_root=False, unsigned=False,
buildinfo = spec_dict.get('buildinfo', {})
old_relative_prefix = buildinfo.get('relative_prefix', new_relative_prefix)
rel = buildinfo.get('relative_rpaths')
# if the original relative prefix and new relative prefix differ the
# directory layout has changed and the buildcache cannot be installed
# if it was created with relative rpaths
info = 'old relative prefix %s\nnew relative prefix %s\nrelative rpaths %s'
tty.debug(info %
(old_relative_prefix, new_relative_prefix, rel))
# if (old_relative_prefix != new_relative_prefix and (rel)):
# shutil.rmtree(tmpdir)
# msg = "Package tarball was created from an install "
# msg += "prefix with a different directory layout. "
# msg += "It cannot be relocated because it "
# msg += "uses relative rpaths."
# raise NewLayoutException(msg)
# extract the tarball in a temp directory
# Extract the tarball into the store root, presumably on the same filesystem.
# The directory created is the base directory name of the old prefix.
# Moving the old prefix name to the new prefix location should preserve
# hard links and symbolic links.
extract_tmp = os.path.join(spack.store.layout.root, '.tmp')
mkdirp(extract_tmp)
extracted_dir = os.path.join(extract_tmp,
old_relative_prefix.split(os.path.sep)[-1])
with closing(tarfile.open(tarfile_path, 'r')) as tar:
tar.extractall(path=tmpdir)
# get the parent directory of the file .spack/binary_distribution
# this should the directory unpacked from the tarball whose
# name is unknown because the prefix naming is unknown
bindist_file = glob.glob('%s/*/.spack/binary_distribution' % tmpdir)[0]
workdir = re.sub('/.spack/binary_distribution$', '', bindist_file)
tty.debug('workdir %s' % workdir)
# install_tree copies hardlinks
# create a temporary tarfile from prefix and exract it to workdir
# tarfile preserves hardlinks
temp_tarfile_name = tarball_name(spec, '.tar')
temp_tarfile_path = os.path.join(tmpdir, temp_tarfile_name)
with closing(tarfile.open(temp_tarfile_path, 'w')) as tar:
tar.add(name='%s' % workdir,
arcname='.')
with closing(tarfile.open(temp_tarfile_path, 'r')) as tar:
tar.extractall(spec.prefix)
os.remove(temp_tarfile_path)
# cleanup
try:
tar.extractall(path=extract_tmp)
except Exception as e:
shutil.rmtree(extracted_dir)
raise e
try:
shutil.move(extracted_dir, spec.prefix)
except Exception as e:
shutil.rmtree(extracted_dir)
raise e
os.remove(tarfile_path)
os.remove(specfile_path)
@@ -1349,27 +1485,39 @@ def try_direct_fetch(spec, full_hash_match=False, mirrors=None):
"""
Try to find the spec directly on the configured mirrors
"""
specfile_name = tarball_name(spec, '.spec.yaml')
deprecated_specfile_name = tarball_name(spec, '.spec.yaml')
specfile_name = tarball_name(spec, '.spec.json')
specfile_is_json = True
lenient = not full_hash_match
found_specs = []
spec_full_hash = spec.full_hash()
for mirror in spack.mirror.MirrorCollection(mirrors=mirrors).values():
buildcache_fetch_url = url_util.join(
buildcache_fetch_url_yaml = url_util.join(
mirror.fetch_url, _build_cache_relative_path, deprecated_specfile_name)
buildcache_fetch_url_json = url_util.join(
mirror.fetch_url, _build_cache_relative_path, specfile_name)
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url)
fetched_spec_yaml = codecs.getreader('utf-8')(fs).read()
_, _, fs = web_util.read_from_url(buildcache_fetch_url_json)
except (URLError, web_util.SpackWebError, HTTPError) as url_err:
tty.debug('Did not find {0} on {1}'.format(
specfile_name, buildcache_fetch_url), url_err)
continue
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_yaml)
specfile_is_json = False
except (URLError, web_util.SpackWebError, HTTPError) as url_err_y:
tty.debug('Did not find {0} on {1}'.format(
specfile_name, buildcache_fetch_url_json), url_err)
tty.debug('Did not find {0} on {1}'.format(
specfile_name, buildcache_fetch_url_yaml), url_err_y)
continue
specfile_contents = codecs.getreader('utf-8')(fs).read()
# read the spec from the build cache file. All specs in build caches
# are concrete (as they are built) so we need to mark this spec
# concrete on read-in.
fetched_spec = Spec.from_yaml(fetched_spec_yaml)
if specfile_is_json:
fetched_spec = Spec.from_json(specfile_contents)
else:
fetched_spec = Spec.from_yaml(specfile_contents)
fetched_spec._mark_concrete()
# Do not recompute the full hash for the fetched spec, instead just
@@ -1390,14 +1538,14 @@ def get_mirrors_for_spec(spec=None, full_hash_match=False,
indicating the mirrors on which it can be found
Args:
spec (Spec): The spec to look for in binary mirrors
spec (spack.spec.Spec): The spec to look for in binary mirrors
full_hash_match (bool): If True, only includes mirrors where the spec
full hash matches the locally computed full hash of the ``spec``
argument. If False, any mirror which has a matching DAG hash
is included in the results.
mirrors_to_check (dict): Optionally override the configured mirrors
with the mirrors in this dictionary.
index_only (bool): Do not attempt direct fetching of ``spec.yaml``
index_only (bool): Do not attempt direct fetching of ``spec.json``
files from remote mirrors, only consider the indices.
Return:
@@ -1447,6 +1595,9 @@ def update_cache_and_get_specs():
possible, so this method will also attempt to initialize and update the
local index cache (essentially a no-op if it has been done already and
nothing has changed on the configured mirrors.)
Throws:
FetchCacheError
"""
binary_index.update()
return binary_index.get_all_built_specs()
@@ -1594,57 +1745,91 @@ def needs_rebuild(spec, mirror_url, rebuild_on_errors=False):
pkg_name, pkg_version, pkg_hash, pkg_full_hash))
tty.debug(spec.tree())
# Try to retrieve the .spec.yaml directly, based on the known
# Try to retrieve the specfile directly, based on the known
# format of the name, in order to determine if the package
# needs to be rebuilt.
cache_prefix = build_cache_prefix(mirror_url)
spec_yaml_file_name = tarball_name(spec, '.spec.yaml')
file_path = os.path.join(cache_prefix, spec_yaml_file_name)
specfile_is_json = True
specfile_name = tarball_name(spec, '.spec.json')
deprecated_specfile_name = tarball_name(spec, '.spec.yaml')
specfile_path = os.path.join(cache_prefix, specfile_name)
deprecated_specfile_path = os.path.join(cache_prefix,
deprecated_specfile_name)
result_of_error = 'Package ({0}) will {1}be rebuilt'.format(
spec.short_spec, '' if rebuild_on_errors else 'not ')
try:
_, _, yaml_file = web_util.read_from_url(file_path)
yaml_contents = codecs.getreader('utf-8')(yaml_file).read()
_, _, spec_file = web_util.read_from_url(specfile_path)
except (URLError, web_util.SpackWebError) as url_err:
err_msg = [
'Unable to determine whether {0} needs rebuilding,',
' caught exception attempting to read from {1}.',
]
tty.error(''.join(err_msg).format(spec.short_spec, file_path))
tty.debug(url_err)
try:
_, _, spec_file = web_util.read_from_url(deprecated_specfile_path)
specfile_is_json = False
except (URLError, web_util.SpackWebError) as url_err_y:
err_msg = [
'Unable to determine whether {0} needs rebuilding,',
' caught exception attempting to read from {1} or {2}.',
]
tty.error(''.join(err_msg).format(
spec.short_spec,
specfile_path,
deprecated_specfile_path))
tty.debug(url_err)
tty.debug(url_err_y)
tty.warn(result_of_error)
return rebuild_on_errors
spec_file_contents = codecs.getreader('utf-8')(spec_file).read()
if not spec_file_contents:
tty.error('Reading {0} returned nothing'.format(
specfile_path if specfile_is_json else deprecated_specfile_path))
tty.warn(result_of_error)
return rebuild_on_errors
if not yaml_contents:
tty.error('Reading {0} returned nothing'.format(file_path))
tty.warn(result_of_error)
return rebuild_on_errors
spec_dict = (sjson.load(spec_file_contents)
if specfile_is_json else syaml.load(spec_file_contents))
spec_yaml = syaml.load(yaml_contents)
yaml_spec = spec_yaml['spec']
try:
nodes = spec_dict['spec']['nodes']
except KeyError:
# Prior node dict format omitted 'nodes' key
nodes = spec_dict['spec']
name = spec.name
# The "spec" key in the yaml is a list of objects, each with a single
# In the old format:
# The "spec" key represents a list of objects, each with a single
# key that is the package name. While the list usually just contains
# a single object, we iterate over the list looking for the object
# with the name of this concrete spec as a key, out of an abundance
# of caution.
cached_pkg_specs = [item[name] for item in yaml_spec if name in item]
# In format version 2:
# ['spec']['nodes'] is still a list of objects, but with a
# multitude of keys. The list will commonly contain many objects, and in the
# case of build specs, it is highly likely that the same name will occur
# once as the actual package, and then again as the build provenance of that
# same package. Hence format version 2 matches on the dag hash, not name.
if nodes and 'name' not in nodes[0]:
# old style
cached_pkg_specs = [item[name] for item in nodes if name in item]
elif nodes and spec_dict['spec']['_meta']['version'] == 2:
cached_pkg_specs = [item for item in nodes
if item[ht.dag_hash.name] == spec.dag_hash()]
cached_target = cached_pkg_specs[0] if cached_pkg_specs else None
# If either the full_hash didn't exist in the .spec.yaml file, or it
# If either the full_hash didn't exist in the specfile, or it
# did, but didn't match the one we computed locally, then we should
# just rebuild. This can be simplified once the dag_hash and the
# full_hash become the same thing.
rebuild = False
if not cached_target or 'full_hash' not in cached_target:
reason = 'full_hash was missing from remote spec.yaml'
if not cached_target:
reason = 'did not find spec in specfile contents'
rebuild = True
elif ht.full_hash.name not in cached_target:
reason = 'full_hash was missing from remote specfile'
rebuild = True
else:
full_hash = cached_target['full_hash']
full_hash = cached_target[ht.full_hash.name]
if full_hash != pkg_full_hash:
reason = 'hash mismatch, remote = {0}, local = {1}'.format(
full_hash, pkg_full_hash)
@@ -1667,11 +1852,11 @@ def check_specs_against_mirrors(mirrors, specs, output_file=None,
Arguments:
mirrors (dict): Mirrors to check against
specs (iterable): Specs to check against mirrors
output_file (string): Path to output file to be written. If provided,
specs (typing.Iterable): Specs to check against mirrors
output_file (str): Path to output file to be written. If provided,
mirrors with missing or out-of-date specs will be formatted as a
JSON object and written to this file.
rebuild_on_errors (boolean): Treat any errors encountered while
rebuild_on_errors (bool): Treat any errors encountered while
checking specs as a signal to rebuild package.
Returns: 1 if any spec was out-of-date on any mirror, 0 otherwise.
@@ -1706,24 +1891,23 @@ def check_specs_against_mirrors(mirrors, specs, output_file=None,
def _download_buildcache_entry(mirror_root, descriptions):
for description in descriptions:
description_url = os.path.join(mirror_root, description['url'])
path = description['path']
fail_if_missing = description['required']
mkdirp(path)
stage = Stage(
description_url, name="build_cache", path=path, keep=True)
try:
stage.fetch()
except fs.FetchError as e:
tty.debug(e)
fail_if_missing = description['required']
for url in description['url']:
description_url = os.path.join(mirror_root, url)
stage = Stage(
description_url, name="build_cache", path=path, keep=True)
try:
stage.fetch()
break
except fs.FetchError as e:
tty.debug(e)
else:
if fail_if_missing:
tty.error('Failed to download required url {0}'.format(
description_url))
return False
return True

View File

@@ -2,8 +2,14 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import contextlib
import fnmatch
import json
import os
import os.path
import re
import sys
try:
@@ -17,16 +23,305 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.architecture
import spack.binary_distribution
import spack.config
import spack.detection
import spack.environment
import spack.main
import spack.modules
import spack.paths
import spack.platforms
import spack.repo
import spack.spec
import spack.store
import spack.user_environment as uenv
import spack.util.executable
import spack.util.path
from spack.util.environment import EnvironmentModifications
#: Map a bootstrapper type to the corresponding class
_bootstrap_methods = {}
def _bootstrapper(type):
"""Decorator to register classes implementing bootstrapping
methods.
Args:
type (str): string identifying the class
"""
def _register(cls):
_bootstrap_methods[type] = cls
return cls
return _register
def _try_import_from_store(module, abstract_spec_str):
"""Return True if the module can be imported from an already
installed spec, False otherwise.
Args:
module: Python module to be imported
abstract_spec_str: abstract spec that may provide the module
"""
bincache_platform = spack.platforms.real_host()
if str(bincache_platform) == 'cray':
bincache_platform = spack.platforms.linux.Linux()
with spack.platforms.use_platform(bincache_platform):
abstract_spec_str = str(spack.spec.Spec(abstract_spec_str))
# We have to run as part of this python interpreter
abstract_spec_str += ' ^' + spec_for_current_python()
installed_specs = spack.store.db.query(abstract_spec_str, installed=True)
for candidate_spec in installed_specs:
lib_spd = candidate_spec['python'].package.default_site_packages_dir
lib64_spd = lib_spd.replace('lib/', 'lib64/')
module_paths = [
os.path.join(candidate_spec.prefix, lib_spd),
os.path.join(candidate_spec.prefix, lib64_spd)
]
sys.path.extend(module_paths)
try:
_fix_ext_suffix(candidate_spec)
if _python_import(module):
msg = ('[BOOTSTRAP MODULE {0}] The installed spec "{1}/{2}" '
'provides the "{0}" Python module').format(
module, abstract_spec_str, candidate_spec.dag_hash()
)
tty.debug(msg)
return True
except Exception as e:
msg = ('unexpected error while trying to import module '
'"{0}" from spec "{1}" [error="{2}"]')
tty.warn(msg.format(module, candidate_spec, str(e)))
else:
msg = "Spec {0} did not provide module {1}"
tty.warn(msg.format(candidate_spec, module))
sys.path = sys.path[:-2]
return False
def _fix_ext_suffix(candidate_spec):
"""Fix the external suffixes of Python extensions on the fly for
platforms that may need it
Args:
candidate_spec (Spec): installed spec with a Python module
to be checked.
"""
# Here we map target families to the patterns expected
# by pristine CPython. Only architectures with known issues
# are included. Known issues:
#
# [RHEL + ppc64le]: https://github.com/spack/spack/issues/25734
#
_suffix_to_be_checked = {
'ppc64le': {
'glob': '*.cpython-*-powerpc64le-linux-gnu.so',
're': r'.cpython-[\w]*-powerpc64le-linux-gnu.so',
'fmt': r'{module}.cpython-{major}{minor}m-powerpc64le-linux-gnu.so'
}
}
# If the current architecture is not problematic return
generic_target = archspec.cpu.host().family
if str(generic_target) not in _suffix_to_be_checked:
return
# If there's no EXT_SUFFIX (Python < 3.5) or the suffix matches
# the expectations, return since the package is surely good
ext_suffix = sysconfig.get_config_var('EXT_SUFFIX')
if ext_suffix is None:
return
expected = _suffix_to_be_checked[str(generic_target)]
if fnmatch.fnmatch(ext_suffix, expected['glob']):
return
# If we are here it means the current interpreter expects different names
# than pristine CPython. So:
# 1. Find what we have installed
# 2. Create symbolic links for the other names, it they're not there already
# Check if standard names are installed and if we have to create
# link for this interpreter
standard_extensions = fs.find(candidate_spec.prefix, expected['glob'])
link_names = [re.sub(expected['re'], ext_suffix, s) for s in standard_extensions]
for file_name, link_name in zip(standard_extensions, link_names):
if os.path.exists(link_name):
continue
os.symlink(file_name, link_name)
# Check if this interpreter installed something and we have to create
# links for a standard CPython interpreter
non_standard_extensions = fs.find(candidate_spec.prefix, '*' + ext_suffix)
for abs_path in non_standard_extensions:
directory, filename = os.path.split(abs_path)
module = filename.split('.')[0]
link_name = os.path.join(directory, expected['fmt'].format(
module=module, major=sys.version_info[0], minor=sys.version_info[1])
)
if os.path.exists(link_name):
continue
os.symlink(abs_path, link_name)
@_bootstrapper(type='buildcache')
class _BuildcacheBootstrapper(object):
"""Install the software needed during bootstrapping from a buildcache."""
def __init__(self, conf):
self.name = conf['name']
self.url = conf['info']['url']
def try_import(self, module, abstract_spec_str):
if _try_import_from_store(module, abstract_spec_str):
return True
tty.info("Bootstrapping {0} from pre-built binaries".format(module))
# Try to install from an unsigned binary cache
abstract_spec = spack.spec.Spec(
abstract_spec_str + ' ^' + spec_for_current_python()
)
# On Cray we want to use Linux binaries if available from mirrors
bincache_platform = spack.platforms.real_host()
if str(bincache_platform) == 'cray':
bincache_platform = spack.platforms.Linux()
with spack.platforms.use_platform(bincache_platform):
abstract_spec = spack.spec.Spec(
abstract_spec_str + ' ^' + spec_for_current_python()
)
# Read information on verified clingo binaries
json_filename = '{0}.json'.format(module)
json_path = os.path.join(
spack.paths.share_path, 'bootstrap', self.name, json_filename
)
with open(json_path) as f:
data = json.load(f)
buildcache = spack.main.SpackCommand('buildcache')
# Ensure we see only the buildcache being used to bootstrap
mirror_scope = spack.config.InternalConfigScope(
'bootstrap_buildcache', {'mirrors:': {self.name: self.url}}
)
with spack.config.override(mirror_scope):
# This index is currently needed to get the compiler used to build some
# specs that wwe know by dag hash.
spack.binary_distribution.binary_index.regenerate_spec_cache()
index = spack.binary_distribution.update_cache_and_get_specs()
if not index:
raise RuntimeError("The binary index is empty")
for item in data['verified']:
candidate_spec = item['spec']
python_spec = item['python']
# Skip specs which are not compatible
if not abstract_spec.satisfies(candidate_spec):
continue
if python_spec not in abstract_spec:
continue
for pkg_name, pkg_hash, pkg_sha256 in item['binaries']:
msg = ('[BOOTSTRAP MODULE {0}] Try installing "{1}" from binary '
'cache at "{2}"')
tty.debug(msg.format(module, pkg_name, self.url))
index_spec = next(x for x in index if x.dag_hash() == pkg_hash)
# Reconstruct the compiler that we need to use for bootstrapping
compiler_entry = {
"modules": [],
"operating_system": str(index_spec.os),
"paths": {
"cc": "/dev/null",
"cxx": "/dev/null",
"f77": "/dev/null",
"fc": "/dev/null"
},
"spec": str(index_spec.compiler),
"target": str(index_spec.target.family)
}
with spack.platforms.use_platform(bincache_platform):
with spack.config.override(
'compilers', [{'compiler': compiler_entry}]
):
spec_str = '/' + pkg_hash
install_args = [
'install',
'--sha256', pkg_sha256,
'-a', '-u', '-o', '-f', spec_str
]
buildcache(*install_args, fail_on_error=False)
# TODO: undo installations that didn't complete?
if _try_import_from_store(module, abstract_spec_str):
return True
return False
@_bootstrapper(type='install')
class _SourceBootstrapper(object):
"""Install the software needed during bootstrapping from sources."""
def __init__(self, conf):
self.conf = conf
@staticmethod
def try_import(module, abstract_spec_str):
if _try_import_from_store(module, abstract_spec_str):
return True
tty.info("Bootstrapping {0} from sources".format(module))
# If we compile code from sources detecting a few build tools
# might reduce compilation time by a fair amount
_add_externals_if_missing()
# Try to build and install from sources
with spack_python_interpreter():
# Add hint to use frontend operating system on Cray
if str(spack.platforms.host()) == 'cray':
abstract_spec_str += ' os=fe'
concrete_spec = spack.spec.Spec(
abstract_spec_str + ' ^' + spec_for_current_python()
)
if module == 'clingo':
# TODO: remove when the old concretizer is deprecated
concrete_spec._old_concretize(deprecation_warning=False)
else:
concrete_spec.concretize()
msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources"
tty.debug(msg.format(module, abstract_spec_str))
# Install the spec that should make the module importable
concrete_spec.package.do_install(fail_fast=True)
return _try_import_from_store(module, abstract_spec_str=abstract_spec_str)
def _make_bootstrapper(conf):
"""Return a bootstrap object built according to the
configuration argument
"""
btype = conf['type']
return _bootstrap_methods[btype](conf)
def _source_is_trusted(conf):
trusted, name = spack.config.get('bootstrap:trusted'), conf['name']
if name not in trusted:
return False
return trusted[name]
def spec_for_current_python():
"""For bootstrapping purposes we are just interested in the Python
@@ -53,7 +348,7 @@ def spack_python_interpreter():
which Spack is currently running as the only Python external spec
available.
"""
python_prefix = os.path.dirname(os.path.dirname(sys.executable))
python_prefix = sys.exec_prefix
external_python = spec_for_current_python()
entry = {
@@ -67,69 +362,68 @@ def spack_python_interpreter():
yield
def make_module_available(module, spec=None, install=False):
"""Ensure module is importable"""
# If we already can import it, that's great
try:
__import__(module)
def ensure_module_importable_or_raise(module, abstract_spec=None):
"""Make the requested module available for import, or raise.
This function tries to import a Python module in the current interpreter
using, in order, the methods configured in bootstrap.yaml.
If none of the methods succeed, an exception is raised. The function exits
on first success.
Args:
module (str): module to be imported in the current interpreter
abstract_spec (str): abstract spec that might provide the module. If not
given it defaults to "module"
Raises:
ImportError: if the module couldn't be imported
"""
# If we can import it already, that's great
tty.debug("[BOOTSTRAP MODULE {0}] Try importing from Python".format(module))
if _python_import(module):
return
except ImportError:
pass
# If it's already installed, use it
# Search by spec
spec = spack.spec.Spec(spec or module)
abstract_spec = abstract_spec or module
source_configs = spack.config.get('bootstrap:sources', [])
# We have to run as part of this python
# We can constrain by a shortened version in place of a version range
# because this spec is only used for querying or as a placeholder to be
# replaced by an external that already has a concrete version. This syntax
# is not sufficient when concretizing without an external, as it will
# concretize to python@X.Y instead of python@X.Y.Z
python_requirement = '^' + spec_for_current_python()
spec.constrain(python_requirement)
installed_specs = spack.store.db.query(spec, installed=True)
errors = {}
for ispec in installed_specs:
# TODO: make sure run-environment is appropriate
module_path = os.path.join(ispec.prefix,
ispec['python'].package.site_packages_dir)
module_path_64 = module_path.replace('/lib/', '/lib64/')
for current_config in source_configs:
if not _source_is_trusted(current_config):
msg = ('[BOOTSTRAP MODULE {0}] Skipping source "{1}" since it is '
'not trusted').format(module, current_config['name'])
tty.debug(msg)
continue
b = _make_bootstrapper(current_config)
try:
sys.path.append(module_path)
sys.path.append(module_path_64)
__import__(module)
return
except ImportError:
tty.warn("Spec %s did not provide module %s" % (ispec, module))
sys.path = sys.path[:-2]
if b.try_import(module, abstract_spec):
return
except Exception as e:
msg = '[BOOTSTRAP MODULE {0}] Unexpected error "{1}"'
tty.debug(msg.format(module, str(e)))
errors[current_config['name']] = e
def _raise_error(module_name, module_spec):
error_msg = 'cannot import module "{0}"'.format(module_name)
if module_spec:
error_msg += ' from spec "{0}'.format(module_spec)
raise ImportError(error_msg)
# We couldn't import in any way, so raise an import error
msg = 'cannot bootstrap the "{0}" Python module'.format(module)
if abstract_spec:
msg += ' from spec "{0}"'.format(abstract_spec)
msg += ' due to the following failures:\n'
for method in errors:
err = errors[method]
msg += " '{0}' raised {1}: {2}\n".format(
method, err.__class__.__name__, str(err))
msg += ' Please run `spack -d spec zlib` for more verbose error messages'
raise ImportError(msg)
if not install:
_raise_error(module, spec)
with spack_python_interpreter():
# We will install for ourselves, using this python if needed
# Concretize the spec
spec.concretize()
spec.package.do_install()
module_path = os.path.join(spec.prefix,
spec['python'].package.site_packages_dir)
module_path_64 = module_path.replace('/lib/', '/lib64/')
def _python_import(module):
try:
sys.path.append(module_path)
sys.path.append(module_path_64)
__import__(module)
return
except ImportError:
sys.path = sys.path[:-2]
_raise_error(module, spec)
return False
return True
def get_executable(exe, spec=None, install=False):
@@ -137,13 +431,14 @@ def get_executable(exe, spec=None, install=False):
Args:
exe (str): needed executable name
spec (Spec or str): spec to search for exe in (default exe)
spec (spack.spec.Spec or str): spec to search for exe in (default exe)
install (bool): install spec if not available
When ``install`` is True, Spack will use the python used to run Spack as an
external. The ``install`` option should only be used with packages that
install quickly (when using external python) or are guaranteed by Spack
organization to be in a binary mirror (clingo)."""
organization to be in a binary mirror (clingo).
"""
# Search the system first
runner = spack.util.executable.which(exe)
if runner:
@@ -201,8 +496,12 @@ def _bootstrap_config_scopes():
config_scopes = [
spack.config.InternalConfigScope('_builtin', spack.config.config_defaults)
]
for name, path in spack.config.configuration_paths:
platform = spack.architecture.platform().name
configuration_paths = (
spack.config.configuration_defaults_path,
('bootstrap', _config_path())
)
for name, path in configuration_paths:
platform = spack.platforms.host().name
platform_scope = spack.config.ConfigScope(
'/'.join([name, platform]), os.path.join(path, platform)
)
@@ -214,17 +513,91 @@ def _bootstrap_config_scopes():
return config_scopes
def _add_compilers_if_missing():
arch = spack.spec.ArchSpec.frontend_arch()
if not spack.compilers.compilers_for_arch(arch):
new_compilers = spack.compilers.find_new_compilers()
if new_compilers:
spack.compilers.add_compilers_to_config(new_compilers, init_config=False)
def _add_externals_if_missing():
search_list = [
spack.repo.path.get('cmake'),
spack.repo.path.get('bison')
]
detected_packages = spack.detection.by_executable(search_list)
spack.detection.update_configuration(detected_packages, scope='bootstrap')
@contextlib.contextmanager
def ensure_bootstrap_configuration():
with spack.architecture.use_platform(spack.architecture.real_platform()):
with spack.repo.use_repositories(spack.paths.packages_path):
with spack.store.use_store(spack.paths.user_bootstrap_store):
# Default configuration scopes excluding command line
# and builtin but accounting for platform specific scopes
config_scopes = _bootstrap_config_scopes()
with spack.config.use_configuration(*config_scopes):
with spack_python_interpreter():
yield
bootstrap_store_path = store_path()
user_configuration = _read_and_sanitize_configuration()
with spack.environment.no_active_environment():
with spack.platforms.use_platform(spack.platforms.real_host()):
with spack.repo.use_repositories(spack.paths.packages_path):
with spack.store.use_store(bootstrap_store_path):
# Default configuration scopes excluding command line
# and builtin but accounting for platform specific scopes
config_scopes = _bootstrap_config_scopes()
with spack.config.use_configuration(*config_scopes):
# We may need to compile code from sources, so ensure we have
# compilers for the current platform before switching parts.
_add_compilers_if_missing()
spack.config.set('bootstrap', user_configuration['bootstrap'])
spack.config.set('config', user_configuration['config'])
with spack.modules.disable_modules():
with spack_python_interpreter():
yield
def _read_and_sanitize_configuration():
"""Read the user configuration that needs to be reused for bootstrapping
and remove the entries that should not be copied over.
"""
# Read the "config" section but pop the install tree (the entry will not be
# considered due to the use_store context manager, so it will be confusing
# to have it in the configuration).
config_yaml = spack.config.get('config')
config_yaml.pop('install_tree', None)
user_configuration = {
'bootstrap': spack.config.get('bootstrap'),
'config': config_yaml
}
return user_configuration
def store_path():
"""Path to the store used for bootstrapped software"""
enabled = spack.config.get('bootstrap:enable', True)
if not enabled:
msg = ('bootstrapping is currently disabled. '
'Use "spack bootstrap enable" to enable it')
raise RuntimeError(msg)
return _store_path()
def _root_path():
"""Root of all the bootstrap related folders"""
return spack.config.get(
'bootstrap:root', spack.paths.default_user_bootstrap_path
)
def _store_path():
bootstrap_root_path = _root_path()
return spack.util.path.canonicalize_path(
os.path.join(bootstrap_root_path, 'store')
)
def _config_path():
bootstrap_root_path = _root_path()
return spack.util.path.canonicalize_path(
os.path.join(bootstrap_root_path, 'config')
)
def clingo_root_spec():
@@ -233,19 +606,22 @@ def clingo_root_spec():
# Add a proper compiler hint to the root spec. We use GCC for
# everything but MacOS.
if str(spack.architecture.platform()) == 'darwin':
if str(spack.platforms.host()) == 'darwin':
spec_str += ' %apple-clang'
else:
spec_str += ' %gcc'
# Add hint to use frontend operating system on Cray
if str(spack.architecture.platform()) == 'cray':
spec_str += ' os=fe'
# Add the generic target
generic_target = archspec.cpu.host().family
spec_str += ' target={0}'.format(str(generic_target))
tty.debug('[BOOTSTRAP ROOT SPEC] clingo: {0}'.format(spec_str))
return spack.spec.Spec(spec_str)
return spec_str
def ensure_clingo_importable_or_raise():
"""Ensure that the clingo module is available for import."""
ensure_module_importable_or_raise(
module='clingo', abstract_spec=clingo_root_spec()
)

View File

@@ -49,7 +49,6 @@
from llnl.util.tty.color import cescape, colorize
from llnl.util.tty.log import MultiProcessFd
import spack.architecture as arch
import spack.build_systems.cmake
import spack.build_systems.meson
import spack.config
@@ -57,10 +56,12 @@
import spack.main
import spack.package
import spack.paths
import spack.platforms
import spack.repo
import spack.schema.environment
import spack.store
import spack.subprocess_context
import spack.user_environment
import spack.util.path
from spack.error import NoHeadersError, NoLibrariesError
from spack.util.cpus import cpus_available
@@ -69,8 +70,8 @@
env_flag,
filter_system_paths,
get_path,
inspect_path,
is_system_path,
preserve_environment,
system_dirs,
validate,
)
@@ -146,6 +147,14 @@ def __call__(self, *args, **kwargs):
return super(MakeExecutable, self).__call__(*args, **kwargs)
def _on_cray():
host_platform = spack.platforms.host()
host_os = host_platform.operating_system('default_os')
on_cray = str(host_platform) == 'cray'
using_cnl = re.match(r'cnl\d+', str(host_os))
return on_cray, using_cnl
def clean_environment():
# Stuff in here sanitizes the build environment to eliminate
# anything the user has set that may interfere. We apply it immediately
@@ -169,6 +178,9 @@ def clean_environment():
env.unset('CMAKE_PREFIX_PATH')
# Affects GNU make, can e.g. indirectly inhibit enabling parallel build
env.unset('MAKEFLAGS')
# Avoid that libraries of build dependencies get hijacked.
env.unset('LD_PRELOAD')
env.unset('DYLD_INSERT_LIBRARIES')
@@ -177,9 +189,7 @@ def clean_environment():
# interference with Spack dependencies.
# CNL requires these variables to be set (or at least some of them,
# depending on the CNL version).
hostarch = arch.Arch(arch.platform(), 'default_os', 'default_target')
on_cray = str(hostarch.platform) == 'cray'
using_cnl = re.match(r'cnl\d+', str(hostarch.os))
on_cray, using_cnl = _on_cray()
if on_cray and not using_cnl:
env.unset('CRAY_LD_LIBRARY_PATH')
for varname in os.environ.keys():
@@ -222,7 +232,7 @@ def clean_environment():
if '/macports/' in p:
env.remove_path('PATH', p)
env.apply_modifications()
return env
def set_compiler_environment_variables(pkg, env):
@@ -455,11 +465,11 @@ def determine_number_of_jobs(
cap to the number of CPUs available to avoid oversubscription.
Parameters:
parallel (bool): true when package supports parallel builds
command_line (int/None): command line override
config_default (int/None): config default number of jobs
max_cpus (int/None): maximum number of CPUs available. When None, this
value is automatically determined.
parallel (bool or None): true when package supports parallel builds
command_line (int or None): command line override
config_default (int or None): config default number of jobs
max_cpus (int or None): maximum number of CPUs available. When None, this
value is automatically determined.
"""
if not parallel:
return 1
@@ -685,14 +695,14 @@ def get_std_cmake_args(pkg):
"""List of standard arguments used if a package is a CMakePackage.
Returns:
list of str: standard arguments that would be used if this
list: standard arguments that would be used if this
package were a CMakePackage instance.
Args:
pkg (PackageBase): package under consideration
pkg (spack.package.PackageBase): package under consideration
Returns:
list of str: arguments for cmake
list: arguments for cmake
"""
return spack.build_systems.cmake.CMakePackage._std_args(pkg)
@@ -701,14 +711,14 @@ def get_std_meson_args(pkg):
"""List of standard arguments used if a package is a MesonPackage.
Returns:
list of str: standard arguments that would be used if this
list: standard arguments that would be used if this
package were a MesonPackage instance.
Args:
pkg (PackageBase): package under consideration
pkg (spack.package.PackageBase): package under consideration
Returns:
list of str: arguments for meson
list: arguments for meson
"""
return spack.build_systems.meson.MesonPackage._std_args(pkg)
@@ -738,7 +748,7 @@ def load_external_modules(pkg):
associated with them.
Args:
pkg (PackageBase): package to load deps for
pkg (spack.package.PackageBase): package to load deps for
"""
for dep in list(pkg.spec.traverse()):
external_modules = dep.external_modules or []
@@ -755,72 +765,77 @@ def setup_package(pkg, dirty, context='build'):
set_module_variables_for_package(pkg)
env = EnvironmentModifications()
if not dirty:
clean_environment()
# Keep track of env changes from packages separately, since we want to
# issue warnings when packages make "suspicious" modifications.
env_base = EnvironmentModifications() if dirty else clean_environment()
env_mods = EnvironmentModifications()
# setup compilers for build contexts
need_compiler = context == 'build' or (context == 'test' and
pkg.test_requires_compiler)
if need_compiler:
set_compiler_environment_variables(pkg, env)
set_wrapper_variables(pkg, env)
set_compiler_environment_variables(pkg, env_mods)
set_wrapper_variables(pkg, env_mods)
env.extend(modifications_from_dependencies(
env_mods.extend(modifications_from_dependencies(
pkg.spec, context, custom_mods_only=False))
# architecture specific setup
pkg.architecture.platform.setup_platform_environment(pkg, env)
platform = spack.platforms.by_name(pkg.spec.architecture.platform)
target = platform.target(pkg.spec.architecture.target)
platform.setup_platform_environment(pkg, env_mods)
if context == 'build':
pkg.setup_build_environment(env)
pkg.setup_build_environment(env_mods)
if (not dirty) and (not env.is_unset('CPATH')):
if (not dirty) and (not env_mods.is_unset('CPATH')):
tty.debug("A dependency has updated CPATH, this may lead pkg-"
"config to assume that the package is part of the system"
" includes and omit it when invoked with '--cflags'.")
elif context == 'test':
pkg.setup_run_environment(env)
env.prepend_path('PATH', '.')
env_mods.extend(
inspect_path(
pkg.spec.prefix,
spack.user_environment.prefix_inspections(pkg.spec.platform),
exclude=is_system_path
)
)
pkg.setup_run_environment(env_mods)
env_mods.prepend_path('PATH', '.')
# Loading modules, in particular if they are meant to be used outside
# of Spack, can change environment variables that are relevant to the
# build of packages. To avoid a polluted environment, preserve the
# value of a few, selected, environment variables
# With the current ordering of environment modifications, this is strictly
# unnecessary. Modules affecting these variables will be overwritten anyway
with preserve_environment('CC', 'CXX', 'FC', 'F77'):
# All module loads that otherwise would belong in previous
# functions have to occur after the env object has its
# modifications applied. Otherwise the environment modifications
# could undo module changes, such as unsetting LD_LIBRARY_PATH
# after a module changes it.
if need_compiler:
for mod in pkg.compiler.modules:
# Fixes issue https://github.com/spack/spack/issues/3153
if os.environ.get("CRAY_CPU_TARGET") == "mic-knl":
load_module("cce")
load_module(mod)
# First apply the clean environment changes
env_base.apply_modifications()
# kludge to handle cray libsci being automatically loaded by PrgEnv
# modules on cray platform. Module unload does no damage when
# unnecessary
# Load modules on an already clean environment, just before applying Spack's
# own environment modifications. This ensures Spack controls CC/CXX/... variables.
if need_compiler:
for mod in pkg.compiler.modules:
load_module(mod)
# kludge to handle cray libsci being automatically loaded by PrgEnv
# modules on cray platform. Module unload does no damage when
# unnecessary
on_cray, _ = _on_cray()
if on_cray:
module('unload', 'cray-libsci')
if pkg.architecture.target.module_name:
load_module(pkg.architecture.target.module_name)
if target.module_name:
load_module(target.module_name)
load_external_modules(pkg)
load_external_modules(pkg)
implicit_rpaths = pkg.compiler.implicit_rpaths()
if implicit_rpaths:
env.set('SPACK_COMPILER_IMPLICIT_RPATHS',
':'.join(implicit_rpaths))
env_mods.set('SPACK_COMPILER_IMPLICIT_RPATHS',
':'.join(implicit_rpaths))
# Make sure nothing's strange about the Spack environment.
validate(env, tty.warn)
env.apply_modifications()
validate(env_mods, tty.warn)
env_mods.apply_modifications()
# Return all env modifications we controlled (excluding module related ones)
env_base.extend(env_mods)
return env_base
def _make_runnable(pkg, env):
@@ -864,7 +879,7 @@ def modifications_from_dependencies(spec, context, custom_mods_only=True):
CMAKE_PREFIX_PATH, or PKG_CONFIG_PATH).
Args:
spec (Spec): spec for which we want the modifications
spec (spack.spec.Spec): spec for which we want the modifications
context (str): either 'build' for build-time modifications or 'run'
for run-time modifications
"""
@@ -1008,8 +1023,8 @@ def _setup_pkg_and_run(serialized_pkg, function, kwargs, child_pipe,
if not kwargs.get('fake', False):
kwargs['unmodified_env'] = os.environ.copy()
setup_package(pkg, dirty=kwargs.get('dirty', False),
context=context)
kwargs['env_modifications'] = setup_package(
pkg, dirty=kwargs.get('dirty', False), context=context)
return_value = function(pkg, kwargs)
child_pipe.send(return_value)
@@ -1062,9 +1077,9 @@ def start_build_process(pkg, function, kwargs):
Args:
pkg (PackageBase): package whose environment we should set up the
pkg (spack.package.PackageBase): package whose environment we should set up the
child process for.
function (callable): argless function to run in the child
function (typing.Callable): argless function to run in the child
process.
Usage::
@@ -1149,7 +1164,7 @@ def get_package_context(traceback, context=3):
"""Return some context for an error message when the build fails.
Args:
traceback (traceback): A traceback from some exception raised during
traceback: A traceback from some exception raised during
install
context (int): Lines of context to show before and after the line

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import itertools
import os
import os.path
import stat
@@ -14,6 +13,8 @@
import llnl.util.tty as tty
from llnl.util.filesystem import force_remove, working_dir
from spack.build_environment import InstallError
from spack.directives import depends_on
from spack.package import PackageBase, run_after, run_before
from spack.util.executable import Executable
@@ -30,7 +31,7 @@ class AutotoolsPackage(PackageBase):
They all have sensible defaults and for many packages the only thing
necessary will be to override the helper method
:py:meth:`~.AutotoolsPackage.configure_args`.
:meth:`~spack.build_systems.autotools.AutotoolsPackage.configure_args`.
For a finer tuning you may also override:
+-----------------------------------------------+--------------------+
@@ -54,9 +55,24 @@ class AutotoolsPackage(PackageBase):
#: This attribute is used in UI queries that need to know the build
#: system base class
build_system_class = 'AutotoolsPackage'
#: Whether or not to update ``config.guess`` and ``config.sub`` on old
#: architectures
patch_config_files = True
@property
def patch_config_files(self):
"""
Whether or not to update old ``config.guess`` and ``config.sub`` files
distributed with the tarball. This currently only applies to
``ppc64le:``, ``aarch64:``, and ``riscv64`` target architectures. The
substitutes are taken from the ``gnuconfig`` package, which is
automatically added as a build dependency for these architectures. In
case system versions of these config files are required, the
``gnuconfig`` package can be marked external with a prefix pointing to
the directory containing the system ``config.guess`` and ``config.sub``
files.
"""
return (self.spec.satisfies('target=ppc64le:')
or self.spec.satisfies('target=aarch64:')
or self.spec.satisfies('target=riscv64:'))
#: Whether or not to update ``libtool``
#: (currently only for Arm/Clang/Fujitsu compilers)
patch_libtool = True
@@ -83,6 +99,10 @@ class AutotoolsPackage(PackageBase):
#: after the installation. If True instead it installs them.
install_libtool_archives = False
depends_on('gnuconfig', type='build', when='target=ppc64le:')
depends_on('gnuconfig', type='build', when='target=aarch64:')
depends_on('gnuconfig', type='build', when='target=riscv64:')
@property
def _removed_la_files_log(self):
"""File containing the list of remove libtool archives"""
@@ -104,12 +124,10 @@ def _do_patch_config_files(self):
"""Some packages ship with older config.guess/config.sub files and
need to have these updated when installed on a newer architecture.
In particular, config.guess fails for PPC64LE for version prior
to a 2013-06-10 build date (automake 1.13.4) and for ARM (aarch64).
to a 2013-06-10 build date (automake 1.13.4) and for ARM (aarch64) and
RISC-V (riscv64).
"""
if not self.patch_config_files or (
not self.spec.satisfies('target=ppc64le:') and
not self.spec.satisfies('target=aarch64:')
):
if not self.patch_config_files:
return
# TODO: Expand this to select the 'config.sub'-compatible architecture
@@ -119,6 +137,8 @@ def _do_patch_config_files(self):
config_arch = 'ppc64le'
elif self.spec.satisfies('target=aarch64:'):
config_arch = 'aarch64'
elif self.spec.satisfies('target=riscv64:'):
config_arch = 'riscv64'
else:
config_arch = 'local'
@@ -138,39 +158,69 @@ def runs_ok(script_abs_path):
return True
# Compute the list of files that needs to be patched
search_dir = self.stage.path
to_be_patched = fs.find(
search_dir, files=['config.sub', 'config.guess'], recursive=True
)
# Get the list of files that needs to be patched
to_be_patched = fs.find(self.stage.path, files=['config.sub', 'config.guess'])
to_be_patched = [f for f in to_be_patched if not runs_ok(f)]
# If there are no files to be patched, return early
if not to_be_patched:
return
# Directories where to search for files to be copied
# over the failing ones
good_file_dirs = ['/usr/share']
if 'automake' in self.spec:
good_file_dirs.insert(0, self.spec['automake'].prefix)
# Otherwise, require `gnuconfig` to be a build dependency
self._require_build_deps(
pkgs=['gnuconfig'],
spec=self.spec,
err="Cannot patch config files")
# List of files to be found in the directories above
# Get the config files we need to patch (config.sub / config.guess).
to_be_found = list(set(os.path.basename(f) for f in to_be_patched))
gnuconfig = self.spec['gnuconfig']
gnuconfig_dir = gnuconfig.prefix
# An external gnuconfig may not not have a prefix.
if gnuconfig_dir is None:
raise InstallError("Spack could not find substitutes for GNU config "
"files because no prefix is available for the "
"`gnuconfig` package. Make sure you set a prefix "
"path instead of modules for external `gnuconfig`.")
candidates = fs.find(gnuconfig_dir, files=to_be_found, recursive=False)
# For external packages the user may have specified an incorrect prefix.
# otherwise the installation is just corrupt.
if not candidates:
msg = ("Spack could not find `config.guess` and `config.sub` "
"files in the `gnuconfig` prefix `{0}`. This means the "
"`gnuconfig` package is broken").format(gnuconfig_dir)
if gnuconfig.external:
msg += (" or the `gnuconfig` package prefix is misconfigured as"
" an external package")
raise InstallError(msg)
# Filter working substitutes
candidates = [f for f in candidates if runs_ok(f)]
substitutes = {}
for directory in good_file_dirs:
candidates = fs.find(directory, files=to_be_found, recursive=True)
candidates = [f for f in candidates if runs_ok(f)]
for name, good_files in itertools.groupby(
candidates, key=os.path.basename
):
substitutes[name] = next(good_files)
to_be_found.remove(name)
for candidate in candidates:
config_file = os.path.basename(candidate)
substitutes[config_file] = candidate
to_be_found.remove(config_file)
# Check that we found everything we needed
if to_be_found:
msg = 'Failed to find suitable substitutes for {0}'
raise RuntimeError(msg.format(', '.join(to_be_found)))
msg = """\
Spack could not find working replacements for the following autotools config
files: {0}.
To resolve this problem, please try the following:
1. Try to rebuild with `patch_config_files = False` in the package `{1}`, to
rule out that Spack tries to replace config files not used by the build.
2. Verify that the `gnuconfig` package is up-to-date.
3. On some systems you need to use system-provided `config.guess` and `config.sub`
files. In this case, mark `gnuconfig` as an non-buildable external package,
and set the prefix to the directory containing the `config.guess` and
`config.sub` files.
"""
raise InstallError(msg.format(', '.join(to_be_found), self.name))
# Copy the good files over the bad ones
for abs_path in to_be_patched:
@@ -252,17 +302,41 @@ def delete_configure_to_force_update(self):
if self.force_autoreconf:
force_remove(self.configure_abs_path)
def _require_build_deps(self, pkgs, spec, err):
"""Require `pkgs` to be direct build dependencies of `spec`. Raises a
RuntimeError with a helpful error messages when any dep is missing."""
build_deps = [d.name for d in spec.dependencies(deptype='build')]
missing_deps = [x for x in pkgs if x not in build_deps]
if not missing_deps:
return
# Raise an exception on missing deps.
msg = ("{0}: missing dependencies: {1}.\n\nPlease add "
"the following lines to the package:\n\n"
.format(err, ", ".join(missing_deps)))
for dep in missing_deps:
msg += (" depends_on('{0}', type='build', when='@{1}')\n"
.format(dep, spec.version))
msg += "\nUpdate the version (when='@{0}') as needed.".format(spec.version)
raise RuntimeError(msg)
def autoreconf(self, spec, prefix):
"""Not needed usually, configure should be already there"""
# If configure exists nothing needs to be done
if os.path.exists(self.configure_abs_path):
return
# Else try to regenerate it
autotools = ['m4', 'autoconf', 'automake', 'libtool']
missing = [x for x in autotools if x not in spec]
if missing:
msg = 'Cannot generate configure: missing dependencies {0}'
raise RuntimeError(msg.format(missing))
# Else try to regenerate it, which reuquires a few build dependencies
self._require_build_deps(
pkgs=['autoconf', 'automake', 'libtool'],
spec=spec,
err="Cannot generate configure")
tty.msg('Configure script not found: trying to generate it')
tty.warn('*********************************************************')
tty.warn('* If the default procedure fails, consider implementing *')
@@ -331,7 +405,7 @@ def flags_to_build_system_args(self, flags):
def configure(self, spec, prefix):
"""Runs configure with the arguments specified in
:py:meth:`~.AutotoolsPackage.configure_args`
:meth:`~spack.build_systems.autotools.AutotoolsPackage.configure_args`
and an appropriately set prefix.
"""
options = getattr(self, 'configure_flag_args', [])
@@ -373,25 +447,28 @@ def _activate_or_not(
name,
activation_word,
deactivation_word,
activation_value=None
activation_value=None,
variant=None
):
"""This function contains the current implementation details of
:py:meth:`~.AutotoolsPackage.with_or_without` and
:py:meth:`~.AutotoolsPackage.enable_or_disable`.
:meth:`~spack.build_systems.autotools.AutotoolsPackage.with_or_without` and
:meth:`~spack.build_systems.autotools.AutotoolsPackage.enable_or_disable`.
Args:
name (str): name of the variant that is being processed
name (str): name of the option that is being activated or not
activation_word (str): the default activation word ('with' in the
case of ``with_or_without``)
deactivation_word (str): the default deactivation word ('without'
in the case of ``with_or_without``)
activation_value (callable): callable that accepts a single
activation_value (typing.Callable): callable that accepts a single
value. This value is either one of the allowed values for a
multi-valued variant or the name of a bool-valued variant.
Returns the parameter to be used when the value is activated.
The special value 'prefix' can also be assigned and will return
``spec[name].prefix`` as activation parameter.
variant (str): name of the variant that is being processed
(if different from option name)
Examples:
@@ -401,6 +478,7 @@ def _activate_or_not(
variant('foo', values=('x', 'y'), description='')
variant('bar', default=True, description='')
variant('ba_z', default=True, description='')
calling this function like:
@@ -410,17 +488,18 @@ def _activate_or_not(
'foo', 'with', 'without', activation_value='prefix'
)
_activate_or_not('bar', 'with', 'without')
_activate_or_not('ba-z', 'with', 'without', variant='ba_z')
will generate the following configuration options:
.. code-block:: console
--with-x=<prefix-to-x> --without-y --with-bar
--with-x=<prefix-to-x> --without-y --with-bar --with-ba-z
for ``<spec-name> foo=x +bar``
Returns:
list of strings that corresponds to the activation/deactivation
list: list of strings that corresponds to the activation/deactivation
of the variant that has been processed
Raises:
@@ -432,32 +511,36 @@ def _activate_or_not(
if activation_value == 'prefix':
activation_value = lambda x: spec[x].prefix
variant = variant or name
# Defensively look that the name passed as argument is among
# variants
if name not in self.variants:
if variant not in self.variants:
msg = '"{0}" is not a variant of "{1}"'
raise KeyError(msg.format(name, self.name))
raise KeyError(msg.format(variant, self.name))
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
if set(self.variants[name].values) == set((True, False)):
if set(self.variants[variant].values) == set((True, False)):
# BoolValuedVariant carry information about a single option.
# Nonetheless, for uniformity of treatment we'll package them
# in an iterable of one element.
condition = '+{name}'.format(name=name)
condition = '+{name}'.format(name=variant)
options = [(name, condition in spec)]
else:
condition = '{name}={value}'
condition = '{variant}={value}'
# "feature_values" is used to track values which correspond to
# features which can be enabled or disabled as understood by the
# package's build system. It excludes values which have special
# meanings and do not correspond to features (e.g. "none")
feature_values = getattr(
self.variants[name].values, 'feature_values', None
) or self.variants[name].values
self.variants[variant].values, 'feature_values', None
) or self.variants[variant].values
options = [
(value, condition.format(name=name, value=value) in spec)
(value,
condition.format(variant=variant,
value=value) in spec)
for value in feature_values
]
@@ -485,7 +568,7 @@ def _default_generator(is_activated):
args.append(line_generator(activated))
return args
def with_or_without(self, name, activation_value=None):
def with_or_without(self, name, activation_value=None, variant=None):
"""Inspects a variant and returns the arguments that activate
or deactivate the selected feature(s) for the configure options.
@@ -501,7 +584,7 @@ def with_or_without(self, name, activation_value=None):
Args:
name (str): name of a valid multi-valued variant
activation_value (callable): callable that accepts a single
activation_value (typing.Callable): callable that accepts a single
value and returns the parameter to be used leading to an entry
of the type ``--with-{name}={parameter}``.
@@ -511,15 +594,17 @@ def with_or_without(self, name, activation_value=None):
Returns:
list of arguments to configure
"""
return self._activate_or_not(name, 'with', 'without', activation_value)
return self._activate_or_not(name, 'with', 'without', activation_value,
variant)
def enable_or_disable(self, name, activation_value=None):
"""Same as :py:meth:`~.AutotoolsPackage.with_or_without` but substitute
``with`` with ``enable`` and ``without`` with ``disable``.
def enable_or_disable(self, name, activation_value=None, variant=None):
"""Same as
:meth:`~spack.build_systems.autotools.AutotoolsPackage.with_or_without`
but substitute ``with`` with ``enable`` and ``without`` with ``disable``.
Args:
name (str): name of a valid multi-valued variant
activation_value (callable): if present accepts a single value
activation_value (typing.Callable): if present accepts a single value
and returns the parameter to be used leading to an entry of the
type ``--enable-{name}={parameter}``
@@ -530,7 +615,7 @@ def enable_or_disable(self, name, activation_value=None):
list of arguments to configure
"""
return self._activate_or_not(
name, 'enable', 'disable', activation_value
name, 'enable', 'disable', activation_value, variant
)
run_after('install')(PackageBase._run_default_install_time_test_callbacks)
@@ -560,3 +645,6 @@ def remove_libtool_archives(self):
fs.mkdirp(os.path.dirname(self._removed_la_files_log))
with open(self._removed_la_files_log, mode='w') as f:
f.write('\n'.join(libtool_files))
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
run_after('install')(PackageBase.apply_macos_rpath_fixups)

View File

@@ -108,21 +108,6 @@ def initconfig_compiler_entries(self):
if fflags:
entries.append(cmake_cache_string("CMAKE_Fortran_FLAGS", fflags))
# Override XL compiler family
familymsg = ("Override to proper compiler family for XL")
if "xlf" in (self.compiler.fc or ''): # noqa: F821
entries.append(cmake_cache_string(
"CMAKE_Fortran_COMPILER_ID", "XL",
familymsg))
if "xlc" in self.compiler.cc: # noqa: F821
entries.append(cmake_cache_string(
"CMAKE_C_COMPILER_ID", "XL",
familymsg))
if "xlC" in self.compiler.cxx: # noqa: F821
entries.append(cmake_cache_string(
"CMAKE_CXX_COMPILER_ID", "XL",
familymsg))
return entries
def initconfig_mpi_entries(self):

View File

@@ -236,7 +236,7 @@ def define_from_variant(self, cmake_var, variant=None):
of ``cmake_var``.
This utility function is similar to
:py:meth:`~.AutotoolsPackage.with_or_without`.
:meth:`~spack.build_systems.autotools.AutotoolsPackage.with_or_without`.
Examples:
@@ -254,9 +254,9 @@ def define_from_variant(self, cmake_var, variant=None):
.. code-block:: python
[define_from_variant('BUILD_SHARED_LIBS', 'shared'),
define_from_variant('CMAKE_CXX_STANDARD', 'cxxstd'),
define_from_variant('SWR')]
[self.define_from_variant('BUILD_SHARED_LIBS', 'shared'),
self.define_from_variant('CMAKE_CXX_STANDARD', 'cxxstd'),
self.define_from_variant('SWR')]
will generate the following configuration options:

View File

@@ -12,7 +12,7 @@ class CudaPackage(PackageBase):
"""Auxiliary class which contains CUDA variant, dependencies and conflicts
and is meant to unify and facilitate its usage.
Maintainers: ax3l, Rombur
Maintainers: ax3l, Rombur, davidbeckingsale
"""
# https://docs.nvidia.com/cuda/cuda-compiler-driver-nvcc/index.html#gpu-feature-list
@@ -87,28 +87,45 @@ def cuda_flags(arch_list):
# Linux x86_64 compiler conflicts from here:
# https://gist.github.com/ax3l/9489132
# GCC
# According to
# https://github.com/spack/spack/pull/25054#issuecomment-886531664
# these conflicts are valid independently from the architecture
# minimum supported versions
conflicts('%gcc@:4', when='+cuda ^cuda@11.0:')
conflicts('%gcc@:5', when='+cuda ^cuda@11.4:')
# maximum supported version
# NOTE:
# in order to not constrain future cuda version to old gcc versions,
# it has been decided to use an upper bound for the latest version.
# This implies that the last one in the list has to be updated at
# each release of a new cuda minor version.
conflicts('%gcc@10:', when='+cuda ^cuda@:11.0')
conflicts('%gcc@11:', when='+cuda ^cuda@:11.4.0')
conflicts('%gcc@12:', when='+cuda ^cuda@:11.5.0')
conflicts('%clang@12:', when='+cuda ^cuda@:11.4.0')
conflicts('%clang@13:', when='+cuda ^cuda@:11.5.0')
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts('%gcc@10', when='+cuda ^cuda@:11.4.0')
conflicts('%gcc@5:', when='+cuda ^cuda@:7.5 target=x86_64:')
conflicts('%gcc@6:', when='+cuda ^cuda@:8 target=x86_64:')
conflicts('%gcc@7:', when='+cuda ^cuda@:9.1 target=x86_64:')
conflicts('%gcc@8:', when='+cuda ^cuda@:10.0.130 target=x86_64:')
conflicts('%gcc@9:', when='+cuda ^cuda@:10.2.89 target=x86_64:')
conflicts('%gcc@:4', when='+cuda ^cuda@11.0.2: target=x86_64:')
conflicts('%gcc@10:', when='+cuda ^cuda@:11.0.3 target=x86_64:')
conflicts('%gcc@11:', when='+cuda ^cuda@:11.1.0 target=x86_64:')
conflicts('%pgi@:14.8', when='+cuda ^cuda@:7.0.27 target=x86_64:')
conflicts('%pgi@:15.3,15.5:', when='+cuda ^cuda@7.5 target=x86_64:')
conflicts('%pgi@:16.2,16.0:16.3', when='+cuda ^cuda@8 target=x86_64:')
conflicts('%pgi@:15,18:', when='+cuda ^cuda@9.0:9.1 target=x86_64:')
conflicts('%pgi@:16,19:', when='+cuda ^cuda@9.2.88:10 target=x86_64:')
conflicts('%pgi@:17,20:',
when='+cuda ^cuda@10.1.105:10.2.89 target=x86_64:')
conflicts('%pgi@:17,21:',
when='+cuda ^cuda@11.0.2:11.1.0 target=x86_64:')
conflicts('%pgi@:17,20:', when='+cuda ^cuda@10.1.105:10.2.89 target=x86_64:')
conflicts('%pgi@:17,21:', when='+cuda ^cuda@11.0.2:11.1.0 target=x86_64:')
conflicts('%clang@:3.4', when='+cuda ^cuda@:7.5 target=x86_64:')
conflicts('%clang@:3.7,4:',
when='+cuda ^cuda@8.0:9.0 target=x86_64:')
conflicts('%clang@:3.7,4.1:',
when='+cuda ^cuda@9.1 target=x86_64:')
conflicts('%clang@:3.7,4:', when='+cuda ^cuda@8.0:9.0 target=x86_64:')
conflicts('%clang@:3.7,4.1:', when='+cuda ^cuda@9.1 target=x86_64:')
conflicts('%clang@:3.7,5.1:', when='+cuda ^cuda@9.2 target=x86_64:')
conflicts('%clang@:3.7,6.1:', when='+cuda ^cuda@10.0.130 target=x86_64:')
conflicts('%clang@:3.7,7.1:', when='+cuda ^cuda@10.1.105 target=x86_64:')
@@ -132,9 +149,6 @@ def cuda_flags(arch_list):
conflicts('%gcc@8:', when='+cuda ^cuda@:10.0.130 target=ppc64le:')
conflicts('%gcc@9:', when='+cuda ^cuda@:10.1.243 target=ppc64le:')
# officially, CUDA 11.0.2 only supports the system GCC 8.3 on ppc64le
conflicts('%gcc@:4', when='+cuda ^cuda@11.0.2: target=ppc64le:')
conflicts('%gcc@10:', when='+cuda ^cuda@:11.0.3 target=ppc64le:')
conflicts('%gcc@11:', when='+cuda ^cuda@:11.1.0 target=ppc64le:')
conflicts('%pgi', when='+cuda ^cuda@:8 target=ppc64le:')
conflicts('%pgi@:16', when='+cuda ^cuda@:9.1.185 target=ppc64le:')
conflicts('%pgi@:17', when='+cuda ^cuda@:10 target=ppc64le:')
@@ -145,7 +159,7 @@ def cuda_flags(arch_list):
conflicts('%clang@7.1:', when='+cuda ^cuda@:10.1.105 target=ppc64le:')
conflicts('%clang@8.1:', when='+cuda ^cuda@:10.2.89 target=ppc64le:')
conflicts('%clang@:5', when='+cuda ^cuda@11.0.2: target=ppc64le:')
conflicts('%clang@10:', when='+cuda ^cuda@:11.0.3 target=ppc64le:')
conflicts('%clang@10:', when='+cuda ^cuda@:11.0.2 target=ppc64le:')
conflicts('%clang@11:', when='+cuda ^cuda@:11.1.0 target=ppc64le:')
# Intel is mostly relevant for x86_64 Linux, even though it also
@@ -170,7 +184,7 @@ def cuda_flags(arch_list):
# Darwin.
# TODO: add missing conflicts for %apple-clang cuda@:10
conflicts('platform=darwin', when='+cuda ^cuda@11.0.2:')
conflicts('platform=darwin', when='+cuda ^cuda@11.0.2: ')
# Make sure cuda_arch can not be used without +cuda
for value in cuda_arch_values:

View File

@@ -116,9 +116,9 @@ class IntelPackage(PackageBase):
# that satisfies self.spec will be used.
version_years = {
# intel-daal is versioned 2016 and later, no divining is needed
'intel-ipp@9.0:9.99': 2016,
'intel-mkl@11.3.0:11.3.999': 2016,
'intel-mpi@5.1:5.99': 2016,
'intel-ipp@9.0:9': 2016,
'intel-mkl@11.3.0:11.3': 2016,
'intel-mpi@5.1:5': 2016,
}
# Below is the list of possible values for setting auto dispatch functions
@@ -368,7 +368,7 @@ def normalize_suite_dir(self, suite_dir_name, version_globs=['*.*.*']):
toplevel psxevars.sh or equivalent file to source (and thus by
the modulefiles that Spack produces).
version_globs (list of str): Suffix glob patterns (most specific
version_globs (list): Suffix glob patterns (most specific
first) expected to qualify suite_dir_name to its fully
version-specific install directory (as opposed to a
compatibility directory or symlink).

View File

@@ -110,3 +110,6 @@ def installcheck(self):
# Check that self.prefix is there after installation
run_after('install')(PackageBase.sanity_check_prefix)
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
run_after('install')(PackageBase.apply_macos_rpath_fixups)

View File

@@ -69,6 +69,9 @@ def install(self, spec, prefix, installer_path=None):
# Installer writes files in ~/intel set HOME so it goes to prefix
bash.add_default_env('HOME', prefix)
# Installer checks $XDG_RUNTIME_DIR/.bootstrapper_lock_file as well
bash.add_default_env('XDG_RUNTIME_DIR',
join_path(self.stage.path, 'runtime'))
bash(installer_path,
'-s', '-a', '-s', '--action', 'install',

View File

@@ -127,24 +127,25 @@ def import_modules(self):
list: list of strings of module names
"""
modules = []
root = os.path.join(
self.prefix,
self.spec['python'].package.config_vars['python_lib']['true']['false'],
)
# Python libraries may be installed in lib or lib64
# See issues #18520 and #17126
for lib in ['lib', 'lib64']:
root = os.path.join(self.prefix, lib, 'python{0}'.format(
self.spec['python'].version.up_to(2)), 'site-packages')
# Some Python libraries are packages: collections of modules
# distributed in directories containing __init__.py files
for path in find(root, '__init__.py', recursive=True):
modules.append(path.replace(root + os.sep, '', 1).replace(
os.sep + '__init__.py', '').replace('/', '.'))
# Some Python libraries are modules: individual *.py files
# found in the site-packages directory
for path in find(root, '*.py', recursive=False):
modules.append(path.replace(root + os.sep, '', 1).replace(
'.py', '').replace('/', '.'))
# Some Python libraries are packages: collections of modules
# distributed in directories containing __init__.py files
for path in find(root, '__init__.py', recursive=True):
modules.append(path.replace(root + os.sep, '', 1).replace(
os.sep + '__init__.py', '').replace('/', '.'))
# Some Python libraries are modules: individual *.py files
# found in the site-packages directory
for path in find(root, '*.py', recursive=False):
modules.append(path.replace(root + os.sep, '', 1).replace(
'.py', '').replace('/', '.'))
tty.debug('Detected the following modules: {0}'.format(modules))
return modules
def setup_file(self):
@@ -254,15 +255,11 @@ def install_args(self, spec, prefix):
# Get all relative paths since we set the root to `prefix`
# We query the python with which these will be used for the lib and inc
# directories. This ensures we use `lib`/`lib64` as expected by python.
python = spec['python'].package.command
command_start = 'print(distutils.sysconfig.'
commands = ';'.join([
'import distutils.sysconfig',
command_start + 'get_python_lib(plat_specific=False, prefix=""))',
command_start + 'get_python_lib(plat_specific=True, prefix=""))',
command_start + 'get_python_inc(plat_specific=True, prefix=""))'])
pure_site_packages_dir, plat_site_packages_dir, inc_dir = python(
'-c', commands, output=str, error=str).strip().split('\n')
pure_site_packages_dir = spec['python'].package.config_vars[
'python_lib']['false']['false']
plat_site_packages_dir = spec['python'].package.config_vars[
'python_lib']['true']['false']
inc_dir = spec['python'].package.config_vars['python_inc']['true']
args += ['--root=%s' % prefix,
'--install-purelib=%s' % pure_site_packages_dir,
@@ -396,11 +393,15 @@ def remove_files_from_view(self, view, merge_map):
self.spec
)
)
to_remove = []
for src, dst in merge_map.items():
if ignore_namespace and namespace_init(dst):
continue
if global_view or not path_contains_subdirectory(src, bin_dir):
view.remove_file(src, dst)
to_remove.append(dst)
else:
os.remove(dst)
view.remove_files(to_remove)

View File

@@ -18,6 +18,9 @@ class RubyPackage(PackageBase):
#. :py:meth:`~.RubyPackage.build`
#. :py:meth:`~.RubyPackage.install`
"""
maintainers = ['Kerilk']
#: Phases of a Ruby package
phases = ['build', 'install']
@@ -50,8 +53,12 @@ def install(self, spec, prefix):
gems = glob.glob('*.gem')
if gems:
# if --install-dir is not used, GEM_PATH is deleted from the
# environement, and Gems required to build native extensions will
# not be found. Those extensions are built during `gem install`.
inspect.getmodule(self).gem(
'install', '--norc', '--ignore-dependencies', gems[0])
'install', '--norc', '--ignore-dependencies',
'--install-dir', prefix, gems[0])
# Check that self.prefix is there after installation
run_after('install')(PackageBase.sanity_check_prefix)

View File

@@ -64,24 +64,25 @@ def import_modules(self):
list: list of strings of module names
"""
modules = []
root = os.path.join(
self.prefix,
self.spec['python'].package.config_vars['python_lib']['true']['false'],
)
# Python libraries may be installed in lib or lib64
# See issues #18520 and #17126
for lib in ['lib', 'lib64']:
root = os.path.join(self.prefix, lib, 'python{0}'.format(
self.spec['python'].version.up_to(2)), 'site-packages')
# Some Python libraries are packages: collections of modules
# distributed in directories containing __init__.py files
for path in find(root, '__init__.py', recursive=True):
modules.append(path.replace(root + os.sep, '', 1).replace(
os.sep + '__init__.py', '').replace('/', '.'))
# Some Python libraries are modules: individual *.py files
# found in the site-packages directory
for path in find(root, '*.py', recursive=False):
modules.append(path.replace(root + os.sep, '', 1).replace(
'.py', '').replace('/', '.'))
# Some Python libraries are packages: collections of modules
# distributed in directories containing __init__.py files
for path in find(root, '__init__.py', recursive=True):
modules.append(path.replace(root + os.sep, '', 1).replace(
os.sep + '__init__.py', '').replace('/', '.'))
# Some Python libraries are modules: individual *.py files
# found in the site-packages directory
for path in find(root, '*.py', recursive=False):
modules.append(path.replace(root + os.sep, '', 1).replace(
'.py', '').replace('/', '.'))
tty.debug('Detected the following modules: {0}'.format(modules))
return modules
def python(self, *args, **kwargs):

View File

@@ -23,11 +23,8 @@ def misc_cache_location():
Currently the ``misc_cache`` stores indexes for virtual dependency
providers and for which packages provide which tags.
"""
path = spack.config.get('config:misc_cache')
if not path:
path = os.path.join(spack.paths.user_config_path, 'cache')
path = spack.util.path.canonicalize_path(path)
return path
path = spack.config.get('config:misc_cache', spack.paths.default_misc_cache_path)
return spack.util.path.canonicalize_path(path)
def _misc_cache():
@@ -47,7 +44,7 @@ def fetch_cache_location():
"""
path = spack.config.get('config:source_cache')
if not path:
path = os.path.join(spack.paths.var_path, "cache")
path = spack.paths.default_fetch_cache_path
path = spack.util.path.canonicalize_path(path)
return path

View File

@@ -45,6 +45,8 @@
]
SPACK_PR_MIRRORS_ROOT_URL = 's3://spack-binaries-prs'
SPACK_SHARED_PR_MIRROR_URL = url_util.join(SPACK_PR_MIRRORS_ROOT_URL,
'shared_pr_mirror')
TEMP_STORAGE_MIRROR_NAME = 'ci_temporary_mirror'
spack_gpg = spack.main.SpackCommand('gpg')
@@ -394,9 +396,6 @@ def append_dep(s, d):
})
for spec in spec_list:
spec.concretize()
# root_spec = get_spec_string(spec)
root_spec = spec
for s in spec.traverse(deptype=all):
@@ -612,11 +611,14 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'strip-compilers': False,
})
# Add this mirror if it's enabled, as some specs might be up to date
# here and thus not need to be rebuilt.
# Add per-PR mirror (and shared PR mirror) if enabled, as some specs might
# be up to date in one of those and thus not need to be rebuilt.
if pr_mirror_url:
spack.mirror.add(
'ci_pr_mirror', pr_mirror_url, cfg.default_modify_scope())
spack.mirror.add('ci_shared_pr_mirror',
SPACK_SHARED_PR_MIRROR_URL,
cfg.default_modify_scope())
pipeline_artifacts_dir = artifacts_root
if not pipeline_artifacts_dir:
@@ -663,16 +665,35 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
# Speed up staging by first fetching binary indices from all mirrors
# (including the per-PR mirror we may have just added above).
bindist.binary_index.update()
try:
bindist.binary_index.update()
except bindist.FetchCacheError as e:
tty.error(e)
staged_phases = {}
try:
for phase in phases:
phase_name = phase['name']
with spack.concretize.disable_compiler_existence_check():
staged_phases[phase_name] = stage_spec_jobs(
env.spec_lists[phase_name],
check_index_only=check_index_only)
if phase_name == 'specs':
# Anything in the "specs" of the environment are already
# concretized by the block at the top of this method, so we
# only need to find the concrete versions, and then avoid
# re-concretizing them needlessly later on.
concrete_phase_specs = [
concrete for abstract, concrete in env.concretized_specs()
if abstract in env.spec_lists[phase_name]
]
else:
# Any specs lists in other definitions (but not in the
# "specs") of the environment are not yet concretized so we
# have to concretize them explicitly here.
concrete_phase_specs = env.spec_lists[phase_name]
with spack.concretize.disable_compiler_existence_check():
for phase_spec in concrete_phase_specs:
phase_spec.concretize()
staged_phases[phase_name] = stage_spec_jobs(
concrete_phase_specs,
check_index_only=check_index_only)
finally:
# Clean up PR mirror if enabled
if pr_mirror_url:
@@ -688,6 +709,17 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
max_length_needs = 0
max_needs_job = ''
# If this is configured, spack will fail "spack ci generate" if it
# generates any full hash which exists under the broken specs url.
broken_spec_urls = None
if broken_specs_url:
if broken_specs_url.startswith('http'):
# To make checking each spec against the list faster, we require
# a url protocol that allows us to iterate the url in advance.
tty.msg('Cannot use an http(s) url for broken specs, ignoring')
else:
broken_spec_urls = web_util.list_url(broken_specs_url)
before_script, after_script = None, None
for phase in phases:
phase_name = phase['name']
@@ -871,16 +903,13 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
tty.debug(debug_msg)
if prune_dag and not rebuild_spec:
tty.debug('Pruning spec that does not need to be rebuilt.')
continue
# Check if this spec is in our list of known failures, now that
# we know this spec needs a rebuild
if broken_specs_url:
broken_spec_path = url_util.join(
broken_specs_url, release_spec_full_hash)
if web_util.url_exists(broken_spec_path):
known_broken_specs_encountered.append('{0} ({1})'.format(
release_spec, release_spec_full_hash))
if (broken_spec_urls is not None and
release_spec_full_hash in broken_spec_urls):
known_broken_specs_encountered.append('{0} ({1})'.format(
release_spec, release_spec_full_hash))
if artifacts_root:
job_dependencies.append({
@@ -917,7 +946,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
bc_root = os.path.join(
local_mirror_dir, 'build_cache')
artifact_paths.extend([os.path.join(bc_root, p) for p in [
bindist.tarball_name(release_spec, '.spec.yaml'),
bindist.tarball_name(release_spec, '.spec.json'),
bindist.tarball_name(release_spec, '.cdashid'),
bindist.tarball_directory_name(release_spec),
]])
@@ -998,6 +1027,14 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'after_script',
]
service_job_retries = {
'max': 2,
'when': [
'runner_system_failure',
'stuck_or_timeout_failure'
]
}
if job_id > 0:
if temp_storage_url_prefix:
# There were some rebuild jobs scheduled, so we will need to
@@ -1017,6 +1054,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
temp_storage_url_prefix)
]
cleanup_job['when'] = 'always'
cleanup_job['retry'] = service_job_retries
output_object['cleanup'] = cleanup_job
@@ -1040,11 +1078,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
index_target_mirror)
]
final_job['when'] = 'always'
if artifacts_root:
final_job['variables'] = {
'SPACK_CONCRETE_ENV_DIR': concrete_env_dir
}
final_job['retry'] = service_job_retries
output_object['rebuild-index'] = final_job
@@ -1108,6 +1142,8 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'echo "All specs already up to date, nothing to rebuild."',
]
noop_job['retry'] = service_job_retries
sorted_output = {'no-specs-to-rebuild': noop_job}
if known_broken_specs_encountered:
@@ -1316,17 +1352,20 @@ def relate_cdash_builds(spec_map, cdash_base_url, job_build_id, cdash_project,
request = Request(cdash_api_url, data=enc_data, headers=headers)
response = opener.open(request)
response_code = response.getcode()
try:
response = opener.open(request)
response_code = response.getcode()
if response_code != 200 and response_code != 201:
msg = 'Relate builds ({0} -> {1}) failed (resp code = {2})'.format(
job_build_id, dep_build_id, response_code)
tty.warn(msg)
return
if response_code != 200 and response_code != 201:
msg = 'Relate builds ({0} -> {1}) failed (resp code = {2})'.format(
job_build_id, dep_build_id, response_code)
tty.warn(msg)
return
response_text = response.read()
tty.debug('Relate builds response: {0}'.format(response_text))
response_text = response.read()
tty.debug('Relate builds response: {0}'.format(response_text))
except Exception as e:
print("Relating builds in CDash failed: {0}".format(e))
def write_cdashid_to_mirror(cdashid, spec, mirror_url):
@@ -1373,13 +1412,13 @@ def read_cdashid_from_mirror(spec, mirror_url):
return int(contents)
def push_mirror_contents(env, spec, yaml_path, mirror_url, sign_binaries):
def push_mirror_contents(env, spec, specfile_path, mirror_url, sign_binaries):
try:
unsigned = not sign_binaries
tty.debug('Creating buildcache ({0})'.format(
'unsigned' if unsigned else 'signed'))
spack.cmd.buildcache._createtarball(
env, spec_yaml=yaml_path, add_deps=False,
env, spec_file=specfile_path, add_deps=False,
output_location=mirror_url, force=True, allow_root=True,
unsigned=unsigned)
except Exception as inst:
@@ -1395,7 +1434,7 @@ def push_mirror_contents(env, spec, yaml_path, mirror_url, sign_binaries):
# BaseException
# object
err_msg = 'Error msg: {0}'.format(inst)
if 'Access Denied' in err_msg:
if any(x in err_msg for x in ['Access Denied', 'InvalidAccessKeyId']):
tty.msg('Permission problem writing to {0}'.format(
mirror_url))
tty.msg(err_msg)

View File

@@ -21,6 +21,7 @@
from llnl.util.tty.color import colorize
import spack.config
import spack.environment as ev
import spack.error
import spack.extensions
import spack.paths
@@ -186,29 +187,13 @@ def matching_spec_from_env(spec):
If no matching spec is found in the environment (or if no environment is
active), this will return the given spec but concretized.
"""
env = spack.environment.get_env({}, cmd_name)
env = ev.active_environment()
if env:
return env.matching_spec(spec) or spec.concretized()
else:
return spec.concretized()
def elide_list(line_list, max_num=10):
"""Takes a long list and limits it to a smaller number of elements,
replacing intervening elements with '...'. For example::
elide_list([1,2,3,4,5,6], 4)
gives::
[1, 2, 3, '...', 6]
"""
if len(line_list) > max_num:
return line_list[:max_num - 1] + ['...'] + line_list[-1:]
else:
return line_list
def disambiguate_spec(spec, env, local=False, installed=True, first=False):
"""Given a spec, figure out which installed package it refers to.
@@ -216,10 +201,10 @@ def disambiguate_spec(spec, env, local=False, installed=True, first=False):
spec (spack.spec.Spec): a spec to disambiguate
env (spack.environment.Environment): a spack environment,
if one is active, or None if no environment is active
local (boolean, default False): do not search chained spack instances
installed (boolean or any, or spack.database.InstallStatus or iterable
of spack.database.InstallStatus): install status argument passed to
database query. See ``spack.database.Database._query`` for details.
local (bool): do not search chained spack instances
installed (bool or spack.database.InstallStatus or typing.Iterable):
install status argument passed to database query.
See ``spack.database.Database._query`` for details.
"""
hashes = env.all_hashes() if env else None
return disambiguate_spec_from_hashes(spec, hashes, local, installed, first)
@@ -231,11 +216,11 @@ def disambiguate_spec_from_hashes(spec, hashes, local=False,
Arguments:
spec (spack.spec.Spec): a spec to disambiguate
hashes (iterable): a set of hashes of specs among which to disambiguate
local (boolean, default False): do not search chained spack instances
installed (boolean or any, or spack.database.InstallStatus or iterable
of spack.database.InstallStatus): install status argument passed to
database query. See ``spack.database.Database._query`` for details.
hashes (typing.Iterable): a set of hashes of specs among which to disambiguate
local (bool): do not search chained spack instances
installed (bool or spack.database.InstallStatus or typing.Iterable):
install status argument passed to database query.
See ``spack.database.Database._query`` for details.
"""
if local:
matching_specs = spack.store.db.query_local(spec, hashes=hashes,
@@ -274,17 +259,19 @@ def display_specs_as_json(specs, deps=False):
seen = set()
records = []
for spec in specs:
if spec.dag_hash() in seen:
dag_hash = spec.dag_hash()
if dag_hash in seen:
continue
seen.add(spec.dag_hash())
records.append(spec.to_record_dict())
records.append(spec.node_dict_with_hashes())
seen.add(dag_hash)
if deps:
for dep in spec.traverse():
if dep.dag_hash() in seen:
dep_dag_hash = dep.dag_hash()
if dep_dag_hash in seen:
continue
seen.add(dep.dag_hash())
records.append(dep.to_record_dict())
records.append(dep.node_dict_with_hashes())
seen.add(dep_dag_hash)
sjson.dump(records, sys.stdout)
@@ -333,9 +320,8 @@ def display_specs(specs, args=None, **kwargs):
namespace.
Args:
specs (list of spack.spec.Spec): the specs to display
args (optional argparse.Namespace): namespace containing
formatting arguments
specs (list): the specs to display
args (argparse.Namespace or None): namespace containing formatting arguments
Keyword Args:
paths (bool): Show paths with each displayed spec
@@ -348,9 +334,9 @@ def display_specs(specs, args=None, **kwargs):
indent (int): indent each line this much
groups (bool): display specs grouped by arch/compiler (default True)
decorators (dict): dictionary mappng specs to decorators
header_callback (function): called at start of arch/compiler groups
header_callback (typing.Callable): called at start of arch/compiler groups
all_headers (bool): show headers even when arch/compiler aren't defined
output (stream): A file object to write to. Default is ``sys.stdout``
output (typing.IO): A file object to write to. Default is ``sys.stdout``
"""
def get_arg(name, default=None):
@@ -502,3 +488,71 @@ def extant_file(f):
if not os.path.isfile(f):
raise argparse.ArgumentTypeError('%s does not exist' % f)
return f
def require_active_env(cmd_name):
"""Used by commands to get the active environment
If an environment is not found, print an error message that says the calling
command *needs* an active environment.
Arguments:
cmd_name (str): name of calling command
Returns:
(spack.environment.Environment): the active environment
"""
env = ev.active_environment()
if env:
return env
else:
tty.die(
'`spack %s` requires an environment' % cmd_name,
'activate an environment first:',
' spack env activate ENV',
'or use:',
' spack -e ENV %s ...' % cmd_name)
def find_environment(args):
"""Find active environment from args or environment variable.
Check for an environment in this order:
1. via ``spack -e ENV`` or ``spack -D DIR`` (arguments)
2. via a path in the spack.environment.spack_env_var environment variable.
If an environment is found, read it in. If not, return None.
Arguments:
args (argparse.Namespace): argparse namespace with command arguments
Returns:
(spack.environment.Environment): a found environment, or ``None``
"""
# treat env as a name
env = args.env
if env:
if ev.exists(env):
return ev.read(env)
else:
# if env was specified, see if it is a directory otherwise, look
# at env_dir (env and env_dir are mutually exclusive)
env = args.env_dir
# if no argument, look for the environment variable
if not env:
env = os.environ.get(ev.spack_env_var)
# nothing was set; there's no active environment
if not env:
return None
# if we get here, env isn't the name of a spack environment; it has
# to be a path to an environment, or there is something wrong.
if ev.is_env_dir(env):
return ev.Environment(env)
raise ev.SpackEnvironmentError('no environment in %s' % env)

View File

@@ -30,8 +30,7 @@ def activate(parser, args):
if len(specs) != 1:
tty.die("activate requires one spec. %d given." % len(specs))
env = ev.get_env(args, 'activate')
spec = spack.cmd.disambiguate_spec(specs[0], env)
spec = spack.cmd.disambiguate_spec(specs[0], ev.active_environment())
if not spec.package.is_extension:
tty.die("%s is not an extension." % spec.name)

View File

@@ -7,7 +7,6 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
description = 'add a spec to an environment'
section = "environments"
@@ -22,7 +21,7 @@ def setup_parser(subparser):
def add(parser, args):
env = ev.get_env(args, 'add', required=True)
env = spack.cmd.require_active_env(cmd_name='add')
with env.write_transaction():
for spec in spack.cmd.parse_specs(args.specs):

View File

@@ -58,9 +58,9 @@ def analyze_spec(spec, analyzers=None, outdir=None, monitor=None, overwrite=Fals
analyze_spec(spec, args.analyzers, args.outdir, monitor)
Args:
spec (Spec): spec object of installed package
spec (spack.spec.Spec): spec object of installed package
analyzers (list): list of analyzer (keys) to run
monitor (monitor.SpackMonitorClient): a monitor client
monitor (spack.monitor.SpackMonitorClient): a monitor client
overwrite (bool): overwrite result if already exists
"""
analyzers = analyzers or list(spack.analyzers.analyzer_types.keys())
@@ -95,7 +95,7 @@ def analyze(parser, args, **kwargs):
sys.exit(0)
# handle active environment, if any
env = ev.get_env(args, 'analyze')
env = ev.active_environment()
# Get an disambiguate spec (we should only have one)
specs = spack.cmd.parse_specs(args.spec)

View File

@@ -12,7 +12,7 @@
import llnl.util.tty.colify as colify
import llnl.util.tty.color as color
import spack.architecture as architecture
import spack.platforms
description = "print architecture information about this machine"
section = "system"
@@ -20,6 +20,10 @@
def setup_parser(subparser):
subparser.add_argument(
'-g', '--generic-target', action='store_true',
help='show the best generic target'
)
subparser.add_argument(
'--known-targets', action='store_true',
help='show a list of all known targets and exit'
@@ -74,25 +78,32 @@ def display_target_group(header, target_group):
def arch(parser, args):
if args.generic_target:
print(archspec.cpu.host().generic)
return
if args.known_targets:
display_targets(archspec.cpu.TARGETS)
return
os_args, target_args = 'default_os', 'default_target'
if args.frontend:
arch = architecture.Arch(architecture.platform(),
'frontend', 'frontend')
os_args, target_args = 'frontend', 'frontend'
elif args.backend:
arch = architecture.Arch(architecture.platform(),
'backend', 'backend')
else:
arch = architecture.Arch(architecture.platform(),
'default_os', 'default_target')
os_args, target_args = 'backend', 'backend'
host_platform = spack.platforms.host()
host_os = host_platform.operating_system(os_args)
host_target = host_platform.target(target_args)
architecture = spack.spec.ArchSpec(
(str(host_platform), str(host_os), str(host_target))
)
if args.platform:
print(arch.platform)
print(architecture.platform)
elif args.operating_system:
print(arch.os)
print(architecture.os)
elif args.target:
print(arch.target)
print(architecture.target)
else:
print(arch)
print(architecture)

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.tty as tty
import llnl.util.tty.color as cl
import spack.audit
@@ -19,12 +20,24 @@ def setup_parser(subparser):
# Audit configuration files
sp.add_parser('configs', help='audit configuration files')
# Https and other linting
https_parser = sp.add_parser('packages-https', help='check https in packages')
https_parser.add_argument(
'--all',
action='store_true',
default=False,
dest='check_all',
help="audit all packages"
)
# Audit package recipes
pkg_parser = sp.add_parser('packages', help='audit package recipes')
pkg_parser.add_argument(
'name', metavar='PKG', nargs='*',
help='package to be analyzed (if none all packages will be processed)',
)
for group in [pkg_parser, https_parser]:
group.add_argument(
'name', metavar='PKG', nargs='*',
help='package to be analyzed (if none all packages will be processed)',
)
# List all checks
sp.add_parser('list', help='list available checks and exits')
@@ -41,6 +54,17 @@ def packages(parser, args):
_process_reports(reports)
def packages_https(parser, args):
# Since packages takes a long time, --all is required without name
if not args.check_all and not args.name:
tty.die("Please specify one or more packages to audit, or --all.")
pkgs = args.name or spack.repo.path.all_package_names()
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs)
_process_reports(reports)
def list(parser, args):
for subcommand, check_tags in spack.audit.GROUPS.items():
print(cl.colorize('@*b{' + subcommand + '}:'))
@@ -58,6 +82,7 @@ def audit(parser, args):
subcommands = {
'configs': configs,
'packages': packages,
'packages-https': packages_https,
'list': list
}
subcommands[args.subcommand](parser, args)

View File

@@ -0,0 +1,220 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
import os.path
import shutil
import llnl.util.tty
import llnl.util.tty.color
import spack.cmd.common.arguments
import spack.config
import spack.main
import spack.util.path
description = "manage bootstrap configuration"
section = "system"
level = "long"
def _add_scope_option(parser):
scopes = spack.config.scopes()
scopes_metavar = spack.config.scopes_metavar
parser.add_argument(
'--scope', choices=scopes, metavar=scopes_metavar,
help="configuration scope to read/modify"
)
def setup_parser(subparser):
sp = subparser.add_subparsers(dest='subcommand')
enable = sp.add_parser('enable', help='enable bootstrapping')
_add_scope_option(enable)
disable = sp.add_parser('disable', help='disable bootstrapping')
_add_scope_option(disable)
reset = sp.add_parser(
'reset', help='reset bootstrapping configuration to Spack defaults'
)
spack.cmd.common.arguments.add_common_arguments(
reset, ['yes_to_all']
)
root = sp.add_parser(
'root', help='get/set the root bootstrap directory'
)
_add_scope_option(root)
root.add_argument(
'path', nargs='?', default=None,
help='set the bootstrap directory to this value'
)
list = sp.add_parser(
'list', help='list the methods available for bootstrapping'
)
_add_scope_option(list)
trust = sp.add_parser(
'trust', help='trust a bootstrapping method'
)
_add_scope_option(trust)
trust.add_argument(
'name', help='name of the method to be trusted'
)
untrust = sp.add_parser(
'untrust', help='untrust a bootstrapping method'
)
_add_scope_option(untrust)
untrust.add_argument(
'name', help='name of the method to be untrusted'
)
def _enable_or_disable(args):
# Set to True if we called "enable", otherwise set to false
value = args.subcommand == 'enable'
spack.config.set('bootstrap:enable', value, scope=args.scope)
def _reset(args):
if not args.yes_to_all:
msg = [
"Bootstrapping configuration is being reset to Spack's defaults. "
"Current configuration will be lost.\n",
"Do you want to continue?"
]
ok_to_continue = llnl.util.tty.get_yes_or_no(
''.join(msg), default=True
)
if not ok_to_continue:
raise RuntimeError('Aborting')
for scope in spack.config.config.file_scopes:
# The default scope should stay untouched
if scope.name == 'defaults':
continue
# If we are in an env scope we can't delete a file, but the best we
# can do is nullify the corresponding configuration
if (scope.name.startswith('env') and
spack.config.get('bootstrap', scope=scope.name)):
spack.config.set('bootstrap', {}, scope=scope.name)
continue
# If we are outside of an env scope delete the bootstrap.yaml file
bootstrap_yaml = os.path.join(scope.path, 'bootstrap.yaml')
backup_file = bootstrap_yaml + '.bkp'
if os.path.exists(bootstrap_yaml):
shutil.move(bootstrap_yaml, backup_file)
def _root(args):
if args.path:
spack.config.set('bootstrap:root', args.path, scope=args.scope)
root = spack.config.get('bootstrap:root', default=None, scope=args.scope)
if root:
root = spack.util.path.canonicalize_path(root)
print(root)
def _list(args):
sources = spack.config.get(
'bootstrap:sources', default=None, scope=args.scope
)
if not sources:
llnl.util.tty.msg(
"No method available for bootstrapping Spack's dependencies"
)
return
def _print_method(source, trusted):
color = llnl.util.tty.color
def fmt(header, content):
header_fmt = "@*b{{{0}:}} {1}"
color.cprint(header_fmt.format(header, content))
trust_str = "@*y{UNKNOWN}"
if trusted is True:
trust_str = "@*g{TRUSTED}"
elif trusted is False:
trust_str = "@*r{UNTRUSTED}"
fmt("Name", source['name'] + ' ' + trust_str)
print()
fmt(" Type", source['type'])
print()
info_lines = ['\n']
for key, value in source.get('info', {}).items():
info_lines.append(' ' * 4 + '@*{{{0}}}: {1}\n'.format(key, value))
if len(info_lines) > 1:
fmt(" Info", ''.join(info_lines))
description_lines = ['\n']
for line in source['description'].split('\n'):
description_lines.append(' ' * 4 + line + '\n')
fmt(" Description", ''.join(description_lines))
trusted = spack.config.get('bootstrap:trusted', {})
for s in sources:
_print_method(s, trusted.get(s['name'], None))
def _write_trust_state(args, value):
name = args.name
sources = spack.config.get('bootstrap:sources')
matches = [s for s in sources if s['name'] == name]
if not matches:
names = [s['name'] for s in sources]
msg = ('there is no bootstrapping method named "{0}". Valid '
'method names are: {1}'.format(name, ', '.join(names)))
raise RuntimeError(msg)
if len(matches) > 1:
msg = ('there is more than one bootstrapping method named "{0}". '
'Please delete all methods but one from bootstrap.yaml '
'before proceeding').format(name)
raise RuntimeError(msg)
# Setting the scope explicitly is needed to not copy over to a new scope
# the entire default configuration for bootstrap.yaml
scope = args.scope or spack.config.default_modify_scope('bootstrap')
spack.config.add(
'bootstrap:trusted:{0}:{1}'.format(name, str(value)), scope=scope
)
def _trust(args):
_write_trust_state(args, value=True)
msg = '"{0}" is now trusted for bootstrapping'
llnl.util.tty.msg(msg.format(args.name))
def _untrust(args):
_write_trust_state(args, value=False)
msg = '"{0}" is now untrusted and will not be used for bootstrapping'
llnl.util.tty.msg(msg.format(args.name))
def bootstrap(parser, args):
callbacks = {
'enable': _enable_or_disable,
'disable': _enable_or_disable,
'reset': _reset,
'root': _root,
'list': _list,
'trust': _trust,
'untrust': _untrust
}
callbacks[args.subcommand](args)

View File

@@ -2,29 +2,33 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import os
import shutil
import sys
import tempfile
import llnl.util.tty as tty
import spack.architecture
import spack.binary_distribution as bindist
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.fetch_strategy as fs
import spack.hash_types as ht
import spack.mirror
import spack.relocate
import spack.repo
import spack.spec
import spack.store
import spack.util.crypto
import spack.util.url as url_util
import spack.util.web as web_util
from spack.cmd import display_specs
from spack.error import SpecError
from spack.spec import Spec, save_dependency_spec_yamls
from spack.spec import Spec, save_dependency_specfiles
from spack.stage import Stage
from spack.util.string import plural
description = "create, download and install binary packages"
@@ -70,8 +74,9 @@ def setup_parser(subparser):
create.add_argument('--rebuild-index', action='store_true',
default=False, help="Regenerate buildcache index " +
"after building package(s)")
create.add_argument('-y', '--spec-yaml', default=None,
help='Create buildcache entry for spec from yaml file')
create.add_argument('--spec-file', default=None,
help=('Create buildcache entry for spec from json or ' +
'yaml file'))
create.add_argument('--only', default='package,dependencies',
dest='things_to_install',
choices=['package', 'dependencies'],
@@ -97,6 +102,8 @@ def setup_parser(subparser):
install.add_argument('-o', '--otherarch', action='store_true',
help="install specs from other architectures" +
" instead of default platform and OS")
# This argument is needed by the bootstrapping logic to verify checksums
install.add_argument('--sha256', help=argparse.SUPPRESS)
arguments.add_common_arguments(install, ['specs'])
install.set_defaults(func=installtarball)
@@ -156,8 +163,9 @@ def setup_parser(subparser):
help='Check single spec instead of release specs file')
check.add_argument(
'-y', '--spec-yaml', default=None,
help='Check single spec from yaml file instead of release specs file')
'--spec-file', default=None,
help=('Check single spec from json or yaml file instead of release ' +
'specs file'))
check.add_argument(
'--rebuild-on-error', default=False, action='store_true',
@@ -166,14 +174,15 @@ def setup_parser(subparser):
check.set_defaults(func=check_binaries)
# Download tarball and spec.yaml
# Download tarball and specfile
dltarball = subparsers.add_parser('download', help=get_tarball.__doc__)
dltarball.add_argument(
'-s', '--spec', default=None,
help="Download built tarball for spec from mirror")
dltarball.add_argument(
'-y', '--spec-yaml', default=None,
help="Download built tarball for spec (from yaml file) from mirror")
'--spec-file', default=None,
help=("Download built tarball for spec (from json or yaml file) " +
"from mirror"))
dltarball.add_argument(
'-p', '--path', default=None,
help="Path to directory where tarball should be downloaded")
@@ -189,26 +198,27 @@ def setup_parser(subparser):
'-s', '--spec', default=None,
help='Spec string for which buildcache name is desired')
getbuildcachename.add_argument(
'-y', '--spec-yaml', default=None,
help='Path to spec yaml file for which buildcache name is desired')
'--spec-file', default=None,
help=('Path to spec json or yaml file for which buildcache name is ' +
'desired'))
getbuildcachename.set_defaults(func=get_buildcache_name)
# Given the root spec, save the yaml of the dependent spec to a file
saveyaml = subparsers.add_parser('save-yaml',
help=save_spec_yamls.__doc__)
saveyaml.add_argument(
savespecfile = subparsers.add_parser('save-specfile',
help=save_specfiles.__doc__)
savespecfile.add_argument(
'--root-spec', default=None,
help='Root spec of dependent spec')
saveyaml.add_argument(
'--root-spec-yaml', default=None,
help='Path to yaml file containing root spec of dependent spec')
saveyaml.add_argument(
savespecfile.add_argument(
'--root-specfile', default=None,
help='Path to json or yaml file containing root spec of dependent spec')
savespecfile.add_argument(
'-s', '--specs', default=None,
help='List of dependent specs for which saved yaml is desired')
saveyaml.add_argument(
'-y', '--yaml-dir', default=None,
savespecfile.add_argument(
'--specfile-dir', default=None,
help='Path to directory where spec yamls should be saved')
saveyaml.set_defaults(func=save_spec_yamls)
savespecfile.set_defaults(func=save_specfiles)
# Copy buildcache from some directory to another mirror url
copy = subparsers.add_parser('copy', help=buildcache_copy.__doc__)
@@ -216,13 +226,44 @@ def setup_parser(subparser):
'--base-dir', default=None,
help='Path to mirror directory (root of existing buildcache)')
copy.add_argument(
'--spec-yaml', default=None,
help='Path to spec yaml file representing buildcache entry to copy')
'--spec-file', default=None,
help=('Path to spec json or yaml file representing buildcache entry to' +
' copy'))
copy.add_argument(
'--destination-url', default=None,
help='Destination mirror url')
copy.set_defaults(func=buildcache_copy)
# Sync buildcache entries from one mirror to another
sync = subparsers.add_parser('sync', help=buildcache_sync.__doc__)
source = sync.add_mutually_exclusive_group(required=True)
source.add_argument('--src-directory',
metavar='DIRECTORY',
type=str,
help="Source mirror as a local file path")
source.add_argument('--src-mirror-name',
metavar='MIRROR_NAME',
type=str,
help="Name of the source mirror")
source.add_argument('--src-mirror-url',
metavar='MIRROR_URL',
type=str,
help="URL of the source mirror")
dest = sync.add_mutually_exclusive_group(required=True)
dest.add_argument('--dest-directory',
metavar='DIRECTORY',
type=str,
help="Destination mirror as a local file path")
dest.add_argument('--dest-mirror-name',
metavar='MIRROR_NAME',
type=str,
help="Name of the destination mirror")
dest.add_argument('--dest-mirror-url',
metavar='MIRROR_URL',
type=str,
help="URL of the destination mirror")
sync.set_defaults(func=buildcache_sync)
# Update buildcache index without copying any additional packages
update_index = subparsers.add_parser(
'update-index', help=buildcache_update_index.__doc__)
@@ -239,12 +280,13 @@ def find_matching_specs(pkgs, allow_multiple_matches=False, env=None):
concretized specs given from cli
Args:
pkgs (string): spec to be matched against installed packages
pkgs (str): spec to be matched against installed packages
allow_multiple_matches (bool): if True multiple matches are admitted
env (Environment): active environment, or ``None`` if there is not one
env (spack.environment.Environment or None): active environment, or ``None``
if there is not one
Return:
list of specs
list: list of specs
"""
hashes = env.all_hashes() if env else None
@@ -292,9 +334,13 @@ def match_downloaded_specs(pkgs, allow_multiple_matches=False, force=False,
specs_from_cli = []
has_errors = False
specs = bindist.update_cache_and_get_specs()
try:
specs = bindist.update_cache_and_get_specs()
except bindist.FetchCacheError as e:
tty.error(e)
if not other_arch:
arch = spack.architecture.default_arch().to_spec()
arch = spack.spec.Spec.default_arch()
specs = [s for s in specs if s.satisfies(arch)]
for pkg in pkgs:
@@ -328,16 +374,19 @@ def match_downloaded_specs(pkgs, allow_multiple_matches=False, force=False,
return specs_from_cli
def _createtarball(env, spec_yaml=None, packages=None, add_spec=True,
def _createtarball(env, spec_file=None, packages=None, add_spec=True,
add_deps=True, output_location=os.getcwd(),
signing_key=None, force=False, make_relative=False,
unsigned=False, allow_root=False, rebuild_index=False):
if spec_yaml:
with open(spec_yaml, 'r') as fd:
yaml_text = fd.read()
tty.debug('createtarball read spec yaml:')
tty.debug(yaml_text)
s = Spec.from_yaml(yaml_text)
if spec_file:
with open(spec_file, 'r') as fd:
specfile_contents = fd.read()
tty.debug('createtarball read specfile contents:')
tty.debug(specfile_contents)
if spec_file.endswith('.json'):
s = Spec.from_json(specfile_contents)
else:
s = Spec.from_yaml(specfile_contents)
package = '/{0}'.format(s.dag_hash())
matches = find_matching_specs(package, env=env)
@@ -350,7 +399,7 @@ def _createtarball(env, spec_yaml=None, packages=None, add_spec=True,
else:
tty.die("build cache file creation requires at least one" +
" installed package spec, an active environment," +
" or else a path to a yaml file containing a spec" +
" or else a path to a json or yaml file containing a spec" +
" to install")
specs = set()
@@ -419,7 +468,7 @@ def createtarball(args):
"""create a binary package from an existing install"""
# restrict matching to current environment if one is active
env = ev.get_env(args, 'buildcache create')
env = ev.active_environment()
output_location = None
if args.directory:
@@ -459,7 +508,7 @@ def createtarball(args):
add_spec = ('package' in args.things_to_install)
add_deps = ('dependencies' in args.things_to_install)
_createtarball(env, spec_yaml=args.spec_yaml, packages=args.specs,
_createtarball(env, spec_file=args.spec_file, packages=args.specs,
add_spec=add_spec, add_deps=add_deps,
output_location=output_location, signing_key=args.key,
force=args.force, make_relative=args.rel,
@@ -494,6 +543,15 @@ def install_tarball(spec, args):
else:
tarball = bindist.download_tarball(spec)
if tarball:
if args.sha256:
checker = spack.util.crypto.Checker(args.sha256)
msg = ('cannot verify checksum for "{0}"'
' [expected={1}]')
msg = msg.format(tarball, args.sha256)
if not checker.check(tarball):
raise spack.binary_distribution.NoChecksumException(msg)
tty.debug('Verified SHA256 checksum of the build cache')
tty.msg('Installing buildcache for spec %s' % spec.format())
bindist.extract_tarball(spec, tarball, args.allow_root,
args.unsigned, args.force)
@@ -506,9 +564,13 @@ def install_tarball(spec, args):
def listspecs(args):
"""list binary packages available from mirrors"""
specs = bindist.update_cache_and_get_specs()
try:
specs = bindist.update_cache_and_get_specs()
except bindist.FetchCacheError as e:
tty.error(e)
if not args.allarch:
arch = spack.architecture.default_arch().to_spec()
arch = spack.spec.Spec.default_arch()
specs = [s for s in specs if s.satisfies(arch)]
if args.specs:
@@ -551,10 +613,10 @@ def check_binaries(args):
its result, specifically, if the exit code is non-zero, then at least
one of the indicated specs needs to be rebuilt.
"""
if args.spec or args.spec_yaml:
if args.spec or args.spec_file:
specs = [get_concrete_spec(args)]
else:
env = ev.get_env(args, 'buildcache', required=True)
env = spack.cmd.require_active_env(cmd_name='buildcache')
env.concretize()
specs = env.all_specs()
@@ -588,15 +650,16 @@ def download_buildcache_files(concrete_spec, local_dest, require_cdashid,
files_to_fetch = [
{
'url': tarball_path_name,
'url': [tarball_path_name],
'path': local_tarball_path,
'required': True,
}, {
'url': bindist.tarball_name(concrete_spec, '.spec.yaml'),
'url': [bindist.tarball_name(concrete_spec, '.spec.json'),
bindist.tarball_name(concrete_spec, '.spec.yaml')],
'path': local_dest,
'required': True,
}, {
'url': bindist.tarball_name(concrete_spec, '.cdashid'),
'url': [bindist.tarball_name(concrete_spec, '.cdashid')],
'path': local_dest,
'required': require_cdashid,
},
@@ -610,9 +673,9 @@ def get_tarball(args):
command uses the process exit code to indicate its result, specifically,
a non-zero exit code indicates that the command failed to download at
least one of the required buildcache components. Normally, just the
tarball and .spec.yaml files are required, but if the --require-cdashid
tarball and .spec.json files are required, but if the --require-cdashid
argument was provided, then a .cdashid file is also required."""
if not args.spec and not args.spec_yaml:
if not args.spec and not args.spec_file:
tty.msg('No specs provided, exiting.')
sys.exit(0)
@@ -629,7 +692,7 @@ def get_tarball(args):
def get_concrete_spec(args):
spec_str = args.spec
spec_yaml_path = args.spec_yaml
spec_yaml_path = args.spec_file
if not spec_str and not spec_yaml_path:
tty.msg('Must provide either spec string or path to ' +
@@ -661,14 +724,14 @@ def get_buildcache_name(args):
sys.exit(0)
def save_spec_yamls(args):
def save_specfiles(args):
"""Get full spec for dependencies, relative to root spec, and write them
to files in the specified output directory. Uses exit code to signal
success or failure. An exit code of zero means the command was likely
successful. If any errors or exceptions are encountered, or if expected
command-line arguments are not provided, then the exit code will be
non-zero."""
if not args.root_spec and not args.root_spec_yaml:
if not args.root_spec and not args.root_specfile:
tty.msg('No root spec provided, exiting.')
sys.exit(1)
@@ -676,20 +739,20 @@ def save_spec_yamls(args):
tty.msg('No dependent specs provided, exiting.')
sys.exit(1)
if not args.yaml_dir:
if not args.specfile_dir:
tty.msg('No yaml directory provided, exiting.')
sys.exit(1)
if args.root_spec_yaml:
with open(args.root_spec_yaml) as fd:
root_spec_as_yaml = fd.read()
if args.root_specfile:
with open(args.root_specfile) as fd:
root_spec_as_json = fd.read()
else:
root_spec = Spec(args.root_spec)
root_spec.concretize()
root_spec_as_yaml = root_spec.to_yaml(hash=ht.build_hash)
save_dependency_spec_yamls(
root_spec_as_yaml, args.yaml_dir, args.specs.split())
root_spec_as_json = root_spec.to_json(hash=ht.build_hash)
spec_format = 'yaml' if args.root_specfile.endswith('yaml') else 'json'
save_dependency_specfiles(
root_spec_as_json, args.specfile_dir, args.specs.split(), spec_format)
sys.exit(0)
@@ -698,10 +761,10 @@ def buildcache_copy(args):
"""Copy a buildcache entry and all its files from one mirror, given as
'--base-dir', to some other mirror, specified as '--destination-url'.
The specific buildcache entry to be copied from one location to the
other is identified using the '--spec-yaml' argument."""
other is identified using the '--spec-file' argument."""
# TODO: This sub-command should go away once #11117 is merged
if not args.spec_yaml:
if not args.spec_file:
tty.msg('No spec yaml provided, exiting.')
sys.exit(1)
@@ -721,12 +784,12 @@ def buildcache_copy(args):
sys.exit(1)
try:
with open(args.spec_yaml, 'r') as fd:
with open(args.spec_file, 'r') as fd:
spec = Spec.from_yaml(fd.read())
except Exception as e:
tty.debug(e)
tty.error('Unable to concrectize spec from yaml {0}'.format(
args.spec_yaml))
args.spec_file))
sys.exit(1)
dest_root_path = dest_url
@@ -741,10 +804,15 @@ def buildcache_copy(args):
tarball_dest_path = os.path.join(dest_root_path, tarball_rel_path)
specfile_rel_path = os.path.join(
build_cache_dir, bindist.tarball_name(spec, '.spec.yaml'))
build_cache_dir, bindist.tarball_name(spec, '.spec.json'))
specfile_src_path = os.path.join(args.base_dir, specfile_rel_path)
specfile_dest_path = os.path.join(dest_root_path, specfile_rel_path)
specfile_rel_path_yaml = os.path.join(
build_cache_dir, bindist.tarball_name(spec, '.spec.yaml'))
specfile_src_path_yaml = os.path.join(args.base_dir, specfile_rel_path)
specfile_dest_path_yaml = os.path.join(dest_root_path, specfile_rel_path)
cdashidfile_rel_path = os.path.join(
build_cache_dir, bindist.tarball_name(spec, '.cdashid'))
cdashid_src_path = os.path.join(args.base_dir, cdashidfile_rel_path)
@@ -760,12 +828,134 @@ def buildcache_copy(args):
tty.msg('Copying {0}'.format(specfile_rel_path))
shutil.copyfile(specfile_src_path, specfile_dest_path)
tty.msg('Copying {0}'.format(specfile_rel_path_yaml))
shutil.copyfile(specfile_src_path_yaml, specfile_dest_path_yaml)
# Copy the cdashid file (if exists) to the destination mirror
if os.path.exists(cdashid_src_path):
tty.msg('Copying {0}'.format(cdashidfile_rel_path))
shutil.copyfile(cdashid_src_path, cdashid_dest_path)
def buildcache_sync(args):
""" Syncs binaries (and associated metadata) from one mirror to another.
Requires an active environment in order to know which specs to sync.
Args:
src (str): Source mirror URL
dest (str): Destination mirror URL
"""
# Figure out the source mirror
source_location = None
if args.src_directory:
source_location = args.src_directory
scheme = url_util.parse(source_location, scheme='<missing>').scheme
if scheme != '<missing>':
raise ValueError(
'"--src-directory" expected a local path; got a URL, instead')
# Ensure that the mirror lookup does not mistake this for named mirror
source_location = 'file://' + source_location
elif args.src_mirror_name:
source_location = args.src_mirror_name
result = spack.mirror.MirrorCollection().lookup(source_location)
if result.name == "<unnamed>":
raise ValueError(
'no configured mirror named "{name}"'.format(
name=source_location))
elif args.src_mirror_url:
source_location = args.src_mirror_url
scheme = url_util.parse(source_location, scheme='<missing>').scheme
if scheme == '<missing>':
raise ValueError(
'"{url}" is not a valid URL'.format(url=source_location))
src_mirror = spack.mirror.MirrorCollection().lookup(source_location)
src_mirror_url = url_util.format(src_mirror.fetch_url)
# Figure out the destination mirror
dest_location = None
if args.dest_directory:
dest_location = args.dest_directory
scheme = url_util.parse(dest_location, scheme='<missing>').scheme
if scheme != '<missing>':
raise ValueError(
'"--dest-directory" expected a local path; got a URL, instead')
# Ensure that the mirror lookup does not mistake this for named mirror
dest_location = 'file://' + dest_location
elif args.dest_mirror_name:
dest_location = args.dest_mirror_name
result = spack.mirror.MirrorCollection().lookup(dest_location)
if result.name == "<unnamed>":
raise ValueError(
'no configured mirror named "{name}"'.format(
name=dest_location))
elif args.dest_mirror_url:
dest_location = args.dest_mirror_url
scheme = url_util.parse(dest_location, scheme='<missing>').scheme
if scheme == '<missing>':
raise ValueError(
'"{url}" is not a valid URL'.format(url=dest_location))
dest_mirror = spack.mirror.MirrorCollection().lookup(dest_location)
dest_mirror_url = url_util.format(dest_mirror.fetch_url)
# Get the active environment
env = spack.cmd.require_active_env(cmd_name='buildcache sync')
tty.msg('Syncing environment buildcache files from {0} to {1}'.format(
src_mirror_url, dest_mirror_url))
build_cache_dir = bindist.build_cache_relative_path()
buildcache_rel_paths = []
tty.debug('Syncing the following specs:')
for s in env.all_specs():
tty.debug(' {0}{1}: {2}'.format(
'* ' if s in env.roots() else ' ', s.name, s.dag_hash()))
buildcache_rel_paths.extend([
os.path.join(
build_cache_dir, bindist.tarball_path_name(s, '.spack')),
os.path.join(
build_cache_dir, bindist.tarball_name(s, '.spec.yaml')),
os.path.join(
build_cache_dir, bindist.tarball_name(s, '.spec.json')),
os.path.join(
build_cache_dir, bindist.tarball_name(s, '.cdashid'))
])
tmpdir = tempfile.mkdtemp()
try:
for rel_path in buildcache_rel_paths:
src_url = url_util.join(src_mirror_url, rel_path)
local_path = os.path.join(tmpdir, rel_path)
dest_url = url_util.join(dest_mirror_url, rel_path)
tty.debug('Copying {0} to {1} via {2}'.format(
src_url, dest_url, local_path))
stage = Stage(src_url,
name="temporary_file",
path=os.path.dirname(local_path),
keep=True)
try:
stage.create()
stage.fetch()
web_util.push_to_url(
local_path,
dest_url,
keep_original=True)
except fs.FetchError as e:
tty.debug('spack buildcache unable to sync {0}'.format(rel_path))
tty.debug(e)
finally:
stage.destroy()
finally:
shutil.rmtree(tmpdir)
def update_index(mirror_url, update_keys=False):
mirror = spack.mirror.MirrorCollection().lookup(mirror_url)
outdir = url_util.format(mirror.push_url)

View File

@@ -14,6 +14,7 @@
import spack.repo
import spack.stage
import spack.util.crypto
from spack.package import preferred_version
from spack.util.naming import valid_fully_qualified_module_name
from spack.version import Version, ver
@@ -26,9 +27,16 @@ def setup_parser(subparser):
subparser.add_argument(
'--keep-stage', action='store_true',
help="don't clean up staging area when command completes")
subparser.add_argument(
sp = subparser.add_mutually_exclusive_group()
sp.add_argument(
'-b', '--batch', action='store_true',
help="don't ask which versions to checksum")
sp.add_argument(
'-l', '--latest', action='store_true',
help="checksum the latest available version only")
sp.add_argument(
'-p', '--preferred', action='store_true',
help="checksum the preferred version only")
arguments.add_common_arguments(subparser, ['package'])
subparser.add_argument(
'versions', nargs=argparse.REMAINDER,
@@ -48,25 +56,38 @@ def checksum(parser, args):
# Get the package we're going to generate checksums for
pkg = spack.repo.get(args.package)
url_dict = {}
if args.versions:
# If the user asked for specific versions, use those
url_dict = {}
for version in args.versions:
version = ver(version)
if not isinstance(version, Version):
tty.die("Cannot generate checksums for version lists or "
"version ranges. Use unambiguous versions.")
url_dict[version] = pkg.url_for_version(version)
elif args.preferred:
version = preferred_version(pkg)
url_dict = dict([(version, pkg.url_for_version(version))])
else:
# Otherwise, see what versions we can find online
url_dict = pkg.fetch_remote_versions()
if not url_dict:
tty.die("Could not find any versions for {0}".format(pkg.name))
# And ensure the specified version URLs take precedence, if available
try:
explicit_dict = {}
for v in pkg.versions:
if not v.isdevelop():
explicit_dict[v] = pkg.url_for_version(v)
url_dict.update(explicit_dict)
except spack.package.NoURLError:
pass
version_lines = spack.stage.get_checksums_for_versions(
url_dict, pkg.name, keep_stage=args.keep_stage,
batch=(args.batch or len(args.versions) > 0 or len(url_dict) == 1),
fetch_options=pkg.fetch_options)
latest=args.latest, fetch_options=pkg.fetch_options)
print()
print(version_lines)

View File

@@ -22,6 +22,7 @@
import spack.environment as ev
import spack.hash_types as ht
import spack.mirror
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util
@@ -30,6 +31,7 @@
level = "long"
CI_REBUILD_INSTALL_BASE_ARGS = ['spack', '-d', '-v']
INSTALL_FAIL_CODE = 1
def get_env_var(variable_name):
@@ -76,8 +78,8 @@ def setup_parser(subparser):
default=False, help="""Spack always check specs against configured
binary mirrors when generating the pipeline, regardless of whether or not
DAG pruning is enabled. This flag controls whether it might attempt to
fetch remote spec.yaml files directly (ensuring no spec is rebuilt if it is
present on the mirror), or whether it should reduce pipeline generation time
fetch remote spec files directly (ensuring no spec is rebuilt if it
is present on the mirror), or whether it should reduce pipeline generation time
by assuming all remote buildcache indices are up to date and only use those
to determine whether a given spec is up to date on mirrors. In the latter
case, specs might be needlessly rebuilt if remote buildcache indices are out
@@ -116,7 +118,7 @@ def ci_generate(args):
for creating a build group for the generated workload and registering
all generated jobs under that build group. If this environment
variable is not set, no build group will be created on CDash."""
env = ev.get_env(args, 'ci generate', required=True)
env = spack.cmd.require_active_env(cmd_name='ci generate')
output_file = args.output_file
copy_yaml_to = args.copy_to
@@ -150,7 +152,7 @@ def ci_generate(args):
def ci_reindex(args):
"""Rebuild the buildcache index associated with the mirror in the
active, gitlab-enabled environment. """
env = ev.get_env(args, 'ci rebuild-index', required=True)
env = spack.cmd.require_active_env(cmd_name='ci rebuild-index')
yaml_root = ev.config_dict(env.yaml)
if 'mirrors' not in yaml_root or len(yaml_root['mirrors'].values()) < 1:
@@ -167,7 +169,7 @@ def ci_rebuild(args):
"""Check a single spec against the remote mirror, and rebuild it from
source if the mirror does not contain the full hash match of the spec
as computed locally. """
env = ev.get_env(args, 'ci rebuild', required=True)
env = spack.cmd.require_active_env(cmd_name='ci rebuild')
# Make sure the environment is "gitlab-enabled", or else there's nothing
# to do.
@@ -491,7 +493,7 @@ def ci_rebuild(args):
# If a spec fails to build in a spack develop pipeline, we add it to a
# list of known broken full hashes. This allows spack PR pipelines to
# avoid wasting compute cycles attempting to build those hashes.
if install_exit_code == 1 and spack_is_develop_pipeline:
if install_exit_code == INSTALL_FAIL_CODE and spack_is_develop_pipeline:
tty.debug('Install failed on develop')
if 'broken-specs-url' in gitlab_ci:
broken_specs_url = gitlab_ci['broken-specs-url']
@@ -502,9 +504,17 @@ def ci_rebuild(args):
tmpdir = tempfile.mkdtemp()
empty_file_path = os.path.join(tmpdir, 'empty.txt')
broken_spec_details = {
'broken-spec': {
'job-url': get_env_var('CI_JOB_URL'),
'pipeline-url': get_env_var('CI_PIPELINE_URL'),
'concrete-spec-yaml': job_spec.to_dict(hash=ht.full_hash)
}
}
try:
with open(empty_file_path, 'w') as efd:
efd.write('')
efd.write(syaml.dump(broken_spec_details))
web_util.push_to_url(
empty_file_path,
broken_spec_path,
@@ -566,6 +576,26 @@ def ci_rebuild(args):
cdash_build_id, pipeline_mirror_url))
spack_ci.write_cdashid_to_mirror(
cdash_build_id, job_spec, pipeline_mirror_url)
# If this is a develop pipeline, check if the spec that we just built is
# on the broken-specs list. If so, remove it.
if spack_is_develop_pipeline and 'broken-specs-url' in gitlab_ci:
broken_specs_url = gitlab_ci['broken-specs-url']
just_built_hash = job_spec.full_hash()
broken_spec_path = url_util.join(broken_specs_url, just_built_hash)
if web_util.url_exists(broken_spec_path):
tty.msg('Removing {0} from the list of broken specs'.format(
broken_spec_path))
try:
web_util.remove_url(broken_spec_path)
except Exception as err:
# If we got some kind of S3 (access denied or other connection
# error), the first non boto-specific class in the exception
# hierarchy is Exception. Just print a warning and return
msg = 'Error removing {0} from broken specs list: {1}'.format(
broken_spec_path, err)
tty.warn(msg)
else:
tty.debug('spack install exited non-zero, will not create buildcache')

View File

@@ -7,15 +7,17 @@
import os
import shutil
import llnl.util.filesystem
import llnl.util.tty as tty
import spack.bootstrap
import spack.caches
import spack.cmd.common.arguments as arguments
import spack.cmd.test
import spack.config
import spack.main
import spack.repo
import spack.stage
import spack.util.path
from spack.paths import lib_path, var_path
description = "remove temporary build files and/or downloaded archives"
@@ -26,7 +28,7 @@
class AllClean(argparse.Action):
"""Activates flags -s -d -f -m and -p simultaneously"""
def __call__(self, parser, namespace, values, option_string=None):
parser.parse_args(['-sdfmpb'], namespace=namespace)
parser.parse_args(['-sdfmp'], namespace=namespace)
def setup_parser(subparser):
@@ -47,9 +49,11 @@ def setup_parser(subparser):
help="remove .pyc, .pyo files and __pycache__ folders")
subparser.add_argument(
'-b', '--bootstrap', action='store_true',
help="remove software needed to bootstrap Spack")
help="remove software and configuration needed to bootstrap Spack")
subparser.add_argument(
'-a', '--all', action=AllClean, help="equivalent to -sdfmpb", nargs=0
'-a', '--all', action=AllClean,
help="equivalent to -sdfmp (does not include --bootstrap)",
nargs=0
)
arguments.add_common_arguments(subparser, ['specs'])
@@ -72,7 +76,11 @@ def clean(parser, args):
if args.stage:
tty.msg('Removing all temporary build stages')
spack.stage.purge()
# Temp directory where buildcaches are extracted
extract_tmp = os.path.join(spack.store.layout.root, '.tmp')
if os.path.exists(extract_tmp):
tty.debug('Removing {0}'.format(extract_tmp))
shutil.rmtree(extract_tmp)
if args.downloads:
tty.msg('Removing cached downloads')
spack.caches.fetch_cache.destroy()
@@ -101,8 +109,9 @@ def clean(parser, args):
shutil.rmtree(dname)
if args.bootstrap:
msg = 'Removing software in "{0}"'
tty.msg(msg.format(spack.paths.user_bootstrap_store))
with spack.store.use_store(spack.paths.user_bootstrap_store):
uninstall = spack.main.SpackCommand('uninstall')
uninstall('-a', '-y')
bootstrap_prefix = spack.util.path.canonicalize_path(
spack.config.get('bootstrap:root')
)
msg = 'Removing bootstrapped software and configuration in "{0}"'
tty.msg(msg.format(bootstrap_prefix))
llnl.util.filesystem.remove_directory_contents(bootstrap_prefix)

View File

@@ -69,7 +69,7 @@ def _specs(self, **kwargs):
# If an environment is provided, we'll restrict the search to
# only its installed packages.
env = ev._active_environment
env = ev.active_environment()
if env:
kwargs['hashes'] = set(env.all_hashes())

View File

@@ -18,7 +18,6 @@
import spack.compilers
import spack.config
import spack.spec
from spack.spec import ArchSpec, CompilerSpec
description = "manage compilers"
section = "system"
@@ -78,24 +77,13 @@ def compiler_find(args):
# None signals spack.compiler.find_compilers to use its default logic
paths = args.add_paths or None
# Don't initialize compilers config via compilers.get_compiler_config.
# Just let compiler_find do the
# entire process and return an empty config from all_compilers
# Default for any other process is init_config=True
compilers = [c for c in spack.compilers.find_compilers(paths)]
new_compilers = []
for c in compilers:
arch_spec = ArchSpec((None, c.operating_system, c.target))
same_specs = spack.compilers.compilers_for_spec(
c.spec, arch_spec, init_config=False)
if not same_specs:
new_compilers.append(c)
# Below scope=None because we want new compilers that don't appear
# in any other configuration.
new_compilers = spack.compilers.find_new_compilers(paths, scope=None)
if new_compilers:
spack.compilers.add_compilers_to_config(new_compilers,
scope=args.scope,
init_config=False)
spack.compilers.add_compilers_to_config(
new_compilers, scope=args.scope, init_config=False
)
n = len(new_compilers)
s = 's' if n > 1 else ''
@@ -110,7 +98,7 @@ def compiler_find(args):
def compiler_remove(args):
cspec = CompilerSpec(args.compiler_spec)
cspec = spack.spec.CompilerSpec(args.compiler_spec)
compilers = spack.compilers.compilers_for_spec(cspec, scope=args.scope)
if not compilers:
tty.die("No compilers match spec %s" % cspec)
@@ -128,7 +116,7 @@ def compiler_remove(args):
def compiler_info(args):
"""Print info about all compilers matching a spec."""
cspec = CompilerSpec(args.compiler_spec)
cspec = spack.spec.CompilerSpec(args.compiler_spec)
compilers = spack.compilers.compilers_for_spec(cspec, scope=args.scope)
if not compilers:

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.cmd
import spack.environment as ev
description = 'concretize an environment and write a lockfile'
@@ -23,7 +24,7 @@ def setup_parser(subparser):
def concretize(parser, args):
env = ev.get_env(args, 'concretize', required=True)
env = spack.cmd.require_active_env(cmd_name='concretize')
if args.test == 'all':
tests = True

View File

@@ -118,7 +118,7 @@ def _get_scope_and_section(args):
# w/no args and an active environment, point to env manifest
if not section:
env = ev.get_env(args, 'config edit')
env = ev.active_environment()
if env:
scope = env.env_file_config_scope_name()
@@ -143,7 +143,10 @@ def config_get(args):
"""
scope, section = _get_scope_and_section(args)
if scope and scope.startswith('env:'):
if section is not None:
spack.config.config.print_section(section)
elif scope and scope.startswith('env:'):
config_file = spack.config.config.get_config_filename(scope, section)
if os.path.exists(config_file):
with open(config_file) as f:
@@ -151,9 +154,6 @@ def config_get(args):
else:
tty.die('environment has no %s file' % ev.manifest_name)
elif section is not None:
spack.config.config.print_section(section)
else:
tty.die('`spack config get` requires a section argument '
'or an active environment.')
@@ -170,12 +170,19 @@ def config_edit(args):
With no arguments and an active environment, edit the spack.yaml for
the active environment.
"""
scope, section = _get_scope_and_section(args)
if not scope and not section:
tty.die('`spack config edit` requires a section argument '
'or an active environment.')
spack_env = os.environ.get(ev.spack_env_var)
if spack_env and not args.scope:
# Don't use the scope object for envs, as `config edit` can be called
# for a malformed environment. Use SPACK_ENV to find spack.yaml.
config_file = ev.manifest_file(spack_env)
else:
# If we aren't editing a spack.yaml file, get config path from scope.
scope, section = _get_scope_and_section(args)
if not scope and not section:
tty.die('`spack config edit` requires a section argument '
'or an active environment.')
config_file = spack.config.config.get_config_filename(scope, section)
config_file = spack.config.config.get_config_filename(scope, section)
if args.print_file:
print(config_file)
else:

View File

@@ -5,7 +5,10 @@
import os
import os.path
import llnl.util.tty
import spack.container
import spack.container.images
import spack.monitor
description = ("creates recipes to build images for different"
@@ -16,9 +19,26 @@
def setup_parser(subparser):
monitor_group = spack.monitor.get_monitor_group(subparser) # noqa
subparser.add_argument(
'--list-os', action='store_true', default=False,
help='list all the OS that can be used in the bootstrap phase and exit'
)
subparser.add_argument(
'--last-stage',
choices=('bootstrap', 'build', 'final'),
default='final',
help='last stage in the container recipe'
)
def containerize(parser, args):
if args.list_os:
possible_os = spack.container.images.all_bootstrap_os()
msg = 'The following operating systems can be used to bootstrap Spack:'
msg += '\n{0}'.format(' '.join(possible_os))
llnl.util.tty.msg(msg)
return
config_dir = args.env_dir or os.getcwd()
config_file = os.path.abspath(os.path.join(config_dir, 'spack.yaml'))
if not os.path.exists(config_file):
@@ -29,10 +49,12 @@ def containerize(parser, args):
# If we have a monitor request, add monitor metadata to config
if args.use_monitor:
config['spack']['monitor'] = {"disable_auth": args.monitor_disable_auth,
"host": args.monitor_host,
"keep_going": args.monitor_keep_going,
"prefix": args.monitor_prefix,
"tags": args.monitor_tags}
recipe = spack.container.recipe(config)
config['spack']['monitor'] = {
"disable_auth": args.monitor_disable_auth,
"host": args.monitor_host,
"keep_going": args.monitor_keep_going,
"prefix": args.monitor_prefix,
"tags": args.monitor_tags
}
recipe = spack.container.recipe(config, last_phase=args.last_stage)
print(recipe)

Some files were not shown because too many files have changed in this diff Show More