Compare commits

...

456 Commits

Author SHA1 Message Date
Gregory Becker
d82b537339 more thread safety 2022-07-11 14:00:20 -04:00
Gregory Becker
3d8d1fe924 update binary index before threading 2022-07-11 13:19:45 -04:00
Luke Diorio-Toth
f09f834a8f py-gtdbtk and fastani: new packages (#31268) 2022-06-24 07:49:07 -07:00
Cody Balos
147f39d7aa Fix typo in documentation note about concretizer:unify (#31246) 2022-06-24 11:31:43 +02:00
Adam J. Stewart
9fad3ed5f1 giflib: fix macOS build (#31234) 2022-06-24 11:28:44 +02:00
Adam J. Stewart
1375b1975b librttopo: add missing Autotools deps (#31252) 2022-06-24 11:28:12 +02:00
Adam J. Stewart
c533612ab6 sfcgal: fix boost build issues (#31254) 2022-06-24 10:22:56 +02:00
Adam J. Stewart
e5cc3c51d1 py-numpy: add v1.23.0 (#31250) 2022-06-24 10:20:04 +02:00
Massimiliano Culpo
70650cacbd ASP-based solver: rescale target weights so that 0 is always the best score (#31226)
fixes #30997

Instead of giving a penalty of 30 to all nodes when preferences
are not package specific, give a penalty of 100 to all targets
of a node where we have package specific preferences, if the target
is not explicitly preferred.
2022-06-23 23:33:36 -07:00
Brian Van Essen
d8a5638d9f Fixed the range on Catch2 to avoid v3.x (#31242) 2022-06-23 16:10:27 -07:00
Luke Diorio-Toth
158f6ddb14 py-instrain: adding new package (#31244)
* py-instrain: adding new package

* py-instrain: fixed style

* removed unnecessary setuptools and numba version restrictions

* fixed pandas version
2022-06-23 22:58:45 +00:00
haralmha
86d1a6f2fc hadoop-xrootd: Add clang conflict (#31215)
* hadoop-xrootd: Add clang conflict

* Flake8

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-06-23 16:45:39 -06:00
Mikael Simberg
ca10ff1649 Allow using mimalloc 2 in pika and HPX (#31260) 2022-06-23 15:33:03 -07:00
Adam J. Stewart
19175b9bab keyutils: add macOS conflict (#31236) 2022-06-23 16:13:37 -06:00
Richard Berger
7d64a8e81f add charliecloud 0.27 and fix master install (#31247)
* charliecloud: custom autoreconf and require py-pip and py-wheel for master

* charliecloud: add 0.27

* charliecloud: add minimum py-pip version
2022-06-23 15:03:03 -07:00
Adam J. Stewart
a0b68a0baa rpcsvc-proto: add v1.4.3 (#31253) 2022-06-23 14:59:55 -07:00
Harmen Stoppels
b7e0692dfa libfuse: 3.11.0 (#31259) 2022-06-23 14:55:47 -07:00
Maciej Wójcik
8cc62ef916 Add some GROMACS versions (#31264)
* gromacs: add recent version

* gromacs-swaxs: add recent versions

* gromacs-chain-coordinate: add recent version
2022-06-23 14:53:32 -07:00
Stephen Sachs
2363aba710 [lammps] Rename master -> develop branch (#31263)
https://github.com/lammps/lammps/pull/3311

Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-06-23 14:40:55 -07:00
Weston Ortiz
5a0c56229c goma: update package with new version and options (#31267) 2022-06-23 14:10:14 -07:00
AMD Toolchain Support
96bb72e412 Updated sha256 value for amdlibflame (#31269) 2022-06-23 14:07:09 -07:00
Adam J. Stewart
7c977733f1 py-pandas: add v1.4.3 (#31270) 2022-06-23 14:05:49 -07:00
Brian Van Essen
d48a760837 Add the rocm tag to packages from the Radeon Open Compute Platform. (#31187)
* Add the rocm tag to packages from the Radeon Open Compute Platform.

* Added more packages from the ROCm software distribution based on
reviewer feedback.

* Updated tags based on reviewer feedback
2022-06-23 09:10:01 -07:00
Brian Van Essen
c64298a183 Enable MVAPICH2 to use the mpichversion executable to detect externals (#31245)
This is an an alternative to mpiname.
2022-06-23 07:41:25 +02:00
Ben Boeckel
31d89e80bc protobuf: add 21.1 (#31249)
Older versions have compile errors under newer compilers (e.g., gcc-12).
2022-06-22 19:33:54 -06:00
Luke Diorio-Toth
2844f1fc45 added racon conflict with new versions of gcc (#31243) 2022-06-22 18:00:27 -07:00
Vicente Bolea
8b85f8ec58 vtk-m: change smoke test (#29113) 2022-06-22 18:58:03 -06:00
eugeneswalker
7ef52acfce relocation: x-pie-executable needs relocation (#31248) 2022-06-22 23:55:28 +00:00
Nicholas Sly
c3fecfb103 Add --disable-wrapper-runpath to openmpi when ~wrapper-rpath to avoid build time error. (#31221)
Co-authored-by: Nicholas Cameron Sly <sly1@rzalastor2.llnl.gov>
2022-06-22 16:57:55 -06:00
Adam J. Stewart
ef278c601c spack create: fix for no URL (#31239) 2022-06-22 18:45:03 +00:00
JeromeDuboisPro
5e9a8d27f7 add first version of py-pyrr package (#31211)
* add first version of py-pyrr package

* added license

* update license HEADER

* Remove preferred as there is only one version available

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Better specify dependency type

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove useless variable license

Co-authored-by: Jérôme Dubois <jerome.dubois@cea.fr>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-06-22 10:36:49 -07:00
Manuela Kuhn
ffc90f944c py-bids-validator: add 1.9.4 (#31230) 2022-06-22 10:35:57 -07:00
Harmen Stoppels
699c0cf9a0 squashfs-mount: new package (#31225) 2022-06-22 14:31:20 +02:00
Massimiliano Culpo
55cac3b098 Remove update functions used to ease the transition in v0.16 (#31216) 2022-06-22 11:15:57 +02:00
Tamara Dahlgren
35d07a6f18 Add missing '?full_index=1' to meson 9850.patch URL (#31222)
* Add missing '?full_index=1' to meson 9850.patch URL

* Correct checksum

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-06-22 08:56:49 +02:00
Adam J. Stewart
1f42a5e598 py-geopandas: add v0.11.0 (#31218) 2022-06-21 18:14:04 -07:00
Adam J. Stewart
967463432b py-shapely: env vars always needed (#31217) 2022-06-21 18:11:36 -07:00
Howard Pritchard
bc03b5caaf meson: fix to recognize intel oneapi compiler (#30605)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2022-06-21 10:47:46 -07:00
Massimiliano Culpo
649760dc1a Canonicalize positional argument to spack bootstrap mirror (#31180)
fixes #31139
2022-06-21 19:25:01 +02:00
Adam J. Stewart
77118427b1 grep: add +pcre variant (#31204) 2022-06-21 10:06:42 -07:00
haralmha
1a024963a2 frontier-client: Add openssl and zlib directories to search path (#31212) 2022-06-21 09:38:24 -07:00
Massimiliano Culpo
aebb601b70 Stricter compatibility rules for OS and compiler when reusing specs (#31170)
* Stricter compatibility rules for OS and compiler when reusing specs
* Add unit test
2022-06-20 17:32:13 -07:00
Robert Cohn
d95f53e1ac New (external-only) package: oneapi-igc (#31045) 2022-06-20 13:38:41 -07:00
Christian Goll
0fe57962b3 OpenSUSE Tumbleweed: use GLIBC version as distro version (#19895)
Tumbleweed is a rolling release that would have used a date 
as a version instead.
2022-06-20 22:20:31 +02:00
Massimiliano Culpo
8ec451aa7d setuptools: add v62.4.0 (#31112) 2022-06-20 08:59:58 -07:00
Stephen Sachs
14349cb882 lammps: add lammps_sizes variant (#31182)
This is translated to CMakes LAMMS_SIZES variable. Used to select integer sizes.

Co-authored-by: Stephen Sachs <stesachs@amazon.com>
2022-06-20 09:39:03 +00:00
Pariksheet Nanda
bc438ed4e9 ome-files-cpp: add v0.6.0 (#31156) 2022-06-20 10:16:21 +02:00
Sam Broderick
760a12c440 Fix request for bzip2, since bzip was pulled due to patent issues (#31198) 2022-06-20 08:13:20 +00:00
Adam J. Stewart
9325c8e53f py-tensorboard-data-server: fix build on Apple M1 (#30908) 2022-06-20 10:12:32 +02:00
HELICS-bot
fd5b7d6cce helics: add v3.2.1 (#31190)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2022-06-20 09:54:57 +02:00
Cory Bloor
b19a0744f1 rocm: add master and develop to BLAS and SOLVER (#30273) 2022-06-20 09:52:47 +02:00
Sebastian Ehlert
75d0c0dff7 fpm: add v0.6.0 (#31193) 2022-06-20 09:49:21 +02:00
Adam J. Stewart
aedc41c3a0 texlive: fix and deprecate live version (#31195) 2022-06-20 09:44:36 +02:00
Valentin Churavy
1f39d6d916 Julia: add v1.7.3 (#31196) 2022-06-20 09:41:50 +02:00
Ye Luo
1bc3b0a926 quantum-espresso: Add 7.1 (#31192) 2022-06-18 23:32:28 +02:00
Ryan Marcellino
62596fb796 py-f90wrap: new version (#31174) 2022-06-17 17:09:34 -06:00
Brian Van Essen
15c35a3cff Bugfix external find --all for libraries (#31186)
* Fixed a bug in the 'external find --all' command where the call failed
to find packages by both executable and library. The bug was that the
call `path.all_packages()` incorrectly turned the variable
`packages_to_check` into a generator rather than keeping it a list.
Thus the second call to `detection.by_library` had no work to do.

* Fixed the help message for the find external and compiler commands as
well as others that used the `scopes_metavar` field to define where
the results should be stored in configuration space.  Specifically,
the fact that configuration could be added to the environment was not
mentioned in the help message.
2022-06-17 21:52:06 +00:00
Brian Van Essen
249c90f909 Added the ecp and radiuss tags to the LBANN software stack packages. (#31188) 2022-06-17 21:25:50 +00:00
Kyle Gerheiser
043b2cbb7c Fix external compiler detection for MPICH and OpenMPI (#30875)
os.path.dirname was being used to compare compilers. If two compilers
are in the same directory then it will pick up the first one it encounters.

Compare the full compiler path instead.
2022-06-17 14:01:49 -06:00
Bernhard Kaindl
aedf215b90 openssh: don't suid-install ssh-keysign (not used by default) (#31083) 2022-06-17 12:09:52 -06:00
Mikhail Titov
eed4a63be7 rct: update packages (RE, RG, RP, RS, RU) with new versions (#31172) 2022-06-17 10:44:22 -07:00
Massimiliano Culpo
13b609b4b6 docs: quote string to show valid YAML (#31178)
fixes #31167
2022-06-17 18:25:52 +02:00
Chuck Atkins
85dc20cb55 Spec: Add a new virtual-customizable home attribute (#30917)
* Spec: Add a new virtual-customizable home attribute

* java: Use the new builtin home attribute

* python: Use the new builtin home attribute
2022-06-17 10:29:08 -04:00
Massimiliano Culpo
667c39987c py-setuptools: install setuptools from wheels directly (#31131)
When installing setuptools from sources in Spack, we might
get into weird failures due to the way we use pip.

In particular, for Spack it's necessary to install in a
non-isolated pip environment to allow using PYTHONPATH as a
selection method for all the build requirements of a
Python package.

This can fail when installing setuptools since there might
be a setuptools version already installed for the Python
interpreter being used, with different entry points than
the one we want to install.

Installing from wheels both pip and setuptools should
harden our installation procedure in the context of:
- Bootstrapping Python dependencies of Spack
- Using external Python packages
2022-06-17 10:42:28 +02:00
Cyrus Harrison
f35d8c4102 vtk-h: use cmake base pkg, overhaul recipe (#28898)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-06-17 09:59:57 +02:00
Sreenivasa Murthy Kolam
6ac3186132 Support 'spack external find' for rocm packages (#30792) 2022-06-17 09:34:49 +02:00
Robert Cohn
466572dc14 Update Intel package descriptions (#31150) 2022-06-17 09:23:40 +02:00
Harmen Stoppels
1c0bf12e5b openssl package: default to mozilla certs (#31164)
On Cray systems that use Cray Data Virtualization Service (DVS),
symlinks across filesystems are not allowed, either due to a bug, or
because they're simply not POSIX compliant [1].

Spack's OpenSSL package defaults to `certs=system` which comes down to
symlinking `/etc/ssl` in the Spack install prefix, triggering this
problem, resulting in mysterious installation failures.

Instead of relying on system certs, we can just use
`ca-certificates-mozilla`, and if users really need system certs, then
they're probably better off marking OpenSSL entirely as external.

[1] https://github.com/glennklockwood/cray-dvs
2022-06-16 23:25:32 -06:00
Adam J. Stewart
bf990bc8ec py-sphinx: add v5, maintainer (#31171) 2022-06-16 20:41:31 -06:00
Massimiliano Culpo
267358a799 ASP-based solver: fix rules on version weights selection (#31153)
* ASP: sort and deduplicate version weights from installed specs

* Pick version weights according to provenance

* Add unit test
2022-06-16 14:17:40 -07:00
Peter Scheibel
392b548312 Manifest parsing: avoid irrelevant files (#31144)
* Manifest directory may not contain manifest files: exclude non-manifest files
* Manifest files use different name for rocmcc: add translation for it
2022-06-16 15:16:48 -05:00
Elizabeth Sherrock
8c3b82c140 New packages: py-ford and dependencies (#31107)
* Created package and added description

* Add py-markdown-include

* Created package

* Finished creating package

* Added py-md-environ

* Added build dependencies

* Added other deps

* Add python-markdown-math (#4)

* Created package and started to add info

* Removed unneeded global/install options

* Figured out version spec for markdown-math

* Removed type=build from unnecessary dependencies

* Removed unneeded install/global options, added version spec to dependency

* Added wscullin as interim maintainer for packages

* Fixed style issues

* Took care of trailing whitespace

* Removed comment line before imports

* Changed file charset to match other packages

* Update var/spack/repos/builtin/packages/py-ford/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-ford/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Removed test dependency after review feedback

* Added new 6.1.12 version to py-ford

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-06-16 12:14:40 -07:00
Qian Jianhua
2e4b533420 rpcsvc-proto: add 'cpp' path for rpcgen (#31157) 2022-06-16 18:24:28 +00:00
Mark Abraham
3b393fe0eb Add LevelZero variant to hwloc package (#31161)
* Add LevelZero variant to hwloc package

This permits the hwloc package to build with with support for the
Intel Level Zero low-level layer, analogous to CUDA, ROCm, and OpenCL.

* Fix typo
2022-06-16 12:21:36 -06:00
Massimiliano Culpo
895ceeda38 concretize.lp: impose a lower bound on the number of version facts if a solution exists (#31142)
* concretize.lp: impose a lower bound on the number of version facts if a valid version exists

fixes #30864

* Add a unit test
2022-06-16 09:25:56 -07:00
Satish Balay
6f1e3a4ad3 xsdk-examples: use 'parallel=False' with tests - as they may fail with a large '-j'[when used with +cuda] (#31138) 2022-06-16 07:50:45 -07:00
iarspider
ce41b7457b Xerces-C: Add option to disable transcoder (#31162)
* Xerces-C: Add option to disable transcoder

* Update package.py

* Apply suggestion from review

* Flake-8
2022-06-16 10:40:03 -04:00
G-Ragghianti
4f9f56630b slate, blaspp, lapackpp: new versions (#31147)
* Adding new release for slate, blaspp, and lapackpp

* Setting release tar file hashes

* New slate release
2022-06-16 08:09:42 -04:00
Rob Falgout
e216ba1520 hypre: new version 2.25.0 (#31146) 2022-06-16 08:07:30 -04:00
Massimiliano Culpo
4fd5ee1d9d CI: test binary bootstrapping of clingo on all macOS versions (#31160) 2022-06-16 07:26:20 -04:00
Mosè Giordano
89f32d51f3 grid: add new package (#31133) 2022-06-16 09:00:09 +02:00
Manuela Kuhn
3b3be70226 r-httpuv: fix zlib dependency (#31121) 2022-06-15 20:22:58 -05:00
Chuck Atkins
95888602f2 mesa: Match rtti settings with the libllvm dependency (#31145)
* llvm-amdgpu: Update the libllvm version provided

* mesa: Match configure rtti settings to the libllvm dependency
2022-06-15 17:29:33 -06:00
Wileam Y. Phan
0205fefe0c googletest: use URLs with .tar.gz extension (#31129) 2022-06-15 18:21:44 +00:00
Tom Scogland
b261b2a5ff Allow bootstrapping to work when partial or incompatible modules exist in the module path (#31035) 2022-06-15 19:09:07 +02:00
Ryan Mast
8c58c14c3d helics: add v2.8.1 (#31134) 2022-06-15 09:27:50 +02:00
SXS Bot
f7b4d488c3 spectre: add v2022.06.14 (#31135)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2022-06-15 09:27:10 +02:00
Sreenivasa Murthy Kolam
38a8d4d2fe ROCm support to build py-torch (#31115)
* rocm support to build py-torch

* rocm support to build py-torch
2022-06-14 15:56:11 -07:00
Robert Underwood
1ceee714db sz: smoke, unit, version updates, and more oh my (#30873)
Co-authored-by: Robert Underwood <runderwood@anl.gov>
2022-06-14 16:53:47 -06:00
Brian Van Essen
5b3e4ba3f8 Fixed bad hashes for recent versions and added v8.4.0 (#31038) 2022-06-14 22:05:44 +02:00
snehring
37426e41cb wget: fixing compilation issue with gcc11+ on rhel7 (#30858) 2022-06-14 16:03:02 +00:00
Sergey Kosukhin
a05e729e1b mpich: remove libtool patches (#30769) 2022-06-14 16:29:22 +02:00
Mikael Simberg
f8a6799e67 apex: fix compilation with binutils (#30638) 2022-06-14 16:04:53 +02:00
Maciej Wójcik
633ebd149c dbcsr: add versions up to v2.2.0 (#30891)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-06-14 15:08:13 +02:00
G-Ragghianti
bf6220821b Make the IntelPackage fail successfully (#30773)
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
2022-06-14 15:06:06 +02:00
Sebastian Ehlert
3db96aa892 simple-dftd3: add package and dependencies (#31046)
- add simple-dftd3 (version 0.6.0, 0.5.1)
- add mctc-lib (version 0.3.0)
- add toml-f (version 0.2.3)
- update dftd4 (version 3.4.0, use mctc-lib as actual dependency)
- update json-fortran (version 8.*, make static library PIC)
2022-06-14 15:05:01 +02:00
Jim Galarowicz
c7628d768a survey: add v1.0.5 (#31053) 2022-06-14 14:56:51 +02:00
haralmha
e1ad926189 vectorclass-version2: move files to include folder (#31105) 2022-06-14 14:54:16 +02:00
dlkuehn
8705735c74 gemma: add new package. (#31078)
Co-authored-by: David Kuehn <las_dkuehn@iastate.edu>
2022-06-14 12:53:14 +00:00
Hadrien G
d82fa045d5 acts: add v19.2 (#31108) 2022-06-14 12:45:03 +00:00
Richard Berger
c72d51cb3a legion: add HIP support (#31080) 2022-06-14 13:49:33 +02:00
Qian Jianhua
d76f184430 hybrid-lambda: fix build tests (#30214) 2022-06-14 13:46:23 +02:00
Seth R. Johnson
19ea24d2bd hdf5: add conflict for broken @1.8.22+fortran+mpi (#31111) 2022-06-14 13:42:48 +02:00
Bernhard Kaindl
83efea32f4 openssh: enable authentication via Kerberos through GSSAPI (#31086)
Add variant +gssapi to enable authentication via Kerberos through GSSAPI

When openssh is installed at sites using Kerberos, openssh needs to auth
via Kerberos through GSSAPI in order to work in such environments.
2022-06-14 13:41:32 +02:00
Mark Olesen
bc577f2dee openfoam: update v2112 to latest patch (#31113)
Co-authored-by: Mark Olesen <Mark.Olesen@esi-group.com>
2022-06-13 21:21:42 -06:00
Jean Luca Bez
04529fbe80 Update HDF5 VOL async (#31011)
* Update h5bench maintainers and versions

* Include version 1.1 for h5bench

* Correct release hash and set default version

* Update .tar.gz version

* Update HDF5 VOL async version and environment variable syntax
2022-06-13 21:58:16 -05:00
Wouter Deconinck
cdfbe2c25d root: new version 6.26.04 (#31116)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-06-13 17:45:37 -07:00
Manuela Kuhn
8305742d75 exempi: add 2.6.1 and fix boost dependency (#31120) 2022-06-13 15:20:31 -07:00
Greg Sjaardema
e0137b1566 SEACAS: Clean up, add new versions (#31119) 2022-06-13 16:22:43 -04:00
iarspider
be95699a55 Add checksum for pacparser 1.4.0 (#31110) 2022-06-13 14:17:33 -06:00
Massimiliano Culpo
7d96a0aa5a Pin setuptools version in Github Action Workflows (#31118)
fixes #31109
2022-06-13 21:09:16 +02:00
Qian Jianhua
874b713edf netpbm: run custom test command (#31072)
* netpbm: run custom test command

* fix styles

* change method name
2022-06-13 11:26:54 -07:00
iarspider
3c0a98c5ab sloccount: add build-time dependency on flex (#31106) 2022-06-13 09:39:44 -07:00
Elsa Gonsiorowski, PhD
998bf90b35 SCR release v3.0, including components (#29924) 2022-06-13 18:17:17 +02:00
liuyangzhuan
2ca32fbc8c Remove filter operation for superlu-dist to avoid permission issues on read-only systems (#30970)
* added package gptune with all its dependencies: adding py-autotune, pygmo, py-pyaml, py-autotune, py-gpy, py-lhsmdu, py-hpbandster, pagmo2, py-opentuner; modifying superlu-dist, py-scikit-optimize

* adding gptune package

* minor fix for macos spack test

* update patch for py-scikit-optimize; update test files for gptune

* fixing gptune package style error

* fixing unit tests

* a few changes reviewed in the PR

* improved gptune package.py with a few newly added/improved dependencies

* fixed a few style errors

* minor fix on package name py-pyro4

* fixing more style errors

* Update var/spack/repos/builtin/packages/py-scikit-optimize/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* resolved a few issues in the PR

* fixing file permissions

* a few minor changes

* style correction

* minor correction to jq package file

* Update var/spack/repos/builtin/packages/py-pyro4/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fixing a few issues in the PR

* adding py-selectors34 required by py-pyro4

* improved the superlu-dist package

* improved the superlu-dist package

* moree changes to gptune and py-selectors34 based on the PR

* Update var/spack/repos/builtin/packages/py-selectors34/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* improved gptune package: 1. addressing comments of tldahlgren in PR 26936; 2. adding variant openmpi

* fixing style issue of gptune

* changing file mode

* improved gptune package: add variant mpispawn which depends on openmpi; add variant superlu and hypre for installing the drivers; modified hypre package file to add a gptune variant

* fixing style error

* corrected pddrive_spawn path in gptune test; enforcing gcc>7

* fixing style error

* setting environment variables when loading gptune

* removing debug print in hypre/package.py

* adding superlu-dist v7.2.0; fixing an issue with CMAKE_INSTALL_LIBDIR

* changing site_packages_dir to python_platlib

* not using python3.9 for py-gpy, which causes due to dropped support of tp_print

* more replacement of site_packages_dir

* fixing a few dependencies in gptune; added a gptune version

* adding url for gptune

* minor correction of gptune

* updating versions in butterflypack

* added a version for openturns

* added a url for openturns

* minor fix for openturns

* Update var/spack/repos/builtin/packages/openturns/package.py

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* added version2.1.1 for butterflypack

* fixing a tag in butterflypack

* minor fix for superlu-dist

* removed file filter for superlu-dist

* Update var/spack/repos/builtin/packages/superlu-dist/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* superlu-dist: add version detection for the testing; removing unused lines

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-06-13 08:48:55 -07:00
Jordan Galby
e50d08ce48 Fix spack style arbitrary git rev as base (#31019)
Allow `spack style -b @` or `spack style -b origin/develop` to work as
expected.

Regression since spack 0.17 #25085
2022-06-13 15:22:35 +00:00
Sergey Kosukhin
696d81513d nvhpc: conflict with all compilers except for gcc (#31073) 2022-06-13 16:47:28 +02:00
Mark Grondona
b38afa7528 Add flux-core v0.39.0 and v0.40.0 and flux-sched v0.23.0 (#31061) 2022-06-13 16:15:25 +02:00
Mikael Simberg
0f2786c9d3 Fix tracy-client package (#31076) 2022-06-13 16:11:00 +02:00
Carson Woods
f72c2ab583 Fix download URL for pcre package (#31077) 2022-06-13 16:05:25 +02:00
Adam J. Stewart
f6e6403fd1 Fix packages that will break with PIL 10 (#31071) 2022-06-13 16:04:13 +02:00
Simon Pintarelli
5550271ead atompaw: add v4.2.0.1 (#31087)
apply ifort patch including v4.2.0.0, but not for 4.2.0.1 anymore
2022-06-13 15:15:59 +02:00
Jack Morrison
a1b19345b1 opa-psm2: add v11.2.228 (#31089) 2022-06-13 14:58:25 +02:00
Erik Schnetter
e2aeb06c91 adios2: add v2.8.1 (#31090) 2022-06-13 14:55:09 +02:00
Brice Videau
14ae0e0d94 py-py-spy: fix dependency on libunwind (#31013)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-06-13 14:16:26 +02:00
Seth R. Johnson
516587a1da py-breathe: new version 4.33.1 and fix dependencies (#31098)
* py-breathe: new version 4.33.1 and fix dependencies

* remove unused dependencies
2022-06-12 18:06:39 -05:00
Paul R. C. Kent
dc36fd87bb Addllvmv1405 (#31099) 2022-06-12 11:24:31 -07:00
marcosmazz
06b5141d01 Limit Jasper version for WPS (#31088)
* Limit Jasper version for WPS

Jasper@3: changed API and does not support interfaces used by WPS.
This prevented ungrib.exe build and failed silently.
2022-06-12 19:02:07 +02:00
Seth R. Johnson
27610838dd py-sphinxcontrib-bibtex: new version 2.4.2 (#31097) 2022-06-12 09:24:46 -07:00
Danny McClanahan
0c7fd9bd8c fix doubly shell quoting args to spack spec (#29282)
* add test to verify fix works
* fix spec cflags/variants parsing test (breaking change)
* fix `spack spec` arg quoting issue
* add error report for deprecated cflags coalescing
* use .group(n) vs subscript regex group extraction for 3.5 compat
* add random test for untested functionality to pass codecov
* fix new test failure since rebase
2022-06-11 12:35:46 -07:00
Todd Gamblin
bf2b30a5f5 bugfix: preserve dict order for Spec.dag_hash() in Python 2 (#31092)
Fix a bug introduced in #21720. `spack_json.dump()` calls `_strify()` on dictionaries to
convert `unicode` to `str`, but it constructs `dict` objects instead of
`collections.OrderedDict` objects, so in Python 2 (or earlier versions of 3) it can
scramble dictionary order.

This can cause hashes to differ between Python 2 and Python 3, or between Python 3.7
and earlier Python 3's.

- [x] use `OrderedDict` in `_strify`
- [x] add a regression test
2022-06-10 20:32:35 -07:00
Asher Mancinelli
f42680b785 ripgrep: remove need for llvm (#31062) 2022-06-10 11:45:53 -07:00
Sergey Kosukhin
a2afd5b82f clingo: fix string formatting in error messages (#31084) 2022-06-10 16:24:27 +00:00
Cyrus Harrison
11f1b371f7 babelflow: fix conditional constraint in "when" decorator (#31057) 2022-06-10 18:07:22 +02:00
Valentin Churavy
cfc46504ac enzyme: new version 0.0.32 (#31081)
Add LLDEnzyme to libs list
2022-06-10 11:48:35 +02:00
dependabot[bot]
163251aa65 build(deps): bump actions/setup-python from 3.1.2 to 4 (#31059)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 3.1.2 to 4.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](98f2ad02fd...d09bd5e600)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-06-10 11:03:36 +02:00
eugeneswalker
6ab9f3a290 exago@1.4.1: : needs py-pytest (#31060) 2022-06-09 16:55:04 -07:00
Gregory Lee
45043bcdf5 py-xdot requires explicit dependency on harfbuzz (#30968) 2022-06-09 13:53:32 -06:00
Adam J. Stewart
7642fa3d99 py-kornia: add v0.6.5 (#31067) 2022-06-09 12:13:51 -07:00
Adam J. Stewart
1689f7fcbe py-pillow: add v9.1.1 (#31068) 2022-06-09 12:12:02 -07:00
Adam J. Stewart
4f67afeb5f py-pytorch-lightning: add v1.6.4 (#31069) 2022-06-09 12:09:05 -07:00
Adam J. Stewart
9d8ecffed0 py-timm: add v0.5.4 (#31070) 2022-06-09 12:06:58 -07:00
Cyrus Harrison
1ada7ea809 add fix for implicit includes with newer vers of gcc (#31056) 2022-06-09 09:40:08 -07:00
Martin Lang
4a8db00691 nfft, pnfft: fix detection of fftw variants (precision) (#30935)
* nfft, pnfft: fix detection of fftw variants (precision)

* nfft, pnfft: use fftw's selected_precisions; avoid repetitive calls to spec

Co-authored-by: Martin Lang <martin.lang@mpsd.mpg.de>
2022-06-09 08:47:12 -07:00
Tim Fuller
01f8236bf5 Allow more fine-grained control over what submodules are updated (#27293)
The "submodules" argument of the "version" directive can now accept
a callable that returns a list of submodules, in addition to the usual
Boolean values
2022-06-09 07:45:49 +02:00
Adam J. Stewart
57822d3014 clingo: fix ~python build (#30824) 2022-06-09 07:33:56 +02:00
Massimiliano Culpo
3fdb3f832a Cancel running workflows automatically on PR update (#31044)
* Cancel running workflows automatically on PR update
* Add the last update later to check cancellation is working
* Use github.run_number instead of github.sha
2022-06-08 15:46:46 -07:00
Asher Mancinelli
bedad508a9 lua-mpack: build with -fPIC always (#31054) 2022-06-08 21:37:55 +00:00
Adam J. Stewart
31f6dedecd py-memray: add new package (#31033)
* py-memray: add new package

* Link to Python

* Add missing non-Python deps
2022-06-08 13:52:38 -07:00
Adam J. Stewart
dc0c4959db cuDNN: add v8.4.0.27-11.6 (#31034) 2022-06-08 13:48:12 -07:00
Mark W. Krentel
8550b6cf49 hpctoolkit: adjust dependencies for develop (#31050)
The develop branch now uses libiberty, but not binutils or libdwarf.
2022-06-08 13:42:32 -07:00
Jen Herting
0e5c4c9cbf Draft: Pending #29806 [py-sentry-sdk] added version 1.5.5 (#29143)
* [py-sentry-sdk] audited dependencies and relisted extras as variants

* [py-sentry-sdk] added version 1.5.5

* [py-sentry-sdk] flake8

* [py-sentry-sdk]

- added "descriptions" to variants
- removed conflicts
  - replaced with when statements in varients
2022-06-08 20:41:44 +00:00
Jen Herting
9bac72e818 New package: py-colossialai (#30807)
* colossalai is dependency for deepoffense

* [py-colossalai] added homepage

* [py-colossalai] added dependencies and removed older versions

* [py-colossalai] updated pypi url to match listed version

* [py-colossalai] spack -> spack.package

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2022-06-08 20:34:34 +00:00
Brice Videau
a65a217c54 py-ytopt: add v0.0.4 (#31029) 2022-06-08 19:43:20 +00:00
Ryan Marcellino
26e6aae8d4 new package: py-pytest-asyncio (#31040) 2022-06-08 11:31:31 -07:00
Manuela Kuhn
d6fabd1533 py-glmsingle: add new package (#31043) 2022-06-08 10:03:18 -07:00
Massimiliano Culpo
d245c46487 bootstrap: account for disabled sources (#31042)
* bootstrap: account for disabled sources

Fix a bug introduced in #30192, which effectively skips
any prescription on disabled bootstrapping sources.

* Add unit test to avoid regression
2022-06-08 07:17:37 -06:00
Adam J. Stewart
3eba28e383 OpenCV: add v4.6.0 (#31022) 2022-06-08 14:31:55 +02:00
Adam J. Stewart
0d91cb58dc py-pympler: add v1.0.1 (#31031) 2022-06-08 14:27:56 +02:00
Adam J. Stewart
5f55abeecb Python: add v3.10.5 (#31039) 2022-06-08 14:22:51 +02:00
psakievich
c3c2416672 Nalu-Wind: allow for lowest order infinity version (#31036)
This will allow us to pin trilinos in our repo
2022-06-08 14:22:10 +02:00
Adam J. Stewart
9e2bc41e45 libtiff: add v4.4.0 (#31008) 2022-06-07 15:49:56 -06:00
Axel Huebl
8f80c5e6f7 openPMD-api: 0.14.5 (#31023)
Add the latest release.
2022-06-07 14:32:43 -07:00
Erik Schnetter
a5e1367882 meson: add version 0.62.2 (#31005) 2022-06-07 15:13:40 -06:00
Peter Scheibel
c724e26ba9 Staging: determine extensions from nonstandard URLs (#31027)
Fixes #31021

With #25185, we no longer default to using tar when we can't
determine the extension type, opting to fail instead.

This broke fetching for the pcre package, where we couldn't determine
the extension. To determine the extension, we were attempting to
extract it from the destination filename; however, this file name
may omit details of the origin URL that are required to determine the
extension, so instead we examine the URL directly.

This also updates the decompressor_for method not to set ext=None
by default: it must now always be set by the caller.
2022-06-07 14:12:08 -07:00
Glenn Johnson
1b9a1992fb Update Spack Bioconductor packages to Bioconductor 3.15 (#30984) 2022-06-07 20:10:01 +02:00
Graeme A Stewart
815764bdef Update prmon recipe, v3.0.2 (#30979)
* Update prmon recipe, v3.0.2

Update recipe to v3.0.2, using external dependency
option for the build (as these are satisfied so easily with Spack)

* Remove "broken" 3.0.1 version

This one could not build properly with Spack, due to
missing submodule sources
2022-06-07 17:17:25 +00:00
Bitllion
adf073b53c Add New Package:PSCMC (#30944)
* add some new version of abacus and fix bug on mkl

* Add Package:PSCMC

* update maintainer

* rebase

* update maintainers

* Update var/spack/repos/builtin/packages/pscmc/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-06-07 10:11:14 -07:00
Axel Huebl
449e885d4c WarpX: 22.06 (#31012)
* WarpX: 22.06

Update `warpx` & `py-warpx` to the latest release, `22.06`.

* Patch: Fix 1D CUDA Builds
2022-06-07 09:49:35 -06:00
Brice Videau
560472ce3a Fix py-deap py-setuptools dependency (#31014) 2022-06-07 08:57:44 -06:00
Brice Videau
0c66446437 Add py-ytopt-autotune version v1.0.0 (#31015) 2022-06-07 08:49:44 -06:00
Massimiliano Culpo
6b4b1dacd9 docs: update the list of Docker images with Spack preinstalled (#31003)
Also, update the image in the docs and use ghcr.io
2022-06-07 16:43:04 +02:00
Gregory Lee
3e90134e14 json-glib depends on gobject-introspection (#30963) 2022-06-07 08:29:38 -06:00
Asher Mancinelli
4ad0594c7b fd: add new package (#31010) 2022-06-07 08:21:59 -06:00
kwryankrattiger
f13b760f10 Visit: Allow building without rendering (#31006)
This is important for HPC systems.
2022-06-07 08:21:36 -06:00
Carlos Bederián
0f6fb1a706 gmsh: +petsc requires ~int64 variant (#30969) 2022-06-07 16:21:07 +02:00
Jordan Galby
e74d85a524 Fix empty install prefix post-install sanity check (#30983) 2022-06-07 16:17:33 +02:00
Glenn Johnson
ffd63c5de1 R ecosystem: cran updates (#31004) 2022-06-07 16:12:51 +02:00
Axel Huebl
ed263615d7 AMReX: update the CMake requirements for v22.06+ (#31018) 2022-06-07 16:01:13 +02:00
Morten Kristensen
b4e775a11a py-vermin: add latest version 1.4.0 (#31009) 2022-06-07 06:09:39 -06:00
Mosè Giordano
6a2844fdee libint: add Python as build dependency (#30870)
Installation process of libint runs a Python script:
<69cc7b9bc6/export/cmake/CMakeLists.txt.export (L410)>.
If Python isn't explicitly listed as build dependency, system Python will be
picked up, which can cause troubles.
2022-06-07 06:01:53 -06:00
Manuela Kuhn
44d670a8ce git-annex: fix source urls (#30789) 2022-06-07 05:45:40 -06:00
Massimiliano Culpo
4a0ac87d07 archspec: bump to v0.1.4 (#30856)
Fixes compiler flags for oneapi and dpcpp
2022-06-07 08:51:34 +00:00
Michael Kuhn
2c6898c717 glib: add v2.72.2 (#31001) 2022-06-07 02:45:51 -06:00
Brice Videau
0a8083c604 Fix py-ray dependencies and build system (#31016)
* Fix py-ray dependencies and build system

* Update var/spack/repos/builtin/packages/py-ray/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Added documentation for unusual dependencies.

* Fixed npm preinstall script per @adamjstewart suggestion.

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2022-06-07 01:36:24 +00:00
John W. Parent
5b45df5269 Update decompression support on Windows (#25185)
Most package installations include compressed source files. This
adds support for common archive types on Windows:

* Add support for using system 7zip functionality to decompress .Z
  files when available (and on Windows, use 7zip for .xz archives)
* Default to using built-in Python support for tar/bz2 decompression
  (note that Python tar documentation mentions preservation of file
  permissions)
* Add tests for decompression support
* Extract logic for handling exploding archives (i.e. compressed
  archives that expand to more than one base file) into an
  exploding_archive_catch context manager in the filesystem module
2022-06-06 18:14:43 -07:00
Chuck Atkins
9d7cc43673 Package: Don't warn for missing source on bundle packages without code (#30913) 2022-06-06 15:33:03 -07:00
miheer vaidya
932065beca Update package rtags (#30890)
* Update backage bear

Newer versions of rtags aren't available via older url

* Fix sha256 as download link has changed

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-06-06 15:30:35 -07:00
Massimiliano Culpo
360192cbfe atompow: remove again misplaced duplicate package (#30992)
Regression re-introduced in #30404
2022-06-06 14:59:22 -07:00
Jen Herting
7bc349c041 New package: py-numpy-quaternion (#29452)
* first build of numpy-quaternion and spherical-functions

* [py-numpy-quaternion] removed python restriction

* [py-numpy-quaternion] added scipy and numba variants

* [py-numpy-quaternion] fixing unicode

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2022-06-06 14:42:33 -07:00
Jen Herting
603ec40ab1 [py-httpx] added version 0.22.0 (#29806)
* [py-httpx] python dependencies are type=('build', 'run')

* [py-httpx] py-wheel is now implied by PythonPackage

* [py-httpx] fixed older version dependencies

* [py-httpx] added version 0.22.0
2022-06-06 14:34:11 -07:00
renjithravindrankannath
ed6695b9c9 Correcting library path of hsakmt-roct for Centos7 and cleaning up the patch file. (#30885)
* Correcting library path of hsakmt-roct and cleaning up the patch file.

* Delete 007-library-path.patch
2022-06-06 13:50:41 -07:00
Adam J. Stewart
fb173f80b2 py-torchmetrics: add v0.9.0 (#31000) 2022-06-06 16:28:11 +02:00
Ye Luo
15d4262b9b quantum-espresso: no need of patch and hdf5 is optional for pw2qmcpack (#30884) 2022-06-06 15:47:10 +02:00
Daniel Arndt
7116fb1b70 Kokkos SYCL AOT flags (#30723)
* Kokkos SYCL AOT flags

* Improve selecting the Intel GPU arch
2022-06-06 15:41:13 +02:00
Adam J. Stewart
9b713fa6a6 py-black: new version/variants/maintainer (#30999) 2022-06-06 07:09:39 -06:00
Adam J. Stewart
8d4a5cb247 GCC: add Apple M1 support (#30825)
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2022-06-06 14:46:43 +02:00
Filippo Spiga
bf2d44c87e nvhpc: add v22.5 (#30845) 2022-06-06 14:44:39 +02:00
Brian Van Essen
926e311f3c dihydrogen: propagate cuda_arch to nvshmem (#30886) 2022-06-06 14:38:58 +02:00
Simon Pintarelli
8ef937032c quantum-espresso: support libxc also for autotools build (#30895) 2022-06-06 14:26:07 +02:00
Erik
73ce789390 amrvis: add configuration file and default color palette (#30942) 2022-06-06 14:01:57 +02:00
Jianshen Liu
e4f3cfcc3a cppcheck: add v2.8 (#30993) 2022-06-06 13:42:42 +02:00
Erik Schnetter
7ac05485c6 cmake: New version 3.23.2 (#30994) 2022-06-05 14:01:18 +02:00
John W. Parent
13b0e73a4e Sanitize filepath from URL (#30625)
Spack's staging logic constructs a file path based on a URL. The URL
may contain characters which are not allowed in valid file paths on
the system (e.g. Windows prohibits ':' and '?' among others). This
commit adds a function to remove such offending characters (but
otherwise preserves the URL string when constructing a file path).
2022-06-04 15:06:43 -07:00
Tom Scogland
03c54aebdd luajit: use GC64 on darwin (#30971) 2022-06-04 14:44:45 +02:00
G-Ragghianti
f4a4b3fa87 Intel oneAPI DPCPP compatibility toolkit: add new package (#30986) 2022-06-04 14:31:41 +02:00
Adam J. Stewart
6b3607287a py-botorch: add new package (#30989) 2022-06-04 14:29:33 +02:00
Carlos Bederián
b4b2585d67 protobuf: patch build error when @3.20 %gcc@12.1.0 (#30973) 2022-06-03 21:32:13 +02:00
Satish Balay
29855ae31e petsc, py-petsc4py: add 3.17.2 (#30982) 2022-06-03 09:39:52 -07:00
Michael Kuhn
44b9efa132 rocksdb: add 7.2.2 (#30977) 2022-06-03 08:56:28 -07:00
Michael Kuhn
f21e26b904 doxygen: add 1.9.4 (#30975) 2022-06-03 08:54:34 -07:00
Jacob Merson
73865c38f9 Include new versions of model-traits package (#30972) 2022-06-03 08:51:43 -07:00
Martin Diehl
38ccefbe84 mpicxx does not work with icpx (#30098) 2022-06-03 08:40:02 -07:00
Adam J. Stewart
1bd33d88bd py-protobuf: pin dependents to @:3 (#30880) 2022-06-03 00:32:02 -04:00
Seth R. Johnson
67ad23cc11 geant4: enable building against spack-built geant outside of spack (#30936) 2022-06-02 21:09:41 -06:00
Adam J. Stewart
8640b50258 Rust: Apple M1 support (#30823) 2022-06-02 20:57:40 -04:00
Adam J. Stewart
043cc688ef py-torch: fix Apple M1 support (#30833) 2022-06-02 20:51:18 -04:00
Adam J. Stewart
e52527029a OpenJDK: fix Apple M1 support (#30826) 2022-06-02 20:48:55 -04:00
Alex Hedges
a7d2f76ac5 git: add v2.36.1 (#30966) 2022-06-02 13:21:37 -07:00
Derek Ryan Strong
fbb134b1af Add lftp 4.9.2 (#30967) 2022-06-02 19:34:46 +00:00
Bitllion
548e9ae88c Abacus (#30933)
* add pacakge: abacus

* rename

* fix some style bugs

* update package:abacus

* fix some style bugs

* format code

* Delete a line of useless comments

* updatee abacus

* Update package.py

add new version info and fix a bug on mkl

* update version info and fix a bug on mkl

* Update package.py

* fix style bugs

* trailing whitespace
2022-06-02 12:09:53 -07:00
Adam J. Stewart
5728ba0122 Use stable URLs for patch-diff GitHub patches (#30953) 2022-06-02 12:52:05 -04:00
Adam J. Stewart
2bda10edb5 Fix typo in GitHub Actions step name (#30924) 2022-06-02 16:33:39 +00:00
miheer vaidya
bd15ca4f16 Update bear to v3.0.0 (#30835) 2022-06-02 09:31:10 -07:00
Jakub Krajniak
f9dfd5fcb8 Add flag to enable WRF-Chem module (#30950)
* Add flag to enable WRF-Chem module

* Update var/spack/repos/builtin/packages/wrf/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* Set chem variant only for v4+

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-06-02 09:11:45 -07:00
Glenn Johnson
f3c4e1adbb texlive: add 20220321 distribution (#30955) 2022-06-02 09:05:24 -07:00
Seth R. Johnson
b2d3ed9096 pmix: add explicit hwloc dependency (#30897)
pmix@:2 uses hwloc if it's available (e.g. in homebrew) which can break
the installation.
2022-06-02 14:47:53 +02:00
Mikael Simberg
dac5fec255 Add pika 0.5.0 (#30960) 2022-06-02 13:56:05 +02:00
Axel Huebl
9784b8f926 Prepare: openPMD-api 0.15.0-dev (#29484)
Anticipate openPMD-api changes in the next major release that are
already in `dev` (aka Spack `develop`):
- C++17 requirement
- drop: `mpark-variant` public dependency
- add: `toml11` private dependency

Also add @franzpoeschel as co-maintainer for the Spack package.
2022-06-01 19:13:33 -06:00
Axel Huebl
adef8f6ca7 WarpX: Patch no-MPI & Lib Install (#30866)
Fixes WarpX issues:
- https://github.com/ECP-WarpX/WarpX/pull/3134
- https://github.com/ECP-WarpX/WarpX/pull/3141

and uses GitHub patch URLs directly instead of storing
patch copies.
2022-06-01 18:49:25 -06:00
Larry Knox
c36f15e29e hdf5: new version version 1.10.9 (#30903) 2022-06-01 16:37:37 -06:00
Carlos Bederián
03531ed904 openmpi: add 4.1.4 (#30947) 2022-06-01 16:21:38 -06:00
Carlos Bederián
54332b2d83 python: add 3.9.13 (#30948) 2022-06-01 15:37:31 -06:00
Xavier Delaruelle
165f42b7ce environment-modules: add version 5.1.1 (#30946) 2022-06-01 12:17:48 -06:00
Sreenivasa Murthy Kolam
1190d03b0f Bump up the version for ROCm-5.1.3 release (#30819)
* Bump up the version for ROCm-5.1.3 release

* remove extra comma from hashes for device-libs of rocm-openmp-extras
2022-06-01 10:57:10 -07:00
kwryankrattiger
5faa927fe6 Ascent: Patch find conduit python (#30949)
Some systems have trouble when using the python on the login node so
this should provide an option to build that doesn't require running
python.
2022-06-01 10:36:33 -07:00
Adam J. Stewart
c60d220f81 py-jupyterlab: add v3.4.2 (#30867) 2022-06-01 10:18:47 -07:00
Hartmut Kaiser
61d3d60414 Update HPX recipe for HPX V1.8.0 (#30896) 2022-06-01 11:06:03 -06:00
rashawnLK
174258c09a Updated intel-gtpin package.py for most recent version, GTPin 3.0. (#30877)
* Updated intel-gtpin package.py for  most recent version, GTPin 3.0.

* Fixed style issues in package.py -- removed trailing whitespace on two
lines.
2022-06-01 11:05:41 -06:00
Erik
73c6a8f73d Version updates for SUNDIALS and CUDA (#30874) 2022-06-01 11:01:32 -06:00
Olivier Cessenat
86dc904080 ngspice: adding version 37 (#30925) 2022-06-01 10:50:02 -06:00
Weiqun Zhang
9e1c87409d amrex: add v22.06 (#30951) 2022-06-01 10:49:46 -06:00
Sergey Kosukhin
2b30dc2e30 nag: add new version (#30927)
* nag: add new version

* nag: update maintainers
2022-06-01 10:49:32 -06:00
Ben Darwin
b1ce756d69 minc-toolkit: add version 1.9.18.2 (#30926) 2022-06-01 10:45:34 -06:00
Ida Mjelde
1194ac6985 Adding a libunwind variant to libzmq (#30932)
* Adding a libunwind variant to libzmq

* Remove whitespace line 46
2022-06-01 10:29:56 -06:00
Asher Mancinelli
954f961208 tmux: support building from master and utf8 opts (#30928)
* tmux: support building from master and utf8 opts

* Fix style errors
2022-06-01 10:29:35 -06:00
Zack Galbreath
47ac710796 CPU & memory requests for jobs that generate GitLab CI pipelines (#30940)
gitlab ci: make sure pipeline generation isn't resource starved
2022-06-01 09:43:23 -06:00
Derek Ryan Strong
d7fb5a6db4 rclone: add 1.58 (#30887)
* Add rclone 1.58

* Update rclone git repo path
2022-05-31 19:35:12 -07:00
Maciej Wójcik
e0624b9278 gromacs: Add recent releases (#30892)
* gromacs: Add recent releases

* gromacs: Update branch name

* gromacs: Update links
2022-05-31 19:27:23 -07:00
Olivier Cessenat
e86614f7b8 gmsh: adding version 4.10.3 (#30923) 2022-05-31 18:53:03 -07:00
Garth N. Wells
d166b948ce fenics-dolfinx: dependency updates (#30919)
* Add pugixml dependency

* Dependency updates

* Fix Spack Numpy verion

* Test more generous NumPy constraint

* Fix NumPy requirment
2022-05-31 18:49:49 -07:00
lorddavidiii
9cc3a2942d cfitsio: add 4.1.0 (#30920) 2022-05-31 18:46:07 -07:00
Marie Houillon
5d685f9ff6 New version for openCARP packages (#30931)
Co-authored-by: openCARP consortium <info@opencarp.org>
2022-05-31 18:21:55 -07:00
Paul Kuberry
9ddf45964d xyce: add sha for version 7.5.0 (#30941) 2022-05-31 17:59:13 -07:00
iarspider
b88cc77f16 xpmem package: add patches for building on FC 35 with kernel 5.16.18-200 (#29945) 2022-05-31 15:55:51 -07:00
Robert Cohn
f3af38ba9b Fix module support for oneapi compilers (#28901)
Updates to improve Spack-generated modules for Intel oneAPI compilers:

* intel-oneapi-compilers set CC etc.
* Add a new package intel-oneapi-compilers-classic which can be used to
  generate a module which sets CC etc. to older compilers (e.g. icc)
* lmod module logic now updated to treat the intel-oneapi-compilers*
  packages as compilers
2022-05-31 15:02:25 -07:00
Wouter Deconinck
adc9f887ea acts-dd4hep: new package; acts: new version (#30850)
* acts-dd4hep: new package, separated from new acts@19.1.0

* acts-dd4hep: improved versioning

* acts-dd4hep: don't use curl | sha256sum

* acts: new variant `odd` for Open Data Detector

* acts-dd4hep: style changes
2022-05-31 11:30:36 -07:00
Wouter Deconinck
9461f482d9 assimp: new version 5.2.4 (#30929) 2022-05-31 12:25:24 -06:00
Paul Kuberry
e014b889c6 xyce: remove python packages as +pymi dependencies and hdf5 from trilinos dependency (#30938) 2022-05-31 14:04:57 -04:00
snehring
181ac574bb sentieon-genomics: adding version 202112.04 (#30876) 2022-05-31 10:05:27 -06:00
Adam J. Stewart
055c9d125d CUDA: make cuda_arch sticky (#30910) 2022-05-30 12:53:15 -07:00
Evan Bollig
a94438b1f5 Added AWS-AHUG alinux2 pipeline (#24601)
Add spack stacks targeted at Spack + AWS + ARM HPC User Group hackathon.  Includes
a list of miniapps and full-apps that are ready to run on both x86_64 and aarch64.

Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2022-05-30 10:26:39 -06:00
Joseph Wang
f583e471b8 pass CC variable to make (#30912)
Set CC to cc
2022-05-30 08:44:31 +02:00
Brian Van Essen
f67f3b1796 Add new versions of protobuf and py-protobuf (#30503)
* Add new versions

* Updated the hashes to match the published pypi.org hashes.  Added version constraints for Python.
2022-05-30 01:23:26 -05:00
Jean Luca Bez
77c86c759c HDF5 VOL-ASYNC update versions (#30900) 2022-05-29 17:49:52 -04:00
Adam J. Stewart
8084259bd3 protobuf: fix spack versions (#30879) 2022-05-28 14:53:24 -06:00
Evan Bollig
98860c6a5f Alinux isc buildcache (#30462)
Add two new stacks targeted at x86_64 and arm, representing an initial list of packages 
used by current and planned AWS Workshops, and built in conjunction with the ISC22
announcement of the spack public binary cache.

Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2022-05-28 11:32:53 -06:00
Todd Gamblin
e6929b9ff9 0.18.0.dev0 -> 0.19.0.dev0 (#30907) 2022-05-28 17:23:01 +00:00
Tom Scogland
18c2f1a57a refactor: packages import spack.package explicitly (#30404)
Explicitly import package utilities in all packages, and corresponding fallout.

This includes:

* rename `spack.package` to `spack.package_base`
* rename `spack.pkgkit` to `spack.package`
* update all packages in builtin, builtin_mock and tutorials to include `from spack.package import *`
* update spack style
  * ensure packages include the import
  * automatically add the new import and remove any/all imports of `spack` and `spack.pkgkit`
    from packages when using `--fix`
  * add support for type-checking packages with mypy when SPACK_MYPY_CHECK_PACKAGES
    is set in the environment
* fix all type checking errors in packages in spack upstream
* update spack create to include the new imports
* update spack repo to inject the new import, injection persists to allow for a deprecation period

Original message below:
 
As requested @adamjstewart, update all packages to use pkgkit.  I ended up using isort to do this,
so repro is easy:

```console
$ isort -a 'from spack.pkgkit import *' --rm 'spack' ./var/spack/repos/builtin/packages/*/package.py
$ spack style --fix
```

There were several line spacing fixups caused either by space manipulation in isort or by packages
that haven't been touched since we added requirements, but there are no functional changes in here.

* [x] add config to isort to make sure this is maintained going forward
2022-05-28 12:55:44 -04:00
Todd Gamblin
3054cd0eff update changelog for v0.18.0 (#30905) 2022-05-28 17:33:20 +02:00
JDBetteridge
9016b79270 Additional BLAS/LAPACK library configuration for Numpy (#30817)
* Add amdblis and amdlibflame as BLAS/LAPACK options

* Add Cray-libsci as BLAS/LAPACK option

* Use Netlib config for Cray-libsci
2022-05-28 03:33:31 -06:00
Erik Schnetter
9f5c6fb398 hpx: New version 1.8.0 (#30848) 2022-05-28 09:40:20 +02:00
Greg Becker
19087c9d35 target optimization: re-norm optimization scale so that 0 is best. (#29926)
referred targets are currently the only minimization criteria for Spack for which we allow
negative values. That means Spack may be incentivized to add nodes to the DAG if they
match the preferred target.

This PR re-norms the minimization criteria so that preferred targets are weighted from 0,
and default target weights are offset by the number of preferred targets per-package to
calculate node_target_weight.

Also fixes a bug in the test for preferred targets that was making the test easier to pass
than it should be.
2022-05-27 22:49:41 -07:00
Greg Becker
4116b04368 update tutorial command for v0.18.0 and new gpg key (#30904) 2022-05-28 02:36:20 +00:00
JDBetteridge
1485931695 Ensure same BLAS/LAPACK config from Numpy used in Scipy (#30818)
* Call Numpy package's set_blas_lapack() and setup_build_environment() in Scipy package

* Remove broken link from comment

* Use .package attribute of spec to avoid import
2022-05-27 10:46:21 -07:00
Derek Ryan Strong
78cac4d840 Add R 4.2.0 (#30859) 2022-05-27 08:24:28 -05:00
Michael Kuhn
2f628c3a97 gcc: add 9.5.0 (#30893) 2022-05-27 12:18:57 +02:00
Adam J. Stewart
a3a8710cbe Python: fix clingo bootstrapping on Apple M1 (#30834)
This PR fixes several issues I noticed while trying to get Spack working on Apple M1.

- [x] `build_environment.py` attempts to add `spec['foo'].libs` and `spec['foo'].headers` to our compiler wrappers for all dependencies using a try-except that ignores `NoLibrariesError` and `NoHeadersError` respectively. However, The `libs` and `headers` attributes of the Python package were erroneously using `RuntimeError` instead.
- [x] `spack external find python` (used during bootstrapping) currently has no way to determine whether or not an installation is `+shared`, so previously we would only search for static Python libs. However, most distributions including XCode/Conda/Intel ship shared Python libs. I updated `libs` to search for both shared and static (order based on variant) as a fallback.
- [x] The `headers` attribute was recursively searching in `prefix.include` for `pyconfig.h`, but this could lead to non-deterministic behavior if multiple versions of Python are installed and `pyconfig.h` files exist in multiple `<prefix>/include/pythonX.Y` locations. It's safer to search in `sysconfig.get_path('include')` instead.
- [x] The Python installation that comes with XCode is broken, and `sysconfig.get_paths` is hard-coded to return specific directories. This meant that our logic for `platlib`/`purelib`/`include` where we replace `platbase`/`base`/`installed_base` with `prefix` wasn't working and the `mkdirp` in `setup_dependent_package` was trying to create a directory in root, giving permissions issues. Even if you commented out those `mkdirp` calls, Spack would add the wrong directories to `PYTHONPATH`. Added a fallback hard-coded to `lib/pythonX.Y/site-packages` if sysconfig is broken (this is what distutils always did).
2022-05-27 03:18:20 -07:00
Paul R. C. Kent
0bf3a9c2af llvm: 14.0.3 and 14.0.4 (#30888) 2022-05-27 09:51:54 +02:00
Severin Strobl
cff955f7bd otf2/scorep: add versions 3.0/7.1 (#28631) 2022-05-27 00:43:58 +02:00
Scott Wittenburg
3d43ebec72 Revert "strip -Werror: all specific or none (#30284)" (#30878)
This reverts commit 330832c22c.
2022-05-26 14:17:01 -07:00
Robert Pavel
6fd07479e3 Updated mfem constraints in laghos spackage (#30851)
Updated mfme constraints in laghos spackage to better match comments and
support legacy builds of `laghos@1.0:2.0`
2022-05-26 10:20:42 -07:00
Simon Pintarelli
03bc36f8b0 q-e-sirius: remove ~apps constraint (#30857) 2022-05-26 10:18:37 -07:00
Brian Van Essen
93e1b283b7 Added hash for new versions (#30860) 2022-05-26 10:15:59 -07:00
Derek Ryan Strong
df2c0fbfbd Add new versions of GNU parallel (#30862) 2022-05-26 10:09:43 -07:00
Derek Ryan Strong
54a69587c3 Add newer nano versions (#30865) 2022-05-26 10:04:50 -07:00
Hans Johansen
294312f02b Adding new package bricks for x86, cuda (#30863)
* Adding new package bricks for x86, cuda

* Fixed complaints from "spack style" that CI found

* add license comment at top

Co-authored-by: drhansj <drhansj@berkeley.edu>
Co-authored-by: eugeneswalker <38933153+eugeneswalker@users.noreply.github.com>
2022-05-26 07:40:11 -07:00
Massimiliano Culpo
0636fdbfef Remove the warning that Spack prints at each spec (#30872)
Add instead a warning box in the documentation
2022-05-26 14:35:20 +00:00
Scott Wittenburg
85e13260cf ci: Support secure binary signing on protected pipelines (#30753)
This PR supports the creation of securely signed binaries built from spack
develop as well as release branches and tags. Specifically:

- remove internal pr mirror url generation logic in favor of buildcache destination
on command line
    - with a single mirror url specified in the spack.yaml, this makes it clearer where 
    binaries from various pipelines are pushed
- designate some tags as reserved: ['public', 'protected', 'notary']
    - these tags are stripped from all jobs by default and provisioned internally
    based on pipeline type
- update gitlab ci yaml to include pipelines on more protected branches than just
develop (so include releases and tags)
    - binaries from all protected pipelines are pushed into mirrors including the
    branch name so releases, tags, and develop binaries are kept separate
- update rebuild jobs running on protected pipelines to run on special runners
provisioned with an intermediate signing key
    - protected rebuild jobs no longer use "SPACK_SIGNING_KEY" env var to
    obtain signing key (in fact, final signing key is nowhere available to rebuild jobs)
    - these intermediate signatures are verified at the end of each pipeline by a new
    signing job to ensure binaries were produced by a protected pipeline
- optionallly schedule a signing/notary job at the end of the pipeline to sign all
packges in the mirror
    - add signing-job-attributes to gitlab-ci section of spack environment to allow
    configuration
    - signing job runs on special runner (separate from protected rebuild runners)
    provisioned with public intermediate key and secret signing key
2022-05-26 08:31:22 -06:00
Adam J. Stewart
b5a519fa51 py-tensorboard: add v2.9.0 (#30832) 2022-05-26 07:43:40 -04:00
Adam J. Stewart
2e2d0b3211 libtiff: remove extra dependencies/patch (#30854) 2022-05-25 23:37:45 -06:00
Todd Gamblin
d51f949768 bugfix: do not compute package_hash for old concrete specs (#30861)
Old concrete specs were slipping through in `_assign_hash`, and `package_hash` was
attempting to recompute a package hash when we could not know the package a time
of concretization.

Part of this was that the logic for `_assign_hash` was hard to understand -- it was
called twice from `_finalize_concretization` and had special cases for both args it
was called with. It's much easier to understand the logic here if we just inline it.

- [x] Get rid of `_assign_hash` and just integrate it with `_finalize_concretization`
- [x] Don't call `_package_hash` at all for already-concrete specs.
- [x] Add regression test.
2022-05-26 03:12:24 +00:00
Adam J. Stewart
1b955e66c1 py-numpy: add v1.22.4 (#30827) 2022-05-25 18:30:02 -06:00
Adam J. Stewart
5f8a3527e7 py-pythran: add v0.11.0 (#30829) 2022-05-25 18:29:42 -06:00
Seth R. Johnson
ec02369dba openmpi: fixes for slurm and #29449 (#30299) 2022-05-25 18:09:38 -06:00
Diego Alvarez
8ceac2ba9b Add nextflow 22.04.3 (#30855) 2022-05-25 15:46:00 -06:00
snehring
85eeed650e eagle: updating to version 1.1.3 (#30852) 2022-05-25 20:20:54 +00:00
Seth R. Johnson
14d4203722 sed: fix recursive symlink (#30849)
Use `spack build` as build dir to avoid recursive link error.

```
config.status: linking /var/folders/fy/x2xtwh1n7fn0_0q2kk29xkv9vvmbqb/T/s3j/spack-stage/spack-stage-sed-4.8-wraqsot6ofzvr3vrgusx4mj4mya5xfux/spack-src/GNUmakefile to GNUmakefile
config.status: executing depfiles commands
config.status: executing po-directories commands
config.status: creating po/POTFILES
config.status: creating po/Makefile
==> sed: Executing phase: 'build'
==> [2022-05-25-14:15:51.310333] 'make' '-j8' 'V=1'
make: GNUmakefile: Too many levels of symbolic links
make: stat: GNUmakefile: Too many levels of symbolic links
make: *** No rule to make target `GNUmakefile'.  Stop.
```
2022-05-25 12:48:05 -07:00
Adam J. Stewart
1bc742c13e py-scikit-learn: add v1.1.1 (#30830) 2022-05-25 13:13:48 -06:00
fpruvost
2712ea6299 Pastix: new package (#30533) 2022-05-25 10:15:53 -07:00
Matthieu Dorier
a9c064cd7e [mochi-margo] added version 0.9.10 (#30844) 2022-05-25 10:33:45 -06:00
Ben Morgan
17fc244cba geant4: new version v11.0.2 (#30847) 2022-05-25 11:34:40 -04:00
Harmen Stoppels
334c786b52 ccache: add missing pkgconfig dep (#30846) 2022-05-25 17:28:46 +02:00
Adam J. Stewart
492541b9cb py-scipy: add v1.8.1 (#30831) 2022-05-25 08:22:10 -07:00
Erik Schnetter
369f825523 reprimand: update homepage (#30840) 2022-05-25 14:53:46 +00:00
Harmen Stoppels
aba9149b71 p7zip: fix %clang (#30843) 2022-05-25 06:24:38 -07:00
Harmen Stoppels
b29f27aec7 dsfmt: set CC=cc (#30842) 2022-05-25 15:17:58 +02:00
eugeneswalker
0176d9830d tau: add v2.31.1 (#30820) 2022-05-25 07:13:31 -06:00
Harmen Stoppels
0c9370ce72 julia: support clang, set llvm NDEBUG correctly (#30800) 2022-05-25 14:42:14 +02:00
Harmen Stoppels
3620204db6 mbedtls: add conflicts over inline asm trouble with clang@12: (#30801) 2022-05-25 14:41:49 +02:00
Jen Herting
13984a4e8d [lcms] Added version 2.13.1 and URL version (#30811)
Co-authored-by: James A Zilberman <jazrc@rit.edu>
2022-05-25 03:53:39 -06:00
Adam J. Stewart
d5c68fdc0d py-pillow-simd: mark conflicts with aarch64 (#30828) 2022-05-25 10:11:58 +02:00
Chuck Atkins
93649f6b68 silo: Fix HDF5 1.13 API breakage (#30786) 2022-05-24 22:49:37 -06:00
Derek Ryan Strong
d367f1e787 Add aria2 1.36.0 (#30822) 2022-05-24 19:37:37 -06:00
Derek Ryan Strong
1c44999192 Add rsync 3.2.4 (#30821) 2022-05-25 01:17:55 +00:00
Matthieu Dorier
ad506ac2a8 [leveldb] add patch to fix check for -Wthread-safety (#30810) 2022-05-24 16:58:35 -07:00
Jen Herting
806521b4a0 [libwebp] Added version 1.2.2 (#30814)
Co-authored-by: James A Zilberman <jazrc@rit.edu>
2022-05-24 21:41:28 +00:00
Scott Wittenburg
70824e4a5e buildcache: Update layout and signing (#30750)
This PR introduces a new build cache layout and package format, with improvements for
both efficiency and security.

## Old Format
Currently a binary package consists of a `spec.json` file at the root and a `.spack` file,
which is a `tar` archive containing a copy of the `spec.json` format, possibly a detached
signature (`.asc`) file, and a tar-gzip compressed archive containing the install tree.

```
build_cache/
  # metadata (for indexing)
  <arch>-<compiler>-<name>-<ver>-24zvipcqgg2wyjpvdq2ajy5jnm564hen.spec.json
  <arch>/
    <compiler>/
      <name>-<ver>/
        # tar archive
        <arch>-<compiler>-<name>-<ver>-24zvipcqgg2wyjpvdq2ajy5jnm564hen.spack
          # tar archive contents:
          # metadata (contains sha256 of internal .tar.gz)
          <arch>-<compiler>-<name>-<ver>-24zvipcqgg2wyjpvdq2ajy5jnm564hen.spec.json
          # signature
          <arch>-<compiler>-<name>-<ver>-24zvipcqgg2wyjpvdq2ajy5jnm564hen.spec.json.asc
          # tar.gz-compressed prefix
          <arch>-<compiler>-<name>-<ver>-24zvipcqgg2wyjpvdq2ajy5jnm564hen.tar.gz
```

After this change, the nesting has been removed so that the `.spack` file is the
compressed archive of the install tree.  Now signed binary packages, will take the
form of a clearsigned `spec.json` file (a `spec.json.sig`) at the root, while unsigned
binary packages will contain a `spec.json` at the root.

## New Format

```
build_cache/
  # metadata (for indexing, contains sha256 of .spack file)
  <arch>-<compiler>-<name>-<ver>-24zvipcqgg2wyjpvdq2ajy5jnm564hen.spec.json
  # clearsigned spec.json metadata
  <arch>-<compiler>-<name>-<ver>-24zvipcqgg2wyjpvdq2ajy5jnm564hen.spec.json.sig
  <arch>/
    <compiler>/
      <name>-<ver>/
        # tar.gz-compressed prefix (may support more compression formats later)
        <arch>-<compiler>-<name>-<ver>-24zvipcqgg2wyjpvdq2ajy5jnm564hen.spack
```

## Benefits
The major benefit of this change is that the signatures on binary packages can be
verified without:

1. Having to download the tarball, or
2. having to extract an unknown tarball.

(1) is an improvement in efficiency; (2) is a security fix: we now ensure that we trust the
binary before we try to run it through `tar`, which avoids potential attacks.

## Backward compatibility
Also after this change, spack should still be able to handle the previous buildcache
structure and binary mirrors with mixed layouts.
2022-05-24 17:39:20 -04:00
Jen Herting
0fe5e72744 [libdeflate] Added version 1.10 (#30813)
Co-authored-by: James A Zilberman <jazrc@rit.edu>
2022-05-24 21:35:43 +00:00
Massimiliano Culpo
ba907defca Add a command to generate a local mirror for bootstrapping (#28556)
This PR builds on #28392 by adding a convenience command to create a local mirror that can be used to bootstrap Spack. This is to overcome the inconvenience in setting up this mirror manually, which has been reported when trying to setup Spack on air-gapped systems.

Using this PR the user can create a bootstrapping mirror, on a machine with internet access, by:

% spack bootstrap mirror --binary-packages /opt/bootstrap
==> Adding "clingo-bootstrap@spack+python %apple-clang target=x86_64" and dependencies to the mirror at /opt/bootstrap/local-mirror
==> Adding "gnupg@2.3: %apple-clang target=x86_64" and dependencies to the mirror at /opt/bootstrap/local-mirror
==> Adding "patchelf@0.13.1:0.13.99 %apple-clang target=x86_64" and dependencies to the mirror at /opt/bootstrap/local-mirror
==> Adding binary packages from "https://github.com/alalazo/spack-bootstrap-mirrors/releases/download/v0.1-rc.2/bootstrap-buildcache.tar.gz" to the mirror at /opt/bootstrap/local-mirror

To register the mirror on the platform where it's supposed to be used run the following command(s):
  % spack bootstrap add --trust local-sources /opt/bootstrap/metadata/sources
  % spack bootstrap add --trust local-binaries /opt/bootstrap/metadata/binaries
The mirror has to be moved over to the air-gapped system, and registered using the commands shown at prompt. The command has options to:

1. Add pre-built binaries downloaded from Github (default is not to add them)
2. Add development dependencies for Spack (currently the Python packages needed to use spack style)

* bootstrap: refactor bootstrap.yaml to move sources metadata out

* bootstrap: allow adding/removing custom bootstrapping sources

This operation can be performed from the command line since
new subcommands have been added to `spack bootstrap`

* Add --trust argument to spack bootstrap add

* Add a command to generate a local mirror for bootstrapping

* Add a unit test for mirror creation
2022-05-24 21:33:52 +00:00
Jen Herting
87b078d1f3 [libaec] Added version 1.0.6 (#30812)
Co-authored-by: James A Zilberman <jazrc@rit.edu>
2022-05-24 21:27:53 +00:00
Daniel Arndt
54ea1f4bf6 Allow Kokkos with OpenMPTarget backend (#30724)
* Allow Kokkos with OpenMPTarget backend

* Restrict SYCL and OpenMPTarget to C++17 or higher

* Improve C++ standard check for SYCL and OpenMPTarget

* Fix indentation
2022-05-24 14:23:47 -07:00
Sergey Kosukhin
067800bc31 mpich: re-enable building of the older versions (#30766)
* mpich: enable building @3.4:3.4.3 ~cuda

* mpich: add dependency on mxm
2022-05-24 12:57:21 -07:00
snehring
0d2eae8da0 tbl2asn: adding currently available version (#30774) 2022-05-24 12:54:42 -07:00
Massimiliano Culpo
f2a81af70e Best effort co-concretization (iterative algorithm) (#28941)
Currently, environments can either be concretized fully together or fully separately. This works well for users who create environments for interoperable software and can use `concretizer:unify:true`. It does not allow environments with conflicting software to be concretized for maximal interoperability.

The primary use-case for this is facilities providing system software. Facilities provide multiple MPI implementations, but all software built against a given MPI ought to be interoperable.

This PR adds a concretization option `concretizer:unify:when_possible`. When this option is used, Spack will concretize specs in the environment separately, but will optimize for minimal differences in overlapping packages.

* Add a level of indirection to root specs

This commit introduce the "literal" atom, which comes with
a few different "arities". The unary "literal" contains an
integer that id the ID of a spec literal. Other "literals"
contain information on the requests made by literal ID. For
instance zlib@1.2.11 generates the following facts:

literal(0,"root","zlib").
literal(0,"node","zlib").
literal(0,"node_version_satisfies","zlib","1.2.11").

This should help with solving large environments "together
where possible" since later literals can be now solved
together in batches.

* Add a mechanism to relax the number of literals being solved

* Modify spack solve to display the new criteria

Since the new criteria is above all the build criteria,
we need to modify the way we display the output.

Originally done by Greg in #27964 and cherry-picked
to this branch by the co-author of the commit.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Inject reusable specs into the solve

Instead of coupling the PyclingoDriver() object with
spack.config, inject the concrete specs that can be
reused.

A method level function takes care of reading from
the store and the buildcache.

* spack solve: show output of multi-rounds

* add tests for best-effort coconcretization

* Enforce having at least a literal being solved

Co-authored-by: Greg Becker <becker33@llnl.gov>
2022-05-24 12:13:28 -07:00
Jen Herting
494e567fe5 New package: py-x21 (#30225)
* Py-x21 now works, needs dependencies

Conflicts:
	var/spack/repos/rit-rc/packages/py-x21/package.py

* Added dependencies to py-x21

* Making flake style check happy

* [py-x21] flake8

* [py-x21]

- added homepage
- added placeholder description
- added comment about checksums

* [py-x21] added darwin support and fixed issue with python 3.7 wheel name

* [py-x21] adding checksum hash

* [py-x21] removed duplicate py-pynacl

* [py-x21]

- updated description
- updated version listing to have a different version for each version
  of python. Also, versions dependent on sys.platform
- updated url_for_version to not require post concretized information so
  that spack checksum works

* [py-x21] isort

Co-authored-by: vehrc <vehrc@rit.edu>
2022-05-24 18:12:48 +00:00
Seth R. Johnson
6a57aede57 environments: fail gracefully on missing keys (#26378) 2022-05-24 08:52:40 -07:00
edwardsp
ba701a7cf8 Update regex to correctly identify quoted args (#23494)
Previously the regex was only checking for presence of quotes as a beginning
or end character and not a matching set.  This erroneously identified the
following *single* argument as being quoted:

    source bashenvfile &> /dev/null && python3 -c "import os, json; print(json.dumps(dict(os.environ)))"
2022-05-24 08:26:07 -07:00
Matthias Wolf
557845cccc apptainer: new package (#30745) 2022-05-24 16:01:46 +02:00
iarspider
c5297523af vdt: add preload variant (#30030) 2022-05-24 12:19:09 +02:00
Evan Bollig
6883868896 libfabric has needed rdma-core for efa since 1.10.0 (#30798) 2022-05-24 04:17:47 -06:00
Satish Balay
1c5587f72d petsc: update rocrand location wrt rocm@5.1.0 (#30790)
rocm-5.1.0 removed librocrand.so from ROCM_DIR/rocrand/lib location (but  includes are still at this location)

/opt/rocm-5.0.2/lib/librocrand.so
/opt/rocm-5.0.2/rocrand/lib/librocrand.so
/opt/rocm-5.1.0/lib/librocrand.so

drwxr-xr-x 2 root root 617 Mar  8 08:20 /opt/rocm-5.0.2/rocrand/include
drwxr-xr-x 2 root root 617 Mar 31 09:48 /opt/rocm-5.1.0/rocrand/include
2022-05-24 11:54:29 +02:00
Mr-Timn
6e7eb49888 su2: add v7.3.1 (#30794) 2022-05-24 10:50:16 +02:00
Paul Wolfenbarger
3df4a32c4f trilinos: add adelus, aprepro and teuchos variants (#28935) 2022-05-24 09:49:33 +01:00
Adam J. Stewart
95b03e7bc9 gplates: add v2.3.0 (#30676) 2022-05-24 08:02:39 +02:00
Greg Becker
817ee81eaa compiler flags: imposed hashes impose the lack of additional compiler flags (#30797) 2022-05-24 01:22:29 -04:00
Tom Scogland
330832c22c strip -Werror: all specific or none (#30284)
Add a config option to strip `-Werror*` or `-Werror=*` from compile lines everywhere.

```yaml
config:
    keep_werror: false
```

By default, we strip all `-Werror` arguments out of compile lines, to avoid unwanted
failures when upgrading compilers.  You can re-enable `-Werror` in your builds if
you really want to, with either:

```yaml
config:
    keep_werror: all
```

or to keep *just* specific `-Werror=XXX` args:

```yaml
config:
    keep_werror: specific
```

This should make swapping in newer versions of compilers much smoother when
maintainers have decided to enable `-Werror` by default.
2022-05-24 00:57:09 -04:00
Todd Gamblin
306bed48d7 specs: emit better parsing errors for specs. (#24860)
Parse error information is kept for specs, but it doesn't seem like we propagate it
to the user when we encounter an error.  This fixes that.

e.g., for this error in a package:

```python
    depends_on("python@:3.8", when="0.900:")
```

Before, with no context and no clue that it's even from a particular spec:

```
==> Error: Unexpected token: ':'
```

With this PR:

```
==> Error: Unexpected token: ':'
  Encountered when parsing spec:
    0.900:
         ^
```
2022-05-24 03:33:43 +00:00
Scott Wittenburg
63402c512b Revert "Added cloud_pipline for E4S on Amazon Linux (#29522)" (#30796)
This reverts commit 07e9c0695a.
2022-05-23 21:12:48 -06:00
Brian Van Essen
736fddc079 Bugfix hwloc find cuda (#30788)
* Added autotools configure flags to ensure that hwloc finds the correct
version of CUDA that it was concretized against, rather than the first
one that package config finds.

* Added support for finding the correct version of ROCm libraries.  Fixed Flake8.

* Fixed guard on finding ROCm library
2022-05-23 20:17:46 -06:00
Jen Herting
036048c26f [py-h2] added version 3.2.0 and 4.1.0 (#29804)
* [py-h2] py-wheel is implied by PythonPackage

* [py-h2] python dependencies should be type=('build', 'run')

* [py-h2] fixed dependencies for py-h2@4.0.0

* [py-h2] added version 3.2.0

* [py-h2] added version 4.1.0

* [py-h2] Older version requires py-enum34 for older versions of python
2022-05-23 17:56:45 -07:00
Greg Becker
8616ba04db Documentation and new method for CachedCMakePackage build system (#22706)
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
2022-05-23 22:48:12 +00:00
Evan Bollig
07e9c0695a Added cloud_pipline for E4S on Amazon Linux (#29522)
Add two new cloud pipelines for E4S on Amazon Linux, include arm and x86 (v3 + v4) stacks.

Notes:
- Updated mpark-variant to remove conflict that no longer exists in Amazon Linux
- Which command on Amazon Linux prefixes on all results when padded_length is too high. In this case, padded_length<=503 works as expected. Chose conservative length of 384.
2022-05-23 15:33:38 -06:00
snehring
f24886acb5 usearch: adding in new version, updating checksums (#30776) 2022-05-23 13:50:50 -07:00
snehring
5031578c39 genemark-et: updating to 4.69 (#30793) 2022-05-23 13:36:26 -07:00
Massimiliano Culpo
7c4cc1c71c archspec: add oneapi and dpcpp flag support (#30783) 2022-05-23 13:28:54 -07:00
Harmen Stoppels
f7258e246f Deprecate spack:concretization over concretizer:unify (#30038)
* Introduce concretizer:unify option to replace spack:concretization

* Deprecate concretization

* Make spack:concretization overrule concretize:unify for now

* Add environment update logic to move from spack:concretization to spack:concretizer:reuse

* Migrate spack:concretization to spack:concretize:unify in all locations

* For new environments make concretizer:unify explicit, so that defaults can be changed in 0.19
2022-05-23 13:20:34 -07:00
Maciej Wójcik
ff980a1452 plumed: add versions up to v2.8.0 (#30787) 2022-05-23 20:07:47 +00:00
Chuck Atkins
51130abf86 ci: Map visit to huge instance for the data-vis-sdk pipeline (#30779) 2022-05-23 14:16:24 -04:00
marcus-elia
383356452b abseil-cpp: add v20211102 (#30785) 2022-05-23 18:52:39 +02:00
Cody Balos
5fc1547886 xsdk-examples: add v0.3.0 (#30770)
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
Co-authored-by: Veselin Dobrev <dobrev@llnl.gov>
2022-05-23 18:45:11 +02:00
Manuela Kuhn
68cd6c72c7 py-rnc2rng: fix 2.6.5 and add 2.6.6 (#30784) 2022-05-23 08:02:13 -07:00
Jean Luca Bez
3d2ff57e7b hdf5-vol-async: update new version, tests, and runtime envs (#30713)
* Update h5bench maintainers and versions

* Include version 1.1 for h5bench

* Correct release hash and set default version

* Update .tar.gz version

* Include new version and update runtime

* Update year

* Update package.py

* Update package.py
2022-05-23 09:28:26 -04:00
Massimiliano Culpo
3bc656808c llvm: make "omp_as_runtime" variant conditional (#30782)
fixes #30700

To avoid clingo adding penalties for not using the
default value for a variant, it's better to model
the variant as conditional where possible.
2022-05-23 14:21:11 +02:00
Ben Bergen
7ded692a76 gdb: add v11.2 (#30780)
- This resolves bug https://sourceware.org/PR28302
2022-05-23 12:18:26 +02:00
MichaelLaufer
aa3c7a138a py-pyfr: Add v1.14.0, add LIBXSMM variant (#30612)
* Update PyFR, Gimmik versions. Add PyFR LIBXSMM variant

* Fixes

* Changes to variants

* Update py-gimmik version requirement in py-pyfr

* Revert "Update py-gimmik version requirement in py-pyfr"

This reverts commit 3b3fde3042.

* Update libxsmm conflicts
2022-05-22 12:01:21 -06:00
Paul Kuberry
42441cddcc Add Teuchos patch that fixes implicit casting of complex numbers (#30777) 2022-05-22 09:26:33 -04:00
Abhik Sarkar
b78025345b Feature/composed boost pkg deps p3 (#28961)
* This commit removes the Boost.with_default_variants to variants that packages are precisely dependant upon. This is the third batch of 16 packages with modified boost dependencies.

* style fix

* Update var/spack/repos/builtin/packages/sympol/package.py

Co-authored-by: Tim Haines <thaines.astro@gmail.com>

* fix style

* Apply suggestions from code review

Co-authored-by: Tim Haines <thaines.astro@gmail.com>

* Fix Trilinos boost deps

* Fix style

Co-authored-by: Tim Haines <thaines.astro@gmail.com>
Co-authored-by: Tom Scogland <tom.scogland@gmail.com>
2022-05-21 15:57:59 -07:00
Harmen Stoppels
2113b625d1 gcc: add build_type and profiled variants (#30660)
Add a `build_type` variant, which allows building optimized compilers,
as well as target libraries (libstdc++ and friends).

The default is `build_type=RelWithDebInfo`, which corresponds to GCC's
default of -O2 -g.

When building with `+bootstrap %gcc`, also add Spack's arch specific
flags using the common denominator between host and new GCC.

It is done by creating a config/spack.mk file in def patch, that looks
as follows:
```
BOOT_CFLAGS := $(filter-out -O% -g%, $(BOOT_CFLAGS)) -O2 -g -march=znver2 -mtune=znver2
CFLAGS_FOR_TARGET := $(filter-out -O% -g%, $(CFLAGS_FOR_TARGET)) -O2 -g -march=znver2 -mtune=znver2
CXXFLAGS_FOR_TARGET := $(filter-out -O% -g%, $(CXXFLAGS_FOR_TARGET)) -O2 -g -march=znver2 -mtune=znver2
```
2022-05-21 23:44:51 +02:00
Jen Herting
c6c3d243e1 New package: py-motor (#30763)
* New package: py-motor

* Fixed dependency errors

* Fixed flake style errors

* Fixed linked issue

Co-authored-by: Viv Eric Hafener <vehrc@sporcbuild.rc.rit.edu>
2022-05-21 11:36:37 -05:00
Manuela Kuhn
870b997cb6 py-ww: add new package (#30767)
* py-ww: add new package

* Add missing py-pytest-runner dependency
2022-05-21 11:28:53 -05:00
snehring
c9492f1cd4 paml: add v4.10.3 (#30772) 2022-05-21 09:47:38 +02:00
dependabot[bot]
24f370491e build(deps): bump actions/upload-artifact from 3 to 3.1.0 (#30778)
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 3 to 3.1.0.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](https://github.com/actions/upload-artifact/compare/v3...3cea5372237819ed00197afe530f5a7ea3e805c8)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-05-21 08:57:19 +02:00
Glenn Johnson
d688a699fa Fix toolchain detection for oneapi/dpcpp compilers (#30775)
The oneapi and dpcpp compilers are essentially the same except for which
binary is used foc CXX. Spack will detect them as "mixed toolchain" and
not inject compiler optimization flags. This will be needed once
archspec has entries for the oneapi and dpcpp compilers. This PR detects
when dpcpp and oneapi are in the toolchains list and explicitly sets
`is_mixed_toolchain` to `False`.
2022-05-21 08:38:01 +02:00
Chuck Atkins
4fbb822072 visit: Overhaul to build in the DAV SDK (#30594)
* mesa-glu and mesa-demos: Fix conflicts with glu and osmesa

* visit: Update visit dependencies

* ecp-data-vis-sdk: Enable +visit

* ci[data-vis-sdk]: Enable +visit
2022-05-20 19:17:30 -04:00
Sreenivasa Murthy Kolam
f86c481280 update checksum and url for mlirmiopen recipe (#30771) 2022-05-20 21:29:58 +00:00
Jen Herting
91a99882b3 [py-openslide-python] added verion 1.1.2 (#30722)
* [py-openslide-python] added verion 1.1.2 and set max py-setuptools version for 1.1.1

* [py-openslide-python]

- setuptools required for all possible newer versions
- python is type build run

* [py-openslide-python] use pil provider
2022-05-20 19:25:20 +00:00
Manuela Kuhn
74bef2105a py-formatizer: add new package (#30755) 2022-05-20 10:37:55 -07:00
Manuela Kuhn
630ebb9d8b py-pybids: add 0.8.0 and 0.15.1 (#30756) 2022-05-20 10:37:01 -07:00
iarspider
183465321e DWZ: use virtual "elf" package (#30761) 2022-05-20 10:29:11 -07:00
Nate deVelder
580f9ec86e Add newer openfast versions and preliminary OpenMP compile support (#30752)
* Add version 3.0 and 3.1 and prelim OpenMP support

* Fix flag handler missing spec variable

* Use self.compiler.openmp_flag instead of -fopenmp

* Fix whitespace
2022-05-20 10:27:47 -07:00
Jordan Galby
0b0920bc90 qt: Qt 5.15.0 requires OpenSSL 1.1.1 (#30754)
Fixes qt configure errors with external openssl on older systems (rhel7)

See
efc02f9cc3/dist/changes-5.15.0 (L346)

This means for now on, `qt ^openssl@1.0` gets you `qt@5.15.4 ~ssl`:
clingo chooses latest qt version **but disables ssl support**.
2022-05-20 18:24:30 +02:00
Greg Becker
ee04a1ab0b errors: model error messages as an optimization problem (#30669)
Error messages for the clingo concretizer have proven challenging. The current messages are incredibly vague and often don't help users at all. Unsat cores in clingo are not guaranteed to be minimal, and lead to cores that are either not useful or need to be post-processed for hours to reach a minimal core.

Following up on an idea from a slack conversation with kwryankrattiger on slack, this PR takes a new approach. We eliminate most integrity constraints and minima/maxima on choice rules in clingo, and instead force invalid states to imply an error predicate. The error predicate can include context on the cause of the error (Package, Version, etc). These error predicates are then heavily optimized against, to ensure that we do not include error facts in the solution when a solution with no error facts could be generated. When post-processing the clingo solution to construct specs, any error facts cause the program to raise an exception. This leads to much more legible error messages. Each error predicate includes a priority and an error message. The error message is formatted by the remaining arguments to produce the error message. The priority is used to ensure that when clingo has a choice of which rules to violate, it chooses the one which will be most informative to the user.

Performance:

"fresh" concretizations appear to suffer a ~20% performance penalty under this branch, while "reuse" concretizations see a speedup of around 33%. 

Possible optimizations if users still see unhelpful messages:

There are currently 3 levels of priority of the error messages. Additional priorities are possible, and can allow us finer granularity to ensure more informative error messages are provided in lieu of less informative ones.

Future work:

Improve tests to ensure that every possible rule implying an error message is exercised
2022-05-20 08:27:07 -07:00
Peter Scheibel
55f4950ed4 Petsc: fix enable-x for virtuals (#30749)
* If direct dependencies provide virtuals, add those virtual names as well
* Also refer to package by virtual name for jpeg
2022-05-20 03:15:44 +00:00
Manuela Kuhn
23960ed623 py-fracridge: add new package (#30739) 2022-05-19 17:28:39 -07:00
iarspider
fb2730d87f Update py-onnx-runtime to 1.10; update CMS patch (#30725)
* Update py-onnx-runtime to 1.10; update CMS patch

* Update package.py
2022-05-19 17:24:18 -07:00
Wileam Y. Phan
30f2394782 Run scheduled CI workflows only in the main repo (#30729)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-05-19 21:35:52 +02:00
Jordan Galby
262c3f07bf Non-existent upstream is not fatal (#30746)
A non-existent upstream should not be fatal: it could only mean it is
not deployed yet. In the meantime, it should not block the user to
rebuild anything it needs.

A warning is still emitted, to let the user decide if this is ok or not.
2022-05-19 18:27:43 +00:00
Harmen Stoppels
b018eb041f python: more +optimizations (#30742) 2022-05-19 12:25:36 -06:00
Matthew Archer
3f4398dd67 fenicsx: update to v0.4 (#30661)
* Fix for xtensor-xsimd

* Add sha256 for all new releases

* renamed ufcx package

* Update sha for ffcx

* fixed hashes and modified fenics-dolfinx to depend on ufcx

* cleaned and fixed dependency types

* use spec.satisfies in cmake_args

* bumped to ufcx@0.4.1

* address PR comments

* fix hashes

* update parmetis in cmake_args to reflect default setting

* update versions

* renamed ufcx package

* fixed hashes and modified fenics-dolfinx to depend on ufcx

* cleaned and fixed dependency types

* use spec.satisfies in cmake_args

* bumped to ufcx@0.4.1

* address PR comments

* fix hashes

* update parmetis in cmake_args to reflect default setting

* update versions

* Add dependency fix

* bump basix to 0.4.2 and address PR comments

* Versioning fixes

* Use xtensor-0.24: and loosen pybind11

* Add conflicts for partitioners

* Updates on partitioners

* use define_from_variant

* Tidy up some dependencies

* Work on multi-variants for graph partitioners

* Fix KaHIP issue.

KaHIP changed the name of its library from 'interface' to 'kahip'. Pin earlier versions of DOLFINx to earlier verisons of KaHIP for proper detection.

Co-authored-by: Chris Richardson <chris@bpi.cam.ac.uk>
Co-authored-by: Garth N. Wells <gnw20@cam.ac.uk>
2022-05-19 18:22:09 +00:00
kwryankrattiger
a225a5b276 Ascent: Add variant to disable blt_find_mpi (#30735)
This is needed to find MPI correctly on cray systems and similar.
2022-05-19 10:06:31 -07:00
iarspider
c9cfc548da Update millepede recipe (#30737) 2022-05-19 10:03:31 -07:00
Rémi Lacroix
3b30886a3a libtheora: fix the hash of a patch. (#30740)
Cosmetic changes only, probably because gitlab.xiph.org got updated.
2022-05-19 09:59:12 -07:00
Jordan Galby
c2fd98ccd2 Fix spack install chgrp on symlinks (#30743)
Fixes missing chgrp on symlinks in package installations, and errors on
symlinks referencing non-existent or non-writable locations.

Note: `os.chown(.., follow_symlinks=False)` is python3 only, but
`os.lchown` exists in both versions.
2022-05-19 08:50:24 -07:00
Valentin Volkl
a0fe6ab2ed cuda: use stage dir instead of /tmp during install (#29584) 2022-05-19 17:18:18 +02:00
Harmen Stoppels
c3be777ea8 tar: add compress deps (#30641)
* remove spack dependency from package

* tar: fix compression programs, use pigz by default instead of gzip on -z
2022-05-19 11:15:13 -04:00
Jordan Galby
8fe39be3df Don't try to mkdir upstream directory when nonexistent (#30744)
When an upstream is specified but the directory does not exist, don't
create the directory for it, it might not be yours.
2022-05-19 14:45:18 +00:00
Tiziano Müller
f5250da611 cp2k: fix unbound var use without cuda (#30736)
fixes #30631
2022-05-19 11:38:00 +02:00
David Beckingsale
c2af154cd2 RAJA and associated packages: add v2022.03.0 (#30047)
* Add raja@2022.03.0
* Add camp@2022.03.0
* Add chai@2022.03.0
* Add umpire@2022.03.1
* Latest chai, raja, umpire versions don't need submodules
* Latest chai, raja, umpire versions update CMake option names
* New umpire +device_alloc option (for latest version)
* All versions of dray are now required to build with raja@:0.14

Co-authored-by: Marty McFadden <mcfadden8@users.noreply.github.com>
2022-05-18 22:48:22 -07:00
robgics
1f6b880fff Add license dir to config (#30135)
* Change license dir from hard-coded to a configurable item

* Change config item to be a string not an array

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2022-05-18 18:26:42 -07:00
Teodor Nikolov
2c211d95ee Catch2: update to 3.0.1 (#30732)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-05-19 00:47:20 +00:00
Asher Mancinelli
c46f673c16 ExaSGD bugfixes (#30731)
* Small bug fixes for ExaSGD packages

* Add Slaven as maint for both exago and hiop
2022-05-18 15:25:24 -07:00
Todd Gamblin
8ff2b4b747 bugfix: handle new dag_hash() on old concrete specs gracefully. (#30678)
Trying to compute `dag_hash()` or `package_hash()` on a concrete spec that doesn't have
a `_package_hash` attribute would attempt to recompute the package hash.

This most commonly manifests as a failed lookup of a namespace if you attempt to uninstall
or compute the hashes of packages in exsternal repositories that aren't registered, e.g.:

```console
> spack spec --json c/htno
==> Error: Unknown namespace: myrepo
```

While it wouldn't change the already-assigned `dag_hash` value, this behavior is
incorrect, since the package file for a previously concrete spec:
  1. might have changed since concretization,
  2. might not exist anymore, or
  3. might just not be findable by Spack.

This PR ensures that the package hash can't be computed on older concrete specs. Instead
of calling `package_hash()` from within `to_node_dict()`, we now check for the `_package_hash`
attribute and only add the package_hash to the spec record if it's there.

This PR also handles the tricky semantics of computing `package_hash()` at concretization
time. We have to compute it *before* marking the spec concrete so that `to_node_dict` can
use it. But this means that the logic for `package_hash()` can't rely on `spec.concrete`,
as it is called *during* concretization. Instead of checking for concreteness, `package_hash()`
now checks `_patches_assigned()` to determine whether it should add them to the package
hash.

- [x] Add an assert to `package_hash()` so it can't be called on specs for which it
      would be wrong.
- [x] Add an `_assign_hash()` method to handle tricky semantics of `package_hash`
      and `dag_hash`.
- [x] Rework concretization to call `_assign_hash()` before and after marking specs
      concrete.
- [x] Rework content hash part of package hash to check for `_patches_assigned()`
      instead of `spec.concrete`.
- [x] regression test
2022-05-18 22:21:22 +00:00
Michael Kuhn
9e05dde28c harfbuzz: add gobject-introspection dependency (#30715)
Fixes #30706
2022-05-18 15:13:36 -06:00
Jen Herting
b1ef5a75f0 [py-tensorflow-hub] applied patch for newer version of zlib (#30664)
* [py-tensorflow-hub] applied patch for newer version of zlib

* [py-tensorflow-hub] patch also applies to 0.11.0

* [py-tensorflow-hub] Audit fix

1. patch URL in package py-tensorflow-hub must end with ?full_index=1
2022-05-18 13:21:21 -07:00
Timothy Brown
f9aa7c611c [WGRIB2] Pinning Jasper to v2. (#30726)
If we use v3 of Jasper, WGRIB2 fails to build due to not
finding `jpc_encode` and `jpc_decode`.
2022-05-18 09:34:22 -07:00
Timothy Brown
9a2e01e22d [G2] Pinning Jasper to version 2. (#30728) 2022-05-18 09:33:05 -07:00
Alberto Invernizzi
2090351d7f Add hip dependency for roc-obj-ls + add perl-uri-encode (#30721)
* add perl-uri-encode package

* add dependencies in hip for roc-obj-ls
2022-05-18 16:24:35 +02:00
Massimiliano Culpo
c775c322ec vendored externals: update archspec (#30683)
- Better support for 164fx
- Better support for Apple M1(pro)
2022-05-18 11:31:20 +02:00
Harmen Stoppels
1185eb9199 Compiler wrapper: fix globbing and debug out.log bell chars (#30699)
* Disable globbing

* Split on bell char when dumping cmd to out.log
2022-05-18 09:06:54 +02:00
Ryan Marcellino
51fa8e7b5e py-pytecplot: new package (#30708)
* py-pytecplot: new package

* fix copyright year

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* use one variant for all extras

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-05-18 01:47:17 +00:00
Mikael Simberg
f505c50770 Add tracy (#30588)
Co-authored-by: Mikael Simberg <mikael.simberg@iki.if>
2022-05-17 18:10:45 -07:00
Mark W. Krentel
2fdc817f03 hpctoolkit: add version 2022.05.15 (#30710) 2022-05-17 18:05:12 -07:00
Michael Kuhn
e1d0b35d5b gobject-introspection: add 1.72.0 (#30714)
Newer versions of gobject-introspection require Meson to build. Convert
the package into a hybrid one that still supports older versions using
Autotools.
2022-05-17 17:59:54 -07:00
Sreenivasa Murthy Kolam
ace5a7c4bf deprecate rocm releases-ROCm-4.2.0,ROCm-4.3.0,ROCm-4.3.1 (#30709) 2022-05-17 14:11:32 -07:00
kwryankrattiger
a91ae8cafe ecp-data-vis-sdk: Drop fortran from ascent spec. (#30707) 2022-05-17 16:53:52 -04:00
Chuck Atkins
b9e3ee6dd0 glew: Fix glu and glx dependencies (#30705) 2022-05-17 10:19:19 -07:00
iarspider
10ea0a2a3e Update rivet to 3.1.6, yoda to 1.9.5; fix recipes (#30702)
* Update rivet to 3.1.6, yoda to 1.9.5; fix recipes

* Use filter_compiler_wrappers instead of custom post-install step
2022-05-17 10:17:24 -07:00
Harmen Stoppels
d6f8ffc6bc add julia 1.6.6 (#30703) 2022-05-17 09:52:46 -07:00
haralmha
02be2f27d1 flatbuffers: Add version 2.0.6 (#30704) 2022-05-17 09:51:02 -07:00
Adam J. Stewart
dfd0702aec GDAL: deprecate 2.X (#30668) 2022-05-17 08:45:55 -07:00
Massimiliano Culpo
f454a683b5 Mark test_repo_last_mtime xfail on Python < 3.5 (#30696) 2022-05-17 12:45:52 +02:00
Chuck Atkins
d7d0c892d8 silo: Make HDF5 version deps more robust (#30693) 2022-05-17 04:26:35 -04:00
snehring
d566330a33 pindel: fixing compilation issues for gcc5+ (#28387) 2022-05-17 09:45:08 +02:00
Alex Hedges
446cbf4b5a sed: add v4.8.0, set gnu_mirror_path per version (#30666) 2022-05-17 09:43:01 +02:00
Tim Haines
5153c9e98c boost: constrain context-impl variant (#30654) 2022-05-17 09:42:19 +02:00
h-murai
d74f2d0be5 petsc: fix an error about handling a provider of Scalapack. (#30682) 2022-05-17 09:33:49 +02:00
Jianshen Liu
021b65d76f cppcheck: add v2.7 (#30698) 2022-05-17 07:29:46 +00:00
Adam J. Stewart
45312d49be Bazel: remove maintainer (#30697) 2022-05-17 09:29:22 +02:00
Chuck Atkins
3fcd85efe9 autoconf-archive: Patch for nvhpc support and propagate search dirs (#30692) 2022-05-17 03:29:42 +00:00
Alberto Madonna
6f3a082c3e runc: symlink sbin to bin to find it in $PATH (#30691) 2022-05-17 02:48:04 +02:00
stepanvanecek
23e2820547 sys-sage: new spack package (#30570)
* sys-sage - adding new package

* sys-sage: updated release verison

* sys-sage: remove FIXMEs from the package

* add libllvm dependency

* sys-sage: remove unnecessary libllvm dependency

Co-authored-by: Stepan Vanecek <stepan.vanecek@tum.de>
2022-05-16 17:01:34 -07:00
Sreenivasa Murthy Kolam
22b999fcd4 Add miopentensile-new recipe for ROCm-5.0.0 . ROCm-5.1.0 release (#29313)
* miopentensile-new recipe for rocm-5.0.0 release

* fix style checks

* update the version for 5.1.0 release, avoid git download
2022-05-16 16:19:35 -07:00
Michael Kuhn
1df7de62ca py-cffconvert: new package (#30694) 2022-05-16 23:16:32 +00:00
andymwood
97ec8f1d19 Avoid calling a method on a NoneType object (#30637) 2022-05-16 21:59:08 +00:00
iarspider
63b6e484fc sigcpp: protect from missing prefix.share folder (#30686) 2022-05-16 13:34:57 -07:00
Nick Forrington
2b12d19314 arm-forge: Versions up to 22.0.1 + minor updates (#30689)
* arm-forge: Download via HTTPS

Update download URL to use HTTPS (rather than HTTP)

* arm-forge: Allow +probe to depend on python3

Allow python dependency required for arm-forge+probe to be python3 as
well as 2.7.x

* arm-forge: Add versions up to 22.0.1
2022-05-16 14:30:19 -06:00
Francesco Giordano
c37fcccd7c aws-parallelcluster: add v2.11.7 (#30685) 2022-05-16 14:29:53 -06:00
Michael Kuhn
6034b5afc2 fix pkgconfig dependencies (#30688)
pkgconfig is the correct dependency, pkg-config is a provider of it.
2022-05-16 14:01:41 -06:00
Michael Kuhn
17bc937083 libfuse: add utils variant (#30675)
By default, libfuse install helper programs like `fusermount3`, which
are mostly useless if not installed with setuid (that is, `+useroot`).

However, their presence makes it complicated to use globally installed
versions, which can be combined with a Spack-installed FUSE library.

In particular, on systems that have a setuid fusermount3 binary, but no
libfuse-dev installed, it is nice to be able to build libfuse with Spack, and
have it call the system setuid executable.
2022-05-16 13:01:44 -06:00
renjithravindrankannath
ad8db0680d Correcting include and library paths using patch file for RVS (#30294)
* Correcting include and library paths using patch file for RVS to build
following library files in spack.
libperf.so.0.0
libpebb.so.0.0
libiet.so.0.0
libgst.so.0.0
libpqt.so.0.0
libmem.so.0.0
libbabel.so.0.0

* Correcting include and library paths using patch file for RVS to build
following library files in spack.
libperf.so.0.0
libpebb.so.0.0
libiet.so.0.0
libgst.so.0.0
libpqt.so.0.0
libmem.so.0.0
libbabel.so.0.0

* Replacing ROCM_PATH with RPATH in the deviceid.sh before installing in Spack build.

* Reducing multiple enviroment variable for HIP and HSA path
2022-05-16 09:44:59 -07:00
estewart08
4f033b155b [AMD][rocm-openmp-extras] - Update versions 5.0.0 through 5.1.0. (#30501)
- Removed gl dependency.
- Specify clang as cmake compiler as gcc was being
  improperly picked up. As a result, ffi include
  path was needed in C/CXX flags.

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-05-16 09:22:48 -07:00
Sreenivasa Murthy Kolam
ad829ccee1 Apply cyclades removal patch for the llvm-amdgpu package (#29376)
* apply cyclades removal patch for the llvm-amdgpu spack package

* update the changes with develop branch
2022-05-16 09:15:26 -07:00
Tom Vander Aa
4b60a17174 Extrae: add support for Intel OneAPI (#30684) 2022-05-16 09:02:00 -06:00
Ethan Stam
edb91f4077 ParaView: -no-ipo for intel builds (#18193) 2022-05-16 08:01:34 -06:00
Todd Gamblin
0fdc3bf420 bugfix: use deterministic edge order for spack graph (#30681)
Previously we sorted by hash values for `spack graph`, but changing hashes can make the
test brittle and the node order seem nondeterministic to users.

- [x] Sort nodes in `spack graph` by the default edge order, which takes into account
      parent and child names as well as dependency types.
- [x] Update ASCII test output for new order.
2022-05-16 11:36:41 +02:00
Francine Lapid
8b34cabb16 subversion: added apxs support (#30381) 2022-05-16 10:42:15 +02:00
haralmha
77fb651e01 frontier-client: adapt pacparser_setmyip function to new pacparser release (#29936) 2022-05-16 10:41:21 +02:00
Harmen Stoppels
35ed7973e2 eccodes, fix jasper again (#30635)
* eccodes: jasper@:2

* Revert "jasper: avoid --gc-sections / hidden symbols"

This reverts commit d1bc0f39c516a7dc1e941aa4a804b7468a200b75.

* bump ecbuild, drop cmake constraint for newer versions

* add ecbuild dep to eccodes@develop
2022-05-16 10:40:35 +02:00
snehring
e73b19024f bpp-suite and deps: urls to GitHub (#30665)
* bpp-core: moving url to github. Fixing compilation issue.

* bpp-phyl: moving url to github.

* bpp-seq: moving url to github

* bpp-popgen: new package

* bpp-suite: moving url to github, new version.

* bpp-popgen: removing unused cmake_args.
2022-05-16 10:23:39 +02:00
Alex Hedges
7803bc9e5f screen: add v4.9.0, add required build deps (#30667) 2022-05-16 10:11:02 +02:00
dlkuehn
55c400297c treesub: change jdk dependency to java, add build to java dep. type (#30672)
Co-authored-by: David Kuehn <las_dkuehn@iastate.edu>
2022-05-16 10:06:09 +02:00
Umar Arshad
8686e18494 clblast: add new package (#30677) 2022-05-16 10:03:15 +02:00
Diego Alvarez
d28967bbf3 nextflow: add v22.04.1 (#30679) 2022-05-16 09:01:15 +02:00
Diego Alvarez
5f928f71c0 openjdk: add 11.0.15+10, 17.0.3+7 (#30680) 2022-05-16 08:58:51 +02:00
Ken Raffenetti
dc7bdf5f24 mpich: add support for Mellanox HCOLL (#30662)
Co-authored-by: Federico Ficarelli <federico.ficarelli@pm.me>
2022-05-15 14:14:47 +02:00
Danny McClanahan
a681fd7b42 Introduce GroupedExceptionHandler and use it to simplify bootstrap error handling (#30192) 2022-05-15 10:59:11 +00:00
Alberto Invernizzi
f40f1b5c7c Fix for spack stage command not extracting packages in custom paths (#30448) 2022-05-15 12:13:42 +02:00
Michael Kuhn
edd3cf0b17 qt: add 5.15.4 (#30656) 2022-05-14 23:34:06 -06:00
Michael Kuhn
ff03e2ef4c uninstall: fix dependency check (#30674)
The dependency check currently checks whether there are only build
dependencies left for a particular package. However, the database also
contains uninstalled packages, which can cause the check to fail.

For instance, with `bison` and `flex` having already been uninstalled,
`m4` will have the following dependents:
```
bison ('build', 'run')--> m4
flex ('build',)--> m4
libnl ('build',)--> m4
```
`bison` and `flex` should be ignored in this case because they are not
installed anymore.

Fixes #30673
2022-05-14 18:01:29 -07:00
Zack Galbreath
bee311edf3 Update GitLab environment variable name (#30671)
Use the IAM credentials that correspond to our new binary mirror
(s3://spack-binaries vs. s3://spack-binaries-develop)
2022-05-14 16:33:32 -06:00
Jen Herting
73b69cfeec [py-pyworld] Limiting numpy version. See: https://zenn.dev/ymd_h/articles/934a90e1468a05 (#30670) 2022-05-14 09:22:31 -05:00
Cameron Smith
4a1041dbc3 CEED v5.0 release (#29710)
* ceed50: add ceed 5.0.0 and pumi 2.2.7

* libceed-0.10

* ceed50: add omegah

* omega-h: mpi and cuda builds work

* omega-h: fix style

* New package: libfms

* New version: gslib@1.0.7

CEED: add some TODO items for the 5.0 release

* ceed: variant name consistent with package name

* LAGHOS: allow newer versions of MFEM to be used with v3.1

* LIBCEED: add missing 'install' target in 'install_targets'

* CEED: address some TODO items + some tweaks

* MFEM: add new variant for FMS (libfms)

* CEED: v5.0.0 depends on 'libfms' and 'mfem+fms'

* RATEL: add missing 'install' target in 'install_targets'

* CEED: add dependency for v5.0.0 on Ratel v0.1.2

* CEED: add Nek-related dependencies for ceed@5.0.0

* CEED: v5.0.0 depends on MAGMA v2.6.2

* libCEED: set the `CUDA_ARCH` makefile parameter

* libCEED: set the `HIP_ARCH` makefile parameter

Co-authored-by: Jed Brown <jed@jedbrown.org>
Co-authored-by: Veselin Dobrev <dobrev@llnl.gov>
Co-authored-by: Veselin Dobrev <v-dobrev@users.noreply.github.com>
2022-05-13 18:29:02 -07:00
Jen Herting
ccab7bf4fd New package: py-pyworld (#28641)
* espnet first build with depends

* added cython>=0.24.0' and type='build'

* [py-pyworld] updated copyright

Co-authored-by: Sid Pendelberry <sid@rit.edu>
2022-05-13 17:19:58 -05:00
John W. Parent
e24e71be6a Preserve Permissions on .zip extraction (#30407)
#24556 merged in support for Python's .zip file support via ZipFile.
However as per #30200 ZipFile does not preserve file permissions of
the extracted contents. This PR returns to using the `unzip`
executable on non-Windows systems (as was the case before #24556)
and now uses `tar` on Windows to extract .zip files.
2022-05-13 13:38:05 -07:00
kwryankrattiger
72d83a6f94 Ascent: Patch 0.8.0 for finding ADIOS2. (#30609) 2022-05-13 13:26:53 -07:00
6969 changed files with 26689 additions and 13573 deletions

View File

@@ -12,6 +12,7 @@ on:
# built-in repository or documentation
- 'var/spack/repos/builtin/**'
- '!var/spack/repos/builtin/packages/clingo-bootstrap/**'
- '!var/spack/repos/builtin/packages/clingo/**'
- '!var/spack/repos/builtin/packages/python/**'
- '!var/spack/repos/builtin/packages/re2c/**'
- 'lib/spack/docs/**'
@@ -19,11 +20,16 @@ on:
# nightly at 2:16 AM
- cron: '16 2 * * *'
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
cancel-in-progress: true
jobs:
fedora-clingo-sources:
runs-on: ubuntu-latest
container: "fedora:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -57,6 +63,7 @@ jobs:
ubuntu-clingo-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
env:
@@ -93,6 +100,7 @@ jobs:
ubuntu-clingo-binaries-and-patchelf:
runs-on: ubuntu-latest
container: "ubuntu:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
env:
@@ -126,6 +134,7 @@ jobs:
opensuse-clingo-sources:
runs-on: ubuntu-latest
container: "opensuse/leap:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -154,6 +163,7 @@ jobs:
macos-clingo-sources:
runs-on: macos-latest
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -170,17 +180,19 @@ jobs:
tree ~/.spack/bootstrap/store/
macos-clingo-binaries:
runs-on: macos-latest
runs-on: ${{ matrix.macos-version }}
strategy:
matrix:
python-version: ['3.5', '3.6', '3.7', '3.8', '3.9', '3.10']
macos-version: ['macos-10.15', 'macos-11', 'macos-12']
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
brew install tree
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb
with:
python-version: ${{ matrix.python-version }}
- name: Bootstrap clingo
@@ -195,10 +207,11 @@ jobs:
strategy:
matrix:
python-version: ['2.7', '3.5', '3.6', '3.7', '3.8', '3.9', '3.10']
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb
with:
python-version: ${{ matrix.python-version }}
- name: Setup repo
@@ -216,6 +229,7 @@ jobs:
ubuntu-gnupg-binaries:
runs-on: ubuntu-latest
container: "ubuntu:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
env:
@@ -250,6 +264,7 @@ jobs:
ubuntu-gnupg-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
env:
@@ -285,6 +300,7 @@ jobs:
macos-gnupg-binaries:
runs-on: macos-latest
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -302,6 +318,7 @@ jobs:
macos-gnupg-sources:
runs-on: macos-latest
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |

View File

@@ -19,6 +19,10 @@ on:
release:
types: [published]
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
cancel-in-progress: true
jobs:
deploy-images:
runs-on: ubuntu-latest
@@ -43,6 +47,7 @@ jobs:
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04']]
name: Build ${{ matrix.dockerfile[0] }}
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
@@ -75,7 +80,7 @@ jobs:
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@6673cd052c4cd6fcf4b4e6e60ea986c889389535
uses: actions/upload-artifact@3cea5372237819ed00197afe530f5a7ea3e805c8
with:
name: dockerfiles
path: dockerfiles
@@ -94,7 +99,7 @@ jobs:
password: ${{ secrets.GITHUB_TOKEN }}
- name: Log in to DockerHub
if: ${{ github.event_name != 'pull_request' }}
if: github.event_name != 'pull_request'
uses: docker/login-action@49ed152c8eca782a232dede0303416e8f356c37b # @v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}

View File

@@ -16,16 +16,21 @@ on:
- '.github/workflows/macos_python.yml'
# TODO: run if we touch any of the recipes involved in this
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
cancel-in-progress: true
# GitHub Action Limits
# https://help.github.com/en/actions/reference/workflow-syntax-for-github-actions
jobs:
install_gcc:
name: gcc with clang
if: github.repository == 'spack/spack'
runs-on: macos-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb # @v2
with:
python-version: 3.9
- name: spack install
@@ -36,11 +41,12 @@ jobs:
install_jupyter_clang:
name: jupyter
if: github.repository == 'spack/spack'
runs-on: macos-latest
timeout-minutes: 700
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb # @v2
with:
python-version: 3.9
- name: spack install
@@ -50,10 +56,11 @@ jobs:
install_scipy_clang:
name: scipy, mpl, pd
if: github.repository == 'spack/spack'
runs-on: macos-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb # @v2
with:
python-version: 3.9
- name: spack install

View File

@@ -4,6 +4,7 @@ Set-Location spack
git config --global user.email "spack@example.com"
git config --global user.name "Test User"
git config --global core.longpaths true
if ($(git branch --show-current) -ne "develop")
{

View File

@@ -9,6 +9,11 @@ on:
branches:
- develop
- releases/**
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
cancel-in-progress: true
jobs:
# Validate that the code can be run on all the Python versions
# supported by Spack
@@ -16,7 +21,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb # @v2
with:
python-version: '3.10'
- name: Install Python Packages
@@ -34,7 +39,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb # @v2
with:
python-version: '3.10'
- name: Install Python packages
@@ -109,7 +114,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -174,7 +179,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb # @v2
with:
python-version: '3.10'
- name: Install System packages
@@ -240,7 +245,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb # @v2
with:
python-version: '3.10'
- name: Install System packages
@@ -289,7 +294,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
@@ -332,7 +337,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6 # @v2
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb # @v2
with:
python-version: '3.10'
- name: Install Python packages
@@ -345,7 +350,7 @@ jobs:
coverage run $(which spack) audit packages
coverage combine
coverage xml
- name: Package audits (wwithout coverage)
- name: Package audits (without coverage)
if: ${{ needs.changes.outputs.with_coverage == 'false' }}
run: |
. share/spack/setup-env.sh

View File

@@ -9,6 +9,11 @@ on:
branches:
- develop
- releases/**
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.run_number }}
cancel-in-progress: true
defaults:
run:
shell:
@@ -18,7 +23,7 @@ jobs:
runs-on: windows-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb
with:
python-version: 3.9
- name: Install Python Packages
@@ -36,7 +41,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
with:
fetch-depth: 0
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb
with:
python-version: 3.9
- name: Install Python packages
@@ -58,7 +63,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
with:
fetch-depth: 0
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb
with:
python-version: 3.9
- name: Install Python packages
@@ -78,7 +83,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
with:
fetch-depth: 0
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb
with:
python-version: 3.9
- name: Install Python packages
@@ -98,7 +103,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
with:
fetch-depth: 0
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb
with:
python-version: 3.9
- name: Install Python packages
@@ -123,7 +128,7 @@ jobs:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
with:
fetch-depth: 0
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb
with:
python-version: 3.9
- name: Install Python packages
@@ -139,11 +144,11 @@ jobs:
echo "installer_root=$((pwd).Path)" | Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append
env:
ProgressPreference: SilentlyContinue
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@3cea5372237819ed00197afe530f5a7ea3e805c8
with:
name: Windows Spack Installer Bundle
path: ${{ env.installer_root }}\pkg\Spack.exe
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@3cea5372237819ed00197afe530f5a7ea3e805c8
with:
name: Windows Spack Installer
path: ${{ env.installer_root}}\pkg\Spack.msi
@@ -154,7 +159,7 @@ jobs:
run:
shell: pwsh
steps:
- uses: actions/setup-python@98f2ad02fd48d057ee3b4d4f66525b231c3e52b6
- uses: actions/setup-python@d09bd5e6005b175076f227b13d9730d56e9dcfcb
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -1,3 +1,205 @@
# v0.18.0 (2022-05-28)
`v0.18.0` is a major feature release.
## Major features in this release
1. **Concretizer now reuses by default**
`spack install --reuse` was introduced in `v0.17.0`, and `--reuse`
is now the default concretization mode. Spack will try hard to
resolve dependencies using installed packages or binaries (#30396).
To avoid reuse and to use the latest package configurations, (the
old default), you can use `spack install --fresh`, or add
configuration like this to your environment or `concretizer.yaml`:
```yaml
concretizer:
reuse: false
```
2. **Finer-grained hashes**
Spack hashes now include `link`, `run`, *and* `build` dependencies,
as well as a canonical hash of package recipes. Previously, hashes
only included `link` and `run` dependencies (though `build`
dependencies were stored by environments). We coarsened the hash to
reduce churn in user installations, but the new default concretizer
behavior mitigates this concern and gets us reuse *and* provenance.
You will be able to see the build dependencies of new installations
with `spack find`. Old installations will not change and their
hashes will not be affected. (#28156, #28504, #30717, #30861)
3. **Improved error messages**
Error handling with the new concretizer is now done with
optimization criteria rather than with unsatisfiable cores, and
Spack reports many more details about conflicting constraints.
(#30669)
4. **Unify environments when possible**
Environments have thus far supported `concretization: together` or
`concretization: separately`. These have been replaced by a new
preference in `concretizer.yaml`:
```yaml
concretizer:
unify: [true|false|when_possible]
```
`concretizer:unify:when_possible` will *try* to resolve a fully
unified environment, but if it cannot, it will create multiple
configurations of some packages where it has to. For large
environments that previously had to be concretized separately, this
can result in a huge speedup (40-50x). (#28941)
5. **Automatically find externals on Cray machines**
Spack can now automatically discover installed packages in the Cray
Programming Environment by running `spack external find` (or `spack
external read-cray-manifest` to *only* query the PE). Packages from
the PE (e.g., `cray-mpich` are added to the database with full
dependency information, and compilers from the PE are added to
`compilers.yaml`. Available with the June 2022 release of the Cray
Programming Environment. (#24894, #30428)
6. **New binary format and hardened signing**
Spack now has an updated binary format, with improvements for
security. The new format has a detached signature file, and Spack
verifies the signature before untarring or decompressing the binary
package. The previous format embedded the signature in a `tar`
file, which required the client to run `tar` *before* verifying
(#30750). Spack can still install from build caches using the old
format, but we encourage users to switch to the new format going
forward.
Production GitLab pipelines have been hardened to securely sign
binaries. There is now a separate signing stage so that signing
keys are never exposed to build system code, and signing keys are
ephemeral and only live as long as the signing pipeline stage.
(#30753)
7. **Bootstrap mirror generation**
The `spack bootstrap mirror` command can automatically create a
mirror for bootstrapping the concretizer and other needed
dependencies in an air-gapped environment. (#28556)
8. **Nascent Windows support**
Spack now has initial support for Windows. Spack core has been
refactored to run in the Windows environment, and a small number of
packages can now build for Windows. More details are
[in the documentation](https://spack.rtfd.io/en/latest/getting_started.html#spack-on-windows)
(#27021, #28385, many more)
9. **Makefile generation**
`spack env depfile` can be used to generate a `Makefile` from an
environment, which can be used to build packages the environment
in parallel on a single node. e.g.:
```console
spack -e myenv env depfile > Makefile
make
```
Spack propagates `gmake` jobserver information to builds so that
their jobs can share cores. (#30039, #30254, #30302, #30526)
10. **New variant features**
In addition to being conditional themselves, variants can now have
[conditional *values*](https://spack.readthedocs.io/en/latest/packaging_guide.html#conditional-possible-values)
that are only possible for certain configurations of a package. (#29530)
Variants can be
[declared "sticky"](https://spack.readthedocs.io/en/latest/packaging_guide.html#sticky-variants),
which prevents them from being enabled or disabled by the
concretizer. Sticky variants must be set explicitly by users
on the command line or in `packages.yaml`. (#28630)
* Allow conditional possible values in variants
* Add a "sticky" property to variants
## Other new features of note
* Environment views can optionally link only `run` dependencies
with `link:run` (#29336)
* `spack external find --all` finds library-only packages in
addition to build dependencies (#28005)
* Customizable `config:license_dir` option (#30135)
* `spack external find --path PATH` takes a custom search path (#30479)
* `spack spec` has a new `--format` argument like `spack find` (#27908)
* `spack concretize --quiet` skips printing concretized specs (#30272)
* `spack info` now has cleaner output and displays test info (#22097)
* Package-level submodule option for git commit versions (#30085, #30037)
* Using `/hash` syntax to refer to concrete specs in an environment
now works even if `/hash` is not installed. (#30276)
## Major internal refactors
* full hash (see above)
* new develop versioning scheme `0.19.0-dev0`
* Allow for multiple dependencies/dependents from the same package (#28673)
* Splice differing virtual packages (#27919)
## Performance Improvements
* Concretization of large environments with `unify: when_possible` is
much faster than concretizing separately (#28941, see above)
* Single-pass view generation algorithm is 2.6x faster (#29443)
## Archspec improvements
* `oneapi` and `dpcpp` flag support (#30783)
* better support for `M1` and `a64fx` (#30683)
## Removals and Deprecations
* Spack no longer supports Python `2.6` (#27256)
* Removed deprecated `--run-tests` option of `spack install`;
use `spack test` (#30461)
* Removed deprecated `spack flake8`; use `spack style` (#27290)
* Deprecate `spack:concretization` config option; use
`concretizer:unify` (#30038)
* Deprecate top-level module configuration; use module sets (#28659)
* `spack activate` and `spack deactivate` are deprecated in favor of
environments; will be removed in `0.19.0` (#29430; see also `link:run`
in #29336 above)
## Notable Bugfixes
* Fix bug that broke locks with many parallel builds (#27846)
* Many bugfixes and consistency improvements for the new concretizer
and `--reuse` (#30357, #30092, #29835, #29933, #28605, #29694, #28848)
## Packages
* `CMakePackage` uses `CMAKE_INSTALL_RPATH_USE_LINK_PATH` (#29703)
* Refactored `lua` support: `lua-lang` virtual supports both
`lua` and `luajit` via new `LuaPackage` build system(#28854)
* PythonPackage: now installs packages with `pip` (#27798)
* Python: improve site_packages_dir handling (#28346)
* Extends: support spec, not just package name (#27754)
* `find_libraries`: search for both .so and .dylib on macOS (#28924)
* Use stable URLs and `?full_index=1` for all github patches (#29239)
## Spack community stats
* 6,416 total packages, 458 new since `v0.17.0`
* 219 new Python packages
* 60 new R packages
* 377 people contributed to this release
* 337 committers to packages
* 85 committers to core
# v0.17.2 (2022-04-13)
### Spack bugfixes
@@ -11,7 +213,7 @@
* Fixed a few bugs affecting the spack ci command (#29518, #29419)
* Fix handling of Intel compiler environment (#29439)
* Fix a few edge cases when reindexing the DB (#28764)
* Remove "Known issues" from documentation (#29664)
* Remove "Known issues" from documentation (#29664)
* Other miscellaneous bugfixes (0b72e070583fc5bcd016f5adc8a84c99f2b7805f, #28403, #29261)
# v0.17.1 (2021-12-23)

View File

@@ -6,34 +6,15 @@ bootstrap:
# by Spack is installed in a "store" subfolder of this root directory
root: $user_cache_path/bootstrap
# Methods that can be used to bootstrap software. Each method may or
# may not be able to bootstrap all of the software that Spack needs,
# may not be able to bootstrap all the software that Spack needs,
# depending on its type.
sources:
- name: 'github-actions-v0.2'
type: buildcache
description: |
Buildcache generated from a public workflow using Github Actions.
The sha256 checksum of binaries is checked before installation.
info:
url: https://mirror.spack.io/bootstrap/github-actions/v0.2
homepage: https://github.com/spack/spack-bootstrap-mirrors
releases: https://github.com/spack/spack-bootstrap-mirrors/releases
metadata: $spack/share/spack/bootstrap/github-actions-v0.2
- name: 'github-actions-v0.1'
type: buildcache
description: |
Buildcache generated from a public workflow using Github Actions.
The sha256 checksum of binaries is checked before installation.
info:
url: https://mirror.spack.io/bootstrap/github-actions/v0.1
homepage: https://github.com/spack/spack-bootstrap-mirrors
releases: https://github.com/spack/spack-bootstrap-mirrors/releases
# This method is just Spack bootstrapping the software it needs from sources.
# It has been added here so that users can selectively disable bootstrapping
# from sources by "untrusting" it.
- name: spack-install
type: install
description: |
Specs built from sources by Spack. May take a long time.
metadata: $spack/share/spack/bootstrap/github-actions-v0.1
- name: 'spack-install'
metadata: $spack/share/spack/bootstrap/spack-install
trusted:
# By default we trust bootstrapping from sources and from binaries
# produced on Github via the workflow

View File

@@ -28,3 +28,9 @@ concretizer:
# instance concretize with target "icelake" while running on "haswell").
# If "true" only allow targets that are compatible with the host.
host_compatible: true
# When "true" concretize root specs of environments together, so that each unique
# package in an environment corresponds to one concrete spec. This ensures
# environments can always be activated. When "false" perform concretization separately
# on each root spec, allowing different versions and variants of the same package in
# an environment.
unify: false

View File

@@ -33,6 +33,9 @@ config:
template_dirs:
- $spack/share/spack/templates
# Directory where licenses should be located
license_dir: $spack/etc/spack/licenses
# Temporary locations Spack can try to use for builds.
#
# Recommended options are given below.

View File

@@ -50,6 +50,13 @@ build cache files for the "ninja" spec:
Note that the targeted spec must already be installed. Once you have a build cache,
you can add it as a mirror, discussed next.
.. warning::
Spack improved the format used for binary caches in v0.18. The entire v0.18 series
will be able to verify and install binary caches both in the new and in the old format.
Support for using the old format is expected to end in v0.19, so we advise users to
recreate relevant buildcaches using Spack v0.18 or higher.
---------------------------------------
Finding or installing build cache files
---------------------------------------

View File

@@ -0,0 +1,160 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _bootstrapping:
=============
Bootstrapping
=============
In the :ref:`Getting started <getting_started>` Section we already mentioned that
Spack can bootstrap some of its dependencies, including ``clingo``. In fact, there
is an entire command dedicated to the management of every aspect of bootstrapping:
.. command-output:: spack bootstrap --help
The first thing to know to understand bootstrapping in Spack is that each of
Spack's dependencies is bootstrapped lazily; i.e. the first time it is needed and
can't be found. You can readily check if any prerequisite for using Spack
is missing by running:
.. code-block:: console
% spack bootstrap status
Spack v0.17.1 - python@3.8
[FAIL] Core Functionalities
[B] MISSING "clingo": required to concretize specs
[FAIL] Binary packages
[B] MISSING "gpg2": required to sign/verify buildcaches
Spack will take care of bootstrapping any missing dependency marked as [B]. Dependencies marked as [-] are instead required to be found on the system.
In the case of the output shown above Spack detected that both ``clingo`` and ``gnupg``
are missing and it's giving detailed information on why they are needed and whether
they can be bootstrapped. Running a command that concretize a spec, like:
.. code-block:: console
% spack solve zlib
==> Bootstrapping clingo from pre-built binaries
==> Fetching https://mirror.spack.io/bootstrap/github-actions/v0.1/build_cache/darwin-catalina-x86_64/apple-clang-12.0.0/clingo-bootstrap-spack/darwin-catalina-x86_64-apple-clang-12.0.0-clingo-bootstrap-spack-p5on7i4hejl775ezndzfdkhvwra3hatn.spack
==> Installing "clingo-bootstrap@spack%apple-clang@12.0.0~docs~ipo+python build_type=Release arch=darwin-catalina-x86_64" from a buildcache
[ ... ]
triggers the bootstrapping of clingo from pre-built binaries as expected.
-----------------------
The Bootstrapping store
-----------------------
The software installed for bootstrapping purposes is deployed in a separate store.
Its location can be checked with the following command:
.. code-block:: console
% spack bootstrap root
It can also be changed with the same command by just specifying the newly desired path:
.. code-block:: console
% spack bootstrap root /opt/spack/bootstrap
You can check what is installed in the bootstrapping store at any time using:
.. code-block:: console
% spack find -b
==> Showing internal bootstrap store at "/Users/spack/.spack/bootstrap/store"
==> 11 installed packages
-- darwin-catalina-x86_64 / apple-clang@12.0.0 ------------------
clingo-bootstrap@spack libassuan@2.5.5 libgpg-error@1.42 libksba@1.5.1 pinentry@1.1.1 zlib@1.2.11
gnupg@2.3.1 libgcrypt@1.9.3 libiconv@1.16 npth@1.6 python@3.8
In case it is needed you can remove all the software in the current bootstrapping store with:
.. code-block:: console
% spack clean -b
==> Removing bootstrapped software and configuration in "/Users/spack/.spack/bootstrap"
% spack find -b
==> Showing internal bootstrap store at "/Users/spack/.spack/bootstrap/store"
==> 0 installed packages
--------------------------------------------
Enabling and disabling bootstrapping methods
--------------------------------------------
Bootstrapping is always performed by trying the methods listed by:
.. command-output:: spack bootstrap list
in the order they appear, from top to bottom. By default Spack is
configured to try first bootstrapping from pre-built binaries and to
fall-back to bootstrapping from sources if that failed.
If need be, you can disable bootstrapping altogether by running:
.. code-block:: console
% spack bootstrap disable
in which case it's your responsibility to ensure Spack runs in an
environment where all its prerequisites are installed. You can
also configure Spack to skip certain bootstrapping methods by *untrusting*
them. For instance:
.. code-block:: console
% spack bootstrap untrust github-actions
==> "github-actions" is now untrusted and will not be used for bootstrapping
tells Spack to skip trying to bootstrap from binaries. To add the "github-actions" method back you can:
.. code-block:: console
% spack bootstrap trust github-actions
There is also an option to reset the bootstrapping configuration to Spack's defaults:
.. code-block:: console
% spack bootstrap reset
==> Bootstrapping configuration is being reset to Spack's defaults. Current configuration will be lost.
Do you want to continue? [Y/n]
%
----------------------------------------
Creating a mirror for air-gapped systems
----------------------------------------
Spack's default configuration for bootstrapping relies on the user having
access to the internet, either to fetch pre-compiled binaries or source tarballs.
Sometimes though Spack is deployed on air-gapped systems where such access is denied.
To help with similar situations Spack has a command that recreates, in a local folder
of choice, a mirror containing the source tarballs and/or binary packages needed for
bootstrapping.
.. code-block:: console
% spack bootstrap mirror --binary-packages /opt/bootstrap
==> Adding "clingo-bootstrap@spack+python %apple-clang target=x86_64" and dependencies to the mirror at /opt/bootstrap/local-mirror
==> Adding "gnupg@2.3: %apple-clang target=x86_64" and dependencies to the mirror at /opt/bootstrap/local-mirror
==> Adding "patchelf@0.13.1:0.13.99 %apple-clang target=x86_64" and dependencies to the mirror at /opt/bootstrap/local-mirror
==> Adding binary packages from "https://github.com/alalazo/spack-bootstrap-mirrors/releases/download/v0.1-rc.2/bootstrap-buildcache.tar.gz" to the mirror at /opt/bootstrap/local-mirror
To register the mirror on the platform where it's supposed to be used run the following command(s):
% spack bootstrap add --trust local-sources /opt/bootstrap/metadata/sources
% spack bootstrap add --trust local-binaries /opt/bootstrap/metadata/binaries
This command needs to be run on a machine with internet access and the resulting folder
has to be moved over to the air-gapped system. Once the local sources are added using the
commands suggested at the prompt, they can be used to bootstrap Spack.

View File

@@ -39,6 +39,7 @@ on these ideas for each distinct build system that Spack supports:
build_systems/autotoolspackage
build_systems/cmakepackage
build_systems/cachedcmakepackage
build_systems/mesonpackage
build_systems/qmakepackage
build_systems/sippackage

View File

@@ -0,0 +1,123 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _cachedcmakepackage:
------------------
CachedCMakePackage
------------------
The CachedCMakePackage base class is used for CMake-based workflows
that create a CMake cache file prior to running ``cmake``. This is
useful for packages with arguments longer than the system limit, and
for reproducibility.
The documentation for this class assumes that the user is familiar with
the ``CMakePackage`` class from which it inherits. See the documentation
for :ref:`CMakePackage <cmakepackage>`.
^^^^^^
Phases
^^^^^^
The ``CachedCMakePackage`` base class comes with the following phases:
#. ``initconfig`` - generate the CMake cache file
#. ``cmake`` - generate the Makefile
#. ``build`` - build the package
#. ``install`` - install the package
By default, these phases run:
.. code-block:: console
$ mkdir spack-build
$ cd spack-build
$ cat << EOF > name-arch-compiler@version.cmake
# Write information on compilers and dependencies
# includes information on mpi and cuda if applicable
$ cmake .. -DCMAKE_INSTALL_PREFIX=/path/to/installation/prefix -C name-arch-compiler@version.cmake
$ make
$ make test # optional
$ make install
The ``CachedCMakePackage`` class inherits from the ``CMakePackage``
class, and accepts all of the same options and adds all of the same
flags to the ``cmake`` command. Similar to the ``CMakePAckage`` class,
you may need to add a few arguments yourself, and the
``CachedCMakePackage`` provides the same interface to add those
flags.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Adding entries to the CMake cache
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In addition to adding flags to the ``cmake`` command, you may need to
add entries to the CMake cache in the ``initconfig`` phase. This can
be done by overriding one of four methods:
#. ``CachedCMakePackage.initconfig_compiler_entries``
#. ``CachedCMakePackage.initconfig_mpi_entries``
#. ``CachedCMakePackage.initconfig_hardware_entries``
#. ``CachedCMakePackage.initconfig_package_entries``
Each of these methods returns a list of CMake cache strings. The
distinction between these methods is merely to provide a
well-structured and legible cmake cache file -- otherwise, entries
from each of these methods are handled identically.
Spack also provides convenience methods for generating CMake cache
entries. These methods are available at module scope in every Spack
package. Because CMake parses boolean options, strings, and paths
differently, there are three such methods:
#. ``cmake_cache_option``
#. ``cmake_cache_string``
#. ``cmake_cache_path``
These methods each accept three parameters -- the name of the CMake
variable associated with the entry, the value of the entry, and an
optional comment -- and return strings in the appropriate format to be
returned from any of the ``initconfig*`` methods. Additionally, these
methods may return comments beginning with the ``#`` character.
A typical usage of these methods may look something like this:
.. code-block:: python
def initconfig_mpi_entries(self)
# Get existing MPI configurations
entries = super(self, Foo).initconfig_mpi_entries()
# The existing MPI configurations key on whether ``mpi`` is in the spec
# This spec has an MPI variant, and we need to enable MPI when it is on.
# This hypothetical package controls MPI with the ``FOO_MPI`` option to
# cmake.
if '+mpi' in self.spec:
entries.append(cmake_cache_option('FOO_MPI', True, "enable mpi"))
else:
entries.append(cmake_cache_option('FOO_MPI', False, "disable mpi"))
def initconfig_package_entries(self):
# Package specific options
entries = []
entries.append('#Entries for build options')
bar_on = '+bar' in self.spec
entries.append(cmake_cache_option('FOO_BAR', bar_on, 'toggle bar'))
entries.append('#Entries for dependencies')
if self.spec['blas'].name == 'baz': # baz is our blas provider
entries.append(cmake_cache_string('FOO_BLAS', 'baz', 'Use baz'))
entries.append(cmake_cache_path('BAZ_PREFIX', self.spec['baz'].prefix))
^^^^^^^^^^^^^^^^^^^^^^
External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on CMake cache files, see:
https://cmake.org/cmake/help/latest/manual/cmake.1.html

View File

@@ -84,8 +84,8 @@ build ``hdf5`` with Intel oneAPI MPI do::
spack install hdf5 +mpi ^intel-oneapi-mpi
Using an Externally Installed oneAPI
====================================
Using Externally Installed oneAPI Tools
=======================================
Spack can also use oneAPI tools that are manually installed with
`Intel Installers`_. The procedures for configuring Spack to use
@@ -110,7 +110,7 @@ Another option is to manually add the configuration to
Libraries
---------
If you want Spack to use MKL that you have installed without Spack in
If you want Spack to use oneMKL that you have installed without Spack in
the default location, then add the following to
``~/.spack/packages.yaml``, adjusting the version as appropriate::
@@ -139,7 +139,7 @@ You can also use Spack-installed libraries. For example::
spack load intel-oneapi-mkl
Will update your environment CPATH, LIBRARY_PATH, and other
environment variables for building an application with MKL.
environment variables for building an application with oneMKL.
More information
================

View File

@@ -15,6 +15,9 @@ IntelPackage
Intel packages in Spack
^^^^^^^^^^^^^^^^^^^^^^^^
This is an earlier version of Intel software development tools and has
now been replaced by Intel oneAPI Toolkits.
Spack can install and use several software development products offered by Intel.
Some of these are available under no-cost terms, others require a paid license.
All share the same basic steps for configuration, installation, and, where

View File

@@ -59,7 +59,8 @@ other techniques to minimize the size of the final image:
&& echo " specs:" \
&& echo " - gromacs+mpi" \
&& echo " - mpich" \
&& echo " concretization: together" \
&& echo " concretizer: together" \
&& echo " unify: true" \
&& echo " config:" \
&& echo " install_tree: /opt/software" \
&& echo " view: /opt/view") > /opt/spack-environment/spack.yaml
@@ -108,9 +109,10 @@ Spack Images on Docker Hub
--------------------------
Docker images with Spack preinstalled and ready to be used are
built on `Docker Hub <https://hub.docker.com/u/spack>`_
at every push to ``develop`` or to a release branch. The OS that
are currently supported are summarized in the table below:
built when a release is tagged, or nightly on ``develop``. The images
are then pushed both to `Docker Hub <https://hub.docker.com/u/spack>`_
and to `GitHub Container Registry <https://github.com/orgs/spack/packages?repo_name=spack>`_.
The OS that are currently supported are summarized in the table below:
.. _containers-supported-os:
@@ -120,22 +122,31 @@ are currently supported are summarized in the table below:
* - Operating System
- Base Image
- Spack Image
* - Ubuntu 16.04
- ``ubuntu:16.04``
- ``spack/ubuntu-xenial``
* - Ubuntu 18.04
- ``ubuntu:18.04``
- ``spack/ubuntu-bionic``
* - Ubuntu 20.04
- ``ubuntu:20.04``
- ``spack/ubuntu-focal``
* - Ubuntu 22.04
- ``ubuntu:22.04``
- ``spack/ubuntu-jammy``
* - CentOS 7
- ``centos:7``
- ``spack/centos7``
* - CentOS Stream
- ``quay.io/centos/centos:stream``
- ``spack/centos-stream``
* - openSUSE Leap
- ``opensuse/leap``
- ``spack/leap15``
* - Amazon Linux 2
- ``amazonlinux:2``
- ``spack/amazon-linux``
All the images are tagged with the corresponding release of Spack:
.. image:: dockerhub_spack.png
.. image:: images/ghcr_spack.png
with the exception of the ``latest`` tag that points to the HEAD
of the ``develop`` branch. These images are available for anyone
@@ -245,7 +256,8 @@ software is respectively built and installed:
&& echo " specs:" \
&& echo " - gromacs+mpi" \
&& echo " - mpich" \
&& echo " concretization: together" \
&& echo " concretizer:" \
&& echo " unify: true" \
&& echo " config:" \
&& echo " install_tree: /opt/software" \
&& echo " view: /opt/view") > /opt/spack-environment/spack.yaml
@@ -366,7 +378,8 @@ produces, for instance, the following ``Dockerfile``:
&& echo " externals:" \
&& echo " - spec: cuda%gcc" \
&& echo " prefix: /usr/local/cuda" \
&& echo " concretization: together" \
&& echo " concretizer:" \
&& echo " unify: true" \
&& echo " config:" \
&& echo " install_tree: /opt/software" \
&& echo " view: /opt/view") > /opt/spack-environment/spack.yaml

View File

@@ -151,7 +151,7 @@ Package-related modules
^^^^^^^^^^^^^^^^^^^^^^^
:mod:`spack.package`
Contains the :class:`~spack.package.Package` class, which
Contains the :class:`~spack.package_base.Package` class, which
is the superclass for all packages in Spack. Methods on ``Package``
implement all phases of the :ref:`package lifecycle
<package-lifecycle>` and manage the build process.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 88 KiB

View File

@@ -273,19 +273,9 @@ or
Concretizing
^^^^^^^^^^^^
Once some user specs have been added to an environment, they can be
concretized. *By default specs are concretized separately*, one after
the other. This mode of operation permits to deploy a full
software stack where multiple configurations of the same package
need to be installed alongside each other. Central installations done
at HPC centers by system administrators or user support groups
are a common case that fits in this behavior.
Environments *can also be configured to concretize all
the root specs in a self-consistent way* to ensure that
each package in the environment comes with a single configuration. This
mode of operation is usually what is required by software developers that
want to deploy their development environment.
Once some user specs have been added to an environment, they can be concretized.
There are at the moment three different modes of operation to concretize an environment,
which are explained in details in :ref:`environments_concretization_config`.
Regardless of which mode of operation has been chosen, the following
command will ensure all the root specs are concretized according to the
constraints that are prescribed in the configuration:
@@ -493,32 +483,76 @@ Appending to this list in the yaml is identical to using the ``spack
add`` command from the command line. However, there is more power
available from the yaml file.
.. _environments_concretization_config:
^^^^^^^^^^^^^^^^^^^
Spec concretization
^^^^^^^^^^^^^^^^^^^
Specs can be concretized separately or together, as already
explained in :ref:`environments_concretization`. The behavior active
under any environment is determined by the ``concretization`` property:
An environment can be concretized in three different modes and the behavior active under any environment
is determined by the ``concretizer:unify`` property. By default specs are concretized *separately*, one after the other:
.. code-block:: yaml
spack:
specs:
- ncview
- netcdf
- nco
- py-sphinx
concretization: together
- hdf5~mpi
- hdf5+mpi
- zlib@1.2.8
concretizer:
unify: false
which can currently take either one of the two allowed values ``together`` or ``separately``
(the default).
This mode of operation permits to deploy a full software stack where multiple configurations of the same package
need to be installed alongside each other using the best possible selection of transitive dependencies. The downside
is that redundancy of installations is disregarded completely, and thus environments might be more bloated than
strictly needed. In the example above, for instance, if a version of ``zlib`` newer than ``1.2.8`` is known to Spack,
then it will be used for both ``hdf5`` installations.
If redundancy of the environment is a concern, Spack provides a way to install it *together where possible*,
i.e. trying to maximize reuse of dependencies across different specs:
.. code-block:: yaml
spack:
specs:
- hdf5~mpi
- hdf5+mpi
- zlib@1.2.8
concretizer:
unify: when_possible
Also in this case Spack allows having multiple configurations of the same package, but privileges the reuse of
specs over other factors. Going back to our example, this means that both ``hdf5`` installations will use
``zlib@1.2.8`` as a dependency even if newer versions of that library are available.
Central installations done at HPC centers by system administrators or user support groups are a common case
that fits either of these two modes.
Environments can also be configured to concretize all the root specs *together*, in a self-consistent way, to
ensure that each package in the environment comes with a single configuration:
.. code-block:: yaml
spack:
specs:
- hdf5+mpi
- zlib@1.2.8
concretizer:
unify: true
This mode of operation is usually what is required by software developers that want to deploy their development
environment and have a single view of it in the filesystem.
.. note::
The ``concretizer:unify`` config option was introduced in Spack 0.18 to
replace the ``concretization`` property. For reference,
``concretization: together`` is replaced by ``concretizer:unify:true``,
and ``concretization: separately`` is replaced by ``concretizer:unify:false``.
.. admonition:: Re-concretization of user specs
When concretizing specs together the entire set of specs will be
When concretizing specs *together* or *together where possible* the entire set of specs will be
re-concretized after any addition of new user specs, to ensure that
the environment remains consistent. When instead the specs are concretized
the environment remains consistent / minimal. When instead the specs are concretized
separately only the new specs will be re-concretized after any addition.
^^^^^^^^^^^^^
@@ -765,7 +799,7 @@ directories.
select: [^mpi]
exclude: ['%pgi@18.5']
projections:
all: {name}/{version}-{compiler.name}
all: '{name}/{version}-{compiler.name}'
link: all
link_type: symlink
@@ -1017,4 +1051,4 @@ the include is conditional.
the ``--make-target-prefix`` flag and use the non-phony targets
``<target-prefix>/env`` and ``<target-prefix>/fetch`` as
prerequisites, instead of the phony targets ``<target-prefix>/all``
and ``<target-prefix>/fetch-all`` respectively.
and ``<target-prefix>/fetch-all`` respectively.

Binary file not shown.

After

Width:  |  Height:  |  Size: 70 KiB

View File

@@ -63,6 +63,7 @@ or refer to the full manual below.
configuration
config_yaml
bootstrapping
build_settings
environments
containers

View File

@@ -1070,13 +1070,32 @@ Commits
Submodules
You can supply ``submodules=True`` to cause Spack to fetch submodules
recursively along with the repository at fetch time. For more information
about git submodules see the manpage of git: ``man git-submodule``.
recursively along with the repository at fetch time.
.. code-block:: python
version('1.0.1', tag='v1.0.1', submodules=True)
If a package has needs more fine-grained control over submodules, define
``submodules`` to be a callable function that takes the package instance as
its only argument. The function should return a list of submodules to be fetched.
.. code-block:: python
def submodules(package):
submodules = []
if "+variant-1" in package.spec:
submodules.append("submodule_for_variant_1")
if "+variant-2" in package.spec:
submodules.append("submodule_for_variant_2")
return submodules
class MyPackage(Package):
version("0.1.0", submodules=submodules)
For more information about git submodules see the manpage of git: ``man
git-submodule``.
.. _github-fetch:
@@ -2393,9 +2412,9 @@ Influence how dependents are built or run
Spack provides a mechanism for dependencies to influence the
environment of their dependents by overriding the
:meth:`setup_dependent_run_environment <spack.package.PackageBase.setup_dependent_run_environment>`
:meth:`setup_dependent_run_environment <spack.package_base.PackageBase.setup_dependent_run_environment>`
or the
:meth:`setup_dependent_build_environment <spack.package.PackageBase.setup_dependent_build_environment>`
:meth:`setup_dependent_build_environment <spack.package_base.PackageBase.setup_dependent_build_environment>`
methods.
The Qt package, for instance, uses this call:
@@ -2417,7 +2436,7 @@ will have the ``PYTHONPATH``, ``PYTHONHOME`` and ``PATH`` environment
variables set appropriately before starting the installation. To make things
even simpler the ``python setup.py`` command is also inserted into the module
scope of dependents by overriding a third method called
:meth:`setup_dependent_package <spack.package.PackageBase.setup_dependent_package>`
:meth:`setup_dependent_package <spack.package_base.PackageBase.setup_dependent_package>`
:
.. literalinclude:: _spack_root/var/spack/repos/builtin/packages/python/package.py
@@ -2775,6 +2794,256 @@ Suppose a user invokes ``spack install`` like this:
Spack will fail with a constraint violation, because the version of
MPICH requested is too low for the ``mpi`` requirement in ``foo``.
.. _custom-attributes:
------------------
Custom attributes
------------------
Often a package will need to provide attributes for dependents to query
various details about what it provides. While any number of custom defined
attributes can be implemented by a package, the four specific attributes
described below are always available on every package with default
implementations and the ability to customize with alternate implementations
in the case of virtual packages provided:
=========== =========================================== =====================
Attribute Purpose Default
=========== =========================================== =====================
``home`` The installation path for the package ``spec.prefix``
``command`` An executable command for the package | ``spec.name`` found
in
| ``.home.bin``
``headers`` A list of headers provided by the package | All headers
searched
| recursively in
``.home.include``
``libs`` A list of libraries provided by the package | ``lib{spec.name}``
searched
| recursively in
``.home`` starting
| with ``lib``,
``lib64``, then the
| rest of ``.home``
=========== =========================================== =====================
Each of these can be customized by implementing the relevant attribute
as a ``@property`` in the package's class:
.. code-block:: python
:linenos:
class Foo(Package):
...
@property
def libs(self):
# The library provided by Foo is libMyFoo.so
return find_libraries('libMyFoo', root=self.home, recursive=True)
A package may also provide a custom implementation of each attribute
for the virtual packages it provides by implementing the
``virtualpackagename_attributename`` property in the package's class.
The implementation used is the first one found from:
#. Specialized virtual: ``Package.virtualpackagename_attributename``
#. Generic package: ``Package.attributename``
#. Default
The use of customized attributes is demonstrated in the next example.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Example: Customized attributes for virtual packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Consider a package ``foo`` that can optionally provide two virtual
packages ``bar`` and ``baz``. When both are enabled the installation tree
appears as follows:
.. code-block:: console
include/foo.h
include/bar/bar.h
lib64/libFoo.so
lib64/libFooBar.so
baz/include/baz/baz.h
baz/lib/libFooBaz.so
The install tree shows that ``foo`` is providing the header ``include/foo.h``
and library ``lib64/libFoo.so`` in it's install prefix. The virtual
package ``bar`` is providing ``include/bar/bar.h`` and library
``lib64/libFooBar.so``, also in ``foo``'s install prefix. The ``baz``
package, however, is provided in the ``baz`` subdirectory of ``foo``'s
prefix with the ``include/baz/baz.h`` header and ``lib/libFooBaz.so``
library. Such a package could implement the optional attributes as
follows:
.. code-block:: python
:linenos:
class Foo(Package):
...
variant('bar', default=False, description='Enable the Foo implementation of bar')
variant('baz', default=False, description='Enable the Foo implementation of baz')
...
provides('bar', when='+bar')
provides('baz', when='+baz')
....
# Just the foo headers
@property
def headers(self):
return find_headers('foo', root=self.home.include, recursive=False)
# Just the foo libraries
@property
def libs(self):
return find_libraries('libFoo', root=self.home, recursive=True)
# The header provided by the bar virutal package
@property
def bar_headers(self):
return find_headers('bar/bar.h', root=self.home.include, recursive=False)
# The libary provided by the bar virtual package
@property
def bar_libs(self):
return find_libraries('libFooBar', root=sef.home, recursive=True)
# The baz virtual package home
@property
def baz_home(self):
return self.prefix.baz
# The header provided by the baz virtual package
@property
def baz_headers(self):
return find_headers('baz/baz', root=self.baz_home.include, recursive=False)
# The library provided by the baz virtual package
@property
def baz_libs(self):
return find_libraries('libFooBaz', root=self.baz_home, recursive=True)
Now consider another package, ``foo-app``, depending on all three:
.. code-block:: python
:linenos:
class FooApp(CMakePackage):
...
depends_on('foo')
depends_on('bar')
depends_on('baz')
The resulting spec objects for it's dependencies shows the result of
the above attribute implementations:
.. code-block:: python
# The core headers and libraries of the foo package
>>> spec['foo']
foo@1.0%gcc@11.3.1+bar+baz arch=linux-fedora35-haswell
>>> spec['foo'].prefix
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6'
# home defaults to the package install prefix without an explicit implementation
>>> spec['foo'].home
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6'
# foo headers from the foo prefix
>>> spec['foo'].headers
HeaderList([
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/include/foo.h',
])
# foo include directories from the foo prefix
>>> spec['foo'].headers.directories
['/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/include']
# foo libraries from the foo prefix
>>> spec['foo'].libs
LibraryList([
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/lib64/libFoo.so',
])
# foo library directories from the foo prefix
>>> spec['foo'].libs.directories
['/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/lib64']
.. code-block:: python
# The virtual bar package in the same prefix as foo
# bar resolves to the foo package
>>> spec['bar']
foo@1.0%gcc@11.3.1+bar+baz arch=linux-fedora35-haswell
>>> spec['bar'].prefix
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6'
# home defaults to the foo prefix without either a Foo.bar_home
# or Foo.home implementation
>>> spec['bar'].home
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6'
# bar header in the foo prefix
>>> spec['bar'].headers
HeaderList([
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/include/bar/bar.h'
])
# bar include dirs from the foo prefix
>>> spec['bar'].headers.directories
['/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/include']
# bar library from the foo prefix
>>> spec['bar'].libs
LibraryList([
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/lib64/libFooBar.so'
])
# bar library directories from the foo prefix
>>> spec['bar'].libs.directories
['/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/lib64']
.. code-block:: python
# The virtual baz package in a subdirectory of foo's prefix
# baz resolves to the foo package
>>> spec['baz']
foo@1.0%gcc@11.3.1+bar+baz arch=linux-fedora35-haswell
>>> spec['baz'].prefix
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6'
# baz_home implementation provides the subdirectory inside the foo prefix
>>> spec['baz'].home
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz'
# baz headers in the baz subdirectory of the foo prefix
>>> spec['baz'].headers
HeaderList([
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/include/baz/baz.h'
])
# baz include directories in the baz subdirectory of the foo prefix
>>> spec['baz'].headers.directories
[
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/include'
]
# baz libraries in the baz subdirectory of the foo prefix
>>> spec['baz'].libs
LibraryList([
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib/libFooBaz.so'
])
# baz library directories in the baz subdirectory of the foo porefix
>>> spec['baz'].libs.directories
[
'/opt/spack/linux-fedora35-haswell/gcc-11.3.1/foo-1.0-ca3rczp5omy7dfzoqw4p7oc2yh3u7lt6/baz/lib'
]
.. _abstract-and-concrete:
-------------------------
@@ -3022,7 +3291,7 @@ The classes that are currently provided by Spack are:
+----------------------------------------------------------+----------------------------------+
| **Base Class** | **Purpose** |
+==========================================================+==================================+
| :class:`~spack.package.Package` | General base class not |
| :class:`~spack.package_base.Package` | General base class not |
| | specialized for any build system |
+----------------------------------------------------------+----------------------------------+
| :class:`~spack.build_systems.makefile.MakefilePackage` | Specialized class for packages |
@@ -3153,7 +3422,7 @@ for the install phase is:
For those not used to Python instance methods, this is the
package itself. In this case it's an instance of ``Foo``, which
extends ``Package``. For API docs on Package objects, see
:py:class:`Package <spack.package.Package>`.
:py:class:`Package <spack.package_base.Package>`.
``spec``
This is the concrete spec object created by Spack from an
@@ -5476,6 +5745,24 @@ Version Lists
Spack packages should list supported versions with the newest first.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using ``home`` vs ``prefix``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
``home`` and ``prefix`` are both attributes that can be queried on a
package's dependencies, often when passing configure arguments pointing to the
location of a dependency. The difference is that while ``prefix`` is the
location on disk where a concrete package resides, ``home`` is the `logical`
location that a package resides, which may be different than ``prefix`` in
the case of virtual packages or other special circumstances. For most use
cases inside a package, it's dependency locations can be accessed via either
``self.spec['foo'].home`` or ``self.spec['foo'].prefix``. Specific packages
that should be consumed by dependents via ``.home`` instead of ``.prefix``
should be noted in their respective documentation.
See :ref:`custom-attributes` for more details and an example implementing
a custom ``home`` attribute.
---------------------------
Packaging workflow commands
---------------------------

View File

@@ -115,7 +115,8 @@ And here's the spack environment built by the pipeline represented as a
spack:
view: false
concretization: separately
concretizer:
unify: false
definitions:
- pkgs:

View File

@@ -61,7 +61,7 @@ You can see the packages we added earlier in the ``specs:`` section. If you
ever want to add more packages, you can either use ``spack add`` or manually
edit this file.
We also need to change the ``concretization:`` option. By default, Spack
We also need to change the ``concretizer:unify`` option. By default, Spack
concretizes each spec *separately*, allowing multiple versions of the same
package to coexist. Since we want a single consistent environment, we want to
concretize all of the specs *together*.
@@ -78,7 +78,8 @@ Here is what your ``spack.yaml`` looks like with this new setting:
# add package specs to the `specs` list
specs: [bash@5, python, py-numpy, py-scipy, py-matplotlib]
view: true
concretization: together
concretizer:
unify: true
^^^^^^^^^^^^^^^^
Symlink location

View File

@@ -25,4 +25,5 @@ spack:
- subversion
# Plotting
- graphviz
concretization: together
concretizer:
unify: true

View File

@@ -7,7 +7,7 @@ bash, , , Compiler wrappers
tar, , , Extract/create archives
gzip, , , Compress/Decompress archives
unzip, , , Compress/Decompress archives
bzip, , , Compress/Decompress archives
bzip2, , , Compress/Decompress archives
xz, , , Compress/Decompress archives
zstd, , Optional, Compress/Decompress archives
file, , , Create/Use Buildcaches
@@ -15,4 +15,4 @@ gnupg2, , , Sign/Verify Buildcaches
git, , , Manage Software Repositories
svn, , Optional, Manage Software Repositories
hg, , Optional, Manage Software Repositories
Python header files, , Optional (e.g. ``python3-dev`` on Debian), Bootstrapping from sources
Python header files, , Optional (e.g. ``python3-dev`` on Debian), Bootstrapping from sources
1 Name Supported Versions Notes Requirement Reason
7 tar Extract/create archives
8 gzip Compress/Decompress archives
9 unzip Compress/Decompress archives
10 bzip bzip2 Compress/Decompress archives
11 xz Compress/Decompress archives
12 zstd Optional Compress/Decompress archives
13 file Create/Use Buildcaches
15 git Manage Software Repositories
16 svn Optional Manage Software Repositories
17 hg Optional Manage Software Repositories
18 Python header files Optional (e.g. ``python3-dev`` on Debian) Bootstrapping from sources

6
lib/spack/env/cc vendored
View File

@@ -1,4 +1,4 @@
#!/bin/sh
#!/bin/sh -f
# shellcheck disable=SC2034 # evals in this script fool shellcheck
#
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
@@ -768,7 +768,9 @@ if [ "$SPACK_DEBUG" = TRUE ]; then
input_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.in.log"
output_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.out.log"
echo "[$mode] $command $input_command" >> "$input_log"
echo "[$mode] ${full_command_list}" >> "$output_log"
IFS="$lsep"
echo "[$mode] "$full_command_list >> "$output_log"
unset IFS
fi
# Execute the full command, preserving spaces with IFS set

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.1.2 (commit 85757b6666422fca86aa882a769bf78b0f992f54)
* Version: 0.1.4 (commit b8eea9df2b4204ff27d204452cd46f5199a0b423)
argparse
--------

View File

@@ -61,7 +61,7 @@ def proc_cpuinfo():
``/proc/cpuinfo``
"""
info = {}
with open("/proc/cpuinfo") as file:
with open("/proc/cpuinfo") as file: # pylint: disable=unspecified-encoding
for line in file:
key, separator, value = line.partition(":")
@@ -80,26 +80,46 @@ def proc_cpuinfo():
def _check_output(args, env):
output = subprocess.Popen(args, stdout=subprocess.PIPE, env=env).communicate()[0]
output = subprocess.Popen( # pylint: disable=consider-using-with
args, stdout=subprocess.PIPE, env=env
).communicate()[0]
return six.text_type(output.decode("utf-8"))
def _machine():
""" "Return the machine architecture we are on"""
operating_system = platform.system()
# If we are not on Darwin, trust what Python tells us
if operating_system != "Darwin":
return platform.machine()
# On Darwin it might happen that we are on M1, but using an interpreter
# built for x86_64. In that case "platform.machine() == 'x86_64'", so we
# need to fix that.
#
# See: https://bugs.python.org/issue42704
output = _check_output(
["sysctl", "-n", "machdep.cpu.brand_string"], env=_ensure_bin_usrbin_in_path()
).strip()
if "Apple" in output:
# Note that a native Python interpreter on Apple M1 would return
# "arm64" instead of "aarch64". Here we normalize to the latter.
return "aarch64"
return "x86_64"
@info_dict(operating_system="Darwin")
def sysctl_info_dict():
"""Returns a raw info dictionary parsing the output of sysctl."""
# Make sure that /sbin and /usr/sbin are in PATH as sysctl is
# usually found there
child_environment = dict(os.environ.items())
search_paths = child_environment.get("PATH", "").split(os.pathsep)
for additional_path in ("/sbin", "/usr/sbin"):
if additional_path not in search_paths:
search_paths.append(additional_path)
child_environment["PATH"] = os.pathsep.join(search_paths)
child_environment = _ensure_bin_usrbin_in_path()
def sysctl(*args):
return _check_output(["sysctl"] + list(args), env=child_environment).strip()
if platform.machine() == "x86_64":
if _machine() == "x86_64":
flags = (
sysctl("-n", "machdep.cpu.features").lower()
+ " "
@@ -125,6 +145,18 @@ def sysctl(*args):
return info
def _ensure_bin_usrbin_in_path():
# Make sure that /sbin and /usr/sbin are in PATH as sysctl is
# usually found there
child_environment = dict(os.environ.items())
search_paths = child_environment.get("PATH", "").split(os.pathsep)
for additional_path in ("/sbin", "/usr/sbin"):
if additional_path not in search_paths:
search_paths.append(additional_path)
child_environment["PATH"] = os.pathsep.join(search_paths)
return child_environment
def adjust_raw_flags(info):
"""Adjust the flags detected on the system to homogenize
slightly different representations.
@@ -184,12 +216,7 @@ def compatible_microarchitectures(info):
Args:
info (dict): dictionary containing information on the host cpu
"""
architecture_family = platform.machine()
# On Apple M1 platform.machine() returns "arm64" instead of "aarch64"
# so we should normalize the name here
if architecture_family == "arm64":
architecture_family = "aarch64"
architecture_family = _machine()
# If a tester is not registered, be conservative and assume no known
# target is compatible with the host
tester = COMPATIBILITY_CHECKS.get(architecture_family, lambda x, y: False)
@@ -244,12 +271,7 @@ def compatibility_check(architecture_family):
architecture_family = (architecture_family,)
def decorator(func):
# pylint: disable=fixme
# TODO: on removal of Python 2.6 support this can be re-written as
# TODO: an update + a dict comprehension
for arch_family in architecture_family:
COMPATIBILITY_CHECKS[arch_family] = func
COMPATIBILITY_CHECKS.update({family: func for family in architecture_family})
return func
return decorator
@@ -288,7 +310,7 @@ def compatibility_check_for_x86_64(info, target):
arch_root = TARGETS[basename]
return (
(target == arch_root or arch_root in target.ancestors)
and (target.vendor == vendor or target.vendor == "generic")
and target.vendor in (vendor, "generic")
and target.features.issubset(features)
)
@@ -303,8 +325,9 @@ def compatibility_check_for_aarch64(info, target):
arch_root = TARGETS[basename]
return (
(target == arch_root or arch_root in target.ancestors)
and (target.vendor == vendor or target.vendor == "generic")
and target.features.issubset(features)
and target.vendor in (vendor, "generic")
# On macOS it seems impossible to get all the CPU features with syctl info
and (target.features.issubset(features) or platform.system() == "Darwin")
)

View File

@@ -11,7 +11,7 @@
try:
from collections.abc import MutableMapping # novm
except ImportError:
from collections import MutableMapping
from collections import MutableMapping # pylint: disable=deprecated-class
class LazyDictionary(MutableMapping):
@@ -56,7 +56,7 @@ def _load_json_file(json_file):
def _factory():
filename = os.path.join(json_dir, json_file)
with open(filename, "r") as file:
with open(filename, "r") as file: # pylint: disable=unspecified-encoding
return json.load(file)
return _factory

View File

@@ -85,7 +85,21 @@
"intel": [
{
"versions": ":",
"name": "pentium4",
"name": "x86-64",
"flags": "-march={name} -mtune=generic"
}
],
"oneapi": [
{
"versions": ":",
"name": "x86-64",
"flags": "-march={name} -mtune=generic"
}
],
"dpcpp": [
{
"versions": ":",
"name": "x86-64",
"flags": "-march={name} -mtune=generic"
}
]
@@ -129,6 +143,20 @@
"name": "x86-64",
"flags": "-march={name} -mtune=generic -mcx16 -msahf -mpopcnt -msse3 -msse4.1 -msse4.2 -mssse3"
}
],
"oneapi": [
{
"versions": "2021.2.0:",
"name": "x86-64-v2",
"flags": "-march={name} -mtune=generic"
}
],
"dpcpp": [
{
"versions": "2021.2.0:",
"name": "x86-64-v2",
"flags": "-march={name} -mtune=generic"
}
]
}
},
@@ -186,6 +214,20 @@
"name": "x86-64",
"flags": "-march={name} -mtune=generic -mcx16 -msahf -mpopcnt -msse3 -msse4.1 -msse4.2 -mssse3 -mavx -mavx2 -mbmi -mbmi2 -mf16c -mfma -mlzcnt -mmovbe -mxsave"
}
],
"oneapi": [
{
"versions": "2021.2.0:",
"name": "x86-64-v3",
"flags": "-march={name} -mtune=generic"
}
],
"dpcpp": [
{
"versions": "2021.2.0:",
"name": "x86-64-v3",
"flags": "-march={name} -mtune=generic"
}
]
}
},
@@ -248,6 +290,20 @@
"name": "x86-64",
"flags": "-march={name} -mtune=generic -mcx16 -msahf -mpopcnt -msse3 -msse4.1 -msse4.2 -mssse3 -mavx -mavx2 -mbmi -mbmi2 -mf16c -mfma -mlzcnt -mmovbe -mxsave -mavx512f -mavx512bw -mavx512cd -mavx512dq -mavx512vl"
}
],
"oneapi": [
{
"versions": "2021.2.0:",
"name": "x86-64-v4",
"flags": "-march={name} -mtune=generic"
}
],
"dpcpp": [
{
"versions": "2021.2.0:",
"name": "x86-64-v4",
"flags": "-march={name} -mtune=generic"
}
]
}
},
@@ -288,8 +344,19 @@
"intel": [
{
"versions": "16.0:",
"name": "pentium4",
"flags": "-march={name} -mtune=generic"
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
@@ -333,6 +400,18 @@
"versions": "16.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -384,6 +463,20 @@
"name": "corei7",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"name": "corei7",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"name": "corei7",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -432,6 +525,20 @@
"name": "corei7",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"name": "corei7",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"name": "corei7",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -490,6 +597,18 @@
"versions": "18.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -550,6 +669,18 @@
"versions": "18.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -615,6 +746,18 @@
"versions": "18.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -672,6 +815,18 @@
"versions": "18.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -732,6 +887,18 @@
"versions": "18.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -798,6 +965,20 @@
"name": "knl",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"name": "knl",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"name": "knl",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -868,6 +1049,20 @@
"name": "skylake-avx512",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"name": "skylake-avx512",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"name": "skylake-avx512",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -937,6 +1132,18 @@
"versions": "18.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1004,6 +1211,18 @@
"versions": "19.0.1:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1098,6 +1317,20 @@
"name": "icelake-client",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"name": "icelake-client",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"name": "icelake-client",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1142,6 +1375,20 @@
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse2"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse2"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse2"
}
]
}
},
@@ -1192,6 +1439,20 @@
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse3"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse3"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse3"
}
]
}
},
@@ -1246,6 +1507,20 @@
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse3"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse3"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse3"
}
]
}
},
@@ -1301,6 +1576,20 @@
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse4.2"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse4.2"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse4.2"
}
]
}
},
@@ -1360,6 +1649,22 @@
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1422,6 +1727,22 @@
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1485,6 +1806,22 @@
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1543,6 +1880,30 @@
"name": "znver3",
"flags": "-march={name} -mtune={name}"
}
],
"intel": [
{
"versions": "16.0:",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1788,7 +2149,6 @@
"fp",
"asimd",
"evtstrm",
"aes",
"pmull",
"sha1",
"sha2",
@@ -1821,18 +2181,26 @@
"flags": "-march=armv8.2-a+crc+crypto+fp16"
},
{
"versions": "8:",
"flags": "-march=armv8.2-a+crc+aes+sha2+fp16+sve -msve-vector-bits=512"
"versions": "8:10.2",
"flags": "-march=armv8.2-a+crc+sha2+fp16+sve -msve-vector-bits=512"
},
{
"versions": "10.3:",
"flags": "-mcpu=a64fx -msve-vector-bits=512"
}
],
"clang": [
{
"versions": "3.9:4.9",
"flags": "-march=armv8.2-a+crc+crypto+fp16"
"flags": "-march=armv8.2-a+crc+sha2+fp16"
},
{
"versions": "5:",
"flags": "-march=armv8.2-a+crc+crypto+fp16+sve"
"versions": "5:10",
"flags": "-march=armv8.2-a+crc+sha2+fp16+sve"
},
{
"versions": "11:",
"flags": "-mcpu=a64fx"
}
],
"arm": [
@@ -1954,7 +2322,40 @@
"m1": {
"from": ["aarch64"],
"vendor": "Apple",
"features": [],
"features": [
"fp",
"asimd",
"evtstrm",
"aes",
"pmull",
"sha1",
"sha2",
"crc32",
"atomics",
"fphp",
"asimdhp",
"cpuid",
"asimdrdm",
"jscvt",
"fcma",
"lrcpc",
"dcpop",
"sha3",
"asimddp",
"sha512",
"asimdfhm",
"dit",
"uscat",
"ilrcpc",
"flagm",
"ssbs",
"sb",
"paca",
"pacg",
"dcpodp",
"flagm2",
"frint"
],
"compilers": {
"gcc": [
{
@@ -1964,14 +2365,22 @@
],
"clang" : [
{
"versions": "9.0:",
"versions": "9.0:12.0",
"flags" : "-march=armv8.4-a"
},
{
"versions": "13.0:",
"flags" : "-mcpu=apple-m1"
}
],
"apple-clang": [
{
"versions": "11.0:",
"versions": "11.0:12.5",
"flags" : "-march=armv8.4-a"
},
{
"versions": "13.0:",
"flags" : "-mcpu=apple-m1"
}
]
}

View File

@@ -308,6 +308,68 @@ def change_sed_delimiter(old_delim, new_delim, *filenames):
filter_file(double_quoted, '"%s"' % repl, f)
@contextmanager
def exploding_archive_catch(stage):
# Check for an exploding tarball, i.e. one that doesn't expand to
# a single directory. If the tarball *didn't* explode, move its
# contents to the staging source directory & remove the container
# directory. If the tarball did explode, just rename the tarball
# directory to the staging source directory.
#
# NOTE: The tar program on Mac OS X will encode HFS metadata in
# hidden files, which can end up *alongside* a single top-level
# directory. We initially ignore presence of hidden files to
# accomodate these "semi-exploding" tarballs but ensure the files
# are copied to the source directory.
# Expand all tarballs in their own directory to contain
# exploding tarballs.
tarball_container = os.path.join(stage.path,
"spack-expanded-archive")
mkdirp(tarball_container)
orig_dir = os.getcwd()
os.chdir(tarball_container)
try:
yield
# catch an exploding archive on sucessful extraction
os.chdir(orig_dir)
exploding_archive_handler(tarball_container, stage)
except Exception as e:
# return current directory context to previous on failure
os.chdir(orig_dir)
raise e
@system_path_filter
def exploding_archive_handler(tarball_container, stage):
"""
Args:
tarball_container: where the archive was expanded to
stage: Stage object referencing filesystem location
where archive is being expanded
"""
files = os.listdir(tarball_container)
non_hidden = [f for f in files if not f.startswith('.')]
if len(non_hidden) == 1:
src = os.path.join(tarball_container, non_hidden[0])
if os.path.isdir(src):
stage.srcdir = non_hidden[0]
shutil.move(src, stage.source_path)
if len(files) > 1:
files.remove(non_hidden[0])
for f in files:
src = os.path.join(tarball_container, f)
dest = os.path.join(stage.path, f)
shutil.move(src, dest)
os.rmdir(tarball_container)
else:
# This is a non-directory entry (e.g., a patch file) so simply
# rename the tarball container to be the source path.
shutil.move(tarball_container, stage.source_path)
else:
shutil.move(tarball_container, stage.source_path)
@system_path_filter(arg_slice=slice(1))
def get_owner_uid(path, err_msg=None):
if not os.path.exists(path):
@@ -367,7 +429,7 @@ def group_ids(uid=None):
@system_path_filter(arg_slice=slice(1))
def chgrp(path, group):
def chgrp(path, group, follow_symlinks=True):
"""Implement the bash chgrp function on a single path"""
if is_windows:
raise OSError("Function 'chgrp' is not supported on Windows")
@@ -376,7 +438,10 @@ def chgrp(path, group):
gid = grp.getgrnam(group).gr_gid
else:
gid = group
os.chown(path, -1, gid)
if follow_symlinks:
os.chown(path, -1, gid)
else:
os.lchown(path, -1, gid)
@system_path_filter(arg_slice=slice(1))

View File

@@ -11,7 +11,9 @@
import os
import re
import sys
import traceback
from datetime import datetime, timedelta
from typing import List, Tuple
import six
from six import string_types
@@ -1009,3 +1011,64 @@ def __repr__(self):
def __str__(self):
return str(self.data)
class GroupedExceptionHandler(object):
"""A generic mechanism to coalesce multiple exceptions and preserve tracebacks."""
def __init__(self):
self.exceptions = [] # type: List[Tuple[str, Exception, List[str]]]
def __bool__(self):
"""Whether any exceptions were handled."""
return bool(self.exceptions)
def forward(self, context):
# type: (str) -> GroupedExceptionForwarder
"""Return a contextmanager which extracts tracebacks and prefixes a message."""
return GroupedExceptionForwarder(context, self)
def _receive_forwarded(self, context, exc, tb):
# type: (str, Exception, List[str]) -> None
self.exceptions.append((context, exc, tb))
def grouped_message(self, with_tracebacks=True):
# type: (bool) -> str
"""Print out an error message coalescing all the forwarded errors."""
each_exception_message = [
'{0} raised {1}: {2}{3}'.format(
context,
exc.__class__.__name__,
exc,
'\n{0}'.format(''.join(tb)) if with_tracebacks else '',
)
for context, exc, tb in self.exceptions
]
return 'due to the following failures:\n{0}'.format(
'\n'.join(each_exception_message)
)
class GroupedExceptionForwarder(object):
"""A contextmanager to capture exceptions and forward them to a
GroupedExceptionHandler."""
def __init__(self, context, handler):
# type: (str, GroupedExceptionHandler) -> None
self._context = context
self._handler = handler
def __enter__(self):
return None
def __exit__(self, exc_type, exc_value, tb):
if exc_value is not None:
self._handler._receive_forwarded(
self._context,
exc_value,
traceback.format_tb(tb),
)
# Suppress any exception from being re-raised:
# https://docs.python.org/3/reference/datamodel.html#object.__exit__.
return True

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: (major, minor, micro, dev release) tuple
spack_version_info = (0, 18, 0, 'dev0')
spack_version_info = (0, 19, 0, 'dev0')
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
spack_version = '.'.join(str(s) for s in spack_version_info)

View File

@@ -12,7 +12,7 @@
import spack.error
import spack.hooks
import spack.monitor
import spack.package
import spack.package_base
import spack.repo
import spack.util.executable

View File

@@ -298,7 +298,8 @@ def _check_build_test_callbacks(pkgs, error_cls):
def _check_patch_urls(pkgs, error_cls):
"""Ensure that patches fetched from GitHub have stable sha256 hashes."""
github_patch_url_re = (
r"^https?://github\.com/.+/.+/(?:commit|pull)/[a-fA-F0-9]*.(?:patch|diff)"
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
".+/.+/(?:commit|pull)/[a-fA-F0-9]*.(?:patch|diff)"
)
errors = []

View File

@@ -210,7 +210,7 @@ def get_all_built_specs(self):
return spec_list
def find_built_spec(self, spec):
def find_built_spec(self, spec, mirrors_to_check=None):
"""Look in our cache for the built spec corresponding to ``spec``.
If the spec can be found among the configured binary mirrors, a
@@ -225,6 +225,8 @@ def find_built_spec(self, spec):
Args:
spec (spack.spec.Spec): Concrete spec to find
mirrors_to_check: Optional mapping containing mirrors to check. If
None, just assumes all configured mirrors.
Returns:
An list of objects containing the found specs and mirror url where
@@ -240,17 +242,23 @@ def find_built_spec(self, spec):
]
"""
self.regenerate_spec_cache()
return self.find_by_hash(spec.dag_hash())
return self.find_by_hash(spec.dag_hash(), mirrors_to_check=mirrors_to_check)
def find_by_hash(self, find_hash):
def find_by_hash(self, find_hash, mirrors_to_check=None):
"""Same as find_built_spec but uses the hash of a spec.
Args:
find_hash (str): hash of the spec to search
mirrors_to_check: Optional mapping containing mirrors to check. If
None, just assumes all configured mirrors.
"""
if find_hash not in self._mirrors_for_spec:
return None
return self._mirrors_for_spec[find_hash]
results = self._mirrors_for_spec[find_hash]
if not mirrors_to_check:
return results
mirror_urls = mirrors_to_check.values()
return [r for r in results if r['mirror_url'] in mirror_urls]
def update_spec(self, spec, found_list):
"""
@@ -563,6 +571,13 @@ def __init__(self, msg):
super(NewLayoutException, self).__init__(msg)
class UnsignedPackageException(spack.error.SpackError):
"""
Raised if installation of unsigned package is attempted without
the use of ``--no-check-signature``.
"""
def compute_hash(data):
return hashlib.sha256(data.encode('utf-8')).hexdigest()
@@ -751,15 +766,16 @@ def select_signing_key(key=None):
return key
def sign_tarball(key, force, specfile_path):
if os.path.exists('%s.asc' % specfile_path):
def sign_specfile(key, force, specfile_path):
signed_specfile_path = '%s.sig' % specfile_path
if os.path.exists(signed_specfile_path):
if force:
os.remove('%s.asc' % specfile_path)
os.remove(signed_specfile_path)
else:
raise NoOverwriteException('%s.asc' % specfile_path)
raise NoOverwriteException(signed_specfile_path)
key = select_signing_key(key)
spack.util.gpg.sign(key, specfile_path, '%s.asc' % specfile_path)
spack.util.gpg.sign(key, specfile_path, signed_specfile_path, clearsign=True)
def _fetch_spec_from_mirror(spec_url):
@@ -768,7 +784,10 @@ def _fetch_spec_from_mirror(spec_url):
_, _, spec_file = web_util.read_from_url(spec_url)
spec_file_contents = codecs.getreader('utf-8')(spec_file).read()
# Need full spec.json name or this gets confused with index.json.
if spec_url.endswith('.json'):
if spec_url.endswith('.json.sig'):
specfile_json = Spec.extract_json_from_clearsig(spec_file_contents)
s = Spec.from_dict(specfile_json)
elif spec_url.endswith('.json'):
s = Spec.from_json(spec_file_contents)
elif spec_url.endswith('.yaml'):
s = Spec.from_yaml(spec_file_contents)
@@ -829,7 +848,9 @@ def generate_package_index(cache_prefix):
file_list = (
entry
for entry in web_util.list_url(cache_prefix)
if entry.endswith('.yaml') or entry.endswith('spec.json'))
if entry.endswith('.yaml') or
entry.endswith('spec.json') or
entry.endswith('spec.json.sig'))
except KeyError as inst:
msg = 'No packages at {0}: {1}'.format(cache_prefix, inst)
tty.warn(msg)
@@ -944,7 +965,7 @@ def _build_tarball(
tmpdir = tempfile.mkdtemp()
cache_prefix = build_cache_prefix(tmpdir)
tarfile_name = tarball_name(spec, '.tar.gz')
tarfile_name = tarball_name(spec, '.spack')
tarfile_dir = os.path.join(cache_prefix, tarball_directory_name(spec))
tarfile_path = os.path.join(tarfile_dir, tarfile_name)
spackfile_path = os.path.join(
@@ -967,10 +988,12 @@ def _build_tarball(
spec_file = spack.store.layout.spec_file_path(spec)
specfile_name = tarball_name(spec, '.spec.json')
specfile_path = os.path.realpath(os.path.join(cache_prefix, specfile_name))
signed_specfile_path = '{0}.sig'.format(specfile_path)
deprecated_specfile_path = specfile_path.replace('.spec.json', '.spec.yaml')
remote_specfile_path = url_util.join(
outdir, os.path.relpath(specfile_path, os.path.realpath(tmpdir)))
remote_signed_specfile_path = '{0}.sig'.format(remote_specfile_path)
remote_specfile_path_deprecated = url_util.join(
outdir, os.path.relpath(deprecated_specfile_path,
os.path.realpath(tmpdir)))
@@ -979,9 +1002,12 @@ def _build_tarball(
if force:
if web_util.url_exists(remote_specfile_path):
web_util.remove_url(remote_specfile_path)
if web_util.url_exists(remote_signed_specfile_path):
web_util.remove_url(remote_signed_specfile_path)
if web_util.url_exists(remote_specfile_path_deprecated):
web_util.remove_url(remote_specfile_path_deprecated)
elif (web_util.url_exists(remote_specfile_path) or
web_util.url_exists(remote_signed_specfile_path) or
web_util.url_exists(remote_specfile_path_deprecated)):
raise NoOverwriteException(url_util.format(remote_specfile_path))
@@ -1043,6 +1069,7 @@ def _build_tarball(
raise ValueError(
'{0} not a valid spec file type (json or yaml)'.format(
spec_file))
spec_dict['buildcache_layout_version'] = 1
bchecksum = {}
bchecksum['hash_algorithm'] = 'sha256'
bchecksum['hash'] = checksum
@@ -1061,25 +1088,15 @@ def _build_tarball(
# sign the tarball and spec file with gpg
if not unsigned:
key = select_signing_key(key)
sign_tarball(key, force, specfile_path)
# put tarball, spec and signature files in .spack archive
with closing(tarfile.open(spackfile_path, 'w')) as tar:
tar.add(name=tarfile_path, arcname='%s' % tarfile_name)
tar.add(name=specfile_path, arcname='%s' % specfile_name)
if not unsigned:
tar.add(name='%s.asc' % specfile_path,
arcname='%s.asc' % specfile_name)
# cleanup file moved to archive
os.remove(tarfile_path)
if not unsigned:
os.remove('%s.asc' % specfile_path)
sign_specfile(key, force, specfile_path)
# push tarball and signed spec json to remote mirror
web_util.push_to_url(
spackfile_path, remote_spackfile_path, keep_original=False)
web_util.push_to_url(
specfile_path, remote_specfile_path, keep_original=False)
signed_specfile_path if not unsigned else specfile_path,
remote_signed_specfile_path if not unsigned else remote_specfile_path,
keep_original=False)
tty.debug('Buildcache for "{0}" written to \n {1}'
.format(spec, remote_spackfile_path))
@@ -1162,48 +1179,174 @@ def push(specs, push_url, specs_kwargs=None, **kwargs):
warnings.warn(str(e))
def download_tarball(spec, preferred_mirrors=None):
def try_verify(specfile_path):
"""Utility function to attempt to verify a local file. Assumes the
file is a clearsigned signature file.
Args:
specfile_path (str): Path to file to be verified.
Returns:
``True`` if the signature could be verified, ``False`` otherwise.
"""
suppress = config.get('config:suppress_gpg_warnings', False)
try:
spack.util.gpg.verify(specfile_path, suppress_warnings=suppress)
except Exception:
return False
return True
def try_fetch(url_to_fetch):
"""Utility function to try and fetch a file from a url, stage it
locally, and return the path to the staged file.
Args:
url_to_fetch (str): Url pointing to remote resource to fetch
Returns:
Path to locally staged resource or ``None`` if it could not be fetched.
"""
stage = Stage(url_to_fetch, keep=True)
stage.create()
try:
stage.fetch()
except fs.FetchError:
stage.destroy()
return None
return stage
def _delete_staged_downloads(download_result):
"""Clean up stages used to download tarball and specfile"""
download_result['tarball_stage'].destroy()
download_result['specfile_stage'].destroy()
def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
"""
Download binary tarball for given package into stage area, returning
path to downloaded tarball if successful, None otherwise.
Args:
spec (spack.spec.Spec): Concrete spec
preferred_mirrors (list): If provided, this is a list of preferred
mirror urls. Other configured mirrors will only be used if the
tarball can't be retrieved from one of these.
unsigned (bool): Whether or not to require signed binaries
mirrors_for_spec (list): Optional list of concrete specs and mirrors
obtained by calling binary_distribution.get_mirrors_for_spec().
These will be checked in order first before looking in other
configured mirrors.
Returns:
Path to the downloaded tarball, or ``None`` if the tarball could not
be downloaded from any configured mirrors.
``None`` if the tarball could not be downloaded (maybe also verified,
depending on whether new-style signed binary packages were found).
Otherwise, return an object indicating the path to the downloaded
tarball, the path to the downloaded specfile (in the case of new-style
buildcache), and whether or not the tarball is already verified.
.. code-block:: JSON
{
"tarball_path": "path-to-locally-saved-tarfile",
"specfile_path": "none-or-path-to-locally-saved-specfile",
"signature_verified": "true-if-binary-pkg-was-already-verified"
}
"""
if not spack.mirror.MirrorCollection():
tty.die("Please add a spack mirror to allow " +
"download of pre-compiled packages.")
tarball = tarball_path_name(spec, '.spack')
specfile_prefix = tarball_name(spec, '.spec')
urls_to_try = []
mirrors_to_try = []
if preferred_mirrors:
for preferred_url in preferred_mirrors:
urls_to_try.append(url_util.join(
preferred_url, _build_cache_relative_path, tarball))
# Note on try_first and try_next:
# mirrors_for_spec mostly likely came from spack caching remote
# mirror indices locally and adding their specs to a local data
# structure supporting quick lookup of concrete specs. Those
# mirrors are likely a subset of all configured mirrors, and
# we'll probably find what we need in one of them. But we'll
# look in all configured mirrors if needed, as maybe the spec
# we need was in an un-indexed mirror. No need to check any
# mirror for the spec twice though.
try_first = [i['mirror_url'] for i in mirrors_for_spec] if mirrors_for_spec else []
try_next = [
i.fetch_url for i in spack.mirror.MirrorCollection().values()
if i.fetch_url not in try_first
]
for mirror in spack.mirror.MirrorCollection().values():
if not preferred_mirrors or mirror.fetch_url not in preferred_mirrors:
urls_to_try.append(url_util.join(
mirror.fetch_url, _build_cache_relative_path, tarball))
for url in try_first + try_next:
mirrors_to_try.append({
'specfile': url_util.join(url,
_build_cache_relative_path, specfile_prefix),
'spackfile': url_util.join(url,
_build_cache_relative_path, tarball)
})
for try_url in urls_to_try:
# stage the tarball into standard place
stage = Stage(try_url, name="build_cache", keep=True)
stage.create()
try:
stage.fetch()
return stage.save_filename
except fs.FetchError:
continue
tried_to_verify_sigs = []
# Assumes we care more about finding a spec file by preferred ext
# than by mirrory priority. This can be made less complicated as
# we remove support for deprecated spec formats and buildcache layouts.
for ext in ['json.sig', 'json', 'yaml']:
for mirror_to_try in mirrors_to_try:
specfile_url = '{0}.{1}'.format(mirror_to_try['specfile'], ext)
spackfile_url = mirror_to_try['spackfile']
local_specfile_stage = try_fetch(specfile_url)
if local_specfile_stage:
local_specfile_path = local_specfile_stage.save_filename
signature_verified = False
if ext.endswith('.sig') and not unsigned:
# If we found a signed specfile at the root, try to verify
# the signature immediately. We will not download the
# tarball if we could not verify the signature.
tried_to_verify_sigs.append(specfile_url)
signature_verified = try_verify(local_specfile_path)
if not signature_verified:
tty.warn("Failed to verify: {0}".format(specfile_url))
if unsigned or signature_verified or not ext.endswith('.sig'):
# We will download the tarball in one of three cases:
# 1. user asked for --no-check-signature
# 2. user didn't ask for --no-check-signature, but we
# found a spec.json.sig and verified the signature already
# 3. neither of the first two cases are true, but this file
# is *not* a signed json (not a spec.json.sig file). That
# means we already looked at all the mirrors and either didn't
# find any .sig files or couldn't verify any of them. But it
# is still possible to find an old style binary package where
# the signature is a detached .asc file in the outer archive
# of the tarball, and in that case, the only way to know is to
# download the tarball. This is a deprecated use case, so if
# something goes wrong during the extraction process (can't
# verify signature, checksum doesn't match) we will fail at
# that point instead of trying to download more tarballs from
# the remaining mirrors, looking for one we can use.
tarball_stage = try_fetch(spackfile_url)
if tarball_stage:
return {
'tarball_stage': tarball_stage,
'specfile_stage': local_specfile_stage,
'signature_verified': signature_verified,
}
local_specfile_stage.destroy()
# Falling through the nested loops meeans we exhaustively searched
# for all known kinds of spec files on all mirrors and did not find
# an acceptable one for which we could download a tarball.
if tried_to_verify_sigs:
raise NoVerifyException(("Spack found new style signed binary packages, "
"but was unable to verify any of them. Please "
"obtain and trust the correct public key. If "
"these are public spack binaries, please see the "
"spack docs for locations where keys can be found."))
tty.warn("download_tarball() was unable to download " +
"{0} from any configured mirrors".format(spec))
@@ -1377,7 +1520,55 @@ def is_backup_file(file):
relocate.relocate_text(text_names, prefix_to_prefix_text)
def extract_tarball(spec, filename, allow_root=False, unsigned=False,
def _extract_inner_tarball(spec, filename, extract_to, unsigned, remote_checksum):
stagepath = os.path.dirname(filename)
spackfile_name = tarball_name(spec, '.spack')
spackfile_path = os.path.join(stagepath, spackfile_name)
tarfile_name = tarball_name(spec, '.tar.gz')
tarfile_path = os.path.join(extract_to, tarfile_name)
deprecated_yaml_name = tarball_name(spec, '.spec.yaml')
deprecated_yaml_path = os.path.join(extract_to, deprecated_yaml_name)
json_name = tarball_name(spec, '.spec.json')
json_path = os.path.join(extract_to, json_name)
with closing(tarfile.open(spackfile_path, 'r')) as tar:
tar.extractall(extract_to)
# some buildcache tarfiles use bzip2 compression
if not os.path.exists(tarfile_path):
tarfile_name = tarball_name(spec, '.tar.bz2')
tarfile_path = os.path.join(extract_to, tarfile_name)
if os.path.exists(json_path):
specfile_path = json_path
elif os.path.exists(deprecated_yaml_path):
specfile_path = deprecated_yaml_path
else:
raise ValueError('Cannot find spec file for {0}.'.format(extract_to))
if not unsigned:
if os.path.exists('%s.asc' % specfile_path):
suppress = config.get('config:suppress_gpg_warnings', False)
try:
spack.util.gpg.verify('%s.asc' % specfile_path, specfile_path, suppress)
except Exception:
raise NoVerifyException("Spack was unable to verify package "
"signature, please obtain and trust the "
"correct public key.")
else:
raise UnsignedPackageException(
"To install unsigned packages, use the --no-check-signature option.")
# get the sha256 checksum of the tarball
local_checksum = checksum_tarball(tarfile_path)
# if the checksums don't match don't install
if local_checksum != remote_checksum['hash']:
raise NoChecksumException(
"Package tarball failed checksum verification.\n"
"It cannot be installed.")
return tarfile_path
def extract_tarball(spec, download_result, allow_root=False, unsigned=False,
force=False):
"""
extract binary tarball for given package into install area
@@ -1388,66 +1579,56 @@ def extract_tarball(spec, filename, allow_root=False, unsigned=False,
else:
raise NoOverwriteException(str(spec.prefix))
tmpdir = tempfile.mkdtemp()
stagepath = os.path.dirname(filename)
spackfile_name = tarball_name(spec, '.spack')
spackfile_path = os.path.join(stagepath, spackfile_name)
tarfile_name = tarball_name(spec, '.tar.gz')
tarfile_path = os.path.join(tmpdir, tarfile_name)
specfile_is_json = True
deprecated_yaml_name = tarball_name(spec, '.spec.yaml')
deprecated_yaml_path = os.path.join(tmpdir, deprecated_yaml_name)
json_name = tarball_name(spec, '.spec.json')
json_path = os.path.join(tmpdir, json_name)
with closing(tarfile.open(spackfile_path, 'r')) as tar:
tar.extractall(tmpdir)
# some buildcache tarfiles use bzip2 compression
if not os.path.exists(tarfile_path):
tarfile_name = tarball_name(spec, '.tar.bz2')
tarfile_path = os.path.join(tmpdir, tarfile_name)
specfile_path = download_result['specfile_stage'].save_filename
if os.path.exists(json_path):
specfile_path = json_path
elif os.path.exists(deprecated_yaml_path):
specfile_is_json = False
specfile_path = deprecated_yaml_path
else:
raise ValueError('Cannot find spec file for {0}.'.format(tmpdir))
if not unsigned:
if os.path.exists('%s.asc' % specfile_path):
try:
suppress = config.get('config:suppress_gpg_warnings', False)
spack.util.gpg.verify(
'%s.asc' % specfile_path, specfile_path, suppress)
except Exception as e:
shutil.rmtree(tmpdir)
raise e
else:
shutil.rmtree(tmpdir)
raise NoVerifyException(
"Package spec file failed signature verification.\n"
"Use spack buildcache keys to download "
"and install a key for verification from the mirror.")
# get the sha256 checksum of the tarball
checksum = checksum_tarball(tarfile_path)
# get the sha256 checksum recorded at creation
spec_dict = {}
with open(specfile_path, 'r') as inputfile:
content = inputfile.read()
if specfile_is_json:
if specfile_path.endswith('.json.sig'):
spec_dict = Spec.extract_json_from_clearsig(content)
elif specfile_path.endswith('.json'):
spec_dict = sjson.load(content)
else:
spec_dict = syaml.load(content)
bchecksum = spec_dict['binary_cache_checksum']
# if the checksums don't match don't install
if bchecksum['hash'] != checksum:
shutil.rmtree(tmpdir)
raise NoChecksumException(
"Package tarball failed checksum verification.\n"
"It cannot be installed.")
bchecksum = spec_dict['binary_cache_checksum']
filename = download_result['tarball_stage'].save_filename
signature_verified = download_result['signature_verified']
tmpdir = None
if ('buildcache_layout_version' not in spec_dict or
int(spec_dict['buildcache_layout_version']) < 1):
# Handle the older buildcache layout where the .spack file
# contains a spec json/yaml, maybe an .asc file (signature),
# and another tarball containing the actual install tree.
tmpdir = tempfile.mkdtemp()
try:
tarfile_path = _extract_inner_tarball(
spec, filename, tmpdir, unsigned, bchecksum)
except Exception as e:
_delete_staged_downloads(download_result)
shutil.rmtree(tmpdir)
raise e
else:
# Newer buildcache layout: the .spack file contains just
# in the install tree, the signature, if it exists, is
# wrapped around the spec.json at the root. If sig verify
# was required, it was already done before downloading
# the tarball.
tarfile_path = filename
if not unsigned and not signature_verified:
raise UnsignedPackageException(
"To install unsigned packages, use the --no-check-signature option.")
# compute the sha256 checksum of the tarball
local_checksum = checksum_tarball(tarfile_path)
# if the checksums don't match don't install
if local_checksum != bchecksum['hash']:
_delete_staged_downloads(download_result)
raise NoChecksumException(
"Package tarball failed checksum verification.\n"
"It cannot be installed.")
new_relative_prefix = str(os.path.relpath(spec.prefix,
spack.store.layout.root))
@@ -1472,11 +1653,13 @@ def extract_tarball(spec, filename, allow_root=False, unsigned=False,
try:
tar.extractall(path=extract_tmp)
except Exception as e:
_delete_staged_downloads(download_result)
shutil.rmtree(extracted_dir)
raise e
try:
shutil.move(extracted_dir, spec.prefix)
except Exception as e:
_delete_staged_downloads(download_result)
shutil.rmtree(extracted_dir)
raise e
os.remove(tarfile_path)
@@ -1495,9 +1678,11 @@ def extract_tarball(spec, filename, allow_root=False, unsigned=False,
spec_id = spec.format('{name}/{hash:7}')
tty.warn('No manifest file in tarball for spec %s' % spec_id)
finally:
shutil.rmtree(tmpdir)
if tmpdir:
shutil.rmtree(tmpdir)
if os.path.exists(filename):
os.remove(filename)
_delete_staged_downloads(download_result)
def install_root_node(spec, allow_root, unsigned=False, force=False, sha256=None):
@@ -1525,21 +1710,23 @@ def install_root_node(spec, allow_root, unsigned=False, force=False, sha256=None
warnings.warn("Package for spec {0} already installed.".format(spec.format()))
return
tarball = download_tarball(spec)
if not tarball:
download_result = download_tarball(spec, unsigned)
if not download_result:
msg = 'download of binary cache file for spec "{0}" failed'
raise RuntimeError(msg.format(spec.format()))
if sha256:
checker = spack.util.crypto.Checker(sha256)
msg = 'cannot verify checksum for "{0}" [expected={1}]'
msg = msg.format(tarball, sha256)
if not checker.check(tarball):
tarball_path = download_result['tarball_stage'].save_filename
msg = msg.format(tarball_path, sha256)
if not checker.check(tarball_path):
_delete_staged_downloads(download_result)
raise spack.binary_distribution.NoChecksumException(msg)
tty.debug('Verified SHA256 checksum of the build cache')
tty.msg('Installing "{0}" from a buildcache'.format(spec.format()))
extract_tarball(spec, tarball, allow_root, unsigned, force)
extract_tarball(spec, download_result, allow_root, unsigned, force)
spack.hooks.post_install(spec)
spack.store.db.add(spec, spack.store.layout)
@@ -1565,6 +1752,8 @@ def try_direct_fetch(spec, mirrors=None):
"""
deprecated_specfile_name = tarball_name(spec, '.spec.yaml')
specfile_name = tarball_name(spec, '.spec.json')
signed_specfile_name = tarball_name(spec, '.spec.json.sig')
specfile_is_signed = False
specfile_is_json = True
found_specs = []
@@ -1573,24 +1762,35 @@ def try_direct_fetch(spec, mirrors=None):
mirror.fetch_url, _build_cache_relative_path, deprecated_specfile_name)
buildcache_fetch_url_json = url_util.join(
mirror.fetch_url, _build_cache_relative_path, specfile_name)
buildcache_fetch_url_signed_json = url_util.join(
mirror.fetch_url, _build_cache_relative_path, signed_specfile_name)
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_json)
_, _, fs = web_util.read_from_url(buildcache_fetch_url_signed_json)
specfile_is_signed = True
except (URLError, web_util.SpackWebError, HTTPError) as url_err:
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_yaml)
specfile_is_json = False
except (URLError, web_util.SpackWebError, HTTPError) as url_err_y:
tty.debug('Did not find {0} on {1}'.format(
specfile_name, buildcache_fetch_url_json), url_err)
tty.debug('Did not find {0} on {1}'.format(
specfile_name, buildcache_fetch_url_yaml), url_err_y)
continue
_, _, fs = web_util.read_from_url(buildcache_fetch_url_json)
except (URLError, web_util.SpackWebError, HTTPError) as url_err_x:
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_yaml)
specfile_is_json = False
except (URLError, web_util.SpackWebError, HTTPError) as url_err_y:
tty.debug('Did not find {0} on {1}'.format(
specfile_name, buildcache_fetch_url_signed_json), url_err)
tty.debug('Did not find {0} on {1}'.format(
specfile_name, buildcache_fetch_url_json), url_err_x)
tty.debug('Did not find {0} on {1}'.format(
specfile_name, buildcache_fetch_url_yaml), url_err_y)
continue
specfile_contents = codecs.getreader('utf-8')(fs).read()
# read the spec from the build cache file. All specs in build caches
# are concrete (as they are built) so we need to mark this spec
# concrete on read-in.
if specfile_is_json:
if specfile_is_signed:
specfile_json = Spec.extract_json_from_clearsig(specfile_contents)
fetched_spec = Spec.from_dict(specfile_json)
elif specfile_is_json:
fetched_spec = Spec.from_json(specfile_contents)
else:
fetched_spec = Spec.from_yaml(specfile_contents)
@@ -1627,7 +1827,7 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):
tty.debug("No Spack mirrors are currently configured")
return {}
results = binary_index.find_built_spec(spec)
results = binary_index.find_built_spec(spec, mirrors_to_check=mirrors_to_check)
# Maybe we just didn't have the latest information from the mirror, so
# try to fetch directly, unless we are only considering the indices.
@@ -1917,7 +2117,8 @@ def download_single_spec(
'path': local_tarball_path,
'required': True,
}, {
'url': [tarball_name(concrete_spec, '.spec.json'),
'url': [tarball_name(concrete_spec, '.spec.json.sig'),
tarball_name(concrete_spec, '.spec.json'),
tarball_name(concrete_spec, '.spec.yaml')],
'path': destination,
'required': True,

View File

@@ -5,6 +5,7 @@
from __future__ import print_function
import contextlib
import copy
import fnmatch
import functools
import json
@@ -21,6 +22,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.lang import GroupedExceptionHandler
import spack.binary_distribution
import spack.config
@@ -36,6 +38,11 @@
import spack.util.environment
import spack.util.executable
import spack.util.path
import spack.util.spack_yaml
import spack.util.url
#: Name of the file containing metadata about the bootstrapping source
METADATA_YAML_FILENAME = 'metadata.yaml'
#: Map a bootstrapper type to the corresponding class
_bootstrap_methods = {}
@@ -73,32 +80,41 @@ def _try_import_from_store(module, query_spec, query_info=None):
for candidate_spec in installed_specs:
pkg = candidate_spec['python'].package
module_paths = {
module_paths = [
os.path.join(candidate_spec.prefix, pkg.purelib),
os.path.join(candidate_spec.prefix, pkg.platlib),
}
sys.path.extend(module_paths)
] # type: list[str]
path_before = list(sys.path)
# NOTE: try module_paths first and last, last allows an existing version in path
# to be picked up and used, possibly depending on something in the store, first
# allows the bootstrap version to work when an incompatible version is in
# sys.path
orders = [
module_paths + sys.path,
sys.path + module_paths,
]
for path in orders:
sys.path = path
try:
_fix_ext_suffix(candidate_spec)
if _python_import(module):
msg = ('[BOOTSTRAP MODULE {0}] The installed spec "{1}/{2}" '
'provides the "{0}" Python module').format(
module, query_spec, candidate_spec.dag_hash()
)
tty.debug(msg)
if query_info is not None:
query_info['spec'] = candidate_spec
return True
except Exception as e:
msg = ('unexpected error while trying to import module '
'"{0}" from spec "{1}" [error="{2}"]')
tty.warn(msg.format(module, candidate_spec, str(e)))
else:
msg = "Spec {0} did not provide module {1}"
tty.warn(msg.format(candidate_spec, module))
try:
_fix_ext_suffix(candidate_spec)
if _python_import(module):
msg = ('[BOOTSTRAP MODULE {0}] The installed spec "{1}/{2}" '
'provides the "{0}" Python module').format(
module, query_spec, candidate_spec.dag_hash()
)
tty.debug(msg)
if query_info is not None:
query_info['spec'] = candidate_spec
return True
except Exception as e:
msg = ('unexpected error while trying to import module '
'"{0}" from spec "{1}" [error="{2}"]')
tty.warn(msg.format(module, candidate_spec, str(e)))
else:
msg = "Spec {0} did not provide module {1}"
tty.warn(msg.format(candidate_spec, module))
sys.path = sys.path[:-3]
sys.path = path_before
return False
@@ -203,12 +219,43 @@ def _executables_in_store(executables, query_spec, query_info=None):
return False
@_bootstrapper(type='buildcache')
class _BuildcacheBootstrapper(object):
"""Install the software needed during bootstrapping from a buildcache."""
class _BootstrapperBase(object):
"""Base class to derive types that can bootstrap software for Spack"""
config_scope_name = ''
def __init__(self, conf):
self.name = conf['name']
self.url = conf['info']['url']
@property
def mirror_url(self):
# Absolute paths
if os.path.isabs(self.url):
return spack.util.url.format(self.url)
# Check for :// and assume it's an url if we find it
if '://' in self.url:
return self.url
# Otherwise, it's a relative path
return spack.util.url.format(os.path.join(self.metadata_dir, self.url))
@property
def mirror_scope(self):
return spack.config.InternalConfigScope(
self.config_scope_name, {'mirrors:': {self.name: self.mirror_url}}
)
@_bootstrapper(type='buildcache')
class _BuildcacheBootstrapper(_BootstrapperBase):
"""Install the software needed during bootstrapping from a buildcache."""
config_scope_name = 'bootstrap_buildcache'
def __init__(self, conf):
super(_BuildcacheBootstrapper, self).__init__(conf)
self.metadata_dir = spack.util.path.canonicalize_path(conf['metadata'])
self.last_search = None
@staticmethod
@@ -231,9 +278,8 @@ def _spec_and_platform(abstract_spec_str):
def _read_metadata(self, package_name):
"""Return metadata about the given package."""
json_filename = '{0}.json'.format(package_name)
json_path = os.path.join(
spack.paths.share_path, 'bootstrap', self.name, json_filename
)
json_dir = self.metadata_dir
json_path = os.path.join(json_dir, json_filename)
with open(json_path) as f:
data = json.load(f)
return data
@@ -307,12 +353,6 @@ def _install_and_test(
return True
return False
@property
def mirror_scope(self):
return spack.config.InternalConfigScope(
'bootstrap_buildcache', {'mirrors:': {self.name: self.url}}
)
def try_import(self, module, abstract_spec_str):
test_fn, info = functools.partial(_try_import_from_store, module), {}
if test_fn(query_spec=abstract_spec_str, query_info=info):
@@ -342,9 +382,13 @@ def try_search_path(self, executables, abstract_spec_str):
@_bootstrapper(type='install')
class _SourceBootstrapper(object):
class _SourceBootstrapper(_BootstrapperBase):
"""Install the software needed during bootstrapping from sources."""
config_scope_name = 'bootstrap_source'
def __init__(self, conf):
super(_SourceBootstrapper, self).__init__(conf)
self.metadata_dir = spack.util.path.canonicalize_path(conf['metadata'])
self.conf = conf
self.last_search = None
@@ -377,7 +421,8 @@ def try_import(self, module, abstract_spec_str):
tty.debug(msg.format(module, abstract_spec_str))
# Install the spec that should make the module importable
concrete_spec.package.do_install(fail_fast=True)
with spack.config.override(self.mirror_scope):
concrete_spec.package.do_install(fail_fast=True)
if _try_import_from_store(module, query_spec=concrete_spec, query_info=info):
self.last_search = info
@@ -390,6 +435,8 @@ def try_search_path(self, executables, abstract_spec_str):
self.last_search = info
return True
tty.info("Bootstrapping {0} from sources".format(abstract_spec_str))
# If we compile code from sources detecting a few build tools
# might reduce compilation time by a fair amount
_add_externals_if_missing()
@@ -402,7 +449,8 @@ def try_search_path(self, executables, abstract_spec_str):
msg = "[BOOTSTRAP] Try installing '{0}' from sources"
tty.debug(msg.format(abstract_spec_str))
concrete_spec.package.do_install()
with spack.config.override(self.mirror_scope):
concrete_spec.package.do_install()
if _executables_in_store(executables, concrete_spec, query_info=info):
self.last_search = info
return True
@@ -417,11 +465,11 @@ def _make_bootstrapper(conf):
return _bootstrap_methods[btype](conf)
def _source_is_trusted(conf):
def source_is_enabled_or_raise(conf):
"""Raise ValueError if the source is not enabled for bootstrapping"""
trusted, name = spack.config.get('bootstrap:trusted'), conf['name']
if name not in trusted:
return False
return trusted[name]
if not trusted.get(name, False):
raise ValueError('source is not trusted')
def spec_for_current_python():
@@ -486,36 +534,26 @@ def ensure_module_importable_or_raise(module, abstract_spec=None):
return
abstract_spec = abstract_spec or module
source_configs = spack.config.get('bootstrap:sources', [])
errors = {}
h = GroupedExceptionHandler()
for current_config in source_configs:
if not _source_is_trusted(current_config):
msg = ('[BOOTSTRAP MODULE {0}] Skipping source "{1}" since it is '
'not trusted').format(module, current_config['name'])
tty.debug(msg)
continue
for current_config in bootstrapping_sources():
with h.forward(current_config['name']):
source_is_enabled_or_raise(current_config)
b = _make_bootstrapper(current_config)
try:
b = _make_bootstrapper(current_config)
if b.try_import(module, abstract_spec):
return
except Exception as e:
msg = '[BOOTSTRAP MODULE {0}] Unexpected error "{1}"'
tty.debug(msg.format(module, str(e)))
errors[current_config['name']] = e
# We couldn't import in any way, so raise an import error
msg = 'cannot bootstrap the "{0}" Python module'.format(module)
assert h, 'expected at least one exception to have been raised at this point: while bootstrapping {0}'.format(module) # noqa: E501
msg = 'cannot bootstrap the "{0}" Python module '.format(module)
if abstract_spec:
msg += ' from spec "{0}"'.format(abstract_spec)
msg += ' due to the following failures:\n'
for method in errors:
err = errors[method]
msg += " '{0}' raised {1}: {2}\n".format(
method, err.__class__.__name__, str(err))
msg += ' Please run `spack -d spec zlib` for more verbose error messages'
msg += 'from spec "{0}" '.format(abstract_spec)
if tty.is_debug():
msg += h.grouped_message(with_tracebacks=True)
else:
msg += h.grouped_message(with_tracebacks=False)
msg += '\nRun `spack --debug ...` for more detailed errors'
raise ImportError(msg)
@@ -538,16 +576,14 @@ def ensure_executables_in_path_or_raise(executables, abstract_spec):
return cmd
executables_str = ', '.join(executables)
source_configs = spack.config.get('bootstrap:sources', [])
for current_config in source_configs:
if not _source_is_trusted(current_config):
msg = ('[BOOTSTRAP EXECUTABLES {0}] Skipping source "{1}" since it is '
'not trusted').format(executables_str, current_config['name'])
tty.debug(msg)
continue
b = _make_bootstrapper(current_config)
try:
h = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
with h.forward(current_config['name']):
source_is_enabled_or_raise(current_config)
b = _make_bootstrapper(current_config)
if b.try_search_path(executables, abstract_spec):
# Additional environment variables needed
concrete_spec, cmd = b.last_search['spec'], b.last_search['command']
@@ -562,14 +598,16 @@ def ensure_executables_in_path_or_raise(executables, abstract_spec):
)
cmd.add_default_envmod(env_mods)
return cmd
except Exception as e:
msg = '[BOOTSTRAP EXECUTABLES {0}] Unexpected error "{1}"'
tty.debug(msg.format(executables_str, str(e)))
# We couldn't import in any way, so raise an import error
msg = 'cannot bootstrap any of the {0} executables'.format(executables_str)
assert h, 'expected at least one exception to have been raised at this point: while bootstrapping {0}'.format(executables_str) # noqa: E501
msg = 'cannot bootstrap any of the {0} executables '.format(executables_str)
if abstract_spec:
msg += ' from spec "{0}"'.format(abstract_spec)
msg += 'from spec "{0}" '.format(abstract_spec)
if tty.is_debug():
msg += h.grouped_message(with_tracebacks=True)
else:
msg += h.grouped_message(with_tracebacks=False)
msg += '\nRun `spack --debug ...` for more detailed errors'
raise RuntimeError(msg)
@@ -826,6 +864,19 @@ def ensure_flake8_in_path_or_raise():
return ensure_executables_in_path_or_raise([executable], abstract_spec=root_spec)
def all_root_specs(development=False):
"""Return a list of all the root specs that may be used to bootstrap Spack.
Args:
development (bool): if True include dev dependencies
"""
specs = [clingo_root_spec(), gnupg_root_spec(), patchelf_root_spec()]
if development:
specs += [isort_root_spec(), mypy_root_spec(),
black_root_spec(), flake8_root_spec()]
return specs
def _missing(name, purpose, system_only=True):
"""Message to be printed if an executable is not found"""
msg = '[{2}] MISSING "{0}": {1}'
@@ -963,3 +1014,23 @@ def status_message(section):
msg += '\n'
msg = msg.format(pass_token if not missing_software else fail_token)
return msg, missing_software
def bootstrapping_sources(scope=None):
"""Return the list of configured sources of software for bootstrapping Spack
Args:
scope (str or None): if a valid configuration scope is given, return the
list only from that scope
"""
source_configs = spack.config.get('bootstrap:sources', default=None, scope=scope)
source_configs = source_configs or []
list_of_sources = []
for entry in source_configs:
current = copy.copy(entry)
metadata_dir = spack.util.path.canonicalize_path(entry['metadata'])
metadata_yaml = os.path.join(metadata_dir, METADATA_YAML_FILENAME)
with open(metadata_yaml) as f:
current.update(spack.util.spack_yaml.load(f))
list_of_sources.append(current)
return list_of_sources

View File

@@ -55,7 +55,7 @@
import spack.config
import spack.install_test
import spack.main
import spack.package
import spack.package_base
import spack.paths
import spack.platforms
import spack.repo
@@ -722,7 +722,7 @@ def get_std_cmake_args(pkg):
package were a CMakePackage instance.
Args:
pkg (spack.package.PackageBase): package under consideration
pkg (spack.package_base.PackageBase): package under consideration
Returns:
list: arguments for cmake
@@ -738,7 +738,7 @@ def get_std_meson_args(pkg):
package were a MesonPackage instance.
Args:
pkg (spack.package.PackageBase): package under consideration
pkg (spack.package_base.PackageBase): package under consideration
Returns:
list: arguments for meson
@@ -748,12 +748,12 @@ def get_std_meson_args(pkg):
def parent_class_modules(cls):
"""
Get list of superclass modules that descend from spack.package.PackageBase
Get list of superclass modules that descend from spack.package_base.PackageBase
Includes cls.__module__
"""
if (not issubclass(cls, spack.package.PackageBase) or
issubclass(spack.package.PackageBase, cls)):
if (not issubclass(cls, spack.package_base.PackageBase) or
issubclass(spack.package_base.PackageBase, cls)):
return []
result = []
module = sys.modules.get(cls.__module__)
@@ -771,7 +771,7 @@ def load_external_modules(pkg):
associated with them.
Args:
pkg (spack.package.PackageBase): package to load deps for
pkg (spack.package_base.PackageBase): package to load deps for
"""
for dep in list(pkg.spec.traverse()):
external_modules = dep.external_modules or []
@@ -1109,7 +1109,7 @@ def start_build_process(pkg, function, kwargs):
Args:
pkg (spack.package.PackageBase): package whose environment we should set up the
pkg (spack.package_base.PackageBase): package whose environment we should set up the
child process for.
function (typing.Callable): argless function to run in the child
process.
@@ -1234,7 +1234,7 @@ def make_stack(tb, stack=None):
if 'self' in frame.f_locals:
# Find the first proper subclass of PackageBase.
obj = frame.f_locals['self']
if isinstance(obj, spack.package.PackageBase):
if isinstance(obj, spack.package_base.PackageBase):
break
# We found obj, the Package implementation we care about.

View File

@@ -9,7 +9,7 @@
from spack.build_systems.autotools import AutotoolsPackage
from spack.directives import extends
from spack.package import ExtensionError
from spack.package_base import ExtensionError
from spack.util.executable import which

View File

@@ -16,7 +16,7 @@
from spack.build_environment import InstallError
from spack.directives import conflicts, depends_on
from spack.operating_systems.mac_os import macos_version
from spack.package import PackageBase, run_after, run_before
from spack.package_base import PackageBase, run_after, run_before
from spack.util.executable import Executable
from spack.version import Version

View File

@@ -8,7 +8,7 @@
from llnl.util.filesystem import install, mkdirp
from spack.build_systems.cmake import CMakePackage
from spack.package import run_after
from spack.package_base import run_after
def cmake_cache_path(name, value, comment=""):
@@ -210,6 +210,10 @@ def std_initconfig_entries(self):
"#------------------{0}\n".format("-" * 60),
]
def initconfig_package_entries(self):
"""This method is to be overwritten by the package"""
return []
def initconfig(self, spec, prefix):
cache_entries = (self.std_initconfig_entries() +
self.initconfig_compiler_entries() +

View File

@@ -18,7 +18,7 @@
import spack.build_environment
from spack.directives import conflicts, depends_on, variant
from spack.package import InstallError, PackageBase, run_after
from spack.package_base import InstallError, PackageBase, run_after
from spack.util.path import convert_to_posix_path
# Regex to extract the primary generator from the CMake generator

View File

@@ -6,7 +6,7 @@
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.multimethod import when
from spack.package import PackageBase
from spack.package_base import PackageBase
class CudaPackage(PackageBase):
@@ -37,6 +37,7 @@ class CudaPackage(PackageBase):
variant('cuda_arch',
description='CUDA architecture',
values=spack.variant.any_combination_of(*cuda_arch_values),
sticky=True,
when='+cuda')
# https://docs.nvidia.com/cuda/cuda-compiler-driver-nvcc/index.html#nvcc-examples

View File

@@ -3,14 +3,16 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.package
from typing import Optional
import spack.package_base
import spack.util.url
class GNUMirrorPackage(spack.package.PackageBase):
class GNUMirrorPackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for GNU packages."""
#: Path of the package in a GNU mirror
gnu_mirror_path = None
gnu_mirror_path = None # type: Optional[str]
#: List of GNU mirrors used by Spack
base_mirrors = [

View File

@@ -26,7 +26,7 @@
import spack.error
from spack.build_environment import dso_suffix
from spack.package import InstallError, PackageBase, run_after
from spack.package_base import InstallError, PackageBase, run_after
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
from spack.util.prefix import Prefix
@@ -1115,7 +1115,7 @@ def _setup_dependent_env_callback(
raise InstallError('compilers_of_client arg required for MPI')
def setup_dependent_package(self, module, dep_spec):
# https://spack.readthedocs.io/en/latest/spack.html#spack.package.PackageBase.setup_dependent_package
# https://spack.readthedocs.io/en/latest/spack.html#spack.package_base.PackageBase.setup_dependent_package
# Reminder: "module" refers to Python module.
# Called before the install() method of dependents.
@@ -1259,6 +1259,14 @@ def install(self, spec, prefix):
for f in glob.glob('%s/intel*log' % tmpdir):
install(f, dst)
@run_after('install')
def validate_install(self):
# Sometimes the installer exits with an error but doesn't pass a
# non-zero exit code to spack. Check for the existence of a 'bin'
# directory to catch this error condition.
if not os.path.exists(self.prefix.bin):
raise InstallError('The installer has failed to install anything.')
@run_after('install')
def configure_rpath(self):
if '+rpath' not in self.spec:

View File

@@ -10,7 +10,7 @@
from spack.directives import depends_on, extends
from spack.multimethod import when
from spack.package import PackageBase
from spack.package_base import PackageBase
from spack.util.executable import Executable

View File

@@ -11,7 +11,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import conflicts
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class MakefilePackage(PackageBase):

View File

@@ -7,7 +7,7 @@
from llnl.util.filesystem import install_tree, working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
from spack.util.executable import which

View File

@@ -11,7 +11,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import depends_on, variant
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class MesonPackage(PackageBase):

View File

@@ -6,7 +6,7 @@
import inspect
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class OctavePackage(PackageBase):

View File

@@ -14,7 +14,7 @@
from llnl.util.filesystem import find_headers, find_libraries, join_path
from spack.package import Package
from spack.package_base import Package
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable

View File

@@ -10,7 +10,7 @@
from llnl.util.filesystem import filter_file
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
from spack.util.executable import Executable

View File

@@ -6,6 +6,7 @@
import os
import re
import shutil
from typing import Optional
import llnl.util.tty as tty
from llnl.util.filesystem import (
@@ -19,13 +20,13 @@
from llnl.util.lang import match_predicate
from spack.directives import depends_on, extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class PythonPackage(PackageBase):
"""Specialized class for packages that are built using pip."""
#: Package name, version, and extension on PyPI
pypi = None
pypi = None # type: Optional[str]
maintainers = ['adamjstewart']
@@ -46,7 +47,7 @@ class PythonPackage(PackageBase):
# package manually
depends_on('py-wheel', type='build')
py_namespace = None
py_namespace = None # type: Optional[str]
@staticmethod
def _std_args(cls):

View File

@@ -9,7 +9,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class QMakePackage(PackageBase):

View File

@@ -5,9 +5,10 @@
import inspect
from typing import Optional
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class RPackage(PackageBase):
@@ -28,10 +29,10 @@ class RPackage(PackageBase):
# package attributes that can be expanded to set the homepage, url,
# list_url, and git values
# For CRAN packages
cran = None
cran = None # type: Optional[str]
# For Bioconductor packages
bioc = None
bioc = None # type: Optional[str]
maintainers = ['glennpj']

View File

@@ -3,13 +3,14 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from typing import Optional
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
from spack.build_environment import SPACK_NO_PARALLEL_MAKE, determine_number_of_jobs
from spack.directives import extends
from spack.package import PackageBase
from spack.package_base import PackageBase
from spack.util.environment import env_flag
from spack.util.executable import Executable, ProcessError
@@ -36,8 +37,8 @@ class RacketPackage(PackageBase):
extends('racket')
pkgs = False
subdirectory = None
name = None
subdirectory = None # type: Optional[str]
name = None # type: Optional[str]
parallel = True
@property

View File

@@ -77,7 +77,7 @@
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.package import PackageBase
from spack.package_base import PackageBase
class ROCmPackage(PackageBase):

View File

@@ -7,7 +7,7 @@
import inspect
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class RubyPackage(PackageBase):

View File

@@ -7,7 +7,7 @@
import inspect
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class SConsPackage(PackageBase):

View File

@@ -11,7 +11,7 @@
from llnl.util.filesystem import find, join_path, working_dir
from spack.directives import depends_on, extends
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class SIPPackage(PackageBase):

View File

@@ -3,15 +3,17 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.package
from typing import Optional
import spack.package_base
import spack.util.url
class SourceforgePackage(spack.package.PackageBase):
class SourceforgePackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for Sourceforge
packages."""
#: Path of the package in a Sourceforge mirror
sourceforge_mirror_path = None
sourceforge_mirror_path = None # type: Optional[str]
#: List of Sourceforge mirrors used by Spack
base_mirrors = [

View File

@@ -2,16 +2,17 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import Optional
import spack.package
import spack.package_base
import spack.util.url
class SourcewarePackage(spack.package.PackageBase):
class SourcewarePackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for Sourceware.org
packages."""
#: Path of the package in a Sourceware mirror
sourceware_mirror_path = None
sourceware_mirror_path = None # type: Optional[str]
#: List of Sourceware mirrors used by Spack
base_mirrors = [

View File

@@ -9,7 +9,7 @@
from llnl.util.filesystem import working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.package_base import PackageBase, run_after
class WafPackage(PackageBase):

View File

@@ -3,15 +3,17 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.package
from typing import Optional
import spack.package_base
import spack.util.url
class XorgPackage(spack.package.PackageBase):
class XorgPackage(spack.package_base.PackageBase):
"""Mixin that takes care of setting url and mirrors for x.org
packages."""
#: Path of the package in a x.org mirror
xorg_mirror_path = None
xorg_mirror_path = None # type: Optional[str]
#: List of x.org mirrors used by Spack
# Note: x.org mirrors are a bit tricky, since many are out-of-sync or off.

View File

@@ -33,7 +33,6 @@
import spack.util.executable as exe
import spack.util.gpg as gpg_util
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util
from spack.error import SpackError
from spack.spec import Spec
@@ -42,10 +41,8 @@
'always',
]
SPACK_PR_MIRRORS_ROOT_URL = 's3://spack-binaries-prs'
SPACK_SHARED_PR_MIRROR_URL = url_util.join(SPACK_PR_MIRRORS_ROOT_URL,
'shared_pr_mirror')
TEMP_STORAGE_MIRROR_NAME = 'ci_temporary_mirror'
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
spack_gpg = spack.main.SpackCommand('gpg')
spack_compiler = spack.main.SpackCommand('compiler')
@@ -199,6 +196,11 @@ def _get_cdash_build_name(spec, build_group):
spec.name, spec.version, spec.compiler, spec.architecture, build_group)
def _remove_reserved_tags(tags):
"""Convenience function to strip reserved tags from jobs"""
return [tag for tag in tags if tag not in SPACK_RESERVED_TAGS]
def _get_spec_string(spec):
format_elements = [
'{name}{@version}',
@@ -231,8 +233,10 @@ def _add_dependency(spec_label, dep_label, deps):
deps[spec_label].add(dep_label)
def _get_spec_dependencies(specs, deps, spec_labels, check_index_only=False):
spec_deps_obj = _compute_spec_deps(specs, check_index_only=check_index_only)
def _get_spec_dependencies(specs, deps, spec_labels, check_index_only=False,
mirrors_to_check=None):
spec_deps_obj = _compute_spec_deps(specs, check_index_only=check_index_only,
mirrors_to_check=mirrors_to_check)
if spec_deps_obj:
dependencies = spec_deps_obj['dependencies']
@@ -249,7 +253,7 @@ def _get_spec_dependencies(specs, deps, spec_labels, check_index_only=False):
_add_dependency(entry['spec'], entry['depends'], deps)
def stage_spec_jobs(specs, check_index_only=False):
def stage_spec_jobs(specs, check_index_only=False, mirrors_to_check=None):
"""Take a set of release specs and generate a list of "stages", where the
jobs in any stage are dependent only on jobs in previous stages. This
allows us to maximize build parallelism within the gitlab-ci framework.
@@ -261,6 +265,8 @@ def stage_spec_jobs(specs, check_index_only=False):
are up to date on those mirrors. This flag limits that search to
the binary cache indices on those mirrors to speed the process up,
even though there is no garantee the index is up to date.
mirrors_to_checK: Optional mapping giving mirrors to check instead of
any configured mirrors.
Returns: A tuple of information objects describing the specs, dependencies
and stages:
@@ -297,8 +303,8 @@ def _remove_satisfied_deps(deps, satisfied_list):
deps = {}
spec_labels = {}
_get_spec_dependencies(
specs, deps, spec_labels, check_index_only=check_index_only)
_get_spec_dependencies(specs, deps, spec_labels, check_index_only=check_index_only,
mirrors_to_check=mirrors_to_check)
# Save the original deps, as we need to return them at the end of the
# function. In the while loop below, the "dependencies" variable is
@@ -340,7 +346,7 @@ def _print_staging_summary(spec_labels, dependencies, stages):
_get_spec_string(s)))
def _compute_spec_deps(spec_list, check_index_only=False):
def _compute_spec_deps(spec_list, check_index_only=False, mirrors_to_check=None):
"""
Computes all the dependencies for the spec(s) and generates a JSON
object which provides both a list of unique spec names as well as a
@@ -413,7 +419,7 @@ def append_dep(s, d):
continue
up_to_date_mirrors = bindist.get_mirrors_for_spec(
spec=s, index_only=check_index_only)
spec=s, mirrors_to_check=mirrors_to_check, index_only=check_index_only)
skey = _spec_deps_key(s)
spec_labels[skey] = {
@@ -602,8 +608,8 @@ def get_spec_filter_list(env, affected_pkgs, dependencies=True, dependents=True)
def generate_gitlab_ci_yaml(env, print_summary, output_file,
prune_dag=False, check_index_only=False,
run_optimizer=False, use_dependencies=False,
artifacts_root=None):
""" Generate a gitlab yaml file to run a dynamic chile pipeline from
artifacts_root=None, remote_mirror_override=None):
""" Generate a gitlab yaml file to run a dynamic child pipeline from
the spec matrix in the active environment.
Arguments:
@@ -629,6 +635,10 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
artifacts_root (str): Path where artifacts like logs, environment
files (spack.yaml, spack.lock), etc should be written. GitLab
requires this to be within the project directory.
remote_mirror_override (str): Typically only needed when one spack.yaml
is used to populate several mirrors with binaries, based on some
criteria. Spack protected pipelines populate different mirrors based
on branch name, facilitated by this option.
"""
with spack.concretize.disable_compiler_existence_check():
with env.write_transaction():
@@ -678,17 +688,19 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
for s in affected_specs:
tty.debug(' {0}'.format(s.name))
generate_job_name = os.environ.get('CI_JOB_NAME', None)
parent_pipeline_id = os.environ.get('CI_PIPELINE_ID', None)
# Downstream jobs will "need" (depend on, for both scheduling and
# artifacts, which include spack.lock file) this pipeline generation
# job by both name and pipeline id. If those environment variables
# do not exist, then maybe this is just running in a shell, in which
# case, there is no expectation gitlab will ever run the generated
# pipeline and those environment variables do not matter.
generate_job_name = os.environ.get('CI_JOB_NAME', 'job-does-not-exist')
parent_pipeline_id = os.environ.get('CI_PIPELINE_ID', 'pipeline-does-not-exist')
# Values: "spack_pull_request", "spack_protected_branch", or not set
spack_pipeline_type = os.environ.get('SPACK_PIPELINE_TYPE', None)
is_pr_pipeline = spack_pipeline_type == 'spack_pull_request'
spack_pr_branch = os.environ.get('SPACK_PR_BRANCH', None)
pr_mirror_url = None
if spack_pr_branch:
pr_mirror_url = url_util.join(SPACK_PR_MIRRORS_ROOT_URL,
spack_pr_branch)
spack_buildcache_copy = os.environ.get('SPACK_COPY_BUILDCACHE', None)
if 'mirrors' not in yaml_root or len(yaml_root['mirrors'].values()) < 1:
tty.die('spack ci generate requires an env containing a mirror')
@@ -743,14 +755,25 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'strip-compilers': False,
})
# Add per-PR mirror (and shared PR mirror) if enabled, as some specs might
# be up to date in one of those and thus not need to be rebuilt.
if pr_mirror_url:
spack.mirror.add(
'ci_pr_mirror', pr_mirror_url, cfg.default_modify_scope())
spack.mirror.add('ci_shared_pr_mirror',
SPACK_SHARED_PR_MIRROR_URL,
cfg.default_modify_scope())
# If a remote mirror override (alternate buildcache destination) was
# specified, add it here in case it has already built hashes we might
# generate.
mirrors_to_check = None
if remote_mirror_override:
if spack_pipeline_type == 'spack_protected_branch':
# Overriding the main mirror in this case might result
# in skipping jobs on a release pipeline because specs are
# up to date in develop. Eventually we want to notice and take
# advantage of this by scheduling a job to copy the spec from
# develop to the release, but until we have that, this makes
# sure we schedule a rebuild job if the spec isn't already in
# override mirror.
mirrors_to_check = {
'override': remote_mirror_override
}
else:
spack.mirror.add(
'ci_pr_mirror', remote_mirror_override, cfg.default_modify_scope())
pipeline_artifacts_dir = artifacts_root
if not pipeline_artifacts_dir:
@@ -825,11 +848,13 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
phase_spec.concretize()
staged_phases[phase_name] = stage_spec_jobs(
concrete_phase_specs,
check_index_only=check_index_only)
check_index_only=check_index_only,
mirrors_to_check=mirrors_to_check)
finally:
# Clean up PR mirror if enabled
if pr_mirror_url:
spack.mirror.remove('ci_pr_mirror', cfg.default_modify_scope())
# Clean up remote mirror override if enabled
if remote_mirror_override:
if spack_pipeline_type != 'spack_protected_branch':
spack.mirror.remove('ci_pr_mirror', cfg.default_modify_scope())
all_job_names = []
output_object = {}
@@ -889,6 +914,14 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
tags = [tag for tag in runner_attribs['tags']]
if spack_pipeline_type is not None:
# For spack pipelines "public" and "protected" are reserved tags
tags = _remove_reserved_tags(tags)
if spack_pipeline_type == 'spack_protected_branch':
tags.extend(['aws', 'protected'])
elif spack_pipeline_type == 'spack_pull_request':
tags.extend(['public'])
variables = {}
if 'variables' in runner_attribs:
variables.update(runner_attribs['variables'])
@@ -1174,6 +1207,10 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
service_job_config,
cleanup_job)
if 'tags' in cleanup_job:
service_tags = _remove_reserved_tags(cleanup_job['tags'])
cleanup_job['tags'] = service_tags
cleanup_job['stage'] = 'cleanup-temp-storage'
cleanup_job['script'] = [
'spack -d mirror destroy --mirror-url {0}/$CI_PIPELINE_ID'.format(
@@ -1181,9 +1218,74 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
]
cleanup_job['when'] = 'always'
cleanup_job['retry'] = service_job_retries
cleanup_job['interruptible'] = True
output_object['cleanup'] = cleanup_job
if ('signing-job-attributes' in gitlab_ci and
spack_pipeline_type == 'spack_protected_branch'):
# External signing: generate a job to check and sign binary pkgs
stage_names.append('stage-sign-pkgs')
signing_job_config = gitlab_ci['signing-job-attributes']
signing_job = {}
signing_job_attrs_to_copy = [
'image',
'tags',
'variables',
'before_script',
'script',
'after_script',
]
_copy_attributes(signing_job_attrs_to_copy,
signing_job_config,
signing_job)
signing_job_tags = []
if 'tags' in signing_job:
signing_job_tags = _remove_reserved_tags(signing_job['tags'])
for tag in ['aws', 'protected', 'notary']:
if tag not in signing_job_tags:
signing_job_tags.append(tag)
signing_job['tags'] = signing_job_tags
signing_job['stage'] = 'stage-sign-pkgs'
signing_job['when'] = 'always'
signing_job['retry'] = {
'max': 2,
'when': ['always']
}
signing_job['interruptible'] = True
output_object['sign-pkgs'] = signing_job
if spack_buildcache_copy:
# Generate a job to copy the contents from wherever the builds are getting
# pushed to the url specified in the "SPACK_BUILDCACHE_COPY" environment
# variable.
src_url = remote_mirror_override or remote_mirror_url
dest_url = spack_buildcache_copy
stage_names.append('stage-copy-buildcache')
copy_job = {
'stage': 'stage-copy-buildcache',
'tags': ['spack', 'public', 'medium', 'aws', 'x86_64'],
'image': 'ghcr.io/spack/python-aws-bash:0.0.1',
'when': 'on_success',
'interruptible': True,
'retry': service_job_retries,
'script': [
'. ./share/spack/setup-env.sh',
'spack --version',
'aws s3 sync --exclude *index.json* --exclude *pgp* {0} {1}'.format(
src_url, dest_url)
]
}
output_object['copy-mirror'] = copy_job
if rebuild_index_enabled:
# Add a final job to regenerate the index
stage_names.append('stage-rebuild-index')
@@ -1194,9 +1296,13 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
service_job_config,
final_job)
if 'tags' in final_job:
service_tags = _remove_reserved_tags(final_job['tags'])
final_job['tags'] = service_tags
index_target_mirror = mirror_urls[0]
if is_pr_pipeline:
index_target_mirror = pr_mirror_url
if remote_mirror_override:
index_target_mirror = remote_mirror_override
final_job['stage'] = 'stage-rebuild-index'
final_job['script'] = [
@@ -1205,6 +1311,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
]
final_job['when'] = 'always'
final_job['retry'] = service_job_retries
final_job['interruptible'] = True
output_object['rebuild-index'] = final_job
@@ -1237,8 +1344,9 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'SPACK_PIPELINE_TYPE': str(spack_pipeline_type)
}
if pr_mirror_url:
output_object['variables']['SPACK_PR_MIRROR_URL'] = pr_mirror_url
if remote_mirror_override:
(output_object['variables']
['SPACK_REMOTE_MIRROR_OVERRIDE']) = remote_mirror_override
spack_stack_name = os.environ.get('SPACK_CI_STACK_NAME', None)
if spack_stack_name:

View File

@@ -8,7 +8,10 @@
import argparse
import os
import re
import shlex
import sys
from textwrap import dedent
from typing import List, Tuple
import ruamel.yaml as yaml
import six
@@ -147,6 +150,58 @@ def get_command(cmd_name):
return getattr(get_module(cmd_name), pname)
class _UnquotedFlags(object):
"""Use a heuristic in `.extract()` to detect whether the user is trying to set
multiple flags like the docker ENV attribute allows (e.g. 'cflags=-Os -pipe').
If the heuristic finds a match (which can be checked with `__bool__()`), a warning
message explaining how to quote multiple flags correctly can be generated with
`.report()`.
"""
flags_arg_pattern = re.compile(
r'^({0})=([^\'"].*)$'.format(
'|'.join(spack.spec.FlagMap.valid_compiler_flags()),
))
def __init__(self, all_unquoted_flag_pairs):
# type: (List[Tuple[re.Match, str]]) -> None
self._flag_pairs = all_unquoted_flag_pairs
def __bool__(self):
# type: () -> bool
return bool(self._flag_pairs)
@classmethod
def extract(cls, sargs):
# type: (str) -> _UnquotedFlags
all_unquoted_flag_pairs = [] # type: List[Tuple[re.Match, str]]
prev_flags_arg = None
for arg in shlex.split(sargs):
if prev_flags_arg is not None:
all_unquoted_flag_pairs.append((prev_flags_arg, arg))
prev_flags_arg = cls.flags_arg_pattern.match(arg)
return cls(all_unquoted_flag_pairs)
def report(self):
# type: () -> str
single_errors = [
'({0}) {1} {2} => {3}'.format(
i + 1, match.group(0), next_arg,
'{0}="{1} {2}"'.format(match.group(1), match.group(2), next_arg),
)
for i, (match, next_arg) in enumerate(self._flag_pairs)
]
return dedent("""\
Some compiler or linker flags were provided without quoting their arguments,
which now causes spack to try to parse the *next* argument as a spec component
such as a variant instead of an additional compiler or linker flag. If the
intent was to set multiple flags, try quoting them together as described below.
Possible flag quotation errors (with the correctly-quoted version after the =>):
{0}""").format('\n'.join(single_errors))
def parse_specs(args, **kwargs):
"""Convenience function for parsing arguments from specs. Handles common
exceptions and dies if there are errors.
@@ -155,29 +210,28 @@ def parse_specs(args, **kwargs):
normalize = kwargs.get('normalize', False)
tests = kwargs.get('tests', False)
sargs = args
if not isinstance(args, six.string_types):
sargs = ' '.join(args)
unquoted_flags = _UnquotedFlags.extract(sargs)
try:
sargs = args
if not isinstance(args, six.string_types):
sargs = ' '.join(spack.util.string.quote(args))
specs = spack.spec.parse(sargs)
for spec in specs:
if concretize:
spec.concretize(tests=tests) # implies normalize
elif normalize:
spec.normalize(tests=tests)
return specs
except spack.spec.SpecParseError as e:
msg = e.message + "\n" + str(e.string) + "\n"
msg += (e.pos + 2) * " " + "^"
raise spack.error.SpackError(msg)
except spack.error.SpecError as e:
msg = e.message
if e.long_message:
msg += e.long_message
if unquoted_flags:
msg += '\n\n'
msg += unquoted_flags.report()
raise spack.error.SpackError(msg)

View File

@@ -6,7 +6,9 @@
import os.path
import shutil
import tempfile
import llnl.util.filesystem
import llnl.util.tty
import llnl.util.tty.color
@@ -15,6 +17,9 @@
import spack.cmd.common.arguments
import spack.config
import spack.main
import spack.mirror
import spack.spec
import spack.stage
import spack.util.path
description = "manage bootstrap configuration"
@@ -22,6 +27,38 @@
level = "long"
# Tarball to be downloaded if binary packages are requested in a local mirror
BINARY_TARBALL = 'https://github.com/spack/spack-bootstrap-mirrors/releases/download/v0.2/bootstrap-buildcache.tar.gz'
#: Subdirectory where to create the mirror
LOCAL_MIRROR_DIR = 'bootstrap_cache'
# Metadata for a generated binary mirror
BINARY_METADATA = {
'type': 'buildcache',
'description': ('Buildcache copied from a public tarball available on Github.'
'The sha256 checksum of binaries is checked before installation.'),
'info': {
'url': os.path.join('..', '..', LOCAL_MIRROR_DIR),
'homepage': 'https://github.com/spack/spack-bootstrap-mirrors',
'releases': 'https://github.com/spack/spack-bootstrap-mirrors/releases',
'tarball': BINARY_TARBALL
}
}
CLINGO_JSON = '$spack/share/spack/bootstrap/github-actions-v0.2/clingo.json'
GNUPG_JSON = '$spack/share/spack/bootstrap/github-actions-v0.2/gnupg.json'
# Metadata for a generated source mirror
SOURCE_METADATA = {
'type': 'install',
'description': 'Mirror with software needed to bootstrap Spack',
'info': {
'url': os.path.join('..', '..', LOCAL_MIRROR_DIR)
}
}
def _add_scope_option(parser):
scopes = spack.config.scopes()
scopes_metavar = spack.config.scopes_metavar
@@ -67,24 +104,61 @@ def setup_parser(subparser):
)
list = sp.add_parser(
'list', help='list the methods available for bootstrapping'
'list', help='list all the sources of software to bootstrap Spack'
)
_add_scope_option(list)
trust = sp.add_parser(
'trust', help='trust a bootstrapping method'
'trust', help='trust a bootstrapping source'
)
_add_scope_option(trust)
trust.add_argument(
'name', help='name of the method to be trusted'
'name', help='name of the source to be trusted'
)
untrust = sp.add_parser(
'untrust', help='untrust a bootstrapping method'
'untrust', help='untrust a bootstrapping source'
)
_add_scope_option(untrust)
untrust.add_argument(
'name', help='name of the method to be untrusted'
'name', help='name of the source to be untrusted'
)
add = sp.add_parser(
'add', help='add a new source for bootstrapping'
)
_add_scope_option(add)
add.add_argument(
'--trust', action='store_true',
help='trust the source immediately upon addition')
add.add_argument(
'name', help='name of the new source of software'
)
add.add_argument(
'metadata_dir', help='directory where to find metadata files'
)
remove = sp.add_parser(
'remove', help='remove a bootstrapping source'
)
remove.add_argument(
'name', help='name of the source to be removed'
)
mirror = sp.add_parser(
'mirror', help='create a local mirror to bootstrap Spack'
)
mirror.add_argument(
'--binary-packages', action='store_true',
help='download public binaries in the mirror'
)
mirror.add_argument(
'--dev', action='store_true',
help='download dev dependencies too'
)
mirror.add_argument(
metavar='DIRECTORY', dest='root_dir',
help='root directory in which to create the mirror and metadata'
)
@@ -137,10 +211,7 @@ def _root(args):
def _list(args):
sources = spack.config.get(
'bootstrap:sources', default=None, scope=args.scope
)
sources = spack.bootstrap.bootstrapping_sources(scope=args.scope)
if not sources:
llnl.util.tty.msg(
"No method available for bootstrapping Spack's dependencies"
@@ -249,6 +320,121 @@ def _status(args):
print()
def _add(args):
initial_sources = spack.bootstrap.bootstrapping_sources()
names = [s['name'] for s in initial_sources]
# If the name is already used error out
if args.name in names:
msg = 'a source named "{0}" already exist. Please choose a different name'
raise RuntimeError(msg.format(args.name))
# Check that the metadata file exists
metadata_dir = spack.util.path.canonicalize_path(args.metadata_dir)
if not os.path.exists(metadata_dir) or not os.path.isdir(metadata_dir):
raise RuntimeError(
'the directory "{0}" does not exist'.format(args.metadata_dir)
)
file = os.path.join(metadata_dir, 'metadata.yaml')
if not os.path.exists(file):
raise RuntimeError('the file "{0}" does not exist'.format(file))
# Insert the new source as the highest priority one
write_scope = args.scope or spack.config.default_modify_scope(section='bootstrap')
sources = spack.config.get('bootstrap:sources', scope=write_scope) or []
sources = [
{'name': args.name, 'metadata': args.metadata_dir}
] + sources
spack.config.set('bootstrap:sources', sources, scope=write_scope)
msg = 'New bootstrapping source "{0}" added in the "{1}" configuration scope'
llnl.util.tty.msg(msg.format(args.name, write_scope))
if args.trust:
_trust(args)
def _remove(args):
initial_sources = spack.bootstrap.bootstrapping_sources()
names = [s['name'] for s in initial_sources]
if args.name not in names:
msg = ('cannot find any bootstrapping source named "{0}". '
'Run `spack bootstrap list` to see available sources.')
raise RuntimeError(msg.format(args.name))
for current_scope in spack.config.scopes():
sources = spack.config.get('bootstrap:sources', scope=current_scope) or []
if args.name in [s['name'] for s in sources]:
sources = [s for s in sources if s['name'] != args.name]
spack.config.set('bootstrap:sources', sources, scope=current_scope)
msg = ('Removed the bootstrapping source named "{0}" from the '
'"{1}" configuration scope.')
llnl.util.tty.msg(msg.format(args.name, current_scope))
trusted = spack.config.get('bootstrap:trusted', scope=current_scope) or []
if args.name in trusted:
trusted.pop(args.name)
spack.config.set('bootstrap:trusted', trusted, scope=current_scope)
msg = 'Deleting information on "{0}" from list of trusted sources'
llnl.util.tty.msg(msg.format(args.name))
def _mirror(args):
mirror_dir = spack.util.path.canonicalize_path(
os.path.join(args.root_dir, LOCAL_MIRROR_DIR)
)
# TODO: Here we are adding gnuconfig manually, but this can be fixed
# TODO: as soon as we have an option to add to a mirror all the possible
# TODO: dependencies of a spec
root_specs = spack.bootstrap.all_root_specs(development=args.dev) + ['gnuconfig']
for spec_str in root_specs:
msg = 'Adding "{0}" and dependencies to the mirror at {1}'
llnl.util.tty.msg(msg.format(spec_str, mirror_dir))
# Suppress tty from the call below for terser messages
llnl.util.tty.set_msg_enabled(False)
spec = spack.spec.Spec(spec_str).concretized()
for node in spec.traverse():
spack.mirror.create(mirror_dir, [node])
llnl.util.tty.set_msg_enabled(True)
if args.binary_packages:
msg = 'Adding binary packages from "{0}" to the mirror at {1}'
llnl.util.tty.msg(msg.format(BINARY_TARBALL, mirror_dir))
llnl.util.tty.set_msg_enabled(False)
stage = spack.stage.Stage(BINARY_TARBALL, path=tempfile.mkdtemp())
stage.create()
stage.fetch()
stage.expand_archive()
build_cache_dir = os.path.join(stage.source_path, 'build_cache')
shutil.move(build_cache_dir, mirror_dir)
llnl.util.tty.set_msg_enabled(True)
def write_metadata(subdir, metadata):
metadata_rel_dir = os.path.join('metadata', subdir)
metadata_yaml = os.path.join(
args.root_dir, metadata_rel_dir, 'metadata.yaml'
)
llnl.util.filesystem.mkdirp(os.path.dirname(metadata_yaml))
with open(metadata_yaml, mode='w') as f:
spack.util.spack_yaml.dump(metadata, stream=f)
return os.path.dirname(metadata_yaml), metadata_rel_dir
instructions = ('\nTo register the mirror on the platform where it\'s supposed '
'to be used, move "{0}" to its final location and run the '
'following command(s):\n\n').format(args.root_dir)
cmd = ' % spack bootstrap add --trust {0} <final-path>/{1}\n'
_, rel_directory = write_metadata(subdir='sources', metadata=SOURCE_METADATA)
instructions += cmd.format('local-sources', rel_directory)
if args.binary_packages:
abs_directory, rel_directory = write_metadata(
subdir='binaries', metadata=BINARY_METADATA
)
shutil.copy(spack.util.path.canonicalize_path(CLINGO_JSON), abs_directory)
shutil.copy(spack.util.path.canonicalize_path(GNUPG_JSON), abs_directory)
instructions += cmd.format('local-binaries', rel_directory)
print(instructions)
def bootstrap(parser, args):
callbacks = {
'status': _status,
@@ -258,6 +444,9 @@ def bootstrap(parser, args):
'root': _root,
'list': _list,
'trust': _trust,
'untrust': _untrust
'untrust': _untrust,
'add': _add,
'remove': _remove,
'mirror': _mirror
}
callbacks[args.subcommand](args)

View File

@@ -478,11 +478,12 @@ def save_specfile_fn(args):
if args.root_specfile:
with open(args.root_specfile) as fd:
root_spec_as_json = fd.read()
spec_format = 'yaml' if args.root_specfile.endswith('yaml') else 'json'
else:
root_spec = Spec(args.root_spec)
root_spec.concretize()
root_spec_as_json = root_spec.to_json(hash=ht.dag_hash)
spec_format = 'yaml' if args.root_specfile.endswith('yaml') else 'json'
spec_format = 'json'
save_dependency_specfiles(
root_spec_as_json, args.specfile_dir, args.specs.split(), spec_format)

View File

@@ -14,7 +14,7 @@
import spack.repo
import spack.stage
import spack.util.crypto
from spack.package import preferred_version
from spack.package_base import preferred_version
from spack.util.naming import valid_fully_qualified_module_name
from spack.version import Version, ver

View File

@@ -64,6 +64,11 @@ def setup_parser(subparser):
'--dependencies', action='store_true', default=False,
help="(Experimental) disable DAG scheduling; use "
' "plain" dependencies.')
generate.add_argument(
'--buildcache-destination', default=None,
help="Override the mirror configured in the environment (spack.yaml) " +
"in order to push binaries from the generated pipeline to a " +
"different location.")
prune_group = generate.add_mutually_exclusive_group()
prune_group.add_argument(
'--prune-dag', action='store_true', dest='prune_dag',
@@ -127,6 +132,7 @@ def ci_generate(args):
prune_dag = args.prune_dag
index_only = args.index_only
artifacts_root = args.artifacts_root
buildcache_destination = args.buildcache_destination
if not output_file:
output_file = os.path.abspath(".gitlab-ci.yml")
@@ -140,7 +146,8 @@ def ci_generate(args):
spack_ci.generate_gitlab_ci_yaml(
env, True, output_file, prune_dag=prune_dag,
check_index_only=index_only, run_optimizer=run_optimizer,
use_dependencies=use_dependencies, artifacts_root=artifacts_root)
use_dependencies=use_dependencies, artifacts_root=artifacts_root,
remote_mirror_override=buildcache_destination)
if copy_yaml_to:
copy_to_dir = os.path.dirname(copy_yaml_to)
@@ -180,6 +187,9 @@ def ci_rebuild(args):
if not gitlab_ci:
tty.die('spack ci rebuild requires an env containing gitlab-ci cfg')
tty.msg('SPACK_BUILDCACHE_DESTINATION={0}'.format(
os.environ.get('SPACK_BUILDCACHE_DESTINATION', None)))
# Grab the environment variables we need. These either come from the
# pipeline generation step ("spack ci generate"), where they were written
# out as variables, or else provided by GitLab itself.
@@ -196,7 +206,7 @@ def ci_rebuild(args):
compiler_action = get_env_var('SPACK_COMPILER_ACTION')
cdash_build_name = get_env_var('SPACK_CDASH_BUILD_NAME')
spack_pipeline_type = get_env_var('SPACK_PIPELINE_TYPE')
pr_mirror_url = get_env_var('SPACK_PR_MIRROR_URL')
remote_mirror_override = get_env_var('SPACK_REMOTE_MIRROR_OVERRIDE')
remote_mirror_url = get_env_var('SPACK_REMOTE_MIRROR_URL')
# Construct absolute paths relative to current $CI_PROJECT_DIR
@@ -244,6 +254,10 @@ def ci_rebuild(args):
tty.debug('Pipeline type - PR: {0}, develop: {1}'.format(
spack_is_pr_pipeline, spack_is_develop_pipeline))
# If no override url exists, then just push binary package to the
# normal remote mirror url.
buildcache_mirror_url = remote_mirror_override or remote_mirror_url
# Figure out what is our temporary storage mirror: Is it artifacts
# buildcache? Or temporary-storage-url-prefix? In some cases we need to
# force something or pipelines might not have a way to propagate build
@@ -373,7 +387,24 @@ def ci_rebuild(args):
cfg.default_modify_scope())
# Check configured mirrors for a built spec with a matching hash
matches = bindist.get_mirrors_for_spec(job_spec, index_only=False)
mirrors_to_check = None
if remote_mirror_override and spack_pipeline_type == 'spack_protected_branch':
# Passing "mirrors_to_check" below means we *only* look in the override
# mirror to see if we should skip building, which is what we want.
mirrors_to_check = {
'override': remote_mirror_override
}
# Adding this mirror to the list of configured mirrors means dependencies
# could be installed from either the override mirror or any other configured
# mirror (e.g. remote_mirror_url which is defined in the environment or
# pipeline_mirror_url), which is also what we want.
spack.mirror.add('mirror_override',
remote_mirror_override,
cfg.default_modify_scope())
matches = bindist.get_mirrors_for_spec(
job_spec, mirrors_to_check=mirrors_to_check, index_only=False)
if matches:
# Got a hash match on at least one configured mirror. All
@@ -517,13 +548,6 @@ def ci_rebuild(args):
# any logs from the staging directory to artifacts now
spack_ci.copy_stage_logs_to_artifacts(job_spec, job_log_dir)
# Create buildcache on remote mirror, either on pr-specific mirror or
# on the main mirror defined in the gitlab-enabled spack environment
if spack_is_pr_pipeline:
buildcache_mirror_url = pr_mirror_url
else:
buildcache_mirror_url = remote_mirror_url
# If the install succeeded, create a buildcache entry for this job spec
# and push it to one or more mirrors. If the install did not succeed,
# print out some instructions on how to reproduce this build failure

View File

@@ -57,7 +57,7 @@
# See the Spack documentation for more information on packaging.
# ----------------------------------------------------------------------------
from spack import *
from spack.package import *
class {class_name}({base_class_name}):
@@ -826,7 +826,7 @@ def get_versions(args, name):
spack.util.url.require_url_format(args.url)
if args.url.startswith('file://'):
valid_url = False # No point in spidering these
except ValueError:
except (ValueError, TypeError):
valid_url = False
if args.url is not None and args.template != 'bundle' and valid_url:

View File

@@ -11,7 +11,7 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.package
import spack.package_base
import spack.repo
import spack.store
@@ -57,7 +57,7 @@ def dependencies(parser, args):
else:
spec = specs[0]
dependencies = spack.package.possible_dependencies(
dependencies = spack.package_base.possible_dependencies(
spec,
transitive=args.transitive,
expand_virtuals=args.expand_virtuals,

View File

@@ -125,7 +125,7 @@ def external_find(args):
# If the list of packages is empty, search for every possible package
if not args.tags and not packages_to_check:
packages_to_check = spack.repo.path.all_packages()
packages_to_check = list(spack.repo.path.all_packages())
detected_packages = spack.detection.by_executable(
packages_to_check, path_hints=args.path)
@@ -177,7 +177,10 @@ def _collect_and_consume_cray_manifest_files(
for directory in manifest_dirs:
for fname in os.listdir(directory):
manifest_files.append(os.path.join(directory, fname))
if fname.endswith('.json'):
fpath = os.path.join(directory, fname)
tty.debug("Adding manifest file: {0}".format(fpath))
manifest_files.append(os.path.join(directory, fpath))
if not manifest_files:
raise NoManifestFileError(
@@ -185,6 +188,7 @@ def _collect_and_consume_cray_manifest_files(
.format(cray_manifest.default_path))
for path in manifest_files:
tty.debug("Reading manifest file: " + path)
try:
cray_manifest.read(path, not dry_run)
except (spack.compilers.UnknownCompilerError, spack.error.SpackError) as e:
@@ -200,7 +204,7 @@ def external_list(args):
list(spack.repo.path.all_packages())
# Print all the detectable packages
tty.msg("Detectable packages per repository")
for namespace, pkgs in sorted(spack.package.detectable_packages.items()):
for namespace, pkgs in sorted(spack.package_base.detectable_packages.items()):
print("Repository:", namespace)
colify.colify(pkgs, indent=4, output=sys.stdout)

View File

@@ -18,7 +18,7 @@
import spack.fetch_strategy as fs
import spack.repo
import spack.spec
from spack.package import has_test_method, preferred_version
from spack.package_base import has_test_method, preferred_version
description = 'get detailed information on a particular package'
section = 'basic'
@@ -269,14 +269,14 @@ def print_tests(pkg):
names = []
pkg_cls = pkg if inspect.isclass(pkg) else pkg.__class__
if has_test_method(pkg_cls):
pkg_base = spack.package.PackageBase
pkg_base = spack.package_base.PackageBase
test_pkgs = [str(cls.test) for cls in inspect.getmro(pkg_cls) if
issubclass(cls, pkg_base) and cls.test != pkg_base.test]
test_pkgs = list(set(test_pkgs))
names.extend([(test.split()[1]).lower() for test in test_pkgs])
# TODO Refactor START
# Use code from package.py's test_process IF this functionality is
# Use code from package_base.py's test_process IF this functionality is
# accepted.
v_names = list(set([vspec.name for vspec in pkg.virtuals_provided]))

View File

@@ -302,7 +302,7 @@ def install(parser, args, **kwargs):
)
reporter = spack.report.collect_info(
spack.package.PackageInstaller, '_install_task', args.log_format, args)
spack.package_base.PackageInstaller, '_install_task', args.log_format, args)
if args.log_file:
reporter.filename = args.log_file

View File

@@ -12,7 +12,7 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.error
import spack.package
import spack.package_base
import spack.repo
import spack.store
from spack.database import InstallStatuses

View File

@@ -15,8 +15,10 @@
import spack
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment
import spack.hash_types as ht
import spack.package
import spack.package_base
import spack.solver.asp as asp
description = "concretize a specs using an ASP solver"
@@ -74,6 +76,51 @@ def setup_parser(subparser):
spack.cmd.common.arguments.add_concretizer_args(subparser)
def _process_result(result, show, required_format, kwargs):
result.raise_if_unsat()
opt, _, _ = min(result.answers)
if ("opt" in show) and (not required_format):
tty.msg("Best of %d considered solutions." % result.nmodels)
tty.msg("Optimization Criteria:")
maxlen = max(len(s[2]) for s in result.criteria)
color.cprint(
"@*{ Priority Criterion %sInstalled ToBuild}" % ((maxlen - 10) * " ")
)
fmt = " @K{%%-8d} %%-%ds%%9s %%7s" % maxlen
for i, (installed_cost, build_cost, name) in enumerate(result.criteria, 1):
color.cprint(
fmt % (
i,
name,
"-" if build_cost is None else installed_cost,
installed_cost if build_cost is None else build_cost,
)
)
print()
# dump the solutions as concretized specs
if 'solutions' in show:
for spec in result.specs:
# With -y, just print YAML to output.
if required_format == 'yaml':
# use write because to_yaml already has a newline.
sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
elif required_format == 'json':
sys.stdout.write(spec.to_json(hash=ht.dag_hash))
else:
sys.stdout.write(
spec.tree(color=sys.stdout.isatty(), **kwargs))
print()
if result.unsolved_specs and "solutions" in show:
tty.msg("Unsolved specs")
for spec in result.unsolved_specs:
print(spec)
print()
def solve(parser, args):
# these are the same options as `spack spec`
name_fmt = '{namespace}.{name}' if args.namespaces else '{name}'
@@ -102,58 +149,42 @@ def solve(parser, args):
if models < 0:
tty.die("model count must be non-negative: %d")
specs = spack.cmd.parse_specs(args.specs)
# Format required for the output (JSON, YAML or None)
required_format = args.format
# If we have an active environment, pick the specs from there
env = spack.environment.active_environment()
if env and args.specs:
msg = "cannot give explicit specs when an environment is active"
raise RuntimeError(msg)
specs = list(env.user_specs) if env else spack.cmd.parse_specs(args.specs)
# set up solver parameters
# Note: reuse and other concretizer prefs are passed as configuration
solver = asp.Solver()
output = sys.stdout if "asp" in show else None
result = solver.solve(
specs,
out=output,
models=models,
timers=args.timers,
stats=args.stats,
setup_only=(set(show) == {'asp'})
)
if 'solutions' not in show:
return
# die if no solution was found
result.raise_if_unsat()
# show the solutions as concretized specs
if 'solutions' in show:
opt, _, _ = min(result.answers)
if ("opt" in show) and (not args.format):
tty.msg("Best of %d considered solutions." % result.nmodels)
tty.msg("Optimization Criteria:")
maxlen = max(len(s[2]) for s in result.criteria)
color.cprint(
"@*{ Priority Criterion %sInstalled ToBuild}" % ((maxlen - 10) * " ")
)
fmt = " @K{%%-8d} %%-%ds%%9s %%7s" % maxlen
for i, (idx, build_idx, name) in enumerate(result.criteria, 1):
color.cprint(
fmt % (
i,
name,
"-" if build_idx is None else opt[idx],
opt[idx] if build_idx is None else opt[build_idx],
)
)
print()
for spec in result.specs:
# With -y, just print YAML to output.
if args.format == 'yaml':
# use write because to_yaml already has a newline.
sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
elif args.format == 'json':
sys.stdout.write(spec.to_json(hash=ht.dag_hash))
setup_only = set(show) == {'asp'}
unify = spack.config.get('concretizer:unify')
if unify != 'when_possible':
# set up solver parameters
# Note: reuse and other concretizer prefs are passed as configuration
result = solver.solve(
specs,
out=output,
models=models,
timers=args.timers,
stats=args.stats,
setup_only=setup_only
)
if not setup_only:
_process_result(result, show, required_format, kwargs)
else:
for idx, result in enumerate(solver.solve_in_rounds(
specs, out=output, models=models, timers=args.timers, stats=args.stats
)):
if "solutions" in show:
tty.msg("ROUND {0}".format(idx))
tty.msg("")
else:
sys.stdout.write(
spec.tree(color=sys.stdout.isatty(), **kwargs))
print("% END ROUND {0}\n".format(idx))
if not setup_only:
_process_result(result, show, required_format, kwargs)

View File

@@ -80,7 +80,8 @@ def spec(parser, args):
# Use command line specified specs, otherwise try to use environment specs.
if args.specs:
input_specs = spack.cmd.parse_specs(args.specs)
specs = [(s, s.concretized()) for s in input_specs]
concretized_specs = spack.cmd.parse_specs(args.specs, concretize=True)
specs = list(zip(input_specs, concretized_specs))
else:
env = ev.active_environment()
if env:

View File

@@ -27,12 +27,6 @@ def setup_parser(subparser):
def stage(parser, args):
# We temporarily modify the working directory when setting up a stage, so we need to
# convert this to an absolute path here in order for it to remain valid later.
custom_path = os.path.abspath(args.path) if args.path else None
if custom_path:
spack.stage.create_stage_root(custom_path)
if not args.specs:
env = ev.active_environment()
if env:
@@ -54,6 +48,10 @@ def stage(parser, args):
specs = spack.cmd.parse_specs(args.specs, concretize=False)
# We temporarily modify the working directory when setting up a stage, so we need to
# convert this to an absolute path here in order for it to remain valid later.
custom_path = os.path.abspath(args.path) if args.path else None
# prevent multiple specs from extracting in the same folder
if len(specs) > 1 and custom_path:
tty.die("`--path` requires a single spec, but multiple were provided")

View File

@@ -65,7 +65,7 @@ def is_package(f):
packages, since we allow `from spack import *` and poking globals
into packages.
"""
return f.startswith("var/spack/repos/")
return f.startswith("var/spack/repos/") and f.endswith('package.py')
#: decorator for adding tools to the list
@@ -94,16 +94,16 @@ def changed_files(base="develop", untracked=True, all_files=False, root=None):
git = which("git", required=True)
# ensure base is in the repo
git("show-ref", "--verify", "--quiet", "refs/heads/%s" % base,
fail_on_error=False)
base_sha = git("rev-parse", "--quiet", "--verify", "--revs-only", base,
fail_on_error=False, output=str)
if git.returncode != 0:
tty.die(
"This repository does not have a '%s' branch." % base,
"This repository does not have a '%s' revision." % base,
"spack style needs this branch to determine which files changed.",
"Ensure that '%s' exists, or specify files to check explicitly." % base
)
range = "{0}...".format(base)
range = "{0}...".format(base_sha.strip())
git_args = [
# Add changed files committed since branching off of develop
@@ -236,7 +236,7 @@ def translate(match):
continue
if not args.root_relative and re_obj:
line = re_obj.sub(translate, line)
print(" " + line)
print(line)
def print_style_header(file_list, args):
@@ -290,20 +290,26 @@ def run_flake8(flake8_cmd, file_list, args):
@tool("mypy")
def run_mypy(mypy_cmd, file_list, args):
# always run with config from running spack prefix
mypy_args = [
common_mypy_args = [
"--config-file", os.path.join(spack.paths.prefix, "pyproject.toml"),
"--package", "spack",
"--package", "llnl",
"--show-error-codes",
]
# not yet, need other updates to enable this
# if any([is_package(f) for f in file_list]):
# mypy_args.extend(["--package", "packages"])
mypy_arg_sets = [common_mypy_args + [
"--package", "spack",
"--package", "llnl",
]]
if 'SPACK_MYPY_CHECK_PACKAGES' in os.environ:
mypy_arg_sets.append(common_mypy_args + [
'--package', 'packages',
'--disable-error-code', 'no-redef',
])
output = mypy_cmd(*mypy_args, fail_on_error=False, output=str)
returncode = mypy_cmd.returncode
returncode = 0
for mypy_args in mypy_arg_sets:
output = mypy_cmd(*mypy_args, fail_on_error=False, output=str)
returncode |= mypy_cmd.returncode
rewrite_and_print_output(output, args)
rewrite_and_print_output(output, args)
print_tool_result("mypy", returncode)
return returncode
@@ -318,16 +324,29 @@ def run_isort(isort_cmd, file_list, args):
pat = re.compile("ERROR: (.*) Imports are incorrectly sorted")
replacement = "ERROR: {0} Imports are incorrectly sorted"
returncode = 0
for chunk in grouper(file_list, 100):
packed_args = isort_args + tuple(chunk)
output = isort_cmd(*packed_args, fail_on_error=False, output=str, error=str)
returncode |= isort_cmd.returncode
returncode = [0]
rewrite_and_print_output(output, args, pat, replacement)
def process_files(file_list, is_args):
for chunk in grouper(file_list, 100):
packed_args = is_args + tuple(chunk)
output = isort_cmd(*packed_args, fail_on_error=False, output=str, error=str)
returncode[0] |= isort_cmd.returncode
print_tool_result("isort", returncode)
return returncode
rewrite_and_print_output(output, args, pat, replacement)
packages_isort_args = ('--rm', 'spack', '--rm', 'spack.pkgkit', '--rm',
'spack.package_defs', '-a', 'from spack.package import *')
packages_isort_args = packages_isort_args + isort_args
# packages
process_files(filter(is_package, file_list),
packages_isort_args)
# non-packages
process_files(filter(lambda f: not is_package(f), file_list),
isort_args)
print_tool_result("isort", returncode[0])
return returncode[0]
@tool("black")

View File

@@ -20,7 +20,7 @@
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.install_test
import spack.package
import spack.package_base
import spack.repo
import spack.report
@@ -189,7 +189,7 @@ def test_run(args):
# Set up reporter
setattr(args, 'package', [s.format() for s in test_suite.specs])
reporter = spack.report.collect_info(
spack.package.PackageBase, 'do_test', args.log_format, args)
spack.package_base.PackageBase, 'do_test', args.log_format, args)
if not reporter.filename:
if args.log_file:
if os.path.isabs(args.log_file):
@@ -217,7 +217,7 @@ def test_list(args):
else set()
def has_test_and_tags(pkg_class):
return spack.package.has_test_method(pkg_class) and \
return spack.package_base.has_test_method(pkg_class) and \
(not args.tag or pkg_class.name in tagged)
if args.list_all:

View File

@@ -24,7 +24,7 @@
# tutorial configuration parameters
tutorial_branch = "releases/v0.17"
tutorial_branch = "releases/v0.18"
tutorial_mirror = "file:///mirror"
tutorial_key = os.path.join(spack.paths.share_path, "keys", "tutorial.pub")

View File

@@ -15,7 +15,7 @@
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.error
import spack.package
import spack.package_base
import spack.repo
import spack.store
from spack.database import InstallStatuses
@@ -221,7 +221,7 @@ def do_uninstall(env, specs, force):
except spack.repo.UnknownEntityError:
# The package.py file has gone away -- but still
# want to uninstall.
spack.package.Package.uninstall_by_spec(item, force=True)
spack.package_base.Package.uninstall_by_spec(item, force=True)
# A package is ready to be uninstalled when nothing else references it,
# unless we are requested to force uninstall it.
@@ -235,7 +235,7 @@ def is_ready(dag_hash):
# If this spec is only used as a build dependency, we can uninstall
return all(
dspec.deptypes == ("build",)
dspec.deptypes == ("build",) or not dspec.parent.installed
for dspec in record.spec.edges_from_dependents()
)

View File

@@ -422,7 +422,7 @@ def url_list_parsing(args, urls, url, pkg):
urls (set): List of URLs that have already been added
url (str or None): A URL to potentially add to ``urls`` depending on
``args``
pkg (spack.package.PackageBase): The Spack package
pkg (spack.package_base.PackageBase): The Spack package
Returns:
set: The updated set of ``urls``
@@ -470,7 +470,7 @@ def name_parsed_correctly(pkg, name):
"""Determine if the name of a package was correctly parsed.
Args:
pkg (spack.package.PackageBase): The Spack package
pkg (spack.package_base.PackageBase): The Spack package
name (str): The name that was extracted from the URL
Returns:
@@ -487,7 +487,7 @@ def version_parsed_correctly(pkg, version):
"""Determine if the version of a package was correctly parsed.
Args:
pkg (spack.package.PackageBase): The Spack package
pkg (spack.package_base.PackageBase): The Spack package
version (str): The version that was extracted from the URL
Returns:

View File

@@ -766,7 +766,8 @@ def name_matches(name, name_list):
toolchains.add(compiler_cls.__name__)
if len(toolchains) > 1:
if toolchains == set(['Clang', 'AppleClang', 'Aocc']):
if toolchains == set(['Clang', 'AppleClang', 'Aocc']) or \
toolchains == set(['Dpcpp', 'Oneapi']):
return False
tty.debug("[TOOLCHAINS] {0}".format(toolchains))
return True

View File

@@ -88,7 +88,7 @@
#: Path to the default configuration
configuration_defaults_path = (
'defaults', os.path.join(spack.paths.etc_path, 'spack', 'defaults')
'defaults', os.path.join(spack.paths.etc_path, 'defaults')
)
#: Hard-coded default values for some key configuration options.
@@ -104,12 +104,13 @@
'build_jobs': min(16, cpus_available()),
'build_stage': '$tempdir/spack-stage',
'concretizer': 'clingo',
'license_dir': spack.paths.default_license_dir,
}
}
#: metavar to use for commands that accept scopes
#: this is shorter and more readable than listing all choices
scopes_metavar = '{defaults,system,site,user}[/PLATFORM]'
scopes_metavar = '{defaults,system,site,user}[/PLATFORM] or env:ENVIRONMENT'
#: Base name for the (internal) overrides scope.
overrides_base_name = 'overrides-'
@@ -815,7 +816,7 @@ def _config():
# Site configuration is per spack instance, for sites or projects
# No site-level configs should be checked into spack by default.
configuration_paths.append(
('site', os.path.join(spack.paths.etc_path, 'spack')),
('site', os.path.join(spack.paths.etc_path)),
)
# User configuration can override both spack defaults and site config

View File

@@ -20,6 +20,7 @@
compiler_name_translation = {
'nvidia': 'nvhpc',
'rocm': 'rocmcc',
}

View File

@@ -356,10 +356,10 @@ def __init__(self, root, db_dir=None, upstream_dbs=None,
self.prefix_fail_path = os.path.join(self._db_dir, 'prefix_failures')
# Create needed directories and files
if not os.path.exists(self._db_dir):
if not is_upstream and not os.path.exists(self._db_dir):
fs.mkdirp(self._db_dir)
if not os.path.exists(self._failure_dir) and not is_upstream:
if not is_upstream and not os.path.exists(self._failure_dir):
fs.mkdirp(self._failure_dir)
self.is_upstream = is_upstream
@@ -1064,9 +1064,7 @@ def _read(self):
self._state_is_inconsistent = False
return
elif self.is_upstream:
raise UpstreamDatabaseLockingError(
"No database index file is present, and upstream"
" databases cannot generate an index file")
tty.warn('upstream not found: {0}'.format(self._index_path))
def _add(
self,

View File

@@ -240,7 +240,7 @@ def compute_windows_program_path_for_package(pkg):
program files location, return list of best guesses
Args:
pkg (spack.package.Package): package for which
pkg (spack.package_base.Package): package for which
Program Files location is to be computed
"""
if not is_windows:

View File

@@ -21,6 +21,7 @@
from llnl.util.lang import dedupe
from llnl.util.symlink import symlink
import spack.binary_distribution
import spack.bootstrap
import spack.compilers
import spack.concretize
@@ -79,8 +80,9 @@
env_subdir_name = '.spack-env'
#: default spack.yaml file to put in new environments
default_manifest_yaml = """\
def default_manifest_yaml():
"""default spack.yaml file to put in new environments"""
return """\
# This is a Spack Environment file.
#
# It describes a set of packages to be installed, along with
@@ -89,7 +91,11 @@
# add package specs to the `specs` list
specs: []
view: true
"""
concretizer:
unify: {}
""".format('true' if spack.config.get('concretizer:unify') else 'false')
#: regex for validating enviroment names
valid_environment_name_re = r'^\w[\w-]*$'
@@ -623,7 +629,7 @@ def __init__(self, path, init_file=None, with_view=None, keep_relative=False):
# This attribute will be set properly from configuration
# during concretization
self.concretization = None
self.unify = None
self.clear()
if init_file:
@@ -632,11 +638,11 @@ def __init__(self, path, init_file=None, with_view=None, keep_relative=False):
# the init file.
with fs.open_if_filename(init_file) as f:
if hasattr(f, 'name') and f.name.endswith('.lock'):
self._read_manifest(default_manifest_yaml)
self._read_manifest(default_manifest_yaml())
self._read_lockfile(f)
self._set_user_specs_from_lockfile()
else:
self._read_manifest(f, raw_yaml=default_manifest_yaml)
self._read_manifest(f, raw_yaml=default_manifest_yaml())
# Rewrite relative develop paths when initializing a new
# environment in a different location from the spack.yaml file.
@@ -700,7 +706,7 @@ def _read(self):
default_manifest = not os.path.exists(self.manifest_path)
if default_manifest:
# No manifest, use default yaml
self._read_manifest(default_manifest_yaml)
self._read_manifest(default_manifest_yaml())
else:
with open(self.manifest_path) as f:
self._read_manifest(f)
@@ -766,8 +772,15 @@ def _read_manifest(self, f, raw_yaml=None):
self.views = {}
# Retrieve the current concretization strategy
configuration = config_dict(self.yaml)
# default concretization to separately
self.concretization = configuration.get('concretization', 'separately')
# Let `concretization` overrule `concretize:unify` config for now,
# but use a translation table to have internally a representation
# as if we were using the new configuration
translation = {'separately': False, 'together': True}
try:
self.unify = translation[configuration['concretization']]
except KeyError:
self.unify = spack.config.get('concretizer:unify', False)
# Retrieve dev-build packages:
self.dev_specs = configuration.get('develop', {})
@@ -1148,14 +1161,44 @@ def concretize(self, force=False, tests=False):
self.specs_by_hash = {}
# Pick the right concretization strategy
if self.concretization == 'together':
if self.unify == 'when_possible':
return self._concretize_together_where_possible(tests=tests)
if self.unify is True:
return self._concretize_together(tests=tests)
if self.concretization == 'separately':
if self.unify is False:
return self._concretize_separately(tests=tests)
msg = 'concretization strategy not implemented [{0}]'
raise SpackEnvironmentError(msg.format(self.concretization))
raise SpackEnvironmentError(msg.format(self.unify))
def _concretize_together_where_possible(self, tests=False):
# Avoid cyclic dependency
import spack.solver.asp
# Exit early if the set of concretized specs is the set of user specs
user_specs_did_not_change = not bool(
set(self.user_specs) - set(self.concretized_user_specs)
)
if user_specs_did_not_change:
return []
# Proceed with concretization
self.concretized_user_specs = []
self.concretized_order = []
self.specs_by_hash = {}
result_by_user_spec = {}
solver = spack.solver.asp.Solver()
for result in solver.solve_in_rounds(self.user_specs, tests=tests):
result_by_user_spec.update(result.specs_by_input)
result = []
for abstract, concrete in sorted(result_by_user_spec.items()):
self._add_concrete_spec(abstract, concrete)
result.append((abstract, concrete))
return result
def _concretize_together(self, tests=False):
"""Concretization strategy that concretizes all the specs
@@ -1240,6 +1283,10 @@ def _concretize_separately(self, tests=False):
# processes try to write the config file in parallel
_ = spack.compilers.get_compiler_config()
# Ensure that buildcache index is updated if reuse is on
if spack.config.get('config:reuse', False):
spack.binary_distribution.binary_index.update()
# Early return if there is nothing to do
if len(arguments) == 0:
return []
@@ -1308,7 +1355,7 @@ def concretize_and_add(self, user_spec, concrete_spec=None, tests=False):
concrete_spec: if provided, then it is assumed that it is the
result of concretizing the provided ``user_spec``
"""
if self.concretization == 'together':
if self.unify is True:
msg = 'cannot install a single spec in an environment that is ' \
'configured to be concretized together. Run instead:\n\n' \
' $ spack add <spec>\n' \
@@ -1611,7 +1658,14 @@ def all_specs(self):
"""Return all specs, even those a user spec would shadow."""
all_specs = set()
for h in self.concretized_order:
all_specs.update(self.specs_by_hash[h].traverse())
try:
spec = self.specs_by_hash[h]
except KeyError:
tty.warn(
'Environment %s appears to be corrupt: missing spec '
'"%s"' % (self.name, h))
continue
all_specs.update(spec.traverse())
return sorted(all_specs)
@@ -1869,17 +1923,15 @@ def write(self, regenerate=True):
regenerate (bool): regenerate views and run post-write hooks as
well as writing if True.
"""
# Intercept environment not using the latest schema format and prevent
# them from being modified
manifest_exists = os.path.exists(self.manifest_path)
if manifest_exists and not is_latest_format(self.manifest_path):
msg = ('The environment "{0}" needs to be written to disk, but '
'is currently using a deprecated format. Please update it '
'using:\n\n'
'\tspack env update {0}\n\n'
'Note that previous versions of Spack will not be able to '
# Warn that environments are not in the latest format.
if not is_latest_format(self.manifest_path):
ver = '.'.join(str(s) for s in spack.spack_version_info[:2])
msg = ('The environment "{}" is written to disk in a deprecated format. '
'Please update it using:\n\n'
'\tspack env update {}\n\n'
'Note that versions of Spack older than {} may not be able to '
'use the updated configuration.')
raise RuntimeError(msg.format(self.name))
tty.warn(msg.format(self.name, self.name, ver))
# ensure path in var/spack/environments
fs.mkdirp(self.path)
@@ -2231,14 +2283,16 @@ def _top_level_key(data):
def is_latest_format(manifest):
"""Return True if the manifest file is at the latest schema format,
False otherwise.
"""Return False if the manifest file exists and is not in the latest schema format.
Args:
manifest (str): manifest file to be analyzed
"""
with open(manifest) as f:
data = syaml.load(f)
try:
with open(manifest) as f:
data = syaml.load(f)
except (OSError, IOError):
return True
top_level_key = _top_level_key(data)
changed = spack.schema.env.update(data[top_level_key])
return not changed

View File

@@ -10,9 +10,9 @@
import llnl.util.tty as tty
#: whether we should write stack traces or short error messages
#: at what level we should write stack traces or short error messages
#: this is module-scoped because it needs to be set very early
debug = False
debug = 0
class SpackError(Exception):

View File

@@ -35,6 +35,7 @@
import six.moves.urllib.parse as urllib_parse
import llnl.util
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.filesystem import (
get_single_file,
@@ -119,6 +120,11 @@ def __init__(self, **kwargs):
# 'no_cache' option from version directive.
self.cache_enabled = not kwargs.pop('no_cache', False)
self.package = None
def set_package(self, package):
self.package = package
# Subclasses need to implement these methods
def fetch(self):
"""Fetch source code archive or repo.
@@ -242,6 +248,10 @@ def source_id(self):
if all(component_ids):
return component_ids
def set_package(self, package):
for item in self:
item.package = package
@fetcher
class URLFetchStrategy(FetchStrategy):
@@ -520,7 +530,7 @@ def expand(self):
"Failed on expand() for URL %s" % self.url)
if not self.extension:
self.extension = extension(self.archive_file)
self.extension = extension(self.url)
if self.stage.expanded:
tty.debug('Source already staged to %s' % self.stage.source_path)
@@ -528,50 +538,11 @@ def expand(self):
decompress = decompressor_for(self.archive_file, self.extension)
# Expand all tarballs in their own directory to contain
# exploding tarballs.
tarball_container = os.path.join(self.stage.path,
"spack-expanded-archive")
# Below we assume that the command to decompress expand the
# archive in the current working directory
mkdirp(tarball_container)
with working_dir(tarball_container):
with fs.exploding_archive_catch(self.stage):
decompress(self.archive_file)
# Check for an exploding tarball, i.e. one that doesn't expand to
# a single directory. If the tarball *didn't* explode, move its
# contents to the staging source directory & remove the container
# directory. If the tarball did explode, just rename the tarball
# directory to the staging source directory.
#
# NOTE: The tar program on Mac OS X will encode HFS metadata in
# hidden files, which can end up *alongside* a single top-level
# directory. We initially ignore presence of hidden files to
# accomodate these "semi-exploding" tarballs but ensure the files
# are copied to the source directory.
files = os.listdir(tarball_container)
non_hidden = [f for f in files if not f.startswith('.')]
if len(non_hidden) == 1:
src = os.path.join(tarball_container, non_hidden[0])
if os.path.isdir(src):
self.stage.srcdir = non_hidden[0]
shutil.move(src, self.stage.source_path)
if len(files) > 1:
files.remove(non_hidden[0])
for f in files:
src = os.path.join(tarball_container, f)
dest = os.path.join(self.stage.path, f)
shutil.move(src, dest)
os.rmdir(tarball_container)
else:
# This is a non-directory entry (e.g., a patch file) so simply
# rename the tarball container to be the source path.
shutil.move(tarball_container, self.stage.source_path)
else:
shutil.move(tarball_container, self.stage.source_path)
def archive(self, destination):
"""Just moves this archive to the destination."""
if not self.archive_file:
@@ -1014,9 +985,20 @@ def clone(self, dest=None, commit=None, branch=None, tag=None, bare=False):
git(*args)
# Init submodules if the user asked for them.
if self.submodules:
with working_dir(dest):
args = ['submodule', 'update', '--init', '--recursive']
git_commands = []
submodules = self.submodules
if callable(submodules):
submodules = list(submodules(self.package))
git_commands.append(["submodule", "init", "--"] + submodules)
git_commands.append(['submodule', 'update', '--recursive'])
elif submodules:
git_commands.append(["submodule", "update", "--init", "--recursive"])
if not git_commands:
return
with working_dir(dest):
for args in git_commands:
if not spack.config.get('config:debug'):
args.insert(1, '--quiet')
git(*args)
@@ -1556,7 +1538,7 @@ def _extrapolate(pkg, version):
try:
return URLFetchStrategy(pkg.url_for_version(version),
fetch_options=pkg.fetch_options)
except spack.package.NoURLError:
except spack.package_base.NoURLError:
msg = ("Can't extrapolate a URL for version %s "
"because package %s defines no URLs")
raise ExtrapolationError(msg % (version, pkg.name))

View File

@@ -493,9 +493,11 @@ def write(self, spec, color=None, out=None):
# Replace node with its dependencies
self._frontier.pop(i)
deps = node.dependencies(deptype=self.deptype)
if deps:
deps = sorted((d.dag_hash() for d in deps), reverse=True)
edges = sorted(
node.edges_to_dependencies(deptype=self.deptype), reverse=True
)
if edges:
deps = [e.spec.dag_hash() for e in edges]
self._connect_deps(i, deps, "new-deps") # anywhere.
elif self._frontier:

View File

@@ -50,7 +50,7 @@
import spack.error
import spack.hooks
import spack.monitor
import spack.package
import spack.package_base
import spack.package_prefs as prefs
import spack.repo
import spack.store
@@ -103,7 +103,7 @@ def _check_last_phase(pkg):
package already.
Args:
pkg (spack.package.PackageBase): the package being installed
pkg (spack.package_base.PackageBase): the package being installed
Raises:
``BadInstallPhase`` if stop_before or last phase is invalid
@@ -125,7 +125,7 @@ def _handle_external_and_upstream(pkg, explicit):
database if it is external package.
Args:
pkg (spack.package.Package): the package whose installation is under
pkg (spack.package_base.Package): the package whose installation is under
consideration
explicit (bool): the package was explicitly requested by the user
Return:
@@ -265,7 +265,7 @@ def _install_from_cache(pkg, cache_only, explicit, unsigned=False):
Extract the package from binary cache
Args:
pkg (spack.package.PackageBase): the package to install from the binary cache
pkg (spack.package_base.PackageBase): package to install from the binary cache
cache_only (bool): only extract from binary cache
explicit (bool): ``True`` if installing the package was explicitly
requested by the user, otherwise, ``False``
@@ -350,27 +350,27 @@ def _process_external_package(pkg, explicit):
def _process_binary_cache_tarball(pkg, binary_spec, explicit, unsigned,
preferred_mirrors=None):
mirrors_for_spec=None):
"""
Process the binary cache tarball.
Args:
pkg (spack.package.PackageBase): the package being installed
pkg (spack.package_base.PackageBase): the package being installed
binary_spec (spack.spec.Spec): the spec whose cache has been confirmed
explicit (bool): the package was explicitly requested by the user
unsigned (bool): ``True`` if binary package signatures to be checked,
otherwise, ``False``
preferred_mirrors (list): Optional list of urls to prefer when
attempting to download the tarball
mirrors_for_spec (list): Optional list of concrete specs and mirrors
obtained by calling binary_distribution.get_mirrors_for_spec().
Return:
bool: ``True`` if the package was extracted from binary cache,
else ``False``
"""
tarball = binary_distribution.download_tarball(
binary_spec, preferred_mirrors=preferred_mirrors)
download_result = binary_distribution.download_tarball(
binary_spec, unsigned, mirrors_for_spec=mirrors_for_spec)
# see #10063 : install from source if tarball doesn't exist
if tarball is None:
if download_result is None:
tty.msg('{0} exists in binary cache but with different hash'
.format(pkg.name))
return False
@@ -380,9 +380,9 @@ def _process_binary_cache_tarball(pkg, binary_spec, explicit, unsigned,
# don't print long padded paths while extracting/relocating binaries
with spack.util.path.filter_padding():
binary_distribution.extract_tarball(
binary_spec, tarball, allow_root=False, unsigned=unsigned, force=False
)
binary_distribution.extract_tarball(binary_spec, download_result,
allow_root=False, unsigned=unsigned,
force=False)
pkg.installed_from_binary_cache = True
spack.store.db.add(pkg.spec, spack.store.layout, explicit=explicit)
@@ -394,7 +394,7 @@ def _try_install_from_binary_cache(pkg, explicit, unsigned=False):
Try to extract the package from binary cache.
Args:
pkg (spack.package.PackageBase): the package to be extracted from binary cache
pkg (spack.package_base.PackageBase): package to be extracted from binary cache
explicit (bool): the package was explicitly requested by the user
unsigned (bool): ``True`` if binary package signatures to be checked,
otherwise, ``False``
@@ -406,12 +406,8 @@ def _try_install_from_binary_cache(pkg, explicit, unsigned=False):
if not matches:
return False
# In the absence of guidance from user or some other reason to prefer one
# mirror over another, any match will suffice, so just pick the first one.
preferred_mirrors = [match['mirror_url'] for match in matches]
binary_spec = matches[0]['spec']
return _process_binary_cache_tarball(pkg, binary_spec, explicit, unsigned,
preferred_mirrors=preferred_mirrors)
return _process_binary_cache_tarball(pkg, pkg.spec, explicit, unsigned,
mirrors_for_spec=matches)
def clear_failures():
@@ -534,7 +530,7 @@ def log(pkg):
Copy provenance into the install directory on success
Args:
pkg (spack.package.Package): the package that was built and installed
pkg (spack.package_base.Package): the package that was built and installed
"""
packages_dir = spack.store.layout.build_packages_path(pkg.spec)
@@ -620,7 +616,7 @@ def package_id(pkg):
and packages for combinatorial environments.
Args:
pkg (spack.package.PackageBase): the package from which the identifier is
pkg (spack.package_base.PackageBase): the package from which the identifier is
derived
"""
if not pkg.spec.concrete:
@@ -773,7 +769,7 @@ def _add_bootstrap_compilers(
Args:
compiler: the compiler to boostrap
architecture: the architecture for which to bootstrap the compiler
pkgs (spack.package.PackageBase): the package with possible compiler
pkgs (spack.package_base.PackageBase): the package with possible compiler
dependencies
request (BuildRequest): the associated install request
all_deps (defaultdict(set)): dictionary of all dependencies and
@@ -790,7 +786,7 @@ def _add_init_task(self, pkg, request, is_compiler, all_deps):
Creates and queus the initial build task for the package.
Args:
pkg (spack.package.Package): the package to be built and installed
pkg (spack.package_base.Package): the package to be built and installed
request (BuildRequest or None): the associated install request
where ``None`` can be used to indicate the package was
explicitly requested by the user
@@ -972,7 +968,7 @@ def _cleanup_task(self, pkg):
Cleanup the build task for the spec
Args:
pkg (spack.package.PackageBase): the package being installed
pkg (spack.package_base.PackageBase): the package being installed
"""
self._remove_task(package_id(pkg))
@@ -986,7 +982,7 @@ def _ensure_install_ready(self, pkg):
already locked.
Args:
pkg (spack.package.PackageBase): the package being locally installed
pkg (spack.package_base.PackageBase): the package being locally installed
"""
pkg_id = package_id(pkg)
pre = "{0} cannot be installed locally:".format(pkg_id)
@@ -1018,7 +1014,8 @@ def _ensure_locked(self, lock_type, pkg):
Args:
lock_type (str): 'read' for a read lock, 'write' for a write lock
pkg (spack.package.PackageBase): the package whose spec is being installed
pkg (spack.package_base.PackageBase): the package whose spec is being
installed
Return:
(lock_type, lock) tuple where lock will be None if it could not
@@ -1232,7 +1229,7 @@ def _install_task(self, task):
# Create a child process to do the actual installation.
# Preserve verbosity settings across installs.
spack.package.PackageBase._verbose = (
spack.package_base.PackageBase._verbose = (
spack.build_environment.start_build_process(
pkg, build_process, install_args)
)
@@ -1377,7 +1374,7 @@ def _setup_install_dir(self, pkg):
Write a small metadata file with the current spack environment.
Args:
pkg (spack.package.Package): the package to be built and installed
pkg (spack.package_base.Package): the package to be built and installed
"""
if not os.path.exists(pkg.spec.prefix):
tty.debug('Creating the installation directory {0}'.format(pkg.spec.prefix))
@@ -1451,7 +1448,7 @@ def _flag_installed(self, pkg, dependent_ids=None):
known dependents.
Args:
pkg (spack.package.Package): Package that has been installed locally,
pkg (spack.package_base.Package): Package that has been installed locally,
externally or upstream
dependent_ids (list or None): list of the package's
dependent ids, or None if the dependent ids are limited to
@@ -1540,7 +1537,7 @@ def install(self):
Install the requested package(s) and or associated dependencies.
Args:
pkg (spack.package.Package): the package to be built and installed"""
pkg (spack.package_base.Package): the package to be built and installed"""
self._init_queue()
fail_fast_err = 'Terminating after first install failure'
@@ -1792,7 +1789,7 @@ def __init__(self, pkg, install_args):
process in the build.
Arguments:
pkg (spack.package.PackageBase) the package being installed.
pkg (spack.package_base.PackageBase) the package being installed.
install_args (dict) arguments to do_install() from parent process.
"""
@@ -1852,8 +1849,8 @@ def run(self):
# get verbosity from do_install() parameter or saved value
self.echo = self.verbose
if spack.package.PackageBase._verbose is not None:
self.echo = spack.package.PackageBase._verbose
if spack.package_base.PackageBase._verbose is not None:
self.echo = spack.package_base.PackageBase._verbose
self.pkg.stage.keep = self.keep_stage
@@ -2005,7 +2002,7 @@ def build_process(pkg, install_args):
This function's return value is returned to the parent process.
Arguments:
pkg (spack.package.PackageBase): the package being installed.
pkg (spack.package_base.PackageBase): the package being installed.
install_args (dict): arguments to do_install() from parent process.
"""
@@ -2053,7 +2050,7 @@ def __init__(self, pkg, request, compiler, start, attempts, status,
Instantiate a build task for a package.
Args:
pkg (spack.package.Package): the package to be built and installed
pkg (spack.package_base.Package): the package to be built and installed
request (BuildRequest or None): the associated install request
where ``None`` can be used to indicate the package was
explicitly requested by the user
@@ -2066,7 +2063,7 @@ def __init__(self, pkg, request, compiler, start, attempts, status,
"""
# Ensure dealing with a package that has a concrete spec
if not isinstance(pkg, spack.package.PackageBase):
if not isinstance(pkg, spack.package_base.PackageBase):
raise ValueError("{0} must be a package".format(str(pkg)))
self.pkg = pkg
@@ -2244,11 +2241,11 @@ def __init__(self, pkg, install_args):
Instantiate a build request for a package.
Args:
pkg (spack.package.Package): the package to be built and installed
pkg (spack.package_base.Package): the package to be built and installed
install_args (dict): the install arguments associated with ``pkg``
"""
# Ensure dealing with a package that has a concrete spec
if not isinstance(pkg, spack.package.PackageBase):
if not isinstance(pkg, spack.package_base.PackageBase):
raise ValueError("{0} must be a package".format(str(pkg)))
self.pkg = pkg
@@ -2318,7 +2315,7 @@ def get_deptypes(self, pkg):
"""Determine the required dependency types for the associated package.
Args:
pkg (spack.package.PackageBase): explicit or implicit package being
pkg (spack.package_base.PackageBase): explicit or implicit package being
installed
Returns:
@@ -2341,7 +2338,7 @@ def run_tests(self, pkg):
"""Determine if the tests should be run for the provided packages
Args:
pkg (spack.package.PackageBase): explicit or implicit package being
pkg (spack.package_base.PackageBase): explicit or implicit package being
installed
Returns:

View File

@@ -375,13 +375,6 @@ def make_argument_parser(**kwargs):
# stat names in groups of 7, for nice wrapping.
stat_lines = list(zip(*(iter(stat_names),) * 7))
# help message for --show-cores
show_cores_help = 'provide additional information on concretization failures\n'
show_cores_help += 'off (default): show only the violated rule\n'
show_cores_help += 'full: show raw unsat cores from clingo\n'
show_cores_help += 'minimized: show subset-minimal unsat cores '
show_cores_help += '(Warning: this may take hours for some specs)'
parser.add_argument(
'-h', '--help',
dest='help', action='store_const', const='short', default=None,
@@ -405,9 +398,6 @@ def make_argument_parser(**kwargs):
'-d', '--debug', action='count', default=0,
help="write out debug messages "
"(more d's for more verbosity: -d, -dd, -ddd, etc.)")
parser.add_argument(
'--show-cores', choices=["off", "full", "minimized"], default="off",
help=show_cores_help)
parser.add_argument(
'--timestamp', action='store_true',
help="Add a timestamp to tty output")
@@ -490,18 +480,11 @@ def setup_main_options(args):
# errors raised by spack.config.
if args.debug:
spack.error.debug = True
spack.error.debug = args.debug
spack.util.debug.register_interrupt_handler()
spack.config.set('config:debug', True, scope='command_line')
spack.util.environment.tracing_enabled = True
if args.show_cores != "off":
# minimize_cores defaults to true, turn it off if we're showing full core
# but don't want to wait to minimize it.
spack.solver.asp.full_cores = True
if args.show_cores == 'full':
spack.solver.asp.minimize_cores = False
if args.timestamp:
tty.set_timestamp(True)

View File

@@ -196,6 +196,14 @@ def provides(self):
if self.spec.name == 'llvm-amdgpu':
provides['compiler'] = spack.spec.CompilerSpec(str(self.spec))
provides['compiler'].name = 'rocmcc'
# Special case for oneapi
if self.spec.name == 'intel-oneapi-compilers':
provides['compiler'] = spack.spec.CompilerSpec(str(self.spec))
provides['compiler'].name = 'oneapi'
# Special case for oneapi classic
if self.spec.name == 'intel-oneapi-compilers-classic':
provides['compiler'] = spack.spec.CompilerSpec(str(self.spec))
provides['compiler'].name = 'intel'
# All the other tokens in the hierarchy must be virtual dependencies
for x in self.hierarchy_tokens:

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import platform as py_platform
import re
from subprocess import check_output
from spack.version import Version
@@ -51,6 +52,17 @@ def __init__(self):
if 'ubuntu' in distname:
version = '.'.join(version[0:2])
# openSUSE Tumbleweed is a rolling release which can change
# more than once in a week, so set version to tumbleweed$GLIBVERS
elif 'opensuse-tumbleweed' in distname or 'opensusetumbleweed' in distname:
distname = 'opensuse'
output = check_output(["ldd", "--version"]).decode()
libcvers = re.findall(r'ldd \(GNU libc\) (.*)', output)
if len(libcvers) == 1:
version = 'tumbleweed' + libcvers[0]
else:
version = 'tumbleweed' + version[0]
else:
version = version[0]

File diff suppressed because it is too large Load Diff

Some files were not shown because too many files have changed in this diff Show More