Compare commits

..

441 Commits

Author SHA1 Message Date
Gregory Becker
702774edea add test, minor refactor to make testing easier 2021-07-08 17:49:11 -07:00
Gregory Becker
7713dd4063 keep 5 views around by default 2021-07-08 17:48:29 -07:00
Todd Gamblin
24c01d57cf imports: sort imports everywhere in Spack (#24695)
* fix remaining flake8 errors

* imports: sort imports everywhere in Spack

We enabled import order checking in #23947, but fixing things manually drives
people crazy. This used `spack style --fix --all` from #24071 to automatically
sort everything in Spack so PR submitters won't have to deal with it.

This should go in after #24071, as it assumes we're using `isort`, not
`flake8-import-order` to order things. `isort` seems to be more flexible and
allows `llnl` mports to be in their own group before `spack` ones, so this
seems like a good switch.
2021-07-08 22:12:30 +00:00
Paul Henning
620836a809 hdf5: Fix compiler identification for warning flags (#24627)
* Fix compiler test

Use `self.spec.satisfies` on compiler to determine if a flag should be
applied or not.  This approach avoids issues with the strings `gcc`
or `clang` appearing in the full path to the compiler executables, as
happens with spack-installed compilers (e.g. `nvhpc%gcc`).

* Limit compiler name search to last path component

@skosukhin pointed out that the cflag modification should happen for any
clang or gcc compiler, regardless of what compiler spec provides them.
This commit reverts to searching for a compiler name containing "gcc"
or "clang", but limits the search to the last path component, which
avoids matching spack-installed compilers built with gcc (e.g.
`nvhpc%gcc`), which will have "gcc" in the compiler path.

* Use `os.path` rather than `pathlib`

Co-authored-by: Paul Henning <phenning@lanl.gov>
2021-07-08 15:17:44 -04:00
Adam J. Stewart
0c5402ea5c setup-env: allow users to skip slow parts (#24545) 2021-07-08 17:07:26 +02:00
Todd Gamblin
a22686279c cvs tests: don't use dateutil at all
`dateutil.parser` was an optional dependency for CVS tests. It was failing on macOS
beacuse the dateutil types were not being installed, and mypy was failing *even when the
CVS tests were skipped*. This seems like it was an oversight on macOS --
`types-dateutil-parser` was not installed there, though it was on Linux unit tests.

It takes 6 lines of YAML and some weird test-skipping logic to get `python-dateutil` and
`types-python-dateutil` installed in all the tests where we need them, but it only takes
4 lines of code to write the date parser we need for CVS, so I just did that instead.

Note that CVS date format can vary from system to system, but it seems like it's always
pretty similar for the parts we care about.

- [x] Replace dateutil.parser with a simpler date regex
- [x] Lose the dependency on `dateutil.parser`
2021-07-07 17:27:31 -07:00
Todd Gamblin
0dd04ffbfb style: get close to full coverage of spack style
Previous tests of `spack style` didn't really run the tools --
they just ensure that the commands worked enough to get coverage.

This adds several real tests and ensures that we hit the corner
cases in `spack style`.  This also tests sucess as well as failure
cases.
2021-07-07 17:27:31 -07:00
Todd Gamblin
24a4d81097 style: clean up and restructure spack style command
This consolidates code across tools in `spack style` so that each
`run_<tool>` function can be called indirecty through a dictionary
of handlers, and os that checks like finding the executable for the
tool can be shared across commands.

- [x] rework `spack style` to use decorators to register tools
- [x] define tool order in one place in `spack style`
- [x] fix python 2/3 issues to Get `isort` checks working
- [x] make isort error regex more robust across versions
- [x] remove unused output option
- [x] change vestigial `TRAVIS_BRANCH` to `GITHUB_BASE_REF`
- [x] update completion
2021-07-07 17:27:31 -07:00
Todd Gamblin
b5d2c30d26 style: Move isort configuration to pyproject.toml
- [x] Remove flake8-import-order checks, as we only need isort for this
- [x] Clean up configuration and requirements
2021-07-07 17:27:31 -07:00
Danny McClanahan
7a9fe189e1 style: add support for isort and --fix 2021-07-07 17:27:31 -07:00
Paul Kuberry
f5c1ae32d1 xyce: Prefer master branch (#24733)
Adds 7.3.0 release of Xyce and makes a tag
'github.master' pointing at the master branch
of the Xyce repository on github.com.
2021-07-07 19:56:57 -04:00
Manuela Kuhn
e0b901153b New package: git-annex (#24721) 2021-07-07 11:54:40 -07:00
Glenn Johnson
4fd8640586 Configure docbook packages (#24300)
This PR configures the spack docbook packages
- docbook-xsl
- docbook-xml

The public entities are now mapped to the locally installed files of the
respective packages. The example catalogs are left in place and
XML_CATALOG_FILES points to the newly created catalogs.
2021-07-07 20:41:24 +02:00
Olivier Cessenat
970bf4318c zziplib: add v0.13.72(#24462) 2021-07-07 20:33:54 +02:00
Gregory Lee
218ae0c5d1 m4 package: only apply nvhpc.patch for version 1.4.18 (#24730)
The patch as-is does not apply to 1.4.19
2021-07-07 11:28:42 -07:00
Adam J. Stewart
ad7984c5c0 magma package: fix bugs in cuda_arch variant (#24735) 2021-07-07 11:18:52 -07:00
Adam J. Stewart
b1f4f91f41 py-mypy: add version 0.910 (#24738) 2021-07-07 11:13:31 -07:00
Massimiliano Culpo
2914f9076e perl: bzip2 and zlib may be installed under <prefix>/lib64 (#24752) 2021-07-07 11:22:51 -06:00
Adam J. Stewart
663c37cac4 py-torch: +magma requires +cuda (#24736) 2021-07-07 19:12:52 +02:00
eugeneswalker
36ba640cbd add e4s-on-power stack (#24734) 2021-07-07 10:06:30 -06:00
Manuela Kuhn
5e33b20230 py-seaborn: add 0.11.1 (#24748) 2021-07-07 10:49:59 -05:00
Jordan Ogas
d3c04ed345 charliecloud: add v0.24 (#24728) 2021-07-07 16:42:41 +02:00
Manuela Kuhn
20a191ad93 py-neurokit2: add new package (#24749) 2021-07-07 16:37:34 +02:00
Harmen Stoppels
d06537f75c libtree: add v1.2.2 (#24747) 2021-07-07 16:36:16 +02:00
Olivier Cessenat
09d89ef265 netpbm: new package (#24063) 2021-07-07 15:36:11 +02:00
Olivier Cessenat
0e3f7ce0ed New Package: visit-ffp (#22903) 2021-07-07 15:07:00 +02:00
Adam J. Stewart
e914e561ec sleef: disable optional dependencies (#24742) 2021-07-07 11:34:07 +02:00
Mark W. Krentel
3c9a58bd0b perl: add dependencies for bzip2 and zlib (#24743)
Perl keeps copies of the bzip2 and zlib source code in its own source
tree and by default uses them in favor of outside libraries.  Instead,
put these dependencies under control of spack and tell perl to use the
spack-built versions.
2021-07-07 11:25:57 +02:00
Seth R. Johnson
59eea2859a trilinos: enable +teko gotype=long (#24722) 2021-07-06 17:01:41 -04:00
G-Ragghianti
c12dc1a5de Magma: add ROCm support and v2.6.0 (#24663) 2021-07-06 13:04:37 -06:00
Vasily Danilin
ea2d4b05bc oneAPI packages: add 2021.3 release (#24617) 2021-07-06 10:11:24 -07:00
Massimiliano Culpo
f9ecc4966d qt: rework to use the when context manager (#24723) 2021-07-06 08:07:36 -06:00
Harmen Stoppels
545f971bec fix buffered download (#24623)
* Use shutil to do a buffered copy from http response to file

* Fix flake8...

* Somehow flake8 still complains about unrelated files
2021-07-06 06:12:35 -06:00
Manuela Kuhn
9d36f7f518 qt+webkit: fix missing dependencies and gcc11 compatibility (#24366) 2021-07-06 13:18:02 +02:00
Jianwen
e65ab166b9 json-fortran: add version v6.0.11 (#24720) 2021-07-06 07:03:09 -04:00
Mark Olesen
f8743d0cbf openfoam: add v2106 (#24579)
Co-authored-by: Mark Olesen <Mark.Olesen@esi-group.com>
2021-07-06 08:54:14 +02:00
Adam J. Stewart
9055deea16 py-torch: fix bug in libs/headers attributes (#24624) 2021-07-06 08:53:05 +02:00
Seth R. Johnson
1e3c012fea xyce: clean and fix trilinos dependencies (#24673) 2021-07-06 08:52:10 +02:00
figroc
61242db8f9 abseil-cpp: add versions up to 20210324.2 (#24692) 2021-07-06 08:42:28 +02:00
figroc
9550703132 tensorflow-serving-client: add v2.3.0 (#24694) 2021-07-06 08:41:16 +02:00
eugeneswalker
e450612188 installer: fix double print of exception (#24697) 2021-07-06 08:40:24 +02:00
Adam J. Stewart
115c39e762 py-black: add v21.6b0 (#24715) 2021-07-06 08:40:12 +02:00
Mark W. Krentel
b8f1bd407e hpcx-mpi: new package (#24194)
This is a virtual package for Nvidia's HPC-X MPI implementation for
external specs only.
2021-07-06 08:20:47 +02:00
Adam J. Stewart
cea11f3714 OpenCV: various package updates (#24553) 2021-07-06 08:18:58 +02:00
Adam J. Stewart
b35d6d13a7 py-sphinx: add v4.0.2 (#24602) 2021-07-05 13:25:25 -06:00
Manuela Kuhn
6a1a4d4bb6 py-secretstorage: add 3.3.1 (#24705) 2021-07-05 12:24:32 -05:00
Manuela Kuhn
713fd67b4a py-keyring: fix installation on linux (#24706)
* py-keyring: fix installation on linux

* Update var/spack/repos/builtin/packages/py-keyring/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-keyring/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-07-05 12:24:21 -05:00
Adam J. Stewart
2ded87d40e GDAL: add v3.3.1 (#24713) 2021-07-05 18:45:57 +02:00
Manuela Kuhn
9963642c1c py-importlib-metadata: add 4.6.1 and 3.10.1 (#24701) 2021-07-05 10:28:01 -05:00
Manuela Kuhn
8e37c30e2f py-whoosh: add new package (#24707) 2021-07-05 16:08:57 +02:00
Manuela Kuhn
04289b2009 py-keyrings-alt: add new package (#24704) 2021-07-05 12:45:29 +02:00
Manuela Kuhn
d764b776d7 py-patool: add new package (#24703) 2021-07-05 12:44:59 +02:00
Manuela Kuhn
1cb2855054 py-iso8601: add new package (#24702) 2021-07-05 12:44:23 +02:00
Manuela Kuhn
95b0eb9fdd py-num2words: add new package (#24681) 2021-07-05 11:16:01 +02:00
Sebastian Pipping
a2a273832f uriparser: add v0.9.5 (#24688) 2021-07-05 11:14:30 +02:00
figroc
cf6aa8f012 grpc: add versions up to 1.33.1 (#24693) 2021-07-05 10:53:35 +02:00
Adam J. Stewart
fd11c6f5f7 py-pandas: add v1.3.0 (#24696) 2021-07-05 10:51:14 +02:00
Adam J. Stewart
eaa918c8f3 py-isort: add v5.9.1, +colors variant (#24699) 2021-07-05 10:50:23 +02:00
Adam J. Stewart
eacba1ffac py-colorama: add v0.4.4 (#24698) 2021-07-05 10:49:26 +02:00
Adam J. Stewart
a3f6df33ef Remove add-maintainers-as-reviewers action (#24700) 2021-07-04 18:55:24 -07:00
Adam J. Stewart
3b94e22ad4 Fix fetching for Python 3.9.6 (#24686)
When using Python 3.9.6, Spack is no longer able to fetch anything. Commands like `spack fetch` and `spack install` all break.

Python 3.9.6 includes a [new change](https://github.com/python/cpython/pull/25853/files#diff-b3712475a413ec972134c0260c8f1eb1deefb66184f740ef00c37b4487ef873eR462) that means that `scheme` must be a string, it cannot be None. The solution is to use an empty string like the method default.

Fixes #24644. Also see https://github.com/Homebrew/homebrew-core/pull/80175 where this issue was discovered by CI. Thanks @branchvincent for reporting such a serious issue before any actual users encountered it!

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-07-02 22:20:09 -07:00
Robert Underwood
e568564e2f Update Z-checker and SZ (#24446) 2021-07-02 23:40:14 +02:00
Joe Heaton
6547f41096 Rename cray compiler to cce (#24653)
cp2k was using the outdated and Incorrect compiler name `cray`.
2021-07-02 12:15:43 -07:00
Scott Wittenburg
c895332284 Pipelines: Improve broken specs check (#24643)
We should not fail the generate stage simply due to the presence of
a broken-spec somewhere in the DAG.  Only fail if the known broken
spec needs to be rebuilt.
2021-07-02 10:49:49 -07:00
Massimiliano Culpo
3d11716e54 Add when context manager to group common constraints in packages (#24650)
This PR adds a context manager that permit to group the common part of a `when=` argument and add that to the context:
```python
class Gcc(AutotoolsPackage):
    with when('+nvptx'):
        depends_on('cuda')
        conflicts('@:6', msg='NVPTX only supported in gcc 7 and above')
        conflicts('languages=ada')
        conflicts('languages=brig')
        conflicts('languages=go')
```
The above snippet is equivalent to:
```python
class Gcc(AutotoolsPackage):
    depends_on('cuda', when='+nvptx')
    conflicts('@:6', when='+nvptx', msg='NVPTX only supported in gcc 7 and above')
    conflicts('languages=ada', when='+nvptx')
    conflicts('languages=brig', when='+nvptx')
    conflicts('languages=go', when='+nvptx')
```
which needs a repetition of the `when='+nvptx'` argument. The context manager might help improving readability and permits to group together directives related to the same semantic aspect (e.g. all the directives needed to model the behavior of `gcc` when `+nvptx` is active). 

Modifications:

- [x] Added a `when` context manager to be used with package directives
- [x] Add unit tests and documentation for the new feature
- [x] Modified `cp2k` and `gcc` to show the use of the context manager
2021-07-02 08:43:15 -07:00
Olivier Cessenat
f88d90e432 mfem: adjusted dependencies on hypre (4.2 compiles with hypre up to 2.20) (#24611) 2021-07-02 09:32:31 +02:00
Seth R. Johnson
8089b86dc2 curl: explicitly disable unused dependencies (#24613)
I installed curl on my mac and it picked up a homebrew (I think?)
installation of gsasl. A later system update broke git because of the
implicitly added dependency. Explicitly disabling libraries that *might*
exist on the system is the safe approach here.

```
dyld: Library not loaded: /usr/local/opt/gsasl/lib/libgsasl.7.dylib
  Referenced from: /rnsdhpc/code/spack/opt/spack/apple-clang/curl/gag5v3c/lib/libcurl.4.dylib
  Reason: image not found
error: git-remote-https died of signal 6
```
2021-07-02 09:30:47 +02:00
Chuck Atkins
f1842f363d dataspaces: move compiler vars to setup_build_environment (#24626) 2021-07-02 09:29:17 +02:00
Adam J. Stewart
7ced07a141 GEOS: add v3.9.1, switch to CMake (#24629) 2021-07-02 08:48:32 +02:00
Weiqun Zhang
189968e207 amrex: add v21.07 (#24655)
Also add conflict with rocm-4.2.
2021-07-02 08:47:08 +02:00
Tiziano Müller
f54fad40ba amdlibflame: fix build with gcc from CrayPE (#24358)
fixes #24210
2021-07-01 18:37:14 -06:00
Mosè Giordano
38a010b580 sombrero: add new package (#24567) 2021-07-01 18:28:28 -06:00
Matthieu Dorier
e1694afdde xsd: added patch to fix missing #include <iostream> (#24496) 2021-07-01 18:25:22 -06:00
Axel Huebl
d842c08a9b WarpX: FFTW+OpenMP (#24604)
FFTW: prefer with OpenMP acceleration for OpenMP compute backend
2021-07-01 18:22:22 -06:00
Sebastian Schmitt
54219852d9 Update dill (#24633) 2021-07-01 16:16:30 -06:00
Harmen Stoppels
4a8a6b4c9d meson: fix typo (libs instead of default_library) (#24649) 2021-07-01 14:37:36 -06:00
Maciej Wójcik
c5a27980df Added Perl workaround for CUDA <= 8 (#24291)
* Added Perl workaround for CUDA <= 8
* Re-wrapped comment
* Proofreading corrections
* Added a reference
* Do not override Perl include path
* Retrieve shell once

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
2021-07-01 12:04:04 -07:00
Seth R. Johnson
406117148d trilinos: improve behavior of gotype (#24565)
* trilinos: add teko conflict
* trilinos: improve gotype variant

Instead of 'none' and 'long' typically being the same (but not for older
trilinos versions), add an explicit 'all' variant that only works for
older trilinos which supports multiple simultaneous tpetra
instantiations.

* trilinos: add self as maintainer
* trilinos: disable vendored gtest by default
2021-07-01 03:13:21 -06:00
Valentin Volkl
e6700d47aa yoda: add v1.9.0 and compiler conflict for earlier versions (#23814) 2021-07-01 09:27:33 +02:00
Tamara Dahlgren
ca550cd819 hdf: replacing use of install test root with new cached tests dir (#24368) 2021-07-01 09:17:38 +02:00
kwryankrattiger
8c46e82862 sensei: repo update (#24487) 2021-07-01 09:15:16 +02:00
Seth R. Johnson
d0bbe18c79 vim: use value variant and update config script (#24554)
This changes several conflicting variants to a single
multi-value variant, and uses conflicts instead of raising InstallError.
(With clingo, requesting +gui automatically selects features=huge!)
I have also rearranged the dependencies for clarity and simplified the
conifgure args.
2021-07-01 09:12:05 +02:00
Harmen Stoppels
ca538e18a4 sirius: add v7.2.5 (#24587) 2021-06-30 23:16:23 -06:00
Desmond Orton
59028aa0a5 py-wxpython: Version update to 4.1.1 (#24569) 2021-06-30 16:55:14 -06:00
Zack Galbreath
c8868f1922 ci: only write to broken-specs list on SpackError (#24618)
ci: only write to broken-specs list on SpackError

Only write to the broken-specs list when `spack install` raises a SpackError,
instead of writing to this list unnecessarily when infrastructure-related problems
prevent a develop job from completing successfully.
2021-06-30 12:16:15 -06:00
Dr. Christian Tacke
89bed5773e root: Add Version 6.24.02 (#24619)
Also fixes some style issues
2021-06-30 12:10:34 -06:00
Robert Mijakovic
1639ac8e86 python: new versions; 3.9.6, 3.8.11, 3.7.11, 3.6.14 (#24593) 2021-06-30 12:01:29 -06:00
Adam J. Stewart
7a794f8b0a py-torchvision: add v0.10.0 (#24340) 2021-06-30 12:42:25 -04:00
Adam J. Stewart
4f9b539644 py-torch: overhaul package (#24294)
* py-torch: patch no longer needed on master

* Overhaul PyTorch package

* py-torch: add v1.9.0

* Change defaults on macOS

* Submodules still needed...

* Add ONNX dependency

* System libs don't work for many submodules

* Silence CMake warning

* Add conflict for +cuda+rocm

* Add more deps

* Add more BLAS options

* Disable some broken system libs options

* Add patches to build older versions

* +mkldnn requires mkl

* Fix BLAS settings
2021-06-30 12:38:53 -04:00
Adam Moody
7753c816f0 scr and other packages: rename default branches to main (#24578) 2021-06-30 10:31:41 -06:00
Zack Galbreath
0bbd71d561 ci: Don't raise an exception if we fail to relate CDash builds (#24299)
ci: Do not allow cdash-related errors to cause pipeline failure
2021-06-30 10:26:26 -06:00
Morten Kristensen
b8512983d9 py-vermin: add v1.2.1 (#24607) 2021-06-30 09:13:21 -06:00
Adam J. Stewart
ea261e3530 py-pytorch-sphinx-theme: add master version (#24594) 2021-06-30 07:52:20 -06:00
Robert Mijakovic
acd1b04ea2 cmake: add v3.20.4, v3.20.5 (#24582)
Co-authored-by: Robert Mijakovic <robert.mijakovic@lxp.lu>
2021-06-30 09:31:00 +02:00
Manuela Kuhn
f2d60261c9 r-stringi: add v1.6.2 (#24585) 2021-06-30 09:30:15 +02:00
shanedsnyder
8ad1dd6036 darshan-runtime,darshan-util: add v3.3.1 + updated git repository (#24574) 2021-06-30 09:11:16 +02:00
Thomas Madlener
c832ac28ae genfit package: add googletest dependency (#24467) 2021-06-29 22:37:25 -07:00
Axel Huebl
a647ae2aeb HiPACE++: FFTW+OpenMP (#24575)
- change the default compute backend: we just start to add OpenMP support
- FFTW: prefer with OpenMP acceleration for OpenMP compute backend
2021-06-29 20:55:22 -06:00
Jen Herting
d00082e70b New package: py-datasets (#24597)
* [py-datasets] created template

* [py-datasets] added dependencies

* [py-datasets] requires py-pyarrow with +parquet

* [py-datasets] Final cleanup

- added homepage
- added description
- removed fixmes
2021-06-29 20:28:23 -06:00
Jen Herting
501f87fb22 dependency for py-torchmeta (#24595)
Co-authored-by: Sid Pendelberry <sid@rit.edu>
2021-06-29 18:58:21 -06:00
Sergey Kosukhin
55dd306790 hdf5: a follow-up to #18937 (#23820) 2021-06-29 18:34:12 -06:00
Adam J. Stewart
a2b7f9997d spack style: warn if flake8-import-order is missing (#24590) 2021-06-29 23:49:18 +00:00
Manuela Kuhn
1c6504d2f5 py-nipype: add 1.6.1 (#24584) 2021-06-29 17:04:23 -06:00
Manuela Kuhn
cc1285c1e1 py-nilearn: add 0.8.0 (#24583) 2021-06-29 16:04:28 -06:00
Harmen Stoppels
304249604a Fix prefix-collision detection for projections (#24049)
If two Specs have the same hash (and prefix) but are not equal, Spack
originally had logic to detect this and raise an error (since both
cannot be installed in the same place). Recently this has eroded and
the check no-longer works; moreover, when defining projections (which
may truncate the hash or other distinguishing properties from the
prefix) Spack was also failing to detect collisions (in both of these
cases, Spack would overwrite the old prefix with the new Spec).

This PR maintains a list of all "taken" prefixes: if a hash is not
registered (i.e. recorded as installed in the database) but the prefix
is occupied, that is a collision. This can detect collisions created
by defining projections (specifically when they omit the hash).

The PR does not detect collisions where specs have the same hash
(and prefix) but are not equal.
2021-06-29 14:44:56 -07:00
Manuela Kuhn
1eb2798c43 py-numpy: conflict with gcc11 and switch master to main (#24573)
Fix syntax of conflict between numpy 1.21.0 and gcc11 to that the clingo
concretizer recognizes it.
In addition the upstream master branch was renamed to main.
2021-06-29 15:40:32 -06:00
Jen Herting
e284cd136a New package: py-pyautogui (#24572)
* [py-pyautogui] created template

* [py-pyautogui] added some unconditional dependencies

* [py-pyautogui] Final cleanup

- added homepage
- added description
- removed fixmes

* [py-pyautogui] added missing dependencies
2021-06-29 13:04:20 -06:00
Christoph Conrads
f5474a2b8b CGNS: add CMake dependency (#24564) 2021-06-29 12:49:26 -06:00
holrock
c824cad2ea ruby: add v3.0.1 (#24560) 2021-06-29 12:22:38 -06:00
Vinícius
3fe1ecd807 simgrid: add v3.27, update package (#24513) 2021-06-29 12:07:22 -06:00
Massimiliano Culpo
2b65c53d2b vermin: show line numbers of violations (#24580)
This commit runs vermin with the --violations option
that shows details of the violations to target requirements.
2021-06-29 19:41:22 +02:00
Jen Herting
327cca7e2e New package: py-huggingface-hub (#24588)
* [py-huggingface-hub] created template

* [py-huggingface-hub] added dependencies

* [py-huggingface-hub] added version 0.0.8

* [py-huggingface-hub] Final cleanup

- added description
- added homepage
- removed fixmes
2021-06-29 10:49:10 -06:00
Scott Wittenburg
09fa155333 pipelines: build warpx on instance with more memory (#24592) 2021-06-29 18:06:30 +02:00
Chuck Atkins
a7b6149cc0 libzmq: Fix gcc11 build failure (#24563) 2021-06-29 06:33:35 +02:00
Manuela Kuhn
b87d9c29c1 py-scipy: fix missing py-cython dependency (#24548) 2021-06-28 14:07:24 -06:00
Larry Knox
cc20dbf645 Hdf5 cmake (#18937)
* Switch hdf5 package from autotools to cmake.

* Add variant for building with zlib, default to ON.

* Update for format requirements.

* Format change.

* Fix breakage from last merge from develop.
Switch szip to use libaec (unrestricted encryption).
Remove 'static' variant:  static libs will only be installed when
~shared.

* Improve args based on suggestions from pull request.

* Update code URL to github.com
Add/modify 4 depends_on lines to fix running "spack graph --deptype=link hdf5".

* Remove trailing whitespace.

* Remove dependencies added solely to make "spack greph --type=link" work.

* Add new version HDF5 1.8.22.

* Remove unnecessary java_check.

* Fix whitespace for style checks.

* Reverted zlib version dependency to 1.1.2:.
zlib variant removed.
api version default renamed "default".

* Remove blank line.

* Whitespace corrections.

* iRemoved unnecessary 'debug' variant.

* Fix typo in version number in conflict for '+szip'.

* Set default for tools variant to True.
Remove patch functions dependent on 'libtool' file that cmake doesn't
produce.

* Remove line to set ONLY_SHARED_LIBS to true.
Add post_install code to install only one version of tools with shared
linkage and original tool names.

* Remove trailing white space and import of glob package not used.

* Leave BUILD_TESTING set to default which is ON.

* Remove  post_install code to install only one version of tools because
some dependent packages running tests in e4s testing are using
h5diff-shared.  Keep both tools versions for now.

* No longer need to import os.
2021-06-28 13:42:54 -06:00
Michael Kuhn
9a9b5dee2e glib: add v2.68.3 (#24558) 2021-06-28 12:01:24 -06:00
Robert Underwood
69d69cbc79 GDB: resolve warnings about imp being deprecated (#24448)
This patch has already been accepted into gdb's trunk, we just adopting
it earlier here since it is small and gives a better user experience.
2021-06-28 07:16:28 -06:00
Adam J. Stewart
2284007db9 Use flake8-import-order to enforce PEP-8 compliance (#23947) 2021-06-28 08:03:20 -05:00
Mikael Simberg
744cedc7e9 Add Asio package (#24485) 2021-06-28 12:04:19 +02:00
Olivier Cessenat
e631ccc6f7 New Package: visit-mfem (#22906) 2021-06-28 12:02:41 +02:00
Kai Germaschewski
3cfc1dbc14 pfunit: fix gcc10 +mpi (#23878)
Instead of refusing to build +mpi with gcc10, add what I guess is now
the standard workaround, ie., `-fallow-argument-mismatch`.

Getting this into pfunit's cmake-based but kinda non-standard build  isi
a bit ugly, but you gotta do what you gotta do...
2021-06-28 11:33:23 +02:00
Jianwen
2970c02639 Fix kokkos version number in lammps. (#24436) 2021-06-28 11:23:34 +02:00
Manuela Kuhn
77a98cabfa llvm: add patch for gcc11 (#24363)
llvm10 was not compiling with gcc due to missing header (see #24270)
2021-06-28 11:18:53 +02:00
Asher Mancinelli
b6aea0d6bf Update ipopt versions, fix blas/lapack flags (#24447)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-06-28 11:14:35 +02:00
Axel Huebl
c06db97970 CUDA: 11.0.3 (11.0 "Update 1") (#24481)
Add CUDA 11.0.3.

This release adds new features such as NVCC flags
`--forward-unknown-to-host-compiler` and
`--forward-unknown-to-host-linker`
2021-06-28 10:53:47 +02:00
miheer vaidya
90dc90e8d1 barvinok: add new package (#24477) 2021-06-28 10:52:45 +02:00
Erik Schnetter
f5ef532bdc silo: force autoreconf when building share libs (#24388) 2021-06-28 10:44:38 +02:00
lorddavidiii
2bd2ef27a2 ocl-icd, opencl-c-headers and opencl-clhpp: add new versions (#24499) 2021-06-28 02:13:24 -06:00
Scott McMillan
a4a393d097 Update Boost package to support building the latest with the NV compilers (#24541)
Co-authored-by: Scott McMillan <smcmillan@nvidia.com>
2021-06-28 01:19:36 -06:00
Chris White
a6ce000e09 RAJA + Umpire: CUDA Arch fixes (#24531) 2021-06-28 09:09:01 +02:00
Manuela Kuhn
3e65828a7e pandoc: add 2.11.4 and 2.14.0.3 (#24542) 2021-06-28 08:50:39 +02:00
albestro
6f950bc8ee cppzmq: add v4.7.1 and "drafts" variant (#24555) 2021-06-28 08:31:04 +02:00
Dylan Simon
963b931309 docs: link projections docs to spec format (#24478) 2021-06-27 08:38:28 +00:00
Anton Kozhevnikov
9bd9cc2c7b swith profiler on/off (#24547) 2021-06-26 19:25:15 +02:00
Alec Scott
4c149ade7f accumulo: add v2.0.1 (#24409) 2021-06-26 05:07:13 -06:00
Hadrien G
ecc950d10c dd4hep: fix hash for version 01-17 (#24425)
Version 1.17 of DD4hep was renamed from "01-17-00" to "01-17", in line 
with the naming conventions of previous releases. Since release archives
contain a subdirectory with the version string in it, this changes the contents
of the tarball ever so slightly, so the SHA-256 checksum must change as well.
2021-06-26 12:34:31 +02:00
Stephen Herbein
9b2e7e6140 flux: add latest versions (core v0.27.0, sched v0.16.0) (#24546) 2021-06-26 04:13:18 -06:00
Cameron Smith
19a973eca0 pumi: add v2.2.6 (#24525) 2021-06-26 11:27:52 +02:00
Manuela Kuhn
ce16503bd3 r-rmarkdown: add 2.9 (#24539) 2021-06-26 11:27:17 +02:00
Manuela Kuhn
ef67ecde60 r-tinytex: add 0.32 (#24538) 2021-06-26 11:27:04 +02:00
Manuela Kuhn
2b1916d845 r-knitr: add 1.33 (#24537) 2021-06-26 11:26:46 +02:00
Manuela Kuhn
469e580034 r-mime: add 0.11 (#24536) 2021-06-26 11:26:25 +02:00
Manuela Kuhn
80585562c9 r-htmltools: add 0.5.1.1 (#24535) 2021-06-26 11:26:06 +02:00
Manuela Kuhn
2aa9e337ee soci: add 4.0.2 and multiple variants (#24543)
Fix url to find newer versions, add newest version 4.0.2 and add
variants for
- cxxstd: To use a specific c++ standard
- static: Enable or disable build of static libraries
- boost: Boost support
- sqlite: SQLite support
- postgresql: PostgreSQL support
2021-06-26 03:25:47 -06:00
Manuela Kuhn
06a1cf2449 r-highr: add 0.9 (#24534) 2021-06-26 11:25:31 +02:00
Manuela Kuhn
2a9b9c9046 r-xfun: add 0.24 (#24533) 2021-06-26 11:25:09 +02:00
Massimiliano Culpo
b12cee32de Update archspec to support arm compiler on a64fx (#24524) 2021-06-26 09:18:48 +02:00
Massimiliano Culpo
17f9ddb2b5 flecsi: fixed reported issues in package (#24398)
Prevent the use of "legion network=none" when
flecsi has "backend=legion"
2021-06-26 09:17:53 +02:00
Satish Balay
dbf030f27a p4est: autoreconf required only for @2.0 (#24544) 2021-06-25 19:02:47 -07:00
Satish Balay
8937102006 p4est: use autoreconf only for @:2.2 (#24528)
This fixes @2.3.2 build breakage with #23824 changes.
2021-06-25 14:04:32 -07:00
snehring
cf0b3632ff bwa: fixing build errors with gcc10+ (#24475) 2021-06-25 22:12:01 +02:00
Adrien Bernede
6b852bc170 Doc: Note on required changes after merge of reproducible builds (#24347)
* Suggestion of a note for conversion of existing pipelines.

* Wording

* Fix format in .rst note

* Wording
2021-06-25 11:02:26 -06:00
Even Rouault
843c38e69e GDAL: only jasper will be removed in GDAL 3.5, not openjpeg (#24483) 2021-06-25 10:19:23 -06:00
Adam J. Stewart
2bc0c0ea59 Add support for .tbz file extensions (#24479) 2021-06-25 09:37:23 -06:00
Satish Balay
3087d74ca7 sundials: remove sundials_nvecopenmp target from ARKODE SuperLU_DIST example (#24516) 2021-06-25 08:19:26 -06:00
Jen Herting
03f54ea4bb py-seqeval: new package (#24486) 2021-06-25 14:35:00 +02:00
Joseph Schoonover
25522b5c9c HOHQMesh: add new package (#24501)
Co-authored-by: Joe Schoonover <joe@fluidnumerics.com>
2021-06-25 14:15:04 +02:00
Michael Kuhn
f4b96a21c8 go-bootstrap: Increase environment variable size (#24508)
When having a few packages loaded, installing go-bootstrap will fail
because the `PATH` variable is truncated at 4096 bytes. Increase the
limit to 128 KiB to make longer paths fit.
2021-06-25 11:00:06 +02:00
Satish Balay
291703f146 xsdk: fix dealii@9.2.0 build (#24515)
1. "+simplex" conflicts with "dealii@:9.2" [The interface to simplex is supported from version 9.3.0 onwards. Please explicitly disable this variant via ~simplex]
2. "+arborx" conflicts with "dealii@:9.2" [The interface to arborx is supported from version 9.3.0 onwards. Please explicitly disable this variant via ~arborx]
2021-06-25 10:54:06 +02:00
Alec Scott
6bf1f69b4c lmod: add v8.5.6 (#24511) 2021-06-25 10:49:01 +02:00
Adam J. Stewart
b9eeef8c38 ONNX: add new versions (#24518) 2021-06-25 09:31:20 +02:00
Olivier Cessenat
ec2d4c07b3 texlive: add support for external find (#24460) 2021-06-25 09:25:18 +02:00
Gregory Lee
7dafc827a7 only apply onapi patch to m4 for v1.4.18 (#24490) 2021-06-24 17:35:26 -07:00
Adam J. Stewart
26f740b25a Sleef: add new versions (#24443)
* Sleef: add new versions
* Mix release versions and dates
2021-06-24 17:58:19 -06:00
Hadrien G
0c996671b8 [acts] Add versions 9.0.0 and 9.0.1 (#24428) 2021-06-24 17:28:27 -06:00
Alec Scott
4eb4994472 Update Jasper to 2.0.32 (#24510) 2021-06-24 23:15:17 +00:00
Peter Scheibel
916cdfbb56 Environment modifications: de-prioritize external packages (#23824)
Prior to any Spack build, Spack modifies PATH etc. to help the build
find the dependencies it needs. It also allows any package to define
custom environment modifications (and furthermore a package can
specify environment modifications to apply when it is used as a
dependency). If an external package defines custom environment
modifications that alter PATH, and the external package is in a merged
or system prefix, then that prefix could "override" the Spack-built
packages.

This commit reorders environment modifications so that PrependPath
actions which expose Spack-built packages override PrependPath actions
for custom environment modifications of external packages.

In more detail, the original order of environment modifications is:

* Modules
* Compiler flag variables
* PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH for dependencies
* Custom package.py modifications in the following order:
  * dependencies
  * root

This commit changes the order:

* Modules
* Compiler flag variables
* For each external dependency
  * PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH modifications
  * Custom modifications
* For each Spack-built dependency
  * PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH modifications
  * Custom modifications
2021-06-24 16:13:46 -07:00
Scott Wittenburg
d7405ddd39 Pipelines: Set a pipeline type variable (#24505)
Spack pipelines need to take specific actions internally that depend
on whether the pipeline is being run on a PR to spack or a merge to
the develop branch.  Pipelines can also run in other repositories,
which represents other possible use cases than just the two mentioned
above.  This PR creates a "SPACK_PIPELINE_TYPE" gitlab variable which
is propagated to rebuild jobs, and is also used internally to determine
which pipeline-specific tasks to run.

One goal of the PR is fix an issue where rebuild jobs which failed on
develop pipelines did not properly report the broken full hash to the
"broken-specs-url".
2021-06-24 16:15:19 -06:00
Asher Mancinelli
010b431692 Add Externally Findable section to info command (#24503)
* Add Externally Findable section to info command

* Use comma delimited detection attributes in addition to boolean value

* Unit test externally detectable part of spack info
2021-06-24 22:12:45 +00:00
Alec Scott
e45800126a abyss: add v2.3.1 (#24408) 2021-06-24 15:01:46 -06:00
Howard Pritchard
7456a0348f OPENMPI: fixes to enable building of ompi master (#24391)
yes I know this name isn't popular but that's the way it is right now.

master and the upcoming v5.0.x release branch use git submodules.

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2021-06-24 13:55:01 -07:00
Alec Scott
1367cc97c2 bedops: add v2.4.39 (#24411) 2021-06-24 14:46:20 -06:00
Manuela Kuhn
9d0b8b575b postgresql: fix typo and missing import (#24491) 2021-06-24 11:11:44 -05:00
Jen Herting
8f2f5639c8 [xxhash] added version 0.8.0 (#24492) 2021-06-24 11:11:01 -05:00
Jen Herting
cf38a96b14 New package: py-xxhash (#24493)
* [py-xxhash] created template

* [py-xxhash] working on dependencies

* [py-xxhash] set version for xxhash

* [py-xxhash] Final cleanup

- added homepage
- added description
- removed fixmes
2021-06-24 11:10:24 -05:00
Anton Kozhevnikov
b1009b48b9 sirius: add single precision switch (#24500) 2021-06-24 08:46:28 -06:00
Joseph Schoonover
d3a1da8496 FTObjectLibrary: new package (#24423)
Co-authored-by: Joe Schoonover <joe@fluidnumerics.com>
2021-06-23 22:55:28 -06:00
Massimiliano Culpo
4985215072 Update command to setup tutorial (#24488) 2021-06-23 17:19:20 -07:00
Scott Wittenburg
db403391c8 spack ci: use return codes to signal exit status (#24090) 2021-06-23 17:09:19 -07:00
Jen Herting
3d631377c0 New package: py-mouseinfo (#24245)
* [py-mouseinfo] created template

* [py-mouseinfo] added some dependencies

* [py-mouseinfo] added platform dependent dependency information

* [py-mouseinfo] flake8

* [py-mouseinfo] added python2 dependency and conflict with darwin for missing dependency

* [py-mouseinfo] Final cleanup

- added homepage
- added description
- removed fixmes

* [py-mouseinfo] using pil provider
2021-06-23 15:46:20 -06:00
Olivier Cessenat
387ee5a0b7 gsl: add v2.7 (#24474) 2021-06-23 18:46:16 +02:00
Massimiliano Culpo
1bccd866ae Fix broken CI for package only PRs, make dateutil not strictly required (#24484)
* Force the Python interpreter with an env variable

This commit forces the Python interpreter with an
environment variable, to ensure that the Python set
by the "setup-python" action is the one being used.

Due to the policy adopted by Spack to prefer python3
over python we may end up picking a Python 3.X
interpreter where Python 2.7 was meant to be used.

* Revert "Update conftest.py (#24473)"

This reverts commit 477c8ce820.

* Make python-dateutil a soft dependency for unit tests

Before #23212 people could clone spack and run
```
spack unit-tests
```
while now this is not possible, since python-dateutil is
a required but not vendored dependency. This change makes
it not a hard requirement, i.e. it will be used if found
in the current interpreter.

* Workaround mypy complaint
2021-06-23 07:56:07 -04:00
Uwe Sauter
97f0c3ccd9 Add dependency on rocm-cmake to various ROCm related packages (#24427) 2021-06-23 11:35:49 +02:00
eugeneswalker
2db858e9c4 filter_compiler_wrappers: include realpath of compiler wrappers (#24456) 2021-06-22 23:37:07 +00:00
loulawrence
4da0561496 Add config option to use urllib to fetch if curl missing (#21398)
* Use Python module urllib to fetch in the case that curl is missing
2021-06-22 13:38:37 -07:00
Peter Scheibel
477c8ce820 Update conftest.py (#24473) 2021-06-22 12:57:35 -07:00
Peter Scheibel
323b47a94e add version 35.0 of luaposix (#24458) 2021-06-22 12:49:34 -06:00
Massimiliano Culpo
acc11f676d ASP-based solver: fix provider logic (#24351)
This commit fixes a subtle bug that may occur when
a package is a "possible_provider" of a virtual but
no "provides_virtual" can be deduced. In that case
the cardinality constraint on "provides_virtual"
may arbitrarily assign a package the role of provider
even if the constraints for it to be one are not fulfilled.

The fix reworks the logic around three concepts:
- "possible_provider": a package may provide a virtual if some constraints are met
- "provides_virtual": a package meet the constraints to provide a virtual
- "provider": a package selected to provide a virtual
2021-06-22 11:37:24 -07:00
Massimiliano Culpo
02b92dbf10 ASP-based solver: fix facts for default providers (#24380)
Facts used to compute weights for providers only need
the package name, since the other attributes are computed
as part of the solve.
2021-06-22 11:33:44 -07:00
Ethan Stam
09a6f3533b z3: set CMAKE_INSTALL_PYTHON_PKG_DIR for +python build (#24470) 2021-06-22 11:20:09 -07:00
Adam J. Stewart
e4c38ba14c py-cartopy: mark incompatibility with PROJ 8 (#24454) 2021-06-22 11:12:25 -07:00
OliverPerks
d292541edb sw4lite: fixed to include build targets (#24466) 2021-06-22 13:01:34 -05:00
Thomas Madlener
07fe558509 dd4hep: Updated version checksum due to updated tag (#24469) 2021-06-22 10:49:50 -07:00
snehring
377f031461 subread: updating subread to 2.0.2 (#24468) 2021-06-22 10:33:33 -07:00
Harmen Stoppels
d63566915d Bump libfuse (#24444) 2021-06-22 10:06:45 -07:00
Adam J. Stewart
11ad6e1a8a py-pandas: add v1.2.5 (#24464) 2021-06-22 10:05:38 -07:00
Adam J. Stewart
ccece0e197 py-numpy: add v1.21.0 (#24463) 2021-06-22 10:05:22 -07:00
eugeneswalker
65e7e1f969 tau: use filter_compiler_wrappers to take advantage of builtin functionality (#24457) 2021-06-22 12:00:27 -05:00
Adam J. Stewart
b0a915a3b6 py-scipy: add v1.7.0 (#24438) 2021-06-22 11:56:16 -05:00
Erik Schnetter
e3b220f699 Implement CVS fetcher (#23212)
Spack packages can now fetch versions from CVS repositories. Note
this fetch mechanism is unsafe unless using :extssh:. Most public
CVS repositories use an insecure protocol implemented as part of CVS.
2021-06-22 09:51:31 -07:00
Adam J. Stewart
512edfcceb py-pythran: add new package (#24440) 2021-06-22 07:07:15 -06:00
Vanessasaurus
8e249c03de adding save of build times on install (#24350)
Here we are adding an install_times.json into the spack install metadata folder.
We record a total, global time, along with the times for each phase. The type
of phase or install start / end is included (e.g., build or fail)

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-06-22 03:01:15 -06:00
Adam J. Stewart
726537e01b py-beniget: add new package (#24439) 2021-06-22 08:50:10 +02:00
Adam J. Stewart
d71a0590b7 py-gast: add v0.4.0 (#24437) 2021-06-22 08:49:38 +02:00
Paul Henning
3039237a0e hdf5: fix compiler detection in flag_handler (#24451)
The original implementation of `flag_handler` searched the
`self.compiler.cc` string for `clang` or `gcc` in order to add a flag
for those compilers.  This approach fails when using a spack-installed
compiler that was itself built with gcc or clang, as those strings will
appear in the fully-qualified compiler executable paths.  This commit
switches to searching for `%gcc` or `%clang` in `self.spec`.

Co-authored-by: Paul Henning <phenning@lanl.gov>
2021-06-21 23:36:28 -07:00
Howard Pritchard
5e48d2c16f open mpi: remove preferred for 4.0.5 release (#24433)
the 4.1.1 release has fixes for problems that kept 4.1.0 from
being the default open mpi version to build using spack.

related to #24396

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
2021-06-21 23:36:09 -07:00
Adam J. Stewart
31e6967c49 MAGMA: add patch to build with CUDA sm_37 (#24442) 2021-06-21 20:18:26 -07:00
Chris White
c9932b2d1e Axom: Remove blueos check on cuda variant (#24349)
* remove blueos check on cuda variant, fix typo

* restore necessary compiler guard

* remove axom+cuda from testing because it only partially works outside ppc systems
2021-06-22 01:29:18 +00:00
Peter Scheibel
c83f4b01aa Fetching: git on Mac OS (#24247)
Extend the changes in #24163 to unit tests.
2021-06-21 17:53:12 -07:00
Adam J. Stewart
7b6ca59038 psimd: add new package (#24406) 2021-06-19 08:25:21 -05:00
Adam J. Stewart
62653b9c36 cpuinfo: add new versions (#24402) 2021-06-19 11:05:41 +02:00
Adam J. Stewart
ebcc222181 FP16: add new versions, prevent downloads (#24403) 2021-06-19 11:05:24 +02:00
Adam J. Stewart
9984e61347 pthreadpool: add new versions (#24404) 2021-06-19 11:04:52 +02:00
Adam J. Stewart
76632d6710 FXdiv: add new package (#24405) 2021-06-19 11:04:25 +02:00
snehring
d394e9978e singularity: add v3.8.0 (#24407) 2021-06-19 10:57:37 +02:00
Alec Scott
5f415c9782 Beast2: add v2.6.4 (#24410) 2021-06-19 10:53:59 +02:00
Alec Scott
c432076280 bedtools2: add v2.30.0 (#24412) 2021-06-19 10:53:21 +02:00
Alec Scott
73d7444ca7 benchmark: add v1.5.5 (#24413) 2021-06-19 10:52:51 +02:00
Alec Scott
93c75fe3f7 Bismarck: add v0.23.0 (#24414) 2021-06-19 10:51:59 +02:00
Alec Scott
aa65293709 cantera: add v2.5.1 (#24415) 2021-06-19 10:51:32 +02:00
Alec Scott
767f03f82f coreset: add v1.09 (#24419) 2021-06-19 10:47:27 +02:00
Alec Scott
4ba6a850d9 diamond: add v2.0.9 (#24421) 2021-06-19 10:46:41 +02:00
Glenn Johnson
91ef60eb0e reditools: update and add features (#24370)
This PR does the following:
- adds version corresponding to commit at 08/03/2020
- adds missing get_DE_events.py script
- adds dependencies needed by get_DE_events.py
- removes REDItoolDenovo.py.patch and python2to3.patch in favor of
  running 2to3 and reindent pre-build
- add batch_sort.patch to handle differences in string/char handling
  betweeen python2 and python3
- adds a variant for the Nature Protocol
- adds dependencies for the nature_protocol variant
- added myself as maintainer
This PR adds a new version of reditools from git.
2021-06-18 21:34:50 -05:00
Thomas Gruber
383d4cc84c Add LIKWID 5.2.0 and a patch for LIKWID 5.1.0 (#24399) 2021-06-18 12:52:51 -06:00
Alec Scott
ca9ff82ad0 abi-dumper: add v1.2 (#24392) 2021-06-18 12:52:23 -06:00
Sergei Shudler
4690fdc081 SLATE: Add e4s testsuite-inspired smoke test (#23376) 2021-06-18 10:50:18 -07:00
Sergei Shudler
1b368e433c Heffte: Add e4s testsuite-inspired smoke test (#23652) 2021-06-18 10:47:44 -07:00
G-Ragghianti
94d6d3951a Removed unofficial MAGMA release and enabled MAGMA in e4s (#24400) 2021-06-18 17:28:35 +00:00
Simon Frasch
58272c9d57 spla: add version 1.5.0 and fix compilation with amdblis (#24374) 2021-06-18 09:55:29 -06:00
Themos Tsikas
1b51f09bf0 Checksum update for NAGCompiler download , Version 7.0 (Build 7048) (#24360) 2021-06-18 09:43:15 -06:00
Satish Balay
e3f4036212 petsc, petsc4py: add version 3.15.1 (#24397) 2021-06-18 08:49:37 -06:00
Erik Schnetter
eeacda3dce double-conversion: New versions 3.1.5, 2.0.2 (#24385)
A version 2.0.3 is also advertised, but doesn't download.
2021-06-18 08:34:38 -06:00
Glenn Johnson
c6961ba4d3 Fixes for opencv (#24361)
This PR fixes a couple of issues with the opencv package, mostly in
relation to cuda. This is only focused on cuda, not any of the other
variants.
- Added versions to the contrib_vers list. Added for all that can be
  retrieved from github. The one for the latest version was missing.
- Added a cmake patch for v3.2.0.
- Deprecated versions 3.1.0 and 3.2.0 as neither of those could be
  built, with or without cuda.
- Adjusted constraints on applying initial cmake patch.
- Added cudnn dependency when +cuda.
- Set constraints for cudnn and cuda for older versions of opencv.
2021-06-18 16:15:58 +02:00
Massimiliano Culpo
32f1aa607c Add an audit system to Spack (#23053)
Add a new "spack audit" command. This command can check for issues
with configuration or with packages and is intended to help a
user debug a failed Spack build. 

In some cases the reported issues are always errors but are too
costly to check for (e.g. packages that specify missing variants on
dependencies). In other cases the issues may be legitimate but
uncommon usage of Spack and we want to be sure the user intended the
behavior (e.g. duplicate compiler definitions).

Audits are grouped by theme, and for now the two themes are packages
and configuration. For example you can run all available audits
on packages with "spack audit packages". It is intended that in
the future users will be able to define their own audits.

The package audits are good candidates for running in package_sanity
(i.e. they could catch bugs in user-submitted packages before they
are merged) but that is left for a later PR.
2021-06-18 07:52:08 -06:00
Adam J. Stewart
8ad05d6a74 FBGEMM: GCC 5+ and AVX2 required (#24356) 2021-06-18 08:49:18 -05:00
Massimiliano Culpo
57467d05e1 Disable magma in the E4S pipeline (#24395)
Building magma has been failing consistently and is currently
blocking PRs from being merged. Disable that spec while we
investigate the failure and work on a fix.
2021-06-18 12:55:31 +00:00
Glenn Johnson
9750459e05 gsl package: update patch for later version (#22968)
Old patch is still provided for older versions.
2021-06-17 17:37:57 -07:00
Frank Willmore
2c1e9cc7b7 oneAPI compiler: update openmp flag (#23771) 2021-06-17 17:26:46 -07:00
Vanessasaurus
e7ac422982 Adding support for spack monitor with containerize (#23777)
This should get us most of the way there to support using monitor during a spack container build, for both Singularity and Docker. Some quick notes:

### Docker
Docker works by way of BUILDKIT and being able to specify --secret. What this means is that you can prefix a line with a mount of type secret as follows:

```bash
# Install the software, remove unnecessary deps
RUN --mount=type=secret,id=su --mount=type=secret,id=st cd /opt/spack-environment && spack env activate . && export SPACKMON_USER=$(cat /run/secrets/su) && export SPACKMON_TOKEN=$(cat /run/secrets/st) && spack install --monitor --fail-fast && spack gc -y
```
Where the id for one or more secrets corresponds to the file mounted at `/run/secrets/<name>`. So, for example, to build this container with su (spackmon user) and sv (spackmon token) defined I would export them on my host and do:

```bash
$ DOCKER_BUILDKIT=1 docker build --network="host" --secret id=st,env=SPACKMON_TOKEN --secret id=su,env=SPACKMON_USER -t spack/container . 
```
And when we add `env` to the secret definition that tells the build to look for the secret with id "st" in the environment variable `SPACKMON_TOKEN` for example.

If the user is building locally with a local spack monitor, we also need to set the `--network` to be the host, otherwise you can't connect to it (a la isolation of course.)

## Singularity

Singularity doesn't have as nice an ability to clearly specify secrets, so (hoping this eventually gets implemented) what I'm doing now is providing the user instructions to write the credentials to a file, add it to the container to source, and remove when done.

## Tags

Note that the tags PR https://github.com/spack/spack/pull/23712 will need to be merged before `--monitor-tags` will actually work because I'm checking for the attribute (that doesn't exist yet):

```bash
"tags": getattr(args, "monitor_tags", None)
```
So when that PR is merged to update the argument group, it will work here, and I can either update the PR here to not check if the attribute is there (it will be) or open another one in the case this PR is already merged. 

Finally, I added a bunch of documetation for how to use monitor with containerize. I say "mostly working" because I can't do a full test run with this new version until the container base is built with the updated spack (the request to the monitor server for an env install was missing so I had to add it here).

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-06-17 17:15:22 -07:00
eugeneswalker
e916b699ee e4s ci env: package preferences: use newer versions (#24371) 2021-06-17 15:17:49 -06:00
Erik Schnetter
fa89ca2eb0 vtk: Limit freetype versions (#24389)
freetype 2.0.3 introduces an incompatible change

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-06-17 15:13:33 -06:00
Adam J. Stewart
1c22742eed gloo: add +cuda variant (#24390) 2021-06-17 20:32:38 +00:00
Erik Schnetter
d4b5911671 hwloc: New version 2.5.0 (#24387) 2021-06-17 13:16:28 -06:00
Erik Schnetter
47e9b62b43 freetype: Add version 2.0.2 (#24386) 2021-06-17 12:37:31 -06:00
Todd Gamblin
100078ec3a codecov: disable inline annotations on PRs (#24362)
Inline codecov annotations make the code hard to read, and they add annotations
in files that seemingly have nothing to do with the PR. Sadly, they add a whole
lot of noise and not a lot of benefit over looking at the PR on codecov. We
should just have people look at the coverage on codecov itself.
2021-06-17 12:22:23 -06:00
Adam J. Stewart
54d8fea9fc MAGMA: add v2.6.0, sm_37 support (#24383) 2021-06-17 11:10:39 -06:00
Adam J. Stewart
3eee93ee76 ONNX: add new package (#24384) 2021-06-17 11:07:35 -06:00
Nick Forrington
b5cb75e5ec arm-forge: add v21.0.2 and variant to detect PMU counters (#24298) 2021-06-17 05:55:32 -06:00
Harmen Stoppels
75675de02a Fix an issue where cray module files may not have CRAY_MPICH_DIR set (#24267) 2021-06-17 04:28:25 -06:00
Tom Payerle
b8ad621907 vtk: patch to replace use of FT_CALLBACK_DEF (#24238) 2021-06-17 04:19:23 -06:00
Andreas Baumbach
eac757da8c New package: py-pyusb (#23733)
* New package: py-pyusb

Change-Id: I606127858b961b5841c60befc5a8353df0f9f38c

* fixup dependencies

Change-Id: I0c9b0ccee693d2c4e847717950d4ce64cb319794

* fixup 2

Change-Id: Ibaccbdafd865e363564f491054e4e4ceb778727b

* Update var/spack/repos/builtin/packages/py-pyusb/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-06-17 03:58:21 -06:00
Harmen Stoppels
011a940f44 Break llvm-amdgpu circular dependency with rocm-device-libs (#23859) 2021-06-17 03:52:33 -06:00
Anton Kozhevnikov
85c5589620 add -fallow-argument-mismatch flag for gcc10 (#24354) 2021-06-17 03:40:27 -06:00
Scott Wittenburg
ee9b1a6ea5 ci: add all locally computed hashes as job variables (#24359) 2021-06-17 03:37:31 -06:00
Jen Herting
986776c937 New package: py-pyscreeze (#24251)
* [py-pyscreeze] created template

* [py-pyscreeze] added dependencies

* [py-pyscreeze] depends on scrot

* [py-pyscreeze] Final cleanup

- added homepage
- added description
- removed fixmes

* [py-pyscreeze] using pil provider
2021-06-17 03:31:33 -06:00
Howard Pritchard
2739edd42c open mpi: add v4.0.6 and fix a bug (#24344)
A patch no longer applies cleanly as its fixed in v4.0.6 - fix it here

==> Installing openmpi-4.0.6-in47f6rxspbnyibkdx6x4ekg6piujobd
==> No binary for openmpi-4.0.6-in47f6rxspbnyibkdx6x4ekg6piujobd found: installing from source
==> Fetching https://download.open-mpi.org/release/open-mpi/v4.0/openmpi-4.0.6.tar.bz2
Reversed (or previously applied) patch detected!  Assume -R? [n]
Apply anyway? [n]
2 out of 2 hunks ignored -- saving rejects to file opal/include/opal/sys/gcc_builtin/atomic.h.rej

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2021-06-17 03:28:37 -06:00
Harmen Stoppels
e4d80c997a Unset LD_PRELOAD and DYLD_INSERT_LIBRARIES (#24177)
When running executables from build dependencies, we want to avoid that
`LD_PRELOAD` and `DYLD_INSERT_LIBRARIES` any of their shared libs build
by spack with system libraries.
2021-06-17 09:25:19 +00:00
Manuela Kuhn
3571c1b812 nss: add nssutil3 library to pkg-config (#24364)
This is needed for qt+webkit to build correctly.
As reference the debian package was taken:
https://salsa.debian.org/mozilla-team/nss/-/blob/master/debian/nss.pc.in
2021-06-17 03:19:31 -06:00
snehring
7831d6be75 bowtie2: add constraints for the simde dependency (#24226) 2021-06-17 01:52:27 -06:00
Asher Mancinelli
c8f58c5f1d hiop: add v0.4.4, use commits for tags (#24365) 2021-06-17 08:53:31 +02:00
Adam J. Stewart
56f1904538 NNPACK: add new package (#24333) 2021-06-17 00:01:24 -06:00
Adam J. Stewart
10608edd24 py-peachpy: add new package (#24373) 2021-06-16 23:34:18 -06:00
Adam J. Stewart
3adee93d14 py-opcodes: add new package (#24372) 2021-06-17 07:04:38 +02:00
John Jolly
d31d339bf6 z3: update package to use CMake build system (#24337)
The Z3 solver provides a Z3Config.cmake file when built using the CMake build
system. This submission changes the package build system to inherit the
CMakePackage type. In addition to changing the build system, this submission:

- Adds the GMP variant
- Removes v4.4.0 and v4.4.1 as CMake was implemented starting with v4.5.0
2021-06-17 07:03:37 +02:00
Adam J. Stewart
5692c15e3a TensorPipe: add new package (#24335)
* TensorPipe: add new package

* Add libuv dependency

* Add min supported version of libuv
2021-06-16 16:36:48 -05:00
Adam J. Stewart
f0a85059c2 XNNPACK: add new package (#24334)
* XNNPACK: add new package

* XNNPACK: add resources
2021-06-16 16:35:56 -05:00
Adam J. Stewart
551ae264fe gloo: add py-torch submodule commits (#24330)
* gloo: add py-torch submodule commits

* gloo: add new version

* gloo: add master branch

* gloo: use Ninja generator
2021-06-16 10:06:40 -05:00
OliverPerks
a92bed0dc5 openssl: architecture check is now based on spec target (#24228) 2021-06-16 08:10:43 -06:00
Nicolas Cornu
3f9f2c2abe eigen: fix build with nvhpc (#24253) 2021-06-16 07:58:31 -06:00
Adam J. Stewart
72c6fc2fda kineto: add new package (#24319) 2021-06-16 07:43:47 -06:00
Adam J. Stewart
38088dd898 FBGEMM: add new package (#24318) 2021-06-16 07:37:44 -06:00
Tim Haines
4f40454800 Dyninst: add v11.0.1 (#24322) 2021-06-16 07:34:39 -06:00
Glenn Johnson
46214b0caa Set r-chipseq to bioconductor format (#24315)
- added description
- converted to git from url
- set commit rather than sha256
2021-06-16 15:18:13 +02:00
Adam J. Stewart
ce0eb4862f QNNPACK: add py-torch submodule commits (#24329) 2021-06-16 14:57:37 +02:00
Steven Smith
058ae3f0fd ParFlow: add new package (#24331) 2021-06-16 14:56:36 +02:00
Glenn Johnson
2439b8d59c r-effects: new package (#24342) 2021-06-16 03:11:16 -06:00
archxlith
891207f20e kaldi: fix building with mkl (#24338) 2021-06-16 03:10:39 -06:00
archxlith
5ec708cb48 openfst: add v1.7.3 (#24339)
It's the highest version allowed in kaldi package
2021-06-16 03:07:17 -06:00
snehring
822d6a93fb openmolcas: add v21.02, add mpi variant (#24343) 2021-06-16 10:37:36 +02:00
Adam J. Stewart
64f3e37479 cpuinfo: prevent downloads during build (#24345) 2021-06-16 08:28:25 +02:00
Adam J. Stewart
8a938978a4 pthreadpool: more specific resource destination (#24346) 2021-06-16 08:27:20 +02:00
Marc Fehling
a067b48112 p4est: add v2.3.2 (#24311) 2021-06-15 23:46:19 -06:00
Axel Huebl
ca1d1c427c openPMD-api: Build with Legacy API (#24341)
Allow to build with `^hdf5@1.12.0 api=v110` and `v18`.
2021-06-15 18:37:52 -07:00
eugeneswalker
b330474a13 e4s ci: specs: add datatransferkit (#24325) 2021-06-15 18:37:37 -07:00
Richarda Butler
1c44912f9b add irep and lua-lang virtual dependency (#22492)
This adds a package for `irep`, a tool for reading `lua` input decks from 
Fortran, C, and C++.

`irep` can be built with either `lua` or `luajit`.  To address this, we also add
a virtual package for lua called `lua-lang`.  `luajit` isn't, by default, a drop-in
replacement for `lua`, but we add a `+lualinks` variant to it that adds symlinks
that make it behave like `lua@5.1`.  With this variant enabled, it provides the
`lua-lang` virtual.  `lua` always provides `lua-lang`.

- [x] add `irep` package
- [x] add `+lualinks` variant to `lua-luajit`
- [x] create `lua-lang` virtual, provided by `lua` and `luajit+lualinks`

Co-authored-by: Kayla Richarda Butler <butler59@quartz1148.llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2021-06-15 17:50:04 -07:00
Seth R. Johnson
5971372be7 cairo: fix gtkdocize patch (#24332)
Patch in #23971 was not correct
2021-06-15 16:08:56 -07:00
Vanessasaurus
53dae0040a adding spack upload command (#24321)
this will first support uploads for spack monitor, and eventually could be
used for other kinds of spack uploads

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-06-15 14:36:02 -07:00
Adam J. Stewart
cdc28a9623 pthreadpool: add new package (#24324) 2021-06-15 14:58:37 -06:00
Thomas Madlener
d4e04f9410 dd4hep: add v1.17 and a patch for cmake issues (#24274) 2021-06-15 13:52:34 -06:00
Seth R. Johnson
1bf84d170f libdrm: fix one error, mark another conflict (#24309)
* libdrm: fix one configure error and require libpciaccess

Failure with `LIBS`: the linker can't find `-lrt` so configure fails on
darwin-bigsur %apple-clang@12.0.5
```
  >> 22    configure: error: in `/private/var/folders/gy/mrg1ffts2h945qj9k29s1l1dvvmbqb/T/s3j/spack-s
           tage/spack-stage-libdrm-2.4.100-ofhk6m25n2pi427ihnxmvjkfmgyzlrqc/spack-src':
  >> 23    configure: error: C compiler cannot create executables
     24    See `config.log' for more details

See build log for details:
  /var/folders/gy/mrg1ffts2h945qj9k29s1l1dvvmbqb/T/s3j/spack-stage/spack-stage-libdrm-2.4.100-ofhk6m25n2pi427ihnxmvjkfmgyzlrqc/spack-build-out.txt
```

* libpciaccess: Mark conflict with darwin

```
make[2]: *** [common_init.lo] Error 1
make[2]: *** Waiting for unfinished jobs....
common_interface.c:75:10: fatal error: 'sys/endian.h' file not found
         ^~~~~~~~~~~~~~
```
and
```
common_init.c:73:3: error: "Unsupported OS"
```
and others
2021-06-15 12:13:28 -06:00
Glenn Johnson
d7263b5da0 r-insight: new package (#24313) 2021-06-15 11:16:34 -06:00
Paul Romano
ba65cc73ef openmc: add v0.12.2, v0.12.1 (#24320) 2021-06-15 09:19:39 -06:00
eugeneswalker
c302887f9b openpmd-api: conflicts w hdf5 api=v110, v16, v18 (#24323)
* openpmd-api: conflicts w hdf5 api=v110, v16, v18
* Update var/spack/repos/builtin/packages/openpmd-api/package.py
* Add reference

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
2021-06-15 07:01:21 -06:00
Vanessasaurus
5521aae4f7 extending example for buildcaches (#22504)
* extending example for buildcaches

I was attempting to create a local build cache from a directory, and I found the
docs for both buildcaches and mirrors, but did not connect the docs that the
url variable could be the local filesystem variable. I am extending the docs for
buildcaches with an example of creating and interacting with one on the filesystem
because I suspect other users will run into this need and possibly not find what
they are looking for.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

* adding as follows to spack mirror list

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2021-06-14 21:46:27 -07:00
eugeneswalker
229247c899 e4s ci environment: packages: update to newer versions (#24308) 2021-06-14 19:26:30 -07:00
Hervé Yviquel
b92abd79ab paraver: rename package to wxparaver, add new versions and fix installation (#24307)
* update url, add all new versions and fix installation

* add wxparaver package and set the old paraver package as deprecated

* remove update of deprecated package

* remove old version from new wxparaver

* Update url

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-06-14 17:46:12 -06:00
David M. Rogers
26c645650d Made mxnet's cuda dependency conditional on +cuda. (#24305) 2021-06-14 15:23:09 -05:00
Vanessasaurus
39cdd085c9 adding more description to binary caches (#23934)
It is currently kind of confusing to the reader to distinguish spack buildcache install
and spack install, and it is not clear how to use a build cache once a mirror is added.
Hopefully this little big of description can help (and I hope I got it right!)

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-06-14 13:17:35 -07:00
Emil Briggs
8793d93e8c New package: rmgdft. (#23313) 2021-06-14 13:40:22 -06:00
QuellynSnead
34c9c89b55 Caliper: add v2.6.0 (#24303) 2021-06-14 13:04:14 -06:00
Robert Cohn
d993ee7972 oneAPI packages: fix install for python2 (#24296)
Fix platform detection logic to work for Python 2 and 3
2021-06-14 10:47:44 -07:00
Axel Huebl
22fe56ad24 HiPACE: new package (#24070)
Co-authored-by: Severin Diederichs <65728274+SeverinDiederichs@users.noreply.github.com>
2021-06-14 19:34:24 +02:00
Glenn Johnson
9cfcc16084 leptonica: add v1.81.0 and missing dependencies (#24302)
- add version 1.81.0
- add dependencies
  - giflib
  - jpeg
  - libpng
  - libtiff
  - libwebp
  - openjpeg
- build shared libs
2021-06-14 11:25:33 -06:00
Glenn Johnson
dcabbca1c5 libwebp: add v1.2.0 and new variants (#24301)
- add version 1.2.0
- add variants
    - giflib
    - jpeg
    - libpng
    - libtiff
2021-06-14 19:04:21 +02:00
Sergio
25bca688ce Fix the branch for the develop version of IOR (#24079) 2021-06-14 10:46:13 -06:00
eugeneswalker
0b769855a1 fast-global-file-status: depends_on libtool (#24293) 2021-06-14 10:01:29 -06:00
Jen Herting
0d73fd2b11 gnutls: added unconditional dependency on libidn2 (#21471) 2021-06-12 13:44:51 +02:00
Dominik Gresch
dc8626b801 IntelPackage: use 'version_yearlike' in check for libfabrics RPATH. (#16700)
Use the 'version_yearlike' attribute instead of 'version' to
check if the SPACK_COMPILER_EXTRA_RPATHS should be set to include
the built-in 'libfabrics'.
When using the bare 'version', the comparison is wrong when
building with 'intel-parallel-studio', which has the version
format '<edition>.YYYY.Nupdate', due to the leading '<edition>'.
2021-06-12 10:02:56 +00:00
Michael Kuhn
1b71d22194 nss: new package (#24288) 2021-06-12 03:19:13 -06:00
Greg Becker
95c9a031ee Ensure all roots of an installed environment are marked explicit in db (#24277) 2021-06-12 10:23:13 +02:00
Olivier Cessenat
d6cbf72b19 graphviz: add v2.47.2 (#24273)
Updated dependencies
2021-06-12 09:37:00 +02:00
Vanessasaurus
ae91d49f21 libinih: add new package (#24289)
xfsprogs currently does not install with error message:
FATAL ERROR: could not find a valid ini.h header.
Adding this package libinih, and including it as
a dependency for xfsprogs seems to fix the issue. It could be
that we only need to add it for newer versions (if it worked before)
and maybe a maintainer can comment on that.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>

Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2021-06-12 00:31:29 -06:00
Adam J. Stewart
8a0a60c575 py-pytorch-sphinx-theme: add new package (#24283) 2021-06-12 08:15:55 +02:00
Michael Kuhn
163fe86bda nspr: add v4.31 (#24284) 2021-06-12 08:08:01 +02:00
Seth R. Johnson
8b75e81666 llvm: add conflicts for newer gcc (#24281)
Closes #24270
2021-06-11 17:07:14 -06:00
Robert Mijakovic
177750b215 cp2k: add v8.2 (#24265)
Co-authored-by: Robert Mijakovic <robert.mijakovic@lxp.lu>
2021-06-11 15:43:14 -06:00
Adrien Bernede
b0f348315c ipopt: add versions up to v3.13.0 (#16706)
Pagination on Github prevent spack from easily parse all available
versions. Also, due to recent migration to GitHub, tarballs for
versions up to 3.12.13 have be regenerated, changing the hash.

The current URL will apparently be supported, so we keep it, and give
the alternative one as a comment.
2021-06-11 14:36:50 -06:00
Adam J. Stewart
11f370e7be setup-env: allow users to skip module function setup (#24236)
* setup-env: allow users to skip module function setup

* Add documentation on SPACK_SKIP_MODULES
2021-06-11 19:19:24 +00:00
Tom Payerle
4a8785d371 [py-ippython]: Start py-backcall dependency to start at py-ipython@7.3.0: (#24279)
This should fix #24278
$INSTALLDIR/lib/python3.7/site-packages/IPython/core/events.py contains an
import from backcall even in @7.3.0, so dependency on py-backcall needs
to start earlier.
2021-06-11 19:11:44 +00:00
Manuela Kuhn
adc4699c3a qt+webkit: fix python2 dependency and add opengl conflict (#24276) 2021-06-11 17:24:28 +00:00
Andreas Baumbach
44a8e17549 py-distributed: restrict py-contextvars dep to newer versions (#23841)
* py-distributed: restrict py-contextvars dep to newer versions

Change-Id: I8a6d55b840309aa6966ab310d42252f45b0ef9c7

* fixup dependency on py-contextvars

Change-Id: I56b729394af0f5a6fc283344d8af87990c97426b
2021-06-11 10:15:55 -05:00
Tom Payerle
b4bf0c3476 texlive: restrict poppler version (#24222)
Restrict poppler version for texlive to poppler@:0.84
Should fix #19946

See also https://github.com/NixOS/nixpkgs/issues/79170
Looks like poppler@0.84 upgraded their header files to use the C++ cstdio
instead of the C stdio.h.  Since TeX is using C, not C++. this causes problems.
2021-06-11 09:01:24 -06:00
William Downs
a588d5dc58 gchp: add v13.0.2 and dev branch (#24206) 2021-06-11 07:22:21 -06:00
ajaust
31c4cdf59c py-pyprecice: fix checksums of 2.2.0.1 and 2.2.1.1 (#24264) 2021-06-11 04:34:43 -06:00
Olivier Cessenat
98ee702b37 perl-cgi: add v4.53 (#24266) 2021-06-11 03:58:39 -06:00
Jen Herting
dbdf8f2ce7 scrot: new package (#24250) 2021-06-11 09:03:44 +00:00
iarspider
ea08e93f2f Display proper message when patch checksum doesn't match (#24229) 2021-06-11 10:31:33 +02:00
Olivier Cessenat
dcb3fbf98e visit-cgns: better sets VISIT_PLUGIN_DIR to fit VisIt dir struct (#23834) 2021-06-11 10:28:57 +02:00
Olivier Cessenat
202510869d visit-silo: better sets VISIT_PLUGIN_DIR to fit VisIt dir struct (#23833) 2021-06-11 10:28:31 +02:00
Rémi Lacroix
ed695f3267 iq-tree: add v2.1.3 (#24235) 2021-06-11 02:19:27 -06:00
Hervé Yviquel
a83b75b878 extrae: fix import and issue with pthread (#24220) 2021-06-11 10:12:18 +02:00
Mark W. Krentel
eefcd3d00d hpcviewer: add v2021.05 (#24225) 2021-06-11 10:11:06 +02:00
Rémi Lacroix
722376c201 mafft: add v7.481 (#24233) 2021-06-11 09:58:43 +02:00
Jen Herting
269615b9ca tophat: set the C++ standard (#21737) 2021-06-11 09:54:18 +02:00
Nikolay Petrov
ec2d8a1571 Fixing provides directive for intel-oneapi-onedal package (#24108) 2021-06-11 09:47:27 +02:00
Jen Herting
c630594092 giblib: new package (#24249) 2021-06-11 09:43:39 +02:00
Marc Fehling
e291fa1b1a p4est: use the official tarball release. (#24219) 2021-06-11 06:46:21 +00:00
Adam J. Stewart
8c7f94db1c py-fiscalyear: add v0.3.2 (#24262) 2021-06-11 08:27:16 +02:00
Michael Kuhn
f7391c1970 iwyu: add 0.16 (#24258) 2021-06-10 19:28:26 -04:00
Carson Woods
5926056f3a reframe: add v3.6.1, v3.6.2 (#24239) 2021-06-10 13:01:44 -06:00
Jen Herting
1c81438343 New package: py-python-xlib (#24150)
* [py-python-xlib] created template

* [py-python-xlib] added dependencies

* [py-python-xlib] Final cleanup

- added homepage
- added description
- removed fixmes

* [py-python-xlib] allowed newer versions of python
2021-06-10 12:58:21 -06:00
Peter Lindstrom
31bca57e89 zfp package: ensure openmp variant is processed (#24221)
* zfp: several package improvements

- add variants for build targets, language bindings, backends
- ensure selected variants are compatible with zfp version
- point to GitHub (not LLNL) tar balls
- add dependencies
- update link to homepage
- add maintainers

* zfp: address suggestions by Spack team

- use conflicts() instead of raising exceptions
- use define() and define_from_variant() where applicable

* Apply suggestions from code review

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Fix ZFP OpenMP build.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-06-10 10:25:46 -07:00
Jen Herting
9da1cb615f New package: py-keyboard (#23636)
* [py-keyboard] created template

* [py-keyboard]

- updated homepage
- added dependency for OSX
- added description
- removed fixmes

* [py-keyboard] Until py-pyobjc can be created, specifying conflict with platform=darwin

* [py-keyboard] is verb
2021-06-10 16:26:44 +00:00
Olivier Cessenat
e4a79dab47 groff: add v1.22.4, update recipe (#24213)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-06-10 16:58:22 +02:00
Olivier Cessenat
fd5b13b7a4 uchardet: new release + lib image was not found on Mac OSX (#24212)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-06-10 16:50:03 +02:00
Sebastian Schmitt
31c0bcf346 Update py-setuptools-scm (#24217) 2021-06-10 07:31:17 -05:00
Seth R. Johnson
849943c63d cairo: circumvent missing gtkdocize for autoconf 2.70+ (#23971) 2021-06-09 17:10:13 -06:00
Ryan May
47ef59c885 Bump MetPy to 1.0.1 (#23944) 2021-06-09 15:49:21 -06:00
mcuma
9c0fb86b48 openkim-models: fix typo in the latest version (#24209) 2021-06-09 20:21:12 +00:00
Robert Pavel
29c4d5901a Update of Flecsi Spackage (#24106)
* Update of Flecsi Spackage

Update of flecsi spackage to reconcile differences between flecsi@1:1.9
and flecsi@2: for future support purposes

* Removing Unnecessary Conditional

Removing unused conditional. Initially the plan was to switch based on
version in `cmake_args` but this was not necessary as build system
variable names remained mostly the same and conflicts prevent the rest.

For the most part, if a variant is there it does not need to check
against what version of the code is being built.

* Updated CI To Reconcile Flecsi Changes

Updated CI to target flecsi@1.4.2 which best matches the previous
release version and reconciled change in variant name
2021-06-09 11:03:50 -07:00
Olivier Cessenat
0dce021f94 jbigkit: adding the library spec for dependents (#24103) 2021-06-09 10:14:28 +02:00
Olivier Cessenat
8f34a66502 texlive: setup dependent build environment (#24102) 2021-06-09 09:46:21 +02:00
Cameron Smith
7499212bc1 pumi: support building tests (#24202)
fix sub directory path to meshes git submodule
2021-06-08 18:28:34 -06:00
Ryan Mast
7e9ed7e56d helics: Add version 2.7.1 (#24188) 2021-06-08 17:05:32 -07:00
snehring
75db07e674 texlive: update live version to 2021 (#24211) 2021-06-08 16:55:49 -07:00
Mark W. Krentel
968d393f6b intel-tbb: explicitly set OS var and pass to TBB (#23852)
The common.inc script in TBB uses the environ var 'OS' to determine
the platform it's on.  On Linux, this is normally empty and TBB falls
back to uname.  But some systems set this to 'CentOS Linux 8' which is
descriptive, but not exactly what common.inc is looking for.

Instead, take the value from python and explicitly set OS to what TBB
expects to avoid this problem.
2021-06-08 17:06:16 -05:00
eugeneswalker
ac3b46fc95 e4s ci: re-enable veloc builds after recent fixes (#24190) 2021-06-08 14:10:44 -07:00
Massimiliano Culpo
fd03d539cc mypy: add types-six to the list of installed packages (#24207)
See https://mypy-lang.blogspot.com/2021/05/the-upcoming-switch-to-modular-typeshed.html
for a broader explanation.
2021-06-08 20:46:25 +00:00
Scott Wittenburg
92bef1da6f Pipelines: Fix default generated rebuild job script (#24185) 2021-06-08 14:33:45 -06:00
Vanessasaurus
3291be6cb1 singularity: add singularityce fork (#24201)
Since the two packages share a common history, the installation
procedure has been factored into a common base class.

Signed-off-by: vsoch <vsoch@users.noreply.github.com>
2021-06-08 21:31:18 +02:00
Teodor Nikolov
d0fdbc1ab2 [umpire] Fix missing header (#24198) 2021-06-08 21:17:51 +02:00
Adam J. Stewart
92be358582 Python: fix +tkinter+tix support (#23980)
* Tcl: fix TCLLIBPATH

* Fix TCL|TK|TIX_LIBRARY paths

* Fix TCL_LIBRARY, no tcl8.6 subdir

* Don't rely on os.listdirs sorting

For tcl and tk, we also install the source directory, so there are
two init.tcl and tk.tcl locations. We want the one in lib/lib64,
which should come before the one in share.

* Add more patches

* Fix dylib on macOS

* Tk: add smoke tests

* Tix: add smoke test
2021-06-08 12:45:04 -05:00
Massimiliano Culpo
f33c4e7280 ASP-based solver: permit to use virtual specs in environments (#24199)
Extracting specs for the result of a solve has been factored
as a method into the asp.Result class. The method account for
virtual specs being passed as initial requests.
2021-06-08 19:04:49 +02:00
Massimiliano Culpo
e321578bbe ASP-based solver: reordered low priority optimization criteria (#24184)
Minimizing compiler mismatches in the DAG and preferring newer 
versions of packages are now higher priority than trying to use as 
many default values as possible in multi-valued variants.
2021-06-08 16:10:49 +02:00
Robert Mijakovic
a2e9a1b642 osu-micro-benchmarks: new version 5.7.1 (#23590) 2021-06-08 08:07:20 -06:00
Todd Gamblin
beed6047e8 macOS: add monterey as macOS version 12. (#24192) 2021-06-08 10:07:34 +00:00
Tamara Dahlgren
11fd88ee3c Update stand-alone tests to use test stage work directory; also added expected ctest output (#24191) 2021-06-08 03:31:40 -06:00
Glenn Johnson
418db4e910 gatk: make r a variant (#24189)
According to the docs, r is needed for plotting, but plotting is
untested. In addition, the specific version requirement of java for gatk
could lead to multiple installations of r being triggered in an
environment. That might cause people to have to be deliberate about
java in a deployment. All in all, it seems that r is better as a
variant for gatk.
2021-06-08 03:08:10 -06:00
Chris Richardson
f231ae97f4 Changes for ufl and basix (#24187) 2021-06-08 03:07:38 -06:00
Tamara Dahlgren
e1bd3ae4db biobambam2: update stand-alone tests to use test stage work directory (#24111) 2021-06-08 09:30:21 +02:00
Massimiliano Culpo
729d66a3f8 Fix brittle unit-tests on providers (#24186)
These tests were broken by #24176
2021-06-07 22:16:14 +02:00
Michael Kuhn
4d55203ce5 build_systems: Make autotools builds verbose (#24161)
This is also what our other build systems are doing.
2021-06-07 14:05:35 -04:00
Valentin Volkl
2bdeaa1b48 pkgconf: disable check due to missing dependencies (#24168) 2021-06-07 12:01:24 -06:00
Adam J. Stewart
3d0bad465b Docs: fix missing backtick in Environments docs (#24109) 2021-06-07 19:05:09 +02:00
Valentin Volkl
506d5744aa delphes: add v3.5.0 (#24101) 2021-06-07 19:04:34 +02:00
Tamara Dahlgren
026cf7aa30 Update stand-alone tests to use test stage work directory (#24110) 2021-06-07 18:11:10 +02:00
Harmen Stoppels
e12b030def mpich: fix constraints in provides directives (#24176) 2021-06-07 09:16:56 -06:00
Michael Kuhn
057bf434ce netdata: add v1.31.0 (#24158)
This also adds some missing dependencies for core functionality.
2021-06-07 17:13:49 +02:00
Glenn Johnson
004f86aab7 r-rpostgres: add new package (#22442) 2021-06-07 17:10:46 +02:00
Glenn Johnson
c01730e33b r-rmariadb: remove no longer needed patch (#24171)
Since PR #22873 was merged, the configure patch for r-rmariadb is no
longer needed. Also added comment regarding configure arguments.
2021-06-07 17:08:43 +02:00
psakievich
1fed008410 Add int64 back for hypre in nalu-wind (#24140)
To concretize the entire exawind environment together we need `hypre` to be built with the same flags for `nalu-wind` and `amr-wind`.  
@jrood-nrel
2021-06-07 07:35:26 -07:00
Chuck Atkins
b0590bf4e8 axl: Don't deprecate v0.3.0 as it's still actively required by veloc (#24179) 2021-06-07 07:35:08 -07:00
arjun-raj-kuppala
c2901ea14a AMD ROCm release 4.2.0: bump up mivisionx version and add gtk option to opencv (#23881)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2021-06-07 15:31:17 +02:00
Harmen Stoppels
b17046723d Revert "Bootstrap with -O3 in cmake (#23147)" (#24174)
This reverts commit fef05621a7.
2021-06-07 03:37:22 -06:00
Luca Heltai
f07be01fa8 arborx: enable trilinos 13+ (#24153) 2021-06-06 21:07:31 -06:00
Adam J. Stewart
559db31511 py-radiant-mlhub: add new package (#24167) 2021-06-06 21:04:30 -06:00
Adam J. Stewart
e1d194b9a3 NCCL: add new versions (#24155) 2021-06-06 22:22:21 -04:00
Peter Scheibel
6ed7d40be7 axom: don't require hdf5~shared (#24144) 2021-06-06 22:17:39 -04:00
Luca Heltai
1533c2fade dealii: add support for arborx (#24154) 2021-06-06 22:16:43 -04:00
Glenn Johnson
986bcef160 darshan-runtime: add SGE scheduler (#24160)
* Set job_id for SGE in darshan-runtime package
* Use a multi value variant for scheduler

Only one scheduler can be selected so make the variant multi valued and
set multi=False.
2021-06-06 19:15:40 -04:00
Adam J. Stewart
09d317c293 py-pystac: add new package (#24166) 2021-06-06 19:11:08 -04:00
Adam J. Stewart
7093fb214f py-tqdm: add version 4.56.2 (#24165) 2021-06-06 14:46:23 -06:00
Paul
534df5cd68 Add Go 1.16.5 and 1.15.13 (#24170) 2021-06-06 17:43:59 +02:00
Chuck Atkins
6c21d64c50 veloc: The axl dependency had an API change in 0.4.0 that breaks veloc (#24157) 2021-06-05 18:25:20 -06:00
Adam J. Stewart
3db5029a4b GMT: add v6.2.0 (#24164) 2021-06-05 23:39:02 +00:00
Adam J. Stewart
7449d6950a Fix git_version on macOS (#24163) 2021-06-05 23:30:28 +00:00
Tom Payerle
9f8e40e95c hdf-eos5: Fix issue when linking against hdf5+szip (#23411) (#23412)
* hdf-eos5: Fix issue when linking against hdf5+szip (#23411)

Should fix issue #23411 when linking against hdf5+szip

Also fix bug if hdf5 does not depend on zlib

Reluctantly added payerle as a maintainer
2021-06-05 07:44:49 -04:00
Greg Becker
af3ebeeea1 apply default linux prefix inspections to all module sets (#24151) 2021-06-04 21:37:20 -06:00
Axel Huebl
42df61d631 py-warpx: 21.06 (#24152) 2021-06-04 21:50:48 -05:00
Tom Payerle
e741211c09 [py-cvxpy]: Add new version, fix depends_on versions (#24147)
Added version 1.1.13

Fixed versions for dependencies based on README.md for package
In particular:
   * versions 1.1.x require python@3, at least 3.4 and for 1.1.13 at least 3.6
   * py-osqp had been pinned to version 0.4.1, but README.md either shows
        no version restriction, of 0.4.1 and higher
   * @1.1.13 requires at least 1.1.6 of py-scs
   * I am assuming since 1.1.x is python@3 only, py-six no longer required
        (it was not explicitly showing up in README.md for these versions)
2021-06-04 17:55:26 -06:00
Manuela Kuhn
4cc27f58db tk: fix url for patch (#24146) 2021-06-04 18:18:07 -05:00
Michael Kuhn
7d3a3af621 main, modules: fix module roots not being found (#24135)
Since the module roots were removed from the config file,
`--print-shell-vars` cannot find the module roots anymore. Fix it by
using the new `root_path` function. Moreover, the roots for lmod and
modules seems to have been flipped by accident.
2021-06-04 15:33:18 -07:00
Jen Herting
ff73ac6e9a New package: py-python3-xlib (#24139)
* [py-python3-xlib] created template

* [py-python3-xlib] requires python3

* [py-python3-xlib] Final cleanup

- added homepage
- added description
- removed fixmes

* [py-python3-xlib] added homepage

* [py-python3-xlib] removing homepage entirely
2021-06-04 20:28:14 +00:00
Tamara Dahlgren
5c37db5db3 libceed: Update the hip-related variant (#24142) 2021-06-04 12:45:10 -07:00
Joe Heaton
13978d68ea enable std c++14 (#24127) 2021-06-04 18:47:32 +00:00
arjun-raj-kuppala
54b9fe219b opencv: Adding gtk variant (#23937) 2021-06-04 12:51:07 -05:00
ajaust
1fd1f1c93f py-pyprecice: Add version 2.2.0.2 and 2.2.1.1 (#24138)
* add versions 2.2.0.2 and 2.2.1.1

* Add maintainer

Added Ishaan as additional maintainer as he is also maintainer of the Python bindings

* add new major precice version as dependency
2021-06-04 10:06:40 -05:00
Tamara Dahlgren
a0259cc4f4 Update stand-alone tests to use test stage work directory (#24130) 2021-06-04 07:28:03 -04:00
Michael Kuhn
d5d1d9548f cmd/stage: print stage path (#24019)
This is a small quality of life improvement so that users can easily
copy and paste the stage path after executing `spack stage spec`.
2021-06-04 11:18:55 +00:00
Desmond Orton
e28e6d2618 gengeo: new package (#24126) 2021-06-04 10:15:28 +02:00
Greg Becker
d8fc38a467 bugfix: modules relative to view use top-level view root, not implementation root (#24124) 2021-06-04 10:13:14 +02:00
Michael Kuhn
c4c14e0c69 libbson, mongo-c-driver: add v1.17.6 (#24134) 2021-06-04 08:03:26 +00:00
Adam J. Stewart
c09eea5947 Don't warn about missing source id for external packages (#24125) 2021-06-04 09:55:09 +02:00
Adam J. Stewart
b03049e938 m4: secure_snprintf.patch is no longer needed for 1.4.19 (#24113) 2021-06-04 09:51:43 +02:00
Tom Scogland
b63a8b3e27 Speed-up version parsing and comparison (#22973)
The VALID_VERSION regex didn't check that the version string was
completely valid, only that a prefix of it was. This version ensures
the entire string represents a valid version.

This makes a few related changes.

1. Make the SEGMENT_REGEX identify *which* arm it matches by what groups
   are populated, including whether it's a string or int component or a
   separator all at once.

2. Use the updated regex to parse the input once with a findall rather
   than twice, once with findall and once with split, since the version
   components and separators can be distinguished by their group status.

3. Rather than "convert to int, on exception stay string," if the int
   group is set then convert to int, if not then construct an instance
   of the VersionStrComponent class

4. VersionStrComponent now implements all of the special string
   comparison logic as part of its __lt__ and __eq__ methods to deal
   with infinity versions and also overloads comparison with integers.

5. Version now uses direct tuple comparison since it has no per-element
   special logic outside the VersionStrComponent class.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2021-06-04 09:23:37 +02:00
Jen Herting
ea390198f4 New package: py-ibm-watson (#24115)
* [py-ibm-watson] created template

* [py-ibm-watson] added dependencies

* [py-ibm-watson] Final cleanup

- added homepage
- added description
- removed fixmes
2021-06-03 20:58:17 -06:00
Tamara Dahlgren
fb05d9830a formetis: Update stand-alone tests to use test stage work directory (#24133) 2021-06-04 02:31:51 +00:00
Tamara Dahlgren
4ad779c4c4 formetis: correct the sha256 (#24131) 2021-06-04 02:27:53 +00:00
Jen Herting
1775383f5f New package: py-ibm-cloud-sdk-core (#24114)
* [py-ibm-cloud-sdk-core] created template

* [py-ibm-cloud-sdk-core] added dependencies

* [py-ibm-cloud-sdk-core] set minimum python version

* [py-ibm-cloud-sdk-core] added version 3.10.0

* [py-ibm-cloud-sdk-core] Final Cleanup

- added homepage
- added description
- removed fixmes
2021-06-03 21:15:46 -05:00
Desmond Orton
a85bc4eee1 New package: py-haphpipe (#23769)
* New Package:py-haphpipe@1.0.3

* removed llvm restrict. & changed freebayes

* Style fix

* Removed pip, wheel, added url for deps list

* used proper gsutil naming

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* url src for deps, samtools fix

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2021-06-03 23:58:08 +00:00
Scott Wittenburg
9903d05be9 Pipelines: Fix generation when dep and pkg arch differ (#24089) 2021-06-03 19:26:43 -04:00
Tom Payerle
70c81069ab Julia zen patch (#24119)
Add LLVM-style CPU name for 'zen'
Map zen => znver1
Fix checksum for @1.5.4
2021-06-03 23:56:21 +02:00
Veselin Dobrev
e4a559a571 CEED v4.0 release (#22735)
* petsc: add hip variant

* libceed: add 0.8, disable occa by default, and let autodetect AVX

Disabling OCCA because backend updates did not make this release and
there are some known bugs so most users won't have reason to use OCCA.

https://github.com/CEED/libCEED/pull/688

* WIP: ceed: 4.0 release

* MFEM package updates (#19748)

* MFEM package updates

* mfem: flake8

* [mfem] Various fixes and tweaks.

[arpack-ng] Add a patch to fix building with IBM XL Fortran.

[libceed] Fix building with IBM XL C/C++.

[pumi] Add C++11 flag for version 2.2.3.

* [mfem] Fix the shared CUDA build.

Reported by: @MPhysXDev

* [mfem] Fix a TODO item

* [mfem] Tweak the AmgX dependencies

* [suite-sparse] Fix the version of the mpfr dependency

* MFEM: add initial HIP support using the ROCmPackage.

* MFEM: add 'slepc' variant.

* MFEM: update the patch for v4.2 for SLEPc.

* mfem: apply 'mfem-4.2-slepc.patch' just to v4.2.

* ceed: apply 'spack style'

* [mfem] Add a patch for mfem v4.2 to work with petsc v3.15.0.

[laghos] Add laghos version 3.1 based on the latest commit in
         the repository; this version works with mfem v4.2.

[ceed] For ceed v4.0 use laghos v3.1.

* [libceed] Explicitly set 'CC_VENDOR=icc' when using 'intel'
          compiler.

* [mfem] Allow pumi >= 2.2.3 with mfem >= 4.2.0.

[ceed] Use pumi v2.2.5 with ceed v4.0.0.

* [ceed] Explicitly use occa v1.1.0 with ceed v4.0.0.
       Use mfem@4.2.0+rocm with ceed@4.0.0+mfem+hip.

* [ceed] Add NekRS v21 as a dependency for ceed v4.0.0.

* [ceed] Fix NekRS version: 21 --> 21.0

* [ceed] Propagate +cuda variant to petsc for ceed v4.0.

* [mfem] Propagate '+rocm' variant to some other packages.

* [ceed] Use +rocm variant of nekrs instead of +hip.

* [ceed] Do not enable magma with ceed@4.0.0+hip.

* [libceed] Fix hip build with libceed@0.8.

* [laghos] For v3.1, use the release .tar.gz file instead of commit.

* Remove cuda & hip variants as they are inherited

* [ceed] Remove comments and FIXMEs about 'magma+hip'.

* [ceed] [libceed] Remove TODOs about occa + hip.

* libceed: use ROCmPackage and +rocm

* petsc: use ROCmPackage for HIP

* libceed, petsc: use CudaPackage

* ceed: forward cuda_arch and amdgpu_target

* [mfem] Use Spack's CudaPackage as a base class; as a result,
       'cuda_arch' values should not include the 'sm_' prefix.
       Also, propagate 'cuda_arch' and 'amdgpu_target' variants
       to enabled dependencies.

* petsc: variant is +rocm, package name is hip

Co-authored-by: Jed Brown <jed@jedbrown.org>
Co-authored-by: Thilina Rathnayake <thilinarmtb@gmail.com>
2021-06-03 11:32:31 -07:00
Manuela Kuhn
8aae76eee0 r-emmeans: add new package (#23991) 2021-06-03 10:37:22 -06:00
Weiqun Zhang
b83f06df0c amrex: add v21.06 and update maintainers (#24086)
Also add hip build dependency on rocprim.
2021-06-03 09:52:34 -06:00
Matthieu Dorier
7845da58a7 tkrzw: add new package (#24100) 2021-06-03 17:45:05 +02:00
Robert Mijakovic
54bce50a17 m4: add v1.4.19 (#24099)
Co-authored-by: Robert Mijakovic <robert.mijakovic@lxp.lu>
2021-06-03 15:01:29 +02:00
Olivier Cessenat
c3898ca3bf texlive: add v20210305 (#24068) 2021-06-03 04:38:06 -06:00
Tom Vander Aa
1efeb933ec intel-oneapi-mpi: fix mpicc and related scripts (#23955)
Replace I_MPI_SUBSTITUTE_INSTALLDIR with actual installation prefix
2021-06-03 04:37:35 -06:00
Harmen Stoppels
473e9aa08e Extend cuda conflicts to cray platform (#24057)
The CUDA compiler conflicts are valid on Cray too, and likely 
on Darwin x86_64 with %gcc and %clang too, so drop platform=linux
2021-06-03 03:46:27 -06:00
Desmond Orton
7e168b8535 py-gsutil: new package (#23970) 2021-06-03 03:40:45 -06:00
Manuela Kuhn
a8c7d9a2ed r-afex: add new package (#24004) 2021-06-03 03:22:37 -06:00
Seth R. Johnson
a478a8cf9a flibcpp: add v1.0.1 and smoke test (#24050) 2021-06-03 11:22:06 +02:00
Chuck Atkins
f7c9e497f1 ecp-data-vis-sdk: Disable +fortran for unifyfs (#24096) 2021-06-03 01:55:36 -06:00
eugeneswalker
ef9d3a464f hdf5: filter compiler wrapper: h5pcc, h5pfc (#24092) 2021-06-03 09:38:15 +02:00
Kai Torben Ohlhus
08a4212ec3 suite-sparse: add v5.10.0 and v5.10.1 (#24097)
Update homepage URL and see release notes:

- https://github.com/DrTimothyAldenDavis/SuiteSparse/releases/tag/v5.10.1
- https://github.com/DrTimothyAldenDavis/SuiteSparse/releases/tag/v5.10.0
2021-06-03 00:34:25 -06:00
Desmond Orton
038bd61e14 New Package:py-responses (#23972)
* New Package:py-responses

* fixed deps
2021-06-02 21:43:20 -06:00
Desmond Orton
b4e347d2ef New Package:py-cookies (#24084) 2021-06-02 22:20:56 -05:00
Manuela Kuhn
e1d578299e r-estimability: add new package (#23990) 2021-06-02 20:28:21 -06:00
Manuela Kuhn
3b148f1192 r-lmertest: add new package (#24000) 2021-06-02 21:03:56 -05:00
1170 changed files with 14093 additions and 5634 deletions

View File

@@ -14,3 +14,8 @@ ignore:
- share/spack/qa/.*
comment: off
# Inline codecov annotations make the code hard to read, and they add
# annotations in files that seemingly have nothing to do with the PR.
github_checks:
annotations: false

View File

@@ -1,6 +0,0 @@
FROM python:3.7-alpine
RUN pip install pygithub
ADD entrypoint.py /entrypoint.py
ENTRYPOINT ["/entrypoint.py"]

View File

@@ -1,85 +0,0 @@
#!/usr/bin/env python
#
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Maintainer review action.
This action checks which packages have changed in a PR, and adds their
maintainers to the pull request for review.
"""
import json
import os
import re
import subprocess
from github import Github
def spack(*args):
"""Run the spack executable with arguments, and return the output split.
This does just enough to run `spack pkg` and `spack maintainers`, the
two commands used by this action.
"""
github_workspace = os.environ['GITHUB_WORKSPACE']
spack = os.path.join(github_workspace, 'bin', 'spack')
output = subprocess.check_output([spack] + list(args))
split = re.split(r'\s*', output.decode('utf-8').strip())
return [s for s in split if s]
def main():
# get these first so that we'll fail early
token = os.environ['GITHUB_TOKEN']
event_path = os.environ['GITHUB_EVENT_PATH']
with open(event_path) as file:
data = json.load(file)
# make sure it's a pull_request event
assert 'pull_request' in data
# only request reviews on open, edit, or reopen
action = data['action']
if action not in ('opened', 'edited', 'reopened'):
return
# get data from the event payload
pr_data = data['pull_request']
base_branch_name = pr_data['base']['ref']
full_repo_name = pr_data['base']['repo']['full_name']
pr_number = pr_data['number']
requested_reviewers = pr_data['requested_reviewers']
author = pr_data['user']['login']
# get a list of packages that this PR modified
changed_pkgs = spack(
'pkg', 'changed', '--type', 'ac', '%s...' % base_branch_name)
# get maintainers for all modified packages
maintainers = set()
for pkg in changed_pkgs:
pkg_maintainers = set(spack('maintainers', pkg))
maintainers |= pkg_maintainers
# remove any maintainers who are already on the PR, and the author,
# as you can't review your own PR)
maintainers -= set(requested_reviewers)
maintainers -= set([author])
if not maintainers:
return
# request reviews from each maintainer
gh = Github(token)
repo = gh.get_repo(full_repo_name)
pr = repo.get_pull(pr_number)
pr.create_review_request(list(maintainers))
if __name__ == "__main__":
main()

View File

@@ -24,9 +24,9 @@ jobs:
pip install --upgrade pip
pip install --upgrade vermin
- name: vermin (Spack's Core)
run: vermin --backport argparse --backport typing -t=2.6- -t=3.5- -v lib/spack/spack/ lib/spack/llnl/ bin/
run: vermin --backport argparse --violations --backport typing -t=2.6- -t=3.5- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
run: vermin --backport argparse --backport typing -t=2.6- -t=3.5- -v var/spack/repos
run: vermin --backport argparse --violations --backport typing -t=2.6- -t=3.5- -vvv var/spack/repos
# Run style checks on the files that have been changed
style:
runs-on: ubuntu-latest
@@ -39,7 +39,7 @@ jobs:
python-version: 3.9
- name: Install Python packages
run: |
pip install --upgrade pip six setuptools flake8 mypy>=0.800 black
pip install --upgrade pip six setuptools flake8 isort>=4.3.5 mypy>=0.800 black types-six
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
@@ -129,8 +129,9 @@ jobs:
run: |
sudo apt-get -y update
# Needed for unit tests
sudo apt-get install -y coreutils gfortran graphviz gnupg2 mercurial
sudo apt-get install -y ninja-build patchelf
sudo apt-get -y install \
coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \
patchelf
# Needed for kcov
sudo apt-get -y install cmake binutils-dev libcurl4-openssl-dev
sudo apt-get -y install zlib1g-dev libdw-dev libiberty-dev
@@ -155,6 +156,8 @@ jobs:
make -C ${KCOV_ROOT}/build && sudo make -C ${KCOV_ROOT}/build install
- name: Bootstrap clingo from sources
if: ${{ matrix.concretizer == 'clingo' }}
env:
SPACK_PYTHON: python
run: |
. share/spack/setup-env.sh
spack external find --not-buildable cmake bison
@@ -162,6 +165,7 @@ jobs:
- name: Run unit tests (full suite with coverage)
if: ${{ needs.changes.outputs.with_coverage == 'true' }}
env:
SPACK_PYTHON: python
COVERAGE: true
SPACK_TEST_SOLVER: ${{ matrix.concretizer }}
run: |
@@ -171,6 +175,7 @@ jobs:
- name: Run unit tests (reduced suite without coverage)
if: ${{ needs.changes.outputs.with_coverage == 'false' }}
env:
SPACK_PYTHON: python
ONLY_PACKAGES: true
SPACK_TEST_SOLVER: ${{ matrix.concretizer }}
run: |
@@ -286,7 +291,7 @@ jobs:
shell: runuser -u spack-test -- bash {0}
run: |
source share/spack/setup-env.sh
spack unit-test -k 'not svn and not hg' -x --verbose
spack unit-test -k 'not cvs and not svn and not hg' -x --verbose
# Test for the clingo based solver (using clingo-cffi)
clingo-cffi:
needs: [ validate, style, documentation, changes ]
@@ -302,8 +307,9 @@ jobs:
run: |
sudo apt-get -y update
# Needed for unit tests
sudo apt-get install -y coreutils gfortran graphviz gnupg2 mercurial
sudo apt-get install -y ninja-build patchelf
sudo apt-get -y install \
coreutils cvs gfortran graphviz gnupg2 mercurial ninja-build \
patchelf
# Needed for kcov
sudo apt-get -y install cmake binutils-dev libcurl4-openssl-dev
sudo apt-get -y install zlib1g-dev libdw-dev libiberty-dev
@@ -364,7 +370,7 @@ jobs:
run: |
pip install --upgrade pip six setuptools
pip install --upgrade codecov coverage
pip install --upgrade flake8 pep8-naming mypy
pip install --upgrade flake8 isort>=4.3.5 mypy>=0.800
- name: Setup Homebrew packages
run: |
brew install dash fish gcc gnupg2 kcov

2
.gitignore vendored
View File

@@ -508,4 +508,4 @@ $RECYCLE.BIN/
*.msp
# Windows shortcuts
*.lnk
*.lnk

View File

@@ -14,9 +14,8 @@
# ~/.spack/modules.yaml
# -------------------------------------------------------------------------
modules:
default:
prefix_inspections:
lib:
- LD_LIBRARY_PATH
lib64:
- LD_LIBRARY_PATH
prefix_inspections:
lib:
- LD_LIBRARY_PATH
lib64:
- LD_LIBRARY_PATH

View File

@@ -34,19 +34,21 @@ packages:
java: [openjdk, jdk, ibm-java]
jpeg: [libjpeg-turbo, libjpeg]
lapack: [openblas, amdlibflame]
lua-lang: [lua, lua-luajit]
mariadb-client: [mariadb-c-client, mariadb]
mkl: [intel-mkl]
mpe: [mpe2]
mpi: [openmpi, mpich]
mysql-client: [mysql, mariadb-c-client]
opencl: [pocl]
onedal: [intel-oneapi-dal]
osmesa: [mesa+osmesa, mesa18+osmesa]
pil: [py-pillow]
pkgconfig: [pkgconf, pkg-config]
rpc: [libtirpc]
scalapack: [netlib-scalapack, amdscalapack]
sycl: [hipsycl]
szip: [libszip, libaec]
szip: [libaec, libszip]
tbb: [intel-tbb]
unwind: [libunwind]
uuid: [util-linux-uuid, libuuid]

View File

@@ -1730,6 +1730,39 @@ This issue typically manifests with the error below:
A nicer error message is TBD in future versions of Spack.
---------------
Troubleshooting
---------------
The ``spack audit`` command:
.. command-output:: spack audit -h
can be used to detect a number of configuration issues. This command detects
configuration settings which might not be strictly wrong but are not likely
to be useful outside of special cases.
It can also be used to detect dependency issues with packages - for example
cases where a package constrains a dependency with a variant that doesn't
exist (in this case Spack could report the problem ahead of time but
automatically performing the check would slow down most runs of Spack).
A detailed list of the checks currently implemented for each subcommand can be
printed with:
.. command-output:: spack -v audit list
Depending on the use case, users might run the appropriate subcommands to obtain
diagnostics. Issues, if found, are reported to stdout:
.. code-block:: console
% spack audit packages lammps
PKG-DIRECTIVES: 1 issue found
1. lammps: wrong variant in "conflicts" directive
the variant 'adios' does not exist
in /home/spack/spack/var/spack/repos/builtin/packages/lammps/package.py
------------
Getting Help

View File

@@ -31,9 +31,25 @@ Build caches are created via:
.. code-block:: console
$ spack buildcache create spec
$ spack buildcache create <spec>
If you wanted to create a build cache in a local directory, you would provide
the ``-d`` argument to target that directory, again also specifying the spec.
Here is an example creating a local directory, "spack-cache" and creating
build cache files for the "ninja" spec:
.. code-block:: console
$ mkdir -p ./spack-cache
$ spack buildcache create -d ./spack-cache ninja
==> Buildcache files will be output to file:///home/spackuser/spack/spack-cache/build_cache
gpgconf: socketdir is '/run/user/1000/gnupg'
gpg: using "E6DF6A8BD43208E4D6F392F23777740B7DBD643D" as default secret key for signing
Note that the targeted spec must already be installed. Once you have a build cache,
you can add it as a mirror, discussed next.
---------------------------------------
Finding or installing build cache files
---------------------------------------
@@ -43,19 +59,98 @@ with:
.. code-block:: console
$ spack mirror add <name> <url>
$ spack mirror add <name> <url>
Note that the url can be a web url _or_ a local filesystem location. In the previous
example, you might add the directory "spack-cache" and call it ``mymirror``:
Build caches are found via:
.. code-block:: console
$ spack buildcache list
$ spack mirror add mymirror ./spack-cache
Build caches are installed via:
You can see that the mirror is added with ``spack mirror list`` as follows:
.. code-block:: console
$ spack buildcache install
$ spack mirror list
mymirror file:///home/spackuser/spack/spack-cache
spack-public https://spack-llnl-mirror.s3-us-west-2.amazonaws.com/
At this point, you've create a buildcache, but spack hasn't indexed it, so if
you run ``spack buildcache list`` you won't see any results. You need to index
this new build cache as follows:
.. code-block:: console
$ spack buildcache update-index -d spack-cache/
Now you can use list:
.. code-block:: console
$ spack buildcache list
==> 1 cached build.
-- linux-ubuntu20.04-skylake / gcc@9.3.0 ------------------------
ninja@1.10.2
Great! So now let's say you have a different spack installation, or perhaps just
a different environment for the same one, and you want to install a package from
that build cache. Let's first uninstall the actual library "ninja" to see if we can
re-install it from the cache.
.. code-block:: console
$ spack uninstall ninja
And now reinstall from the buildcache
.. code-block:: console
$ spack buildcache install ninja
==> buildcache spec(s) matching ninja
==> Fetching file:///home/spackuser/spack/spack-cache/build_cache/linux-ubuntu20.04-skylake/gcc-9.3.0/ninja-1.10.2/linux-ubuntu20.04-skylake-gcc-9.3.0-ninja-1.10.2-i4e5luour7jxdpc3bkiykd4imke3mkym.spack
####################################################################################################################################### 100.0%
==> Installing buildcache for spec ninja@1.10.2%gcc@9.3.0 arch=linux-ubuntu20.04-skylake
gpgconf: socketdir is '/run/user/1000/gnupg'
gpg: Signature made Tue 23 Mar 2021 10:16:29 PM MDT
gpg: using RSA key E6DF6A8BD43208E4D6F392F23777740B7DBD643D
gpg: Good signature from "spackuser (GPG created for Spack) <spackuser@noreply.users.github.com>" [ultimate]
It worked! You've just completed a full example of creating a build cache with
a spec of interest, adding it as a mirror, updating it's index, listing the contents,
and finally, installing from it.
Note that the above command is intended to install a particular package to a
build cache you have created, and not to install a package from a build cache.
For the latter, once a mirror is added, by default when you do ``spack install`` the ``--use-cache``
flag is set, and you will install a package from a build cache if it is available.
If you want to always use the cache, you can do:
.. code-block:: console
$ spack install --cache-only <package>
For example, to combine all of the commands above to add the E4S build cache
and then install from it exclusively, you would do:
.. code-block:: console
$ spack mirror add E4S https://cache.e4s.io
$ spack buildcache keys --install --trust
$ spack install --cache-only <package>
We use ``--install`` and ``--trust`` to say that we are installing keys to our
keyring, and trusting all downloaded keys.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
List of popular build caches

View File

@@ -17,10 +17,10 @@
# All configuration values have a default; values that are commented out
# serve to show the default.
import sys
import os
import re
import subprocess
import sys
from glob import glob
from sphinx.ext.apidoc import main as sphinx_apidoc
@@ -82,6 +82,8 @@
# Disable duplicate cross-reference warnings.
#
from sphinx.domains.python import PythonDomain
class PatchedPythonDomain(PythonDomain):
def resolve_xref(self, env, fromdocname, builder, typ, target, node, contnode):
if 'refspecific' in node:
@@ -136,6 +138,7 @@ def setup(sphinx):
#
# The short X.Y version.
import spack
version = '.'.join(str(s) for s in spack.spack_version_info[:2])
# The full version, including alpha/beta/rc tags.
release = spack.spack_version
@@ -179,7 +182,8 @@ def setup(sphinx):
# We use our own extension of the default style with a few modifications
from pygments.style import Style
from pygments.styles.default import DefaultStyle
from pygments.token import Generic, Comment, Text
from pygments.token import Comment, Generic, Text
class SpackStyle(DefaultStyle):
styles = DefaultStyle.styles.copy()
@@ -188,6 +192,7 @@ class SpackStyle(DefaultStyle):
styles[Generic.Prompt] = "bold #346ec9"
import pkg_resources
dist = pkg_resources.Distribution(__file__)
sys.path.append('.') # make 'conf' module findable
ep = pkg_resources.EntryPoint.parse('spack = conf:SpackStyle', dist=dist)

View File

@@ -363,7 +363,7 @@ to ``spack install`` on the command line, ``--no-add`` is the default,
while for dependency specs on the other hand, it is optional. In other
words, if there is an unambiguous match in the active concrete environment
for a root spec provided to ``spack install`` on the command line, spack
does not require you to specify the ``--no-add` option to prevent the spec
does not require you to specify the ``--no-add`` option to prevent the spec
from being added again. At the same time, a spec that already exists in the
environment, but only as a dependency, will be added to the environment as a
root spec without the ``--no-add`` option.

View File

@@ -70,7 +70,13 @@ Sourcing these files will put the ``spack`` command in your ``PATH``, set
up your ``MODULEPATH`` to use Spack's packages, and add other useful
shell integration for :ref:`certain commands <packaging-shell-support>`,
:ref:`environments <environments>`, and :ref:`modules <modules>`. For
``bash``, it also sets up tab completion.
``bash`` and ``zsh``, it also sets up tab completion.
In order to know which directory to add to your ``MODULEPATH``, these scripts
query the ``spack`` command. On shared filesystems, this can be a bit slow,
especially if you log in frequently. If you don't use modules, or want to set
``MODULEPATH`` manually instead, you can set the ``SPACK_SKIP_MODULES``
environment variable to skip this step and speed up sourcing the file.
If you do not want to use Spack's shell support, you can always just run
the ``spack`` command directly from ``spack/bin/spack``.
@@ -1166,7 +1172,7 @@ the key that we just created:
60D2685DAB647AD4DB54125961E09BB6F2A0ADCB
uid [ultimate] dinosaur (GPG created for Spack) <dinosaur@thedinosaurthings.com>
Note that the name "dinosaur" can be seen under the uid, which is the unique
id. We might need this reference if we want to export or otherwise reference the key.
@@ -1205,7 +1211,7 @@ If you want to include the private key, then just add `--secret`:
$ spack gpg export --secret dinosaur.priv dinosaur
This will write the private key to the file `dinosaur.priv`.
This will write the private key to the file `dinosaur.priv`.
.. warning::

View File

@@ -103,6 +103,140 @@ more tags to your build, you can do:
$ spack install --monitor --monitor-tags pizza,pasta hdf5
----------------------------
Monitoring with Containerize
----------------------------
The same argument group is available to add to a containerize command.
^^^^^^
Docker
^^^^^^
To add monitoring to a Docker container recipe generation using the defaults,
and assuming a monitor server running on localhost, you would
start with a spack.yaml in your present working directory:
.. code-block:: yaml
spack:
specs:
- samtools
And then do:
.. code-block:: console
# preview first
spack containerize --monitor
# and then write to a Dockerfile
spack containerize --monitor > Dockerfile
The install command will be edited to include commands for enabling monitoring.
However, getting secrets into the container for your monitor server is something
that should be done carefully. Specifically you should:
- Never try to define secrets as ENV, ARG, or using ``--build-arg``
- Do not try to get the secret into the container via a "temporary" file that you remove (it in fact will still exist in a layer)
Instead, it's recommended to use buildkit `as explained here <https://pythonspeed.com/articles/docker-build-secrets/>`_.
You'll need to again export environment variables for your spack monitor server:
.. code-block:: console
$ export SPACKMON_TOKEN=50445263afd8f67e59bd79bff597836ee6c05438
$ export SPACKMON_USER=spacky
And then use buildkit along with your build and identifying the name of the secret:
.. code-block:: console
$ DOCKER_BUILDKIT=1 docker build --secret id=st,env=SPACKMON_TOKEN --secret id=su,env=SPACKMON_USER -t spack/container .
The secrets are expected to come from your environment, and then will be temporarily mounted and available
at ``/run/secrets/<name>``. If you forget to supply them (and authentication is required) the build
will fail. If you need to build on your host (and interact with a spack monitor at localhost) you'll
need to tell Docker to use the host network:
.. code-block:: console
$ DOCKER_BUILDKIT=1 docker build --network="host" --secret id=st,env=SPACKMON_TOKEN --secret id=su,env=SPACKMON_USER -t spack/container .
^^^^^^^^^^^
Singularity
^^^^^^^^^^^
To add monitoring to a Singularity container build, the spack.yaml needs to
be modified slightly to specify wanting a different format:
.. code-block:: yaml
spack:
specs:
- samtools
container:
format: singularity
Again, generate the recipe:
.. code-block:: console
# preview first
$ spack containerize --monitor
# then write to a Singularity recipe
$ spack containerize --monitor > Singularity
Singularity doesn't have a direct way to define secrets at build time, so we have
to do a bit of a manual command to add a file, source secrets in it, and remove it.
Since Singularity doesn't have layers like Docker, deleting a file will truly
remove it from the container and history. So let's say we have this file,
``secrets.sh``:
.. code-block:: console
# secrets.sh
export SPACKMON_USER=spack
export SPACKMON_TOKEN=50445263afd8f67e59bd79bff597836ee6c05438
We would then generate the Singularity recipe, and add a files section,
a source of that file at the start of ``%post``, and **importantly**
a removal of the final at the end of that same section.
.. code-block::
Bootstrap: docker
From: spack/ubuntu-bionic:latest
Stage: build
%files
secrets.sh /opt/secrets.sh
%post
. /opt/secrets.sh
# spack install commands are here
...
# Don't forget to remove here!
rm /opt/secrets.sh
You can then build the container as your normally would.
.. code-block:: console
$ sudo singularity build container.sif Singularity
------------------
Monitoring Offline
------------------
@@ -117,4 +251,15 @@ flag.
$ spack install --monitor --monitor-save-local hdf5
This will save results in a subfolder, "monitor" in your designated spack
reports folder, which defaults to ``$HOME/.spack/reports/monitor``.
reports folder, which defaults to ``$HOME/.spack/reports/monitor``. When
you are ready to upload them to a spack monitor server:
.. code-block:: console
$ spack monitor upload ~/.spack/reports/monitor
You can choose the root directory of results as shown above, or a specific
subdirectory. The command accepts other arguments to specify configuration
for the monitor.

View File

@@ -920,12 +920,13 @@ For some packages, source code is provided in a Version Control System
(VCS) repository rather than in a tarball. Spack can fetch packages
from VCS repositories. Currently, Spack supports fetching with `Git
<git-fetch_>`_, `Mercurial (hg) <hg-fetch_>`_, `Subversion (svn)
<svn-fetch_>`_, and `Go <go-fetch_>`_. In all cases, the destination
<svn-fetch_>`_, `CVS (cvs) <cvs-fetch_>`_, and `Go <go-fetch_>`_.
In all cases, the destination
is the standard stage source path.
To fetch a package from a source repository, Spack needs to know which
VCS to use and where to download from. Much like with ``url``, package
authors can specify a class-level ``git``, ``hg``, ``svn``, or ``go``
authors can specify a class-level ``git``, ``hg``, ``svn``, ``cvs``, or ``go``
attribute containing the correct download location.
Many packages developed with Git have both a Git repository as well as
@@ -1173,6 +1174,55 @@ you can check out a branch or tag by changing the URL. If you want to
package multiple branches, simply add a ``svn`` argument to each
version directive.
.. _cvs-fetch:
^^^
CVS
^^^
CVS (Concurrent Versions System) is an old centralized version control
system. It is a predecessor of Subversion.
To fetch with CVS, use the ``cvs``, branch, and ``date`` parameters.
The destination directory will be the standard stage source path.
Fetching the head
Simply add a ``cvs`` parameter to the package:
.. code-block:: python
class Example(Package):
cvs = ":pserver:outreach.scidac.gov/cvsroot%module=modulename"
version('1.1.2.4')
CVS repository locations are described using an older syntax that
is different from today's ubiquitous URL syntax. ``:pserver:``
denotes the transport method. CVS servers can host multiple
repositories (called "modules") at the same location, and one needs
to specify both the server location and the module name to access.
Spack combines both into one string using the ``%module=modulename``
suffix shown above.
This download method is untrusted.
Fetching a date
Versions in CVS are commonly specified by date. To fetch a
particular branch or date, add a ``branch`` and/or ``date`` argument
to the version directive:
.. code-block:: python
version('2021.4.22', branch='branchname', date='2021-04-22')
Unfortunately, CVS does not identify repository-wide commits via a
revision or hash like Subversion, Git, or Mercurial do. This makes
it impossible to specify an exact commit to check out.
CVS has more features, but since CVS is rarely used these days, Spack
does not support all of them.
.. _go-fetch:
^^
@@ -1207,7 +1257,7 @@ Variants
Many software packages can be configured to enable optional
features, which often come at the expense of additional dependencies or
longer build times. To be flexible enough and support a wide variety of
use cases, Spack permits to expose to the end-user the ability to choose
use cases, Spack allows you to expose to the end-user the ability to choose
which features should be activated in a package at the time it is installed.
The mechanism to be employed is the :py:func:`spack.directives.variant` directive.
@@ -2725,6 +2775,57 @@ packages be built with MVAPICH and GCC.
See the :ref:`concretization-preferences` section for more details.
.. _group_when_spec:
----------------------------
Common ``when=`` constraints
----------------------------
In case a package needs many directives to share the whole ``when=``
argument, or just part of it, Spack allows you to group the common part
under a context manager:
.. code-block:: python
class Gcc(AutotoolsPackage):
with when('+nvptx'):
depends_on('cuda')
conflicts('@:6', msg='NVPTX only supported in gcc 7 and above')
conflicts('languages=ada')
conflicts('languages=brig')
conflicts('languages=go')
The snippet above is equivalent to the more verbose:
.. code-block:: python
class Gcc(AutotoolsPackage):
depends_on('cuda', when='+nvptx')
conflicts('@:6', when='+nvptx', msg='NVPTX only supported in gcc 7 and above')
conflicts('languages=ada', when='+nvptx')
conflicts('languages=brig', when='+nvptx')
conflicts('languages=go', when='+nvptx')
Constraints stemming from the context are added to what is explicitly present in the
``when=`` argument of a directive, so:
.. code-block:: python
with when('+elpa'):
depends_on('elpa+openmp', when='+openmp')
is equivalent to:
.. code-block:: python
depends_on('elpa+openmp', when='+openmp+elpa')
Constraints from nested context managers are also added together, but they are rarely
needed or recommended.
.. _install-method:
------------------

View File

@@ -169,11 +169,28 @@ have disabled it (using ``rebuild-index: False``) because the index would only b
generated in the artifacts mirror anyway, and consequently would not be available
during subesequent pipeline runs.
.. note::
With the addition of reproducible builds (#22887) a previously working
pipeline will require some changes:
* In the build jobs (``runner-attributes``), the environment location changed.
This will typically show as a ``KeyError`` in the failing job. Be sure to
point to ``${SPACK_CONCRETE_ENV_DIR}``.
* When using ``include`` in your environment, be sure to make the included
files available in the build jobs. This means adding those files to the
artifact directory. Those files will also be missing in the reproducibility
artifact.
* Because the location of the environment changed, including files with
relative path may have to be adapted to work both in the project context
(generation job) and in the concrete env dir context (build job).
-----------------------------------
Spack commands supporting pipelines
-----------------------------------
Spack provides a command ``ci`` command with a few sub-commands supporting spack
Spack provides a ``ci`` command with a few sub-commands supporting spack
ci pipelines. These commands are covered in more detail in this section.
.. _cmd-spack-ci:

View File

@@ -543,7 +543,8 @@ specified from the command line using the ``--projection-file`` option
to the ``spack view`` command.
The projections configuration file is a mapping of partial specs to
spec format strings, as shown in the example below.
spec format strings, defined by the :meth:`~spack.spec.Spec.format`
function, as shown in the example below.
.. code-block:: yaml

View File

@@ -11,7 +11,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.1.2 (commit 130607c373fd88cd3c43da94c0d3afd3a44084b0)
* Version: 0.1.2 (commit 26dec9d47e509daf8c970de4c89da200da52ad20)
argparse
--------

View File

@@ -1725,6 +1725,12 @@
"versions": ":",
"flags": "-march=armv8-a -mtune=generic"
}
],
"arm": [
{
"versions": ":",
"flags": "-march=armv8-a -mtune=generic"
}
]
}
},
@@ -1828,6 +1834,12 @@
"versions": "5:",
"flags": "-march=armv8.2-a+crc+crypto+fp16+sve"
}
],
"arm": [
{
"versions": "20:",
"flags": "-march=armv8.2-a+crc+crypto+fp16+sve"
}
]
}
},

View File

@@ -5,9 +5,9 @@
from __future__ import print_function
import re
import argparse
import errno
import re
import sys
from six import StringIO

View File

@@ -4,9 +4,9 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import errno
import hashlib
import glob
import grp
import hashlib
import itertools
import numbers
import os
@@ -19,10 +19,11 @@
from contextlib import contextmanager
import six
from llnl.util import tty
from llnl.util.lang import dedupe, memoized
from spack.util.executable import Executable
from spack.util.executable import Executable
if sys.version_info >= (3, 3):
from collections.abc import Sequence # novm

View File

@@ -5,14 +5,15 @@
from __future__ import division
import functools
import inspect
import multiprocessing
import os
import re
import functools
import inspect
from datetime import datetime, timedelta
from six import string_types
import sys
from datetime import datetime, timedelta
from six import string_types
if sys.version_info < (3, 0):
from itertools import izip_longest # novm

View File

@@ -7,12 +7,12 @@
from __future__ import print_function
import filecmp
import os
import shutil
import filecmp
from llnl.util.filesystem import traverse_tree, mkdirp, touch
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp, touch, traverse_tree
__all__ = ['LinkTree']

View File

@@ -3,16 +3,16 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import fcntl
import errno
import time
import fcntl
import os
import socket
import time
from datetime import datetime
import llnl.util.tty as tty
import spack.util.string
import spack.util.string
__all__ = ['Lock', 'LockTransaction', 'WriteTransaction', 'ReadTransaction',
'LockError', 'LockTimeoutError',

View File

@@ -12,12 +12,13 @@
import termios
import textwrap
import traceback
import six
from datetime import datetime
import six
from six import StringIO
from six.moves import input
from llnl.util.tty.color import cprint, cwrite, cescape, clen
from llnl.util.tty.color import cescape, clen, cprint, cwrite
# Globals
_debug = 0

View File

@@ -10,10 +10,11 @@
import os
import sys
from six import StringIO, text_type
from llnl.util.tty import terminal_size
from llnl.util.tty.color import clen, cextra
from llnl.util.tty.color import cextra, clen
class ColumnConfig:

View File

@@ -60,9 +60,9 @@
To output an @, use '@@'. To output a } inside braces, use '}}'.
"""
from __future__ import unicode_literals
import re
import sys
from contextlib import contextmanager
import six

View File

@@ -13,15 +13,14 @@
import os
import re
import select
import signal
import sys
import traceback
import signal
from contextlib import contextmanager
from six import string_types
from six import StringIO
from typing import Optional # novm
from types import ModuleType # novm
from typing import Optional # novm
from six import StringIO, string_types
import llnl.util.tty as tty

View File

@@ -14,10 +14,10 @@
"""
from __future__ import print_function
import os
import signal
import multiprocessing
import os
import re
import signal
import sys
import termios
import time

View File

@@ -8,9 +8,9 @@
from llnl.util.lang import memoized
import spack.spec
from spack.compilers.clang import Clang
from spack.spec import CompilerSpec
from spack.util.executable import Executable, ProcessError
from spack.compilers.clang import Clang
class ABI(object):

View File

@@ -10,11 +10,10 @@
from __future__ import absolute_import
import spack.util.classes
import spack.paths
import llnl.util.tty as tty
import spack.paths
import spack.util.classes
mod_path = spack.paths.analyzers_path
analyzers = spack.util.classes.list_classes("spack.analyzers", mod_path)

View File

@@ -7,14 +7,15 @@
and (optionally) interact with a Spack Monitor
"""
import spack.monitor
import spack.hooks
import llnl.util.tty as tty
import spack.util.path
import spack.config
import os
import llnl.util.tty as tty
import spack.config
import spack.hooks
import spack.monitor
import spack.util.path
def get_analyzer_dir(spec, analyzer_dir=None):
"""

View File

@@ -8,11 +8,12 @@
directory."""
import spack.monitor
from .analyzer_base import AnalyzerBase
import os
import spack.monitor
from .analyzer_base import AnalyzerBase
class ConfigArgs(AnalyzerBase):

View File

@@ -8,11 +8,11 @@
an index of key, value pairs for environment variables."""
from .analyzer_base import AnalyzerBase
import os
from spack.util.environment import EnvironmentModifications
import os
from .analyzer_base import AnalyzerBase
class EnvironmentVariables(AnalyzerBase):

View File

@@ -8,11 +8,12 @@
analyzer folder for further processing."""
import spack.monitor
from .analyzer_base import AnalyzerBase
import os
import spack.monitor
from .analyzer_base import AnalyzerBase
class InstallFiles(AnalyzerBase):

View File

@@ -4,20 +4,20 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack
import spack.error
import spack.bootstrap
import spack.hooks
import spack.monitor
import spack.binary_distribution
import spack.package
import spack.repo
import os
import llnl.util.tty as tty
from .analyzer_base import AnalyzerBase
import spack
import spack.binary_distribution
import spack.bootstrap
import spack.error
import spack.hooks
import spack.monitor
import spack.package
import spack.repo
import os
from .analyzer_base import AnalyzerBase
class Libabigail(AnalyzerBase):

View File

@@ -60,20 +60,21 @@
import functools
import warnings
import archspec.cpu
import six
import llnl.util.tty as tty
import archspec.cpu
import llnl.util.lang as lang
import llnl.util.tty as tty
import spack.compiler
import spack.compilers
import spack.config
import spack.paths
import spack.error as serr
import spack.paths
import spack.util.classes
import spack.util.executable
import spack.version
import spack.util.classes
from spack.util.spack_yaml import syaml_dict

395
lib/spack/spack/audit.py Normal file
View File

@@ -0,0 +1,395 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Classes and functions to register audit checks for various parts of
Spack and run them on-demand.
To register a new class of sanity checks (e.g. sanity checks for
compilers.yaml), the first action required is to create a new AuditClass
object:
.. code-block:: python
audit_cfgcmp = AuditClass(
tag='CFG-COMPILER',
description='Sanity checks on compilers.yaml',
kwargs=()
)
This object is to be used as a decorator to register functions
that will perform each a single check:
.. code-block:: python
@audit_cfgcmp
def _search_duplicate_compilers(error_cls):
pass
These functions need to take as argument the keywords declared when
creating the decorator object plus an ``error_cls`` argument at the
end, acting as a factory to create Error objects. It should return a
(possibly empty) list of errors.
Calls to each of these functions are triggered by the ``run`` method of
the decorator object, that will forward the keyword arguments passed
as input.
"""
import collections
import itertools
try:
from collections.abc import Sequence # novm
except ImportError:
from collections import Sequence
#: Map an audit tag to a list of callables implementing checks
CALLBACKS = {}
#: Map a group of checks to the list of related audit tags
GROUPS = collections.defaultdict(list)
class Error(object):
"""Information on an error reported in a test."""
def __init__(self, summary, details):
self.summary = summary
self.details = tuple(details)
def __str__(self):
return self.summary + '\n' + '\n'.join([
' ' + detail for detail in self.details
])
def __eq__(self, other):
if self.summary != other.summary or self.details != other.details:
return False
return True
def __hash__(self):
value = (self.summary, self.details)
return hash(value)
class AuditClass(Sequence):
def __init__(self, group, tag, description, kwargs):
"""Return an object that acts as a decorator to register functions
associated with a specific class of sanity checks.
Args:
group (str): group in which this check is to be inserted
tag (str): tag uniquely identifying the class of sanity checks
description (str): description of the sanity checks performed
by this tag
kwargs (tuple of str): keyword arguments that each registered
function needs to accept
"""
if tag in CALLBACKS:
msg = 'audit class "{0}" already registered'
raise ValueError(msg.format(tag))
self.group = group
self.tag = tag
self.description = description
self.kwargs = kwargs
self.callbacks = []
# Init the list of hooks
CALLBACKS[self.tag] = self
# Update the list of tags in the group
GROUPS[self.group].append(self.tag)
def __call__(self, func):
self.callbacks.append(func)
def __getitem__(self, item):
return self.callbacks[item]
def __len__(self):
return len(self.callbacks)
def run(self, **kwargs):
msg = 'please pass "{0}" as keyword arguments'
msg = msg.format(', '.join(self.kwargs))
assert set(self.kwargs) == set(kwargs), msg
errors = []
kwargs['error_cls'] = Error
for fn in self.callbacks:
errors.extend(fn(**kwargs))
return errors
def run_group(group, **kwargs):
"""Run the checks that are part of the group passed as argument.
Args:
group (str): group of checks to be run
**kwargs: keyword arguments forwarded to the checks
Returns:
List of (tag, errors) that failed.
"""
reports = []
for check in GROUPS[group]:
errors = run_check(check, **kwargs)
reports.append((check, errors))
return reports
def run_check(tag, **kwargs):
"""Run the checks associated with a single tag.
Args:
tag (str): tag of the check
**kwargs: keyword arguments forwarded to the checks
Returns:
Errors occurred during the checks
"""
return CALLBACKS[tag].run(**kwargs)
# TODO: For the generic check to be useful for end users,
# TODO: we need to implement hooks like described in
# TODO: https://github.com/spack/spack/pull/23053/files#r630265011
#: Generic checks relying on global state
generic = AuditClass(
group='generic',
tag='GENERIC',
description='Generic checks relying on global variables',
kwargs=()
)
#: Sanity checks on compilers.yaml
config_compiler = AuditClass(
group='configs',
tag='CFG-COMPILER',
description='Sanity checks on compilers.yaml',
kwargs=()
)
@config_compiler
def _search_duplicate_compilers(error_cls):
"""Report compilers with the same spec and two different definitions"""
import spack.config
errors = []
compilers = list(sorted(
spack.config.get('compilers'), key=lambda x: x['compiler']['spec']
))
for spec, group in itertools.groupby(
compilers, key=lambda x: x['compiler']['spec']
):
group = list(group)
if len(group) == 1:
continue
error_msg = 'Compiler defined multiple times: {0}'
try:
details = [str(x._start_mark).strip() for x in group]
except Exception:
details = []
errors.append(error_cls(
summary=error_msg.format(spec), details=details
))
return errors
#: Sanity checks on packages.yaml
config_packages = AuditClass(
group='configs',
tag='CFG-PACKAGES',
description='Sanity checks on packages.yaml',
kwargs=()
)
@config_packages
def _search_duplicate_specs_in_externals(error_cls):
"""Search for duplicate specs declared as externals"""
import spack.config
errors, externals = [], collections.defaultdict(list)
packages_yaml = spack.config.get('packages')
for name, pkg_config in packages_yaml.items():
# No externals can be declared under all
if name == 'all' or 'externals' not in pkg_config:
continue
current_externals = pkg_config['externals']
for entry in current_externals:
# Ask for the string representation of the spec to normalize
# aspects of the spec that may be represented in multiple ways
# e.g. +foo or foo=true
key = str(spack.spec.Spec(entry['spec']))
externals[key].append(entry)
for spec, entries in sorted(externals.items()):
# If there's a single external for a spec we are fine
if len(entries) < 2:
continue
# Otherwise wwe need to report an error
error_msg = 'Multiple externals share the same spec: {0}'.format(spec)
try:
lines = [str(x._start_mark).strip() for x in entries]
details = [
'Please remove all but one of the following entries:'
] + lines + [
'as they might result in non-deterministic hashes'
]
except TypeError:
details = []
errors.append(error_cls(summary=error_msg, details=details))
return errors
#: Sanity checks on package directives
package_directives = AuditClass(
group='packages',
tag='PKG-DIRECTIVES',
description='Sanity checks on specs used in directives',
kwargs=('pkgs',)
)
@package_directives
def _unknown_variants_in_directives(pkgs, error_cls):
"""Report unknown or wrong variants in directives for this package"""
import llnl.util.lang
import spack.repo
import spack.spec
errors = []
for pkg_name in pkgs:
pkg = spack.repo.get(pkg_name)
# Check "conflicts" directive
for conflict, triggers in pkg.conflicts.items():
for trigger, _ in triggers:
vrn = spack.spec.Spec(conflict)
try:
vrn.constrain(trigger)
except Exception as e:
msg = 'Generic error in conflict for package "{0}": '
errors.append(error_cls(msg.format(pkg.name), [str(e)]))
continue
errors.extend(_analyze_variants_in_directive(
pkg, vrn, directive='conflicts', error_cls=error_cls
))
# Check "depends_on" directive
for _, triggers in pkg.dependencies.items():
triggers = list(triggers)
for trigger in list(triggers):
vrn = spack.spec.Spec(trigger)
errors.extend(_analyze_variants_in_directive(
pkg, vrn, directive='depends_on', error_cls=error_cls
))
# Check "patch" directive
for _, triggers in pkg.provided.items():
triggers = [spack.spec.Spec(x) for x in triggers]
for vrn in triggers:
errors.extend(_analyze_variants_in_directive(
pkg, vrn, directive='patch', error_cls=error_cls
))
# Check "resource" directive
for vrn in pkg.resources:
errors.extend(_analyze_variants_in_directive(
pkg, vrn, directive='resource', error_cls=error_cls
))
return llnl.util.lang.dedupe(errors)
@package_directives
def _unknown_variants_in_dependencies(pkgs, error_cls):
"""Report unknown dependencies and wrong variants for dependencies"""
import spack.repo
import spack.spec
errors = []
for pkg_name in pkgs:
pkg = spack.repo.get(pkg_name)
filename = spack.repo.path.filename_for_package_name(pkg_name)
for dependency_name, dependency_data in pkg.dependencies.items():
# No need to analyze virtual packages
if spack.repo.path.is_virtual(dependency_name):
continue
try:
dependency_pkg = spack.repo.get(dependency_name)
except spack.repo.UnknownPackageError:
# This dependency is completely missing, so report
# and continue the analysis
summary = (pkg_name + ": unknown package '{0}' in "
"'depends_on' directive".format(dependency_name))
details = [
" in " + filename
]
errors.append(error_cls(summary=summary, details=details))
continue
for _, dependency_edge in dependency_data.items():
dependency_variants = dependency_edge.spec.variants
for name, value in dependency_variants.items():
try:
dependency_pkg.variants[name].validate_or_raise(
value, pkg=dependency_pkg
)
except Exception as e:
summary = (pkg_name + ": wrong variant used for a "
"dependency in a 'depends_on' directive")
error_msg = str(e).strip()
if isinstance(e, KeyError):
error_msg = ('the variant {0} does not '
'exist'.format(error_msg))
error_msg += " in package '" + dependency_name + "'"
errors.append(error_cls(
summary=summary, details=[error_msg, 'in ' + filename]
))
return errors
def _analyze_variants_in_directive(pkg, constraint, directive, error_cls):
import spack.variant
variant_exceptions = (
spack.variant.InconsistentValidationError,
spack.variant.MultipleValuesInExclusiveVariantError,
spack.variant.InvalidVariantValueError,
KeyError
)
errors = []
for name, v in constraint.variants.items():
try:
pkg.variants[name].validate_or_raise(v, pkg=pkg)
except variant_exceptions as e:
summary = pkg.name + ': wrong variant in "{0}" directive'
summary = summary.format(directive)
filename = spack.repo.path.filename_for_package_name(pkg.name)
error_msg = str(e).strip()
if isinstance(e, KeyError):
error_msg = 'the variant {0} does not exist'.format(error_msg)
err = error_cls(summary=summary, details=[
error_msg, 'in ' + filename
])
errors.append(err)
return errors

View File

@@ -4,22 +4,20 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import codecs
import glob
import hashlib
import json
import os
import re
import shutil
import sys
import tarfile
import shutil
import tempfile
import hashlib
import glob
from ordereddict_backport import OrderedDict
from contextlib import closing
import ruamel.yaml as yaml
import json
from six.moves.urllib.error import URLError, HTTPError
from ordereddict_backport import OrderedDict
from six.moves.urllib.error import HTTPError, URLError
import llnl.util.lang
import llnl.util.tty as tty
@@ -29,19 +27,18 @@
import spack.config as config
import spack.database as spack_db
import spack.fetch_strategy as fs
import spack.util.file_cache as file_cache
import spack.mirror
import spack.relocate as relocate
import spack.util.file_cache as file_cache
import spack.util.gpg
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.mirror
import spack.util.url as url_util
import spack.util.web as web_util
from spack.caches import misc_cache_location
from spack.spec import Spec
from spack.stage import Stage
_build_cache_relative_path = 'build_cache'
_build_cache_keys_relative_path = '_pgp'

View File

@@ -5,6 +5,7 @@
import contextlib
import os
import sys
try:
import sysconfig # novm
except ImportError:

View File

@@ -33,44 +33,52 @@
calls you can make from within the install() function.
"""
import inspect
import re
import multiprocessing
import os
import re
import shutil
import sys
import traceback
import types
from six import StringIO
import llnl.util.tty as tty
from llnl.util.tty.color import cescape, colorize
from llnl.util.filesystem import mkdirp, install, install_tree
from llnl.util.filesystem import install, install_tree, mkdirp
from llnl.util.lang import dedupe
from llnl.util.tty.color import cescape, colorize
from llnl.util.tty.log import MultiProcessFd
import spack.architecture as arch
import spack.build_systems.cmake
import spack.build_systems.meson
import spack.config
import spack.install_test
import spack.main
import spack.paths
import spack.package
import spack.paths
import spack.repo
import spack.schema.environment
import spack.store
import spack.install_test
import spack.subprocess_context
import spack.architecture as arch
import spack.util.path
from spack.util.string import plural
from spack.util.environment import (
env_flag, filter_system_paths, get_path, is_system_path,
EnvironmentModifications, validate, preserve_environment)
from spack.util.environment import system_dirs
from spack.error import NoLibrariesError, NoHeadersError
from spack.util.executable import Executable
from spack.util.module_cmd import load_module, path_from_modules, module
from spack.util.log_parse import parse_log_events, make_log_context
from spack.error import NoHeadersError, NoLibrariesError
from spack.util.cpus import cpus_available
from spack.util.environment import (
EnvironmentModifications,
env_flag,
filter_system_paths,
get_path,
is_system_path,
preserve_environment,
system_dirs,
validate,
)
from spack.util.executable import Executable
from spack.util.log_parse import make_log_context, parse_log_events
from spack.util.module_cmd import load_module, module, path_from_modules
from spack.util.string import plural
#
# This can be set by the user to globally disable parallel builds.
#
@@ -78,7 +86,7 @@
#
# These environment variables are set by
# set_build_environment_variables and used to pass parameters to
# set_wrapper_variables and used to pass parameters to
# Spack's compiler wrappers.
#
SPACK_ENV_PATH = 'SPACK_ENV_PATH'
@@ -159,6 +167,12 @@ def clean_environment():
env.unset('CPLUS_INCLUDE_PATH')
env.unset('OBJC_INCLUDE_PATH')
env.unset('CMAKE_PREFIX_PATH')
# Avoid that libraries of build dependencies get hijacked.
env.unset('LD_PRELOAD')
env.unset('DYLD_INSERT_LIBRARIES')
# On Cray "cluster" systems, unset CRAY_LD_LIBRARY_PATH to avoid
# interference with Spack dependencies.
# CNL requires these variables to be set (or at least some of them,
@@ -306,111 +320,20 @@ def set_compiler_environment_variables(pkg, env):
return env
def _place_externals_last(spec_container):
def set_wrapper_variables(pkg, env):
"""Set environment variables used by the Spack compiler wrapper
(which have the prefix `SPACK_`) and also add the compiler wrappers
to PATH.
This determines the injected -L/-I/-rpath options; each
of these specifies a search order and this function computes these
options in a manner that is intended to match the DAG traversal order
in `modifications_from_dependencies`: that method uses a post-order
traversal so that `PrependPath` actions from dependencies take lower
precedence; we use a post-order traversal here to match the visitation
order of `modifications_from_dependencies` (so we are visiting the
lowest priority packages first).
"""
For a (possibly unordered) container of specs, return an ordered list
where all external specs are at the end of the list. External packages
may be installed in merged prefixes with other packages, and so
they should be deprioritized for any search order (i.e. in PATH, or
for a set of -L entries in a compiler invocation).
"""
# Establish an arbitrary but fixed ordering of specs so that resulting
# environment variable values are stable
spec_container = sorted(spec_container, key=lambda x: x.name)
first = list(x for x in spec_container if not x.external)
second = list(x for x in spec_container if x.external)
return first + second
def set_build_environment_variables(pkg, env, dirty):
"""Ensure a clean install environment when we build packages.
This involves unsetting pesky environment variables that may
affect the build. It also involves setting environment variables
used by Spack's compiler wrappers.
Args:
pkg: The package we are building
env: The build environment
dirty (bool): Skip unsetting the user's environment settings
"""
# Gather information about various types of dependencies
build_deps = set(pkg.spec.dependencies(deptype=('build', 'test')))
link_deps = set(pkg.spec.traverse(root=False, deptype=('link')))
build_link_deps = build_deps | link_deps
rpath_deps = get_rpath_deps(pkg)
# This includes all build dependencies and any other dependencies that
# should be added to PATH (e.g. supporting executables run by build
# dependencies)
build_and_supporting_deps = set()
for build_dep in build_deps:
build_and_supporting_deps.update(build_dep.traverse(deptype='run'))
# External packages may be installed in a prefix which contains many other
# package installs. To avoid having those installations override
# Spack-installed packages, they are placed at the end of search paths.
# System prefixes are removed entirely later on since they are already
# searched.
build_deps = _place_externals_last(build_deps)
link_deps = _place_externals_last(link_deps)
build_link_deps = _place_externals_last(build_link_deps)
rpath_deps = _place_externals_last(rpath_deps)
build_and_supporting_deps = _place_externals_last(
build_and_supporting_deps)
link_dirs = []
include_dirs = []
rpath_dirs = []
# The top-level package is always RPATHed. It hasn't been installed yet
# so the RPATHs are added unconditionally (e.g. even though lib64/ may
# not be created for the install).
for libdir in ['lib', 'lib64']:
lib_path = os.path.join(pkg.prefix, libdir)
rpath_dirs.append(lib_path)
# Set up link, include, RPATH directories that are passed to the
# compiler wrapper
for dep in link_deps:
if is_system_path(dep.prefix):
continue
query = pkg.spec[dep.name]
dep_link_dirs = list()
try:
dep_link_dirs.extend(query.libs.directories)
except NoLibrariesError:
tty.debug("No libraries found for {0}".format(dep.name))
for default_lib_dir in ['lib', 'lib64']:
default_lib_prefix = os.path.join(dep.prefix, default_lib_dir)
if os.path.isdir(default_lib_prefix):
dep_link_dirs.append(default_lib_prefix)
link_dirs.extend(dep_link_dirs)
if dep in rpath_deps:
rpath_dirs.extend(dep_link_dirs)
try:
include_dirs.extend(query.headers.directories)
except NoHeadersError:
tty.debug("No headers found for {0}".format(dep.name))
link_dirs = list(dedupe(filter_system_paths(link_dirs)))
include_dirs = list(dedupe(filter_system_paths(include_dirs)))
rpath_dirs = list(dedupe(filter_system_paths(rpath_dirs)))
env.set(SPACK_LINK_DIRS, ':'.join(link_dirs))
env.set(SPACK_INCLUDE_DIRS, ':'.join(include_dirs))
env.set(SPACK_RPATH_DIRS, ':'.join(rpath_dirs))
build_and_supporting_prefixes = filter_system_paths(
x.prefix for x in build_and_supporting_deps)
build_link_prefixes = filter_system_paths(
x.prefix for x in build_link_deps)
# Add dependencies to CMAKE_PREFIX_PATH
env.set_path('CMAKE_PREFIX_PATH', get_cmake_prefix_path(pkg))
# Set environment variables if specified for
# the given compiler
compiler = pkg.compiler
@@ -420,16 +343,6 @@ def set_build_environment_variables(pkg, env, dirty):
extra_rpaths = ':'.join(compiler.extra_rpaths)
env.set('SPACK_COMPILER_EXTRA_RPATHS', extra_rpaths)
# Add bin directories from dependencies to the PATH for the build.
# These directories are added to the beginning of the search path, and in
# the order given by 'build_and_supporting_prefixes' (the iteration order
# is reversed because each entry is prepended)
for prefix in reversed(build_and_supporting_prefixes):
for dirname in ['bin', 'bin64']:
bin_dir = os.path.join(prefix, dirname)
if os.path.isdir(bin_dir):
env.prepend_path('PATH', bin_dir)
# Add spack build environment path with compiler wrappers first in
# the path. We add the compiler wrapper path, which includes default
# wrappers (cc, c++, f77, f90), AND a subdirectory containing
@@ -449,6 +362,7 @@ def set_build_environment_variables(pkg, env, dirty):
if os.path.isdir(ci):
env_paths.append(ci)
tty.debug("Adding compiler bin/ paths: " + " ".join(env_paths))
for item in env_paths:
env.prepend_path('PATH', item)
env.set_path(SPACK_ENV_PATH, env_paths)
@@ -467,14 +381,69 @@ def set_build_environment_variables(pkg, env, dirty):
raise RuntimeError("No ccache binary found in PATH")
env.set(SPACK_CCACHE_BINARY, ccache)
# Add any pkgconfig directories to PKG_CONFIG_PATH
for prefix in reversed(build_link_prefixes):
for directory in ('lib', 'lib64', 'share'):
pcdir = os.path.join(prefix, directory, 'pkgconfig')
if os.path.isdir(pcdir):
env.prepend_path('PKG_CONFIG_PATH', pcdir)
# Gather information about various types of dependencies
link_deps = set(pkg.spec.traverse(root=False, deptype=('link')))
rpath_deps = get_rpath_deps(pkg)
return env
link_dirs = []
include_dirs = []
rpath_dirs = []
def _prepend_all(list_to_modify, items_to_add):
# Update the original list (creating a new list would be faster but
# may not be convenient)
for item in reversed(list(items_to_add)):
list_to_modify.insert(0, item)
def update_compiler_args_for_dep(dep):
if dep in link_deps and (not is_system_path(dep.prefix)):
query = pkg.spec[dep.name]
dep_link_dirs = list()
try:
dep_link_dirs.extend(query.libs.directories)
except NoLibrariesError:
tty.debug("No libraries found for {0}".format(dep.name))
for default_lib_dir in ['lib', 'lib64']:
default_lib_prefix = os.path.join(
dep.prefix, default_lib_dir)
if os.path.isdir(default_lib_prefix):
dep_link_dirs.append(default_lib_prefix)
_prepend_all(link_dirs, dep_link_dirs)
if dep in rpath_deps:
_prepend_all(rpath_dirs, dep_link_dirs)
try:
_prepend_all(include_dirs, query.headers.directories)
except NoHeadersError:
tty.debug("No headers found for {0}".format(dep.name))
for dspec in pkg.spec.traverse(root=False, order='post'):
if dspec.external:
update_compiler_args_for_dep(dspec)
# Just above, we prepended entries for -L/-rpath for externals. We
# now do this for non-external packages so that Spack-built packages
# are searched first for libraries etc.
for dspec in pkg.spec.traverse(root=False, order='post'):
if not dspec.external:
update_compiler_args_for_dep(dspec)
# The top-level package is always RPATHed. It hasn't been installed yet
# so the RPATHs are added unconditionally (e.g. even though lib64/ may
# not be created for the install).
for libdir in ['lib64', 'lib']:
lib_path = os.path.join(pkg.prefix, libdir)
rpath_dirs.insert(0, lib_path)
link_dirs = list(dedupe(filter_system_paths(link_dirs)))
include_dirs = list(dedupe(filter_system_paths(include_dirs)))
rpath_dirs = list(dedupe(filter_system_paths(rpath_dirs)))
env.set(SPACK_LINK_DIRS, ':'.join(link_dirs))
env.set(SPACK_INCLUDE_DIRS, ':'.join(include_dirs))
env.set(SPACK_RPATH_DIRS, ':'.join(rpath_dirs))
def determine_number_of_jobs(
@@ -712,15 +681,6 @@ def get_rpaths(pkg):
return list(dedupe(filter_system_paths(rpaths)))
def get_cmake_prefix_path(pkg):
build_deps = set(pkg.spec.dependencies(deptype=('build', 'test')))
link_deps = set(pkg.spec.traverse(root=False, deptype=('link')))
build_link_deps = build_deps | link_deps
build_link_deps = _place_externals_last(build_link_deps)
build_link_prefixes = filter_system_paths(x.prefix for x in build_link_deps)
return build_link_prefixes
def get_std_cmake_args(pkg):
"""List of standard arguments used if a package is a CMakePackage.
@@ -788,42 +748,40 @@ def load_external_modules(pkg):
def setup_package(pkg, dirty, context='build'):
"""Execute all environment setup routines."""
if context not in ['build', 'test']:
raise ValueError(
"'context' must be one of ['build', 'test'] - got: {0}"
.format(context))
set_module_variables_for_package(pkg)
env = EnvironmentModifications()
if not dirty:
clean_environment()
# setup compilers and build tools for build contexts
# setup compilers for build contexts
need_compiler = context == 'build' or (context == 'test' and
pkg.test_requires_compiler)
if need_compiler:
set_compiler_environment_variables(pkg, env)
set_build_environment_variables(pkg, env, dirty)
set_wrapper_variables(pkg, env)
env.extend(modifications_from_dependencies(
pkg.spec, context, custom_mods_only=False))
# architecture specific setup
pkg.architecture.platform.setup_platform_environment(pkg, env)
if context == 'build':
# recursive post-order dependency information
env.extend(
modifications_from_dependencies(pkg.spec, context=context)
)
pkg.setup_build_environment(env)
if (not dirty) and (not env.is_unset('CPATH')):
tty.debug("A dependency has updated CPATH, this may lead pkg-"
"config to assume that the package is part of the system"
" includes and omit it when invoked with '--cflags'.")
# setup package itself
set_module_variables_for_package(pkg)
pkg.setup_build_environment(env)
elif context == 'test':
import spack.user_environment as uenv # avoid circular import
env.extend(uenv.environment_modifications_for_spec(pkg.spec))
env.extend(
modifications_from_dependencies(pkg.spec, context=context)
)
set_module_variables_for_package(pkg)
pkg.setup_run_environment(env)
env.prepend_path('PATH', '.')
# Loading modules, in particular if they are meant to be used outside
@@ -865,39 +823,173 @@ def setup_package(pkg, dirty, context='build'):
env.apply_modifications()
def modifications_from_dependencies(spec, context):
def _make_runnable(pkg, env):
# Helper method which prepends a Package's bin/ prefix to the PATH
# environment variable
prefix = pkg.prefix
for dirname in ['bin', 'bin64']:
bin_dir = os.path.join(prefix, dirname)
if os.path.isdir(bin_dir):
env.prepend_path('PATH', bin_dir)
def modifications_from_dependencies(spec, context, custom_mods_only=True):
"""Returns the environment modifications that are required by
the dependencies of a spec and also applies modifications
to this spec's package at module scope, if need be.
Environment modifications include:
- Updating PATH so that executables can be found
- Updating CMAKE_PREFIX_PATH and PKG_CONFIG_PATH so that their respective
tools can find Spack-built dependencies
- Running custom package environment modifications
Custom package modifications can conflict with the default PATH changes
we make (specifically for the PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH
environment variables), so this applies changes in a fixed order:
- All modifications (custom and default) from external deps first
- All modifications from non-external deps afterwards
With that order, `PrependPath` actions from non-external default
environment modifications will take precedence over custom modifications
from external packages.
A secondary constraint is that custom and default modifications are
grouped on a per-package basis: combined with the post-order traversal this
means that default modifications of dependents can override custom
modifications of dependencies (again, this would only occur for PATH,
CMAKE_PREFIX_PATH, or PKG_CONFIG_PATH).
Args:
spec (Spec): spec for which we want the modifications
context (str): either 'build' for build-time modifications or 'run'
for run-time modifications
"""
if context not in ['build', 'run', 'test']:
raise ValueError(
"Expecting context to be one of ['build', 'run', 'test'], "
"got: {0}".format(context))
env = EnvironmentModifications()
pkg = spec.package
# Maps the context to deptype and method to be called
deptype_and_method = {
'build': (('build', 'link', 'test'),
'setup_dependent_build_environment'),
'run': (('link', 'run'), 'setup_dependent_run_environment'),
'test': (('link', 'run', 'test'), 'setup_dependent_run_environment')
}
deptype, method = deptype_and_method[context]
# Note: see computation of 'custom_mod_deps' and 'exe_deps' later in this
# function; these sets form the building blocks of those collections.
build_deps = set(spec.dependencies(deptype=('build', 'test')))
link_deps = set(spec.traverse(root=False, deptype='link'))
build_link_deps = build_deps | link_deps
build_and_supporting_deps = set()
for build_dep in build_deps:
build_and_supporting_deps.update(build_dep.traverse(deptype='run'))
run_and_supporting_deps = set(
spec.traverse(root=False, deptype=('run', 'link')))
test_and_supporting_deps = set()
for test_dep in set(spec.dependencies(deptype='test')):
test_and_supporting_deps.update(test_dep.traverse(deptype='run'))
root = context == 'test'
for dspec in spec.traverse(order='post', root=root, deptype=deptype):
dpkg = dspec.package
set_module_variables_for_package(dpkg)
# Allow dependencies to modify the module
dpkg.setup_dependent_package(pkg.module, spec)
getattr(dpkg, method)(env, spec)
# All dependencies that might have environment modifications to apply
custom_mod_deps = set()
if context == 'build':
custom_mod_deps.update(build_and_supporting_deps)
# Tests may be performed after build
custom_mod_deps.update(test_and_supporting_deps)
else:
# test/run context
custom_mod_deps.update(run_and_supporting_deps)
if context == 'test':
custom_mod_deps.update(test_and_supporting_deps)
custom_mod_deps.update(link_deps)
# Determine 'exe_deps': the set of packages with binaries we want to use
if context == 'build':
exe_deps = build_and_supporting_deps | test_and_supporting_deps
elif context == 'run':
exe_deps = set(spec.traverse(deptype='run'))
elif context == 'test':
exe_deps = test_and_supporting_deps
def default_modifications_for_dep(dep):
if (dep in build_link_deps and
not is_system_path(dep.prefix) and
context == 'build'):
prefix = dep.prefix
env.prepend_path('CMAKE_PREFIX_PATH', prefix)
for directory in ('lib', 'lib64', 'share'):
pcdir = os.path.join(prefix, directory, 'pkgconfig')
if os.path.isdir(pcdir):
env.prepend_path('PKG_CONFIG_PATH', pcdir)
if dep in exe_deps and not is_system_path(dep.prefix):
_make_runnable(dep, env)
def add_modifications_for_dep(dep):
# Some callers of this function only want the custom modifications.
# For callers that want both custom and default modifications, we want
# to perform the default modifications here (this groups custom
# and default modifications together on a per-package basis).
if not custom_mods_only:
default_modifications_for_dep(dep)
# Perform custom modifications here (PrependPath actions performed in
# the custom method override the default environment modifications
# we do to help the build, namely for PATH, CMAKE_PREFIX_PATH, and
# PKG_CONFIG_PATH)
if dep in custom_mod_deps:
dpkg = dep.package
set_module_variables_for_package(dpkg)
# Allow dependencies to modify the module
dpkg.setup_dependent_package(spec.package.module, spec)
if context == 'build':
dpkg.setup_dependent_build_environment(env, spec)
else:
dpkg.setup_dependent_run_environment(env, spec)
# Note that we want to perform environment modifications in a fixed order.
# The Spec.traverse method provides this: i.e. in addition to
# the post-order semantics, it also guarantees a fixed traversal order
# among dependencies which are not constrained by post-order semantics.
for dspec in spec.traverse(root=False, order='post'):
if dspec.external:
add_modifications_for_dep(dspec)
for dspec in spec.traverse(root=False, order='post'):
# Default env modifications for non-external packages can override
# custom modifications of external packages (this can only occur
# for modifications to PATH, CMAKE_PREFIX_PATH, and PKG_CONFIG_PATH)
if not dspec.external:
add_modifications_for_dep(dspec)
return env
def get_cmake_prefix_path(pkg):
# Note that unlike modifications_from_dependencies, this does not include
# any edits to CMAKE_PREFIX_PATH defined in custom
# setup_dependent_build_environment implementations of dependency packages
build_deps = set(pkg.spec.dependencies(deptype=('build', 'test')))
link_deps = set(pkg.spec.traverse(root=False, deptype=('link')))
build_link_deps = build_deps | link_deps
spack_built = []
externals = []
# modifications_from_dependencies updates CMAKE_PREFIX_PATH by first
# prepending all externals and then all non-externals
for dspec in pkg.spec.traverse(root=False, order='post'):
if dspec in build_link_deps:
if dspec.external:
externals.insert(0, dspec)
else:
spack_built.insert(0, dspec)
ordered_build_link_deps = spack_built + externals
build_link_prefixes = filter_system_paths(
x.prefix for x in ordered_build_link_deps)
return build_link_prefixes
def _setup_pkg_and_run(serialized_pkg, function, kwargs, child_pipe,
input_multiprocess_fd):

View File

@@ -6,6 +6,7 @@
# Why doesn't this work for me?
# from spack import *
from llnl.util.filesystem import filter_file
from spack.build_systems.autotools import AutotoolsPackage
from spack.directives import extends
from spack.package import ExtensionError

View File

@@ -7,13 +7,13 @@
import os
import os.path
import stat
from subprocess import PIPE
from subprocess import check_call
from subprocess import PIPE, check_call
from typing import List # novm
import llnl.util.tty as tty
import llnl.util.filesystem as fs
from llnl.util.filesystem import working_dir, force_remove
import llnl.util.tty as tty
from llnl.util.filesystem import force_remove, working_dir
from spack.package import PackageBase, run_after, run_before
from spack.util.executable import Executable
@@ -345,8 +345,11 @@ def build(self, spec, prefix):
"""Makes the build targets specified by
:py:attr:``~.AutotoolsPackage.build_targets``
"""
# See https://autotools.io/automake/silent.html
params = ['V=1']
params += self.build_targets
with working_dir(self.build_directory):
inspect.getmodule(self).make(*self.build_targets)
inspect.getmodule(self).make(*params)
def install(self, spec, prefix):
"""Makes the install targets specified by

View File

@@ -4,8 +4,8 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from llnl.util.filesystem import install, mkdirp
import llnl.util.tty as tty
from llnl.util.filesystem import install, mkdirp
from spack.build_systems.cmake import CMakePackage
from spack.package import run_after

View File

@@ -10,10 +10,11 @@
import re
from typing import List # novm
import spack.build_environment
from llnl.util.filesystem import working_dir
from spack.directives import depends_on, variant, conflicts
from spack.package import PackageBase, InstallError, run_after
import spack.build_environment
from spack.directives import conflicts, depends_on, variant
from spack.package import InstallError, PackageBase, run_after
# Regex to extract the primary generator from the CMake generator
# string.

View File

@@ -3,10 +3,9 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import PackageBase
from spack.directives import depends_on, variant, conflicts
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.package import PackageBase
class CudaPackage(PackageBase):
@@ -79,47 +78,46 @@ def cuda_flags(arch_list):
depends_on('cuda@11.0:', when='cuda_arch=80')
depends_on('cuda@11.1:', when='cuda_arch=86')
# There are at least three cases to be aware of for compiler conflicts
# 1. Linux x86_64
# 2. Linux ppc64le
# 3. Mac OS X
# CUDA-compiler conflicts are version-to-version specific and are
# difficult to express with the current Spack conflict syntax
# From the NVIDIA install guide we know of conflicts for particular
# platforms (linux, darwin), architectures (x86, powerpc) and compilers
# (gcc, clang). We don't restrict %gcc and %clang conflicts to
# platform=linux, since they should also apply to platform=cray, and may
# apply to platform=darwin. We currently do not provide conflicts for
# platform=darwin with %apple-clang.
# Linux x86_64 compiler conflicts from here:
# https://gist.github.com/ax3l/9489132
arch_platform = ' target=x86_64: platform=linux'
conflicts('%gcc@5:', when='+cuda ^cuda@:7.5' + arch_platform)
conflicts('%gcc@6:', when='+cuda ^cuda@:8' + arch_platform)
conflicts('%gcc@7:', when='+cuda ^cuda@:9.1' + arch_platform)
conflicts('%gcc@8:', when='+cuda ^cuda@:10.0.130' + arch_platform)
conflicts('%gcc@9:', when='+cuda ^cuda@:10.2.89' + arch_platform)
conflicts('%gcc@:4', when='+cuda ^cuda@11.0.2:' + arch_platform)
conflicts('%gcc@10:', when='+cuda ^cuda@:11.0.2' + arch_platform)
conflicts('%gcc@11:', when='+cuda ^cuda@:11.1.0' + arch_platform)
conflicts('%pgi@:14.8', when='+cuda ^cuda@:7.0.27' + arch_platform)
conflicts('%pgi@:15.3,15.5:', when='+cuda ^cuda@7.5' + arch_platform)
conflicts('%pgi@:16.2,16.0:16.3', when='+cuda ^cuda@8' + arch_platform)
conflicts('%pgi@:15,18:', when='+cuda ^cuda@9.0:9.1' + arch_platform)
conflicts('%pgi@:16,19:', when='+cuda ^cuda@9.2.88:10' + arch_platform)
conflicts('%gcc@5:', when='+cuda ^cuda@:7.5 target=x86_64:')
conflicts('%gcc@6:', when='+cuda ^cuda@:8 target=x86_64:')
conflicts('%gcc@7:', when='+cuda ^cuda@:9.1 target=x86_64:')
conflicts('%gcc@8:', when='+cuda ^cuda@:10.0.130 target=x86_64:')
conflicts('%gcc@9:', when='+cuda ^cuda@:10.2.89 target=x86_64:')
conflicts('%gcc@:4', when='+cuda ^cuda@11.0.2: target=x86_64:')
conflicts('%gcc@10:', when='+cuda ^cuda@:11.0.3 target=x86_64:')
conflicts('%gcc@11:', when='+cuda ^cuda@:11.1.0 target=x86_64:')
conflicts('%pgi@:14.8', when='+cuda ^cuda@:7.0.27 target=x86_64:')
conflicts('%pgi@:15.3,15.5:', when='+cuda ^cuda@7.5 target=x86_64:')
conflicts('%pgi@:16.2,16.0:16.3', when='+cuda ^cuda@8 target=x86_64:')
conflicts('%pgi@:15,18:', when='+cuda ^cuda@9.0:9.1 target=x86_64:')
conflicts('%pgi@:16,19:', when='+cuda ^cuda@9.2.88:10 target=x86_64:')
conflicts('%pgi@:17,20:',
when='+cuda ^cuda@10.1.105:10.2.89' + arch_platform)
when='+cuda ^cuda@10.1.105:10.2.89 target=x86_64:')
conflicts('%pgi@:17,21:',
when='+cuda ^cuda@11.0.2:11.1.0' + arch_platform)
conflicts('%clang@:3.4', when='+cuda ^cuda@:7.5' + arch_platform)
when='+cuda ^cuda@11.0.2:11.1.0 target=x86_64:')
conflicts('%clang@:3.4', when='+cuda ^cuda@:7.5 target=x86_64:')
conflicts('%clang@:3.7,4:',
when='+cuda ^cuda@8.0:9.0' + arch_platform)
when='+cuda ^cuda@8.0:9.0 target=x86_64:')
conflicts('%clang@:3.7,4.1:',
when='+cuda ^cuda@9.1' + arch_platform)
conflicts('%clang@:3.7,5.1:', when='+cuda ^cuda@9.2' + arch_platform)
conflicts('%clang@:3.7,6.1:', when='+cuda ^cuda@10.0.130' + arch_platform)
conflicts('%clang@:3.7,7.1:', when='+cuda ^cuda@10.1.105' + arch_platform)
when='+cuda ^cuda@9.1 target=x86_64:')
conflicts('%clang@:3.7,5.1:', when='+cuda ^cuda@9.2 target=x86_64:')
conflicts('%clang@:3.7,6.1:', when='+cuda ^cuda@10.0.130 target=x86_64:')
conflicts('%clang@:3.7,7.1:', when='+cuda ^cuda@10.1.105 target=x86_64:')
conflicts('%clang@:3.7,8.1:',
when='+cuda ^cuda@10.1.105:10.1.243' + arch_platform)
conflicts('%clang@:3.2,9:', when='+cuda ^cuda@10.2.89' + arch_platform)
conflicts('%clang@:5', when='+cuda ^cuda@11.0.2:' + arch_platform)
conflicts('%clang@10:', when='+cuda ^cuda@:11.0.2' + arch_platform)
conflicts('%clang@11:', when='+cuda ^cuda@:11.1.0' + arch_platform)
when='+cuda ^cuda@10.1.105:10.1.243 target=x86_64:')
conflicts('%clang@:3.2,9:', when='+cuda ^cuda@10.2.89 target=x86_64:')
conflicts('%clang@:5', when='+cuda ^cuda@11.0.2: target=x86_64:')
conflicts('%clang@10:', when='+cuda ^cuda@:11.0.3 target=x86_64:')
conflicts('%clang@11:', when='+cuda ^cuda@:11.1.0 target=x86_64:')
# x86_64 vs. ppc64le differ according to NVidia docs
# Linux ppc64le compiler conflicts from Table from the docs below:
@@ -129,27 +127,26 @@ def cuda_flags(arch_list):
# https://docs.nvidia.com/cuda/archive/9.0/cuda-installation-guide-linux/index.html
# https://docs.nvidia.com/cuda/archive/8.0/cuda-installation-guide-linux/index.html
arch_platform = ' target=ppc64le: platform=linux'
# information prior to CUDA 9 difficult to find
conflicts('%gcc@6:', when='+cuda ^cuda@:9' + arch_platform)
conflicts('%gcc@8:', when='+cuda ^cuda@:10.0.130' + arch_platform)
conflicts('%gcc@9:', when='+cuda ^cuda@:10.1.243' + arch_platform)
conflicts('%gcc@6:', when='+cuda ^cuda@:9 target=ppc64le:')
conflicts('%gcc@8:', when='+cuda ^cuda@:10.0.130 target=ppc64le:')
conflicts('%gcc@9:', when='+cuda ^cuda@:10.1.243 target=ppc64le:')
# officially, CUDA 11.0.2 only supports the system GCC 8.3 on ppc64le
conflicts('%gcc@:4', when='+cuda ^cuda@11.0.2:' + arch_platform)
conflicts('%gcc@10:', when='+cuda ^cuda@:11.0.2' + arch_platform)
conflicts('%gcc@11:', when='+cuda ^cuda@:11.1.0' + arch_platform)
conflicts('%pgi', when='+cuda ^cuda@:8' + arch_platform)
conflicts('%pgi@:16', when='+cuda ^cuda@:9.1.185' + arch_platform)
conflicts('%pgi@:17', when='+cuda ^cuda@:10' + arch_platform)
conflicts('%clang@4:', when='+cuda ^cuda@:9.0.176' + arch_platform)
conflicts('%clang@5:', when='+cuda ^cuda@:9.1' + arch_platform)
conflicts('%clang@6:', when='+cuda ^cuda@:9.2' + arch_platform)
conflicts('%clang@7:', when='+cuda ^cuda@10.0.130' + arch_platform)
conflicts('%clang@7.1:', when='+cuda ^cuda@:10.1.105' + arch_platform)
conflicts('%clang@8.1:', when='+cuda ^cuda@:10.2.89' + arch_platform)
conflicts('%clang@:5', when='+cuda ^cuda@11.0.2:' + arch_platform)
conflicts('%clang@10:', when='+cuda ^cuda@:11.0.2' + arch_platform)
conflicts('%clang@11:', when='+cuda ^cuda@:11.1.0' + arch_platform)
conflicts('%gcc@:4', when='+cuda ^cuda@11.0.2: target=ppc64le:')
conflicts('%gcc@10:', when='+cuda ^cuda@:11.0.3 target=ppc64le:')
conflicts('%gcc@11:', when='+cuda ^cuda@:11.1.0 target=ppc64le:')
conflicts('%pgi', when='+cuda ^cuda@:8 target=ppc64le:')
conflicts('%pgi@:16', when='+cuda ^cuda@:9.1.185 target=ppc64le:')
conflicts('%pgi@:17', when='+cuda ^cuda@:10 target=ppc64le:')
conflicts('%clang@4:', when='+cuda ^cuda@:9.0.176 target=ppc64le:')
conflicts('%clang@5:', when='+cuda ^cuda@:9.1 target=ppc64le:')
conflicts('%clang@6:', when='+cuda ^cuda@:9.2 target=ppc64le:')
conflicts('%clang@7:', when='+cuda ^cuda@10.0.130 target=ppc64le:')
conflicts('%clang@7.1:', when='+cuda ^cuda@:10.1.105 target=ppc64le:')
conflicts('%clang@8.1:', when='+cuda ^cuda@:10.2.89 target=ppc64le:')
conflicts('%clang@:5', when='+cuda ^cuda@11.0.2: target=ppc64le:')
conflicts('%clang@10:', when='+cuda ^cuda@:11.0.3 target=ppc64le:')
conflicts('%clang@11:', when='+cuda ^cuda@:11.1.0 target=ppc64le:')
# Intel is mostly relevant for x86_64 Linux, even though it also
# exists for Mac OS X. No information prior to CUDA 3.2 or Intel 11.1
@@ -171,15 +168,8 @@ def cuda_flags(arch_list):
conflicts('%xl@:12,14:15,17:', when='+cuda ^cuda@9.2')
conflicts('%xl@:12,17:', when='+cuda ^cuda@:11.1.0')
# Mac OS X
# platform = ' platform=darwin'
# Apple XCode clang vs. LLVM clang are difficult to specify
# with spack syntax. Xcode clang name is `clang@x.y.z-apple`
# which precludes ranges being specified. We have proposed
# rename XCode clang to `clang@apple-x.y.z` or even
# `clang-apple@x.y.z as a possible fix.
# Compiler conflicts will be eventual taken from here:
# https://docs.nvidia.com/cuda/cuda-installation-guide-mac-os-x/index.html#abstract
# Darwin.
# TODO: add missing conflicts for %apple-clang cuda@:10
conflicts('platform=darwin', when='+cuda ^cuda@11.0.2:')
# Make sure cuda_arch can not be used without +cuda

View File

@@ -3,8 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.util.url
import spack.package
import spack.util.url
class GNUMirrorPackage(spack.package.PackageBase):

View File

@@ -4,26 +4,32 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import sys
import glob
import tempfile
import re
import inspect
import os
import re
import sys
import tempfile
import xml.etree.ElementTree as ElementTree
import llnl.util.tty as tty
from llnl.util.filesystem import (
HeaderList,
LibraryList,
ancestor,
filter_file,
find_headers,
find_libraries,
find_system_libraries,
install,
)
from llnl.util.filesystem import \
install, ancestor, filter_file, \
HeaderList, find_headers, \
LibraryList, find_libraries, find_system_libraries
from spack.version import Version, ver
from spack.package import PackageBase, run_after, InstallError
from spack.build_environment import dso_suffix
from spack.package import InstallError, PackageBase, run_after
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
from spack.util.prefix import Prefix
from spack.build_environment import dso_suffix
from spack.version import Version, ver
# A couple of utility functions that might be useful in general. If so, they
# should really be defined elsewhere, unless deemed heretical.
@@ -1089,7 +1095,7 @@ def _setup_dependent_env_callback(
# Intel MPI since 2019 depends on libfabric which is not in the
# lib directory but in a directory of its own which should be
# included in the rpath
if self.version >= ver('2019'):
if self.version_yearlike >= ver('2019'):
d = ancestor(self.component_lib_dir('mpi'))
libfabrics_path = os.path.join(d, 'libfabric', 'lib')
env.append_path('SPACK_COMPILER_EXTRA_RPATHS',

View File

@@ -9,6 +9,7 @@
import llnl.util.tty as tty
from llnl.util.filesystem import working_dir
from spack.package import PackageBase, run_after

View File

@@ -5,6 +5,7 @@
from llnl.util.filesystem import install_tree, working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from spack.util.executable import which

View File

@@ -9,6 +9,7 @@
from typing import List # novm
from llnl.util.filesystem import working_dir
from spack.directives import depends_on, variant
from spack.package import PackageBase, run_after
@@ -101,9 +102,9 @@ def _std_args(pkg):
strip = 'true' if '+strip' in pkg.spec else 'false'
if 'libs=static,shared' in pkg.spec:
if 'default_library=static,shared' in pkg.spec:
default_library = 'both'
elif 'libs=static' in pkg.spec:
elif 'default_library=static' in pkg.spec:
default_library = 'static'
else:
default_library = 'shared'

View File

@@ -8,16 +8,16 @@
"""
import getpass
import platform
import shutil
from sys import platform
from os.path import basename, dirname, isdir
from llnl.util.filesystem import find_headers, find_libraries, join_path
from spack.package import Package
from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable
from llnl.util.filesystem import find_headers, find_libraries, join_path
class IntelOneApiPackage(Package):
"""Base class for Intel oneAPI packages."""
@@ -48,7 +48,7 @@ def install(self, spec, prefix, installer_path=None):
if installer_path is None:
installer_path = basename(self.url_for_version(spec.version))
if platform == 'linux':
if platform.system() == 'Linux':
# Intel installer assumes and enforces that all components
# are installed into a single prefix. Spack wants to
# install each component in a separate prefix. The

View File

@@ -7,10 +7,11 @@
import inspect
import os
from llnl.util.filesystem import filter_file
from spack.directives import extends
from spack.package import PackageBase, run_after
from spack.util.executable import Executable
from llnl.util.filesystem import filter_file
class PerlPackage(PackageBase):

View File

@@ -6,14 +6,20 @@
import os
import shutil
import llnl.util.tty as tty
from llnl.util.filesystem import (
filter_file,
find,
get_filetype,
path_contains_subdirectory,
same_path,
working_dir,
)
from llnl.util.lang import match_predicate
from spack.directives import extends
from spack.package import PackageBase, run_after
from llnl.util.filesystem import (working_dir, get_filetype, filter_file,
path_contains_subdirectory, same_path, find)
from llnl.util.lang import match_predicate
import llnl.util.tty as tty
class PythonPackage(PackageBase):
"""Specialized class for packages that are built using Python

View File

@@ -7,6 +7,7 @@
import inspect
from llnl.util.filesystem import working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after

View File

@@ -75,10 +75,9 @@
# does not like its directory structure.
#
from spack.package import PackageBase
from spack.directives import depends_on, variant, conflicts
import spack.variant
from spack.directives import conflicts, depends_on, variant
from spack.package import PackageBase
class ROCmPackage(PackageBase):

View File

@@ -6,10 +6,11 @@
import inspect
import os
from llnl.util.filesystem import find, working_dir, join_path
import llnl.util.tty as tty
from llnl.util.filesystem import find, join_path, working_dir
from spack.directives import depends_on, extends
from spack.package import PackageBase, run_after
import llnl.util.tty as tty
class SIPPackage(PackageBase):

View File

@@ -3,8 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.util.url
import spack.package
import spack.util.url
class SourceforgePackage(spack.package.PackageBase):

View File

@@ -3,8 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.util.url
import spack.package
import spack.util.url
class SourcewarePackage(spack.package.PackageBase):

View File

@@ -6,11 +6,11 @@
import inspect
from llnl.util.filesystem import working_dir
from spack.directives import depends_on
from spack.package import PackageBase, run_after
from llnl.util.filesystem import working_dir
class WafPackage(PackageBase):
"""Specialized class for packages that are built using the

View File

@@ -3,8 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.util.url
import spack.package
import spack.util.url
class XorgPackage(spack.package.PackageBase):

View File

@@ -9,10 +9,10 @@
import llnl.util.lang
from llnl.util.filesystem import mkdirp
import spack.error
import spack.paths
import spack.config
import spack.error
import spack.fetch_strategy
import spack.paths
import spack.util.file_cache
import spack.util.path

View File

@@ -17,10 +17,10 @@
from six import iteritems
from six.moves.urllib.error import HTTPError, URLError
from six.moves.urllib.parse import urlencode
from six.moves.urllib.request import build_opener, HTTPHandler, Request
from six.moves.urllib.request import HTTPHandler, Request, build_opener
import llnl.util.tty as tty
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack
import spack.binary_distribution as bindist
@@ -28,18 +28,17 @@
import spack.compilers as compilers
import spack.config as cfg
import spack.environment as ev
from spack.error import SpackError
import spack.main
import spack.mirror
import spack.paths
import spack.repo
from spack.spec import Spec
import spack.util.executable as exe
import spack.util.spack_yaml as syaml
import spack.util.web as web_util
import spack.util.gpg as gpg_util
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util
from spack.error import SpackError
from spack.spec import Spec
JOB_RETRY_CONDITIONS = [
'always',
@@ -81,7 +80,8 @@ def _create_buildgroup(opener, headers, url, project, group_name, group_type):
if response_code != 200 and response_code != 201:
msg = 'Creating buildgroup failed (response code = {0}'.format(
response_code)
raise SpackError(msg)
tty.warn(msg)
return None
response_text = response.read()
response_json = json.loads(response_text)
@@ -110,7 +110,8 @@ def populate_buildgroup(job_names, group_name, project, site,
if not parent_group_id or not group_id:
msg = 'Failed to create or retrieve buildgroups for {0}'.format(
group_name)
raise SpackError(msg)
tty.warn(msg)
return
data = {
'project': project,
@@ -133,7 +134,7 @@ def populate_buildgroup(job_names, group_name, project, site,
if response_code != 200:
msg = 'Error response code ({0}) in populate_buildgroup'.format(
response_code)
raise SpackError(msg)
tty.warn(msg)
def is_main_phase(phase_name):
@@ -507,7 +508,7 @@ def format_job_needs(phase_name, strip_compilers, dep_jobs,
'job': get_job_name(phase_name,
strip_compilers,
dep_job,
osname,
dep_job.architecture,
build_group),
'artifacts': enable_artifacts_buildcache,
})
@@ -549,9 +550,8 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
generate_job_name = os.environ.get('CI_JOB_NAME', None)
parent_pipeline_id = os.environ.get('CI_PIPELINE_ID', None)
is_pr_pipeline = (
os.environ.get('SPACK_IS_PR_PIPELINE', '').lower() == 'true'
)
spack_pipeline_type = os.environ.get('SPACK_PIPELINE_TYPE', None)
is_pr_pipeline = spack_pipeline_type == 'spack_pull_request'
spack_pr_branch = os.environ.get('SPACK_PR_BRANCH', None)
pr_mirror_url = None
@@ -706,14 +706,9 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
root_spec = spec_record['rootSpec']
pkg_name = pkg_name_from_spec_label(spec_label)
release_spec = root_spec[pkg_name]
# Check if this spec is in our list of known failures.
if broken_specs_url:
full_hash = release_spec.full_hash()
broken_spec_path = url_util.join(broken_specs_url, full_hash)
if web_util.url_exists(broken_spec_path):
known_broken_specs_encountered.append('{0} ({1})'.format(
release_spec, full_hash))
release_spec_full_hash = release_spec.full_hash()
release_spec_dag_hash = release_spec.dag_hash()
release_spec_build_hash = release_spec.build_hash()
runner_attribs = find_matching_config(
release_spec, gitlab_ci)
@@ -746,8 +741,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
job_script.insert(0, 'cd {0}'.format(concrete_env_dir))
job_script.extend([
'spack ci rebuild --prepare',
'./install.sh'
'spack ci rebuild'
])
if 'script' in runner_attribs:
@@ -776,7 +770,9 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
job_vars = {
'SPACK_ROOT_SPEC': format_root_spec(
root_spec, main_phase, strip_compilers),
'SPACK_JOB_SPEC_DAG_HASH': release_spec.dag_hash(),
'SPACK_JOB_SPEC_DAG_HASH': release_spec_dag_hash,
'SPACK_JOB_SPEC_BUILD_HASH': release_spec_build_hash,
'SPACK_JOB_SPEC_FULL_HASH': release_spec_full_hash,
'SPACK_JOB_SPEC_PKG_NAME': release_spec.name,
'SPACK_COMPILER_ACTION': compiler_action
}
@@ -877,6 +873,15 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
if prune_dag and not rebuild_spec:
continue
# Check if this spec is in our list of known failures, now that
# we know this spec needs a rebuild
if broken_specs_url:
broken_spec_path = url_util.join(
broken_specs_url, release_spec_full_hash)
if web_util.url_exists(broken_spec_path):
known_broken_specs_encountered.append('{0} ({1})'.format(
release_spec, release_spec_full_hash))
if artifacts_root:
job_dependencies.append({
'job': generate_job_name,
@@ -1069,7 +1074,7 @@ def generate_gitlab_ci_yaml(env, print_summary, output_file,
'SPACK_JOB_LOG_DIR': rel_job_log_dir,
'SPACK_JOB_REPRO_DIR': rel_job_repro_dir,
'SPACK_LOCAL_MIRROR_DIR': rel_local_mirror_dir,
'SPACK_IS_PR_PIPELINE': str(is_pr_pipeline)
'SPACK_PIPELINE_TYPE': str(spack_pipeline_type)
}
if pr_mirror_url:
@@ -1256,7 +1261,8 @@ def register_cdash_build(build_name, base_url, project, site, track):
if response_code != 200 and response_code != 201:
msg = 'Adding build failed (response code = {0}'.format(response_code)
raise SpackError(msg)
tty.warn(msg)
return (None, None)
response_text = response.read()
response_json = json.loads(response_text)
@@ -1293,8 +1299,9 @@ def relate_cdash_builds(spec_map, cdash_base_url, job_build_id, cdash_project,
tty.debug('Did not find cdashid for {0} on {1}'.format(
dep_pkg_name, url))
else:
raise SpackError('Did not find cdashid for {0} anywhere'.format(
tty.warn('Did not find cdashid for {0} anywhere'.format(
dep_pkg_name))
return
payload = {
"project": cdash_project,
@@ -1315,7 +1322,8 @@ def relate_cdash_builds(spec_map, cdash_base_url, job_build_id, cdash_project,
if response_code != 200 and response_code != 201:
msg = 'Relate builds ({0} -> {1}) failed (resp code = {2})'.format(
job_build_id, dep_build_id, response_code)
raise SpackError(msg)
tty.warn(msg)
return
response_text = response.read()
tty.debug('Relate builds response: {0}'.format(response_text))
@@ -1338,7 +1346,16 @@ def write_cdashid_to_mirror(cdashid, spec, mirror_url):
tty.debug('pushing cdashid to url')
tty.debug(' local file path: {0}'.format(local_cdash_path))
tty.debug(' remote url: {0}'.format(remote_url))
web_util.push_to_url(local_cdash_path, remote_url)
try:
web_util.push_to_url(local_cdash_path, remote_url)
except Exception as inst:
# No matter what went wrong here, don't allow the pipeline to fail
# just because there was an issue storing the cdashid on the mirror
msg = 'Failed to write cdashid {0} to mirror {1}'.format(
cdashid, mirror_url)
tty.warn(inst)
tty.warn(msg)
def read_cdashid_from_mirror(spec, mirror_url):
@@ -1356,40 +1373,34 @@ def read_cdashid_from_mirror(spec, mirror_url):
return int(contents)
def push_mirror_contents(env, spec, yaml_path, mirror_url, build_id,
sign_binaries):
if mirror_url:
try:
unsigned = not sign_binaries
tty.debug('Creating buildcache ({0})'.format(
'unsigned' if unsigned else 'signed'))
spack.cmd.buildcache._createtarball(
env, spec_yaml=yaml_path, add_deps=False,
output_location=mirror_url, force=True, allow_root=True,
unsigned=unsigned)
if build_id:
tty.debug('Writing cdashid ({0}) to remote mirror: {1}'.format(
build_id, mirror_url))
write_cdashid_to_mirror(build_id, spec, mirror_url)
except Exception as inst:
# If the mirror we're pushing to is on S3 and there's some
# permissions problem, for example, we can't just target
# that exception type here, since users of the
# `spack ci rebuild' may not need or want any dependency
# on boto3. So we use the first non-boto exception type
# in the heirarchy:
# boto3.exceptions.S3UploadFailedError
# boto3.exceptions.Boto3Error
# Exception
# BaseException
# object
err_msg = 'Error msg: {0}'.format(inst)
if 'Access Denied' in err_msg:
tty.msg('Permission problem writing to {0}'.format(
mirror_url))
tty.msg(err_msg)
else:
raise inst
def push_mirror_contents(env, spec, yaml_path, mirror_url, sign_binaries):
try:
unsigned = not sign_binaries
tty.debug('Creating buildcache ({0})'.format(
'unsigned' if unsigned else 'signed'))
spack.cmd.buildcache._createtarball(
env, spec_yaml=yaml_path, add_deps=False,
output_location=mirror_url, force=True, allow_root=True,
unsigned=unsigned)
except Exception as inst:
# If the mirror we're pushing to is on S3 and there's some
# permissions problem, for example, we can't just target
# that exception type here, since users of the
# `spack ci rebuild' may not need or want any dependency
# on boto3. So we use the first non-boto exception type
# in the heirarchy:
# boto3.exceptions.S3UploadFailedError
# boto3.exceptions.Boto3Error
# Exception
# BaseException
# object
err_msg = 'Error msg: {0}'.format(inst)
if 'Access Denied' in err_msg:
tty.msg('Permission problem writing to {0}'.format(
mirror_url))
tty.msg(err_msg)
else:
raise inst
def copy_stage_logs_to_artifacts(job_spec, job_log_dir):

View File

@@ -5,19 +5,20 @@
from __future__ import print_function
import argparse
import os
import re
import sys
import argparse
import ruamel.yaml as yaml
import ruamel.yaml as yaml
import six
from ruamel.yaml.error import MarkedYAMLError
import llnl.util.tty as tty
from llnl.util.filesystem import join_path
from llnl.util.lang import attr_setdefault, index_by
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
from llnl.util.filesystem import join_path
import spack.config
import spack.error
@@ -27,8 +28,6 @@
import spack.store
import spack.util.spack_json as sjson
import spack.util.string
from ruamel.yaml.error import MarkedYAMLError
# cmd has a submodule called "list" so preserve the python list module
python_list = list

View File

@@ -9,7 +9,6 @@
import spack.cmd.common.arguments as arguments
import spack.environment as ev
description = 'add a spec to an environment'
section = "environments"
level = "long"

View File

@@ -17,7 +17,6 @@
import spack.paths
import spack.report
description = "run analyzers on installed packages"
section = "analysis"
level = "long"

View File

@@ -8,8 +8,10 @@
import collections
import archspec.cpu
import llnl.util.tty.colify as colify
import llnl.util.tty.color as color
import spack.architecture as architecture
description = "print architecture information about this machine"

View File

@@ -0,0 +1,80 @@
# Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.tty.color as cl
import spack.audit
import spack.repo
description = "audit configuration files, packages, etc."
section = "system"
level = "short"
def setup_parser(subparser):
# Top level flags, valid for every audit class
sp = subparser.add_subparsers(metavar='SUBCOMMAND', dest='subcommand')
# Audit configuration files
sp.add_parser('configs', help='audit configuration files')
# Audit package recipes
pkg_parser = sp.add_parser('packages', help='audit package recipes')
pkg_parser.add_argument(
'name', metavar='PKG', nargs='*',
help='package to be analyzed (if none all packages will be processed)',
)
# List all checks
sp.add_parser('list', help='list available checks and exits')
def configs(parser, args):
reports = spack.audit.run_group(args.subcommand)
_process_reports(reports)
def packages(parser, args):
pkgs = args.name or spack.repo.path.all_package_names()
reports = spack.audit.run_group(args.subcommand, pkgs=pkgs)
_process_reports(reports)
def list(parser, args):
for subcommand, check_tags in spack.audit.GROUPS.items():
print(cl.colorize('@*b{' + subcommand + '}:'))
for tag in check_tags:
audit_obj = spack.audit.CALLBACKS[tag]
print(' ' + audit_obj.description)
if args.verbose:
for idx, fn in enumerate(audit_obj.callbacks):
print(' {0}. '.format(idx + 1) + fn.__doc__)
print()
print()
def audit(parser, args):
subcommands = {
'configs': configs,
'packages': packages,
'list': list
}
subcommands[args.subcommand](parser, args)
def _process_reports(reports):
for check, errors in reports:
if errors:
msg = '{0}: {1} issue{2} found'.format(
check, len(errors), '' if len(errors) == 1 else 's'
)
header = '@*b{' + msg + '}'
print(cl.colorize(header))
for idx, error in enumerate(errors):
print(str(idx + 1) + '. ' + str(error))
raise SystemExit(1)
else:
msg = '{0}: 0 issues found.'.format(check)
header = '@*b{' + msg + '}'
print(cl.colorize(header))

View File

@@ -8,16 +8,15 @@
import sys
import llnl.util.tty as tty
from llnl.util.lang import pretty_date
from llnl.util.filesystem import working_dir
from llnl.util.lang import pretty_date
from llnl.util.tty.colify import colify_table
import spack.util.spack_json as sjson
import spack.paths
import spack.repo
from spack.util.executable import which
import spack.util.spack_json as sjson
from spack.cmd import spack_is_git_repo
from spack.util.executable import which
description = "show contributors to packages"
section = "developer"

View File

@@ -8,10 +8,12 @@
import sys
import llnl.util.tty as tty
import spack.architecture
import spack.binary_distribution as bindist
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.hash_types as ht
import spack.mirror
@@ -19,17 +21,12 @@
import spack.repo
import spack.spec
import spack.store
import spack.config
import spack.repo
import spack.store
import spack.util.url as url_util
from spack.cmd import display_specs
from spack.error import SpecError
from spack.spec import Spec, save_dependency_spec_yamls
from spack.util.string import plural
from spack.cmd import display_specs
description = "create, download and install binary packages"
section = "packaging"
level = "long"

View File

@@ -15,7 +15,7 @@
import spack.stage
import spack.util.crypto
from spack.util.naming import valid_fully_qualified_module_name
from spack.version import ver, Version
from spack.version import Version, ver
description = "checksum available versions of a package"
section = "packaging"

View File

@@ -17,15 +17,14 @@
import spack.binary_distribution as bindist
import spack.ci as spack_ci
import spack.config as cfg
import spack.cmd.buildcache as buildcache
import spack.config as cfg
import spack.environment as ev
import spack.hash_types as ht
import spack.mirror
import spack.util.url as url_util
import spack.util.web as web_util
description = "manage continuous integration pipelines"
section = "build"
level = "long"
@@ -196,8 +195,7 @@ def ci_rebuild(args):
compiler_action = get_env_var('SPACK_COMPILER_ACTION')
cdash_build_name = get_env_var('SPACK_CDASH_BUILD_NAME')
related_builds = get_env_var('SPACK_RELATED_BUILDS_CDASH')
pr_env_var = get_env_var('SPACK_IS_PR_PIPELINE')
dev_env_var = get_env_var('SPACK_IS_DEVELOP_PIPELINE')
spack_pipeline_type = get_env_var('SPACK_PIPELINE_TYPE')
pr_mirror_url = get_env_var('SPACK_PR_MIRROR_URL')
remote_mirror_url = get_env_var('SPACK_REMOTE_MIRROR_URL')
@@ -231,7 +229,6 @@ def ci_rebuild(args):
eq_idx = proj_enc.find('=') + 1
cdash_project_enc = proj_enc[eq_idx:]
cdash_site = ci_cdash['site']
cdash_id_path = os.path.join(repro_dir, 'cdash_id.txt')
tty.debug('cdash_base_url = {0}'.format(cdash_base_url))
tty.debug('cdash_project = {0}'.format(cdash_project))
tty.debug('cdash_project_enc = {0}'.format(cdash_project_enc))
@@ -242,8 +239,11 @@ def ci_rebuild(args):
# Is this a pipeline run on a spack PR or a merge to develop? It might
# be neither, e.g. a pipeline run on some environment repository.
spack_is_pr_pipeline = True if pr_env_var == 'True' else False
spack_is_develop_pipeline = True if dev_env_var == 'True' else False
spack_is_pr_pipeline = spack_pipeline_type == 'spack_pull_request'
spack_is_develop_pipeline = spack_pipeline_type == 'spack_protected_branch'
tty.debug('Pipeline type - PR: {0}, develop: {1}'.format(
spack_is_pr_pipeline, spack_is_develop_pipeline))
# Figure out what is our temporary storage mirror: Is it artifacts
# buildcache? Or temporary-storage-url-prefix? In some cases we need to
@@ -396,7 +396,7 @@ def ci_rebuild(args):
job_spec_pkg_name, matching_mirror))
tty.debug('Downloading to {0}'.format(build_cache_dir))
buildcache.download_buildcache_files(
job_spec, build_cache_dir, True, matching_mirror)
job_spec, build_cache_dir, False, matching_mirror)
# Now we are done and successful
sys.exit(0)
@@ -431,24 +431,21 @@ def ci_rebuild(args):
cdash_build_name, cdash_base_url, cdash_project,
cdash_site, job_spec_buildgroup)
cdash_upload_url = '{0}/submit.php?project={1}'.format(
cdash_base_url, cdash_project_enc)
if cdash_build_id is not None:
cdash_upload_url = '{0}/submit.php?project={1}'.format(
cdash_base_url, cdash_project_enc)
install_args.extend([
'--cdash-upload-url', cdash_upload_url,
'--cdash-build', cdash_build_name,
'--cdash-site', cdash_site,
'--cdash-buildstamp', cdash_build_stamp,
])
install_args.extend([
'--cdash-upload-url', cdash_upload_url,
'--cdash-build', cdash_build_name,
'--cdash-site', cdash_site,
'--cdash-buildstamp', cdash_build_stamp,
])
tty.debug('CDash: Relating build with dependency builds')
spack_ci.relate_cdash_builds(
spec_map, cdash_base_url, cdash_build_id, cdash_project,
[pipeline_mirror_url, pr_mirror_url, remote_mirror_url])
# store the cdash build id on disk for later
with open(cdash_id_path, 'w') as fd:
fd.write(cdash_build_id)
tty.debug('CDash: Relating build with dependency builds')
spack_ci.relate_cdash_builds(
spec_map, cdash_base_url, cdash_build_id, cdash_project,
[pipeline_mirror_url, pr_mirror_url, remote_mirror_url])
# A compiler action of 'FIND_ANY' means we are building a bootstrap
# compiler or one of its deps.
@@ -494,11 +491,14 @@ def ci_rebuild(args):
# If a spec fails to build in a spack develop pipeline, we add it to a
# list of known broken full hashes. This allows spack PR pipelines to
# avoid wasting compute cycles attempting to build those hashes.
if install_exit_code != 0 and spack_is_develop_pipeline:
if install_exit_code == 1 and spack_is_develop_pipeline:
tty.debug('Install failed on develop')
if 'broken-specs-url' in gitlab_ci:
broken_specs_url = gitlab_ci['broken-specs-url']
dev_fail_hash = job_spec.full_hash()
broken_spec_path = url_util.join(broken_specs_url, dev_fail_hash)
tty.msg('Reporting broken develop build as: {0}'.format(
broken_spec_path))
tmpdir = tempfile.mkdtemp()
empty_file_path = os.path.join(tmpdir, 'empty.txt')
@@ -541,17 +541,31 @@ def ci_rebuild(args):
# Create buildcache in either the main remote mirror, or in the
# per-PR mirror, if this is a PR pipeline
spack_ci.push_mirror_contents(
env, job_spec, job_spec_yaml_path, buildcache_mirror_url,
cdash_build_id, sign_binaries)
if buildcache_mirror_url:
spack_ci.push_mirror_contents(
env, job_spec, job_spec_yaml_path, buildcache_mirror_url,
sign_binaries)
if cdash_build_id:
tty.debug('Writing cdashid ({0}) to remote mirror: {1}'.format(
cdash_build_id, buildcache_mirror_url))
spack_ci.write_cdashid_to_mirror(
cdash_build_id, job_spec, buildcache_mirror_url)
# Create another copy of that buildcache in the per-pipeline
# temporary storage mirror (this is only done if either
# artifacts buildcache is enabled or a temporary storage url
# prefix is set)
spack_ci.push_mirror_contents(
env, job_spec, job_spec_yaml_path, pipeline_mirror_url,
cdash_build_id, sign_binaries)
if pipeline_mirror_url:
spack_ci.push_mirror_contents(
env, job_spec, job_spec_yaml_path, pipeline_mirror_url,
sign_binaries)
if cdash_build_id:
tty.debug('Writing cdashid ({0}) to remote mirror: {1}'.format(
cdash_build_id, pipeline_mirror_url))
spack_ci.write_cdashid_to_mirror(
cdash_build_id, job_spec, pipeline_mirror_url)
else:
tty.debug('spack install exited non-zero, will not create buildcache')
@@ -580,16 +594,16 @@ def ci_rebuild(args):
print(reproduce_msg)
# Tie job success/failure to the success/failure of building the spec
sys.exit(install_exit_code)
return install_exit_code
def ci_reproduce(args):
job_url = args.job_url
work_dir = args.working_dir
spack_ci.reproduce_ci_job(job_url, work_dir)
return spack_ci.reproduce_ci_job(job_url, work_dir)
def ci(parser, args):
if args.func:
args.func(args)
return args.func(args)

View File

@@ -10,15 +10,14 @@
import llnl.util.tty as tty
import spack.caches
import spack.config
import spack.cmd.test
import spack.cmd.common.arguments as arguments
import spack.cmd.test
import spack.config
import spack.main
import spack.repo
import spack.stage
from spack.paths import lib_path, var_path
description = "remove temporary build files and/or downloaded archives"
section = "build"
level = "long"

View File

@@ -14,7 +14,9 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.argparsewriter import (
ArgparseWriter, ArgparseRstWriter, ArgparseCompletionWriter
ArgparseCompletionWriter,
ArgparseRstWriter,
ArgparseWriter,
)
from llnl.util.tty.colify import colify
@@ -23,7 +25,6 @@
import spack.paths
from spack.main import section_descriptions
description = "list available spack commands"
section = "developer"
level = "long"

View File

@@ -8,10 +8,11 @@
import os
import llnl.util.tty as tty
import spack.build_environment as build_environment
import spack.paths
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.paths
from spack.util.environment import dump_environment, pickle_environment

View File

@@ -7,16 +7,18 @@
import argparse
import sys
from six import iteritems
import llnl.util.tty as tty
import spack.compilers
import spack.config
import spack.spec
from llnl.util.lang import index_by
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
from spack.spec import CompilerSpec, ArchSpec
import spack.compilers
import spack.config
import spack.spec
from spack.spec import ArchSpec, CompilerSpec
description = "manage compilers"
section = "system"

View File

@@ -10,15 +10,16 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.config
import spack.cmd.common.arguments
import spack.schema.env
import spack.config
import spack.environment as ev
import spack.repo
import spack.schema.env
import spack.schema.packages
import spack.store
import spack.util.spack_yaml as syaml
from spack.util.editor import editor
import spack.store
import spack.repo
description = "get and set configuration options"
section = "config"

View File

@@ -4,7 +4,9 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import spack.container
import spack.monitor
description = ("creates recipes to build images for different"
" container runtimes")
@@ -12,6 +14,10 @@
level = "long"
def setup_parser(subparser):
monitor_group = spack.monitor.get_monitor_group(subparser) # noqa
def containerize(parser, args):
config_dir = args.env_dir or os.getcwd()
config_file = os.path.abspath(os.path.join(config_dir, 'spack.yaml'))
@@ -21,5 +27,12 @@ def containerize(parser, args):
config = spack.container.validate(config_file)
# If we have a monitor request, add monitor metadata to config
if args.use_monitor:
config['spack']['monitor'] = {"disable_auth": args.monitor_disable_auth,
"host": args.monitor_host,
"keep_going": args.monitor_keep_going,
"prefix": args.monitor_prefix,
"tags": args.monitor_tags}
recipe = spack.container.recipe(config)
print(recipe)

View File

@@ -11,16 +11,23 @@
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack.util.web
import spack.repo
import spack.stage
import spack.util.web
from spack.spec import Spec
from spack.url import (
UndetectableNameError,
UndetectableVersionError,
parse_name,
parse_version,
)
from spack.util.editor import editor
from spack.util.executable import which, ProcessError
from spack.util.naming import mod_to_class
from spack.util.naming import simplify_name, valid_fully_qualified_module_name
from spack.url import UndetectableNameError, UndetectableVersionError
from spack.url import parse_name, parse_version
from spack.util.executable import ProcessError, which
from spack.util.naming import (
mod_to_class,
simplify_name,
valid_fully_qualified_module_name,
)
description = "create a new package file"
section = "packaging"

View File

@@ -14,18 +14,18 @@
installation and its deprecator.
'''
from __future__ import print_function
import argparse
import os
import llnl.util.tty as tty
import spack.cmd
import spack.store
import spack.cmd.common.arguments as arguments
import spack.environment as ev
from spack.error import SpackError
import spack.store
from spack.database import InstallStatuses
from spack.error import SpackError
description = "Replace one package with another via symlinks"
section = "admin"

View File

@@ -3,14 +3,14 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import sys
import os
import sys
import llnl.util.tty as tty
import spack.config
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.repo
description = "developer build: build from code in current working directory"

View File

@@ -10,7 +10,6 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
from spack.error import SpackError
description = "add a spec to an environment's dev-build information"

View File

@@ -3,8 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import glob
import os
import llnl.util.tty as tty

View File

@@ -8,22 +8,21 @@
import sys
from collections import namedtuple
import llnl.util.tty as tty
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
import spack.config
import spack.schema.env
import spack.cmd.common.arguments
import spack.cmd.install
import spack.cmd.uninstall
import spack.cmd.modules
import spack.cmd.common.arguments as arguments
import spack.cmd.install
import spack.cmd.modules
import spack.cmd.uninstall
import spack.config
import spack.environment as ev
import spack.schema.env
import spack.util.string as string
description = "manage virtual environments"
section = "environments"
level = "short"

View File

@@ -9,9 +9,9 @@
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
import spack.environment as ev
import spack.cmd as cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.repo
import spack.store
from spack.filesystem_view import YamlFilesystemView

View File

@@ -10,10 +10,12 @@
import sys
from collections import defaultdict, namedtuple
import six
import llnl.util.filesystem
import llnl.util.tty as tty
import llnl.util.tty.colify as colify
import six
import spack
import spack.cmd
import spack.cmd.common.arguments

View File

@@ -9,17 +9,17 @@
import os
import sys
import llnl.util.lang
import llnl.util.tty as tty
import llnl.util.tty.color as color
import llnl.util.lang
import spack.environment as ev
import spack.repo
import spack.cmd as cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.repo
import spack.user_environment as uenv
from spack.util.string import plural
from spack.database import InstallStatuses
from spack.util.string import plural
description = "list and search installed packages"
section = "basic"

View File

@@ -9,7 +9,6 @@
import spack.cmd.style
description = "alias for spack style (deprecated)"
section = spack.cmd.style.section
level = spack.cmd.style.level

View File

@@ -3,8 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import argparse
import os
import spack.binary_distribution
import spack.cmd.common.arguments as arguments

View File

@@ -11,7 +11,7 @@
import spack.cmd.common.arguments as arguments
import spack.config
import spack.store
from spack.graph import graph_dot, graph_ascii
from spack.graph import graph_ascii, graph_dot
description = "generate graphs of package dependency relationships"
section = "basic"

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import sys
from llnl.util.tty.color import colorize
description = "get help on spack and its commands"

View File

@@ -6,6 +6,7 @@
from __future__ import print_function
import textwrap
from six.moves import zip_longest
import llnl.util.tty as tty
@@ -13,10 +14,9 @@
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.fetch_strategy as fs
import spack.repo
import spack.spec
import spack.fetch_strategy as fs
description = 'get detailed information on a particular package'
section = 'basic'
@@ -155,6 +155,26 @@ def print_text_info(pkg):
color.cprint('')
color.cprint(section_title('Maintainers: ') + mnt)
color.cprint('')
color.cprint(section_title('Externally Detectable: '))
# If the package has an 'executables' field, it can detect an installation
if hasattr(pkg, 'executables'):
find_attributes = []
if hasattr(pkg, 'determine_version'):
find_attributes.append('version')
if hasattr(pkg, 'determine_variants'):
find_attributes.append('variants')
# If the package does not define 'determine_version' nor
# 'determine_variants', then it must use some custom detection
# mechanism. In this case, just inform the user it's detectable somehow.
color.cprint(' True{0}'.format(
' (' + ', '.join(find_attributes) + ')' if find_attributes else ''))
else:
color.cprint(' False')
color.cprint('')
color.cprint(section_title("Tags: "))
if hasattr(pkg, 'tags'):

View File

@@ -23,7 +23,6 @@
from spack.error import SpackError
from spack.installer import PackageInstaller
description = "build and install packages"
section = "build"
level = "short"
@@ -347,6 +346,10 @@ def get_tests(specs):
reporter.filename = default_log_file(specs[0])
reporter.specs = specs
# Tell the monitor about the specs
if args.use_monitor and specs:
monitor.new_configuration(specs)
tty.msg("Installing environment {0}".format(env.name))
with reporter('build'):
env.install_all(args, **kwargs)

View File

@@ -3,23 +3,22 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from __future__ import print_function
from __future__ import division
from __future__ import division, print_function
import argparse
import fnmatch
import json
import math
import os
import re
import sys
import math
import json
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.dependency
import spack.repo
import spack.cmd.common.arguments as arguments
from spack.version import VersionList
if sys.version_info > (3, 1):

View File

@@ -8,9 +8,9 @@
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.util.environment
import spack.user_environment as uenv
import spack.store
import spack.user_environment as uenv
import spack.util.environment
description = "add package to the user environment"
section = "user environment"

View File

@@ -6,12 +6,13 @@
from __future__ import print_function
import os
import llnl.util.tty as tty
import spack.environment as ev
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment
import spack.environment as ev
import spack.paths
import spack.repo
import spack.stage

View File

@@ -6,7 +6,8 @@
import sys
import llnl.util.tty as tty
from spack.util.log_parse import parse_log_events, make_log_context
from spack.util.log_parse import make_log_context, parse_log_events
description = "filter errors and warnings from build logs"
section = "build"

View File

@@ -12,7 +12,6 @@
import llnl.util.tty.color as color
from llnl.util.tty.colify import colify
import spack.repo
description = "get information about package maintainers"

View File

@@ -7,16 +7,16 @@
import sys
from llnl.util import tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.error
import spack.package
import spack.cmd.common.arguments as arguments
import spack.repo
import spack.store
from spack.database import InstallStatuses
from llnl.util import tty
description = "mark packages as explicitly or implicitly installed"
section = "admin"
level = "long"

View File

@@ -17,9 +17,8 @@
import spack.repo
import spack.util.url as url_util
import spack.util.web as web_util
from spack.spec import Spec
from spack.error import SpackError
from spack.spec import Spec
from spack.util.spack_yaml import syaml_dict
description = "manage mirrors (source and binary)"

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
from typing import Dict, Callable # novm
from typing import Callable, Dict # novm
import llnl.util.tty as tty

Some files were not shown because too many files have changed in this diff Show More