Compare commits

...

518 Commits

Author SHA1 Message Date
Todd Gamblin
37afca39ae Revert "Update various Jupyter packages (#36332)"
This reverts commit d20fee0c42.
2023-03-23 13:08:59 -07:00
Massimiliano Culpo
b0e54bc0ac Fix regression on compiler constraint (#36342)
fixes #36339

We were missing a rule that enforced a match between
the `node_compiler` and the compiler used to satisfy
a requirement.

Fix compiler with custom, made up version too
2023-03-23 20:43:13 +01:00
Adam J. Stewart
d20fee0c42 Update various Jupyter packages (#36332)
* Update various Jupyter packages
* Fix missing versions
2023-03-23 10:55:06 -07:00
Dr. Christian Tacke
fdd94d1ee9 fairlogger: 1.9 and older are incompatible with fmt 9+ (#36336)
Co-authored-by: Dennis Klein <d.klein@gsi.de>
2023-03-23 10:50:18 -07:00
Sergey Kosukhin
fa37ff51e7 libzip: add version 1.3.2 (#36337)
* libzip: add property 'headers'
* libzip: add version 1.3.2
2023-03-23 10:48:14 -07:00
Matthieu Dorier
2853051e48 [mochi-margo] margo version 0.13.1 added (#36344) 2023-03-23 10:08:55 -07:00
Matthew Thompson
862e9a59c4 gcc: fix for apple-clang conflicts (#36165)
* gcc: fix for apple-clang conflict

* Update var/spack/repos/builtin/packages/gcc/package.py

Use variant by @adamjstewart

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-23 13:03:09 -04:00
Harmen Stoppels
4dc9d9f60e Revert "Bugfix: package requirements with git commits (#35057)" (#36341)
This reverts commit 3d597e29be.
2023-03-23 12:10:46 +01:00
Peter Scheibel
3d597e29be Bugfix: package requirements with git commits (#35057)
* Specs that define 'new' versions in the require: section need to generate
  associated facts to indicate that those versions are valid.

* add test to verify success with unknown versions.
2023-03-23 01:58:20 -07:00
Ted Stern
739a67eda8 Revert "wrf: fix patches for aarch64 config (#35984)" (#36333)
This reverts commit 99893a6475.
2023-03-23 07:50:56 +01:00
Xavier Delaruelle
47d710dc4d modules tcl: switch default all:autoload from none to direct (#36269)
Since environment-modules has support for autoloading since 4.2,
and Spack-builds of it enable it by default, use the same autoload
default for tcl as lmod.
2023-03-23 07:49:17 +01:00
Stan Tomov
101c5b51bb magma: add v2.7.1 (#35610) 2023-03-22 19:04:56 -04:00
Adam J. Stewart
f4e4d83a02 LLVM OpenMP: add v16.0.0 (#36330) 2023-03-22 16:25:02 -05:00
Adam J. Stewart
68979f8740 py-papermill: add new package (#36328) 2023-03-22 14:00:40 -07:00
Mathew Cleveland
37fbfcf7fe Add opppy-0_1_6 and opppy-0_1_7 releases to the spack recipes (#36326)
* add opppy-0_1_6 and opppy-0_1_7 releases to the spack recipes
* update urls
* remove sphinx from the dependency list
* cleanup OPPPY versions to capture OPPPY-0_1_1 tag descrepency
* one more attempt at fixing the url for opppy-0_1_1 (simpler fix)

---------

Co-authored-by: Cleveland <cleveland@lanl.gov>
Co-authored-by: clevelam <clevelam@users.noreply.github.com>
2023-03-22 16:57:49 -04:00
Leopold Talirz
311d3be18e docs: mention cuda multi-arch capability (#36321) 2023-03-22 16:52:53 -04:00
MatthewLieber
2393e456ee Osu/mv2 hwloc2 (#36325)
* Revert "Remove legacy yaml from buildcache fetch (#34347)"
  This reverts commit b58ec9e2b9.
* Revert "Revert "Remove legacy yaml from buildcache fetch (#34347)""
  This reverts commit f91ec2e8da.
* add variant for hwloc v2
* running black

---------

Co-authored-by: Matt Lieber <lieber.31@osu.edu>
2023-03-22 11:49:12 -07:00
Vincent Michaud-Rioux
e09caf2ab8 Add py-pennylane-lightning-kokkos package. (#36257)
* update python package

* change package inheritance

* small update

* enable cpp tests

* small update

* Add flaky package

* Restructure PennyLane deps and order

* Change Lightning defaults and add libomp support for MacOS

* Replace explicit git url with PyPI

* Add Flaky support

* Update PennyLane and PennyLane Lightning support

* fix format

* update packages versioning

* Add patching and default updates for lightning package

* Format

* fix patch version

* update py-flaky package

* update py-pennylane-lightning package

* update py-pennylane package

* remove explicity python dependence

* Remove redundant lines from patch-file

* Update SHA for new patch

* Initial commit for PLLKokkos.

* Comment verbose variant.

* Update develop commit version and restore verbose option.

* Add backends.

* Add mesa package dep (libxml2). Fix rocm install for py-pennylane-lightning-kokkos.

* Restore sycl backend.

* Revert mesa package.

* Make py-pe-li-kokkos into CudaPackage, ROCmPackage.

* Do not force kokkos+wrapper when +cuda

* Few mods following comments on py-pll.

* Update versions of py-pennylane*.

* Remove py-pennylane-lightning patch.

* Remove redundant preferred=True.

* Fix lint in py-pennylane-lightning-kokkos.

* Update var/spack/repos/builtin/packages/py-pennylane-lightning-kokkos/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Ninja and pip not required at runtime. Set lower bound on PL/PLL versions.

* Remove v0.29.0 from pennylane.

* Add AmintorDusko as maintainer.

---------

Co-authored-by: AmintorDusko <amintor_dusko@hotmail.com>
Co-authored-by: Lee J. O'Riordan <lee@xanadu.ai>
Co-authored-by: Amintor Dusko <87949283+AmintorDusko@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-22 10:06:05 -05:00
Adam J. Stewart
f15efd27bd py-lightning: fix dependencies (#36213) 2023-03-22 14:08:19 +01:00
Adam J. Stewart
668fb7f5dd grep: fix +pcre in 3.9 (#36169) 2023-03-22 09:49:41 +01:00
Alec Scott
e1a5228a16 perl-inline: add v0.86 (#36299) 2023-03-22 03:47:17 -05:00
Wouter Deconinck
8a48f9a479 xcb-util-*: new versions, migration to freedesktop.org (#36241)
The xcb-utils have been migrated to the gitlab.freedesktop.org, from the
previous separate location. That means that a URL change is needed to
pick up newer version
([ref](https://lists.freedesktop.org/archives/xcb/2022-October/011422.html)).

This replaces the `homepage` and `url` with the latest (to an `xz`
file), adds a `url_for_version` function to resolve past versions, and
add the latest versions. Because of the `url_for_version` I don't think
we can use the `xorg_mirror_path` approach here.

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2023-03-22 09:45:16 +01:00
Alec Scott
6551ad8711 perl-file-slurper: add v0.014 (#36305) 2023-03-22 03:38:17 -05:00
Alec Scott
a2479c13a6 perl-exception-class: add v1.45 (#36315) 2023-03-22 03:35:42 -05:00
snehring
59fecb353c Add missing deps for braker and bcftools (#36279) 2023-03-22 09:25:13 +01:00
Alec Scott
d0098876e0 perl-io-tty: add v1.17 (#36295) 2023-03-22 03:24:54 -05:00
snehring
8d2f08ae85 shapemapper: add new package (#36282) 2023-03-22 09:20:59 +01:00
Alec Scott
d71ee98bad perl-io-html: add v1.004 (#36296) 2023-03-22 03:20:32 -05:00
Alec Scott
ed989be8eb perl-file-which: add v1.27 (#36304) 2023-03-22 04:17:33 -04:00
Alec Scott
00d45d052d perl-error: add v0.17029 (#36316) 2023-03-22 03:14:33 -05:00
Alec Scott
f86f30ad71 perl-list-moreutils and moreutils-xs: add v0.430 (#36291) 2023-03-22 03:10:28 -05:00
Alec Scott
4f4c9f440e perl-graph-readwrite: add v2.10 (#36302) 2023-03-22 03:09:51 -05:00
Adam J. Stewart
848ab435a5 py-torch: OpenMP support doesn't work on Apple Silicon (#36287) 2023-03-22 09:06:42 +01:00
Adam J. Stewart
2418bf446d py-jupyter-client: add v8.1.0 (#36288) 2023-03-22 09:06:10 +01:00
Alec Scott
893bb8d7c7 perl-libwww-perl: add v6.68 (#36292) 2023-03-22 03:04:16 -05:00
Alec Scott
9e6d048af2 perl-mce: add v1.884 (#36289) 2023-03-22 08:57:52 +01:00
Alec Scott
d17321ffc0 perl-log-log4perl: add v1.49 (#36290) 2023-03-22 08:57:36 +01:00
Alec Scott
38912d17f7 perl-json: add v4.10 (#36293) 2023-03-22 08:54:21 +01:00
Alec Scott
3185bd81b1 perl-ipc-run: add v20220807.0 (#36294) 2023-03-22 02:53:24 -05:00
Alec Scott
25035a302e perl-io-compress: add v2.204 (#36297) 2023-03-22 08:51:14 +01:00
Alec Scott
85c1b16213 perl-inline-c: add v0.81 (#36298) 2023-03-22 08:50:43 +01:00
Alec Scott
e2bc51fcad perl-exporter-tiny: add v1.006000 (#36313) 2023-03-22 02:47:38 -05:00
Alec Scott
c3b56f789c perl-http-message: add v6.44 (#36300) 2023-03-22 08:41:37 +01:00
Alec Scott
492ec0e783 perl-http-cookies: add v6.10 (#36301) 2023-03-22 08:41:16 +01:00
Alec Scott
653057e93a perl-graph: add v0.20105 (#36303) 2023-03-22 08:40:21 +01:00
Alec Scott
87c1cfaf03 perl-file-sharedir-install: add v0.14 (#36306) 2023-03-22 08:38:20 +01:00
Alec Scott
647bb5124e perl-file-pushd: add v1.016 (#36307) 2023-03-22 08:38:00 +01:00
Alec Scott
7c646a5dbd perl-file-homedir: add v1.006 (#36308) 2023-03-22 08:37:41 +01:00
Alec Scott
692d624f45 perl-file-copy-recursive: add v0.45 (#36309) 2023-03-22 08:37:22 +01:00
Alec Scott
98adc0b3f9 gh: add v2.25.0 (#36319) 2023-03-22 02:37:03 -05:00
Alec Scott
628dbce6f6 perl-ffi-checklib: add v0.31 (#36310) 2023-03-22 08:30:40 +01:00
Alec Scott
49079d6f88 perl-extutils-makemaker: add v7.68 (#36311) 2023-03-22 08:29:52 +01:00
Alec Scott
ff23a2a2ee perl-extutils-depends: add v0.8001 (#36312) 2023-03-22 08:29:35 +01:00
Alec Scott
725389ff32 perl-exporter-lite: add v0.09 (#36314) 2023-03-22 08:28:05 +01:00
Alec Scott
781959603d perl-devel-stacktrace: add v2.04 (#36317) 2023-03-22 08:26:27 +01:00
Alec Scott
787fe3283f perl-devel-overloadinfo: add v0.007 (#36318) 2023-03-22 08:25:18 +01:00
Alec Scott
c9c2b5e6bb coreutils: add v9.2 (#36320) 2023-03-22 08:23:55 +01:00
John W. Parent
97bdf28b29 libxml2: enable build on Windows (#36261)
Add Nmake-based builder for Windows
2023-03-21 23:33:22 -04:00
Christian F. A. Negre
f49e9591b7 lcc: new package (#36100)
* lcc: new package
* update CMake version per `CMakeLists.txt`

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2023-03-21 11:32:32 -07:00
Thomas Bouvier
a0bc32c319 nccl-tests: add version v2.13.6 (#36160) 2023-03-21 11:17:29 -07:00
Adam J. Stewart
52bcd0eda1 py-fiona: add v1.9.2 (#36278) 2023-03-21 12:39:55 -05:00
John W. Parent
2e9d0e146e netcdf-c[xx]: CMake/Windows build (#34935)
netcdf-cxx and netcdf-c now build with CMake rather than Autotools.
netcdf-c can still optionally build with Autotools (but defaults to
CMake). With some additional patches to the CMake files, netcdf-c
can use CMake to build on Windows.
2023-03-21 10:15:50 -07:00
downloadico
84ab72557a Update abinit version (#36264)
* abinit: add version 9.8.3
* require hdf5 up to 1.8 and libxc up to version 5
* abinit: constrained versions of libxc and hdf5
* fixed bad syntax for format
* fixed error looking for fftw in spec.
* Changed to look for fftw-api in spec.
* Update var/spack/repos/builtin/packages/abinit/package.py

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-03-21 10:09:38 -07:00
Wouter Deconinck
1d62d9460d xrootd: new version 5.5.2, 5.5.3 (#36271)
Only bugfixes, no build system changes, https://github.com/xrootd/xrootd/compare/v5.5.1...v5.5.3
2023-03-21 11:32:58 -04:00
Mosè Giordano
b9f32b1e7a curl: Add version 8.0.1 (#36256)
r: restrict compatibility with curl
2023-03-21 12:25:39 +01:00
Adam J. Stewart
2b539129f0 py-pillow: add v9.3.0 and v9.4.0 (#36259) 2023-03-21 06:08:21 -04:00
Michael Kuhn
9288ece826 environment-modules: add main branch (#36268) 2023-03-21 03:13:27 -04:00
Matthieu Dorier
9b09d8bc49 valijson: add new package (#36250) 2023-03-21 03:13:03 -04:00
Ryan Marcellino
4d90f464e1 py-parsl: add v1.2.0 (#36266) 2023-03-21 02:38:03 -04:00
Thomas Madlener
6edc480736 podio: Add version 0.16.3 (#36253) 2023-03-21 02:27:58 -04:00
Ryan Marcellino
649e9ae0ad py-cryptography: add v3.3.2 (#36267) 2023-03-21 02:23:25 -04:00
Adam J. Stewart
98ece85e63 py-timm: does not yet support Python 3.11 (#36260) 2023-03-21 02:23:03 -04:00
Mosè Giordano
fa57e62744 julia: Relax compatibility with curl (#36262)
Curl version 8 has a compatible ABI/API with version 7.
2023-03-21 02:18:02 -04:00
Ken Raffenetti
880c819d97 mpich: add 4.1.1 release (#35901) 2023-03-21 00:03:57 -04:00
Eric Martin
5fedb10370 py-reportseff: add new package (#36113)
* py-reportseff: add new package

* Update var/spack/repos/builtin/packages/py-reportseff/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-reportseff/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* Update var/spack/repos/builtin/packages/py-reportseff/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Add py-importlib-metadata preqreq

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-03-20 22:17:01 -05:00
Jen Herting
9787253842 [srcml-identifier-getter-tool] New package (#35763)
* [srcml-identifier-getter-tool] New package
* [srcml-identifier-getter-tool] formatting
2023-03-20 18:08:35 -07:00
Alec Scott
3984a1e159 babl: add v0.1.102 (#35837)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-03-20 17:33:10 -07:00
Carsten Uphoff
bfca1729fa Add double batched FFT library package (#36086)
* Add double batched FFT library package
* Fix style
* Add error when unsupported compiler is uesd

---------

Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2023-03-20 16:25:37 -07:00
Sangu Mbekelu
8d8a008ef2 new mosesdecoder package (#36252)
* new mosesdecoder package
* [@spackbot] updating style on behalf of Sangu-Mbekelu

---------

Co-authored-by: Sangu Mbekelu <s.mbekelu9@gmail.com>
2023-03-20 16:04:04 -07:00
Erik Schnetter
b7505aa726 universal: New package (#36168)
* universal: New package
* universal: Update to version 3.68
2023-03-20 15:36:38 -07:00
Wouter Deconinck
2cecb4b00c Xorg apps: updated versions to current latest (#36242)
* Xorg apps: updated versions to current latest

This updates all xorg apps to the latest versions, adding updated
requirements where needed.

No major version increases in any packages.

Minor version increases in some packages (build changes, if any, are
indicated below):
- rgb
- xauth
- xcalc
- xclock
- xeyes: xi >= 1.7, x11-xcb xcb-present >= 1.9 xcb-xfixes xcb-damage
- xfontsel
- xfs: xfont2 >= 2.0.1
- xinit
- xpr
- xrdb

Bugfix version increases in many packages, with no expected impact on
dependencies or interfaces.

Summary of dependency changes:
- xeyes:
  - depends_on("libxi@1.7:", when="@1.2:")
  - depends_on("libxcb@1.9:", when="@1.2:")
- xfs:
  - depends_on("libxfont@1.4.5:", when="@:1.1")
  - depends_on("libxfont2@2.0.1:", when="@1.2:")

* setxkbmap: depends_on libxrandr when @1.3.3:

* constype: new version
2023-03-20 15:26:28 -07:00
John W. Parent
8695d96bd1 NASM package: fix build on Windows (#35100) 2023-03-20 14:45:00 -07:00
John W. Parent
fa0749bfb8 lz4: switch to CMake build (#35101)
Add support for building with CMake and make it the default build
system on all platforms. By doing this, lz4 can now be built on
Windows. The makefile-based build remains as an option.
2023-03-20 14:39:19 -07:00
Wouter Deconinck
b431c4dc06 wayland: new versions, new build system (meson) (#36217)
* wayland: new versions, new build system (meson)

* wayland-protocols: new version, new build system (meson)

* [@spackbot] updating style on behalf of wdconinc

* wayland-protocols: added maintainer

* wayland: added maintainer

* wayland-protocols: no need to import build systems, per flake8

* wayland: no need to import build system, per flake8

* Update var/spack/repos/builtin/packages/wayland/package.py

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2023-03-20 12:31:45 -07:00
Peter Scheibel
c3e41153ac Package requirements: allow single specs in requirement lists (#36258)
If you have a "require:" section in your packages config, and you
use it to specify a list of requirements, the list elements can
now include strings (before this, each element in the list had to
be a `one_of` or `any_of` specification, which is awkward if you
wanted to apply just one spec with no alternatives).
2023-03-20 12:30:33 -07:00
Andrew-Dunning-NNL
e1752ca382 new package py-oracledb (#36191)
* new package py-oracledb

* py-oracledb use python3.6:
2023-03-20 14:14:05 -04:00
Erik Heeren
2bcd4e0ecd py-pint-xarray: new package (#36106)
* py-pint-xarray: new package

* py-pint-xarray: review remarks
2023-03-20 10:34:33 -05:00
Erik Heeren
550bda3096 py-pdf2image: new package (#36088)
* py-pdf2image: new package

* py-pdf2image: 1.16.3 source now available on pypi

* Update var/spack/repos/builtin/packages/py-pdf2image/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-20 10:33:59 -05:00
Adam J. Stewart
334bc69a64 Python: add several new versions (#36249) 2023-03-20 10:18:13 -05:00
Harmen Stoppels
88d78025a6 spack install: simplify behavior when inside environments (#35206)
Example one:

```
spack install --add x y z
```

is equivalent to

```
spack add x y z
spack concretize
spack install --only-concrete
```

where `--only-concrete` installs without modifying spack.yaml/spack.lock

Example two:

```
spack install
```

concretizes current spack.yaml if outdated and installs all specs.

Example three:

```
spack install x y z
```

concretizes current spack.yaml if outdated and installs *only* concrete
specs in the environment that match abstract specs `x`, `y`, or `z`.
2023-03-20 13:51:30 +01:00
Miroslav Stoyanov
7e981d83fd heffte: update versions and arch flags (#36095)
* update versions and arch flags
* style update
* more style issues
* fix hashes and testing problem
* return the old versions, but they are really bad
* fix style

---------

Co-authored-by: Gerald Ragghianti <gerald@ragghianti.com>
2023-03-20 08:13:13 -04:00
Adam J. Stewart
b28e9e651d libpng: add v1.6.39 (#36247) 2023-03-20 07:23:23 -04:00
Adam J. Stewart
5dc8ed2694 Remove unused ignore parameter of extends() directive (#35588)
The `ignore` parameter was only used for `spack activate/deactivate`, and it isn't used
by Spack Environments which have their own handling of file conflicts. We should remove it.

Everything that handles `ignore=` was removed in #29317 and included in 0.19, when we
removed `spack activate` and `spack deactivate` in favor of environments.  So all of these
usages removed here were already being ignored by Spack.
2023-03-20 07:22:59 -04:00
Jean-Baptiste Besnard
199f71ea48 LULESH: fix space in rpath for +visual (#36094) 2023-03-20 11:16:58 +01:00
Harmen Stoppels
b8e5fc061d ci.py: remove redundant wrapper around get (#36188) 2023-03-20 10:56:19 +01:00
Adam J. Stewart
b77a4331bc GEOS: add v3.11.2 (#36189) 2023-03-20 10:54:41 +01:00
Rob Falgout
adcdf4a7e2 hypre: add v2.28.0 (#36187) 2023-03-20 10:41:56 +01:00
Erik Schnetter
dfd63ccd73 lrzip: New version 0.651 (#36196) 2023-03-20 10:35:57 +01:00
Alec Scott
2bfcfd1f72 perl: add v5.37.9 (#36205) 2023-03-20 10:34:44 +01:00
Alec Scott
7518362706 perl-b-hooks-endofscope: add v0.26 (#36208) 2023-03-20 10:32:37 +01:00
Alec Scott
13d8bc47c8 perl-capture-tiny: add v0.48 (#36209) 2023-03-20 10:32:16 +01:00
Alec Scott
d5c0d1ce58 perl-class-inspector: add v1.36 (#36211) 2023-03-20 10:31:59 +01:00
Alec Scott
46bd481124 perl-alien-build: add v2.78 (#36206) 2023-03-20 10:31:10 +01:00
Alec Scott
e92b996db9 perl-app-cmd: add v0.335 (#36207) 2023-03-20 10:14:37 +01:00
Alec Scott
eb1723332e perl-cgi: add v4.56 (#36210) 2023-03-20 10:13:20 +01:00
Erik Schnetter
3afef0635f rclone: New version 1.62.2 (#36197) 2023-03-20 10:12:59 +01:00
Alec Scott
032385ae51 perl-dbi: add v1.643 (#36218) 2023-03-20 10:07:15 +01:00
Alec Scott
7a9578ce7d perl-dbd-sqlite: add v1.72 (#36219) 2023-03-20 10:06:55 +01:00
Alec Scott
e155df5ada perl-db-file: add v1.858 (#36221) 2023-03-20 10:06:11 +01:00
Alec Scott
005af3e755 perl-date-manip: add v6.91 (#36222) 2023-03-20 10:05:52 +01:00
Alec Scott
85e721c16c perl-data-optlist: add v0.113 (#36223) 2023-03-20 10:05:14 +01:00
Alec Scott
e61ae290a2 perl-cpan-meta-check: add v0.017 (#36224) 2023-03-20 10:04:58 +01:00
Alec Scott
782d3b889a perl-config-general: add v2.65 (#36225) 2023-03-20 10:04:11 +01:00
Alec Scott
3a7e5372d0 perl-compress-raw-zlib: add v2.204 (#36226) 2023-03-20 10:03:26 +01:00
Alec Scott
114e9b528f perl-compress-raw-bzip2: add v2.204 (#36227) 2023-03-20 10:02:59 +01:00
Alec Scott
8e3021cdb1 perl-clone: add v0.46 (#36228) 2023-03-20 10:02:19 +01:00
Alec Scott
9542d46395 perl-class-method-modifiers: add v2.15 (#36229) 2023-03-20 10:01:59 +01:00
Alec Scott
0825e9a95e perl-class-load: add v0.25 (#36230) 2023-03-20 10:01:45 +01:00
Alec Scott
b586c8cf1d lis: add v2.1.0 (#36231) 2023-03-20 10:01:11 +01:00
Alec Scott
eba3f5503b bazel: add v6.1.1 (#36234) 2023-03-20 10:00:48 +01:00
Harmen Stoppels
e30a89fb7c llvm: add v16 (#36239) 2023-03-20 09:54:31 +01:00
Alec Scott
0646c953e5 homer: add v4.11.1 (#36232) 2023-03-20 09:52:58 +01:00
Alec Scott
d6d68b892a perl-dbd-pg: add v3.16.1 (#36220) 2023-03-20 09:52:13 +01:00
Mark W. Krentel
8b1c5d910d intel-xed: add version 2022.10.11 (#36244) 2023-03-20 09:29:51 +01:00
Alec Scott
a800361344 pax-utils: add v1.3.3 (#36204) 2023-03-20 09:28:12 +01:00
Alec Scott
631a3d849f openldap: add v2.6.4 (#36202) 2023-03-20 09:27:40 +01:00
Alec Scott
af09297a76 gpgme: add v1.19.0 (#36201) 2023-03-20 09:27:20 +01:00
Alec Scott
1b27a2dda5 code-server: add v4.11.0 (#36200) 2023-03-20 09:26:58 +01:00
Alec Scott
3ebe5939e3 autodiff: add v1.0.1 (#36199) 2023-03-20 09:26:24 +01:00
Alec Scott
c9a4bf8d3f elfutils: add v0.189 (#35859) 2023-03-20 09:26:01 +01:00
Adam J. Stewart
973e37823c py-tensorboard-data-server: add v0.7.0 (#36248) 2023-03-20 09:25:33 +01:00
Xavier Delaruelle
41d7fe0a50 modules tcl: fix autoload mechanism in template (#36237)
Adapt tcl modulefile template to call "module load" on autoload
dependency without testing if this dependency is already loaded or not.

The is-loaded test is not necessary, as module commands know how to cope
with an already loaded module. With environment-modules 4.2+ (released
in 2018) it is also important to have this "module load" command even if
dependency is already loaded in order to record that the modulefile
declares such dependency. This is important if you want to keep a
consistent environment when a dependent module is unloaded.

The "Autoloading" verbose message is also removed as recent module
commands will report such information to the user (depending on the
verbosity configured for the module command).

Such change has been test successfully with Modules 3.2 (EL7), 4.5 (EL8)
and 5.2 (latest) and also with Lmod 7 and 8 (as it is mentionned in
Spack docs that Lmod can be used along with tcl modules). Dependencies
are correctly loaded or unloaded, whether they are loaded/unloaded or
not.

This change fixes Tcl quoting issue introduced in #32853.

Fixes #19155.
2023-03-20 09:23:40 +01:00
Adam J. Stewart
1af863a1e3 bash: add v5.2.15 (#36245) 2023-03-20 09:14:41 +01:00
Adam J. Stewart
75714d30f5 gawk: fix build on Apple Silicon (#36246) 2023-03-20 09:14:00 +01:00
Alec Scott
d5e30ac5f1 diffutils: add v3.9 (#35852) 2023-03-20 08:37:36 +01:00
Alec Scott
8c4265f033 harminv: add v1.4.2 and update URL to maintained git repository (#36062) 2023-03-20 08:36:25 +01:00
Angus Gibson
b8b6ae42a0 py-setuptools-git-versioning: new package (#36123)
* py-setuptools-git-versioning: new package

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-19 20:02:14 -04:00
Wouter Deconinck
5532350d4b qt-* (Qt6 pkgs): new version 6.4.3 (#36235) 2023-03-19 09:07:43 -05:00
Alec Scott
620effec1b bumpversion: add v0.6.0 and bump2version dependency (#36021)
* bumpversion: add v0.6.0

* Add bump2version dependency package
2023-03-18 21:51:06 -06:00
Xavier Delaruelle
df97827a7b Fix case spelling for Lmod and Tcl (#36215) 2023-03-19 01:42:50 +00:00
Wouter Deconinck
4ffdde94ef py-hepunits: new versions 2.2.0, 2.2.1, 2.3.0, 2.3.1 (#35545)
* py-hepunits: new versions 2.2.0, 2.2.1, 2.3.0, 2.3.1

Python 2 support dropped in 2.2 series.

Ref: https://github.com/scikit-hep/hepunits/compare/v2.1.1...v2.3.1

* py-hepunits: py-hatchling as of version 2.3

* [@spackbot] updating style on behalf of wdconinc

* py-hepunits: only depends_on toml through 2.1.1

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2023-03-18 17:18:04 -05:00
Benjamin Meyers
5b04146f8a New package py-pyhull (#36107)
* New package py-pyhull

* [@spackbot] updating style on behalf of meyersbs
2023-03-18 17:02:31 -05:00
Benjamin Meyers
eddbbb867d New package py-seekpath (#36108)
* New package py-seekpath

* [@spackbot] updating style on behalf of meyersbs
2023-03-18 17:01:43 -05:00
Benjamin Meyers
32154e6fc7 New package py-pyisemail (#36112) 2023-03-18 17:00:46 -05:00
Adam J. Stewart
6618b0c830 py-scikit-image: add v0.20.0 (#36167)
* py-scikit-image: add v0.20.0

* [@spackbot] updating style on behalf of adamjstewart

---------

Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2023-03-18 16:49:39 -05:00
Massimiliano Culpo
d84c6ad29e cmake build system: make "generator" a variant (#35552) 2023-03-18 16:39:04 +01:00
Massimiliano Culpo
2f07c64f2d Fix wrong computation of concrete specs due to a bug in intersects (#36194)
fixes #36190
2023-03-18 12:50:52 +01:00
Alec Scott
ca5cab8498 patchelf: add v0.17.2 (#36203) 2023-03-18 11:09:58 +01:00
Alec Scott
5f8ee20c7c ffmpeg: add v6.0 (#35857)
* ffmpeg: add v6.0

* Add limit to py-torchvision to prevent ffmpeg v6.0
2023-03-18 02:47:44 -04:00
Harmen Stoppels
fd70a2cc07 cython: force through env variable (#35995) 2023-03-17 19:54:24 -04:00
Alec Scott
31201f91bc libsigsegv: add v2.14 (#36070) 2023-03-17 18:39:26 -04:00
Ben Morgan
da0b76047d geant4: new version 11.0.4 (#36185) 2023-03-17 17:58:42 -04:00
Massimiliano Culpo
0478e5f684 Improve wording of audit message (#36180) 2023-03-17 17:43:35 -04:00
Alec Scott
4f7c147d50 libpciaccess: add v0.17 (#36076) 2023-03-17 17:33:44 -04:00
Amintor Dusko
73a887ee7c Update PennyLane and PennyLane Lightning (#35406) 2023-03-17 17:28:26 -04:00
John W. Parent
8195f27a66 Windows: properly handle symlink failures (#36003)
In the Windows filesystem logic for creating a symlink, we intend to
fall back to a copy when the symlink cannot be created (for some
configuration settings on Windows it is not possible for the user
to create a symlink). It turns out we were overly-broad in which
exceptions lead to this fallback, and the subsequent copy would
also fail: at least one case where this occurred is when we
attempted to create a symlink that already existed.

The updated logic expressly avoids falling back to a copy when the
file/symlink already exists.
2023-03-17 10:19:32 -07:00
Alec Scott
a60fa7ff7d libxdmcp: add v1.1.4 (#36074) 2023-03-17 13:11:10 -04:00
Adam J. Stewart
6272853030 Bazel: limit parallelism (#36002)
* Bazel: limit parallelism

* Patch packages that don't directly invoke bazel

* Style fixes

* flag comes after build, not bazel

* flag comes after build, not bazel

* command is only attribute if specific package
2023-03-17 11:13:27 -05:00
Seth R. Johnson
507b42c54f veccore: new version 0.8.1 (#36184) 2023-03-17 09:19:04 -04:00
Szilárd Páll
3897c1308e Switch GROMACS build type to Release (#36181)
The current default RelWithDebInfo gives significantly slower builds
so it should not be the default.
2023-03-17 07:17:06 -06:00
Valentin Volkl
b54d208aea boost: add patch for 1.81.0 (#35964) 2023-03-17 11:42:43 +01:00
Edoardo Aprà
612aa744f6 nwchem: add v7.2.0 (#36061) 2023-03-17 11:41:33 +01:00
Massimiliano Culpo
97193a25ce Mitigation for GitVersion bug when no =reference is given (#36159)
* ASP-based solver: use satisfies instead of intersects

They are semantically equivalent for concrete versions,
but the GitVersion.intersects implementation is buggy

* Mitigation for git version bug

fixes #36134

This commit works around the issue in #36134, by using
GitVersion.satisfies instead of GitVersion.intersects

There are still underlying issues when trying to infer the
"reference version" when no explicit one is given, but:

1. They are not reproducible with our synthetic repo
2. They occur only when the `git.<xxx>` form of Git version
   is used

Here we just work around the user facing issue and ensure
the tests are correct with our synthetic repository.
2023-03-17 11:36:29 +01:00
Vicente Bolea
5bf96561ee vtk-m: update to latest release (#35590)
* vtk-m: add v2.0.0
* Update var/spack/repos/builtin/packages/vtk-m/package.py

---------

Co-authored-by: Kenneth Moreland <morelandkd@ornl.gov>
2023-03-17 11:19:13 +01:00
Cameron Book
f2ba1d276b nccmp: add more constrain to dependencies, add configure args (#35539) 2023-03-17 11:10:36 +01:00
dependabot[bot]
4e060ba933 build(deps): bump actions/checkout from 3.3.0 to 3.4.0 (#36140)
Bumps [actions/checkout](https://github.com/actions/checkout) from 3.3.0 to 3.4.0.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](ac59398561...24cb908017)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-17 10:58:57 +01:00
afzpatel
dd15c37021 hipcub and rocprim: enable testing (#35660) 2023-03-17 10:56:54 +01:00
Harmen Stoppels
141c154948 openssh: 9.2, 9.3 (#36162) 2023-03-17 10:31:39 +01:00
Alec Scott
e51447c2c0 nano: add v7.2 (#36148) 2023-03-17 10:16:27 +01:00
Adam J. Stewart
a84fb716a0 Update the PyTorch ecosystem (#36132)
* py-pytorch-lightning: add v2.0.0

* py-lightning-utilities: add v0.8.0

* Update all PyTorch packages

* Open-CE does not yet have patches for PyTorch 2 on ppc64le
2023-03-17 10:13:44 +01:00
M. Eric Irrgang
a11f06885f Fix --test behavior for gromacs package. (#35674)
For `spack install --test=all gromacs`
* remove the `test` target from the `check()` call and just use
  the `check` target, in accordance with usual GROMACS test protocol
* build the test binaries explicitly during the build phase

Additional minor updates are necessary. This change
updates the package structure to the newer format with a
separate Builder class so we can override `check()`.
However, note that additional modernization should be
undertaken with care.
2023-03-17 10:11:22 +01:00
Massimiliano Culpo
8517a74f37 ASP-based solver: tweak heuristic, modify compiler encoding (#35989)
This PR does 2 unrelated things:
1. It changes the encoding of the compilers
2. It tweaks the heuristic for the solves in a0d8817907

Both were initially motivated by trying to get a performance gain but, while 2 showed significant speed-ups[^1], 1 instead didn't. I kept it anyhow, since I think the code related to compilers is more consolidated with the new encoding and we might get some performance improvement out of it if we can base our errors on the `node_compiler(Package, CompilerID)` atoms instead of `attrs`.

[^1]: In general the changes in the heuristic brought a ~10% speed-up on the tests I did. I'll post detailed results below.

Add a warning about compilers.yaml that is triggered if there are multiple compilers with the same spec, os and
target (since they can't be selected by users with the spec syntax only).
2023-03-17 00:39:41 -07:00
Alec Scott
34ef01a5c8 libx11: add v1.8.4 (#36075) 2023-03-17 00:04:07 -04:00
Michael Kuhn
86e49a63ce lmod: add 8.7.20 (#36177) 2023-03-17 02:07:31 +01:00
John W. Parent
d76845e875 libpng package: build with CMake (#35105) 2023-03-16 16:44:53 -07:00
Matthew Thompson
97d6c741b0 Fix for ESMF post_install on macOS (#36087) 2023-03-16 16:32:54 -07:00
Bill Williams
09fd3e8e61 Add explicit configure args to fix instrumentation-time paths (#36089) 2023-03-16 16:30:48 -07:00
Stephen Sachs
e341dac014 [pmix] master branch uses git submodule config/oac (#36104)
* [pmix] master branch uses git submodule config/oac
* Add comment for future versions
2023-03-16 16:04:56 -07:00
Stephen Sachs
38383743e7 pmix, openmpi, and prrte need to use the same configure to find the same deps (#36105)
* [openmpi] 5.0.0.rc10 onwards needs munge

This is the error you will see when munge is missing from `PKG_CONFIG_PATH`:

```
configure:63942: checking for pmix pkg-config cflags
configure:63956: check_package_pkgconfig_run_results=Package munge was not found in the pkg-config search path.
Perhaps you should add the directory containing `munge.pc'
to the PKG_CONFIG_PATH environment variable
Package 'munge', required by 'pmix', not found
configure:63959: $? = 1
configure:63966: pkg-config output: Package munge was not found in the pkg-config search path.
Perhaps you should add the directory containing `munge.pc'
to the PKG_CONFIG_PATH environment variable
Package 'munge', required by 'pmix', not found
configure:63972: result: error
configure:63974: error: An error occurred retrieving pmix cppflags from pkg-config
```

* Use same PKG_CONFIG_PATH defaults for ompi+pmix+prrte

The issue I tried to fix in https://github.com/spack/spack/pull/36105 comes from
different default search paths in different `pkg-config` executables used in
`openmpi` and `pmix` package. As these tools (`openmpi`, `pmix`, and `prrte`)
all use the same mechanisms to detect dependencies, the `pkg-config` environment
they use should also be equal.
2023-03-16 16:02:02 -07:00
Rémi Lacroix
8b94cc4ec2 libaio: Add version 0.3.113 (#36101) 2023-03-16 14:27:50 -07:00
Wouter Deconinck
0a55b44092 autodiff: new version 1.0.0 (#36121)
No more https://0ver.org. No changes to build system since 0.6.12.
2023-03-16 14:20:43 -07:00
Erik Heeren
d97bb895e8 Ospray (#36128)
* ospray: denoiser and GLM variants
* ospray: denoiser defaults to True to preserve previous behaviour
2023-03-16 14:11:03 -07:00
Erik Schnetter
e8482d9e79 openssl: New version 3.1.0 (#36166) 2023-03-16 16:03:06 -04:00
Richard Berger
3f3565e890 LAMMPS: add new versions (#35592)
* LAMMPS: add new stable version 20220623.3
* LAMMPS: add new patch version 20230208
2023-03-16 12:52:48 -07:00
Michael Kuhn
3bb35fbaf6 meson: add 1.0.1 (#35987) 2023-03-16 15:28:09 -04:00
Dom Heinzeller
4572052c63 Modify info print of ESMF_CPP due to permission denied errors in spack on MSU Hercules (#35969)
* Skip info print of ESMF_CPP due to permission denied errors in spack on MSU Hercules
* Better version of patch
2023-03-16 12:22:57 -07:00
Harmen Stoppels
ba00da61e4 reduce spec.json.sig file size (#36157)
Since GPG clear-sign cannot deal with lines longer than 19995 characters
and doesn't even error but simply truncates those linese (don't ask me
why...), we have to be careful not to hit that line limit when reducing
the filesize.

So, instead this PR sets the indent level to 0 and drops the whitespace
after `: `, which still reduces file size by 50% or so.
2023-03-16 19:46:13 +01:00
John W. Parent
825599a510 Windows: target arch based on spec target arch (#35797)
Update packages to check Spec's target rather than the host platform.
2023-03-16 11:31:19 -07:00
renjithravindrankannath
4f6f1b620f Include rocm-openmp-extras header and omp library (#36142) 2023-03-16 11:21:55 -07:00
Rocco Meli
08dc2d4020 add rdkit for gninavis and remove mpi (#36117) 2023-03-16 11:03:26 -07:00
Pierre Jolivet
6af84c4574 slepc: add HPDDM wrappers (#36118) 2023-03-16 12:32:07 -05:00
Harmen Stoppels
50cc1d12f9 Revert "minify spec.json in buildcache (#36138)" (#36156)
This reverts commit 1a8eefe09b.
2023-03-16 10:30:52 +01:00
Alec Scott
c29168eff1 openfst: add v1.8.2 (#36143) 2023-03-16 09:23:21 +01:00
Alec Scott
5ed1efab40 openal-soft: add v1.23.0 (#36144) 2023-03-16 09:23:05 +01:00
Alec Scott
887d70410d octave: add v8.1.0 (#36145) 2023-03-16 09:22:50 +01:00
Alec Scott
a9936141ee nginx: add v1.23.3 (#36146) 2023-03-16 09:22:31 +01:00
Alec Scott
5744fc3637 netdata: add v1.38.1 (#36147) 2023-03-16 09:21:54 +01:00
Alec Scott
b13c201f46 mpdecimal: add v2.5.1 (#36149) 2023-03-16 09:20:15 +01:00
Alec Scott
e2ab46251b mpc: add v1.3.1 (#36150) 2023-03-16 09:19:56 +01:00
Alec Scott
193c927bd2 mosh: add v1.4.0 (#36151) 2023-03-16 09:18:20 +01:00
Alec Scott
9d195da8ee mkfontscale: add v1.2.2 (#36152) 2023-03-16 09:18:04 +01:00
Alec Scott
132b89178e erlang: add v25.3 (#36153) 2023-03-16 09:17:47 +01:00
Adam J. Stewart
6491e08f5d qt-base: add qmake attribute (#36114) 2023-03-15 20:53:59 -05:00
kwryankrattiger
bb73dfc02e Hotfix: CI: Add CI target for gpu-test stack (#36136) 2023-03-15 21:13:02 -04:00
Howard Pritchard
64fa902ba6 UCX: make version support level more realistic (#36127)
Per feedback from the UCX community, we rarely do update
releases to anything but the current and one previous main
release stream.

Update comments in the UCX spack file to reflect this.

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2023-03-15 17:09:41 -04:00
Harmen Stoppels
1a8eefe09b minify spec.json in buildcache (#36138)
saves about 50% of data, which is significant
for hundreds of thousands of spec.json files
in our buildcaches.
2023-03-15 16:54:03 -04:00
Alec Scott
85d51bfd9a extrae: add v4.0.3 (#36059) 2023-03-15 15:24:38 -04:00
Alec Scott
e5d78e3780 libpipeline: add v1.5.7 (#36068) 2023-03-15 15:19:54 -04:00
Annop Wongwathanarat
99893a6475 wrf: fix patches for aarch64 config (#35984) 2023-03-15 12:40:00 +01:00
Annop Wongwathanarat
5f8f89b9c9 py-numpy: enable linking with armpl-gcc and acfl for BLAS and LAPACK (#35417)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-15 12:38:14 +01:00
Mosè Giordano
028535030c julia: Some improvements to the package (#36054) 2023-03-15 11:10:50 +01:00
Robert Blake
0e295afb1c cardioid: fix homepage (#36099) 2023-03-15 09:19:34 +01:00
Mark W. Krentel
e58c84e63e hpctoolkit: add branch 2023.03.stable (#36096) 2023-03-15 09:18:18 +01:00
Howard Pritchard
37904c3342 UCX: add 1.14.0 (#36098)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2023-03-15 09:17:55 +01:00
Adam J. Stewart
9f116c7bb1 GDAL: add v3.6.3 (#36097) 2023-03-15 09:16:48 +01:00
Shahzeb Siddiqui
b5f3b5bf78 Remove leftover command from documentation (#36116)
The command refers to dotkit files, which are not supported since a long time.
2023-03-14 20:48:28 -04:00
Erik Heeren
93887edba8 py-pyshacl: patch dependency typo (#36084)
* py-pyshacl: patch dependency typo

* py-pyshacl: satisfy flake8

* Update var/spack/repos/builtin/packages/py-pyshacl/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-14 20:18:27 -04:00
John W. Parent
cd42fc5cc8 Libogg and libtheora: build on windows (#35099)
Adds builders appropriate for building these packages on Windows.
It is intended that builds on other platforms are unaffected (e.g.
they build with Autotools as before on Linux).
2023-03-14 16:46:49 -07:00
Sajid Ali
9a1254063a Fix HDF5+mpi~fortran (#35400)
* HDF5+mpi~fortran
* fix style
2023-03-14 19:04:34 -04:00
Alec Scott
32f8ee6d58 libxfont: add v1.5.4 (#36072) 2023-03-14 18:59:34 -04:00
Harmen Stoppels
25239924fa postgresql: fix typo (#36115) 2023-03-14 18:00:27 -04:00
Erik Heeren
7b27cd2f94 py-pint: new versions (#36102)
* py-pint: new versions

* Update var/spack/repos/builtin/packages/py-pint/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pint/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-pint/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-14 18:00:01 -04:00
Gregory Lee
02e579d23d gobject-introspection 1.7.2 also requires libffi@:3.3 (#35606) 2023-03-14 14:05:40 -07:00
Matthias Wolf
6add885bb2 py-antspyx: new package (#30964)
* py-antspyx: new package

Also adds required dependencies.

Requires options to ITK to enable the right support libraries, and
patches to remove tune the setup and provide resources rather than
downloading libraries/"submodules" on the fly.

* Fix patch URL

* Style fixes.

* bump version and re-include `git clone ...` as resource
2023-03-14 16:51:38 -04:00
Harmen Stoppels
96b205ce6c environment.matching_spec: linear time traversal (#35534)
... and use colors in disambiguate message for clarity.

This commit avoids the loop:

```
for root in roots:
  for dep in deps(root):
    ...
```

instead it ensures each node is visited once and only once.

Also adds a small optimization when searching for concrete specs, since
we can assume uniqueness of dag hash, so it's fine to early exit.
2023-03-14 11:18:10 -07:00
Alec Scott
1711e186fe go: add v1.20.2 and v1.19.7 (#36065) 2023-03-14 18:28:57 +01:00
Rocco Meli
16f70ca78d pexsi: add v1.2 and v2.0 (#36049) 2023-03-14 16:02:11 +01:00
Alec Scott
2437a1d554 makedepend: add v1.0.8 (#36078) 2023-03-14 09:43:28 -04:00
Alec Scott
a6432bc770 armadillo: add v12.0.1 (#36051) 2023-03-14 09:43:04 -04:00
Alec Scott
ae6902b7ab looptools: add v2.16 (#36077) 2023-03-14 09:38:13 -04:00
Harmen Stoppels
40019dacd9 Use bfs in get_spec_filter_list (#36093) 2023-03-14 14:34:56 +01:00
Sangu Mbekelu
5c48304d07 new py-ultralytics package (#35890)
* new py-ultralytics package

* [@spackbot] updating style on behalf of Sangu-Mbekelu

* Update package.py

modified dependencies

---------

Co-authored-by: Sangu Mbekelu <s.mbekelu9@gmail.com>
2023-03-14 09:33:55 -04:00
Alec Scott
bab2f0a1b0 cbc: add v2.10.8 (#36055) 2023-03-14 09:23:43 -04:00
Alec Scott
ecc781fb3c libpcap: add v1.10.3 (#36067) 2023-03-14 09:19:49 -04:00
Harmen Stoppels
1691b7caac Fix typo affecting Gitlab CI (#36103)
Introduced in #35944
2023-03-14 14:18:05 +01:00
Seth R. Johnson
4f848f9200 vecgeom: new version 1.2.2 (#36085) 2023-03-14 08:48:18 -04:00
Alec Scott
08298b6766 mariadb-c-client: add v3.3.4 (#36079) 2023-03-14 08:37:58 -04:00
Alec Scott
11a509a40e man-db: add v2.11.2 (#36080) 2023-03-14 08:32:55 -04:00
Alec Scott
e3a7ad8112 libssh: add v0.8.9 (#36069) 2023-03-14 08:23:35 -04:00
Alec Scott
6efec2b2bd libxfont2: add v2.0.6 (#36073) 2023-03-14 08:16:00 -04:00
Alec Scott
ecd6fc00fd libtasn1: add v4.19.0 (#36071) 2023-03-14 08:07:12 -04:00
Alec Scott
87dc28a2f7 kmergenie: add v1.7051 (#36066) 2023-03-14 08:03:08 -04:00
Alec Scott
b2633e9057 fjcontrib: add v1.051 (#36060) 2023-03-14 08:02:46 -04:00
Alec Scott
116bc396c2 ccache: add v4.8 (#36056) 2023-03-14 08:02:23 -04:00
Alec Scott
39049e2bde hugo: add v0.111.3 (#36063) 2023-03-14 07:54:08 -04:00
Alec Scott
309969053e coinutils: add v2.11.6 (#36058) 2023-03-14 07:53:26 -04:00
Alec Scott
3fbd06023c cgl: add v0.60.6 (#36057) 2023-03-14 07:53:11 -04:00
Benjamin Meyers
853b964947 New packages: py-robocrys, py-matminer, py-pubchempy (#35941)
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
2023-03-14 07:27:51 -04:00
Harmen Stoppels
f7da7db9b2 use stage dir for buildcache create (#36091) 2023-03-14 09:35:47 +01:00
Michael Kuhn
5bae742826 concretizer: add mode to reuse dependencies only (#30990)
This adds a new mode for `concretizer:reuse` called `dependencies`,
which only reuses dependencies. Currently, `spack install foo` will
reuse older versions of `foo`, which might be surprising to users.
2023-03-14 09:22:20 +01:00
Rocco Meli
03636cd6ac Update MDAnalysis and addition of MDAnalysisTests (#36052)
* update mda dependencies

* apply black

* mdanalysis draft

* update

* small fixes

* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-mdanalysis/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-13 15:01:24 -05:00
nkgh77
ee1ea1f430 octave: better specification of MKL and AMDFFTW libraries (#35935) 2023-03-13 10:06:42 +01:00
Alec Scott
ff019f868b libiberty: add v2.40 (#36042) 2023-03-13 09:30:44 +01:00
Adam J. Stewart
2bbc6390dc py-earthengine-api: add v0.1.344 (#36053) 2023-03-13 09:27:15 +01:00
Harmen Stoppels
2107b6bf00 Set build_jobs dynamically in CI to avoid oversubscription (#35996)
Co-authored-by: Zack Galbreath <zack.galbreath@kitware.com>
Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
2023-03-13 08:29:58 +01:00
Alec Scott
31de7ea56c font-util: add v1.4.0 (#35860) 2023-03-12 21:24:01 +01:00
Jen Herting
55870efbcc New package: py-inflect (#35942)
Co-authored-by: Alex C Leute <aclrc@sporcsubmit.rc.rit.edu>
Co-authored-by: qwertos <qwertos@users.noreply.github.com>
2023-03-12 20:44:51 +01:00
Sangu Mbekelu
c38b463954 added a new verison of py-certifi (#35940)
Co-authored-by: Sangu Mbekelu <s.mbekelu9@gmail.com>
2023-03-12 10:52:17 -04:00
Michael Kuhn
5a4bc51bc0 cube: add 4.8 and 4.7.1 (#35959) 2023-03-12 11:38:32 +01:00
Massimiliano Culpo
528aca7c88 Revert "banner: add v3.5 (#36019)" (#36046)
This reverts commit 61af6b8f37.
2023-03-12 11:31:47 +01:00
Michael Kuhn
9fcfdf7a97 zstd: add v1.5.4 (#35438) 2023-03-12 05:42:20 -04:00
Greg Becker
66bf9bc7a6 cce compiler: bugfix for version regex to avoid conflation with apple-clang (#35974)
Currently apple-clang is detected as cce, and it should not be.
---------

Co-authored-by: becker33 <becker33@users.noreply.github.com>
2023-03-12 08:17:09 +00:00
Jonathon Anderson
dee5cb1aeb gloo: fix build on Linux >=6.0.3 (#35992) 2023-03-12 08:53:31 +01:00
Cameron Book
25666f9254 gsi-ncdiag: add new package (#35999) 2023-03-12 08:42:55 +01:00
Brian Van Essen
d78d112f18 aluminum, lbann: add new versions and deprecate old ones (#35954) 2023-03-12 08:40:13 +01:00
Alec Scott
1a97fddf5a at-spi2-core: add v2.47.90 (#36014) 2023-03-12 08:34:48 +01:00
Heiko Bauke
29d989a048 mpl: add v0.3.0 (#36015) 2023-03-12 08:33:27 +01:00
dependabot[bot]
79bba432df build(deps): bump docker/setup-buildx-action from 2.4.1 to 2.5.0 (#36008)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 2.4.1 to 2.5.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](f03ac48505...4b4e9c3e2d)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-12 08:32:17 +01:00
Alec Scott
ef4971d2e1 actsvg: add v0.4.30 (#36012) 2023-03-12 08:30:08 +01:00
Adam J. Stewart
1cc7ea651a py-torchmetrics: add v0.11.4 (#36016) 2023-03-12 08:29:41 +01:00
Alec Scott
16fd615fad autoconf-archive: add v2023.02.20 (#36017) 2023-03-12 08:29:02 +01:00
Alec Scott
61af6b8f37 banner: add v3.5 (#36019) 2023-03-12 08:28:17 +01:00
Alec Scott
7c3c6011de cni-plugins: add v1.2.0 (#36024) 2023-03-12 08:27:49 +01:00
Alec Scott
36d6660739 commons-lang3: add v3.12.0 (#36025) 2023-03-12 08:26:28 +01:00
Alec Scott
9199dabc0b cryptopp: add v8.7.0 (#36026) 2023-03-12 08:25:51 +01:00
Houjun Tang
9f6b2f8e96 HDF5-vol-async: add "memcpy" variant (#36013) 2023-03-12 08:25:11 +01:00
Alec Scott
ba1fd789e0 datamash: add v1.8 (#36027) 2023-03-12 08:24:16 +01:00
Alec Scott
013b2dec1e dbus: add v1.13.6 (#36028) 2023-03-12 08:23:42 +01:00
Alec Scott
d9cf959010 dbus-glib: add v0.112 (#36029) 2023-03-12 08:22:51 +01:00
Alec Scott
a76066ec42 gh: add v2.24.3 (#36032) 2023-03-12 08:21:58 +01:00
Alec Scott
8ce6a5355e ghostscript: add v10.0.0 (#36033) 2023-03-12 08:21:25 +01:00
Alec Scott
e61a1a6e74 hugo: add v0.111.2 (#36035) 2023-03-12 08:19:07 +01:00
Alec Scott
16d7270700 hydra: add v4.1.1 (#36036) 2023-03-12 08:18:34 +01:00
Alec Scott
e77e93b66a glab: add v1.26.0 (#36034) 2023-03-12 08:17:32 +01:00
Alec Scott
0c2a801ff2 libbson: add v1.23.2 (#36037) 2023-03-12 08:17:00 +01:00
Alec Scott
c84ce77969 libcap: add v2.67 (#36038) 2023-03-12 08:16:44 +01:00
Alec Scott
3464570b55 libdmx: add v1.1.4 (#36039) 2023-03-12 08:16:26 +01:00
Alec Scott
2a1428e5d4 libfontenc: add v1.1.7 (#36040) 2023-03-12 08:16:10 +01:00
Alec Scott
6f15cef281 libfs: add v1.0.9 (#36041) 2023-03-12 08:15:55 +01:00
Alec Scott
6fbda46c12 bazel: add v6.1.0 (#36020) 2023-03-12 08:14:25 +01:00
Adam J. Stewart
6c9d079cfb py-torch: NNPACK requires AVX2 (#35994) 2023-03-12 08:03:27 +01:00
Michael Kuhn
a741350e69 glib: add 2.74.6 (#35708) 2023-03-11 19:17:26 -05:00
Karen C. Tsai
fe5865da0d Add spackage for py-sphinx-rtd-dark-mode (#35946) 2023-03-11 14:27:23 -05:00
Mosè Giordano
1e9a654f17 curl: Allow compiling recent versions with MbedTLS 2 (#35947)
Curl 7.79 started supporting MbedTLS 3, but it did not drop support for v2.
2023-03-11 14:17:13 -05:00
Harmen Stoppels
844701b974 get --dev and drop set -x (#36010) 2023-03-10 22:59:57 -08:00
Alec Scott
1d081565db gmake: add v4.4.1 (#35872)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-03-11 01:14:27 -05:00
eugeneswalker
39abe69c97 py-exarl: new package (#35828)
* py-exarl: new package

* fix style

* extend copyright to 2023

* add maintainer
2023-03-10 18:26:00 -08:00
Alec Scott
f5228cf59c go: refactor bootstrapping process (#35823)
* Refactor go bootstrapping to include binary or gcc bootstrap
2023-03-10 16:27:49 -08:00
Harmen Stoppels
08d7f47278 curl flag is not universally supported (#36009) 2023-03-10 16:20:32 -08:00
Alec Scott
92c6112991 gdk-pixbuf: add v2.42.10 (#35867) 2023-03-10 18:02:56 -05:00
Alec Scott
3605105cf1 editres: add v1.0.8 (#35854) 2023-03-10 18:02:36 -05:00
Harmen Stoppels
26fd1ac5b0 hotfix: fix double double quotes (#36005) 2023-03-10 13:41:01 -08:00
Zack Galbreath
e1301df60c ci: version bump for ghcr.io/spack/e4s-amazonlinux-2 (#35976)
* ci: version bump for ghcr.io/spack/e4s-amazonlinux-2

This new image comes with GnuPG v2.4.0

* py-cython: upperbounds for Python versions

* fix py-gevent nonsense

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2023-03-10 13:32:11 -08:00
kwryankrattiger
181bb54372 Hotfix: Fix CI unit test after CI refactor (#36004)
* Hotfix: Fix CI unit test after CI refactor
2023-03-10 13:31:40 -08:00
kwryankrattiger
f3595da600 CI boilerplate reduction (#34272)
* CI configuration boilerplate reduction and refactor

Configuration:
- New notation for list concatenation (prepend/append)
- New notation for string concatenation (prepend/append)
- Break out configuration files for: ci.yaml, cdash.yaml, view.yaml
- Spack CI section refactored to improve self-consistency and
composability
  - Scripts are now lists of lists and/or lists of strings
  - Job attributes are now listed under precedence ordered list that are
  composed/merged using Spack config merge rules.
  - "service-jobs" are identified explicitly rather than as a batch

CI:
- Consolidate common, platform, and architecture configurations for all CI stacks into composable configuration files
- Make padding consistent across all stacks (256)
- Merge all package -> runner mappings to be consistent across all
stacks

Unit Test:
- Refactor CI module unit-tests for refactor configuration

Docs:
- Add docs for new notations in configuration.rst
- Rewrite docs on CI pipelines to be consistent with refactored CI
workflow

* Script verbose environ, dev bootstrap

* Port #35409
2023-03-10 12:25:35 -07:00
Scott Wittenburg
16c67ff9b4 ci: Increase the amount of pruning possible for PR pipelines (#35944)
By setting the traversal depth to 1, only specs matching the changed
package and direct dependents of those (and of course all dependencies
of that set) are removed from pruning candidacy.
2023-03-10 11:19:52 -08:00
Alec Scott
ce7409bbf7 feh: add v3.9.1 (#35858) 2023-03-10 11:08:50 -08:00
kwryankrattiger
369914c3e1 Add packages OSPRay, rkcommon, Open VKL, and Open Image Denoise (#35530) 2023-03-10 19:35:46 +01:00
Matthew Thompson
64e0ca5a89 Update yaFyaml, pFlogger, and gFTL versions, add list_url (#35968)
* Update yaFyaml, pFlogger, and gFTL versions
* Add list_url
2023-03-10 09:57:15 -08:00
Erik Heeren
51a5377ceb py-deap: newer version can use newer setuptools (#35986)
* py-deap: newer version can use newer setuptools

* Update var/spack/repos/builtin/packages/py-deap/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-10 10:57:36 -05:00
H. Joe Lee
243627104e scons: add version 4.5.1. (#35990) 2023-03-10 10:47:52 -05:00
Adam J. Stewart
e817b0b9d0 py-scikit-learn: add v1.2.2 (#35982) 2023-03-10 06:38:49 -05:00
Valentin Volkl
eb59097576 rivet: remove tag (deleted by upstream developers) (#35971) 2023-03-10 04:58:16 -05:00
SXS Bot
73c1f3f893 spectre: add v2023.03.09 (#35972)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2023-03-10 04:53:09 -05:00
nicolas le goff
566fb51d71 cgns: enable tools (#35713) 2023-03-10 08:59:23 +01:00
Paul R. C. Kent
617f44f9ed QMCPACK v3.16.0 (#35967) 2023-03-09 19:53:58 -05:00
Greg Becker
a51f4b77d9 reorder_flags: properly handle flags from concrete reused specs (#35951) 2023-03-09 16:46:47 -08:00
Michael Kuhn
9e6afc7dec scalasca: add 2.6.1 (#35970) 2023-03-09 15:13:53 -05:00
Matthieu Dorier
68874a72fb [liburing] Adds liburing package (#35762)
* [liburing] Adds liburing package
* Update var/spack/repos/builtin/packages/liburing/package.py
* [liburing] Added conflicts for darwin and windows platforms

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-03-09 11:03:32 -08:00
Harshula Jayasuriya
e560beed19 concretizer.yaml: document valid values for granularity (#35961) 2023-03-09 19:51:11 +01:00
Vanessasaurus
de586bb66c Automated deployment to update package flux-core 2023-03-09 (#35956)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2023-03-09 10:25:57 -08:00
Michael Kuhn
846cd05c7e scorep: fix dependencies (#35966)
The overlapping dependency version ranges caused the concretizer to pick
version 7.1 even though version 8.0 is available:
```
==> Error: No version for 'cubelib' satisfies '@4.7.1' and '@4.8'
```

Moreover, Score-P 8.0 requires libbfd:
```
configure: error: bfd.h required
```
2023-03-09 10:12:37 -08:00
Alec Scott
1ef313b604 bdftopcf: add v1.1 (#35845) 2023-03-09 11:40:25 -05:00
Rocco Meli
73026f4f4b Deprecate elpa rc2 (#35953) 2023-03-09 09:28:14 -05:00
Alec Scott
08dd6d1a21 fontconfig: add v2.14.2 (#35861) 2023-03-09 09:13:07 -05:00
Mark W. Krentel
e9173a59fd hpctoolkit: adjust dependency and conflict for xz (#35950)
Hpctoolkit doesn't build cleanly with xz 5.2.7 and 5.2.8 due to a
misuse of the symver attribute.  This is now fixed in 5.2.9 and later.
2023-03-09 08:37:21 -05:00
Benjamin Meyers
7401c97037 New packages: py-fireworks, py-flask-paginate (#35939)
* New packages: py-fireworks, py-flask-paginate

* [@spackbot] updating style on behalf of meyersbs
2023-03-09 04:54:26 -05:00
Benjamin Meyers
15433cfaf1 New package py-custodian (#35938)
* New package py-custodian

* [@spackbot] updating style on behalf of meyersbs
2023-03-09 04:50:31 -05:00
Benjamin Meyers
99aa0ef0cd Update py-boltons (#35937) 2023-03-09 04:46:25 -05:00
Alec Scott
28934e5f77 gawk: add v5.2.1 (#35863) 2023-03-08 21:36:43 -05:00
Michael Kuhn
f08598427d git: add 2.39.2 (#35911) 2023-03-08 19:40:16 -05:00
Michael Kuhn
cc4f7c224a libuv-julia: fix mtime again (#35945)
On some systems touch runs out of order,
so set a equal mtimes to the relevant files
2023-03-08 19:02:38 -05:00
Maciej Wójcik
ee5b2936e4 gcc: Patch building of GCC 5.1-12.1 with glibc >= 2.36 (#35798) 2023-03-08 22:29:57 +01:00
Teo
f7a6446d3f Halide: Add 15.0.0 (#35924) 2023-03-08 16:22:16 -05:00
Tamara Dahlgren
624e28ee03 nek5000/nekcem: test_install -> check_install (#35925) 2023-03-08 16:05:29 -05:00
Auriane R
784e5f5789 Add pika 0.13.0 and pika-algorithms 0.1.2 (#35933)
* Add last release of pika-algorithms + version constraint

* Add pika release 0.13.0
2023-03-08 16:05:11 -05:00
Michael Kuhn
180618b25a p7zip: update checksum for 17.05 (#35923)
See https://github.com/p7zip-project/p7zip/issues/220
2023-03-08 16:04:57 -05:00
Emil Briggs
aefcce51fc rmgdft: add version 5.0.5 (#35922)
* Updated for version 5.0.5.
2023-03-08 15:59:09 -05:00
Weiqun Zhang
884a356b1e amrex: add v23.03 (#35765) 2023-03-08 15:45:16 -05:00
Jean-Baptiste Besnard
ee69f2d516 intel-mpi-benchmarks: variant and conflicts fixes (#35670) 2023-03-08 15:33:28 -05:00
renjithravindrankannath
bc5bb06f1f Provide openmp from rocm-open-extras when tensile uses openmp (#35767)
* Provide openmp from rocm-open-extras when tensile uses openmp
* Correcting audit check failure in rocm-openmp-extras dependency
* Fixing style check error
* rocm-openmp-extras required instead of llvm-amdgpu both varient
2023-03-08 09:57:35 -08:00
Peter Scheibel
1b8561f752 add logging to help users debug where external file searches are taking a long time (#35900) 2023-03-08 09:46:13 -08:00
nicolas le goff
7d54c24939 qwt: lift restrictions on qt version and added an opengl variant and VisIt use (#35734) 2023-03-08 18:38:33 +01:00
Robert Underwood
960923287d gdb: version 13.1 and debuginfod support (#35769)
* gdb: version 13.1 and debuginfod
* gdb: update to autotools helpers

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
2023-03-08 08:54:47 -08:00
Scott Wittenburg
4a9ffdcfa2 gitlab ci: Provide a knob to control untouched spec pruning (#35274)
When untouched spec pruning is enabled, specs possibly affected
by a change cannot be pruned from a pipeline.

Previously spack looked at all specs matching changed package
names, and traversed dependents of each, all the way to the
environment root, to compute the set of environment specs
possibly affected by a change (and thus, not candidates for
pruning).

With this PR, when untouched spec pruning is enabled, a new
environment variable can control how far towards the root spack
traverses to compute the set of specs possibly affected by a
change.  SPACK_UNTOUCHED_PRUNING_DEPENDENT_DEPTH can be set
to any numeric value before the "spack ci generate" command
is called to control this traversal depth parameter.  Setting
it to "0" traverses only touched specs, setting it to "1"
traverses only touched specs and their direct dependents, and
so on.  Omitting the variable results in the previous behavior
of traversing all the way to the root.  Setting it to a negative
value means no traversal is done, and always yields an empty
set of possibly affected specs (which would result in the max
pruning possible).
2023-03-08 09:38:07 -07:00
Harmen Stoppels
22d4e79037 buildcache create: reproducible tarballs (#35623)
Currently `spack buildcache create` creates compressed tarballs that
differ between each invocation, thanks to:

1. The gzip header containing mtime set to time.time()
2. The generated buildinfo file which has a different mtime every time.

To avoid this, you have to explicitly construct GZipFile yourself, since
the Python API doesn't expose the mtime arg, and we have to manually
create the tarinfo object for the buildinfo metadata file.

Normalize mode: regular files & hardlinks executable by user, dirs, symlinks: set 0o755 permissions in tarfile; other files use 0o644
2023-03-08 15:51:55 +00:00
Sangu Mbekelu
2777ca83eb new py-thop package (#35889)
* "new py-thop package"

* [@spackbot] updating style on behalf of Sangu-Mbekelu

* Update package.py

modified the url and dependencies

---------

Co-authored-by: Sangu Mbekelu <s.mbekelu9@gmail.com>
2023-03-08 09:12:27 -06:00
Erik Heeren
a2423f5736 py-openmesh: new package (#35907)
* py-openmesh: new package

* Update var/spack/repos/builtin/packages/py-openmesh/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-08 09:11:34 -06:00
Aaron Black
81765e0278 mfem: add missing cublas for cuda support (#35608) 2023-03-08 09:14:46 -05:00
Erik Heeren
0d4f9b26b8 py-parse-type: new package (#35909) 2023-03-08 07:13:30 -05:00
Gerhard Theurich
87c21a58d1 parallelio: new version (#35553) 2023-03-08 07:13:13 -05:00
Alberto Invernizzi
5900378cff newly released 0.8.3 (#35910) 2023-03-08 07:05:13 -05:00
Massimiliano Culpo
d54611af2c Split satisfies(..., strict=True/False) into two functions (#35681)
This commit formalizes `satisfies(lhs, rhs, strict=True/False)`
and splits it into two functions: `satisfies(lhs, rhs)` and
`intersects(lhs, rhs)`.

- `satisfies(lhs, rhs)` means: all concrete specs matching the
   left hand side also match the right hand side
- `intersects(lhs, rhs)` means: there exist concrete specs
   matching both lhs and rhs.

`intersects` now has the property that it's commutative,
which previously was not guaranteed.

For abstract specs, `intersects(lhs, rhs)` implies that
`constrain(lhs, rhs)` works.

What's *not* done in this PR is ensuring that
`intersects(concrete, abstract)` returns false when the
abstract spec has additional properties not present in the
concrete spec, but `constrain(concrete, abstract)` will
raise an error.

To accomplish this, some semantics have changed, as well
as bugfixes to ArchSpec:
- GitVersion is now interpreted as a more constrained
  version
- Compiler flags are interpreted as strings since their
  order is important
- Abstract specs respect variant type (bool / multivalued)
2023-03-08 13:00:53 +01:00
Benjamin Meyers
39adb65dc7 New package: py-imbalanced-learn (#35895)
* New package: py-imbalanced-learn

* Fix typo

* [@spackbot] updating style on behalf of meyersbs

* Update var/spack/repos/builtin/packages/py-imbalanced-learn/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-08 07:00:09 -05:00
Rocco Meli
db15e1895f bump elpa (#35908) 2023-03-08 06:59:52 -05:00
Benjamin Meyers
7610926e5e Update and fix py-meldmd (#35783)
* Update/fix py-meldmd; update openmm

* Restrict filter_file based on openmm version

* Updates based on Adam's feedback

* [@spackbot] updating style on behalf of meyersbs

* Break up long filter_file

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-08 06:59:35 -05:00
Benjamin Meyers
703f687ca0 Update py-seaborn to @0.12.2 (#35896) 2023-03-08 06:54:17 -05:00
Annop Wongwathanarat
983a56e729 gromacs: add sve variant on aarch64 (#35614) 2023-03-08 10:25:36 +01:00
Brian Vanderwende
cbd0770497 ESMF should use Spack wrappers directly (#35749) 2023-03-08 10:21:51 +01:00
Tamara Dahlgren
b06648eb64 docs: added platform conflicts example, fix quotes (#35771) 2023-03-08 10:10:01 +01:00
QuellynSnead
80d784c401 singularity-eos: (#35625)
The Cray fortran compiler names fortran modules in uppercase by
default. Compile with the "-ef" flag to produce the lowercase
name that singularity-eos is expecting.
2023-03-08 09:58:34 +01:00
downloadico
5b3ad0adaa pgplot: made dependent packages set environment variables from pgplot (#35803) 2023-03-08 09:48:12 +01:00
Richard Berger
3feadc0a36 lammps: GPU/Kokkos package updates (#35885) 2023-03-08 09:46:05 +01:00
Alec Scott
8ec86e05c4 ico: add v1.0.6 (#35881) 2023-03-08 09:33:22 +01:00
Alec Scott
b34fd98915 ftxui: add v4.0.0 (#35868) 2023-03-08 09:33:08 +01:00
Tim Haines
a93d143f17 boost: add v1.81.0 (#34613) 2023-03-08 09:28:17 +01:00
Alec Scott
d0ced9da94 lucene: add v9.5.0 (#35917) 2023-03-08 09:12:51 +01:00
Harmen Stoppels
c37d6f97dc compiler wrapper: parse Wl and Xlinker properly (#35912)
Two fixes:

1. `-Wl,a,b,c,d` is a comma separated list of linker arguments, we
   incorrectly assume key/value pairs, which runs into issues with for
   example `-Wl,--enable-new-dtags,-rpath,/x`
2. `-Xlinker,xxx` is not a think, so it shouldn't be parsed.
2023-03-08 09:03:31 +01:00
Adam J. Stewart
ec73157a34 py-mypy: add v1.1.1 (#35926) 2023-03-08 08:16:01 +01:00
Alec Scott
e447c365ee help2man: add v1.49.3 (#35877) 2023-03-08 02:01:58 -05:00
Alec Scott
c5c67145d3 iso-codes: add v4.13.0 (#35915) 2023-03-08 01:18:10 -05:00
Alec Scott
a5bc83d635 httpie: add v3.2.1 (#35879)
* httpie: add v3.2.1

* Add additional 3.2.1 dependencies to httpie

* Add version condition to dependency

* Reorder dependencies for efficiency

* Update var/spack/repos/builtin/packages/httpie/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-08 01:05:44 -05:00
Alec Scott
1760553b70 lua-luaposix: add v36.1 (#35918) 2023-03-08 00:30:28 -05:00
Alec Scott
62d9bf5fef listres: add v1.0.5 (#35921) 2023-03-07 22:12:35 -05:00
Alec Scott
cb49da1b6f lndir: add v1.0.4 (#35920) 2023-03-07 21:54:19 -05:00
Alec Scott
c79d9ac5bd erlang: add v25.2 (#35856) 2023-03-07 20:29:52 -05:00
Alec Scott
871ca3e805 jchronoss: add v1.2.1 (#35916) 2023-03-07 19:58:21 -05:00
Harmen Stoppels
89176bd3f6 libxc: use gitlab release tarballs for v6.0.0 and greater (#35894)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-03-07 19:57:59 -05:00
Alec Scott
b29a607ceb isl: add v0.25 (#35884) 2023-03-07 18:03:01 -05:00
Alec Scott
0c06ecc711 iceauth: add v1.0.9 (#35880) 2023-03-07 18:02:45 -05:00
Alec Scott
73d1e36da5 imake: add v1.0.9 (#35882) 2023-03-07 18:02:29 -05:00
Erik Heeren
0d57c2ab24 py-numpy-stl: new package (#35892) 2023-03-07 18:02:13 -05:00
Alec Scott
272e69b2fd htslib: add v1.17 (#35883) 2023-03-07 18:01:58 -05:00
Alec Scott
8efde89c0e hivex: add v1.3.23 (#35878) 2023-03-07 18:01:43 -05:00
Alec Scott
c7ec47c658 graphviz: add v7.1.0 (#35876) 2023-03-07 17:56:08 -05:00
Alec Scott
013e82f74f grep: add v3.9 (#35875) 2023-03-07 17:55:46 -05:00
Alec Scott
fff7e6d626 gradle: add v8.0.2 (#35873) 2023-03-07 17:50:06 -05:00
Alec Scott
ac1fe8765a gprolog: add v1.5.0 (#35874) 2023-03-07 17:23:21 -05:00
Alec Scott
acbf46d786 glpk: add v5.0 (#35871) 2023-03-07 17:22:56 -05:00
Alec Scott
a753fa12fb gegl: add v0.4.42 (#35866) 2023-03-07 17:22:35 -05:00
Alec Scott
27b2dc1608 fonttosfnt: add v1.2.2 (#35862) 2023-03-07 17:16:43 -05:00
Alec Scott
d8f8b42bcb fslsfonts: add v1.0.6 (#35869) 2023-03-07 16:58:59 -05:00
Alec Scott
3995428ad2 gatk: add v4.3.0.0 (#35864) 2023-03-07 16:53:15 -05:00
Alec Scott
76d41b7f9f fstobdf: add v1.0.7 (#35870) 2023-03-07 16:36:08 -05:00
Samuel Li
a7501105b1 update SPERR package (#35810)
* update SPERR package
* remove blank line
* update SPERR package
* remove blank line
---------

Co-authored-by: Samuel Li <Sam@Navada>
2023-03-07 12:26:43 -08:00
Adam J. Stewart
da33334488 py-pytorch-lightning: add v1.9.4 (#35791) 2023-03-07 12:08:47 -08:00
Adam J. Stewart
68dfcd10e6 py-pytorch-lightning: torch~distributed not supported in 1.9.0 (#35809) 2023-03-07 12:05:39 -08:00
Alec Scott
4e6a8a40b7 cloog: add v0.18.4 (#35849) 2023-03-07 11:55:00 -08:00
Alec Scott
20f663d089 dnsmasq: add v2.89 (#35851) 2023-03-07 11:54:37 -08:00
Cody Balos
acf61daf99 sundials: add new version (#35796) 2023-03-07 11:52:05 -08:00
Alec Scott
a0233d2560 cmocka: add v1.1.7 (#35848) 2023-03-07 11:49:42 -08:00
Alec Scott
8c989e0aee apr: add v1.7.2, apr-util 1.6.3 (#35832)
* apr: add v1.7.2
* apr-util: add v1.6.3
2023-03-07 11:48:39 -08:00
Alec Scott
2f5e7fb38c awscli: add v1.27.84 (#35836)
* awscli: add v1.27.84

* Add botocore dependency to awscli

* Update var/spack/repos/builtin/packages/awscli/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Add py-botocore@1.29.84 dependency

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-07 13:38:17 -06:00
Alec Scott
b9bc911921 cget: add v0.2.0 (#35846) 2023-03-07 13:10:56 -05:00
Alec Scott
462df718ff bc: add v1.07.1 (#35844) 2023-03-07 13:10:33 -05:00
Alec Scott
b20271a8e8 code-server: add v4.10.1 (#35855) 2023-03-07 12:06:01 -05:00
Alec Scott
a62992b4b1 easybuild: add v4.7.0 (#35853)
* easybuild: add v4.7.0

* Add v0.4.7 to all easybuild dependencies

* Reorder versions newest to oldest

* Fix styling on easybuild dependency loop
2023-03-07 12:05:39 -05:00
Alec Scott
818459e6fc dash: add v0.5.12 (#35850) 2023-03-07 11:48:05 -05:00
Alec Scott
38bd499c09 colordiff: add v1.0.21 (#35847) 2023-03-07 11:47:43 -05:00
Alec Scott
f9de4c2da8 aspell: add v0.60.8 (#35834) 2023-03-07 11:47:21 -05:00
Rocco Meli
335ae31a59 Update biopython, gsd, and griddataformats (#35827)
* update mda dependencies

* apply black

* Update var/spack/repos/builtin/packages/py-gsd/package.py

* Update var/spack/repos/builtin/packages/py-griddataformats/package.py

* Update var/spack/repos/builtin/packages/py-griddataformats/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* remove numpy upper bound

* Update var/spack/repos/builtin/packages/py-griddataformats/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-gsd/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-07 11:47:04 -05:00
Alec Scott
c4e5ee8831 beforelight: add v1.0.6 (#35843) 2023-03-07 11:32:17 -05:00
Alec Scott
73c358819b busybox: add v1.36.0 (#35840) 2023-03-07 11:31:52 -05:00
Alec Scott
6f396aff99 bitmap: add v1.1.0 (#35839) 2023-03-07 11:20:26 -05:00
Alec Scott
047b99fab9 blktrace: add v1.3.0 (#35842) 2023-03-07 10:56:31 -05:00
Alec Scott
a12a290ee1 bmake: add v20230303 (#35841) 2023-03-07 10:56:05 -05:00
Alec Scott
bd6c9085f0 atk: add v2.38.0 (#35833) 2023-03-07 10:55:48 -05:00
Alec Scott
c0a0d60378 appres: add v1.0.6 (#35831) 2023-03-07 10:55:29 -05:00
Alec Scott
bc84ca126e ace: add v7.1.0 (#35829) 2023-03-07 10:55:08 -05:00
Alec Scott
3bb7570e02 alglib: add v3.20.0 (#35830) 2023-03-07 10:44:09 -05:00
Adam J. Stewart
a5b80662ae py-matplotlib: add v3.7.1 (#35822) 2023-03-06 19:02:20 -05:00
John W. Parent
0c5360e3fd Proj: to CMake (#35108)
* update proj

* re-add autotools support

* style

* Setup env in builders

* Drop direct windows conflict for older versions

* Default to CMake

Add new style class definiton

* Proj: setup_run_environment in package not builder

* Proj: move run env changes to pkg, rm cmake arg

* Set PROJ_LIB during build

* Style

* Rm redundant configure arg
2023-03-06 17:23:34 -05:00
Greg Becker
2ff337a2a5 compiler flags: fix multiple compilers with different flags (#35721)
Currently, if two compilers with the same spec differ on the flags, the concretizer will:

1. mix both sets of flags for the spec in the ASP program
2. error noting that the set of flags from the compiler (both of them) doesn't match the set from the lower priority compiler

This PR fixes both -- only flags from the highest priority compiler with a given spec are considered.
2023-03-06 10:29:48 -08:00
Erik Heeren
f3841774f7 py-elasticsearch: new versions (#35764)
* py-elasticsearch: new versions

Also add py-elastic-transport as a new dependency

* py-elasticsearch: py-urllib3 is no longer a dependency

* Update var/spack/repos/builtin/packages/py-elasticsearch/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-06 10:30:31 -06:00
Alec Scott
97c2dd3a5a Add Hugo v0.111.1 (#35824) 2023-03-05 18:33:23 +01:00
Alec Scott
3aae80ca07 curl: add v7.88.1 and deprecate previous versions due to CVEs (#35825)
* curl: add v7.88.1 and deprecate previous versions due to CVEs

* Add self as a maintainer to curl
2023-03-05 18:32:56 +01:00
Alec Scott
973bc92813 libarchive: add v3.6.2 and deprecate previous versions due to CVE-2022-36227 (#35826) 2023-03-05 18:22:41 +01:00
Todd Gamblin
42a02411b4 windows: use sys.platform == "win32" instead of is_windows (#35640)
`mypy` only understands `sys.platform == "win32"`, not indirect assignments of that
value to things like `is_windows`. If we don't use the accepted platform checks, `mypy`
registers many Windows-only symbols as not present on Linux, when it should skip the
checks for platform-specific code.
2023-03-05 07:58:05 -08:00
Erik Heeren
4561536403 py-simpervisor: correct pypi (#35785) 2023-03-04 10:44:03 -06:00
MicK7
6b694749d3 Update python linting packages (#35811)
* add 2.14.2 py-astroid version

* add py-pylint 2.26.2

* fix black

* fix py-dill depends_on

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* fix py-astroid minor versionning

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* modify typing_extensions depends_on

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-03 16:06:56 -06:00
MatthewLieber
c0f48b30cf Start using paths found in extra_rpaths in compilers.yaml when building (#35376)
* Start using paths found in extra_rpaths in compilers.yaml when building

* running black and changing maintainer list

* changing import order to pass isort

---------

Co-authored-by: Matthew Lieber <lieber.31@osu.edu>
2023-03-03 13:29:59 -08:00
Massimiliano Culpo
046416479a Polish spack.util.environment (#35812)
Update `spack.util.environment` to remove legacy idioms.
* Remove kwargs from method signature and use a class for traces
* Uppercase a few global variables
* spack.util.environment: add type-hints
* Improve docstrings
* Fixed most style issues reported by pylint
2023-03-03 16:17:27 -05:00
Lucas Frérot
b17113b63d snakemake: added versions 7.19.0-7.22.0 (#35535)
* snakemake: added versions 7.19.0-7.22.0

* snakemake: corrected +reports dependencies
2023-03-03 12:46:29 -06:00
Loïc Pottier
479f5a74a3 py-ipdb: updating versions (#35654)
* py-ipdb: updating versions

Signed-off-by: Loïc Pottier <pottier1@llnl.gov>

* py-ipdb: fixing versions problem and deleting 10.1 which is too old for Python > 3.6

Signed-off-by: Loïc Pottier <pottier1@llnl.gov>

* Update var/spack/repos/builtin/packages/py-ipdb/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-ipdb: removed useless dependencies

Signed-off-by: Loïc Pottier <pottier1@llnl.gov>

* Update var/spack/repos/builtin/packages/py-ipdb/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-ipdb/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-ipdb: missing @

Signed-off-by: Loïc Pottier <pottier1@llnl.gov>

* Update var/spack/repos/builtin/packages/py-ipdb/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-03 05:28:40 -05:00
kwryankrattiger
895886959f CI: Fix timeout for VisIt and ParaView (#35787)
ref. #35400
2023-03-03 09:13:14 +01:00
Jean Luca Bez
0bdb0b07fd update github URL (#35789) 2023-03-03 00:23:23 -05:00
Ishaan Desai
817b59900a Update pyprecice v2.5.0.2 (#35788)
* Update versions 2.5.0.0 and 2.5.0.1

* Applying review changes

* Updating incorrect checksum for v2.4.0.0

* Add for loop to define depends_on for preCICE versions and bindings versions

* Formatting

* Missing comma

* Add pyprecice v2.5.0.2
2023-03-03 00:19:17 -05:00
Erik Heeren
2ddd66ca48 py-jupyter-server-proxy: fix dependency condition (#35784)
Too much copypasta
2023-03-03 00:18:31 -05:00
H. Joe Lee
7ddd796f89 scons: add a new version (#35652)
* scons: add a new version

* scons: address @adamjstewart review
2023-03-02 15:14:59 -06:00
downloadico
1e2ef16b39 Add e3sm scorpio (#35794)
* e3sm-scorpio: add e3sm-scorpio package
   This is the Scorpio package from the e3sm.org site.
* fixed style errors
* removed unneeded dependency on cmake
2023-03-02 15:48:55 -05:00
Adam J. Stewart
77355fd348 py-nbstripout: add new package (#35786) 2023-03-02 10:47:48 -07:00
Adam J. Stewart
1facf99f51 py-rtoml: add new package (#35780) 2023-03-02 09:43:36 -08:00
Robert Underwood
07c8939679 julia: fix for libuv and Julia (#35776)
Use correct `shlib_symbol_version` for Julia 1.8, work around issue where libuv-julia's git checkout has arbitrary mtime, causing make to regenerate configure scripts, sometimes.
2023-03-02 16:36:55 +01:00
Erik Heeren
28f4b5729a py-chart-studio: new package (#35759)
* py-chart-studio: new package

* py-chart-studio: add missing six dependency
2023-03-02 08:40:37 -06:00
M. Eric Irrgang
dd7af323ed Add a py-gmxapi package. (#35738)
* Add a `py-gmxapi` package.

This package provides the Python package for the GROMACS
public API. The Python package is not strongly coupled to
a specific GROMACS _version_, but its compiled extension module
is strongly coupled to a specific GROMACS _installation_.

* Update conflict info.

In order to allow `^gromacs@2022.1` while rejecting `^gromacs@2022`,
we need to compare to `gromacs@2022.0`.

* Apply suggestions from code review

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Apply suggestions from code review.

* Simplify build system structure.
* Update dependencies for completeness.

* Update var/spack/repos/builtin/packages/py-gmxapi/package.py

Per code review, pretend gmxapi <0.4 doesn't exist, for simplicity.

* Update var/spack/repos/builtin/packages/py-gmxapi/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-gmxapi/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-02 08:37:17 -06:00
Axel Huebl
2ba1f700e6 WarpX: add v23.03 (#35775)
Update `warpx` & `py-warpx` to the latest release.
2023-03-02 04:19:08 -08:00
Adam J. Stewart
739ab89b65 py-kornia: add v0.6.10 (#35554) 2023-03-02 01:08:39 -08:00
Adam J. Stewart
7e5099627f py-pytorch-lightning: add v1.9.3 (#35629) 2023-03-02 00:23:34 -08:00
Brian Van Essen
eb29889f6e Detection of Cray's slingshot detection has relied on the presence of (#35779)
a shared library /lib64/libcxi.so, which seems to also appear on other
non-slingshot systems.  This patch also checks to make sure that there
is a Cray programming enviornment in /opt/cray/pe in addition to the
shared library.
2023-03-01 23:19:20 -08:00
Dmitriy
ae27df4113 Add py-mpi4py as a dependency for henson (#35743)
* Add py-mpi4py as a dependency

* Add maintainers per spackbot's request

* Add type=(build,run) per adamjstewart's suggestion
2023-03-01 22:43:31 -08:00
Adam J. Stewart
ae2c8c02a2 py-torchgeo: kornia backwards-incompatible change (#35570) 2023-03-01 22:13:23 -08:00
Adam J. Stewart
caff9e4292 py-scipy: add v1.10.1 (#35581) 2023-03-01 16:04:23 -08:00
Harmen Stoppels
beb3414b4d py-pygments 2.12; fix py-docutils, again (#35394)
* py-pygments 2.12; fix py-docutils, again

`2.12` is the latest for which our style hack works, beyond that we need
our own package to make a plugin.

Old docutils needs old setuptools

* py-setuptools is always a dep

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update the range

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-01 15:53:41 -08:00
Alex Richert
ed07cee852 Add capitalized symlinks for libesmf.{a,so} (#35774)
* Add capitalized symlinks for libesmf.{a,so}
* Add import of lib suffixes
2023-03-01 15:28:46 -08:00
Rémi Lacroix
7c1a164219 GitHub CLI: Add version 2.23.0 (#35761) 2023-03-01 13:59:28 -08:00
Howard Pritchard
18e0b893f2 papi: more fixes for Intel OneAPI compiler (#34048)
The Intel OneAPI's extreme pickiness continues to bring out
buggy/noncompliant code.

This patch fixes an error in the configure.in embedded 'c' test code
and also in a file with an initialized, but unused, variable.

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2023-03-01 13:33:55 -08:00
Garth N. Wells
df5b25764b Update FEniCSx libraries to v0.6 (#35600)
* Updates to release 0.6.

* Dep updates

* Dep version fix

* Another version fix

* Fix typo

* UFL version fix

* Update var/spack/repos/builtin/packages/py-fenics-dolfinx/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-fenics-ffcx/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Some updates following review

* Update var/spack/repos/builtin/packages/py-fenics-dolfinx/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* More updates

* More updates

* build/run updates

* Small fix

* Fix version number.

* specify lower bounds for python dependencies

* address style issues

* address style issues

* address PR comments

* amend setuptools dependency to be of type build only

* amend setuptools dependency to be of type run and build for ffcx and ufl

* add build dependency to ensure import tests pass

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Matthew Archer <ma595@cam.ac.uk>
2023-03-01 13:18:23 -08:00
Tiziano Müller
44705b0a6e cp2k: fix builds on macOS, workaround reported issue with __contains__ (#35584)
Co-authored-by: Robert Cohn <rscohn2@gmail.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-03-01 12:01:00 -08:00
Erik Heeren
3bb03ea7d1 py-dbf: new versions and dependency (#35760) 2023-03-01 10:49:27 -08:00
luker
3c411bf135 Fix superlu-dist package for cray fortran (#35728)
The superlu-dist code developers modified the code such that the patch
is no longer needer for `@7.2.0:`  (the patch will actually fail)
2023-03-01 10:49:06 -08:00
Erik Heeren
f2363c1cb5 Py dask mpi (#35679)
* py-dask-mpi: new package with dependencies

* py-hatch-jupyter-builder is not needed after all

* skip_modules seems cleaner

* Update var/spack/repos/builtin/packages/py-jupyter-server-proxy/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update var/spack/repos/builtin/packages/py-simpervisor/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-01 11:35:55 -06:00
Adam J. Stewart
7188eeb604 py-fiona: add upper bounds to Python versions (#35658)
* py-fiona: add upper bounds to Python versions

* Add error msg
2023-03-01 09:53:20 -06:00
Adam J. Stewart
583d89e95a Simplify spack help --spec output (#35626) 2023-03-01 16:26:59 +01:00
Mosè Giordano
4a24401ed0 wrf: Fix compilation with GCC 10+ (#35177)
Flags `-fallow-argument-mismatch -fallow-invalid-boz` set in `FFLAGS`/`FCFLAGS`
environment variables don't really have effect in older versions of WRF, we need
to force them in the compiler wrappers.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-03-01 07:14:42 -08:00
Wouter Deconinck
2796794b19 py-particle: new versions 0.16.*, 0.20.*, 0.21.* (#35547)
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-03-01 14:49:37 +01:00
Wouter Deconinck
dd854e86d9 sartre: new package (#32713) 2023-03-01 14:36:16 +01:00
Valentin Volkl
1500246ab7 vc: improvements for testing (#28887) 2023-03-01 14:29:43 +01:00
MatthewLieber
206cd94c0b mvapich2-gdr/mvapich2x: add v2.3.7, update package config (#33066)
Co-authored-by: Nick Contini <contini.26@osu.edu>
2023-03-01 14:24:34 +01:00
Houjun Tang
ff1a425c8d hdf5-vol-async: add v1.5 (#35636) 2023-03-01 14:16:56 +01:00
Wouter Deconinck
ab999d5af9 dd4hep: depends_on root +webgui when +ddeve ^root @6.28: (#35624) 2023-03-01 14:13:50 +01:00
H. Joe Lee
513ff34e66 dpdk: add a new build system and version (#35647) 2023-03-01 14:08:41 +01:00
H. Joe Lee
b59c8e9a43 isa-l_crypto: add a new package (#35651) 2023-03-01 14:06:37 +01:00
H. Joe Lee
9483e34d15 isa-l: add a new package (#35650) 2023-03-01 14:06:22 +01:00
Alex Richert
7e02139cad upp: add v10.0.8 (#35667) 2023-03-01 14:00:27 +01:00
Alex Richert
a08d86c201 grib-util: fix dependency constraints (#35668) 2023-03-01 13:59:45 +01:00
Hans Fangohr
4e70532cd1 OOMMF: set preferred version (#35675)
With the last merge request for OOMMF [1], the intention was to have version
20b0_20220930 as the preferred version, and provide 20b0_20220930-vanilla as an
additional version for the unlikely case anybody needed that.

I made the (wrong) assumption that the `version` listed first in the `package.py` file
would be the preferred version. This merge request is to correct that by
explicitly tagging the preferred version with `preferred=True`.

[1] https://github.com/spack/spack/pull/33072/files
2023-03-01 13:53:59 +01:00
Glenn Johnson
2d2a1c82d4 docbook: resolve conflict in spack env view (#35682)
If the docbook packages
- docbook-xml
- docbook-xsl

are installed in a spack environment view the catalog files will be in
conflict in the view directory. This PR resolves that by adding an
appropriate prefix to each catalog name so that they are unique in the
view. The resulting XML_CATALOG_FILES environment variable will then be
able to point to both of them.
2023-03-01 04:17:46 -08:00
Wouter Deconinck
a47ebe5784 dd4hep: new versions 1.25, 1.25.1 (#35665) 2023-03-01 12:56:03 +01:00
acastanedam
4f97bb118f elk: add v8.3.22 and fix a few issues (#35678) 2023-03-01 12:55:02 +01:00
Harmen Stoppels
d71eca74a3 mbedtls: new versions and deprecations [CVE-2022-46392] (#35715)
* Add new versions and deprecations [CVE-2022-46392]
* remove maintainer, order versions by major no
2023-03-01 03:14:00 -08:00
Tamara Dahlgren
f88dec43df cbench: renamed test_blas_linkage to check_blas_linkage (#35690) 2023-03-01 12:03:23 +01:00
Wouter Deconinck
2e8306d324 py-uproot: new versions 4.3.*, 5.0.* (#35548) 2023-03-01 11:53:38 +01:00
Adam J. Stewart
d64d77e99e py-torchmetrics: add v0.11.2-3 (#35755) 2023-03-01 02:48:35 -08:00
Cyrus Harrison
d7b11af731 ascent: add v0.9.0 (#35211) 2023-03-01 11:48:15 +01:00
Annop Wongwathanarat
68372a4dfe hpcg: add arm compiler (#35710) 2023-03-01 11:46:12 +01:00
Bill Williams
54500a5fca scorep: more precise dependencies for v7/v8 (#35712) 2023-03-01 11:45:07 +01:00
Adam J. Stewart
146464e063 Docs: fix link to PythonPackage docs (#35725) 2023-03-01 11:14:05 +01:00
Massimiliano Culpo
07e251c887 Remove handling of deprecated target names (graviton) (#35537)
* Update target names for Gitlab pipelines

* Remove handling of deprecated names for graviton
2023-03-01 11:03:12 +01:00
Mosè Giordano
f9f51cb930 Add new versions of Julia dependencies (#35622) 2023-03-01 01:52:28 -08:00
nicolas le goff
ad3c22fae9 gts: add missing pkgconfig dependency (#35657) 2023-03-01 10:49:09 +01:00
AMD Toolchain Support
bedcc5449a quantum-espresso: fix building with aocc (#35612)
Co-authored-by: Tooley, Phil <phil.tooley@amd.com>
2023-03-01 10:47:42 +01:00
Henning Glawe
a1a54fa8b7 nfft: add v3.2.4 (#35757) 2023-03-01 01:42:23 -08:00
Edward Hartnett
2a97bcbd5a bacio: add v2.6.0, including handling of shared library builds (#35490) 2023-03-01 10:29:12 +01:00
Mark W. Krentel
99fb4c4a47 hpctoolkit: add version 2023.03.01, add python variant (#35662)
1. add version 2023.03.01
 2. add variant 'python' that supports unwinding python source
 3. clean up some things with the cray variant
 4. require the latest libmonitor
 5. fix sha256 checksum for url patch
 6. delete rocm 5.3 from older versions
2023-03-01 09:44:32 +01:00
Brian Vanderwende
5b52685216 NCO: add v5.1.4 and v5.1.0 (#35753) 2023-03-01 09:27:25 +01:00
Alex Richert
53a924c20f Fix bufr package typo (libufr->libbufr) (#35740) 2023-02-28 15:56:30 -08:00
Annop Wongwathanarat
09ad541e98 armcomputelibrary: add version 23.02 (#35723) 2023-02-28 23:51:05 +01:00
M. Eric Irrgang
b109e16fba Back-port a patch for filesystem logic in gmx executable. (#35672)
* Backport a patch for relocatable `gmx` executable.

* spack style fixes
2023-02-28 14:18:00 -07:00
Mark W. Krentel
043a80ff9e hpcviewer: add version 2023.02 (#35648) 2023-02-28 11:13:30 -08:00
Matthieu Dorier
b7f7af1713 c-raft: add new package (#35735)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-02-28 11:16:02 -07:00
Rémi Lacroix
e5bd319c19 git-lfs: Add version 3.3.0 (#35671) 2023-02-27 23:58:56 -05:00
Tim Haines
90ad65a6e7 Dyninst: add version 12.3.0 (#35630) 2023-02-27 16:13:47 -08:00
Luke Diorio-Toth
779e80b7c1 Package/py pysam macos (#33851)
* cleaned up style, linked to external htslib

* removed htslib/bcfrools/samtools deps, use bundled libs instead

the pysam package includes the necessary libs to link to, so it wasn't even using linked libs when building

* fixed style

* revert to using external htslib

currently uses bundled samtools and bcftools, and there is no way to use external versions for those dependencies

* added libs property to htslibs package

added support for lib64

* added htslib name
2023-02-27 16:25:21 -06:00
Annop Wongwathanarat
502e216ee2 onednn: add variant to use Arm Compute Library on aarch64 (#35643)
* onednn: add variant to use Arm Compute Library on aarch64

* Update cmake version

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Shorten macro definition

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* Update cpu/gpu_runtime variants

* Update acl variant when 1.7+

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* [@spackbot] updating style on behalf of annop-w

* Add dependencies for new runtimes

* Fix dependency package name to oneapi-level-zero

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-02-27 13:58:03 -06:00
Wouter Deconinck
d6ff426d48 py-awkward: new version 1.10.*, 2.0.* (#35549)
* py-awkward: new version 1.10.*, 2.0.*

Lots of changes in 2.0.*, see https://github.com/scikit-hep/awkward/releases. This will need some extra testing.

* py-awkward: hatchling

* Update var/spack/repos/builtin/packages/py-awkward/package.py

* Update var/spack/repos/builtin/packages/py-awkward/package.py

* py-scikit-build-core: new and improved py-scikit-build

* py-awkward-cpp: new package

* py-awkward: add depends_on py-awkward-cpp

* py-awkward: depends_on py-packaging

* py-awkward-cpp: new versions pinned by py-awkward

* py-scikit-build-core: additional depends_on

* py-awkward: branch master deprecated

* py-pytest-subprocess: new package

* py-pytest: new version 7.2.1

* py-scikit-build-core: add tests dependencies

* [@spackbot] updating style on behalf of wdconinc

* py-scikit-build-core: two more test dependencies

* py-pytest: depends_on py-exceptiongroup

* py-awkward: add pytest support

* py-pytest: suggestions from review

* py-scikit-build-core: suggestions from review

* Update var/spack/repos/builtin/packages/py-awkward-cpp/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-awkward: depends_on pyyaml when @:1, order old deps last

* [@spackbot] updating style on behalf of wdconinc

* py-awkward: move some opt deps to test, order test deps

* py-awkward: remove test dependencies

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-02-27 13:51:13 -06:00
Michael Kuhn
c4311a250a pango: add 1.50.13 (#35709) 2023-02-27 09:38:17 -08:00
dlkuehn
ceaa304f5f osi: add version 0.108.7 (#35689)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-02-27 09:10:01 -07:00
dlkuehn
038efa4173 clp: add version 1.17.7 (#35688)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2023-02-27 07:23:00 -08:00
Harmen Stoppels
773fd5ad84 hpctoolkit: fix broken patches (#35711)
The patches don't have a stable checksum.
2023-02-27 10:50:48 +01:00
Seth R. Johnson
9b46e92e13 Celeritas: new versions 0.2.1 and 0.1.5 (#35704)
* celeritas: new versions 0.1.5 and 0.2.1

* celeritas: deprecate old versions
2023-02-27 09:36:28 +00:00
Howard Pritchard
f004311611 OpenMPI: add the 4.1.5 release (#35677)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2023-02-27 00:57:36 -08:00
Glenn Johnson
a4b949492b r-twosamplemr: add new package and dependencies (#35683) 2023-02-27 07:38:27 +01:00
Larry Knox
6ab792fb03 hdf5-vol-cache: add v1.1 (#35685) 2023-02-27 07:35:30 +01:00
Alex Richert
313c7386c4 go: set GOMAXPROCS to limit number of build processes (#35703) 2023-02-27 07:26:50 +01:00
Adam J. Stewart
b0b4a05d44 py-nbqa: add new package (#35707) 2023-02-27 07:22:48 +01:00
Alberto Invernizzi
4e13b5374f fix dump problem (#35673)
if dump file existed it was not truncating the file, resulting in
a file with unaltered filesize, with the new content at the beginning,
"padded" with the tail of the old content, since the new content was
not enough to overwrite it.
2023-02-24 21:32:33 -08:00
Vinícius
07897900eb ompss-2 dependencies (#35642) 2023-02-24 21:22:17 -08:00
Axel Huebl
d286146c64 WarpX 23.02 (#35633)
Update `warpx` & `py-warpx` to the latest release.
2023-02-23 17:09:28 -08:00
851 changed files with 12270 additions and 7123 deletions

View File

@@ -19,7 +19,7 @@ jobs:
package-audits:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c # @v2
- uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f # @v2
- uses: actions/setup-python@d27e3f3d7c64b4bbf8e4abfb9b63b83e846e0435 # @v2
with:
python-version: ${{inputs.python_version}}

View File

@@ -24,7 +24,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison bison-devel libstdc++-static
- name: Checkout
uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
with:
fetch-depth: 0
- name: Setup non-root user
@@ -62,7 +62,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
cmake bison
- name: Checkout
uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
with:
fetch-depth: 0
- name: Setup non-root user
@@ -99,7 +99,7 @@ jobs:
bzip2 curl file g++ gcc gfortran git gnupg2 gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
with:
fetch-depth: 0
- name: Setup non-root user
@@ -133,7 +133,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
with:
fetch-depth: 0
- name: Setup repo
@@ -158,7 +158,7 @@ jobs:
run: |
brew install cmake bison@2.7 tree
- name: Checkout
uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
- name: Bootstrap clingo
run: |
source share/spack/setup-env.sh
@@ -179,7 +179,7 @@ jobs:
run: |
brew install tree
- name: Checkout
uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
- name: Bootstrap clingo
run: |
set -ex
@@ -204,7 +204,7 @@ jobs:
runs-on: ubuntu-20.04
steps:
- name: Checkout
uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
with:
fetch-depth: 0
- name: Setup repo
@@ -247,7 +247,7 @@ jobs:
bzip2 curl file g++ gcc patchelf gfortran git gzip \
make patch unzip xz-utils python3 python3-dev tree
- name: Checkout
uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
with:
fetch-depth: 0
- name: Setup non-root user
@@ -283,7 +283,7 @@ jobs:
make patch unzip xz-utils python3 python3-dev tree \
gawk
- name: Checkout
uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
with:
fetch-depth: 0
- name: Setup non-root user
@@ -316,7 +316,7 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh
@@ -333,7 +333,7 @@ jobs:
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Checkout
uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
- name: Bootstrap GnuPG
run: |
source share/spack/setup-env.sh

View File

@@ -50,7 +50,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c # @v2
uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f # @v2
- name: Set Container Tag Normal (Nightly)
run: |
@@ -89,7 +89,7 @@ jobs:
uses: docker/setup-qemu-action@e81a89b1732b9c48d79cd809d8d81d79c4647a18 # @v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@f03ac48505955848960e80bbb68046aa35c7b9e7 # @v1
uses: docker/setup-buildx-action@4b4e9c3e2d4531116a6f8ba8e71fc6e2cb6e6c8c # @v1
- name: Log in to GitHub Container Registry
uses: docker/login-action@f4ef78c080cd8ba55a85445d5b36e214a81df20a # @v1

View File

@@ -35,7 +35,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c # @v2
- uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f # @v2
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0

View File

@@ -47,7 +47,7 @@ jobs:
on_develop: false
steps:
- uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c # @v2
- uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@d27e3f3d7c64b4bbf8e4abfb9b63b83e846e0435 # @v2
@@ -94,7 +94,7 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c # @v2
- uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@d27e3f3d7c64b4bbf8e4abfb9b63b83e846e0435 # @v2
@@ -133,7 +133,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c # @v2
- uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f # @v2
- name: Setup repo and non-root user
run: |
git --version
@@ -151,7 +151,7 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c # @v2
- uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@d27e3f3d7c64b4bbf8e4abfb9b63b83e846e0435 # @v2
@@ -185,7 +185,7 @@ jobs:
matrix:
python-version: ["3.10"]
steps:
- uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c # @v2
- uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@d27e3f3d7c64b4bbf8e4abfb9b63b83e846e0435 # @v2

View File

@@ -18,7 +18,7 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c # @v2
- uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f # @v2
- uses: actions/setup-python@d27e3f3d7c64b4bbf8e4abfb9b63b83e846e0435 # @v2
with:
python-version: '3.11'
@@ -35,7 +35,7 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c # @v2
- uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@d27e3f3d7c64b4bbf8e4abfb9b63b83e846e0435 # @v2

View File

@@ -15,7 +15,7 @@ jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
- uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
with:
fetch-depth: 0
- uses: actions/setup-python@d27e3f3d7c64b4bbf8e4abfb9b63b83e846e0435
@@ -39,7 +39,7 @@ jobs:
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
- uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
with:
fetch-depth: 0
- uses: actions/setup-python@d27e3f3d7c64b4bbf8e4abfb9b63b83e846e0435
@@ -63,7 +63,7 @@ jobs:
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
- uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
with:
fetch-depth: 0
- uses: actions/setup-python@d27e3f3d7c64b4bbf8e4abfb9b63b83e846e0435
@@ -87,7 +87,7 @@ jobs:
# git config --global core.symlinks false
# shell:
# powershell
# - uses: actions/checkout@ac593985615ec2ede58e132d2e21d2b1cbd6127c
# - uses: actions/checkout@24cb9080177205b6e8c946b17badbe402adc938f
# with:
# fetch-depth: 0
# - uses: actions/setup-python@d27e3f3d7c64b4bbf8e4abfb9b63b83e846e0435

View File

@@ -13,16 +13,18 @@ concretizer:
# Whether to consider installed packages or packages from buildcaches when
# concretizing specs. If `true`, we'll try to use as many installs/binaries
# as possible, rather than building. If `false`, we'll always give you a fresh
# concretization.
reuse: true
# concretization. If `dependencies`, we'll only reuse dependencies but
# give you a fresh concretization for your root specs.
reuse: dependencies
# Options that tune which targets are considered for concretization. The
# concretization process is very sensitive to the number targets, and the time
# needed to reach a solution increases noticeably with the number of targets
# considered.
targets:
# Determine whether we want to target specific or generic microarchitectures.
# An example of the first kind might be for instance "skylake" or "bulldozer",
# while generic microarchitectures are for instance "aarch64" or "x86_64_v4".
# Determine whether we want to target specific or generic
# microarchitectures. Valid values are: "microarchitectures" or "generic".
# An example of "microarchitectures" would be "skylake" or "bulldozer",
# while an example of "generic" would be "aarch64" or "x86_64_v4".
granularity: microarchitectures
# If "false" allow targets that are incompatible with the current host (for
# instance concretize with target "icelake" while running on "haswell").
@@ -33,4 +35,4 @@ concretizer:
# environments can always be activated. When "false" perform concretization separately
# on each root spec, allowing different versions and variants of the same package in
# an environment.
unify: true
unify: true

View File

@@ -46,7 +46,7 @@ modules:
tcl:
all:
autoload: none
autoload: direct
# Default configurations if lmod is enabled
lmod:

View File

@@ -28,7 +28,7 @@ packages:
gl: [glx, osmesa]
glu: [mesa-glu, openglu]
golang: [go, gcc]
go-external-or-gccgo-bootstrap: [go-bootstrap, gcc]
go-or-gccgo-bootstrap: [go-bootstrap, gcc]
iconv: [libiconv]
ipp: [intel-ipp]
java: [openjdk, jdk, ibm-java]

View File

@@ -942,7 +942,7 @@ first ``libelf`` above, you would run:
$ spack load /qmm4kso
To see which packages that you have loaded to your enviornment you would
To see which packages that you have loaded to your environment you would
use ``spack find --loaded``.
.. code-block:: console

View File

@@ -18,7 +18,7 @@ your Spack mirror and then downloaded and installed by others.
Whenever a mirror provides prebuilt packages, Spack will take these packages
into account during concretization and installation, making ``spack install``
signficantly faster.
significantly faster.
.. note::

View File

@@ -28,11 +28,14 @@ This package provides the following variants:
* **cuda_arch**
This variant supports the optional specification of the architecture.
This variant supports the optional specification of one or multiple architectures.
Valid values are maintained in the ``cuda_arch_values`` property and
are the numeric character equivalent of the compute capability version
(e.g., '10' for version 1.0). Each provided value affects associated
``CUDA`` dependencies and compiler conflicts.
The variant builds both PTX code for the _virtual_ architecture
(e.g. ``compute_10``) and binary code for the _real_ architecture (e.g. ``sm_10``).
GPUs and their compute capability versions are listed at
https://developer.nvidia.com/cuda-gpus .

View File

@@ -124,7 +124,7 @@ Using oneAPI Tools Installed by Spack
=====================================
Spack can be a convenient way to install and configure compilers and
libaries, even if you do not intend to build a Spack package. If you
libraries, even if you do not intend to build a Spack package. If you
want to build a Makefile project using Spack-installed oneAPI compilers,
then use spack to configure your environment::

View File

@@ -397,7 +397,7 @@ for specifics and examples for ``packages.yaml`` files.
.. If your system administrator did not provide modules for pre-installed Intel
tools, you could do well to ask for them, because installing multiple copies
of the Intel tools, as is wont to happen once Spack is in the picture, is
of the Intel tools, as is won't to happen once Spack is in the picture, is
bound to stretch disk space and patience thin. If you *are* the system
administrator and are still new to modules, then perhaps it's best to follow
the `next section <Installing Intel tools within Spack_>`_ and install the tools
@@ -653,7 +653,7 @@ follow `the next section <intel-install-libs_>`_ instead.
* If you specified a custom variant (for example ``+vtune``) you may want to add this as your
preferred variant in the packages configuration for the ``intel-parallel-studio`` package
as described in :ref:`package-preferences`. Otherwise you will have to specify
the variant everytime ``intel-parallel-studio`` is being used as ``mkl``, ``fftw`` or ``mpi``
the variant every time ``intel-parallel-studio`` is being used as ``mkl``, ``fftw`` or ``mpi``
implementation to avoid pulling in a different variant.
* To set the Intel compilers for default use in Spack, instead of the usual ``%gcc``,

View File

@@ -582,7 +582,7 @@ libraries. Make sure not to add modules/packages containing the word
"test", as these likely won't end up in the installation directory,
or may require test dependencies like pytest to be installed.
Instead of defining the ``import_modules`` explicity, only the subset
Instead of defining the ``import_modules`` explicitly, only the subset
of module names to be skipped can be defined by using ``skip_modules``.
If a defined module has submodules, they are skipped as well, e.g.,
in case the ``plotting`` modules should be excluded from the

View File

@@ -227,6 +227,9 @@ You can get the name to use for ``<platform>`` by running ``spack arch
--platform``. The system config scope has a ``<platform>`` section for
sites at which ``/etc`` is mounted on multiple heterogeneous machines.
.. _config-scope-precedence:
----------------
Scope Precedence
----------------
@@ -239,6 +242,11 @@ lower-precedence settings. Completely ignoring higher-level configuration
options is supported with the ``::`` notation for keys (see
:ref:`config-overrides` below).
There are also special notations for string concatenation and precendense override.
Using the ``+:`` notation can be used to force *prepending* strings or lists. For lists, this is identical
to the default behavior. Using the ``-:`` works similarly, but for *appending* values.
:ref:`config-prepend-append`
^^^^^^^^^^^
Simple keys
^^^^^^^^^^^
@@ -279,6 +287,47 @@ command:
- ~/.spack/stage
.. _config-prepend-append:
^^^^^^^^^^^^^^^^^^^^
String Concatenation
^^^^^^^^^^^^^^^^^^^^
Above, the user ``config.yaml`` *completely* overrides specific settings in the
default ``config.yaml``. Sometimes, it is useful to add a suffix/prefix
to a path or name. To do this, you can use the ``-:`` notation for *append*
string concatenation at the end of a key in a configuration file. For example:
.. code-block:: yaml
:emphasize-lines: 1
:caption: ~/.spack/config.yaml
config:
install_tree-: /my/custom/suffix/
Spack will then append to the lower-precedence configuration under the
``install_tree-:`` section:
.. code-block:: console
$ spack config get config
config:
install_tree: /some/other/directory/my/custom/suffix
build_stage:
- $tempdir/$user/spack-stage
- ~/.spack/stage
Similarly, ``+:`` can be used to *prepend* to a path or name:
.. code-block:: yaml
:emphasize-lines: 1
:caption: ~/.spack/config.yaml
config:
install_tree+: /my/custom/suffix/
.. _config-overrides:
^^^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -472,7 +472,7 @@ use my new hook as follows:
.. code-block:: python
def post_log_write(message, level):
"""Do something custom with the messsage and level every time we write
"""Do something custom with the message and level every time we write
to the log
"""
print('running post_log_write!')

View File

@@ -1597,8 +1597,8 @@ in a Windows CMD prompt.
.. note::
If you chose to install Spack into a directory on Windows that is set up to require Administrative
Privleges, Spack will require elevated privleges to run.
Administrative Privleges can be denoted either by default such as
Privileges, Spack will require elevated privileges to run.
Administrative Privileges can be denoted either by default such as
``C:\Program Files``, or aministrator applied administrative restrictions
on a directory that spack installs files to such as ``C:\Users``
@@ -1694,7 +1694,7 @@ Spack console via:
spack install cpuinfo
If in the previous step, you did not have CMake or Ninja installed, running the command above should boostrap both packages
If in the previous step, you did not have CMake or Ninja installed, running the command above should bootstrap both packages
"""""""""""""""""""""""""""
Windows Compatible Packages

View File

@@ -13,7 +13,7 @@ The use of module systems to manage user environment in a controlled way
is a common practice at HPC centers that is often embraced also by
individual programmers on their development machines. To support this
common practice Spack integrates with `Environment Modules
<http://modules.sourceforge.net/>`_ and `LMod
<http://modules.sourceforge.net/>`_ and `Lmod
<http://lmod.readthedocs.io/en/latest/>`_ by providing post-install hooks
that generate module files and commands to manipulate them.
@@ -26,8 +26,8 @@ Using module files via Spack
----------------------------
If you have installed a supported module system you should be able to
run either ``module avail`` or ``use -l spack`` to see what module
files have been installed. Here is sample output of those programs,
run ``module avail`` to see what module
files have been installed. Here is sample output of those programs,
showing lots of installed packages:
.. code-block:: console
@@ -51,12 +51,7 @@ showing lots of installed packages:
help2man-1.47.4-gcc-4.8-kcnqmau lua-luaposix-33.4.0-gcc-4.8-mdod2ry netlib-scalapack-2.0.2-gcc-6.3.0-rgqfr6d py-scipy-0.19.0-gcc-6.3.0-kr7nat4 zlib-1.2.11-gcc-6.3.0-7cqp6cj
The names should look familiar, as they resemble the output from ``spack find``.
You *can* use the modules here directly. For example, you could type either of these commands
to load the ``cmake`` module:
.. code-block:: console
$ use cmake-3.7.2-gcc-6.3.0-fowuuby
For example, you could type the following command to load the ``cmake`` module:
.. code-block:: console
@@ -93,9 +88,9 @@ the different file formats that can be generated by Spack:
+-----------------------------+--------------------+-------------------------------+----------------------------------------------+----------------------+
| | **Hook name** | **Default root directory** | **Default template file** | **Compatible tools** |
+=============================+====================+===============================+==============================================+======================+
| **TCL - Non-Hierarchical** | ``tcl`` | share/spack/modules | share/spack/templates/modules/modulefile.tcl | Env. Modules/LMod |
| **Tcl - Non-Hierarchical** | ``tcl`` | share/spack/modules | share/spack/templates/modules/modulefile.tcl | Env. Modules/Lmod |
+-----------------------------+--------------------+-------------------------------+----------------------------------------------+----------------------+
| **Lua - Hierarchical** | ``lmod`` | share/spack/lmod | share/spack/templates/modules/modulefile.lua | LMod |
| **Lua - Hierarchical** | ``lmod`` | share/spack/lmod | share/spack/templates/modules/modulefile.lua | Lmod |
+-----------------------------+--------------------+-------------------------------+----------------------------------------------+----------------------+
@@ -396,13 +391,13 @@ name and version for all packages that depend on mpi.
When specifying module names by projection for Lmod modules, we
recommend NOT including names of dependencies (e.g., MPI, compilers)
that are already in the LMod hierarchy.
that are already in the Lmod hierarchy.
.. note::
TCL modules
TCL modules also allow for explicit conflicts between modulefiles.
Tcl modules
Tcl modules also allow for explicit conflicts between modulefiles.
.. code-block:: yaml
@@ -426,9 +421,9 @@ that are already in the LMod hierarchy.
.. note::
LMod hierarchical module files
Lmod hierarchical module files
When ``lmod`` is activated Spack will generate a set of hierarchical lua module
files that are understood by LMod. The hierarchy will always contain the
files that are understood by Lmod. The hierarchy will always contain the
two layers ``Core`` / ``Compiler`` but can be further extended to
any of the virtual dependencies present in Spack. A case that could be useful in
practice is for instance:
@@ -450,7 +445,7 @@ that are already in the LMod hierarchy.
that will generate a hierarchy in which the ``lapack`` and ``mpi`` layer can be switched
independently. This allows a site to build the same libraries or applications against different
implementations of ``mpi`` and ``lapack``, and let LMod switch safely from one to the
implementations of ``mpi`` and ``lapack``, and let Lmod switch safely from one to the
other.
All packages built with a compiler in ``core_compilers`` and all
@@ -460,12 +455,12 @@ that are already in the LMod hierarchy.
.. warning::
Consistency of Core packages
The user is responsible for maintining consistency among core packages, as ``core_specs``
bypasses the hierarchy that allows LMod to safely switch between coherent software stacks.
bypasses the hierarchy that allows Lmod to safely switch between coherent software stacks.
.. warning::
Deep hierarchies and ``lmod spider``
For hierarchies that are deeper than three layers ``lmod spider`` may have some issues.
See `this discussion on the LMod project <https://github.com/TACC/Lmod/issues/114>`_.
See `this discussion on the Lmod project <https://github.com/TACC/Lmod/issues/114>`_.
""""""""""""""""""""""
Select default modules
@@ -534,7 +529,7 @@ installed to ``/spack/prefix/foo``, if ``foo`` installs executables to
update ``MANPATH``.
The default list of environment variables in this config section
inludes ``PATH``, ``MANPATH``, ``ACLOCAL_PATH``, ``PKG_CONFIG_PATH``
includes ``PATH``, ``MANPATH``, ``ACLOCAL_PATH``, ``PKG_CONFIG_PATH``
and ``CMAKE_PREFIX_PATH``, as well as ``DYLD_FALLBACK_LIBRARY_PATH``
on macOS. On Linux however, the corresponding ``LD_LIBRARY_PATH``
variable is *not* set, because it affects the behavior of
@@ -634,8 +629,9 @@ by its dependency; when the dependency is autoloaded, the executable will be in
PATH. Similarly for scripting languages such as Python, packages and their dependencies
have to be loaded together.
Autoloading is enabled by default for LMod, as it has great builtin support for through
the ``depends_on`` function. For Environment Modules it is disabled by default.
Autoloading is enabled by default for Lmod and Environment Modules. The former
has builtin support for through the ``depends_on`` function. The latter uses
``module load`` statement to load and track dependencies.
Autoloading can also be enabled conditionally:
@@ -655,12 +651,14 @@ The allowed values for the ``autoload`` statement are either ``none``,
``direct`` or ``all``.
.. note::
TCL prerequisites
Tcl prerequisites
In the ``tcl`` section of the configuration file it is possible to use
the ``prerequisites`` directive that accepts the same values as
``autoload``. It will produce module files that have a ``prereq``
statement, which can be used to autoload dependencies in some versions
of Environment Modules.
statement, which autoloads dependencies on Environment Modules when its
``auto_handling`` configuration option is enabled. If Environment Modules
is installed with Spack, ``auto_handling`` is enabled by default starting
version 4.2. Otherwise it is enabled by default since version 5.0.
------------------------
Maintaining Module Files

File diff suppressed because it is too large Load Diff

View File

@@ -9,27 +9,32 @@
CI Pipelines
============
Spack provides commands that support generating and running automated build
pipelines designed for Gitlab CI. At the highest level it works like this:
provide a spack environment describing the set of packages you care about,
and include within that environment file a description of how those packages
should be mapped to Gitlab runners. Spack can then generate a ``.gitlab-ci.yml``
file containing job descriptions for all your packages that can be run by a
properly configured Gitlab CI instance. When run, the generated pipeline will
build and deploy binaries, and it can optionally report to a CDash instance
Spack provides commands that support generating and running automated build pipelines in CI instances. At the highest
level it works like this: provide a spack environment describing the set of packages you care about, and include a
description of how those packages should be mapped to Gitlab runners. Spack can then generate a ``.gitlab-ci.yml``
file containing job descriptions for all your packages that can be run by a properly configured CI instance. When
run, the generated pipeline will build and deploy binaries, and it can optionally report to a CDash instance
regarding the health of the builds as they evolve over time.
------------------------------
Getting started with pipelines
------------------------------
It is fairly straightforward to get started with automated build pipelines. At
a minimum, you'll need to set up a Gitlab instance (more about Gitlab CI
`here <https://about.gitlab.com/product/continuous-integration/>`_) and configure
at least one `runner <https://docs.gitlab.com/runner/>`_. Then the basic steps
for setting up a build pipeline are as follows:
To get started with automated build pipelines a Gitlab instance with version ``>= 12.9``
(more about Gitlab CI `here <https://about.gitlab.com/product/continuous-integration/>`_)
with at least one `runner <https://docs.gitlab.com/runner/>`_ configured is required. This
can be done quickly by setting up a local Gitlab instance.
#. Create a repository on your gitlab instance
It is possible to set up pipelines on gitlab.com, but the builds there are limited to
60 minutes and generic hardware. It is possible to
`hook up <https://about.gitlab.com/blog/2018/04/24/getting-started-gitlab-ci-gcp>`_
Gitlab to Google Kubernetes Engine (`GKE <https://cloud.google.com/kubernetes-engine/>`_)
or Amazon Elastic Kubernetes Service (`EKS <https://aws.amazon.com/eks>`_), though those
topics are outside the scope of this document.
After setting up a Gitlab instance for running CI, the basic steps for setting up a build pipeline are as follows:
#. Create a repository in the Gitlab instance with CI and a runner enabled.
#. Add a ``spack.yaml`` at the root containing your pipeline environment
#. Add a ``.gitlab-ci.yml`` at the root containing two jobs (one to generate
the pipeline dynamically, and one to run the generated jobs).
@@ -40,13 +45,6 @@ See the :ref:`functional_example` section for a minimal working example. See al
the :ref:`custom_Workflow` section for a link to an example of a custom workflow
based on spack pipelines.
While it is possible to set up pipelines on gitlab.com, as illustrated above, the
builds there are limited to 60 minutes and generic hardware. It is also possible to
`hook up <https://about.gitlab.com/blog/2018/04/24/getting-started-gitlab-ci-gcp>`_
Gitlab to Google Kubernetes Engine (`GKE <https://cloud.google.com/kubernetes-engine/>`_)
or Amazon Elastic Kubernetes Service (`EKS <https://aws.amazon.com/eks>`_), though those
topics are outside the scope of this document.
Spack's pipelines are now making use of the
`trigger <https://docs.gitlab.com/ee/ci/yaml/#trigger>`_ syntax to run
dynamically generated
@@ -132,29 +130,35 @@ And here's the spack environment built by the pipeline represented as a
mirrors: { "mirror": "s3://spack-public/mirror" }
gitlab-ci:
before_script:
- git clone ${SPACK_REPO}
- pushd spack && git checkout ${SPACK_CHECKOUT_VERSION} && popd
- . "./spack/share/spack/setup-env.sh"
script:
- pushd ${SPACK_CONCRETE_ENV_DIR} && spack env activate --without-view . && popd
- spack -d ci rebuild
mappings:
- match: ["os=ubuntu18.04"]
runner-attributes:
image:
name: ghcr.io/scottwittenburg/ecpe4s-ubuntu18.04-runner-x86_64:2020-09-01
entrypoint: [""]
tags:
- docker
ci:
enable-artifacts-buildcache: True
rebuild-index: False
pipeline-gen:
- any-job:
before_script:
- git clone ${SPACK_REPO}
- pushd spack && git checkout ${SPACK_CHECKOUT_VERSION} && popd
- . "./spack/share/spack/setup-env.sh"
- build-job:
tags: [docker]
image:
name: ghcr.io/scottwittenburg/ecpe4s-ubuntu18.04-runner-x86_64:2020-09-01
entrypoint: [""]
The elements of this file important to spack ci pipelines are described in more
detail below, but there are a couple of things to note about the above working
example:
.. note::
There is no ``script`` attribute specified for here. The reason for this is
Spack CI will automatically generate reasonable default scripts. More
detail on what is in these scripts can be found below.
Also notice the ``before_script`` section. It is required when using any of the
default scripts to source the ``setup-env.sh`` script in order to inform
the default scripts where to find the ``spack`` executable.
Normally ``enable-artifacts-buildcache`` is not recommended in production as it
results in large binary artifacts getting transferred back and forth between
gitlab and the runners. But in this example on gitlab.com where there is no
@@ -174,7 +178,7 @@ during subsequent pipeline runs.
With the addition of reproducible builds (#22887) a previously working
pipeline will require some changes:
* In the build jobs (``runner-attributes``), the environment location changed.
* In the build-jobs, the environment location changed.
This will typically show as a ``KeyError`` in the failing job. Be sure to
point to ``${SPACK_CONCRETE_ENV_DIR}``.
@@ -196,9 +200,9 @@ ci pipelines. These commands are covered in more detail in this section.
.. _cmd-spack-ci:
^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^
``spack ci``
^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^
Super-command for functionality related to generating pipelines and executing
pipeline jobs.
@@ -227,7 +231,7 @@ Using ``--prune-dag`` or ``--no-prune-dag`` configures whether or not jobs are
generated for specs that are already up to date on the mirror. If enabling
DAG pruning using ``--prune-dag``, more information may be required in your
``spack.yaml`` file, see the :ref:`noop_jobs` section below regarding
``service-job-attributes``.
``noop-job``.
The optional ``--check-index-only`` argument can be used to speed up pipeline
generation by telling spack to consider only remote buildcache indices when
@@ -263,11 +267,11 @@ generated by jobs in the pipeline.
.. _cmd-spack-ci-rebuild:
^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^
``spack ci rebuild``
^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^
The purpose of ``spack ci rebuild`` is straightforward: take its assigned
The purpose of ``spack ci rebuild`` is to take an assigned
spec and ensure a binary of a successful build exists on the target mirror.
If the binary does not already exist, it is built from source and pushed
to the mirror. The associated stand-alone tests are optionally run against
@@ -280,7 +284,7 @@ directory. The script is run in a job to install the spec from source. The
resulting binary package is pushed to the mirror. If ``cdash`` is configured
for the environment, then the build results will be uploaded to the site.
Environment variables and values in the ``gitlab-ci`` section of the
Environment variables and values in the ``ci::pipeline-gen`` section of the
``spack.yaml`` environment file provide inputs to this process. The
two main sources of environment variables are variables written into
``.gitlab-ci.yml`` by ``spack ci generate`` and the GitLab CI runtime.
@@ -298,21 +302,23 @@ A snippet from an example ``spack.yaml`` file illustrating use of this
option *and* specification of a package with broken tests is given below.
The inclusion of a spec for building ``gptune`` is not shown here. Note
that ``--tests`` is passed to ``spack ci rebuild`` as part of the
``gitlab-ci`` script.
``build-job`` script.
.. code-block:: yaml
gitlab-ci:
script:
- . "./share/spack/setup-env.sh"
- spack --version
- cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
- spack -d ci rebuild --tests > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
ci:
pipeline-gen:
- build-job
script:
- . "./share/spack/setup-env.sh"
- spack --version
- cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi
- spack -d ci rebuild --tests > >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_out.txt) 2> >(tee ${SPACK_ARTIFACTS_ROOT}/user_data/pipeline_err.txt >&2)
broken-tests-packages:
- gptune
@@ -354,113 +360,31 @@ arguments you can pass to ``spack ci reproduce-build`` in order to reproduce
a particular build locally.
------------------------------------
A pipeline-enabled spack environment
Job Types
------------------------------------
Here's an example of a spack environment file that has been enhanced with
sections describing a build pipeline:
^^^^^^^^^^^^^^^
Rebuild (build)
^^^^^^^^^^^^^^^
.. code-block:: yaml
Rebuild jobs, denoted as ``build-job``'s in the ``pipeline-gen`` list, are jobs
associated with concrete specs that have been marked for rebuild. By default a simple
script for doing rebuild is generated, but may be modified as needed.
spack:
definitions:
- pkgs:
- readline@7.0
- compilers:
- '%gcc@5.5.0'
- oses:
- os=ubuntu18.04
- os=centos7
specs:
- matrix:
- [$pkgs]
- [$compilers]
- [$oses]
mirrors:
cloud_gitlab: https://mirror.spack.io
gitlab-ci:
mappings:
- match:
- os=ubuntu18.04
runner-attributes:
tags:
- spack-kube
image: spack/ubuntu-bionic
- match:
- os=centos7
runner-attributes:
tags:
- spack-kube
image: spack/centos7
cdash:
build-group: Release Testing
url: https://cdash.spack.io
project: Spack
site: Spack AWS Gitlab Instance
The default script does three main steps, change directories to the pipelines concrete
environment, activate the concrete environment, and run the ``spack ci rebuild`` command:
Hopefully, the ``definitions``, ``specs``, ``mirrors``, etc. sections are already
familiar, as they are part of spack :ref:`environments`. So let's take a more
in-depth look some of the pipeline-related sections in that environment file
that might not be as familiar.
.. code-block:: bash
The ``gitlab-ci`` section is used to configure how the pipeline workload should be
generated, mainly how the jobs for building specs should be assigned to the
configured runners on your instance. Each entry within the list of ``mappings``
corresponds to a known gitlab runner, where the ``match`` section is used
in assigning a release spec to one of the runners, and the ``runner-attributes``
section is used to configure the spec/job for that particular runner.
Both the top-level ``gitlab-ci`` section as well as each ``runner-attributes``
section can also contain the following keys: ``image``, ``tags``, ``variables``,
``before_script``, ``script``, and ``after_script``. If any of these keys are
provided at the ``gitlab-ci`` level, they will be used as the defaults for any
``runner-attributes``, unless they are overridden in those sections. Specifying
any of these keys at the ``runner-attributes`` level generally overrides the
keys specified at the higher level, with a couple exceptions. Any ``variables``
specified at both levels result in those dictionaries getting merged in the
resulting generated job, and any duplicate variable names get assigned the value
provided in the specific ``runner-attributes``. If ``tags`` are specified both
at the ``gitlab-ci`` level as well as the ``runner-attributes`` level, then the
lists of tags are combined, and any duplicates are removed.
See the section below on using a custom spack for an example of how these keys
could be used.
There are other pipeline options you can configure within the ``gitlab-ci`` section
as well.
The ``bootstrap`` section allows you to specify lists of specs from
your ``definitions`` that should be staged ahead of the environment's ``specs`` (this
section is described in more detail below). The ``enable-artifacts-buildcache`` key
takes a boolean and determines whether the pipeline uses artifacts to store and
pass along the buildcaches from one stage to the next (the default if you don't
provide this option is ``False``).
The optional ``broken-specs-url`` key tells Spack to check against a list of
specs that are known to be currently broken in ``develop``. If any such specs
are found, the ``spack ci generate`` command will fail with an error message
informing the user what broken specs were encountered. This allows the pipeline
to fail early and avoid wasting compute resources attempting to build packages
that will not succeed.
The optional ``cdash`` section provides information that will be used by the
``spack ci generate`` command (invoked by ``spack ci start``) for reporting
to CDash. All the jobs generated from this environment will belong to a
"build group" within CDash that can be tracked over time. As the release
progresses, this build group may have jobs added or removed. The url, project,
and site are used to specify the CDash instance to which build results should
be reported.
Take a look at the
`schema <https://github.com/spack/spack/blob/develop/lib/spack/spack/schema/gitlab_ci.py>`_
for the gitlab-ci section of the spack environment file, to see precisely what
syntax is allowed there.
cd ${concrete_environment_dir}
spack env activate --without-view .
spack ci rebuild
.. _rebuild_index:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Note about rebuilding buildcache index
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^
Update Index (reindex)
^^^^^^^^^^^^^^^^^^^^^^
By default, while a pipeline job may rebuild a package, create a buildcache
entry, and push it to the mirror, it does not automatically re-generate the
@@ -475,21 +399,44 @@ not correctly reflect the mirror's contents at the end of a pipeline.
To make sure the buildcache index is up to date at the end of your pipeline,
spack generates a job to update the buildcache index of the target mirror
at the end of each pipeline by default. You can disable this behavior by
adding ``rebuild-index: False`` inside the ``gitlab-ci`` section of your
spack environment. Spack will assign the job any runner attributes found
on the ``service-job-attributes``, if you have provided that in your
``spack.yaml``.
adding ``rebuild-index: False`` inside the ``ci`` section of your
spack environment.
Reindex jobs do not allow modifying the ``script`` attribute since it is automatically
generated using the target mirror listed in the ``mirrors::mirror`` configuration.
^^^^^^^^^^^^^^^^^
Signing (signing)
^^^^^^^^^^^^^^^^^
This job is run after all of the rebuild jobs are completed and is intended to be used
to sign the package binaries built by a protected CI run. Signing jobs are generated
only if a signing job ``script`` is specified and the spack CI job type is protected.
Note, if an ``any-job`` section contains a script, this will not implicitly create a
``signing`` job, a signing job may only exist if it is explicitly specified in the
configuration with a ``script`` attribute. Specifying a signing job without a script
does not create a signing job and the job configuration attributes will be ignored.
Signing jobs are always assigned the runner tags ``aws``, ``protected``, and ``notary``.
^^^^^^^^^^^^^^^^^
Cleanup (cleanup)
^^^^^^^^^^^^^^^^^
When using ``temporary-storage-url-prefix`` the cleanup job will destroy the mirror
created for the associated Gitlab pipeline. Cleanup jobs do not allow modifying the
script, but do expect that the spack command is in the path and require a
``before_script`` to be specified that sources the ``setup-env.sh`` script.
.. _noop_jobs:
^^^^^^^^^^^^^^^^^^^^^^^
Note about "no-op" jobs
^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^
No Op (noop)
^^^^^^^^^^^^
If no specs in an environment need to be rebuilt during a given pipeline run
(meaning all are already up to date on the mirror), a single successful job
(a NO-OP) is still generated to avoid an empty pipeline (which GitLab
considers to be an error). An optional ``service-job-attributes`` section
considers to be an error). The ``noop-job*`` sections
can be added to your ``spack.yaml`` where you can provide ``tags`` and
``image`` or ``variables`` for the generated NO-OP job. This section also
supports providing ``before_script``, ``script``, and ``after_script``, in
@@ -499,51 +446,100 @@ Following is an example of this section added to a ``spack.yaml``:
.. code-block:: yaml
spack:
specs:
- openmpi
mirrors:
cloud_gitlab: https://mirror.spack.io
gitlab-ci:
mappings:
- match:
- os=centos8
runner-attributes:
tags:
- custom
- tag
image: spack/centos7
service-job-attributes:
tags: ['custom', 'tag']
image:
name: 'some.image.registry/custom-image:latest'
entrypoint: ['/bin/bash']
script:
- echo "Custom message in a custom script"
spack:
ci:
pipeline-gen:
- noop-job:
tags: ['custom', 'tag']
image:
name: 'some.image.registry/custom-image:latest'
entrypoint: ['/bin/bash']
script::
- echo "Custom message in a custom script"
The example above illustrates how you can provide the attributes used to run
the NO-OP job in the case of an empty pipeline. The only field for the NO-OP
job that might be generated for you is ``script``, but that will only happen
if you do not provide one yourself.
if you do not provide one yourself. Notice in this example the ``script``
uses the ``::`` notation to prescribe override behavior. Without this, the
``echo`` command would have been prepended to the automatically generated script
rather than replacing it.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Assignment of specs to runners
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
------------------------------------
ci.yaml
------------------------------------
The ``mappings`` section corresponds to a list of runners, and during assignment
of specs to runners, the list is traversed in order looking for matches, the
first runner that matches a release spec is assigned to build that spec. The
``match`` section within each runner mapping section is a list of specs, and
if any of those specs match the release spec (the ``spec.satisfies()`` method
is used), then that runner is considered a match.
Here's an example of a spack configuration file describing a build pipeline:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Configuration of specs/jobs for a runner
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.. code-block:: yaml
Once a runner has been chosen to build a release spec, the ``runner-attributes``
section provides information determining details of the job in the context of
the runner. The ``runner-attributes`` section must have a ``tags`` key, which
ci:
target: gitlab
rebuild_index: True
broken-specs-url: https://broken.specs.url
broken-tests-packages:
- gptune
pipeline-gen:
- submapping:
- match:
- os=ubuntu18.04
build-job:
tags:
- spack-kube
image: spack/ubuntu-bionic
- match:
- os=centos7
build-job:
tags:
- spack-kube
image: spack/centos7
cdash:
build-group: Release Testing
url: https://cdash.spack.io
project: Spack
site: Spack AWS Gitlab Instance
The ``ci`` config section is used to configure how the pipeline workload should be
generated, mainly how the jobs for building specs should be assigned to the
configured runners on your instance. The main section for configuring pipelines
is ``pipeline-gen``, which is a list of job attribute sections that are merged,
using the same rules as Spack configs (:ref:`config-scope-precedence`), from the bottom up.
The order sections are applied is to be consistent with how spack orders scope precedence when merging lists.
There are two main section types, ``<type>-job`` sections and ``submapping``
sections.
^^^^^^^^^^^^^^^^^^^^^^
Job Attribute Sections
^^^^^^^^^^^^^^^^^^^^^^
Each type of job may have attributes added or removed via sections in the ``pipeline-gen``
list. Job type specific attributes may be specified using the keys ``<type>-job`` to
add attributes to all jobs of type ``<type>`` or ``<type>-job-remove`` to remove attributes
of type ``<type>``. Each section may only contain one type of job attribute specification, ie. ,
``build-job`` and ``noop-job`` may not coexist but ``build-job`` and ``build-job-remove`` may.
.. note::
The ``*-remove`` specifications are applied before the additive attribute specification.
For example, in the case where both ``build-job`` and ``build-job-remove`` are listed in
the same ``pipeline-gen`` section, the value will still exist in the merged build-job after
applying the section.
All of the attributes specified are forwarded to the generated CI jobs, however special
treatment is applied to the attributes ``tags``, ``image``, ``variables``, ``script``,
``before_script``, and ``after_script`` as they are components recognized explicitly by the
Spack CI generator. For the ``tags`` attribute, Spack will remove reserved tags
(:ref:`reserved_tags`) from all jobs specified in the config. In some cases, such as for
``signing`` jobs, reserved tags will be added back based on the type of CI that is being run.
Once a runner has been chosen to build a release spec, the ``build-job*``
sections provide information determining details of the job in the context of
the runner. At lease one of the ``build-job*`` sections must contain a ``tags`` key, which
is a list containing at least one tag used to select the runner from among the
runners known to the gitlab instance. For Docker executor type runners, the
``image`` key is used to specify the Docker image used to build the release spec
@@ -554,7 +550,7 @@ information on to the runner that it needs to do its work (e.g. scheduler
parameters, etc.). Any ``variables`` provided here will be added, verbatim, to
each job.
The ``runner-attributes`` section also allows users to supply custom ``script``,
The ``build-job`` section also allows users to supply custom ``script``,
``before_script``, and ``after_script`` sections to be applied to every job
scheduled on that runner. This allows users to do any custom preparation or
cleanup tasks that fit their particular workflow, as well as completely
@@ -565,46 +561,45 @@ environment directory is located within your ``--artifacts_root`` (or if not
provided, within your ``$CI_PROJECT_DIR``), activates that environment for
you, and invokes ``spack ci rebuild``.
.. _staging_algorithm:
Sections that specify scripts (``script``, ``before_script``, ``after_script``) are all
read as lists of commands or lists of lists of commands. It is recommended to write scripts
as lists of lists if scripts will be composed via merging. The default behavior of merging
lists will remove duplicate commands and potentially apply unwanted reordering, whereas
merging lists of lists will preserve the local ordering and never removes duplicate
commands. When writing commands to the CI target script, all lists are expanded and
flattened into a single list.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Summary of ``.gitlab-ci.yml`` generation algorithm
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^
Submapping Sections
^^^^^^^^^^^^^^^^^^^
All specs yielded by the matrix (or all the specs in the environment) have their
dependencies computed, and the entire resulting set of specs are staged together
before being run through the ``gitlab-ci/mappings`` entries, where each staged
spec is assigned a runner. "Staging" is the name given to the process of
figuring out in what order the specs should be built, taking into consideration
Gitlab CI rules about jobs/stages. In the staging process the goal is to maximize
the number of jobs in any stage of the pipeline, while ensuring that the jobs in
any stage only depend on jobs in previous stages (since those jobs are guaranteed
to have completed already). As a runner is determined for a job, the information
in the ``runner-attributes`` is used to populate various parts of the job
description that will be used by Gitlab CI. Once all the jobs have been assigned
a runner, the ``.gitlab-ci.yml`` is written to disk.
A special case of attribute specification is the ``submapping`` section which may be used
to apply job attributes to build jobs based on the package spec associated with the rebuild
job. Submapping is specified as a list of spec ``match`` lists associated with
``build-job``/``build-job-remove`` sections. There are two options for ``match_behavior``,
either ``first`` or ``merge`` may be specified. In either case, the ``submapping`` list is
processed from the bottom up, and then each ``match`` list is searched for a string that
satisfies the check ``spec.satisfies({match_item})`` for each concrete spec.
The short example provided above would result in the ``readline``, ``ncurses``,
and ``pkgconf`` packages getting staged and built on the runner chosen by the
``spack-k8s`` tag. In this example, spack assumes the runner is a Docker executor
type runner, and thus certain jobs will be run in the ``centos7`` container,
and others in the ``ubuntu-18.04`` container. The resulting ``.gitlab-ci.yml``
will contain 6 jobs in three stages. Once the jobs have been generated, the
presence of a ``SPACK_CDASH_AUTH_TOKEN`` environment variable during the
``spack ci generate`` command would result in all of the jobs being put in a
build group on CDash called "Release Testing" (that group will be created if
it didn't already exist).
The the case of ``match_behavior: first``, the first ``match`` section in the list of
``submappings`` that contains a string that satisfies the spec will apply it's
``build-job*`` attributes to the rebuild job associated with that spec. This is the
default behavior and will be the method if no ``match_behavior`` is specified.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Optional compiler bootstrapping
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The the case of ``merge`` match, all of the ``match`` sections in the list of
``submappings`` that contain a string that satisfies the spec will have the associated
``build-job*`` attributes applied to the rebuild job associated with that spec. Again,
the attributes will be merged starting from the bottom match going up to the top match.
Spack pipelines also have support for bootstrapping compilers on systems that
may not already have the desired compilers installed. The idea here is that
you can specify a list of things to bootstrap in your ``definitions``, and
spack will guarantee those will be installed in a phase of the pipeline before
your release specs, so that you can rely on those packages being available in
the binary mirror when you need them later on in the pipeline. At the moment
In the case that no match is found in a submapping section, no additional attributes will be applied.
^^^^^^^^^^^^^
Bootstrapping
^^^^^^^^^^^^^
The ``bootstrap`` section allows you to specify lists of specs from
your ``definitions`` that should be staged ahead of the environment's ``specs``. At the moment
the only viable use-case for bootstrapping is to install compilers.
Here's an example of what bootstrapping some compilers might look like:
@@ -680,6 +675,86 @@ environment/stack file, and in that case no bootstrapping will be done (only the
specs will be staged for building) and the runners will be expected to already
have all needed compilers installed and configured for spack to use.
^^^^^^^^^^^^^^^^^^^
Pipeline Buildcache
^^^^^^^^^^^^^^^^^^^
The ``enable-artifacts-buildcache`` key
takes a boolean and determines whether the pipeline uses artifacts to store and
pass along the buildcaches from one stage to the next (the default if you don't
provide this option is ``False``).
^^^^^^^^^^^^^^^^
Broken Specs URL
^^^^^^^^^^^^^^^^
The optional ``broken-specs-url`` key tells Spack to check against a list of
specs that are known to be currently broken in ``develop``. If any such specs
are found, the ``spack ci generate`` command will fail with an error message
informing the user what broken specs were encountered. This allows the pipeline
to fail early and avoid wasting compute resources attempting to build packages
that will not succeed.
^^^^^
CDash
^^^^^
The optional ``cdash`` section provides information that will be used by the
``spack ci generate`` command (invoked by ``spack ci start``) for reporting
to CDash. All the jobs generated from this environment will belong to a
"build group" within CDash that can be tracked over time. As the release
progresses, this build group may have jobs added or removed. The url, project,
and site are used to specify the CDash instance to which build results should
be reported.
Take a look at the
`schema <https://github.com/spack/spack/blob/develop/lib/spack/spack/schema/ci.py>`_
for the gitlab-ci section of the spack environment file, to see precisely what
syntax is allowed there.
.. _reserved_tags:
^^^^^^^^^^^^^
Reserved Tags
^^^^^^^^^^^^^
Spack has a subset of tags (``public``, ``protected``, and ``notary``) that it reserves
for classifying runners that may require special permissions or access. The tags
``public`` and ``protected`` are used to distinguish between runners that use public
permissions and runners with protected permissions. The ``notary`` tag is a special tag
that is used to indicate runners that have access to the highly protected information
used for signing binaries using the ``signing`` job.
.. _staging_algorithm:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Summary of ``.gitlab-ci.yml`` generation algorithm
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
All specs yielded by the matrix (or all the specs in the environment) have their
dependencies computed, and the entire resulting set of specs are staged together
before being run through the ``ci/pipeline-gen`` entries, where each staged
spec is assigned a runner. "Staging" is the name given to the process of
figuring out in what order the specs should be built, taking into consideration
Gitlab CI rules about jobs/stages. In the staging process the goal is to maximize
the number of jobs in any stage of the pipeline, while ensuring that the jobs in
any stage only depend on jobs in previous stages (since those jobs are guaranteed
to have completed already). As a runner is determined for a job, the information
in the merged ``any-job*`` and ``build-job*`` sections is used to populate various parts of the job
description that will be used by the target CI pipelines. Once all the jobs have been assigned
a runner, the ``.gitlab-ci.yml`` is written to disk.
The short example provided above would result in the ``readline``, ``ncurses``,
and ``pkgconf`` packages getting staged and built on the runner chosen by the
``spack-k8s`` tag. In this example, spack assumes the runner is a Docker executor
type runner, and thus certain jobs will be run in the ``centos7`` container,
and others in the ``ubuntu-18.04`` container. The resulting ``.gitlab-ci.yml``
will contain 6 jobs in three stages. Once the jobs have been generated, the
presence of a ``SPACK_CDASH_AUTH_TOKEN`` environment variable during the
``spack ci generate`` command would result in all of the jobs being put in a
build group on CDash called "Release Testing" (that group will be created if
it didn't already exist).
-------------------------------------
Using a custom spack in your pipeline
-------------------------------------
@@ -726,23 +801,21 @@ generated by ``spack ci generate``. You also want your generated rebuild jobs
spack:
...
gitlab-ci:
mappings:
- match:
- os=ubuntu18.04
runner-attributes:
tags:
- spack-kube
image: spack/ubuntu-bionic
before_script:
- git clone ${SPACK_REPO}
- pushd spack && git checkout ${SPACK_REF} && popd
- . "./spack/share/spack/setup-env.sh"
script:
- spack env activate --without-view ${SPACK_CONCRETE_ENV_DIR}
- spack -d ci rebuild
after_script:
- rm -rf ./spack
ci:
pipeline-gen:
- build-job:
tags:
- spack-kube
image: spack/ubuntu-bionic
before_script:
- git clone ${SPACK_REPO}
- pushd spack && git checkout ${SPACK_REF} && popd
- . "./spack/share/spack/setup-env.sh"
script:
- spack env activate --without-view ${SPACK_CONCRETE_ENV_DIR}
- spack -d ci rebuild
after_script:
- rm -rf ./spack
Now all of the generated rebuild jobs will use the same shell script to clone
spack before running their actual workload.
@@ -831,3 +904,4 @@ verify binary packages (when installing or creating buildcaches). You could
also have already trusted a key spack know about, or if no key is present anywhere,
spack will install specs using ``--no-check-signature`` and create buildcaches
using ``-u`` (for unsigned binaries).

93
lib/spack/env/cc vendored
View File

@@ -427,6 +427,48 @@ isystem_include_dirs_list=""
libs_list=""
other_args_list=""
# Global state for keeping track of -Wl,-rpath -Wl,/path
wl_expect_rpath=no
parse_Wl() {
# drop -Wl
shift
while [ $# -ne 0 ]; do
if [ "$wl_expect_rpath" = yes ]; then
rp="$1"
wl_expect_rpath=no
else
rp=""
case "$1" in
-rpath=*)
rp="${1#-rpath=}"
;;
--rpath=*)
rp="${1#--rpath=}"
;;
-rpath|--rpath)
wl_expect_rpath=yes
;;
"$dtags_to_strip")
;;
*)
append other_args_list "-Wl,$1"
;;
esac
fi
if [ -n "$rp" ]; then
if system_dir "$rp"; then
append system_rpath_dirs_list "$rp"
else
append rpath_dirs_list "$rp"
fi
fi
shift
done
# By lack of local variables, always set this to empty string.
rp=""
}
while [ $# -ne 0 ]; do
@@ -526,54 +568,9 @@ while [ $# -ne 0 ]; do
append other_args_list "-l$arg"
;;
-Wl,*)
arg="${1#-Wl,}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
case "$arg" in
-rpath=*) rp="${arg#-rpath=}" ;;
--rpath=*) rp="${arg#--rpath=}" ;;
-rpath,*) rp="${arg#-rpath,}" ;;
--rpath,*) rp="${arg#--rpath,}" ;;
-rpath|--rpath)
shift; arg="$1"
case "$arg" in
-Wl,*)
rp="${arg#-Wl,}"
;;
*)
die "-Wl,-rpath was not followed by -Wl,*"
;;
esac
;;
"$dtags_to_strip")
: # We want to remove explicitly this flag
;;
*)
append other_args_list "-Wl,$arg"
;;
esac
;;
-Xlinker,*)
arg="${1#-Xlinker,}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
case "$arg" in
-rpath=*) rp="${arg#-rpath=}" ;;
--rpath=*) rp="${arg#--rpath=}" ;;
-rpath|--rpath)
shift; arg="$1"
case "$arg" in
-Xlinker,*)
rp="${arg#-Xlinker,}"
;;
*)
die "-Xlinker,-rpath was not followed by -Xlinker,*"
;;
esac
;;
*)
append other_args_list "-Xlinker,$arg"
;;
esac
IFS=,
parse_Wl $1
unset IFS
;;
-Xlinker)
if [ "$2" = "-rpath" ]; then

View File

@@ -16,7 +16,6 @@
import sys
import tempfile
from contextlib import contextmanager
from sys import platform as _platform
from typing import Callable, List, Match, Optional, Tuple, Union
from llnl.util import tty
@@ -26,9 +25,7 @@
from spack.util.executable import Executable, which
from spack.util.path import path_to_os_path, system_path_filter
is_windows = _platform == "win32"
if not is_windows:
if sys.platform != "win32":
import grp
import pwd
else:
@@ -154,7 +151,7 @@ def lookup(name):
def getuid():
if is_windows:
if sys.platform == "win32":
import ctypes
if ctypes.windll.shell32.IsUserAnAdmin() == 0:
@@ -167,7 +164,7 @@ def getuid():
@system_path_filter
def rename(src, dst):
# On Windows, os.rename will fail if the destination file already exists
if is_windows:
if sys.platform == "win32":
# Windows path existence checks will sometimes fail on junctions/links/symlinks
# so check for that case
if os.path.exists(dst) or os.path.islink(dst):
@@ -196,7 +193,7 @@ def _get_mime_type():
"""Generate method to call `file` system command to aquire mime type
for a specified path
"""
if is_windows:
if sys.platform == "win32":
# -h option (no-dereference) does not exist in Windows
return file_command("-b", "--mime-type")
else:
@@ -551,7 +548,7 @@ def get_owner_uid(path, err_msg=None):
else:
p_stat = os.stat(path)
if _platform != "win32":
if sys.platform != "win32":
owner_uid = p_stat.st_uid
else:
sid = win32security.GetFileSecurity(
@@ -584,7 +581,7 @@ def group_ids(uid=None):
Returns:
(list of int): gids of groups the user is a member of
"""
if is_windows:
if sys.platform == "win32":
tty.warn("Function is not supported on Windows")
return []
@@ -604,7 +601,7 @@ def group_ids(uid=None):
@system_path_filter(arg_slice=slice(1))
def chgrp(path, group, follow_symlinks=True):
"""Implement the bash chgrp function on a single path"""
if is_windows:
if sys.platform == "win32":
raise OSError("Function 'chgrp' is not supported on Windows")
if isinstance(group, str):
@@ -1131,7 +1128,7 @@ def open_if_filename(str_or_file, mode="r"):
@system_path_filter
def touch(path):
"""Creates an empty file at the specified path."""
if is_windows:
if sys.platform == "win32":
perms = os.O_WRONLY | os.O_CREAT
else:
perms = os.O_WRONLY | os.O_CREAT | os.O_NONBLOCK | os.O_NOCTTY
@@ -1193,7 +1190,7 @@ def temp_cwd():
yield tmp_dir
finally:
kwargs = {}
if is_windows:
if sys.platform == "win32":
kwargs["ignore_errors"] = False
kwargs["onerror"] = readonly_file_handler(ignore_errors=True)
shutil.rmtree(tmp_dir, **kwargs)
@@ -1438,7 +1435,7 @@ def visit_directory_tree(root, visitor, rel_path="", depth=0):
try:
isdir = f.is_dir()
except OSError as e:
if is_windows and hasattr(e, "winerror") and e.winerror == 5 and islink:
if sys.platform == "win32" and hasattr(e, "winerror") and e.winerror == 5 and islink:
# if path is a symlink, determine destination and
# evaluate file vs directory
link_target = resolve_link_target_relative_to_the_link(f)
@@ -1547,11 +1544,11 @@ def readonly_file_handler(ignore_errors=False):
"""
def error_remove_readonly(func, path, exc):
if not is_windows:
if sys.platform != "win32":
raise RuntimeError("This method should only be invoked on Windows")
excvalue = exc[1]
if (
is_windows
sys.platform == "win32"
and func in (os.rmdir, os.remove, os.unlink)
and excvalue.errno == errno.EACCES
):
@@ -1581,7 +1578,7 @@ def remove_linked_tree(path):
# Windows readonly files cannot be removed by Python
# directly.
if is_windows:
if sys.platform == "win32":
kwargs["ignore_errors"] = False
kwargs["onerror"] = readonly_file_handler(ignore_errors=True)
@@ -2095,7 +2092,7 @@ def names(self):
# on non Windows platform
# Windows valid library extensions are:
# ['.dll', '.lib']
valid_exts = [".dll", ".lib"] if is_windows else [".dylib", ".so", ".a"]
valid_exts = [".dll", ".lib"] if sys.platform == "win32" else [".dylib", ".so", ".a"]
for ext in valid_exts:
i = name.rfind(ext)
if i != -1:
@@ -2243,7 +2240,7 @@ def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
message = message.format(find_libraries.__name__, type(libraries))
raise TypeError(message)
if is_windows:
if sys.platform == "win32":
static_ext = "lib"
# For linking (runtime=False) you need the .lib files regardless of
# whether you are doing a shared or static link
@@ -2275,7 +2272,7 @@ def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
# finally search all of root recursively. The search stops when the first
# match is found.
common_lib_dirs = ["lib", "lib64"]
if is_windows:
if sys.platform == "win32":
common_lib_dirs.extend(["bin", "Lib"])
for subdir in common_lib_dirs:
@@ -2410,7 +2407,7 @@ def _link(self, path, dest_dir):
# For py2 compatibility, we have to catch the specific Windows error code
# associate with trying to create a file that already exists (winerror 183)
except OSError as e:
if e.winerror == 183:
if sys.platform == "win32" and (e.winerror == 183 or e.errno == errno.EEXIST):
# We have either already symlinked or we are encoutering a naming clash
# either way, we don't want to overwrite existing libraries
already_linked = islink(dest_file)

View File

@@ -5,15 +5,13 @@
import errno
import os
import shutil
import sys
import tempfile
from os.path import exists, join
from sys import platform as _platform
from llnl.util import lang
is_windows = _platform == "win32"
if is_windows:
if sys.platform == "win32":
from win32file import CreateHardLink
@@ -23,7 +21,7 @@ def symlink(real_path, link_path):
On Windows, use junctions if os.symlink fails.
"""
if not is_windows:
if sys.platform != "win32":
os.symlink(real_path, link_path)
elif _win32_can_symlink():
# Windows requires target_is_directory=True when the target is a dir.
@@ -32,9 +30,15 @@ def symlink(real_path, link_path):
try:
# Try to use junctions
_win32_junction(real_path, link_path)
except OSError:
# If all else fails, fall back to copying files
shutil.copyfile(real_path, link_path)
except OSError as e:
if e.errno == errno.EEXIST:
# EEXIST error indicates that file we're trying to "link"
# is already present, don't bother trying to copy which will also fail
# just raise
raise
else:
# If all else fails, fall back to copying files
shutil.copyfile(real_path, link_path)
def islink(path):
@@ -99,7 +103,7 @@ def _win32_is_junction(path):
if os.path.islink(path):
return False
if is_windows:
if sys.platform == "win32":
import ctypes.wintypes
GetFileAttributes = ctypes.windll.kernel32.GetFileAttributesW

View File

@@ -25,7 +25,7 @@ def architecture_compatible(self, target, constraint):
return (
not target.architecture
or not constraint.architecture
or target.architecture.satisfies(constraint.architecture)
or target.architecture.intersects(constraint.architecture)
)
@memoized
@@ -104,7 +104,7 @@ def compiler_compatible(self, parent, child, **kwargs):
for cversion in child.compiler.versions:
# For a few compilers use specialized comparisons.
# Otherwise match on version match.
if pversion.satisfies(cversion):
if pversion.intersects(cversion):
return True
elif parent.compiler.name == "gcc" and self._gcc_compiler_compare(
pversion, cversion

View File

@@ -695,8 +695,11 @@ def _ensure_variant_defaults_are_parsable(pkgs, error_cls):
try:
variant.validate_or_raise(vspec, pkg_cls=pkg_cls)
except spack.variant.InvalidVariantValueError:
error_msg = "The variant '{}' default value in package '{}' cannot be validated"
errors.append(error_cls(error_msg.format(variant_name, pkg_name), []))
error_msg = (
"The default value of the variant '{}' in package '{}' failed validation"
)
question = "Is it among the allowed values?"
errors.append(error_cls(error_msg.format(variant_name, pkg_name), [question]))
return errors
@@ -721,7 +724,7 @@ def _version_constraints_are_satisfiable_by_some_version_in_repo(pkgs, error_cls
dependency_pkg_cls = None
try:
dependency_pkg_cls = spack.repo.path.get_pkg_class(s.name)
assert any(v.satisfies(s.versions) for v in list(dependency_pkg_cls.versions))
assert any(v.intersects(s.versions) for v in list(dependency_pkg_cls.versions))
except Exception:
summary = (
"{0}: dependency on {1} cannot be satisfied " "by known versions of {1.name}"

View File

@@ -6,6 +6,8 @@
import codecs
import collections
import hashlib
import io
import itertools
import json
import multiprocessing.pool
import os
@@ -20,7 +22,8 @@
import urllib.parse
import urllib.request
import warnings
from contextlib import closing
from contextlib import closing, contextmanager
from gzip import GzipFile
from urllib.error import HTTPError, URLError
import ruamel.yaml as yaml
@@ -39,6 +42,7 @@
import spack.platforms
import spack.relocate as relocate
import spack.repo
import spack.stage
import spack.store
import spack.traverse as traverse
import spack.util.crypto
@@ -739,34 +743,31 @@ def get_buildfile_manifest(spec):
return data
def write_buildinfo_file(spec, workdir, rel=False):
"""
Create a cache file containing information
required for the relocation
"""
def prefixes_to_hashes(spec):
return {
str(s.prefix): s.dag_hash()
for s in itertools.chain(
spec.traverse(root=True, deptype="link"), spec.dependencies(deptype="run")
)
}
def get_buildinfo_dict(spec, rel=False):
"""Create metadata for a tarball"""
manifest = get_buildfile_manifest(spec)
prefix_to_hash = dict()
prefix_to_hash[str(spec.package.prefix)] = spec.dag_hash()
deps = spack.build_environment.get_rpath_deps(spec.package)
for d in deps + spec.dependencies(deptype="run"):
prefix_to_hash[str(d.prefix)] = d.dag_hash()
# Create buildinfo data and write it to disk
buildinfo = {}
buildinfo["sbang_install_path"] = spack.hooks.sbang.sbang_install_path()
buildinfo["relative_rpaths"] = rel
buildinfo["buildpath"] = spack.store.layout.root
buildinfo["spackprefix"] = spack.paths.prefix
buildinfo["relative_prefix"] = os.path.relpath(spec.prefix, spack.store.layout.root)
buildinfo["relocate_textfiles"] = manifest["text_to_relocate"]
buildinfo["relocate_binaries"] = manifest["binary_to_relocate"]
buildinfo["relocate_links"] = manifest["link_to_relocate"]
buildinfo["hardlinks_deduped"] = manifest["hardlinks_deduped"]
buildinfo["prefix_to_hash"] = prefix_to_hash
filename = buildinfo_file_name(workdir)
with open(filename, "w") as outfile:
outfile.write(syaml.dump(buildinfo, default_flow_style=True))
return {
"sbang_install_path": spack.hooks.sbang.sbang_install_path(),
"relative_rpaths": rel,
"buildpath": spack.store.layout.root,
"spackprefix": spack.paths.prefix,
"relative_prefix": os.path.relpath(spec.prefix, spack.store.layout.root),
"relocate_textfiles": manifest["text_to_relocate"],
"relocate_binaries": manifest["binary_to_relocate"],
"relocate_links": manifest["link_to_relocate"],
"hardlinks_deduped": manifest["hardlinks_deduped"],
"prefix_to_hash": prefixes_to_hashes(spec),
}
def tarball_directory_name(spec):
@@ -1139,6 +1140,68 @@ def generate_key_index(key_prefix, tmpdir=None):
shutil.rmtree(tmpdir)
@contextmanager
def gzip_compressed_tarfile(path):
"""Create a reproducible, compressed tarfile"""
# Create gzip compressed tarball of the install prefix
# 1) Use explicit empty filename and mtime 0 for gzip header reproducibility.
# If the filename="" is dropped, Python will use fileobj.name instead.
# This should effectively mimick `gzip --no-name`.
# 2) On AMD Ryzen 3700X and an SSD disk, we have the following on compression speed:
# compresslevel=6 gzip default: llvm takes 4mins, roughly 2.1GB
# compresslevel=9 python default: llvm takes 12mins, roughly 2.1GB
# So we follow gzip.
with open(path, "wb") as fileobj, closing(
GzipFile(filename="", mode="wb", compresslevel=6, mtime=0, fileobj=fileobj)
) as gzip_file, tarfile.TarFile(name="", mode="w", fileobj=gzip_file) as tar:
yield tar
def deterministic_tarinfo(tarinfo: tarfile.TarInfo):
# We only add files, symlinks, hardlinks, and directories
# No character devices, block devices and FIFOs should ever enter a tarball.
if tarinfo.isdev():
return None
# For distribution, it makes no sense to user/group data; since (a) they don't exist
# on other machines, and (b) they lead to surprises as `tar x` run as root will change
# ownership if it can. We want to extract as the current user. By setting owner to root,
# root will extract as root, and non-privileged user will extract as themselves.
tarinfo.uid = 0
tarinfo.gid = 0
tarinfo.uname = ""
tarinfo.gname = ""
# Reset mtime to epoch time, our prefixes are not truly immutable, so files may get
# touched; as long as the content does not change, this ensures we get stable tarballs.
tarinfo.mtime = 0
# Normalize mode
if tarinfo.isfile() or tarinfo.islnk():
# If user can execute, use 0o755; else 0o644
# This is to avoid potentially unsafe world writable & exeutable files that may get
# extracted when Python or tar is run with privileges
tarinfo.mode = 0o644 if tarinfo.mode & 0o100 == 0 else 0o755
else: # symbolic link and directories
tarinfo.mode = 0o755
return tarinfo
def tar_add_metadata(tar: tarfile.TarFile, path: str, data: dict):
# Serialize buildinfo for the tarball
bstring = syaml.dump(data, default_flow_style=True).encode("utf-8")
tarinfo = tarfile.TarInfo(name=path)
tarinfo.size = len(bstring)
tar.addfile(deterministic_tarinfo(tarinfo), io.BytesIO(bstring))
def _do_create_tarball(tarfile_path, binaries_dir, pkg_dir, buildinfo):
with gzip_compressed_tarfile(tarfile_path) as tar:
tar.add(name=binaries_dir, arcname=pkg_dir, filter=deterministic_tarinfo)
tar_add_metadata(tar, buildinfo_file_name(pkg_dir), buildinfo)
def _build_tarball(
spec,
out_url,
@@ -1156,15 +1219,37 @@ def _build_tarball(
if not spec.concrete:
raise ValueError("spec must be concrete to build tarball")
# set up some paths
tmpdir = tempfile.mkdtemp()
cache_prefix = build_cache_prefix(tmpdir)
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
_build_tarball_in_stage_dir(
spec,
out_url,
stage_dir=tmpdir,
force=force,
relative=relative,
unsigned=unsigned,
allow_root=allow_root,
key=key,
regenerate_index=regenerate_index,
)
def _build_tarball_in_stage_dir(
spec,
out_url,
stage_dir,
force=False,
relative=False,
unsigned=False,
allow_root=False,
key=None,
regenerate_index=False,
):
cache_prefix = build_cache_prefix(stage_dir)
tarfile_name = tarball_name(spec, ".spack")
tarfile_dir = os.path.join(cache_prefix, tarball_directory_name(spec))
tarfile_path = os.path.join(tarfile_dir, tarfile_name)
spackfile_path = os.path.join(cache_prefix, tarball_path_name(spec, ".spack"))
remote_spackfile_path = url_util.join(out_url, os.path.relpath(spackfile_path, tmpdir))
remote_spackfile_path = url_util.join(out_url, os.path.relpath(spackfile_path, stage_dir))
mkdirp(tarfile_dir)
if web_util.url_exists(remote_spackfile_path):
@@ -1183,7 +1268,7 @@ def _build_tarball(
signed_specfile_path = "{0}.sig".format(specfile_path)
remote_specfile_path = url_util.join(
out_url, os.path.relpath(specfile_path, os.path.realpath(tmpdir))
out_url, os.path.relpath(specfile_path, os.path.realpath(stage_dir))
)
remote_signed_specfile_path = "{0}.sig".format(remote_specfile_path)
@@ -1199,7 +1284,7 @@ def _build_tarball(
raise NoOverwriteException(url_util.format(remote_specfile_path))
pkg_dir = os.path.basename(spec.prefix.rstrip(os.path.sep))
workdir = os.path.join(tmpdir, pkg_dir)
workdir = os.path.join(stage_dir, pkg_dir)
# TODO: We generally don't want to mutate any files, but when using relative
# mode, Spack unfortunately *does* mutate rpaths and links ahead of time.
@@ -1217,39 +1302,22 @@ def _build_tarball(
os.remove(temp_tarfile_path)
else:
binaries_dir = spec.prefix
mkdirp(os.path.join(workdir, ".spack"))
# create info for later relocation and create tar
write_buildinfo_file(spec, workdir, relative)
buildinfo = get_buildinfo_dict(spec, relative)
# optionally make the paths in the binaries relative to each other
# in the spack install tree before creating tarball
try:
if relative:
make_package_relative(workdir, spec, allow_root)
elif not allow_root:
ensure_package_relocatable(workdir, binaries_dir)
except Exception as e:
shutil.rmtree(workdir)
shutil.rmtree(tarfile_dir)
shutil.rmtree(tmpdir)
tty.die(e)
if relative:
make_package_relative(workdir, spec, buildinfo, allow_root)
elif not allow_root:
ensure_package_relocatable(buildinfo, binaries_dir)
# create gzip compressed tarball of the install prefix
# On AMD Ryzen 3700X and an SSD disk, we have the following on compression speed:
# compresslevel=6 gzip default: llvm takes 4mins, roughly 2.1GB
# compresslevel=9 python default: llvm takes 12mins, roughly 2.1GB
# So we follow gzip.
with closing(tarfile.open(tarfile_path, "w:gz", compresslevel=6)) as tar:
tar.add(name=binaries_dir, arcname=pkg_dir)
if not relative:
# Add buildinfo file
buildinfo_path = buildinfo_file_name(workdir)
buildinfo_arcname = buildinfo_file_name(pkg_dir)
tar.add(name=buildinfo_path, arcname=buildinfo_arcname)
_do_create_tarball(tarfile_path, binaries_dir, pkg_dir, buildinfo)
# remove copy of install directory
shutil.rmtree(workdir)
if relative:
shutil.rmtree(workdir)
# get the sha256 checksum of the tarball
checksum = checksum_tarball(tarfile_path)
@@ -1275,7 +1343,11 @@ def _build_tarball(
spec_dict["buildinfo"] = buildinfo
with open(specfile_path, "w") as outfile:
outfile.write(sjson.dump(spec_dict))
# Note: when using gpg clear sign, we need to avoid long lines (19995 chars).
# If lines are longer, they are truncated without error. Thanks GPG!
# So, here we still add newlines, but no indent, so save on file size and
# line length.
json.dump(spec_dict, outfile, indent=0, separators=(",", ":"))
# sign the tarball and spec file with gpg
if not unsigned:
@@ -1292,18 +1364,15 @@ def _build_tarball(
tty.debug('Buildcache for "{0}" written to \n {1}'.format(spec, remote_spackfile_path))
try:
# push the key to the build cache's _pgp directory so it can be
# imported
if not unsigned:
push_keys(out_url, keys=[key], regenerate_index=regenerate_index, tmpdir=tmpdir)
# push the key to the build cache's _pgp directory so it can be
# imported
if not unsigned:
push_keys(out_url, keys=[key], regenerate_index=regenerate_index, tmpdir=stage_dir)
# create an index.json for the build_cache directory so specs can be
# found
if regenerate_index:
generate_package_index(url_util.join(out_url, os.path.relpath(cache_prefix, tmpdir)))
finally:
shutil.rmtree(tmpdir)
# create an index.json for the build_cache directory so specs can be
# found
if regenerate_index:
generate_package_index(url_util.join(out_url, os.path.relpath(cache_prefix, stage_dir)))
return None
@@ -1536,13 +1605,12 @@ def download_tarball(spec, unsigned=False, mirrors_for_spec=None):
return None
def make_package_relative(workdir, spec, allow_root):
def make_package_relative(workdir, spec, buildinfo, allow_root):
"""
Change paths in binaries to relative paths. Change absolute symlinks
to relative symlinks.
"""
prefix = spec.prefix
buildinfo = read_buildinfo_file(workdir)
old_layout_root = buildinfo["buildpath"]
orig_path_names = list()
cur_path_names = list()
@@ -1566,9 +1634,8 @@ def make_package_relative(workdir, spec, allow_root):
relocate.make_link_relative(cur_path_names, orig_path_names)
def ensure_package_relocatable(workdir, binaries_dir):
def ensure_package_relocatable(buildinfo, binaries_dir):
"""Check if package binaries are relocatable."""
buildinfo = read_buildinfo_file(workdir)
binaries = [os.path.join(binaries_dir, f) for f in buildinfo["relocate_binaries"]]
relocate.ensure_binaries_are_relocatable(binaries)

View File

@@ -208,7 +208,7 @@ def _install_and_test(self, abstract_spec, bincache_platform, bincache_data, tes
# This will be None for things that don't depend on python
python_spec = item.get("python", None)
# Skip specs which are not compatible
if not abstract_spec.satisfies(candidate_spec):
if not abstract_spec.intersects(candidate_spec):
continue
if python_spec is not None and python_spec not in abstract_spec:

View File

@@ -69,13 +69,13 @@
from spack.installer import InstallError
from spack.util.cpus import cpus_available
from spack.util.environment import (
SYSTEM_DIRS,
EnvironmentModifications,
env_flag,
filter_system_paths,
get_path,
inspect_path,
is_system_path,
system_dirs,
validate,
)
from spack.util.executable import Executable
@@ -397,7 +397,7 @@ def set_compiler_environment_variables(pkg, env):
env.set("SPACK_COMPILER_SPEC", str(spec.compiler))
env.set("SPACK_SYSTEM_DIRS", ":".join(system_dirs))
env.set("SPACK_SYSTEM_DIRS", ":".join(SYSTEM_DIRS))
compiler.setup_custom_environment(pkg, env)
@@ -485,7 +485,13 @@ def update_compiler_args_for_dep(dep):
query = pkg.spec[dep.name]
dep_link_dirs = list()
try:
# In some circumstances (particularly for externals) finding
# libraries packages can be time consuming, so indicate that
# we are performing this operation (and also report when it
# finishes).
tty.debug("Collecting libraries for {0}".format(dep.name))
dep_link_dirs.extend(query.libs.directories)
tty.debug("Libraries for {0} have been collected.".format(dep.name))
except NoLibrariesError:
tty.debug("No libraries found for {0}".format(dep.name))
@@ -772,7 +778,9 @@ def setup_package(pkg, dirty, context="build"):
set_compiler_environment_variables(pkg, env_mods)
set_wrapper_variables(pkg, env_mods)
tty.debug("setup_package: grabbing modifications from dependencies")
env_mods.extend(modifications_from_dependencies(pkg.spec, context, custom_mods_only=False))
tty.debug("setup_package: collected all modifications from dependencies")
# architecture specific setup
platform = spack.platforms.by_name(pkg.spec.architecture.platform)
@@ -780,6 +788,7 @@ def setup_package(pkg, dirty, context="build"):
platform.setup_platform_environment(pkg, env_mods)
if context == "build":
tty.debug("setup_package: setup build environment for root")
builder = spack.builder.create(pkg)
builder.setup_build_environment(env_mods)
@@ -790,6 +799,7 @@ def setup_package(pkg, dirty, context="build"):
" includes and omit it when invoked with '--cflags'."
)
elif context == "test":
tty.debug("setup_package: setup test environment for root")
env_mods.extend(
inspect_path(
pkg.spec.prefix,
@@ -806,6 +816,7 @@ def setup_package(pkg, dirty, context="build"):
# Load modules on an already clean environment, just before applying Spack's
# own environment modifications. This ensures Spack controls CC/CXX/... variables.
if need_compiler:
tty.debug("setup_package: loading compiler modules")
for mod in pkg.compiler.modules:
load_module(mod)
@@ -943,6 +954,7 @@ def default_modifications_for_dep(dep):
_make_runnable(dep, env)
def add_modifications_for_dep(dep):
tty.debug("Adding env modifications for {0}".format(dep.name))
# Some callers of this function only want the custom modifications.
# For callers that want both custom and default modifications, we want
# to perform the default modifications here (this groups custom
@@ -968,6 +980,7 @@ def add_modifications_for_dep(dep):
builder.setup_dependent_build_environment(env, spec)
else:
dpkg.setup_dependent_run_environment(env, spec)
tty.debug("Added env modifications for {0}".format(dep.name))
# Note that we want to perform environment modifications in a fixed order.
# The Spec.traverse method provides this: i.e. in addition to

View File

@@ -8,7 +8,7 @@
import platform
import re
import sys
from typing import List, Tuple
from typing import List, Optional, Tuple
import llnl.util.filesystem as fs
@@ -16,7 +16,7 @@
import spack.builder
import spack.package_base
import spack.util.path
from spack.directives import build_system, depends_on, variant
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from ._checks import BaseBuilder, execute_build_time_tests
@@ -35,6 +35,43 @@ def _extract_primary_generator(generator):
return primary_generator
def generator(*names: str, default: Optional[str] = None):
"""The build system generator to use.
See ``cmake --help`` for a list of valid generators.
Currently, "Unix Makefiles" and "Ninja" are the only generators
that Spack supports. Defaults to "Unix Makefiles".
See https://cmake.org/cmake/help/latest/manual/cmake-generators.7.html
for more information.
Args:
names: allowed generators for this package
default: default generator
"""
allowed_values = ("make", "ninja")
if any(x not in allowed_values for x in names):
msg = "only 'make' and 'ninja' are allowed for CMake's 'generator' directive"
raise ValueError(msg)
default = default or names[0]
not_used = [x for x in allowed_values if x not in names]
def _values(x):
return x in allowed_values
_values.__doc__ = f"{','.join(names)}"
variant(
"generator",
default=default,
values=_values,
description="the build system generator to use",
)
for x in not_used:
conflicts(f"generator={x}")
class CMakePackage(spack.package_base.PackageBase):
"""Specialized class for packages built using CMake
@@ -67,8 +104,15 @@ class CMakePackage(spack.package_base.PackageBase):
when="^cmake@3.9:",
description="CMake interprocedural optimization",
)
if sys.platform == "win32":
generator("ninja")
else:
generator("ninja", "make", default="make")
depends_on("cmake", type="build")
depends_on("ninja", type="build", when="platform=windows")
depends_on("gmake", type="build", when="generator=make")
depends_on("ninja", type="build", when="generator=ninja")
def flags_to_build_system_args(self, flags):
"""Return a list of all command line arguments to pass the specified
@@ -138,18 +182,6 @@ class CMakeBuilder(BaseBuilder):
| :py:meth:`~.CMakeBuilder.build_directory` | Directory where to |
| | build the package |
+-----------------------------------------------+--------------------+
The generator used by CMake can be specified by providing the ``generator``
attribute. Per
https://cmake.org/cmake/help/git-master/manual/cmake-generators.7.html,
the format is: [<secondary-generator> - ]<primary_generator>.
The full list of primary and secondary generators supported by CMake may be found
in the documentation for the version of CMake used; however, at this time Spack
supports only the primary generators "Unix Makefiles" and "Ninja." Spack's CMake
support is agnostic with respect to primary generators. Spack will generate a
runtime error if the generator string does not follow the prescribed format, or if
the primary generator is not supported.
"""
#: Phases of a CMake package
@@ -160,7 +192,6 @@ class CMakeBuilder(BaseBuilder):
#: Names associated with package attributes in the old build-system format
legacy_attributes: Tuple[str, ...] = (
"generator",
"build_targets",
"install_targets",
"build_time_test_callbacks",
@@ -171,16 +202,6 @@ class CMakeBuilder(BaseBuilder):
"build_directory",
)
#: The build system generator to use.
#:
#: See ``cmake --help`` for a list of valid generators.
#: Currently, "Unix Makefiles" and "Ninja" are the only generators
#: that Spack supports. Defaults to "Unix Makefiles".
#:
#: See https://cmake.org/cmake/help/latest/manual/cmake-generators.7.html
#: for more information.
generator = "Ninja" if sys.platform == "win32" else "Unix Makefiles"
#: Targets to be used during the build phase
build_targets: List[str] = []
#: Targets to be used during the install phase
@@ -202,12 +223,20 @@ def root_cmakelists_dir(self):
"""
return self.pkg.stage.source_path
@property
def generator(self):
if self.spec.satisfies("generator=make"):
return "Unix Makefiles"
if self.spec.satisfies("generator=ninja"):
return "Ninja"
msg = f'{self.spec.format()} has an unsupported value for the "generator" variant'
raise ValueError(msg)
@property
def std_cmake_args(self):
"""Standard cmake arguments provided as a property for
convenience of package writers
"""
# standard CMake arguments
std_cmake_args = CMakeBuilder.std_args(self.pkg, generator=self.generator)
std_cmake_args += getattr(self.pkg, "cmake_flag_args", [])
return std_cmake_args

View File

@@ -38,6 +38,7 @@
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util
from spack import traverse
from spack.error import SpackError
from spack.reporters import CDash, CDashConfiguration
from spack.reporters.cdash import build_stamp as cdash_build_stamp
@@ -361,60 +362,7 @@ def append_dep(s, d):
def _spec_matches(spec, match_string):
return spec.satisfies(match_string)
def _remove_attributes(src_dict, dest_dict):
if "tags" in src_dict and "tags" in dest_dict:
# For 'tags', we remove any tags that are listed for removal
for tag in src_dict["tags"]:
while tag in dest_dict["tags"]:
dest_dict["tags"].remove(tag)
def _copy_attributes(attrs_list, src_dict, dest_dict):
for runner_attr in attrs_list:
if runner_attr in src_dict:
if runner_attr in dest_dict and runner_attr == "tags":
# For 'tags', we combine the lists of tags, while
# avoiding duplicates
for tag in src_dict[runner_attr]:
if tag not in dest_dict[runner_attr]:
dest_dict[runner_attr].append(tag)
elif runner_attr in dest_dict and runner_attr == "variables":
# For 'variables', we merge the dictionaries. Any conflicts
# (i.e. 'runner-attributes' has same variable key as the
# higher level) we resolve by keeping the more specific
# 'runner-attributes' version.
for src_key, src_val in src_dict[runner_attr].items():
dest_dict[runner_attr][src_key] = copy.deepcopy(src_dict[runner_attr][src_key])
else:
dest_dict[runner_attr] = copy.deepcopy(src_dict[runner_attr])
def _find_matching_config(spec, gitlab_ci):
runner_attributes = {}
overridable_attrs = ["image", "tags", "variables", "before_script", "script", "after_script"]
_copy_attributes(overridable_attrs, gitlab_ci, runner_attributes)
matched = False
only_first = gitlab_ci.get("match_behavior", "first") == "first"
for ci_mapping in gitlab_ci["mappings"]:
for match_string in ci_mapping["match"]:
if _spec_matches(spec, match_string):
matched = True
if "remove-attributes" in ci_mapping:
_remove_attributes(ci_mapping["remove-attributes"], runner_attributes)
if "runner-attributes" in ci_mapping:
_copy_attributes(
overridable_attrs, ci_mapping["runner-attributes"], runner_attributes
)
break
if matched and only_first:
break
return runner_attributes if matched else None
return spec.intersects(match_string)
def _format_job_needs(
@@ -490,16 +438,28 @@ def compute_affected_packages(rev1="HEAD^", rev2="HEAD"):
return spack.repo.get_all_package_diffs("ARC", rev1=rev1, rev2=rev2)
def get_spec_filter_list(env, affected_pkgs):
def get_spec_filter_list(env, affected_pkgs, dependent_traverse_depth=None):
"""Given a list of package names and an active/concretized
environment, return the set of all concrete specs from the
environment that could have been affected by changing the
list of packages.
If a ``dependent_traverse_depth`` is given, it is used to limit
upward (in the parent direction) traversal of specs of touched
packages. E.g. if 1 is provided, then only direct dependents
of touched package specs are traversed to produce specs that
could have been affected by changing the package, while if 0 is
provided, only the changed specs themselves are traversed. If ``None``
is given, upward traversal of touched package specs is done all
the way to the environment roots. Providing a negative number
results in no traversals at all, yielding an empty set.
Arguments:
env (spack.environment.Environment): Active concrete environment
affected_pkgs (List[str]): Affected package names
dependent_traverse_depth: Optional integer to limit dependent
traversal, or None to disable the limit.
Returns:
@@ -512,17 +472,237 @@ def get_spec_filter_list(env, affected_pkgs):
tty.debug("All concrete environment specs:")
for s in all_concrete_specs:
tty.debug(" {0}/{1}".format(s.name, s.dag_hash()[:7]))
env_matches = [s for s in all_concrete_specs if s.name in frozenset(affected_pkgs)]
affected_pkgs = frozenset(affected_pkgs)
env_matches = [s for s in all_concrete_specs if s.name in affected_pkgs]
visited = set()
dag_hash = lambda s: s.dag_hash()
for match in env_matches:
for parent in match.traverse(direction="parents", key=dag_hash):
affected_specs.update(
parent.traverse(direction="children", visited=visited, key=dag_hash)
)
for depth, parent in traverse.traverse_nodes(
env_matches, direction="parents", key=dag_hash, depth=True, order="breadth"
):
if dependent_traverse_depth is not None and depth > dependent_traverse_depth:
break
affected_specs.update(parent.traverse(direction="children", visited=visited, key=dag_hash))
return affected_specs
def _build_jobs(phases, staged_phases):
for phase in phases:
phase_name = phase["name"]
spec_labels, dependencies, stages = staged_phases[phase_name]
for stage_jobs in stages:
for spec_label in stage_jobs:
spec_record = spec_labels[spec_label]
release_spec = spec_record["spec"]
release_spec_dag_hash = release_spec.dag_hash()
yield release_spec, release_spec_dag_hash
def _noop(x):
return x
def _unpack_script(script_section, op=_noop):
script = []
for cmd in script_section:
if isinstance(cmd, list):
for subcmd in cmd:
script.append(op(subcmd))
else:
script.append(op(cmd))
return script
class SpackCI:
"""Spack CI object used to generate intermediate representation
used by the CI generator(s).
"""
def __init__(self, ci_config, phases, staged_phases):
"""Given the information from the ci section of the config
and the job phases setup meta data needed for generating Spack
CI IR.
"""
self.ci_config = ci_config
self.named_jobs = ["any", "build", "cleanup", "noop", "reindex", "signing"]
self.ir = {
"jobs": {},
"temporary-storage-url-prefix": self.ci_config.get(
"temporary-storage-url-prefix", None
),
"enable-artifacts-buildcache": self.ci_config.get(
"enable-artifacts-buildcache", False
),
"bootstrap": self.ci_config.get(
"bootstrap", []
), # This is deprecated and should be removed
"rebuild-index": self.ci_config.get("rebuild-index", True),
"broken-specs-url": self.ci_config.get("broken-specs-url", None),
"broken-tests-packages": self.ci_config.get("broken-tests-packages", []),
"target": self.ci_config.get("target", "gitlab"),
}
jobs = self.ir["jobs"]
for spec, dag_hash in _build_jobs(phases, staged_phases):
jobs[dag_hash] = self.__init_job(spec)
for name in self.named_jobs:
# Skip the special named jobs
if name not in ["any", "build"]:
jobs[name] = self.__init_job("")
def __init_job(self, spec):
"""Initialize job object"""
return {"spec": spec, "attributes": {}}
def __is_named(self, section):
"""Check if a pipeline-gen configuration section is for a named job,
and if so return the name otherwise return none.
"""
for _name in self.named_jobs:
keys = ["{0}-job".format(_name), "{0}-job-remove".format(_name)]
if any([key for key in keys if key in section]):
return _name
return None
@staticmethod
def __job_name(name, suffix=""):
"""Compute the name of a named job with appropriate suffix.
Valid suffixes are either '-remove' or empty string or None
"""
assert type(name) == str
jname = name
if suffix:
jname = "{0}-job{1}".format(name, suffix)
else:
jname = "{0}-job".format(name)
return jname
def __apply_submapping(self, dest, spec, section):
"""Apply submapping setion to the IR dict"""
matched = False
only_first = section.get("match_behavior", "first") == "first"
for match_attrs in reversed(section["submapping"]):
attrs = cfg.InternalConfigScope._process_dict_keyname_overrides(match_attrs)
for match_string in match_attrs["match"]:
if _spec_matches(spec, match_string):
matched = True
if "build-job-remove" in match_attrs:
spack.config.remove_yaml(dest, attrs["build-job-remove"])
if "build-job" in match_attrs:
spack.config.merge_yaml(dest, attrs["build-job"])
break
if matched and only_first:
break
return dest
# Generate IR from the configs
def generate_ir(self):
"""Generate the IR from the Spack CI configurations."""
jobs = self.ir["jobs"]
# Implicit job defaults
defaults = [
{
"build-job": {
"script": [
"cd {env_dir}",
"spack env activate --without-view .",
"spack ci rebuild",
]
}
},
{"noop-job": {"script": ['echo "All specs already up to date, nothing to rebuild."']}},
]
# Job overrides
overrides = [
# Reindex script
{
"reindex-job": {
"script:": [
"spack buildcache update-index --keys --mirror-url {index_target_mirror}"
]
}
},
# Cleanup script
{
"cleanup-job": {
"script:": [
"spack -d mirror destroy --mirror-url {mirror_prefix}/$CI_PIPELINE_ID"
]
}
},
# Add signing job tags
{"signing-job": {"tags": ["aws", "protected", "notary"]}},
# Remove reserved tags
{"any-job-remove": {"tags": SPACK_RESERVED_TAGS}},
]
pipeline_gen = overrides + self.ci_config.get("pipeline-gen", []) + defaults
for section in reversed(pipeline_gen):
name = self.__is_named(section)
has_submapping = "submapping" in section
section = cfg.InternalConfigScope._process_dict_keyname_overrides(section)
if name:
remove_job_name = self.__job_name(name, suffix="-remove")
merge_job_name = self.__job_name(name)
do_remove = remove_job_name in section
do_merge = merge_job_name in section
def _apply_section(dest, src):
if do_remove:
dest = spack.config.remove_yaml(dest, src[remove_job_name])
if do_merge:
dest = copy.copy(spack.config.merge_yaml(dest, src[merge_job_name]))
if name == "build":
# Apply attributes to all build jobs
for _, job in jobs.items():
if job["spec"]:
_apply_section(job["attributes"], section)
elif name == "any":
# Apply section attributes too all jobs
for _, job in jobs.items():
_apply_section(job["attributes"], section)
else:
# Create a signing job if there is script and the job hasn't
# been initialized yet
if name == "signing" and name not in jobs:
if "signing-job" in section:
if "script" not in section["signing-job"]:
continue
else:
jobs[name] = self.__init_job("")
# Apply attributes to named job
_apply_section(jobs[name]["attributes"], section)
elif has_submapping:
# Apply section jobs with specs to match
for _, job in jobs.items():
if job["spec"]:
job["attributes"] = self.__apply_submapping(
job["attributes"], job["spec"], section
)
for _, job in jobs.items():
if job["spec"]:
job["spec"] = job["spec"].name
return self.ir
def generate_gitlab_ci_yaml(
env,
print_summary,
@@ -572,14 +752,32 @@ def generate_gitlab_ci_yaml(
yaml_root = ev.config_dict(env.yaml)
if "gitlab-ci" not in yaml_root:
tty.die('Environment yaml does not have "gitlab-ci" section')
# Get the joined "ci" config with all of the current scopes resolved
ci_config = cfg.get("ci")
gitlab_ci = yaml_root["gitlab-ci"]
if not ci_config:
tty.die('Environment yaml does not have "ci" section')
cdash_handler = CDashHandler(yaml_root.get("cdash")) if "cdash" in yaml_root else None
# Default target is gitlab...and only target is gitlab
if "target" in ci_config and ci_config["target"] != "gitlab":
tty.die('Spack CI module only generates target "gitlab"')
cdash_config = cfg.get("cdash")
cdash_handler = CDashHandler(cdash_config) if "build-group" in cdash_config else None
build_group = cdash_handler.build_group if cdash_handler else None
dependent_depth = os.environ.get("SPACK_PRUNE_UNTOUCHED_DEPENDENT_DEPTH", None)
if dependent_depth is not None:
try:
dependent_depth = int(dependent_depth)
except (TypeError, ValueError):
tty.warn(
f"Unrecognized value ({dependent_depth}) "
"provided for SPACK_PRUNE_UNTOUCHED_DEPENDENT_DEPTH, "
"ignoring it."
)
dependent_depth = None
prune_untouched_packages = False
spack_prune_untouched = os.environ.get("SPACK_PRUNE_UNTOUCHED", None)
if spack_prune_untouched is not None and spack_prune_untouched.lower() == "true":
@@ -595,7 +793,9 @@ def generate_gitlab_ci_yaml(
tty.debug("affected pkgs:")
for p in affected_pkgs:
tty.debug(" {0}".format(p))
affected_specs = get_spec_filter_list(env, affected_pkgs)
affected_specs = get_spec_filter_list(
env, affected_pkgs, dependent_traverse_depth=dependent_depth
)
tty.debug("all affected specs:")
for s in affected_specs:
tty.debug(" {0}/{1}".format(s.name, s.dag_hash()[:7]))
@@ -637,25 +837,25 @@ def generate_gitlab_ci_yaml(
# trying to build.
broken_specs_url = ""
known_broken_specs_encountered = []
if "broken-specs-url" in gitlab_ci:
broken_specs_url = gitlab_ci["broken-specs-url"]
if "broken-specs-url" in ci_config:
broken_specs_url = ci_config["broken-specs-url"]
enable_artifacts_buildcache = False
if "enable-artifacts-buildcache" in gitlab_ci:
enable_artifacts_buildcache = gitlab_ci["enable-artifacts-buildcache"]
if "enable-artifacts-buildcache" in ci_config:
enable_artifacts_buildcache = ci_config["enable-artifacts-buildcache"]
rebuild_index_enabled = True
if "rebuild-index" in gitlab_ci and gitlab_ci["rebuild-index"] is False:
if "rebuild-index" in ci_config and ci_config["rebuild-index"] is False:
rebuild_index_enabled = False
temp_storage_url_prefix = None
if "temporary-storage-url-prefix" in gitlab_ci:
temp_storage_url_prefix = gitlab_ci["temporary-storage-url-prefix"]
if "temporary-storage-url-prefix" in ci_config:
temp_storage_url_prefix = ci_config["temporary-storage-url-prefix"]
bootstrap_specs = []
phases = []
if "bootstrap" in gitlab_ci:
for phase in gitlab_ci["bootstrap"]:
if "bootstrap" in ci_config:
for phase in ci_config["bootstrap"]:
try:
phase_name = phase.get("name")
strip_compilers = phase.get("compiler-agnostic")
@@ -720,6 +920,27 @@ def generate_gitlab_ci_yaml(
shutil.copyfile(env.manifest_path, os.path.join(concrete_env_dir, "spack.yaml"))
shutil.copyfile(env.lock_path, os.path.join(concrete_env_dir, "spack.lock"))
with open(env.manifest_path, "r") as env_fd:
env_yaml_root = syaml.load(env_fd)
# Add config scopes to environment
env_includes = env_yaml_root["spack"].get("include", [])
cli_scopes = [
os.path.abspath(s.path)
for s in cfg.scopes().values()
if type(s) == cfg.ImmutableConfigScope
and s.path not in env_includes
and os.path.exists(s.path)
]
include_scopes = []
for scope in cli_scopes:
if scope not in include_scopes and scope not in env_includes:
include_scopes.insert(0, scope)
env_includes.extend(include_scopes)
env_yaml_root["spack"]["include"] = env_includes
with open(os.path.join(concrete_env_dir, "spack.yaml"), "w") as fd:
fd.write(syaml.dump_config(env_yaml_root, default_flow_style=False))
job_log_dir = os.path.join(pipeline_artifacts_dir, "logs")
job_repro_dir = os.path.join(pipeline_artifacts_dir, "reproduction")
job_test_dir = os.path.join(pipeline_artifacts_dir, "tests")
@@ -731,7 +952,7 @@ def generate_gitlab_ci_yaml(
# generation job and the rebuild jobs. This can happen when gitlab
# checks out the project into a runner-specific directory, for example,
# and different runners are picked for generate and rebuild jobs.
ci_project_dir = os.environ.get("CI_PROJECT_DIR")
ci_project_dir = os.environ.get("CI_PROJECT_DIR", os.getcwd())
rel_artifacts_root = os.path.relpath(pipeline_artifacts_dir, ci_project_dir)
rel_concrete_env_dir = os.path.relpath(concrete_env_dir, ci_project_dir)
rel_job_log_dir = os.path.relpath(job_log_dir, ci_project_dir)
@@ -745,7 +966,7 @@ def generate_gitlab_ci_yaml(
try:
bindist.binary_index.update()
except bindist.FetchCacheError as e:
tty.error(e)
tty.warn(e)
staged_phases = {}
try:
@@ -802,6 +1023,9 @@ def generate_gitlab_ci_yaml(
else:
broken_spec_urls = web_util.list_url(broken_specs_url)
spack_ci = SpackCI(ci_config, phases, staged_phases)
spack_ci_ir = spack_ci.generate_ir()
before_script, after_script = None, None
for phase in phases:
phase_name = phase["name"]
@@ -829,7 +1053,7 @@ def generate_gitlab_ci_yaml(
spec_record["needs_rebuild"] = False
continue
runner_attribs = _find_matching_config(release_spec, gitlab_ci)
runner_attribs = spack_ci_ir["jobs"][release_spec_dag_hash]["attributes"]
if not runner_attribs:
tty.warn("No match found for {0}, skipping it".format(release_spec))
@@ -860,23 +1084,21 @@ def generate_gitlab_ci_yaml(
except AttributeError:
image_name = build_image
job_script = ["spack env activate --without-view ."]
if "script" not in runner_attribs:
raise AttributeError
if artifacts_root:
job_script.insert(0, "cd {0}".format(concrete_env_dir))
def main_script_replacements(cmd):
return cmd.replace("{env_dir}", concrete_env_dir)
job_script.extend(["spack ci rebuild"])
if "script" in runner_attribs:
job_script = [s for s in runner_attribs["script"]]
job_script = _unpack_script(runner_attribs["script"], op=main_script_replacements)
before_script = None
if "before_script" in runner_attribs:
before_script = [s for s in runner_attribs["before_script"]]
before_script = _unpack_script(runner_attribs["before_script"])
after_script = None
if "after_script" in runner_attribs:
after_script = [s for s in runner_attribs["after_script"]]
after_script = _unpack_script(runner_attribs["after_script"])
osname = str(release_spec.architecture)
job_name = get_job_name(
@@ -938,7 +1160,7 @@ def generate_gitlab_ci_yaml(
bs_arch = c_spec.architecture
bs_arch_family = bs_arch.target.microarchitecture.family
if (
c_spec.satisfies(compiler_pkg_spec)
c_spec.intersects(compiler_pkg_spec)
and bs_arch_family == spec_arch_family
):
# We found the bootstrap compiler this release spec
@@ -1120,19 +1342,6 @@ def generate_gitlab_ci_yaml(
else:
tty.warn("Unable to populate buildgroup without CDash credentials")
service_job_config = None
if "service-job-attributes" in gitlab_ci:
service_job_config = gitlab_ci["service-job-attributes"]
default_attrs = [
"image",
"tags",
"variables",
"before_script",
# 'script',
"after_script",
]
service_job_retries = {
"max": 2,
"when": ["runner_system_failure", "stuck_or_timeout_failure", "script_failure"],
@@ -1144,55 +1353,29 @@ def generate_gitlab_ci_yaml(
# schedule a job to clean up the temporary storage location
# associated with this pipeline.
stage_names.append("cleanup-temp-storage")
cleanup_job = {}
if service_job_config:
_copy_attributes(default_attrs, service_job_config, cleanup_job)
if "tags" in cleanup_job:
service_tags = _remove_reserved_tags(cleanup_job["tags"])
cleanup_job["tags"] = service_tags
cleanup_job = copy.deepcopy(spack_ci_ir["jobs"]["cleanup"]["attributes"])
cleanup_job["stage"] = "cleanup-temp-storage"
cleanup_job["script"] = [
"spack -d mirror destroy --mirror-url {0}/$CI_PIPELINE_ID".format(
temp_storage_url_prefix
)
]
cleanup_job["when"] = "always"
cleanup_job["retry"] = service_job_retries
cleanup_job["interruptible"] = True
cleanup_job["script"] = _unpack_script(
cleanup_job["script"],
op=lambda cmd: cmd.replace("mirror_prefix", temp_storage_url_prefix),
)
output_object["cleanup"] = cleanup_job
if (
"signing-job-attributes" in gitlab_ci
"script" in spack_ci_ir["jobs"]["signing"]["attributes"]
and spack_pipeline_type == "spack_protected_branch"
):
# External signing: generate a job to check and sign binary pkgs
stage_names.append("stage-sign-pkgs")
signing_job_config = gitlab_ci["signing-job-attributes"]
signing_job = {}
signing_job = spack_ci_ir["jobs"]["signing"]["attributes"]
signing_job_attrs_to_copy = [
"image",
"tags",
"variables",
"before_script",
"script",
"after_script",
]
_copy_attributes(signing_job_attrs_to_copy, signing_job_config, signing_job)
signing_job_tags = []
if "tags" in signing_job:
signing_job_tags = _remove_reserved_tags(signing_job["tags"])
for tag in ["aws", "protected", "notary"]:
if tag not in signing_job_tags:
signing_job_tags.append(tag)
signing_job["tags"] = signing_job_tags
signing_job["script"] = _unpack_script(signing_job["script"])
signing_job["stage"] = "stage-sign-pkgs"
signing_job["when"] = "always"
@@ -1204,23 +1387,17 @@ def generate_gitlab_ci_yaml(
if rebuild_index_enabled:
# Add a final job to regenerate the index
stage_names.append("stage-rebuild-index")
final_job = {}
if service_job_config:
_copy_attributes(default_attrs, service_job_config, final_job)
if "tags" in final_job:
service_tags = _remove_reserved_tags(final_job["tags"])
final_job["tags"] = service_tags
final_job = spack_ci_ir["jobs"]["reindex"]["attributes"]
index_target_mirror = mirror_urls[0]
if remote_mirror_override:
index_target_mirror = remote_mirror_override
final_job["stage"] = "stage-rebuild-index"
final_job["script"] = [
"spack buildcache update-index --keys --mirror-url {0}".format(index_target_mirror)
]
final_job["script"] = _unpack_script(
final_job["script"],
op=lambda cmd: cmd.replace("{index_target_mirror}", index_target_mirror),
)
final_job["when"] = "always"
final_job["retry"] = service_job_retries
final_job["interruptible"] = True
@@ -1301,13 +1478,7 @@ def generate_gitlab_ci_yaml(
else:
# No jobs were generated
tty.debug("No specs to rebuild, generating no-op job")
noop_job = {}
if service_job_config:
_copy_attributes(default_attrs, service_job_config, noop_job)
if "script" not in noop_job:
noop_job["script"] = ['echo "All specs already up to date, nothing to rebuild."']
noop_job = spack_ci_ir["jobs"]["noop"]["attributes"]
noop_job["retry"] = service_job_retries
@@ -1321,7 +1492,7 @@ def generate_gitlab_ci_yaml(
sys.exit(1)
with open(output_file, "w") as outf:
outf.write(syaml.dump_config(sorted_output, default_flow_style=True))
outf.write(syaml.dump(sorted_output, default_flow_style=True))
def _url_encode_string(input_string):
@@ -1501,7 +1672,10 @@ def copy_files_to_artifacts(src, artifacts_dir):
try:
fs.copy(src, artifacts_dir)
except Exception as err:
tty.warn(f"Unable to copy files ({src}) to artifacts {artifacts_dir} due to: {err}")
msg = ("Unable to copy files ({0}) to artifacts {1} due to " "exception: {2}").format(
src, artifacts_dir, str(err)
)
tty.warn(msg)
def copy_stage_logs_to_artifacts(job_spec, job_log_dir):
@@ -1721,6 +1895,7 @@ def reproduce_ci_job(url, work_dir):
function is a set of printed instructions for running docker and then
commands to run to reproduce the build once inside the container.
"""
work_dir = os.path.realpath(work_dir)
download_and_extract_artifacts(url, work_dir)
lock_file = fs.find(work_dir, "spack.lock")[0]
@@ -1885,7 +2060,9 @@ def reproduce_ci_job(url, work_dir):
if job_image:
inst_list.append("\nRun the following command:\n\n")
inst_list.append(
" $ docker run --rm -v {0}:{1} -ti {2}\n".format(work_dir, mount_as_dir, job_image)
" $ docker run --rm --name spack_reproducer -v {0}:{1}:Z -ti {2}\n".format(
work_dir, mount_as_dir, job_image
)
)
inst_list.append("\nOnce inside the container:\n\n")
else:
@@ -1936,13 +2113,16 @@ def process_command(name, commands, repro_dir):
# Create a string [command 1] && [command 2] && ... && [command n] with commands
# quoted using double quotes.
args_to_string = lambda args: " ".join('"{}"'.format(arg) for arg in args)
full_command = " && ".join(map(args_to_string, commands))
full_command = " \n ".join(map(args_to_string, commands))
# Write the command to a shell script
script = "{0}.sh".format(name)
with open(script, "w") as fd:
fd.write("#!/bin/sh\n\n")
fd.write("\n# spack {0} command\n".format(name))
fd.write("set -e\n")
if os.environ.get("SPACK_VERBOSE_SCRIPT"):
fd.write("set -x\n")
fd.write(full_command)
fd.write("\n")

View File

@@ -498,11 +498,11 @@ def list_fn(args):
if not args.allarch:
arch = spack.spec.Spec.default_arch()
specs = [s for s in specs if s.satisfies(arch)]
specs = [s for s in specs if s.intersects(arch)]
if args.specs:
constraints = set(args.specs)
specs = [s for s in specs if any(s.satisfies(c) for c in constraints)]
specs = [s for s in specs if any(s.intersects(c) for c in constraints)]
if sys.stdout.isatty():
builds = len(specs)
tty.msg("%s." % plural(builds, "cached build"))

View File

@@ -33,12 +33,6 @@ def deindent(desc):
return desc.replace(" ", "")
def get_env_var(variable_name):
if variable_name in os.environ:
return os.environ.get(variable_name)
return None
def setup_parser(subparser):
setup_parser.parser = subparser
subparsers = subparser.add_subparsers(help="CI sub-commands")
@@ -255,10 +249,9 @@ def ci_rebuild(args):
# Make sure the environment is "gitlab-enabled", or else there's nothing
# to do.
yaml_root = ev.config_dict(env.yaml)
gitlab_ci = yaml_root["gitlab-ci"] if "gitlab-ci" in yaml_root else None
if not gitlab_ci:
tty.die("spack ci rebuild requires an env containing gitlab-ci cfg")
ci_config = cfg.get("ci")
if not ci_config:
tty.die("spack ci rebuild requires an env containing ci cfg")
tty.msg(
"SPACK_BUILDCACHE_DESTINATION={0}".format(
@@ -269,27 +262,27 @@ def ci_rebuild(args):
# Grab the environment variables we need. These either come from the
# pipeline generation step ("spack ci generate"), where they were written
# out as variables, or else provided by GitLab itself.
pipeline_artifacts_dir = get_env_var("SPACK_ARTIFACTS_ROOT")
job_log_dir = get_env_var("SPACK_JOB_LOG_DIR")
job_test_dir = get_env_var("SPACK_JOB_TEST_DIR")
repro_dir = get_env_var("SPACK_JOB_REPRO_DIR")
local_mirror_dir = get_env_var("SPACK_LOCAL_MIRROR_DIR")
concrete_env_dir = get_env_var("SPACK_CONCRETE_ENV_DIR")
ci_pipeline_id = get_env_var("CI_PIPELINE_ID")
ci_job_name = get_env_var("CI_JOB_NAME")
signing_key = get_env_var("SPACK_SIGNING_KEY")
job_spec_pkg_name = get_env_var("SPACK_JOB_SPEC_PKG_NAME")
job_spec_dag_hash = get_env_var("SPACK_JOB_SPEC_DAG_HASH")
compiler_action = get_env_var("SPACK_COMPILER_ACTION")
spack_pipeline_type = get_env_var("SPACK_PIPELINE_TYPE")
remote_mirror_override = get_env_var("SPACK_REMOTE_MIRROR_OVERRIDE")
remote_mirror_url = get_env_var("SPACK_REMOTE_MIRROR_URL")
spack_ci_stack_name = get_env_var("SPACK_CI_STACK_NAME")
shared_pr_mirror_url = get_env_var("SPACK_CI_SHARED_PR_MIRROR_URL")
rebuild_everything = get_env_var("SPACK_REBUILD_EVERYTHING")
pipeline_artifacts_dir = os.environ.get("SPACK_ARTIFACTS_ROOT")
job_log_dir = os.environ.get("SPACK_JOB_LOG_DIR")
job_test_dir = os.environ.get("SPACK_JOB_TEST_DIR")
repro_dir = os.environ.get("SPACK_JOB_REPRO_DIR")
local_mirror_dir = os.environ.get("SPACK_LOCAL_MIRROR_DIR")
concrete_env_dir = os.environ.get("SPACK_CONCRETE_ENV_DIR")
ci_pipeline_id = os.environ.get("CI_PIPELINE_ID")
ci_job_name = os.environ.get("CI_JOB_NAME")
signing_key = os.environ.get("SPACK_SIGNING_KEY")
job_spec_pkg_name = os.environ.get("SPACK_JOB_SPEC_PKG_NAME")
job_spec_dag_hash = os.environ.get("SPACK_JOB_SPEC_DAG_HASH")
compiler_action = os.environ.get("SPACK_COMPILER_ACTION")
spack_pipeline_type = os.environ.get("SPACK_PIPELINE_TYPE")
remote_mirror_override = os.environ.get("SPACK_REMOTE_MIRROR_OVERRIDE")
remote_mirror_url = os.environ.get("SPACK_REMOTE_MIRROR_URL")
spack_ci_stack_name = os.environ.get("SPACK_CI_STACK_NAME")
shared_pr_mirror_url = os.environ.get("SPACK_CI_SHARED_PR_MIRROR_URL")
rebuild_everything = os.environ.get("SPACK_REBUILD_EVERYTHING")
# Construct absolute paths relative to current $CI_PROJECT_DIR
ci_project_dir = get_env_var("CI_PROJECT_DIR")
ci_project_dir = os.environ.get("CI_PROJECT_DIR")
pipeline_artifacts_dir = os.path.join(ci_project_dir, pipeline_artifacts_dir)
job_log_dir = os.path.join(ci_project_dir, job_log_dir)
job_test_dir = os.path.join(ci_project_dir, job_test_dir)
@@ -306,8 +299,10 @@ def ci_rebuild(args):
# Query the environment manifest to find out whether we're reporting to a
# CDash instance, and if so, gather some information from the manifest to
# support that task.
cdash_handler = spack_ci.CDashHandler(yaml_root.get("cdash")) if "cdash" in yaml_root else None
if cdash_handler:
cdash_config = cfg.get("cdash")
cdash_handler = None
if "build-group" in cdash_config:
cdash_handler = spack_ci.CDashHandler(cdash_config)
tty.debug("cdash url = {0}".format(cdash_handler.url))
tty.debug("cdash project = {0}".format(cdash_handler.project))
tty.debug("cdash project_enc = {0}".format(cdash_handler.project_enc))
@@ -340,13 +335,13 @@ def ci_rebuild(args):
pipeline_mirror_url = None
temp_storage_url_prefix = None
if "temporary-storage-url-prefix" in gitlab_ci:
temp_storage_url_prefix = gitlab_ci["temporary-storage-url-prefix"]
if "temporary-storage-url-prefix" in ci_config:
temp_storage_url_prefix = ci_config["temporary-storage-url-prefix"]
pipeline_mirror_url = url_util.join(temp_storage_url_prefix, ci_pipeline_id)
enable_artifacts_mirror = False
if "enable-artifacts-buildcache" in gitlab_ci:
enable_artifacts_mirror = gitlab_ci["enable-artifacts-buildcache"]
if "enable-artifacts-buildcache" in ci_config:
enable_artifacts_mirror = ci_config["enable-artifacts-buildcache"]
if enable_artifacts_mirror or (
spack_is_pr_pipeline and not enable_artifacts_mirror and not temp_storage_url_prefix
):
@@ -593,8 +588,8 @@ def ci_rebuild(args):
# avoid wasting compute cycles attempting to build those hashes.
if install_exit_code == INSTALL_FAIL_CODE and spack_is_develop_pipeline:
tty.debug("Install failed on develop")
if "broken-specs-url" in gitlab_ci:
broken_specs_url = gitlab_ci["broken-specs-url"]
if "broken-specs-url" in ci_config:
broken_specs_url = ci_config["broken-specs-url"]
dev_fail_hash = job_spec.dag_hash()
broken_spec_path = url_util.join(broken_specs_url, dev_fail_hash)
tty.msg("Reporting broken develop build as: {0}".format(broken_spec_path))
@@ -602,8 +597,8 @@ def ci_rebuild(args):
broken_spec_path,
job_spec_pkg_name,
spack_ci_stack_name,
get_env_var("CI_JOB_URL"),
get_env_var("CI_PIPELINE_URL"),
os.environ.get("CI_JOB_URL"),
os.environ.get("CI_PIPELINE_URL"),
job_spec.to_dict(hash=ht.dag_hash),
)
@@ -615,17 +610,14 @@ def ci_rebuild(args):
# the package, run them and copy the output. Failures of any kind should
# *not* terminate the build process or preclude creating the build cache.
broken_tests = (
"broken-tests-packages" in gitlab_ci
and job_spec.name in gitlab_ci["broken-tests-packages"]
"broken-tests-packages" in ci_config
and job_spec.name in ci_config["broken-tests-packages"]
)
reports_dir = fs.join_path(os.getcwd(), "cdash_report")
if args.tests and broken_tests:
tty.warn(
"Unable to run stand-alone tests since listed in "
"gitlab-ci's 'broken-tests-packages'"
)
tty.warn("Unable to run stand-alone tests since listed in " "ci's 'broken-tests-packages'")
if cdash_handler:
msg = "Package is listed in gitlab-ci's broken-tests-packages"
msg = "Package is listed in ci's broken-tests-packages"
cdash_handler.report_skipped(job_spec, reports_dir, reason=msg)
cdash_handler.copy_test_results(reports_dir, job_test_dir)
elif args.tests:
@@ -688,8 +680,8 @@ def ci_rebuild(args):
# If this is a develop pipeline, check if the spec that we just built is
# on the broken-specs list. If so, remove it.
if spack_is_develop_pipeline and "broken-specs-url" in gitlab_ci:
broken_specs_url = gitlab_ci["broken-specs-url"]
if spack_is_develop_pipeline and "broken-specs-url" in ci_config:
broken_specs_url = ci_config["broken-specs-url"]
just_built_hash = job_spec.dag_hash()
broken_spec_path = url_util.join(broken_specs_url, just_built_hash)
if web_util.url_exists(broken_spec_path):
@@ -706,9 +698,9 @@ def ci_rebuild(args):
else:
tty.debug("spack install exited non-zero, will not create buildcache")
api_root_url = get_env_var("CI_API_V4_URL")
ci_project_id = get_env_var("CI_PROJECT_ID")
ci_job_id = get_env_var("CI_JOB_ID")
api_root_url = os.environ.get("CI_API_V4_URL")
ci_project_id = os.environ.get("CI_PROJECT_ID")
ci_job_id = os.environ.get("CI_JOB_ID")
repro_job_url = "{0}/projects/{1}/jobs/{2}/artifacts".format(
api_root_url, ci_project_id, ci_job_id

View File

@@ -514,7 +514,15 @@ def add_concretizer_args(subparser):
dest="concretizer:reuse",
const=True,
default=None,
help="reuse installed dependencies/buildcaches when possible",
help="reuse installed packages/buildcaches when possible",
)
subgroup.add_argument(
"--reuse-deps",
action=ConfigSetAction,
dest="concretizer:reuse",
const="dependencies",
default=None,
help="reuse installed dependencies only",
)

View File

@@ -39,19 +39,14 @@
compiler flags:
@g{cflags="flags"} cppflags, cflags, cxxflags,
fflags, ldflags, ldlibs
@g{cflags=="flags"} propagate flags to package dependencies
cppflags, cflags, cxxflags, fflags,
ldflags, ldlibs
@g{==} propagate flags to package dependencies
variants:
@B{+variant} enable <variant>
@B{++variant} propagate enable <variant>
@r{-variant} or @r{~variant} disable <variant>
@r{--variant} or @r{~~variant} propagate disable <variant>
@B{variant=value} set non-boolean <variant> to <value>
@B{variant==value} propagate non-boolean <variant> to <value>
@B{variant=value1,value2,value3} set multi-value <variant> values
@B{variant==value1,value2,value3} propagate multi-value <variant> values
@B{++}, @r{--}, @r{~~}, @B{==} propagate variants to package dependencies
architecture variants:
@m{platform=platform} linux, darwin, cray, etc.

View File

@@ -283,7 +283,7 @@ def print_tests(pkg):
c_names = ("gcc", "intel", "intel-parallel-studio", "pgi")
if pkg.name in c_names:
v_names.extend(["c", "cxx", "fortran"])
if pkg.spec.satisfies("llvm+clang"):
if pkg.spec.intersects("llvm+clang"):
v_names.extend(["c", "cxx"])
# TODO Refactor END

View File

@@ -263,146 +263,6 @@ def report_filename(args: argparse.Namespace, specs: List[spack.spec.Spec]) -> s
return result
def install_specs(specs, install_kwargs, cli_args):
try:
if ev.active_environment():
install_specs_inside_environment(specs, install_kwargs, cli_args)
else:
install_specs_outside_environment(specs, install_kwargs)
except spack.build_environment.InstallError as e:
if cli_args.show_log_on_error:
e.print_context()
assert e.pkg, "Expected InstallError to include the associated package"
if not os.path.exists(e.pkg.build_log_path):
tty.error("'spack install' created no log.")
else:
sys.stderr.write("Full build log:\n")
with open(e.pkg.build_log_path) as log:
shutil.copyfileobj(log, sys.stderr)
raise
def install_specs_inside_environment(specs, install_kwargs, cli_args):
specs_to_install, specs_to_add = [], []
env = ev.active_environment()
for abstract, concrete in specs:
# This won't find specs added to the env since last
# concretize, therefore should we consider enforcing
# concretization of the env before allowing to install
# specs?
m_spec = env.matching_spec(abstract)
# If there is any ambiguity in the above call to matching_spec
# (i.e. if more than one spec in the environment matches), then
# SpackEnvironmentError is raised, with a message listing the
# the matches. Getting to this point means there were either
# no matches or exactly one match.
if not m_spec and not cli_args.add:
msg = (
"Cannot install '{0}' because it is not in the current environment."
" You can add it to the environment with 'spack add {0}', or as part"
" of the install command with 'spack install --add {0}'"
).format(str(abstract))
tty.die(msg)
if not m_spec:
tty.debug("adding {0} as a root".format(abstract.name))
specs_to_add.append((abstract, concrete))
continue
tty.debug("exactly one match for {0} in env -> {1}".format(m_spec.name, m_spec.dag_hash()))
if m_spec in env.roots() or not cli_args.add:
# either the single match is a root spec (in which case
# the spec is not added to the env again), or the user did
# not specify --add (in which case it is assumed we are
# installing already-concretized specs in the env)
tty.debug("just install {0}".format(m_spec.name))
specs_to_install.append(m_spec)
else:
# the single match is not a root (i.e. it's a dependency),
# and --add was specified, so we'll add it as a
# root before installing
tty.debug("add {0} then install it".format(m_spec.name))
specs_to_add.append((abstract, concrete))
if specs_to_add:
tty.debug("Adding the following specs as roots:")
for abstract, concrete in specs_to_add:
tty.debug(" {0}".format(abstract.name))
with env.write_transaction():
specs_to_install.append(env.concretize_and_add(abstract, concrete))
env.write(regenerate=False)
# Install the validated list of cli specs
if specs_to_install:
tty.debug("Installing the following cli specs:")
for s in specs_to_install:
tty.debug(" {0}".format(s.name))
env.install_specs(specs_to_install, **install_kwargs)
def install_specs_outside_environment(specs, install_kwargs):
installs = [(concrete.package, install_kwargs) for _, concrete in specs]
builder = PackageInstaller(installs)
builder.install()
def install_all_specs_from_active_environment(
install_kwargs, only_concrete, cli_test_arg, reporter_factory
):
"""Install all specs from the active environment
Args:
install_kwargs (dict): dictionary of options to be passed to the installer
only_concrete (bool): if true don't concretize the environment, but install
only the specs that are already concrete
cli_test_arg (bool or str): command line argument to select which test to run
reporter: reporter object for the installations
"""
env = ev.active_environment()
if not env:
msg = "install requires a package argument or active environment"
if "spack.yaml" in os.listdir(os.getcwd()):
# There's a spack.yaml file in the working dir, the user may
# have intended to use that
msg += "\n\n"
msg += "Did you mean to install using the `spack.yaml`"
msg += " in this directory? Try: \n"
msg += " spack env activate .\n"
msg += " spack install\n"
msg += " OR\n"
msg += " spack --env . install"
tty.die(msg)
install_kwargs["tests"] = compute_tests_install_kwargs(env.user_specs, cli_test_arg)
if not only_concrete:
with env.write_transaction():
concretized_specs = env.concretize(tests=install_kwargs["tests"])
ev.display_specs(concretized_specs)
# save view regeneration for later, so that we only do it
# once, as it can be slow.
env.write(regenerate=False)
specs = env.all_specs()
if not specs:
msg = "{0} environment has no specs to install".format(env.name)
tty.msg(msg)
return
reporter = reporter_factory(specs) or lang.nullcontext()
tty.msg("Installing environment {0}".format(env.name))
with reporter:
env.install_all(**install_kwargs)
tty.debug("Regenerating environment views for {0}".format(env.name))
with env.write_transaction():
# write env to trigger view generation and modulefile
# generation
env.write()
def compute_tests_install_kwargs(specs, cli_test_arg):
"""Translate the test cli argument into the proper install argument"""
if cli_test_arg == "all":
@@ -412,43 +272,6 @@ def compute_tests_install_kwargs(specs, cli_test_arg):
return False
def specs_from_cli(args, install_kwargs):
"""Return abstract and concrete spec parsed from the command line."""
abstract_specs = spack.cmd.parse_specs(args.spec)
install_kwargs["tests"] = compute_tests_install_kwargs(abstract_specs, args.test)
try:
concrete_specs = spack.cmd.parse_specs(
args.spec, concretize=True, tests=install_kwargs["tests"]
)
except SpackError as e:
tty.debug(e)
if args.log_format is not None:
reporter = args.reporter()
reporter.concretization_report(report_filename(args, abstract_specs), e.message)
raise
return abstract_specs, concrete_specs
def concrete_specs_from_file(args):
"""Return the list of concrete specs read from files."""
result = []
for file in args.specfiles:
with open(file, "r") as f:
if file.endswith("yaml") or file.endswith("yml"):
s = spack.spec.Spec.from_yaml(f)
else:
s = spack.spec.Spec.from_json(f)
concretized = s.concretized()
if concretized.dag_hash() != s.dag_hash():
msg = 'skipped invalid file "{0}". '
msg += "The file does not contain a concrete spec."
tty.warn(msg.format(file))
continue
result.append(concretized)
return result
def require_user_confirmation_for_overwrite(concrete_specs, args):
if args.yes_to_all:
return
@@ -475,12 +298,40 @@ def require_user_confirmation_for_overwrite(concrete_specs, args):
tty.die("Reinstallation aborted.")
def _dump_log_on_error(e: spack.build_environment.InstallError):
e.print_context()
assert e.pkg, "Expected InstallError to include the associated package"
if not os.path.exists(e.pkg.build_log_path):
tty.error("'spack install' created no log.")
else:
sys.stderr.write("Full build log:\n")
with open(e.pkg.build_log_path, errors="replace") as log:
shutil.copyfileobj(log, sys.stderr)
def _die_require_env():
msg = "install requires a package argument or active environment"
if "spack.yaml" in os.listdir(os.getcwd()):
# There's a spack.yaml file in the working dir, the user may
# have intended to use that
msg += (
"\n\n"
"Did you mean to install using the `spack.yaml`"
" in this directory? Try: \n"
" spack env activate .\n"
" spack install\n"
" OR\n"
" spack --env . install"
)
tty.die(msg)
def install(parser, args):
# TODO: unify args.verbose?
tty.set_verbose(args.verbose or args.install_verbose)
if args.help_cdash:
spack.cmd.common.arguments.print_cdash_help()
arguments.print_cdash_help()
return
if args.no_checksum:
@@ -489,43 +340,150 @@ def install(parser, args):
if args.deprecated:
spack.config.set("config:deprecated", True, scope="command_line")
spack.cmd.common.arguments.sanitize_reporter_options(args)
arguments.sanitize_reporter_options(args)
def reporter_factory(specs):
if args.log_format is None:
return None
return lang.nullcontext()
context_manager = spack.report.build_context_manager(
return spack.report.build_context_manager(
reporter=args.reporter(), filename=report_filename(args, specs=specs), specs=specs
)
return context_manager
install_kwargs = install_kwargs_from_args(args)
if not args.spec and not args.specfiles:
# If there are no args but an active environment then install the packages from it.
install_all_specs_from_active_environment(
install_kwargs=install_kwargs,
only_concrete=args.only_concrete,
cli_test_arg=args.test,
reporter_factory=reporter_factory,
)
env = ev.active_environment()
if not env and not args.spec and not args.specfiles:
_die_require_env()
try:
if env:
install_with_active_env(env, args, install_kwargs, reporter_factory)
else:
install_without_active_env(args, install_kwargs, reporter_factory)
except spack.build_environment.InstallError as e:
if args.show_log_on_error:
_dump_log_on_error(e)
raise
def _maybe_add_and_concretize(args, env, specs):
"""Handle the overloaded spack install behavior of adding
and automatically concretizing specs"""
# Users can opt out of accidental concretizations with --only-concrete
if args.only_concrete:
return
# Specs from CLI
abstract_specs, concrete_specs = specs_from_cli(args, install_kwargs)
# Otherwise, we will modify the environment.
with env.write_transaction():
# `spack add` adds these specs.
if args.add:
for spec in specs:
env.add(spec)
# Concrete specs from YAML or JSON files
specs_from_file = concrete_specs_from_file(args)
abstract_specs.extend(specs_from_file)
concrete_specs.extend(specs_from_file)
# `spack concretize`
tests = compute_tests_install_kwargs(env.user_specs, args.test)
concretized_specs = env.concretize(tests=tests)
ev.display_specs(concretized_specs)
# save view regeneration for later, so that we only do it
# once, as it can be slow.
env.write(regenerate=False)
def install_with_active_env(env: ev.Environment, args, install_kwargs, reporter_factory):
specs = spack.cmd.parse_specs(args.spec)
# The following two commands are equivalent:
# 1. `spack install --add x y z`
# 2. `spack add x y z && spack concretize && spack install --only-concrete`
# here we do the `add` and `concretize` part.
_maybe_add_and_concretize(args, env, specs)
# Now we're doing `spack install --only-concrete`.
if args.add or not specs:
specs_to_install = env.concrete_roots()
if not specs_to_install:
tty.msg(f"{env.name} environment has no specs to install")
return
# `spack install x y z` without --add is installing matching specs in the env.
else:
specs_to_install = env.all_matching_specs(*specs)
if not specs_to_install:
msg = (
"Cannot install '{0}' because no matching specs are in the current environment."
" You can add specs to the environment with 'spack add {0}', or as part"
" of the install command with 'spack install --add {0}'"
).format(" ".join(args.spec))
tty.die(msg)
install_kwargs["tests"] = compute_tests_install_kwargs(specs_to_install, args.test)
if args.overwrite:
require_user_confirmation_for_overwrite(specs_to_install, args)
install_kwargs["overwrite"] = [spec.dag_hash() for spec in specs_to_install]
try:
with reporter_factory(specs_to_install):
env.install_specs(specs_to_install, **install_kwargs)
finally:
# TODO: this is doing way too much to trigger
# views and modules to be generated.
with env.write_transaction():
env.write(regenerate=True)
def concrete_specs_from_cli(args, install_kwargs):
"""Return abstract and concrete spec parsed from the command line."""
abstract_specs = spack.cmd.parse_specs(args.spec)
install_kwargs["tests"] = compute_tests_install_kwargs(abstract_specs, args.test)
try:
concrete_specs = spack.cmd.parse_specs(
args.spec, concretize=True, tests=install_kwargs["tests"]
)
except SpackError as e:
tty.debug(e)
if args.log_format is not None:
reporter = args.reporter()
reporter.concretization_report(report_filename(args, abstract_specs), e.message)
raise
return concrete_specs
def concrete_specs_from_file(args):
"""Return the list of concrete specs read from files."""
result = []
for file in args.specfiles:
with open(file, "r") as f:
if file.endswith("yaml") or file.endswith("yml"):
s = spack.spec.Spec.from_yaml(f)
else:
s = spack.spec.Spec.from_json(f)
concretized = s.concretized()
if concretized.dag_hash() != s.dag_hash():
msg = 'skipped invalid file "{0}". '
msg += "The file does not contain a concrete spec."
tty.warn(msg.format(file))
continue
result.append(concretized)
return result
def install_without_active_env(args, install_kwargs, reporter_factory):
concrete_specs = concrete_specs_from_cli(args, install_kwargs) + concrete_specs_from_file(args)
if len(concrete_specs) == 0:
tty.die("The `spack install` command requires a spec to install.")
reporter = reporter_factory(concrete_specs) or lang.nullcontext()
with reporter:
with reporter_factory(concrete_specs):
if args.overwrite:
require_user_confirmation_for_overwrite(concrete_specs, args)
install_kwargs["overwrite"] = [spec.dag_hash() for spec in concrete_specs]
install_specs(zip(abstract_specs, concrete_specs), install_kwargs, args)
installs = [(s.package, install_kwargs) for s in concrete_specs]
builder = PackageInstaller(installs)
builder.install()

View File

@@ -335,7 +335,7 @@ def not_excluded_fn(args):
exclude_specs.extend(spack.cmd.parse_specs(str(args.exclude_specs).split()))
def not_excluded(x):
return not any(x.satisfies(y, strict=True) for y in exclude_specs)
return not any(x.satisfies(y) for y in exclude_specs)
return not_excluded

View File

@@ -26,7 +26,6 @@
description = "run spack's unit tests (wrapper around pytest)"
section = "developer"
level = "long"
is_windows = sys.platform == "win32"
def setup_parser(subparser):
@@ -212,7 +211,7 @@ def unit_test(parser, args, unknown_args):
# mock configuration used by unit tests
# Note: skip on windows here because for the moment,
# clingo is wholly unsupported from bootstrap
if not is_windows:
if sys.platform != "win32":
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
if pytest is None:

View File

@@ -28,8 +28,6 @@
__all__ = ["Compiler"]
is_windows = sys.platform == "win32"
@llnl.util.lang.memoized
def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()):
@@ -598,7 +596,7 @@ def search_regexps(cls, language):
suffixes = [""]
# Windows compilers generally have an extension of some sort
# as do most files on Windows, handle that case here
if is_windows:
if sys.platform == "win32":
ext = r"\.(?:exe|bat)"
cls_suf = [suf + ext for suf in cls.suffixes]
ext_suf = [ext]

View File

@@ -84,7 +84,7 @@ def _to_dict(compiler):
d = {}
d["spec"] = str(compiler.spec)
d["paths"] = dict((attr, getattr(compiler, attr, None)) for attr in _path_instance_vars)
d["flags"] = dict((fname, fvals) for fname, fvals in compiler.flags)
d["flags"] = dict((fname, " ".join(fvals)) for fname, fvals in compiler.flags.items())
d["flags"].update(
dict(
(attr, getattr(compiler, attr, None))

View File

@@ -61,7 +61,7 @@ def is_clang_based(self):
return version >= ver("9.0") and "classic" not in str(version)
version_argument = "--version"
version_regex = r"[Vv]ersion.*?(\d+(\.\d+)+)"
version_regex = r"[Cc]ray (?:clang|C :|C\+\+ :|Fortran :) [Vv]ersion.*?(\d+(\.\d+)+)"
@property
def verbose_flag(self):

View File

@@ -122,7 +122,19 @@ def platform_toolset_ver(self):
@property
def cl_version(self):
"""Cl toolset version"""
return spack.compiler.get_compiler_version_output(self.cc)
return Version(
re.search(
Msvc.version_regex,
spack.compiler.get_compiler_version_output(self.cc, version_arg=None),
).group(1)
)
@property
def vs_root(self):
# The MSVC install root is located at a fix level above the compiler
# and is referenceable idiomatically via the pattern below
# this should be consistent accross versions
return os.path.abspath(os.path.join(self.cc, "../../../../../../../.."))
def setup_custom_environment(self, pkg, env):
"""Set environment variables for MSVC using the

View File

@@ -134,7 +134,7 @@ def _valid_virtuals_and_externals(self, spec):
externals = spec_externals(cspec)
for ext in externals:
if ext.satisfies(spec):
if ext.intersects(spec):
usable.append(ext)
# If nothing is in the usable list now, it's because we aren't
@@ -200,7 +200,7 @@ def concretize_version(self, spec):
# List of versions we could consider, in sorted order
pkg_versions = spec.package_class.versions
usable = [v for v in pkg_versions if any(v.satisfies(sv) for sv in spec.versions)]
usable = [v for v in pkg_versions if any(v.intersects(sv) for sv in spec.versions)]
yaml_prefs = PackagePrefs(spec.name, "version")
@@ -344,7 +344,7 @@ def concretize_architecture(self, spec):
new_target_arch = spack.spec.ArchSpec((None, None, str(new_target)))
curr_target_arch = spack.spec.ArchSpec((None, None, str(curr_target)))
if not new_target_arch.satisfies(curr_target_arch):
if not new_target_arch.intersects(curr_target_arch):
# new_target is an incorrect guess based on preferences
# and/or default
valid_target_ranges = str(curr_target).split(",")

View File

@@ -77,6 +77,8 @@
"config": spack.schema.config.schema,
"upstreams": spack.schema.upstreams.schema,
"bootstrap": spack.schema.bootstrap.schema,
"ci": spack.schema.ci.schema,
"cdash": spack.schema.cdash.schema,
}
# Same as above, but including keys for environments
@@ -360,6 +362,12 @@ def _process_dict_keyname_overrides(data):
if sk.endswith(":"):
key = syaml.syaml_str(sk[:-1])
key.override = True
elif sk.endswith("+"):
key = syaml.syaml_str(sk[:-1])
key.prepend = True
elif sk.endswith("-"):
key = syaml.syaml_str(sk[:-1])
key.append = True
else:
key = sk
@@ -1040,6 +1048,33 @@ def _override(string):
return hasattr(string, "override") and string.override
def _append(string):
"""Test if a spack YAML string is an override.
See ``spack_yaml`` for details. Keys in Spack YAML can end in `+:`,
and if they do, their values append lower-precedence
configs.
str, str : concatenate strings.
[obj], [obj] : append lists.
"""
return getattr(string, "append", False)
def _prepend(string):
"""Test if a spack YAML string is an override.
See ``spack_yaml`` for details. Keys in Spack YAML can end in `+:`,
and if they do, their values prepend lower-precedence
configs.
str, str : concatenate strings.
[obj], [obj] : prepend lists. (default behavior)
"""
return getattr(string, "prepend", False)
def _mark_internal(data, name):
"""Add a simple name mark to raw YAML/JSON data.
@@ -1102,7 +1137,57 @@ def get_valid_type(path):
raise ConfigError("Cannot determine valid type for path '%s'." % path)
def merge_yaml(dest, source):
def remove_yaml(dest, source):
"""UnMerges source from dest; entries in source take precedence over dest.
This routine may modify dest and should be assigned to dest, in
case dest was None to begin with, e.g.:
dest = remove_yaml(dest, source)
In the result, elements from lists from ``source`` will not appear
as elements of lists from ``dest``. Likewise, when iterating over keys
or items in merged ``OrderedDict`` objects, keys from ``source`` will not
appear as keys in ``dest``.
Config file authors can optionally end any attribute in a dict
with `::` instead of `:`, and the key will remove the entire section
from ``dest``
"""
def they_are(t):
return isinstance(dest, t) and isinstance(source, t)
# If source is None, overwrite with source.
if source is None:
return dest
# Source list is prepended (for precedence)
if they_are(list):
# Make sure to copy ruamel comments
dest[:] = [x for x in dest if x not in source]
return dest
# Source dict is merged into dest.
elif they_are(dict):
for sk, sv in source.items():
# always remove the dest items. Python dicts do not overwrite
# keys on insert, so this ensures that source keys are copied
# into dest along with mark provenance (i.e., file/line info).
unmerge = sk in dest
old_dest_value = dest.pop(sk, None)
if unmerge and not spack.config._override(sk):
dest[sk] = remove_yaml(old_dest_value, sv)
return dest
# If we reach here source and dest are either different types or are
# not both lists or dicts: replace with source.
return dest
def merge_yaml(dest, source, prepend=False, append=False):
"""Merges source into dest; entries in source take precedence over dest.
This routine may modify dest and should be assigned to dest, in
@@ -1118,6 +1203,9 @@ def merge_yaml(dest, source):
Config file authors can optionally end any attribute in a dict
with `::` instead of `:`, and the key will override that of the
parent instead of merging.
`+:` will extend the default prepend merge strategy to include string concatenation
`-:` will change the merge strategy to append, it also includes string concatentation
"""
def they_are(t):
@@ -1129,8 +1217,12 @@ def they_are(t):
# Source list is prepended (for precedence)
if they_are(list):
# Make sure to copy ruamel comments
dest[:] = source + [x for x in dest if x not in source]
if append:
# Make sure to copy ruamel comments
dest[:] = [x for x in dest if x not in source] + source
else:
# Make sure to copy ruamel comments
dest[:] = source + [x for x in dest if x not in source]
return dest
# Source dict is merged into dest.
@@ -1147,7 +1239,7 @@ def they_are(t):
old_dest_value = dest.pop(sk, None)
if merge and not _override(sk):
dest[sk] = merge_yaml(old_dest_value, sv)
dest[sk] = merge_yaml(old_dest_value, sv, _prepend(sk), _append(sk))
else:
# if sk ended with ::, or if it's new, completely override
dest[sk] = copy.deepcopy(sv)
@@ -1158,6 +1250,13 @@ def they_are(t):
return dest
elif they_are(str):
# Concatenate strings in prepend mode
if prepend:
return source + dest
elif append:
return dest + source
# If we reach here source and dest are either different types or are
# not both lists or dicts: replace with source.
return copy.copy(source)
@@ -1183,6 +1282,17 @@ def process_config_path(path):
front = syaml.syaml_str(front)
front.override = True
seen_override_in_path = True
elif front.endswith("+"):
front = front.rstrip("+")
front = syaml.syaml_str(front)
front.prepend = True
elif front.endswith("-"):
front = front.rstrip("-")
front = syaml.syaml_str(front)
front.append = True
result.append(front)
return result

View File

@@ -1525,7 +1525,7 @@ def _query(
if not (start_date < inst_date < end_date):
continue
if query_spec is any or rec.spec.satisfies(query_spec, strict=True):
if query_spec is any or rec.spec.satisfies(query_spec):
results.append(rec.spec)
return results

View File

@@ -29,7 +29,6 @@
import spack.util.spack_yaml
import spack.util.windows_registry
is_windows = sys.platform == "win32"
#: Information on a package that has been detected
DetectedPackage = collections.namedtuple("DetectedPackage", ["spec", "prefix"])
@@ -184,7 +183,7 @@ def library_prefix(library_dir):
elif "lib" in lowered_components:
idx = lowered_components.index("lib")
return os.sep.join(components[:idx])
elif is_windows and "bin" in lowered_components:
elif sys.platform == "win32" and "bin" in lowered_components:
idx = lowered_components.index("bin")
return os.sep.join(components[:idx])
else:
@@ -260,13 +259,13 @@ def find_windows_compiler_bundled_packages():
class WindowsKitExternalPaths(object):
if is_windows:
if sys.platform == "win32":
plat_major_ver = str(winOs.windows_version()[0])
@staticmethod
def find_windows_kit_roots():
"""Return Windows kit root, typically %programfiles%\\Windows Kits\\10|11\\"""
if not is_windows:
if sys.platform != "win32":
return []
program_files = os.environ["PROGRAMFILES(x86)"]
kit_base = os.path.join(
@@ -359,7 +358,7 @@ def compute_windows_program_path_for_package(pkg):
pkg (spack.package_base.PackageBase): package for which
Program Files location is to be computed
"""
if not is_windows:
if sys.platform != "win32":
return []
# note windows paths are fine here as this method should only ever be invoked
# to interact with Windows
@@ -379,7 +378,7 @@ def compute_windows_user_path_for_package(pkg):
installs see:
https://learn.microsoft.com/en-us/dotnet/api/system.environment.specialfolder?view=netframework-4.8
"""
if not is_windows:
if sys.platform != "win32":
return []
# Current user directory

View File

@@ -31,8 +31,6 @@
path_to_dict,
)
is_windows = sys.platform == "win32"
def common_windows_package_paths():
paths = WindowsCompilerExternalPaths.find_windows_compiler_bundled_packages()
@@ -57,7 +55,7 @@ def executables_in_path(path_hints):
path_hints (list): list of paths to be searched. If None the list will be
constructed based on the PATH environment variable.
"""
if is_windows:
if sys.platform == "win32":
path_hints.extend(common_windows_package_paths())
search_paths = llnl.util.filesystem.search_paths_for_executables(*path_hints)
return path_to_dict(search_paths)
@@ -149,7 +147,7 @@ def by_library(packages_to_check, path_hints=None):
path_to_lib_name = (
libraries_in_ld_and_system_library_path(path_hints=path_hints)
if not is_windows
if sys.platform != "win32"
else libraries_in_windows_paths(path_hints)
)

View File

@@ -21,7 +21,6 @@
import spack.util.spack_json as sjson
from spack.error import SpackError
is_windows = sys.platform == "win32"
# Note: Posixpath is used here as opposed to
# os.path.join due to spack.spec.Spec.format
# requiring forward slash path seperators at this stage
@@ -346,7 +345,7 @@ def remove_install_directory(self, spec, deprecated=False):
# Windows readonly files cannot be removed by Python
# directly, change permissions before attempting to remove
if is_windows:
if sys.platform == "win32":
kwargs = {
"ignore_errors": False,
"onerror": fs.readonly_file_handler(ignore_errors=False),

View File

@@ -13,6 +13,7 @@
import time
import urllib.parse
import urllib.request
from typing import List, Optional
import ruamel.yaml as yaml
@@ -59,7 +60,7 @@
#: currently activated environment
_active_environment = None
_active_environment: Optional["Environment"] = None
#: default path where environments are stored in the spack tree
@@ -349,7 +350,8 @@ def _is_dev_spec_and_has_changed(spec):
def _spec_needs_overwrite(spec, changed_dev_specs):
"""Check whether the current spec needs to be overwritten because either it has
changed itself or one of its dependencies have changed"""
changed itself or one of its dependencies have changed
"""
# if it's not installed, we don't need to overwrite it
if not spec.installed:
return False
@@ -1551,12 +1553,11 @@ def update_default_view(self, viewpath):
def regenerate_views(self):
if not self.views:
tty.debug("Skip view update, this environment does not" " maintain a view")
tty.debug("Skip view update, this environment does not maintain a view")
return
concretized_root_specs = [s for _, s in self.concretized_specs()]
for view in self.views.values():
view.regenerate(concretized_root_specs)
view.regenerate(self.concrete_roots())
def check_views(self):
"""Checks if the environments default view can be activated."""
@@ -1564,7 +1565,7 @@ def check_views(self):
# This is effectively a no-op, but it touches all packages in the
# default view if they are installed.
for view_name, view in self.views.items():
for _, spec in self.concretized_specs():
for spec in self.concrete_roots():
if spec in view and spec.package and spec.installed:
msg = '{0} in view "{1}"'
tty.debug(msg.format(spec.name, view_name))
@@ -1582,7 +1583,7 @@ def _env_modifications_for_default_view(self, reverse=False):
visited = set()
errors = []
for _, root_spec in self.concretized_specs():
for root_spec in self.concrete_roots():
if root_spec in self.default_view and root_spec.installed and root_spec.package:
for spec in root_spec.traverse(deptype="run", root=True):
if spec.name in visited:
@@ -1799,9 +1800,6 @@ def install_specs(self, specs=None, **install_args):
"Could not install log links for {0}: {1}".format(spec.name, str(e))
)
with self.write_transaction():
self.regenerate_views()
def all_specs(self):
"""Return all specs, even those a user spec would shadow."""
roots = [self.specs_by_hash[h] for h in self.concretized_order]
@@ -1846,6 +1844,11 @@ def concretized_specs(self):
for s, h in zip(self.concretized_user_specs, self.concretized_order):
yield (s, self.specs_by_hash[h])
def concrete_roots(self):
"""Same as concretized_specs, except it returns the list of concrete
roots *without* associated user spec"""
return [root for _, root in self.concretized_specs()]
def get_by_hash(self, dag_hash):
matches = {}
roots = [self.specs_by_hash[h] for h in self.concretized_order]
@@ -1862,6 +1865,16 @@ def get_one_by_hash(self, dag_hash):
assert len(hash_matches) == 1
return hash_matches[0]
def all_matching_specs(self, *specs: spack.spec.Spec) -> List[Spec]:
"""Returns all concretized specs in the environment satisfying any of the input specs"""
key = lambda s: s.dag_hash()
return [
s
for s in spack.traverse.traverse_nodes(self.concrete_roots(), key=key)
if any(s.satisfies(t) for t in specs)
]
@spack.repo.autospec
def matching_spec(self, spec):
"""
Given a spec (likely not concretized), find a matching concretized
@@ -1879,59 +1892,60 @@ def matching_spec(self, spec):
and multiple dependency specs match, then this raises an error
and reports all matching specs.
"""
# Root specs will be keyed by concrete spec, value abstract
# Dependency-only specs will have value None
matches = {}
env_root_to_user = {root.dag_hash(): user for user, root in self.concretized_specs()}
root_matches, dep_matches = [], []
if not isinstance(spec, spack.spec.Spec):
spec = spack.spec.Spec(spec)
for user_spec, concretized_user_spec in self.concretized_specs():
# Deal with concrete specs differently
if spec.concrete:
if spec in concretized_user_spec:
matches[spec] = spec
for env_spec in spack.traverse.traverse_nodes(
specs=[root for _, root in self.concretized_specs()],
key=lambda s: s.dag_hash(),
order="breadth",
):
if not env_spec.satisfies(spec):
continue
if concretized_user_spec.satisfies(spec):
matches[concretized_user_spec] = user_spec
for dep_spec in concretized_user_spec.traverse(root=False):
if dep_spec.satisfies(spec):
# Don't overwrite the abstract spec if present
# If not present already, set to None
matches[dep_spec] = matches.get(dep_spec, None)
# If the spec is concrete, then there is no possibility of multiple matches,
# and we immediately return the single match
if spec.concrete:
return env_spec
if not matches:
# Distinguish between environment roots and deps. Specs that are both
# are classified as environment roots.
user_spec = env_root_to_user.get(env_spec.dag_hash())
if user_spec:
root_matches.append((env_spec, user_spec))
else:
dep_matches.append(env_spec)
# No matching spec
if not root_matches and not dep_matches:
return None
elif len(matches) == 1:
return list(matches.keys())[0]
root_matches = dict(
(concrete, abstract) for concrete, abstract in matches.items() if abstract
)
# Single root spec, any number of dep specs => return root spec.
if len(root_matches) == 1:
return list(root_matches.items())[0][0]
return root_matches[0][0]
if not root_matches and len(dep_matches) == 1:
return dep_matches[0]
# More than one spec matched, and either multiple roots matched or
# none of the matches were roots
# If multiple root specs match, it is assumed that the abstract
# spec will most-succinctly summarize the difference between them
# (and the user can enter one of these to disambiguate)
match_strings = []
fmt_str = "{hash:7} " + spack.spec.default_format
for concrete, abstract in matches.items():
if abstract:
s = "Root spec %s\n %s" % (abstract, concrete.format(fmt_str))
else:
s = "Dependency spec\n %s" % concrete.format(fmt_str)
match_strings.append(s)
color = clr.get_color_when()
match_strings = [
f"Root spec {abstract.format(color=color)}\n {concrete.format(fmt_str, color=color)}"
for concrete, abstract in root_matches
]
match_strings.extend(
f"Dependency spec\n {s.format(fmt_str, color=color)}" for s in dep_matches
)
matches_str = "\n".join(match_strings)
msg = "{0} matches multiple specs in the environment {1}: \n" "{2}".format(
str(spec), self.name, matches_str
raise SpackEnvironmentError(
f"{spec} matches multiple specs in the environment {self.name}: \n{matches_str}"
)
raise SpackEnvironmentError(msg)
def removed_specs(self):
"""Tuples of (user spec, concrete spec) for all specs that will be
@@ -2192,6 +2206,7 @@ def _update_and_write_manifest(self, raw_yaml_dict, yaml_dict):
view = dict((name, view.to_dict()) for name, view in self.views.items())
else:
view = False
yaml_dict["view"] = view
if self.dev_specs:
@@ -2313,7 +2328,7 @@ def _concretize_from_constraints(spec_constraints, tests=False):
invalid_deps = [
c
for c in spec_constraints
if any(c.satisfies(invd, strict=True) for invd in invalid_deps_string)
if any(c.satisfies(invd) for invd in invalid_deps_string)
]
if len(invalid_deps) != len(invalid_deps_string):
raise e

View File

@@ -28,7 +28,6 @@
import os.path
import re
import shutil
import sys
import urllib.parse
from typing import List, Optional
@@ -53,7 +52,6 @@
#: List of all fetch strategies, created by FetchStrategy metaclass.
all_strategies = []
is_windows = sys.platform == "win32"
CONTENT_TYPE_MISMATCH_WARNING_TEMPLATE = (
"The contents of {subject} look like {content_type}. Either the URL"
@@ -1503,7 +1501,7 @@ def _from_merged_attrs(fetcher, pkg, version):
return fetcher(**attrs)
def for_package_version(pkg, version):
def for_package_version(pkg, version=None):
"""Determine a fetch strategy based on the arguments supplied to
version() in the package description."""
@@ -1514,8 +1512,18 @@ def for_package_version(pkg, version):
check_pkg_attributes(pkg)
if not isinstance(version, spack.version.VersionBase):
version = spack.version.Version(version)
if version is not None:
assert not pkg.spec.concrete, "concrete specs should not pass the 'version=' argument"
# Specs are initialized with the universe range, if no version information is given,
# so here we make sure we always match the version passed as argument
if not isinstance(version, spack.version.VersionBase):
version = spack.version.Version(version)
version_list = spack.version.VersionList()
version_list.add(version)
pkg.spec.versions = version_list
else:
version = pkg.version
# if it's a commit, we must use a GitFetchStrategy
if isinstance(version, spack.version.GitVersion):

View File

@@ -30,8 +30,7 @@
#: Groupdb does not exist on Windows, prevent imports
#: on supported systems
is_windows = sys.platform == "win32"
if not is_windows:
if sys.platform != "win32":
import grp
#: Spack itself also limits the shebang line to at most 4KB, which should be plenty.

View File

@@ -84,9 +84,6 @@
#: queue invariants).
STATUS_REMOVED = "removed"
is_windows = sys.platform == "win32"
is_osx = sys.platform == "darwin"
class InstallAction(object):
#: Don't perform an install
@@ -169,9 +166,9 @@ def _do_fake_install(pkg):
if not pkg.name.startswith("lib"):
library = "lib" + library
plat_shared = ".dll" if is_windows else ".so"
plat_static = ".lib" if is_windows else ".a"
dso_suffix = ".dylib" if is_osx else plat_shared
plat_shared = ".dll" if sys.platform == "win32" else ".so"
plat_static = ".lib" if sys.platform == "win32" else ".a"
dso_suffix = ".dylib" if sys.platform == "darwin" else plat_shared
# Install fake command
fs.mkdirp(pkg.prefix.bin)

View File

@@ -575,7 +575,7 @@ def setup_main_options(args):
if args.debug:
spack.util.debug.register_interrupt_handler()
spack.config.set("config:debug", True, scope="command_line")
spack.util.environment.tracing_enabled = True
spack.util.environment.TRACING_ENABLED = True
if args.timestamp:
tty.set_timestamp(True)

View File

@@ -492,7 +492,7 @@ def get_matching_versions(specs, num_versions=1):
break
# Generate only versions that satisfy the spec.
if spec.concrete or v.satisfies(spec.versions):
if spec.concrete or v.intersects(spec.versions):
s = spack.spec.Spec(pkg.name)
s.versions = VersionList([v])
s.variants = spec.variants.copy()

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""This package contains code for creating environment modules, which can
include TCL non-hierarchical modules, LUA hierarchical modules, and others.
include Tcl non-hierarchical modules, Lua hierarchical modules, and others.
"""
from __future__ import absolute_import

View File

@@ -207,7 +207,7 @@ def merge_config_rules(configuration, spec):
# evaluated in order of appearance in the module file
spec_configuration = module_specific_configuration.pop("all", {})
for constraint, action in module_specific_configuration.items():
if spec.satisfies(constraint, strict=True):
if spec.satisfies(constraint):
if hasattr(constraint, "override") and constraint.override:
spec_configuration = {}
update_dictionary_extending_lists(spec_configuration, action)

View File

@@ -71,7 +71,7 @@ def guess_core_compilers(name, store=False):
# A compiler is considered to be a core compiler if any of the
# C, C++ or Fortran compilers reside in a system directory
is_system_compiler = any(
os.path.dirname(x) in spack.util.environment.system_dirs
os.path.dirname(x) in spack.util.environment.SYSTEM_DIRS
for x in compiler["paths"].values()
if x is not None
)

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""This module implements the classes necessary to generate TCL
"""This module implements the classes necessary to generate Tcl
non-hierarchical modules.
"""
import posixpath
@@ -19,7 +19,7 @@
from .common import BaseConfiguration, BaseContext, BaseFileLayout, BaseModuleFileWriter
#: TCL specific part of the configuration
#: Tcl specific part of the configuration
def configuration(module_set_name):
config_path = "modules:%s:tcl" % module_set_name
config = spack.config.get(config_path, {})

View File

@@ -36,7 +36,7 @@
cmake_cache_path,
cmake_cache_string,
)
from spack.build_systems.cmake import CMakePackage
from spack.build_systems.cmake import CMakePackage, generator
from spack.build_systems.cuda import CudaPackage
from spack.build_systems.generic import Package
from spack.build_systems.gnu import GNUMirrorPackage

View File

@@ -92,9 +92,6 @@
_spack_configure_argsfile = "spack-configure-args.txt"
is_windows = sys.platform == "win32"
def deprecated_version(pkg, version):
"""Return True if the version is deprecated, False otherwise.
@@ -165,7 +162,7 @@ def windows_establish_runtime_linkage(self):
Performs symlinking to incorporate rpath dependencies to Windows runtime search paths
"""
if is_windows:
if sys.platform == "win32":
self.win_rpath.add_library_dependent(*self.win_add_library_dependent())
self.win_rpath.add_rpath(*self.win_add_rpath())
self.win_rpath.establish_link()
@@ -210,7 +207,7 @@ def to_windows_exe(exe):
plat_exe = []
if hasattr(cls, "executables"):
for exe in cls.executables:
if is_windows:
if sys.platform == "win32":
exe = to_windows_exe(exe)
plat_exe.append(exe)
return plat_exe
@@ -1200,7 +1197,7 @@ def _make_fetcher(self):
# one element (the root package). In case there are resources
# associated with the package, append their fetcher to the
# composite.
root_fetcher = fs.for_package_version(self, self.version)
root_fetcher = fs.for_package_version(self)
fetcher = fs.FetchStrategyComposite() # Composite fetcher
fetcher.append(root_fetcher) # Root fetcher is always present
resources = self._get_needed_resources()
@@ -1311,7 +1308,7 @@ def provides(self, vpkg_name):
True if this package provides a virtual package with the specified name
"""
return any(
any(self.spec.satisfies(c) for c in constraints)
any(self.spec.intersects(c) for c in constraints)
for s, constraints in self.provided.items()
if s.name == vpkg_name
)
@@ -1617,7 +1614,7 @@ def content_hash(self, content=None):
# TODO: resources
if self.spec.versions.concrete:
try:
source_id = fs.for_package_version(self, self.version).source_id()
source_id = fs.for_package_version(self).source_id()
except (fs.ExtrapolationError, fs.InvalidArgsError):
# ExtrapolationError happens if the package has no fetchers defined.
# InvalidArgsError happens when there are version directives with args,
@@ -1780,7 +1777,7 @@ def _get_needed_resources(self):
# conflict with the spec, so we need to invoke
# when_spec.satisfies(self.spec) vs.
# self.spec.satisfies(when_spec)
if when_spec.satisfies(self.spec, strict=False):
if when_spec.intersects(self.spec):
resources.extend(resource_list)
# Sorts the resources by the length of the string representing their
# destination. Since any nested resource must contain another
@@ -2401,7 +2398,7 @@ def rpath(self):
# on Windows, libraries of runtime interest are typically
# stored in the bin directory
if is_windows:
if sys.platform == "win32":
rpaths = [self.prefix.bin]
rpaths.extend(d.prefix.bin for d in deps if os.path.isdir(d.prefix.bin))
else:

View File

@@ -73,7 +73,7 @@ def __call__(self, spec):
# integer is the index of the first spec in order that satisfies
# spec, or it's a number larger than any position in the order.
match_index = next(
(i for i, s in enumerate(spec_order) if spec.satisfies(s)), len(spec_order)
(i for i, s in enumerate(spec_order) if spec.intersects(s)), len(spec_order)
)
if match_index < len(spec_order) and spec_order[match_index] == spec:
# If this is called with multiple specs that all satisfy the same
@@ -185,7 +185,7 @@ def _package(maybe_abstract_spec):
),
extra_attributes=entry.get("extra_attributes", {}),
)
if external_spec.satisfies(spec):
if external_spec.intersects(spec):
external_specs.append(external_spec)
# Defensively copy returned specs

View File

@@ -37,7 +37,7 @@
def slingshot_network():
return os.path.exists("/lib64/libcxi.so")
return os.path.exists("/opt/cray/pe") and os.path.exists("/lib64/libcxi.so")
def _target_name_from_craype_target_name(name):

View File

@@ -10,7 +10,7 @@ def get_projection(projections, spec):
"""
all_projection = None
for spec_like, projection in projections.items():
if spec.satisfies(spec_like, strict=True):
if spec.satisfies(spec_like):
return projection
elif spec_like == "all":
all_projection = projection

View File

@@ -72,7 +72,7 @@ def providers_for(self, virtual_spec):
# Add all the providers that satisfy the vpkg spec.
if virtual_spec.name in self.providers:
for p_spec, spec_set in self.providers[virtual_spec.name].items():
if p_spec.satisfies(virtual_spec, deps=False):
if p_spec.intersects(virtual_spec, deps=False):
result.update(spec_set)
# Return providers in order. Defensively copy.
@@ -186,7 +186,7 @@ def update(self, spec):
provider_spec = provider_spec_readonly.copy()
provider_spec.compiler_flags = spec.compiler_flags.copy()
if spec.satisfies(provider_spec, deps=False):
if spec.intersects(provider_spec, deps=False):
provided_name = provided_spec.name
provider_map = self.providers.setdefault(provided_name, {})

View File

@@ -15,7 +15,8 @@
"cdash": {
"type": "object",
"additionalProperties": False,
"required": ["build-group", "url", "project", "site"],
# "required": ["build-group", "url", "project", "site"],
"required": ["build-group"],
"patternProperties": {
r"build-group": {"type": "string"},
r"url": {"type": "string"},

View File

@@ -0,0 +1,181 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Schema for gitlab-ci.yaml configuration file.
.. literalinclude:: ../spack/schema/ci.py
:lines: 13-
"""
from llnl.util.lang import union_dicts
# Schema for script fields
# List of lists and/or strings
# This is similar to what is allowed in
# the gitlab schema
script_schema = {
"type": "array",
"items": {"anyOf": [{"type": "string"}, {"type": "array", "items": {"type": "string"}}]},
}
# Additional attributes are allow
# and will be forwarded directly to the
# CI target YAML for each job.
attributes_schema = {
"type": "object",
"properties": {
"image": {
"oneOf": [
{"type": "string"},
{
"type": "object",
"properties": {
"name": {"type": "string"},
"entrypoint": {"type": "array", "items": {"type": "string"}},
},
},
]
},
"tags": {"type": "array", "items": {"type": "string"}},
"variables": {
"type": "object",
"patternProperties": {r"[\w\d\-_\.]+": {"type": "string"}},
},
"before_script": script_schema,
"script": script_schema,
"after_script": script_schema,
},
}
submapping_schema = {
"type": "object",
"additinoalProperties": False,
"required": ["submapping"],
"properties": {
"match_behavior": {"type": "string", "enum": ["first", "merge"], "default": "first"},
"submapping": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": False,
"required": ["match"],
"properties": {
"match": {"type": "array", "items": {"type": "string"}},
"build-job": attributes_schema,
"build-job-remove": attributes_schema,
},
},
},
},
}
named_attributes_schema = {
"oneOf": [
{
"type": "object",
"additionalProperties": False,
"properties": {"noop-job": attributes_schema, "noop-job-remove": attributes_schema},
},
{
"type": "object",
"additionalProperties": False,
"properties": {"build-job": attributes_schema, "build-job-remove": attributes_schema},
},
{
"type": "object",
"additionalProperties": False,
"properties": {
"reindex-job": attributes_schema,
"reindex-job-remove": attributes_schema,
},
},
{
"type": "object",
"additionalProperties": False,
"properties": {
"signing-job": attributes_schema,
"signing-job-remove": attributes_schema,
},
},
{
"type": "object",
"additionalProperties": False,
"properties": {
"cleanup-job": attributes_schema,
"cleanup-job-remove": attributes_schema,
},
},
{
"type": "object",
"additionalProperties": False,
"properties": {"any-job": attributes_schema, "any-job-remove": attributes_schema},
},
]
}
pipeline_gen_schema = {
"type": "array",
"items": {"oneOf": [submapping_schema, named_attributes_schema]},
}
core_shared_properties = union_dicts(
{
"pipeline-gen": pipeline_gen_schema,
"bootstrap": {
"type": "array",
"items": {
"anyOf": [
{"type": "string"},
{
"type": "object",
"additionalProperties": False,
"required": ["name"],
"properties": {
"name": {"type": "string"},
"compiler-agnostic": {"type": "boolean", "default": False},
},
},
]
},
},
"rebuild-index": {"type": "boolean"},
"broken-specs-url": {"type": "string"},
"broken-tests-packages": {"type": "array", "items": {"type": "string"}},
"target": {"type": "string", "enum": ["gitlab"], "default": "gitlab"},
}
)
ci_properties = {
"anyOf": [
{
"type": "object",
"additionalProperties": False,
# "required": ["mappings"],
"properties": union_dicts(
core_shared_properties, {"enable-artifacts-buildcache": {"type": "boolean"}}
),
},
{
"type": "object",
"additionalProperties": False,
# "required": ["mappings"],
"properties": union_dicts(
core_shared_properties, {"temporary-storage-url-prefix": {"type": "string"}}
),
},
]
}
#: Properties for inclusion in other schemas
properties = {"ci": ci_properties}
#: Full schema with metadata
schema = {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Spack CI configuration file schema",
"type": "object",
"additionalProperties": False,
"properties": properties,
}

View File

@@ -14,7 +14,9 @@
"type": "object",
"additionalProperties": False,
"properties": {
"reuse": {"type": "boolean"},
"reuse": {
"oneOf": [{"type": "boolean"}, {"type": "string", "enum": ["dependencies"]}]
},
"enable_node_namespace": {"type": "boolean"},
"targets": {
"type": "object",

View File

@@ -12,11 +12,11 @@
import spack.schema.bootstrap
import spack.schema.cdash
import spack.schema.ci
import spack.schema.compilers
import spack.schema.concretizer
import spack.schema.config
import spack.schema.container
import spack.schema.gitlab_ci
import spack.schema.mirrors
import spack.schema.modules
import spack.schema.packages
@@ -31,7 +31,7 @@
spack.schema.concretizer.properties,
spack.schema.config.properties,
spack.schema.container.properties,
spack.schema.gitlab_ci.properties,
spack.schema.ci.properties,
spack.schema.mirrors.properties,
spack.schema.modules.properties,
spack.schema.packages.properties,

View File

@@ -32,11 +32,16 @@
{
"type": "array",
"items": {
"type": "object",
"properties": {
"one_of": {"type": "array"},
"any_of": {"type": "array"},
},
"oneOf": [
{
"type": "object",
"properties": {
"one_of": {"type": "array"},
"any_of": {"type": "array"},
},
},
{"type": "string"},
]
},
},
# Shorthand for a single requirement group with

View File

@@ -818,6 +818,9 @@ def __init__(self, tests=False):
self.compiler_version_constraints = set()
self.post_facts = []
# (ID, CompilerSpec) -> dictionary of attributes
self.compiler_info = collections.defaultdict(dict)
# hashes we've already added facts for
self.seen_hashes = set()
self.reusable_and_possible = {}
@@ -942,54 +945,38 @@ def conflict_rules(self, pkg):
self.gen.fact(fn.conflict(pkg.name, trigger_id, constraint_id, conflict_msg))
self.gen.newline()
def available_compilers(self):
def compiler_facts(self):
"""Facts about available compilers."""
self.gen.h2("Available compilers")
compilers = self.possible_compilers
indexed_possible_compilers = list(enumerate(self.possible_compilers))
for compiler_id, compiler in indexed_possible_compilers:
self.gen.fact(fn.compiler_id(compiler_id))
self.gen.fact(fn.compiler_name(compiler_id, compiler.spec.name))
self.gen.fact(fn.compiler_version(compiler_id, compiler.spec.version))
compiler_versions = collections.defaultdict(lambda: set())
for compiler in compilers:
compiler_versions[compiler.name].add(compiler.version)
if compiler.operating_system:
self.gen.fact(fn.compiler_os(compiler_id, compiler.operating_system))
for compiler in sorted(compiler_versions):
for v in sorted(compiler_versions[compiler]):
self.gen.fact(fn.compiler_version(compiler, v))
if compiler.target is not None:
self.gen.fact(fn.compiler_target(compiler_id, compiler.target))
for flag_type, flags in compiler.flags.items():
for flag in flags:
self.gen.fact(fn.compiler_flag(compiler_id, flag_type, flag))
self.gen.newline()
def compiler_defaults(self):
"""Set compiler defaults, given a list of possible compilers."""
self.gen.h2("Default compiler preferences")
# Set compiler defaults, given a list of possible compilers
self.gen.h2("Default compiler preferences (CompilerID, Weight)")
compiler_list = self.possible_compilers.copy()
compiler_list = sorted(compiler_list, key=lambda x: (x.name, x.version), reverse=True)
ppk = spack.package_prefs.PackagePrefs("all", "compiler", all=False)
matches = sorted(compiler_list, key=ppk)
matches = sorted(indexed_possible_compilers, key=lambda x: ppk(x[1].spec))
for i, cspec in enumerate(matches):
f = fn.default_compiler_preference(cspec.name, cspec.version, i)
for weight, (compiler_id, cspec) in enumerate(matches):
f = fn.default_compiler_preference(compiler_id, weight)
self.gen.fact(f)
# Enumerate target families. This may be redundant, but compilers with
# custom versions will be able to concretize properly.
for entry in spack.compilers.all_compilers_config():
compiler_entry = entry["compiler"]
cspec = spack.spec.CompilerSpec(compiler_entry["spec"])
if not compiler_entry.get("target", None):
continue
self.gen.fact(
fn.compiler_supports_target(cspec.name, cspec.version, compiler_entry["target"])
)
def compiler_supports_os(self):
compilers_yaml = spack.compilers.all_compilers_config()
for entry in compilers_yaml:
c = spack.spec.CompilerSpec(entry["compiler"]["spec"])
operating_system = entry["compiler"]["operating_system"]
self.gen.fact(fn.compiler_supports_os(c.name, c.version, operating_system))
def package_compiler_defaults(self, pkg):
"""Facts about packages' compiler prefs."""
@@ -998,14 +985,16 @@ def package_compiler_defaults(self, pkg):
if not pkg_prefs or "compiler" not in pkg_prefs:
return
compiler_list = self.possible_compilers.copy()
compiler_list = self.possible_compilers
compiler_list = sorted(compiler_list, key=lambda x: (x.name, x.version), reverse=True)
ppk = spack.package_prefs.PackagePrefs(pkg.name, "compiler", all=False)
matches = sorted(compiler_list, key=ppk)
matches = sorted(compiler_list, key=lambda x: ppk(x.spec))
for i, cspec in enumerate(reversed(matches)):
for i, compiler in enumerate(reversed(matches)):
self.gen.fact(
fn.node_compiler_preference(pkg.name, cspec.name, cspec.version, -i * 100)
fn.node_compiler_preference(
pkg.name, compiler.spec.name, compiler.spec.version, -i * 100
)
)
def package_requirement_rules(self, pkg):
@@ -1028,9 +1017,14 @@ def _rules_from_requirements(self, pkg_name, requirements):
else:
rules = []
for requirement in requirements:
for policy in ("one_of", "any_of"):
if policy in requirement:
rules.append((pkg_name, policy, requirement[policy]))
if isinstance(requirement, str):
# A string represents a spec that must be satisfied. It is
# equivalent to a one_of group with a single element
rules.append((pkg_name, "one_of", [requirement]))
else:
for policy in ("one_of", "any_of"):
if policy in requirement:
rules.append((pkg_name, policy, requirement[policy]))
return rules
def pkg_rules(self, pkg, tests):
@@ -1392,23 +1386,6 @@ def target_preferences(self, pkg_name):
fn.target_weight(pkg_name, str(preferred.architecture.target), i + offset)
)
def flag_defaults(self):
self.gen.h2("Compiler flag defaults")
# types of flags that can be on specs
for flag in spack.spec.FlagMap.valid_compiler_flags():
self.gen.fact(fn.flag_type(flag))
self.gen.newline()
# flags from compilers.yaml
compilers = all_compilers_in_config()
for compiler in compilers:
for name, flags in compiler.flags.items():
for flag in flags:
self.gen.fact(
fn.compiler_version_flag(compiler.name, compiler.version, name, flag)
)
def spec_clauses(self, *args, **kwargs):
"""Wrap a call to `_spec_clauses()` into a try/except block that
raises a comprehensible error message in case of failure.
@@ -1458,6 +1435,7 @@ class Head(object):
node_compiler = fn.attr("node_compiler_set")
node_compiler_version = fn.attr("node_compiler_version_set")
node_flag = fn.attr("node_flag_set")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagate = fn.attr("variant_propagate")
@@ -1471,6 +1449,7 @@ class Body(object):
node_compiler = fn.attr("node_compiler")
node_compiler_version = fn.attr("node_compiler_version")
node_flag = fn.attr("node_flag")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagate = fn.attr("variant_propagate")
@@ -1552,6 +1531,7 @@ class Body(object):
for flag_type, flags in spec.compiler_flags.items():
for flag in flags:
clauses.append(f.node_flag(spec.name, flag_type, flag))
clauses.append(f.node_flag_source(spec.name, flag_type, spec.name))
if not spec.concrete and flag.propagate is True:
clauses.append(f.node_flag_propagate(spec.name, flag_type))
@@ -1762,8 +1742,6 @@ def target_defaults(self, specs):
if granularity == "generic":
candidate_targets = [t for t in candidate_targets if t.vendor == "generic"]
compilers = self.possible_compilers
# Add targets explicitly requested from specs
for spec in specs:
if not spec.architecture or not spec.architecture.target:
@@ -1780,8 +1758,14 @@ def target_defaults(self, specs):
if ancestor not in candidate_targets:
candidate_targets.append(ancestor)
best_targets = set([uarch.family.name])
for compiler in sorted(compilers):
best_targets = {uarch.family.name}
for compiler_id, compiler in enumerate(self.possible_compilers):
# Stub support for cross-compilation, to be expanded later
if compiler.target is not None and compiler.target != str(uarch.family):
self.gen.fact(fn.compiler_supports_target(compiler_id, compiler.target))
self.gen.newline()
continue
supported = self._supported_targets(compiler.name, compiler.version, candidate_targets)
# If we can't find supported targets it may be due to custom
@@ -1789,10 +1773,8 @@ def target_defaults(self, specs):
# real_version from the compiler object to get more accurate
# results.
if not supported:
compiler_obj = spack.compilers.compilers_for_spec(compiler)
compiler_obj = compiler_obj[0]
supported = self._supported_targets(
compiler.name, compiler_obj.real_version, candidate_targets
compiler.name, compiler.real_version, candidate_targets
)
if not supported:
@@ -1800,20 +1782,19 @@ def target_defaults(self, specs):
for target in supported:
best_targets.add(target.name)
self.gen.fact(
fn.compiler_supports_target(compiler.name, compiler.version, target.name)
)
self.gen.fact(fn.compiler_supports_target(compiler_id, target.name))
self.gen.fact(
fn.compiler_supports_target(compiler.name, compiler.version, uarch.family.name)
)
self.gen.fact(fn.compiler_supports_target(compiler_id, uarch.family.name))
self.gen.newline()
i = 0 # TODO compute per-target offset?
for target in candidate_targets:
self.gen.fact(fn.target(target.name))
self.gen.fact(fn.target_family(target.name, target.family.name))
for parent in sorted(target.parents):
self.gen.fact(fn.target_parent(target.name, parent.name))
self.gen.fact(fn.target_compatible(target.name, target.name))
# Code for ancestor can run on target
for ancestor in target.ancestors:
self.gen.fact(fn.target_compatible(target.name, ancestor.name))
# prefer best possible targets; weight others poorly so
# they're not used unless set explicitly
@@ -1824,10 +1805,10 @@ def target_defaults(self, specs):
i += 1
else:
self.default_targets.append((100, target.name))
self.default_targets = list(sorted(set(self.default_targets)))
self.gen.newline()
self.default_targets = list(sorted(set(self.default_targets)))
def virtual_providers(self):
self.gen.h2("Virtual providers")
msg = (
@@ -1843,6 +1824,22 @@ def virtual_providers(self):
def generate_possible_compilers(self, specs):
compilers = all_compilers_in_config()
# Search for compilers which differs only by aspects that are
# not selectable by users using the spec syntax
seen, sanitized_list = set(), []
for compiler in compilers:
key = compiler.spec, compiler.operating_system, compiler.target
if key in seen:
warnings.warn(
f"duplicate found for {compiler.spec} on "
f"{compiler.operating_system}/{compiler.target}. "
f"Edit your compilers.yaml configuration to remove it."
)
continue
sanitized_list.append(compiler)
seen.add(key)
cspecs = set([c.spec for c in compilers])
# add compiler specs from the input line to possibilities if we
@@ -1863,15 +1860,27 @@ def generate_possible_compilers(self, specs):
# Allow unknown compilers to exist if the associated spec
# is already built
else:
cspecs.add(s.compiler)
compiler_cls = spack.compilers.class_for_compiler_name(s.compiler.name)
compilers.append(
compiler_cls(
s.compiler, operating_system=None, target=None, paths=[None] * 4
)
)
self.gen.fact(fn.allow_compiler(s.compiler.name, s.compiler.version))
return cspecs
return list(
sorted(
compilers,
key=lambda compiler: (compiler.spec.name, compiler.spec.version),
reverse=True,
)
)
def define_version_constraints(self):
"""Define what version_satisfies(...) means in ASP logic."""
for pkg_name, versions in sorted(self.version_constraints):
# version must be *one* of the ones the spec allows.
# Also, "possible versions" contain only concrete versions, so satisfies is appropriate
allowed_versions = [
v for v in sorted(self.possible_versions[pkg_name]) if v.satisfies(versions)
]
@@ -1923,14 +1932,12 @@ def versions_for(v):
self.possible_versions[pkg_name].add(version)
def define_compiler_version_constraints(self):
compiler_list = spack.compilers.all_compiler_specs()
compiler_list = list(sorted(set(compiler_list)))
for constraint in sorted(self.compiler_version_constraints):
for compiler in compiler_list:
if compiler.satisfies(constraint):
for compiler_id, compiler in enumerate(self.possible_compilers):
if compiler.spec.satisfies(constraint):
self.gen.fact(
fn.compiler_version_satisfies(
constraint.name, constraint.versions, compiler.version
constraint.name, constraint.versions, compiler_id
)
)
self.gen.newline()
@@ -2091,10 +2098,13 @@ def setup(self, driver, specs, reuse=None):
for reusable_spec in reuse:
self._facts_from_concrete_spec(reusable_spec, possible)
self.gen.h1("Possible flags on nodes")
for flag in spack.spec.FlagMap.valid_compiler_flags():
self.gen.fact(fn.flag_type(flag))
self.gen.newline()
self.gen.h1("General Constraints")
self.available_compilers()
self.compiler_defaults()
self.compiler_supports_os()
self.compiler_facts()
# architecture defaults
self.platform_defaults()
@@ -2105,7 +2115,6 @@ def setup(self, driver, specs, reuse=None):
self.provider_defaults()
self.provider_requirements()
self.external_packages()
self.flag_defaults()
self.gen.h1("Package Constraints")
for pkg in sorted(self.pkgs):
@@ -2176,6 +2185,7 @@ def __init__(self, specs, hash_lookup=None):
self._specs = {}
self._result = None
self._command_line_specs = specs
self._hash_specs = []
self._flag_sources = collections.defaultdict(lambda: set())
self._flag_compiler_defaults = set()
@@ -2186,6 +2196,7 @@ def __init__(self, specs, hash_lookup=None):
def hash(self, pkg, h):
if pkg not in self._specs:
self._specs[pkg] = self._hash_lookup[h]
self._hash_specs.append(pkg)
def node(self, pkg):
if pkg not in self._specs:
@@ -2284,10 +2295,11 @@ def reorder_flags(self):
flags will appear last on the compile line, in the order they
were specified.
The solver determines wihch flags are on nodes; this routine
The solver determines which flags are on nodes; this routine
imposes order afterwards.
"""
compilers = dict((c.spec, c) for c in all_compilers_in_config())
# reverse compilers so we get highest priority compilers that share a spec
compilers = dict((c.spec, c) for c in reversed(all_compilers_in_config()))
cmd_specs = dict((s.name, s) for spec in self._command_line_specs for s in spec.traverse())
for spec in self._specs.values():
@@ -2310,8 +2322,8 @@ def reorder_flags(self):
)
# add flags from each source, lowest to highest precedence
for source_name in sorted_sources:
source = cmd_specs[source_name]
for name in sorted_sources:
source = self._specs[name] if name in self._hash_specs else cmd_specs[name]
extend_flag_list(from_sources, source.compiler_flags.get(flag_type, []))
# compiler flags from compilers config are lowest precedence
@@ -2386,10 +2398,12 @@ def build_specs(self, function_tuples):
continue
# if we've already gotten a concrete spec for this pkg,
# do not bother calling actions on it.
# do not bother calling actions on it except for node_flag_source,
# since node_flag_source is tracking information not in the spec itself
spec = self._specs.get(pkg)
if spec and spec.concrete:
continue
if name != "node_flag_source":
continue
action(*args)
@@ -2489,7 +2503,7 @@ def _check_input_and_extract_concrete_specs(specs):
spack.spec.Spec.ensure_valid_variants(s)
return reusable
def _reusable_specs(self):
def _reusable_specs(self, specs):
reusable_specs = []
if self.reuse:
# Specs from the local Database
@@ -2511,6 +2525,13 @@ def _reusable_specs(self):
# TODO: update mirror configuration so it can indicate that the
# TODO: source cache (or any mirror really) doesn't have binaries.
pass
# If we only want to reuse dependencies, remove the root specs
if self.reuse == "dependencies":
reusable_specs = [
spec for spec in reusable_specs if not any(root in spec for root in specs)
]
return reusable_specs
def solve(self, specs, out=None, timers=False, stats=False, tests=False, setup_only=False):
@@ -2527,7 +2548,7 @@ def solve(self, specs, out=None, timers=False, stats=False, tests=False, setup_o
"""
# Check upfront that the variants are admissible
reusable_specs = self._check_input_and_extract_concrete_specs(specs)
reusable_specs.extend(self._reusable_specs())
reusable_specs.extend(self._reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)
output = OutputConfiguration(timers=timers, stats=stats, out=out, setup_only=setup_only)
result, _, _ = self.driver.solve(setup, specs, reuse=reusable_specs, output=output)
@@ -2550,7 +2571,7 @@ def solve_in_rounds(self, specs, out=None, timers=False, stats=False, tests=Fals
tests (bool): add test dependencies to the solve
"""
reusable_specs = self._check_input_and_extract_concrete_specs(specs)
reusable_specs.extend(self._reusable_specs())
reusable_specs.extend(self._reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)
# Tell clingo that we don't have to solve all the inputs at once

View File

@@ -522,7 +522,7 @@ error(2, "{0} and {1} cannot both propagate variant '{2}' to package {3} with va
attr("variant_propagate", Package, Variant, Value1, Source1),
attr("variant_propagate", Package, Variant, Value2, Source2),
variant(Package, Variant),
Value1 != Value2.
Value1 < Value2.
% a variant cannot be set if it is not a variant on the package
error(2, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, Package)
@@ -816,25 +816,15 @@ node_target_compatible(Package, Target)
:- attr("node_target", Package, MyTarget),
target_compatible(Target, MyTarget).
% target_compatible(T1, T2) means code for T2 can run on T1
% This order is dependent -> dependency in the node DAG, which
% is contravariant with the target DAG.
target_compatible(Target, Target) :- target(Target).
target_compatible(Child, Parent) :- target_parent(Child, Parent).
target_compatible(Descendent, Ancestor)
:- target_parent(Target, Ancestor),
target_compatible(Descendent, Target),
target(Target).
#defined target_satisfies/2.
#defined target_parent/2.
% can't use targets on node if the compiler for the node doesn't support them
error(2, "{0} compiler '{2}@{3}' incompatible with 'target={1}'", Package, Target, Compiler, Version)
:- attr("node_target", Package, Target),
not compiler_supports_target(Compiler, Version, Target),
attr("node_compiler", Package, Compiler),
attr("node_compiler_version", Package, Compiler, Version),
node_compiler(Package, CompilerID),
not compiler_supports_target(CompilerID, Target),
compiler_name(CompilerID, Compiler),
compiler_version(CompilerID, Version),
build(Package).
% if a target is set explicitly, respect it
@@ -868,32 +858,44 @@ error(2, "'{0} target={1}' is not compatible with this machine", Package, Target
%-----------------------------------------------------------------------------
% Compiler semantics
%-----------------------------------------------------------------------------
compiler(Compiler) :- compiler_version(Compiler, _).
% There must be only one compiler set per built node. The compiler
% is chosen among available versions.
{ attr("node_compiler_version", Package, Compiler, Version) : compiler_version(Compiler, Version) } :-
% There must be only one compiler set per built node.
{ node_compiler(Package, CompilerID) : compiler_id(CompilerID) } :-
attr("node", Package),
build(Package).
% Infer the compiler that matches a reused node
node_compiler(Package, CompilerID)
:- attr("node_compiler_version", Package, CompilerName, CompilerVersion),
attr("node", Package),
compiler_name(CompilerID, CompilerName),
compiler_version(CompilerID, CompilerVersion),
concrete(Package).
% Expand the internal attribute into "attr("node_compiler_version")
attr("node_compiler_version", Package, CompilerName, CompilerVersion)
:- node_compiler(Package, CompilerID),
compiler_name(CompilerID, CompilerName),
compiler_version(CompilerID, CompilerVersion),
build(Package).
attr("node_compiler", Package, CompilerName)
:- attr("node_compiler_version", Package, CompilerName, CompilerVersion).
error(2, "No valid compiler version found for '{0}'", Package)
:- attr("node", Package),
C = #count{ Version : attr("node_compiler_version", Package, _, Version)},
C < 1.
error(2, "'{0}' compiler constraints '%{1}@{2}' and '%{3}@{4}' are incompatible", Package, Compiler1, Version1, Compiler2, Version2)
:- attr("node", Package),
attr("node_compiler_version", Package, Compiler1, Version1),
attr("node_compiler_version", Package, Compiler2, Version2),
(Compiler1, Version1) < (Compiler2, Version2). % see[1]
not node_compiler(Package, _).
% Sometimes we just need to know the compiler and not the version
attr("node_compiler", Package, Compiler) :- attr("node_compiler_version", Package, Compiler, _).
error(2, "Cannot concretize {0} with two compilers {1}@{2} and {3}@{4}", Package, C1, V1, C2, V2)
:- attr("node", Package),
attr("node_compiler_version", Package, C1, V1),
attr("node_compiler_version", Package, C2, V2),
(C1, V1) < (C2, V2). % see[1]
% We can't have a compiler be enforced and select the version from another compiler
error(2, "Cannot concretize {0} with two compilers {1}@{2} and {3}@{4}", Package, C1, V1, C2, V2)
:- attr("node_compiler_version", Package, C1, V1),
attr("node_compiler_version", Package, C2, V2),
(C1, V1) != (C2, V2).
(C1, V1) < (C2, V2).
error(2, "Cannot concretize {0} with two compilers {1} and {2}@{3}", Package, Compiler1, Compiler2, Version)
:- attr("node_compiler", Package, Compiler1),
@@ -904,37 +906,47 @@ error(2, "Cannot concretize {0} with two compilers {1} and {2}@{3}", Package, Co
error(1, "No valid compiler for {0} satisfies '%{1}'", Package, Compiler)
:- attr("node", Package),
attr("node_compiler_version_satisfies", Package, Compiler, ":"),
C = #count{ Version : attr("node_compiler_version", Package, Compiler, Version), compiler_version_satisfies(Compiler, ":", Version) },
C < 1.
not compiler_version_satisfies(Compiler, ":", _).
% If the compiler of a node must satisfy a constraint, then its version
% must be chosen among the ones that satisfy said constraint
error(2, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
:- attr("node", Package),
attr("node_compiler_version_satisfies", Package, Compiler, Constraint),
C = #count{ Version : attr("node_compiler_version", Package, Compiler, Version), compiler_version_satisfies(Compiler, Constraint, Version) },
C < 1.
not compiler_version_satisfies(Compiler, Constraint, _).
error(2, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
:- attr("node", Package),
attr("node_compiler_version_satisfies", Package, Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, ID),
node_compiler(Package, ID).
% If the node is associated with a compiler and the compiler satisfy a constraint, then
% the compiler associated with the node satisfy the same constraint
attr("node_compiler_version_satisfies", Package, Compiler, Constraint)
:- attr("node_compiler_version", Package, Compiler, Version),
compiler_version_satisfies(Compiler, Constraint, Version).
:- node_compiler(Package, CompilerID),
compiler_name(CompilerID, Compiler),
compiler_version_satisfies(Compiler, Constraint, CompilerID).
#defined compiler_version_satisfies/3.
% If the compiler version was set from the command line,
% respect it verbatim
attr("node_compiler_version", Package, Compiler, Version) :-
attr("node_compiler_version_set", Package, Compiler, Version).
:- attr("node_compiler_version_set", Package, Compiler, Version),
not attr("node_compiler_version", Package, Compiler, Version).
:- attr("node_compiler_set", Package, Compiler),
not attr("node_compiler_version", Package, Compiler, _).
% Cannot select a compiler if it is not supported on the OS
% Compilers that are explicitly marked as allowed
% are excluded from this check
error(2, "{0} compiler '%{1}@{2}' incompatible with 'os={3}'", Package, Compiler, Version, OS)
:- attr("node_compiler_version", Package, Compiler, Version),
attr("node_os", Package, OS),
not compiler_supports_os(Compiler, Version, OS),
:- attr("node_os", Package, OS),
node_compiler(Package, CompilerID),
compiler_name(CompilerID, Compiler),
compiler_version(CompilerID, Version),
not compiler_os(CompilerID, OS),
not allow_compiler(Compiler, Version),
build(Package).
@@ -942,8 +954,8 @@ error(2, "{0} compiler '%{1}@{2}' incompatible with 'os={3}'", Package, Compiler
% same compiler there's a mismatch.
compiler_match(Package, Dependency)
:- depends_on(Package, Dependency),
attr("node_compiler_version", Package, Compiler, Version),
attr("node_compiler_version", Dependency, Compiler, Version).
node_compiler(Package, CompilerID),
node_compiler(Dependency, CompilerID).
compiler_mismatch(Package, Dependency)
:- depends_on(Package, Dependency),
@@ -955,25 +967,32 @@ compiler_mismatch_required(Package, Dependency)
attr("node_compiler_set", Dependency, _),
not compiler_match(Package, Dependency).
#defined compiler_supports_os/3.
#defined compiler_os/3.
#defined allow_compiler/2.
% compilers weighted by preference according to packages.yaml
compiler_weight(Package, Weight)
:- attr("node_compiler_version", Package, Compiler, V),
:- node_compiler(Package, CompilerID),
compiler_name(CompilerID, Compiler),
compiler_version(CompilerID, V),
node_compiler_preference(Package, Compiler, V, Weight).
compiler_weight(Package, Weight)
:- attr("node_compiler_version", Package, Compiler, V),
:- node_compiler(Package, CompilerID),
compiler_name(CompilerID, Compiler),
compiler_version(CompilerID, V),
not node_compiler_preference(Package, Compiler, V, _),
default_compiler_preference(Compiler, V, Weight).
default_compiler_preference(CompilerID, Weight).
compiler_weight(Package, 100)
:- attr("node_compiler_version", Package, Compiler, Version),
not node_compiler_preference(Package, Compiler, Version, _),
not default_compiler_preference(Compiler, Version, _).
:- node_compiler(Package, CompilerID),
compiler_name(CompilerID, Compiler),
compiler_version(CompilerID, V),
not node_compiler_preference(Package, Compiler, V, _),
not default_compiler_preference(CompilerID, _).
% For the time being, be strict and reuse only if the compiler match one we have on the system
error(2, "Compiler {1}@{2} requested for {0} cannot be found. Set install_missing_compilers:true if intended.", Package, Compiler, Version)
:- attr("node_compiler_version", Package, Compiler, Version), not compiler_version(Compiler, Version).
:- attr("node_compiler_version", Package, Compiler, Version),
not node_compiler(Package, _).
#defined node_compiler_preference/4.
#defined default_compiler_preference/3.
@@ -985,10 +1004,11 @@ error(2, "Compiler {1}@{2} requested for {0} cannot be found. Set install_missin
% propagate flags when compiler match
can_inherit_flags(Package, Dependency, FlagType)
:- depends_on(Package, Dependency),
attr("node_compiler", Package, Compiler),
attr("node_compiler", Dependency, Compiler),
node_compiler(Package, CompilerID),
node_compiler(Dependency, CompilerID),
not attr("node_flag_set", Dependency, FlagType, _),
compiler(Compiler), flag_type(FlagType).
compiler_id(CompilerID),
flag_type(FlagType).
node_flag_inherited(Dependency, FlagType, Flag)
:- attr("node_flag_set", Package, FlagType, Flag), can_inherit_flags(Package, Dependency, FlagType),
@@ -1005,7 +1025,7 @@ error(2, "{0} and {1} cannot both propagate compiler flags '{2}' to {3}", Source
attr("node_flag_propagate", Source2, FlagType),
can_inherit_flags(Source1, Package, FlagType),
can_inherit_flags(Source2, Package, FlagType),
Source1 != Source2.
Source1 < Source2.
% remember where flags came from
attr("node_flag_source", Package, FlagType, Package) :- attr("node_flag_set", Package, FlagType, _).
@@ -1015,19 +1035,21 @@ attr("node_flag_source", Dependency, FlagType, Q)
% compiler flags from compilers.yaml are put on nodes if compiler matches
attr("node_flag", Package, FlagType, Flag)
:- compiler_version_flag(Compiler, Version, FlagType, Flag),
attr("node_compiler_version", Package, Compiler, Version),
:- compiler_flag(CompilerID, FlagType, Flag),
node_compiler(Package, CompilerID),
flag_type(FlagType),
compiler(Compiler),
compiler_version(Compiler, Version).
compiler_id(CompilerID),
compiler_name(CompilerID, CompilerName),
compiler_version(CompilerID, Version).
attr("node_flag_compiler_default", Package)
:- not attr("node_flag_set", Package, FlagType, _),
compiler_version_flag(Compiler, Version, FlagType, Flag),
attr("node_compiler_version", Package, Compiler, Version),
compiler_flag(CompilerID, FlagType, Flag),
node_compiler(Package, CompilerID),
flag_type(FlagType),
compiler(Compiler),
compiler_version(Compiler, Version).
compiler_id(CompilerID),
compiler_name(CompilerID, CompilerName),
compiler_version(CompilerID, Version).
% if a flag is set to something or inherited, it's included
attr("node_flag", Package, FlagType, Flag) :- attr("node_flag_set", Package, FlagType, Flag).
@@ -1038,7 +1060,7 @@ attr("node_flag", Package, FlagType, Flag)
attr("no_flags", Package, FlagType)
:- not attr("node_flag", Package, FlagType, _), attr("node", Package), flag_type(FlagType).
#defined compiler_version_flag/4.
#defined compiler_flag/3.
%-----------------------------------------------------------------------------
@@ -1054,7 +1076,7 @@ attr("no_flags", Package, FlagType)
% You can't install a hash, if it is not installed
:- attr("hash", Package, Hash), not installed_hash(Package, Hash).
% This should be redundant given the constraint above
:- attr("hash", Package, Hash1), attr("hash", Package, Hash2), Hash1 != Hash2.
:- attr("hash", Package, Hash1), attr("hash", Package, Hash2), Hash1 < Hash2.
% if a hash is selected, we impose all the constraints that implies
impose(Hash) :- attr("hash", Package, Hash).
@@ -1311,13 +1333,29 @@ opt_criterion(5, "non-preferred targets").
%-----------------
% Domain heuristic
%-----------------
#heuristic attr("version", Package, Version) : version_declared(Package, Version, 0), attr("node", Package). [10, true]
#heuristic version_weight(Package, 0) : version_declared(Package, Version, 0), attr("node", Package). [10, true]
#heuristic attr("node_target", Package, Target) : package_target_weight(Target, Package, 0), attr("node", Package). [10, true]
#heuristic node_target_weight(Package, 0) : attr("node", Package). [10, true]
#heuristic literal_solved(ID) : literal(ID). [1, sign]
#heuristic literal_solved(ID) : literal(ID). [50, init]
#heuristic attr("hash", Package, Hash) : attr("root", Package). [45, init]
#heuristic attr("version", Package, Version) : version_declared(Package, Version, 0), attr("root", Package). [40, true]
#heuristic version_weight(Package, 0) : version_declared(Package, Version, 0), attr("root", Package). [40, true]
#heuristic attr("variant_value", Package, Variant, Value) : variant_default_value(Package, Variant, Value), attr("root", Package). [40, true]
#heuristic attr("node_target", Package, Target) : package_target_weight(Target, Package, 0), attr("root", Package). [40, true]
#heuristic node_target_weight(Package, 0) : attr("root", Package). [40, true]
#heuristic node_compiler(Package, CompilerID) : default_compiler_preference(ID, 0), compiler_id(ID), attr("root", Package). [40, true]
#heuristic provider(Package, Virtual) : possible_provider_weight(Package, Virtual, 0, _), attr("virtual_node", Virtual). [30, true]
#heuristic provider_weight(Package, Virtual, 0, R) : possible_provider_weight(Package, Virtual, 0, R), attr("virtual_node", Virtual). [30, true]
#heuristic attr("node", Package) : possible_provider_weight(Package, Virtual, 0, _), attr("virtual_node", Virtual). [30, true]
#heuristic attr("version", Package, Version) : version_declared(Package, Version, 0), attr("node", Package). [20, true]
#heuristic version_weight(Package, 0) : version_declared(Package, Version, 0), attr("node", Package). [20, true]
#heuristic attr("node_target", Package, Target) : package_target_weight(Target, Package, 0), attr("node", Package). [20, true]
#heuristic node_target_weight(Package, 0) : attr("node", Package). [20, true]
#heuristic node_compiler(Package, CompilerID) : default_compiler_preference(ID, 0), compiler_id(ID), attr("node", Package). [15, true]
#heuristic attr("variant_value", Package, Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", Package). [10, true]
#heuristic provider(Package, Virtual) : possible_provider_weight(Package, Virtual, 0, _), attr("virtual_node", Virtual). [10, true]
#heuristic attr("node", Package) : possible_provider_weight(Package, Virtual, 0, _), attr("virtual_node", Virtual). [10, true]
#heuristic attr("node_os", Package, OS) : buildable_os(OS). [10, true]
%-----------

View File

@@ -54,7 +54,6 @@
import itertools
import os
import re
import sys
import warnings
from typing import Tuple
@@ -118,7 +117,6 @@
"SpecDeprecatedError",
]
is_windows = sys.platform == "win32"
#: Valid pattern for an identifier in Spack
identifier_re = r"\w[\w-]*"
@@ -193,9 +191,7 @@ def __call__(self, match):
@lang.lazy_lexicographic_ordering
class ArchSpec(object):
"""Aggregate the target platform, the operating system and the target
microarchitecture into an architecture spec..
"""
"""Aggregate the target platform, the operating system and the target microarchitecture."""
@staticmethod
def _return_arch(os_tag, target_tag):
@@ -364,17 +360,11 @@ def target_or_none(t):
self._target = value
def satisfies(self, other, strict=False):
"""Predicate to check if this spec satisfies a constraint.
def satisfies(self, other: "ArchSpec") -> bool:
"""Return True if all concrete specs matching self also match other, otherwise False.
Args:
other (ArchSpec or str): constraint on the current instance
strict (bool): if ``False`` the function checks if the current
instance *might* eventually satisfy the constraint. If
``True`` it check if the constraint is satisfied right now.
Returns:
True if the constraint is satisfied, False otherwise.
other: spec to be satisfied
"""
other = self._autospec(other)
@@ -382,47 +372,69 @@ def satisfies(self, other, strict=False):
for attribute in ("platform", "os"):
other_attribute = getattr(other, attribute)
self_attribute = getattr(self, attribute)
if strict or self.concrete:
if other_attribute and self_attribute != other_attribute:
return False
else:
if other_attribute and self_attribute and self_attribute != other_attribute:
return False
if other_attribute and self_attribute != other_attribute:
return False
# Check target
return self.target_satisfies(other, strict=strict)
return self._target_satisfies(other, strict=True)
def target_satisfies(self, other, strict):
need_to_check = (
bool(other.target) if strict or self.concrete else bool(other.target and self.target)
)
def intersects(self, other: "ArchSpec") -> bool:
"""Return True if there exists at least one concrete spec that matches both
self and other, otherwise False.
This operation is commutative, and if two specs intersect it means that one
can constrain the other.
Args:
other: spec to be checked for compatibility
"""
other = self._autospec(other)
# Check platform and os
for attribute in ("platform", "os"):
other_attribute = getattr(other, attribute)
self_attribute = getattr(self, attribute)
if other_attribute and self_attribute and self_attribute != other_attribute:
return False
return self._target_satisfies(other, strict=False)
def _target_satisfies(self, other: "ArchSpec", strict: bool) -> bool:
if strict is True:
need_to_check = bool(other.target)
else:
need_to_check = bool(other.target and self.target)
# If there's no need to check we are fine
if not need_to_check:
return True
# self is not concrete, but other_target is there and strict=True
# other_target is there and strict=True
if self.target is None:
return False
return bool(self.target_intersection(other))
return bool(self._target_intersection(other))
def target_constrain(self, other):
if not other.target_satisfies(self, strict=False):
def _target_constrain(self, other: "ArchSpec") -> bool:
if not other._target_satisfies(self, strict=False):
raise UnsatisfiableArchitectureSpecError(self, other)
if self.target_concrete:
return False
elif other.target_concrete:
self.target = other.target
return True
# Compute the intersection of every combination of ranges in the lists
results = self.target_intersection(other)
# Do we need to dedupe here?
self.target = ",".join(results)
results = self._target_intersection(other)
attribute_str = ",".join(results)
def target_intersection(self, other):
if self.target == attribute_str:
return False
self.target = attribute_str
return True
def _target_intersection(self, other):
results = []
if not self.target or not other.target:
@@ -466,7 +478,7 @@ def target_intersection(self, other):
results.append("%s:%s" % (n_min, n_max))
return results
def constrain(self, other):
def constrain(self, other: "ArchSpec") -> bool:
"""Projects all architecture fields that are specified in the given
spec onto the instance spec if they're missing from the instance
spec.
@@ -481,7 +493,7 @@ def constrain(self, other):
"""
other = self._autospec(other)
if not other.satisfies(self):
if not other.intersects(self):
raise UnsatisfiableArchitectureSpecError(other, self)
constrained = False
@@ -491,7 +503,7 @@ def constrain(self, other):
setattr(self, attr, ovalue)
constrained = True
self.target_constrain(other)
constrained |= self._target_constrain(other)
return constrained
@@ -507,7 +519,9 @@ def concrete(self):
@property
def target_concrete(self):
"""True if the target is not a range or list."""
return ":" not in str(self.target) and "," not in str(self.target)
return (
self.target is not None and ":" not in str(self.target) and "," not in str(self.target)
)
def to_dict(self):
d = syaml.syaml_dict(
@@ -593,11 +607,31 @@ def _autospec(self, compiler_spec_like):
return compiler_spec_like
return CompilerSpec(compiler_spec_like)
def satisfies(self, other, strict=False):
other = self._autospec(other)
return self.name == other.name and self.versions.satisfies(other.versions, strict=strict)
def intersects(self, other: "CompilerSpec") -> bool:
"""Return True if all concrete specs matching self also match other, otherwise False.
def constrain(self, other):
For compiler specs this means that the name of the compiler must be the same for
self and other, and that the versions ranges should intersect.
Args:
other: spec to be satisfied
"""
other = self._autospec(other)
return self.name == other.name and self.versions.intersects(other.versions)
def satisfies(self, other: "CompilerSpec") -> bool:
"""Return True if all concrete specs matching self also match other, otherwise False.
For compiler specs this means that the name of the compiler must be the same for
self and other, and that the version range of self is a subset of that of other.
Args:
other: spec to be satisfied
"""
other = self._autospec(other)
return self.name == other.name and self.versions.satisfies(other.versions)
def constrain(self, other: "CompilerSpec") -> bool:
"""Intersect self's versions with other.
Return whether the CompilerSpec changed.
@@ -605,7 +639,7 @@ def constrain(self, other):
other = self._autospec(other)
# ensure that other will actually constrain this spec.
if not other.satisfies(self):
if not other.intersects(self):
raise UnsatisfiableCompilerSpecError(other, self)
return self.versions.intersect(other.versions)
@@ -738,24 +772,25 @@ def __init__(self, spec):
super(FlagMap, self).__init__()
self.spec = spec
def satisfies(self, other, strict=False):
if strict or (self.spec and self.spec._concrete):
return all(f in self and set(self[f]) == set(other[f]) for f in other)
else:
if not all(
set(self[f]) == set(other[f]) for f in other if (other[f] != [] and f in self)
):
def satisfies(self, other):
return all(f in self and self[f] == other[f] for f in other)
def intersects(self, other):
common_types = set(self) & set(other)
for flag_type in common_types:
if not self[flag_type] or not other[flag_type]:
# At least one of the two is empty
continue
if self[flag_type] != other[flag_type]:
return False
# Check that the propagation values match
for flag_type in other:
if not all(
other[flag_type][i].propagate == self[flag_type][i].propagate
for i in range(len(other[flag_type]))
if flag_type in self
):
return False
return True
if not all(
f1.propagate == f2.propagate for f1, f2 in zip(self[flag_type], other[flag_type])
):
# At least one propagation flag didn't match
return False
return True
def constrain(self, other):
"""Add all flags in other that aren't in self to self.
@@ -2613,9 +2648,9 @@ def _old_concretize(self, tests=False, deprecation_warning=True):
# it's possible to build that configuration with Spack
continue
for conflict_spec, when_list in x.package_class.conflicts.items():
if x.satisfies(conflict_spec, strict=True):
if x.satisfies(conflict_spec):
for when_spec, msg in when_list:
if x.satisfies(when_spec, strict=True):
if x.satisfies(when_spec):
when = when_spec.copy()
when.name = x.name
matches.append((x, conflict_spec, when, msg))
@@ -2667,7 +2702,7 @@ def inject_patches_variant(root):
# Add any patches from the package to the spec.
patches = []
for cond, patch_list in s.package_class.patches.items():
if s.satisfies(cond, strict=True):
if s.satisfies(cond):
for patch in patch_list:
patches.append(patch)
if patches:
@@ -2685,7 +2720,7 @@ def inject_patches_variant(root):
patches = []
for cond, dependency in pkg_deps[dspec.spec.name].items():
for pcond, patch_list in dependency.patches.items():
if dspec.parent.satisfies(cond, strict=True) and dspec.spec.satisfies(pcond):
if dspec.parent.satisfies(cond) and dspec.spec.satisfies(pcond):
patches.extend(patch_list)
if patches:
all_patches = spec_to_patches.setdefault(id(dspec.spec), [])
@@ -2943,7 +2978,7 @@ def _evaluate_dependency_conditions(self, name):
# evaluate when specs to figure out constraints on the dependency.
dep = None
for when_spec, dependency in conditions.items():
if self.satisfies(when_spec, strict=True):
if self.satisfies(when_spec):
if dep is None:
dep = dp.Dependency(self.name, Spec(name), type=())
try:
@@ -2978,7 +3013,7 @@ def _find_provider(self, vdep, provider_index):
# result.
for provider in providers:
for spec in providers:
if spec is not provider and provider.satisfies(spec):
if spec is not provider and provider.intersects(spec):
providers.remove(spec)
# Can't have multiple providers for the same thing in one spec.
if len(providers) > 1:
@@ -3295,9 +3330,15 @@ def update_variant_validate(self, variant_name, values):
pkg_variant.validate_or_raise(self.variants[variant_name], pkg_cls)
def constrain(self, other, deps=True):
"""Merge the constraints of other with self.
"""Intersect self with other in-place. Return True if self changed, False otherwise.
Returns True if the spec changed as a result, False if not.
Args:
other: constraint to be added to self
deps: if False, constrain only the root node, otherwise constrain dependencies
as well.
Raises:
spack.error.UnsatisfiableSpecError: when self cannot be constrained
"""
# If we are trying to constrain a concrete spec, either the spec
# already satisfies the constraint (and the method returns False)
@@ -3377,6 +3418,9 @@ def constrain(self, other, deps=True):
if deps:
changed |= self._constrain_dependencies(other)
if other.concrete and not self.concrete and other.satisfies(self):
self._finalize_concretization()
return changed
def _constrain_dependencies(self, other):
@@ -3389,7 +3433,7 @@ def _constrain_dependencies(self, other):
# TODO: might want more detail than this, e.g. specific deps
# in violation. if this becomes a priority get rid of this
# check and be more specific about what's wrong.
if not other.satisfies_dependencies(self):
if not other._intersects_dependencies(self):
raise UnsatisfiableDependencySpecError(other, self)
if any(not d.name for d in other.traverse(root=False)):
@@ -3451,58 +3495,49 @@ def _autospec(self, spec_like):
return spec_like
return Spec(spec_like)
def satisfies(self, other, deps=True, strict=False):
"""Determine if this spec satisfies all constraints of another.
def intersects(self, other: "Spec", deps: bool = True) -> bool:
"""Return True if there exists at least one concrete spec that matches both
self and other, otherwise False.
There are two senses for satisfies, depending on the ``strict``
argument.
This operation is commutative, and if two specs intersect it means that one
can constrain the other.
* ``strict=False``: the left-hand side and right-hand side have
non-empty intersection. For example ``zlib`` satisfies
``zlib@1.2.3`` and ``zlib@1.2.3`` satisfies ``zlib``. In this
sense satisfies is a commutative operation: ``x.satisfies(y)``
if and only if ``y.satisfies(x)``.
* ``strict=True``: the left-hand side is a subset of the right-hand
side. For example ``zlib@1.2.3`` satisfies ``zlib``, but ``zlib``
does not satisfy ``zlib@1.2.3``. In this sense satisfies is not
commutative: the left-hand side should be at least as constrained
as the right-hand side.
Args:
other: spec to be checked for compatibility
deps: if True check compatibility of dependency nodes too, if False only check root
"""
other = self._autospec(other)
# Optimizations for right-hand side concrete:
# 1. For subset (strict=True) tests this means the left-hand side must
# be the same singleton with identical hash. Notice that package hashes
# can be different for otherwise indistinguishable concrete Spec objects.
# 2. For non-empty intersection (strict=False) we only have a fast path
# when the left-hand side is also concrete.
if other.concrete:
if strict:
return self.concrete and self.dag_hash() == other.dag_hash()
elif self.concrete:
return self.dag_hash() == other.dag_hash()
if other.concrete and self.concrete:
return self.dag_hash() == other.dag_hash()
# If the names are different, we need to consider virtuals
if self.name != other.name and self.name and other.name:
# A concrete provider can satisfy a virtual dependency.
if not self.virtual and other.virtual:
if self.virtual and other.virtual:
# Two virtual specs intersect only if there are providers for both
lhs = spack.repo.path.providers_for(str(self))
rhs = spack.repo.path.providers_for(str(other))
intersection = [s for s in lhs if any(s.intersects(z) for z in rhs)]
return bool(intersection)
# A provider can satisfy a virtual dependency.
elif self.virtual or other.virtual:
virtual_spec, non_virtual_spec = (self, other) if self.virtual else (other, self)
try:
# Here we might get an abstract spec
pkg_cls = spack.repo.path.get_pkg_class(self.fullname)
pkg = pkg_cls(self)
pkg_cls = spack.repo.path.get_pkg_class(non_virtual_spec.fullname)
pkg = pkg_cls(non_virtual_spec)
except spack.repo.UnknownEntityError:
# If we can't get package info on this spec, don't treat
# it as a provider of this vdep.
return False
if pkg.provides(other.name):
if pkg.provides(virtual_spec.name):
for provided, when_specs in pkg.provided.items():
if any(
self.satisfies(when, deps=False, strict=strict) for when in when_specs
non_virtual_spec.intersects(when, deps=False) for when in when_specs
):
if provided.satisfies(other):
if provided.intersects(virtual_spec):
return True
return False
@@ -3513,75 +3548,41 @@ def satisfies(self, other, deps=True, strict=False):
and self.namespace != other.namespace
):
return False
if self.versions and other.versions:
if not self.versions.satisfies(other.versions, strict=strict):
if not self.versions.intersects(other.versions):
return False
elif strict and (self.versions or other.versions):
return False
# None indicates no constraints when not strict.
if self.compiler and other.compiler:
if not self.compiler.satisfies(other.compiler, strict=strict):
if not self.compiler.intersects(other.compiler):
return False
elif strict and (other.compiler and not self.compiler):
if not self.variants.intersects(other.variants):
return False
var_strict = strict
if (not self.name) or (not other.name):
var_strict = True
if not self.variants.satisfies(other.variants, strict=var_strict):
return False
# Architecture satisfaction is currently just string equality.
# If not strict, None means unconstrained.
if self.architecture and other.architecture:
if not self.architecture.satisfies(other.architecture, strict):
if not self.architecture.intersects(other.architecture):
return False
elif strict and (other.architecture and not self.architecture):
return False
if not self.compiler_flags.satisfies(other.compiler_flags, strict=strict):
if not self.compiler_flags.intersects(other.compiler_flags):
return False
# If we need to descend into dependencies, do it, otherwise we're done.
if deps:
deps_strict = strict
if self._concrete and not other.name:
# We're dealing with existing specs
deps_strict = True
return self.satisfies_dependencies(other, strict=deps_strict)
return self._intersects_dependencies(other)
else:
return True
def satisfies_dependencies(self, other, strict=False):
"""
This checks constraints on common dependencies against each other.
"""
def _intersects_dependencies(self, other):
other = self._autospec(other)
# If there are no constraints to satisfy, we're done.
if not other._dependencies:
return True
if strict:
# if we have no dependencies, we can't satisfy any constraints.
if not self._dependencies:
return False
# use list to prevent double-iteration
selfdeps = list(self.traverse(root=False))
otherdeps = list(other.traverse(root=False))
if not all(any(d.satisfies(dep, strict=True) for d in selfdeps) for dep in otherdeps):
return False
elif not self._dependencies:
# if not strict, this spec *could* eventually satisfy the
# constraints on other.
if not other._dependencies or not self._dependencies:
# one spec *could* eventually satisfy the other
return True
# Handle first-order constraints directly
for name in self.common_dependencies(other):
if not self[name].satisfies(other[name], deps=False):
if not self[name].intersects(other[name], deps=False):
return False
# For virtual dependencies, we need to dig a little deeper.
@@ -3609,6 +3610,89 @@ def satisfies_dependencies(self, other, strict=False):
return True
def satisfies(self, other: "Spec", deps: bool = True) -> bool:
"""Return True if all concrete specs matching self also match other, otherwise False.
Args:
other: spec to be satisfied
deps: if True descend to dependencies, otherwise only check root node
"""
other = self._autospec(other)
if other.concrete:
# The left-hand side must be the same singleton with identical hash. Notice that
# package hashes can be different for otherwise indistinguishable concrete Spec
# objects.
return self.concrete and self.dag_hash() == other.dag_hash()
# If the names are different, we need to consider virtuals
if self.name != other.name and self.name and other.name:
# A concrete provider can satisfy a virtual dependency.
if not self.virtual and other.virtual:
try:
# Here we might get an abstract spec
pkg_cls = spack.repo.path.get_pkg_class(self.fullname)
pkg = pkg_cls(self)
except spack.repo.UnknownEntityError:
# If we can't get package info on this spec, don't treat
# it as a provider of this vdep.
return False
if pkg.provides(other.name):
for provided, when_specs in pkg.provided.items():
if any(self.satisfies(when, deps=False) for when in when_specs):
if provided.intersects(other):
return True
return False
# namespaces either match, or other doesn't require one.
if (
other.namespace is not None
and self.namespace is not None
and self.namespace != other.namespace
):
return False
if not self.versions.satisfies(other.versions):
return False
if self.compiler and other.compiler:
if not self.compiler.satisfies(other.compiler):
return False
elif other.compiler and not self.compiler:
return False
if not self.variants.satisfies(other.variants):
return False
if self.architecture and other.architecture:
if not self.architecture.satisfies(other.architecture):
return False
elif other.architecture and not self.architecture:
return False
if not self.compiler_flags.satisfies(other.compiler_flags):
return False
# If we need to descend into dependencies, do it, otherwise we're done.
if not deps:
return True
# If there are no constraints to satisfy, we're done.
if not other._dependencies:
return True
# If we have no dependencies, we can't satisfy any constraints.
if not self._dependencies:
return False
# If we arrived here, then rhs is abstract. At the moment we don't care about the edge
# structure of an abstract DAG - hence the deps=False parameter.
return all(
any(lhs.satisfies(rhs, deps=False) for lhs in self.traverse(root=False))
for rhs in other.traverse(root=False)
)
def virtual_dependencies(self):
"""Return list of any virtual deps in this spec."""
return [spec for spec in self.traverse() if spec.virtual]

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import functools
import warnings
import archspec.cpu
@@ -33,14 +32,6 @@ def _impl(self, other):
return _impl
#: Translation table from archspec deprecated names
_DEPRECATED_ARCHSPEC_NAMES = {
"graviton": "cortex_a72",
"graviton2": "neoverse_n1",
"graviton3": "neoverse_v1",
}
class Target(object):
def __init__(self, name, module_name=None):
"""Target models microarchitectures and their compatibility.
@@ -52,10 +43,6 @@ def __init__(self, name, module_name=None):
like Cray (e.g. craype-compiler)
"""
if not isinstance(name, archspec.cpu.Microarchitecture):
if name in _DEPRECATED_ARCHSPEC_NAMES:
msg = "'target={}' is deprecated, use 'target={}' instead"
name, old_name = _DEPRECATED_ARCHSPEC_NAMES[name], name
warnings.warn(msg.format(old_name, name))
name = archspec.cpu.TARGETS.get(name, archspec.cpu.generic_microarchitecture(name))
self.microarchitecture = name
self.module_name = module_name

View File

@@ -183,7 +183,7 @@ def test_optimization_flags_with_custom_versions(
def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constraint_tuple):
architecture = spack.spec.ArchSpec(architecture_tuple)
constraint = spack.spec.ArchSpec(constraint_tuple)
assert not architecture.satisfies(constraint, strict=True)
assert not architecture.satisfies(constraint)
@pytest.mark.parametrize(

View File

@@ -2,11 +2,13 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import filecmp
import glob
import io
import os
import platform
import sys
import tarfile
import urllib.error
import urllib.request
import urllib.response
@@ -952,3 +954,81 @@ def fake_build_tarball(node, push_url, **kwargs):
bindist.push([spec], push_url, include_root=root, include_dependencies=deps)
assert packages_to_push == expected
def test_reproducible_tarball_is_reproducible(tmpdir):
p = tmpdir.mkdir("prefix")
p.mkdir("bin")
p.mkdir(".spack")
app = p.join("bin", "app")
tarball_1 = str(tmpdir.join("prefix-1.tar.gz"))
tarball_2 = str(tmpdir.join("prefix-2.tar.gz"))
with open(app, "w") as f:
f.write("hello world")
buildinfo = {"metadata": "yes please"}
# Create a tarball with a certain mtime of bin/app
os.utime(app, times=(0, 0))
bindist._do_create_tarball(tarball_1, binaries_dir=p, pkg_dir="pkg", buildinfo=buildinfo)
# Do it another time with different mtime of bin/app
os.utime(app, times=(10, 10))
bindist._do_create_tarball(tarball_2, binaries_dir=p, pkg_dir="pkg", buildinfo=buildinfo)
# They should be bitwise identical:
assert filecmp.cmp(tarball_1, tarball_2, shallow=False)
# Sanity check for contents:
with tarfile.open(tarball_1, mode="r") as f:
for m in f.getmembers():
assert m.uid == m.gid == m.mtime == 0
assert m.uname == m.gname == ""
assert set(f.getnames()) == {
"pkg",
"pkg/bin",
"pkg/bin/app",
"pkg/.spack",
"pkg/.spack/binary_distribution",
}
def test_tarball_normalized_permissions(tmpdir):
p = tmpdir.mkdir("prefix")
p.mkdir("bin")
p.mkdir("share")
p.mkdir(".spack")
app = p.join("bin", "app")
data = p.join("share", "file")
tarball = str(tmpdir.join("prefix.tar.gz"))
# Everyone can write & execute. This should turn into 0o755 when the tarball is
# extracted (on a different system).
with open(app, "w", opener=lambda path, flags: os.open(path, flags, 0o777)) as f:
f.write("hello world")
# User doesn't have execute permissions, but group/world have; this should also
# turn into 0o644 (user read/write, group&world only read).
with open(data, "w", opener=lambda path, flags: os.open(path, flags, 0o477)) as f:
f.write("hello world")
bindist._do_create_tarball(tarball, binaries_dir=p, pkg_dir="pkg", buildinfo={})
with tarfile.open(tarball) as tar:
path_to_member = {member.name: member for member in tar.getmembers()}
# directories should have 0o755
assert path_to_member["pkg"].mode == 0o755
assert path_to_member["pkg/bin"].mode == 0o755
assert path_to_member["pkg/.spack"].mode == 0o755
# executable-by-user files should be 0o755
assert path_to_member["pkg/bin/app"].mode == 0o755
# not-executable-by-user files should be 0o644
assert path_to_member["pkg/share/file"].mode == 0o644

View File

@@ -127,13 +127,13 @@ def test_static_to_shared_library(build_environment):
"linux": (
"/bin/mycc -shared"
" -Wl,--disable-new-dtags"
" -Wl,-soname,{2} -Wl,--whole-archive {0}"
" -Wl,-soname -Wl,{2} -Wl,--whole-archive {0}"
" -Wl,--no-whole-archive -o {1}"
),
"darwin": (
"/bin/mycc -dynamiclib"
" -Wl,--disable-new-dtags"
" -install_name {1} -Wl,-force_load,{0} -o {1}"
" -install_name {1} -Wl,-force_load -Wl,{0} -o {1}"
),
}

View File

@@ -268,16 +268,18 @@ def test_cmake_std_args(self, default_mock_concretization):
s = default_mock_concretization("mpich")
assert spack.build_systems.cmake.CMakeBuilder.std_args(s.package)
def test_cmake_bad_generator(self, monkeypatch, default_mock_concretization):
def test_cmake_bad_generator(self, default_mock_concretization):
s = default_mock_concretization("cmake-client")
monkeypatch.setattr(type(s.package), "generator", "Yellow Sticky Notes", raising=False)
with pytest.raises(spack.package_base.InstallError):
s.package.builder.std_cmake_args
spack.build_systems.cmake.CMakeBuilder.std_args(
s.package, generator="Yellow Sticky Notes"
)
def test_cmake_secondary_generator(self, default_mock_concretization):
s = default_mock_concretization("cmake-client")
s.package.generator = "CodeBlocks - Unix Makefiles"
assert s.package.builder.std_cmake_args
assert spack.build_systems.cmake.CMakeBuilder.std_args(
s.package, generator="CodeBlocks - Unix Makefiles"
)
def test_define(self, default_mock_concretization):
s = default_mock_concretization("cmake-client")

View File

@@ -16,7 +16,7 @@
import spack.config
import spack.spec
from spack.paths import build_env_path
from spack.util.environment import set_env, system_dirs
from spack.util.environment import SYSTEM_DIRS, set_env
from spack.util.executable import Executable, ProcessError
#
@@ -160,7 +160,7 @@ def wrapper_environment(working_env):
SPACK_DEBUG_LOG_ID="foo-hashabc",
SPACK_COMPILER_SPEC="gcc@4.4.7",
SPACK_SHORT_SPEC="foo@1.2 arch=linux-rhel6-x86_64 /hashabc",
SPACK_SYSTEM_DIRS=":".join(system_dirs),
SPACK_SYSTEM_DIRS=":".join(SYSTEM_DIRS),
SPACK_CC_RPATH_ARG="-Wl,-rpath,",
SPACK_CXX_RPATH_ARG="-Wl,-rpath,",
SPACK_F77_RPATH_ARG="-Wl,-rpath,",
@@ -342,6 +342,16 @@ def test_fc_flags(wrapper_environment, wrapper_flags):
)
def test_Wl_parsing(wrapper_environment):
check_args(
cc,
["-Wl,-rpath,/a,--enable-new-dtags,-rpath=/b,--rpath", "-Wl,/c"],
[real_cc]
+ target_args
+ ["-Wl,--disable-new-dtags", "-Wl,-rpath,/a", "-Wl,-rpath,/b", "-Wl,-rpath,/c"],
)
def test_dep_rpath(wrapper_environment):
"""Ensure RPATHs for root package are added."""
check_args(cc, test_args, [real_cc] + target_args + common_compile_args)

View File

@@ -408,19 +408,36 @@ def test_get_spec_filter_list(mutable_mock_env_path, config, mutable_mock_repo):
touched = ["libdwarf"]
# traversing both directions from libdwarf in the graphs depicted
# above (and additionally including dependencies of dependents of
# libdwarf) results in the following possibly affected env specs:
# mpileaks, callpath, dyninst, libdwarf, libelf, and mpich.
# Unaffected specs are hypre and it's dependencies.
# Make sure we return the correct set of possibly affected specs,
# given a dependent traversal depth and the fact that the touched
# package is libdwarf. Passing traversal depth of None or something
# equal to or larger than the greatest depth in the graph are
# equivalent and result in traversal of all specs from the touched
# package to the root. Passing negative traversal depth results in
# no spec traversals. Passing any other number yields differing
# numbers of possibly affected specs.
affected_specs = ci.get_spec_filter_list(e1, touched)
affected_pkg_names = set([s.name for s in affected_specs])
expected_affected_pkg_names = set(
["mpileaks", "mpich", "callpath", "dyninst", "libdwarf", "libelf"]
)
full_set = set(["mpileaks", "mpich", "callpath", "dyninst", "libdwarf", "libelf"])
empty_set = set([])
depth_2_set = set(["mpich", "callpath", "dyninst", "libdwarf", "libelf"])
depth_1_set = set(["dyninst", "libdwarf", "libelf"])
depth_0_set = set(["libdwarf", "libelf"])
assert affected_pkg_names == expected_affected_pkg_names
expectations = {
None: full_set,
3: full_set,
100: full_set,
-1: empty_set,
0: depth_0_set,
1: depth_1_set,
2: depth_2_set,
}
for key, val in expectations.items():
affected_specs = ci.get_spec_filter_list(e1, touched, dependent_traverse_depth=key)
affected_pkg_names = set([s.name for s in affected_specs])
print(f"{key}: {affected_pkg_names}")
assert affected_pkg_names == val
@pytest.mark.regression("29947")

View File

@@ -28,8 +28,8 @@
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
from spack.schema.buildcache_spec import schema as specfile_schema
from spack.schema.ci import schema as ci_schema
from spack.schema.database_index import schema as db_idx_schema
from spack.schema.gitlab_ci import schema as gitlab_ci_schema
from spack.spec import CompilerSpec, Spec
from spack.util.pattern import Bunch
@@ -177,26 +177,29 @@ def test_ci_generate_with_env(
- [$old-gcc-pkgs]
mirrors:
some-mirror: {0}
gitlab-ci:
ci:
bootstrap:
- name: bootstrap
compiler-agnostic: true
mappings:
pipeline-gen:
- submapping:
- match:
- arch=test-debian6-core2
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
- match:
- arch=test-debian6-m1
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
service-job-attributes:
image: donotcare
tags: [donotcare]
- cleanup-job:
image: donotcare
tags: [donotcare]
- reindex-job:
script:: [hello, world]
cdash:
build-group: Not important
url: https://my.fake.cdash
@@ -239,6 +242,10 @@ def test_ci_generate_with_env(
def _validate_needs_graph(yaml_contents, needs_graph, artifacts):
"""Validate the needs graph in the generate CI"""
# TODO: Fix the logic to catch errors where expected packages/needs are not
# found.
for job_name, job_def in yaml_contents.items():
for needs_def_name, needs_list in needs_graph.items():
if job_name.startswith(needs_def_name):
@@ -269,27 +276,30 @@ def test_ci_generate_bootstrap_gcc(
spack:
definitions:
- bootstrap:
- gcc@9.5
- gcc@9.0
- gcc@3.0
specs:
- dyninst%gcc@9.5
- dyninst%gcc@3.0
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
ci:
bootstrap:
- name: bootstrap
compiler-agnostic: true
mappings:
pipeline-gen:
- submapping:
- match:
- arch=test-debian6-x86_64
runner-attributes:
build-job:
tags:
- donotcare
- match:
- arch=test-debian6-aarch64
runner-attributes:
build-job:
tags:
- donotcare
- any-job:
tags:
- donotcare
"""
)
@@ -326,26 +336,30 @@ def test_ci_generate_bootstrap_artifacts_buildcache(
spack:
definitions:
- bootstrap:
- gcc@9.5
- gcc@3.0
specs:
- dyninst%gcc@9.5
- dyninst%gcc@3.0
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
ci:
bootstrap:
- name: bootstrap
compiler-agnostic: true
mappings:
pipeline-gen:
- submapping:
- match:
- arch=test-debian6-x86_64
runner-attributes:
build-job:
tags:
- donotcare
- match:
- arch=test-debian6-aarch64
runner-attributes:
build-job:
tags:
- donotcare
- any-job:
tags:
- donotcare
enable-artifacts-buildcache: True
"""
)
@@ -398,7 +412,7 @@ def test_ci_generate_with_env_missing_section(
"""
)
expect_out = 'Error: Environment yaml does not have "gitlab-ci" section'
expect_out = 'Error: Environment yaml does not have "ci" section'
with tmpdir.as_cwd():
env_cmd("create", "test", "./spack.yaml")
@@ -427,12 +441,13 @@ def test_ci_generate_with_cdash_token(
- archive-files
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
ci:
enable-artifacts-buildcache: True
mappings:
pipeline-gen:
- submapping:
- match:
- archive-files
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
@@ -485,11 +500,12 @@ def test_ci_generate_with_custom_scripts(
- archive-files
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
mappings:
ci:
pipeline-gen:
- submapping:
- match:
- archive-files
runner-attributes:
build-job:
tags:
- donotcare
variables:
@@ -576,17 +592,18 @@ def test_ci_generate_pkg_with_deps(
- flatten-deps
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
ci:
enable-artifacts-buildcache: True
mappings:
pipeline-gen:
- submapping:
- match:
- flatten-deps
runner-attributes:
build-job:
tags:
- donotcare
- match:
- dependency-install
runner-attributes:
build-job:
tags:
- donotcare
"""
@@ -642,22 +659,23 @@ def test_ci_generate_for_pr_pipeline(
- flatten-deps
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
ci:
enable-artifacts-buildcache: True
mappings:
pipeline-gen:
- submapping:
- match:
- flatten-deps
runner-attributes:
build-job:
tags:
- donotcare
- match:
- dependency-install
runner-attributes:
build-job:
tags:
- donotcare
service-job-attributes:
image: donotcare
tags: [donotcare]
- cleanup-job:
image: donotcare
tags: [donotcare]
rebuild-index: False
"""
)
@@ -703,12 +721,13 @@ def test_ci_generate_with_external_pkg(
- externaltest
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
mappings:
ci:
pipeline-gen:
- submapping:
- match:
- archive-files
- externaltest
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
@@ -744,7 +763,7 @@ def test_ci_rebuild_missing_config(tmpdir, working_env, mutable_mock_env_path):
env_cmd("create", "test", "./spack.yaml")
env_cmd("activate", "--without-view", "--sh", "test")
out = ci_cmd("rebuild", fail_on_error=False)
assert "env containing gitlab-ci" in out
assert "env containing ci" in out
env_cmd("deactivate")
@@ -785,17 +804,18 @@ def create_rebuild_env(tmpdir, pkg_name, broken_tests=False):
- $packages
mirrors:
test-mirror: {1}
gitlab-ci:
ci:
broken-specs-url: {2}
broken-tests-packages: {3}
temporary-storage-url-prefix: {4}
mappings:
- match:
- {0}
runner-attributes:
tags:
- donotcare
image: donotcare
pipeline-gen:
- submapping:
- match:
- {0}
build-job:
tags:
- donotcare
image: donotcare
cdash:
build-group: Not important
url: https://my.fake.cdash
@@ -875,10 +895,9 @@ def activate_rebuild_env(tmpdir, pkg_name, rebuild_env):
@pytest.mark.parametrize("broken_tests", [True, False])
def test_ci_rebuild_mock_success(
tmpdir,
config,
working_env,
mutable_mock_env_path,
install_mockery,
install_mockery_mutable_config,
mock_gnupghome,
mock_stage,
mock_fetch,
@@ -914,7 +933,7 @@ def test_ci_rebuild(
tmpdir,
working_env,
mutable_mock_env_path,
install_mockery,
install_mockery_mutable_config,
mock_packages,
monkeypatch,
mock_gnupghome,
@@ -1014,12 +1033,13 @@ def test_ci_nothing_to_rebuild(
- $packages
mirrors:
test-mirror: {0}
gitlab-ci:
ci:
enable-artifacts-buildcache: True
mappings:
pipeline-gen:
- submapping:
- match:
- archive-files
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
@@ -1101,18 +1121,19 @@ def test_ci_generate_mirror_override(
- $packages
mirrors:
test-mirror: {0}
gitlab-ci:
mappings:
ci:
pipeline-gen:
- submapping:
- match:
- patchelf
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
service-job-attributes:
tags:
- nonbuildtag
image: basicimage
- cleanup-job:
tags:
- nonbuildtag
image: basicimage
""".format(
mirror_url
)
@@ -1183,19 +1204,24 @@ def test_push_mirror_contents(
- $packages
mirrors:
test-mirror: {0}
gitlab-ci:
ci:
enable-artifacts-buildcache: True
mappings:
pipeline-gen:
- submapping:
- match:
- patchelf
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
service-job-attributes:
tags:
- nonbuildtag
image: basicimage
- cleanup-job:
tags:
- nonbuildtag
image: basicimage
- any-job:
tags:
- nonbuildtag
image: basicimage
""".format(
mirror_url
)
@@ -1345,56 +1371,58 @@ def test_ci_generate_override_runner_attrs(
- a
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
tags:
- toplevel
- toplevel2
variables:
ONE: toplevelvarone
TWO: toplevelvartwo
before_script:
- pre step one
- pre step two
script:
- main step
after_script:
- post step one
match_behavior: {0}
mappings:
- match:
- flatten-deps
runner-attributes:
tags:
- specific-one
variables:
THREE: specificvarthree
- match:
- dependency-install
- match:
- a
remove-attributes:
tags:
- toplevel2
runner-attributes:
tags:
- specific-a
variables:
ONE: specificvarone
TWO: specificvartwo
before_script:
- custom pre step one
script:
- custom main step
after_script:
- custom post step one
- match:
- a
runner-attributes:
tags:
- specific-a-2
service-job-attributes:
image: donotcare
tags: [donotcare]
ci:
pipeline-gen:
- match_behavior: {0}
submapping:
- match:
- flatten-deps
build-job:
tags:
- specific-one
variables:
THREE: specificvarthree
- match:
- dependency-install
- match:
- a
build-job:
tags:
- specific-a-2
- match:
- a
build-job-remove:
tags:
- toplevel2
build-job:
tags:
- specific-a
variables:
ONE: specificvarone
TWO: specificvartwo
before_script::
- - custom pre step one
script::
- - custom main step
after_script::
- custom post step one
- build-job:
tags:
- toplevel
- toplevel2
variables:
ONE: toplevelvarone
TWO: toplevelvartwo
before_script:
- - pre step one
- pre step two
script::
- - main step
after_script:
- - post step one
- cleanup-job:
image: donotcare
tags: [donotcare]
""".format(
match_behavior
)
@@ -1420,8 +1448,6 @@ def test_ci_generate_override_runner_attrs(
assert global_vars["SPACK_CHECKOUT_VERSION"] == "12ad69eb1"
for ci_key in yaml_contents.keys():
if "(specs) b" in ci_key:
assert False
if "(specs) a" in ci_key:
# Make sure a's attributes override variables, and all the
# scripts. Also, make sure the 'toplevel' tag doesn't
@@ -1495,10 +1521,11 @@ def test_ci_generate_with_workarounds(
- callpath%gcc@9.5
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
mappings:
ci:
pipeline-gen:
- submapping:
- match: ['%gcc@9.5']
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
@@ -1550,11 +1577,12 @@ def test_ci_rebuild_index(
- callpath
mirrors:
test-mirror: {0}
gitlab-ci:
mappings:
ci:
pipeline-gen:
- submapping:
- match:
- patchelf
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
@@ -1642,29 +1670,30 @@ def test_ci_generate_bootstrap_prune_dag(
- b%gcc@12.2.0
mirrors:
atestm: {0}
gitlab-ci:
ci:
bootstrap:
- name: bootstrap
compiler-agnostic: true
mappings:
pipeline-gen:
- submapping:
- match:
- arch=test-debian6-x86_64
runner-attributes:
build-job:
tags:
- donotcare
- match:
- arch=test-debian6-core2
runner-attributes:
build-job:
tags:
- meh
- match:
- arch=test-debian6-aarch64
runner-attributes:
build-job:
tags:
- donotcare
- match:
- arch=test-debian6-m1
runner-attributes:
build-job:
tags:
- meh
""".format(
@@ -1743,18 +1772,22 @@ def test_ci_generate_prune_untouched(
- callpath
mirrors:
some-mirror: {0}
gitlab-ci:
mappings:
- match:
- arch=test-debian6-core2
runner-attributes:
tags:
- donotcare
image: donotcare
ci:
pipeline-gen:
- build-job:
tags:
- donotcare
image: donotcare
""".format(
mirror_url
)
)
# Dependency graph rooted at callpath
# callpath -> dyninst -> libelf
# -> libdwarf -> libelf
# -> mpich
with tmpdir.as_cwd():
env_cmd("create", "test", "./spack.yaml")
outputfile = str(tmpdir.join(".gitlab-ci.yml"))
@@ -1765,19 +1798,93 @@ def fake_compute_affected(r1=None, r2=None):
def fake_stack_changed(env_path, rev1="HEAD^", rev2="HEAD"):
return False
with ev.read("test"):
env_hashes = {}
with ev.read("test") as active_env:
monkeypatch.setattr(ci, "compute_affected_packages", fake_compute_affected)
monkeypatch.setattr(ci, "get_stack_changed", fake_stack_changed)
active_env.concretize()
for s in active_env.all_specs():
env_hashes[s.name] = s.dag_hash()
ci_cmd("generate", "--output-file", outputfile)
with open(outputfile) as f:
contents = f.read()
print(contents)
yaml_contents = syaml.load(contents)
generated_hashes = []
for ci_key in yaml_contents.keys():
if "archive-files" in ci_key:
print("Error: archive-files should have been pruned")
assert False
if ci_key.startswith("(specs)"):
generated_hashes.append(
yaml_contents[ci_key]["variables"]["SPACK_JOB_SPEC_DAG_HASH"]
)
assert env_hashes["archive-files"] not in generated_hashes
for spec_name in ["callpath", "dyninst", "mpich", "libdwarf", "libelf"]:
assert env_hashes[spec_name] in generated_hashes
def test_ci_generate_prune_env_vars(
tmpdir, mutable_mock_env_path, install_mockery, mock_packages, ci_base_environment, monkeypatch
):
"""Make sure environment variables controlling untouched spec
pruning behave as expected."""
os.environ.update({"SPACK_PRUNE_UNTOUCHED": "TRUE"}) # enables pruning of untouched specs
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
"""\
spack:
specs:
- libelf
ci:
pipeline-gen:
- submapping:
- match:
- arch=test-debian6-core2
build-job:
tags:
- donotcare
image: donotcare
"""
)
with tmpdir.as_cwd():
env_cmd("create", "test", "./spack.yaml")
def fake_compute_affected(r1=None, r2=None):
return ["libdwarf"]
def fake_stack_changed(env_path, rev1="HEAD^", rev2="HEAD"):
return False
expected_depth_param = None
def check_get_spec_filter_list(env, affected_pkgs, dependent_traverse_depth=None):
assert dependent_traverse_depth == expected_depth_param
return set()
monkeypatch.setattr(ci, "compute_affected_packages", fake_compute_affected)
monkeypatch.setattr(ci, "get_stack_changed", fake_stack_changed)
monkeypatch.setattr(ci, "get_spec_filter_list", check_get_spec_filter_list)
expectations = {"-1": -1, "0": 0, "True": None}
for key, val in expectations.items():
with ev.read("test"):
os.environ.update({"SPACK_PRUNE_UNTOUCHED_DEPENDENT_DEPTH": key})
expected_depth_param = val
# Leaving out the mirror in the spack.yaml above means the
# pipeline generation command will fail, pretty much immediately.
# But for this test, we only care how the environment variables
# for pruning are handled, the faster the better. So allow the
# spack command to fail.
ci_cmd("generate", fail_on_error=False)
def test_ci_subcommands_without_mirror(
@@ -1796,11 +1903,12 @@ def test_ci_subcommands_without_mirror(
spack:
specs:
- archive-files
gitlab-ci:
mappings:
ci:
pipeline-gen:
- submapping:
- match:
- archive-files
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
@@ -1829,12 +1937,13 @@ def test_ensure_only_one_temporary_storage():
"""Make sure 'gitlab-ci' section of env does not allow specification of
both 'enable-artifacts-buildcache' and 'temporary-storage-url-prefix'."""
gitlab_ci_template = """
gitlab-ci:
ci:
{0}
mappings:
pipeline-gen:
- submapping:
- match:
- notcheckedhere
runner-attributes:
build-job:
tags:
- donotcare
"""
@@ -1850,21 +1959,21 @@ def test_ensure_only_one_temporary_storage():
# User can specify "enable-artifacts-buildcache" (boolean)
yaml_obj = syaml.load(gitlab_ci_template.format(enable_artifacts))
jsonschema.validate(yaml_obj, gitlab_ci_schema)
jsonschema.validate(yaml_obj, ci_schema)
# User can also specify "temporary-storage-url-prefix" (string)
yaml_obj = syaml.load(gitlab_ci_template.format(temp_storage))
jsonschema.validate(yaml_obj, gitlab_ci_schema)
jsonschema.validate(yaml_obj, ci_schema)
# However, specifying both should fail to validate
yaml_obj = syaml.load(gitlab_ci_template.format(specify_both))
with pytest.raises(jsonschema.ValidationError):
jsonschema.validate(yaml_obj, gitlab_ci_schema)
jsonschema.validate(yaml_obj, ci_schema)
# Specifying neither should be fine too, as neither of these properties
# should be required
yaml_obj = syaml.load(gitlab_ci_template.format(specify_neither))
jsonschema.validate(yaml_obj, gitlab_ci_schema)
jsonschema.validate(yaml_obj, ci_schema)
def test_ci_generate_temp_storage_url(
@@ -1886,12 +1995,13 @@ def test_ci_generate_temp_storage_url(
- archive-files
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
ci:
temporary-storage-url-prefix: file:///work/temp/mirror
mappings:
pipeline-gen:
- submapping:
- match:
- archive-files
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
@@ -1957,15 +2067,16 @@ def test_ci_generate_read_broken_specs_url(
- a
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
ci:
broken-specs-url: "{0}"
mappings:
pipeline-gen:
- submapping:
- match:
- a
- flatten-deps
- b
- dependency-install
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
@@ -2006,26 +2117,27 @@ def test_ci_generate_external_signing_job(
- archive-files
mirrors:
some-mirror: https://my.fake.mirror
gitlab-ci:
ci:
temporary-storage-url-prefix: file:///work/temp/mirror
mappings:
pipeline-gen:
- submapping:
- match:
- archive-files
runner-attributes:
build-job:
tags:
- donotcare
image: donotcare
signing-job-attributes:
tags:
- nonbuildtag
- secretrunner
image:
name: customdockerimage
entrypoint: []
variables:
IMPORTANT_INFO: avalue
script:
- echo hello
- signing-job:
tags:
- nonbuildtag
- secretrunner
image:
name: customdockerimage
entrypoint: []
variables:
IMPORTANT_INFO: avalue
script::
- echo hello
"""
)
@@ -2068,11 +2180,12 @@ def test_ci_reproduce(
- $packages
mirrors:
test-mirror: file:///some/fake/mirror
gitlab-ci:
mappings:
ci:
pipeline-gen:
- submapping:
- match:
- archive-files
runner-attributes:
build-job:
tags:
- donotcare
image: {0}
@@ -2149,7 +2262,9 @@ def fake_download_and_extract_artifacts(url, work_dir):
working_dir.strpath,
output=str,
)
expect_out = "docker run --rm -v {0}:{0} -ti {1}".format(working_dir.strpath, image_name)
expect_out = "docker run --rm --name spack_reproducer -v {0}:{0}:Z -ti {1}".format(
os.path.realpath(working_dir.strpath), image_name
)
assert expect_out in rep_out

View File

@@ -122,19 +122,18 @@ def test_root_and_dep_match_returns_root(mock_packages, mutable_mock_env_path):
assert env_spec2
def test_concretizer_arguments(mutable_config, mock_packages):
@pytest.mark.parametrize(
"arg,config", [("--reuse", True), ("--fresh", False), ("--reuse-deps", "dependencies")]
)
def test_concretizer_arguments(mutable_config, mock_packages, arg, config):
"""Ensure that ConfigSetAction is doing the right thing."""
spec = spack.main.SpackCommand("spec")
assert spack.config.get("concretizer:reuse", None) is None
spec("--reuse", "zlib")
spec(arg, "zlib")
assert spack.config.get("concretizer:reuse", None) is True
spec("--fresh", "zlib")
assert spack.config.get("concretizer:reuse", None) is False
assert spack.config.get("concretizer:reuse", None) == config
def test_use_buildcache_type():

View File

@@ -82,8 +82,8 @@ def test_change_match_spec():
change("--match-spec", "mpileaks@2.2", "mpileaks@2.3")
assert not any(x.satisfies("mpileaks@2.2") for x in e.user_specs)
assert any(x.satisfies("mpileaks@2.3") for x in e.user_specs)
assert not any(x.intersects("mpileaks@2.2") for x in e.user_specs)
assert any(x.intersects("mpileaks@2.3") for x in e.user_specs)
def test_change_multiple_matches():
@@ -97,8 +97,8 @@ def test_change_multiple_matches():
change("--match-spec", "mpileaks", "-a", "mpileaks%gcc")
assert all(x.satisfies("%gcc") for x in e.user_specs if x.name == "mpileaks")
assert any(x.satisfies("%clang") for x in e.user_specs if x.name == "libelf")
assert all(x.intersects("%gcc") for x in e.user_specs if x.name == "mpileaks")
assert any(x.intersects("%clang") for x in e.user_specs if x.name == "libelf")
def test_env_add_virtual():
@@ -111,7 +111,7 @@ def test_env_add_virtual():
hashes = e.concretized_order
assert len(hashes) == 1
spec = e.specs_by_hash[hashes[0]]
assert spec.satisfies("mpi")
assert spec.intersects("mpi")
def test_env_add_nonexistant_fails():
@@ -687,7 +687,7 @@ def test_env_with_config():
with e:
e.concretize()
assert any(x.satisfies("mpileaks@2.2") for x in e._get_environment_specs())
assert any(x.intersects("mpileaks@2.2") for x in e._get_environment_specs())
def test_with_config_bad_include():
@@ -1630,9 +1630,9 @@ def test_stack_concretize_extraneous_deps(tmpdir, config, mock_packages):
assert concrete.concrete
assert not user.concrete
if user.name == "libelf":
assert not concrete.satisfies("^mpi", strict=True)
assert not concrete.satisfies("^mpi")
elif user.name == "mpileaks":
assert concrete.satisfies("^mpi", strict=True)
assert concrete.satisfies("^mpi")
def test_stack_concretize_extraneous_variants(tmpdir, config, mock_packages):

View File

@@ -16,8 +16,6 @@
from spack.main import SpackCommand
from spack.spec import Spec
is_windows = sys.platform == "win32"
@pytest.fixture
def executables_found(monkeypatch):
@@ -39,7 +37,7 @@ def _win_exe_ext():
def define_plat_exe(exe):
if is_windows:
if sys.platform == "win32":
exe += ".bat"
return exe

View File

@@ -276,7 +276,7 @@ def test_install_commit(mock_git_version_info, install_mockery, mock_packages, m
assert filename in installed
with open(spec.prefix.bin.join(filename), "r") as f:
content = f.read().strip()
assert content == "[]" # contents are weird for another test
assert content == "[0]" # contents are weird for another test
def test_install_overwrite_multiple(
@@ -793,7 +793,7 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# ^b
# a
# ^b
e = ev.create("test")
e = ev.create("test", with_view=False)
e.add("mpileaks")
e.add("libelf@0.8.10") # so env has both root and dep libelf specs
e.add("a")
@@ -829,14 +829,11 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# Assert using --no-add with a spec not in the env fails
inst_out = install("--no-add", "boost", fail_on_error=False, output=str)
assert "You can add it to the environment with 'spack add " in inst_out
assert "You can add specs to the environment with 'spack add " in inst_out
# Without --add, ensure that install fails if the spec matches more
# than one root
with pytest.raises(ev.SpackEnvironmentError) as err:
inst_out = install("a", output=str)
assert "a matches multiple specs in the env" in str(err)
# Without --add, ensure that two packages "a" get installed
inst_out = install("a", output=str)
assert len([x for x in e.all_specs() if x.installed and x.name == "a"]) == 2
# Install an unambiguous dependency spec (that already exists as a dep
# in the environment) and make sure it gets installed (w/ deps),
@@ -1177,6 +1174,6 @@ def test_report_filename_for_cdash(install_mockery_mutable_config, mock_fetch):
args = parser.parse_args(
["--cdash-upload-url", "https://blahblah/submit.php?project=debugging", "a"]
)
_, specs = spack.cmd.install.specs_from_cli(args, {})
specs = spack.cmd.install.concrete_specs_from_cli(args, {})
filename = spack.cmd.install.report_filename(args, specs)
assert filename != "https://blahblah/submit.php?project=debugging"

View File

@@ -7,7 +7,6 @@
from spack.main import SpackCommand
is_windows = sys.platform == "win32"
resource = SpackCommand("resource")
#: these are hashes used in mock packages
@@ -23,7 +22,7 @@
"bf07a7fbb825fc0aae7bf4a1177b2b31fcf8a3feeaf7092761e18c859ee52a9c",
"7d865e959b2466918c9863afca942d0fb89d7c9ac0c99bafc3749504ded97730",
]
if not is_windows
if sys.platform != "win32"
else [
"abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234",
"1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd1234abcd",
@@ -68,7 +67,7 @@ def test_resource_list_only_hashes(mock_packages, capfd):
def test_resource_show(mock_packages, capfd):
test_hash = (
"c45c1564f70def3fc1a6e22139f62cb21cd190cc3a7dbe6f4120fa59ce33dcb8"
if not is_windows
if sys.platform != "win32"
else "3c5b65abcd6a3b2c714dbf7c31ff65fe3748a1adc371f030c283007ca5534f11"
)
with capfd.disabled():

View File

@@ -14,8 +14,6 @@
import spack.extensions
import spack.main
is_windows = sys.platform == "win32"
class Extension:
"""Helper class to simplify the creation of simple command extension
@@ -274,7 +272,7 @@ def test_variable_in_extension_path(config, working_env):
os.environ["_MY_VAR"] = os.path.join("my", "var")
ext_paths = [os.path.join("~", "${_MY_VAR}", "spack-extension-1")]
# Home env variable is USERPROFILE on Windows
home_env = "USERPROFILE" if is_windows else "HOME"
home_env = "USERPROFILE" if sys.platform == "win32" else "HOME"
expected_ext_paths = [
os.path.join(os.environ[home_env], os.environ["_MY_VAR"], "spack-extension-1")
]

View File

@@ -58,6 +58,7 @@ def test_arm_version_detection(version_str, expected_version):
[
("Cray C : Version 8.4.6 Mon Apr 15, 2019 12:13:39\n", "8.4.6"),
("Cray C++ : Version 8.4.6 Mon Apr 15, 2019 12:13:45\n", "8.4.6"),
("Cray clang Version 8.4.6 Mon Apr 15, 2019 12:13:45\n", "8.4.6"),
("Cray Fortran : Version 8.4.6 Mon Apr 15, 2019 12:13:55\n", "8.4.6"),
],
)
@@ -487,3 +488,27 @@ def _module(cmd, *args):
def test_aocc_version_detection(version_str, expected_version):
version = spack.compilers.aocc.Aocc.extract_version_from_output(version_str)
assert version == expected_version
@pytest.mark.regression("33901")
@pytest.mark.parametrize(
"version_str",
[
(
"Apple clang version 11.0.0 (clang-1100.0.33.8)\n"
"Target: x86_64-apple-darwin18.7.0\n"
"Thread model: posix\n"
"InstalledDir: "
"/Applications/Xcode.app/Contents/Developer/Toolchains/"
"XcodeDefault.xctoolchain/usr/bin\n"
),
(
"Apple LLVM version 7.0.2 (clang-700.1.81)\n"
"Target: x86_64-apple-darwin15.2.0\n"
"Thread model: posix\n"
),
],
)
def test_apple_clang_not_detected_as_cce(version_str):
version = spack.compilers.cce.Cce.extract_version_from_output(version_str)
assert version == "unknown"

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import copy
import os
import sys
@@ -25,8 +26,6 @@
from spack.spec import Spec
from spack.version import ver
is_windows = sys.platform == "win32"
def check_spec(abstract, concrete):
if abstract.versions.concrete:
@@ -294,11 +293,11 @@ def test_concretize_with_provides_when(self):
we ask for some advanced version.
"""
repo = spack.repo.path
assert not any(s.satisfies("mpich2@:1.0") for s in repo.providers_for("mpi@2.1"))
assert not any(s.satisfies("mpich2@:1.1") for s in repo.providers_for("mpi@2.2"))
assert not any(s.satisfies("mpich@:1") for s in repo.providers_for("mpi@2"))
assert not any(s.satisfies("mpich@:1") for s in repo.providers_for("mpi@3"))
assert not any(s.satisfies("mpich2") for s in repo.providers_for("mpi@3"))
assert not any(s.intersects("mpich2@:1.0") for s in repo.providers_for("mpi@2.1"))
assert not any(s.intersects("mpich2@:1.1") for s in repo.providers_for("mpi@2.2"))
assert not any(s.intersects("mpich@:1") for s in repo.providers_for("mpi@2"))
assert not any(s.intersects("mpich@:1") for s in repo.providers_for("mpi@3"))
assert not any(s.intersects("mpich2") for s in repo.providers_for("mpi@3"))
def test_provides_handles_multiple_providers_of_same_version(self):
""" """
@@ -332,6 +331,24 @@ def test_compiler_flags_from_compiler_and_dependent(self):
for spec in [client, cmake]:
assert spec.compiler_flags["cflags"] == ["-O3", "-g"]
def test_compiler_flags_differ_identical_compilers(self):
# Correct arch to use test compiler that has flags
spec = Spec("a %clang@12.2.0 platform=test os=fe target=fe")
# Get the compiler that matches the spec (
compiler = spack.compilers.compiler_for_spec("clang@12.2.0", spec.architecture)
# Clear cache for compiler config since it has its own cache mechanism outside of config
spack.compilers._cache_config_file = []
# Configure spack to have two identical compilers with different flags
default_dict = spack.compilers._to_dict(compiler)
different_dict = copy.deepcopy(default_dict)
different_dict["compiler"]["flags"] = {"cflags": "-O2"}
with spack.config.override("compilers", [different_dict]):
spec.concretize()
assert spec.satisfies("cflags=-O2")
def test_concretize_compiler_flag_propagate(self):
spec = Spec("hypre cflags=='-g' ^openblas")
spec.concretize()
@@ -1138,7 +1155,7 @@ def test_custom_compiler_version(self):
def test_all_patches_applied(self):
uuidpatch = (
"a60a42b73e03f207433c5579de207c6ed61d58e4d12dd3b5142eb525728d89ea"
if not is_windows
if sys.platform != "win32"
else "d0df7988457ec999c148a4a2af25ce831bfaad13954ba18a4446374cb0aef55e"
)
localpatch = "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"
@@ -1257,6 +1274,18 @@ def test_reuse_installed_packages_when_package_def_changes(
# Structure and package hash will be different without reuse
assert root.dag_hash() != new_root_without_reuse.dag_hash()
def test_reuse_with_flags(self, mutable_database, mutable_config):
if spack.config.get("config:concretizer") == "original":
pytest.xfail("Original concretizer does not reuse")
spack.config.set("concretizer:reuse", True)
spec = Spec("a cflags=-g cxxflags=-g").concretized()
spack.store.db.add(spec, None)
testspec = Spec("a cflags=-g")
testspec.concretize()
assert testspec == spec
@pytest.mark.regression("20784")
def test_concretization_of_test_dependencies(self):
# With clingo we emit dependency_conditions regardless of the type
@@ -1445,7 +1474,7 @@ def test_concrete_specs_are_not_modified_on_reuse(
with spack.config.override("concretizer:reuse", True):
s = spack.spec.Spec(spec_str).concretized()
assert s.installed is expect_installed
assert s.satisfies(spec_str, strict=True)
assert s.satisfies(spec_str)
@pytest.mark.regression("26721,19736")
def test_sticky_variant_in_package(self):
@@ -2047,3 +2076,85 @@ def test_external_python_extension_find_unified_python(self):
abstract_specs = [spack.spec.Spec(s) for s in ["py-extension1", "python"]]
specs = spack.concretize.concretize_specs_together(*abstract_specs)
assert specs[0]["python"] == specs[1]["python"]
@pytest.mark.regression("36190")
@pytest.mark.parametrize(
"specs",
[
["mpileaks^ callpath ^dyninst@8.1.1:8 ^mpich2@1.3:1"],
["multivalue-variant ^a@2:2"],
["v1-consumer ^conditional-provider@1:1 +disable-v1"],
],
)
def test_result_specs_is_not_empty(self, specs):
"""Check that the implementation of "result.specs" is correct in cases where we
know a concretization exists.
"""
specs = [spack.spec.Spec(s) for s in specs]
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, specs, reuse=[])
assert result.specs
assert not result.unsolved_specs
@pytest.mark.regression("36339")
def test_compiler_match_constraints_when_selected(self):
"""Test that, when multiple compilers with the same name are in the configuration
we ensure that the selected one matches all the required constraints.
"""
compiler_configuration = [
{
"compiler": {
"spec": "gcc@11.1.0",
"paths": {
"cc": "/usr/bin/gcc",
"cxx": "/usr/bin/g++",
"f77": "/usr/bin/gfortran",
"fc": "/usr/bin/gfortran",
},
"operating_system": "debian6",
"target": "x86_64",
"modules": [],
}
},
{
"compiler": {
"spec": "gcc@12.1.0",
"paths": {
"cc": "/usr/bin/gcc",
"cxx": "/usr/bin/g++",
"f77": "/usr/bin/gfortran",
"fc": "/usr/bin/gfortran",
},
"operating_system": "debian6",
"target": "x86_64",
"modules": [],
}
},
]
spack.config.set("compilers", compiler_configuration)
s = spack.spec.Spec("a %gcc@:11").concretized()
assert s.compiler.version == ver("11.1.0"), s
@pytest.mark.regression("36339")
@pytest.mark.skipif(sys.platform == "win32", reason="Not supported on Windows")
def test_compiler_with_custom_non_numeric_version(self, mock_executable):
"""Test that, when a compiler has a completely made up version, we can use its
'real version' to detect targets and don't raise during concretization.
"""
gcc_path = mock_executable("gcc", output="echo 9")
compiler_configuration = [
{
"compiler": {
"spec": "gcc@foo",
"paths": {"cc": gcc_path, "cxx": gcc_path, "f77": None, "fc": None},
"operating_system": "debian6",
"target": "x86_64",
"modules": [],
}
}
]
spack.config.set("compilers", compiler_configuration)
s = spack.spec.Spec("a %gcc@foo").concretized()
assert s.compiler.version == ver("foo")

View File

@@ -107,12 +107,28 @@ def fake_installs(monkeypatch, tmpdir):
)
def test_one_package_multiple_reqs(concretize_scope, test_repo):
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not support configuration requirements")
conf_str = """\
packages:
y:
require:
- "@2.4"
- "~shared"
"""
update_packages_config(conf_str)
y_spec = Spec("y").concretized()
assert y_spec.satisfies("@2.4~shared")
def test_requirement_isnt_optional(concretize_scope, test_repo):
"""If a user spec requests something that directly conflicts
with a requirement, make sure we get an error.
"""
if spack.config.get("config:concretizer") == "original":
pytest.skip("Original concretizer does not support configuration" " requirements")
pytest.skip("Original concretizer does not support configuration requirements")
conf_str = """\
packages:

View File

@@ -54,8 +54,6 @@
from spack.util.pattern import Bunch
from spack.util.web import FetchError
is_windows = sys.platform == "win32"
def ensure_configuration_fixture_run_before(request):
"""Ensure that fixture mutating the configuration run before the one where
@@ -159,7 +157,9 @@ def latest_commit():
return git("rev-list", "-n1", "HEAD", output=str, error=str).strip()
# Add two commits on main branch
write_file(filename, "[]")
# A commit without a previous version counts as "0"
write_file(filename, "[0]")
git("add", filename)
commit("first commit")
commits.append(latest_commit())
@@ -621,7 +621,7 @@ def ensure_debug(monkeypatch):
tty.set_debug(current_debug_level)
@pytest.fixture(autouse=is_windows, scope="session")
@pytest.fixture(autouse=sys.platform == "win32", scope="session")
def platform_config():
spack.config.add_default_platform_scope(spack.platforms.real_host().name)
@@ -633,7 +633,7 @@ def default_config():
This ensures we can test the real default configuration without having
tests fail when the user overrides the defaults that we test against."""
defaults_path = os.path.join(spack.paths.etc_path, "defaults")
if is_windows:
if sys.platform == "win32":
defaults_path = os.path.join(defaults_path, "windows")
with spack.config.use_configuration(defaults_path) as defaults_config:
yield defaults_config
@@ -690,7 +690,7 @@ def configuration_dir(tmpdir_factory, linux_os):
tmpdir.ensure("user", dir=True)
# Slightly modify config.yaml and compilers.yaml
if is_windows:
if sys.platform == "win32":
locks = False
else:
locks = True
@@ -1675,11 +1675,11 @@ def mock_executable(tmpdir):
"""
import jinja2
shebang = "#!/bin/sh\n" if not is_windows else "@ECHO OFF"
shebang = "#!/bin/sh\n" if sys.platform != "win32" else "@ECHO OFF"
def _factory(name, output, subdir=("bin",)):
f = tmpdir.ensure(*subdir, dir=True).join(name)
if is_windows:
if sys.platform == "win32":
f += ".bat"
t = jinja2.Template("{{ shebang }}{{ output }}\n")
f.write(t.render(shebang=shebang, output=output))

View File

@@ -33,8 +33,6 @@
from spack.schema.database_index import schema
from spack.util.executable import Executable
is_windows = sys.platform == "win32"
pytestmark = pytest.mark.db
@@ -451,7 +449,7 @@ def test_005_db_exists(database):
lock_file = os.path.join(database.root, ".spack-db", "lock")
assert os.path.exists(str(index_file))
# Lockfiles not currently supported on Windows
if not is_windows:
if sys.platform != "win32":
assert os.path.exists(str(lock_file))
with open(index_file) as fd:

View File

@@ -86,13 +86,13 @@ def test_env_change_spec_in_definition(tmpdir, mock_packages, config, mutable_mo
e.concretize()
e.write()
assert any(x.satisfies("mpileaks@2.1%gcc") for x in e.user_specs)
assert any(x.intersects("mpileaks@2.1%gcc") for x in e.user_specs)
e.change_existing_spec(spack.spec.Spec("mpileaks@2.2"), list_name="desired_specs")
e.write()
assert any(x.satisfies("mpileaks@2.2%gcc") for x in e.user_specs)
assert not any(x.satisfies("mpileaks@2.1%gcc") for x in e.user_specs)
assert any(x.intersects("mpileaks@2.2%gcc") for x in e.user_specs)
assert not any(x.intersects("mpileaks@2.1%gcc") for x in e.user_specs)
def test_env_change_spec_in_matrix_raises_error(

View File

@@ -230,16 +230,6 @@ def test_path_manipulation(env):
assert os.environ["PATH_LIST_WITH_DUPLICATES"].count("/duplicate") == 1
def test_extra_arguments(env):
"""Tests that we can attach extra arguments to any command."""
env.set("A", "dummy value", who="Pkg1")
for x in env:
assert "who" in x.args
env.apply_modifications()
assert "dummy value" == os.environ["A"]
def test_extend(env):
"""Tests that we can construct a list of environment modifications
starting from another list.
@@ -489,7 +479,7 @@ def test_from_environment_diff(before, after, search_list):
assert item in mod
@pytest.mark.skipif(sys.platform == "win32", reason="LMod not supported on Windows")
@pytest.mark.skipif(sys.platform == "win32", reason="Lmod not supported on Windows")
@pytest.mark.regression("15775")
def test_exclude_lmod_variables():
# Construct the list of environment modifications

View File

@@ -24,8 +24,6 @@
import spack.store
import spack.util.lock as lk
is_windows = sys.platform == "win32"
def _mock_repo(root, namespace):
"""Create an empty repository at the specified root
@@ -528,7 +526,7 @@ def _repoerr(repo, name):
# The call to install_tree will raise the exception since not mocking
# creation of dependency package files within *install* directories.
with pytest.raises(IOError, match=path if not is_windows else ""):
with pytest.raises(IOError, match=path if sys.platform != "win32" else ""):
inst.dump_packages(spec, path)
# Now try the error path, which requires the mock directory structure
@@ -879,7 +877,7 @@ def _chgrp(path, group, follow_symlinks=True):
metadatadir = spack.store.layout.metadata_path(spec)
# Regex matching with Windows style paths typically fails
# so we skip the match check here
if is_windows:
if sys.platform == "win32":
metadatadir = None
# Should fail with a "not a directory" error
with pytest.raises(OSError, match=metadatadir):

View File

@@ -11,9 +11,8 @@
import spack.paths
from spack.compiler import _parse_non_system_link_dirs
is_windows = sys.platform == "win32"
drive = ""
if is_windows:
if sys.platform == "win32":
match = re.search(r"[A-Za-z]:", spack.paths.test_path)
if match:
drive = match.group()
@@ -210,7 +209,7 @@ def test_obscure_parsing_rules():
]
# TODO: add a comment explaining why this happens
if is_windows:
if sys.platform == "win32":
paths.remove(os.path.join(root, "second", "path"))
check_link_paths("obscure-parsing-rules.txt", paths)

View File

@@ -13,8 +13,6 @@
import spack.paths
is_windows = sys.platform == "win32"
@pytest.fixture()
def library_list():
@@ -28,7 +26,7 @@ def library_list():
"/dir3/libz.so",
"libmpi.so.20.10.1", # shared object libraries may be versioned
]
if not is_windows
if sys.platform != "win32"
else [
"/dir1/liblapack.lib",
"/dir2/libpython3.6.dll",
@@ -59,10 +57,10 @@ def header_list():
# TODO: Remove below when llnl.util.filesystem.find_libraries becomes spec aware
plat_static_ext = "lib" if is_windows else "a"
plat_static_ext = "lib" if sys.platform == "win32" else "a"
plat_shared_ext = "dll" if is_windows else "so"
plat_shared_ext = "dll" if sys.platform == "win32" else "so"
plat_apple_shared_ext = "dylib"
@@ -78,7 +76,8 @@ def test_joined_and_str(self, library_list):
expected = " ".join(
[
"/dir1/liblapack.%s" % plat_static_ext,
"/dir2/libpython3.6.%s" % (plat_apple_shared_ext if not is_windows else "dll"),
"/dir2/libpython3.6.%s"
% (plat_apple_shared_ext if sys.platform != "win32" else "dll"),
"/dir1/libblas.%s" % plat_static_ext,
"/dir3/libz.%s" % plat_shared_ext,
"libmpi.%s.20.10.1" % plat_shared_ext,
@@ -93,7 +92,8 @@ def test_joined_and_str(self, library_list):
expected = ";".join(
[
"/dir1/liblapack.%s" % plat_static_ext,
"/dir2/libpython3.6.%s" % (plat_apple_shared_ext if not is_windows else "dll"),
"/dir2/libpython3.6.%s"
% (plat_apple_shared_ext if sys.platform != "win32" else "dll"),
"/dir1/libblas.%s" % plat_static_ext,
"/dir3/libz.%s" % plat_shared_ext,
"libmpi.%s.20.10.1" % plat_shared_ext,

View File

@@ -62,8 +62,7 @@
import llnl.util.multiproc as mp
from llnl.util.filesystem import getuid, touch
is_windows = sys.platform == "win32"
if not is_windows:
if sys.platform != "win32":
import fcntl
pytestmark = pytest.mark.skipif(sys.platform == "win32", reason="does not run on windows")
@@ -127,7 +126,7 @@ def make_readable(*paths):
# stat.S_IREAD constants or a corresponding integer value). All other
# bits are ignored."
for path in paths:
if not is_windows:
if sys.platform != "win32":
mode = 0o555 if os.path.isdir(path) else 0o444
else:
mode = stat.S_IREAD
@@ -136,7 +135,7 @@ def make_readable(*paths):
def make_writable(*paths):
for path in paths:
if not is_windows:
if sys.platform != "win32":
mode = 0o755 if os.path.isdir(path) else 0o744
else:
mode = stat.S_IWRITE
@@ -616,7 +615,7 @@ def test_read_lock_read_only_dir_writable_lockfile(lock_dir, lock_path):
pass
@pytest.mark.skipif(False if is_windows else getuid() == 0, reason="user is root")
@pytest.mark.skipif(False if sys.platform == "win32" else getuid() == 0, reason="user is root")
def test_read_lock_no_lockfile(lock_dir, lock_path):
"""read-only directory, no lockfile (so can't create)."""
with read_only(lock_dir):

View File

@@ -55,7 +55,7 @@ def test_modules_written_with_proper_permissions(
spec = spack.spec.Spec("mpileaks").concretized()
# The code tested is common to all module types, but has to be tested from
# one. TCL picked at random
# one. Tcl picked at random
generator = spack.modules.tcl.TclModulefileWriter(spec, "default")
generator.write()

Some files were not shown because too many files have changed in this diff Show More