Compare commits

...

121 Commits

Author SHA1 Message Date
Todd Gamblin
3985b307a1 concretizer: add --minimal configuration option
The reusing concretizer minimizes builds, but it still preserves
defaults from packages and preferences while doing that. We can be
more aggressive by making minimization the top priority, at the
expense of "weird" concretizations.

This can be advantageous: if you write your packages as explicitly
as possible, then you can use that with `--minimal` to get the
smallest possible package configuration (at least in terms of the
number of packages in the build).  Conversely, you can use minimal
concretization as kind of a worst case to ensure that you have the
"right" constraints on your dependencies.

Example for intuition: `cmake` can optionally build without openssl, but
it's enabled by default because many builds use that functionality. Using
`minimal: true` will build `cmake~openssl` unless the user asks for
`cmake+openssl` explicitly.

- [x] add `minimal` option to `concretizer.yaml`
- [x] add `--minimal` CLI option to concretizer arguments
- [x] wire everything up
- [x] add some tests
2022-05-24 14:52:17 -04:00
Seth R. Johnson
6a57aede57 environments: fail gracefully on missing keys (#26378) 2022-05-24 08:52:40 -07:00
edwardsp
ba701a7cf8 Update regex to correctly identify quoted args (#23494)
Previously the regex was only checking for presence of quotes as a beginning
or end character and not a matching set.  This erroneously identified the
following *single* argument as being quoted:

    source bashenvfile &> /dev/null && python3 -c "import os, json; print(json.dumps(dict(os.environ)))"
2022-05-24 08:26:07 -07:00
Matthias Wolf
557845cccc apptainer: new package (#30745) 2022-05-24 16:01:46 +02:00
iarspider
c5297523af vdt: add preload variant (#30030) 2022-05-24 12:19:09 +02:00
Evan Bollig
6883868896 libfabric has needed rdma-core for efa since 1.10.0 (#30798) 2022-05-24 04:17:47 -06:00
Satish Balay
1c5587f72d petsc: update rocrand location wrt rocm@5.1.0 (#30790)
rocm-5.1.0 removed librocrand.so from ROCM_DIR/rocrand/lib location (but  includes are still at this location)

/opt/rocm-5.0.2/lib/librocrand.so
/opt/rocm-5.0.2/rocrand/lib/librocrand.so
/opt/rocm-5.1.0/lib/librocrand.so

drwxr-xr-x 2 root root 617 Mar  8 08:20 /opt/rocm-5.0.2/rocrand/include
drwxr-xr-x 2 root root 617 Mar 31 09:48 /opt/rocm-5.1.0/rocrand/include
2022-05-24 11:54:29 +02:00
Mr-Timn
6e7eb49888 su2: add v7.3.1 (#30794) 2022-05-24 10:50:16 +02:00
Paul Wolfenbarger
3df4a32c4f trilinos: add adelus, aprepro and teuchos variants (#28935) 2022-05-24 09:49:33 +01:00
Adam J. Stewart
95b03e7bc9 gplates: add v2.3.0 (#30676) 2022-05-24 08:02:39 +02:00
Greg Becker
817ee81eaa compiler flags: imposed hashes impose the lack of additional compiler flags (#30797) 2022-05-24 01:22:29 -04:00
Tom Scogland
330832c22c strip -Werror: all specific or none (#30284)
Add a config option to strip `-Werror*` or `-Werror=*` from compile lines everywhere.

```yaml
config:
    keep_werror: false
```

By default, we strip all `-Werror` arguments out of compile lines, to avoid unwanted
failures when upgrading compilers.  You can re-enable `-Werror` in your builds if
you really want to, with either:

```yaml
config:
    keep_werror: all
```

or to keep *just* specific `-Werror=XXX` args:

```yaml
config:
    keep_werror: specific
```

This should make swapping in newer versions of compilers much smoother when
maintainers have decided to enable `-Werror` by default.
2022-05-24 00:57:09 -04:00
Todd Gamblin
306bed48d7 specs: emit better parsing errors for specs. (#24860)
Parse error information is kept for specs, but it doesn't seem like we propagate it
to the user when we encounter an error.  This fixes that.

e.g., for this error in a package:

```python
    depends_on("python@:3.8", when="0.900:")
```

Before, with no context and no clue that it's even from a particular spec:

```
==> Error: Unexpected token: ':'
```

With this PR:

```
==> Error: Unexpected token: ':'
  Encountered when parsing spec:
    0.900:
         ^
```
2022-05-24 03:33:43 +00:00
Scott Wittenburg
63402c512b Revert "Added cloud_pipline for E4S on Amazon Linux (#29522)" (#30796)
This reverts commit 07e9c0695a.
2022-05-23 21:12:48 -06:00
Brian Van Essen
736fddc079 Bugfix hwloc find cuda (#30788)
* Added autotools configure flags to ensure that hwloc finds the correct
version of CUDA that it was concretized against, rather than the first
one that package config finds.

* Added support for finding the correct version of ROCm libraries.  Fixed Flake8.

* Fixed guard on finding ROCm library
2022-05-23 20:17:46 -06:00
Jen Herting
036048c26f [py-h2] added version 3.2.0 and 4.1.0 (#29804)
* [py-h2] py-wheel is implied by PythonPackage

* [py-h2] python dependencies should be type=('build', 'run')

* [py-h2] fixed dependencies for py-h2@4.0.0

* [py-h2] added version 3.2.0

* [py-h2] added version 4.1.0

* [py-h2] Older version requires py-enum34 for older versions of python
2022-05-23 17:56:45 -07:00
Greg Becker
8616ba04db Documentation and new method for CachedCMakePackage build system (#22706)
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
2022-05-23 22:48:12 +00:00
Evan Bollig
07e9c0695a Added cloud_pipline for E4S on Amazon Linux (#29522)
Add two new cloud pipelines for E4S on Amazon Linux, include arm and x86 (v3 + v4) stacks.

Notes:
- Updated mpark-variant to remove conflict that no longer exists in Amazon Linux
- Which command on Amazon Linux prefixes on all results when padded_length is too high. In this case, padded_length<=503 works as expected. Chose conservative length of 384.
2022-05-23 15:33:38 -06:00
snehring
f24886acb5 usearch: adding in new version, updating checksums (#30776) 2022-05-23 13:50:50 -07:00
snehring
5031578c39 genemark-et: updating to 4.69 (#30793) 2022-05-23 13:36:26 -07:00
Massimiliano Culpo
7c4cc1c71c archspec: add oneapi and dpcpp flag support (#30783) 2022-05-23 13:28:54 -07:00
Harmen Stoppels
f7258e246f Deprecate spack:concretization over concretizer:unify (#30038)
* Introduce concretizer:unify option to replace spack:concretization

* Deprecate concretization

* Make spack:concretization overrule concretize:unify for now

* Add environment update logic to move from spack:concretization to spack:concretizer:reuse

* Migrate spack:concretization to spack:concretize:unify in all locations

* For new environments make concretizer:unify explicit, so that defaults can be changed in 0.19
2022-05-23 13:20:34 -07:00
Maciej Wójcik
ff980a1452 plumed: add versions up to v2.8.0 (#30787) 2022-05-23 20:07:47 +00:00
Chuck Atkins
51130abf86 ci: Map visit to huge instance for the data-vis-sdk pipeline (#30779) 2022-05-23 14:16:24 -04:00
marcus-elia
383356452b abseil-cpp: add v20211102 (#30785) 2022-05-23 18:52:39 +02:00
Cody Balos
5fc1547886 xsdk-examples: add v0.3.0 (#30770)
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
Co-authored-by: Veselin Dobrev <dobrev@llnl.gov>
2022-05-23 18:45:11 +02:00
Manuela Kuhn
68cd6c72c7 py-rnc2rng: fix 2.6.5 and add 2.6.6 (#30784) 2022-05-23 08:02:13 -07:00
Jean Luca Bez
3d2ff57e7b hdf5-vol-async: update new version, tests, and runtime envs (#30713)
* Update h5bench maintainers and versions

* Include version 1.1 for h5bench

* Correct release hash and set default version

* Update .tar.gz version

* Include new version and update runtime

* Update year

* Update package.py

* Update package.py
2022-05-23 09:28:26 -04:00
Massimiliano Culpo
3bc656808c llvm: make "omp_as_runtime" variant conditional (#30782)
fixes #30700

To avoid clingo adding penalties for not using the
default value for a variant, it's better to model
the variant as conditional where possible.
2022-05-23 14:21:11 +02:00
Ben Bergen
7ded692a76 gdb: add v11.2 (#30780)
- This resolves bug https://sourceware.org/PR28302
2022-05-23 12:18:26 +02:00
MichaelLaufer
aa3c7a138a py-pyfr: Add v1.14.0, add LIBXSMM variant (#30612)
* Update PyFR, Gimmik versions. Add PyFR LIBXSMM variant

* Fixes

* Changes to variants

* Update py-gimmik version requirement in py-pyfr

* Revert "Update py-gimmik version requirement in py-pyfr"

This reverts commit 3b3fde3042.

* Update libxsmm conflicts
2022-05-22 12:01:21 -06:00
Paul Kuberry
42441cddcc Add Teuchos patch that fixes implicit casting of complex numbers (#30777) 2022-05-22 09:26:33 -04:00
Abhik Sarkar
b78025345b Feature/composed boost pkg deps p3 (#28961)
* This commit removes the Boost.with_default_variants to variants that packages are precisely dependant upon. This is the third batch of 16 packages with modified boost dependencies.

* style fix

* Update var/spack/repos/builtin/packages/sympol/package.py

Co-authored-by: Tim Haines <thaines.astro@gmail.com>

* fix style

* Apply suggestions from code review

Co-authored-by: Tim Haines <thaines.astro@gmail.com>

* Fix Trilinos boost deps

* Fix style

Co-authored-by: Tim Haines <thaines.astro@gmail.com>
Co-authored-by: Tom Scogland <tom.scogland@gmail.com>
2022-05-21 15:57:59 -07:00
Harmen Stoppels
2113b625d1 gcc: add build_type and profiled variants (#30660)
Add a `build_type` variant, which allows building optimized compilers,
as well as target libraries (libstdc++ and friends).

The default is `build_type=RelWithDebInfo`, which corresponds to GCC's
default of -O2 -g.

When building with `+bootstrap %gcc`, also add Spack's arch specific
flags using the common denominator between host and new GCC.

It is done by creating a config/spack.mk file in def patch, that looks
as follows:
```
BOOT_CFLAGS := $(filter-out -O% -g%, $(BOOT_CFLAGS)) -O2 -g -march=znver2 -mtune=znver2
CFLAGS_FOR_TARGET := $(filter-out -O% -g%, $(CFLAGS_FOR_TARGET)) -O2 -g -march=znver2 -mtune=znver2
CXXFLAGS_FOR_TARGET := $(filter-out -O% -g%, $(CXXFLAGS_FOR_TARGET)) -O2 -g -march=znver2 -mtune=znver2
```
2022-05-21 23:44:51 +02:00
Jen Herting
c6c3d243e1 New package: py-motor (#30763)
* New package: py-motor

* Fixed dependency errors

* Fixed flake style errors

* Fixed linked issue

Co-authored-by: Viv Eric Hafener <vehrc@sporcbuild.rc.rit.edu>
2022-05-21 11:36:37 -05:00
Manuela Kuhn
870b997cb6 py-ww: add new package (#30767)
* py-ww: add new package

* Add missing py-pytest-runner dependency
2022-05-21 11:28:53 -05:00
snehring
c9492f1cd4 paml: add v4.10.3 (#30772) 2022-05-21 09:47:38 +02:00
dependabot[bot]
24f370491e build(deps): bump actions/upload-artifact from 3 to 3.1.0 (#30778)
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 3 to 3.1.0.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](https://github.com/actions/upload-artifact/compare/v3...3cea5372237819ed00197afe530f5a7ea3e805c8)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-05-21 08:57:19 +02:00
Glenn Johnson
d688a699fa Fix toolchain detection for oneapi/dpcpp compilers (#30775)
The oneapi and dpcpp compilers are essentially the same except for which
binary is used foc CXX. Spack will detect them as "mixed toolchain" and
not inject compiler optimization flags. This will be needed once
archspec has entries for the oneapi and dpcpp compilers. This PR detects
when dpcpp and oneapi are in the toolchains list and explicitly sets
`is_mixed_toolchain` to `False`.
2022-05-21 08:38:01 +02:00
Chuck Atkins
4fbb822072 visit: Overhaul to build in the DAV SDK (#30594)
* mesa-glu and mesa-demos: Fix conflicts with glu and osmesa

* visit: Update visit dependencies

* ecp-data-vis-sdk: Enable +visit

* ci[data-vis-sdk]: Enable +visit
2022-05-20 19:17:30 -04:00
Sreenivasa Murthy Kolam
f86c481280 update checksum and url for mlirmiopen recipe (#30771) 2022-05-20 21:29:58 +00:00
Jen Herting
91a99882b3 [py-openslide-python] added verion 1.1.2 (#30722)
* [py-openslide-python] added verion 1.1.2 and set max py-setuptools version for 1.1.1

* [py-openslide-python]

- setuptools required for all possible newer versions
- python is type build run

* [py-openslide-python] use pil provider
2022-05-20 19:25:20 +00:00
Manuela Kuhn
74bef2105a py-formatizer: add new package (#30755) 2022-05-20 10:37:55 -07:00
Manuela Kuhn
630ebb9d8b py-pybids: add 0.8.0 and 0.15.1 (#30756) 2022-05-20 10:37:01 -07:00
iarspider
183465321e DWZ: use virtual "elf" package (#30761) 2022-05-20 10:29:11 -07:00
Nate deVelder
580f9ec86e Add newer openfast versions and preliminary OpenMP compile support (#30752)
* Add version 3.0 and 3.1 and prelim OpenMP support

* Fix flag handler missing spec variable

* Use self.compiler.openmp_flag instead of -fopenmp

* Fix whitespace
2022-05-20 10:27:47 -07:00
Jordan Galby
0b0920bc90 qt: Qt 5.15.0 requires OpenSSL 1.1.1 (#30754)
Fixes qt configure errors with external openssl on older systems (rhel7)

See
efc02f9cc3/dist/changes-5.15.0 (L346)

This means for now on, `qt ^openssl@1.0` gets you `qt@5.15.4 ~ssl`:
clingo chooses latest qt version **but disables ssl support**.
2022-05-20 18:24:30 +02:00
Greg Becker
ee04a1ab0b errors: model error messages as an optimization problem (#30669)
Error messages for the clingo concretizer have proven challenging. The current messages are incredibly vague and often don't help users at all. Unsat cores in clingo are not guaranteed to be minimal, and lead to cores that are either not useful or need to be post-processed for hours to reach a minimal core.

Following up on an idea from a slack conversation with kwryankrattiger on slack, this PR takes a new approach. We eliminate most integrity constraints and minima/maxima on choice rules in clingo, and instead force invalid states to imply an error predicate. The error predicate can include context on the cause of the error (Package, Version, etc). These error predicates are then heavily optimized against, to ensure that we do not include error facts in the solution when a solution with no error facts could be generated. When post-processing the clingo solution to construct specs, any error facts cause the program to raise an exception. This leads to much more legible error messages. Each error predicate includes a priority and an error message. The error message is formatted by the remaining arguments to produce the error message. The priority is used to ensure that when clingo has a choice of which rules to violate, it chooses the one which will be most informative to the user.

Performance:

"fresh" concretizations appear to suffer a ~20% performance penalty under this branch, while "reuse" concretizations see a speedup of around 33%. 

Possible optimizations if users still see unhelpful messages:

There are currently 3 levels of priority of the error messages. Additional priorities are possible, and can allow us finer granularity to ensure more informative error messages are provided in lieu of less informative ones.

Future work:

Improve tests to ensure that every possible rule implying an error message is exercised
2022-05-20 08:27:07 -07:00
Peter Scheibel
55f4950ed4 Petsc: fix enable-x for virtuals (#30749)
* If direct dependencies provide virtuals, add those virtual names as well
* Also refer to package by virtual name for jpeg
2022-05-20 03:15:44 +00:00
Manuela Kuhn
23960ed623 py-fracridge: add new package (#30739) 2022-05-19 17:28:39 -07:00
iarspider
fb2730d87f Update py-onnx-runtime to 1.10; update CMS patch (#30725)
* Update py-onnx-runtime to 1.10; update CMS patch

* Update package.py
2022-05-19 17:24:18 -07:00
Wileam Y. Phan
30f2394782 Run scheduled CI workflows only in the main repo (#30729)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2022-05-19 21:35:52 +02:00
Jordan Galby
262c3f07bf Non-existent upstream is not fatal (#30746)
A non-existent upstream should not be fatal: it could only mean it is
not deployed yet. In the meantime, it should not block the user to
rebuild anything it needs.

A warning is still emitted, to let the user decide if this is ok or not.
2022-05-19 18:27:43 +00:00
Harmen Stoppels
b018eb041f python: more +optimizations (#30742) 2022-05-19 12:25:36 -06:00
Matthew Archer
3f4398dd67 fenicsx: update to v0.4 (#30661)
* Fix for xtensor-xsimd

* Add sha256 for all new releases

* renamed ufcx package

* Update sha for ffcx

* fixed hashes and modified fenics-dolfinx to depend on ufcx

* cleaned and fixed dependency types

* use spec.satisfies in cmake_args

* bumped to ufcx@0.4.1

* address PR comments

* fix hashes

* update parmetis in cmake_args to reflect default setting

* update versions

* renamed ufcx package

* fixed hashes and modified fenics-dolfinx to depend on ufcx

* cleaned and fixed dependency types

* use spec.satisfies in cmake_args

* bumped to ufcx@0.4.1

* address PR comments

* fix hashes

* update parmetis in cmake_args to reflect default setting

* update versions

* Add dependency fix

* bump basix to 0.4.2 and address PR comments

* Versioning fixes

* Use xtensor-0.24: and loosen pybind11

* Add conflicts for partitioners

* Updates on partitioners

* use define_from_variant

* Tidy up some dependencies

* Work on multi-variants for graph partitioners

* Fix KaHIP issue.

KaHIP changed the name of its library from 'interface' to 'kahip'. Pin earlier versions of DOLFINx to earlier verisons of KaHIP for proper detection.

Co-authored-by: Chris Richardson <chris@bpi.cam.ac.uk>
Co-authored-by: Garth N. Wells <gnw20@cam.ac.uk>
2022-05-19 18:22:09 +00:00
kwryankrattiger
a225a5b276 Ascent: Add variant to disable blt_find_mpi (#30735)
This is needed to find MPI correctly on cray systems and similar.
2022-05-19 10:06:31 -07:00
iarspider
c9cfc548da Update millepede recipe (#30737) 2022-05-19 10:03:31 -07:00
Rémi Lacroix
3b30886a3a libtheora: fix the hash of a patch. (#30740)
Cosmetic changes only, probably because gitlab.xiph.org got updated.
2022-05-19 09:59:12 -07:00
Jordan Galby
c2fd98ccd2 Fix spack install chgrp on symlinks (#30743)
Fixes missing chgrp on symlinks in package installations, and errors on
symlinks referencing non-existent or non-writable locations.

Note: `os.chown(.., follow_symlinks=False)` is python3 only, but
`os.lchown` exists in both versions.
2022-05-19 08:50:24 -07:00
Valentin Volkl
a0fe6ab2ed cuda: use stage dir instead of /tmp during install (#29584) 2022-05-19 17:18:18 +02:00
Harmen Stoppels
c3be777ea8 tar: add compress deps (#30641)
* remove spack dependency from package

* tar: fix compression programs, use pigz by default instead of gzip on -z
2022-05-19 11:15:13 -04:00
Jordan Galby
8fe39be3df Don't try to mkdir upstream directory when nonexistent (#30744)
When an upstream is specified but the directory does not exist, don't
create the directory for it, it might not be yours.
2022-05-19 14:45:18 +00:00
Tiziano Müller
f5250da611 cp2k: fix unbound var use without cuda (#30736)
fixes #30631
2022-05-19 11:38:00 +02:00
David Beckingsale
c2af154cd2 RAJA and associated packages: add v2022.03.0 (#30047)
* Add raja@2022.03.0
* Add camp@2022.03.0
* Add chai@2022.03.0
* Add umpire@2022.03.1
* Latest chai, raja, umpire versions don't need submodules
* Latest chai, raja, umpire versions update CMake option names
* New umpire +device_alloc option (for latest version)
* All versions of dray are now required to build with raja@:0.14

Co-authored-by: Marty McFadden <mcfadden8@users.noreply.github.com>
2022-05-18 22:48:22 -07:00
robgics
1f6b880fff Add license dir to config (#30135)
* Change license dir from hard-coded to a configurable item

* Change config item to be a string not an array

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2022-05-18 18:26:42 -07:00
Teodor Nikolov
2c211d95ee Catch2: update to 3.0.1 (#30732)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-05-19 00:47:20 +00:00
Asher Mancinelli
c46f673c16 ExaSGD bugfixes (#30731)
* Small bug fixes for ExaSGD packages

* Add Slaven as maint for both exago and hiop
2022-05-18 15:25:24 -07:00
Todd Gamblin
8ff2b4b747 bugfix: handle new dag_hash() on old concrete specs gracefully. (#30678)
Trying to compute `dag_hash()` or `package_hash()` on a concrete spec that doesn't have
a `_package_hash` attribute would attempt to recompute the package hash.

This most commonly manifests as a failed lookup of a namespace if you attempt to uninstall
or compute the hashes of packages in exsternal repositories that aren't registered, e.g.:

```console
> spack spec --json c/htno
==> Error: Unknown namespace: myrepo
```

While it wouldn't change the already-assigned `dag_hash` value, this behavior is
incorrect, since the package file for a previously concrete spec:
  1. might have changed since concretization,
  2. might not exist anymore, or
  3. might just not be findable by Spack.

This PR ensures that the package hash can't be computed on older concrete specs. Instead
of calling `package_hash()` from within `to_node_dict()`, we now check for the `_package_hash`
attribute and only add the package_hash to the spec record if it's there.

This PR also handles the tricky semantics of computing `package_hash()` at concretization
time. We have to compute it *before* marking the spec concrete so that `to_node_dict` can
use it. But this means that the logic for `package_hash()` can't rely on `spec.concrete`,
as it is called *during* concretization. Instead of checking for concreteness, `package_hash()`
now checks `_patches_assigned()` to determine whether it should add them to the package
hash.

- [x] Add an assert to `package_hash()` so it can't be called on specs for which it
      would be wrong.
- [x] Add an `_assign_hash()` method to handle tricky semantics of `package_hash`
      and `dag_hash`.
- [x] Rework concretization to call `_assign_hash()` before and after marking specs
      concrete.
- [x] Rework content hash part of package hash to check for `_patches_assigned()`
      instead of `spec.concrete`.
- [x] regression test
2022-05-18 22:21:22 +00:00
Michael Kuhn
9e05dde28c harfbuzz: add gobject-introspection dependency (#30715)
Fixes #30706
2022-05-18 15:13:36 -06:00
Jen Herting
b1ef5a75f0 [py-tensorflow-hub] applied patch for newer version of zlib (#30664)
* [py-tensorflow-hub] applied patch for newer version of zlib

* [py-tensorflow-hub] patch also applies to 0.11.0

* [py-tensorflow-hub] Audit fix

1. patch URL in package py-tensorflow-hub must end with ?full_index=1
2022-05-18 13:21:21 -07:00
Timothy Brown
f9aa7c611c [WGRIB2] Pinning Jasper to v2. (#30726)
If we use v3 of Jasper, WGRIB2 fails to build due to not
finding `jpc_encode` and `jpc_decode`.
2022-05-18 09:34:22 -07:00
Timothy Brown
9a2e01e22d [G2] Pinning Jasper to version 2. (#30728) 2022-05-18 09:33:05 -07:00
Alberto Invernizzi
2090351d7f Add hip dependency for roc-obj-ls + add perl-uri-encode (#30721)
* add perl-uri-encode package

* add dependencies in hip for roc-obj-ls
2022-05-18 16:24:35 +02:00
Massimiliano Culpo
c775c322ec vendored externals: update archspec (#30683)
- Better support for 164fx
- Better support for Apple M1(pro)
2022-05-18 11:31:20 +02:00
Harmen Stoppels
1185eb9199 Compiler wrapper: fix globbing and debug out.log bell chars (#30699)
* Disable globbing

* Split on bell char when dumping cmd to out.log
2022-05-18 09:06:54 +02:00
Ryan Marcellino
51fa8e7b5e py-pytecplot: new package (#30708)
* py-pytecplot: new package

* fix copyright year

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* use one variant for all extras

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-05-18 01:47:17 +00:00
Mikael Simberg
f505c50770 Add tracy (#30588)
Co-authored-by: Mikael Simberg <mikael.simberg@iki.if>
2022-05-17 18:10:45 -07:00
Mark W. Krentel
2fdc817f03 hpctoolkit: add version 2022.05.15 (#30710) 2022-05-17 18:05:12 -07:00
Michael Kuhn
e1d0b35d5b gobject-introspection: add 1.72.0 (#30714)
Newer versions of gobject-introspection require Meson to build. Convert
the package into a hybrid one that still supports older versions using
Autotools.
2022-05-17 17:59:54 -07:00
Sreenivasa Murthy Kolam
ace5a7c4bf deprecate rocm releases-ROCm-4.2.0,ROCm-4.3.0,ROCm-4.3.1 (#30709) 2022-05-17 14:11:32 -07:00
kwryankrattiger
a91ae8cafe ecp-data-vis-sdk: Drop fortran from ascent spec. (#30707) 2022-05-17 16:53:52 -04:00
Chuck Atkins
b9e3ee6dd0 glew: Fix glu and glx dependencies (#30705) 2022-05-17 10:19:19 -07:00
iarspider
10ea0a2a3e Update rivet to 3.1.6, yoda to 1.9.5; fix recipes (#30702)
* Update rivet to 3.1.6, yoda to 1.9.5; fix recipes

* Use filter_compiler_wrappers instead of custom post-install step
2022-05-17 10:17:24 -07:00
Harmen Stoppels
d6f8ffc6bc add julia 1.6.6 (#30703) 2022-05-17 09:52:46 -07:00
haralmha
02be2f27d1 flatbuffers: Add version 2.0.6 (#30704) 2022-05-17 09:51:02 -07:00
Adam J. Stewart
dfd0702aec GDAL: deprecate 2.X (#30668) 2022-05-17 08:45:55 -07:00
Massimiliano Culpo
f454a683b5 Mark test_repo_last_mtime xfail on Python < 3.5 (#30696) 2022-05-17 12:45:52 +02:00
Chuck Atkins
d7d0c892d8 silo: Make HDF5 version deps more robust (#30693) 2022-05-17 04:26:35 -04:00
snehring
d566330a33 pindel: fixing compilation issues for gcc5+ (#28387) 2022-05-17 09:45:08 +02:00
Alex Hedges
446cbf4b5a sed: add v4.8.0, set gnu_mirror_path per version (#30666) 2022-05-17 09:43:01 +02:00
Tim Haines
5153c9e98c boost: constrain context-impl variant (#30654) 2022-05-17 09:42:19 +02:00
h-murai
d74f2d0be5 petsc: fix an error about handling a provider of Scalapack. (#30682) 2022-05-17 09:33:49 +02:00
Jianshen Liu
021b65d76f cppcheck: add v2.7 (#30698) 2022-05-17 07:29:46 +00:00
Adam J. Stewart
45312d49be Bazel: remove maintainer (#30697) 2022-05-17 09:29:22 +02:00
Chuck Atkins
3fcd85efe9 autoconf-archive: Patch for nvhpc support and propagate search dirs (#30692) 2022-05-17 03:29:42 +00:00
Alberto Madonna
6f3a082c3e runc: symlink sbin to bin to find it in $PATH (#30691) 2022-05-17 02:48:04 +02:00
stepanvanecek
23e2820547 sys-sage: new spack package (#30570)
* sys-sage - adding new package

* sys-sage: updated release verison

* sys-sage: remove FIXMEs from the package

* add libllvm dependency

* sys-sage: remove unnecessary libllvm dependency

Co-authored-by: Stepan Vanecek <stepan.vanecek@tum.de>
2022-05-16 17:01:34 -07:00
Sreenivasa Murthy Kolam
22b999fcd4 Add miopentensile-new recipe for ROCm-5.0.0 . ROCm-5.1.0 release (#29313)
* miopentensile-new recipe for rocm-5.0.0 release

* fix style checks

* update the version for 5.1.0 release, avoid git download
2022-05-16 16:19:35 -07:00
Michael Kuhn
1df7de62ca py-cffconvert: new package (#30694) 2022-05-16 23:16:32 +00:00
andymwood
97ec8f1d19 Avoid calling a method on a NoneType object (#30637) 2022-05-16 21:59:08 +00:00
iarspider
63b6e484fc sigcpp: protect from missing prefix.share folder (#30686) 2022-05-16 13:34:57 -07:00
Nick Forrington
2b12d19314 arm-forge: Versions up to 22.0.1 + minor updates (#30689)
* arm-forge: Download via HTTPS

Update download URL to use HTTPS (rather than HTTP)

* arm-forge: Allow +probe to depend on python3

Allow python dependency required for arm-forge+probe to be python3 as
well as 2.7.x

* arm-forge: Add versions up to 22.0.1
2022-05-16 14:30:19 -06:00
Francesco Giordano
c37fcccd7c aws-parallelcluster: add v2.11.7 (#30685) 2022-05-16 14:29:53 -06:00
Michael Kuhn
6034b5afc2 fix pkgconfig dependencies (#30688)
pkgconfig is the correct dependency, pkg-config is a provider of it.
2022-05-16 14:01:41 -06:00
Michael Kuhn
17bc937083 libfuse: add utils variant (#30675)
By default, libfuse install helper programs like `fusermount3`, which
are mostly useless if not installed with setuid (that is, `+useroot`).

However, their presence makes it complicated to use globally installed
versions, which can be combined with a Spack-installed FUSE library.

In particular, on systems that have a setuid fusermount3 binary, but no
libfuse-dev installed, it is nice to be able to build libfuse with Spack, and
have it call the system setuid executable.
2022-05-16 13:01:44 -06:00
renjithravindrankannath
ad8db0680d Correcting include and library paths using patch file for RVS (#30294)
* Correcting include and library paths using patch file for RVS to build
following library files in spack.
libperf.so.0.0
libpebb.so.0.0
libiet.so.0.0
libgst.so.0.0
libpqt.so.0.0
libmem.so.0.0
libbabel.so.0.0

* Correcting include and library paths using patch file for RVS to build
following library files in spack.
libperf.so.0.0
libpebb.so.0.0
libiet.so.0.0
libgst.so.0.0
libpqt.so.0.0
libmem.so.0.0
libbabel.so.0.0

* Replacing ROCM_PATH with RPATH in the deviceid.sh before installing in Spack build.

* Reducing multiple enviroment variable for HIP and HSA path
2022-05-16 09:44:59 -07:00
estewart08
4f033b155b [AMD][rocm-openmp-extras] - Update versions 5.0.0 through 5.1.0. (#30501)
- Removed gl dependency.
- Specify clang as cmake compiler as gcc was being
  improperly picked up. As a result, ffi include
  path was needed in C/CXX flags.

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2022-05-16 09:22:48 -07:00
Sreenivasa Murthy Kolam
ad829ccee1 Apply cyclades removal patch for the llvm-amdgpu package (#29376)
* apply cyclades removal patch for the llvm-amdgpu spack package

* update the changes with develop branch
2022-05-16 09:15:26 -07:00
Tom Vander Aa
4b60a17174 Extrae: add support for Intel OneAPI (#30684) 2022-05-16 09:02:00 -06:00
Ethan Stam
edb91f4077 ParaView: -no-ipo for intel builds (#18193) 2022-05-16 08:01:34 -06:00
Todd Gamblin
0fdc3bf420 bugfix: use deterministic edge order for spack graph (#30681)
Previously we sorted by hash values for `spack graph`, but changing hashes can make the
test brittle and the node order seem nondeterministic to users.

- [x] Sort nodes in `spack graph` by the default edge order, which takes into account
      parent and child names as well as dependency types.
- [x] Update ASCII test output for new order.
2022-05-16 11:36:41 +02:00
Francine Lapid
8b34cabb16 subversion: added apxs support (#30381) 2022-05-16 10:42:15 +02:00
haralmha
77fb651e01 frontier-client: adapt pacparser_setmyip function to new pacparser release (#29936) 2022-05-16 10:41:21 +02:00
Harmen Stoppels
35ed7973e2 eccodes, fix jasper again (#30635)
* eccodes: jasper@:2

* Revert "jasper: avoid --gc-sections / hidden symbols"

This reverts commit d1bc0f39c516a7dc1e941aa4a804b7468a200b75.

* bump ecbuild, drop cmake constraint for newer versions

* add ecbuild dep to eccodes@develop
2022-05-16 10:40:35 +02:00
snehring
e73b19024f bpp-suite and deps: urls to GitHub (#30665)
* bpp-core: moving url to github. Fixing compilation issue.

* bpp-phyl: moving url to github.

* bpp-seq: moving url to github

* bpp-popgen: new package

* bpp-suite: moving url to github, new version.

* bpp-popgen: removing unused cmake_args.
2022-05-16 10:23:39 +02:00
Alex Hedges
7803bc9e5f screen: add v4.9.0, add required build deps (#30667) 2022-05-16 10:11:02 +02:00
dlkuehn
55c400297c treesub: change jdk dependency to java, add build to java dep. type (#30672)
Co-authored-by: David Kuehn <las_dkuehn@iastate.edu>
2022-05-16 10:06:09 +02:00
Umar Arshad
8686e18494 clblast: add new package (#30677) 2022-05-16 10:03:15 +02:00
Diego Alvarez
d28967bbf3 nextflow: add v22.04.1 (#30679) 2022-05-16 09:01:15 +02:00
Diego Alvarez
5f928f71c0 openjdk: add 11.0.15+10, 17.0.3+7 (#30680) 2022-05-16 08:58:51 +02:00
Ken Raffenetti
dc7bdf5f24 mpich: add support for Mellanox HCOLL (#30662)
Co-authored-by: Federico Ficarelli <federico.ficarelli@pm.me>
2022-05-15 14:14:47 +02:00
257 changed files with 4283 additions and 1168 deletions

View File

@@ -24,6 +24,7 @@ jobs:
fedora-clingo-sources:
runs-on: ubuntu-latest
container: "fedora:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -57,6 +58,7 @@ jobs:
ubuntu-clingo-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
env:
@@ -93,6 +95,7 @@ jobs:
ubuntu-clingo-binaries-and-patchelf:
runs-on: ubuntu-latest
container: "ubuntu:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
env:
@@ -126,6 +129,7 @@ jobs:
opensuse-clingo-sources:
runs-on: ubuntu-latest
container: "opensuse/leap:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -154,6 +158,7 @@ jobs:
macos-clingo-sources:
runs-on: macos-latest
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -174,6 +179,7 @@ jobs:
strategy:
matrix:
python-version: ['3.5', '3.6', '3.7', '3.8', '3.9', '3.10']
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -195,6 +201,7 @@ jobs:
strategy:
matrix:
python-version: ['2.7', '3.5', '3.6', '3.7', '3.8', '3.9', '3.10']
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b
@@ -216,6 +223,7 @@ jobs:
ubuntu-gnupg-binaries:
runs-on: ubuntu-latest
container: "ubuntu:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
env:
@@ -250,6 +258,7 @@ jobs:
ubuntu-gnupg-sources:
runs-on: ubuntu-latest
container: "ubuntu:latest"
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
env:
@@ -285,6 +294,7 @@ jobs:
macos-gnupg-binaries:
runs-on: macos-latest
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |
@@ -302,6 +312,7 @@ jobs:
macos-gnupg-sources:
runs-on: macos-latest
if: github.repository == 'spack/spack'
steps:
- name: Install dependencies
run: |

View File

@@ -43,6 +43,7 @@ jobs:
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04']]
name: Build ${{ matrix.dockerfile[0] }}
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
@@ -75,7 +76,7 @@ jobs:
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@6673cd052c4cd6fcf4b4e6e60ea986c889389535
uses: actions/upload-artifact@3cea5372237819ed00197afe530f5a7ea3e805c8
with:
name: dockerfiles
path: dockerfiles
@@ -94,7 +95,7 @@ jobs:
password: ${{ secrets.GITHUB_TOKEN }}
- name: Log in to DockerHub
if: ${{ github.event_name != 'pull_request' }}
if: github.event_name != 'pull_request'
uses: docker/login-action@49ed152c8eca782a232dede0303416e8f356c37b # @v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}

View File

@@ -22,6 +22,7 @@ on:
jobs:
install_gcc:
name: gcc with clang
if: github.repository == 'spack/spack'
runs-on: macos-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2
@@ -36,6 +37,7 @@ jobs:
install_jupyter_clang:
name: jupyter
if: github.repository == 'spack/spack'
runs-on: macos-latest
timeout-minutes: 700
steps:
@@ -50,6 +52,7 @@ jobs:
install_scipy_clang:
name: scipy, mpl, pd
if: github.repository == 'spack/spack'
runs-on: macos-latest
steps:
- uses: actions/checkout@2541b1294d2704b0964813337f33b291d3f8596b # @v2

View File

@@ -139,11 +139,11 @@ jobs:
echo "installer_root=$((pwd).Path)" | Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append
env:
ProgressPreference: SilentlyContinue
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@3cea5372237819ed00197afe530f5a7ea3e805c8
with:
name: Windows Spack Installer Bundle
path: ${{ env.installer_root }}\pkg\Spack.exe
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@3cea5372237819ed00197afe530f5a7ea3e805c8
with:
name: Windows Spack Installer
path: ${{ env.installer_root}}\pkg\Spack.msi

View File

@@ -15,16 +15,38 @@ concretizer:
# as possible, rather than building. If `false`, we'll always give you a fresh
# concretization.
reuse: true
# If `true`, Spack will consider minimizing builds its *topmost* priority.
# Note that this can result in weird package configurations. In particular,
# Spack will disable variants and might downgrade versions to avoid building
# new packages for an install. By default, Spack respects defaults from
# packages and preferences *before* minimizing the number of builds.
#
# Example for intuition: `cmake` can optionally build without openssl, but
# it's enabled by default because many builds use that functionality. Using
# `minimal: true` will build `cmake~openssl` unless the user asks for
# `cmake+openssl` explicitly.
minimal: false
# Options that tune which targets are considered for concretization. The
# concretization process is very sensitive to the number targets, and the time
# needed to reach a solution increases noticeably with the number of targets
# considered.
targets:
# Determine whether we want to target specific or generic microarchitectures.
# An example of the first kind might be for instance "skylake" or "bulldozer",
# while generic microarchitectures are for instance "aarch64" or "x86_64_v4".
granularity: microarchitectures
# If "false" allow targets that are incompatible with the current host (for
# instance concretize with target "icelake" while running on "haswell").
# If "true" only allow targets that are compatible with the host.
host_compatible: true
# When "true" concretize root specs of environments together, so that each unique
# package in an environment corresponds to one concrete spec. This ensures
# environments can always be activated. When "false" perform concretization separately
# on each root spec, allowing different versions and variants of the same package in
# an environment.
unify: false

View File

@@ -33,6 +33,9 @@ config:
template_dirs:
- $spack/share/spack/templates
# Directory where licenses should be located
license_dir: $spack/etc/spack/licenses
# Temporary locations Spack can try to use for builds.
#
# Recommended options are given below.

View File

@@ -39,6 +39,7 @@ on these ideas for each distinct build system that Spack supports:
build_systems/autotoolspackage
build_systems/cmakepackage
build_systems/cachedcmakepackage
build_systems/mesonpackage
build_systems/qmakepackage
build_systems/sippackage

View File

@@ -0,0 +1,123 @@
.. Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _cachedcmakepackage:
------------------
CachedCMakePackage
------------------
The CachedCMakePackage base class is used for CMake-based workflows
that create a CMake cache file prior to running ``cmake``. This is
useful for packages with arguments longer than the system limit, and
for reproducibility.
The documentation for this class assumes that the user is familiar with
the ``CMakePackage`` class from which it inherits. See the documentation
for :ref:`CMakePackage <cmakepackage>`.
^^^^^^
Phases
^^^^^^
The ``CachedCMakePackage`` base class comes with the following phases:
#. ``initconfig`` - generate the CMake cache file
#. ``cmake`` - generate the Makefile
#. ``build`` - build the package
#. ``install`` - install the package
By default, these phases run:
.. code-block:: console
$ mkdir spack-build
$ cd spack-build
$ cat << EOF > name-arch-compiler@version.cmake
# Write information on compilers and dependencies
# includes information on mpi and cuda if applicable
$ cmake .. -DCMAKE_INSTALL_PREFIX=/path/to/installation/prefix -C name-arch-compiler@version.cmake
$ make
$ make test # optional
$ make install
The ``CachedCMakePackage`` class inherits from the ``CMakePackage``
class, and accepts all of the same options and adds all of the same
flags to the ``cmake`` command. Similar to the ``CMakePAckage`` class,
you may need to add a few arguments yourself, and the
``CachedCMakePackage`` provides the same interface to add those
flags.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Adding entries to the CMake cache
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In addition to adding flags to the ``cmake`` command, you may need to
add entries to the CMake cache in the ``initconfig`` phase. This can
be done by overriding one of four methods:
#. ``CachedCMakePackage.initconfig_compiler_entries``
#. ``CachedCMakePackage.initconfig_mpi_entries``
#. ``CachedCMakePackage.initconfig_hardware_entries``
#. ``CachedCMakePackage.initconfig_package_entries``
Each of these methods returns a list of CMake cache strings. The
distinction between these methods is merely to provide a
well-structured and legible cmake cache file -- otherwise, entries
from each of these methods are handled identically.
Spack also provides convenience methods for generating CMake cache
entries. These methods are available at module scope in every Spack
package. Because CMake parses boolean options, strings, and paths
differently, there are three such methods:
#. ``cmake_cache_option``
#. ``cmake_cache_string``
#. ``cmake_cache_path``
These methods each accept three parameters -- the name of the CMake
variable associated with the entry, the value of the entry, and an
optional comment -- and return strings in the appropriate format to be
returned from any of the ``initconfig*`` methods. Additionally, these
methods may return comments beginning with the ``#`` character.
A typical usage of these methods may look something like this:
.. code-block:: python
def initconfig_mpi_entries(self)
# Get existing MPI configurations
entries = super(self, Foo).initconfig_mpi_entries()
# The existing MPI configurations key on whether ``mpi`` is in the spec
# This spec has an MPI variant, and we need to enable MPI when it is on.
# This hypothetical package controls MPI with the ``FOO_MPI`` option to
# cmake.
if '+mpi' in self.spec:
entries.append(cmake_cache_option('FOO_MPI', True, "enable mpi"))
else:
entries.append(cmake_cache_option('FOO_MPI', False, "disable mpi"))
def initconfig_package_entries(self):
# Package specific options
entries = []
entries.append('#Entries for build options')
bar_on = '+bar' in self.spec
entries.append(cmake_cache_option('FOO_BAR', bar_on, 'toggle bar'))
entries.append('#Entries for dependencies')
if self.spec['blas'].name == 'baz': # baz is our blas provider
entries.append(cmake_cache_string('FOO_BLAS', 'baz', 'Use baz'))
entries.append(cmake_cache_path('BAZ_PREFIX', self.spec['baz'].prefix))
^^^^^^^^^^^^^^^^^^^^^^
External documentation
^^^^^^^^^^^^^^^^^^^^^^
For more information on CMake cache files, see:
https://cmake.org/cmake/help/latest/manual/cmake.1.html

View File

@@ -59,7 +59,8 @@ other techniques to minimize the size of the final image:
&& echo " specs:" \
&& echo " - gromacs+mpi" \
&& echo " - mpich" \
&& echo " concretization: together" \
&& echo " concretizer: together" \
&& echo " unify: true" \
&& echo " config:" \
&& echo " install_tree: /opt/software" \
&& echo " view: /opt/view") > /opt/spack-environment/spack.yaml
@@ -245,7 +246,8 @@ software is respectively built and installed:
&& echo " specs:" \
&& echo " - gromacs+mpi" \
&& echo " - mpich" \
&& echo " concretization: together" \
&& echo " concretizer:" \
&& echo " unify: true" \
&& echo " config:" \
&& echo " install_tree: /opt/software" \
&& echo " view: /opt/view") > /opt/spack-environment/spack.yaml
@@ -366,7 +368,8 @@ produces, for instance, the following ``Dockerfile``:
&& echo " externals:" \
&& echo " - spec: cuda%gcc" \
&& echo " prefix: /usr/local/cuda" \
&& echo " concretization: together" \
&& echo " concretizer:" \
&& echo " unify: true" \
&& echo " config:" \
&& echo " install_tree: /opt/software" \
&& echo " view: /opt/view") > /opt/spack-environment/spack.yaml

View File

@@ -281,8 +281,8 @@ need to be installed alongside each other. Central installations done
at HPC centers by system administrators or user support groups
are a common case that fits in this behavior.
Environments *can also be configured to concretize all
the root specs in a self-consistent way* to ensure that
each package in the environment comes with a single configuration. This
the root specs in a unified way* to ensure that
each package in the environment corresponds to a single concrete spec. This
mode of operation is usually what is required by software developers that
want to deploy their development environment.
@@ -499,7 +499,7 @@ Spec concretization
Specs can be concretized separately or together, as already
explained in :ref:`environments_concretization`. The behavior active
under any environment is determined by the ``concretization`` property:
under any environment is determined by the ``concretizer:unify`` property:
.. code-block:: yaml
@@ -509,10 +509,15 @@ under any environment is determined by the ``concretization`` property:
- netcdf
- nco
- py-sphinx
concretization: together
concretizer:
unify: true
which can currently take either one of the two allowed values ``together`` or ``separately``
(the default).
.. note::
The ``concretizer:unify`` config option was introduced in Spack 0.18 to
replace the ``concretization`` property. For reference,
``concretization: separately`` is replaced by ``concretizer:unify:true``,
and ``concretization: together`` is replaced by ``concretizer:unify:false``.
.. admonition:: Re-concretization of user specs

View File

@@ -115,7 +115,8 @@ And here's the spack environment built by the pipeline represented as a
spack:
view: false
concretization: separately
concretizer:
unify: false
definitions:
- pkgs:

View File

@@ -61,7 +61,7 @@ You can see the packages we added earlier in the ``specs:`` section. If you
ever want to add more packages, you can either use ``spack add`` or manually
edit this file.
We also need to change the ``concretization:`` option. By default, Spack
We also need to change the ``concretizer:unify`` option. By default, Spack
concretizes each spec *separately*, allowing multiple versions of the same
package to coexist. Since we want a single consistent environment, we want to
concretize all of the specs *together*.
@@ -78,7 +78,8 @@ Here is what your ``spack.yaml`` looks like with this new setting:
# add package specs to the `specs` list
specs: [bash@5, python, py-numpy, py-scipy, py-matplotlib]
view: true
concretization: together
concretizer:
unify: true
^^^^^^^^^^^^^^^^
Symlink location

View File

@@ -25,4 +25,5 @@ spack:
- subversion
# Plotting
- graphviz
concretization: together
concretizer:
unify: true

32
lib/spack/env/cc vendored
View File

@@ -1,4 +1,4 @@
#!/bin/sh
#!/bin/sh -f
# shellcheck disable=SC2034 # evals in this script fool shellcheck
#
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
@@ -401,7 +401,8 @@ input_command="$*"
# command line and recombine them with Spack arguments later. We
# parse these out so that we can make sure that system paths come
# last, that package arguments come first, and that Spack arguments
# are injected properly.
# are injected properly. Based on configuration, we also strip -Werror
# arguments.
#
# All other arguments, including -l arguments, are treated as
# 'other_args' and left in their original order. This ensures that
@@ -440,6 +441,29 @@ while [ $# -ne 0 ]; do
continue
fi
if [ -n "${SPACK_COMPILER_FLAGS_KEEP}" ] ; then
# NOTE: the eval is required to allow `|` alternatives inside the variable
eval "\
case '$1' in
$SPACK_COMPILER_FLAGS_KEEP)
append other_args_list "$1"
shift
continue
;;
esac
"
fi
if [ -n "${SPACK_COMPILER_FLAGS_REMOVE}" ] ; then
eval "\
case '$1' in
$SPACK_COMPILER_FLAGS_REMOVE)
shift
continue
;;
esac
"
fi
case "$1" in
-isystem*)
arg="${1#-isystem}"
@@ -768,7 +792,9 @@ if [ "$SPACK_DEBUG" = TRUE ]; then
input_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.in.log"
output_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.out.log"
echo "[$mode] $command $input_command" >> "$input_log"
echo "[$mode] ${full_command_list}" >> "$output_log"
IFS="$lsep"
echo "[$mode] "$full_command_list >> "$output_log"
unset IFS
fi
# Execute the full command, preserving spaces with IFS set

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.1.2 (commit 85757b6666422fca86aa882a769bf78b0f992f54)
* Version: 0.1.4 (commit 53fc4ac91e9b4c5e4079f15772503a80bece72ad)
argparse
--------

View File

@@ -61,7 +61,7 @@ def proc_cpuinfo():
``/proc/cpuinfo``
"""
info = {}
with open("/proc/cpuinfo") as file:
with open("/proc/cpuinfo") as file: # pylint: disable=unspecified-encoding
for line in file:
key, separator, value = line.partition(":")
@@ -80,26 +80,46 @@ def proc_cpuinfo():
def _check_output(args, env):
output = subprocess.Popen(args, stdout=subprocess.PIPE, env=env).communicate()[0]
output = subprocess.Popen( # pylint: disable=consider-using-with
args, stdout=subprocess.PIPE, env=env
).communicate()[0]
return six.text_type(output.decode("utf-8"))
def _machine():
""" "Return the machine architecture we are on"""
operating_system = platform.system()
# If we are not on Darwin, trust what Python tells us
if operating_system != "Darwin":
return platform.machine()
# On Darwin it might happen that we are on M1, but using an interpreter
# built for x86_64. In that case "platform.machine() == 'x86_64'", so we
# need to fix that.
#
# See: https://bugs.python.org/issue42704
output = _check_output(
["sysctl", "-n", "machdep.cpu.brand_string"], env=_ensure_bin_usrbin_in_path()
).strip()
if "Apple" in output:
# Note that a native Python interpreter on Apple M1 would return
# "arm64" instead of "aarch64". Here we normalize to the latter.
return "aarch64"
return "x86_64"
@info_dict(operating_system="Darwin")
def sysctl_info_dict():
"""Returns a raw info dictionary parsing the output of sysctl."""
# Make sure that /sbin and /usr/sbin are in PATH as sysctl is
# usually found there
child_environment = dict(os.environ.items())
search_paths = child_environment.get("PATH", "").split(os.pathsep)
for additional_path in ("/sbin", "/usr/sbin"):
if additional_path not in search_paths:
search_paths.append(additional_path)
child_environment["PATH"] = os.pathsep.join(search_paths)
child_environment = _ensure_bin_usrbin_in_path()
def sysctl(*args):
return _check_output(["sysctl"] + list(args), env=child_environment).strip()
if platform.machine() == "x86_64":
if _machine() == "x86_64":
flags = (
sysctl("-n", "machdep.cpu.features").lower()
+ " "
@@ -125,6 +145,18 @@ def sysctl(*args):
return info
def _ensure_bin_usrbin_in_path():
# Make sure that /sbin and /usr/sbin are in PATH as sysctl is
# usually found there
child_environment = dict(os.environ.items())
search_paths = child_environment.get("PATH", "").split(os.pathsep)
for additional_path in ("/sbin", "/usr/sbin"):
if additional_path not in search_paths:
search_paths.append(additional_path)
child_environment["PATH"] = os.pathsep.join(search_paths)
return child_environment
def adjust_raw_flags(info):
"""Adjust the flags detected on the system to homogenize
slightly different representations.
@@ -184,12 +216,7 @@ def compatible_microarchitectures(info):
Args:
info (dict): dictionary containing information on the host cpu
"""
architecture_family = platform.machine()
# On Apple M1 platform.machine() returns "arm64" instead of "aarch64"
# so we should normalize the name here
if architecture_family == "arm64":
architecture_family = "aarch64"
architecture_family = _machine()
# If a tester is not registered, be conservative and assume no known
# target is compatible with the host
tester = COMPATIBILITY_CHECKS.get(architecture_family, lambda x, y: False)
@@ -244,12 +271,7 @@ def compatibility_check(architecture_family):
architecture_family = (architecture_family,)
def decorator(func):
# pylint: disable=fixme
# TODO: on removal of Python 2.6 support this can be re-written as
# TODO: an update + a dict comprehension
for arch_family in architecture_family:
COMPATIBILITY_CHECKS[arch_family] = func
COMPATIBILITY_CHECKS.update({family: func for family in architecture_family})
return func
return decorator
@@ -288,7 +310,7 @@ def compatibility_check_for_x86_64(info, target):
arch_root = TARGETS[basename]
return (
(target == arch_root or arch_root in target.ancestors)
and (target.vendor == vendor or target.vendor == "generic")
and target.vendor in (vendor, "generic")
and target.features.issubset(features)
)
@@ -303,8 +325,9 @@ def compatibility_check_for_aarch64(info, target):
arch_root = TARGETS[basename]
return (
(target == arch_root or arch_root in target.ancestors)
and (target.vendor == vendor or target.vendor == "generic")
and target.features.issubset(features)
and target.vendor in (vendor, "generic")
# On macOS it seems impossible to get all the CPU features with syctl info
and (target.features.issubset(features) or platform.system() == "Darwin")
)

View File

@@ -11,7 +11,7 @@
try:
from collections.abc import MutableMapping # novm
except ImportError:
from collections import MutableMapping
from collections import MutableMapping # pylint: disable=deprecated-class
class LazyDictionary(MutableMapping):
@@ -56,7 +56,7 @@ def _load_json_file(json_file):
def _factory():
filename = os.path.join(json_dir, json_file)
with open(filename, "r") as file:
with open(filename, "r") as file: # pylint: disable=unspecified-encoding
return json.load(file)
return _factory

View File

@@ -88,6 +88,20 @@
"name": "pentium4",
"flags": "-march={name} -mtune=generic"
}
],
"oneapi": [
{
"versions": ":",
"name": "pentium4",
"flags": "-march={name} -mtune=generic"
}
],
"dpcpp": [
{
"versions": ":",
"name": "pentium4",
"flags": "-march={name} -mtune=generic"
}
]
}
},
@@ -291,6 +305,20 @@
"name": "pentium4",
"flags": "-march={name} -mtune=generic"
}
],
"oneapi": [
{
"versions": ":",
"name": "pentium4",
"flags": "-march={name} -mtune=generic"
}
],
"dpcpp": [
{
"versions": ":",
"name": "pentium4",
"flags": "-march={name} -mtune=generic"
}
]
}
},
@@ -333,6 +361,18 @@
"versions": "16.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -384,6 +424,20 @@
"name": "corei7",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"name": "corei7",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"name": "corei7",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -432,6 +486,20 @@
"name": "corei7",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"name": "corei7",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"name": "corei7",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -490,6 +558,18 @@
"versions": "18.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -550,6 +630,18 @@
"versions": "18.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -615,6 +707,18 @@
"versions": "18.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -672,6 +776,18 @@
"versions": "18.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -732,6 +848,18 @@
"versions": "18.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -798,6 +926,20 @@
"name": "knl",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"name": "knl",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"name": "knl",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -868,6 +1010,20 @@
"name": "skylake-avx512",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"name": "skylake-avx512",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"name": "skylake-avx512",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -937,6 +1093,18 @@
"versions": "18.0:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1004,6 +1172,18 @@
"versions": "19.0.1:",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1098,6 +1278,20 @@
"name": "icelake-client",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"name": "icelake-client",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"name": "icelake-client",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1142,6 +1336,20 @@
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse2"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse2"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse2"
}
]
}
},
@@ -1192,6 +1400,20 @@
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse3"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse3"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse3"
}
]
}
},
@@ -1246,6 +1468,20 @@
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse3"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse3"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse3"
}
]
}
},
@@ -1301,6 +1537,20 @@
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse4.2"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse4.2"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"flags": "-msse4.2"
}
]
}
},
@@ -1360,6 +1610,22 @@
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1422,6 +1688,22 @@
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1485,6 +1767,22 @@
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1543,6 +1841,30 @@
"name": "znver3",
"flags": "-march={name} -mtune={name}"
}
],
"intel": [
{
"versions": "16.0:",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"oneapi": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
],
"dpcpp": [
{
"versions": ":",
"warnings": "Intel's compilers may or may not optimize to the same degree for non-Intel microprocessors for optimizations that are not unique to Intel microprocessors",
"name": "core-avx2",
"flags": "-march={name} -mtune={name}"
}
]
}
},
@@ -1788,7 +2110,6 @@
"fp",
"asimd",
"evtstrm",
"aes",
"pmull",
"sha1",
"sha2",
@@ -1821,18 +2142,26 @@
"flags": "-march=armv8.2-a+crc+crypto+fp16"
},
{
"versions": "8:",
"flags": "-march=armv8.2-a+crc+aes+sha2+fp16+sve -msve-vector-bits=512"
"versions": "8:10.2",
"flags": "-march=armv8.2-a+crc+sha2+fp16+sve -msve-vector-bits=512"
},
{
"versions": "10.3:",
"flags": "-mcpu=a64fx -msve-vector-bits=512"
}
],
"clang": [
{
"versions": "3.9:4.9",
"flags": "-march=armv8.2-a+crc+crypto+fp16"
"flags": "-march=armv8.2-a+crc+sha2+fp16"
},
{
"versions": "5:",
"flags": "-march=armv8.2-a+crc+crypto+fp16+sve"
"versions": "5:10",
"flags": "-march=armv8.2-a+crc+sha2+fp16+sve"
},
{
"versions": "11:",
"flags": "-mcpu=a64fx"
}
],
"arm": [
@@ -1954,7 +2283,40 @@
"m1": {
"from": ["aarch64"],
"vendor": "Apple",
"features": [],
"features": [
"fp",
"asimd",
"evtstrm",
"aes",
"pmull",
"sha1",
"sha2",
"crc32",
"atomics",
"fphp",
"asimdhp",
"cpuid",
"asimdrdm",
"jscvt",
"fcma",
"lrcpc",
"dcpop",
"sha3",
"asimddp",
"sha512",
"asimdfhm",
"dit",
"uscat",
"ilrcpc",
"flagm",
"ssbs",
"sb",
"paca",
"pacg",
"dcpodp",
"flagm2",
"frint"
],
"compilers": {
"gcc": [
{
@@ -1964,14 +2326,22 @@
],
"clang" : [
{
"versions": "9.0:",
"versions": "9.0:12.0",
"flags" : "-march=armv8.4-a"
},
{
"versions": "13.0:",
"flags" : "-mcpu=apple-m1"
}
],
"apple-clang": [
{
"versions": "11.0:",
"versions": "11.0:12.5",
"flags" : "-march=armv8.4-a"
},
{
"versions": "13.0:",
"flags" : "-mcpu=apple-m1"
}
]
}

View File

@@ -367,7 +367,7 @@ def group_ids(uid=None):
@system_path_filter(arg_slice=slice(1))
def chgrp(path, group):
def chgrp(path, group, follow_symlinks=True):
"""Implement the bash chgrp function on a single path"""
if is_windows:
raise OSError("Function 'chgrp' is not supported on Windows")
@@ -376,7 +376,10 @@ def chgrp(path, group):
gid = grp.getgrnam(group).gr_gid
else:
gid = group
os.chown(path, -1, gid)
if follow_symlinks:
os.chown(path, -1, gid)
else:
os.lchown(path, -1, gid)
@system_path_filter(arg_slice=slice(1))

View File

@@ -242,6 +242,17 @@ def clean_environment():
# show useful matches.
env.set('LC_ALL', build_lang)
remove_flags = set()
keep_flags = set()
if spack.config.get('config:flags:keep_werror') == 'all':
keep_flags.add('-Werror*')
else:
if spack.config.get('config:flags:keep_werror') == 'specific':
keep_flags.add('-Werror=*')
remove_flags.add('-Werror*')
env.set('SPACK_COMPILER_FLAGS_KEEP', '|'.join(keep_flags))
env.set('SPACK_COMPILER_FLAGS_REMOVE', '|'.join(remove_flags))
# Remove any macports installs from the PATH. The macports ld can
# cause conflicts with the built-in linker on el capitan. Solves
# assembler issues, e.g.:

View File

@@ -210,6 +210,10 @@ def std_initconfig_entries(self):
"#------------------{0}\n".format("-" * 60),
]
def initconfig_package_entries(self):
"""This method is to be overwritten by the package"""
return []
def initconfig(self, spec, prefix):
cache_entries = (self.std_initconfig_entries() +
self.initconfig_compiler_entries() +

View File

@@ -155,31 +155,17 @@ def parse_specs(args, **kwargs):
normalize = kwargs.get('normalize', False)
tests = kwargs.get('tests', False)
try:
sargs = args
if not isinstance(args, six.string_types):
sargs = ' '.join(spack.util.string.quote(args))
specs = spack.spec.parse(sargs)
for spec in specs:
if concretize:
spec.concretize(tests=tests) # implies normalize
elif normalize:
spec.normalize(tests=tests)
sargs = args
if not isinstance(args, six.string_types):
sargs = ' '.join(spack.util.string.quote(args))
specs = spack.spec.parse(sargs)
for spec in specs:
if concretize:
spec.concretize(tests=tests) # implies normalize
elif normalize:
spec.normalize(tests=tests)
return specs
except spack.spec.SpecParseError as e:
msg = e.message + "\n" + str(e.string) + "\n"
msg += (e.pos + 2) * " " + "^"
raise spack.error.SpackError(msg)
except spack.error.SpecError as e:
msg = e.message
if e.long_message:
msg += e.long_message
raise spack.error.SpackError(msg)
return specs
def matching_spec_from_env(spec):

View File

@@ -478,11 +478,12 @@ def save_specfile_fn(args):
if args.root_specfile:
with open(args.root_specfile) as fd:
root_spec_as_json = fd.read()
spec_format = 'yaml' if args.root_specfile.endswith('yaml') else 'json'
else:
root_spec = Spec(args.root_spec)
root_spec.concretize()
root_spec_as_json = root_spec.to_json(hash=ht.dag_hash)
spec_format = 'yaml' if args.root_specfile.endswith('yaml') else 'json'
spec_format = 'json'
save_dependency_specfiles(
root_spec_as_json, args.specfile_dir, args.specs.split(), spec_format)

View File

@@ -380,6 +380,11 @@ def add_concretizer_args(subparser):
const=False, default=None,
help='do not reuse installed deps; build newest configuration'
)
subgroup.add_argument(
'--minimal', action=ConfigSetAction, dest="concretizer:minimal",
const=True, default=None,
help='minimize builds (disables default variants, may choose older versions)'
)
subgroup.add_argument(
'--reuse', action=ConfigSetAction, dest="concretizer:reuse",
const=True, default=None,

View File

@@ -136,13 +136,13 @@ def solve(parser, args):
)
fmt = " @K{%%-8d} %%-%ds%%9s %%7s" % maxlen
for i, (idx, build_idx, name) in enumerate(result.criteria, 1):
for i, (installed_cost, build_cost, name) in enumerate(result.criteria, 1):
color.cprint(
fmt % (
i,
name,
"-" if build_idx is None else opt[idx],
opt[idx] if build_idx is None else opt[build_idx],
"-" if build_cost is None else installed_cost,
installed_cost if build_cost is None else build_cost,
)
)
print()

View File

@@ -766,7 +766,8 @@ def name_matches(name, name_list):
toolchains.add(compiler_cls.__name__)
if len(toolchains) > 1:
if toolchains == set(['Clang', 'AppleClang', 'Aocc']):
if toolchains == set(['Clang', 'AppleClang', 'Aocc']) or \
toolchains == set(['Dpcpp', 'Oneapi']):
return False
tty.debug("[TOOLCHAINS] {0}".format(toolchains))
return True

View File

@@ -88,7 +88,7 @@
#: Path to the default configuration
configuration_defaults_path = (
'defaults', os.path.join(spack.paths.etc_path, 'spack', 'defaults')
'defaults', os.path.join(spack.paths.etc_path, 'defaults')
)
#: Hard-coded default values for some key configuration options.
@@ -104,6 +104,10 @@
'build_jobs': min(16, cpus_available()),
'build_stage': '$tempdir/spack-stage',
'concretizer': 'clingo',
'license_dir': spack.paths.default_license_dir,
'flags': {
'keep_werror': 'none',
},
}
}
@@ -815,7 +819,7 @@ def _config():
# Site configuration is per spack instance, for sites or projects
# No site-level configs should be checked into spack by default.
configuration_paths.append(
('site', os.path.join(spack.paths.etc_path, 'spack')),
('site', os.path.join(spack.paths.etc_path)),
)
# User configuration can override both spack defaults and site config

View File

@@ -356,10 +356,10 @@ def __init__(self, root, db_dir=None, upstream_dbs=None,
self.prefix_fail_path = os.path.join(self._db_dir, 'prefix_failures')
# Create needed directories and files
if not os.path.exists(self._db_dir):
if not is_upstream and not os.path.exists(self._db_dir):
fs.mkdirp(self._db_dir)
if not os.path.exists(self._failure_dir) and not is_upstream:
if not is_upstream and not os.path.exists(self._failure_dir):
fs.mkdirp(self._failure_dir)
self.is_upstream = is_upstream
@@ -1064,9 +1064,7 @@ def _read(self):
self._state_is_inconsistent = False
return
elif self.is_upstream:
raise UpstreamDatabaseLockingError(
"No database index file is present, and upstream"
" databases cannot generate an index file")
tty.warn('upstream not found: {0}'.format(self._index_path))
def _add(
self,

View File

@@ -79,8 +79,9 @@
env_subdir_name = '.spack-env'
#: default spack.yaml file to put in new environments
default_manifest_yaml = """\
def default_manifest_yaml():
"""default spack.yaml file to put in new environments"""
return """\
# This is a Spack Environment file.
#
# It describes a set of packages to be installed, along with
@@ -89,7 +90,11 @@
# add package specs to the `specs` list
specs: []
view: true
"""
concretizer:
unify: {}
""".format('true' if spack.config.get('concretizer:unify') else 'false')
#: regex for validating enviroment names
valid_environment_name_re = r'^\w[\w-]*$'
@@ -632,11 +637,11 @@ def __init__(self, path, init_file=None, with_view=None, keep_relative=False):
# the init file.
with fs.open_if_filename(init_file) as f:
if hasattr(f, 'name') and f.name.endswith('.lock'):
self._read_manifest(default_manifest_yaml)
self._read_manifest(default_manifest_yaml())
self._read_lockfile(f)
self._set_user_specs_from_lockfile()
else:
self._read_manifest(f, raw_yaml=default_manifest_yaml)
self._read_manifest(f, raw_yaml=default_manifest_yaml())
# Rewrite relative develop paths when initializing a new
# environment in a different location from the spack.yaml file.
@@ -700,7 +705,7 @@ def _read(self):
default_manifest = not os.path.exists(self.manifest_path)
if default_manifest:
# No manifest, use default yaml
self._read_manifest(default_manifest_yaml)
self._read_manifest(default_manifest_yaml())
else:
with open(self.manifest_path) as f:
self._read_manifest(f)
@@ -766,8 +771,11 @@ def _read_manifest(self, f, raw_yaml=None):
self.views = {}
# Retrieve the current concretization strategy
configuration = config_dict(self.yaml)
# default concretization to separately
self.concretization = configuration.get('concretization', 'separately')
# Let `concretization` overrule `concretize:unify` config for now.
unify = spack.config.get('concretizer:unify')
self.concretization = configuration.get(
'concretization', 'together' if unify else 'separately')
# Retrieve dev-build packages:
self.dev_specs = configuration.get('develop', {})
@@ -1611,7 +1619,14 @@ def all_specs(self):
"""Return all specs, even those a user spec would shadow."""
all_specs = set()
for h in self.concretized_order:
all_specs.update(self.specs_by_hash[h].traverse())
try:
spec = self.specs_by_hash[h]
except KeyError:
tty.warn(
'Environment %s appears to be corrupt: missing spec '
'"%s"' % (self.name, h))
continue
all_specs.update(spec.traverse())
return sorted(all_specs)
@@ -1869,17 +1884,15 @@ def write(self, regenerate=True):
regenerate (bool): regenerate views and run post-write hooks as
well as writing if True.
"""
# Intercept environment not using the latest schema format and prevent
# them from being modified
manifest_exists = os.path.exists(self.manifest_path)
if manifest_exists and not is_latest_format(self.manifest_path):
msg = ('The environment "{0}" needs to be written to disk, but '
'is currently using a deprecated format. Please update it '
'using:\n\n'
'\tspack env update {0}\n\n'
'Note that previous versions of Spack will not be able to '
# Warn that environments are not in the latest format.
if not is_latest_format(self.manifest_path):
ver = '.'.join(str(s) for s in spack.spack_version_info[:2])
msg = ('The environment "{}" is written to disk in a deprecated format. '
'Please update it using:\n\n'
'\tspack env update {}\n\n'
'Note that versions of Spack older than {} may not be able to '
'use the updated configuration.')
raise RuntimeError(msg.format(self.name))
tty.warn(msg.format(self.name, self.name, ver))
# ensure path in var/spack/environments
fs.mkdirp(self.path)
@@ -2231,14 +2244,16 @@ def _top_level_key(data):
def is_latest_format(manifest):
"""Return True if the manifest file is at the latest schema format,
False otherwise.
"""Return False if the manifest file exists and is not in the latest schema format.
Args:
manifest (str): manifest file to be analyzed
"""
with open(manifest) as f:
data = syaml.load(f)
try:
with open(manifest) as f:
data = syaml.load(f)
except (OSError, IOError):
return True
top_level_key = _top_level_key(data)
changed = spack.schema.env.update(data[top_level_key])
return not changed

View File

@@ -10,9 +10,9 @@
import llnl.util.tty as tty
#: whether we should write stack traces or short error messages
#: at what level we should write stack traces or short error messages
#: this is module-scoped because it needs to be set very early
debug = False
debug = 0
class SpackError(Exception):

View File

@@ -493,9 +493,11 @@ def write(self, spec, color=None, out=None):
# Replace node with its dependencies
self._frontier.pop(i)
deps = node.dependencies(deptype=self.deptype)
if deps:
deps = sorted((d.dag_hash() for d in deps), reverse=True)
edges = sorted(
node.edges_to_dependencies(deptype=self.deptype), reverse=True
)
if edges:
deps = [e.spec.dag_hash() for e in edges]
self._connect_deps(i, deps, "new-deps") # anywhere.
elif self._frontier:

View File

@@ -375,13 +375,6 @@ def make_argument_parser(**kwargs):
# stat names in groups of 7, for nice wrapping.
stat_lines = list(zip(*(iter(stat_names),) * 7))
# help message for --show-cores
show_cores_help = 'provide additional information on concretization failures\n'
show_cores_help += 'off (default): show only the violated rule\n'
show_cores_help += 'full: show raw unsat cores from clingo\n'
show_cores_help += 'minimized: show subset-minimal unsat cores '
show_cores_help += '(Warning: this may take hours for some specs)'
parser.add_argument(
'-h', '--help',
dest='help', action='store_const', const='short', default=None,
@@ -405,9 +398,6 @@ def make_argument_parser(**kwargs):
'-d', '--debug', action='count', default=0,
help="write out debug messages "
"(more d's for more verbosity: -d, -dd, -ddd, etc.)")
parser.add_argument(
'--show-cores', choices=["off", "full", "minimized"], default="off",
help=show_cores_help)
parser.add_argument(
'--timestamp', action='store_true',
help="Add a timestamp to tty output")
@@ -490,18 +480,11 @@ def setup_main_options(args):
# errors raised by spack.config.
if args.debug:
spack.error.debug = True
spack.error.debug = args.debug
spack.util.debug.register_interrupt_handler()
spack.config.set('config:debug', True, scope='command_line')
spack.util.environment.tracing_enabled = True
if args.show_cores != "off":
# minimize_cores defaults to true, turn it off if we're showing full core
# but don't want to wait to minimize it.
spack.solver.asp.full_cores = True
if args.show_cores == 'full':
spack.solver.asp.minimize_cores = False
if args.timestamp:
tty.set_timestamp(True)

View File

@@ -53,6 +53,7 @@
import spack.store
import spack.url
import spack.util.environment
import spack.util.path
import spack.util.web
from spack.filesystem_view import YamlFilesystemView
from spack.install_test import TestFailure, TestSuite
@@ -60,7 +61,6 @@
from spack.stage import ResourceStage, Stage, StageComposite, stage_prefix
from spack.util.executable import ProcessError, which
from spack.util.package_hash import package_hash
from spack.util.path import win_exe_ext
from spack.util.prefix import Prefix
from spack.version import Version
@@ -200,9 +200,9 @@ def __init__(cls, name, bases, attr_dict):
def platform_executables(self):
def to_windows_exe(exe):
if exe.endswith('$'):
exe = exe.replace('$', '%s$' % win_exe_ext())
exe = exe.replace('$', '%s$' % spack.util.path.win_exe_ext())
else:
exe += win_exe_ext()
exe += spack.util.path.win_exe_ext()
return exe
plat_exe = []
if hasattr(self, 'executables'):
@@ -438,6 +438,11 @@ def name(self):
self._name = self._name[self._name.rindex('.') + 1:]
return self._name
@property
def global_license_dir(self):
"""Returns the directory where license files for all packages are stored."""
return spack.util.path.canonicalize_path(spack.config.get('config:license_dir'))
def run_before(*phases):
"""Registers a method of a package to be run before a given phase"""
@@ -938,9 +943,8 @@ def name(self):
@property
def global_license_dir(self):
"""Returns the directory where global license files for all
packages are stored."""
return os.path.join(spack.paths.prefix, 'etc', 'spack', 'licenses')
"""Returns the directory where global license files are stored."""
return type(self).global_license_dir
@property
def global_license_file(self):
@@ -1714,7 +1718,10 @@ def content_hash(self, content=None):
hash_content.append(source_id.encode('utf-8'))
# patch sha256's
if self.spec.concrete:
# Only include these if they've been assigned by the concretizer.
# We check spec._patches_assigned instead of spec.concrete because
# we have to call package_hash *before* marking specs concrete
if self.spec._patches_assigned():
hash_content.extend(
':'.join((p.sha256, str(p.level))).encode('utf-8')
for p in self.spec.patches

View File

@@ -123,11 +123,11 @@ def accept(self, id):
def next_token_error(self, message):
"""Raise an error about the next token in the stream."""
raise ParseError(message, self.text, self.token.end)
raise ParseError(message, self.text[0], self.token.end)
def last_token_error(self, message):
"""Raise an error about the previous token in the stream."""
raise ParseError(message, self.text, self.token.start)
raise ParseError(message, self.text[0], self.token.start)
def unexpected_token(self):
self.next_token_error("Unexpected token: '%s'" % self.next.value)

View File

@@ -43,8 +43,12 @@
hooks_path = os.path.join(module_path, "hooks")
opt_path = os.path.join(prefix, "opt")
share_path = os.path.join(prefix, "share", "spack")
etc_path = os.path.join(prefix, "etc")
etc_path = os.path.join(prefix, "etc", "spack")
#
# Things in $spack/etc/spack
#
default_license_dir = os.path.join(etc_path, "licenses")
#
# Things in $spack/var/spack

View File

@@ -4,6 +4,8 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""This module contains jsonschema files for all of Spack's YAML formats."""
import warnings
import six
import llnl.util.lang
@@ -49,10 +51,12 @@ def _deprecated_properties(validator, deprecated, instance, schema):
msg = msg_str_or_func.format(properties=deprecated_properties)
else:
msg = msg_str_or_func(instance, deprecated_properties)
if msg is None:
return
is_error = deprecated['error']
if not is_error:
llnl.util.tty.warn(msg)
warnings.warn(msg)
else:
import jsonschema
yield jsonschema.ValidationError(msg)

View File

@@ -15,6 +15,7 @@
'additionalProperties': False,
'properties': {
'reuse': {'type': 'boolean'},
'minimal': {'type': 'boolean'},
'targets': {
'type': 'object',
'properties': {
@@ -25,6 +26,14 @@
}
}
},
'unify': {
'type': 'boolean'
# Todo: add when_possible.
# 'oneOf': [
# {'type': 'boolean'},
# {'type': 'string', 'enum': ['when_possible']}
# ]
}
}
}
}

View File

@@ -56,6 +56,7 @@
'type': 'array',
'items': {'type': 'string'}
},
'license_dir': {'type': 'string'},
'source_cache': {'type': 'string'},
'misc_cache': {'type': 'string'},
'connect_timeout': {'type': 'integer', 'minimum': 0},
@@ -90,7 +91,16 @@
'additional_external_search_paths': {
'type': 'array',
'items': {'type': 'string'}
}
},
'flags': {
'type': 'object',
'properties': {
'keep_werror': {
'type': 'string',
'enum': ['all', 'specific', 'none'],
},
},
},
},
'deprecatedProperties': {
'properties': ['module_roots'],

View File

@@ -16,6 +16,24 @@
import spack.schema.packages
import spack.schema.projections
warned_about_concretization = False
def deprecate_concretization(instance, props):
global warned_about_concretization
if warned_about_concretization:
return None
# Deprecate `spack:concretization` in favor of `spack:concretizer:unify`.
concretization_to_unify = {'together': 'true', 'separately': 'false'}
concretization = instance['concretization']
unify = concretization_to_unify[concretization]
return (
'concretization:{} is deprecated and will be removed in Spack 0.19 in favor of '
'the new concretizer:unify:{} config option.'.format(concretization, unify)
)
#: legal first keys in the schema
keys = ('spack', 'env')
@@ -61,6 +79,11 @@
'type': 'object',
'default': {},
'additionalProperties': False,
'deprecatedProperties': {
'properties': ['concretization'],
'message': deprecate_concretization,
'error': False
},
'properties': union_dicts(
# merged configuration scope schemas
spack.schema.merged.properties,
@@ -169,11 +192,33 @@ def update(data):
Returns:
True if data was changed, False otherwise
"""
updated = False
if 'include' in data:
msg = ("included configuration files should be updated manually"
" [files={0}]")
warnings.warn(msg.format(', '.join(data['include'])))
if 'packages' in data:
return spack.schema.packages.update(data['packages'])
return False
updated |= spack.schema.packages.update(data['packages'])
# Spack 0.19 drops support for `spack:concretization` in favor of
# `spack:concretizer:unify`. Here we provide an upgrade path that changes the former
# into the latter, or warns when there's an ambiguity. Note that Spack 0.17 is not
# forward compatible with `spack:concretizer:unify`.
if 'concretization' in data:
has_unify = 'unify' in data.get('concretizer', {})
to_unify = {'together': True, 'separately': False}
unify = to_unify[data['concretization']]
if has_unify and data['concretizer']['unify'] != unify:
warnings.warn(
'The following configuration conflicts: '
'`spack:concretization:{}` and `spack:concretizer:unify:{}`'
'. Please update manually.'.format(
data['concretization'], data['concretizer']['unify']))
else:
data.update({'concretizer': {'unify': unify}})
data.pop('concretization')
updated = True
return updated

View File

@@ -9,6 +9,7 @@
import itertools
import os
import pprint
import re
import types
import warnings
@@ -55,14 +56,6 @@
parse_files = None
#: whether we should write ASP unsat cores quickly in debug mode when the cores
#: may be very large or take the time (sometimes hours) to minimize them
minimize_cores = True
#: whether we should include all facts in the unsat cores or only error messages
full_cores = False
# backward compatibility functions for clingo ASTs
def ast_getter(*names):
def getter(node):
@@ -114,7 +107,7 @@ def getter(node):
def build_criteria_names(costs, tuples):
"""Construct an ordered mapping from criteria names to indices in the cost list."""
"""Construct an ordered mapping from criteria names to costs."""
# pull optimization criteria names out of the solution
priorities_names = []
@@ -141,7 +134,10 @@ def build_criteria_names(costs, tuples):
# sort the criteria by priority
priorities_names = sorted(priorities_names, reverse=True)
assert len(priorities_names) == len(costs), "Wrong number of optimization criteria!"
# We only have opt-criterion values for non-error types
# error type criteria are excluded (they come first)
error_criteria = len(costs) - len(priorities_names)
costs = costs[error_criteria:]
# split list into three parts: build criteria, fixed criteria, non-build criteria
num_criteria = len(priorities_names)
@@ -154,12 +150,12 @@ def build_criteria_names(costs, tuples):
# mapping from priority to index in cost list
indices = dict((p, i) for i, (p, n) in enumerate(priorities_names))
# make a list that has each name with its build and non-build priority
# make a list that has each name with its build and non-build costs
criteria = [
(p - fixed_priority_offset + num_build, None, name) for p, name in fixed
(costs[p - fixed_priority_offset + num_build], None, name) for p, name in fixed
]
for (i, name), (b, _) in zip(installed, build):
criteria.append((indices[i], indices[b], name))
criteria.append((costs[indices[i]], costs[indices[b]], name))
return criteria
@@ -331,9 +327,6 @@ def format_core(self, core):
core_symbols = []
for atom in core:
sym = symbols[atom]
if sym.name in ("rule", "error"):
# these are special symbols we use to get messages in the core
sym = sym.arguments[0].string
core_symbols.append(sym)
return sorted(str(symbol) for symbol in core_symbols)
@@ -392,7 +385,7 @@ def raise_if_unsat(self):
"""
Raise an appropriate error if the result is unsatisfiable.
The error is a UnsatisfiableSpecError, and includes the minimized cores
The error is an InternalConcretizerError, and includes the minimized cores
resulting from the solve, formatted to be human readable.
"""
if self.satisfiable:
@@ -402,12 +395,8 @@ def raise_if_unsat(self):
if len(constraints) == 1:
constraints = constraints[0]
if minimize_cores:
conflicts = self.format_minimal_cores()
else:
conflicts = self.format_cores()
raise UnsatisfiableSpecError(constraints, conflicts=conflicts)
conflicts = self.format_minimal_cores()
raise InternalConcretizerError(constraints, conflicts=conflicts)
@property
def specs(self):
@@ -507,13 +496,11 @@ def h2(self, name):
def newline(self):
self.out.write('\n')
def fact(self, head, assumption=False):
def fact(self, head):
"""ASP fact (a rule without a body).
Arguments:
head (AspFunction): ASP function to generate as fact
assumption (bool): If True and using cores, use this fact as a
choice point in ASP and include it in unsatisfiable cores
"""
symbol = head.symbol() if hasattr(head, 'symbol') else head
@@ -521,10 +508,9 @@ def fact(self, head, assumption=False):
atom = self.backend.add_atom(symbol)
# with `--show-cores=full or --show-cores=minimized, make all facts
# choices/assumptions, otherwise only if assumption=True
choice = self.cores and (full_cores or assumption)
# Only functions relevant for constructing bug reports for bad error messages
# are assumptions, and only when using cores.
choice = self.cores and symbol.name == 'internal_error'
self.backend.add_rule([atom], [], choice=choice)
if choice:
self.assumptions.append(atom)
@@ -582,9 +568,10 @@ def visit(node):
for term in node.body:
if ast_type(term) == ASTType.Literal:
if ast_type(term.atom) == ASTType.SymbolicAtom:
if ast_sym(term.atom).name == "error":
name = ast_sym(term.atom).name
if name == 'internal_error':
arg = ast_sym(ast_sym(term.atom).arguments[0])
self.fact(fn.error(arg.string), assumption=True)
self.fact(AspFunction(name)(arg.string))
path = os.path.join(parent_dir, 'concretize.lp')
parse_files([path], visit)
@@ -667,7 +654,7 @@ def stringify(x):
class SpackSolverSetup(object):
"""Class to set up and run a Spack concretization solve."""
def __init__(self, reuse=False, tests=False):
def __init__(self, reuse=None, minimal=None, tests=False):
self.gen = None # set by setup()
self.declared_versions = {}
@@ -692,10 +679,11 @@ def __init__(self, reuse=False, tests=False):
# Caches to optimize the setup phase of the solver
self.target_specs_cache = None
# whether to add installed/binary hashes to the solve
self.reuse = reuse
# whether to add installed/binary hashes to the solve
# Solver paramters that affect setup -- see Solver documentation
self.reuse = spack.config.get(
"concretizer:reuse", False) if reuse is None else reuse
self.minimal = spack.config.get(
"concretizer:minimal", False) if minimal is None else minimal
self.tests = tests
def pkg_version_rules(self, pkg):
@@ -737,7 +725,7 @@ def spec_versions(self, spec):
# record all version constraints for later
self.version_constraints.add((spec.name, spec.versions))
return [fn.version_satisfies(spec.name, spec.versions)]
return [fn.node_version_satisfies(spec.name, spec.versions)]
def target_ranges(self, spec, single_target_fn):
target = spec.architecture.target
@@ -750,13 +738,24 @@ def target_ranges(self, spec, single_target_fn):
return [fn.node_target_satisfies(spec.name, target)]
def conflict_rules(self, pkg):
default_msg = "{0} '{1}' conflicts with '{2}'"
no_constraint_msg = "{0} conflicts with '{1}'"
for trigger, constraints in pkg.conflicts.items():
trigger_id = self.condition(spack.spec.Spec(trigger), name=pkg.name)
self.gen.fact(fn.conflict_trigger(trigger_id))
trigger_msg = "conflict trigger %s" % str(trigger)
trigger_id = self.condition(
spack.spec.Spec(trigger), name=pkg.name, msg=trigger_msg)
for constraint, _ in constraints:
constraint_id = self.condition(constraint, name=pkg.name)
self.gen.fact(fn.conflict(pkg.name, trigger_id, constraint_id))
for constraint, conflict_msg in constraints:
if conflict_msg is None:
if constraint == spack.spec.Spec():
conflict_msg = no_constraint_msg.format(pkg.name, trigger)
else:
conflict_msg = default_msg.format(pkg.name, trigger, constraint)
constraint_msg = "conflict constraint %s" % str(constraint)
constraint_id = self.condition(
constraint, name=pkg.name, msg=constraint_msg)
self.gen.fact(
fn.conflict(pkg.name, trigger_id, constraint_id, conflict_msg))
self.gen.newline()
def available_compilers(self):
@@ -829,7 +828,7 @@ def package_compiler_defaults(self, pkg):
pkg.name, cspec.name, cspec.version, -i * 100
))
def pkg_rules(self, pkg, tests):
def pkg_rules(self, pkg):
pkg = packagize(pkg)
# versions
@@ -840,9 +839,18 @@ def pkg_rules(self, pkg, tests):
for name, entry in sorted(pkg.variants.items()):
variant, when = entry
for w in when:
cond_id = self.condition(w, name=pkg.name)
self.gen.fact(fn.variant_condition(cond_id, pkg.name, name))
if spack.spec.Spec() in when:
# unconditional variant
self.gen.fact(fn.variant(pkg.name, name))
else:
# conditional variant
for w in when:
msg = "%s has variant %s" % (pkg.name, name)
if str(w):
msg += " when %s" % w
cond_id = self.condition(w, name=pkg.name, msg=msg)
self.gen.fact(fn.variant_condition(cond_id, pkg.name, name))
single_value = not variant.multi
if single_value:
@@ -885,7 +893,9 @@ def pkg_rules(self, pkg, tests):
imposed = spack.spec.Spec(value.when)
imposed.name = pkg.name
self.condition(
required_spec=required, imposed_spec=imposed, name=pkg.name
required_spec=required, imposed_spec=imposed, name=pkg.name,
msg="%s variant %s value %s when %s" % (
pkg.name, name, value, when)
)
if variant.sticky:
@@ -913,7 +923,7 @@ def pkg_rules(self, pkg, tests):
)
)
def condition(self, required_spec, imposed_spec=None, name=None):
def condition(self, required_spec, imposed_spec=None, name=None, msg=None):
"""Generate facts for a dependency or virtual provider condition.
Arguments:
@@ -922,7 +932,7 @@ def condition(self, required_spec, imposed_spec=None, name=None):
are imposed when this condition is triggered
name (str or None): name for `required_spec` (required if
required_spec is anonymous, ignored if not)
msg (str or None): description of the condition
Returns:
int: id of the condition created by this function
"""
@@ -931,7 +941,7 @@ def condition(self, required_spec, imposed_spec=None, name=None):
assert named_cond.name, "must provide name for anonymous condtions!"
condition_id = next(self._condition_id_counter)
self.gen.fact(fn.condition(condition_id))
self.gen.fact(fn.condition(condition_id, msg))
# requirements trigger the condition
requirements = self.spec_clauses(
@@ -963,7 +973,8 @@ def package_provider_rules(self, pkg):
for provided, whens in pkg.provided.items():
for when in whens:
condition_id = self.condition(when, provided, pkg.name)
msg = '%s provides %s when %s' % (pkg.name, provided, when)
condition_id = self.condition(when, provided, pkg.name, msg)
self.gen.fact(fn.provider_condition(
condition_id, when.name, provided.name
))
@@ -987,7 +998,11 @@ def package_dependencies_rules(self, pkg):
if not deptypes:
continue
condition_id = self.condition(cond, dep.spec, pkg.name)
msg = '%s depends on %s' % (pkg.name, dep.spec.name)
if cond != spack.spec.Spec():
msg += ' when %s' % cond
condition_id = self.condition(cond, dep.spec, pkg.name, msg)
self.gen.fact(fn.dependency_condition(
condition_id, pkg.name, dep.spec.name
))
@@ -1067,7 +1082,8 @@ def external_packages(self):
# Declare external conditions with a local index into packages.yaml
for local_idx, spec in enumerate(external_specs):
condition_id = self.condition(spec)
msg = '%s available as external when satisfying %s' % (spec.name, spec)
condition_id = self.condition(spec, msg=msg)
self.gen.fact(
fn.possible_external(condition_id, pkg_name, local_idx)
)
@@ -1794,10 +1810,14 @@ def setup(self, driver, specs):
self.gen.h1("Concrete input spec definitions")
self.define_concrete_input_specs(specs, possible)
self.gen.h1("Concretizer options")
if self.reuse:
self.gen.fact(fn.optimize_for_reuse())
if self.minimal:
self.gen.fact(fn.minimal_installs())
if self.reuse:
self.gen.h1("Installed packages")
self.gen.fact(fn.optimize_for_reuse())
self.gen.newline()
self.define_installed_packages(specs, possible)
self.gen.h1('General Constraints')
@@ -1818,7 +1838,7 @@ def setup(self, driver, specs):
self.gen.h1('Package Constraints')
for pkg in sorted(pkgs):
self.gen.h2('Package rules: %s' % pkg)
self.pkg_rules(pkg, tests=self.tests)
self.pkg_rules(pkg)
self.gen.h2('Package preferences: %s' % pkg)
self.preferred_variants(pkg)
self.preferred_targets(pkg)
@@ -1920,6 +1940,17 @@ def node_os(self, pkg, os):
def node_target(self, pkg, target):
self._arch(pkg).target = target
def error(self, priority, msg, *args):
msg = msg.format(*args)
# For variant formatting, we sometimes have to construct specs
# to format values properly. Find/replace all occurances of
# Spec(...) with the string representation of the spec mentioned
specs_to_construct = re.findall(r'Spec\(([^)]*)\)', msg)
for spec_str in specs_to_construct:
msg = msg.replace('Spec(%s)' % spec_str, str(spack.spec.Spec(spec_str)))
raise UnsatisfiableSpecError(msg)
def variant_value(self, pkg, name, value):
# FIXME: is there a way not to special case 'dev_path' everywhere?
if name == 'dev_path':
@@ -2042,15 +2073,27 @@ def deprecated(self, pkg, version):
msg = 'using "{0}@{1}" which is a deprecated version'
tty.warn(msg.format(pkg, version))
@staticmethod
def sort_fn(function_tuple):
name = function_tuple[0]
if name == 'error':
priority = function_tuple[1][0]
return (-4, priority)
elif name == 'hash':
return (-3, 0)
elif name == 'node':
return (-2, 0)
elif name == 'node_compiler':
return (-1, 0)
else:
return (0, 0)
def build_specs(self, function_tuples):
# Functions don't seem to be in particular order in output. Sort
# them here so that directives that build objects (like node and
# node_compiler) are called in the right order.
function_tuples.sort(key=lambda f: {
"hash": -3,
"node": -2,
"node_compiler": -1,
}.get(f[0], 0))
self.function_tuples = function_tuples
self.function_tuples.sort(key=self.sort_fn)
self._specs = {}
for name, args in function_tuples:
@@ -2058,7 +2101,6 @@ def build_specs(self, function_tuples):
continue
action = getattr(self, name, None)
# print out unknown actions so we can display them for debugging
if not action:
msg = "%s(%s)" % (name, ", ".join(str(a) for a in args))
@@ -2068,16 +2110,18 @@ def build_specs(self, function_tuples):
assert action and callable(action)
# ignore predicates on virtual packages, as they're used for
# solving but don't construct anything
pkg = args[0]
if spack.repo.path.is_virtual(pkg):
continue
# solving but don't construct anything. Do not ignore error
# predicates on virtual packages.
if name != 'error':
pkg = args[0]
if spack.repo.path.is_virtual(pkg):
continue
# if we've already gotten a concrete spec for this pkg,
# do not bother calling actions on it.
spec = self._specs.get(pkg)
if spec and spec.concrete:
continue
# if we've already gotten a concrete spec for this pkg,
# do not bother calling actions on it.
spec = self._specs.get(pkg)
if spec and spec.concrete:
continue
action(*args)
@@ -2107,8 +2151,9 @@ def build_specs(self, function_tuples):
for s in self._specs.values():
_develop_specs_from_env(s, ev.active_environment())
for s in self._specs.values():
s._mark_concrete()
# mark concrete and assign hashes to all specs in the solve
for root in roots.values():
root._finalize_concretization()
for s in self._specs.values():
spack.spec.Spec.ensure_no_deprecated(s)
@@ -2151,6 +2196,10 @@ class Solver(object):
``reuse (bool)``
Whether to try to reuse existing installs/binaries
``minimal (bool)``
If ``True`` make minimizing nodes the top priority, even higher
than defaults from packages and preferences.
"""
def __init__(self):
self.driver = PyclingoDriver()
@@ -2158,6 +2207,7 @@ def __init__(self):
# These properties are settable via spack configuration, and overridable
# by setting them directly as properties.
self.reuse = spack.config.get("concretizer:reuse", False)
self.minimal = spack.config.get("concretizer:minimal", False)
def solve(
self,
@@ -2188,7 +2238,7 @@ def solve(
continue
spack.spec.Spec.ensure_valid_variants(s)
setup = SpackSolverSetup(reuse=self.reuse, tests=tests)
setup = SpackSolverSetup(reuse=self.reuse, minimal=self.minimal, tests=tests)
return self.driver.solve(
setup,
specs,
@@ -2201,25 +2251,27 @@ def solve(
class UnsatisfiableSpecError(spack.error.UnsatisfiableSpecError):
"""
Subclass for new constructor signature for new concretizer
"""
def __init__(self, msg):
super(spack.error.UnsatisfiableSpecError, self).__init__(msg)
self.provided = None
self.required = None
self.constraint_type = None
class InternalConcretizerError(spack.error.UnsatisfiableSpecError):
"""
Subclass for new constructor signature for new concretizer
"""
def __init__(self, provided, conflicts):
indented = [' %s\n' % conflict for conflict in conflicts]
conflict_msg = ''.join(indented)
issue = 'conflicts' if full_cores else 'errors'
msg = '%s is unsatisfiable, %s are:\n%s' % (provided, issue, conflict_msg)
newline_indent = '\n '
if not full_cores:
msg += newline_indent + 'To see full clingo unsat cores, '
msg += 're-run with `spack --show-cores=full`'
if not minimize_cores or not full_cores:
# not solver.minimalize_cores and not solver.full_cores impossible
msg += newline_indent + 'For full, subset-minimal unsat cores, '
msg += 're-run with `spack --show-cores=minimized'
msg += newline_indent
msg += 'Warning: This may take (up to) hours for some specs'
error_msg = ''.join(indented)
msg = 'Spack concretizer internal error. Please submit a bug report'
msg += '\n Please include the command, environment if applicable,'
msg += '\n and the following error message.'
msg = '\n %s is unsatisfiable, errors are:\n%s' % (provided, error_msg)
super(spack.error.UnsatisfiableSpecError, self).__init__(msg)

View File

@@ -7,22 +7,6 @@
% This logic program implements Spack's concretizer
%=============================================================================
%-----------------------------------------------------------------------------
% Generic constraints on nodes
%-----------------------------------------------------------------------------
% each node must have a single version
:- not 1 { version(Package, _) } 1, node(Package).
% each node must have a single platform, os and target
:- not 1 { node_platform(Package, _) } 1, node(Package), error("A node must have exactly one platform").
:- not 1 { node_os(Package, _) } 1, node(Package).
:- not 1 { node_target(Package, _) } 1, node(Package).
% each node has a single compiler associated with it
:- not 1 { node_compiler(Package, _) } 1, node(Package).
:- not 1 { node_compiler_version(Package, _, _) } 1, node(Package).
%-----------------------------------------------------------------------------
% Version semantics
%-----------------------------------------------------------------------------
@@ -35,7 +19,7 @@ version_declared(Package, Version, Weight) :- version_declared(Package, Version,
:- version_declared(Package, Version, Weight, Origin1),
version_declared(Package, Version, Weight, Origin2),
Origin1 < Origin2,
error("Internal error: two versions with identical weights").
internal_error("Two versions with identical weights").
% We cannot use a version declared for an installed package if we end up building it
:- version_declared(Package, Version, Weight, "installed"),
@@ -48,11 +32,27 @@ version_declared(Package, Version) :- version_declared(Package, Version, _).
% If something is a package, it has only one version and that must be a
% declared version.
1 { version(Package, Version) : version_declared(Package, Version) } 1
:- node(Package), error("Each node must have exactly one version").
% We allow clingo to choose any version(s), and infer an error if there
% is not precisely one version chosen. Error facts are heavily optimized
% against to ensure they cannot be inferred when a non-error solution is
% possible
{ version(Package, Version) : version_declared(Package, Version) }
:- node(Package).
error(2, "No version for '{0}' satisfies '@{1}' and '@{2}'", Package, Version1, Version2)
:- node(Package),
version(Package, Version1),
version(Package, Version2),
Version1 < Version2. % see[1]
% A virtual package may have or not a version, but never has more than one
:- virtual_node(Package), 2 { version(Package, _) }.
error(2, "No versions available for package '{0}'", Package)
:- node(Package), not version(Package, _).
% A virtual package may or may not have a version, but never has more than one
error(2, "No version for '{0}' satisfies '@{1}' and '@{2}'", Virtual, Version1, Version2)
:- virtual_node(Virtual),
version(Virtual, Version1),
version(Virtual, Version2),
Version1 < Version2. % see[1]
% If we select a deprecated version, mark the package as deprecated
deprecated(Package, Version) :- version(Package, Version), deprecated_version(Package, Version).
@@ -61,14 +61,27 @@ possible_version_weight(Package, Weight)
:- version(Package, Version),
version_declared(Package, Version, Weight).
1 { version_weight(Package, Weight) : possible_version_weight(Package, Weight) } 1 :- node(Package), error("Internal error: Package version must have a unique weight").
version_weight(Package, Weight)
:- version(Package, Version),
node(Package),
Weight = #min{W : version_declared(Package, Version, W)}.
% version_satisfies implies that exactly one of the satisfying versions
% node_version_satisfies implies that exactly one of the satisfying versions
% is the package's version, and vice versa.
1 { version(Package, Version) : version_satisfies(Package, Constraint, Version) } 1
:- version_satisfies(Package, Constraint),
error("no version satisfies the given constraints").
version_satisfies(Package, Constraint)
% While this choice rule appears redundant with the initial choice rule for
% versions, virtual nodes with version constraints require this rule to be
% able to choose versions
{ version(Package, Version) : version_satisfies(Package, Constraint, Version) }
:- node_version_satisfies(Package, Constraint).
% More specific error message if the version cannot satisfy some constraint
% Otherwise covered by `no_version_error` and `versions_conflict_error`.
error(1, "No valid version for '{0}' satisfies '@{1}'", Package, Constraint)
:- node_version_satisfies(Package, Constraint),
C = #count{ Version : version(Package, Version), version_satisfies(Package, Constraint, Version)},
C < 1.
node_version_satisfies(Package, Constraint)
:- version(Package, Version), version_satisfies(Package, Constraint, Version).
#defined version_satisfies/3.
@@ -87,7 +100,7 @@ version_satisfies(Package, Constraint)
% conditions are specified with `condition_requirement` and hold when
% corresponding spec attributes hold.
condition_holds(ID) :-
condition(ID);
condition(ID, _);
attr(Name, A1) : condition_requirement(ID, Name, A1);
attr(Name, A1, A2) : condition_requirement(ID, Name, A1, A2);
attr(Name, A1, A2, A3) : condition_requirement(ID, Name, A1, A2, A3).
@@ -106,7 +119,12 @@ attr(Name, A1, A2, A3) :- impose(ID), imposed_constraint(ID, Name, A1, A2, A3).
variant_value(Package, Variant, Value),
not imposed_constraint(Hash, "variant_value", Package, Variant, Value).
#defined condition/1.
% we cannot have additional flag values when we are working with concrete specs
:- node(Package), hash(Package, Hash),
node_flag(Package, FlagType, Flag),
not imposed_constraint(Hash, "node_flag", Package, FlagType, Flag).
#defined condition/2.
#defined condition_requirement/3.
#defined condition_requirement/4.
#defined condition_requirement/5.
@@ -133,9 +151,9 @@ depends_on(Package, Dependency) :- depends_on(Package, Dependency, _).
dependency_holds(Package, Dependency, Type) :-
dependency_condition(ID, Package, Dependency),
dependency_type(ID, Type),
condition_holds(ID),
build(Package),
not external(Package).
not external(Package),
condition_holds(ID).
% We cut off dependencies of externals (as we don't really know them).
% Don't impose constraints on dependencies that don't exist.
@@ -161,17 +179,18 @@ node(Dependency) :- node(Package), depends_on(Package, Dependency).
% dependencies) and get a two-node unconnected graph
needed(Package) :- root(Package).
needed(Dependency) :- needed(Package), depends_on(Package, Dependency).
:- node(Package), not needed(Package),
error("All dependencies must be reachable from root").
error(1, "'{0}' is not a valid dependency for any package in the DAG", Package)
:- node(Package),
not needed(Package).
% Avoid cycles in the DAG
% some combinations of conditional dependencies can result in cycles;
% this ensures that we solve around them
path(Parent, Child) :- depends_on(Parent, Child).
path(Parent, Descendant) :- path(Parent, A), depends_on(A, Descendant).
:- path(A, B), path(B, A), error("Cyclic dependencies are not allowed").
#defined error/1.
error(2, "Cyclic dependency detected between '{0}' and '{1}'\n Consider changing variants to avoid the cycle", A, B)
:- path(A, B),
path(B, A).
#defined dependency_type/2.
#defined dependency_condition/3.
@@ -179,14 +198,13 @@ path(Parent, Descendant) :- path(Parent, A), depends_on(A, Descendant).
%-----------------------------------------------------------------------------
% Conflicts
%-----------------------------------------------------------------------------
:- node(Package),
conflict(Package, TriggerID, ConstraintID),
error(0, Msg) :- node(Package),
conflict(Package, TriggerID, ConstraintID, Msg),
condition_holds(TriggerID),
condition_holds(ConstraintID),
not external(Package), % ignore conflicts for externals
error("A conflict was triggered").
not external(Package). % ignore conflicts for externals
#defined conflict/3.
#defined conflict/4.
%-----------------------------------------------------------------------------
% Virtual dependencies
@@ -206,8 +224,17 @@ virtual_node(Virtual)
% If there's a virtual node, we must select one and only one provider.
% The provider must be selected among the possible providers.
1 { provider(Package, Virtual) : possible_provider(Package, Virtual) } 1
:- virtual_node(Virtual), error("Virtual packages must be satisfied by a unique provider").
{ provider(Package, Virtual) : possible_provider(Package, Virtual) }
:- virtual_node(Virtual).
error(2, "Cannot find valid provider for virtual {0}", Virtual)
:- virtual_node(Virtual),
P = #count{ Package : provider(Package, Virtual)},
P < 1.
error(2, "Spec cannot include multiple providers for virtual '{0}'\n Requested '{1}' and '{2}'", Virtual, P1, P2)
:- virtual_node(Virtual),
provider(P1, Virtual),
provider(P2, Virtual),
P1 < P2.
% virtual roots imply virtual nodes, and that one provider is a root
virtual_node(Virtual) :- virtual_root(Virtual).
@@ -232,7 +259,7 @@ virtual_condition_holds(Provider, Virtual) :-
% A package cannot be the actual provider for a virtual if it does not
% fulfill the conditions to provide that virtual
:- provider(Package, Virtual), not virtual_condition_holds(Package, Virtual),
error("Internal error: virtual when provides not respected").
internal_error("Virtual when provides not respected").
#defined possible_provider/2.
@@ -245,7 +272,7 @@ virtual_condition_holds(Provider, Virtual) :-
% we select the weight, among the possible ones, that minimizes the overall objective function.
1 { provider_weight(Dependency, Virtual, Weight, Reason) :
possible_provider_weight(Dependency, Virtual, Weight, Reason) } 1
:- provider(Dependency, Virtual), error("Internal error: package provider weights must be unique").
:- provider(Dependency, Virtual), internal_error("Package provider weights must be unique").
% Get rid or the reason for enabling the possible weight (useful for debugging)
provider_weight(Dependency, Virtual, Weight) :- provider_weight(Dependency, Virtual, Weight, _).
@@ -291,7 +318,7 @@ node(Package) :- attr("node", Package).
virtual_node(Virtual) :- attr("virtual_node", Virtual).
hash(Package, Hash) :- attr("hash", Package, Hash).
version(Package, Version) :- attr("version", Package, Version).
version_satisfies(Package, Constraint) :- attr("version_satisfies", Package, Constraint).
node_version_satisfies(Package, Constraint) :- attr("node_version_satisfies", Package, Constraint).
node_platform(Package, Platform) :- attr("node_platform", Package, Platform).
node_os(Package, OS) :- attr("node_os", Package, OS).
node_target(Package, Target) :- attr("node_target", Package, Target).
@@ -310,7 +337,7 @@ attr("node", Package) :- node(Package).
attr("virtual_node", Virtual) :- virtual_node(Virtual).
attr("hash", Package, Hash) :- hash(Package, Hash).
attr("version", Package, Version) :- version(Package, Version).
attr("version_satisfies", Package, Constraint) :- version_satisfies(Package, Constraint).
attr("node_version_satisfies", Package, Constraint) :- node_version_satisfies(Package, Constraint).
attr("node_platform", Package, Platform) :- node_platform(Package, Platform).
attr("node_os", Package, OS) :- node_os(Package, OS).
attr("node_target", Package, Target) :- node_target(Package, Target).
@@ -338,7 +365,7 @@ attr("node_compiler_version_satisfies", Package, Compiler, Version)
#defined external_only/1.
#defined pkg_provider_preference/4.
#defined default_provider_preference/3.
#defined version_satisfies/2.
#defined node_version_satisfies/2.
#defined node_compiler_version_satisfies/3.
#defined root/1.
@@ -347,9 +374,17 @@ attr("node_compiler_version_satisfies", Package, Compiler, Version)
%-----------------------------------------------------------------------------
% if a package is external its version must be one of the external versions
1 { external_version(Package, Version, Weight):
version_declared(Package, Version, Weight, "external") } 1
:- external(Package), error("External package version does not satisfy external spec").
{ external_version(Package, Version, Weight):
version_declared(Package, Version, Weight, "external") }
:- external(Package).
error(2, "Attempted to use external for '{0}' which does not satisfy any configured external spec", Package)
:- external(Package),
not external_version(Package, _, _).
error(2, "Attempted to use external for '{0}' which does not satisfy any configured external spec", Package)
:- external(Package),
external_version(Package, Version1, Weight1),
external_version(Package, Version2, Weight2),
(Version1, Weight1) < (Version2, Weight2). % see[1]
version_weight(Package, Weight) :- external_version(Package, Version, Weight).
version(Package, Version) :- external_version(Package, Version, Weight).
@@ -369,7 +404,7 @@ external(Package) :- external_spec_selected(Package, _).
version_weight(Package, Weight),
version_declared(Package, Version, Weight, "external"),
not external(Package),
error("Internal error: external weight used for internal spec").
internal_error("External weight used for internal spec").
% determine if an external spec has been selected
external_spec_selected(Package, LocalIndex) :-
@@ -381,8 +416,9 @@ external_conditions_hold(Package, LocalIndex) :-
% it cannot happen that a spec is external, but none of the external specs
% conditions hold.
:- external(Package), not external_conditions_hold(Package, _),
error("External package does not satisfy external spec").
error(2, "Attempted to use external for '{0}' which does not satisfy any configured external spec", Package)
:- external(Package),
not external_conditions_hold(Package, _).
#defined possible_external/3.
#defined external_spec_index/3.
@@ -399,16 +435,16 @@ variant(Package, Variant) :- variant_condition(ID, Package, Variant),
condition_holds(ID).
% a variant cannot be set if it is not a variant on the package
:- variant_set(Package, Variant),
not variant(Package, Variant),
build(Package),
error("Unsatisfied conditional variants cannot be set").
error(2, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Package, Variant)
:- variant_set(Package, Variant),
not variant(Package, Variant),
build(Package).
% a variant cannot take on a value if it is not a variant of the package
:- variant_value(Package, Variant, _),
not variant(Package, Variant),
build(Package),
error("Unsatisfied conditional variants cannot take on a variant value").
error(2, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Package, Variant)
:- variant_value(Package, Variant, _),
not variant(Package, Variant),
build(Package).
% if a variant is sticky and not set its value is the default value
variant_value(Package, Variant, Value) :-
@@ -418,27 +454,30 @@ variant_value(Package, Variant, Value) :-
variant_default_value(Package, Variant, Value),
build(Package).
% one variant value for single-valued variants.
1 {
% at most one variant value for single-valued variants.
{
variant_value(Package, Variant, Value)
: variant_possible_value(Package, Variant, Value)
} 1
:- node(Package),
variant(Package, Variant),
variant_single_value(Package, Variant),
build(Package),
error("Single valued variants must have a single value").
% at least one variant value for multi-valued variants.
1 {
variant_value(Package, Variant, Value)
: variant_possible_value(Package, Variant, Value)
}
:- node(Package),
variant(Package, Variant),
not variant_single_value(Package, Variant),
build(Package),
error("Internal error: All variants must have a value").
build(Package).
error(2, "'{0}' required multiple values for single-valued variant '{1}'\n Requested 'Spec({1}={2})' and 'Spec({1}={3})'", Package, Variant, Value1, Value2)
:- node(Package),
variant(Package, Variant),
variant_single_value(Package, Variant),
build(Package),
variant_value(Package, Variant, Value1),
variant_value(Package, Variant, Value2),
Value1 < Value2. % see[1]
error(2, "No valid value for variant '{1}' of package '{0}'", Package, Variant)
:- node(Package),
variant(Package, Variant),
build(Package),
C = #count{ Value : variant_value(Package, Variant, Value) },
C < 1.
% if a variant is set to anything, it is considered 'set'.
variant_set(Package, Variant) :- variant_set(Package, Variant, _).
@@ -446,21 +485,21 @@ variant_set(Package, Variant) :- variant_set(Package, Variant, _).
% A variant cannot have a value that is not also a possible value
% This only applies to packages we need to build -- concrete packages may
% have been built w/different variants from older/different package versions.
:- variant_value(Package, Variant, Value),
not variant_possible_value(Package, Variant, Value),
build(Package),
error("Variant set to invalid value").
error(1, "'Spec({1}={2})' is not a valid value for '{0}' variant '{1}'", Package, Variant, Value)
:- variant_value(Package, Variant, Value),
not variant_possible_value(Package, Variant, Value),
build(Package).
% Some multi valued variants accept multiple values from disjoint sets.
% Ensure that we respect that constraint and we don't pick values from more
% than one set at once
:- variant_value(Package, Variant, Value1),
variant_value(Package, Variant, Value2),
variant_value_from_disjoint_sets(Package, Variant, Value1, Set1),
variant_value_from_disjoint_sets(Package, Variant, Value2, Set2),
Set1 < Set2,
build(Package),
error("Variant values selected from multiple disjoint sets").
error(2, "{0} variant '{1}' cannot have values '{2}' and '{3}' as they come from disjoing value sets", Package, Variant, Value1, Value2)
:- variant_value(Package, Variant, Value1),
variant_value(Package, Variant, Value2),
variant_value_from_disjoint_sets(Package, Variant, Value1, Set1),
variant_value_from_disjoint_sets(Package, Variant, Value2, Set2),
Set1 < Set2, % see[1]
build(Package).
% variant_set is an explicitly set variant value. If it's not 'set',
% we revert to the default value. If it is set, we force the set value
@@ -518,12 +557,11 @@ variant_default_value(Package, Variant, Value) :- variant_default_value_from_cli
% Treat 'none' in a special way - it cannot be combined with other
% values even if the variant is multi-valued
:- 2 {
variant_value(Package, Variant, Value) : variant_possible_value(Package, Variant, Value)
},
variant_value(Package, Variant, "none"),
build(Package),
error("Variant value 'none' cannot be combined with any other value").
error(2, "{0} variant '{1}' cannot have values '{2}' and 'none'", Package, Variant, Value)
:- variant_value(Package, Variant, Value),
variant_value(Package, Variant, "none"),
Value != "none",
build(Package).
% patches and dev_path are special variants -- they don't have to be
% declared in the package, so we just allow them to spring into existence
@@ -567,6 +605,18 @@ node_platform(Package, Platform)
% platform is set if set to anything
node_platform_set(Package) :- node_platform_set(Package, _).
% each node must have a single platform
error(2, "No valid platform found for {0}", Package)
:- node(Package),
C = #count{ Platform : node_platform(Package, Platform)},
C < 1.
error(2, "Cannot concretize {0} with multiple platforms\n Requested 'platform={1}' and 'platform={2}'", Package, Platform1, Platform2)
:- node(Package),
node_platform(Package, Platform1),
node_platform(Package, Platform2),
Platform1 < Platform2. % see[1]
#defined node_platform_set/2. % avoid warnings
%-----------------------------------------------------------------------------
@@ -576,20 +626,32 @@ node_platform_set(Package) :- node_platform_set(Package, _).
os(OS) :- os(OS, _).
% one os per node
1 { node_os(Package, OS) : os(OS) } 1 :-
node(Package), error("Each node must have exactly one OS").
{ node_os(Package, OS) : os(OS) } :- node(Package).
error(2, "Cannot find valid operating system for '{0}'", Package)
:- node(Package),
C = #count{ OS : node_os(Package, OS)},
C < 1.
error(2, "Cannot concretize {0} with multiple operating systems\n Requested 'os={1}' and 'os={2}'", Package, OS1, OS2)
:- node(Package),
node_os(Package, OS1),
node_os(Package, OS2),
OS1 < OS2. %see [1]
% can't have a non-buildable OS on a node we need to build
:- build(Package), node_os(Package, OS), not buildable_os(OS),
error("No available OS can be built for").
error(2, "Cannot concretize '{0} os={1}'. Operating system '{1}' is not buildable", Package, OS)
:- build(Package),
node_os(Package, OS),
not buildable_os(OS).
% can't have dependencies on incompatible OS's
:- depends_on(Package, Dependency),
node_os(Package, PackageOS),
node_os(Dependency, DependencyOS),
not os_compatible(PackageOS, DependencyOS),
build(Package),
error("Dependencies must have compatible OS's with their dependents").
error(2, "{0} and dependency {1} have incompatible operating systems 'os={2}' and 'os={3}'", Package, Dependency, PackageOS, DependencyOS)
:- depends_on(Package, Dependency),
node_os(Package, PackageOS),
node_os(Dependency, DependencyOS),
not os_compatible(PackageOS, DependencyOS),
build(Package).
% give OS choice weights according to os declarations
node_os_weight(Package, Weight)
@@ -621,14 +683,24 @@ node_os(Package, OS) :- node_os_set(Package, OS), node(Package).
%-----------------------------------------------------------------------------
% Each node has only one target chosen among the known targets
1 { node_target(Package, Target) : target(Target) } 1 :- node(Package), error("Each node must have exactly one target").
{ node_target(Package, Target) : target(Target) } :- node(Package).
error(2, "Cannot find valid target for '{0}'", Package)
:- node(Package),
C = #count{Target : node_target(Package, Target)},
C < 1.
error(2, "Cannot concretize '{0}' with multiple targets\n Requested 'target={1}' and 'target={2}'", Package, Target1, Target2)
:- node(Package),
node_target(Package, Target1),
node_target(Package, Target2),
Target1 < Target2. % see[1]
% If a node must satisfy a target constraint, enforce it
:- node_target(Package, Target),
node_target_satisfies(Package, Constraint),
not target_satisfies(Constraint, Target),
error("Node targets must satisfy node target constraints").
error(1, "'{0} target={1}' cannot satisfy constraint 'target={2}'", Package, Target, Constraint)
:- node_target(Package, Target),
node_target_satisfies(Package, Constraint),
not target_satisfies(Constraint, Target).
% If a node has a target and the target satisfies a constraint, then the target
% associated with the node satisfies the same constraint
@@ -636,10 +708,10 @@ node_target_satisfies(Package, Constraint)
:- node_target(Package, Target), target_satisfies(Constraint, Target).
% If a node has a target, all of its dependencies must be compatible with that target
:- depends_on(Package, Dependency),
node_target(Package, Target),
not node_target_compatible(Dependency, Target),
error("Dependency node targets must be compatible with dependent targets").
error(2, "Cannot find compatible targets for {0} and {1}", Package, Dependency)
:- depends_on(Package, Dependency),
node_target(Package, Target),
not node_target_compatible(Dependency, Target).
% Intermediate step for performance reasons
% When the integrity constraint above was formulated including this logic
@@ -680,12 +752,12 @@ target_weight(Target, Package, Weight)
:- package_target_weight(Target, Package, Weight).
% can't use targets on node if the compiler for the node doesn't support them
:- node_target(Package, Target),
not compiler_supports_target(Compiler, Version, Target),
node_compiler(Package, Compiler),
node_compiler_version(Package, Compiler, Version),
build(Package),
error("No satisfying compiler available is compatible with a satisfying target").
error(2, "{0} compiler '{2}@{3}' incompatible with 'target={1}'", Package, Target, Compiler, Version)
:- node_target(Package, Target),
not compiler_supports_target(Compiler, Version, Target),
node_compiler(Package, Compiler),
node_compiler_version(Package, Compiler, Version),
build(Package).
% if a target is set explicitly, respect it
node_target(Package, Target)
@@ -712,8 +784,10 @@ node_target_mismatch(Parent, Dependency)
not node_target_match(Parent, Dependency).
% disallow reusing concrete specs that don't have a compatible target
:- node(Package), node_target(Package, Target), not target(Target),
error("No satisfying package's target is compatible with this machine").
error(2, "'{0} target={1}' is not compatible with this machine", Package, Target)
:- node(Package),
node_target(Package, Target),
not target(Target).
#defined node_target_set/2.
#defined package_target_weight/3.
@@ -725,10 +799,19 @@ compiler(Compiler) :- compiler_version(Compiler, _).
% There must be only one compiler set per built node. The compiler
% is chosen among available versions.
1 { node_compiler_version(Package, Compiler, Version) : compiler_version(Compiler, Version) } 1 :-
{ node_compiler_version(Package, Compiler, Version) : compiler_version(Compiler, Version) } :-
node(Package),
build(Package),
error("Each node must have exactly one compiler").
build(Package).
error(2, "No valid compiler version found for '{0}'", Package)
:- node(Package),
C = #count{ Version : node_compiler_version(Package, _, Version)},
C < 1.
error(2, "'{0}' compiler constraints '%{1}@{2}' and '%{3}@{4}' are incompatible", Package, Compiler1, Version1, Compiler2, Version2)
:- node(Package),
node_compiler_version(Package, Compiler1, Version1),
node_compiler_version(Package, Compiler2, Version2),
(Compiler1, Version1) < (Compiler2, Version2). % see[1]
% Sometimes we just need to know the compiler and not the version
node_compiler(Package, Compiler) :- node_compiler_version(Package, Compiler, _).
@@ -737,14 +820,22 @@ node_compiler(Package, Compiler) :- node_compiler_version(Package, Compiler, _).
:- node_compiler(Package, Compiler1),
node_compiler_version(Package, Compiler2, _),
Compiler1 != Compiler2,
error("Internal error: mismatch between selected compiler and compiler version").
internal_error("Mismatch between selected compiler and compiler version").
% If the compiler of a node cannot be satisfied, raise
error(1, "No valid compiler for {0} satisfies '%{1}'", Package, Compiler)
:- node(Package),
node_compiler_version_satisfies(Package, Compiler, ":"),
C = #count{ Version : node_compiler_version(Package, Compiler, Version), compiler_version_satisfies(Compiler, ":", Version) },
C < 1.
% If the compiler of a node must satisfy a constraint, then its version
% must be chosen among the ones that satisfy said constraint
1 { node_compiler_version(Package, Compiler, Version)
: compiler_version_satisfies(Compiler, Constraint, Version) } 1 :-
node_compiler_version_satisfies(Package, Compiler, Constraint),
error("Internal error: node compiler version mismatch").
error(2, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
:- node(Package),
node_compiler_version_satisfies(Package, Compiler, Constraint),
C = #count{ Version : node_compiler_version(Package, Compiler, Version), compiler_version_satisfies(Compiler, Constraint, Version) },
C < 1.
% If the node is associated with a compiler and the compiler satisfy a constraint, then
% the compiler associated with the node satisfy the same constraint
@@ -762,11 +853,12 @@ node_compiler_version(Package, Compiler, Version) :- node_compiler_version_set(P
% Cannot select a compiler if it is not supported on the OS
% Compilers that are explicitly marked as allowed
% are excluded from this check
:- node_compiler_version(Package, Compiler, Version), node_os(Package, OS),
not compiler_supports_os(Compiler, Version, OS),
not allow_compiler(Compiler, Version),
build(Package),
error("No satisfying compiler available is compatible with a satisfying os").
error(2, "{0} compiler '%{1}@{2}' incompatible with 'os={3}'", Package, Compiler, Version, OS)
:- node_compiler_version(Package, Compiler, Version),
node_os(Package, OS),
not compiler_supports_os(Compiler, Version, OS),
not allow_compiler(Compiler, Version),
build(Package).
% If a package and one of its dependencies don't have the
% same compiler there's a mismatch.
@@ -859,7 +951,7 @@ no_flags(Package, FlagType)
%-----------------------------------------------------------------------------
% the solver is free to choose at most one installed hash for each package
{ hash(Package, Hash) : installed_hash(Package, Hash) } 1
:- node(Package), error("Internal error: package must resolve to at most one hash").
:- node(Package), internal_error("Package must resolve to at most one hash").
% you can't choose an installed hash for a dev spec
:- hash(Package, Hash), variant_value(Package, "dev_path", _).
@@ -870,26 +962,36 @@ impose(Hash) :- hash(Package, Hash).
% if we haven't selected a hash for a package, we'll be building it
build(Package) :- not hash(Package, _), node(Package).
% Minimizing builds is tricky. We want a minimizing criterion
% because we want to reuse what is avaialble, but
% we also want things that are built to stick to *default preferences* from
% the package and from the user. We therefore treat built specs differently and apply
% a different set of optimization criteria to them. Spack's *first* priority is to
% reuse what it *can*, but if it builds something, the built specs will respect
% defaults and preferences. This is implemented by bumping the priority of optimization
% criteria for built specs -- so that they take precedence over the otherwise
% topmost-priority criterion to reuse what is installed.
% Minimizing builds is tricky. We want a minimizing criterion because we want to reuse
% what is avaialble, but we also want things that are built to stick to *default
% preferences* from the package and from the user. We therefore treat built specs
% differently and apply a different set of optimization criteria to them. Spack's first
% priority is to reuse what it can, but if it builds something, the built specs will
% respect defaults and preferences.
%
% This is implemented by bumping the priority of optimization criteria for built specs
% -- so that they take precedence over the otherwise topmost-priority criterion to reuse
% what is installed.
%
% If the user explicitly asks for *minimal* installs, we don't differentiate between
% built and reused specs - the top priority is just minimizing builds.
%
% The priority ranges are:
% 200+ Shifted priorities for build nodes; correspond to priorities 0 - 99.
% 100 - 199 Unshifted priorities. Currently only includes minimizing #builds.
% 0 - 99 Priorities for non-built nodes.
build_priority(Package, 200) :- build(Package), node(Package), optimize_for_reuse().
build_priority(Package, 0) :- not build(Package), node(Package), optimize_for_reuse().
build_priority(Package, 200) :- node(Package), build(Package), optimize_for_reuse(),
not minimal_installs().
build_priority(Package, 0) :- node(Package), not build(Package), optimize_for_reuse().
% don't adjust build priorities if reuse is not enabled
% Don't adjust build priorities if reusing, or if doing minimal installs
% With minimal, minimizing builds is the TOP priority
build_priority(Package, 0) :- node(Package), not optimize_for_reuse().
build_priority(Package, 0) :- node(Package), minimal_installs().
% Minimize builds with both --reuse and with --minimal
minimize_builds() :- optimize_for_reuse().
minimize_builds() :- minimal_installs().
% don't assign versions from installed packages unless reuse is enabled
% NOTE: that "installed" means the declared version was only included because
@@ -908,6 +1010,24 @@ build_priority(Package, 0) :- node(Package), not optimize_for_reuse().
not optimize_for_reuse().
#defined installed_hash/2.
#defined minimal_installs/0.
%-----------------------------------------------------------------
% Optimization to avoid errors
%-----------------------------------------------------------------
% Some errors are handled as rules instead of constraints because
% it allows us to explain why something failed. Here we optimize
% HEAVILY against the facts generated by those rules.
#minimize{ 0@1000: #true}.
#minimize{ 0@1001: #true}.
#minimize{ 0@1002: #true}.
#minimize{ 1000@1000+Priority,Msg: error(Priority, Msg) }.
#minimize{ 1000@1000+Priority,Msg,Arg1: error(Priority, Msg, Arg1) }.
#minimize{ 1000@1000+Priority,Msg,Arg1,Arg2: error(Priority, Msg, Arg1, Arg2) }.
#minimize{ 1000@1000+Priority,Msg,Arg1,Arg2,Arg3: error(Priority, Msg, Arg1, Arg2, Arg3) }.
#minimize{ 1000@1000+Priority,Msg,Arg1,Arg2,Arg3,Arg4: error(Priority, Msg, Arg1, Arg2, Arg3, Arg4) }.
#minimize{ 1000@1000+Priority,Msg,Arg1,Arg2,Arg3,Arg4,Arg5: error(Priority, Msg, Arg1, Arg2, Arg3, Arg4, Arg5) }.
%-----------------------------------------------------------------------------
% How to optimize the spec (high to low priority)
@@ -920,7 +1040,7 @@ build_priority(Package, 0) :- node(Package), not optimize_for_reuse().
% Try hard to reuse installed packages (i.e., minimize the number built)
opt_criterion(100, "number of packages to build (vs. reuse)").
#minimize { 0@100: #true }.
#minimize { 1@100,Package : build(Package), optimize_for_reuse() }.
#minimize { 1@100,Package : build(Package), minimize_builds() }.
#defined optimize_for_reuse/0.
% Minimize the number of deprecated versions being used
@@ -1088,3 +1208,11 @@ opt_criterion(1, "non-preferred targets").
#heuristic variant_value(Package, Variant, Value) : variant_default_value(Package, Variant, Value), node(Package). [10, true]
#heuristic provider(Package, Virtual) : possible_provider_weight(Package, Virtual, 0, _), virtual_node(Virtual). [10, true]
#heuristic node(Package) : possible_provider_weight(Package, Virtual, 0, _), virtual_node(Virtual). [10, true]
%-----------
% Notes
%-----------
% [1] Clingo ensures a total ordering among all atoms. We rely on that total ordering
% to reduce symmetry in the solution by checking `<` instead of `!=` in symmetric
% cases. These choices are made without loss of generality.

View File

@@ -34,3 +34,13 @@
% deprecated packages
#show deprecated/2.
% error types
#show error/2.
#show error/3.
#show error/4.
#show error/5.
#show error/6.
#show error/7.
% debug

View File

@@ -1776,7 +1776,7 @@ def spec_hash(self, hash):
json_text = sjson.dump(node_dict)
return spack.util.hash.b32_hash(json_text)
def _cached_hash(self, hash, length=None):
def _cached_hash(self, hash, length=None, force=False):
"""Helper function for storing a cached hash on the spec.
This will run spec_hash() with the deptype and package_hash
@@ -1785,6 +1785,8 @@ def _cached_hash(self, hash, length=None):
Arguments:
hash (spack.hash_types.SpecHashDescriptor): type of hash to generate.
length (int): length of hash prefix to return (default is full hash string)
force (bool): cache the hash even if spec is not concrete (default False)
"""
if not hash.attr:
return self.spec_hash(hash)[:length]
@@ -1794,13 +1796,20 @@ def _cached_hash(self, hash, length=None):
return hash_string[:length]
else:
hash_string = self.spec_hash(hash)
if self.concrete:
if force or self.concrete:
setattr(self, hash.attr, hash_string)
return hash_string[:length]
def package_hash(self):
"""Compute the hash of the contents of the package for this node"""
# Concrete specs with the old DAG hash did not have the package hash, so we do
# not know what the package looked like at concretization time
if self.concrete and not self._package_hash:
raise ValueError(
"Cannot call package_hash() on concrete specs with the old dag_hash()"
)
return self._cached_hash(ht.package_hash)
def dag_hash(self, length=None):
@@ -1923,8 +1932,13 @@ def to_node_dict(self, hash=ht.dag_hash):
if hasattr(variant, '_patches_in_order_of_appearance'):
d['patches'] = variant._patches_in_order_of_appearance
if self._concrete and hash.package_hash:
package_hash = self.package_hash()
if self._concrete and hash.package_hash and self._package_hash:
# We use the attribute here instead of `self.package_hash()` because this
# should *always* be assignhed at concretization time. We don't want to try
# to compute a package hash for concrete spec where a) the package might not
# exist, or b) the `dag_hash` didn't include the package hash when the spec
# was concretized.
package_hash = self._package_hash
# Full hashes are in bytes
if (not isinstance(package_hash, six.text_type)
@@ -2694,8 +2708,8 @@ def _old_concretize(self, tests=False, deprecation_warning=True):
# TODO: or turn external_path into a lazy property
Spec.ensure_external_path_if_external(s)
# Mark everything in the spec as concrete, as well.
self._mark_concrete()
# assign hashes and mark concrete
self._finalize_concretization()
# If any spec in the DAG is deprecated, throw an error
Spec.ensure_no_deprecated(self)
@@ -2725,6 +2739,21 @@ def _old_concretize(self, tests=False, deprecation_warning=True):
# there are declared inconsistencies)
self.architecture.target.optimization_flags(self.compiler)
def _patches_assigned(self):
"""Whether patches have been assigned to this spec by the concretizer."""
# FIXME: _patches_in_order_of_appearance is attached after concretization
# FIXME: to store the order of patches.
# FIXME: Probably needs to be refactored in a cleaner way.
if "patches" not in self.variants:
return False
# ensure that patch state is consistent
patch_variant = self.variants["patches"]
assert hasattr(patch_variant, "_patches_in_order_of_appearance"), \
"patches should always be assigned with a patch variant."
return True
@staticmethod
def inject_patches_variant(root):
# This dictionary will store object IDs rather than Specs as keys
@@ -2859,7 +2888,6 @@ def _new_concretize(self, tests=False):
concretized = answer[name]
self._dup(concretized)
self._mark_concrete()
def concretize(self, tests=False):
"""Concretize the current spec.
@@ -2896,6 +2924,64 @@ def _mark_concrete(self, value=True):
s.clear_cached_hashes()
s._mark_root_concrete(value)
def _assign_hash(self, hash):
"""Compute and cache the provided hash type for this spec and its dependencies.
Arguments:
hash (spack.hash_types.SpecHashDescriptor): the hash to assign to nodes
in the spec.
There are special semantics to consider for `package_hash`.
This should be called:
1. for `package_hash`, immediately after concretization, but *before* marking
concrete, and
2. for `dag_hash`, immediately after marking concrete.
`package_hash` is tricky, because we can't call it on *already* concrete specs,
but we need to assign it *at concretization time* to just-concretized specs. So,
the concretizer must assign the package hash *before* marking their specs
concrete (so that the only concrete specs are the ones already marked concrete).
`dag_hash` is also tricky, since it cannot compute `package_hash()` lazily for
the same reason. `package_hash` needs to be assigned *at concretization time*,
so, `to_node_dict()` can't just assume that it can compute `package_hash` itself
-- it needs to either see or not see a `_package_hash` attribute.
Rules of thumb for `package_hash`:
1. Old-style concrete specs from *before* `dag_hash` included `package_hash`
will not have a `_package_hash` attribute at all.
2. New-style concrete specs will have a `_package_hash` assigned at
concretization time.
3. Abstract specs will not have a `_package_hash` attribute at all.
"""
for spec in self.traverse():
# Already concrete specs either already have a package hash (new dag_hash())
# or they never will b/c we can't know it (old dag_hash()). Skip them.
if hash is ht.package_hash and not spec.concrete:
spec._cached_hash(hash, force=True)
# keep this check here to ensure package hash is saved
assert getattr(spec, hash.attr)
else:
spec._cached_hash(hash)
def _finalize_concretization(self):
"""Assign hashes to this spec, and mark it concrete.
This is called at the end of concretization.
"""
# See docs for in _assign_hash for why package_hash needs to happen here.
self._assign_hash(ht.package_hash)
# Mark everything in the spec as concrete
self._mark_concrete()
# Assign dag_hash (this *could* be done lazily, but it's assigned anyway in
# ensure_no_deprecated, and it's clearer to see explicitly where it happens)
self._assign_hash(ht.dag_hash)
def concretized(self, tests=False):
"""This is a non-destructive version of concretize().
@@ -3656,19 +3742,12 @@ def patches(self):
TODO: this only checks in the package; it doesn't resurrect old
patches from install directories, but it probably should.
"""
if not self.concrete:
raise spack.error.SpecError("Spec is not concrete: " + str(self))
if not hasattr(self, "_patches"):
self._patches = []
if 'patches' in self.variants:
# FIXME: _patches_in_order_of_appearance is attached after
# FIXME: concretization to store the order of patches somewhere.
# FIXME: Needs to be refactored in a cleaner way.
# translate patch sha256sums to patch objects by consulting the index
self._patches = []
for sha256 in self.variants['patches']._patches_in_order_of_appearance:
# translate patch sha256sums to patch objects by consulting the index
if self._patches_assigned():
for sha256 in self.variants["patches"]._patches_in_order_of_appearance:
index = spack.repo.path.patch_index
patch = index.patch_for_package(sha256, self.package)
self._patches.append(patch)
@@ -5307,6 +5386,16 @@ def __init__(self, parse_error):
self.string = parse_error.string
self.pos = parse_error.pos
@property
def long_message(self):
return "\n".join(
[
" Encountered when parsing spec:",
" %s" % self.string,
" %s^" % (" " * self.pos),
]
)
class DuplicateDependencyError(spack.error.SpecError):
"""Raised when the same dependency occurs in a spec twice."""

View File

@@ -78,13 +78,13 @@ def config_directory(tmpdir_factory):
tmpdir = tmpdir_factory.mktemp('test_configs')
# restore some sane defaults for packages and config
config_path = py.path.local(spack.paths.etc_path)
modules_yaml = config_path.join('spack', 'defaults', 'modules.yaml')
os_modules_yaml = config_path.join('spack', 'defaults', '%s' %
modules_yaml = config_path.join('defaults', 'modules.yaml')
os_modules_yaml = config_path.join('defaults', '%s' %
platform.system().lower(),
'modules.yaml')
packages_yaml = config_path.join('spack', 'defaults', 'packages.yaml')
config_yaml = config_path.join('spack', 'defaults', 'config.yaml')
repos_yaml = config_path.join('spack', 'defaults', 'repos.yaml')
packages_yaml = config_path.join('defaults', 'packages.yaml')
config_yaml = config_path.join('defaults', 'config.yaml')
repos_yaml = config_path.join('defaults', 'repos.yaml')
tmpdir.ensure('site', dir=True)
tmpdir.ensure('user', dir=True)
tmpdir.ensure('site/%s' % platform.system().lower(), dir=True)

View File

@@ -12,6 +12,9 @@
import pytest
import spack.build_environment
import spack.config
import spack.spec
from spack.paths import build_env_path
from spack.util.environment import set_env, system_dirs
from spack.util.executable import Executable, ProcessError
@@ -129,7 +132,9 @@ def wrapper_environment():
SPACK_TARGET_ARGS="-march=znver2 -mtune=znver2",
SPACK_LINKER_ARG='-Wl,',
SPACK_DTAGS_TO_ADD='--disable-new-dtags',
SPACK_DTAGS_TO_STRIP='--enable-new-dtags'):
SPACK_DTAGS_TO_STRIP='--enable-new-dtags',
SPACK_COMPILER_FLAGS_KEEP='',
SPACK_COMPILER_FLAGS_REMOVE='-Werror*',):
yield
@@ -157,6 +162,21 @@ def check_args(cc, args, expected):
assert expected == cc_modified_args
def check_args_contents(cc, args, must_contain, must_not_contain):
"""Check output arguments that cc produces when called with args.
This assumes that cc will print debug command output with one element
per line, so that we see whether arguments that should (or shouldn't)
contain spaces are parsed correctly.
"""
with set_env(SPACK_TEST_COMMAND='dump-args'):
cc_modified_args = cc(*args, output=str).strip().split('\n')
for a in must_contain:
assert a in cc_modified_args
for a in must_not_contain:
assert a not in cc_modified_args
def check_env_var(executable, var, expected):
"""Check environment variables updated by the passed compiler wrapper
@@ -642,6 +662,63 @@ def test_no_ccache_prepend_for_fc(wrapper_environment):
common_compile_args)
def test_keep_and_remove(wrapper_environment):
werror_specific = ['-Werror=meh']
werror = ['-Werror']
werror_all = werror_specific + werror
with set_env(
SPACK_COMPILER_FLAGS_KEEP='',
SPACK_COMPILER_FLAGS_REMOVE='-Werror*',
):
check_args_contents(cc, test_args + werror_all, ['-Wl,--end-group'], werror_all)
with set_env(
SPACK_COMPILER_FLAGS_KEEP='-Werror=*',
SPACK_COMPILER_FLAGS_REMOVE='-Werror*',
):
check_args_contents(cc, test_args + werror_all, werror_specific, werror)
with set_env(
SPACK_COMPILER_FLAGS_KEEP='-Werror=*',
SPACK_COMPILER_FLAGS_REMOVE='-Werror*|-llib1|-Wl*',
):
check_args_contents(
cc,
test_args + werror_all,
werror_specific,
werror + ["-llib1", "-Wl,--rpath"]
)
@pytest.mark.parametrize('cfg_override,initial,expected,must_be_gone', [
# Set and unset variables
('config:flags:keep_werror:all',
['-Werror', '-Werror=specific', '-bah'],
['-Werror', '-Werror=specific', '-bah'],
[],
),
('config:flags:keep_werror:specific',
['-Werror', '-Werror=specific', '-bah'],
['-Werror=specific', '-bah'],
['-Werror'],
),
('config:flags:keep_werror:none',
['-Werror', '-Werror=specific', '-bah'],
['-bah'],
['-Werror', '-Werror=specific'],
),
])
@pytest.mark.usefixtures('wrapper_environment', 'mutable_config')
def test_flag_modification(cfg_override, initial, expected, must_be_gone):
spack.config.add(cfg_override)
env = spack.build_environment.clean_environment()
env.apply_modifications()
check_args_contents(
cc,
test_args + initial,
expected,
must_be_gone
)
@pytest.mark.regression('9160')
def test_disable_new_dtags(wrapper_environment, wrapper_flags):
with set_env(SPACK_TEST_COMMAND='dump-args'):

View File

@@ -120,11 +120,19 @@ def test_concretizer_arguments(mutable_config, mock_packages):
spec = spack.main.SpackCommand("spec")
assert spack.config.get("concretizer:reuse", None) is None
assert spack.config.get("concretizer:minimal", None) is None
spec("--reuse", "zlib")
assert spack.config.get("concretizer:reuse", None) is True
assert spack.config.get("concretizer:minimal", None) is None
spec("--fresh", "zlib")
assert spack.config.get("concretizer:reuse", None) is False
assert spack.config.get("concretizer:minimal", None) is None
spec("--minimal", "zlib")
assert spack.config.get("concretizer:reuse", None) is False
assert spack.config.get("concretizer:minimal", None) is True

View File

@@ -486,7 +486,7 @@ def test_config_remove_from_env(mutable_empty_config, mutable_mock_env_path):
config('rm', 'config:dirty')
output = config('get')
expected = ev.default_manifest_yaml
expected = ev.default_manifest_yaml()
expected += """ config: {}
"""

View File

@@ -2251,7 +2251,7 @@ def test_env_write_only_non_default():
with open(e.manifest_path, 'r') as f:
yaml = f.read()
assert yaml == ev.default_manifest_yaml
assert yaml == ev.default_manifest_yaml()
@pytest.mark.regression('20526')
@@ -2358,6 +2358,26 @@ def test_old_format_cant_be_updated_implicitly(packages_yaml_v015):
add('hdf5')
@pytest.mark.parametrize('concretization,unify', [
('together', 'true'),
('separately', 'false')
])
def test_update_concretization_to_concretizer_unify(concretization, unify, tmpdir):
spack_yaml = """\
spack:
concretization: {}
""".format(concretization)
tmpdir.join('spack.yaml').write(spack_yaml)
# Update the environment
env('update', '-y', str(tmpdir))
with open(str(tmpdir.join('spack.yaml'))) as f:
assert f.read() == """\
spack:
concretizer:
unify: {}
""".format(unify)
@pytest.mark.regression('18147')
def test_can_update_attributes_with_override(tmpdir):
spack_yaml = """
@@ -2391,7 +2411,8 @@ def test_newline_in_commented_sequence_is_not_an_issue(tmpdir):
modules:
- libelf/3.18.1
concretization: together
concretizer:
unify: false
"""
abspath = tmpdir.join('spack.yaml')
abspath.write(spack_yaml)

View File

@@ -32,7 +32,7 @@ def _platform_executables(monkeypatch):
def _win_exe_ext():
return '.bat'
monkeypatch.setattr(spack.package, 'win_exe_ext', _win_exe_ext)
monkeypatch.setattr(spack.util.path, 'win_exe_ext', _win_exe_ext)
def define_plat_exe(exe):

View File

@@ -532,7 +532,7 @@ def test_cdash_report_concretization_error(tmpdir, mock_fetch, install_mockery,
# new or the old concretizer
expected_messages = (
'Conflicts in concretized spec',
'A conflict was triggered',
'conflicts with',
)
assert any(x in content for x in expected_messages)

View File

@@ -124,6 +124,17 @@ def test_spec_returncode():
assert spec.returncode == 1
def test_spec_parse_error():
with pytest.raises(spack.spec.SpecParseError) as e:
spec("1.15:")
# make sure the error is formatted properly
error_msg = """\
1.15:
^"""
assert error_msg in e.value.long_message
def test_env_aware_spec(mutable_mock_env_path):
env = ev.create('test')
env.add('mpileaks')

View File

@@ -638,14 +638,11 @@ def test_conflicts_show_cores(self, conflict_spec, monkeypatch):
if spack.config.get('config:concretizer') == 'original':
pytest.skip('Testing debug statements specific to new concretizer')
monkeypatch.setattr(spack.solver.asp, 'full_cores', True)
monkeypatch.setattr(spack.solver.asp, 'minimize_cores', False)
s = Spec(conflict_spec)
with pytest.raises(spack.error.SpackError) as e:
s.concretize()
assert "conflict_trigger(" in e.value.message
assert "conflict" in e.value.message
def test_conflict_in_all_directives_true(self):
s = Spec('when-directives-true')

View File

@@ -18,6 +18,7 @@
import spack.environment as ev
import spack.main
import spack.paths
import spack.repo
import spack.schema.compilers
import spack.schema.config
import spack.schema.env
@@ -1157,6 +1158,19 @@ def test_bad_path_double_override(config):
pass
def test_license_dir_config(mutable_config, mock_packages):
"""Ensure license directory is customizable"""
assert spack.config.get("config:license_dir") == spack.paths.default_license_dir
assert spack.package.Package.global_license_dir == spack.paths.default_license_dir
assert spack.repo.get("a").global_license_dir == spack.paths.default_license_dir
rel_path = os.path.join(os.path.sep, "foo", "bar", "baz")
spack.config.set("config:license_dir", rel_path)
assert spack.config.get("config:license_dir") == rel_path
assert spack.package.Package.global_license_dir == rel_path
assert spack.repo.get("a").global_license_dir == rel_path
@pytest.mark.regression('22547')
def test_single_file_scope_cache_clearing(env_yaml):
scope = spack.config.SingleFileScope(

View File

@@ -576,7 +576,7 @@ def default_config():
This ensures we can test the real default configuration without having
tests fail when the user overrides the defaults that we test against."""
defaults_path = os.path.join(spack.paths.etc_path, 'spack', 'defaults')
defaults_path = os.path.join(spack.paths.etc_path, 'defaults')
if is_windows:
defaults_path = os.path.join(defaults_path, "windows")
with spack.config.use_configuration(defaults_path) as defaults_config:

View File

@@ -104,7 +104,7 @@ def test_ascii_graph_mpileaks(config, mock_packages, monkeypatch):
/
o dyninst
|\
o | libdwarf
| o libdwarf
|/
o libelf
'''

View File

@@ -820,7 +820,7 @@ def test_setup_install_dir_grp(install_mockery, monkeypatch, capfd):
def _get_group(spec):
return mock_group
def _chgrp(path, group):
def _chgrp(path, group, follow_symlinks=True):
tty.msg(mock_chgrp_msg.format(path, group))
monkeypatch.setattr(prefs, 'get_package_group', _get_group)

View File

@@ -2,8 +2,8 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import sys
import pytest
@@ -59,6 +59,9 @@ def test_repo_anonymous_pkg(mutable_mock_repo):
@pytest.mark.maybeslow
@pytest.mark.skipif(
sys.version_info < (3, 5), reason="Test started failing spuriously on Python 2.7"
)
def test_repo_last_mtime():
latest_mtime = max(os.path.getmtime(p.module.__file__)
for p in spack.repo.path.all_packages())

View File

@@ -1273,3 +1273,23 @@ def test_spec_installed(install_mockery, database):
# 'a' is not in the mock DB and is not installed
spec = Spec("a").concretized()
assert not spec.installed
@pytest.mark.regression('30678')
def test_call_dag_hash_on_old_dag_hash_spec(mock_packages, config):
# create a concrete spec
a = Spec("a").concretized()
dag_hashes = {
spec.name: spec.dag_hash() for spec in a.traverse()
}
# make it look like an old DAG hash spec with no package hash on the spec.
for spec in a.traverse():
assert spec.concrete
spec._package_hash = None
for spec in a.traverse():
assert dag_hashes[spec.name] == spec.dag_hash()
with pytest.raises(ValueError, match='Cannot call package_hash()'):
spec.package_hash()

View File

@@ -125,7 +125,6 @@ def check_lex(self, tokens, spec):
def _check_raises(self, exc_type, items):
for item in items:
with pytest.raises(exc_type):
print("CHECKING: ", item, "=======================")
Spec(item)
# ========================================================================

View File

@@ -178,7 +178,7 @@ def streamify(arg, mode):
istream, close_istream = streamify(input, 'r')
if not ignore_quotes:
quoted_args = [arg for arg in args if re.search(r'^"|^\'|"$|\'$', arg)]
quoted_args = [arg for arg in args if re.search(r'^".*"$|^\'.*\'$', arg)]
if quoted_args:
tty.warn(
"Quotes in command arguments can confuse scripts like"

View File

@@ -44,7 +44,7 @@ def set_permissions(path, perms, group=None):
fs.chmod_x(path, perms)
if group:
fs.chgrp(path, group)
fs.chgrp(path, group, follow_symlinks=False)
class InvalidPermissionsError(SpackError):

View File

@@ -1,9 +1,9 @@
spack:
view: false
concretization: separately
concretizer:
reuse: false
unify: false
config:
install_tree:

View File

@@ -1,9 +1,9 @@
spack:
view: false
concretization: separately
concretizer:
reuse: false
unify: false
config:
install_tree:
@@ -15,33 +15,32 @@ spack:
packages:
cmake:
variants: ~ownlibs
curl:
variants: tls=mbedtls
mesa:
variants: +glx ~osmesa +opengl ~opengles +llvm
paraview:
variants: +qt
visit:
variants: +gui
all:
target: [x86_64]
# The spec will be gradually expanded to enable all the SDK components.
# Currently disabled: ascent, catalyst, visit
# Currently disabled: sensei
specs:
- matrix:
- - ecp-data-vis-sdk
+adios2
+ascent
+cinema
+darshan
+faodel
+hdf5
+paraview
+pnetcdf
+sz
+unifyfs
+veloc
+vtkm
+zfp
- ecp-data-vis-sdk ~cuda ~rocm ~sensei
+adios2
+ascent
+cinema
+darshan
+faodel
+hdf5
+paraview
+pnetcdf
+sz
+unifyfs
+veloc
+vtkm
+zfp
+visit
mirrors: { "mirror": "s3://spack-binaries/data-vis-sdk" }
@@ -70,6 +69,7 @@ spack:
- mesa
- openblas
- paraview
- visit
- vtk-m
runner-attributes:
tags: [ "spack", "public", "large", "x86_64" ]

View File

@@ -1,9 +1,9 @@
spack:
view: false
concretization: separately
concretizer:
reuse: false
unify: false
config:
concretizer: clingo

View File

@@ -1,9 +1,9 @@
spack:
view: false
concretization: separately
concretizer:
reuse: false
unify: false
config:
concretizer: clingo

View File

@@ -1,9 +1,9 @@
spack:
view: false
concretization: separately
concretizer:
reuse: false
unify: false
config:
concretizer: clingo

View File

@@ -1,9 +1,9 @@
spack:
concretization: separately
view: false
concretizer:
reuse: false
unify: false
config:
concretizer: clingo

View File

@@ -1,9 +1,9 @@
spack:
view: false
concretization: separately
concretizer:
reuse: false
unify: false
config:
install_tree:

View File

@@ -335,7 +335,7 @@ _spacktivate() {
_spack() {
if $list_options
then
SPACK_COMPREPLY="-h --help -H --all-help --color -c --config -C --config-scope -d --debug --show-cores --timestamp --pdb -e --env -D --env-dir -E --no-env --use-env-repo -k --insecure -l --enable-locks -L --disable-locks -m --mock -b --bootstrap -p --profile --sorted-profile --lines -v --verbose --stacktrace -V --version --print-shell-vars"
SPACK_COMPREPLY="-h --help -H --all-help --color -c --config -C --config-scope -d --debug --timestamp --pdb -e --env -D --env-dir -E --no-env --use-env-repo -k --insecure -l --enable-locks -L --disable-locks -m --mock -b --bootstrap -p --profile --sorted-profile --lines -v --verbose --stacktrace -V --version --print-shell-vars"
else
SPACK_COMPREPLY="activate add analyze arch audit blame bootstrap build-env buildcache cd checksum ci clean clone commands compiler compilers concretize config containerize create deactivate debug dependencies dependents deprecate dev-build develop diff docs edit env extensions external fetch find gc gpg graph help info install license list load location log-parse maintainers make-installer mark mirror module monitor patch pkg providers pydoc python reindex remove rm repo resource restage solve spec stage style tags test test-env tutorial undevelop uninstall unit-test unload url verify versions view"
fi

View File

@@ -10,10 +10,11 @@ class AbseilCpp(CMakePackage):
"""Abseil Common Libraries (C++) """
homepage = "https://abseil.io/"
url = "https://github.com/abseil/abseil-cpp/archive/20210324.2.tar.gz"
url = "https://github.com/abseil/abseil-cpp/archive/refs/tags/20211102.0.tar.gz"
maintainers = ['jcftang']
version('20211102.0', sha256='dcf71b9cba8dc0ca9940c4b316a0c796be8fab42b070bb6b7cab62b48f0e66c4')
version('20210324.2', sha256='59b862f50e710277f8ede96f083a5bb8d7c9595376146838b9580be90374ee1f')
version('20210324.1', sha256='441db7c09a0565376ecacf0085b2d4c2bbedde6115d7773551bc116212c2a8d6')
version('20210324.0', sha256='dd7db6815204c2a62a2160e32c55e97113b0a0178b2f090d6bab5ce36111db4b')

View File

@@ -0,0 +1,37 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
from spack.pkg.builtin.singularityce import SingularityBase
# Apptainer is the new name of Singularity, piggy-back on the original package
class Apptainer(SingularityBase):
'''Apptainer is an open source container platform designed to be simple, fast, and
secure. Many container platforms are available, but Apptainer is designed for
ease-of-use on shared systems and in high performance computing (HPC)
environments.
Needs post-install chmod/chown steps to enable full functionality.
See package definition or `spack-build-out.txt` build log for details,
e.g.::
tail -15 $(spack location -i apptainer)/.spack/spack-build-out.txt
'''
homepage = "https://apptainer.org"
url = "https://github.com/apptainer/apptainer/releases/download/v1.0.2/apptainer-1.0.2.tar.gz"
git = "https://github.com/apptainer/apptainer.git"
version('main', branch='main')
version('1.0.2', sha256='2d7a9d0a76d5574459d249c3415e21423980d9154ce85e8c34b0600782a7dfd3')
singularity_org = 'apptainer'
singularity_name = 'apptainer'
singularity_security_urls = (
"https://apptainer.org/docs/admin/main/security.html",
"https://apptainer.org/docs/admin/main/admin_quickstart.html#apptainer-security",
)

View File

@@ -0,0 +1,11 @@
#!/bin/sh -eu
{% for cf in chown_files %}
chown root {{ prefix }}/{{ cf }}
{% endfor %}
{% for sf in setuid_files %}
chmod 4555 {{ prefix }}/{{ sf }}
{% endfor %}
# end

View File

@@ -22,22 +22,43 @@ class ArmForge(Package):
# versions (and checksums) based on the target platform shows up
if platform.machine() == "aarch64":
version("22.0.1", sha256="89237d85cdecf6481c1aa72f3a7c60145ff2f1efcf588aa382b98ee8046d1bd5")
version("22.0", sha256="30328b3f92d3284c632c196690a36bce8f3c83b17f2c810deb31797a9ac69432")
version("21.1.3", sha256="4a4ff7372aad5a31fc9e18b7b6c493691ab37d8d44a3158584e62d1ab82b0eeb")
version("21.1.2", sha256="9d62bb0b9411663693a4431c993a5a81ce84fc7d58c08d32c77e87fac7c44eb0")
version("21.1.1", sha256="9215180dc42565c2ac45db8851d3ee5fcfe11f06ccbf655c2e800f32a3479f5d")
version("21.1", sha256="d6f6444eb2d47fd884d8b125f890d6a02a9d5bcfc10950af46b11d3b1e1189fd")
version("21.0.3", sha256="371f4e3087af329bee155ceb50b9adaf006d3b8602fb1b6bbdc710ab0f74368d")
version("21.0.2", sha256="ca547d11086ddd2704468166ad01f34132fcfa8d416239ad85c87a6c5f042298")
version("21.0.1", sha256="bb76207b47079db843f189f604cffd00cffa49963c3e515092b67392047b92d2")
version("21.0", sha256="2bcc745d0049d6b25c77c97b2d7bad7b4f804180972a2306a8599ce41f6a4573")
elif platform.machine() == "ppc64le":
version("22.0.1", sha256="7499462f2f24a556b504c10e56cce61d8e29630ac99aa7a9a59f60e7ce474877")
version("22.0", sha256="dbe1248ba683b2b1374c888805c608252daade47f904fb91a4a4803bf11ea5fa")
version("21.1.3", sha256="eecbc5686d60994c5468b2d7cd37bebe5d9ac0ba37bd1f98fbfc69b071db541e")
version("21.1.2", sha256="3c1006e3a3ee0c3a1e73f984da19d9953fa48f31c89f8774954c885a070a76d8")
version("21.1.1", sha256="305ee145040db30b7c7af6a05364030fe753c40cbf5cf3b4ea04a8812c2c3b49")
version("21.1", sha256="24e6fb120fcecf854a069ce6c993d430e892a18f415603009768e43317980491")
version("21.0.3", sha256="a1dad6efdfa6bf95e5ec02f771217bf98d7d010c6bfdd5caf0796f0e75ef0fab")
version("21.0.2", sha256="302cadf6c6ddd6f41fafb0d490a92ae0919a7b24d6c212228311253cec2ff1b7")
version("21.0.1", sha256="fa9c1fbb115d34533f4dc449cb49c7eca0472205973ed1e9ab5ccd916c85a6f9")
version("21.0", sha256="60cfa7dd1cd131ec85e67cb660f2f84cf30bb700d8979cae1f5f88af658fd249")
elif platform.machine() == "x86_64":
version("22.0.1", sha256="8f8a61c159665d3de3bc5334ed97bdb4966bfbdb91b65d32d162d489eb2219ac")
version("22.0", sha256="4e63758bd474e9640700673625eb2adfe2bdf875eaacfe67862d184ae08f542f")
version("21.1.3", sha256="03dc82f1d075deb6f08d1e3e6592dc9b630d406c08a1316d89c436b5874f3407")
version("21.1.2", sha256="ebc99fa3461d2cd968e4d304c11b70cc8d9c5a2acd68681cec2067c128255cd5")
version("21.1.1", sha256="b1f4a6cffca069b10bcee66ace38bd1a515f48fbae3c44ff6faef6825a633df5")
version("21.1", sha256="933dce5980ab0f977a79d24eecf4464bd7c5ff22fa74fb2758f68d1ccb7723d2")
version("21.0.3", sha256="24708b363a8d3a82879ab4f60fe08faeb936e1e8286b324928a86a2d707071e8")
version("21.0.2", sha256="741ff2a995c8cf7ce5d346a3f7d2a552ec602b995e477e9a5a3a6319d3907980")
version("21.0.1", sha256="849c1443af3315b0b4dc2d8b337200cd92351cb11448bd3364428d8b6325ae5a")
version("21.0", sha256="71b713a05d431a3c26bd83cc4d0b65a0afd7d7f5bf57aa11edfb41da90f01774")
variant('probe', default=False, description='Detect available PMU counters via "forge-probe" during install')
# forge-probe executes with "/usr/bin/env python"
depends_on('python@2.7:2.9.9', type='build', when='+probe')
depends_on('python@2.7:', type='build', when='+probe')
# Licensing
license_required = True
@@ -52,7 +73,7 @@ class ArmForge(Package):
license_url = "https://developer.arm.com/tools-and-software/server-and-hpc/help/help-and-tutorials/system-administration/licensing/arm-licence-server"
def url_for_version(self, version):
return "http://content.allinea.com/downloads/arm-forge-%s-linux-%s.tar" % (version, platform.machine())
return "https://content.allinea.com/downloads/arm-forge-%s-linux-%s.tar" % (version, platform.machine())
def install(self, spec, prefix):
subprocess.call(["./textinstall.sh", "--accept-licence", prefix])

View File

@@ -69,6 +69,8 @@ class Ascent(CMakePackage, CudaPackage):
variant('test', default=True, description='Enable Ascent unit tests')
variant("mpi", default=True, description="Build Ascent MPI Support")
# set to false for systems that implicitly link mpi
variant('blt_find_mpi', default=True, description='Use BLT CMake Find MPI logic')
variant("serial", default=True, description="build serial (non-mpi) libraries")
# variants for language support
@@ -486,6 +488,10 @@ def hostconfig(self):
cfg.write(cmake_cache_entry("MPIEXEC",
mpiexe_bin))
if "+blt_find_mpi" in spec:
cfg.write(cmake_cache_entry("ENABLE_FIND_MPI", "ON"))
else:
cfg.write(cmake_cache_entry("ENABLE_FIND_MPI", "OFF"))
###################################
# BABELFLOW (also depends on mpi)
###################################

View File

@@ -24,9 +24,9 @@ class Atmi(CMakePackage):
version('5.0.0', sha256='208c1773170722b60b74357e264e698df5871e9d9d490d64011e6ea76750d9cf')
version('4.5.2', sha256='c235cfb8bdd89deafecf9123264217b8cc5577a5469e3e1f24587fa820d0792e')
version('4.5.0', sha256='64eeb0244cedae99db7dfdb365e0ad624106cc1090a531f94885ae81e254aabf')
version('4.3.1', sha256='4497fa6d33547b946e2a51619f2777ec36e9cff1b07fd534eb8a5ef0d8e30650')
version('4.3.0', sha256='1cbe0e9258ce7cce7b7ccc288335dffbac821ceb745c4f3fd48e2a258abada89')
version('4.2.0', sha256='c1c89c00d2dc3e764c63b2e51ff7fd5c06d5881ed56aed0adf639582d3389585')
version('4.3.1', sha256='4497fa6d33547b946e2a51619f2777ec36e9cff1b07fd534eb8a5ef0d8e30650', deprecated=True)
version('4.3.0', sha256='1cbe0e9258ce7cce7b7ccc288335dffbac821ceb745c4f3fd48e2a258abada89', deprecated=True)
version('4.2.0', sha256='c1c89c00d2dc3e764c63b2e51ff7fd5c06d5881ed56aed0adf639582d3389585', deprecated=True)
version('4.1.0', sha256='b31849f86c79f90466a9d67f0a28a93c1675181e38e2a5f571ffc963e4b06f5f', deprecated=True)
version('4.0.0', sha256='8a2e5789ee7165aff0f0669eecd23ac0a5c8a5bfbc1acd9380fe9a8ed5bffe3a', deprecated=True)
version('3.10.0', sha256='387e87c622ec334d3ba7a2f4f015ea9a219712722f4c56c1ef572203d0d072ea', deprecated=True)

View File

@@ -0,0 +1,30 @@
From c92672b9c59f9011e5d68220e99cfdd70c1b98df Mon Sep 17 00:00:00 2001
From: Reuben Thomas <rrt@sc3d.org>
Date: Mon, 25 Apr 2022 12:37:10 +0100
Subject: [PATCH] m4/ax_cc_maxopt.m4: add missing ;; to end of case
---
m4/ax_cc_maxopt.m4 | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/m4/ax_cc_maxopt.m4 b/m4/ax_cc_maxopt.m4
index 05218e36..7ad53438 100644
--- a/m4/ax_cc_maxopt.m4
+++ b/m4/ax_cc_maxopt.m4
@@ -55,7 +55,7 @@
# modified version of the Autoconf Macro, you may extend this special
# exception to the GPL to apply to your modified version as well.
-#serial 22
+#serial 23
AC_DEFUN([AX_CC_MAXOPT],
[
@@ -146,6 +146,7 @@ if test "x$ac_test_CFLAGS" = "x"; then
nvhpc)
# default optimization flags for nvhpc
CFLAGS="$CFLAGS -O3"
+ ;;
gnu)
# default optimization flags for gcc on all systems

View File

@@ -13,3 +13,10 @@ class AutoconfArchive(AutotoolsPackage, GNUMirrorPackage):
version('2022.02.11', sha256='78a61b611e2eeb55a89e0398e0ce387bcaf57fe2dd53c6fe427130f777ad1e8c')
version('2019.01.06', sha256='17195c833098da79de5778ee90948f4c5d90ed1a0cf8391b4ab348e2ec511e3f')
# https://github.com/autoconf-archive/autoconf-archive/pull/251
patch('2022.02.11-ax_cc_maxopt-nvhpc.patch', when='@2022.02.11')
def setup_dependent_build_environment(self, env, dependent_spec):
"""Adds the ACLOCAL path for autotools."""
env.append_path('ACLOCAL_PATH', self.prefix.share.aclocal)

View File

@@ -13,13 +13,14 @@ class AwsParallelcluster(PythonPackage):
tool to deploy and manage HPC clusters in the AWS cloud."""
homepage = "https://github.com/aws/aws-parallelcluster"
pypi = "aws-parallelcluster/aws-parallelcluster-2.11.6.tar.gz"
pypi = "aws-parallelcluster/aws-parallelcluster-2.11.7.tar.gz"
maintainers = [
'charlesg3', 'chenwany', 'demartinofra', 'enrico-usai', 'francesco-giordano',
'gmarciani', 'hanwen-pcluste', 'lukeseawalker',
]
version('2.11.7', sha256='f7c51cf1c94787f56e0661e39860ecc9275efeacc88716b7c9f14053ec7fbd35')
version('2.11.6', sha256='4df4bcf966f523bcdf5b4f68ed0ef347eebae70a074cd098b15bc8a6be27217c')
version('2.11.5', sha256='7499f88387cbe2cb73f9fddeee3363117f7ef1524d6a73e77bb07900040baebb')
version('2.11.4', sha256='449537ccda57f91f4ec6ae0c94a8e2b1a789f08f80245fadb28f44a4351d5da4')

View File

@@ -16,8 +16,6 @@ class Bazel(Package):
homepage = "https://bazel.build/"
url = "https://github.com/bazelbuild/bazel/releases/download/3.1.0/bazel-3.1.0-dist.zip"
maintainers = ['adamjstewart']
tags = ['build-tools']
version('4.0.0', sha256='d350f80e70654932db252db380d2ec0144a00e86f8d9f2b4c799ffdb48e9cdd1')

View File

@@ -161,7 +161,7 @@ def libs(self):
values=('fcontext', 'ucontext', 'winfib'),
multi=False,
description='Use the specified backend for boost-context',
when='+context')
when='@1.65.0: +context')
variant('cxxstd',
default='98',

View File

@@ -0,0 +1,21 @@
diff --git a/src/Bpp/Graph/GlobalGraph.cpp b/src/Bpp/Graph/GlobalGraph.cpp
index ca2d18f..d681314 100644
--- a/src/Bpp/Graph/GlobalGraph.cpp
+++ b/src/Bpp/Graph/GlobalGraph.cpp
@@ -42,6 +42,7 @@
#include <sstream>
#include <string>
#include <vector>
+#include <limits>
#include "../Exceptions.h"
#include "../Text/TextTools.h"
@@ -751,7 +752,7 @@ void GlobalGraph::orientate()
// if none, look for node wih minimum number of fathers
if (it == nextNodes.end())
{
- size_t nbF = numeric_limits<size_t>::infinity();
+ size_t nbF = std::numeric_limits<size_t>::infinity();
it = nextNodes.begin();
for ( ; it != nextNodes.end(); it++)

View File

@@ -10,10 +10,12 @@ class BppCore(CMakePackage):
"""Bio++ core library."""
homepage = "http://biopp.univ-montp2.fr/wiki/index.php/Installation"
url = "http://biopp.univ-montp2.fr/repos/sources/bpp-core-2.2.0.tar.gz"
url = "https://github.com/BioPP/bpp-core/archive/refs/tags/v2.4.1.tar.gz"
maintainers = ['snehring']
version('2.4.1', sha256='1150b8ced22cff23dd4770d7c23fad11239070b44007740e77407f0d746c0af6')
version('2.2.0', sha256='aacd4afddd1584ab6bfa1ff6931259408f1d39958a0bdc5f78bf1f9ee4e98b79')
version('2.2.0', sha256='aacd4afddd1584ab6bfa1ff6931259408f1d39958a0bdc5f78bf1f9ee4e98b79', deprecated=True)
depends_on('cmake@2.6:', type='build')
@@ -21,5 +23,8 @@ class BppCore(CMakePackage):
# resolve ambiguous of 'isnan' function.
patch('clarify_isnan.patch', when='%fj')
# This is fixed in master, next release should be fine
patch('global-graph-limits.patch', when='@2.4.1')
def cmake_args(self):
return ['-DBUILD_TESTING=FALSE']

View File

@@ -10,9 +10,12 @@ class BppPhyl(CMakePackage):
"""Bio++ phylogeny library."""
homepage = "http://biopp.univ-montp2.fr/wiki/index.php/Installation"
url = "http://biopp.univ-montp2.fr/repos/sources/bpp-phyl-2.2.0.tar.gz"
url = "https://github.com/BioPP/bpp-phyl/archive/refs/tags/v2.4.1.tar.gz"
version('2.2.0', sha256='f346d87bbc7858924f3c99d7d74eb4a1f7a1b926746c68d8c28e07396c64237b')
maintainers = ['snehring']
version('2.4.1', sha256='e7bf7d4570f756b7773904ffa600ffcd77c965553ddb5cbc252092d1da962ff2')
version('2.2.0', sha256='f346d87bbc7858924f3c99d7d74eb4a1f7a1b926746c68d8c28e07396c64237b', deprecated=True)
depends_on('cmake@2.6:', type='build')
depends_on('bpp-core')

View File

@@ -0,0 +1,19 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class BppPopgen(CMakePackage):
"""The Bio++ Population Genetics Library"""
homepage = "https://https://github.com/BioPP/bpp-popgen"
url = "https://github.com/BioPP/bpp-popgen/archive/refs/tags/v2.4.1.tar.gz"
maintainers = ['snehring']
version('2.4.1', sha256='03b57d71a63c8fa7f11c085e531d0d691fc1d40d4ea541070dabde0ab3baf413')
depends_on('bpp-seq')

View File

@@ -10,10 +10,12 @@ class BppSeq(CMakePackage):
"""Bio++ seq library."""
homepage = "http://biopp.univ-montp2.fr/wiki/index.php/Installation"
url = "http://biopp.univ-montp2.fr/repos/sources/bpp-seq-2.2.0.tar.gz"
url = "https://github.com/BioPP/bpp-seq/archive/refs/tags/v2.4.1.tar.gz"
maintainers = ['snehring']
version('2.4.1', sha256='dbfcb04803e4b7f08f9f159da8a947c91906c3ca8b20683ac193f6dc524d4655')
version('2.2.0', sha256='0927d7fb0301c1b99a7353d5876deadb4a3040776cc74e8fe1c366fe920e7b6b')
version('2.2.0', sha256='0927d7fb0301c1b99a7353d5876deadb4a3040776cc74e8fe1c366fe920e7b6b', deprecated=True)
depends_on('cmake@2.6:', type='build')
depends_on('bpp-core')

View File

@@ -11,15 +11,19 @@ class BppSuite(CMakePackage):
sequence analysis."""
homepage = "http://biopp.univ-montp2.fr/wiki/index.php/BppSuite"
url = "http://biopp.univ-montp2.fr/repos/sources/bppsuite/bppsuite-2.2.0.tar.gz"
url = "https://github.com/BioPP/bppsuite/archive/refs/tags/v2.4.1.tar.gz"
version('2.2.0', sha256='761fa5eec794af221d971ae70fd8c43171ad71a6bb5f20549263a1797b43f138')
maintainers = ['snehring']
version('2.4.1', sha256='0485adcc17e37439069d27e4fac144e5ae38036ba21f31e6d21f070ce4ea5199')
version('2.2.0', sha256='761fa5eec794af221d971ae70fd8c43171ad71a6bb5f20549263a1797b43f138', deprecated=True)
depends_on('cmake@2.6:', type='build')
depends_on('texinfo', type='build')
depends_on('bpp-core')
depends_on('bpp-seq')
depends_on('bpp-phyl')
depends_on('bpp-popgen', when='@2.4.1:')
# Clarify isinf's namespace, because Fujitsu compiler can't
# resolve ambiguous of 'isinf' function.

View File

@@ -19,6 +19,7 @@ class Camp(CMakePackage, CudaPackage, ROCmPackage):
maintainers = ['trws']
version('main', branch='main', submodules='True')
version('2022.03.0', sha256='e9090d5ee191ea3a8e36b47a8fe78f3ac95d51804f1d986d931e85b8f8dad721')
version('0.3.0', sha256='129431a049ca5825443038ad5a37a86ba6d09b2618d5fe65d35f83136575afdb')
version('0.2.3', sha256='58a0f3bd5eadb588d7dc83f3d050aff8c8db639fc89e8d6553f9ce34fc2421a7')
version('0.2.2', sha256='194d38b57e50e3494482a7f94940b27f37a2bee8291f2574d64db342b981d819')

View File

@@ -19,9 +19,10 @@ class Catch2(CMakePackage):
version('develop', branch='devel')
# Releases
version('3.0.1', sha256='8c4173c68ae7da1b5b505194a0c2d6f1b2aef4ec1e3e7463bde451f26bbaf4e7')
version('3.0.0-preview4', sha256='2458d47d923b65ab611656cb7669d1810bcc4faa62e4c054a7405b1914cd4aee')
version('3.0.0-preview3', sha256='06a4f903858f21c553e988f8b76c9c6915d1f95f95512d6a58c421e02a2c4975')
version('2.13.8', sha256='b9b592bd743c09f13ee4bf35fc30eeee2748963184f6bea836b146e6cc2a585a', preferred=True)
version('2.13.8', sha256='b9b592bd743c09f13ee4bf35fc30eeee2748963184f6bea836b146e6cc2a585a')
version('2.13.7', sha256='3cdb4138a072e4c0290034fe22d9f0a80d3bcfb8d7a8a5c49ad75d3a5da24fae')
version('2.13.6', sha256='48dfbb77b9193653e4e72df9633d2e0383b9b625a47060759668480fdf24fbd4')
version('2.13.5', sha256='7fee7d643599d10680bfd482799709f14ed282a8b7db82f54ec75ec9af32fa76')

View File

@@ -19,8 +19,9 @@ class Chai(CachedCMakePackage, CudaPackage, ROCmPackage):
maintainers = ['davidbeckingsale']
version('develop', branch='develop', submodules=True)
version('main', branch='main', submodules=True)
version('develop', branch='develop', submodules=False)
version('main', branch='main', submodules=False)
version('2022.03.0', tag='v2022.03.0', submodules=False)
version('2.4.0', tag='v2.4.0', submodules=True)
version('2.3.0', tag='v2.3.0', submodules=True)
version('2.2.2', tag='v2.2.2', submodules=True)
@@ -45,12 +46,15 @@ class Chai(CachedCMakePackage, CudaPackage, ROCmPackage):
depends_on('cmake@3.8:', type='build')
depends_on('cmake@3.9:', type='build', when="+cuda")
depends_on('cmake@3.14:', when='@2022.03.0:')
depends_on('blt@0.5.0:', type='build', when='@2022.03.0:')
depends_on('blt@0.4.1:', type='build', when='@2.4.0:')
depends_on('blt@0.4.0:', type='build', when='@2.3.0')
depends_on('blt@0.3.6:', type='build', when='@:2.2.2')
depends_on('umpire')
depends_on('umpire@2022.03.0:', when='@2022.03.0:')
depends_on('umpire@6.0.0', when="@2.4.0")
depends_on('umpire@4.1.2', when="@2.2.0:2.3.0")
depends_on('umpire@main', when='@main')
@@ -73,6 +77,7 @@ class Chai(CachedCMakePackage, CudaPackage, ROCmPackage):
depends_on('raja@0.14.0', when="@2.4.0")
depends_on('raja@0.13.0', when="@2.3.0")
depends_on('raja@0.12.0', when="@2.2.0:2.2.2")
depends_on('raja@2022.03.0:', when='@2022.03.0:')
depends_on('raja@main', when='@main')
with when('+cuda'):
@@ -147,16 +152,21 @@ def initconfig_package_entries(self):
spec = self.spec
entries = []
option_prefix = "CHAI_" if spec.satisfies("@2022.03.0:") else ""
entries.append(cmake_cache_path("BLT_SOURCE_DIR", spec['blt'].prefix))
if '+raja' in spec:
entries.append(cmake_cache_option("ENABLE_RAJA_PLUGIN", True))
entries.append(cmake_cache_option(
"{}ENABLE_RAJA_PLUGIN".format(option_prefix), True))
entries.append(cmake_cache_path("RAJA_DIR", spec['raja'].prefix))
entries.append(cmake_cache_option('ENABLE_PICK', '+enable_pick' in spec))
entries.append(cmake_cache_option(
"{}ENABLE_PICK".format(option_prefix), '+enable_pick' in spec))
entries.append(cmake_cache_path(
"umpire_DIR", spec['umpire'].prefix.share.umpire.cmake))
entries.append(cmake_cache_option("ENABLE_TESTS", '+tests' in spec))
entries.append(cmake_cache_option("ENABLE_BENCHMARKS", '+benchmarks' in spec))
entries.append(cmake_cache_option("ENABLE_EXAMPLES", '+examples' in spec))
entries.append(cmake_cache_option(
"{}ENABLE_EXAMPLES".format(option_prefix), '+examples' in spec))
entries.append(cmake_cache_option("BUILD_SHARED_LIBS", '+shared' in spec))
return entries

View File

@@ -28,7 +28,7 @@ class Chameleon(CMakePackage, CudaPackage):
variant('simgrid', default=False, when='runtime=starpu', description='Enable simulation mode through StarPU+SimGrid')
# dependencies
depends_on("pkg-config", type='build')
depends_on("pkgconfig", type='build')
with when("runtime=starpu"):
depends_on("starpu")

View File

@@ -0,0 +1,51 @@
# Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# ----------------------------------------------------------------------------
from spack import *
class Clblast(CMakePackage):
""" CLBlast is a modern, lightweight, performant and tunable OpenCL BLAS
library written in C++11. It is designed to leverage the full performance
potential of a wide variety of OpenCL devices from different vendors,
including desktop and laptop GPUs, embedded GPUs, and other accelerators.
CLBlast implements BLAS routines: basic linear algebra subprograms
operating on vectors and matrices."""
homepage = 'https://cnugteren.github.io/clblast/clblast.html'
git = 'https://github.com/CNugteren/CLBlast'
url = 'https://github.com/CNugteren/CLBlast/archive/refs/tags/1.5.2.zip'
maintainers = ['umar456']
version('master', branch='master')
version('1.5.2', sha256='0e3a017c3aa352e0bf94ea65cfc9609beb2c22204d31c2ef43d0478178cfee00')
version('1.5.1', sha256='a0f0cb7308b59a495c23beaef1674093ed26996f66d439623808755dbf568c3f')
version('1.5.0', sha256='1bf8584ee4370d5006a467a1206276ab23e32ba03fd1bd0f1c6da6a6c9f03bc9')
version('1.4.1', sha256='c22dab892641301b24bd90f5a6916a10b5a877c5f5f90c2dc348a58d39a278a7')
version('1.4.0', sha256='ae00393fab7f7a85a6058ffb336670d1a529213eea288af4d657a32f2834564a')
version('1.3.0', sha256='cf314e976329dd2dfd467b713247c020b06c0fc17d4261d94c4aabbf34f5827f')
version('1.2.0', sha256='3adfd5b5ffa2725e3c172de7cde8cc7019cd295fac4e37d39839e176db0db652')
version('1.1.0', sha256='2896f5c8ac6580b1adf026770ef5a664b209484c47189c55466f8884ffd33052')
version('1.0.1', sha256='6c9415a1394c554debce85c47349ecaaebdc9d5baa187d3ecb84be00ae9c70f0')
version('1.0.0', sha256='230a55a868bdd21425867cbd0dcb7ec046aa5ca522fb5694e42740b5b16d0f59')
depends_on('opencl +icd')
variant('shared', description='Build a shared libraries', default=True)
variant('tuners', description='Enable compilation of the tuners', default=False)
variant('netlib', description='Enable compilation of the CBLAS Netlib API', default=False)
provides('blas', when='+netlib')
def cmake_args(self):
args = [
self.define_from_variant('BUILD_SHARED_LIBS', 'shared'),
self.define('TESTS', self.run_tests),
self.define_from_variant('TUNERS'),
self.define_from_variant('NETLIB')
]
return args

View File

@@ -24,9 +24,9 @@ class Comgr(CMakePackage):
version('5.0.0', sha256='da1bbc694bd930a504406eb0a0018c2e317d8b2c136fb2cab8de426870efe9a8')
version('4.5.2', sha256='e45f387fb6635fc1713714d09364204cd28fea97655b313c857beb1f8524e593')
version('4.5.0', sha256='03c5880e0922fcff31306f7da2eb9d3a3709d9b5b75b3524dcfae85f4b181678')
version('4.3.1', sha256='f1d99550383ed7b3a01d304eedc3d86a8e45b271aa5a80b1dd099c22fda3f745')
version('4.3.0', sha256='f77b505abb474078374701dfc49e651ad3eeec5349ce6edda54549943a3775ee')
version('4.2.0', sha256='40a1ea50d2aea0cf75c4d17cdd6a7fe44ae999bf0147d24a756ca4675ce24e36')
version('4.3.1', sha256='f1d99550383ed7b3a01d304eedc3d86a8e45b271aa5a80b1dd099c22fda3f745', deprecated=True)
version('4.3.0', sha256='f77b505abb474078374701dfc49e651ad3eeec5349ce6edda54549943a3775ee', deprecated=True)
version('4.2.0', sha256='40a1ea50d2aea0cf75c4d17cdd6a7fe44ae999bf0147d24a756ca4675ce24e36', deprecated=True)
version('4.1.0', sha256='ffb625978555c63582aa46857672431793261166aa31761eff4fe5c2cab661ae', deprecated=True)
version('4.0.0', sha256='f389601fb70b2d9a60d0e2798919af9ddf7b8376a2e460141507fe50073dfb31', deprecated=True)
version('3.10.0', sha256='b44ee5805a6236213d758fa4b612bb859d8f774b9b4bdc3a2699bb009dd631bc', deprecated=True)

View File

@@ -661,7 +661,8 @@ def fflags(var, lst):
mkf.write(fflags('CPPFLAGS', cppflags))
mkf.write(fflags('CFLAGS', cflags))
mkf.write(fflags('CXXFLAGS', cxxflags))
mkf.write(fflags(acc_flags_var, nvflags))
if '+cuda' in spec:
mkf.write(fflags(acc_flags_var, nvflags))
mkf.write(fflags('FCFLAGS', fcflags))
mkf.write(fflags('LDFLAGS', ldflags))
mkf.write(fflags('LIBS', libs))

View File

@@ -11,6 +11,7 @@ class Cppcheck(MakefilePackage):
homepage = "http://cppcheck.sourceforge.net/"
url = "https://downloads.sourceforge.net/project/cppcheck/cppcheck/1.78/cppcheck-1.78.tar.bz2"
version('2.7', sha256='ac74c0973c46a052760f4ff7ca6a84616ca5795510542d195a6f122c53079291')
version('2.1', sha256='ab26eeef039e5b58aac01efb8cb664f2cc16bf9879c61bc93cd00c95be89a5f7')
version('2.0', sha256='5f77d36a37ed9ef58ea8b499e4b1db20468114c9ca12b5fb39b95906cab25a3f')
version('1.90', sha256='43758d56613596c29440e55ea96a5a13e36f81ca377a8939648b5242faf61883')

View File

@@ -128,7 +128,7 @@
'6.5.14': {
'Linux-x86_64': ('f3e527f34f317314fe8fcd8c85f10560729069298c0f73105ba89225db69da48', 'https://developer.download.nvidia.com/compute/cuda/6_5/rel/installers/cuda_6.5.14_linux_64.run')},
'6.0.37': {
'Linux-x86_64': ('991e436c7a6c94ec67cf44204d136adfef87baa3ded270544fa211179779bc40', '//developer.download.nvidia.com/compute/cuda/6_0/rel/installers/cuda_6.0.37_linux_64.run')},
'Linux-x86_64': ('991e436c7a6c94ec67cf44204d136adfef87baa3ded270544fa211179779bc40', 'https://developer.download.nvidia.com/compute/cuda/6_0/rel/installers/cuda_6.0.37_linux_64.run')},
}
@@ -247,6 +247,11 @@ def install(self, spec, prefix):
'--toolkit', # install CUDA Toolkit
]
if spec.satisfies('@7:'):
# use stage dir instead of /tmp
mkdir(join_path(self.stage.path, 'tmp'))
arguments.append('--tmpdir=%s' % join_path(self.stage.path, 'tmp'))
if spec.satisfies('@10.1:'):
arguments.append('--installpath=%s' % prefix) # Where to install
else:

View File

@@ -78,6 +78,8 @@ def propagate_cuda_arch(package, spec=None):
depends_on("apcomp~shared", when="~shared")
depends_on("apcomp+shared", when="+shared")
depends_on("raja@0.12.0:")
depends_on("raja@:0.14", when='@0.1.7:')
depends_on("raja@:0.13", when="@:0.1.6")
depends_on("raja~cuda", when="~cuda")
depends_on("raja+cuda", when="+cuda")

View File

@@ -15,7 +15,7 @@ class Dwz(MakefilePackage, SourcewarePackage):
maintainers = ['iarspider']
depends_on('libelf')
depends_on('elf')
version('0.14-patches', branch='dwz-0.14-branch')
version('0.14', sha256='33006eab875ff0a07f13fc885883c5bd9514d83ecea9f18bc46b5732dddf0d1f', preferred=True)

View File

@@ -8,19 +8,20 @@
class Ecbuild(CMakePackage):
"""ecBuild is the ECMWF build system. It is built on top of CMake and
consists of a set of macros as well as a wrapper around CMake,"""
consists of a set of macros as well as a wrapper around CMake"""
homepage = 'https://github.com/ecmwf/ecbuild'
url = 'https://github.com/ecmwf/ecbuild/archive/refs/tags/3.6.1.tar.gz'
maintainers = ['skosukhin']
version('3.6.5', sha256='98bff3d3c269f973f4bfbe29b4de834cd1d43f15b1c8d1941ee2bfe15e3d4f7f')
version('3.6.1', sha256='796ccceeb7af01938c2f74eab0724b228e9bf1978e32484aa3e227510f69ac59')
# Some of the tests (ECBUILD-415 and test_ecbuild_regex_escape) fail with
# cmake@2.20.0 and it is not yet clear why. For now, we simply limit the
# version of cmake to the latest '3.19.x':
depends_on('cmake@3.11:3.19', type=('build', 'run'))
depends_on('cmake@3.11:', type=('build', 'run'))
# See https://github.com/ecmwf/ecbuild/issues/35
depends_on('cmake@:3.19', type=('build', 'run'), when='@:3.6.1')
# Some of the installed scripts require running Perl:
depends_on('perl', type=('build', 'run'))

View File

@@ -40,10 +40,12 @@ class Eccodes(CMakePackage):
homepage = 'https://software.ecmwf.int/wiki/display/ECC/ecCodes+Home'
url = 'https://confluence.ecmwf.int/download/attachments/45757960/eccodes-2.2.0-Source.tar.gz?api=v2'
git = 'https://github.com/ecmwf/eccodes.git'
list_url = 'https://confluence.ecmwf.int/display/ECC/Releases'
maintainers = ['skosukhin']
version('develop', branch='develop')
version('2.25.0', sha256='8975131aac54d406e5457706fd4e6ba46a8cc9c7dd817a41f2aa64ce1193c04e')
version('2.24.2', sha256='c60ad0fd89e11918ace0d84c01489f21222b11d6cad3ff7495856a0add610403')
version('2.23.0', sha256='cbdc8532537e9682f1a93ddb03440416b66906a4cc25dec3cbd73940d194bf0c')
@@ -95,7 +97,12 @@ class Eccodes(CMakePackage):
depends_on('openjpeg@1.5.0:1.5,2.1.0:2.3', when='jp2k=openjpeg')
# Additional constraint for older versions.
depends_on('openjpeg@:2.1', when='@:2.16 jp2k=openjpeg')
depends_on('jasper', when='jp2k=jasper')
with when('jp2k=jasper'):
depends_on('jasper')
# jasper 3.x compat from commit 86f0b35f1a8492cb16f82fb976a0a5acd2986ac2
depends_on('jasper@:2', when='@:2.25.0')
depends_on('libpng', when='+png')
depends_on('libaec', when='+aec')
# Can be built with Python 2 or Python 3.
@@ -111,6 +118,8 @@ class Eccodes(CMakePackage):
depends_on('cmake@3.6:', type='build')
depends_on('cmake@3.12:', when='@2.19:', type='build')
depends_on('ecbuild', type='build', when='@develop')
conflicts('+openmp', when='+pthreads',
msg='Cannot enable both POSIX threads and OMP')

View File

@@ -31,19 +31,16 @@ class EcpDataVisSdk(BundlePackage, CudaPackage, ROCmPackage):
# Vis
variant('ascent', default=False, description="Enable Ascent")
variant('cinema', default=False, description="Enable Cinema")
variant('paraview', default=False, description="Enable ParaView")
variant('sz', default=False, description="Enable SZ")
variant('visit', default=False, description="Enable VisIt")
variant('vtkm', default=False, description="Enable VTK-m")
variant('zfp', default=False, description="Enable ZFP")
# Cinema
variant('cinema', default=False, description="Enable Cinema")
# Outstanding build issues
variant('sensei', default=False, description="Enable Sensei")
conflicts('+sensei')
variant('visit', default=False, description="Enable VisIt")
conflicts('+visit')
# Wrapper around depends_on to propagate dependency variants
def dav_sdk_depends_on(spec, when=None, propagate=None):
@@ -125,9 +122,12 @@ def exclude_variants(variants, exclude):
dav_sdk_depends_on('sensei@develop +vtkio +python ~miniapps', when='+sensei',
propagate=dict(propagate_to_sensei))
dav_sdk_depends_on('ascent+mpi+fortran+openmp+python+shared+vtkh+dray~test',
# Fortran support with ascent is problematic on some Cray platforms so the
# SDK is explicitly disabling it until the issues are resolved.
dav_sdk_depends_on('ascent+mpi~fortran+openmp+python+shared+vtkh+dray~test',
when='+ascent',
propagate=['adios2', 'cuda'] + cuda_arch_variants)
# Need to explicitly turn off conduit hdf5_compat in order to build
# hdf5@1.12 which is required for SDK
depends_on('ascent ^conduit ~hdf5_compat', when='+ascent +hdf5')
@@ -149,7 +149,9 @@ def exclude_variants(variants, exclude):
depends_on('paraview ~cuda', when='+paraview ~cuda')
conflicts('paraview@master', when='+paraview')
dav_sdk_depends_on('visit', when='+visit')
dav_sdk_depends_on('visit+mpi+python+silo',
when='+visit',
propagate=['hdf5', 'adios2'])
dav_sdk_depends_on('vtk-m@1.7:+shared+mpi+openmp+rendering',
when='+vtkm',

Some files were not shown because too many files have changed in this diff Show More