Compare commits

...

84 Commits

Author SHA1 Message Date
Wouter Deconinck
7a9ea92de0 docs: user_reports.rst 2024-11-07 11:04:59 -06:00
Wouter Deconinck
508aa228a7 docs: in_the_news 2024-11-07 10:57:01 -06:00
Wouter Deconinck
03c29ad13e docs: new toc entry testimonials 2024-11-07 10:47:07 -06:00
eugeneswalker
bf11fb037b loki%oneapi@2025: -Wno-error=missing-template-arg-list-after-template-kw (#47475) 2024-11-06 18:49:35 -07:00
Marc T. Henry de Frahan
074b845cd3 Add amr-wind versions (#47479) 2024-11-06 17:30:29 -07:00
eugeneswalker
dd26732897 legion%oneapi@2025: cxxflags add -Wno-error=missing-template-arg-list-after-template-kw (#47478) 2024-11-06 16:26:04 -08:00
eugeneswalker
3665c5c01b slate %oneapi@2025: cxxflags: add -Wno-error=missing-template-arg-list-after-template-kw (#47476) 2024-11-06 16:25:47 -08:00
Harmen Stoppels
73219e4b02 llnl.util.filesystem.find: multiple entrypoints (#47436)
You can now provide multiple roots to a single `find()` call and all of
them will be searched. The roots can overlap (e.g. can be parents of one
another).

This also adds a library function for taking a set of regular expression
patterns and creating a single OR expression (and that library function
is used in `find` to improve its performance).
2024-11-06 15:22:26 -08:00
Jon Rood
57a90c91a4 nalu-wind: fix hypre constraints (#47474) 2024-11-06 15:47:44 -07:00
Cameron Smith
8f4a0718bf omega-h: new version and cuda conflicts for prior versions (#47473)
* omegah: add version 10.8.6

* omegah: cuda without kokkos conflict

* omegah: test with latest version in ci
2024-11-06 22:36:59 +00:00
Wouter Deconinck
9049ffdc7a gsoap: add v2.8.135 (#47415)
* gsoap: add v2.8.135
2024-11-06 12:52:30 -07:00
eugeneswalker
d1f313342e tau: add v2.34 (#47471) 2024-11-06 10:15:52 -07:00
Massimiliano Culpo
e62cf9c45b Fix spack -c <override> when env active (#47403)
Set command line scopes last in _main, so they are higher scopes

Restore the global configuration in a spawned process by inspecting
the result of ctx.get_start_method()

Add the ability to pass a mp.context to PackageInstallContext.

Add shell-tests to check overriding the configuration:
- Using both -c and -C from command line
- With and without an environment active
2024-11-06 17:18:58 +01:00
Wouter Deconinck
ee2723dc46 rivet: add through v4.0.2 (incl yoda: add through v2.0.2) (#47383)
* yoda: add v2.0.1, v2.0.2

* rivet: add v3.1.9, v3.1.10, v4.0.0, v4.0.1, v4.0.2

* rivet: yoda@:1 when @:3; conflicts hepmc3@3.3.0 when @:4.0.0

* rivet: fix style

* rivet: hepmc=2 only when @:3; use libs.directories[0]

* hepmc3: def libs

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-06 09:09:40 -07:00
Harmen Stoppels
d09b185522 Fix various bootstrap/concretizer import issues (#47467) 2024-11-06 14:35:04 +00:00
Harmen Stoppels
a31c525778 llnl.util.filesystem.find: restore old error handling (#47463) 2024-11-06 11:49:14 +01:00
Thomas Madlener
2aa5a16433 edm4hep: Add json variant for newer versions (#47180)
* edm4hep: Add json variant for newer versions

Explicit option has been added to EDM4hep so we now expose it via a
variant as well. We keep the old behavior where we unconditionally
depended on nlohmann-json and implicitly built JSON support if we could
detect it cmake stage

* Fix condition statement in when clause

* Use open version range to avoid fixing to single version

---------

Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2024-11-06 03:42:34 -07:00
Richarda Butler
0c164d2740 Feature: Allow variants to propagate if not available in source pkg (#42931)
Variants can now be propagated from a dependent package to (transitive) dependencies, 
even if the source or transitive dependencies have the propagated variants.

For example, here `zlib` doesn't have a `guile` variant, but `gmake` does:
```
$ spack spec zlib++guile
 -   zlib@1.3%gcc@12.2.0+optimize+pic+shared build_system=makefile arch=linux-rhel8-broadwell
 -       ^gcc-runtime@12.2.0%gcc@12.2.0 build_system=generic arch=linux-rhel8-broadwell
 -       ^gmake@4.4.1%gcc@12.2.0+guile build_system=generic arch=linux-rhel8-broadwell
```

Adding this property has some strange ramifications for `satisfies()`. In particular:
* The abstract specs `pkg++variant` and `pkg+variant`  do not intersect, because `+variant`
  implies that `pkg` *has* the variant, but `++variant` does not.
* This means that `spec.satisfies("++foo")` is `True` if:
    * for concrete specs: `spec` and its dependencies all have `foo` set if it exists
    * for abstract specs: no dependency of `spec`  has `~foo` (i.e. no dependency contradicts `++foo`).
* This also means that `Spec("++foo").satisfies("+foo")` is `False` -- we only know after concretization.

The `satisfies()` semantics may be surprising, but this is the cost of introducing non-subset
semantics (which are more useful than proper subsets here).

- [x] Change checks for variants
- [x] Resolve conflicts
- [x] Add tests
- [x] Add documentation

---------

Co-authored-by: Gregory Becker <becker33@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-11-06 00:53:52 -08:00
Jon Rood
801390f6be masa: add versions and update dependencies (#47447)
* masa: add versions

* masa: update dependencies
2024-11-05 13:08:21 -07:00
wspear
c601692bc7 Fix filter_compiler_wrappers with mpi (#47448) 2024-11-05 13:08:10 -07:00
Harmen Stoppels
2b9c6790f2 omega-h: fix versioning and cuda compat (#47433) 2024-11-05 15:50:48 +01:00
Martin Lang
09ae2516d5 cgal: add v6.0.1 (#47285) 2024-11-05 14:20:10 +01:00
Harmen Stoppels
eb9ff5d7a7 paraview: add forward compat bound with cuda (#47430) 2024-11-05 13:25:19 +01:00
Harmen Stoppels
dadb30f0e2 libc.py: detect glibc also in chinese locale (#47434) 2024-11-05 12:30:32 +01:00
Harmen Stoppels
d45f682573 Revert "Ci generate on change (#47318)" (#47431)
This reverts commit 1462c35761.
2024-11-05 10:51:12 +01:00
David Boehme
b7601f3042 Add Adiak v0.4.1 (#47429) 2024-11-05 01:51:19 -07:00
Mosè Giordano
6b5a479d1e extrae: fix build with gcc@14 (#47407) 2024-11-05 09:42:06 +01:00
Harmen Stoppels
1297dd7fbc py-torchaudio, py-torchtext: rpath fixup (#47404)
* py-torchaudio, py-torchtext: rpath fixup

also add missing dependency on Spack ffmpeg to torchaudio.

* py-torchaudio: add torio rpath
2024-11-05 09:36:26 +01:00
Adam J. Stewart
75c169d870 py-tensorflow: add v2.18.0 (#47211) 2024-11-05 09:04:07 +01:00
Harmen Stoppels
afe431cfb5 py-python-ptrace: missing forward compat bound (#47401) 2024-11-05 07:50:38 +01:00
Massimiliano Culpo
14bc900e9d spack.concretize: add type-hints, remove kwargs (#47382)
Also remove find_spec, which was used by the old concretizer.
Currently, it seems to be used only in tests.
2024-11-05 07:46:49 +01:00
Howard Pritchard
e42e541605 openmpi: add 4.1.7 (#47427)
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2024-11-04 22:23:16 -07:00
Wouter Deconinck
9310fcabd8 sst-dumpi: fix homepage (#47420) 2024-11-04 22:13:17 -07:00
Wouter Deconinck
6822f99cc6 quicksilver: fix homepage (#47419) 2024-11-04 22:12:57 -07:00
Wouter Deconinck
703cd6a313 py-eventlet: fix url (#47418)
* py-eventlet: fix url
* py-eventlet: fix checksum
2024-11-04 22:08:17 -07:00
Wouter Deconinck
5b59a53545 py-configspace: fix homepage (#47417) 2024-11-04 21:59:04 -07:00
Mosè Giordano
b862eec6bc extrae: add more versions (#47408) 2024-11-04 21:43:02 -07:00
Mosè Giordano
dcc199ae63 extrae: fix typo (#47406) 2024-11-04 21:34:08 -07:00
Paul Gessinger
f8da72cffe pythia8: Include patch for C++20 / Clang (#47400)
* pythia8: Include patch for C++20 / Clang

Pythia8 vendors some FJCore sources that are as of Pythia8 312
incompatible with C++20 on clang. This adds a patch that makes it
compatible in these scenarios

* Add issue link

* rename setup_cxxstd function

* Remove an accidental printout

* Apply patch to all compilers, add lower bound
2024-11-04 20:34:47 -06:00
Matthieu Dorier
8650ba3cea prometheus-cpp: added package prometheus-cpp (#47384)
* prometheus-cpp: added package prometheus-cpp
* prometheus-cpp: edited PR for style
2024-11-04 17:55:29 -08:00
Pranav Sivaraman
54aaa95a35 flux: new package (#47392) 2024-11-04 17:51:23 -08:00
Edward Hartnett
5a29c9d82b added g2c-2.0.0 (#47399) 2024-11-04 17:48:48 -08:00
Tim Haines
c8873ea35c dyninst: patch broken builds for 10.0.0:12.2.0 (#47339)
* dyninst: patch broken builds for 10.0.0:12.3.0
* Only apply before 12.3.0
2024-11-04 15:33:41 -08:00
Paul R. C. Kent
c7659df4af libxc: add v7.0.0 (#47263) 2024-11-04 15:30:55 -08:00
Sreenivasa Murthy Kolam
0de6c17477 fix the error libroctx64.so.o not found when executing MIOpenDriver (#47196) 2024-11-04 11:54:48 -08:00
Darren Bolduc
6924c530e2 google-cloud-cpp: add v2.29.0, v2.30.0 (#47146)
* google-cloud-cpp: add v2.29.0; fix cxx-std versions
* d'oh, single value for the variant
2024-11-04 11:50:54 -08:00
Peter Scheibel
38c8069ab4 filesystem.py: add max_depth argument to find (#41945)
* `find(..., max_depth=...)` can be used to control how many directories at most to descend into below the starting point
* `find` now enters every unique (symlinked) directory once at the lowest depth
* `find` is now repeatable: it traverses the directory tree in a deterministic order
2024-11-04 20:31:57 +01:00
Todd Gamblin
5cc07522ab cc: parse RPATHs when in ld mode
In the pure `ld` case, we weren't actually parsing `RPATH` arguments separately as we
do for `ccld`. Fix this by adding *another* nested case statement for raw `RPATH`
parsing.

There are now 3 places where we deal with `-rpath` and friends, but I don't see a great
way to unify them, as `-Wl,`, `-Xlinker`, and raw `-rpath` arguments are all ever so
slightly different.

Also, this Fixes ordering of assertions to make `pytest` diffs more intelligible.
The meaning of `+` and `-` in diffs changed in `pytest` 6.0 and the "preferred" order
for assertions became `assert actual == expected` instead of the other way around.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-04 19:52:08 +01:00
Todd Gamblin
575a006ca3 cc: simplify ordered list handling
`cc` divides most paths up into system paths, spack managed paths, and other paths.
This gets really repetitive and makes the code hard to read. Simplify the script
by adding some functions to do most of the redundant work for us.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-04 19:52:08 +01:00
John Gouwar
23ac56edfb Times spec building and timing to public concretizer API (#47310)
This PR has two small contributions:
- It adds another phase to the timer for concrectization, "construct_specs", to actually see the time the concretizer spends interpreting the `clingo` output to build the Python object for a concretized spec. 
- It adds the method `Solver.solve_with_stats` to expose the timers that were already in the concretizer to the public solver API. `Solver.solve` just becomes a special case of `Solver.solve_with_stats` that throws away the timing output (which is what it was already doing).  

These changes will make it easier to benchmark concretizer performance and provide a more complete picture of the time spent in the concretizer by including the time spent interpreting clingo output.
2024-11-04 09:48:18 -08:00
Harmen Stoppels
8c3068809f papi: add forward compat bound for cuda (#47409) 2024-11-04 17:32:47 +01:00
Stephen Nicholas Swatman
2214fc855d geant4-data: symlink only specific data dirs (#47367)
Currently, the `geant4-data` spec creates symlink to all of its
dependencies, and it does so by globbing their `share/` directories.
This works very well for the way Spack installs these, but it doesn't
work for anybody wanting to use e.g. the Geant4 data on CVMFS. See pull
request #47298. This commit changes the way the `geant4-data` spec
works. It no longer blindly globs directories and makes symlinks, but it
asks its dependencies specifically for the name of their data directory.
This should allow us to use Spack to use the CVMFS installations as
externals.
2024-11-04 15:38:12 +00:00
Harmen Stoppels
d44bdc40c9 boost: require +icu when +locale (#47396) 2024-11-04 16:12:16 +01:00
Stephen Nicholas Swatman
e952f6be8e acts dependencies: new versions as of 2024/11/01 (#47366)
* acts dependencies: new versions as of 2024/11/01

Includes new versions of ACTS itself, Detray, and Vecmem.

* Bind TBB
2024-11-04 06:48:08 -07:00
Wouter Deconinck
b95936f752 zabbix: add v5.0.44, v6.0.34, v7.0.4 (fix CVEs) (#47001)
* zabbix: add v5.0.44, v6.0.34, v7.0.4 (fix CVEs)

* [@spackbot] updating style on behalf of wdconinc

* zabbix: use f-string

* zabbix: fix f-string quoting

* zabbix: use mysql-client

* @wdconic, this fixes the mysql client virtual for me

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-11-04 07:20:09 -06:00
Harmen Stoppels
8d0856d1cc packaging_guide.rst: explain forward and backward compat before the less common cases (#47402)
The idea is to go from most to least used: backward compat -> forward compat -> pinning on major or major.minor version -> pinning specific, concrete versions.

Further, the following

```python
   # backward compatibility with Python
   depends_on("python@3.8:")
   depends_on("python@3.9:", when="@1.2:")
   depends_on("python@3.10:", when="@1.4:")

   # forward compatibility with Python
   depends_on("python@:3.12", when="@:1.10")
   depends_on("python@:3.13", when="@:1.12")
   depends_on("python@:3.14")
```

is better than disjoint when ranges causing repetition of the rules on dependencies, and requiring frequent editing of existing lines after new releases are done:

```python
   depends_on("python@3.8:3.12", when="@:1.1")
   depends_on("python@3.9:3.12", when="@1.2:1.3")
   depends_on("python@3.10:3.12", when="@1.4:1.10")
   depends_on("python@3.10:3.13", when="@1.11:1.12")
   depends_on("python@3.10:3.14", when="@1.13:")
2024-11-04 13:52:05 +01:00
Teague Sterling
10f7014add vep-cache: new package (#44523)
* py-uvloop: add v3.8.14, v3.9.15, v3.10.3 and dependencies

* rollback

* vep: add v110,v111,v112

* vep-cache: add v110,v111,v112

* Cleanup

* Reorganizigng

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of teaguesterling

* Update package.py

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of teaguesterling

* Update package.py

* [@spackbot] updating style on behalf of teaguesterling

* Fix scoping and syntax issues

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* fix styles

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* fix variants

* fixing up variant issues and cleaning up resource code

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* fixing unused imports

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Apply suggestions from code review

Co-authored-by: Arne Becker <101113822+EbiArnie@users.noreply.github.com>

* fixing vep dependencies

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Fixing resources

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Fixing issue where resources are not downloaded

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* vep-cache fixing downloads

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* defaulting to using VEP installer

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Removing resource-based cache installation and simplifying package. Resources without checksums doesn't work (anymore?) and calculating them with be difficult

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Arne Becker <101113822+EbiArnie@users.noreply.github.com>
2024-11-04 11:37:21 +00:00
Harmen Stoppels
c9ed91758d tcsh: add missing libxcrypt dependency (#47398) 2024-11-04 03:42:34 -07:00
Harmen Stoppels
2c1d74db9b krb5: disable missing keyutils dependency (#47397) 2024-11-04 03:20:33 -07:00
Harmen Stoppels
5b93466340 libssh2: fix crypto (#47393) 2024-11-04 02:48:35 -07:00
Martin Lang
1ee344c75c bigdft-futile: fix compilation for @1.9.5~mpi (#47292)
When compiled without MPI support, a fake mpi header is autogenerated during configure/build. The header is missing one symbol in version 1.9.5. The problem has since been fixed upstream.

A simular problem does also occur for 1.9.4. Unfortunately, the patch does not work for 1.9.4 and I also don't know if further fixes would be required for 1.9.4. Therefore, only the newest version 1.9.5 is patched.
2024-11-04 10:47:54 +01:00
afzpatel
754011643c rocal and rocm-openmp-extras: fix build failures (#47314) 2024-11-04 10:41:07 +01:00
Cédric Chevalier
2148292bdb kokkos and kokkos-kernels: use new urls for v4.4 and above (#47330) 2024-11-04 10:39:16 +01:00
Harmen Stoppels
cf3576a9bb suite-sparse: fix missing rpaths for dependencies (#47394) 2024-11-04 02:38:16 -07:00
Martin Lang
a86f164835 nlopt: new version 2.8.0 (#47289) 2024-11-04 10:35:11 +01:00
Martin Lang
2782ae6d7e libpspio: new version 0.4.1 (#47287) 2024-11-04 10:34:22 +01:00
Wouter Deconinck
b1fd6dbb6d libxml2: add v2.11.9, v2.12.9, v2.13.4 (#47297)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-04 10:32:34 +01:00
Andrey Prokopenko
18936771ff arborx: remove Trilinos dependency for @1.6: (#47305) 2024-11-04 10:29:32 +01:00
Brian Spilner
9a94ea7dfe icon: add a maintainer (#47323) 2024-11-04 10:26:28 +01:00
Larry Knox
a93bd6cee4 hdf5: add develop-2.0 (#47299)
Update HDF5 version for develop branch to develop-2.0 to match the new
version in the develop branch.

Remove develop-1.16 as it has been decided to make next release HDF5 2.0.0.
2024-11-04 10:21:17 +01:00
Paul R. C. Kent
4c247e206c llvm: add v19.1.3 (#47325) 2024-11-04 10:18:24 +01:00
Weiqun Zhang
fcdaccfeb6 amrex: add v24.11 (#47371) 2024-11-04 10:06:49 +01:00
Wouter Deconinck
2fc056e27c py-flask-compress: add v1.14 (#47373) 2024-11-04 10:02:15 +01:00
Wouter Deconinck
417c48b07a py-flask-cors: add v4.0.0 (#47374) 2024-11-04 10:01:36 +01:00
Christophe Prud'homme
f05033b0d2 cpr: add +pic and +shared variants (#47281) 2024-11-04 10:00:26 +01:00
Cameron Smith
d63f06e4b7 pumi: add version 2.2.9 (#47380) 2024-11-04 09:58:40 +01:00
Wouter Deconinck
8296aaf175 minizip: add v1.3.1 (#47379) 2024-11-04 09:57:40 +01:00
Wouter Deconinck
86ebcabd46 cups: add v2.4.11 (#47390) 2024-11-04 09:55:33 +01:00
Wouter Deconinck
87329639f2 elasticsearch, kibana, logstash: add v8.15.2 (#46873) 2024-11-04 09:50:41 +01:00
Harmen Stoppels
0acd6ae7b2 lua-luaposix: add missing libxcrypt dependency (#47395) 2024-11-04 01:47:47 -07:00
Massimiliano Culpo
395c911689 Specs: propagated variants affect == equality (#47376)
This PR changes the semantic of == for spec so that:

hdf5++mpi == hdf5+mpi

won't hold true anymore. It also changes the constrain semantic, so that a
non-propagating variant always override a propagating variant. This means:

(hdf5++mpi).constrain(hdf5+mpi) -> hdf5+mpi

Before we had a very weird semantic, that was supposed to be tested by unit-tests:

(libelf++debug).constrain(libelf+debug+foo) -> libelf++debug++foo

This semantic has been dropped, as it was never really tested due to the == bug.
2024-11-03 22:35:16 -08:00
Wouter Deconinck
2664303d7a pythia8: add v8.312 (#47389)
* pythia8: add v8.312

* pythia8: update homepage url
2024-11-03 23:48:34 +01:00
Wouter Deconinck
ff9568fa2f sherpa: add v3.0.1 (#47388)
* sherpa: add v3.0.1

* sherpa: no depends_on py-setuptools
2024-11-03 15:32:35 -07:00
eugeneswalker
632c009569 e4s ci stacks: reduce package prefs (#47381) 2024-11-03 00:18:13 +01:00
140 changed files with 2466 additions and 977 deletions

View File

@@ -1359,6 +1359,10 @@ For example, for the ``stackstart`` variant:
mpileaks stackstart==4 # variant will be propagated to dependencies
mpileaks stackstart=4 # only mpileaks will have this variant value
Spack also allows variants to be propagated from a package that does
not have that variant.
^^^^^^^^^^^^^^
Compiler Flags
^^^^^^^^^^^^^^

View File

@@ -214,6 +214,7 @@ def setup(sphinx):
# Spack classes that intersphinx is unable to resolve
("py:class", "spack.version.StandardVersion"),
("py:class", "spack.spec.DependencySpec"),
("py:class", "spack.spec.ArchSpec"),
("py:class", "spack.spec.InstallStatus"),
("py:class", "spack.spec.SpecfileReaderBase"),
("py:class", "spack.install_test.Pb"),

View File

@@ -0,0 +1,23 @@
.. Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _in-the-news:
==================
Spack In The News
==================
Since its inception, Spack has been featured in many blog posts and news stories. Here we collect a
.. warning::
Disclaimer:
Spack is not responsible for and does not control the content of these external pages.
Spack makes no representatations about their accuracy or applicability.
----
2024
----
- May 22, 2024: `Spack on Windows: A New Era in Cross-Platform Dependency Management <https://www.kitware.com/spack-on-windows-a-new-era-in-cross-platform-dependency-management/>`_ (Kitware)

View File

@@ -97,6 +97,13 @@ or refer to the full manual below.
build_systems
developer_guide
.. toctree::
:maxdepth: 2
:caption: Testimonials
in_the_news
user_reports
.. toctree::
:maxdepth: 2
:caption: API Docs

View File

@@ -2503,15 +2503,14 @@ with. For example, suppose that in the ``libdwarf`` package you write:
depends_on("libelf@0.8")
Now ``libdwarf`` will require ``libelf`` at *exactly* version ``0.8``.
You can also specify a requirement for a particular variant or for
specific compiler flags:
Now ``libdwarf`` will require ``libelf`` in the range ``0.8``, which
includes patch versions ``0.8.1``, ``0.8.2``, etc. Apart from version
restrictions, you can also specify variants if this package requires
optional features of the dependency.
.. code-block:: python
depends_on("libelf@0.8+debug")
depends_on("libelf debug=True")
depends_on("libelf cppflags='-fPIC'")
depends_on("libelf@0.8 +parser +pic")
Both users *and* package authors can use the same spec syntax to refer
to different package configurations. Users use the spec syntax on the
@@ -2519,46 +2518,82 @@ command line to find installed packages or to install packages with
particular constraints, and package authors can use specs to describe
relationships between packages.
^^^^^^^^^^^^^^
Version ranges
^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Specifying backward and forward compatibility
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Although some packages require a specific version for their dependencies,
most can be built with a range of versions. For example, if you are
writing a package for a legacy Python module that only works with Python
2.4 through 2.6, this would look like:
Packages are often compatible with a range of versions of their
dependencies. This is typically referred to as backward and forward
compatibility. Spack allows you to specify this in the ``depends_on``
directive using version ranges.
**Backwards compatibility** means that the package requires at least a
certain version of its dependency:
.. code-block:: python
depends_on("python@2.4:2.6")
depends_on("python@3.10:")
Version ranges in Spack are *inclusive*, so ``2.4:2.6`` means any version
greater than or equal to ``2.4`` and up to and including any ``2.6.x``. If
you want to specify that a package works with any version of Python 3 (or
higher), this would look like:
In this case, the package requires Python 3.10 or newer.
Commonly, packages drop support for older versions of a dependency as
they release new versions. In Spack you can conveniently add every
backward compatibility rule as a separate line:
.. code-block:: python
depends_on("python@3:")
# backward compatibility with Python
depends_on("python@3.8:")
depends_on("python@3.9:", when="@1.2:")
depends_on("python@3.10:", when="@1.4:")
Here we leave out the upper bound. If you want to say that a package
requires Python 2, you can similarly leave out the lower bound:
This means that in general we need Python 3.8 or newer; from version
1.2 onwards we need Python 3.9 or newer; from version 1.4 onwards we
need Python 3.10 or newer. Notice that it's fine to have overlapping
ranges in the ``when`` clauses.
**Forward compatibility** means that the package requires at most a
certain version of its dependency. Forward compatibility rules are
necessary when there are breaking changes in the dependency that the
package cannot handle. In Spack we often add forward compatibility
bounds only at the time a new, breaking version of a dependency is
released. As with backward compatibility, it is typical to see a list
of forward compatibility bounds in a package file as seperate lines:
.. code-block:: python
depends_on("python@:2")
# forward compatibility with Python
depends_on("python@:3.12", when="@:1.10")
depends_on("python@:3.13", when="@:1.12")
Notice that we didn't use ``@:3``. Version ranges are *inclusive*, so
``@:3`` means "up to and including any 3.x version".
Notice how the ``:`` now appears before the version number both in the
dependency and in the ``when`` clause. This tells Spack that in general
we need Python 3.13 or older up to version ``1.12.x``, and up to version
``1.10.x`` we need Python 3.12 or older. Said differently, forward compatibility
with Python 3.13 was added in version 1.11, while version 1.13 added forward
compatibility with Python 3.14.
You can also simply write
Notice that a version range ``@:3.12`` includes *any* patch version
number ``3.12.x``, which is often useful when specifying forward compatibility
bounds.
So far we have seen open-ended version ranges, which is by far the most
common use case. It is also possible to specify both a lower and an upper bound
on the version of a dependency, like this:
.. code-block:: python
depends_on("python@2.7")
depends_on("python@3.10:3.12")
to tell Spack that the package needs Python 2.7.x. This is equivalent to
``@2.7:2.7``.
There is short syntax to specify that a package is compatible with say any
``3.x`` version:
.. code-block:: python
depends_on("python@3")
The above is equivalent to ``depends_on("python@3:3")``, which means at least
Python version 3 and at most any version ``3.x.y``.
In very rare cases, you may need to specify an exact version, for example
if you need to distinguish between ``3.2`` and ``3.2.1``:

View File

@@ -0,0 +1,28 @@
.. Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
Spack Project Developers. See the top-level COPYRIGHT file for details.
SPDX-License-Identifier: (Apache-2.0 OR MIT)
.. _user-reports:
============
User Reports
============
Spack has an active user community which deploys Spack in a multitude of contexts.
Here we collect a growing list of reports by users on how they have applied Spack
in their context.
If you have a user report that you think merits inclusion, feel free to open a pull
request and add it to this list.
.. warning::
Disclaimer:
Spack is not responsible for and does not control the content of these external pages.
Spack makes no representatations about their accuracy or applicability.
----
2024
----
- September 3, 2024: `Aidan Heerdegen: RRR: Reliability, Replicability, Reproducibility for climate models <https://www.youtube.com/watch?v=BVoVliqgx1U>`_ (ACCESS-NRI)

238
lib/spack/env/cc vendored
View File

@@ -101,10 +101,9 @@ setsep() {
esac
}
# prepend LISTNAME ELEMENT [SEP]
# prepend LISTNAME ELEMENT
#
# Prepend ELEMENT to the list stored in the variable LISTNAME,
# assuming the list is separated by SEP.
# Prepend ELEMENT to the list stored in the variable LISTNAME.
# Handles empty lists and single-element lists.
prepend() {
varname="$1"
@@ -238,6 +237,36 @@ esac
}
"
# path_list functions. Path_lists have 3 parts: spack_store_<list>, <list> and system_<list>,
# which are used to prioritize paths when assembling the final command line.
# init_path_lists LISTNAME
# Set <LISTNAME>, spack_store_<LISTNAME>, and system_<LISTNAME> to "".
init_path_lists() {
eval "spack_store_$1=\"\""
eval "$1=\"\""
eval "system_$1=\"\""
}
# assign_path_lists LISTNAME1 LISTNAME2
# Copy contents of LISTNAME2 into LISTNAME1, for each path_list prefix.
assign_path_lists() {
eval "spack_store_$1=\"\${spack_store_$2}\""
eval "$1=\"\${$2}\""
eval "system_$1=\"\${system_$2}\""
}
# append_path_lists LISTNAME ELT
# Append the provided ELT to the appropriate list, based on the result of path_order().
append_path_lists() {
path_order "$2"
case $? in
0) eval "append spack_store_$1 \"\$2\"" ;;
1) eval "append $1 \"\$2\"" ;;
2) eval "append system_$1 \"\$2\"" ;;
esac
}
# Check if optional parameters are defined
# If we aren't asking for debug flags, don't add them
if [ -z "${SPACK_ADD_DEBUG_FLAGS:-}" ]; then
@@ -470,12 +499,7 @@ input_command="$*"
parse_Wl() {
while [ $# -ne 0 ]; do
if [ "$wl_expect_rpath" = yes ]; then
path_order "$1"
case $? in
0) append return_spack_store_rpath_dirs_list "$1" ;;
1) append return_rpath_dirs_list "$1" ;;
2) append return_system_rpath_dirs_list "$1" ;;
esac
append_path_lists return_rpath_dirs_list "$1"
wl_expect_rpath=no
else
case "$1" in
@@ -484,24 +508,14 @@ parse_Wl() {
if [ -z "$arg" ]; then
shift; continue
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
append_path_lists return_rpath_dirs_list "$arg"
;;
--rpath=*)
arg="${1#--rpath=}"
if [ -z "$arg" ]; then
shift; continue
fi
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
append_path_lists return_rpath_dirs_list "$arg"
;;
-rpath|--rpath)
wl_expect_rpath=yes
@@ -509,8 +523,7 @@ parse_Wl() {
"$dtags_to_strip")
;;
-Wl)
# Nested -Wl,-Wl means we're in NAG compiler territory, we don't support
# it.
# Nested -Wl,-Wl means we're in NAG compiler territory. We don't support it.
return 1
;;
*)
@@ -529,21 +542,10 @@ categorize_arguments() {
return_other_args_list=""
return_isystem_was_used=""
return_isystem_spack_store_include_dirs_list=""
return_isystem_system_include_dirs_list=""
return_isystem_include_dirs_list=""
return_spack_store_include_dirs_list=""
return_system_include_dirs_list=""
return_include_dirs_list=""
return_spack_store_lib_dirs_list=""
return_system_lib_dirs_list=""
return_lib_dirs_list=""
return_spack_store_rpath_dirs_list=""
return_system_rpath_dirs_list=""
return_rpath_dirs_list=""
init_path_lists return_isystem_include_dirs_list
init_path_lists return_include_dirs_list
init_path_lists return_lib_dirs_list
init_path_lists return_rpath_dirs_list
# Global state for keeping track of -Wl,-rpath -Wl,/path
wl_expect_rpath=no
@@ -609,32 +611,17 @@ categorize_arguments() {
arg="${1#-isystem}"
return_isystem_was_used=true
if [ -z "$arg" ]; then shift; arg="$1"; fi
path_order "$arg"
case $? in
0) append return_isystem_spack_store_include_dirs_list "$arg" ;;
1) append return_isystem_include_dirs_list "$arg" ;;
2) append return_isystem_system_include_dirs_list "$arg" ;;
esac
append_path_lists return_isystem_include_dirs_list "$arg"
;;
-I*)
arg="${1#-I}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
path_order "$arg"
case $? in
0) append return_spack_store_include_dirs_list "$arg" ;;
1) append return_include_dirs_list "$arg" ;;
2) append return_system_include_dirs_list "$arg" ;;
esac
append_path_lists return_include_dirs_list "$arg"
;;
-L*)
arg="${1#-L}"
if [ -z "$arg" ]; then shift; arg="$1"; fi
path_order "$arg"
case $? in
0) append return_spack_store_lib_dirs_list "$arg" ;;
1) append return_lib_dirs_list "$arg" ;;
2) append return_system_lib_dirs_list "$arg" ;;
esac
append_path_lists return_lib_dirs_list "$arg"
;;
-l*)
# -loopopt=0 is generated erroneously in autoconf <= 2.69,
@@ -667,32 +654,17 @@ categorize_arguments() {
break
elif [ "$xlinker_expect_rpath" = yes ]; then
# Register the path of -Xlinker -rpath <other args> -Xlinker <path>
path_order "$1"
case $? in
0) append return_spack_store_rpath_dirs_list "$1" ;;
1) append return_rpath_dirs_list "$1" ;;
2) append return_system_rpath_dirs_list "$1" ;;
esac
append_path_lists return_rpath_dirs_list "$1"
xlinker_expect_rpath=no
else
case "$1" in
-rpath=*)
arg="${1#-rpath=}"
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
append_path_lists return_rpath_dirs_list "$arg"
;;
--rpath=*)
arg="${1#--rpath=}"
path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
append_path_lists return_rpath_dirs_list "$arg"
;;
-rpath|--rpath)
xlinker_expect_rpath=yes
@@ -709,7 +681,36 @@ categorize_arguments() {
"$dtags_to_strip")
;;
*)
append return_other_args_list "$1"
# if mode is not ld, we can just add to other args
if [ "$mode" != "ld" ]; then
append return_other_args_list "$1"
shift
continue
fi
# if we're in linker mode, we need to parse raw RPATH args
case "$1" in
-rpath=*)
arg="${1#-rpath=}"
append_path_lists return_rpath_dirs_list "$arg"
;;
--rpath=*)
arg="${1#--rpath=}"
append_path_lists return_rpath_dirs_list "$arg"
;;
-rpath|--rpath)
if [ $# -eq 1 ]; then
# -rpath without value: let the linker raise an error.
append return_other_args_list "$1"
break
fi
shift
append_path_lists return_rpath_dirs_list "$1"
;;
*)
append return_other_args_list "$1"
;;
esac
;;
esac
shift
@@ -731,21 +732,10 @@ categorize_arguments() {
categorize_arguments "$@"
spack_store_include_dirs_list="$return_spack_store_include_dirs_list"
system_include_dirs_list="$return_system_include_dirs_list"
include_dirs_list="$return_include_dirs_list"
spack_store_lib_dirs_list="$return_spack_store_lib_dirs_list"
system_lib_dirs_list="$return_system_lib_dirs_list"
lib_dirs_list="$return_lib_dirs_list"
spack_store_rpath_dirs_list="$return_spack_store_rpath_dirs_list"
system_rpath_dirs_list="$return_system_rpath_dirs_list"
rpath_dirs_list="$return_rpath_dirs_list"
isystem_spack_store_include_dirs_list="$return_isystem_spack_store_include_dirs_list"
isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
isystem_include_dirs_list="$return_isystem_include_dirs_list"
assign_path_lists isystem_include_dirs_list return_isystem_include_dirs_list
assign_path_lists include_dirs_list return_include_dirs_list
assign_path_lists lib_dirs_list return_lib_dirs_list
assign_path_lists rpath_dirs_list return_rpath_dirs_list
isystem_was_used="$return_isystem_was_used"
other_args_list="$return_other_args_list"
@@ -821,21 +811,10 @@ IFS="$lsep"
categorize_arguments $spack_flags_list
unset IFS
spack_flags_isystem_spack_store_include_dirs_list="$return_isystem_spack_store_include_dirs_list"
spack_flags_isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
spack_flags_isystem_include_dirs_list="$return_isystem_include_dirs_list"
spack_flags_spack_store_include_dirs_list="$return_spack_store_include_dirs_list"
spack_flags_system_include_dirs_list="$return_system_include_dirs_list"
spack_flags_include_dirs_list="$return_include_dirs_list"
spack_flags_spack_store_lib_dirs_list="$return_spack_store_lib_dirs_list"
spack_flags_system_lib_dirs_list="$return_system_lib_dirs_list"
spack_flags_lib_dirs_list="$return_lib_dirs_list"
spack_flags_spack_store_rpath_dirs_list="$return_spack_store_rpath_dirs_list"
spack_flags_system_rpath_dirs_list="$return_system_rpath_dirs_list"
spack_flags_rpath_dirs_list="$return_rpath_dirs_list"
assign_path_lists spack_flags_isystem_include_dirs_list return_isystem_include_dirs_list
assign_path_lists spack_flags_include_dirs_list return_include_dirs_list
assign_path_lists spack_flags_lib_dirs_list return_lib_dirs_list
assign_path_lists spack_flags_rpath_dirs_list return_rpath_dirs_list
spack_flags_isystem_was_used="$return_isystem_was_used"
spack_flags_other_args_list="$return_other_args_list"
@@ -894,7 +873,7 @@ esac
case "$mode" in
cpp|cc|as|ccld)
if [ "$spack_flags_isystem_was_used" = "true" ] || [ "$isystem_was_used" = "true" ]; then
extend isystem_spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS
extend spack_store_isystem_include_dirs_list SPACK_STORE_INCLUDE_DIRS
extend isystem_include_dirs_list SPACK_INCLUDE_DIRS
else
extend spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS
@@ -910,64 +889,63 @@ args_list="$flags_list"
# Include search paths partitioned by (in store, non-sytem, system)
# NOTE: adding ${lsep} to the prefix here turns every added element into two
extend args_list spack_flags_spack_store_include_dirs_list -I
extend args_list spack_store_spack_flags_include_dirs_list -I
extend args_list spack_store_include_dirs_list -I
extend args_list spack_flags_include_dirs_list -I
extend args_list include_dirs_list -I
extend args_list spack_flags_isystem_spack_store_include_dirs_list "-isystem${lsep}"
extend args_list isystem_spack_store_include_dirs_list "-isystem${lsep}"
extend args_list spack_store_spack_flags_isystem_include_dirs_list "-isystem${lsep}"
extend args_list spack_store_isystem_include_dirs_list "-isystem${lsep}"
extend args_list spack_flags_isystem_include_dirs_list "-isystem${lsep}"
extend args_list isystem_include_dirs_list "-isystem${lsep}"
extend args_list spack_flags_system_include_dirs_list -I
extend args_list system_spack_flags_include_dirs_list -I
extend args_list system_include_dirs_list -I
extend args_list spack_flags_isystem_system_include_dirs_list "-isystem${lsep}"
extend args_list isystem_system_include_dirs_list "-isystem${lsep}"
extend args_list system_spack_flags_isystem_include_dirs_list "-isystem${lsep}"
extend args_list system_isystem_include_dirs_list "-isystem${lsep}"
# Library search paths partitioned by (in store, non-sytem, system)
extend args_list spack_flags_spack_store_lib_dirs_list "-L"
extend args_list spack_store_spack_flags_lib_dirs_list "-L"
extend args_list spack_store_lib_dirs_list "-L"
extend args_list spack_flags_lib_dirs_list "-L"
extend args_list lib_dirs_list "-L"
extend args_list spack_flags_system_lib_dirs_list "-L"
extend args_list system_spack_flags_lib_dirs_list "-L"
extend args_list system_lib_dirs_list "-L"
# RPATHs arguments
rpath_prefix=""
case "$mode" in
ccld)
if [ -n "$dtags_to_add" ] ; then
append args_list "$linker_arg$dtags_to_add"
fi
extend args_list spack_flags_spack_store_rpath_dirs_list "$rpath"
extend args_list spack_store_rpath_dirs_list "$rpath"
extend args_list spack_flags_rpath_dirs_list "$rpath"
extend args_list rpath_dirs_list "$rpath"
extend args_list spack_flags_system_rpath_dirs_list "$rpath"
extend args_list system_rpath_dirs_list "$rpath"
rpath_prefix="$rpath"
;;
ld)
if [ -n "$dtags_to_add" ] ; then
append args_list "$dtags_to_add"
fi
extend args_list spack_flags_spack_store_rpath_dirs_list "-rpath${lsep}"
extend args_list spack_store_rpath_dirs_list "-rpath${lsep}"
extend args_list spack_flags_rpath_dirs_list "-rpath${lsep}"
extend args_list rpath_dirs_list "-rpath${lsep}"
extend args_list spack_flags_system_rpath_dirs_list "-rpath${lsep}"
extend args_list system_rpath_dirs_list "-rpath${lsep}"
rpath_prefix="-rpath${lsep}"
;;
esac
# if mode is ccld or ld, extend RPATH lists with the prefix determined above
if [ -n "$rpath_prefix" ]; then
extend args_list spack_store_spack_flags_rpath_dirs_list "$rpath_prefix"
extend args_list spack_store_rpath_dirs_list "$rpath_prefix"
extend args_list spack_flags_rpath_dirs_list "$rpath_prefix"
extend args_list rpath_dirs_list "$rpath_prefix"
extend args_list system_spack_flags_rpath_dirs_list "$rpath_prefix"
extend args_list system_rpath_dirs_list "$rpath_prefix"
fi
# Other arguments from the input command
extend args_list other_args_list
extend args_list spack_flags_other_args_list

View File

@@ -20,11 +20,11 @@
import tempfile
from contextlib import contextmanager
from itertools import accumulate
from typing import Callable, Iterable, List, Match, Optional, Tuple, Union
from typing import Callable, Deque, Dict, Iterable, List, Match, Optional, Set, Tuple, Union
import llnl.util.symlink
from llnl.util import tty
from llnl.util.lang import dedupe, memoized
from llnl.util.lang import dedupe, fnmatch_translate_multiple, memoized
from llnl.util.symlink import islink, readlink, resolve_link_target_relative_to_the_link, symlink
from ..path import path_to_os_path, system_path_filter
@@ -1673,28 +1673,40 @@ def find_first(root: str, files: Union[Iterable[str], str], bfs_depth: int = 2)
return FindFirstFile(root, *files, bfs_depth=bfs_depth).find()
def find(root, files, recursive=True):
"""Search for ``files`` starting from the ``root`` directory.
Like GNU/BSD find but written entirely in Python.
def find(
root: Union[str, List[str]],
files: Union[str, List[str]],
recursive: bool = True,
max_depth: Optional[int] = None,
) -> List[str]:
"""Finds all non-directory files matching the filename patterns from ``files`` starting from
``root``. This function returns a deterministic result for the same input and directory
structure when run multiple times. Symlinked directories are followed, and unique directories
are searched only once. Each matching file is returned only once at lowest depth in case
multiple paths exist due to symlinked directories. The function has similarities to the Unix
``find`` utility.
Examples:
.. code-block:: console
$ find /usr -name python
$ find -L /usr -name python3 -type f
is equivalent to:
is roughly equivalent to
>>> find('/usr', 'python')
>>> find("/usr", "python3")
with the notable difference that this function only lists a single path to each file in case of
symlinked directories.
.. code-block:: console
$ find /usr/local/bin -maxdepth 1 -name python
$ find -L /usr/local/bin /usr/local/sbin -maxdepth 1 '(' -name python3 -o -name getcap \\
')' -type f
is equivalent to:
is roughly equivalent to:
>>> find('/usr/local/bin', 'python', recursive=False)
>>> find(["/usr/local/bin", "/usr/local/sbin"], ["python3", "getcap"], recursive=False)
Accepts any glob characters accepted by fnmatch:
@@ -1708,70 +1720,116 @@ def find(root, files, recursive=True):
========== ====================================
Parameters:
root (str): The root directory to start searching from
files (str or collections.abc.Sequence): Library name(s) to search for
recursive (bool): if False search only root folder,
if True descends top-down from the root. Defaults to True.
root: One or more root directories to start searching from
files: One or more filename patterns to search for
recursive: if False search only root, if True descends from roots. Defaults to True.
max_depth: if set, don't search below this depth. Cannot be set if recursive is False
Returns:
list: The files that have been found
Returns a list of absolute, matching file paths.
"""
if isinstance(files, str):
if not isinstance(root, list):
root = [root]
if not isinstance(files, list):
files = [files]
if recursive:
tty.debug(f"Find (recursive): {root} {str(files)}")
result = _find_recursive(root, files)
else:
tty.debug(f"Find (not recursive): {root} {str(files)}")
result = _find_non_recursive(root, files)
# If recursive is false, max_depth can only be None or 0
if max_depth and not recursive:
raise ValueError(f"max_depth ({max_depth}) cannot be set if recursive is False")
tty.debug(f"Find complete: {root} {str(files)}")
if not recursive:
max_depth = 0
elif max_depth is None:
max_depth = sys.maxsize
tty.debug(f"Find (max depth = {max_depth}): {root} {files}")
result = _find_max_depth(root, files, max_depth)
tty.debug(f"Find complete: {root} {files}")
return result
@system_path_filter
def _find_recursive(root, search_files):
# The variable here is **on purpose** a defaultdict. The idea is that
# we want to poke the filesystem as little as possible, but still maintain
# stability in the order of the answer. Thus we are recording each library
# found in a key, and reconstructing the stable order later.
found_files = collections.defaultdict(list)
# Make the path absolute to have os.walk also return an absolute path
root = os.path.abspath(root)
for path, _, list_files in os.walk(root):
for search_file in search_files:
matches = glob.glob(os.path.join(path, search_file))
matches = [os.path.join(path, x) for x in matches]
found_files[search_file].extend(matches)
answer = []
for search_file in search_files:
answer.extend(found_files[search_file])
return answer
def _log_file_access_issue(e: OSError, path: str) -> None:
errno_name = errno.errorcode.get(e.errno, "UNKNOWN")
tty.debug(f"find must skip {path}: {errno_name} {e}")
@system_path_filter
def _find_non_recursive(root, search_files):
# The variable here is **on purpose** a defaultdict as os.list_dir
# can return files in any order (does not preserve stability)
found_files = collections.defaultdict(list)
def _dir_id(s: os.stat_result) -> Tuple[int, int]:
# Note: on windows, st_ino is the file index and st_dev is the volume serial number. See
# https://github.com/python/cpython/blob/3.9/Python/fileutils.c
return (s.st_ino, s.st_dev)
# Make the path absolute to have absolute path returned
root = os.path.abspath(root)
for search_file in search_files:
matches = glob.glob(os.path.join(root, search_file))
matches = [os.path.join(root, x) for x in matches]
found_files[search_file].extend(matches)
def _find_max_depth(roots: List[str], globs: List[str], max_depth: int = sys.maxsize) -> List[str]:
"""See ``find`` for the public API."""
# Apply normcase to file patterns and filenames to respect case insensitive filesystems
regex, groups = fnmatch_translate_multiple([os.path.normcase(x) for x in globs])
# Ordered dictionary that keeps track of the files found for each pattern
capture_group_to_paths: Dict[str, List[str]] = {group: [] for group in groups}
# Ensure returned paths are always absolute
roots = [os.path.abspath(r) for r in roots]
# Breadth-first search queue. Each element is a tuple of (depth, directory)
dir_queue: Deque[Tuple[int, str]] = collections.deque()
# Set of visited directories. Each element is a tuple of (inode, device)
visited_dirs: Set[Tuple[int, int]] = set()
answer = []
for search_file in search_files:
answer.extend(found_files[search_file])
for root in roots:
try:
stat_root = os.stat(root)
except OSError as e:
_log_file_access_issue(e, root)
continue
dir_id = _dir_id(stat_root)
if dir_id not in visited_dirs:
dir_queue.appendleft((0, root))
visited_dirs.add(dir_id)
return answer
while dir_queue:
depth, next_dir = dir_queue.pop()
try:
dir_iter = os.scandir(next_dir)
except OSError as e:
_log_file_access_issue(e, next_dir)
continue
with dir_iter:
ordered_entries = sorted(dir_iter, key=lambda x: x.name)
for dir_entry in ordered_entries:
try:
it_is_a_dir = dir_entry.is_dir(follow_symlinks=True)
except OSError as e:
# Possible permission issue, or a symlink that cannot be resolved (ELOOP).
_log_file_access_issue(e, dir_entry.path)
continue
if it_is_a_dir and depth < max_depth:
try:
# The stat should be performed in a try/except block. We repeat that here
# vs. moving to the above block because we only want to call `stat` if we
# haven't exceeded our max_depth
if sys.platform == "win32":
# Note: st_ino/st_dev on DirEntry.stat are not set on Windows, so we
# have to call os.stat
stat_info = os.stat(dir_entry.path, follow_symlinks=True)
else:
stat_info = dir_entry.stat(follow_symlinks=True)
except OSError as e:
_log_file_access_issue(e, dir_entry.path)
continue
dir_id = _dir_id(stat_info)
if dir_id not in visited_dirs:
dir_queue.appendleft((depth + 1, dir_entry.path))
visited_dirs.add(dir_id)
else:
m = regex.match(os.path.normcase(os.path.basename(dir_entry.path)))
if not m:
continue
for group in capture_group_to_paths:
if m.group(group):
capture_group_to_paths[group].append(dir_entry.path)
break
return [path for paths in capture_group_to_paths.values() for path in paths]
# Utilities for libraries and headers
@@ -2210,7 +2268,9 @@ def find_system_libraries(libraries, shared=True):
return libraries_found
def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
def find_libraries(
libraries, root, shared=True, recursive=False, runtime=True, max_depth: Optional[int] = None
):
"""Returns an iterable of full paths to libraries found in a root dir.
Accepts any glob characters accepted by fnmatch:
@@ -2231,6 +2291,8 @@ def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
otherwise for static. Defaults to True.
recursive (bool): if False search only root folder,
if True descends top-down from the root. Defaults to False.
max_depth (int): if set, don't search below this depth. Cannot be set
if recursive is False
runtime (bool): Windows only option, no-op elsewhere. If true,
search for runtime shared libs (.DLL), otherwise, search
for .Lib files. If shared is false, this has no meaning.
@@ -2239,6 +2301,7 @@ def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
Returns:
LibraryList: The libraries that have been found
"""
if isinstance(libraries, str):
libraries = [libraries]
elif not isinstance(libraries, collections.abc.Sequence):
@@ -2271,8 +2334,10 @@ def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
libraries = ["{0}.{1}".format(lib, suffix) for lib in libraries for suffix in suffixes]
if not recursive:
if max_depth:
raise ValueError(f"max_depth ({max_depth}) cannot be set if recursive is False")
# If not recursive, look for the libraries directly in root
return LibraryList(find(root, libraries, False))
return LibraryList(find(root, libraries, recursive=False))
# To speedup the search for external packages configured e.g. in /usr,
# perform first non-recursive search in root/lib then in root/lib64 and
@@ -2290,7 +2355,7 @@ def find_libraries(libraries, root, shared=True, recursive=False, runtime=True):
if found_libs:
break
else:
found_libs = find(root, libraries, True)
found_libs = find(root, libraries, recursive=True, max_depth=max_depth)
return LibraryList(found_libs)

View File

@@ -5,12 +5,14 @@
import collections.abc
import contextlib
import fnmatch
import functools
import itertools
import os
import re
import sys
import traceback
import typing
import warnings
from datetime import datetime, timedelta
from typing import Callable, Iterable, List, Tuple, TypeVar
@@ -859,6 +861,32 @@ def elide_list(line_list: List[str], max_num: int = 10) -> List[str]:
return line_list
if sys.version_info >= (3, 9):
PatternStr = re.Pattern[str]
else:
PatternStr = typing.Pattern[str]
def fnmatch_translate_multiple(patterns: List[str]) -> Tuple[PatternStr, List[str]]:
"""Same as fnmatch.translate, but creates a single regex of the form
``(?P<pattern0>...)|(?P<pattern1>...)|...`` for each pattern in the iterable, where
``patternN`` is a named capture group that matches the corresponding pattern translated by
``fnmatch.translate``. This can be used to match multiple patterns in a single pass. No case
normalization is performed on the patterns.
Args:
patterns: list of fnmatch patterns
Returns:
Tuple of the combined regex and the list of named capture groups corresponding to each
pattern in the input list.
"""
groups = [f"pattern{i}" for i in range(len(patterns))]
regexes = (fnmatch.translate(p) for p in patterns)
combined = re.compile("|".join(f"(?P<{g}>{r})" for g, r in zip(groups, regexes)))
return combined, groups
@contextlib.contextmanager
def nullcontext(*args, **kwargs):
"""Empty context manager.

View File

@@ -37,7 +37,6 @@
import spack.error
import spack.main
import spack.mirror
import spack.package_base
import spack.paths
import spack.repo
import spack.spec
@@ -265,22 +264,14 @@ def _format_job_needs(dep_jobs, build_group, prune_dag, rebuild_decisions):
def get_change_revisions():
"""If this is a git repo get the revisions to use when checking
for changed packages and spack core modules."""
rev1 = None
rev2 = None
# Note: git_dir may be a file in a worktree. If it exists, attempt to use git
# to determine if there is a revision
git_dir = os.path.join(spack.paths.prefix, ".git")
if os.path.exists(git_dir):
# The default will only find changed packages from the last
# commit. When the commit is a merge commit, this is will return all of the
# changes on the topic.
# TODO: Handle the case where the clone is not shallow clone of a merge commit
# using `git merge-base`
rev1 = "HEAD^"
rev2 = "HEAD"
return rev1, rev2
if os.path.exists(git_dir) and os.path.isdir(git_dir):
# TODO: This will only find changed packages from the last
# TODO: commit. While this may work for single merge commits
# TODO: when merging the topic branch into the base, it will
# TODO: require more thought outside of that narrow case.
return "HEAD^", "HEAD"
return None, None
def get_stack_changed(env_path, rev1="HEAD^", rev2="HEAD"):
@@ -399,7 +390,7 @@ class SpackCI:
used by the CI generator(s).
"""
def __init__(self, ci_config, spec_labels=None, stages=None):
def __init__(self, ci_config, spec_labels, stages):
"""Given the information from the ci section of the config
and the staged jobs, set up meta data needed for generating Spack
CI IR.
@@ -417,9 +408,8 @@ def __init__(self, ci_config, spec_labels=None, stages=None):
}
jobs = self.ir["jobs"]
if spec_labels and stages:
for spec, dag_hash in _build_jobs(spec_labels, stages):
jobs[dag_hash] = self.__init_job(spec)
for spec, dag_hash in _build_jobs(spec_labels, stages):
jobs[dag_hash] = self.__init_job(spec)
for name in self.named_jobs:
# Skip the special named jobs
@@ -715,53 +705,14 @@ def generate_gitlab_ci_yaml(
files (spack.yaml, spack.lock), etc should be written. GitLab
requires this to be within the project directory.
"""
rev1, rev2 = get_change_revisions()
tty.debug(f"Got following revisions: rev1={rev1}, rev2={rev2}")
# Get the joined "ci" config with all of the current scopes resolved
ci_config = cfg.get("ci")
spack_prune_untouched = os.environ.get("SPACK_PRUNE_UNTOUCHED", None)
changed = rev1 and rev2
affected_pkgs = None
if spack_prune_untouched and changed:
affected_pkgs = compute_affected_packages(rev1, rev2)
tty.debug("affected pkgs:")
if affected_pkgs:
for p in affected_pkgs:
tty.debug(f" {p}")
else:
tty.debug(" no affected packages...")
possible_builds = spack.package_base.possible_dependencies(*env.user_specs)
changed = any((spec in p for p in possible_builds.values()) for spec in affected_pkgs)
if not changed:
spack_ci = SpackCI(ci_config)
spack_ci_ir = spack_ci.generate_ir()
# No jobs should be generated.
noop_job = spack_ci_ir["jobs"]["noop"]["attributes"]
# If this job fails ignore the status and carry on
noop_job["retry"] = 0
noop_job["allow_failure"] = True
tty.msg("Skipping concretization, generating no-op job")
output_object = {"no-specs-to-rebuild": noop_job}
# Ensure the child pipeline always runs
output_object["workflow"] = {"rules": [{"when": "always"}]}
with open(output_file, "w") as f:
ruamel.yaml.YAML().dump(output_object, f)
return
with spack.concretize.disable_compiler_existence_check():
with env.write_transaction():
env.concretize()
env.write()
# Get the joined "ci" config with all of the current scopes resolved
ci_config = cfg.get("ci")
if not ci_config:
raise SpackCIError("Environment does not have a `ci` configuration")
@@ -786,13 +737,20 @@ def generate_gitlab_ci_yaml(
dependent_depth = None
prune_untouched_packages = False
spack_prune_untouched = os.environ.get("SPACK_PRUNE_UNTOUCHED", None)
if spack_prune_untouched is not None and spack_prune_untouched.lower() == "true":
# Requested to prune untouched packages, but assume we won't do that
# unless we're actually in a git repo.
if changed:
rev1, rev2 = get_change_revisions()
tty.debug(f"Got following revisions: rev1={rev1}, rev2={rev2}")
if rev1 and rev2:
# If the stack file itself did not change, proceed with pruning
if not get_stack_changed(env.manifest_path, rev1, rev2):
prune_untouched_packages = True
affected_pkgs = compute_affected_packages(rev1, rev2)
tty.debug("affected pkgs:")
for p in affected_pkgs:
tty.debug(f" {p}")
affected_specs = get_spec_filter_list(
env, affected_pkgs, dependent_traverse_depth=dependent_depth
)
@@ -1140,6 +1098,11 @@ def main_script_replacements(cmd):
# warn only if there was actually a CDash configuration.
tty.warn("Unable to populate buildgroup without CDash credentials")
service_job_retries = {
"max": 2,
"when": ["runner_system_failure", "stuck_or_timeout_failure", "script_failure"],
}
if copy_only_pipeline:
stage_names.append("copy")
sync_job = copy.deepcopy(spack_ci_ir["jobs"]["copy"]["attributes"])
@@ -1199,10 +1162,7 @@ def main_script_replacements(cmd):
)
final_job["when"] = "always"
final_job["retry"] = {
"max": 2,
"when": ["runner_system_failure", "stuck_or_timeout_failure", "script_failure"],
}
final_job["retry"] = service_job_retries
final_job["interruptible"] = True
final_job["dependencies"] = []

View File

@@ -17,6 +17,7 @@
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
import spack.concretize
import spack.config # breaks a cycle.
import spack.environment as ev
import spack.error
@@ -194,7 +195,7 @@ def _concretize_spec_pairs(to_concretize, tests=False):
elif unify == "when_possible":
concretize_method = spack.concretize.concretize_together_when_possible
concretized = concretize_method(*to_concretize, tests=tests)
concretized = concretize_method(to_concretize, tests=tests)
return [concrete for _, concrete in concretized]

View File

@@ -2,20 +2,20 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""
(DEPRECATED) Used to contain the code for the original concretizer
"""
"""High-level functions to concretize list of specs"""
import sys
import time
from contextlib import contextmanager
from itertools import chain
from typing import Tuple
from typing import Iterable, Optional, Sequence, Tuple, Union
import llnl.util.tty as tty
import spack.compilers
import spack.config
import spack.error
from spack.spec import Spec
import spack.repo
import spack.util.parallel
from spack.spec import ArchSpec, CompilerSpec, Spec
CHECK_COMPILER_EXISTENCE = True
@@ -36,92 +36,62 @@ def enable_compiler_existence_check():
CHECK_COMPILER_EXISTENCE = saved
def find_spec(spec, condition, default=None):
"""Searches the dag from spec in an intelligent order and looks
for a spec that matches a condition"""
# First search parents, then search children
deptype = ("build", "link")
dagiter = chain(
spec.traverse(direction="parents", deptype=deptype, root=False),
spec.traverse(direction="children", deptype=deptype, root=False),
)
visited = set()
for relative in dagiter:
if condition(relative):
return relative
visited.add(id(relative))
# Then search all other relatives in the DAG *except* spec
for relative in spec.root.traverse(deptype="all"):
if relative is spec:
continue
if id(relative) in visited:
continue
if condition(relative):
return relative
# Finally search spec itself.
if condition(spec):
return spec
return default # Nothing matched the condition; return default.
SpecPair = Tuple[Spec, Spec]
SpecLike = Union[Spec, str]
TestsType = Union[bool, Iterable[str]]
def concretize_specs_together(*abstract_specs, **kwargs):
def concretize_specs_together(
abstract_specs: Sequence[SpecLike], tests: TestsType = False
) -> Sequence[Spec]:
"""Given a number of specs as input, tries to concretize them together.
Args:
tests (bool or list or set): False to run no tests, True to test
all packages, or a list of package names to run tests for some
*abstract_specs: abstract specs to be concretized, given either
as Specs or strings
Returns:
List of concretized specs
abstract_specs: abstract specs to be concretized
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.solver.asp
allow_deprecated = spack.config.get("config:deprecated", False)
solver = spack.solver.asp.Solver()
result = solver.solve(
abstract_specs, tests=kwargs.get("tests", False), allow_deprecated=allow_deprecated
)
result = solver.solve(abstract_specs, tests=tests, allow_deprecated=allow_deprecated)
return [s.copy() for s in result.specs]
def concretize_together(*spec_list, **kwargs):
def concretize_together(
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
"""Given a number of specs as input, tries to concretize them together.
Args:
tests (bool or list or set): False to run no tests, True to test
all packages, or a list of package names to run tests for some
*spec_list: list of tuples to concretize. First entry is abstract spec, second entry is
spec_list: list of tuples to concretize. First entry is abstract spec, second entry is
already concrete spec or None if not yet concretized
Returns:
List of tuples of abstract and concretized specs
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list]
abstract_specs = [abstract for abstract, _ in spec_list]
concrete_specs = concretize_specs_together(*to_concretize, **kwargs)
concrete_specs = concretize_specs_together(to_concretize, tests=tests)
return list(zip(abstract_specs, concrete_specs))
def concretize_together_when_possible(*spec_list, **kwargs):
def concretize_together_when_possible(
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
"""Given a number of specs as input, tries to concretize them together to the extent possible.
See documentation for ``unify: when_possible`` concretization for the precise definition of
"to the extent possible".
Args:
tests (bool or list or set): False to run no tests, True to test
all packages, or a list of package names to run tests for some
*spec_list: list of tuples to concretize. First entry is abstract spec, second entry is
spec_list: list of tuples to concretize. First entry is abstract spec, second entry is
already concrete spec or None if not yet concretized
Returns:
List of tuples of abstract and concretized specs
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
import spack.solver.asp
to_concretize = [concrete if concrete else abstract for abstract, concrete in spec_list]
old_concrete_to_abstract = {
concrete: abstract for (abstract, concrete) in spec_list if concrete
@@ -131,7 +101,7 @@ def concretize_together_when_possible(*spec_list, **kwargs):
solver = spack.solver.asp.Solver()
allow_deprecated = spack.config.get("config:deprecated", False)
for result in solver.solve_in_rounds(
to_concretize, tests=kwargs.get("tests", False), allow_deprecated=allow_deprecated
to_concretize, tests=tests, allow_deprecated=allow_deprecated
):
result_by_user_spec.update(result.specs_by_input)
@@ -143,19 +113,19 @@ def concretize_together_when_possible(*spec_list, **kwargs):
]
def concretize_separately(*spec_list, **kwargs):
"""Given a number of specs as input, tries to concretize them together.
def concretize_separately(
spec_list: Sequence[SpecPair], tests: TestsType = False
) -> Sequence[SpecPair]:
"""Concretizes the input specs separately from each other.
Args:
tests (bool or list or set): False to run no tests, True to test
all packages, or a list of package names to run tests for some
*spec_list: list of tuples to concretize. First entry is abstract spec, second entry is
spec_list: list of tuples to concretize. First entry is abstract spec, second entry is
already concrete spec or None if not yet concretized
Returns:
List of tuples of abstract and concretized specs
tests: list of package names for which to consider tests dependencies. If True, all nodes
will have test dependencies. If False, test dependencies will be disregarded.
"""
tests = kwargs.get("tests", False)
import spack.bootstrap
to_concretize = [abstract for abstract, concrete in spec_list if not concrete]
args = [
(i, str(abstract), tests)
@@ -192,11 +162,7 @@ def concretize_separately(*spec_list, **kwargs):
for j, (i, concrete, duration) in enumerate(
spack.util.parallel.imap_unordered(
spack.concretize._concretize_task,
args,
processes=num_procs,
debug=tty.is_debug(),
maxtaskperchild=1,
_concretize_task, args, processes=num_procs, debug=tty.is_debug(), maxtaskperchild=1
)
):
ret.append((i, concrete))
@@ -215,7 +181,7 @@ def concretize_separately(*spec_list, **kwargs):
]
def _concretize_task(packed_arguments) -> Tuple[int, Spec, float]:
def _concretize_task(packed_arguments: Tuple[int, str, TestsType]) -> Tuple[int, Spec, float]:
index, spec_str, tests = packed_arguments
with tty.SuppressOutput(msg_enabled=False):
start = time.time()
@@ -227,10 +193,10 @@ class UnavailableCompilerVersionError(spack.error.SpackError):
"""Raised when there is no available compiler that satisfies a
compiler spec."""
def __init__(self, compiler_spec, arch=None):
err_msg = "No compilers with spec {0} found".format(compiler_spec)
def __init__(self, compiler_spec: CompilerSpec, arch: Optional[ArchSpec] = None) -> None:
err_msg = f"No compilers with spec {compiler_spec} found"
if arch:
err_msg += " for operating system {0} and target {1}.".format(arch.os, arch.target)
err_msg += f" for operating system {arch.os} and target {arch.target}."
super().__init__(
err_msg,

View File

@@ -427,6 +427,10 @@ def __init__(self, *scopes: ConfigScope) -> None:
self.push_scope(scope)
self.format_updates: Dict[str, List[ConfigScope]] = collections.defaultdict(list)
def ensure_unwrapped(self) -> "Configuration":
"""Ensure we unwrap this object from any dynamic wrapper (like Singleton)"""
return self
@_config_mutator
def push_scope(self, scope: ConfigScope) -> None:
"""Add a higher precedence scope to the Configuration."""
@@ -752,10 +756,6 @@ def override(
assert scope is overrides
#: configuration scopes added on the command line set by ``spack.main.main()``
COMMAND_LINE_SCOPES: List[str] = []
def _add_platform_scope(cfg: Configuration, name: str, path: str, writable: bool = True) -> None:
"""Add a platform-specific subdirectory for the current platform."""
platform = spack.platforms.host().name
@@ -860,13 +860,6 @@ def create() -> Configuration:
# Each scope can have per-platfom overrides in subdirectories
_add_platform_scope(cfg, name, path)
# add command-line scopes
_add_command_line_scopes(cfg, COMMAND_LINE_SCOPES)
# we make a special scope for spack commands so that they can
# override configuration options.
cfg.push_scope(InternalConfigScope("command_line"))
return cfg

View File

@@ -14,7 +14,7 @@
import urllib.parse
import urllib.request
import warnings
from typing import Any, Dict, Iterable, List, Optional, Tuple, Union
from typing import Any, Dict, Iterable, List, Optional, Sequence, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -24,7 +24,6 @@
import spack
import spack.caches
import spack.compilers
import spack.concretize
import spack.config
import spack.deptypes as dt
@@ -43,7 +42,6 @@
import spack.util.environment
import spack.util.hash
import spack.util.lock as lk
import spack.util.parallel
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
@@ -55,7 +53,7 @@
from spack.spec_list import SpecList
from spack.util.path import substitute_path_variables
SpecPair = Tuple[spack.spec.Spec, spack.spec.Spec]
SpecPair = spack.concretize.SpecPair
#: environment variable used to indicate the active environment
spack_env_var = "SPACK_ENV"
@@ -1533,9 +1531,7 @@ def _get_specs_to_concretize(
]
return new_user_specs, kept_user_specs, specs_to_concretize
def _concretize_together_where_possible(
self, tests: bool = False
) -> List[Tuple[spack.spec.Spec, spack.spec.Spec]]:
def _concretize_together_where_possible(self, tests: bool = False) -> Sequence[SpecPair]:
# Avoid cyclic dependency
import spack.solver.asp
@@ -1550,7 +1546,7 @@ def _concretize_together_where_possible(
ret = []
result = spack.concretize.concretize_together_when_possible(
*specs_to_concretize, tests=tests
specs_to_concretize, tests=tests
)
for abstract, concrete in result:
# Only add to the environment if it's from this environment (not included in)
@@ -1563,7 +1559,7 @@ def _concretize_together_where_possible(
return ret
def _concretize_together(self, tests: bool = False) -> List[SpecPair]:
def _concretize_together(self, tests: bool = False) -> Sequence[SpecPair]:
"""Concretization strategy that concretizes all the specs
in the same DAG.
"""
@@ -1577,8 +1573,8 @@ def _concretize_together(self, tests: bool = False) -> List[SpecPair]:
self.specs_by_hash = {}
try:
concretized_specs: List[SpecPair] = spack.concretize.concretize_together(
*specs_to_concretize, tests=tests
concretized_specs = spack.concretize.concretize_together(
specs_to_concretize, tests=tests
)
except spack.error.UnsatisfiableSpecError as e:
# "Enhance" the error message for multiple root specs, suggest a less strict
@@ -1627,7 +1623,7 @@ def _concretize_separately(self, tests=False):
to_concretize = [
(root, None) for root in self.user_specs if root not in old_concretized_user_specs
]
concretized_specs = spack.concretize.concretize_separately(*to_concretize, tests=tests)
concretized_specs = spack.concretize.concretize_separately(to_concretize, tests=tests)
by_hash = {}
for abstract, concrete in concretized_specs:

View File

@@ -911,13 +911,6 @@ def _main(argv=None):
# Make spack load / env activate work on macOS
restore_macos_dyld_vars()
# make spack.config aware of any command line configuration scopes
if args.config_scopes:
spack.config.COMMAND_LINE_SCOPES = args.config_scopes
# ensure options on spack command come before everything
setup_main_options(args)
# activate an environment if one was specified on the command line
env_format_error = None
if not args.no_env:
@@ -931,6 +924,12 @@ def _main(argv=None):
e.print_context()
env_format_error = e
# Push scopes from the command line last
if args.config_scopes:
spack.config._add_command_line_scopes(spack.config.CONFIG, args.config_scopes)
spack.config.CONFIG.push_scope(spack.config.InternalConfigScope("command_line"))
setup_main_options(args)
# ------------------------------------------------------------------------
# Things that require configuration should go below here
# ------------------------------------------------------------------------

View File

@@ -27,7 +27,6 @@
import spack
import spack.binary_distribution
import spack.bootstrap.core
import spack.compilers
import spack.concretize
import spack.config
@@ -816,7 +815,7 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
solve, and the internal statistics from clingo.
"""
# avoid circular import
import spack.bootstrap
import spack.bootstrap.core
output = output or DEFAULT_OUTPUT_CONFIGURATION
timer = spack.util.timer.Timer()
@@ -889,6 +888,7 @@ def on_model(model):
result.satisfiable = solve_result.satisfiable
if result.satisfiable:
timer.start("construct_specs")
# get the best model
builder = SpecBuilder(specs, hash_lookup=setup.reusable_and_possible)
min_cost, best_model = min(models)
@@ -913,7 +913,8 @@ def on_model(model):
# record the possible dependencies in the solve
result.possible_dependencies = setup.pkgs
timer.stop("construct_specs")
timer.stop()
elif cores:
result.control = self.control
result.cores.extend(cores)
@@ -2030,9 +2031,12 @@ def _spec_clauses(
for variant_def in variant_defs:
self.variant_values_from_specs.add((spec.name, id(variant_def), value))
clauses.append(f.variant_value(spec.name, vname, value))
if variant.propagate:
clauses.append(f.propagate(spec.name, fn.variant_value(vname, value)))
if self.pkg_class(spec.name).has_variant(vname):
clauses.append(f.variant_value(spec.name, vname, value))
else:
clauses.append(f.variant_value(spec.name, vname, value))
# compiler and compiler version
if spec.compiler:
@@ -4191,7 +4195,7 @@ def _check_input_and_extract_concrete_specs(specs):
spack.spec.Spec.ensure_valid_variants(s)
return reusable
def solve(
def solve_with_stats(
self,
specs,
out=None,
@@ -4202,6 +4206,8 @@ def solve(
allow_deprecated=False,
):
"""
Concretize a set of specs and track the timing and statistics for the solve
Arguments:
specs (list): List of ``Spec`` objects to solve for.
out: Optionally write the generate ASP program to a file-like object.
@@ -4213,15 +4219,22 @@ def solve(
setup_only (bool): if True, stop after setup and don't solve (default False).
allow_deprecated (bool): allow deprecated version in the solve
"""
# Check upfront that the variants are admissible
specs = [s.lookup_hash() for s in specs]
reusable_specs = self._check_input_and_extract_concrete_specs(specs)
reusable_specs.extend(self.selector.reusable_specs(specs))
setup = SpackSolverSetup(tests=tests)
output = OutputConfiguration(timers=timers, stats=stats, out=out, setup_only=setup_only)
result, _, _ = self.driver.solve(
return self.driver.solve(
setup, specs, reuse=reusable_specs, output=output, allow_deprecated=allow_deprecated
)
def solve(self, specs, **kwargs):
"""
Convenience function for concretizing a set of specs and ignoring timing
and statistics. Uses the same kwargs as solve_with_stats.
"""
# Check upfront that the variants are admissible
result, _, _ = self.solve_with_stats(specs, **kwargs)
return result
def solve_in_rounds(

View File

@@ -57,6 +57,12 @@
internal_error("provider with no virtual node").
:- provider(PackageNode, _), not attr("node", PackageNode),
internal_error("provider with no real node").
:- node_has_variant(PackageNode, _, _), not attr("node", PackageNode),
internal_error("node has variant for a non-node").
:- attr("variant_set", PackageNode, _, _), not attr("node", PackageNode),
internal_error("variant_set for a non-node").
:- variant_is_propagated(PackageNode, _), not attr("node", PackageNode),
internal_error("variant_is_propagated for a non-node").
:- attr("root", node(ID, PackageNode)), ID > min_dupe_id,
internal_error("root with a non-minimal duplicate ID").
@@ -575,7 +581,8 @@ attr("virtual_on_edge", PackageNode, ProviderNode, Virtual)
% or used somewhere
:- attr("virtual_node", node(_, Virtual)),
not attr("virtual_on_incoming_edges", _, Virtual),
not attr("virtual_root", node(_, Virtual)).
not attr("virtual_root", node(_, Virtual)),
internal_error("virtual node does not match incoming edge").
attr("virtual_on_incoming_edges", ProviderNode, Virtual)
:- attr("virtual_on_edge", _, ProviderNode, Virtual).
@@ -629,7 +636,8 @@ do_not_impose(EffectID, node(X, Package))
virtual_condition_holds(_, PossibleProvider, Virtual),
PossibleProvider != ProviderNode,
explicitly_requested_root(PossibleProvider),
not explicitly_requested_root(ProviderNode).
not explicitly_requested_root(ProviderNode),
internal_error("If a root can provide a virtual, it must be the provider").
% A package cannot be the actual provider for a virtual if it does not
% fulfill the conditions to provide that virtual
@@ -772,7 +780,8 @@ required_provider(Provider, Virtual)
pkg_fact(Virtual, condition_effect(ConditionID, EffectID)),
imposed_constraint(EffectID, "node", Provider).
:- provider(node(Y, Package), node(X, Virtual)), required_provider(Provider, Virtual), Package != Provider.
:- provider(node(Y, Package), node(X, Virtual)), required_provider(Provider, Virtual), Package != Provider,
internal_error("If a provider is required the concretizer must use it").
% TODO: the following choice rule allows the solver to add compiler
% flags if their only source is from a requirement. This is overly-specific
@@ -852,7 +861,8 @@ variant_defined(PackageNode, Name) :- variant_definition(PackageNode, Name, _).
% for two or more variant definitions, this prefers the last one defined.
:- node_has_variant(node(NodeID, Package), Name, SelectedVariantID),
variant_definition(node(NodeID, Package), Name, VariantID),
VariantID > SelectedVariantID.
VariantID > SelectedVariantID,
internal_error("If the solver picks a variant descriptor it must use that variant descriptor").
% B: Associating applicable package rules with nodes
@@ -969,6 +979,7 @@ error(100, "{0} variant '{1}' cannot have values '{2}' and '{3}' as they come fr
:- attr("variant_set", node(ID, Package), Variant, Value),
not attr("variant_value", node(ID, Package), Variant, Value).
internal_error("If a variant is set to a value it must have that value").
% The rules below allow us to prefer default values for variants
% whenever possible. If a variant is set in a spec, or if it is
@@ -979,7 +990,7 @@ variant_not_default(node(ID, Package), Variant, Value)
% variants set explicitly on the CLI don't count as non-default
not attr("variant_set", node(ID, Package), Variant, Value),
% variant values forced by propagation don't count as non-default
not propagate(node(ID, Package), variant_value(Variant, Value)),
not propagate(node(ID, Package), variant_value(Variant, Value, _)),
% variants set on externals that we could use don't count as non-default
% this makes spack prefer to use an external over rebuilding with the
% default configuration
@@ -991,7 +1002,7 @@ variant_default_not_used(node(ID, Package), Variant, Value)
:- variant_default_value(node(ID, Package), Variant, Value),
node_has_variant(node(ID, Package), Variant, _),
not attr("variant_value", node(ID, Package), Variant, Value),
not propagate(node(ID, Package), variant_value(Variant, _)),
not propagate(node(ID, Package), variant_value(Variant, _, _)),
attr("node", node(ID, Package)).
% The variant is set in an external spec
@@ -1036,10 +1047,14 @@ variant_single_value(PackageNode, Variant)
% Propagation semantics
%-----------------------------------------------------------------------------
non_default_propagation(variant_value(Name, Value)) :- attr("propagate", RootNode, variant_value(Name, Value)).
% Propagation roots have a corresponding attr("propagate", ...)
propagate(RootNode, PropagatedAttribute) :- attr("propagate", RootNode, PropagatedAttribute).
propagate(RootNode, PropagatedAttribute) :- attr("propagate", RootNode, PropagatedAttribute), not non_default_propagation(PropagatedAttribute).
propagate(RootNode, PropagatedAttribute, EdgeTypes) :- attr("propagate", RootNode, PropagatedAttribute, EdgeTypes).
% Special case variants, to inject the source node in the propagated attribute
propagate(RootNode, variant_value(Name, Value, RootNode)) :- attr("propagate", RootNode, variant_value(Name, Value)).
% Propagate an attribute along edges to child nodes
propagate(ChildNode, PropagatedAttribute) :-
@@ -1061,21 +1076,53 @@ propagate(ChildNode, PropagatedAttribute, edge_types(DepType1, DepType2)) :-
% If a variant is propagated, and can be accepted, set its value
attr("variant_selected", PackageNode, Variant, Value, VariantType, VariantID) :-
propagate(PackageNode, variant_value(Variant, Value)),
propagate(PackageNode, variant_value(Variant, Value, _)),
node_has_variant(PackageNode, Variant, VariantID),
variant_type(VariantID, VariantType),
variant_possible_value(PackageNode, Variant, Value),
not attr("variant_set", PackageNode, Variant).
variant_possible_value(PackageNode, Variant, Value).
% If a variant is propagated, we cannot have extraneous values
variant_is_propagated(PackageNode, Variant) :-
attr("variant_value", PackageNode, Variant, Value),
propagate(PackageNode, variant_value(Variant, Value)),
propagate(PackageNode, variant_value(Variant, Value, _)),
not attr("variant_set", PackageNode, Variant).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_selected", PackageNode, Variant, Value, _, _),
not propagate(PackageNode, variant_value(Variant, Value)).
not propagate(PackageNode, variant_value(Variant, Value, _)).
error(100, "{0} and {1} cannot both propagate variant '{2}' to the shared dependency: {3}",
Package1, Package2, Variant, Dependency) :-
% The variant is a singlevalued variant
variant_single_value(node(X, Package1), Variant),
% Dependency is trying to propagate Variant with different values and is not the source package
propagate(node(Z, Dependency), variant_value(Variant, Value1, node(X, Package1))),
propagate(node(Z, Dependency), variant_value(Variant, Value2, node(Y, Package2))),
% Package1 and Package2 and their values are different
Package1 > Package2, Value1 != Value2,
not propagate(node(Z, Dependency), variant_value(Variant, _, node(Z, Dependency))).
% Cannot propagate the same variant from two different packages if one is a dependency of the other
error(100, "{0} and {1} cannot both propagate variant '{2}'", Package1, Package2, Variant) :-
% The variant is a single-valued variant
variant_single_value(node(X, Package1), Variant),
% Package1 and Package2 and their values are different
Package1 != Package2, Value1 != Value2,
% Package2 is set to propagate the value from Package1
propagate(node(Y, Package2), variant_value(Variant, Value2, node(X, Package2))),
propagate(node(Y, Package2), variant_value(Variant, Value1, node(X, Package1))),
variant_is_propagated(node(Y, Package2), Variant).
% Cannot propagate a variant if a different value was set for it in a dependency
error(100, "Cannot propagate the variant '{0}' from the package: {1} because package: {2} is set to exclude it", Variant, Source, Package) :-
% Package has a Variant and Source is propagating Variant
attr("variant_set", node(X, Package), Variant, Value1),
% The packages and values are different
Source != Package, Value1 != Value2,
% The variant is a single-valued variant
variant_single_value(node(X, Package1), Variant),
% A different value is being propagated from somewhere else
propagate(node(X, Package), variant_value(Variant, Value2, node(Y, Source))).
%----
% Flags

View File

@@ -877,8 +877,9 @@ def constrain(self, other):
# Next, if any flags in other propagate, we force them to propagate in our case
shared = list(sorted(set(other[flag_type]) - extra_other))
for x, y in _shared_subset_pair_iterate(shared, sorted(self[flag_type])):
if x.propagate:
y.propagate = True
if y.propagate is True and x.propagate is False:
changed = True
y.propagate = False
# TODO: what happens if flag groups with a partial (but not complete)
# intersection specify different behaviors for flag propagation?
@@ -933,6 +934,7 @@ def _cmp_iter(self):
def flags():
for flag in v:
yield flag
yield flag.propagate
yield flags
@@ -3018,7 +3020,12 @@ def ensure_valid_variants(spec):
pkg_variants = pkg_cls.variant_names()
# reserved names are variants that may be set on any package
# but are not necessarily recorded by the package's class
not_existing = set(spec.variants) - (set(pkg_variants) | set(vt.reserved_names))
propagate_variants = [name for name, variant in spec.variants.items() if variant.propagate]
not_existing = set(spec.variants) - (
set(pkg_variants) | set(vt.reserved_names) | set(propagate_variants)
)
if not_existing:
raise vt.UnknownVariantError(
f"No such variant {not_existing} for spec: '{spec}'", list(not_existing)
@@ -3045,6 +3052,10 @@ def constrain(self, other, deps=True):
raise spack.error.UnsatisfiableSpecError(self, other, "constrain a concrete spec")
other = self._autospec(other)
if other.concrete and other.satisfies(self):
self._dup(other)
return True
if other.abstract_hash:
if not self.abstract_hash or other.abstract_hash.startswith(self.abstract_hash):
self.abstract_hash = other.abstract_hash
@@ -4523,8 +4534,69 @@ def substitute(self, vspec):
# Set the item
super().__setitem__(vspec.name, vspec)
def satisfies(self, other):
return all(k in self and self[k].satisfies(other[k]) for k in other)
def partition_variants(self):
non_prop, prop = lang.stable_partition(self.values(), lambda x: not x.propagate)
# Just return the names
non_prop = [x.name for x in non_prop]
prop = [x.name for x in prop]
return non_prop, prop
def satisfies(self, other: "VariantMap") -> bool:
if self.spec.concrete:
return self._satisfies_when_self_concrete(other)
return self._satisfies_when_self_abstract(other)
def _satisfies_when_self_concrete(self, other: "VariantMap") -> bool:
non_propagating, propagating = other.partition_variants()
result = all(
name in self and self[name].satisfies(other[name]) for name in non_propagating
)
if not propagating:
return result
for node in self.spec.traverse():
if not all(
node.variants[name].satisfies(other[name])
for name in propagating
if name in node.variants
):
return False
return result
def _satisfies_when_self_abstract(self, other: "VariantMap") -> bool:
other_non_propagating, other_propagating = other.partition_variants()
self_non_propagating, self_propagating = self.partition_variants()
# First check variants without propagation set
result = all(
name in self_non_propagating
and (self[name].propagate or self[name].satisfies(other[name]))
for name in other_non_propagating
)
if result is False or (not other_propagating and not self_propagating):
return result
# Check that self doesn't contradict variants propagated by other
if other_propagating:
for node in self.spec.traverse():
if not all(
node.variants[name].satisfies(other[name])
for name in other_propagating
if name in node.variants
):
return False
# Check that other doesn't contradict variants propagated by self
if self_propagating:
for node in other.spec.traverse():
if not all(
node.variants[name].satisfies(self[name])
for name in self_propagating
if name in node.variants
):
return False
return result
def intersects(self, other):
return all(self[k].intersects(other[k]) for k in other if k in self)

View File

@@ -17,7 +17,6 @@
import multiprocessing
import pickle
import pydoc
import sys
from types import ModuleType
import spack.config
@@ -27,9 +26,6 @@
import spack.repo
import spack.store
_SERIALIZE = sys.platform == "win32" or (sys.version_info >= (3, 8) and sys.platform == "darwin")
patches = None
@@ -56,7 +52,7 @@ def _restore_and_run(self, fn, test_state):
fn()
def create(self):
test_state = TestState()
test_state = GlobalStateMarshaler()
return multiprocessing.Process(target=self._restore_and_run, args=(self.fn, test_state))
@@ -65,49 +61,56 @@ class PackageInstallContext:
needs to be transmitted to a child process.
"""
def __init__(self, pkg):
if _SERIALIZE:
def __init__(self, pkg, *, ctx=None):
ctx = ctx or multiprocessing.get_context()
self.serialize = ctx.get_start_method() != "fork"
if self.serialize:
self.serialized_pkg = serialize(pkg)
self.global_state = GlobalStateMarshaler()
self.serialized_env = serialize(spack.environment.active_environment())
else:
self.pkg = pkg
self.global_state = None
self.env = spack.environment.active_environment()
self.spack_working_dir = spack.paths.spack_working_dir
self.test_state = TestState()
def restore(self):
self.test_state.restore()
spack.paths.spack_working_dir = self.spack_working_dir
env = pickle.load(self.serialized_env) if _SERIALIZE else self.env
env = pickle.load(self.serialized_env) if self.serialize else self.env
# Activating the environment modifies the global configuration, so globals have to
# be restored afterward, in case other modifications were applied on top (e.g. from
# command line)
if env:
spack.environment.activate(env)
if self.serialize:
self.global_state.restore()
# Order of operation is important, since the package might be retrieved
# from a repo defined within the environment configuration
pkg = pickle.load(self.serialized_pkg) if _SERIALIZE else self.pkg
pkg = pickle.load(self.serialized_pkg) if self.serialize else self.pkg
return pkg
class TestState:
"""Spack tests may modify state that is normally read from disk in memory;
this object is responsible for properly serializing that state to be
applied to a subprocess. This isn't needed outside of a testing environment
but this logic is designed to behave the same inside or outside of tests.
class GlobalStateMarshaler:
"""Class to serialize and restore global state for child processes.
Spack may modify state that is normally read from disk or command line in memory;
this object is responsible for properly serializing that state to be applied to a subprocess.
"""
def __init__(self):
if _SERIALIZE:
self.config = spack.config.CONFIG
self.platform = spack.platforms.host
self.test_patches = store_patches()
self.store = spack.store.STORE
self.config = spack.config.CONFIG.ensure_unwrapped()
self.platform = spack.platforms.host
self.test_patches = store_patches()
self.store = spack.store.STORE
def restore(self):
if _SERIALIZE:
spack.config.CONFIG = self.config
spack.repo.PATH = spack.repo.create(self.config)
spack.platforms.host = self.platform
spack.store.STORE = self.store
self.test_patches.restore()
spack.config.CONFIG = self.config
spack.repo.PATH = spack.repo.create(self.config)
spack.platforms.host = self.platform
spack.store.STORE = self.store
self.test_patches.restore()
class TestPatches:

View File

@@ -199,7 +199,7 @@ def check_args(cc, args, expected):
"""
with set_env(SPACK_TEST_COMMAND="dump-args"):
cc_modified_args = cc(*args, output=str).strip().split("\n")
assert expected == cc_modified_args
assert cc_modified_args == expected
def check_args_contents(cc, args, must_contain, must_not_contain):
@@ -272,6 +272,43 @@ def test_ld_mode(wrapper_environment):
assert dump_mode(ld, ["foo.o", "bar.o", "baz.o", "-o", "foo", "-Wl,-rpath,foo"]) == "ld"
def test_ld_unterminated_rpath(wrapper_environment):
check_args(
ld,
["foo.o", "bar.o", "baz.o", "-o", "foo", "-rpath"],
["ld", "--disable-new-dtags", "foo.o", "bar.o", "baz.o", "-o", "foo", "-rpath"],
)
def test_xlinker_unterminated_rpath(wrapper_environment):
check_args(
cc,
["foo.o", "bar.o", "baz.o", "-o", "foo", "-Xlinker", "-rpath"],
[real_cc]
+ target_args
+ [
"-Wl,--disable-new-dtags",
"foo.o",
"bar.o",
"baz.o",
"-o",
"foo",
"-Xlinker",
"-rpath",
],
)
def test_wl_unterminated_rpath(wrapper_environment):
check_args(
cc,
["foo.o", "bar.o", "baz.o", "-o", "foo", "-Wl,-rpath"],
[real_cc]
+ target_args
+ ["-Wl,--disable-new-dtags", "foo.o", "bar.o", "baz.o", "-o", "foo", "-Wl,-rpath"],
)
def test_ld_flags(wrapper_environment, wrapper_flags):
check_args(
ld,

View File

@@ -1650,45 +1650,3 @@ def fake_dyn_mapping_urlopener(*args, **kwargs):
assert job.get("variables", {}).get("MY_VAR") == "hello"
assert "ignored_field" not in job
assert "unallowed_field" not in job
def test_ci_generate_noop_no_concretize(
tmpdir,
working_env,
mutable_mock_env_path,
install_mockery,
mock_packages,
monkeypatch,
ci_base_environment,
):
# Write the enviroment file
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
"""\
spack:
specs:
- pkg-a
mirrors:
buildcache-destination: https://my.fake.mirror
ci:
type: gitlab
"""
)
def fake_compute_affected(r1=None, r2=None):
return []
monkeypatch.setattr(ci, "compute_affected_packages", fake_compute_affected)
monkeypatch.setenv("SPACK_PRUNE_UNTOUCHED", "TRUE") # enables pruning of untouched specs
with tmpdir.as_cwd():
env_cmd("create", "test", "./spack.yaml")
outputfile = str(tmpdir.join(".gitlab-ci.yml"))
with ev.read("test"):
ci_cmd("generate", "--output-file", outputfile)
with open(outputfile) as of:
pipeline_doc = syaml.load(of.read())
assert "no-specs-to-rebuild" in pipeline_doc

View File

@@ -33,7 +33,6 @@
import spack.store
import spack.util.file_cache
import spack.variant as vt
from spack.concretize import find_spec
from spack.installer import PackageInstaller
from spack.spec import CompilerSpec, Spec
from spack.version import Version, VersionList, ver
@@ -541,21 +540,17 @@ def test_concretize_two_virtuals_with_dual_provider_and_a_conflict(self):
@pytest.mark.parametrize(
"spec_str,expected_propagation",
[
("hypre~~shared ^openblas+shared", [("hypre", "~shared"), ("openblas", "+shared")]),
# Propagates past a node that doesn't have the variant
("hypre~~shared ^openblas", [("hypre", "~shared"), ("openblas", "~shared")]),
# Propagates from root node to all nodes
(
"ascent~~shared +adios2",
[("ascent", "~shared"), ("adios2", "~shared"), ("bzip2", "~shared")],
),
# Propagates below a node that uses the other value explicitly
# Propagate from a node that is not the root node
(
"ascent~~shared +adios2 ^adios2+shared",
[("ascent", "~shared"), ("adios2", "+shared"), ("bzip2", "~shared")],
),
(
"ascent++shared +adios2 ^adios2~shared",
[("ascent", "+shared"), ("adios2", "~shared"), ("bzip2", "+shared")],
"ascent +adios2 ^adios2~~shared",
[("ascent", "+shared"), ("adios2", "~shared"), ("bzip2", "~shared")],
),
],
)
@@ -565,21 +560,109 @@ def test_concretize_propagate_disabled_variant(self, spec_str, expected_propagat
for key, expected_satisfies in expected_propagation:
spec[key].satisfies(expected_satisfies)
def test_concretize_propagated_variant_is_not_passed_to_dependent(self):
"""Test a package variant value was passed from its parent."""
spec = Spec("ascent~~shared +adios2 ^adios2+shared")
def test_concretize_propagate_variant_not_dependencies(self):
"""Test that when propagating a variant it is not propagated to dependencies that
do not have that variant"""
spec = Spec("quantum-espresso~~invino")
spec.concretize()
assert spec.satisfies("^adios2+shared")
assert spec.satisfies("^bzip2~shared")
for dep in spec.traverse(root=False):
assert "invino" not in dep.variants.keys()
def test_concretize_propagate_variant_exclude_dependency_fail(self):
"""Tests that a propagating variant cannot be allowed to be excluded by any of
the source package's dependencies"""
spec = Spec("hypre ~~shared ^openblas +shared")
with pytest.raises(spack.error.UnsatisfiableSpecError):
spec.concretize()
def test_concretize_propagate_same_variant_from_direct_dep_fail(self):
"""Test that when propagating a variant from the source package and a direct
dependency also propagates the same variant with a different value. Raises error"""
spec = Spec("ascent +adios2 ++shared ^adios2 ~~shared")
with pytest.raises(spack.error.UnsatisfiableSpecError):
spec.concretize()
def test_concretize_propagate_same_variant_in_dependency_fail(self):
"""Test that when propagating a variant from the source package, none of it's
dependencies can propagate that variant with a different value. Raises error."""
spec = Spec("ascent +adios2 ++shared ^bzip2 ~~shared")
with pytest.raises(spack.error.UnsatisfiableSpecError):
spec.concretize()
def test_concretize_propagate_same_variant_virtual_dependency_fail(self):
"""Test that when propagating a variant from the source package and a direct
dependency (that is a virtual pkg) also propagates the same variant with a
different value. Raises error"""
spec = Spec("hypre ++shared ^openblas ~~shared")
with pytest.raises(spack.error.UnsatisfiableSpecError):
spec.concretize()
def test_concretize_propagate_same_variant_multiple_sources_diamond_dep_fail(self):
"""Test that fails when propagating the same variant with different values from multiple
sources that share a dependency"""
spec = Spec("parent-foo-bar ^dependency-foo-bar++bar ^direct-dep-foo-bar~~bar")
with pytest.raises(spack.error.UnsatisfiableSpecError):
spec.concretize()
def test_concretize_propagate_specified_variant(self):
"""Test that only the specified variant is propagated to the dependencies"""
spec = Spec("parent-foo-bar ~~foo")
spec.concretize()
assert spec.satisfies("~foo") and spec.satisfies("^dependency-foo-bar~foo")
assert spec.satisfies("+bar") and not spec.satisfies("^dependency-foo-bar+bar")
assert spec.satisfies("^dependency-foo-bar~foo")
assert spec.satisfies("^second-dependency-foo-bar-fee~foo")
assert spec.satisfies("^direct-dep-foo-bar~foo")
assert not spec.satisfies("^dependency-foo-bar+bar")
assert not spec.satisfies("^second-dependency-foo-bar-fee+bar")
assert not spec.satisfies("^direct-dep-foo-bar+bar")
def test_concretize_propagate_one_variant(self):
"""Test that you can specify to propagate one variant and not all"""
spec = Spec("parent-foo-bar ++bar ~foo")
spec.concretize()
assert spec.satisfies("~foo") and not spec.satisfies("^dependency-foo-bar~foo")
assert spec.satisfies("+bar") and spec.satisfies("^dependency-foo-bar+bar")
def test_concretize_propagate_through_first_level_deps(self):
"""Test that boolean valued variants can be propagated past first level
dependecies even if the first level dependency does have the variant"""
spec = Spec("parent-foo-bar-fee ++fee")
spec.concretize()
assert spec.satisfies("+fee") and not spec.satisfies("dependency-foo-bar+fee")
assert spec.satisfies("^second-dependency-foo-bar-fee+fee")
def test_concretize_propagate_multiple_variants(self):
"""Test that multiple boolean valued variants can be propagated from
the same source package"""
spec = Spec("parent-foo-bar-fee ~~foo ++bar")
spec.concretize()
assert spec.satisfies("~foo") and spec.satisfies("+bar")
assert spec.satisfies("^dependency-foo-bar ~foo +bar")
assert spec.satisfies("^second-dependency-foo-bar-fee ~foo +bar")
def test_concretize_propagate_multiple_variants_mulitple_sources(self):
"""Test the propagates multiple different variants for multiple sources
in a diamond dependency"""
spec = Spec("parent-foo-bar ^dependency-foo-bar++bar ^direct-dep-foo-bar~~foo")
spec.concretize()
assert spec.satisfies("^second-dependency-foo-bar-fee+bar")
assert spec.satisfies("^second-dependency-foo-bar-fee~foo")
assert not spec.satisfies("^dependency-foo-bar~foo")
assert not spec.satisfies("^direct-dep-foo-bar+bar")
def test_concretize_propagate_single_valued_variant(self):
"""Test propagation for single valued variants"""
spec = Spec("multivalue-variant libs==static")
spec.concretize()
assert spec.satisfies("libs=static")
assert spec.satisfies("^pkg-a libs=static")
def test_concretize_propagate_multivalue_variant(self):
"""Test that multivalue variants are propagating the specified value(s)
@@ -592,6 +675,46 @@ def test_concretize_propagate_multivalue_variant(self):
assert not spec.satisfies("^pkg-a foo=bar")
assert not spec.satisfies("^pkg-b foo=bar")
def test_concretize_propagate_multiple_multivalue_variant(self):
"""Tests propagating the same mulitvalued variant from different sources allows
the dependents to accept all propagated values"""
spec = Spec("multivalue-variant foo==bar ^pkg-a foo==baz")
spec.concretize()
assert spec.satisfies("multivalue-variant foo=bar")
assert spec.satisfies("^pkg-a foo=bar,baz")
assert spec.satisfies("^pkg-b foo=bar,baz")
def test_concretize_propagate_variant_not_in_source(self):
"""Test that variant is still propagated even if the source pkg
doesn't have the variant"""
spec = Spec("callpath++debug")
spec.concretize()
assert spec.satisfies("^mpich+debug")
assert not spec.satisfies("callpath+debug")
assert not spec.satisfies("^dyninst+debug")
def test_concretize_propagate_variant_multiple_deps_not_in_source(self):
"""Test that a variant can be propagated to multiple dependencies
when the variant is not in the source package"""
spec = Spec("netlib-lapack++shared")
spec.concretize()
assert spec.satisfies("^openblas+shared")
assert spec.satisfies("^perl+shared")
assert not spec.satisfies("netlib-lapack+shared")
def test_concretize_propagate_variant_second_level_dep_not_in_source(self):
"""Test that a variant can be propagated past first level dependencies
when the variant is not in the source package or any of the first level
dependencies"""
spec = Spec("parent-foo-bar ++fee")
spec.concretize()
assert spec.satisfies("^second-dependency-foo-bar-fee +fee")
assert not spec.satisfies("parent-foo-bar +fee")
def test_no_matching_compiler_specs(self, mock_low_high_config):
# only relevant when not building compilers as needed
with spack.concretize.enable_compiler_existence_check():
@@ -674,39 +797,6 @@ def test_external_and_virtual(self, mutable_config):
assert spec["externaltool"].compiler.satisfies("gcc")
assert spec["stuff"].compiler.satisfies("gcc")
def test_find_spec_parents(self):
"""Tests the spec finding logic used by concretization."""
s = Spec.from_literal({"a +foo": {"b +foo": {"c": None, "d+foo": None}, "e +foo": None}})
assert "a" == find_spec(s["b"], lambda s: "+foo" in s).name
def test_find_spec_children(self):
s = Spec.from_literal({"a": {"b +foo": {"c": None, "d+foo": None}, "e +foo": None}})
assert "d" == find_spec(s["b"], lambda s: "+foo" in s).name
s = Spec.from_literal({"a": {"b +foo": {"c+foo": None, "d": None}, "e +foo": None}})
assert "c" == find_spec(s["b"], lambda s: "+foo" in s).name
def test_find_spec_sibling(self):
s = Spec.from_literal({"a": {"b +foo": {"c": None, "d": None}, "e +foo": None}})
assert "e" == find_spec(s["b"], lambda s: "+foo" in s).name
assert "b" == find_spec(s["e"], lambda s: "+foo" in s).name
s = Spec.from_literal({"a": {"b +foo": {"c": None, "d": None}, "e": {"f +foo": None}}})
assert "f" == find_spec(s["b"], lambda s: "+foo" in s).name
def test_find_spec_self(self):
s = Spec.from_literal({"a": {"b +foo": {"c": None, "d": None}, "e": None}})
assert "b" == find_spec(s["b"], lambda s: "+foo" in s).name
def test_find_spec_none(self):
s = Spec.from_literal({"a": {"b": {"c": None, "d": None}, "e": None}})
assert find_spec(s["b"], lambda s: "+foo" in s) is None
def test_compiler_child(self):
s = Spec("mpileaks%clang target=x86_64 ^dyninst%gcc")
s.concretize()
@@ -815,7 +905,7 @@ def test_regression_issue_7941(self):
)
def test_simultaneous_concretization_of_specs(self, abstract_specs):
abstract_specs = [Spec(x) for x in abstract_specs]
concrete_specs = spack.concretize.concretize_specs_together(*abstract_specs)
concrete_specs = spack.concretize.concretize_specs_together(abstract_specs)
# Check there's only one configuration of each package in the DAG
names = set(dep.name for spec in concrete_specs for dep in spec.traverse())
@@ -2137,7 +2227,7 @@ def test_external_python_extension_find_unified_python(self):
spack.config.set("packages", external_conf)
abstract_specs = [Spec(s) for s in ["py-extension1", "python"]]
specs = spack.concretize.concretize_specs_together(*abstract_specs)
specs = spack.concretize.concretize_specs_together(abstract_specs)
assert specs[0]["python"] == specs[1]["python"]
@pytest.mark.regression("36190")

View File

@@ -9,7 +9,7 @@
import pytest
from llnl.util.filesystem import HeaderList, LibraryList, find, find_headers, find_libraries
from llnl.util.filesystem import HeaderList, LibraryList, find_headers, find_libraries
import spack.paths
@@ -324,33 +324,3 @@ def test_searching_order(search_fn, search_list, root, kwargs):
# List should be empty here
assert len(rlist) == 0
@pytest.mark.parametrize(
"root,search_list,kwargs,expected",
[
(
search_dir,
"*/*bar.tx?",
{"recursive": False},
[
os.path.join(search_dir, os.path.join("a", "foobar.txt")),
os.path.join(search_dir, os.path.join("b", "bar.txp")),
os.path.join(search_dir, os.path.join("c", "bar.txt")),
],
),
(
search_dir,
"*/*bar.tx?",
{"recursive": True},
[
os.path.join(search_dir, os.path.join("a", "foobar.txt")),
os.path.join(search_dir, os.path.join("b", "bar.txp")),
os.path.join(search_dir, os.path.join("c", "bar.txt")),
],
),
],
)
def test_find_with_globbing(root, search_list, kwargs, expected):
matches = find(root, search_list, **kwargs)
assert sorted(matches) == sorted(expected)

View File

@@ -6,6 +6,7 @@
"""Tests for ``llnl/util/filesystem.py``"""
import filecmp
import os
import pathlib
import shutil
import stat
import sys
@@ -14,7 +15,8 @@
import pytest
import llnl.util.filesystem as fs
from llnl.util.symlink import islink, readlink, symlink
import llnl.util.symlink
from llnl.util.symlink import _windows_can_symlink, islink, readlink, symlink
import spack.paths
@@ -1035,3 +1037,177 @@ def test_windows_sfn(tmpdir):
assert "d\\LONGER~1" in fs.windows_sfn(d)
assert "d\\LONGER~2" in fs.windows_sfn(e)
shutil.rmtree(tmpdir.join("d"))
@pytest.fixture
def dir_structure_with_things_to_find(tmpdir):
"""
<root>/
dir_one/
file_one
dir_two/
dir_three/
dir_four/
file_two
file_three
file_four
"""
dir_one = tmpdir.join("dir_one").ensure(dir=True)
tmpdir.join("dir_two").ensure(dir=True)
dir_three = tmpdir.join("dir_three").ensure(dir=True)
dir_four = dir_three.join("dir_four").ensure(dir=True)
locations = {}
locations["file_one"] = str(dir_one.join("file_one").ensure())
locations["file_two"] = str(dir_four.join("file_two").ensure())
locations["file_three"] = str(dir_three.join("file_three").ensure())
locations["file_four"] = str(tmpdir.join("file_four").ensure())
return str(tmpdir), locations
def test_find_max_depth(dir_structure_with_things_to_find):
root, locations = dir_structure_with_things_to_find
# Make sure the paths we use to verify are absolute
assert os.path.isabs(locations["file_one"])
assert set(fs.find(root, "file_*", max_depth=0)) == {locations["file_four"]}
assert set(fs.find(root, "file_*", max_depth=1)) == {
locations["file_one"],
locations["file_three"],
locations["file_four"],
}
assert set(fs.find(root, "file_two", max_depth=2)) == {locations["file_two"]}
assert not set(fs.find(root, "file_two", max_depth=1))
assert set(fs.find(root, "file_two")) == {locations["file_two"]}
assert set(fs.find(root, "file_*")) == set(locations.values())
def test_find_max_depth_relative(dir_structure_with_things_to_find):
"""find_max_depth should return absolute paths even if
the provided path is relative.
"""
root, locations = dir_structure_with_things_to_find
with fs.working_dir(root):
assert set(fs.find(".", "file_*", max_depth=0)) == {locations["file_four"]}
assert set(fs.find(".", "file_two", max_depth=2)) == {locations["file_two"]}
@pytest.mark.parametrize("recursive,max_depth", [(False, -1), (False, 1)])
def test_max_depth_and_recursive_errors(tmpdir, recursive, max_depth):
root = str(tmpdir)
error_str = "cannot be set if recursive is False"
with pytest.raises(ValueError, match=error_str):
fs.find(root, ["some_file"], recursive=recursive, max_depth=max_depth)
with pytest.raises(ValueError, match=error_str):
fs.find_libraries(["some_lib"], root, recursive=recursive, max_depth=max_depth)
@pytest.fixture(params=[True, False])
def complex_dir_structure(request, tmpdir):
"""
"lx-dy" means "level x, directory y"
"lx-fy" means "level x, file y"
"lx-sy" means "level x, symlink y"
<root>/
l1-d1/
l2-d1/
l3-s1 -> l1-d2 # points to directory above l2-d1
l3-d2/
l4-f1
l3-s3 -> l1-d1 # cyclic link
l3-d4/
l4-f2
l1-d2/
l2-f1
l2-d2/
l3-f3
l2-s3 -> l2-d2
l1-s3 -> l3-d4 # a link that "skips" a directory level
l1-s4 -> l2-s3 # a link to a link to a dir
"""
use_junctions = request.param
if sys.platform == "win32" and not use_junctions and not _windows_can_symlink():
pytest.skip("This Windows instance is not configured with symlink support")
elif sys.platform != "win32" and use_junctions:
pytest.skip("Junctions are a Windows-only feature")
l1_d1 = tmpdir.join("l1-d1").ensure(dir=True)
l2_d1 = l1_d1.join("l2-d1").ensure(dir=True)
l3_d2 = l2_d1.join("l3-d2").ensure(dir=True)
l3_d4 = l2_d1.join("l3-d4").ensure(dir=True)
l1_d2 = tmpdir.join("l1-d2").ensure(dir=True)
l2_d2 = l1_d2.join("l1-d2").ensure(dir=True)
if use_junctions:
link_fn = llnl.util.symlink._windows_create_junction
else:
link_fn = os.symlink
link_fn(l1_d2, pathlib.Path(l2_d1) / "l3-s1")
link_fn(l1_d1, pathlib.Path(l2_d1) / "l3-s3")
link_fn(l3_d4, pathlib.Path(tmpdir) / "l1-s3")
l2_s3 = pathlib.Path(l1_d2) / "l2-s3"
link_fn(l2_d2, l2_s3)
link_fn(l2_s3, pathlib.Path(tmpdir) / "l1-s4")
locations = {
"l4-f1": str(l3_d2.join("l4-f1").ensure()),
"l4-f2-full": str(l3_d4.join("l4-f2").ensure()),
"l4-f2-link": str(pathlib.Path(tmpdir) / "l1-s3" / "l4-f2"),
"l2-f1": str(l1_d2.join("l2-f1").ensure()),
"l2-f1-link": str(pathlib.Path(tmpdir) / "l1-d1" / "l2-d1" / "l3-s1" / "l2-f1"),
"l3-f3-full": str(l2_d2.join("l3-f3").ensure()),
"l3-f3-link-l1": str(pathlib.Path(tmpdir) / "l1-s4" / "l3-f3"),
}
return str(tmpdir), locations
def test_find_max_depth_symlinks(complex_dir_structure):
root, locations = complex_dir_structure
root = pathlib.Path(root)
assert set(fs.find(root, "l4-f1")) == {locations["l4-f1"]}
assert set(fs.find(root / "l1-s3", "l4-f2", max_depth=0)) == {locations["l4-f2-link"]}
assert set(fs.find(root / "l1-d1", "l2-f1")) == {locations["l2-f1-link"]}
# File is accessible via symlink and subdir, the link path will be
# searched first, and the directory will not be searched again when
# it is encountered the second time (via not-link) in the traversal
assert set(fs.find(root, "l4-f2")) == {locations["l4-f2-link"]}
# File is accessible only via the dir, so the full file path should
# be reported
assert set(fs.find(root / "l1-d1", "l4-f2")) == {locations["l4-f2-full"]}
# Check following links to links
assert set(fs.find(root, "l3-f3")) == {locations["l3-f3-link-l1"]}
def test_find_max_depth_multiple_and_repeated_entry_points(complex_dir_structure):
root, locations = complex_dir_structure
fst = str(pathlib.Path(root) / "l1-d1" / "l2-d1")
snd = str(pathlib.Path(root) / "l1-d2")
nonexistent = str(pathlib.Path(root) / "nonexistent")
assert set(fs.find([fst, snd, fst, snd, nonexistent], ["l*-f*"], max_depth=1)) == {
locations["l2-f1"],
locations["l4-f1"],
locations["l4-f2-full"],
locations["l3-f3-full"],
}
def test_multiple_patterns(complex_dir_structure):
root, _ = complex_dir_structure
paths = fs.find(root, ["l2-f1", "l3-f3", "*"])
# There shouldn't be duplicate results with multiple, overlapping patterns
assert len(set(paths)) == len(paths)
# All files should be found
filenames = [os.path.basename(p) for p in paths]
assert set(filenames) == {"l2-f1", "l3-f3", "l4-f1", "l4-f2"}
# They are ordered by first matching pattern (this is a bit of an implementation detail,
# and we could decide to change the exact order in the future)
assert filenames[0] == "l2-f1"
assert filenames[1] == "l3-f3"

View File

@@ -373,3 +373,18 @@ class _SomeClass:
_SomeClass.deprecated.error_lvl = 2
with pytest.raises(AttributeError):
_ = s.deprecated
def test_fnmatch_multiple():
regex, groups = llnl.util.lang.fnmatch_translate_multiple(["libf*o.so", "libb*r.so"])
a = regex.match("libfoo.so")
assert a and a.group(groups[0]) == "libfoo.so"
b = regex.match("libbar.so")
assert b and b.group(groups[1]) == "libbar.so"
assert not regex.match("libfoo.so.1")
assert not regex.match("libbar.so.1")
assert not regex.match("libfoo.solibbar.so")
assert not regex.match("libbaz.so")

View File

@@ -231,7 +231,7 @@ class TestSpecSemantics:
("mpich+foo", "mpich foo=True", "mpich+foo"),
("mpich++foo", "mpich foo=True", "mpich+foo"),
("mpich foo=true", "mpich+foo", "mpich+foo"),
("mpich foo==true", "mpich++foo", "mpich+foo"),
("mpich foo==true", "mpich++foo", "mpich++foo"),
("mpich~foo", "mpich foo=FALSE", "mpich~foo"),
("mpich~~foo", "mpich foo=FALSE", "mpich~foo"),
("mpich foo=False", "mpich~foo", "mpich~foo"),
@@ -271,17 +271,17 @@ class TestSpecSemantics:
("mpich+foo", "mpich", "mpich+foo"),
("mpich~foo", "mpich", "mpich~foo"),
("mpich foo=1", "mpich", "mpich foo=1"),
("mpich", "mpich++foo", "mpich+foo"),
("mpich", "mpich++foo", "mpich++foo"),
("libelf+debug", "libelf+foo", "libelf+debug+foo"),
("libelf+debug", "libelf+debug+foo", "libelf+debug+foo"),
("libelf debug=2", "libelf foo=1", "libelf debug=2 foo=1"),
("libelf debug=2", "libelf debug=2 foo=1", "libelf debug=2 foo=1"),
("libelf+debug", "libelf~foo", "libelf+debug~foo"),
("libelf+debug", "libelf+debug~foo", "libelf+debug~foo"),
("libelf++debug", "libelf+debug+foo", "libelf++debug++foo"),
("libelf debug==2", "libelf foo=1", "libelf debug==2 foo==1"),
("libelf debug==2", "libelf debug=2 foo=1", "libelf debug==2 foo==1"),
("libelf++debug", "libelf++debug~foo", "libelf++debug~~foo"),
("libelf++debug", "libelf+debug+foo", "libelf+debug+foo"),
("libelf debug==2", "libelf foo=1", "libelf debug==2 foo=1"),
("libelf debug==2", "libelf debug=2 foo=1", "libelf debug=2 foo=1"),
("libelf++debug", "libelf++debug~foo", "libelf++debug~foo"),
("libelf foo=bar,baz", "libelf foo=*", "libelf foo=bar,baz"),
("libelf foo=*", "libelf foo=bar,baz", "libelf foo=bar,baz"),
(
@@ -367,19 +367,24 @@ def test_abstract_specs_can_constrain_each_other(self, lhs, rhs, expected):
'mpich cflags="-O3 -g"',
'mpich cflags=="-O3"',
'mpich cflags="-O3 -g"',
'mpich cflags="-O3 -g"',
[],
[],
),
(
'mpich cflags=="-O3 -g"',
[("cflags", "-O3")],
[("cflags", "-O3")],
'mpich cflags=="-O3"',
'mpich cflags=="-O3 -g"',
'mpich cflags=="-O3 -g"',
[("cflags", "-O3"), ("cflags", "-g")],
[("cflags", "-O3"), ("cflags", "-g")],
),
],
)
def test_constrain_compiler_flags(
self, lhs, rhs, expected_lhs, expected_rhs, propagated_lhs, propagated_rhs
):
"""Constraining is asymmetric for compiler flags. Also note that
Spec equality does not account for flag propagation, so the checks
here are manual.
"""
"""Constraining is asymmetric for compiler flags."""
lhs, rhs, expected_lhs, expected_rhs = (
Spec(lhs),
Spec(rhs),
@@ -507,9 +512,6 @@ def test_constraining_abstract_specs_with_empty_intersection(self, lhs, rhs):
("mpich", "mpich +foo"),
("mpich", "mpich~foo"),
("mpich", "mpich foo=1"),
("mpich", "mpich++foo"),
("mpich", "mpich~~foo"),
("mpich", "mpich foo==1"),
("multivalue-variant foo=bar", "multivalue-variant +foo"),
("multivalue-variant foo=bar", "multivalue-variant ~foo"),
("multivalue-variant fee=bar", "multivalue-variant fee=baz"),
@@ -531,6 +533,58 @@ def test_concrete_specs_which_do_not_satisfy_abstract(
with pytest.raises(UnsatisfiableSpecError):
assert rhs.constrain(lhs)
@pytest.mark.parametrize(
"lhs,rhs", [("mpich", "mpich++foo"), ("mpich", "mpich~~foo"), ("mpich", "mpich foo==1")]
)
def test_concrete_specs_which_satisfy_abstract(self, lhs, rhs, default_mock_concretization):
lhs, rhs = default_mock_concretization(lhs), Spec(rhs)
assert lhs.intersects(rhs)
assert rhs.intersects(lhs)
assert lhs.satisfies(rhs)
s1 = lhs.copy()
s1.constrain(rhs)
assert s1 == lhs and s1.satisfies(lhs)
s2 = rhs.copy()
s2.constrain(lhs)
assert s2 == lhs and s2.satisfies(lhs)
@pytest.mark.parametrize(
"lhs,rhs,expected,constrained",
[
# hdf5++mpi satisfies hdf5, and vice versa, because of the non-contradiction semantic
("hdf5++mpi", "hdf5", True, "hdf5++mpi"),
("hdf5", "hdf5++mpi", True, "hdf5++mpi"),
# Same holds true for arbitrary propagated variants
("hdf5++mpi", "hdf5++shared", True, "hdf5++mpi++shared"),
# Here hdf5+mpi satisfies hdf5++mpi but not vice versa
("hdf5++mpi", "hdf5+mpi", False, "hdf5+mpi"),
("hdf5+mpi", "hdf5++mpi", True, "hdf5+mpi"),
# Non contradiction is violated
("hdf5 ^foo~mpi", "hdf5++mpi", False, "hdf5++mpi ^foo~mpi"),
("hdf5++mpi", "hdf5 ^foo~mpi", False, "hdf5++mpi ^foo~mpi"),
],
)
def test_abstract_specs_with_propagation(self, lhs, rhs, expected, constrained):
"""Tests (and documents) behavior of variant propagation on abstract specs.
Propagated variants do not comply with subset semantic, making it difficult to give
precise definitions. Here we document the behavior that has been decided for the
practical cases we face.
"""
lhs, rhs, constrained = Spec(lhs), Spec(rhs), Spec(constrained)
assert lhs.satisfies(rhs) is expected
c = lhs.copy()
c.constrain(rhs)
assert c == constrained
c = rhs.copy()
c.constrain(lhs)
assert c == constrained
def test_satisfies_single_valued_variant(self):
"""Tests that the case reported in
https://github.com/spack/spack/pull/2386#issuecomment-282147639
@@ -1904,3 +1958,20 @@ def test_old_format_strings_trigger_error(default_mock_concretization):
s = Spec("pkg-a").concretized()
with pytest.raises(SpecFormatStringError):
s.format("${PACKAGE}-${VERSION}-${HASH}")
@pytest.mark.regression("47362")
@pytest.mark.parametrize(
"lhs,rhs",
[
("hdf5 +mpi", "hdf5++mpi"),
("hdf5 cflags==-g", "hdf5 cflags=-g"),
("hdf5 +mpi ++shared", "hdf5+mpi +shared"),
("hdf5 +mpi cflags==-g", "hdf5++mpi cflag=-g"),
],
)
def test_equality_discriminate_on_propagation(lhs, rhs):
"""Tests that == can discriminate abstract specs based on their 'propagation' status"""
s, t = Spec(lhs), Spec(rhs)
assert s != t
assert len({s, t}) == 2

View File

@@ -9,20 +9,30 @@
import shlex
import sys
from subprocess import PIPE, run
from typing import List, Optional
from typing import Dict, List, Optional
import spack.spec
import spack.util.elf
#: Pattern to distinguish glibc from other libc implementations
GLIBC_PATTERN = r"\b(?:Free Software Foundation|Roland McGrath|Ulrich Depper)\b"
def _env() -> Dict[str, str]:
"""Currently only set LC_ALL=C without clearing further environment variables"""
return {**os.environ, "LC_ALL": "C"}
def _libc_from_ldd(ldd: str) -> Optional["spack.spec.Spec"]:
try:
result = run([ldd, "--version"], stdout=PIPE, stderr=PIPE, check=False)
result = run([ldd, "--version"], stdout=PIPE, stderr=PIPE, check=False, env=_env())
stdout = result.stdout.decode("utf-8")
except Exception:
return None
if not re.search(r"\bFree Software Foundation\b", stdout):
# The string "Free Software Foundation" is sometimes translated and not detected, but the names
# of the authors are typically present.
if not re.search(GLIBC_PATTERN, stdout):
return None
version_str = re.match(r".+\(.+\) (.+)", stdout)
@@ -38,7 +48,7 @@ def default_search_paths_from_dynamic_linker(dynamic_linker: str) -> List[str]:
"""If the dynamic linker is glibc at a certain version, we can query the hard-coded library
search paths"""
try:
result = run([dynamic_linker, "--help"], stdout=PIPE, stderr=PIPE, check=False)
result = run([dynamic_linker, "--help"], stdout=PIPE, stderr=PIPE, check=False, env=_env())
assert result.returncode == 0
out = result.stdout.decode("utf-8")
except Exception:
@@ -74,7 +84,9 @@ def libc_from_dynamic_linker(dynamic_linker: str) -> Optional["spack.spec.Spec"]
# Now try to figure out if glibc or musl, which is the only ones we support.
# In recent glibc we can simply execute the dynamic loader. In musl that's always the case.
try:
result = run([dynamic_linker, "--version"], stdout=PIPE, stderr=PIPE, check=False)
result = run(
[dynamic_linker, "--version"], stdout=PIPE, stderr=PIPE, check=False, env=_env()
)
stdout = result.stdout.decode("utf-8")
stderr = result.stderr.decode("utf-8")
except Exception:
@@ -91,7 +103,7 @@ def libc_from_dynamic_linker(dynamic_linker: str) -> Optional["spack.spec.Spec"]
return spec
except Exception:
return None
elif re.search(r"\bFree Software Foundation\b", stdout):
elif re.search(GLIBC_PATTERN, stdout):
# output is like "ld.so (...) stable release version 2.33."
match = re.search(r"version (\d+\.\d+(?:\.\d+)?)", stdout)
if not match:

View File

@@ -251,7 +251,7 @@ def implicit_variant_conversion(method):
def convert(self, other):
# We don't care if types are different as long as I can convert other to type(self)
try:
other = type(self)(other.name, other._original_value)
other = type(self)(other.name, other._original_value, propagate=other.propagate)
except (error.SpecError, ValueError):
return False
return method(self, other)
@@ -379,6 +379,7 @@ def _value_setter(self, value: ValueType) -> None:
def _cmp_iter(self) -> Iterable:
yield self.name
yield from (str(v) for v in self.value_as_tuple)
yield self.propagate
def copy(self) -> "AbstractVariant":
"""Returns an instance of a variant equivalent to self
@@ -453,6 +454,7 @@ def constrain(self, other: "AbstractVariant") -> bool:
values.remove("*")
self._value_setter(",".join(str(v) for v in values))
self.propagate = self.propagate and other.propagate
return old_value != self.value
def __contains__(self, item: Union[str, bool]) -> bool:
@@ -557,6 +559,7 @@ def constrain(self, other: "AbstractVariant") -> bool:
if self.value != other.value:
raise UnsatisfiableVariantSpecError(other.value, self.value)
self.propagate = self.propagate and other.propagate
return False
def __contains__(self, item: ValueType) -> bool:
@@ -827,7 +830,7 @@ def prevalidate_variant_value(
only if the variant is a reserved variant.
"""
# don't validate wildcards or variants with reserved names
if variant.value == ("*",) or variant.name in reserved_names:
if variant.value == ("*",) or variant.name in reserved_names or variant.propagate:
return []
# raise if there is no definition at all

View File

@@ -14,14 +14,10 @@ spack:
variants: +mpi
binutils:
variants: +ld +gold +headers +libiberty ~nls
elfutils:
variants: ~nls
hdf5:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=sockets,tcp,udp,rxm
libunwind:
variants: +pic +xz
openblas:
variants: threads=openmp
trilinos:
@@ -29,25 +25,14 @@ spack:
+ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu
+nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos
+teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
mpi:
require: mpich
mpich:
require: '~wrapperrpath ~hwloc %gcc target=neoverse_v2'
tbb:
require: intel-tbb
boost:
variants: +atomic +chrono +container +date_time +exception +filesystem +graph
+iostreams +locale +log +math +mpi +multithreaded +program_options +random
+regex +serialization +shared +signals +stacktrace +system +test +thread +timer
cxxstd=17 visibility=global
libffi:
require: "@3.4.4 %gcc target=neoverse_v2"
vtk-m:
require: "+examples %gcc target=neoverse_v2"
cuda:
version: [11.8.0]
paraview:
require: "+examples %gcc target=neoverse_v2"

View File

@@ -14,14 +14,10 @@ spack:
variants: +mpi
binutils:
variants: +ld +gold +headers +libiberty ~nls
elfutils:
variants: ~nls
hdf5:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=sockets,tcp,udp,rxm
libunwind:
variants: +pic +xz
openblas:
variants: threads=openmp
trilinos:
@@ -29,27 +25,16 @@ spack:
+ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu
+nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos
+teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
mpi:
require: mpich
mpich:
require: '~wrapperrpath ~hwloc %gcc target=neoverse_v1'
tbb:
require: intel-tbb
boost:
variants: +atomic +chrono +container +date_time +exception +filesystem +graph
+iostreams +locale +log +math +mpi +multithreaded +program_options +random
+regex +serialization +shared +signals +stacktrace +system +test +thread +timer
cxxstd=17 visibility=global
libffi:
require: "@3.4.4 %gcc target=neoverse_v1"
vtk-m:
require: "+examples %gcc target=neoverse_v1"
paraview:
require: "+examples %gcc target=neoverse_v1"
cuda:
version: [11.8.0]
specs:
# CPU

View File

@@ -15,14 +15,10 @@ spack:
variants: +mpi cuda_arch=70
binutils:
variants: +ld +gold +headers +libiberty ~nls
elfutils:
variants: ~nls
hdf5:
variants: +fortran +hl +shared
libfabric:
variants: fabrics=sockets,tcp,udp,rxm
libunwind:
variants: +pic +xz
openblas:
variants: threads=openmp
trilinos:
@@ -42,8 +38,6 @@ spack:
require: "~tcmalloc %gcc@9.4.0 target=ppc64le"
tbb:
require: intel-tbb
libffi:
require: "@3.4.4 %gcc@9.4.0 target=ppc64le"
vtk-m:
require: "+examples %gcc@9.4.0 target=ppc64le"
cuda:

View File

@@ -14,8 +14,6 @@ spack:
variants: +mpi
binutils:
variants: +ld +gold +headers +libiberty ~nls
elfutils:
variants: ~nls
hdf5:
variants: +fortran +hl +shared
libfabric:
@@ -29,30 +27,19 @@ spack:
+ifpack +ifpack2 +intrepid +intrepid2 +isorropia +kokkos +ml +minitensor +muelu
+nox +piro +phalanx +rol +rythmos +sacado +stk +shards +shylu +stokhos +stratimikos
+teko +tempus +tpetra +trilinoscouplings +zoltan +zoltan2 +superlu-dist gotype=long_long
xz:
variants: +pic
mpi:
require: mpich
mpich:
require: '~wrapperrpath ~hwloc'
require: '~wrapperrpath ~hwloc target=x86_64_v3'
tbb:
require: intel-tbb
boost:
variants: +atomic +chrono +container +date_time +exception +filesystem +graph
+iostreams +locale +log +math +mpi +multithreaded +program_options +random
+regex +serialization +shared +signals +stacktrace +system +test +thread +timer
cxxstd=17 visibility=global
libffi:
require: "@3.4.4"
vtk-m:
require: "+examples"
require: "+examples target=x86_64_v3"
visit:
require: "~gui"
cuda:
version: [11.8.0]
require: "~gui target=x86_64_v3"
paraview:
# Don't build GUI support or GLX rendering for HPC/container deployments
require: "@5.11 +examples ~qt ^[virtuals=gl] osmesa"
require: "@5.11 +examples ~qt ^[virtuals=gl] osmesa target=x86_64_v3"
specs:
# CPU

View File

@@ -0,0 +1,36 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Used to test correct application of config line scopes in various cases.
The option `config:cache` is supposed to be False, and overridden to True
from the command line.
"""
import multiprocessing as mp
import spack.config
import spack.subprocess_context
def show_config(serialized_state):
_ = serialized_state.restore()
result = spack.config.CONFIG.get("config:ccache")
if result is not True:
raise RuntimeError(f"Expected config:ccache:true, but got {result}")
if __name__ == "__main__":
print("Testing spawn")
ctx = mp.get_context("spawn")
serialized_state = spack.subprocess_context.PackageInstallContext(None, ctx=ctx)
p = ctx.Process(target=show_config, args=(serialized_state,))
p.start()
p.join()
print("Testing fork")
ctx = mp.get_context("fork")
serialized_state = spack.subprocess_context.PackageInstallContext(None, ctx=ctx)
p = ctx.Process(target=show_config, args=(serialized_state,))
p.start()
p.join()

View File

@@ -52,7 +52,7 @@ if [[ "$UNIT_TEST_COVERAGE" != "true" ]] && python -m pytest -VV 2>&1 | grep xdi
fi
# We are running pytest-cov after the addition of pytest-xdist, since it integrates
# other pugins for pytest automatically. We still need to use "coverage" explicitly
# other plugins for pytest automatically. We still need to use "coverage" explicitly
# for the commands above.
#
# There is a need to pass the configuration file explicitly due to a bug:

View File

@@ -207,3 +207,20 @@ fails spack env deactivate
echo "Correct error exit codes for unit-test when it fails"
fails spack unit-test fail
title "Testing config override from command line, outside of an environment"
contains 'True' spack -c config:ccache:true python -c "import spack.config;print(spack.config.CONFIG.get('config:ccache'))"
contains 'True' spack -C "$SHARE_DIR/qa/configuration" python -c "import spack.config;print(spack.config.CONFIG.get('config:ccache'))"
succeeds spack -c config:ccache:true python "$SHARE_DIR/qa/config_state.py"
succeeds spack -C "$SHARE_DIR/qa/configuration" python "$SHARE_DIR/qa/config_state.py"
title "Testing config override from command line, inside an environment"
spack env activate --temp
spack config add "config:ccache:false"
contains 'True' spack -c config:ccache:true python -c "import spack.config;print(spack.config.CONFIG.get('config:ccache'))"
contains 'True' spack -C "$SHARE_DIR/qa/configuration" python -c "import spack.config;print(spack.config.CONFIG.get('config:ccache'))"
succeeds spack -c config:ccache:true python "$SHARE_DIR/qa/config_state.py"
succeeds spack -C "$SHARE_DIR/qa/configuration" python "$SHARE_DIR/qa/config_state.py"
spack env deactivate

View File

@@ -44,7 +44,7 @@ def libs(self):
# Header provided by the bar virutal package
@property
def bar_headers(self):
return find_headers("bar/bar", root=self.home.include, recursive=False)
return find_headers("bar", root=self.home.include, recursive=True)
# Libary provided by the bar virtual package
@property
@@ -59,7 +59,7 @@ def baz_home(self):
# Header provided by the baz virtual package
@property
def baz_headers(self):
return find_headers("baz/baz", root=self.baz_home.include, recursive=False)
return find_headers("baz", root=self.baz_home.include, recursive=True)
# Library provided by the baz virtual package
@property

View File

@@ -18,3 +18,5 @@ class DependencyFooBar(Package):
variant("foo", default=True, description="")
variant("bar", default=False, description="")
depends_on("second-dependency-foo-bar-fee")

View File

@@ -0,0 +1,22 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class DirectDepFooBar(Package):
"""This package has a variant "bar", which is False by default, and
variant "foo" which is True by default.
"""
homepage = "http://www.example.com"
url = "http://www.example.com/direct-dep-foo-bar-1.0.tar.gz"
version("1.0", md5="567890abcdefg12345678900987654321")
variant("foo", default=True, description="")
variant("bar", default=False, description="")
depends_on("second-dependency-foo-bar-fee")

View File

@@ -25,4 +25,6 @@ class Openblas(Package):
# To ensure test works with newer gcc versions
conflicts("%gcc@:10.1", when="@0.2.16:")
depends_on("perl")
provides("blas")

View File

@@ -0,0 +1,23 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class ParentFooBarFee(Package):
"""This package has a variant "bar", which is True by default, and depends on another
package which has the same variant defaulting to False.
"""
homepage = "http://www.example.com"
url = "http://www.example.com/parent-foo-bar-fee-1.0.tar.gz"
version("1.0", md5="abcdefg01234567890123abcdefghfed")
variant("foo", default=True, description="")
variant("bar", default=True, description="")
variant("fee", default=False, description="")
depends_on("dependency-foo-bar")

View File

@@ -19,4 +19,5 @@ class ParentFooBar(Package):
variant("foo", default=True, description="")
variant("bar", default=True, description="")
depends_on("direct-dep-foo-bar")
depends_on("dependency-foo-bar")

View File

@@ -14,3 +14,5 @@ class Perl(Package):
extendable = True
version("0.0.0", md5="abcdef1234567890abcdef1234567890")
variant("shared", default=True, description="Build shared libraries")

View File

@@ -25,6 +25,14 @@ class PkgA(AutotoolsPackage):
variant("bvv", default=True, description="The good old BV variant")
variant(
"libs",
default="shared",
values=("shared", "static"),
multi=True,
description="Type of libraries to install",
)
depends_on("pkg-b", when="foobar=bar")
depends_on("test-dependency", type="test")

View File

@@ -0,0 +1,21 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class SecondDependencyFooBarFee(Package):
"""This package has a variant "foo", which is True by default, a variant "bar" which
is False by default, and variant "foo" which is True by default.
"""
homepage = "http://www.example.com"
url = "http://www.example.com/second-dependency-foo-bar-fee-1.0.tar.gz"
version("1.0", md5="2101234567890abcdefg1234567890abc")
variant("foo", default=True, description="")
variant("bar", default=False, description="")
variant("fee", default=False, description="")

View File

@@ -41,6 +41,7 @@ class Acts(CMakePackage, CudaPackage):
# Supported Acts versions
version("main", branch="main")
version("master", branch="main", deprecated=True) # For compatibility
version("37.3.0", commit="b3e856d4dadcda7d1a88a9b846ce5a7acd8410c4", submodules=True)
version("37.2.0", commit="821144dc40d35b44aee0d7857a0bd1c99e4a3932", submodules=True)
version("37.1.0", commit="fa6ad4d52e0bd09cf8c78507fcbb18e9ac2c87a3", submodules=True)
version("37.0.1", commit="998b9c9dd42d5160c2540f8fa820505869bfdb79", submodules=True)
@@ -337,7 +338,7 @@ class Acts(CMakePackage, CudaPackage):
"tbb",
default=True,
description="Build the examples with Threading Building Blocks library",
when="@19.8:19,20.1: +examples",
when="@19.8:19,20.1:37.2 +examples",
)
variant("analysis", default=False, description="Build analysis applications in the examples")
@@ -382,6 +383,7 @@ class Acts(CMakePackage, CudaPackage):
depends_on("hepmc3 @3.2.1:", when="+hepmc3")
depends_on("heppdt", when="+hepmc3 @:4.0")
depends_on("intel-tbb @2020.1:", when="+examples +tbb")
depends_on("intel-tbb @2020.1:", when="+examples @37.3:")
depends_on("mlpack@3.1.1:", when="+mlpack")
depends_on("nlohmann-json @3.9.1:", when="@0.14: +json")
depends_on("nlohmann-json @3.10.5:", when="@37: +json")

View File

@@ -22,8 +22,9 @@ class Adiak(CMakePackage):
license("MIT")
version(
"0.4.0", commit="7e8b7233f8a148b402128ed46b2f0c643e3b397e", submodules=True, preferred=True
"0.4.1", commit="7ac997111785bee6d9391664b1d18ebc2b3c557b", submodules=True, preferred=True
)
version("0.4.0", commit="7e8b7233f8a148b402128ed46b2f0c643e3b397e", submodules=True)
version("0.2.2", commit="3aedd494c81c01df1183af28bc09bade2fabfcd3", submodules=True)
version(
"0.3.0-alpha",

View File

@@ -21,6 +21,8 @@ class AmrWind(CMakePackage, CudaPackage, ROCmPackage):
license("BSD-3-Clause")
version("main", branch="main", submodules=True)
version("3.2.0", tag="v3.2.0", submodules=True)
version("3.1.7", tag="v3.1.7", submodules=True)
version("3.1.6", tag="v3.1.6", submodules=True)
version("3.1.5", tag="v3.1.5", submodules=True)
version("3.1.4", tag="v3.1.4", submodules=True)

View File

@@ -26,6 +26,7 @@ class Amrex(CMakePackage, CudaPackage, ROCmPackage):
license("BSD-3-Clause")
version("develop", branch="development")
version("24.11", sha256="31cc37b39f15e02252875815f6066046fc56a479bf459362b9889b0d6a202df6")
version("24.10", sha256="a2d15e417bd7c41963749338e884d939c80c5f2fcae3279fe3f1b463e3e4208a")
version("24.09", sha256="a1435d16532d04a1facce9a9ae35d68a57f7cd21a5f22a6590bde3c265ea1449")
version("24.08", sha256="e09623e715887a19a1f86ed6fdb8335022fd6c03f19372d8f13b55cdeeadf5de")

View File

@@ -63,7 +63,7 @@ class Arborx(CMakePackage, CudaPackage, ROCmPackage):
for backend in kokkos_backends:
deflt, descr = kokkos_backends[backend]
variant(backend.lower(), default=deflt, description=descr)
variant("trilinos", default=False, description="use Kokkos from Trilinos")
variant("trilinos", default=False, when="@:1.5", description="use Kokkos from Trilinos")
depends_on("cmake@3.12:", type="build")
depends_on("cmake@3.16:", type="build", when="@1.0:")
@@ -77,8 +77,8 @@ class Arborx(CMakePackage, CudaPackage, ROCmPackage):
depends_on("kokkos@3.6.00:", when="@1.3~trilinos")
depends_on("kokkos@3.7.01:", when="@1.4:1.4.1~trilinos")
depends_on("kokkos@4.0.00:", when="@1.5~trilinos")
depends_on("kokkos@4.1.00:", when="@1.6~trilinos")
depends_on("kokkos@4.2.00:", when="@1.7:~trilinos")
depends_on("kokkos@4.1.00:", when="@1.6")
depends_on("kokkos@4.2.00:", when="@1.7:")
for backend in kokkos_backends:
depends_on("kokkos+%s" % backend.lower(), when="~trilinos+%s" % backend.lower())
@@ -96,8 +96,9 @@ class Arborx(CMakePackage, CudaPackage, ROCmPackage):
conflicts("^kokkos", when="+trilinos")
depends_on("kokkos+cuda_lambda", when="~trilinos+cuda")
# Trilinos/Kokkos
# Trilinos with internal Kokkos
# Notes:
# - starting with Trilinos 14.4, Trilinos' spack package uses external Kokkos
# - current version of Trilinos package does not allow disabling Serial
# - current version of Trilinos package does not allow enabling CUDA
depends_on("trilinos+kokkos", when="+trilinos")
@@ -106,18 +107,16 @@ class Arborx(CMakePackage, CudaPackage, ROCmPackage):
depends_on("trilinos@13.4.0:", when="@1.3+trilinos")
depends_on("trilinos@14.0.0:", when="@1.4:1.4.1+trilinos")
depends_on("trilinos@14.2.0:", when="@1.5+trilinos")
depends_on("trilinos@14.4.0:", when="@1.6+trilinos")
depends_on("trilinos@15.1.0:", when="@1.7:+trilinos")
patch("trilinos14.0-kokkos-major-version.patch", when="@1.4+trilinos ^trilinos@14.0.0")
conflicts("~serial", when="+trilinos")
def cmake_args(self):
spec = self.spec
if "~trilinos" in spec:
kokkos_spec = spec["kokkos"]
else:
if "+trilinos" in spec:
kokkos_spec = spec["trilinos"]
else:
kokkos_spec = spec["kokkos"]
options = [
f"-DKokkos_ROOT={kokkos_spec.prefix}",

View File

@@ -49,6 +49,14 @@ class BigdftFutile(AutotoolsPackage, CudaPackage):
configure_directory = "futile"
# missing MPI_BOTTOM in fake mpif.h (generated when compiling without MPI support)
# similar issue (maybe others) also in 1.9.4 but this patch does not work for 1.9.4
patch(
"https://gitlab.com/l_sim/bigdft-suite/-/commit/ec7419011fa9fd815de77bfca8642973091fb64b.diff",
sha256="66c493e37fe7f7f9800ae7f49bb0172a5b2372a2ce0ee4c3bcb7ff5c59a3925c",
when="@1.9.5~mpi",
)
def configure_args(self):
spec = self.spec
prefix = self.prefix

View File

@@ -246,6 +246,7 @@ def libs(self):
depends_on("icu4c cxxstd=14", when="+icu cxxstd=14")
depends_on("icu4c cxxstd=17", when="+icu cxxstd=17")
conflicts("cxxstd=98", when="+icu") # Requires c++11 at least
conflicts("+locale ~icu") # Boost.Locale "strongly recommends" icu, so enforce it
depends_on("python", when="+python")
# https://github.com/boostorg/python/commit/cbd2d9f033c61d29d0a1df14951f4ec91e7d05cd

View File

@@ -240,7 +240,7 @@ class Ceed(BundlePackage, CudaPackage, ROCmPackage):
# Omega_h
# ceed-5.0
depends_on("omega-h@scorec.10.1.0", when="@5.0.0+omega-h")
depends_on("omega-h@10.1.0", when="@5.0.0+omega-h")
depends_on("omega-h~trilinos", when="@5.0.0+omega-h+quickbuild")
# MFEM, Laghos, Remhos

View File

@@ -17,6 +17,7 @@ class Cgal(CMakePackage):
homepage = "https://www.cgal.org/"
url = "https://github.com/CGAL/cgal/releases/download/v5.4.1/CGAL-5.4.1.tar.xz"
version("6.0.1", sha256="0acdfbf317c556630dd526f3253780f29b6ec9713ee92903e81b5c93c0f59b7f")
version("5.6", sha256="dcab9b08a50a06a7cc2cc69a8a12200f8d8f391b9b8013ae476965c10b45161f")
version("5.5.3", sha256="0a04f662693256328b05babfabb5e3a5b7db2f5a58d52e3c520df9d0828ddd73")
version("5.5.2", sha256="b2b05d5616ecc69facdc24417cce0b04fb4321491d107db45103add520e3d8c3")

View File

@@ -20,6 +20,9 @@ class Cpr(CMakePackage):
version("1.10.4", sha256="88462d059cd3df22c4d39ae04483ed50dfd2c808b3effddb65ac3b9aa60b542d")
version("1.9.2", sha256="3bfbffb22c51f322780d10d3ca8f79424190d7ac4b5ad6ad896de08dbd06bf31")
variant("pic", default=True, description="Position independent code")
variant("shared", default=True, description="Build shared library")
depends_on("cxx", type="build")
depends_on("curl")
@@ -32,4 +35,6 @@ def cmake_args(self):
self.define("CPR_USE_SYSTEM_GTEST", True),
self.define(f"CPR{_force}_USE_SYSTEM_CURL", True),
self.define("CPR_ENABLE_SSL", True),
self.define_from_variant("BUILD_SHARED_LIBS", "shared"),
self.define_from_variant("CMAKE_POSITION_INDEPENDENT_CODE", "pic"),
]

View File

@@ -20,9 +20,12 @@ class Cups(AutotoolsPackage):
license("Apache-2.0", checked_by="wdconinc")
version("2.4.11", sha256="9a88fe1da3a29a917c3fc67ce6eb3178399d68e1a548c6d86c70d9b13651fd71")
version("2.4.10", sha256="d75757c2bc0f7a28b02ee4d52ca9e4b1aa1ba2affe16b985854f5336940e5ad7")
version("2.3.3", sha256="261fd948bce8647b6d5cb2a1784f0c24cc52b5c4e827b71d726020bcc502f3ee")
version("2.2.3", sha256="66701fe15838f2c892052c913bde1ba106bbee2e0a953c955a62ecacce76885f")
with default_args(deprecated=True):
# https://nvd.nist.gov/vuln/detail/CVE-2023-4504
version("2.3.3", sha256="261fd948bce8647b6d5cb2a1784f0c24cc52b5c4e827b71d726020bcc502f3ee")
version("2.2.3", sha256="66701fe15838f2c892052c913bde1ba106bbee2e0a953c955a62ecacce76885f")
depends_on("c", type="build")
depends_on("cxx", type="build")

View File

@@ -180,8 +180,9 @@ class Dealii(CMakePackage, CudaPackage):
depends_on("arborx+trilinos", when="@9.3:+arborx+trilinos")
depends_on("arpack-ng+mpi", when="+arpack+mpi")
depends_on("assimp", when="@9.0:+assimp")
depends_on("cgal", when="@9.4:+cgal")
depends_on("cgal@5:", when="@9.5:+cgal")
# cgal 6 not yet supported: https://github.com/spack/spack/pull/47285#issuecomment-2455403447
depends_on("cgal@:5", when="@9.4:+cgal")
depends_on("cgal@5", when="@9.5:+cgal")
depends_on("doxygen+graphviz", when="+doc")
depends_on("graphviz", when="+doc")
depends_on("ginkgo", when="@9.1:+ginkgo")

View File

@@ -20,6 +20,7 @@ class Detray(CMakePackage):
license("MPL-2.0", checked_by="stephenswat")
version("0.81.0", sha256="821313a7e3ea90fcf5c92153d28bba1f85844e03d7c6b6b98d0b3407adb86357")
version("0.80.0", sha256="a12f3e333778ddd20a568b5c8df5b2375f9a4d74caf921822c1864b07b3f8ab7")
version("0.79.0", sha256="3b9f18cb003e59795a0e4b1414069ac8558b975714626449293a71bc4398a380")
version("0.78.0", sha256="ca3a348f4e12ed690c3106197e107b9c393b6902224b2543b00382050864bcf3")

View File

@@ -0,0 +1,11 @@
diff --git a/dataflowAPI/src/AbslocInterface.C b/dataflowAPI/src/AbslocInterface.C
index 9d7ad000c..582e64004 100644
--- a/dataflowAPI/src/AbslocInterface.C
+++ b/dataflowAPI/src/AbslocInterface.C
@@ -29,6 +29,7 @@
*/
+#include <deque>
#include "Absloc.h"
#include "AbslocInterface.h"

View File

@@ -110,6 +110,11 @@ class Dyninst(CMakePackage):
patch("stackanalysis_h.patch", when="@9.2.0")
patch("v9.3.2-auto.patch", when="@9.3.2 %gcc@:4.7")
patch("tribool.patch", when="@9.3.0:10.0.0 ^boost@1.69:")
patch(
"missing_include_deque.patch",
when="@10.0.0:12.2.0",
sha256="0064d8d51bd01bd0035e1ebc49276f627ce6366d4524c92cf47d3c09b0031f96",
)
requires("%gcc", when="@:13.0.0", msg="dyninst builds only with GCC")

View File

@@ -61,12 +61,20 @@ class Edm4hep(CMakePackage):
description="Use the specified C++ standard when building.",
)
variant(
"json",
default=True,
description="Build edm4hep with JSON support and edm4hep2json",
when="@0.99.2:",
)
depends_on("cmake@3.3:", type="build")
depends_on("cmake@3.23:", type="build", when="@0.10.3:")
depends_on("python", type="build")
depends_on("root@6.08:")
depends_on("nlohmann-json@3.10.5:")
depends_on("nlohmann-json@3.10.5:", when="@0.99.2: +json")
depends_on("nlohmann-json@3.10.5:", when="@:0.99.1")
depends_on("podio@1:", when="@0.99:")
depends_on("podio@0.15:", when="@:0.10.5")
for _std in _cxxstd_values:
@@ -88,6 +96,8 @@ def cmake_args(self):
# C++ Standard
args.append(self.define("CMAKE_CXX_STANDARD", self.spec.variants["cxxstd"].value))
args.append(self.define("BUILD_TESTING", self.run_tests))
if self.spec.satisfies("@0.99.2: +json"):
args.append(self.define_from_variant("EDM4HEP_WITH_JSON", "json"))
return args
def setup_run_environment(self, env):

View File

@@ -13,14 +13,23 @@ class Elasticsearch(Package):
"""
homepage = "https://www.elastic.co/"
url = "https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.2.4.tar.gz"
url = "https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-8.15.2-linux-x86_64.tar.gz"
version("6.4.0", sha256="e9786efb5cecd12adee2807c7640ba9a1ab3b484d2e87497bb8d0b6df0e24f01")
version("6.3.0", sha256="0464127140820d82b24bd2830232131ea85bcd49267a8bc7365e4fa391dee2a3")
version("6.2.4", sha256="91e6f1ea1e1dd39011e7a703d2751ca46ee374665b08b0bfe17e0c0c27000e8e")
version("8.15.2", sha256="0b6905ede457be9d1d73d0b6be1c3a7c7c6220829846b532f2604ad30ba7308f")
with default_args(deprecated=True):
# https://nvd.nist.gov/vuln/detail/CVE-2018-3831
version("6.4.0", sha256="e9786efb5cecd12adee2807c7640ba9a1ab3b484d2e87497bb8d0b6df0e24f01")
version("6.3.0", sha256="0464127140820d82b24bd2830232131ea85bcd49267a8bc7365e4fa391dee2a3")
version("6.2.4", sha256="91e6f1ea1e1dd39011e7a703d2751ca46ee374665b08b0bfe17e0c0c27000e8e")
depends_on("java", type="run")
def url_for_version(self, version):
if self.spec.satisfies("@:6"):
return f"https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-{version}.tar.gz"
else:
return f"https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-{version}-linux-x86_64.tar.gz"
def install(self, spec, prefix):
dirs = ["bin", "config", "lib", "modules", "plugins"]

View File

@@ -42,6 +42,15 @@ class Extrae(AutotoolsPackage):
license("LGPL-2.1-or-later")
version("4.2.3", sha256="c132f3609b2e6f34d95ca1598eea01e5097257b6a663bb9698206ec271825ed0")
version("4.2.2", sha256="1f776f1a3401942b79685ba13489a954a731bce7cbb8549594f6da0b557c58a7")
version("4.2.1", sha256="0260a9a4952b6ac9b82ee33ee2749c22ae10d39447e42167a2626c77f664bb9a")
version("4.2.0", sha256="7b83a1ed008440bbc1bda88297d2d0e9256780db1cf8401b3c12718451f8919a")
version("4.1.7", sha256="0ed87449f74db0abc239ee8c40176e89f9ca6a69738fe751ec0df8fc46da1712")
version("4.1.6", sha256="9f146e70311b8ae9d77584f6efc7b30478885cfd095f7bd3937d5b08aec19985")
version("4.1.5", sha256="ab425f2e155e9af3332c01177df1776a6a953e721dfe8774eb23733f942b76a0")
version("4.1.4", sha256="6b5894bea046273a0d2a5c72204937ad310b2f88cd5d87d10f5ca0aaf1d637da")
version("4.1.3", sha256="889f136ddcfec2f8f9401b24ee29ebf74cf055e4e524c54821aba25513c24c03")
version("4.1.2", sha256="adbc1d3aefde7649262426d471237dc96f070b93be850a6f15280ed86fd0b952")
version("4.0.6", sha256="233be38035dd76f6877b1fd93d308e024e5d4ef5519d289f8e319cd6c58d0bc6")
version("4.0.5", sha256="8f5eefa95f2e94a3b5f9b7f7cbaaed523862f190575ee797113b1e97deff1586")
@@ -128,7 +137,7 @@ def configure_args(self):
args += (
["--with-cuda=%s" % spec["cuda"].prefix]
if spec.satisifes("+cuda")
if spec.satisfies("+cuda")
else ["--without-cuda"]
)
@@ -160,6 +169,18 @@ def flag_handler(self, name, flags):
flags.append("-lintl")
elif name == "ldflags":
flags.append("-pthread")
# This is to work around
# <https://github.com/bsc-performance-tools/extrae/issues/115>.
if self.spec.satisfies("%gcc@14:") and name == "cflags":
flags.extend(
[
"-Wno-error=incompatible-pointer-types",
"-Wno-error=implicit-function-declaration",
"-Wno-error=int-conversion",
]
)
return self.build_system_flags(name, flags)
def install(self, spec, prefix):

View File

@@ -0,0 +1,39 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Flux(CMakePackage):
"""A C++20 library for sequence-orientated programming"""
homepage = "https://tristanbrindle.com/flux/"
url = "https://github.com/tcbrindle/flux/archive/refs/tags/v0.4.0.tar.gz"
maintainers("pranav-sivaraman")
license("BSL-1.0", checked_by="pranav-sivaraman")
version("0.4.0", sha256="95e7d9d71c9ee9e89bb24b46ccba77ddfb0a1580630c2faab0b415dacc7c8d56")
variant("docs", default=False, description="Build Flux documentation")
depends_on("cxx", type="build")
depends_on("cmake@3.23:", type="build")
with default_args(when="+docs"):
depends_on("py-sphinx")
depends_on("py-sphinx-copybutton")
depends_on("py-furo")
def cmake_args(self):
args = [
self.define("FLUX_BUILD_TESTS", self.run_tests),
self.define("FLUX_BUILD_EXAMPLES", False),
self.define_from_variant("FLUX_BUILD_DOCS", "docs"),
]
return args

View File

@@ -18,6 +18,7 @@ class G2c(CMakePackage):
maintainers("AlexanderRichert-NOAA", "Hang-Lei-NOAA", "edwardhartnett")
version("develop", branch="develop")
version("2.0.0", sha256="39c23bf1219c60101548c8525e3a879c84119558f768081779d404a8caf4cec9")
version("1.9.0", sha256="5554276e18bdcddf387a08c2dd23f9da310c6598905df6a2a244516c22ded9aa")
version("1.8.0", sha256="4ce9f5a7cb0950699fe08ebc5a463ab4d09ef550c050391a319308a2494f971f")
version("1.7.0", sha256="73afba9da382fed73ed8692d77fa037bb313280879cd4012a5e5697dccf55175")

View File

@@ -24,13 +24,18 @@ class G4abla(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "G4ABLA{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "G4ABLA{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4ABLADATA", install_path)
def url_for_version(self, version):
"""Handle version string."""
return "http://geant4-data.web.cern.ch/geant4-data/datasets/G4ABLA.%s.tar.gz" % version
@property
def g4datasetname(self):
spec = self.spec
return "G4ABLA{0}".format(spec.version)

View File

@@ -35,13 +35,18 @@ class G4emlow(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "G4EMLOW{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "G4EMLOW{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4LEDATA", install_path)
def url_for_version(self, version):
"""Handle version string."""
return "https://geant4-data.web.cern.ch/geant4-data/datasets/G4EMLOW.%s.tar.gz" % version
@property
def g4datasetname(self):
spec = self.spec
return "G4EMLOW{0}".format(spec.version)

View File

@@ -25,11 +25,11 @@ class G4ensdfstate(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "G4ENSDFSTATE{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "G4ENSDFSTATE{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4ENSDFSTATEDATA", install_path)
def url_for_version(self, version):
@@ -37,3 +37,8 @@ def url_for_version(self, version):
return (
"http://geant4-data.web.cern.ch/geant4-data/datasets/G4ENSDFSTATE.%s.tar.gz" % version
)
@property
def g4datasetname(self):
spec = self.spec
return "G4ENSDFSTATE{0}".format(spec.version)

View File

@@ -25,13 +25,18 @@ class G4incl(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "G4INCL{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "G4INCL{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4INCLDATA", install_path)
def url_for_version(self, version):
"""Handle version string."""
return "http://geant4-data.web.cern.ch/geant4-data/datasets/G4INCL.%s.tar.gz" % version
@property
def g4datasetname(self):
spec = self.spec
return "G4INCL{0}".format(spec.version)

View File

@@ -25,13 +25,18 @@ class G4ndl(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "G4NDL{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "G4NDL{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4NEUTRONHPDATA", install_path)
def url_for_version(self, version):
"""Handle version string."""
return "http://geant4-data.web.cern.ch/geant4-data/datasets/G4NDL.%s.tar.gz" % version
@property
def g4datasetname(self):
spec = self.spec
return "G4NDL{0}".format(spec.version)

View File

@@ -24,11 +24,11 @@ class G4neutronxs(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "G4NEUTRONXS{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "G4NEUTRONXS{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4NEUTRONXSDATA", install_path)
def url_for_version(self, version):
@@ -36,3 +36,8 @@ def url_for_version(self, version):
return (
"http://geant4-data.web.cern.ch/geant4-data/datasets/G4NEUTRONXS.%s.tar.gz" % version
)
@property
def g4datasetname(self):
spec = self.spec
return "G4NEUTRONXS{0}".format(spec.version)

View File

@@ -23,13 +23,18 @@ class G4nudexlib(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "G4NUDEXLIB{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "G4NUDEXLIB{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4NUDEXLIBDATA", install_path)
def url_for_version(self, version):
"""Handle version string."""
return "http://geant4-data.web.cern.ch/geant4-data/datasets/G4NUDEXLIB.%s.tar.gz" % version
@property
def g4datasetname(self):
spec = self.spec
return "G4NUDEXLIB{0}".format(spec.version)

View File

@@ -28,11 +28,11 @@ class G4particlexs(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "G4PARTICLEXS{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "G4PARTICLEXS{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4PARTICLEXSDATA", install_path)
def url_for_version(self, version):
@@ -40,3 +40,8 @@ def url_for_version(self, version):
return (
"http://geant4-data.web.cern.ch/geant4-data/datasets/G4PARTICLEXS.%s.tar.gz" % version
)
@property
def g4datasetname(self):
spec = self.spec
return "G4PARTICLEXS{0}".format(spec.version)

View File

@@ -27,13 +27,11 @@ class G4photonevaporation(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "PhotonEvaporation{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(
self.prefix.share, "data", "PhotonEvaporation{0}".format(self.version)
)
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4LEVELGAMMADATA", install_path)
def url_for_version(self, version):
@@ -42,3 +40,8 @@ def url_for_version(self, version):
"http://geant4-data.web.cern.ch/geant4-data/datasets/G4PhotonEvaporation.%s.tar.gz"
% version
)
@property
def g4datasetname(self):
spec = self.spec
return "PhotonEvaporation{0}".format(spec.version)

View File

@@ -22,13 +22,18 @@ class G4pii(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "G4PII{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "G4PII{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4PIIDATA", install_path)
def url_for_version(self, version):
"""Handle version string."""
return "https://geant4-data.web.cern.ch/geant4-data/datasets/G4PII.1.3.tar.gz" % version
@property
def g4datasetname(self):
spec = self.spec
return "G4PII{0}".format(spec.version)

View File

@@ -27,13 +27,11 @@ class G4radioactivedecay(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "RadioactiveDecay{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(
self.prefix.share, "data", "RadioactiveDecay{0}".format(self.version)
)
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4RADIOACTIVEDATA", install_path)
def url_for_version(self, version):
@@ -42,3 +40,8 @@ def url_for_version(self, version):
"http://geant4-data.web.cern.ch/geant4-data/datasets/G4RadioactiveDecay.%s.tar.gz"
% version
)
@property
def g4datasetname(self):
spec = self.spec
return "RadioactiveDecay{0}".format(spec.version)

View File

@@ -25,11 +25,11 @@ class G4realsurface(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "RealSurface{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "RealSurface{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4REALSURFACEDATA", install_path)
def url_for_version(self, version):
@@ -39,3 +39,8 @@ def url_for_version(self, version):
"G4" if version > Version("1.0") else "", version
)
)
@property
def g4datasetname(self):
spec = self.spec
return "RealSurface{0}".format(spec.version)

View File

@@ -23,13 +23,18 @@ class G4saiddata(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "G4SAIDDATA{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "G4SAIDDATA{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4SAIDXSDATA", install_path)
def url_for_version(self, version):
"""Handle version string."""
return "http://geant4-data.web.cern.ch/geant4-data/datasets/G4SAIDDATA.%s.tar.gz" % version
@property
def g4datasetname(self):
spec = self.spec
return "G4SAIDDATA{0}".format(spec.version)

View File

@@ -24,13 +24,18 @@ class G4tendl(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "G4TENDL{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "G4TENDL{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4PARTICLEHPDATA", install_path)
def url_for_version(self, version):
"""Handle version string."""
return "http://geant4-data.web.cern.ch/geant4-data/datasets/G4TENDL.%s.tar.gz" % version
@property
def g4datasetname(self):
spec = self.spec
return "G4TENDL{0}".format(spec.version)

View File

@@ -23,13 +23,18 @@ class G4urrpt(Package):
def install(self, spec, prefix):
mkdirp(join_path(prefix.share, "data"))
install_path = join_path(prefix.share, "data", "G4URRPT{0}".format(self.version))
install_path = join_path(prefix.share, "data", self.g4datasetname)
install_tree(self.stage.source_path, install_path)
def setup_dependent_run_environment(self, env, dependent_spec):
install_path = join_path(self.prefix.share, "data", "G4URRPT{0}".format(self.version))
install_path = join_path(self.prefix.share, "data", self.g4datasetname)
env.set("G4URRPTDATA", install_path)
def url_for_version(self, version):
"""Handle version string."""
return "http://geant4-data.web.cern.ch/geant4-data/datasets/G4URRPT.%s.tar.gz" % version
@property
def g4datasetname(self):
spec = self.spec
return "G4URRPT{0}".format(spec.version)

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import glob
import os
from spack.package import *
@@ -204,5 +203,11 @@ def datadir(self):
def install(self, spec, prefix):
with working_dir(self.datadir, create=True):
for s in spec.dependencies():
for d in glob.glob("{0}/data/*".format(s.prefix.share)):
os.symlink(d, os.path.basename(d))
if not s.name.startswith("g4"):
continue
if not hasattr(s.package, "g4datasetname"):
raise InstallError(f"Dependency `{s.name}` does not expose `g4datasetname`")
d = "{0}/data/{1}".format(s.prefix.share, s.package.g4datasetname)
os.symlink(d, os.path.basename(d))

View File

@@ -18,6 +18,8 @@ class GoogleCloudCpp(CMakePackage):
sanity_check_is_dir = ["lib", "include"]
version("2.30.0", sha256="170650b11ece54977b42dd85be648b6bd2d614ff68ea6863a0013865e576b49c")
version("2.29.0", sha256="758e1eca8186b962516c0659b34ce1768ba1c9769cfd998c5bbffb084ad901ff")
version("2.28.0", sha256="1d51910cb4419f6100d8b9df6bccd33477d09f50e378f12b06dae0f137ed7bc6")
depends_on("abseil-cpp")
@@ -30,11 +32,17 @@ class GoogleCloudCpp(CMakePackage):
variant("shared", default=False, description="Build shared instead of static libraries")
variant(
"cxxstd",
default="11",
values=("11", "14", "17", "20"),
default="14",
values=("14", "17", "20"),
multi=False,
description="Use the specified C++ standard when building.",
)
variant(
"libraries",
default="__ga_libraries__",
multi=False,
description="Which client libraries to build/install. e.g. libraries=bigtable,storage",
)
def cmake_args(self):
args = [
@@ -43,6 +51,6 @@ def cmake_args(self):
"-DBUILD_TESTING:Bool=OFF",
"-DGOOGLE_CLOUD_CPP_WITH_MOCKS:Bool=OFF",
"-DGOOGLE_CLOUD_CPP_ENABLE_EXAMPLES:Bool=OFF",
"-DGOOGLE_CLOUD_CPP_ENABLE:String=__ga_libraries__",
self.define_from_variant("GOOGLE_CLOUD_CPP_ENABLE", "libraries"),
]
return args

View File

@@ -17,13 +17,30 @@ class Gsoap(AutotoolsPackage, SourceforgePackage):
maintainers("greenc-FNAL", "gartung", "marcmengel", "vitodb")
version("2.8.127", sha256="25ecad1bbc363494eb7ea95a68508e4c93cc20596fad9ebc196c6572bbbd3c08")
version("2.8.124", sha256="4b798780989338f665ef8e171bbcc422a271004d62d5852666d5eeca33a6a636")
version("2.8.119", sha256="8997c43b599a2bfe4a788e303a5dd24bbf5992fd06d56f606ca680ca5b0070cf")
version("2.8.114", sha256="aa70a999258100c170a3f8750c1f91318a477d440f6a28117f68bc1ded32327f")
version("2.8.113", sha256="e73782b618303cf55ea6a45751b75ba96797a7a12967ed9d02e6d5761977e73a")
version("2.8.112", sha256="05345312e0bb4d81c98ae63b97cff9eb097f38dafe09356189f9d8e235c54095")
version("2.8.111", sha256="f1670c7e3aeaa66bc5658539fbd162e5099f022666855ef2b2c2bac07fec4bd3")
version("2.8.135", sha256="b11757e405d55d4674dfbf88c4fa6d7e24155cf64ed8ed578ccad2f2b555e98d")
with default_args(deprecated=True):
# Unavailable for direct download anymore
version(
"2.8.127", sha256="25ecad1bbc363494eb7ea95a68508e4c93cc20596fad9ebc196c6572bbbd3c08"
)
version(
"2.8.124", sha256="4b798780989338f665ef8e171bbcc422a271004d62d5852666d5eeca33a6a636"
)
version(
"2.8.119", sha256="8997c43b599a2bfe4a788e303a5dd24bbf5992fd06d56f606ca680ca5b0070cf"
)
version(
"2.8.114", sha256="aa70a999258100c170a3f8750c1f91318a477d440f6a28117f68bc1ded32327f"
)
version(
"2.8.113", sha256="e73782b618303cf55ea6a45751b75ba96797a7a12967ed9d02e6d5761977e73a"
)
version(
"2.8.112", sha256="05345312e0bb4d81c98ae63b97cff9eb097f38dafe09356189f9d8e235c54095"
)
version(
"2.8.111", sha256="f1670c7e3aeaa66bc5658539fbd162e5099f022666855ef2b2c2bac07fec4bd3"
)
depends_on("openssl")
depends_on("pkgconfig", type="build")

View File

@@ -38,8 +38,7 @@ class Hdf5(CMakePackage):
# The 'develop' version is renamed so that we could uninstall (or patch) it
# without affecting other develop version.
version("develop-1.17", branch="develop")
version("develop-1.16", branch="hdf5_1_16")
version("develop-2.0", branch="develop")
version("develop-1.14", branch="hdf5_1_14")
version("develop-1.12", branch="hdf5_1_12")
version("develop-1.10", branch="hdf5_1_10")

View File

@@ -58,6 +58,10 @@ class Hepmc3(CMakePackage):
conflicts("%gcc@9.3.0", when="@:3.1.1")
patch("ba38f14d8f56c16cc4105d98f6d4540c928c6150.patch", when="@3.1.2:3.2.1 %gcc@9.3.0")
@property
def libs(self):
return find_libraries(["libHepMC3", "libHepMC3Search"], root=self.prefix, recursive=True)
def cmake_args(self):
spec = self.spec
from_variant = self.define_from_variant

View File

@@ -16,7 +16,7 @@ class Icon(AutotoolsPackage):
homepage = "https://www.icon-model.org"
url = "https://gitlab.dkrz.de/icon/icon-model/-/archive/icon-2024.01-public/icon-model-icon-2024.01-public.tar.gz"
maintainers("skosukhin")
maintainers("skosukhin", "Try2Code")
license("BSD-3-Clause", checked_by="skosukhin")

View File

@@ -13,9 +13,10 @@ class Kibana(Package):
homepage = "https://www.elastic.co/products/kibana"
url = "https://artifacts.elastic.co/downloads/kibana/kibana-6.4.0-linux-x86_64.tar.gz"
version("6.4.0", sha256="df2056105a08c206a1adf9caed09a152a53429a0f1efc1ba3ccd616092d78aee")
depends_on("cxx", type="build") # generated
version("8.15.2", sha256="b1f8082a4200867078170e92ad299e293ee514f5fdbb96b7a0d1de17a880d1eb")
with default_args(deprecated=True):
# https://nvd.nist.gov/vuln/detail/CVE-2019-7609
version("6.4.0", sha256="df2056105a08c206a1adf9caed09a152a53429a0f1efc1ba3ccd616092d78aee")
depends_on("java", type="run")

View File

@@ -11,7 +11,7 @@ class KokkosKernels(CMakePackage, CudaPackage):
homepage = "https://github.com/kokkos/kokkos-kernels"
git = "https://github.com/kokkos/kokkos-kernels.git"
url = "https://github.com/kokkos/kokkos-kernels/archive/4.0.00.tar.gz"
url = "https://github.com/kokkos/kokkos-kernels/releases/download/4.4.01/kokkos-kernels-4.4.01.tar.gz"
tags = ["e4s"]
@@ -21,32 +21,111 @@ class KokkosKernels(CMakePackage, CudaPackage):
license("Apache-2.0 WITH LLVM-exception")
# generate checksum for each release tarball with the following command
# openssl sha256 kokkos-kernels-x.y.z.tar.gz
version("develop", branch="develop")
version("master", branch="master")
version("4.4.01", sha256="9f741449f5ace5a7d8a5a81194ff2108e5525d16f08fcd9bb6c9bb4853d7720d")
version("4.4.00", sha256="6559871c091eb5bcff53bae5a0f04f2298971d1aa1b2c135bd5a2dae3f9376a2")
version("4.3.01", sha256="749553a6ea715ba1e56fa0b13b42866bb9880dba7a94e343eadf40d08c68fab8")
version("4.3.00", sha256="03c3226ee97dbca4fa56fe69bc4eefa0673e23c37f2741943d9362424a63950e")
version("4.2.01", sha256="058052b3a40f5d4e447b7ded5c480f1b0d4aa78373b0bc7e43804d0447c34ca8")
version("4.2.00", sha256="c65df9a101dbbef2d8fd43c60c9ea85f2046bb3535fa1ad16e7c661ddd60401e")
version("4.1.00", sha256="d6a4108444ea226e43bf6a9c0dfc557f223a72b1142bf81aa78dd60e16ac2d56")
version("4.0.01", sha256="3f493fcb0244b26858ceb911be64092fbf7785616ad62c81abde0ea1ce86688a")
version("4.0.00", sha256="750079d0be1282d18ecd280e130ca303044ac399f1e5864488284b92f5ce0a86")
version("3.7.01", sha256="b2060f5894bdaf7f7d4793b90444fac260460cfa80595afcbcb955518864b446")
version("3.7.00", sha256="51bc6db3995392065656848e2b152cfd1c3a95a951ab18a3934278113d59f32b")
version("3.6.01", sha256="f000b156c8c0b80e85d38587907c11d9479aaf362408b812effeda5e22b24d0d")
version("3.6.00", sha256="2753643fd643b9eed9f7d370e0ff5fa957211d08a91aa75398e31cbc9e5eb0a5")
version("3.5.00", sha256="a03a41a047d95f9f07cd1e1d30692afdb75b5c705ef524e19c1d02fe60ccf8d1")
version("3.4.01", sha256="f504aa4afbffb58fa7c4430d0fdb8fd5690a268823fa15eb0b7d58dab9d351e6")
version("3.4.00", sha256="07ba11869e686cb0d47272d1ef494ccfbcdef3f93ff1c8b64ab9e136a53a227a")
version("3.3.01", sha256="0f21fe6b5a8b6ae7738290e293aa990719aefe88b32f84617436bfd6074a8f77")
version("3.3.00", sha256="8d7f78815301afb90ddba7914dce5b718cea792ac0c7350d2f8d00bd2ef1cece")
version("3.2.01", sha256="c486e5cac19e354a517498c362838619435734d64b44f44ce909b0531c21d95c")
version("3.2.00", sha256="8ac20ee28ae7813ce1bda461918800ad57fdbac2af86ef5d1ba74e83e10956de")
version("3.1.00", sha256="27fea241ae92f41bd5b070b1a590ba3a56a06aca750207a98bea2f64a4a40c89")
version("3.0.00", sha256="e4b832aed3f8e785de24298f312af71217a26067aea2de51531e8c1e597ef0e6")
version("4.4.01", sha256="4a32bc8330e0113856bdf181df94cc4f9902e3cebb5dc7cea5948f30df03bfa1")
version("4.4.00", sha256="66d5c3f728a8c7689159c97006996164ea00fd39702476220e3dbf2a05c49e8f")
version(
"4.3.01",
sha256="749553a6ea715ba1e56fa0b13b42866bb9880dba7a94e343eadf40d08c68fab8",
url="https://github.com/kokkos/kokkos-kernels/archive/4.3.01.tar.gz",
)
version(
"4.3.00",
sha256="03c3226ee97dbca4fa56fe69bc4eefa0673e23c37f2741943d9362424a63950e",
url="https://github.com/kokkos/kokkos-kernels/archive/4.3.00.tar.gz",
)
version(
"4.2.01",
sha256="058052b3a40f5d4e447b7ded5c480f1b0d4aa78373b0bc7e43804d0447c34ca8",
url="https://github.com/kokkos/kokkos-kernels/archive/4.2.01.tar.gz",
)
version(
"4.2.00",
sha256="c65df9a101dbbef2d8fd43c60c9ea85f2046bb3535fa1ad16e7c661ddd60401e",
url="https://github.com/kokkos/kokkos-kernels/archive/4.2.00.tar.gz",
)
version(
"4.1.00",
sha256="d6a4108444ea226e43bf6a9c0dfc557f223a72b1142bf81aa78dd60e16ac2d56",
url="https://github.com/kokkos/kokkos-kernels/archive/4.1.00.tar.gz",
)
version(
"4.0.01",
sha256="3f493fcb0244b26858ceb911be64092fbf7785616ad62c81abde0ea1ce86688a",
url="https://github.com/kokkos/kokkos-kernels/archive/4.0.01.tar.gz",
)
version(
"4.0.00",
sha256="750079d0be1282d18ecd280e130ca303044ac399f1e5864488284b92f5ce0a86",
url="https://github.com/kokkos/kokkos-kernels/archive/4.0.00.tar.gz",
)
version(
"3.7.01",
sha256="b2060f5894bdaf7f7d4793b90444fac260460cfa80595afcbcb955518864b446",
url="https://github.com/kokkos/kokkos-kernels/archive/3.7.01.tar.gz",
)
version(
"3.7.00",
sha256="51bc6db3995392065656848e2b152cfd1c3a95a951ab18a3934278113d59f32b",
url="https://github.com/kokkos/kokkos-kernels/archive/3.7.00.tar.gz",
)
version(
"3.6.01",
sha256="f000b156c8c0b80e85d38587907c11d9479aaf362408b812effeda5e22b24d0d",
url="https://github.com/kokkos/kokkos-kernels/archive/3.6.01.tar.gz",
)
version(
"3.6.00",
sha256="2753643fd643b9eed9f7d370e0ff5fa957211d08a91aa75398e31cbc9e5eb0a5",
url="https://github.com/kokkos/kokkos-kernels/archive/3.6.00.tar.gz",
)
version(
"3.5.00",
sha256="a03a41a047d95f9f07cd1e1d30692afdb75b5c705ef524e19c1d02fe60ccf8d1",
url="https://github.com/kokkos/kokkos-kernels/archive/3.5.00.tar.gz",
)
version(
"3.4.01",
sha256="f504aa4afbffb58fa7c4430d0fdb8fd5690a268823fa15eb0b7d58dab9d351e6",
url="https://github.com/kokkos/kokkos-kernels/archive/3.4.01.tar.gz",
)
version(
"3.4.00",
sha256="07ba11869e686cb0d47272d1ef494ccfbcdef3f93ff1c8b64ab9e136a53a227a",
url="https://github.com/kokkos/kokkos-kernels/archive/3.4.00.tar.gz",
)
version(
"3.3.01",
sha256="0f21fe6b5a8b6ae7738290e293aa990719aefe88b32f84617436bfd6074a8f77",
url="https://github.com/kokkos/kokkos-kernels/archive/3.3.01.tar.gz",
)
version(
"3.3.00",
sha256="8d7f78815301afb90ddba7914dce5b718cea792ac0c7350d2f8d00bd2ef1cece",
url="https://github.com/kokkos/kokkos-kernels/archive/3.3.00.tar.gz",
)
version(
"3.2.01",
sha256="c486e5cac19e354a517498c362838619435734d64b44f44ce909b0531c21d95c",
url="https://github.com/kokkos/kokkos-kernels/archive/3.2.01.tar.gz",
)
version(
"3.2.00",
sha256="8ac20ee28ae7813ce1bda461918800ad57fdbac2af86ef5d1ba74e83e10956de",
url="https://github.com/kokkos/kokkos-kernels/archive/3.2.00.tar.gz",
)
version(
"3.1.00",
sha256="27fea241ae92f41bd5b070b1a590ba3a56a06aca750207a98bea2f64a4a40c89",
url="https://github.com/kokkos/kokkos-kernels/archive/3.1.00.tar.gz",
)
version(
"3.0.00",
sha256="e4b832aed3f8e785de24298f312af71217a26067aea2de51531e8c1e597ef0e6",
url="https://github.com/kokkos/kokkos-kernels/archive/3.0.00.tar.gz",
)
depends_on("cxx", type="build") # generated

View File

@@ -15,7 +15,7 @@ class Kokkos(CMakePackage, CudaPackage, ROCmPackage):
homepage = "https://github.com/kokkos/kokkos"
git = "https://github.com/kokkos/kokkos.git"
url = "https://github.com/kokkos/kokkos/archive/3.6.00.tar.gz"
url = "https://github.com/kokkos/kokkos/releases/download/4.4.01/kokkos-4.4.01.tar.gz"
tags = ["e4s"]
@@ -27,30 +27,120 @@ class Kokkos(CMakePackage, CudaPackage, ROCmPackage):
version("master", branch="master")
version("develop", branch="develop")
version("4.4.01", sha256="3f7096d17eaaa4004c7497ac082bf1ae3ff47b5104149e54af021a89414c3682")
version("4.4.00", sha256="c638980cb62c34969b8c85b73e68327a2cb64f763dd33e5241f5fd437170205a")
version("4.3.01", sha256="5998b7c732664d6b5e219ccc445cd3077f0e3968b4be480c29cd194b4f45ec70")
version("4.3.00", sha256="53cf30d3b44dade51d48efefdaee7a6cf109a091b702a443a2eda63992e5fe0d")
version("4.2.01", sha256="cbabbabba021d00923fb357d2e1b905dda3838bd03c885a6752062fe03c67964")
version("4.2.00", sha256="ac08765848a0a6ac584a0a46cd12803f66dd2a2c2db99bb17c06ffc589bf5be8")
version("4.1.00", sha256="cf725ea34ba766fdaf29c884cfe2daacfdc6dc2d6af84042d1c78d0f16866275")
version("4.0.01", sha256="bb942de8afdd519fd6d5d3974706bfc22b6585a62dd565c12e53bdb82cd154f0")
version("4.0.00", sha256="1829a423883d4b44223c7c3a53d3c51671145aad57d7d23e6a1a4bebf710dcf6")
version("3.7.02", sha256="5024979f06bc8da2fb696252a66297f3e0e67098595a0cc7345312b3b4aa0f54")
version("3.7.01", sha256="0481b24893d1bcc808ec68af1d56ef09b82a1138a1226d6be27c3b3c3da65ceb")
version("3.7.00", sha256="62e3f9f51c798998f6493ed36463f66e49723966286ef70a9dcba329b8443040")
version("3.6.01", sha256="1b80a70c5d641da9fefbbb652e857d7c7a76a0ebad1f477c253853e209deb8db")
version("3.6.00", sha256="53b11fffb53c5d48da5418893ac7bc814ca2fde9c86074bdfeaa967598c918f4")
version("3.5.00", sha256="748f06aed63b1e77e3653cd2f896ef0d2c64cb2e2d896d9e5a57fec3ff0244ff")
version("3.4.01", sha256="146d5e233228e75ef59ca497e8f5872d9b272cb93e8e9cdfe05ad34a23f483d1")
version("3.4.00", sha256="2e4438f9e4767442d8a55e65d000cc9cde92277d415ab4913a96cd3ad901d317")
version("3.3.01", sha256="4919b00bb7b6eb80f6c335a32f98ebe262229d82e72d3bae6dd91aaf3d234c37")
version("3.3.00", sha256="170b9deaa1943185e928f8fcb812cd4593a07ed7d220607467e8f0419e147295")
version("3.2.01", sha256="9e27a3d8f81559845e190d60f277d84d6f558412a3df3301d9545e91373bcaf1")
version("3.2.00", sha256="05e1b4dd1ef383ca56fe577913e1ff31614764e65de6d6f2a163b2bddb60b3e9")
version("3.1.01", sha256="ff5024ebe8570887d00246e2793667e0d796b08c77a8227fe271127d36eec9dd")
version("3.1.00", sha256="b935c9b780e7330bcb80809992caa2b66fd387e3a1c261c955d622dae857d878")
version("3.0.00", sha256="c00613d0194a4fbd0726719bbed8b0404ed06275f310189b3493f5739042a92b")
version("4.4.01", sha256="3413f0cb39912128d91424ebd92e8832009e7eeaf6fa8da58e99b0d37860d972")
version("4.4.00", sha256="0b46372f38c48aa088411ac1b7c173a5c90f0fdb69ab40271827688fc134f58b")
version(
"4.3.01",
sha256="5998b7c732664d6b5e219ccc445cd3077f0e3968b4be480c29cd194b4f45ec70",
url="https://github.com/kokkos/kokkos/archive/4.3.01.tar.gz",
)
version(
"4.3.00",
sha256="53cf30d3b44dade51d48efefdaee7a6cf109a091b702a443a2eda63992e5fe0d",
url="https://github.com/kokkos/kokkos/archive/4.3.00.tar.gz",
)
version(
"4.2.01",
sha256="cbabbabba021d00923fb357d2e1b905dda3838bd03c885a6752062fe03c67964",
url="https://github.com/kokkos/kokkos/archive/4.2.01.tar.gz",
)
version(
"4.2.00",
sha256="ac08765848a0a6ac584a0a46cd12803f66dd2a2c2db99bb17c06ffc589bf5be8",
url="https://github.com/kokkos/kokkos/archive/4.2.00.tar.gz",
)
version(
"4.1.00",
sha256="cf725ea34ba766fdaf29c884cfe2daacfdc6dc2d6af84042d1c78d0f16866275",
url="https://github.com/kokkos/kokkos/archive/4.1.00.tar.gz",
)
version(
"4.0.01",
sha256="bb942de8afdd519fd6d5d3974706bfc22b6585a62dd565c12e53bdb82cd154f0",
url="https://github.com/kokkos/kokkos/archive/4.0.01.tar.gz",
)
version(
"4.0.00",
sha256="1829a423883d4b44223c7c3a53d3c51671145aad57d7d23e6a1a4bebf710dcf6",
url="https://github.com/kokkos/kokkos/archive/4.0.00.tar.gz",
)
version(
"3.7.02",
sha256="5024979f06bc8da2fb696252a66297f3e0e67098595a0cc7345312b3b4aa0f54",
url="https://github.com/kokkos/kokkos/archive/3.7.02.tar.gz",
)
version(
"3.7.01",
sha256="0481b24893d1bcc808ec68af1d56ef09b82a1138a1226d6be27c3b3c3da65ceb",
url="https://github.com/kokkos/kokkos/archive/3.7.01.tar.gz",
)
version(
"3.7.00",
sha256="62e3f9f51c798998f6493ed36463f66e49723966286ef70a9dcba329b8443040",
url="https://github.com/kokkos/kokkos/archive/3.7.00.tar.gz",
)
version(
"3.6.01",
sha256="1b80a70c5d641da9fefbbb652e857d7c7a76a0ebad1f477c253853e209deb8db",
url="https://github.com/kokkos/kokkos/archive/3.6.01.tar.gz",
)
version(
"3.6.00",
sha256="53b11fffb53c5d48da5418893ac7bc814ca2fde9c86074bdfeaa967598c918f4",
url="https://github.com/kokkos/kokkos/archive/3.6.00.tar.gz",
)
version(
"3.5.00",
sha256="748f06aed63b1e77e3653cd2f896ef0d2c64cb2e2d896d9e5a57fec3ff0244ff",
url="https://github.com/kokkos/kokkos/archive/3.5.00.tar.gz",
)
version(
"3.4.01",
sha256="146d5e233228e75ef59ca497e8f5872d9b272cb93e8e9cdfe05ad34a23f483d1",
url="https://github.com/kokkos/kokkos/archive/3.4.01.tar.gz",
)
version(
"3.4.00",
sha256="2e4438f9e4767442d8a55e65d000cc9cde92277d415ab4913a96cd3ad901d317",
url="https://github.com/kokkos/kokkos/archive/3.4.00.tar.gz",
)
version(
"3.3.01",
sha256="4919b00bb7b6eb80f6c335a32f98ebe262229d82e72d3bae6dd91aaf3d234c37",
url="https://github.com/kokkos/kokkos/archive/3.3.01.tar.gz",
)
version(
"3.3.00",
sha256="170b9deaa1943185e928f8fcb812cd4593a07ed7d220607467e8f0419e147295",
url="https://github.com/kokkos/kokkos/archive/3.3.00.tar.gz",
)
version(
"3.2.01",
sha256="9e27a3d8f81559845e190d60f277d84d6f558412a3df3301d9545e91373bcaf1",
url="https://github.com/kokkos/kokkos/archive/3.2.01.tar.gz",
)
version(
"3.2.00",
sha256="05e1b4dd1ef383ca56fe577913e1ff31614764e65de6d6f2a163b2bddb60b3e9",
url="https://github.com/kokkos/kokkos/archive/3.2.00.tar.gz",
)
version(
"3.1.01",
sha256="ff5024ebe8570887d00246e2793667e0d796b08c77a8227fe271127d36eec9dd",
url="https://github.com/kokkos/kokkos/archive/3.1.01.tar.gz",
)
version(
"3.1.00",
sha256="b935c9b780e7330bcb80809992caa2b66fd387e3a1c261c955d622dae857d878",
url="https://github.com/kokkos/kokkos/archive/3.1.00.tar.gz",
)
version(
"3.0.00",
sha256="c00613d0194a4fbd0726719bbed8b0404ed06275f310189b3493f5739042a92b",
url="https://github.com/kokkos/kokkos/archive/3.0.00.tar.gz",
)
depends_on("cxx", type="build") # Kokkos requires a C++ compiler

View File

@@ -103,7 +103,7 @@ def patch(self):
def configure_args(self):
spec = self.spec
args = ["--without-system-verto"]
args = ["--without-system-verto", "--without-keyutils"]
if spec.satisfies("~shared"):
args.append("--enable-static")

View File

@@ -310,6 +310,12 @@ def validate_gasnet_root(value):
"sysomp", default=False, description="Use system OpenMP implementation instead of Realm's"
)
def flag_handler(self, name, flags):
if name == "cxxflags":
if self.spec.satisfies("%oneapi@2025:"):
flags.append("-Wno-error=missing-template-arg-list-after-template-kw")
return (flags, None, None)
def cmake_args(self):
spec = self.spec
from_variant = self.define_from_variant

View File

@@ -16,6 +16,7 @@ class Libpspio(AutotoolsPackage):
license("MPL-2.0")
version("0.4.1", sha256="e4f87f6d8821042db3a88dad60ae07278e36ad2571e28f5d30f02d8b164b4daa")
version("0.3.0", sha256="4dc092457e481e5cd703eeecd87e6f17749941fe274043550c8a2557a649afc5")
depends_on("c", type="build") # generated

View File

@@ -14,6 +14,7 @@ class Libssh2(AutotoolsPackage, CMakePackage):
license("BSD-3-Clause")
version("1.11.1", sha256="d9ec76cbe34db98eec3539fe2c899d26b0c837cb3eb466a56b0f109cabf658f7")
version("1.11.0", sha256="3736161e41e2693324deb38c26cfdc3efe6209d634ba4258db1cecff6a5ad461")
version("1.10.0", sha256="2d64e90f3ded394b91d3a2e774ca203a4179f69aebee03003e5a6fa621e41d51")
version("1.9.0", sha256="d5fb8bd563305fd1074dda90bd053fb2d29fc4bce048d182f96eaa466dfadafd")
@@ -23,8 +24,7 @@ class Libssh2(AutotoolsPackage, CMakePackage):
"1.4.3", sha256="eac6f85f9df9db2e6386906a6227eb2cd7b3245739561cad7d6dc1d5d021b96d"
) # CentOS7
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
depends_on("c", type="build")
build_system("autotools", "cmake", default="autotools")
@@ -53,7 +53,7 @@ class Libssh2(AutotoolsPackage, CMakePackage):
# and fails to prepend the -L flags, which is causing issues in libgit2, as
# it tries to locate e.g. libssl in the dirs of the pc file's -L flags, and
# cannot find the lib.
patch("pr-1114.patch", when="@1.7:")
patch("pr-1114.patch", when="@1.7:1.11.0")
class CMakeBuilder(spack.build_systems.cmake.CMakeBuilder):
@@ -77,14 +77,27 @@ def cmake_args(self):
class AutotoolsBuilder(spack.build_systems.autotools.AutotoolsBuilder):
def configure_args(self):
args = ["--disable-tests", "--disable-docker-tests", "--disable-examples-build"]
args += self.enable_or_disable("shared")
args = [
"--disable-tests",
"--disable-docker-tests",
"--disable-examples-build",
"--without-libgcrypt",
"--without-wincng",
*self.enable_or_disable("shared"),
]
crypto = self.spec.variants["crypto"].value
if crypto == "openssl":
args.append(f"--with-libssl-prefix={self.spec['openssl'].prefix}")
elif crypto == "mbedtls":
args.append(f"--with-libmbedcrypto-prefix={self.spec['mbedtls'].prefix}")
if self.spec.satisfies("@1.9:"):
# single flag for all crypto backends
args.append(f"--with-crypto={crypto}")
else:
# one flag per crypto backend
if crypto == "openssl":
args.append(f"--with-libssl-prefix={self.spec['openssl'].prefix}")
args.append("--without-mbedtls")
elif crypto == "mbedtls":
args.append(f"--with-libmbedcrypto-prefix={self.spec['mbedtls'].prefix}")
args.append("--without-openssl")
return args

View File

@@ -15,6 +15,7 @@ class Libxc(AutotoolsPackage, CudaPackage):
license("MPL-2.0-no-copyleft-exception")
version("7.0.0", sha256="8d4e343041c9cd869833822f57744872076ae709a613c118d70605539fb13a77")
version("6.2.2", sha256="d1b65ef74615a1e539d87a0e6662f04baf3a2316706b4e2e686da3193b26b20f")
version("6.2.1", sha256="da96fc4f6e4221734986f49758b410ffe1d406efd3538761062a4af57a2bd272")
version("6.2.0", sha256="31edb72c69157b6c0beaff1f10cbbb6348ce7579ef81d8f286764e5ab61194d1")

View File

@@ -30,19 +30,40 @@ def url_for_version(self, version):
license("MIT")
version("2.10.3", sha256="5d2cc3d78bec3dbe212a9d7fa629ada25a7da928af432c93060ff5c17ee28a9c")
version("2.10.2", sha256="d240abe6da9c65cb1900dd9bf3a3501ccf88b3c2a1cb98317d03f272dda5b265")
version("2.10.1", sha256="21a9e13cc7c4717a6c36268d0924f92c3f67a1ece6b7ff9d588958a6db9fb9d8")
version("2.9.14", sha256="60d74a257d1ccec0475e749cba2f21559e48139efba6ff28224357c7c798dfee")
version("2.9.13", sha256="276130602d12fe484ecc03447ee5e759d0465558fbc9d6bd144e3745306ebf0e")
version("2.9.12", sha256="c8d6681e38c56f172892c85ddc0852e1fd4b53b4209e7f4ebf17f7e2eae71d92")
version("2.9.11", sha256="886f696d5d5b45d780b2880645edf9e0c62a4fd6841b853e824ada4e02b4d331")
version("2.9.10", sha256="aafee193ffb8fe0c82d4afef6ef91972cbaf5feea100edc2f262750611b4be1f")
version("2.9.9", sha256="94fb70890143e3c6549f265cee93ec064c80a84c42ad0f23e85ee1fd6540a871")
version("2.9.8", sha256="0b74e51595654f958148759cfef0993114ddccccbb6f31aee018f3558e8e2732")
version("2.9.4", sha256="ffb911191e509b966deb55de705387f14156e1a56b21824357cdf0053233633c")
version("2.9.2", sha256="5178c30b151d044aefb1b08bf54c3003a0ac55c59c866763997529d60770d5bc")
version("2.7.8", sha256="cda23bc9ebd26474ca8f3d67e7d1c4a1f1e7106364b690d822e009fdc3c417ec")
version("2.13.4", sha256="65d042e1c8010243e617efb02afda20b85c2160acdbfbcb5b26b80cec6515650")
version("2.12.9", sha256="59912db536ab56a3996489ea0299768c7bcffe57169f0235e7f962a91f483590")
version("2.11.9", sha256="780157a1efdb57188ec474dca87acaee67a3a839c2525b2214d318228451809f")
with default_args(deprecated=True):
# https://nvd.nist.gov/vuln/detail/CVE-2024-25062
version(
"2.10.3", sha256="5d2cc3d78bec3dbe212a9d7fa629ada25a7da928af432c93060ff5c17ee28a9c"
)
version(
"2.10.2", sha256="d240abe6da9c65cb1900dd9bf3a3501ccf88b3c2a1cb98317d03f272dda5b265"
)
version(
"2.10.1", sha256="21a9e13cc7c4717a6c36268d0924f92c3f67a1ece6b7ff9d588958a6db9fb9d8"
)
version(
"2.9.14", sha256="60d74a257d1ccec0475e749cba2f21559e48139efba6ff28224357c7c798dfee"
)
version(
"2.9.13", sha256="276130602d12fe484ecc03447ee5e759d0465558fbc9d6bd144e3745306ebf0e"
)
version(
"2.9.12", sha256="c8d6681e38c56f172892c85ddc0852e1fd4b53b4209e7f4ebf17f7e2eae71d92"
)
version(
"2.9.11", sha256="886f696d5d5b45d780b2880645edf9e0c62a4fd6841b853e824ada4e02b4d331"
)
version(
"2.9.10", sha256="aafee193ffb8fe0c82d4afef6ef91972cbaf5feea100edc2f262750611b4be1f"
)
version("2.9.9", sha256="94fb70890143e3c6549f265cee93ec064c80a84c42ad0f23e85ee1fd6540a871")
version("2.9.8", sha256="0b74e51595654f958148759cfef0993114ddccccbb6f31aee018f3558e8e2732")
version("2.9.4", sha256="ffb911191e509b966deb55de705387f14156e1a56b21824357cdf0053233633c")
version("2.9.2", sha256="5178c30b151d044aefb1b08bf54c3003a0ac55c59c866763997529d60770d5bc")
version("2.7.8", sha256="cda23bc9ebd26474ca8f3d67e7d1c4a1f1e7106364b690d822e009fdc3c417ec")
depends_on("c", type="build") # generated

View File

@@ -56,6 +56,7 @@ class Llvm(CMakePackage, CudaPackage, LlvmDetection, CompilerPackage):
license("Apache-2.0")
version("main", branch="main")
version("19.1.3", sha256="e5106e2bef341b3f5e41340e4b6c6a58259f4021ad801acf14e88f1a84567b05")
version("19.1.2", sha256="622cb6c5e95a3bb7e9876c4696a65671f235bd836cfd0c096b272f6c2ada41e7")
version("19.1.1", sha256="115dfd98a353d05bffdab3f80db22f159da48aca0124e8c416f437adcd54b77f")
version("19.1.0", sha256="0a08341036ca99a106786f50f9c5cb3fbe458b3b74cab6089fd368d0edb2edfe")

View File

@@ -15,12 +15,24 @@ class Logstash(Package):
"""
homepage = "https://artifacts.elastic.co"
url = "https://artifacts.elastic.co/downloads/logstash/logstash-6.6.0.tar.gz"
url = "https://artifacts.elastic.co/downloads/logstash/logstash-8.15.2-linux-x86_64.tar.gz"
version("6.6.0", sha256="5a9a8b9942631e9d4c3dfb8d47075276e8c2cff343841145550cc0c1cfe7bba7")
version("8.15.2", sha256="fc75c8cad1016b07f7aeeeeb7ea23f4195ab1beee2ced282f11ff6d0e84f7e51")
with default_args(deprecated=True):
# https://nvd.nist.gov/vuln/detail/CVE-2019-7612
version("6.6.0", sha256="5a9a8b9942631e9d4c3dfb8d47075276e8c2cff343841145550cc0c1cfe7bba7")
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
depends_on("java@11:")
def url_for_version(self, version):
if self.spec.satisfies("@:6"):
return f"https://artifacts.elastic.co/downloads/logstash/logstash-{version}.tar.gz"
else:
return f"https://artifacts.elastic.co/downloads/logstash/logstash-{version}-linux-x86_64.tar.gz"
def install(self, spec, prefix):
install_tree(".", prefix)
def setup_run_environment(self, env):
# do not use the bundled jdk
env.set("LS_JAVA_HOME", self.spec["java"].home)

View File

@@ -24,6 +24,8 @@ class Loki(MakefilePackage):
def flag_handler(self, name, flags):
if name == "cxxflags":
if self.spec.satisfies("%oneapi@2025:"):
flags.append("-Wno-error=missing-template-arg-list-after-template-kw")
if self.spec.satisfies("%oneapi@2023.0.0:"):
flags.append("-Wno-error=dynamic-exception-spec")
if self.spec.satisfies("@0.1.7 %gcc@11:"):

View File

@@ -23,6 +23,7 @@ class LuaLuaposix(LuaPackage):
version("33.4.0", sha256="e66262f5b7fe1c32c65f17a5ef5ffb31c4d1877019b4870a5d373e2ab6526a21")
version("33.2.1", sha256="4fb34dfea67f4cf3194cdecc6614c9aea67edc3c4093d34137669ea869c358e1")
depends_on("c", type="build") # generated
depends_on("c", type="build")
depends_on("libxcrypt", when="platform=linux")
depends_on("lua-bit32", when="^lua-lang@5.1")

View File

@@ -19,20 +19,27 @@ class Masa(AutotoolsPackage):
license("LGPL-2.1-or-later")
version("master", branch="master")
version("0.51.0", tag="0.51.0")
version("0.50.0", tag="0.50.0")
version("0.44.0", tag="0.44.0")
version("0.43.1", tag="0.43.1")
version("0.43.0", tag="0.43.0")
version("0.42.0", tag="0.42.0")
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
depends_on("fortran", type="build") # generated
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("fortran", type="build", when="+fortran")
variant("fortran", default=True, description="Compile with Fortran interfaces")
variant("python", default=True, description="Compile with Python interfaces")
variant("fortran", default=False, description="Compile with Fortran interfaces")
variant("python", default=False, description="Compile with Python interfaces")
depends_on("gettext")
depends_on("metaphysicl")
depends_on("python")
depends_on("autoconf", type="build")
depends_on("automake", type="build")
depends_on("libtool", type="build")
depends_on("swig", type="build")
depends_on("python", when="+python")
depends_on("metaphysicl")
depends_on("swig", type="build", when="+python")
def configure_args(self):
options = []

View File

@@ -14,10 +14,14 @@ class Minizip(AutotoolsPackage):
license("Zlib")
version("1.2.11", sha256="c3e5e9fdd5004dcb542feda5ee4f0ff0744628baf8ed2dd5d66f8ca1197cb1a1")
version("1.3.1", sha256="9a93b2b7dfdac77ceba5a558a580e74667dd6fede4585b91eefb60f03b72df23")
with default_args(deprecated=True):
# https://nvd.nist.gov/vuln/detail/CVE-2022-37434
version(
"1.2.11", sha256="c3e5e9fdd5004dcb542feda5ee4f0ff0744628baf8ed2dd5d66f8ca1197cb1a1"
)
depends_on("c", type="build") # generated
depends_on("cxx", type="build") # generated
depends_on("c", type="build")
configure_directory = "contrib/minizip"
@@ -28,8 +32,9 @@ class Minizip(AutotoolsPackage):
depends_on("zlib-api")
# error: implicit declaration of function 'mkdir' is invalid in C99
patch("implicit.patch", when="%apple-clang@12:")
patch("implicit.patch", when="%gcc@7.3.0:")
with when("@:1.2.11"):
patch("implicit.patch", when="%apple-clang@12:")
patch("implicit.patch", when="%gcc@7.3.0:")
# statically link to libz.a
# https://github.com/Homebrew/homebrew-core/blob/master/Formula/minizip.rb

Some files were not shown because too many files have changed in this diff Show More