Compare commits

...

511 Commits

Author SHA1 Message Date
Wouter Deconinck
a5f8cfd3f3 RPackages with custom url: url -> urls=[url] 2024-08-20 13:27:57 -05:00
Wouter Deconinck
d38b4024bc RPackages with bioc: remove redundant homepage and url 2024-08-20 13:27:24 -05:00
Wouter Deconinck
4d09bc13d9 RPackage: urls=[bioc/src/contrib,data/annotation/src/contrib] for bioc 2024-08-20 13:22:50 -05:00
Wouter Deconinck
1794cd598c RPackage: urls=[src/contrib,src/contrib/Archive], list_url=src/contrib 2024-08-20 13:21:41 -05:00
eugeneswalker
f1114858f5 e4s ci: add chapel (#45659)
* e4s ci: add chapel
* e4s ci: fix gpu target typo

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-08-14 09:54:07 -07:00
Greg Becker
2b6bdc7013 OneapiPackage: do not use getpass.getuser (#45727)
* OneapiPackage: do not use getpass.getuser
2024-08-14 09:48:44 -07:00
Massimiliano Culpo
586a35be43 SpecHashDescriptor: better repr in debugger (#45739) 2024-08-14 18:02:09 +02:00
Fernando Ayats
7a8dc36760 freefem: add v4.14 (#45687) 2024-08-14 08:49:04 -07:00
afzpatel
e01151a200 enable asan in remaining rocm packages (#45192)
* initial commit to enable asan in remaining rocm packages
* remove os import
* add rocm-opencl rocm-dbgapi rocm-debug-agent
* add libclang path to LD_LIBRARY_PATH
* enable asan for rocfft
* add f-string and add +asan dependencies for hip and rocm-opencl
* add conflicts for centos7/8 and rhel 9
2024-08-14 08:46:04 -07:00
Harmen Stoppels
29b50527a6 spack buildcache push: parallel in general (#45682)
Make spack buildcache push for the non-oci case also parallel, and --update-index more efficieny
2024-08-14 17:19:45 +02:00
Massimiliano Culpo
94961ffe0a Optimize marshaling of Repo and RepoPath (#45742)
When sending Repo and RepoPath over to a child process,
we go through a marshaling procedure with pickle. The
default behavior for these classes is highly inefficient,
as it serializes a lot of specs that can just be
reconstructed on the other end of the pipe.

Here we write optimized procedures to __reduce__ both
classes.
2024-08-14 14:34:35 +02:00
Massimiliano Culpo
03a7da1e44 Micro-optimize finding executables (#45740) 2024-08-14 13:52:28 +02:00
Massimiliano Culpo
97ffe2e575 Add schema for compiler options in packages.yaml (#45738) 2024-08-14 11:47:36 +02:00
Harmen Stoppels
7b10aae356 Show underlying errors on fetch failure (#45714)
- unwrap/flatten nested exceptions
- improve tests
- unify curl lookup
2024-08-14 08:15:15 +00:00
AcriusWinter
b61cd74707 raja: new test API (#45184)
* raja: new test API
* raja: tweak test method names and docstrings
* raja: restore running tests under proper directory
* raja: cleanup skiptest message and example call
* raja: Tweak expected outputs to match current
* raja: test_views -> test_stencil_offset_layout

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-08-13 21:36:14 -06:00
Adam J. Stewart
374d94edf7 py-matplotlib: add v3.9.2 (#45710) 2024-08-13 22:15:00 -05:00
dependabot[bot]
827522d825 build(deps): bump docker/build-push-action from 6.6.1 to 6.7.0 (#45730)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.6.1 to 6.7.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](16ebe778df...5cd11c3a4c)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-13 18:03:26 -06:00
Tamara Dahlgren
8ba6e7eed2 Bugfix: allow test_* build-time and stand-alone tests (#45699) 2024-08-13 16:58:00 -07:00
AcriusWinter
e40c10509d mptensor: Changed skiptest, test name, and added docstring (#44909)
* mptensor: Changed skiptest, test name, and added docstring
* mptensor: make stand-alone test method name and docstring more specific

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-08-13 17:48:25 -06:00
Sakib Rahman
21a2c3a591 py-htgettoken: add v2.0-2 (#45688)
* Add version 2.0.-2

* Newer version at the top

* [@spackbot] updating style on behalf of rahmans1

* py-htgettoken: reorder comments

* Use sha256sum instead of commit id for version 2.0-2 and above

---------

Co-authored-by: rahmans1 <rahmans1@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-08-13 14:39:51 -06:00
Chris Marsh
70eb7506df Add py-dask and py-distributed 2024.7.1 (#45546)
* Add dask 2024.3 and distributed 2024.7

* [@spackbot] updating style on behalf of Chrismarsh

---------

Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
2024-08-13 14:27:47 -05:00
Massimiliano Culpo
2b95eecb83 Improve external detection tests for compilers (#45709)
Extracted from #44419

This adds / modifies some external detection tests for compilers,
to reproduce cases that are currently tested in unit tests.

The unit tests will later be removed.
2024-08-13 18:37:54 +02:00
Fernando Ayats
df8507f470 bigdft : add v1.9.5 (#45270) 2024-08-13 09:25:34 -07:00
Ganesh Vijayakumar
645c8eeaeb Update OpenFAST package.py (#45706)
Mandating build of C++ driver program whenever "cxx" option is used. Necessitated by recent change to OpenFAST https://github.com/OpenFAST/openfast/blob/dev/glue-codes/openfast-cpp/CMakeLists.txt#L60
2024-08-13 08:04:04 -06:00
Harmen Stoppels
b693987f95 cuda: drop preference (#45130) 2024-08-13 14:17:54 +02:00
BOUDAOUD34
7999686856 siesta: add v4.1.5, v5.0.0 and v5.0.1, add variants and build v5 using cmake (#45518)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-08-13 12:10:22 +02:00
Massimiliano Culpo
7001a2a65a Fix a bug with automatic tag detection (#45696)
Extracted from #45638

When adding the "detectable" tag to a package class that has the
"tag" attribute inherited from a base class, we need to copy it to
avoid modifying the base class.
2024-08-13 10:19:26 +02:00
Kaan
7c985d6432 Intel OneAPI Codeplay Plugin for NVIDIA GPU Offload (#45655)
* kickoff attempt

* resource similar to fortran

* delete unused install_component_codeplay

* Adding conflict for versions <= 2022.2.1, moving install to package.py, adding sha256 for version 2024.2.1

* [@spackbot] updating style on behalf of kaanolgu

---------

Co-authored-by: Kaan Olgu <kaan.olgu@bristol.ac.uk>
2024-08-13 08:40:35 +01:00
Harmen Stoppels
a66586d749 spack buildcache push: best effort (#45631)
"spack buildcache push" for partially installed environments pushes all it 
can by default, and only dumps errors towards the end.

If --fail-fast is provided, error out before pushing anything if any
of the packages is uninstalled

oci build caches using parallel push now use futures to ensure pushing
goes in best-effort style.
2024-08-13 08:12:48 +02:00
AcriusWinter
6b73f00310 migraphx: Old to new test API (#44988)
* migraphx: Old to new test API
* migraphx: tweak name and docstring to be more descriptive

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-08-12 19:29:25 -06:00
Harmen Stoppels
063b987ceb remove config:concretizer:clingo (#45684) 2024-08-12 18:23:14 -07:00
AcriusWinter
fe19394bf9 py-amrex: old to new test API (#45183)
* py-amrex: test name change
2024-08-12 16:56:51 -07:00
Matt Thompson
e09955d83b mepo: Add 2.0.0 (#45691) 2024-08-12 16:54:48 -06:00
Brian Spilner
d367f14d5e add cdo-2.4.1 and cdo-2.4.2 (#45686) 2024-08-12 16:24:06 -06:00
Henri Menke
6f61e382da etsf-io: use pic flag when compiling (#45646) 2024-08-12 16:23:36 -06:00
AcriusWinter
63e680e4f9 c: new test API (#45469)
* c: new test API
* gcc:  provides('c')
* c: bugfix and simplification of the new stand-alone test method

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-08-12 15:58:09 -06:00
Nicole C.
27557a133b debug: Update cmd and test for Windows (#45309)
* debug: Update cmd and test for Windows

* Add comment tar options not supported by Win tar
2024-08-12 15:50:51 -04:00
dslarm
78810e95ed acfl, armpl-gcc: use ubuntu-22.04 as target for Debian 12 (#45524) 2024-08-12 13:21:40 -06:00
Wouter Deconinck
553b44473f aspell: add v0.60.8.1 (#45685) 2024-08-12 11:38:00 -07:00
Harmen Stoppels
966a775a45 re2: fix cmake cxx std (#45694) 2024-08-12 18:58:02 +02:00
Mikael Simberg
327c75386a mold: add 2.33.0 (#45680) 2024-08-12 07:08:33 -06:00
Cédric Chevalier
a2cb7ee803 Kokkos: only requires a C++ compiler (#45467)
Signed-off-by: Cédric Chevalier <cedric.chevalier@cea.fr>
2024-08-12 06:38:13 -06:00
Wouter Deconinck
2a5d4b2291 cli11: add v2.4.2 (#45673) 2024-08-12 05:07:59 -06:00
Harmen Stoppels
3b59817ea7 deal with TimeoutError from ssl.py (#45683) 2024-08-12 13:06:13 +02:00
Wouter Deconinck
06eacdf9d8 xterm: add v393 (#45678) 2024-08-12 04:27:49 -06:00
Till Ehrengruber
bfdcdb4851 cutensor: add v2.0.1.2 on aarch64 (#45138) 2024-08-12 12:07:53 +02:00
Filippo Spiga
83873d06a1 jube: add v2.6.2, v2.7.0, v2.7.1 (#45599) 2024-08-12 11:59:59 +02:00
G-Ragghianti
91333919c6 SLATE package: make MPI and OpenMP a requirement (#44979)
Co-authored-by: gragghia <gragghia@BlackM3.local>
2024-08-12 11:54:41 +02:00
Melven Roehrig-Zoellner
cd6237cac4 py-pyside2: add version 5.15.14 (#44634)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-08-12 11:44:28 +02:00
Derek Ryan Strong
91412fb595 openmpi: restrict versions for launcher variants (#45624) 2024-08-12 11:40:02 +02:00
Adam J. Stewart
678c995415 PyTorch: update ecosystem (#45431) 2024-08-12 11:33:27 +02:00
Wouter Deconinck
63af548271 runc: add v1.1.13 (#45679) 2024-08-12 03:33:11 -06:00
Adam J. Stewart
200dfb0346 giflib: fix build on macOS (#45643) 2024-08-12 11:30:45 +02:00
Adam J. Stewart
e2f605f6e9 GDAL: clarify compiler version support (#45651) 2024-08-12 11:27:48 +02:00
Wouter Deconinck
3cf1914b7e py-avro: add v1.11.3, v1.12.0 (#45677) 2024-08-12 03:17:50 -06:00
Wouter Deconinck
cd7a49114c embree: add v4.3.3 (#45674) 2024-08-12 03:13:03 -06:00
Wouter Deconinck
1144487ee7 autodiff: add v1.0.2 -> v1.1.2 (#43527) 2024-08-12 11:03:31 +02:00
Wouter Deconinck
742b78d2b5 wayland-protocols: add v1.35, v1.36; support tests (#45670) 2024-08-12 10:54:10 +02:00
Wouter Deconinck
633d1d2ccb abseil-cpp: add v20240722; support tests (#45671) 2024-08-12 10:50:20 +02:00
Wouter Deconinck
9adefd587e ccache: add v4.10.2 (#45672) 2024-08-12 10:46:55 +02:00
Wouter Deconinck
102a30a5a2 {url,homepage} = http->https://*.sourceforge.net (#45676) 2024-08-12 02:24:32 -06:00
Harmen Stoppels
7ddc886d6d buildcache: fix hard-coded, outdated layout version (#45645) 2024-08-12 09:25:31 +02:00
Wouter Deconinck
9e7183fb14 catch2: add v3.5.4, v3.6.0 (#45662)
* catch2: add v3.5.4, v3.6.0

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-08-11 21:00:42 -07:00
Wouter Deconinck
18ab3c20ce opencascade: add v7.8.1 (#45665)
* opencascade: add v7.8.1

* opencascade: with default_args
2024-08-11 16:27:34 -06:00
Jim Phillips
b91b42dc7b namd: do not require single_node_gpu with rocm (#45650)
Removes conflict inadvertently left in #45553
2024-08-11 16:04:56 -06:00
Adam J. Stewart
7900d0b3db py-ruff: add v0.5.7 (#45660) 2024-08-11 14:31:23 -07:00
Wouter Deconinck
847d7bc87d libdrm: add v2.4.121, v2.4.122 (switch to multiple build systems) (#45663)
* libdrm: add  v2.4.121, v2.4.122 (switch to multiple build systems)

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-08-11 14:26:37 -07:00
Wouter Deconinck
078984dcf4 libx11: add v1.8.10 (#45664) 2024-08-11 14:20:24 -07:00
Wouter Deconinck
010324714f py-gssapi: add v1.8.3 (#45666) 2024-08-11 14:09:45 -07:00
AcriusWinter
7ce5ac1e6e fortran: new test API (#45470)
* fortran: new test API
* fortran: add provides to gcc package
* fortran: simplify stand-alone test processing

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-08-11 14:32:39 -06:00
Steven Hahn
565165f02d benchmark: add 1.8.5 (#45657)
Signed-off-by: Steven Hahn <hahnse@ornl.gov>
2024-08-11 13:48:05 -06:00
AcriusWinter
e4869cd558 hypre-cmake: old to new test API (#45144)
* hypre-cmake: old to new test API
* hypre-cmake: update Makefile to use installed files
* hypre-cmake: make stand-alone test method name more specific

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-08-11 03:03:24 -06:00
AcriusWinter
990e0dc526 hypre: old to new test API (#45066)
* hypre: old to new test API
* hypre: restore test_parts; add Makefile cleanup

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-08-11 02:28:25 -06:00
Piotr Luszczek
f9d8b6b5aa plasma: add version 24.8.7 (#45656) 2024-08-11 02:14:21 -06:00
Massimiliano Culpo
2079b888c8 Remove the old concretizer (#45215)
The old concretizer is still used to bootstrap clingo from source. If we switch to a DAG model
where compilers are treated as nodes, we need to either:

1. fix the old concretizer to support this (which is a lot of work and possibly research), or
2. bootstrap `clingo` without the old concretizer.

This PR takes the second approach and gets rid of the old concretizer code. To bootstrap
`clingo`, we store some concrete spec prototypes as JSON, select one according to the
coarse-grained system architecture, and tweak them according to the current host.

The old concretizer and related dead code are removed.  In particular, this removes
`Spec.normalize()` and related methods, which were used in many unit-tests to set
up the test context. The tests have been updated not to use `normalize()`.

- [x] Bootstrap clingo concretization based on a JSON file
- [x] Bootstrap clingo *before* patchelf
- [x] Remove any use of the old concretizer, including:
      * Remove only_clingo and only_original fixtures
      * Remove _old_concretize and _new_concretize
      * Remove _concretize_together_old
      * Remove _concretize_together_new
      * Remove any use of `SPACK_TEST_SOLVER`
      * Simplify CI jobs
- [x] ensure bootstrapping `clingo` works on on Darwin and Windows
- [x] Raise an intelligible error when a compiler is missing
- [x] Ensure bootstrapping works on FreeBSD
- [x] remove normalize and related methods

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-08-10 16:12:27 -07:00
Teague Sterling
2dbc5213b0 pulseaudio: add pkgconfig build dep (#45653)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-08-10 12:01:16 -05:00
Greg Becker
7a83cdbcc7 Concretizer should respect namespace of reused specs (#45538)
* concretize.lp: improve coverage of internal_error facts
* concretizer: track namespaces for reused packages
* regression test
2024-08-09 13:51:34 -07:00
Dominic Hofer
da33c12ad4 Remove execution permission from setup-env.sh (#45641)
`setup-env.sh` is meant to be sourced, not executed directly.
By revoking execution permissions, users who accidentally execute
the script will receive an error instead of seeing no effect.

* Remove execution permission from `setup-env.sh` and friends
* Don't make output file executable in `spack commands --update-completion`

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-08-09 13:27:07 -07:00
Alec Scott
c30979ed66 fzf: add v0.52.1, v0.53.0, v0.54.3 (#45634)
* fzf: add v0.52.1, v0.53.0, v0.54.3

* fzf: accept suggestions for url_for_version

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-08-09 11:11:01 -07:00
Alec Scott
5d7d18d028 go: add v1.22.6, deprecate insecure versions (#45635) 2024-08-09 10:45:11 -07:00
John W. Parent
92e42bbed9 Windows: Port Libuv (#45413)
* On Windows it is built with CMake, however CMake is built against
  libuv, so libuv must depend on cmake+ownlibs to short circuit the
  circular dependency
* libuv currently fetches the -dist source distribution of libuv for
  certain versions because those versions contain a pre-generated
  ./configure script. However those distributions have all CMake
  files removed, so they cannot be used to build on Windows.
  Because the source distributions are different, this means the
  checksums are different, and necessitates an additional version
  declaration for each version we want to support with CMake.
2024-08-09 10:13:57 -06:00
AcriusWinter
899ac78887 cxx: new test API (#45462)
* cxx: new test API
* gcc: provide cxx
* default providers:  cxx provided by gcc
* cxx: cleanup stand-alone test
  - test_c -> test_cxx
  - simplify compilation and execution
  - corrected output checks

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-08-08 23:23:05 -06:00
AcriusWinter
7bec524dd5 Install test root update: old to new API (#45491)
* convert install_test_root from old to new API
2024-08-08 13:01:06 -07:00
Dom Heinzeller
546e0925b0 odc: add v1.5.2 (#45622)
* Add odc@1.5.2

* Add climbfuji as maintainer in var/spack/repos/builtin/packages/odc/package.py
2024-08-08 13:28:19 -06:00
Vanessasaurus
95b533ddcd flux-sched: add version 0.36.1 (#45619)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
2024-08-08 13:18:35 -06:00
Eric Lingerfelt
28fe85ae66 Adding ecmwf-atlas version 0.38 (#45505) 2024-08-08 12:58:21 -06:00
Chris Marsh
6b936884f5 py-rioxarray: add v0.17.0 (#45529)
* Update available versions

* fix style

* Reduce added versions to just a single new version as per review

* fix style

* Set dependency versions in line with pyproject and setup.py

* add new interp variant

* add wheel as required

* Add variant description

* change +interp default, actually add packaging this time
2024-08-08 19:00:39 +02:00
Vicente Bolea
7b879d092d vtk-m: update vtk-m to 2.2.0 (#45544) 2024-08-08 09:35:48 -07:00
Massimiliano Culpo
007c1148c0 Remove Compiler.PrgEnv* attributes (#45615) 2024-08-08 18:15:42 +02:00
Wouter Deconinck
8b2fec61f2 hepmc3: add v3.3.0 (#45617) 2024-08-08 17:19:26 +02:00
Andrey Perestoronin
1cebb7e1c3 intel-oneapi-2024.2.1 (#45618) 2024-08-08 10:44:26 -04:00
Chris Marsh
6f8d8ba47e openblas : fix install_name on macos (#45606) 2024-08-08 07:50:30 -06:00
Teague Sterling
9464898449 py-glean-sdk: add 60.5.0 and fix 60.0.1 checksum (#45630)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-08-08 07:35:16 -06:00
Dom Heinzeller
0902910784 Compiler wrapper: add env var to pass vcheck flags (#44588)
Fixes #43494

Add a set of environment variables SPACK_ALWAYS_CFLAGS (etc.) that
are always applied by the compiler wrapper.

Unlike SPACK_CFLAGS, for example, these will also be applied to
version checks (both SPACK_CFLAGS and SPACK_ALWAYS_CFLAGS will be
applied to the other invocation modes like ccld etc.).

Using this new functionality, the classic Intel and oneAPI compilers
are updated to pass compiler flags that disable warning messages
when newer versions are invoked via their older binary names
(these warnings were also generated for version checks, hence the
need for a new wrapper variable).

---------

Co-authored-by: Peter Josef Scheibel <scheibel1@llnl.gov>
2024-08-08 06:40:36 +00:00
dependabot[bot]
7050ace968 build(deps): bump docker/build-push-action from 6.5.0 to 6.6.1 (#45623)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.5.0 to 6.6.1.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](5176d81f87...16ebe778df)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-08 07:26:50 +02:00
Federico Ficarelli
7efbad0d81 hipsycl: remove myself from maintainers (#45616) 2024-08-07 21:50:59 +02:00
teddy
2298abd7f4 Add py-gmsh package (#45409)
Co-authored-by: chantrait <teddy.chantrait@cea.fr>
2024-08-07 20:47:00 +02:00
BOUDAOUD34
46efa7e151 namd: add compile options for ROCm (#45553)
* namd:add compile options for ROCm
* Combine --rocm-prefix and its value in a single opts.extend call to ensure they remain ordered correctly and improve code robustness.
* Standardize the code and add ROCm conflicts
* add single_node_gpu conflict

---------

Co-authored-by: U-PALLAS\boudaoud <boudaoud@pc44.pallas.cines.fr>
2024-08-07 10:54:33 -07:00
Adam J. Stewart
60c589db28 PyTorch: add support for the UCC distributed backend (#45598) 2024-08-07 10:49:29 -07:00
Sreenivasa Murthy Kolam
ca9a7b2033 MIOpen-hip , RocFFT packages: fix miopendriver failure and build failure with centos-8 os (#45429)
* MIOpen-hip package: fix miopendriver failure with centos-8 os
* fix the centos-8 build error with lstdc++
2024-08-07 10:45:20 -07:00
kjrstory
470a26bbcd openfoam-org: Add new version 11 (#39771) 2024-08-07 09:55:33 -07:00
Teague Sterling
b52e4fc650 libxcvt: new package (#44640)
* Adding libxcvt package

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* [@spackbot] updating style on behalf of teaguesterling

* Update package.py

* Update package.py

* updating checksum after swithc to x.org mirror

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-08-07 10:00:45 -05:00
Mathieu Cloirec
a653579e56 hipsycl: add compile options for ROCm (#45497)
Co-authored-by: cloirec <cloirec@pc54.cines.fr>
2024-08-07 16:23:53 +02:00
Chris Marsh
7f89391b14 parallelio: fix install_name for on macos (#45567) 2024-08-07 15:55:10 +02:00
Wouter Deconinck
34c98101ad xorg-server and xorg pkgs: Fix the build and mark protocols as build deps (#45536) 2024-08-07 14:51:52 +02:00
Jeremy Guillette
f1ea979e2b py-gevent: @:23.9.0 conflicts with py-cython@3.0.10 (#45257) (#45295)
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-08-07 14:35:30 +02:00
Christopher Christofi
55cbdd435c py-chex: add v0.1.86 (#45476) 2024-08-07 14:16:01 +02:00
Adam J. Stewart
1cce947be6 py-torchgeo: incompatible with lightning 2.3 (#45612) 2024-08-07 13:38:33 +02:00
Adam J. Stewart
0a735c6ea6 py-lightning: add v2.4.0 (#45611) 2024-08-07 13:36:33 +02:00
Adam J. Stewart
5400b1e222 py-lightly: add v1.5.11 (#45610) 2024-08-07 13:34:58 +02:00
Bernhard Kaindl
ef461befcc linux-perf: Add capstone variant to build with capstone disassembler 2024-08-07 13:34:07 +02:00
Bernhard Kaindl
831b4a3e4a linux-perf: If clang is in PATH, pass "CLANG=" + shutil.which("clang")
When clang is installed I get `clang: Permission denied`.
Setting CLANG to the full path of clang fixes this.

Signed-off-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-08-07 13:34:07 +02:00
Jordan Galby
6007a77a33 Add linux-perf 2024-08-07 13:34:07 +02:00
Jordan Galby
a2794f04bc audit-userspace: Backport patch to fix build error
Fix build error:

```
     1711    ../../../py-compile: line 125: test: found: integer expression expected
     1712    ../../../py-compile: line 137: python: command not found
  >> 1713    make[4]: *** [Makefile:521: install-pyexecPYTHON] Error 127
```
2024-08-07 13:34:07 +02:00
Jordan GALBY
3ae3bfd997 audit-userspace: Add 3.1.2
Fixes build errors on almalinux 8 gcc 8.5.0:

```txt
     1352    audit_wrap.c: In function '_wrap_audit_rule_data_buf_set':
  >> 1353    audit_wrap.c:5010:17: error: cast specifies array type
     1354         arg1->buf = (char [])(char *)memcpy(malloc((size)*sizeof(char)), (const char *)(arg2), sizeof(char)*(size));
     1355                     ^
  >> 1356    audit_wrap.c:5010:15: error: invalid use of flexible array member
     1357         arg1->buf = (char [])(char *)memcpy(malloc((size)*sizeof(char)), (const char *)(arg2), sizeof(char)*(size));
     1358                   ^
  >> 1359    audit_wrap.c:5012:15: error: invalid use of flexible array member
     1360         arg1->buf = 0;
     1361                   ^
```
2024-08-07 13:34:07 +02:00
Jordan GALBY
5f3f968a1f xmlto: Fix missing flex build dep
Fixes error:

```txt
3 errors found in build log:
     61    ==> xmlto: Executing phase: 'build'
     62    ==> [2023-10-26-09:48:35.425903] 'make' '-j16' 'V=1'
     63    make  all-am
     64    make[1]: Entering directory '/tmp/root/spack-stage/spack-stage-xmlto-0.0.28-huuygrp4qasytrezg774yavnnaxzgp2e/spack-src'
     65    /bin/sh ./ylwrap xmlif/xmlif.l .c xmlif/xmlif.c -- /bin/sh /tmp/root/spack-stage/spack-stage-xmlto-0.0.28-huuygrp4qasytrezg774yavnnaxzgp2e/spack-src/missing flex
     66    /tmp/root/spack-stage/spack-stage-xmlto-0.0.28-huuygrp4qasytrezg774yavnnaxzgp2e/spack-src/missing: line 81: flex: command not found
  >> 67    WARNING: 'flex' is missing on your system.
     68             You should only need it if you modified a '.l' file.
     69             You may want to install the Fast Lexical Analyzer package:
     70             <http://flex.sourceforge.net/>
  >> 71    make[1]: *** [Makefile:757: xmlif/xmlif.c] Error 127
     72    make[1]: Leaving directory '/tmp/root/spack-stage/spack-stage-xmlto-0.0.28-huuygrp4qasytrezg774yavnnaxzgp2e/spack-src'
  >> 73    make: *** [Makefile:584: all] Error 2
```
2024-08-07 13:34:07 +02:00
Jordan GALBY
652de07d8c babeltrace: Fix build minimal OS: missing pkgconfig 2024-08-07 13:34:07 +02:00
Jordan GALBY
c16191d9ea babeltrace: Add 1.5.11 for linux-perf 2024-08-07 13:34:07 +02:00
Konstantinos Parasyris
1b1663acea flux-pmix: correct FLUX_PMI_CLIENT_SEARCHPATH (#45277) 2024-08-07 13:09:47 +02:00
Jordan Galby
d2f269ed7b Fix SIP build system installing files into python-venv (#45360)
For example: spack install py-pyqt5 or py-pyqt6 would install pylupdate5 pyrcc5 and
pyuic5 into in python-venv's install prefix.

Fix https://github.com/spack/spack/issues/45359
2024-08-07 12:12:56 +02:00
Tamara Dahlgren
4584d85ca6 biobambam2: fix test scripts to use installed binaries (#45609) 2024-08-07 11:43:23 +02:00
Wouter Deconinck
2106a2be26 qt-base: let QtPackage base class handle module files (#45603) 2024-08-07 10:53:32 +02:00
Wouter Deconinck
228c82502d qt-5compat: new package for QtCore5Compat module (#45601) 2024-08-07 10:47:31 +02:00
tygoetsch
431f5627d9 Add charliecloud@0.38 (#45566) 2024-08-07 01:33:36 -06:00
Teague Sterling
fb315c37ba py-hail: new package (#44521) 2024-08-07 06:55:07 +02:00
dependabot[bot]
f9fa160a24 build(deps): bump actions/upload-artifact from 4.3.5 to 4.3.6 (#45608)
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 4.3.5 to 4.3.6.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](89ef406dd8...834a144ee9)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-07 06:50:18 +02:00
Massimiliano Culpo
1ee29929a7 cmake: add v3.30.2 (#45593) 2024-08-06 19:53:45 -06:00
Massimiliano Culpo
97e691cdbf binutils: add v2.43 (#45594) 2024-08-06 15:09:55 -06:00
Matthieu Dorier
51ba25fec3 py-gcovr: add version 7.2 (#45597)
* py-gcovr: add version 7.2
* py-gcovr: fix dependency on py-tomli

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-08-06 13:41:03 -06:00
Matthieu Dorier
81281646e9 py-colorlog: added new version (#45596)
* py-colorlog: added new version
* py-colorlog: corrected dependency on py-setuptools
2024-08-06 13:23:46 -06:00
Wouter Deconinck
85ec4cca92 libxshmfence: add v1.3.1, v1.3.2 (#44393)
* libxshmfence: add v1.3.1, v1.3.2
* libxshmfence: fix url_for_version
* libxshmfence: url_for_version without spec.satisfies
* libxshmfence: mv url_for_version after directives

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-08-06 10:56:35 -07:00
Pranav Sivaraman
f3c21b0177 libdwarf: remove old versions with 0.10.1 (#45306) 2024-08-06 18:50:29 +02:00
Alex Leute
51ac4686b4 py-anvio and py-rich-argparse: new packages (#45367) 2024-08-06 17:52:18 +02:00
Michael Kuhn
82752ad0b7 rocksdb: add missing build dependencies (#45252) 2024-08-06 09:23:13 -06:00
Hariharan Devarajan
b231e6e9e9 Changes to DLIOProfiler and DFTracer Package (#45180)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-08-06 13:49:01 +02:00
Dave Keeshan
90f8c20133 verible: Add versions 0.0.3671 and 0.0.3667 (#44660) 2024-08-06 11:28:36 +02:00
Dave Keeshan
9835b072e2 yosys: add v0.42 2024-08-06 11:27:36 +02:00
Massimiliano Culpo
f438a33978 changa: add v3.5 (#45591) 2024-08-06 11:12:45 +02:00
Axel Huebl
8ded2ddf5e C-Blosc2: add v2.15.1 (#45582) 2024-08-06 02:54:24 -06:00
Gavin John
e3904d4cbf Fix spack url stats (#45584) 2024-08-06 02:54:01 -06:00
Bill Williams
e1bcbcf9f3 Score-P: Add remapping for rocmcc (#45316)
* Score-P: Add remapping for rocmcc

rocmcc -> amdclang for ScoreP

* [@spackbot] updating style on behalf of wrwilliams

---------

Co-authored-by: William Williams <william.williams@tu-dresden.de>
Co-authored-by: wrwilliams <wrwilliams@users.noreply.github.com>
2024-08-06 09:51:50 +02:00
Cameron Rutherford
fa671a639a hiop: update package.py with correct cusolver_lu CMake variable (#45332) 2024-08-06 09:40:56 +02:00
snehring
28171f1b9d diamond: add blast support and eigen (#45254)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-08-06 09:30:22 +02:00
Christopher Christofi
8de03e2bf5 py-numba4jax: add new package (#45481) 2024-08-06 09:17:41 +02:00
Massimiliano Culpo
2fa314b6b6 Avoid duplicate dependabot bumps (#45590)
* Avoid duplicate dependabot bumps

Example of a duplicate bump:
- https://github.com/spack/spack/pull/45124
- https://github.com/spack/spack/pull/45125

* Deduplicate "pip" ecosystem
2024-08-06 07:14:55 +00:00
alvaro-sch
7780059c64 orca: add v6.0.0 (#45489) 2024-08-06 09:13:42 +02:00
Satish Balay
7e69671570 petsc, py-petsc4py: add v3.21.4 (#45506) 2024-08-06 08:43:27 +02:00
Rocco Meli
5650d4d419 dbcsr: add v2.7.0 and +g2g variant (#45501)
Co-authored-by: RMeli <RMeli@users.noreply.github.com>
2024-08-06 08:40:09 +02:00
Andrew W Elble
fa38dd9386 gdrcopy: specify CUDA envvar during build (#45415)
otherwise /usr/local/cuda is used
2024-08-06 08:32:31 +02:00
Weiqun Zhang
16a2a5047c amrex: add v24.08 (#45543) 2024-08-06 08:10:53 +02:00
dependabot[bot]
899e458ee5 build(deps): bump flake8 in /.github/workflows/requirements/style (#45587)
Bumps [flake8](https://github.com/pycqa/flake8) from 7.1.0 to 7.1.1.
- [Commits](https://github.com/pycqa/flake8/compare/7.1.0...7.1.1)

---
updated-dependencies:
- dependency-name: flake8
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-06 07:09:18 +02:00
dependabot[bot]
4a8d09dcc1 build(deps): bump flake8 from 7.1.0 to 7.1.1 in /lib/spack/docs (#45588)
Bumps [flake8](https://github.com/pycqa/flake8) from 7.1.0 to 7.1.1.
- [Commits](https://github.com/pycqa/flake8/compare/7.1.0...7.1.1)

---
updated-dependencies:
- dependency-name: flake8
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-06 07:08:57 +02:00
Matthieu Dorier
98e206193b py-configspace: add main, 1.0.0, 1.0.1, 1.1.1, 1.1.2, 1.1.3, 1.1.4, fix url and fix cython dependency (#45193)
* [py-configspace] fix dependency on cython

* py-cython not needed starting from 1.0.0

* added py-configspace 1.0.0 and 1.0.1

* py-configspace: fix style

* added py-configspace version 1.1.0

* added py-configspace version 1.1.1

* py-configspace: two more versions and new maintainer

* py-configspace: fixed typo

* py-configspace: added version 1.1.4
2024-08-05 23:26:27 -05:00
Christopher Christofi
6a6c295938 r: add version 4.4.1 (#45564) 2024-08-05 18:45:27 -06:00
Tamara Dahlgren
9a1002c098 tix: skip implicit patch (#45244) 2024-08-05 17:34:26 -07:00
Zach Jibben
6c903543e1 Update Truchas package for 24.06 and 24.07 (#45583) 2024-08-05 16:40:56 -07:00
Jonathon Anderson
994d995b64 intel-xed: Rewrite recipe to match upstream install layout (v2) (#45155)
* intel-xed: Rewrite to use mbuild install procedure
  Fixes https://github.com/spack/spack/issues/41268
* hpctoolkit: depend on intel-xed +deprecated-includes
* intel-xed: add missing versions from 2024
* intel-xed: Verify dependencies on C/C++ compilers

---------

Signed-off-by: Jonathon Anderson <anderson.jonathonm@gmail.com>
2024-08-05 14:53:15 -07:00
Matt Thompson
54d17ae044 Update GFE packages (#45194)
* Update GFE packages
* Approve compiler depends_on
2024-08-05 14:00:23 -07:00
jmuddnv
9ea103f94e nvhpc: add v24.7 (#45511) 2024-08-05 21:49:59 +02:00
Richard Berger
83efafa09f Add language depends_on on several packages (#45298) 2024-08-05 21:48:58 +02:00
Jaroslav Hron
5f29bb9b22 petsc: fix zlib-api handling (#45545) 2024-08-05 21:24:50 +02:00
Luc Grosheintz
441b64c3d9 highfive: add v2.10.0 (#45486) 2024-08-05 21:16:54 +02:00
Sergey Kosukhin
cee6c59684 nag: add version 7.2.7203 (#45556) 2024-08-05 20:40:59 +02:00
Wouter Deconinck
b1adfcf665 seacas: limit to fmt@10 (#45565) 2024-08-05 13:11:45 -05:00
Michael Kuhn
433abfcc80 gcc: add 14.2.0 (#45577) 2024-08-05 11:01:09 -06:00
Wouter Deconinck
02063302c5 fastor: add git attribute, verify language dependency (#45549) 2024-08-05 18:56:36 +02:00
Massimiliano Culpo
b9125ae3e7 Do not halt concretization on unknown variants in externals (#45326)
* Do not halt concretization on unkwnown variants in externals
2024-08-05 09:24:57 -07:00
Pranav Sivaraman
0a2b63b032 highway: add v1.2.0 (#45335)
Co-authored-by: pranav-sivaraman <pranav-sivaraman@users.noreply.github.com>
2024-08-05 18:23:43 +02:00
SXS Bot
35d84a6456 spectre: add v2024.08.03 (#45575)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-08-05 18:18:25 +02:00
jgraciahlrs
0257b2db4b libxcb: Set well-known locale for build (#45502)
* libxcb: Set well-known locale for build

Builds might fail if no valid locale is set. See https://www.linuxfromscratch.org/blfs/view/git/x/libxcb.html

* libxcb: fix style

* libxcb: change locale from en_US.UTF-8 to C.UTF-8

* libxcb: fix style

* Update var/spack/repos/builtin/packages/libxcb/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* [@spackbot] updating style on behalf of jgraciahlrs

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
Co-authored-by: jgraciahlrs <jgraciahlrs@users.noreply.github.com>
2024-08-05 08:47:38 -05:00
Wouter Deconinck
d3bf1e04fc py-vector: add through v1.4.1 (switch to hatchling) (#45527) 2024-08-05 10:45:57 +02:00
Auriane R.
530639e15f Use satisfies instead of if ... in spec in b* and c* directories (#45555) 2024-08-04 11:59:13 -06:00
Wouter Deconinck
c8695f2ba6 py-mypy: add v1.9.0, v1.10.1, v1.11.1 (#45542) 2024-08-04 13:08:04 +02:00
dependabot[bot]
f3bd820374 build(deps): bump pytest from 8.3.1 to 8.3.2 in /lib/spack/docs (#45461)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.1 to 8.3.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.3.1...8.3.2)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-04 05:44:52 +00:00
dependabot[bot]
29b9fe1f0b build(deps): bump black in /.github/workflows/requirements/style (#45561)
Bumps [black](https://github.com/psf/black) from 24.4.2 to 24.8.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.4.2...24.8.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-03 21:49:33 -05:00
dependabot[bot]
1090895e72 build(deps): bump black from 24.4.2 to 24.8.0 in /lib/spack/docs (#45563)
Bumps [black](https://github.com/psf/black) from 24.4.2 to 24.8.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.4.2...24.8.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-03 21:48:50 -05:00
dependabot[bot]
e983f4a858 build(deps): bump sphinx-design from 0.6.0 to 0.6.1 in /lib/spack/docs (#45562)
Bumps [sphinx-design](https://github.com/executablebooks/sphinx-design) from 0.6.0 to 0.6.1.
- [Release notes](https://github.com/executablebooks/sphinx-design/releases)
- [Changelog](https://github.com/executablebooks/sphinx-design/blob/main/CHANGELOG.md)
- [Commits](https://github.com/executablebooks/sphinx-design/compare/v0.6.0...v0.6.1)

---
updated-dependencies:
- dependency-name: sphinx-design
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-03 21:46:19 -05:00
Wouter Deconinck
72e3f10d5b ffmpeg: update patch hashes for addition of the X-Git-Tag (#45574) 2024-08-03 22:43:54 +02:00
Teague Sterling
c5ae5ba4db xfce4: new packages (#44646) 2024-08-03 13:29:48 +02:00
Alex Richert
a1090029f3 g2: add variants for recent releases (#45441) 2024-08-03 13:01:18 +02:00
Alex Richert
0135c808a0 landsfcutil: add testing with pfunit (#45449) 2024-08-03 13:00:15 +02:00
Alex Richert
678084fed8 bufr: add 12.1.0 (#45459) 2024-08-03 12:56:58 +02:00
Adam J. Stewart
705d58005d py-jax / JAX: add v0.4.31 (#45519) 2024-08-03 11:16:42 +02:00
Alex Richert
cee266046b sp: remove 'generated' tag (#45455) 2024-08-03 11:13:31 +02:00
dependabot[bot]
5aa3d9c39c build(deps): bump actions/upload-artifact from 4.3.4 to 4.3.5 (#45559)
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 4.3.4 to 4.3.5.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](0b2256b8c0...89ef406dd8)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-03 00:09:47 -06:00
dependabot[bot]
3ee6507dd6 build(deps): bump mypy from 1.11.0 to 1.11.1 in /lib/spack/docs (#45530)
Bumps [mypy](https://github.com/python/mypy) from 1.11.0 to 1.11.1.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.11...v1.11.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-02 17:01:25 -06:00
Todd Gamblin
425bba2f1a Allow spec queries by namespace (#45416)
* Allow spec queries by `namespace`

Spack specs have "namespaces" that indicate what package repository they come from, but
there has not been a way to use the spec syntax to match one.

You can say things like this:

```console
spack find builtin.zlib
spack find myrepo.zlib
```

But, because namespaces are written as a dot-separated prefix on the name, you can't say
"find me all specs in namespace myrepo". The syntax doesn't allow it.

This PR allows you to specify anonymous specs with namespaces on the CLI. Specifically
you can do queries like this:

```console
spack find namespace=builtin
spack find namespace=myrepo
```

You can use this anywhere else you use spec syntax, e.g. in a config file to separate
installations based on what repo they came from:

```yaml
spack:
    config:
        install_tree:
            root: $spack/opt/spack
            projections:
                namespace=myrepo: "myrepo_special_path/{name}-{hash}"
                namespace=builtin: "builtin/{name}-{hash}"
```

This PR adds a special `namespace_if_anonymous` attribute to specs, which returns the
`namespace` if the spec has no name, otherwise it returns `None`. This allows us to
print the namespace for anonymous specs but to continue hiding it for most views, as
we've done so far.

This is implemented as a special case, but it's one that already exists, along with
`platform`, `os`, `target`, etc. This also reserves existing special case names for
variants so that users cannot define them in their package files. This change is
potentially breaking, but I do not think it will be common. There are no builtin
packages with a variant called `namespace`, and defining `os`, `target`, or `platform`
as a variant would've likely caused other problems if they were already being used.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-08-02 13:38:14 -05:00
Wouter Deconinck
a2cbc46dbc openblas: fix AttributeError when threads=openmp (#45338) 2024-08-02 12:19:06 -06:00
Wouter Deconinck
8538b0c01d xmlto: hotfix upstream patch removed by fedora (#45551) 2024-08-02 08:33:17 -06:00
Teague Sterling
ff30da7385 py-glean-sdk: new package (#45389)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-08-02 07:29:36 -06:00
Alex Richert
7c5771ed11 ncio: add check function to run unit tests (#45448) 2024-08-02 11:36:01 +02:00
Alex Richert
81fb1a9b8f bacio: remove an old maintainer and 'generated' tags (#45440) 2024-08-02 11:33:37 +02:00
Alex Richert
835bd2557e g2c: remove the 'generated' tag (which is correct) (#45442) 2024-08-02 11:25:31 +02:00
Alex Richert
e5a8d7be49 gfsio: add testing with pfunit (#45444) 2024-08-02 11:24:54 +02:00
Alex Richert
9f795de60b nemsio: add check to run the unit tests (#45450) 2024-08-02 11:22:20 +02:00
Alex Richert
7791a30bc2 nemsiogfs: add running the unit tests (#45451) 2024-08-02 11:21:35 +02:00
Alex Richert
2e85c83301 sfcio: add unit testing with pfunit (#45453) 2024-08-02 11:20:44 +02:00
Alex Richert
251190a0c4 sigio: add unit testing with pfunit (#45454) 2024-08-02 11:19:41 +02:00
Alex Richert
90b85239d5 wrf-io: remove 'generated' tags (which are correct) (#45458) 2024-08-02 11:16:45 +02:00
Alex Richert
f276a8da75 w3emc: add @2.12.0, conflict on +shared~pic (#45456) 2024-08-02 11:16:05 +02:00
Alex Richert
93799ec641 w3nco: remove 'generated' tags (which are correct) (#45457) 2024-08-02 11:13:33 +02:00
Alex Richert
dddc056a29 prod-util: remove 'generated' tags (which are correct) (#45452) 2024-08-02 11:13:03 +02:00
Alex Richert
3e6d9cdc06 g2tmpl: remove 'generated' tags (which are correct) (#45443) 2024-08-02 11:12:29 +02:00
Alex Richert
091786411b grib-util: remove 'generated' tags (which are correct) (#45445) 2024-08-02 11:11:26 +02:00
Alex Richert
4af09dd506 ip2: remove 'generated' tags (which are correct) (#45447) 2024-08-02 11:10:53 +02:00
Adam J. Stewart
2626bff96d py-numpy: "@1.23:" add conflict for "%gcc@:6.4" (#45468) 2024-08-02 10:33:37 +02:00
jgraciahlrs
9ef1d609e2 py-markupsafe: add depends_on("python@3.7:", when="@2.0:") (#45503)
As per PyPI, recent versions of py-markupsafe (>=2) require Python >=3.7.
2024-08-02 10:28:25 +02:00
Wouter Deconinck
4c60deb992 xrootd: add github as secondary url to avoid SSL issues (#45512) 2024-08-02 10:24:00 +02:00
Adam J. Stewart
53bc782278 pthreadpool: use same flags as PyTorch (#45521) 2024-08-02 10:11:24 +02:00
Wouter Deconinck
4e087349a4 py-particle: add v0.23.1, v0.24.0 (#45528)
* py-particle: add v0.23.1, v0.24.0

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-08-02 10:04:56 +02:00
Juan Miguel Carceller
53815b725a groff: Add missing depends_on("m4") (#45552) 2024-08-02 09:45:44 +02:00
eugeneswalker
e8c8e7b8a8 e4s oneapi ci: try enabling some disabled specs (#45355) 2024-08-01 21:25:30 -07:00
Seth R. Johnson
b781ce5b0f libspng: add maintainer, fix dependencies, args (#45410)
* libspng: add maintainer, fix dependencies, args

* Update var/spack/repos/builtin/packages/libspng/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Fix syntax error

* Update var/spack/repos/builtin/packages/libspng/package.py

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-08-01 21:03:27 -06:00
Juan Miguel Carceller
a3c3f4c3d1 root: Add patch to fix TUri (#45428)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-08-01 19:40:15 -07:00
Diego Alvarez S.
445b6dfcf8 Add blast+ v2.15.0, v2.16.0 (#45425) 2024-08-01 19:39:03 -07:00
Wouter Deconinck
b2ef64369f perl: add v5.40.0 (#45287)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-08-01 20:33:41 -06:00
Juan Miguel Carceller
a8d2ea68f5 gaudi: add versions 38.2 and 38.3 and limit the version of fmt (#45466)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-08-01 19:20:22 -07:00
afzpatel
c7a437573b py-tensorflow: change url for 2.16.1-rocm-enhanced (#45539)
* change url for 2.16.1-rocm-enhanced

* fix typo
2024-08-01 19:51:33 -06:00
Chris Marsh
5736d1e206 py-xarray: Update and ensure dask compatibility (#45537)
* Add 2024.7 and new +viz variant as per pyproject.toml

* Ensure dask/xarray versions are compatible
2024-08-01 18:21:28 -07:00
Teague Sterling
e110e3c3af py-zstandard: new package (#45388)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-08-01 18:43:10 -06:00
Teague Sterling
e2d8b581db py-glean-parser: new package (#45390)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-08-01 18:37:35 -06:00
Thomas Madlener
10a4de8e04 edm4hep: Add v0.99 release and deprecate older versions (#45516)
* edm4hep: Add v0.99 and deprecated older versions

* edm4hep: Fix nlohmann-json dependency version

* Keep 0.10.5 undeprecated
2024-08-01 18:26:42 -06:00
Todd Gamblin
96ddbd5e17 format: allow spaces in format specifiers (#45487)
* format: allow spaces in format specifiers

Key-value pair format specifiers can now contain spaces in the key. This allows us to
add spaces to format strings that are *only* present when the attribute formatted is not
``None``. Instead of writing:

```
    {arch=architecture}
```

and special casing `arch=` like a sigil in `Spec.format()`, we can now write:

```
    { arch=architecture}
```

And the space is *only* printed when `architecture` is not `None`. This allows us to
remove the special case in `Spec.format()` for `arch=`.

Previously the only `key=` prefix allowed in format specifiers was `arch=`, but this PR
removes that requirement, and the `key=` part of a key-value specifier can be any name.
It does *not* have to correspond to the formatted attribute.

- [x] modify `SPEC_FORMAT_RE` to allow arbitrary keys in key-value pairs.
- [x] remove special case for `arch=` from `Spec.format()`.
- [x] modify format strings using `{arch=architecture}` to use `{ arch=architecture}`
- [x] add more tests for formatting

This PR saves other more complex attributes like compiler flags and their spacing for later.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-08-01 18:20:43 -06:00
Stephen Nicholas Swatman
65b530e7ec detray: add versions 0.70.0 through 0.72.1 (#45541)
* detray: add versions 0.69.1 through 0.72.1

This commit adds four new versions of the detray package.

* Remove v0.69.0
2024-08-01 17:55:58 -06:00
Auriane R.
de98e3d6e5 Update if ... in spec with satisfies in a* dirs (#44822) 2024-08-01 18:21:37 -05:00
John W. Parent
ffcb4ee487 Windows msvsc: Report accurate platform toolset version for VS2022 (#45525)
VC toolset versions 144 and 143 are both associated with the platform
toolset 143; this deviates from prior version choices made by the
MSVC devs; add a special case to report platform toolset version
as 143 when VC toolset version is >= 143 (this will need revisiting
for later releases).
2024-08-01 11:27:17 -07:00
Teague Sterling
bfba3c9d5c py-attrs: add v17.4.0 (#45385)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-08-01 11:04:37 -07:00
Teague Sterling
37e56ea24d py-pyrsistent: add v0.14.0 (#45387)
* py-pyrsistent: add v0.1.4.0 & conflict

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-08-01 10:52:20 -07:00
Alex Richert
453e8c71ac ip: add v5.1.0 (#45331)
* ip: add v5.1.0

* [@spackbot] updating style on behalf of AlexanderRichert-NOAA

* Update package.py

* Update package.py
2024-08-01 10:33:13 -07:00
James Taliaferro
e669fcafd0 kakoune: add v2024.05.18 (#45460)
* update Kakoune, explicitly make install dirs first

* blacken
2024-08-01 09:40:28 -07:00
Stephen Hudson
2dbbcf3ca5 py-libensemble: add v1.4.0, v1.4.1 (#45463)
* libEnsemble: add v1.4.0

* libEnsemble: add v1.4.1
2024-08-01 09:39:10 -07:00
Christopher Christofi
bce710eec1 py-flax: add v0.8.5 (#45480) 2024-08-01 09:09:00 -07:00
Thomas Madlener
64a69796e2 lcio: add v2.22.1 (#45517)
* lcio: add latest version 2.22.1

* lcio: update sio dependency to match actual requirements
2024-08-01 08:56:04 -07:00
rfbgo
dd460a0eb0 py-torch-nvidia-apex: add v22.03 -> v24.04.01 (#45176) 2024-08-01 11:47:45 +02:00
Manuela Kuhn
475fe9977a py-rsatoolbox: add v0.1.5 (#45484) 2024-08-01 10:53:41 +02:00
Adam J. Stewart
b86e42a5aa py-sphinx: add v8.0 (#45520) 2024-08-01 08:40:29 +02:00
Claudia Comito
9f04c45dea py-heat: add v1.4.2 (#45317) 2024-08-01 05:38:30 +02:00
Szabolcs Horvát
20e6b60fce py-igraph: add 0.11.6 (#45132) 2024-08-01 05:34:20 +02:00
Teague Sterling
5f86ee5d93 scala: add v2.12.13 -> v2.12.19, v2.13.10, 2.13.14 (#45477)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-08-01 05:18:46 +02:00
Wouter Deconinck
84ad509621 py-onnxruntime: add v1.17.3 (#44500) 2024-08-01 04:17:56 +02:00
Rémi Lacroix
ea0da49acb OpenFOAM-org: Add missing dependency on readline for @:9 (#44482)
The setSet tool (removed in version 10) has an optional dependency on readline.
The build script will use the system readline if not provided by Spack.
2024-08-01 03:59:00 +02:00
Christopher Christofi
e77fbfe8f8 py-jaxtyping: new package (#45482) 2024-08-01 03:50:48 +02:00
Teague Sterling
d485650369 py-build: new package (#45478)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-08-01 03:47:33 +02:00
Melven Roehrig-Zoellner
c1f22ca5cb tixi: add python variant (sets PYTHONPATH and LD_LIBRARY_PATH) (#44592) 2024-08-01 03:44:11 +02:00
Melven Roehrig-Zoellner
c5d1c9ae61 t8code: fix build with gcc14 2024-08-01 03:26:56 +02:00
Melven Roehrig-Zoellner
d8184b37a3 py-pylint-gitlab: new package 2024-08-01 03:20:40 +02:00
Melven Roehrig-Zoellner
bd952a552f py-anybadge: new package 2024-08-01 03:20:40 +02:00
Cameron Smith
aa171a6cc9 omegah: Update c/c++ language deps (#45303)
Signed-off-by: Cameron Smith <smithc11@rpi.edu>
2024-08-01 03:06:00 +02:00
Alex Richert
e4ee59741e grib-util: Add 1.5.0 (#45310) 2024-08-01 02:23:31 +02:00
Manuela Kuhn
b3b9f4d4b7 py-pybv: new package (#45370) 2024-08-01 02:07:11 +02:00
Manuela Kuhn
c1f2b36854 py-edfio: add v0.4.3 and py-poetry-dynamic-versioning: add v1.4.0, fix url (#45369) 2024-08-01 02:06:05 +02:00
Vincent Michaud-Rioux
87494d2941 py-pennylane: Add 0.36 and 0.37 with deps (#45134) 2024-07-31 22:55:02 +02:00
Ian Lumsden
ad26dcfbfc flux-core,flux-sched: fix environments with external flux (#44775) 2024-07-31 21:36:51 +02:00
Vlad Savelyev
5541a184d5 py-multiqc: add v1.23 (#45325) 2024-07-31 21:16:21 +02:00
Manuela Kuhn
f1140055d0 py-pymatreader: Add v0.0.32 (#45366) 2024-07-31 21:06:15 +02:00
Manuela Kuhn
88782fb05a py-eeglabio: new package (#45372) 2024-07-31 21:04:44 +02:00
Manuela Kuhn
69d216a88e py-edflib-python: new package (#45371) 2024-07-31 21:03:26 +02:00
Wouter Deconinck
04f0af0a28 acts,dd4hep: restrict to podio@0 to prevent failures with podio@1 (#44825)
* dd4hep: restrict to podio@0 to prevent failures with podio@1

* acts: restrict to podio@0 to prevent failures with podio@1

* dd4hep: close when range for podio

* acts: close when range for podio

* acts: fix when range for podio

* acts: close when range for podio

* acts,dd4hep: ensure main/master still depends on podio
2024-07-31 20:51:38 +02:00
Wouter Deconinck
c71c7735fd py-globus-sdk: add through v3.42; pypi now uses underscores (#45349) 2024-07-31 20:48:06 +02:00
Teague Sterling
89bc483c87 py-cffi: add v1.16.0 (#45386)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-07-31 20:16:49 +02:00
Felix Thaler
62d2e8d1f4 libvterm: Fix download: Use download from launchpad.net (#45094) 2024-07-31 19:14:18 +02:00
ron minnich
12abc233d0 mpigraph: new package (LLNL mpigraph) (#45121)
Signed-off-by: Ron Minnich <rminnich@google.com>
2024-07-31 19:08:48 +02:00
Andrew W Elble
c30c5df340 libxcb: xcb-proto is a build dependency (#45523) 2024-07-31 10:48:21 -04:00
Mikael Simberg
4a088f717e pika: add v0.26.1 (#45515) 2024-07-31 02:52:55 -06:00
Christopher Christofi
9a10538f6d openslide: add new version 4.0.0 (#42158)
* openslide: add new version 4.0.0

* openslide: update dependency organization
2024-07-31 09:27:34 +01:00
arezaii
c753446353 Chapel package: updates post release (#45304)
* Fix +rocm variant, to ensure correct dependencies on ROCm packages
  and use of AMD LLVM
* Add a +pshm variant for comm=gasnet to enable fast shared-memory
  comms between co-locales
* Add logic to ensure we get the native CXI libfabric network provider
  on Cray EX
* Expand dependency type for package modules to encompass runtime
  dependencies
* Factor logic for setting (LD_)LIBRARY_PATH and PKG_CONFIG_PATH of
  runtime dependencies
* Workaround issue #44746 that causes a transitive dependency on lua
  to break SLURM
* Disable nonfunctional checkChplDoc test
* Annotate some variants as conditional, to improve spack info output
  and reduce confusion

---------

Co-authored-by: Dan Bonachea <dobonachea@lbl.gov>
2024-07-30 18:24:56 -07:00
Chris Marsh
65a15c6145 Mac OS UUID virtual: platform-specific virtuals not correctly prioritized (#43002)
`apple-libuuid` includes types that aren't available in other `uuid`
providers; this cause issues in consuming packages (e.g., py-matplotlib)
that use SDKs like CarbonCore.framework when they attempt to use
`util-linux-uuid` as a `uuid` provider on Mac OS.

Tweak `util-linux-uuid` to indicate that it does not provide `uuid`
on Mac OS.
2024-07-30 11:38:07 -06:00
dependabot[bot]
e563f84ae2 build(deps): bump docker/setup-buildx-action from 3.5.0 to 3.6.1 (#45495)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.5.0 to 3.6.1.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](aa33708b10...988b5a0280)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-30 07:57:15 -07:00
Teague Sterling
622ad1ddd7 perl-bio-ensembl-funcgen: new package (#44508)
* Adding the perl-bio-ensembl-funcgen package

* Update package.py

* Update package.py
2024-07-30 11:17:29 +01:00
jmlapre
1bd17876ed trilinos: add v16.0.0 (#45402) 2024-07-29 23:23:11 -06:00
Richard Berger
a789689709 py-furo: add new versions (#45439) 2024-07-29 19:58:43 -06:00
Teo
171a2e0e31 add new maintainer (#45436) 2024-07-29 19:58:14 -06:00
Seth R. Johnson
66d3fddedf Remove maintainership from packages I have no stake in (#45435) 2024-07-29 16:23:00 -06:00
RichardBuntLinaro
14a3b13900 linaro-forge: added 24.0.3 version (#45430) 2024-07-29 09:31:14 -06:00
Teague Sterling
40d41455db perl-bio-ensembl: new package (#44506)
* Adding perl-bio-ensembl package

* Fixing checksums

* Update package.py
2024-07-29 13:33:09 +01:00
Todd Gamblin
d63ead25ac add spack audit configs to ci
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-07-29 01:30:14 -07:00
Todd Gamblin
4a35dec206 wasi-sdk: add default provider
This was missed in #45394 because we don't run unit tests for package PRs, and
`test_all_virtual_packages_have_default_providers`, which would've caught it, is a unit
test, not an audit.

- [x] add a default provider for `wasi-sdk` in `etc/spack/defaults/packages.yaml` (which
      we require for all virtuals)
- [x] rework `test_all_virtual_packages_have_default_providers` as an audit called
      `_ensure_all_virtual_packages_have_default_providers`

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-07-29 01:30:14 -07:00
Matt Jolly
c294b9d3b9 meson: add v1.4.2, v1.5.1 (#45384)
Also deprecate old and superseded versions
2024-07-29 09:02:48 +02:00
AcriusWinter
057b415074 pinentry: old to new test API (#45011)
* pinentry: New API
* move code around
* added back version check
* Complete check_version refactor
* Honor original handling of guis (i.e., don't try if not there)

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-07-26 14:07:06 -06:00
Manuela Kuhn
3180b28d76 py-mne: add v1.7.1 (#45400)
* py-mne: add v1.7.1
* fix style
2024-07-26 12:04:58 -07:00
Walter de Jong
b47c31509d apptainer: add v1.3.2, v1.3.3 (#45398)
* upgrade apptainer: security fix
  apptainer 1.3.1 has a high severity security issue: CVE-2024-3727
  Upgrade to 1.3.2 or preferably 1.3.3
* added comment for 1.3.1

---------

Co-authored-by: Walter de Jong <walter.dejong@surf.nl>
2024-07-26 08:51:30 -07:00
Sreenivasa Murthy Kolam
f99a5ef2e7 Fix build failure when kokkos +rocm is enabled. (#44459)
* fix kokkos +rocm build failure

* address review comments

* address review comments . revert the previous changes

* address review comments. Add rocthrust for 4.3 version onwards
2024-07-26 06:44:06 -07:00
Wouter Deconinck
690bcf5d47 intel-oneapi-compilers: update description with current compilers (#45348)
* intel-oneapi-compilers: update description with current compilers

* Update var/spack/repos/builtin/packages/intel-oneapi-compilers/package.py

Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>

* intel-oneapi-compilers: break docstring line

---------

Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>
2024-07-26 07:20:21 -06:00
AcriusWinter
1e6bef079d kokkos: new test API (#45010)
* kokkos: new test API
* kokkos: added import llnl.util.lang as lang because earlier versions couldn't be installed without it

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-07-25 18:39:24 -07:00
AcriusWinter
564155fd1a povray: new test API (#45174)
* povray: new test API
* capture output and test name change
* povray: add v3.7.0.10, deprecate 3.7.0.8

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-07-25 17:51:34 -07:00
AcriusWinter
f371b6f06c umpire: old to new test API and refactor (#44957)
* umpire: old to new test format
* umpire: old to new test method and refactor
* indentation
* black reformat
* last minute syntax
* docstring and checks
* black format
* change test name
* method call correction
* Resolve problem with stand-alone tests (or examples) not loading library (Fixes #44960)

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-07-25 17:47:44 -07:00
Howard Pritchard
0f9434fca4 openmpi: add v5.0.4 and v5.0.5 (#45305)
* Open MPI: add release 5.0.4
* OpenMPI: add release 5.0.5
   needed quick turnaround owing to
   https://github.com/open-mpi/ompi/issues/12693

---------

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2024-07-25 15:14:37 -07:00
Martin Diehl
235831a035 damask 3.0.0 and 3.0.0-beta2 (#45405) 2024-07-25 14:03:09 -07:00
Teague Sterling
4240748cea wasi-sdk-prebuilt: new package (#45394)
* wasi-sdk-prebuilt: new package
* move url_for_version to be first method

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-07-25 12:56:35 -07:00
John W. Parent
934e34fbd6 CMake package: add versions 3.30.0 and 3.30.1 (#45418) 2024-07-25 10:39:56 -06:00
Adam J. Stewart
ea42d18506 libgcrypt: add patch for avx512 support (#45432) 2024-07-25 15:57:18 +02:00
Andrew W Elble
2b763ff2db py-tensorflow: alter gcc conflict, fix build (#45330) 2024-07-25 08:22:48 +02:00
AcriusWinter
c6cc97953b sz: new test API (#45363)
* sz: new test API
* fix typo; check installed executable; conform to subpart naming convention
* skip tests early if not installed; remove unnecessary "_sz" from test part names

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-07-24 19:09:57 -06:00
Tamara Dahlgren
ff144df549 strumpack: make standalone test for +mpi more robust (#44943)
* strumpack: make standalone test for +mpi more robust
* Comment about which MPI launcher being attempted
2024-07-24 17:57:12 -07:00
Teo
f3acf201c4 halide: add v18.0.0 (#45401)
* Added Halide 18
* Fix style+other stuff
* Accept compiler deps
* 17.0.2 too
* reorder versions
2024-07-24 16:22:51 -07:00
John W. Parent
e5364ea832 Netlib-lapack package: search for correct library names on Windows (#45417)
Library names on Windows are not typically prefixed with lib; the default
`.libs` implementation accounts for this, but `netlib-lapack` has a
custom implementation of `.libs` that did not account for this.
2024-07-24 13:28:30 -07:00
dependabot[bot]
53b8f91c02 build(deps): bump docker/login-action from 3.2.0 to 3.3.0 (#45378)
Bumps [docker/login-action](https://github.com/docker/login-action) from 3.2.0 to 3.3.0.
- [Release notes](https://github.com/docker/login-action/releases)
- [Commits](0d4c9c5ea7...9780b0c442)

---
updated-dependencies:
- dependency-name: docker/login-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-24 13:40:12 -04:00
Todd Gamblin
a841ddd00c spack pkg grep: don't warn when grepping for quoted strings (#45412)
The `Executable` class emits a warning when you pass quoted arguments to it:

```
> spack pkg grep '"namespace"'
==> Warning: Quotes in command arguments can confuse scripts like configure.
  The following arguments may cause problems when executed:
      "namespace"
  Quotes aren't needed because spack doesn't use a shell. Consider removing them.
  If multiple levels of quotation are required, use `ignore_quotes=True`.
```

This is to warn new package authors who aren't used to calling build commands in python.
It's not useful for `spack pkg grep`, where we really are passing args on the command
line, and if we pass a quoted string, we probably meant to.

- [x] make `ignore_quotes` an instance variable, not just an argument to ``__call__`
- [x] set `ignore_quotes` to `True` in `spack pkg grep`

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-07-24 08:11:32 -07:00
snehring
39455768b2 hybpiper: change package type, add version 2.1.8 (#45262)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-24 09:16:54 -05:00
afzpatel
e529a454eb CI: add ML ROCm stack (#45302)
* add ML ROCm stack

* add suggested changes

* remove py-torch and py-tensorflow-estimator

* add TF_ROCM_AMDGPU_TARGETS env variable and remove packages from pipeline

* remove py-jax and py-xgboost
2024-07-24 16:16:15 +02:00
AcriusWinter
1b5dc396e3 uftrace: change to executable declaration (#45403) 2024-07-23 18:32:06 -06:00
Wouter Deconinck
15a3ac0512 py-arrow: add v1.3.0 (switch to flit-core) (#45123) 2024-07-23 17:50:21 -06:00
Thomas Applencourt
52f149266f ruby: add v3.3.4 (#45334)
* Ruby Add 3.3.4

* Update package.py

* Update package.py
2024-07-23 17:38:52 -06:00
Teague Sterling
8d33c2e7c0 generate-ninja: new package (#45391)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-07-23 14:47:04 -07:00
Karol Krizka
b3d82dc3a8 ROOT should add_include_path virtual glu for consistency. (#45057) 2024-07-23 14:12:54 -05:00
eugeneswalker
0fb44529bb e4s rocm external ci stack: upgrade to v6.1.2 (#45356)
* e4s rocm external ci stack: upgrade to v6.1.2

* magma: add rocm-core 6.1.2
2024-07-23 09:52:52 -07:00
dependabot[bot]
6ea944bf17 build(deps): bump pytest from 8.2.2 to 8.3.1 in /lib/spack/docs (#45377)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.2.2 to 8.3.1.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.2.2...8.3.1)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:53:55 -04:00
dependabot[bot]
8c6177c47f build(deps): bump sphinx from 7.4.6 to 7.4.7 in /lib/spack/docs (#45376)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 7.4.6 to 7.4.7.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.4.6...v7.4.7)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:53:43 -04:00
dependabot[bot]
b65d9f1524 build(deps): bump docker/setup-buildx-action from 3.4.0 to 3.5.0 (#45379)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.4.0 to 3.5.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](4fd812986e...aa33708b10)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:53:18 -04:00
dependabot[bot]
03e5dddf24 build(deps): bump docker/build-push-action from 6.4.1 to 6.5.0 (#45380)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.4.1 to 6.5.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](1ca370b3a9...5176d81f87)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:53:04 -04:00
dependabot[bot]
7bb892f7b3 build(deps): bump docker/setup-qemu-action from 3.1.0 to 3.2.0 (#45381)
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.1.0 to 3.2.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](5927c834f5...49b3bc8e6b)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-23 11:52:42 -04:00
Wouter Deconinck
66ed8ebbd9 gh: convert to GoPackage (#45351)
* gh: convert to GoPackage

* gh: fix style
2024-07-23 11:49:05 -04:00
Wouter Deconinck
0d326f83b6 GoPackage: default -modcacherw to ensure stage cleanup (#45350) 2024-07-23 11:48:52 -04:00
Dom Heinzeller
fc0955b125 Update and clean up hdf-eos2 package.py to fix build errors with Intel oneAPI (#45399) 2024-07-23 07:54:12 -07:00
Mikael Simberg
13ba1b96c3 fmt: add 11.0.2 (#45396) 2024-07-23 07:46:45 -07:00
Mikael Simberg
d66d169027 Remove # generated comments from many packages, add some missing depends_on("cxx") (#45395) 2024-07-23 10:31:22 +02:00
Manuela Kuhn
6decd6aaa1 py-mne-bids: add new package (#45374) 2024-07-22 17:11:18 -06:00
AcriusWinter
3c0ffa8652 uftrace: new test API, add 0.16 with libtraceevent, fix build (#45364)
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
2024-07-22 16:31:38 -06:00
Christoph Junghans
4917e3f664 votca: add v2024.1 (#45341) 2024-07-22 14:22:46 -06:00
Adam J. Stewart
b2a14b456e py-numpy: add v2.0.1 (#45354) 2024-07-22 14:04:09 -06:00
Teague Sterling
ab1580a37f linux-pam: add v1.5.1, v1.5.3, v1.6.0, v1.6.1 and additional variants (#45344)
* package/linux-pam: dependencies
* Adding variants to linux-pam
* Updating linux-pam variants
* Fixing variants for linux-pam after testing
* clean up flag handling
* clean up terrible tabs
* cant use default_args for compiler dependencies
* Change selinux to off by default

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-07-22 12:57:39 -07:00
Teague Sterling
c1f979cd54 libwnck: new package (#44641) 2024-07-22 21:27:45 +02:00
Teague Sterling
d124338ecb libgudev: new package (#44639) 2024-07-22 21:24:42 +02:00
Brad Geltz
d001f14514 geopm: Add v3.1 and update the needed dependencies (#44793) 2024-07-22 21:23:11 +02:00
Jordan Galby
c43205d6de asciidoc: Fix asciidoc@10 install (#44926) 2024-07-22 21:16:08 +02:00
snehring
54f1af5a29 libint: Fix build for 2.6.0, add libs property (#45034)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-22 20:55:34 +02:00
Alex Seaton
350661f027 heyoka: add current 4.0.x and 5.0.0 releases (#45314) 2024-07-22 20:16:39 +02:00
Philipp Edelmann
3699df2651 rayleigh: add v1.2.0 (#45333) 2024-07-22 20:05:44 +02:00
Andrew W Elble
63197fea3e dedisp: fix preinstall: it only takes self (#45328)
too many arguments here, only takes "self".
2024-07-22 20:00:14 +02:00
Martin Lang
f044194b06 libvdwxc: Fix configure with gcc-14 (#45093) 2024-07-22 19:54:49 +02:00
William Moses
29302c13e9 Update Enzyme to 0.0.135 (#45346) 2024-07-22 09:35:40 -07:00
Diego Alvarez S.
c4808de2ff Add nextflow 24.04.3 (#45357) 2024-07-22 09:29:46 -07:00
Alex Leute
c390a4530e py-colored: Added new version (#45361) 2024-07-22 09:17:13 -07:00
Teague Sterling
be771d5d6f node-js: add v17.9.1, v20.15.0, v21.7.3, v22.3.0, v22.4.0 (#45007)
* Adding new versions and compilation conflict for nodejs
* Update failed version for gcc14
* Updating conflicts notes for correctness/clarity/format
* Applying spack-ized versions of fix in https://github.com/nodejs/node/issues/52223 to adddress CI failures
* Update fix-old-glibc-random-headers.patch
* Update package.py
* Update fix-old-glibc-random-headers.patch
* Update fix-old-glibc-random-headers.patch
* Adding conflict for older glibc
* Fixing patch for older systems, need to undef
* Removing overly strict conflicts

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-07-22 08:32:34 -07:00
Wouter Deconinck
8b45fa089e geant4: support Qt5 and Qt6 (#45352)
* geant4: support qt5 and qt6
* geant4: update conflict msg
2024-07-22 13:38:46 +01:00
dependabot[bot]
0d04223ccd build(deps): bump mypy from 1.10.1 to 1.11.0 in /lib/spack/docs (#45337)
Bumps [mypy](https://github.com/python/mypy) from 1.10.1 to 1.11.0.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.10.1...v1.11)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-20 11:25:31 -05:00
Peter Scheibel
5ef222d62f Testing: omit test on windows (#45340)
Re-disable a test that was enabled in #45031
2024-07-20 00:32:37 -07:00
Nicole C.
6810e9ed2e Windows Tests: enable more cmd tests on Windows (#45031)
* Several tests can be enabled on Windows with no changes to logic
  (either the test logic or logic being tested)
* Test for `spack location` requires modification of the test logic,
  but with a minor change can be enabled on Windows
2024-07-19 18:08:28 -07:00
Andrew W Elble
a6c638d0fa sqlite: fix AttributeError when +functions (#45327)
using self.compiler.cc_pic_flag here results in these errors:

==> sqlite: Executing phase: 'install'
==> Error: AttributeError: 'AutotoolsBuilder' object has no attribute 'compiler'

change it to self.pkg.compiler.cc_pic_flag instead.
2024-07-19 16:24:22 -06:00
Rémi Lacroix
fa8a512945 NCCL: add version 2.22.3-1. (#45322) 2024-07-19 12:18:16 -06:00
Wouter Deconinck
24b73da9e6 docs: util/environment.py: use re.Pattern[str] instead of re (#45329)
* docs: util/environment.py: use `re.Pattern[str]` instead of `re`

* docs: sphinx==7.4.6
2024-07-19 20:03:07 +02:00
afzpatel
4447d3339c change 2.16-rocm-enhanced to 2.16.1-rocm-enhanced (#45320) 2024-07-19 09:45:31 -06:00
Alex Leute
d82c9e7f2a usearch: new version (#45308)
* usearch: new versison
   Manual download no longer reqired for @12:
2024-07-19 01:42:58 -06:00
AcriusWinter
6828a7402a improved-rdock: new test API (#45239)
* imoroved-rdock: new test API
* Make test subpart names and or descriptions more accurate

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-07-18 16:06:45 -06:00
dmagdavector
a0d62a40dd sqlite: add some newer releases (#45297)
Included: 3.46.0 (most current), 3.45.3, 3.45.1 (for possible compat with Ubuntu 24.04 LTS), 3.44.2.
2024-07-18 15:06:24 -06:00
eugeneswalker
712dcf6b8d e4s ci: enable some disabled specs (#44934)
* e4s ci: enable some disabled specs

* comment out cp2k +cuda due to unsupported cuda_arch

* comment out dealii+cuda due to concretize error

* work through concretize errors

* e4s: comment out failing builds

* e4s stack: disabled non-building specs

* comment out failing specs

* comment out failing specs

* cleanup comments
2024-07-18 20:58:10 +00:00
Wouter Deconinck
ad1fc34199 rust: rework external find to require both rustc and cargo (#45286)
* rust: rework external find to require both rustc and cargo

* rust: handle unable to parse version

* [@spackbot] updating style on behalf of wdconinc

* rust: not x or not y -> not (x and y)

Co-authored-by: Alec Scott <hi@alecbcs.com>

* rust: pick first rustc found

Co-authored-by: Alec Scott <hi@alecbcs.com>

* rust: list comprehensions

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-07-18 14:19:50 -06:00
Adam J. Stewart
ab723b25d0 py-tensorflow: add v2.17.0 (#45201) 2024-07-18 12:54:52 -07:00
Wouter Deconinck
016673f419 py-cryptography: add v41.0.7, v42.0.8; py-setuptools-rust: add v1.7.0, v1.8.1, v1.9.0 (#45282) 2024-07-18 12:18:27 -07:00
Adam J. Stewart
7bc6d62e9b py-sphinx: add v7.4 (#45255) 2024-07-18 12:48:18 -06:00
Bill Williams
fca9cc3e0e Allow remapping of compiler names (#45299)
CCE in spack is Cray on the Score-P configure line. Others can be added.

Co-authored-by: William Williams <william.williams@tu-dresden.de>
2024-07-18 11:48:51 -06:00
snehring
2a178bfbb0 mafft: add version 7.525 (#45258)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-18 10:25:53 -07:00
snehring
3381879358 spades: add version 4.0.0 and new variants (#45278)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-18 10:24:05 -07:00
Matt Thompson
ed9058618a mapl: add v2.47.1 (#45280)
* mapl: add 2.47.1
* Approve compiler depends_on
2024-07-18 10:21:15 -07:00
Harmen Stoppels
a4c99bad6a git packages: add language dep (#45294) 2024-07-18 19:17:54 +02:00
Pranav Sivaraman
f31f58ff26 magic-enum: add version 0.9.6 (#45284)
* magic-enum: add version 0.9.6
* magic-enum: add maintainer

---------

Co-authored-by: pranav-sivaraman <pranav-sivaraman@users.noreply.github.com>
2024-07-18 10:15:35 -07:00
Wouter Deconinck
f84918da4b harfbuzz: add v9.0.0 (#45290)
* harfbuzz: add v9.0.0
* harfbuzz: do not patch non-existing Makefile beyond v8
2024-07-18 10:10:18 -07:00
Richard Berger
80a237e250 netlib-lapack: add pic variant (#45291) 2024-07-18 10:06:25 -07:00
Cameron Smith
f52d3b26c3 pumi: language dependencies (#45301)
Signed-off-by: Cameron Smith <smithc11@rpi.edu>
2024-07-18 11:04:57 -06:00
AcriusWinter
2029d714a0 rocm-opencl: old to new test API (#45065)
* rocm-opencl: old to new test API
* Run tests from test stage directory

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-07-18 09:35:10 -07:00
Alec Scott
31ef1df74f go: remove invalid deps (#45279)
* go: remove invalid deps

* go: add dependencies sed and grep
2024-07-18 09:18:05 -07:00
Richard Berger
00ae96a7cb libmesh: add v1.7.1, and fixes (#45292)
* libmesh: add missing v1.7.1 release

* libmesh: avoid pulling in petsc if only +mpi or +metis is given

* libmesh: add shared variant

Co-authored-by: rbberger <rbberger@users.noreply.github.com>
2024-07-18 13:46:39 +02:00
Seth R. Johnson
8d2a6d6744 sethrj: update maintained package language dependencies (#45289) 2024-07-18 10:55:20 +02:00
Massimiliano Culpo
9443e31b1e Do not initialize previous store state in "use_store" (#45268)
The "use_store" context manager is used to swap the value
of a global variable (spack.store.STORE), while keeping
another global variable consistent (spack.config.CONFIG).

When doing that it tries to evaluate the previous value
of the store, if that was not done already. This is wrong,
since the configuration might be in an "intermediate" state
that was never meant to trigger side effects.

Remove that operation, and add a unit test to
prevent regressions.
2024-07-18 07:18:14 +02:00
Wouter Deconinck
2d8ca8af69 qt-*: add v6.7.1, v6.7.2 (#45288) 2024-07-17 22:03:13 -06:00
dependabot[bot]
de4d4695c4 build(deps): bump docker/build-push-action from 6.4.0 to 6.4.1 (#45283)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.4.0 to 6.4.1.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](a254f8ca60...1ca370b3a9)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-17 21:43:12 -06:00
MichaelLaufer
c8cf85223f py-pyfr: add v2.0.3 (#45274)
* py-pyfr: add v2.0.3
2024-07-17 18:36:40 -06:00
Wouter Deconinck
b869538544 environment: handle view root at existing directory better (#45263)
- remove empty dir if exists at view root
- error better if non-empty dir

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-07-17 23:17:30 +02:00
Adam J. Stewart
4710cbb281 py-lightning: setuptools required at run-time (#45260) 2024-07-17 11:11:42 -07:00
Massimiliano Culpo
9ae1014e55 Run minimization of weights only on known targets (#45269)
This prevents excessive output from clingo of the kind:

.../spack/lib/spack/spack/solver/concretize.lp:1640:5-11: info: tuple ignored:
  #sup@2
2024-07-17 11:10:00 -07:00
afzpatel
813c0dd031 hipsparselt, composable-kernel: add netlib-lapack test dependency and enable ck test (#45273)
* add netlib-lapack dependency to hipsparselt and enable ck test
* fix cmake args
2024-07-17 10:49:28 -07:00
Lucas Frérot
91071933d0 tamaas: added version 2.8.0 and petsc variant (#45267)
* tamaas: added version 2.8.0
* tamaas: added +petsc variant for extra solvers
2024-07-17 10:30:40 -07:00
snehring
df5bac3e6c giflib: remove convert call in doc generation (#45276)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-17 09:42:20 -07:00
Harmen Stoppels
7b9f8abce5 Add depends_on([c,cxx,fortran]) (#45217)
Add language dependencies `c`, `cxx`, and `fortran`.

These `depends_on` statements are auto-generated based on file extensions found
in source tarballs/zipfiles.

The `# generated` comment can be removed by package maintainers after
validating correctness.
2024-07-17 16:07:43 +02:00
Harmen Stoppels
a2f9d4b6a1 pixman: unconditional --with-pic (#45272) 2024-07-17 16:02:50 +02:00
Harmen Stoppels
77e16d55c1 warpx: fix openpmd backward compat bound (#45271) 2024-07-17 15:31:43 +02:00
Stephen Nicholas Swatman
ecb2442566 detray: new package (#45024)
* detray: new package

This commit adds the detray package, a detector description library for
HEP experiments that is designed to be GPU-friendly.

* Update var/spack/repos/builtin/packages/detray/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/detray/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-07-17 07:42:19 -05:00
Alec Scott
89c0b4accf libgcrypt: conflict with darwin when @1.11.0 (#45264)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-07-16 23:45:13 -06:00
downloadico
8e5b51395a abinit: add version 10.0.7 (#45250)
* abinit: add version 10.0.7
* abinit: simplified version constraint for applying rm_march_settings_v9.patch
2024-07-16 15:08:29 -07:00
AcriusWinter
c2ada0f15a parflow: Old test method to new test method (#44933)
* parflow: Old test method to new test method
* add output checker
* made req. changes
2024-07-16 12:43:31 -07:00
afzpatel
6d3541c5fd fix hipblas test (#44666)
* fix hipblas test
* add rocm-openmp-extras dependencies
2024-07-16 12:39:08 -07:00
snehring
31e4149067 vasp: add new version 6.4.3 and rework package (#44937)
* vasp: add new version 6.4.3 and rework package
* vasp: remove redundant cuda dep
* vasp: bump fftlib variant version restriction
* vasp: honor the still existing scalapack variant

---------

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-16 12:37:44 -07:00
otsukay
c9fba9ec79 fujitsu.patch is no longer needed for versions>=4.5 (#45154) 2024-07-16 12:18:39 -07:00
Wouter Deconinck
282627714e gaudi: depends_on python +dbm (#45238) 2024-07-16 12:13:17 -07:00
fpruvost
714dd783f9 pastix: new version v6.4.0 (#45246) 2024-07-16 12:06:30 -07:00
Harmen Stoppels
40b390903d gmake: generic CXX, fix build.sh, deprecate 4.0, add 4.1 (#45137)
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-07-16 18:58:57 +02:00
Massimiliano Culpo
ce1b569b69 Fix order of deserialization in subprocess context (#45229)
Since the the MetaPathFinder now owns a lazily constructed RepoPath object, we need to deserialize environments before the package that needs to be restored. Before we were relying on globals to be inconsistent in a way that let the entire process go.
2024-07-16 10:15:29 -06:00
Harmen Stoppels
b539eb5aab concretizer: show input specs on error (#45245) 2024-07-16 14:04:56 +02:00
Seth R. Johnson
e992e1efbd Celeritas: new version 0.4.4 (#45234) 2024-07-16 04:23:02 -06:00
Alec Scott
33a52dd836 pass: switch to git based versions to fix changing checksum in tarball (#45237) 2024-07-16 04:18:09 -06:00
Wouter Deconinck
b5f06fb3bc py-mpmath: ad v1.3.0; depends_on py-setuptools for old versions (#45232) 2024-07-16 04:07:40 -06:00
afzpatel
494817b616 correct test binary name (#45240) 2024-07-16 04:03:02 -06:00
Wouter Deconinck
02470a5aae geant4: add v11.3.0.beta (#45087)
* geant4: add v11.3.0.beta

* geant4: vecgeom@1.2.8: when 11.3:

* geant4-data: add v11.3.0

* g4particlexs: add v4.1

* g4emlow: add v8.6

* g4nudexlib, g4urrpt: add v1.0

* [@spackbot] updating style on behalf of wdconinc

* geant4: immediately deprecate geant4-11.3.0.beta

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-07-16 10:03:49 +01:00
Massimiliano Culpo
42232a8ab6 Fix error message for test log in child process (#45233)
If we don't have a log, we'll mask the real error with
another caused by using None as an argument to os.path.join
2024-07-16 06:58:36 +02:00
Matthieu Dorier
cb64df45c8 toml11: adds new versions (#45056) 2024-07-16 06:40:45 +02:00
Wouter Deconinck
a11da7bdb9 cmd/dependents.py: remove unused loop over all packages (#45166) 2024-07-16 06:38:01 +02:00
Matthew Lesko
9a22ae11c6 openmpi: fix pmix version check in v5 (#44928)
* OpenMPI 5 w/ PRRTE 3 series PMIX version check fix

OpenMPI fails to compile otherwise when targeting external PMIX 4.2.6 and likely others.

```
  >> 3369    base/ess_base_bootstrap.c:72:14: error: static declaration of 'pmix_getline' follows non-static declaration
     3370       72 | static char *pmix_getline(FILE *fp)
     3371          |              ^
     3372    /opt/pmix/include/pmix/src/util/pmix_string_copy.h:83:19: note: previous declaration is here
     3373       83 | PMIX_EXPORT char *pmix_getline(FILE *fp);
     3374          |                   ^
     3375    1 error generated.
  >> 3376    make[4]: *** [Makefile:820: base/ess_base_bootstrap.lo] Error 1
```

Upstream PRRTE fix (not released yet): https://github.com/openpmix/prrte/pull/1957
Upstream OpenMPI issue: https://github.com/open-mpi/ompi/issues/12537 ("fixed in next release")

Co-authored-by: Shahzeb Siddiqui <shahzebmsiddiqui@gmail.com>
2024-07-16 06:33:12 +02:00
Stephen Sachs
318a7e0e30 wrf: explicit conflict oneapi + older versions (#44787)
The patch which enables icx/ifx compilers is only added for `wrf@4.4:`. This PR prints a useful message at concretization time instead of failing the installation later on.

Co-authored-by: stephenmsachs <stephenmsachs@users.noreply.github.com>
2024-07-16 06:28:54 +02:00
Wouter Deconinck
e976f351f8 py-ipython: depends_on python +sqlite3 when @8: (#45231) 2024-07-16 06:26:41 +02:00
Wouter Deconinck
437341d40e py-nodeenv: depends_on python +ssl (#45225) 2024-07-16 06:23:21 +02:00
dependabot[bot]
9d7ea1a28b build(deps): bump docker/build-push-action from 6.3.0 to 6.4.0 (#45243)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.3.0 to 6.4.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](1a162644f9...a254f8ca60)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-16 06:14:27 +02:00
AcriusWinter
d85668f096 slate: changed stand-alone test from old to new API (#44953)
* slate: changed from old to new format
* make code tighter
* replace assert method
* SkipTest plus other cleanup

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-07-15 21:10:50 -06:00
Alex Richert
5c3a23a481 pixman: add shared, pic variants (#44889)
* Add shared/pic variants to pixman
* add +shared~pic conflict
2024-07-15 17:36:45 -07:00
AcriusWinter
8be1f26ac6 tix: old to new test API (#45223) 2024-07-15 17:31:02 -07:00
Adam J. Stewart
35bd21fc64 py-tensorflow-estimator: correct dependencies (#44185) 2024-07-15 22:12:16 +02:00
Adam J. Stewart
652170fb54 DCMTK: fix build with libtiff (#45213) 2024-07-15 22:02:19 +02:00
Harmen Stoppels
d4e6c29f25 unparser.py: remove print statements (#45235) 2024-07-15 21:55:11 +02:00
Stephen Nicholas Swatman
c12772e73f vecmem: add infrastructure for working with SYCL (#45058)
* vecmem: add infrastructure for working with SYCL

The vecmem package uses an unorthodox build system where, instead of
expecting a SYCL-capable compiler in the `CXX` environment variable, it
expects one in `SYCLCXX`. It also needs the correct SYCL flags to be
set. This commit adds a custom build environment for the vecmem package
which allows it to be built in this way. I've also added an extra CMake
flag to ensure that the build system doesn't download any unwanted
dependencies.

* Update var/spack/repos/builtin/packages/vecmem/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-07-15 14:42:19 -05:00
pauleonix
a26ac1dbcc cuda: add v12.5.1 (#44342)
- Add explicit conflict on ppc64le for 12.5 and newer.
- Update/fix intel compiler conflict to reflect that intel@2021 is compatible
  only since 11.4.1 and not since 11.1.1.
- Add intel compiler conflicts to reflect strict support matrix since
  12.2.0.
2024-07-15 20:04:47 +02:00
Teague Sterling
2afaeba292 zip: add patch for gcc@14: (#45006) 2024-07-15 19:42:41 +02:00
Martin Lang
a14e76b98d FFTW: missing function declaration in pfft patch (#45095) 2024-07-15 18:50:35 +02:00
renjithravindrankannath
9a3a759ed3 Updating rocm-opencl to 6.1.2 (#45219) 2024-07-15 09:27:53 -07:00
Michael Kuhn
72f17d6961 rocksdb: add 9.4.0 (#45230) 2024-07-15 09:26:50 -07:00
Harmen Stoppels
1b967a9d98 iconv: remove requirement (#45206)
no longer necessary after 5c53973220
2024-07-15 17:45:32 +02:00
Harmen Stoppels
bb954390ec sqlite: fix url_for_version (#45228) 2024-07-15 17:45:15 +02:00
Wouter Deconinck
bf9b6940c9 abseil-cpp: patch to avoid googletest build dependency (#45103) 2024-07-15 17:09:21 +02:00
simonLeary42
22980b9e65 rust: update cmake dependency ranges (#45145) 2024-07-15 17:03:23 +02:00
Vanessasaurus
483426f771 flux-sched: add v0.36.0 (#45161) 2024-07-15 17:01:07 +02:00
rfbgo
1e5b976eb7 py-pytorch-lightning: add v2.0.7 (#45175) 2024-07-15 16:59:03 +02:00
Patrick Diehl
2aa6939b96 hpx: add instrumentation=thread_debug (#45199)
Co-authored-by: Hartmut Kaiser <hartmut.kaiser@gmail.com>
2024-07-15 14:47:13 +02:00
Simo Tuomisto
f7e601d352 google-cloud-cli: fix unquoted value in env variable (#45207) 2024-07-15 14:41:44 +02:00
Manuela Kuhn
c4082931e3 r-colourpicker: add 1.3.0 (#45209) 2024-07-15 14:39:28 +02:00
Julien Cortial
cee3e5436b perl-json: add optional dependency on perl-json-xs (#45050) 2024-07-15 14:23:25 +02:00
Adam J. Stewart
613fa56bfc py-shapely: add v2.0.5 (#45224) 2024-07-15 11:38:10 +02:00
Michael Kuhn
0752d94bbf libelf: fix build with GCC 14 (#45226) 2024-07-15 10:18:36 +02:00
afzpatel
3bf1a03760 py-tensorflow: change py-tensorflow@2.16-rocm-enhanced to use tarball instead of branch (#45218)
* change py-tensorflow@2.16-rocm-enhanced to use tarball instead of branch

* remove revert_fd6b0a4.patch and use github commit patch url
2024-07-13 12:19:19 +02:00
John W. Parent
e2844e2fef bootstrap ci: add exit code validation for windows (#45221) 2024-07-12 23:07:45 -06:00
AcriusWinter
2ca733bbc1 rocm-clang-ocl: old to new test API (#44938)
* rocm-ocl-clang: old to new test format
* Minor cleanup

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-07-12 15:56:38 -06:00
Dom Heinzeller
e2b6eca420 qt: Add support for compiling @5.15.14 with Intel oneAPI compilers (icx, icpx) (#45195) 2024-07-12 15:35:01 -06:00
Dom Heinzeller
67cb19614e Update fckit to build with Intel oneAPI compilers (icx, icpx) (#45196) 2024-07-12 13:37:58 -06:00
Manuela Kuhn
e464461c19 py-pyqt6: add v6.7.0 (#45212) 2024-07-12 11:03:28 -07:00
Manuela Kuhn
6efe88f7a1 py-multiecho: add v0.29 (#45216) 2024-07-12 10:42:04 -07:00
Harmen Stoppels
0ce35dafe1 Add c to the list of languages (#45191) 2024-07-12 15:25:41 +02:00
pauleonix
49e419b2df cuda: add maintainer (#45211) 2024-07-12 06:11:10 -07:00
Massimiliano Culpo
d9033d8dac llvm: detect short executable names (#45171)
Also, remove annotations for "ld.lld" and "lldb"
2024-07-12 14:03:00 +02:00
Harmen Stoppels
517b7fb0c9 directives: types, avoid redundant parsing (#45208) 2024-07-12 13:35:16 +02:00
Massimiliano Culpo
568e79a1e3 gcc: consider link when detecting compilers (#45169) 2024-07-12 11:30:56 +02:00
Harmen Stoppels
9c8846b37b Add pkg- prefix to builtin.mock a b c d ... (#45205) 2024-07-12 11:27:40 +02:00
Tamara Dahlgren
737b70cbbf Buildcache: remove deprecated --allow-root and preview subcommand (#45204) 2024-07-11 18:19:04 -07:00
Dom Heinzeller
03d2212881 Bug fix for mapl: configure mvapich2 (#45164)
* Bug fix for mapl: configure mvapich2
* Update var/spack/repos/builtin/packages/mapl/package.py

---------

Co-authored-by: Matt Thompson <fortran@gmail.com>
2024-07-11 15:03:37 -07:00
Lev Gorenstein
b8d10916af Remove some explicit dependencies (#45146)
As discussed in https://github.com/spack/spack/pull/44881#issuecomment-2218411735 a `spack install py-globus-cli` fails to concretize on an Ubuntu 22.04 under Windows WSL2 because of too strict of explicit dependencies.

Let's try to remove them here (since these are "just in case" and in all honesty should be handled by `py-globus-sdk` anyways).
2024-07-11 11:19:47 -07:00
Wouter Deconinck
4fe5f35c2f xrootd: add v5.7.0 (#45078)
* xrootd: add v5.7.0
* xrootd: new variant +ec, depends_on isa-l
2024-07-11 11:12:27 -07:00
Greg Sjaardema
cea1d8b935 seacas: new version to fix some portability bugs (#45179)
* Now builds with latest fmt release (11.0.1)
* Missing array include in nem_spread
* Fix timestep conssitency check in file-per-rank case if one or more dbs have no timesteps.
2024-07-11 09:52:07 -07:00
dependabot[bot]
e7946a3a41 build(deps): bump actions/checkout from 4.1.6 to 4.1.7 (#44693)
Bumps [actions/checkout](https://github.com/actions/checkout) from 4.1.6 to 4.1.7.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](a5ac7e51b4...692973e3d9)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-11 15:32:04 +02:00
Harmen Stoppels
5c53973220 concretize.lp: drop 0 weight of external providers (#45025)
If an external happens to be a provider of anything, the solver would
set its weight to 0, meaning that it is most preferred, even if
packages.yaml config disagrees.

That was done so that `spack external find mpich` would be sufficent to
pick it up as mpi provider.

That may have made sense for mpi specifically, but doesn't make sense
for other virtuals. For example `glibc` provides iconv, and is an
external by design, but it's better to use libiconv as a separate
package as a provider.

Therefore, drop this rule, and instead let users add config:

```
mpi:
  require: [mpich]
```

or

```
mpi:
  buildable: false
```

which is well-documented.
2024-07-11 15:29:56 +02:00
Harmen Stoppels
278a38f4af external find --not-buildable: mark virtuals (#45159)
This change makes `spack external find --not-buildable` mark virtuals
provided by detected packages as non-buildable, so that it's sufficient
for users to let spack detect say mpich and have the concretizer pick it
up as mpi provider even when openmpi is "more preferred".
2024-07-11 15:19:55 +02:00
Paolo
39bbedf517 acfl: update the headers property (#44653)
Consistently with ArmPL@24:, the include directory for acfl@24:
has changed to include. The change wants to update to this change
and distinguish the include path for releases previous to 24.04
and the future ones
2024-07-11 11:20:34 +02:00
Massimiliano Culpo
2153f6056d checksum: fix circular imports on macOS (#45187) 2024-07-11 10:49:29 +02:00
Adam J. Stewart
2be9b41362 py-tensorboard: update numpy compatibility (#45092) 2024-07-11 10:32:35 +02:00
AcriusWinter
f9fa024fc5 rocm-cmake: changed test API from old to new (#44939)
* rocm-cmake: changed test format from old to new
* Rename cmake variable
* post-conflict resolution: remove remaining version check

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-07-11 00:39:08 -06:00
Jack Morrison
7c7ac27900 MPICH: Add version 4.2.2 (#45040) 2024-07-10 21:32:09 -07:00
Derek Ryan Strong
253e8b1f2a Add libjpeg-turbo v3.0.3, v3.0.2, v3.0.1 (#44990) 2024-07-10 21:29:36 -07:00
dependabot[bot]
60e75c9234 build(deps): bump docker/login-action from 3.1.0 to 3.2.0 (#44424)
Bumps [docker/login-action](https://github.com/docker/login-action) from 3.1.0 to 3.2.0.
- [Release notes](https://github.com/docker/login-action/releases)
- [Commits](e92390c5fb...0d4c9c5ea7)

---
updated-dependencies:
- dependency-name: docker/login-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-11 05:42:04 +02:00
dependabot[bot]
fb89337b04 build(deps): bump actions/setup-python from 5.1.0 to 5.1.1 (#45182)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 5.1.0 to 5.1.1.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](82c7e631bb...39cd14951b)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-11 05:39:08 +02:00
Kyle Knoepfel
53f71fc4a7 Use ROOT_LIBRARY_PATH and adjust other environment variables (#45109)
* Use ROOT_LIBRARY_PATH and adjust other environment variables

* Accommodate versions older than ROOT 6.26

* Use os instead of pathlib
2024-07-10 19:40:52 -06:00
AcriusWinter
12e7c1569c pumi: new test API (#45181)
* pumi: new test API

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-07-10 17:57:05 -06:00
Nicole C
2eb566b884 Spack on Windows: update dev_build tests to run on Windows (#45039) 2024-07-10 16:52:01 -07:00
Hariharan Devarajan
a7444873b9 brahma: add 0.0.4 and 0.0.5 (#45168)
* Added Release 0.0.4 and 0.0.5
* Changed requirement for gotcha
   use gotcha 1.0.5 for 0.0.2 and 0.0.3
* Combine gotcha 1.0.7 for master and develop

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-07-10 14:30:32 -06:00
Sreenivasa Murthy Kolam
397ff11d6d rpp package - fix the add_tests build failure for 6.1 rocm rel (#44738)
* rpp package - fix the add_tests build failure for 6.1 rel
* fix test build failure
2024-07-10 13:29:14 -07:00
renjithravindrankannath
285563ad01 Need to configure rsmiBindings.py.in similar to rsmiBindingsInit.py.in (#45131) 2024-07-10 13:21:28 -07:00
Teague Sterling
c3111ac0b4 py-janus: new package (#44520)
* py-janus: add v.0.7.0,v1.0.0
*  Incorporating changes from review  including:    
    https://github.com/spack/spack/pull/44520#pullrequestreview-2095028464
2024-07-10 13:16:12 -07:00
HELICS-bot
6505e7e02a helics: Add version 3.5.3 (#45142)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2024-07-10 12:19:39 -07:00
Alex Richert
c82889058c bacio: recipe updates (#45150) 2024-07-10 12:18:23 -07:00
renjithravindrankannath
2f4c20567c Correcting sha256sum for 6.1.2 (#45152) 2024-07-10 12:13:03 -07:00
Adam J. Stewart
a7d6a1188b py-rtree: add v1.3.0 (#45157) 2024-07-10 12:09:56 -07:00
Matthieu Dorier
4f5244920f py-configspace: new versions (#45165) 2024-07-10 12:05:04 -07:00
Adam J. Stewart
feecb60b9e py-pyvista: declare numpy 2 support (#45158) 2024-07-10 12:03:38 -07:00
John W. Parent
4f18cab8d2 Cpuinfo: static build when on Windows (#44899)
* Mirror cpuinfo CI for msvc
2024-07-10 12:46:13 -05:00
Harmen Stoppels
e6f1b4e63a Avoid duplicate detectable tag (#45160)
in case of inheritance the static tags prop may be updated multiple
times, and it turns out builder classes magically inherit from
traditional package classes
2024-07-10 18:14:28 +02:00
Stephen Nicholas Swatman
c458e985af Set LD_LIBRARY_PATH for OneAPI compiler (#45059)
While trying to build packages with the OneAPI compiler version 2024.1 I
ran into the following error, indicating that the compiler is unable to
find some necessary libraries:

```
/storage/Software/oneapi/2024.1/compiler/2024.1/bin/sycl-post-link: error
  while loading shared libraries: libonnxruntime.1.12.22.721.so: cannot open
  shared object file: No such file or directory

  icpx: error: unable to execute command: No such file or directory

  icpx: error: sycl-post-link command failed due to signal (use -v to see
  invocation)
```

Indeed, `libonnxruntime.1.12.22.721.so` does come bundled with the
OneAPI compiler, but it is not available in the build environment by
default. In this commit, I update the custom environment created by
OneAPI to include the `lib/` directory in which these libraries reside
in the `LD_LIBRARY_PATH` environment variable.
2024-07-10 06:50:07 -06:00
Stephen Nicholas Swatman
5f234e16d0 dfelibs: add Boost as a testing dependency (#45133)
In my enthusiasm to add dfelibs to Spack, I didn't realise that the
unit tests of dfelibs use Boost and, as such, Boost is required as a
testing dependency.
2024-07-10 06:32:45 -06:00
Massimiliano Culpo
9001e9328a Remove unnecessary copy.deepcopy calls (#45135) 2024-07-10 09:33:48 +02:00
AcriusWinter
b237ee3689 octopus: old to new test API (#45143)
* octopus: old to new test API
* Minor simplifications and cleanup

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2024-07-09 19:02:36 -07:00
Massimiliano Culpo
aa911eca40 Add compatibility of sequoia with previous macOS versions (#45127)
* Add compatibility of sequoia with previous macOS versions

* Add compatibility of sequoia with previous macOS versions
2024-07-09 17:48:24 -07:00
Wouter Deconinck
6362c615f5 git: add several new patch-level versions (#45107)
* git: add new patch-level versions

* git: deprecate older previous with broken git lfs
2024-07-09 15:43:31 -06:00
Mikael Simberg
544be90469 fmt: add 11.0.1 (#45089)
Co-authored-by: Alberto Invernizzi <9337627+albestro@users.noreply.github.com>
2024-07-09 13:21:50 -06:00
Peter Scheibel
56a1663cd9 spack find -c: search all concretized-but-not-installed specs (#44713)
Originally if you had `x -> y -> z`, and an env with `x` in its speclist that is concretized but not installed, then `spack find -c y` would not show anything. This was intended: `spack find` has up-until-now only ever listed out installed specs (and `-c` was for adding a preamble section about roots).

This changes `spack find` so:

* `-c` makes it search through all concretized specs in the env (in a sense it is anticipated that a concretized environment would serve as a "speculative" DB and users may want to query it like they query the DB outside of envs)
* Adds a `-i/--install-status` option, equivalent to `-I` from `spack spec`
* Shows install status for either `-c` or `-i`
* As a side effect to prior point, `spack find -i` can now distinguish different installation states (upstream/external)

Examples:

```
$ spack find -r
==> In environment findtest
==> 1 root specs
 -  raja

==> 6 installed packages (not shown)
==> 12 concretized packages to be installed (not shown)
```

```
$ spack find
==> In environment findtest
==> 1 root specs
 -  raja

-- darwin-ventura-m1 / apple-clang@14.0.3 -----------------------
berkeley-db@18.1.40  bzip2@1.0.8  diffutils@3.10  gmake@4.4.1  gnuconfig@2022-09-17  libiconv@1.17
==> 6 installed packages
==> 12 concretized packages to be installed (show with `spack find -c`)
```

```
$ spack find -c
==> In environment findtest
==> 1 root specs
 -  raja

-- darwin-ventura-m1 / apple-clang@14.0.3 -----------------------
[+]  berkeley-db@18.1.40  [+]  bzip2@1.0.8      -   cmake@3.29.4  [+]  diffutils@3.10  [+]  gmake@4.4.1           [+]  libiconv@1.17   -   nghttp2@1.62.0   -   pkgconf@2.2.0    -   readline@8.2
 -   blt@0.6.2             -   camp@2024.02.1   -   curl@8.7.1     -   gdbm@1.23       [+]  gnuconfig@2022-09-17   -   ncurses@6.5     -   perl@5.38.2      -   raja@2024.02.2   -   zlib-ng@2.1.6
==> 6 installed packages
==> 12 concretized packages to be installed


```
$ spack -E find
...
==> 82 installed packages
```
2024-07-09 11:53:20 -07:00
Rocco Meli
f9a46d61fa charmpp: add v8.0.0 (#45097)
* charmpp v8.0.0

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
2024-07-09 10:21:41 -06:00
Mikael Simberg
a81451ba1f pika: add v0.26.0 (#45104) 2024-07-09 10:01:57 -06:00
Rocco Meli
b11e370888 namd 3.0 (#45096) 2024-07-09 09:54:29 -06:00
Massimiliano Culpo
54ee7d4165 Remove the "install_mockery_mutable_config" fixture (#45129)
This fixture was introduced in #16429, and made
redundant in #39024
2024-07-09 11:23:49 +02:00
Massimiliano Culpo
15efcbe042 Fix conflicting use of config and mutable_config fixtures in unit tests (#45106)
and add a fixture to detect use of conflicting fixtures
2024-07-09 09:51:04 +02:00
Alec Scott
7c5fbee327 Improve organization of CI workflow scripts and pip requirements (#45037) 2024-07-09 04:46:09 +02:00
Satish Balay
b19c4cdcf6 petsc, py-petsc4py: add v3.21.3 (#44954)
* petsc, py-petsc4py: add v3.21.3
* py-petsc4py: requires cython v3 since v3.20
2024-07-08 15:40:36 -07:00
Harmen Stoppels
3212cf86f4 environments.rst: go from simple to advanced (#45004)
* environments.rst: go from simple to advanced
* improvements
* notes about activation
2024-07-08 15:36:18 -07:00
Auriane R
fbceae7773 [py-datasets] Add py-datasets version 2.20.0 (#44903)
* Add py-datasets version 2.20.0

* Add dependency requirements for version 2.20 + refactor

* Add missing tqdm and requests versions and to install latest py-datasets

* Add missing python requirements for 2.8.0 and 2.20.0
2024-07-08 15:21:14 -07:00
George Young
b921d1a920 gtfsort: new package (#45062)
* gtfsort: new rust package @0.2.2

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-07-08 15:07:22 -07:00
Robert Cohn
8128b549a5 [intel-oneapi-dpct] correct 2024.2.0 hash (#45100) 2024-07-08 13:46:55 -07:00
Alex Richert
7405d95035 ip2: deprecate package, fix sp dependency (#45064) 2024-07-08 22:18:15 +02:00
Harmen Stoppels
a04b12a3ef spec.py: print right deptype in tree (#45091)
Fix a bug where Spec.tree with cover=nodes reduces deptypes from all
in-edges, including from nodes not reachable from the root, which almost
always happens for concrete specs
2024-07-08 18:25:57 +02:00
Massimiliano Culpo
cbf8f2326d pinentry: add v1.3.1 (#45073) 2024-07-08 08:58:14 -07:00
Harmen Stoppels
297874bfed spec.py: fix __getitem__ looking outside of dag (#45090)
`Spec.__getitem__` queries dependent edges, which almost always point to
nodes outside the sub-dag considered. It should only ever look at edges
being traversed.
2024-07-08 14:53:51 +02:00
Massimiliano Culpo
74398d74ac Add type-hints to RepoPath (#45068)
* Also, fix a bug with use_repositories + import spack.pkg
2024-07-08 11:48:39 +02:00
afzpatel
cef9c36183 kripke: update version to 1.2.7 (#44791)
* initial commit to update kripke to 1.2.7
* fix style errors
2024-07-08 02:16:24 -06:00
Wouter Deconinck
5e7430975a zlib-ng: add v2.1.7, v2.2.1 (#45076) 2024-07-08 09:40:44 +02:00
Adam J. Stewart
1456f6dba1 py-scikit-learn: add v1.5.1 (#45016) 2024-07-08 09:29:11 +02:00
Hariharan Devarajan
daf74a60ca cpp-logger: add v0.0.4 (#45033) 2024-07-08 09:28:33 +02:00
Niclas Jansson
87df95c097 neko: add v0.8.0 (#45086)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-07-08 01:26:44 -06:00
Richard Berger
9b49576875 legion: bugfix for +cuda+cuda_unsupported_compiler (#45036)
When using a newer Clang for Kokkos than supported by a given CUDA version, the
CUDA compiler detection in Legion's CMake still needs to be passed
CMAKE_CUDA_FLAGS to pass the compiler check.
2024-07-08 09:25:09 +02:00
Hariharan Devarajan
065cbf79fc gotcha: add v1.0.7 (#45043) 2024-07-08 09:22:19 +02:00
Rocco Meli
09b89e87a4 DLA-Future-Fortran: add v0.2.0 (#45055) 2024-07-08 09:16:31 +02:00
Adam J. Stewart
ddab6156a6 py-numpy: add v2.0.0 (#44735) 2024-07-08 09:14:50 +02:00
Harry Sharma
10cdfff0d1 feat: add diamond@2.1.[8,9] (#45047) 2024-07-08 09:05:32 +02:00
Adam J. Stewart
3328416976 py-matplotlib: add v3.9.1 (#45060) 2024-07-08 08:59:10 +02:00
Wouter Deconinck
094a621f3c acts: add v35.1.0, v35.2.0 (#44963)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-07-08 08:46:57 +02:00
Wouter Deconinck
87ce5d8ccb py-webcolors: add v24.6.0 (#45075) 2024-07-08 08:45:15 +02:00
Wouter Deconinck
bcdc92e25f vc: add v1.4.5 (#45077) 2024-07-08 08:43:16 +02:00
Wouter Deconinck
a323fab135 util-linux{-uuid}: add v2.40.2 (#45079)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-07-08 08:40:14 +02:00
Wouter Deconinck
d42031b075 assimp: add v5.4.2 (#45081)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-07-08 08:33:06 +02:00
Jonathon Anderson
efbb18aa25 hpctoolkit: minor fixes for build failures (#45070) 2024-07-08 08:22:11 +02:00
Harmen Stoppels
8a430f89b3 spack -C <env>: use env config w/o activation (#45046)
Precedence:

1. Named environment
2. Anonymous environment
3. Generic directory
2024-07-06 22:02:25 -07:00
4195 changed files with 22498 additions and 8216 deletions

View File

@@ -5,13 +5,10 @@ updates:
directory: "/"
schedule:
interval: "daily"
# Requirements to build documentation
# Requirements to run style checks and build documentation
- package-ecosystem: "pip"
directory: "/lib/spack/docs"
schedule:
interval: "daily"
# Requirements to run style checks
- package-ecosystem: "pip"
directory: "/.github/workflows/style"
directories:
- "/.github/workflows/requirements/style/*"
- "/lib/spack/docs"
schedule:
interval: "daily"

View File

@@ -28,8 +28,8 @@ jobs:
run:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages
@@ -44,6 +44,7 @@ jobs:
run: |
. share/spack/setup-env.sh
coverage run $(which spack) audit packages
coverage run $(which spack) audit configs
coverage run $(which spack) -d audit externals
coverage combine
coverage xml
@@ -52,6 +53,7 @@ jobs:
run: |
. share/spack/setup-env.sh
spack -d audit packages
spack -d audit configs
spack -d audit externals
- name: Package audits (without coverage)
if: ${{ runner.os == 'Windows' }}
@@ -59,6 +61,8 @@ jobs:
. share/spack/setup-env.sh
spack -d audit packages
./share/spack/qa/validate_last_exit.ps1
spack -d audit configs
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals
./share/spack/qa/validate_last_exit.ps1
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673

View File

@@ -37,7 +37,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- name: Bootstrap clingo
@@ -60,10 +60,10 @@ jobs:
run: |
brew install cmake bison tree
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: "3.12"
- name: Bootstrap clingo
@@ -71,12 +71,14 @@ jobs:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
VALIDATE_LAST_EXIT: ${{ matrix.runner == 'windows-latest' && './share/spack/qa/validate_last_exit.ps1' || '' }}
run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find --not-buildable cmake bison
spack -d solve zlib
${{ env.VALIDATE_LAST_EXIT }}
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/
gnupg-sources:
@@ -94,7 +96,7 @@ jobs:
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: sudo rm -rf $(command -v gpg gpg2 patchelf)
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- name: Bootstrap GnuPG
@@ -123,10 +125,10 @@ jobs:
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: |
3.8
@@ -152,7 +154,7 @@ jobs:
not_found=0
old_path="$PATH"
export PATH="$ver_dir:$PATH"
./bin/spack-tmpconfig -b ./.github/workflows/bootstrap-test.sh
./bin/spack-tmpconfig -b ./.github/workflows/bin/bootstrap-test.sh
export PATH="$old_path"
fi
fi
@@ -166,4 +168,3 @@ jobs:
source share/spack/setup-env.sh
spack -d gpg list
tree ~/.spack/bootstrap/store/

View File

@@ -55,7 +55,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta
@@ -76,7 +76,7 @@ jobs:
env:
SPACK_YAML_OS: "${{ matrix.dockerfile[2] }}"
run: |
.github/workflows/generate_spack_yaml_containerize.sh
.github/workflows/bin/generate_spack_yaml_containerize.sh
. share/spack/setup-env.sh
mkdir -p dockerfiles/${{ matrix.dockerfile[0] }}
spack containerize --last-stage=bootstrap | tee dockerfiles/${{ matrix.dockerfile[0] }}/Dockerfile
@@ -87,19 +87,19 @@ jobs:
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@0b2256b8c012f0828dc542b3febcab082c67f72b
uses: actions/upload-artifact@834a144ee995460fba8ed112a2fc961b36a5ec5a
with:
name: dockerfiles_${{ matrix.dockerfile[0] }}
path: dockerfiles
- name: Set up QEMU
uses: docker/setup-qemu-action@5927c834f5b4fdf503fca6f4c7eccda82949e1ee
uses: docker/setup-qemu-action@49b3bc8e6bdd4a60e6116a5414239cba5943d3cf
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@4fd812986e6c8c2a69e18311145f9371337f27d4
uses: docker/setup-buildx-action@988b5a0280414f521da01fcc63a27aeeb4b104db
- name: Log in to GitHub Container Registry
uses: docker/login-action@e92390c5fb421da1463c202d546fed0ec5c39f20
uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567
with:
registry: ghcr.io
username: ${{ github.actor }}
@@ -107,13 +107,13 @@ jobs:
- name: Log in to DockerHub
if: github.event_name != 'pull_request'
uses: docker/login-action@e92390c5fb421da1463c202d546fed0ec5c39f20
uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@1a162644f9a7e87d8f4b053101d1d9a712edc18c
uses: docker/build-push-action@5cd11c3a4ced054e52742c5fd54dca954e0edd85
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}
@@ -126,7 +126,7 @@ jobs:
needs: deploy-images
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@0b2256b8c012f0828dc542b3febcab082c67f72b
uses: actions/upload-artifact/merge@834a144ee995460fba8ed112a2fc961b36a5ec5a
with:
name: dockerfiles
pattern: dockerfiles_*

View File

@@ -36,7 +36,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0

View File

@@ -1,8 +0,0 @@
#!/usr/bin/env sh
. share/spack/setup-env.sh
echo -e "config:\n build_jobs: 2" > etc/spack/config.yaml
spack config add "packages:all:target:[x86_64]"
spack compiler find
spack compiler info apple-clang
spack debug report
spack solve zlib

View File

@@ -14,10 +14,10 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -1,6 +1,6 @@
black==24.4.2
black==24.8.0
clingo==5.7.1
flake8==7.1.0
flake8==7.1.1
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.20240513

View File

@@ -16,45 +16,34 @@ jobs:
matrix:
os: [ubuntu-latest]
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
concretizer: ['clingo']
on_develop:
- ${{ github.ref == 'refs/heads/develop' }}
include:
- python-version: '3.11'
os: ubuntu-latest
concretizer: original
on_develop: ${{ github.ref == 'refs/heads/develop' }}
- python-version: '3.6'
os: ubuntu-20.04
concretizer: clingo
on_develop: ${{ github.ref == 'refs/heads/develop' }}
exclude:
- python-version: '3.7'
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
- python-version: '3.8'
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
- python-version: '3.9'
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
- python-version: '3.10'
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
- python-version: '3.11'
os: ubuntu-latest
concretizer: 'clingo'
on_develop: false
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -72,7 +61,7 @@ jobs:
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
- name: Bootstrap clingo
if: ${{ matrix.concretizer == 'clingo' }}
env:
@@ -85,7 +74,6 @@ jobs:
- name: Run unit tests
env:
SPACK_PYTHON: python
SPACK_TEST_SOLVER: ${{ matrix.concretizer }}
SPACK_TEST_PARALLEL: 2
COVERAGE: true
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
@@ -100,10 +88,10 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: '3.11'
- name: Install System packages
@@ -118,7 +106,7 @@ jobs:
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
- name: Run shell tests
env:
COVERAGE: true
@@ -141,13 +129,13 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory /__w/spack/spack
git fetch --unshallow
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
useradd spack-test
chown -R spack-test .
- name: Run unit tests
@@ -160,10 +148,10 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: '3.11'
- name: Install System packages
@@ -178,11 +166,10 @@ jobs:
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
- name: Run unit tests (full suite with coverage)
env:
COVERAGE: true
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
@@ -198,10 +185,10 @@ jobs:
os: [macos-13, macos-14]
python-version: ["3.11"]
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages
@@ -213,11 +200,10 @@ jobs:
brew install dash fish gcc gnupg2 kcov
- name: Run unit tests
env:
SPACK_TEST_SOLVER: clingo
SPACK_TEST_PARALLEL: 4
run: |
git --version
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
. share/spack/setup-env.sh
$(which spack) bootstrap disable spack-install
$(which spack) solve zlib
@@ -236,10 +222,10 @@ jobs:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: 3.9
- name: Install Python packages
@@ -247,7 +233,7 @@ jobs:
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
./.github/workflows/bin/setup_git.ps1
- name: Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml

View File

@@ -18,15 +18,15 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: '3.11'
cache: 'pip'
- name: Install Python Packages
run: |
pip install --upgrade pip setuptools
pip install -r .github/workflows/style/requirements.txt
pip install -r .github/workflows/requirements/style/requirements.txt
- name: vermin (Spack's Core)
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
@@ -35,22 +35,22 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
- uses: actions/setup-python@39cd14951b08e74b54015e9e001cdefcf80e669f
with:
python-version: '3.11'
cache: 'pip'
- name: Install Python packages
run: |
pip install --upgrade pip setuptools
pip install -r .github/workflows/style/requirements.txt
pip install -r .github/workflows/requirements/style/requirements.txt
- name: Setup git configuration
run: |
# Need this for the git tests to succeed.
git --version
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
- name: Run style tests
run: |
share/spack/qa/run-style-tests
@@ -70,13 +70,13 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory /__w/spack/spack
git fetch --unshallow
. .github/workflows/setup_git.sh
. .github/workflows/bin/setup_git.sh
useradd spack-test
chown -R spack-test .
- name: Bootstrap Spack development environment

View File

@@ -170,23 +170,6 @@ config:
# If set to true, Spack will use ccache to cache C compiles.
ccache: false
# The concretization algorithm to use in Spack. Options are:
#
# 'clingo': Uses a logic solver under the hood to solve DAGs with full
# backtracking and optimization for user preferences. Spack will
# try to bootstrap the logic solver, if not already available.
#
# 'original': Spack's original greedy, fixed-point concretizer. This
# algorithm can make decisions too early and will not backtrack
# sufficiently for many specs. This will soon be deprecated in
# favor of clingo.
#
# See `concretizer.yaml` for more settings you can fine-tune when
# using clingo.
concretizer: clingo
# How long to wait to lock the Spack installation database. This lock is used
# when Spack needs to manage its own package metadata and all operations are
# expected to complete within the default time limit. The timeout should

View File

@@ -1,3 +0,0 @@
packages:
iconv:
require: [libiconv]

View File

@@ -20,11 +20,14 @@ packages:
awk: [gawk]
armci: [armcimpi]
blas: [openblas, amdblis]
c: [gcc]
cxx: [gcc]
D: [ldc]
daal: [intel-oneapi-daal]
elf: [elfutils]
fftw-api: [fftw, amdfftw]
flame: [libflame, amdlibflame]
fortran: [gcc]
fortran-rt: [gcc-runtime, intel-oneapi-runtime]
fuse: [libfuse]
gl: [glx, osmesa]
@@ -61,6 +64,7 @@ packages:
tbb: [intel-tbb]
unwind: [libunwind]
uuid: [util-linux-uuid, libuuid]
wasi-sdk: [wasi-sdk-prebuilt]
xxd: [xxd-standalone, vim]
yacc: [bison, byacc]
ziglang: [zig]

View File

@@ -1,6 +1,5 @@
config:
locks: false
concretizer: clingo
build_stage::
- '$spack/.staging'
stage_name: '{name}-{version}-{hash:7}'

View File

@@ -206,6 +206,7 @@ def setup(sphinx):
("py:class", "six.moves.urllib.parse.ParseResult"),
("py:class", "TextIO"),
("py:class", "hashlib._Hash"),
("py:class", "concurrent.futures._base.Executor"),
# Spack classes that are private and we don't want to expose
("py:class", "spack.provider_index._IndexBase"),
("py:class", "spack.repo._PrependFileLoader"),

View File

@@ -931,32 +931,84 @@ This allows for a much-needed reduction in redundancy between packages
and constraints.
----------------
Filesystem Views
----------------
-----------------
Environment Views
-----------------
Spack Environments can define filesystem views, which provide a direct access point
for software similar to the directory hierarchy that might exist under ``/usr/local``.
Filesystem views are updated every time the environment is written out to the lock
file ``spack.lock``, so the concrete environment and the view are always compatible.
The files of the view's installed packages are brought into the view by symbolic or
hard links, referencing the original Spack installation, or by copy.
Spack Environments can have an associated filesystem view, which is a directory
with a more traditional structure ``<view>/bin``, ``<view>/lib``, ``<view>/include``
in which all files of the installed packages are linked.
By default a view is created for each environment, thanks to the ``view: true``
option in the ``spack.yaml`` manifest file:
.. code-block:: yaml
spack:
specs: [perl, python]
view: true
The view is created in a hidden directory ``.spack-env/view`` relative to the environment.
If you've used ``spack env activate``, you may have already interacted with this view. Spack
prepends its ``<view>/bin`` dir to ``PATH`` when the environment is activated, so that
you can directly run executables from all installed packages in the environment.
Views are highly customizable: you can control where they are put, modify their structure,
include and exclude specs, change how files are linked, and you can even generate multiple
views for a single environment.
.. _configuring_environment_views:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Configuration in ``spack.yaml``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^
Minimal view configuration
^^^^^^^^^^^^^^^^^^^^^^^^^^
The Spack Environment manifest file has a top-level keyword
``view``. Each entry under that heading is a **view descriptor**, headed
by a name. Any number of views may be defined under the ``view`` heading.
The view descriptor contains the root of the view, and
optionally the projections for the view, ``select`` and
``exclude`` lists for the view and link information via ``link`` and
The minimal configuration
.. code-block:: yaml
spack:
# ...
view: true
lets Spack generate a single view with default settings under the
``.spack-env/view`` directory of the environment.
Another short way to configure a view is to specify just where to put it:
.. code-block:: yaml
spack:
# ...
view: /path/to/view
Views can also be disabled by setting ``view: false``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^
Advanced view configuration
^^^^^^^^^^^^^^^^^^^^^^^^^^^
One or more **view descriptors** can be defined under ``view``, keyed by a name.
The example from the previous section with ``view: /path/to/view`` is equivalent
to defining a view descriptor named ``default`` with a ``root`` attribute:
.. code-block:: yaml
spack:
# ...
view:
default: # name of the view
root: /path/to/view # view descriptor attribute
The ``default`` view descriptor name is special: when you ``spack env activate`` your
environment, this view will be used to update (among other things) your ``PATH``
variable.
View descriptors must contain the root of the view, and optionally projections,
``select`` and ``exclude`` lists and link information via ``link`` and
``link_type``.
For example, in the following manifest
As a more advanced example, in the following manifest
file snippet we define a view named ``mpis``, rooted at
``/path/to/view`` in which all projections use the package name,
version, and compiler name to determine the path for a given
@@ -1001,59 +1053,10 @@ of ``hardlink`` or ``copy``.
when the environment is not activated, and linked libraries will be located
*outside* of the view thanks to rpaths.
There are two shorthands for environments with a single view. If the
environment at ``/path/to/env`` has a single view, with a root at
``/path/to/env/.spack-env/view``, with default selection and exclusion
and the default projection, we can put ``view: True`` in the
environment manifest. Similarly, if the environment has a view with a
different root, but default selection, exclusion, and projections, the
manifest can say ``view: /path/to/view``. These views are
automatically named ``default``, so that
.. code-block:: yaml
spack:
# ...
view: True
is equivalent to
.. code-block:: yaml
spack:
# ...
view:
default:
root: .spack-env/view
and
.. code-block:: yaml
spack:
# ...
view: /path/to/view
is equivalent to
.. code-block:: yaml
spack:
# ...
view:
default:
root: /path/to/view
By default, Spack environments are configured with ``view: True`` in
the manifest. Environments can be configured without views using
``view: False``. For backwards compatibility reasons, environments
with no ``view`` key are treated the same as ``view: True``.
From the command line, the ``spack env create`` command takes an
argument ``--with-view [PATH]`` that sets the path for a single, default
view. If no path is specified, the default path is used (``view:
True``). The argument ``--without-view`` can be used to create an
true``). The argument ``--without-view`` can be used to create an
environment without any view configured.
The ``spack env view`` command can be used to change the manage views
@@ -1119,11 +1122,18 @@ the projection under ``all`` before reaching those entries.
Activating environment views
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The ``spack env activate`` command will put the default view for the
environment into the user's path, in addition to activating the
environment for Spack commands. The arguments ``-v,--with-view`` and
``-V,--without-view`` can be used to tune this behavior. The default
behavior is to activate with the environment view if there is one.
The ``spack env activate <env>`` has two effects:
1. It activates the environment so that further Spack commands such
as ``spack install`` will run in the context of the environment.
2. It activates the view so that environment variables such as
``PATH`` are updated to include the view.
Without further arguments, the ``default`` view of the environment is
activated. If a view with a different name has to be activated,
``spack env activate --with-view <name> <env>`` can be
used instead. You can also activate the environment without modifying
further environment variables using ``--without-view``.
The environment variables affected by the ``spack env activate``
command and the paths that are used to update them are determined by
@@ -1146,8 +1156,8 @@ relevant variable if the path exists. For this reason, it is not
recommended to use non-default projections with the default view of an
environment.
The ``spack env deactivate`` command will remove the default view of
the environment from the user's path.
The ``spack env deactivate`` command will remove the active view of
the Spack environment from the user's environment variables.
.. _env-generate-depfile:
@@ -1306,7 +1316,7 @@ index once every package is pushed. Note how this target uses the generated
example/push/%: example/install/%
@mkdir -p $(dir $@)
$(info About to push $(SPEC) to a buildcache)
$(SPACK) -e . buildcache push --allow-root --only=package $(BUILDCACHE_DIR) /$(HASH)
$(SPACK) -e . buildcache push --only=package $(BUILDCACHE_DIR) /$(HASH)
@touch $@
push: $(addprefix example/push/,$(example/SPACK_PACKAGE_IDS))

View File

@@ -1,13 +1,13 @@
sphinx==7.2.6
sphinx==7.4.7
sphinxcontrib-programoutput==0.17
sphinx_design==0.6.0
sphinx_design==0.6.1
sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.1
docutils==0.20.1
pygments==2.18.0
urllib3==2.2.2
pytest==8.2.2
pytest==8.3.2
isort==5.13.2
black==24.4.2
flake8==7.1.0
mypy==1.10.1
black==24.8.0
flake8==7.1.1
mypy==1.11.1

96
lib/spack/env/cc vendored
View File

@@ -174,6 +174,46 @@ preextend() {
unset IFS
}
execute() {
# dump the full command if the caller supplies SPACK_TEST_COMMAND=dump-args
if [ -n "${SPACK_TEST_COMMAND=}" ]; then
case "$SPACK_TEST_COMMAND" in
dump-args)
IFS="$lsep"
for arg in $full_command_list; do
echo "$arg"
done
unset IFS
exit
;;
dump-env-*)
var=${SPACK_TEST_COMMAND#dump-env-}
eval "printf '%s\n' \"\$0: \$var: \$$var\""
;;
*)
die "Unknown test command: '$SPACK_TEST_COMMAND'"
;;
esac
fi
#
# Write the input and output commands to debug logs if it's asked for.
#
if [ "$SPACK_DEBUG" = TRUE ]; then
input_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.in.log"
output_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.out.log"
echo "[$mode] $command $input_command" >> "$input_log"
IFS="$lsep"
echo "[$mode] "$full_command_list >> "$output_log"
unset IFS
fi
# Execute the full command, preserving spaces with IFS set
# to the alarm bell separator.
IFS="$lsep"; exec $full_command_list
exit
}
# Fail with a clear message if the input contains any bell characters.
if eval "[ \"\${*#*${lsep}}\" != \"\$*\" ]"; then
die "Compiler command line contains our separator ('${lsep}'). Cannot parse."
@@ -231,12 +271,17 @@ fi
# ld link
# ccld compile & link
# Note. SPACK_ALWAYS_XFLAGS are applied for all compiler invocations,
# including version checks (SPACK_XFLAGS variants are not applied
# for version checks).
command="${0##*/}"
comp="CC"
vcheck_flags=""
case "$command" in
cpp)
mode=cpp
debug_flags="-g"
vcheck_flags="${SPACK_ALWAYS_CPPFLAGS}"
;;
cc|c89|c99|gcc|clang|armclang|icc|icx|pgcc|nvc|xlc|xlc_r|fcc|amdclang|cl.exe|craycc)
command="$SPACK_CC"
@@ -244,6 +289,7 @@ case "$command" in
comp="CC"
lang_flags=C
debug_flags="-g"
vcheck_flags="${SPACK_ALWAYS_CFLAGS}"
;;
c++|CC|g++|clang++|armclang++|icpc|icpx|pgc++|nvc++|xlc++|xlc++_r|FCC|amdclang++|crayCC)
command="$SPACK_CXX"
@@ -251,6 +297,7 @@ case "$command" in
comp="CXX"
lang_flags=CXX
debug_flags="-g"
vcheck_flags="${SPACK_ALWAYS_CXXFLAGS}"
;;
ftn|f90|fc|f95|gfortran|flang|armflang|ifort|ifx|pgfortran|nvfortran|xlf90|xlf90_r|nagfor|frt|amdflang|crayftn)
command="$SPACK_FC"
@@ -258,6 +305,7 @@ case "$command" in
comp="FC"
lang_flags=F
debug_flags="-g"
vcheck_flags="${SPACK_ALWAYS_FFLAGS}"
;;
f77|xlf|xlf_r|pgf77)
command="$SPACK_F77"
@@ -265,6 +313,7 @@ case "$command" in
comp="F77"
lang_flags=F
debug_flags="-g"
vcheck_flags="${SPACK_ALWAYS_FFLAGS}"
;;
ld|ld.gold|ld.lld)
mode=ld
@@ -365,7 +414,11 @@ unset IFS
export PATH="$new_dirs"
if [ "$mode" = vcheck ]; then
exec "${command}" "$@"
full_command_list="$command"
args="$@"
extend full_command_list vcheck_flags
extend full_command_list args
execute
fi
# Darwin's linker has a -r argument that merges object files together.
@@ -722,6 +775,7 @@ case "$mode" in
cc|ccld)
case $lang_flags in
F)
extend spack_flags_list SPACK_ALWAYS_FFLAGS
extend spack_flags_list SPACK_FFLAGS
;;
esac
@@ -731,6 +785,7 @@ esac
# C preprocessor flags come before any C/CXX flags
case "$mode" in
cpp|as|cc|ccld)
extend spack_flags_list SPACK_ALWAYS_CPPFLAGS
extend spack_flags_list SPACK_CPPFLAGS
;;
esac
@@ -741,9 +796,11 @@ case "$mode" in
cc|ccld)
case $lang_flags in
C)
extend spack_flags_list SPACK_ALWAYS_CFLAGS
extend spack_flags_list SPACK_CFLAGS
;;
CXX)
extend spack_flags_list SPACK_ALWAYS_CXXFLAGS
extend spack_flags_list SPACK_CXXFLAGS
;;
esac
@@ -933,39 +990,4 @@ if [ -n "$SPACK_CCACHE_BINARY" ]; then
esac
fi
# dump the full command if the caller supplies SPACK_TEST_COMMAND=dump-args
if [ -n "${SPACK_TEST_COMMAND=}" ]; then
case "$SPACK_TEST_COMMAND" in
dump-args)
IFS="$lsep"
for arg in $full_command_list; do
echo "$arg"
done
unset IFS
exit
;;
dump-env-*)
var=${SPACK_TEST_COMMAND#dump-env-}
eval "printf '%s\n' \"\$0: \$var: \$$var\""
;;
*)
die "Unknown test command: '$SPACK_TEST_COMMAND'"
;;
esac
fi
#
# Write the input and output commands to debug logs if it's asked for.
#
if [ "$SPACK_DEBUG" = TRUE ]; then
input_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.in.log"
output_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.out.log"
echo "[$mode] $command $input_command" >> "$input_log"
IFS="$lsep"
echo "[$mode] "$full_command_list >> "$output_log"
unset IFS
fi
# Execute the full command, preserving spaces with IFS set
# to the alarm bell separator.
IFS="$lsep"; exec $full_command_list
execute

View File

@@ -351,6 +351,22 @@ def _wrongly_named_spec(error_cls):
return errors
@config_packages
def _ensure_all_virtual_packages_have_default_providers(error_cls):
"""All virtual packages must have a default provider explicitly set."""
configuration = spack.config.create()
defaults = configuration.get("packages", scope="defaults")
default_providers = defaults["all"]["providers"]
virtuals = spack.repo.PATH.provider_index.providers
default_providers_filename = configuration.scopes["defaults"].get_section_filename("packages")
return [
error_cls(f"'{virtual}' must have a default provider in {default_providers_filename}", [])
for virtual in virtuals
if virtual not in default_providers
]
def _make_config_error(config_data, summary, error_cls):
s = io.StringIO()
s.write("Occurring in the following file:\n")

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,154 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Bootstrap concrete specs for clingo
Spack uses clingo to concretize specs. When clingo itself needs to be bootstrapped from sources,
we need to rely on another mechanism to get a concrete spec that fits the current host.
This module contains the logic to get a concrete spec for clingo, starting from a prototype
JSON file for a similar platform.
"""
import pathlib
import sys
from typing import Dict, Optional, Tuple
import archspec.cpu
import spack.compiler
import spack.compilers
import spack.platforms
import spack.spec
import spack.traverse
from .config import spec_for_current_python
class ClingoBootstrapConcretizer:
def __init__(self, configuration):
self.host_platform = spack.platforms.host()
self.host_os = self.host_platform.operating_system("frontend")
self.host_target = archspec.cpu.host().family
self.host_architecture = spack.spec.ArchSpec.frontend_arch()
self.host_architecture.target = str(self.host_target)
self.host_compiler = self._valid_compiler_or_raise()
self.host_python = self.python_external_spec()
if str(self.host_platform) == "linux":
self.host_libc = self.libc_external_spec()
self.external_cmake, self.external_bison = self._externals_from_yaml(configuration)
def _valid_compiler_or_raise(self) -> "spack.compiler.Compiler":
if str(self.host_platform) == "linux":
compiler_name = "gcc"
elif str(self.host_platform) == "darwin":
compiler_name = "apple-clang"
elif str(self.host_platform) == "windows":
compiler_name = "msvc"
elif str(self.host_platform) == "freebsd":
compiler_name = "clang"
else:
raise RuntimeError(f"Cannot bootstrap clingo from sources on {self.host_platform}")
candidates = spack.compilers.compilers_for_spec(
compiler_name, arch_spec=self.host_architecture
)
if not candidates:
raise RuntimeError(
f"Cannot find any version of {compiler_name} to bootstrap clingo from sources"
)
candidates.sort(key=lambda x: x.spec.version, reverse=True)
return candidates[0]
def _externals_from_yaml(
self, configuration: "spack.config.Configuration"
) -> Tuple[Optional["spack.spec.Spec"], Optional["spack.spec.Spec"]]:
packages_yaml = configuration.get("packages")
requirements = {"cmake": "@3.20:", "bison": "@2.5:"}
selected: Dict[str, Optional["spack.spec.Spec"]] = {"cmake": None, "bison": None}
for pkg_name in ["cmake", "bison"]:
if pkg_name not in packages_yaml:
continue
candidates = packages_yaml[pkg_name].get("externals", [])
for candidate in candidates:
s = spack.spec.Spec(candidate["spec"], external_path=candidate["prefix"])
if not s.satisfies(requirements[pkg_name]):
continue
if not s.intersects(f"%{self.host_compiler.spec}"):
continue
if not s.intersects(f"arch={self.host_architecture}"):
continue
selected[pkg_name] = self._external_spec(s)
break
return selected["cmake"], selected["bison"]
def prototype_path(self) -> pathlib.Path:
"""Path to a prototype concrete specfile for clingo"""
parent_dir = pathlib.Path(__file__).parent
result = parent_dir / "prototypes" / f"clingo-{self.host_platform}-{self.host_target}.json"
if str(self.host_platform) == "linux":
# Using aarch64 as a fallback, since it has gnuconfig (x86_64 doesn't have it)
if not result.exists():
result = parent_dir / "prototypes" / f"clingo-{self.host_platform}-aarch64.json"
elif str(self.host_platform) == "freebsd":
result = parent_dir / "prototypes" / f"clingo-{self.host_platform}-amd64.json"
elif not result.exists():
raise RuntimeError(f"Cannot bootstrap clingo from sources on {self.host_platform}")
return result
def concretize(self) -> "spack.spec.Spec":
# Read the prototype and mark it NOT concrete
s = spack.spec.Spec.from_specfile(str(self.prototype_path()))
s._mark_concrete(False)
# Tweak it to conform to the host architecture
for node in s.traverse():
node.architecture.os = str(self.host_os)
node.compiler = self.host_compiler.spec
node.architecture = self.host_architecture
if node.name == "gcc-runtime":
node.versions = self.host_compiler.spec.versions
for edge in spack.traverse.traverse_edges([s], cover="edges"):
if edge.spec.name == "python":
edge.spec = self.host_python
if edge.spec.name == "bison" and self.external_bison:
edge.spec = self.external_bison
if edge.spec.name == "cmake" and self.external_cmake:
edge.spec = self.external_cmake
if "libc" in edge.virtuals:
edge.spec = self.host_libc
s._finalize_concretization()
# Work around the fact that the installer calls Spec.dependents() and
# we modified edges inconsistently
return s.copy()
def python_external_spec(self) -> "spack.spec.Spec":
"""Python external spec corresponding to the current running interpreter"""
result = spack.spec.Spec(spec_for_current_python(), external_path=sys.exec_prefix)
return self._external_spec(result)
def libc_external_spec(self) -> "spack.spec.Spec":
result = self.host_compiler.default_libc
return self._external_spec(result)
def _external_spec(self, initial_spec) -> "spack.spec.Spec":
initial_spec.namespace = "builtin"
initial_spec.compiler = self.host_compiler.spec
initial_spec.architecture = self.host_architecture
for flag_type in spack.spec.FlagMap.valid_compiler_flags():
initial_spec.compiler_flags[flag_type] = []
return spack.spec.parse_with_version_concrete(initial_spec)

View File

@@ -54,6 +54,7 @@
import spack.version
from ._common import _executables_in_store, _python_import, _root_spec, _try_import_from_store
from .clingo import ClingoBootstrapConcretizer
from .config import spack_python_interpreter, spec_for_current_python
#: Name of the file containing metadata about the bootstrapping source
@@ -268,15 +269,13 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
# Try to build and install from sources
with spack_python_interpreter():
# Add hint to use frontend operating system on Cray
concrete_spec = spack.spec.Spec(abstract_spec_str + " ^" + spec_for_current_python())
if module == "clingo":
# TODO: remove when the old concretizer is deprecated # pylint: disable=fixme
concrete_spec._old_concretize( # pylint: disable=protected-access
deprecation_warning=False
)
bootstrapper = ClingoBootstrapConcretizer(configuration=spack.config.CONFIG)
concrete_spec = bootstrapper.concretize()
else:
concrete_spec = spack.spec.Spec(
abstract_spec_str + " ^" + spec_for_current_python()
)
concrete_spec.concretize()
msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources"
@@ -303,14 +302,7 @@ def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bo
# might reduce compilation time by a fair amount
_add_externals_if_missing()
concrete_spec = spack.spec.Spec(abstract_spec_str)
if concrete_spec.name == "patchelf":
concrete_spec._old_concretize( # pylint: disable=protected-access
deprecation_warning=False
)
else:
concrete_spec.concretize()
concrete_spec = spack.spec.Spec(abstract_spec_str).concretized()
msg = "[BOOTSTRAP] Try installing '{0}' from sources"
tty.debug(msg.format(abstract_spec_str))
with spack.config.override(self.mirror_scope):

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1473,7 +1473,7 @@ def long_message(self):
out.write(" {0}\n".format(self.log_name))
# Also output the test log path IF it exists
if self.context != "test":
if self.context != "test" and have_log:
test_log = join_path(os.path.dirname(self.log_name), spack_install_test_log)
if os.path.isfile(test_log):
out.write("\nSee test log for details:\n")

View File

@@ -124,6 +124,8 @@ def cuda_flags(arch_list):
# minimum supported versions
conflicts("%gcc@:4", when="+cuda ^cuda@11.0:")
conflicts("%gcc@:5", when="+cuda ^cuda@11.4:")
conflicts("%gcc@:7.2", when="+cuda ^cuda@12.4:")
conflicts("%clang@:6", when="+cuda ^cuda@12.2:")
# maximum supported version
# NOTE:
@@ -211,12 +213,16 @@ def cuda_flags(arch_list):
conflicts("%intel@19.0:", when="+cuda ^cuda@:10.0")
conflicts("%intel@19.1:", when="+cuda ^cuda@:10.1")
conflicts("%intel@19.2:", when="+cuda ^cuda@:11.1.0")
conflicts("%intel@2021:", when="+cuda ^cuda@:11.4.0")
# XL is mostly relevant for ppc64le Linux
conflicts("%xl@:12,14:", when="+cuda ^cuda@:9.1")
conflicts("%xl@:12,14:15,17:", when="+cuda ^cuda@9.2")
conflicts("%xl@:12,17:", when="+cuda ^cuda@:11.1.0")
# PowerPC.
conflicts("target=ppc64le", when="+cuda ^cuda@12.5:")
# Darwin.
# TODO: add missing conflicts for %apple-clang cuda@:10
conflicts("platform=darwin", when="+cuda ^cuda@11.0.2: ")
conflicts("platform=darwin", when="+cuda ^cuda@11.0.2:")

View File

@@ -72,7 +72,7 @@ def build_directory(self):
def build_args(self):
"""Arguments for ``go build``."""
# Pass ldflags -s = --strip-all and -w = --no-warnings by default
return ["-ldflags", "-s -w", "-o", f"{self.pkg.name}"]
return ["-modcacherw", "-ldflags", "-s -w", "-o", f"{self.pkg.name}"]
@property
def check_args(self):

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Common utilities for managing intel oneapi packages."""
import getpass
import os
import platform
import shutil
@@ -13,6 +12,7 @@
from llnl.util.filesystem import HeaderList, LibraryList, find_libraries, join_path, mkdirp
from llnl.util.link_tree import LinkTree
import spack.util.path
from spack.build_environment import dso_suffix
from spack.directives import conflicts, license, redistribute, variant
from spack.package_base import InstallError
@@ -99,7 +99,7 @@ def install_component(self, installer_path):
# with other install depends on the userid. For root, we
# delete the installercache before and after install. For
# non root we redefine the HOME environment variable.
if getpass.getuser() == "root":
if spack.util.path.get_user() == "root":
shutil.rmtree("/var/intel/installercache", ignore_errors=True)
bash = Executable("bash")
@@ -122,7 +122,7 @@ def install_component(self, installer_path):
self.prefix,
)
if getpass.getuser() == "root":
if spack.util.path.get_user() == "root":
shutil.rmtree("/var/intel/installercache", ignore_errors=True)
# Some installers have a bug and do not return an error code when failing

View File

@@ -85,20 +85,28 @@ def homepage(cls):
return "https://bioconductor.org/packages/" + cls.bioc
@lang.classproperty
def url(cls):
def urls(cls):
if cls.cran:
return (
return [
"https://cloud.r-project.org/src/contrib/"
+ cls.cran
+ "_"
+ str(list(cls.versions)[0])
+ ".tar.gz"
)
+ f"{cls.cran}_{str(list(cls.versions)[0])}.tar.gz",
"https://cloud.r-project.org/src/contrib/Archive/{cls.cran}/"
+ f"{cls.cran}_{str(list(cls.versions)[0])}.tar.gz",
]
elif cls.bioc:
return [
"https://bioconductor.org/packages/release/bioc/src/contrib/"
+ f"{cls.bioc}_{str(list(cls.versions)[0])}.tar.gz",
"https://bioconductor.org/packages/release/data/annotation/src/contrib/"
+ f"{cls.bioc}_{str(list(cls.versions)[0])}.tar.gz",
]
else:
return [cls.url]
@lang.classproperty
def list_url(cls):
if cls.cran:
return "https://cloud.r-project.org/src/contrib/Archive/" + cls.cran + "/"
return "https://cloud.r-project.org/src/contrib/"
@property
def git(self):

View File

@@ -139,6 +139,10 @@ def configure(self, pkg, spec, prefix):
args = ["--verbose", "--target-dir", inspect.getmodule(self.pkg).python_platlib]
args.extend(self.configure_args())
# https://github.com/Python-SIP/sip/commit/cb0be6cb6e9b756b8b0db3136efb014f6fb9b766
if spec["py-sip"].satisfies("@6.1.0:"):
args.extend(["--scripts-dir", pkg.prefix.bin])
sip_build = Executable(spec["py-sip"].prefix.bin.join("sip-build"))
sip_build(*args)

View File

@@ -38,6 +38,7 @@
import spack.paths
import spack.repo
import spack.spec
import spack.stage
import spack.util.git
import spack.util.gpg as gpg_util
import spack.util.spack_yaml as syaml
@@ -71,7 +72,7 @@
# TODO: Remove this in Spack 0.23
SHARED_PR_MIRROR_URL = "s3://spack-binaries-prs/shared_pr_mirror"
JOB_NAME_FORMAT = (
"{name}{@version} {/hash:7} {%compiler.name}{@compiler.version}{arch=architecture}"
"{name}{@version} {/hash:7} {%compiler.name}{@compiler.version}{ arch=architecture}"
)
IS_WINDOWS = sys.platform == "win32"
spack_gpg = spack.main.SpackCommand("gpg")
@@ -1107,7 +1108,7 @@ def main_script_replacements(cmd):
if cdash_handler and cdash_handler.auth_token:
try:
cdash_handler.populate_buildgroup(all_job_names)
except (SpackError, HTTPError, URLError) as err:
except (SpackError, HTTPError, URLError, TimeoutError) as err:
tty.warn(f"Problem populating buildgroup: {err}")
else:
tty.warn("Unable to populate buildgroup without CDash credentials")
@@ -1370,15 +1371,6 @@ def can_verify_binaries():
return len(gpg_util.public_keys()) >= 1
def _push_to_build_cache(spec: spack.spec.Spec, sign_binaries: bool, mirror_url: str) -> None:
"""Unchecked version of the public API, for easier mocking"""
bindist.push_or_raise(
spec,
spack.mirror.Mirror.from_url(mirror_url).push_url,
bindist.PushOptions(force=True, unsigned=not sign_binaries),
)
def push_to_build_cache(spec: spack.spec.Spec, mirror_url: str, sign_binaries: bool) -> bool:
"""Push one or more binary packages to the mirror.
@@ -1389,20 +1381,13 @@ def push_to_build_cache(spec: spack.spec.Spec, mirror_url: str, sign_binaries: b
sign_binaries: If True, spack will attempt to sign binary package before pushing.
"""
tty.debug(f"Pushing to build cache ({'signed' if sign_binaries else 'unsigned'})")
signing_key = bindist.select_signing_key() if sign_binaries else None
try:
_push_to_build_cache(spec, sign_binaries, mirror_url)
bindist.push_or_raise([spec], out_url=mirror_url, signing_key=signing_key)
return True
except bindist.PushToBuildCacheError as e:
tty.error(str(e))
tty.error(f"Problem writing to {mirror_url}: {e}")
return False
except Exception as e:
# TODO (zackgalbreath): write an adapter for boto3 exceptions so we can catch a specific
# exception instead of parsing str(e)...
msg = str(e)
if any(x in msg for x in ["Access Denied", "InvalidAccessKeyId"]):
tty.error(f"Permission problem writing to {mirror_url}: {msg}")
return False
raise
def remove_other_mirrors(mirrors_to_keep, scope=None):
@@ -2083,7 +2068,7 @@ def read_broken_spec(broken_spec_url):
"""
try:
_, _, fs = web_util.read_from_url(broken_spec_url)
except (URLError, web_util.SpackWebError, HTTPError):
except web_util.SpackWebError:
tty.warn(f"Unable to read broken spec from {broken_spec_url}")
return None

View File

@@ -237,7 +237,7 @@ def ensure_single_spec_or_die(spec, matching_specs):
if len(matching_specs) <= 1:
return
format_string = "{name}{@version}{%compiler.name}{@compiler.version}{arch=architecture}"
format_string = "{name}{@version}{%compiler.name}{@compiler.version}{ arch=architecture}"
args = ["%s matches multiple packages." % spec, "Matching packages:"]
args += [
colorize(" @K{%s} " % s.dag_hash(7)) + s.cformat(format_string) for s in matching_specs
@@ -336,6 +336,7 @@ def display_specs(specs, args=None, **kwargs):
groups (bool): display specs grouped by arch/compiler (default True)
decorator (typing.Callable): function to call to decorate specs
all_headers (bool): show headers even when arch/compiler aren't defined
status_fn (typing.Callable): if provided, prepend install-status info
output (typing.IO): A file object to write to. Default is ``sys.stdout``
"""
@@ -359,6 +360,7 @@ def get_arg(name, default=None):
groups = get_arg("groups", True)
all_headers = get_arg("all_headers", False)
output = get_arg("output", sys.stdout)
status_fn = get_arg("status_fn", None)
decorator = get_arg("decorator", None)
if decorator is None:
@@ -386,6 +388,13 @@ def get_arg(name, default=None):
def fmt(s, depth=0):
"""Formatter function for all output specs"""
string = ""
if status_fn:
# This was copied from spec.tree's colorization logic
# then shortened because it seems like status_fn should
# always return an InstallStatus
string += colorize(status_fn(s).value)
if hashes:
string += gray_hash(s, hlen) + " "
string += depth * " "

View File

@@ -3,28 +3,24 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import copy
import glob
import hashlib
import json
import multiprocessing
import multiprocessing.pool
import os
import shutil
import sys
import tempfile
from typing import Dict, List, Optional, Tuple, Union
from typing import List, Tuple
import llnl.util.tty as tty
from llnl.string import plural
from llnl.util.lang import elide_list
from llnl.util.lang import elide_list, stable_partition
import spack.binary_distribution as bindist
import spack.cmd
import spack.config
import spack.deptypes as dt
import spack.environment as ev
import spack.error
import spack.hash_types as ht
import spack.mirror
import spack.oci.oci
import spack.oci.opener
@@ -35,28 +31,13 @@
import spack.store
import spack.user_environment
import spack.util.crypto
import spack.util.parallel
import spack.util.url as url_util
import spack.util.web as web_util
from spack import traverse
from spack.build_environment import determine_number_of_jobs
from spack.cmd import display_specs
from spack.cmd.common import arguments
from spack.oci.image import (
Digest,
ImageReference,
default_config,
default_index_tag,
default_manifest,
default_tag,
tag_is_spec,
)
from spack.oci.oci import (
copy_missing_layers_with_retry,
get_manifest_and_config_with_retry,
list_tags,
upload_blob_with_retry,
upload_manifest_with_retry,
)
from spack.oci.image import ImageReference
from spack.spec import Spec, save_dependency_specfiles
description = "create, download and install binary packages"
@@ -70,12 +51,6 @@ def setup_parser(subparser: argparse.ArgumentParser):
push = subparsers.add_parser("push", aliases=["create"], help=push_fn.__doc__)
push.add_argument("-f", "--force", action="store_true", help="overwrite tarball if it exists")
push.add_argument(
"--allow-root",
"-a",
action="store_true",
help="allow install root string in binary files after RPATH substitution",
)
push_sign = push.add_mutually_exclusive_group(required=False)
push_sign.add_argument(
"--unsigned",
@@ -118,6 +93,17 @@ def setup_parser(subparser: argparse.ArgumentParser):
"Alternatively, one can decide to build a cache for only the package or only the "
"dependencies",
)
with_or_without_build_deps = push.add_mutually_exclusive_group()
with_or_without_build_deps.add_argument(
"--with-build-dependencies",
action="store_true",
help="include build dependencies in the buildcache",
)
with_or_without_build_deps.add_argument(
"--without-build-dependencies",
action="store_true",
help="exclude build dependencies from the buildcache",
)
push.add_argument(
"--fail-fast",
action="store_true",
@@ -190,10 +176,6 @@ def setup_parser(subparser: argparse.ArgumentParser):
keys.add_argument("-f", "--force", action="store_true", help="force new download of keys")
keys.set_defaults(func=keys_fn)
preview = subparsers.add_parser("preview", help=preview_fn.__doc__)
arguments.add_common_arguments(preview, ["installed_specs"])
preview.set_defaults(func=preview_fn)
# Check if binaries need to be rebuilt on remote mirror
check = subparsers.add_parser("check", help=check_fn.__doc__)
check.add_argument(
@@ -339,39 +321,6 @@ def _format_spec(spec: Spec) -> str:
return spec.cformat("{name}{@version}{/hash:7}")
def _progress(i: int, total: int):
if total > 1:
digits = len(str(total))
return f"[{i+1:{digits}}/{total}] "
return ""
class NoPool:
def map(self, func, args):
return [func(a) for a in args]
def starmap(self, func, args):
return [func(*a) for a in args]
def __enter__(self):
return self
def __exit__(self, *args):
pass
MaybePool = Union[multiprocessing.pool.Pool, NoPool]
def _make_pool() -> MaybePool:
"""Can't use threading because it's unsafe, and can't use spawned processes because of globals.
That leaves only forking"""
if multiprocessing.get_start_method() == "fork":
return multiprocessing.pool.Pool(determine_number_of_jobs(parallel=True))
else:
return NoPool()
def _skip_no_redistribute_for_public(specs):
remaining_specs = list()
removed_specs = list()
@@ -391,6 +340,45 @@ def _skip_no_redistribute_for_public(specs):
return remaining_specs
class PackagesAreNotInstalledError(spack.error.SpackError):
"""Raised when a list of specs is not installed but picked to be packaged."""
def __init__(self, specs: List[Spec]):
super().__init__(
"Cannot push non-installed packages",
", ".join(elide_list([_format_spec(s) for s in specs], 5)),
)
class PackageNotInstalledError(spack.error.SpackError):
"""Raised when a spec is not installed but picked to be packaged."""
def _specs_to_be_packaged(
requested: List[Spec], things_to_install: str, build_deps: bool
) -> List[Spec]:
"""Collect all non-external with or without roots and dependencies"""
if "dependencies" not in things_to_install:
deptype = dt.NONE
elif build_deps:
deptype = dt.ALL
else:
deptype = dt.RUN | dt.LINK | dt.TEST
specs = [
s
for s in traverse.traverse_nodes(
requested,
root="package" in things_to_install,
deptype=deptype,
order="breadth",
key=traverse.by_dag_hash,
)
if not s.external
]
specs.reverse()
return specs
def push_fn(args):
"""create a binary package and push it to a mirror"""
if args.spec_file:
@@ -404,11 +392,6 @@ def push_fn(args):
else:
roots = spack.cmd.require_active_env(cmd_name="buildcache push").concrete_roots()
if args.allow_root:
tty.warn(
"The flag `--allow-root` is the default in Spack 0.21, will be removed in Spack 0.22"
)
mirror: spack.mirror.Mirror = args.mirror
# Check if this is an OCI image.
@@ -434,84 +417,79 @@ def push_fn(args):
"Code signing is currently not supported for OCI images. "
"Use --unsigned to silence this warning."
)
unsigned = True
# This is a list of installed, non-external specs.
specs = bindist.specs_to_be_packaged(
# Select a signing key, or None if unsigned.
signing_key = None if unsigned else (args.key or bindist.select_signing_key())
specs = _specs_to_be_packaged(
roots,
root="package" in args.things_to_install,
dependencies="dependencies" in args.things_to_install,
things_to_install=args.things_to_install,
build_deps=args.with_build_dependencies or not args.without_build_dependencies,
)
if not args.private:
specs = _skip_no_redistribute_for_public(specs)
# When pushing multiple specs, print the url once ahead of time, as well as how
# many specs are being pushed.
if len(specs) > 1:
tty.info(f"Selected {len(specs)} specs to push to {push_url}")
failed = []
# Pushing not installed specs is an error. Either fail fast or populate the error list and
# push installed package in best effort mode.
failed: List[Tuple[Spec, BaseException]] = []
with spack.store.STORE.db.read_transaction():
if any(not s.installed for s in specs):
specs, not_installed = stable_partition(specs, lambda s: s.installed)
if args.fail_fast:
raise PackagesAreNotInstalledError(not_installed)
else:
failed.extend(
(s, PackageNotInstalledError("package not installed")) for s in not_installed
)
# TODO: unify this logic in the future.
if target_image:
base_image = ImageReference.from_string(args.base_image) if args.base_image else None
with tempfile.TemporaryDirectory(
dir=spack.stage.get_stage_root()
) as tmpdir, _make_pool() as pool:
skipped, base_images, checksums = _push_oci(
with bindist.default_push_context() as (tmpdir, executor):
if target_image:
base_image = ImageReference.from_string(args.base_image) if args.base_image else None
skipped, base_images, checksums, upload_errors = bindist._push_oci(
target_image=target_image,
base_image=base_image,
installed_specs_with_deps=specs,
force=args.force,
tmpdir=tmpdir,
pool=pool,
executor=executor,
)
if upload_errors:
failed.extend(upload_errors)
# Apart from creating manifests for each individual spec, we allow users to create a
# separate image tag for all root specs and their runtime dependencies.
if args.tag:
elif args.tag:
tagged_image = target_image.with_tag(args.tag)
# _push_oci may not populate base_images if binaries were already in the registry
for spec in roots:
_update_base_images(
bindist._oci_update_base_images(
base_image=base_image,
target_image=target_image,
spec=spec,
base_image_cache=base_images,
)
_put_manifest(base_images, checksums, tagged_image, tmpdir, None, None, *roots)
bindist._oci_put_manifest(
base_images, checksums, tagged_image, tmpdir, None, None, *roots
)
tty.info(f"Tagged {tagged_image}")
else:
skipped = []
for i, spec in enumerate(specs):
try:
bindist.push_or_raise(
spec,
push_url,
bindist.PushOptions(
force=args.force,
unsigned=unsigned,
key=args.key,
regenerate_index=args.update_index,
),
)
msg = f"{_progress(i, len(specs))}Pushed {_format_spec(spec)}"
if len(specs) == 1:
msg += f" to {push_url}"
tty.info(msg)
except bindist.NoOverwriteException:
skipped.append(_format_spec(spec))
# Catch any other exception unless the fail fast option is set
except Exception as e:
if args.fail_fast or isinstance(
e, (bindist.PickKeyException, bindist.NoKeyException)
):
raise
failed.append((_format_spec(spec), e))
else:
skipped, upload_errors = bindist._push(
specs,
out_url=push_url,
force=args.force,
update_index=args.update_index,
signing_key=signing_key,
tmpdir=tmpdir,
executor=executor,
)
failed.extend(upload_errors)
if skipped:
if len(specs) == 1:
@@ -534,389 +512,22 @@ def push_fn(args):
raise spack.error.SpackError(
f"The following {len(failed)} errors occurred while pushing specs to the buildcache",
"\n".join(
elide_list([f" {spec}: {e.__class__.__name__}: {e}" for spec, e in failed], 5)
elide_list(
[
f" {_format_spec(spec)}: {e.__class__.__name__}: {e}"
for spec, e in failed
],
5,
)
),
)
# Update the index if requested
# TODO: remove update index logic out of bindist; should be once after all specs are pushed
# not once per spec.
# Update the OCI index if requested
if target_image and len(skipped) < len(specs) and args.update_index:
with tempfile.TemporaryDirectory(
dir=spack.stage.get_stage_root()
) as tmpdir, _make_pool() as pool:
_update_index_oci(target_image, tmpdir, pool)
def _get_spack_binary_blob(image_ref: ImageReference) -> Optional[spack.oci.oci.Blob]:
"""Get the spack tarball layer digests and size if it exists"""
try:
manifest, config = get_manifest_and_config_with_retry(image_ref)
return spack.oci.oci.Blob(
compressed_digest=Digest.from_string(manifest["layers"][-1]["digest"]),
uncompressed_digest=Digest.from_string(config["rootfs"]["diff_ids"][-1]),
size=manifest["layers"][-1]["size"],
)
except Exception:
return None
def _push_single_spack_binary_blob(image_ref: ImageReference, spec: spack.spec.Spec, tmpdir: str):
filename = os.path.join(tmpdir, f"{spec.dag_hash()}.tar.gz")
# Create an oci.image.layer aka tarball of the package
compressed_tarfile_checksum, tarfile_checksum = spack.oci.oci.create_tarball(spec, filename)
blob = spack.oci.oci.Blob(
Digest.from_sha256(compressed_tarfile_checksum),
Digest.from_sha256(tarfile_checksum),
os.path.getsize(filename),
)
# Upload the blob
upload_blob_with_retry(image_ref, file=filename, digest=blob.compressed_digest)
# delete the file
os.unlink(filename)
return blob
def _retrieve_env_dict_from_config(config: dict) -> dict:
"""Retrieve the environment variables from the image config file.
Sets a default value for PATH if it is not present.
Args:
config (dict): The image config file.
Returns:
dict: The environment variables.
"""
env = {"PATH": "/bin:/usr/bin"}
if "Env" in config.get("config", {}):
for entry in config["config"]["Env"]:
key, value = entry.split("=", 1)
env[key] = value
return env
def _archspec_to_gooarch(spec: spack.spec.Spec) -> str:
name = spec.target.family.name
name_map = {"aarch64": "arm64", "x86_64": "amd64"}
return name_map.get(name, name)
def _put_manifest(
base_images: Dict[str, Tuple[dict, dict]],
checksums: Dict[str, spack.oci.oci.Blob],
image_ref: ImageReference,
tmpdir: str,
extra_config: Optional[dict],
annotations: Optional[dict],
*specs: spack.spec.Spec,
):
architecture = _archspec_to_gooarch(specs[0])
dependencies = list(
reversed(
list(
s
for s in traverse.traverse_nodes(
specs, order="topo", deptype=("link", "run"), root=True
)
if not s.external
)
)
)
base_manifest, base_config = base_images[architecture]
env = _retrieve_env_dict_from_config(base_config)
# If the base image uses `vnd.docker.distribution.manifest.v2+json`, then we use that too.
# This is because Singularity / Apptainer is very strict about not mixing them.
base_manifest_mediaType = base_manifest.get(
"mediaType", "application/vnd.oci.image.manifest.v1+json"
)
use_docker_format = (
base_manifest_mediaType == "application/vnd.docker.distribution.manifest.v2+json"
)
spack.user_environment.environment_modifications_for_specs(*specs).apply_modifications(env)
# Create an oci.image.config file
config = copy.deepcopy(base_config)
# Add the diff ids of the dependencies
for s in dependencies:
config["rootfs"]["diff_ids"].append(str(checksums[s.dag_hash()].uncompressed_digest))
# Set the environment variables
config["config"]["Env"] = [f"{k}={v}" for k, v in env.items()]
if extra_config:
# From the OCI v1.0 spec:
# > Any extra fields in the Image JSON struct are considered implementation
# > specific and MUST be ignored by any implementations which are unable to
# > interpret them.
config.update(extra_config)
config_file = os.path.join(tmpdir, f"{specs[0].dag_hash()}.config.json")
with open(config_file, "w") as f:
json.dump(config, f, separators=(",", ":"))
config_file_checksum = Digest.from_sha256(
spack.util.crypto.checksum(hashlib.sha256, config_file)
)
# Upload the config file
upload_blob_with_retry(image_ref, file=config_file, digest=config_file_checksum)
manifest = {
"mediaType": base_manifest_mediaType,
"schemaVersion": 2,
"config": {
"mediaType": base_manifest["config"]["mediaType"],
"digest": str(config_file_checksum),
"size": os.path.getsize(config_file),
},
"layers": [
*(layer for layer in base_manifest["layers"]),
*(
{
"mediaType": (
"application/vnd.docker.image.rootfs.diff.tar.gzip"
if use_docker_format
else "application/vnd.oci.image.layer.v1.tar+gzip"
),
"digest": str(checksums[s.dag_hash()].compressed_digest),
"size": checksums[s.dag_hash()].size,
}
for s in dependencies
),
],
}
if not use_docker_format and annotations:
manifest["annotations"] = annotations
# Finally upload the manifest
upload_manifest_with_retry(image_ref, manifest=manifest)
# delete the config file
os.unlink(config_file)
def _update_base_images(
*,
base_image: Optional[ImageReference],
target_image: ImageReference,
spec: spack.spec.Spec,
base_image_cache: Dict[str, Tuple[dict, dict]],
):
"""For a given spec and base image, copy the missing layers of the base image with matching
arch to the registry of the target image. If no base image is specified, create a dummy
manifest and config file."""
architecture = _archspec_to_gooarch(spec)
if architecture in base_image_cache:
return
if base_image is None:
base_image_cache[architecture] = (
default_manifest(),
default_config(architecture, "linux"),
)
else:
base_image_cache[architecture] = copy_missing_layers_with_retry(
base_image, target_image, architecture
)
def _push_oci(
*,
target_image: ImageReference,
base_image: Optional[ImageReference],
installed_specs_with_deps: List[Spec],
tmpdir: str,
pool: MaybePool,
force: bool = False,
) -> Tuple[List[str], Dict[str, Tuple[dict, dict]], Dict[str, spack.oci.oci.Blob]]:
"""Push specs to an OCI registry
Args:
image_ref: The target OCI image
base_image: Optional base image, which will be copied to the target registry.
installed_specs_with_deps: The installed specs to push, excluding externals,
including deps, ordered from roots to leaves.
force: Whether to overwrite existing layers and manifests in the buildcache.
Returns:
A tuple consisting of the list of skipped specs already in the build cache,
a dictionary mapping architectures to base image manifests and configs,
and a dictionary mapping each spec's dag hash to a blob.
"""
# Reverse the order
installed_specs_with_deps = list(reversed(installed_specs_with_deps))
# Spec dag hash -> blob
checksums: Dict[str, spack.oci.oci.Blob] = {}
# arch -> (manifest, config)
base_images: Dict[str, Tuple[dict, dict]] = {}
# Specs not uploaded because they already exist
skipped = []
if not force:
tty.info("Checking for existing specs in the buildcache")
to_be_uploaded = []
tags_to_check = (target_image.with_tag(default_tag(s)) for s in installed_specs_with_deps)
available_blobs = pool.map(_get_spack_binary_blob, tags_to_check)
for spec, maybe_blob in zip(installed_specs_with_deps, available_blobs):
if maybe_blob is not None:
checksums[spec.dag_hash()] = maybe_blob
skipped.append(_format_spec(spec))
else:
to_be_uploaded.append(spec)
else:
to_be_uploaded = installed_specs_with_deps
if not to_be_uploaded:
return skipped, base_images, checksums
tty.info(
f"{len(to_be_uploaded)} specs need to be pushed to "
f"{target_image.domain}/{target_image.name}"
)
# Upload blobs
new_blobs = pool.starmap(
_push_single_spack_binary_blob, ((target_image, spec, tmpdir) for spec in to_be_uploaded)
)
# And update the spec to blob mapping
for spec, blob in zip(to_be_uploaded, new_blobs):
checksums[spec.dag_hash()] = blob
# Copy base images if necessary
for spec in to_be_uploaded:
_update_base_images(
base_image=base_image,
target_image=target_image,
spec=spec,
base_image_cache=base_images,
)
def extra_config(spec: Spec):
spec_dict = spec.to_dict(hash=ht.dag_hash)
spec_dict["buildcache_layout_version"] = 1
spec_dict["binary_cache_checksum"] = {
"hash_algorithm": "sha256",
"hash": checksums[spec.dag_hash()].compressed_digest.digest,
}
return spec_dict
# Upload manifests
tty.info("Uploading manifests")
pool.starmap(
_put_manifest,
(
(
base_images,
checksums,
target_image.with_tag(default_tag(spec)),
tmpdir,
extra_config(spec),
{"org.opencontainers.image.description": spec.format()},
spec,
)
for spec in to_be_uploaded
),
)
# Print the image names of the top-level specs
for spec in to_be_uploaded:
tty.info(f"Pushed {_format_spec(spec)} to {target_image.with_tag(default_tag(spec))}")
return skipped, base_images, checksums
def _config_from_tag(image_ref: ImageReference, tag: str) -> Optional[dict]:
# Don't allow recursion here, since Spack itself always uploads
# vnd.oci.image.manifest.v1+json, not vnd.oci.image.index.v1+json
_, config = get_manifest_and_config_with_retry(image_ref.with_tag(tag), tag, recurse=0)
# Do very basic validation: if "spec" is a key in the config, it
# must be a Spec object too.
return config if "spec" in config else None
def _update_index_oci(image_ref: ImageReference, tmpdir: str, pool: MaybePool) -> None:
tags = list_tags(image_ref)
# Fetch all image config files in parallel
spec_dicts = pool.starmap(
_config_from_tag, ((image_ref, tag) for tag in tags if tag_is_spec(tag))
)
# Populate the database
db_root_dir = os.path.join(tmpdir, "db_root")
db = bindist.BuildCacheDatabase(db_root_dir)
for spec_dict in spec_dicts:
spec = Spec.from_dict(spec_dict)
db.add(spec, directory_layout=None)
db.mark(spec, "in_buildcache", True)
# Create the index.json file
index_json_path = os.path.join(tmpdir, "index.json")
with open(index_json_path, "w") as f:
db._write_to_file(f)
# Create an empty config.json file
empty_config_json_path = os.path.join(tmpdir, "config.json")
with open(empty_config_json_path, "wb") as f:
f.write(b"{}")
# Upload the index.json file
index_shasum = Digest.from_sha256(spack.util.crypto.checksum(hashlib.sha256, index_json_path))
upload_blob_with_retry(image_ref, file=index_json_path, digest=index_shasum)
# Upload the config.json file
empty_config_digest = Digest.from_sha256(
spack.util.crypto.checksum(hashlib.sha256, empty_config_json_path)
)
upload_blob_with_retry(image_ref, file=empty_config_json_path, digest=empty_config_digest)
# Push a manifest file that references the index.json file as a layer
# Notice that we push this as if it is an image, which it of course is not.
# When the ORAS spec becomes official, we can use that instead of a fake image.
# For now we just use the OCI image spec, so that we don't run into issues with
# automatic garbage collection of blobs that are not referenced by any image manifest.
oci_manifest = {
"mediaType": "application/vnd.oci.image.manifest.v1+json",
"schemaVersion": 2,
# Config is just an empty {} file for now, and irrelevant
"config": {
"mediaType": "application/vnd.oci.image.config.v1+json",
"digest": str(empty_config_digest),
"size": os.path.getsize(empty_config_json_path),
},
# The buildcache index is the only layer, and is not a tarball, we lie here.
"layers": [
{
"mediaType": "application/vnd.oci.image.layer.v1.tar+gzip",
"digest": str(index_shasum),
"size": os.path.getsize(index_json_path),
}
],
}
upload_manifest_with_retry(image_ref.with_tag(default_index_tag), oci_manifest)
) as tmpdir, spack.util.parallel.make_concurrent_executor() as executor:
bindist._oci_update_index(target_image, tmpdir, executor)
def install_fn(args):
@@ -960,14 +571,6 @@ def keys_fn(args):
bindist.get_keys(args.install, args.trust, args.force)
def preview_fn(args):
"""analyze an installed spec and reports whether executables and libraries are relocatable"""
tty.warn(
"`spack buildcache preview` is deprecated since `spack buildcache push --allow-root` is "
"now the default. This command will be removed in Spack 0.22"
)
def check_fn(args: argparse.Namespace):
"""check specs against remote binary mirror(s) to see if any need to be rebuilt
@@ -1205,14 +808,15 @@ def update_index(mirror: spack.mirror.Mirror, update_keys=False):
if image_ref:
with tempfile.TemporaryDirectory(
dir=spack.stage.get_stage_root()
) as tmpdir, _make_pool() as pool:
_update_index_oci(image_ref, tmpdir, pool)
) as tmpdir, spack.util.parallel.make_concurrent_executor() as executor:
bindist._oci_update_index(image_ref, tmpdir, executor)
return
# Otherwise, assume a normal mirror.
url = mirror.push_url
bindist.generate_package_index(url_util.join(url, bindist.build_cache_relative_path()))
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
bindist.generate_package_index(url, tmpdir)
if update_keys:
keys_url = url_util.join(
@@ -1220,7 +824,8 @@ def update_index(mirror: spack.mirror.Mirror, update_keys=False):
)
try:
bindist.generate_key_index(keys_url)
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
bindist.generate_key_index(keys_url, tmpdir)
except bindist.CannotListKeys as e:
# Do not error out if listing keys went wrong. This usually means that the _gpg path
# does not exist. TODO: distinguish between this and other errors.

View File

@@ -11,7 +11,6 @@
from argparse import ArgumentParser, Namespace
from typing import IO, Any, Callable, Dict, Iterable, List, Optional, Sequence, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.argparsewriter import ArgparseRstWriter, ArgparseWriter, Command
from llnl.util.tty.colify import colify
@@ -867,9 +866,6 @@ def _commands(parser: ArgumentParser, args: Namespace) -> None:
prepend_header(args, f)
formatter(args, f)
if args.update_completion:
fs.set_executable(args.update)
else:
prepend_header(args, sys.stdout)
formatter(args, sys.stdout)

View File

@@ -156,7 +156,7 @@ def print_flattened_configuration(*, blame: bool) -> None:
"""
env = ev.active_environment()
if env is not None:
pristine = env.manifest.pristine_yaml_content
pristine = env.manifest.yaml_content
flattened = pristine.copy()
flattened[spack.schema.env.TOP_LEVEL_KEY] = pristine[spack.schema.env.TOP_LEVEL_KEY].copy()
else:

View File

@@ -941,9 +941,7 @@ def get_repository(args, name):
)
else:
if spec.namespace:
repo = spack.repo.PATH.get_repo(spec.namespace, None)
if not repo:
tty.die("Unknown namespace: '{0}'".format(spec.namespace))
repo = spack.repo.PATH.get_repo(spec.namespace)
else:
repo = spack.repo.PATH.first_repo()

View File

@@ -6,6 +6,7 @@
import os
import platform
import re
import sys
from datetime import datetime
from glob import glob
@@ -62,9 +63,10 @@ def create_db_tarball(args):
base = os.path.basename(str(spack.store.STORE.root))
transform_args = []
# Currently --transform and -s are not supported by Windows native tar
if "GNU" in tar("--version", output=str):
transform_args = ["--transform", "s/^%s/%s/" % (base, tarball_name)]
else:
elif sys.platform != "win32":
transform_args = ["-s", "/^%s/%s/" % (base, tarball_name)]
wd = os.path.dirname(str(spack.store.STORE.root))
@@ -90,7 +92,6 @@ def report(args):
print("* **Spack:**", get_version())
print("* **Python:**", platform.python_version())
print("* **Platform:**", architecture)
print("* **Concretizer:**", spack.config.get("config:concretizer"))
def debug(parser, args):

View File

@@ -47,16 +47,6 @@ def inverted_dependencies():
dependents of, e.g., `mpi`, but virtuals are not included as
actual dependents.
"""
dag = {}
for pkg_cls in spack.repo.PATH.all_package_classes():
dag.setdefault(pkg_cls.name, set())
for dep in pkg_cls.dependencies_by_name():
deps = [dep]
# expand virtuals if necessary
if spack.repo.PATH.is_virtual(dep):
deps += [s.name for s in spack.repo.PATH.providers_for(dep)]
dag = collections.defaultdict(set)
for pkg_cls in spack.repo.PATH.all_package_classes():
for _, deps_by_name in pkg_cls.dependencies.items():

View File

@@ -7,7 +7,7 @@
import os
import re
import sys
from typing import List, Optional
from typing import List, Optional, Set
import llnl.util.tty as tty
import llnl.util.tty.colify as colify
@@ -19,6 +19,7 @@
import spack.detection
import spack.error
import spack.repo
import spack.spec
import spack.util.environment
from spack.cmd.common import arguments
@@ -138,14 +139,26 @@ def external_find(args):
candidate_packages, path_hints=args.path, max_workers=args.jobs
)
new_entries = spack.detection.update_configuration(
new_specs = spack.detection.update_configuration(
detected_packages, scope=args.scope, buildable=not args.not_buildable
)
if new_entries:
# If the user runs `spack external find --not-buildable mpich` we also mark `mpi` non-buildable
# to avoid that the concretizer picks a different mpi provider.
if new_specs and args.not_buildable:
virtuals: Set[str] = {
virtual.name
for new_spec in new_specs
for virtual_specs in spack.repo.PATH.get_pkg_class(new_spec.name).provided.values()
for virtual in virtual_specs
}
new_virtuals = spack.detection.set_virtuals_nonbuildable(virtuals, scope=args.scope)
new_specs.extend(spack.spec.Spec(name) for name in new_virtuals)
if new_specs:
path = spack.config.CONFIG.get_config_filename(args.scope, "packages")
msg = "The following specs have been detected on this system and added to {0}"
tty.msg(msg.format(path))
spack.cmd.display_specs(new_entries)
tty.msg(f"The following specs have been detected on this system and added to {path}")
spack.cmd.display_specs(new_specs)
else:
tty.msg("No new external packages detected")

View File

@@ -46,6 +46,10 @@ def setup_parser(subparser):
help="output specs as machine-readable json records",
)
subparser.add_argument(
"-I", "--install-status", action="store_true", help="show install status of packages"
)
subparser.add_argument(
"-d", "--deps", action="store_true", help="output dependencies along with found specs"
)
@@ -293,25 +297,24 @@ def root_decorator(spec, string):
)
print()
if args.show_concretized:
tty.msg("Concretized roots")
cmd.display_specs(env.specs_by_hash.values(), args, decorator=decorator)
print()
# Display a header for the installed packages section IF there are installed
# packages. If there aren't any, we'll just end up printing "0 installed packages"
# later.
if results and not args.only_roots:
tty.msg("Installed packages")
def find(parser, args):
q_args = query_arguments(args)
results = args.specs(**q_args)
env = ev.active_environment()
if not env and args.only_roots:
tty.die("-r / --only-roots requires an active environment")
if not env and args.show_concretized:
tty.die("-c / --show-concretized requires an active environment")
if env:
if args.constraint:
init_specs = spack.cmd.parse_specs(args.constraint)
results = env.all_matching_specs(*init_specs)
else:
results = env.all_specs()
else:
q_args = query_arguments(args)
results = args.specs(**q_args)
decorator = make_env_decorator(env) if env else lambda s, f: f
@@ -332,6 +335,11 @@ def find(parser, args):
if args.loaded:
results = spack.cmd.filter_loaded_specs(results)
if args.install_status or args.show_concretized:
status_fn = spack.spec.Spec.install_status
else:
status_fn = None
# Display the result
if args.json:
cmd.display_specs_as_json(results, deps=args.deps)
@@ -340,12 +348,34 @@ def find(parser, args):
if env:
display_env(env, args, decorator, results)
count_suffix = " (not shown)"
if not args.only_roots:
cmd.display_specs(results, args, decorator=decorator, all_headers=True)
count_suffix = ""
display_results = results
if not args.show_concretized:
display_results = list(x for x in results if x.installed)
cmd.display_specs(
display_results, args, decorator=decorator, all_headers=True, status_fn=status_fn
)
# print number of installed packages last (as the list may be long)
if sys.stdout.isatty() and args.groups:
installed_suffix = ""
concretized_suffix = " to be installed"
if args.only_roots:
installed_suffix += " (not shown)"
concretized_suffix += " (not shown)"
else:
if env and not args.show_concretized:
concretized_suffix += " (show with `spack find -c`)"
pkg_type = "loaded" if args.loaded else "installed"
spack.cmd.print_how_many_pkgs(results, pkg_type, suffix=count_suffix)
spack.cmd.print_how_many_pkgs(
list(x for x in results if x.installed), pkg_type, suffix=installed_suffix
)
if env:
spack.cmd.print_how_many_pkgs(
list(x for x in results if not x.installed),
"concretized",
suffix=concretized_suffix,
)

View File

@@ -5,10 +5,12 @@
import argparse
import os
import tempfile
import spack.binary_distribution
import spack.mirror
import spack.paths
import spack.stage
import spack.util.gpg
import spack.util.url
from spack.cmd.common import arguments
@@ -115,6 +117,7 @@ def setup_parser(subparser):
help="URL of the mirror where keys will be published",
)
publish.add_argument(
"--update-index",
"--rebuild-index",
action="store_true",
default=False,
@@ -220,9 +223,10 @@ def gpg_publish(args):
elif args.mirror_url:
mirror = spack.mirror.Mirror(args.mirror_url, args.mirror_url)
spack.binary_distribution.push_keys(
mirror, keys=args.keys, regenerate_index=args.rebuild_index
)
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
spack.binary_distribution.push_keys(
mirror, keys=args.keys, tmpdir=tmpdir, update_index=args.update_index
)
def gpg(parser, args):

View File

@@ -169,7 +169,9 @@ def pkg_hash(args):
def get_grep(required=False):
"""Get a grep command to use with ``spack pkg grep``."""
return exe.which(os.environ.get("SPACK_GREP") or "grep", required=required)
grep = exe.which(os.environ.get("SPACK_GREP") or "grep", required=required)
grep.ignore_quotes = True # allow `spack pkg grep '"quoted string"'` without warning
return grep
def pkg_grep(args, unknown_args):

View File

@@ -339,7 +339,7 @@ def add(self, pkg_name, fetcher):
for pkg_cls in spack.repo.PATH.all_package_classes():
npkgs += 1
for v in pkg_cls.versions:
for v in list(pkg_cls.versions):
try:
pkg = pkg_cls(spack.spec.Spec(pkg_cls.name))
fetcher = fs.for_package_version(pkg, v)

View File

@@ -18,7 +18,6 @@
import llnl.util.tty as tty
from llnl.util.filesystem import path_contains_subdirectory, paths_containing_libs
import spack.compilers
import spack.error
import spack.schema.environment
import spack.spec
@@ -279,11 +278,6 @@ def debug_flags(self):
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3"]
# Cray PrgEnv name that can be used to load this compiler
PrgEnv: Optional[str] = None
# Name of module used to switch versions of this compiler
PrgEnv_compiler: Optional[str] = None
def __init__(
self,
cspec,

View File

@@ -488,7 +488,7 @@ def supported_compilers_for_host_platform() -> List[str]:
return supported_compilers_for_platform(host_plat)
def supported_compilers_for_platform(platform: spack.platforms.Platform) -> List[str]:
def supported_compilers_for_platform(platform: "spack.platforms.Platform") -> List[str]:
"""Return a set of compiler class objects supported by Spack
that are also supported by the provided platform

View File

@@ -25,9 +25,6 @@ class Aocc(Compiler):
# Subclasses use possible names of Fortran 90 compiler
fc_names = ["flang"]
PrgEnv = "PrgEnv-aocc"
PrgEnv_compiler = "aocc"
version_argument = "--version"
@property

View File

@@ -34,12 +34,9 @@ def __init__(self, *args, **kwargs):
# MacPorts builds gcc versions with prefixes and -mp-X.Y suffixes.
suffixes = [r"-mp-\d\.\d"]
PrgEnv = "PrgEnv-cray"
PrgEnv_compiler = "cce"
@property
def link_paths(self):
if any(self.PrgEnv in m for m in self.modules):
if any("PrgEnv-cray" in m for m in self.modules):
# Old module-based interface to cray compilers
return {
"cc": os.path.join("cce", "cc"),

View File

@@ -40,9 +40,6 @@ class Gcc(spack.compiler.Compiler):
"fc": os.path.join("gcc", "gfortran"),
}
PrgEnv = "PrgEnv-gnu"
PrgEnv_compiler = "gcc"
@property
def verbose_flag(self):
return "-v"

View File

@@ -31,9 +31,6 @@ class Intel(Compiler):
"fc": os.path.join("intel", "ifort"),
}
PrgEnv = "PrgEnv-intel"
PrgEnv_compiler = "intel"
if sys.platform == "win32":
version_argument = "/QV"
else:
@@ -126,3 +123,14 @@ def fc_pic_flag(self):
@property
def stdcxx_libs(self):
return ("-cxxlib",)
def setup_custom_environment(self, pkg, env):
# Edge cases for Intel's oneAPI compilers when using the legacy classic compilers:
# Always pass flags to disable deprecation warnings, since these warnings can
# confuse tools that parse the output of compiler commands (e.g. version checks).
if self.cc and self.cc.endswith("icc") and self.real_version >= Version("2021"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
if self.cxx and self.cxx.endswith("icpc") and self.real_version >= Version("2021"):
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.fc and self.fc.endswith("ifort") and self.real_version >= Version("2021"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")

View File

@@ -231,24 +231,55 @@ def msvc_version(self):
@property
def short_msvc_version(self):
"""This is the shorthand VCToolset version of form
MSVC<short-ver>
"""
This is the shorthand VCToolset version of form
MSVC<short-ver> *NOT* the full version, for that see
return "MSVC" + self.vc_toolset_ver
@property
def vc_toolset_ver(self):
"""
The toolset version is the version of the combined set of cl and link
This typically relates directly to VS version i.e. VS 2022 is v143
VS 19 is v142, etc.
This value is defined by the first three digits of the major + minor
version of the VS toolset (143 for 14.3x.bbbbb). Traditionally the
minor version has remained a static two digit number for a VS release
series, however, as of VS22, this is no longer true, both
14.4x.bbbbb and 14.3x.bbbbb are considered valid VS22 VC toolset
versions due to a change in toolset minor version sentiment.
This is *NOT* the full version, for that see
Msvc.msvc_version or MSVC.platform_toolset_ver for the
raw platform toolset version
"""
ver = self.platform_toolset_ver
return "MSVC" + ver
ver = self.msvc_version[:2].joined.string[:3]
return ver
@property
def platform_toolset_ver(self):
"""
This is the platform toolset version of current MSVC compiler
i.e. 142.
i.e. 142. The platform toolset is the targeted MSVC library/compiler
versions by compilation (this is different from the VC Toolset)
This is different from the VC toolset version as established
by `short_msvc_version`
by `short_msvc_version`, but typically are represented by the same
three digit value
"""
return self.msvc_version[:2].joined.string[:3]
# Typically VS toolset version and platform toolset versions match
# VS22 introduces the first divergence of VS toolset version
# (144 for "recent" releases) and platform toolset version (143)
# so it needs additional handling until MS releases v144
# (assuming v144 is also for VS22)
# or adds better support for detection
# TODO: (johnwparent) Update this logic for the next platform toolset
# or VC toolset version update
toolset_ver = self.vc_toolset_ver
vs22_toolset = Version(toolset_ver) > Version("142")
return toolset_ver if not vs22_toolset else "143"
def _compiler_version(self, compiler):
"""Returns version object for given compiler"""

View File

@@ -29,9 +29,6 @@ class Nvhpc(Compiler):
"fc": os.path.join("nvhpc", "nvfortran"),
}
PrgEnv = "PrgEnv-nvhpc"
PrgEnv_compiler = "nvhpc"
version_argument = "--version"
version_regex = r"nv[^ ]* (?:[^ ]+ Dev-r)?([0-9.]+)(?:-[0-9]+)?"

View File

@@ -4,11 +4,12 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from os.path import dirname
from os.path import dirname, join
from llnl.util import tty
from spack.compiler import Compiler
from spack.version import Version
class Oneapi(Compiler):
@@ -32,9 +33,6 @@ class Oneapi(Compiler):
"fc": os.path.join("oneapi", "ifx"),
}
PrgEnv = "PrgEnv-oneapi"
PrgEnv_compiler = "oneapi"
version_argument = "--version"
version_regex = r"(?:(?:oneAPI DPC\+\+(?:\/C\+\+)? Compiler)|(?:\(IFORT\))|(?:\(IFX\))) (\S+)"
@@ -135,8 +133,22 @@ def setup_custom_environment(self, pkg, env):
# It is located in the same directory as the driver. Error message:
# clang++: error: unable to execute command:
# Executable "sycl-post-link" doesn't exist!
if self.cxx:
# also ensures that shared objects and libraries required by the compiler,
# e.g. libonnx, can be found succesfully
# due to a fix, this is no longer required for OneAPI versions >= 2024.2
if self.cxx and pkg.spec.satisfies("%oneapi@:2024.1"):
env.prepend_path("PATH", dirname(self.cxx))
env.prepend_path("LD_LIBRARY_PATH", join(dirname(dirname(self.cxx)), "lib"))
# Edge cases for Intel's oneAPI compilers when using the legacy classic compilers:
# Always pass flags to disable deprecation warnings, since these warnings can
# confuse tools that parse the output of compiler commands (e.g. version checks).
if self.cc and self.cc.endswith("icc") and self.real_version >= Version("2021"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
if self.cxx and self.cxx.endswith("icpc") and self.real_version >= Version("2021"):
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.fc and self.fc.endswith("ifort") and self.real_version >= Version("2021"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")
# 2024 release bumped the libsycl version because of an ABI
# change, 2024 compilers are required. You will see this

View File

@@ -30,9 +30,6 @@ class Pgi(Compiler):
"fc": os.path.join("pgi", "pgfortran"),
}
PrgEnv = "PrgEnv-pgi"
PrgEnv_compiler = "pgi"
version_argument = "-V"
ignore_version_errors = [2] # `pgcc -V` on PowerPC annoyingly returns 2
version_regex = r"pg[^ ]* ([0-9.]+)-[0-9]+ (LLVM )?[^ ]+ target on "

View File

@@ -23,9 +23,6 @@ class Rocmcc(spack.compilers.clang.Clang):
# Subclasses use possible names of Fortran 90 compiler
fc_names = ["amdflang"]
PrgEnv = "PrgEnv-amd"
PrgEnv_compiler = "amd"
@property
def link_paths(self):
link_paths = {

View File

@@ -2,29 +2,11 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""
Functions here are used to take abstract specs and make them concrete.
For example, if a spec asks for a version between 1.8 and 1.9, these
functions might take will take the most recent 1.9 version of the
package available. Or, if the user didn't specify a compiler for a
spec, then this will assign a compiler to the spec based on defaults
or user preferences.
TODO: make this customizable and allow users to configure
concretization policies.
(DEPRECATED) Used to contain the code for the original concretizer
"""
import functools
import platform
import tempfile
from contextlib import contextmanager
from itertools import chain
from typing import Union
import archspec.cpu
import llnl.util.lang
import llnl.util.tty as tty
import spack.abi
import spack.compilers
@@ -37,639 +19,20 @@
import spack.target
import spack.tengine
import spack.util.path
import spack.variant as vt
from spack.package_prefs import PackagePrefs, is_spec_buildable, spec_externals
from spack.version import ClosedOpenRange, VersionList, ver
#: impements rudimentary logic for ABI compatibility
_abi: Union[spack.abi.ABI, llnl.util.lang.Singleton] = llnl.util.lang.Singleton(
lambda: spack.abi.ABI()
)
@functools.total_ordering
class reverse_order:
"""Helper for creating key functions.
This is a wrapper that inverts the sense of the natural
comparisons on the object.
"""
def __init__(self, value):
self.value = value
def __eq__(self, other):
return other.value == self.value
def __lt__(self, other):
return other.value < self.value
class Concretizer:
"""You can subclass this class to override some of the default
concretization strategies, or you can override all of them.
"""
"""(DEPRECATED) Only contains logic to enable/disable compiler existence checks."""
#: Controls whether we check that compiler versions actually exist
#: during concretization. Used for testing and for mirror creation
check_for_compiler_existence = None
#: Packages that the old concretizer cannot deal with correctly, and cannot build anyway.
#: Those will not be considered as providers for virtuals.
non_buildable_packages = {"glibc", "musl"}
def __init__(self, abstract_spec=None):
def __init__(self):
if Concretizer.check_for_compiler_existence is None:
Concretizer.check_for_compiler_existence = not spack.config.get(
"config:install_missing_compilers", False
)
self.abstract_spec = abstract_spec
self._adjust_target_answer_generator = None
def concretize_develop(self, spec):
"""
Add ``dev_path=*`` variant to packages built from local source.
"""
env = spack.environment.active_environment()
dev_info = env.dev_specs.get(spec.name, {}) if env else {}
if not dev_info:
return False
path = spack.util.path.canonicalize_path(dev_info["path"], default_wd=env.path)
if "dev_path" in spec.variants:
assert spec.variants["dev_path"].value == path
changed = False
else:
spec.variants.setdefault("dev_path", vt.SingleValuedVariant("dev_path", path))
changed = True
changed |= spec.constrain(dev_info["spec"])
return changed
def _valid_virtuals_and_externals(self, spec):
"""Returns a list of candidate virtual dep providers and external
packages that coiuld be used to concretize a spec.
Preferred specs come first in the list.
"""
# First construct a list of concrete candidates to replace spec with.
candidates = [spec]
pref_key = lambda spec: 0 # no-op pref key
if spec.virtual:
candidates = [
s
for s in spack.repo.PATH.providers_for(spec)
if s.name not in self.non_buildable_packages
]
if not candidates:
raise spack.error.UnsatisfiableProviderSpecError(candidates[0], spec)
# Find nearest spec in the DAG (up then down) that has prefs.
spec_w_prefs = find_spec(
spec, lambda p: PackagePrefs.has_preferred_providers(p.name, spec.name), spec
) # default to spec itself.
# Create a key to sort candidates by the prefs we found
pref_key = PackagePrefs(spec_w_prefs.name, "providers", spec.name)
# For each candidate package, if it has externals, add those
# to the usable list. if it's not buildable, then *only* add
# the externals.
usable = []
for cspec in candidates:
if is_spec_buildable(cspec):
usable.append(cspec)
externals = spec_externals(cspec)
for ext in externals:
if ext.intersects(spec):
usable.append(ext)
# If nothing is in the usable list now, it's because we aren't
# allowed to build anything.
if not usable:
raise NoBuildError(spec)
# Use a sort key to order the results
return sorted(
usable,
key=lambda spec: (
not spec.external, # prefer externals
pref_key(spec), # respect prefs
spec.name, # group by name
reverse_order(spec.versions), # latest version
spec, # natural order
),
)
def choose_virtual_or_external(self, spec: spack.spec.Spec):
"""Given a list of candidate virtual and external packages, try to
find one that is most ABI compatible.
"""
candidates = self._valid_virtuals_and_externals(spec)
if not candidates:
return candidates
# Find the nearest spec in the dag that has a compiler. We'll
# use that spec to calibrate compiler compatibility.
abi_exemplar = find_spec(spec, lambda x: x.compiler)
if abi_exemplar is None:
abi_exemplar = spec.root
# Sort candidates from most to least compatibility.
# We reverse because True > False.
# Sort is stable, so candidates keep their order.
return sorted(
candidates,
reverse=True,
key=lambda spec: (
_abi.compatible(spec, abi_exemplar, loose=True),
_abi.compatible(spec, abi_exemplar),
),
)
def concretize_version(self, spec):
"""If the spec is already concrete, return. Otherwise take
the preferred version from spackconfig, and default to the package's
version if there are no available versions.
TODO: In many cases we probably want to look for installed
versions of each package and use an installed version
if we can link to it. The policy implemented here will
tend to rebuild a lot of stuff becasue it will prefer
a compiler in the spec to any compiler already-
installed things were built with. There is likely
some better policy that finds some middle ground
between these two extremes.
"""
# return if already concrete.
if spec.versions.concrete:
return False
# List of versions we could consider, in sorted order
pkg_versions = spec.package_class.versions
usable = [v for v in pkg_versions if any(v.intersects(sv) for sv in spec.versions)]
yaml_prefs = PackagePrefs(spec.name, "version")
# The keys below show the order of precedence of factors used
# to select a version when concretizing. The item with
# the "largest" key will be selected.
#
# NOTE: When COMPARING VERSIONS, the '@develop' version is always
# larger than other versions. BUT when CONCRETIZING,
# the largest NON-develop version is selected by default.
keyfn = lambda v: (
# ------- Special direction from the user
# Respect order listed in packages.yaml
-yaml_prefs(v),
# The preferred=True flag (packages or packages.yaml or both?)
pkg_versions.get(v).get("preferred", False),
# ------- Regular case: use latest non-develop version by default.
# Avoid @develop version, which would otherwise be the "largest"
# in straight version comparisons
not v.isdevelop(),
# Compare the version itself
# This includes the logic:
# a) develop > everything (disabled by "not v.isdevelop() above)
# b) numeric > non-numeric
# c) Numeric or string comparison
v,
)
usable.sort(key=keyfn, reverse=True)
if usable:
spec.versions = ver([usable[0]])
else:
# We don't know of any SAFE versions that match the given
# spec. Grab the spec's versions and grab the highest
# *non-open* part of the range of versions it specifies.
# Someone else can raise an error if this happens,
# e.g. when we go to fetch it and don't know how. But it
# *might* work.
if not spec.versions or spec.versions == VersionList([":"]):
raise NoValidVersionError(spec)
else:
last = spec.versions[-1]
if isinstance(last, ClosedOpenRange):
range_as_version = VersionList([last]).concrete_range_as_version
if range_as_version:
spec.versions = ver([range_as_version])
else:
raise NoValidVersionError(spec)
else:
spec.versions = ver([last])
return True # Things changed
def concretize_architecture(self, spec):
"""If the spec is empty provide the defaults of the platform. If the
architecture is not a string type, then check if either the platform,
target or operating system are concretized. If any of the fields are
changed then return True. If everything is concretized (i.e the
architecture attribute is a namedtuple of classes) then return False.
If the target is a string type, then convert the string into a
concretized architecture. If it has no architecture and the root of the
DAG has an architecture, then use the root otherwise use the defaults
on the platform.
"""
# ensure type safety for the architecture
if spec.architecture is None:
spec.architecture = spack.spec.ArchSpec()
if spec.architecture.concrete:
return False
# Get platform of nearest spec with a platform, including spec
# If spec has a platform, easy
if spec.architecture.platform:
new_plat = spack.platforms.by_name(spec.architecture.platform)
else:
# Else if anyone else has a platform, take the closest one
# Search up, then down, along build/link deps first
# Then any nearest. Algorithm from compilerspec search
platform_spec = find_spec(spec, lambda x: x.architecture and x.architecture.platform)
if platform_spec:
new_plat = spack.platforms.by_name(platform_spec.architecture.platform)
else:
# If no platform anywhere in this spec, grab the default
new_plat = spack.platforms.host()
# Get nearest spec with relevant platform and an os
# Generally, same algorithm as finding platform, except we only
# consider specs that have a platform
if spec.architecture.os:
new_os = spec.architecture.os
else:
new_os_spec = find_spec(
spec,
lambda x: (
x.architecture
and x.architecture.platform == str(new_plat)
and x.architecture.os
),
)
if new_os_spec:
new_os = new_os_spec.architecture.os
else:
new_os = new_plat.operating_system("default_os")
# Get the nearest spec with relevant platform and a target
# Generally, same algorithm as finding os
curr_target = None
if spec.architecture.target:
curr_target = spec.architecture.target
if spec.architecture.target and spec.architecture.target_concrete:
new_target = spec.architecture.target
else:
new_target_spec = find_spec(
spec,
lambda x: (
x.architecture
and x.architecture.platform == str(new_plat)
and x.architecture.target
and x.architecture.target != curr_target
),
)
if new_target_spec:
if curr_target:
# constrain one target by the other
new_target_arch = spack.spec.ArchSpec(
(None, None, new_target_spec.architecture.target)
)
curr_target_arch = spack.spec.ArchSpec((None, None, curr_target))
curr_target_arch.constrain(new_target_arch)
new_target = curr_target_arch.target
else:
new_target = new_target_spec.architecture.target
else:
# To get default platform, consider package prefs
if PackagePrefs.has_preferred_targets(spec.name):
new_target = self.target_from_package_preferences(spec)
else:
new_target = new_plat.target("default_target")
if curr_target:
# convert to ArchSpec to compare satisfaction
new_target_arch = spack.spec.ArchSpec((None, None, str(new_target)))
curr_target_arch = spack.spec.ArchSpec((None, None, str(curr_target)))
if not new_target_arch.intersects(curr_target_arch):
# new_target is an incorrect guess based on preferences
# and/or default
valid_target_ranges = str(curr_target).split(",")
for target_range in valid_target_ranges:
t_min, t_sep, t_max = target_range.partition(":")
if not t_sep:
new_target = t_min
break
elif t_max:
new_target = t_max
break
elif t_min:
# TODO: something better than picking first
new_target = t_min
break
# Construct new architecture, compute whether spec changed
arch_spec = (str(new_plat), str(new_os), str(new_target))
new_arch = spack.spec.ArchSpec(arch_spec)
spec_changed = new_arch != spec.architecture
spec.architecture = new_arch
return spec_changed
def target_from_package_preferences(self, spec):
"""Returns the preferred target from the package preferences if
there's any.
Args:
spec: abstract spec to be concretized
"""
target_prefs = PackagePrefs(spec.name, "target")
target_specs = [spack.spec.Spec("target=%s" % tname) for tname in archspec.cpu.TARGETS]
def tspec_filter(s):
# Filter target specs by whether the architecture
# family is the current machine type. This ensures
# we only consider x86_64 targets when on an
# x86_64 machine, etc. This may need to change to
# enable setting cross compiling as a default
target = archspec.cpu.TARGETS[str(s.architecture.target)]
arch_family_name = target.family.name
return arch_family_name == platform.machine()
# Sort filtered targets by package prefs
target_specs = list(filter(tspec_filter, target_specs))
target_specs.sort(key=target_prefs)
new_target = target_specs[0].architecture.target
return new_target
def concretize_variants(self, spec):
"""If the spec already has variants filled in, return. Otherwise, add
the user preferences from packages.yaml or the default variants from
the package specification.
"""
changed = False
preferred_variants = PackagePrefs.preferred_variants(spec.name)
pkg_cls = spec.package_class
for name, entry in pkg_cls.variants.items():
variant, when = entry
var = spec.variants.get(name, None)
if var and "*" in var:
# remove variant wildcard before concretizing
# wildcard cannot be combined with other variables in a
# multivalue variant, a concrete variant cannot have the value
# wildcard, and a wildcard does not constrain a variant
spec.variants.pop(name)
if name not in spec.variants and any(spec.satisfies(w) for w in when):
changed = True
if name in preferred_variants:
spec.variants[name] = preferred_variants.get(name)
else:
spec.variants[name] = variant.make_default()
if name in spec.variants and not any(spec.satisfies(w) for w in when):
raise vt.InvalidVariantForSpecError(name, when, spec)
return changed
def concretize_compiler(self, spec):
"""If the spec already has a compiler, we're done. If not, then take
the compiler used for the nearest ancestor with a compiler
spec and use that. If the ancestor's compiler is not
concrete, then used the preferred compiler as specified in
spackconfig.
Intuition: Use the spackconfig default if no package that depends on
this one has a strict compiler requirement. Otherwise, try to
build with the compiler that will be used by libraries that
link to this one, to maximize compatibility.
"""
# Pass on concretizing the compiler if the target or operating system
# is not yet determined
if not spec.architecture.concrete:
# We haven't changed, but other changes need to happen before we
# continue. `return True` here to force concretization to keep
# running.
return True
# Only use a matching compiler if it is of the proper style
# Takes advantage of the proper logic already existing in
# compiler_for_spec Should think whether this can be more
# efficient
def _proper_compiler_style(cspec, aspec):
compilers = spack.compilers.compilers_for_spec(cspec, arch_spec=aspec)
# If the spec passed as argument is concrete we want to check
# the versions match exactly
if (
cspec.concrete
and compilers
and cspec.version not in [c.version for c in compilers]
):
return []
return compilers
if spec.compiler and spec.compiler.concrete:
if self.check_for_compiler_existence and not _proper_compiler_style(
spec.compiler, spec.architecture
):
_compiler_concretization_failure(spec.compiler, spec.architecture)
return False
# Find another spec that has a compiler, or the root if none do
other_spec = spec if spec.compiler else find_spec(spec, lambda x: x.compiler, spec.root)
other_compiler = other_spec.compiler
assert other_spec
# Check if the compiler is already fully specified
if other_compiler and other_compiler.concrete:
if self.check_for_compiler_existence and not _proper_compiler_style(
other_compiler, spec.architecture
):
_compiler_concretization_failure(other_compiler, spec.architecture)
spec.compiler = other_compiler
return True
if other_compiler: # Another node has abstract compiler information
compiler_list = spack.compilers.find_specs_by_arch(other_compiler, spec.architecture)
if not compiler_list:
# We don't have a matching compiler installed
if not self.check_for_compiler_existence:
# Concretize compiler spec versions as a package to build
cpkg_spec = spack.compilers.pkg_spec_for_compiler(other_compiler)
self.concretize_version(cpkg_spec)
spec.compiler = spack.spec.CompilerSpec(
other_compiler.name, cpkg_spec.versions
)
return True
else:
# No compiler with a satisfactory spec was found
raise UnavailableCompilerVersionError(other_compiler, spec.architecture)
else:
# We have no hints to go by, grab any compiler
compiler_list = spack.compilers.all_compiler_specs()
if not compiler_list:
# Spack has no compilers.
raise spack.compilers.NoCompilersError()
# By default, prefer later versions of compilers
compiler_list = sorted(compiler_list, key=lambda x: (x.name, x.version), reverse=True)
ppk = PackagePrefs(other_spec.name, "compiler")
matches = sorted(compiler_list, key=ppk)
# copy concrete version into other_compiler
try:
spec.compiler = next(
c for c in matches if _proper_compiler_style(c, spec.architecture)
).copy()
except StopIteration:
# No compiler with a satisfactory spec has a suitable arch
_compiler_concretization_failure(other_compiler, spec.architecture)
assert spec.compiler.concrete
return True # things changed.
def concretize_compiler_flags(self, spec):
"""
The compiler flags are updated to match those of the spec whose
compiler is used, defaulting to no compiler flags in the spec.
Default specs set at the compiler level will still be added later.
"""
# Pass on concretizing the compiler flags if the target or operating
# system is not set.
if not spec.architecture.concrete:
# We haven't changed, but other changes need to happen before we
# continue. `return True` here to force concretization to keep
# running.
return True
compiler_match = lambda other: (
spec.compiler == other.compiler and spec.architecture == other.architecture
)
ret = False
for flag in spack.spec.FlagMap.valid_compiler_flags():
if flag not in spec.compiler_flags:
spec.compiler_flags[flag] = list()
try:
nearest = next(
p
for p in spec.traverse(direction="parents")
if (compiler_match(p) and (p is not spec) and flag in p.compiler_flags)
)
nearest_flags = nearest.compiler_flags.get(flag, [])
flags = spec.compiler_flags.get(flag, [])
if set(nearest_flags) - set(flags):
spec.compiler_flags[flag] = list(llnl.util.lang.dedupe(nearest_flags + flags))
ret = True
except StopIteration:
pass
# Include the compiler flag defaults from the config files
# This ensures that spack will detect conflicts that stem from a change
# in default compiler flags.
try:
compiler = spack.compilers.compiler_for_spec(spec.compiler, spec.architecture)
except spack.compilers.NoCompilerForSpecError:
if self.check_for_compiler_existence:
raise
return ret
for flag in compiler.flags:
config_flags = compiler.flags.get(flag, [])
flags = spec.compiler_flags.get(flag, [])
spec.compiler_flags[flag] = list(llnl.util.lang.dedupe(config_flags + flags))
if set(config_flags) - set(flags):
ret = True
return ret
def adjust_target(self, spec):
"""Adjusts the target microarchitecture if the compiler is too old
to support the default one.
Args:
spec: spec to be concretized
Returns:
True if spec was modified, False otherwise
"""
# To minimize the impact on performance this function will attempt
# to adjust the target only at the very first call once necessary
# information is set. It will just return False on subsequent calls.
# The way this is achieved is by initializing a generator and making
# this function return the next answer.
if not (spec.architecture and spec.architecture.concrete):
# Not ready, but keep going because we have work to do later
return True
def _make_only_one_call(spec):
yield self._adjust_target(spec)
while True:
yield False
if self._adjust_target_answer_generator is None:
self._adjust_target_answer_generator = _make_only_one_call(spec)
return next(self._adjust_target_answer_generator)
def _adjust_target(self, spec):
"""Assumes that the architecture and the compiler have been
set already and checks if the current target microarchitecture
is the default and can be optimized by the compiler.
If not, downgrades the microarchitecture until a suitable one
is found. If none can be found raise an error.
Args:
spec: spec to be concretized
Returns:
True if any modification happened, False otherwise
"""
import archspec.cpu
# Try to adjust the target only if it is the default
# target for this platform
current_target = spec.architecture.target
current_platform = spack.platforms.by_name(spec.architecture.platform)
default_target = current_platform.target("default_target")
if PackagePrefs.has_preferred_targets(spec.name):
default_target = self.target_from_package_preferences(spec)
if current_target != default_target or (
self.abstract_spec
and self.abstract_spec.architecture
and self.abstract_spec.architecture.concrete
):
return False
try:
current_target.optimization_flags(spec.compiler)
except archspec.cpu.UnsupportedMicroarchitecture:
microarchitecture = current_target.microarchitecture
for ancestor in microarchitecture.ancestors:
candidate = None
try:
candidate = spack.target.Target(ancestor)
candidate.optimization_flags(spec.compiler)
except archspec.cpu.UnsupportedMicroarchitecture:
continue
if candidate is not None:
msg = (
"{0.name}@{0.version} cannot build optimized "
'binaries for "{1}". Using best target possible: '
'"{2}"'
)
msg = msg.format(spec.compiler, current_target, candidate)
tty.warn(msg)
spec.architecture.target = candidate
return True
else:
raise
return False
@contextmanager
@@ -719,19 +82,6 @@ def find_spec(spec, condition, default=None):
return default # Nothing matched the condition; return default.
def _compiler_concretization_failure(compiler_spec, arch):
# Distinguish between the case that there are compilers for
# the arch but not with the given compiler spec and the case that
# there are no compilers for the arch at all
if not spack.compilers.compilers_for_arch(arch):
available_os_targets = set(
(c.operating_system, c.target) for c in spack.compilers.all_compilers()
)
raise NoCompilersForArchError(arch, available_os_targets)
else:
raise UnavailableCompilerVersionError(compiler_spec, arch)
def concretize_specs_together(*abstract_specs, **kwargs):
"""Given a number of specs as input, tries to concretize them together.
@@ -744,12 +94,6 @@ def concretize_specs_together(*abstract_specs, **kwargs):
Returns:
List of concretized specs
"""
if spack.config.get("config:concretizer", "clingo") == "original":
return _concretize_specs_together_original(*abstract_specs, **kwargs)
return _concretize_specs_together_new(*abstract_specs, **kwargs)
def _concretize_specs_together_new(*abstract_specs, **kwargs):
import spack.solver.asp
allow_deprecated = spack.config.get("config:deprecated", False)
@@ -760,51 +104,6 @@ def _concretize_specs_together_new(*abstract_specs, **kwargs):
return [s.copy() for s in result.specs]
def _concretize_specs_together_original(*abstract_specs, **kwargs):
abstract_specs = [spack.spec.Spec(s) for s in abstract_specs]
tmpdir = tempfile.mkdtemp()
builder = spack.repo.MockRepositoryBuilder(tmpdir)
# Split recursive specs, as it seems the concretizer has issue
# respecting conditions on dependents expressed like
# depends_on('foo ^bar@1.0'), see issue #11160
split_specs = [
dep.copy(deps=False) for spec1 in abstract_specs for dep in spec1.traverse(root=True)
]
builder.add_package(
"concretizationroot", dependencies=[(str(x), None, None) for x in split_specs]
)
with spack.repo.use_repositories(builder.root, override=False):
# Spec from a helper package that depends on all the abstract_specs
concretization_root = spack.spec.Spec("concretizationroot")
concretization_root.concretize(tests=kwargs.get("tests", False))
# Retrieve the direct dependencies
concrete_specs = [concretization_root[spec.name].copy() for spec in abstract_specs]
return concrete_specs
class NoCompilersForArchError(spack.error.SpackError):
def __init__(self, arch, available_os_targets):
err_msg = (
"No compilers found"
" for operating system %s and target %s."
"\nIf previous installations have succeeded, the"
" operating system may have been updated." % (arch.os, arch.target)
)
available_os_target_strs = list()
for operating_system, t in available_os_targets:
os_target_str = "%s-%s" % (operating_system, t) if t else operating_system
available_os_target_strs.append(os_target_str)
err_msg += (
"\nCompilers are defined for the following"
" operating systems and targets:\n\t" + "\n\t".join(available_os_target_strs)
)
super().__init__(err_msg, "Run 'spack compiler find' to add compilers.")
class UnavailableCompilerVersionError(spack.error.SpackError):
"""Raised when there is no available compiler that satisfies a
compiler spec."""
@@ -820,37 +119,3 @@ def __init__(self, compiler_spec, arch=None):
"'spack compilers' to see which compilers are already recognized"
" by spack.",
)
class NoValidVersionError(spack.error.SpackError):
"""Raised when there is no way to have a concrete version for a
particular spec."""
def __init__(self, spec):
super().__init__(
"There are no valid versions for %s that match '%s'" % (spec.name, spec.versions)
)
class InsufficientArchitectureInfoError(spack.error.SpackError):
"""Raised when details on architecture cannot be collected from the
system"""
def __init__(self, spec, archs):
super().__init__(
"Cannot determine necessary architecture information for '%s': %s"
% (spec.name, str(archs))
)
class NoBuildError(spack.error.SpecError):
"""Raised when a package is configured with the buildable option False, but
no satisfactory external versions can be found
"""
def __init__(self, spec):
msg = (
"The spec\n '%s'\n is configured as not buildable, "
"and no matching external installs were found"
)
super().__init__(msg % spec)

View File

@@ -39,7 +39,6 @@
from llnl.util import filesystem, lang, tty
import spack.compilers
import spack.paths
import spack.platforms
import spack.schema
@@ -100,7 +99,6 @@
"dirty": False,
"build_jobs": min(16, cpus_available()),
"build_stage": "$tempdir/spack-stage",
"concretizer": "clingo",
"license_dir": spack.paths.default_license_dir,
}
}
@@ -174,9 +172,7 @@ def _write_section(self, section: str) -> None:
if data is None:
return
# We copy data here to avoid adding defaults at write time
validate_data = copy.deepcopy(data)
validate(validate_data, SECTION_SCHEMAS[section])
validate(data, SECTION_SCHEMAS[section])
try:
filesystem.mkdirp(self.path)
@@ -796,22 +792,27 @@ def config_paths_from_entry_points() -> List[Tuple[str, str]]:
def _add_command_line_scopes(
cfg: Union[Configuration, lang.Singleton], command_line_scopes: List[str]
) -> None:
"""Add additional scopes from the --config-scope argument.
"""Add additional scopes from the --config-scope argument, either envs or dirs."""
import spack.environment.environment as env # circular import
Command line scopes are named after their position in the arg list.
"""
for i, path in enumerate(command_line_scopes):
# We ensure that these scopes exist and are readable, as they are
# provided on the command line by the user.
if not os.path.isdir(path):
raise ConfigError(f"config scope is not a directory: '{path}'")
elif not os.access(path, os.R_OK):
raise ConfigError(f"config scope is not readable: '{path}'")
name = f"cmd_scope_{i}"
# name based on order on the command line
name = f"cmd_scope_{i:d}"
cfg.push_scope(DirectoryConfigScope(name, path, writable=False))
_add_platform_scope(cfg, name, path, writable=False)
if env.exists(path): # managed environment
manifest = env.EnvironmentManifestFile(env.root(path))
elif env.is_env_dir(path): # anonymous environment
manifest = env.EnvironmentManifestFile(path)
elif os.path.isdir(path): # directory with config files
cfg.push_scope(DirectoryConfigScope(name, path, writable=False))
_add_platform_scope(cfg, name, path, writable=False)
continue
else:
raise ConfigError(f"Invalid configuration scope: {path}")
for scope in manifest.env_config_scopes:
scope.name = f"{name}:{scope.name}"
scope.writable = False
cfg.push_scope(scope)
def create() -> Configuration:
@@ -1075,11 +1076,8 @@ def validate(
"""
import jsonschema
# Validate a copy to avoid adding defaults
# This allows us to round-trip data without adding to it.
test_data = syaml.deepcopy(data)
try:
spack.schema.Validator(schema).validate(test_data)
spack.schema.Validator(schema).validate(data)
except jsonschema.ValidationError as e:
if hasattr(e.instance, "lc"):
line_number = e.instance.lc.line + 1
@@ -1088,7 +1086,7 @@ def validate(
raise ConfigFormatError(e, data, filename, line_number) from e
# return the validated data so that we can access the raw data
# mostly relevant for environments
return test_data
return data
def read_config_file(

View File

@@ -2,7 +2,12 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from .common import DetectedPackage, executable_prefix, update_configuration
from .common import (
DetectedPackage,
executable_prefix,
set_virtuals_nonbuildable,
update_configuration,
)
from .path import by_path, executables_in_path
from .test import detection_tests
@@ -12,5 +17,6 @@
"executables_in_path",
"executable_prefix",
"update_configuration",
"set_virtuals_nonbuildable",
"detection_tests",
]

View File

@@ -136,10 +136,10 @@ def path_to_dict(search_paths: List[str]):
# entry overrides later entries
for search_path in reversed(search_paths):
try:
for lib in os.listdir(search_path):
lib_path = os.path.join(search_path, lib)
if llnl.util.filesystem.is_readable_file(lib_path):
path_to_lib[lib_path] = lib
with os.scandir(search_path) as entries:
path_to_lib.update(
{entry.path: entry.name for entry in entries if entry.is_file()}
)
except OSError as e:
msg = f"cannot scan '{search_path}' for external software: {str(e)}"
llnl.util.tty.debug(msg)
@@ -252,6 +252,27 @@ def update_configuration(
return all_new_specs
def set_virtuals_nonbuildable(virtuals: Set[str], scope: Optional[str] = None) -> List[str]:
"""Update packages:virtual:buildable:False for the provided virtual packages, if the property
is not set by the user. Returns the list of virtual packages that have been updated."""
packages = spack.config.get("packages")
new_config = {}
for virtual in virtuals:
# If the user has set the buildable prop do not override it
if virtual in packages and "buildable" in packages[virtual]:
continue
new_config[virtual] = {"buildable": False}
# Update the provided scope
spack.config.set(
"packages",
spack.config.merge_yaml(spack.config.get("packages", scope=scope), new_config),
scope=scope,
)
return list(new_config.keys())
def _windows_drive() -> str:
"""Return Windows drive string extracted from the PROGRAMFILES environment variable,
which is guaranteed to be defined for all logins.

View File

@@ -12,7 +12,7 @@
import re
import sys
import warnings
from typing import Dict, List, Optional, Set, Tuple
from typing import Dict, Iterable, List, Optional, Set, Tuple, Type
import llnl.util.filesystem
import llnl.util.lang
@@ -187,7 +187,7 @@ def libraries_in_windows_paths(path_hints: Optional[List[str]] = None) -> Dict[s
return path_to_dict(search_paths)
def _group_by_prefix(paths: Set[str]) -> Dict[str, Set[str]]:
def _group_by_prefix(paths: List[str]) -> Dict[str, Set[str]]:
groups = collections.defaultdict(set)
for p in paths:
groups[os.path.dirname(p)].add(p)
@@ -200,7 +200,7 @@ class Finder:
def default_path_hints(self) -> List[str]:
return []
def search_patterns(self, *, pkg: "spack.package_base.PackageBase") -> List[str]:
def search_patterns(self, *, pkg: Type["spack.package_base.PackageBase"]) -> List[str]:
"""Returns the list of patterns used to match candidate files.
Args:
@@ -226,7 +226,7 @@ def prefix_from_path(self, *, path: str) -> str:
raise NotImplementedError("must be implemented by derived classes")
def detect_specs(
self, *, pkg: "spack.package_base.PackageBase", paths: List[str]
self, *, pkg: Type["spack.package_base.PackageBase"], paths: List[str]
) -> List[DetectedPackage]:
"""Given a list of files matching the search patterns, returns a list of detected specs.
@@ -243,7 +243,9 @@ def detect_specs(
return []
result = []
for candidate_path, items_in_prefix in sorted(_group_by_prefix(set(paths)).items()):
for candidate_path, items_in_prefix in _group_by_prefix(
llnl.util.lang.dedupe(paths)
).items():
# TODO: multiple instances of a package can live in the same
# prefix, and a package implementation can return multiple specs
# for one prefix, but without additional details (e.g. about the
@@ -299,19 +301,17 @@ def detect_specs(
return result
def find(
self, *, pkg_name: str, initial_guess: Optional[List[str]] = None
self, *, pkg_name: str, repository, initial_guess: Optional[List[str]] = None
) -> List[DetectedPackage]:
"""For a given package, returns a list of detected specs.
Args:
pkg_name: package being detected
initial_guess: initial list of paths to search from the caller
if None, default paths are searched. If this
is an empty list, nothing will be searched.
repository: repository to retrieve the package
initial_guess: initial list of paths to search from the caller if None, default paths
are searched. If this is an empty list, nothing will be searched.
"""
import spack.repo
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
pkg_cls = repository.get_pkg_class(pkg_name)
patterns = self.search_patterns(pkg=pkg_cls)
if not patterns:
return []
@@ -327,7 +327,7 @@ class ExecutablesFinder(Finder):
def default_path_hints(self) -> List[str]:
return spack.util.environment.get_path("PATH")
def search_patterns(self, *, pkg: "spack.package_base.PackageBase") -> List[str]:
def search_patterns(self, *, pkg: Type["spack.package_base.PackageBase"]) -> List[str]:
result = []
if hasattr(pkg, "executables") and hasattr(pkg, "platform_executables"):
result = pkg.platform_executables()
@@ -335,13 +335,10 @@ def search_patterns(self, *, pkg: "spack.package_base.PackageBase") -> List[str]
def candidate_files(self, *, patterns: List[str], paths: List[str]) -> List[str]:
executables_by_path = executables_in_path(path_hints=paths)
patterns = [re.compile(x) for x in patterns]
result = []
for compiled_re in patterns:
for path, exe in executables_by_path.items():
if compiled_re.search(exe):
result.append(path)
return list(sorted(set(result)))
joined_pattern = re.compile(r"|".join(patterns))
result = [path for path, exe in executables_by_path.items() if joined_pattern.search(exe)]
result.sort()
return result
def prefix_from_path(self, *, path: str) -> str:
result = executable_prefix(path)
@@ -356,7 +353,7 @@ class LibrariesFinder(Finder):
DYLD_LIBRARY_PATH, DYLD_FALLBACK_LIBRARY_PATH, and standard system library paths
"""
def search_patterns(self, *, pkg: "spack.package_base.PackageBase") -> List[str]:
def search_patterns(self, *, pkg: Type["spack.package_base.PackageBase"]) -> List[str]:
result = []
if hasattr(pkg, "libraries"):
result = pkg.libraries
@@ -385,7 +382,7 @@ def prefix_from_path(self, *, path: str) -> str:
def by_path(
packages_to_search: List[str],
packages_to_search: Iterable[str],
*,
path_hints: Optional[List[str]] = None,
max_workers: Optional[int] = None,
@@ -399,19 +396,28 @@ def by_path(
path_hints: initial list of paths to be searched
max_workers: maximum number of workers to search for packages in parallel
"""
import spack.repo
# TODO: Packages should be able to define both .libraries and .executables in the future
# TODO: determine_spec_details should get all relevant libraries and executables in one call
executables_finder, libraries_finder = ExecutablesFinder(), LibrariesFinder()
detected_specs_by_package: Dict[str, Tuple[concurrent.futures.Future, ...]] = {}
result = collections.defaultdict(list)
repository = spack.repo.PATH.ensure_unwrapped()
with concurrent.futures.ProcessPoolExecutor(max_workers=max_workers) as executor:
for pkg in packages_to_search:
executable_future = executor.submit(
executables_finder.find, pkg_name=pkg, initial_guess=path_hints
executables_finder.find,
pkg_name=pkg,
initial_guess=path_hints,
repository=repository,
)
library_future = executor.submit(
libraries_finder.find, pkg_name=pkg, initial_guess=path_hints
libraries_finder.find,
pkg_name=pkg,
initial_guess=path_hints,
repository=repository,
)
detected_specs_by_package[pkg] = executable_future, library_future

View File

@@ -81,7 +81,17 @@ class OpenMpi(Package):
]
#: These are variant names used by Spack internally; packages can't use them
reserved_names = ["patches", "dev_path"]
reserved_names = [
"arch",
"architecture",
"dev_path",
"namespace",
"operating_system",
"os",
"patches",
"platform",
"target",
]
#: Names of possible directives. This list is mostly populated using the @directive decorator.
#: Some directives leverage others and in that case are not automatically added.
@@ -90,14 +100,14 @@ class OpenMpi(Package):
_patch_order_index = 0
SpecType = Union["spack.spec.Spec", str]
SpecType = str
DepType = Union[Tuple[str, ...], str]
WhenType = Optional[Union["spack.spec.Spec", str, bool]]
Patcher = Callable[[Union["spack.package_base.PackageBase", Dependency]], None]
PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]]
SUPPORTED_LANGUAGES = ("fortran", "cxx")
SUPPORTED_LANGUAGES = ("fortran", "cxx", "c")
def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]:
@@ -475,7 +485,7 @@ def _execute_version(pkg, ver, **kwargs):
def _depends_on(
pkg: "spack.package_base.PackageBase",
spec: SpecType,
spec: "spack.spec.Spec",
*,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
@@ -485,11 +495,10 @@ def _depends_on(
if not when_spec:
return
dep_spec = spack.spec.Spec(spec)
if not dep_spec.name:
raise DependencyError("Invalid dependency specification in package '%s':" % pkg.name, spec)
if pkg.name == dep_spec.name:
raise CircularReferenceError("Package '%s' cannot depend on itself." % pkg.name)
if not spec.name:
raise DependencyError(f"Invalid dependency specification in package '{pkg.name}':", spec)
if pkg.name == spec.name:
raise CircularReferenceError(f"Package '{pkg.name}' cannot depend on itself.")
depflag = dt.canonicalize(type)
@@ -505,7 +514,7 @@ def _depends_on(
# ensure `Spec.virtual` is a valid thing to call in a directive.
# For now, we comment out the following check to allow for virtual packages
# with package files.
# if patches and dep_spec.virtual:
# if patches and spec.virtual:
# raise DependencyPatchError("Cannot patch a virtual dependency.")
# ensure patches is a list
@@ -520,13 +529,13 @@ def _depends_on(
# this is where we actually add the dependency to this package
deps_by_name = pkg.dependencies.setdefault(when_spec, {})
dependency = deps_by_name.get(dep_spec.name)
dependency = deps_by_name.get(spec.name)
if not dependency:
dependency = Dependency(pkg, dep_spec, depflag=depflag)
deps_by_name[dep_spec.name] = dependency
dependency = Dependency(pkg, spec, depflag=depflag)
deps_by_name[spec.name] = dependency
else:
dependency.spec.constrain(dep_spec, deps=False)
dependency.spec.constrain(spec, deps=False)
dependency.depflag |= depflag
# apply patches to the dependency
@@ -591,12 +600,13 @@ def depends_on(
@see The section "Dependency specs" in the Spack Packaging Guide.
"""
if spack.spec.Spec(spec).name in SUPPORTED_LANGUAGES:
dep_spec = spack.spec.Spec(spec)
if dep_spec.name in SUPPORTED_LANGUAGES:
assert type == "build", "languages must be of 'build' type"
return _language(lang_spec_str=spec, when=when)
def _execute_depends_on(pkg: "spack.package_base.PackageBase"):
_depends_on(pkg, spec, when=when, type=type, patches=patches)
_depends_on(pkg, dep_spec, when=when, type=type, patches=patches)
return _execute_depends_on
@@ -666,25 +676,24 @@ def extends(spec, when=None, type=("build", "run"), patches=None):
keyword arguments can be passed to extends() so that extension
packages can pass parameters to the extendee's extension
mechanism.
"""
mechanism."""
def _execute_extends(pkg):
when_spec = _make_when_spec(when)
if not when_spec:
return
_depends_on(pkg, spec, when=when, type=type, patches=patches)
spec_obj = spack.spec.Spec(spec)
dep_spec = spack.spec.Spec(spec)
_depends_on(pkg, dep_spec, when=when, type=type, patches=patches)
# When extending python, also add a dependency on python-venv. This is done so that
# Spack environment views are Python virtual environments.
if spec_obj.name == "python" and not pkg.name == "python-venv":
_depends_on(pkg, "python-venv", when=when, type=("build", "run"))
if dep_spec.name == "python" and not pkg.name == "python-venv":
_depends_on(pkg, spack.spec.Spec("python-venv"), when=when, type=("build", "run"))
# TODO: the values of the extendees dictionary are not used. Remove in next refactor.
pkg.extendees[spec_obj.name] = (spec_obj, None)
pkg.extendees[dep_spec.name] = (dep_spec, None)
return _execute_extends

View File

@@ -5,7 +5,7 @@
import collections
import collections.abc
import contextlib
import copy
import errno
import os
import pathlib
import re
@@ -269,9 +269,7 @@ def root(name):
def exists(name):
"""Whether an environment with this name exists or not."""
if not valid_env_name(name):
return False
return os.path.isdir(root(name))
return valid_env_name(name) and os.path.isdir(_root(name))
def active(name):
@@ -530,8 +528,8 @@ def _read_yaml(str_or_file):
)
filename = getattr(str_or_file, "name", None)
default_data = spack.config.validate(data, spack.schema.env.schema, filename)
return data, default_data
spack.config.validate(data, spack.schema.env.schema, filename)
return data
def _write_yaml(data, str_or_file):
@@ -791,6 +789,23 @@ def regenerate(self, concrete_roots: List[Spec]) -> None:
root_dirname = os.path.dirname(self.root)
tmp_symlink_name = os.path.join(root_dirname, "._view_link")
# Remove self.root if is it an empty dir, since we need a symlink there. Note that rmdir
# fails if self.root is a symlink.
try:
os.rmdir(self.root)
except (FileNotFoundError, NotADirectoryError):
pass
except OSError as e:
if e.errno == errno.ENOTEMPTY:
msg = "it is a non-empty directory"
elif e.errno == errno.EACCES:
msg = "of insufficient permissions"
else:
raise
raise SpackEnvironmentViewError(
f"The environment view in {self.root} cannot not be created because {msg}."
) from e
# Create a new view
try:
fs.mkdirp(new_root)
@@ -922,7 +937,7 @@ def __init__(self, manifest_dir: Union[str, pathlib.Path]) -> None:
def _load_manifest_file(self):
"""Instantiate and load the manifest file contents into memory."""
with lk.ReadTransaction(self.txlock):
self.manifest = EnvironmentManifestFile(self.path)
self.manifest = EnvironmentManifestFile(self.path, self.name)
with self.manifest.use_config():
self._read()
@@ -959,18 +974,25 @@ def write_transaction(self):
"""Get a write lock context manager for use in a `with` block."""
return lk.WriteTransaction(self.txlock, acquire=self._re_read)
def _process_definition(self, item):
def _process_definition(self, entry):
"""Process a single spec definition item."""
entry = copy.deepcopy(item)
when = _eval_conditional(entry.pop("when", "True"))
assert len(entry) == 1
when_string = entry.get("when")
if when_string is not None:
when = _eval_conditional(when_string)
assert len([x for x in entry if x != "when"]) == 1
else:
when = True
assert len(entry) == 1
if when:
name, spec_list = next(iter(entry.items()))
user_specs = SpecList(name, spec_list, self.spec_lists.copy())
if name in self.spec_lists:
self.spec_lists[name].extend(user_specs)
else:
self.spec_lists[name] = user_specs
for name, spec_list in entry.items():
if name == "when":
continue
user_specs = SpecList(name, spec_list, self.spec_lists.copy())
if name in self.spec_lists:
self.spec_lists[name].extend(user_specs)
else:
self.spec_lists[name] = user_specs
def _process_view(self, env_view: Optional[Union[bool, str, Dict]]):
"""Process view option(s), which can be boolean, string, or None.
@@ -1622,9 +1644,8 @@ def _concretize_separately(self, tests=False):
i += 1
# Ensure we don't try to bootstrap clingo in parallel
if spack.config.get("config:concretizer", "clingo") == "clingo":
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_clingo_importable_or_raise()
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_clingo_importable_or_raise()
# Ensure all the indexes have been built or updated, since
# otherwise the processes in the pool may timeout on waiting
@@ -2753,10 +2774,11 @@ def from_lockfile(manifest_dir: Union[pathlib.Path, str]) -> "EnvironmentManifes
manifest.flush()
return manifest
def __init__(self, manifest_dir: Union[pathlib.Path, str]) -> None:
def __init__(self, manifest_dir: Union[pathlib.Path, str], name: Optional[str] = None) -> None:
self.manifest_dir = pathlib.Path(manifest_dir)
self.name = name or str(manifest_dir)
self.manifest_file = self.manifest_dir / manifest_name
self.scope_name = f"env:{environment_name(self.manifest_dir)}"
self.scope_name = f"env:{self.name}"
self.config_stage_dir = os.path.join(env_subdir_path(manifest_dir), "config")
#: Configuration scopes associated with this environment. Note that these are not
@@ -2768,12 +2790,8 @@ def __init__(self, manifest_dir: Union[pathlib.Path, str]) -> None:
raise SpackEnvironmentError(msg)
with self.manifest_file.open() as f:
raw, with_defaults_added = _read_yaml(f)
self.yaml_content = _read_yaml(f)
#: Pristine YAML content, without defaults being added
self.pristine_yaml_content = raw
#: YAML content with defaults added by Spack, if they're missing
self.yaml_content = with_defaults_added
self.changed = False
def _all_matches(self, user_spec: str) -> List[str]:
@@ -2787,7 +2805,7 @@ def _all_matches(self, user_spec: str) -> List[str]:
ValueError: if no equivalent match is found
"""
result = []
for yaml_spec_str in self.pristine_configuration["specs"]:
for yaml_spec_str in self.configuration["specs"]:
if Spec(yaml_spec_str) == Spec(user_spec):
result.append(yaml_spec_str)
@@ -2802,7 +2820,6 @@ def add_user_spec(self, user_spec: str) -> None:
Args:
user_spec: user spec to be appended
"""
self.pristine_configuration.setdefault("specs", []).append(user_spec)
self.configuration.setdefault("specs", []).append(user_spec)
self.changed = True
@@ -2817,7 +2834,6 @@ def remove_user_spec(self, user_spec: str) -> None:
"""
try:
for key in self._all_matches(user_spec):
self.pristine_configuration["specs"].remove(key)
self.configuration["specs"].remove(key)
except ValueError as e:
msg = f"cannot remove {user_spec} from {self}, no such spec exists"
@@ -2835,7 +2851,6 @@ def override_user_spec(self, user_spec: str, idx: int) -> None:
SpackEnvironmentError: when the user spec cannot be overridden
"""
try:
self.pristine_configuration["specs"][idx] = user_spec
self.configuration["specs"][idx] = user_spec
except ValueError as e:
msg = f"cannot override {user_spec} from {self}"
@@ -2848,10 +2863,10 @@ def set_include_concrete(self, include_concrete: List[str]) -> None:
Args:
include_concrete: list of already existing concrete environments to include
"""
self.pristine_configuration[included_concrete_name] = []
self.configuration[included_concrete_name] = []
for env_path in include_concrete:
self.pristine_configuration[included_concrete_name].append(env_path)
self.configuration[included_concrete_name].append(env_path)
self.changed = True
@@ -2865,14 +2880,13 @@ def add_definition(self, user_spec: str, list_name: str) -> None:
Raises:
SpackEnvironmentError: is no valid definition exists already
"""
defs = self.pristine_configuration.get("definitions", [])
defs = self.configuration.get("definitions", [])
msg = f"cannot add {user_spec} to the '{list_name}' definition, no valid list exists"
for idx, item in self._iterate_on_definitions(defs, list_name=list_name, err_msg=msg):
item[list_name].append(user_spec)
break
self.configuration["definitions"][idx][list_name].append(user_spec)
self.changed = True
def remove_definition(self, user_spec: str, list_name: str) -> None:
@@ -2886,7 +2900,7 @@ def remove_definition(self, user_spec: str, list_name: str) -> None:
SpackEnvironmentError: if the user spec cannot be removed from the list,
or the list does not exist
"""
defs = self.pristine_configuration.get("definitions", [])
defs = self.configuration.get("definitions", [])
msg = (
f"cannot remove {user_spec} from the '{list_name}' definition, "
f"no valid list exists"
@@ -2899,7 +2913,6 @@ def remove_definition(self, user_spec: str, list_name: str) -> None:
except ValueError:
pass
self.configuration["definitions"][idx][list_name].remove(user_spec)
self.changed = True
def override_definition(self, user_spec: str, *, override: str, list_name: str) -> None:
@@ -2914,7 +2927,7 @@ def override_definition(self, user_spec: str, *, override: str, list_name: str)
Raises:
SpackEnvironmentError: if the user spec cannot be overridden
"""
defs = self.pristine_configuration.get("definitions", [])
defs = self.configuration.get("definitions", [])
msg = f"cannot override {user_spec} with {override} in the '{list_name}' definition"
for idx, item in self._iterate_on_definitions(defs, list_name=list_name, err_msg=msg):
@@ -2925,7 +2938,6 @@ def override_definition(self, user_spec: str, *, override: str, list_name: str)
except ValueError:
pass
self.configuration["definitions"][idx][list_name][sub_index] = override
self.changed = True
def _iterate_on_definitions(self, definitions, *, list_name, err_msg):
@@ -2957,7 +2969,6 @@ def set_default_view(self, view: Union[bool, str, pathlib.Path, Dict[str, str]])
True the default view is used for the environment, if False there's no view.
"""
if isinstance(view, dict):
self.pristine_configuration["view"][default_view_name].update(view)
self.configuration["view"][default_view_name].update(view)
self.changed = True
return
@@ -2965,15 +2976,13 @@ def set_default_view(self, view: Union[bool, str, pathlib.Path, Dict[str, str]])
if not isinstance(view, bool):
view = str(view)
self.pristine_configuration["view"] = view
self.configuration["view"] = view
self.changed = True
def remove_default_view(self) -> None:
"""Removes the default view from the manifest file"""
view_data = self.pristine_configuration.get("view")
view_data = self.configuration.get("view")
if isinstance(view_data, collections.abc.Mapping):
self.pristine_configuration["view"].pop(default_view_name)
self.configuration["view"].pop(default_view_name)
self.changed = True
return
@@ -2986,17 +2995,12 @@ def flush(self) -> None:
return
with fs.write_tmp_and_move(os.path.realpath(self.manifest_file)) as f:
_write_yaml(self.pristine_yaml_content, f)
_write_yaml(self.yaml_content, f)
self.changed = False
@property
def pristine_configuration(self):
"""Return the dictionaries in the pristine YAML, without the top level attribute"""
return self.pristine_yaml_content[TOP_LEVEL_KEY]
@property
def configuration(self):
"""Return the dictionaries in the YAML, without the top level attribute"""
"""Return the dictionaries in the pristine YAML, without the top level attribute"""
return self.yaml_content[TOP_LEVEL_KEY]
def __len__(self):
@@ -3033,7 +3037,6 @@ def included_config_scopes(self) -> List[spack.config.ConfigScope]:
# load config scopes added via 'include:', in reverse so that
# highest-precedence scopes are last.
includes = self[TOP_LEVEL_KEY].get("include", [])
env_name = environment_name(self.manifest_dir)
missing = []
for i, config_path in enumerate(reversed(includes)):
# allow paths to contain spack config/environment variables, etc.
@@ -3096,12 +3099,12 @@ def included_config_scopes(self) -> List[spack.config.ConfigScope]:
if os.path.isdir(config_path):
# directories are treated as regular ConfigScopes
config_name = "env:%s:%s" % (env_name, os.path.basename(config_path))
config_name = f"env:{self.name}:{os.path.basename(config_path)}"
tty.debug(f"Creating DirectoryConfigScope {config_name} for '{config_path}'")
scopes.append(spack.config.DirectoryConfigScope(config_name, config_path))
elif os.path.exists(config_path):
# files are assumed to be SingleFileScopes
config_name = "env:%s:%s" % (env_name, config_path)
config_name = f"env:{self.name}:{config_path}"
tty.debug(f"Creating SingleFileScope {config_name} for '{config_path}'")
scopes.append(
spack.config.SingleFileScope(

View File

@@ -30,6 +30,7 @@
import shutil
import urllib.error
import urllib.parse
import urllib.request
from pathlib import PurePath
from typing import List, Optional
@@ -273,10 +274,7 @@ def __init__(self, url=None, checksum=None, **kwargs):
@property
def curl(self):
if not self._curl:
try:
self._curl = which("curl", required=True)
except CommandNotFoundError as exc:
tty.error(str(exc))
self._curl = web_util.require_curl()
return self._curl
def source_id(self):
@@ -297,27 +295,23 @@ def candidate_urls(self):
@_needs_stage
def fetch(self):
if self.archive_file:
tty.debug("Already downloaded {0}".format(self.archive_file))
tty.debug(f"Already downloaded {self.archive_file}")
return
url = None
errors = []
errors: List[Exception] = []
for url in self.candidate_urls:
if not web_util.url_exists(url):
tty.debug("URL does not exist: " + url)
continue
try:
self._fetch_from_url(url)
break
except FailedDownloadError as e:
errors.append(str(e))
for msg in errors:
tty.debug(msg)
errors.extend(e.exceptions)
else:
raise FailedDownloadError(*errors)
if not self.archive_file:
raise FailedDownloadError(url)
raise FailedDownloadError(
RuntimeError(f"Missing archive {self.archive_file} after fetching")
)
def _fetch_from_url(self, url):
if spack.config.get("config:url_fetch_method") == "curl":
@@ -336,19 +330,20 @@ def _check_headers(self, headers):
@_needs_stage
def _fetch_urllib(self, url):
save_file = self.stage.save_filename
tty.msg("Fetching {0}".format(url))
# Run urllib but grab the mime type from the http headers
request = urllib.request.Request(url, headers={"User-Agent": web_util.SPACK_USER_AGENT})
try:
url, headers, response = web_util.read_from_url(url)
except web_util.SpackWebError as e:
response = web_util.urlopen(request)
except (TimeoutError, urllib.error.URLError) as e:
# clean up archive on failure.
if self.archive_file:
os.remove(self.archive_file)
if os.path.lexists(save_file):
os.remove(save_file)
msg = "urllib failed to fetch with error {0}".format(e)
raise FailedDownloadError(url, msg)
raise FailedDownloadError(e) from e
tty.msg(f"Fetching {url}")
if os.path.lexists(save_file):
os.remove(save_file)
@@ -356,7 +351,7 @@ def _fetch_urllib(self, url):
with open(save_file, "wb") as _open_file:
shutil.copyfileobj(response, _open_file)
self._check_headers(str(headers))
self._check_headers(str(response.headers))
@_needs_stage
def _fetch_curl(self, url):
@@ -365,7 +360,7 @@ def _fetch_curl(self, url):
if self.stage.save_filename:
save_file = self.stage.save_filename
partial_file = self.stage.save_filename + ".part"
tty.msg("Fetching {0}".format(url))
tty.msg(f"Fetching {url}")
if partial_file:
save_args = [
"-C",
@@ -405,8 +400,8 @@ def _fetch_curl(self, url):
try:
web_util.check_curl_code(curl.returncode)
except spack.error.FetchError as err:
raise spack.fetch_strategy.FailedDownloadError(url, str(err))
except spack.error.FetchError as e:
raise FailedDownloadError(e) from e
self._check_headers(headers)
@@ -554,13 +549,13 @@ def fetch(self):
try:
response = self._urlopen(self.url)
except urllib.error.URLError as e:
except (TimeoutError, urllib.error.URLError) as e:
# clean up archive on failure.
if self.archive_file:
os.remove(self.archive_file)
if os.path.lexists(file):
os.remove(file)
raise FailedDownloadError(self.url, f"Failed to fetch {self.url}: {e}") from e
raise FailedDownloadError(e) from e
if os.path.lexists(file):
os.remove(file)
@@ -1312,35 +1307,41 @@ def __init__(self, *args, **kwargs):
@_needs_stage
def fetch(self):
if self.archive_file:
tty.debug("Already downloaded {0}".format(self.archive_file))
tty.debug(f"Already downloaded {self.archive_file}")
return
parsed_url = urllib.parse.urlparse(self.url)
if parsed_url.scheme != "s3":
raise spack.error.FetchError("S3FetchStrategy can only fetch from s3:// urls.")
tty.debug("Fetching {0}".format(self.url))
basename = os.path.basename(parsed_url.path)
request = urllib.request.Request(
self.url, headers={"User-Agent": web_util.SPACK_USER_AGENT}
)
with working_dir(self.stage.path):
_, headers, stream = web_util.read_from_url(self.url)
try:
response = web_util.urlopen(request)
except (TimeoutError, urllib.error.URLError) as e:
raise FailedDownloadError(e) from e
tty.debug(f"Fetching {self.url}")
with open(basename, "wb") as f:
shutil.copyfileobj(stream, f)
shutil.copyfileobj(response, f)
content_type = web_util.get_header(headers, "Content-type")
content_type = web_util.get_header(response.headers, "Content-type")
if content_type == "text/html":
warn_content_type_mismatch(self.archive_file or "the archive")
if self.stage.save_filename:
llnl.util.filesystem.rename(
os.path.join(self.stage.path, basename), self.stage.save_filename
)
fs.rename(os.path.join(self.stage.path, basename), self.stage.save_filename)
if not self.archive_file:
raise FailedDownloadError(self.url)
raise FailedDownloadError(
RuntimeError(f"Missing archive {self.archive_file} after fetching")
)
@fetcher
@@ -1366,17 +1367,23 @@ def fetch(self):
if parsed_url.scheme != "gs":
raise spack.error.FetchError("GCSFetchStrategy can only fetch from gs:// urls.")
tty.debug("Fetching {0}".format(self.url))
basename = os.path.basename(parsed_url.path)
request = urllib.request.Request(
self.url, headers={"User-Agent": web_util.SPACK_USER_AGENT}
)
with working_dir(self.stage.path):
_, headers, stream = web_util.read_from_url(self.url)
try:
response = web_util.urlopen(request)
except (TimeoutError, urllib.error.URLError) as e:
raise FailedDownloadError(e) from e
tty.debug(f"Fetching {self.url}")
with open(basename, "wb") as f:
shutil.copyfileobj(stream, f)
shutil.copyfileobj(response, f)
content_type = web_util.get_header(headers, "Content-type")
content_type = web_util.get_header(response.headers, "Content-type")
if content_type == "text/html":
warn_content_type_mismatch(self.archive_file or "the archive")
@@ -1385,7 +1392,9 @@ def fetch(self):
os.rename(os.path.join(self.stage.path, basename), self.stage.save_filename)
if not self.archive_file:
raise FailedDownloadError(self.url)
raise FailedDownloadError(
RuntimeError(f"Missing archive {self.archive_file} after fetching")
)
@fetcher
@@ -1722,9 +1731,9 @@ class NoCacheError(spack.error.FetchError):
class FailedDownloadError(spack.error.FetchError):
"""Raised when a download fails."""
def __init__(self, url, msg=""):
super().__init__("Failed to fetch file from URL: %s" % url, msg)
self.url = url
def __init__(self, *exceptions: Exception):
super().__init__("Failed to download")
self.exceptions = exceptions
class NoArchiveFileError(spack.error.FetchError):

View File

@@ -37,6 +37,12 @@ def __call__(self, spec):
"""Run this hash on the provided spec."""
return spec.spec_hash(self)
def __repr__(self):
return (
f"SpecHashDescriptor(depflag={self.depflag!r}, "
f"package_hash={self.package_hash!r}, name={self.name!r}, override={self.override!r})"
)
#: Spack's deployment hash. Includes all inputs that can affect how a package is built.
dag_hash = SpecHashDescriptor(depflag=dt.BUILD | dt.LINK | dt.RUN, package_hash=True, name="hash")

View File

@@ -23,9 +23,6 @@ def post_install(spec, explicit):
# Push the package to all autopush mirrors
for mirror in spack.mirror.MirrorCollection(binary=True, autopush=True).values():
bindist.push_or_raise(
spec,
mirror.push_url,
bindist.PushOptions(force=True, regenerate_index=False, unsigned=not mirror.signed),
)
signing_key = bindist.select_signing_key() if mirror.signed else None
bindist.push_or_raise([spec], out_url=mirror.push_url, signing_key=signing_key, force=True)
tty.msg(f"{spec.name}: Pushed to build cache: '{mirror.name}'")

View File

@@ -757,6 +757,10 @@ def test_process(pkg: Pb, kwargs):
pkg.tester.status(pkg.spec.name, TestStatus.SKIPPED)
return
# Make sure properly named build-time test methods actually run as
# stand-alone tests.
pkg.run_tests = True
# run test methods from the package and all virtuals it provides
v_names = virtuals(pkg)
test_specs = [pkg.spec] + [spack.spec.Spec(v_name) for v_name in sorted(v_names)]

View File

@@ -444,8 +444,9 @@ def make_argument_parser(**kwargs):
"--config-scope",
dest="config_scopes",
action="append",
metavar="DIR",
help="add a custom configuration scope",
metavar="DIR|ENV",
help="add directory or environment as read-only configuration scope, without activating "
"the environment.",
)
parser.add_argument(
"-d",

View File

@@ -6,7 +6,6 @@
import hashlib
import json
import os
import time
import urllib.error
import urllib.parse
import urllib.request
@@ -43,11 +42,6 @@ def create_tarball(spec: spack.spec.Spec, tarfile_path):
return spack.binary_distribution._do_create_tarball(tarfile_path, spec.prefix, buildinfo)
def _log_upload_progress(digest: Digest, size: int, elapsed: float):
elapsed = max(elapsed, 0.001) # guard against division by zero
tty.info(f"Uploaded {digest} ({elapsed:.2f}s, {size / elapsed / 1024 / 1024:.2f} MB/s)")
def with_query_param(url: str, param: str, value: str) -> str:
"""Add a query parameter to a URL
@@ -141,8 +135,6 @@ def upload_blob(
if not force and blob_exists(ref, digest, _urlopen):
return False
start = time.time()
with open(file, "rb") as f:
file_size = os.fstat(f.fileno()).st_size
@@ -167,7 +159,6 @@ def upload_blob(
# Created the blob in one go.
if response.status == 201:
_log_upload_progress(digest, file_size, time.time() - start)
return True
# Otherwise, do another PUT request.
@@ -191,8 +182,6 @@ def upload_blob(
spack.oci.opener.ensure_status(request, response, 201)
# print elapsed time and # MB/s
_log_upload_progress(digest, file_size, time.time() - start)
return True

View File

@@ -35,6 +35,7 @@
import spack.compilers
import spack.config
import spack.dependency
import spack.deptypes as dt
import spack.directives
import spack.directory_layout
@@ -196,13 +197,12 @@ def __init__(cls, name, bases, attr_dict):
# that "foo" was a possible executable.
# If a package has the executables or libraries attribute then it's
# assumed to be detectable
# assumed to be detectable. Add a tag, so finding them is faster
if hasattr(cls, "executables") or hasattr(cls, "libraries"):
# Append a tag to each detectable package, so that finding them is faster
if hasattr(cls, "tags"):
getattr(cls, "tags").append(DetectablePackageMeta.TAG)
else:
setattr(cls, "tags", [DetectablePackageMeta.TAG])
# To add the tag, we need to copy the tags attribute, and attach it to
# the current class. We don't use append, since it might modify base classes,
# if "tags" is retrieved following the MRO.
cls.tags = getattr(cls, "tags", []) + [DetectablePackageMeta.TAG]
@classmethod
def platform_executables(cls):

View File

@@ -328,19 +328,26 @@ def next_spec(
if not self.ctx.next_token:
return initial_spec
def add_dependency(dep, **edge_properties):
"""wrapper around root_spec._add_dependency"""
try:
root_spec._add_dependency(dep, **edge_properties)
except spack.error.SpecError as e:
raise SpecParsingError(str(e), self.ctx.current_token, self.literal_str) from e
initial_spec = initial_spec or spack.spec.Spec()
root_spec = SpecNodeParser(self.ctx).parse(initial_spec)
root_spec = SpecNodeParser(self.ctx, self.literal_str).parse(initial_spec)
while True:
if self.ctx.accept(TokenType.START_EDGE_PROPERTIES):
edge_properties = EdgeAttributeParser(self.ctx, self.literal_str).parse()
edge_properties.setdefault("depflag", 0)
edge_properties.setdefault("virtuals", ())
dependency = self._parse_node(root_spec)
root_spec._add_dependency(dependency, **edge_properties)
add_dependency(dependency, **edge_properties)
elif self.ctx.accept(TokenType.DEPENDENCY):
dependency = self._parse_node(root_spec)
root_spec._add_dependency(dependency, depflag=0, virtuals=())
add_dependency(dependency, depflag=0, virtuals=())
else:
break
@@ -348,7 +355,7 @@ def next_spec(
return root_spec
def _parse_node(self, root_spec):
dependency = SpecNodeParser(self.ctx).parse()
dependency = SpecNodeParser(self.ctx, self.literal_str).parse()
if dependency is None:
msg = (
"the dependency sigil and any optional edge attributes must be followed by a "
@@ -367,10 +374,11 @@ def all_specs(self) -> List["spack.spec.Spec"]:
class SpecNodeParser:
"""Parse a single spec node from a stream of tokens"""
__slots__ = "ctx", "has_compiler", "has_version"
__slots__ = "ctx", "has_compiler", "has_version", "literal_str"
def __init__(self, ctx):
def __init__(self, ctx, literal_str):
self.ctx = ctx
self.literal_str = literal_str
self.has_compiler = False
self.has_version = False
@@ -388,7 +396,8 @@ def parse(
if not self.ctx.next_token or self.ctx.expect(TokenType.DEPENDENCY):
return initial_spec
initial_spec = initial_spec or spack.spec.Spec()
if initial_spec is None:
initial_spec = spack.spec.Spec()
# If we start with a package name we have a named spec, we cannot
# accept another package name afterwards in a node
@@ -405,12 +414,21 @@ def parse(
elif self.ctx.accept(TokenType.FILENAME):
return FileParser(self.ctx).parse(initial_spec)
def raise_parsing_error(string: str, cause: Optional[Exception] = None):
"""Raise a spec parsing error with token context."""
raise SpecParsingError(string, self.ctx.current_token, self.literal_str) from cause
def add_flag(name: str, value: str, propagate: bool):
"""Wrapper around ``Spec._add_flag()`` that adds parser context to errors raised."""
try:
initial_spec._add_flag(name, value, propagate)
except Exception as e:
raise_parsing_error(str(e), e)
while True:
if self.ctx.accept(TokenType.COMPILER):
if self.has_compiler:
raise spack.spec.DuplicateCompilerSpecError(
f"{initial_spec} cannot have multiple compilers"
)
raise_parsing_error("Spec cannot have multiple compilers")
compiler_name = self.ctx.current_token.value[1:]
initial_spec.compiler = spack.spec.CompilerSpec(compiler_name.strip(), ":")
@@ -418,9 +436,7 @@ def parse(
elif self.ctx.accept(TokenType.COMPILER_AND_VERSION):
if self.has_compiler:
raise spack.spec.DuplicateCompilerSpecError(
f"{initial_spec} cannot have multiple compilers"
)
raise_parsing_error("Spec cannot have multiple compilers")
compiler_name, compiler_version = self.ctx.current_token.value[1:].split("@")
initial_spec.compiler = spack.spec.CompilerSpec(
@@ -434,9 +450,8 @@ def parse(
or self.ctx.accept(TokenType.VERSION)
):
if self.has_version:
raise spack.spec.MultipleVersionError(
f"{initial_spec} cannot have multiple versions"
)
raise_parsing_error("Spec cannot have multiple versions")
initial_spec.versions = spack.version.VersionList(
[spack.version.from_string(self.ctx.current_token.value[1:])]
)
@@ -445,29 +460,25 @@ def parse(
elif self.ctx.accept(TokenType.BOOL_VARIANT):
variant_value = self.ctx.current_token.value[0] == "+"
initial_spec._add_flag(
self.ctx.current_token.value[1:].strip(), variant_value, propagate=False
)
add_flag(self.ctx.current_token.value[1:].strip(), variant_value, propagate=False)
elif self.ctx.accept(TokenType.PROPAGATED_BOOL_VARIANT):
variant_value = self.ctx.current_token.value[0:2] == "++"
initial_spec._add_flag(
self.ctx.current_token.value[2:].strip(), variant_value, propagate=True
)
add_flag(self.ctx.current_token.value[2:].strip(), variant_value, propagate=True)
elif self.ctx.accept(TokenType.KEY_VALUE_PAIR):
match = SPLIT_KVP.match(self.ctx.current_token.value)
assert match, "SPLIT_KVP and KEY_VALUE_PAIR do not agree."
name, delim, value = match.groups()
initial_spec._add_flag(name, strip_quotes_and_unescape(value), propagate=False)
name, _, value = match.groups()
add_flag(name, strip_quotes_and_unescape(value), propagate=False)
elif self.ctx.accept(TokenType.PROPAGATED_KEY_VALUE_PAIR):
match = SPLIT_KVP.match(self.ctx.current_token.value)
assert match, "SPLIT_KVP and PROPAGATED_KEY_VALUE_PAIR do not agree."
name, delim, value = match.groups()
initial_spec._add_flag(name, strip_quotes_and_unescape(value), propagate=True)
name, _, value = match.groups()
add_flag(name, strip_quotes_and_unescape(value), propagate=True)
elif self.ctx.expect(TokenType.DAG_HASH):
if initial_spec.abstract_hash:

View File

@@ -9,7 +9,7 @@
import os.path
import pathlib
import sys
from typing import Any, Dict, Optional, Tuple, Type
from typing import Any, Dict, Optional, Tuple, Type, Union
import llnl.util.filesystem
from llnl.url import allowed_archive
@@ -65,6 +65,9 @@ def apply_patch(
patch(*args)
PatchPackageType = Union["spack.package_base.PackageBase", Type["spack.package_base.PackageBase"]]
class Patch:
"""Base class for patches.
@@ -77,7 +80,7 @@ class Patch:
def __init__(
self,
pkg: "spack.package_base.PackageBase",
pkg: PatchPackageType,
path_or_url: str,
level: int,
working_dir: str,
@@ -159,7 +162,7 @@ class FilePatch(Patch):
def __init__(
self,
pkg: "spack.package_base.PackageBase",
pkg: PatchPackageType,
relative_path: str,
level: int,
working_dir: str,
@@ -183,7 +186,7 @@ def __init__(
abs_path: Optional[str] = None
# At different times we call FilePatch on instances and classes
pkg_cls = pkg if inspect.isclass(pkg) else pkg.__class__
for cls in inspect.getmro(pkg_cls):
for cls in inspect.getmro(pkg_cls): # type: ignore
if not hasattr(cls, "module"):
# We've gone too far up the MRO
break
@@ -242,7 +245,7 @@ class UrlPatch(Patch):
def __init__(
self,
pkg: "spack.package_base.PackageBase",
pkg: PatchPackageType,
url: str,
level: int = 1,
*,
@@ -361,8 +364,9 @@ def from_dict(
"""
repository = repository or spack.repo.PATH
owner = dictionary.get("owner")
if "owner" not in dictionary:
raise ValueError("Invalid patch dictionary: %s" % dictionary)
if owner is None:
raise ValueError(f"Invalid patch dictionary: {dictionary}")
assert isinstance(owner, str)
pkg_cls = repository.get_pkg_class(owner)
if "url" in dictionary:

View File

@@ -149,12 +149,12 @@ def current_repository(self, value):
@contextlib.contextmanager
def switch_repo(self, substitute: "RepoType"):
"""Switch the current repository list for the duration of the context manager."""
old = self.current_repository
old = self._repo
try:
self.current_repository = substitute
self._repo = substitute
yield
finally:
self.current_repository = old
self._repo = old
def find_spec(self, fullname, python_path, target=None):
# "target" is not None only when calling importlib.reload()
@@ -590,7 +590,7 @@ def __init__(
self,
package_checker: FastPackageChecker,
namespace: str,
cache: spack.caches.FileCacheType,
cache: "spack.caches.FileCacheType",
):
self.checker = package_checker
self.packages_path = self.checker.packages_path
@@ -675,32 +675,44 @@ class RepoPath:
repository.
Args:
repos (list): list Repo objects or paths to put in this RepoPath
repos: list Repo objects or paths to put in this RepoPath
cache: file cache associated with this repository
overrides: dict mapping package name to class attribute overrides for that package
"""
def __init__(self, *repos, cache, overrides=None):
self.repos = []
def __init__(
self,
*repos: Union[str, "Repo"],
cache: Optional["spack.caches.FileCacheType"],
overrides: Optional[Dict[str, Any]] = None,
) -> None:
self.repos: List[Repo] = []
self.by_namespace = nm.NamespaceTrie()
self._provider_index = None
self._patch_index = None
self._tag_index = None
self._provider_index: Optional[spack.provider_index.ProviderIndex] = None
self._patch_index: Optional[spack.patch.PatchCache] = None
self._tag_index: Optional[spack.tag.TagIndex] = None
# Add each repo to this path.
for repo in repos:
try:
if isinstance(repo, str):
assert cache is not None, "cache must hold a value, when repo is a string"
repo = Repo(repo, cache=cache, overrides=overrides)
repo.finder(self)
self.put_last(repo)
except RepoError as e:
tty.warn(
"Failed to initialize repository: '%s'." % repo,
f"Failed to initialize repository: '{repo}'.",
e.message,
"To remove the bad repository, run this command:",
" spack repo rm %s" % repo,
f" spack repo rm {repo}",
)
def put_first(self, repo):
def ensure_unwrapped(self) -> "RepoPath":
"""Ensure we unwrap this object from any dynamic wrapper (like Singleton)"""
return self
def put_first(self, repo: "Repo") -> None:
"""Add repo first in the search path."""
if isinstance(repo, RepoPath):
for r in reversed(repo.repos):
@@ -728,50 +740,34 @@ def remove(self, repo):
if repo in self.repos:
self.repos.remove(repo)
def get_repo(self, namespace, default=NOT_PROVIDED):
"""Get a repository by namespace.
Arguments:
namespace:
Look up this namespace in the RepoPath, and return it if found.
Optional Arguments:
default:
If default is provided, return it when the namespace
isn't found. If not, raise an UnknownNamespaceError.
"""
def get_repo(self, namespace: str) -> "Repo":
"""Get a repository by namespace."""
full_namespace = python_package_for_repo(namespace)
if full_namespace not in self.by_namespace:
if default == NOT_PROVIDED:
raise UnknownNamespaceError(namespace)
return default
raise UnknownNamespaceError(namespace)
return self.by_namespace[full_namespace]
def first_repo(self):
def first_repo(self) -> Optional["Repo"]:
"""Get the first repo in precedence order."""
return self.repos[0] if self.repos else None
@llnl.util.lang.memoized
def _all_package_names_set(self, include_virtuals):
def _all_package_names_set(self, include_virtuals) -> Set[str]:
return {name for repo in self.repos for name in repo.all_package_names(include_virtuals)}
@llnl.util.lang.memoized
def _all_package_names(self, include_virtuals):
def _all_package_names(self, include_virtuals: bool) -> List[str]:
"""Return all unique package names in all repositories."""
return sorted(self._all_package_names_set(include_virtuals), key=lambda n: n.lower())
def all_package_names(self, include_virtuals=False):
def all_package_names(self, include_virtuals: bool = False) -> List[str]:
return self._all_package_names(include_virtuals)
def package_path(self, name):
def package_path(self, name: str) -> str:
"""Get path to package.py file for this repo."""
return self.repo_for_pkg(name).package_path(name)
def all_package_paths(self):
def all_package_paths(self) -> Generator[str, None, None]:
for name in self.all_package_names():
yield self.package_path(name)
@@ -787,53 +783,52 @@ def packages_with_tags(self, *tags: str, full: bool = False) -> Set[str]:
for pkg in repo.packages_with_tags(*tags)
}
def all_package_classes(self):
def all_package_classes(self) -> Generator[Type["spack.package_base.PackageBase"], None, None]:
for name in self.all_package_names():
yield self.get_pkg_class(name)
@property
def provider_index(self):
def provider_index(self) -> spack.provider_index.ProviderIndex:
"""Merged ProviderIndex from all Repos in the RepoPath."""
if self._provider_index is None:
self._provider_index = spack.provider_index.ProviderIndex(repository=self)
for repo in reversed(self.repos):
self._provider_index.merge(repo.provider_index)
return self._provider_index
@property
def tag_index(self):
def tag_index(self) -> spack.tag.TagIndex:
"""Merged TagIndex from all Repos in the RepoPath."""
if self._tag_index is None:
self._tag_index = spack.tag.TagIndex(repository=self)
for repo in reversed(self.repos):
self._tag_index.merge(repo.tag_index)
return self._tag_index
@property
def patch_index(self):
def patch_index(self) -> spack.patch.PatchCache:
"""Merged PatchIndex from all Repos in the RepoPath."""
if self._patch_index is None:
self._patch_index = spack.patch.PatchCache(repository=self)
for repo in reversed(self.repos):
self._patch_index.update(repo.patch_index)
return self._patch_index
@autospec
def providers_for(self, vpkg_spec):
def providers_for(self, virtual_spec: "spack.spec.Spec") -> List["spack.spec.Spec"]:
providers = [
spec
for spec in self.provider_index.providers_for(vpkg_spec)
for spec in self.provider_index.providers_for(virtual_spec)
if spec.name in self._all_package_names_set(include_virtuals=False)
]
if not providers:
raise UnknownPackageError(vpkg_spec.fullname)
raise UnknownPackageError(virtual_spec.fullname)
return providers
@autospec
def extensions_for(self, extendee_spec):
def extensions_for(
self, extendee_spec: "spack.spec.Spec"
) -> List["spack.package_base.PackageBase"]:
return [
pkg_cls(spack.spec.Spec(pkg_cls.name))
for pkg_cls in self.all_package_classes()
@@ -844,7 +839,7 @@ def last_mtime(self):
"""Time a package file in this repo was last updated."""
return max(repo.last_mtime() for repo in self.repos)
def repo_for_pkg(self, spec):
def repo_for_pkg(self, spec: Union[str, "spack.spec.Spec"]) -> "Repo":
"""Given a spec, get the repository for its package."""
# We don't @_autospec this function b/c it's called very frequently
# and we want to avoid parsing str's into Specs unnecessarily.
@@ -869,17 +864,20 @@ def repo_for_pkg(self, spec):
return repo
# If the package isn't in any repo, return the one with
# highest precedence. This is for commands like `spack edit`
# highest precedence. This is for commands like `spack edit`
# that can operate on packages that don't exist yet.
return self.first_repo()
selected = self.first_repo()
if selected is None:
raise UnknownPackageError(name)
return selected
def get(self, spec):
def get(self, spec: "spack.spec.Spec") -> "spack.package_base.PackageBase":
"""Returns the package associated with the supplied spec."""
msg = "RepoPath.get can only be called on concrete specs"
assert isinstance(spec, spack.spec.Spec) and spec.concrete, msg
return self.repo_for_pkg(spec).get(spec)
def get_pkg_class(self, pkg_name):
def get_pkg_class(self, pkg_name: str) -> Type["spack.package_base.PackageBase"]:
"""Find a class for the spec's package and return the class object."""
return self.repo_for_pkg(pkg_name).get_pkg_class(pkg_name)
@@ -892,26 +890,26 @@ def dump_provenance(self, spec, path):
"""
return self.repo_for_pkg(spec).dump_provenance(spec, path)
def dirname_for_package_name(self, pkg_name):
def dirname_for_package_name(self, pkg_name: str) -> str:
return self.repo_for_pkg(pkg_name).dirname_for_package_name(pkg_name)
def filename_for_package_name(self, pkg_name):
def filename_for_package_name(self, pkg_name: str) -> str:
return self.repo_for_pkg(pkg_name).filename_for_package_name(pkg_name)
def exists(self, pkg_name):
def exists(self, pkg_name: str) -> bool:
"""Whether package with the give name exists in the path's repos.
Note that virtual packages do not "exist".
"""
return any(repo.exists(pkg_name) for repo in self.repos)
def _have_name(self, pkg_name):
def _have_name(self, pkg_name: str) -> bool:
have_name = pkg_name is not None
if have_name and not isinstance(pkg_name, str):
raise ValueError("is_virtual(): expected package name, got %s" % type(pkg_name))
raise ValueError(f"is_virtual(): expected package name, got {type(pkg_name)}")
return have_name
def is_virtual(self, pkg_name):
def is_virtual(self, pkg_name: str) -> bool:
"""Return True if the package with this name is virtual, False otherwise.
This function use the provider index. If calling from a code block that
@@ -923,7 +921,7 @@ def is_virtual(self, pkg_name):
have_name = self._have_name(pkg_name)
return have_name and pkg_name in self.provider_index
def is_virtual_safe(self, pkg_name):
def is_virtual_safe(self, pkg_name: str) -> bool:
"""Return True if the package with this name is virtual, False otherwise.
This function doesn't use the provider index.
@@ -937,6 +935,16 @@ def is_virtual_safe(self, pkg_name):
def __contains__(self, pkg_name):
return self.exists(pkg_name)
def marshal(self):
return (self.repos,)
@staticmethod
def unmarshal(repos):
return RepoPath(*repos, cache=None)
def __reduce__(self):
return RepoPath.unmarshal, self.marshal()
class Repo:
"""Class representing a package repository in the filesystem.
@@ -957,7 +965,7 @@ def __init__(
self,
root: str,
*,
cache: spack.caches.FileCacheType,
cache: "spack.caches.FileCacheType",
overrides: Optional[Dict[str, Any]] = None,
) -> None:
"""Instantiate a package repository from a filesystem path.
@@ -1326,6 +1334,20 @@ def __repr__(self) -> str:
def __contains__(self, pkg_name: str) -> bool:
return self.exists(pkg_name)
@staticmethod
def unmarshal(root, cache, overrides):
"""Helper method to unmarshal keyword arguments"""
return Repo(root, cache=cache, overrides=overrides)
def marshal(self):
cache = self._cache
if isinstance(cache, llnl.util.lang.Singleton):
cache = cache.instance
return self.root, cache, self.overrides
def __reduce__(self):
return Repo.unmarshal, self.marshal()
RepoType = Union[Repo, RepoPath]
@@ -1418,7 +1440,9 @@ def _path(configuration=None):
return create(configuration=configuration)
def create(configuration):
def create(
configuration: Union["spack.config.Configuration", llnl.util.lang.Singleton]
) -> RepoPath:
"""Create a RepoPath from a configuration object.
Args:
@@ -1454,20 +1478,20 @@ def all_package_names(include_virtuals=False):
@contextlib.contextmanager
def use_repositories(*paths_and_repos, **kwargs):
def use_repositories(
*paths_and_repos: Union[str, Repo], override: bool = True
) -> Generator[RepoPath, None, None]:
"""Use the repositories passed as arguments within the context manager.
Args:
*paths_and_repos: paths to the repositories to be used, or
already constructed Repo objects
override (bool): if True use only the repositories passed as input,
override: if True use only the repositories passed as input,
if False add them to the top of the list of current repositories.
Returns:
Corresponding RepoPath object
"""
global PATH
# TODO (Python 2.7): remove this kwargs on deprecation of Python 2.7 support
override = kwargs.get("override", True)
paths = [getattr(x, "root", x) for x in paths_and_repos]
scope_name = "use-repo-{}".format(uuid.uuid4())
repos_key = "repos:" if override else "repos"
@@ -1476,7 +1500,8 @@ def use_repositories(*paths_and_repos, **kwargs):
)
PATH, saved = create(configuration=spack.config.CONFIG), PATH
try:
yield PATH
with REPOS_FINDER.switch_repo(PATH): # type: ignore
yield PATH
finally:
spack.config.CONFIG.remove_scope(scope_name=scope_name)
PATH = saved
@@ -1576,10 +1601,9 @@ class UnknownNamespaceError(UnknownEntityError):
"""Raised when we encounter an unknown namespace"""
def __init__(self, namespace, name=None):
msg, long_msg = "Unknown namespace: {}".format(namespace), None
msg, long_msg = f"Unknown namespace: {namespace}", None
if name == "yaml":
long_msg = "Did you mean to specify a filename with './{}.{}'?"
long_msg = long_msg.format(namespace, name)
long_msg = f"Did you mean to specify a filename with './{namespace}.{name}'?"
super().__init__(msg, long_msg)

View File

@@ -11,6 +11,26 @@
import spack.schema.environment
flags: Dict[str, Any] = {
"type": "object",
"additionalProperties": False,
"properties": {
"cflags": {"anyOf": [{"type": "string"}, {"type": "null"}]},
"cxxflags": {"anyOf": [{"type": "string"}, {"type": "null"}]},
"fflags": {"anyOf": [{"type": "string"}, {"type": "null"}]},
"cppflags": {"anyOf": [{"type": "string"}, {"type": "null"}]},
"ldflags": {"anyOf": [{"type": "string"}, {"type": "null"}]},
"ldlibs": {"anyOf": [{"type": "string"}, {"type": "null"}]},
},
}
extra_rpaths: Dict[str, Any] = {"type": "array", "default": [], "items": {"type": "string"}}
implicit_rpaths: Dict[str, Any] = {
"anyOf": [{"type": "array", "items": {"type": "string"}}, {"type": "boolean"}]
}
#: Properties for inclusion in other schemas
properties: Dict[str, Any] = {
"compilers": {
@@ -35,18 +55,7 @@
"fc": {"anyOf": [{"type": "string"}, {"type": "null"}]},
},
},
"flags": {
"type": "object",
"additionalProperties": False,
"properties": {
"cflags": {"anyOf": [{"type": "string"}, {"type": "null"}]},
"cxxflags": {"anyOf": [{"type": "string"}, {"type": "null"}]},
"fflags": {"anyOf": [{"type": "string"}, {"type": "null"}]},
"cppflags": {"anyOf": [{"type": "string"}, {"type": "null"}]},
"ldflags": {"anyOf": [{"type": "string"}, {"type": "null"}]},
"ldlibs": {"anyOf": [{"type": "string"}, {"type": "null"}]},
},
},
"flags": flags,
"spec": {"type": "string"},
"operating_system": {"type": "string"},
"target": {"type": "string"},
@@ -54,18 +63,9 @@
"modules": {
"anyOf": [{"type": "string"}, {"type": "null"}, {"type": "array"}]
},
"implicit_rpaths": {
"anyOf": [
{"type": "array", "items": {"type": "string"}},
{"type": "boolean"},
]
},
"implicit_rpaths": implicit_rpaths,
"environment": spack.schema.environment.definition,
"extra_rpaths": {
"type": "array",
"default": [],
"items": {"type": "string"},
},
"extra_rpaths": extra_rpaths,
},
}
},

View File

@@ -84,7 +84,6 @@
"build_language": {"type": "string"},
"build_jobs": {"type": "integer", "minimum": 1},
"ccache": {"type": "boolean"},
"concretizer": {"type": "string", "enum": ["original", "clingo"]},
"db_lock_timeout": {"type": "integer", "minimum": 1},
"package_lock_timeout": {
"anyOf": [{"type": "integer", "minimum": 1}, {"type": "null"}]
@@ -98,9 +97,9 @@
"aliases": {"type": "object", "patternProperties": {r"\w[\w-]*": {"type": "string"}}},
},
"deprecatedProperties": {
"properties": ["terminal_title"],
"message": "config:terminal_title has been replaced by "
"install_status and is ignored",
"properties": ["concretizer"],
"message": "Spack supports only clingo as a concretizer from v0.23. "
"The config:concretizer config option is ignored.",
"error": False,
},
}

View File

@@ -11,6 +11,8 @@
import spack.schema.environment
from .compilers import extra_rpaths, flags, implicit_rpaths
permissions = {
"type": "object",
"additionalProperties": False,
@@ -184,7 +186,16 @@
"type": "object",
"additionalProperties": True,
"properties": {
"environment": spack.schema.environment.definition
"compilers": {
"type": "object",
"patternProperties": {
r"(^\w[\w-]*)": {"type": "string"}
},
},
"environment": spack.schema.environment.definition,
"extra_rpaths": extra_rpaths,
"implicit_rpaths": implicit_rpaths,
"flags": flags,
},
},
},

View File

@@ -23,6 +23,7 @@
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.lang import elide_list
import spack
import spack.binary_distribution
@@ -621,8 +622,9 @@ def _external_config_with_implicit_externals(configuration):
class ErrorHandler:
def __init__(self, model):
def __init__(self, model, input_specs: List[spack.spec.Spec]):
self.model = model
self.input_specs = input_specs
self.full_model = None
def multiple_values_error(self, attribute, pkg):
@@ -709,12 +711,13 @@ def handle_error(self, msg, *args):
return msg
def message(self, errors) -> str:
messages = [
f" {idx+1: 2}. {self.handle_error(msg, *args)}"
input_specs = ", ".join(elide_list([f"`{s}`" for s in self.input_specs], 5))
header = f"failed to concretize {input_specs} for the following reasons:"
messages = (
f" {idx+1:2}. {self.handle_error(msg, *args)}"
for idx, (_, msg, args) in enumerate(errors)
]
header = "concretization failed for the following reasons:\n"
return "\n".join([header] + messages)
)
return "\n".join((header, *messages))
def raise_if_errors(self):
initial_error_args = extract_args(self.model, "error")
@@ -750,7 +753,7 @@ def on_model(model):
f"unexpected error during concretization [{str(e)}]. "
f"Please report a bug at https://github.com/spack/spack/issues"
)
raise spack.error.SpackError(msg)
raise spack.error.SpackError(msg) from e
raise UnsatisfiableSpecError(msg)
@@ -894,7 +897,7 @@ def on_model(model):
min_cost, best_model = min(models)
# first check for errors
error_handler = ErrorHandler(best_model)
error_handler = ErrorHandler(best_model, specs)
error_handler.raise_if_errors()
# build specs from spec attributes in the model
@@ -1435,16 +1438,14 @@ def condition(
# caller, we won't emit partial facts.
condition_id = next(self._id_counter)
self.gen.fact(fn.pkg_fact(required_spec.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
trigger_id = self._get_condition_id(
required_spec, cache=self._trigger_cache, body=True, transform=transform_required
)
self.gen.fact(fn.pkg_fact(required_spec.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
self.gen.fact(
fn.pkg_fact(required_spec.name, fn.condition_trigger(condition_id, trigger_id))
)
if not imposed_spec:
return condition_id
@@ -1693,19 +1694,43 @@ def external_packages(self):
spack.spec.parse_with_version_concrete(x["spec"]) for x in externals
]
external_specs = []
selected_externals = set()
if spec_filters:
for current_filter in spec_filters:
current_filter.factory = lambda: candidate_specs
external_specs.extend(current_filter.selected_specs())
else:
external_specs.extend(candidate_specs)
selected_externals.update(current_filter.selected_specs())
# Emit facts for externals specs. Note that "local_idx" is the index of the spec
# in packages:<pkg_name>:externals. This means:
#
# packages:<pkg_name>:externals[local_idx].spec == spec
external_versions = []
for local_idx, spec in enumerate(candidate_specs):
msg = f"{spec.name} available as external when satisfying {spec}"
if spec_filters and spec not in selected_externals:
continue
if not spec.versions.concrete:
warnings.warn(f"cannot use the external spec {spec}: needs a concrete version")
continue
def external_imposition(input_spec, requirements):
return requirements + [
fn.attr("external_conditions_hold", input_spec.name, local_idx)
]
try:
self.condition(spec, spec, msg=msg, transform_imposed=external_imposition)
except (spack.error.SpecError, RuntimeError) as e:
warnings.warn(f"while setting up external spec {spec}: {e}")
continue
external_versions.append((spec.version, local_idx))
self.possible_versions[spec.name].add(spec.version)
self.gen.newline()
# Order the external versions to prefer more recent versions
# even if specs in packages.yaml are not ordered that way
external_versions = [
(x.version, external_id) for external_id, x in enumerate(external_specs)
]
external_versions = [
(v, idx, external_id)
for idx, (v, external_id) in enumerate(sorted(external_versions, reverse=True))
@@ -1715,19 +1740,6 @@ def external_packages(self):
DeclaredVersion(version=version, idx=idx, origin=Provenance.EXTERNAL)
)
# Declare external conditions with a local index into packages.yaml
for local_idx, spec in enumerate(external_specs):
msg = "%s available as external when satisfying %s" % (spec.name, spec)
def external_imposition(input_spec, requirements):
return requirements + [
fn.attr("external_conditions_hold", input_spec.name, local_idx)
]
self.condition(spec, spec, msg=msg, transform_imposed=external_imposition)
self.possible_versions[spec.name].add(spec.version)
self.gen.newline()
self.trigger_rules()
self.effect_rules()
@@ -1839,6 +1851,8 @@ def _spec_clauses(
if spec.name:
clauses.append(f.node(spec.name) if not spec.virtual else f.virtual_node(spec.name))
if spec.namespace:
clauses.append(f.namespace(spec.name, spec.namespace))
clauses.extend(self.spec_versions(spec))
@@ -2736,6 +2750,7 @@ class _Head:
"""ASP functions used to express spec clauses in the HEAD of a rule"""
node = fn.attr("node")
namespace = fn.attr("namespace_set")
virtual_node = fn.attr("virtual_node")
node_platform = fn.attr("node_platform_set")
node_os = fn.attr("node_os_set")
@@ -2751,6 +2766,7 @@ class _Body:
"""ASP functions used to express spec clauses in the BODY of a rule"""
node = fn.attr("node")
namespace = fn.attr("namespace")
virtual_node = fn.attr("virtual_node")
node_platform = fn.attr("node_platform")
node_os = fn.attr("node_os")

View File

@@ -18,38 +18,79 @@
{ attr("virtual_node", node(0..X-1, Package)) } :- max_dupes(Package, X), virtual(Package).
% Integrity constraints on DAG nodes
:- attr("root", PackageNode), not attr("node", PackageNode).
:- attr("version", PackageNode, _), not attr("node", PackageNode), not attr("virtual_node", PackageNode).
:- attr("node_version_satisfies", PackageNode, _), not attr("node", PackageNode), not attr("virtual_node", PackageNode).
:- attr("hash", PackageNode, _), not attr("node", PackageNode).
:- attr("node_platform", PackageNode, _), not attr("node", PackageNode).
:- attr("node_os", PackageNode, _), not attr("node", PackageNode).
:- attr("node_target", PackageNode, _), not attr("node", PackageNode).
:- attr("node_compiler_version", PackageNode, _, _), not attr("node", PackageNode).
:- attr("variant_value", PackageNode, _, _), not attr("node", PackageNode).
:- attr("node_flag_compiler_default", PackageNode), not attr("node", PackageNode).
:- attr("node_flag", PackageNode, _, _), not attr("node", PackageNode).
:- attr("external_spec_selected", PackageNode, _), not attr("node", PackageNode).
:- attr("depends_on", ParentNode, _, _), not attr("node", ParentNode).
:- attr("depends_on", _, ChildNode, _), not attr("node", ChildNode).
:- attr("node_flag_source", ParentNode, _, _), not attr("node", ParentNode).
:- attr("node_flag_source", _, _, ChildNode), not attr("node", ChildNode).
:- attr("virtual_node", VirtualNode), not provider(_, VirtualNode), internal_error("virtual node with no provider").
:- provider(_, VirtualNode), not attr("virtual_node", VirtualNode), internal_error("provider with no virtual node").
:- provider(PackageNode, _), not attr("node", PackageNode), internal_error("provider with no real node").
:- attr("root", PackageNode),
not attr("node", PackageNode),
internal_error("Every root must be a node").
:- attr("version", PackageNode, _),
not attr("node", PackageNode),
not attr("virtual_node", PackageNode),
internal_error("Only nodes and virtual_nodes can have versions").
:- attr("node_version_satisfies", PackageNode, _),
not attr("node", PackageNode),
not attr("virtual_node", PackageNode),
internal_error("Only nodes and virtual_nodes can have version satisfaction").
:- attr("hash", PackageNode, _),
not attr("node", PackageNode),
internal_error("Only nodes can have hashes").
:- attr("node_platform", PackageNode, _),
not attr("node", PackageNode),
internal_error("Only nodes can have platforms").
:- attr("node_os", PackageNode, _), not attr("node", PackageNode),
internal_error("Only nodes can have node_os").
:- attr("node_target", PackageNode, _), not attr("node", PackageNode),
internal_error("Only nodes can have node_target").
:- attr("node_compiler_version", PackageNode, _, _), not attr("node", PackageNode),
internal_error("Only nodes can have node_compiler_version").
:- attr("variant_value", PackageNode, _, _), not attr("node", PackageNode),
internal_error("variant_value true for a non-node").
:- attr("node_flag_compiler_default", PackageNode), not attr("node", PackageNode),
internal_error("node_flag_compiler_default true for non-node").
:- attr("node_flag", PackageNode, _, _), not attr("node", PackageNode),
internal_error("node_flag assigned for non-node").
:- attr("external_spec_selected", PackageNode, _), not attr("node", PackageNode),
internal_error("external_spec_selected for non-node").
:- attr("depends_on", ParentNode, _, _), not attr("node", ParentNode),
internal_error("non-node depends on something").
:- attr("depends_on", _, ChildNode, _), not attr("node", ChildNode),
internal_error("something depends_on a non-node").
:- attr("node_flag_source", Node, _, _), not attr("node", Node),
internal_error("node_flag_source assigned for a non-node").
:- attr("node_flag_source", _, _, SourceNode), not attr("node", SourceNode),
internal_error("node_flag_source assigned with a non-node source").
:- attr("virtual_node", VirtualNode), not provider(_, VirtualNode),
internal_error("virtual node with no provider").
:- provider(_, VirtualNode), not attr("virtual_node", VirtualNode),
internal_error("provider with no virtual node").
:- provider(PackageNode, _), not attr("node", PackageNode),
internal_error("provider with no real node").
:- attr("root", node(ID, PackageNode)), ID > min_dupe_id, internal_error("root with a non-minimal duplicate ID").
:- attr("root", node(ID, PackageNode)), ID > min_dupe_id,
internal_error("root with a non-minimal duplicate ID").
% Nodes in the "root" unification set cannot depend on non-root nodes if the dependency is "link" or "run"
:- attr("depends_on", node(min_dupe_id, Package), node(ID, _), "link"), ID != min_dupe_id, unification_set("root", node(min_dupe_id, Package)), internal_error("link dependency out of the root unification set").
:- attr("depends_on", node(min_dupe_id, Package), node(ID, _), "run"), ID != min_dupe_id, unification_set("root", node(min_dupe_id, Package)), internal_error("run dependency out of the root unification set").
% Namespaces are statically assigned by a package fact
attr("namespace", node(ID, Package), Namespace) :- attr("node", node(ID, Package)), pkg_fact(Package, namespace(Namespace)).
% Namespaces are statically assigned by a package fact if not otherwise set
error(100, "{0} does not have a namespace", Package) :- attr("node", node(ID, Package)),
not attr("namespace", node(ID, Package), _),
internal_error("A node must have a namespace").
error(100, "{0} cannot come from both {1} and {2} namespaces", Package, NS1, NS2) :- attr("node", node(ID, Package)),
attr("namespace", node(ID, Package), NS1),
attr("namespace", node(ID, Package), NS2),
NS1 != NS2,
internal_error("A node cannot have two namespaces").
attr("namespace", node(ID, Package), Namespace) :- attr("namespace_set", node(ID, Package), Namespace).
attr("namespace", node(ID, Package), Namespace)
:- attr("node", node(ID, Package)),
not attr("namespace_set", node(ID, Package), _),
pkg_fact(Package, namespace(Namespace)).
% Rules on "unification sets", i.e. on sets of nodes allowing a single configuration of any given package
unify(SetID, PackageName) :- unification_set(SetID, node(_, PackageName)).
:- 2 { unification_set(SetID, node(_, PackageName)) }, unify(SetID, PackageName).
:- 2 { unification_set(SetID, node(_, PackageName)) }, unify(SetID, PackageName),
internal_error("Cannot have multiple unification sets IDs for one set").
unification_set("root", PackageNode) :- attr("root", PackageNode).
unification_set(SetID, ChildNode) :- attr("depends_on", ParentNode, ChildNode, Type), Type != "build", unification_set(SetID, ParentNode).
@@ -75,7 +116,8 @@ unification_set(SetID, VirtualNode)
% as a build dependency.
%
% We'll need to relax the rule before we get to actual cross-compilation
:- depends_on(ParentNode, node(X, Dependency)), depends_on(ParentNode, node(Y, Dependency)), X < Y.
:- depends_on(ParentNode, node(X, Dependency)), depends_on(ParentNode, node(Y, Dependency)), X < Y,
internal_error("Cannot split link/build deptypes for a single edge (yet)").
#defined multiple_unification_sets/1.
@@ -131,7 +173,8 @@ mentioned_in_literal(Root, Mentioned) :- mentioned_in_literal(TriggerID, Root, M
condition_set(node(min_dupe_id, Root), node(min_dupe_id, Root)) :- mentioned_in_literal(Root, Root).
1 { condition_set(node(min_dupe_id, Root), node(0..Y-1, Mentioned)) : max_dupes(Mentioned, Y) } 1 :-
mentioned_in_literal(Root, Mentioned), Mentioned != Root.
mentioned_in_literal(Root, Mentioned), Mentioned != Root,
internal_error("must have exactly one condition_set for literals").
% Discriminate between "roots" that have been explicitly requested, and roots that are deduced from "virtual roots"
explicitly_requested_root(node(min_dupe_id, Package)) :-
@@ -151,7 +194,8 @@ associated_with_root(RootNode, ChildNode) :-
:- attr("root", RootNode),
condition_set(RootNode, node(X, Package)),
not virtual(Package),
not associated_with_root(RootNode, node(X, Package)).
not associated_with_root(RootNode, node(X, Package)),
internal_error("nodes in root condition set must be associated with root").
#defined concretize_everything/0.
#defined literal/1.
@@ -385,8 +429,10 @@ imposed_nodes(ConditionID, PackageNode, node(X, A1))
condition_set(PackageNode, node(X, A1)),
attr("hash", PackageNode, ConditionID).
:- imposed_packages(ID, A1), impose(ID, PackageNode), not condition_set(PackageNode, node(_, A1)).
:- imposed_packages(ID, A1), impose(ID, PackageNode), not imposed_nodes(ID, PackageNode, node(_, A1)).
:- imposed_packages(ID, A1), impose(ID, PackageNode), not condition_set(PackageNode, node(_, A1)),
internal_error("Imposing constraint outside of condition set").
:- imposed_packages(ID, A1), impose(ID, PackageNode), not imposed_nodes(ID, PackageNode, node(_, A1)),
internal_error("Imposing constraint outside of imposed_nodes").
% Conditions that hold impose may impose constraints on other specs
attr(Name, node(X, A1)) :- impose(ID, PackageNode), imposed_constraint(ID, Name, A1), imposed_nodes(ID, PackageNode, node(X, A1)).
@@ -416,7 +462,8 @@ provider(ProviderNode, VirtualNode) :- attr("provider_set", ProviderNode, Virtua
% satisfy the dependency.
1 { attr("depends_on", node(X, A1), node(0..Y-1, A2), A3) : max_dupes(A2, Y) } 1
:- impose(ID, node(X, A1)),
imposed_constraint(ID, "depends_on", A1, A2, A3).
imposed_constraint(ID, "depends_on", A1, A2, A3),
internal_error("Build deps must land in exactly one duplicate").
% Reconstruct virtual dependencies for reused specs
attr("virtual_on_edge", node(X, A1), node(Y, A2), Virtual)
@@ -611,25 +658,18 @@ do_not_impose(EffectID, node(X, Package))
% Virtual dependency weights
%-----------------------------------------------------------------------------
% A provider may have different possible weights depending on whether it's an external
% or not, or on preferences expressed in packages.yaml etc. This rule ensures that
% A provider has different possible weights depending on its preference. This rule ensures that
% we select the weight, among the possible ones, that minimizes the overall objective function.
1 { provider_weight(DependencyNode, VirtualNode, Weight) :
possible_provider_weight(DependencyNode, VirtualNode, Weight, _) } 1
:- provider(DependencyNode, VirtualNode), internal_error("Package provider weights must be unique").
% A provider that is an external can use a weight of 0
possible_provider_weight(DependencyNode, VirtualNode, 0, "external")
:- provider(DependencyNode, VirtualNode),
external(DependencyNode).
% A provider mentioned in the default configuration can use a weight
% according to its priority in the list of providers
% Any configured provider has a weight based on index in the preference list
possible_provider_weight(node(ProviderID, Provider), node(VirtualID, Virtual), Weight, "default")
:- provider(node(ProviderID, Provider), node(VirtualID, Virtual)),
default_provider_preference(Virtual, Provider, Weight).
% Any provider can use 100 as a weight, which is very high and discourage its use
% Any non-configured provider has a default weight of 100
possible_provider_weight(node(ProviderID, Provider), VirtualNode, 100, "fallback")
:- provider(node(ProviderID, Provider), VirtualNode).
@@ -1162,8 +1202,11 @@ target_weight(Target, 0)
node_target_weight(PackageNode, MinWeight)
:- attr("node", PackageNode),
attr("node_target", PackageNode, Target),
target(Target),
MinWeight = #min { Weight : target_weight(Target, Weight) }.
:- attr("node_target", PackageNode, Target), not node_target_weight(PackageNode, _).
% compatibility rules for targets among nodes
node_target_match(ParentNode, DependencyNode)
:- attr("depends_on", ParentNode, DependencyNode, Type), Type != "build",

View File

@@ -12,6 +12,7 @@
%=============================================================================
% macOS
os_compatible("sequoia", "sonoma").
os_compatible("sonoma", "ventura").
os_compatible("ventura", "monterey").
os_compatible("monterey", "bigsur").

File diff suppressed because it is too large Load Diff

View File

@@ -13,7 +13,7 @@
import stat
import sys
import tempfile
from typing import Callable, Dict, Iterable, Optional, Set
from typing import Callable, Dict, Iterable, List, Optional, Set
import llnl.string
import llnl.util.lang
@@ -40,6 +40,7 @@
import spack.resource
import spack.spec
import spack.stage
import spack.util.crypto
import spack.util.lock
import spack.util.path as sup
import spack.util.pattern as pattern
@@ -534,32 +535,29 @@ def generate_fetchers():
for fetcher in dynamic_fetchers:
yield fetcher
def print_errors(errors):
for msg in errors:
tty.debug(msg)
errors = []
errors: List[str] = []
for fetcher in generate_fetchers():
try:
fetcher.stage = self
self.fetcher = fetcher
self.fetcher.fetch()
break
except spack.fetch_strategy.NoCacheError:
except fs.NoCacheError:
# Don't bother reporting when something is not cached.
continue
except fs.FailedDownloadError as f:
errors.extend(f"{fetcher}: {e.__class__.__name__}: {e}" for e in f.exceptions)
continue
except spack.error.SpackError as e:
errors.append("Fetching from {0} failed.".format(fetcher))
tty.debug(e)
errors.append(f"{fetcher}: {e.__class__.__name__}: {e}")
continue
else:
print_errors(errors)
self.fetcher = self.default_fetcher
default_msg = "All fetchers failed for {0}".format(self.name)
raise spack.error.FetchError(err_msg or default_msg, None)
print_errors(errors)
if err_msg:
raise spack.error.FetchError(err_msg)
raise spack.error.FetchError(
f"All fetchers failed for {self.name}", "\n".join(f" {e}" for e in errors)
)
def steal_source(self, dest):
"""Copy the source_path directory in its entirety to directory dest
@@ -1188,7 +1186,7 @@ def _fetch_and_checksum(url, options, keep_stage, action_fn=None):
# Checksum the archive and add it to the list
checksum = spack.util.crypto.checksum(hashlib.sha256, stage.archive_file)
return checksum, None
except FailedDownloadError:
except fs.FailedDownloadError:
return None, f"[WORKER] Failed to fetch {url}"
except Exception as e:
return None, f"[WORKER] Something failed on {url}, skipping. ({e})"
@@ -1208,7 +1206,3 @@ class RestageError(StageError):
class VersionFetchError(StageError):
"""Raised when we can't determine a URL to fetch a package."""
# Keep this in namespace for convenience
FailedDownloadError = fs.FailedDownloadError

View File

@@ -371,7 +371,6 @@ def use_store(
data.update(extra_data)
# Swap the store with the one just constructed and return it
ensure_singleton_created()
spack.config.CONFIG.push_scope(
spack.config.InternalConfigScope(name=scope_name, data={"config": {"install_tree": data}})
)

View File

@@ -79,9 +79,11 @@ def restore(self):
self.test_state.restore()
spack.main.spack_working_dir = self.spack_working_dir
env = pickle.load(self.serialized_env) if _SERIALIZE else self.env
pkg = pickle.load(self.serialized_pkg) if _SERIALIZE else self.pkg
if env:
spack.environment.activate(env)
# Order of operation is important, since the package might be retrieved
# from a repo defined within the environment configuration
pkg = pickle.load(self.serialized_pkg) if _SERIALIZE else self.pkg
return pkg

View File

@@ -208,12 +208,13 @@ def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constra
],
)
@pytest.mark.usefixtures("mock_packages", "config")
@pytest.mark.only_clingo("Fixing the parser broke this test for the original concretizer.")
@pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="tests are for x86_64 uarch ranges"
)
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
spec = Spec(f"a %gcc@10 foobar=bar target={root_target_range} ^b target={dep_target_range}")
spec = Spec(
f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}"
)
with spack.concretize.disable_compiler_existence_check():
spec.concretize()
assert spec.target == spec["b"].target == result
assert spec.target == spec["pkg-b"].target == result

View File

@@ -105,11 +105,11 @@ def config_directory(tmpdir_factory):
@pytest.fixture(scope="function")
def default_config(tmpdir, config_directory, monkeypatch, install_mockery_mutable_config):
# This fixture depends on install_mockery_mutable_config to ensure
def default_config(tmpdir, config_directory, monkeypatch, install_mockery):
# This fixture depends on install_mockery to ensure
# there is a clear order of initialization. The substitution of the
# config scopes here is done on top of the substitution that comes with
# install_mockery_mutable_config
# install_mockery
mutable_dir = tmpdir.mkdir("mutable_config").join("tmp")
config_directory.copy(mutable_dir)
@@ -337,7 +337,7 @@ def test_relative_rpaths_install_nondefault(mirror_dir):
buildcache_cmd("install", "-uf", cspec.name)
def test_push_and_fetch_keys(mock_gnupghome):
def test_push_and_fetch_keys(mock_gnupghome, tmp_path):
testpath = str(mock_gnupghome)
mirror = os.path.join(testpath, "mirror")
@@ -357,7 +357,7 @@ def test_push_and_fetch_keys(mock_gnupghome):
assert len(keys) == 1
fpr = keys[0]
bindist.push_keys(mirror, keys=[fpr], regenerate_index=True)
bindist.push_keys(mirror, keys=[fpr], tmpdir=str(tmp_path), update_index=True)
# dir 2: import the key from the mirror, and confirm that its fingerprint
# matches the one created above
@@ -398,9 +398,7 @@ def fake_dag_hash(spec, length=None):
return "tal4c7h4z0gqmixb1eqa92mjoybxn5l6"[:length]
@pytest.mark.usefixtures(
"install_mockery_mutable_config", "mock_packages", "mock_fetch", "test_mirror"
)
@pytest.mark.usefixtures("install_mockery", "mock_packages", "mock_fetch", "test_mirror")
def test_spec_needs_rebuild(monkeypatch, tmpdir):
"""Make sure needs_rebuild properly compares remote hash
against locally computed one, avoiding unnecessary rebuilds"""
@@ -429,7 +427,7 @@ def test_spec_needs_rebuild(monkeypatch, tmpdir):
assert rebuild
@pytest.mark.usefixtures("install_mockery_mutable_config", "mock_packages", "mock_fetch")
@pytest.mark.usefixtures("install_mockery", "mock_packages", "mock_fetch")
def test_generate_index_missing(monkeypatch, tmpdir, mutable_config):
"""Ensure spack buildcache index only reports available packages"""
@@ -466,7 +464,7 @@ def test_generate_index_missing(monkeypatch, tmpdir, mutable_config):
assert "libelf" not in cache_list
def test_generate_key_index_failure(monkeypatch):
def test_generate_key_index_failure(monkeypatch, tmp_path):
def list_url(url, recursive=False):
if "fails-listing" in url:
raise Exception("Couldn't list the directory")
@@ -479,13 +477,13 @@ def push_to_url(*args, **kwargs):
monkeypatch.setattr(web_util, "push_to_url", push_to_url)
with pytest.raises(CannotListKeys, match="Encountered problem listing keys"):
bindist.generate_key_index("s3://non-existent/fails-listing")
bindist.generate_key_index("s3://non-existent/fails-listing", str(tmp_path))
with pytest.raises(GenerateIndexError, match="problem pushing .* Couldn't upload"):
bindist.generate_key_index("s3://non-existent/fails-uploading")
bindist.generate_key_index("s3://non-existent/fails-uploading", str(tmp_path))
def test_generate_package_index_failure(monkeypatch, capfd):
def test_generate_package_index_failure(monkeypatch, tmp_path, capfd):
def mock_list_url(url, recursive=False):
raise Exception("Some HTTP error")
@@ -494,15 +492,16 @@ def mock_list_url(url, recursive=False):
test_url = "file:///fake/keys/dir"
with pytest.raises(GenerateIndexError, match="Unable to generate package index"):
bindist.generate_package_index(test_url)
bindist.generate_package_index(test_url, str(tmp_path))
assert (
f"Warning: Encountered problem listing packages at {test_url}: Some HTTP error"
"Warning: Encountered problem listing packages at "
f"{test_url}/{bindist.BUILD_CACHE_RELATIVE_PATH}: Some HTTP error"
in capfd.readouterr().err
)
def test_generate_indices_exception(monkeypatch, capfd):
def test_generate_indices_exception(monkeypatch, tmp_path, capfd):
def mock_list_url(url, recursive=False):
raise Exception("Test Exception handling")
@@ -511,10 +510,10 @@ def mock_list_url(url, recursive=False):
url = "file:///fake/keys/dir"
with pytest.raises(GenerateIndexError, match=f"Encountered problem listing keys at {url}"):
bindist.generate_key_index(url)
bindist.generate_key_index(url, str(tmp_path))
with pytest.raises(GenerateIndexError, match="Unable to generate package index"):
bindist.generate_package_index(url)
bindist.generate_package_index(url, str(tmp_path))
assert f"Encountered problem listing packages at {url}" in capfd.readouterr().err
@@ -587,9 +586,7 @@ def test_update_sbang(tmpdir, test_mirror):
str(archspec.cpu.host().family) != "x86_64",
reason="test data uses gcc 4.5.0 which does not support aarch64",
)
def test_install_legacy_buildcache_layout(
mutable_config, compiler_factory, install_mockery_mutable_config
):
def test_install_legacy_buildcache_layout(mutable_config, compiler_factory, install_mockery):
"""Legacy buildcache layout involved a nested archive structure
where the .spack file contained a repeated spec.json and another
compressed archive file containing the install tree. This test

View File

@@ -228,3 +228,25 @@ def test_source_is_disabled(mutable_config):
spack.config.add("bootstrap:trusted:{0}:{1}".format(conf["name"], False))
with pytest.raises(ValueError):
spack.bootstrap.core.source_is_enabled_or_raise(conf)
@pytest.mark.regression("45247")
def test_use_store_does_not_try_writing_outside_root(tmp_path, monkeypatch, mutable_config):
"""Tests that when we use the 'use_store' context manager, there is no attempt at creating
a Store outside the given root.
"""
initial_store = mutable_config.get("config:install_tree:root")
user_store = tmp_path / "store"
fn = spack.store.Store.__init__
def _checked_init(self, root, *args, **kwargs):
fn(self, root, *args, **kwargs)
assert self.root == str(user_store)
monkeypatch.setattr(spack.store.Store, "__init__", _checked_init)
spack.store.reinitialize()
with spack.store.use_store(user_store):
assert spack.config.CONFIG.get("config:install_tree:root") == str(user_store)
assert spack.config.CONFIG.get("config:install_tree:root") == initial_store

View File

@@ -13,34 +13,34 @@
import spack.spec
import spack.util.url
install = spack.main.SpackCommand("install")
pytestmark = pytest.mark.not_on_windows("does not run on windows")
def test_build_tarball_overwrite(install_mockery, mock_fetch, monkeypatch, tmpdir):
with tmpdir.as_cwd():
spec = spack.spec.Spec("trivial-install-test-package").concretized()
install(str(spec))
def test_build_tarball_overwrite(install_mockery, mock_fetch, monkeypatch, tmp_path):
spec = spack.spec.Spec("trivial-install-test-package").concretized()
spec.package.do_install(fake=True)
# Runs fine the first time, throws the second time
out_url = spack.util.url.path_to_file_url(str(tmpdir))
bd.push_or_raise(spec, out_url, bd.PushOptions(unsigned=True))
with pytest.raises(bd.NoOverwriteException):
bd.push_or_raise(spec, out_url, bd.PushOptions(unsigned=True))
specs = [spec]
# Should work fine with force=True
bd.push_or_raise(spec, out_url, bd.PushOptions(force=True, unsigned=True))
# Runs fine the first time, second time it's a no-op
out_url = spack.util.url.path_to_file_url(str(tmp_path))
skipped = bd.push_or_raise(specs, out_url, signing_key=None)
assert not skipped
# Remove the tarball and try again.
# This must *also* throw, because of the existing .spec.json file
os.remove(
os.path.join(
bd.build_cache_prefix("."),
bd.tarball_directory_name(spec),
bd.tarball_name(spec, ".spack"),
)
)
skipped = bd.push_or_raise(specs, out_url, signing_key=None)
assert skipped == specs
with pytest.raises(bd.NoOverwriteException):
bd.push_or_raise(spec, out_url, bd.PushOptions(unsigned=True))
# Should work fine with force=True
skipped = bd.push_or_raise(specs, out_url, signing_key=None, force=True)
assert not skipped
# Remove the tarball, which should cause push to push.
os.remove(
tmp_path
/ bd.BUILD_CACHE_RELATIVE_PATH
/ bd.tarball_directory_name(spec)
/ bd.tarball_name(spec, ".spack")
)
skipped = bd.push_or_raise(specs, out_url, signing_key=None)
assert not skipped

View File

@@ -6,6 +6,7 @@
import os
import platform
import posixpath
import sys
import pytest
@@ -177,7 +178,7 @@ def _set_wrong_cc(x):
def test_setup_dependent_package_inherited_modules(
config, working_env, mock_packages, install_mockery, mock_fetch
working_env, mock_packages, install_mockery, mock_fetch
):
# This will raise on regression
s = spack.spec.Spec("cmake-client-inheritor").concretized()
@@ -287,6 +288,25 @@ def platform_pathsep(pathlist):
assert name not in os.environ
def test_compiler_custom_env(config, mock_packages, monkeypatch, working_env):
if sys.platform == "win32":
test_path = r"C:\test\path\element\custom-env" + "\\"
else:
test_path = r"/test/path/element/custom-env/"
def custom_env(pkg, env):
env.prepend_path("PATH", test_path)
env.append_flags("ENV_CUSTOM_CC_FLAGS", "--custom-env-flag1")
pkg = spack.spec.Spec("cmake").concretized().package
monkeypatch.setattr(pkg.compiler, "setup_custom_environment", custom_env)
spack.build_environment.setup_package(pkg, False)
# Note: trailing slash may be stripped by internal logic
assert test_path[:-1] in os.environ["PATH"]
assert "--custom-env-flag1" in os.environ["ENV_CUSTOM_CC_FLAGS"]
def test_external_config_env(mock_packages, mutable_config, working_env):
cmake_config = {
"externals": [
@@ -457,14 +477,14 @@ def test_parallel_false_is_not_propagating(default_mock_concretization):
# a foobar=bar (parallel = False)
# |
# b (parallel =True)
s = default_mock_concretization("a foobar=bar")
s = default_mock_concretization("pkg-a foobar=bar")
spack.build_environment.set_package_py_globals(s.package, context=Context.BUILD)
assert s["a"].package.module.make_jobs == 1
assert s["pkg-a"].package.module.make_jobs == 1
spack.build_environment.set_package_py_globals(s["b"].package, context=Context.BUILD)
assert s["b"].package.module.make_jobs == spack.build_environment.determine_number_of_jobs(
parallel=s["b"].package.parallel
spack.build_environment.set_package_py_globals(s["pkg-b"].package, context=Context.BUILD)
assert s["pkg-b"].package.module.make_jobs == spack.build_environment.determine_number_of_jobs(
parallel=s["pkg-b"].package.parallel
)

View File

@@ -94,10 +94,10 @@ def test_negative_ninja_check(self, input_dir, test_dir, concretize_and_setup):
@pytest.mark.not_on_windows("autotools not available on windows")
@pytest.mark.usefixtures("config", "mock_packages")
@pytest.mark.usefixtures("mock_packages")
class TestAutotoolsPackage:
def test_with_or_without(self, default_mock_concretization):
s = default_mock_concretization("a")
s = default_mock_concretization("pkg-a")
options = s.package.with_or_without("foo")
# Ensure that values that are not representing a feature
@@ -129,7 +129,7 @@ def activate(value):
assert "--without-lorem-ipsum" in options
def test_none_is_allowed(self, default_mock_concretization):
s = default_mock_concretization("a foo=none")
s = default_mock_concretization("pkg-a foo=none")
options = s.package.with_or_without("foo")
# Ensure that values that are not representing a feature
@@ -139,11 +139,9 @@ def test_none_is_allowed(self, default_mock_concretization):
assert "--without-baz" in options
assert "--no-fee" in options
def test_libtool_archive_files_are_deleted_by_default(
self, default_mock_concretization, mutable_database
):
def test_libtool_archive_files_are_deleted_by_default(self, mutable_database):
# Install a package that creates a mock libtool archive
s = default_mock_concretization("libtool-deletion")
s = Spec("libtool-deletion").concretized()
s.package.do_install(explicit=True)
# Assert the libtool archive is not there and we have
@@ -154,25 +152,23 @@ def test_libtool_archive_files_are_deleted_by_default(
assert libtool_deletion_log
def test_libtool_archive_files_might_be_installed_on_demand(
self, mutable_database, monkeypatch, default_mock_concretization
self, mutable_database, monkeypatch
):
# Install a package that creates a mock libtool archive,
# patch its package to preserve the installation
s = default_mock_concretization("libtool-deletion")
s = Spec("libtool-deletion").concretized()
monkeypatch.setattr(type(s.package.builder), "install_libtool_archives", True)
s.package.do_install(explicit=True)
# Assert libtool archives are installed
assert os.path.exists(s.package.builder.libtool_archive_file)
def test_autotools_gnuconfig_replacement(self, default_mock_concretization, mutable_database):
def test_autotools_gnuconfig_replacement(self, mutable_database):
"""
Tests whether only broken config.sub and config.guess are replaced with
files from working alternatives from the gnuconfig package.
"""
s = default_mock_concretization(
"autotools-config-replacement +patch_config_files +gnuconfig"
)
s = Spec("autotools-config-replacement +patch_config_files +gnuconfig").concretized()
s.package.do_install()
with open(os.path.join(s.prefix.broken, "config.sub")) as f:
@@ -187,15 +183,11 @@ def test_autotools_gnuconfig_replacement(self, default_mock_concretization, muta
with open(os.path.join(s.prefix.working, "config.guess")) as f:
assert "gnuconfig version of config.guess" not in f.read()
def test_autotools_gnuconfig_replacement_disabled(
self, default_mock_concretization, mutable_database
):
def test_autotools_gnuconfig_replacement_disabled(self, mutable_database):
"""
Tests whether disabling patch_config_files
"""
s = default_mock_concretization(
"autotools-config-replacement ~patch_config_files +gnuconfig"
)
s = Spec("autotools-config-replacement ~patch_config_files +gnuconfig").concretized()
s.package.do_install()
with open(os.path.join(s.prefix.broken, "config.sub")) as f:

View File

@@ -355,6 +355,15 @@ def test_fc_flags(wrapper_environment, wrapper_flags):
)
def test_always_cflags(wrapper_environment, wrapper_flags):
with set_env(SPACK_ALWAYS_CFLAGS="-always1 -always2"):
check_args(
cc,
["-v", "--cmd-line-v-opt"],
[real_cc] + ["-always1", "-always2"] + ["-v", "--cmd-line-v-opt"],
)
def test_Wl_parsing(wrapper_environment):
check_args(
cc,

View File

@@ -199,7 +199,7 @@ def __call__(self, *args, **kwargs):
assert "Unable to merge {0}".format(c1) in err
def test_get_spec_filter_list(mutable_mock_env_path, config, mutable_mock_repo):
def test_get_spec_filter_list(mutable_mock_env_path, mutable_mock_repo):
"""Test that given an active environment and list of touched pkgs,
we get the right list of possibly-changed env specs"""
e1 = ev.create("test")
@@ -253,7 +253,7 @@ def test_get_spec_filter_list(mutable_mock_env_path, config, mutable_mock_repo):
@pytest.mark.regression("29947")
def test_affected_specs_on_first_concretization(mutable_mock_env_path, mock_packages, config):
def test_affected_specs_on_first_concretization(mutable_mock_env_path, mock_packages):
e = ev.create("first_concretization")
e.add("mpileaks~shared")
e.add("mpileaks+shared")
@@ -286,7 +286,7 @@ def _fail(self, args):
def test_ci_create_buildcache(tmpdir, working_env, config, mock_packages, monkeypatch):
"""Test that create_buildcache returns a list of objects with the correct
keys and types."""
monkeypatch.setattr(spack.ci, "_push_to_build_cache", lambda a, b, c: True)
monkeypatch.setattr(ci, "push_to_build_cache", lambda a, b, c: True)
results = ci.create_buildcache(
None, destination_mirror_urls=["file:///fake-url-one", "file:///fake-url-two"]
@@ -322,12 +322,12 @@ def test_ci_run_standalone_tests_missing_requirements(
@pytest.mark.not_on_windows("Reliance on bash script not supported on Windows")
def test_ci_run_standalone_tests_not_installed_junit(
tmp_path, repro_dir, working_env, default_mock_concretization, mock_test_stage, capfd
tmp_path, repro_dir, working_env, mock_test_stage, capfd, mock_packages
):
log_file = tmp_path / "junit.xml"
args = {
"log_file": str(log_file),
"job_spec": default_mock_concretization("printing-package"),
"job_spec": spack.spec.Spec("printing-package").concretized(),
"repro_dir": str(repro_dir),
"fail_fast": True,
}
@@ -340,13 +340,13 @@ def test_ci_run_standalone_tests_not_installed_junit(
@pytest.mark.not_on_windows("Reliance on bash script not supported on Windows")
def test_ci_run_standalone_tests_not_installed_cdash(
tmp_path, repro_dir, working_env, default_mock_concretization, mock_test_stage, capfd
tmp_path, repro_dir, working_env, mock_test_stage, capfd, mock_packages
):
"""Test run_standalone_tests with cdash and related options."""
log_file = tmp_path / "junit.xml"
args = {
"log_file": str(log_file),
"job_spec": default_mock_concretization("printing-package"),
"job_spec": spack.spec.Spec("printing-package").concretized(),
"repro_dir": str(repro_dir),
}

Some files were not shown because too many files have changed in this diff Show More