Compare commits

...

514 Commits

Author SHA1 Message Date
Wouter Deconinck
4bc67e8e84 util-linux: provide uuid when new variant +uuid 2024-07-06 14:07:05 -05:00
dependabot[bot]
aeaa922eef build(deps): bump actions/upload-artifact from 4.3.3 to 4.3.4 (#45069)
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 4.3.3 to 4.3.4.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](65462800fd...0b2256b8c0)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-06 08:20:30 +02:00
Hadrien G
a6d5a34be3 Remove myself from maintainer lists (#45071) 2024-07-06 08:17:31 +02:00
Todd Gamblin
ba79542f3c spack gc: remove debug print statement (#45067)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-07-05 22:36:45 +02:00
Auriane R
dc10c8a1ed [py-transformers] Add newer versions (#45022)
* Add newer versions for py-transformers

* Add dependencies needed for py-transformers latest version

* Enforce dependencies requirements for py-transformers newer versions
2024-07-05 14:52:24 +02:00
Auriane R
5ab814505e py-flash-attn: add v2.5.6 -> main (#44894)
* Add latest releases of py-flash-attn

* Add main branch for flash attention

* Add additional requirements
2024-07-05 06:19:01 -06:00
Harmen Stoppels
1d8bdcfc04 config: fix class hierarchy (#45044)
1. Avoid that `self.path` is of type `Optional[str]`
2. Simplify immutable config with a property.
2024-07-05 12:41:13 +02:00
Massimiliano Culpo
95cf341b50 Inject dependencies in repo classes (#45053) 2024-07-05 12:00:41 +02:00
Benjamin Fovet
a134485b1b remove trailing dot in gmsh 4.13.0 shasum (#45054) 2024-07-05 10:18:11 +02:00
dependabot[bot]
d39edeb9a1 build(deps): bump docker/setup-buildx-action from 3.3.0 to 3.4.0 (#45051)
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.3.0 to 3.4.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](d70bba72b1...4fd812986e)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-04 19:28:35 -05:00
dependabot[bot]
453fb27be2 build(deps): bump docker/build-push-action from 6.2.0 to 6.3.0 (#45042)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.2.0 to 6.3.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](15560696de...1a162644f9)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-04 19:28:21 -05:00
dependabot[bot]
831f04fb7d build(deps): bump docker/setup-qemu-action from 3.0.0 to 3.1.0 (#45041)
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.0.0 to 3.1.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](68827325e0...5927c834f5)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-04 19:28:03 -05:00
Wouter Deconinck
04044a9744 containers: rm centos7 since EOL (#45049) 2024-07-04 22:22:23 +02:00
Massimiliano Culpo
8077285a63 flux-core: remove deprecated versions (#45014) 2024-07-04 22:15:55 +02:00
Paul R. C. Kent
537926c1a7 py-pyscf: add v2.6.0, v2.6.1, v2.6.2 (#45032)
* pyscf: add 260 261 262

* Fix max version typo
2024-07-04 12:43:27 -05:00
Stephen Nicholas Swatman
02ff3d7b1e acts-algebra-plugins: new package (#44861)
This commit adds the `acts-algebra-plugins` package which provides the
Acts project with linear algebra functionality.
2024-07-04 09:51:18 -05:00
Jordan Galby
491cb278f3 spack audit packages: Fix message (#45045)
Fix message formatting of the "virtual dependency cannot have variants" error.
2024-07-04 14:30:30 +02:00
Harmen Stoppels
ed1ebefd8f dray: deprecate and simplify (#45015) 2024-07-04 11:43:34 +02:00
Harmen Stoppels
36d64fcbd4 iconv: require libiconv on linux (#45026)
otherwise it is still picked up from glibc as it is external
2024-07-04 08:20:27 +02:00
Massimiliano Culpo
c5cdc2c0a2 Heuristic decays to default over time (#45023)
This modifies heuristic to decay to clingo default
over time. The hope is that this helps with specs
that have an optimal solution with a high penalty.

Let target and compiler heuristic decay too, do not
guess compiler
2024-07-04 08:19:52 +02:00
Carsten Uphoff
0eca86f64f tiny-tensor-compiler: fix missing dependencies (#44465)
* tiny-tensor-compiler: fix missing dependencies
* tiny-tensor-compiler: Remove dependency type for opencl c headers
* Re2c@3.1 has a mandatory Python interpreter dependency
* Remove re2c@3.1

---------

Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-07-03 19:08:45 -06:00
Stephen Nicholas Swatman
50027d76a5 dfelibs: new package (#44860)
* dfelibs: new package

This commits adds the `dfelibs` package, which is an Acts project fork
of dfelibs.

* dfelibs: docstring typo

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-07-03 15:37:25 -06:00
Robert Cohn
b4748de5a9 [openfoam]: use latest cgal (#45003)
* [openfoam]: use latest cgal
* add version checks for CGAL
2024-07-03 14:04:38 -07:00
Teague Sterling
8f2532c624 Fix #44715 by removing the librsvg dependency, which is only needed for testing. Also fixing typo. (#44989)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-07-03 13:52:16 -07:00
Jordan Galby
0726513334 gettext: Fix unvendor libxml2 (#44924)
* gettext: Fix unvendor libxml2
* gettext: Fix build with external libxml2
* Revert "gettext: Fix build with external libxml2"
   This reverts commit c209ad65cb.
2024-07-03 13:40:55 -07:00
Adam J. Stewart
f951f38883 Add support for macOS Sequoia (#45018) 2024-07-03 10:59:26 -07:00
afzpatel
5262412c13 py-tensorflow: remove patch file for 2.16-rocm-enhanced (#44783)
* remove patch file for py-tensorflow@2.16-rocm-enhanced

* add changes

* fix style errors

* remove 2.14-rocm-enhanced version and add patch file

* fix stlye error

* remove jit patch

* add 2.14-rocm-enhanced version
2024-07-03 12:19:07 +02:00
Harmen Stoppels
f022b93249 cmake: add patch to allow static linking with -DCMAKE_INSTALL_RPATH set (#44900) 2024-07-03 10:46:07 +02:00
Vanessasaurus
60b5e98182 flux-core: add v0.64.0 (#45012)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-07-03 01:55:20 -06:00
mvlopri
682acae9fd seacas/package.py: Update default version sha to latest (#45009)
Update to use the latest tag of SEACAS through Spack.
2024-07-03 01:50:27 -06:00
Rocco Meli
c0f80e9117 elsi: improve package and add external libOMM (#44865)
Add MatrixSwitch package
Add libOMM package
Use libOMM as external in ELSI
Add include paths for all libraries
Fortran modules need one directory up (can't use libs.directories since they are just the include/ directories)
2024-07-03 08:56:30 +02:00
Teague Sterling
dcf13af459 kentutils: add v455, v460, v464, v465 + package updates (#44413)
This addresses a few issues in the kentutils package:
 - Issue #44372
 - Cleaning up formattting and styles
 - Removing old versions that are not avaialble anymore
 - Removing newer versions that are also not available anymore
 - The installer does not install the static libs
   that are expected by packages that depend on kentutils
 - There are conflicts and less-than-desirable dependencies
   that can be addressed via variants

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-07-03 08:40:02 +02:00
Auriane R
a2e4fb6b95 py-huggingface-hub: add v0.23.4 (#44905) 2024-07-03 08:35:50 +02:00
Adam J. Stewart
9dd92f493a py-tensorflow: fix numa build (#44607) 2024-07-03 08:34:34 +02:00
Robert Cohn
b23e832002 intel-oneapi-compilers: remove all patching (#45008) 2024-07-03 07:34:25 +02:00
Derek Ryan Strong
ae2f626168 Add giflib v5.2.2 (#44994) 2024-07-02 14:45:11 -07:00
Derek Ryan Strong
a73930da81 Add libwebp v1.4.0, v1.3.2, v1.3.1, v1.3.0 (#44993) 2024-07-02 14:43:45 -07:00
Derek Ryan Strong
921d446196 Add lcms v2.16, v2.15, v2.14 (#44992) 2024-07-02 14:41:02 -07:00
Derek Ryan Strong
8e2ea5de9d Add openjpeg v2.5.2, v2.5.1 (#44991) 2024-07-02 14:39:19 -07:00
Alex Richert
fd0baca222 fontconfig: add pic variant and support static deps (#44884)
* fontconfig: add pic variant; allow static dependencies
* Revise fontconfig to clean up flags
* Revise fontconfig to remove unnecessary flag
* Update fontconfig
* update fontconfig logic for static deps
2024-07-02 14:06:53 -07:00
Harmen Stoppels
d2fc0d6a35 Revert "e4s ci: reduce size due to 5mb gitlab artifact limit (#44986)" (#45001)
This reverts commit 18de6a480b.
2024-07-02 21:05:10 +00:00
renjithravindrankannath
eedc9e0eaf Bump-up the version for RoCm-6.1.2 release (#44849)
* Bumping up to ROCm 6.1.2
* Bump up rocmlir & amdsmi to 6.1.2
* Removing llvm dependency from amdsmi
* Removed redundant for loops and brought rocm-cmake in for loop
* Removing version check of deprecated versions
2024-07-02 14:02:25 -07:00
Massimiliano Culpo
6b85f6b405 ci: deprecate the --dependencies and --optimize option (#45005) 2024-07-02 22:06:52 +02:00
Matt Thompson
5686a6b928 mepo: new package (#44879)
* mepo: add new package
* Add mepo 2.0.0rc4
* Update dependencies
2024-07-02 12:18:40 -07:00
Hector Barrios
ad665c6af1 fix static linking when using Intel OneAPI mkl (#36991) 2024-07-02 13:55:30 -04:00
Mikael Simberg
d78d6db61e fmt: add 11.0.0 (#44997) 2024-07-02 11:51:37 -06:00
Wouter Deconinck
f47c307bf4 fastor: new package (#44984)
* fastor: new package

* fastor: apply code suggestions and fix style
2024-07-02 09:33:09 -06:00
Harmen Stoppels
5b4edb9499 queue -> stack (#45002) 2024-07-02 16:41:29 +02:00
Harmen Stoppels
a6e6093922 spack_yaml.py: fix default_flow_style (#44998) 2024-07-02 14:01:13 +02:00
Harmen Stoppels
2e8b4e660e spack_yaml: add anchorify function (#44995)
This adds spack.util.spack_yaml.anchorify, which takes a non-cyclic
dict/list structure, and replaces identical values with (back)
references to the first instance, so that yaml serialization will use
anchors.

`repr` is used to identify sub-dags, which in principle is quadratic
complexity in depth of the graph, but in practice the depth is O(1) so
this should not matter.

Then this is used in CI to reduce the size of generated YAML files to
30% of their original size.
2024-07-02 14:00:19 +02:00
Harmen Stoppels
0ca1ee8b91 Pipelines: update configuration for aws-isc (#44999)
1.18 is not a string in YAML

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-07-02 12:39:35 +02:00
Adam J. Stewart
a322672259 py-pillow: add v10.4.0 (#44983) 2024-07-02 03:59:24 -06:00
Dom Heinzeller
6cab86d0c1 New package py-pyhdf (#44877)
* Add var/spack/repos/builtin/packages/py-pyhdf/package.py
* Address reviewer comments regarding dependencies for new package py-pyhdf
2024-07-02 01:19:04 -06:00
Robert Cohn
86e7e2e070 [intel-oneapi-mpi]: restore support for classic wrapper names (#44982)
* [intel-oneapi-mpi]: add back support for classic wrappers

* spack style fixes

* add error checking
2024-07-01 18:54:14 -06:00
Cody Balos
69fca439f4 care: add v0.13.1, v0.13.0 and v0.12.0 (#44936)
* care: add v0.13.0 and v0.12.0
* add maintainer
* add 0.13.1
* update dependencies
2024-07-01 16:45:01 -06:00
Vicente Bolea
3b90fb589f vtk: update vtk to 9.3.1 (#44966)
* vtk: add 9.3.1 version

* vtk: update maintainers
2024-07-01 16:20:31 -06:00
Harmen Stoppels
fff126204c cmd/develop.py: fix readability (#44980)
stage[0] is assumed to be for sources, 1: and onwards is
patches/resources, make that a bit more clear.
2024-07-01 12:14:00 -07:00
Szabolcs Horvát
98e626cf67 igraph: add 0.10.13 (#44975) 2024-07-01 12:10:38 -07:00
Richard Berger
d8fe628a95 LAMMPS updates (#44977)
* lammps: add new versions
* lammps: add directories for plugins
* lammps-example-plugin: new package
   Example illustrating how to install a LAMMPS plugin, based on the
   example part of the LAMMPS distribution.
2024-07-01 11:57:26 -07:00
Weiqun Zhang
4c378840e3 amrex: add v24.07 (#44985) 2024-07-01 11:50:28 -07:00
eugeneswalker
18de6a480b e4s ci: reduce size due to 5mb gitlab artifact limit (#44986) 2024-07-01 20:28:56 +02:00
Kun Wu
c6da4d586b cuda: add v12.5.0 (#44971)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-07-01 17:45:04 +00:00
Tamara Dahlgren
61d6fc70e8 Docs: include cmake spec property for the command (#44956) 2024-07-01 10:28:26 -07:00
Vicente Bolea
7c65655c7e adios2: add patch to support rocm6 (#44941)
* adios2: add patch to support rocm6

* e4s rocm ci: re-enable adios2 +rocm

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
2024-07-01 09:20:48 -07:00
snehring
76ca264b72 py-tesorter: add post install hmmpress step (#44940)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-07-01 16:51:26 +02:00
Vicente Bolea
aa58d3c170 vtk-m: update vtk-m to 2.2.0-rc1 (#44875) 2024-07-01 16:49:22 +02:00
Harmen Stoppels
a32b898a00 curl: remove deprecated versions (#44630) 2024-07-01 16:31:54 +02:00
Adam J. Stewart
6fcd43ee64 py-tensorflow: add v2.16.2 (#44974) 2024-07-01 15:50:52 +02:00
Richard Berger
f1f9f00d43 flecsi: cleanup spackage (#44873) 2024-07-01 15:49:59 +02:00
renjithravindrankannath
8ff27f9257 libffi: enable pic (#44970)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-07-01 07:01:12 -06:00
Adam J. Stewart
78c62532c7 py-scipy: add v1.14.0 (#44967) 2024-07-01 14:21:41 +02:00
Harmen Stoppels
a7f327dced zlib-ng: add patch for lld 17+ (#44684)
- add new patch for recent lld
- remove workaround for lld
- drop old versions
- drop old patches
2024-07-01 12:59:53 +02:00
Harmen Stoppels
310c435396 netlib-lapack: provide blas and lapack together (#44981)
If netlib-lapack is built with ~external-blas, it internally links
liblapack.so with libblas.so, meaning that whenever netlib-lapack is
used as a lapack provider, the package must also be a blas provider.

Conversely using netli-lapack as a blas provider does not imply that it
also must provide lapack, but nothing is lost disallowing that...
2024-07-01 12:53:03 +02:00
Harmen Stoppels
fa3f27e8e7 archive.py: undo unrelated changes from #43851 (#44947) 2024-07-01 11:35:32 +02:00
Harmen Stoppels
6b0fefff29 Use composite stage also for develop specs (#44950) 2024-07-01 11:24:16 +02:00
Massimiliano Culpo
f613316282 Flag propagation: restrict to link/run (#44925)
In practice people don't care about having their build dependencies reinstalled with say cflags=-O3 if that is what is set at the input spec, so restrict propagation to link/run deps only.

Also simplify the encoding in asp.
2024-07-01 09:57:51 +02:00
Massimiliano Culpo
1b5b74390f neoverse-v1: restore py-cinemasci (#44976)
Use a different tactic for determining conflicts.

Give more priority to setting False very old versions.
2024-07-01 09:53:24 +02:00
Adam J. Stewart
b57f88cb89 JAX: add v0.4.30 (#44964) 2024-07-01 09:02:48 +02:00
Adam J. Stewart
03afc2a1e6 py-scikit-image: add v0.24.0 (#44965) 2024-07-01 09:02:32 +02:00
Adam J. Stewart
1f6ed9324d py-kornia: add v0.7.3 (#44973) 2024-07-01 09:02:12 +02:00
Adam J. Stewart
5559772afa GDAL: add v3.9.1 (#44969) 2024-07-01 09:01:37 +02:00
Adam J. Stewart
8728631fe0 py-keras: add v3.4 (#44968) 2024-07-01 09:01:18 +02:00
Harmen Stoppels
e34d9cbe5f Remove DIYStage (#44949) 2024-07-01 08:33:56 +02:00
Wouter Deconinck
0efba09990 root: depends_on veccore@0.4.0: resp. @0.4.2: when certain versions (#36667) 2024-06-30 19:37:22 -06:00
Wouter Deconinck
a9cb80d792 xl: avoid matching "_r" in tempfile.TemporaryDirectory during audit (#44831) 2024-06-30 17:49:01 -06:00
Teague Sterling
dae6fe711c gobject-introspection: add v1.60.2 (#44643)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-06-30 20:46:02 +02:00
Massimiliano Culpo
c9a24bc6c5 neoverse-v1: comment out py-cinemasci (#44972) 2024-06-30 10:23:35 +02:00
AcriusWinter
00663f29a9 Strumpack: Changed old test method to new test method (#44874)
* added try except
* Resolve style issues

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-28 12:54:03 -07:00
Elliott Slaughter
15a48990b6 legion: Add 24.06.0. (#44951) 2024-06-28 12:20:11 -06:00
fpruvost
af0b898c2e composyx: fix cmake options prefix (#44948) 2024-06-28 09:49:08 -07:00
Massimiliano Culpo
ddf8384bc6 libgpg-error, libassuan: fixes for darwin (#44946) 2024-06-28 18:27:12 +02:00
Mikael Simberg
670f92f42b mold: add v2.32.1 (#44920) 2024-06-28 10:24:22 -06:00
Dominic Hofer
c81b0e3d2a icon: add 2024.01-1 (#44919) 2024-06-28 09:52:53 -06:00
joscot-linaro
f56d804d85 linaro-forge: added 24.0.2 version (#44922) 2024-06-28 09:47:47 -06:00
sarahtang-amzn
b57f08f22b WRF4.5.x: Limit the patching of ADIOS function to 4.5.0 (#44929) 2024-06-28 08:58:25 -06:00
Harmen Stoppels
34f3b8fdd0 Add missing v0.22.0 changelog (#44945) 2024-06-28 13:31:38 +02:00
Harmen Stoppels
0b3b49b4e0 installer.py: handle external roots the same (#44917)
There was logic not to enqueue build requests for externals if they occur as
roots. That's unnecessary.
2024-06-28 11:26:56 +02:00
Massimiliano Culpo
fa96422702 libassuan: add v3.0.0 (#44913)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-06-28 10:44:25 +02:00
Massimiliano Culpo
e12168ed24 libgcrypt: add v1.11.0 (#44915)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-06-28 10:43:57 +02:00
Massimiliano Culpo
c0f2df8e0a libgpg-error: add v1.50 (#44914) 2024-06-28 10:39:19 +02:00
Massimiliano Culpo
8807ade98f libksba: add v1.6.7 (#44916) 2024-06-28 10:38:43 +02:00
arezaii
13356ddbcc update chapel package for v2.1 (#44931) 2024-06-27 21:29:11 -06:00
AcriusWinter
974033be80 hiop: Corrected test name, added docstring, and changed test to new API (#44765)
* Changed test method from old method to new method
* Corrected test name and added docstring
* Split test method into stand alone-tests
* Added SkipTest
* code refactor
* removed repeated code
* Remove exe from test part purpose

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-27 18:02:42 -06:00
Alan Dayton
8755fc7291 Add new releases of CHAI (#44911) 2024-06-27 17:47:28 -06:00
Chris Marsh
17c02fe759 vtk: Fix proj dependency ranges (#44370)
* Improve proj version selection to avoid too new of a proj with older api usage in vtk9

* Proj constraint range around 8.2.1


---------

Co-authored-by: John Parent <john.parent@kitware.com>
2024-06-27 15:00:56 -04:00
Dan Lipsa
4c7d18a772 Spack on Windows: fix "spack load --list" and "spack unload" (#35720)
Fix the following on Windows:

* `spack load --list` (this printed 0 packages even if packages were
  loaded)
* `spack unload <package>` (this said that the package is not loaded
  even if it was)

Update unit tests for `spack load` to also run on Windows (specifically
for ".bat"). This involved refactoring a few tests to parameterize
based on whether the unit tests are being run on a Windows system
(and to account for batch syntax).
2024-06-27 11:44:36 -07:00
dependabot[bot]
b28b26c39a build(deps): bump docker/build-push-action from 5.3.0 to 6.2.0 (#44910)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 5.3.0 to 6.2.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](2cdde995de...15560696de)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-27 13:06:39 -05:00
Teague Sterling
23aed605ec plink2: add v2.00a5.10, v2.00a5.11 (#44707)
* Adding additional versions to plink2 and switching to tarballs to allow for better version detection in the future

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-06-27 10:48:20 -07:00
Massimiliano Culpo
fdb8d565aa zig: add v0.13.0 (#44912) 2024-06-27 10:38:12 -07:00
Benjamin Fovet
9b08296236 gmsh: add v4.13.1 (#44901)
Also fix gmsh 4.13.0 checksum

Co-authored-by: Benjamin Fovet <benjamin.fovet@cea.fr>
2024-06-27 11:03:34 -06:00
Julien Cortial
c82d8c63fa lcov: Add missing perl dependency (#44896)
Date::Parse, provided by TimeDate, is an undocumented dependency
of lcov 2
2024-06-27 07:08:06 -06:00
Julien Cortial
7a8989bbfc mumps: add v5.7.2 (#44897) 2024-06-27 07:07:51 -06:00
Matthew Whitlock
22c86074c8 Fix bug in logfile parsing (#42706)
* Fix bug in logfile parsing

* Cite ECMA-48 standards for CSI

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-06-27 07:07:30 -06:00
Massimiliano Culpo
ef9e449322 Ensure parent runtime version >= child (#44834)
Fixes a bug where old gcc-runtime libraries would be loaded at runtime, but newer are required by dependencies, breaking the binaries.
2024-06-27 14:58:42 +02:00
Henrique BR
6b73195478 Fixed typo in hypre package (#44921)
Co-authored-by: “Henrique <henrique.bergallo-rocha@ukaea.uk>
2024-06-27 12:59:11 +02:00
Emil Briggs
c7b9bf6a77 rmgdft: add v6.1.0 (#44797) 2024-06-27 12:58:35 +02:00
Thomas Madlener
a84c91b259 podio: add version 1.0.1 (#44854) 2024-06-27 12:55:40 +02:00
Michael Kuhn
e6566dfd67 gcc: add 12.4.0 (#44882) 2024-06-27 12:42:55 +02:00
dependabot[bot]
d6419f32b8 build(deps): bump mypy from 1.10.0 to 1.10.1 in /lib/spack/docs (#44885)
Bumps [mypy](https://github.com/python/mypy) from 1.10.0 to 1.10.1.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.10.0...v1.10.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-27 12:40:14 +02:00
Miguel
60ed682577 Fix typo in docs (#44891)
Fixes on the rst formatting of the make_jobs
2024-06-27 12:32:16 +02:00
Harmen Stoppels
6c1fa8c30b mongo-c-driver: overhaul package recipe (#44866) 2024-06-27 12:30:40 +02:00
Harmen Stoppels
09167fe8ac ci: mgard build failure w/ system zlib (#44918)
The pcluster pipeline runs spack external find and autodetects the most ancient of zlib versions on the system: 1.2.7, release date May 3, 2012.

Turns out some packages require parts of the zlib api added within the last decade.
2024-06-27 12:19:17 +02:00
Dom Heinzeller
eb7951818d py-poetry, py-numexpr, and py-metpy: correct dependencies (#44651) 2024-06-27 09:40:17 +00:00
Auriane R
6959656d51 py-pyyaml: add v6.0.1 (#44906) 2024-06-27 11:13:20 +02:00
Wouter Deconinck
f916b50491 containers: centos:stream -> centos:stream9 (#44876) 2024-06-27 10:38:56 +02:00
Jiakun Yan
7160e1d3e7 lci: new package (#44041) 2024-06-27 10:31:14 +02:00
Cody Balos
16369d50a7 sundials: add v7.1.1 (#44814)
* sundials: add 7.1.0

* add patch

* remove patch, add 7.1.1 instead of 7.1.0
2024-06-26 15:44:57 -07:00
eugeneswalker
35fc371222 update e4s stacks (#44813)
* update e4s stacks

* adios2 +rocm: disable kokkos due to spack issue #44832

* comment out mgard+cuda due to spack issue #44833

* comment out cabana, legion, arborx due to kokkos spack issue #44832

* comment out slepc, petsc due to petsc spack issue #44600

* comment out adios2+rocm due to kokkos rocm spack issue #44832

* comment out kokkos due to spack issue #44832
2024-06-26 15:24:34 -07:00
Harmen Stoppels
4bbf4a5e79 otf2: fix package (#44856)
- extend python
- forward compat bound with python due to imp import
- restore -O2 flags (CFLAGS do not compose)
2024-06-26 17:28:51 +02:00
Alex Richert
e5bd79b011 py-jcb: new package (#44880)
* Add py-jcb: JEDI Configuration Builder
* Update maintainers
2024-06-26 08:18:40 -06:00
Dom Heinzeller
2267b40bda New package py-globus-cli, update py-globus-sdk (#44881)
* Add var/spack/repos/builtin/packages/py-globus-cli/package.py
* Update var/spack/repos/builtin/packages/py-globus-sdk: add version 3.25.0
* For py-globus-cli: make py-typing-extension dependency conditional of Python version, and allow newer Python versions than 3.10

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-26 08:13:03 -06:00
Massimiliano Culpo
b0b6016e12 ASP-based solver: add a generic rule for propagation (#44870)
This adds a generic propagate/2 rule to propagate any
fact to children in the DAG.
2024-06-26 16:02:00 +02:00
Harmen Stoppels
c24265fe7e opennurbs: multiple build systems (#44871) 2024-06-26 15:34:20 +02:00
Jon Rood
9d0102ac89 nalu-wind: fix linking when using trilinos-solvers variant (#44887) 2024-06-26 07:33:50 -06:00
Mikael Simberg
52b8b3ed8d stdexec: Don't skip build stage on main since no longer header only (#44892) 2024-06-26 14:50:44 +02:00
Harmen Stoppels
380030c59a mpifileutils: cmakepackage (#44863) 2024-06-26 14:28:29 +02:00
Harmen Stoppels
867a813328 kallisto: simplify (#44862) 2024-06-26 14:27:12 +02:00
Harmen Stoppels
ded3fa50a3 jasper: multiple build systems (#44859) 2024-06-26 14:25:23 +02:00
Harmen Stoppels
84653e8d9f cutlang: remove (#44853) 2024-06-26 14:24:37 +02:00
Harmen Stoppels
ac0fd7138f oce: deprecate (#44864) 2024-06-26 14:19:41 +02:00
Massimiliano Culpo
c0f9f47b8c Simplify and improve solver heuristic (#44893)
When we changed how to deal with errors in November,
we didn't realize that for an unconstrained choice
rule it is more important in the heuristic to guess
what is NOT in the answer set, since it will be the
majority of options.

Previously this was following automatically from what
was in the answer set, via `1 { ... } 1` cardinality
constraints.

Here we improve the heuristic and the solve time for specs.
2024-06-26 13:08:41 +02:00
Daniel Ahlin
7e9d24a145 gromacs: maintainer change (#44890)
Adding mabraham to maintainer. Mark will bring a lot of GROMACS knowledge. Removing danielahlin as I am currently not working actively on this.
2024-06-26 01:44:35 -06:00
Simon Pintarelli
99405e6a4d nlcglib: add v1.1.0 (#44580) 2024-06-26 09:42:26 +02:00
Simon Frasch
7500a4853c spla: add v1.6.1 (#44878) 2024-06-26 09:40:01 +02:00
psakievich
54d192e026 Steal source was not assigning the package class (#44886)
Fetcher was missing the package class assignment
2024-06-25 23:17:56 -07:00
Harmen Stoppels
e6a8eba72d eztrace: multiple build systems (#44855) 2024-06-26 06:12:02 +02:00
Wouter Deconinck
4d604c8c9f root: add v6.30.08, v6.32.02 (#44807) 2024-06-25 14:57:44 -07:00
fpruvost
b081e0046f Move/rename maphyspp to composyx (#44836) 2024-06-25 13:58:54 -07:00
Tamara Dahlgren
baf82c0245 Docs: Update stand-alone test information (#44755)
Update and slightly reorganize stand-alone test information to include new and improved examples and more links that can be used in PR feedback.
2024-06-25 13:33:41 -07:00
Hector Martinez-Seara
884a0a392d rdma-core: Patched CMakeLists.txt to find drm include at libdrm - archlinux (#44839)
* Patched CMakeLists.txt to find drm include at libdrm - archlinux
* Restricted usage patch to versions 34 onwards
* Update var/spack/repos/builtin/packages/rdma-core/libdrm.patch

---------

Co-authored-by: Mark Abraham <Mark.J.Abraham@gmail.com>
2024-06-25 13:06:38 -07:00
Harmen Stoppels
824f2a5652 memaxes: cmakepackage (#44869) 2024-06-25 12:55:54 -07:00
Wouter Deconinck
b894acf1fc geant4: add v11.2.2, incl g4ndl v4.7.1 (#44830)
* geant4: add v11.2.2

* g4ndl: add v4.7.1

* geant4-data, g4ndl: use =4.7 for 11.2.0:11.2.1, 4.7.1 for 11.2.2
2024-06-25 14:35:31 -04:00
Miguel
536856874c add documentation for make_jobs variable (#44838)
* add documentation for make_jobs variable

* apply suggested changes

* Update packaging_guide.rst

add suggestions to the documentation

* Update packaging_guide.rst

fix missing quotes in the documentation

* suggestions to packaging_guide.rst
2024-06-25 11:24:49 -06:00
eugeneswalker
8d1aaef8b8 e4s ci: paraview: require +examples (#44847) 2024-06-25 15:12:56 +00:00
Simon Pintarelli
3216c4362e tiled-mm: add v2.3.1 (#44867) 2024-06-25 09:08:14 -06:00
Harmen Stoppels
2e822e65fd libbson: cmake & autotools package (#44852) 2024-06-25 14:54:22 +02:00
Harmen Stoppels
41fde4db8c glog: remove custom cmake stuff (#44858) 2024-06-25 14:52:28 +02:00
Wouter Deconinck
81125c3bd8 hepmc3: pass root variant cxxstd as HEPMC3_CXX_STANDARD (#44806)
* hepmc3: pass `root` variant `cxxstd` as `HEPMC3_CXX_STANDARD`

* hepmc3: when @:3.2.3 +rootio, depends_on root cxxstd=11

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-06-25 05:50:36 -05:00
Nicole C
d8b0df6f5b Make url_fetch tests work on Windows (#44809) 2024-06-25 07:34:11 +00:00
Martin Pokorny
99e3fdb180 Legion: add redop_half variant, and add missing build dependency (#44792)
* legion: add redop_half variant
* legion: conditionally add py-setuptools as build dependency
   Dependency exists only for '+python' variant.
2024-06-25 01:33:54 -06:00
Paul Romano
64cfdc07cb OpenMC: add v0.15.0 and v0.14.0 (#44840) 2024-06-25 01:19:23 -06:00
Harmen Stoppels
59fbbdd9ce blitz: use cmake (#44804) 2024-06-25 09:11:14 +02:00
Melven Roehrig-Zoellner
adedf58297 py-heat: add v1.4.1 (#44835) 2024-06-25 01:10:18 -06:00
Wouter Deconinck
f13ea8aa75 py-scikit-build-core: typing-extension dependency fix (#44817) 2024-06-25 09:00:03 +02:00
Wouter Deconinck
109a4d52b5 vecmem: add v0.25.0 -> v1.6.0 (#43528) 2024-06-25 08:48:53 +02:00
Davis Herring
7f2117e2cf flecsi: add v2.3.0 (#44798) 2024-06-25 08:34:02 +02:00
Rocco Meli
dfab3e8829 ELSI: improve package (#44717)
* Spglib: add version 2.4.0

* DLA-Future: fix +test option

* elsi

* elsi improvements

* [@spackbot] updating style on behalf of RMeli

* remove test variants

* fix ntpoly and and use externals as default

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
2024-06-25 07:38:56 +02:00
Martin Pokorny
822622f07a thrust: add v1.17.x (#44561) 2024-06-25 07:35:29 +02:00
Vicente Bolea
c4194e4f58 vtk-m: support kokkos@4: (#44850) 2024-06-25 07:21:20 +02:00
Raffaele Solcà
05c7ff4595 dla-future: add v0.6.0 (#44841) 2024-06-24 22:33:14 -06:00
Matt Thompson
a6ac78c7c6 mapl: add 2.47.0 (#44842)
* mapl: add 2.47.0
* Fix typo
2024-06-24 22:32:46 -06:00
eugeneswalker
8479122e71 ecp rocm ci: add ecp-dav back + use updated container image w rocm-llvm-dev installed (#44827) 2024-06-24 16:48:40 -06:00
Teague Sterling
3d91bfea75 libgtop: new package (#44642)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-06-24 15:43:27 -06:00
Wouter Deconinck
27e28b33ee py-fsspec: add v2024.3.1, v2024.5.0 (switch to hatchling) (#44491) 2024-06-24 18:16:37 +02:00
Teague Sterling
eaf8ac7407 py-orjson: add v3.8.14, v3.9.15, v3.10.3 and dependencies (#44514)
* py-uvloop: add v3.8.14, v3.9.15, v3.10.3 and dependencies

* rollback

* py-uvloop: add v3.8.14, v3.9.15, v3.10.3 and dependencies

* Update package.py
2024-06-24 18:15:11 +02:00
John W. Parent
1e9d550bc6 cmake: add v3.29.6 (#44616) 2024-06-24 09:19:14 -06:00
Harmen Stoppels
0282fe9efd spec_list: do not resolve abstract hashes (#44760) 2024-06-24 16:43:01 +02:00
eugeneswalker
ad3d9c83fe e4s: add glvis (#44812) 2024-06-24 05:57:54 -07:00
Adam J. Stewart
910b923c5d JAX: add v0.4.29 (#44683)
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2024-06-24 08:43:45 +02:00
snehring
c1f1e1396d libint: add v2.9.0 (#44776)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-06-24 08:36:13 +02:00
Neil Flood
3139dbdd39 py-rios: add v2.0.2 (#44779) 2024-06-24 08:35:00 +02:00
Teague Sterling
cc6dcdcab9 htslib: add variants (#44501)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-06-24 08:34:10 +02:00
Wouter Deconinck
57b83e5fb2 root: depends_on libc only on linux (#44823) 2024-06-22 20:39:08 +02:00
Andrey Perestoronin
55fe73586e add packages from intel-oneapi-2024.2.0 release (#44789) 2024-06-21 16:17:54 -06:00
Auriane R
3be497344a py-triton: add main (#44774)
* Add triton env variable to build with clang

* Add git url and main branch for py-triton
2024-06-21 08:57:58 -07:00
Carlos Bederián
05413689b9 python: add v3.12.4 (#44801) 2024-06-21 17:00:39 +02:00
Alberto Invernizzi
50da223888 lua-lpeg: add latest version (#44803) 2024-06-21 07:16:08 -06:00
Alberto Invernizzi
719b260cf1 tree-sitter: new versions (#44802) 2024-06-21 06:52:37 -06:00
Tamara Dahlgren
3848c41494 Bugfix: test_is_externally_detectable needs to use mockpackages (#44795) 2024-06-21 12:24:46 +02:00
Veselin Dobrev
da720cafd8 openssl: add latest OpenSSL versions, deprecate previous ones (#44620) 2024-06-21 08:32:23 +02:00
AcriusWinter
8f0b029308 openpmd-api: Changed from old to new test API (#44764)
* changed from old to new test API
* changed test name so it works
* small docstring change
* fixed skiptest check
* made tests check output
2024-06-20 15:06:48 -07:00
Pierre Blanchard
901f4b789d sleef: add v3.6.1 (#44784)
Update preferred package to SLEEF 3.6.1
2024-06-20 14:47:54 -07:00
kwryankrattiger
6e6fef1b0e ParaView: add 5.13.0-RC1 (#44785)
* ParaView: add 5.13.0-RC1
2024-06-20 14:44:26 -07:00
George Young
ae131a5c7c py-multiqc: updating to @1.21, adding new dependency py-pyyaml-env (#44768)
* py-multiqc: updating to @1.21, adding new dependency py-pyyaml-env
* Correcting dependencies
* Drop matplotlib depends_on() for versions of multiqc not included in package.py

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
2024-06-20 14:28:54 -07:00
MichaelLaufer
ed59e43e1d py-pyfr: Add v2.0.2 (#44487)
* py-gimmik: add v3.2.1
* py-pyfr: add v2.0.2
2024-06-20 14:22:29 -07:00
Loris Ercole
5e8f9ed1c7 elpa: new version and recipe tuning (#44749)
Add version 2024.03.001 and fix #43902.

---------

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2024-06-20 14:16:10 -07:00
Brian Van Essen
9c31ff74c4 Allow zlib to find external installations. (#44694)
* Allow zlib to find external installations.
* Apply suggestions from code review
* Fixed the determine_version function to loop over all of the potential
libraries that could be installed by zlib.

---------

Co-authored-by: John W. Parent <45471568+johnwparent@users.noreply.github.com>
2024-06-20 13:40:23 -07:00
AcriusWinter
90c8fe0182 snakemake: Changed test method name and added docstring (#44723)
* snakemake: Changed tested method name and added docstring
2024-06-20 10:28:29 -07:00
John W. Parent
58db81c323 bootstrap: test building clingo from sources on windows (#44624) 2024-06-20 16:39:58 +00:00
snehring
e867008819 votca: tighten up boost dep (#44777)
* votca: tighten up boost dep

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-06-20 10:13:08 -06:00
Paul R. C. Kent
9910e06b25 Update package.py (#44772) 2024-06-20 17:41:15 +02:00
Harmen Stoppels
a3ece7ff4d cmake: remove version deprecated in 0.22 (#44628) 2024-06-20 17:15:26 +02:00
Robert Manson-Sawko
00f0ca2060 openmpi: add with-lsf-libdir config option (#44563) 2024-06-20 08:25:25 -06:00
Derek Ryan Strong
35557ac21c pmix: add v5.0.2, v4.2.9, v4.2.8, v4.2.7 (#44576) 2024-06-20 14:49:00 +02:00
Josh Bowden
54662f7ae1 damaris: add v1.11.0 (#44610)
Co-authored-by: Joshua Bowden <joshua-chales.bowden@inria.fr>
2024-06-20 14:30:11 +02:00
Wouter Deconinck
9cdb2a8dbb dd4hep: depends_on root +root7 in some cases (#43671)
* dd4hep: depends_on root +root7 in some cases

* dd4hep: avoid self-referential whens -> conflicts

* dd4hep: avoid conflict with (narrower) depends_on root@6.27:
2024-06-20 14:22:47 +02:00
Marcel Breyer
ffdab20294 googletest: fix checking a variant condition (#44781)
Change "pthread" to "+pthread" since otherwise the spec is never satisfied and pthread is never used.
2024-06-20 05:59:53 -06:00
Dominic Hofer
e91a69a756 py-tabulate: add v0.8.10, v0.9.0 (#44604) 2024-06-20 13:51:50 +02:00
Matthew Lesko
7665076339 ESMF: set ESMF_LAPACK_LIBPATH if +external-lapack (#44672)
This PR set the `ESMF_LAPACK_LIBPATH` variable (from the `lapack` spec
attributes), removing the previous FIXME and commented example.
2024-06-20 13:48:38 +02:00
Thomas Madlener
49ba2d84a0 edm4hep: Make edm4hep extend python after upstream changes (#44681) 2024-06-20 13:47:14 +02:00
Robert Cohn
0cec923e0a intel-oneapi-mpi: update mpi wrappers to icx variants (#44688) 2024-06-20 13:46:32 +02:00
Nick Hagerty
b97b001dad hipfort: add non-system gcc support (#44612) 2024-06-20 13:45:18 +02:00
Stephen Sachs
113e231abe Remove deprecated intel-* packages (#44700)
Packages will be removed with https://github.com/spack/spack/pull/44689. This PR
makes sure that the `aws-pcluster-x86_64_v4` stack still works as expected.
2024-06-20 07:07:27 -04:00
Adam J. Stewart
f43ca7a554 py-lightning: add v2.3 (#44731)
Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2024-06-20 11:32:53 +02:00
Robert Cohn
3c2c215619 [intel-*] remove deprecated packages (#44689)
* [intel-*] remove deprecated packages

* undo delete of mkl, mpi, parallel-studio
2024-06-20 11:31:38 +02:00
Frédéric Simonis
a2b3a004bf precice: add version 3.1.2 (#44583) 2024-06-20 11:17:11 +02:00
Harmen Stoppels
f650133f83 build_environment: fix ccache error handling (#44740) 2024-06-20 11:16:24 +02:00
Thomas Helfer
81f9d5baa5 tfel: add support for versions up to 4.2.1 (#44578) 2024-06-20 11:09:50 +02:00
psakievich
3938a85ff8 nalu-wind: update submodules (#44687)
Co-authored-by: psakievich <psakievich@users.noreply.github.com>
2024-06-20 10:41:44 +02:00
Thomas Madlener
84cb604b19 podio: Add version 1.0 (#44780)
Co-authored-by: Juan Miguel Carceller <22276694+jmcarcell@users.noreply.github.com>
2024-06-20 02:31:55 -06:00
Alec Scott
093504d9a0 go: drop deprecated versions prior to v0.22 release, clean up build location (#44778) 2024-06-20 10:06:54 +02:00
Alex Richert
70a93a746d py-wxflow: new package (#44754)
* Add py-wxflow, a set of tools used in weather workflows
* Update package.py
* add 0.2.0
* add unit testing
2024-06-19 16:19:31 -06:00
Adam J. Stewart
4326efddbf py-maturin: add v1.6.0 (#44734)
* py-maturin: add v1.6.0
* bzip2 only on macOS
2024-06-19 15:02:59 -07:00
Joe Schoonover
6f51b543f0 Update hohqmesh and feq-parse packages (#44737)
* Update hohqmesh and feq-parse versions
   Update feq-parse homepage to new documentation page
   Update feq-parse license to be consistent with 2.2.2 release

---------

Co-authored-by: fluidnumerics-joe <fluidnumerics-joe@users.noreply.github.com>
2024-06-19 14:54:58 -07:00
Christian Glusa
3316e49ad3 Binder: Add newer version (#44741) 2024-06-19 11:14:43 -07:00
Stephen Nicholas Swatman
0c4a91cd18 actsvg: add versions up to 0.4.44 (#44770)
This commit adds new versions of actsvg. I have also taken the liberty
of adding myself a maintainer as I work actively on the ACTS project and
its dependencies, and would like to start keeping the Spack specs more
up to date.
2024-06-19 11:13:17 -07:00
Dave Keeshan
07c6c3ebac verilator: add v5.026 (#44757) 2024-06-19 11:07:56 -07:00
SXS Bot
e3a4a07616 spectre: add v2024.06.18 (#44761)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-06-19 11:06:12 -07:00
Stephen Nicholas Swatman
57467139e5 covfie: new package (#44771)
This commit adds the covfie package which facilitates the
high-performance storage of vector fields and other structured
multi-dimensional data.
2024-06-19 10:34:32 -07:00
Carlos Bederián
e8bc53f37b ucx: add v1.17.0 (#44767) 2024-06-19 13:24:54 +02:00
Dan Bonachea
3b78515fd4 upcxx package: Add resilience to broken libfabric (#44618)
Some systems have a libfabric install that doesn't work, so don't
drop dead if a call to `fi_info` fails (e.g. due to missing shared libraries)
2024-06-18 18:10:04 -07:00
downloadico
6b052c3af9 Abinit fix hdf5 (#44763)
* abinit: fix locating HDF5
Remove the check in the configure script to locate HDF5.  Replaced by using
Spack to locate the package.
2024-06-18 16:28:11 -06:00
AcriusWinter
37e2d46d7d Legion: Reformatted Old To New Test Method and skipping tests (#44733)
* Legion: reformatted old test method to match new test method
* Updated docstring and how cmake file is opened
2024-06-18 12:18:56 -07:00
eugeneswalker
a389eb5a08 e4s external rocm ci: bump rocm stack to v6.1.1 (#44449)
* e4s external rocm ci: bump rocm stack to v6.1.1

* comment out exago+rocm due to issue with raja@0.14.0 see spack issue #44593

* comment out adios2+rocm due to spack issue #44594

* comment out petsc+rocm due to spack issue #44600

* comment out sundials+rocm due to spack issue #44601

* comment out slepc+rocm due to petsc spack issue #44600

* comment out tau+rocm due to spack issue #44659

* comment out ecp-data-vis-sdk due to spack issue #44745

* packages: register rocm-core as external

* re-enable tau due to issue #44659 having been resolved

* use latest ci image: ecpe4s/ubuntu22.04-runner-amd64-gcc-11.4-rocm6.1.1:2024.06.17

* comment out paraview due to spack issue #44745

* comment out ecp-data-vis-sdk +vtkm due to issue https://gitlab.spack.io/spack/spack/-/jobs/11632511
2024-06-18 11:43:03 -07:00
Neil Flood
d57f174ca3 py-rios: add 1.4.17, v2.0.1 (#44679)
* Update for 2.0.1
* cloudpickle dependency is only 'run'
* Follow new formatting guidelines
* black wants trailing commas
* Simplified version ranges, as recommended by @tldahlgren
2024-06-18 10:35:43 -06:00
AcriusWinter
e6ae42b1eb vtk-m: Changed test method names and skipping non-applicable tests from old to new approach (#44705)
* vtk-m: Changed test method names and skipping non-applicable tests from old to new approach

-----
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-18 09:40:17 -06:00
George Young
d911b9c48d seacr: new package @1.4-b2 (#42677)
* seacr: new package @1.4-b2

* Update var/spack/repos/builtin/packages/seacr/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: LMS Bioinformatics <bioinformatics@lms.mrc.ac.uk>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-06-18 07:37:58 -07:00
Tim Haines
3b35b7f4fa Boost: switch from jfrog to boost.io for downloads (#44728)
The jfrog hosting will be shut down in Dec 2024.
2024-06-18 08:40:03 +02:00
dependabot[bot]
a82fb33b31 build(deps): bump flake8 from 7.0.0 to 7.1.0 in /lib/spack/docs (#44752) 2024-06-18 08:27:15 +02:00
dependabot[bot]
5f35a90529 build(deps): bump urllib3 from 2.2.1 to 2.2.2 in /lib/spack/docs (#44751) 2024-06-18 08:27:04 +02:00
dependabot[bot]
81c620b61b build(deps): bump flake8 from 7.0.0 to 7.1.0 in /.github/workflows/style (#44750) 2024-06-18 08:26:18 +02:00
Chris White
12866eb0d6 ascent: add v0.9.3 (#44571)
* add new ascent version
* add requirement for new version of umpire/raja
* add patch for vtk-m dependency
2024-06-17 19:37:50 -06:00
arezaii
24b8d0666e Chapel package: major update (#42197)
* add cray detection taken from upcxx
* add CUDA/ROCm support
* add numerous pass-through options to Chapel build,
  like gpu_mem_strategy, comm_substrate, etc.; all variants are
  translated to analogous CHPL_* environment variables. As a side
  effect, this defines a number of environment variables that are
  not actually used by Chapel.
* Define LD_LIBRARY_PATH, LIBRARY_PATH, and PKG_CONFIG_PATH to
  help programs built with Chapel properly locate needed runtime
  dependencies

---------

Co-authored-by: bonachea <dobonachea@lbl.gov>
2024-06-17 18:09:05 -06:00
Tamara Dahlgren
0f2c7248c8 Bugfix: omega-h stand-alone tests: ensure proper ordering (#44748) 2024-06-17 15:17:30 -06:00
Alec Scott
8cc4ad3ac5 pass: install autocompletion for all shells (#44744) 2024-06-17 12:35:02 -07:00
Dan Lipsa
9ba90e322e PROJ: add new versions, improve tiff patches (#42767)
* Projsync needs curl

This fixes a proj compilation error from paraview

* tiff does not set TIFF_INCLUDE_DIR tested by proj

* TIFF patch exists for proj 9.2

* Add comments with location of proj patch

* Use TIFF patches for different cmake versions

* Use modules starting cmake 3.28 and variables before that

* Fix black style

* Simpler patches, add newer proj

* Remove duplicated flag

* Deprecate PROJ 4.8 and older

* Remove PRIVATE

* Deprecate PROJ 7.0.0

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-06-17 15:33:22 +02:00
Taillefumier Mathieu
d9c6b40d8e [cp2k] Enforce exclusion of libxsmm main (#44739)
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-06-17 07:30:15 -06:00
Alberto Invernizzi
d792e1f052 update minimum libvterm version for 0.10.0 (#44720)
7a5effb0f9
2024-06-15 21:15:50 -07:00
Todd Gamblin
2d464f8c89 spack-python: remove superfluous /usr/bin/env (#44724)
Not sure why I had this here, as `/bin/sh` will find the first spack in `PATH` just like
`env`.

- [x] remove `/usr/bin/env` and avoid an extra process launch.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-06-15 08:11:19 -07:00
AcriusWinter
456a8e3553 omega-h: reformatted from old test method to new test method (#44712)
* omega-h: reformatted from old test method to new test method
* omega-h: added proper output checking

--------
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-14 20:03:39 -06:00
AcriusWinter
db94696cf0 libstdcompat: removed duplicate build-time test method (#44725)
* libstdcompat: removed stand-alone test method
2024-06-14 18:40:03 -06:00
kwryankrattiger
72bb656b9e Intel MKL express requirements as requires (#44727)
The ``+cluster` variant requires there be an MPI family known in the
spec. When using externals it is easy to miss-configure this requirement
leading to a runtime exception which is not desirable. This converts the
exception to a package rule.
2024-06-14 19:07:37 -04:00
James Smillie
e092026eb8 muparser: refactor to use new multi-build-system logic (#44552) 2024-06-14 13:43:20 -07:00
psakievich
e5f5749d67 HDF5: Fix typo in symlink paths (#44711) 2024-06-14 12:48:32 -06:00
renjithravindrankannath
6e4f8ea7e4 hip: Adding hipother for cuda support starting from 6.0 (#44453)
* Adding hipother for cuda support starting from 6.0
* Updating the version check as per current spec
2024-06-14 08:26:07 -07:00
Massimiliano Culpo
5e8eff24d2 Build developer-tools pipeline only on manylinux (#43811) 2024-06-14 10:13:06 +02:00
Alec Scott
36f1801eb8 rclone: add v1.65.2 (#44657) 2024-06-13 21:28:14 -05:00
Alec Scott
e3deee57ba go: add v1.22.4 (#44656) 2024-06-13 21:28:06 -05:00
Alec Scott
404deb99f4 hugo: add v0.127.0 (#44626) 2024-06-13 21:27:56 -05:00
Alec Scott
f594bc7aea fd: add v10.1.0 (#44623) 2024-06-13 21:27:46 -05:00
Harshula Jayasuriya
9ed948523a openmpi: disable remark 10441 for intel classic 2021.7.0 or newer (#44614)
* Compilation of openmpi fails when intel classic compiler 2021.7.0
  or newer is used.
2024-06-13 17:58:50 -07:00
Wouter Deconinck
72d7c2d558 xorg pkgs: update versions and homepage cgit -> gitlab (#44420)
* xorg libs: new versions
* xorg pkgs: update homepage from cgit to gitlab
* xorgs pkgs: fix homepage since, yeah, I did edit those 122 files by hand...
* libxmu: don't need the .0 patch version here

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-13 17:42:17 -07:00
Matt Thompson
d4a7582955 gh: add 2.50.0 (#44670) 2024-06-13 16:46:52 -07:00
dependabot[bot]
42ac1f0cb2 build(deps): bump codecov/codecov-action from 4.4.1 to 4.5.0 (#44710)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.4.1 to 4.5.0.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](125fc84a9a...e28ff129e5)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-13 16:45:24 -07:00
Mathew Cleveland
4af61d432f add draco-7_18_0 to package.py (#44673)
Co-authored-by: Cleveland <cleveland@lanl.gov>
2024-06-13 16:43:00 -07:00
Teague Sterling
52bdaa7bf5 perl-dbd-mysql: add v4.052, v5.005 (#44503)
* Fixed some issues with installation breaking due to mysql_client

* Remove debugging details

* Adding client_only to deps

* Adding client_only to variant and deps

* Undoingw client_only to variant and deps
2024-06-13 16:42:23 -07:00
Diego Alvarez S
96b42238c5 openjdk: update default version to 17.x (#44647)
* Set default version of OpenJDK to 17.x

* Fix OpenJDK newer versions in Apple Silicon

* Specify java@11 for astral and stc
2024-06-13 16:40:50 -07:00
Teague Sterling
c7bd259739 Adding pkgconfig dependecy to startup-notification (#44709)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-06-13 17:37:51 -06:00
Jonathon Anderson
0dfc360b1e hpctoolkit: conflicts with elfutils @0.191 (#44696)
See https://gitlab.com/hpctoolkit/hpctoolkit/-/issues/831
2024-06-13 12:25:37 -07:00
AcriusWinter
5e578e2e4e Converted warpx from the old format to the new format for stand-alone testing (#44677)
* Converted warpx from the old format to the new format for stand-alone testing

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-13 11:34:17 -07:00
AcriusWinter
93111d495b Changed sz3 stand alone tests from old to new API (#44691) 2024-06-13 10:45:16 -07:00
Howard Pritchard
b1bbe240d7 py-xtb: fix problem with meson file (#44595)
Using meson 1.3.2 the py-xtb package fails
during the install process.  Error signature shown in

    https://github.com/grimme-lab/xtb-python/pull/114

Signed-off-by: Howard Pritchard <howardp@lanl.gov>
2024-06-13 10:18:42 -07:00
Jean Luca Bez
bbb58ff4c6 update PDC version (#44702) 2024-06-13 11:15:33 -06:00
Sam Gillingham
7a710bee17 update py-tuiview to 1.2.14 (#44692)
* update py-tuiview to 1.2.14
* remove whitespace
2024-06-13 10:07:36 -07:00
renjithravindrankannath
bc3903d0e0 New patch to add half include path for tests in 6.1 (#44697)
* Add half include path for tests in 6.1
* removing unwanted  when="@:6.1"
* Correcting conditon to apply patch
* Stlye check error fix and updating rpp dependency
* Audit check error fix
* Correcting rpp dependency and adding hsa-rocr-dev lib path to LD_LIBRARY_PATH
2024-06-13 09:42:56 -07:00
Simon Pintarelli
61c8326180 cleanup q-e-sirius recipe (#44698) 2024-06-13 09:39:06 -07:00
Adrien Bernede
c5caa4b838 Add latest releases to camp, raja and umpire (#44699)
* Add latest releases to camp, raja and umpire
2024-06-13 09:31:38 -07:00
Joe Schoonover
ebdff73c8c Add new versions of feq-parse (#44701) 2024-06-13 09:25:54 -07:00
eugeneswalker
f78beb71f7 tau@2.33.2 +rocm: patch to disable llvm plugin (#44690) 2024-06-13 09:13:41 -07:00
Simon Frasch
74210c7f46 sirius: fix spla+openmp requirement (#44685) 2024-06-13 06:53:28 +02:00
Harmen Stoppels
eff4451cdd Revert "linux-pam: fixes #44637" (#44682)
* Revert "linux-pam: fixes #44637 (#44638)"

This reverts commit 9151fc1653.

* linux-pam: drop redundant deps, add missing, dont build docs, nls
2024-06-12 13:13:11 -07:00
Teague Sterling
8d0fc3e639 gettext: fix condition (#44680) 2024-06-12 13:24:42 -06:00
Taillefumier Mathieu
3736da3f89 Update cp2k old recipe (#44650)
The old build system has issues with the library order. This commit is an attempt
to fix it.
2024-06-12 10:11:02 +02:00
Melven Roehrig-Zoellner
221e464df3 py-pyopengl: new package (#32357)
* py-pyopengl: new simple package

* py-pyopengl: Fix typo in comment

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-pyopengl: add variants and dependencies

* py-pyopengl: build from source and improve variants

* py-pyopengl: use corrected freeglut libs

* py-pyopengl: update copyright

* py-pyopengl: remove duplicate conflict clause

* py-pyopengl: change dependency to link

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-06-12 09:52:49 +02:00
Gavin John
bac5253169 py-biopython: Require python <= 3.9 for certain versions (#44668) 2024-06-12 08:57:56 +02:00
AMD Toolchain Support
fc2ee5cae8 WRF 4.5.2 support is added for AOCC compilers (#44584)
Co-authored-by: Raviteja K <rkuppili@amd.com>
2024-06-11 18:45:08 -07:00
psakievich
e11f83f34b Nalu-Wind Tioga Dependency (#44675)
* Nalu-Wind Tioga Dependency
   We created the tioga@1.0.0 tag for reproducing the exawind 1.0 release
* Update tioga tags
2024-06-11 18:44:37 -07:00
Veselin Dobrev
6e7fb9a308 MFEM: add new version v4.7 (#44010)
* Core change: logic for extracting RPATHs from modules may return
  `None`: filter this out of the set of RPATHs that is auto-generated
* Core change: `CachedCMakePackage` no longer adds ldflags to
  `CMAKE_STATIC_LINKER_FLAGS`: generally these flags are not appropriate
  for static linking (e.g. invocation of `ar`)
* [mfem] Add version 4.7
* [mfem] Add variant for precision (single/double). Enforce consistency
  for precision amongst mfem and hypre/petsc/mumps dependencies
* [mfem] Add cxxstd (and related constraints preventing use of
  old cxxstd values for newer versions of some dependencies)
* [hypre] In line with prior point, added support for specifying
  precision
* [petsc] Add config option to avoid error when building against
  `superlu-dist+rocm`
* [hiop] add proper `raja`/`umpire`/`camp` version constraints for
  `hiop` versions 0.3.99-0.4.x; require `+raja` for `+rocm`, and
  add dependency on `hiprand` for `+rocm`
* [butterflypack, mfem, strumpack, suite-sparse] Require
  `CRAYLIBS_{target-family}` env var to be defined
* [suite-sparse] versions `@7.4:` changed install location of headers:
  add symlink from old location to new location
* [zlib-ng] Fix error where shared libs were not successfully built for
  `%cce@17` (the build did not fail, but the finished `zlib-ng%cce@17`
  install did not have shared libs)
2024-06-11 16:39:22 -07:00
renjithravindrankannath
b156a62a44 making suite-sparse library path generic in hipsolver (#44473)
* suite-sparse library path can be lib or lib64
* fixing style check error
* Adding the missing import os
* Style check error fix
* Updating patch for 6.1.0 with correct lib path
2024-06-11 14:41:42 -07:00
Elliott Slaughter
990e77c55f legion: Requires C++17 minimum after 24.03.0. (#44596) 2024-06-11 14:10:04 -07:00
Matt Thompson
c2fb529819 fms: add llvm-openmp for apple-clang +openmp (#44667)
* fms: add llvm-openmp for apple-clang
2024-06-11 14:08:31 -07:00
psakievich
337bf3b944 Enable and constrain reuse for GitVersion installations (#43859)
* Preserve higher weight for CLI git ref versions

Currently the concretizer fails if you reuse a git ref version
that has already been installed but modify the spec at all.

See #38484 for futher diagnosis

The issue here is that since there is no established provenance for
these versions the highest weight they are currently assigned is
that of prior install. Re-use checks then fail because the weight of
the version is identical to the solver.

Ironically, these versions are given the highest weights possible when
specified on the CLI for the first time.  They should only appear in a
DAG if they are an exact match or if the user specifies them at the CLI.
Therefore it makes sense to preserve their higher ordering.

Getting this right is critical to moving all branch based versions to a pinned
git-ref in the future.

* [@spackbot] updating style on behalf of psakievich

* Update lib/spack/spack/solver/asp.py

Co-authored-by: Greg Becker <becker33@llnl.gov>

* Add provenance specific to git ref installs

* Sensitvity to name that I could not track down

* Add regression test

* Adjust test

* Add prefer standard unit-test

* Style

* Add required mock

* Format and mark

* Make unit-test case reproduce CLI investigation

* Remove unnecessary mock package

* [@spackbot] updating style on behalf of psakievich

* Use already developed fixture

* Add zlib-ng to mocks again

* Remove accidental adds

* Remove maintainer

* [@spackbot] updating style on behalf of psakievich

* Rename test file

* [@spackbot] updating style on behalf of psakievich

* Remove unused imports

* Update tests

* [@spackbot] updating style on behalf of psakievich

* Style

* Update lib/spack/spack/test/concretize.py

Co-authored-by: Greg Becker <becker33@llnl.gov>

* Update solver rule

* Duplicate installed rules for installed_git_version

* Revert "Duplicate installed rules for installed_git_version"

This reverts commit 17223fc8d1.

---------

Co-authored-by: psakievich <psakievich@users.noreply.github.com>
Co-authored-by: Greg Becker <becker33@llnl.gov>
2024-06-11 14:57:09 -06:00
Nuno Nobre
a49b2f4f16 Use https, not ssh, for AMS git clones (#44669) 2024-06-11 13:53:27 -07:00
AcriusWinter
ddcf1a4b2e Renamed build-time test method (#44572)
* libdistributed: renamed test to check_test
* libpressio-rmetric: renamed test method
* libpressio-tools: renamed test method
* Renamed build-time test methods for: libpressio-opt, libpressio-tthresh, opened

---------

Co-authored-by: Jackson Peter Lawrence <lawrence31@borax5.llnl.gov>
2024-06-11 13:22:50 -06:00
Simon Frasch
82a932c078 spla: add v1.6.0 (#44609) 2024-06-11 13:06:50 +02:00
Harmen Stoppels
0781615117 openssl: remove deprecated versions (#44629) 2024-06-11 10:13:13 +02:00
Teague Sterling
9151fc1653 linux-pam: fixes #44637 (#44638)
* package/linux-pam: dependencies

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* package/linux-pam: Fixes #44637

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update var/spack/repos/builtin/packages/linux-pam/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Update var/spack/repos/builtin/packages/linux-pam/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* [@spackbot] updating style on behalf of teaguesterling

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-06-10 23:41:11 -06:00
Robert Underwood
3a83b21ce1 improvements for dlib;highway;libjxl (#44348)
* improvements for dlib;highway;libjxl

* Update var/spack/repos/builtin/packages/dlib/package.py

Co-authored-by: Alec Scott <hi@alecbcs.com>

* fix style

---------

Co-authored-by: Robert Underwood <runderwood@anl.gov>
Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-06-10 16:50:36 -07:00
William Hart
cfc042d901 cppad: add v20180000.0, v20240000.4, master (#44299)
* Updating cppad package

* Disabling doc generation

* Adding whart222 as maintainer

* Style fixes

* Pushing changes based on Tamara's feedback.

* Updating sha256 hash for version 20180000.0

* Reorg of declarations

* Reworking cmake args

* Reformatting
2024-06-10 16:48:19 -07:00
Teague Sterling
211ad9e7d9 gdk-pixbuf: adding variant version constraint (#44645)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-06-10 16:40:21 -07:00
Teague Sterling
437b259829 perl-http-tiny: new package (#44502)
* Adding perl-http-tiny package

* Update package.py
2024-06-10 17:23:03 -06:00
Loïc Pottier
f524bba869 flux-sched: fixed missing modules in PATH when lib64 and lib are both used (#44658)
Signed-off-by: Loïc Pottier <pottier1@llnl.gov>
2024-06-10 17:03:00 -06:00
Kelly (KT) Thompson
2f31fb5f17 lcov: add master (#42498)
* Provide instructions for building a developmental version of lcov.

* style fix

* style fix

* Promote single version git URL to the package level.

* Formatting fix.
2024-06-10 16:34:09 -06:00
Dave Keeshan
c3567cb199 verilator: Some environment varibles are no longer required (#44655)
* verilator: Some environment varibles are no longer required

* Revert depends_on for flex back to the deafult case

* Use decorator, when, for setup_run_environment insdtead of if inside function
2024-06-10 16:14:26 -06:00
Melven Roehrig-Zoellner
ae4c1d11f7 julia and libcxxwrap-julia: add new versions (#44635)
* julia: new version (1.10.4)

* libcxxwrap-julia: new version (0.12.5)
2024-06-10 14:29:46 -07:00
Thomas Madlener
cbab451c1a dd4hep: Add version 1.29 (#44652) 2024-06-10 12:54:18 -07:00
Thomas Madlener
9cec17ca26 lcio: Add latest tag v02-22 (#44606)
Co-authored-by: Valentin Volkl <valentin.volkl@cern.ch>
2024-06-10 11:56:31 -06:00
Jon Rood
d9c5d91b6f hdf5: solve cmake findhdf5 issue with hl components (#44423)
* hdf5: solve cmake findhdf5 issue with hl components

* Fix mistake.

* Update var/spack/repos/builtin/packages/hdf5/package.py

Co-authored-by: psakievich <psakiev@sandia.gov>

* Style.

* Style.

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
2024-06-10 10:11:49 -06:00
Chris Green
6e194c6ffe root: new (old) versions and fixes for 6.32.00 (#44611)
* Older versions and backported patches
   * Add 6.28.12 and 6.24.04.
   * Backported patches for segfault in weighted likelihood fits.
   * Minor style/signature improvements.
* ROOT 6.32.00 fixes omitted from https://github.com/spack/spack/pull/44550
2024-06-10 17:59:33 +02:00
Mikael Simberg
8f0c28037b mold: add 2.32.0 (#44648) 2024-06-10 03:57:52 -06:00
Rocco Meli
31aabcabf7 elsi: add v2.10.1, master (#44579) 2024-06-10 09:13:06 +02:00
Melven Roehrig-Zoellner
ca9531d205 qt: new minor version 5.15.14 (#44632) 2024-06-09 17:03:15 -04:00
Harmen Stoppels
794c5eb6a0 git: remove deprecated versions (#44631) 2024-06-08 06:46:04 -07:00
Massimiliano Culpo
c6cc125e22 Remove failing unit-test on Windows (#44627) 2024-06-08 09:36:22 +02:00
Bernhard Kaindl
528c1ed9ba binutils: detect the "gold" and "headers" variants (#40214)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-06-07 18:08:55 -06:00
Kyle Gerheiser
52cc603245 fabtests: add v1.19.1 -> v1.21.0 (#44358) 2024-06-08 00:07:18 +02:00
Wouter Deconinck
5e55af2dce py-scikit-build-core: add v0.7.1, v0.8.2, v0.9.5 (#44492) 2024-06-08 00:00:45 +02:00
Wouter Deconinck
24ee7c8928 py-partd: add v1.4.1, v1.4.2 (#44497) 2024-06-07 23:59:07 +02:00
Vanessasaurus
605df09ae1 flux-sched: add v0.35.0 (#44602)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-06-07 23:56:39 +02:00
eugeneswalker
4aebef900c sundials+rocm: patch hip platform for rocm 5.7-6 (#44617) 2024-06-07 11:37:52 -06:00
eugeneswalker
59c5bef165 magma: add rocm-core versions >=6 (#44598) 2024-06-07 08:17:18 -07:00
eugeneswalker
a18adf74bf camp@0.2.3 +rocm: patch for rocm 6 from LLNL/camp PR#143 (#44597) 2024-06-07 08:16:59 -07:00
Rocco Meli
6426ab1b7e dftbplus: add v24.1 (#44613) 2024-06-07 08:54:51 -06:00
Nils Vu
7d1de58378 blaze: add v3.8.2, v3.8.1 (#44556) 2024-06-07 08:54:14 -06:00
Mikael Simberg
82a54378d8 pika: Add valgrind variant (#44558) 2024-06-07 07:33:08 -07:00
Adam J. Stewart
e6e8fada8b GEOS: add new versions (#44586) 2024-06-07 13:47:04 +02:00
Adam J. Stewart
7b541ac322 PyTorch: update ecosystem (#44585) 2024-06-07 13:46:50 +02:00
kwryankrattiger
b0a2ea3970 Generate jobs should use x86_64_v3 runners only (#44582) 2024-06-07 13:20:02 +02:00
Sreenivasa Murthy Kolam
cb439a09dd Deprecate older rocm releases- 5.2 till 5.4 (#43181) 2024-06-07 13:19:34 +02:00
Rocco Meli
f87ee334c2 CP2K: use dla-future-fortran depencency (#44603)
* Spglib: add version 2.4.0

* DLA-Future: fix +test option

* cp2k: add dla-future-fortran dependency

* Update var/spack/repos/builtin/packages/cp2k/package.py
2024-06-07 13:16:53 +02:00
Paul R. C. Kent
e8f8cf8543 Add llvm v18.1.7 (#44581) 2024-06-06 22:28:34 -06:00
Teague Sterling
8c93fb747b py-jproperties: add v1.0.1, v2.0.0, v2.1.0, v2.1.1 (#44519)
* py-jproperties: add v1.0.1, v2.0.0, v2.1.0, v2.1.1
* Incorporating changes from review: https://github.com/spack/spack/pull/44519#discussion_r1625170596
2024-06-06 14:40:07 -07:00
Alex Leute
1701e929bc py-pyseer and deps: new package (#44543)
* py-pyseer: Added package py-pyseer and dependency py-glmnet-python
* py-glmnet-python: Updated homepage, and added comments
* Update copyrights

---------

Co-authored-by: Jen Herting <jen@herting.cc>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-06-06 14:34:11 -07:00
Melven Roehrig-Zoellner
1bb3e04263 t8code: new versions (#44570) 2024-06-06 14:30:28 -07:00
AMD Toolchain Support
e91ae19ec4 amdlibm: fix build environment (#44054)
Co-authored-by: Branden Moore <branden.moore@amd.com>
2024-06-06 14:59:48 -06:00
Melven Roehrig-Zoellner
818ae08c61 py-h5py: fix py-cython version constraints (#44569)
* py-h5py: fix py-cython version constraints
* py-mpi4py: new version and updated dependencies
2024-06-06 13:22:30 -06:00
jmuddnv
15f32f2ca1 Update package.py (#44568)
New tarfiles for HPC SDK 24.5.  Release is 24.5-2.
2024-06-06 12:04:48 -06:00
Melven Roehrig-Zoellner
59aa62ea5c gdb: add version 14.2 (#44564) 2024-06-06 11:42:03 -06:00
SXS Bot
b4c292ddd0 spectre: add v2024.06.05 (#44566)
Co-authored-by: sxs-bot <sxs-bot@users.noreply.github.com>
2024-06-06 11:01:55 -06:00
Pariksheet Nanda
a25655446a libtiff: add v4.6.0 and default disable opengl (Fixes #44545) (#44546)
* libtiff: add v4.6.0 and default disable opengl (#44545)

* libtiff: Fix typo in CMake key

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* libtiff: Broader description of OpenGL variant

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* libtiff: reformat using spack style black recommendation

* libtiff: couple opengl flag with autotools

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-06-06 18:05:29 +02:00
Carlos Bederián
cf3d59bb2e libarchive: add versions 3.7.4, 3.7.3, 3.7.2 (#44363) 2024-06-06 08:03:22 -06:00
dependabot[bot]
f80287166e build(deps): bump sphinx-design from 0.5.0 to 0.6.0 in /lib/spack/docs (#44360)
Bumps [sphinx-design](https://github.com/executablebooks/sphinx-design) from 0.5.0 to 0.6.0.
- [Release notes](https://github.com/executablebooks/sphinx-design/releases)
- [Changelog](https://github.com/executablebooks/sphinx-design/blob/main/CHANGELOG.md)
- [Commits](https://github.com/executablebooks/sphinx-design/compare/v0.5.0...v0.6.0)

---
updated-dependencies:
- dependency-name: sphinx-design
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-06 14:34:59 +02:00
Wouter Deconinck
329dc40b98 py-editables: add v0.4, v0.5 (switch to flit-core) (#44380) 2024-06-06 14:32:28 +02:00
Kyle Gerheiser
8328c34a3e libfabric: add v1.19.1 and v1.20.2 (#44357) 2024-06-06 14:25:07 +02:00
Melven Roehrig-Zoellner
6c309d3bc9 cgns: patch for gcc14 (#44562) 2024-06-06 06:16:11 -06:00
Rocco Meli
24b49eee83 py-gsd: add v3.2.1 (#44474) 2024-06-06 14:13:58 +02:00
Wouter Deconinck
715fab340f (py-)onnx: add v1.16.1 (#44485) 2024-06-06 14:11:23 +02:00
Wouter Deconinck
8231e84d15 py-ipykernel: add latest bugfixes from v6.23.3 through v6.29.4 (#44498) 2024-06-06 14:07:56 +02:00
Wouter Deconinck
8dc91a7a5c py-ipython: add latest bugfixes from v8.15.0 through v8.25.0 (#44499) 2024-06-06 14:07:16 +02:00
Teague Sterling
4c195b1a06 py-azure-mgmt-storage: add v20.0.0, v20.1.0 (#44511) 2024-06-06 14:05:08 +02:00
Teague Sterling
c6fb6bf5f8 py-nest-asyncio: v1.5.8, v1.5.9, v1.6.0 (#44512) 2024-06-06 14:03:28 +02:00
Teague Sterling
ed6161b80c py-rich: add v12.6.0,13.7.1 (#44515) 2024-06-06 14:02:50 +02:00
Derek Ryan Strong
93424e4565 hwloc: add v2.9.2, v2.9.3, v2.10.0 (#44577) 2024-06-06 04:00:46 -06:00
Teague Sterling
ca00e42f1d py-aiohttp: add 3.9.0, 3.9.4, 3.9.5 (#44510) 2024-06-06 11:22:31 +02:00
Teague Sterling
357ee1c632 py-uvloop: add v0.17.0,v0.18.0,v0.19.0 (#44513) 2024-06-06 11:15:22 +02:00
Derek Ryan Strong
cb0a3eaade neovim: add v0.10.0, v0.9.5 (#44365) 2024-06-06 11:11:45 +02:00
dependabot[bot]
a20d34b8aa build(deps): bump pytest from 8.2.1 to 8.2.2 in /lib/spack/docs (#44553)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.2.1 to 8.2.2.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.2.1...8.2.2)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-06 08:38:54 +02:00
Derek Ryan Strong
329910a620 ffmpeg: add v7.0, master (#44107)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-06-06 07:57:29 +02:00
Derek Ryan Strong
c374d04b0d wget: add v1.21.4 and v1.24.5 (#44574) 2024-06-06 07:56:22 +02:00
Derek Ryan Strong
b68cf16c85 Add gmake build dependency for unibilium (#44575) 2024-06-06 07:54:46 +02:00
Alex Richert
391c4cf099 netcdf-c: add logging variant (#43380)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Sergey Kosukhin <skosukhin@gmail.com>
2024-06-05 17:48:27 +02:00
Michael Kuhn
8260599e98 py-netcdf4: fix build with gcc@14 (#44134) 2024-06-05 17:38:54 +02:00
Brian Vanderwende
3433c8b8a5 ncl: consolidate patch methods (#44333) 2024-06-05 17:37:25 +02:00
Harmen Stoppels
e53bc780e4 libuuid: deprecate entirely (#44525)
There hasn't been a release in almost a decade, and build failures with
GCC 14 were reported. I don't think it makes sense to patch it, since
the project moved over to `util-linux`, and Spack's default provider is
that package. Better to just get rid of it in the next Spack release.
2024-06-05 17:36:09 +02:00
Teague Sterling
53346dbaa6 libuuid: address #44479 (#44481)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-06-05 08:14:20 -06:00
Chris Green
99994ea245 [root] New version 6.32.00 (#44550) 2024-06-05 08:27:13 -05:00
Harmen Stoppels
3ffe02a2fe spack edit: allow edit multiple files at once (#44416) 2024-06-05 14:19:11 +02:00
Carlos Bederián
9b77502360 openblas: add v0.3.27 (#44312) 2024-06-05 13:18:32 +02:00
Vanessasaurus
96a97328cf Automated deployment to update package flux-core 2024-06-05 (#44554)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-06-05 12:44:31 +02:00
Mikael Simberg
1a400383c0 valgrind: Add 3.21.0, 3.22.0, and 3.23.0 (#44557) 2024-06-05 12:33:00 +02:00
Harmen Stoppels
4f8ab19355 mold: unvendor (#44539) 2024-06-05 12:12:41 +02:00
Chris Marsh
8919677faf Work around the linker incompatibility that exists with fortran and apple-clang (#44547) 2024-06-05 02:16:52 -07:00
Chris Marsh
858b185a0e Armadillo: fix for linker error with apple-clang 15 (#44551)
* Armadillo needs to use -ld_classic with apple-clang 15
2024-06-05 01:02:43 -07:00
Tony Weaver
bc738cea32 py-your: new package (#44448)
* py-your: new package

Spack package recipe for YOUR, Your Unified Reader.  YOUR processes pulsar data in different formats.

Output below from spack install py-your
spack install py-your
[+] /usr (external glibc-2.28-oj2wjfl2ao5inhfz4qehw6hlck2hizvp)
[+] /opt/apps/spack/gcc-runtime-8.5.0-5k6kvi5
[+] /opt/apps/spack/libxcrypt-4.4.35-zigqpjo
[+] /opt/apps/spack/ncurses-6.4-xbvwv2w
[+] /opt/apps/spack/erfa-2.0.0-4qkta2n
[+] /opt/apps/spack/zlib-ng-2.1.6-ccn5qny
[+] /opt/apps/spack/pcre-8.45-33dfhul
[+] /opt/apps/spack/libpciaccess-0.17-2qdxdjo
[+] /opt/apps/spack/libmd-1.0.4-zbdiprt
[+] /opt/apps/spack/qhull-2020.2-klc7ewb
[+] /opt/apps/spack/bzip2-1.0.8-t65bq3t
[+] /opt/apps/spack/xz-5.4.6-axoznvt
[+] /opt/apps/spack/alsa-lib-1.2.3.2-a7icjdy
[+] /opt/apps/spack/zstd-1.5.6-nyk6gt6
[+] /opt/apps/spack/libffi-3.4.6-ibucrfe
[+] /opt/apps/spack/yasm-1.3.0-v4etmmm
[+] /opt/apps/spack/openblas-0.3.26-pfyk2vi
[+] /opt/apps/spack/pkgconf-1.9.5-ckjdqjm
[+] /opt/apps/spack/wcslib-7.3-zvcqq7o
[+] /opt/apps/spack/libiconv-1.17-jskazis
[+] /opt/apps/spack/unzip-6.0-mqftjtp
[+] /opt/apps/spack/libjpeg-turbo-3.0.0-vjvivme
[+] /opt/apps/spack/readline-8.2-2ys6ede
[+] /opt/apps/spack/openssl-3.2.1-4lqdgni
[+] /opt/apps/spack/pigz-2.8-rx263bp
[+] /opt/apps/spack/libpng-1.6.39-kiuku4y
[+] /opt/apps/spack/libbsd-0.12.1-njt5grs
[+] /opt/apps/spack/swig-4.0.2-fortran-z3sbnze
[+] /opt/apps/spack/binutils-2.42-vkjcvfr
[+] /opt/apps/spack/util-linux-uuid-2.38.1-w3kgjq3
[+] /opt/apps/spack/hdf5-1.14.3-sbuiw6q
[+] /opt/apps/spack/libedit-3.1-20230828-676jwbd
[+] /opt/apps/spack/nghttp2-1.57.0-u72gxms
[+] /opt/apps/spack/ffmpeg-6.1.1-2vhrbda
[+] /opt/apps/spack/libxml2-2.10.3-37klvxv
[+] /opt/apps/spack/gdbm-1.23-cylmqwx
[+] /opt/apps/spack/sqlite-3.43.2-axuxulg
[+] /opt/apps/spack/tar-1.34-wjzs4wj
[+] /opt/apps/spack/freetype-2.13.2-in4taxi
[+] /opt/apps/spack/expat-2.6.2-7kfe3hb
[+] /opt/apps/spack/curl-8.6.0-gpzsr3p
[+] /opt/apps/spack/hwloc-2.9.1-fhoz6al
[+] /opt/apps/spack/gettext-0.22.4-zjsp346
[+] /opt/apps/spack/lua-5.3.6-47356ia
[+] /opt/apps/spack/cfitsio-3.49-mmy3dbr
[+] /opt/apps/spack/python-3.10.13-fz7fymx
[+] /opt/apps/spack/elfutils-0.190-uswzaiw
[+] /opt/apps/spack/py-pytz-2023.3-ojdlhrd
[+] /opt/apps/spack/py-cycler-0.11.0-b7mjvv7
[+] /opt/apps/spack/py-pip-23.0-lxkcvby
[+] /opt/apps/spack/python-venv-1.0-2cz5c3s
[+] /opt/apps/spack/py-numpy-1.26.4-t5acjcv
[+] /opt/apps/spack/llvm-14.0.6-3nosumn
[+] /opt/apps/spack/py-packaging-23.1-wkeyuk6
[+] /opt/apps/spack/py-six-1.16.0-iv6iv3q
[+] /opt/apps/spack/py-pybind11-2.12.0-5rvupjv
[+] /opt/apps/spack/py-setuptools-69.2.0-3do76jw
[+] /opt/apps/spack/py-pyparsing-3.1.2-fq6imlt
[+] /opt/apps/spack/py-wheel-0.41.2-brm3k3h
[+] /opt/apps/spack/py-llvmlite-0.41.1-qom3l5h
[+] /opt/apps/spack/py-astropy-4.0.1.post1-xbojixg
[+] /opt/apps/spack/py-python-dateutil-2.8.2-kzdfskc
[+] /opt/apps/spack/py-numexpr-2.8.4-fc5xc5s
[+] /opt/apps/spack/py-h5py-3.11.0-y6drk3j
[+] /opt/apps/spack/py-tifffile-2023.8.30-oof4try
[+] /opt/apps/spack/py-pygments-2.16.1-stgrccl
[+] /opt/apps/spack/py-mdurl-0.1.2-nqk43ry
[+] /opt/apps/spack/py-scipy-1.13.0-vxjgfov
[+] /opt/apps/spack/py-contourpy-1.0.7-jg5lhss
[+] /opt/apps/spack/py-pillow-10.3.0-ijh2cju
[+] /opt/apps/spack/py-kiwisolver-1.4.5-vdh5sq5
[+] /opt/apps/spack/py-fonttools-4.39.4-x5ydbox
[+] /opt/apps/spack/py-bottleneck-1.3.7-ztsm4lg
[+] /opt/apps/spack/py-lazy-loader-0.4-k7hgvka
[+] /opt/apps/spack/py-numba-0.58.1-hzvrjwj
[+] /opt/apps/spack/py-markdown-it-py-3.0.0-l4p4qv5
[+] /opt/apps/spack/py-imageio-2.34.0-z5hu4yc
[+] /opt/apps/spack/py-matplotlib-3.8.4-azq2fzm
[+] /opt/apps/spack/py-pandas-1.5.3-p3gnh6t
[+] /opt/apps/spack/py-rich-13.4.2-okhgwwz
[+] /opt/apps/spack/py-networkx-3.1-b54br7r
[+] /opt/apps/spack/py-scikit-image-0.23.2-cvyzh3t
==> Installing py-your-0.6.7-djfzsn2lutp24ik6wrk6tjx5f7hil76x [83/83]
==> No binary for py-your-0.6.7-djfzsn2lutp24ik6wrk6tjx5f7hil76x found: installing from source
==> Fetching https://github.com/thepetabyteproject/your/archive/refs/tags/0.6.7.tar.gz
==> No patches needed for py-your
==> py-your: Executing phase: 'install'
==> py-your: Successfully installed py-your-0.6.7-djfzsn2lutp24ik6wrk6tjx5f7hil76x
  Stage: 1.43s.  Install: 0.99s.  Post-install: 0.12s.  Total: 3.12s

* Removed setup_run_environment

After some testing, both spack load and module load for the package will include the bin directory generated by py-your as well as the path to the version of python the package was built with, without the need for the setup_run_environment function.

I removed that function (Although, like Tamara I thought it would be necessary based on other package setups I used as a  basis for this package).

Note: I also updated the required version of py-astropy from py-astropy@4.0: to @py-astropy@6.1.0:  In my test builds, the install was picking up version py-astropy@4.0.1.post1 and numpy1.26.  However when I  tried to run some of the code I was getting errors about py-astropy making numpy calls that are now removed.  The newer version of py-astropy corrects these.  Ideally this would be handled in the py-astropy package to make sure numpy isn't too new
2024-06-04 21:38:38 -05:00
Teague Sterling
c2196f7d3a py-humanize: add v1.0.0, v1.1.0, v2.6.0, v3.14.0, v4.8.0, v4.9.0 (#44516)
* py-humanize: add v1.0.0, v1.1.0, v2.6.0, v3.14.0, v4.8.0, v4.9.0
2024-06-04 17:34:12 -07:00
Kyle Knoepfel
d45c27fdbd [intel-tbb] Speed up build and add versions (#44549)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-06-04 15:33:53 -06:00
Teague Sterling
173084de19 py-python-json-logger: add v2.0.2 (#44517)
* py-uvloop: add v3.8.14, v3.9.15, v3.10.3 and dependencies
* py-python-json-logger: add v2.0.2
2024-06-04 11:20:47 -07:00
Teague Sterling
fd2c5fa247 py-maturin: fix rust version dependencies to build (#44518)
* py-uvloop: add v3.8.14, v3.9.15, v3.10.3 and dependencies
* rollback
* py-maturin: fix rust version dependencies to build
2024-06-04 11:09:37 -07:00
kenche-linaro
73e0e9bdff linaro-forge: added 24.0.1 version (#44538) 2024-06-04 11:00:54 -07:00
snehring
4442414d74 kallisto: add version 0.50.1 (#44544)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-06-04 10:21:54 -07:00
Chris Marsh
8dbf9005f0 proj@7: support the new tiff interface in cmake@3.19 (#44535)
Add patch for proj@7 to support the new tiff interface shipped in cmake@3.19: This compliments the existing patch for @8  in #43780
2024-06-04 13:09:49 -04:00
Scott Wittenburg
0c3da1b498 gitlab ci: Remove protected publish job (#44304) 2024-06-04 11:54:05 -05:00
Todd Gamblin
278f5818b7 python: make every view a venv (#44382)
#40773 introduced python-venv, which improved build isolation and avoids issues with,
e.g., `ubuntu`'s system python modifying `sysconfig` to include a (very unwanted)
`local` directory within the default install layout.

This addresses a few cases where #40773 removed functionality, without harming the
default cases where we use `python-venv`.

Traditionally, *every* view with `python` in it was essentially a virtual environment,
because we would copy the `python` interpreter and `os.py` into every view when linking.
We now rely on `python-venv` to do that, but only when it's used (i.e. new builds) and
only for packages that have an `extends("python")` directive.

This again makes every view with `python` in it a virtual environment, but only
if we're not already using a package like `python-venv`. This uses a different
mechanism from before -- instead of using the `virtualenv` trick of copying `python`
into the prefix, we instead create a `pyvenv.cfg` like `venv` (the more modern way
to do it).

This fixes two things:
1. If you already had an environment before Spack `v0.22` that worked, it would
   stop working without a reconcretize and rebuild in `v0.22`, because we no longer
   copy the python interpreter on link. Adding `pyvenv.cfg` fixes this in a more
   modern way, so old views will keep working.

2. If you have an env that only includes python packages that use `depends_on("python")`
   instead of `extends("python")`, those packages will now be importable as before,
   though they won't have the same level of build isolation you'd get with `extends`
   and `python-venv`.

* views: avoid making client code deal with link functions

Users of views and ViewDescriptors shouldn't have to deal with link functions -- they
should just say what type of linking they want.

- [x] views take a link_type, not a link function
- [x] views work out the link function from the link type
- [x] view descriptors and commands now just tell the view what they want.

* python: simplify logic for avoiding pyvenv.cfg in copy views

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-06-04 09:52:21 -06:00
John W. Parent
c2e85202c7 CMake: add 3.28.6 && 3.29.4 (#44532)
* Add CMake 3.28.6

* Add 3.29.4
2024-06-04 09:08:09 -06:00
Mikael Simberg
b021b12043 gcc: add mold variant to use mold by default (#44117) 2024-06-04 16:46:01 +02:00
Todd Gamblin
89a0c9f4b3 nvhpc: Do not use -Wno-error with nvhpc (#44142)
In #30882, we made Spack ignore `-Werror` calls so that it could more easily build
projects that inject `-Werror` into their builds. We did this by translating them to
`-Wno-error` in the compiler wrapper. However, some compilers (like `nvhpc`) do not
support `-Wno-error`. We need to exclude them from this feature until they do.

- [x] make a property on `PackageBase` for `keep_werror` that knows not to use it for
      `nvhpc`.

- [x] update property so that it keeps only the specific `-Werror=...` args for newer nvhpc's,
      which support `-Wno-error` but not `-Wno-error=...`

---------

Co-authored-by: William Mou <william.mou1024@gmail.com>
Co-authored-by: Tom Scogland <scogland1@llnl.gov>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-06-04 03:46:35 -07:00
Weiqun Zhang
259629c300 amrex: add v24.06 (#44495) 2024-06-03 17:52:18 -07:00
Teague Sterling
1ce2baf7a2 duckdb: add v1.0.0, v0.10.3 (#44531)
* duckdb: add v1.0.0, v0.10.3
* Adding issue reference
2024-06-04 02:33:36 +02:00
Garth N. Wells
4576a42a0f py-fenics-dolfinx: dependency update (#44524)
* Update nanobind dependency
* Update py-nanobind dependency
2024-06-03 17:24:40 -07:00
Nils Vu
4fba351b92 charmpp: enable darwin arm build (#44103) 2024-06-03 17:44:29 -06:00
Garth N. Wells
706737245a py-nanobind: add v2.0.0 (#44371)
* Add nanobind 2.0.0
* Add dependency
* Fix dependency name
* Change "_" -> "-"
2024-06-03 15:25:03 -07:00
吴坎
0dbdf49075 Update py-vl-convert-python (#44527)
* Update py-vl-convert-python:
  1. set version to 1.4.0
  2. set version 1.3.0 deprecated since its rust dependency curve@4.1.1 is not able to compile
2024-06-03 15:07:16 -07:00
Owen Solberg
641075539c hugo: add v0.126.3 (#44530)
Co-authored-by: Owen Solberg <owen.solberg@sana.com>
2024-06-03 23:37:51 +02:00
Diego Alvarez S
9428d99991 seqfu: new package (#44445)
* Add seqfu

---------

Co-authored-by: dialvarezs <dialvarezs@users.noreply.github.com>
2024-06-03 14:30:00 -07:00
eugeneswalker
f3cf2e94c4 hip@6.1: fix reference to hsa-rocr-dev so it works when externally defined (#44528) 2024-06-03 14:11:47 -07:00
Todd Gamblin
85f13442d2 Consolidate concretization output for environments (#44489)
When Spack concretizes environments, it prints every (newly concretized) root spec
individually with all of its dependencies. For most reasonably sized environments, this
is too much output. This is true for three commands:

* `spack concretize` when concretizing an environment with newly added specs
* `spack install` when installing an environment with newly added specs
* `spack spec` with no arguments in an environment

The output dates back to before we had unified environments or nicer spec traversal
routines, and we can improve it.

This PR makes environment concretization output analogous to what we do for regular
specs. Just like `spack spec` for a single spec, we show all root specs with no
indentation, so you can easily see the specs you explicitly requested. Dependencies are
shown:

1. With indentation according to their depth in a breadth-first traversal starting at
   the roots;
2. Only once if they appear on paths from multiple roots

So, the default is now consistent with `spack spec` for one spec--it's `--cover=nodes`.
i.e., if there are 100 specs in your environment, you'll get 100 lines of output.

If you want to see more details, you can do that with `spack spec` using the arguments
you're already familiar with. For example, if you wanted to see dependency types and
*all* dependencies, you could use `spack spec -l --cover=edges`. Or you could add
deptypes and namespaces with, e.g. `spack spec -ltN`.

With no arguments in an environment, `spack spec` concretizes (if necessary) and shows
the concretized environment. If you run `spack concretize` *first*, inspecting the
environment repeatedly with `spack spec` will be fast, as everything is already in the
`spack.lock` file.

- [x] factor most logic of `Spec.tree()` out of `Spec` class into `spack.spec.tree()`,
      which can take multiple specs as roots.
- [x] make `Spec.tree()` call `spack.spec.tree()`
- [x] `spack.environment.display_specs()` now uses `spack.spec.tree()`
- [x] Update `spack concretize`
- [x] Update `spack install`
- [x] Update `spack spec` to call `spack.spec.tree()` for environments.
- [x] Continue to output specs individually for `spack spec` when using
      `--yaml` or `--json`
2024-06-03 13:29:14 -07:00
James Taliaferro
f478a65635 nb: new package (#44456)
* new package: nb
* only one filter_file, install completions
* completions now implicit, merged by the view
2024-06-03 12:57:47 -07:00
Rocco Meli
eca44600c5 rdkit (#44476) 2024-06-03 12:53:08 -07:00
Matt Thompson
d7a4652554 pflogger: add version 1.15.0 (#44467) 2024-06-03 12:41:51 -07:00
G-Ragghianti
d85cdd1946 Slate version 2024 05 31 (#44529)
* updating version for slate, blaspp, and lapackpp
* verified new hashes
2024-06-03 12:39:50 -07:00
Chris Marsh
ce3aae1501 seacas: protect against known mixed-toolchain problem (#44378)
* Protected against a known problem with mixed gcc/apple-clang toolchains. Fixes #44330

---------

Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
2024-06-03 12:27:53 -06:00
Chris Marsh
90c4f9d463 Apply patch to allow vtk to compile with %gcc 13 and 14. (#44332)
* Apply patch from upstream to allow vtk compilation with %gcc 12 and 14.

* Fixes #44331

* fix soec usage

* fix compiler version range

* Finalize version range, switch to .diff file


---------

Co-authored-by: Chrismarsh <Chrismarsh@users.noreply.github.com>
2024-06-03 14:21:09 -04:00
Dan Bonachea
93ccd81d86 upcxx: update to latest gasnet and fix some bitrot (#44488)
* Now need to explicitly depend on libfabric for Cray EX
* Ensure build uses the selected CUDA and ROCm versions
* Correct spelling on `configure --with-ldflags`
* Patch a defect regarding `configure --with-ldflags`
* Default to Spack's copy of GASNet-EX, which is newer than embedded
2024-06-02 16:59:13 +02:00
Dan Bonachea
9c5a70ab6c gasnet: add v2024.5.0 (#44478)
* Add GASNet v 2024.5.0

* cosmetic fix to info output

* Add a missing dependency

* Ensure GASNet detects the provided ROCm/CUDA dependency

* [@spackbot] updating style on behalf of bonachea

---------

Co-authored-by: bonachea <bonachea@users.noreply.github.com>
2024-06-01 16:04:45 -07:00
Rocco Meli
5ef58144cb py-fortls: add v3.1.0 (#44477)
* Spglib: add version 2.4.0

* DLA-Future: fix +test option

* update

* [@spackbot] updating style on behalf of RMeli

* Apply suggestions from code review

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-06-01 15:01:47 -05:00
Filippo Spiga
41ddbdfd90 gmsh: add v4.11.1, master (#41320)
* Adding gmsh 4.11.1

* Becoming a maintainer

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-06-01 11:30:25 -06:00
Rocco Meli
66924c85a3 py-griddataformats: add v1.0.2 (#44475)
* update

* Apply suggestions from code review

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-06-01 10:52:17 -05:00
snehring
a4c3fc138c apptainer: adding version 1.3.1 (#44104)
e2fsprogs: adding version 1.47.0 and fuse2fs variant
fuse-overlayfs: adding version 1.13
squashfuse: adding version 0.5.1 and 0.5.2
e2fsprogs: fixing audit errors
apptainer: expanding deps to cover more versions, fix binary_path logic.
apptainer: add go version deps, fix e2fsprogs patch sums, rework dep paths
fuse-overlayfs: tightening up the libfuse dep

Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-06-01 01:03:35 -06:00
Matt Thompson
62ed2a07a7 mapl: add 2.46.2, 2.40.5 (#44466) 2024-05-31 22:21:12 -06:00
James Taliaferro
e7aec9e872 pass: new package (#44454)
* New package: password-store

* add bash completion as variant

* also patch the cygwin platform snippet

* update description and maintainers

* make completions implicit and don't overwrite the completions package

* remove completion option

* formatting

* clean up file filter syntax

* remove reference to completion variant

* remove dependency on bash-completion

* clarify comments

* bashcompdir is already the default

* add optional dependency on xclip

* fix formatting
2024-05-31 18:05:06 -06:00
snehring
b065c3e11e ruby: adding version 3.3.2 (#44447)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2024-05-31 17:39:09 -06:00
Lucas Frérot
88b357c453 tamaas: adding new versions and python install fix (#44469)
* tamaas: install python extension with explicit pip call
* tamaas: patching compilation issues with recent compilers
* tamaas: added versions 2.7.0 and 2.7.1
2024-05-31 16:02:24 -07:00
Ryan Mulhall
bb7299c04a add in the latest versions of FMS (#44471)
Co-authored-by: rem1776 <rem1776@github.com>
2024-05-31 16:00:39 -07:00
Raffaele Solcà
7a5bddfd15 dla-future: Add 0.5.0 (#44463)
* add dla-future@0.5.0

* [@spackbot] updating style on behalf of rasolca

* fix typo

* review suggestions

---------

Co-authored-by: rasolca <rasolca@users.noreply.github.com>
2024-05-31 13:53:37 -06:00
potter-s
50fe769f82 Updated version (#44461)
Co-authored-by: Simon Potter <sp39@sanger.ac.uk>
2024-05-31 13:44:00 -06:00
Wouter Deconinck
29d39d1adf util-macros: ensure url_for_version works for older versions (#44421)
* util-macros: ensure url_for_version only used for older versions
* util-macros: use url.substitute_version after xz -> bz2
* util-macros: mv url_for_version down the file
2024-05-31 12:36:44 -07:00
Lydéric Debusschère
8dde7f3975 py-junit2html: new package, version '31.0.2' (#44399)
* py-junit2html: new package, version '31.0.2'
* py-junit2html: install from sources instead of from wheel
2024-05-31 12:36:02 -07:00
renjithravindrankannath
0cd038273e Bump-up rocm-opencl with 6.1.0 & 6.1.1 and adding hsa library path in LD_LIBRARY_PATH (#44171)
* Adding hsa library path in LD_LIBRARY_PATH
* Prepending instead of setting LD_LIBRARY_PATH to hsa-rocr-dev/lib
* adding rocm-opencl 6.1.0 & 6.1.1 updates
2024-05-31 12:30:55 -07:00
Matt Schramm
1f5bfe80ed Updates meep to latest release 1.29.0 (#44468) 2024-05-31 12:10:30 -07:00
Hartmut Kaiser
4d2611ad8a Adding HPX v1.10.0 to package (#44470) 2024-05-31 12:08:52 -07:00
jmuddnv
21a97dad31 Changes for NVIDIA HPC SDK 24.5 (#44354) 2024-05-31 10:32:39 -07:00
jdomke
338a01ca6d fujitsu-mpi: point MPI compiler wrappers to Spack compiler wrappers (#44457)
Using OMPI_ environment variables, like openmpi does.
2024-05-31 09:39:51 -06:00
Harmen Stoppels
392396ded4 traverse: pass key correctly (#44460)
Fixes a bug where custom keys to identify nodes were not passed
correctly.
2024-05-31 08:26:38 -07:00
Wouter Deconinck
a336e0edb7 py-numexpr: add v2.8.8, v2.9.0 (#44451) 2024-05-31 08:34:19 -06:00
Satish Balay
9426fefa00 petsc4py: fix typo with version (#44452) 2024-05-31 08:24:16 -06:00
Tom Bradford
812192eef5 protobuf: fix 3.4:3.21 patch checksum (#44443) 2024-05-31 13:59:49 +02:00
Rocco Meli
b8c8e80965 dla-future-fortran: fix +test option (#44458)
* Spglib: add version 2.4.0

* DLA-Future: fix +test option
2024-05-31 11:18:52 +02:00
Alberto Invernizzi
77fd5d8414 paraview: update cuda_arch management (#44243)
* refactor logic to switch to cmake for cuda management
2024-05-30 23:33:38 -05:00
Tom Bradford
82050ed371 corrected the vmd 1.9.3 tarball checksum (#44433) 2024-05-30 17:02:55 -07:00
John W. Parent
a7381a9413 Bootstrapping: don't use Mac OS binaries on Windows (#44193)
`BuildcacheBootstrapper` uses `Spec.intersects` to match specs needed
for bootstrapping against the binary cache. The specs were not
sufficiently-detailed to prevent matching e.g. cached binaries for
Mac OS on Windows; this commit adds the platform to each requested
bootstrap spec to prevent that.
2024-05-30 17:10:29 -06:00
Alec Scott
b932783d4d unmaintained pkgs: bump versions 2024-05-20 (#44274)
* unmaintained pkgs: bump versions 2024-05-20

* seqkit: revert changes

* msgpack-c: add dependency on boost

* msgpack-c: revert to test neovim

* revert: parallel, openblas, and re2c due to conflicts

* gmsh: revert changing version

* trilinos: limit version of suite-sparse to fix build

* gtkorvo-atl: deprecate develop before remove

* msgpack-c: deprecate v1.4.1

* msgpack-c: fix style

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-30 13:27:11 -06:00
Teague Sterling
0b51f25034 Package/gettext: Old version issues (#44440)
gcc@:5 hits https://savannah.gnu.org/bugs/index.php?65811 in gettext@0.22:

also fix patch of configure script

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-05-30 12:35:20 -06:00
Sreenivasa Murthy Kolam
d6a182fb5d Bump-up the version for RoCm-6.1.1 release (#44178)
* initial commit for RoCm-6.1.1 release

* fix style issues

* update the version for rocmlir and rocm-cmake

* restrict the patch for 6.0.0 release

* fix build failure for hipsolver-6.1.1 release

* update the hipblaslt for rocm-6.1.1 release

* add the patch for hipsparselt for 6.1.1 rel
2024-05-30 08:57:43 -07:00
Satish Balay
e8635adb21 petsc, py-petsc4py: add v3.21.2 (#44439) 2024-05-30 07:10:24 -07:00
Massimiliano Culpo
f242e0fd0c remove platform=cray (#43796)
Remove support for `cray` as a separate platform.

Any platform previously detected as `cray` is now detected as `linux`.

Users who still need platform=cray have to stick to Spack 0.22
2024-05-30 14:21:32 +02:00
renjithravindrankannath
67b5f6b838 rocm-validation-suite: custom patch update for lib and lib64 yaml library path (#44287) 2024-05-30 09:06:57 +02:00
Wouter Deconinck
9d16f17463 conmon: add v2.1.12 (#44389) 2024-05-30 08:57:05 +02:00
Harmen Stoppels
f44f5b0db0 tests: use fewer default paths (#44432)
Set config:install_tree:root and modules:default:roots to something
sensible.
2024-05-30 08:12:19 +02:00
Harmen Stoppels
39ace5fc45 concretizer: enforce host compat when reuse only (#44428)
Fixes a bug in the concretizer where specs depending on a host
incompatible libc would be used. This bug triggers when nothing is
built.

In the case where everything is reused, there is no libc provider from
the perspective of the solver, there is only compatible_libc. This
commit ensures that we require a host compatible libc on any reused
spec, additionally to requiring compat with the chosen libc provider.

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-05-30 07:31:28 +02:00
Diego Alvarez S
0601d6a0c5 nim: add v2.0.4 (#44375)
* Add nim 2.0.4
* Use install instead of copy

---------

Co-authored-by: dialvarezs <dialvarezs@users.noreply.github.com>
2024-05-29 18:05:11 -07:00
renjithravindrankannath
11869ff872 Patch to add hsa prefix/include path (#44334)
* Patch to add hsa prefix/include path:wq
* Style check error fix
* The patch need to be applied to 2024.01.stable. ROCm 6.0 requires hpctoolkit 2024.01.1 or later
* Fixing style chec error
2024-05-29 18:02:34 -07:00
Teague Sterling
6753605807 Update py-pyspark and py-py4j (#44263)
* Adding a py4j variant that requires Java via spack to avoid situations where a system doesn't have Java and py4j expects it
* Adding new versions of py-pyspark
* Adding a new variant to require java (via py4j) and clean up dependency handling
* Adding myself as a maintainer for py-pyspark and py-py4j
* Fix overlooked version bump in py4j
* Version bump to meet py-spark expectations
* Version bump to add latest compatibile version with pyspark
* Matching py-grpcio bump
* Adding variants and dependents for pyspark
* Adding runtime deps
* Changing default java requirement. I'm not sure this is the right call
* Changing py4j with java dependency handling
* Fix style
* Update package.py fix unnecessary f-strinh
* Make +java the default for both
* Fix nested deps
* Revert styles after default change
* Added new versions and GCC 14 compatbility conflicts
* Added new versions and compatibility conflicts for gcc 14
* Added new versions paired to arrow (for gcc14 compat)
* Update py-protobuf compiler conflict
* Update depends to match 
    See https://github.com/grpc/grpc/blob/master/src/python/grpcio_status/setup.py
* Updating dependencies and conflicts for py-googleapis-common-protos. Added new version to avoid future issues
* Remove upper bound version on py-protobuf and add default_args
* Adding new versions and updating dependencies back to versions 1.35.0
* Updating oldest numpy deps
* Fixing merge
* bit more cleaniness for var/spack/repos/builtin/packages/py-googleapis-common-protos/package.py
* Adding latest matching version of py-grpcio and py-grpcio-status
* Update package.py
   https://github.com/spack/spack/pull/44263#discussion_r1612317943
* Update dependencies
* Adding additional versions for dependent packages. Deprecated two versions: 1.16 is old, built for python ~3.6, and does not build for 3.8. 1.52.0 was removed from pypi
* Revert py-grpcio-tools changes. Will include in separate PR
* Adding patches and constraints to get 1.48 to build as it's a dependency that is called out for some other packages
* Updating to account for yanked packages for dependencies
* Fix style
* Update sha256 for py-grpcio v0.16.0 to reflect change

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-05-29 18:58:46 -06:00
Tony Weaver
918db85737 py-astropy: add v6.1 and new py-astropy-iers-data dependency (#44318)
* py-astropy: Add version 6.1

Added build info for version 6.1 in py-astropy.  Requires a new additional package, astropy-iers-data which has been included as py-astropy-iers-data to match with spack's general naming conventions.

Below is the output of the spack install showing successful build fro version 6.1 and the new py-astropy-iers-data package

[+] /usr (external glibc-2.28-oj2wjfl2ao5inhfz4qehw6hlck2hizvp)
[+] /opt/apps/spack/gcc-runtime-8.5.0-5k6kvi5
[+] /opt/apps/spack/bzip2-1.0.8-t65bq3t
[+] /opt/apps/spack/libmd-1.0.4-zbdiprt
[+] /opt/apps/spack/libiconv-1.17-jskazis
[+] /opt/apps/spack/util-linux-uuid-2.38.1-w3kgjq3
[+] /opt/apps/spack/libxcrypt-4.4.35-zigqpjo
[+] /opt/apps/spack/xz-5.4.6-axoznvt
[+] /opt/apps/spack/zlib-ng-2.1.6-ccn5qny
[+] /opt/apps/spack/libyaml-0.2.5-fxathvq
[+] /opt/apps/spack/ncurses-6.4-xbvwv2w
[+] /opt/apps/spack/zstd-1.5.6-nyk6gt6
[+] /opt/apps/spack/pcre2-10.42-fu62kky
[+] /opt/apps/spack/libunistring-1.2-whrov3e
[+] /opt/apps/spack/nghttp2-1.57.0-u72gxms
[+] /opt/apps/spack/openblas-0.3.26-pfyk2vi
[+] /opt/apps/spack/berkeley-db-18.1.40-jftva2u
[+] /opt/apps/spack/wcslib-7.3-zvcqq7o
[+] /opt/apps/spack/libffi-3.4.6-ibucrfe
[+] /opt/apps/spack/erfa-2.0.0-4qkta2n
[+] /opt/apps/spack/pkgconf-1.9.5-ckjdqjm
[+] /opt/apps/spack/libbsd-0.12.1-njt5grs
[+] /opt/apps/spack/openssl-3.2.1-4lqdgni
[+] /opt/apps/spack/pigz-2.8-rx263bp
[+] /opt/apps/spack/readline-8.2-2ys6ede
[+] /opt/apps/spack/libidn2-2.3.7-vnie4rz
[+] /opt/apps/spack/libedit-3.1-20230828-676jwbd
[+] /opt/apps/spack/libxml2-2.10.3-37klvxv
[+] /opt/apps/spack/expat-2.6.2-7kfe3hb
[+] /opt/apps/spack/curl-8.6.0-gpzsr3p
[+] /opt/apps/spack/tar-1.34-wjzs4wj
[+] /opt/apps/spack/gdbm-1.23-cylmqwx
[+] /opt/apps/spack/sqlite-3.43.2-axuxulg
[+] /opt/apps/spack/cfitsio-3.49-mmy3dbr
[+] /opt/apps/spack/gettext-0.22.4-zjsp346
[+] /opt/apps/spack/perl-5.38.0-gzljgek
[+] /opt/apps/spack/python-3.10.13-fz7fymx
[+] /opt/apps/spack/krb5-1.20.1-tqiapsx
[+] /opt/apps/spack/py-pyyaml-6.0-rju7jls
[+] /opt/apps/spack/py-tomli-2.0.1-eanxpu2
[+] /opt/apps/spack/py-numpy-1.26.4-t5acjcv
[+] /opt/apps/spack/python-venv-1.0-2cz5c3s
[+] /opt/apps/spack/py-pip-23.0-lxkcvby
[+] /opt/apps/spack/openssh-9.7p1-jxrkzso
[+] /opt/apps/spack/py-pyerfa-2.0.0.1-kyfazhs
[+] /opt/apps/spack/py-packaging-23.1-wkeyuk6
[+] /opt/apps/spack/py-typing-extensions-4.8.0-ujwbb6g
[+] /opt/apps/spack/py-setuptools-69.2.0-3do76jw
[+] /opt/apps/spack/py-wheel-0.41.2-brm3k3h
[+] /opt/apps/spack/git-2.45.1-tuc5jnb
[+] /opt/apps/spack/py-cython-3.0.0-zx62ssd
==> Installing py-astropy-iers-data-main-ukchsfzhfcyz6e6fxar6mtykqiavporj [52/55]
==> No binary for py-astropy-iers-data-main-ukchsfzhfcyz6e6fxar6mtykqiavporj found: installing from source
==> No patches needed for py-astropy-iers-data
==> py-astropy-iers-data: Executing phase: 'install'
==> py-astropy-iers-data: Successfully installed py-astropy-iers-data-main-ukchsfzhfcyz6e6fxar6mtykqiavporj
  Stage: 1.74s.  Install: 0.93s.  Post-install: 0.52s.  Total: 3.36s
[+] /opt/apps/spack/py-astropy-iers-data-main-ukchsfz
[+] /opt/apps/spack/py-extension-helpers-0.1-a5hmr6j
[+] /opt/apps/spack/py-setuptools-scm-8.0.4-qdhxchg
==> Installing py-astropy-6.1.0-f4pffru3kmyion2kq6muomgrfs5y4gdo [55/55]
==> No binary for py-astropy-6.1.0-f4pffru3kmyion2kq6muomgrfs5y4gdo found: installing from source
==> Fetching https://files.pythonhosted.org/packages/source/a/astropy/astropy-6.1.0.tar.gz
==> Ran patch() for py-astropy
==> py-astropy: Executing phase: 'install'
==> Warning: Module file /opt/modulefiles/spack/Core/py-astropy/6.1.0.lua exists and will not be overwritten
==> py-astropy: Successfully installed py-astropy-6.1.0-f4pffru3kmyion2kq6muomgrfs5y4gdo
  Stage: 1.29s.  Install: 1m 5.77s.  Post-install: 0.60s.  Total: 1m 7.94s

* Removed extra-whitespace

A blank line had white space, removed the white space

* Additional formatting changes for black

* Additional package updates

Based on previous recommendations updated py-astropy and py-astropy-iers-data packages.  Also added a new version to py-pyerfa package to match the 6.1.0 dependencies better

Of importance in these updates, I did add pypi and version info to py-astropy-iers-data.  Originally I had argued that this package updates quite frequently (on a weekly basis) and so it did not make sense to include pypi/versions and we should instead use the non-version git-repo structure based on the master branch.  However, when I tried to build the package py-setuptools-scm errored out when trying to build the package:

/opt/apps/spack/py-setuptools-scm-8.0.4-ax2zqro/lib/python3.10/site-packages/setuptools_scm/git.py:163: UserWarning: "/tmp/root/spack-stage/spack-stage-py-astropy-iers-data-main-iw2mdzlukb37mkcbcozjjefjoefw2eyp/spack-src" is shallow and may cause errors

I believe this is due to the download file/stage directory not containing the version and instead including the branch.   I changed the package to use versions instead and it worked just fine as shown below.  In addition, when I had done some preliminary testing, the package installed fine using the non-version master branch.  When I checked, that installation it used py-setuptools-scm@7.1 while in this installation run it used a much more recent 8.0.4 so it is possible that somewhere between scm7.1 and 8.0.04 something changed that caused this error to show up.  Since the setuptools-scm package has something to do extracting package versions, I imagine it's some kind of mismatch issue

Output from build:
spack install py-astropy
[+] /usr (external glibc-2.28-oj2wjfl2ao5inhfz4qehw6hlck2hizvp)
[+] /opt/apps/spack/gcc-runtime-8.5.0-5k6kvi5
[+] /opt/apps/spack/wcslib-7.3-zvcqq7o
[+] /opt/apps/spack/xz-5.4.6-axoznvt
[+] /opt/apps/spack/libffi-3.4.6-ibucrfe
[+] /opt/apps/spack/erfa-2.0.0-4qkta2n
[+] /opt/apps/spack/libmd-1.0.4-zbdiprt
[+] /opt/apps/spack/util-linux-uuid-2.38.1-w3kgjq3
[+] /opt/apps/spack/libiconv-1.17-jskazis
[+] /opt/apps/spack/berkeley-db-18.1.40-jftva2u
[+] /opt/apps/spack/zstd-1.5.6-nyk6gt6
[+] /opt/apps/spack/ncurses-6.4-xbvwv2w
[+] /opt/apps/spack/bzip2-1.0.8-t65bq3t
[+] /opt/apps/spack/libunistring-1.2-whrov3e
[+] /opt/apps/spack/pcre2-10.42-fu62kky
[+] /opt/apps/spack/pkgconf-1.9.5-ckjdqjm
[+] /opt/apps/spack/zlib-ng-2.1.6-ccn5qny
[+] /opt/apps/spack/openblas-0.3.26-pfyk2vi
[+] /opt/apps/spack/libxcrypt-4.4.35-zigqpjo
[+] /opt/apps/spack/libyaml-0.2.5-fxathvq
[+] /opt/apps/spack/libbsd-0.12.1-njt5grs
[+] /opt/apps/spack/readline-8.2-2ys6ede
[+] /opt/apps/spack/libidn2-2.3.7-vnie4rz
[+] /opt/apps/spack/nghttp2-1.57.0-u72gxms
[+] /opt/apps/spack/libedit-3.1-20230828-676jwbd
[+] /opt/apps/spack/libxml2-2.10.3-37klvxv
[+] /opt/apps/spack/openssl-3.2.1-4lqdgni
[+] /opt/apps/spack/pigz-2.8-rx263bp
[+] /opt/apps/spack/expat-2.6.2-7kfe3hb
[+] /opt/apps/spack/sqlite-3.43.2-axuxulg
[+] /opt/apps/spack/gdbm-1.23-cylmqwx
[+] /opt/apps/spack/curl-8.6.0-gpzsr3p
[+] /opt/apps/spack/tar-1.34-wjzs4wj
[+] /opt/apps/spack/perl-5.38.0-gzljgek
[+] /opt/apps/spack/cfitsio-3.49-mmy3dbr
[+] /opt/apps/spack/gettext-0.22.4-zjsp346
[+] /opt/apps/spack/krb5-1.20.1-tqiapsx
[+] /opt/apps/spack/python-3.10.13-fz7fymx
[+] /opt/apps/spack/openssh-9.7p1-jxrkzso
[+] /opt/apps/spack/py-tomli-2.0.1-eanxpu2
[+] /opt/apps/spack/py-setuptools-69.2.0-3do76jw
[+] /opt/apps/spack/py-numpy-1.26.4-t5acjcv
[+] /opt/apps/spack/python-venv-1.0-2cz5c3s
[+] /opt/apps/spack/py-packaging-23.1-wkeyuk6
[+] /opt/apps/spack/py-pip-23.0-lxkcvby
[+] /opt/apps/spack/git-2.45.1-zu6qkoc
[+] /opt/apps/spack/py-markupsafe-2.1.3-isgtki6
[+] /opt/apps/spack/py-wheel-0.41.2-brm3k3h
==> Installing py-extension-helpers-0.1-a5hmr6jtrvpcq3ibwkwhvwlydthjif5a [49/57]
==> No binary for py-extension-helpers-0.1-a5hmr6jtrvpcq3ibwkwhvwlydthjif5a found: installing from source
==> Fetching ac8a6fe91c.tar.gz
==> No patches needed for py-extension-helpers
==> py-extension-helpers: Executing phase: 'install'
==> py-extension-helpers: Successfully installed py-extension-helpers-0.1-a5hmr6jtrvpcq3ibwkwhvwlydthjif5a
  Stage: 0.37s.  Install: 0.88s.  Post-install: 0.52s.  Total: 1.89s
[+] /opt/apps/spack/py-extension-helpers-0.1-a5hmr6j
==> Installing py-cython-3.0.0-zx62ssdy4p6ddwuqbixel2vcsihjcs6m [50/57]
==> No binary for py-cython-3.0.0-zx62ssdy4p6ddwuqbixel2vcsihjcs6m found: installing from source
==> Fetching 350b18f967.tar.gz
==> No patches needed for py-cython
==> py-cython: Executing phase: 'install'
==> py-cython: Successfully installed py-cython-3.0.0-zx62ssdy4p6ddwuqbixel2vcsihjcs6m
  Stage: 1.07s.  Install: 2m 49.80s.  Post-install: 0.40s.  Total: 2m 51.37s
[+] /opt/apps/spack/py-cython-3.0.0-zx62ssd
[+] /opt/apps/spack/py-jinja2-3.1.2-wacpq7j
[+] /opt/apps/spack/py-typing-extensions-4.8.0-ujwbb6g
[+] /opt/apps/spack/py-pyyaml-6.0-rju7jls
[+] /opt/apps/spack/py-setuptools-scm-8.0.4-ax2zqro
==> Installing py-astropy-iers-data-0.2024.5.20.0.29.40-wckxjf6icqn3aqhyjidfir3byyjq5aq6 [55/57]
==> No binary for py-astropy-iers-data-0.2024.5.20.0.29.40-wckxjf6icqn3aqhyjidfir3byyjq5aq6 found: installing from source
==> Using cached archive: /opt/spack/var/spack/cache/_source-cache/archive/7f/7fff3d3404ae8560533ac0e685db7acc02c4d8984faa4ac3d607096879fba2d1.tar.gz
==> No patches needed for py-astropy-iers-data
==> py-astropy-iers-data: Executing phase: 'install'
==> py-astropy-iers-data: Successfully installed py-astropy-iers-data-0.2024.5.20.0.29.40-wckxjf6icqn3aqhyjidfir3byyjq5aq6
  Stage: 0.07s.  Install: 1.21s.  Post-install: 0.39s.  Total: 1.88s
[+] /opt/apps/spack/py-astropy-iers-data-0.2024.5.20.0.29.40-wckxjf6
==> Installing py-pyerfa-2.0.1.1-pkokp6usk7m2bjxcba3nwgqgrjufumcp [56/57]
==> No binary for py-pyerfa-2.0.1.1-pkokp6usk7m2bjxcba3nwgqgrjufumcp found: installing from source
==> Fetching https://files.pythonhosted.org/packages/source/p/pyerfa/pyerfa-2.0.1.1.tar.gz
==> No patches needed for py-pyerfa
==> py-pyerfa: Executing phase: 'install'
==> py-pyerfa: Successfully installed py-pyerfa-2.0.1.1-pkokp6usk7m2bjxcba3nwgqgrjufumcp
  Stage: 0.93s.  Install: 17.72s.  Post-install: 0.33s.  Total: 19.15s
[+] /opt/apps/spack/py-pyerfa-2.0.1.1-pkokp6u
==> Installing py-astropy-6.1.0-5brbkjnjzfg3lc6h34qku24ep5pwsxzs [57/57]
==> No binary for py-astropy-6.1.0-5brbkjnjzfg3lc6h34qku24ep5pwsxzs found: installing from source
==> Fetching https://files.pythonhosted.org/packages/source/a/astropy/astropy-6.1.0.tar.gz
==> Ran patch() for py-astropy
==> py-astropy: Executing phase: 'install'
==> py-astropy: Successfully installed py-astropy-6.1.0-5brbkjnjzfg3lc6h34qku24ep5pwsxzs
  Stage: 1.16s.  Install: 1m 3.34s.  Post-install: 1.05s.  Total: 1m 5.77s
[+] /opt/apps/spack/py-astropy-6.1.0-5brbkjn

* Updated to match with black formatting
2024-05-29 15:16:54 -07:00
Robert Cohn
1184de8352 intel-oneapi-mkl: change logic for gfortran compatibility (#44242)
* [intel-oneapi-mkl]: change logic for gfortran compatibility

* review comments

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>

* warn if fortran-rt and gcc are used without gfortran

* remove warning

---------

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2024-05-29 15:02:28 -06:00
Vanessasaurus
2470fde5d9 flux-sched: add v0.34.0 (#44205)
Signed-off-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
Co-authored-by: vsoch <vsoch@users.noreply.github.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-05-29 12:02:58 -07:00
Harmen Stoppels
abfff43976 remove non-existent when kwarg (#44437)
Not sure why this was added, it is ignored.
2024-05-29 20:39:37 +02:00
Chris Green
230687a501 postgresql: fix thinko with thread-safety option (#44381) 2024-05-29 11:30:54 -07:00
eugeneswalker
5ff8908ff3 e4s hopper ci: minimize root specs (#44436) 2024-05-29 10:09:04 -07:00
Diego Alvarez S
882e09e50b mmseqs2: add v15-6f452 (#44362) 2024-05-29 10:05:38 -07:00
Harmen Stoppels
6753f4a7cb bootstrap: fix broken test (#44403) 2024-05-29 19:03:49 +02:00
Wouter Deconinck
1dc63dbea6 acts: add v34.1.0, v35.0.0 (identification, sycl variants changes) (#44407)
* acts: add v34.1.0, v35.0.0 (identification, sycl variants changes)

* [@spackbot] updating style on behalf of wdconinc

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-05-29 08:52:20 -07:00
Vicente Bolea
b9dfae4722 adios2: update to latest version 2.10.1 (#44426) 2024-05-29 06:46:45 -05:00
Harmen Stoppels
70412612c7 installer: improve init signature and explicits (#44374)
Change the installer to take `([pkg], args)` in the constructor instead
of `[(pkg, args)]`. The reason is that certain arguments are global
settings, and the new API ensures that those arguments cannot be
different across different "build requests".

The `explicit` install arg is now a list of hashes, and the installer is
no longer responsible for determining what package is installed
explicitly. This way environment installs can simply pass the list of
environment roots, without them necessarily being explicit build
requests. For example an env with two roots [a, b], where b depends on
a, would not always cause spack install to mark b as explicit.

Notice that `overwrite` already took a list of hashes, this makes
`explicit` consistent.

`package.do_install(explicit=True)` continues to take a boolean.
2024-05-29 08:25:34 +02:00
Harmen Stoppels
cd741c368c py-pythran: add 0.16, fix compat bounds (#43983) 2024-05-29 08:13:11 +02:00
Teague Sterling
16a7bef456 fix: mariadb confilct (#44418)
This is a small fix the the conflicts of mariadb recently pushed. The max conflict version for gcc>=13 was off by one.
2024-05-28 22:14:09 -06:00
Wouter Deconinck
85f62728c6 libxcb, xcb-proto: add v1.17.1 (#44394)
* libxcb, xcb-proto: add v1.17.1
* libxcb, xcb-proto: inherit XorgPackage
* xcb-proto: http -> https
2024-05-28 14:36:16 -06:00
Wouter Deconinck
092dc96e6c acts: pass cuda_arch to CMAKE_CUDA_ARCHITECTURES (#44397) 2024-05-28 11:41:32 -07:00
Carsten Uphoff
2bb20caa5f New package: tiny tensor compiler (#44401)
Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-05-28 10:27:44 -07:00
Nathalie Furmento
00bcf935e8 starpu: add v1.4.7 (#44415) 2024-05-28 10:24:14 -07:00
Martin Pokorny
3751372396 legion: add missing build dependency and new variants (#44329)
* legion: add pip dependency for build

pip is needed for the build when the Legion Python binding is enabled

* legion: add variants for gc logging and system OpenMP use

This also removes the `cmake_cxx_flags` variable from `cmake_args()`
because that variable had no effect.

* [@spackbot] updating style on behalf of mpokorny

* legion: use spec.satisfies() in cmake_args()

e.g, replace use of '"+foo" in spec' with 'spec.satisfies("+foo")'

* legion: avoid configuring with multiple "-DLegion_REDOP_COMPLEX=ON" arguments

In the previous version, when both '+redop_complex' and '+bindings'
was set, two instances of "-DLegion_REDOP_COMPLEX=ON" arguments were
generated for cmake configuration.

* legion: fix value of "Legion_OUTPUT_LEVEL" for configuration

the previous version had no effect on setting the configuration value

---------

Co-authored-by: mpokorny <mpokorny@users.noreply.github.com>
2024-05-28 10:21:36 -07:00
Wouter Deconinck
e6afeca92f xorg-docs: add v1.7.3 (#44388)
* xorg-sgml-doctools: add v1.12.1

* xorg-docs: add v1.7.3

* util-macros: add v1.20.1 (now distributed as xz)

* util-macros: prefer spec.satisfies

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-28 10:19:31 -07:00
Wouter Deconinck
35b9307af6 py-uhi: add v0.4.0 (#44411) 2024-05-28 10:05:52 -07:00
Alberto Sartori
567f728579 justbuild: add v1.3.1 (#44398) 2024-05-28 07:03:07 -07:00
Teague Sterling
404c5c29a1 mariadb: add v10.8.8, v10.9.6, v11.3.2 (#44412) 2024-05-28 06:58:23 -07:00
Matthieu Dorier
63712ba6c6 mochi-margo: add v0.16.0, v0.17.0 (#44406) 2024-05-28 06:53:04 -07:00
Wouter Deconinck
ef62d47dc7 prmon: add v3.1.0 (#44410) 2024-05-28 06:52:17 -07:00
Wouter Deconinck
a4594857fc nlohmann-json: add v3.11.3 (#44409) 2024-05-28 14:14:47 +02:00
Wouter Deconinck
e77572b753 git-lfs: add v3.4.1, v3.5.1 (#44392)
* git-lfs: add v3.4.1, v3.5.1

* git-lfs: rm trailing spaces
2024-05-28 07:42:18 -04:00
Juan Miguel Carceller
8c84c5ff66 whizard: add a dependency on ghostscript and fix +openmp (#44414) 2024-05-28 10:57:56 +02:00
Wouter Deconinck
5d8beaf0ed doxygen: add v1.11.0, v1.10.0 (#44390) 2024-05-28 10:05:12 +02:00
Wouter Deconinck
ac405f3d79 elfutils: add v0.191 (#44391) 2024-05-27 22:29:00 -07:00
Wouter Deconinck
2e30553310 madx: add v5.09.03 (#44408) 2024-05-27 18:56:53 -07:00
Diego Alvarez S
85a61772d8 seqkit: add v2.8.2 (#44356) 2024-05-27 11:41:56 -07:00
Teague Sterling
4007f8726d rust: add conflicts with gcc >= 13 (#44404) 2024-05-27 11:32:48 -07:00
Adam J. Stewart
a097f7791b py-ruff: add v0.4.5 (#44355) 2024-05-27 11:28:48 -07:00
kwryankrattiger
3d4d89b2c0 ParasView Release 5.12.1 (#44396)
Add ParaView 5.12.1
Update preferred ParaView to 5.12.1
2024-05-27 10:45:02 -07:00
Harmen Stoppels
e461234865 link: directly bind to os.* on non-windows (#44400)
The windows wrappers for basic functions like `os.symlink`,
`os.readlink` and `os.path.islink` in the `llnl.util.symlink` module
have bugs, and trigger more file system operations on non-windows than
they should.

This commit just binds `llnl.util.symlink.symlink = os.symlink` etc so
built-in functions are used on non-windows
2024-05-27 13:37:04 +02:00
Adam J. Stewart
2c1d5f9844 Remove deprecated versions from packages (#44157) 2024-05-27 09:30:55 +02:00
Michael Kuhn
c4b682b983 rocksdb: add 9.2.1 (#44384) 2024-05-26 14:15:03 -05:00
Derek Ryan Strong
de0b784d5a nano: add v8.0 (#44366)
* Add nano v8.0

* Change to pkgconfig virtual provider

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-25 22:12:08 -05:00
Derek Ryan Strong
5f38afdfc7 Add vim 9.1.0437 (#44364) 2024-05-25 17:20:26 -07:00
Derek Ryan Strong
ac67c6e34b htop: add v3.3.0 (#44369) 2024-05-25 17:18:59 -07:00
Peter Scheibel
72deb53832 Make spack clean env-aware (#44227)
`spack clean <spec>` will now resolve specs based on the active environment if one is active.

If an env is active but no matching spec is found, this will fall back on fully concretizing.
2024-05-24 15:00:50 -07:00
Massimiliano Culpo
7c87253fd8 Make strong preferences even stronger (#44373)
Before this PR, if Spack could see a possibility to reuse a spec that
doesn't match a strong preference, it would do so. After the PR, a
strong preference would take precedence.
2024-05-24 10:06:28 -07:00
Carsten Uphoff
1136aedd08 Add Khronos official OpenCL ICD loader (#44351)
* Add Khronos official OpenCL ICD loader

Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>

* Formatting; add missing opencl-c-headers version

Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>

* opencl-icd-loader: use define instead of f-string

Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>

---------

Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-05-24 09:57:39 -07:00
AMD Toolchain Support
24e1b56268 uprof: update recipe, add missing dependency (#44293) 2024-05-24 16:19:18 +02:00
Harmen Stoppels
eef6a79b35 Prefer libiconv for iconv (#44335)
`glibc` and `musl` provide a basic implementation of `iconv` (`iconv`,
`iconv_open`, `iconv_close`), but in practice the installation may be
missing the character encoding methods to make them usable. On Fedora
for example, users need to

```yum install glibc-gconv-extra```

to get the character encodings that `gettext` requires during configure,
namely EUC-JP. Users may not have permissions to install the missing
parts of glibc.

Since Spack can install `libiconv`, it is simpler to use that by
default.
2024-05-24 13:25:59 +02:00
Adam J. Stewart
556a36cbd7 py-scikit-learn: add v1.5.0 (#44303)
* py-scikit-learn: add v1.5.0

* Add maintainers

* py-scikit-learn-extra: latest py-scikit-learn not supported
2024-05-24 12:41:59 +02:00
Rocco Meli
8aa490d6b7 DLA-Future-Fortran: new package (#44314)
* Spglib: add version 2.4.0

* dla-future-fortran: new package version 0.1.0

* [@spackbot] updating style on behalf of RMeli

* apply suggestion and add maintainer

---------

Co-authored-by: RMeli <RMeli@users.noreply.github.com>
2024-05-24 09:53:33 +02:00
Chris Marsh
d9d085da10 Fix patch being applied @7 which causes an error (#44367) 2024-05-24 09:45:20 +02:00
Samuel Khuvis
d88d720577 add mvapich support for intel scalapack_libs (#44246)
* add mvapich support for intel scalapack_libs

* Add mvapich support for oneapi scalapack_libs
2024-05-24 02:51:53 +00:00
Harry Sharma
1d670ae744 feat: add metacarpa@1.0.1 to spack (#44339)
* feat: add metacarpa@1.0.1 to spack
* fix: style issue
* Update copyright year
2024-05-23 20:43:13 -06:00
Veselin Dobrev
35ad6f52c1 Better Emacs build on Mac OS (#37294)
* [emacs] When installing on Mac OS, build and install the native
        Emacs.app along with the standard executables.

* [emacs] Make the GUI build on Mac optional by adding "gui" variant

* Apply reviewer suggestion

Co-authored-by: Alec Scott <hi@alecbcs.com>

* Add emacs version 29.3

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2024-05-23 18:17:38 -07:00
Greg Becker
b61bae7640 bugfix: external detection for compilers with os but not target (#44156)
avoid calling `spec.target` when None.

When an external compiler package has an `os` set but no `target` set, Spack
currently falls into a codepath that calls `spec.target` (which itself calls
`spec.architecture.target.Microarchitecture`) when `spec.architecture.target`
is None, throwing an error.

e.g.

```
packages:
  gcc:
    externals:
    - spec: gcc@12.3.1 os=rhel7
      prefix: /usr
```

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-24 00:13:36 +00:00
Massimiliano Culpo
8b7abace8b Enforce consistency of gl providers (#44307)
* glew: rework dependency on gl

This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.

* mesa-demos: rework dependency on gl

This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.

* mesa-glu: rework dependency on gl

This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.

* paraview: fix dependency on glew

* mesa: group dependency on when("+glx")

* Add missing dependency on libxml2

* paraview: remove the "osmesa" and "egl" variant

Instead, enforce consistency using the "gl" virtual that allows
only one provider.

* visit: remove osmesa variant

* Disable paraview in the aws-isc stacks

* data-vis-sdk: rework constrains to enforce front-ends

* e4s-power: remove redundant paraview

* Pipelines: update osmesa variants

* trilinos-catalyst-ioss-adapter: make gl a run dependency
2024-05-23 20:17:51 +00:00
pauleonix
5cf98d9564 asio: Add 1.30.0:1.30.2 (#44346) 2024-05-23 13:15:02 -07:00
Diego Alvarez S
973a961cb5 Add opendjk 11.0.23+9, 17.0.11+9, 21.0.3+9 (#44340) 2024-05-23 12:48:57 -05:00
Adam J. Stewart
868d0cb957 py-scipy: add v1.13.1 (#44337) 2024-05-23 10:45:51 -07:00
Diego Alvarez S
497f3a3832 nextflow: add 24.04.1 (#44338)
* Add nextflow 24.04.1
* Install java 17 in this version
2024-05-23 10:44:36 -07:00
Wouter Deconinck
9843f41bce libxkbcommon: add v1.6.0, v1.7.0 (#44344) 2024-05-23 10:37:07 -07:00
Carsten Uphoff
e54fefc2b7 Add double-batched-fft-library@0.5.1 (#44345)
Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-05-23 10:34:57 -07:00
pauleonix
90c0889533 benchmark: Add 1.8.4 (#44347) 2024-05-23 10:31:29 -07:00
Matt Thompson
6696e82ce7 mapl: add conflicts for intel 2021.7 (#44350) 2024-05-23 10:20:53 -07:00
Carsten Uphoff
dcc55d53db Add level zero loader versions (#44349)
Signed-off-by: Carsten Uphoff <carsten.uphoff@intel.com>
2024-05-23 10:56:31 -06:00
Harmen Stoppels
92000e81b8 absolutify_elf_sonames.py: fix _patchelf (#44343) 2024-05-23 14:05:10 +00:00
Mikael Simberg
125175ae25 gperftools: Don't build benchmarks or tests (#44336) 2024-05-23 06:55:52 -06:00
Massimiliano Culpo
f60e548a0d ASP-based solver: fix reusing externals on linux (#44316)
We need to tell clingo the libc compatibility of external nodes
in buildcaches or stores, to allow reuse.
2024-05-23 14:37:48 +02:00
John W. Parent
04dc16a6b1 cmake: add v3.28 (#43723) 2024-05-23 07:58:05 +02:00
Wouter Deconinck
27b90e38db py-cloudpickle: new version 3.0.0 (switch to flit-core) (#44324) 2024-05-23 07:40:16 +02:00
Wouter Deconinck
7e5ce3ba48 py-ordered-set: new version 4.1.0 (flit-core) (#44325) 2024-05-23 07:38:53 +02:00
Paolo
f5f7cfdc8f armpl-gcc: add v24.04 (#43553)
Starting from 24.04, armpl-gcc will only have three packages files: dev, rpm, and macOS. 

Only one version for gcc is provided, so the changes in the code reflect that the tar provided
for gcc is only one rather than many.
2024-05-23 06:37:23 +02:00
Alex Richert
3e1a562312 Update package.py (#44322) 2024-05-22 21:13:35 -06:00
Todd Gamblin
ce4d962faa README.md: add windows 2024-05-22 18:12:12 -07:00
mSamiolo
b9816a97fc docs: update chain.rst to improve discussion of upstreams (#43918)
* Update chain.rst

* Update lib/spack/docs/chain.rst

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update lib/spack/docs/chain.rst

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* docs: rm leading spaces to avoid indent

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-05-22 22:04:53 +00:00
950 changed files with 12377 additions and 13122 deletions

View File

@@ -61,7 +61,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals
./share/spack/qa/validate_last_exit.ps1
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits

View File

@@ -53,10 +53,10 @@ jobs:
runs-on: ${{ matrix.runner }}
strategy:
matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest"]
runner: ['macos-13', 'macos-14', "ubuntu-latest", "windows-latest"]
steps:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
if: ${{ matrix.runner != 'ubuntu-latest' && matrix.runner != 'windows-latest' }}
run: |
brew install cmake bison tree
- name: Checkout
@@ -67,13 +67,17 @@ jobs:
with:
python-version: "3.12"
- name: Bootstrap clingo
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
run: |
source share/spack/setup-env.sh
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4
spack external find --not-buildable cmake bison
spack -d solve zlib
tree ~/.spack/bootstrap/store/
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/
gnupg-sources:
runs-on: ${{ matrix.runner }}
@@ -84,13 +88,11 @@ jobs:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
brew install tree gawk
sudo rm -rf $(command -v gpg gpg2)
- name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
run: sudo rm -rf $(command -v gpg gpg2 patchelf)
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:

View File

@@ -40,8 +40,7 @@ jobs:
# 1: Platforms to build for
# 2: Base image (e.g. ubuntu:22.04)
dockerfile: [[amazon-linux, 'linux/amd64,linux/arm64', 'amazonlinux:2'],
[centos7, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:7'],
[centos-stream, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:stream'],
[centos-stream9, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:stream9'],
[leap15, 'linux/amd64,linux/arm64,linux/ppc64le', 'opensuse/leap:15'],
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04'],
@@ -88,16 +87,16 @@ jobs:
fi
- name: Upload Dockerfile
uses: actions/upload-artifact@65462800fd760344b1a7b4382951275a0abb4808
uses: actions/upload-artifact@0b2256b8c012f0828dc542b3febcab082c67f72b
with:
name: dockerfiles_${{ matrix.dockerfile[0] }}
path: dockerfiles
- name: Set up QEMU
uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3
uses: docker/setup-qemu-action@5927c834f5b4fdf503fca6f4c7eccda82949e1ee
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@d70bba72b1f3fd22344832f00baa16ece964efeb
uses: docker/setup-buildx-action@4fd812986e6c8c2a69e18311145f9371337f27d4
- name: Log in to GitHub Container Registry
uses: docker/login-action@e92390c5fb421da1463c202d546fed0ec5c39f20
@@ -114,7 +113,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@2cdde995de11925a030ce8070c3d77a52ffcf1c0
uses: docker/build-push-action@1a162644f9a7e87d8f4b053101d1d9a712edc18c
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}
@@ -127,7 +126,7 @@ jobs:
needs: deploy-images
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@65462800fd760344b1a7b4382951275a0abb4808
uses: actions/upload-artifact/merge@0b2256b8c012f0828dc542b3febcab082c67f72b
with:
name: dockerfiles
pattern: dockerfiles_*

View File

@@ -53,6 +53,13 @@ jobs:
- 'var/spack/repos/builtin/packages/clingo/**'
- 'var/spack/repos/builtin/packages/python/**'
- 'var/spack/repos/builtin/packages/re2c/**'
- 'var/spack/repos/builtin/packages/gnupg/**'
- 'var/spack/repos/builtin/packages/libassuan/**'
- 'var/spack/repos/builtin/packages/libgcrypt/**'
- 'var/spack/repos/builtin/packages/libgpg-error/**'
- 'var/spack/repos/builtin/packages/libksba/**'
- 'var/spack/repos/builtin/packages/npth/**'
- 'var/spack/repos/builtin/packages/pinentry/**'
- 'lib/spack/**'
- 'share/spack/**'
- '.github/workflows/bootstrap.yml'
@@ -77,13 +84,8 @@ jobs:
needs: [ prechecks, changes ]
uses: ./.github/workflows/unit_tests.yaml
secrets: inherit
windows:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.core == 'true' }}
needs: [ prechecks ]
uses: ./.github/workflows/windows_python.yml
secrets: inherit
all:
needs: [ windows, unit-tests, bootstrap ]
needs: [ unit-tests, bootstrap ]
runs-on: ubuntu-latest
steps:
- name: Success

View File

@@ -1,6 +1,6 @@
black==24.4.2
clingo==5.7.1
flake8==7.0.0
flake8==7.1.0
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.20240513

View File

@@ -91,7 +91,7 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
with:
flags: unittests,linux,${{ matrix.concretizer }}
token: ${{ secrets.CODECOV_TOKEN }}
@@ -124,7 +124,7 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
with:
flags: shelltests,linux
token: ${{ secrets.CODECOV_TOKEN }}
@@ -185,7 +185,7 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
with:
flags: unittests,linux,clingo
token: ${{ secrets.CODECOV_TOKEN }}
@@ -223,8 +223,39 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
with:
flags: unittests,macos
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
# Run unit tests on Windows
windows:
defaults:
run:
shell:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true

View File

@@ -1,83 +0,0 @@
name: windows
on:
workflow_call:
concurrency:
group: windows-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
defaults:
run:
shell:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml --ignore=lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools coverage pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Command Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools coverage
- name: Build Test
run: |
spack compiler find
spack -d external find cmake ninja
spack -d install abseil-cpp

View File

@@ -1,3 +1,324 @@
# v0.22.0 (2024-05-12)
`v0.22.0` is a major feature release.
## Features in this release
1. **Compiler dependencies**
We are in the process of making compilers proper dependencies in Spack, and a number
of changes in `v0.22` support that effort. You may notice nodes in your dependency
graphs for compiler runtime libraries like `gcc-runtime` or `libgfortran`, and you
may notice that Spack graphs now include `libc`. We've also begun moving compiler
configuration from `compilers.yaml` to `packages.yaml` to make it consistent with
other externals. We are trying to do this with the least disruption possible, so
your existing `compilers.yaml` files should still work. We expect to be done with
this transition by the `v0.23` release in November.
* #41104: Packages compiled with `%gcc` on Linux, macOS and FreeBSD now depend on a
new package `gcc-runtime`, which contains a copy of the shared compiler runtime
libraries. This enables gcc runtime libraries to be installed and relocated when
using a build cache. When building minimal Spack-generated container images it is
no longer necessary to install libgfortran, libgomp etc. using the system package
manager.
* #42062: Packages compiled with `%oneapi` now depend on a new package
`intel-oneapi-runtime`. This is similar to `gcc-runtime`, and the runtimes can
provide virtuals and compilers can inject dependencies on virtuals into compiled
packages. This allows us to model library soname compatibility and allows
compilers like `%oneapi` to provide virtuals like `sycl` (which can also be
provided by standalone libraries). Note that until we have an agreement in place
with intel, Intel packages are marked `redistribute(source=False, binary=False)`
and must be downloaded outside of Spack.
* #43272: changes to the optimization criteria of the solver improve the hit-rate of
buildcaches by a fair amount. The solver more relaxed compatibility rules and will
not try to strictly match compilers or targets of reused specs. Users can still
enforce the previous strict behavior with `require:` sections in `packages.yaml`.
Note that to enforce correct linking, Spack will *not* reuse old `%gcc` and
`%oneapi` specs that do not have the runtime libraries as a dependency.
* #43539: Spack will reuse specs built with compilers that are *not* explicitly
configured in `compilers.yaml`. Because we can now keep runtime libraries in build
cache, we do not require you to also have a local configured compiler to *use* the
runtime libraries. This improves reuse in buildcaches and avoids conflicts with OS
updates that happen underneath Spack.
* #43190: binary compatibility on `linux` is now based on the `libc` version,
instead of on the `os` tag. Spack builds now detect the host `libc` (`glibc` or
`musl`) and add it as an implicit external node in the dependency graph. Binaries
with a `libc` with the same name and a version less than or equal to that of the
detected `libc` can be reused. This is only on `linux`, not `macos` or `Windows`.
* #43464: each package that can provide a compiler is now detectable using `spack
external find`. External packages defining compiler paths are effectively used as
compilers, and `spack external find -t compiler` can be used as a substitute for
`spack compiler find`. More details on this transition are in
[the docs](https://spack.readthedocs.io/en/latest/getting_started.html#manual-compiler-configuration)
2. **Improved `spack find` UI for Environments**
If you're working in an enviroment, you likely care about:
* What are the roots
* Which ones are installed / not installed
* What's been added that still needs to be concretized
We've tweaked `spack find` in environments to show this information much more
clearly. Installation status is shown next to each root, so you can see what is
installed. Roots are also shown in bold in the list of installed packages. There is
also a new option for `spack find -r` / `--only-roots` that will only show env
roots, if you don't want to look at all the installed specs.
More details in #42334.
3. **Improved command-line string quoting**
We are making some breaking changes to how Spack parses specs on the CLI in order to
respect shell quoting instead of trying to fight it. If you (sadly) had to write
something like this on the command line:
```
spack install zlib cflags=\"-O2 -g\"
```
That will now result in an error, but you can now write what you probably expected
to work in the first place:
```
spack install zlib cflags="-O2 -g"
```
Quoted can also now include special characters, so you can supply flags like:
```
spack intall zlib ldflags='-Wl,-rpath=$ORIGIN/_libs'
```
To reduce ambiguity in parsing, we now require that you *not* put spaces around `=`
and `==` when for flags or variants. This would not have broken before but will now
result in an error:
```
spack install zlib cflags = "-O2 -g"
```
More details and discussion in #30634.
4. **Revert default `spack install` behavior to `--reuse`**
We changed the default concretizer behavior from `--reuse` to `--reuse-deps` in
#30990 (in `v0.20`), which meant that *every* `spack install` invocation would
attempt to build a new version of the requested package / any environment roots.
While this is a common ask for *upgrading* and for *developer* workflows, we don't
think it should be the default for a package manager.
We are going to try to stick to this policy:
1. Prioritize reuse and build as little as possible by default.
2. Only upgrade or install duplicates if they are explicitly asked for, or if there
is a known security issue that necessitates an upgrade.
With the install command you now have three options:
* `--reuse` (default): reuse as many existing installations as possible.
* `--reuse-deps` / `--fresh-roots`: upgrade (freshen) roots but reuse dependencies if possible.
* `--fresh`: install fresh versions of requested packages (roots) and their dependencies.
We've also introduced `--fresh-roots` as an alias for `--reuse-deps` to make it more clear
that it may give you fresh versions. More details in #41302 and #43988.
5. **More control over reused specs**
You can now control which packages to reuse and how. There is a new
`concretizer:reuse` config option, which accepts the following properties:
- `roots`: `true` to reuse roots, `false` to reuse just dependencies
- `exclude`: list of constraints used to select which specs *not* to reuse
- `include`: list of constraints used to select which specs *to* reuse
- `from`: list of sources for reused specs (some combination of `local`,
`buildcache`, or `external`)
For example, to reuse only specs compiled with GCC, you could write:
```yaml
concretizer:
reuse:
roots: true
include:
- "%gcc"
```
Or, if `openmpi` must be used from externals, and it must be the only external used:
```yaml
concretizer:
reuse:
roots: true
from:
- type: local
exclude: ["openmpi"]
- type: buildcache
exclude: ["openmpi"]
- type: external
include: ["openmpi"]
```
6. **New `redistribute()` directive**
Some packages can't be redistributed in source or binary form. We need an explicit
way to say that in a package.
Now there is a `redistribute()` directive so that package authors can write:
```python
class MyPackage(Package):
redistribute(source=False, binary=False)
```
Like other directives, this works with `when=`:
```python
class MyPackage(Package):
# 12.0 and higher are proprietary
redistribute(source=False, binary=False, when="@12.0:")
# can't redistribute when we depend on some proprietary dependency
redistribute(source=False, binary=False, when="^proprietary-dependency")
```
More in #20185.
7. **New `conflict:` and `prefer:` syntax for package preferences**
Previously, you could express conflicts and preferences in `packages.yaml` through
some contortions with `require:`:
```yaml
packages:
zlib-ng:
require:
- one_of: ["%clang", "@:"] # conflict on %clang
- any_of: ["+shared", "@:"] # strong preference for +shared
```
You can now use `require:` and `prefer:` for a much more readable configuration:
```yaml
packages:
zlib-ng:
conflict:
- "%clang"
prefer:
- "+shared"
```
See [the documentation](https://spack.readthedocs.io/en/latest/packages_yaml.html#conflicts-and-strong-preferences)
and #41832 for more details.
8. **`include_concrete` in environments**
You may want to build on the *concrete* contents of another environment without
changing that environment. You can now include the concrete specs from another
environment's `spack.lock` with `include_concrete`:
```yaml
spack:
specs: []
concretizer:
unify: true
include_concrete:
- /path/to/environment1
- /path/to/environment2
```
Now, when *this* environment is concretized, it will bring in the already concrete
specs from `environment1` and `environment2`, and build on top of them without
changing them. This is useful if you have phased deployments, where old deployments
should not be modified but you want to use as many of them as possible. More details
in #33768.
9. **`python-venv` isolation**
Spack has unique requirements for Python because it:
1. installs every package in its own independent directory, and
2. allows users to register *external* python installations.
External installations may contain their own installed packages that can interfere
with Spack installations, and some distributions (Debian and Ubuntu) even change the
`sysconfig` in ways that alter the installation layout of installed Python packages
(e.g., with the addition of a `/local` prefix on Debian or Ubuntu). To isolate Spack
from these and other issues, we now insert a small `python-venv` package in between
`python` and packages that need to install Python code. This isolates Spack's build
environment, isolates Spack from any issues with an external python, and resolves a
large number of issues we've had with Python installations.
See #40773 for further details.
## New commands, options, and directives
* Allow packages to be pushed to build cache after install from source (#42423)
* `spack develop`: stage build artifacts in same root as non-dev builds #41373
* Don't delete `spack develop` build artifacts after install (#43424)
* `spack find`: add options for local/upstream only (#42999)
* `spack logs`: print log files for packages (either partially built or installed) (#42202)
* `patch`: support reversing patches (#43040)
* `develop`: Add -b/--build-directory option to set build_directory package attribute (#39606)
* `spack list`: add `--namesapce` / `--repo` option (#41948)
* directives: add `checked_by` field to `license()`, add some license checks
* `spack gc`: add options for environments and build dependencies (#41731)
* Add `--create` to `spack env activate` (#40896)
## Performance improvements
* environment.py: fix excessive re-reads (#43746)
* ruamel yaml: fix quadratic complexity bug (#43745)
* Refactor to improve `spec format` speed (#43712)
* Do not acquire a write lock on the env post install if no views (#43505)
* asp.py: fewer calls to `spec.copy()` (#43715)
* spec.py: early return in `__str__`
* avoid `jinja2` import at startup unless needed (#43237)
## Other new features of note
* `archspec`: update to `v0.2.4`: support for Windows, bugfixes for `neoverse-v1` and
`neoverse-v2` detection.
* `spack config get`/`blame`: with no args, show entire config
* `spack env create <env>`: dir if dir-like (#44024)
* ASP-based solver: update os compatibility for macOS (#43862)
* Add handling of custom ssl certs in urllib ops (#42953)
* Add ability to rename environments (#43296)
* Add config option and compiler support to reuse across OS's (#42693)
* Support for prereleases (#43140)
* Only reuse externals when configured (#41707)
* Environments: Add support for including views (#42250)
## Binary caches
* Build cache: make signed/unsigned a mirror property (#41507)
* tools stack
## Removals, deprecations, and syntax changes
* remove `dpcpp` compiler and package (#43418)
* spack load: remove --only argument (#42120)
## Notable Bugfixes
* repo.py: drop deleted packages from provider cache (#43779)
* Allow `+` in module file names (#41999)
* `cmd/python`: use runpy to allow multiprocessing in scripts (#41789)
* Show extension commands with spack -h (#41726)
* Support environment variable expansion inside module projections (#42917)
* Alert user to failed concretizations (#42655)
* shell: fix zsh color formatting for PS1 in environments (#39497)
* spack mirror create --all: include patches (#41579)
## Spack community stats
* 7,994 total packages; 525 since `v0.21.0`
* 178 new Python packages, 5 new R packages
* 358 people contributed to this release
* 344 committers to packages
* 45 committers to core
# v0.21.2 (2024-03-01)
## Bugfixes

View File

@@ -32,7 +32,7 @@
Spack is a multi-platform package manager that builds and installs
multiple versions and configurations of software. It works on Linux,
macOS, and many supercomputers. Spack is non-destructive: installing a
macOS, Windows, and many supercomputers. Spack is non-destructive: installing a
new version of a package does not break existing installations, so many
configurations of the same package can coexist.

View File

@@ -22,4 +22,4 @@
#
# This is compatible across platforms.
#
exec /usr/bin/env spack python "$@"
exec spack python "$@"

View File

@@ -188,25 +188,27 @@ if NOT "%_sp_args%"=="%_sp_args:--help=%" (
goto :end_switch
:case_load
:: If args contain --sh, --csh, or -h/--help: just execute.
if defined _sp_args (
if NOT "%_sp_args%"=="%_sp_args:--help=%" (
goto :default_case
) else if NOT "%_sp_args%"=="%_sp_args:-h=%" (
goto :default_case
) else if NOT "%_sp_args%"=="%_sp_args:--bat=%" (
goto :default_case
)
if NOT defined _sp_args (
exit /B 0
)
:: If args contain --bat, or -h/--help: just execute.
if NOT "%_sp_args%"=="%_sp_args:--help=%" (
goto :default_case
) else if NOT "%_sp_args%"=="%_sp_args:-h=%" (
goto :default_case
) else if NOT "%_sp_args%"=="%_sp_args:--bat=%" (
goto :default_case
) else if NOT "%_sp_args%"=="%_sp_args:--list=%" (
goto :default_case
)
for /f "tokens=* USEBACKQ" %%I in (
`python "%spack%" %_sp_flags% %_sp_subcommand% --bat %_sp_args%`) do %%I
`python "%spack%" %_sp_flags% %_sp_subcommand% --bat %_sp_args%`
) do %%I
goto :end_switch
:case_unload
goto :case_load
:default_case
python "%spack%" %_sp_flags% %_sp_subcommand% %_sp_args%
goto :end_switch

View File

@@ -1,16 +0,0 @@
# -------------------------------------------------------------------------
# This is the default configuration for Spack's module file generation.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/modules.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/modules.yaml
# -------------------------------------------------------------------------
modules: {}

View File

@@ -1,19 +0,0 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
all:
providers:
iconv: [glibc, musl, libiconv]

View File

@@ -1,19 +1,3 @@
# -------------------------------------------------------------------------
# This file controls default concretization preferences for Spack.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/packages.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/packages.yaml
# -------------------------------------------------------------------------
packages:
all:
providers:
iconv: [glibc, musl, libiconv]
iconv:
require: [libiconv]

View File

@@ -60,7 +60,7 @@ packages:
szip: [libaec, libszip]
tbb: [intel-tbb]
unwind: [libunwind]
uuid: [util-linux-uuid, libuuid]
uuid: [util-linux-uuid, util-linux+uuid, libuuid]
xxd: [xxd-standalone, vim]
yacc: [bison, byacc]
ziglang: [zig]

View File

@@ -1433,22 +1433,12 @@ the reserved keywords ``platform``, ``os`` and ``target``:
$ spack install libelf os=ubuntu18.04
$ spack install libelf target=broadwell
or together by using the reserved keyword ``arch``:
.. code-block:: console
$ spack install libelf arch=cray-CNL10-haswell
Normally users don't have to bother specifying the architecture if they
are installing software for their current host, as in that case the
values will be detected automatically. If you need fine-grained control
over which packages use which targets (or over *all* packages' default
target), see :ref:`package-preferences`.
.. admonition:: Cray machines
The situation is a little bit different for Cray machines and a detailed
explanation on how the architecture can be set on them can be found at :ref:`cray-support`
.. _support-for-microarchitectures:

View File

@@ -11,7 +11,8 @@ Chaining Spack Installations
You can point your Spack installation to another installation to use any
packages that are installed there. To register the other Spack instance,
you can add it as an entry to ``upstreams.yaml``:
you can add it as an entry to ``upstreams.yaml`` at any of the
:ref:`configuration-scopes`:
.. code-block:: yaml
@@ -22,7 +23,8 @@ you can add it as an entry to ``upstreams.yaml``:
install_tree: /path/to/another/spack/opt/spack
``install_tree`` must point to the ``opt/spack`` directory inside of the
Spack base directory.
Spack base directory, or the location of the ``install_tree`` defined
in :ref:`config.yaml <config-yaml>`.
Once the upstream Spack instance has been added, ``spack find`` will
automatically check the upstream instance when querying installed packages,

View File

@@ -203,12 +203,9 @@ The OS that are currently supported are summarized in the table below:
* - Ubuntu 24.04
- ``ubuntu:24.04``
- ``spack/ubuntu-noble``
* - CentOS 7
- ``centos:7``
- ``spack/centos7``
* - CentOS Stream
- ``quay.io/centos/centos:stream``
- ``spack/centos-stream``
* - CentOS Stream9
- ``quay.io/centos/centos:stream9``
- ``spack/centos-stream9``
* - openSUSE Leap
- ``opensuse/leap``
- ``spack/leap15``

View File

@@ -1364,187 +1364,6 @@ This will write the private key to the file `dinosaur.priv`.
or for help on an issue or the Spack slack.
.. _cray-support:
-------------
Spack on Cray
-------------
Spack differs slightly when used on a Cray system. The architecture spec
can differentiate between the front-end and back-end processor and operating system.
For example, on Edison at NERSC, the back-end target processor
is "Ivy Bridge", so you can specify to use the back-end this way:
.. code-block:: console
$ spack install zlib target=ivybridge
You can also use the operating system to build against the back-end:
.. code-block:: console
$ spack install zlib os=CNL10
Notice that the name includes both the operating system name and the major
version number concatenated together.
Alternatively, if you want to build something for the front-end,
you can specify the front-end target processor. The processor for a login node
on Edison is "Sandy bridge" so we specify on the command line like so:
.. code-block:: console
$ spack install zlib target=sandybridge
And the front-end operating system is:
.. code-block:: console
$ spack install zlib os=SuSE11
^^^^^^^^^^^^^^^^^^^^^^^
Cray compiler detection
^^^^^^^^^^^^^^^^^^^^^^^
Spack can detect compilers using two methods. For the front-end, we treat
everything the same. The difference lies in back-end compiler detection.
Back-end compiler detection is made via the Tcl module avail command.
Once it detects the compiler it writes the appropriate PrgEnv and compiler
module name to compilers.yaml and sets the paths to each compiler with Cray\'s
compiler wrapper names (i.e. cc, CC, ftn). During build time, Spack will load
the correct PrgEnv and compiler module and will call appropriate wrapper.
The compilers.yaml config file will also differ. There is a
modules section that is filled with the compiler's Programming Environment
and module name. On other systems, this field is empty []:
.. code-block:: yaml
- compiler:
modules:
- PrgEnv-intel
- intel/15.0.109
As mentioned earlier, the compiler paths will look different on a Cray system.
Since most compilers are invoked using cc, CC and ftn, the paths for each
compiler are replaced with their respective Cray compiler wrapper names:
.. code-block:: yaml
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
As opposed to an explicit path to the compiler executable. This allows Spack
to call the Cray compiler wrappers during build time.
For more on compiler configuration, check out :ref:`compiler-config`.
Spack sets the default Cray link type to dynamic, to better match other
other platforms. Individual packages can enable static linking (which is the
default outside of Spack on cray systems) using the ``-static`` flag.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting defaults and using Cray modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If you want to use default compilers for each PrgEnv and also be able
to load cray external modules, you will need to set up a ``packages.yaml``.
Here's an example of an external configuration for cray modules:
.. code-block:: yaml
packages:
mpich:
externals:
- spec: "mpich@7.3.1%gcc@5.2.0 arch=cray_xc-haswell-CNL10"
modules:
- cray-mpich
- spec: "mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-haswell-CNL10"
modules:
- cray-mpich
all:
providers:
mpi: [mpich]
This tells Spack that for whatever package that depends on mpi, load the
cray-mpich module into the environment. You can then be able to use whatever
environment variables, libraries, etc, that are brought into the environment
via module load.
.. note::
For Cray-provided packages, it is best to use ``modules:`` instead of ``prefix:``
in ``packages.yaml``, because the Cray Programming Environment heavily relies on
modules (e.g., loading the ``cray-mpich`` module adds MPI libraries to the
compiler wrapper link line).
You can set the default compiler that Spack can use for each compiler type.
If you want to use the Cray defaults, then set them under ``all:`` in packages.yaml.
In the compiler field, set the compiler specs in your order of preference.
Whenever you build with that compiler type, Spack will concretize to that version.
Here is an example of a full packages.yaml used at NERSC
.. code-block:: yaml
packages:
mpich:
externals:
- spec: "mpich@7.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-mpich
- spec: "mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-SuSE11-ivybridge"
modules:
- cray-mpich
buildable: False
netcdf:
externals:
- spec: "netcdf@4.3.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-netcdf
- spec: "netcdf@4.3.3.1%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-netcdf
buildable: False
hdf5:
externals:
- spec: "hdf5@1.8.14%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-hdf5
- spec: "hdf5@1.8.14%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-hdf5
buildable: False
all:
compiler: [gcc@5.2.0, intel@16.0.0.109]
providers:
mpi: [mpich]
Here we tell spack that whenever we want to build with gcc use version 5.2.0 or
if we want to build with intel compilers, use version 16.0.0.109. We add a spec
for each compiler type for each cray modules. This ensures that for each
compiler on our system we can use that external module.
For more on external packages check out the section :ref:`sec-external-packages`.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using Linux containers on Cray machines
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack uses environment variables particular to the Cray programming
environment to determine which systems are Cray platforms. These
environment variables may be propagated into containers that are not
using the Cray programming environment.
To ensure that Spack does not autodetect the Cray programming
environment, unset the environment variable ``MODULEPATH``. This
will cause Spack to treat a linux container on a Cray system as a base
linux distro.
.. _windows_support:
----------------

View File

@@ -2344,6 +2344,27 @@ you set ``parallel`` to ``False`` at the package level, then each call
to ``make()`` will be sequential by default, but packagers can call
``make(parallel=True)`` to override it.
Note that the ``--jobs`` option works out of the box for all standard
build systems. If you are using a non-standard build system instead, you
can use the variable ``make_jobs`` to extract the number of jobs specified
by the ``--jobs`` option:
.. code-block:: python
:emphasize-lines: 7, 11
:linenos:
class Xios(Package):
...
def install(self, spec, prefix):
...
options = [
...
'--jobs', str(make_jobs),
]
...
make_xios = Executable("./make_xios")
make_xios(*options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Install-level build parallelism
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -5173,12 +5194,6 @@ installed executable. The check is implemented as follows:
reframe = Executable(self.prefix.bin.reframe)
reframe("-l")
.. warning::
The API for adding tests is not yet considered stable and may change
in future releases.
""""""""""""""""""""""""""""""""
Checking build-time test results
""""""""""""""""""""""""""""""""
@@ -5216,38 +5231,42 @@ be left in the build stage directory as illustrated below:
Stand-alone tests
^^^^^^^^^^^^^^^^^
While build-time tests are integrated with the build process, stand-alone
While build-time tests are integrated with the installation process, stand-alone
tests are expected to run days, weeks, even months after the software is
installed. The goal is to provide a mechanism for gaining confidence that
packages work as installed **and** *continue* to work as the underlying
software evolves. Packages can add and inherit stand-alone tests. The
`spack test`` command is used to manage stand-alone testing.
``spack test`` command is used for stand-alone testing.
.. note::
.. admonition:: Stand-alone test methods should complete within a few minutes.
Execution speed is important since these tests are intended to quickly
assess whether installed specs work on the system. Consequently, they
should run relatively quickly -- as in on the order of at most a few
minutes -- while ideally executing all, or at least key aspects of the
installed software.
assess whether installed specs work on the system. Spack cannot spare
resources for more extensive testing of packages included in CI stacks.
.. note::
Failing stand-alone tests indicate problems with the installation and,
therefore, there is no reason to proceed with more resource-intensive
tests until those have been investigated.
Passing stand-alone tests indicate that more thorough testing, such
as running extensive unit or regression tests, or tests that run at
scale can proceed without wasting resources on a problematic installation.
Consequently, stand-alone tests should run relatively quickly -- as in
on the order of at most a few minutes -- while testing at least key aspects
of the installed software. Save more extensive testing for other tools.
Tests are defined in the package using methods with names beginning ``test_``.
This allows Spack to support multiple independent checks, or parts. Files
needed for testing, such as source, data, and expected outputs, may be saved
from the build and or stored with the package in the repository. Regardless
of origin, these files are automatically copied to the spec's test stage
directory prior to execution of the test method(s). Spack also provides some
helper functions to facilitate processing.
directory prior to execution of the test method(s). Spack also provides helper
functions to facilitate common processing.
.. tip::
**The status of stand-alone tests can be used to guide follow-up testing efforts.**
Passing stand-alone tests justify performing more thorough testing, such
as running extensive unit or regression tests or tests that run at scale,
when available. These tests are outside of the scope of Spack packaging.
Failing stand-alone tests indicate problems with the installation and,
therefore, no reason to proceed with more resource-intensive tests until
the failures have been investigated.
.. _configure-test-stage:
@@ -5255,30 +5274,26 @@ helper functions to facilitate processing.
Configuring the test stage directory
""""""""""""""""""""""""""""""""""""
Stand-alone tests utilize a test stage directory for building, running,
and tracking results in the same way Spack uses a build stage directory.
The default test stage root directory, ``~/.spack/test``, is defined in
:ref:`etc/spack/defaults/config.yaml <config-yaml>`. This location is
customizable by adding or changing the ``test_stage`` path in the high-level
``config`` of the appropriate ``config.yaml`` file such that:
Stand-alone tests utilize a test stage directory to build, run, and track
tests in the same way Spack uses a build stage directory to install software.
The default test stage root directory, ``$HOME/.spack/test``, is defined in
:ref:`config.yaml <config-yaml>`. This location is customizable by adding or
changing the ``test_stage`` path such that:
.. code-block:: yaml
config:
test_stage: /path/to/test/stage
Packages can use the ``self.test_suite.stage`` property to access this setting.
Other package properties that provide access to spec-specific subdirectories
and files are described in :ref:`accessing staged files <accessing-files>`.
Packages can use the ``self.test_suite.stage`` property to access the path.
.. note::
.. admonition:: Each spec being tested has its own test stage directory.
The test stage path is the root directory for the **entire suite**.
In other words, it is the root directory for **all specs** being
tested by the ``spack test run`` command. Each spec gets its own
stage subdirectory. Use ``self.test_suite.test_dir_for_spec(self.spec)``
to access the spec-specific test stage directory.
The ``config:test_stage`` option is the path to the root of a
**test suite**'s stage directories.
Other package properties that provide paths to spec-specific subdirectories
and files are described in :ref:`accessing-files`.
.. _adding-standalone-tests:
@@ -5291,61 +5306,144 @@ Test recipes are defined in the package using methods with names beginning
Each method has access to the information Spack tracks on the package, such
as options, compilers, and dependencies, supporting the customization of tests
to the build. Standard python ``assert`` statements and other error reporting
mechanisms are available. Such exceptions are automatically caught and reported
mechanisms can be used. These exceptions are automatically caught and reported
as test failures.
Each test method is an implicit test part named by the method and whose
purpose is the method's docstring. Providing a purpose gives context for
aiding debugging. A test method may contain embedded test parts. Spack
outputs the test name and purpose prior to running each test method and
any embedded test parts. For example, ``MyPackage`` below provides two basic
examples of installation tests: ``test_always_fails`` and ``test_example``.
As the name indicates, the first always fails. The second simply runs the
installed example.
Each test method is an *implicit test part* named by the method. Its purpose
is the method's docstring. Providing a meaningful purpose for the test gives
context that can aid debugging. Spack outputs both the name and purpose at the
start of test execution so it's also important that the docstring/purpose be
brief.
.. tip::
We recommend naming test methods so it is clear *what* is being tested.
For example, if a test method is building and or running an executable
called ``example``, then call the method ``test_example``. This, together
with a similarly meaningful test purpose, will aid test comprehension,
debugging, and maintainability.
Stand-alone tests run in an environment that provides access to information
on the installed software, such as build options, dependencies, and compilers.
Build options and dependencies are accessed using the same spec checks used
by build recipes. Examples of checking :ref:`variant settings <variants>` and
:ref:`spec constraints <testing-specs>` can be found at the provided links.
.. admonition:: Spack automatically sets up the test stage directory and environment.
Spack automatically creates the test stage directory and copies
relevant files *prior to* running tests. It can also ensure build
dependencies are available **if** necessary.
The path to the test stage is configurable (see :ref:`configure-test-stage`).
Files that Spack knows to copy are those saved from the build (see
:ref:`cache_extra_test_sources`) and those added to the package repository
(see :ref:`cache_custom_files`).
Spack will use the value of the ``test_requires_compiler`` property to
determine whether it needs to also set up build dependencies (see
:ref:`test-build-tests`).
The ``MyPackage`` package below provides two basic test examples:
``test_example`` and ``test_example2``. The first runs the installed
``example`` and ensures its output contains an expected string. The second
runs ``example2`` without checking output so is only concerned with confirming
the executable runs successfully. If the installed spec is not expected to have
``example2``, then the check at the top of the method will raise a special
``SkipTest`` exception, which is captured to facilitate reporting skipped test
parts to tools like CDash.
.. code-block:: python
class MyPackage(Package):
...
def test_always_fails(self):
"""use assert to always fail"""
assert False
def test_example(self):
"""run installed example"""
"""ensure installed example works"""
expected = "Done."
example = which(self.prefix.bin.example)
example()
# Capture stdout and stderr from running the Executable
# and check that the expected output was produced.
out = example(output=str.split, error=str.split)
assert expected in out, f"Expected '{expected}' in the output"
def test_example2(self):
"""run installed example2"""
if self.spec.satisfies("@:1.0"):
# Raise SkipTest to ensure flagging the test as skipped for
# test reporting purposes.
raise SkipTest("Test is only available for v1.1 on")
example2 = which(self.prefix.bin.example2)
example2()
Output showing the identification of each test part after running the tests
is illustrated below.
.. code-block:: console
$ spack test run --alias mypackage mypackage@1.0
$ spack test run --alias mypackage mypackage@2.0
==> Spack test mypackage
...
$ spack test results -l mypackage
==> Results for test suite 'mypackage':
...
==> [2023-03-10-16:03:56.625204] test: test_always_fails: use assert to always fail
==> [2024-03-10-16:03:56.625439] test: test_example: ensure installed example works
...
FAILED
==> [2023-03-10-16:03:56.625439] test: test_example: run installed example
PASSED: MyPackage::test_example
==> [2024-03-10-16:03:56.625439] test: test_example2: run installed example2
...
PASSED
PASSED: MyPackage::test_example2
.. admonition:: Do NOT implement tests that must run in the installation prefix.
.. note::
Use of the package spec's installation prefix for building and running
tests is **strongly discouraged**. Doing so causes permission errors for
shared spack instances *and* facilities that install the software in
read-only file systems or directories.
If ``MyPackage`` were a recipe for a library, the tests should build
an example or test program that is then executed.
Instead, start these test methods by explicitly copying the needed files
from the installation prefix to the test stage directory. Note the test
stage directory is the current directory when the test is executed with
the ``spack test run`` command.
A test method can include test parts using the ``test_part`` context manager.
Each part is treated as an independent check to allow subsequent test parts
to execute even after a test part fails.
.. admonition:: Test methods for library packages should build test executables.
.. _test-part:
Stand-alone tests for library packages *should* build test executables
that utilize the *installed* library. Doing so ensures the tests follow
a similar build process that users of the library would follow.
For more information on how to do this, see :ref:`test-build-tests`.
.. tip::
If you want to see more examples from packages with stand-alone tests, run
``spack pkg grep "def\stest" | sed "s/\/package.py.*//g" | sort -u``
from the command line to get a list of the packages.
.. _adding-standalone-test-parts:
"""""""""""""""""""""""""""""
Adding stand-alone test parts
"""""""""""""""""""""""""""""
Sometimes dependencies between steps of a test lend themselves to being
broken into parts. Tracking the pass/fail status of each part may aid
debugging. Spack provides a ``test_part`` context manager for use within
test methods.
Each test part is independently run, tracked, and reported. Test parts are
executed in the order they appear. If one fails, subsequent test parts are
still performed even if they would also fail. This allows tools like CDash
to track and report the status of test parts across runs. The pass/fail status
of the enclosing test is derived from the statuses of the embedded test parts.
.. admonition:: Test method and test part names **must** be unique.
Test results reporting requires that test methods and embedded test parts
within a package have unique names.
The signature for ``test_part`` is:
@@ -5367,40 +5465,68 @@ where each argument has the following meaning:
* ``work_dir`` is the path to the directory in which the test will run.
The default of ``None``, or ``"."``, corresponds to the the spec's test
stage (i.e., ``self.test_suite.test_dir_for_spec(self.spec)``.
stage (i.e., ``self.test_suite.test_dir_for_spec(self.spec)``).
.. admonition:: Tests should **not** run under the installation directory.
.. admonition:: Start test part names with the name of the enclosing test.
Use of the package spec's installation directory for building and running
tests is **strongly** discouraged. Doing so causes permission errors for
shared spack instances *and* facilities that install the software in
read-only file systems or directories.
We **highly recommend** starting the names of test parts with the name
of the enclosing test. Doing so helps with the comprehension, readability
and debugging of test results.
Suppose ``MyPackage`` actually installs two examples we want to use for tests.
These checks can be implemented as separate checks or, as illustrated below,
embedded test parts.
Suppose ``MyPackage`` installs multiple executables that need to run in a
specific order since the outputs from one are inputs of others. Further suppose
we want to add an integration test that runs the executables in order. We can
accomplish this goal by implementing a stand-alone test method consisting of
test parts for each executable as follows:
.. code-block:: python
class MyPackage(Package):
...
def test_example(self):
"""run installed examples"""
for example in ["ex1", "ex2"]:
with test_part(
self,
f"test_example_{example}",
purpose=f"run installed {example}",
):
exe = which(join_path(self.prefix.bin, example))
exe()
def test_series(self):
"""run setup, perform, and report"""
In this case, there will be an implicit test part for ``test_example``
and separate sub-parts for ``ex1`` and ``ex2``. The second sub-part
will be executed regardless of whether the first passes. The test
log for a run where the first executable fails and the second passes
is illustrated below.
with test_part(self, "test_series_setup", purpose="setup operation"):
exe = which(self.prefix.bin.setup))
exe()
with test_part(self, "test_series_run", purpose="perform operation"):
exe = which(self.prefix.bin.run))
exe()
with test_part(self, "test_series_report", purpose="generate report"):
exe = which(self.prefix.bin.report))
exe()
The result is ``test_series`` runs the following executable in order: ``setup``,
``run``, and ``report``. In this case no options are passed to any of the
executables and no outputs from running them are checked. Consequently, the
implementation could be simplified with a for-loop as follows:
.. code-block:: python
class MyPackage(Package):
...
def test_series(self):
"""execute series setup, run, and report"""
for exe, reason in [
("setup", "setup operation"),
("run", "perform operation"),
("report", "generate report")
]:
with test_part(self, f"test_series_{exe}", purpose=reason):
exe = which(self.prefix.bin.join(exe))
exe()
In both cases, since we're using a context manager, each test part in
``test_series`` will execute regardless of the status of the other test
parts.
Now let's look at the output from running the stand-alone tests where
the second test part, ``test_series_run``, fails.
.. code-block:: console
@@ -5410,50 +5536,68 @@ is illustrated below.
$ spack test results -l mypackage
==> Results for test suite 'mypackage':
...
==> [2023-03-10-16:03:56.625204] test: test_example: run installed examples
==> [2023-03-10-16:03:56.625439] test: test_example_ex1: run installed ex1
==> [2024-03-10-16:03:56.625204] test: test_series: execute series setup, run, and report
==> [2024-03-10-16:03:56.625439] test: test_series_setup: setup operation
...
FAILED
==> [2023-03-10-16:03:56.625555] test: test_example_ex2: run installed ex2
PASSED: MyPackage::test_series_setup
==> [2024-03-10-16:03:56.625555] test: test_series_run: perform operation
...
PASSED
FAILED: MyPackage::test_series_run
==> [2024-03-10-16:03:57.003456] test: test_series_report: generate report
...
FAILED: MyPackage::test_series_report
FAILED: MyPackage::test_series
...
.. warning::
Since test parts depended on the success of previous parts, we see that the
failure of one results in the failure of subsequent checks and the overall
result of the test method, ``test_series``, is failure.
Test results reporting requires that each test method and embedded
test part for a package have a unique name.
.. tip::
Stand-alone tests run in an environment that provides access to information
Spack has on how the software was built, such as build options, dependencies,
and compilers. Build options and dependencies are accessed with the normal
spec checks. Examples of checking :ref:`variant settings <variants>` and
:ref:`spec constraints <testing-specs>` can be found at the provided links.
Accessing compilers in stand-alone tests that are used by the build requires
setting a package property as described :ref:`below <test-compilation>`.
If you want to see more examples from packages using ``test_part``, run
``spack pkg grep "test_part(" | sed "s/\/package.py.*//g" | sort -u``
from the command line to get a list of the packages.
.. _test-build-tests:
.. _test-compilation:
"""""""""""""""""""""""""""""""""""""
Building and running test executables
"""""""""""""""""""""""""""""""""""""
"""""""""""""""""""""""""
Enabling test compilation
"""""""""""""""""""""""""
.. admonition:: Re-use build-time sources and (small) input data sets when possible.
If you want to build and run binaries in tests, then you'll need to tell
Spack to load the package's compiler configuration. This is accomplished
by setting the package's ``test_requires_compiler`` property to ``True``.
We **highly recommend** re-using build-time test sources and pared down
input files for testing installed software. These files are easier
to keep synchronized with software capabilities when they reside
within the software's repository. More information on saving files from
the installation process can be found at :ref:`cache_extra_test_sources`.
Setting the property to ``True`` ensures access to the compiler through
canonical environment variables (e.g., ``CC``, ``CXX``, ``FC``, ``F77``).
It also gives access to build dependencies like ``cmake`` through their
``spec objects`` (e.g., ``self.spec["cmake"].prefix.bin.cmake``).
If that is not possible, you can add test-related files to the package
repository (see :ref:`cache_custom_files`). It will be important to
remember to maintain them so they work across listed or supported versions
of the package.
.. note::
Packages that build libraries are good examples of cases where you'll want
to build test executables from the installed software before running them.
Doing so requires you to let Spack know it needs to load the package's
compiler configuration. This is accomplished by setting the package's
``test_requires_compiler`` property to ``True``.
The ``test_requires_compiler`` property should be added at the top of
the package near other attributes, such as the ``homepage`` and ``url``.
.. admonition:: ``test_requires_compiler = True`` is required to build test executables.
Below illustrates using this feature to compile an example.
Setting the property to ``True`` ensures access to the compiler through
canonical environment variables (e.g., ``CC``, ``CXX``, ``FC``, ``F77``).
It also gives access to build dependencies like ``cmake`` through their
``spec objects`` (e.g., ``self.spec["cmake"].prefix.bin.cmake`` for the
path or ``self.spec["cmake"].command`` for the ``Executable`` instance).
Be sure to add the property at the top of the package class under other
properties like the ``homepage``.
The example below, which ignores how ``cxx-example.cpp`` is acquired,
illustrates the basic process of compiling a test executable using the
installed library before running it.
.. code-block:: python
@@ -5477,28 +5621,22 @@ Below illustrates using this feature to compile an example.
cxx_example = which(exe)
cxx_example()
Typically the files used to build and or run test executables are either
cached from the installation (see :ref:`cache_extra_test_sources`) or added
to the package repository (see :ref:`cache_custom_files`). There is nothing
preventing the use of both.
.. _cache_extra_test_sources:
"""""""""""""""""""""""
Saving build-time files
"""""""""""""""""""""""
""""""""""""""""""""""""""""""""""""
Saving build- and install-time files
""""""""""""""""""""""""""""""""""""
.. note::
We highly recommend re-using build-time test sources and pared down
input files for testing installed software. These files are easier
to keep synchronized with software capabilities since they reside
within the software's repository.
If that is not possible, you can add test-related files to the package
repository (see :ref:`adding custom files <cache_custom_files>`). It
will be important to maintain them so they work across listed or supported
versions of the package.
You can use the ``cache_extra_test_sources`` helper to copy directories
and or files from the source build stage directory to the package's
installation directory.
You can use the ``cache_extra_test_sources`` helper routine to copy
directories and or files from the source build stage directory to the
package's installation directory. Spack will automatically copy these
files for you when it sets up the test stage directory and before it
begins running the tests.
The signature for ``cache_extra_test_sources`` is:
@@ -5513,46 +5651,69 @@ where each argument has the following meaning:
* ``srcs`` is a string *or* a list of strings corresponding to the
paths of subdirectories and or files needed for stand-alone testing.
The paths must be relative to the staged source directory. Contents of
subdirectories and files are copied to a special test cache subdirectory
of the installation prefix. They are automatically copied to the appropriate
relative paths under the test stage directory prior to executing stand-alone
tests.
.. warning::
For example, a package method for copying everything in the ``tests``
subdirectory plus the ``foo.c`` and ``bar.c`` files from ``examples``
and using ``foo.c`` in a test method is illustrated below.
Paths provided in the ``srcs`` argument **must be relative** to the
staged source directory. They will be copied to the equivalent relative
location under the test stage directory prior to test execution.
Contents of subdirectories and files are copied to a special test cache
subdirectory of the installation prefix. They are automatically copied to
the appropriate relative paths under the test stage directory prior to
executing stand-alone tests.
.. tip::
*Perform test-related conversions once when copying files.*
If one or more of the copied files needs to be modified to reference
the installed software, it is recommended that those changes be made
to the cached files **once** in the post-``install`` copy method
**after** the call to ``cache_extra_test_sources``. This will reduce
the amount of unnecessary work in the test method **and** avoid problems
running stand-alone tests in shared instances and facility deployments.
The ``filter_file`` function can be quite useful for such changes
(see :ref:`file-filtering`).
Below is a basic example of a test that relies on files from the installation.
This package method re-uses the contents of the ``examples`` subdirectory,
which is assumed to have all of the files implemented to allow ``make`` to
compile and link ``foo.c`` and ``bar.c`` against the package's installed
library.
.. code-block:: python
class MyLibPackage(Package):
class MyLibPackage(MakefilePackage):
...
@run_after("install")
def copy_test_files(self):
srcs = ["tests",
join_path("examples", "foo.c"),
join_path("examples", "bar.c")]
cache_extra_test_sources(self, srcs)
cache_extra_test_sources(self, "examples")
def test_foo(self):
exe = "foo"
src_dir = self.test_suite.current_test_cache_dir.examples
with working_dir(src_dir):
cc = which(os.environ["CC"])
cc(
f"-L{self.prefix.lib}",
f"-I{self.prefix.include}",
f"{exe}.c",
"-o", exe
)
foo = which(exe)
foo()
def test_example(self):
"""build and run the examples"""
examples_dir = self.test_suite.current_test_cache_dir.examples
with working_dir(examples_dir):
make = which("make")
make()
In this case, the method copies the associated files from the build
stage, **after** the software is installed, to the package's test
cache directory. Then ``test_foo`` builds ``foo`` using ``foo.c``
before running the program.
for program in ["foo", "bar"]:
with test_part(
self,
f"test_example_{program}",
purpose=f"ensure {program} runs"
):
exe = Executable(program)
exe()
In this case, ``copy_test_files`` copies the associated files from the
build stage to the package's test cache directory under the installation
prefix. Running ``spack test run`` for the package results in Spack copying
the directory and its contents to the the test stage directory. The
``working_dir`` context manager ensures the commands within it are executed
from the ``examples_dir``. The test builds the software using ``make`` before
running each executable, ``foo`` and ``bar``, as independent test parts.
.. note::
@@ -5561,43 +5722,18 @@ before running the program.
The key to copying files for stand-alone testing at build time is use
of the ``run_after`` directive, which ensures the associated files are
copied **after** the provided build stage where the files **and**
installation prefix are available.
copied **after** the provided build stage (``install``) when the installation
prefix **and** files are available.
These paths are **automatically copied** from cache to the test stage
directory prior to the execution of any stand-alone tests. Tests access
the files using the ``self.test_suite.current_test_cache_dir`` property.
In our example above, test methods can use the following paths to reference
the copy of each entry listed in ``srcs``, respectively:
The test method uses the path contained in the package's
``self.test_suite.current_test_cache_dir`` property for the root directory
of the copied files. In this case, that's the ``examples`` subdirectory.
* ``self.test_suite.current_test_cache_dir.tests``
* ``join_path(self.test_suite.current_test_cache_dir.examples, "foo.c")``
* ``join_path(self.test_suite.current_test_cache_dir.examples, "bar.c")``
.. admonition:: Library packages should build stand-alone tests
Library developers will want to build the associated tests
against their **installed** libraries before running them.
.. note::
While source and input files are generally recommended, binaries
**may** also be cached by the build process. Only you, as the package
writer or maintainer, know whether these files would be appropriate
for testing the installed software weeks to months later.
.. note::
If one or more of the copied files needs to be modified to reference
the installed software, it is recommended that those changes be made
to the cached files **once** in the ``copy_test_sources`` method and
***after** the call to ``cache_extra_test_sources()``. This will
reduce the amount of unnecessary work in the test method **and** avoid
problems testing in shared instances and facility deployments.
The ``filter_file`` function can be quite useful for such changes.
See :ref:`file manipulation <file-manipulation>`.
.. tip::
If you want to see more examples from packages that cache build files, run
``spack pkg grep cache_extra_test_sources | sed "s/\/package.py.*//g" | sort -u``
from the command line to get a list of the packages.
.. _cache_custom_files:
@@ -5605,8 +5741,9 @@ the copy of each entry listed in ``srcs``, respectively:
Adding custom files
"""""""""""""""""""
In some cases it can be useful to have files that can be used to build or
check the results of tests. Examples include:
Sometimes it is helpful or necessary to include custom files for building and
or checking the results of tests as part of the package. Examples of the types
of files that might be useful are:
- test source files
- test input files
@@ -5614,17 +5751,15 @@ check the results of tests. Examples include:
- expected test outputs
While obtaining such files from the software repository is preferred (see
:ref:`adding build-time files <cache_extra_test_sources>`), there are
circumstances where that is not feasible (e.g., the software is not being
actively maintained). When test files can't be obtained from the repository
or as a supplement to files that can, Spack supports the inclusion of
additional files under the ``test`` subdirectory of the package in the
Spack repository.
:ref:`cache_extra_test_sources`), there are circumstances where doing so is not
feasible such as when the software is not being actively maintained. When test
files cannot be obtained from the repository or there is a need to supplement
files that can, Spack supports the inclusion of additional files under the
``test`` subdirectory of the package in the Spack repository.
Spack **automatically copies** the contents of that directory to the
test staging directory prior to running stand-alone tests. Test methods
access those files using the ``self.test_suite.current_test_data_dir``
property as shown below.
The following example assumes a ``custom-example.c`` is saved in ``MyLibary``
package's ``test`` subdirectory. It also assumes the program simply needs to
be compiled and linked against the installed ``MyLibrary`` software.
.. code-block:: python
@@ -5634,17 +5769,29 @@ property as shown below.
test_requires_compiler = True
...
def test_example(self):
def test_custom_example(self):
"""build and run custom-example"""
data_dir = self.test_suite.current_test_data_dir
src_dir = self.test_suite.current_test_data_dir
exe = "custom-example"
src = datadir.join(f"{exe}.cpp")
...
# TODO: Build custom-example using src and exe
...
custom_example = which(exe)
custom_example()
with working_dir(src_dir):
cc = which(os.environ["CC"])
cc(
f"-L{self.prefix.lib}",
f"-I{self.prefix.include}",
f"{exe}.cpp",
"-o", exe
)
custom_example = Executable(exe)
custom_example()
In this case, ``spack test run`` for the package results in Spack copying
the contents of the ``test`` subdirectory to the test stage directory path
in ``self.test_suite.current_test_data_dir`` before calling
``test_custom_example``. Use of the ``working_dir`` context manager
ensures the commands to build and run the program are performed from
within the appropriate subdirectory of the test stage.
.. _expected_test_output_from_file:
@@ -5653,9 +5800,8 @@ Reading expected output from a file
"""""""""""""""""""""""""""""""""""
The helper function ``get_escaped_text_output`` is available for packages
to retrieve and properly format the text from a file that contains the
expected output from running an executable that may contain special
characters.
to retrieve properly formatted text from a file potentially containing
special characters.
The signature for ``get_escaped_text_output`` is:
@@ -5665,10 +5811,13 @@ The signature for ``get_escaped_text_output`` is:
where ``filename`` is the path to the file containing the expected output.
The ``filename`` for a :ref:`custom file <cache_custom_files>` can be
accessed by tests using the ``self.test_suite.current_test_data_dir``
property. The example below illustrates how to read a file that was
added to the package's ``test`` subdirectory.
The path provided to ``filename`` for one of the copied custom files
(:ref:`custom file <cache_custom_files>`) is in the path rooted at
``self.test_suite.current_test_data_dir``.
The example below shows how to reference both the custom database
(``packages.db``) and expected output (``dump.out``) files Spack copies
to the test stage:
.. code-block:: python
@@ -5690,8 +5839,9 @@ added to the package's ``test`` subdirectory.
for exp in expected:
assert re.search(exp, out), f"Expected '{exp}' in output"
If the file was instead copied from the ``tests`` subdirectory of the staged
source code, the path would be obtained as shown below.
If the files were instead cached from installing the software, the paths to the
two files would be found under the ``self.test_suite.current_test_cache_dir``
directory as shown below:
.. code-block:: python
@@ -5699,17 +5849,24 @@ source code, the path would be obtained as shown below.
"""check example table dump"""
test_cache_dir = self.test_suite.current_test_cache_dir
db_filename = test_cache_dir.join("packages.db")
..
expected = get_escaped_text_output(test_cache_dir.join("dump.out"))
...
Alternatively, if the file was copied to the ``share/tests`` subdirectory
as part of the installation process, the test could access the path as
follows:
Alternatively, if both files had been installed by the software into the
``share/tests`` subdirectory of the installation prefix, the paths to the
two files would be referenced as follows:
.. code-block:: python
def test_example(self):
"""check example table dump"""
db_filename = join_path(self.prefix.share.tests, "packages.db")
db_filename = self.prefix.share.tests.join("packages.db")
..
expected = get_escaped_text_output(
self.prefix.share.tests.join("dump.out")
)
...
.. _check_outputs:
@@ -5717,9 +5874,9 @@ follows:
Comparing expected to actual outputs
""""""""""""""""""""""""""""""""""""
The helper function ``check_outputs`` is available for packages to ensure
the expected outputs from running an executable are contained within the
actual outputs.
The ``check_outputs`` helper routine is available for packages to ensure
multiple expected outputs from running an executable are contained within
the actual outputs.
The signature for ``check_outputs`` is:
@@ -5745,11 +5902,17 @@ Invoking the method is the equivalent of:
if errors:
raise RuntimeError("\n ".join(errors))
.. tip::
If you want to see more examples from packages that use this helper, run
``spack pkg grep check_outputs | sed "s/\/package.py.*//g" | sort -u``
from the command line to get a list of the packages.
.. _accessing-files:
"""""""""""""""""""""""""""""""""""""""""
Accessing package- and test-related files
Finding package- and test-related files
"""""""""""""""""""""""""""""""""""""""""
You may need to access files from one or more locations when writing
@@ -5758,8 +5921,7 @@ include test source files or includes them but has no way to build the
executables using the installed headers and libraries. In these cases
you may need to reference the files relative to one or more root directory.
The table below lists relevant path properties and provides additional
examples of their use.
:ref:`Reading expected output <expected_test_output_from_file>` provides
examples of their use. See :ref:`expected_test_output_from_file` for
examples of accessing files saved from the software repository, package
repository, and installation.
@@ -5788,7 +5950,6 @@ repository, and installation.
- ``self.test_suite.current_test_data_dir``
- ``join_path(self.test_suite.current_test_data_dir, "hello.f90")``
.. _inheriting-tests:
""""""""""""""""""""""""""""
@@ -5831,7 +5992,7 @@ maintainers provide additional stand-alone tests customized to the package.
.. warning::
Any package that implements a test method with the same name as an
inherited method overrides the inherited method. If that is not the
inherited method will override the inherited method. If that is not the
goal and you are not explicitly calling and adding functionality to
the inherited method for the test, then make sure that all test methods
and embedded test parts have unique test names.
@@ -5996,6 +6157,8 @@ running:
This is already part of the boilerplate for packages created with
``spack create``.
.. _file-filtering:
^^^^^^^^^^^^^^^^^^^
Filtering functions
^^^^^^^^^^^^^^^^^^^

View File

@@ -253,17 +253,6 @@ can easily happen if it is not updated frequently, this behavior ensures that
spack has a way to know for certain about the status of any concrete spec on
the remote mirror, but can slow down pipeline generation significantly.
The ``--optimize`` argument is experimental and runs the generated pipeline
document through a series of optimization passes designed to reduce the size
of the generated file.
The ``--dependencies`` is also experimental and disables what in Gitlab is
referred to as DAG scheduling, internally using the ``dependencies`` keyword
rather than ``needs`` to list dependency jobs. The drawback of using this option
is that before any job can begin, all jobs in previous stages must first
complete. The benefit is that Gitlab allows more dependencies to be listed
when using ``dependencies`` instead of ``needs``.
The optional ``--output-file`` argument should be an absolute path (including
file name) to the generated pipeline, and if not given, the default is
``./.gitlab-ci.yml``.

View File

@@ -1,13 +1,13 @@
sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx_design==0.6.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.1
docutils==0.20.1
pygments==2.18.0
urllib3==2.2.1
pytest==8.2.1
urllib3==2.2.2
pytest==8.2.2
isort==5.13.2
black==24.4.2
flake8==7.0.0
mypy==1.10.0
flake8==7.1.0
mypy==1.10.1

View File

@@ -766,7 +766,6 @@ def copy_tree(
src: str,
dest: str,
symlinks: bool = True,
allow_broken_symlinks: bool = sys.platform != "win32",
ignore: Optional[Callable[[str], bool]] = None,
_permissions: bool = False,
):
@@ -789,8 +788,6 @@ def copy_tree(
src (str): the directory to copy
dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks
allow_broken_symlinks (bool): whether or not to allow broken (dangling) symlinks,
On Windows, setting this to True will raise an exception. Defaults to true on unix.
ignore (typing.Callable): function indicating which files to ignore
_permissions (bool): for internal use only
@@ -798,8 +795,6 @@ def copy_tree(
IOError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest*
"""
if allow_broken_symlinks and sys.platform == "win32":
raise llnl.util.symlink.SymlinkError("Cannot allow broken symlinks on Windows!")
if _permissions:
tty.debug("Installing {0} to {1}".format(src, dest))
else:
@@ -872,16 +867,14 @@ def escaped_path(path):
copy_mode(s, d)
for target, d, s in links:
symlink(target, d, allow_broken_symlinks=allow_broken_symlinks)
symlink(target, d)
if _permissions:
set_install_permissions(d)
copy_mode(s, d)
@system_path_filter
def install_tree(
src, dest, symlinks=True, ignore=None, allow_broken_symlinks=sys.platform != "win32"
):
def install_tree(src, dest, symlinks=True, ignore=None):
"""Recursively install an entire directory tree rooted at *src*.
Same as :py:func:`copy_tree` with the addition of setting proper
@@ -892,21 +885,12 @@ def install_tree(
dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks
ignore (typing.Callable): function indicating which files to ignore
allow_broken_symlinks (bool): whether or not to allow broken (dangling) symlinks,
On Windows, setting this to True will raise an exception.
Raises:
IOError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest*
"""
copy_tree(
src,
dest,
symlinks=symlinks,
allow_broken_symlinks=allow_broken_symlinks,
ignore=ignore,
_permissions=True,
)
copy_tree(src, dest, symlinks=symlinks, ignore=ignore, _permissions=True)
@system_path_filter

View File

@@ -8,6 +8,7 @@
import subprocess
import sys
import tempfile
from typing import Union
from llnl.util import lang, tty
@@ -16,92 +17,66 @@
if sys.platform == "win32":
from win32file import CreateHardLink
is_windows = sys.platform == "win32"
def _windows_symlink(
src: str, dst: str, target_is_directory: bool = False, *, dir_fd: Union[int, None] = None
):
"""On Windows with System Administrator privileges this will be a normal symbolic link via
os.symlink. On Windows without privledges the link will be a junction for a directory and a
hardlink for a file. On Windows the various link types are:
def symlink(source_path: str, link_path: str, allow_broken_symlinks: bool = not is_windows):
"""
Create a link.
Symbolic Link: A link to a file or directory on the same or different volume (drive letter) or
even to a remote file or directory (using UNC in its path). Need System Administrator
privileges to make these.
On non-Windows and Windows with System Administrator
privleges this will be a normal symbolic link via
os.symlink.
Hard Link: A link to a file on the same volume (drive letter) only. Every file (file's data)
has at least 1 hard link (file's name). But when this method creates a new hard link there will
be 2. Deleting all hard links effectively deletes the file. Don't need System Administrator
privileges.
On Windows without privledges the link will be a
junction for a directory and a hardlink for a file.
On Windows the various link types are:
Symbolic Link: A link to a file or directory on the
same or different volume (drive letter) or even to
a remote file or directory (using UNC in its path).
Need System Administrator privileges to make these.
Hard Link: A link to a file on the same volume (drive
letter) only. Every file (file's data) has at least 1
hard link (file's name). But when this method creates
a new hard link there will be 2. Deleting all hard
links effectively deletes the file. Don't need System
Administrator privileges.
Junction: A link to a directory on the same or different
volume (drive letter) but not to a remote directory. Don't
need System Administrator privileges.
Parameters:
source_path (str): The real file or directory that the link points to.
Must be absolute OR relative to the link.
link_path (str): The path where the link will exist.
allow_broken_symlinks (bool): On Linux or Mac, don't raise an exception if the source_path
doesn't exist. This will still raise an exception on Windows.
"""
source_path = os.path.normpath(source_path)
Junction: A link to a directory on the same or different volume (drive letter) but not to a
remote directory. Don't need System Administrator privileges."""
source_path = os.path.normpath(src)
win_source_path = source_path
link_path = os.path.normpath(link_path)
link_path = os.path.normpath(dst)
# Never allow broken links on Windows.
if sys.platform == "win32" and allow_broken_symlinks:
raise ValueError("allow_broken_symlinks parameter cannot be True on Windows.")
# Perform basic checks to make sure symlinking will succeed
if os.path.lexists(link_path):
raise AlreadyExistsError(f"Link path ({link_path}) already exists. Cannot create link.")
if not allow_broken_symlinks:
# Perform basic checks to make sure symlinking will succeed
if os.path.lexists(link_path):
raise AlreadyExistsError(
f"Link path ({link_path}) already exists. Cannot create link."
if not os.path.exists(source_path):
if os.path.isabs(source_path):
# An absolute source path that does not exist will result in a broken link.
raise SymlinkError(
f"Source path ({source_path}) is absolute but does not exist. Resulting "
f"link would be broken so not making link."
)
if not os.path.exists(source_path):
if os.path.isabs(source_path) and not allow_broken_symlinks:
# An absolute source path that does not exist will result in a broken link.
raise SymlinkError(
f"Source path ({source_path}) is absolute but does not exist. Resulting "
f"link would be broken so not making link."
)
else:
# os.symlink can create a link when the given source path is relative to
# the link path. Emulate this behavior and check to see if the source exists
# relative to the link path ahead of link creation to prevent broken
# links from being made.
link_parent_dir = os.path.dirname(link_path)
relative_path = os.path.join(link_parent_dir, source_path)
if os.path.exists(relative_path):
# In order to work on windows, the source path needs to be modified to be
# relative because hardlink/junction dont resolve relative paths the same
# way as os.symlink. This is ignored on other operating systems.
win_source_path = relative_path
else:
# os.symlink can create a link when the given source path is relative to
# the link path. Emulate this behavior and check to see if the source exists
# relative to the link path ahead of link creation to prevent broken
# links from being made.
link_parent_dir = os.path.dirname(link_path)
relative_path = os.path.join(link_parent_dir, source_path)
if os.path.exists(relative_path):
# In order to work on windows, the source path needs to be modified to be
# relative because hardlink/junction dont resolve relative paths the same
# way as os.symlink. This is ignored on other operating systems.
win_source_path = relative_path
elif not allow_broken_symlinks:
raise SymlinkError(
f"The source path ({source_path}) is not relative to the link path "
f"({link_path}). Resulting link would be broken so not making link."
)
raise SymlinkError(
f"The source path ({source_path}) is not relative to the link path "
f"({link_path}). Resulting link would be broken so not making link."
)
# Create the symlink
if sys.platform == "win32" and not _windows_can_symlink():
if not _windows_can_symlink():
_windows_create_link(win_source_path, link_path)
else:
os.symlink(source_path, link_path, target_is_directory=os.path.isdir(source_path))
def islink(path: str) -> bool:
def _windows_islink(path: str) -> bool:
"""Override os.islink to give correct answer for spack logic.
For Non-Windows: a link can be determined with the os.path.islink method.
@@ -269,7 +244,7 @@ def _windows_create_hard_link(path: str, link: str):
CreateHardLink(link, path)
def readlink(path: str, *, dir_fd=None):
def _windows_readlink(path: str, *, dir_fd=None):
"""Spack utility to override of os.readlink method to work cross platform"""
if _windows_is_hardlink(path):
return _windows_read_hard_link(path)
@@ -338,6 +313,16 @@ def resolve_link_target_relative_to_the_link(link):
return os.path.join(link_dir, target)
if sys.platform == "win32":
symlink = _windows_symlink
readlink = _windows_readlink
islink = _windows_islink
else:
symlink = os.symlink
readlink = os.readlink
islink = os.path.islink
class SymlinkError(RuntimeError):
"""Exception class for errors raised while creating symlinks,
junctions and hard links

View File

@@ -33,8 +33,23 @@
pass
esc, bell, lbracket, bslash, newline = r"\x1b", r"\x07", r"\[", r"\\", r"\n"
# Ansi Control Sequence Introducers (CSI) are a well-defined format
# Standard ECMA-48: Control Functions for Character-Imaging I/O Devices, section 5.4
# https://www.ecma-international.org/wp-content/uploads/ECMA-48_5th_edition_june_1991.pdf
csi_pre = f"{esc}{lbracket}"
csi_param, csi_inter, csi_post = r"[0-?]", r"[ -/]", r"[@-~]"
ansi_csi = f"{csi_pre}{csi_param}*{csi_inter}*{csi_post}"
# General ansi escape sequences have well-defined prefixes,
# but content and suffixes are less reliable.
# Conservatively assume they end with either "<ESC>\" or "<BELL>",
# with no intervening "<ESC>"/"<BELL>" keys or newlines
esc_pre = f"{esc}[@-_]"
esc_content = f"[^{esc}{bell}{newline}]"
esc_post = f"(?:{esc}{bslash}|{bell})"
ansi_esc = f"{esc_pre}{esc_content}*{esc_post}"
# Use this to strip escape sequences
_escape = re.compile(r"\x1b[^m]*m|\x1b\[?1034h|\x1b\][0-9]+;[^\x07]*\x07")
_escape = re.compile(f"{ansi_csi}|{ansi_esc}")
# control characters for enabling/disabling echo
#

View File

@@ -791,7 +791,7 @@ def check_virtual_with_variants(spec, msg):
return
error = error_cls(
f"{pkg_name}: {msg}",
f"remove variants from '{spec}' in depends_on directive in {filename}",
[f"remove variants from '{spec}' in depends_on directive in {filename}"],
)
errors.append(error)

View File

@@ -213,15 +213,18 @@ def _root_spec(spec_str: str) -> str:
Args:
spec_str: spec to be bootstrapped. Must be without compiler and target.
"""
# Add a compiler requirement to the root spec.
# Add a compiler and platform requirement to the root spec.
platform = str(spack.platforms.host())
if platform == "darwin":
spec_str += " %apple-clang"
elif platform == "windows":
spec_str += " %msvc"
elif platform == "linux":
spec_str += " %gcc"
elif platform == "freebsd":
spec_str += " %clang"
spec_str += f" platform={platform}"
target = archspec.cpu.host().family
spec_str += f" target={target}"

View File

@@ -129,10 +129,10 @@ def _bootstrap_config_scopes() -> Sequence["spack.config.ConfigScope"]:
configuration_paths = (spack.config.CONFIGURATION_DEFAULTS_PATH, ("bootstrap", _config_path()))
for name, path in configuration_paths:
platform = spack.platforms.host().name
platform_scope = spack.config.ConfigScope(
"/".join([name, platform]), os.path.join(path, platform)
platform_scope = spack.config.DirectoryConfigScope(
f"{name}/{platform}", os.path.join(path, platform)
)
generic_scope = spack.config.ConfigScope(name, path)
generic_scope = spack.config.DirectoryConfigScope(name, path)
config_scopes.extend([generic_scope, platform_scope])
msg = "[BOOTSTRAP CONFIG SCOPE] name={0}, path={1}"
tty.debug(msg.format(generic_scope.name, generic_scope.path))

View File

@@ -72,6 +72,7 @@
import spack.store
import spack.subprocess_context
import spack.user_environment
import spack.util.executable
import spack.util.path
import spack.util.pattern
from spack import traverse
@@ -91,7 +92,7 @@
)
from spack.util.executable import Executable
from spack.util.log_parse import make_log_context, parse_log_events
from spack.util.module_cmd import load_module, module, path_from_modules
from spack.util.module_cmd import load_module, path_from_modules
#
# This can be set by the user to globally disable parallel builds.
@@ -190,14 +191,6 @@ def __call__(self, *args, **kwargs):
return super().__call__(*args, **kwargs)
def _on_cray():
host_platform = spack.platforms.host()
host_os = host_platform.operating_system("default_os")
on_cray = str(host_platform) == "cray"
using_cnl = re.match(r"cnl\d+", str(host_os))
return on_cray, using_cnl
def clean_environment():
# Stuff in here sanitizes the build environment to eliminate
# anything the user has set that may interfere. We apply it immediately
@@ -241,17 +234,6 @@ def clean_environment():
if varname.endswith("_ROOT") and varname != "SPACK_ROOT":
env.unset(varname)
# On Cray "cluster" systems, unset CRAY_LD_LIBRARY_PATH to avoid
# interference with Spack dependencies.
# CNL requires these variables to be set (or at least some of them,
# depending on the CNL version).
on_cray, using_cnl = _on_cray()
if on_cray and not using_cnl:
env.unset("CRAY_LD_LIBRARY_PATH")
for varname in os.environ.keys():
if "PKGCONF" in varname:
env.unset(varname)
# Unset the following variables because they can affect installation of
# Autotools and CMake packages.
build_system_vars = [
@@ -381,11 +363,7 @@ def set_compiler_environment_variables(pkg, env):
_add_werror_handling(keep_werror, env)
# Set the target parameters that the compiler will add
# Don't set on cray platform because the targeting module handles this
if spec.satisfies("platform=cray"):
isa_arg = ""
else:
isa_arg = spec.architecture.target.optimization_flags(compiler)
isa_arg = spec.architecture.target.optimization_flags(compiler)
env.set("SPACK_TARGET_ARGS", isa_arg)
# Trap spack-tracked compiler flags as appropriate.
@@ -481,10 +459,7 @@ def set_wrapper_variables(pkg, env):
# Find ccache binary and hand it to build environment
if spack.config.get("config:ccache"):
ccache = Executable("ccache")
if not ccache:
raise RuntimeError("No ccache binary found in PATH")
env.set(SPACK_CCACHE_BINARY, ccache)
env.set(SPACK_CCACHE_BINARY, spack.util.executable.which_string("ccache", required=True))
# Gather information about various types of dependencies
link_deps = set(pkg.spec.traverse(root=False, deptype=("link")))
@@ -763,7 +738,9 @@ def get_rpaths(pkg):
# Second module is our compiler mod name. We use that to get rpaths from
# module show output.
if pkg.compiler.modules and len(pkg.compiler.modules) > 1:
rpaths.append(path_from_modules([pkg.compiler.modules[1]]))
mod_rpath = path_from_modules([pkg.compiler.modules[1]])
if mod_rpath:
rpaths.append(mod_rpath)
return list(dedupe(filter_system_paths(rpaths)))
@@ -833,14 +810,6 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
for mod in pkg.compiler.modules:
load_module(mod)
# kludge to handle cray mpich and libsci being automatically loaded by
# PrgEnv modules on cray platform. Module unload does no damage when
# unnecessary
on_cray, _ = _on_cray()
if on_cray and not dirty:
for mod in ["cray-mpich", "cray-libsci"]:
module("unload", mod)
if target and target.module_name:
load_module(target.module_name)

View File

@@ -162,7 +162,9 @@ def initconfig_compiler_entries(self):
ld_flags = " ".join(flags["ldflags"])
ld_format_string = "CMAKE_{0}_LINKER_FLAGS"
# CMake has separate linker arguments for types of builds.
for ld_type in ["EXE", "MODULE", "SHARED", "STATIC"]:
# 'ldflags' should not be used with CMAKE_STATIC_LINKER_FLAGS which
# is used by the archiver, so don't include "STATIC" in this loop:
for ld_type in ["EXE", "MODULE", "SHARED"]:
ld_string = ld_format_string.format(ld_type)
entries.append(cmake_cache_string(ld_string, ld_flags))

View File

@@ -110,9 +110,8 @@ def cuda_flags(arch_list):
# From the NVIDIA install guide we know of conflicts for particular
# platforms (linux, darwin), architectures (x86, powerpc) and compilers
# (gcc, clang). We don't restrict %gcc and %clang conflicts to
# platform=linux, since they should also apply to platform=cray, and may
# apply to platform=darwin. We currently do not provide conflicts for
# platform=darwin with %apple-clang.
# platform=linux, since they may apply to platform=darwin. We currently
# do not provide conflicts for platform=darwin with %apple-clang.
# Linux x86_64 compiler conflicts from here:
# https://gist.github.com/ax3l/9489132
@@ -137,14 +136,14 @@ def cuda_flags(arch_list):
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.3")
conflicts("%gcc@14:", when="+cuda ^cuda@:12.4")
conflicts("%gcc@14:", when="+cuda ^cuda@:12.5")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
conflicts("%clang@15:", when="+cuda ^cuda@:12.0")
conflicts("%clang@16:", when="+cuda ^cuda@:12.1")
conflicts("%clang@17:", when="+cuda ^cuda@:12.3")
conflicts("%clang@18:", when="+cuda ^cuda@:12.4")
conflicts("%clang@18:", when="+cuda ^cuda@:12.5")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -846,6 +846,7 @@ def scalapack_libs(self):
"^mpich@2:" in spec_root
or "^cray-mpich" in spec_root
or "^mvapich2" in spec_root
or "^mvapich" in spec_root
or "^intel-mpi" in spec_root
or "^intel-oneapi-mpi" in spec_root
or "^intel-parallel-studio" in spec_root
@@ -936,32 +937,15 @@ def mpi_setup_dependent_build_environment(self, env, dependent_spec, compilers_o
"I_MPI_ROOT": self.normalize_path("mpi"),
}
# CAUTION - SIMILAR code in:
# var/spack/repos/builtin/packages/mpich/package.py
# var/spack/repos/builtin/packages/openmpi/package.py
# var/spack/repos/builtin/packages/mvapich2/package.py
#
# On Cray, the regular compiler wrappers *are* the MPI wrappers.
if "platform=cray" in self.spec:
# TODO: Confirm
wrapper_vars.update(
{
"MPICC": compilers_of_client["CC"],
"MPICXX": compilers_of_client["CXX"],
"MPIF77": compilers_of_client["F77"],
"MPIF90": compilers_of_client["F90"],
}
)
else:
compiler_wrapper_commands = self.mpi_compiler_wrappers
wrapper_vars.update(
{
"MPICC": compiler_wrapper_commands["MPICC"],
"MPICXX": compiler_wrapper_commands["MPICXX"],
"MPIF77": compiler_wrapper_commands["MPIF77"],
"MPIF90": compiler_wrapper_commands["MPIF90"],
}
)
compiler_wrapper_commands = self.mpi_compiler_wrappers
wrapper_vars.update(
{
"MPICC": compiler_wrapper_commands["MPICC"],
"MPICXX": compiler_wrapper_commands["MPICXX"],
"MPIF77": compiler_wrapper_commands["MPIF77"],
"MPIF90": compiler_wrapper_commands["MPIF90"],
}
)
# Ensure that the directory containing the compiler wrappers is in the
# PATH. Spack packages add `prefix.bin` to their dependents' paths,

View File

@@ -24,7 +24,6 @@ class MSBuildPackage(spack.package_base.PackageBase):
build_system("msbuild")
conflicts("platform=linux", when="build_system=msbuild")
conflicts("platform=darwin", when="build_system=msbuild")
conflicts("platform=cray", when="build_system=msbuild")
@spack.builder.builder("msbuild")

View File

@@ -24,7 +24,6 @@ class NMakePackage(spack.package_base.PackageBase):
build_system("nmake")
conflicts("platform=linux", when="build_system=nmake")
conflicts("platform=darwin", when="build_system=nmake")
conflicts("platform=cray", when="build_system=nmake")
@spack.builder.builder("nmake")

View File

@@ -36,9 +36,8 @@ class IntelOneApiPackage(Package):
"target=ppc64:",
"target=ppc64le:",
"target=aarch64:",
"platform=darwin:",
"platform=cray:",
"platform=windows:",
"platform=darwin",
"platform=windows",
]:
conflicts(c, msg="This package in only available for x86_64 and Linux")

View File

@@ -34,6 +34,8 @@ def _misc_cache():
return spack.util.file_cache.FileCache(path)
FileCacheType = Union[spack.util.file_cache.FileCache, llnl.util.lang.Singleton]
#: Spack's cache for small data
MISC_CACHE: Union[spack.util.file_cache.FileCache, llnl.util.lang.Singleton] = (
llnl.util.lang.Singleton(_misc_cache)

View File

@@ -22,6 +22,8 @@
from urllib.parse import urlencode
from urllib.request import HTTPHandler, Request, build_opener
import ruamel.yaml
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.lang import memoized
@@ -551,10 +553,9 @@ def generate_gitlab_ci_yaml(
env,
print_summary,
output_file,
*,
prune_dag=False,
check_index_only=False,
run_optimizer=False,
use_dependencies=False,
artifacts_root=None,
remote_mirror_override=None,
):
@@ -575,12 +576,6 @@ def generate_gitlab_ci_yaml(
this mode results in faster yaml generation time). Otherwise, also
check each spec directly by url (useful if there is no index or it
might be out of date).
run_optimizer (bool): If True, post-process the generated yaml to try
try to reduce the size (attempts to collect repeated configuration
and replace with definitions).)
use_dependencies (bool): If true, use "dependencies" rather than "needs"
("needs" allows DAG scheduling). Useful if gitlab instance cannot
be configured to handle more than a few "needs" per job.
artifacts_root (str): Path where artifacts like logs, environment
files (spack.yaml, spack.lock), etc should be written. GitLab
requires this to be within the project directory.
@@ -814,7 +809,8 @@ def ensure_expected_target_path(path):
cli_scopes = [
os.path.relpath(s.path, concrete_env_dir)
for s in cfg.scopes().values()
if isinstance(s, cfg.ImmutableConfigScope)
if not s.writable
and isinstance(s, (cfg.DirectoryConfigScope))
and s.path not in env_includes
and os.path.exists(s.path)
]
@@ -1271,17 +1267,6 @@ def main_script_replacements(cmd):
with open(copy_specs_file, "w") as fd:
fd.write(json.dumps(buildcache_copies))
# TODO(opadron): remove this or refactor
if run_optimizer:
import spack.ci_optimization as ci_opt
output_object = ci_opt.optimizer(output_object)
# TODO(opadron): remove this or refactor
if use_dependencies:
import spack.ci_needs_workaround as cinw
output_object = cinw.needs_to_dependencies(output_object)
else:
# No jobs were generated
noop_job = spack_ci_ir["jobs"]["noop"]["attributes"]
@@ -1310,8 +1295,11 @@ def main_script_replacements(cmd):
if not rebuild_everything:
sys.exit(1)
with open(output_file, "w") as outf:
outf.write(syaml.dump(sorted_output, default_flow_style=True))
# Minimize yaml output size through use of anchors
syaml.anchorify(sorted_output)
with open(output_file, "w") as f:
ruamel.yaml.YAML().dump(sorted_output, f)
def _url_encode_string(input_string):

View File

@@ -1,34 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections.abc
get_job_name = lambda needs_entry: (
needs_entry.get("job")
if (isinstance(needs_entry, collections.abc.Mapping) and needs_entry.get("artifacts", True))
else needs_entry if isinstance(needs_entry, str) else None
)
def convert_job(job_entry):
if not isinstance(job_entry, collections.abc.Mapping):
return job_entry
needs = job_entry.get("needs")
if needs is None:
return job_entry
new_job = {}
new_job.update(job_entry)
del new_job["needs"]
new_job["dependencies"] = list(
filter((lambda x: x is not None), (get_job_name(needs_entry) for needs_entry in needs))
)
return new_job
def needs_to_dependencies(yaml):
return dict((k, convert_job(v)) for k, v in yaml.items())

View File

@@ -1,363 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import collections.abc
import copy
import hashlib
import spack.util.spack_yaml as syaml
def sort_yaml_obj(obj):
if isinstance(obj, collections.abc.Mapping):
return syaml.syaml_dict(
(k, sort_yaml_obj(v)) for k, v in sorted(obj.items(), key=(lambda item: str(item[0])))
)
if isinstance(obj, collections.abc.Sequence) and not isinstance(obj, str):
return syaml.syaml_list(sort_yaml_obj(x) for x in obj)
return obj
def matches(obj, proto):
"""Returns True if the test object "obj" matches the prototype object
"proto".
If obj and proto are mappings, obj matches proto if (key in obj) and
(obj[key] matches proto[key]) for every key in proto.
If obj and proto are sequences, obj matches proto if they are of the same
length and (a matches b) for every (a,b) in zip(obj, proto).
Otherwise, obj matches proto if obj == proto.
Precondition: proto must not have any reference cycles
"""
if isinstance(obj, collections.abc.Mapping):
if not isinstance(proto, collections.abc.Mapping):
return False
return all((key in obj and matches(obj[key], val)) for key, val in proto.items())
if isinstance(obj, collections.abc.Sequence) and not isinstance(obj, str):
if not (isinstance(proto, collections.abc.Sequence) and not isinstance(proto, str)):
return False
if len(obj) != len(proto):
return False
return all(matches(obj[index], val) for index, val in enumerate(proto))
return obj == proto
def subkeys(obj, proto):
"""Returns the test mapping "obj" after factoring out the items it has in
common with the prototype mapping "proto".
Consider a recursive merge operation, merge(a, b) on mappings a and b, that
returns a mapping, m, whose keys are the union of the keys of a and b, and
for every such key, "k", its corresponding value is:
- merge(a[key], b[key]) if a[key] and b[key] are mappings, or
- b[key] if (key in b) and not matches(a[key], b[key]),
or
- a[key] otherwise
If obj and proto are mappings, the returned object is the smallest object,
"a", such that merge(a, proto) matches obj.
Otherwise, obj is returned.
"""
if not (
isinstance(obj, collections.abc.Mapping) and isinstance(proto, collections.abc.Mapping)
):
return obj
new_obj = {}
for key, value in obj.items():
if key not in proto:
new_obj[key] = value
continue
if matches(value, proto[key]) and matches(proto[key], value):
continue
if isinstance(value, collections.abc.Mapping):
new_obj[key] = subkeys(value, proto[key])
continue
new_obj[key] = value
return new_obj
def add_extends(yaml, key):
"""Modifies the given object "yaml" so that it includes an "extends" key
whose value features "key".
If "extends" is not in yaml, then yaml is modified such that
yaml["extends"] == key.
If yaml["extends"] is a str, then yaml is modified such that
yaml["extends"] == [yaml["extends"], key]
If yaml["extends"] is a list that does not include key, then key is
appended to the list.
Otherwise, yaml is left unchanged.
"""
has_key = "extends" in yaml
extends = yaml.get("extends")
if has_key and not isinstance(extends, (str, collections.abc.Sequence)):
return
if extends is None:
yaml["extends"] = key
return
if isinstance(extends, str):
if extends != key:
yaml["extends"] = [extends, key]
return
if key not in extends:
extends.append(key)
def common_subobject(yaml, sub):
"""Factor prototype object "sub" out of the values of mapping "yaml".
Consider a modified copy of yaml, "new", where for each key, "key" in yaml:
- If yaml[key] matches sub, then new[key] = subkeys(yaml[key], sub).
- Otherwise, new[key] = yaml[key].
If the above match criteria is not satisfied for any such key, then (yaml,
None) is returned. The yaml object is returned unchanged.
Otherwise, each matching value in new is modified as in
add_extends(new[key], common_key), and then new[common_key] is set to sub.
The common_key value is chosen such that it does not match any preexisting
key in new. In this case, (new, common_key) is returned.
"""
match_list = set(k for k, v in yaml.items() if matches(v, sub))
if not match_list:
return yaml, None
common_prefix = ".c"
common_index = 0
while True:
common_key = "".join((common_prefix, str(common_index)))
if common_key not in yaml:
break
common_index += 1
new_yaml = {}
for key, val in yaml.items():
new_yaml[key] = copy.deepcopy(val)
if not matches(val, sub):
continue
new_yaml[key] = subkeys(new_yaml[key], sub)
add_extends(new_yaml[key], common_key)
new_yaml[common_key] = sub
return new_yaml, common_key
def print_delta(name, old, new, applied=None):
delta = new - old
reldelta = (1000 * delta) // old
reldelta = (reldelta // 10, reldelta % 10)
if applied is None:
applied = new <= old
print(
"\n".join(
(
"{0} {1}:",
" before: {2: 10d}",
" after : {3: 10d}",
" delta : {4:+10d} ({5:=+3d}.{6}%)",
)
).format(name, ("+" if applied else "x"), old, new, delta, reldelta[0], reldelta[1])
)
def try_optimization_pass(name, yaml, optimization_pass, *args, **kwargs):
"""Try applying an optimization pass and return information about the
result
"name" is a string describing the nature of the pass. If it is a non-empty
string, summary statistics are also printed to stdout.
"yaml" is the object to apply the pass to.
"optimization_pass" is the function implementing the pass to be applied.
"args" and "kwargs" are the additional arguments to pass to optimization
pass. The pass is applied as
>>> (new_yaml, *other_results) = optimization_pass(yaml, *args, **kwargs)
The pass's results are greedily rejected if it does not modify the original
yaml document, or if it produces a yaml document that serializes to a
larger string.
Returns (new_yaml, yaml, applied, other_results) if applied, or
(yaml, new_yaml, applied, other_results) otherwise.
"""
result = optimization_pass(yaml, *args, **kwargs)
new_yaml, other_results = result[0], result[1:]
if new_yaml is yaml:
# pass was not applied
return (yaml, new_yaml, False, other_results)
pre_size = len(syaml.dump_config(sort_yaml_obj(yaml), default_flow_style=True))
post_size = len(syaml.dump_config(sort_yaml_obj(new_yaml), default_flow_style=True))
# pass makes the size worse: not applying
applied = post_size <= pre_size
if applied:
yaml, new_yaml = new_yaml, yaml
if name:
print_delta(name, pre_size, post_size, applied)
return (yaml, new_yaml, applied, other_results)
def build_histogram(iterator, key):
"""Builds a histogram of values given an iterable of mappings and a key.
For each mapping "m" with key "key" in iterator, the value m[key] is
considered.
Returns a list of tuples (hash, count, proportion, value), where
- "hash" is a sha1sum hash of the value.
- "count" is the number of occurences of values that hash to "hash".
- "proportion" is the proportion of all values considered above that
hash to "hash".
- "value" is one of the values considered above that hash to "hash".
Which value is chosen when multiple values hash to the same "hash" is
undefined.
The list is sorted in descending order by count, yielding the most
frequently occuring hashes first.
"""
buckets = collections.defaultdict(int)
values = {}
num_objects = 0
for obj in iterator:
num_objects += 1
try:
val = obj[key]
except (KeyError, TypeError):
continue
value_hash = hashlib.sha1()
value_hash.update(syaml.dump_config(sort_yaml_obj(val)).encode())
value_hash = value_hash.hexdigest()
buckets[value_hash] += 1
values[value_hash] = val
return [
(h, buckets[h], float(buckets[h]) / num_objects, values[h])
for h in sorted(buckets.keys(), key=lambda k: -buckets[k])
]
def optimizer(yaml):
original_size = len(syaml.dump_config(sort_yaml_obj(yaml), default_flow_style=True))
# try factoring out commonly repeated portions
common_job = {
"variables": {"SPACK_COMPILER_ACTION": "NONE"},
"after_script": ['rm -rf "./spack"'],
"artifacts": {"paths": ["jobs_scratch_dir", "cdash_report"], "when": "always"},
}
# look for a list of tags that appear frequently
_, count, proportion, tags = next(iter(build_histogram(yaml.values(), "tags")), (None,) * 4)
# If a list of tags is found, and there are more than one job that uses it,
# *and* the jobs that do use it represent at least 70% of all jobs, then
# add the list to the prototype object.
if tags and count > 1 and proportion >= 0.70:
common_job["tags"] = tags
# apply common object factorization
yaml, other, applied, rest = try_optimization_pass(
"general common object factorization", yaml, common_subobject, common_job
)
# look for a common script, and try factoring that out
_, count, proportion, script = next(
iter(build_histogram(yaml.values(), "script")), (None,) * 4
)
if script and count > 1 and proportion >= 0.70:
yaml, other, applied, rest = try_optimization_pass(
"script factorization", yaml, common_subobject, {"script": script}
)
# look for a common before_script, and try factoring that out
_, count, proportion, script = next(
iter(build_histogram(yaml.values(), "before_script")), (None,) * 4
)
if script and count > 1 and proportion >= 0.70:
yaml, other, applied, rest = try_optimization_pass(
"before_script factorization", yaml, common_subobject, {"before_script": script}
)
# Look specifically for the SPACK_ROOT_SPEC environment variables.
# Try to factor them out.
h = build_histogram(
(getattr(val, "get", lambda *args: {})("variables") for val in yaml.values()),
"SPACK_ROOT_SPEC",
)
# In this case, we try to factor out *all* instances of the SPACK_ROOT_SPEC
# environment variable; not just the one that appears with the greatest
# frequency. We only require that more than 1 job uses a given instance's
# value, because we expect the value to be very large, and so expect even
# few-to-one factorizations to yield large space savings.
counter = 0
for _, count, proportion, spec in h:
if count <= 1:
continue
counter += 1
yaml, other, applied, rest = try_optimization_pass(
"SPACK_ROOT_SPEC factorization ({count})".format(count=counter),
yaml,
common_subobject,
{"variables": {"SPACK_ROOT_SPEC": spec}},
)
new_size = len(syaml.dump_config(sort_yaml_obj(yaml), default_flow_style=True))
print("\n")
print_delta("overall summary", original_size, new_size)
print("\n")
return yaml

View File

@@ -444,7 +444,7 @@ def format_list(specs):
def filter_loaded_specs(specs):
"""Filter a list of specs returning only those that are
currently loaded."""
hashes = os.environ.get(uenv.spack_loaded_hashes_var, "").split(":")
hashes = os.environ.get(uenv.spack_loaded_hashes_var, "").split(os.pathsep)
return [x for x in specs if x.dag_hash() in hashes]

View File

@@ -165,7 +165,7 @@ def _reset(args):
if not ok_to_continue:
raise RuntimeError("Aborting")
for scope in spack.config.CONFIG.file_scopes:
for scope in spack.config.CONFIG.writable_scopes:
# The default scope should stay untouched
if scope.name == "defaults":
continue

View File

@@ -6,6 +6,7 @@
import json
import os
import shutil
import warnings
from urllib.parse import urlparse, urlunparse
import llnl.util.filesystem as fs
@@ -73,7 +74,7 @@ def setup_parser(subparser):
"--optimize",
action="store_true",
default=False,
help="(experimental) optimize the gitlab yaml file for size\n\n"
help="(DEPRECATED) optimize the gitlab yaml file for size\n\n"
"run the generated document through a series of optimization passes "
"designed to reduce the size of the generated file",
)
@@ -81,7 +82,7 @@ def setup_parser(subparser):
"--dependencies",
action="store_true",
default=False,
help="(experimental) disable DAG scheduling (use 'plain' dependencies)",
help="(DEPRECATED) disable DAG scheduling (use 'plain' dependencies)",
)
generate.add_argument(
"--buildcache-destination",
@@ -200,6 +201,18 @@ def ci_generate(args):
before invoking this command. the value must be the CDash authorization token needed to create
a build group and register all generated jobs under it
"""
if args.optimize:
warnings.warn(
"The --optimize option has been deprecated, and currently has no effect. "
"It will be removed in Spack v0.24."
)
if args.dependencies:
warnings.warn(
"The --dependencies option has been deprecated, and currently has no effect. "
"It will be removed in Spack v0.24."
)
env = spack.cmd.require_active_env(cmd_name="ci generate")
if args.copy_to:
@@ -212,8 +225,6 @@ def ci_generate(args):
output_file = args.output_file
copy_yaml_to = args.copy_to
run_optimizer = args.optimize
use_dependencies = args.dependencies
prune_dag = args.prune_dag
index_only = args.index_only
artifacts_root = args.artifacts_root
@@ -234,8 +245,6 @@ def ci_generate(args):
output_file,
prune_dag=prune_dag,
check_index_only=index_only,
run_optimizer=run_optimizer,
use_dependencies=use_dependencies,
artifacts_root=artifacts_root,
remote_mirror_override=buildcache_destination,
)

View File

@@ -106,7 +106,8 @@ def clean(parser, args):
# Then do the cleaning falling through the cases
if args.specs:
specs = spack.cmd.parse_specs(args.specs, concretize=True)
specs = spack.cmd.parse_specs(args.specs, concretize=False)
specs = list(spack.cmd.matching_spec_from_env(x) for x in specs)
for spec in specs:
msg = "Cleaning build stage [{0}]"
tty.msg(msg.format(spec.short_spec))

View File

@@ -3,6 +3,9 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import llnl.util.tty as tty
from llnl.string import plural
import spack.cmd
import spack.cmd.common.arguments
import spack.environment as ev
@@ -43,5 +46,9 @@ def concretize(parser, args):
with env.write_transaction():
concretized_specs = env.concretize(force=args.force, tests=tests)
if not args.quiet:
ev.display_specs(concretized_specs)
if concretized_specs:
tty.msg(f"Concretized {plural(len(concretized_specs), 'spec')}:")
ev.display_specs([concrete for _, concrete in concretized_specs])
else:
tty.msg("No new specs to concretize.")
env.write()

View File

@@ -264,7 +264,9 @@ def config_remove(args):
def _can_update_config_file(scope: spack.config.ConfigScope, cfg_file):
if isinstance(scope, spack.config.SingleFileScope):
return fs.can_access(cfg_file)
return fs.can_write_to_dir(scope.path) and fs.can_access(cfg_file)
elif isinstance(scope, spack.config.DirectoryConfigScope):
return fs.can_write_to_dir(scope.path) and fs.can_access(cfg_file)
return False
def _config_change_requires_scope(path, spec, scope, match_spec=None):
@@ -362,14 +364,11 @@ def config_change(args):
def config_update(args):
# Read the configuration files
spack.config.CONFIG.get_config(args.section, scope=args.scope)
updates: List[spack.config.ConfigScope] = list(
filter(
lambda s: not isinstance(
s, (spack.config.InternalConfigScope, spack.config.ImmutableConfigScope)
),
spack.config.CONFIG.format_updates[args.section],
)
)
updates: List[spack.config.ConfigScope] = [
x
for x in spack.config.CONFIG.format_updates[args.section]
if not isinstance(x, spack.config.InternalConfigScope) and x.writable
]
cannot_overwrite, skip_system_scope = [], False
for scope in updates:
@@ -447,7 +446,7 @@ def _can_revert_update(scope_dir, cfg_file, bkp_file):
def config_revert(args):
scopes = [args.scope] if args.scope else [x.name for x in spack.config.CONFIG.file_scopes]
scopes = [args.scope] if args.scope else [x.name for x in spack.config.CONFIG.writable_scopes]
# Search for backup files in the configuration scopes
Entry = collections.namedtuple("Entry", ["scope", "cfg", "bkp"])

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import sys
@@ -934,7 +933,7 @@ def get_repository(args, name):
# Figure out where the new package should live
repo_path = args.repo
if repo_path is not None:
repo = spack.repo.Repo(repo_path)
repo = spack.repo.from_path(repo_path)
if spec.namespace and spec.namespace != repo.namespace:
tty.die(
"Can't create package with namespace {0} in repo with "

View File

@@ -9,6 +9,8 @@
import spack.cmd
import spack.config
import spack.fetch_strategy
import spack.repo
import spack.spec
import spack.util.path
import spack.version
@@ -69,13 +71,15 @@ def _retrieve_develop_source(spec, abspath):
# We construct a package class ourselves, rather than asking for
# Spec.package, since Spec only allows this when it is concrete
package = pkg_cls(spec)
if isinstance(package.stage[0].fetcher, spack.fetch_strategy.GitFetchStrategy):
package.stage[0].fetcher.get_full_repo = True
source_stage = package.stage[0]
if isinstance(source_stage.fetcher, spack.fetch_strategy.GitFetchStrategy):
source_stage.fetcher.get_full_repo = True
# If we retrieved this version before and cached it, we may have
# done so without cloning the full git repo; likewise, any
# mirror might store an instance with truncated history.
package.stage[0].disable_mirrors()
source_stage.disable_mirrors()
source_stage.fetcher.set_package(package)
package.stage.steal_source(abspath)

View File

@@ -3,6 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import errno
import glob
import os
@@ -11,43 +12,13 @@
import spack.cmd
import spack.paths
import spack.repo
from spack.spec import Spec
from spack.util.editor import editor
import spack.util.editor
description = "open package files in $EDITOR"
section = "packaging"
level = "short"
def edit_package(name, repo_path, namespace):
"""Opens the requested package file in your favorite $EDITOR.
Args:
name (str): The name of the package
repo_path (str): The path to the repository containing this package
namespace (str): A valid namespace registered with Spack
"""
# Find the location of the package
if repo_path:
repo = spack.repo.Repo(repo_path)
elif namespace:
repo = spack.repo.PATH.get_repo(namespace)
else:
repo = spack.repo.PATH
path = repo.filename_for_package_name(name)
spec = Spec(name)
if os.path.exists(path):
if not os.path.isfile(path):
tty.die("Something is wrong. '{0}' is not a file!".format(path))
if not os.access(path, os.R_OK):
tty.die("Insufficient permissions on '%s'!" % path)
else:
raise spack.repo.UnknownPackageError(spec.name)
editor(path)
def setup_parser(subparser):
excl_args = subparser.add_mutually_exclusive_group()
@@ -98,41 +69,67 @@ def setup_parser(subparser):
excl_args.add_argument("-r", "--repo", default=None, help="path to repo to edit package in")
excl_args.add_argument("-N", "--namespace", default=None, help="namespace of package to edit")
subparser.add_argument("package", nargs="?", default=None, help="package name")
subparser.add_argument("package", nargs="*", default=None, help="package name")
def locate_package(name: str, repo: spack.repo.Repo) -> str:
path = repo.filename_for_package_name(name)
try:
with open(path, "r"):
return path
except OSError as e:
if e.errno == errno.ENOENT:
raise spack.repo.UnknownPackageError(name) from e
tty.die(f"Cannot edit package: {e}")
def locate_file(name: str, path: str) -> str:
# convert command names to python module name
if path == spack.paths.command_path:
name = spack.cmd.python_name(name)
file_path = os.path.join(path, name)
# Try to open direct match.
try:
with open(file_path, "r"):
return file_path
except OSError as e:
if e.errno != errno.ENOENT:
tty.die(f"Cannot edit file: {e}")
pass
# Otherwise try to find a file that starts with the name
candidates = glob.glob(file_path + "*")
exclude_list = [".pyc", "~"] # exclude binaries and backups
files = [f for f in candidates if not any(f.endswith(ext) for ext in exclude_list)]
if len(files) > 1:
tty.die(
f"Multiple files start with `{name}`:\n"
+ "\n".join(f" {os.path.basename(f)}" for f in files)
)
elif not files:
tty.die(f"No file for '{name}' was found in {path}")
return files[0]
def edit(parser, args):
name = args.package
# By default, edit package files
path = spack.paths.packages_path
names = args.package
# If `--command`, `--test`, or `--module` is chosen, edit those instead
if args.path:
path = args.path
if name:
# convert command names to python module name
if path == spack.paths.command_path:
name = spack.cmd.python_name(name)
path = os.path.join(path, name)
if not os.path.exists(path):
files = glob.glob(path + "*")
exclude_list = [".pyc", "~"] # exclude binaries and backups
files = list(filter(lambda x: all(s not in x for s in exclude_list), files))
if len(files) > 1:
m = "Multiple files exist with the name {0}.".format(name)
m += " Please specify a suffix. Files are:\n\n"
for f in files:
m += " " + os.path.basename(f) + "\n"
tty.die(m)
if not files:
tty.die("No file for '{0}' was found in {1}".format(name, path))
path = files[0] # already confirmed only one entry in files
editor(path)
elif name:
edit_package(name, args.repo, args.namespace)
paths = [locate_file(name, args.path) for name in names] if names else [args.path]
spack.util.editor.editor(*paths)
elif names:
if args.repo:
repo = spack.repo.from_path(args.repo)
elif args.namespace:
repo = spack.repo.PATH.get_repo(args.namespace)
else:
repo = spack.repo.PATH
paths = [locate_package(name, repo) for name in names]
spack.util.editor.editor(*paths)
else:
# By default open the directory where packages live
editor(path)
spack.util.editor.editor(spack.paths.packages_path)

View File

@@ -56,7 +56,6 @@ def roots_from_environments(args, active_env):
# -e says "also preserve things needed by this particular env"
for env_name_or_dir in args.except_environment:
print("HMM", env_name_or_dir)
if ev.exists(env_name_or_dir):
env = ev.read(env_name_or_dir)
elif ev.is_env_dir(env_name_or_dir):

View File

@@ -50,7 +50,7 @@
@B{++}, @r{--}, @r{~~}, @B{==} propagate variants to package dependencies
architecture variants:
@m{platform=platform} linux, darwin, cray, etc.
@m{platform=platform} linux, darwin, freebsd, windows
@m{os=operating_system} specific <operating_system>
@m{target=target} specific <target> processor
@m{arch=platform-os-target} shortcut for all three above

View File

@@ -10,6 +10,7 @@
from typing import List
import llnl.util.filesystem as fs
from llnl.string import plural
from llnl.util import lang, tty
import spack.build_environment
@@ -61,7 +62,6 @@ def install_kwargs_from_args(args):
"dependencies_use_cache": cache_opt(args.use_cache, dep_use_bc),
"dependencies_cache_only": cache_opt(args.cache_only, dep_use_bc),
"include_build_deps": args.include_build_deps,
"explicit": True, # Use true as a default for install command
"stop_at": args.until,
"unsigned": args.unsigned,
"install_deps": ("dependencies" in args.things_to_install),
@@ -376,7 +376,9 @@ def _maybe_add_and_concretize(args, env, specs):
# `spack concretize`
tests = compute_tests_install_kwargs(env.user_specs, args.test)
concretized_specs = env.concretize(tests=tests)
ev.display_specs(concretized_specs)
if concretized_specs:
tty.msg(f"Concretized {plural(len(concretized_specs), 'spec')}")
ev.display_specs([concrete for _, concrete in concretized_specs])
# save view regeneration for later, so that we only do it
# once, as it can be slow.
@@ -473,6 +475,7 @@ def install_without_active_env(args, install_kwargs, reporter_factory):
require_user_confirmation_for_overwrite(concrete_specs, args)
install_kwargs["overwrite"] = [spec.dag_hash() for spec in concrete_specs]
installs = [(s.package, install_kwargs) for s in concrete_specs]
builder = PackageInstaller(installs)
installs = [s.package for s in concrete_specs]
install_kwargs["explicit"] = [s.dag_hash() for s in concrete_specs]
builder = PackageInstaller(installs, install_kwargs)
builder.install()

View File

@@ -91,7 +91,7 @@ def repo_add(args):
tty.die("Not a Spack repository: %s" % path)
# Make sure it's actually a spack repository by constructing it.
repo = spack.repo.Repo(canon_path)
repo = spack.repo.from_path(canon_path)
# If that succeeds, finally add it to the configuration.
repos = spack.config.get("repos", scope=args.scope)
@@ -124,7 +124,7 @@ def repo_remove(args):
# If it is a namespace, remove corresponding repo
for path in repos:
try:
repo = spack.repo.Repo(path)
repo = spack.repo.from_path(path)
if repo.namespace == namespace_or_path:
repos.remove(path)
spack.config.set("repos", repos, args.scope)
@@ -142,7 +142,7 @@ def repo_list(args):
repos = []
for r in roots:
try:
repos.append(spack.repo.Repo(r))
repos.append(spack.repo.from_path(r))
except spack.repo.RepoError:
continue

View File

@@ -114,15 +114,16 @@ def _process_result(result, show, required_format, kwargs):
# dump the solutions as concretized specs
if "solutions" in show:
for spec in result.specs:
# With -y, just print YAML to output.
if required_format == "yaml":
# use write because to_yaml already has a newline.
sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
elif required_format == "json":
sys.stdout.write(spec.to_json(hash=ht.dag_hash))
else:
sys.stdout.write(spec.tree(color=sys.stdout.isatty(), **kwargs))
if required_format:
for spec in result.specs:
# With -y, just print YAML to output.
if required_format == "yaml":
# use write because to_yaml already has a newline.
sys.stdout.write(spec.to_yaml(hash=ht.dag_hash))
elif required_format == "json":
sys.stdout.write(spec.to_json(hash=ht.dag_hash))
else:
sys.stdout.write(spack.spec.tree(result.specs, color=sys.stdout.isatty(), **kwargs))
print()
if result.unsolved_specs and "solutions" in show:

View File

@@ -105,11 +105,19 @@ def spec(parser, args):
if env:
env.concretize()
specs = env.concretized_specs()
# environments are printed together in a combined tree() invocation,
# except when using --yaml or --json, which we print spec by spec below.
if not args.format:
tree_kwargs["key"] = spack.traverse.by_dag_hash
tree_kwargs["hashes"] = args.long or args.very_long
print(spack.spec.tree([concrete for _, concrete in specs], **tree_kwargs))
return
else:
tty.die("spack spec requires at least one spec or an active environment")
for input, output in specs:
# With -y, just print YAML to output.
# With --yaml or --json, just print the raw specs to output
if args.format:
if args.format == "yaml":
# use write because to_yaml already has a newline.

View File

@@ -71,7 +71,7 @@ def unload(parser, args):
"Cannot specify specs on command line when unloading all specs with '--all'"
)
hashes = os.environ.get(uenv.spack_loaded_hashes_var, "").split(":")
hashes = os.environ.get(uenv.spack_loaded_hashes_var, "").split(os.pathsep)
if args.specs:
specs = [
spack.cmd.disambiguate_spec_from_hashes(spec, hashes)

View File

@@ -38,10 +38,10 @@
import spack.cmd
import spack.environment as ev
import spack.filesystem_view as fsv
import spack.schema.projections
import spack.store
from spack.config import validate
from spack.filesystem_view import YamlFilesystemView, view_func_parser
from spack.util import spack_yaml as s_yaml
description = "project packages to a compact naming scheme on the filesystem"
@@ -193,17 +193,13 @@ def view(parser, args):
ordered_projections = {}
# What method are we using for this view
if args.action in actions_link:
link_fn = view_func_parser(args.action)
else:
link_fn = view_func_parser("symlink")
view = YamlFilesystemView(
link_type = args.action if args.action in actions_link else "symlink"
view = fsv.YamlFilesystemView(
path,
spack.store.STORE.layout,
projections=ordered_projections,
ignore_conflicts=getattr(args, "ignore_conflicts", False),
link=link_fn,
link_type=link_type,
verbose=args.verbose,
)

View File

@@ -695,10 +695,6 @@ def compiler_environment(self):
try:
# load modules and set env variables
for module in self.modules:
# On cray, mic-knl module cannot be loaded without cce module
# See: https://github.com/spack/spack/issues/3153
if os.environ.get("CRAY_CPU_TARGET") == "mic-knl":
spack.util.module_cmd.load_module("cce")
spack.util.module_cmd.load_module(module)
# apply other compiler environment changes

View File

@@ -220,10 +220,10 @@ def _compiler_config_from_external(config):
operating_system = host_platform.operating_system("default_os")
target = host_platform.target("default_target").microarchitecture
else:
target = spec.target
target = spec.architecture.target
if not target:
host_platform = spack.platforms.host()
target = host_platform.target("default_target").microarchitecture
target = spack.platforms.host().target("default_target")
target = target.microarchitecture
operating_system = spec.os
if not operating_system:
@@ -260,7 +260,7 @@ def _init_compiler_config(
def compiler_config_files():
config_files = list()
config = spack.config.CONFIG
for scope in config.file_scopes:
for scope in config.writable_scopes:
name = scope.name
compiler_config = config.get("compilers", scope=name)
if compiler_config:

View File

@@ -35,7 +35,7 @@
import os
import re
import sys
from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Type, Union
from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Union
from llnl.util import filesystem, lang, tty
@@ -117,21 +117,39 @@
class ConfigScope:
"""This class represents a configuration scope.
def __init__(self, name: str) -> None:
self.name = name
self.writable = False
self.sections = syaml.syaml_dict()
A scope is one directory containing named configuration files.
Each file is a config "section" (e.g., mirrors, compilers, etc.).
"""
def get_section_filename(self, section: str) -> str:
raise NotImplementedError
def __init__(self, name, path) -> None:
self.name = name # scope name.
self.path = path # path to directory containing configs.
self.sections = syaml.syaml_dict() # sections read from config files.
def get_section(self, section: str) -> Optional[YamlConfigDict]:
raise NotImplementedError
def _write_section(self, section: str) -> None:
raise NotImplementedError
@property
def is_platform_dependent(self) -> bool:
"""Returns true if the scope name is platform specific"""
return os.sep in self.name
return False
def clear(self) -> None:
"""Empty cached config information."""
self.sections = syaml.syaml_dict()
def __repr__(self) -> str:
return f"<ConfigScope: {self.name}>"
class DirectoryConfigScope(ConfigScope):
"""Config scope backed by a directory containing one file per section."""
def __init__(self, name: str, path: str, *, writable: bool = True) -> None:
super().__init__(name)
self.path = path
self.writable = writable
def get_section_filename(self, section: str) -> str:
"""Returns the filename associated with a given section"""
@@ -148,6 +166,9 @@ def get_section(self, section: str) -> Optional[YamlConfigDict]:
return self.sections[section]
def _write_section(self, section: str) -> None:
if not self.writable:
raise ConfigError(f"Cannot write to immutable scope {self}")
filename = self.get_section_filename(section)
data = self.get_section(section)
if data is None:
@@ -164,19 +185,23 @@ def _write_section(self, section: str) -> None:
except (syaml.SpackYAMLError, OSError) as e:
raise ConfigFileError(f"cannot write to '{filename}'") from e
def clear(self) -> None:
"""Empty cached config information."""
self.sections = syaml.syaml_dict()
def __repr__(self) -> str:
return f"<ConfigScope: {self.name}: {self.path}>"
@property
def is_platform_dependent(self) -> bool:
"""Returns true if the scope name is platform specific"""
return "/" in self.name
class SingleFileScope(ConfigScope):
"""This class represents a configuration scope in a single YAML file."""
def __init__(
self, name: str, path: str, schema: YamlConfigDict, yaml_path: Optional[List[str]] = None
self,
name: str,
path: str,
schema: YamlConfigDict,
*,
yaml_path: Optional[List[str]] = None,
writable: bool = True,
) -> None:
"""Similar to ``ConfigScope`` but can be embedded in another schema.
@@ -195,15 +220,13 @@ def __init__(
config:
install_tree: $spack/opt/spack
"""
super().__init__(name, path)
super().__init__(name)
self._raw_data: Optional[YamlConfigDict] = None
self.schema = schema
self.path = path
self.writable = writable
self.yaml_path = yaml_path or []
@property
def is_platform_dependent(self) -> bool:
return False
def get_section_filename(self, section) -> str:
return self.path
@@ -257,6 +280,8 @@ def get_section(self, section: str) -> Optional[YamlConfigDict]:
return self.sections.get(section, None)
def _write_section(self, section: str) -> None:
if not self.writable:
raise ConfigError(f"Cannot write to immutable scope {self}")
data_to_write: Optional[YamlConfigDict] = self._raw_data
# If there is no existing data, this section SingleFileScope has never
@@ -301,19 +326,6 @@ def __repr__(self) -> str:
return f"<SingleFileScope: {self.name}: {self.path}>"
class ImmutableConfigScope(ConfigScope):
"""A configuration scope that cannot be written to.
This is used for ConfigScopes passed on the command line.
"""
def _write_section(self, section) -> None:
raise ConfigError(f"Cannot write to immutable scope {self}")
def __repr__(self) -> str:
return f"<ImmutableConfigScope: {self.name}: {self.path}>"
class InternalConfigScope(ConfigScope):
"""An internal configuration scope that is not persisted to a file.
@@ -323,7 +335,7 @@ class InternalConfigScope(ConfigScope):
"""
def __init__(self, name: str, data: Optional[YamlConfigDict] = None) -> None:
super().__init__(name, None)
super().__init__(name)
self.sections = syaml.syaml_dict()
if data is not None:
@@ -333,9 +345,6 @@ def __init__(self, name: str, data: Optional[YamlConfigDict] = None) -> None:
validate({section: dsec}, SECTION_SCHEMAS[section])
self.sections[section] = _mark_internal(syaml.syaml_dict({section: dsec}), name)
def get_section_filename(self, section: str) -> str:
raise NotImplementedError("Cannot get filename for InternalConfigScope.")
def get_section(self, section: str) -> Optional[YamlConfigDict]:
"""Just reads from an internal dictionary."""
if section not in self.sections:
@@ -440,27 +449,21 @@ def remove_scope(self, scope_name: str) -> Optional[ConfigScope]:
return scope
@property
def file_scopes(self) -> List[ConfigScope]:
"""List of writable scopes with an associated file."""
return [
s
for s in self.scopes.values()
if (type(s) is ConfigScope or type(s) is SingleFileScope)
]
def writable_scopes(self) -> Generator[ConfigScope, None, None]:
"""Generator of writable scopes with an associated file."""
return (s for s in self.scopes.values() if s.writable)
def highest_precedence_scope(self) -> ConfigScope:
"""Non-internal scope with highest precedence."""
return next(reversed(self.file_scopes))
"""Writable scope with highest precedence."""
return next(s for s in reversed(self.scopes.values()) if s.writable) # type: ignore
def highest_precedence_non_platform_scope(self) -> ConfigScope:
"""Non-internal non-platform scope with highest precedence
Platform-specific scopes are of the form scope/platform"""
generator = reversed(self.file_scopes)
highest = next(generator)
while highest and highest.is_platform_dependent:
highest = next(generator)
return highest
"""Writable non-platform scope with highest precedence"""
return next(
s
for s in reversed(self.scopes.values()) # type: ignore
if s.writable and not s.is_platform_dependent
)
def matching_scopes(self, reg_expr) -> List[ConfigScope]:
"""
@@ -755,13 +758,14 @@ def override(
def _add_platform_scope(
cfg: Union[Configuration, lang.Singleton], scope_type: Type[ConfigScope], name: str, path: str
cfg: Union[Configuration, lang.Singleton], name: str, path: str, writable: bool = True
) -> None:
"""Add a platform-specific subdirectory for the current platform."""
platform = spack.platforms.host().name
plat_name = os.path.join(name, platform)
plat_path = os.path.join(path, platform)
cfg.push_scope(scope_type(plat_name, plat_path))
scope = DirectoryConfigScope(
f"{name}/{platform}", os.path.join(path, platform), writable=writable
)
cfg.push_scope(scope)
def config_paths_from_entry_points() -> List[Tuple[str, str]]:
@@ -806,8 +810,8 @@ def _add_command_line_scopes(
# name based on order on the command line
name = f"cmd_scope_{i:d}"
cfg.push_scope(ImmutableConfigScope(name, path))
_add_platform_scope(cfg, ImmutableConfigScope, name, path)
cfg.push_scope(DirectoryConfigScope(name, path, writable=False))
_add_platform_scope(cfg, name, path, writable=False)
def create() -> Configuration:
@@ -851,10 +855,10 @@ def create() -> Configuration:
# add each scope and its platform-specific directory
for name, path in configuration_paths:
cfg.push_scope(ConfigScope(name, path))
cfg.push_scope(DirectoryConfigScope(name, path))
# Each scope can have per-platfom overrides in subdirectories
_add_platform_scope(cfg, ConfigScope, name, path)
_add_platform_scope(cfg, name, path)
# add command-line scopes
_add_command_line_scopes(cfg, COMMAND_LINE_SCOPES)
@@ -969,7 +973,7 @@ def set(path: str, value: Any, scope: Optional[str] = None) -> None:
def add_default_platform_scope(platform: str) -> None:
plat_name = os.path.join("defaults", platform)
plat_path = os.path.join(CONFIGURATION_DEFAULTS_PATH[1], platform)
CONFIG.push_scope(ConfigScope(plat_name, plat_path))
CONFIG.push_scope(DirectoryConfigScope(plat_name, plat_path))
def scopes() -> Dict[str, ConfigScope]:
@@ -978,19 +982,10 @@ def scopes() -> Dict[str, ConfigScope]:
def writable_scopes() -> List[ConfigScope]:
"""
Return list of writable scopes. Higher-priority scopes come first in the
list.
"""
return list(
reversed(
list(
x
for x in CONFIG.scopes.values()
if not isinstance(x, (InternalConfigScope, ImmutableConfigScope))
)
)
)
"""Return list of writable scopes. Higher-priority scopes come first in the list."""
scopes = [x for x in CONFIG.scopes.values() if x.writable]
scopes.reverse()
return scopes
def writable_scope_names() -> List[str]:
@@ -1599,7 +1594,7 @@ def _config_from(scopes_or_paths: List[Union[ConfigScope, str]]) -> Configuratio
path = os.path.normpath(scope_or_path)
assert os.path.isdir(path), f'"{path}" must be a directory'
name = os.path.basename(path)
scopes.append(ConfigScope(name, path))
scopes.append(DirectoryConfigScope(name, path))
configuration = Configuration(*scopes)
return configuration

View File

@@ -78,24 +78,17 @@
"image": "quay.io/almalinuxorg/almalinux:8"
}
},
"centos:stream": {
"centos:stream9": {
"bootstrap": {
"template": "container/centos_stream.dockerfile",
"image": "quay.io/centos/centos:stream"
"template": "container/centos_stream9.dockerfile",
"image": "quay.io/centos/centos:stream9"
},
"os_package_manager": "dnf_epel",
"build": "spack/centos-stream",
"build": "spack/centos-stream9",
"final": {
"image": "quay.io/centos/centos:stream"
"image": "quay.io/centos/centos:stream9"
}
},
"centos:7": {
"bootstrap": {
"template": "container/centos_7.dockerfile"
},
"os_package_manager": "yum",
"build": "spack/centos7"
},
"opensuse/leap:15": {
"bootstrap": {
"template": "container/leap-15.dockerfile"

View File

@@ -24,12 +24,15 @@
from llnl.util.link_tree import ConflictingSpecsError
from llnl.util.symlink import readlink, symlink
import spack.caches
import spack.cmd
import spack.compilers
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.error
import spack.fetch_strategy
import spack.filesystem_view as fsv
import spack.hash_types as ht
import spack.hooks
import spack.main
@@ -52,7 +55,6 @@
import spack.util.url
import spack.version
from spack import traverse
from spack.filesystem_view import SimpleFilesystemView, inverse_view_func_parser, view_func_parser
from spack.installer import PackageInstaller
from spack.schema.env import TOP_LEVEL_KEY
from spack.spec import Spec
@@ -606,7 +608,7 @@ def __init__(
self.projections = projections
self.select = select
self.exclude = exclude
self.link_type = view_func_parser(link_type)
self.link_type = fsv.canonicalize_link_type(link_type)
self.link = link
def select_fn(self, spec):
@@ -640,7 +642,7 @@ def to_dict(self):
if self.exclude:
ret["exclude"] = self.exclude
if self.link_type:
ret["link_type"] = inverse_view_func_parser(self.link_type)
ret["link_type"] = self.link_type
if self.link != default_view_link:
ret["link"] = self.link
return ret
@@ -690,7 +692,7 @@ def get_projection_for_spec(self, spec):
to exist on the filesystem."""
return self._view(self.root).get_projection_for_spec(spec)
def view(self, new: Optional[str] = None) -> SimpleFilesystemView:
def view(self, new: Optional[str] = None) -> fsv.SimpleFilesystemView:
"""
Returns a view object for the *underlying* view directory. This means that the
self.root symlink is followed, and that the view has to exist on the filesystem
@@ -710,14 +712,14 @@ def view(self, new: Optional[str] = None) -> SimpleFilesystemView:
)
return self._view(path)
def _view(self, root: str) -> SimpleFilesystemView:
def _view(self, root: str) -> fsv.SimpleFilesystemView:
"""Returns a view object for a given root dir."""
return SimpleFilesystemView(
return fsv.SimpleFilesystemView(
root,
spack.store.STORE.layout,
ignore_conflicts=True,
projections=self.projections,
link=self.link_type,
link_type=self.link_type,
)
def __contains__(self, spec):
@@ -1948,13 +1950,19 @@ def install_specs(self, specs: Optional[List[Spec]] = None, **install_args):
specs = specs if specs is not None else roots
# Extend the set of specs to overwrite with modified dev specs and their parents
install_args["overwrite"] = (
install_args.get("overwrite", []) + self._dev_specs_that_need_overwrite()
overwrite: Set[str] = set()
overwrite.update(install_args.get("overwrite", []), self._dev_specs_that_need_overwrite())
install_args["overwrite"] = overwrite
explicit: Set[str] = set()
explicit.update(
install_args.get("explicit", []),
(s.dag_hash() for s in specs),
(s.dag_hash() for s in roots),
)
install_args["explicit"] = explicit
installs = [(spec.package, {**install_args, "explicit": spec in roots}) for spec in specs]
PackageInstaller(installs).install()
PackageInstaller([spec.package for spec in specs], install_args).install()
def all_specs_generator(self) -> Iterable[Spec]:
"""Returns a generator for all concrete specs"""
@@ -2467,27 +2475,21 @@ def _equiv_dict(first, second):
return same_values and same_keys_with_same_overrides
def display_specs(concretized_specs):
"""Displays the list of specs returned by `Environment.concretize()`.
def display_specs(specs):
"""Displays a list of specs traversed breadth-first, covering nodes, with install status.
Args:
concretized_specs (list): list of specs returned by
`Environment.concretize()`
specs (list): list of specs
"""
def _tree_to_display(spec):
return spec.tree(
recurse_dependencies=True,
format=spack.spec.DISPLAY_FORMAT,
status_fn=spack.spec.Spec.install_status,
hashlen=7,
hashes=True,
)
for user_spec, concrete_spec in concretized_specs:
tty.msg("Concretized {0}".format(user_spec))
sys.stdout.write(_tree_to_display(concrete_spec))
print("")
tree_string = spack.spec.tree(
specs,
format=spack.spec.DISPLAY_FORMAT,
hashes=True,
hashlen=7,
status_fn=spack.spec.Spec.install_status,
key=traverse.by_dag_hash,
)
print(tree_string)
def _concretize_from_constraints(spec_constraints, tests=False):
@@ -2541,7 +2543,7 @@ def _concretize_task(packed_arguments) -> Tuple[int, Spec, float]:
def make_repo_path(root):
"""Make a RepoPath from the repo subdirectories in an environment."""
path = spack.repo.RepoPath()
path = spack.repo.RepoPath(cache=spack.caches.MISC_CACHE)
if os.path.isdir(root):
for repo_root in os.listdir(root):
@@ -2550,7 +2552,7 @@ def make_repo_path(root):
if not os.path.isdir(repo_root):
continue
repo = spack.repo.Repo(repo_root)
repo = spack.repo.from_path(repo_root)
path.put_last(repo)
return path
@@ -3026,7 +3028,7 @@ def included_config_scopes(self) -> List[spack.config.ConfigScope]:
SpackEnvironmentError: if the manifest includes a remote file but
no configuration stage directory has been identified
"""
scopes = []
scopes: List[spack.config.ConfigScope] = []
# load config scopes added via 'include:', in reverse so that
# highest-precedence scopes are last.
@@ -3095,23 +3097,21 @@ def included_config_scopes(self) -> List[spack.config.ConfigScope]:
if os.path.isdir(config_path):
# directories are treated as regular ConfigScopes
config_name = "env:%s:%s" % (env_name, os.path.basename(config_path))
tty.debug("Creating ConfigScope {0} for '{1}'".format(config_name, config_path))
scope = spack.config.ConfigScope(config_name, config_path)
tty.debug(f"Creating DirectoryConfigScope {config_name} for '{config_path}'")
scopes.append(spack.config.DirectoryConfigScope(config_name, config_path))
elif os.path.exists(config_path):
# files are assumed to be SingleFileScopes
config_name = "env:%s:%s" % (env_name, config_path)
tty.debug(
"Creating SingleFileScope {0} for '{1}'".format(config_name, config_path)
)
scope = spack.config.SingleFileScope(
config_name, config_path, spack.schema.merged.schema
tty.debug(f"Creating SingleFileScope {config_name} for '{config_path}'")
scopes.append(
spack.config.SingleFileScope(
config_name, config_path, spack.schema.merged.schema
)
)
else:
missing.append(config_path)
continue
scopes.append(scope)
if missing:
msg = "Detected {0} missing include path(s):".format(len(missing))
msg += "\n {0}".format("\n ".join(missing))
@@ -3128,7 +3128,10 @@ def env_config_scopes(self) -> List[spack.config.ConfigScope]:
scopes: List[spack.config.ConfigScope] = [
*self.included_config_scopes,
spack.config.SingleFileScope(
self.scope_name, str(self.manifest_file), spack.schema.env.schema, [TOP_LEVEL_KEY]
self.scope_name,
str(self.manifest_file),
spack.schema.env.schema,
yaml_path=[TOP_LEVEL_KEY],
),
]
ensure_no_disallowed_env_config_mods(scopes)

View File

@@ -10,8 +10,9 @@
import shutil
import stat
import sys
from typing import Optional
from typing import Callable, Dict, Optional
from llnl.string import comma_or
from llnl.util import tty
from llnl.util.filesystem import (
mkdirp,
@@ -49,19 +50,20 @@
_projections_path = ".spack/projections.yaml"
def view_symlink(src, dst, **kwargs):
# keyword arguments are irrelevant
# here to fit required call signature
LinkCallbackType = Callable[[str, str, "FilesystemView", Optional["spack.spec.Spec"]], None]
def view_symlink(src: str, dst: str, *args, **kwargs) -> None:
symlink(src, dst)
def view_hardlink(src, dst, **kwargs):
# keyword arguments are irrelevant
# here to fit required call signature
def view_hardlink(src: str, dst: str, *args, **kwargs) -> None:
os.link(src, dst)
def view_copy(src: str, dst: str, view, spec: Optional[spack.spec.Spec] = None):
def view_copy(
src: str, dst: str, view: "FilesystemView", spec: Optional["spack.spec.Spec"] = None
) -> None:
"""
Copy a file from src to dst.
@@ -104,27 +106,40 @@ def view_copy(src: str, dst: str, view, spec: Optional[spack.spec.Spec] = None):
tty.debug(f"Can't change the permissions for {dst}")
def view_func_parser(parsed_name):
# What method are we using for this view
if parsed_name in ("hardlink", "hard"):
#: supported string values for `link_type` in an env, mapped to canonical values
_LINK_TYPES = {
"hardlink": "hardlink",
"hard": "hardlink",
"copy": "copy",
"relocate": "copy",
"add": "symlink",
"symlink": "symlink",
"soft": "symlink",
}
_VALID_LINK_TYPES = sorted(set(_LINK_TYPES.values()))
def canonicalize_link_type(link_type: str) -> str:
"""Return canonical"""
canonical = _LINK_TYPES.get(link_type)
if not canonical:
raise ValueError(
f"Invalid link type: '{link_type}. Must be one of {comma_or(_VALID_LINK_TYPES)}'"
)
return canonical
def function_for_link_type(link_type: str) -> LinkCallbackType:
link_type = canonicalize_link_type(link_type)
if link_type == "hardlink":
return view_hardlink
elif parsed_name in ("copy", "relocate"):
return view_copy
elif parsed_name in ("add", "symlink", "soft"):
elif link_type == "symlink":
return view_symlink
else:
raise ValueError(f"invalid link type for view: '{parsed_name}'")
elif link_type == "copy":
return view_copy
def inverse_view_func_parser(view_type):
# get string based on view type
if view_type is view_hardlink:
link_name = "hardlink"
elif view_type is view_copy:
link_name = "copy"
else:
link_name = "symlink"
return link_name
assert False, "invalid link type" # need mypy Literal values
class FilesystemView:
@@ -140,7 +155,16 @@ class FilesystemView:
directory structure.
"""
def __init__(self, root, layout, **kwargs):
def __init__(
self,
root: str,
layout: "spack.directory_layout.DirectoryLayout",
*,
projections: Optional[Dict] = None,
ignore_conflicts: bool = False,
verbose: bool = False,
link_type: str = "symlink",
):
"""
Initialize a filesystem view under the given `root` directory with
corresponding directory `layout`.
@@ -149,15 +173,14 @@ def __init__(self, root, layout, **kwargs):
"""
self._root = root
self.layout = layout
self.projections = {} if projections is None else projections
self.projections = kwargs.get("projections", {})
self.ignore_conflicts = kwargs.get("ignore_conflicts", False)
self.verbose = kwargs.get("verbose", False)
self.ignore_conflicts = ignore_conflicts
self.verbose = verbose
# Setup link function to include view
link_func = kwargs.get("link", view_symlink)
self.link = ft.partial(link_func, view=self)
self.link_type = link_type
self.link = ft.partial(function_for_link_type(link_type), view=self)
def add_specs(self, *specs, **kwargs):
"""
@@ -255,8 +278,24 @@ class YamlFilesystemView(FilesystemView):
Filesystem view to work with a yaml based directory layout.
"""
def __init__(self, root, layout, **kwargs):
super().__init__(root, layout, **kwargs)
def __init__(
self,
root: str,
layout: "spack.directory_layout.DirectoryLayout",
*,
projections: Optional[Dict] = None,
ignore_conflicts: bool = False,
verbose: bool = False,
link_type: str = "symlink",
):
super().__init__(
root,
layout,
projections=projections,
ignore_conflicts=ignore_conflicts,
verbose=verbose,
link_type=link_type,
)
# Super class gets projections from the kwargs
# YAML specific to get projections from YAML file
@@ -638,9 +677,6 @@ class SimpleFilesystemView(FilesystemView):
"""A simple and partial implementation of FilesystemView focused on performance and immutable
views, where specs cannot be removed after they were added."""
def __init__(self, root, layout, **kwargs):
super().__init__(root, layout, **kwargs)
def _sanity_check_view_projection(self, specs):
"""A very common issue is that we end up with two specs of the same package, that project
to the same prefix. We want to catch that as early as possible and give a sensible error to

View File

@@ -13,7 +13,6 @@
import spack.config
import spack.relocate
from spack.util.elf import ElfParsingError, parse_elf
from spack.util.executable import Executable
def is_shared_library_elf(filepath):
@@ -141,7 +140,7 @@ def post_install(spec, explicit=None):
return
# Only enable on platforms using ELF.
if not spec.satisfies("platform=linux") and not spec.satisfies("platform=cray"):
if not spec.satisfies("platform=linux"):
return
# Disable this hook when bootstrapping, to avoid recursion.
@@ -149,10 +148,9 @@ def post_install(spec, explicit=None):
return
# Should failing to locate patchelf be a hard error?
patchelf_path = spack.relocate._patchelf()
if not patchelf_path:
patchelf = spack.relocate._patchelf()
if not patchelf:
return
patchelf = Executable(patchelf_path)
fixes = find_and_patch_sonames(spec.prefix, spec.package.non_bindable_shared_objects, patchelf)

View File

@@ -117,7 +117,7 @@ def post_install(spec, explicit=None):
return
# Only enable on platforms using ELF.
if not spec.satisfies("platform=linux") and not spec.satisfies("platform=cray"):
if not spec.satisfies("platform=linux"):
return
visit_directory_tree(spec.prefix, ElfFilesWithRPathVisitor())

View File

@@ -582,7 +582,7 @@ def dump_packages(spec: "spack.spec.Spec", path: str) -> None:
# Create a source repo and get the pkg directory out of it.
try:
source_repo = spack.repo.Repo(source_repo_root)
source_repo = spack.repo.from_path(source_repo_root)
source_pkg_dir = source_repo.dirname_for_package_name(node.name)
except spack.repo.RepoError as err:
tty.debug(f"Failed to create source repo for {node.name}: {str(err)}")
@@ -593,16 +593,14 @@ def dump_packages(spec: "spack.spec.Spec", path: str) -> None:
dest_repo_root = os.path.join(path, node.namespace)
if not os.path.exists(dest_repo_root):
spack.repo.create_repo(dest_repo_root)
repo = spack.repo.Repo(dest_repo_root)
repo = spack.repo.from_path(dest_repo_root)
# Get the location of the package in the dest repo.
dest_pkg_dir = repo.dirname_for_package_name(node.name)
if node is spec:
spack.repo.PATH.dump_provenance(node, dest_pkg_dir)
elif source_pkg_dir:
fs.install_tree(
source_pkg_dir, dest_pkg_dir, allow_broken_symlinks=(sys.platform != "win32")
)
fs.install_tree(source_pkg_dir, dest_pkg_dir)
def get_dependent_ids(spec: "spack.spec.Spec") -> List[str]:
@@ -761,12 +759,8 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
if not self.pkg.spec.concrete:
raise ValueError(f"{self.pkg.name} must have a concrete spec")
# Cache the package phase options with the explicit package,
# popping the options to ensure installation of associated
# dependencies is NOT affected by these options.
self.pkg.stop_before_phase = install_args.pop("stop_before", None) # type: ignore[attr-defined] # noqa: E501
self.pkg.last_phase = install_args.pop("stop_at", None) # type: ignore[attr-defined]
self.pkg.stop_before_phase = install_args.get("stop_before") # type: ignore[attr-defined] # noqa: E501
self.pkg.last_phase = install_args.get("stop_at") # type: ignore[attr-defined]
# Cache the package id for convenience
self.pkg_id = package_id(pkg.spec)
@@ -1076,19 +1070,17 @@ def flag_installed(self, installed: List[str]) -> None:
@property
def explicit(self) -> bool:
"""The package was explicitly requested by the user."""
return self.is_root and self.request.install_args.get("explicit", True)
return self.pkg.spec.dag_hash() in self.request.install_args.get("explicit", [])
@property
def is_root(self) -> bool:
"""The package was requested directly, but may or may not be explicit
in an environment."""
def is_build_request(self) -> bool:
"""The package was requested directly"""
return self.pkg == self.request.pkg
@property
def use_cache(self) -> bool:
_use_cache = True
if self.is_root:
if self.is_build_request:
return self.request.install_args.get("package_use_cache", _use_cache)
else:
return self.request.install_args.get("dependencies_use_cache", _use_cache)
@@ -1096,7 +1088,7 @@ def use_cache(self) -> bool:
@property
def cache_only(self) -> bool:
_cache_only = False
if self.is_root:
if self.is_build_request:
return self.request.install_args.get("package_cache_only", _cache_only)
else:
return self.request.install_args.get("dependencies_cache_only", _cache_only)
@@ -1122,24 +1114,17 @@ def priority(self):
class PackageInstaller:
"""
Class for managing the install process for a Spack instance based on a
bottom-up DAG approach.
Class for managing the install process for a Spack instance based on a bottom-up DAG approach.
This installer can coordinate concurrent batch and interactive, local
and distributed (on a shared file system) builds for the same Spack
instance.
This installer can coordinate concurrent batch and interactive, local and distributed (on a
shared file system) builds for the same Spack instance.
"""
def __init__(self, installs: List[Tuple["spack.package_base.PackageBase", dict]] = []) -> None:
"""Initialize the installer.
Args:
installs (list): list of tuples, where each
tuple consists of a package (PackageBase) and its associated
install arguments (dict)
"""
def __init__(
self, packages: List["spack.package_base.PackageBase"], install_args: dict
) -> None:
# List of build requests
self.build_requests = [BuildRequest(pkg, install_args) for pkg, install_args in installs]
self.build_requests = [BuildRequest(pkg, install_args) for pkg in packages]
# Priority queue of build tasks
self.build_pq: List[Tuple[Tuple[int, int], BuildTask]] = []
@@ -1557,17 +1542,6 @@ def _add_tasks(self, request: BuildRequest, all_deps):
tty.warn(f"Installation request refused: {str(err)}")
return
# Skip out early if the spec is not being installed locally (i.e., if
# external or upstream).
#
# External and upstream packages need to get flagged as installed to
# ensure proper status tracking for environment build.
explicit = request.install_args.get("explicit", True)
not_local = _handle_external_and_upstream(request.pkg, explicit)
if not_local:
self._flag_installed(request.pkg)
return
install_compilers = spack.config.get("config:install_missing_compilers", False)
install_deps = request.install_args.get("install_deps")
@@ -1683,10 +1657,6 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
if not pkg.unit_test_check():
return
# Injecting information to know if this installation request is the root one
# to determine in BuildProcessInstaller whether installation is explicit or not
install_args["is_root"] = task.is_root
try:
self._setup_install_dir(pkg)
@@ -1998,8 +1968,8 @@ def install(self) -> None:
self._init_queue()
fail_fast_err = "Terminating after first install failure"
single_explicit_spec = len(self.build_requests) == 1
failed_explicits = []
single_requested_spec = len(self.build_requests) == 1
failed_build_requests = []
install_status = InstallStatus(len(self.build_pq))
@@ -2048,11 +2018,10 @@ def install(self) -> None:
# Skip the installation if the spec is not being installed locally
# (i.e., if external or upstream) BUT flag it as installed since
# some package likely depends on it.
if not task.explicit:
if _handle_external_and_upstream(pkg, False):
term_status.clear()
self._flag_installed(pkg, task.dependents)
continue
if _handle_external_and_upstream(pkg, task.explicit):
term_status.clear()
self._flag_installed(pkg, task.dependents)
continue
# Flag a failed spec. Do not need an (install) prefix lock since
# assume using a separate (failed) prefix lock file.
@@ -2197,14 +2166,11 @@ def install(self) -> None:
if self.fail_fast:
raise InstallError(f"{fail_fast_err}: {str(exc)}", pkg=pkg)
# Terminate at this point if the single explicit spec has
# failed to install.
if single_explicit_spec and task.explicit:
raise
# Track explicit spec id and error to summarize when done
if task.explicit:
failed_explicits.append((pkg, pkg_id, str(exc)))
# Terminate when a single build request has failed, or summarize errors later.
if task.is_build_request:
if single_requested_spec:
raise
failed_build_requests.append((pkg, pkg_id, str(exc)))
finally:
# Remove the install prefix if anything went wrong during
@@ -2227,16 +2193,16 @@ def install(self) -> None:
if request.install_args.get("install_package") and request.pkg_id not in self.installed
]
if failed_explicits or missing:
for _, pkg_id, err in failed_explicits:
if failed_build_requests or missing:
for _, pkg_id, err in failed_build_requests:
tty.error(f"{pkg_id}: {err}")
for _, pkg_id in missing:
tty.error(f"{pkg_id}: Package was not installed")
if len(failed_explicits) > 0:
pkg = failed_explicits[0][0]
ids = [pkg_id for _, pkg_id, _ in failed_explicits]
if len(failed_build_requests) > 0:
pkg = failed_build_requests[0][0]
ids = [pkg_id for _, pkg_id, _ in failed_build_requests]
tty.debug(
"Associating installation failure with first failed "
f"explicit package ({ids[0]}) from {', '.join(ids)}"
@@ -2295,7 +2261,7 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
self.verbose = bool(install_args.get("verbose", False))
# whether installation was explicitly requested by the user
self.explicit = install_args.get("is_root", False) and install_args.get("explicit", True)
self.explicit = pkg.spec.dag_hash() in install_args.get("explicit", [])
# env before starting installation
self.unmodified_env = install_args.get("unmodified_env", {})
@@ -2380,9 +2346,7 @@ def _install_source(self) -> None:
src_target = os.path.join(pkg.spec.prefix, "share", pkg.name, "src")
tty.debug(f"{self.pre} Copying source to {src_target}")
fs.install_tree(
pkg.stage.source_path, src_target, allow_broken_symlinks=(sys.platform != "win32")
)
fs.install_tree(pkg.stage.source_path, src_target)
def _real_install(self) -> None:
import spack.builder

View File

@@ -3,22 +3,12 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from ._operating_system import OperatingSystem
from .cray_backend import CrayBackend
from .cray_frontend import CrayFrontend
from .freebsd import FreeBSDOs
from .linux_distro import LinuxDistro
from .mac_os import MacOs
from .windows_os import WindowsOs
__all__ = [
"OperatingSystem",
"LinuxDistro",
"MacOs",
"CrayFrontend",
"CrayBackend",
"WindowsOs",
"FreeBSDOs",
]
__all__ = ["OperatingSystem", "LinuxDistro", "MacOs", "WindowsOs", "FreeBSDOs"]
#: List of all the Operating Systems known to Spack
operating_systems = [LinuxDistro, MacOs, CrayFrontend, CrayBackend, WindowsOs, FreeBSDOs]
operating_systems = [LinuxDistro, MacOs, WindowsOs, FreeBSDOs]

View File

@@ -1,172 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import llnl.util.tty as tty
import spack.error
import spack.version
from spack.util.module_cmd import module
from .linux_distro import LinuxDistro
#: Possible locations of the Cray CLE release file,
#: which we look at to get the CNL OS version.
_cle_release_file = "/etc/opt/cray/release/cle-release"
_clerelease_file = "/etc/opt/cray/release/clerelease"
def read_cle_release_file():
"""Read the CLE release file and return a dict with its attributes.
This file is present on newer versions of Cray.
The release file looks something like this::
RELEASE=6.0.UP07
BUILD=6.0.7424
...
The dictionary we produce looks like this::
{
"RELEASE": "6.0.UP07",
"BUILD": "6.0.7424",
...
}
Returns:
dict: dictionary of release attributes
"""
with open(_cle_release_file) as release_file:
result = {}
for line in release_file:
# use partition instead of split() to ensure we only split on
# the first '=' in the line.
key, _, value = line.partition("=")
result[key] = value.strip()
return result
def read_clerelease_file():
"""Read the CLE release file and return the Cray OS version.
This file is present on older versions of Cray.
The release file looks something like this::
5.2.UP04
Returns:
str: the Cray OS version
"""
with open(_clerelease_file) as release_file:
for line in release_file:
return line.strip()
class CrayBackend(LinuxDistro):
"""Compute Node Linux (CNL) is the operating system used for the Cray XC
series super computers. It is a very stripped down version of GNU/Linux.
Any compilers found through this operating system will be used with
modules. If updated, user must make sure that version and name are
updated to indicate that OS has been upgraded (or downgraded)
"""
def __init__(self):
name = "cnl"
version = self._detect_crayos_version()
if version:
# If we found a CrayOS version, we do not want the information
# from LinuxDistro. In order to skip the logic from
# distro.linux_distribution, while still calling __init__
# methods further up the MRO, we skip LinuxDistro in the MRO and
# call the OperatingSystem superclass __init__ method
super(LinuxDistro, self).__init__(name, version)
else:
super().__init__()
self.modulecmd = module
def __str__(self):
return self.name + str(self.version)
@classmethod
def _detect_crayos_version(cls):
if os.path.isfile(_cle_release_file):
release_attrs = read_cle_release_file()
if "RELEASE" not in release_attrs:
# This Cray system uses a base OS not CLE/CNL
return None
v = spack.version.Version(release_attrs["RELEASE"])
return v[0]
elif os.path.isfile(_clerelease_file):
v = read_clerelease_file()
return spack.version.Version(v)[0]
else:
# Not all Cray systems run CNL on the backend.
# Systems running in what Cray calls "cluster" mode run other
# linux OSs under the Cray PE.
# So if we don't detect any Cray OS version on the system,
# we return None. We can't ever be sure we will get a Cray OS
# version.
# Returning None allows the calling code to test for the value
# being "True-ish" rather than requiring a try/except block.
return None
def arguments_to_detect_version_fn(self, paths):
import spack.compilers
command_arguments = []
for compiler_name in spack.compilers.supported_compilers():
cmp_cls = spack.compilers.class_for_compiler_name(compiler_name)
# If the compiler doesn't have a corresponding
# Programming Environment, skip to the next
if cmp_cls.PrgEnv is None:
continue
if cmp_cls.PrgEnv_compiler is None:
tty.die("Must supply PrgEnv_compiler with PrgEnv")
compiler_id = spack.compilers.CompilerID(self, compiler_name, None)
detect_version_args = spack.compilers.DetectVersionArgs(
id=compiler_id, variation=(None, None), language="cc", path="cc"
)
command_arguments.append(detect_version_args)
return command_arguments
def detect_version(self, detect_version_args):
import spack.compilers
modulecmd = self.modulecmd
compiler_name = detect_version_args.id.compiler_name
compiler_cls = spack.compilers.class_for_compiler_name(compiler_name)
output = modulecmd("avail", compiler_cls.PrgEnv_compiler)
version_regex = r"({0})/([\d\.]+[\d]-?[\w]*)".format(compiler_cls.PrgEnv_compiler)
matches = re.findall(version_regex, output)
version = tuple(version for _, version in matches if "classic" not in version)
compiler_id = detect_version_args.id
value = detect_version_args._replace(id=compiler_id._replace(version=version))
return value, None
def make_compilers(self, compiler_id, paths):
import spack.spec
name = compiler_id.compiler_name
cmp_cls = spack.compilers.class_for_compiler_name(name)
compilers = []
for v in compiler_id.version:
comp = cmp_cls(
spack.spec.CompilerSpec(name + "@=" + v),
self,
"any",
["cc", "CC", "ftn"],
[cmp_cls.PrgEnv, name + "/" + v],
)
compilers.append(comp)
return compilers

View File

@@ -1,105 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import contextlib
import os
import re
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
from spack.util.environment import get_path
from spack.util.module_cmd import module
from .linux_distro import LinuxDistro
@contextlib.contextmanager
def unload_programming_environment():
"""Context manager that unloads Cray Programming Environments."""
env_bu = None
# We rely on the fact that the PrgEnv-* modules set the PE_ENV
# environment variable.
if "PE_ENV" in os.environ:
# Copy environment variables to restore them after the compiler
# detection. We expect that the only thing PrgEnv-* modules do is
# the environment variables modifications.
env_bu = os.environ.copy()
# Get the name of the module from the environment variable.
prg_env = "PrgEnv-" + os.environ["PE_ENV"].lower()
# Unload the PrgEnv-* module. By doing this we intentionally
# provoke errors when the Cray's compiler wrappers are executed
# (Error: A PrgEnv-* modulefile must be loaded.) so they will not
# be detected as valid compilers by the overridden method. We also
# expect that the modules that add the actual compilers' binaries
# into the PATH environment variable (i.e. the following modules:
# 'intel', 'cce', 'gcc', etc.) will also be unloaded since they are
# specified as prerequisites in the PrgEnv-* modulefiles.
module("unload", prg_env)
yield
# Restore the environment.
if env_bu is not None:
os.environ.clear()
os.environ.update(env_bu)
class CrayFrontend(LinuxDistro):
"""Represents OS that runs on login and service nodes of the Cray platform.
It acts as a regular Linux without Cray-specific modules and compiler
wrappers."""
@property
def compiler_search_paths(self):
"""Calls the default function but unloads Cray's programming
environments first.
This prevents from detecting Cray compiler wrappers and avoids
possible false detections.
"""
import spack.compilers
with unload_programming_environment():
search_paths = get_path("PATH")
extract_path_re = re.compile(r"prepend-path[\s]*PATH[\s]*([/\w\.:-]*)")
for compiler_cls in spack.compilers.all_compiler_types():
# Check if the compiler class is supported on Cray
prg_env = getattr(compiler_cls, "PrgEnv", None)
compiler_module = getattr(compiler_cls, "PrgEnv_compiler", None)
if not (prg_env and compiler_module):
continue
# It is supported, check which versions are available
output = module("avail", compiler_cls.PrgEnv_compiler)
version_regex = r"({0})/([\d\.]+[\d]-?[\w]*)".format(compiler_cls.PrgEnv_compiler)
matches = re.findall(version_regex, output)
versions = tuple(version for _, version in matches if "classic" not in version)
# Now inspect the modules and add to paths
msg = "[CRAY FE] Detected FE compiler [name={0}, versions={1}]"
tty.debug(msg.format(compiler_module, versions))
for v in versions:
try:
current_module = compiler_module + "/" + v
out = module("show", current_module)
match = extract_path_re.search(out)
search_paths += match.group(1).split(":")
except Exception as e:
msg = (
"[CRAY FE] An unexpected error occurred while "
"detecting FE compiler [compiler={0}, "
" version={1}, error={2}]"
)
tty.debug(msg.format(compiler_cls.name, v, str(e)))
search_paths = list(llnl.util.lang.dedupe(search_paths))
return fs.search_paths_for_executables(*search_paths)

View File

@@ -143,6 +143,7 @@ def __init__(self):
"12": "monterey",
"13": "ventura",
"14": "sonoma",
"15": "sequoia",
}
version = macos_version()

View File

@@ -621,10 +621,6 @@ class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass
#: By default do not run tests within package's install()
run_tests = False
#: Keep -Werror flags, matches config:flags:keep_werror to override config
# NOTE: should be type Optional[Literal['all', 'specific', 'none']] in 3.8+
keep_werror: Optional[str] = None
#: Most packages are NOT extendable. Set to True if you want extensions.
extendable = False
@@ -752,11 +748,6 @@ def __init__(self, spec):
self._fetch_time = 0.0
self.win_rpath = fsys.WindowsSimulatedRPath(self)
if self.is_extension:
pkg_cls = spack.repo.PATH.get_pkg_class(self.extendee_spec.name)
pkg_cls(self.extendee_spec)._check_extendable()
super().__init__()
@classmethod
@@ -930,6 +921,32 @@ def global_license_file(self):
self.global_license_dir, self.name, os.path.basename(self.license_files[0])
)
# NOTE: return type should be Optional[Literal['all', 'specific', 'none']] in
# Python 3.8+, but we still support 3.6.
@property
def keep_werror(self) -> Optional[str]:
"""Keep ``-Werror`` flags, matches ``config:flags:keep_werror`` to override config.
Valid return values are:
* ``"all"``: keep all ``-Werror`` flags.
* ``"specific"``: keep only ``-Werror=specific-warning`` flags.
* ``"none"``: filter out all ``-Werror*`` flags.
* ``None``: respect the user's configuration (``"none"`` by default).
"""
if self.spec.satisfies("%nvhpc@:23.3") or self.spec.satisfies("%pgi"):
# Filtering works by replacing -Werror with -Wno-error, but older nvhpc and
# PGI do not understand -Wno-error, so we disable filtering.
return "all"
elif self.spec.satisfies("%nvhpc@23.4:"):
# newer nvhpc supports -Wno-error but can't disable specific warnings with
# -Wno-error=warning. Skip -Werror=warning, but still filter -Werror.
return "specific"
else:
# use -Werror disablement by default for other compilers
return None
@property
def version(self):
if not self.spec.versions.concrete:
@@ -1119,10 +1136,9 @@ def _make_stage(self):
if not link_format:
link_format = "build-{arch}-{hash:7}"
stage_link = self.spec.format_path(link_format)
return DevelopStage(compute_stage_name(self.spec), dev_path, stage_link)
# To fetch the current version
source_stage = self._make_root_stage(self.fetcher)
source_stage = DevelopStage(compute_stage_name(self.spec), dev_path, stage_link)
else:
source_stage = self._make_root_stage(self.fetcher)
# all_stages is source + resources + patches
all_stages = StageComposite()
@@ -1451,10 +1467,8 @@ def do_fetch(self, mirror_only=False):
return
checksum = spack.config.get("config:checksum")
fetch = self.stage.needs_fetching
if (
checksum
and fetch
and (self.version not in self.versions)
and (not isinstance(self.version, GitVersion))
):
@@ -1561,13 +1575,11 @@ def do_patch(self):
tty.debug("Patching failed last time. Restaging.")
self.stage.restage()
else:
# develop specs/ DIYStages may have patch failures but
# should never be restaged
msg = (
"A patch failure was detected in %s." % self.name
+ " Build errors may occur due to this."
# develop specs may have patch failures but should never be restaged
tty.warn(
f"A patch failure was detected in {self.name}."
" Build errors may occur due to this."
)
tty.warn(msg)
return
# If this file exists, then we already applied all the patches.
@@ -1881,7 +1893,10 @@ def do_install(self, **kwargs):
verbose (bool): Display verbose build output (by default,
suppresses it)
"""
PackageInstaller([(self, kwargs)]).install()
explicit = kwargs.get("explicit", True)
if isinstance(explicit, bool):
kwargs["explicit"] = {self.spec.dag_hash()} if explicit else set()
PackageInstaller([self], kwargs).install()
# TODO (post-34236): Update tests and all packages that use this as a
# TODO (post-34236): package method to the routine made available to
@@ -2368,10 +2383,6 @@ def do_deprecate(self, deprecator, link_fn):
PackageBase.uninstall_by_spec(spec, force=True, deprecator=deprecator)
link_fn(deprecator.prefix, spec.prefix)
def _check_extendable(self):
if not self.extendable:
raise ValueError("Package %s is not extendable!" % self.name)
def view(self):
"""Create a view with the prefix of this package as the root.
Extensions added to this view will modify the installation prefix of

View File

@@ -6,7 +6,6 @@
from ._functions import _host, by_name, platforms, prevent_cray_detection, reset
from ._platform import Platform
from .cray import Cray
from .darwin import Darwin
from .freebsd import FreeBSD
from .linux import Linux
@@ -15,7 +14,6 @@
__all__ = [
"Platform",
"Cray",
"Darwin",
"Linux",
"FreeBSD",

View File

@@ -8,7 +8,6 @@
import spack.util.environment
from .cray import Cray
from .darwin import Darwin
from .freebsd import FreeBSD
from .linux import Linux
@@ -16,7 +15,7 @@
from .windows import Windows
#: List of all the platform classes known to Spack
platforms = [Cray, Darwin, Linux, Windows, FreeBSD, Test]
platforms = [Darwin, Linux, Windows, FreeBSD, Test]
@llnl.util.lang.memoized

View File

@@ -2,254 +2,10 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import platform
import re
import archspec.cpu
import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.target
import spack.version
from spack.operating_systems.cray_backend import CrayBackend
from spack.operating_systems.cray_frontend import CrayFrontend
from spack.paths import build_env_path
from spack.util.executable import Executable
from spack.util.module_cmd import module
from ._platform import NoPlatformError, Platform
_craype_name_to_target_name = {
"x86-cascadelake": "cascadelake",
"x86-naples": "zen",
"x86-rome": "zen2",
"x86-milan": "zen3",
"x86-skylake": "skylake_avx512",
"mic-knl": "mic_knl",
"interlagos": "bulldozer",
"abudhabi": "piledriver",
}
_ex_craype_dir = "/opt/cray/pe/cpe"
_xc_craype_dir = "/opt/cray/pe/cdt"
def slingshot_network():
return os.path.exists("/opt/cray/pe") and (
os.path.exists("/lib64/libcxi.so") or os.path.exists("/usr/lib64/libcxi.so")
)
def _target_name_from_craype_target_name(name):
return _craype_name_to_target_name.get(name, name)
class Cray(Platform):
priority = 10
def __init__(self):
"""Create a Cray system platform.
Target names should use craype target names but not include the
'craype-' prefix. Uses first viable target from:
self
envars [SPACK_FRONT_END, SPACK_BACK_END]
configuration file "targets.yaml" with keys 'front_end', 'back_end'
scanning /etc/bash/bashrc.local for back_end only
"""
super().__init__("cray")
# Make all craype targets available.
for target in self._avail_targets():
name = _target_name_from_craype_target_name(target)
self.add_target(name, spack.target.Target(name, "craype-%s" % target))
self.back_end = os.environ.get("SPACK_BACK_END", self._default_target_from_env())
self.default = self.back_end
if self.back_end not in self.targets:
# We didn't find a target module for the backend
raise NoPlatformError()
# Setup frontend targets
for name in archspec.cpu.TARGETS:
if name not in self.targets:
self.add_target(name, spack.target.Target(name))
self.front_end = os.environ.get("SPACK_FRONT_END", archspec.cpu.host().name)
if self.front_end not in self.targets:
self.add_target(self.front_end, spack.target.Target(self.front_end))
front_distro = CrayFrontend()
back_distro = CrayBackend()
self.default_os = str(back_distro)
self.back_os = self.default_os
self.front_os = str(front_distro)
self.add_operating_system(self.back_os, back_distro)
if self.front_os != self.back_os:
self.add_operating_system(self.front_os, front_distro)
def setup_platform_environment(self, pkg, env):
"""Change the linker to default dynamic to be more
similar to linux/standard linker behavior
"""
# Unload these modules to prevent any silent linking or unnecessary
# I/O profiling in the case of darshan.
modules_to_unload = ["cray-mpich", "darshan", "cray-libsci", "altd"]
for mod in modules_to_unload:
module("unload", mod)
env.set("CRAYPE_LINK_TYPE", "dynamic")
cray_wrapper_names = os.path.join(build_env_path, "cray")
if os.path.isdir(cray_wrapper_names):
env.prepend_path("PATH", cray_wrapper_names)
env.prepend_path("SPACK_ENV_PATH", cray_wrapper_names)
# Makes spack installed pkg-config work on Crays
env.append_path("PKG_CONFIG_PATH", "/usr/lib64/pkgconfig")
env.append_path("PKG_CONFIG_PATH", "/usr/local/lib64/pkgconfig")
# CRAY_LD_LIBRARY_PATH is used at build time by the cray compiler
# wrappers to augment LD_LIBRARY_PATH. This is to avoid long load
# times at runtime. This behavior is not always respected on cray
# "cluster" systems, so we reproduce it here.
if os.environ.get("CRAY_LD_LIBRARY_PATH"):
env.prepend_path("LD_LIBRARY_PATH", os.environ["CRAY_LD_LIBRARY_PATH"])
@classmethod
def craype_type_and_version(cls):
if os.path.isdir(_ex_craype_dir):
craype_dir = _ex_craype_dir
craype_type = "EX"
elif os.path.isdir(_xc_craype_dir):
craype_dir = _xc_craype_dir
craype_type = "XC"
else:
return (None, None)
# Take the default version from known symlink path
default_path = os.path.join(craype_dir, "default")
if os.path.islink(default_path):
version = spack.version.Version(readlink(default_path))
return (craype_type, version)
# If no default version, sort available versions and return latest
versions_available = [spack.version.Version(v) for v in os.listdir(craype_dir)]
versions_available.sort(reverse=True)
if not versions_available:
return (craype_type, None)
return (craype_type, versions_available[0])
@classmethod
def detect(cls):
"""
Detect whether this system requires CrayPE module support.
Systems with newer CrayPE (21.10 for EX systems, future work for CS and
XC systems) have compilers and MPI wrappers that can be used directly
by path. These systems are considered ``linux`` platforms.
For systems running an older CrayPE, we detect the Cray platform based
on the availability through `module` of the Cray programming
environment. If this environment is available, we can use it to find
compilers, target modules, etc. If the Cray programming environment is
not available via modules, then we will treat it as a standard linux
system, as the Cray compiler wrappers and other components of the Cray
programming environment are irrelevant without module support.
"""
if "opt/cray" not in os.environ.get("MODULEPATH", ""):
return False
craype_type, craype_version = cls.craype_type_and_version()
if craype_type == "XC":
return True
if craype_type == "EX" and craype_version < spack.version.Version("21.10"):
return True
return False
def _default_target_from_env(self):
"""Set and return the default CrayPE target loaded in a clean login
session.
A bash subshell is launched with a wiped environment and the list of
loaded modules is parsed for the first acceptable CrayPE target.
"""
# env -i /bin/bash -lc echo $CRAY_CPU_TARGET 2> /dev/null
if getattr(self, "default", None) is None:
bash = Executable("/bin/bash")
output = bash(
"--norc",
"--noprofile",
"-lc",
"echo $CRAY_CPU_TARGET",
env={"TERM": os.environ.get("TERM", "")},
output=str,
error=os.devnull,
)
default_from_module = "".join(output.split()) # rm all whitespace
if default_from_module:
tty.debug("Found default module:%s" % default_from_module)
return default_from_module
else:
front_end = archspec.cpu.host()
# Look for the frontend architecture or closest ancestor
# available in cray target modules
avail = [_target_name_from_craype_target_name(x) for x in self._avail_targets()]
for front_end_possibility in [front_end] + front_end.ancestors:
if front_end_possibility.name in avail:
tty.debug("using front-end architecture or available ancestor")
return front_end_possibility.name
else:
tty.debug("using platform.machine as default")
return platform.machine()
def _avail_targets(self):
"""Return a list of available CrayPE CPU targets."""
def modules_in_output(output):
"""Returns a list of valid modules parsed from modulecmd output"""
return [i for i in re.split(r"\s\s+|\n", output)]
def target_names_from_modules(modules):
# Craype- module prefixes that are not valid CPU targets.
targets = []
for mod in modules:
if "craype-" in mod:
name = mod[7:]
name = name.split()[0]
_n = name.replace("-", "_") # test for mic-knl/mic_knl
is_target_name = name in archspec.cpu.TARGETS or _n in archspec.cpu.TARGETS
is_cray_target_name = name in _craype_name_to_target_name
if is_target_name or is_cray_target_name:
targets.append(name)
return targets
def modules_from_listdir():
craype_default_path = "/opt/cray/pe/craype/default/modulefiles"
if os.path.isdir(craype_default_path):
return os.listdir(craype_default_path)
return []
if getattr(self, "_craype_targets", None) is None:
strategies = [
lambda: modules_in_output(module("avail", "-t", "craype-")),
modules_from_listdir,
]
for available_craype_modules in strategies:
craype_modules = available_craype_modules()
craype_targets = target_names_from_modules(craype_modules)
if craype_targets:
self._craype_targets = craype_targets
break
else:
# If nothing is found add platform.machine()
# to avoid Spack erroring out
self._craype_targets = [platform.machine()]
return self._craype_targets

View File

@@ -25,7 +25,8 @@
import traceback
import types
import uuid
from typing import Any, Dict, List, Set, Tuple, Union
import warnings
from typing import Any, Dict, Generator, List, Optional, Set, Tuple, Type, Union
import llnl.path
import llnl.util.filesystem as fs
@@ -126,11 +127,35 @@ def exec_module(self, module):
class ReposFinder:
"""MetaPathFinder class that loads a Python module corresponding to a Spack package
"""MetaPathFinder class that loads a Python module corresponding to a Spack package.
Return a loader based on the inspection of the current global repository list.
Returns a loader based on the inspection of the current repository list.
"""
def __init__(self):
self._repo_init = _path
self._repo = None
@property
def current_repository(self):
if self._repo is None:
self._repo = self._repo_init()
return self._repo
@current_repository.setter
def current_repository(self, value):
self._repo = value
@contextlib.contextmanager
def switch_repo(self, substitute: "RepoType"):
"""Switch the current repository list for the duration of the context manager."""
old = self.current_repository
try:
self.current_repository = substitute
yield
finally:
self.current_repository = old
def find_spec(self, fullname, python_path, target=None):
# "target" is not None only when calling importlib.reload()
if target is not None:
@@ -149,9 +174,14 @@ def compute_loader(self, fullname):
# namespaces are added to repo, and package modules are leaves.
namespace, dot, module_name = fullname.rpartition(".")
# If it's a module in some repo, or if it is the repo's
# namespace, let the repo handle it.
for repo in PATH.repos:
# If it's a module in some repo, or if it is the repo's namespace, let the repo handle it.
is_repo_path = isinstance(self.current_repository, RepoPath)
if is_repo_path:
repos = self.current_repository.repos
else:
repos = [self.current_repository]
for repo in repos:
# We are using the namespace of the repo and the repo contains the package
if namespace == repo.full_namespace:
# With 2 nested conditionals we can call "repo.real_name" only once
@@ -165,7 +195,7 @@ def compute_loader(self, fullname):
# No repo provides the namespace, but it is a valid prefix of
# something in the RepoPath.
if PATH.by_namespace.is_prefix(fullname):
if is_repo_path and self.current_repository.by_namespace.is_prefix(fullname):
return SpackNamespaceLoader()
return None
@@ -560,7 +590,7 @@ def __init__(
self,
package_checker: FastPackageChecker,
namespace: str,
cache: spack.util.file_cache.FileCache,
cache: spack.caches.FileCacheType,
):
self.checker = package_checker
self.packages_path = self.checker.packages_path
@@ -648,11 +678,9 @@ class RepoPath:
repos (list): list Repo objects or paths to put in this RepoPath
"""
def __init__(self, *repos, **kwargs):
cache = kwargs.get("cache", spack.caches.MISC_CACHE)
def __init__(self, *repos, cache, overrides=None):
self.repos = []
self.by_namespace = nm.NamespaceTrie()
self._provider_index = None
self._patch_index = None
self._tag_index = None
@@ -661,7 +689,8 @@ def __init__(self, *repos, **kwargs):
for repo in repos:
try:
if isinstance(repo, str):
repo = Repo(repo, cache=cache)
repo = Repo(repo, cache=cache, overrides=overrides)
repo.finder(self)
self.put_last(repo)
except RepoError as e:
tty.warn(
@@ -915,18 +944,28 @@ class Repo:
Each package repository must have a top-level configuration file
called `repo.yaml`.
Currently, `repo.yaml` this must define:
Currently, `repo.yaml` must define:
`namespace`:
A Python namespace where the repository's packages should live.
`subdirectory`:
An optional subdirectory name where packages are placed
"""
def __init__(self, root, cache=None):
def __init__(
self,
root: str,
*,
cache: spack.caches.FileCacheType,
overrides: Optional[Dict[str, Any]] = None,
) -> None:
"""Instantiate a package repository from a filesystem path.
Args:
root: the root directory of the repository
cache: file cache associated with this repository
overrides: dict mapping package name to class attribute overrides for that package
"""
# Root directory, containing _repo.yaml and package dirs
# Allow roots to by spack-relative by starting with '$spack'
@@ -939,20 +978,20 @@ def check(condition, msg):
# Validate repository layout.
self.config_file = os.path.join(self.root, repo_config_name)
check(os.path.isfile(self.config_file), "No %s found in '%s'" % (repo_config_name, root))
check(os.path.isfile(self.config_file), f"No {repo_config_name} found in '{root}'")
# Read configuration and validate namespace
config = self._read_config()
check(
"namespace" in config,
"%s must define a namespace." % os.path.join(root, repo_config_name),
f"{os.path.join(root, repo_config_name)} must define a namespace.",
)
self.namespace = config["namespace"]
check(
re.match(r"[a-zA-Z][a-zA-Z0-9_.]+", self.namespace),
("Invalid namespace '%s' in repo '%s'. " % (self.namespace, self.root))
+ "Namespaces must be valid python identifiers separated by '.'",
f"Invalid namespace '{self.namespace}' in repo '{self.root}'. "
"Namespaces must be valid python identifiers separated by '.'",
)
# Set up 'full_namespace' to include the super-namespace
@@ -964,23 +1003,26 @@ def check(condition, msg):
packages_dir = config.get("subdirectory", packages_dir_name)
self.packages_path = os.path.join(self.root, packages_dir)
check(
os.path.isdir(self.packages_path),
"No directory '%s' found in '%s'" % (packages_dir, root),
os.path.isdir(self.packages_path), f"No directory '{packages_dir}' found in '{root}'"
)
# These are internal cache variables.
self._modules = {}
self._classes = {}
self._instances = {}
# Class attribute overrides by package name
self.overrides = overrides or {}
# Optional reference to a RepoPath to influence module import from spack.pkg
self._finder: Optional[RepoPath] = None
# Maps that goes from package name to corresponding file stat
self._fast_package_checker = None
self._fast_package_checker: Optional[FastPackageChecker] = None
# Indexes for this repository, computed lazily
self._repo_index = None
self._cache = cache or spack.caches.MISC_CACHE
self._repo_index: Optional[RepoIndex] = None
self._cache = cache
def real_name(self, import_name):
def finder(self, value: RepoPath) -> None:
self._finder = value
def real_name(self, import_name: str) -> Optional[str]:
"""Allow users to import Spack packages using Python identifiers.
A python identifier might map to many different Spack package
@@ -999,18 +1041,21 @@ def real_name(self, import_name):
return import_name
options = nm.possible_spack_module_names(import_name)
options.remove(import_name)
try:
options.remove(import_name)
except ValueError:
pass
for name in options:
if name in self:
return name
return None
def is_prefix(self, fullname):
def is_prefix(self, fullname: str) -> bool:
"""True if fullname is a prefix of this Repo's namespace."""
parts = fullname.split(".")
return self._names[: len(parts)] == parts
def _read_config(self):
def _read_config(self) -> Dict[str, str]:
"""Check for a YAML config file in this db's root directory."""
try:
with open(self.config_file) as reponame_file:
@@ -1021,14 +1066,14 @@ def _read_config(self):
or "repo" not in yaml_data
or not isinstance(yaml_data["repo"], dict)
):
tty.die("Invalid %s in repository %s" % (repo_config_name, self.root))
tty.die(f"Invalid {repo_config_name} in repository {self.root}")
return yaml_data["repo"]
except IOError:
tty.die("Error reading %s when opening %s" % (self.config_file, self.root))
tty.die(f"Error reading {self.config_file} when opening {self.root}")
def get(self, spec):
def get(self, spec: "spack.spec.Spec") -> "spack.package_base.PackageBase":
"""Returns the package associated with the supplied spec."""
msg = "Repo.get can only be called on concrete specs"
assert isinstance(spec, spack.spec.Spec) and spec.concrete, msg
@@ -1049,16 +1094,13 @@ def get(self, spec):
# pass these through as their error messages will be fine.
raise
except Exception as e:
tty.debug(e)
# Make sure other errors in constructors hit the error
# handler by wrapping them
if spack.config.get("config:debug"):
sys.excepthook(*sys.exc_info())
raise FailedConstructorError(spec.fullname, *sys.exc_info())
tty.debug(e)
raise FailedConstructorError(spec.fullname, *sys.exc_info()) from e
@autospec
def dump_provenance(self, spec, path):
def dump_provenance(self, spec: "spack.spec.Spec", path: str) -> None:
"""Dump provenance information for a spec to a particular path.
This dumps the package file and any associated patch files.
@@ -1066,7 +1108,7 @@ def dump_provenance(self, spec, path):
"""
if spec.namespace and spec.namespace != self.namespace:
raise UnknownPackageError(
"Repository %s does not contain package %s." % (self.namespace, spec.fullname)
f"Repository {self.namespace} does not contain package {spec.fullname}."
)
package_path = self.filename_for_package_name(spec.name)
@@ -1083,17 +1125,13 @@ def dump_provenance(self, spec, path):
if os.path.exists(patch.path):
fs.install(patch.path, path)
else:
tty.warn("Patch file did not exist: %s" % patch.path)
warnings.warn(f"Patch file did not exist: {patch.path}")
# Install the package.py file itself.
fs.install(self.filename_for_package_name(spec.name), path)
def purge(self):
"""Clear entire package instance cache."""
self._instances.clear()
@property
def index(self):
def index(self) -> RepoIndex:
"""Construct the index for this repo lazily."""
if self._repo_index is None:
self._repo_index = RepoIndex(self._pkg_checker, self.namespace, cache=self._cache)
@@ -1103,42 +1141,40 @@ def index(self):
return self._repo_index
@property
def provider_index(self):
def provider_index(self) -> spack.provider_index.ProviderIndex:
"""A provider index with names *specific* to this repo."""
return self.index["providers"]
@property
def tag_index(self):
def tag_index(self) -> spack.tag.TagIndex:
"""Index of tags and which packages they're defined on."""
return self.index["tags"]
@property
def patch_index(self):
def patch_index(self) -> spack.patch.PatchCache:
"""Index of patches and packages they're defined on."""
return self.index["patches"]
@autospec
def providers_for(self, vpkg_spec):
def providers_for(self, vpkg_spec: "spack.spec.Spec") -> List["spack.spec.Spec"]:
providers = self.provider_index.providers_for(vpkg_spec)
if not providers:
raise UnknownPackageError(vpkg_spec.fullname)
return providers
@autospec
def extensions_for(self, extendee_spec):
return [
pkg_cls(spack.spec.Spec(pkg_cls.name))
for pkg_cls in self.all_package_classes()
if pkg_cls(spack.spec.Spec(pkg_cls.name)).extends(extendee_spec)
]
def extensions_for(
self, extendee_spec: "spack.spec.Spec"
) -> List["spack.package_base.PackageBase"]:
result = [pkg_cls(spack.spec.Spec(pkg_cls.name)) for pkg_cls in self.all_package_classes()]
return [x for x in result if x.extends(extendee_spec)]
def dirname_for_package_name(self, pkg_name):
"""Get the directory name for a particular package. This is the
directory that contains its package.py file."""
def dirname_for_package_name(self, pkg_name: str) -> str:
"""Given a package name, get the directory containing its package.py file."""
_, unqualified_name = self.partition_package_name(pkg_name)
return os.path.join(self.packages_path, unqualified_name)
def filename_for_package_name(self, pkg_name):
def filename_for_package_name(self, pkg_name: str) -> str:
"""Get the filename for the module we should load for a particular
package. Packages for a Repo live in
``$root/<package_name>/package.py``
@@ -1151,23 +1187,23 @@ def filename_for_package_name(self, pkg_name):
return os.path.join(pkg_dir, package_file_name)
@property
def _pkg_checker(self):
def _pkg_checker(self) -> FastPackageChecker:
if self._fast_package_checker is None:
self._fast_package_checker = FastPackageChecker(self.packages_path)
return self._fast_package_checker
def all_package_names(self, include_virtuals=False):
def all_package_names(self, include_virtuals: bool = False) -> List[str]:
"""Returns a sorted list of all package names in the Repo."""
names = sorted(self._pkg_checker.keys())
if include_virtuals:
return names
return [x for x in names if not self.is_virtual(x)]
def package_path(self, name):
def package_path(self, name: str) -> str:
"""Get path to package.py file for this repo."""
return os.path.join(self.packages_path, name, package_file_name)
def all_package_paths(self):
def all_package_paths(self) -> Generator[str, None, None]:
for name in self.all_package_names():
yield self.package_path(name)
@@ -1176,7 +1212,7 @@ def packages_with_tags(self, *tags: str) -> Set[str]:
v.intersection_update(*(self.tag_index[tag.lower()] for tag in tags))
return v
def all_package_classes(self):
def all_package_classes(self) -> Generator[Type["spack.package_base.PackageBase"], None, None]:
"""Iterator over all package *classes* in the repository.
Use this with care, because loading packages is slow.
@@ -1184,7 +1220,7 @@ def all_package_classes(self):
for name in self.all_package_names():
yield self.get_pkg_class(name)
def exists(self, pkg_name):
def exists(self, pkg_name: str) -> bool:
"""Whether a package with the supplied name exists."""
if pkg_name is None:
return False
@@ -1201,28 +1237,22 @@ def last_mtime(self):
"""Time a package file in this repo was last updated."""
return self._pkg_checker.last_mtime()
def is_virtual(self, pkg_name):
def is_virtual(self, pkg_name: str) -> bool:
"""Return True if the package with this name is virtual, False otherwise.
This function use the provider index. If calling from a code block that
is used to construct the provider index use the ``is_virtual_safe`` function.
Args:
pkg_name (str): name of the package we want to check
"""
return pkg_name in self.provider_index
def is_virtual_safe(self, pkg_name):
def is_virtual_safe(self, pkg_name: str) -> bool:
"""Return True if the package with this name is virtual, False otherwise.
This function doesn't use the provider index.
Args:
pkg_name (str): name of the package we want to check
"""
return not self.exists(pkg_name) or self.get_pkg_class(pkg_name).virtual
def get_pkg_class(self, pkg_name):
def get_pkg_class(self, pkg_name: str) -> Type["spack.package_base.PackageBase"]:
"""Get the class for the package out of its module.
First loads (or fetches from cache) a module for the
@@ -1234,7 +1264,8 @@ def get_pkg_class(self, pkg_name):
fullname = f"{self.full_namespace}.{pkg_name}"
try:
module = importlib.import_module(fullname)
with REPOS_FINDER.switch_repo(self._finder or self):
module = importlib.import_module(fullname)
except ImportError:
raise UnknownPackageError(fullname)
except Exception as e:
@@ -1245,26 +1276,21 @@ def get_pkg_class(self, pkg_name):
if not inspect.isclass(cls):
tty.die(f"{pkg_name}.{class_name} is not a class")
new_cfg_settings = (
spack.config.get("packages").get(pkg_name, {}).get("package_attributes", {})
)
# Clear any prior changes to class attributes in case the class was loaded from the
# same repo, but with different overrides
overridden_attrs = getattr(cls, "overridden_attrs", {})
attrs_exclusively_from_config = getattr(cls, "attrs_exclusively_from_config", [])
# Clear any prior changes to class attributes in case the config has
# since changed
for key, val in overridden_attrs.items():
setattr(cls, key, val)
for key in attrs_exclusively_from_config:
delattr(cls, key)
# Keep track of every class attribute that is overridden by the config:
# if the config changes between calls to this method, we make sure to
# restore the original config values (in case the new config no longer
# sets attributes that it used to)
# Keep track of every class attribute that is overridden: if different overrides
# dictionaries are used on the same physical repo, we make sure to restore the original
# config values
new_overridden_attrs = {}
new_attrs_exclusively_from_config = set()
for key, val in new_cfg_settings.items():
for key, val in self.overrides.get(pkg_name, {}).items():
if hasattr(cls, key):
new_overridden_attrs[key] = getattr(cls, key)
else:
@@ -1291,13 +1317,13 @@ def partition_package_name(self, pkg_name: str) -> Tuple[str, str]:
return namespace, pkg_name
def __str__(self):
return "[Repo '%s' at '%s']" % (self.namespace, self.root)
def __str__(self) -> str:
return f"Repo '{self.namespace}' at {self.root}"
def __repr__(self):
def __repr__(self) -> str:
return self.__str__()
def __contains__(self, pkg_name):
def __contains__(self, pkg_name: str) -> bool:
return self.exists(pkg_name)
@@ -1373,12 +1399,17 @@ def create_repo(root, namespace=None, subdir=packages_dir_name):
return full_path, namespace
def from_path(path: str) -> "Repo":
"""Returns a repository from the path passed as input. Injects the global misc cache."""
return Repo(path, cache=spack.caches.MISC_CACHE)
def create_or_construct(path, namespace=None):
"""Create a repository, or just return a Repo if it already exists."""
if not os.path.exists(path):
fs.mkdirp(path)
create_repo(path, namespace)
return Repo(path)
return from_path(path)
def _path(configuration=None):
@@ -1396,7 +1427,17 @@ def create(configuration):
repo_dirs = configuration.get("repos")
if not repo_dirs:
raise NoRepoConfiguredError("Spack configuration contains no package repositories.")
return RepoPath(*repo_dirs)
overrides = {}
for pkg_name, data in configuration.get("packages").items():
if pkg_name == "all":
continue
value = data.get("package_attributes", {})
if not value:
continue
overrides[pkg_name] = value
return RepoPath(*repo_dirs, cache=spack.caches.MISC_CACHE, overrides=overrides)
#: Singleton repo path instance

View File

@@ -116,6 +116,8 @@ class Provenance(enum.IntEnum):
PACKAGE_PY = enum.auto()
# An installed spec
INSTALLED = enum.auto()
# lower provenance for installed git refs so concretizer prefers StandardVersion installs
INSTALLED_GIT_VERSION = enum.auto()
# A runtime injected from another package (e.g. a compiler)
RUNTIME = enum.auto()
@@ -844,8 +846,6 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
parent_dir = os.path.dirname(__file__)
self.control.load(os.path.join(parent_dir, "concretize.lp"))
self.control.load(os.path.join(parent_dir, "heuristic.lp"))
if spack.config.CONFIG.get("concretizer:duplicates:strategy", "none") != "none":
self.control.load(os.path.join(parent_dir, "heuristic_separate.lp"))
self.control.load(os.path.join(parent_dir, "display.lp"))
if not setup.concretize_everything:
self.control.load(os.path.join(parent_dir, "when_possible.lp"))
@@ -1880,11 +1880,8 @@ def _spec_clauses(
)
clauses.append(f.variant_value(spec.name, vname, value))
if variant.propagate:
clauses.append(
f.variant_propagation_candidate(spec.name, vname, value, spec.name)
)
clauses.append(f.propagate(spec.name, fn.variant_value(vname, value)))
# Tell the concretizer that this is a possible value for the
# variant, to account for things like int/str values where we
@@ -1917,9 +1914,12 @@ def _spec_clauses(
for flag_type, flags in spec.compiler_flags.items():
for flag in flags:
clauses.append(f.node_flag(spec.name, flag_type, flag))
clauses.append(f.node_flag_source(spec.name, flag_type, spec.name))
if not spec.concrete and flag.propagate is True:
clauses.append(f.node_flag_propagate(spec.name, flag_type))
clauses.append(
f.propagate(
spec.name, fn.node_flag(flag_type, flag), fn.edge_types("link", "run")
)
)
# dependencies
if spec.concrete:
@@ -1939,6 +1939,11 @@ def _spec_clauses(
for virtual in virtuals:
clauses.append(fn.attr("virtual_on_incoming_edges", spec.name, virtual))
# If the spec is external and concrete, we allow all the libcs on the system
if spec.external and spec.concrete and using_libc_compatibility():
for libc in self.libcs:
clauses.append(fn.attr("compatible_libc", spec.name, libc.name, libc.version))
# add all clauses from dependencies
if transitive:
# TODO: Eventually distinguish 2 deps on the same pkg (build and link)
@@ -2067,7 +2072,7 @@ def define_ad_hoc_versions_from_specs(
# best possible, so they're guaranteed to be used preferentially.
version = s.versions.concrete
if version is None or any(v == version for v in self.possible_versions[s.name]):
if version is None or (any((v == version) for v in self.possible_versions[s.name])):
continue
if require_checksum and not _is_checksummed_git_version(version):
@@ -2381,9 +2386,16 @@ def concrete_specs(self):
# - Add OS to possible OS's
for dep in spec.traverse():
self.possible_versions[dep.name].add(dep.version)
self.declared_versions[dep.name].append(
DeclaredVersion(version=dep.version, idx=0, origin=Provenance.INSTALLED)
)
if isinstance(dep.version, vn.GitVersion):
self.declared_versions[dep.name].append(
DeclaredVersion(
version=dep.version, idx=0, origin=Provenance.INSTALLED_GIT_VERSION
)
)
else:
self.declared_versions[dep.name].append(
DeclaredVersion(version=dep.version, idx=0, origin=Provenance.INSTALLED)
)
self.possible_oses.add(dep.os)
def define_concrete_input_specs(self, specs, possible):
@@ -2435,7 +2447,7 @@ def setup(
if using_libc_compatibility():
for libc in self.libcs:
self.gen.fact(fn.allowed_libc(libc.name, libc.version))
self.gen.fact(fn.host_libc(libc.name, libc.version))
if not allow_deprecated:
self.gen.fact(fn.deprecated_versions_not_allowed())
@@ -2732,9 +2744,7 @@ class _Head:
node_compiler = fn.attr("node_compiler_set")
node_compiler_version = fn.attr("node_compiler_version_set")
node_flag = fn.attr("node_flag_set")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
propagate = fn.attr("propagate")
class _Body:
@@ -2749,9 +2759,7 @@ class _Body:
node_compiler = fn.attr("node_compiler")
node_compiler_version = fn.attr("node_compiler_version")
node_flag = fn.attr("node_flag")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
propagate = fn.attr("propagate")
class ProblemInstanceBuilder:
@@ -3225,6 +3233,39 @@ def requires(self, impose: str, *, when: str):
self.runtime_conditions.add((imposed_spec, when_spec))
self.reset()
def propagate(self, constraint_str: str, *, when: str):
msg = "the 'propagate' method can be called only with pkg('*')"
assert self.current_package == "*", msg
when_spec = spack.spec.Spec(when)
assert when_spec.name is None, "only anonymous when specs are accepted"
placeholder = "XXX"
node_variable = "node(ID, Package)"
when_spec.name = placeholder
body_clauses = self._setup.spec_clauses(when_spec, body=True)
body_str = (
f" {f',{os.linesep} '.join(str(x) for x in body_clauses)},\n"
f" not external({node_variable}),\n"
f" not runtime(Package)"
).replace(f'"{placeholder}"', f"{node_variable}")
constraint_spec = spack.spec.Spec(constraint_str)
assert constraint_spec.name is None, "only anonymous constraint specs are accepted"
constraint_spec.name = placeholder
constraint_clauses = self._setup.spec_clauses(constraint_spec, body=False)
for clause in constraint_clauses:
if clause.args[0] == "node_compiler_version_satisfies":
self._setup.compiler_version_constraints.add(constraint_spec.compiler)
args = f'"{constraint_spec.compiler.name}", "{constraint_spec.compiler.versions}"'
head_str = f"propagate({node_variable}, node_compiler_version_satisfies({args}))"
rule = f"{head_str} :-\n{body_str}.\n\n"
self.rules.append(rule)
self.reset()
def consume_facts(self):
"""Consume the facts collected by this object, and emits rules and
facts for the runtimes.
@@ -3304,6 +3345,8 @@ def hash(self, node, h):
def node(self, node):
if node not in self._specs:
self._specs[node] = spack.spec.Spec(node.pkg)
for flag_type in spack.spec.FlagMap.valid_compiler_flags():
self._specs[node].compiler_flags[flag_type] = []
def _arch(self, node):
arch = self._specs[node].architecture
@@ -3356,9 +3399,6 @@ def node_flag(self, node, flag_type, flag):
def node_flag_source(self, node, flag_type, source):
self._flag_sources[(node, flag_type)].add(source)
def no_flags(self, node, flag_type):
self._specs[node].compiler_flags[flag_type] = []
def external_spec_selected(self, node, idx):
"""This means that the external spec and index idx has been selected for this package."""
packages_yaml = _external_config_with_implicit_externals(spack.config.CONFIG)
@@ -3451,7 +3491,7 @@ def reorder_flags(self):
ordered_compiler_flags = list(llnl.util.lang.dedupe(from_compiler + from_sources))
compiler_flags = spec.compiler_flags.get(flag_type, [])
msg = "%s does not equal %s" % (set(compiler_flags), set(ordered_compiler_flags))
msg = f"{set(compiler_flags)} does not equal {set(ordered_compiler_flags)}"
assert set(compiler_flags) == set(ordered_compiler_flags), msg
spec.compiler_flags.update({flag_type: ordered_compiler_flags})
@@ -3521,9 +3561,8 @@ def build_specs(self, function_tuples):
# do not bother calling actions on it except for node_flag_source,
# since node_flag_source is tracking information not in the spec itself
spec = self._specs.get(args[0])
if spec and spec.concrete:
if name != "node_flag_source":
continue
if spec and spec.concrete and name != "node_flag_source":
continue
action(*args)
@@ -3794,12 +3833,6 @@ class Solver:
def __init__(self):
self.driver = PyclingoDriver()
self.selector = ReusableSpecsSelector(configuration=spack.config.CONFIG)
if spack.platforms.host().name == "cray":
msg = (
"The Cray platform, i.e. 'platform=cray', will be removed in Spack v0.23. "
"All Cray machines will be then detected as 'platform=linux'."
)
warnings.warn(msg)
@staticmethod
def _check_input_and_extract_concrete_specs(specs):

View File

@@ -29,7 +29,6 @@
:- attr("variant_value", PackageNode, _, _), not attr("node", PackageNode).
:- attr("node_flag_compiler_default", PackageNode), not attr("node", PackageNode).
:- attr("node_flag", PackageNode, _, _), not attr("node", PackageNode).
:- attr("no_flags", PackageNode, _), not attr("node", PackageNode).
:- attr("external_spec_selected", PackageNode, _), not attr("node", PackageNode).
:- attr("depends_on", ParentNode, _, _), not attr("node", ParentNode).
:- attr("depends_on", _, ChildNode, _), not attr("node", ChildNode).
@@ -256,6 +255,7 @@ possible_version_weight(node(ID, Package), Weight)
:- attr("version", node(ID, Package), Version),
version_weight(node(ID, Package), Weight),
not pkg_fact(Package, version_declared(Version, Weight, "installed")),
not pkg_fact(Package, version_declared(Version, Weight, "installed_git_version")),
not build(node(ID, Package)),
internal_error("Build version weight used for reused package").
@@ -811,37 +811,6 @@ node_has_variant(node(ID, Package), Variant) :-
pkg_fact(Package, variant(Variant)),
attr("node", node(ID, Package)).
% Variant propagation is forwarded to dependencies
attr("variant_propagation_candidate", PackageNode, Variant, Value, Source) :-
attr("node", PackageNode),
depends_on(ParentNode, PackageNode),
attr("variant_value", node(_, Source), Variant, Value),
attr("variant_propagation_candidate", ParentNode, Variant, _, Source).
% If the node is a candidate, and it has the variant and value,
% then those variant and value should be propagated
attr("variant_propagate", node(ID, Package), Variant, Value, Source) :-
attr("variant_propagation_candidate", node(ID, Package), Variant, Value, Source),
node_has_variant(node(ID, Package), Variant),
pkg_fact(Package, variant_possible_value(Variant, Value)),
not attr("variant_set", node(ID, Package), Variant).
% Propagate the value, if there is the corresponding attribute
attr("variant_value", PackageNode, Variant, Value) :- attr("variant_propagate", PackageNode, Variant, Value, _).
% If a variant is propagated, we cannot have extraneous values (this is for multi valued variants)
variant_is_propagated(PackageNode, Variant) :- attr("variant_propagate", PackageNode, Variant, _, _).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_value", PackageNode, Variant, Value),
not attr("variant_propagate", PackageNode, Variant, Value, _).
% Cannot receive different values from different sources on the same variant
error(100, "{0} and {1} cannot both propagate variant '{2}' to package {3} with values '{4}' and '{5}'", Source1, Source2, Variant, Package, Value1, Value2) :-
attr("variant_propagate", node(X, Package), Variant, Value1, Source1),
attr("variant_propagate", node(X, Package), Variant, Value2, Source2),
node_has_variant(node(X, Package), Variant),
Value1 < Value2, Source1 < Source2.
% a variant cannot be set if it is not a variant on the package
error(100, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, Package)
:- attr("variant_set", node(X, Package), Variant),
@@ -919,7 +888,7 @@ variant_not_default(node(ID, Package), Variant, Value)
% variants set explicitly on the CLI don't count as non-default
not attr("variant_set", node(ID, Package), Variant, Value),
% variant values forced by propagation don't count as non-default
not attr("variant_propagate", node(ID, Package), Variant, Value, _),
not propagate(node(ID, Package), variant_value(Variant, Value)),
% variants set on externals that we could use don't count as non-default
% this makes spack prefer to use an external over rebuilding with the
% default configuration
@@ -932,7 +901,7 @@ variant_default_not_used(node(ID, Package), Variant, Value)
:- variant_default_value(Package, Variant, Value),
node_has_variant(node(ID, Package), Variant),
not attr("variant_value", node(ID, Package), Variant, Value),
not attr("variant_propagate", node(ID, Package), Variant, _, _),
not propagate(node(ID, Package), variant_value(Variant, _)),
attr("node", node(ID, Package)).
% The variant is set in an external spec
@@ -989,6 +958,101 @@ pkg_fact(Package, variant_single_value("dev_path"))
#defined variant_default_value/3.
#defined variant_default_value_from_packages_yaml/3.
%-----------------------------------------------------------------------------
% Propagation semantics
%-----------------------------------------------------------------------------
% Propagation roots have a corresponding attr("propagate", ...)
propagate(RootNode, PropagatedAttribute) :- attr("propagate", RootNode, PropagatedAttribute).
propagate(RootNode, PropagatedAttribute, EdgeTypes) :- attr("propagate", RootNode, PropagatedAttribute, EdgeTypes).
% Propagate an attribute along edges to child nodes
propagate(ChildNode, PropagatedAttribute) :-
propagate(ParentNode, PropagatedAttribute),
depends_on(ParentNode, ChildNode).
propagate(ChildNode, PropagatedAttribute, edge_types(DepType1, DepType2)) :-
propagate(ParentNode, PropagatedAttribute, edge_types(DepType1, DepType2)),
depends_on(ParentNode, ChildNode),
1 { attr("depends_on", ParentNode, ChildNode, DepType1); attr("depends_on", ParentNode, ChildNode, DepType2) }.
%-----------------------------------------------------------------------------
% Activation of propagated values
%-----------------------------------------------------------------------------
%----
% Variants
%----
% If a variant is propagated, and can be accepted, set its value
attr("variant_value", node(ID, Package), Variant, Value) :-
propagate(node(ID, Package), variant_value(Variant, Value)),
node_has_variant(node(ID, Package), Variant),
pkg_fact(Package, variant_possible_value(Variant, Value)),
not attr("variant_set", node(ID, Package), Variant).
% If a variant is propagated, we cannot have extraneous values
variant_is_propagated(PackageNode, Variant) :-
attr("variant_value", PackageNode, Variant, Value),
propagate(PackageNode, variant_value(Variant, Value)),
not attr("variant_set", PackageNode, Variant).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_value", PackageNode, Variant, Value),
not propagate(PackageNode, variant_value(Variant, Value)).
%----
% Flags
%----
% A propagated flag implies:
% 1. The same flag type is not set on this node
% 2. This node has the same compiler as the propagation source
propagated_flag(node(PackageID, Package), node_flag(FlagType, Flag), SourceNode) :-
propagate(node(PackageID, Package), node_flag(FlagType, Flag), _),
not attr("node_flag_set", node(PackageID, Package), FlagType, _),
% Same compiler as propagation source
node_compiler(node(PackageID, Package), CompilerID),
node_compiler(SourceNode, CompilerID),
attr("propagate", SourceNode, node_flag(FlagType, Flag), _),
node(PackageID, Package) != SourceNode,
not runtime(Package).
attr("node_flag", PackageNode, FlagType, Flag) :- propagated_flag(PackageNode, node_flag(FlagType, Flag), _).
attr("node_flag_source", PackageNode, FlagType, SourceNode) :- propagated_flag(PackageNode, node_flag(FlagType, _), SourceNode).
% Cannot propagate the same flag from two distinct sources
error(100, "{0} and {1} cannot both propagate compiler flags '{2}' to {3}", Source1, Source2, Package, FlagType) :-
propagated_flag(node(ID, Package), node_flag(FlagType, _), node(_, Source1)),
propagated_flag(node(ID, Package), node_flag(FlagType, _), node(_, Source2)),
Source1 < Source2.
%----
% Compiler constraints
%----
attr("node_compiler_version_satisfies", node(ID, Package), Compiler, Version) :-
propagate(node(ID, Package), node_compiler_version_satisfies(Compiler, Version)),
node_compiler(node(ID, Package), CompilerID),
compiler_name(CompilerID, Compiler),
not runtime(Package),
not external(Package).
%-----------------------------------------------------------------------------
% Runtimes
%-----------------------------------------------------------------------------
% Check whether the DAG has any built package
has_built_packages() :- build(X), not external(X).
% If we build packages, the runtime nodes must use an available compiler
1 { node_compiler(PackageNode, CompilerID) : build(PackageNode), not external(PackageNode) } :-
has_built_packages(),
runtime(RuntimePackage),
node_compiler(node(_, RuntimePackage), CompilerID).
%-----------------------------------------------------------------------------
% Platform semantics
%-----------------------------------------------------------------------------
@@ -1090,10 +1154,15 @@ attr("node_target", PackageNode, Target)
:- attr("node", PackageNode), attr("node_target_set", PackageNode, Target).
% each node has the weight of its assigned target
node_target_weight(node(ID, Package), Weight)
:- attr("node", node(ID, Package)),
attr("node_target", node(ID, Package), Target),
target_weight(Target, Weight).
target_weight(Target, 0)
:- attr("node", PackageNode),
attr("node_target", PackageNode, Target),
attr("node_target_set", PackageNode, Target).
node_target_weight(PackageNode, MinWeight)
:- attr("node", PackageNode),
attr("node_target", PackageNode, Target),
MinWeight = #min { Weight : target_weight(Target, Weight) }.
% compatibility rules for targets among nodes
node_target_match(ParentNode, DependencyNode)
@@ -1155,12 +1224,12 @@ error(10, "No valid compiler for {0} satisfies '%{1}'", Package, Compiler)
% If the compiler of a node must satisfy a constraint, then its version
% must be chosen among the ones that satisfy said constraint
error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
error(100, "Package {0} cannot satisfy '%{1}@{2}'", Package, Compiler, Constraint)
:- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, _).
not compiler_version_satisfies(Compiler, Constraint, _).
error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
error(100, "Package {0} cannot satisfy '%{1}@{2}'", Package, Compiler, Constraint)
:- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, ID),
@@ -1241,45 +1310,9 @@ error(100, "Compiler {1}@{2} requested for {0} cannot be found. Set install_miss
% Compiler flags
%-----------------------------------------------------------------------------
% propagate flags when compiler match
can_inherit_flags(PackageNode, DependencyNode, FlagType)
:- same_compiler(PackageNode, DependencyNode),
not attr("node_flag_set", DependencyNode, FlagType, _),
flag_type(FlagType).
same_compiler(PackageNode, DependencyNode)
:- depends_on(PackageNode, DependencyNode),
node_compiler(PackageNode, CompilerID),
node_compiler(DependencyNode, CompilerID),
compiler_id(CompilerID).
node_flag_inherited(DependencyNode, FlagType, Flag)
:- attr("node_flag_set", PackageNode, FlagType, Flag),
can_inherit_flags(PackageNode, DependencyNode, FlagType),
attr("node_flag_propagate", PackageNode, FlagType).
% Ensure propagation
:- node_flag_inherited(PackageNode, FlagType, Flag),
can_inherit_flags(PackageNode, DependencyNode, FlagType),
attr("node_flag_propagate", PackageNode, FlagType).
error(100, "{0} and {1} cannot both propagate compiler flags '{2}' to {3}", Source1, Source2, Package, FlagType) :-
depends_on(Source1, Package),
depends_on(Source2, Package),
attr("node_flag_propagate", Source1, FlagType),
attr("node_flag_propagate", Source2, FlagType),
can_inherit_flags(Source1, Package, FlagType),
can_inherit_flags(Source2, Package, FlagType),
Source1 < Source2.
% remember where flags came from
attr("node_flag_source", PackageNode, FlagType, PackageNode)
:- attr("node_flag_set", PackageNode, FlagType, _).
attr("node_flag_source", DependencyNode, FlagType, Q)
:- attr("node_flag_source", PackageNode, FlagType, Q),
node_flag_inherited(DependencyNode, FlagType, _),
attr("node_flag_propagate", PackageNode, FlagType).
attr("node_flag_source", PackageNode, FlagType, PackageNode) :- attr("node_flag_set", PackageNode, FlagType, _).
attr("node_flag_source", PackageNode, FlagType, PackageNode) :- attr("node_flag", PackageNode, FlagType, _), attr("hash", PackageNode, _).
% compiler flags from compilers.yaml are put on nodes if compiler matches
attr("node_flag", PackageNode, FlagType, Flag)
@@ -1299,15 +1332,8 @@ attr("node_flag_compiler_default", PackageNode)
compiler_name(CompilerID, CompilerName),
compiler_version(CompilerID, Version).
% if a flag is set to something or inherited, it's included
% Flag set to something
attr("node_flag", PackageNode, FlagType, Flag) :- attr("node_flag_set", PackageNode, FlagType, Flag).
attr("node_flag", PackageNode, FlagType, Flag) :- node_flag_inherited(PackageNode, FlagType, Flag).
% if no node flags are set for a type, there are no flags.
attr("no_flags", PackageNode, FlagType)
:- not attr("node_flag", PackageNode, FlagType, _),
attr("node", PackageNode),
flag_type(FlagType).
#defined compiler_flag/3.
@@ -1345,8 +1371,10 @@ build(PackageNode) :- not attr("hash", PackageNode, _), attr("node", PackageNode
% topmost-priority criterion to reuse what is installed.
%
% The priority ranges are:
% 200+ Shifted priorities for build nodes; correspond to priorities 0 - 99.
% 100 - 199 Unshifted priorities. Currently only includes minimizing #builds.
% 1000+ Optimizations for concretization errors
% 300 - 1000 Highest priority optimizations for valid solutions
% 200 - 299 Shifted priorities for build nodes; correspond to priorities 0 - 99.
% 100 - 199 Unshifted priorities. Currently only includes minimizing #builds and minimizing dupes.
% 0 - 99 Priorities for non-built nodes.
build_priority(PackageNode, 200) :- build(PackageNode), attr("node", PackageNode).
build_priority(PackageNode, 0) :- not build(PackageNode), attr("node", PackageNode).
@@ -1394,6 +1422,16 @@ build_priority(PackageNode, 0) :- not build(PackageNode), attr("node", Package
% 2. a `#minimize{ 0@2 : #true }.` statement that ensures the criterion
% is displayed (clingo doesn't display sums over empty sets by default)
% A condition group specifies one or more specs that must be satisfied.
% Specs declared first are preferred, so we assign increasing weights and
% minimize the weights.
opt_criterion(310, "requirement weight").
#minimize{ 0@310: #true }.
#minimize {
Weight@310,PackageNode,Group
: requirement_weight(PackageNode, Group, Weight)
}.
% Try hard to reuse installed packages (i.e., minimize the number built)
opt_criterion(110, "number of packages to build (vs. reuse)").
#minimize { 0@110: #true }.
@@ -1405,18 +1443,6 @@ opt_criterion(100, "number of nodes from the same package").
#minimize { ID@100,Package : attr("virtual_node", node(ID, Package)) }.
#defined optimize_for_reuse/0.
% A condition group specifies one or more specs that must be satisfied.
% Specs declared first are preferred, so we assign increasing weights and
% minimize the weights.
opt_criterion(75, "requirement weight").
#minimize{ 0@275: #true }.
#minimize{ 0@75: #true }.
#minimize {
Weight@75+Priority,PackageNode,Group
: requirement_weight(PackageNode, Group, Weight),
build_priority(PackageNode, Priority)
}.
% Minimize the number of deprecated versions being used
opt_criterion(73, "deprecated versions used").
#minimize{ 0@273: #true }.
@@ -1496,7 +1522,7 @@ opt_criterion(45, "preferred providers (non-roots)").
}.
% Try to minimize the number of compiler mismatches in the DAG.
opt_criterion(40, "compiler mismatches that are not from CLI").
opt_criterion(40, "compiler mismatches that are not required").
#minimize{ 0@240: #true }.
#minimize{ 0@40: #true }.
#minimize{
@@ -1506,7 +1532,7 @@ opt_criterion(40, "compiler mismatches that are not from CLI").
not runtime(Dependency)
}.
opt_criterion(39, "compiler mismatches that are not from CLI").
opt_criterion(39, "compiler mismatches that are required").
#minimize{ 0@239: #true }.
#minimize{ 0@39: #true }.
#minimize{

View File

@@ -4,21 +4,35 @@
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% Heuristic to speed-up solves (node with ID 0)
% Heuristic to speed-up solves
%=============================================================================
% No duplicates by default (most of them will be true)
#heuristic attr("node", node(PackageID, Package)). [100, init]
#heuristic attr("node", node(PackageID, Package)). [ 2, factor]
#heuristic attr("virtual_node", node(VirtualID, Virtual)). [100, init]
#heuristic attr("node", node(1..X-1, Package)) : max_dupes(Package, X), not virtual(Package), X > 1. [-1, sign]
#heuristic attr("virtual_node", node(1..X-1, Package)) : max_dupes(Package, X), virtual(Package) , X > 1. [-1, sign]
%-----------------
% Domain heuristic
%-----------------
% Pick preferred version
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, Weight)), attr("node", node(PackageID, Package)). [40, init]
#heuristic version_weight(node(PackageID, Package), 0) : pkg_fact(Package, version_declared(Version, 0 )), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, 0 )), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, Weight)), attr("node", node(PackageID, Package)), Weight > 0. [-1, sign]
% Root node
#heuristic attr("version", node(0, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("root", node(0, Package)). [35, true]
#heuristic version_weight(node(0, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("root", node(0, Package)). [35, true]
#heuristic attr("variant_value", node(0, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("root", node(0, Package)). [35, true]
#heuristic attr("node_target", node(0, Package), Target) : target_weight(Target, 0), attr("root", node(0, Package)). [35, true]
#heuristic node_target_weight(node(0, Package), 0) : attr("root", node(0, Package)). [35, true]
#heuristic node_compiler(node(0, Package), CompilerID) : compiler_weight(ID, 0), compiler_id(ID), attr("root", node(0, Package)). [35, true]
% Use default variants
#heuristic attr("variant_value", node(PackageID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(PackageID, Package)). [40, true]
#heuristic attr("variant_value", node(PackageID, Package), Variant, Value) : not variant_default_value(Package, Variant, Value), attr("node", node(PackageID, Package)). [40, false]
% Providers
#heuristic attr("node", node(0, Package)) : default_provider_preference(Virtual, Package, 0), possible_in_link_run(Package). [30, true]
% Use default operating system and platform
#heuristic attr("node_os", node(PackageID, Package), OS) : os(OS, 0), attr("root", node(PackageID, Package)). [40, true]
#heuristic attr("node_platform", node(PackageID, Package), Platform) : allowed_platform(Platform), attr("root", node(PackageID, Package)). [40, true]
% Use default targets
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)). [30, init]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)). [ 2, factor]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, 0), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)), Weight > 0. [-1, sign]
% Use the default compilers
#heuristic node_compiler(node(PackageID, Package), ID) : compiler_weight(ID, 0), compiler_id(ID), attr("node", node(PackageID, Package)). [30, init]

View File

@@ -1,24 +0,0 @@
% Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
% Spack Project Developers. See the top-level COPYRIGHT file for details.
%
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% Heuristic to speed-up solves (node with ID > 0)
%=============================================================================
% node(ID, _)
#heuristic attr("version", node(ID, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic version_weight(node(ID, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic attr("variant_value", node(ID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic attr("node_target", node(ID, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_target_weight(node(ID, Package), 0) : attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : compiler_weight(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
% node(ID, _), split build dependencies
#heuristic attr("version", node(ID, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic version_weight(node(ID, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic attr("variant_value", node(ID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic attr("node_target", node(ID, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_target_weight(node(ID, Package), 0) : attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : compiler_weight(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]

View File

@@ -10,25 +10,31 @@
%=============================================================================
% A package cannot be reused if the libc is not compatible with it
:- provider(node(X, LibcPackage), node(0, "libc")),
attr("version", node(X, LibcPackage), LibcVersion),
attr("hash", node(R, ReusedPackage), Hash),
% Libc packages can be reused without the "compatible_libc" attribute
ReusedPackage != LibcPackage,
not attr("compatible_libc", node(R, ReusedPackage), LibcPackage, LibcVersion).
% Check whether the DAG has any built package
has_built_packages() :- build(X), not external(X).
error(100, "Cannot reuse {0} since we cannot determine libc compatibility", ReusedPackage)
:- provider(node(X, LibcPackage), node(0, "libc")),
attr("version", node(X, LibcPackage), LibcVersion),
attr("hash", node(R, ReusedPackage), Hash),
% Libc packages can be reused without the "compatible_libc" attribute
ReusedPackage != LibcPackage,
not attr("compatible_libc", node(R, ReusedPackage), LibcPackage, LibcVersion).
% A libc is needed in the DAG
:- has_built_packages(), not provider(_, node(0, "libc")).
% The libc must be chosen among available ones
% Non-libc reused specs must be host libc compatible. In case we build packages, we get a
% host compatible libc provider from other rules. If nothing is built, there is no libc provider,
% since it's pruned from reusable specs, meaning we have to explicitly impose reused specs are host
% compatible.
:- attr("hash", node(R, ReusedPackage), Hash),
not provider(node(R, ReusedPackage), node(0, "libc")),
not attr("compatible_libc", node(R, ReusedPackage), _, _).
% The libc provider must be one that a compiler can target
:- has_built_packages(),
provider(node(X, LibcPackage), node(0, "libc")),
attr("node", node(X, LibcPackage)),
attr("version", node(X, LibcPackage), LibcVersion),
not allowed_libc(LibcPackage, LibcVersion).
not host_libc(LibcPackage, LibcVersion).
% A built node must depend on libc
:- build(PackageNode),

View File

@@ -1287,6 +1287,102 @@ def copy(self, *args, **kwargs):
return self.wrapped_obj.copy(*args, **kwargs)
def tree(
specs: List["spack.spec.Spec"],
*,
color: Optional[bool] = None,
depth: bool = False,
hashes: bool = False,
hashlen: Optional[int] = None,
cover: str = "nodes",
indent: int = 0,
format: str = DEFAULT_FORMAT,
deptypes: Union[Tuple[str, ...], str] = "all",
show_types: bool = False,
depth_first: bool = False,
recurse_dependencies: bool = True,
status_fn: Optional[Callable[["Spec"], InstallStatus]] = None,
prefix: Optional[Callable[["Spec"], str]] = None,
key=id,
) -> str:
"""Prints out specs and their dependencies, tree-formatted with indentation.
Status function may either output a boolean or an InstallStatus
Args:
color: if True, always colorize the tree. If False, don't colorize the tree. If None,
use the default from llnl.tty.color
depth: print the depth from the root
hashes: if True, print the hash of each node
hashlen: length of the hash to be printed
cover: either "nodes" or "edges"
indent: extra indentation for the tree being printed
format: format to be used to print each node
deptypes: dependency types to be represented in the tree
show_types: if True, show the (merged) dependency type of a node
depth_first: if True, traverse the DAG depth first when representing it as a tree
recurse_dependencies: if True, recurse on dependencies
status_fn: optional callable that takes a node as an argument and return its
installation status
prefix: optional callable that takes a node as an argument and return its
installation prefix
"""
out = ""
if color is None:
color = clr.get_color_when()
for d, dep_spec in traverse.traverse_tree(
sorted(specs), cover=cover, deptype=deptypes, depth_first=depth_first, key=key
):
node = dep_spec.spec
if prefix is not None:
out += prefix(node)
out += " " * indent
if depth:
out += "%-4d" % d
if status_fn:
status = status_fn(node)
if status in list(InstallStatus):
out += clr.colorize(status.value, color=color)
elif status:
out += clr.colorize("@g{[+]} ", color=color)
else:
out += clr.colorize("@r{[-]} ", color=color)
if hashes:
out += clr.colorize("@K{%s} ", color=color) % node.dag_hash(hashlen)
if show_types:
if cover == "nodes":
# when only covering nodes, we merge dependency types
# from all dependents before showing them.
depflag = 0
for ds in node.edges_from_dependents():
depflag |= ds.depflag
else:
# when covering edges or paths, we show dependency
# types only for the edge through which we visited
depflag = dep_spec.depflag
type_chars = dt.flag_to_chars(depflag)
out += "[%s] " % type_chars
out += " " * d
if d > 0:
out += "^"
out += node.format(format, color=color) + "\n"
# Check if we wanted just the first line
if not recurse_dependencies:
break
return out
@lang.lazy_lexicographic_ordering(set_hash=False)
class Spec:
#: Cache for spec's prefix, computed lazily in the corresponding property
@@ -2816,9 +2912,7 @@ def _old_concretize(self, tests=False, deprecation_warning=True):
# Check if we can produce an optimized binary (will throw if
# there are declared inconsistencies)
# No need on platform=cray because of the targeting modules
if not self.satisfies("platform=cray"):
self.architecture.target.optimization_flags(self.compiler)
self.architecture.target.optimization_flags(self.compiler)
def _patches_assigned(self):
"""Whether patches have been assigned to this spec by the concretizer."""
@@ -4606,13 +4700,14 @@ def tree(
recurse_dependencies: bool = True,
status_fn: Optional[Callable[["Spec"], InstallStatus]] = None,
prefix: Optional[Callable[["Spec"], str]] = None,
key=id,
) -> str:
"""Prints out this spec and its dependencies, tree-formatted
with indentation.
"""Prints out this spec and its dependencies, tree-formatted with indentation.
Status function may either output a boolean or an InstallStatus
See multi-spec ``spack.spec.tree()`` function for details.
Args:
specs: List of specs to format.
color: if True, always colorize the tree. If False, don't colorize the tree. If None,
use the default from llnl.tty.color
depth: print the depth from the root
@@ -4630,60 +4725,23 @@ def tree(
prefix: optional callable that takes a node as an argument and return its
installation prefix
"""
out = ""
if color is None:
color = clr.get_color_when()
for d, dep_spec in traverse.traverse_tree(
[self], cover=cover, deptype=deptypes, depth_first=depth_first
):
node = dep_spec.spec
if prefix is not None:
out += prefix(node)
out += " " * indent
if depth:
out += "%-4d" % d
if status_fn:
status = status_fn(node)
if status in list(InstallStatus):
out += clr.colorize(status.value, color=color)
elif status:
out += clr.colorize("@g{[+]} ", color=color)
else:
out += clr.colorize("@r{[-]} ", color=color)
if hashes:
out += clr.colorize("@K{%s} ", color=color) % node.dag_hash(hashlen)
if show_types:
if cover == "nodes":
# when only covering nodes, we merge dependency types
# from all dependents before showing them.
depflag = 0
for ds in node.edges_from_dependents():
depflag |= ds.depflag
else:
# when covering edges or paths, we show dependency
# types only for the edge through which we visited
depflag = dep_spec.depflag
type_chars = dt.flag_to_chars(depflag)
out += "[%s] " % type_chars
out += " " * d
if d > 0:
out += "^"
out += node.format(format, color=color) + "\n"
# Check if we wanted just the first line
if not recurse_dependencies:
break
return out
return tree(
[self],
color=color,
depth=depth,
hashes=hashes,
hashlen=hashlen,
cover=cover,
indent=indent,
format=format,
deptypes=deptypes,
show_types=show_types,
depth_first=depth_first,
recurse_dependencies=recurse_dependencies,
status_fn=status_fn,
prefix=prefix,
key=key,
)
def __repr__(self):
return str(self)

View File

@@ -212,10 +212,7 @@ def _expand_matrix_constraints(matrix_config):
results = []
for combo in itertools.product(*expanded_rows):
# Construct a combined spec to test against excludes
flat_combo = [constraint for constraint_list in combo for constraint in constraint_list]
# Resolve abstract hashes so we can exclude by their concrete properties
flat_combo = [Spec(x).lookup_hash() for x in flat_combo]
flat_combo = [Spec(constraint) for constraints in combo for constraint in constraints]
test_spec = flat_combo[0].copy()
for constraint in flat_combo[1:]:
@@ -231,7 +228,9 @@ def _expand_matrix_constraints(matrix_config):
spack.variant.substitute_abstract_variants(test_spec)
except spack.variant.UnknownVariantError:
pass
if any(test_spec.satisfies(x) for x in excludes):
# Resolve abstract hashes for exclusion criteria
if any(test_spec.lookup_hash().satisfies(x) for x in excludes):
continue
if sigil:

View File

@@ -346,8 +346,6 @@ class Stage(LockableStagingDir):
similar, and are intended to persist for only one run of spack.
"""
#: Most staging is managed by Spack. DIYStage is one exception.
needs_fetching = True
requires_patch_success = True
def __init__(
@@ -772,8 +770,6 @@ def __init__(self):
"cache_mirror",
"steal_source",
"disable_mirrors",
"needs_fetching",
"requires_patch_success",
]
)
@@ -812,6 +808,10 @@ def path(self):
def archive_file(self):
return self[0].archive_file
@property
def requires_patch_success(self):
return self[0].requires_patch_success
@property
def keep(self):
return self[0].keep
@@ -822,64 +822,7 @@ def keep(self, value):
item.keep = value
class DIYStage:
"""
Simple class that allows any directory to be a spack stage. Consequently,
it does not expect or require that the source path adhere to the standard
directory naming convention.
"""
needs_fetching = False
requires_patch_success = False
def __init__(self, path):
if path is None:
raise ValueError("Cannot construct DIYStage without a path.")
elif not os.path.isdir(path):
raise StagePathError("The stage path directory does not exist:", path)
self.archive_file = None
self.path = path
self.source_path = path
self.created = True
# DIY stages do nothing as context managers.
def __enter__(self):
pass
def __exit__(self, exc_type, exc_val, exc_tb):
pass
def fetch(self, *args, **kwargs):
tty.debug("No need to fetch for DIY.")
def check(self):
tty.debug("No checksum needed for DIY.")
def expand_archive(self):
tty.debug("Using source directory: {0}".format(self.source_path))
@property
def expanded(self):
"""Returns True since the source_path must exist."""
return True
def restage(self):
raise RestageError("Cannot restage a DIY stage.")
def create(self):
self.created = True
def destroy(self):
# No need to destroy DIY stage.
pass
def cache_local(self):
tty.debug("Sources for DIY stages are not cached")
class DevelopStage(LockableStagingDir):
needs_fetching = False
requires_patch_success = False
def __init__(self, name, dev_path, reference_link):

View File

@@ -2,16 +2,12 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import platform
import sys
import pytest
import archspec.cpu
import llnl.util.filesystem as fs
import spack.compilers
import spack.concretize
import spack.operating_systems
@@ -25,9 +21,8 @@ def current_host_platform():
"""Return the platform of the current host as detected by the
'platform' stdlib package.
"""
if os.path.exists("/opt/cray/pe"):
current_platform = spack.platforms.Cray()
elif "Linux" in platform.system():
current_platform = None
if "Linux" in platform.system():
current_platform = spack.platforms.Linux()
elif "Darwin" in platform.system():
current_platform = spack.platforms.Darwin()
@@ -222,28 +217,3 @@ def test_concretize_target_ranges(root_target_range, dep_target_range, result, m
with spack.concretize.disable_compiler_existence_check():
spec.concretize()
assert spec.target == spec["b"].target == result
@pytest.mark.parametrize(
"versions,default,expected",
[
(["21.11", "21.9"], "21.11", False),
(["21.11", "21.9"], "21.9", True),
(["21.11", "21.9"], None, False),
],
)
@pytest.mark.skipif(sys.platform == "win32", reason="Cray does not use windows")
def test_cray_platform_detection(versions, default, expected, tmpdir, monkeypatch, working_env):
ex_path = str(tmpdir.join("fake_craype_dir"))
fs.mkdirp(ex_path)
with fs.working_dir(ex_path):
for version in versions:
fs.touch(version)
if default:
os.symlink(default, "default")
monkeypatch.setattr(spack.platforms.cray, "_ex_craype_dir", ex_path)
os.environ["MODULEPATH"] = "/opt/cray/pe"
assert spack.platforms.cray.Cray.detect() == expected

View File

@@ -115,8 +115,8 @@ def default_config(tmpdir, config_directory, monkeypatch, install_mockery_mutabl
cfg = spack.config.Configuration(
*[
spack.config.ConfigScope(name, str(mutable_dir))
for name in ["site/%s" % platform.system().lower(), "site", "user"]
spack.config.DirectoryConfigScope(name, str(mutable_dir))
for name in [f"site/{platform.system().lower()}", "site", "user"]
]
)

View File

@@ -556,24 +556,6 @@ def test_build_jobs_defaults():
)
def test_dirty_disable_module_unload(config, mock_packages, working_env, mock_module_cmd):
"""Test that on CRAY platform 'module unload' is not called if the 'dirty'
option is on.
"""
s = spack.spec.Spec("a").concretized()
# If called with "dirty" we don't unload modules, so no calls to the
# `module` function on Cray
spack.build_environment.setup_package(s.package, dirty=True)
assert not mock_module_cmd.calls
# If called without "dirty" we unload modules on Cray
spack.build_environment.setup_package(s.package, dirty=False)
assert mock_module_cmd.calls
assert any(("unload", "cray-libsci") == item[0] for item in mock_module_cmd.calls)
assert any(("unload", "cray-mpich") == item[0] for item in mock_module_cmd.calls)
class TestModuleMonkeyPatcher:
def test_getting_attributes(self, default_mock_concretization):
s = default_mock_concretization("libelf")

View File

@@ -12,21 +12,21 @@
def test_build_task_errors(install_mockery):
with pytest.raises(ValueError, match="must be a package"):
inst.BuildTask("abc", None, False, 0, 0, 0, [])
inst.BuildTask("abc", None, False, 0, 0, 0, set())
spec = spack.spec.Spec("trivial-install-test-package")
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
with pytest.raises(ValueError, match="must have a concrete spec"):
inst.BuildTask(pkg_cls(spec), None, False, 0, 0, 0, [])
inst.BuildTask(pkg_cls(spec), None, False, 0, 0, 0, set())
spec.concretize()
assert spec.concrete
with pytest.raises(ValueError, match="must have a build request"):
inst.BuildTask(spec.package, None, False, 0, 0, 0, [])
inst.BuildTask(spec.package, None, False, 0, 0, 0, set())
request = inst.BuildRequest(spec.package, {})
with pytest.raises(inst.InstallError, match="Cannot create a build task"):
inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_REMOVED, [])
inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_REMOVED, set())
def test_build_task_basics(install_mockery):
@@ -36,8 +36,8 @@ def test_build_task_basics(install_mockery):
# Ensure key properties match expectations
request = inst.BuildRequest(spec.package, {})
task = inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_ADDED, [])
assert task.explicit # package was "explicitly" requested
task = inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_ADDED, set())
assert not task.explicit
assert task.priority == len(task.uninstalled_deps)
assert task.key == (task.priority, task.sequence)
@@ -58,7 +58,7 @@ def test_build_task_strings(install_mockery):
# Ensure key properties match expectations
request = inst.BuildRequest(spec.package, {})
task = inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_ADDED, [])
task = inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_ADDED, set())
# Cover __repr__
irep = task.__repr__()

View File

@@ -828,14 +828,14 @@ def test_keep_and_replace(wrapper_environment):
),
(
"config:flags:keep_werror:specific",
["-Werror", "-Werror=specific", "-bah"],
["-Werror=specific", "-bah"],
["-Werror", "-Werror=specific", "-Werror-specific2", "-bah"],
["-Wno-error", "-Werror=specific", "-Werror-specific2", "-bah"],
["-Werror"],
),
(
"config:flags:keep_werror:none",
["-Werror", "-Werror=specific", "-bah"],
["-bah", "-Wno-error", "-Wno-error=specific"],
["-Wno-error", "-Wno-error=specific", "-bah"],
["-Werror", "-Werror=specific"],
),
# check non-standard -Werror opts like -Werror-implicit-function-declaration
@@ -848,13 +848,13 @@ def test_keep_and_replace(wrapper_environment):
(
"config:flags:keep_werror:specific",
["-Werror", "-Werror-implicit-function-declaration", "-bah"],
["-Werror-implicit-function-declaration", "-bah", "-Wno-error"],
["-Wno-error", "-Werror-implicit-function-declaration", "-bah"],
["-Werror"],
),
(
"config:flags:keep_werror:none",
["-Werror", "-Werror-implicit-function-declaration", "-bah"],
["-bah", "-Wno-error=implicit-function-declaration"],
["-Wno-error", "-bah", "-Wno-error=implicit-function-declaration"],
["-Werror", "-Werror-implicit-function-declaration"],
),
],

View File

@@ -2,7 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import itertools
import os
import subprocess
@@ -11,15 +10,12 @@
import llnl.util.filesystem as fs
import spack.ci as ci
import spack.ci_needs_workaround as cinw
import spack.ci_optimization as ci_opt
import spack.config
import spack.environment as ev
import spack.error
import spack.paths as spack_paths
import spack.util.git
import spack.util.gpg
import spack.util.spack_yaml as syaml
@pytest.fixture
@@ -203,164 +199,6 @@ def __call__(self, *args, **kwargs):
assert "Unable to merge {0}".format(c1) in err
@pytest.mark.parametrize("obj, proto", [({}, [])])
def test_ci_opt_argument_checking(obj, proto):
"""Check that matches() and subkeys() return False when `proto` is not a dict."""
assert not ci_opt.matches(obj, proto)
assert not ci_opt.subkeys(obj, proto)
@pytest.mark.parametrize("yaml", [{"extends": 1}])
def test_ci_opt_add_extends_non_sequence(yaml):
"""Check that add_extends() exits if 'extends' is not a sequence."""
yaml_copy = yaml.copy()
ci_opt.add_extends(yaml, None)
assert yaml == yaml_copy
def test_ci_workarounds():
fake_root_spec = "x" * 544
fake_spack_ref = "x" * 40
common_variables = {"SPACK_IS_PR_PIPELINE": "False"}
common_before_script = [
'git clone "https://github.com/spack/spack"',
" && ".join(("pushd ./spack", 'git checkout "{ref}"'.format(ref=fake_spack_ref), "popd")),
'. "./spack/share/spack/setup-env.sh"',
]
def make_build_job(name, deps, stage, use_artifact_buildcache, optimize, use_dependencies):
variables = common_variables.copy()
variables["SPACK_JOB_SPEC_PKG_NAME"] = name
result = {
"stage": stage,
"tags": ["tag-0", "tag-1"],
"artifacts": {
"paths": ["jobs_scratch_dir", "cdash_report", name + ".spec.json", name],
"when": "always",
},
"retry": {"max": 2, "when": ["always"]},
"after_script": ['rm -rf "./spack"'],
"script": ["spack ci rebuild"],
"image": {"name": "spack/centos7", "entrypoint": [""]},
}
if optimize:
result["extends"] = [".c0", ".c1"]
else:
variables["SPACK_ROOT_SPEC"] = fake_root_spec
result["before_script"] = common_before_script
result["variables"] = variables
if use_dependencies:
result["dependencies"] = list(deps) if use_artifact_buildcache else []
else:
result["needs"] = [{"job": dep, "artifacts": use_artifact_buildcache} for dep in deps]
return {name: result}
def make_rebuild_index_job(use_artifact_buildcache, optimize, use_dependencies):
result = {
"stage": "stage-rebuild-index",
"script": "spack buildcache update-index s3://mirror",
"tags": ["tag-0", "tag-1"],
"image": {"name": "spack/centos7", "entrypoint": [""]},
"after_script": ['rm -rf "./spack"'],
}
if optimize:
result["extends"] = ".c0"
else:
result["before_script"] = common_before_script
return {"rebuild-index": result}
def make_factored_jobs(optimize):
return (
{
".c0": {"before_script": common_before_script},
".c1": {"variables": {"SPACK_ROOT_SPEC": fake_root_spec}},
}
if optimize
else {}
)
def make_stage_list(num_build_stages):
return {
"stages": (
["-".join(("stage", str(i))) for i in range(num_build_stages)]
+ ["stage-rebuild-index"]
)
}
def make_yaml_obj(use_artifact_buildcache, optimize, use_dependencies):
result = {}
result.update(
make_build_job(
"pkg-a", [], "stage-0", use_artifact_buildcache, optimize, use_dependencies
)
)
result.update(
make_build_job(
"pkg-b", ["pkg-a"], "stage-1", use_artifact_buildcache, optimize, use_dependencies
)
)
result.update(
make_build_job(
"pkg-c",
["pkg-a", "pkg-b"],
"stage-2",
use_artifact_buildcache,
optimize,
use_dependencies,
)
)
result.update(make_rebuild_index_job(use_artifact_buildcache, optimize, use_dependencies))
result.update(make_factored_jobs(optimize))
result.update(make_stage_list(3))
return result
# test every combination of:
# use artifact buildcache: true or false
# run optimization pass: true or false
# convert needs to dependencies: true or false
for use_ab in (False, True):
original = make_yaml_obj(
use_artifact_buildcache=use_ab, optimize=False, use_dependencies=False
)
for opt, deps in itertools.product(*(((False, True),) * 2)):
# neither optimizing nor converting needs->dependencies
if not (opt or deps):
# therefore, nothing to test
continue
predicted = make_yaml_obj(
use_artifact_buildcache=use_ab, optimize=opt, use_dependencies=deps
)
actual = original.copy()
if opt:
actual = ci_opt.optimizer(actual)
if deps:
actual = cinw.needs_to_dependencies(actual)
predicted = syaml.dump_config(ci_opt.sort_yaml_obj(predicted), default_flow_style=True)
actual = syaml.dump_config(ci_opt.sort_yaml_obj(actual), default_flow_style=True)
assert predicted == actual
def test_get_spec_filter_list(mutable_mock_env_path, config, mutable_mock_repo):
"""Test that given an active environment and list of touched pkgs,
we get the right list of possibly-changed env specs"""

View File

@@ -1432,55 +1432,6 @@ def test_ci_generate_override_runner_attrs(
assert the_elt["after_script"][0] == "post step one"
def test_ci_generate_with_workarounds(
tmpdir, mutable_mock_env_path, install_mockery, mock_packages, monkeypatch, ci_base_environment
):
"""Make sure the post-processing cli workarounds do what they should"""
filename = str(tmpdir.join("spack.yaml"))
with open(filename, "w") as f:
f.write(
"""\
spack:
specs:
- callpath%gcc@=9.5
mirrors:
some-mirror: https://my.fake.mirror
ci:
pipeline-gen:
- submapping:
- match: ['%gcc@9.5']
build-job:
tags:
- donotcare
image: donotcare
enable-artifacts-buildcache: true
"""
)
with tmpdir.as_cwd():
env_cmd("create", "test", "./spack.yaml")
outputfile = str(tmpdir.join(".gitlab-ci.yml"))
with ev.read("test"):
ci_cmd("generate", "--output-file", outputfile, "--dependencies")
with open(outputfile) as f:
contents = f.read()
yaml_contents = syaml.load(contents)
found_one = False
non_rebuild_keys = ["workflow", "stages", "variables", "rebuild-index"]
for ci_key in yaml_contents.keys():
if ci_key not in non_rebuild_keys:
found_one = True
job_obj = yaml_contents[ci_key]
assert "needs" not in job_obj
assert "dependencies" in job_obj
assert found_one is True
@pytest.mark.disable_clean_stage_check
def test_ci_rebuild_index(
tmpdir,

View File

@@ -11,6 +11,7 @@
import spack.caches
import spack.cmd.clean
import spack.environment as ev
import spack.main
import spack.package_base
import spack.stage
@@ -68,6 +69,20 @@ def test_function_calls(command_line, effects, mock_calls_for_clean):
assert mock_calls_for_clean[name] == (1 if name in effects else 0)
def test_env_aware_clean(mock_stage, install_mockery, mutable_mock_env_path, monkeypatch):
e = ev.create("test", with_view=False)
e.add("mpileaks")
e.concretize()
def fail(*args, **kwargs):
raise Exception("This should not have been called")
monkeypatch.setattr(spack.spec.Spec, "concretize", fail)
with e:
clean("mpileaks")
def test_remove_python_cache(tmpdir, monkeypatch):
cache_files = ["file1.pyo", "file2.pyc"]
source_file = "file1.py"

View File

@@ -125,18 +125,8 @@ def print_spack_cc(*args):
print(os.environ.get("CC", ""))
# `module unload cray-libsci` in test environment causes failure
# It does not fail for actual installs
# build_environment.py imports module directly, so we monkeypatch it there
# rather than in module_cmd
def mock_module_noop(*args):
pass
def test_dev_build_drop_in(tmpdir, mock_packages, monkeypatch, install_mockery, working_env):
monkeypatch.setattr(os, "execvp", print_spack_cc)
monkeypatch.setattr(spack.build_environment, "module", mock_module_noop)
with tmpdir.as_cwd():
output = dev_build("-b", "edit", "--drop-in", "sh", "dev-build-test-install@0.0.0")
assert "lib/spack/env" in output

View File

@@ -0,0 +1,46 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import spack.paths
import spack.repo
import spack.util.editor
from spack.build_systems import autotools, cmake
from spack.main import SpackCommand
edit = SpackCommand("edit")
def test_edit_packages(monkeypatch, mock_packages: spack.repo.RepoPath):
"""Test spack edit a b"""
path_a = mock_packages.filename_for_package_name("a")
path_b = mock_packages.filename_for_package_name("b")
called = False
def editor(*args: str, **kwargs):
nonlocal called
called = True
assert args[0] == path_a
assert args[1] == path_b
monkeypatch.setattr(spack.util.editor, "editor", editor)
edit("a", "b")
assert called
def test_edit_files(monkeypatch):
"""Test spack edit --build-system autotools cmake"""
called = False
def editor(*args: str, **kwargs):
nonlocal called
called = True
assert os.path.samefile(args[0], autotools.__file__)
assert os.path.samefile(args[1], cmake.__file__)
monkeypatch.setattr(spack.util.editor, "editor", editor)
edit("--build-system", "autotools", "cmake")
assert called

View File

@@ -434,7 +434,7 @@ def test_find_loaded(database, working_env):
output = find("--loaded", "--group")
assert output == ""
os.environ[uenv.spack_loaded_hashes_var] = ":".join(
os.environ[uenv.spack_loaded_hashes_var] = os.pathsep.join(
[x.dag_hash() for x in spack.store.STORE.db.query()]
)
output = find("--loaded")

View File

@@ -57,9 +57,9 @@ def test_info_noversion(mock_packages, print_buffer):
@pytest.mark.parametrize(
"pkg_query,expected", [("zlib", "False"), ("gcc", "True (version, variants)")]
"pkg_query,expected", [("zlib", "False"), ("find-externals1", "True (version)")]
)
def test_is_externally_detectable(pkg_query, expected, parser, print_buffer):
def test_is_externally_detectable(mock_packages, pkg_query, expected, parser, print_buffer):
args = parser.parse_args(["--detectable", pkg_query])
spack.cmd.info.info(parser, args)

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import sys
import pytest
@@ -17,101 +18,125 @@
install = SpackCommand("install")
location = SpackCommand("location")
pytestmark = pytest.mark.not_on_windows("does not run on windows")
def test_manpath_trailing_colon(
install_mockery, mock_fetch, mock_archive, mock_packages, working_env
):
(shell, set_command, commandsep) = (
("--bat", 'set "%s=%s"', "\n")
if sys.platform == "win32"
else ("--sh", "export %s=%s", ";")
)
"""Test that the commands generated by load add the MANPATH prefix
inspections. Also test that Spack correctly preserves the default/existing
manpath search path via a trailing colon"""
install("mpileaks")
sh_out = load("--sh", "mpileaks")
lines = sh_out.split("\n")
assert any(re.match(r"export MANPATH=.*:;", ln) for ln in lines)
sh_out = load(shell, "mpileaks")
lines = [line.strip("\n") for line in sh_out.split(commandsep)]
assert any(re.match(set_command % ("MANPATH", ".*" + os.pathsep), ln) for ln in lines)
os.environ["MANPATH"] = "/tmp/man" + os.pathsep
os.environ["MANPATH"] = "/tmp/man:"
sh_out = load("--sh", "mpileaks")
lines = sh_out.split("\n")
assert any(re.match(r"export MANPATH=.*:/tmp/man:;", ln) for ln in lines)
sh_out = load(shell, "mpileaks")
lines = [line.strip("\n") for line in sh_out.split(commandsep)]
assert any(
re.match(set_command % ("MANPATH", ".*" + os.pathsep + "/tmp/man" + os.pathsep), ln)
for ln in lines
)
def test_load_recursive(install_mockery, mock_fetch, mock_archive, mock_packages, working_env):
"""Test that `spack load` applies prefix inspections of its required runtime deps in
topo-order"""
install("mpileaks")
mpileaks_spec = spack.spec.Spec("mpileaks").concretized()
def test_load_shell(shell, set_command):
"""Test that `spack load` applies prefix inspections of its required runtime deps in
topo-order"""
install("mpileaks")
mpileaks_spec = spack.spec.Spec("mpileaks").concretized()
# Ensure our reference variable is cleed.
os.environ["CMAKE_PREFIX_PATH"] = "/hello:/world"
# Ensure our reference variable is clean.
os.environ["CMAKE_PREFIX_PATH"] = "/hello" + os.pathsep + "/world"
sh_out = load("--sh", "mpileaks")
csh_out = load("--csh", "mpileaks")
shell_out = load(shell, "mpileaks")
def extract_cmake_prefix_path(output, prefix):
return next(cmd for cmd in output.split(";") if cmd.startswith(prefix))[
len(prefix) :
].split(":")
def extract_value(output, variable):
match = re.search(set_command % variable, output, flags=re.MULTILINE)
value = match.group(1)
return value.split(os.pathsep)
# Map a prefix found in CMAKE_PREFIX_PATH back to a package name in mpileaks' DAG.
prefix_to_pkg = lambda prefix: next(
s.name for s in mpileaks_spec.traverse() if s.prefix == prefix
)
# Map a prefix found in CMAKE_PREFIX_PATH back to a package name in mpileaks' DAG.
prefix_to_pkg = lambda prefix: next(
s.name for s in mpileaks_spec.traverse() if s.prefix == prefix
)
paths_sh = extract_cmake_prefix_path(sh_out, prefix="export CMAKE_PREFIX_PATH=")
paths_csh = extract_cmake_prefix_path(csh_out, prefix="setenv CMAKE_PREFIX_PATH ")
paths_shell = extract_value(shell_out, "CMAKE_PREFIX_PATH")
# Shouldn't be a difference between loading csh / sh, so check they're the same.
assert paths_sh == paths_csh
# We should've prepended new paths, and keep old ones.
assert paths_shell[-2:] == ["/hello", "/world"]
# We should've prepended new paths, and keep old ones.
assert paths_sh[-2:] == ["/hello", "/world"]
# All but the last two paths are added by spack load; lookup what packages they're from.
pkgs = [prefix_to_pkg(p) for p in paths_shell[:-2]]
# All but the last two paths are added by spack load; lookup what packages they're from.
pkgs = [prefix_to_pkg(p) for p in paths_sh[:-2]]
# Do we have all the runtime packages?
assert set(pkgs) == set(
s.name for s in mpileaks_spec.traverse(deptype=("link", "run"), root=True)
)
# Do we have all the runtime packages?
assert set(pkgs) == set(
s.name for s in mpileaks_spec.traverse(deptype=("link", "run"), root=True)
)
# Finally, do we list them in topo order?
for i, pkg in enumerate(pkgs):
set(s.name for s in mpileaks_spec[pkg].traverse(direction="parents")) in set(pkgs[:i])
# Finally, do we list them in topo order?
for i, pkg in enumerate(pkgs):
set(s.name for s in mpileaks_spec[pkg].traverse(direction="parents")) in set(pkgs[:i])
# Lastly, do we keep track that mpileaks was loaded?
assert (
extract_value(shell_out, uenv.spack_loaded_hashes_var)[0] == mpileaks_spec.dag_hash()
)
return paths_shell
# Lastly, do we keep track that mpileaks was loaded?
assert f"export {uenv.spack_loaded_hashes_var}={mpileaks_spec.dag_hash()}" in sh_out
assert f"setenv {uenv.spack_loaded_hashes_var} {mpileaks_spec.dag_hash()}" in csh_out
if sys.platform == "win32":
shell, set_command = ("--bat", r'set "%s=(.*)"')
test_load_shell(shell, set_command)
else:
params = [("--sh", r"export %s=([^;]*)"), ("--csh", r"setenv %s ([^;]*)")]
shell, set_command = params[0]
paths_sh = test_load_shell(shell, set_command)
shell, set_command = params[1]
paths_csh = test_load_shell(shell, set_command)
assert paths_sh == paths_csh
def test_load_includes_run_env(install_mockery, mock_fetch, mock_archive, mock_packages):
@pytest.mark.parametrize(
"shell,set_command",
(
[("--bat", 'set "%s=%s"')]
if sys.platform == "win32"
else [("--sh", "export %s=%s"), ("--csh", "setenv %s %s")]
),
)
def test_load_includes_run_env(
shell, set_command, install_mockery, mock_fetch, mock_archive, mock_packages
):
"""Tests that environment changes from the package's
`setup_run_environment` method are added to the user environment in
addition to the prefix inspections"""
install("mpileaks")
sh_out = load("--sh", "mpileaks")
csh_out = load("--csh", "mpileaks")
shell_out = load(shell, "mpileaks")
assert "export FOOBAR=mpileaks" in sh_out
assert "setenv FOOBAR mpileaks" in csh_out
assert set_command % ("FOOBAR", "mpileaks") in shell_out
def test_load_first(install_mockery, mock_fetch, mock_archive, mock_packages):
"""Test with and without the --first option"""
shell = "--bat" if sys.platform == "win32" else "--sh"
install("libelf@0.8.12")
install("libelf@0.8.13")
# Now there are two versions of libelf, which should cause an error
out = load("--sh", "libelf", fail_on_error=False)
out = load(shell, "libelf", fail_on_error=False)
assert "matches multiple packages" in out
assert "Use a more specific spec" in out
# Using --first should avoid the error condition
load("--sh", "--first", "libelf")
load(shell, "--first", "libelf")
def test_load_fails_no_shell(install_mockery, mock_fetch, mock_archive, mock_packages):
@@ -122,7 +147,24 @@ def test_load_fails_no_shell(install_mockery, mock_fetch, mock_archive, mock_pac
assert "To set up shell support" in out
def test_unload(install_mockery, mock_fetch, mock_archive, mock_packages, working_env):
@pytest.mark.parametrize(
"shell,set_command,unset_command",
(
[("--bat", 'set "%s=%s"', 'set "%s="')]
if sys.platform == "win32"
else [("--sh", "export %s=%s", "unset %s"), ("--csh", "setenv %s %s", "unsetenv %s")]
),
)
def test_unload(
shell,
set_command,
unset_command,
install_mockery,
mock_fetch,
mock_archive,
mock_packages,
working_env,
):
"""Tests that any variables set in the user environment are undone by the
unload command"""
install("mpileaks")
@@ -130,16 +172,16 @@ def test_unload(install_mockery, mock_fetch, mock_archive, mock_packages, workin
# Set so unload has something to do
os.environ["FOOBAR"] = "mpileaks"
os.environ[uenv.spack_loaded_hashes_var] = "%s:%s" % (mpileaks_spec.dag_hash(), "garbage")
os.environ[uenv.spack_loaded_hashes_var] = ("%s" + os.pathsep + "%s") % (
mpileaks_spec.dag_hash(),
"garbage",
)
sh_out = unload("--sh", "mpileaks")
csh_out = unload("--csh", "mpileaks")
shell_out = unload(shell, "mpileaks")
assert "unset FOOBAR" in sh_out
assert "unsetenv FOOBAR" in csh_out
assert (unset_command % "FOOBAR") in shell_out
assert "export %s=garbage" % uenv.spack_loaded_hashes_var in sh_out
assert "setenv %s garbage" % uenv.spack_loaded_hashes_var in csh_out
assert set_command % (uenv.spack_loaded_hashes_var, "garbage") in shell_out
def test_unload_fails_no_shell(

View File

@@ -13,6 +13,7 @@
import spack.cmd.pkg
import spack.main
import spack.repo
import spack.util.file_cache
#: new fake package template
pkg_template = """\
@@ -34,13 +35,14 @@ def install(self, spec, prefix):
# Force all tests to use a git repository *in* the mock packages repo.
@pytest.fixture(scope="module")
def mock_pkg_git_repo(git, tmpdir_factory):
def mock_pkg_git_repo(git, tmp_path_factory):
"""Copy the builtin.mock repo and make a mutable git repo inside it."""
tmproot = tmpdir_factory.mktemp("mock_pkg_git_repo")
repo_path = tmproot.join("builtin.mock")
root_dir = tmp_path_factory.mktemp("mock_pkg_git_repo")
repo_dir = root_dir / "builtin.mock"
shutil.copytree(spack.paths.mock_packages_path, str(repo_dir))
shutil.copytree(spack.paths.mock_packages_path, str(repo_path))
mock_repo = spack.repo.RepoPath(str(repo_path))
repo_cache = spack.util.file_cache.FileCache(str(root_dir / "cache"))
mock_repo = spack.repo.RepoPath(str(repo_dir), cache=repo_cache)
mock_repo_packages = mock_repo.repos[0].packages_path
with working_dir(mock_repo_packages):
@@ -75,7 +77,7 @@ def mock_pkg_git_repo(git, tmpdir_factory):
git("rm", "-rf", "pkg-c")
git("-c", "commit.gpgsign=false", "commit", "-m", "change pkg-b, remove pkg-c, add pkg-d")
with spack.repo.use_repositories(str(repo_path)):
with spack.repo.use_repositories(str(repo_dir)):
yield mock_repo_packages

View File

@@ -38,7 +38,7 @@ def flake8_package(tmpdir):
change to the ``flake8`` mock package, yields the filename, then undoes the
change on cleanup.
"""
repo = spack.repo.Repo(spack.paths.mock_packages_path)
repo = spack.repo.from_path(spack.paths.mock_packages_path)
filename = repo.filename_for_package_name("flake8")
rel_path = os.path.dirname(os.path.relpath(filename, spack.paths.prefix))
tmp = tmpdir / rel_path / "flake8-ci-package.py"
@@ -54,7 +54,7 @@ def flake8_package(tmpdir):
@pytest.fixture
def flake8_package_with_errors(scope="function"):
"""A flake8 package with errors."""
repo = spack.repo.Repo(spack.paths.mock_packages_path)
repo = spack.repo.from_path(spack.paths.mock_packages_path)
filename = repo.filename_for_package_name("flake8")
tmp = filename + ".tmp"
@@ -130,7 +130,7 @@ def test_changed_files_all_files():
assert os.path.join(spack.paths.module_path, "spec.py") in files
# a mock package
repo = spack.repo.Repo(spack.paths.mock_packages_path)
repo = spack.repo.from_path(spack.paths.mock_packages_path)
filename = repo.filename_for_package_name("flake8")
assert filename in files

View File

@@ -3,12 +3,8 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Test detection of compiler version"""
import os
import pytest
import llnl.util.filesystem as fs
import spack.compilers.aocc
import spack.compilers.arm
import spack.compilers.cce
@@ -23,7 +19,6 @@
import spack.compilers.xl
import spack.compilers.xl_r
import spack.util.module_cmd
from spack.operating_systems.cray_frontend import CrayFrontend
@pytest.mark.parametrize(
@@ -413,48 +408,6 @@ def test_xl_version_detection(version_str, expected_version):
assert version == expected_version
@pytest.mark.not_on_windows("Not supported on Windows (yet)")
@pytest.mark.parametrize(
"compiler,version",
[
("gcc", "8.1.0"),
("gcc", "1.0.0-foo"),
("pgi", "19.1"),
("pgi", "19.1a"),
("intel", "9.0.0"),
("intel", "0.0.0-foobar"),
# ('oneapi', '2021.1'),
# ('oneapi', '2021.1-foobar')
],
)
def test_cray_frontend_compiler_detection(compiler, version, tmpdir, monkeypatch, working_env):
"""Test that the Cray frontend properly finds compilers form modules"""
# setup the fake compiler directory
compiler_dir = tmpdir.join(compiler)
compiler_exe = compiler_dir.join("cc").ensure()
fs.set_executable(str(compiler_exe))
# mock modules
def _module(cmd, *args):
module_name = "%s/%s" % (compiler, version)
module_contents = "prepend-path PATH %s" % compiler_dir
if cmd == "avail":
return module_name if compiler in args[0] else ""
if cmd == "show":
return module_contents if module_name in args else ""
monkeypatch.setattr(spack.operating_systems.cray_frontend, "module", _module)
# remove PATH variable
os.environ.pop("PATH", None)
# get a CrayFrontend object
cray_fe_os = CrayFrontend()
paths = cray_fe_os.compiler_search_paths
assert paths == [str(compiler_dir)]
@pytest.mark.parametrize(
"version_str,expected_version",
[

View File

@@ -24,11 +24,12 @@
import spack.platforms
import spack.repo
import spack.solver.asp
import spack.util.file_cache
import spack.util.libc
import spack.variant as vt
from spack.concretize import find_spec
from spack.spec import CompilerSpec, Spec
from spack.version import Version, ver
from spack.version import Version, VersionList, ver
def check_spec(abstract, concrete):
@@ -168,19 +169,18 @@ def reverser(pkg_name):
@pytest.fixture()
def repo_with_changing_recipe(tmpdir_factory, mutable_mock_repo):
def repo_with_changing_recipe(tmp_path_factory, mutable_mock_repo):
repo_namespace = "changing"
repo_dir = tmpdir_factory.mktemp(repo_namespace)
repo_dir = tmp_path_factory.mktemp(repo_namespace)
repo_dir.join("repo.yaml").write(
(repo_dir / "repo.yaml").write_text(
"""
repo:
namespace: changing
""",
ensure=True,
"""
)
packages_dir = repo_dir.ensure("packages", dir=True)
packages_dir = repo_dir / "packages"
root_pkg_str = """
class Root(Package):
homepage = "http://www.example.com"
@@ -191,7 +191,9 @@ class Root(Package):
conflicts("^changing~foo")
"""
packages_dir.join("root", "package.py").write(root_pkg_str, ensure=True)
package_py = packages_dir / "root" / "package.py"
package_py.parent.mkdir(parents=True)
package_py.write_text(root_pkg_str)
changing_template = """
class Changing(Package):
@@ -225,7 +227,9 @@ class _ChangingPackage:
def __init__(self, repo_directory):
self.repo_dir = repo_directory
self.repo = spack.repo.Repo(str(repo_directory))
cache_dir = tmp_path_factory.mktemp("cache")
self.repo_cache = spack.util.file_cache.FileCache(str(cache_dir))
self.repo = spack.repo.Repo(str(repo_directory), cache=self.repo_cache)
def change(self, changes=None):
changes = changes or {}
@@ -246,10 +250,12 @@ def change(self, changes=None):
# Change the recipe
t = jinja2.Template(changing_template)
changing_pkg_str = t.render(**context)
packages_dir.join("changing", "package.py").write(changing_pkg_str, ensure=True)
package_py = packages_dir / "changing" / "package.py"
package_py.parent.mkdir(parents=True, exist_ok=True)
package_py.write_text(changing_pkg_str)
# Re-add the repository
self.repo = spack.repo.Repo(str(self.repo_dir))
self.repo = spack.repo.Repo(str(self.repo_dir), cache=self.repo_cache)
repository.put_first(self.repo)
_changing_pkg = _ChangingPackage(repo_dir)
@@ -421,30 +427,38 @@ def test_compiler_flags_differ_identical_compilers(self, mutable_config, clang12
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
def test_concretize_compiler_flag_propagate(self):
spec = Spec("hypre cflags=='-g' ^openblas")
spec.concretize()
assert spec.satisfies("^openblas cflags='-g'")
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
@pytest.mark.parametrize(
"spec_str,expected,not_expected",
[
# Simple flag propagation from the root
("hypre cflags=='-g' ^openblas", ["hypre cflags='-g'", "^openblas cflags='-g'"], []),
(
"hypre cflags='-g' ^openblas",
["hypre cflags='-g'", "^openblas"],
["^openblas cflags='-g'"],
),
# Setting a flag overrides propagation
(
"hypre cflags=='-g' ^openblas cflags='-O3'",
["hypre cflags='-g'", "^openblas cflags='-O3'"],
["^openblas cflags='-g'"],
),
# Propagation doesn't go across build dependencies
(
"cmake-client cflags=='-O2 -g'",
["cmake-client cflags=='-O2 -g'", "^cmake"],
["cmake cflags=='-O2 -g'"],
),
],
)
def test_concretize_compiler_flag_does_not_propagate(self):
spec = Spec("hypre cflags='-g' ^openblas")
spec.concretize()
def test_compiler_flag_propagation(self, spec_str, expected, not_expected):
root = Spec(spec_str).concretized()
assert not spec.satisfies("^openblas cflags='-g'")
for constraint in expected:
assert root.satisfies(constraint)
@pytest.mark.only_clingo(
"Optional compiler propagation isn't deprecated for original concretizer"
)
def test_concretize_propagate_compiler_flag_not_passed_to_dependent(self):
spec = Spec("hypre cflags=='-g' ^openblas cflags='-O3'")
spec.concretize()
assert set(spec.compiler_flags["cflags"]) == set(["-g"])
assert spec.satisfies("^openblas cflags='-O3'")
for constraint in not_expected:
assert not root.satisfies(constraint)
def test_mixing_compilers_only_affects_subdag(self):
spack.config.set("packages:all:compiler", ["clang", "gcc"])
@@ -1767,21 +1781,21 @@ def test_reuse_with_unknown_package_dont_raise(self, tmpdir, temporary_store, mo
assert s.namespace == "builtin.mock"
@pytest.mark.parametrize(
"specs,expected",
"specs,expected,libc_offset",
[
(["libelf", "libelf@0.8.10"], 1),
(["libdwarf%gcc", "libelf%clang"], 2),
(["libdwarf%gcc", "libdwarf%clang"], 3),
(["libdwarf^libelf@0.8.12", "libdwarf^libelf@0.8.13"], 4),
(["hdf5", "zmpi"], 3),
(["hdf5", "mpich"], 2),
(["hdf5^zmpi", "mpich"], 4),
(["mpi", "zmpi"], 2),
(["mpi", "mpich"], 1),
(["libelf", "libelf@0.8.10"], 1, 1),
(["libdwarf%gcc", "libelf%clang"], 2, 1),
(["libdwarf%gcc", "libdwarf%clang"], 3, 2),
(["libdwarf^libelf@0.8.12", "libdwarf^libelf@0.8.13"], 4, 1),
(["hdf5", "zmpi"], 3, 1),
(["hdf5", "mpich"], 2, 1),
(["hdf5^zmpi", "mpich"], 4, 1),
(["mpi", "zmpi"], 2, 1),
(["mpi", "mpich"], 1, 1),
],
)
@pytest.mark.only_clingo("Original concretizer cannot concretize in rounds")
def test_best_effort_coconcretize(self, specs, expected):
def test_best_effort_coconcretize(self, specs, expected, libc_offset):
specs = [Spec(s) for s in specs]
solver = spack.solver.asp.Solver()
solver.reuse = False
@@ -1790,7 +1804,9 @@ def test_best_effort_coconcretize(self, specs, expected):
for s in result.specs:
concrete_specs.update(s.traverse())
libc_offset = 1 if spack.solver.asp.using_libc_compatibility() else 0
if not spack.solver.asp.using_libc_compatibility():
libc_offset = 0
assert len(concrete_specs) == expected + libc_offset
@pytest.mark.parametrize(
@@ -2546,6 +2562,53 @@ def test_include_specs_from_externals_and_libcs(
assert result["deprecated-versions"].satisfies("@1.0.0")
@pytest.mark.regression("44085")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_can_reuse_concrete_externals_for_dependents(self, mutable_config, tmp_path):
"""Test that external specs that are in the DB can be reused. This means they are
preferred to concretizing another external from packages.yaml
"""
packages_yaml = {
"externaltool": {"externals": [{"spec": "externaltool@2.0", "prefix": "/fake/path"}]}
}
mutable_config.set("packages", packages_yaml)
# Concretize with gcc@9 to get a suboptimal spec, since we have gcc@10 available
external_spec = Spec("externaltool@2 %gcc@9").concretized()
assert external_spec.external
root_specs = [Spec("sombrero")]
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, root_specs, reuse=[external_spec])
assert len(result.specs) == 1
sombrero = result.specs[0]
assert sombrero["externaltool"].dag_hash() == external_spec.dag_hash()
@pytest.mark.only_clingo("Original concretizer cannot reuse")
def test_cannot_reuse_host_incompatible_libc(self):
"""Test whether reuse concretization correctly fails to reuse a spec with a host
incompatible libc."""
if not spack.solver.asp.using_libc_compatibility():
pytest.skip("This test requires libc nodes")
# We install b@1 ^glibc@2.30, and b@0 ^glibc@2.28. The former is not host compatible, the
# latter is.
fst = Spec("b@1").concretized()
fst._mark_concrete(False)
fst.dependencies("glibc")[0].versions = VersionList(["=2.30"])
fst._mark_concrete(True)
snd = Spec("b@0").concretized()
# The spec b@1 ^glibc@2.30 is "more optimal" than b@0 ^glibc@2.28, but due to glibc
# incompatibility, it should not be reused.
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, [Spec("b")], reuse=[fst, snd])
assert len(result.specs) == 1
assert result.specs[0] == snd
@pytest.fixture()
def duplicates_test_repository():
@@ -2996,3 +3059,45 @@ def test_spec_filters(specs, include, exclude, expected):
factory=lambda: specs, is_usable=lambda x: True, include=include, exclude=exclude
)
assert f.selected_specs() == expected
@pytest.mark.only_clingo("clingo only reuse feature being tested")
@pytest.mark.regression("38484")
def test_git_ref_version_can_be_reused(
install_mockery_mutable_config, do_not_check_runtimes_on_reuse
):
first_spec = spack.spec.Spec("git-ref-package@git.2.1.5=2.1.5~opt").concretized()
first_spec.package.do_install(fake=True, explicit=True)
with spack.config.override("concretizer:reuse", True):
# reproducer of the issue is that spack will solve when there is a change to the base spec
second_spec = spack.spec.Spec("git-ref-package@git.2.1.5=2.1.5+opt").concretized()
assert second_spec.dag_hash() != first_spec.dag_hash()
# we also want to confirm that reuse actually works so leave variant off to
# let solver reuse
third_spec = spack.spec.Spec("git-ref-package@git.2.1.5=2.1.5")
assert first_spec.satisfies(third_spec)
third_spec.concretize()
assert third_spec.dag_hash() == first_spec.dag_hash()
@pytest.mark.only_clingo("clingo only reuse feature being tested")
@pytest.mark.parametrize("standard_version", ["2.0.0", "2.1.5", "2.1.6"])
def test_reuse_prefers_standard_over_git_versions(
standard_version, install_mockery_mutable_config, do_not_check_runtimes_on_reuse
):
"""
order matters in this test. typically reuse would pick the highest versioned installed match
but we want to prefer the standard version over git ref based versions
so install git ref last and ensure it is not picked up by reuse
"""
standard_spec = spack.spec.Spec(f"git-ref-package@{standard_version}").concretized()
standard_spec.package.do_install(fake=True, explicit=True)
git_spec = spack.spec.Spec("git-ref-package@git.2.1.5=2.1.5").concretized()
git_spec.package.do_install(fake=True, explicit=True)
with spack.config.override("concretizer:reuse", True):
test_spec = spack.spec.Spec("git-ref-package@2").concretized()
assert git_spec.dag_hash() != test_spec.dag_hash()
assert standard_spec.dag_hash() == test_spec.dag_hash()

View File

@@ -79,13 +79,13 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
[
# The reused runtime is older than we need, thus we'll add a more recent one for a
("a%gcc@10.2.1", "b%gcc@9.4.0", {"a": "gcc-runtime@10.2.1", "b": "gcc-runtime@9.4.0"}, 2),
# The root is compiled with an older compiler, thus we'll reuse the runtime from b
("a%gcc@9.4.0", "b%gcc@10.2.1", {"a": "gcc-runtime@10.2.1", "b": "gcc-runtime@10.2.1"}, 1),
# The root is compiled with an older compiler, thus we'll NOT reuse the runtime from b
("a%gcc@9.4.0", "b%gcc@10.2.1", {"a": "gcc-runtime@9.4.0", "b": "gcc-runtime@9.4.0"}, 1),
# Same as before, but tests that we can reuse from a more generic target
pytest.param(
"a%gcc@9.4.0",
"b%gcc@10.2.1 target=x86_64",
{"a": "gcc-runtime@10.2.1 target=x86_64", "b": "gcc-runtime@10.2.1 target=x86_64"},
{"a": "gcc-runtime@9.4.0", "b": "gcc-runtime@9.4.0"},
1,
marks=pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
@@ -102,13 +102,15 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
),
],
)
@pytest.mark.regression("44444")
def test_reusing_specs_with_gcc_runtime(root_str, reused_str, expected, nruntime, runtime_repo):
"""Tests that we can reuse specs with a "gcc-runtime" leaf node. In particular, checks
that the semantic for gcc-runtimes versions accounts for reused packages too.
Reusable runtime versions should be lower, or equal, to that of parent nodes.
"""
root, reused_spec = _concretize_with_reuse(root_str=root_str, reused_str=reused_str)
assert f"{expected['b']}" in reused_spec
runtime_a = root.dependencies("gcc-runtime")[0]
assert runtime_a.satisfies(expected["a"])
runtime_b = root["b"].dependencies("gcc-runtime")[0]
@@ -123,8 +125,7 @@ def test_reusing_specs_with_gcc_runtime(root_str, reused_str, expected, nruntime
[
# Ensure that, whether we have multiple runtimes in the DAG or not,
# we always link only the latest version
("a%gcc@10.2.1", "b%gcc@9.4.0", ["gcc-runtime@10.2.1"], ["gcc-runtime@9.4.0"]),
("a%gcc@9.4.0", "b%gcc@10.2.1", ["gcc-runtime@10.2.1"], ["gcc-runtime@9.4.0"]),
("a%gcc@10.2.1", "b%gcc@9.4.0", ["gcc-runtime@10.2.1"], ["gcc-runtime@9.4.0"])
],
)
def test_views_can_handle_duplicate_runtime_nodes(

Some files were not shown because too many files have changed in this diff Show More