Compare commits

..

346 Commits

Author SHA1 Message Date
Todd Gamblin
0d0ff44e3e Spec: Remove _normal attribute and unused constructor arguments (#48119)
The `_normal` attribute on specs is no longer used and has no meaning.
It's left over from part of the original concretizer.

The `concrete` constructor argument is also not used by any part of core.

- [x] remove `_normal` attribute from `Spec`
- [x] remove `concrete` argument from `Spec.__init__`
- [x] remove unused `check_diamond_normalized_dag` function in tests
- [x] simplify `Spec` constructor and docstrings

I tried to add typing to `Spec` here, but it creates a huge number of type issues
because *most* things on `Spec` are optional. We probably need separate `Spec` and
`ConcreteSpec` classes before attempting that.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-14 15:24:22 +01:00
Wouter Deconinck
f4bfeb7ed8 sundials: fix missing comma in list (#48106) 2024-12-14 06:38:01 -07:00
Zack Galbreath
a16350df69 Fix typo in ci/README.md (#48114) 2024-12-14 06:22:46 -07:00
John W. Parent
a2981cff1f New Package: Trame (#47920)
* Add trame
2024-12-13 17:16:31 -06:00
Richard Berger
d2372f8eee Add libcxi and its dependencies (#47705) 2024-12-13 16:09:17 -07:00
Buldram
c310c2911a nim: fix deps, deprecate and patch old versions (#47934)
* nim: fix deps, deprecate and patch old versions
* Fix runtime dependencies for produced binaries:
  - Add -rpaths pointing at dependencies to
	  std wrapper modules
  - Set version constraints for OpenSSL
  - Added SQLite3 variant for <2.0
* Parallelize build with make
* Deprecate 1.0.10 due to CVEs
* Deprecate 1.9.3 as it's an old development version
* Backport patch for CVE-2021-21372 to <1.2.10/<1.4.4,
  CVE-2021-21374 to 1.4.2 and CVE-2021-46872 to 1.4.*
* Avoid empty low ranges that include devel
* Add previously missing CVE comment
* Keep "link" type for dynamic libraries for MSVC
* Omit "run" type for library dependencies
* Disable SQLite variant by default
* Fix version ranges
   Had assumed they were exclusive, but they're inclusive
* Correct version range for sqlite variant
   Difference doesn't matter outside of development versions
* Move patches to use GitHub URLs instead of files
* Retry CI
* append ?full_index=1
2024-12-13 15:29:18 -07:00
Rémi Lacroix
d68747912d suite-sparse: Add version 7.8.3. (#48101) 2024-12-13 14:53:38 -07:00
Harmen Stoppels
107e4515bd tests: fix a few open(...) calls (#48113) 2024-12-13 21:38:48 +01:00
Massimiliano Culpo
af6526bb82 ga: add a pylint check to avoid adding open calls without encoding= (#48099) 2024-12-13 21:21:26 +01:00
Raffaele Solcà
dd8dff7872 add dla-future v0.7.0 (#48098) 2024-12-13 12:28:21 -07:00
Adam J. Stewart
82d4b391bf py-matplotlib: add v3.9.3, v3.9.4 (#48107)
* py-matplotlib: add v3.9.3, v3.9.4
* Add upper bound on meson
2024-12-13 11:54:19 -07:00
Harmen Stoppels
a07e372770 filter_file: make tempfile later (#48108)
* filter_file: make tempfile later

* also add a `.` after the filename
2024-12-13 11:49:32 -07:00
kwryankrattiger
d35202d83e VisIt: Patch to fix python module deps (#48097)
Previously the pip setup would delete the visitmodule during the install
step. This was fixed by forcing the pip setup to only run once before
the dependents are created.
2024-12-13 11:33:07 -07:00
Tamara Dahlgren
1c1d439a01 Circular import fix: spack.config -> spack.environment (#48057)
Fix by moving setup to spack.main
2024-12-13 18:44:08 +01:00
Massimiliano Culpo
d52be82c06 netcdf-cxx4: use https instead of ftp (#47875)
* netcdf-cxx4: use https instead of ftp
* Update url for v4.3.0
2024-12-13 09:00:40 -08:00
Dominic Hofer
2a0fc464c9 Remove maintainer (#48105) 2024-12-13 17:17:50 +01:00
Massimiliano Culpo
cd26331b19 superlu: add v7.0, add metis as a dependency (#48061) 2024-12-13 11:26:27 +01:00
Kevin Huck
f5934db96b Updating APEX package (#48017)
* Updating APEX package
Adding version 2.7.1 and adding OpenCL support flags

* Update var/spack/repos/builtin/packages/apex/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* Update var/spack/repos/builtin/packages/apex/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* Fixing recommended change typo

* Removing superfluous conflict

---------

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2024-12-13 10:20:30 +01:00
Harmen Stoppels
11b86ca75c package_base.py: remove deprecated props (#48066) 2024-12-13 10:19:16 +01:00
kwryankrattiger
0c2b546825 Remove extraneous newline from reproducer output (#48076)
* Remove extraneous newline from reproducer output

* Convert print -> tty.info
2024-12-13 09:38:43 +01:00
kwryankrattiger
ef615bcc7e CI: Move the build stage to the project root instead of tmp (#47996)
On MacOS the tmpdir fills up over time and isn't cleared properly.
Move the build stage to a location it is cleared after each build.
2024-12-13 06:55:01 +00:00
Wouter Deconinck
0f21f24356 cmake: ~ownlibs ^libarchive compression+=bz2lib,lzma,zstd (#48079) 2024-12-12 23:47:29 -07:00
Brian Vanderwende
e59ee0768f grads: Support newer hdf5 versions for grads (#48058)
* Support newer hdf5 versions for grads

* Cover netcdf dependency too

* Adjusted placement of two comments
2024-12-13 00:33:04 -06:00
Chris Marsh
6cbd9dcf13 Add +pic variant by default such that consumers of the static version of pcre2 can use it in a shared library. Fixes #47614 (#48071) 2024-12-12 19:38:10 -06:00
kwryankrattiger
92dbb55703 Mkae Autotools build_system point at correct build_directory (#48072) 2024-12-12 15:58:18 -07:00
Marc T. Henry de Frahan
e84631473c Add openfast FPE trapping variant (#48042) 2024-12-12 15:28:04 -07:00
Satish Balay
c213a8c2a7 camp: restrict cub dependency to cuda versions older than 10 - as newer cuda versions already include cub (#48008)
Adding an additional dependency on cub pulls in an in-compatible version of cub [than whats provided by cuda]
2024-12-12 13:17:51 -08:00
Harmen Stoppels
526af1cbe7 Sprinkle open(..., encoding=utf-8) (#48006)
Add missing encoding=utf-8 to various open calls. This makes
files like spec.json, spack.yaml, spack.lock, config.yaml etc locale
independent w.r.t. text encoding. In practice this is not often an
issue since Python 3.7, where the C locale is promoted to
C.UTF-8. But it's better to enforce UTF-8 explicitly, since there is
no guarantee text files are written in the right encoding.

Also avoid opening in text mode if it can be avoided.
2024-12-12 21:46:08 +01:00
Massimiliano Culpo
334a8b0991 ci: remove a custom implementation of a stdlib functionality (#48068) 2024-12-12 12:09:35 -08:00
Dom Heinzeller
1581922c9e cprnc: install rpath patch for v1.0.8 (#47913) 2024-12-12 20:31:40 +01:00
Harmen Stoppels
9cd2f0a536 filter_file: fix various bugs (#48038)
* `f.tell` on a `TextIOWrapper` does not return the offset in bytes, but
  an opaque integer that can only be used for `f.seek` on the same
  object. Spack assumes it's a byte offset.
* Do not open in a locale dependent way, but assume utf-8 (and allow
  users to override that)
* Use tempfile to generate a backup/temporary file in a safe way
* Comparison between None and str is valid and on purpose.
2024-12-12 20:07:39 +01:00
Harmen Stoppels
687766b8ab spec.parser / spec.token: improvements (#48063)
Follow-up to #47956 

* Rename `token.py` -> `tokenize.py`
* Rename `parser.py` -> `spec_parser.py`
* Move common code related to iterating over tokens into `tokenize.py`
* Add "unexpected character token" (i.e. `.`) to `SpecTokens` by default instead of having a separate tokenizer / regex.
2024-12-12 17:08:20 +01:00
Adam J. Stewart
396a701860 Python: deprecate 3.8 (#46913)
* Python: deprecate 3.8

* Remove preference for EOL Python versions

* Explicitly deprecate things requiring EOL Python

* More deprecations

* deprecate old versions of slepc, py-petsc4py, py-slepc4py in sync with old versions of petsc

---------

Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2024-12-12 15:22:06 +01:00
Massimiliano Culpo
7105cc8c01 Make use of ^ in 'depends_on' an error (#48062)
The use of `^` in `depends_on` directives has never been allowed, since
the dawn of Spack.

Up to now, we used to have an audit to catch this kind of issue, mainly
because in that way we could easily collect all issues and report them
to packagers at once.

Due to implementation details, this audit doesn't work if a dependency
without a `^` is followed by the same dependency with a `^`.

This PR makes this pattern an error, which will be reported eagerly, and
removes the corresponding audit. It also fixes a package using the wrong
idiom.

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-12-12 14:19:05 +01:00
Wouter Deconinck
0ce38ed109 rivet: patch to fix missing headers (#48049) 2024-12-12 11:14:15 +01:00
Tamara Dahlgren
c548bcc9ef Environment: remove self import (#48056) 2024-12-12 10:30:19 +01:00
Tamara Dahlgren
f018e0fe42 Circular import fix: spack.schema.config -> spack.config (#48059)
fix by moving `merge_yaml` from `config.py` to `schema/__init__.py`
2024-12-12 10:07:53 +01:00
Tamara Dahlgren
9aefbb0e96 Circular import fix: spack.oci.opener -> spack.parser (#47956)
by splitting spack.parser into two modules
2024-12-12 10:02:07 +01:00
Marcel Koch
9265991767 ginkgo: add v1.9.0 (#47987)
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-12-11 16:53:01 -07:00
Dom Heinzeller
25cfea48f3 ESMF package: support clang with flang
So far, the ESMF package recipe in spack assumes that the spack
compilers clang and apple-clang are using gfortran as the Fortran
compiler. But with the latest improvements to the LLVM compilers,
we need to also support clang with flang.
2024-12-11 16:47:52 -07:00
Andy Porter
fc4316cafa py-psyclone: add v3.0.0 (#47964)
* Update py-psyclone package to version 3.0.0
* Add Sergi and myself as maintainers
* correct case of url and add url_for_version() method
2024-12-11 11:51:14 -08:00
Scott Wittenburg
de1416b3de ci: Refactor pipeline generation (#47459)
Reorganize the pipeline generation aspect of the ci module,
mostly to separate the representation, generation, and
pruning of pipeline graphs from platform-specific output
formatting.

Introduce a pipeline generation registry to support generating
pipelines for other platforms, though gitlab is still the only
supported format currently.

Fix a long-existing bug in pipeline pruning where only direct
dependencies were added to any nodes dependency list.
2024-12-11 19:23:37 +00:00
Harmen Stoppels
ba52c4f05d gha: fix git safe.directory config (#48041) 2024-12-11 19:11:44 +01:00
Rocco Meli
501ee68606 dbcsr: add v2.8.0 (#48035)
* dbcsr: add v2.8.0

* add myself as maintainer
2024-12-11 10:18:38 -07:00
Jon Rood
283eaaf323 amr-wind: remove unused cmake option (#48009)
* amr-wind: remove unused cmake option

* Style.
2024-12-11 09:28:31 -07:00
Wouter Deconinck
a3543008d9 qt-base: fix rpath for dependents (#47424)
ensure that CMAKE_INSTALL_RPATH_USE_LINK_PATH=ON works in qt packages.
2024-12-11 16:59:47 +01:00
Robert Cohn
f760e16688 umf only avaiable with 2025 (#48027) 2024-12-11 16:33:56 +01:00
Harmen Stoppels
e9d2732e00 log.py: improve utf-8 handling, and non-utf-8 output (#48005) 2024-12-11 10:54:17 +01:00
Harmen Stoppels
03525528d6 llnl.path: make system_path_filter a noop on non-win32 (#48032) 2024-12-11 10:51:06 +01:00
Scott Wittenburg
a3985e7538 Revert "Set the "build_jobs" on concretization/generate for CI (#47660)" (#48028)
This reverts commit 316dcc1609.
2024-12-11 07:56:36 +01:00
Robert Cohn
ae28528ec7 sycl runtime needs umf (#48011) 2024-12-10 14:34:35 -08:00
Paul Kuberry
cb8880b388 Update compadre and py-pycompadre to v1.6.0 (#47948)
* compadre: add version 1.6.0
* py-pycompadre: add version 1.6.0
2024-12-10 13:29:31 -08:00
kwryankrattiger
316dcc1609 Set the "build_jobs" on concretization/generate for CI (#47660)
* Set the "build_jobs" on concretization/generate for CI

build_jobs also controls the concretization pool size. Set this
in the config section for CI generate.

This config is overwritten by build_job CI using the SPACK_BUILD_JOBS
environment variable. This implicitly will drop the default build
CPU request on all "default" grouped build jobs from (max) 16 to 8.

* Add default allocations for build jobs

* Add common jobs and concretize args to ci generate and rebuild

* CI: Specify parallel concretize and build jobs via argument

* Increase power and cray concretization limits

Lowering limits for these stacks creates timeout
2024-12-10 14:13:23 -07:00
rfbgo
84ea7dbddf hp2p: new package (#47950)
* Add hp2p app definition
* update homepage
   Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
* update to fstring

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-12-10 13:03:54 -08:00
dmagdavector
b6e4ff0242 py-nbclassic: add v1.1.0 (#47946)
* py-nbclassic: add v1.1
* py-nbclassic: reduce explicit dependencies for v1.1.0
  Having all the 'excess' packages listed did not break anything, as
  they were needed for `py-jupyter-server` (pulled in via `py-notebook-shim`)
  anyway, but the change makes it more clear on why things are being pulled in.
2024-12-10 13:45:44 -07:00
Xuefeng Ding
c23ffbbd7a geant4: patch typo in wroot (#47955)
* bug fix

* not just 10.4, all versions

* geant4: comment; close patch when range

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-12-10 14:36:01 -06:00
Luc Grosheintz
accd3ca860 highfive: add v2.10.1 (#47914)
* highfive: release 2.10.1
* Use sha256 of '.tar.gz'.

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-12-10 12:29:17 -08:00
dmagdavector
d47629a521 py-jupyterlab-server: add v2.23 to 2.27 (#47969)
* py-jupyterlab-server: add v2.23 to 2.27
* py-jupyterlab-server: style fixes
2024-12-10 12:06:25 -08:00
Lehman Garrison
7bb6c9b828 py-disbatch: add new package at version 3.0 (#47988) 2024-12-10 10:13:56 -08:00
Wouter Deconinck
7e5b5f8c57 veccore: add v0.8.2 (#47855)
* veccore: add v8.2.0

* Update cmake requirement

---------

Co-authored-by: Seth R Johnson <johnsonsr@ornl.gov>
2024-12-10 08:38:43 -07:00
Paul R. C. Kent
3a1c0f5c5f llvm: add v19.1.5 (#47897) 2024-12-10 15:19:10 +01:00
Massimiliano Culpo
b50dbb8604 pipelines: simplify and lint aws-pcluster-* (#47989) 2024-12-10 12:16:51 +01:00
Harmen Stoppels
30c00353d4 make level_zero variant consistent, add missing instances (#47985) 2024-12-10 09:25:30 +01:00
Tamara Dahlgren
466c3abaeb Remove remaining use of deprecated test callback (#47995) 2024-12-10 08:19:56 +01:00
Adam J. Stewart
478647f873 py-numpy: add v2.2.0 (#47999) 2024-12-09 23:39:00 -07:00
Adam J. Stewart
15f3851a92 py-scikit-learn: add v1.6.0 (#47998) 2024-12-09 22:59:24 -07:00
Laura Weber
5232ee1ed1 tecplot: updated hash for 2024r1m1 (#47886) 2024-12-09 18:31:35 -08:00
teddy
855943ff29 py-mgmetis: remove constrains 3.X for mpi4py & 1.X for numpy depandancies (#47890)
Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
2024-12-09 18:29:59 -08:00
Tobias Ribizel
449a462cde gurobi: add versions 11 and 12 (#47889)
This also means removing the Python support for these versions,
as the installation method was deprecated in favor of pip/conda
2024-12-09 18:27:30 -08:00
François Trahay
f3c6f00cc1 eztrace: new version for building from the dev branch of the git repository (#47891) 2024-12-09 18:25:54 -08:00
jgraciahlrs
42333ad66e extrae: relax requirements on binutils (#47893) 2024-12-09 18:24:15 -08:00
Luc Grosheintz
36f3566257 highfive: update maintainers. (#47896)
* highfive: update maintainers.
* switch maintainers.

---------

Co-authored-by: Nicolas Cornu <me+github@alkino.fr>
2024-12-09 18:22:10 -08:00
Adam J. Stewart
24fc720c0b py-twine: add v6.0.1 (#47899) 2024-12-09 18:18:19 -08:00
Andrey Alekseenko
fe0f4c1815 gromacs: support version 2024.4 (#47900)
And fix help formatting
2024-12-09 18:16:42 -08:00
Alberto Sartori
d68462ae8e justbuild: add version 1.4.1 (#47902) 2024-12-09 18:15:06 -08:00
Victor Lopez Herrero
0189e92329 dlb: add v3.5.0 (#47916) 2024-12-09 18:06:58 -08:00
Mark Abraham
8d83baa35e gromacs: conflict %apple-clang and +openmp (#47935) 2024-12-09 17:39:32 -08:00
Wouter Deconinck
12dd1208f3 geant4: add v11.3.0 (#47961)
* geant4: add v11.3.0
* geant4: rm deprecated 11.3.0.beta
* geant4: add 11.3.0 and associated data library versions
   - Data library versions taken from:
     - https://gitlab.cern.ch/geant4/geant4/-/blob/v11.3.0/cmake/Modules/G4DatasetDefinitions.cmake?ref_type=tags
   - Variants etc otherwise unchanged.
   - 11.3.0-beta version removed, release version marked as preffered.
* g4channeling: f-strings

---------

Co-authored-by: Ben Morgan <ben.morgan@warwick.ac.uk>
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2024-12-09 15:01:45 -08:00
David Gardner
728c5e0e9d add main branch (#47952) 2024-12-09 14:54:38 -08:00
dmagdavector
c3e92a3d01 py-httpx: add v0.28, v0.28.1 (#47970)
* py-httpx: add v0.28, v0.28.1
* py-httpx: py-sniffio dependency only needed up to 0.27
2024-12-09 14:44:29 -08:00
Stephen Nicholas Swatman
49efa711d0 acts dependencies: new versions as of 2024/12/08 (#47981)
* acts dependencies: new versions as of 2024/12/08
 This commit includes a new version of ACTS, as well as new versions of
  the ACTS algebra plugins, covfie, detray, and geomodel.
* Fixes
* covfie: depends_on cmake@3.21: when @0.11:

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-12-09 15:08:47 -07:00
Wouter Deconinck
ab4a645cbe various pkgs: use https homepage when http redirects (github.io) (#47974)
* various pkgs: use https homepage when http redirects (github.io)
* py-owslib: fix homepage
* py-owslib: fix homepage to rtd
* h5io: fix homepage
2024-12-09 14:48:33 -07:00
Wouter Deconinck
7c74247f23 py-greenlet: remove preference for v2.0.2 (#47962)
* py-greenlet: remove preference for 2.0.2
* py-greenlet: conflicts with gcc@:7 when @3.1.1:
2024-12-09 12:26:16 -08:00
Matt Thompson
728f13d4b2 mapl: add v2.51.0 (#47968) 2024-12-09 12:21:05 -08:00
Wouter Deconinck
4d6347c99c node-js: patch for %gcc@12.[1-2] when @22.2:22.5 (#47979)
* node-js: patch for %gcc@12.[1-2] when @22.2:22
* node-js: avoid url patch (serial in common.gypi)
2024-12-09 12:19:09 -08:00
Wouter Deconinck
b2a86fcaba py-plac: add v1.4.3; restrict to python@:3.11 for older (#47982) 2024-12-09 11:24:51 -08:00
Kin Fai Tse
da83ab35e8 add soci 4.0.3 (#47983) 2024-12-09 11:17:36 -08:00
Alberto Invernizzi
9cb2070eeb gh: add v2.59.0 -> v2.63.2 (#47958)
* gh: bump versions

* update go requirement (good catch @alecbcs!)
see 8446079656
2024-12-09 09:43:10 -08:00
Wouter Deconinck
a72490fc91 coverage.yml: set fail_ci_if_error = false again (#47986) 2024-12-09 16:59:44 +01:00
Mikael Simberg
f15e5f7163 mold: Add 2.35.0 (#47984) 2024-12-09 04:02:51 -07:00
dependabot[bot]
fc105a1a26 build(deps): bump types-six in /.github/workflows/requirements/style (#47954)
Bumps [types-six](https://github.com/python/typeshed) from 1.16.21.20241105 to 1.17.0.20241205.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-six
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-08 19:34:20 -06:00
Stephen Sachs
8a9e16dc3b aws-pcluster stacks: static spack.yaml (#47918) 2024-12-08 20:26:08 +01:00
eugeneswalker
0b7fc360fa e4s ci: add lammps +rocm (#47929)
* e4s ci: add lammps +rocm

* e4s rocm external stack: add def for rocm-openmp-extras external

* lammps +rocm external rocm has errors, comment out
2024-12-08 10:21:28 -08:00
Wouter Deconinck
79d79969bb celeritas: patch 0.5.0 for geant4@11.3.0: (#47976) 2024-12-08 09:12:14 -05:00
Harmen Stoppels
422f829e4e mirrors: add missing init file (#47977) 2024-12-08 09:31:22 +01:00
Alec Scott
f54c101b44 py-jedi: add v0.19.2 (#47569) 2024-12-07 16:26:31 +01:00
Harmen Stoppels
05acd29f38 extensions.py: remove import of spack.cmd (#47963) 2024-12-07 10:08:04 +01:00
Wouter Deconinck
77e2187e13 coverage.yml: fail_ci_if_error = true (#47731) 2024-12-06 11:01:10 -08:00
Harmen Stoppels
5c88e035f2 directives.py: remove redundant import (#47965) 2024-12-06 19:18:12 +01:00
Harmen Stoppels
94bd7b9afb build_environment: drop off by one fix (#47960) 2024-12-06 17:01:46 +01:00
Stephen Herbener
f181ac199a Upgraded version specs for ECMWF packages: eckit, atlas, ectrans, fckit, fiat (#47749) 2024-12-05 18:46:56 -08:00
Sreenivasa Murthy Kolam
a8da7993ad Bump up the version for rocm-6.2.4 release (#47707)
* Bump up the version for rocm-6.2.4 release
2024-12-05 18:41:02 -08:00
Dom Heinzeller
b808338792 py-uxarray: new package plus dependencies (#47573)
* Add py-param@2.1.1
* Add py-panel@1.5.2
* Add py-bokeh@3.5.2
* New package py-datashader
* New package py-geoviews
* New package py-holoviews
* WIP: new package py-uxarray
* New package py-antimeridian
* New package py-dask-expr
* New package py-spatialpandas
* New package py-hvplot
* Add dependency on py-dask-expr for 'py-dask@2024.3: +dataframe'
* Added all dependencies for py-uxarray; still having problems with py-dask +dataframe / py-dask-expr
* Fix style errors in many packages
* Clean up comments and fix style errors in var/spack/repos/builtin/packages/py-dask-expr/package.py
* In var/spack/repos/builtin/packages/py-dask/package.py: since 2023.8, the dataframe variant requires the array variant
* Fix style errors in py-uxarray package
2024-12-05 18:20:55 -08:00
Massimiliano Culpo
112e47cc23 Don't inject import statements in package recipes
Remove a hack done by RepoLoader, which was injecting an extra
```
from spack.package import *
```
at the beginning of each package.py
2024-12-05 12:48:00 -08:00
Dom Heinzeller
901cea7a54 Add conflict for pixman with Intel Classic (#47922) 2024-12-05 18:14:57 +01:00
Massimiliano Culpo
c1b2ac549d solver: partition classes related to requirement parsing into their own file (#47915) 2024-12-05 18:10:06 +01:00
Harmen Stoppels
4693b323ac spack.mirror: split into submodules (#47936) 2024-12-05 18:09:08 +01:00
Kin Fai Tse
1f2a68f2b6 tar: conditionally link iconv (#47933)
* fix broken packages requiring iconv

* tar: -liconv only when libiconv

* Revert "fix broken packages requiring iconv"

This reverts commit 5fa426b52f.

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-12-05 10:09:18 -06:00
Juan Miguel Carceller
3fcc38ef04 pandoramonitoring,pandorasdk: change docstrings that are wrong (#47937)
and are copied from the pandorapfa package

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-12-05 08:53:09 -07:00
Harmen Stoppels
22d104d7a9 ci: add bootstrap stack for python@3.6:3.13 (#47719)
Resurrect latest Python 3.6
Add clingo-bootstrap to Gitlab CI.
2024-12-05 10:07:24 +01:00
Todd Gamblin
8b1009a4a0 resource: clean up arguments and typing
- [x] Clean up arguments on the `resource` directive.
- [x] Add type annotations
- [x] Add `resource` to type annotations on `PackageBase`
- [x] Fix up `resource` docstrings

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
Todd Gamblin
f54526957a directives: add type annotations to DirectiveMeta class
Some of the class-level annotations were wrong, and some were missing. Annotate all the
functions here and fix the class properties to match what's actually happening.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
Todd Gamblin
175a4bf101 directives: use Type[PackageBase] instead of PackageBase
The first argument to each Spack directive is not a `PackageBase` instance but a
`PackageBase` class object, so fix the type annotations to reflect this.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
Todd Gamblin
aa81d59958 directives: don't include Optional in PatchesType
`Optional` shouldn't be part of `PatchesType` -- it's clearer to specify `Optional` it
in the methods that need their arguments to be optional.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
James Taliaferro
6aafefd43d package version: Neovim 0.10.2 (#47925) 2024-12-04 23:17:55 +01:00
Satish Balay
ac82f344bd trilinos@develop: update kokkos dependency (#47838) 2024-12-04 19:53:38 +01:00
Harmen Stoppels
16fd77f9da rust-bootstrap: fix zlib dependency (#47894)
x
2024-12-04 02:28:19 -08:00
Harmen Stoppels
f82554a39b stage.py: improve path to url (#47898) 2024-12-04 09:41:38 +01:00
Massimiliano Culpo
2aaf50b8f7 eigen: remove unnecessary dependency on fortran (#47866) 2024-12-04 08:18:40 +01:00
Mathew Cleveland
b0b9cf15f7 add a '+no_warning' variant to METIS to prevent pervasive warning (#47452)
* add a '+no_warning' variant to metis to prevent prevasive warning
* fix formating

---------

Co-authored-by: Cleveland <cleveland@lanl.gov>
Co-authored-by: mcourtois <mathieu.courtois@gmail.com>
2024-12-03 17:02:36 -08:00
v
8898e14e69 update py-numl and py-nugraph recipes (#47680)
* update py-numl and py-nugraph recipes

this commit adds the develop branch as a valid option for each of these two packages. in order to enable this, package tarballs are now retrieved from the github source repository instead of pypi, and their checksums and the build system have been updated accordingly.

* rename versions "develop" -> "main" to be consistent with branch name
2024-12-03 16:59:33 -08:00
Buldram
63c72634ea nim: add latest versions (#47844)
* nim: add latest versions
  In addition:
  - Create separate build and install phases.
  - Remove koch nimble call as it's redundant with koch tools.
  - Install all additional tools bundled with Nim instead of only Nimble.
* Fix 1.6 version
* nim: add devel
  In addition:
  - Fix build accessing user config/cache
2024-12-03 16:57:59 -08:00
Carson Woods
a7eacd77e3 bug fix: updated warning message to reflect impending v1.0 release (#47887) 2024-12-03 17:16:36 +01:00
Cédric Chevalier
09b7ea0400 Bump Kokkos and Kokkos-kernels to 4.5.00 (#47809)
* Bump Kokkos and Kokkos-kernels to 4.5.00

* petsc@:3.22 add a conflict with this new version of kokkos

* Update kokkos/kokkos-kernel dependency

---------

Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2024-12-03 09:09:25 -07:00
Harmen Stoppels
b31dd46ab8 style.py: do not remove import spack in packages (#47895) 2024-12-03 16:04:18 +01:00
Harmen Stoppels
ad7417dee9 nwchem: add resource, remove patch (#47892)
fixes a build failure due to broken URL and improves nwchem build without internet
2024-12-03 14:09:05 +01:00
Wouter Deconinck
c3de3b0b6f tar: add v1.35 (fix CVEs) (#47426) 2024-12-03 13:26:04 +01:00
Harmen Stoppels
6da9bf226a python: drop nis module also for < 3.13 (#47862)
the nis module was removed in python 3.13
we had it default to ~nis
no package requires +nis
required dependencies for +nis were missing

so better to remove the nis module entirely.
2024-12-03 13:01:08 +01:00
Auriane R.
b3ee954e5b Remove duplicate version (#47880) 2024-12-03 10:14:47 +01:00
napulath
db090b0cad Update package.py (#47885) 2024-12-03 08:24:28 +01:00
Massimiliano Culpo
3a6c361a85 cgns: make fortran dependency optional (#47867) 2024-12-03 06:18:37 +01:00
Adam J. Stewart
bb5bd030d4 py-rasterio: add v1.4.3 (#47881) 2024-12-03 06:10:20 +01:00
dependabot[bot]
b9c60f96ea build(deps): bump pytest from 8.3.3 to 8.3.4 in /lib/spack/docs (#47882)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.3 to 8.3.4.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.3.3...8.3.4)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-03 06:07:27 +01:00
Stephen Nicholas Swatman
6b16c64c0e acts dependencies: new versions as of 2024/12/02 (#47787)
* acts dependencies: new versions as of 2024/11/25

This commit adds a new version of detray and two new versions of vecmem.

* acts dependencies: new versions as of 2024/12/02

This commit adds version 38 of ACTS and a new version of detray.
2024-12-02 19:50:25 -06:00
Andrey Perestoronin
3ea970746d add compilers packages (#47877) 2024-12-02 15:53:56 -07:00
Satish Balay
d8f2e080e6 petsc, py-petsc4py: add v3.22.2 (#47845) 2024-12-02 14:21:31 -08:00
Harmen Stoppels
ecb8a48376 libseccomp: python forward compat bound (#47876)
* libseccomp: python forward compat bound

* include 2.5.5

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-12-02 14:59:40 -07:00
Massimiliano Culpo
30176582e4 py-torchvision: add dependency on c (#47873) 2024-12-02 22:23:58 +01:00
Massimiliano Culpo
ac17e8bea4 utf8cpp: move to GitHub, make it a CMake package (#47870) 2024-12-02 14:14:24 -07:00
Massimiliano Culpo
c30c85a99c seacas: add a conditional dependency on fortran (#47871)
* seacas: remove unnecessary dependency on fortran
* seacas: add a conditional dependency on fortran
2024-12-02 13:13:14 -08:00
Michael Schlottke-Lakemper
2ae8eb6686 Update HOHQmesh package with newer versions (#47861) 2024-12-02 12:29:45 -08:00
Jose E. Roman
b5cc5b701c New patch release SLEPc 3.22.2 (#47859) 2024-12-02 12:06:52 -08:00
Wouter Deconinck
8e7641e584 onnx: set CMAKE_CXX_STANDARD to abseil-cpp cxxstd value (#47858) 2024-12-02 11:56:33 -08:00
Weiqun Zhang
e692d401eb amrex: add v24.12 (#47857) 2024-12-02 11:55:08 -08:00
Massimiliano Culpo
99319b1d91 oneapi-level-zero: add dependency on c (#47874) 2024-12-02 12:48:49 -07:00
Satish Balay
839ed9447c trilinos@14.4.0 revert kokkos-kernel dependency - as this breaks builds (#47852) 2024-12-02 11:44:37 -08:00
afzpatel
8e5a040985 ucc: add ROCm and rccl support (#46580) 2024-12-02 20:43:53 +01:00
Stephen Nicholas Swatman
5ddbb1566d benchmark: add version 1.9.1 (#47860)
This commit adds version 1.9.1 of Google Benchmark.
2024-12-02 11:42:38 -08:00
Massimiliano Culpo
eb17680d28 double-conversion: add dependency on c, and c++ (#47869) 2024-12-02 12:38:16 -07:00
Massimiliano Culpo
f4d81be9cf py-torch-nvidia-apex: add dependency on C (#47868) 2024-12-02 20:37:33 +01:00
Massimiliano Culpo
ea5ffe35f5 configuration: set egl as buildable:false (#47865) 2024-12-02 11:33:01 -08:00
Wouter Deconinck
1e37a77e72 mlpack: depends_on py-setuptools (#47828) 2024-12-02 12:04:53 +01:00
Todd Gamblin
29427d3e9e ruff: add v0.8.1 (#47851)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-30 10:49:47 +01:00
Todd Gamblin
2a2d1989c1 version_types: clean up type hierarchy and add annotations (#47781)
In preparation for adding `when=` to `version()`, I'm cleaning up the types in
`version_types` and making sure the methods here pass `mypy` checks. This started as an
attempt to use `ConcreteVersion` outside of `spack.version` and grew into a larger type
refactor.

The hierarchy now looks like this:

* `VersionType`
  * `ConcreteVersion`
    * `StandardVersion`
    * `GitVersion`
  * `ClosedOpenRange`
  * `VersionList`

Note that the top-level thing can't easily be `Version` as that is a method and it
returns only `ConcreteVersion` right now. I *could* do something fancy with `__new__` to
make `Version` a synonym for the `ConcreteVersion` constructor, which would allow it to
be used as a type. I could also do something similar with `VersionRange` but not sure if
it's worth it just to make these into types.

There are still some places where I think `GitVersion` might not be handled properly,
but I have not attempted to fix those here.

- [x] Add a top-level `VersionType` class that all version types extend from
- [x] Define and document common methods and rich comparisons on `VersionType`
- [x] Replace complicated `Union` types with `VersionType` and `ConcreteVersion` as needed
- [x] Annotate most methods (skipping `__getitem__` and friends as the typing is a pain)
- [x] Fix up the `VersionList` constructor a bit
- [x] Add cases to methods that weren't handling all `VersionType`s
- [x] Rework some places to clarify typing for `mypy`
- [x] Simplify / optimize _next_version
- [x] Make StandardVersion.string a property to enable lazy comparison

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-30 08:21:07 +01:00
Wouter Deconinck
c6e292f55f py-nbdime: add v3.2.1 (#47445) 2024-11-29 15:59:11 -07:00
teddy
bf5e6b4aaf py-mpi4py: create mpi.cfg file, this file is removed since v4.0.0, but API is retained #47584
Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
2024-11-29 13:28:21 -06:00
Adam J. Stewart
9760089089 VTK: mark Python version compatibility (#47814)
* VTK: mark Python version compatibility

* VTK 8.2.0 also only supports Python 3.7
2024-11-29 13:04:56 -06:00
dmagdavector
da7c5c551d py-pip: add v23.2.1 -> v24.3.1 (#47753)
* py-pip: update to latest version 24.3.1 (plus some others)

* py-pip: note Python version dependency for new PIP versions
2024-11-29 17:18:19 +01:00
Harmen Stoppels
a575fa8529 gcc: add missing patches from Iain Sandoe's branch (#47843) 2024-11-29 08:10:04 +01:00
Massimiliano Culpo
39a65d88f6 fpm: add a dependency on c, and fortran (#47839)
Extracted from #45189

Build failure: https://gitlab.spack.io/spack/spack/-/jobs/13871774

Co-authored-by: Sebastian Ehlert <28669218+awvwgk@users.noreply.github.com>
2024-11-29 08:07:50 +01:00
Massimiliano Culpo
06ff8c88ac py-torch-sparse: add a dependency on c (#47841)
Extracted from #45189

Build failure: https://gitlab.spack.io/spack/spack/-/jobs/13870876
2024-11-29 08:06:46 +01:00
Massimiliano Culpo
a96b67ce3d miopen-hip: add a dependency on c (#47842)
Extracted from #45189

Build failure: https://gitlab.spack.io/spack/spack/-/jobs/13870957
2024-11-29 07:25:43 +01:00
Harmen Stoppels
67d494fa0b filesystem.py: remove unused md5sum (#47832) 2024-11-28 18:43:21 +01:00
Harmen Stoppels
e37e53cfe8 traverse: add MixedDepthVisitor, use in cmake (#47750)
This visitor accepts the sub-dag of all nodes and unique edges that have
deptype X directly from given roots, or deptype Y transitively for any
of the roots.
2024-11-28 17:48:48 +01:00
Andrey Perestoronin
cf31d20d4c add new packages (#47817) 2024-11-28 09:49:52 -05:00
Harmen Stoppels
b74db341c8 darwin: preserve hardlinks on codesign/install_name_tool (#47808) 2024-11-28 14:57:28 +01:00
Daryl W. Grunau
e88a3f6f85 eospac: version 6.5.12 (#47826)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2024-11-28 12:32:35 +01:00
Massimiliano Culpo
9bd7483e73 Add further C and C++ dependencies to packages (#47821) 2024-11-28 10:50:35 +01:00
Harmen Stoppels
04c76fab63 hip: hints for find_package llvm/clang (#47788)
LLVM can be a transitive link dependency of hip through gl's dependency mesa, which uses it for software rendering.

In this case make sure llvm-amdgpu is found with find_package(LLVM) and
find_package(Clang) by setting LLVM_ROOT and Clang_ROOT.

That makes the patch of find_package's HINTS redundant, so remove that.
It did not work anyways, because CMAKE_PREFIX_PATH has higher precedence
than HINTS.
2024-11-28 10:23:09 +01:00
Adam J. Stewart
ecbf9fcacf py-scooby: add v0.10.0 (#47790) 2024-11-28 10:21:36 +01:00
Victor A. P. Magri
69fb594699 hypre: add a variant to allow using internal lapack functions (#47780) 2024-11-28 10:15:12 +01:00
Howard Pritchard
d28614151f nghttp2: add v1.64.0 (#47800)
Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
2024-11-28 10:12:41 +01:00
etiennemlb
f1d6af6c94 netlib-scalapack: fix for some clang derivative (cce/rocmcc) (#45434) 2024-11-28 09:25:33 +01:00
Adam J. Stewart
192821f361 py-river: mark numpy 2 compatibility (#47813) 2024-11-28 09:24:21 +01:00
Adam J. Stewart
18790ca397 py-pyvista: VTK 9.4 not yet supported (#47815) 2024-11-28 09:23:41 +01:00
BOUDAOUD34
c22d77a38e dbcsr: patch for resolving .mod file conflicts in ROCm by implementing USE, INTRINSIC (#46181)
Co-authored-by: U-PALLAS\boudaoud <boudaoud@pc44.pallas.cines.fr>
2024-11-28 09:20:48 +01:00
Tom Payerle
d82bdb3bf7 seacas: update recipe to find faodel (#40239)
Explcitly sets the CMake variables  Faodel_INCLUDE_DIRS and Faodel_LIBRARY_DIRS when +faodel.
This seems to be needed for recent versions of seacas (seacas@2021-04-02:), but should be safe
to do for all versions.

For Faodel_INCLUDE_DIRS, it looks like Faodel has header files under $(Faodel_Prefix)/include/faodel,
but seacas is not including the "faodel" part in #includes.  So add both $(Faodel_Prefix)/include
and $(Foadel_Prefix)/include/faodel

Co-authored-by: payerle <payerle@users.noreply.github.com>
2024-11-28 09:17:44 +01:00
Matt Thompson
a042bdfe0b mapl: add hpcx-mpi (#47793) 2024-11-28 09:15:25 +01:00
Adam J. Stewart
60e3e645e8 py-joblib: add v1.4.2 (#47789) 2024-11-28 08:28:44 +01:00
Chris Marsh
51785437bc Patch to fix building gcc@14.2 on darwin. Fixes #45628 (#47830) 2024-11-27 20:58:18 -07:00
dependabot[bot]
2e8db0815d build(deps): bump docker/build-push-action from 6.9.0 to 6.10.0 (#47819)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.9.0 to 6.10.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](4f58ea7922...48aba3b46d)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-27 16:29:41 -07:00
George Malerbo
8a6428746f raylib: add v5.5 (#47708)
* Add version 5.5 in package.py

* Update package.py
2024-11-27 16:25:22 -07:00
Adam J. Stewart
6b9c099af8 py-keras: add v3.7.0 (#47816) 2024-11-27 16:12:47 -07:00
Derek Ryan Strong
30814fb4e0 Deprecate rsync releases before v3.2.5 (#47820) 2024-11-27 16:14:34 -06:00
Harmen Stoppels
3194be2e92 gcc-runtime: remove libz.so from libgfortran.so if present (#47812) 2024-11-27 22:32:37 +01:00
snehring
41be2f5899 ltr-retriever: changing directory layout (#38513) 2024-11-27 14:16:57 -07:00
kwryankrattiger
02af41ebb3 gdk-pixbuf: Point at gitlab instead of broken mirror (#47825) 2024-11-27 15:13:55 -06:00
snehring
9d33c89030 r-rsamtools: add -lz to Makevars (#38649) 2024-11-27 13:44:48 -07:00
Erik Heeren
51ab7bad3b julia: conflict for %gcc@12: support (#35931) 2024-11-27 04:31:44 -07:00
kwryankrattiger
0b094f2473 Docs: Reference 7z requirement on Windows (#35943) 2024-11-26 17:11:12 -05:00
Christoph Junghans
cd306d0bc6 all-libary: add voronoi support and git version (#47798)
* all-libary: add voronoi support and git version

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-26 14:56:22 -07:00
Dom Heinzeller
fdb9cf2412 Intel/oneapi compilers: correct version ranges for diab-disable flag (#47428)
* c/c++ flags should have been modified for all 2023.x.y versions, but
  upper bound was too low
* Fortran flags should have been modified for all 2024.x.y versions, but
  likewise the upper bound was too low
2024-11-26 12:34:37 -07:00
etiennemlb
a546441d2e siesta: remove link args on a non-declared dependency (#46080) 2024-11-26 20:25:04 +01:00
IHuismann
141cdb6810 adol-c: fix libs property (#36614) 2024-11-26 17:01:18 +01:00
Brian Van Essen
f2ab74efe5 cray: add further versions of Cray packages. (#37733) 2024-11-26 16:59:53 +01:00
Martin Aumüller
38b838e405 openscenegraph: remove X11 dependencies for macos (#39037) 2024-11-26 16:59:10 +01:00
Mark Abraham
c037188b59 gromacs: announce deprecation policy and start to implement (#47804)
* gromacs: announce deprecation policy and start to implement

* Style it up

* [@spackbot] updating style on behalf of mabraham

* Bump versions used in CI

---------

Co-authored-by: mabraham <mabraham@users.noreply.github.com>
2024-11-26 05:54:07 -07:00
Mark Abraham
0835a3c5f2 gromacs: obtain SYCL from either ACpp or intel-oneapi-runtime (#47806) 2024-11-26 05:51:54 -07:00
Mark Abraham
38a2f9c2f2 gromacs: Improve HeFFTe dependency (#47805)
GROMACS supports HeFFTe with either SYCL or CUDA build and requires
a matching HeFFTe build
2024-11-26 05:50:41 -07:00
Massimiliano Culpo
eecd4afe58 gromacs: fix the value used for the ITT directory (#47795) 2024-11-26 08:14:45 +01:00
Seth R. Johnson
83624551e0 ROOT: default to +aqua~x on macOS (#47792) 2024-11-25 14:27:38 -06:00
Victor A. P. Magri
741652caa1 caliper: add "tools" variant (#47779) 2024-11-25 18:26:53 +01:00
Mark Abraham
8e914308f0 gromacs: add itt variant (#47764)
Permit configuring GROMACS with support for mdrun to trace its timing
regions by calling the ITT API. This permits tools like VTune and
unitrace to augment their analysis with GROMACS-specific annotation.
2024-11-25 16:12:55 +01:00
Mikael Simberg
3c220d0989 apex: add 2.7.0 (#47736) 2024-11-25 13:22:16 +01:00
Wouter Deconinck
8094fa1e2f py-gradio: add v5.1.0 (and add/update dependencies) (fix CVEs) (#47504)
* py-pdm-backend: add v2.4.3
* py-starlette: add v0.28.0, v0.32.0, v0.35.1, v0.36.3, v0.37.2, v0.41.2
* py-fastapi: add v0.110.2, v0.115.4
* py-pydantic-extra-types: add v2.10.0
* py-pydantic-settings: add v2.6.1
* py-python-multipart: add v0.0.17
* py-email-validator: add v2.2.0
2024-11-25 13:07:56 +01:00
Massimiliano Culpo
5c67051980 Add missing C/C++ dependencies (#47782) 2024-11-25 12:56:39 +01:00
John W. Parent
c01fb9a6d2 Add CMake 3.31 minor release (#47676) 2024-11-25 04:32:57 -07:00
Harmen Stoppels
bf12bb57e7 install_test: first look at builder, then package (#47735) 2024-11-25 11:53:28 +01:00
Wouter Deconinck
406c73ae11 py-boto*: add v1.34.162 (#47528)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:39:09 +01:00
Wouter Deconinck
3f50ccfcdd py-azure-*: updated versions (#47525) 2024-11-25 11:38:49 +01:00
Wouter Deconinck
9883a2144d py-quart: add v0.19.8 (#47508)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:38:22 +01:00
Wouter Deconinck
94815d2227 py-netifaces: add v0.10.9, v0.11.0 (#47451) 2024-11-25 11:37:41 +01:00
Wouter Deconinck
a15563f890 py-werkzeug: add v3.1.3 (and deprecate older versions) (#47507)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:28:01 +01:00
Wouter Deconinck
ac2ede8d2f py-pyzmq: add v25.1.2, v26.0.3, v26.1.1, v26.2.0 (switch to py-scikit-build-core) (#44493) 2024-11-25 11:00:00 +01:00
david-edwards-linaro
b256a7c50d linaro-forge: remove v21.1.3 (#47688) 2024-11-25 10:53:27 +01:00
Szabolcs Horvát
21e10d6d98 igraph: add v0.10.15 (#47692) 2024-11-25 10:51:24 +01:00
afzpatel
ed39967848 rocm-tensile: add 6.2.1 (#47702) 2024-11-25 10:40:21 +01:00
Alex Richert
eda0c6888e ip: add cmake version requirement for @5.1: (#47754) 2024-11-25 02:38:08 -07:00
pauleonix
66055f903c cuda: Add v12.6.3 (#47721) 2024-11-25 10:36:11 +01:00
Dave Keeshan
a1c57d86c3 libusb: add v1.0.23:1.0.27 (#47727) 2024-11-25 10:33:08 +01:00
Dave Keeshan
9da8dcae97 verible: add v0.0.3841 (#47729) 2024-11-25 10:30:48 +01:00
jflrichard
c93f223a73 postgis: add version 3.1.2 (#47743) 2024-11-25 10:24:03 +01:00
Wouter Deconinck
f1faf31735 build-containers: determine latest release tag and push that as latest (#47742) 2024-11-25 10:20:58 +01:00
Stephen Herbener
8957ef0df5 Updated version specs for bufr-query package. (#47752) 2024-11-25 10:14:16 +01:00
Veselin Dobrev
347ec87fc5 mfem: add logic for the C++ standard level when using rocPRIM (#47751) 2024-11-25 10:13:22 +01:00
Adam J. Stewart
cd8c46e54e py-ruff: add v0.8.0 (#47758) 2024-11-25 10:02:31 +01:00
Adam J. Stewart
75b03bc12f glib: add v2.82.2 (#47766) 2024-11-24 20:55:18 +01:00
Adam J. Stewart
58511a3352 py-pandas: correct Python version constraint (#47770) 2024-11-24 17:48:16 +01:00
Adam J. Stewart
325873a4c7 py-fsspec: add v2024.10.0 (#47778) 2024-11-24 15:42:30 +01:00
Adam J. Stewart
9156e4be04 awscli-v2: add v2.22.4 (#47765) 2024-11-24 15:42:06 +01:00
Adam J. Stewart
12d3abc736 py-pytz: add v2024.2 (#47777) 2024-11-24 15:40:45 +01:00
Adam J. Stewart
4208aa6291 py-torchvision: add Python 3.13 support (#47776) 2024-11-24 15:40:11 +01:00
Adam J. Stewart
0bad754e23 py-scikit-learn: add Python 3.13 support (#47775) 2024-11-24 15:39:36 +01:00
Adam J. Stewart
cde2620f41 py-safetensors: add v0.4.5 (#47774) 2024-11-24 15:38:05 +01:00
Adam J. Stewart
a35aa038b0 py-pystac: add support for Python 3.12+ (#47772) 2024-11-24 15:37:43 +01:00
Adam J. Stewart
150416919e py-pydantic-core: add v2.27.1 (#47771) 2024-11-24 15:37:06 +01:00
Adam J. Stewart
281c274e0b py-jupyter-packaging: add Python 3.13 support (#47769) 2024-11-24 15:31:31 +01:00
Adam J. Stewart
16e130ece1 py-cryptography: mark Python 3.13 support (#47768) 2024-11-24 15:31:08 +01:00
Adam J. Stewart
7586303fba py-ruamel-yaml-clib: add Python compatibility bounds (#47773) 2024-11-24 15:28:45 +01:00
Harmen Stoppels
6501880fbf py-node-env: add v1.9.1 (#47762) 2024-11-24 15:27:16 +01:00
Harmen Stoppels
c76098038c py-ipykernel: split forward and backward compat bounds (#47763) 2024-11-24 15:26:15 +01:00
Harmen Stoppels
124b616b27 add a few forward compat bounds with python (#47761) 2024-11-24 15:23:11 +01:00
Adam J. Stewart
1148c8f195 gobject-introspection: Python 3.12 still not supported (#47767) 2024-11-24 03:53:32 -07:00
Adam J. Stewart
c57452dd08 py-cffi: support Python 3.12+ (#47713) 2024-11-24 08:41:29 +01:00
Harmen Stoppels
a7e57c9a14 py-opt-einsum: add v3.4.0 (#47759) 2024-11-24 08:40:29 +01:00
Teague Sterling
85d83f9c26 duckdb: add v1.1.3, deprecate <v1.1.0 (#47653)
* duckdb: add v1.0.0, v0.10.3

* Adding issue reference

* Adding issue reference

* duckdb: add v1.1.0

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Fixing styles

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* duckdb: add v1.1.1

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* duckdb: Fix missing depends_on(unixodbc, when=+odbc)

* Adding duckdb variants, removing old variants, removing deprecated versions

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* duckdb+static_openssl: Add pkgconfig and zlib-api to link zlib when needed

* duckdb: add v1.1.3

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py for CVE-2024-41672 as suggested

* [@spackbot] updating style on behalf of teaguesterling

* duckdb: add CVE comment before deprecated versions

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-23 13:13:40 -07:00
finkandreas
39a081d7fd Kokkos complex_align variant, Trilinos+PETSc enforcement for Kokkos~complex_align (#47686) 2024-11-23 07:45:22 -07:00
Harmen Stoppels
71b65bb424 py-opt-einsum: missing forward compat bound for python (#47757) 2024-11-23 10:48:07 +01:00
Adam J. Stewart
3dcbd118df py-cython: support Python 3.12+ (#47714)
and add various other compat bounds on dependents
2024-11-22 22:20:41 +01:00
Harmen Stoppels
5dacb774f6 itk: use vendored googletest (#47687)
external googletest breaks dependents because they end up with
ITK_LIBRARIES set to `GTest::GTest;GTest::Main`, which then end up
literally in a nonsensical link line `-lGTest::GtTest`.

the vendored googletest produces a cmake config file where
`ITKGoogleTest_LIBRARIES` is empty.
2024-11-22 18:41:23 +01:00
Harmen Stoppels
cb3d6549c9 traverse.py: ensure topo order is bfs for trees (#47720) 2024-11-22 15:04:19 +01:00
Mark Abraham
559c2f1eb9 gromacs: oneapi does not always require gcc (#47679)
* gromacs: oneapi does not always require gcc

* Support intel_provided_gcc only with %intel classic compiler

Require gcc only when needed with %intel

* New approach depending on gcc-runtime directly

* Update var/spack/repos/builtin/packages/gromacs/package.py

Co-authored-by: Christoph Junghans <christoph.junghans@gmail.com>

---------

Co-authored-by: Christoph Junghans <christoph.junghans@gmail.com>
2024-11-22 06:33:30 -07:00
Harmen Stoppels
ed1dbea77b eigen: self.builder.build_directory -> self.build_directory (#47728) 2024-11-22 07:20:38 +01:00
Seth R. Johnson
6ebafe4631 vecgeom: add v1.2.10 and delete unused, deprecated versions (#47725)
* vecgeom: add v1.2.10

* Remove unused+deprecated versions of vecgeom

* Deprecate older v1.2.x  versions

* [@spackbot] updating style on behalf of sethrj
2024-11-21 17:03:09 -07:00
Harmen Stoppels
7f0bb7147d README.md update old tutorial URL (#47718) 2024-11-21 16:46:46 +01:00
Satish Balay
f41b38e93d xsdk: add v1.1.0 (#47635)
xsdk: exclude pflotran, alquimia, exago

heffte: ~fftw when=+hip

dealii: ~sundials ~opencascade ~vtk ~taskflow
2024-11-21 08:08:27 -06:00
Massimiliano Culpo
5fd12b7bea Add further missing C, C++ dependencies to packages (#47662) 2024-11-21 14:49:12 +01:00
Mikael Simberg
fe746bdebb aws-ofi-nccl: Add 1.8.1 to 1.13.0 (#47717) 2024-11-21 05:37:57 -07:00
eugeneswalker
453af4b9f7 hdf5-vol-cache %cce: add -Wno-error=incompatible-function-pointer-types (#47698) 2024-11-20 14:56:19 -08:00
eugeneswalker
29cf1559cc netlib-scalapack %cce: add cflags -Wno-error=implicit-function-declaration (#47701) 2024-11-20 15:09:14 -07:00
eugeneswalker
a9b3e1670b mpifileutils%cce: append cflags -Wno-error=implicit-function-declaration (#47700) 2024-11-20 14:19:05 -07:00
kwryankrattiger
4f9aa6004b visit: add v3.4.0, v3.4.1 (#47161)
* Visit: Add new versions 3.4.0 and 3.4.1

* Adios2: Restrict python, 3.11 doesn't not work for older Adios2

* VisIt: Set the VTK_VERSION for @3.4:

Older versions of VTK used the VTK_{MAJOR, MINOR}_VERSION variables for
VTK detection. VisIt >= 3.4 uses the full string VTK_VERSION.

* CI: Don't build llvm-amdgpu for non-HIP stack

* VisIt: v3.4.1 handles newer Adios2 correctly

* Visit: Add missing links in HDF5, set correct VTK version configuration parameter

* VisIt: Add py-pip requirement and patch visit with configuration changes

* HDF5 symlinks move when inside of callback

* VisIt ninja install fails with python module. Using make does not

* VisIt 3.4 has a high minimum cmake requirement

* HDF5: Early return when not mpi for mpi symlinks

* HDF5: Use platform agnostic method for creating legacy compatible MPI symlinks

* Fix VISIT_VTK_VERSION handling for 8.2.1a hack
2024-11-20 18:37:56 +01:00
Harmen Stoppels
aa2c18e4df spack style: import-check -> import, fix bugs, exclude spack.pkg (#47690) 2024-11-20 16:15:28 +01:00
dependabot[bot]
0ff3e86315 build(deps): bump codecov/codecov-action from 5.0.2 to 5.0.3 (#47683)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 5.0.2 to 5.0.3.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](5c47607acb...05f5a9cfad)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-19 20:40:01 -06:00
dependabot[bot]
df208c1095 build(deps): bump docker/metadata-action from 5.5.1 to 5.6.1 (#47682)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.5.1 to 5.6.1.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](8e5442c4ef...369eb591f4)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-19 20:39:45 -06:00
Chris Marsh
853f70edc8 cgal: update depends versions for 6.0.1 (#47516)
* This extends PR #47285 to properly include some of the required version constrains of cgal 6 incl C++ standard. It also adds the new no-gmp backend as a variant.

* fix style

* disable cgal@6 +demo variant as the demos require qt6 which is not in spack

* disable the gmp variant until clarity on how its supposed to work is provided. bound shared and header_only variants to relevant versions

* Fix missing msvc compiler limit, fix variant left in

* Add more comments. Better describe the gmp variant. Remove testing code

* fix style
2024-11-19 16:43:21 -07:00
Paul R. C. Kent
50970f866e Add llvm v19.1.4 (#47681) 2024-11-19 16:03:28 -07:00
Wouter Deconinck
8821300985 py-gevent: add v24.2.1, v24.10.3, v24.11.1 (#47646)
* py-gevent: add v24.2.1, v24.10.3
* py-gevent: add v24.11.1
2024-11-19 12:14:52 -08:00
AMD Toolchain Support
adc8e1d996 Restrict disable dynamic thread scaling only to 3.1 version (#47673)
Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
2024-11-19 12:12:21 -08:00
Andrey Perestoronin
1e0aac6ac3 Add new 2025.0.1 Oneapi patch packages (#47678) 2024-11-19 11:38:42 -07:00
Harmen Stoppels
99e2313d81 openturns: fix deps (#47669) 2024-11-19 18:13:47 +01:00
Mark Abraham
22690a7576 Make oneAPI library-with-sdk specialize library class (#47632) 2024-11-19 12:12:10 -05:00
Harmen Stoppels
5325cfe865 systemd: symlink the internal libraries so they are found in rpath (#47667) 2024-11-19 15:28:49 +01:00
Harmen Stoppels
5333925dd7 sensei: fix install rpath for python extension (#47670) 2024-11-19 15:23:54 +01:00
Massimiliano Culpo
2db99e1ff6 gmp: fix cxx dependency, remove dependency on fortran (#47671) 2024-11-19 15:19:08 +01:00
Massimiliano Culpo
68aa712a3e solver: add a timeout handle for users (#47661)
This PR adds a configuration setting to allow setting time limits for concretization.

For backward compatibility, the default is to set no time limit.
2024-11-19 15:00:26 +01:00
Mikael Simberg
2e71bc640c pika: Add 0.30.1 (#47666) 2024-11-19 05:44:41 -07:00
Dom Heinzeller
661f3621a7 netcdf-cxx: add a maintainer (#47665) 2024-11-19 05:28:38 -07:00
Massimiliano Culpo
f182032337 Restore message when concretizing in parallel (#47663)
It was lost in #44843
2024-11-19 12:28:14 +00:00
teddy
066666b7b1 py-non-regression-test-tools: add v1.1.6 & remove v1.1.2 (tag removed) (#47622)
* py-non-regression-test-tools: add v1.1.6  & remove v1.1.2 (tag removed)
* Update var/spack/repos/builtin/packages/py-non-regression-test-tools/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-11-19 04:38:33 -07:00
Richard Berger
73316c3e28 cached_cmake: mpifc is not always defined (#46861)
* cached_cmake: mpifc is not always defined
* mpich: only depend on fortran when +fortran
2024-11-18 14:49:56 -08:00
afzpatel
c8e4ae08da eigen: enable ROCm support and add master version (#47332)
* eigen: enable ROCm support and add master version
* change boost dependency to only for tests
2024-11-18 14:41:02 -08:00
Tom Scogland
44225caade llvm: fix sysroot and build on darwin (#47336)
The default build of clang on darwin couldn't actually build anything
because of a lack of a sysroot built in.  Also several compilation
errors finding the system libc++ cropped up, much like those in GCC, and
have been fixed.
2024-11-18 14:39:16 -08:00
Lydéric Debusschère
8d325d3e30 yambo: add v5.2.3, v5.2.4 (#47350)
* packages: Update 'yambo'
* add call to 'resource' method to download Ydriver and iotk during fetch instead of during build
* air-gapped installation could be performed since version 5.2.1
* add versions 5.2.3 and 5.2.4
* remove some inexistant configure options for versions "@5:"
* add a sanity_check on 'bin/yambo'
2024-11-18 14:36:16 -08:00
Mosè Giordano
d0fd112006 grep: add executables attribute and determine_version method (#47438) 2024-11-18 14:30:46 -08:00
Wouter Deconinck
50f43ca71d py-gitpython: add v3.1.43 (fix CVEs) (#47444)
* py-gitpython: add v3.1.43
2024-11-18 14:24:20 -08:00
Vanessasaurus
2546fb6afa Automated deployment to update package flux-sched 2024-11-05 (#47449)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-11-18 14:23:35 -08:00
Vanessasaurus
10f6863d91 Automated deployment to update package flux-security 2024-11-05 (#47450)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2024-11-18 14:21:42 -08:00
afzpatel
63ea528606 mivisionx, migraphx and rocal: add and fix tests (#47453)
* fix mivisionx test, add migraphx build-time test and add rocal unit test
* fix miopen-hip linking issue
* remove setup_run_environment from roctracer-dev
* change patch file
2024-11-18 14:09:18 -08:00
snehring
89d2b9553d tb-lmto: new package (#47482)
* tb-lmto: adding new package tb-lmto
* tb-lmto: update copyright year
2024-11-18 14:01:59 -08:00
Dom Heinzeller
278326b4d9 Add py-phdf@0.11.4 and add conflict for py-pyhdf@0.10.4 with numpy@1.25: (#47656) 2024-11-18 13:53:38 -08:00
alvaro-sch
43c1a5e0ec orca: add v6.0.1, avx2-6.0.1 (#47489)
* orca: add 6.0.1 versions
* orca: checksum fix for avx2-6.0.1
* orca: fix version string for avx2-6.0.1
* orca: checksum fix for 6.0.1
2024-11-18 13:34:38 -08:00
Paul Gessinger
8feb506b3a geomodel: Fix dependencies (#47437)
* geomodel: Add dependency on `hdf5` for `+pythia`, require `hdf5+cxx`

* fix visualization dependencies

* geomodel: Add soqt dependency

* update dependency on soqt to drop explicit qt variant
2024-11-18 15:31:53 -06:00
Wouter Deconinck
627544191a py-pymongo: add v4.10.1 (fix CVE) (#47501)
* py-pymongo: add v4.10.1
* py-pymongo: fix copyright header spacing
* py-hatch-requirements-txt: add v0.4.1

---------

Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-18 12:53:32 -08:00
Wouter Deconinck
cf672ea8af py-waitress: add v3.0.1 (#47509) 2024-11-18 12:50:49 -08:00
green-br
2c4ac02adf Add option to not build GUI for WxWidgets. (#47526) 2024-11-18 12:43:23 -08:00
Niclas Jansson
7f76490b31 neko: add v0.8.1, v0.9.0 and fix package (#47558) 2024-11-18 12:38:08 -08:00
Thomas-Ulrich
46e4c1fd30 seissol: add versions, conflict (#47562)
* add a couple of seissol version
2024-11-18 12:34:59 -08:00
Rémi Lacroix
85c5533e62 hpctoolkit: Update the minimum version for Python dependency (#47564)
New versions of HPCToolKit supports Python from version 3.8.
2024-11-18 12:31:20 -08:00
Vicente Bolea
c47cafd11a diy: apply smoke_test patch in 3.6 only (#47572) 2024-11-18 12:26:45 -08:00
jmuddnv
8e33cc158b Changes for NVIDIA HPC SDK 24.11 (#47592) 2024-11-18 12:24:31 -08:00
Adam J. Stewart
f07173e5ee py-torchmetrics: add v1.6.0 (#47580) 2024-11-18 12:22:57 -08:00
Ken Raffenetti
118f5d2683 mpich: Remove incorrect dependency (#47586)
The gni libfabric provider works on some Cray systems, but not all. For
example, Slingshot-based machines use a different libfabric provider
(cxi). Therefore libfabric/gni should not be a dependency when using
Cray PMI.
2024-11-18 12:20:10 -08:00
Louise Spellacy
8fb2abc3cd linaro-forge: added version 24.1 (#47594) 2024-11-18 12:16:12 -08:00
afzpatel
3bcb8a9236 rocm-dbgapi: add pciutils dependency (#47605)
* add pciutils dependency
* add maintainer
2024-11-18 12:12:55 -08:00
Adam J. Stewart
a6fdd7608f py-huggingface-hub: add v0.26.2, hf_transfer (#47600) 2024-11-18 11:51:16 -08:00
Cody Balos
1ffd7125a6 sundials: set CUDAToolkit_ROOT (#47601)
* sundials: specify CUDAToolkit_ROOT
2024-11-18 11:48:16 -08:00
Jerome Soumagne
d1166fd316 mercury: add v2.4.0 (#47606) 2024-11-18 11:36:23 -08:00
Weiqun Zhang
b8eba1c677 amrex: add new variant fft for >= 24.11 (#47611) 2024-11-18 11:35:00 -08:00
Pranav Sivaraman
e3c0515076 tinycbor: new package (#47623) 2024-11-18 11:19:14 -08:00
Pranav Sivaraman
97406f241c tomlplusplus: new package (#47624)
* tomlplusplus: new package
* tomlplusplus: remove period
2024-11-18 11:15:30 -08:00
Wouter Deconinck
e1dfbbf611 py-greenlet: add v3.0.3, v3.1.1 (#47647)
* py-greenlet: add v3.0.3, v3.1.1
* py-greenlet: depends_on py-setuptools@40.8.0:
2024-11-18 12:13:41 -07:00
Paul
52147348c7 Added Go v1.23.3 (#47633) 2024-11-18 10:52:17 -08:00
Wouter Deconinck
aeb0ab6acf pocl: add v3.1, v4.0, v5.0, v6.0 (#47643) 2024-11-18 10:43:26 -08:00
Wouter Deconinck
6cd26b7603 clinfo: add v3.0.23.01.25 (#47644) 2024-11-18 10:41:32 -08:00
wspear
1c75d07f05 Fix self.spec reference (#47610)
@teaguesterling
2024-11-18 10:37:53 -08:00
Wouter Deconinck
15b3ff2a0a harfbuzz: add v10.1.0 (#47645) 2024-11-18 10:29:29 -08:00
Teague Sterling
e9f94d9bf2 ollama: add v0.4.2 (#47654)
Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
2024-11-18 10:14:16 -08:00
Wouter Deconinck
299324c7ca dbus: add v1.15.12 (#47655) 2024-11-18 10:10:14 -08:00
Stephen Nicholas Swatman
dfab174f31 benchmark: add version 1.9.0 (#47658)
This commit adds Google Benchmark v1.9.0.
2024-11-18 07:04:52 -06:00
Stephen Nicholas Swatman
a86953fcb1 acts: add version 37.4.0 (#47657)
This commit adds v37.4.0 of the ACTS project; no new versions of the
dependencies were released.
2024-11-18 06:58:59 -06:00
Massimiliano Culpo
5f262eb5d3 Add further missing C, C++ dependencies to packages (#47649) 2024-11-18 09:39:47 +01:00
Wouter Deconinck
00f179ee6d root: add v6.32.08 (#47641) 2024-11-17 18:39:17 -06:00
Massimiliano Culpo
da4f7c2952 Add an audit to prevent using the name "all" in packages (#47651)
Packages cannot be named like that, since we use "all" to indicate
default settings under the "packages" section of the configuration.
2024-11-17 13:32:24 -07:00
Harmen Stoppels
fdedb6f95d style.py: add import-check for missing & redundant imports (#47619) 2024-11-17 09:18:48 +01:00
Massimiliano Culpo
067fefc46a Set "generic" uarch granularity for a few pipelines (#47640) 2024-11-17 09:07:42 +01:00
Massimiliano Culpo
42c9961bbe Added a few missing language deps to packages (#47639) 2024-11-17 09:07:08 +01:00
Wouter Deconinck
fe2bf4c0f9 pixman: add missing MesonPackage (#47607) 2024-11-17 09:03:15 +01:00
Harmen Stoppels
4d3b85c4d4 spack.package / builtin repo: fix exports/imports (#47617)
Add various missing imports in packages.
Remove redundant imports
Export NoLibrariesError, NoHeadersError, which_string in spack.package
2024-11-17 09:02:04 +01:00
Satish Balay
f05cbfbf44 xsdk: dealii has changes to variant defaults, update xsdk accordingly (#47602) 2024-11-16 16:42:16 -06:00
Wouter Deconinck
448049ccfc qt-tools: new package (#45602)
* qt-tools: new pkg with +designer to build Qt Designer for QWT

* qt-tools: fix style

* qt-tools: fix unused variable

* qt-tools: rm setup_run_environments (now in qt-base)

* qt-tools: add myself as maintainer

* qt-tools: add variant assistant; use commits with submodule

* qt-base: define QtPackage.get_git
2024-11-16 09:09:41 -06:00
etiennemlb
e56057fd79 gobject-introspection: Do not write to user home (#47621) 2024-11-16 11:11:52 +01:00
Harmen Stoppels
26d80e7bc5 py-blosc2: use external libblosc2 (#47566) 2024-11-16 09:43:54 +01:00
Dom Heinzeller
60eb0e9c80 Bug fix in py-scipy for versions 1.8.0 to 1.14.0 that surfaces with latest Clang and Intel LLVM compilers (#47620) 2024-11-16 06:56:25 +01:00
Thomas Bouvier
7443a3b572 py-wandb: add v0.16.6 (#43891)
* py-wandb: add version v0.16.6

* fix: typo

* py-wandb: py-click when @0.15.5:, py-pathtools when @:0.15

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-15 21:17:51 -07:00
dependabot[bot]
a5ba4f8d91 build(deps): bump codecov/codecov-action from 4.6.0 to 5.0.2 (#47631)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 4.6.0 to 5.0.2.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](b9fd7d16f6...5c47607acb)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-15 21:41:14 -06:00
Matthias Wolf
6ef0f495a9 py-libsonata: add v0.1.29 (#47466)
* py-libsonata: new version.

* Fix Python version dependency.

* py-libsonata: fix typo

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-15 20:27:41 -07:00
Matthias Wolf
e91b8c291a py-numpy-quaternion: add v2024.0.3 (#47469)
* py-numpy-quaterion: add new version.

* Update dependency version bounds

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Fix build dependencies.

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-15 21:16:16 -06:00
Wouter Deconinck
6662046aca armadillo: add v14.0.3 (#47634) 2024-11-15 19:57:48 -07:00
Matthieu Dorier
db83c62fb1 arrow: add v18.0.0 (#47494)
* arrow: added version 18.0.0

This PR adds version 18.0.0 to the arrow package.

* arrow: updated dependency on llvm
2024-11-15 20:54:43 -06:00
teddy
d4adfda385 costo: add v0.0.8 (#47625)
Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
2024-11-15 18:10:52 -08:00
Matt Thompson
e8a8e2d98b mapl: add 2.40.3.1 (#47627)
* mapl: add 2.40.3.1
* Relax ESMF requirement
2024-11-15 18:09:22 -08:00
Paolo
55c770c556 Add ACfL 24.10.1 (#47616) 2024-11-15 18:05:38 -08:00
Thomas Gruber
33a796801c Likwid: add version 5.4.0 (#47630) 2024-11-15 18:04:21 -08:00
Seth R. Johnson
b90ac6441c celeritas: remove ancient versions and add CUDA package dependency (#47629)
* celeritas: remove deprecated versions through 0.3

* celeritas: deprecate old versions

* celeritas: add c++20 option

* Propagate vecgeom CUDA requirements

* Remove outdated conflicts and format it
2024-11-15 17:27:22 -07:00
dependabot[bot]
68b69aa9e3 build(deps): bump sphinx-rtd-theme in /lib/spack/docs (#47588)
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 3.0.1 to 3.0.2.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/3.0.1...3.0.2)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-15 17:21:42 -06:00
792 changed files with 12987 additions and 7554 deletions

View File

@@ -57,7 +57,13 @@ jobs:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
- name: Determine latest release tag
id: latest
run: |
git fetch --quiet --tags
echo "tag=$(git tag --list --sort=-v:refname | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | head -n 1)" | tee -a $GITHUB_OUTPUT
- uses: docker/metadata-action@369eb591f429131d6889c46b94e711f089e6ca96
id: docker_meta
with:
images: |
@@ -71,6 +77,7 @@ jobs:
type=semver,pattern={{major}}
type=ref,event=branch
type=ref,event=pr
type=raw,value=latest,enable=${{ github.ref == format('refs/tags/{0}', steps.latest.outputs.tag) }}
- name: Generate the Dockerfile
env:
@@ -113,7 +120,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@4f58ea79222b3b9dc2c8bbdd6debcef730109a75
uses: docker/build-push-action@48aba3b46d1b1fec4febb7c5d0c644b249a11355
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}

View File

@@ -29,6 +29,7 @@ jobs:
- run: coverage xml
- name: "Upload coverage report to CodeCov"
uses: codecov/codecov-action@b9fd7d16f6d7d1b5d2bec1a2887e65ceed900238
uses: codecov/codecov-action@05f5a9cfad807516dbbef9929c4a42df3eb78766
with:
verbose: true
fail_ci_if_error: false

View File

@@ -3,5 +3,5 @@ clingo==5.7.1
flake8==7.1.1
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.20241105
types-six==1.17.0.20241205
vermin==1.6.0

View File

@@ -134,7 +134,7 @@ jobs:
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory /__w/spack/spack
git config --global --add safe.directory '*'
git fetch --unshallow
. .github/workflows/bin/setup_git.sh
useradd spack-test

View File

@@ -13,8 +13,7 @@ concurrency:
jobs:
# Validate that the code can be run on all the Python versions
# supported by Spack
# Validate that the code can be run on all the Python versions supported by Spack
validate:
runs-on: ubuntu-latest
steps:
@@ -74,7 +73,7 @@ jobs:
- name: Setup repo and non-root user
run: |
git --version
git config --global --add safe.directory /__w/spack/spack
git config --global --add safe.directory '*'
git fetch --unshallow
. .github/workflows/bin/setup_git.sh
useradd spack-test
@@ -87,6 +86,7 @@ jobs:
spack -d bootstrap now --dev
spack -d style -t black
spack unit-test -V
# Check we don't make the situation with circular imports worse
import-check:
runs-on: ubuntu-latest
steps:
@@ -146,3 +146,21 @@ jobs:
else
printf '\033[1;32mImport check passed: %s <= %s\033[0m\n' "$edges_after" "$edges_before"
fi
# Further style checks from pylint
pylint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b
with:
python-version: '3.13'
cache: 'pip'
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pylint
- name: Pylint (Spack Core)
run: |
pylint -j 4 --disable=all --enable=unspecified-encoding --ignore-paths=lib/spack/external lib

View File

@@ -70,7 +70,7 @@ Tutorial
----------------
We maintain a
[**hands-on tutorial**](https://spack.readthedocs.io/en/latest/tutorial.html).
[**hands-on tutorial**](https://spack-tutorial.readthedocs.io/).
It covers basic to advanced usage, packaging, developer features, and large HPC
deployments. You can do all of the exercises on your own laptop using a
Docker container.

View File

@@ -55,3 +55,11 @@ concretizer:
splice:
explicit: []
automatic: false
# Maximum time, in seconds, allowed for the 'solve' phase. If set to 0, there is no time limit.
timeout: 0
# If set to true, exceeding the timeout will always result in a concretization error. If false,
# the best (suboptimal) model computed before the timeout is used.
#
# Setting this to false yields unreproducible results, so we advise to use that value only
# for debugging purposes (e.g. check which constraints can help Spack concretize faster).
error_on_timeout: true

View File

@@ -76,6 +76,8 @@ packages:
buildable: false
cray-mvapich2:
buildable: false
egl:
buildable: false
fujitsu-mpi:
buildable: false
hpcx-mpi:

View File

@@ -178,8 +178,8 @@ Spec-related modules
Contains :class:`~spack.spec.Spec`. Also implements most of the logic for concretization
of specs.
:mod:`spack.parser`
Contains :class:`~spack.parser.SpecParser` and functions related to parsing specs.
:mod:`spack.spec_parser`
Contains :class:`~spack.spec_parser.SpecParser` and functions related to parsing specs.
:mod:`spack.version`
Implements a simple :class:`~spack.version.Version` class with simple

View File

@@ -1326,6 +1326,7 @@ Required:
* Microsoft Visual Studio
* Python
* Git
* 7z
Optional:
* Intel Fortran (needed for some packages)
@@ -1391,6 +1392,13 @@ as the project providing Git support on Windows. This is additionally the recomm
for installing Git on Windows, a link to which can be found above. Spack requires the
utilities vendored by this project.
"""
7zip
"""
A tool for extracting ``.xz`` files is required for extracting source tarballs. The latest 7zip
can be located at https://sourceforge.net/projects/sevenzip/.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step 2: Install and setup Spack
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -5137,7 +5137,7 @@ other checks.
- Not applicable
* - :ref:`PythonPackage <pythonpackage>`
- Not applicable
- ``test`` (module imports)
- ``test_imports`` (module imports)
* - :ref:`QMakePackage <qmakepackage>`
- ``check`` (``make check``)
- Not applicable
@@ -5146,7 +5146,7 @@ other checks.
- Not applicable
* - :ref:`SIPPackage <sippackage>`
- Not applicable
- ``test`` (module imports)
- ``test_imports`` (module imports)
* - :ref:`WafPackage <wafpackage>`
- ``build_test`` (must be overridden)
- ``install_test`` (must be overridden)

View File

@@ -1,12 +1,12 @@
sphinx==8.1.3
sphinxcontrib-programoutput==0.17
sphinx_design==0.6.1
sphinx-rtd-theme==3.0.1
sphinx-rtd-theme==3.0.2
python-levenshtein==0.26.1
docutils==0.21.2
pygments==2.18.0
urllib3==2.2.3
pytest==8.3.3
pytest==8.3.4
isort==5.13.2
black==24.10.0
flake8==7.1.1

View File

@@ -66,7 +66,7 @@ def _is_url(path_or_url: str) -> bool:
return result
def system_path_filter(_func=None, arg_slice: Optional[slice] = None):
def _system_path_filter(_func=None, arg_slice: Optional[slice] = None):
"""Filters function arguments to account for platform path separators.
Optional slicing range can be specified to select specific arguments
@@ -100,6 +100,16 @@ def path_filter_caller(*args, **kwargs):
return holder_func
def _noop_decorator(_func=None, arg_slice: Optional[slice] = None):
return _func if _func else lambda x: x
if sys.platform == "win32":
system_path_filter = _system_path_filter
else:
system_path_filter = _noop_decorator
def sanitize_win_longpath(path: str) -> str:
"""Strip Windows extended path prefix from strings
Returns sanitized string.

View File

@@ -24,6 +24,7 @@
Callable,
Deque,
Dict,
Generator,
Iterable,
List,
Match,
@@ -300,35 +301,32 @@ def filter_file(
ignore_absent: bool = False,
start_at: Optional[str] = None,
stop_at: Optional[str] = None,
encoding: Optional[str] = "utf-8",
) -> None:
r"""Like sed, but uses python regular expressions.
Filters every line of each file through regex and replaces the file
with a filtered version. Preserves mode of filtered files.
Filters every line of each file through regex and replaces the file with a filtered version.
Preserves mode of filtered files.
As with re.sub, ``repl`` can be either a string or a callable.
If it is a callable, it is passed the match object and should
return a suitable replacement string. If it is a string, it
can contain ``\1``, ``\2``, etc. to represent back-substitution
as sed would allow.
As with re.sub, ``repl`` can be either a string or a callable. If it is a callable, it is
passed the match object and should return a suitable replacement string. If it is a string, it
can contain ``\1``, ``\2``, etc. to represent back-substitution as sed would allow.
Args:
regex (str): The regular expression to search for
repl (str): The string to replace matches with
*filenames: One or more files to search and replace
string (bool): Treat regex as a plain string. Default it False
backup (bool): Make backup file(s) suffixed with ``~``. Default is False
ignore_absent (bool): Ignore any files that don't exist.
Default is False
start_at (str): Marker used to start applying the replacements. If a
text line matches this marker filtering is started at the next line.
All contents before the marker and the marker itself are copied
verbatim. Default is to start filtering from the first line of the
file.
stop_at (str): Marker used to stop scanning the file further. If a text
line matches this marker filtering is stopped and the rest of the
file is copied verbatim. Default is to filter until the end of the
file.
regex: The regular expression to search for
repl: The string to replace matches with
*filenames: One or more files to search and replace string: Treat regex as a plain string.
Default it False backup: Make backup file(s) suffixed with ``~``. Default is False
ignore_absent: Ignore any files that don't exist. Default is False
start_at: Marker used to start applying the replacements. If a text line matches this
marker filtering is started at the next line. All contents before the marker and the
marker itself are copied verbatim. Default is to start filtering from the first line of
the file.
stop_at: Marker used to stop scanning the file further. If a text line matches this marker
filtering is stopped and the rest of the file is copied verbatim. Default is to filter
until the end of the file.
encoding: The encoding to use when reading and writing the files. Default is None, which
uses the system's default encoding.
"""
# Allow strings to use \1, \2, etc. for replacement, like sed
if not callable(repl):
@@ -344,72 +342,56 @@ def groupid_to_group(x):
if string:
regex = re.escape(regex)
for filename in path_to_os_path(*filenames):
msg = 'FILTER FILE: {0} [replacing "{1}"]'
tty.debug(msg.format(filename, regex))
backup_filename = filename + "~"
tmp_filename = filename + ".spack~"
if ignore_absent and not os.path.exists(filename):
msg = 'FILTER FILE: file "{0}" not found. Skipping to next file.'
tty.debug(msg.format(filename))
regex_compiled = re.compile(regex)
for path in path_to_os_path(*filenames):
if ignore_absent and not os.path.exists(path):
tty.debug(f'FILTER FILE: file "{path}" not found. Skipping to next file.')
continue
else:
tty.debug(f'FILTER FILE: {path} [replacing "{regex}"]')
# Create backup file. Don't overwrite an existing backup
# file in case this file is being filtered multiple times.
if not os.path.exists(backup_filename):
shutil.copy(filename, backup_filename)
fd, temp_path = tempfile.mkstemp(
prefix=f"{os.path.basename(path)}.", dir=os.path.dirname(path)
)
os.close(fd)
# Create a temporary file to read from. We cannot use backup_filename
# in case filter_file is invoked multiple times on the same file.
shutil.copy(filename, tmp_filename)
shutil.copy(path, temp_path)
errored = False
try:
# Open as a text file and filter until the end of the file is
# reached, or we found a marker in the line if it was specified
#
# To avoid translating line endings (\n to \r\n and vice-versa)
# we force os.open to ignore translations and use the line endings
# the file comes with
with open(tmp_filename, mode="r", errors="surrogateescape", newline="") as input_file:
with open(filename, mode="w", errors="surrogateescape", newline="") as output_file:
do_filtering = start_at is None
# Using iter and readline is a workaround needed not to
# disable input_file.tell(), which will happen if we call
# input_file.next() implicitly via the for loop
for line in iter(input_file.readline, ""):
if stop_at is not None:
current_position = input_file.tell()
# Open as a text file and filter until the end of the file is reached, or we found a
# marker in the line if it was specified. To avoid translating line endings (\n to
# \r\n and vice-versa) use newline="".
with open(
temp_path, mode="r", errors="surrogateescape", newline="", encoding=encoding
) as input_file, open(
path, mode="w", errors="surrogateescape", newline="", encoding=encoding
) as output_file:
if start_at is None and stop_at is None: # common case, avoids branching in loop
for line in input_file:
output_file.write(re.sub(regex_compiled, repl, line))
else:
# state is -1 before start_at; 0 between; 1 after stop_at
state = 0 if start_at is None else -1
for line in input_file:
if state == 0:
if stop_at == line.strip():
output_file.write(line)
break
if do_filtering:
filtered_line = re.sub(regex, repl, line)
output_file.write(filtered_line)
else:
do_filtering = start_at == line.strip()
output_file.write(line)
else:
current_position = None
# If we stopped filtering at some point, reopen the file in
# binary mode and copy verbatim the remaining part
if current_position and stop_at:
with open(tmp_filename, mode="rb") as input_binary_buffer:
input_binary_buffer.seek(current_position)
with open(filename, mode="ab") as output_binary_buffer:
output_binary_buffer.writelines(input_binary_buffer.readlines())
state = 1
else:
line = re.sub(regex_compiled, repl, line)
elif state == -1 and start_at == line.strip():
state = 0
output_file.write(line)
except BaseException:
# clean up the original file on failure.
shutil.move(backup_filename, filename)
# restore the original file
os.rename(temp_path, path)
errored = True
raise
finally:
os.remove(tmp_filename)
if not backup and os.path.exists(backup_filename):
os.remove(backup_filename)
if not errored and not backup:
os.unlink(temp_path)
class FileFilter:
@@ -1114,12 +1096,12 @@ def hash_directory(directory, ignore=[]):
@contextmanager
@system_path_filter
def write_tmp_and_move(filename):
def write_tmp_and_move(filename: str, *, encoding: Optional[str] = None):
"""Write to a temporary file, then move into place."""
dirname = os.path.dirname(filename)
basename = os.path.basename(filename)
tmp = os.path.join(dirname, ".%s.tmp" % basename)
with open(tmp, "w") as f:
with open(tmp, "w", encoding=encoding) as f:
yield f
shutil.move(tmp, filename)
@@ -2772,22 +2754,6 @@ def prefixes(path):
return paths
@system_path_filter
def md5sum(file):
"""Compute the MD5 sum of a file.
Args:
file (str): file to be checksummed
Returns:
MD5 sum of the file's content
"""
md5 = hashlib.md5()
with open(file, "rb") as f:
md5.update(f.read())
return md5.digest()
@system_path_filter
def remove_directory_contents(dir):
"""Remove all contents of a directory."""
@@ -2838,6 +2804,25 @@ def temporary_dir(
remove_directory_contents(tmp_dir)
@contextmanager
def edit_in_place_through_temporary_file(file_path: str) -> Generator[str, None, None]:
"""Context manager for modifying ``file_path`` in place, preserving its inode and hardlinks,
for functions or external tools that do not support in-place editing. Notice that this function
is unsafe in that it works with paths instead of a file descriptors, but this is by design,
since we assume the call site will create a new inode at the same path."""
tmp_fd, tmp_path = tempfile.mkstemp(
dir=os.path.dirname(file_path), prefix=f"{os.path.basename(file_path)}."
)
# windows cannot replace a file with open fds, so close since the call site needs to replace.
os.close(tmp_fd)
try:
shutil.copyfile(file_path, tmp_path, follow_symlinks=True)
yield tmp_path
shutil.copyfile(tmp_path, file_path, follow_symlinks=True)
finally:
os.unlink(tmp_path)
def filesummary(path, print_bytes=16) -> Tuple[int, bytes]:
"""Create a small summary of the given file. Does not error
when file does not exist.

View File

@@ -96,8 +96,8 @@ def get_fh(self, path: str) -> IO:
Arguments:
path: path to lock file we want a filehandle for
"""
# Open writable files as 'r+' so we can upgrade to write later
os_mode, fh_mode = (os.O_RDWR | os.O_CREAT), "r+"
# Open writable files as rb+ so we can upgrade to write later
os_mode, fh_mode = (os.O_RDWR | os.O_CREAT), "rb+"
pid = os.getpid()
open_file = None # OpenFile object, if there is one
@@ -124,7 +124,7 @@ def get_fh(self, path: str) -> IO:
# we know path exists but not if it's writable. If it's read-only,
# only open the file for reading (and fail if we're trying to get
# an exclusive (write) lock on it)
os_mode, fh_mode = os.O_RDONLY, "r"
os_mode, fh_mode = os.O_RDONLY, "rb"
fd = os.open(path, os_mode)
fh = os.fdopen(fd, fh_mode)
@@ -243,7 +243,7 @@ def __init__(
helpful for distinguishing between different Spack locks.
"""
self.path = path
self._file: Optional[IO] = None
self._file: Optional[IO[bytes]] = None
self._reads = 0
self._writes = 0
@@ -329,9 +329,9 @@ def _lock(self, op: int, timeout: Optional[float] = None) -> Tuple[float, int]:
self._ensure_parent_directory()
self._file = FILE_TRACKER.get_fh(self.path)
if LockType.to_module(op) == fcntl.LOCK_EX and self._file.mode == "r":
if LockType.to_module(op) == fcntl.LOCK_EX and self._file.mode == "rb":
# Attempt to upgrade to write lock w/a read-only file.
# If the file were writable, we'd have opened it 'r+'
# If the file were writable, we'd have opened it rb+
raise LockROFileError(self.path)
self._log_debug(
@@ -426,7 +426,7 @@ def _read_log_debug_data(self) -> None:
line = self._file.read()
if line:
pid, host = line.strip().split(",")
pid, host = line.decode("utf-8").strip().split(",")
_, _, pid = pid.rpartition("=")
_, _, self.host = host.rpartition("=")
self.pid = int(pid)
@@ -442,7 +442,7 @@ def _write_log_debug_data(self) -> None:
# write pid, host to disk to sync over FS
self._file.seek(0)
self._file.write("pid=%s,host=%s" % (self.pid, self.host))
self._file.write(f"pid={self.pid},host={self.host}".encode("utf-8"))
self._file.truncate()
self._file.flush()
os.fsync(self._file.fileno())

View File

@@ -161,7 +161,7 @@ def _err_check(result, func, args):
)
# Use conout$ here to handle a redirectired stdout/get active console associated
# with spack
with open(r"\\.\CONOUT$", "w") as conout:
with open(r"\\.\CONOUT$", "w", encoding="utf-8") as conout:
# Link above would use kernel32.GetStdHandle(-11) however this would not handle
# a redirected stdout appropriately, so we always refer to the current CONSOLE out
# which is defined as conout$ on Windows.

View File

@@ -21,7 +21,7 @@
from multiprocessing.connection import Connection
from threading import Thread
from types import ModuleType
from typing import Callable, Optional, Union
from typing import Callable, Optional
import llnl.util.tty as tty
@@ -762,7 +762,7 @@ def __enter__(self):
self.reader = open(self.logfile, mode="rb+")
# Dup stdout so we can still write to it after redirection
self.echo_writer = open(os.dup(sys.stdout.fileno()), "w")
self.echo_writer = open(os.dup(sys.stdout.fileno()), "w", encoding=sys.stdout.encoding)
# Redirect stdout and stderr to write to logfile
self.stderr.redirect_stream(self.writer.fileno())
self.stdout.redirect_stream(self.writer.fileno())
@@ -879,10 +879,13 @@ def _writer_daemon(
write_fd.close()
# 1. Use line buffering (3rd param = 1) since Python 3 has a bug
# that prevents unbuffered text I/O.
# 2. Python 3.x before 3.7 does not open with UTF-8 encoding by default
# that prevents unbuffered text I/O. [needs citation]
# 2. Enforce a UTF-8 interpretation of build process output with errors replaced by '?'.
# The downside is that the log file will not contain the exact output of the build process.
# 3. closefd=False because Connection has "ownership"
read_file = os.fdopen(read_fd.fileno(), "r", 1, encoding="utf-8", closefd=False)
read_file = os.fdopen(
read_fd.fileno(), "r", 1, encoding="utf-8", errors="replace", closefd=False
)
if stdin_fd:
stdin_file = os.fdopen(stdin_fd.fileno(), closefd=False)
@@ -928,11 +931,7 @@ def _writer_daemon(
try:
while line_count < 100:
# Handle output from the calling process.
try:
line = _retry(read_file.readline)()
except UnicodeDecodeError:
# installs like --test=root gpgme produce non-UTF8 logs
line = "<line lost: output was not encoded as UTF-8>\n"
line = _retry(read_file.readline)()
if not line:
return
@@ -946,6 +945,13 @@ def _writer_daemon(
output_line = clean_line
if filter_fn:
output_line = filter_fn(clean_line)
enc = sys.stdout.encoding
if enc != "utf-8":
# On Python 3.6 and 3.7-3.14 with non-{utf-8,C} locale stdout
# may not be able to handle utf-8 output. We do an inefficient
# dance of re-encoding with errors replaced, so stdout.write
# does not raise.
output_line = output_line.encode(enc, "replace").decode(enc)
sys.stdout.write(output_line)
# Stripped output to log file.
@@ -1022,22 +1028,3 @@ def wrapped(*args, **kwargs):
def _input_available(f):
return f in select.select([f], [], [], 0)[0]
LogType = Union[nixlog, winlog]
def print_message(logger: LogType, msg: str, verbose: bool = False):
"""Print the message to the log, optionally echoing.
Args:
logger: instance of the output logger (e.g. nixlog or winlog)
msg: message being output
verbose: ``True`` displays verbose output, ``False`` suppresses
it (``False`` is default)
"""
if verbose:
with logger.force_echo():
tty.info(msg, format="g")
else:
tty.info(msg, format="g")

View File

@@ -571,8 +571,13 @@ def _search_for_deprecated_package_methods(pkgs, error_cls):
@package_properties
def _ensure_all_package_names_are_lowercase(pkgs, error_cls):
"""Ensure package names are lowercase and consistent"""
reserved_names = ("all",)
badname_regex, errors = re.compile(r"[_A-Z]"), []
for pkg_name in pkgs:
if pkg_name in reserved_names:
error_msg = f"The name '{pkg_name}' is reserved, and cannot be used for packages"
errors.append(error_cls(error_msg, []))
if badname_regex.search(pkg_name):
error_msg = f"Package name '{pkg_name}' should be lowercase and must not contain '_'"
errors.append(error_cls(error_msg, []))
@@ -651,7 +656,7 @@ def _ensure_docstring_and_no_fixme(pkgs, error_cls):
for pkg_name in pkgs:
details = []
filename = spack.repo.PATH.filename_for_package_name(pkg_name)
with open(filename, "r") as package_file:
with open(filename, "r", encoding="utf-8") as package_file:
for i, line in enumerate(package_file):
pattern = next((r for r in fixme_regexes if r.search(line)), None)
if pattern:
@@ -688,19 +693,19 @@ def invalid_sha256_digest(fetcher):
return h, True
return None, False
error_msg = "Package '{}' does not use sha256 checksum".format(pkg_name)
error_msg = f"Package '{pkg_name}' does not use sha256 checksum"
details = []
for v, args in pkg.versions.items():
fetcher = spack.fetch_strategy.for_package_version(pkg, v)
digest, is_bad = invalid_sha256_digest(fetcher)
if is_bad:
details.append("{}@{} uses {}".format(pkg_name, v, digest))
details.append(f"{pkg_name}@{v} uses {digest}")
for _, resources in pkg.resources.items():
for resource in resources:
digest, is_bad = invalid_sha256_digest(resource.fetcher)
if is_bad:
details.append("Resource in '{}' uses {}".format(pkg_name, digest))
details.append(f"Resource in '{pkg_name}' uses {digest}")
if details:
errors.append(error_cls(error_msg, details))
@@ -804,7 +809,7 @@ def _uses_deprecated_globals(pkgs, error_cls):
continue
file = spack.repo.PATH.filename_for_package_name(pkg_name)
tree = ast.parse(open(file).read())
tree = ast.parse(open(file, "rb").read())
visitor = DeprecatedMagicGlobals(("std_cmake_args", "std_meson_args", "std_pip_args"))
visitor.visit(tree)
if visitor.references_to_globals:
@@ -1004,20 +1009,6 @@ def _issues_in_depends_on_directive(pkgs, error_cls):
for when, deps_by_name in pkg_cls.dependencies.items():
for dep_name, dep in deps_by_name.items():
# Check if there are nested dependencies declared. We don't want directives like:
#
# depends_on('foo+bar ^fee+baz')
#
# but we'd like to have two dependencies listed instead.
nested_dependencies = dep.spec.dependencies()
if nested_dependencies:
summary = f"{pkg_name}: nested dependency declaration '{dep.spec}'"
ndir = len(nested_dependencies) + 1
details = [
f"split depends_on('{dep.spec}', when='{when}') into {ndir} directives",
f"in {filename}",
]
errors.append(error_cls(summary=summary, details=details))
def check_virtual_with_variants(spec, msg):
if not spec.virtual or not spec.variants:

View File

@@ -40,7 +40,7 @@
import spack.hash_types as ht
import spack.hooks
import spack.hooks.sbang
import spack.mirror
import spack.mirrors.mirror
import spack.oci.image
import spack.oci.oci
import spack.oci.opener
@@ -69,10 +69,8 @@
Digest,
ImageReference,
default_config,
default_index_tag,
default_manifest,
default_tag,
tag_is_spec,
ensure_valid_tag,
)
from spack.oci.oci import (
copy_missing_layers_with_retry,
@@ -83,7 +81,6 @@
)
from spack.package_prefs import get_package_dir_permissions, get_package_group
from spack.relocate_text import utf8_paths_to_single_binary_regex
from spack.spec import Spec
from spack.stage import Stage
from spack.util.executable import which
@@ -369,7 +366,7 @@ def update(self, with_cooldown=False):
on disk under ``_index_cache_root``)."""
self._init_local_index_cache()
configured_mirror_urls = [
m.fetch_url for m in spack.mirror.MirrorCollection(binary=True).values()
m.fetch_url for m in spack.mirrors.mirror.MirrorCollection(binary=True).values()
]
items_to_remove = []
spec_cache_clear_needed = False
@@ -586,7 +583,7 @@ def buildinfo_file_name(prefix):
def read_buildinfo_file(prefix):
"""Read buildinfo file"""
with open(buildinfo_file_name(prefix), "r") as f:
with open(buildinfo_file_name(prefix), "r", encoding="utf-8") as f:
return syaml.load(f)
@@ -827,10 +824,10 @@ def _read_specs_and_push_index(
contents = read_method(file)
# Need full spec.json name or this gets confused with index.json.
if file.endswith(".json.sig"):
specfile_json = Spec.extract_json_from_clearsig(contents)
fetched_spec = Spec.from_dict(specfile_json)
specfile_json = spack.spec.Spec.extract_json_from_clearsig(contents)
fetched_spec = spack.spec.Spec.from_dict(specfile_json)
elif file.endswith(".json"):
fetched_spec = Spec.from_json(contents)
fetched_spec = spack.spec.Spec.from_json(contents)
else:
continue
@@ -840,17 +837,17 @@ def _read_specs_and_push_index(
# Now generate the index, compute its hash, and push the two files to
# the mirror.
index_json_path = os.path.join(temp_dir, "index.json")
with open(index_json_path, "w") as f:
with open(index_json_path, "w", encoding="utf-8") as f:
db._write_to_file(f)
# Read the index back in and compute its hash
with open(index_json_path) as f:
with open(index_json_path, encoding="utf-8") as f:
index_string = f.read()
index_hash = compute_hash(index_string)
# Write the hash out to a local file
index_hash_path = os.path.join(temp_dir, "index.json.hash")
with open(index_hash_path, "w") as f:
with open(index_hash_path, "w", encoding="utf-8") as f:
f.write(index_hash)
# Push the index itself
@@ -884,7 +881,7 @@ def _specs_from_cache_aws_cli(cache_prefix):
aws = which("aws")
def file_read_method(file_path):
with open(file_path) as fd:
with open(file_path, encoding="utf-8") as fd:
return fd.read()
tmpspecsdir = tempfile.mkdtemp()
@@ -1029,7 +1026,7 @@ def generate_key_index(key_prefix: str, tmpdir: str) -> None:
target = os.path.join(tmpdir, "index.json")
index = {"keys": dict((fingerprint, {}) for fingerprint in sorted(set(fingerprints)))}
with open(target, "w") as f:
with open(target, "w", encoding="utf-8") as f:
sjson.dump(index, f)
try:
@@ -1100,7 +1097,7 @@ class ExistsInBuildcache(NamedTuple):
class BuildcacheFiles:
def __init__(self, spec: Spec, local: str, remote: str):
def __init__(self, spec: spack.spec.Spec, local: str, remote: str):
"""
Args:
spec: The spec whose tarball and specfile are being managed.
@@ -1130,7 +1127,7 @@ def local_tarball(self) -> str:
return os.path.join(self.local, f"{self.spec.dag_hash()}.tar.gz")
def _exists_in_buildcache(spec: Spec, tmpdir: str, out_url: str) -> ExistsInBuildcache:
def _exists_in_buildcache(spec: spack.spec.Spec, tmpdir: str, out_url: str) -> ExistsInBuildcache:
"""returns a tuple of bools (signed, unsigned, tarball) indicating whether specfiles/tarballs
exist in the buildcache"""
files = BuildcacheFiles(spec, tmpdir, out_url)
@@ -1141,7 +1138,11 @@ def _exists_in_buildcache(spec: Spec, tmpdir: str, out_url: str) -> ExistsInBuil
def _url_upload_tarball_and_specfile(
spec: Spec, tmpdir: str, out_url: str, exists: ExistsInBuildcache, signing_key: Optional[str]
spec: spack.spec.Spec,
tmpdir: str,
out_url: str,
exists: ExistsInBuildcache,
signing_key: Optional[str],
):
files = BuildcacheFiles(spec, tmpdir, out_url)
tarball = files.local_tarball()
@@ -1159,7 +1160,7 @@ def _url_upload_tarball_and_specfile(
web_util.push_to_url(tarball, files.remote_tarball(), keep_original=False)
specfile = files.local_specfile()
with open(specfile, "w") as f:
with open(specfile, "w", encoding="utf-8") as f:
# Note: when using gpg clear sign, we need to avoid long lines (19995 chars).
# If lines are longer, they are truncated without error. Thanks GPG!
# So, here we still add newlines, but no indent, so save on file size and
@@ -1176,7 +1177,7 @@ def _url_upload_tarball_and_specfile(
class Uploader:
def __init__(self, mirror: spack.mirror.Mirror, force: bool, update_index: bool):
def __init__(self, mirror: spack.mirrors.mirror.Mirror, force: bool, update_index: bool):
self.mirror = mirror
self.force = force
self.update_index = update_index
@@ -1224,7 +1225,7 @@ def tag(self, tag: str, roots: List[spack.spec.Spec]):
class OCIUploader(Uploader):
def __init__(
self,
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool,
update_index: bool,
base_image: Optional[str],
@@ -1273,7 +1274,7 @@ def tag(self, tag: str, roots: List[spack.spec.Spec]):
class URLUploader(Uploader):
def __init__(
self,
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool,
update_index: bool,
signing_key: Optional[str],
@@ -1297,7 +1298,7 @@ def push(
def make_uploader(
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool = False,
update_index: bool = False,
signing_key: Optional[str] = None,
@@ -1314,7 +1315,7 @@ def make_uploader(
)
def _format_spec(spec: Spec) -> str:
def _format_spec(spec: spack.spec.Spec) -> str:
return spec.cformat("{name}{@version}{/hash:7}")
@@ -1337,7 +1338,7 @@ def _progress(self):
return f"[{self.n:{digits}}/{self.total}] "
return ""
def start(self, spec: Spec, running: bool) -> None:
def start(self, spec: spack.spec.Spec, running: bool) -> None:
self.n += 1
self.running = running
self.pre = self._progress()
@@ -1356,18 +1357,18 @@ def fail(self) -> None:
def _url_push(
specs: List[Spec],
specs: List[spack.spec.Spec],
out_url: str,
signing_key: Optional[str],
force: bool,
update_index: bool,
tmpdir: str,
executor: concurrent.futures.Executor,
) -> Tuple[List[Spec], List[Tuple[Spec, BaseException]]]:
) -> Tuple[List[spack.spec.Spec], List[Tuple[spack.spec.Spec, BaseException]]]:
"""Pushes to the provided build cache, and returns a list of skipped specs that were already
present (when force=False), and a list of errors. Does not raise on error."""
skipped: List[Spec] = []
errors: List[Tuple[Spec, BaseException]] = []
skipped: List[spack.spec.Spec] = []
errors: List[Tuple[spack.spec.Spec, BaseException]] = []
exists_futures = [
executor.submit(_exists_in_buildcache, spec, tmpdir, out_url) for spec in specs
@@ -1440,7 +1441,7 @@ def _url_push(
return skipped, errors
def _oci_upload_success_msg(spec: Spec, digest: Digest, size: int, elapsed: float):
def _oci_upload_success_msg(spec: spack.spec.Spec, digest: Digest, size: int, elapsed: float):
elapsed = max(elapsed, 0.001) # guard against division by zero
return (
f"Pushed {_format_spec(spec)}: {digest} ({elapsed:.2f}s, "
@@ -1526,7 +1527,7 @@ def _oci_put_manifest(
):
architecture = _oci_archspec_to_gooarch(specs[0])
expected_blobs: List[Spec] = [
expected_blobs: List[spack.spec.Spec] = [
s
for s in traverse.traverse_nodes(specs, order="topo", deptype=("link", "run"), root=True)
if not s.external
@@ -1570,7 +1571,7 @@ def _oci_put_manifest(
config_file = os.path.join(tmpdir, f"{specs[0].dag_hash()}.config.json")
with open(config_file, "w") as f:
with open(config_file, "w", encoding="utf-8") as f:
json.dump(config, f, separators=(",", ":"))
config_file_checksum = Digest.from_sha256(
@@ -1640,19 +1641,33 @@ def _oci_update_base_images(
)
def _oci_default_tag(spec: spack.spec.Spec) -> str:
"""Return a valid, default image tag for a spec."""
return ensure_valid_tag(f"{spec.name}-{spec.version}-{spec.dag_hash()}.spack")
#: Default OCI index tag
default_index_tag = "index.spack"
def tag_is_spec(tag: str) -> bool:
"""Check if a tag is likely a Spec"""
return tag.endswith(".spack") and tag != default_index_tag
def _oci_push(
*,
target_image: ImageReference,
base_image: Optional[ImageReference],
installed_specs_with_deps: List[Spec],
installed_specs_with_deps: List[spack.spec.Spec],
tmpdir: str,
executor: concurrent.futures.Executor,
force: bool = False,
) -> Tuple[
List[Spec],
List[spack.spec.Spec],
Dict[str, Tuple[dict, dict]],
Dict[str, spack.oci.oci.Blob],
List[Tuple[Spec, BaseException]],
List[Tuple[spack.spec.Spec, BaseException]],
]:
# Spec dag hash -> blob
checksums: Dict[str, spack.oci.oci.Blob] = {}
@@ -1661,13 +1676,15 @@ def _oci_push(
base_images: Dict[str, Tuple[dict, dict]] = {}
# Specs not uploaded because they already exist
skipped: List[Spec] = []
skipped: List[spack.spec.Spec] = []
if not force:
tty.info("Checking for existing specs in the buildcache")
blobs_to_upload = []
tags_to_check = (target_image.with_tag(default_tag(s)) for s in installed_specs_with_deps)
tags_to_check = (
target_image.with_tag(_oci_default_tag(s)) for s in installed_specs_with_deps
)
available_blobs = executor.map(_oci_get_blob_info, tags_to_check)
for spec, maybe_blob in zip(installed_specs_with_deps, available_blobs):
@@ -1695,8 +1712,8 @@ def _oci_push(
executor.submit(_oci_push_pkg_blob, target_image, spec, tmpdir) for spec in blobs_to_upload
]
manifests_to_upload: List[Spec] = []
errors: List[Tuple[Spec, BaseException]] = []
manifests_to_upload: List[spack.spec.Spec] = []
errors: List[Tuple[spack.spec.Spec, BaseException]] = []
# And update the spec to blob mapping for successful uploads
for spec, blob_future in zip(blobs_to_upload, blob_futures):
@@ -1722,7 +1739,7 @@ def _oci_push(
base_image_cache=base_images,
)
def extra_config(spec: Spec):
def extra_config(spec: spack.spec.Spec):
spec_dict = spec.to_dict(hash=ht.dag_hash)
spec_dict["buildcache_layout_version"] = CURRENT_BUILD_CACHE_LAYOUT_VERSION
spec_dict["binary_cache_checksum"] = {
@@ -1738,7 +1755,7 @@ def extra_config(spec: Spec):
_oci_put_manifest,
base_images,
checksums,
target_image.with_tag(default_tag(spec)),
target_image.with_tag(_oci_default_tag(spec)),
tmpdir,
extra_config(spec),
{"org.opencontainers.image.description": spec.format()},
@@ -1755,7 +1772,7 @@ def extra_config(spec: Spec):
manifest_progress.start(spec, manifest_future.running())
if error is None:
manifest_progress.ok(
f"Tagged {_format_spec(spec)} as {target_image.with_tag(default_tag(spec))}"
f"Tagged {_format_spec(spec)} as {target_image.with_tag(_oci_default_tag(spec))}"
)
else:
manifest_progress.fail()
@@ -1790,13 +1807,13 @@ def _oci_update_index(
db = BuildCacheDatabase(db_root_dir)
for spec_dict in spec_dicts:
spec = Spec.from_dict(spec_dict)
spec = spack.spec.Spec.from_dict(spec_dict)
db.add(spec)
db.mark(spec, "in_buildcache", True)
# Create the index.json file
index_json_path = os.path.join(tmpdir, "index.json")
with open(index_json_path, "w") as f:
with open(index_json_path, "w", encoding="utf-8") as f:
db._write_to_file(f)
# Create an empty config.json file
@@ -1905,7 +1922,7 @@ def _get_valid_spec_file(path: str, max_supported_layout: int) -> Tuple[Dict, in
try:
as_string = binary_content.decode("utf-8")
if path.endswith(".json.sig"):
spec_dict = Spec.extract_json_from_clearsig(as_string)
spec_dict = spack.spec.Spec.extract_json_from_clearsig(as_string)
else:
spec_dict = json.loads(as_string)
except Exception as e:
@@ -1953,9 +1970,9 @@ def download_tarball(spec, unsigned: Optional[bool] = False, mirrors_for_spec=No
"signature_verified": "true-if-binary-pkg-was-already-verified"
}
"""
configured_mirrors: Iterable[spack.mirror.Mirror] = spack.mirror.MirrorCollection(
binary=True
).values()
configured_mirrors: Iterable[spack.mirrors.mirror.Mirror] = (
spack.mirrors.mirror.MirrorCollection(binary=True).values()
)
if not configured_mirrors:
tty.die("Please add a spack mirror to allow download of pre-compiled packages.")
@@ -1980,7 +1997,7 @@ def fetch_url_to_mirror(url):
for mirror in configured_mirrors:
if mirror.fetch_url == url:
return mirror
return spack.mirror.Mirror(url)
return spack.mirrors.mirror.Mirror(url)
mirrors = [fetch_url_to_mirror(url) for url in mirror_urls]
@@ -2001,7 +2018,7 @@ def fetch_url_to_mirror(url):
if fetch_url.startswith("oci://"):
ref = spack.oci.image.ImageReference.from_string(
fetch_url[len("oci://") :]
).with_tag(spack.oci.image.default_tag(spec))
).with_tag(_oci_default_tag(spec))
# Fetch the manifest
try:
@@ -2245,7 +2262,8 @@ def relocate_package(spec):
]
if analogs:
# Prefer same-name analogs and prefer higher versions
# This matches the preferences in Spec.splice, so we will find same node
# This matches the preferences in spack.spec.Spec.splice, so we
# will find same node
analog = max(analogs, key=lambda a: (a.name == s.name, a.version))
lookup_dag_hash = analog.dag_hash()
@@ -2334,7 +2352,9 @@ def is_backup_file(file):
if not codesign:
return
for binary in changed_files:
codesign("-fs-", binary)
# preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed
@@ -2648,7 +2668,7 @@ def try_direct_fetch(spec, mirrors=None):
specfile_is_signed = False
found_specs = []
binary_mirrors = spack.mirror.MirrorCollection(mirrors=mirrors, binary=True).values()
binary_mirrors = spack.mirrors.mirror.MirrorCollection(mirrors=mirrors, binary=True).values()
for mirror in binary_mirrors:
buildcache_fetch_url_json = url_util.join(
@@ -2679,10 +2699,10 @@ def try_direct_fetch(spec, mirrors=None):
# are concrete (as they are built) so we need to mark this spec
# concrete on read-in.
if specfile_is_signed:
specfile_json = Spec.extract_json_from_clearsig(specfile_contents)
fetched_spec = Spec.from_dict(specfile_json)
specfile_json = spack.spec.Spec.extract_json_from_clearsig(specfile_contents)
fetched_spec = spack.spec.Spec.from_dict(specfile_json)
else:
fetched_spec = Spec.from_json(specfile_contents)
fetched_spec = spack.spec.Spec.from_json(specfile_contents)
fetched_spec._mark_concrete()
found_specs.append({"mirror_url": mirror.fetch_url, "spec": fetched_spec})
@@ -2709,7 +2729,7 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):
if spec is None:
return []
if not spack.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True):
if not spack.mirrors.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True):
tty.debug("No Spack mirrors are currently configured")
return {}
@@ -2748,7 +2768,7 @@ def clear_spec_cache():
def get_keys(install=False, trust=False, force=False, mirrors=None):
"""Get pgp public keys available on mirror with suffix .pub"""
mirror_collection = mirrors or spack.mirror.MirrorCollection(binary=True)
mirror_collection = mirrors or spack.mirrors.mirror.MirrorCollection(binary=True)
if not mirror_collection:
tty.die("Please add a spack mirror to allow " + "download of build caches.")
@@ -2803,7 +2823,7 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
def _url_push_keys(
*mirrors: Union[spack.mirror.Mirror, str],
*mirrors: Union[spack.mirrors.mirror.Mirror, str],
keys: List[str],
tmpdir: str,
update_index: bool = False,
@@ -2870,7 +2890,7 @@ def check_specs_against_mirrors(mirrors, specs, output_file=None):
"""
rebuilds = {}
for mirror in spack.mirror.MirrorCollection(mirrors, binary=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(mirrors, binary=True).values():
tty.debug("Checking for built specs at {0}".format(mirror.fetch_url))
rebuild_list = []
@@ -2887,7 +2907,7 @@ def check_specs_against_mirrors(mirrors, specs, output_file=None):
}
if output_file:
with open(output_file, "w") as outf:
with open(output_file, "w", encoding="utf-8") as outf:
outf.write(json.dumps(rebuilds))
return 1 if rebuilds else 0
@@ -2914,7 +2934,7 @@ def _download_buildcache_entry(mirror_root, descriptions):
def download_buildcache_entry(file_descriptions, mirror_url=None):
if not mirror_url and not spack.mirror.MirrorCollection(binary=True):
if not mirror_url and not spack.mirrors.mirror.MirrorCollection(binary=True):
tty.die(
"Please provide or add a spack mirror to allow " + "download of buildcache entries."
)
@@ -2923,7 +2943,7 @@ def download_buildcache_entry(file_descriptions, mirror_url=None):
mirror_root = os.path.join(mirror_url, BUILD_CACHE_RELATIVE_PATH)
return _download_buildcache_entry(mirror_root, file_descriptions)
for mirror in spack.mirror.MirrorCollection(binary=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(binary=True).values():
mirror_root = os.path.join(mirror.fetch_url, BUILD_CACHE_RELATIVE_PATH)
if _download_buildcache_entry(mirror_root, file_descriptions):
@@ -2981,7 +3001,7 @@ def __init__(self, all_architectures):
self.possible_specs = specs
def __call__(self, spec: Spec, **kwargs):
def __call__(self, spec: spack.spec.Spec, **kwargs):
"""
Args:
spec: The spec being searched for
@@ -3119,7 +3139,7 @@ def __init__(self, url: str, local_hash, urlopen=None) -> None:
def conditional_fetch(self) -> FetchIndexResult:
"""Download an index from an OCI registry type mirror."""
url_manifest = self.ref.with_tag(spack.oci.image.default_index_tag).manifest_url()
url_manifest = self.ref.with_tag(default_index_tag).manifest_url()
try:
response = self.urlopen(
urllib.request.Request(

View File

@@ -37,7 +37,7 @@
import spack.binary_distribution
import spack.config
import spack.detection
import spack.mirror
import spack.mirrors.mirror
import spack.platforms
import spack.spec
import spack.store
@@ -91,7 +91,7 @@ def __init__(self, conf: ConfigDictionary) -> None:
self.metadata_dir = spack.util.path.canonicalize_path(conf["metadata"])
# Promote (relative) paths to file urls
self.url = spack.mirror.Mirror(conf["info"]["url"]).fetch_url
self.url = spack.mirrors.mirror.Mirror(conf["info"]["url"]).fetch_url
@property
def mirror_scope(self) -> spack.config.InternalConfigScope:

View File

@@ -882,6 +882,9 @@ def __init__(self, *roots: spack.spec.Spec, context: Context):
elif context == Context.RUN:
self.root_depflag = dt.RUN | dt.LINK
def accept(self, item):
return True
def neighbors(self, item):
spec = item.edge.spec
if spec.dag_hash() in self.root_hashes:
@@ -919,19 +922,19 @@ def effective_deptypes(
a flag specifying in what way they do so. The list is ordered topologically
from root to leaf, meaning that environment modifications should be applied
in reverse so that dependents override dependencies, not the other way around."""
visitor = traverse.TopoVisitor(
EnvironmentVisitor(*specs, context=context),
key=lambda x: x.dag_hash(),
topo_sorted_edges = traverse.traverse_topo_edges_generator(
traverse.with_artificial_edges(specs),
visitor=EnvironmentVisitor(*specs, context=context),
key=traverse.by_dag_hash,
root=True,
all_edges=True,
)
traverse.traverse_depth_first_with_visitor(traverse.with_artificial_edges(specs), visitor)
# Dictionary with "no mode" as default value, so it's easy to write modes[x] |= flag.
use_modes = defaultdict(lambda: UseMode(0))
nodes_with_type = []
for edge in visitor.edges:
for edge in topo_sorted_edges:
parent, child, depflag = edge.parent, edge.spec, edge.depflag
# Mark the starting point
@@ -1423,27 +1426,20 @@ def make_stack(tb, stack=None):
# We found obj, the Package implementation we care about.
# Point out the location in the install method where we failed.
filename = inspect.getfile(frame.f_code)
lineno = frame.f_lineno
if os.path.basename(filename) == "package.py":
# subtract 1 because we inject a magic import at the top of package files.
# TODO: get rid of the magic import.
lineno -= 1
lines = ["{0}:{1:d}, in {2}:".format(filename, lineno, frame.f_code.co_name)]
lines = [f"{filename}:{frame.f_lineno}, in {frame.f_code.co_name}:"]
# Build a message showing context in the install method.
sourcelines, start = inspect.getsourcelines(frame)
# Calculate lineno of the error relative to the start of the function.
fun_lineno = lineno - start
fun_lineno = frame.f_lineno - start
start_ctx = max(0, fun_lineno - context)
sourcelines = sourcelines[start_ctx : fun_lineno + context + 1]
for i, line in enumerate(sourcelines):
is_error = start_ctx + i == fun_lineno
mark = ">> " if is_error else " "
# Add start to get lineno relative to start of file, not function.
marked = " {0}{1:-6d}{2}".format(mark, start + start_ctx + i, line.rstrip())
marked = f" {'>> ' if is_error else ' '}{start + start_ctx + i:-6d}{line.rstrip()}"
if is_error:
marked = colorize("@R{%s}" % cescape(marked))
lines.append(marked)

View File

@@ -112,7 +112,7 @@ def execute_build_time_tests(builder: spack.builder.Builder):
if not builder.pkg.run_tests or not builder.build_time_test_callbacks:
return
builder.phase_tests("build", builder.build_time_test_callbacks)
builder.pkg.tester.phase_tests(builder, "build", builder.build_time_test_callbacks)
def execute_install_time_tests(builder: spack.builder.Builder):
@@ -125,7 +125,7 @@ def execute_install_time_tests(builder: spack.builder.Builder):
if not builder.pkg.run_tests or not builder.install_time_test_callbacks:
return
builder.phase_tests("install", builder.install_time_test_callbacks)
builder.pkg.tester.phase_tests(builder, "install", builder.install_time_test_callbacks)
class BuilderWithDefaults(spack.builder.Builder):

View File

@@ -182,10 +182,7 @@ def patch_config_files(self) -> bool:
@property
def _removed_la_files_log(self) -> str:
"""File containing the list of removed libtool archives"""
build_dir = self.build_directory
if not os.path.isabs(self.build_directory):
build_dir = os.path.join(self.pkg.stage.path, build_dir)
return os.path.join(build_dir, "removed_la_files.txt")
return os.path.join(self.build_directory, "removed_la_files.txt")
@property
def archive_files(self) -> List[str]:
@@ -523,7 +520,12 @@ def configure_abs_path(self) -> str:
@property
def build_directory(self) -> str:
"""Override to provide another place to build the package"""
return self.configure_directory
# Handle the case where the configure directory is set to a non-absolute path
# Non-absolute paths are always relative to the staging source path
build_dir = self.configure_directory
if not os.path.isabs(build_dir):
build_dir = os.path.join(self.pkg.stage.source_path, build_dir)
return build_dir
@spack.phase_callbacks.run_before("autoreconf")
def delete_configure_to_force_update(self) -> None:
@@ -836,7 +838,7 @@ def remove_libtool_archives(self) -> None:
libtool_files = fs.find(str(self.pkg.prefix), "*.la", recursive=True)
with fs.safe_remove(*libtool_files):
fs.mkdirp(os.path.dirname(self._removed_la_files_log))
with open(self._removed_la_files_log, mode="w") as f:
with open(self._removed_la_files_log, mode="w", encoding="utf-8") as f:
f.write("\n".join(libtool_files))
def setup_build_environment(self, env):

View File

@@ -192,7 +192,10 @@ def initconfig_mpi_entries(self):
entries.append(cmake_cache_path("MPI_C_COMPILER", spec["mpi"].mpicc))
entries.append(cmake_cache_path("MPI_CXX_COMPILER", spec["mpi"].mpicxx))
entries.append(cmake_cache_path("MPI_Fortran_COMPILER", spec["mpi"].mpifc))
# not all MPIs have Fortran wrappers
if hasattr(spec["mpi"], "mpifc"):
entries.append(cmake_cache_path("MPI_Fortran_COMPILER", spec["mpi"].mpifc))
# Check for slurm
using_slurm = False
@@ -321,7 +324,7 @@ def initconfig(self, pkg, spec, prefix):
+ self.initconfig_package_entries()
)
with open(self.cache_name, "w") as f:
with open(self.cache_name, "w", encoding="utf-8") as f:
for entry in cache_entries:
f.write("%s\n" % entry)
f.write("\n")

View File

@@ -9,7 +9,7 @@
import re
import sys
from itertools import chain
from typing import Any, List, Optional, Set, Tuple
from typing import Any, List, Optional, Tuple
import llnl.util.filesystem as fs
from llnl.util.lang import stable_partition
@@ -21,6 +21,7 @@
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack import traverse
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from spack.util.environment import filter_system_paths
@@ -166,15 +167,18 @@ def _values(x):
def get_cmake_prefix_path(pkg: spack.package_base.PackageBase) -> List[str]:
"""Obtain the CMAKE_PREFIX_PATH entries for a package, based on the cmake_prefix_path package
attribute of direct build/test and transitive link dependencies."""
# Add direct build/test deps
selected: Set[str] = {s.dag_hash() for s in pkg.spec.dependencies(deptype=dt.BUILD | dt.TEST)}
# Add transitive link deps
selected.update(s.dag_hash() for s in pkg.spec.traverse(root=False, deptype=dt.LINK))
# Separate out externals so they do not shadow Spack prefixes
externals, spack_built = stable_partition(
(s for s in pkg.spec.traverse(root=False, order="topo") if s.dag_hash() in selected),
lambda x: x.external,
edges = traverse.traverse_topo_edges_generator(
traverse.with_artificial_edges([pkg.spec]),
visitor=traverse.MixedDepthVisitor(
direct=dt.BUILD | dt.TEST, transitive=dt.LINK, key=traverse.by_dag_hash
),
key=traverse.by_dag_hash,
root=False,
all_edges=False, # cover all nodes, not all edges
)
ordered_specs = [edge.spec for edge in edges]
# Separate out externals so they do not shadow Spack prefixes
externals, spack_built = stable_partition((s for s in ordered_specs), lambda x: x.external)
return filter_system_paths(
path for spec in chain(spack_built, externals) for path in spec.package.cmake_prefix_paths

View File

@@ -1153,7 +1153,7 @@ def _determine_license_type(self):
# The file will have been created upon self.license_required AND
# self.license_files having been populated, so the "if" is usually
# true by the time the present function runs; ../hooks/licensing.py
with open(f) as fh:
with open(f, encoding="utf-8") as fh:
if re.search(r"^[ \t]*[^" + self.license_comment + "\n]", fh.read(), re.MULTILINE):
license_type = {
"ACTIVATION_TYPE": "license_file",
@@ -1185,7 +1185,7 @@ def configure(self):
# our configuration accordingly. We can do this because the tokens are
# quite long and specific.
validator_code = open("pset/check.awk", "r").read()
validator_code = open("pset/check.awk", "r", encoding="utf-8").read()
# Let's go a little further and distill the tokens (plus some noise).
tokenlike_words = set(re.findall(r"[A-Z_]{4,}", validator_code))
@@ -1222,7 +1222,7 @@ def configure(self):
config_draft.update(self._determine_license_type)
# Write sorted *by token* so the file looks less like a hash dump.
f = open("silent.cfg", "w")
f = open("silent.cfg", "w", encoding="utf-8")
for token, value in sorted(config_draft.items()):
if token in tokenlike_words:
f.write("%s=%s\n" % (token, value))
@@ -1273,7 +1273,7 @@ def configure_rpath(self):
raise InstallError("Cannot find compiler command to configure rpath:\n\t" + f)
compiler_cfg = os.path.abspath(f + ".cfg")
with open(compiler_cfg, "w") as fh:
with open(compiler_cfg, "w", encoding="utf-8") as fh:
fh.write("-Xlinker -rpath={0}\n".format(compilers_lib_dir))
@spack.phase_callbacks.run_after("install")
@@ -1297,7 +1297,7 @@ def configure_auto_dispatch(self):
ad.append(x)
compiler_cfg = os.path.abspath(f + ".cfg")
with open(compiler_cfg, "a") as fh:
with open(compiler_cfg, "a", encoding="utf-8") as fh:
fh.write("-ax{0}\n".format(",".join(ad)))
@spack.phase_callbacks.run_after("install")

View File

@@ -75,7 +75,7 @@ def generate_luarocks_config(self, pkg, spec, prefix):
table_entries.append(self._generate_tree_line(d.name, d.prefix))
path = self._luarocks_config_path()
with open(path, "w") as config:
with open(path, "w", encoding="utf-8") as config:
config.write(
"""
deps_mode="all"

View File

@@ -255,7 +255,7 @@ def libs(self):
return find_libraries("*", root=self.component_prefix.lib, recursive=not self.v2_layout)
class IntelOneApiLibraryPackageWithSdk(IntelOneApiPackage):
class IntelOneApiLibraryPackageWithSdk(IntelOneApiLibraryPackage):
"""Base class for Intel oneAPI library packages with SDK components.
Contains some convenient default implementations for libraries

View File

@@ -8,12 +8,7 @@
import functools
from typing import Dict, List, Optional, Tuple, Type
import llnl.util.tty as tty
import llnl.util.tty.log as log
import spack.config
import spack.error
import spack.install_test
import spack.multimethod
import spack.package_base
import spack.phase_callbacks
@@ -127,16 +122,11 @@ def __init__(self, wrapped_pkg_object, root_builder):
new_cls_name,
bases,
{
# boolean to indicate whether install-time tests are run
"run_tests": property(lambda x: x.wrapped_package_object.run_tests),
# boolean to indicate whether the package's stand-alone tests
# require a compiler
"test_requires_compiler": property(
lambda x: x.wrapped_package_object.test_requires_compiler
),
# TestSuite instance the spec is a part of
"test_suite": property(lambda x: x.wrapped_package_object.test_suite),
# PackageTest instance to manage the spec's testing
"tester": property(lambda x: x.wrapped_package_object.tester),
},
)
@@ -491,7 +481,7 @@ def __str__(self):
class Builder(BaseBuilder, collections.abc.Sequence):
"""A builder is a class that, given a package object (i.e. associated with concrete spec),
knows how to install it and perform install-time checks.
knows how to install it.
The builder behaves like a sequence, and when iterated over return the "phases" of the
installation in the correct order.
@@ -528,52 +518,3 @@ def __getitem__(self, idx):
def __len__(self):
return len(self.phases)
def phase_tests(self, phase_name: str, method_names: List[str]):
"""Execute the package's phase-time tests.
This process uses the same test setup and logging used for
stand-alone tests for consistency.
Args:
phase_name: the name of the build-time phase (e.g., ``build``, ``install``)
method_names: phase-specific callback method names
"""
verbose = tty.is_verbose()
fail_fast = spack.config.get("config:fail_fast", False)
tester = self.pkg.tester
with tester.test_logger(verbose=verbose, externals=False) as logger:
# Report running each of the methods in the build log
log.print_message(logger, f"Running {phase_name}-time tests", verbose)
tester.set_current_specs(self.pkg.spec, self.pkg.spec)
have_tests = any(name.startswith("test_") for name in method_names)
if have_tests:
spack.install_test.copy_test_files(self.pkg, self.pkg.spec)
for name in method_names:
try:
# Prefer the method in the package over the builder's.
# We need this primarily to pick up arbitrarily named test
# methods but also some build-time checks.
fn = getattr(self.pkg, name, getattr(self, name))
msg = f"RUN-TESTS: {phase_name}-time tests [{name}]"
log.print_message(logger, msg, verbose)
fn()
except AttributeError as e:
msg = f"RUN-TESTS: method not implemented [{name}]"
log.print_message(logger, msg, verbose)
tester.add_failure(e, msg)
if fail_fast:
break
if have_tests:
log.print_message(logger, "Completed testing", verbose)
# Raise exception if any failures encountered
tester.handle_failures()

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,41 @@
# Spack CI generators
This document describes how the ci module can be extended to provide novel
ci generators. The module currently has only a single generator for gitlab.
The unit-tests for the ci module define a small custom generator for testing
purposes as well.
The process of generating a pipeline involves creating a ci-enabled spack
environment, activating it, and running `spack ci generate`, possibly with
arguments describing things like where the output should be written.
Internally pipeline generation is broken into two components: general and
ci platform specific.
## General pipeline functionality
General pipeline functionality includes building a pipeline graph (really,
a forest), pruning it in a variety of ways, and gathering attributes for all
the generated spec build jobs from the spack configuration.
All of the above functionality is defined in the `__init__.py` of the top-level
ci module, and should be roughly the same for pipelines generated for any
platform.
## CI platform specific functionality
Functionality specific to CI platforms (e.g. gitlab, gha, etc.) should be
defined in a dedicated module. In order to define a generator for a new
platform, there are only a few requirements:
1. add a file under `ci` in which you define a generator method decorated with
the `@generator` attribute. .
1. import it from `lib/spack/spack/ci/__init__.py`, so that your new generator
is registered.
1. the generator method must take as arguments PipelineDag, SpackCIConfig,
and PipelineOptions objects, in that order.
1. the generator method must produce an output file containing the
generated pipeline.

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,825 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import codecs
import copy
import json
import os
import re
import ssl
import sys
import time
from collections import deque
from enum import Enum
from typing import Dict, Generator, List, Optional, Set, Tuple
from urllib.parse import quote, urlencode, urlparse
from urllib.request import HTTPHandler, HTTPSHandler, Request, build_opener
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.lang import Singleton, memoized
import spack.binary_distribution as bindist
import spack.config as cfg
import spack.deptypes as dt
import spack.environment as ev
import spack.error
import spack.mirrors.mirror
import spack.schema
import spack.spec
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util
from spack import traverse
from spack.reporters import CDash, CDashConfiguration
from spack.reporters.cdash import SPACK_CDASH_TIMEOUT
from spack.reporters.cdash import build_stamp as cdash_build_stamp
def _urlopen():
error_handler = web_util.SpackHTTPDefaultErrorHandler()
# One opener with HTTPS ssl enabled
with_ssl = build_opener(
HTTPHandler(), HTTPSHandler(context=web_util.ssl_create_default_context()), error_handler
)
# One opener with HTTPS ssl disabled
without_ssl = build_opener(
HTTPHandler(), HTTPSHandler(context=ssl._create_unverified_context()), error_handler
)
# And dynamically dispatch based on the config:verify_ssl.
def dispatch_open(fullurl, data=None, timeout=None, verify_ssl=True):
opener = with_ssl if verify_ssl else without_ssl
timeout = timeout or cfg.get("config:connect_timeout", 1)
return opener.open(fullurl, data, timeout)
return dispatch_open
IS_WINDOWS = sys.platform == "win32"
SPACK_RESERVED_TAGS = ["public", "protected", "notary"]
_dyn_mapping_urlopener = Singleton(_urlopen)
def copy_files_to_artifacts(src, artifacts_dir):
"""
Copy file(s) to the given artifacts directory
Parameters:
src (str): the glob-friendly path expression for the file(s) to copy
artifacts_dir (str): the destination directory
"""
try:
fs.copy(src, artifacts_dir)
except Exception as err:
msg = (
f"Unable to copy files ({src}) to artifacts {artifacts_dir} due to "
f"exception: {str(err)}"
)
tty.warn(msg)
def win_quote(quote_str: str) -> str:
if IS_WINDOWS:
quote_str = f'"{quote_str}"'
return quote_str
def _spec_matches(spec, match_string):
return spec.intersects(match_string)
def _noop(x):
return x
def unpack_script(script_section, op=_noop):
script = []
for cmd in script_section:
if isinstance(cmd, list):
for subcmd in cmd:
script.append(op(subcmd))
else:
script.append(op(cmd))
return script
def ensure_expected_target_path(path: str) -> str:
"""Returns passed paths with all Windows path separators exchanged
for posix separators
TODO (johnwparent): Refactor config + cli read/write to deal only in posix style paths
"""
if path:
return path.replace("\\", "/")
return path
def update_env_scopes(
env: ev.Environment,
cli_scopes: List[str],
output_file: str,
transform_windows_paths: bool = False,
) -> None:
"""Add any config scopes from cli_scopes which aren't already included in the
environment, by reading the yaml, adding the missing includes, and writing the
updated yaml back to the same location.
"""
with open(env.manifest_path, "r", encoding="utf-8") as env_fd:
env_yaml_root = syaml.load(env_fd)
# Add config scopes to environment
env_includes = env_yaml_root["spack"].get("include", [])
include_scopes: List[str] = []
for scope in cli_scopes:
if scope not in include_scopes and scope not in env_includes:
include_scopes.insert(0, scope)
env_includes.extend(include_scopes)
env_yaml_root["spack"]["include"] = [
ensure_expected_target_path(i) if transform_windows_paths else i for i in env_includes
]
with open(output_file, "w", encoding="utf-8") as fd:
syaml.dump_config(env_yaml_root, fd, default_flow_style=False)
def write_pipeline_manifest(specs, src_prefix, dest_prefix, output_file):
"""Write out the file describing specs that should be copied"""
buildcache_copies = {}
for release_spec in specs:
release_spec_dag_hash = release_spec.dag_hash()
# TODO: This assumes signed version of the spec
buildcache_copies[release_spec_dag_hash] = [
{
"src": url_util.join(
src_prefix,
bindist.build_cache_relative_path(),
bindist.tarball_name(release_spec, ".spec.json.sig"),
),
"dest": url_util.join(
dest_prefix,
bindist.build_cache_relative_path(),
bindist.tarball_name(release_spec, ".spec.json.sig"),
),
},
{
"src": url_util.join(
src_prefix,
bindist.build_cache_relative_path(),
bindist.tarball_path_name(release_spec, ".spack"),
),
"dest": url_util.join(
dest_prefix,
bindist.build_cache_relative_path(),
bindist.tarball_path_name(release_spec, ".spack"),
),
},
]
target_dir = os.path.dirname(output_file)
if not os.path.exists(target_dir):
os.makedirs(target_dir)
with open(output_file, "w", encoding="utf-8") as fd:
fd.write(json.dumps(buildcache_copies))
class CDashHandler:
"""
Class for managing CDash data and processing.
"""
def __init__(self, ci_cdash):
# start with the gitlab ci configuration
self.url = ci_cdash.get("url")
self.build_group = ci_cdash.get("build-group")
self.project = ci_cdash.get("project")
self.site = ci_cdash.get("site")
# grab the authorization token when available
self.auth_token = os.environ.get("SPACK_CDASH_AUTH_TOKEN")
if self.auth_token:
tty.verbose("Using CDash auth token from environment")
# append runner description to the site if available
runner = os.environ.get("CI_RUNNER_DESCRIPTION")
if runner:
self.site += f" ({runner})"
def args(self):
return [
"--cdash-upload-url",
win_quote(self.upload_url),
"--cdash-build",
win_quote(self.build_name),
"--cdash-site",
win_quote(self.site),
"--cdash-buildstamp",
win_quote(self.build_stamp),
]
def build_name(self, spec: Optional[spack.spec.Spec] = None) -> Optional[str]:
"""Returns the CDash build name.
A name will be generated if the `spec` is provided,
otherwise, the value will be retrieved from the environment
through the `SPACK_CDASH_BUILD_NAME` variable.
Returns: (str) given spec's CDash build name."""
if spec:
build_name = f"{spec.name}@{spec.version}%{spec.compiler} \
hash={spec.dag_hash()} arch={spec.architecture} ({self.build_group})"
tty.debug(f"Generated CDash build name ({build_name}) from the {spec.name}")
return build_name
env_build_name = os.environ.get("SPACK_CDASH_BUILD_NAME")
tty.debug(f"Using CDash build name ({env_build_name}) from the environment")
return env_build_name
@property # type: ignore
def build_stamp(self):
"""Returns the CDash build stamp.
The one defined by SPACK_CDASH_BUILD_STAMP environment variable
is preferred due to the representation of timestamps; otherwise,
one will be built.
Returns: (str) current CDash build stamp"""
build_stamp = os.environ.get("SPACK_CDASH_BUILD_STAMP")
if build_stamp:
tty.debug(f"Using build stamp ({build_stamp}) from the environment")
return build_stamp
build_stamp = cdash_build_stamp(self.build_group, time.time())
tty.debug(f"Generated new build stamp ({build_stamp})")
return build_stamp
@property # type: ignore
@memoized
def project_enc(self):
tty.debug(f"Encoding project ({type(self.project)}): {self.project})")
encode = urlencode({"project": self.project})
index = encode.find("=") + 1
return encode[index:]
@property
def upload_url(self):
url_format = f"{self.url}/submit.php?project={self.project_enc}"
return url_format
def copy_test_results(self, source, dest):
"""Copy test results to artifacts directory."""
reports = fs.join_path(source, "*_Test*.xml")
copy_files_to_artifacts(reports, dest)
def create_buildgroup(self, opener, headers, url, group_name, group_type):
data = {"newbuildgroup": group_name, "project": self.project, "type": group_type}
enc_data = json.dumps(data).encode("utf-8")
request = Request(url, data=enc_data, headers=headers)
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code not in [200, 201]:
msg = f"Creating buildgroup failed (response code = {response_code})"
tty.warn(msg)
return None
response_text = response.read()
response_json = json.loads(response_text)
build_group_id = response_json["id"]
return build_group_id
def populate_buildgroup(self, job_names):
url = f"{self.url}/api/v1/buildgroup.php"
headers = {
"Authorization": f"Bearer {self.auth_token}",
"Content-Type": "application/json",
}
opener = build_opener(HTTPHandler)
parent_group_id = self.create_buildgroup(opener, headers, url, self.build_group, "Daily")
group_id = self.create_buildgroup(
opener, headers, url, f"Latest {self.build_group}", "Latest"
)
if not parent_group_id or not group_id:
msg = f"Failed to create or retrieve buildgroups for {self.build_group}"
tty.warn(msg)
return
data = {
"dynamiclist": [
{"match": name, "parentgroupid": parent_group_id, "site": self.site}
for name in job_names
]
}
enc_data = json.dumps(data).encode("utf-8")
request = Request(url, data=enc_data, headers=headers)
request.get_method = lambda: "PUT"
response = opener.open(request, timeout=SPACK_CDASH_TIMEOUT)
response_code = response.getcode()
if response_code != 200:
msg = f"Error response code ({response_code}) in populate_buildgroup"
tty.warn(msg)
def report_skipped(self, spec: spack.spec.Spec, report_dir: str, reason: Optional[str]):
"""Explicitly report skipping testing of a spec (e.g., it's CI
configuration identifies it as known to have broken tests or
the CI installation failed).
Args:
spec: spec being tested
report_dir: directory where the report will be written
reason: reason the test is being skipped
"""
configuration = CDashConfiguration(
upload_url=self.upload_url,
packages=[spec.name],
build=self.build_name,
site=self.site,
buildstamp=self.build_stamp,
track=None,
)
reporter = CDash(configuration=configuration)
reporter.test_skipped_report(report_dir, spec, reason)
class PipelineType(Enum):
COPY_ONLY = 1
spack_copy_only = 1
PROTECTED_BRANCH = 2
spack_protected_branch = 2
PULL_REQUEST = 3
spack_pull_request = 3
class PipelineOptions:
"""A container for all pipeline options that can be specified (whether
via cli, config/yaml, or environment variables)"""
def __init__(
self,
env: ev.Environment,
buildcache_destination: spack.mirrors.mirror.Mirror,
artifacts_root: str = "jobs_scratch_dir",
print_summary: bool = True,
output_file: Optional[str] = None,
check_index_only: bool = False,
broken_specs_url: Optional[str] = None,
rebuild_index: bool = True,
untouched_pruning_dependent_depth: Optional[int] = None,
prune_untouched: bool = False,
prune_up_to_date: bool = True,
prune_external: bool = True,
stack_name: Optional[str] = None,
pipeline_type: Optional[PipelineType] = None,
require_signing: bool = False,
cdash_handler: Optional["CDashHandler"] = None,
):
"""
Args:
env: Active spack environment
buildcache_destination: The mirror where built binaries should be pushed
artifacts_root: Path to location where artifacts should be stored
print_summary: Print a summary of the scheduled pipeline
output_file: Path where output file should be written
check_index_only: Only fetch the index or fetch all spec files
broken_specs_url: URL where broken specs (on develop) should be reported
rebuild_index: Generate a job to rebuild mirror index after rebuilds
untouched_pruning_dependent_depth: How many parents to traverse from changed pkg specs
prune_untouched: Prune jobs for specs that were unchanged in git history
prune_up_to_date: Prune specs from pipeline if binary exists on the mirror
prune_external: Prune specs from pipeline if they are external
stack_name: Name of spack stack
pipeline_type: Type of pipeline running (optional)
require_signing: Require buildcache to be signed (fail w/out signing key)
cdash_handler: Object for communicating build information with CDash
"""
self.env = env
self.buildcache_destination = buildcache_destination
self.artifacts_root = artifacts_root
self.print_summary = print_summary
self.output_file = output_file
self.check_index_only = check_index_only
self.broken_specs_url = broken_specs_url
self.rebuild_index = rebuild_index
self.untouched_pruning_dependent_depth = untouched_pruning_dependent_depth
self.prune_untouched = prune_untouched
self.prune_up_to_date = prune_up_to_date
self.prune_external = prune_external
self.stack_name = stack_name
self.pipeline_type = pipeline_type
self.require_signing = require_signing
self.cdash_handler = cdash_handler
class PipelineNode:
spec: spack.spec.Spec
parents: Set[str]
children: Set[str]
def __init__(self, spec: spack.spec.Spec):
self.spec = spec
self.parents = set()
self.children = set()
@property
def key(self):
"""Return key of the stored spec"""
return PipelineDag.key(self.spec)
class PipelineDag:
"""Turn a list of specs into a simple directed graph, that doesn't keep track
of edge types."""
@classmethod
def key(cls, spec: spack.spec.Spec) -> str:
return spec.dag_hash()
def __init__(self, specs: List[spack.spec.Spec]) -> None:
# Build dictionary of nodes
self.nodes: Dict[str, PipelineNode] = {
PipelineDag.key(s): PipelineNode(s)
for s in traverse.traverse_nodes(specs, deptype=dt.ALL_TYPES, root=True)
}
# Create edges
for edge in traverse.traverse_edges(
specs, deptype=dt.ALL_TYPES, root=False, cover="edges"
):
parent_key = PipelineDag.key(edge.parent)
child_key = PipelineDag.key(edge.spec)
self.nodes[parent_key].children.add(child_key)
self.nodes[child_key].parents.add(parent_key)
def prune(self, node_key: str):
"""Remove a node from the graph, and reconnect its parents and children"""
node = self.nodes[node_key]
for parent in node.parents:
self.nodes[parent].children.remove(node_key)
self.nodes[parent].children |= node.children
for child in node.children:
self.nodes[child].parents.remove(node_key)
self.nodes[child].parents |= node.parents
del self.nodes[node_key]
def traverse_nodes(
self, direction: str = "children"
) -> Generator[Tuple[int, PipelineNode], None, None]:
"""Yields (depth, node) from the pipeline graph. Traversal is topologically
ordered from the roots if ``direction`` is ``children``, or from the leaves
if ``direction`` is ``parents``. The yielded depth is the length of the
longest path from the starting point to the yielded node."""
if direction == "children":
get_in_edges = lambda node: node.parents
get_out_edges = lambda node: node.children
else:
get_in_edges = lambda node: node.children
get_out_edges = lambda node: node.parents
sort_key = lambda k: self.nodes[k].spec.name
out_edges = {k: sorted(get_out_edges(n), key=sort_key) for k, n in self.nodes.items()}
num_in_edges = {k: len(get_in_edges(n)) for k, n in self.nodes.items()}
# Populate a queue with all the nodes that have no incoming edges
nodes = deque(
sorted(
[(0, key) for key in self.nodes.keys() if num_in_edges[key] == 0],
key=lambda item: item[1],
)
)
while nodes:
# Remove the next node, n, from the queue and yield it
depth, n_key = nodes.pop()
yield (depth, self.nodes[n_key])
# Remove an in-edge from every node, m, pointed to by an
# out-edge from n. If any of those nodes are left with
# 0 remaining in-edges, add them to the queue.
for m in out_edges[n_key]:
num_in_edges[m] -= 1
if num_in_edges[m] == 0:
nodes.appendleft((depth + 1, m))
def get_dependencies(self, node: PipelineNode) -> List[PipelineNode]:
"""Returns a list of nodes corresponding to the direct dependencies
of the given node."""
return [self.nodes[k] for k in node.children]
class SpackCIConfig:
"""Spack CI object used to generate intermediate representation
used by the CI generator(s).
"""
def __init__(self, ci_config):
"""Given the information from the ci section of the config
and the staged jobs, set up meta data needed for generating Spack
CI IR.
"""
self.ci_config = ci_config
self.named_jobs = ["any", "build", "copy", "cleanup", "noop", "reindex", "signing"]
self.ir = {
"jobs": {},
"rebuild-index": self.ci_config.get("rebuild-index", True),
"broken-specs-url": self.ci_config.get("broken-specs-url", None),
"broken-tests-packages": self.ci_config.get("broken-tests-packages", []),
"target": self.ci_config.get("target", "gitlab"),
}
jobs = self.ir["jobs"]
for name in self.named_jobs:
# Skip the special named jobs
if name not in ["any", "build"]:
jobs[name] = self.__init_job("")
def __init_job(self, release_spec):
"""Initialize job object"""
job_object = {"spec": release_spec, "attributes": {}}
if release_spec:
job_vars = job_object["attributes"].setdefault("variables", {})
job_vars["SPACK_JOB_SPEC_DAG_HASH"] = release_spec.dag_hash()
job_vars["SPACK_JOB_SPEC_PKG_NAME"] = release_spec.name
job_vars["SPACK_JOB_SPEC_PKG_VERSION"] = release_spec.format("{version}")
job_vars["SPACK_JOB_SPEC_COMPILER_NAME"] = release_spec.format("{compiler.name}")
job_vars["SPACK_JOB_SPEC_COMPILER_VERSION"] = release_spec.format("{compiler.version}")
job_vars["SPACK_JOB_SPEC_ARCH"] = release_spec.format("{architecture}")
job_vars["SPACK_JOB_SPEC_VARIANTS"] = release_spec.format("{variants}")
return job_object
def __is_named(self, section):
"""Check if a pipeline-gen configuration section is for a named job,
and if so return the name otherwise return none.
"""
for _name in self.named_jobs:
keys = [f"{_name}-job", f"{_name}-job-remove"]
if any([key for key in keys if key in section]):
return _name
return None
@staticmethod
def __job_name(name, suffix=""):
"""Compute the name of a named job with appropriate suffix.
Valid suffixes are either '-remove' or empty string or None
"""
assert isinstance(name, str)
jname = name
if suffix:
jname = f"{name}-job{suffix}"
else:
jname = f"{name}-job"
return jname
def __apply_submapping(self, dest, spec, section):
"""Apply submapping setion to the IR dict"""
matched = False
only_first = section.get("match_behavior", "first") == "first"
for match_attrs in reversed(section["submapping"]):
attrs = cfg.InternalConfigScope._process_dict_keyname_overrides(match_attrs)
for match_string in match_attrs["match"]:
if _spec_matches(spec, match_string):
matched = True
if "build-job-remove" in match_attrs:
spack.config.remove_yaml(dest, attrs["build-job-remove"])
if "build-job" in match_attrs:
spack.schema.merge_yaml(dest, attrs["build-job"])
break
if matched and only_first:
break
return dest
# Create jobs for all the pipeline specs
def init_pipeline_jobs(self, pipeline: PipelineDag):
for _, node in pipeline.traverse_nodes():
dag_hash = node.spec.dag_hash()
self.ir["jobs"][dag_hash] = self.__init_job(node.spec)
# Generate IR from the configs
def generate_ir(self):
"""Generate the IR from the Spack CI configurations."""
jobs = self.ir["jobs"]
# Implicit job defaults
defaults = [
{
"build-job": {
"script": [
"cd {env_dir}",
"spack env activate --without-view .",
"spack ci rebuild",
]
}
},
{"noop-job": {"script": ['echo "All specs already up to date, nothing to rebuild."']}},
]
# Job overrides
overrides = [
# Reindex script
{
"reindex-job": {
"script:": ["spack buildcache update-index --keys {index_target_mirror}"]
}
},
# Cleanup script
{
"cleanup-job": {
"script:": ["spack -d mirror destroy {mirror_prefix}/$CI_PIPELINE_ID"]
}
},
# Add signing job tags
{"signing-job": {"tags": ["aws", "protected", "notary"]}},
# Remove reserved tags
{"any-job-remove": {"tags": SPACK_RESERVED_TAGS}},
]
pipeline_gen = overrides + self.ci_config.get("pipeline-gen", []) + defaults
for section in reversed(pipeline_gen):
name = self.__is_named(section)
has_submapping = "submapping" in section
has_dynmapping = "dynamic-mapping" in section
section = cfg.InternalConfigScope._process_dict_keyname_overrides(section)
if name:
remove_job_name = self.__job_name(name, suffix="-remove")
merge_job_name = self.__job_name(name)
do_remove = remove_job_name in section
do_merge = merge_job_name in section
def _apply_section(dest, src):
if do_remove:
dest = spack.config.remove_yaml(dest, src[remove_job_name])
if do_merge:
dest = copy.copy(spack.schema.merge_yaml(dest, src[merge_job_name]))
if name == "build":
# Apply attributes to all build jobs
for _, job in jobs.items():
if job["spec"]:
_apply_section(job["attributes"], section)
elif name == "any":
# Apply section attributes too all jobs
for _, job in jobs.items():
_apply_section(job["attributes"], section)
else:
# Create a signing job if there is script and the job hasn't
# been initialized yet
if name == "signing" and name not in jobs:
if "signing-job" in section:
if "script" not in section["signing-job"]:
continue
else:
jobs[name] = self.__init_job("")
# Apply attributes to named job
_apply_section(jobs[name]["attributes"], section)
elif has_submapping:
# Apply section jobs with specs to match
for _, job in jobs.items():
if job["spec"]:
job["attributes"] = self.__apply_submapping(
job["attributes"], job["spec"], section
)
elif has_dynmapping:
mapping = section["dynamic-mapping"]
dynmap_name = mapping.get("name")
# Check if this section should be skipped
dynmap_skip = os.environ.get("SPACK_CI_SKIP_DYNAMIC_MAPPING")
if dynmap_name and dynmap_skip:
if re.match(dynmap_skip, dynmap_name):
continue
# Get the endpoint
endpoint = mapping["endpoint"]
endpoint_url = urlparse(endpoint)
# Configure the request header
header = {"User-Agent": web_util.SPACK_USER_AGENT}
header.update(mapping.get("header", {}))
# Expand header environment variables
# ie. if tokens are passed
for value in header.values():
value = os.path.expandvars(value)
verify_ssl = mapping.get("verify_ssl", spack.config.get("config:verify_ssl", True))
timeout = mapping.get("timeout", spack.config.get("config:connect_timeout", 1))
required = mapping.get("require", [])
allowed = mapping.get("allow", [])
ignored = mapping.get("ignore", [])
# required keys are implicitly allowed
allowed = sorted(set(allowed + required))
ignored = sorted(set(ignored))
required = sorted(set(required))
# Make sure required things are not also ignored
assert not any([ikey in required for ikey in ignored])
def job_query(job):
job_vars = job["attributes"]["variables"]
query = (
"{SPACK_JOB_SPEC_PKG_NAME}@{SPACK_JOB_SPEC_PKG_VERSION}"
# The preceding spaces are required (ref. https://github.com/spack/spack-gantry/blob/develop/docs/api.md#allocation)
" {SPACK_JOB_SPEC_VARIANTS}"
" arch={SPACK_JOB_SPEC_ARCH}"
"%{SPACK_JOB_SPEC_COMPILER_NAME}@{SPACK_JOB_SPEC_COMPILER_VERSION}"
).format_map(job_vars)
return f"spec={quote(query)}"
for job in jobs.values():
if not job["spec"]:
continue
# Create request for this job
query = job_query(job)
request = Request(
endpoint_url._replace(query=query).geturl(), headers=header, method="GET"
)
try:
response = _dyn_mapping_urlopener(
request, verify_ssl=verify_ssl, timeout=timeout
)
except Exception as e:
# For now just ignore any errors from dynamic mapping and continue
# This is still experimental, and failures should not stop CI
# from running normally
tty.warn(f"Failed to fetch dynamic mapping for query:\n\t{query}")
tty.warn(f"{e}")
continue
config = json.load(codecs.getreader("utf-8")(response))
# Strip ignore keys
if ignored:
for key in ignored:
if key in config:
config.pop(key)
# Only keep allowed keys
clean_config = {}
if allowed:
for key in allowed:
if key in config:
clean_config[key] = config[key]
else:
clean_config = config
# Verify all of the required keys are present
if required:
missing_keys = []
for key in required:
if key not in clean_config.keys():
missing_keys.append(key)
if missing_keys:
tty.warn(f"Response missing required keys: {missing_keys}")
if clean_config:
job["attributes"] = spack.schema.merge_yaml(
job.get("attributes", {}), clean_config
)
for _, job in jobs.items():
if job["spec"]:
job["spec"] = job["spec"].name
return self.ir
class SpackCIError(spack.error.SpackError):
def __init__(self, msg):
super().__init__(msg)

View File

@@ -0,0 +1,36 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
# Holds all known formatters
"""Generators that support writing out pipelines for various CI platforms,
using a common pipeline graph definition.
"""
import spack.error
_generators = {}
def generator(name):
"""Decorator to register a pipeline generator method.
A generator method should take PipelineDag, SpackCIConfig, and
PipelineOptions arguments, and should produce a pipeline file.
"""
def _decorator(generate_method):
_generators[name] = generate_method
return generate_method
return _decorator
def get_generator(name):
try:
return _generators[name]
except KeyError:
raise UnknownGeneratorException(name)
class UnknownGeneratorException(spack.error.SpackError):
def __init__(self, generator_name):
super().__init__(f"No registered generator for {generator_name}")

View File

@@ -0,0 +1,416 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import copy
import os
import shutil
from typing import List, Optional
import ruamel.yaml
import llnl.util.tty as tty
import spack
import spack.binary_distribution as bindist
import spack.config as cfg
import spack.mirrors.mirror
import spack.schema
import spack.spec
import spack.util.spack_yaml as syaml
from .common import (
SPACK_RESERVED_TAGS,
PipelineDag,
PipelineOptions,
PipelineType,
SpackCIConfig,
SpackCIError,
ensure_expected_target_path,
unpack_script,
update_env_scopes,
write_pipeline_manifest,
)
from .generator_registry import generator
# See https://docs.gitlab.com/ee/ci/yaml/#retry for descriptions of conditions
JOB_RETRY_CONDITIONS = [
# "always",
"unknown_failure",
"script_failure",
"api_failure",
"stuck_or_timeout_failure",
"runner_system_failure",
"runner_unsupported",
"stale_schedule",
# "job_execution_timeout",
"archived_failure",
"unmet_prerequisites",
"scheduler_failure",
"data_integrity_failure",
]
JOB_NAME_FORMAT = "{name}{@version} {/hash}"
def _remove_reserved_tags(tags):
"""Convenience function to strip reserved tags from jobs"""
return [tag for tag in tags if tag not in SPACK_RESERVED_TAGS]
def get_job_name(spec: spack.spec.Spec, build_group: Optional[str] = None) -> str:
"""Given a spec and possibly a build group, return the job name. If the
resulting name is longer than 255 characters, it will be truncated.
Arguments:
spec: Spec job will build
build_group: Name of build group this job belongs to (a CDash notion)
Returns: The job name
"""
job_name = spec.format(JOB_NAME_FORMAT)
if build_group:
job_name = f"{job_name} {build_group}"
return job_name[:255]
def maybe_generate_manifest(pipeline: PipelineDag, options: PipelineOptions, manifest_path):
# TODO: Consider including only hashes of rebuilt specs in the manifest,
# instead of full source and destination urls. Also, consider renaming
# the variable that controls whether or not to write the manifest from
# "SPACK_COPY_BUILDCACHE" to "SPACK_WRITE_PIPELINE_MANIFEST" or similar.
spack_buildcache_copy = os.environ.get("SPACK_COPY_BUILDCACHE", None)
if spack_buildcache_copy:
buildcache_copy_src_prefix = options.buildcache_destination.fetch_url
buildcache_copy_dest_prefix = spack_buildcache_copy
if options.pipeline_type == PipelineType.COPY_ONLY:
manifest_specs = [s for s in options.env.all_specs() if not s.external]
else:
manifest_specs = [n.spec for _, n in pipeline.traverse_nodes(direction="children")]
write_pipeline_manifest(
manifest_specs, buildcache_copy_src_prefix, buildcache_copy_dest_prefix, manifest_path
)
@generator("gitlab")
def generate_gitlab_yaml(pipeline: PipelineDag, spack_ci: SpackCIConfig, options: PipelineOptions):
"""Given a pipeline graph, job attributes, and pipeline options,
write a pipeline that can be consumed by GitLab to the given output file.
Arguments:
pipeline: An already pruned graph of jobs representing all the specs to build
spack_ci: An object containing the configured attributes of all jobs in the pipeline
options: An object containing all the pipeline options gathered from yaml, env, etc...
"""
ci_project_dir = os.environ.get("CI_PROJECT_DIR") or os.getcwd()
generate_job_name = os.environ.get("CI_JOB_NAME", "job-does-not-exist")
generate_pipeline_id = os.environ.get("CI_PIPELINE_ID", "pipeline-does-not-exist")
artifacts_root = options.artifacts_root
if artifacts_root.startswith(ci_project_dir):
artifacts_root = os.path.relpath(artifacts_root, ci_project_dir)
pipeline_artifacts_dir = os.path.join(ci_project_dir, artifacts_root)
output_file = options.output_file
if not output_file:
output_file = os.path.abspath(".gitlab-ci.yml")
else:
output_file_path = os.path.abspath(output_file)
gen_ci_dir = os.path.dirname(output_file_path)
if not os.path.exists(gen_ci_dir):
os.makedirs(gen_ci_dir)
spack_ci_ir = spack_ci.generate_ir()
concrete_env_dir = os.path.join(pipeline_artifacts_dir, "concrete_environment")
# Now that we've added the mirrors we know about, they should be properly
# reflected in the environment manifest file, so copy that into the
# concrete environment directory, along with the spack.lock file.
if not os.path.exists(concrete_env_dir):
os.makedirs(concrete_env_dir)
shutil.copyfile(options.env.manifest_path, os.path.join(concrete_env_dir, "spack.yaml"))
shutil.copyfile(options.env.lock_path, os.path.join(concrete_env_dir, "spack.lock"))
update_env_scopes(
options.env,
[
os.path.relpath(s.path, concrete_env_dir)
for s in cfg.scopes().values()
if not s.writable
and isinstance(s, (cfg.DirectoryConfigScope))
and os.path.exists(s.path)
],
os.path.join(concrete_env_dir, "spack.yaml"),
# Here transforming windows paths is only required in the special case
# of copy_only_pipelines, a unique scenario where the generate job and
# child pipelines are run on different platforms. To make this compatible
# w/ Windows, we cannot write Windows style path separators that will be
# consumed on by the Posix copy job runner.
#
# TODO (johnwparent): Refactor config + cli read/write to deal only in
# posix style paths
transform_windows_paths=(options.pipeline_type == PipelineType.COPY_ONLY),
)
job_log_dir = os.path.join(pipeline_artifacts_dir, "logs")
job_repro_dir = os.path.join(pipeline_artifacts_dir, "reproduction")
job_test_dir = os.path.join(pipeline_artifacts_dir, "tests")
user_artifacts_dir = os.path.join(pipeline_artifacts_dir, "user_data")
# We communicate relative paths to the downstream jobs to avoid issues in
# situations where the CI_PROJECT_DIR varies between the pipeline
# generation job and the rebuild jobs. This can happen when gitlab
# checks out the project into a runner-specific directory, for example,
# and different runners are picked for generate and rebuild jobs.
rel_concrete_env_dir = os.path.relpath(concrete_env_dir, ci_project_dir)
rel_job_log_dir = os.path.relpath(job_log_dir, ci_project_dir)
rel_job_repro_dir = os.path.relpath(job_repro_dir, ci_project_dir)
rel_job_test_dir = os.path.relpath(job_test_dir, ci_project_dir)
rel_user_artifacts_dir = os.path.relpath(user_artifacts_dir, ci_project_dir)
def main_script_replacements(cmd):
return cmd.replace("{env_dir}", rel_concrete_env_dir)
output_object = {}
job_id = 0
stage_id = 0
stages: List[List] = []
stage_names = []
max_length_needs = 0
max_needs_job = ""
if not options.pipeline_type == PipelineType.COPY_ONLY:
for level, node in pipeline.traverse_nodes(direction="parents"):
stage_id = level
if len(stages) == stage_id:
stages.append([])
stages[stage_id].append(node.spec)
stage_name = f"stage-{level}"
if stage_name not in stage_names:
stage_names.append(stage_name)
release_spec = node.spec
release_spec_dag_hash = release_spec.dag_hash()
job_object = spack_ci_ir["jobs"][release_spec_dag_hash]["attributes"]
if not job_object:
tty.warn(f"No match found for {release_spec}, skipping it")
continue
if options.pipeline_type is not None:
# For spack pipelines "public" and "protected" are reserved tags
job_object["tags"] = _remove_reserved_tags(job_object.get("tags", []))
if options.pipeline_type == PipelineType.PROTECTED_BRANCH:
job_object["tags"].extend(["protected"])
elif options.pipeline_type == PipelineType.PULL_REQUEST:
job_object["tags"].extend(["public"])
if "script" not in job_object:
raise AttributeError
job_object["script"] = unpack_script(job_object["script"], op=main_script_replacements)
if "before_script" in job_object:
job_object["before_script"] = unpack_script(job_object["before_script"])
if "after_script" in job_object:
job_object["after_script"] = unpack_script(job_object["after_script"])
build_group = options.cdash_handler.build_group if options.cdash_handler else None
job_name = get_job_name(release_spec, build_group)
dep_nodes = pipeline.get_dependencies(node)
job_object["needs"] = [
{"job": get_job_name(dep_node.spec, build_group), "artifacts": False}
for dep_node in dep_nodes
]
job_object["needs"].append(
{"job": generate_job_name, "pipeline": f"{generate_pipeline_id}"}
)
job_vars = job_object["variables"]
# Let downstream jobs know whether the spec needed rebuilding, regardless
# whether DAG pruning was enabled or not.
already_built = bindist.get_mirrors_for_spec(spec=release_spec, index_only=True)
job_vars["SPACK_SPEC_NEEDS_REBUILD"] = "False" if already_built else "True"
if options.cdash_handler:
build_name = options.cdash_handler.build_name(release_spec)
job_vars["SPACK_CDASH_BUILD_NAME"] = build_name
build_stamp = options.cdash_handler.build_stamp
job_vars["SPACK_CDASH_BUILD_STAMP"] = build_stamp
job_object["artifacts"] = spack.schema.merge_yaml(
job_object.get("artifacts", {}),
{
"when": "always",
"paths": [
rel_job_log_dir,
rel_job_repro_dir,
rel_job_test_dir,
rel_user_artifacts_dir,
],
},
)
job_object["stage"] = stage_name
job_object["retry"] = {"max": 2, "when": JOB_RETRY_CONDITIONS}
job_object["interruptible"] = True
length_needs = len(job_object["needs"])
if length_needs > max_length_needs:
max_length_needs = length_needs
max_needs_job = job_name
output_object[job_name] = job_object
job_id += 1
tty.debug(f"{job_id} build jobs generated in {stage_id} stages")
if job_id > 0:
tty.debug(f"The max_needs_job is {max_needs_job}, with {max_length_needs} needs")
service_job_retries = {
"max": 2,
"when": ["runner_system_failure", "stuck_or_timeout_failure", "script_failure"],
}
# In some cases, pipeline generation should write a manifest. Currently
# the only purpose is to specify a list of sources and destinations for
# everything that should be copied.
distinguish_stack = options.stack_name if options.stack_name else "rebuilt"
manifest_path = os.path.join(
pipeline_artifacts_dir, "specs_to_copy", f"copy_{distinguish_stack}_specs.json"
)
maybe_generate_manifest(pipeline, options, manifest_path)
if options.pipeline_type == PipelineType.COPY_ONLY:
stage_names.append("copy")
sync_job = copy.deepcopy(spack_ci_ir["jobs"]["copy"]["attributes"])
sync_job["stage"] = "copy"
sync_job["needs"] = [{"job": generate_job_name, "pipeline": f"{generate_pipeline_id}"}]
if "variables" not in sync_job:
sync_job["variables"] = {}
sync_job["variables"][
"SPACK_COPY_ONLY_DESTINATION"
] = options.buildcache_destination.fetch_url
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
if "buildcache-source" not in pipeline_mirrors:
raise SpackCIError("Copy-only pipelines require a mirror named 'buildcache-source'")
buildcache_source = pipeline_mirrors["buildcache-source"].fetch_url
sync_job["variables"]["SPACK_BUILDCACHE_SOURCE"] = buildcache_source
sync_job["dependencies"] = []
output_object["copy"] = sync_job
job_id += 1
if job_id > 0:
if (
"script" in spack_ci_ir["jobs"]["signing"]["attributes"]
and options.pipeline_type == PipelineType.PROTECTED_BRANCH
):
# External signing: generate a job to check and sign binary pkgs
stage_names.append("stage-sign-pkgs")
signing_job = spack_ci_ir["jobs"]["signing"]["attributes"]
signing_job["script"] = unpack_script(signing_job["script"])
signing_job["stage"] = "stage-sign-pkgs"
signing_job["when"] = "always"
signing_job["retry"] = {"max": 2, "when": ["always"]}
signing_job["interruptible"] = True
if "variables" not in signing_job:
signing_job["variables"] = {}
signing_job["variables"][
"SPACK_BUILDCACHE_DESTINATION"
] = options.buildcache_destination.push_url
signing_job["dependencies"] = []
output_object["sign-pkgs"] = signing_job
if options.rebuild_index:
# Add a final job to regenerate the index
stage_names.append("stage-rebuild-index")
final_job = spack_ci_ir["jobs"]["reindex"]["attributes"]
final_job["stage"] = "stage-rebuild-index"
target_mirror = options.buildcache_destination.push_url
final_job["script"] = unpack_script(
final_job["script"],
op=lambda cmd: cmd.replace("{index_target_mirror}", target_mirror),
)
final_job["when"] = "always"
final_job["retry"] = service_job_retries
final_job["interruptible"] = True
final_job["dependencies"] = []
output_object["rebuild-index"] = final_job
output_object["stages"] = stage_names
# Capture the version of Spack used to generate the pipeline, that can be
# passed to `git checkout` for version consistency. If we aren't in a Git
# repository, presume we are a Spack release and use the Git tag instead.
spack_version = spack.get_version()
version_to_clone = spack.get_spack_commit() or f"v{spack.spack_version}"
rebuild_everything = not options.prune_up_to_date and not options.prune_untouched
output_object["variables"] = {
"SPACK_ARTIFACTS_ROOT": artifacts_root,
"SPACK_CONCRETE_ENV_DIR": rel_concrete_env_dir,
"SPACK_VERSION": spack_version,
"SPACK_CHECKOUT_VERSION": version_to_clone,
"SPACK_JOB_LOG_DIR": rel_job_log_dir,
"SPACK_JOB_REPRO_DIR": rel_job_repro_dir,
"SPACK_JOB_TEST_DIR": rel_job_test_dir,
"SPACK_PIPELINE_TYPE": options.pipeline_type.name if options.pipeline_type else "None",
"SPACK_CI_STACK_NAME": os.environ.get("SPACK_CI_STACK_NAME", "None"),
"SPACK_REBUILD_CHECK_UP_TO_DATE": str(options.prune_up_to_date),
"SPACK_REBUILD_EVERYTHING": str(rebuild_everything),
"SPACK_REQUIRE_SIGNING": str(options.require_signing),
}
if options.stack_name:
output_object["variables"]["SPACK_CI_STACK_NAME"] = options.stack_name
output_vars = output_object["variables"]
for item, val in output_vars.items():
output_vars[item] = ensure_expected_target_path(val)
else:
# No jobs were generated
noop_job = spack_ci_ir["jobs"]["noop"]["attributes"]
# If this job fails ignore the status and carry on
noop_job["retry"] = 0
noop_job["allow_failure"] = True
tty.debug("No specs to rebuild, generating no-op job")
output_object = {"no-specs-to-rebuild": noop_job}
# Ensure the child pipeline always runs
output_object["workflow"] = {"rules": [{"when": "always"}]}
sorted_output = {}
for output_key, output_value in sorted(output_object.items()):
sorted_output[output_key] = output_value
# Minimize yaml output size through use of anchors
syaml.anchorify(sorted_output)
with open(output_file, "w", encoding="utf-8") as f:
ruamel.yaml.YAML().dump(sorted_output, f)

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import difflib
import importlib
import os
import re
@@ -23,10 +24,10 @@
import spack.environment as ev
import spack.error
import spack.extensions
import spack.parser
import spack.paths
import spack.repo
import spack.spec
import spack.spec_parser
import spack.store
import spack.traverse as traverse
import spack.user_environment as uenv
@@ -125,6 +126,8 @@ def get_module(cmd_name):
tty.debug("Imported {0} from built-in commands".format(pname))
except ImportError:
module = spack.extensions.get_module(cmd_name)
if not module:
raise CommandNotFoundError(cmd_name)
attr_setdefault(module, SETUP_PARSER, lambda *args: None) # null-op
attr_setdefault(module, DESCRIPTION, "")
@@ -160,12 +163,12 @@ def quote_kvp(string: str) -> str:
or ``name==``, and we assume the rest of the argument is the value. This covers the
common cases of passign flags, e.g., ``cflags="-O2 -g"`` on the command line.
"""
match = spack.parser.SPLIT_KVP.match(string)
match = spack.spec_parser.SPLIT_KVP.match(string)
if not match:
return string
key, delim, value = match.groups()
return f"{key}{delim}{spack.parser.quote_if_needed(value)}"
return f"{key}{delim}{spack.spec_parser.quote_if_needed(value)}"
def parse_specs(
@@ -177,7 +180,7 @@ def parse_specs(
args = [args] if isinstance(args, str) else args
arg_string = " ".join([quote_kvp(arg) for arg in args])
specs = spack.parser.parse(arg_string)
specs = spack.spec_parser.parse(arg_string)
if not concretize:
return specs
@@ -691,3 +694,24 @@ def find_environment(args):
def first_line(docstring):
"""Return the first line of the docstring."""
return docstring.split("\n")[0]
class CommandNotFoundError(spack.error.SpackError):
"""Exception class thrown when a requested command is not recognized as
such.
"""
def __init__(self, cmd_name):
msg = (
f"{cmd_name} is not a recognized Spack command or extension command; "
"check with `spack commands`."
)
long_msg = None
similar = difflib.get_close_matches(cmd_name, all_commands())
if 1 <= len(similar) <= 5:
long_msg = "\nDid you mean one of the following commands?\n "
long_msg += "\n ".join(similar)
super().__init__(msg, long_msg)

View File

@@ -16,7 +16,7 @@
import spack.bootstrap.config
import spack.bootstrap.core
import spack.config
import spack.mirror
import spack.mirrors.utils
import spack.spec
import spack.stage
import spack.util.path
@@ -400,7 +400,7 @@ def _mirror(args):
llnl.util.tty.set_msg_enabled(False)
spec = spack.spec.Spec(spec_str).concretized()
for node in spec.traverse():
spack.mirror.create(mirror_dir, [node])
spack.mirrors.utils.create(mirror_dir, [node])
llnl.util.tty.set_msg_enabled(True)
if args.binary_packages:
@@ -419,7 +419,7 @@ def write_metadata(subdir, metadata):
metadata_rel_dir = os.path.join("metadata", subdir)
metadata_yaml = os.path.join(args.root_dir, metadata_rel_dir, "metadata.yaml")
llnl.util.filesystem.mkdirp(os.path.dirname(metadata_yaml))
with open(metadata_yaml, mode="w") as f:
with open(metadata_yaml, mode="w", encoding="utf-8") as f:
spack.util.spack_yaml.dump(metadata, stream=f)
return os.path.dirname(metadata_yaml), metadata_rel_dir

View File

@@ -21,7 +21,7 @@
import spack.deptypes as dt
import spack.environment as ev
import spack.error
import spack.mirror
import spack.mirrors.mirror
import spack.oci.oci
import spack.spec
import spack.stage
@@ -392,7 +392,7 @@ def push_fn(args):
roots = spack.cmd.require_active_env(cmd_name="buildcache push").concrete_roots()
mirror = args.mirror
assert isinstance(mirror, spack.mirror.Mirror)
assert isinstance(mirror, spack.mirrors.mirror.Mirror)
push_url = mirror.push_url
@@ -731,7 +731,7 @@ def manifest_copy(manifest_file_list, dest_mirror=None):
deduped_manifest = {}
for manifest_path in manifest_file_list:
with open(manifest_path) as fd:
with open(manifest_path, encoding="utf-8") as fd:
manifest = json.loads(fd.read())
for spec_hash, copy_list in manifest.items():
# Last duplicate hash wins
@@ -750,7 +750,7 @@ def manifest_copy(manifest_file_list, dest_mirror=None):
copy_buildcache_file(copy_file["src"], dest)
def update_index(mirror: spack.mirror.Mirror, update_keys=False):
def update_index(mirror: spack.mirrors.mirror.Mirror, update_keys=False):
# Special case OCI images for now.
try:
image_ref = spack.oci.oci.image_from_mirror(mirror)

View File

@@ -253,7 +253,7 @@ def add_versions_to_package(pkg: PackageBase, version_lines: str, is_batch: bool
if match:
new_versions.append((Version(match.group(1)), ver_line))
with open(filename, "r+") as f:
with open(filename, "r+", encoding="utf-8") as f:
contents = f.read()
split_contents = version_statement_re.split(contents)

View File

@@ -6,7 +6,6 @@
import json
import os
import shutil
import warnings
from urllib.parse import urlparse, urlunparse
import llnl.util.filesystem as fs
@@ -20,7 +19,7 @@
import spack.config as cfg
import spack.environment as ev
import spack.hash_types as ht
import spack.mirror
import spack.mirrors.mirror
import spack.util.gpg as gpg_util
import spack.util.timer as timer
import spack.util.url as url_util
@@ -62,22 +61,8 @@ def setup_parser(subparser):
"path to the file where generated jobs file should be written. "
"default is .gitlab-ci.yml in the root of the repository",
)
generate.add_argument(
"--optimize",
action="store_true",
default=False,
help="(DEPRECATED) optimize the gitlab yaml file for size\n\n"
"run the generated document through a series of optimization passes "
"designed to reduce the size of the generated file",
)
generate.add_argument(
"--dependencies",
action="store_true",
default=False,
help="(DEPRECATED) disable DAG scheduling (use 'plain' dependencies)",
)
prune_group = generate.add_mutually_exclusive_group()
prune_group.add_argument(
prune_dag_group = generate.add_mutually_exclusive_group()
prune_dag_group.add_argument(
"--prune-dag",
action="store_true",
dest="prune_dag",
@@ -85,7 +70,7 @@ def setup_parser(subparser):
help="skip up-to-date specs\n\n"
"do not generate jobs for specs that are up-to-date on the mirror",
)
prune_group.add_argument(
prune_dag_group.add_argument(
"--no-prune-dag",
action="store_false",
dest="prune_dag",
@@ -93,6 +78,23 @@ def setup_parser(subparser):
help="process up-to-date specs\n\n"
"generate jobs for specs even when they are up-to-date on the mirror",
)
prune_ext_group = generate.add_mutually_exclusive_group()
prune_ext_group.add_argument(
"--prune-externals",
action="store_true",
dest="prune_externals",
default=True,
help="skip external specs\n\n"
"do not generate jobs for specs that are marked as external",
)
prune_ext_group.add_argument(
"--no-prune-externals",
action="store_false",
dest="prune_externals",
default=True,
help="process external specs\n\n"
"generate jobs for specs even when they are marked as external",
)
generate.add_argument(
"--check-index-only",
action="store_true",
@@ -108,11 +110,12 @@ def setup_parser(subparser):
)
generate.add_argument(
"--artifacts-root",
default=None,
default="jobs_scratch_dir",
help="path to the root of the artifacts directory\n\n"
"if provided, concrete environment files (spack.yaml, spack.lock) will be generated under "
"this directory. their location will be passed to generated child jobs through the "
"SPACK_CONCRETE_ENVIRONMENT_PATH variable",
"The spack ci module assumes it will normally be run from within your project "
"directory, wherever that is checked out to run your ci. The artifacts root directory "
"should specifiy a name that can safely be used for artifacts within your project "
"directory.",
)
generate.set_defaults(func=ci_generate)
@@ -187,42 +190,8 @@ def ci_generate(args):
before invoking this command. the value must be the CDash authorization token needed to create
a build group and register all generated jobs under it
"""
if args.optimize:
warnings.warn(
"The --optimize option has been deprecated, and currently has no effect. "
"It will be removed in Spack v0.24."
)
if args.dependencies:
warnings.warn(
"The --dependencies option has been deprecated, and currently has no effect. "
"It will be removed in Spack v0.24."
)
env = spack.cmd.require_active_env(cmd_name="ci generate")
output_file = args.output_file
prune_dag = args.prune_dag
index_only = args.index_only
artifacts_root = args.artifacts_root
if not output_file:
output_file = os.path.abspath(".gitlab-ci.yml")
else:
output_file_path = os.path.abspath(output_file)
gen_ci_dir = os.path.dirname(output_file_path)
if not os.path.exists(gen_ci_dir):
os.makedirs(gen_ci_dir)
# Generate the jobs
spack_ci.generate_gitlab_ci_yaml(
env,
True,
output_file,
prune_dag=prune_dag,
check_index_only=index_only,
artifacts_root=artifacts_root,
)
spack_ci.generate_pipeline(env, args)
def ci_reindex(args):
@@ -240,7 +209,7 @@ def ci_reindex(args):
ci_mirrors = yaml_root["mirrors"]
mirror_urls = [url for url in ci_mirrors.values()]
remote_mirror_url = mirror_urls[0]
mirror = spack.mirror.Mirror(remote_mirror_url)
mirror = spack.mirrors.mirror.Mirror(remote_mirror_url)
buildcache.update_index(mirror, update_keys=True)
@@ -328,7 +297,7 @@ def ci_rebuild(args):
full_rebuild = True if rebuild_everything and rebuild_everything.lower() == "true" else False
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
buildcache_destination = None
if "buildcache-destination" not in pipeline_mirrors:
tty.die("spack ci rebuild requires a mirror named 'buildcache-destination")
@@ -387,7 +356,7 @@ def ci_rebuild(args):
# Write this job's spec json into the reproduction directory, and it will
# also be used in the generated "spack install" command to install the spec
tty.debug("job concrete spec path: {0}".format(job_spec_json_path))
with open(job_spec_json_path, "w") as fd:
with open(job_spec_json_path, "w", encoding="utf-8") as fd:
fd.write(job_spec.to_json(hash=ht.dag_hash))
# Write some other details to aid in reproduction into an artifact
@@ -397,7 +366,7 @@ def ci_rebuild(args):
"job_spec_json": job_spec_json_file,
"ci_project_dir": ci_project_dir,
}
with open(repro_file, "w") as fd:
with open(repro_file, "w", encoding="utf-8") as fd:
fd.write(json.dumps(repro_details))
# Write information about spack into an artifact in the repro dir
@@ -433,14 +402,16 @@ def ci_rebuild(args):
if not config["verify_ssl"]:
spack_cmd.append("-k")
install_args = [f'--use-buildcache={spack_ci.win_quote("package:never,dependencies:only")}']
install_args = [
f'--use-buildcache={spack_ci.common.win_quote("package:never,dependencies:only")}'
]
can_verify = spack_ci.can_verify_binaries()
verify_binaries = can_verify and spack_is_pr_pipeline is False
if not verify_binaries:
install_args.append("--no-check-signature")
slash_hash = spack_ci.win_quote("/" + job_spec.dag_hash())
slash_hash = spack_ci.common.win_quote("/" + job_spec.dag_hash())
# Arguments when installing the root from sources
deps_install_args = install_args + ["--only=dependencies"]
@@ -605,7 +576,7 @@ def ci_rebuild(args):
rebuild_timer.stop()
try:
with open("install_timers.json", "w") as timelog:
with open("install_timers.json", "w", encoding="utf-8") as timelog:
extra_attributes = {"name": ".ci-rebuild"}
rebuild_timer.write_json(timelog, extra_attributes=extra_attributes)
except Exception as e:

View File

@@ -743,7 +743,7 @@ def rst(args: Namespace, out: IO) -> None:
# extract cross-refs of the form `_cmd-spack-<cmd>:` from rst files
documented_commands: Set[str] = set()
for filename in args.rst_files:
with open(filename) as f:
with open(filename, encoding="utf-8") as f:
for line in f:
match = re.match(r"\.\. _cmd-(spack-.*):", line)
if match:
@@ -815,7 +815,7 @@ def prepend_header(args: Namespace, out: IO) -> None:
if not args.header:
return
with open(args.header) as header:
with open(args.header, encoding="utf-8") as header:
out.write(header.read())
@@ -836,7 +836,7 @@ def _commands(parser: ArgumentParser, args: Namespace) -> None:
if args.update:
tty.msg(f"Updating file: {args.update}")
with open(args.update, "w") as f:
with open(args.update, "w", encoding="utf-8") as f:
prepend_header(args, f)
formatter(args, f)

View File

@@ -14,7 +14,8 @@
import spack.config
import spack.deptypes as dt
import spack.environment as ev
import spack.mirror
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.reporters
import spack.spec
import spack.store
@@ -168,7 +169,7 @@ def installed_specs(args):
else:
packages = []
for file in args.specfiles:
with open(file, "r") as f:
with open(file, "r", encoding="utf-8") as f:
s = spack.spec.Spec.from_yaml(f)
packages.append(s.format())
return packages
@@ -689,31 +690,31 @@ def mirror_name_or_url(m):
# If there's a \ or / in the name, it's interpreted as a path or url.
if "/" in m or "\\" in m or m in (".", ".."):
return spack.mirror.Mirror(m)
return spack.mirrors.mirror.Mirror(m)
# Otherwise, the named mirror is required to exist.
try:
return spack.mirror.require_mirror_name(m)
return spack.mirrors.utils.require_mirror_name(m)
except ValueError as e:
raise argparse.ArgumentTypeError(f"{e}. Did you mean {os.path.join('.', m)}?") from e
def mirror_url(url):
try:
return spack.mirror.Mirror.from_url(url)
return spack.mirrors.mirror.Mirror.from_url(url)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
def mirror_directory(path):
try:
return spack.mirror.Mirror.from_local_path(path)
return spack.mirrors.mirror.Mirror.from_local_path(path)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
def mirror_name(name):
try:
return spack.mirror.require_mirror_name(name)
return spack.mirrors.utils.require_mirror_name(name)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e

View File

@@ -14,6 +14,7 @@
import spack.config
import spack.environment as ev
import spack.error
import spack.schema
import spack.schema.env
import spack.spec
import spack.store
@@ -566,7 +567,7 @@ def config_prefer_upstream(args):
# Simply write the config to the specified file.
existing = spack.config.get("packages", scope=scope)
new = spack.config.merge_yaml(existing, pkgs)
new = spack.schema.merge_yaml(existing, pkgs)
spack.config.set("packages", new, scope)
config_file = spack.config.CONFIG.get_config_filename(scope, section)

View File

@@ -110,7 +110,7 @@ def write(self, pkg_path):
all_deps.append(self.dependencies)
# Write out a template for the file
with open(pkg_path, "w") as pkg_file:
with open(pkg_path, "w", encoding="utf-8") as pkg_file:
pkg_file.write(
package_template.format(
name=self.name,

View File

@@ -76,7 +76,7 @@ def locate_package(name: str, repo: spack.repo.Repo) -> str:
path = repo.filename_for_package_name(name)
try:
with open(path, "r"):
with open(path, "r", encoding="utf-8"):
return path
except OSError as e:
if e.errno == errno.ENOENT:
@@ -93,7 +93,7 @@ def locate_file(name: str, path: str) -> str:
# Try to open direct match.
try:
with open(file_path, "r"):
with open(file_path, "r", encoding="utf-8"):
return file_path
except OSError as e:
if e.errno != errno.ENOENT:

View File

@@ -865,7 +865,7 @@ def env_loads(args):
args.recurse_dependencies = False
loads_file = fs.join_path(env.path, "loads")
with open(loads_file, "w") as f:
with open(loads_file, "w", encoding="utf-8") as f:
specs = env._get_environment_specs(recurse_dependencies=recurse_dependencies)
spack.cmd.modules.loads(module_type, specs, args, f)
@@ -1053,7 +1053,7 @@ def env_depfile(args):
# Finally write to stdout/file.
if args.output:
with open(args.output, "w") as f:
with open(args.output, "w", encoding="utf-8") as f:
f.write(makefile)
else:
sys.stdout.write(makefile)

View File

@@ -8,7 +8,7 @@
import tempfile
import spack.binary_distribution
import spack.mirror
import spack.mirrors.mirror
import spack.paths
import spack.stage
import spack.util.gpg
@@ -217,11 +217,11 @@ def gpg_publish(args):
mirror = None
if args.directory:
url = spack.util.url.path_to_file_url(args.directory)
mirror = spack.mirror.Mirror(url, url)
mirror = spack.mirrors.mirror.Mirror(url, url)
elif args.mirror_name:
mirror = spack.mirror.MirrorCollection(binary=True).lookup(args.mirror_name)
mirror = spack.mirrors.mirror.MirrorCollection(binary=True).lookup(args.mirror_name)
elif args.mirror_url:
mirror = spack.mirror.Mirror(args.mirror_url, args.mirror_url)
mirror = spack.mirrors.mirror.Mirror(args.mirror_url, args.mirror_url)
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
spack.binary_distribution._url_push_keys(

View File

@@ -291,7 +291,7 @@ def _dump_log_on_error(e: InstallError):
tty.error("'spack install' created no log.")
else:
sys.stderr.write("Full build log:\n")
with open(e.pkg.log_path, errors="replace") as log:
with open(e.pkg.log_path, errors="replace", encoding="utf-8") as log:
shutil.copyfileobj(log, sys.stderr)
@@ -445,7 +445,7 @@ def concrete_specs_from_file(args):
"""Return the list of concrete specs read from files."""
result = []
for file in args.specfiles:
with open(file, "r") as f:
with open(file, "r", encoding="utf-8") as f:
if file.endswith("yaml") or file.endswith("yml"):
s = spack.spec.Spec.from_yaml(f)
else:

View File

@@ -191,7 +191,7 @@ def verify(args):
for relpath in _licensed_files(args):
path = os.path.join(args.root, relpath)
with open(path) as f:
with open(path, encoding="utf-8") as f:
lines = [line for line in f][:license_lines]
error = _check_license(lines, path)

View File

@@ -340,7 +340,7 @@ def list(parser, args):
return
tty.msg("Updating file: %s" % args.update)
with open(args.update, "w") as f:
with open(args.update, "w", encoding="utf-8") as f:
formatter(sorted_packages, f)
elif args.count:

View File

@@ -31,7 +31,7 @@ def line_to_rtf(str):
return str.replace("\n", "\\par")
contents = ""
with open(file_path, "r+") as f:
with open(file_path, "r+", encoding="utf-8") as f:
for line in f.readlines():
contents += line_to_rtf(line)
return rtf_header.format(contents)
@@ -93,7 +93,7 @@ def make_installer(parser, args):
rtf_spack_license = txt_to_rtf(spack_license)
spack_license = posixpath.join(source_dir, "LICENSE.rtf")
with open(spack_license, "w") as rtf_license:
with open(spack_license, "w", encoding="utf-8") as rtf_license:
written = rtf_license.write(rtf_spack_license)
if written == 0:
raise RuntimeError("Failed to generate properly formatted license file")

View File

@@ -14,7 +14,8 @@
import spack.concretize
import spack.config
import spack.environment as ev
import spack.mirror
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.repo
import spack.spec
import spack.util.web as web_util
@@ -365,15 +366,15 @@ def mirror_add(args):
connection["autopush"] = args.autopush
if args.signed is not None:
connection["signed"] = args.signed
mirror = spack.mirror.Mirror(connection, name=args.name)
mirror = spack.mirrors.mirror.Mirror(connection, name=args.name)
else:
mirror = spack.mirror.Mirror(args.url, name=args.name)
spack.mirror.add(mirror, args.scope)
mirror = spack.mirrors.mirror.Mirror(args.url, name=args.name)
spack.mirrors.utils.add(mirror, args.scope)
def mirror_remove(args):
"""remove a mirror by name"""
spack.mirror.remove(args.name, args.scope)
spack.mirrors.utils.remove(args.name, args.scope)
def _configure_mirror(args):
@@ -382,7 +383,7 @@ def _configure_mirror(args):
if args.name not in mirrors:
tty.die(f"No mirror found with name {args.name}.")
entry = spack.mirror.Mirror(mirrors[args.name], args.name)
entry = spack.mirrors.mirror.Mirror(mirrors[args.name], args.name)
direction = "fetch" if args.fetch else "push" if args.push else None
changes = {}
if args.url:
@@ -449,7 +450,7 @@ def mirror_set_url(args):
def mirror_list(args):
"""print out available mirrors to the console"""
mirrors = spack.mirror.MirrorCollection(scope=args.scope)
mirrors = spack.mirrors.mirror.MirrorCollection(scope=args.scope)
if not mirrors:
tty.msg("No mirrors configured.")
return
@@ -467,7 +468,7 @@ def specs_from_text_file(filename, concretize=False):
concretize (bool): if True concretize the specs before returning
the list.
"""
with open(filename, "r") as f:
with open(filename, "r", encoding="utf-8") as f:
specs_in_file = f.readlines()
specs_in_file = [s.strip() for s in specs_in_file]
return spack.cmd.parse_specs(" ".join(specs_in_file), concretize=concretize)
@@ -489,9 +490,9 @@ def concrete_specs_from_user(args):
def extend_with_additional_versions(specs, num_versions):
if num_versions == "all":
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs = spack.mirrors.utils.get_all_versions(specs)
else:
mirror_specs = spack.mirror.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = spack.mirrors.utils.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = [x.concretized() for x in mirror_specs]
return mirror_specs
@@ -570,7 +571,7 @@ def concrete_specs_from_environment():
def all_specs_with_all_versions():
specs = [spack.spec.Spec(n) for n in spack.repo.all_package_names()]
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs = spack.mirrors.utils.get_all_versions(specs)
mirror_specs.sort(key=lambda s: (s.name, s.version))
return mirror_specs
@@ -659,19 +660,21 @@ def _specs_and_action(args):
def create_mirror_for_all_specs(mirror_specs, path, skip_unstable_versions):
mirror_cache, mirror_stats = spack.mirror.mirror_cache_and_stats(
mirror_cache, mirror_stats = spack.mirrors.utils.mirror_cache_and_stats(
path, skip_unstable_versions=skip_unstable_versions
)
for candidate in mirror_specs:
pkg_cls = spack.repo.PATH.get_pkg_class(candidate.name)
pkg_obj = pkg_cls(spack.spec.Spec(candidate))
mirror_stats.next_spec(pkg_obj.spec)
spack.mirror.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)
spack.mirrors.utils.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)
process_mirror_stats(*mirror_stats.stats())
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
present, mirrored, error = spack.mirrors.utils.create(
path, mirror_specs, skip_unstable_versions
)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
@@ -681,7 +684,7 @@ def mirror_destroy(args):
mirror_url = None
if args.mirror_name:
result = spack.mirror.MirrorCollection().lookup(args.mirror_name)
result = spack.mirrors.mirror.MirrorCollection().lookup(args.mirror_name)
mirror_url = result.push_url
elif args.mirror_url:
mirror_url = args.mirror_url

View File

@@ -8,6 +8,7 @@
import spack.cmd.common.arguments
import spack.cmd.modules
import spack.config
import spack.modules
import spack.modules.lmod

View File

@@ -7,6 +7,7 @@
import spack.cmd.common.arguments
import spack.cmd.modules
import spack.config
import spack.modules
import spack.modules.tcl

View File

@@ -150,7 +150,7 @@ def pkg_source(args):
content = ph.canonical_source(spec)
else:
message = "Source for %s:" % filename
with open(filename) as f:
with open(filename, encoding="utf-8") as f:
content = f.read()
if sys.stdout.isatty():

View File

@@ -94,7 +94,7 @@ def ipython_interpreter(args):
if "PYTHONSTARTUP" in os.environ:
startup_file = os.environ["PYTHONSTARTUP"]
if os.path.isfile(startup_file):
with open(startup_file) as startup:
with open(startup_file, encoding="utf-8") as startup:
exec(startup.read())
# IPython can also support running a script OR command, not both
@@ -126,7 +126,7 @@ def python_interpreter(args):
if "PYTHONSTARTUP" in os.environ:
startup_file = os.environ["PYTHONSTARTUP"]
if os.path.isfile(startup_file):
with open(startup_file) as startup:
with open(startup_file, encoding="utf-8") as startup:
console.runsource(startup.read(), startup_file, "exec")
if args.python_command:
propagate_exceptions_from(console)

View File

@@ -3,18 +3,21 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import ast
import os
import re
import sys
from itertools import zip_longest
from typing import Dict, List, Optional
import llnl.util.tty as tty
import llnl.util.tty.color as color
from llnl.util.filesystem import working_dir
import spack.paths
import spack.repo
import spack.util.git
from spack.util.executable import which
from spack.util.executable import Executable, which
description = "runs source code style checks on spack"
section = "developer"
@@ -36,10 +39,7 @@ def grouper(iterable, n, fillvalue=None):
#: double-check the results of other tools (if, e.g., --fix was provided)
#: The list maps an executable name to a method to ensure the tool is
#: bootstrapped or present in the environment.
tool_names = ["isort", "black", "flake8", "mypy"]
#: tools we run in spack style
tools = {}
tool_names = ["import", "isort", "black", "flake8", "mypy"]
#: warnings to ignore in mypy
mypy_ignores = [
@@ -61,14 +61,28 @@ def is_package(f):
#: decorator for adding tools to the list
class tool:
def __init__(self, name, required=False):
def __init__(self, name: str, required: bool = False, external: bool = True) -> None:
self.name = name
self.external = external
self.required = required
def __call__(self, fun):
tools[self.name] = (fun, self.required)
self.fun = fun
tools[self.name] = self
return fun
@property
def installed(self) -> bool:
return bool(which(self.name)) if self.external else True
@property
def executable(self) -> Optional[Executable]:
return which(self.name) if self.external else None
#: tools we run in spack style
tools: Dict[str, tool] = {}
def changed_files(base="develop", untracked=True, all_files=False, root=None):
"""Get list of changed files in the Spack repository.
@@ -176,22 +190,22 @@ def setup_parser(subparser):
"-t",
"--tool",
action="append",
help="specify which tools to run (default: %s)" % ",".join(tool_names),
help="specify which tools to run (default: %s)" % ", ".join(tool_names),
)
tool_group.add_argument(
"-s",
"--skip",
metavar="TOOL",
action="append",
help="specify tools to skip (choose from %s)" % ",".join(tool_names),
help="specify tools to skip (choose from %s)" % ", ".join(tool_names),
)
subparser.add_argument("files", nargs=argparse.REMAINDER, help="specific files to check")
def cwd_relative(path, args):
def cwd_relative(path, root, initial_working_dir):
"""Translate prefix-relative path to current working directory-relative."""
return os.path.relpath(os.path.join(args.root, path), args.initial_working_dir)
return os.path.relpath(os.path.join(root, path), initial_working_dir)
def rewrite_and_print_output(
@@ -201,7 +215,10 @@ def rewrite_and_print_output(
# print results relative to current working directory
def translate(match):
return replacement.format(cwd_relative(match.group(1), args), *list(match.groups()[1:]))
return replacement.format(
cwd_relative(match.group(1), args.root, args.initial_working_dir),
*list(match.groups()[1:]),
)
for line in output.split("\n"):
if not line:
@@ -220,7 +237,7 @@ def print_style_header(file_list, args, tools_to_run):
# translate modified paths to cwd_relative if needed
paths = [filename.strip() for filename in file_list]
if not args.root_relative:
paths = [cwd_relative(filename, args) for filename in paths]
paths = [cwd_relative(filename, args.root, args.initial_working_dir) for filename in paths]
tty.msg("Modified files", *paths)
sys.stdout.flush()
@@ -306,8 +323,6 @@ def process_files(file_list, is_args):
rewrite_and_print_output(output, args, pat, replacement)
packages_isort_args = (
"--rm",
"spack",
"--rm",
"spack.pkgkit",
"--rm",
@@ -352,17 +367,137 @@ def run_black(black_cmd, file_list, args):
return returncode
def _module_part(root: str, expr: str):
parts = expr.split(".")
# spack.pkg is for repositories, don't try to resolve it here.
if ".".join(parts[:2]) == spack.repo.ROOT_PYTHON_NAMESPACE:
return None
while parts:
f1 = os.path.join(root, "lib", "spack", *parts) + ".py"
f2 = os.path.join(root, "lib", "spack", *parts, "__init__.py")
if (
os.path.exists(f1)
# ensure case sensitive match
and f"{parts[-1]}.py" in os.listdir(os.path.dirname(f1))
or os.path.exists(f2)
):
return ".".join(parts)
parts.pop()
return None
def _run_import_check(
file_list: List[str],
*,
fix: bool,
root_relative: bool,
root=spack.paths.prefix,
working_dir=spack.paths.prefix,
out=sys.stdout,
):
if sys.version_info < (3, 9):
print("import check requires Python 3.9 or later")
return 0
is_use = re.compile(r"(?<!from )(?<!import )(?:llnl|spack)\.[a-zA-Z0-9_\.]+")
# redundant imports followed by a `# comment` are ignored, cause there can be legimitate reason
# to import a module: execute module scope init code, or to deal with circular imports.
is_abs_import = re.compile(r"^import ((?:llnl|spack)\.[a-zA-Z0-9_\.]+)$", re.MULTILINE)
exit_code = 0
for file in file_list:
to_add = set()
to_remove = []
pretty_path = file if root_relative else cwd_relative(file, root, working_dir)
try:
with open(file, "r", encoding="utf-8") as f:
contents = f.read()
parsed = ast.parse(contents)
except Exception:
exit_code = 1
print(f"{pretty_path}: could not parse", file=out)
continue
for m in is_abs_import.finditer(contents):
if contents.count(m.group(1)) == 1:
to_remove.append(m.group(0))
exit_code = 1
print(f"{pretty_path}: redundant import: {m.group(1)}", file=out)
# Clear all strings to avoid matching comments/strings etc.
for node in ast.walk(parsed):
if isinstance(node, ast.Constant) and isinstance(node.value, str):
node.value = ""
filtered_contents = ast.unparse(parsed) # novermin
for m in is_use.finditer(filtered_contents):
module = _module_part(root, m.group(0))
if not module or module in to_add:
continue
if re.search(rf"import {re.escape(module)}\b(?!\.)", contents):
continue
to_add.add(module)
exit_code = 1
print(f"{pretty_path}: missing import: {module} ({m.group(0)})", file=out)
if not fix or not to_add and not to_remove:
continue
with open(file, "r", encoding="utf-8") as f:
lines = f.readlines()
if to_add:
# insert missing imports before the first import, delegate ordering to isort
for node in parsed.body:
if isinstance(node, (ast.Import, ast.ImportFrom)):
first_line = node.lineno
break
else:
print(f"{pretty_path}: could not fix", file=out)
continue
lines.insert(first_line, "\n".join(f"import {x}" for x in to_add) + "\n")
new_contents = "".join(lines)
# remove redundant imports
for statement in to_remove:
new_contents = new_contents.replace(f"{statement}\n", "")
with open(file, "w", encoding="utf-8") as f:
f.write(new_contents)
return exit_code
@tool("import", external=False)
def run_import_check(import_check_cmd, file_list, args):
exit_code = _run_import_check(
file_list,
fix=args.fix,
root_relative=args.root_relative,
root=args.root,
working_dir=args.initial_working_dir,
)
print_tool_result("import", exit_code)
return exit_code
def validate_toolset(arg_value):
"""Validate --tool and --skip arguments (sets of optionally comma-separated tools)."""
tools = set(",".join(arg_value).split(",")) # allow args like 'isort,flake8'
for tool in tools:
if tool not in tool_names:
tty.die("Invaild tool: '%s'" % tool, "Choose from: %s" % ", ".join(tool_names))
tty.die("Invalid tool: '%s'" % tool, "Choose from: %s" % ", ".join(tool_names))
return tools
def missing_tools(tools_to_run):
return [t for t in tools_to_run if which(t) is None]
def missing_tools(tools_to_run: List[str]) -> List[str]:
return [t for t in tools_to_run if not tools[t].installed]
def _bootstrap_dev_dependencies():
@@ -417,9 +552,9 @@ def prefix_relative(path):
print_style_header(file_list, args, tools_to_run)
for tool_name in tools_to_run:
run_function, required = tools[tool_name]
tool = tools[tool_name]
print_tool_header(tool_name)
return_code |= run_function(which(tool_name), file_list, args)
return_code |= tool.fun(tool.executable, file_list, args)
if return_code == 0:
tty.msg(color.colorize("@*{spack style checks were clean}"))

View File

@@ -346,7 +346,7 @@ def _report_suite_results(test_suite, args, constraints):
tty.msg("{0} for test suite '{1}'{2}:".format(results_desc, test_suite.name, matching))
results = {}
with open(test_suite.results_file, "r") as f:
with open(test_suite.results_file, "r", encoding="utf-8") as f:
for line in f:
pkg_id, status = line.split()
results[pkg_id] = status
@@ -371,7 +371,7 @@ def _report_suite_results(test_suite, args, constraints):
spec = test_specs[pkg_id]
log_file = test_suite.log_file_for_spec(spec)
if os.path.isfile(log_file):
with open(log_file, "r") as f:
with open(log_file, "r", encoding="utf-8") as f:
msg += "\n{0}".format("".join(f.readlines()))
tty.msg(msg)

View File

@@ -192,7 +192,7 @@ def view(parser, args):
if args.action in actions_link and args.projection_file:
# argparse confirms file exists
with open(args.projection_file, "r") as f:
with open(args.projection_file, "r", encoding="utf-8") as f:
projections_data = s_yaml.load(f)
validate(projections_data, spack.schema.projections.schema)
ordered_projections = projections_data["projections"]

View File

@@ -469,7 +469,7 @@ def _compile_dummy_c_source(self) -> Optional[str]:
fout = os.path.join(tmpdir, "output")
fin = os.path.join(tmpdir, f"main.{ext}")
with open(fin, "w") as csource:
with open(fin, "w", encoding="utf-8") as csource:
csource.write(
"int main(int argc, char* argv[]) { (void)argc; (void)argv; return 0; }\n"
)

View File

@@ -124,8 +124,8 @@ def setup_custom_environment(self, pkg, env):
# Edge cases for Intel's oneAPI compilers when using the legacy classic compilers:
# Always pass flags to disable deprecation warnings, since these warnings can
# confuse tools that parse the output of compiler commands (e.g. version checks).
if self.real_version >= Version("2021") and self.real_version <= Version("2023"):
if self.real_version >= Version("2021") and self.real_version < Version("2024"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.real_version >= Version("2021") and self.real_version <= Version("2024"):
if self.real_version >= Version("2021") and self.real_version < Version("2025"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")

View File

@@ -155,10 +155,10 @@ def setup_custom_environment(self, pkg, env):
# icx+icpx+ifx or icx+icpx+ifort. But to be on the safe side (some users may
# want to try to swap icpx against icpc, for example), and since the Intel LLVM
# compilers accept these diag-disable flags, we apply them for all compilers.
if self.real_version >= Version("2021") and self.real_version <= Version("2023"):
if self.real_version >= Version("2021") and self.real_version < Version("2024"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.real_version >= Version("2021") and self.real_version <= Version("2024"):
if self.real_version >= Version("2021") and self.real_version < Version("2025"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")
# 2024 release bumped the libsycl version because of an ABI

View File

@@ -160,6 +160,11 @@ def concretize_separately(
# TODO: support parallel concretization on macOS and Windows
num_procs = min(len(args), spack.config.determine_number_of_jobs(parallel=True))
msg = "Starting concretization"
if sys.platform not in ("darwin", "win32") and num_procs > 1:
msg += f" pool with {num_procs} processes"
tty.msg(msg)
for j, (i, concrete, duration) in enumerate(
spack.util.parallel.imap_unordered(
_concretize_task, args, processes=num_procs, debug=tty.is_debug(), maxtaskperchild=1

View File

@@ -179,7 +179,7 @@ def _write_section(self, section: str) -> None:
try:
filesystem.mkdirp(self.path)
with open(filename, "w") as f:
with open(filename, "w", encoding="utf-8") as f:
syaml.dump_config(data, stream=f, default_flow_style=False)
except (syaml.SpackYAMLError, OSError) as e:
raise ConfigFileError(f"cannot write to '{filename}'") from e
@@ -314,7 +314,7 @@ def _write_section(self, section: str) -> None:
filesystem.mkdirp(parent)
tmp = os.path.join(parent, f".{os.path.basename(self.path)}.tmp")
with open(tmp, "w") as f:
with open(tmp, "w", encoding="utf-8") as f:
syaml.dump_config(data_to_write, stream=f, default_flow_style=False)
filesystem.rename(tmp, self.path)
@@ -619,7 +619,7 @@ def _get_config_memoized(self, section: str, scope: Optional[str]) -> YamlConfig
if changed:
self.format_updates[section].append(scope)
merged_section = merge_yaml(merged_section, data)
merged_section = spack.schema.merge_yaml(merged_section, data)
# no config files -- empty config.
if section not in merged_section:
@@ -680,7 +680,7 @@ def set(self, path: str, value: Any, scope: Optional[str] = None) -> None:
while len(parts) > 1:
key = parts.pop(0)
if _override(key):
if spack.schema.override(key):
new = type(data[key])()
del data[key]
else:
@@ -693,7 +693,7 @@ def set(self, path: str, value: Any, scope: Optional[str] = None) -> None:
data[key] = new
data = new
if _override(parts[0]):
if spack.schema.override(parts[0]):
data.pop(parts[0], None)
# update new value
@@ -790,30 +790,6 @@ def config_paths_from_entry_points() -> List[Tuple[str, str]]:
return config_paths
def _add_command_line_scopes(cfg: Configuration, command_line_scopes: List[str]) -> None:
"""Add additional scopes from the --config-scope argument, either envs or dirs."""
import spack.environment.environment as env # circular import
for i, path in enumerate(command_line_scopes):
name = f"cmd_scope_{i}"
if env.exists(path): # managed environment
manifest = env.EnvironmentManifestFile(env.root(path))
elif env.is_env_dir(path): # anonymous environment
manifest = env.EnvironmentManifestFile(path)
elif os.path.isdir(path): # directory with config files
cfg.push_scope(DirectoryConfigScope(name, path, writable=False))
_add_platform_scope(cfg, name, path, writable=False)
continue
else:
raise spack.error.ConfigError(f"Invalid configuration scope: {path}")
for scope in manifest.env_config_scopes:
scope.name = f"{name}:{scope.name}"
scope.writable = False
cfg.push_scope(scope)
def create() -> Configuration:
"""Singleton Configuration instance.
@@ -894,7 +870,7 @@ def add_from_file(filename: str, scope: Optional[str] = None) -> None:
value = data[section]
existing = get(section, scope=scope)
new = merge_yaml(existing, value)
new = spack.schema.merge_yaml(existing, value)
# We cannot call config.set directly (set is a type)
CONFIG.set(section, new, scope)
@@ -946,7 +922,7 @@ def add(fullpath: str, scope: Optional[str] = None) -> None:
value: List[str] = [value] # type: ignore[no-redef]
# merge value into existing
new = merge_yaml(existing, value)
new = spack.schema.merge_yaml(existing, value)
CONFIG.set(path, new, scope)
@@ -1093,7 +1069,7 @@ def read_config_file(
# schema when it's not necessary) while allowing us to validate against a
# known schema when the top-level key could be incorrect.
try:
with open(path) as f:
with open(path, encoding="utf-8") as f:
tty.debug(f"Reading config from file {path}")
data = syaml.load_config(f)
@@ -1120,44 +1096,6 @@ def read_config_file(
raise ConfigFileError(str(e)) from e
def _override(string: str) -> bool:
"""Test if a spack YAML string is an override.
See ``spack_yaml`` for details. Keys in Spack YAML can end in `::`,
and if they do, their values completely replace lower-precedence
configs instead of merging into them.
"""
return hasattr(string, "override") and string.override
def _append(string: str) -> bool:
"""Test if a spack YAML string is an override.
See ``spack_yaml`` for details. Keys in Spack YAML can end in `+:`,
and if they do, their values append lower-precedence
configs.
str, str : concatenate strings.
[obj], [obj] : append lists.
"""
return getattr(string, "append", False)
def _prepend(string: str) -> bool:
"""Test if a spack YAML string is an override.
See ``spack_yaml`` for details. Keys in Spack YAML can end in `+:`,
and if they do, their values prepend lower-precedence
configs.
str, str : concatenate strings.
[obj], [obj] : prepend lists. (default behavior)
"""
return getattr(string, "prepend", False)
def _mark_internal(data, name):
"""Add a simple name mark to raw YAML/JSON data.
@@ -1260,7 +1198,7 @@ def they_are(t):
unmerge = sk in dest
old_dest_value = dest.pop(sk, None)
if unmerge and not _override(sk):
if unmerge and not spack.schema.override(sk):
dest[sk] = remove_yaml(old_dest_value, sv)
return dest
@@ -1270,81 +1208,6 @@ def they_are(t):
return dest
def merge_yaml(dest, source, prepend=False, append=False):
"""Merges source into dest; entries in source take precedence over dest.
This routine may modify dest and should be assigned to dest, in
case dest was None to begin with, e.g.:
dest = merge_yaml(dest, source)
In the result, elements from lists from ``source`` will appear before
elements of lists from ``dest``. Likewise, when iterating over keys
or items in merged ``OrderedDict`` objects, keys from ``source`` will
appear before keys from ``dest``.
Config file authors can optionally end any attribute in a dict
with `::` instead of `:`, and the key will override that of the
parent instead of merging.
`+:` will extend the default prepend merge strategy to include string concatenation
`-:` will change the merge strategy to append, it also includes string concatentation
"""
def they_are(t):
return isinstance(dest, t) and isinstance(source, t)
# If source is None, overwrite with source.
if source is None:
return None
# Source list is prepended (for precedence)
if they_are(list):
if append:
# Make sure to copy ruamel comments
dest[:] = [x for x in dest if x not in source] + source
else:
# Make sure to copy ruamel comments
dest[:] = source + [x for x in dest if x not in source]
return dest
# Source dict is merged into dest.
elif they_are(dict):
# save dest keys to reinsert later -- this ensures that source items
# come *before* dest in OrderdDicts
dest_keys = [dk for dk in dest.keys() if dk not in source]
for sk, sv in source.items():
# always remove the dest items. Python dicts do not overwrite
# keys on insert, so this ensures that source keys are copied
# into dest along with mark provenance (i.e., file/line info).
merge = sk in dest
old_dest_value = dest.pop(sk, None)
if merge and not _override(sk):
dest[sk] = merge_yaml(old_dest_value, sv, _prepend(sk), _append(sk))
else:
# if sk ended with ::, or if it's new, completely override
dest[sk] = copy.deepcopy(sv)
# reinsert dest keys so they are last in the result
for dk in dest_keys:
dest[dk] = dest.pop(dk)
return dest
elif they_are(str):
# Concatenate strings in prepend mode
if prepend:
return source + dest
elif append:
return dest + source
# If we reach here source and dest are either different types or are
# not both lists or dicts: replace with source.
return copy.copy(source)
class ConfigPath:
quoted_string = "(?:\"[^\"]+\")|(?:'[^']+')"
unquoted_string = "[^:'\"]+"

View File

@@ -33,7 +33,7 @@ def validate(configuration_file):
"""
import jsonschema
with open(configuration_file) as f:
with open(configuration_file, encoding="utf-8") as f:
config = syaml.load(f)
# Ensure we have a "container" attribute with sensible defaults set

View File

@@ -27,7 +27,7 @@ def data():
if not _data:
json_dir = os.path.abspath(os.path.dirname(__file__))
json_file = os.path.join(json_dir, "images.json")
with open(json_file) as f:
with open(json_file, encoding="utf-8") as f:
_data = json.load(f)
return _data

View File

@@ -211,7 +211,7 @@ def entries_to_specs(entries):
def read(path, apply_updates):
decode_exception_type = json.decoder.JSONDecodeError
try:
with open(path, "r") as json_file:
with open(path, "r", encoding="utf-8") as json_file:
json_data = json.load(json_file)
jsonschema.validate(json_data, manifest_schema)

View File

@@ -760,7 +760,7 @@ def _read_from_file(self, filename):
Does not do any locking.
"""
try:
with open(filename, "r") as f:
with open(filename, "r", encoding="utf-8") as f:
# In the future we may use a stream of JSON objects, hence `raw_decode` for compat.
fdata, _ = JSONDecoder().raw_decode(f.read())
except Exception as e:
@@ -1031,12 +1031,12 @@ def _write(self, type, value, traceback):
# Write a temporary database file them move it into place
try:
with open(temp_file, "w") as f:
with open(temp_file, "w", encoding="utf-8") as f:
self._write_to_file(f)
fs.rename(temp_file, self._index_path)
if _use_uuid:
with open(self._verifier_path, "w") as f:
with open(self._verifier_path, "w", encoding="utf-8") as f:
new_verifier = str(uuid.uuid4())
f.write(new_verifier)
self.last_seen_verifier = new_verifier
@@ -1053,7 +1053,7 @@ def _read(self):
current_verifier = ""
if _use_uuid:
try:
with open(self._verifier_path, "r") as f:
with open(self._verifier_path, "r", encoding="utf-8") as f:
current_verifier = f.read()
except BaseException:
pass

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Data structures that represent Spack's dependency relationships."""
from typing import Dict, List
from typing import Dict, List, Type
import spack.deptypes as dt
import spack.spec
@@ -38,7 +38,7 @@ class Dependency:
def __init__(
self,
pkg: "spack.package_base.PackageBase",
pkg: Type["spack.package_base.PackageBase"],
spec: "spack.spec.Spec",
depflag: dt.DepFlag = dt.DEFAULT,
):

View File

@@ -27,6 +27,7 @@
import spack.config
import spack.error
import spack.operating_systems.windows_os as winOs
import spack.schema
import spack.spec
import spack.util.environment
import spack.util.spack_yaml
@@ -226,7 +227,7 @@ def update_configuration(
pkg_to_cfg[package_name] = pkg_config
pkgs_cfg = spack.config.get("packages", scope=scope)
pkgs_cfg = spack.config.merge_yaml(pkgs_cfg, pkg_to_cfg)
pkgs_cfg = spack.schema.merge_yaml(pkgs_cfg, pkg_to_cfg)
spack.config.set("packages", pkgs_cfg, scope=scope)
return all_new_specs
@@ -246,7 +247,7 @@ def set_virtuals_nonbuildable(virtuals: Set[str], scope: Optional[str] = None) -
# Update the provided scope
spack.config.set(
"packages",
spack.config.merge_yaml(spack.config.get("packages", scope=scope), new_config),
spack.schema.merge_yaml(spack.config.get("packages", scope=scope), new_config),
scope=scope,
)

View File

@@ -198,6 +198,6 @@ def _detection_tests_yaml(
) -> Tuple[pathlib.Path, Dict[str, Any]]:
pkg_dir = pathlib.Path(repository.filename_for_package_name(pkg_name)).parent
detection_tests_yaml = pkg_dir / "detection_test.yaml"
with open(str(detection_tests_yaml)) as f:
with open(str(detection_tests_yaml), encoding="utf-8") as f:
content = spack_yaml.load(f)
return detection_tests_yaml, content

View File

@@ -21,6 +21,7 @@ class OpenMpi(Package):
* ``conflicts``
* ``depends_on``
* ``extends``
* ``license``
* ``patch``
* ``provides``
* ``resource``
@@ -34,12 +35,12 @@ class OpenMpi(Package):
import collections.abc
import os.path
import re
from typing import Any, Callable, List, Optional, Tuple, Union
from typing import Any, Callable, List, Optional, Tuple, Type, Union
import llnl.util.lang
import llnl.util.tty.color
import spack.deptypes as dt
import spack.fetch_strategy
import spack.package_base
import spack.patch
import spack.spec
@@ -47,7 +48,6 @@ class OpenMpi(Package):
import spack.variant
from spack.dependency import Dependency
from spack.directives_meta import DirectiveError, DirectiveMeta
from spack.fetch_strategy import from_kwargs
from spack.resource import Resource
from spack.version import (
GitVersion,
@@ -82,8 +82,8 @@ class OpenMpi(Package):
SpecType = str
DepType = Union[Tuple[str, ...], str]
WhenType = Optional[Union[spack.spec.Spec, str, bool]]
Patcher = Callable[[Union[spack.package_base.PackageBase, Dependency]], None]
PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]]
Patcher = Callable[[Union[Type[spack.package_base.PackageBase], Dependency]], None]
PatchesType = Union[Patcher, str, List[Union[Patcher, str]]]
SUPPORTED_LANGUAGES = ("fortran", "cxx", "c")
@@ -219,7 +219,7 @@ def version(
return lambda pkg: _execute_version(pkg, ver, **kwargs)
def _execute_version(pkg, ver, **kwargs):
def _execute_version(pkg: Type[spack.package_base.PackageBase], ver: Union[str, int], **kwargs):
if (
(any(s in kwargs for s in spack.util.crypto.hashes) or "checksum" in kwargs)
and hasattr(pkg, "has_code")
@@ -250,12 +250,12 @@ def _execute_version(pkg, ver, **kwargs):
def _depends_on(
pkg: spack.package_base.PackageBase,
pkg: Type[spack.package_base.PackageBase],
spec: spack.spec.Spec,
*,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
patches: PatchesType = None,
patches: Optional[PatchesType] = None,
):
when_spec = _make_when_spec(when)
if not when_spec:
@@ -297,6 +297,13 @@ def _depends_on(
deps_by_name = pkg.dependencies.setdefault(when_spec, {})
dependency = deps_by_name.get(spec.name)
if spec.dependencies():
raise DirectiveError(
f"the '^' sigil cannot be used in 'depends_on' directives. Please reformulate "
f"the directive below as multiple directives:\n\n"
f'\tdepends_on("{spec}", when="{when_spec}")\n'
)
if not dependency:
dependency = Dependency(pkg, spec, depflag=depflag)
deps_by_name[spec.name] = dependency
@@ -330,7 +337,7 @@ def conflicts(conflict_spec: SpecType, when: WhenType = None, msg: Optional[str]
msg (str): optional user defined message
"""
def _execute_conflicts(pkg: spack.package_base.PackageBase):
def _execute_conflicts(pkg: Type[spack.package_base.PackageBase]):
# If when is not specified the conflict always holds
when_spec = _make_when_spec(when)
if not when_spec:
@@ -349,7 +356,7 @@ def depends_on(
spec: SpecType,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
patches: PatchesType = None,
patches: Optional[PatchesType] = None,
):
"""Creates a dict of deps with specs defining when they apply.
@@ -371,14 +378,16 @@ def depends_on(
assert type == "build", "languages must be of 'build' type"
return _language(lang_spec_str=spec, when=when)
def _execute_depends_on(pkg: spack.package_base.PackageBase):
def _execute_depends_on(pkg: Type[spack.package_base.PackageBase]):
_depends_on(pkg, dep_spec, when=when, type=type, patches=patches)
return _execute_depends_on
@directive("disable_redistribute")
def redistribute(source=None, binary=None, when: WhenType = None):
def redistribute(
source: Optional[bool] = None, binary: Optional[bool] = None, when: WhenType = None
):
"""Can be used inside a Package definition to declare that
the package source and/or compiled binaries should not be
redistributed.
@@ -393,7 +402,10 @@ def redistribute(source=None, binary=None, when: WhenType = None):
def _execute_redistribute(
pkg: spack.package_base.PackageBase, source=None, binary=None, when: WhenType = None
pkg: Type[spack.package_base.PackageBase],
source: Optional[bool],
binary: Optional[bool],
when: WhenType,
):
if source is None and binary is None:
return
@@ -469,9 +481,7 @@ def provides(*specs: SpecType, when: WhenType = None):
when: condition when this provides clause needs to be considered
"""
def _execute_provides(pkg: spack.package_base.PackageBase):
import spack.parser # Avoid circular dependency
def _execute_provides(pkg: Type[spack.package_base.PackageBase]):
when_spec = _make_when_spec(when)
if not when_spec:
return
@@ -517,7 +527,7 @@ def can_splice(
variants will be skipped by '*'.
"""
def _execute_can_splice(pkg: spack.package_base.PackageBase):
def _execute_can_splice(pkg: Type[spack.package_base.PackageBase]):
when_spec = _make_when_spec(when)
if isinstance(match_variants, str) and match_variants != "*":
raise ValueError(
@@ -558,10 +568,10 @@ def patch(
compressed URL patches)
"""
def _execute_patch(pkg_or_dep: Union[spack.package_base.PackageBase, Dependency]):
pkg = pkg_or_dep
if isinstance(pkg, Dependency):
pkg = pkg.pkg
def _execute_patch(
pkg_or_dep: Union[Type[spack.package_base.PackageBase], Dependency]
) -> None:
pkg = pkg_or_dep.pkg if isinstance(pkg_or_dep, Dependency) else pkg_or_dep
if hasattr(pkg, "has_code") and not pkg.has_code:
raise UnsupportedPackageDirective(
@@ -735,58 +745,55 @@ def _execute_variant(pkg):
@directive("resources")
def resource(**kwargs):
"""Define an external resource to be fetched and staged when building the
package. Based on the keywords present in the dictionary the appropriate
FetchStrategy will be used for the resource. Resources are fetched and
staged in their own folder inside spack stage area, and then moved into
the stage area of the package that needs them.
def resource(
*,
name: Optional[str] = None,
destination: str = "",
placement: Optional[str] = None,
when: WhenType = None,
# additional kwargs are as for `version()`
**kwargs,
):
"""Define an external resource to be fetched and staged when building the package.
Based on the keywords present in the dictionary the appropriate FetchStrategy will
be used for the resource. Resources are fetched and staged in their own folder
inside spack stage area, and then moved into the stage area of the package that
needs them.
List of recognized keywords:
Keyword Arguments:
name: name for the resource
when: condition defining when the resource is needed
destination: path, relative to the package stage area, to which resource should be moved
placement: optionally rename the expanded resource inside the destination directory
* 'when' : (optional) represents the condition upon which the resource is
needed
* 'destination' : (optional) path where to move the resource. This path
must be relative to the main package stage area.
* 'placement' : (optional) gives the possibility to fine tune how the
resource is moved into the main package stage area.
"""
def _execute_resource(pkg):
when = kwargs.get("when")
when_spec = _make_when_spec(when)
if not when_spec:
return
destination = kwargs.get("destination", "")
placement = kwargs.get("placement", None)
# Check if the path is relative
if os.path.isabs(destination):
message = (
"The destination keyword of a resource directive " "can't be an absolute path.\n"
)
message += "\tdestination : '{dest}\n'".format(dest=destination)
raise RuntimeError(message)
msg = "The destination keyword of a resource directive can't be an absolute path.\n"
msg += f"\tdestination : '{destination}\n'"
raise RuntimeError(msg)
# Check if the path falls within the main package stage area
test_path = "stage_folder_root"
normalized_destination = os.path.normpath(
os.path.join(test_path, destination)
) # Normalized absolute path
# Normalized absolute path
normalized_destination = os.path.normpath(os.path.join(test_path, destination))
if test_path not in normalized_destination:
message = (
"The destination folder of a resource must fall "
"within the main package stage directory.\n"
)
message += "\tdestination : '{dest}'\n".format(dest=destination)
raise RuntimeError(message)
msg = "Destination of a resource must be within the package stage directory.\n"
msg += f"\tdestination : '{destination}'\n"
raise RuntimeError(msg)
resources = pkg.resources.setdefault(when_spec, [])
name = kwargs.get("name")
fetcher = from_kwargs(**kwargs)
resources.append(Resource(name, fetcher, destination, placement))
resources.append(
Resource(name, spack.fetch_strategy.from_kwargs(**kwargs), destination, placement)
)
return _execute_resource
@@ -818,7 +825,9 @@ def _execute_maintainer(pkg):
return _execute_maintainer
def _execute_license(pkg, license_identifier: str, when):
def _execute_license(
pkg: Type[spack.package_base.PackageBase], license_identifier: str, when: WhenType
):
# If when is not specified the license always holds
when_spec = _make_when_spec(when)
if not when_spec:
@@ -882,7 +891,7 @@ def requires(*requirement_specs: str, policy="one_of", when=None, msg=None):
msg: optional user defined message
"""
def _execute_requires(pkg: spack.package_base.PackageBase):
def _execute_requires(pkg: Type[spack.package_base.PackageBase]):
if policy not in ("one_of", "any_of"):
err_msg = (
f"the 'policy' argument of the 'requires' directive in {pkg.name} is set "
@@ -907,7 +916,7 @@ def _execute_requires(pkg: spack.package_base.PackageBase):
def _language(lang_spec_str: str, *, when: Optional[Union[str, bool]] = None):
"""Temporary implementation of language virtuals, until compilers are proper dependencies."""
def _execute_languages(pkg: spack.package_base.PackageBase):
def _execute_languages(pkg: Type[spack.package_base.PackageBase]):
when_spec = _make_when_spec(when)
if not when_spec:
return

View File

@@ -5,7 +5,7 @@
import collections.abc
import functools
from typing import List, Set
from typing import Any, Callable, Dict, List, Optional, Sequence, Set, Type, Union
import llnl.util.lang
@@ -25,11 +25,13 @@ class DirectiveMeta(type):
# Set of all known directives
_directive_dict_names: Set[str] = set()
_directives_to_be_executed: List[str] = []
_when_constraints_from_context: List[str] = []
_directives_to_be_executed: List[Callable] = []
_when_constraints_from_context: List[spack.spec.Spec] = []
_default_args: List[dict] = []
def __new__(cls, name, bases, attr_dict):
def __new__(
cls: Type["DirectiveMeta"], name: str, bases: tuple, attr_dict: dict
) -> "DirectiveMeta":
# Initialize the attribute containing the list of directives
# to be executed. Here we go reversed because we want to execute
# commands:
@@ -60,7 +62,7 @@ def __new__(cls, name, bases, attr_dict):
return super(DirectiveMeta, cls).__new__(cls, name, bases, attr_dict)
def __init__(cls, name, bases, attr_dict):
def __init__(cls: "DirectiveMeta", name: str, bases: tuple, attr_dict: dict):
# The instance is being initialized: if it is a package we must ensure
# that the directives are called to set it up.
@@ -81,27 +83,27 @@ def __init__(cls, name, bases, attr_dict):
super(DirectiveMeta, cls).__init__(name, bases, attr_dict)
@staticmethod
def push_to_context(when_spec):
def push_to_context(when_spec: spack.spec.Spec) -> None:
"""Add a spec to the context constraints."""
DirectiveMeta._when_constraints_from_context.append(when_spec)
@staticmethod
def pop_from_context():
def pop_from_context() -> spack.spec.Spec:
"""Pop the last constraint from the context"""
return DirectiveMeta._when_constraints_from_context.pop()
@staticmethod
def push_default_args(default_args):
def push_default_args(default_args: Dict[str, Any]) -> None:
"""Push default arguments"""
DirectiveMeta._default_args.append(default_args)
@staticmethod
def pop_default_args():
def pop_default_args() -> dict:
"""Pop default arguments"""
return DirectiveMeta._default_args.pop()
@staticmethod
def directive(dicts=None):
def directive(dicts: Optional[Union[Sequence[str], str]] = None) -> Callable:
"""Decorator for Spack directives.
Spack directives allow you to modify a package while it is being
@@ -156,7 +158,7 @@ class Foo(Package):
DirectiveMeta._directive_dict_names |= set(dicts)
# This decorator just returns the directive functions
def _decorator(decorated_function):
def _decorator(decorated_function: Callable) -> Callable:
directive_names.append(decorated_function.__name__)
@functools.wraps(decorated_function)

View File

@@ -141,7 +141,7 @@ def relative_path_for_spec(self, spec):
def write_spec(self, spec, path):
"""Write a spec out to a file."""
_check_concrete(spec)
with open(path, "w") as f:
with open(path, "w", encoding="utf-8") as f:
# The hash of the projection is the DAG hash which contains
# the full provenance, so it's availabe if we want it later
spec.to_json(f, hash=ht.dag_hash)
@@ -153,13 +153,13 @@ def write_host_environment(self, spec):
"""
env_file = self.env_metadata_path(spec)
environ = spack.spec.get_host_environment_metadata()
with open(env_file, "w") as fd:
with open(env_file, "w", encoding="utf-8") as fd:
sjson.dump(environ, fd)
def read_spec(self, path):
"""Read the contents of a file and parse them as a spec"""
try:
with open(path) as f:
with open(path, encoding="utf-8") as f:
extension = os.path.splitext(path)[-1].lower()
if extension == ".json":
spec = spack.spec.Spec.from_json(f)

View File

@@ -482,6 +482,7 @@
display_specs,
environment_dir_from_name,
environment_from_name_or_dir,
environment_path_scopes,
exists,
initialize_environment_dir,
installed_specs,
@@ -518,6 +519,7 @@
"display_specs",
"environment_dir_from_name",
"environment_from_name_or_dir",
"environment_path_scopes",
"exists",
"initialize_environment_dir",
"installed_specs",

View File

@@ -27,7 +27,6 @@
import spack.concretize
import spack.config
import spack.deptypes as dt
import spack.environment
import spack.error
import spack.filesystem_view as fsv
import spack.hash_types as ht
@@ -163,7 +162,7 @@ def installed_specs():
Returns the specs of packages installed in the active environment or None
if no packages are installed.
"""
env = spack.environment.active_environment()
env = active_environment()
hashes = env.all_hashes() if env else None
return spack.store.STORE.db.query(hashes=hashes)
@@ -972,7 +971,7 @@ def _read(self):
self._construct_state_from_manifest()
if os.path.exists(self.lock_path):
with open(self.lock_path) as f:
with open(self.lock_path, encoding="utf-8") as f:
read_lock_version = self._read_lockfile(f)["_meta"]["lockfile-version"]
if read_lock_version == 1:
@@ -1054,7 +1053,7 @@ def _process_concrete_includes(self):
if self.included_concrete_envs:
if os.path.exists(self.lock_path):
with open(self.lock_path) as f:
with open(self.lock_path, encoding="utf-8") as f:
data = self._read_lockfile(f)
if included_concrete_name in data:
@@ -2333,7 +2332,7 @@ def write(self, regenerate: bool = True) -> None:
self.new_specs.clear()
def update_lockfile(self) -> None:
with fs.write_tmp_and_move(self.lock_path) as f:
with fs.write_tmp_and_move(self.lock_path, encoding="utf-8") as f:
sjson.dump(self._to_lockfile_dict(), stream=f)
def ensure_env_directory_exists(self, dot_env: bool = False) -> None:
@@ -2508,7 +2507,7 @@ def update_yaml(manifest, backup_file):
AssertionError: in case anything goes wrong during the update
"""
# Check if the environment needs update
with open(manifest) as f:
with open(manifest, encoding="utf-8") as f:
data = syaml.load(f)
top_level_key = _top_level_key(data)
@@ -2526,7 +2525,7 @@ def update_yaml(manifest, backup_file):
assert not os.path.exists(backup_file), msg.format(backup_file)
shutil.copy(manifest, backup_file)
with open(manifest, "w") as f:
with open(manifest, "w", encoding="utf-8") as f:
syaml.dump_config(data, f)
return True
@@ -2554,7 +2553,7 @@ def is_latest_format(manifest):
manifest (str): manifest file to be analyzed
"""
try:
with open(manifest) as f:
with open(manifest, encoding="utf-8") as f:
data = syaml.load(f)
except (OSError, IOError):
return True
@@ -2656,7 +2655,7 @@ def from_lockfile(manifest_dir: Union[pathlib.Path, str]) -> "EnvironmentManifes
# TBD: Should this be the abspath?
manifest_dir = pathlib.Path(manifest_dir)
lockfile = manifest_dir / lockfile_name
with lockfile.open("r") as f:
with lockfile.open("r", encoding="utf-8") as f:
data = sjson.load(f)
user_specs = data["roots"]
@@ -2683,7 +2682,7 @@ def __init__(self, manifest_dir: Union[pathlib.Path, str], name: Optional[str] =
msg = f"cannot find '{manifest_name}' in {self.manifest_dir}"
raise SpackEnvironmentError(msg)
with self.manifest_file.open() as f:
with self.manifest_file.open(encoding="utf-8") as f:
self.yaml_content = _read_yaml(f)
self.changed = False
@@ -3059,6 +3058,29 @@ def use_config(self):
self.deactivate_config_scope()
def environment_path_scopes(name: str, path: str) -> Optional[List[spack.config.ConfigScope]]:
"""Retrieve the suitably named environment path scopes
Arguments:
name: configuration scope name
path: path to configuration file(s)
Returns: list of environment scopes, if any, or None
"""
if exists(path): # managed environment
manifest = EnvironmentManifestFile(root(path))
elif is_env_dir(path): # anonymous environment
manifest = EnvironmentManifestFile(path)
else:
return None
for scope in manifest.env_config_scopes:
scope.name = f"{name}:{scope.name}"
scope.writable = False
return manifest.env_config_scopes
class SpackEnvironmentError(spack.error.SpackError):
"""Superclass for all errors to do with Spack environments."""

View File

@@ -192,3 +192,10 @@ def __reduce__(self):
def _make_stop_phase(msg, long_msg):
return StopPhase(msg, long_msg)
class MirrorError(SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super().__init__(msg, long_msg)

View File

@@ -5,7 +5,6 @@
"""Service functions and classes to implement the hooks
for Spack's command extensions.
"""
import difflib
import glob
import importlib
import os
@@ -17,7 +16,6 @@
import llnl.util.lang
import spack.cmd
import spack.config
import spack.error
import spack.util.path
@@ -25,9 +23,6 @@
_extension_regexp = re.compile(r"spack-(\w[-\w]*)$")
# TODO: For consistency we should use spack.cmd.python_name(), but
# currently this would create a circular relationship between
# spack.cmd and spack.extensions.
def _python_name(cmd_name):
return cmd_name.replace("-", "_")
@@ -211,8 +206,7 @@ def get_module(cmd_name):
module = load_command_extension(cmd_name, folder)
if module:
return module
else:
raise CommandNotFoundError(cmd_name)
return None
def get_template_dirs():
@@ -224,27 +218,6 @@ def get_template_dirs():
return extensions
class CommandNotFoundError(spack.error.SpackError):
"""Exception class thrown when a requested command is not recognized as
such.
"""
def __init__(self, cmd_name):
msg = (
"{0} is not a recognized Spack command or extension command;"
" check with `spack commands`.".format(cmd_name)
)
long_msg = None
similar = difflib.get_close_matches(cmd_name, spack.cmd.all_commands())
if 1 <= len(similar) <= 5:
long_msg = "\nDid you mean one of the following commands?\n "
long_msg += "\n ".join(similar)
super().__init__(msg, long_msg)
class ExtensionNamingError(spack.error.SpackError):
"""Exception class thrown when a configured extension does not follow
the expected naming convention.

View File

@@ -326,12 +326,12 @@ def __init__(
def write_projections(self):
if self.projections:
mkdirp(os.path.dirname(self.projections_path))
with open(self.projections_path, "w") as f:
with open(self.projections_path, "w", encoding="utf-8") as f:
f.write(s_yaml.dump_config({"projections": self.projections}))
def read_projections(self):
if os.path.exists(self.projections_path):
with open(self.projections_path, "r") as f:
with open(self.projections_path, "r", encoding="utf-8") as f:
projections_data = s_yaml.load(f)
spack.config.validate(projections_data, spack.schema.projections.schema)
return projections_data["projections"]
@@ -429,7 +429,7 @@ def needs_file(spec, file):
self.get_path_meta_folder(spec), spack.store.STORE.layout.manifest_file_name
)
try:
with open(manifest_file, "r") as f:
with open(manifest_file, "r", encoding="utf-8") as f:
manifest = s_json.load(f)
except (OSError, IOError):
# if we can't load it, assume it doesn't know about the file.
@@ -833,7 +833,7 @@ def get_projection_for_spec(self, spec):
#####################
def get_spec_from_file(filename):
try:
with open(filename, "r") as f:
with open(filename, "r", encoding="utf-8") as f:
return spack.spec.Spec.from_yaml(f)
except IOError:
return None

View File

@@ -325,12 +325,7 @@ def write(self, spec, color=None, out=None):
self._out = llnl.util.tty.color.ColorStream(out, color=color)
# We'll traverse the spec in topological order as we graph it.
nodes_in_topological_order = [
edge.spec
for edge in spack.traverse.traverse_edges_topo(
[spec], direction="children", deptype=self.depflag
)
]
nodes_in_topological_order = list(spec.traverse(order="topo", deptype=self.depflag))
nodes_in_topological_order.reverse()
# Work on a copy to be nondestructive

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.binary_distribution as bindist
import spack.mirror
import spack.mirrors.mirror
def post_install(spec, explicit):
@@ -22,7 +22,7 @@ def post_install(spec, explicit):
return
# Push the package to all autopush mirrors
for mirror in spack.mirror.MirrorCollection(binary=True, autopush=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(binary=True, autopush=True).values():
signing_key = bindist.select_signing_key() if mirror.signed else None
with bindist.make_uploader(mirror=mirror, force=True, signing_key=signing_key) as uploader:
uploader.push_or_raise([spec])

View File

@@ -142,7 +142,7 @@ def write_license_file(pkg, license_path):
os.makedirs(os.path.dirname(license_path))
# Output
with open(license_path, "w") as f:
with open(license_path, "w", encoding="utf-8") as f:
for line in txt.splitlines():
f.write("{0}{1}\n".format(pkg.license_comment, line))
f.close()

View File

@@ -17,13 +17,12 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import llnl.util.tty.log as log
import llnl.util.tty.log
from llnl.string import plural
from llnl.util.lang import nullcontext
from llnl.util.tty.color import colorize
import spack.build_environment
import spack.compilers
import spack.config
import spack.error
import spack.package_base
@@ -51,6 +50,7 @@
ListOrStringType = Union[str, List[str]]
LogType = Union[llnl.util.tty.log.nixlog, llnl.util.tty.log.winlog]
Pb = TypeVar("Pb", bound="spack.package_base.PackageBase")
PackageObjectOrClass = Union[Pb, Type[Pb]]
@@ -81,7 +81,7 @@ def get_escaped_text_output(filename: str) -> List[str]:
Returns:
escaped text lines read from the file
"""
with open(filename) as f:
with open(filename, encoding="utf-8") as f:
# Ensure special characters are escaped as needed
expected = f.read()
@@ -207,6 +207,22 @@ def install_test_root(pkg: Pb):
return os.path.join(pkg.metadata_dir, "test")
def print_message(logger: LogType, msg: str, verbose: bool = False):
"""Print the message to the log, optionally echoing.
Args:
logger: instance of the output logger (e.g. nixlog or winlog)
msg: message being output
verbose: ``True`` displays verbose output, ``False`` suppresses
it (``False`` is default)
"""
if verbose:
with logger.force_echo():
tty.info(msg, format="g")
else:
tty.info(msg, format="g")
def overall_status(current_status: "TestStatus", substatuses: List["TestStatus"]) -> "TestStatus":
"""Determine the overall status based on the current and associated sub status values.
@@ -253,16 +269,15 @@ def __init__(self, pkg: Pb):
self.test_log_file: str
self.pkg_id: str
if self.pkg.test_suite is not None:
if pkg.test_suite:
# Running stand-alone tests
suite = self.pkg.test_suite
self.test_log_file = suite.log_file_for_spec(pkg.spec) # type: ignore[union-attr]
self.tested_file = suite.tested_file_for_spec(pkg.spec) # type: ignore[union-attr]
self.pkg_id = suite.test_pkg_id(pkg.spec) # type: ignore[union-attr]
self.test_log_file = pkg.test_suite.log_file_for_spec(pkg.spec)
self.tested_file = pkg.test_suite.tested_file_for_spec(pkg.spec)
self.pkg_id = pkg.test_suite.test_pkg_id(pkg.spec)
else:
# Running phase-time tests for a single package whose results are
# retained in the package's stage directory.
self.pkg.test_suite = TestSuite([pkg.spec])
pkg.test_suite = TestSuite([pkg.spec])
self.test_log_file = fs.join_path(pkg.stage.path, spack_install_test_log)
self.pkg_id = pkg.spec.format("{name}-{version}-{hash:7}")
@@ -270,10 +285,10 @@ def __init__(self, pkg: Pb):
self._logger = None
@property
def logger(self) -> Optional[log.LogType]:
def logger(self) -> Optional[LogType]:
"""The current logger or, if none, sets to one."""
if not self._logger:
self._logger = log.log_output(self.test_log_file)
self._logger = llnl.util.tty.log.log_output(self.test_log_file)
return self._logger
@@ -290,7 +305,7 @@ def test_logger(self, verbose: bool = False, externals: bool = False):
fs.touch(self.test_log_file) # Otherwise log_parse complains
fs.set_install_permissions(self.test_log_file)
with log.log_output(self.test_log_file, verbose) as self._logger:
with llnl.util.tty.log.log_output(self.test_log_file, verbose) as self._logger:
with self.logger.force_echo(): # type: ignore[union-attr]
tty.msg("Testing package " + colorize(r"@*g{" + self.pkg_id + r"}"))
@@ -316,13 +331,6 @@ def add_failure(self, exception: Exception, msg: str):
"""Add the failure details to the current list."""
self.test_failures.append((exception, msg))
def set_current_specs(self, base_spec: spack.spec.Spec, test_spec: spack.spec.Spec):
# Ignore union-attr check for test_suite since the constructor of this
# class ensures it is always not None.
test_suite = self.pkg.test_suite
test_suite.current_base_spec = base_spec # type: ignore[union-attr]
test_suite.current_test_spec = test_spec # type: ignore[union-attr]
def status(self, name: str, status: "TestStatus", msg: Optional[str] = None):
"""Track and print the test status for the test part name."""
part_name = f"{self.pkg.__class__.__name__}::{name}"
@@ -344,54 +352,56 @@ def status(self, name: str, status: "TestStatus", msg: Optional[str] = None):
self.test_parts[part_name] = status
self.counts[status] += 1
def handle_failures(self):
"""Raise exception if any failures were collected during testing
def phase_tests(self, builder, phase_name: str, method_names: List[str]):
"""Execute the builder's package phase-time tests.
Raises:
TestFailure: test failures were collected
Args:
builder: builder for package being tested
phase_name: the name of the build-time phase (e.g., ``build``, ``install``)
method_names: phase-specific callback method names
"""
if self.test_failures:
raise TestFailure(self.test_failures)
verbose = tty.is_verbose()
fail_fast = spack.config.get("config:fail_fast", False)
def stand_alone_tests(self, dirty=False, externals=False):
with self.test_logger(verbose=verbose, externals=False) as logger:
# Report running each of the methods in the build log
print_message(logger, f"Running {phase_name}-time tests", verbose)
builder.pkg.test_suite.current_test_spec = builder.pkg.spec
builder.pkg.test_suite.current_base_spec = builder.pkg.spec
have_tests = any(name.startswith("test_") for name in method_names)
if have_tests:
copy_test_files(builder.pkg, builder.pkg.spec)
for name in method_names:
try:
fn = getattr(builder, name, None) or getattr(builder.pkg, name)
except AttributeError as e:
print_message(logger, f"RUN-TESTS: method not implemented [{name}]", verbose)
self.add_failure(e, f"RUN-TESTS: method not implemented [{name}]")
if fail_fast:
break
continue
print_message(logger, f"RUN-TESTS: {phase_name}-time tests [{name}]", verbose)
fn()
if have_tests:
print_message(logger, "Completed testing", verbose)
# Raise any collected failures here
if self.test_failures:
raise TestFailure(self.test_failures)
def stand_alone_tests(self, kwargs):
"""Run the package's stand-alone tests.
Args:
kwargs (dict): arguments to be used by the test process
Raises:
AttributeError: required test_requires_compiler attribute is missing
"""
pkg = self.pkg
spec = pkg.spec
pkg_spec = spec.format("{name}-{version}-{hash:7}")
import spack.build_environment
if not hasattr(pkg, "test_requires_compiler"):
raise AttributeError(
f"Cannot run tests for {pkg_spec}: missing required "
"test_requires_compiler attribute"
)
if pkg.test_requires_compiler:
compilers = spack.compilers.compilers_for_spec(
spec.compiler, arch_spec=spec.architecture
)
if not compilers:
tty.error(
f"Skipping tests for package {pkg_spec}\n"
f"Package test requires missing compiler {spec.compiler}"
)
return
kwargs = {
"dirty": dirty,
"fake": False,
"context": "test",
"externals": externals,
"verbose": tty.is_verbose(),
}
spack.build_environment.start_build_process(pkg, test_process, kwargs)
spack.build_environment.start_build_process(self.pkg, test_process, kwargs)
def parts(self) -> int:
"""The total number of (checked) test parts."""
@@ -448,7 +458,7 @@ def write_tested_status(self):
elif self.counts[TestStatus.PASSED] > 0:
status = TestStatus.PASSED
with open(self.tested_file, "w") as f:
with open(self.tested_file, "w", encoding="utf-8") as f:
f.write(f"{status.value}\n")
@@ -492,7 +502,7 @@ def test_part(pkg: Pb, test_name: str, purpose: str, work_dir: str = ".", verbos
for i, entry in enumerate(stack):
filename, lineno, function, text = entry
if spack.repo.is_package_file(filename):
with open(filename) as f:
with open(filename, encoding="utf-8") as f:
lines = f.readlines()
new_lineno = lineno - 2
text = lines[new_lineno]
@@ -683,9 +693,10 @@ def process_test_parts(pkg: Pb, test_specs: List[spack.spec.Spec], verbose: bool
):
test_fn(pkg)
# If fail-fast was on, we errored out above
# If we collected errors, raise them in batch here
tester.handle_failures()
# If fail-fast was on, we error out above
# If we collect errors, raise them in batch here
if tester.test_failures:
raise TestFailure(tester.test_failures)
finally:
if tester.ran_tests():
@@ -711,12 +722,12 @@ def test_process(pkg: Pb, kwargs):
with pkg.tester.test_logger(verbose, externals) as logger:
if pkg.spec.external and not externals:
log.print_message(logger, "Skipped tests for external package", verbose)
print_message(logger, "Skipped tests for external package", verbose)
pkg.tester.status(pkg.spec.name, TestStatus.SKIPPED)
return
if not pkg.spec.installed:
log.print_message(logger, "Skipped not installed package", verbose)
print_message(logger, "Skipped not installed package", verbose)
pkg.tester.status(pkg.spec.name, TestStatus.SKIPPED)
return
@@ -811,7 +822,7 @@ def get_test_suite(name: str) -> Optional["TestSuite"]:
def write_test_suite_file(suite):
"""Write the test suite to its (JSON) lock file."""
with open(suite.stage.join(test_suite_filename), "w") as f:
with open(suite.stage.join(test_suite_filename), "w", encoding="utf-8") as f:
sjson.dump(suite.to_dict(), stream=f)
@@ -841,7 +852,7 @@ def __init__(self, specs, alias=None):
# even if they contain the same spec
self.specs = [spec.copy() for spec in specs]
self.current_test_spec = None # spec currently tested, can be virtual
self.current_base_spec = None # spec currently running tests
self.current_base_spec = None # spec currently running do_test
self.alias = alias
self._hash = None
@@ -865,10 +876,6 @@ def content_hash(self):
self._hash = b32_hash
return self._hash
def set_current_specs(self, base_spec: spack.spec.Spec, test_spec: spack.spec.Spec):
self.current_base_spec = base_spec
self.current_test_spec = test_spec
def __call__(self, *args, **kwargs):
self.write_reproducibility_data()
@@ -878,16 +885,18 @@ def __call__(self, *args, **kwargs):
externals = kwargs.get("externals", False)
for spec in self.specs:
pkg = spec.package
try:
if pkg.test_suite:
if spec.package.test_suite:
raise TestSuiteSpecError(
f"Package {pkg.name} cannot be run in two test suites at once"
"Package {} cannot be run in two test suites at once".format(
spec.package.name
)
)
# Set up the test suite to know which test is running
pkg.test_suite = self
self.set_current_specs(spec, spec)
spec.package.test_suite = self
self.current_base_spec = spec
self.current_test_spec = spec
# setup per-test directory in the stage dir
test_dir = self.test_dir_for_spec(spec)
@@ -896,7 +905,7 @@ def __call__(self, *args, **kwargs):
fs.mkdirp(test_dir)
# run the package tests
pkg.tester.stand_alone_tests(dirty=dirty, externals=externals)
spec.package.do_test(dirty=dirty, externals=externals)
# Clean up on success
if remove_directory:
@@ -930,7 +939,8 @@ def __call__(self, *args, **kwargs):
finally:
spec.package.test_suite = None
self.set_current_specs(None, None)
self.current_test_spec = None
self.current_base_spec = None
write_test_summary(self.counts)
@@ -967,7 +977,7 @@ def test_status(self, spec: spack.spec.Spec, externals: bool) -> Optional[TestSt
status = TestStatus.NO_TESTS
return status
with open(tests_status_file, "r") as f:
with open(tests_status_file, "r", encoding="utf-8") as f:
value = (f.read()).strip("\n")
return TestStatus(int(value)) if value else TestStatus.NO_TESTS
@@ -1169,7 +1179,7 @@ def from_file(filename):
BaseException: sjson.SpackJSONError if problem parsing the file
"""
try:
with open(filename) as f:
with open(filename, encoding="utf-8") as f:
data = sjson.load(f)
test_suite = TestSuite.from_dict(data)
content_hash = os.path.basename(os.path.dirname(filename))
@@ -1186,7 +1196,7 @@ def _add_msg_to_file(filename, msg):
filename (str): path to the file
msg (str): message to be appended to the file
"""
with open(filename, "a+") as f:
with open(filename, "a+", encoding="utf-8") as f:
f.write(f"{msg}\n")

View File

@@ -56,7 +56,7 @@
import spack.deptypes as dt
import spack.error
import spack.hooks
import spack.mirror
import spack.mirrors.mirror
import spack.package_base
import spack.package_prefs as prefs
import spack.repo
@@ -105,7 +105,7 @@ def __str__(self):
def _write_timer_json(pkg, timer, cache):
extra_attributes = {"name": pkg.name, "cache": cache, "hash": pkg.spec.dag_hash()}
try:
with open(pkg.times_log_path, "w") as timelog:
with open(pkg.times_log_path, "w", encoding="utf-8") as timelog:
timer.write_json(timelog, extra_attributes=extra_attributes)
except Exception as e:
tty.debug(str(e))
@@ -491,7 +491,7 @@ def _try_install_from_binary_cache(
timer: timer to keep track of binary install phases.
"""
# Early exit if no binary mirrors are configured.
if not spack.mirror.MirrorCollection(binary=True):
if not spack.mirrors.mirror.MirrorCollection(binary=True):
return False
tty.debug(f"Searching for binary cache of {package_id(pkg.spec)}")
@@ -692,7 +692,7 @@ def log(pkg: "spack.package_base.PackageBase") -> None:
if errors.getvalue():
error_file = os.path.join(target_dir, "errors.txt")
fs.mkdirp(target_dir)
with open(error_file, "w") as err:
with open(error_file, "w", encoding="utf-8") as err:
err.write(errors.getvalue())
tty.warn(f"Errors occurred when archiving files.\n\tSee: {error_file}")
@@ -2405,7 +2405,7 @@ def _real_install(self) -> None:
# Save just the changes to the environment. This file can be
# safely installed, since it does not contain secret variables.
with open(pkg.env_mods_path, "w") as env_mods_file:
with open(pkg.env_mods_path, "w", encoding="utf-8") as env_mods_file:
mods = self.env_mods.shell_modifications(explicit=True, env=self.unmodified_env)
env_mods_file.write(mods)
@@ -2414,7 +2414,7 @@ def _real_install(self) -> None:
configure_args = getattr(pkg, attr)()
configure_args = " ".join(configure_args)
with open(pkg.configure_args_path, "w") as args_file:
with open(pkg.configure_args_path, "w", encoding="utf-8") as args_file:
args_file.write(configure_args)
break

View File

@@ -48,7 +48,6 @@
import spack.util.debug
import spack.util.environment
import spack.util.lock
from spack.error import SpackError
#: names of profile statistics
stat_names = pstats.Stats.sort_arg_dict_default
@@ -858,6 +857,33 @@ def resolve_alias(cmd_name: str, cmd: List[str]) -> Tuple[str, List[str]]:
return cmd_name, cmd
def add_command_line_scopes(
cfg: spack.config.Configuration, command_line_scopes: List[str]
) -> None:
"""Add additional scopes from the --config-scope argument, either envs or dirs.
Args:
cfg: configuration instance
command_line_scopes: list of configuration scope paths
Raises:
spack.error.ConfigError: if the path is an invalid configuration scope
"""
for i, path in enumerate(command_line_scopes):
name = f"cmd_scope_{i}"
scopes = ev.environment_path_scopes(name, path)
if scopes is None:
if os.path.isdir(path): # directory with config files
cfg.push_scope(spack.config.DirectoryConfigScope(name, path, writable=False))
spack.config._add_platform_scope(cfg, name, path, writable=False)
continue
else:
raise spack.error.ConfigError(f"Invalid configuration scope: {path}")
for scope in scopes:
cfg.push_scope(scope)
def _main(argv=None):
"""Logic for the main entry point for the Spack command.
@@ -926,7 +952,7 @@ def _main(argv=None):
# Push scopes from the command line last
if args.config_scopes:
spack.config._add_command_line_scopes(spack.config.CONFIG, args.config_scopes)
add_command_line_scopes(spack.config.CONFIG, args.config_scopes)
spack.config.CONFIG.push_scope(spack.config.InternalConfigScope("command_line"))
setup_main_options(args)
@@ -1012,7 +1038,7 @@ def main(argv=None):
try:
return _main(argv)
except SpackError as e:
except spack.error.SpackError as e:
tty.debug(e)
e.die() # gracefully die on any SpackErrors

View File

@@ -0,0 +1,4 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

View File

@@ -0,0 +1,146 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
from typing import Optional
import llnl.url
import llnl.util.symlink
from llnl.util.filesystem import mkdirp
import spack.fetch_strategy
import spack.oci.image
import spack.repo
import spack.spec
from spack.error import MirrorError
class MirrorLayout:
"""A ``MirrorLayout`` object describes the relative path of a mirror entry."""
def __init__(self, path: str) -> None:
self.path = path
def __iter__(self):
"""Yield all paths including aliases where the resource can be found."""
yield self.path
def make_alias(self, root: str) -> None:
"""Make the entry ``root / self.path`` available under a human readable alias"""
pass
class DefaultLayout(MirrorLayout):
def __init__(self, alias_path: str, digest_path: Optional[str] = None) -> None:
# When we have a digest, it is used as the primary storage location. If not, then we use
# the human-readable alias. In case of mirrors of a VCS checkout, we currently do not have
# a digest, that's why an alias is required and a digest optional.
super().__init__(path=digest_path or alias_path)
self.alias = alias_path
self.digest_path = digest_path
def make_alias(self, root: str) -> None:
"""Symlink a human readible path in our mirror to the actual storage location."""
# We already use the human-readable path as the main storage location.
if not self.digest_path:
return
alias, digest = os.path.join(root, self.alias), os.path.join(root, self.digest_path)
alias_dir = os.path.dirname(alias)
relative_dst = os.path.relpath(digest, start=alias_dir)
mkdirp(alias_dir)
tmp = f"{alias}.tmp"
llnl.util.symlink.symlink(relative_dst, tmp)
try:
os.rename(tmp, alias)
except OSError:
# Clean up the temporary if possible
try:
os.unlink(tmp)
except OSError:
pass
raise
def __iter__(self):
if self.digest_path:
yield self.digest_path
yield self.alias
class OCILayout(MirrorLayout):
"""Follow the OCI Image Layout Specification to archive blobs where paths are of the form
``blobs/<algorithm>/<digest>``"""
def __init__(self, digest: spack.oci.image.Digest) -> None:
super().__init__(os.path.join("blobs", digest.algorithm, digest.digest))
def _determine_extension(fetcher):
if isinstance(fetcher, spack.fetch_strategy.URLFetchStrategy):
if fetcher.expand_archive:
# If we fetch with a URLFetchStrategy, use URL's archive type
ext = llnl.url.determine_url_file_extension(fetcher.url)
if ext:
# Remove any leading dots
ext = ext.lstrip(".")
else:
msg = """\
Unable to parse extension from {0}.
If this URL is for a tarball but does not include the file extension
in the name, you can explicitly declare it with the following syntax:
version('1.2.3', 'hash', extension='tar.gz')
If this URL is for a download like a .jar or .whl that does not need
to be expanded, or an uncompressed installation script, you can tell
Spack not to expand it with the following syntax:
version('1.2.3', 'hash', expand=False)
"""
raise MirrorError(msg.format(fetcher.url))
else:
# If the archive shouldn't be expanded, don't check extension.
ext = None
else:
# Otherwise we'll make a .tar.gz ourselves
ext = "tar.gz"
return ext
def default_mirror_layout(
fetcher: "spack.fetch_strategy.FetchStrategy",
per_package_ref: str,
spec: Optional["spack.spec.Spec"] = None,
) -> MirrorLayout:
"""Returns a ``MirrorReference`` object which keeps track of the relative
storage path of the resource associated with the specified ``fetcher``."""
ext = None
if spec:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
versions = pkg_cls.versions.get(spec.version, {})
ext = versions.get("extension", None)
# If the spec does not explicitly specify an extension (the default case),
# then try to determine it automatically. An extension can only be
# specified for the primary source of the package (e.g. the source code
# identified in the 'version' declaration). Resources/patches don't have
# an option to specify an extension, so it must be inferred for those.
ext = ext or _determine_extension(fetcher)
if ext:
per_package_ref += ".%s" % ext
global_ref = fetcher.mirror_id()
if global_ref:
global_ref = os.path.join("_source-cache", global_ref)
if global_ref and ext:
global_ref += ".%s" % ext
return DefaultLayout(per_package_ref, global_ref)

View File

@@ -2,42 +2,20 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""
This file contains code for creating spack mirror directories. A
mirror is an organized hierarchy containing specially named archive
files. This enabled spack to know where to find files in a mirror if
the main server for a particular package is down. Or, if the computer
where spack is run is not connected to the internet, it allows spack
to download packages directly from a mirror (e.g., on an intranet).
"""
import collections
import collections.abc
import operator
import os
import os.path
import sys
import traceback
import urllib.parse
from typing import Any, Dict, Optional, Tuple, Union
import llnl.url
import llnl.util.symlink
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack.caches
import spack.config
import spack.error
import spack.fetch_strategy
import spack.mirror
import spack.oci.image
import spack.repo
import spack.spec
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.version
from spack.error import MirrorError
#: What schemes do we support
supported_url_schemes = ("file", "http", "https", "sftp", "ftp", "s3", "gs", "oci")
@@ -490,380 +468,3 @@ def __iter__(self):
def __len__(self):
return len(self._mirrors)
def _determine_extension(fetcher):
if isinstance(fetcher, spack.fetch_strategy.URLFetchStrategy):
if fetcher.expand_archive:
# If we fetch with a URLFetchStrategy, use URL's archive type
ext = llnl.url.determine_url_file_extension(fetcher.url)
if ext:
# Remove any leading dots
ext = ext.lstrip(".")
else:
msg = """\
Unable to parse extension from {0}.
If this URL is for a tarball but does not include the file extension
in the name, you can explicitly declare it with the following syntax:
version('1.2.3', 'hash', extension='tar.gz')
If this URL is for a download like a .jar or .whl that does not need
to be expanded, or an uncompressed installation script, you can tell
Spack not to expand it with the following syntax:
version('1.2.3', 'hash', expand=False)
"""
raise MirrorError(msg.format(fetcher.url))
else:
# If the archive shouldn't be expanded, don't check extension.
ext = None
else:
# Otherwise we'll make a .tar.gz ourselves
ext = "tar.gz"
return ext
class MirrorLayout:
"""A ``MirrorLayout`` object describes the relative path of a mirror entry."""
def __init__(self, path: str) -> None:
self.path = path
def __iter__(self):
"""Yield all paths including aliases where the resource can be found."""
yield self.path
def make_alias(self, root: str) -> None:
"""Make the entry ``root / self.path`` available under a human readable alias"""
pass
class DefaultLayout(MirrorLayout):
def __init__(self, alias_path: str, digest_path: Optional[str] = None) -> None:
# When we have a digest, it is used as the primary storage location. If not, then we use
# the human-readable alias. In case of mirrors of a VCS checkout, we currently do not have
# a digest, that's why an alias is required and a digest optional.
super().__init__(path=digest_path or alias_path)
self.alias = alias_path
self.digest_path = digest_path
def make_alias(self, root: str) -> None:
"""Symlink a human readible path in our mirror to the actual storage location."""
# We already use the human-readable path as the main storage location.
if not self.digest_path:
return
alias, digest = os.path.join(root, self.alias), os.path.join(root, self.digest_path)
alias_dir = os.path.dirname(alias)
relative_dst = os.path.relpath(digest, start=alias_dir)
mkdirp(alias_dir)
tmp = f"{alias}.tmp"
llnl.util.symlink.symlink(relative_dst, tmp)
try:
os.rename(tmp, alias)
except OSError:
# Clean up the temporary if possible
try:
os.unlink(tmp)
except OSError:
pass
raise
def __iter__(self):
if self.digest_path:
yield self.digest_path
yield self.alias
class OCILayout(MirrorLayout):
"""Follow the OCI Image Layout Specification to archive blobs where paths are of the form
``blobs/<algorithm>/<digest>``"""
def __init__(self, digest: spack.oci.image.Digest) -> None:
super().__init__(os.path.join("blobs", digest.algorithm, digest.digest))
def default_mirror_layout(
fetcher: "spack.fetch_strategy.FetchStrategy",
per_package_ref: str,
spec: Optional["spack.spec.Spec"] = None,
) -> MirrorLayout:
"""Returns a ``MirrorReference`` object which keeps track of the relative
storage path of the resource associated with the specified ``fetcher``."""
ext = None
if spec:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
versions = pkg_cls.versions.get(spec.version, {})
ext = versions.get("extension", None)
# If the spec does not explicitly specify an extension (the default case),
# then try to determine it automatically. An extension can only be
# specified for the primary source of the package (e.g. the source code
# identified in the 'version' declaration). Resources/patches don't have
# an option to specify an extension, so it must be inferred for those.
ext = ext or _determine_extension(fetcher)
if ext:
per_package_ref += ".%s" % ext
global_ref = fetcher.mirror_id()
if global_ref:
global_ref = os.path.join("_source-cache", global_ref)
if global_ref and ext:
global_ref += ".%s" % ext
return DefaultLayout(per_package_ref, global_ref)
def get_all_versions(specs):
"""Given a set of initial specs, return a new set of specs that includes
each version of each package in the original set.
Note that if any spec in the original set specifies properties other than
version, this information will be omitted in the new set; for example; the
new set of specs will not include variant settings.
"""
version_specs = []
for spec in specs:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# Skip any package that has no known versions.
if not pkg_cls.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg_cls.name)
continue
for version in pkg_cls.versions:
version_spec = spack.spec.Spec(pkg_cls.name)
version_spec.versions = spack.version.VersionList([version])
version_specs.append(version_spec)
return version_specs
def get_matching_versions(specs, num_versions=1):
"""Get a spec for EACH known version matching any spec in the list.
For concrete specs, this retrieves the concrete version and, if more
than one version per spec is requested, retrieves the latest versions
of the package.
"""
matching = []
for spec in specs:
pkg = spec.package
# Skip any package that has no known versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg.name)
continue
pkg_versions = num_versions
version_order = list(reversed(sorted(pkg.versions)))
matching_spec = []
if spec.concrete:
matching_spec.append(spec)
pkg_versions -= 1
if spec.version in version_order:
version_order.remove(spec.version)
for v in version_order:
# Generate no more than num_versions versions for each spec.
if pkg_versions < 1:
break
# Generate only versions that satisfy the spec.
if spec.concrete or v.intersects(spec.versions):
s = spack.spec.Spec(pkg.name)
s.versions = spack.version.VersionList([v])
s.variants = spec.variants.copy()
# This is needed to avoid hanging references during the
# concretization phase
s.variants.spec = s
matching_spec.append(s)
pkg_versions -= 1
if not matching_spec:
tty.warn("No known version matches spec: %s" % spec)
matching.extend(matching_spec)
return matching
def create(path, specs, skip_unstable_versions=False):
"""Create a directory to be used as a spack mirror, and fill it with
package archives.
Arguments:
path: Path to create a mirror directory hierarchy in.
specs: Any package versions matching these specs will be added \
to the mirror.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
Return Value:
Returns a tuple of lists: (present, mirrored, error)
* present: Package specs that were already present.
* mirrored: Package specs that were successfully mirrored.
* error: Package specs that failed to mirror due to some error.
"""
# automatically spec-ify anything in the specs array.
specs = [s if isinstance(s, spack.spec.Spec) else spack.spec.Spec(s) for s in specs]
mirror_cache, mirror_stats = mirror_cache_and_stats(path, skip_unstable_versions)
for spec in specs:
mirror_stats.next_spec(spec)
create_mirror_from_package_object(spec.package, mirror_cache, mirror_stats)
return mirror_stats.stats()
def mirror_cache_and_stats(path, skip_unstable_versions=False):
"""Return both a mirror cache and a mirror stats, starting from the path
where a mirror ought to be created.
Args:
path (str): path to create a mirror directory hierarchy in.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
"""
# Get the absolute path of the root before we start jumping around.
if not os.path.isdir(path):
try:
mkdirp(path)
except OSError as e:
raise MirrorError("Cannot create directory '%s':" % path, str(e))
mirror_cache = spack.caches.MirrorCache(path, skip_unstable_versions=skip_unstable_versions)
mirror_stats = MirrorStats()
return mirror_cache, mirror_stats
def add(mirror: Mirror, scope=None):
"""Add a named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if mirror.name in mirrors:
tty.die("Mirror with name {} already exists.".format(mirror.name))
items = [(n, u) for n, u in mirrors.items()]
items.insert(0, (mirror.name, mirror.to_dict()))
mirrors = syaml.syaml_dict(items)
spack.config.set("mirrors", mirrors, scope=scope)
def remove(name, scope):
"""Remove the named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if name not in mirrors:
tty.die("No mirror with name %s" % name)
mirrors.pop(name)
spack.config.set("mirrors", mirrors, scope=scope)
tty.msg("Removed mirror %s." % name)
class MirrorStats:
def __init__(self):
self.present = {}
self.new = {}
self.errors = set()
self.current_spec = None
self.added_resources = set()
self.existing_resources = set()
def next_spec(self, spec):
self._tally_current_spec()
self.current_spec = spec
def _tally_current_spec(self):
if self.current_spec:
if self.added_resources:
self.new[self.current_spec] = len(self.added_resources)
if self.existing_resources:
self.present[self.current_spec] = len(self.existing_resources)
self.added_resources = set()
self.existing_resources = set()
self.current_spec = None
def stats(self):
self._tally_current_spec()
return list(self.present), list(self.new), list(self.errors)
def already_existed(self, resource):
# If an error occurred after caching a subset of a spec's
# resources, a secondary attempt may consider them already added
if resource not in self.added_resources:
self.existing_resources.add(resource)
def added(self, resource):
self.added_resources.add(resource)
def error(self):
self.errors.add(self.current_spec)
def create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats):
"""Add a single package object to a mirror.
The package object is only required to have an associated spec
with a concrete version.
Args:
pkg_obj (spack.package_base.PackageBase): package object with to be added.
mirror_cache (spack.caches.MirrorCache): mirror where to add the spec.
mirror_stats (spack.mirror.MirrorStats): statistics on the current mirror
Return:
True if the spec was added successfully, False otherwise
"""
tty.msg("Adding package {} to mirror".format(pkg_obj.spec.format("{name}{@version}")))
num_retries = 3
while num_retries > 0:
try:
# Includes patches and resources
with pkg_obj.stage as pkg_stage:
pkg_stage.cache_mirror(mirror_cache, mirror_stats)
exception = None
break
except Exception as e:
exc_tuple = sys.exc_info()
exception = e
num_retries -= 1
if exception:
if spack.config.get("config:debug"):
traceback.print_exception(file=sys.stderr, *exc_tuple)
else:
tty.warn(
"Error while fetching %s" % pkg_obj.spec.cformat("{name}{@version}"),
getattr(exception, "message", exception),
)
mirror_stats.error()
return False
return True
def require_mirror_name(mirror_name):
"""Find a mirror by name and raise if it does not exist"""
mirror = MirrorCollection().get(mirror_name)
if not mirror:
raise ValueError(f'no mirror named "{mirror_name}"')
return mirror
class MirrorError(spack.error.SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super().__init__(msg, long_msg)

View File

@@ -0,0 +1,258 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import traceback
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack.caches
import spack.config
import spack.error
import spack.repo
import spack.spec
import spack.util.spack_yaml as syaml
import spack.version
from spack.error import MirrorError
from spack.mirrors.mirror import Mirror, MirrorCollection
def get_all_versions(specs):
"""Given a set of initial specs, return a new set of specs that includes
each version of each package in the original set.
Note that if any spec in the original set specifies properties other than
version, this information will be omitted in the new set; for example; the
new set of specs will not include variant settings.
"""
version_specs = []
for spec in specs:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# Skip any package that has no known versions.
if not pkg_cls.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg_cls.name)
continue
for version in pkg_cls.versions:
version_spec = spack.spec.Spec(pkg_cls.name)
version_spec.versions = spack.version.VersionList([version])
version_specs.append(version_spec)
return version_specs
def get_matching_versions(specs, num_versions=1):
"""Get a spec for EACH known version matching any spec in the list.
For concrete specs, this retrieves the concrete version and, if more
than one version per spec is requested, retrieves the latest versions
of the package.
"""
matching = []
for spec in specs:
pkg = spec.package
# Skip any package that has no known versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg.name)
continue
pkg_versions = num_versions
version_order = list(reversed(sorted(pkg.versions)))
matching_spec = []
if spec.concrete:
matching_spec.append(spec)
pkg_versions -= 1
if spec.version in version_order:
version_order.remove(spec.version)
for v in version_order:
# Generate no more than num_versions versions for each spec.
if pkg_versions < 1:
break
# Generate only versions that satisfy the spec.
if spec.concrete or v.intersects(spec.versions):
s = spack.spec.Spec(pkg.name)
s.versions = spack.version.VersionList([v])
s.variants = spec.variants.copy()
# This is needed to avoid hanging references during the
# concretization phase
s.variants.spec = s
matching_spec.append(s)
pkg_versions -= 1
if not matching_spec:
tty.warn("No known version matches spec: %s" % spec)
matching.extend(matching_spec)
return matching
def create(path, specs, skip_unstable_versions=False):
"""Create a directory to be used as a spack mirror, and fill it with
package archives.
Arguments:
path: Path to create a mirror directory hierarchy in.
specs: Any package versions matching these specs will be added \
to the mirror.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
Return Value:
Returns a tuple of lists: (present, mirrored, error)
* present: Package specs that were already present.
* mirrored: Package specs that were successfully mirrored.
* error: Package specs that failed to mirror due to some error.
"""
# automatically spec-ify anything in the specs array.
specs = [s if isinstance(s, spack.spec.Spec) else spack.spec.Spec(s) for s in specs]
mirror_cache, mirror_stats = mirror_cache_and_stats(path, skip_unstable_versions)
for spec in specs:
mirror_stats.next_spec(spec)
create_mirror_from_package_object(spec.package, mirror_cache, mirror_stats)
return mirror_stats.stats()
def mirror_cache_and_stats(path, skip_unstable_versions=False):
"""Return both a mirror cache and a mirror stats, starting from the path
where a mirror ought to be created.
Args:
path (str): path to create a mirror directory hierarchy in.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
"""
# Get the absolute path of the root before we start jumping around.
if not os.path.isdir(path):
try:
mkdirp(path)
except OSError as e:
raise MirrorError("Cannot create directory '%s':" % path, str(e))
mirror_cache = spack.caches.MirrorCache(path, skip_unstable_versions=skip_unstable_versions)
mirror_stats = MirrorStats()
return mirror_cache, mirror_stats
def add(mirror: Mirror, scope=None):
"""Add a named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if mirror.name in mirrors:
tty.die("Mirror with name {} already exists.".format(mirror.name))
items = [(n, u) for n, u in mirrors.items()]
items.insert(0, (mirror.name, mirror.to_dict()))
mirrors = syaml.syaml_dict(items)
spack.config.set("mirrors", mirrors, scope=scope)
def remove(name, scope):
"""Remove the named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if name not in mirrors:
tty.die("No mirror with name %s" % name)
mirrors.pop(name)
spack.config.set("mirrors", mirrors, scope=scope)
tty.msg("Removed mirror %s." % name)
class MirrorStats:
def __init__(self):
self.present = {}
self.new = {}
self.errors = set()
self.current_spec = None
self.added_resources = set()
self.existing_resources = set()
def next_spec(self, spec):
self._tally_current_spec()
self.current_spec = spec
def _tally_current_spec(self):
if self.current_spec:
if self.added_resources:
self.new[self.current_spec] = len(self.added_resources)
if self.existing_resources:
self.present[self.current_spec] = len(self.existing_resources)
self.added_resources = set()
self.existing_resources = set()
self.current_spec = None
def stats(self):
self._tally_current_spec()
return list(self.present), list(self.new), list(self.errors)
def already_existed(self, resource):
# If an error occurred after caching a subset of a spec's
# resources, a secondary attempt may consider them already added
if resource not in self.added_resources:
self.existing_resources.add(resource)
def added(self, resource):
self.added_resources.add(resource)
def error(self):
self.errors.add(self.current_spec)
def create_mirror_from_package_object(
pkg_obj, mirror_cache: "spack.caches.MirrorCache", mirror_stats: MirrorStats
) -> bool:
"""Add a single package object to a mirror.
The package object is only required to have an associated spec
with a concrete version.
Args:
pkg_obj (spack.package_base.PackageBase): package object with to be added.
mirror_cache: mirror where to add the spec.
mirror_stats: statistics on the current mirror
Return:
True if the spec was added successfully, False otherwise
"""
tty.msg("Adding package {} to mirror".format(pkg_obj.spec.format("{name}{@version}")))
max_retries = 3
for num_retries in range(max_retries):
try:
# Includes patches and resources
with pkg_obj.stage as pkg_stage:
pkg_stage.cache_mirror(mirror_cache, mirror_stats)
break
except Exception as e:
if num_retries + 1 == max_retries:
if spack.config.get("config:debug"):
traceback.print_exc()
else:
tty.warn(
"Error while fetching %s" % pkg_obj.spec.format("{name}{@version}"), str(e)
)
mirror_stats.error()
return False
return True
def require_mirror_name(mirror_name):
"""Find a mirror by name and raise if it does not exist"""
mirror = MirrorCollection().get(mirror_name)
if not mirror:
raise ValueError(f'no mirror named "{mirror_name}"')
return mirror

View File

@@ -48,6 +48,7 @@
import spack.error
import spack.paths
import spack.projections as proj
import spack.schema
import spack.schema.environment
import spack.spec
import spack.store
@@ -216,7 +217,7 @@ def root_path(name, module_set_name):
roots = spack.config.get(f"modules:{module_set_name}:roots", {})
# Merge config values into the defaults so we prefer configured values
roots = spack.config.merge_yaml(defaults, roots)
roots = spack.schema.merge_yaml(defaults, roots)
path = roots.get(name, os.path.join(spack.paths.share_path, name))
return spack.util.path.canonicalize_path(path)
@@ -227,7 +228,7 @@ def generate_module_index(root, modules, overwrite=False):
if overwrite or not os.path.exists(index_path):
entries = syaml.syaml_dict()
else:
with open(index_path) as index_file:
with open(index_path, encoding="utf-8") as index_file:
yaml_content = syaml.load(index_file)
entries = yaml_content["module_index"]
@@ -236,7 +237,7 @@ def generate_module_index(root, modules, overwrite=False):
entries[m.spec.dag_hash()] = entry
index = {"module_index": entries}
llnl.util.filesystem.mkdirp(root)
with open(index_path, "w") as index_file:
with open(index_path, "w", encoding="utf-8") as index_file:
syaml.dump(index, default_flow_style=False, stream=index_file)
@@ -256,7 +257,7 @@ def read_module_index(root):
index_path = os.path.join(root, "module-index.yaml")
if not os.path.exists(index_path):
return {}
with open(index_path) as index_file:
with open(index_path, encoding="utf-8") as index_file:
return _read_module_index(index_file)
@@ -605,7 +606,7 @@ def configure_options(self):
return msg
if os.path.exists(pkg.install_configure_args_path):
with open(pkg.install_configure_args_path) as args_file:
with open(pkg.install_configure_args_path, encoding="utf-8") as args_file:
return spack.util.path.padding_filter(args_file.read())
# Returning a false-like value makes the default templates skip
@@ -624,10 +625,10 @@ def environment_modifications(self):
"""List of environment modifications to be processed."""
# Modifications guessed by inspecting the spec prefix
prefix_inspections = syaml.syaml_dict()
spack.config.merge_yaml(
spack.schema.merge_yaml(
prefix_inspections, spack.config.get("modules:prefix_inspections", {})
)
spack.config.merge_yaml(
spack.schema.merge_yaml(
prefix_inspections,
spack.config.get(f"modules:{self.conf.name}:prefix_inspections", {}),
)
@@ -900,7 +901,7 @@ def write(self, overwrite=False):
# Render the template
text = template.render(context)
# Write it to file
with open(self.layout.filename, "w") as f:
with open(self.layout.filename, "w", encoding="utf-8") as f:
f.write(text)
# Set the file permissions of the module to match that of the package
@@ -939,7 +940,7 @@ def update_module_hiddenness(self, remove=False):
if modulerc_exists:
# retrieve modulerc content
with open(modulerc_path) as f:
with open(modulerc_path, encoding="utf-8") as f:
content = f.readlines()
content = "".join(content).split("\n")
# remove last empty item if any
@@ -974,7 +975,7 @@ def update_module_hiddenness(self, remove=False):
elif content != self.modulerc_header:
# ensure file ends with a newline character
content.append("")
with open(modulerc_path, "w") as f:
with open(modulerc_path, "w", encoding="utf-8") as f:
f.write("\n".join(content))
def remove(self):

View File

@@ -7,8 +7,6 @@
import urllib.parse
from typing import Optional, Union
import spack.spec
# notice: Docker is more strict (no uppercase allowed). We parse image names *with* uppercase
# and normalize, so: example.com/Organization/Name -> example.com/organization/name. Tags are
# case sensitive though.
@@ -195,7 +193,7 @@ def __eq__(self, __value: object) -> bool:
)
def _ensure_valid_tag(tag: str) -> str:
def ensure_valid_tag(tag: str) -> str:
"""Ensure a tag is valid for an OCI registry."""
sanitized = re.sub(r"[^\w.-]", "_", tag)
if len(sanitized) > 128:
@@ -203,20 +201,6 @@ def _ensure_valid_tag(tag: str) -> str:
return sanitized
def default_tag(spec: "spack.spec.Spec") -> str:
"""Return a valid, default image tag for a spec."""
return _ensure_valid_tag(f"{spec.name}-{spec.version}-{spec.dag_hash()}.spack")
#: Default OCI index tag
default_index_tag = "index.spack"
def tag_is_spec(tag: str) -> bool:
"""Check if a tag is likely a Spec"""
return tag.endswith(".spack") and tag != default_index_tag
def default_config(architecture: str, os: str):
return {
"architecture": architecture,

View File

@@ -16,7 +16,8 @@
import llnl.util.tty as tty
import spack.fetch_strategy
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.oci.opener
import spack.stage
import spack.util.url
@@ -213,7 +214,7 @@ def upload_manifest(
return digest, size
def image_from_mirror(mirror: spack.mirror.Mirror) -> ImageReference:
def image_from_mirror(mirror: spack.mirrors.mirror.Mirror) -> ImageReference:
"""Given an OCI based mirror, extract the URL and image name from it"""
url = mirror.push_url
if not url.startswith("oci://"):
@@ -385,5 +386,8 @@ def make_stage(
# is the `oci-layout` and `index.json` files, which are
# required by the spec.
return spack.stage.Stage(
fetch_strategy, mirror_paths=spack.mirror.OCILayout(digest), name=digest.digest, keep=keep
fetch_strategy,
mirror_paths=spack.mirrors.layout.OCILayout(digest),
name=digest.digest,
keep=keep,
)

View File

@@ -20,8 +20,8 @@
import llnl.util.lang
import spack.config
import spack.mirror
import spack.parser
import spack.mirrors.mirror
import spack.tokenize
import spack.util.web
from .image import ImageReference
@@ -57,7 +57,7 @@ def dispatch_open(fullurl, data=None, timeout=None):
quoted_string = rf'"(?:({qdtext}*)|{quoted_pair})*"'
class TokenType(spack.parser.TokenBase):
class WwwAuthenticateTokens(spack.tokenize.TokenBase):
AUTH_PARAM = rf"({token}){BWS}={BWS}({token}|{quoted_string})"
# TOKEN68 = r"([A-Za-z0-9\-._~+/]+=*)" # todo... support this?
TOKEN = rf"{tchar}+"
@@ -68,9 +68,7 @@ class TokenType(spack.parser.TokenBase):
ANY = r"."
TOKEN_REGEXES = [rf"(?P<{token}>{token.regex})" for token in TokenType]
ALL_TOKENS = re.compile("|".join(TOKEN_REGEXES))
WWW_AUTHENTICATE_TOKENIZER = spack.tokenize.Tokenizer(WwwAuthenticateTokens)
class State(Enum):
@@ -81,18 +79,6 @@ class State(Enum):
AUTH_PARAM_OR_SCHEME = auto()
def tokenize(input: str):
scanner = ALL_TOKENS.scanner(input) # type: ignore[attr-defined]
for match in iter(scanner.match, None): # type: ignore[var-annotated]
yield spack.parser.Token(
TokenType.__members__[match.lastgroup], # type: ignore[attr-defined]
match.group(), # type: ignore[attr-defined]
match.start(), # type: ignore[attr-defined]
match.end(), # type: ignore[attr-defined]
)
class Challenge:
__slots__ = ["scheme", "params"]
@@ -128,7 +114,7 @@ def parse_www_authenticate(input: str):
unquote = lambda s: _unquote(r"\1", s[1:-1])
mode: State = State.CHALLENGE
tokens = tokenize(input)
tokens = WWW_AUTHENTICATE_TOKENIZER.tokenize(input)
current_challenge = Challenge()
@@ -141,36 +127,36 @@ def extract_auth_param(input: str) -> Tuple[str, str]:
return key, value
while True:
token: spack.parser.Token = next(tokens)
token: spack.tokenize.Token = next(tokens)
if mode == State.CHALLENGE:
if token.kind == TokenType.EOF:
if token.kind == WwwAuthenticateTokens.EOF:
raise ValueError(token)
elif token.kind == TokenType.TOKEN:
elif token.kind == WwwAuthenticateTokens.TOKEN:
current_challenge.scheme = token.value
mode = State.AUTH_PARAM_LIST_START
else:
raise ValueError(token)
elif mode == State.AUTH_PARAM_LIST_START:
if token.kind == TokenType.EOF:
if token.kind == WwwAuthenticateTokens.EOF:
challenges.append(current_challenge)
break
elif token.kind == TokenType.COMMA:
elif token.kind == WwwAuthenticateTokens.COMMA:
# Challenge without param list, followed by another challenge.
challenges.append(current_challenge)
current_challenge = Challenge()
mode = State.CHALLENGE
elif token.kind == TokenType.SPACE:
elif token.kind == WwwAuthenticateTokens.SPACE:
# A space means it must be followed by param list
mode = State.AUTH_PARAM
else:
raise ValueError(token)
elif mode == State.AUTH_PARAM:
if token.kind == TokenType.EOF:
if token.kind == WwwAuthenticateTokens.EOF:
raise ValueError(token)
elif token.kind == TokenType.AUTH_PARAM:
elif token.kind == WwwAuthenticateTokens.AUTH_PARAM:
key, value = extract_auth_param(token.value)
current_challenge.params.append((key, value))
mode = State.NEXT_IN_LIST
@@ -178,22 +164,22 @@ def extract_auth_param(input: str) -> Tuple[str, str]:
raise ValueError(token)
elif mode == State.NEXT_IN_LIST:
if token.kind == TokenType.EOF:
if token.kind == WwwAuthenticateTokens.EOF:
challenges.append(current_challenge)
break
elif token.kind == TokenType.COMMA:
elif token.kind == WwwAuthenticateTokens.COMMA:
mode = State.AUTH_PARAM_OR_SCHEME
else:
raise ValueError(token)
elif mode == State.AUTH_PARAM_OR_SCHEME:
if token.kind == TokenType.EOF:
if token.kind == WwwAuthenticateTokens.EOF:
raise ValueError(token)
elif token.kind == TokenType.TOKEN:
elif token.kind == WwwAuthenticateTokens.TOKEN:
challenges.append(current_challenge)
current_challenge = Challenge(token.value)
mode = State.AUTH_PARAM_LIST_START
elif token.kind == TokenType.AUTH_PARAM:
elif token.kind == WwwAuthenticateTokens.AUTH_PARAM:
key, value = extract_auth_param(token.value)
current_challenge.params.append((key, value))
mode = State.NEXT_IN_LIST
@@ -367,11 +353,11 @@ def http_error_401(self, req: Request, fp, code, msg, headers):
def credentials_from_mirrors(
domain: str, *, mirrors: Optional[Iterable[spack.mirror.Mirror]] = None
domain: str, *, mirrors: Optional[Iterable[spack.mirrors.mirror.Mirror]] = None
) -> Optional[UsernamePassword]:
"""Filter out OCI registry credentials from a list of mirrors."""
mirrors = mirrors or spack.mirror.MirrorCollection().values()
mirrors = mirrors or spack.mirrors.mirror.MirrorCollection().values()
for mirror in mirrors:
# Prefer push credentials over fetch. Unlikely that those are different

View File

@@ -11,7 +11,7 @@
from os import chdir, environ, getcwd, makedirs, mkdir, remove, removedirs
from shutil import move, rmtree
from spack.error import InstallError
from spack.error import InstallError, NoHeadersError, NoLibrariesError
# Emulate some shell commands for convenience
env = environ

View File

@@ -24,7 +24,6 @@
import time
import traceback
import typing
import warnings
from typing import Any, Callable, Dict, Iterable, List, Optional, Set, Tuple, Type, TypeVar, Union
import llnl.util.filesystem as fsys
@@ -40,7 +39,8 @@
import spack.error
import spack.fetch_strategy as fs
import spack.hooks
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.multimethod
import spack.patch
import spack.phase_callbacks
@@ -54,6 +54,7 @@
import spack.variant
from spack.error import InstallError, NoURLError, PackageError
from spack.filesystem_view import YamlFilesystemView
from spack.resource import Resource
from spack.solver.version_order import concretization_version_order
from spack.stage import DevelopStage, ResourceStage, Stage, StageComposite, compute_stage_name
from spack.util.package_hash import package_hash
@@ -585,6 +586,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
# Declare versions dictionary as placeholder for values.
# This allows analysis tools to correctly interpret the class attributes.
versions: dict
resources: Dict[spack.spec.Spec, List[Resource]]
dependencies: Dict[spack.spec.Spec, Dict[str, spack.dependency.Dependency]]
conflicts: Dict[spack.spec.Spec, List[Tuple[spack.spec.Spec, Optional[str]]]]
requirements: Dict[
@@ -595,6 +597,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
patches: Dict[spack.spec.Spec, List[spack.patch.Patch]]
variants: Dict[spack.spec.Spec, Dict[str, spack.variant.Variant]]
languages: Dict[spack.spec.Spec, Set[str]]
licenses: Dict[spack.spec.Spec, str]
splice_specs: Dict[spack.spec.Spec, Tuple[spack.spec.Spec, Union[None, str, List[str]]]]
#: Store whether a given Spec source/binary should not be redistributed.
@@ -712,7 +715,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
#: are available to build a custom test code.
test_requires_compiler: bool = False
#: The spec's TestSuite instance, which is used to manage its testing.
#: TestSuite instance used to manage stand-alone tests for 1+ specs.
test_suite: Optional[Any] = None
def __init__(self, spec):
@@ -1184,10 +1187,10 @@ def _make_resource_stage(self, root_stage, resource):
root=root_stage,
resource=resource,
name=self._resource_stage(resource),
mirror_paths=spack.mirror.default_mirror_layout(
mirror_paths=spack.mirrors.layout.default_mirror_layout(
resource.fetcher, os.path.join(self.name, pretty_resource_name)
),
mirrors=spack.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
path=self.path,
)
@@ -1199,7 +1202,7 @@ def _make_root_stage(self, fetcher):
# Construct a mirror path (TODO: get this out of package.py)
format_string = "{name}-{version}"
pretty_name = self.spec.format_path(format_string)
mirror_paths = spack.mirror.default_mirror_layout(
mirror_paths = spack.mirrors.layout.default_mirror_layout(
fetcher, os.path.join(self.name, pretty_name), self.spec
)
# Construct a path where the stage should build..
@@ -1208,7 +1211,7 @@ def _make_root_stage(self, fetcher):
stage = Stage(
fetcher,
mirror_paths=mirror_paths,
mirrors=spack.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
name=stage_name,
path=self.path,
search_fn=self._download_search,
@@ -1356,24 +1359,6 @@ def tester(self):
self._tester = spack.install_test.PackageTest(self)
return self._tester
@property
def installed(self):
msg = (
'the "PackageBase.installed" property is deprecated and will be '
'removed in Spack v0.19, use "Spec.installed" instead'
)
warnings.warn(msg)
return self.spec.installed
@property
def installed_upstream(self):
msg = (
'the "PackageBase.installed_upstream" property is deprecated and will '
'be removed in Spack v0.19, use "Spec.installed_upstream" instead'
)
warnings.warn(msg)
return self.spec.installed_upstream
@property
def fetcher(self):
if not self.spec.versions.concrete:
@@ -1751,7 +1736,7 @@ def all_patches(cls):
return patches
def content_hash(self, content=None):
def content_hash(self, content: Optional[bytes] = None) -> str:
"""Create a hash based on the artifacts and patches used to build this package.
This includes:
@@ -1943,6 +1928,29 @@ def _resource_stage(self, resource):
resource_stage_folder = "-".join(pieces)
return resource_stage_folder
def do_test(self, dirty=False, externals=False):
if self.test_requires_compiler:
compilers = spack.compilers.compilers_for_spec(
self.spec.compiler, arch_spec=self.spec.architecture
)
if not compilers:
tty.error(
"Skipping tests for package %s\n"
% self.spec.format("{name}-{version}-{hash:7}")
+ "Package test requires missing compiler %s" % self.spec.compiler
)
return
kwargs = {
"dirty": dirty,
"fake": False,
"context": "test",
"externals": externals,
"verbose": tty.is_verbose(),
}
self.tester.stand_alone_tests(kwargs)
def unit_test_check(self):
"""Hook for unit tests to assert things about package internals.

View File

@@ -40,7 +40,7 @@ def compare_output(current_output, blessed_output):
def compare_output_file(current_output, blessed_output_file):
"""Same as above, but when the blessed output is given as a file."""
with open(blessed_output_file, "r") as f:
with open(blessed_output_file, "r", encoding="utf-8") as f:
blessed_output = f.read()
compare_output(current_output, blessed_output)

View File

@@ -16,7 +16,8 @@
import spack
import spack.error
import spack.fetch_strategy
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.repo
import spack.stage
import spack.util.spack_json as sjson
@@ -329,12 +330,12 @@ def stage(self) -> "spack.stage.Stage":
name = "{0}-{1}".format(os.path.basename(self.url), fetch_digest[:7])
per_package_ref = os.path.join(self.owner.split(".")[-1], name)
mirror_ref = spack.mirror.default_mirror_layout(fetcher, per_package_ref)
mirror_ref = spack.mirrors.layout.default_mirror_layout(fetcher, per_package_ref)
self._stage = spack.stage.Stage(
fetcher,
name=f"{spack.stage.stage_prefix}patch-{fetch_digest}",
mirror_paths=mirror_ref,
mirrors=spack.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
)
return self._stage

View File

@@ -13,6 +13,7 @@
import macholib.mach_o
import macholib.MachO
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.lang import memoized
@@ -275,10 +276,10 @@ def modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
# Deduplicate and flatten
args = list(itertools.chain.from_iterable(llnl.util.lang.dedupe(args)))
install_name_tool = executable.Executable("install_name_tool")
if args:
args.append(str(cur_path))
install_name_tool = executable.Executable("install_name_tool")
install_name_tool(*args)
with fs.edit_in_place_through_temporary_file(cur_path) as temp_path:
install_name_tool(*args, temp_path)
def macholib_get_paths(cur_path):
@@ -717,8 +718,8 @@ def fixup_macos_rpath(root, filename):
# No fixes needed
return False
args.append(abspath)
executable.Executable("install_name_tool")(*args)
with fs.edit_in_place_through_temporary_file(abspath) as temp_path:
executable.Executable("install_name_tool")(*args, temp_path)
return True

View File

@@ -41,6 +41,7 @@
import spack.provider_index
import spack.spec
import spack.tag
import spack.tengine
import spack.util.file_cache
import spack.util.git
import spack.util.naming as nm
@@ -81,43 +82,6 @@ def namespace_from_fullname(fullname):
return namespace
class _PrependFileLoader(importlib.machinery.SourceFileLoader):
def __init__(self, fullname, path, prepend=None):
super(_PrependFileLoader, self).__init__(fullname, path)
self.prepend = prepend
def path_stats(self, path):
stats = super(_PrependFileLoader, self).path_stats(path)
if self.prepend:
stats["size"] += len(self.prepend) + 1
return stats
def get_data(self, path):
data = super(_PrependFileLoader, self).get_data(path)
if path != self.path or self.prepend is None:
return data
else:
return self.prepend.encode() + b"\n" + data
class RepoLoader(_PrependFileLoader):
"""Loads a Python module associated with a package in specific repository"""
#: Code in ``_package_prepend`` is prepended to imported packages.
#:
#: Spack packages are expected to call `from spack.package import *`
#: themselves, but we are allowing a deprecation period before breaking
#: external repos that don't do this yet.
_package_prepend = "from spack.package import *"
def __init__(self, fullname, repo, package_name):
self.repo = repo
self.package_name = package_name
self.package_py = repo.filename_for_package_name(package_name)
self.fullname = fullname
super().__init__(self.fullname, self.package_py, prepend=self._package_prepend)
class SpackNamespaceLoader:
def create_module(self, spec):
return SpackNamespace(spec.name)
@@ -187,7 +151,8 @@ def compute_loader(self, fullname):
# With 2 nested conditionals we can call "repo.real_name" only once
package_name = repo.real_name(module_name)
if package_name:
return RepoLoader(fullname, repo, package_name)
module_path = repo.filename_for_package_name(package_name)
return importlib.machinery.SourceFileLoader(fullname, module_path)
# We are importing a full namespace like 'spack.pkg.builtin'
if fullname == repo.full_namespace:
@@ -1066,7 +1031,7 @@ def is_prefix(self, fullname: str) -> bool:
def _read_config(self) -> Dict[str, str]:
"""Check for a YAML config file in this db's root directory."""
try:
with open(self.config_file) as reponame_file:
with open(self.config_file, encoding="utf-8") as reponame_file:
yaml_data = syaml.load(reponame_file)
if (
@@ -1400,7 +1365,7 @@ def create_repo(root, namespace=None, subdir=packages_dir_name):
packages_path = os.path.join(root, subdir)
fs.mkdirp(packages_path)
with open(config_path, "w") as config:
with open(config_path, "w", encoding="utf-8") as config:
config.write("repo:\n")
config.write(f" namespace: '{namespace}'\n")
if subdir != packages_dir_name:
@@ -1521,15 +1486,13 @@ def add_package(self, name, dependencies=None):
Both "dep_type" and "condition" can default to ``None`` in which case
``spack.dependency.default_deptype`` and ``spack.spec.Spec()`` are used.
"""
import spack.tengine # avoid circular import
dependencies = dependencies or []
context = {"cls_name": nm.mod_to_class(name), "dependencies": dependencies}
template = spack.tengine.make_environment().get_template("mock-repository/package.pyt")
text = template.render(context)
package_py = self.recipe_filename(name)
fs.mkdirp(os.path.dirname(package_py))
with open(package_py, "w") as f:
with open(package_py, "w", encoding="utf-8") as f:
f.write(text)
def remove(self, name):

View File

@@ -191,9 +191,9 @@ def on_success(self, pkg, kwargs, package_record):
def fetch_log(self, pkg):
try:
if os.path.exists(pkg.install_log_path):
stream = gzip.open(pkg.install_log_path, "rt")
stream = gzip.open(pkg.install_log_path, "rt", encoding="utf-8")
else:
stream = open(pkg.log_path)
stream = open(pkg.log_path, encoding="utf-8")
with stream as f:
return f.read()
except OSError:
@@ -204,7 +204,7 @@ def extract_package_from_signature(self, instance, *args, **kwargs):
class TestInfoCollector(InfoCollector):
"""Collect information for the PackageTest.stand_alone_tests method.
"""Collect information for the PackageBase.do_test method.
Args:
specs: specs whose install information will be recorded
@@ -214,7 +214,7 @@ class TestInfoCollector(InfoCollector):
dir: str
def __init__(self, specs: List[spack.spec.Spec], record_directory: str):
super().__init__(spack.install_test.PackageTest, "stand_alone_tests", specs)
super().__init__(spack.package_base.PackageBase, "do_test", specs)
self.dir = record_directory
def on_success(self, pkg, kwargs, package_record):
@@ -233,7 +233,7 @@ def fetch_log(self, pkg: spack.package_base.PackageBase):
return f"Cannot open log for {pkg.spec.cshort_spec}"
def extract_package_from_signature(self, instance, *args, **kwargs):
return instance.pkg
return instance
@contextlib.contextmanager

Some files were not shown because too many files have changed in this diff Show More