Compare commits

..

156 Commits

Author SHA1 Message Date
Todd Gamblin
0150e965a0 Remove deprecated --spec-file arg from tests
The `spack ci` tests were using the deprecated `--spec-file` argument, which should be
replaced with `--spec`.
2023-12-11 12:33:22 -08:00
Brian Van Essen
5d999d0e4f Add logic to cache the RPATH variables in CachedCMakePackages. (#41417) 2023-12-08 09:27:44 -08:00
Seth R. Johnson
694a1ff340 celeritas: new version 0.4.1 (#41504)
* celeritas: new version 0.4.1

* Mark correct versions as deprecated
2023-12-08 09:56:18 +00:00
Richard Berger
4ec451cfed flecsi: remove ^legion network=gasnet restriction (#41494) 2023-12-07 19:03:04 -07:00
Julien Cortial
a77eca7f88 cdt: Add versions 1.3.6 and 1.4.0 (#41490) 2023-12-07 14:27:39 -07:00
Greg Becker
14ac2b063a cce compiler: remove vestigial compiler names (#41303) 2023-12-07 15:17:03 -06:00
Tamara Dahlgren
edf4d6659d add missing endtime property to CDash (#41498) 2023-12-07 22:06:46 +01:00
Lydéric Debusschère
6531fbf425 py-mpldock: new package (#41316)
* py-mpldock: new package

* py-mpldock: remove version constraint on python

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-07 21:24:05 +01:00
Victor Brunini
0a6045eadf Fix cdash reporter time stamps (#38825)
* Fix cdash reporter time stamps (#38818).
   The cdash reporter is created before packages are installed so save the
   starttime then instead of the endtime.
* Use endtime instead of starttime for the endtime of update

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
2023-12-07 19:32:10 +00:00
Todd Gamblin
5722a13af0 Spack mailing list is now announcement-only (#41496)
Participation in the venerable Spack google group has dwindled, though we still have
540+ subscribers there.  I've made the mailing list announcement-only, and I've given
a few maintainers posting privileges.

This PR adds some notes to the README indicating that the mailing list is only for
announcements.
2023-12-07 19:22:14 +00:00
Brian Van Essen
9f1223e7a3 Bugfix spectrum-mpi module generation (#41466)
* Ensure that additional environment variables are set when a module
file is generated.

* Fixed the detection of the opal_prefix / MPI_ROOT field to use ompi_info.
---------

Co-authored-by: Greg Becker <becker33@llnl.gov>
2023-12-07 11:06:07 -08:00
Vijay M
5beef28444 Update homepage URL, add 5.5.1 version, remove bad version hashes, and other minro changes (#41492) 2023-12-07 10:52:06 -08:00
snehring
e618a93f3d iqtree2: add new version 2.2.2.7 and new variant lsd2 (#41467)
* iqtree2: add new version 2.2.2.7 and new variant lsd2
* iqtree2: reorder variant and resource
2023-12-07 11:30:14 -07:00
Garth N. Wells
3f0ec5c580 Update UFCx for v0.7.0. (#41392) 2023-12-07 10:20:34 -08:00
Kyle Knoepfel
14392efc6d Permit shared-library for libbacktrace (#41454) 2023-12-07 10:17:31 -08:00
John W. Parent
d7406aaaa5 CMake: v3.26.6 (#41282) 2023-12-07 10:25:42 -07:00
Harmen Stoppels
5a7e691ae2 freebsd (#41480) 2023-12-07 09:19:55 -08:00
Dave Keeshan
b9f63ab40b opensta: add new package (#41484)
* Add opensta, is allows 2 variants, zlib and cudd, but they are both enabled by default
* Remove unused import, os
2023-12-07 09:18:58 -08:00
Auriane R
4417b1f9ee Update pika package to use f-strings (#41483) 2023-12-07 10:14:33 -07:00
Adam J. Stewart
04f14166cb py-keras: add v3.0.1 (#41486) 2023-12-07 09:05:42 -08:00
Robert Cohn
223a54098e [intel-mkl,intel-ipp,intel-daal]: deprecate packages (#41488) 2023-12-07 09:01:02 -08:00
Satish Balay
ea505e2d26 petsc: add variant +zoltan (#41472) 2023-12-07 10:51:00 -06:00
Ataf Fazledin Ahamed
e2b51e01be traverse.py: use > 0 instead of >= 0 (#41482)
Signed-off-by: fazledyn-or <ataf@openrefactory.com>
2023-12-07 09:38:54 -07:00
jmlapre
a04ee77f77 trilinos: replace pytrilinos2 variant with python (#41435)
* depend_on python

There is an ill-named variant "python" that enables the pytrilinos1
variant.  This made it through our testing but broke on our actual
CI test machines.

* adjust "python" variant based on Trilinos version

For Trilinos <= 14, enable PyTrilinos(1). For later versions
of Trilinos, enable PyTrilinos2.

We still support directly enabling PyTrilinos2 via the "pytrilinos2"
variant.

* remove pytrilinos2 variant

* correct depends_on constraints
2023-12-07 10:28:13 -05:00
Jordan Galby
bb03ce7281 Do not use depfile in bootstrap (#41458)
- we don't have a fallback if make is not installed
- we assume file system locking works
- we don't verify that make is gnu make (bootstrapping fails on FreeBSD as a result)
- there are some weird race conditions in writing spack.yaml on concurrent spack install
- the view is updated after every package install instead of post environment install.
2023-12-07 10:09:49 +00:00
Massimiliano Culpo
31640652c7 audit: forbid nested dependencies in depends_on declarations (#41428)
Forbid nested dependencies in depends_on declarations, by running an audit in CI.

Fix the packages not passing the new audit:
- amd-aocl
- exago
- palace
- shapemapper
- xsdk-examples

ginkgo: add a commit sha to v1.5.0.glu_experimental
2023-12-07 10:21:01 +01:00
Samuel Browne
0ff0e8944e trilinos: add v15.0.0 (#41465) 2023-12-07 10:07:02 +01:00
Jack Morrison
a877d812d0 rdma-core: Add new versions 41.5, 42.5, 43.4, 44.4, 45.3, 46.2, 47.1, 49.0 (#41473) 2023-12-07 09:58:06 +01:00
dependabot[bot]
24a59ffd36 build(deps): bump actions/setup-python from 4.8.0 to 5.0.0 (#41474)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.8.0 to 5.0.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](b64ffcaf5b...0a5c615913)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-07 09:55:55 +01:00
Dave Keeshan
57f46f0375 cudd: add new package (#41476) 2023-12-07 09:37:22 +01:00
Chris Green
7d45e132a6 [root] New variants, checksum changes, sundry improvements (#41463)
* New variants:
  - `tmvz-cpu`
  - `tmvz-gpu`
  - `tmvz-pymva`
  - `tmvz-sofie`

* Improve X-related dependencies.

* Improve TMVA-related dependencies with more specificity.

* Patch possible missing standard header include in Eve7.

* Patch Protobuf handling to support new Protobuf-provided CMake config
  files required to handle transitive `abseil-cpp` dependence.

* Add missing terminal newline to `webgui` patch to remove patch
  warning.

* Handle deprecated/removed build options.

* Handle unwanted system paths in various `PATH`-like environment
  variables.
2023-12-06 18:39:51 -06:00
Vicente Bolea
e7ac676417 paraview: dropping patch since changes exists (#41462) 2023-12-06 16:18:46 -07:00
Adam J. Stewart
94ba152ef5 py-torchmetrics: add v1.2.1 (#41456) 2023-12-06 12:48:28 -07:00
Massimiliano Culpo
5404a5bb82 llvm: reformulate a when condition to avoid tautology (#41461)
The condition on swig can be interpreted as "true if true,
false if false" and gives clingo the option to add swig
or not.

If not other optimization criteria break the tie, then
the concretization is non-deterministic.
2023-12-06 19:12:42 +01:00
Dave Keeshan
b522d8f610 yosys: add new package (#41416)
* Add EDA tools, yosys to Spack
* Add maintainers
* Move from format to f-strings
2023-12-06 09:47:58 -08:00
Thomas Madlener
2a57c11d28 lcio: add version 2.20.2, sio: add version 0.2 (#41451) 2023-12-06 09:46:18 -08:00
Thomas Madlener
1aa3a641ee edm4hep: Update cmake version dependency for newer versions (#41450) 2023-12-06 09:43:24 -08:00
yizeyi18
6feba1590c nwchem: add libxc/elpa support (#41376)
* added external libxc/elpa choice
* fixed formatting issues and 1 unused variant found by reviewer
* try to fix a string formatting issue
* try to fix some other string formatting issues
* fixed 1 flake8 style issue
* use explicit fftw-api@3
2023-12-06 09:38:46 -08:00
Chris Green
58a7912435 [catch2] Sundry improvements including C++20 support (#41199)
* Add `url_list` to facilitate finding new versions.

* `cxxstd` is not meaningful when `@:2.99.99` as it was a header-only
  package before v3.

* Support C++20/23, remove C++14 support.

* Add @greenc-FNAL to maintainers.

* Add CMake arguments to support testing, build of extras and examples.
2023-12-06 11:07:27 -06:00
yizeyi18
03ae2eb223 dla-future: add a patch (#41409)
* using std::int64_t needs include cstdint in gcc-13

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2023-12-06 16:55:40 +01:00
Julien Cortial
013f0d3a13 Only build tests for proj package if required (#41065)
* Only build tests for proj package if required

Even if tests are not explictly required to be built, proj build them
anyway and tries to download Google Test.

* proj: fix name of test activation flag

* proj: Always set test activation flag

* proj: Patch test activation logic for versions 5.x
2023-12-06 09:32:39 -06:00
Eric Berquist
3e68aa0b2f py-pre-commit: add 3.5.0 (#41438) 2023-12-06 09:11:32 -06:00
Gavin John
1da0d0342b Add new versions of py-quast (#40788)
* Add new versions of py-quast

* Update hashes

* Add joblib and simplejson

* Fat finger

* Update dependency type
2023-12-06 09:04:17 -06:00
Lydéric Debusschère
6f7d91aebf py-python-pptx: new package (#41315)
* py-python-pptx: new package

* py-python-pptx: use pil instead of pillow, remove version constraint on python

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-06 08:43:51 -06:00
Lydéric Debusschère
071c74d185 py-tldextract: new package (#41330)
* py-tldextract: new package

* py-tldextract: add version 5.1.1

* py-tldextract: fix version constraint on py-setuptools-scm

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-06 08:42:48 -06:00
Juan Miguel Carceller
51435d6d69 ruff: add version 0.1.6 (#41355)
* Add a new version of ruff

* Add a comment about where the dependency can be found

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2023-12-06 08:41:20 -06:00
Jordan Galby
8ce110e069 bootstrap: Don't catch Ctrl-C (#41449) 2023-12-06 14:58:14 +01:00
Auriane R
90aee11c33 Add pika 0.21.0 release (#41446) 2023-12-06 03:48:26 -07:00
Harmen Stoppels
4fc73bd7f3 minimal support for freebsd (#41434) 2023-12-06 10:27:22 +00:00
Dom Heinzeller
f7fc4b201d Update py-werkzeug version dependency for py-graphene-tornado@2.6.1 (#41426)
* Update py-werkzeug version dependency for py-graphene-tornado@2.6.1

* Add note on diverging version requirements for py-werkzeug in py-graphene-tornado
2023-12-06 10:28:58 +01:00
Jim Edwards
84999b6996 mpiserial: rework installation (#40762) 2023-12-06 09:39:08 +01:00
Harmen Stoppels
b0f193071d bootstrap status: no bash (#41431) 2023-12-06 09:20:59 +01:00
dependabot[bot]
d1c3374ccb build(deps): bump actions/setup-python from 4.7.1 to 4.8.0 (#41441)
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4.7.1 to 4.8.0.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](65d7f2d534...b64ffcaf5b)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-06 09:20:14 +01:00
Richard Berger
cba8ba0466 legion: correct cuda dependency for cr version (#41119)
* legion: correct cuda dependency for cr version
* Update var/spack/repos/builtin/packages/legion/package.py

---------

Co-authored-by: Davis Herring <herring@lanl.gov>
2023-12-05 18:36:50 -08:00
Wouter Deconinck
d50f8d7b19 root: sha256 change on latest versions (#41401)
* root: sha256 change on latest version
* root: sha256 change on 6.28.10
* root: replace 6.26.12 by 6.26.14
* root: hash for 6.26.14
2023-12-05 18:20:27 -08:00
kjrstory
969fbbfb5a foam-extend: add v5.0 (#40480)
* foam-extend:add new version
* depreacted versions
2023-12-05 18:11:59 -08:00
Robert Cohn
1cd5397b12 [intel-parallel-studio] Deprecate entire package (#41430) 2023-12-05 17:56:49 -08:00
psakievich
1829dbd7b6 CDash: Spack dumps stage errors to configure phase (#41436) 2023-12-05 22:05:39 +00:00
Philippe Virouleau
9e3b231e6f julia: fix LLVM paches hashes (#41410) 2023-12-05 22:53:40 +01:00
Alec Scott
5911a677d4 py-gidgethub: add new package (#41286)
* py-gidgethub: add new package

* Add main branch version and scope flit/flit-core dependency

* Update var/spack/repos/builtin/packages/py-gidgethub/package.py

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* Add optional dependencies as variants of package

* Add git url for main version

* Fix variant and dependency ordering

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-12-05 21:26:42 +01:00
Alec Scott
bb60bb4f7a py-gidgetlab: add new package (#41338)
* gidgetlab: add new package

* Convert both cachetools and aiohttp to optional deps with variants

* Fix forgotten variant conditional on cachetools dependency

* Add git url and main version for dev workflows

* Fix variant and dependency ordering

* Remove cachetools variant and merge dependency with aiohttp variant
2023-12-05 21:24:38 +01:00
Victoria Cherkas
ddec75315e Add maintainers to fdb, eckit, ecbuild, metkit, eccodes (#41433)
* Add maintainers to fdb
* Add maintainers to eckit
* Add maintainers to mekit
* Add maintainers to eccodes
* Add maintainers to ecbuild
* Add climbfuji to eccodes maintainers
2023-12-05 13:18:41 -07:00
Veselin Dobrev
8bcb1f8766 MFEM: Add a patch to fix the +gslib+shared+miniapps build (#41399)
* [mfem] Add a patch to resolve issue #41382

* [mfem] Spack CI wants "patch URL must end with ?full_index=1"
2023-12-05 10:26:48 -08:00
Matthew Thompson
5a0ac4ba94 Update versions of GFE packages (#41429) 2023-12-05 10:24:51 -07:00
Lydéric Debusschère
673689d53b py-sphinx: add versions 7.2.4, 7.2.5 and 7.2.6 (#41411)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-05 15:07:22 +01:00
Brian Vanderwende
ace8e17f02 libdap4: add explicit RPC dependency (#40019) 2023-12-05 13:11:19 +01:00
Billae
eb9c63541a documentation: add instructions on how to use external opengl (#40987) 2023-12-05 12:59:41 +01:00
Kensuke WATANABE
b9f4d9f6fc sirius: fix build error with Fujitsu compiler (#41101) 2023-12-05 12:54:03 +01:00
Lydéric Debusschère
eda3522ce8 py-sphinxcontrib-moderncmakedomain: add new package (#41331)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-05 12:47:25 +01:00
Harmen Stoppels
3cefd73fcc spack buildcache check: use same interface as push (#41378) 2023-12-05 12:44:50 +01:00
Alberto Invernizzi
3547bcb517 openfoam-org: fix for being able to build manually checked out repository (#41000)
* grep WM_PROJECT_VERSION from etc/bashrc

This fixes the problem when building from a manually checked out repo
which might have a different version wrt the one defined in the spack
package (e.g. anything later than 5.0 is known as 5.x by the build
system)

* patch applies to just 5.0, in newer versions it is already addressed

In `5.20171030` there's a commit

c66fba323c

very similar (almost identical) to what the patch `50-etc.patch` does.

So the patch should not be applied to other than `5.0` otherwise it errors.

References:
- https://github.com/OpenFOAM/OpenFOAM-5.x/commits/20171030/etc/bashrc
- 197d9d3bf2/etc/bashrc (L45-L47)
2023-12-05 12:37:57 +01:00
Todd Gamblin
53b528f649 bugfix: sort variants in spack info --variants-by-name (#41389)
This was missed while backporting the new `spack info` command from #40326.

Variants should be sorted by name when invoking `spack info --variants-by-name`.
2023-12-05 12:31:40 +01:00
Mark W. Krentel
798770f9e5 hpctoolkit: add conflict for recent intel-xed (#41413)
Intel made an incompatible change in XED in 2023.08.21 that breaks
hpctoolkit (at run time).  Hpctoolkit develop can adapt (soon will),
but older versions must use xed :2023.07.09.
2023-12-05 11:17:37 +01:00
Jim Edwards
4a920243a0 cprnc: update sha256 for github artifacts (#41418) 2023-12-05 11:13:14 +01:00
Ben Wibking
8727195b84 openblas: fix macOS build when using XCode 15 or newer (#41420)
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2023-12-05 09:53:54 +01:00
Massimiliano Culpo
456f2ca40f extensions: improve docs, fix unit-tests (#41425) 2023-12-05 09:49:35 +01:00
dependabot[bot]
b4258aaa25 build(deps): bump docker/metadata-action from 5.2.0 to 5.3.0 (#41423)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.2.0 to 5.3.0.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](e6428a5c4e...31cebacef4)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-05 09:48:29 +01:00
Mitch B
5d9647544a arrow: add versions up to v14.0.1 (#41424) 2023-12-05 09:48:03 +01:00
Brian Van Essen
1fdb6a3e7e Updating the LBANN, Hydrogen, and DiHydrogen recipes (#41390)
* Updating the LBANN, Hydrogen, and DiHydrogen recipes for both new
variants and to make sure that RPATHs are properly setup.

Co-authored-by: bvanessen <bvanessen@users.noreply.github.com>
2023-12-05 09:31:51 +01:00
Robert Cohn
7c77b3a4b2 [intel] deprecate all versions (#41412)
Deprecating intel package, which contains intel classic compilers. This package has not been updated in 3 years. Please use intel-oneapi-compilers instead.
2023-12-04 21:52:55 -07:00
Ye Luo
eb4b8292b6 A few changes to quantum-espresso (#41225)
* gipaw.x installed by cmake if version >= 5c4a4ce.
  gipaw.x will only be installed with cmake if the qe-gipaw version
  is >= 5c4a4ce. Currently, QE source uses the older f5823521 one.
  Here a patch to the submodule_commit_hash_records to use a newer
  qe-gipaw version.
* Update package.py
* Delete var/spack/repos/builtin/packages/quantum-espresso/gipaw-eccee44.patch
* Update package.py
* Restoring gipaw-eccee44 patch
* Update package.py
* Add fox variant in quantum-espresso
* Fix an issue introduced in #36484. Patches are 7.1 only.
* Change plugin handling.
* formatting.
* Typo correction
* Refine conflict

---------

Co-authored-by: S. Alexis Paz <alexis.paz@gmail.com>
2023-12-04 11:37:05 -08:00
John Biddiscombe
16bc58ea49 Add EGL support to ParaView and Glew (#39800)
* Add EGL support to ParaView and Glew

add a package for egl that provides GL but also adds
EGL libs and headers for projects that need them

Fix a header problem with the opengl package

Format files using black

* better description for egl variant description

Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>

* better check/setup of non egl variant dependencies

Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>

* Add biddisco as maintainer

* Fix unused var style warning

* Add egl conflicts for other gl providers

---------

Co-authored-by: Vicente Bolea <vicente.bolea@gmail.com>
2023-12-04 10:53:06 -06:00
Alec Scott
6028ce8bc1 direnv: add v2.33.0 (#41397) 2023-12-04 14:12:13 +01:00
Harmen Stoppels
349e7e4c37 zlib-ng: add v2.1.5 (#41402) 2023-12-04 12:44:10 +01:00
Harmen Stoppels
a982118c1f ci.py: fix missing import (#41391) 2023-12-04 12:14:59 +01:00
Adam J. Stewart
40d12ed7e2 PythonPackage: type hints (#40539)
* PythonPackage: nested config_settings, type hints

* No need to quote PythonPackage

* Use narrower types for now until needed
2023-12-04 10:53:53 +00:00
James Smillie
9e0720207a Windows: fix kit base path and reference to windows registry key (#41388)
* Proper handling of argument passed as semicolon-separated str
* Fix reference to windows registry key in win-wdk
2023-12-03 15:35:13 -08:00
Jaelyn Litzinger
88e738c343 Allow exago to use hiop@develop past v1.0.1 (#41384) 2023-12-02 14:06:22 -06:00
Cameron Rutherford
8bbc2e2ade resolve: add package with cuda and rocm support (#40871) 2023-12-01 20:49:11 -06:00
Julien Cortial
1509e54435 Add MUMPS versions 5.6.0, 5.6.1 and 5.6.2 (#41386)
The patch for version 5.5.x still applies to 5.6.x.
2023-12-01 18:48:00 -07:00
Dom Heinzeller
ca164d6619 Fix curl install using Intel compilers (#41380)
When using Intel to build curl, add 'CFLAGS=-we147' to the configure
args to fix error 'compiler does not halt on function prototype
mismatch'
2023-12-01 17:24:05 -07:00
Felix Werner
a632576231 Add XCDF. (#41379) 2023-12-01 14:56:18 -08:00
Felix Werner
70b16cfb59 Add PhotoSpline. (#41374) 2023-12-01 14:44:53 -08:00
Brian Vanderwende
1d89d4dc13 MET fixes for 11.1 and HDF4 support (#41372)
* MET fixes for 11.1 and HDF4 support
* Fix zlib reference in MET
2023-12-01 14:42:36 -08:00
Dewi
bc8a0f56ed removed cmake build version pointing to fork (#41368) 2023-12-01 14:39:45 -08:00
Jack Morrison
4e09396f8a Libfabric: Introduce OPX provider conflict for v1.20.0 (#41343)
* Libfabric: Introduce OPX provider conflict for v1.20.0
* Add message to libfabric 1.20.0 opx provider conflict
2023-12-01 14:35:54 -08:00
Weiqun Zhang
0d488c6e4f amrex: add v23.12 (#41385) 2023-12-01 22:45:58 +01:00
Erik Heeren
50e76bc3d3 py-pyglet: version bump (#41082)
* py-pyglet: version bump

* py-pyglet: use zip instead of whl, update dependencies

* py-pyglet: 2.0.9 and 2.0.10 zips should be downloaded from github

* py-pyglet: style

* py-pyglet: use virtual packages in dependencies

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>

* py-pyglet: doesn't depend on py-future any more

* py-pyglet: remove glx dependency

* py-pyglet: back to the pypi zipfiles with patch instead

---------

Co-authored-by: Manuela Kuhn <36827019+manuelakuhn@users.noreply.github.com>
2023-12-01 21:44:30 +01:00
Dave Keeshan
a543fd79f1 verible: add new package (#41270)
* Add initial version of verible to spack
* Update to use explict url path for each release, as the release tagh includes extra data, also added the bottom most point of gcc, gcc9
2023-12-01 10:48:52 -08:00
Adam J. Stewart
ab50aa61db py-keras: add v3.0.0 (#41356)
* py-keras: add v3.0.0

* Older keras actually requires protobuf

* Correct url_for_version

* Capitalization is important

* Keep pil and pydot deps
2023-12-01 18:23:58 +00:00
Richard Berger
d06a102e69 Various FleCSI updates (#41068)
* flecsi: remove deprecated versions
* flecsi: add explicit conflict for backend=hpx +hdf5
* flecsi: propagate +openmp to kokkos and legion
* flecsi: remove doc variant prior to @2.2
   It wouldn't do anything meaningful and won't install the documentation.

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2023-12-01 09:41:33 -08:00
Rohit Goswami
b0f0d2f1fb dftbplus: Update and add upstream maintainer (#33243)
* dftbp: Update and add upstream maintainer
* dftbp: Trust in the hybrid cmake builds
* dftbp: Handle scalapack better
* dftbp: Refactor as per review
* dftbp: Build shared for python
* dftbp: Address review comments
* dftbp: Add another maintainer
* dftp: Fix typo
* dftbp: Arpack for serial builds only
* dftbp: Update option docs
* dftbp: Update documentation for elsi
* dftbp: Add comment for context
* dftbp: Tighter bounds on python
* dftbp: Add negf only when shared
* dftbp: Fix typo
* dftbp: Update sha256
* dftpb: Add when directive for cmake and ninja
* dftbp: Enforce comment

---------

Co-authored-by: Tamara Dahlgren <dahlgren1@llnl.gov>
Co-authored-by: awvwgk <awvwgk@users.noreply.github.com>
Co-authored-by: iamashwin99 <iamashwin99@users.noreply.github.com>
Co-authored-by: Ashwin Kumar Karnad <46030335+iamashwin99@users.noreply.github.com>
Co-authored-by: Sebastian Ehlert <28669218+awvwgk@users.noreply.github.com>
Co-authored-by: tldahlgren <tldahlgren@users.noreply.github.com>
2023-12-01 09:34:11 -08:00
Victoria Cherkas
6029b600f0 eccodes: add v2.32.0, v2.31.0 (#40770)
* eccodes new versions and dependencies
* Suggested changes for multiple variant defaults
* Update var/spack/repos/builtin/packages/eccodes/package.py

---------

Co-authored-by: Sergey Kosukhin <skosukhin@gmail.com>
2023-12-01 09:25:31 -08:00
dependabot[bot]
e6107e336c build(deps): bump docutils from 0.18.1 to 0.20.1 in /lib/spack/docs (#38174)
Bumps [docutils](https://docutils.sourceforge.io/) from 0.18.1 to 0.20.1.

---
updated-dependencies:
- dependency-name: docutils
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 17:16:34 +01:00
Lydéric Debusschère
9ef57c3c86 py-cma: new package (#41326)
* py-pycma: new package

* rename py-pycma in py-cma; py-cma: use pypi instead of github sources

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 08:54:02 -06:00
Lydéric Debusschère
3b021bb4ac py-mahotas: new package (#41329)
* py-mahotas: new package

* py-mahotas: relax version constraint on numpy

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 08:52:37 -06:00
Richard Berger
42bac83c2e LAMMPS updates (#40879)
* lammps: add new stable version 20230802.1

* lammps: add missing potential download for +mesont

* lammps: fix python package install

* Update var/spack/repos/builtin/packages/lammps/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* lammps: py-numpy and py-mpi4py should be build and run deps

* lammps: add new 20231121 release

- MPIIO package has been removed -> disable mpiio variant
- LAMMPS_EXCEPTIONS is now always on -> disable exceptions variant
- CMake 3.16+ is now required
- Kokkos 4.1.0 is now supported

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-12-01 07:13:56 -07:00
Lydéric Debusschère
7e38e9e515 py-xmlplain: new package (#41324)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 08:02:24 -06:00
Lydéric Debusschère
cf5ffedc23 py-requests-file: new package (#41328)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 08:00:23 -06:00
Lydéric Debusschère
3a9aea753d [add] py-autodocsumm: new package (#41309)
Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 07:53:57 -06:00
Lydéric Debusschère
c61da8381c py-sphinx-jinja2-compat: new package (#41310)
* [add] py-sphinx-jinja2-compat: new package

* py-whey-pth: new package, dependence of py-sphinx-jinja2-compat

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 07:53:02 -06:00
Lydéric Debusschère
8de814eddf py-sphinx-removed-in: new package (#41325)
* py-sphinx-removed-in: new package

* py-sphinx-removed-in: fix dependence

---------

Co-authored-by: LydDeb <lyderic.debusschere@eolen.com>
2023-12-01 07:43:40 -06:00
Satish Balay
c9341a2532 petsc, py-petsc4py: add v3.20.2 (#41366) 2023-12-01 07:42:48 -06:00
Manuela Kuhn
8b202b3fb2 py-urllib3: add 2.1.0 and 2.0.7 (#41358) 2023-12-01 07:39:18 -06:00
Dom Heinzeller
2ecc260e0e Update py-cylc-flow (add version 8.2.3) (#41209)
* Add missing runtime dependency on py-colorama to py-ansimarkup

* Add py-metomi-isodatetime@3.1.0

* New package py-graphql-relay

* Update py-cylc-flowi, add version 8.2.3

* Fix merge conflict

* Revert mistake in var/spack/repos/builtin/packages/py-cylc-flow/package.py

* Update py-metomi-isodatetime dependencies for py-cylc-flow

* Add 'climbfuji' to list of maintainers for py-cylc-flow
2023-12-01 07:33:34 -06:00
Christopher Christofi
6130fe8f57 py-beartype: new package with versions 0.15.0 and 0.16.2 (#39759)
* py-beartype: new package with version 0.15.0

* Update var/spack/repos/builtin/packages/py-beartype/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* py-beartype: depend on python 3.8 or higher

* py-beartype: add new version 0.16.2

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2023-12-01 07:29:30 -06:00
Adam J. Stewart
d3aa7a620e py-mpi4py: fix build with Apple Clang (#41362)
* py-mpi4py: fix build with Apple Clang

* [@spackbot] updating style on behalf of adamjstewart

---------

Co-authored-by: adamjstewart <adamjstewart@users.noreply.github.com>
2023-12-01 05:33:14 -06:00
dependabot[bot]
2794e14870 build(deps): bump pygments from 2.17.1 to 2.17.2 in /lib/spack/docs (#41212)
Bumps [pygments](https://github.com/pygments/pygments) from 2.17.1 to 2.17.2.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/2.17.2/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.17.1...2.17.2)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 12:29:42 +01:00
dependabot[bot]
cc1e990c7e build(deps): bump sphinx-rtd-theme in /lib/spack/docs (#41305)
Bumps [sphinx-rtd-theme](https://github.com/readthedocs/sphinx_rtd_theme) from 1.3.0 to 2.0.0.
- [Changelog](https://github.com/readthedocs/sphinx_rtd_theme/blob/master/docs/changelog.rst)
- [Commits](https://github.com/readthedocs/sphinx_rtd_theme/compare/1.3.0...2.0.0)

---
updated-dependencies:
- dependency-name: sphinx-rtd-theme
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 12:29:22 +01:00
dependabot[bot]
59e6b0b100 build(deps): bump mypy from 1.7.0 to 1.7.1 in /lib/spack/docs (#41243)
Bumps [mypy](https://github.com/python/mypy) from 1.7.0 to 1.7.1.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.7.0...v1.7.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 09:22:23 +01:00
dependabot[bot]
ec53d02814 build(deps): bump docker/metadata-action from 5.0.0 to 5.2.0 (#41371)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.0.0 to 5.2.0.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](96383f4557...e6428a5c4e)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 09:21:56 +01:00
dependabot[bot]
389c77cf83 build(deps): bump mypy from 1.6.1 to 1.7.1 in /.github/workflows/style (#41242)
Bumps [mypy](https://github.com/python/mypy) from 1.6.1 to 1.7.1.
- [Changelog](https://github.com/python/mypy/blob/master/CHANGELOG.md)
- [Commits](https://github.com/python/mypy/compare/v1.6.1...v1.7.1)

---
updated-dependencies:
- dependency-name: mypy
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-12-01 09:20:56 +01:00
Vicente Bolea
17c87b4c29 vtk-m: bump vtk-m 2.1.0 (#41351)
* vtk-m: bump vtk-m 2.1.0

* Update package.py

* Update package.py
2023-11-30 20:18:15 -08:00
Thomas Helfer
91453c5ba0 Add support for new versions of TFEL and MGIS (#41357)
* Add new versions to TFEL and MGIS
2023-11-30 19:47:57 -07:00
Robert Cohn
a587a10c86 [intel-mpi] deprecation (#41322) 2023-11-30 19:52:42 -05:00
Derek Ryan Strong
cfe77fcd90 r: add license and missing versions and fix rmath build directory (#41260)
* Add R license and missing versions
* Fix rmath build directory
2023-11-30 16:18:02 -08:00
Matthieu Dorier
6cf36a1817 mochi-margo: added version 0.15.0 (#41319) 2023-11-30 15:32:24 -08:00
Matthieu Dorier
ee40cfa830 mochi-thallium: adding a few new versions (#41323)
* mochi-thallium: added a few newer versions
* mochi-thallium: added constraint on the version of margo required
* removed thallium 0.12 which needs updating
* mochi-thallium: fixed hash for version 0.12.0
* removed thallium 0.12 which needs updating (again)
2023-11-30 15:19:32 -08:00
Harmen Stoppels
04f64d4ac6 tests: use temporary_store (#41369) 2023-11-30 15:11:35 -08:00
jmlapre
779fef7d41 trilinos: new pytrilinos2 variant (#40615) 2023-11-30 14:08:58 -07:00
Richard Berger
5be3ca396b Singularity-EOS update (#41333)
* singularity-eos: deprecate v1.6 versions and remove unused code
* singularity-eos: add v1.8.0
2023-11-30 12:55:25 -08:00
Dave Keeshan
e420441bc2 Fix flex for build and link, limit gcc to 7 or greater (#41335) 2023-11-30 12:51:48 -08:00
Dave Keeshan
a039dc16fa Move compiler renaming to filter_compiler_wrappers (#41336) 2023-11-30 12:48:59 -08:00
Christopher Christofi
b31c89b110 perl-carp-assert: add new package with version 0.22 (#41347)
* perl-carp-assert: add new package with version 0.22
* fix style
2023-11-30 12:46:24 -08:00
Christopher Christofi
bc4f3d6cbd perl-parse-yapp: add new package with version 1.21 (#41348) 2023-11-30 12:37:26 -08:00
Matthias Wolf
40209506b7 acfl: truncate version version number (#41354)
When using `spack external find acfl`, we get the full version string
with 4 components in `packages.yaml`.  This PR truncates the version
nubmer when finding the `armpl` component to be able to run without
intervention.
2023-11-30 12:32:10 -08:00
Massimiliano Culpo
6ff07c7753 Fix issue with latest mypy (#41363) 2023-11-30 20:31:03 +01:00
Adam J. Stewart
d874c6d79c py-tensorflow-estimator: add new versions (#41364) 2023-11-30 10:54:11 -08:00
Adam J. Stewart
927e739e0a GDAL: add v3.8.1 (#41365) 2023-11-30 10:51:42 -08:00
Tom Scogland
dd607d11d5 developer tools stack try 2 (#40921)
* developer tools stack try 2

This version is actually in use locally and has largely stabilized, at
least on x86.  Some packages are still a challenge on ppc64le, but maybe
worth keeping this working as a set.

* add packages, try to get container with newer gcc

* remove reuse: true

* try to get cmake to build on medium, 25 minutes is too long

* add lsd package and add to dev tools stack

* clean up fzf dependency and sorting

* Update share/spack/gitlab/cloud_pipelines/stacks/developer_tools/spack.yaml

* cuda: add 12.3.0 (#40827)

* Switch to dashes

* yet more underscores

---------

Co-authored-by: Paul R. C. Kent <kentpr@ornl.gov>
2023-11-30 18:32:21 +00:00
Harmen Stoppels
d436e97fc6 reuse concretization: allow externals from remote when locally configured (#35975)
This looks to me like the best compromise regarding externals in a
build cache. I wouldn't want `spack install` on my machine to install
specs that were marked external on another. At the same time there are
centers that control the target systems on which spack is used, and
would want to use external in buildcaches.

As a solution, reuse concretization will now consider those externals
used in buildcaches that match a locally configured external in
packages.yaml.

So for example person A installs and pushes specs with this config:

```yaml
packages:
  ncurses:
    externals:
    - spec: ncurses@6.0.12345 +feature
      prefix: /usr
```

and person B concretizes and installs using that buildcache with the
following config:

```yaml
packages:
  ncurses:
    externals:
    - spec: ncurses@6
    prefix: /usr
```

the spec will be reused (or rather, will be considered for reuse...)
2023-11-30 09:38:05 -08:00
Harmen Stoppels
f3983d60c2 tests: add missing mutable db (#41359) 2023-11-30 18:37:35 +01:00
Harmen Stoppels
40e705d39e tests: fix side effects of default_config fixture (#41361)
* tests: default_config drop scope

* use default_config elsewhere

* use parse_install_tree for missing defaults in default config
2023-11-30 18:19:10 +01:00
Harmen Stoppels
d92457467a test_variant_propagation_with_unify_false: missing fixture (#41345) 2023-11-30 17:27:53 +01:00
Harmen Stoppels
4c2734fe14 --scope: lazy defaults (#41353) 2023-11-30 15:35:21 +01:00
Taillefumier Mathieu
34d791189d Update SIRIUS version for CP2K master (#41264)
* Update SIRIUS version for CP2K master

* Update var/spack/repos/builtin/packages/cp2k/package.py

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>

---------

Co-authored-by: Rocco Meli <r.meli@bluemail.ch>
2023-11-30 04:08:22 -07:00
Vinícius
eec9eced1b simgrid: add v3.34 and v3.35 (#41340) 2023-11-30 11:23:57 +01:00
Christopher Christofi
3bc8a7aa5f use double quotes where spack style finds errors (#41349) 2023-11-30 09:46:02 +01:00
Massimiliano Culpo
3b045c289d Fix a typo in an integrity constraint (#41334) 2023-11-30 09:44:51 +01:00
Harmen Stoppels
4b93c57d44 argparse: make scope choices lazy s.t. validation in tests works (#41344) 2023-11-30 08:37:11 +01:00
Harmen Stoppels
377e7de0d2 tests: fix issue with os.environ binding (#41342) 2023-11-30 07:18:41 +01:00
Greg Sjaardema
0a9c84dd25 SEACAS: new release (#41273)
Replace last release due to build issues on windows. Tag was applied incorrectly
2023-11-29 21:38:05 -07:00
Chris White
2ececcd03e MFEM: add mpi link dir (#41337)
Also fix netcdf-c zlib reference
2023-11-29 21:32:50 -07:00
Adam J. Stewart
f4f67adf49 py-numba: add v0.58.1 (#41262)
* py-numba: add v0.58.1

* Passing tests
2023-11-30 00:17:34 +01:00
Adam J. Stewart
220898b4de py-pygeos: add v0.14 (#41248)
* py-pygeos: add v0.14

* Python is also a link dep
2023-11-30 00:15:07 +01:00
Luc Berger
450f938056 kokkos: add v4.2.00 (#41203)
* Kokkos: adding version 4.2.00 to the package
* Kokkos: adding AMD GPU arch
* kokkos@4.2.00 +sycl: patch numeric traits unit test

---------

Co-authored-by: eugeneswalker <eugenesunsetwalker@gmail.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2023-11-29 22:47:15 +00:00
270 changed files with 3260 additions and 1344 deletions

View File

@@ -23,7 +23,7 @@ jobs:
operating_system: ["ubuntu-latest", "macos-latest"]
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: ${{inputs.python_version}}
- name: Install Python packages

View File

@@ -159,7 +159,7 @@ jobs:
brew install cmake bison@2.7 tree
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: "3.12"
- name: Bootstrap clingo

View File

@@ -57,7 +57,7 @@ jobs:
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
- uses: docker/metadata-action@96383f45573cb7f253c731d3b3ab81c87ef81934
- uses: docker/metadata-action@31cebacef4805868f9ce9a0cb03ee36c32df2ac4
id: docker_meta
with:
images: |

View File

@@ -17,7 +17,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -2,6 +2,6 @@ black==23.11.0
clingo==5.6.2
flake8==6.1.0
isort==5.12.0
mypy==1.6.1
mypy==1.7.1
types-six==1.16.21.9
vermin==1.6.0

View File

@@ -54,7 +54,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install System packages
@@ -101,7 +101,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -159,7 +159,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: '3.11'
- name: Install System packages
@@ -194,7 +194,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # @v2
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236 # @v2
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c # @v2
with:
python-version: ${{ matrix.python-version }}
- name: Install Python packages

View File

@@ -19,7 +19,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: '3.11'
cache: 'pip'
@@ -38,7 +38,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: '3.11'
cache: 'pip'

View File

@@ -18,7 +18,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: 3.9
- name: Install Python packages
@@ -42,7 +42,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: 3.9
- name: Install Python packages
@@ -66,7 +66,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
with:
fetch-depth: 0
- uses: actions/setup-python@65d7f2d534ac1bc67fcd62888c5f4f3d2cb2b236
- uses: actions/setup-python@0a5c61591373683505ea898e09a3ea4f39ef2b9c
with:
python-version: 3.9
- name: Install Python packages

View File

@@ -66,10 +66,11 @@ Resources:
* **Matrix space**: [#spack-space:matrix.org](https://matrix.to/#/#spack-space:matrix.org):
[bridged](https://github.com/matrix-org/matrix-appservice-slack#matrix-appservice-slack) to Slack.
* [**Github Discussions**](https://github.com/spack/spack/discussions):
not just for discussions, but also Q&A.
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack)
for Q&A and discussions. Note the pinned discussions for announcements.
* **Twitter**: [@spackpm](https://twitter.com/spackpm). Be sure to
`@mention` us!
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack):
only for announcements. Please use other venues for discussions.
Contributing
------------------------

View File

@@ -82,7 +82,7 @@ class already contains:
.. code-block:: python
depends_on('cmake', type='build')
depends_on("cmake", type="build")
If you need to specify a particular version requirement, you can
@@ -90,7 +90,7 @@ override this in your package:
.. code-block:: python
depends_on('cmake@2.8.12:', type='build')
depends_on("cmake@2.8.12:", type="build")
^^^^^^^^^^^^^^^^^^^
@@ -137,10 +137,10 @@ and without the :meth:`~spack.build_systems.cmake.CMakeBuilder.define` and
def cmake_args(self):
args = [
'-DWHATEVER:STRING=somevalue',
self.define('ENABLE_BROKEN_FEATURE', False),
self.define_from_variant('DETECT_HDF5', 'hdf5'),
self.define_from_variant('THREADS'), # True if +threads
"-DWHATEVER:STRING=somevalue",
self.define("ENABLE_BROKEN_FEATURE", False),
self.define_from_variant("DETECT_HDF5", "hdf5"),
self.define_from_variant("THREADS"), # True if +threads
]
return args
@@ -151,10 +151,10 @@ and CMake simply ignores the empty command line argument. For example the follow
.. code-block:: python
variant('example', default=True, when='@2.0:')
variant("example", default=True, when="@2.0:")
def cmake_args(self):
return [self.define_from_variant('EXAMPLE', 'example')]
return [self.define_from_variant("EXAMPLE", "example")]
will generate ``'cmake' '-DEXAMPLE=ON' ...`` when `@2.0: +example` is met, but will
result in ``'cmake' '' ...`` when the spec version is below ``2.0``.
@@ -193,9 +193,9 @@ a variant to control this:
.. code-block:: python
variant('build_type', default='RelWithDebInfo',
description='CMake build type',
values=('Debug', 'Release', 'RelWithDebInfo', 'MinSizeRel'))
variant("build_type", default="RelWithDebInfo",
description="CMake build type",
values=("Debug", "Release", "RelWithDebInfo", "MinSizeRel"))
However, not every CMake package accepts all four of these options.
Grep the ``CMakeLists.txt`` file to see if the default values are
@@ -205,9 +205,9 @@ package overrides the default variant with:
.. code-block:: python
variant('build_type', default='DebugRelease',
description='The build type to build',
values=('Debug', 'Release', 'DebugRelease'))
variant("build_type", default="DebugRelease",
description="The build type to build",
values=("Debug", "Release", "DebugRelease"))
For more information on ``CMAKE_BUILD_TYPE``, see:
https://cmake.org/cmake/help/latest/variable/CMAKE_BUILD_TYPE.html
@@ -250,7 +250,7 @@ generator is Ninja. To switch to the Ninja generator, simply add:
.. code-block:: python
generator = 'Ninja'
generator = "Ninja"
``CMakePackage`` defaults to "Unix Makefiles". If you switch to the
@@ -258,7 +258,7 @@ Ninja generator, make sure to add:
.. code-block:: python
depends_on('ninja', type='build')
depends_on("ninja", type="build")
to the package as well. Aside from that, you shouldn't need to do
anything else. Spack will automatically detect that you are using
@@ -288,7 +288,7 @@ like so:
.. code-block:: python
root_cmakelists_dir = 'src'
root_cmakelists_dir = "src"
Note that this path is relative to the root of the extracted tarball,
@@ -304,7 +304,7 @@ different sub-directory, simply override ``build_directory`` like so:
.. code-block:: python
build_directory = 'my-build'
build_directory = "my-build"
^^^^^^^^^^^^^^^^^^^^^^^^^
Build and install targets
@@ -324,8 +324,8 @@ library or build the documentation, you can add these like so:
.. code-block:: python
build_targets = ['all', 'docs']
install_targets = ['install', 'docs']
build_targets = ["all", "docs"]
install_targets = ["install", "docs"]
^^^^^^^
Testing

View File

@@ -934,9 +934,9 @@ a *virtual* ``mkl`` package is declared in Spack.
.. code-block:: python
# Examples for absolute and conditional dependencies:
depends_on('mkl')
depends_on('mkl', when='+mkl')
depends_on('mkl', when='fftw=mkl')
depends_on("mkl")
depends_on("mkl", when="+mkl")
depends_on("mkl", when="fftw=mkl")
The ``MKLROOT`` environment variable (part of the documented API) will be set
during all stages of client package installation, and is available to both
@@ -972,8 +972,8 @@ a *virtual* ``mkl`` package is declared in Spack.
def configure_args(self):
args = []
...
args.append('--with-blas=%s' % self.spec['blas'].libs.ld_flags)
args.append('--with-lapack=%s' % self.spec['lapack'].libs.ld_flags)
args.append("--with-blas=%s" % self.spec["blas"].libs.ld_flags)
args.append("--with-lapack=%s" % self.spec["lapack"].libs.ld_flags)
...
.. tip::
@@ -989,13 +989,13 @@ a *virtual* ``mkl`` package is declared in Spack.
.. code-block:: python
self.spec['blas'].headers.include_flags
self.spec["blas"].headers.include_flags
and to generate linker options (``-L<dir> -llibname ...``), use the same as above,
.. code-block:: python
self.spec['blas'].libs.ld_flags
self.spec["blas"].libs.ld_flags
See
:ref:`MakefilePackage <makefilepackage>`

View File

@@ -88,7 +88,7 @@ override the ``luarocks_args`` method like so:
.. code-block:: python
def luarocks_args(self):
return ['flag1', 'flag2']
return ["flag1", "flag2"]
One common use of this is to override warnings or flags for newer compilers, as in:

View File

@@ -48,8 +48,8 @@ class automatically adds the following dependencies:
.. code-block:: python
depends_on('java', type=('build', 'run'))
depends_on('maven', type='build')
depends_on("java", type=("build", "run"))
depends_on("maven", type="build")
In the ``pom.xml`` file, you may see sections like:
@@ -72,8 +72,8 @@ should add:
.. code-block:: python
depends_on('java@7:', type='build')
depends_on('maven@3.5.4:', type='build')
depends_on("java@7:", type="build")
depends_on("maven@3.5.4:", type="build")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -88,9 +88,9 @@ the build phase. For example:
def build_args(self):
return [
'-Pdist,native',
'-Dtar',
'-Dmaven.javadoc.skip=true'
"-Pdist,native",
"-Dtar",
"-Dmaven.javadoc.skip=true"
]

View File

@@ -86,8 +86,8 @@ the ``MesonPackage`` base class already contains:
.. code-block:: python
depends_on('meson', type='build')
depends_on('ninja', type='build')
depends_on("meson", type="build")
depends_on("ninja", type="build")
If you need to specify a particular version requirement, you can
@@ -95,8 +95,8 @@ override this in your package:
.. code-block:: python
depends_on('meson@0.43.0:', type='build')
depends_on('ninja', type='build')
depends_on("meson@0.43.0:", type="build")
depends_on("ninja", type="build")
^^^^^^^^^^^^^^^^^^^
@@ -121,7 +121,7 @@ override the ``meson_args`` method like so:
.. code-block:: python
def meson_args(self):
return ['--warnlevel=3']
return ["--warnlevel=3"]
This method can be used to pass flags as well as variables.

View File

@@ -118,7 +118,7 @@ so ``PerlPackage`` contains:
.. code-block:: python
extends('perl')
extends("perl")
If your package requires a specific version of Perl, you should
@@ -132,14 +132,14 @@ properly. If your package uses ``Makefile.PL`` to build, add:
.. code-block:: python
depends_on('perl-extutils-makemaker', type='build')
depends_on("perl-extutils-makemaker", type="build")
If your package uses ``Build.PL`` to build, add:
.. code-block:: python
depends_on('perl-module-build', type='build')
depends_on("perl-module-build", type="build")
^^^^^^^^^^^^^^^^^
@@ -165,11 +165,11 @@ arguments to ``Makefile.PL`` or ``Build.PL`` by overriding
.. code-block:: python
def configure_args(self):
expat = self.spec['expat'].prefix
expat = self.spec["expat"].prefix
return [
'EXPATLIBPATH={0}'.format(expat.lib),
'EXPATINCPATH={0}'.format(expat.include),
"EXPATLIBPATH={0}".format(expat.lib),
"EXPATINCPATH={0}".format(expat.include),
]

View File

@@ -83,7 +83,7 @@ base class already contains:
.. code-block:: python
depends_on('qt', type='build')
depends_on("qt", type="build")
If you want to specify a particular version requirement, or need to
@@ -91,7 +91,7 @@ link to the ``qt`` libraries, you can override this in your package:
.. code-block:: python
depends_on('qt@5.6.0:')
depends_on("qt@5.6.0:")
^^^^^^^^^^^^^^^^^^^^^^^^^^
Passing arguments to qmake
@@ -103,7 +103,7 @@ override the ``qmake_args`` method like so:
.. code-block:: python
def qmake_args(self):
return ['-recursive']
return ["-recursive"]
This method can be used to pass flags as well as variables.
@@ -118,7 +118,7 @@ sub-directory by adding the following to the package:
.. code-block:: python
build_directory = 'src'
build_directory = "src"
^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -163,28 +163,28 @@ attributes that can be used to set ``homepage``, ``url``, ``list_url``, and
.. code-block:: python
cran = 'caret'
cran = "caret"
is equivalent to:
.. code-block:: python
homepage = 'https://cloud.r-project.org/package=caret'
url = 'https://cloud.r-project.org/src/contrib/caret_6.0-86.tar.gz'
list_url = 'https://cloud.r-project.org/src/contrib/Archive/caret'
homepage = "https://cloud.r-project.org/package=caret"
url = "https://cloud.r-project.org/src/contrib/caret_6.0-86.tar.gz"
list_url = "https://cloud.r-project.org/src/contrib/Archive/caret"
Likewise, the following ``bioc`` attribute:
.. code-block:: python
bioc = 'BiocVersion'
bioc = "BiocVersion"
is equivalent to:
.. code-block:: python
homepage = 'https://bioconductor.org/packages/BiocVersion/'
git = 'https://git.bioconductor.org/packages/BiocVersion'
homepage = "https://bioconductor.org/packages/BiocVersion/"
git = "https://git.bioconductor.org/packages/BiocVersion"
^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -200,7 +200,7 @@ base class contains:
.. code-block:: python
extends('r')
extends("r")
Take a close look at the homepage for ``caret``. If you look at the
@@ -209,7 +209,7 @@ You should add this to your package like so:
.. code-block:: python
depends_on('r@3.2.0:', type=('build', 'run'))
depends_on("r@3.2.0:", type=("build", "run"))
^^^^^^^^^^^^^^
@@ -227,7 +227,7 @@ and list all of their dependencies in the following sections:
* LinkingTo
As far as Spack is concerned, all 3 of these dependency types
correspond to ``type=('build', 'run')``, so you don't have to worry
correspond to ``type=("build", "run")``, so you don't have to worry
about the details. If you are curious what they mean,
https://github.com/spack/spack/issues/2951 has a pretty good summary:
@@ -330,7 +330,7 @@ the dependency:
.. code-block:: python
depends_on('r-lattice@0.20:', type=('build', 'run'))
depends_on("r-lattice@0.20:", type=("build", "run"))
^^^^^^^^^^^^^^^^^^
@@ -361,20 +361,20 @@ like so:
.. code-block:: python
def configure_args(self):
mpi_name = self.spec['mpi'].name
mpi_name = self.spec["mpi"].name
# The type of MPI. Supported values are:
# OPENMPI, LAM, MPICH, MPICH2, or CRAY
if mpi_name == 'openmpi':
Rmpi_type = 'OPENMPI'
elif mpi_name == 'mpich':
Rmpi_type = 'MPICH2'
if mpi_name == "openmpi":
Rmpi_type = "OPENMPI"
elif mpi_name == "mpich":
Rmpi_type = "MPICH2"
else:
raise InstallError('Unsupported MPI type')
raise InstallError("Unsupported MPI type")
return [
'--with-Rmpi-type={0}'.format(Rmpi_type),
'--with-mpi={0}'.format(spec['mpi'].prefix),
"--with-Rmpi-type={0}".format(Rmpi_type),
"--with-mpi={0}".format(spec["mpi"].prefix),
]

View File

@@ -84,8 +84,8 @@ The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
summary = 'An implementation of the AsciiDoc text processor and publishing toolchain'
description = 'A fast, open source text processor and publishing toolchain for converting AsciiDoc content to HTML 5, DocBook 5, and other formats.'
summary = "An implementation of the AsciiDoc text processor and publishing toolchain"
description = "A fast, open source text processor and publishing toolchain for converting AsciiDoc content to HTML 5, DocBook 5, and other formats."
Either of these can be used for the description of the Spack package.
@@ -98,7 +98,7 @@ The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
homepage = 'https://asciidoctor.org'
homepage = "https://asciidoctor.org"
This should be used as the official homepage of the Spack package.
@@ -112,21 +112,21 @@ the base class contains:
.. code-block:: python
extends('ruby')
extends("ruby")
The ``*.gemspec`` file may contain something like:
.. code-block:: ruby
required_ruby_version = '>= 2.3.0'
required_ruby_version = ">= 2.3.0"
This can be added to the Spack package using:
.. code-block:: python
depends_on('ruby@2.3.0:', type=('build', 'run'))
depends_on("ruby@2.3.0:", type=("build", "run"))
^^^^^^^^^^^^^^^^^

View File

@@ -124,7 +124,7 @@ are wrong, you can provide the names yourself by overriding
.. code-block:: python
import_modules = ['PyQt5']
import_modules = ["PyQt5"]
These tests often catch missing dependencies and non-RPATHed

View File

@@ -63,8 +63,8 @@ run package-specific unit tests.
.. code-block:: python
def installtest(self):
with working_dir('test'):
pytest = which('py.test')
with working_dir("test"):
pytest = which("py.test")
pytest()
@@ -93,7 +93,7 @@ the following dependency automatically:
.. code-block:: python
depends_on('python@2.5:', type='build')
depends_on("python@2.5:", type="build")
Waf only supports Python 2.5 and up.
@@ -113,7 +113,7 @@ phase, you can use:
args = []
if self.run_tests:
args.append('--test')
args.append("--test")
return args

View File

@@ -9,46 +9,42 @@
Custom Extensions
=================
*Spack extensions* permit you to extend Spack capabilities by deploying your
*Spack extensions* allow you to extend Spack capabilities by deploying your
own custom commands or logic in an arbitrary location on your filesystem.
This might be extremely useful e.g. to develop and maintain a command whose purpose is
too specific to be considered for reintegration into the mainline or to
evolve a command through its early stages before starting a discussion to merge
it upstream.
From Spack's point of view an extension is any path in your filesystem which
respects a prescribed naming and layout for files:
respects the following naming and layout for files:
.. code-block:: console
spack-scripting/ # The top level directory must match the format 'spack-{extension_name}'
├── pytest.ini # Optional file if the extension ships its own tests
├── scripting # Folder that may contain modules that are needed for the extension commands
│   ── cmd # Folder containing extension commands
│   └── filter.py # A new command that will be available
├── tests # Tests for this extension
│   ── cmd # Folder containing extension commands
│   │   └── filter.py # A new command that will be available
│   └── functions.py # Module with internal details
└── tests # Tests for this extension
│ ├── conftest.py
│ └── test_filter.py
└── templates # Templates that may be needed by the extension
In the example above the extension named *scripting* adds an additional command (``filter``)
and unit tests to verify its behavior. The code for this example can be
obtained by cloning the corresponding git repository:
In the example above, the extension is named *scripting*. It adds an additional command
(``spack filter``) and unit tests to verify its behavior.
.. TODO: write an ad-hoc "hello world" extension and make it part of the spack organization
The extension can import any core Spack module in its implementation. When loaded by
the ``spack`` command, the extension itself is imported as a Python package in the
``spack.extensions`` namespace. In the example above, since the extension is named
"scripting", the corresponding Python module is ``spack.extensions.scripting``.
The code for this example extension can be obtained by cloning the corresponding git repository:
.. code-block:: console
$ cd ~/
$ mkdir tmp && cd tmp
$ git clone https://github.com/alalazo/spack-scripting.git
Cloning into 'spack-scripting'...
remote: Counting objects: 11, done.
remote: Compressing objects: 100% (7/7), done.
remote: Total 11 (delta 0), reused 11 (delta 0), pack-reused 0
Receiving objects: 100% (11/11), done.
As you can see by inspecting the sources, Python modules that are part of the extension
can import any core Spack module.
$ git -C /tmp clone https://github.com/spack/spack-scripting.git
---------------------------------
Configure Spack to Use Extensions
@@ -61,7 +57,7 @@ paths to ``config.yaml``. In the case of our example this means ensuring that:
config:
extensions:
- ~/tmp/spack-scripting
- /tmp/spack-scripting
is part of your configuration file. Once this is setup any command that the extension provides
will be available from the command line:
@@ -86,37 +82,32 @@ will be available from the command line:
--implicit select specs that are not installed or were installed implicitly
--output OUTPUT where to dump the result
The corresponding unit tests can be run giving the appropriate options
to ``spack unit-test``:
The corresponding unit tests can be run giving the appropriate options to ``spack unit-test``:
.. code-block:: console
$ spack unit-test --extension=scripting
============================================================== test session starts ===============================================================
platform linux2 -- Python 2.7.15rc1, pytest-3.2.5, py-1.4.34, pluggy-0.4.0
rootdir: /home/mculpo/tmp/spack-scripting, inifile: pytest.ini
========================================== test session starts ===========================================
platform linux -- Python 3.11.5, pytest-7.4.3, pluggy-1.3.0
rootdir: /home/culpo/github/spack-scripting
configfile: pytest.ini
testpaths: tests
plugins: xdist-3.5.0
collected 5 items
tests/test_filter.py ...XX
============================================================ short test summary info =============================================================
XPASS tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
XPASS tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
tests/test_filter.py ..... [100%]
=========================================================== slowest 20 test durations ============================================================
3.74s setup tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.17s call tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
0.16s call tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.15s call tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1]
0.13s call tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
0.08s call tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.04s teardown tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
0.00s setup tests/test_filter.py::test_filtering_specs[flags4-specs4-expected4]
0.00s setup tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
0.00s setup tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1]
0.00s setup tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags2-specs2-expected2]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags1-specs1-expected1]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags0-specs0-expected0]
0.00s teardown tests/test_filter.py::test_filtering_specs[flags3-specs3-expected3]
====================================================== 3 passed, 2 xpassed in 4.51 seconds =======================================================
========================================== slowest 30 durations ==========================================
2.31s setup tests/test_filter.py::test_filtering_specs[kwargs0-specs0-expected0]
0.57s call tests/test_filter.py::test_filtering_specs[kwargs2-specs2-expected2]
0.56s call tests/test_filter.py::test_filtering_specs[kwargs4-specs4-expected4]
0.54s call tests/test_filter.py::test_filtering_specs[kwargs3-specs3-expected3]
0.54s call tests/test_filter.py::test_filtering_specs[kwargs1-specs1-expected1]
0.48s call tests/test_filter.py::test_filtering_specs[kwargs0-specs0-expected0]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs4-specs4-expected4]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs2-specs2-expected2]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs1-specs1-expected1]
0.01s setup tests/test_filter.py::test_filtering_specs[kwargs3-specs3-expected3]
(5 durations < 0.005s hidden. Use -vv to show these durations.)
=========================================== 5 passed in 5.06s ============================================

View File

@@ -250,10 +250,9 @@ Compiler configuration
Spack has the ability to build packages with multiple compilers and
compiler versions. Compilers can be made available to Spack by
specifying them manually in ``compilers.yaml`` or ``packages.yaml``,
or automatically by running ``spack compiler find``, but for
convenience Spack will automatically detect compilers the first time
it needs them.
specifying them manually in ``compilers.yaml``, or automatically by
running ``spack compiler find``, but for convenience Spack will
automatically detect compilers the first time it needs them.
.. _cmd-spack-compilers:
@@ -458,52 +457,6 @@ specification. The operations available to modify the environment are ``set``, `
prepend_path: # Similar for append|remove_path
LD_LIBRARY_PATH: /ld/paths/added/by/setvars/sh
.. note::
Spack is in the process of moving compilers from a separate
attribute to be handled like all other packages. As part of this
process, the ``compilers.yaml`` section will eventually be replaced
by configuration in the ``packages.yaml`` section. This new
configuration is now available, although it is not yet the default
behavior.
Compilers can also be configured as external packages in the
``packages.yaml`` config file. Any external package for a compiler
(e.g. ``gcc`` or ``llvm``) will be treated as a configured compiler
assuming the paths to the compiler executables are determinable from
the prefix.
If the paths to the compiler executable are not determinable from the
prefix, you can add them to the ``extra_attributes`` field using the
``compilers`` key. The ``compilers`` key accepts compilers for ``c``,
``cxx``, ``fortran``, and ``f77``.
For all other fields from the ``compilers`` config, they can be added
to the ``extra_attributes`` field for an external representing a
compiler. These fields are used as-is in the internal representation
of the compiler config.
.. code-block:: yaml
packages:
gcc:
external:
- spec: gcc@12.2.0 arch=linux-rhel8-skylake
prefix: /usr
extra_attributes:
environment:
set:
GCC_ROOT: /usr
external:
- spec: llvm+clang@15.0.0 arch=linux-rhel8-skylake
prefix: /usr
extra_attributes:
compilers:
c: /usr/bin/clang-with-suffix
cxx: /usr/bin/clang++-with-extra-info
fortran: /usr/bin/gfortran
extra_rpaths:
- /usr/lib/llvm/
^^^^^^^^^^^^^^^^^^^^^^^
Build Your Own Compiler

View File

@@ -111,3 +111,28 @@ CUDA is split into fewer components and is simpler to specify:
prefix: /opt/cuda/cuda-11.0.2/
where ``/opt/cuda/cuda-11.0.2/lib/`` contains ``libcudart.so``.
-----------------------------------
Using an External OpenGL API
-----------------------------------
Depending on whether we have a graphics card or not, we may choose to use OSMesa or GLX to implement the OpenGL API.
If a graphics card is unavailable, OSMesa is recommended and can typically be built with Spack.
However, if we prefer to utilize the system GLX tailored to our graphics card, we need to declare it as an external. Here's how to do it:
.. code-block:: yaml
packages:
libglx:
require: [opengl]
opengl:
buildable: false
externals:
- prefix: /usr/
spec: opengl@4.6
Note that prefix has to be the root of both the libraries and the headers, using is /usr not the path the the lib.
To know which spec for opengl is available use ``cd /usr/include/GL && grep -Ri gl_version``.

View File

@@ -1,13 +1,13 @@
sphinx==7.2.6
sphinxcontrib-programoutput==0.17
sphinx_design==0.5.0
sphinx-rtd-theme==1.3.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.23.0
docutils==0.18.1
pygments==2.17.1
docutils==0.20.1
pygments==2.17.2
urllib3==2.1.0
pytest==7.4.3
isort==5.12.0
black==23.11.0
flake8==6.1.0
mypy==1.7.0
mypy==1.7.1

View File

@@ -1047,9 +1047,9 @@ def __bool__(self):
"""Whether any exceptions were handled."""
return bool(self.exceptions)
def forward(self, context: str) -> "GroupedExceptionForwarder":
def forward(self, context: str, base: type = BaseException) -> "GroupedExceptionForwarder":
"""Return a contextmanager which extracts tracebacks and prefixes a message."""
return GroupedExceptionForwarder(context, self)
return GroupedExceptionForwarder(context, self, base)
def _receive_forwarded(self, context: str, exc: Exception, tb: List[str]):
self.exceptions.append((context, exc, tb))
@@ -1072,15 +1072,18 @@ class GroupedExceptionForwarder:
"""A contextmanager to capture exceptions and forward them to a
GroupedExceptionHandler."""
def __init__(self, context: str, handler: GroupedExceptionHandler):
def __init__(self, context: str, handler: GroupedExceptionHandler, base: type):
self._context = context
self._handler = handler
self._base = base
def __enter__(self):
return None
def __exit__(self, exc_type, exc_value, tb):
if exc_value is not None:
if not issubclass(exc_type, self._base):
return False
self._handler._receive_forwarded(self._context, exc_value, traceback.format_tb(tb))
# Suppress any exception from being re-raised:

View File

@@ -726,13 +726,37 @@ def _unknown_variants_in_directives(pkgs, error_cls):
@package_directives
def _unknown_variants_in_dependencies(pkgs, error_cls):
"""Report unknown dependencies and wrong variants for dependencies"""
def _issues_in_depends_on_directive(pkgs, error_cls):
"""Reports issues with 'depends_on' directives.
Issues might be unknown dependencies, unknown variants or variant values, or declaration
of nested dependencies.
"""
errors = []
for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
filename = spack.repo.PATH.filename_for_package_name(pkg_name)
for dependency_name, dependency_data in pkg_cls.dependencies.items():
# Check if there are nested dependencies declared. We don't want directives like:
#
# depends_on('foo+bar ^fee+baz')
#
# but we'd like to have two dependencies listed instead.
for when, dependency_edge in dependency_data.items():
dependency_spec = dependency_edge.spec
nested_dependencies = dependency_spec.dependencies()
if nested_dependencies:
summary = (
f"{pkg_name}: invalid nested dependency "
f"declaration '{str(dependency_spec)}'"
)
details = [
f"split depends_on('{str(dependency_spec)}', when='{str(when)}') "
f"into {len(nested_dependencies) + 1} directives",
f"in {filename}",
]
errors.append(error_cls(summary=summary, details=details))
# No need to analyze virtual packages
if spack.repo.PATH.is_virtual(dependency_name):
continue

View File

@@ -230,7 +230,11 @@ def _associate_built_specs_with_mirror(self, cache_key, mirror_url):
)
return
spec_list = db.query_local(installed=False, in_buildcache=True)
spec_list = [
s
for s in db.query_local(installed=any, in_buildcache=any)
if s.external or db.query_local_by_spec_hash(s.dag_hash()).in_buildcache
]
for indexed_spec in spec_list:
dag_hash = indexed_spec.dag_hash()

View File

@@ -16,6 +16,7 @@
import llnl.util.filesystem as fs
from llnl.util import tty
import spack.platforms
import spack.store
import spack.util.environment
import spack.util.executable
@@ -206,17 +207,19 @@ def _root_spec(spec_str: str) -> str:
"""Add a proper compiler and target to a spec used during bootstrapping.
Args:
spec_str (str): spec to be bootstrapped. Must be without compiler and target.
spec_str: spec to be bootstrapped. Must be without compiler and target.
"""
# Add a proper compiler hint to the root spec. We use GCC for
# everything but MacOS and Windows.
if str(spack.platforms.host()) == "darwin":
# Add a compiler requirement to the root spec.
platform = str(spack.platforms.host())
if platform == "darwin":
spec_str += " %apple-clang"
elif str(spack.platforms.host()) == "windows":
elif platform == "windows":
# TODO (johnwparent): Remove version constraint when clingo patch is up
spec_str += " %msvc@:19.37"
else:
elif platform == "linux":
spec_str += " %gcc"
elif platform == "freebsd":
spec_str += " %clang"
target = archspec.cpu.host().family
spec_str += f" target={target}"

View File

@@ -147,7 +147,7 @@ def _add_compilers_if_missing() -> None:
mixed_toolchain=sys.platform == "darwin"
)
if new_compilers:
spack.compilers.add_compilers_to_config(new_compilers)
spack.compilers.add_compilers_to_config(new_compilers, init_config=False)
@contextlib.contextmanager

View File

@@ -386,7 +386,7 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
with exception_handler.forward(current_config["name"]):
with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_import(module, abstract_spec):
@@ -441,7 +441,7 @@ def ensure_executables_in_path_or_raise(
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
with exception_handler.forward(current_config["name"]):
with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_search_path(executables, abstract_spec):

View File

@@ -19,7 +19,6 @@
import spack.tengine
import spack.util.cpus
import spack.util.executable
from spack.environment import depfile
from ._common import _root_spec
from .config import root_path, spec_for_current_python, store_path
@@ -86,12 +85,9 @@ def __init__(self) -> None:
super().__init__(self.environment_root())
def update_installations(self) -> None:
"""Update the installations of this environment.
The update is done using a depfile on Linux and macOS, and using the ``install_all``
method of environments on Windows.
"""
with tty.SuppressOutput(msg_enabled=False, warn_enabled=False):
"""Update the installations of this environment."""
log_enabled = tty.is_debug() or tty.is_verbose()
with tty.SuppressOutput(msg_enabled=log_enabled, warn_enabled=log_enabled):
specs = self.concretize()
if specs:
colorized_specs = [
@@ -100,11 +96,9 @@ def update_installations(self) -> None:
]
tty.msg(f"[BOOTSTRAPPING] Installing dependencies ({', '.join(colorized_specs)})")
self.write(regenerate=False)
if sys.platform == "win32":
with tty.SuppressOutput(msg_enabled=log_enabled, warn_enabled=log_enabled):
self.install_all()
else:
self._install_with_depfile()
self.write(regenerate=True)
self.write(regenerate=True)
def update_syspath_and_environ(self) -> None:
"""Update ``sys.path`` and the PATH, PYTHONPATH environment variables to point to
@@ -122,25 +116,6 @@ def update_syspath_and_environ(self) -> None:
+ [str(x) for x in self.pythonpaths()]
)
def _install_with_depfile(self) -> None:
model = depfile.MakefileModel.from_env(self)
template = spack.tengine.make_environment().get_template(
os.path.join("depfile", "Makefile")
)
makefile = self.environment_root() / "Makefile"
makefile.write_text(template.render(model.to_dict()))
make = spack.util.executable.which("make")
kwargs = {}
if not tty.is_debug():
kwargs = {"output": os.devnull, "error": os.devnull}
make(
"-C",
str(self.environment_root()),
"-j",
str(spack.util.cpus.determine_number_of_jobs(parallel=True)),
**kwargs,
)
def _write_spack_yaml_file(self) -> None:
tty.msg(
"[BOOTSTRAPPING] Spack has missing dependencies, creating a bootstrapping environment"

View File

@@ -66,7 +66,6 @@ def _core_requirements() -> List[RequiredResponseType]:
_core_system_exes = {
"make": _missing("make", "required to build software from sources"),
"patch": _missing("patch", "required to patch source code before building"),
"bash": _missing("bash", "required for Spack compiler wrapper"),
"tar": _missing("tar", "required to manage code archives"),
"gzip": _missing("gzip", "required to compress/decompress code archives"),
"unzip": _missing("unzip", "required to compress/decompress code archives"),

View File

@@ -9,6 +9,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.build_environment
import spack.builder
from .cmake import CMakeBuilder, CMakePackage
@@ -285,6 +286,19 @@ def initconfig_hardware_entries(self):
def std_initconfig_entries(self):
cmake_prefix_path_env = os.environ["CMAKE_PREFIX_PATH"]
cmake_prefix_path = cmake_prefix_path_env.replace(os.pathsep, ";")
cmake_rpaths_env = spack.build_environment.get_rpaths(self.pkg)
cmake_rpaths_path = ";".join(cmake_rpaths_env)
complete_rpath_list = cmake_rpaths_path
if "SPACK_COMPILER_EXTRA_RPATHS" in os.environ:
spack_extra_rpaths_env = os.environ["SPACK_COMPILER_EXTRA_RPATHS"]
spack_extra_rpaths_path = spack_extra_rpaths_env.replace(os.pathsep, ";")
complete_rpath_list = "{0};{1}".format(complete_rpath_list, spack_extra_rpaths_path)
if "SPACK_COMPILER_IMPLICIT_RPATHS" in os.environ:
spack_implicit_rpaths_env = os.environ["SPACK_COMPILER_IMPLICIT_RPATHS"]
spack_implicit_rpaths_path = spack_implicit_rpaths_env.replace(os.pathsep, ";")
complete_rpath_list = "{0};{1}".format(complete_rpath_list, spack_implicit_rpaths_path)
return [
"#------------------{0}".format("-" * 60),
"# !!!! This is a generated file, edit at own risk !!!!",
@@ -292,6 +306,9 @@ def std_initconfig_entries(self):
"# CMake executable path: {0}".format(self.pkg.spec["cmake"].command.path),
"#------------------{0}\n".format("-" * 60),
cmake_cache_string("CMAKE_PREFIX_PATH", cmake_prefix_path),
cmake_cache_string("CMAKE_INSTALL_RPATH_USE_LINK_PATH", "ON"),
cmake_cache_string("CMAKE_BUILD_RPATH", complete_rpath_list),
cmake_cache_string("CMAKE_INSTALL_RPATH", complete_rpath_list),
self.define_cmake_cache_from_variant("CMAKE_BUILD_TYPE", "build_type"),
]

View File

@@ -6,13 +6,14 @@
import os
import re
import shutil
from typing import Optional
from typing import Iterable, List, Mapping, Optional
import archspec
import llnl.util.filesystem as fs
import llnl.util.lang as lang
import llnl.util.tty as tty
from llnl.util.filesystem import HeaderList, LibraryList
import spack.builder
import spack.config
@@ -25,14 +26,18 @@
from spack.directives import build_system, depends_on, extends, maintainers
from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import test_part
from spack.spec import Spec
from spack.util.prefix import Prefix
from ._checks import BaseBuilder, execute_install_time_tests
def _flatten_dict(dictionary):
def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
"""Iterable that yields KEY=VALUE paths through a dictionary.
Args:
dictionary: Possibly nested dictionary of arbitrary keys and values.
Yields:
A single path through the dictionary.
"""
@@ -50,7 +55,7 @@ class PythonExtension(spack.package_base.PackageBase):
maintainers("adamjstewart")
@property
def import_modules(self):
def import_modules(self) -> Iterable[str]:
"""Names of modules that the Python package provides.
These are used to test whether or not the installation succeeded.
@@ -65,7 +70,7 @@ def import_modules(self):
detected, this property can be overridden by the package.
Returns:
list: list of strings of module names
List of strings of module names.
"""
modules = []
pkg = self.spec["python"].package
@@ -102,14 +107,14 @@ def import_modules(self):
return modules
@property
def skip_modules(self):
def skip_modules(self) -> Iterable[str]:
"""Names of modules that should be skipped when running tests.
These are a subset of import_modules. If a module has submodules,
they are skipped as well (meaning a.b is skipped if a is contained).
Returns:
list: list of strings of module names
List of strings of module names.
"""
return []
@@ -185,12 +190,12 @@ def remove_files_from_view(self, view, merge_map):
view.remove_files(to_remove)
def test_imports(self):
def test_imports(self) -> None:
"""Attempts to import modules of the installed package."""
# Make sure we are importing the installed modules,
# not the ones in the source directory
python = inspect.getmodule(self).python
python = inspect.getmodule(self).python # type: ignore[union-attr]
for module in self.import_modules:
with test_part(
self,
@@ -315,24 +320,27 @@ class PythonPackage(PythonExtension):
py_namespace: Optional[str] = None
@lang.classproperty
def homepage(cls):
def homepage(cls) -> Optional[str]: # type: ignore[override]
if cls.pypi:
name = cls.pypi.split("/")[0]
return "https://pypi.org/project/" + name + "/"
return f"https://pypi.org/project/{name}/"
return None
@lang.classproperty
def url(cls):
def url(cls) -> Optional[str]:
if cls.pypi:
return "https://files.pythonhosted.org/packages/source/" + cls.pypi[0] + "/" + cls.pypi
return f"https://files.pythonhosted.org/packages/source/{cls.pypi[0]}/{cls.pypi}"
return None
@lang.classproperty
def list_url(cls):
def list_url(cls) -> Optional[str]: # type: ignore[override]
if cls.pypi:
name = cls.pypi.split("/")[0]
return "https://pypi.org/simple/" + name + "/"
return f"https://pypi.org/simple/{name}/"
return None
@property
def headers(self):
def headers(self) -> HeaderList:
"""Discover header files in platlib."""
# Remove py- prefix in package name
@@ -350,7 +358,7 @@ def headers(self):
raise NoHeadersError(msg.format(self.spec.name, include, platlib))
@property
def libs(self):
def libs(self) -> LibraryList:
"""Discover libraries in platlib."""
# Remove py- prefix in package name
@@ -384,7 +392,7 @@ class PythonPipBuilder(BaseBuilder):
install_time_test_callbacks = ["test"]
@staticmethod
def std_args(cls):
def std_args(cls) -> List[str]:
return [
# Verbose
"-vvv",
@@ -409,7 +417,7 @@ def std_args(cls):
]
@property
def build_directory(self):
def build_directory(self) -> str:
"""The root directory of the Python package.
This is usually the directory containing one of the following files:
@@ -420,51 +428,51 @@ def build_directory(self):
"""
return self.pkg.stage.source_path
def config_settings(self, spec, prefix):
def config_settings(self, spec: Spec, prefix: Prefix) -> Mapping[str, object]:
"""Configuration settings to be passed to the PEP 517 build backend.
Requires pip 22.1 or newer for keys that appear only a single time,
or pip 23.1 or newer if the same key appears multiple times.
Args:
spec (spack.spec.Spec): build spec
prefix (spack.util.prefix.Prefix): installation prefix
spec: Build spec.
prefix: Installation prefix.
Returns:
dict: Possibly nested dictionary of KEY, VALUE settings
Possibly nested dictionary of KEY, VALUE settings.
"""
return {}
def install_options(self, spec, prefix):
def install_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
"""Extra arguments to be supplied to the setup.py install command.
Requires pip 23.0 or older.
Args:
spec (spack.spec.Spec): build spec
prefix (spack.util.prefix.Prefix): installation prefix
spec: Build spec.
prefix: Installation prefix.
Returns:
list: list of options
List of options.
"""
return []
def global_options(self, spec, prefix):
def global_options(self, spec: Spec, prefix: Prefix) -> Iterable[str]:
"""Extra global options to be supplied to the setup.py call before the install
or bdist_wheel command.
Deprecated in pip 23.1.
Args:
spec (spack.spec.Spec): build spec
prefix (spack.util.prefix.Prefix): installation prefix
spec: Build spec.
prefix: Installation prefix.
Returns:
list: list of options
List of options.
"""
return []
def install(self, pkg, spec, prefix):
def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
"""Install everything from build directory."""
args = PythonPipBuilder.std_args(pkg) + [f"--prefix={prefix}"]

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
from spack.cmd.common import arguments
description = "add a spec to an environment"
section = "environments"

View File

@@ -15,13 +15,13 @@
import spack.bootstrap
import spack.bootstrap.config
import spack.bootstrap.core
import spack.cmd.common.arguments
import spack.config
import spack.main
import spack.mirror
import spack.spec
import spack.stage
import spack.util.path
from spack.cmd.common import arguments
description = "manage bootstrap configuration"
section = "system"
@@ -68,12 +68,8 @@
def _add_scope_option(parser):
scopes = spack.config.scopes()
parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
"--scope", action=arguments.ConfigScope, help="configuration scope to read/modify"
)
@@ -106,7 +102,7 @@ def setup_parser(subparser):
disable.add_argument("name", help="name of the source to be disabled", nargs="?", default=None)
reset = sp.add_parser("reset", help="reset bootstrapping configuration to Spack defaults")
spack.cmd.common.arguments.add_common_arguments(reset, ["yes_to_all"])
arguments.add_common_arguments(reset, ["yes_to_all"])
root = sp.add_parser("root", help="get/set the root bootstrap directory")
_add_scope_option(root)

View File

@@ -21,7 +21,6 @@
import spack.binary_distribution as bindist
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.error
@@ -40,6 +39,7 @@
import spack.util.web as web_util
from spack.build_environment import determine_number_of_jobs
from spack.cmd import display_specs
from spack.cmd.common import arguments
from spack.oci.image import (
Digest,
ImageReference,
@@ -182,23 +182,22 @@ def setup_parser(subparser: argparse.ArgumentParser):
)
# used to construct scope arguments below
scopes = spack.config.scopes()
check.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope containing mirrors to check",
)
check_spec_or_specfile = check.add_mutually_exclusive_group(required=True)
check_spec_or_specfile.add_argument(
# Unfortunately there are 3 ways to do the same thing here:
check_specs = check.add_mutually_exclusive_group()
check_specs.add_argument(
"-s", "--spec", help="check single spec instead of release specs file"
)
check_spec_or_specfile.add_argument(
check_specs.add_argument(
"--spec-file",
help="check single spec from json or yaml file instead of release specs file",
)
arguments.add_common_arguments(check, ["specs"])
check.set_defaults(func=check_fn)
@@ -816,15 +815,24 @@ def check_fn(args: argparse.Namespace):
exit code is non-zero, then at least one of the indicated specs needs to be rebuilt
"""
if args.spec_file:
specs_arg = (
args.spec_file if os.path.sep in args.spec_file else os.path.join(".", args.spec_file)
)
tty.warn(
"The flag `--spec-file` is deprecated and will be removed in Spack 0.22. "
"Use --spec instead."
f"Use `spack buildcache check {specs_arg}` instead."
)
elif args.spec:
specs_arg = args.spec
tty.warn(
"The flag `--spec` is deprecated and will be removed in Spack 0.23. "
f"Use `spack buildcache check {specs_arg}` instead."
)
else:
specs_arg = args.specs
specs = spack.cmd.parse_specs(args.spec or args.spec_file)
if specs:
specs = _matching_specs(specs)
if specs_arg:
specs = _matching_specs(spack.cmd.parse_specs(specs_arg))
else:
specs = spack.cmd.require_active_env("buildcache check").all_specs()

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.cmd
import spack.cmd.common.arguments as arguments
from spack.cmd.common import arguments
description = "change an existing spec in an environment"
section = "environments"

View File

@@ -16,6 +16,7 @@
import spack.cmd.buildcache as buildcache
import spack.config as cfg
import spack.environment as ev
import spack.environment.depfile
import spack.hash_types as ht
import spack.mirror
import spack.util.gpg as gpg_util
@@ -606,7 +607,9 @@ def ci_rebuild(args):
"SPACK_INSTALL_FLAGS={}".format(args_to_string(deps_install_args)),
"-j$(nproc)",
"install-deps/{}".format(
ev.depfile.MakefileSpec(job_spec).safe_format("{name}-{version}-{hash}")
spack.environment.depfile.MakefileSpec(job_spec).safe_format(
"{name}-{version}-{hash}"
)
),
],
spack_cmd + ["install"] + root_install_args,

View File

@@ -12,13 +12,13 @@
import spack.bootstrap
import spack.caches
import spack.cmd.common.arguments as arguments
import spack.cmd.test
import spack.config
import spack.repo
import spack.stage
import spack.store
import spack.util.path
from spack.cmd.common import arguments
from spack.paths import lib_path, var_path
description = "remove temporary build files and/or downloaded archives"

View File

@@ -124,6 +124,33 @@ def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, deptype)
class ConfigScope(argparse.Action):
"""Pick the currently configured config scopes."""
def __init__(self, *args, **kwargs) -> None:
kwargs.setdefault("metavar", spack.config.SCOPES_METAVAR)
super().__init__(*args, **kwargs)
@property
def default(self):
return self._default() if callable(self._default) else self._default
@default.setter
def default(self, value):
self._default = value
@property
def choices(self):
return spack.config.scopes().keys()
@choices.setter
def choices(self, value):
pass
def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, values)
def _cdash_reporter(namespace):
"""Helper function to create a CDash reporter. This function gets an early reference to the
argparse namespace under construction, so it can later use it to create the object.

View File

@@ -8,13 +8,13 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.deptypes as dt
import spack.error
import spack.paths
import spack.spec
import spack.store
from spack import build_environment, traverse
from spack.cmd.common import arguments
from spack.context import Context
from spack.util.environment import dump_environment, pickle_environment

View File

@@ -14,6 +14,7 @@
import spack.compilers
import spack.config
import spack.spec
from spack.cmd.common import arguments
description = "manage compilers"
section = "system"
@@ -23,8 +24,6 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="compiler_command")
scopes = spack.config.scopes()
# Find
find_parser = sp.add_parser(
"find",
@@ -47,9 +46,8 @@ def setup_parser(subparser):
find_parser.add_argument("add_paths", nargs=argparse.REMAINDER)
find_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope("compilers"),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope("compilers"),
help="configuration scope to modify",
)
@@ -60,20 +58,15 @@ def setup_parser(subparser):
)
remove_parser.add_argument("compiler_spec")
remove_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=None,
help="configuration scope to modify",
"--scope", action=arguments.ConfigScope, default=None, help="configuration scope to modify"
)
# List
list_parser = sp.add_parser("list", help="list available compilers")
list_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_list_scope(),
help="configuration scope to read from",
)
@@ -82,9 +75,8 @@ def setup_parser(subparser):
info_parser.add_argument("compiler_spec")
info_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_list_scope(),
help="configuration scope to read from",
)
@@ -103,7 +95,7 @@ def compiler_find(args):
paths, scope=None, mixed_toolchain=args.mixed_toolchain
)
if new_compilers:
spack.compilers.add_compilers_to_config(new_compilers, scope=args.scope)
spack.compilers.add_compilers_to_config(new_compilers, scope=args.scope, init_config=False)
n = len(new_compilers)
s = "s" if n > 1 else ""

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.config
from spack.cmd.common import arguments
from spack.cmd.compiler import compiler_list
description = "list available compilers"
@@ -12,13 +12,8 @@
def setup_parser(subparser):
scopes = spack.config.scopes()
subparser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
"--scope", action=arguments.ConfigScope, help="configuration scope to read/modify"
)

View File

@@ -10,7 +10,6 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.cmd.common.arguments
import spack.config
import spack.environment as ev
import spack.repo
@@ -18,6 +17,7 @@
import spack.schema.packages
import spack.store
import spack.util.spack_yaml as syaml
from spack.cmd.common import arguments
from spack.util.editor import editor
description = "get and set configuration options"
@@ -26,14 +26,9 @@
def setup_parser(subparser):
scopes = spack.config.scopes()
# User can only choose one
subparser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
help="configuration scope to read/modify",
"--scope", action=arguments.ConfigScope, help="configuration scope to read/modify"
)
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="config_command")
@@ -101,13 +96,13 @@ def setup_parser(subparser):
setup_parser.add_parser = add_parser
update = sp.add_parser("update", help="update configuration files to the latest format")
spack.cmd.common.arguments.add_common_arguments(update, ["yes_to_all"])
arguments.add_common_arguments(update, ["yes_to_all"])
update.add_argument("section", help="section to update")
revert = sp.add_parser(
"revert", help="revert configuration files to their state before update"
)
spack.cmd.common.arguments.add_common_arguments(revert, ["yes_to_all"])
arguments.add_common_arguments(revert, ["yes_to_all"])
revert.add_argument("section", help="section to update")

View File

@@ -10,10 +10,10 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.cmd.common.confirmation as confirmation
import spack.environment as ev
import spack.spec
from spack.cmd.common import arguments
description = "remove specs from the concretized lockfile of an environment"
section = "environments"

View File

@@ -9,11 +9,11 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.package_base
import spack.repo
import spack.store
from spack.cmd.common import arguments
description = "show dependencies of a package"
section = "basic"

View File

@@ -9,10 +9,10 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.repo
import spack.store
from spack.cmd.common import arguments
description = "show packages that depend on another"
section = "basic"

View File

@@ -20,9 +20,9 @@
from llnl.util.symlink import symlink
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from spack.error import SpackError

View File

@@ -9,9 +9,9 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.repo
from spack.cmd.common import arguments
description = "developer build: build from code in current working directory"
section = "build"

View File

@@ -8,10 +8,10 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.spec
import spack.util.path
import spack.version
from spack.cmd.common import arguments
from spack.error import SpackError
description = "add a spec to an environment's dev-build information"

View File

@@ -10,11 +10,11 @@
from llnl.util.tty.color import cprint, get_color_when
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.solver.asp as asp
import spack.util.environment
import spack.util.spack_json as sjson
from spack.cmd.common import arguments
description = "compare two specs"
section = "basic"

View File

@@ -20,7 +20,6 @@
import spack.cmd
import spack.cmd.common
import spack.cmd.common.arguments
import spack.cmd.common.arguments as arguments
import spack.cmd.install
import spack.cmd.modules
import spack.cmd.uninstall
@@ -31,6 +30,7 @@
import spack.schema.env
import spack.spec
import spack.tengine
from spack.cmd.common import arguments
from spack.util.environment import EnvironmentModifications
description = "manage virtual environments"

View File

@@ -10,10 +10,10 @@
from llnl.util.tty.colify import colify
import spack.cmd as cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.repo
import spack.store
from spack.cmd.common import arguments
description = "list extensions for package"
section = "extensions"

View File

@@ -14,12 +14,12 @@
import spack
import spack.cmd
import spack.cmd.common.arguments
import spack.config
import spack.cray_manifest as cray_manifest
import spack.detection
import spack.error
import spack.util.environment
from spack.cmd.common import arguments
description = "manage external packages in Spack configuration"
section = "config"
@@ -29,8 +29,6 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="external_command")
scopes = spack.config.scopes()
find_parser = sp.add_parser("find", help="add external packages to packages.yaml")
find_parser.add_argument(
"--not-buildable",
@@ -48,15 +46,14 @@ def setup_parser(subparser):
)
find_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope("packages"),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope("packages"),
help="configuration scope to modify",
)
find_parser.add_argument(
"--all", action="store_true", help="search for all packages that Spack knows about"
)
spack.cmd.common.arguments.add_common_arguments(find_parser, ["tags", "jobs"])
arguments.add_common_arguments(find_parser, ["tags", "jobs"])
find_parser.add_argument("packages", nargs=argparse.REMAINDER)
find_parser.epilog = (
'The search is by default on packages tagged with the "build-tools" or '

View File

@@ -6,11 +6,11 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.repo
import spack.traverse
from spack.cmd.common import arguments
description = "fetch archives for packages"
section = "build"

View File

@@ -12,9 +12,9 @@
import spack.bootstrap
import spack.cmd as cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.repo
from spack.cmd.common import arguments
from spack.database import InstallStatuses
description = "list and search installed packages"

View File

@@ -7,11 +7,11 @@
import os
import spack.binary_distribution
import spack.cmd.common.arguments as arguments
import spack.mirror
import spack.paths
import spack.util.gpg
import spack.util.url
from spack.cmd.common import arguments
description = "handle GPG actions for spack"
section = "packaging"

View File

@@ -5,10 +5,10 @@
from llnl.util import tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.store
from spack.cmd.common import arguments
from spack.graph import DAGWithDependencyTypes, SimpleDAG, graph_ascii, graph_dot, static_graph_dot
description = "generate graphs of package dependency relationships"

View File

@@ -11,13 +11,13 @@
import llnl.util.tty.color as color
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.deptypes as dt
import spack.fetch_strategy as fs
import spack.install_test
import spack.repo
import spack.spec
import spack.version
from spack.cmd.common import arguments
from spack.package_base import preferred_version
description = "get detailed information on a particular package"
@@ -327,7 +327,7 @@ def _variants_by_name_when(pkg):
"""Adaptor to get variants keyed by { name: { when: { [Variant...] } }."""
# TODO: replace with pkg.variants_by_name(when=True) when unified directive dicts are merged.
variants = {}
for name, (variant, whens) in pkg.variants.items():
for name, (variant, whens) in sorted(pkg.variants.items()):
for when in whens:
variants.setdefault(name, {}).setdefault(when, []).append(variant)
return variants

View File

@@ -14,7 +14,6 @@
import spack.build_environment
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.fetch_strategy
@@ -23,6 +22,7 @@
import spack.report
import spack.spec
import spack.store
from spack.cmd.common import arguments
from spack.error import SpackError
from spack.installer import PackageInstaller

View File

@@ -15,9 +15,9 @@
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.deptypes as dt
import spack.repo
from spack.cmd.common import arguments
from spack.version import VersionList
description = "list and search available packages"

View File

@@ -8,12 +8,12 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.cmd.find
import spack.environment as ev
import spack.store
import spack.user_environment as uenv
import spack.util.environment
from spack.cmd.common import arguments
description = "add package to the user environment"
section = "user environment"

View File

@@ -9,11 +9,11 @@
import spack.builder
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.paths
import spack.repo
import spack.stage
from spack.cmd.common import arguments
description = "print out locations of packages and spack directories"
section = "basic"

View File

@@ -8,11 +8,11 @@
from llnl.util import tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.error
import spack.package_base
import spack.repo
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
description = "mark packages as explicitly or implicitly installed"

View File

@@ -11,7 +11,6 @@
import spack.caches
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.concretize
import spack.config
import spack.environment as ev
@@ -20,6 +19,7 @@
import spack.spec
import spack.util.path
import spack.util.web as web_util
from spack.cmd.common import arguments
from spack.error import SpackError
description = "manage mirrors (source and binary)"
@@ -88,18 +88,14 @@ def setup_parser(subparser):
"--mirror-url", metavar="mirror_url", type=str, help="find mirror to destroy by url"
)
# used to construct scope arguments below
scopes = spack.config.scopes()
# Add
add_parser = sp.add_parser("add", help=mirror_add.__doc__)
add_parser.add_argument("name", help="mnemonic name for mirror", metavar="mirror")
add_parser.add_argument("url", help="url of mirror directory from 'spack mirror create'")
add_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope to modify",
)
add_parser.add_argument(
@@ -117,9 +113,8 @@ def setup_parser(subparser):
remove_parser.add_argument("name", help="mnemonic name for mirror", metavar="mirror")
remove_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope to modify",
)
@@ -136,9 +131,8 @@ def setup_parser(subparser):
)
set_url_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope to modify",
)
arguments.add_connection_args(set_url_parser, False)
@@ -165,9 +159,8 @@ def setup_parser(subparser):
set_parser.add_argument("--url", help="url of mirror directory from 'spack mirror create'")
set_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope to modify",
)
arguments.add_connection_args(set_parser, False)
@@ -176,9 +169,8 @@ def setup_parser(subparser):
list_parser = sp.add_parser("list", help=mirror_list.__doc__)
list_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_list_scope(),
help="configuration scope to read from",
)

View File

@@ -14,11 +14,11 @@
from llnl.util.tty import color
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.modules
import spack.modules.common
import spack.repo
from spack.cmd.common import arguments
description = "manipulate module files"
section = "environment"

View File

@@ -6,12 +6,12 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.package_base
import spack.repo
import spack.traverse
from spack.cmd.common import arguments
description = "patch expanded archive sources in preparation for install"
section = "build"

View File

@@ -12,11 +12,11 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.paths
import spack.repo
import spack.util.executable as exe
import spack.util.package_hash as ph
from spack.cmd.common import arguments
description = "query packages associated with particular git revisions"
section = "developer"

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
from spack.cmd.common import arguments
description = "remove specs from an environment"
section = "environments"

View File

@@ -11,6 +11,7 @@
import spack.config
import spack.repo
import spack.util.path
from spack.cmd.common import arguments
description = "manage package source repositories"
section = "config"
@@ -19,7 +20,6 @@
def setup_parser(subparser):
sp = subparser.add_subparsers(metavar="SUBCOMMAND", dest="repo_command")
scopes = spack.config.scopes()
# Create
create_parser = sp.add_parser("create", help=repo_create.__doc__)
@@ -43,9 +43,8 @@ def setup_parser(subparser):
list_parser = sp.add_parser("list", help=repo_list.__doc__)
list_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_list_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_list_scope(),
help="configuration scope to read from",
)
@@ -54,9 +53,8 @@ def setup_parser(subparser):
add_parser.add_argument("path", help="path to a Spack package repository directory")
add_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope to modify",
)
@@ -67,9 +65,8 @@ def setup_parser(subparser):
)
remove_parser.add_argument(
"--scope",
choices=scopes,
metavar=spack.config.SCOPES_METAVAR,
default=spack.config.default_modify_scope(),
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope(),
help="configuration scope to modify",
)

View File

@@ -6,8 +6,8 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.repo
from spack.cmd.common import arguments
description = "revert checked out package source code"
section = "build"

View File

@@ -12,12 +12,12 @@
import spack
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment
import spack.hash_types as ht
import spack.package_base
import spack.solver.asp as asp
from spack.cmd.common import arguments
description = "concretize a specs using an ASP solver"
section = "developer"

View File

@@ -10,11 +10,11 @@
import spack
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.hash_types as ht
import spack.spec
import spack.store
from spack.cmd.common import arguments
description = "show what would be installed, given a spec"
section = "build"

View File

@@ -8,13 +8,13 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.config
import spack.environment as ev
import spack.package_base
import spack.repo
import spack.stage
import spack.traverse
from spack.cmd.common import arguments
description = "expand downloaded archive in preparation for install"
section = "build"

View File

@@ -15,12 +15,12 @@
from llnl.util.tty import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.environment as ev
import spack.install_test
import spack.package_base
import spack.repo
import spack.report
from spack.cmd.common import arguments
description = "run spack's tests for an install"
section = "admin"

View File

@@ -10,11 +10,11 @@
from llnl.util.filesystem import working_dir
import spack
import spack.cmd.common.arguments as arguments
import spack.config
import spack.paths
import spack.util.git
import spack.util.gpg
from spack.cmd.common import arguments
from spack.util.spack_yaml import syaml_dict
description = "set up spack for our tutorial (WARNING: modifies config!)"

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.cmd.common.arguments as arguments
from spack.cmd.common import arguments
description = "remove specs from an environment"
section = "environments"

View File

@@ -10,13 +10,13 @@
from llnl.util.tty.colify import colify
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.cmd.common.confirmation as confirmation
import spack.environment as ev
import spack.package_base
import spack.spec
import spack.store
import spack.traverse as traverse
from spack.cmd.common import arguments
from spack.database import InstallStatuses
description = "remove installed packages"

View File

@@ -227,9 +227,7 @@ def unit_test(parser, args, unknown_args):
# has been used, then test that extension.
pytest_root = spack.paths.spack_root
if args.extension:
target = args.extension
extensions = spack.extensions.get_extension_paths()
pytest_root = spack.extensions.path_for_extension(target, *extensions)
pytest_root = spack.extensions.load_extension(args.extension)
# pytest.ini lives in the root of the spack repository.
with llnl.util.filesystem.working_dir(pytest_root):

View File

@@ -7,10 +7,10 @@
import sys
import spack.cmd
import spack.cmd.common.arguments as arguments
import spack.error
import spack.user_environment as uenv
import spack.util.environment
from spack.cmd.common import arguments
description = "remove package from the user environment"
section = "user environment"

View File

@@ -8,9 +8,9 @@
import llnl.util.tty as tty
from llnl.util.tty.colify import colify
import spack.cmd.common.arguments as arguments
import spack.repo
import spack.spec
from spack.cmd.common import arguments
from spack.version import infinity_versions, ver
description = "list available versions of a package"

View File

@@ -334,40 +334,6 @@ def __init__(
# used for version checks for API, e.g. C++11 flag
self._real_version = None
def __eq__(self, other):
return (
self.cc == other.cc
and self.cxx == other.cxx
and self.fc == other.fc
and self.f77 == other.f77
and self.spec == other.spec
and self.operating_system == other.operating_system
and self.target == other.target
and self.flags == other.flags
and self.modules == other.modules
and self.environment == other.environment
and self.extra_rpaths == other.extra_rpaths
and self.enable_implicit_rpaths == other.enable_implicit_rpaths
)
def __hash__(self):
return hash(
(
self.cc,
self.cxx,
self.fc,
self.f77,
self.spec,
self.operating_system,
self.target,
str(self.flags),
str(self.modules),
str(self.environment),
str(self.extra_rpaths),
self.enable_implicit_rpaths,
)
)
def verify_executables(self):
"""Raise an error if any of the compiler executables is not valid.

View File

@@ -109,7 +109,7 @@ def _to_dict(compiler):
return {"compiler": d}
def get_compiler_config(scope=None, init_config=False):
def get_compiler_config(scope=None, init_config=True):
"""Return the compiler configuration for the specified architecture."""
config = spack.config.get("compilers", scope=scope) or []
@@ -118,8 +118,6 @@ def get_compiler_config(scope=None, init_config=False):
merged_config = spack.config.get("compilers")
if merged_config:
# Config is empty for this scope
# Do not init config because there is a non-empty scope
return config
_init_compiler_config(scope=scope)
@@ -127,105 +125,6 @@ def get_compiler_config(scope=None, init_config=False):
return config
def get_compiler_config_from_packages(scope=None):
"""Return the compiler configuration from packages.yaml"""
config = spack.config.get("packages", scope=scope)
if not config:
return []
packages = []
compiler_package_names = supported_compilers() + list(package_name_to_compiler_name.keys())
for name, entry in config.items():
if name not in compiler_package_names:
continue
externals_config = entry.get("externals", None)
if not externals_config:
continue
packages.extend(_compiler_config_from_package_config(externals_config))
return packages
def _compiler_config_from_package_config(config):
compilers = []
for entry in config:
compiler = _compiler_config_from_external(entry)
if compiler:
compilers.append(compiler)
return compilers
def _compiler_config_from_external(config):
spec = spack.spec.parse_with_version_concrete(config["spec"])
# use str(spec.versions) to allow `@x.y.z` instead of `@=x.y.z`
compiler_spec = spack.spec.CompilerSpec(
package_name_to_compiler_name.get(spec.name, spec.name), spec.version
)
extra_attributes = config.get("extra_attributes", {})
prefix = config.get("prefix", None)
compiler_class = class_for_compiler_name(compiler_spec.name)
paths = extra_attributes.get("compilers", {})
# compilers format has cc/fc/f77, externals format has "c/fortran"
if "c" in paths:
paths["cc"] = paths.pop("c")
if "fortran" in paths:
fc = paths.pop("fortran")
paths["fc"] = fc
if "f77" not in paths:
paths["f77"] = fc
compiler_langs = ["cc", "cxx", "fc", "f77"]
for lang in compiler_langs:
if paths.setdefault(lang, None):
continue
if not prefix:
continue
# Check for files that satisfy the naming scheme for this compiler
bindir = os.path.join(prefix, "bin")
for f, regex in itertools.product(os.listdir(bindir), compiler_class.search_regexps(lang)):
if regex.match(f):
paths[lang] = os.path.join(bindir, f)
if all(v is None for v in paths.values()):
return None
if not spec.architecture:
host_platform = spack.platforms.host()
operating_system = host_platform.operating_system("default_os")
target = host_platform.target("default_target").microarchitecture
else:
target = spec.target
if not target:
host_platform = spack.platforms.host()
target = host_platform.target("default_target").microarchitecture
operating_system = spec.os
if not operating_system:
host_platform = spack.platforms.host()
operating_system = host_platform.operating_system("default_os")
compiler_entry = {
"compiler": {
"spec": str(compiler_spec),
"paths": paths,
"flags": extra_attributes.get("flags", {}),
"operating_system": str(operating_system),
"target": str(target.family),
"modules": config.get("modules", []),
"environment": extra_attributes.get("environment", {}),
"extra_rpaths": extra_attributes.get("extra_rpaths", []),
"implicit_rpaths": extra_attributes.get("implicit_rpaths", None),
}
}
return compiler_entry
def _init_compiler_config(*, scope):
"""Compiler search used when Spack has no compilers."""
compilers = find_compilers()
@@ -243,20 +142,17 @@ def compiler_config_files():
compiler_config = config.get("compilers", scope=name)
if compiler_config:
config_files.append(config.get_config_filename(name, "compilers"))
compiler_config_from_packages = get_compiler_config_from_packages(scope=name)
if compiler_config_from_packages:
config_files.append(config.get_config_filename(name, "packages"))
return config_files
def add_compilers_to_config(compilers, scope=None):
def add_compilers_to_config(compilers, scope=None, init_config=True):
"""Add compilers to the config for the specified architecture.
Arguments:
compilers: a list of Compiler objects.
scope: configuration scope to modify.
"""
compiler_config = get_compiler_config(scope, init_config=False)
compiler_config = get_compiler_config(scope, init_config)
for compiler in compilers:
if not compiler.cc:
tty.debug(f"{compiler.spec} does not have a C compiler")
@@ -288,9 +184,6 @@ def remove_compiler_from_config(compiler_spec, scope=None):
for current_scope in candidate_scopes:
removal_happened |= _remove_compiler_from_scope(compiler_spec, scope=current_scope)
msg = "`spack compiler remove` will not remove compilers defined in packages.yaml"
msg += "\nTo remove these compilers, either edit the config or use `spack external remove`"
tty.debug(msg)
return removal_happened
@@ -305,7 +198,7 @@ def _remove_compiler_from_scope(compiler_spec, scope):
True if one or more compiler entries were actually removed, False otherwise
"""
assert scope is not None, "a specific scope is needed when calling this function"
compiler_config = get_compiler_config(scope, init_config=False)
compiler_config = get_compiler_config(scope)
filtered_compiler_config = [
compiler_entry
for compiler_entry in compiler_config
@@ -328,14 +221,7 @@ def all_compilers_config(scope=None, init_config=True):
"""Return a set of specs for all the compiler versions currently
available to build with. These are instances of CompilerSpec.
"""
from_packages_yaml = get_compiler_config_from_packages(scope)
if from_packages_yaml:
init_config = False
from_compilers_yaml = get_compiler_config(scope, init_config)
result = from_compilers_yaml + from_packages_yaml
key = lambda c: _compiler_from_config_entry(c["compiler"])
return list(llnl.util.lang.dedupe(result, key=key))
return get_compiler_config(scope, init_config)
def all_compiler_specs(scope=None, init_config=True):
@@ -502,7 +388,7 @@ def find_specs_by_arch(compiler_spec, arch_spec, scope=None, init_config=True):
def all_compilers(scope=None, init_config=True):
config = all_compilers_config(scope, init_config=init_config)
config = get_compiler_config(scope, init_config=init_config)
compilers = list()
for items in config:
items = items["compiler"]
@@ -517,7 +403,10 @@ def compilers_for_spec(
"""This gets all compilers that satisfy the supplied CompilerSpec.
Returns an empty list if none are found.
"""
config = all_compilers_config(scope, init_config)
if use_cache:
config = all_compilers_config(scope, init_config)
else:
config = get_compiler_config(scope, init_config)
matches = set(find(compiler_spec, scope, init_config))
compilers = []
@@ -693,7 +582,9 @@ def get_compiler_duplicates(compiler_spec, arch_spec):
scope_to_compilers = {}
for scope in config.scopes:
compilers = compilers_for_spec(compiler_spec, arch_spec=arch_spec, scope=scope)
compilers = compilers_for_spec(
compiler_spec, arch_spec=arch_spec, scope=scope, use_cache=False
)
if compilers:
scope_to_compilers[scope] = compilers

View File

@@ -20,16 +20,16 @@ def __init__(self, *args, **kwargs):
self.version_argument = "-V"
# Subclasses use possible names of C compiler
cc_names = ["craycc", "cc"]
cc_names = ["craycc"]
# Subclasses use possible names of C++ compiler
cxx_names = ["crayCC", "CC"]
cxx_names = ["crayCC"]
# Subclasses use possible names of Fortran 77 compiler
f77_names = ["crayftn", "ftn"]
f77_names = ["crayftn"]
# Subclasses use possible names of Fortran 90 compiler
fc_names = ["crayftn", "ftn"]
fc_names = ["crayftn"]
# MacPorts builds gcc versions with prefixes and -mp-X.Y suffixes.
suffixes = [r"-mp-\d\.\d"]

View File

@@ -227,7 +227,7 @@ def read(path, apply_updates):
if apply_updates and compilers:
for compiler in compilers:
try:
spack.compilers.add_compilers_to_config([compiler])
spack.compilers.add_compilers_to_config([compiler], init_config=False)
except Exception:
warnings.warn(
f"Could not add compiler {str(compiler.spec)}: "

View File

@@ -309,10 +309,14 @@ def find_windows_kit_roots() -> List[str]:
return glob.glob(kit_base)
@staticmethod
def find_windows_kit_bin_paths(kit_base: Optional[str] = None) -> List[str]:
def find_windows_kit_bin_paths(
kit_base: Union[Optional[str], Optional[list]] = None
) -> List[str]:
"""Returns Windows kit bin directory per version"""
kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base
assert kit_base, "Unexpectedly empty value for Windows kit base path"
if isinstance(kit_base, str):
kit_base = kit_base.split(";")
kit_paths = []
for kit in kit_base:
kit_bin = os.path.join(kit, "bin")
@@ -320,10 +324,14 @@ def find_windows_kit_bin_paths(kit_base: Optional[str] = None) -> List[str]:
return kit_paths
@staticmethod
def find_windows_kit_lib_paths(kit_base: Optional[str] = None) -> List[str]:
def find_windows_kit_lib_paths(
kit_base: Union[Optional[str], Optional[list]] = None
) -> List[str]:
"""Returns Windows kit lib directory per version"""
kit_base = WindowsKitExternalPaths.find_windows_kit_roots() if not kit_base else kit_base
assert kit_base, "Unexpectedly empty value for Windows kit base path"
if isinstance(kit_base, str):
kit_base = kit_base.split(";")
kit_paths = []
for kit in kit_base:
kit_lib = os.path.join(kit, "Lib")

View File

@@ -1554,7 +1554,7 @@ def _concretize_separately(self, tests=False):
# Ensure we have compilers in compilers.yaml to avoid that
# processes try to write the config file in parallel
_ = spack.compilers.get_compiler_config(init_config=True)
_ = spack.compilers.get_compiler_config()
# Early return if there is nothing to do
if len(args) == 0:

View File

@@ -6,11 +6,13 @@
for Spack's command extensions.
"""
import difflib
import glob
import importlib
import os
import re
import sys
import types
from typing import List
import llnl.util.lang
@@ -75,6 +77,15 @@ def load_command_extension(command, path):
if not os.path.exists(cmd_path):
return None
ensure_extension_loaded(extension, path=path)
module = importlib.import_module(module_name)
sys.modules[module_name] = module
return module
def ensure_extension_loaded(extension, *, path):
def ensure_package_creation(name):
package_name = "{0}.{1}".format(__name__, name)
if package_name in sys.modules:
@@ -100,10 +111,22 @@ def ensure_package_creation(name):
ensure_package_creation(extension)
ensure_package_creation(extension + ".cmd")
module = importlib.import_module(module_name)
sys.modules[module_name] = module
return module
def load_extension(name: str) -> str:
"""Loads a single extension into the 'spack.extensions' package.
Args:
name: name of the extension
"""
extension_root = path_for_extension(name, paths=get_extension_paths())
ensure_extension_loaded(name, path=extension_root)
commands = glob.glob(
os.path.join(extension_root, extension_name(extension_root), "cmd", "*.py")
)
commands = [os.path.basename(x).rstrip(".py") for x in commands]
for command in commands:
load_command_extension(command, extension_root)
return extension_root
def get_extension_paths():
@@ -125,7 +148,7 @@ def get_command_paths():
return command_paths
def path_for_extension(target_name, *paths):
def path_for_extension(target_name: str, *, paths: List[str]) -> str:
"""Return the test root dir for a given extension.
Args:

View File

@@ -357,7 +357,8 @@ def _print_installed_pkg(message: str) -> None:
Args:
message (str): message to be output
"""
print(colorize("@*g{[+]} ") + spack.util.path.debug_padded_filter(message))
if tty.msg_enabled():
print(colorize("@*g{[+]} ") + spack.util.path.debug_padded_filter(message))
def print_install_test_log(pkg: "spack.package_base.PackageBase") -> None:
@@ -2007,7 +2008,9 @@ def install(self) -> None:
# Only enable the terminal status line when we're in a tty without debug info
# enabled, so that the output does not get cluttered.
term_status = TermStatusLine(enabled=sys.stdout.isatty() and not tty.is_debug())
term_status = TermStatusLine(
enabled=sys.stdout.isatty() and tty.msg_enabled() and not tty.is_debug()
)
while self.build_pq:
task = self._pop_task()

View File

@@ -5,11 +5,20 @@
from ._operating_system import OperatingSystem
from .cray_backend import CrayBackend
from .cray_frontend import CrayFrontend
from .freebsd import FreeBSDOs
from .linux_distro import LinuxDistro
from .mac_os import MacOs
from .windows_os import WindowsOs
__all__ = ["OperatingSystem", "LinuxDistro", "MacOs", "CrayFrontend", "CrayBackend", "WindowsOs"]
__all__ = [
"OperatingSystem",
"LinuxDistro",
"MacOs",
"CrayFrontend",
"CrayBackend",
"WindowsOs",
"FreeBSDOs",
]
#: List of all the Operating Systems known to Spack
operating_systems = [LinuxDistro, MacOs, CrayFrontend, CrayBackend, WindowsOs]
operating_systems = [LinuxDistro, MacOs, CrayFrontend, CrayBackend, WindowsOs, FreeBSDOs]

View File

@@ -0,0 +1,15 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import platform as py_platform
from spack.version import Version
from ._operating_system import OperatingSystem
class FreeBSDOs(OperatingSystem):
def __init__(self):
release = py_platform.release().split("-", 1)[0]
super().__init__("freebsd", Version(release))

View File

@@ -206,11 +206,15 @@ def tokenize(text: str) -> Iterator[Token]:
scanner = ALL_TOKENS.scanner(text) # type: ignore[attr-defined]
match: Optional[Match] = None
for match in iter(scanner.match, None):
# The following two assertions are to help mypy
msg = (
"unexpected value encountered during parsing. Please submit a bug report "
"at https://github.com/spack/spack/issues/new/choose"
)
assert match is not None, msg
assert match.lastgroup is not None, msg
yield Token(
TokenType.__members__[match.lastgroup], # type: ignore[attr-defined]
match.group(), # type: ignore[attr-defined]
match.start(), # type: ignore[attr-defined]
match.end(), # type: ignore[attr-defined]
TokenType.__members__[match.lastgroup], match.group(), match.start(), match.end()
)
if match is None and not text:

View File

@@ -8,6 +8,7 @@
from ._platform import Platform
from .cray import Cray
from .darwin import Darwin
from .freebsd import FreeBSD
from .linux import Linux
from .test import Test
from .windows import Windows
@@ -17,6 +18,7 @@
"Cray",
"Darwin",
"Linux",
"FreeBSD",
"Test",
"Windows",
"platforms",

View File

@@ -10,12 +10,13 @@
from .cray import Cray
from .darwin import Darwin
from .freebsd import FreeBSD
from .linux import Linux
from .test import Test
from .windows import Windows
#: List of all the platform classes known to Spack
platforms = [Cray, Darwin, Linux, Windows, Test]
platforms = [Cray, Darwin, Linux, Windows, FreeBSD, Test]
@llnl.util.lang.memoized

View File

@@ -0,0 +1,37 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import platform
import archspec.cpu
import spack.target
from spack.operating_systems.freebsd import FreeBSDOs
from ._platform import Platform
class FreeBSD(Platform):
priority = 102
def __init__(self):
super().__init__("freebsd")
for name in archspec.cpu.TARGETS:
self.add_target(name, spack.target.Target(name))
# Get specific default
self.default = archspec.cpu.host().name
self.front_end = self.default
self.back_end = self.default
os = FreeBSDOs()
self.default_os = str(os)
self.front_os = self.default_os
self.back_os = self.default_os
self.add_operating_system(str(os), os)
@classmethod
def detect(cls):
return platform.system().lower() == "freebsd"

View File

@@ -92,11 +92,12 @@ def __init__(self, configuration: CDashConfiguration):
self.osname = platform.system()
self.osrelease = platform.release()
self.target = spack.platforms.host().target("default_target")
self.endtime = int(time.time())
self.starttime = int(time.time())
self.endtime = self.starttime
self.buildstamp = (
configuration.buildstamp
if configuration.buildstamp
else build_stamp(configuration.track, self.endtime)
else build_stamp(configuration.track, self.starttime)
)
self.buildIds: Dict[str, str] = {}
self.revision = ""
@@ -125,7 +126,7 @@ def build_report_for_package(self, report_dir, package, duration):
report_data[phase] = {}
report_data[phase]["loglines"] = []
report_data[phase]["status"] = 0
report_data[phase]["endtime"] = self.endtime
report_data[phase]["starttime"] = self.starttime
# Track the phases we perform so we know what reports to create.
# We always report the update step because this is how we tell CDash
@@ -153,6 +154,25 @@ def build_report_for_package(self, report_dir, package, duration):
elif cdash_phase:
report_data[cdash_phase]["loglines"].append(xml.sax.saxutils.escape(line))
# something went wrong pre-cdash "configure" phase b/c we have an exception and only
# "update" was encounterd.
# dump the report in the configure line so teams can see what the issue is
if len(phases_encountered) == 1 and package["exception"]:
# TODO this mapping is not ideal since these are pre-configure errors
# we need to determine if a more appropriate cdash phase can be utilized
# for now we will add a message to the log explaining this
cdash_phase = "configure"
phases_encountered.append(cdash_phase)
log_message = (
"Pre-configure errors occured in Spack's process that terminated the "
"build process prematurely.\nSpack output::\n{0}".format(
xml.sax.saxutils.escape(package["exception"])
)
)
report_data[cdash_phase]["loglines"].append(log_message)
# Move the build phase to the front of the list if it occurred.
# This supports older versions of CDash that expect this phase
# to be reported before all others.
@@ -160,9 +180,9 @@ def build_report_for_package(self, report_dir, package, duration):
build_pos = phases_encountered.index("build")
phases_encountered.insert(0, phases_encountered.pop(build_pos))
self.starttime = self.endtime - duration
self.endtime = self.starttime + duration
for phase in phases_encountered:
report_data[phase]["starttime"] = self.starttime
report_data[phase]["endtime"] = self.endtime
report_data[phase]["log"] = "\n".join(report_data[phase]["loglines"])
errors, warnings = parse_log_events(report_data[phase]["loglines"])
@@ -309,7 +329,7 @@ def test_report_for_package(self, report_dir, package, duration):
self.buildname = "{0}-{1}".format(self.current_package_name, package["id"])
else:
self.buildname = self.report_build_name(self.current_package_name)
self.starttime = self.endtime - duration
self.endtime = self.starttime + duration
report_data = self.initialize_report(report_dir)
report_data["hostname"] = socket.gethostname()
@@ -354,7 +374,7 @@ def concretization_report(self, report_dir, msg):
self.buildname = self.base_buildname
report_data = self.initialize_report(report_dir)
report_data["update"] = {}
report_data["update"]["starttime"] = self.endtime
report_data["update"]["starttime"] = self.starttime
report_data["update"]["endtime"] = self.endtime
report_data["update"]["revision"] = self.revision
report_data["update"]["log"] = msg

View File

@@ -3134,6 +3134,25 @@ def _develop_specs_from_env(spec, env):
spec.constrain(dev_info["spec"])
def _is_reusable_external(packages, spec: spack.spec.Spec) -> bool:
"""Returns true iff spec is an external that can be reused.
Arguments:
packages: the packages configuration
spec: the spec to check
"""
for name in {spec.name, *(p.name for p in spec.package.provided)}:
for entry in packages.get(name, {}).get("externals", []):
if (
spec.satisfies(entry["spec"])
and spec.external_path == entry.get("prefix")
and spec.external_modules == entry.get("modules")
):
return True
return False
class Solver:
"""This is the main external interface class for solving.
@@ -3181,8 +3200,18 @@ def _reusable_specs(self, specs):
# Specs from buildcaches
try:
index = spack.binary_distribution.update_cache_and_get_specs()
reusable_specs.extend(index)
# Specs in a build cache that depend on externals are reusable as long as local
# config has matching externals. This should guard against picking up binaries
# linked against externals not available locally, while still supporting the use
# case of distributing binaries across machines with similar externals.
packages = spack.config.get("packages")
reusable_specs.extend(
[
s
for s in spack.binary_distribution.update_cache_and_get_specs()
if not s.external or _is_reusable_external(packages, s)
]
)
except (spack.binary_distribution.FetchCacheError, IndexError):
# this is raised when no mirrors had indices.
# TODO: update mirror configuration so it can indicate that the

View File

@@ -20,7 +20,7 @@
% Integrity constraints on DAG nodes
:- attr("root", PackageNode), not attr("node", PackageNode).
:- attr("version", PackageNode, _), not attr("node", PackageNode), not attr("virtual_node", PackageNode).
:- attr("node_version_satisfies", PackageNode), not attr("node", PackageNode).
:- attr("node_version_satisfies", PackageNode, _), not attr("node", PackageNode), not attr("virtual_node", PackageNode).
:- attr("hash", PackageNode, _), not attr("node", PackageNode).
:- attr("node_platform", PackageNode, _), not attr("node", PackageNode).
:- attr("node_os", PackageNode, _), not attr("node", PackageNode).

View File

@@ -30,6 +30,8 @@ def current_host_platform():
current_platform = spack.platforms.Darwin()
elif "Windows" in platform.system():
current_platform = spack.platforms.Windows()
elif "FreeBSD" in platform.system():
current_platform = spack.platforms.FreeBSD()
return current_platform

Some files were not shown because too many files have changed in this diff Show More