Compare commits

..

156 Commits

Author SHA1 Message Date
Harmen Stoppels
4969fdf23a autotools.py: order props / methods / callbacks 2025-01-23 12:24:56 +01:00
Massimiliano Culpo
2db654bf5a Define the DB index file name in a single place (#48681) 2025-01-23 12:17:34 +01:00
Harmen Stoppels
9992b563db Improve spack.build_systems typing (#48590) 2025-01-23 12:15:43 +01:00
Harmen Stoppels
daba1a805e build_environment.py: clear CONFIG_SITE for configure scripts (#48690)
Should be sufficient to set CONFIG_SITE to /dev/null before running configure to 
prevent any config site file to be loaded, which currently causes non-determinism in builds.

Setting the variable CONFIG_SITE prevents configure files to check $prefix/share/config.site, 
$prefix/etc/config.site, $ac_default_prefix/share/config.site, $ac_default_prefix/etc/config.site 
so we don't have to patch a block of code.
2025-01-23 11:45:52 +01:00
Wouter Deconinck
832bf95aa4 fmt: add v11.1.{1,2}; spdlog: add v1.15.0 (#48402)
* fmt: add v11.1.1

* spdlog: add v1.15.0

* spdlog: terminate string

* fmt: add v11.1.2

* spdlog: apply fmt-11.1 support patch up to 1.15.0

* spdlog: add prereq patch to ensure intended patch applies cleanly

* spdlog: fix patch checksum
2025-01-23 09:34:20 +01:00
Wouter Deconinck
81e6dcd95c gaudi: add v39.2; CUDA support (#48557)
* gaudi: add v39.2; CUDA support
* gaudi: CUDAPackage -> CudaPackage
2025-01-23 08:41:38 +01:00
Tara Drwenski
518572e710 Use gcc toolchain when using hip and gcc host compiler (#48632) 2025-01-22 23:37:24 -08:00
Sinan
6f4ac31a67 alps: add new package (#48183) 2025-01-22 14:57:19 -07:00
Victor A. P. Magri
e291daaa17 Add OpenMP dependency to Kokkos and KokkosKernels (#48455)
* Add OpenMP dependency to Kokkos and KokkosKernels

* Add Chris' suggestions
2025-01-22 12:53:41 -07:00
Seth R. Johnson
58f1e791a0 Celeritas: new version 0.5.1 (#48678)
* Celeritas: add version 0.5

* Fix style

* Un-deprecate non-awful versions
2025-01-22 11:03:48 -07:00
jnhealy2
aba0a740c2 Address quoting issue impacting dev_paths containing @ symbols (#48555)
* Address quoting issue that casuses dev_paths containing @ symbols to parse as versions

* Fix additional location were dev_path was imporperly constructed

The dev_path must be quoted to avoid parsing issues when
paths contain '@' symbols. Also add tests to catch
regression of this behavior

* Format to fix line length

* fix failing tests

* Fix whitespace error

* Add binary_compatibility fixture to test

* Change string formatting to avoid multiline f string

* Update lib/spack/spack/test/concretization/core.py

* Add tmate debug session

* Revert "Add tmate debug session"

This reverts commit 24e2f77e3c.

* Move test and refactor to use env methods

* Update lib/spack/spack/test/cmd/develop.py

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
2025-01-22 18:03:13 +01:00
Harmen Stoppels
0fe8e763c3 bootstrap: do not show disabled sources as errors if enabled sources fail (#48676) 2025-01-22 17:02:23 +01:00
Harmen Stoppels
0e2d261b7e bootstrap/status.py: remove make (#48677) 2025-01-22 17:00:25 +01:00
Matthieu Dorier
85cb234861 pmdk: add cmake dependency back (#48664) 2025-01-22 14:44:04 +01:00
Harmen Stoppels
87a83db623 autotools.py: set lt_cv_apple_cc_single_mod=yes (#48671)
Since macOS 15 `ld -single_module` warns with a deprecation message,
which makes configure scripts believe the flag is unsupported. That
in turn triggers a code path where `archive_cmds` is set to

```
$CC -r -keep_private_externs -nostdlib ... -dynamiclib
```

instead of just

```
$CC -dynamiclib ...
```

This code path was meant to trigger only on ancient macOS <= 14.4 where
libtool had to add `-single_module`, which is the default since macos
14.4, and is now apparently deprecated because the flag is a no-op for
more than 15 years.

The wrong `archive_cmds` causes actual problems combined with a bug in
OpenMPI's compiler wrapper (`CC=mpicc`), which appends `-rpath` flags,
which cause an error when combined with the `-r` flag added by the
autotools.

Spack's compiler wrapper doesn't do this, but it's likely there are
other compiler wrappers out there that are not aware that `-r` and
`-rpath` cannot be combined.

The fix is to change defaults: `lt_cv_apple_cc_single_mod=yes`.
2025-01-22 11:29:50 +01:00
Taillefumier Mathieu
e1e17786c5 sirius: libxc 7 forward compat (#48663) 2025-01-22 11:27:19 +01:00
eugeneswalker
68af5cc4c0 e4s ci stacks: add libceed (#48668) 2025-01-21 14:13:55 -08:00
Harmen Stoppels
70df460fa7 PackageBase.detect_dev_src_change: speed up (#48618) 2025-01-21 16:48:37 +01:00
Harmen Stoppels
31a1b2fd6c relocate.py, binary_distribution.py: cleanup (#48651) 2025-01-21 15:45:08 +01:00
moloney
f8fd51e12f BF+ENH: Add wxwidgets 3.2.6 and py-wxpython 4.2.2, missing pkgconfig dep (#48597)
* BF+ENH: Add wxwidgets 3.2.6 and py-wxpython 4.2.2

Improves compat with newer Python (>3.9) and numpy.

Fix error during configure step (even on at least some older
versions) by including 'pkgconfig' as a build dep so gtk+ is found.

* BF: Make wxpython use spack built wxwidgets instead of rebuilding

* Update var/spack/repos/builtin/packages/py-wxpython/package.py

Avoid using too new version of py-setuptools as license file format is currently not compatible.

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-01-20 21:01:47 -06:00
Mirko Rahn
12784594aa gpi-space: add v24.12 (#48361) 2025-01-20 11:47:56 -07:00
Massimiliano Culpo
e0eb0aba37 netlib-lapack: add v3.12.1 (#48318)
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2025-01-20 18:23:54 +01:00
Stephen Nicholas Swatman
f47bf5f6b8 indicators: new package (#47786)
* indicators: new package

Indicators is a new package which aims to add easy-to-use progress bars
to modern C++.

* Update header

* Update dependencies
2025-01-20 09:00:37 -06:00
Seth R. Johnson
9296527775 npm, node-js, typescript: add external find (#48587)
* npm: add external find

* node-js: add external find

* Add external for typescript

* Match full executable string

* Fix style
2025-01-20 07:26:01 -06:00
Wouter Deconinck
08c53fa405 pythia8: correct with_or_without prefix for +openmpi (#48131)
* pythia8: correct with_or_without prefix for +openmpi

* pythia8: fix style

* pythia8: fix style
2025-01-20 06:50:29 -06:00
Harmen Stoppels
0c6f0c090d openldap: fix build (#48646) 2025-01-20 13:16:52 +01:00
Nathan Hanford
c623448f81 add affinity package (#48589) 2025-01-20 04:44:25 -07:00
Wouter Deconinck
df71341972 py-mplhep: add v0.3.55 (#48338) 2025-01-20 10:26:09 +01:00
Wouter Deconinck
75862c456d lhapdf: add v6.5.5 (#48336) 2025-01-20 10:25:38 +01:00
dmagdavector
e680a0c153 libslirp: add v4.8.0 (#48561) 2025-01-20 10:23:47 +01:00
Wouter Deconinck
9ad36080ca r: add v4.4.2 (#48329)
* r: add v4.4.2

* r-rcpp: add v1.0.13-1 spot release; conflict 1.0.13 with r@4.4.2
2025-01-20 10:23:23 +01:00
Harmen Stoppels
ecd14f0ad9 archive.py: fix include_parent_directories=True with path_to_name (#48570) 2025-01-20 10:21:32 +01:00
Rocco Meli
c44edf1e8d gnina: add v1.3 (#47711)
* gnina: add v1.3

* Update var/spack/repos/builtin/packages/gnina/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* Update var/spack/repos/builtin/packages/gnina/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* Update var/spack/repos/builtin/packages/gnina/package.py

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>

* use github patch

* update

---------

Co-authored-by: Mikael Simberg <mikael.simberg@iki.fi>
2025-01-20 10:19:49 +01:00
Paul R. C. Kent
1eacdca5aa py-pyscf: add v2.8.0 (#48571) 2025-01-20 10:15:41 +01:00
Keita Iwabuchi
4a8f5efb38 Metall: add v0.29 and v0.30 (#48564)
Co-authored-by: Keita Iwabuchi <iwabuchi1@lln.gov>
2025-01-20 10:14:42 +01:00
Mickael PHILIT
2e753571bd cgns: add v4.5.0 (#48490) 2025-01-20 10:09:45 +01:00
Thomas Bouvier
da16336550 nvtx: fix missing import (#48580) 2025-01-20 10:08:47 +01:00
Alberto Sartori
1818e70e74 justbuild: add version 1.4.2 (#48581) 2025-01-20 10:07:55 +01:00
Chang Liu
1dde785e9a fusion-io: new package (#47699) 2025-01-20 10:06:13 +01:00
Alberto Invernizzi
a7af32c23b py-virtualenvwrapper: add v6.0.0, v6.1.0, v6.1.1 (#47785) 2025-01-20 10:04:40 +01:00
Simon Frasch
6c92ad439b spfft: add v1.1.1 (#48643) 2025-01-20 10:03:50 +01:00
snehring
93f555eb14 dorado: fixing typo on version restriction for hdf5 (#48591)
Signed-off-by: Shane Nehring <snehring@iastate.edu>
2025-01-20 10:01:29 +01:00
Vanessasaurus
fa3725e9de flux-pmix: add v0.6.0 (#48600)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-01-20 10:00:35 +01:00
Vanessasaurus
870dd6206f flux-sched: add v0.41.0(#48599)
Co-authored-by: github-actions <github-actions@users.noreply.github.com>
2025-01-20 09:57:22 +01:00
Olivier Cessenat
b1d411ab06 octave: new version 9.3.0 (#48606) 2025-01-20 09:56:55 +01:00
Massimiliano Culpo
783eccfbd5 jsonschema: use draft7 validator and simplify schemas (#48621) 2025-01-20 09:51:29 +01:00
Wouter Deconinck
a842332b1b rsync: add v3.4.0, v3.4.1 (#48607) 2025-01-20 09:50:44 +01:00
Harmen Stoppels
7e41288ca6 spack_yaml.py / spec.py: use dict instead of OrderedDict (#48616)
* for config: let `syaml_dict` inherit from `dict` instead of `OrderedDict`. `syaml_dict` now only exists as a mutable wrapper for yaml related metadata.
* for spec serialization / hashing: use `dict` directly

This is possible since we only support cpython 3.6+ in which dicts are ordered.

This improves performance of hash computation a bit. For a larger spec I'm getting 9.22ms instead of 11.9ms, so 22.5% reduction in runtime.
2025-01-20 17:33:44 +09:00
Tara Drwenski
3bb375a47f Change llvm-amdgpu to a build dependency of rocm (#48612) 2025-01-20 17:28:15 +09:00
Olivier Cessenat
478855728f ngspice: add v44 (#48611) 2025-01-20 09:27:51 +01:00
jgraciahlrs
5e3baeabfa scorep: update configure opts for libbfd (#48614) 2025-01-20 09:24:12 +01:00
Axel Huebl
58b9b54066 openPMD-api: add v0.16.1 (#48601)
* openpmd-api: add v0.16.1

Add the latest release of openPMD-api.

* TOML11: Latest Versions
2025-01-20 09:22:30 +01:00
Luca Heltai
3918deab74 dealii: add v9.6.1, and v9.6.2 (#48620) 2025-01-20 09:20:49 +01:00
Matt Thompson
ceb2ce352f mapl: add v2.52.0 (#48629) 2025-01-20 09:10:29 +01:00
Wouter Deconinck
7dc6bff7b1 root: depends_on r-rcpp@:1.0.12 when @:6.32.02 (#48625) 2025-01-20 09:05:55 +01:00
Adam J. Stewart
05fbbd7164 py-numpy: add v2.1.3, v2.2.1, v2.2.2 (#48640) 2025-01-20 08:48:56 +01:00
Juan Miguel Carceller
58421866c2 hepmc3: extend python when the python bindings are enabled (#47836)
Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2025-01-20 08:46:08 +01:00
Harmen Stoppels
962498095d compat with mypy 1.14 (#48626)
* OSError.errno apparently is int | None, but already part of __str__ anyway so drop

* fix disambiguate_spec_from_hashes

* List[Spec].sort() complains, ignore it

* fix typing of KnownCompiler, and use it in asp.py
2025-01-18 21:16:05 +01:00
Anna Rift
d0217cf04e Fix two instances of duplicate 'https://' in URLs (#48628) 2025-01-17 22:59:02 -07:00
Olivier Cessenat
65745fa0df poppler: many improvements (#48627)
* poppler: many improvements

* Strangely I did not have the proper comment header

* Add a maintainer
2025-01-17 22:42:50 -07:00
John Gouwar
9f7cff1780 Stable splice-topo order for resolving splicing (#48605)
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-01-17 16:27:16 +01:00
Harmen Stoppels
bb43fa5444 spec.py: make hashing of extra_attributes order independent (#48615) 2025-01-17 13:50:36 +01:00
Julien Loiseau
847f560a6e hard: new package (#48595) 2025-01-16 15:47:59 -07:00
Tamara Dahlgren
623ff835fc mypy: update python version to avoid error/warning (#48593)
* pyproject: remove mypy python version option since defaults to python used to run it (#48593)
2025-01-16 11:24:29 -07:00
Harmen Stoppels
ca19790ff2 py-executing: support python 3.13 (#48604) 2025-01-16 19:00:10 +01:00
acastanedam
f23366e4f8 Update elk to versions 8.8, 9.6, 10.2 (#48583) 2025-01-16 08:46:18 -08:00
Chris Marsh
42fb689501 py-pint-xarray: add version 0.4 (#47599) 2025-01-16 16:57:49 +01:00
arezaii
c33bbdb77d add Arkouda server and client packages (#48054)
Adds packages for the Arkouda server (`arkouda`) and Arkouda python client  (`py-arkouda`).

Arkouda server and client are divided into separate packages to allow for them to be
installed independently of one another. 

Future work remains to add a `+dev` variant to `py-arkouda`, but that will require additional
supporting packages made available through spack (e.g. for `py-pytest-env`
2025-01-15 21:39:23 -08:00
Laurent Chardon
1af6aa22c1 py-basemap: add v1.4.1 (#48548)
* py-basemap: add v1.4.1
* py-basemap: remove broken v1.2.1
* py-basemap: add hires variant
2025-01-15 18:05:05 -08:00
Laurent Chardon
03d9373e5c py-pycuda: add version 2024.1.2 (#48547)
* py-pycuda: add version 2024.1.2
* py-pycuda: add version 2024.1.2
* py-pycuda: Improve dependencies versions
2025-01-15 17:33:14 -08:00
HELICS-bot
f469a3d6ab helics: Add version 3.6.0 (#48000)
* helics: Add version 3.6.0
* helics: Add constraints for new minimum CMake (3.22+), Boost (1.75+), GCC (11+), Clang (15+), and ICC (21+) versions in HELICS 3.6+

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Ryan Mast <mast9@llnl.gov>
2025-01-15 17:32:27 -07:00
Xuefeng Ding
25f24d947a add root to rpath of python (#48491)
* add root to rpath of python?

* fix #48446

* [@spackbot] updating style on behalf of DingXuefeng

---------

Co-authored-by: DingXuefeng <DingXuefeng@users.noreply.github.com>
Co-authored-by: Patrick Gartung <gartung@fnal.gov>
2025-01-15 14:27:59 -07:00
Sergio Sánchez Ramírez
af25a84a56 julia: bump 1.11 and 1.10 patch versions (#48149) 2025-01-15 20:23:09 +01:00
Massimiliano Culpo
59a71959e7 Fix wrong hash in spliced specs, and improve unit-tests (#48574)
Modifications:
* Fix a severe bug where a spliced spec has the same hash as the original build spec
* Removed CacheManager in favor of install_mockery. The latter sets up a temporary store, and removes it at the end 
   of the test.
* Deleted a couple of helper functions e.g. _has_dependencies
* Checked that SolverException is raised, rather than Exception (more specific)
* Extended, and renamed the splicing_setup fixture (now called install_specs)
* Added more specific assertions in each test

One test is currently flaky, due to some instability when sorting multiple specs. It's currently marked xfail
2025-01-15 19:43:55 +01:00
Harmen Stoppels
00e804a94b package.py: expose types of common globals unconditionally (#48588) 2025-01-15 17:33:38 +01:00
Harmen Stoppels
db997229f2 julia: fix forward compat with curl (#48585) 2025-01-15 16:13:37 +01:00
Massimiliano Culpo
6fac041d40 Add type-hints to which and which_string (#48554) 2025-01-15 15:24:18 +01:00
Paul R. C. Kent
f8b2c65ddf llvm: add v19.1.7 (#48572) 2025-01-15 15:22:14 +01:00
Jordan Galby
c504304d39 Add version attributes up_to_1, up_to_2, up_to_3 (#38410)
Making them also available in spec format string
2025-01-15 13:07:17 +01:00
Harmen Stoppels
976f1c2198 flag_mixing.py: add missing import (#48579) 2025-01-15 10:48:34 +01:00
Greg Becker
e7c591a8b8 Deprecate Spec.concretize/Spec.concretized in favor of spack.concretize.concretize_one (#47971)
The methods spack.spec.Spec.concretize and spack.spec.Spec.concretized
are deprecated in favor of spack.concretize.concretize_one.

This will resolve a circular dependency between the spack.spec and
spack.concretize in the next Spack release.
2025-01-15 10:13:19 +01:00
arezaii
f3522cba74 Chapel 2.3 (#48053)
* add python bindings variant to chapel
* fix chpl_home, update chapel frontend rpath in venv
* fix chpl frontend shared lib rpath hack
* patch chapel env var line length limit
* patch chapel python shared lib location by chapel version
* unhack chpl frontend lib rpath
* fix postinstall main version number file path
* update chapel version number to 2.3
* use chapel releases instead of source tarballs, remove dead code
* Chapel 2.3 adds support for LLVM 19
* Bundled LLVM always requires CMake 3.20:
* Apply 2.3 patch for LLVM include search path

Fixes build errors for `chapel@2.3+rocm` of the form:

```
compiler/llvm/llvmDebug.cpp:174:9: error: cannot convert 'const char*' to 'llvm::dwarf::MemorySpace'
compiler/llvm/llvmDebug.cpp:254:15: error: cannot convert 'const char*' to 'llvm::dwarf::MemorySpace'
```

* fix style

* Fix misreporting of test_hello failures

`test_part` was aliasing the enclosing test name, leading to confusing and incorrect reporting on test failure.

* Ensure `libxml2` optional dep of `hwloc` is also added to `PKG_CONFIG_PATH` in the run env

* patch chplCheck for GPU + multilocale configs, install docs

* Adjust patch for checkChplInstall

* Ensure `CHPL_DEVELOPER` is unset for `~developer`

---------

Co-authored-by: Dan Bonachea <dobonachea@lbl.gov>
2025-01-14 21:48:44 -08:00
Hubertus van Dam
0bd9c235a0 lammps: enable scafacos (#47638)
* Enable Scafacos in LAMMPS
* lammps: make scafacos work with +lib

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
2025-01-14 21:33:44 -07:00
jclause
335fca7049 Improve support using external modules with zsh (#48115)
* Improve support using external modules with zsh

Spack integrates with external modules by launching a python subprocess
to scrape the path from the module command. In zsh, subshells do not
inheret functions defined in the parent shell (only environment
variables). This breaks the module function in module_cmd.py.

As a workaround, source the module commands using $MODULESHOME prior to
running the module command.

* Fix formatting

* Fix flake error

* Add guard around sourcing module file

* Add improved unit testing to module src command

* Correct style

* Another attempt at style

* formatting again

* formatting again

---------

Co-authored-by: psakievich <psakiev@sandia.gov>
2025-01-14 21:29:15 -07:00
Alex Richert
ce5ef14fdb crtm: add v3.1.1-build1 (#48451)
* add crtm@3.1.1-build1
* fix url_for_version's version check

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2025-01-14 19:03:50 -07:00
Greg Becker
2447d16e55 bugfix: associate nodes directly with specs for compiler flag reordering (#48496) 2025-01-14 10:15:57 -08:00
Wouter Deconinck
8196c68ff3 py-dask: fix py-versioneer version pin (#47980)
* py-dask: fix py-versioneer version pin
* py-dask: depends_on py-click@8.1 when @2023.11.0:
* py-dask: reorder dependency lines

Co-authored-by: Sergey Kosukhin <sergey.kosukhin@mpimet.mpg.de>

---------

Co-authored-by: Sergey Kosukhin <sergey.kosukhin@mpimet.mpg.de>
2025-01-14 09:44:24 -08:00
Taillefumier Mathieu
308f74fe8b trexio: add v2.2.3 -> master (#48543)
* Update trexio
   - update version
   - add cmake support

* Fix formating

* hdf5+hl only needed when < 2.3.0

---------

Co-authored-by: Mathieu Taillefumier <mathieu.taillefumier@free.fr>
2025-01-14 09:39:43 -08:00
Ludovic Räss
864f09fef0 pism: add v2.0.7, v2.1.1 (#47921)
* Update recipe

* Update

* Update

* Cleanup

* Fixup
2025-01-14 09:20:16 -08:00
Juan Miguel Carceller
29b53581e2 bdsim: update to point to the new location in github and add 1.7.7 (#48458)
* bdsim: update to point to the new location in github and add 1.7.7

* Remove the C++ standard patch

* Remove the cmake_args and add a comment instead

---------

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2025-01-14 09:18:22 -08:00
Wouter Deconinck
e441e780b9 flatbuffers: add v24.12.23 (#48563)
* flatbuffers: add v24.12.23

* flatbuffers: fix style
2025-01-14 09:12:08 -08:00
Jean Luca Bez
4c642df5ae drishti: add v0.5, v0.6 (#39262)
* include new versions, update dependencies, add test
* Update package.py
* requested changes
* link dependency
* include commit
* fix style
2025-01-14 08:48:35 -07:00
Matt Thompson
217774c972 mepo: add v2.2.1, v2.3.0 (#48552) 2025-01-14 08:44:57 -07:00
Matthieu Dorier
8ec89fc54c pmdk: add gmake dependency, remove cmake dependency (#48544) 2025-01-14 08:44:43 -07:00
Marc T. Henry de Frahan
66ce93a2e3 Openfast version update (#48558) 2025-01-14 08:44:26 -07:00
AMD Toolchain Support
116ffe5809 ICON: Fix missing ocean variant logic handling (#48550)
Co-authored-by: Phil Tooley <phil.tooley@amd.com>
2025-01-14 08:44:01 -07:00
Hans Fangohr
6b0ea2db1d octopus: add v15.1 (#48514) 2025-01-14 08:43:45 -07:00
Harmen Stoppels
d72b371c8a relocate_text: fix return value (#48568) 2025-01-14 13:29:06 +01:00
Mikael Simberg
aa88ced154 Set CMake policy CMP0074 to the new behaviour in hip package (#48549) 2025-01-14 04:48:09 -07:00
Todd Gamblin
d89ae7bcde Database: make constructor safe for read-only filesystems (#48454)
`spack find` and other commands that read the DB will fail if the databse is in a
read-only location, because `FailureTracker` and `Database` can try to create their
parent directories when they are constructed, even if the DB root is read-only.

Instead, we should only write to the filesystem when needed.

- [x] don't create parent directories in constructors
- [x] have write operations create parents if needed
- [x] in tests, `chmod` databases to be read-only unless absolutely needed.

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-01-14 00:10:51 -08:00
Harmen Stoppels
53d1665a8b traverse: use deque and popleft (#48353)
Avoids a quadratic time operation. In practice it is not really felt, cause it has a tiny constant.
But maybe in very large DAGs it is.
2025-01-14 00:09:57 -08:00
Harmen Stoppels
9fa1654102 builtin: add missing deps on gmake (#48545) 2025-01-13 22:51:20 -07:00
Julien Bigot
2c692a5755 doxygen: new versions 1.13 (#48541) 2025-01-13 21:29:40 -07:00
Hubertus van Dam
c0df012b18 py-mdi: new package (#47636)
* Adding MDI
* py-mdi: update package
* lammps: py-mdi needs to be usable at runtime
* py-mdi: update dependency constraints

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
2025-01-13 20:54:06 -07:00
Cody Balos
66e8523e14 add long_spec property to get fully enumerated spec string (#48389) 2025-01-13 16:43:39 -08:00
Paul R. C. Kent
3932299768 Add v0.3.29 (#48536) 2025-01-13 16:36:29 -08:00
Buldram
3eba6b8379 qemacs, texi2html: new packages (#48537)
* qemacs, texi2html: new packages
* Make plugin support variant
2025-01-13 16:23:40 -08:00
Wouter Deconinck
369928200a py-histoprint: add v2.5.0, v2.6.0 (switch to hatchling) (#48452)
* py-histoprint: add v2.5.0, v2.6.0 (switch to hatchling)
* py-histoprint: add tag hep
2025-01-13 13:44:39 -08:00
Olivier Cessenat
81ed0f8d87 visit: external find requires import re (#48469) 2025-01-13 13:27:31 -07:00
Adam J. Stewart
194b6311e9 GDAL: add v3.10.1 (#48551) 2025-01-13 12:04:47 -08:00
Vicente Bolea
8420898f79 vtk: add v9.4.1 (#48325) 2025-01-13 11:49:41 -08:00
Tamara Dahlgren
f556ba46d9 bugfix/r: git property needs to be classproperty (#48228) 2025-01-13 12:14:26 -06:00
Hubertus van Dam
ddaa9d5d81 pace: new package (#47561)
* Adding the PACE library to the LAMMPS build
* Adding the PACE library for ML-PACE in LAMMPS

---------

Co-authored-by: Richard Berger <rberger@lanl.gov>
2025-01-13 10:28:05 -07:00
Wouter Deconinck
b878fe5555 boost: add v1.87.0, update depdendents' constraints (#48475)
* boost: add 1.87.0

* boost: revert addition of parser variant

* precice: forward compatibility boost@:1.86 when @:3.1.2

* faodel: forward compatibility boost@:1.86 when @:1.2108.1

* boost: conflicts ~python when +mpi @1.87.0:
2025-01-13 09:19:30 -08:00
afzpatel
b600bfc779 hipsparselt, rocalution and omniperf: update hipsparselt and rocalutuion to 6.3.0 and fix omniperf typo (#48374)
* hipsparselt and omniperf: update hipsparselt to 6.3.0 and fix typo in omniperf

* fix style

* update rocalution to 6.3.0
2025-01-13 09:18:28 -08:00
Wouter Deconinck
612c289c41 dbus: add v1.16.0 (#48531) 2025-01-13 16:41:16 +01:00
Harmen Stoppels
e42c76cccf py-ipython: relax py-jedi forward compat (#48542) 2025-01-13 15:12:24 +01:00
Wouter Deconinck
25013bacf2 hep stack: add pandora PFA suite (#48497) 2025-01-13 13:41:01 +01:00
Carlos Bederián
3d554db198 hpl: build v2.3 with timers (#48206)
Co-authored-by: zzzoom <zzzoom@users.noreply.github.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2025-01-13 13:04:49 +01:00
Wouter Deconinck
b6def50dcb py-webcolors: add v24.8.0, v24.11.1 (switch to pdm-backend) (#48398) 2025-01-13 13:03:56 +01:00
Wouter Deconinck
bf591c96bd py-llvmlite: add v0.42.0, v0.43.0; py-numba: add v0.59.1, v0.60.0 (#48399) 2025-01-13 13:03:33 +01:00
Wouter Deconinck
edf1d2ec40 py-pre-commit: add v3.6.2, v3.7.1, v3.8.0, v4.0.1 (#48400) 2025-01-13 13:03:06 +01:00
Adam J. Stewart
07f607ec9f PyTorch: remove +onnx_ml variant (#48406) 2025-01-13 13:02:21 +01:00
Olivier Cessenat
93747c5e24 gxsview: making sure -lstdc++fs is added (#47703)
Co-authored-by: Olivier Cessenat <cessenat@jliana.magic>
2025-01-13 13:01:03 +01:00
Thomas Helfer
b746d4596a tfel: update package to transition from boost/python to pybind11 (#48237) 2025-01-13 12:59:22 +01:00
G-Ragghianti
8814705936 papi: fix erroneous patch application (#48428) 2025-01-13 12:56:28 +01:00
Harmen Stoppels
c989541ebc lmod: fix build on macOS (#48438) 2025-01-13 12:53:11 +01:00
Krishna Chilleri
1759ce05dd py-graphviz: add v0.20.3 (#48280) 2025-01-13 12:49:57 +01:00
Wouter Deconinck
c0c1a4aea1 podio: add v1.2; conflicts +rntuple ^root@6.32: when @:0.99 (#47991)
Co-authored-by: Thomas Madlener <thomas.madlener@desy.de>
2025-01-13 12:48:25 +01:00
Wouter Deconinck
53353ae64e hep stack: add scikit-hep packages (#48412) 2025-01-13 12:40:29 +01:00
Wouter Deconinck
62f7a4c9b1 py-wxpython: depends_on wxwidgets +gui (#48499) 2025-01-13 12:31:13 +01:00
Massimiliano Culpo
39679d0882 otf2: add a dependency on Fortran (#48506)
See https://gitlab.spack.io/spack/spack/-/jobs/14423941

From https://gitlab.com/score-p/scorep/-/blob/master/INSTALL?ref_type=heads

> Score-P requires a full compiler suite with language support for C99,
> C++11 and optionally Fortran 77 and Fortran 90.
2025-01-13 12:28:14 +01:00
Ufuk Turunçoğlu
50e6bf9979 esmf: add v8.8.0 (#48495) 2025-01-13 12:25:04 +01:00
Olivier Cessenat
b874c31cc8 keepassxc: adding versions 2.7.8 and 2.7.9 (#48518) 2025-01-13 12:22:55 +01:00
Adam J. Stewart
04baad90f5 py-lightning-uq-box: add v0.2.0 (#48305) 2025-01-13 12:22:04 +01:00
Adam J. Stewart
1022527923 py-ruff: add v0.9.1 (#48519) 2025-01-13 12:20:53 +01:00
Alberto Invernizzi
7ef19ec1d8 gnupg: bump versions and update libassuan requirement (#48520) 2025-01-13 12:19:38 +01:00
Wouter Deconinck
6e45b51f27 cepgen: add v1.2.5 (#48524) 2025-01-13 12:16:31 +01:00
Adam J. Stewart
5f9cd0991b py-timm: add v1.0.13 (#48526) 2025-01-13 12:15:19 +01:00
Wouter Deconinck
98c44fc351 hep stack: root +postgres (#48498) 2025-01-13 12:11:19 +01:00
Adam J. Stewart
b99f850c8e py-scipy: add v1.15.1 (#48523) 2025-01-13 12:07:30 +01:00
Adam J. Stewart
cbbd68d16b py-scikit-learn: add v1.6.1 (#48525) 2025-01-13 12:07:06 +01:00
Wouter Deconinck
e4fbf99497 dcap: depends_on openssl and zlib-api (#48529) 2025-01-13 12:06:29 +01:00
Wouter Deconinck
6a225d5405 dpmjet: add v19.3.6, v19.3.7; add to hep stack (#48530) 2025-01-13 12:05:45 +01:00
Wouter Deconinck
af9fd82476 acts: conflict ~svg ~json when +traccc (#48534) 2025-01-13 11:51:05 +01:00
Carson Woods
29c1152484 py-rich-click: added new versions and dependencies (#48270) 2025-01-13 11:48:10 +01:00
Wouter Deconinck
d6a8af6a1d py-asdf: depends_on py-setuptools-scm@8: (#48538) 2025-01-13 11:44:44 +01:00
Adam J. Stewart
3c3dad0a7a py-kornia: add v0.8.0 (#48522) 2025-01-13 11:43:33 +01:00
Wouter Deconinck
109efdff88 vbfnlo: deprecated 3.0.0beta* since 3.0 < 3.0.0beta (#48516) 2025-01-13 11:35:38 +01:00
Wouter Deconinck
fa318e2c92 lcio: add v2.22.3 (#48532) 2025-01-13 10:45:28 +01:00
Harmen Stoppels
064e70990d binary_distribution.py: stop relocation old /bin/bash sbang shebang (#48502) 2025-01-13 09:30:18 +01:00
Michael Kuhn
c40139b7d6 builtin: fix pkgconfig dependencies (#48533)
`pkgconfig` is the virtual dependency, `pkg-config` and `pkgconf` are
implementations.
2025-01-13 08:40:49 +01:00
Wouter Deconinck
c302e1a768 builtin: add undeclared gmake dependencies (#48535) 2025-01-13 08:38:37 +01:00
Harmen Stoppels
7171015f1c py-mypy: add v1.13.0 and v1.14.1 (#48508)
* py-setuptools: add v75.3.0, v75.8.0
* py-radical-utils: forward compat with setuptools
2025-01-12 17:18:09 +01:00
Massimiliano Culpo
8ab6f33eb6 Deprecate make, gmake and ninja globals in package API (#48450)
Instead, depends_on("gmake", type="build") should be used, which makes the global available through `setup_dependent_package`.
2025-01-11 22:43:22 +01:00
John W. Parent
a66ab9cc6c CMake: add versions 3.31.3 and 3.31.4 (#48509) 2025-01-11 11:38:36 -07:00
759 changed files with 13681 additions and 9022 deletions

View File

@@ -61,7 +61,7 @@ jobs:
run: "brew install kcov"
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest pytest-xdist pytest-cov clingo
pip install --upgrade pip setuptools pytest pytest-xdist pytest-cov
pip install --upgrade flake8 "isort>=4.3.5" "mypy>=0.900" "click" "black"
- name: Setup git configuration
run: |
@@ -185,6 +185,7 @@ jobs:
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.6
spack bootstrap status
spack solve zlib
spack unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml lib/spack/spack/test/concretization/core.py
- uses: actions/upload-artifact@6f51ac03b9356f520e9adb1b1b7802705f340c2b
with:
@@ -222,7 +223,7 @@ jobs:
. share/spack/setup-env.sh
$(which spack) bootstrap disable spack-install
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python')
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: actions/upload-artifact@6f51ac03b9356f520e9adb1b1b7802705f340c2b
with:
@@ -253,7 +254,7 @@ jobs:
env:
COVERAGE_FILE: coverage/.coverage-windows
run: |
spack unit-test --verbose --cov --cov-config=pyproject.toml
spack unit-test -x --verbose --cov --cov-config=pyproject.toml
./share/spack/qa/validate_last_exit.ps1
- uses: actions/upload-artifact@6f51ac03b9356f520e9adb1b1b7802705f340c2b
with:

View File

@@ -19,7 +19,7 @@ config:
install_tree:
root: $spack/opt/spack
projections:
all: "{architecture.platform}-{architecture.target}/{name}-{version}-{hash}"
all: "{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}"
# install_tree can include an optional padded length (int or boolean)
# default is False (do not pad)
# if padded_length is True, Spack will pad as close to the system max path

View File

@@ -15,11 +15,12 @@
# -------------------------------------------------------------------------
packages:
all:
compiler:
- apple-clang
- clang
- gcc
providers:
c: [apple-clang, llvm, gcc]
cxx: [apple-clang, llvm, gcc]
elf: [libelf]
fortran: [gcc]
fuse: [macfuse]
gl: [apple-gl]
glu: [apple-glu]

View File

@@ -15,18 +15,19 @@
# -------------------------------------------------------------------------
packages:
all:
compiler: [gcc, clang, oneapi, xl, nag, fj, aocc]
providers:
awk: [gawk]
armci: [armcimpi]
blas: [openblas, amdblis]
c: [gcc, llvm, intel-oneapi-compilers, xl, aocc]
cxx: [gcc, llvm, intel-oneapi-compilers, xl, aocc]
c: [gcc]
cxx: [gcc]
D: [ldc]
daal: [intel-oneapi-daal]
elf: [elfutils]
fftw-api: [fftw, amdfftw]
flame: [libflame, amdlibflame]
fortran: [gcc, llvm]
fortran: [gcc]
fortran-rt: [gcc-runtime, intel-oneapi-runtime]
fuse: [libfuse]
gl: [glx, osmesa]

View File

@@ -15,8 +15,8 @@
# -------------------------------------------------------------------------
packages:
all:
compiler:
- msvc
providers:
c : [msvc]
cxx: [msvc]
mpi: [msmpi]
gl: [wgl]

View File

@@ -56,13 +56,13 @@ If you look at the ``perl`` package, you'll see:
.. code-block:: python
phases = ["configure", "build", "install"]
phases = ("configure", "build", "install")
Similarly, ``cmake`` defines:
.. code-block:: python
phases = ["bootstrap", "build", "install"]
phases = ("bootstrap", "build", "install")
If we look at the ``cmake`` example, this tells Spack's ``PackageBase``
class to run the ``bootstrap``, ``build``, and ``install`` functions

View File

@@ -543,10 +543,10 @@ With either interpreter you can run a single command:
.. code-block:: console
$ spack python -c 'from spack.spec import Spec; Spec("python").concretized()'
$ spack python -c 'from spack.concretize import concretize_one; concretize_one("python")'
...
$ spack python -i ipython -c 'from spack.spec import Spec; Spec("python").concretized()'
$ spack python -i ipython -c 'from spack.concretize import concretize_one; concretize_one("python")'
Out[1]: ...
or a file:

View File

@@ -456,14 +456,13 @@ For instance, the following config options,
tcl:
all:
suffixes:
^python@3: 'python{^python.version}'
^python@3: 'python{^python.version.up_to_2}'
^openblas: 'openblas'
will add a ``python-3.12.1`` version string to any packages compiled with
Python matching the spec, ``python@3``. This is useful to know which
version of Python a set of Python extensions is associated with. Likewise, the
``openblas`` string is attached to any program that has openblas in the spec,
most likely via the ``+blas`` variant specification.
will add a ``python3.12`` to module names of packages compiled with Python 3.12, and similarly for
all specs depending on ``python@3``. This is useful to know which version of Python a set of Python
extensions is associated with. Likewise, the ``openblas`` string is attached to any program that
has openblas in the spec, most likely via the ``+blas`` variant specification.
The most heavyweight solution to module naming is to change the entire
naming convention for module files. This uses the projections format

View File

@@ -330,7 +330,7 @@ that ``--tests`` is passed to ``spack ci rebuild`` as part of the
- spack --version
- cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture.platform}-{architecture.target}/{name}-{version}-{hash}'"
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi

1
lib/spack/env/aocc/clang vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/aocc/clang++ vendored Symbolic link
View File

@@ -0,0 +1 @@
../cpp

1
lib/spack/env/aocc/flang vendored Symbolic link
View File

@@ -0,0 +1 @@
../fc

1
lib/spack/env/arm/armclang vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/arm/armclang++ vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/arm/armflang vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/c++ vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/c89 vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/c99 vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/case-insensitive/CC vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

View File

@@ -39,6 +39,12 @@ readonly params="\
SPACK_ENV_PATH
SPACK_DEBUG_LOG_DIR
SPACK_DEBUG_LOG_ID
SPACK_COMPILER_SPEC
SPACK_CC_RPATH_ARG
SPACK_CXX_RPATH_ARG
SPACK_F77_RPATH_ARG
SPACK_FC_RPATH_ARG
SPACK_LINKER_ARG
SPACK_SHORT_SPEC
SPACK_SYSTEM_DIRS
SPACK_MANAGED_DIRS"
@@ -339,9 +345,6 @@ case "$command" in
;;
ld|ld.gold|ld.lld)
mode=ld
if [ -z "$SPACK_CC_RPATH_ARG" ]; then
comp="CXX"
fi
;;
*)
die "Unknown compiler: $command"
@@ -396,12 +399,10 @@ fi
#
dtags_to_add="${SPACK_DTAGS_TO_ADD}"
dtags_to_strip="${SPACK_DTAGS_TO_STRIP}"
linker_arg="ERROR: LINKER ARG WAS NOT SET, MAYBE THE PACKAGE DOES NOT DEPEND ON ${comp}?"
eval "linker_arg=\${SPACK_${comp}_LINKER_ARG:?${linker_arg}}"
linker_arg="${SPACK_LINKER_ARG}"
# Set up rpath variable according to language.
rpath="ERROR: RPATH ARG WAS NOT SET, MAYBE THE PACKAGE DOES NOT DEPEND ON ${comp}?"
rpath="ERROR: RPATH ARG WAS NOT SET"
eval "rpath=\${SPACK_${comp}_RPATH_ARG:?${rpath}}"
# Dump the mode and exit if the command is dump-mode.
@@ -410,6 +411,13 @@ if [ "$SPACK_TEST_COMMAND" = "dump-mode" ]; then
exit
fi
# If, say, SPACK_CC is set but SPACK_FC is not, we want to know. Compilers do not
# *have* to set up Fortran executables, so we need to tell the user when a build is
# about to attempt to use them unsuccessfully.
if [ -z "$command" ]; then
die "Compiler '$SPACK_COMPILER_SPEC' does not have a $language compiler configured."
fi
#
# Filter '.' and Spack environment directories out of PATH so that
# this script doesn't just call itself
@@ -779,17 +787,15 @@ case "$mode" in
C)
extend spack_flags_list SPACK_ALWAYS_CFLAGS
extend spack_flags_list SPACK_CFLAGS
preextend flags_list SPACK_TARGET_ARGS_CC
;;
CXX)
extend spack_flags_list SPACK_ALWAYS_CXXFLAGS
extend spack_flags_list SPACK_CXXFLAGS
preextend flags_list SPACK_TARGET_ARGS_CXX
;;
F)
preextend flags_list SPACK_TARGET_ARGS_FORTRAN
;;
esac
# prepend target args
preextend flags_list SPACK_TARGET_ARGS
;;
esac

1
lib/spack/env/cce/case-insensitive/CC vendored Symbolic link
View File

@@ -0,0 +1 @@
../../cc

View File

@@ -0,0 +1 @@
../../cc

1
lib/spack/env/cce/cc vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/cce/craycc vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/cce/crayftn vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/cce/ftn vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/clang/clang vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/clang/clang++ vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/clang/flang vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/clang/gfortran vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/cpp vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/f77 vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/f90 vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/f95 vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/fc vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/fj/case-insensitive/FCC vendored Symbolic link
View File

@@ -0,0 +1 @@
../../cc

1
lib/spack/env/fj/fcc vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/fj/frt vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/ftn vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/gcc/g++ vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/gcc/gcc vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/gcc/gfortran vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/intel/icc vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/intel/icpc vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/intel/ifort vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/ld vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/ld.gold vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/ld.lld vendored Symbolic link
View File

@@ -0,0 +1 @@
cc

1
lib/spack/env/nag/nagfor vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/nvhpc/nvc vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/nvhpc/nvc++ vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/nvhpc/nvfortran vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/oneapi/dpcpp vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/oneapi/icpx vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/oneapi/icx vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/oneapi/ifx vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/pgi/pgc++ vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/pgi/pgcc vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/pgi/pgfortran vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/rocmcc/amdclang vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/rocmcc/amdclang++ vendored Symbolic link
View File

@@ -0,0 +1 @@
../cpp

1
lib/spack/env/rocmcc/amdflang vendored Symbolic link
View File

@@ -0,0 +1 @@
../fc

1
lib/spack/env/xl/xlc vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/xl/xlc++ vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/xl/xlf vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/xl/xlf90 vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/xl_r/xlc++_r vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/xl_r/xlc_r vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/xl_r/xlf90_r vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

1
lib/spack/env/xl_r/xlf_r vendored Symbolic link
View File

@@ -0,0 +1 @@
../cc

View File

@@ -75,7 +75,6 @@
"install_tree",
"is_exe",
"join_path",
"last_modification_time_recursive",
"library_extensions",
"mkdirp",
"partition_path",
@@ -1470,15 +1469,36 @@ def set_executable(path):
@system_path_filter
def last_modification_time_recursive(path):
path = os.path.abspath(path)
times = [os.stat(path).st_mtime]
times.extend(
os.lstat(os.path.join(root, name)).st_mtime
for root, dirs, files in os.walk(path)
for name in dirs + files
)
return max(times)
def recursive_mtime_greater_than(path: str, time: float) -> bool:
"""Returns true if any file or dir recursively under `path` has mtime greater than `time`."""
# use bfs order to increase likelihood of early return
queue: Deque[str] = collections.deque()
if os.stat(path).st_mtime > time:
return True
while queue:
current = queue.popleft()
try:
entries = os.scandir(current)
except OSError:
continue
with entries:
for entry in entries:
try:
st = entry.stat(follow_symlinks=False)
except OSError:
continue
if st.st_mtime > time:
return True
if entry.is_dir(follow_symlinks=False):
queue.append(entry.path)
return False
@system_path_filter
@@ -1740,8 +1760,7 @@ def find(
def _log_file_access_issue(e: OSError, path: str) -> None:
errno_name = errno.errorcode.get(e.errno, "UNKNOWN")
tty.debug(f"find must skip {path}: {errno_name} {e}")
tty.debug(f"find must skip {path}: {e}")
def _file_id(s: os.stat_result) -> Tuple[int, int]:

View File

@@ -72,7 +72,7 @@ def index_by(objects, *funcs):
if isinstance(f, str):
f = lambda x: getattr(x, funcs[0])
elif isinstance(f, tuple):
f = lambda x: tuple(getattr(x, p, None) for p in funcs[0])
f = lambda x: tuple(getattr(x, p) for p in funcs[0])
result = {}
for o in objects:
@@ -996,8 +996,11 @@ def _receive_forwarded(self, context: str, exc: Exception, tb: List[str]):
def grouped_message(self, with_tracebacks: bool = True) -> str:
"""Print out an error message coalescing all the forwarded errors."""
each_exception_message = [
"\n\t{0} raised {1}: {2}\n{3}".format(
context, exc.__class__.__name__, exc, f"\n{''.join(tb)}" if with_tracebacks else ""
"{0} raised {1}: {2}{3}".format(
context,
exc.__class__.__name__,
exc,
"\n{0}".format("".join(tb)) if with_tracebacks else "",
)
for context, exc, tb in self.exceptions
]

View File

@@ -10,7 +10,7 @@
import spack.util.git
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "1.0.0-alpha.3"
__version__ = "0.24.0.dev0"
spack_version = __version__

View File

@@ -1356,14 +1356,8 @@ def _test_detection_by_executable(pkgs, debug_log, error_cls):
def _compare_extra_attribute(_expected, _detected, *, _spec):
result = []
# Check items are of the same type
if not isinstance(_detected, type(_expected)):
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
# If they are string expected is a regex
if isinstance(_expected, str):
if isinstance(_expected, str) and isinstance(_detected, str):
try:
_regex = re.compile(_expected)
except re.error:
@@ -1379,7 +1373,7 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
_details = [f"{_detected} does not match the regex"]
return [error_cls(summary=_summary, details=_details)]
if isinstance(_expected, dict):
elif isinstance(_expected, dict) and isinstance(_detected, dict):
_not_detected = set(_expected.keys()) - set(_detected.keys())
if _not_detected:
_summary = f"{pkg_name}: cannot detect some attributes for spec {_spec}"
@@ -1394,6 +1388,10 @@ def _compare_extra_attribute(_expected, _detected, *, _spec):
result.extend(
_compare_extra_attribute(_expected[_key], _detected[_key], _spec=_spec)
)
else:
_summary = f'{pkg_name}: error when trying to detect "{_expected}"'
_details = [f"{_detected} was detected instead"]
return [error_cls(summary=_summary, details=_details)]
return result

View File

@@ -23,7 +23,7 @@
import urllib.request
import warnings
from contextlib import closing
from typing import IO, Dict, Iterable, List, NamedTuple, Optional, Set, Tuple, Union
from typing import IO, Callable, Dict, Iterable, List, NamedTuple, Optional, Set, Tuple, Union
import llnl.util.filesystem as fsys
import llnl.util.lang
@@ -91,6 +91,9 @@
CURRENT_BUILD_CACHE_LAYOUT_VERSION = 2
INDEX_HASH_FILE = "index.json.hash"
class BuildCacheDatabase(spack_db.Database):
"""A database for binary buildcaches.
@@ -502,7 +505,7 @@ def _fetch_and_cache_index(self, mirror_url, cache_entry={}):
scheme = urllib.parse.urlparse(mirror_url).scheme
if scheme != "oci" and not web_util.url_exists(
url_util.join(mirror_url, BUILD_CACHE_RELATIVE_PATH, "index.json")
url_util.join(mirror_url, BUILD_CACHE_RELATIVE_PATH, spack_db.INDEX_JSON_FILE)
):
return False
@@ -625,14 +628,7 @@ def tarball_directory_name(spec):
Return name of the tarball directory according to the convention
<os>-<architecture>/<compiler>/<package>-<version>/
"""
if spec.original_spec_format() < 5:
compiler = spec.annotations.compiler_node_attribute
assert compiler is not None, "a compiler spec is expected"
return spec.format_path(
f"{spec.architecture}/{compiler.name}-{compiler.version}/{spec.name}-{spec.version}"
)
return spec.format_path(f"{spec.architecture.platform}/{spec.name}-{spec.version}")
return spec.format_path("{architecture}/{compiler.name}-{compiler.version}/{name}-{version}")
def tarball_name(spec, ext):
@@ -640,17 +636,9 @@ def tarball_name(spec, ext):
Return the name of the tarfile according to the convention
<os>-<architecture>-<package>-<dag_hash><ext>
"""
if spec.original_spec_format() < 5:
compiler = spec.annotations.compiler_node_attribute
assert compiler is not None, "a compiler spec is expected"
spec_formatted = (
f"{spec.architecture}-{compiler.name}-{compiler.version}-{spec.name}"
f"-{spec.version}-{spec.dag_hash()}"
)
else:
spec_formatted = (
f"{spec.architecture.platform}-{spec.name}-{spec.version}-{spec.dag_hash()}"
)
spec_formatted = spec.format_path(
"{architecture}-{compiler.name}-{compiler.version}-{name}-{version}-{hash}"
)
return f"{spec_formatted}{ext}"
@@ -684,19 +672,24 @@ def sign_specfile(key: str, specfile_path: str) -> str:
def _read_specs_and_push_index(
file_list, read_method, cache_prefix, db: BuildCacheDatabase, temp_dir, concurrency
file_list: List[str],
read_method: Callable,
cache_prefix: str,
db: BuildCacheDatabase,
temp_dir: str,
concurrency: int,
):
"""Read all the specs listed in the provided list, using thread given thread parallelism,
generate the index, and push it to the mirror.
Args:
file_list (list(str)): List of urls or file paths pointing at spec files to read
file_list: List of urls or file paths pointing at spec files to read
read_method: A function taking a single argument, either a url or a file path,
and which reads the spec file at that location, and returns the spec.
cache_prefix (str): prefix of the build cache on s3 where index should be pushed.
cache_prefix: prefix of the build cache on s3 where index should be pushed.
db: A spack database used for adding specs and then writing the index.
temp_dir (str): Location to write index.json and hash for pushing
concurrency (int): Number of parallel processes to use when fetching
temp_dir: Location to write index.json and hash for pushing
concurrency: Number of parallel processes to use when fetching
"""
for file in file_list:
contents = read_method(file)
@@ -714,7 +707,7 @@ def _read_specs_and_push_index(
# Now generate the index, compute its hash, and push the two files to
# the mirror.
index_json_path = os.path.join(temp_dir, "index.json")
index_json_path = os.path.join(temp_dir, spack_db.INDEX_JSON_FILE)
with open(index_json_path, "w", encoding="utf-8") as f:
db._write_to_file(f)
@@ -724,14 +717,14 @@ def _read_specs_and_push_index(
index_hash = compute_hash(index_string)
# Write the hash out to a local file
index_hash_path = os.path.join(temp_dir, "index.json.hash")
index_hash_path = os.path.join(temp_dir, INDEX_HASH_FILE)
with open(index_hash_path, "w", encoding="utf-8") as f:
f.write(index_hash)
# Push the index itself
web_util.push_to_url(
index_json_path,
url_util.join(cache_prefix, "index.json"),
url_util.join(cache_prefix, spack_db.INDEX_JSON_FILE),
keep_original=False,
extra_args={"ContentType": "application/json", "CacheControl": "no-cache"},
)
@@ -739,7 +732,7 @@ def _read_specs_and_push_index(
# Push the hash
web_util.push_to_url(
index_hash_path,
url_util.join(cache_prefix, "index.json.hash"),
url_util.join(cache_prefix, INDEX_HASH_FILE),
keep_original=False,
extra_args={"ContentType": "text/plain", "CacheControl": "no-cache"},
)
@@ -876,9 +869,12 @@ def _url_generate_package_index(url: str, tmpdir: str, concurrency: int = 32):
tty.debug(f"Retrieving spec descriptor files from {url} to build index")
db = BuildCacheDatabase(tmpdir)
db._write()
try:
_read_specs_and_push_index(file_list, read_fn, url, db, db.database_directory, concurrency)
_read_specs_and_push_index(
file_list, read_fn, url, db, str(db.database_directory), concurrency
)
except Exception as e:
raise GenerateIndexError(f"Encountered problem pushing package index to {url}: {e}") from e
@@ -1792,7 +1788,7 @@ def _oci_update_index(
db.mark(spec, "in_buildcache", True)
# Create the index.json file
index_json_path = os.path.join(tmpdir, "index.json")
index_json_path = os.path.join(tmpdir, spack_db.INDEX_JSON_FILE)
with open(index_json_path, "w", encoding="utf-8") as f:
db._write_to_file(f)
@@ -2140,10 +2136,9 @@ def fetch_url_to_mirror(url):
def dedupe_hardlinks_if_necessary(root, buildinfo):
"""Updates a buildinfo dict for old archives that did
not dedupe hardlinks. De-duping hardlinks is necessary
when relocating files in parallel and in-place. This
means we must preserve inodes when relocating."""
"""Updates a buildinfo dict for old archives that did not dedupe hardlinks. De-duping hardlinks
is necessary when relocating files in parallel and in-place. This means we must preserve inodes
when relocating."""
# New archives don't need this.
if buildinfo.get("hardlinks_deduped", False):
@@ -2172,69 +2167,47 @@ def dedupe_hardlinks_if_necessary(root, buildinfo):
buildinfo[key] = new_list
def relocate_package(spec):
"""
Relocate the given package
"""
workdir = str(spec.prefix)
buildinfo = read_buildinfo_file(workdir)
new_layout_root = str(spack.store.STORE.layout.root)
new_prefix = str(spec.prefix)
new_rel_prefix = str(os.path.relpath(new_prefix, new_layout_root))
new_spack_prefix = str(spack.paths.prefix)
old_sbang_install_path = None
if "sbang_install_path" in buildinfo:
old_sbang_install_path = str(buildinfo["sbang_install_path"])
def relocate_package(spec: spack.spec.Spec) -> None:
"""Relocate binaries and text files in the given spec prefix, based on its buildinfo file."""
spec_prefix = str(spec.prefix)
buildinfo = read_buildinfo_file(spec_prefix)
old_layout_root = str(buildinfo["buildpath"])
old_spack_prefix = str(buildinfo.get("spackprefix"))
old_rel_prefix = buildinfo.get("relative_prefix")
old_prefix = os.path.join(old_layout_root, old_rel_prefix)
# Warn about old style tarballs created with the now removed --rel flag.
# Warn about old style tarballs created with the --rel flag (removed in Spack v0.20)
if buildinfo.get("relative_rpaths", False):
tty.warn(
f"Tarball for {spec} uses relative rpaths, " "which can cause library loading issues."
f"Tarball for {spec} uses relative rpaths, which can cause library loading issues."
)
# In the past prefix_to_hash was the default and externals were not dropped, so prefixes
# were not unique.
# In Spack 0.19 and older prefix_to_hash was the default and externals were not dropped, so
# prefixes were not unique.
if "hash_to_prefix" in buildinfo:
hash_to_old_prefix = buildinfo["hash_to_prefix"]
elif "prefix_to_hash" in buildinfo:
hash_to_old_prefix = dict((v, k) for (k, v) in buildinfo["prefix_to_hash"].items())
hash_to_old_prefix = {v: k for (k, v) in buildinfo["prefix_to_hash"].items()}
else:
hash_to_old_prefix = dict()
raise NewLayoutException(
"Package tarball was created from an install prefix with a different directory layout "
"and an older buildcache create implementation. It cannot be relocated."
)
if old_rel_prefix != new_rel_prefix and not hash_to_old_prefix:
msg = "Package tarball was created from an install "
msg += "prefix with a different directory layout and an older "
msg += "buildcache create implementation. It cannot be relocated."
raise NewLayoutException(msg)
prefix_to_prefix: Dict[str, str] = {}
# Spurious replacements (e.g. sbang) will cause issues with binaries
# For example, the new sbang can be longer than the old one.
# Hence 2 dictionaries are maintained here.
prefix_to_prefix_text = collections.OrderedDict()
prefix_to_prefix_bin = collections.OrderedDict()
if "sbang_install_path" in buildinfo:
old_sbang_install_path = str(buildinfo["sbang_install_path"])
prefix_to_prefix[old_sbang_install_path] = spack.hooks.sbang.sbang_install_path()
if old_sbang_install_path:
install_path = spack.hooks.sbang.sbang_install_path()
prefix_to_prefix_text[old_sbang_install_path] = install_path
# First match specific prefix paths. Possibly the *local* install prefix of some dependency is
# in an upstream, so we cannot assume the original spack store root can be mapped uniformly to
# the new spack store root.
# First match specific prefix paths. Possibly the *local* install prefix
# of some dependency is in an upstream, so we cannot assume the original
# spack store root can be mapped uniformly to the new spack store root.
#
# If the spec is spliced, we need to handle the simultaneous mapping
# from the old install_tree to the new install_tree and from the build_spec
# to the spliced spec.
# Because foo.build_spec is foo for any non-spliced spec, we can simplify
# by checking for spliced-in nodes by checking for nodes not in the build_spec
# without any explicit check for whether the spec is spliced.
# An analog in this algorithm is any spec that shares a name or provides the same virtuals
# in the context of the relevant root spec. This ensures that the analog for a spec s
# is the spec that s replaced when we spliced.
# If the spec is spliced, we need to handle the simultaneous mapping from the old install_tree
# to the new install_tree and from the build_spec to the spliced spec. Because foo.build_spec
# is foo for any non-spliced spec, we can simplify by checking for spliced-in nodes by checking
# for nodes not in the build_spec without any explicit check for whether the spec is spliced.
# An analog in this algorithm is any spec that shares a name or provides the same virtuals in
# the context of the relevant root spec. This ensures that the analog for a spec s is the spec
# that s replaced when we spliced.
relocation_specs = specs_to_relocate(spec)
build_spec_ids = set(id(s) for s in spec.build_spec.traverse(deptype=dt.ALL & ~dt.BUILD))
for s in relocation_specs:
@@ -2254,72 +2227,48 @@ def relocate_package(spec):
lookup_dag_hash = analog.dag_hash()
if lookup_dag_hash in hash_to_old_prefix:
old_dep_prefix = hash_to_old_prefix[lookup_dag_hash]
prefix_to_prefix_bin[old_dep_prefix] = str(s.prefix)
prefix_to_prefix_text[old_dep_prefix] = str(s.prefix)
prefix_to_prefix[old_dep_prefix] = str(s.prefix)
# Only then add the generic fallback of install prefix -> install prefix.
prefix_to_prefix_text[old_prefix] = new_prefix
prefix_to_prefix_bin[old_prefix] = new_prefix
prefix_to_prefix_text[old_layout_root] = new_layout_root
prefix_to_prefix_bin[old_layout_root] = new_layout_root
prefix_to_prefix[old_layout_root] = str(spack.store.STORE.layout.root)
# This is vestigial code for the *old* location of sbang. Previously,
# sbang was a bash script, and it lived in the spack prefix. It is
# now a POSIX script that lives in the install prefix. Old packages
# will have the old sbang location in their shebangs.
orig_sbang = "#!/bin/bash {0}/bin/sbang".format(old_spack_prefix)
new_sbang = spack.hooks.sbang.sbang_shebang_line()
prefix_to_prefix_text[orig_sbang] = new_sbang
# Delete identity mappings from prefix_to_prefix
prefix_to_prefix = {k: v for k, v in prefix_to_prefix.items() if k != v}
tty.debug("Relocating package from", "%s to %s." % (old_layout_root, new_layout_root))
# If there's nothing to relocate, we're done.
if not prefix_to_prefix:
return
for old, new in prefix_to_prefix.items():
tty.debug(f"Relocating: {old} => {new}.")
# Old archives may have hardlinks repeated.
dedupe_hardlinks_if_necessary(workdir, buildinfo)
dedupe_hardlinks_if_necessary(spec_prefix, buildinfo)
# Text files containing the prefix text
text_names = [os.path.join(workdir, f) for f in buildinfo["relocate_textfiles"]]
textfiles = [os.path.join(spec_prefix, f) for f in buildinfo["relocate_textfiles"]]
binaries = [os.path.join(spec_prefix, f) for f in buildinfo.get("relocate_binaries")]
links = [os.path.join(spec_prefix, f) for f in buildinfo.get("relocate_links", [])]
# If we are not installing back to the same install tree do the relocation
if old_prefix != new_prefix:
files_to_relocate = [
os.path.join(workdir, filename) for filename in buildinfo.get("relocate_binaries")
]
# If the buildcache was not created with relativized rpaths
# do the relocation of path in binaries
platform = spack.platforms.by_name(spec.platform)
if "macho" in platform.binary_formats:
relocate.relocate_macho_binaries(files_to_relocate, prefix_to_prefix_bin)
elif "elf" in platform.binary_formats:
# The new ELF dynamic section relocation logic only handles absolute to
# absolute relocation.
relocate.relocate_elf_binaries(files_to_relocate, prefix_to_prefix_bin)
platform = spack.platforms.by_name(spec.platform)
if "macho" in platform.binary_formats:
relocate.relocate_macho_binaries(binaries, prefix_to_prefix)
elif "elf" in platform.binary_formats:
relocate.relocate_elf_binaries(binaries, prefix_to_prefix)
# Relocate links to the new install prefix
links = [os.path.join(workdir, f) for f in buildinfo.get("relocate_links", [])]
relocate.relocate_links(links, prefix_to_prefix_bin)
relocate.relocate_links(links, prefix_to_prefix)
relocate.relocate_text(textfiles, prefix_to_prefix)
changed_files = relocate.relocate_text_bin(binaries, prefix_to_prefix)
# For all buildcaches
# relocate the install prefixes in text files including dependencies
relocate.relocate_text(text_names, prefix_to_prefix_text)
# relocate the install prefixes in binary files including dependencies
changed_files = relocate.relocate_text_bin(files_to_relocate, prefix_to_prefix_bin)
# Add ad-hoc signatures to patched macho files when on macOS.
if "macho" in platform.binary_formats and sys.platform == "darwin":
codesign = which("codesign")
if not codesign:
return
for binary in changed_files:
# preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed
else:
if old_spack_prefix != new_spack_prefix:
relocate.relocate_text(text_names, prefix_to_prefix_text)
# Add ad-hoc signatures to patched macho files when on macOS.
if "macho" in platform.binary_formats and sys.platform == "darwin":
codesign = which("codesign")
if not codesign:
return
for binary in changed_files:
# preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
def _extract_inner_tarball(spec, filename, extract_to, signature_required: bool, remote_checksum):
@@ -2997,7 +2946,7 @@ def __init__(self, url, local_hash, urlopen=web_util.urlopen):
def get_remote_hash(self):
# Failure to fetch index.json.hash is not fatal
url_index_hash = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json.hash")
url_index_hash = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, INDEX_HASH_FILE)
try:
response = self.urlopen(urllib.request.Request(url_index_hash, headers=self.headers))
except (TimeoutError, urllib.error.URLError):
@@ -3018,7 +2967,7 @@ def conditional_fetch(self) -> FetchIndexResult:
return FetchIndexResult(etag=None, hash=None, data=None, fresh=True)
# Otherwise, download index.json
url_index = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json")
url_index = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, spack_db.INDEX_JSON_FILE)
try:
response = self.urlopen(urllib.request.Request(url_index, headers=self.headers))
@@ -3062,7 +3011,7 @@ def __init__(self, url, etag, urlopen=web_util.urlopen):
def conditional_fetch(self) -> FetchIndexResult:
# Just do a conditional fetch immediately
url = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json")
url = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, spack_db.INDEX_JSON_FILE)
headers = {"User-Agent": web_util.SPACK_USER_AGENT, "If-None-Match": f'"{self.etag}"'}
try:

View File

@@ -10,7 +10,9 @@
import sys
import sysconfig
import warnings
from typing import Dict, Optional, Sequence, Union
from typing import Optional, Sequence, Union
from typing_extensions import TypedDict
import archspec.cpu
@@ -18,13 +20,17 @@
from llnl.util import tty
import spack.platforms
import spack.spec
import spack.store
import spack.util.environment
import spack.util.executable
from .config import spec_for_current_python
QueryInfo = Dict[str, "spack.spec.Spec"]
class QueryInfo(TypedDict, total=False):
spec: spack.spec.Spec
command: spack.util.executable.Executable
def _python_import(module: str) -> bool:
@@ -211,7 +217,9 @@ def _executables_in_store(
):
spack.util.environment.path_put_first("PATH", [bin_dir])
if query_info is not None:
query_info["command"] = spack.util.executable.which(*executables, path=bin_dir)
query_info["command"] = spack.util.executable.which(
*executables, path=bin_dir, required=True
)
query_info["spec"] = concrete_spec
return True
return False
@@ -226,13 +234,12 @@ def _root_spec(spec_str: str) -> str:
# Add a compiler and platform requirement to the root spec.
platform = str(spack.platforms.host())
# FIXME (compiler as nodes): recover the compiler for source bootstrapping
# if platform == "darwin":
# spec_str += " %apple-clang"
if platform == "windows":
if platform == "darwin":
spec_str += " %apple-clang"
elif platform == "windows":
spec_str += " %msvc"
# elif platform == "linux":
# spec_str += " %gcc"
elif platform == "linux":
spec_str += " %gcc"
elif platform == "freebsd":
spec_str += " %clang"
spec_str += f" platform={platform}"

View File

@@ -15,13 +15,11 @@
import archspec.cpu
import spack.compilers.config
import spack.compilers.libraries
import spack.config
import spack.compiler
import spack.compilers
import spack.platforms
import spack.spec
import spack.traverse
import spack.version
from .config import spec_for_current_python
@@ -40,7 +38,7 @@ def __init__(self, configuration):
self.external_cmake, self.external_bison = self._externals_from_yaml(configuration)
def _valid_compiler_or_raise(self):
def _valid_compiler_or_raise(self) -> "spack.compiler.Compiler":
if str(self.host_platform) == "linux":
compiler_name = "gcc"
elif str(self.host_platform) == "darwin":
@@ -48,30 +46,17 @@ def _valid_compiler_or_raise(self):
elif str(self.host_platform) == "windows":
compiler_name = "msvc"
elif str(self.host_platform) == "freebsd":
compiler_name = "llvm"
compiler_name = "clang"
else:
raise RuntimeError(f"Cannot bootstrap clingo from sources on {self.host_platform}")
candidates = [
x
for x in spack.compilers.config.CompilerFactory.from_packages_yaml(spack.config.CONFIG)
if x.name == compiler_name
]
candidates = spack.compilers.compilers_for_spec(
compiler_name, arch_spec=self.host_architecture
)
if not candidates:
raise RuntimeError(
f"Cannot find any version of {compiler_name} to bootstrap clingo from sources"
)
candidates.sort(key=lambda x: x.version, reverse=True)
best = candidates[0]
# Get compilers for bootstrapping from the 'builtin' repository
best.namespace = "builtin"
# If the compiler does not support C++ 14, fail with a legible error message
try:
_ = best.package.standard_flag(language="cxx", standard="14")
except RuntimeError as e:
raise RuntimeError(
"cannot find a compiler supporting C++ 14 [needed to bootstrap clingo]"
) from e
candidates.sort(key=lambda x: x.spec.version, reverse=True)
return candidates[0]
def _externals_from_yaml(
@@ -90,6 +75,9 @@ def _externals_from_yaml(
if not s.satisfies(requirements[pkg_name]):
continue
if not s.intersects(f"%{self.host_compiler.spec}"):
continue
if not s.intersects(f"arch={self.host_architecture}"):
continue
@@ -122,14 +110,11 @@ def concretize(self) -> "spack.spec.Spec":
# Tweak it to conform to the host architecture
for node in s.traverse():
node.architecture.os = str(self.host_os)
node.compiler = self.host_compiler.spec
node.architecture = self.host_architecture
if node.name == "gcc-runtime":
node.versions = self.host_compiler.versions
# Can't use re2c@3.1 with Python 3.6
if self.host_python.satisfies("@3.6"):
s["re2c"].versions.versions = [spack.version.from_string("=2.2")]
node.versions = self.host_compiler.spec.versions
for edge in spack.traverse.traverse_edges([s], cover="edges"):
if edge.spec.name == "python":
@@ -141,9 +126,6 @@ def concretize(self) -> "spack.spec.Spec":
if edge.spec.name == "cmake" and self.external_cmake:
edge.spec = self.external_cmake
if edge.spec.name == self.host_compiler.name:
edge.spec = self.host_compiler
if "libc" in edge.virtuals:
edge.spec = self.host_libc
@@ -159,12 +141,12 @@ def python_external_spec(self) -> "spack.spec.Spec":
return self._external_spec(result)
def libc_external_spec(self) -> "spack.spec.Spec":
detector = spack.compilers.libraries.CompilerPropertyDetector(self.host_compiler)
result = detector.default_libc()
result = self.host_compiler.default_libc
return self._external_spec(result)
def _external_spec(self, initial_spec) -> "spack.spec.Spec":
initial_spec.namespace = "builtin"
initial_spec.compiler = self.host_compiler.spec
initial_spec.architecture = self.host_architecture
for flag_type in spack.spec.FlagMap.valid_compiler_flags():
initial_spec.compiler_flags[flag_type] = []

View File

@@ -10,7 +10,7 @@
from llnl.util import tty
import spack.compilers.config
import spack.compilers
import spack.config
import spack.environment
import spack.modules
@@ -142,8 +142,8 @@ def _bootstrap_config_scopes() -> Sequence["spack.config.ConfigScope"]:
def _add_compilers_if_missing() -> None:
arch = spack.spec.ArchSpec.frontend_arch()
if not spack.compilers.config.compilers_for_arch(arch):
spack.compilers.config.find_compilers()
if not spack.compilers.compilers_for_arch(arch):
spack.compilers.find_compilers()
@contextlib.contextmanager

View File

@@ -34,8 +34,10 @@
from llnl.util.lang import GroupedExceptionHandler
import spack.binary_distribution
import spack.concretize
import spack.config
import spack.detection
import spack.error
import spack.mirrors.mirror
import spack.platforms
import spack.spec
@@ -47,7 +49,13 @@
import spack.version
from spack.installer import PackageInstaller
from ._common import _executables_in_store, _python_import, _root_spec, _try_import_from_store
from ._common import (
QueryInfo,
_executables_in_store,
_python_import,
_root_spec,
_try_import_from_store,
)
from .clingo import ClingoBootstrapConcretizer
from .config import spack_python_interpreter, spec_for_current_python
@@ -134,7 +142,7 @@ class BuildcacheBootstrapper(Bootstrapper):
def __init__(self, conf) -> None:
super().__init__(conf)
self.last_search: Optional[ConfigDictionary] = None
self.last_search: Optional[QueryInfo] = None
self.config_scope_name = f"bootstrap_buildcache-{uuid.uuid4()}"
@staticmethod
@@ -211,14 +219,14 @@ def _install_and_test(
for _, pkg_hash, pkg_sha256 in item["binaries"]:
self._install_by_hash(pkg_hash, pkg_sha256, bincache_platform)
info: ConfigDictionary = {}
info: QueryInfo = {}
if test_fn(query_spec=abstract_spec, query_info=info):
self.last_search = info
return True
return False
def try_import(self, module: str, abstract_spec_str: str) -> bool:
info: ConfigDictionary
info: QueryInfo
test_fn, info = functools.partial(_try_import_from_store, module), {}
if test_fn(query_spec=abstract_spec_str, query_info=info):
return True
@@ -231,7 +239,7 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
return self._install_and_test(abstract_spec, bincache_platform, data, test_fn)
def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bool:
info: ConfigDictionary
info: QueryInfo
test_fn, info = functools.partial(_executables_in_store, executables), {}
if test_fn(query_spec=abstract_spec_str, query_info=info):
self.last_search = info
@@ -249,11 +257,11 @@ class SourceBootstrapper(Bootstrapper):
def __init__(self, conf) -> None:
super().__init__(conf)
self.last_search: Optional[ConfigDictionary] = None
self.last_search: Optional[QueryInfo] = None
self.config_scope_name = f"bootstrap_source-{uuid.uuid4()}"
def try_import(self, module: str, abstract_spec_str: str) -> bool:
info: ConfigDictionary = {}
info: QueryInfo = {}
if _try_import_from_store(module, abstract_spec_str, query_info=info):
self.last_search = info
return True
@@ -270,22 +278,17 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
bootstrapper = ClingoBootstrapConcretizer(configuration=spack.config.CONFIG)
concrete_spec = bootstrapper.concretize()
else:
concrete_spec = spack.spec.Spec(
abstract_spec = spack.spec.Spec(
abstract_spec_str + " ^" + spec_for_current_python()
)
concrete_spec.concretize()
concrete_spec = spack.concretize.concretize_one(abstract_spec)
msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources"
tty.debug(msg.format(module, abstract_spec_str))
# Install the spec that should make the module importable
with spack.config.override(self.mirror_scope):
PackageInstaller(
[concrete_spec.package],
fail_fast=True,
package_use_cache=False,
dependencies_use_cache=False,
).install()
PackageInstaller([concrete_spec.package], fail_fast=True).install()
if _try_import_from_store(module, query_spec=concrete_spec, query_info=info):
self.last_search = info
@@ -293,7 +296,7 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
return False
def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bool:
info: ConfigDictionary = {}
info: QueryInfo = {}
if _executables_in_store(executables, abstract_spec_str, query_info=info):
self.last_search = info
return True
@@ -304,7 +307,7 @@ def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bo
# might reduce compilation time by a fair amount
_add_externals_if_missing()
concrete_spec = spack.spec.Spec(abstract_spec_str).concretized()
concrete_spec = spack.concretize.concretize_one(abstract_spec_str)
msg = "[BOOTSTRAP] Try installing '{0}' from sources"
tty.debug(msg.format(abstract_spec_str))
with spack.config.override(self.mirror_scope):
@@ -321,10 +324,9 @@ def create_bootstrapper(conf: ConfigDictionary):
return _bootstrap_methods[btype](conf)
def source_is_enabled(conf: ConfigDictionary):
"""Raise ValueError if the source is not enabled for bootstrapping"""
trusted, name = spack.config.get("bootstrap:trusted"), conf["name"]
return trusted.get(name, False)
def source_is_enabled(conf: ConfigDictionary) -> bool:
"""Returns true if the source is not enabled for bootstrapping"""
return spack.config.get("bootstrap:trusted").get(conf["name"], False)
def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str] = None):
@@ -356,20 +358,21 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
for current_config in bootstrapping_sources():
if not source_is_enabled(current_config):
continue
with exception_handler.forward(current_config["name"], Exception):
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_import(module, abstract_spec):
if create_bootstrapper(current_config).try_import(module, abstract_spec):
return
assert exception_handler, (
f"expected at least one exception to have been raised at this point: "
f"while bootstrapping {module}"
)
msg = f'cannot bootstrap the "{module}" Python module '
if abstract_spec:
msg += f'from spec "{abstract_spec}" '
msg += exception_handler.grouped_message(with_tracebacks=tty.is_debug())
if not exception_handler:
msg += ": no bootstrapping sources are enabled"
elif spack.error.debug or spack.error.SHOW_BACKTRACE:
msg += exception_handler.grouped_message(with_tracebacks=True)
else:
msg += exception_handler.grouped_message(with_tracebacks=False)
msg += "\nRun `spack --backtrace ...` for more detailed errors"
raise ImportError(msg)
@@ -417,6 +420,7 @@ def ensure_executables_in_path_or_raise(
current_bootstrapper.last_search["spec"],
current_bootstrapper.last_search["command"],
)
assert cmd is not None, "expected an Executable"
cmd.add_default_envmod(
spack.user_environment.environment_modifications_for_specs(
concrete_spec, set_package_py_globals=False
@@ -424,18 +428,17 @@ def ensure_executables_in_path_or_raise(
)
return cmd
assert exception_handler, (
f"expected at least one exception to have been raised at this point: "
f"while bootstrapping {executables_str}"
)
msg = f"cannot bootstrap any of the {executables_str} executables "
if abstract_spec:
msg += f'from spec "{abstract_spec}" '
if tty.is_debug():
if not exception_handler:
msg += ": no bootstrapping sources are enabled"
elif spack.error.debug or spack.error.SHOW_BACKTRACE:
msg += exception_handler.grouped_message(with_tracebacks=True)
else:
msg += exception_handler.grouped_message(with_tracebacks=False)
msg += "\nRun `spack --debug ...` for more detailed errors"
msg += "\nRun `spack --backtrace ...` for more detailed errors"
raise RuntimeError(msg)

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -63,7 +63,6 @@ def _missing(name: str, purpose: str, system_only: bool = True) -> str:
def _core_requirements() -> List[RequiredResponseType]:
_core_system_exes = {
"make": _missing("make", "required to build software from sources"),
"patch": _missing("patch", "required to patch source code before building"),
"tar": _missing("tar", "required to manage code archives"),
"gzip": _missing("gzip", "required to compress/decompress code archives"),

View File

@@ -36,6 +36,7 @@
import multiprocessing
import os
import re
import stat
import sys
import traceback
import types
@@ -43,7 +44,19 @@
from enum import Flag, auto
from itertools import chain
from multiprocessing.connection import Connection
from typing import Callable, Dict, List, Optional, Set, Tuple
from typing import (
Callable,
Dict,
List,
Optional,
Sequence,
Set,
TextIO,
Tuple,
Type,
Union,
overload,
)
import archspec.cpu
@@ -58,7 +71,7 @@
import spack.build_systems.meson
import spack.build_systems.python
import spack.builder
import spack.compilers.libraries
import spack.compilers
import spack.config
import spack.deptypes as dt
import spack.error
@@ -72,6 +85,7 @@
import spack.store
import spack.subprocess_context
import spack.util.executable
import spack.util.libc
from spack import traverse
from spack.context import Context
from spack.error import InstallError, NoHeadersError, NoLibrariesError
@@ -79,7 +93,6 @@
from spack.util.environment import (
SYSTEM_DIR_CASE_ENTRY,
EnvironmentModifications,
PrependPath,
env_flag,
filter_system_paths,
get_path,
@@ -145,48 +158,128 @@ def get_effective_jobs(jobs, parallel=True, supports_jobserver=False):
class MakeExecutable(Executable):
"""Special callable executable object for make so the user can specify
parallelism options on a per-invocation basis. Specifying
'parallel' to the call will override whatever the package's
global setting is, so you can either default to true or false and
override particular calls. Specifying 'jobs_env' to a particular
call will name an environment variable which will be set to the
parallelism level (without affecting the normal invocation with
-j).
"""Special callable executable object for make so the user can specify parallelism options
on a per-invocation basis.
"""
def __init__(self, name, jobs, **kwargs):
supports_jobserver = kwargs.pop("supports_jobserver", True)
super().__init__(name, **kwargs)
def __init__(self, name: str, *, jobs: int, supports_jobserver: bool = True) -> None:
super().__init__(name)
self.supports_jobserver = supports_jobserver
self.jobs = jobs
def __call__(self, *args, **kwargs):
"""parallel, and jobs_env from kwargs are swallowed and used here;
remaining arguments are passed through to the superclass.
"""
parallel = kwargs.pop("parallel", True)
jobs_env = kwargs.pop("jobs_env", None)
jobs_env_supports_jobserver = kwargs.pop("jobs_env_supports_jobserver", False)
@overload
def __call__(
self,
*args: str,
parallel: bool = ...,
jobs_env: Optional[str] = ...,
jobs_env_supports_jobserver: bool = ...,
fail_on_error: bool = ...,
ignore_errors: Union[int, Sequence[int]] = ...,
ignore_quotes: Optional[bool] = ...,
timeout: Optional[int] = ...,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
input: Optional[TextIO] = ...,
output: Union[Optional[TextIO], str] = ...,
error: Union[Optional[TextIO], str] = ...,
_dump_env: Optional[Dict[str, str]] = ...,
) -> None: ...
@overload
def __call__(
self,
*args: str,
parallel: bool = ...,
jobs_env: Optional[str] = ...,
jobs_env_supports_jobserver: bool = ...,
fail_on_error: bool = ...,
ignore_errors: Union[int, Sequence[int]] = ...,
ignore_quotes: Optional[bool] = ...,
timeout: Optional[int] = ...,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
input: Optional[TextIO] = ...,
output: Union[Type[str], Callable] = ...,
error: Union[Optional[TextIO], str, Type[str], Callable] = ...,
_dump_env: Optional[Dict[str, str]] = ...,
) -> str: ...
@overload
def __call__(
self,
*args: str,
parallel: bool = ...,
jobs_env: Optional[str] = ...,
jobs_env_supports_jobserver: bool = ...,
fail_on_error: bool = ...,
ignore_errors: Union[int, Sequence[int]] = ...,
ignore_quotes: Optional[bool] = ...,
timeout: Optional[int] = ...,
env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
extra_env: Optional[Union[Dict[str, str], EnvironmentModifications]] = ...,
input: Optional[TextIO] = ...,
output: Union[Optional[TextIO], str, Type[str], Callable] = ...,
error: Union[Type[str], Callable] = ...,
_dump_env: Optional[Dict[str, str]] = ...,
) -> str: ...
def __call__(
self,
*args: str,
parallel: bool = True,
jobs_env: Optional[str] = None,
jobs_env_supports_jobserver: bool = False,
**kwargs,
) -> Optional[str]:
"""Runs this "make" executable in a subprocess.
Args:
parallel: if False, parallelism is disabled
jobs_env: environment variable that will be set to the current level of parallelism
jobs_env_supports_jobserver: whether the jobs env supports a job server
For all the other **kwargs, refer to the base class.
"""
jobs = get_effective_jobs(
self.jobs, parallel=parallel, supports_jobserver=self.supports_jobserver
)
if jobs is not None:
args = ("-j{0}".format(jobs),) + args
args = (f"-j{jobs}",) + args
if jobs_env:
# Caller wants us to set an environment variable to
# control the parallelism.
# Caller wants us to set an environment variable to control the parallelism
jobs_env_jobs = get_effective_jobs(
self.jobs, parallel=parallel, supports_jobserver=jobs_env_supports_jobserver
)
if jobs_env_jobs is not None:
kwargs["extra_env"] = {jobs_env: str(jobs_env_jobs)}
extra_env = kwargs.setdefault("extra_env", {})
extra_env.update({jobs_env: str(jobs_env_jobs)})
return super().__call__(*args, **kwargs)
class UndeclaredDependencyError(spack.error.SpackError):
"""Raised if a dependency is invoking an executable through a module global, without
declaring a dependency on it.
"""
class DeprecatedExecutable:
def __init__(self, pkg: str, exe: str, exe_pkg: str) -> None:
self.pkg = pkg
self.exe = exe
self.exe_pkg = exe_pkg
def __call__(self, *args, **kwargs):
raise UndeclaredDependencyError(
f"{self.pkg} is using {self.exe} without declaring a dependency on {self.exe_pkg}"
)
def add_default_env(self, key: str, value: str):
self.__call__()
def clean_environment():
# Stuff in here sanitizes the build environment to eliminate
# anything the user has set that may interfere. We apply it immediately
@@ -208,11 +301,13 @@ def clean_environment():
env.unset("CPLUS_INCLUDE_PATH")
env.unset("OBJC_INCLUDE_PATH")
# prevent configure scripts from sourcing variables from config site file (AC_SITE_LOAD).
env.set("CONFIG_SITE", os.devnull)
env.unset("CMAKE_PREFIX_PATH")
env.unset("PYTHONPATH")
env.unset("R_HOME")
env.unset("R_ENVIRON")
env.unset("LUA_PATH")
env.unset("LUA_CPATH")
@@ -295,10 +390,62 @@ def _add_werror_handling(keep_werror, env):
env.set("SPACK_COMPILER_FLAGS_REPLACE", " ".join(["|".join(item) for item in replace_flags]))
def set_wrapper_environment_variables_for_flags(pkg, env):
def set_compiler_environment_variables(pkg, env):
assert pkg.spec.concrete
compiler = pkg.compiler
spec = pkg.spec
# Make sure the executables for this compiler exist
compiler.verify_executables()
# Set compiler variables used by CMake and autotools
assert all(key in compiler.link_paths for key in ("cc", "cxx", "f77", "fc"))
# Populate an object with the list of environment modifications
# and return it
# TODO : add additional kwargs for better diagnostics, like requestor,
# ttyout, ttyerr, etc.
link_dir = spack.paths.build_env_path
# Set SPACK compiler variables so that our wrapper knows what to
# call. If there is no compiler configured then use a default
# wrapper which will emit an error if it is used.
if compiler.cc:
env.set("SPACK_CC", compiler.cc)
env.set("CC", os.path.join(link_dir, compiler.link_paths["cc"]))
else:
env.set("CC", os.path.join(link_dir, "cc"))
if compiler.cxx:
env.set("SPACK_CXX", compiler.cxx)
env.set("CXX", os.path.join(link_dir, compiler.link_paths["cxx"]))
else:
env.set("CC", os.path.join(link_dir, "c++"))
if compiler.f77:
env.set("SPACK_F77", compiler.f77)
env.set("F77", os.path.join(link_dir, compiler.link_paths["f77"]))
else:
env.set("F77", os.path.join(link_dir, "f77"))
if compiler.fc:
env.set("SPACK_FC", compiler.fc)
env.set("FC", os.path.join(link_dir, compiler.link_paths["fc"]))
else:
env.set("FC", os.path.join(link_dir, "fc"))
# Set SPACK compiler rpath flags so that our wrapper knows what to use
env.set("SPACK_CC_RPATH_ARG", compiler.cc_rpath_arg)
env.set("SPACK_CXX_RPATH_ARG", compiler.cxx_rpath_arg)
env.set("SPACK_F77_RPATH_ARG", compiler.f77_rpath_arg)
env.set("SPACK_FC_RPATH_ARG", compiler.fc_rpath_arg)
env.set("SPACK_LINKER_ARG", compiler.linker_arg)
# Check whether we want to force RPATH or RUNPATH
if spack.config.get("config:shared_linking:type") == "rpath":
env.set("SPACK_DTAGS_TO_STRIP", compiler.enable_new_dtags)
env.set("SPACK_DTAGS_TO_ADD", compiler.disable_new_dtags)
else:
env.set("SPACK_DTAGS_TO_STRIP", compiler.disable_new_dtags)
env.set("SPACK_DTAGS_TO_ADD", compiler.enable_new_dtags)
if pkg.keep_werror is not None:
keep_werror = pkg.keep_werror
else:
@@ -306,6 +453,10 @@ def set_wrapper_environment_variables_for_flags(pkg, env):
_add_werror_handling(keep_werror, env)
# Set the target parameters that the compiler will add
isa_arg = optimization_flags(compiler, spec.target)
env.set("SPACK_TARGET_ARGS", isa_arg)
# Trap spack-tracked compiler flags as appropriate.
# env_flags are easy to accidentally override.
inject_flags = {}
@@ -338,23 +489,75 @@ def set_wrapper_environment_variables_for_flags(pkg, env):
# implicit variables
env.set(flag.upper(), " ".join(f for f in env_flags[flag]))
pkg.flags_to_build_system_args(build_system_flags)
env.set("SPACK_COMPILER_SPEC", str(spec.compiler))
env.set("SPACK_SYSTEM_DIRS", SYSTEM_DIR_CASE_ENTRY)
compiler.setup_custom_environment(pkg, env)
return env
def optimization_flags(compiler, target):
if spack.compilers.is_mixed_toolchain(compiler):
msg = (
"microarchitecture specific optimizations are not "
"supported yet on mixed compiler toolchains [check"
f" {compiler.name}@{compiler.version} for further details]"
)
tty.debug(msg)
return ""
# Try to check if the current compiler comes with a version number or
# has an unexpected suffix. If so, treat it as a compiler with a
# custom spec.
version_number, _ = archspec.cpu.version_components(compiler.version.dotted_numeric_string)
compiler_version = compiler.version
version_number, suffix = archspec.cpu.version_components(compiler.version)
if not version_number or suffix:
try:
compiler_version = compiler.real_version
except spack.util.executable.ProcessError as e:
# log this and just return compiler.version instead
tty.debug(str(e))
try:
result = target.optimization_flags(compiler.name, version_number)
result = target.optimization_flags(compiler.name, compiler_version.dotted_numeric_string)
except (ValueError, archspec.cpu.UnsupportedMicroarchitecture):
result = ""
return result
class FilterDefaultDynamicLinkerSearchPaths:
"""Remove rpaths to directories that are default search paths of the dynamic linker."""
def __init__(self, dynamic_linker: Optional[str]) -> None:
# Identify directories by (inode, device) tuple, which handles symlinks too.
self.default_path_identifiers: Set[Tuple[int, int]] = set()
if not dynamic_linker:
return
for path in spack.util.libc.default_search_paths_from_dynamic_linker(dynamic_linker):
try:
s = os.stat(path)
if stat.S_ISDIR(s.st_mode):
self.default_path_identifiers.add((s.st_ino, s.st_dev))
except OSError:
continue
def is_dynamic_loader_default_path(self, p: str) -> bool:
try:
s = os.stat(p)
return (s.st_ino, s.st_dev) in self.default_path_identifiers
except OSError:
return False
def __call__(self, dirs: List[str]) -> List[str]:
if not self.default_path_identifiers:
return dirs
return [p for p in dirs if not self.is_dynamic_loader_default_path(p)]
def set_wrapper_variables(pkg, env):
"""Set environment variables used by the Spack compiler wrapper (which have the prefix
`SPACK_`) and also add the compiler wrappers to PATH.
@@ -363,8 +566,39 @@ def set_wrapper_variables(pkg, env):
this function computes these options in a manner that is intended to match the DAG traversal
order in `SetupContext`. TODO: this is not the case yet, we're using post order, SetupContext
is using topo order."""
# Set compiler flags injected from the spec
set_wrapper_environment_variables_for_flags(pkg, env)
# Set environment variables if specified for
# the given compiler
compiler = pkg.compiler
env.extend(spack.schema.environment.parse(compiler.environment))
if compiler.extra_rpaths:
extra_rpaths = ":".join(compiler.extra_rpaths)
env.set("SPACK_COMPILER_EXTRA_RPATHS", extra_rpaths)
# Add spack build environment path with compiler wrappers first in
# the path. We add the compiler wrapper path, which includes default
# wrappers (cc, c++, f77, f90), AND a subdirectory containing
# compiler-specific symlinks. The latter ensures that builds that
# are sensitive to the *name* of the compiler see the right name when
# we're building with the wrappers.
#
# Conflicts on case-insensitive systems (like "CC" and "cc") are
# handled by putting one in the <build_env_path>/case-insensitive
# directory. Add that to the path too.
env_paths = []
compiler_specific = os.path.join(
spack.paths.build_env_path, os.path.dirname(pkg.compiler.link_paths["cc"])
)
for item in [spack.paths.build_env_path, compiler_specific]:
env_paths.append(item)
ci = os.path.join(item, "case-insensitive")
if os.path.isdir(ci):
env_paths.append(ci)
tty.debug("Adding compiler bin/ paths: " + " ".join(env_paths))
for item in env_paths:
env.prepend_path("PATH", item)
env.set_path(SPACK_ENV_PATH, env_paths)
# Working directory for the spack command itself, for debug logs.
if spack.config.get("config:debug"):
@@ -430,15 +664,22 @@ def set_wrapper_variables(pkg, env):
lib_path = os.path.join(pkg.prefix, libdir)
rpath_dirs.insert(0, lib_path)
filter_default_dynamic_linker_search_paths = FilterDefaultDynamicLinkerSearchPaths(
pkg.compiler.default_dynamic_linker
)
# TODO: filter_system_paths is again wrong (and probably unnecessary due to the is_system_path
# branch above). link_dirs should be filtered with entries from _parse_link_paths.
link_dirs = list(dedupe(filter_system_paths(link_dirs)))
include_dirs = list(dedupe(filter_system_paths(include_dirs)))
rpath_dirs = list(dedupe(filter_system_paths(rpath_dirs)))
rpath_dirs = filter_default_dynamic_linker_search_paths(rpath_dirs)
default_dynamic_linker_filter = spack.compilers.libraries.dynamic_linker_filter_for(pkg.spec)
if default_dynamic_linker_filter:
rpath_dirs = default_dynamic_linker_filter(rpath_dirs)
# TODO: implicit_rpaths is prefiltered by is_system_path, that should be removed in favor of
# just this filter.
implicit_rpaths = filter_default_dynamic_linker_search_paths(pkg.compiler.implicit_rpaths())
if implicit_rpaths:
env.set("SPACK_COMPILER_IMPLICIT_RPATHS", ":".join(implicit_rpaths))
# Spack managed directories include the stage, store and upstream stores. We extend this with
# their real paths to make it more robust (e.g. /tmp vs /private/tmp on macOS).
@@ -474,10 +715,9 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
module.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg)
module.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg)
# TODO: make these build deps that can be installed if not found.
module.make = MakeExecutable("make", jobs)
module.gmake = MakeExecutable("gmake", jobs)
module.ninja = MakeExecutable("ninja", jobs, supports_jobserver=False)
module.make = DeprecatedExecutable(pkg.name, "make", "gmake")
module.gmake = DeprecatedExecutable(pkg.name, "gmake", "gmake")
module.ninja = DeprecatedExecutable(pkg.name, "ninja", "ninja")
# TODO: johnwparent: add package or builder support to define these build tools
# for now there is no entrypoint for builders to define these on their
# own
@@ -491,6 +731,26 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
# Don't use which for this; we want to find it in the current dir.
module.configure = Executable("./configure")
# Put spack compiler paths in module scope. (Some packages use it
# in setup_run_environment etc, so don't put it context == build)
link_dir = spack.paths.build_env_path
pkg_compiler = None
try:
pkg_compiler = pkg.compiler
except spack.compilers.NoCompilerForSpecError as e:
tty.debug(f"cannot set 'spack_cc': {str(e)}")
if pkg_compiler is not None:
module.spack_cc = os.path.join(link_dir, pkg_compiler.link_paths["cc"])
module.spack_cxx = os.path.join(link_dir, pkg_compiler.link_paths["cxx"])
module.spack_f77 = os.path.join(link_dir, pkg_compiler.link_paths["f77"])
module.spack_fc = os.path.join(link_dir, pkg_compiler.link_paths["fc"])
else:
module.spack_cc = None
module.spack_cxx = None
module.spack_f77 = None
module.spack_fc = None
# Useful directories within the prefix are encapsulated in
# a Prefix object.
module.prefix = pkg.prefix
@@ -656,6 +916,7 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
context == Context.TEST and pkg.test_requires_compiler
)
if need_compiler:
set_compiler_environment_variables(pkg, env_mods)
set_wrapper_variables(pkg, env_mods)
# Platform specific setup goes before package specific setup. This is for setting
@@ -667,11 +928,6 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
env_mods.extend(setup_context.get_env_modifications())
tty.debug("setup_package: collected all modifications from dependencies")
tty.debug("setup_package: adding compiler wrappers paths")
for x in env_mods.group_by_name()["SPACK_ENV_PATH"]:
assert isinstance(x, PrependPath), "unexpected setting used for SPACK_ENV_PATH"
env_mods.prepend_path("PATH", x.value)
if context == Context.TEST:
env_mods.prepend_path("PATH", ".")
elif context == Context.BUILD and not dirty and not env_mods.is_unset("CPATH"):
@@ -685,6 +941,11 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
# Load modules on an already clean environment, just before applying Spack's
# own environment modifications. This ensures Spack controls CC/CXX/... variables.
if need_compiler:
tty.debug("setup_package: loading compiler modules")
for mod in pkg.compiler.modules:
load_module(mod)
load_external_modules(pkg)
# Make sure nothing's strange about the Spack environment.

View File

@@ -6,7 +6,9 @@
import llnl.util.filesystem as fs
import spack.directives
import spack.spec
import spack.util.executable
import spack.util.prefix
from .autotools import AutotoolsBuilder, AutotoolsPackage
@@ -17,19 +19,18 @@ class AspellBuilder(AutotoolsBuilder):
to the Aspell extensions.
"""
def configure(self, pkg, spec, prefix):
def configure(
self,
pkg: "AspellDictPackage", # type: ignore[override]
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
):
aspell = spec["aspell"].prefix.bin.aspell
prezip = spec["aspell"].prefix.bin.prezip
destdir = prefix
sh = spack.util.executable.which("sh")
sh(
"./configure",
"--vars",
"ASPELL={0}".format(aspell),
"PREZIP={0}".format(prezip),
"DESTDIR={0}".format(destdir),
)
sh = spack.util.executable.Executable("/bin/sh")
sh("./configure", "--vars", f"ASPELL={aspell}", f"PREZIP={prezip}", f"DESTDIR={destdir}")
# Aspell dictionaries install their bits into their prefix.lib

View File

@@ -12,7 +12,6 @@
import spack.build_environment
import spack.builder
import spack.compilers.libraries
import spack.error
import spack.package_base
import spack.phase_callbacks
@@ -192,6 +191,177 @@ def archive_files(self) -> List[str]:
files.append(self._removed_la_files_log)
return files
@property
def configure_directory(self) -> str:
"""Return the directory where 'configure' resides."""
return self.pkg.stage.source_path
@property
def configure_abs_path(self) -> str:
# Absolute path to configure
configure_abs_path = os.path.join(os.path.abspath(self.configure_directory), "configure")
return configure_abs_path
@property
def build_directory(self) -> str:
"""Override to provide another place to build the package"""
# Handle the case where the configure directory is set to a non-absolute path
# Non-absolute paths are always relative to the staging source path
build_dir = self.configure_directory
if not os.path.isabs(build_dir):
build_dir = os.path.join(self.pkg.stage.source_path, build_dir)
return build_dir
@property
def autoreconf_search_path_args(self) -> List[str]:
"""Search path includes for autoreconf. Add an -I flag for all `aclocal` dirs
of build deps, skips the default path of automake, move external include
flags to the back, since they might pull in unrelated m4 files shadowing
spack dependencies."""
return _autoreconf_search_path_args(self.spec)
def autoreconf(
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Not needed usually, configure should be already there"""
# If configure exists nothing needs to be done
if os.path.exists(self.configure_abs_path):
return
# Else try to regenerate it, which requires a few build dependencies
ensure_build_dependencies_or_raise(
spec=spec,
dependencies=["autoconf", "automake", "libtool"],
error_msg="Cannot generate configure",
)
tty.msg("Configure script not found: trying to generate it")
tty.warn("*********************************************************")
tty.warn("* If the default procedure fails, consider implementing *")
tty.warn("* a custom AUTORECONF phase in the package *")
tty.warn("*********************************************************")
with fs.working_dir(self.configure_directory):
# This line is what is needed most of the time
# --install, --verbose, --force
autoreconf_args = ["-ivf"]
autoreconf_args += self.autoreconf_search_path_args
autoreconf_args += self.autoreconf_extra_args
self.pkg.module.autoreconf(*autoreconf_args)
def configure(
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "configure", with the arguments specified by the builder and an
appropriately set prefix.
"""
options = getattr(self.pkg, "configure_flag_args", [])
options += ["--prefix={0}".format(prefix)]
options += self.configure_args()
with fs.working_dir(self.build_directory, create=True):
pkg.module.configure(*options)
def build(
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "make" on the build targets specified by the builder."""
# See https://autotools.io/automake/silent.html
params = ["V=1"]
params += self.build_targets
with fs.working_dir(self.build_directory):
pkg.module.make(*params)
def install(
self, pkg: AutotoolsPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets)
def check(self) -> None:
"""Run "make" on the ``test`` and ``check`` targets, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("test")
self.pkg._if_make_target_execute("check")
def installcheck(self) -> None:
"""Run "make" on the ``installcheck`` target, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("installcheck")
def setup_build_environment(self, env):
if self.spec.platform == "darwin" and macos_version() >= Version("11"):
# Many configure files rely on matching '10.*' for macOS version
# detection and fail to add flags if it shows as version 11.
env.set("MACOSX_DEPLOYMENT_TARGET", "10.16")
def with_or_without(
self,
name: str,
activation_value: Optional[Union[Callable, str]] = None,
variant: Optional[str] = None,
) -> List[str]:
"""Inspects a variant and returns the arguments that activate
or deactivate the selected feature(s) for the configure options.
This function works on all type of variants. For bool-valued variants
it will return by default ``--with-{name}`` or ``--without-{name}``.
For other kinds of variants it will cycle over the allowed values and
return either ``--with-{value}`` or ``--without-{value}``.
If activation_value is given, then for each possible value of the
variant, the option ``--with-{value}=activation_value(value)`` or
``--without-{value}`` will be added depending on whether or not
``variant=value`` is in the spec.
Args:
name: name of a valid multi-valued variant
activation_value: callable that accepts a single value and returns the parameter to be
used leading to an entry of the type ``--with-{name}={parameter}``.
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
Returns:
list of arguments to configure
"""
return self._activate_or_not(name, "with", "without", activation_value, variant)
def enable_or_disable(
self,
name: str,
activation_value: Optional[Union[Callable, str]] = None,
variant: Optional[str] = None,
) -> List[str]:
"""Same as
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without`
but substitute ``with`` with ``enable`` and ``without`` with ``disable``.
Args:
name: name of a valid multi-valued variant
activation_value: if present accepts a single value and returns the parameter to be
used leading to an entry of the type ``--enable-{name}={parameter}``
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
Returns:
list of arguments to configure
"""
return self._activate_or_not(name, "enable", "disable", activation_value, variant)
def configure_args(self) -> List[str]:
"""Return the list of all the arguments that must be passed to configure,
except ``--prefix`` which will be pre-pended to the list.
"""
return []
@spack.phase_callbacks.run_before("autoreconf")
def _delete_configure_to_force_update(self) -> None:
if self.force_autoreconf:
fs.force_remove(self.configure_abs_path)
@spack.phase_callbacks.run_after("autoreconf")
def _do_patch_config_files(self) -> None:
"""Some packages ship with older config.guess/config.sub files and need to
@@ -304,6 +474,24 @@ def runs_ok(script_abs_path):
fs.copy(substitutes[name], abs_path)
os.chmod(abs_path, mode)
@spack.phase_callbacks.run_after("autoreconf")
def _set_configure_or_die(self) -> None:
"""Ensure the presence of a "configure" script, or raise. If the "configure"
is found, a module level attribute is set.
Raises:
RuntimeError: if the "configure" script is not found
"""
# Check if the "configure" script is there. If not raise a RuntimeError.
if not os.path.exists(self.configure_abs_path):
msg = "configure script not found in {0}"
raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
@spack.phase_callbacks.run_before("configure")
def _patch_usr_bin_file(self) -> None:
"""On NixOS file is not available in /usr/bin/file. Patch configure
@@ -357,6 +545,13 @@ def _do_patch_libtool_configure(self) -> None:
)
# Support Libtool 2.4.2 and older:
x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2')
# Configure scripts generated with libtool < 2.5.4 have a faulty test for the
# -single_module linker flag. A deprecation warning makes it think the default is
# -multi_module, triggering it to use problematic linker flags (such as ld -r). The
# linker default is `-single_module` from (ancient) macOS 10.4, so override by setting
# `lt_cv_apple_cc_single_mod=yes`. See the fix in libtool commit
# 82f7f52123e4e7e50721049f7fa6f9b870e09c9d.
x.filter("lt_cv_apple_cc_single_mod=no", "lt_cv_apple_cc_single_mod=yes", string=True)
@spack.phase_callbacks.run_after("configure")
def _do_patch_libtool(self) -> None:
@@ -393,44 +588,33 @@ def _do_patch_libtool(self) -> None:
markers[tag] = "LIBTOOL TAG CONFIG: {0}".format(tag.upper())
# Replace empty linker flag prefixes:
if self.spec.satisfies("%nag"):
if self.pkg.compiler.name == "nag":
# Nag is mixed with gcc and g++, which are recognized correctly.
# Therefore, we change only Fortran values:
nag_pkg = self.spec["fortran"].package
for tag in ["fc", "f77"]:
marker = markers[tag]
x.filter(
regex='^wl=""$',
repl=f'wl="{nag_pkg.linker_arg}"',
start_at=f"# ### BEGIN {marker}",
stop_at=f"# ### END {marker}",
repl='wl="{0}"'.format(self.pkg.compiler.linker_arg),
start_at="# ### BEGIN {0}".format(marker),
stop_at="# ### END {0}".format(marker),
)
else:
compiler_spec = spack.compilers.libraries.compiler_spec(self.spec)
if compiler_spec:
x.filter(regex='^wl=""$', repl='wl="{0}"'.format(compiler_spec.package.linker_arg))
x.filter(regex='^wl=""$', repl='wl="{0}"'.format(self.pkg.compiler.linker_arg))
# Replace empty PIC flag values:
for compiler, marker in markers.items():
if compiler == "cc":
language = "c"
elif compiler == "cxx":
language = "cxx"
else:
language = "fortran"
if language not in self.spec:
continue
for cc, marker in markers.items():
x.filter(
regex='^pic_flag=""$',
repl=f'pic_flag="{self.spec[language].package.pic_flag}"',
start_at=f"# ### BEGIN {marker}",
stop_at=f"# ### END {marker}",
repl='pic_flag="{0}"'.format(
getattr(self.pkg.compiler, "{0}_pic_flag".format(cc))
),
start_at="# ### BEGIN {0}".format(marker),
stop_at="# ### END {0}".format(marker),
)
# Other compiler-specific patches:
if self.spec.satisfies("%fj"):
if self.pkg.compiler.name == "fj":
x.filter(regex="-nostdlib", repl="", string=True)
rehead = r"/\S*/"
for o in [
@@ -443,7 +627,7 @@ def _do_patch_libtool(self) -> None:
r"crtendS\.o",
]:
x.filter(regex=(rehead + o), repl="")
elif self.spec.satisfies("%nag"):
elif self.pkg.compiler.name == "nag":
for tag in ["fc", "f77"]:
marker = markers[tag]
start_at = "# ### BEGIN {0}".format(marker)
@@ -517,142 +701,27 @@ def _do_patch_libtool(self) -> None:
stop_at=stop_at,
)
@property
def configure_directory(self) -> str:
"""Return the directory where 'configure' resides."""
return self.pkg.stage.source_path
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
@property
def configure_abs_path(self) -> str:
# Absolute path to configure
configure_abs_path = os.path.join(os.path.abspath(self.configure_directory), "configure")
return configure_abs_path
@property
def build_directory(self) -> str:
"""Override to provide another place to build the package"""
# Handle the case where the configure directory is set to a non-absolute path
# Non-absolute paths are always relative to the staging source path
build_dir = self.configure_directory
if not os.path.isabs(build_dir):
build_dir = os.path.join(self.pkg.stage.source_path, build_dir)
return build_dir
@spack.phase_callbacks.run_before("autoreconf")
def delete_configure_to_force_update(self) -> None:
if self.force_autoreconf:
fs.force_remove(self.configure_abs_path)
@property
def autoreconf_search_path_args(self) -> List[str]:
"""Search path includes for autoreconf. Add an -I flag for all `aclocal` dirs
of build deps, skips the default path of automake, move external include
flags to the back, since they might pull in unrelated m4 files shadowing
spack dependencies."""
return _autoreconf_search_path_args(self.spec)
@spack.phase_callbacks.run_after("autoreconf")
def set_configure_or_die(self) -> None:
"""Ensure the presence of a "configure" script, or raise. If the "configure"
is found, a module level attribute is set.
Raises:
RuntimeError: if the "configure" script is not found
@spack.phase_callbacks.run_after("install")
def _remove_libtool_archives(self) -> None:
"""Remove all .la files in prefix sub-folders if the package sets
``install_libtool_archives`` to be False.
"""
# Check if the "configure" script is there. If not raise a RuntimeError.
if not os.path.exists(self.configure_abs_path):
msg = "configure script not found in {0}"
raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
def configure_args(self) -> List[str]:
"""Return the list of all the arguments that must be passed to configure,
except ``--prefix`` which will be pre-pended to the list.
"""
return []
def autoreconf(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Not needed usually, configure should be already there"""
# If configure exists nothing needs to be done
if os.path.exists(self.configure_abs_path):
# If .la files are to be installed there's nothing to do
if self.install_libtool_archives:
return
# Else try to regenerate it, which requires a few build dependencies
ensure_build_dependencies_or_raise(
spec=spec,
dependencies=["autoconf", "automake", "libtool"],
error_msg="Cannot generate configure",
)
# Remove the files and create a log of what was removed
libtool_files = fs.find(str(self.pkg.prefix), "*.la", recursive=True)
with fs.safe_remove(*libtool_files):
fs.mkdirp(os.path.dirname(self._removed_la_files_log))
with open(self._removed_la_files_log, mode="w", encoding="utf-8") as f:
f.write("\n".join(libtool_files))
tty.msg("Configure script not found: trying to generate it")
tty.warn("*********************************************************")
tty.warn("* If the default procedure fails, consider implementing *")
tty.warn("* a custom AUTORECONF phase in the package *")
tty.warn("*********************************************************")
with fs.working_dir(self.configure_directory):
# This line is what is needed most of the time
# --install, --verbose, --force
autoreconf_args = ["-ivf"]
autoreconf_args += self.autoreconf_search_path_args
autoreconf_args += self.autoreconf_extra_args
self.pkg.module.autoreconf(*autoreconf_args)
def configure(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "configure", with the arguments specified by the builder and an
appropriately set prefix.
"""
options = getattr(self.pkg, "configure_flag_args", [])
options += ["--prefix={0}".format(prefix)]
options += self.configure_args()
with fs.working_dir(self.build_directory, create=True):
pkg.module.configure(*options)
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the build targets specified by the builder."""
# See https://autotools.io/automake/silent.html
params = ["V=1"]
params += self.build_targets
with fs.working_dir(self.build_directory):
pkg.module.make(*params)
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
def check(self) -> None:
"""Run "make" on the ``test`` and ``check`` targets, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("test")
self.pkg._if_make_target_execute("check")
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
def _activate_or_not(
self,
@@ -774,93 +843,6 @@ def _default_generator(is_activated):
args.append(line_generator(activated))
return args
def with_or_without(
self,
name: str,
activation_value: Optional[Union[Callable, str]] = None,
variant: Optional[str] = None,
) -> List[str]:
"""Inspects a variant and returns the arguments that activate
or deactivate the selected feature(s) for the configure options.
This function works on all type of variants. For bool-valued variants
it will return by default ``--with-{name}`` or ``--without-{name}``.
For other kinds of variants it will cycle over the allowed values and
return either ``--with-{value}`` or ``--without-{value}``.
If activation_value is given, then for each possible value of the
variant, the option ``--with-{value}=activation_value(value)`` or
``--without-{value}`` will be added depending on whether or not
``variant=value`` is in the spec.
Args:
name: name of a valid multi-valued variant
activation_value: callable that accepts a single value and returns the parameter to be
used leading to an entry of the type ``--with-{name}={parameter}``.
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
Returns:
list of arguments to configure
"""
return self._activate_or_not(name, "with", "without", activation_value, variant)
def enable_or_disable(
self,
name: str,
activation_value: Optional[Union[Callable, str]] = None,
variant: Optional[str] = None,
) -> List[str]:
"""Same as
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without`
but substitute ``with`` with ``enable`` and ``without`` with ``disable``.
Args:
name: name of a valid multi-valued variant
activation_value: if present accepts a single value and returns the parameter to be
used leading to an entry of the type ``--enable-{name}={parameter}``
The special value "prefix" can also be assigned and will return
``spec[name].prefix`` as activation parameter.
Returns:
list of arguments to configure
"""
return self._activate_or_not(name, "enable", "disable", activation_value, variant)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def installcheck(self) -> None:
"""Run "make" on the ``installcheck`` target, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("installcheck")
@spack.phase_callbacks.run_after("install")
def remove_libtool_archives(self) -> None:
"""Remove all .la files in prefix sub-folders if the package sets
``install_libtool_archives`` to be False.
"""
# If .la files are to be installed there's nothing to do
if self.install_libtool_archives:
return
# Remove the files and create a log of what was removed
libtool_files = fs.find(str(self.pkg.prefix), "*.la", recursive=True)
with fs.safe_remove(*libtool_files):
fs.mkdirp(os.path.dirname(self._removed_la_files_log))
with open(self._removed_la_files_log, mode="w", encoding="utf-8") as f:
f.write("\n".join(libtool_files))
def setup_build_environment(self, env):
if self.spec.platform == "darwin" and macos_version() >= Version("11"):
# Many configure files rely on matching '10.*' for macOS version
# detection and fail to add flags if it shows as version 11.
env.set("MACOSX_DEPLOYMENT_TARGET", "10.16")
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
def _autoreconf_search_path_args(spec: spack.spec.Spec) -> List[str]:
dirs_seen: Set[Tuple[int, int]] = set()

View File

@@ -10,7 +10,8 @@
import llnl.util.tty as tty
import spack.phase_callbacks
from spack.directives import depends_on
import spack.spec
import spack.util.prefix
from .cmake import CMakeBuilder, CMakePackage
@@ -68,7 +69,12 @@ class CachedCMakeBuilder(CMakeBuilder):
@property
def cache_name(self):
return f"{self.pkg.name}-{self.spec.architecture.platform}-{self.spec.dag_hash()}.cmake"
return "{0}-{1}-{2}@{3}.cmake".format(
self.pkg.name,
self.pkg.spec.architecture,
self.pkg.spec.compiler.name,
self.pkg.spec.compiler.version,
)
@property
def cache_path(self):
@@ -111,9 +117,7 @@ def initconfig_compiler_entries(self):
# Fortran compiler is optional
if "FC" in os.environ:
spack_fc_entry = cmake_cache_path("CMAKE_Fortran_COMPILER", os.environ["FC"])
system_fc_entry = cmake_cache_path(
"CMAKE_Fortran_COMPILER", self.spec["fortran"].package.fortran
)
system_fc_entry = cmake_cache_path("CMAKE_Fortran_COMPILER", self.pkg.compiler.fc)
else:
spack_fc_entry = "# No Fortran compiler defined in spec"
system_fc_entry = "# No Fortran compiler defined in spec"
@@ -129,8 +133,8 @@ def initconfig_compiler_entries(self):
" " + cmake_cache_path("CMAKE_CXX_COMPILER", os.environ["CXX"]),
" " + spack_fc_entry,
"else()\n",
" " + cmake_cache_path("CMAKE_C_COMPILER", self.spec["c"].package.cc),
" " + cmake_cache_path("CMAKE_CXX_COMPILER", self.spec["cxx"].package.cxx),
" " + cmake_cache_path("CMAKE_C_COMPILER", self.pkg.compiler.cc),
" " + cmake_cache_path("CMAKE_CXX_COMPILER", self.pkg.compiler.cxx),
" " + system_fc_entry,
"endif()\n",
]
@@ -291,6 +295,13 @@ def initconfig_hardware_entries(self):
entries.append(cmake_cache_string("AMDGPU_TARGETS", arch_str))
entries.append(cmake_cache_string("GPU_TARGETS", arch_str))
if spec.satisfies("%gcc"):
entries.append(
cmake_cache_string(
"CMAKE_HIP_FLAGS", f"--gcc-toolchain={self.pkg.compiler.prefix}"
)
)
return entries
def std_initconfig_entries(self):
@@ -321,7 +332,9 @@ def initconfig_package_entries(self):
"""This method is to be overwritten by the package"""
return []
def initconfig(self, pkg, spec, prefix):
def initconfig(
self, pkg: "CachedCMakePackage", spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
cache_entries = (
self.std_initconfig_entries()
+ self.initconfig_compiler_entries()
@@ -358,10 +371,6 @@ class CachedCMakePackage(CMakePackage):
CMakeBuilder = CachedCMakeBuilder
# These dependencies are assumed in the builder
depends_on("c", type="build")
depends_on("cxx", type="build")
def flag_handler(self, name, flags):
if name in ("cflags", "cxxflags", "cppflags", "fflags"):
return None, None, None # handled in the cmake cache

View File

@@ -7,6 +7,8 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on
from spack.multimethod import when
@@ -81,12 +83,16 @@ def check_args(self):
def setup_build_environment(self, env):
env.set("CARGO_HOME", self.stage.path)
def build(self, pkg, spec, prefix):
def build(
self, pkg: CargoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Runs ``cargo install`` in the source directory"""
with fs.working_dir(self.build_directory):
pkg.module.cargo("install", "--root", "out", "--path", ".", *self.build_args)
def install(self, pkg, spec, prefix):
def install(
self, pkg: CargoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Copy build files into package prefix."""
with fs.working_dir(self.build_directory):
fs.install_tree("out", prefix)

View File

@@ -454,10 +454,7 @@ def cmake_args(self) -> List[str]:
return []
def cmake(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: CMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Runs ``cmake`` in the build directory"""
@@ -474,10 +471,7 @@ def cmake(
pkg.module.cmake(*options)
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: CMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the build targets"""
with fs.working_dir(self.build_directory):
@@ -488,10 +482,7 @@ def build(
pkg.module.ninja(*self.build_targets)
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: CMakePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the install targets"""
with fs.working_dir(self.build_directory):

View File

@@ -6,13 +6,12 @@
import pathlib
import re
import sys
from typing import Dict, List, Optional, Sequence, Tuple, Union
from typing import Dict, List, Sequence, Tuple, Union
import llnl.util.tty as tty
from llnl.util.lang import classproperty, memoized
from llnl.util.lang import classproperty
import spack
import spack.compilers.error
import spack.compiler
import spack.package_base
import spack.util.executable
@@ -44,9 +43,6 @@ class CompilerPackage(spack.package_base.PackageBase):
#: Static definition of languages supported by this class
compiler_languages: Sequence[str] = ["c", "cxx", "fortran"]
#: Relative path to compiler wrappers
link_paths: Dict[str, str] = {}
def __init__(self, spec: "spack.spec.Spec"):
super().__init__(spec)
msg = f"Supported languages for {spec} are not a subset of possible supported languages"
@@ -81,14 +77,14 @@ def executables(cls) -> Sequence[str]:
]
@classmethod
def determine_version(cls, exe: Path) -> str:
def determine_version(cls, exe: Path):
version_argument = cls.compiler_version_argument
if isinstance(version_argument, str):
version_argument = (version_argument,)
for va in version_argument:
try:
output = compiler_output(exe, version_argument=va)
output = spack.compiler.get_compiler_version_output(exe, va)
match = re.search(cls.compiler_version_regex, output)
if match:
return ".".join(match.groups())
@@ -99,11 +95,10 @@ def determine_version(cls, exe: Path) -> str:
f"[{__file__}] Cannot detect a valid version for the executable "
f"{str(exe)}, for package '{cls.name}': {e}"
)
return ""
@classmethod
def compiler_bindir(cls, prefix: Path) -> Path:
"""Overridable method for the location of the compiler bindir within the prefix"""
"""Overridable method for the location of the compiler bindir within the preifx"""
return os.path.join(prefix, "bin")
@classmethod
@@ -147,109 +142,3 @@ def determine_compiler_paths(cls, exes: Sequence[Path]) -> Dict[str, Path]:
def determine_variants(cls, exes: Sequence[Path], version_str: str) -> Tuple:
# path determination is separated so it can be reused in subclasses
return "", {"compilers": cls.determine_compiler_paths(exes=exes)}
#: Returns the argument needed to set the RPATH, or None if it does not exist
rpath_arg: Optional[str] = "-Wl,-rpath,"
#: Flag that needs to be used to pass an argument to the linker
linker_arg: str = "-Wl,"
#: Flag used to produce Position Independent Code
pic_flag: str = "-fPIC"
#: Flag used to get verbose output
verbose_flags: str = "-v"
#: Flag to activate OpenMP support
openmp_flag: str = "-fopenmp"
required_libs: List[str] = []
def standard_flag(self, *, language: str, standard: str) -> str:
"""Returns the flag used to enforce a given standard for a language"""
if language not in self.supported_languages:
raise spack.compilers.error.UnsupportedCompilerFlag(
f"{self.spec} does not provide the '{language}' language"
)
try:
return self._standard_flag(language=language, standard=standard)
except (KeyError, RuntimeError) as e:
raise spack.compilers.error.UnsupportedCompilerFlag(
f"{self.spec} does not provide the '{language}' standard {standard}"
) from e
def _standard_flag(self, *, language: str, standard: str) -> str:
raise NotImplementedError("Must be implemented by derived classes")
def archspec_name(self) -> str:
"""Name that archspec uses to refer to this compiler"""
return self.spec.name
@property
def cc(self) -> Optional[str]:
assert self.spec.concrete, "cannot retrieve C compiler, spec is not concrete"
if self.spec.external:
return self.spec.extra_attributes["compilers"].get("c", None)
return self._cc_path()
def _cc_path(self) -> Optional[str]:
"""Returns the path to the C compiler, if the package was installed by Spack"""
return None
@property
def cxx(self) -> Optional[str]:
assert self.spec.concrete, "cannot retrieve C++ compiler, spec is not concrete"
if self.spec.external:
return self.spec.extra_attributes["compilers"].get("cxx", None)
return self._cxx_path()
def _cxx_path(self) -> Optional[str]:
"""Returns the path to the C++ compiler, if the package was installed by Spack"""
return None
@property
def fortran(self):
assert self.spec.concrete, "cannot retrieve Fortran compiler, spec is not concrete"
if self.spec.external:
return self.spec.extra_attributes["compilers"].get("fortran", None)
return self._fortran_path()
def _fortran_path(self) -> Optional[str]:
"""Returns the path to the Fortran compiler, if the package was installed by Spack"""
return None
@memoized
def _compiler_output(
compiler_path: Path, *, version_argument: str, ignore_errors: Tuple[int, ...] = ()
) -> str:
"""Returns the output from the compiler invoked with the given version argument.
Args:
compiler_path: path of the compiler to be invoked
version_argument: the argument used to extract version information
"""
compiler = spack.util.executable.Executable(compiler_path)
if not version_argument:
return compiler(
output=str, error=str, ignore_errors=ignore_errors, timeout=120, fail_on_error=True
)
return compiler(
version_argument,
output=str,
error=str,
ignore_errors=ignore_errors,
timeout=120,
fail_on_error=True,
)
def compiler_output(
compiler_path: Path, *, version_argument: str, ignore_errors: Tuple[int, ...] = ()
) -> str:
"""Wrapper for _get_compiler_version_output()."""
# This ensures that we memoize compiler output by *absolute path*,
# not just executable name. If we don't do this, and the path changes
# (e.g., during testing), we can get incorrect results.
if not os.path.isabs(compiler_path):
compiler_path = spack.util.executable.which_string(compiler_path, required=True)
return _compiler_output(
compiler_path, version_argument=version_argument, ignore_errors=ignore_errors
)

View File

@@ -7,6 +7,8 @@
import spack.directives
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from ._checks import BuilderWithDefaults, apply_macos_rpath_fixups, execute_install_time_tests
@@ -48,3 +50,8 @@ class GenericBuilder(BuilderWithDefaults):
# unconditionally perform any post-install phase tests
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
def install(
self, pkg: Package, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
raise NotImplementedError

View File

@@ -7,6 +7,8 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, extends
from spack.multimethod import when
@@ -88,12 +90,16 @@ def check_args(self):
"""Argument for ``go test`` during check phase"""
return []
def build(self, pkg, spec, prefix):
def build(
self, pkg: GoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Runs ``go build`` in the source directory"""
with fs.working_dir(self.build_directory):
pkg.module.go("build", *self.build_args)
def install(self, pkg, spec, prefix):
def install(
self, pkg: GoPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Install built binaries into prefix bin."""
with fs.working_dir(self.build_directory):
fs.mkdirp(prefix.bin)

View File

@@ -7,7 +7,9 @@
import spack.builder
import spack.package_base
import spack.spec
import spack.util.executable
import spack.util.prefix
from spack.directives import build_system, depends_on, extends
from spack.multimethod import when
@@ -55,7 +57,9 @@ class LuaBuilder(spack.builder.Builder):
#: Names associated with package attributes in the old build-system format
legacy_attributes = ()
def unpack(self, pkg, spec, prefix):
def unpack(
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
if os.path.splitext(pkg.stage.archive_file)[1] == ".rock":
directory = pkg.luarocks("unpack", pkg.stage.archive_file, output=str)
dirlines = directory.split("\n")
@@ -66,15 +70,16 @@ def unpack(self, pkg, spec, prefix):
def _generate_tree_line(name, prefix):
return """{{ name = "{name}", root = "{prefix}" }};""".format(name=name, prefix=prefix)
def generate_luarocks_config(self, pkg, spec, prefix):
def generate_luarocks_config(
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
spec = self.pkg.spec
table_entries = []
for d in spec.traverse(deptype=("build", "run")):
if d.package.extends(self.pkg.extendee_spec):
table_entries.append(self._generate_tree_line(d.name, d.prefix))
path = self._luarocks_config_path()
with open(path, "w", encoding="utf-8") as config:
with open(self._luarocks_config_path(), "w", encoding="utf-8") as config:
config.write(
"""
deps_mode="all"
@@ -85,23 +90,26 @@ def generate_luarocks_config(self, pkg, spec, prefix):
"\n".join(table_entries)
)
)
return path
def preprocess(self, pkg, spec, prefix):
def preprocess(
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Override this to preprocess source before building with luarocks"""
pass
def luarocks_args(self):
return []
def install(self, pkg, spec, prefix):
def install(
self, pkg: LuaPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
rock = "."
specs = find(".", "*.rockspec", recursive=False)
if specs:
rock = specs[0]
rocks_args = self.luarocks_args()
rocks_args.append(rock)
self.pkg.luarocks("--tree=" + prefix, "make", *rocks_args)
pkg.luarocks("--tree=" + prefix, "make", *rocks_args)
def _luarocks_config_path(self):
return os.path.join(self.pkg.stage.source_path, "spack_luarocks.lua")

View File

@@ -98,29 +98,20 @@ def build_directory(self) -> str:
return self.pkg.stage.source_path
def edit(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: MakefilePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Edit the Makefile before calling make. The default is a no-op."""
pass
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: MakefilePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "make" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.build_targets)
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: MakefilePackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):

View File

@@ -5,6 +5,8 @@
import spack.builder
import spack.package_base
import spack.spec
import spack.util.prefix
from spack.directives import build_system, depends_on
from spack.multimethod import when
from spack.util.executable import which
@@ -58,16 +60,20 @@ def build_args(self):
"""List of args to pass to build phase."""
return []
def build(self, pkg, spec, prefix):
def build(
self, pkg: MavenPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Compile code and package into a JAR file."""
with fs.working_dir(self.build_directory):
mvn = which("mvn")
mvn = which("mvn", required=True)
if self.pkg.run_tests:
mvn("verify", *self.build_args())
else:
mvn("package", "-DskipTests", *self.build_args())
def install(self, pkg, spec, prefix):
def install(
self, pkg: MavenPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Copy to installation prefix."""
with fs.working_dir(self.build_directory):
fs.install_tree(".", prefix)

View File

@@ -188,10 +188,7 @@ def meson_args(self) -> List[str]:
return []
def meson(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: MesonPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Run ``meson`` in the build directory"""
options = []
@@ -204,10 +201,7 @@ def meson(
pkg.module.meson(*options)
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: MesonPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the build targets"""
options = ["-v"]
@@ -216,10 +210,7 @@ def build(
pkg.module.ninja(*options)
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
self, pkg: MesonPackage, spec: spack.spec.Spec, prefix: spack.util.prefix.Prefix
) -> None:
"""Make the install targets"""
with fs.working_dir(self.build_directory):

Some files were not shown because too many files have changed in this diff Show More