Compare commits

...

258 Commits

Author SHA1 Message Date
Massimiliano Culpo
4f0006a480 openfoam: fix oneapi support 2024-12-08 12:18:35 +01:00
Massimiliano Culpo
63e328645f gcc-runtime: simplify condition for providing libgfortran 2024-12-08 12:18:34 +01:00
Massimiliano Culpo
36c14561a6 fixup 2024-12-08 12:18:33 +01:00
Massimiliano Culpo
75f9940777 intel-oneapi-compilers: use the correct uarch options 2024-12-08 12:18:33 +01:00
Massimiliano Culpo
f6e9ec48c0 intel-oneapi-compilers: use the correct uarch options 2024-12-08 12:18:32 +01:00
Massimiliano Culpo
df92dad225 Raise UnsupportedCompilerFlag when a flag is not supported 2024-12-08 12:18:32 +01:00
Massimiliano Culpo
ecd13e2df8 Remove SPACK_COMPILER_SPEC from the environment 2024-12-08 12:18:31 +01:00
Massimiliano Culpo
3a8d573598 netcdf-cxx4: use https instead of ftp 2024-12-08 12:18:31 +01:00
Massimiliano Culpo
9700f1d716 (to be removed) Make spack unit test runnable 2024-12-08 12:18:30 +01:00
Massimiliano Culpo
9809c9e35c Update command completion
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:18:30 +01:00
Massimiliano Culpo
9f49c04dd6 pipelines: fix vtk version on windows 2024-12-08 12:18:30 +01:00
Massimiliano Culpo
d9f3942966 pipelines: relax ppc64le requirements 2024-12-08 12:18:29 +01:00
Massimiliano Culpo
5438dd4dc3 pipelines: relax rocm requirements 2024-12-08 12:18:29 +01:00
Massimiliano Culpo
c125f58284 pipelines: "tee" configuration, for better logging 2024-12-08 12:18:28 +01:00
Massimiliano Culpo
8b14500fdc Update pipeline configurations
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:18:28 +01:00
Massimiliano Culpo
66d7085567 Prepend compiler wrappers path last, so we don't risk finding externals 2024-12-08 12:18:27 +01:00
Massimiliano Culpo
6628c27e55 solver: temporarily enforce compilers to be externals 2024-12-08 12:18:27 +01:00
Massimiliano Culpo
0462fd4950 dyninst: add missing dependencies 2024-12-08 12:18:26 +01:00
Massimiliano Culpo
764b6fd084 builtin: minimal fix for _get_host_config_path 2024-12-08 12:18:26 +01:00
Massimiliano Culpo
cb405bcd78 builtin: fix for Windows pipelines 2024-12-08 12:18:25 +01:00
Massimiliano Culpo
e19274973c builtin: changes to packages 2024-12-08 12:18:25 +01:00
Massimiliano Culpo
4bedae3d3a fix: a compiler package sets dependent build environment only if used as such 2024-12-08 12:18:24 +01:00
Massimiliano Culpo
320fb7dde7 Allow different target flags for different compilers 2024-12-08 12:18:23 +01:00
Massimiliano Culpo
6a3f94a0bd Fix setting SPACK_TARGET_ARGS for concrete specs 2024-12-08 12:18:23 +01:00
Massimiliano Culpo
1f675bb742 Fix setting SPACK_TARGET_ARGS for concrete specs 2024-12-08 12:18:22 +01:00
Massimiliano Culpo
c92c603283 Fix concretization of julia
That package depends on llvm as a library, and the rule on compatible
targets for compilers was getting in the way.
2024-12-08 12:18:22 +01:00
Massimiliano Culpo
89fe2b8b46 Make Spec.compiler behavior stricter
Now the adaptor will raise if the Spec has no C, C++,
or Fortran compiler.
2024-12-08 12:18:19 +01:00
Massimiliano Culpo
998270b714 Make Spec.compiler behavior stricter
Now the adaptor will raise if the Spec has no C, C++,
or Fortran compiler.
2024-12-08 12:18:03 +01:00
Massimiliano Culpo
f764029b0a unit-tests: remove a few FIXMEs 2024-12-08 12:18:02 +01:00
Massimiliano Culpo
67011c8e88 Spec.__contains__: traverse only lin/run + direct build 2024-12-08 12:17:24 +01:00
Massimiliano Culpo
7ee73ed1b6 Spec.__contains__: traverse only lin/run + direct build 2024-12-08 12:17:24 +01:00
Massimiliano Culpo
3408f7ec56 Remove a test that should fail according to concretization rules 2024-12-08 12:17:23 +01:00
Massimiliano Culpo
95f8b335a2 Add a unit-test for satisfies and __getitem__ semantic 2024-12-08 12:17:23 +01:00
Massimiliano Culpo
ab4e8449a2 Add a unit-test for compiler self-dependencies 2024-12-08 12:17:22 +01:00
Massimiliano Culpo
0ea1ead751 Exempt "compilers" and "runtimes" from default requirements 2024-12-08 12:17:22 +01:00
Massimiliano Culpo
8ca15e25bf unit-tests: mark a few tests as xfail, or skip, for now
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:21 +01:00
Massimiliano Culpo
e6cd03711e unit-tests: fix most unit tests to account for the new model
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:21 +01:00
Massimiliano Culpo
fa0b10148f asp: fix intel-oneapi-compilers-classic 2024-12-08 12:17:21 +01:00
Massimiliano Culpo
fefdafea50 Exempt "compilers" and "runtimes" from default requirements 2024-12-08 12:17:20 +01:00
Massimiliano Culpo
67201b168c Allow self concretization to bootstrap compilers 2024-12-08 12:17:20 +01:00
Massimiliano Culpo
e9372fe24c Add more constraint to providers 2024-12-08 12:17:19 +01:00
Massimiliano Culpo
9b0f0a8cad Fix for duplicate glibc in concretization 2024-12-08 12:17:19 +01:00
Massimiliano Culpo
40e0d389ea Improve reporting when bootstrapping from source
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:18 +01:00
Massimiliano Culpo
1415f50a64 Improve error messages for statically checked specs
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:18 +01:00
Massimiliano Culpo
db2d1b40fc spec: implemented direct satisfy semantic
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:17 +01:00
Massimiliano Culpo
e4ace1a63a compilers_for_arch: improve implementation
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:17 +01:00
Massimiliano Culpo
0510c9fcde Fixup binary cache reuse
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:16 +01:00
Massimiliano Culpo
0036713b81 Write adaptors for CompilerSpec and Compiler
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:16 +01:00
Massimiliano Culpo
c55ebdb183 Make BaseConfiguration pickleable
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:16 +01:00
Massimiliano Culpo
4a4ffe4733 (WIP) Fix cray manifest
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:15 +01:00
Massimiliano Culpo
88cb090e00 (WIP) Fix LMod module generation
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:15 +01:00
Massimiliano Culpo
15d75caafe (WIP) Remove deprecated argument for Spec.format
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:14 +01:00
Massimiliano Culpo
9601f6a4c4 fixup: spec copies compiler annotation
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:14 +01:00
Massimiliano Culpo
30b9f6a1c1 Restore bootstrapping from binaries
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:13 +01:00
Massimiliano Culpo
0646759258 Restore bootstrapping from sources
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:13 +01:00
Massimiliano Culpo
255246bcf3 spec: change semantic of __getitem__
Now __getitem__ can pick items in the transitive link/run graph,
or from direct build dependencies.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:12 +01:00
Massimiliano Culpo
c52ff9be8d spec: bump specfile format to v5
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:12 +01:00
Massimiliano Culpo
fa40e6d021 Overhaul of the spack.compilers package
Now the package contains modules that help using, or
detecting, compiler packages.

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:11 +01:00
Massimiliano Culpo
5614e23f0b Remove spack.compilers Python modules
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:11 +01:00
Massimiliano Culpo
87de7dbbfd (WIP) Install mechanism 2024-12-08 12:17:11 +01:00
Massimiliano Culpo
a2e94365e2 (WIP) Recover bootstrapping from binaries on linux 2024-12-08 12:17:10 +01:00
Massimiliano Culpo
7a7556f154 unit-tests: fix concretization and spack compiler tests 2024-12-08 12:17:10 +01:00
Massimiliano Culpo
18152b9b0f builtin.mock et al. : changes to packages
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:09 +01:00
Massimiliano Culpo
8c1a4de9e2 solver: first working implementation of compiler as nodes
This commit changes the model to treat compilers as nodes, and
drops the concept of a "compiler" as a bundle of a C, C++, and
Fortran compiler.

Implementation does not rely on `Compiler` or `CompilerSpec`.
2024-12-08 12:17:09 +01:00
Massimiliano Culpo
a96acc548f Deprecate packages:all:compiler and update default configs
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:08 +01:00
Massimiliano Culpo
20b4acb8b8 directives: remove workaround for the c, cxx and fortran language
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-12-08 12:17:08 +01:00
Massimiliano Culpo
8a78eb8c28 spack audit: account for % new semantic
Since % means "direct build dependency", we need to exclude
it from the check in audits.
2024-12-08 12:17:07 +01:00
Massimiliano Culpo
a8806e494b Overhaul the spack compiler command
This reverts commit 2c47dddbc1.

Now, `spack compiler` writes by default in packages.yaml. Entries
in old `compilers.yaml` are converted to external specs as a way to
support legacy configuration.

Since this operation is expensive, an environment variable can be
used to enforce the deprecation of `compiler.yaml`.

The --mixed-toolchain option has been deprecated, since it stops to
make sense once compiler are treated as nodes.
2024-12-08 12:17:07 +01:00
Massimiliano Culpo
d72abc3dc7 Allow reading old JSON files 2024-12-08 12:17:06 +01:00
Massimiliano Culpo
4e69c7497e parse_with_version_concrete: remove compiler= switch 2024-12-08 12:17:06 +01:00
Massimiliano Culpo
0595d0a9d0 Make CompilerSpec raise on __init__ 2024-12-08 12:17:06 +01:00
Massimiliano Culpo
d5bee0d4b0 parser: parse compilers as direct build deps 2024-12-08 12:17:05 +01:00
Harmen Stoppels
422f829e4e mirrors: add missing init file (#47977) 2024-12-08 09:31:22 +01:00
Alec Scott
f54c101b44 py-jedi: add v0.19.2 (#47569) 2024-12-07 16:26:31 +01:00
Harmen Stoppels
05acd29f38 extensions.py: remove import of spack.cmd (#47963) 2024-12-07 10:08:04 +01:00
Wouter Deconinck
77e2187e13 coverage.yml: fail_ci_if_error = true (#47731) 2024-12-06 11:01:10 -08:00
Harmen Stoppels
5c88e035f2 directives.py: remove redundant import (#47965) 2024-12-06 19:18:12 +01:00
Harmen Stoppels
94bd7b9afb build_environment: drop off by one fix (#47960) 2024-12-06 17:01:46 +01:00
Stephen Herbener
f181ac199a Upgraded version specs for ECMWF packages: eckit, atlas, ectrans, fckit, fiat (#47749) 2024-12-05 18:46:56 -08:00
Sreenivasa Murthy Kolam
a8da7993ad Bump up the version for rocm-6.2.4 release (#47707)
* Bump up the version for rocm-6.2.4 release
2024-12-05 18:41:02 -08:00
Dom Heinzeller
b808338792 py-uxarray: new package plus dependencies (#47573)
* Add py-param@2.1.1
* Add py-panel@1.5.2
* Add py-bokeh@3.5.2
* New package py-datashader
* New package py-geoviews
* New package py-holoviews
* WIP: new package py-uxarray
* New package py-antimeridian
* New package py-dask-expr
* New package py-spatialpandas
* New package py-hvplot
* Add dependency on py-dask-expr for 'py-dask@2024.3: +dataframe'
* Added all dependencies for py-uxarray; still having problems with py-dask +dataframe / py-dask-expr
* Fix style errors in many packages
* Clean up comments and fix style errors in var/spack/repos/builtin/packages/py-dask-expr/package.py
* In var/spack/repos/builtin/packages/py-dask/package.py: since 2023.8, the dataframe variant requires the array variant
* Fix style errors in py-uxarray package
2024-12-05 18:20:55 -08:00
Massimiliano Culpo
112e47cc23 Don't inject import statements in package recipes
Remove a hack done by RepoLoader, which was injecting an extra
```
from spack.package import *
```
at the beginning of each package.py
2024-12-05 12:48:00 -08:00
Dom Heinzeller
901cea7a54 Add conflict for pixman with Intel Classic (#47922) 2024-12-05 18:14:57 +01:00
Massimiliano Culpo
c1b2ac549d solver: partition classes related to requirement parsing into their own file (#47915) 2024-12-05 18:10:06 +01:00
Harmen Stoppels
4693b323ac spack.mirror: split into submodules (#47936) 2024-12-05 18:09:08 +01:00
Kin Fai Tse
1f2a68f2b6 tar: conditionally link iconv (#47933)
* fix broken packages requiring iconv

* tar: -liconv only when libiconv

* Revert "fix broken packages requiring iconv"

This reverts commit 5fa426b52f.

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-12-05 10:09:18 -06:00
Juan Miguel Carceller
3fcc38ef04 pandoramonitoring,pandorasdk: change docstrings that are wrong (#47937)
and are copied from the pandorapfa package

Co-authored-by: jmcarcell <jmcarcell@users.noreply.github.com>
2024-12-05 08:53:09 -07:00
Harmen Stoppels
22d104d7a9 ci: add bootstrap stack for python@3.6:3.13 (#47719)
Resurrect latest Python 3.6
Add clingo-bootstrap to Gitlab CI.
2024-12-05 10:07:24 +01:00
Todd Gamblin
8b1009a4a0 resource: clean up arguments and typing
- [x] Clean up arguments on the `resource` directive.
- [x] Add type annotations
- [x] Add `resource` to type annotations on `PackageBase`
- [x] Fix up `resource` docstrings

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
Todd Gamblin
f54526957a directives: add type annotations to DirectiveMeta class
Some of the class-level annotations were wrong, and some were missing. Annotate all the
functions here and fix the class properties to match what's actually happening.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
Todd Gamblin
175a4bf101 directives: use Type[PackageBase] instead of PackageBase
The first argument to each Spack directive is not a `PackageBase` instance but a
`PackageBase` class object, so fix the type annotations to reflect this.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
Todd Gamblin
aa81d59958 directives: don't include Optional in PatchesType
`Optional` shouldn't be part of `PatchesType` -- it's clearer to specify `Optional` it
in the methods that need their arguments to be optional.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-12-04 22:49:18 -08:00
James Taliaferro
6aafefd43d package version: Neovim 0.10.2 (#47925) 2024-12-04 23:17:55 +01:00
Satish Balay
ac82f344bd trilinos@develop: update kokkos dependency (#47838) 2024-12-04 19:53:38 +01:00
Harmen Stoppels
16fd77f9da rust-bootstrap: fix zlib dependency (#47894)
x
2024-12-04 02:28:19 -08:00
Harmen Stoppels
f82554a39b stage.py: improve path to url (#47898) 2024-12-04 09:41:38 +01:00
Massimiliano Culpo
2aaf50b8f7 eigen: remove unnecessary dependency on fortran (#47866) 2024-12-04 08:18:40 +01:00
Mathew Cleveland
b0b9cf15f7 add a '+no_warning' variant to METIS to prevent pervasive warning (#47452)
* add a '+no_warning' variant to metis to prevent prevasive warning
* fix formating

---------

Co-authored-by: Cleveland <cleveland@lanl.gov>
Co-authored-by: mcourtois <mathieu.courtois@gmail.com>
2024-12-03 17:02:36 -08:00
v
8898e14e69 update py-numl and py-nugraph recipes (#47680)
* update py-numl and py-nugraph recipes

this commit adds the develop branch as a valid option for each of these two packages. in order to enable this, package tarballs are now retrieved from the github source repository instead of pypi, and their checksums and the build system have been updated accordingly.

* rename versions "develop" -> "main" to be consistent with branch name
2024-12-03 16:59:33 -08:00
Buldram
63c72634ea nim: add latest versions (#47844)
* nim: add latest versions
  In addition:
  - Create separate build and install phases.
  - Remove koch nimble call as it's redundant with koch tools.
  - Install all additional tools bundled with Nim instead of only Nimble.
* Fix 1.6 version
* nim: add devel
  In addition:
  - Fix build accessing user config/cache
2024-12-03 16:57:59 -08:00
Carson Woods
a7eacd77e3 bug fix: updated warning message to reflect impending v1.0 release (#47887) 2024-12-03 17:16:36 +01:00
Cédric Chevalier
09b7ea0400 Bump Kokkos and Kokkos-kernels to 4.5.00 (#47809)
* Bump Kokkos and Kokkos-kernels to 4.5.00

* petsc@:3.22 add a conflict with this new version of kokkos

* Update kokkos/kokkos-kernel dependency

---------

Co-authored-by: Satish Balay <balay@mcs.anl.gov>
2024-12-03 09:09:25 -07:00
Harmen Stoppels
b31dd46ab8 style.py: do not remove import spack in packages (#47895) 2024-12-03 16:04:18 +01:00
Harmen Stoppels
ad7417dee9 nwchem: add resource, remove patch (#47892)
fixes a build failure due to broken URL and improves nwchem build without internet
2024-12-03 14:09:05 +01:00
Wouter Deconinck
c3de3b0b6f tar: add v1.35 (fix CVEs) (#47426) 2024-12-03 13:26:04 +01:00
Harmen Stoppels
6da9bf226a python: drop nis module also for < 3.13 (#47862)
the nis module was removed in python 3.13
we had it default to ~nis
no package requires +nis
required dependencies for +nis were missing

so better to remove the nis module entirely.
2024-12-03 13:01:08 +01:00
Auriane R.
b3ee954e5b Remove duplicate version (#47880) 2024-12-03 10:14:47 +01:00
napulath
db090b0cad Update package.py (#47885) 2024-12-03 08:24:28 +01:00
Massimiliano Culpo
3a6c361a85 cgns: make fortran dependency optional (#47867) 2024-12-03 06:18:37 +01:00
Adam J. Stewart
bb5bd030d4 py-rasterio: add v1.4.3 (#47881) 2024-12-03 06:10:20 +01:00
dependabot[bot]
b9c60f96ea build(deps): bump pytest from 8.3.3 to 8.3.4 in /lib/spack/docs (#47882)
Bumps [pytest](https://github.com/pytest-dev/pytest) from 8.3.3 to 8.3.4.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/8.3.3...8.3.4)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-03 06:07:27 +01:00
Stephen Nicholas Swatman
6b16c64c0e acts dependencies: new versions as of 2024/12/02 (#47787)
* acts dependencies: new versions as of 2024/11/25

This commit adds a new version of detray and two new versions of vecmem.

* acts dependencies: new versions as of 2024/12/02

This commit adds version 38 of ACTS and a new version of detray.
2024-12-02 19:50:25 -06:00
Andrey Perestoronin
3ea970746d add compilers packages (#47877) 2024-12-02 15:53:56 -07:00
Satish Balay
d8f2e080e6 petsc, py-petsc4py: add v3.22.2 (#47845) 2024-12-02 14:21:31 -08:00
Harmen Stoppels
ecb8a48376 libseccomp: python forward compat bound (#47876)
* libseccomp: python forward compat bound

* include 2.5.5

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-12-02 14:59:40 -07:00
Massimiliano Culpo
30176582e4 py-torchvision: add dependency on c (#47873) 2024-12-02 22:23:58 +01:00
Massimiliano Culpo
ac17e8bea4 utf8cpp: move to GitHub, make it a CMake package (#47870) 2024-12-02 14:14:24 -07:00
Massimiliano Culpo
c30c85a99c seacas: add a conditional dependency on fortran (#47871)
* seacas: remove unnecessary dependency on fortran
* seacas: add a conditional dependency on fortran
2024-12-02 13:13:14 -08:00
Michael Schlottke-Lakemper
2ae8eb6686 Update HOHQmesh package with newer versions (#47861) 2024-12-02 12:29:45 -08:00
Jose E. Roman
b5cc5b701c New patch release SLEPc 3.22.2 (#47859) 2024-12-02 12:06:52 -08:00
Wouter Deconinck
8e7641e584 onnx: set CMAKE_CXX_STANDARD to abseil-cpp cxxstd value (#47858) 2024-12-02 11:56:33 -08:00
Weiqun Zhang
e692d401eb amrex: add v24.12 (#47857) 2024-12-02 11:55:08 -08:00
Massimiliano Culpo
99319b1d91 oneapi-level-zero: add dependency on c (#47874) 2024-12-02 12:48:49 -07:00
Satish Balay
839ed9447c trilinos@14.4.0 revert kokkos-kernel dependency - as this breaks builds (#47852) 2024-12-02 11:44:37 -08:00
afzpatel
8e5a040985 ucc: add ROCm and rccl support (#46580) 2024-12-02 20:43:53 +01:00
Stephen Nicholas Swatman
5ddbb1566d benchmark: add version 1.9.1 (#47860)
This commit adds version 1.9.1 of Google Benchmark.
2024-12-02 11:42:38 -08:00
Massimiliano Culpo
eb17680d28 double-conversion: add dependency on c, and c++ (#47869) 2024-12-02 12:38:16 -07:00
Massimiliano Culpo
f4d81be9cf py-torch-nvidia-apex: add dependency on C (#47868) 2024-12-02 20:37:33 +01:00
Massimiliano Culpo
ea5ffe35f5 configuration: set egl as buildable:false (#47865) 2024-12-02 11:33:01 -08:00
Wouter Deconinck
1e37a77e72 mlpack: depends_on py-setuptools (#47828) 2024-12-02 12:04:53 +01:00
Todd Gamblin
29427d3e9e ruff: add v0.8.1 (#47851)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-30 10:49:47 +01:00
Todd Gamblin
2a2d1989c1 version_types: clean up type hierarchy and add annotations (#47781)
In preparation for adding `when=` to `version()`, I'm cleaning up the types in
`version_types` and making sure the methods here pass `mypy` checks. This started as an
attempt to use `ConcreteVersion` outside of `spack.version` and grew into a larger type
refactor.

The hierarchy now looks like this:

* `VersionType`
  * `ConcreteVersion`
    * `StandardVersion`
    * `GitVersion`
  * `ClosedOpenRange`
  * `VersionList`

Note that the top-level thing can't easily be `Version` as that is a method and it
returns only `ConcreteVersion` right now. I *could* do something fancy with `__new__` to
make `Version` a synonym for the `ConcreteVersion` constructor, which would allow it to
be used as a type. I could also do something similar with `VersionRange` but not sure if
it's worth it just to make these into types.

There are still some places where I think `GitVersion` might not be handled properly,
but I have not attempted to fix those here.

- [x] Add a top-level `VersionType` class that all version types extend from
- [x] Define and document common methods and rich comparisons on `VersionType`
- [x] Replace complicated `Union` types with `VersionType` and `ConcreteVersion` as needed
- [x] Annotate most methods (skipping `__getitem__` and friends as the typing is a pain)
- [x] Fix up the `VersionList` constructor a bit
- [x] Add cases to methods that weren't handling all `VersionType`s
- [x] Rework some places to clarify typing for `mypy`
- [x] Simplify / optimize _next_version
- [x] Make StandardVersion.string a property to enable lazy comparison

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-11-30 08:21:07 +01:00
Wouter Deconinck
c6e292f55f py-nbdime: add v3.2.1 (#47445) 2024-11-29 15:59:11 -07:00
teddy
bf5e6b4aaf py-mpi4py: create mpi.cfg file, this file is removed since v4.0.0, but API is retained #47584
Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
2024-11-29 13:28:21 -06:00
Adam J. Stewart
9760089089 VTK: mark Python version compatibility (#47814)
* VTK: mark Python version compatibility

* VTK 8.2.0 also only supports Python 3.7
2024-11-29 13:04:56 -06:00
dmagdavector
da7c5c551d py-pip: add v23.2.1 -> v24.3.1 (#47753)
* py-pip: update to latest version 24.3.1 (plus some others)

* py-pip: note Python version dependency for new PIP versions
2024-11-29 17:18:19 +01:00
Harmen Stoppels
a575fa8529 gcc: add missing patches from Iain Sandoe's branch (#47843) 2024-11-29 08:10:04 +01:00
Massimiliano Culpo
39a65d88f6 fpm: add a dependency on c, and fortran (#47839)
Extracted from #45189

Build failure: https://gitlab.spack.io/spack/spack/-/jobs/13871774

Co-authored-by: Sebastian Ehlert <28669218+awvwgk@users.noreply.github.com>
2024-11-29 08:07:50 +01:00
Massimiliano Culpo
06ff8c88ac py-torch-sparse: add a dependency on c (#47841)
Extracted from #45189

Build failure: https://gitlab.spack.io/spack/spack/-/jobs/13870876
2024-11-29 08:06:46 +01:00
Massimiliano Culpo
a96b67ce3d miopen-hip: add a dependency on c (#47842)
Extracted from #45189

Build failure: https://gitlab.spack.io/spack/spack/-/jobs/13870957
2024-11-29 07:25:43 +01:00
Harmen Stoppels
67d494fa0b filesystem.py: remove unused md5sum (#47832) 2024-11-28 18:43:21 +01:00
Harmen Stoppels
e37e53cfe8 traverse: add MixedDepthVisitor, use in cmake (#47750)
This visitor accepts the sub-dag of all nodes and unique edges that have
deptype X directly from given roots, or deptype Y transitively for any
of the roots.
2024-11-28 17:48:48 +01:00
Andrey Perestoronin
cf31d20d4c add new packages (#47817) 2024-11-28 09:49:52 -05:00
Harmen Stoppels
b74db341c8 darwin: preserve hardlinks on codesign/install_name_tool (#47808) 2024-11-28 14:57:28 +01:00
Daryl W. Grunau
e88a3f6f85 eospac: version 6.5.12 (#47826)
Co-authored-by: Daryl W. Grunau <dwg@lanl.gov>
2024-11-28 12:32:35 +01:00
Massimiliano Culpo
9bd7483e73 Add further C and C++ dependencies to packages (#47821) 2024-11-28 10:50:35 +01:00
Harmen Stoppels
04c76fab63 hip: hints for find_package llvm/clang (#47788)
LLVM can be a transitive link dependency of hip through gl's dependency mesa, which uses it for software rendering.

In this case make sure llvm-amdgpu is found with find_package(LLVM) and
find_package(Clang) by setting LLVM_ROOT and Clang_ROOT.

That makes the patch of find_package's HINTS redundant, so remove that.
It did not work anyways, because CMAKE_PREFIX_PATH has higher precedence
than HINTS.
2024-11-28 10:23:09 +01:00
Adam J. Stewart
ecbf9fcacf py-scooby: add v0.10.0 (#47790) 2024-11-28 10:21:36 +01:00
Victor A. P. Magri
69fb594699 hypre: add a variant to allow using internal lapack functions (#47780) 2024-11-28 10:15:12 +01:00
Howard Pritchard
d28614151f nghttp2: add v1.64.0 (#47800)
Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
2024-11-28 10:12:41 +01:00
etiennemlb
f1d6af6c94 netlib-scalapack: fix for some clang derivative (cce/rocmcc) (#45434) 2024-11-28 09:25:33 +01:00
Adam J. Stewart
192821f361 py-river: mark numpy 2 compatibility (#47813) 2024-11-28 09:24:21 +01:00
Adam J. Stewart
18790ca397 py-pyvista: VTK 9.4 not yet supported (#47815) 2024-11-28 09:23:41 +01:00
BOUDAOUD34
c22d77a38e dbcsr: patch for resolving .mod file conflicts in ROCm by implementing USE, INTRINSIC (#46181)
Co-authored-by: U-PALLAS\boudaoud <boudaoud@pc44.pallas.cines.fr>
2024-11-28 09:20:48 +01:00
Tom Payerle
d82bdb3bf7 seacas: update recipe to find faodel (#40239)
Explcitly sets the CMake variables  Faodel_INCLUDE_DIRS and Faodel_LIBRARY_DIRS when +faodel.
This seems to be needed for recent versions of seacas (seacas@2021-04-02:), but should be safe
to do for all versions.

For Faodel_INCLUDE_DIRS, it looks like Faodel has header files under $(Faodel_Prefix)/include/faodel,
but seacas is not including the "faodel" part in #includes.  So add both $(Faodel_Prefix)/include
and $(Foadel_Prefix)/include/faodel

Co-authored-by: payerle <payerle@users.noreply.github.com>
2024-11-28 09:17:44 +01:00
Matt Thompson
a042bdfe0b mapl: add hpcx-mpi (#47793) 2024-11-28 09:15:25 +01:00
Adam J. Stewart
60e3e645e8 py-joblib: add v1.4.2 (#47789) 2024-11-28 08:28:44 +01:00
Chris Marsh
51785437bc Patch to fix building gcc@14.2 on darwin. Fixes #45628 (#47830) 2024-11-27 20:58:18 -07:00
dependabot[bot]
2e8db0815d build(deps): bump docker/build-push-action from 6.9.0 to 6.10.0 (#47819)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.9.0 to 6.10.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](4f58ea7922...48aba3b46d)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-27 16:29:41 -07:00
George Malerbo
8a6428746f raylib: add v5.5 (#47708)
* Add version 5.5 in package.py

* Update package.py
2024-11-27 16:25:22 -07:00
Adam J. Stewart
6b9c099af8 py-keras: add v3.7.0 (#47816) 2024-11-27 16:12:47 -07:00
Derek Ryan Strong
30814fb4e0 Deprecate rsync releases before v3.2.5 (#47820) 2024-11-27 16:14:34 -06:00
Harmen Stoppels
3194be2e92 gcc-runtime: remove libz.so from libgfortran.so if present (#47812) 2024-11-27 22:32:37 +01:00
snehring
41be2f5899 ltr-retriever: changing directory layout (#38513) 2024-11-27 14:16:57 -07:00
kwryankrattiger
02af41ebb3 gdk-pixbuf: Point at gitlab instead of broken mirror (#47825) 2024-11-27 15:13:55 -06:00
snehring
9d33c89030 r-rsamtools: add -lz to Makevars (#38649) 2024-11-27 13:44:48 -07:00
Erik Heeren
51ab7bad3b julia: conflict for %gcc@12: support (#35931) 2024-11-27 04:31:44 -07:00
kwryankrattiger
0b094f2473 Docs: Reference 7z requirement on Windows (#35943) 2024-11-26 17:11:12 -05:00
Christoph Junghans
cd306d0bc6 all-libary: add voronoi support and git version (#47798)
* all-libary: add voronoi support and git version

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-26 14:56:22 -07:00
Dom Heinzeller
fdb9cf2412 Intel/oneapi compilers: correct version ranges for diab-disable flag (#47428)
* c/c++ flags should have been modified for all 2023.x.y versions, but
  upper bound was too low
* Fortran flags should have been modified for all 2024.x.y versions, but
  likewise the upper bound was too low
2024-11-26 12:34:37 -07:00
etiennemlb
a546441d2e siesta: remove link args on a non-declared dependency (#46080) 2024-11-26 20:25:04 +01:00
IHuismann
141cdb6810 adol-c: fix libs property (#36614) 2024-11-26 17:01:18 +01:00
Brian Van Essen
f2ab74efe5 cray: add further versions of Cray packages. (#37733) 2024-11-26 16:59:53 +01:00
Martin Aumüller
38b838e405 openscenegraph: remove X11 dependencies for macos (#39037) 2024-11-26 16:59:10 +01:00
Mark Abraham
c037188b59 gromacs: announce deprecation policy and start to implement (#47804)
* gromacs: announce deprecation policy and start to implement

* Style it up

* [@spackbot] updating style on behalf of mabraham

* Bump versions used in CI

---------

Co-authored-by: mabraham <mabraham@users.noreply.github.com>
2024-11-26 05:54:07 -07:00
Mark Abraham
0835a3c5f2 gromacs: obtain SYCL from either ACpp or intel-oneapi-runtime (#47806) 2024-11-26 05:51:54 -07:00
Mark Abraham
38a2f9c2f2 gromacs: Improve HeFFTe dependency (#47805)
GROMACS supports HeFFTe with either SYCL or CUDA build and requires
a matching HeFFTe build
2024-11-26 05:50:41 -07:00
Massimiliano Culpo
eecd4afe58 gromacs: fix the value used for the ITT directory (#47795) 2024-11-26 08:14:45 +01:00
Seth R. Johnson
83624551e0 ROOT: default to +aqua~x on macOS (#47792) 2024-11-25 14:27:38 -06:00
Victor A. P. Magri
741652caa1 caliper: add "tools" variant (#47779) 2024-11-25 18:26:53 +01:00
Mark Abraham
8e914308f0 gromacs: add itt variant (#47764)
Permit configuring GROMACS with support for mdrun to trace its timing
regions by calling the ITT API. This permits tools like VTune and
unitrace to augment their analysis with GROMACS-specific annotation.
2024-11-25 16:12:55 +01:00
Mikael Simberg
3c220d0989 apex: add 2.7.0 (#47736) 2024-11-25 13:22:16 +01:00
Wouter Deconinck
8094fa1e2f py-gradio: add v5.1.0 (and add/update dependencies) (fix CVEs) (#47504)
* py-pdm-backend: add v2.4.3
* py-starlette: add v0.28.0, v0.32.0, v0.35.1, v0.36.3, v0.37.2, v0.41.2
* py-fastapi: add v0.110.2, v0.115.4
* py-pydantic-extra-types: add v2.10.0
* py-pydantic-settings: add v2.6.1
* py-python-multipart: add v0.0.17
* py-email-validator: add v2.2.0
2024-11-25 13:07:56 +01:00
Massimiliano Culpo
5c67051980 Add missing C/C++ dependencies (#47782) 2024-11-25 12:56:39 +01:00
John W. Parent
c01fb9a6d2 Add CMake 3.31 minor release (#47676) 2024-11-25 04:32:57 -07:00
Harmen Stoppels
bf12bb57e7 install_test: first look at builder, then package (#47735) 2024-11-25 11:53:28 +01:00
Wouter Deconinck
406c73ae11 py-boto*: add v1.34.162 (#47528)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:39:09 +01:00
Wouter Deconinck
3f50ccfcdd py-azure-*: updated versions (#47525) 2024-11-25 11:38:49 +01:00
Wouter Deconinck
9883a2144d py-quart: add v0.19.8 (#47508)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:38:22 +01:00
Wouter Deconinck
94815d2227 py-netifaces: add v0.10.9, v0.11.0 (#47451) 2024-11-25 11:37:41 +01:00
Wouter Deconinck
a15563f890 py-werkzeug: add v3.1.3 (and deprecate older versions) (#47507)
Co-authored-by: wdconinc <wdconinc@users.noreply.github.com>
2024-11-25 11:28:01 +01:00
Wouter Deconinck
ac2ede8d2f py-pyzmq: add v25.1.2, v26.0.3, v26.1.1, v26.2.0 (switch to py-scikit-build-core) (#44493) 2024-11-25 11:00:00 +01:00
david-edwards-linaro
b256a7c50d linaro-forge: remove v21.1.3 (#47688) 2024-11-25 10:53:27 +01:00
Szabolcs Horvát
21e10d6d98 igraph: add v0.10.15 (#47692) 2024-11-25 10:51:24 +01:00
afzpatel
ed39967848 rocm-tensile: add 6.2.1 (#47702) 2024-11-25 10:40:21 +01:00
Alex Richert
eda0c6888e ip: add cmake version requirement for @5.1: (#47754) 2024-11-25 02:38:08 -07:00
pauleonix
66055f903c cuda: Add v12.6.3 (#47721) 2024-11-25 10:36:11 +01:00
Dave Keeshan
a1c57d86c3 libusb: add v1.0.23:1.0.27 (#47727) 2024-11-25 10:33:08 +01:00
Dave Keeshan
9da8dcae97 verible: add v0.0.3841 (#47729) 2024-11-25 10:30:48 +01:00
jflrichard
c93f223a73 postgis: add version 3.1.2 (#47743) 2024-11-25 10:24:03 +01:00
Wouter Deconinck
f1faf31735 build-containers: determine latest release tag and push that as latest (#47742) 2024-11-25 10:20:58 +01:00
Stephen Herbener
8957ef0df5 Updated version specs for bufr-query package. (#47752) 2024-11-25 10:14:16 +01:00
Veselin Dobrev
347ec87fc5 mfem: add logic for the C++ standard level when using rocPRIM (#47751) 2024-11-25 10:13:22 +01:00
Adam J. Stewart
cd8c46e54e py-ruff: add v0.8.0 (#47758) 2024-11-25 10:02:31 +01:00
Adam J. Stewart
75b03bc12f glib: add v2.82.2 (#47766) 2024-11-24 20:55:18 +01:00
Adam J. Stewart
58511a3352 py-pandas: correct Python version constraint (#47770) 2024-11-24 17:48:16 +01:00
Adam J. Stewart
325873a4c7 py-fsspec: add v2024.10.0 (#47778) 2024-11-24 15:42:30 +01:00
Adam J. Stewart
9156e4be04 awscli-v2: add v2.22.4 (#47765) 2024-11-24 15:42:06 +01:00
Adam J. Stewart
12d3abc736 py-pytz: add v2024.2 (#47777) 2024-11-24 15:40:45 +01:00
Adam J. Stewart
4208aa6291 py-torchvision: add Python 3.13 support (#47776) 2024-11-24 15:40:11 +01:00
Adam J. Stewart
0bad754e23 py-scikit-learn: add Python 3.13 support (#47775) 2024-11-24 15:39:36 +01:00
Adam J. Stewart
cde2620f41 py-safetensors: add v0.4.5 (#47774) 2024-11-24 15:38:05 +01:00
Adam J. Stewart
a35aa038b0 py-pystac: add support for Python 3.12+ (#47772) 2024-11-24 15:37:43 +01:00
Adam J. Stewart
150416919e py-pydantic-core: add v2.27.1 (#47771) 2024-11-24 15:37:06 +01:00
Adam J. Stewart
281c274e0b py-jupyter-packaging: add Python 3.13 support (#47769) 2024-11-24 15:31:31 +01:00
Adam J. Stewart
16e130ece1 py-cryptography: mark Python 3.13 support (#47768) 2024-11-24 15:31:08 +01:00
Adam J. Stewart
7586303fba py-ruamel-yaml-clib: add Python compatibility bounds (#47773) 2024-11-24 15:28:45 +01:00
Harmen Stoppels
6501880fbf py-node-env: add v1.9.1 (#47762) 2024-11-24 15:27:16 +01:00
Harmen Stoppels
c76098038c py-ipykernel: split forward and backward compat bounds (#47763) 2024-11-24 15:26:15 +01:00
Harmen Stoppels
124b616b27 add a few forward compat bounds with python (#47761) 2024-11-24 15:23:11 +01:00
Adam J. Stewart
1148c8f195 gobject-introspection: Python 3.12 still not supported (#47767) 2024-11-24 03:53:32 -07:00
Adam J. Stewart
c57452dd08 py-cffi: support Python 3.12+ (#47713) 2024-11-24 08:41:29 +01:00
Harmen Stoppels
a7e57c9a14 py-opt-einsum: add v3.4.0 (#47759) 2024-11-24 08:40:29 +01:00
Teague Sterling
85d83f9c26 duckdb: add v1.1.3, deprecate <v1.1.0 (#47653)
* duckdb: add v1.0.0, v0.10.3

* Adding issue reference

* Adding issue reference

* duckdb: add v1.1.0

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Fixing styles

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* duckdb: add v1.1.1

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* duckdb: Fix missing depends_on(unixodbc, when=+odbc)

* Adding duckdb variants, removing old variants, removing deprecated versions

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* duckdb+static_openssl: Add pkgconfig and zlib-api to link zlib when needed

* duckdb: add v1.1.3

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>

* Update package.py for CVE-2024-41672 as suggested

* [@spackbot] updating style on behalf of teaguesterling

* duckdb: add CVE comment before deprecated versions

---------

Signed-off-by: Teague Sterling <teaguesterling@gmail.com>
Co-authored-by: Bernhard Kaindl <bernhardkaindl7@gmail.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-11-23 13:13:40 -07:00
finkandreas
39a081d7fd Kokkos complex_align variant, Trilinos+PETSc enforcement for Kokkos~complex_align (#47686) 2024-11-23 07:45:22 -07:00
Harmen Stoppels
71b65bb424 py-opt-einsum: missing forward compat bound for python (#47757) 2024-11-23 10:48:07 +01:00
Adam J. Stewart
3dcbd118df py-cython: support Python 3.12+ (#47714)
and add various other compat bounds on dependents
2024-11-22 22:20:41 +01:00
Harmen Stoppels
5dacb774f6 itk: use vendored googletest (#47687)
external googletest breaks dependents because they end up with
ITK_LIBRARIES set to `GTest::GTest;GTest::Main`, which then end up
literally in a nonsensical link line `-lGTest::GtTest`.

the vendored googletest produces a cmake config file where
`ITKGoogleTest_LIBRARIES` is empty.
2024-11-22 18:41:23 +01:00
Harmen Stoppels
cb3d6549c9 traverse.py: ensure topo order is bfs for trees (#47720) 2024-11-22 15:04:19 +01:00
Mark Abraham
559c2f1eb9 gromacs: oneapi does not always require gcc (#47679)
* gromacs: oneapi does not always require gcc

* Support intel_provided_gcc only with %intel classic compiler

Require gcc only when needed with %intel

* New approach depending on gcc-runtime directly

* Update var/spack/repos/builtin/packages/gromacs/package.py

Co-authored-by: Christoph Junghans <christoph.junghans@gmail.com>

---------

Co-authored-by: Christoph Junghans <christoph.junghans@gmail.com>
2024-11-22 06:33:30 -07:00
Harmen Stoppels
ed1dbea77b eigen: self.builder.build_directory -> self.build_directory (#47728) 2024-11-22 07:20:38 +01:00
Seth R. Johnson
6ebafe4631 vecgeom: add v1.2.10 and delete unused, deprecated versions (#47725)
* vecgeom: add v1.2.10

* Remove unused+deprecated versions of vecgeom

* Deprecate older v1.2.x  versions

* [@spackbot] updating style on behalf of sethrj
2024-11-21 17:03:09 -07:00
Harmen Stoppels
7f0bb7147d README.md update old tutorial URL (#47718) 2024-11-21 16:46:46 +01:00
Satish Balay
f41b38e93d xsdk: add v1.1.0 (#47635)
xsdk: exclude pflotran, alquimia, exago

heffte: ~fftw when=+hip

dealii: ~sundials ~opencascade ~vtk ~taskflow
2024-11-21 08:08:27 -06:00
Massimiliano Culpo
5fd12b7bea Add further missing C, C++ dependencies to packages (#47662) 2024-11-21 14:49:12 +01:00
Mikael Simberg
fe746bdebb aws-ofi-nccl: Add 1.8.1 to 1.13.0 (#47717) 2024-11-21 05:37:57 -07:00
eugeneswalker
453af4b9f7 hdf5-vol-cache %cce: add -Wno-error=incompatible-function-pointer-types (#47698) 2024-11-20 14:56:19 -08:00
eugeneswalker
29cf1559cc netlib-scalapack %cce: add cflags -Wno-error=implicit-function-declaration (#47701) 2024-11-20 15:09:14 -07:00
eugeneswalker
a9b3e1670b mpifileutils%cce: append cflags -Wno-error=implicit-function-declaration (#47700) 2024-11-20 14:19:05 -07:00
kwryankrattiger
4f9aa6004b visit: add v3.4.0, v3.4.1 (#47161)
* Visit: Add new versions 3.4.0 and 3.4.1

* Adios2: Restrict python, 3.11 doesn't not work for older Adios2

* VisIt: Set the VTK_VERSION for @3.4:

Older versions of VTK used the VTK_{MAJOR, MINOR}_VERSION variables for
VTK detection. VisIt >= 3.4 uses the full string VTK_VERSION.

* CI: Don't build llvm-amdgpu for non-HIP stack

* VisIt: v3.4.1 handles newer Adios2 correctly

* Visit: Add missing links in HDF5, set correct VTK version configuration parameter

* VisIt: Add py-pip requirement and patch visit with configuration changes

* HDF5 symlinks move when inside of callback

* VisIt ninja install fails with python module. Using make does not

* VisIt 3.4 has a high minimum cmake requirement

* HDF5: Early return when not mpi for mpi symlinks

* HDF5: Use platform agnostic method for creating legacy compatible MPI symlinks

* Fix VISIT_VTK_VERSION handling for 8.2.1a hack
2024-11-20 18:37:56 +01:00
Harmen Stoppels
aa2c18e4df spack style: import-check -> import, fix bugs, exclude spack.pkg (#47690) 2024-11-20 16:15:28 +01:00
dependabot[bot]
0ff3e86315 build(deps): bump codecov/codecov-action from 5.0.2 to 5.0.3 (#47683)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 5.0.2 to 5.0.3.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](5c47607acb...05f5a9cfad)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-19 20:40:01 -06:00
dependabot[bot]
df208c1095 build(deps): bump docker/metadata-action from 5.5.1 to 5.6.1 (#47682)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 5.5.1 to 5.6.1.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](8e5442c4ef...369eb591f4)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-19 20:39:45 -06:00
Chris Marsh
853f70edc8 cgal: update depends versions for 6.0.1 (#47516)
* This extends PR #47285 to properly include some of the required version constrains of cgal 6 incl C++ standard. It also adds the new no-gmp backend as a variant.

* fix style

* disable cgal@6 +demo variant as the demos require qt6 which is not in spack

* disable the gmp variant until clarity on how its supposed to work is provided. bound shared and header_only variants to relevant versions

* Fix missing msvc compiler limit, fix variant left in

* Add more comments. Better describe the gmp variant. Remove testing code

* fix style
2024-11-19 16:43:21 -07:00
Paul R. C. Kent
50970f866e Add llvm v19.1.4 (#47681) 2024-11-19 16:03:28 -07:00
Wouter Deconinck
8821300985 py-gevent: add v24.2.1, v24.10.3, v24.11.1 (#47646)
* py-gevent: add v24.2.1, v24.10.3
* py-gevent: add v24.11.1
2024-11-19 12:14:52 -08:00
AMD Toolchain Support
adc8e1d996 Restrict disable dynamic thread scaling only to 3.1 version (#47673)
Co-authored-by: vijay kallesh <Vijay-teekinavar.Kallesh@amd.com>
2024-11-19 12:12:21 -08:00
Andrey Perestoronin
1e0aac6ac3 Add new 2025.0.1 Oneapi patch packages (#47678) 2024-11-19 11:38:42 -07:00
Harmen Stoppels
99e2313d81 openturns: fix deps (#47669) 2024-11-19 18:13:47 +01:00
Mark Abraham
22690a7576 Make oneAPI library-with-sdk specialize library class (#47632) 2024-11-19 12:12:10 -05:00
Harmen Stoppels
5325cfe865 systemd: symlink the internal libraries so they are found in rpath (#47667) 2024-11-19 15:28:49 +01:00
Harmen Stoppels
5333925dd7 sensei: fix install rpath for python extension (#47670) 2024-11-19 15:23:54 +01:00
Massimiliano Culpo
2db99e1ff6 gmp: fix cxx dependency, remove dependency on fortran (#47671) 2024-11-19 15:19:08 +01:00
Massimiliano Culpo
68aa712a3e solver: add a timeout handle for users (#47661)
This PR adds a configuration setting to allow setting time limits for concretization.

For backward compatibility, the default is to set no time limit.
2024-11-19 15:00:26 +01:00
Mikael Simberg
2e71bc640c pika: Add 0.30.1 (#47666) 2024-11-19 05:44:41 -07:00
Dom Heinzeller
661f3621a7 netcdf-cxx: add a maintainer (#47665) 2024-11-19 05:28:38 -07:00
Massimiliano Culpo
f182032337 Restore message when concretizing in parallel (#47663)
It was lost in #44843
2024-11-19 12:28:14 +00:00
teddy
066666b7b1 py-non-regression-test-tools: add v1.1.6 & remove v1.1.2 (tag removed) (#47622)
* py-non-regression-test-tools: add v1.1.6  & remove v1.1.2 (tag removed)
* Update var/spack/repos/builtin/packages/py-non-regression-test-tools/package.py

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

---------

Co-authored-by: t. chantrait <teddy.chantrait@cea.fr>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
2024-11-19 04:38:33 -07:00
615 changed files with 10110 additions and 10490 deletions

View File

@@ -57,7 +57,13 @@ jobs:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
- name: Determine latest release tag
id: latest
run: |
git fetch --quiet --tags
echo "tag=$(git tag --list --sort=-v:refname | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | head -n 1)" | tee -a $GITHUB_OUTPUT
- uses: docker/metadata-action@369eb591f429131d6889c46b94e711f089e6ca96
id: docker_meta
with:
images: |
@@ -71,6 +77,7 @@ jobs:
type=semver,pattern={{major}}
type=ref,event=branch
type=ref,event=pr
type=raw,value=latest,enable=${{ github.ref == format('refs/tags/{0}', steps.latest.outputs.tag) }}
- name: Generate the Dockerfile
env:
@@ -113,7 +120,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@4f58ea79222b3b9dc2c8bbdd6debcef730109a75
uses: docker/build-push-action@48aba3b46d1b1fec4febb7c5d0c644b249a11355
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}

View File

@@ -29,6 +29,7 @@ jobs:
- run: coverage xml
- name: "Upload coverage report to CodeCov"
uses: codecov/codecov-action@5c47607acb93fed5485fdbf7232e8a31425f672a
uses: codecov/codecov-action@05f5a9cfad807516dbbef9929c4a42df3eb78766
with:
verbose: true
fail_ci_if_error: true

View File

@@ -55,7 +55,7 @@ jobs:
cmake bison libbison-dev kcov
- name: Install Python packages
run: |
pip install --upgrade pip setuptools pytest pytest-xdist pytest-cov
pip install --upgrade pip setuptools pytest pytest-xdist pytest-cov clingo
pip install --upgrade flake8 "isort>=4.3.5" "mypy>=0.900" "click" "black"
- name: Setup git configuration
run: |
@@ -173,7 +173,6 @@ jobs:
spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.6
spack bootstrap status
spack solve zlib
spack unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml lib/spack/spack/test/concretization/core.py
- uses: actions/upload-artifact@b4b15b8c7c6ac21ea08fcf65892d2ee8f75cf882
with:
@@ -211,7 +210,7 @@ jobs:
. share/spack/setup-env.sh
$(which spack) bootstrap disable spack-install
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python')
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: actions/upload-artifact@b4b15b8c7c6ac21ea08fcf65892d2ee8f75cf882
with:
@@ -242,7 +241,7 @@ jobs:
env:
COVERAGE_FILE: coverage/.coverage-windows
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml
spack unit-test --verbose --cov --cov-config=pyproject.toml
./share/spack/qa/validate_last_exit.ps1
- uses: actions/upload-artifact@b4b15b8c7c6ac21ea08fcf65892d2ee8f75cf882
with:

View File

@@ -70,7 +70,7 @@ Tutorial
----------------
We maintain a
[**hands-on tutorial**](https://spack.readthedocs.io/en/latest/tutorial.html).
[**hands-on tutorial**](https://spack-tutorial.readthedocs.io/).
It covers basic to advanced usage, packaging, developer features, and large HPC
deployments. You can do all of the exercises on your own laptop using a
Docker container.

View File

@@ -55,3 +55,11 @@ concretizer:
splice:
explicit: []
automatic: false
# Maximum time, in seconds, allowed for the 'solve' phase. If set to 0, there is no time limit.
timeout: 0
# If set to true, exceeding the timeout will always result in a concretization error. If false,
# the best (suboptimal) model computed before the timeout is used.
#
# Setting this to false yields unreproducible results, so we advise to use that value only
# for debugging purposes (e.g. check which constraints can help Spack concretize faster).
error_on_timeout: true

View File

@@ -19,7 +19,7 @@ config:
install_tree:
root: $spack/opt/spack
projections:
all: "{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}"
all: "{architecture.platform}/{architecture.target}/{name}-{version}-{hash}"
# install_tree can include an optional padded length (int or boolean)
# default is False (do not pad)
# if padded_length is True, Spack will pad as close to the system max path

View File

@@ -15,12 +15,11 @@
# -------------------------------------------------------------------------
packages:
all:
compiler:
- apple-clang
- clang
- gcc
providers:
c: [apple-clang, llvm, gcc]
cxx: [apple-clang, llvm, gcc]
elf: [libelf]
fortran: [gcc]
fuse: [macfuse]
gl: [apple-gl]
glu: [apple-glu]

View File

@@ -15,19 +15,18 @@
# -------------------------------------------------------------------------
packages:
all:
compiler: [gcc, clang, oneapi, xl, nag, fj, aocc]
providers:
awk: [gawk]
armci: [armcimpi]
blas: [openblas, amdblis]
c: [gcc]
cxx: [gcc]
c: [gcc, llvm, intel-oneapi-compilers, xl, aocc]
cxx: [gcc, llvm, intel-oneapi-compilers, xl, aocc]
D: [ldc]
daal: [intel-oneapi-daal]
elf: [elfutils]
fftw-api: [fftw, amdfftw]
flame: [libflame, amdlibflame]
fortran: [gcc]
fortran: [gcc, llvm]
fortran-rt: [gcc-runtime, intel-oneapi-runtime]
fuse: [libfuse]
gl: [glx, osmesa]
@@ -76,6 +75,8 @@ packages:
buildable: false
cray-mvapich2:
buildable: false
egl:
buildable: false
fujitsu-mpi:
buildable: false
hpcx-mpi:

View File

@@ -15,8 +15,8 @@
# -------------------------------------------------------------------------
packages:
all:
compiler:
- msvc
providers:
c : [msvc]
cxx: [msvc]
mpi: [msmpi]
gl: [wgl]

View File

@@ -1326,6 +1326,7 @@ Required:
* Microsoft Visual Studio
* Python
* Git
* 7z
Optional:
* Intel Fortran (needed for some packages)
@@ -1391,6 +1392,13 @@ as the project providing Git support on Windows. This is additionally the recomm
for installing Git on Windows, a link to which can be found above. Spack requires the
utilities vendored by this project.
"""
7zip
"""
A tool for extracting ``.xz`` files is required for extracting source tarballs. The latest 7zip
can be located at https://sourceforge.net/projects/sevenzip/.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step 2: Install and setup Spack
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -6,7 +6,7 @@ python-levenshtein==0.26.1
docutils==0.21.2
pygments==2.18.0
urllib3==2.2.3
pytest==8.3.3
pytest==8.3.4
isort==5.13.2
black==24.10.0
flake8==7.1.1

26
lib/spack/env/cc vendored
View File

@@ -40,11 +40,6 @@ readonly params="\
SPACK_ENV_PATH
SPACK_DEBUG_LOG_DIR
SPACK_DEBUG_LOG_ID
SPACK_COMPILER_SPEC
SPACK_CC_RPATH_ARG
SPACK_CXX_RPATH_ARG
SPACK_F77_RPATH_ARG
SPACK_FC_RPATH_ARG
SPACK_LINKER_ARG
SPACK_SHORT_SPEC
SPACK_SYSTEM_DIRS
@@ -223,6 +218,7 @@ for param in $params; do
if eval "test -z \"\${${param}:-}\""; then
die "Spack compiler must be run from Spack! Input '$param' is missing."
fi
# FIXME (compiler as nodes) add checks on whether `SPACK_XX_RPATH` is set if `SPACK_XX` is set
done
# eval this because SPACK_MANAGED_DIRS and SPACK_SYSTEM_DIRS are inputs we don't wanna loop over.
@@ -346,6 +342,9 @@ case "$command" in
;;
ld|ld.gold|ld.lld)
mode=ld
if [ -z "$SPACK_CC_RPATH_ARG" ]; then
comp="CXX"
fi
;;
*)
die "Unknown compiler: $command"
@@ -403,7 +402,7 @@ dtags_to_strip="${SPACK_DTAGS_TO_STRIP}"
linker_arg="${SPACK_LINKER_ARG}"
# Set up rpath variable according to language.
rpath="ERROR: RPATH ARG WAS NOT SET"
rpath="ERROR: RPATH ARG WAS NOT SET, MAYBE THE PACKAGE DOES NOT DEPEND ON ${comp}?"
eval "rpath=\${SPACK_${comp}_RPATH_ARG:?${rpath}}"
# Dump the mode and exit if the command is dump-mode.
@@ -412,13 +411,6 @@ if [ "$SPACK_TEST_COMMAND" = "dump-mode" ]; then
exit
fi
# If, say, SPACK_CC is set but SPACK_FC is not, we want to know. Compilers do not
# *have* to set up Fortran executables, so we need to tell the user when a build is
# about to attempt to use them unsuccessfully.
if [ -z "$command" ]; then
die "Compiler '$SPACK_COMPILER_SPEC' does not have a $language compiler configured."
fi
#
# Filter '.' and Spack environment directories out of PATH so that
# this script doesn't just call itself
@@ -788,15 +780,17 @@ case "$mode" in
C)
extend spack_flags_list SPACK_ALWAYS_CFLAGS
extend spack_flags_list SPACK_CFLAGS
preextend flags_list SPACK_TARGET_ARGS_CC
;;
CXX)
extend spack_flags_list SPACK_ALWAYS_CXXFLAGS
extend spack_flags_list SPACK_CXXFLAGS
preextend flags_list SPACK_TARGET_ARGS_CXX
;;
F)
preextend flags_list SPACK_TARGET_ARGS_FORTRAN
;;
esac
# prepend target args
preextend flags_list SPACK_TARGET_ARGS
;;
esac

View File

@@ -24,6 +24,7 @@
Callable,
Deque,
Dict,
Generator,
Iterable,
List,
Match,
@@ -2772,22 +2773,6 @@ def prefixes(path):
return paths
@system_path_filter
def md5sum(file):
"""Compute the MD5 sum of a file.
Args:
file (str): file to be checksummed
Returns:
MD5 sum of the file's content
"""
md5 = hashlib.md5()
with open(file, "rb") as f:
md5.update(f.read())
return md5.digest()
@system_path_filter
def remove_directory_contents(dir):
"""Remove all contents of a directory."""
@@ -2838,6 +2823,25 @@ def temporary_dir(
remove_directory_contents(tmp_dir)
@contextmanager
def edit_in_place_through_temporary_file(file_path: str) -> Generator[str, None, None]:
"""Context manager for modifying ``file_path`` in place, preserving its inode and hardlinks,
for functions or external tools that do not support in-place editing. Notice that this function
is unsafe in that it works with paths instead of a file descriptors, but this is by design,
since we assume the call site will create a new inode at the same path."""
tmp_fd, tmp_path = tempfile.mkstemp(
dir=os.path.dirname(file_path), prefix=f"{os.path.basename(file_path)}."
)
# windows cannot replace a file with open fds, so close since the call site needs to replace.
os.close(tmp_fd)
try:
shutil.copyfile(file_path, tmp_path, follow_symlinks=True)
yield tmp_path
shutil.copyfile(tmp_path, file_path, follow_symlinks=True)
finally:
os.unlink(tmp_path)
def filesummary(path, print_bytes=16) -> Tuple[int, bytes]:
"""Create a small summary of the given file. Does not error
when file does not exist.

View File

@@ -73,7 +73,7 @@ def index_by(objects, *funcs):
if isinstance(f, str):
f = lambda x: getattr(x, funcs[0])
elif isinstance(f, tuple):
f = lambda x: tuple(getattr(x, p) for p in funcs[0])
f = lambda x: tuple(getattr(x, p, None) for p in funcs[0])
result = {}
for o in objects:
@@ -995,11 +995,8 @@ def _receive_forwarded(self, context: str, exc: Exception, tb: List[str]):
def grouped_message(self, with_tracebacks: bool = True) -> str:
"""Print out an error message coalescing all the forwarded errors."""
each_exception_message = [
"{0} raised {1}: {2}{3}".format(
context,
exc.__class__.__name__,
exc,
"\n{0}".format("".join(tb)) if with_tracebacks else "",
"\n\t{0} raised {1}: {2}\n{3}".format(
context, exc.__class__.__name__, exc, f"\n{''.join(tb)}" if with_tracebacks else ""
)
for context, exc, tb in self.exceptions
]

View File

@@ -55,6 +55,7 @@ def _search_duplicate_compilers(error_cls):
import spack.builder
import spack.config
import spack.deptypes
import spack.fetch_strategy
import spack.patch
import spack.repo
@@ -693,19 +694,19 @@ def invalid_sha256_digest(fetcher):
return h, True
return None, False
error_msg = "Package '{}' does not use sha256 checksum".format(pkg_name)
error_msg = f"Package '{pkg_name}' does not use sha256 checksum"
details = []
for v, args in pkg.versions.items():
fetcher = spack.fetch_strategy.for_package_version(pkg, v)
digest, is_bad = invalid_sha256_digest(fetcher)
if is_bad:
details.append("{}@{} uses {}".format(pkg_name, v, digest))
details.append(f"{pkg_name}@{v} uses {digest}")
for _, resources in pkg.resources.items():
for resource in resources:
digest, is_bad = invalid_sha256_digest(resource.fetcher)
if is_bad:
details.append("Resource in '{}' uses {}".format(pkg_name, digest))
details.append(f"Resource in '{pkg_name}' uses {digest}")
if details:
errors.append(error_cls(error_msg, details))
@@ -1014,7 +1015,14 @@ def _issues_in_depends_on_directive(pkgs, error_cls):
# depends_on('foo+bar ^fee+baz')
#
# but we'd like to have two dependencies listed instead.
nested_dependencies = dep.spec.dependencies()
nested_dependencies = dep.spec.edges_to_dependencies()
# Filter out pure build dependencies, like:
#
# depends_on('foo+bar%gcc')
#
nested_dependencies = [
x for x in nested_dependencies if x.depflag != spack.deptypes.BUILD
]
if nested_dependencies:
summary = f"{pkg_name}: nested dependency declaration '{dep.spec}'"
ndir = len(nested_dependencies) + 1

View File

@@ -40,7 +40,7 @@
import spack.hash_types as ht
import spack.hooks
import spack.hooks.sbang
import spack.mirror
import spack.mirrors.mirror
import spack.oci.image
import spack.oci.oci
import spack.oci.opener
@@ -369,7 +369,7 @@ def update(self, with_cooldown=False):
on disk under ``_index_cache_root``)."""
self._init_local_index_cache()
configured_mirror_urls = [
m.fetch_url for m in spack.mirror.MirrorCollection(binary=True).values()
m.fetch_url for m in spack.mirrors.mirror.MirrorCollection(binary=True).values()
]
items_to_remove = []
spec_cache_clear_needed = False
@@ -765,7 +765,14 @@ def tarball_directory_name(spec):
Return name of the tarball directory according to the convention
<os>-<architecture>/<compiler>/<package>-<version>/
"""
return spec.format_path("{architecture}/{compiler.name}-{compiler.version}/{name}-{version}")
if spec.original_spec_format() < 5:
compiler = spec.annotations.compiler_node_attribute
assert compiler is not None, "a compiler spec is expected"
return spec.format_path(
f"{spec.architecture}/{compiler.name}-{compiler.version}/{spec.name}-{spec.version}"
)
return spec.format_path(f"{spec.architecture.platform}/{spec.name}-{spec.version}")
def tarball_name(spec, ext):
@@ -773,9 +780,17 @@ def tarball_name(spec, ext):
Return the name of the tarfile according to the convention
<os>-<architecture>-<package>-<dag_hash><ext>
"""
spec_formatted = spec.format_path(
"{architecture}-{compiler.name}-{compiler.version}-{name}-{version}-{hash}"
)
if spec.original_spec_format() < 5:
compiler = spec.annotations.compiler_node_attribute
assert compiler is not None, "a compiler spec is expected"
spec_formatted = (
f"{spec.architecture}-{compiler.name}-{compiler.version}-{spec.name}"
f"-{spec.version}-{spec.dag_hash()}"
)
else:
spec_formatted = (
f"{spec.architecture.platform}-{spec.name}-{spec.version}-{spec.dag_hash()}"
)
return f"{spec_formatted}{ext}"
@@ -1176,7 +1191,7 @@ def _url_upload_tarball_and_specfile(
class Uploader:
def __init__(self, mirror: spack.mirror.Mirror, force: bool, update_index: bool):
def __init__(self, mirror: spack.mirrors.mirror.Mirror, force: bool, update_index: bool):
self.mirror = mirror
self.force = force
self.update_index = update_index
@@ -1224,7 +1239,7 @@ def tag(self, tag: str, roots: List[spack.spec.Spec]):
class OCIUploader(Uploader):
def __init__(
self,
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool,
update_index: bool,
base_image: Optional[str],
@@ -1273,7 +1288,7 @@ def tag(self, tag: str, roots: List[spack.spec.Spec]):
class URLUploader(Uploader):
def __init__(
self,
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool,
update_index: bool,
signing_key: Optional[str],
@@ -1297,7 +1312,7 @@ def push(
def make_uploader(
mirror: spack.mirror.Mirror,
mirror: spack.mirrors.mirror.Mirror,
force: bool = False,
update_index: bool = False,
signing_key: Optional[str] = None,
@@ -1953,9 +1968,9 @@ def download_tarball(spec, unsigned: Optional[bool] = False, mirrors_for_spec=No
"signature_verified": "true-if-binary-pkg-was-already-verified"
}
"""
configured_mirrors: Iterable[spack.mirror.Mirror] = spack.mirror.MirrorCollection(
binary=True
).values()
configured_mirrors: Iterable[spack.mirrors.mirror.Mirror] = (
spack.mirrors.mirror.MirrorCollection(binary=True).values()
)
if not configured_mirrors:
tty.die("Please add a spack mirror to allow download of pre-compiled packages.")
@@ -1980,7 +1995,7 @@ def fetch_url_to_mirror(url):
for mirror in configured_mirrors:
if mirror.fetch_url == url:
return mirror
return spack.mirror.Mirror(url)
return spack.mirrors.mirror.Mirror(url)
mirrors = [fetch_url_to_mirror(url) for url in mirror_urls]
@@ -2334,7 +2349,9 @@ def is_backup_file(file):
if not codesign:
return
for binary in changed_files:
codesign("-fs-", binary)
# preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed
@@ -2648,7 +2665,7 @@ def try_direct_fetch(spec, mirrors=None):
specfile_is_signed = False
found_specs = []
binary_mirrors = spack.mirror.MirrorCollection(mirrors=mirrors, binary=True).values()
binary_mirrors = spack.mirrors.mirror.MirrorCollection(mirrors=mirrors, binary=True).values()
for mirror in binary_mirrors:
buildcache_fetch_url_json = url_util.join(
@@ -2709,7 +2726,7 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):
if spec is None:
return []
if not spack.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True):
if not spack.mirrors.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True):
tty.debug("No Spack mirrors are currently configured")
return {}
@@ -2748,7 +2765,7 @@ def clear_spec_cache():
def get_keys(install=False, trust=False, force=False, mirrors=None):
"""Get pgp public keys available on mirror with suffix .pub"""
mirror_collection = mirrors or spack.mirror.MirrorCollection(binary=True)
mirror_collection = mirrors or spack.mirrors.mirror.MirrorCollection(binary=True)
if not mirror_collection:
tty.die("Please add a spack mirror to allow " + "download of build caches.")
@@ -2803,7 +2820,7 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
def _url_push_keys(
*mirrors: Union[spack.mirror.Mirror, str],
*mirrors: Union[spack.mirrors.mirror.Mirror, str],
keys: List[str],
tmpdir: str,
update_index: bool = False,
@@ -2870,7 +2887,7 @@ def check_specs_against_mirrors(mirrors, specs, output_file=None):
"""
rebuilds = {}
for mirror in spack.mirror.MirrorCollection(mirrors, binary=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(mirrors, binary=True).values():
tty.debug("Checking for built specs at {0}".format(mirror.fetch_url))
rebuild_list = []
@@ -2914,7 +2931,7 @@ def _download_buildcache_entry(mirror_root, descriptions):
def download_buildcache_entry(file_descriptions, mirror_url=None):
if not mirror_url and not spack.mirror.MirrorCollection(binary=True):
if not mirror_url and not spack.mirrors.mirror.MirrorCollection(binary=True):
tty.die(
"Please provide or add a spack mirror to allow " + "download of buildcache entries."
)
@@ -2923,7 +2940,7 @@ def download_buildcache_entry(file_descriptions, mirror_url=None):
mirror_root = os.path.join(mirror_url, BUILD_CACHE_RELATIVE_PATH)
return _download_buildcache_entry(mirror_root, file_descriptions)
for mirror in spack.mirror.MirrorCollection(binary=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(binary=True).values():
mirror_root = os.path.join(mirror.fetch_url, BUILD_CACHE_RELATIVE_PATH)
if _download_buildcache_entry(mirror_root, file_descriptions):

View File

@@ -227,12 +227,13 @@ def _root_spec(spec_str: str) -> str:
# Add a compiler and platform requirement to the root spec.
platform = str(spack.platforms.host())
if platform == "darwin":
spec_str += " %apple-clang"
elif platform == "windows":
# FIXME (compiler as nodes): recover the compiler for source bootstrapping
# if platform == "darwin":
# spec_str += " %apple-clang"
if platform == "windows":
spec_str += " %msvc"
elif platform == "linux":
spec_str += " %gcc"
# elif platform == "linux":
# spec_str += " %gcc"
elif platform == "freebsd":
spec_str += " %clang"
spec_str += f" platform={platform}"

View File

@@ -16,8 +16,9 @@
import archspec.cpu
import spack.compiler
import spack.compilers
import spack.compilers.config
import spack.compilers.libraries
import spack.config
import spack.platforms
import spack.spec
import spack.traverse
@@ -39,7 +40,7 @@ def __init__(self, configuration):
self.external_cmake, self.external_bison = self._externals_from_yaml(configuration)
def _valid_compiler_or_raise(self) -> "spack.compiler.Compiler":
def _valid_compiler_or_raise(self):
if str(self.host_platform) == "linux":
compiler_name = "gcc"
elif str(self.host_platform) == "darwin":
@@ -47,17 +48,30 @@ def _valid_compiler_or_raise(self) -> "spack.compiler.Compiler":
elif str(self.host_platform) == "windows":
compiler_name = "msvc"
elif str(self.host_platform) == "freebsd":
compiler_name = "clang"
compiler_name = "llvm"
else:
raise RuntimeError(f"Cannot bootstrap clingo from sources on {self.host_platform}")
candidates = spack.compilers.compilers_for_spec(
compiler_name, arch_spec=self.host_architecture
)
candidates = [
x
for x in spack.compilers.config.CompilerFactory.from_packages_yaml(spack.config.CONFIG)
if x.name == compiler_name
]
if not candidates:
raise RuntimeError(
f"Cannot find any version of {compiler_name} to bootstrap clingo from sources"
)
candidates.sort(key=lambda x: x.spec.version, reverse=True)
candidates.sort(key=lambda x: x.version, reverse=True)
best = candidates[0]
# Get compilers for bootstrapping from the 'builtin' repository
best.namespace = "builtin"
# If the compiler does not support C++ 14, fail with a legible error message
try:
_ = best.package.standard_flag(language="cxx", standard="14")
except RuntimeError as e:
raise RuntimeError(
"cannot find a compiler supporting C++ 14 [needed to bootstrap clingo]"
) from e
return candidates[0]
def _externals_from_yaml(
@@ -76,9 +90,6 @@ def _externals_from_yaml(
if not s.satisfies(requirements[pkg_name]):
continue
if not s.intersects(f"%{self.host_compiler.spec}"):
continue
if not s.intersects(f"arch={self.host_architecture}"):
continue
@@ -111,11 +122,10 @@ def concretize(self) -> "spack.spec.Spec":
# Tweak it to conform to the host architecture
for node in s.traverse():
node.architecture.os = str(self.host_os)
node.compiler = self.host_compiler.spec
node.architecture = self.host_architecture
if node.name == "gcc-runtime":
node.versions = self.host_compiler.spec.versions
node.versions = self.host_compiler.versions
for edge in spack.traverse.traverse_edges([s], cover="edges"):
if edge.spec.name == "python":
@@ -127,6 +137,9 @@ def concretize(self) -> "spack.spec.Spec":
if edge.spec.name == "cmake" and self.external_cmake:
edge.spec = self.external_cmake
if edge.spec.name == self.host_compiler.name:
edge.spec = self.host_compiler
if "libc" in edge.virtuals:
edge.spec = self.host_libc
@@ -142,12 +155,12 @@ def python_external_spec(self) -> "spack.spec.Spec":
return self._external_spec(result)
def libc_external_spec(self) -> "spack.spec.Spec":
result = self.host_compiler.default_libc
detector = spack.compilers.libraries.CompilerPropertyDetector(self.host_compiler)
result = detector.default_libc()
return self._external_spec(result)
def _external_spec(self, initial_spec) -> "spack.spec.Spec":
initial_spec.namespace = "builtin"
initial_spec.compiler = self.host_compiler.spec
initial_spec.architecture = self.host_architecture
for flag_type in spack.spec.FlagMap.valid_compiler_flags():
initial_spec.compiler_flags[flag_type] = []

View File

@@ -11,7 +11,7 @@
from llnl.util import tty
import spack.compilers
import spack.compilers.config
import spack.config
import spack.environment
import spack.modules
@@ -143,8 +143,8 @@ def _bootstrap_config_scopes() -> Sequence["spack.config.ConfigScope"]:
def _add_compilers_if_missing() -> None:
arch = spack.spec.ArchSpec.frontend_arch()
if not spack.compilers.compilers_for_arch(arch):
spack.compilers.find_compilers()
if not spack.compilers.config.compilers_for_arch(arch):
spack.compilers.config.find_compilers()
@contextlib.contextmanager

View File

@@ -37,7 +37,7 @@
import spack.binary_distribution
import spack.config
import spack.detection
import spack.mirror
import spack.mirrors.mirror
import spack.platforms
import spack.spec
import spack.store
@@ -91,7 +91,7 @@ def __init__(self, conf: ConfigDictionary) -> None:
self.metadata_dir = spack.util.path.canonicalize_path(conf["metadata"])
# Promote (relative) paths to file urls
self.url = spack.mirror.Mirror(conf["info"]["url"]).fetch_url
self.url = spack.mirrors.mirror.Mirror(conf["info"]["url"]).fetch_url
@property
def mirror_scope(self) -> spack.config.InternalConfigScope:
@@ -281,7 +281,12 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
# Install the spec that should make the module importable
with spack.config.override(self.mirror_scope):
PackageInstaller([concrete_spec.package], fail_fast=True).install()
PackageInstaller(
[concrete_spec.package],
fail_fast=True,
package_use_cache=False,
dependencies_use_cache=False,
).install()
if _try_import_from_store(module, query_spec=concrete_spec, query_info=info):
self.last_search = info
@@ -317,11 +322,10 @@ def create_bootstrapper(conf: ConfigDictionary):
return _bootstrap_methods[btype](conf)
def source_is_enabled_or_raise(conf: ConfigDictionary):
def source_is_enabled(conf: ConfigDictionary):
"""Raise ValueError if the source is not enabled for bootstrapping"""
trusted, name = spack.config.get("bootstrap:trusted"), conf["name"]
if not trusted.get(name, False):
raise ValueError("source is not trusted")
return trusted.get(name, False)
def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str] = None):
@@ -351,8 +355,10 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
if not source_is_enabled(current_config):
continue
with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_import(module, abstract_spec):
return
@@ -364,11 +370,7 @@ def ensure_module_importable_or_raise(module: str, abstract_spec: Optional[str]
msg = f'cannot bootstrap the "{module}" Python module '
if abstract_spec:
msg += f'from spec "{abstract_spec}" '
if tty.is_debug():
msg += exception_handler.grouped_message(with_tracebacks=True)
else:
msg += exception_handler.grouped_message(with_tracebacks=False)
msg += "\nRun `spack --debug ...` for more detailed errors"
msg += exception_handler.grouped_message(with_tracebacks=tty.is_debug())
raise ImportError(msg)
@@ -406,8 +408,9 @@ def ensure_executables_in_path_or_raise(
exception_handler = GroupedExceptionHandler()
for current_config in bootstrapping_sources():
if not source_is_enabled(current_config):
continue
with exception_handler.forward(current_config["name"], Exception):
source_is_enabled_or_raise(current_config)
current_bootstrapper = create_bootstrapper(current_config)
if current_bootstrapper.try_search_path(executables, abstract_spec):
# Additional environment variables needed

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -37,7 +37,6 @@
import multiprocessing
import os
import re
import stat
import sys
import traceback
import types
@@ -60,7 +59,7 @@
import spack.build_systems.meson
import spack.build_systems.python
import spack.builder
import spack.compilers
import spack.compilers.libraries
import spack.config
import spack.deptypes as dt
import spack.error
@@ -74,7 +73,6 @@
import spack.store
import spack.subprocess_context
import spack.util.executable
import spack.util.libc
from spack import traverse
from spack.context import Context
from spack.error import InstallError, NoHeadersError, NoLibrariesError
@@ -82,6 +80,7 @@
from spack.util.environment import (
SYSTEM_DIR_CASE_ENTRY,
EnvironmentModifications,
PrependPath,
env_flag,
filter_system_paths,
get_path,
@@ -297,62 +296,10 @@ def _add_werror_handling(keep_werror, env):
env.set("SPACK_COMPILER_FLAGS_REPLACE", " ".join(["|".join(item) for item in replace_flags]))
def set_compiler_environment_variables(pkg, env):
def set_wrapper_environment_variables_for_flags(pkg, env):
assert pkg.spec.concrete
compiler = pkg.compiler
spec = pkg.spec
# Make sure the executables for this compiler exist
compiler.verify_executables()
# Set compiler variables used by CMake and autotools
assert all(key in compiler.link_paths for key in ("cc", "cxx", "f77", "fc"))
# Populate an object with the list of environment modifications
# and return it
# TODO : add additional kwargs for better diagnostics, like requestor,
# ttyout, ttyerr, etc.
link_dir = spack.paths.build_env_path
# Set SPACK compiler variables so that our wrapper knows what to
# call. If there is no compiler configured then use a default
# wrapper which will emit an error if it is used.
if compiler.cc:
env.set("SPACK_CC", compiler.cc)
env.set("CC", os.path.join(link_dir, compiler.link_paths["cc"]))
else:
env.set("CC", os.path.join(link_dir, "cc"))
if compiler.cxx:
env.set("SPACK_CXX", compiler.cxx)
env.set("CXX", os.path.join(link_dir, compiler.link_paths["cxx"]))
else:
env.set("CC", os.path.join(link_dir, "c++"))
if compiler.f77:
env.set("SPACK_F77", compiler.f77)
env.set("F77", os.path.join(link_dir, compiler.link_paths["f77"]))
else:
env.set("F77", os.path.join(link_dir, "f77"))
if compiler.fc:
env.set("SPACK_FC", compiler.fc)
env.set("FC", os.path.join(link_dir, compiler.link_paths["fc"]))
else:
env.set("FC", os.path.join(link_dir, "fc"))
# Set SPACK compiler rpath flags so that our wrapper knows what to use
env.set("SPACK_CC_RPATH_ARG", compiler.cc_rpath_arg)
env.set("SPACK_CXX_RPATH_ARG", compiler.cxx_rpath_arg)
env.set("SPACK_F77_RPATH_ARG", compiler.f77_rpath_arg)
env.set("SPACK_FC_RPATH_ARG", compiler.fc_rpath_arg)
env.set("SPACK_LINKER_ARG", compiler.linker_arg)
# Check whether we want to force RPATH or RUNPATH
if spack.config.get("config:shared_linking:type") == "rpath":
env.set("SPACK_DTAGS_TO_STRIP", compiler.enable_new_dtags)
env.set("SPACK_DTAGS_TO_ADD", compiler.disable_new_dtags)
else:
env.set("SPACK_DTAGS_TO_STRIP", compiler.disable_new_dtags)
env.set("SPACK_DTAGS_TO_ADD", compiler.enable_new_dtags)
if pkg.keep_werror is not None:
keep_werror = pkg.keep_werror
else:
@@ -360,10 +307,6 @@ def set_compiler_environment_variables(pkg, env):
_add_werror_handling(keep_werror, env)
# Set the target parameters that the compiler will add
isa_arg = optimization_flags(compiler, spec.target)
env.set("SPACK_TARGET_ARGS", isa_arg)
# Trap spack-tracked compiler flags as appropriate.
# env_flags are easy to accidentally override.
inject_flags = {}
@@ -397,74 +340,27 @@ def set_compiler_environment_variables(pkg, env):
env.set(flag.upper(), " ".join(f for f in env_flags[flag]))
pkg.flags_to_build_system_args(build_system_flags)
env.set("SPACK_COMPILER_SPEC", str(spec.compiler))
env.set("SPACK_SYSTEM_DIRS", SYSTEM_DIR_CASE_ENTRY)
compiler.setup_custom_environment(pkg, env)
# FIXME (compiler as nodes): recover this one in the correct packages
# compiler.setup_custom_environment(pkg, env)
return env
def optimization_flags(compiler, target):
if spack.compilers.is_mixed_toolchain(compiler):
msg = (
"microarchitecture specific optimizations are not "
"supported yet on mixed compiler toolchains [check"
f" {compiler.name}@{compiler.version} for further details]"
)
tty.debug(msg)
return ""
# Try to check if the current compiler comes with a version number or
# has an unexpected suffix. If so, treat it as a compiler with a
# custom spec.
compiler_version = compiler.version
version_number, suffix = archspec.cpu.version_components(compiler.version)
if not version_number or suffix:
try:
compiler_version = compiler.real_version
except spack.util.executable.ProcessError as e:
# log this and just return compiler.version instead
tty.debug(str(e))
version_number, _ = archspec.cpu.version_components(compiler.version.dotted_numeric_string)
try:
result = target.optimization_flags(compiler.name, compiler_version.dotted_numeric_string)
result = target.optimization_flags(compiler.name, version_number)
except (ValueError, archspec.cpu.UnsupportedMicroarchitecture):
result = ""
return result
class FilterDefaultDynamicLinkerSearchPaths:
"""Remove rpaths to directories that are default search paths of the dynamic linker."""
def __init__(self, dynamic_linker: Optional[str]) -> None:
# Identify directories by (inode, device) tuple, which handles symlinks too.
self.default_path_identifiers: Set[Tuple[int, int]] = set()
if not dynamic_linker:
return
for path in spack.util.libc.default_search_paths_from_dynamic_linker(dynamic_linker):
try:
s = os.stat(path)
if stat.S_ISDIR(s.st_mode):
self.default_path_identifiers.add((s.st_ino, s.st_dev))
except OSError:
continue
def is_dynamic_loader_default_path(self, p: str) -> bool:
try:
s = os.stat(p)
return (s.st_ino, s.st_dev) in self.default_path_identifiers
except OSError:
return False
def __call__(self, dirs: List[str]) -> List[str]:
if not self.default_path_identifiers:
return dirs
return [p for p in dirs if not self.is_dynamic_loader_default_path(p)]
def set_wrapper_variables(pkg, env):
"""Set environment variables used by the Spack compiler wrapper (which have the prefix
`SPACK_`) and also add the compiler wrappers to PATH.
@@ -473,39 +369,8 @@ def set_wrapper_variables(pkg, env):
this function computes these options in a manner that is intended to match the DAG traversal
order in `SetupContext`. TODO: this is not the case yet, we're using post order, SetupContext
is using topo order."""
# Set environment variables if specified for
# the given compiler
compiler = pkg.compiler
env.extend(spack.schema.environment.parse(compiler.environment))
if compiler.extra_rpaths:
extra_rpaths = ":".join(compiler.extra_rpaths)
env.set("SPACK_COMPILER_EXTRA_RPATHS", extra_rpaths)
# Add spack build environment path with compiler wrappers first in
# the path. We add the compiler wrapper path, which includes default
# wrappers (cc, c++, f77, f90), AND a subdirectory containing
# compiler-specific symlinks. The latter ensures that builds that
# are sensitive to the *name* of the compiler see the right name when
# we're building with the wrappers.
#
# Conflicts on case-insensitive systems (like "CC" and "cc") are
# handled by putting one in the <build_env_path>/case-insensitive
# directory. Add that to the path too.
env_paths = []
compiler_specific = os.path.join(
spack.paths.build_env_path, os.path.dirname(pkg.compiler.link_paths["cc"])
)
for item in [spack.paths.build_env_path, compiler_specific]:
env_paths.append(item)
ci = os.path.join(item, "case-insensitive")
if os.path.isdir(ci):
env_paths.append(ci)
tty.debug("Adding compiler bin/ paths: " + " ".join(env_paths))
for item in env_paths:
env.prepend_path("PATH", item)
env.set_path(SPACK_ENV_PATH, env_paths)
# Set compiler flags injected from the spec
set_wrapper_environment_variables_for_flags(pkg, env)
# Working directory for the spack command itself, for debug logs.
if spack.config.get("config:debug"):
@@ -571,22 +436,15 @@ def set_wrapper_variables(pkg, env):
lib_path = os.path.join(pkg.prefix, libdir)
rpath_dirs.insert(0, lib_path)
filter_default_dynamic_linker_search_paths = FilterDefaultDynamicLinkerSearchPaths(
pkg.compiler.default_dynamic_linker
)
# TODO: filter_system_paths is again wrong (and probably unnecessary due to the is_system_path
# branch above). link_dirs should be filtered with entries from _parse_link_paths.
link_dirs = list(dedupe(filter_system_paths(link_dirs)))
include_dirs = list(dedupe(filter_system_paths(include_dirs)))
rpath_dirs = list(dedupe(filter_system_paths(rpath_dirs)))
rpath_dirs = filter_default_dynamic_linker_search_paths(rpath_dirs)
# TODO: implicit_rpaths is prefiltered by is_system_path, that should be removed in favor of
# just this filter.
implicit_rpaths = filter_default_dynamic_linker_search_paths(pkg.compiler.implicit_rpaths())
if implicit_rpaths:
env.set("SPACK_COMPILER_IMPLICIT_RPATHS", ":".join(implicit_rpaths))
default_dynamic_linker_filter = spack.compilers.libraries.dynamic_linker_filter_for(pkg.spec)
if default_dynamic_linker_filter:
rpath_dirs = default_dynamic_linker_filter(rpath_dirs)
# Spack managed directories include the stage, store and upstream stores. We extend this with
# their real paths to make it more robust (e.g. /tmp vs /private/tmp on macOS).
@@ -642,22 +500,19 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
# Put spack compiler paths in module scope. (Some packages use it
# in setup_run_environment etc, so don't put it context == build)
link_dir = spack.paths.build_env_path
pkg_compiler = None
try:
pkg_compiler = pkg.compiler
except spack.compilers.NoCompilerForSpecError as e:
tty.debug(f"cannot set 'spack_cc': {str(e)}")
if pkg_compiler is not None:
module.spack_cc = os.path.join(link_dir, pkg_compiler.link_paths["cc"])
module.spack_cxx = os.path.join(link_dir, pkg_compiler.link_paths["cxx"])
module.spack_f77 = os.path.join(link_dir, pkg_compiler.link_paths["f77"])
module.spack_fc = os.path.join(link_dir, pkg_compiler.link_paths["fc"])
else:
module.spack_cc = None
module.spack_cxx = None
module.spack_f77 = None
module.spack_fc = None
# FIXME (compiler as nodes): make this more general, and not tied to three languages
# Maybe add a callback?
global_names = {
"c": ("spack_cc",),
"cxx": ("spack_cxx",),
"fortran": ("spack_fc", "spack_f77"),
}
for language in ("c", "cxx", "fortran"):
spec = pkg.spec.dependencies(virtuals=[language])
value = None if not spec else os.path.join(link_dir, spec[0].package.link_paths[language])
for name in global_names[language]:
setattr(module, name, value)
# Useful directories within the prefix are encapsulated in
# a Prefix object.
@@ -824,7 +679,6 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
context == Context.TEST and pkg.test_requires_compiler
)
if need_compiler:
set_compiler_environment_variables(pkg, env_mods)
set_wrapper_variables(pkg, env_mods)
# Platform specific setup goes before package specific setup. This is for setting
@@ -836,6 +690,11 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
env_mods.extend(setup_context.get_env_modifications())
tty.debug("setup_package: collected all modifications from dependencies")
tty.debug("setup_package: adding compiler wrappers paths")
for x in env_mods.group_by_name()["SPACK_ENV_PATH"]:
assert isinstance(x, PrependPath), "unexpected setting used for SPACK_ENV_PATH"
env_mods.prepend_path("PATH", x.value)
if context == Context.TEST:
env_mods.prepend_path("PATH", ".")
elif context == Context.BUILD and not dirty and not env_mods.is_unset("CPATH"):
@@ -849,11 +708,6 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
# Load modules on an already clean environment, just before applying Spack's
# own environment modifications. This ensures Spack controls CC/CXX/... variables.
if need_compiler:
tty.debug("setup_package: loading compiler modules")
for mod in pkg.compiler.modules:
load_module(mod)
load_external_modules(pkg)
# Make sure nothing's strange about the Spack environment.
@@ -882,6 +736,9 @@ def __init__(self, *roots: spack.spec.Spec, context: Context):
elif context == Context.RUN:
self.root_depflag = dt.RUN | dt.LINK
def accept(self, item):
return True
def neighbors(self, item):
spec = item.edge.spec
if spec.dag_hash() in self.root_hashes:
@@ -919,19 +776,19 @@ def effective_deptypes(
a flag specifying in what way they do so. The list is ordered topologically
from root to leaf, meaning that environment modifications should be applied
in reverse so that dependents override dependencies, not the other way around."""
visitor = traverse.TopoVisitor(
EnvironmentVisitor(*specs, context=context),
key=lambda x: x.dag_hash(),
topo_sorted_edges = traverse.traverse_topo_edges_generator(
traverse.with_artificial_edges(specs),
visitor=EnvironmentVisitor(*specs, context=context),
key=traverse.by_dag_hash,
root=True,
all_edges=True,
)
traverse.traverse_depth_first_with_visitor(traverse.with_artificial_edges(specs), visitor)
# Dictionary with "no mode" as default value, so it's easy to write modes[x] |= flag.
use_modes = defaultdict(lambda: UseMode(0))
nodes_with_type = []
for edge in visitor.edges:
for edge in topo_sorted_edges:
parent, child, depflag = edge.parent, edge.spec, edge.depflag
# Mark the starting point
@@ -1423,27 +1280,20 @@ def make_stack(tb, stack=None):
# We found obj, the Package implementation we care about.
# Point out the location in the install method where we failed.
filename = inspect.getfile(frame.f_code)
lineno = frame.f_lineno
if os.path.basename(filename) == "package.py":
# subtract 1 because we inject a magic import at the top of package files.
# TODO: get rid of the magic import.
lineno -= 1
lines = ["{0}:{1:d}, in {2}:".format(filename, lineno, frame.f_code.co_name)]
lines = [f"{filename}:{frame.f_lineno}, in {frame.f_code.co_name}:"]
# Build a message showing context in the install method.
sourcelines, start = inspect.getsourcelines(frame)
# Calculate lineno of the error relative to the start of the function.
fun_lineno = lineno - start
fun_lineno = frame.f_lineno - start
start_ctx = max(0, fun_lineno - context)
sourcelines = sourcelines[start_ctx : fun_lineno + context + 1]
for i, line in enumerate(sourcelines):
is_error = start_ctx + i == fun_lineno
mark = ">> " if is_error else " "
# Add start to get lineno relative to start of file, not function.
marked = " {0}{1:-6d}{2}".format(mark, start + start_ctx + i, line.rstrip())
marked = f" {'>> ' if is_error else ' '}{start + start_ctx + i:-6d}{line.rstrip()}"
if is_error:
marked = colorize("@R{%s}" % cescape(marked))
lines.append(marked)

View File

@@ -13,6 +13,7 @@
import spack.build_environment
import spack.builder
import spack.compilers.libraries
import spack.error
import spack.package_base
import spack.phase_callbacks
@@ -396,33 +397,44 @@ def _do_patch_libtool(self) -> None:
markers[tag] = "LIBTOOL TAG CONFIG: {0}".format(tag.upper())
# Replace empty linker flag prefixes:
if self.pkg.compiler.name == "nag":
if self.spec.satisfies("%nag"):
# Nag is mixed with gcc and g++, which are recognized correctly.
# Therefore, we change only Fortran values:
nag_pkg = self.spec["fortran"].package
for tag in ["fc", "f77"]:
marker = markers[tag]
x.filter(
regex='^wl=""$',
repl='wl="{0}"'.format(self.pkg.compiler.linker_arg),
start_at="# ### BEGIN {0}".format(marker),
stop_at="# ### END {0}".format(marker),
repl=f'wl="{nag_pkg.linker_arg}"',
start_at=f"# ### BEGIN {marker}",
stop_at=f"# ### END {marker}",
)
else:
x.filter(regex='^wl=""$', repl='wl="{0}"'.format(self.pkg.compiler.linker_arg))
compiler_spec = spack.compilers.libraries.compiler_spec(self.spec)
if compiler_spec:
x.filter(regex='^wl=""$', repl='wl="{0}"'.format(compiler_spec.package.linker_arg))
# Replace empty PIC flag values:
for cc, marker in markers.items():
for compiler, marker in markers.items():
if compiler == "cc":
language = "c"
elif compiler == "cxx":
language = "cxx"
else:
language = "fortran"
if language not in self.spec:
continue
x.filter(
regex='^pic_flag=""$',
repl='pic_flag="{0}"'.format(
getattr(self.pkg.compiler, "{0}_pic_flag".format(cc))
),
start_at="# ### BEGIN {0}".format(marker),
stop_at="# ### END {0}".format(marker),
repl=f'pic_flag="{self.spec[language].package.pic_flag}"',
start_at=f"# ### BEGIN {marker}",
stop_at=f"# ### END {marker}",
)
# Other compiler-specific patches:
if self.pkg.compiler.name == "fj":
if self.spec.satisfies("%fj"):
x.filter(regex="-nostdlib", repl="", string=True)
rehead = r"/\S*/"
for o in [
@@ -435,7 +447,7 @@ def _do_patch_libtool(self) -> None:
r"crtendS\.o",
]:
x.filter(regex=(rehead + o), repl="")
elif self.pkg.compiler.name == "nag":
elif self.spec.satisfies("%nag"):
for tag in ["fc", "f77"]:
marker = markers[tag]
start_at = "# ### BEGIN {0}".format(marker)

View File

@@ -11,6 +11,7 @@
import llnl.util.tty as tty
import spack.phase_callbacks
from spack.directives import depends_on
from .cmake import CMakeBuilder, CMakePackage
@@ -68,12 +69,7 @@ class CachedCMakeBuilder(CMakeBuilder):
@property
def cache_name(self):
return "{0}-{1}-{2}@{3}.cmake".format(
self.pkg.name,
self.pkg.spec.architecture,
self.pkg.spec.compiler.name,
self.pkg.spec.compiler.version,
)
return f"{self.pkg.name}-{self.spec.architecture.platform}-{self.spec.dag_hash()}.cmake"
@property
def cache_path(self):
@@ -116,7 +112,9 @@ def initconfig_compiler_entries(self):
# Fortran compiler is optional
if "FC" in os.environ:
spack_fc_entry = cmake_cache_path("CMAKE_Fortran_COMPILER", os.environ["FC"])
system_fc_entry = cmake_cache_path("CMAKE_Fortran_COMPILER", self.pkg.compiler.fc)
system_fc_entry = cmake_cache_path(
"CMAKE_Fortran_COMPILER", self.spec["fortran"].package.fortran
)
else:
spack_fc_entry = "# No Fortran compiler defined in spec"
system_fc_entry = "# No Fortran compiler defined in spec"
@@ -132,8 +130,8 @@ def initconfig_compiler_entries(self):
" " + cmake_cache_path("CMAKE_CXX_COMPILER", os.environ["CXX"]),
" " + spack_fc_entry,
"else()\n",
" " + cmake_cache_path("CMAKE_C_COMPILER", self.pkg.compiler.cc),
" " + cmake_cache_path("CMAKE_CXX_COMPILER", self.pkg.compiler.cxx),
" " + cmake_cache_path("CMAKE_C_COMPILER", self.spec["c"].package.cc),
" " + cmake_cache_path("CMAKE_CXX_COMPILER", self.spec["cxx"].package.cxx),
" " + system_fc_entry,
"endif()\n",
]
@@ -352,6 +350,10 @@ class CachedCMakePackage(CMakePackage):
CMakeBuilder = CachedCMakeBuilder
# These dependencies are assumed in the builder
depends_on("c", type="build")
depends_on("cxx", type="build")
def flag_handler(self, name, flags):
if name in ("cflags", "cxxflags", "cppflags", "fflags"):
return None, None, None # handled in the cmake cache

View File

@@ -9,7 +9,7 @@
import re
import sys
from itertools import chain
from typing import Any, List, Optional, Set, Tuple
from typing import Any, List, Optional, Tuple
import llnl.util.filesystem as fs
from llnl.util.lang import stable_partition
@@ -21,6 +21,7 @@
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack import traverse
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from spack.util.environment import filter_system_paths
@@ -166,15 +167,18 @@ def _values(x):
def get_cmake_prefix_path(pkg: spack.package_base.PackageBase) -> List[str]:
"""Obtain the CMAKE_PREFIX_PATH entries for a package, based on the cmake_prefix_path package
attribute of direct build/test and transitive link dependencies."""
# Add direct build/test deps
selected: Set[str] = {s.dag_hash() for s in pkg.spec.dependencies(deptype=dt.BUILD | dt.TEST)}
# Add transitive link deps
selected.update(s.dag_hash() for s in pkg.spec.traverse(root=False, deptype=dt.LINK))
# Separate out externals so they do not shadow Spack prefixes
externals, spack_built = stable_partition(
(s for s in pkg.spec.traverse(root=False, order="topo") if s.dag_hash() in selected),
lambda x: x.external,
edges = traverse.traverse_topo_edges_generator(
traverse.with_artificial_edges([pkg.spec]),
visitor=traverse.MixedDepthVisitor(
direct=dt.BUILD | dt.TEST, transitive=dt.LINK, key=traverse.by_dag_hash
),
key=traverse.by_dag_hash,
root=False,
all_edges=False, # cover all nodes, not all edges
)
ordered_specs = [edge.spec for edge in edges]
# Separate out externals so they do not shadow Spack prefixes
externals, spack_built = stable_partition((s for s in ordered_specs), lambda x: x.external)
return filter_system_paths(
path for spec in chain(spack_built, externals) for path in spec.package.cmake_prefix_paths

View File

@@ -5,15 +5,22 @@
import itertools
import os
import pathlib
import platform
import re
import sys
from typing import Dict, List, Sequence, Tuple, Union
from typing import Dict, List, Optional, Sequence, Tuple, Union
import archspec.cpu
import llnl.util.tty as tty
from llnl.util.lang import classproperty
from llnl.util.lang import classproperty, memoized
import spack.compiler
import spack
import spack.compilers.error
import spack.compilers.libraries
import spack.config
import spack.package_base
import spack.paths
import spack.util.executable
# Local "type" for type hints
@@ -44,6 +51,9 @@ class CompilerPackage(spack.package_base.PackageBase):
#: Static definition of languages supported by this class
compiler_languages: Sequence[str] = ["c", "cxx", "fortran"]
#: Relative path to compiler wrappers
link_paths: Dict[str, str] = {}
def __init__(self, spec: "spack.spec.Spec"):
super().__init__(spec)
msg = f"Supported languages for {spec} are not a subset of possible supported languages"
@@ -78,14 +88,14 @@ def executables(cls) -> Sequence[str]:
]
@classmethod
def determine_version(cls, exe: Path):
def determine_version(cls, exe: Path) -> str:
version_argument = cls.compiler_version_argument
if isinstance(version_argument, str):
version_argument = (version_argument,)
for va in version_argument:
try:
output = spack.compiler.get_compiler_version_output(exe, va)
output = compiler_output(exe, version_argument=va)
match = re.search(cls.compiler_version_regex, output)
if match:
return ".".join(match.groups())
@@ -96,6 +106,7 @@ def determine_version(cls, exe: Path):
f"[{__file__}] Cannot detect a valid version for the executable "
f"{str(exe)}, for package '{cls.name}': {e}"
)
return ""
@classmethod
def compiler_bindir(cls, prefix: Path) -> Path:
@@ -143,3 +154,184 @@ def determine_compiler_paths(cls, exes: Sequence[Path]) -> Dict[str, Path]:
def determine_variants(cls, exes: Sequence[Path], version_str: str) -> Tuple:
# path determination is separated so it can be reused in subclasses
return "", {"compilers": cls.determine_compiler_paths(exes=exes)}
#: Returns the argument needed to set the RPATH, or None if it does not exist
rpath_arg: Optional[str] = "-Wl,-rpath,"
#: Flag that needs to be used to pass an argument to the linker
linker_arg: str = "-Wl,"
#: Flag used to produce Position Independent Code
pic_flag: str = "-fPIC"
#: Flag used to get verbose output
verbose_flags: str = "-v"
#: Flag to activate OpenMP support
openmp_flag: str = "-fopenmp"
def standard_flag(self, *, language: str, standard: str) -> str:
"""Returns the flag used to enforce a given standard for a language"""
if language not in self.supported_languages:
raise spack.compilers.error.UnsupportedCompilerFlag(
f"{self.spec} does not provide the '{language}' language"
)
try:
return self._standard_flag(language=language, standard=standard)
except (KeyError, RuntimeError) as e:
raise spack.compilers.error.UnsupportedCompilerFlag(
f"{self.spec} does not provide the '{language}' standard {standard}"
) from e
def _standard_flag(self, *, language: str, standard: str) -> str:
raise NotImplementedError("Must be implemented by derived classes")
@property
def disable_new_dtags(self) -> str:
if platform.system() == "Darwin":
return ""
return "--disable-new-dtags"
@property
def enable_new_dtags(self) -> str:
if platform.system() == "Darwin":
return ""
return "--enable-new-dtags"
def setup_dependent_build_environment(self, env, dependent_spec):
# FIXME (compiler as nodes): check if this is good enough or should be made more general
# The package is not used as a compiler, so skip this setup
if not any(
lang in dependent_spec and dependent_spec[lang].name == self.spec.name
for lang in ("c", "cxx", "fortran")
):
return
# Populate an object with the list of environment modifications and return it
link_dir = pathlib.Path(spack.paths.build_env_path)
env_paths = []
for language, attr_name, wrapper_var_name, spack_var_name in [
("c", "cc", "CC", "SPACK_CC"),
("cxx", "cxx", "CXX", "SPACK_CXX"),
("fortran", "fortran", "F77", "SPACK_F77"),
("fortran", "fortran", "FC", "SPACK_FC"),
]:
if language not in dependent_spec or dependent_spec[language].name != self.spec.name:
continue
if not hasattr(self, attr_name):
continue
compiler = getattr(self, attr_name)
env.set(spack_var_name, compiler)
if language not in self.link_paths:
continue
wrapper_path = link_dir / self.link_paths.get(language)
env.set(wrapper_var_name, str(wrapper_path))
env.set(f"SPACK_{wrapper_var_name}_RPATH_ARG", self.rpath_arg)
uarch = dependent_spec.architecture.target
version_number, _ = archspec.cpu.version_components(
self.spec.version.dotted_numeric_string
)
try:
isa_arg = uarch.optimization_flags(self.archspec_name(), version_number)
except (ValueError, archspec.cpu.UnsupportedMicroarchitecture):
isa_arg = ""
if isa_arg:
env.set(f"SPACK_TARGET_ARGS_{attr_name.upper()}", isa_arg)
# Add spack build environment path with compiler wrappers first in
# the path. We add the compiler wrapper path, which includes default
# wrappers (cc, c++, f77, f90), AND a subdirectory containing
# compiler-specific symlinks. The latter ensures that builds that
# are sensitive to the *name* of the compiler see the right name when
# we're building with the wrappers.
#
# Conflicts on case-insensitive systems (like "CC" and "cc") are
# handled by putting one in the <build_env_path>/case-insensitive
# directory. Add that to the path too.
compiler_specific = os.path.join(
spack.paths.build_env_path, os.path.dirname(self.link_paths[language])
)
for item in [spack.paths.build_env_path, compiler_specific]:
env_paths.append(item)
ci = os.path.join(item, "case-insensitive")
if os.path.isdir(ci):
env_paths.append(ci)
# FIXME (compiler as nodes): make these paths language specific
env.set("SPACK_LINKER_ARG", self.linker_arg)
paths = _implicit_rpaths(pkg=self)
if paths:
env.set("SPACK_COMPILER_IMPLICIT_RPATHS", ":".join(paths))
# Check whether we want to force RPATH or RUNPATH
if spack.config.CONFIG.get("config:shared_linking:type") == "rpath":
env.set("SPACK_DTAGS_TO_STRIP", self.enable_new_dtags)
env.set("SPACK_DTAGS_TO_ADD", self.disable_new_dtags)
else:
env.set("SPACK_DTAGS_TO_STRIP", self.disable_new_dtags)
env.set("SPACK_DTAGS_TO_ADD", self.enable_new_dtags)
spec = self.spec
if spec.extra_attributes:
extra_rpaths = spec.extra_attributes.get("extra_rpaths")
if extra_rpaths:
extra_rpaths = ":".join(compiler.extra_rpaths)
env.append_path("SPACK_COMPILER_EXTRA_RPATHS", extra_rpaths)
for item in env_paths:
env.prepend_path("SPACK_ENV_PATH", item)
def archspec_name(self) -> str:
"""Name that archspec uses to refer to this compiler"""
return self.spec.name
def _implicit_rpaths(pkg: spack.package_base.PackageBase) -> List[str]:
detector = spack.compilers.libraries.CompilerPropertyDetector(pkg.spec)
paths = detector.implicit_rpaths()
return paths
@memoized
def _compiler_output(
compiler_path: Path, *, version_argument: str, ignore_errors: Tuple[int, ...] = ()
) -> str:
"""Returns the output from the compiler invoked with the given version argument.
Args:
compiler_path: path of the compiler to be invoked
version_argument: the argument used to extract version information
"""
compiler = spack.util.executable.Executable(compiler_path)
compiler_invocation_args = {
"output": str,
"error": str,
"ignore_errors": ignore_errors,
"timeout": 120,
"fail_on_error": True,
}
if version_argument:
output = compiler(version_argument, **compiler_invocation_args)
else:
output = compiler(**compiler_invocation_args)
return output
def compiler_output(
compiler_path: Path, *, version_argument: str, ignore_errors: Tuple[int, ...] = ()
) -> str:
"""Wrapper for _get_compiler_version_output()."""
# This ensures that we memoize compiler output by *absolute path*,
# not just executable name. If we don't do this, and the path changes
# (e.g., during testing), we can get incorrect results.
if not os.path.isabs(compiler_path):
compiler_path = spack.util.executable.which_string(compiler_path, required=True)
return _compiler_output(
compiler_path, version_argument=version_argument, ignore_errors=ignore_errors
)

View File

@@ -75,7 +75,7 @@ def toolchain_version(self):
Override this method to select a specific version of the toolchain or change
selection heuristics.
Default is whatever version of msvc has been selected by concretization"""
return "v" + self.pkg.compiler.platform_toolset_ver
return "v" + self.spec["msvc"].package.platform_toolset_ver
@property
def std_msbuild_args(self):

View File

@@ -140,7 +140,7 @@ def setup_run_environment(self, env):
$ source {prefix}/{component}/{version}/env/vars.sh
"""
# Only if environment modifications are desired (default is +envmods)
if "~envmods" not in self.spec:
if "+envmods" in self.spec:
env.extend(
EnvironmentModifications.from_sourcing_file(
self.component_prefix.env.join("vars.sh"), *self.env_script_args
@@ -255,7 +255,7 @@ def libs(self):
return find_libraries("*", root=self.component_prefix.lib, recursive=not self.v2_layout)
class IntelOneApiLibraryPackageWithSdk(IntelOneApiPackage):
class IntelOneApiLibraryPackageWithSdk(IntelOneApiLibraryPackage):
"""Base class for Intel oneAPI library packages with SDK components.
Contains some convenient default implementations for libraries

View File

@@ -277,10 +277,6 @@ def update_external_dependencies(self, extendee_spec=None):
if not python.architecture.target:
python.architecture.target = archspec.cpu.host().family.name
# Ensure compiler information is present
if not python.compiler:
python.compiler = self.spec.compiler
python.external_path = self.spec.external_path
python._mark_concrete()
self.spec.add_dependency_edge(python, depflag=dt.BUILD | dt.LINK | dt.RUN, virtuals=())

View File

@@ -37,7 +37,8 @@
import spack.config as cfg
import spack.error
import spack.main
import spack.mirror
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.paths
import spack.repo
import spack.spec
@@ -204,7 +205,7 @@ def _print_staging_summary(spec_labels, stages, rebuild_decisions):
if not stages:
return
mirrors = spack.mirror.MirrorCollection(binary=True)
mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
tty.msg("Checked the following mirrors for binaries:")
for m in mirrors.values():
tty.msg(f" {m.fetch_url}")
@@ -797,7 +798,7 @@ def ensure_expected_target_path(path):
path = path.replace("\\", "/")
return path
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
buildcache_destination = None
if "buildcache-destination" not in pipeline_mirrors:
raise SpackCIError("spack ci generate requires a mirror named 'buildcache-destination'")
@@ -1323,7 +1324,7 @@ def push_to_build_cache(spec: spack.spec.Spec, mirror_url: str, sign_binaries: b
"""
tty.debug(f"Pushing to build cache ({'signed' if sign_binaries else 'unsigned'})")
signing_key = bindist.select_signing_key() if sign_binaries else None
mirror = spack.mirror.Mirror.from_url(mirror_url)
mirror = spack.mirrors.mirror.Mirror.from_url(mirror_url)
try:
with bindist.make_uploader(mirror, signing_key=signing_key) as uploader:
uploader.push_or_raise([spec])
@@ -1343,7 +1344,7 @@ def remove_other_mirrors(mirrors_to_keep, scope=None):
mirrors_to_remove.append(name)
for mirror_name in mirrors_to_remove:
spack.mirror.remove(mirror_name, scope)
spack.mirrors.utils.remove(mirror_name, scope)
def copy_files_to_artifacts(src, artifacts_dir):
@@ -2137,7 +2138,7 @@ def build_name(self):
Returns: (str) current spec's CDash build name."""
spec = self.current_spec
if spec:
build_name = f"{spec.name}@{spec.version}%{spec.compiler} \
build_name = f"{spec.name}@{spec.version} \
hash={spec.dag_hash()} arch={spec.architecture} ({self.build_group})"
tty.debug(f"Generated CDash build name ({build_name}) from the {spec.name}")
return build_name

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import difflib
import importlib
import os
import re
@@ -125,6 +126,8 @@ def get_module(cmd_name):
tty.debug("Imported {0} from built-in commands".format(pname))
except ImportError:
module = spack.extensions.get_module(cmd_name)
if not module:
raise CommandNotFoundError(cmd_name)
attr_setdefault(module, SETUP_PARSER, lambda *args: None) # null-op
attr_setdefault(module, DESCRIPTION, "")
@@ -369,8 +372,13 @@ def iter_groups(specs, indent, all_headers):
index = index_by(specs, ("architecture", "compiler"))
ispace = indent * " "
def _key(item):
if item is None:
return ""
return str(item)
# Traverse the index and print out each package
for i, (architecture, compiler) in enumerate(sorted(index)):
for i, (architecture, compiler) in enumerate(sorted(index, key=_key)):
if i > 0:
print()
@@ -428,6 +436,7 @@ def display_specs(specs, args=None, **kwargs):
"""
# FIXME (compiler as nodes): remove the "show full compiler" arguments, and its use
def get_arg(name, default=None):
"""Prefer kwargs, then args, then default."""
if name in kwargs:
@@ -442,7 +451,6 @@ def get_arg(name, default=None):
hashes = get_arg("long", False)
namespaces = get_arg("namespaces", False)
flags = get_arg("show_flags", False)
full_compiler = get_arg("show_full_compiler", False)
variants = get_arg("variants", False)
groups = get_arg("groups", True)
all_headers = get_arg("all_headers", False)
@@ -464,10 +472,7 @@ def get_arg(name, default=None):
if format_string is None:
nfmt = "{fullname}" if namespaces else "{name}"
ffmt = ""
if full_compiler or flags:
ffmt += "{%compiler.name}"
if full_compiler:
ffmt += "{@compiler.version}"
if flags:
ffmt += " {compiler_flags}"
vfmt = "{variants}" if variants else ""
format_string = nfmt + "{@version}" + ffmt + vfmt
@@ -691,3 +696,24 @@ def find_environment(args):
def first_line(docstring):
"""Return the first line of the docstring."""
return docstring.split("\n")[0]
class CommandNotFoundError(spack.error.SpackError):
"""Exception class thrown when a requested command is not recognized as
such.
"""
def __init__(self, cmd_name):
msg = (
f"{cmd_name} is not a recognized Spack command or extension command; "
"check with `spack commands`."
)
long_msg = None
similar = difflib.get_close_matches(cmd_name, all_commands())
if 1 <= len(similar) <= 5:
long_msg = "\nDid you mean one of the following commands?\n "
long_msg += "\n ".join(similar)
super().__init__(msg, long_msg)

View File

@@ -16,7 +16,7 @@
import spack.bootstrap.config
import spack.bootstrap.core
import spack.config
import spack.mirror
import spack.mirrors.utils
import spack.spec
import spack.stage
import spack.util.path
@@ -400,7 +400,7 @@ def _mirror(args):
llnl.util.tty.set_msg_enabled(False)
spec = spack.spec.Spec(spec_str).concretized()
for node in spec.traverse():
spack.mirror.create(mirror_dir, [node])
spack.mirrors.utils.create(mirror_dir, [node])
llnl.util.tty.set_msg_enabled(True)
if args.binary_packages:

View File

@@ -21,7 +21,7 @@
import spack.deptypes as dt
import spack.environment as ev
import spack.error
import spack.mirror
import spack.mirrors.mirror
import spack.oci.oci
import spack.spec
import spack.stage
@@ -392,7 +392,7 @@ def push_fn(args):
roots = spack.cmd.require_active_env(cmd_name="buildcache push").concrete_roots()
mirror = args.mirror
assert isinstance(mirror, spack.mirror.Mirror)
assert isinstance(mirror, spack.mirrors.mirror.Mirror)
push_url = mirror.push_url
@@ -750,7 +750,7 @@ def manifest_copy(manifest_file_list, dest_mirror=None):
copy_buildcache_file(copy_file["src"], dest)
def update_index(mirror: spack.mirror.Mirror, update_keys=False):
def update_index(mirror: spack.mirrors.mirror.Mirror, update_keys=False):
# Special case OCI images for now.
try:
image_ref = spack.oci.oci.image_from_mirror(mirror)

View File

@@ -20,7 +20,7 @@
import spack.config as cfg
import spack.environment as ev
import spack.hash_types as ht
import spack.mirror
import spack.mirrors.mirror
import spack.util.gpg as gpg_util
import spack.util.timer as timer
import spack.util.url as url_util
@@ -240,7 +240,7 @@ def ci_reindex(args):
ci_mirrors = yaml_root["mirrors"]
mirror_urls = [url for url in ci_mirrors.values()]
remote_mirror_url = mirror_urls[0]
mirror = spack.mirror.Mirror(remote_mirror_url)
mirror = spack.mirrors.mirror.Mirror(remote_mirror_url)
buildcache.update_index(mirror, update_keys=True)
@@ -328,7 +328,7 @@ def ci_rebuild(args):
full_rebuild = True if rebuild_everything and rebuild_everything.lower() == "true" else False
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
buildcache_destination = None
if "buildcache-destination" not in pipeline_mirrors:
tty.die("spack ci rebuild requires a mirror named 'buildcache-destination")

View File

@@ -14,7 +14,8 @@
import spack.config
import spack.deptypes as dt
import spack.environment as ev
import spack.mirror
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.reporters
import spack.spec
import spack.store
@@ -689,31 +690,31 @@ def mirror_name_or_url(m):
# If there's a \ or / in the name, it's interpreted as a path or url.
if "/" in m or "\\" in m or m in (".", ".."):
return spack.mirror.Mirror(m)
return spack.mirrors.mirror.Mirror(m)
# Otherwise, the named mirror is required to exist.
try:
return spack.mirror.require_mirror_name(m)
return spack.mirrors.utils.require_mirror_name(m)
except ValueError as e:
raise argparse.ArgumentTypeError(f"{e}. Did you mean {os.path.join('.', m)}?") from e
def mirror_url(url):
try:
return spack.mirror.Mirror.from_url(url)
return spack.mirrors.mirror.Mirror.from_url(url)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
def mirror_directory(path):
try:
return spack.mirror.Mirror.from_local_path(path)
return spack.mirrors.mirror.Mirror.from_local_path(path)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
def mirror_name(name):
try:
return spack.mirror.require_mirror_name(name)
return spack.mirrors.utils.require_mirror_name(name)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e

View File

@@ -5,13 +5,14 @@
import argparse
import sys
import warnings
import llnl.util.tty as tty
from llnl.util.lang import index_by
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
import spack.compilers
import spack.compilers.config
import spack.config
import spack.spec
from spack.cmd.common import arguments
@@ -35,13 +36,13 @@ def setup_parser(subparser):
"--mixed-toolchain",
action="store_true",
default=sys.platform == "darwin",
help="Allow mixed toolchains (for example: clang, clang++, gfortran)",
help="(DEPRECATED) Allow mixed toolchains (for example: clang, clang++, gfortran)",
)
mixed_toolchain_group.add_argument(
"--no-mixed-toolchain",
action="store_false",
dest="mixed_toolchain",
help="Do not allow mixed toolchains (for example: clang, clang++, gfortran)",
help="(DEPRECATED) Do not allow mixed toolchains (for example: clang, clang++, gfortran)",
)
find_parser.add_argument("add_paths", nargs=argparse.REMAINDER)
find_parser.add_argument(
@@ -80,77 +81,97 @@ def compiler_find(args):
"""Search either $PATH or a list of paths OR MODULES for compilers and
add them to Spack's configuration.
"""
if args.mixed_toolchain:
warnings.warn(
"The '--mixed-toolchain' option has been deprecated in Spack v0.23, and currently "
"has no effect. The option will be removed in Spack v0.25"
)
paths = args.add_paths or None
new_compilers = spack.compilers.find_compilers(
path_hints=paths,
scope=args.scope,
mixed_toolchain=args.mixed_toolchain,
max_workers=args.jobs,
new_compilers = spack.compilers.config.find_compilers(
path_hints=paths, scope=args.scope, max_workers=args.jobs
)
if new_compilers:
n = len(new_compilers)
s = "s" if n > 1 else ""
filename = spack.config.CONFIG.get_config_filename(args.scope, "compilers")
filename = spack.config.CONFIG.get_config_filename(args.scope, "packages")
tty.msg(f"Added {n:d} new compiler{s} to {filename}")
compiler_strs = sorted(f"{c.spec.name}@{c.spec.version}" for c in new_compilers)
compiler_strs = sorted(f"{spec.name}@{spec.versions}" for spec in new_compilers)
colify(reversed(compiler_strs), indent=4)
else:
tty.msg("Found no new compilers")
tty.msg("Compilers are defined in the following files:")
colify(spack.compilers.compiler_config_files(), indent=4)
colify(spack.compilers.config.compiler_config_files(), indent=4)
def compiler_remove(args):
compiler_spec = spack.spec.CompilerSpec(args.compiler_spec)
candidate_compilers = spack.compilers.compilers_for_spec(compiler_spec, scope=args.scope)
remover = spack.compilers.config.CompilerRemover(spack.config.CONFIG)
candidates = remover.mark_compilers(match=args.compiler_spec, scope=args.scope)
if not candidates:
tty.die(f"No compiler matches '{args.compiler_spec}'")
if not candidate_compilers:
tty.die("No compilers match spec %s" % compiler_spec)
compiler_strs = reversed(sorted(f"{spec.name}@{spec.versions}" for spec in candidates))
if not args.all and len(candidate_compilers) > 1:
tty.error(f"Multiple compilers match spec {compiler_spec}. Choose one:")
colify(reversed(sorted([c.spec.display_str for c in candidate_compilers])), indent=4)
tty.msg("Or, use `spack compiler remove -a` to remove all of them.")
if not args.all and len(candidates) > 1:
tty.error(f"multiple compilers match the spec '{args.compiler_spec}':")
print()
colify(compiler_strs, indent=4)
print()
print(
"Either use a stricter spec to select only one, or use `spack compiler remove -a`"
" to remove all of them."
)
sys.exit(1)
for current_compiler in candidate_compilers:
spack.compilers.remove_compiler_from_config(current_compiler.spec, scope=args.scope)
tty.msg(f"{current_compiler.spec.display_str} has been removed")
remover.flush()
tty.msg("The following compilers have been removed:")
print()
colify(compiler_strs, indent=4)
print()
def compiler_info(args):
"""Print info about all compilers matching a spec."""
cspec = spack.spec.CompilerSpec(args.compiler_spec)
compilers = spack.compilers.compilers_for_spec(cspec, scope=args.scope)
query = spack.spec.Spec(args.compiler_spec)
all_compilers = spack.compilers.config.all_compilers(scope=args.scope, init_config=False)
compilers = [x for x in all_compilers if x.satisfies(query)]
if not compilers:
tty.die("No compilers match spec %s" % cspec)
tty.die(f"No compilers match spec {query.cformat()}")
else:
for c in compilers:
print(c.spec.display_str + ":")
print("\tpaths:")
for cpath in ["cc", "cxx", "f77", "fc"]:
print("\t\t%s = %s" % (cpath, getattr(c, cpath, None)))
if c.flags:
print("\tflags:")
for flag, flag_value in c.flags.items():
print("\t\t%s = %s" % (flag, flag_value))
if len(c.environment) != 0:
if len(c.environment.get("set", {})) != 0:
print("\tenvironment:")
print("\t set:")
for key, value in c.environment["set"].items():
print("\t %s = %s" % (key, value))
if c.extra_rpaths:
print("\tExtra rpaths:")
for extra_rpath in c.extra_rpaths:
print("\t\t%s" % extra_rpath)
print("\tmodules = %s" % c.modules)
print("\toperating system = %s" % c.operating_system)
print(f"{c.cformat()}:")
print(f" prefix: {c.external_path}")
extra_attributes = getattr(c, "extra_attributes", {})
if "compilers" in extra_attributes:
print(" compilers:")
for language, exe in extra_attributes.get("compilers", {}).items():
print(f" {language}: {exe}")
if "flags" in extra_attributes:
print(" flags:")
for flag, flag_value in extra_attributes["flags"].items():
print(f" {flag} = {flag_value}")
# FIXME (compiler as nodes): recover this printing
# if "environment" in extra_attributes:
# if len(c.environment.get("set", {})) != 0:
# print("\tenvironment:")
# print("\t set:")
# for key, value in c.environment["set"].items():
# print("\t %s = %s" % (key, value))
if "extra_rpaths" in extra_attributes:
print(" extra rpaths:")
for extra_rpath in extra_attributes["extra_rpaths"]:
print(f" {extra_rpath}")
if getattr(c, "external_modules", []):
print(" modules: ")
for module in c.external_modules:
print(f" {module}")
print()
def compiler_list(args):
compilers = spack.compilers.all_compilers(scope=args.scope, init_config=False)
compilers = spack.compilers.config.all_compilers(scope=args.scope, init_config=False)
# If there are no compilers in any scope, and we're outputting to a tty, give a
# hint to the user.
@@ -163,7 +184,7 @@ def compiler_list(args):
tty.msg(msg)
return
index = index_by(compilers, lambda c: (c.spec.name, c.operating_system, c.target))
index = index_by(compilers, spack.compilers.config.name_os_target)
tty.msg("Available compilers")
@@ -182,10 +203,10 @@ def compiler_list(args):
name, os, target = key
os_str = os
if target:
os_str += "-%s" % target
cname = "%s{%s} %s" % (spack.spec.COMPILER_COLOR, name, os_str)
os_str += f"-{target}"
cname = f"{spack.spec.COMPILER_COLOR}{{{name}}} {os_str}"
tty.hline(colorize(cname), char="-")
colify(reversed(sorted(c.spec.display_str for c in compilers)))
colify(reversed(sorted(c.format("{name}@{version}") for c in compilers)))
def compiler(parser, args):

View File

@@ -518,8 +518,6 @@ def config_prefer_upstream(args):
for spec in pref_specs:
# Collect all the upstream compilers and versions for this package.
pkg = pkgs.get(spec.name, {"version": []})
all = pkgs.get("all", {"compiler": []})
pkgs["all"] = all
pkgs[spec.name] = pkg
# We have no existing variant if this is our first added version.
@@ -529,10 +527,6 @@ def config_prefer_upstream(args):
if version not in pkg["version"]:
pkg["version"].append(version)
compiler = str(spec.compiler)
if compiler not in all["compiler"]:
all["compiler"].append(compiler)
# Get and list all the variants that differ from the default.
variants = []
for var_name, variant in spec.variants.items():

View File

@@ -99,7 +99,7 @@ def setup_parser(subparser):
"--show-full-compiler",
action="store_true",
dest="show_full_compiler",
help="show full compiler specs",
help="(DEPRECATED) show full compiler specs. Currently it's a no-op",
)
implicit_explicit = subparser.add_mutually_exclusive_group()
implicit_explicit.add_argument(
@@ -279,7 +279,6 @@ def root_decorator(spec, string):
# these enforce details in the root specs to show what the user asked for
namespaces=True,
show_flags=True,
show_full_compiler=True,
decorator=root_decorator,
variants=True,
)
@@ -302,7 +301,6 @@ def root_decorator(spec, string):
decorator=lambda s, f: color.colorize("@*{%s}" % f),
namespace=True,
show_flags=True,
show_full_compiler=True,
variants=True,
)
print()

View File

@@ -8,7 +8,7 @@
import tempfile
import spack.binary_distribution
import spack.mirror
import spack.mirrors.mirror
import spack.paths
import spack.stage
import spack.util.gpg
@@ -217,11 +217,11 @@ def gpg_publish(args):
mirror = None
if args.directory:
url = spack.util.url.path_to_file_url(args.directory)
mirror = spack.mirror.Mirror(url, url)
mirror = spack.mirrors.mirror.Mirror(url, url)
elif args.mirror_name:
mirror = spack.mirror.MirrorCollection(binary=True).lookup(args.mirror_name)
mirror = spack.mirrors.mirror.MirrorCollection(binary=True).lookup(args.mirror_name)
elif args.mirror_url:
mirror = spack.mirror.Mirror(args.mirror_url, args.mirror_url)
mirror = spack.mirrors.mirror.Mirror(args.mirror_url, args.mirror_url)
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
spack.binary_distribution._url_push_keys(

View File

@@ -14,7 +14,8 @@
import spack.concretize
import spack.config
import spack.environment as ev
import spack.mirror
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.repo
import spack.spec
import spack.util.web as web_util
@@ -365,15 +366,15 @@ def mirror_add(args):
connection["autopush"] = args.autopush
if args.signed is not None:
connection["signed"] = args.signed
mirror = spack.mirror.Mirror(connection, name=args.name)
mirror = spack.mirrors.mirror.Mirror(connection, name=args.name)
else:
mirror = spack.mirror.Mirror(args.url, name=args.name)
spack.mirror.add(mirror, args.scope)
mirror = spack.mirrors.mirror.Mirror(args.url, name=args.name)
spack.mirrors.utils.add(mirror, args.scope)
def mirror_remove(args):
"""remove a mirror by name"""
spack.mirror.remove(args.name, args.scope)
spack.mirrors.utils.remove(args.name, args.scope)
def _configure_mirror(args):
@@ -382,7 +383,7 @@ def _configure_mirror(args):
if args.name not in mirrors:
tty.die(f"No mirror found with name {args.name}.")
entry = spack.mirror.Mirror(mirrors[args.name], args.name)
entry = spack.mirrors.mirror.Mirror(mirrors[args.name], args.name)
direction = "fetch" if args.fetch else "push" if args.push else None
changes = {}
if args.url:
@@ -449,7 +450,7 @@ def mirror_set_url(args):
def mirror_list(args):
"""print out available mirrors to the console"""
mirrors = spack.mirror.MirrorCollection(scope=args.scope)
mirrors = spack.mirrors.mirror.MirrorCollection(scope=args.scope)
if not mirrors:
tty.msg("No mirrors configured.")
return
@@ -489,9 +490,9 @@ def concrete_specs_from_user(args):
def extend_with_additional_versions(specs, num_versions):
if num_versions == "all":
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs = spack.mirrors.utils.get_all_versions(specs)
else:
mirror_specs = spack.mirror.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = spack.mirrors.utils.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = [x.concretized() for x in mirror_specs]
return mirror_specs
@@ -570,7 +571,7 @@ def concrete_specs_from_environment():
def all_specs_with_all_versions():
specs = [spack.spec.Spec(n) for n in spack.repo.all_package_names()]
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs = spack.mirrors.utils.get_all_versions(specs)
mirror_specs.sort(key=lambda s: (s.name, s.version))
return mirror_specs
@@ -659,19 +660,21 @@ def _specs_and_action(args):
def create_mirror_for_all_specs(mirror_specs, path, skip_unstable_versions):
mirror_cache, mirror_stats = spack.mirror.mirror_cache_and_stats(
mirror_cache, mirror_stats = spack.mirrors.utils.mirror_cache_and_stats(
path, skip_unstable_versions=skip_unstable_versions
)
for candidate in mirror_specs:
pkg_cls = spack.repo.PATH.get_pkg_class(candidate.name)
pkg_obj = pkg_cls(spack.spec.Spec(candidate))
mirror_stats.next_spec(pkg_obj.spec)
spack.mirror.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)
spack.mirrors.utils.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)
process_mirror_stats(*mirror_stats.stats())
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
present, mirrored, error = spack.mirrors.utils.create(
path, mirror_specs, skip_unstable_versions
)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
@@ -681,7 +684,7 @@ def mirror_destroy(args):
mirror_url = None
if args.mirror_name:
result = spack.mirror.MirrorCollection().lookup(args.mirror_name)
result = spack.mirrors.mirror.MirrorCollection().lookup(args.mirror_name)
mirror_url = result.push_url
elif args.mirror_url:
mirror_url = args.mirror_url

View File

@@ -8,6 +8,7 @@
import spack.cmd.common.arguments
import spack.cmd.modules
import spack.config
import spack.modules
import spack.modules.lmod

View File

@@ -7,6 +7,7 @@
import spack.cmd.common.arguments
import spack.cmd.modules
import spack.config
import spack.modules
import spack.modules.tcl

View File

@@ -15,6 +15,7 @@
from llnl.util.filesystem import working_dir
import spack.paths
import spack.repo
import spack.util.git
from spack.util.executable import Executable, which
@@ -38,7 +39,7 @@ def grouper(iterable, n, fillvalue=None):
#: double-check the results of other tools (if, e.g., --fix was provided)
#: The list maps an executable name to a method to ensure the tool is
#: bootstrapped or present in the environment.
tool_names = ["import-check", "isort", "black", "flake8", "mypy"]
tool_names = ["import", "isort", "black", "flake8", "mypy"]
#: warnings to ignore in mypy
mypy_ignores = [
@@ -322,8 +323,6 @@ def process_files(file_list, is_args):
rewrite_and_print_output(output, args, pat, replacement)
packages_isort_args = (
"--rm",
"spack",
"--rm",
"spack.pkgkit",
"--rm",
@@ -370,10 +369,19 @@ def run_black(black_cmd, file_list, args):
def _module_part(root: str, expr: str):
parts = expr.split(".")
# spack.pkg is for repositories, don't try to resolve it here.
if ".".join(parts[:2]) == spack.repo.ROOT_PYTHON_NAMESPACE:
return None
while parts:
f1 = os.path.join(root, "lib", "spack", *parts) + ".py"
f2 = os.path.join(root, "lib", "spack", *parts, "__init__.py")
if os.path.exists(f1) or os.path.exists(f2):
if (
os.path.exists(f1)
# ensure case sensitive match
and f"{parts[-1]}.py" in os.listdir(os.path.dirname(f1))
or os.path.exists(f2)
):
return ".".join(parts)
parts.pop()
return None
@@ -389,7 +397,7 @@ def _run_import_check(
out=sys.stdout,
):
if sys.version_info < (3, 9):
print("import-check requires Python 3.9 or later")
print("import check requires Python 3.9 or later")
return 0
is_use = re.compile(r"(?<!from )(?<!import )(?:llnl|spack)\.[a-zA-Z0-9_\.]+")
@@ -431,10 +439,11 @@ def _run_import_check(
module = _module_part(root, m.group(0))
if not module or module in to_add:
continue
if f"import {module}" not in filtered_contents:
to_add.add(module)
exit_code = 1
print(f"{pretty_path}: missing import: {module}", file=out)
if re.search(rf"import {re.escape(module)}\b(?!\.)", contents):
continue
to_add.add(module)
exit_code = 1
print(f"{pretty_path}: missing import: {module} ({m.group(0)})", file=out)
if not fix or not to_add and not to_remove:
continue
@@ -465,7 +474,7 @@ def _run_import_check(
return exit_code
@tool("import-check", external=False)
@tool("import", external=False)
def run_import_check(import_check_cmd, file_list, args):
exit_code = _run_import_check(
file_list,
@@ -474,7 +483,7 @@ def run_import_check(import_check_cmd, file_list, args):
root=args.root,
working_dir=args.initial_working_dir,
)
print_tool_result("import-check", exit_code)
print_tool_result("import", exit_code)
return exit_code

View File

@@ -217,7 +217,7 @@ def unit_test(parser, args, unknown_args):
# Ensure clingo is available before switching to the
# mock configuration used by unit tests
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_core_dependencies()
spack.bootstrap.ensure_clingo_importable_or_raise()
if pytest is None:
spack.bootstrap.ensure_environment_dependencies()
import pytest

View File

@@ -1,850 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import contextlib
import hashlib
import itertools
import json
import os
import platform
import re
import shutil
import sys
import tempfile
from typing import Dict, List, Optional, Sequence
import llnl.path
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import path_contains_subdirectory, paths_containing_libs
import spack.caches
import spack.error
import spack.schema.environment
import spack.spec
import spack.util.executable
import spack.util.libc
import spack.util.module_cmd
import spack.version
from spack.util.environment import filter_system_paths
from spack.util.file_cache import FileCache
__all__ = ["Compiler"]
PATH_INSTANCE_VARS = ["cc", "cxx", "f77", "fc"]
FLAG_INSTANCE_VARS = ["cflags", "cppflags", "cxxflags", "fflags"]
@llnl.util.lang.memoized
def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()) -> str:
"""Invokes the compiler at a given path passing a single
version argument and returns the output.
Args:
compiler_path (path): path of the compiler to be invoked
version_arg (str): the argument used to extract version information
"""
compiler = spack.util.executable.Executable(compiler_path)
compiler_invocation_args = {
"output": str,
"error": str,
"ignore_errors": ignore_errors,
"timeout": 120,
"fail_on_error": True,
}
if version_arg:
output = compiler(version_arg, **compiler_invocation_args)
else:
output = compiler(**compiler_invocation_args)
return output
def get_compiler_version_output(compiler_path, *args, **kwargs) -> str:
"""Wrapper for _get_compiler_version_output()."""
# This ensures that we memoize compiler output by *absolute path*,
# not just executable name. If we don't do this, and the path changes
# (e.g., during testing), we can get incorrect results.
if not os.path.isabs(compiler_path):
compiler_path = spack.util.executable.which_string(compiler_path, required=True)
return _get_compiler_version_output(compiler_path, *args, **kwargs)
def tokenize_flags(flags_values, propagate=False):
"""Given a compiler flag specification as a string, this returns a list
where the entries are the flags. For compiler options which set values
using the syntax "-flag value", this function groups flags and their
values together. Any token not preceded by a "-" is considered the
value of a prior flag."""
tokens = flags_values.split()
if not tokens:
return []
flag = tokens[0]
flags_with_propagation = []
for token in tokens[1:]:
if not token.startswith("-"):
flag += " " + token
else:
flags_with_propagation.append((flag, propagate))
flag = token
flags_with_propagation.append((flag, propagate))
return flags_with_propagation
#: regex for parsing linker lines
_LINKER_LINE = re.compile(r"^( *|.*[/\\])" r"(link|ld|([^/\\]+-)?ld|collect2)" r"[^/\\]*( |$)")
#: components of linker lines to ignore
_LINKER_LINE_IGNORE = re.compile(r"(collect2 version|^[A-Za-z0-9_]+=|/ldfe )")
#: regex to match linker search paths
_LINK_DIR_ARG = re.compile(r"^-L(.:)?(?P<dir>[/\\].*)")
#: regex to match linker library path arguments
_LIBPATH_ARG = re.compile(r"^[-/](LIBPATH|libpath):(?P<dir>.*)")
def _parse_link_paths(string):
"""Parse implicit link paths from compiler debug output.
This gives the compiler runtime library paths that we need to add to
the RPATH of generated binaries and libraries. It allows us to
ensure, e.g., that codes load the right libstdc++ for their compiler.
"""
lib_search_paths = False
raw_link_dirs = []
for line in string.splitlines():
if lib_search_paths:
if line.startswith("\t"):
raw_link_dirs.append(line[1:])
continue
else:
lib_search_paths = False
elif line.startswith("Library search paths:"):
lib_search_paths = True
if not _LINKER_LINE.match(line):
continue
if _LINKER_LINE_IGNORE.match(line):
continue
tty.debug(f"implicit link dirs: link line: {line}")
next_arg = False
for arg in line.split():
if arg in ("-L", "-Y"):
next_arg = True
continue
if next_arg:
raw_link_dirs.append(arg)
next_arg = False
continue
link_dir_arg = _LINK_DIR_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
raw_link_dirs.append(link_dir)
link_dir_arg = _LIBPATH_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
raw_link_dirs.append(link_dir)
implicit_link_dirs = list()
visited = set()
for link_dir in raw_link_dirs:
normalized_path = os.path.abspath(link_dir)
if normalized_path not in visited:
implicit_link_dirs.append(normalized_path)
visited.add(normalized_path)
tty.debug(f"implicit link dirs: result: {', '.join(implicit_link_dirs)}")
return implicit_link_dirs
@llnl.path.system_path_filter
def _parse_non_system_link_dirs(string: str) -> List[str]:
"""Parses link paths out of compiler debug output.
Args:
string: compiler debug output as a string
Returns:
Implicit link paths parsed from the compiler output
"""
link_dirs = _parse_link_paths(string)
# Remove directories that do not exist. Some versions of the Cray compiler
# report nonexistent directories
link_dirs = [d for d in link_dirs if os.path.isdir(d)]
# Return set of directories containing needed compiler libs, minus
# system paths. Note that 'filter_system_paths' only checks for an
# exact match, while 'in_system_subdirectory' checks if a path contains
# a system directory as a subdirectory
link_dirs = filter_system_paths(link_dirs)
return list(p for p in link_dirs if not in_system_subdirectory(p))
def in_system_subdirectory(path):
system_dirs = [
"/lib/",
"/lib64/",
"/usr/lib/",
"/usr/lib64/",
"/usr/local/lib/",
"/usr/local/lib64/",
]
return any(path_contains_subdirectory(path, x) for x in system_dirs)
class Compiler:
"""This class encapsulates a Spack "compiler", which includes C,
C++, and Fortran compilers. Subclasses should implement
support for specific compilers, their possible names, arguments,
and how to identify the particular type of compiler."""
# Optional prefix regexes for searching for this type of compiler.
# Prefixes are sometimes used for toolchains
prefixes: List[str] = []
# Optional suffix regexes for searching for this type of compiler.
# Suffixes are used by some frameworks, e.g. macports uses an '-mp-X.Y'
# version suffix for gcc.
suffixes = [r"-.*"]
#: Compiler argument that produces version information
version_argument = "-dumpversion"
#: Return values to ignore when invoking the compiler to get its version
ignore_version_errors: Sequence[int] = ()
#: Regex used to extract version from compiler's output
version_regex = "(.*)"
# These libraries are anticipated to be required by all executables built
# by any compiler
_all_compiler_rpath_libraries = ["libc", "libc++", "libstdc++"]
#: Platform matcher for Platform objects supported by compiler
is_supported_on_platform = lambda x: True
# Default flags used by a compiler to set an rpath
@property
def cc_rpath_arg(self):
return "-Wl,-rpath,"
@property
def cxx_rpath_arg(self):
return "-Wl,-rpath,"
@property
def f77_rpath_arg(self):
return "-Wl,-rpath,"
@property
def fc_rpath_arg(self):
return "-Wl,-rpath,"
@property
def linker_arg(self):
"""Flag that need to be used to pass an argument to the linker."""
return "-Wl,"
@property
def disable_new_dtags(self):
if platform.system() == "Darwin":
return ""
return "--disable-new-dtags"
@property
def enable_new_dtags(self):
if platform.system() == "Darwin":
return ""
return "--enable-new-dtags"
@property
def debug_flags(self):
return ["-g"]
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3"]
def __init__(
self,
cspec,
operating_system,
target,
paths,
modules: Optional[List[str]] = None,
alias=None,
environment=None,
extra_rpaths=None,
enable_implicit_rpaths=None,
**kwargs,
):
self.spec = cspec
self.operating_system = str(operating_system)
self.target = target
self.modules = modules or []
self.alias = alias
self.environment = environment or {}
self.extra_rpaths = extra_rpaths or []
self.enable_implicit_rpaths = enable_implicit_rpaths
self.cache = COMPILER_CACHE
self.cc = paths[0]
self.cxx = paths[1]
self.f77 = None
self.fc = None
if len(paths) > 2:
self.f77 = paths[2]
if len(paths) == 3:
self.fc = self.f77
else:
self.fc = paths[3]
# Unfortunately have to make sure these params are accepted
# in the same order they are returned by sorted(flags)
# in compilers/__init__.py
self.flags = spack.spec.FlagMap(self.spec)
for flag in self.flags.valid_compiler_flags():
value = kwargs.get(flag, None)
if value is not None:
values_with_propagation = tokenize_flags(value, False)
for value, propagation in values_with_propagation:
self.flags.add_flag(flag, value, propagation)
# caching value for compiler reported version
# used for version checks for API, e.g. C++11 flag
self._real_version = None
def __eq__(self, other):
return (
self.cc == other.cc
and self.cxx == other.cxx
and self.fc == other.fc
and self.f77 == other.f77
and self.spec == other.spec
and self.operating_system == other.operating_system
and self.target == other.target
and self.flags == other.flags
and self.modules == other.modules
and self.environment == other.environment
and self.extra_rpaths == other.extra_rpaths
and self.enable_implicit_rpaths == other.enable_implicit_rpaths
)
def __hash__(self):
return hash(
(
self.cc,
self.cxx,
self.fc,
self.f77,
self.spec,
self.operating_system,
self.target,
str(self.flags),
str(self.modules),
str(self.environment),
str(self.extra_rpaths),
self.enable_implicit_rpaths,
)
)
def verify_executables(self):
"""Raise an error if any of the compiler executables is not valid.
This method confirms that for all of the compilers (cc, cxx, f77, fc)
that have paths, those paths exist and are executable by the current
user.
Raises a CompilerAccessError if any of the non-null paths for the
compiler are not accessible.
"""
def accessible_exe(exe):
# compilers may contain executable names (on Cray or user edited)
if not os.path.isabs(exe):
exe = spack.util.executable.which_string(exe)
if not exe:
return False
return os.path.isfile(exe) and os.access(exe, os.X_OK)
# setup environment before verifying in case we have executable names
# instead of absolute paths
with self.compiler_environment():
missing = [
cmp
for cmp in (self.cc, self.cxx, self.f77, self.fc)
if cmp and not accessible_exe(cmp)
]
if missing:
raise CompilerAccessError(self, missing)
@property
def version(self):
return self.spec.version
@property
def real_version(self):
"""Executable reported compiler version used for API-determinations
E.g. C++11 flag checks.
"""
real_version_str = self.cache.get(self).real_version
if not real_version_str or real_version_str == "unknown":
return self.version
return spack.version.StandardVersion.from_string(real_version_str)
def implicit_rpaths(self) -> List[str]:
if self.enable_implicit_rpaths is False:
return []
output = self.compiler_verbose_output
if not output:
return []
link_dirs = _parse_non_system_link_dirs(output)
all_required_libs = list(self.required_libs) + Compiler._all_compiler_rpath_libraries
return list(paths_containing_libs(link_dirs, all_required_libs))
@property
def default_dynamic_linker(self) -> Optional[str]:
"""Determine default dynamic linker from compiler link line"""
output = self.compiler_verbose_output
if not output:
return None
return spack.util.libc.parse_dynamic_linker(output)
@property
def default_libc(self) -> Optional["spack.spec.Spec"]:
"""Determine libc targeted by the compiler from link line"""
# technically this should be testing the target platform of the compiler, but we don't have
# that, so stick to host platform for now.
if sys.platform in ("darwin", "win32"):
return None
dynamic_linker = self.default_dynamic_linker
if not dynamic_linker:
return None
return spack.util.libc.libc_from_dynamic_linker(dynamic_linker)
@property
def required_libs(self):
"""For executables created with this compiler, the compiler libraries
that would be generally required to run it.
"""
# By default every compiler returns the empty list
return []
@property
def compiler_verbose_output(self) -> Optional[str]:
"""Verbose output from compiling a dummy C source file. Output is cached."""
return self.cache.get(self).c_compiler_output
def _compile_dummy_c_source(self) -> Optional[str]:
if self.cc:
cc = self.cc
ext = "c"
else:
cc = self.cxx
ext = "cc"
if not cc or not self.verbose_flag:
return None
try:
tmpdir = tempfile.mkdtemp(prefix="spack-implicit-link-info")
fout = os.path.join(tmpdir, "output")
fin = os.path.join(tmpdir, f"main.{ext}")
with open(fin, "w") as csource:
csource.write(
"int main(int argc, char* argv[]) { (void)argc; (void)argv; return 0; }\n"
)
cc_exe = spack.util.executable.Executable(cc)
for flag_type in ["cflags" if cc == self.cc else "cxxflags", "cppflags", "ldflags"]:
cc_exe.add_default_arg(*self.flags.get(flag_type, []))
with self.compiler_environment():
return cc_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
except spack.util.executable.ProcessError as pe:
tty.debug("ProcessError: Command exited with non-zero status: " + pe.long_message)
return None
finally:
shutil.rmtree(tmpdir, ignore_errors=True)
@property
def verbose_flag(self) -> Optional[str]:
"""
This property should be overridden in the compiler subclass if a
verbose flag is available.
If it is not overridden, it is assumed to not be supported.
"""
# This property should be overridden in the compiler subclass if
# OpenMP is supported by that compiler
@property
def openmp_flag(self):
# If it is not overridden, assume it is not supported and warn the user
raise UnsupportedCompilerFlag(self, "OpenMP", "openmp_flag")
# This property should be overridden in the compiler subclass if
# C++98 is not the default standard for that compiler
@property
def cxx98_flag(self):
return ""
# This property should be overridden in the compiler subclass if
# C++11 is supported by that compiler
@property
def cxx11_flag(self):
# If it is not overridden, assume it is not supported and warn the user
raise UnsupportedCompilerFlag(self, "the C++11 standard", "cxx11_flag")
# This property should be overridden in the compiler subclass if
# C++14 is supported by that compiler
@property
def cxx14_flag(self):
# If it is not overridden, assume it is not supported and warn the user
raise UnsupportedCompilerFlag(self, "the C++14 standard", "cxx14_flag")
# This property should be overridden in the compiler subclass if
# C++17 is supported by that compiler
@property
def cxx17_flag(self):
# If it is not overridden, assume it is not supported and warn the user
raise UnsupportedCompilerFlag(self, "the C++17 standard", "cxx17_flag")
# This property should be overridden in the compiler subclass if
# C99 is supported by that compiler
@property
def c99_flag(self):
# If it is not overridden, assume it is not supported and warn the user
raise UnsupportedCompilerFlag(self, "the C99 standard", "c99_flag")
# This property should be overridden in the compiler subclass if
# C11 is supported by that compiler
@property
def c11_flag(self):
# If it is not overridden, assume it is not supported and warn the user
raise UnsupportedCompilerFlag(self, "the C11 standard", "c11_flag")
@property
def cc_pic_flag(self):
"""Returns the flag used by the C compiler to produce
Position Independent Code (PIC)."""
return "-fPIC"
@property
def cxx_pic_flag(self):
"""Returns the flag used by the C++ compiler to produce
Position Independent Code (PIC)."""
return "-fPIC"
@property
def f77_pic_flag(self):
"""Returns the flag used by the F77 compiler to produce
Position Independent Code (PIC)."""
return "-fPIC"
@property
def fc_pic_flag(self):
"""Returns the flag used by the FC compiler to produce
Position Independent Code (PIC)."""
return "-fPIC"
# Note: This is not a class method. The class methods are used to detect
# compilers on PATH based systems, and do not set up the run environment of
# the compiler. This method can be called on `module` based systems as well
def get_real_version(self) -> str:
"""Query the compiler for its version.
This is the "real" compiler version, regardless of what is in the
compilers.yaml file, which the user can change to name their compiler.
Use the runtime environment of the compiler (modules and environment
modifications) to enable the compiler to run properly on any platform.
"""
cc = spack.util.executable.Executable(self.cc)
try:
with self.compiler_environment():
output = cc(
self.version_argument,
output=str,
error=str,
ignore_errors=tuple(self.ignore_version_errors),
)
return self.extract_version_from_output(output)
except spack.util.executable.ProcessError:
return "unknown"
@property
def prefix(self):
"""Query the compiler for its install prefix. This is the install
path as reported by the compiler. Note that paths for cc, cxx, etc
are not enough to find the install prefix of the compiler, since
the can be symlinks, wrappers, or filenames instead of absolute paths."""
raise NotImplementedError("prefix is not implemented for this compiler")
#
# Compiler classes have methods for querying the version of
# specific compiler executables. This is used when discovering compilers.
#
# Compiler *instances* are just data objects, and can only be
# constructed from an actual set of executables.
#
@classmethod
def default_version(cls, cc):
"""Override just this to override all compiler version functions."""
output = get_compiler_version_output(
cc, cls.version_argument, tuple(cls.ignore_version_errors)
)
return cls.extract_version_from_output(output)
@classmethod
@llnl.util.lang.memoized
def extract_version_from_output(cls, output: str) -> str:
"""Extracts the version from compiler's output."""
match = re.search(cls.version_regex, output)
return match.group(1) if match else "unknown"
@classmethod
def cc_version(cls, cc):
return cls.default_version(cc)
@classmethod
def search_regexps(cls, language):
# Compile all the regular expressions used for files beforehand.
# This searches for any combination of <prefix><name><suffix>
# defined for the compiler
compiler_names = getattr(cls, "{0}_names".format(language))
prefixes = [""] + cls.prefixes
suffixes = [""]
if sys.platform == "win32":
ext = r"\.(?:exe|bat)"
cls_suf = [suf + ext for suf in cls.suffixes]
ext_suf = [ext]
suffixes = suffixes + cls.suffixes + cls_suf + ext_suf
else:
suffixes = suffixes + cls.suffixes
regexp_fmt = r"^({0}){1}({2})$"
return [
re.compile(regexp_fmt.format(prefix, re.escape(name), suffix))
for prefix, name, suffix in itertools.product(prefixes, compiler_names, suffixes)
]
def setup_custom_environment(self, pkg, env):
"""Set any environment variables necessary to use the compiler."""
pass
def __repr__(self):
"""Return a string representation of the compiler toolchain."""
return self.__str__()
def __str__(self):
"""Return a string representation of the compiler toolchain."""
return "%s(%s)" % (
self.name,
"\n ".join(
(
str(s)
for s in (
self.cc,
self.cxx,
self.f77,
self.fc,
self.modules,
str(self.operating_system),
)
)
),
)
@contextlib.contextmanager
def compiler_environment(self):
# Avoid modifying os.environ if possible.
if not self.modules and not self.environment:
yield
return
# store environment to replace later
backup_env = os.environ.copy()
try:
# load modules and set env variables
for module in self.modules:
spack.util.module_cmd.load_module(module)
# apply other compiler environment changes
spack.schema.environment.parse(self.environment).apply_modifications()
yield
finally:
# Restore environment regardless of whether inner code succeeded
os.environ.clear()
os.environ.update(backup_env)
def to_dict(self):
flags_dict = {fname: " ".join(fvals) for fname, fvals in self.flags.items()}
flags_dict.update(
{attr: getattr(self, attr, None) for attr in FLAG_INSTANCE_VARS if hasattr(self, attr)}
)
result = {
"spec": str(self.spec),
"paths": {attr: getattr(self, attr, None) for attr in PATH_INSTANCE_VARS},
"flags": flags_dict,
"operating_system": str(self.operating_system),
"target": str(self.target),
"modules": self.modules or [],
"environment": self.environment or {},
"extra_rpaths": self.extra_rpaths or [],
}
if self.enable_implicit_rpaths is not None:
result["implicit_rpaths"] = self.enable_implicit_rpaths
if self.alias:
result["alias"] = self.alias
return result
class CompilerAccessError(spack.error.SpackError):
def __init__(self, compiler, paths):
msg = "Compiler '%s' has executables that are missing" % compiler.spec
msg += " or are not executable: %s" % paths
super().__init__(msg)
class InvalidCompilerError(spack.error.SpackError):
def __init__(self):
super().__init__("Compiler has no executables.")
class UnsupportedCompilerFlag(spack.error.SpackError):
def __init__(self, compiler, feature, flag_name, ver_string=None):
super().__init__(
"{0} ({1}) does not support {2} (as compiler.{3}).".format(
compiler.name, ver_string if ver_string else compiler.version, feature, flag_name
),
"If you think it should, please edit the compiler.{0} subclass to".format(
compiler.name
)
+ " implement the {0} property and submit a pull request or issue.".format(flag_name),
)
class CompilerCacheEntry:
"""Deserialized cache entry for a compiler"""
__slots__ = ["c_compiler_output", "real_version"]
def __init__(self, c_compiler_output: Optional[str], real_version: str):
self.c_compiler_output = c_compiler_output
self.real_version = real_version
@classmethod
def from_dict(cls, data: Dict[str, Optional[str]]):
if not isinstance(data, dict):
raise ValueError(f"Invalid {cls.__name__} data")
c_compiler_output = data.get("c_compiler_output")
real_version = data.get("real_version")
if not isinstance(real_version, str) or not isinstance(
c_compiler_output, (str, type(None))
):
raise ValueError(f"Invalid {cls.__name__} data")
return cls(c_compiler_output, real_version)
class CompilerCache:
"""Base class for compiler output cache. Default implementation does not cache anything."""
def value(self, compiler: Compiler) -> Dict[str, Optional[str]]:
return {
"c_compiler_output": compiler._compile_dummy_c_source(),
"real_version": compiler.get_real_version(),
}
def get(self, compiler: Compiler) -> CompilerCacheEntry:
return CompilerCacheEntry.from_dict(self.value(compiler))
class FileCompilerCache(CompilerCache):
"""Cache for compiler output, which is used to determine implicit link paths, the default libc
version, and the compiler version."""
name = os.path.join("compilers", "compilers.json")
def __init__(self, cache: "FileCache") -> None:
self.cache = cache
self.cache.init_entry(self.name)
self._data: Dict[str, Dict[str, Optional[str]]] = {}
def _get_entry(self, key: str) -> Optional[CompilerCacheEntry]:
try:
return CompilerCacheEntry.from_dict(self._data[key])
except ValueError:
del self._data[key]
except KeyError:
pass
return None
def get(self, compiler: Compiler) -> CompilerCacheEntry:
# Cache hit
try:
with self.cache.read_transaction(self.name) as f:
assert f is not None
self._data = json.loads(f.read())
assert isinstance(self._data, dict)
except (json.JSONDecodeError, AssertionError):
self._data = {}
key = self._key(compiler)
value = self._get_entry(key)
if value is not None:
return value
# Cache miss
with self.cache.write_transaction(self.name) as (old, new):
try:
assert old is not None
self._data = json.loads(old.read())
assert isinstance(self._data, dict)
except (json.JSONDecodeError, AssertionError):
self._data = {}
# Use cache entry that may have been created by another process in the meantime.
entry = self._get_entry(key)
# Finally compute the cache entry
if entry is None:
self._data[key] = self.value(compiler)
entry = CompilerCacheEntry.from_dict(self._data[key])
new.write(json.dumps(self._data, separators=(",", ":")))
return entry
def _key(self, compiler: Compiler) -> str:
as_bytes = json.dumps(compiler.to_dict(), separators=(",", ":")).encode("utf-8")
return hashlib.sha256(as_bytes).hexdigest()
def _make_compiler_cache():
return FileCompilerCache(spack.caches.MISC_CACHE)
COMPILER_CACHE: CompilerCache = llnl.util.lang.Singleton(_make_compiler_cache) # type: ignore

View File

@@ -2,836 +2,3 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""This module contains functions related to finding compilers on the
system and configuring Spack to use multiple compilers.
"""
import importlib
import os
import re
import sys
import warnings
from typing import Dict, List, Optional
import archspec.cpu
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
import spack.compiler
import spack.config
import spack.error
import spack.paths
import spack.platforms
import spack.repo
import spack.spec
from spack.operating_systems import windows_os
from spack.util.environment import get_path
from spack.util.naming import mod_to_class
_other_instance_vars = [
"modules",
"operating_system",
"environment",
"implicit_rpaths",
"extra_rpaths",
]
# TODO: Caches at module level make it difficult to mock configurations in
# TODO: unit tests. It might be worth reworking their implementation.
#: cache of compilers constructed from config data, keyed by config entry id.
_compiler_cache: Dict[str, "spack.compiler.Compiler"] = {}
_compiler_to_pkg = {
"clang": "llvm+clang",
"oneapi": "intel-oneapi-compilers",
"rocmcc": "llvm-amdgpu",
"intel@2020:": "intel-oneapi-compilers-classic",
"arm": "acfl",
}
# TODO: generating this from the previous dict causes docs errors
package_name_to_compiler_name = {
"llvm": "clang",
"intel-oneapi-compilers": "oneapi",
"llvm-amdgpu": "rocmcc",
"intel-oneapi-compilers-classic": "intel",
"acfl": "arm",
}
#: Tag used to identify packages providing a compiler
COMPILER_TAG = "compiler"
def pkg_spec_for_compiler(cspec):
"""Return the spec of the package that provides the compiler."""
for spec, package in _compiler_to_pkg.items():
if cspec.satisfies(spec):
spec_str = "%s@%s" % (package, cspec.versions)
break
else:
spec_str = str(cspec)
return spack.spec.parse_with_version_concrete(spec_str)
def _auto_compiler_spec(function):
def converter(cspec_like, *args, **kwargs):
if not isinstance(cspec_like, spack.spec.CompilerSpec):
cspec_like = spack.spec.CompilerSpec(cspec_like)
return function(cspec_like, *args, **kwargs)
return converter
def _to_dict(compiler):
"""Return a dict version of compiler suitable to insert in YAML."""
return {"compiler": compiler.to_dict()}
def get_compiler_config(
configuration: "spack.config.Configuration",
*,
scope: Optional[str] = None,
init_config: bool = False,
) -> List[Dict]:
"""Return the compiler configuration for the specified architecture."""
config = configuration.get("compilers", scope=scope) or []
if config or not init_config:
return config
merged_config = configuration.get("compilers")
if merged_config:
# Config is empty for this scope
# Do not init config because there is a non-empty scope
return config
find_compilers(scope=scope)
config = configuration.get("compilers", scope=scope)
return config
def get_compiler_config_from_packages(
configuration: "spack.config.Configuration", *, scope: Optional[str] = None
) -> List[Dict]:
"""Return the compiler configuration from packages.yaml"""
packages_yaml = configuration.get("packages", scope=scope)
return CompilerConfigFactory.from_packages_yaml(packages_yaml)
def compiler_config_files():
config_files = list()
config = spack.config.CONFIG
for scope in config.writable_scopes:
name = scope.name
compiler_config = config.get("compilers", scope=name)
if compiler_config:
config_files.append(config.get_config_filename(name, "compilers"))
compiler_config_from_packages = get_compiler_config_from_packages(config, scope=name)
if compiler_config_from_packages:
config_files.append(config.get_config_filename(name, "packages"))
return config_files
def add_compilers_to_config(compilers, scope=None):
"""Add compilers to the config for the specified architecture.
Arguments:
compilers: a list of Compiler objects.
scope: configuration scope to modify.
"""
compiler_config = get_compiler_config(configuration=spack.config.CONFIG, scope=scope)
for compiler in compilers:
if not compiler.cc:
tty.debug(f"{compiler.spec} does not have a C compiler")
if not compiler.cxx:
tty.debug(f"{compiler.spec} does not have a C++ compiler")
if not compiler.f77:
tty.debug(f"{compiler.spec} does not have a Fortran77 compiler")
if not compiler.fc:
tty.debug(f"{compiler.spec} does not have a Fortran compiler")
compiler_config.append(_to_dict(compiler))
spack.config.set("compilers", compiler_config, scope=scope)
@_auto_compiler_spec
def remove_compiler_from_config(compiler_spec, scope=None):
"""Remove compilers from configuration by spec.
If scope is None, all the scopes are searched for removal.
Arguments:
compiler_spec: compiler to be removed
scope: configuration scope to modify
"""
candidate_scopes = [scope]
if scope is None:
candidate_scopes = spack.config.CONFIG.scopes.keys()
removal_happened = False
for current_scope in candidate_scopes:
removal_happened |= _remove_compiler_from_scope(compiler_spec, scope=current_scope)
msg = "`spack compiler remove` will not remove compilers defined in packages.yaml"
msg += "\nTo remove these compilers, either edit the config or use `spack external remove`"
tty.debug(msg)
return removal_happened
def _remove_compiler_from_scope(compiler_spec, scope):
"""Removes a compiler from a specific configuration scope.
Args:
compiler_spec: compiler to be removed
scope: configuration scope under consideration
Returns:
True if one or more compiler entries were actually removed, False otherwise
"""
assert scope is not None, "a specific scope is needed when calling this function"
compiler_config = get_compiler_config(configuration=spack.config.CONFIG, scope=scope)
filtered_compiler_config = [
compiler_entry
for compiler_entry in compiler_config
if not spack.spec.parse_with_version_concrete(
compiler_entry["compiler"]["spec"], compiler=True
).satisfies(compiler_spec)
]
if len(filtered_compiler_config) == len(compiler_config):
return False
# We need to preserve the YAML type for comments, hence we are copying the
# items in the list that has just been retrieved
compiler_config[:] = filtered_compiler_config
spack.config.CONFIG.set("compilers", compiler_config, scope=scope)
return True
def all_compilers_config(
configuration: "spack.config.Configuration",
*,
scope: Optional[str] = None,
init_config: bool = True,
) -> List["spack.compiler.Compiler"]:
"""Return a set of specs for all the compiler versions currently
available to build with. These are instances of CompilerSpec.
"""
from_packages_yaml = get_compiler_config_from_packages(configuration, scope=scope)
if from_packages_yaml:
init_config = False
from_compilers_yaml = get_compiler_config(configuration, scope=scope, init_config=init_config)
result = from_compilers_yaml + from_packages_yaml
# Dedupe entries by the compiler they represent
# If the entry is invalid, treat it as unique for deduplication
key = lambda c: _compiler_from_config_entry(c["compiler"] or id(c))
return list(llnl.util.lang.dedupe(result, key=key))
def all_compiler_specs(scope=None, init_config=True):
# Return compiler specs from the merged config.
return [
spack.spec.parse_with_version_concrete(s["compiler"]["spec"], compiler=True)
for s in all_compilers_config(spack.config.CONFIG, scope=scope, init_config=init_config)
]
def find_compilers(
path_hints: Optional[List[str]] = None,
*,
scope: Optional[str] = None,
mixed_toolchain: bool = False,
max_workers: Optional[int] = None,
) -> List["spack.compiler.Compiler"]:
"""Searches for compiler in the paths given as argument. If any new compiler is found, the
configuration is updated, and the list of new compiler objects is returned.
Args:
path_hints: list of path hints where to look for. A sensible default based on the ``PATH``
environment variable will be used if the value is None
scope: configuration scope to modify
mixed_toolchain: allow mixing compilers from different toolchains if otherwise missing for
a certain language
max_workers: number of processes used to search for compilers
"""
import spack.detection
known_compilers = set(all_compilers(init_config=False))
if path_hints is None:
path_hints = get_path("PATH")
default_paths = fs.search_paths_for_executables(*path_hints)
if sys.platform == "win32":
default_paths.extend(windows_os.WindowsOs().compiler_search_paths)
compiler_pkgs = spack.repo.PATH.packages_with_tags(COMPILER_TAG, full=True)
detected_packages = spack.detection.by_path(
compiler_pkgs, path_hints=default_paths, max_workers=max_workers
)
valid_compilers = {}
for name, detected in detected_packages.items():
compilers = [x for x in detected if CompilerConfigFactory.from_external_spec(x)]
if not compilers:
continue
valid_compilers[name] = compilers
def _has_fortran_compilers(x):
if "compilers" not in x.extra_attributes:
return False
return "fortran" in x.extra_attributes["compilers"]
if mixed_toolchain:
gccs = [x for x in valid_compilers.get("gcc", []) if _has_fortran_compilers(x)]
if gccs:
best_gcc = sorted(
gccs, key=lambda x: spack.spec.parse_with_version_concrete(x).version
)[-1]
gfortran = best_gcc.extra_attributes["compilers"]["fortran"]
for name in ("llvm", "apple-clang"):
if name not in valid_compilers:
continue
candidates = valid_compilers[name]
for candidate in candidates:
if _has_fortran_compilers(candidate):
continue
candidate.extra_attributes["compilers"]["fortran"] = gfortran
new_compilers = []
for name, detected in valid_compilers.items():
for config in CompilerConfigFactory.from_specs(detected):
c = _compiler_from_config_entry(config["compiler"])
if c in known_compilers:
continue
new_compilers.append(c)
add_compilers_to_config(new_compilers, scope=scope)
return new_compilers
def select_new_compilers(compilers, scope=None):
"""Given a list of compilers, remove those that are already defined in
the configuration.
"""
compilers_not_in_config = []
for c in compilers:
arch_spec = spack.spec.ArchSpec((None, c.operating_system, c.target))
same_specs = compilers_for_spec(
c.spec, arch_spec=arch_spec, scope=scope, init_config=False
)
if not same_specs:
compilers_not_in_config.append(c)
return compilers_not_in_config
def supported_compilers() -> List[str]:
"""Return a set of names of compilers supported by Spack.
See available_compilers() to get a list of all the available
versions of supported compilers.
"""
# Hack to be able to call the compiler `apple-clang` while still
# using a valid python name for the module
return sorted(all_compiler_names())
def supported_compilers_for_host_platform() -> List[str]:
"""Return a set of compiler class objects supported by Spack
that are also supported by the current host platform
"""
host_plat = spack.platforms.real_host()
return supported_compilers_for_platform(host_plat)
def supported_compilers_for_platform(platform: "spack.platforms.Platform") -> List[str]:
"""Return a set of compiler class objects supported by Spack
that are also supported by the provided platform
Args:
platform (str): string representation of platform
for which compiler compatability should be determined
"""
return [
name
for name in supported_compilers()
if class_for_compiler_name(name).is_supported_on_platform(platform)
]
def all_compiler_names() -> List[str]:
def replace_apple_clang(name):
return name if name != "apple_clang" else "apple-clang"
return [replace_apple_clang(name) for name in all_compiler_module_names()]
@llnl.util.lang.memoized
def all_compiler_module_names() -> List[str]:
return list(llnl.util.lang.list_modules(spack.paths.compilers_path))
@_auto_compiler_spec
def supported(compiler_spec):
"""Test if a particular compiler is supported."""
return compiler_spec.name in supported_compilers()
@_auto_compiler_spec
def find(compiler_spec, scope=None, init_config=True):
"""Return specs of available compilers that match the supplied
compiler spec. Return an empty list if nothing found."""
return [c for c in all_compiler_specs(scope, init_config) if c.satisfies(compiler_spec)]
@_auto_compiler_spec
def find_specs_by_arch(compiler_spec, arch_spec, scope=None, init_config=True):
"""Return specs of available compilers that match the supplied
compiler spec. Return an empty list if nothing found."""
return [
c.spec
for c in compilers_for_spec(
compiler_spec, arch_spec=arch_spec, scope=scope, init_config=init_config
)
]
def all_compilers(scope=None, init_config=True):
return all_compilers_from(
configuration=spack.config.CONFIG, scope=scope, init_config=init_config
)
def all_compilers_from(configuration, scope=None, init_config=True):
compilers = []
for items in all_compilers_config(
configuration=configuration, scope=scope, init_config=init_config
):
items = items["compiler"]
compiler = _compiler_from_config_entry(items) # can be None in error case
if compiler:
compilers.append(compiler)
return compilers
@_auto_compiler_spec
def compilers_for_spec(compiler_spec, *, arch_spec=None, scope=None, init_config=True):
"""This gets all compilers that satisfy the supplied CompilerSpec.
Returns an empty list if none are found.
"""
config = all_compilers_config(spack.config.CONFIG, scope=scope, init_config=init_config)
matches = set(find(compiler_spec, scope, init_config))
compilers = []
for cspec in matches:
compilers.extend(get_compilers(config, cspec, arch_spec))
return compilers
def compilers_for_arch(arch_spec, scope=None):
config = all_compilers_config(spack.config.CONFIG, scope=scope, init_config=False)
return list(get_compilers(config, arch_spec=arch_spec))
def compiler_specs_for_arch(arch_spec, scope=None):
return [c.spec for c in compilers_for_arch(arch_spec, scope)]
class CacheReference:
"""This acts as a hashable reference to any object (regardless of whether
the object itself is hashable) and also prevents the object from being
garbage-collected (so if two CacheReference objects are equal, they
will refer to the same object, since it will not have been gc'ed since
the creation of the first CacheReference).
"""
def __init__(self, val):
self.val = val
self.id = id(val)
def __hash__(self):
return self.id
def __eq__(self, other):
return isinstance(other, CacheReference) and self.id == other.id
def compiler_from_dict(items):
cspec = spack.spec.parse_with_version_concrete(items["spec"], compiler=True)
os = items.get("operating_system", None)
target = items.get("target", None)
if not (
"paths" in items and all(n in items["paths"] for n in spack.compiler.PATH_INSTANCE_VARS)
):
raise InvalidCompilerConfigurationError(cspec)
cls = class_for_compiler_name(cspec.name)
compiler_paths = []
for c in spack.compiler.PATH_INSTANCE_VARS:
compiler_path = items["paths"][c]
if compiler_path != "None":
compiler_paths.append(compiler_path)
else:
compiler_paths.append(None)
mods = items.get("modules")
if mods == "None":
mods = []
alias = items.get("alias", None)
compiler_flags = items.get("flags", {})
environment = items.get("environment", {})
extra_rpaths = items.get("extra_rpaths", [])
implicit_rpaths = items.get("implicit_rpaths", None)
# Starting with c22a145, 'implicit_rpaths' was a list. Now it is a
# boolean which can be set by the user to disable all automatic
# RPATH insertion of compiler libraries
if implicit_rpaths is not None and not isinstance(implicit_rpaths, bool):
implicit_rpaths = None
return cls(
cspec,
os,
target,
compiler_paths,
mods,
alias,
environment,
extra_rpaths,
enable_implicit_rpaths=implicit_rpaths,
**compiler_flags,
)
def _compiler_from_config_entry(items):
"""Note this is intended for internal use only. To avoid re-parsing
the same config dictionary this keeps track of its location in
memory. If you provide the same dictionary twice it will return
the same Compiler object (regardless of whether the dictionary
entries have changed).
"""
config_id = CacheReference(items)
compiler = _compiler_cache.get(config_id, None)
if compiler is None:
try:
compiler = compiler_from_dict(items)
except UnknownCompilerError as e:
warnings.warn(e.message)
_compiler_cache[config_id] = compiler
return compiler
def get_compilers(config, cspec=None, arch_spec=None):
compilers = []
for items in config:
items = items["compiler"]
# We might use equality here.
if cspec and not spack.spec.parse_with_version_concrete(
items["spec"], compiler=True
).satisfies(cspec):
continue
# If an arch spec is given, confirm that this compiler
# is for the given operating system
os = items.get("operating_system", None)
if arch_spec and os != arch_spec.os:
continue
# If an arch spec is given, confirm that this compiler
# is for the given target. If the target is 'any', match
# any given arch spec. If the compiler has no assigned
# target this is an old compiler config file, skip this logic.
target = items.get("target", None)
try:
current_target = archspec.cpu.TARGETS[str(arch_spec.target)]
family = str(current_target.family)
except KeyError:
# TODO: Check if this exception handling makes sense, or if we
# TODO: need to change / refactor tests
family = str(arch_spec.target)
except AttributeError:
assert arch_spec is None
if arch_spec and target and (target != family and target != "any"):
# If the family of the target is the family we are seeking,
# there's an error in the underlying configuration
if archspec.cpu.TARGETS[target].family == family:
msg = (
'the "target" field in compilers.yaml accepts only '
'target families [replace "{0}" with "{1}"'
' in "{2}" specification]'
)
msg = msg.format(str(target), family, items.get("spec", "??"))
raise ValueError(msg)
continue
compiler = _compiler_from_config_entry(items)
if compiler:
compilers.append(compiler)
return compilers
@_auto_compiler_spec
def compiler_for_spec(compiler_spec, arch_spec):
"""Get the compiler that satisfies compiler_spec. compiler_spec must
be concrete."""
assert compiler_spec.concrete
assert arch_spec.concrete
compilers = compilers_for_spec(compiler_spec, arch_spec=arch_spec)
if len(compilers) < 1:
raise NoCompilerForSpecError(compiler_spec, arch_spec.os)
if len(compilers) > 1:
msg = "Multiple definitions of compiler %s " % compiler_spec
msg += "for architecture %s:\n %s" % (arch_spec, compilers)
tty.debug(msg)
return compilers[0]
@llnl.util.lang.memoized
def class_for_compiler_name(compiler_name):
"""Given a compiler module name, get the corresponding Compiler class."""
if not supported(compiler_name):
raise UnknownCompilerError(compiler_name)
# Hack to be able to call the compiler `apple-clang` while still
# using a valid python name for the module
submodule_name = compiler_name
if compiler_name == "apple-clang":
submodule_name = compiler_name.replace("-", "_")
module_name = ".".join(["spack", "compilers", submodule_name])
module_obj = importlib.import_module(module_name)
cls = getattr(module_obj, mod_to_class(compiler_name))
# make a note of the name in the module so we can get to it easily.
cls.name = compiler_name
return cls
def all_compiler_types():
return [class_for_compiler_name(c) for c in supported_compilers()]
def is_mixed_toolchain(compiler):
"""Returns True if the current compiler is a mixed toolchain,
False otherwise.
Args:
compiler (spack.compiler.Compiler): a valid compiler object
"""
import spack.detection.path
executables = [
os.path.basename(compiler.cc or ""),
os.path.basename(compiler.cxx or ""),
os.path.basename(compiler.f77 or ""),
os.path.basename(compiler.fc or ""),
]
toolchains = set()
finder = spack.detection.path.ExecutablesFinder()
for pkg_name in spack.repo.PATH.packages_with_tags(COMPILER_TAG):
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
patterns = finder.search_patterns(pkg=pkg_cls)
if not patterns:
continue
joined_pattern = re.compile(r"|".join(patterns))
if any(joined_pattern.search(exe) for exe in executables):
tty.debug(f"[TOOLCHAIN] MATCH {pkg_name}")
toolchains.add(pkg_name)
if len(toolchains) > 1:
if (
toolchains == {"llvm", "apple-clang", "aocc"}
# Msvc toolchain uses Intel ifx
or toolchains == {"msvc", "intel-oneapi-compilers"}
):
return False
tty.debug("[TOOLCHAINS] {0}".format(toolchains))
return True
return False
_EXTRA_ATTRIBUTES_KEY = "extra_attributes"
_COMPILERS_KEY = "compilers"
_C_KEY = "c"
_CXX_KEY, _FORTRAN_KEY = "cxx", "fortran"
class CompilerConfigFactory:
"""Class aggregating all ways of constructing a list of compiler config entries."""
@staticmethod
def from_specs(specs: List["spack.spec.Spec"]) -> List[dict]:
result = []
compiler_package_names = supported_compilers() + list(package_name_to_compiler_name.keys())
for s in specs:
if s.name not in compiler_package_names:
continue
candidate = CompilerConfigFactory.from_external_spec(s)
if candidate is None:
continue
result.append(candidate)
return result
@staticmethod
def from_packages_yaml(packages_yaml) -> List[dict]:
compiler_specs = []
compiler_package_names = supported_compilers() + list(package_name_to_compiler_name.keys())
for name, entry in packages_yaml.items():
if name not in compiler_package_names:
continue
externals_config = entry.get("externals", None)
if not externals_config:
continue
current_specs = []
for current_external in externals_config:
compiler = CompilerConfigFactory._spec_from_external_config(current_external)
if compiler:
current_specs.append(compiler)
compiler_specs.extend(current_specs)
return CompilerConfigFactory.from_specs(compiler_specs)
@staticmethod
def _spec_from_external_config(config):
# Allow `@x.y.z` instead of `@=x.y.z`
err_header = f"The external spec '{config['spec']}' cannot be used as a compiler"
# If extra_attributes is not there I might not want to use this entry as a compiler,
# therefore just leave a debug message, but don't be loud with a warning.
if _EXTRA_ATTRIBUTES_KEY not in config:
tty.debug(f"[{__file__}] {err_header}: missing the '{_EXTRA_ATTRIBUTES_KEY}' key")
return None
extra_attributes = config[_EXTRA_ATTRIBUTES_KEY]
result = spack.spec.Spec(
str(spack.spec.parse_with_version_concrete(config["spec"])),
external_modules=config.get("modules"),
)
result.extra_attributes = extra_attributes
return result
@staticmethod
def from_external_spec(spec: "spack.spec.Spec") -> Optional[dict]:
spec = spack.spec.parse_with_version_concrete(spec)
extra_attributes = getattr(spec, _EXTRA_ATTRIBUTES_KEY, None)
if extra_attributes is None:
return None
paths = CompilerConfigFactory._extract_compiler_paths(spec)
if paths is None:
return None
compiler_spec = spack.spec.CompilerSpec(
package_name_to_compiler_name.get(spec.name, spec.name), spec.version
)
operating_system, target = CompilerConfigFactory._extract_os_and_target(spec)
compiler_entry = {
"compiler": {
"spec": str(compiler_spec),
"paths": paths,
"flags": extra_attributes.get("flags", {}),
"operating_system": str(operating_system),
"target": str(target.family),
"modules": getattr(spec, "external_modules", []),
"environment": extra_attributes.get("environment", {}),
"extra_rpaths": extra_attributes.get("extra_rpaths", []),
"implicit_rpaths": extra_attributes.get("implicit_rpaths", None),
}
}
return compiler_entry
@staticmethod
def _extract_compiler_paths(spec: "spack.spec.Spec") -> Optional[Dict[str, str]]:
err_header = f"The external spec '{spec}' cannot be used as a compiler"
extra_attributes = spec.extra_attributes
# If I have 'extra_attributes' warn if 'compilers' is missing,
# or we don't have a C compiler
if _COMPILERS_KEY not in extra_attributes:
warnings.warn(
f"{err_header}: missing the '{_COMPILERS_KEY}' key under '{_EXTRA_ATTRIBUTES_KEY}'"
)
return None
attribute_compilers = extra_attributes[_COMPILERS_KEY]
if _C_KEY not in attribute_compilers:
warnings.warn(
f"{err_header}: missing the C compiler path under "
f"'{_EXTRA_ATTRIBUTES_KEY}:{_COMPILERS_KEY}'"
)
return None
c_compiler = attribute_compilers[_C_KEY]
# C++ and Fortran compilers are not mandatory, so let's just leave a debug trace
if _CXX_KEY not in attribute_compilers:
tty.debug(f"[{__file__}] The external spec {spec} does not have a C++ compiler")
if _FORTRAN_KEY not in attribute_compilers:
tty.debug(f"[{__file__}] The external spec {spec} does not have a Fortran compiler")
# compilers format has cc/fc/f77, externals format has "c/fortran"
return {
"cc": c_compiler,
"cxx": attribute_compilers.get(_CXX_KEY, None),
"fc": attribute_compilers.get(_FORTRAN_KEY, None),
"f77": attribute_compilers.get(_FORTRAN_KEY, None),
}
@staticmethod
def _extract_os_and_target(spec: "spack.spec.Spec"):
if not spec.architecture:
host_platform = spack.platforms.host()
operating_system = host_platform.operating_system("default_os")
target = host_platform.target("default_target")
else:
target = spec.architecture.target
if not target:
target = spack.platforms.host().target("default_target")
operating_system = spec.os
if not operating_system:
host_platform = spack.platforms.host()
operating_system = host_platform.operating_system("default_os")
return operating_system, target
class InvalidCompilerConfigurationError(spack.error.SpackError):
def __init__(self, compiler_spec):
super().__init__(
f'Invalid configuration for [compiler "{compiler_spec}"]: ',
f"Compiler configuration must contain entries for "
f"all compilers: {spack.compiler.PATH_INSTANCE_VARS}",
)
class UnknownCompilerError(spack.error.SpackError):
def __init__(self, compiler_name):
super().__init__("Spack doesn't support the requested compiler: {0}".format(compiler_name))
class NoCompilerForSpecError(spack.error.SpackError):
def __init__(self, compiler_spec, target):
super().__init__(
"No compilers for operating system %s satisfy spec %s" % (target, compiler_spec)
)

View File

@@ -0,0 +1,212 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import enum
import typing
from typing import Dict, List
from llnl.util import lang
from .libraries import CompilerPropertyDetector
if typing.TYPE_CHECKING:
import spack.spec
class Languages(enum.Enum):
C = "c"
CXX = "cxx"
FORTRAN = "fortran"
class CompilerAdaptor:
def __init__(
self, compiled_spec: "spack.spec.Spec", compilers: Dict[Languages, "spack.spec.Spec"]
) -> None:
if not compilers:
raise AttributeError(f"{compiled_spec} has no 'compiler' attribute")
self.compilers = compilers
self.compiled_spec = compiled_spec
def _lang_exists_or_raise(self, name: str, *, lang: Languages) -> None:
if lang not in self.compilers:
raise AttributeError(
f"'{self.compiled_spec}' has no {lang.value} compiler, so the "
f"'{name}' property cannot be retrieved"
)
def _maybe_return_attribute(self, name: str, *, lang: Languages) -> str:
self._lang_exists_or_raise(name, lang=lang)
return getattr(self.compilers[lang].package, name)
@property
def cc_rpath_arg(self) -> str:
self._lang_exists_or_raise("cc_rpath_arg", lang=Languages.C)
return self.compilers[Languages.C].package.rpath_arg
@property
def cxx_rpath_arg(self) -> str:
self._lang_exists_or_raise("cxx_rpath_arg", lang=Languages.CXX)
return self.compilers[Languages.CXX].package.rpath_arg
@property
def fc_rpath_arg(self) -> str:
self._lang_exists_or_raise("fc_rpath_arg", lang=Languages.FORTRAN)
return self.compilers[Languages.FORTRAN].package.rpath_arg
@property
def f77_rpath_arg(self) -> str:
self._lang_exists_or_raise("f77_rpath_arg", lang=Languages.FORTRAN)
return self.compilers[Languages.FORTRAN].package.rpath_arg
@property
def linker_arg(self) -> str:
return self._maybe_return_attribute("linker_arg", lang=Languages.C)
@property
def name(self):
return next(iter(self.compilers.values())).name
@property
def version(self):
return next(iter(self.compilers.values())).version
def implicit_rpaths(self) -> List[str]:
result, seen = [], set()
for compiler in self.compilers.values():
if compiler in seen:
continue
seen.add(compiler)
result.extend(CompilerPropertyDetector(compiler).implicit_rpaths())
return result
@property
def openmp_flag(self) -> str:
return next(iter(self.compilers.values())).package.openmp_flag
@property
def cxx98_flag(self) -> str:
return self.compilers[Languages.CXX].package.standard_flag(
language=Languages.CXX.value, standard="98"
)
@property
def cxx11_flag(self) -> str:
return self.compilers[Languages.CXX].package.standard_flag(
language=Languages.CXX.value, standard="11"
)
@property
def cxx14_flag(self) -> str:
return self.compilers[Languages.CXX].package.standard_flag(
language=Languages.CXX.value, standard="14"
)
@property
def cxx17_flag(self) -> str:
return self.compilers[Languages.CXX].package.standard_flag(
language=Languages.CXX.value, standard="17"
)
@property
def cxx20_flag(self) -> str:
return self.compilers[Languages.CXX].package.standard_flag(
language=Languages.CXX.value, standard="20"
)
@property
def cxx23_flag(self) -> str:
return self.compilers[Languages.CXX].package.standard_flag(
language=Languages.CXX.value, standard="23"
)
@property
def c99_flag(self) -> str:
return self.compilers[Languages.C].package.standard_flag(
language=Languages.C.value, standard="99"
)
@property
def c11_flag(self) -> str:
return self.compilers[Languages.C].package.standard_flag(
language=Languages.C.value, standard="11"
)
@property
def c17_flag(self) -> str:
return self.compilers[Languages.C].package.standard_flag(
language=Languages.C.value, standard="17"
)
@property
def c23_flag(self) -> str:
return self.compilers[Languages.C].package.standard_flag(
language=Languages.C.value, standard="17"
)
@property
def cc_pic_flag(self) -> str:
self._lang_exists_or_raise("cc_pic_flag", lang=Languages.C)
return self.compilers[Languages.C].package.pic_flag
@property
def cxx_pic_flag(self) -> str:
self._lang_exists_or_raise("cxx_pic_flag", lang=Languages.CXX)
return self.compilers[Languages.CXX].package.pic_flag
@property
def fc_pic_flag(self) -> str:
self._lang_exists_or_raise("fc_pic_flag", lang=Languages.FORTRAN)
return self.compilers[Languages.FORTRAN].package.pic_flag
@property
def f77_pic_flag(self) -> str:
self._lang_exists_or_raise("f77_pic_flag", lang=Languages.FORTRAN)
return self.compilers[Languages.FORTRAN].package.pic_flag
@property
def prefix(self) -> str:
return next(iter(self.compilers.values())).prefix
@property
def extra_rpaths(self) -> List[str]:
compiler = next(iter(self.compilers.values()))
return getattr(compiler, "extra_attributes", {}).get("extra_rpaths", [])
@property
def cc(self):
return self._maybe_return_attribute("cc", lang=Languages.C)
@property
def cxx(self):
return self._maybe_return_attribute("cxx", lang=Languages.CXX)
@property
def fc(self):
self._lang_exists_or_raise("fc", lang=Languages.FORTRAN)
return self.compilers[Languages.FORTRAN].package.fortran
@property
def f77(self):
self._lang_exists_or_raise("f77", lang=Languages.FORTRAN)
return self.compilers[Languages.FORTRAN].package.fortran
class DeprecatedCompiler(lang.DeprecatedProperty):
def __init__(self) -> None:
super().__init__(name="compiler")
def factory(self, instance, owner) -> CompilerAdaptor:
spec = instance.spec
if not spec.concrete:
raise ValueError("Can only get a compiler for a concrete package.")
compilers = {}
for language in Languages:
deps = spec.dependencies(virtuals=[language.value])
if deps:
compilers[language] = deps[0]
return CompilerAdaptor(instance, compilers)

View File

@@ -1,120 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import llnl.util.lang
from spack.compiler import Compiler
from spack.version import ver
class Aocc(Compiler):
version_argument = "--version"
@property
def debug_flags(self):
return [
"-gcodeview",
"-gdwarf-2",
"-gdwarf-3",
"-gdwarf-4",
"-gdwarf-5",
"-gline-tables-only",
"-gmodules",
"-g",
]
@property
def opt_flags(self):
return ["-O0", "-O1", "-O2", "-O3", "-Ofast", "-Os", "-Oz", "-Og", "-O", "-O4"]
@property
def link_paths(self):
link_paths = {
"cc": os.path.join("aocc", "clang"),
"cxx": os.path.join("aocc", "clang++"),
"f77": os.path.join("aocc", "flang"),
"fc": os.path.join("aocc", "flang"),
}
return link_paths
@property
def verbose_flag(self):
return "-v"
@property
def openmp_flag(self):
return "-fopenmp"
@property
def cxx11_flag(self):
return "-std=c++11"
@property
def cxx14_flag(self):
return "-std=c++14"
@property
def cxx17_flag(self):
return "-std=c++17"
@property
def c99_flag(self):
return "-std=c99"
@property
def c11_flag(self):
return "-std=c11"
@property
def cc_pic_flag(self):
return "-fPIC"
@property
def cxx_pic_flag(self):
return "-fPIC"
@property
def f77_pic_flag(self):
return "-fPIC"
@property
def fc_pic_flag(self):
return "-fPIC"
required_libs = ["libclang"]
@classmethod
@llnl.util.lang.memoized
def extract_version_from_output(cls, output):
match = re.search(r"AOCC_(\d+)[._](\d+)[._](\d+)", output)
if match:
return ".".join(match.groups())
return "unknown"
@property
def stdcxx_libs(self):
return ("-lstdc++",)
@property
def cflags(self):
return self._handle_default_flag_addtions()
@property
def cxxflags(self):
return self._handle_default_flag_addtions()
@property
def fflags(self):
return self._handle_default_flag_addtions()
def _handle_default_flag_addtions(self):
# This is a known issue for AOCC 3.0 see:
# https://developer.amd.com/wp-content/resources/AOCC-3.0-Install-Guide.pdf
if self.version.satisfies(ver("3.0.0")):
return "-Wno-unused-command-line-argument " "-mllvm -eliminate-similar-expr=false"

View File

@@ -1,116 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import re
import llnl.util.lang
import spack.compiler
import spack.compilers.clang
from spack.version import Version
class AppleClang(spack.compilers.clang.Clang):
openmp_flag = "-Xpreprocessor -fopenmp"
@classmethod
@llnl.util.lang.memoized
def extract_version_from_output(cls, output):
ver = "unknown"
match = re.search(
# Apple's LLVM compiler has its own versions, so suffix them.
r"^Apple (?:LLVM|clang) version ([^ )]+)",
output,
# Multi-line, since 'Apple clang' may not be on the first line
# in particular, when run as gcc, it seems to output
# "Configured with: --prefix=..." as the first line
re.M,
)
if match:
ver = match.group(match.lastindex)
return ver
# C++ flags based on CMake Modules/Compiler/AppleClang-CXX.cmake
@property
def cxx11_flag(self):
# Spack's AppleClang detection only valid from Xcode >= 4.6
if self.real_version < Version("4.0"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++11 standard", "cxx11_flag", "Xcode < 4.0"
)
return "-std=c++11"
@property
def cxx14_flag(self):
if self.real_version < Version("5.1"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++14 standard", "cxx14_flag", "Xcode < 5.1"
)
elif self.real_version < Version("6.1"):
return "-std=c++1y"
return "-std=c++14"
@property
def cxx17_flag(self):
if self.real_version < Version("6.1"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++17 standard", "cxx17_flag", "Xcode < 6.1"
)
elif self.real_version < Version("10.0"):
return "-std=c++1z"
return "-std=c++17"
@property
def cxx20_flag(self):
if self.real_version < Version("10.0"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++20 standard", "cxx20_flag", "Xcode < 10.0"
)
elif self.real_version < Version("13.0"):
return "-std=c++2a"
return "-std=c++20"
@property
def cxx23_flag(self):
if self.real_version < Version("13.0"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++23 standard", "cxx23_flag", "Xcode < 13.0"
)
return "-std=c++2b"
# C flags based on CMake Modules/Compiler/AppleClang-C.cmake
@property
def c99_flag(self):
if self.real_version < Version("4.0"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C99 standard", "c99_flag", "< 4.0"
)
return "-std=c99"
@property
def c11_flag(self):
if self.real_version < Version("4.0"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C11 standard", "c11_flag", "< 4.0"
)
return "-std=c11"
@property
def c17_flag(self):
if self.real_version < Version("11.0"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C17 standard", "c17_flag", "< 11.0"
)
return "-std=c17"
@property
def c23_flag(self):
if self.real_version < Version("11.0.3"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C23 standard", "c23_flag", "< 11.0.3"
)
return "-std=c2x"

View File

@@ -1,80 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import spack.compiler
class Arm(spack.compiler.Compiler):
# Named wrapper links within lib/spack/env
link_paths = {
"cc": os.path.join("arm", "armclang"),
"cxx": os.path.join("arm", "armclang++"),
"f77": os.path.join("arm", "armflang"),
"fc": os.path.join("arm", "armflang"),
}
# The ``--version`` option seems to be the most consistent one for
# arm compilers. Output looks like this:
#
# $ arm<c/f>lang --version
# Arm C/C++/Fortran Compiler version 19.0 (build number 73) (based on LLVM 7.0.2)
# Target: aarch64--linux-gnu
# Thread model: posix
# InstalledDir:
# /opt/arm/arm-hpc-compiler-19.0_Generic-AArch64_RHEL-7_aarch64-linux/bin
version_argument = "--version"
version_regex = r"Arm C\/C\+\+\/Fortran Compiler version ([\d\.]+) "
@property
def verbose_flag(self):
return "-v"
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3", "-Ofast"]
@property
def openmp_flag(self):
return "-fopenmp"
@property
def cxx11_flag(self):
return "-std=c++11"
@property
def cxx14_flag(self):
return "-std=c++14"
@property
def cxx17_flag(self):
return "-std=c++1z"
@property
def c99_flag(self):
return "-std=c99"
@property
def c11_flag(self):
return "-std=c11"
@property
def cc_pic_flag(self):
return "-fPIC"
@property
def cxx_pic_flag(self):
return "-fPIC"
@property
def f77_pic_flag(self):
return "-fPIC"
@property
def fc_pic_flag(self):
return "-fPIC"
required_libs = ["libclang", "libflang"]

View File

@@ -1,128 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from spack.compiler import Compiler, UnsupportedCompilerFlag
from spack.version import Version
class Cce(Compiler):
"""Cray compiler environment compiler."""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
# For old cray compilers on module based systems we replace
# ``version_argument`` with the old value. Cannot be a property
# as the new value is used in classmethods for path-based detection
if not self.is_clang_based:
self.version_argument = "-V"
# MacPorts builds gcc versions with prefixes and -mp-X.Y suffixes.
suffixes = [r"-mp-\d\.\d"]
@property
def link_paths(self):
if any("PrgEnv-cray" in m for m in self.modules):
# Old module-based interface to cray compilers
return {
"cc": os.path.join("cce", "cc"),
"cxx": os.path.join("case-insensitive", "CC"),
"f77": os.path.join("cce", "ftn"),
"fc": os.path.join("cce", "ftn"),
}
return {
"cc": os.path.join("cce", "craycc"),
"cxx": os.path.join("cce", "case-insensitive", "crayCC"),
"f77": os.path.join("cce", "crayftn"),
"fc": os.path.join("cce", "crayftn"),
}
@property
def is_clang_based(self):
version = self._real_version or self.version
return version >= Version("9.0") and "classic" not in str(version)
version_argument = "--version"
version_regex = r"[Cc]ray (?:clang|C :|C\+\+ :|Fortran :) [Vv]ersion.*?(\d+(\.\d+)+)"
@property
def verbose_flag(self):
return "-v"
@property
def debug_flags(self):
return ["-g", "-G0", "-G1", "-G2", "-Gfast"]
@property
def openmp_flag(self):
if self.is_clang_based:
return "-fopenmp"
return "-h omp"
@property
def cxx11_flag(self):
if self.is_clang_based:
return "-std=c++11"
return "-h std=c++11"
@property
def cxx14_flag(self):
if self.is_clang_based:
return "-std=c++14"
return "-h std=c++14"
@property
def cxx17_flag(self):
if self.is_clang_based:
return "-std=c++17"
@property
def c99_flag(self):
if self.is_clang_based:
return "-std=c99"
elif self.real_version >= Version("8.4"):
return "-h std=c99,noconform,gnu"
elif self.real_version >= Version("8.1"):
return "-h c99,noconform,gnu"
raise UnsupportedCompilerFlag(self, "the C99 standard", "c99_flag", "< 8.1")
@property
def c11_flag(self):
if self.is_clang_based:
return "-std=c11"
elif self.real_version >= Version("8.5"):
return "-h std=c11,noconform,gnu"
raise UnsupportedCompilerFlag(self, "the C11 standard", "c11_flag", "< 8.5")
@property
def cc_pic_flag(self):
if self.is_clang_based:
return "-fPIC"
return "-h PIC"
@property
def cxx_pic_flag(self):
if self.is_clang_based:
return "-fPIC"
return "-h PIC"
@property
def f77_pic_flag(self):
if self.is_clang_based:
return "-fPIC"
return "-h PIC"
@property
def fc_pic_flag(self):
if self.is_clang_based:
return "-fPIC"
return "-h PIC"
@property
def stdcxx_libs(self):
# Cray compiler wrappers link to the standard C++ library
# without additional flags.
return ()

View File

@@ -1,192 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import llnl.util.lang
from spack.compiler import Compiler, UnsupportedCompilerFlag
from spack.version import Version
#: compiler symlink mappings for mixed f77 compilers
f77_mapping = [
("gfortran", os.path.join("clang", "gfortran")),
("xlf_r", os.path.join("xl_r", "xlf_r")),
("xlf", os.path.join("xl", "xlf")),
("ifort", os.path.join("intel", "ifort")),
]
#: compiler symlink mappings for mixed f90/fc compilers
fc_mapping = [
("gfortran", os.path.join("clang", "gfortran")),
("xlf90_r", os.path.join("xl_r", "xlf90_r")),
("xlf90", os.path.join("xl", "xlf90")),
("ifort", os.path.join("intel", "ifort")),
]
class Clang(Compiler):
version_argument = "--version"
@property
def debug_flags(self):
return [
"-gcodeview",
"-gdwarf-2",
"-gdwarf-3",
"-gdwarf-4",
"-gdwarf-5",
"-gline-tables-only",
"-gmodules",
"-g",
]
@property
def opt_flags(self):
return ["-O0", "-O1", "-O2", "-O3", "-Ofast", "-Os", "-Oz", "-Og", "-O", "-O4"]
# Clang has support for using different fortran compilers with the
# clang executable.
@property
def link_paths(self):
# clang links are always the same
link_paths = {
"cc": os.path.join("clang", "clang"),
"cxx": os.path.join("clang", "clang++"),
}
# fortran links need to look at the actual compiler names from
# compilers.yaml to figure out which named symlink to use
for compiler_name, link_path in f77_mapping:
if self.f77 and compiler_name in self.f77:
link_paths["f77"] = link_path
break
else:
link_paths["f77"] = os.path.join("clang", "flang")
for compiler_name, link_path in fc_mapping:
if self.fc and compiler_name in self.fc:
link_paths["fc"] = link_path
break
else:
link_paths["fc"] = os.path.join("clang", "flang")
return link_paths
@property
def verbose_flag(self):
return "-v"
openmp_flag = "-fopenmp"
# C++ flags based on CMake Modules/Compiler/Clang.cmake
@property
def cxx11_flag(self):
if self.real_version < Version("3.3"):
raise UnsupportedCompilerFlag(self, "the C++11 standard", "cxx11_flag", "< 3.3")
return "-std=c++11"
@property
def cxx14_flag(self):
if self.real_version < Version("3.4"):
raise UnsupportedCompilerFlag(self, "the C++14 standard", "cxx14_flag", "< 3.5")
elif self.real_version < Version("3.5"):
return "-std=c++1y"
return "-std=c++14"
@property
def cxx17_flag(self):
if self.real_version < Version("3.5"):
raise UnsupportedCompilerFlag(self, "the C++17 standard", "cxx17_flag", "< 3.5")
elif self.real_version < Version("5.0"):
return "-std=c++1z"
return "-std=c++17"
@property
def cxx20_flag(self):
if self.real_version < Version("5.0"):
raise UnsupportedCompilerFlag(self, "the C++20 standard", "cxx20_flag", "< 5.0")
elif self.real_version < Version("11.0"):
return "-std=c++2a"
else:
return "-std=c++20"
@property
def cxx23_flag(self):
if self.real_version < Version("12.0"):
raise UnsupportedCompilerFlag(self, "the C++23 standard", "cxx23_flag", "< 12.0")
elif self.real_version < Version("17.0"):
return "-std=c++2b"
else:
return "-std=c++23"
@property
def c99_flag(self):
return "-std=c99"
@property
def c11_flag(self):
if self.real_version < Version("3.0"):
raise UnsupportedCompilerFlag(self, "the C11 standard", "c11_flag", "< 3.0")
if self.real_version < Version("3.1"):
return "-std=c1x"
return "-std=c11"
@property
def c17_flag(self):
if self.real_version < Version("6.0"):
raise UnsupportedCompilerFlag(self, "the C17 standard", "c17_flag", "< 6.0")
return "-std=c17"
@property
def c23_flag(self):
if self.real_version < Version("9.0"):
raise UnsupportedCompilerFlag(self, "the C23 standard", "c23_flag", "< 9.0")
elif self.real_version < Version("18.0"):
return "-std=c2x"
else:
return "-std=c23"
@property
def cc_pic_flag(self):
return "-fPIC"
@property
def cxx_pic_flag(self):
return "-fPIC"
@property
def f77_pic_flag(self):
return "-fPIC"
@property
def fc_pic_flag(self):
return "-fPIC"
required_libs = ["libclang"]
@classmethod
@llnl.util.lang.memoized
def extract_version_from_output(cls, output):
ver = "unknown"
if ("Apple" in output) or ("AMD" in output):
return ver
match = re.search(
# Normal clang compiler versions are left as-is
r"(?:clang|flang-new) version ([^ )\n]+)-svn[~.\w\d-]*|"
# Don't include hyphenated patch numbers in the version
# (see https://github.com/spack/spack/pull/14365 for details)
r"(?:clang|flang-new) version ([^ )\n]+?)-[~.\w\d-]*|"
r"(?:clang|flang-new) version ([^ )\n]+)",
output,
)
if match:
ver = match.group(match.lastindex)
return ver

View File

@@ -0,0 +1,427 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""This module contains functions related to finding compilers on the system,
and configuring Spack to use multiple compilers.
"""
import os
import re
import sys
import warnings
from typing import Any, Dict, List, Optional, Tuple
import archspec.cpu
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
import spack.config
import spack.detection
import spack.error
import spack.platforms
import spack.repo
import spack.spec
from spack.operating_systems import windows_os
from spack.util.environment import get_path
package_name_to_compiler_name = {
"llvm": "clang",
"intel-oneapi-compilers": "oneapi",
"llvm-amdgpu": "rocmcc",
"intel-oneapi-compilers-classic": "intel",
"acfl": "arm",
}
#: Tag used to identify packages providing a compiler
COMPILER_TAG = "compiler"
def compiler_config_files():
config_files = []
configuration = spack.config.CONFIG
for scope in configuration.writable_scopes:
name = scope.name
from_packages_yaml = CompilerFactory.from_packages_yaml(configuration, scope=name)
if from_packages_yaml:
config_files.append(configuration.get_config_filename(name, "packages"))
compiler_config = configuration.get("compilers", scope=name)
if compiler_config:
config_files.append(configuration.get_config_filename(name, "compilers"))
return config_files
def add_compiler_to_config(new_compilers, *, scope=None) -> None:
"""Add a Compiler object to the configuration, at the required scope."""
# FIXME (compiler as nodes): still needed to read Cray manifest
by_name: Dict[str, List["spack.spec.Spec"]] = {}
for x in new_compilers:
by_name.setdefault(x.name, []).append(x)
spack.detection.update_configuration(by_name, buildable=True, scope=scope)
def find_compilers(
path_hints: Optional[List[str]] = None,
*,
scope: Optional[str] = None,
max_workers: Optional[int] = None,
) -> List["spack.spec.Spec"]:
"""Searches for compiler in the paths given as argument. If any new compiler is found, the
configuration is updated, and the list of new compiler objects is returned.
Args:
path_hints: list of path hints where to look for. A sensible default based on the ``PATH``
environment variable will be used if the value is None
scope: configuration scope to modify
max_workers: number of processes used to search for compilers
"""
if path_hints is None:
path_hints = get_path("PATH")
default_paths = fs.search_paths_for_executables(*path_hints)
if sys.platform == "win32":
default_paths.extend(windows_os.WindowsOs().compiler_search_paths)
compiler_pkgs = spack.repo.PATH.packages_with_tags(COMPILER_TAG, full=True)
detected_packages = spack.detection.by_path(
compiler_pkgs, path_hints=default_paths, max_workers=max_workers
)
new_compilers = spack.detection.update_configuration(
detected_packages, buildable=True, scope=scope
)
return new_compilers
def select_new_compilers(
candidates: List["spack.spec.Spec"], *, scope: Optional[str] = None
) -> List["spack.spec.Spec"]:
"""Given a list of compilers, remove those that are already defined in
the configuration.
"""
compilers_in_config = all_compilers_from(configuration=spack.config.CONFIG, scope=scope)
return [c for c in candidates if c not in compilers_in_config]
def supported_compilers() -> List[str]:
"""Returns all the currently supported compiler packages"""
return sorted(spack.repo.PATH.packages_with_tags(COMPILER_TAG))
def all_compilers(
scope: Optional[str] = None, init_config: bool = True
) -> List["spack.spec.Spec"]:
"""Returns all the compilers from the current global configuration.
Args:
scope: configuration scope from which to extract the compilers. If None, the merged
configuration is used.
init_config: if True, search for compilers if none is found in configuration.
"""
compilers = all_compilers_from(configuration=spack.config.CONFIG, scope=scope)
if not compilers and init_config:
find_compilers(scope=scope)
compilers = all_compilers_from(configuration=spack.config.CONFIG, scope=scope)
return compilers
def all_compilers_from(
configuration: "spack.config.ConfigurationType", scope: Optional[str] = None
) -> List["spack.spec.Spec"]:
"""Returns all the compilers from the current global configuration.
Args:
configuration: configuration to be queried
scope: configuration scope from which to extract the compilers. If None, the merged
configuration is used.
"""
compilers = CompilerFactory.from_packages_yaml(configuration, scope=scope)
if os.environ.get("SPACK_EXPERIMENTAL_DEPRECATE_COMPILERS_YAML") != "1":
legacy_compilers = CompilerFactory.from_compilers_yaml(configuration, scope=scope)
if legacy_compilers:
# FIXME (compiler as nodes): write how to update the file. Maybe an ad-hoc command
warnings.warn(
"Some compilers are still defined in 'compilers.yaml', which has been deprecated "
"in v0.23. Those configuration files will be ignored from Spack v0.25.\n"
)
for legacy in legacy_compilers:
if not any(c.satisfies(f"{legacy.name}@{legacy.versions}") for c in compilers):
compilers.append(legacy)
return compilers
class CompilerRemover:
"""Removes compiler from configuration."""
def __init__(self, configuration: "spack.config.ConfigurationType") -> None:
self.configuration = configuration
self.marked_packages_yaml: List[Tuple[str, Any]] = []
self.marked_compilers_yaml: List[Tuple[str, Any]] = []
def mark_compilers(
self, *, match: str, scope: Optional[str] = None
) -> List["spack.spec.Spec"]:
"""Marks compilers to be removed in configuration, and returns a corresponding list
of specs.
Args:
match: constraint that the compiler must match to be removed.
scope: scope where to remove the compiler. If None, all writeable scopes are checked.
"""
self.marked_packages_yaml = []
self.marked_compilers_yaml = []
candidate_scopes = [scope]
if scope is None:
candidate_scopes = [x.name for x in self.configuration.writable_scopes]
all_removals = self._mark_in_packages_yaml(match, candidate_scopes)
all_removals.extend(self._mark_in_compilers_yaml(match, candidate_scopes))
return all_removals
def _mark_in_packages_yaml(self, match, candidate_scopes):
compiler_package_names = supported_compilers()
all_removals = []
for current_scope in candidate_scopes:
packages_yaml = self.configuration.get("packages", scope=current_scope)
if not packages_yaml:
continue
removed_from_scope = []
for name, entry in packages_yaml.items():
if name not in compiler_package_names:
continue
externals_config = entry.get("externals", None)
if not externals_config:
continue
def _partition_match(external_yaml):
s = CompilerFactory.from_external_yaml(external_yaml)
return not s.satisfies(match)
to_keep, to_remove = llnl.util.lang.stable_partition(
externals_config, _partition_match
)
if not to_remove:
continue
removed_from_scope.extend(to_remove)
entry["externals"] = to_keep
if not removed_from_scope:
continue
self.marked_packages_yaml.append((current_scope, packages_yaml))
all_removals.extend(
[CompilerFactory.from_external_yaml(x) for x in removed_from_scope]
)
return all_removals
def _mark_in_compilers_yaml(self, match, candidate_scopes):
if os.environ.get("SPACK_EXPERIMENTAL_DEPRECATE_COMPILERS_YAML") == "1":
return []
all_removals = []
for current_scope in candidate_scopes:
compilers_yaml = self.configuration.get("compilers", scope=current_scope)
if not compilers_yaml:
continue
def _partition_match(entry):
external_specs = CompilerFactory.from_legacy_yaml(entry["compiler"])
return not any(x.satisfies(match) for x in external_specs)
to_keep, to_remove = llnl.util.lang.stable_partition(compilers_yaml, _partition_match)
if not to_remove:
continue
compilers_yaml[:] = to_keep
self.marked_compilers_yaml.append((current_scope, compilers_yaml))
for entry in to_remove:
all_removals.extend(CompilerFactory.from_legacy_yaml(entry["compiler"]))
return all_removals
def flush(self):
"""Removes from configuration the specs that have been marked by the previous call
of ``remove_compilers``.
"""
for scope, packages_yaml in self.marked_packages_yaml:
self.configuration.set("packages", packages_yaml, scope=scope)
for scope, compilers_yaml in self.marked_compilers_yaml:
self.configuration.set("compilers", compilers_yaml, scope=scope)
def compilers_for_spec(compiler_spec, *, arch_spec=None, scope=None, init_config=True):
"""This gets all compilers that satisfy the supplied CompilerSpec.
Returns an empty list if none are found.
"""
# FIXME (compiler as nodes): to be removed, or reimplemented
raise NotImplementedError("still to be implemented")
def compilers_for_arch(
arch_spec: "spack.spec.ArchSpec", *, scope: Optional[str] = None
) -> List["spack.spec.Spec"]:
"""Returns the compilers that can be used on the input architecture"""
compilers = all_compilers_from(spack.config.CONFIG, scope=scope)
query = f"platform={arch_spec.platform} target=:{arch_spec.target}"
return [x for x in compilers if x.satisfies(query)]
def class_for_compiler_name(compiler_name):
"""Given a compiler module name, get the corresponding Compiler class."""
# FIXME (compiler as nodes): to be removed, or reimplemented
raise NotImplementedError("still to be implemented")
_EXTRA_ATTRIBUTES_KEY = "extra_attributes"
_COMPILERS_KEY = "compilers"
_C_KEY = "c"
_CXX_KEY, _FORTRAN_KEY = "cxx", "fortran"
def name_os_target(spec: "spack.spec.Spec") -> Tuple[str, str, str]:
if not spec.architecture:
host_platform = spack.platforms.host()
operating_system = host_platform.operating_system("default_os")
target = host_platform.target("default_target")
else:
target = spec.architecture.target
if not target:
target = spack.platforms.host().target("default_target")
target = target
operating_system = spec.os
if not operating_system:
host_platform = spack.platforms.host()
operating_system = host_platform.operating_system("default_os")
return spec.name, str(operating_system), str(target)
class CompilerFactory:
"""Class aggregating all ways of constructing a list of compiler specs from config entries."""
_PACKAGES_YAML_CACHE: Dict[str, Optional["spack.spec.Spec"]] = {}
_COMPILERS_YAML_CACHE: Dict[str, List["spack.spec.Spec"]] = {}
_GENERIC_TARGET = None
@staticmethod
def from_packages_yaml(
configuration: "spack.config.ConfigurationType", *, scope: Optional[str] = None
) -> List["spack.spec.Spec"]:
"""Returns the compiler specs defined in the "packages" section of the configuration"""
compilers = []
compiler_package_names = supported_compilers()
packages_yaml = configuration.get("packages", scope=scope)
for name, entry in packages_yaml.items():
if name not in compiler_package_names:
continue
externals_config = entry.get("externals", None)
if not externals_config:
continue
compiler_specs = []
for current_external in externals_config:
key = str(current_external)
if key not in CompilerFactory._PACKAGES_YAML_CACHE:
CompilerFactory._PACKAGES_YAML_CACHE[key] = CompilerFactory.from_external_yaml(
current_external
)
compiler = CompilerFactory._PACKAGES_YAML_CACHE[key]
if compiler:
compiler_specs.append(compiler)
compilers.extend(compiler_specs)
return compilers
@staticmethod
def from_external_yaml(config: Dict[str, Any]) -> Optional["spack.spec.Spec"]:
"""Returns a compiler spec from an external definition from packages.yaml."""
# Allow `@x.y.z` instead of `@=x.y.z`
err_header = f"The external spec '{config['spec']}' cannot be used as a compiler"
# If extra_attributes is not there I might not want to use this entry as a compiler,
# therefore just leave a debug message, but don't be loud with a warning.
if _EXTRA_ATTRIBUTES_KEY not in config:
tty.debug(f"[{__file__}] {err_header}: missing the '{_EXTRA_ATTRIBUTES_KEY}' key")
return None
extra_attributes = config[_EXTRA_ATTRIBUTES_KEY]
result = spack.spec.Spec(
str(spack.spec.parse_with_version_concrete(config["spec"])),
external_path=config.get("prefix"),
external_modules=config.get("modules"),
)
result.extra_attributes = extra_attributes
CompilerFactory._finalize_external_concretization(result)
return result
@staticmethod
def _finalize_external_concretization(abstract_spec):
if CompilerFactory._GENERIC_TARGET is None:
CompilerFactory._GENERIC_TARGET = archspec.cpu.host().family
if abstract_spec.architecture:
abstract_spec.architecture.complete_with_defaults()
else:
abstract_spec.constrain(spack.spec.Spec.default_arch())
abstract_spec.architecture.target = CompilerFactory._GENERIC_TARGET
abstract_spec._finalize_concretization()
@staticmethod
def from_legacy_yaml(compiler_dict: Dict[str, Any]) -> List["spack.spec.Spec"]:
"""Returns a list of external specs, corresponding to a compiler entry
from compilers.yaml.
"""
from spack.detection.path import ExecutablesFinder
# FIXME (compiler as nodes): should we look at targets too?
result = []
candidate_paths = [x for x in compiler_dict["paths"].values() if x is not None]
finder = ExecutablesFinder()
for pkg_name in spack.repo.PATH.packages_with_tags("compiler"):
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
pattern = re.compile(r"|".join(finder.search_patterns(pkg=pkg_cls)))
filtered_paths = [x for x in candidate_paths if pattern.search(os.path.basename(x))]
detected = finder.detect_specs(pkg=pkg_cls, paths=filtered_paths)
result.extend(detected)
for item in result:
CompilerFactory._finalize_external_concretization(item)
return result
@staticmethod
def from_compilers_yaml(
configuration: "spack.config.ConfigurationType", *, scope: Optional[str] = None
) -> List["spack.spec.Spec"]:
"""Returns the compiler specs defined in the "compilers" section of the configuration"""
result: List["spack.spec.Spec"] = []
for item in configuration.get("compilers", scope=scope):
key = str(item)
if key not in CompilerFactory._COMPILERS_YAML_CACHE:
CompilerFactory._COMPILERS_YAML_CACHE[key] = CompilerFactory.from_legacy_yaml(
item["compiler"]
)
result.extend(CompilerFactory._COMPILERS_YAML_CACHE[key])
return result
class UnknownCompilerError(spack.error.SpackError):
def __init__(self, compiler_name):
super().__init__(f"Spack doesn't support the requested compiler: {compiler_name}")

View File

@@ -0,0 +1,19 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from ..error import SpackError
class CompilerAccessError(SpackError):
def __init__(self, compiler, paths):
super().__init__(
f"Compiler '{compiler.spec}' has executables that are missing"
f" or are not executable: {paths}"
)
class UnsupportedCompilerFlag(SpackError):
"""Raised when a compiler does not support a flag type (e.g. a flag to enforce a
language standard).
"""

View File

@@ -1,79 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import spack.compiler
class Fj(spack.compiler.Compiler):
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("fj", "fcc"),
"cxx": os.path.join("fj", "case-insensitive", "FCC"),
"f77": os.path.join("fj", "frt"),
"fc": os.path.join("fj", "frt"),
}
version_argument = "--version"
version_regex = r"\((?:FCC|FRT)\) ([a-z\d.]+)"
required_libs = ["libfj90i", "libfj90f", "libfjsrcinfo"]
@property
def verbose_flag(self):
return "-v"
@property
def debug_flags(self):
return "-g"
@property
def opt_flags(self):
return ["-O0", "-O1", "-O2", "-O3", "-Ofast"]
@property
def openmp_flag(self):
return "-Kopenmp"
@property
def cxx98_flag(self):
return "-std=c++98"
@property
def cxx11_flag(self):
return "-std=c++11"
@property
def cxx14_flag(self):
return "-std=c++14"
@property
def cxx17_flag(self):
return "-std=c++17"
@property
def c99_flag(self):
return "-std=c99"
@property
def c11_flag(self):
return "-std=c11"
@property
def cc_pic_flag(self):
return "-KPIC"
@property
def cxx_pic_flag(self):
return "-KPIC"
@property
def f77_pic_flag(self):
return "-KPIC"
@property
def fc_pic_flag(self):
return "-KPIC"

View File

@@ -0,0 +1,26 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from typing import List, Tuple
def tokenize_flags(flags_values: str, propagate: bool = False) -> List[Tuple[str, bool]]:
"""Given a compiler flag specification as a string, this returns a list
where the entries are the flags. For compiler options which set values
using the syntax "-flag value", this function groups flags and their
values together. Any token not preceded by a "-" is considered the
value of a prior flag."""
tokens = flags_values.split()
if not tokens:
return []
flag = tokens[0]
flags_with_propagation = []
for token in tokens[1:]:
if not token.startswith("-"):
flag += " " + token
else:
flags_with_propagation.append((flag, propagate))
flag = token
flags_with_propagation.append((flag, propagate))
return flags_with_propagation

View File

@@ -1,191 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from llnl.util.filesystem import ancestor
import spack.compiler
import spack.compilers.apple_clang as apple_clang
import spack.util.executable
from spack.version import Version
class Gcc(spack.compiler.Compiler):
# MacPorts builds gcc versions with prefixes and -mp-X or -mp-X.Y suffixes.
# Homebrew and Linuxbrew may build gcc with -X, -X.Y suffixes.
# Old compatibility versions may contain XY suffixes.
suffixes = [r"-mp-\d+(?:\.\d+)?", r"-\d+(?:\.\d+)?", r"\d\d"]
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("gcc", "gcc"),
"cxx": os.path.join("gcc", "g++"),
"f77": os.path.join("gcc", "gfortran"),
"fc": os.path.join("gcc", "gfortran"),
}
@property
def verbose_flag(self):
return "-v"
@property
def debug_flags(self):
return ["-g", "-gstabs+", "-gstabs", "-gxcoff+", "-gxcoff", "-gvms"]
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3", "-Os", "-Ofast", "-Og"]
@property
def openmp_flag(self):
return "-fopenmp"
@property
def cxx98_flag(self):
if self.real_version < Version("6.0"):
return ""
else:
return "-std=c++98"
@property
def cxx11_flag(self):
if self.real_version < Version("4.3"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++11 standard", "cxx11_flag", " < 4.3"
)
elif self.real_version < Version("4.7"):
return "-std=c++0x"
else:
return "-std=c++11"
@property
def cxx14_flag(self):
if self.real_version < Version("4.8"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++14 standard", "cxx14_flag", "< 4.8"
)
elif self.real_version < Version("4.9"):
return "-std=c++1y"
else:
return "-std=c++14"
@property
def cxx17_flag(self):
if self.real_version < Version("5.0"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++17 standard", "cxx17_flag", "< 5.0"
)
elif self.real_version < Version("6.0"):
return "-std=c++1z"
else:
return "-std=c++17"
@property
def cxx20_flag(self):
if self.real_version < Version("8.0"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++20 standard", "cxx20_flag", "< 8.0"
)
elif self.real_version < Version("11.0"):
return "-std=c++2a"
else:
return "-std=c++20"
@property
def cxx23_flag(self):
if self.real_version < Version("11.0"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C++23 standard", "cxx23_flag", "< 11.0"
)
elif self.real_version < Version("14.0"):
return "-std=c++2b"
else:
return "-std=c++23"
@property
def c99_flag(self):
if self.real_version < Version("4.5"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C99 standard", "c99_flag", "< 4.5"
)
return "-std=c99"
@property
def c11_flag(self):
if self.real_version < Version("4.7"):
raise spack.compiler.UnsupportedCompilerFlag(
self, "the C11 standard", "c11_flag", "< 4.7"
)
return "-std=c11"
@property
def cc_pic_flag(self):
return "-fPIC"
@property
def cxx_pic_flag(self):
return "-fPIC"
@property
def f77_pic_flag(self):
return "-fPIC"
@property
def fc_pic_flag(self):
return "-fPIC"
required_libs = ["libgcc", "libgfortran"]
@classmethod
def default_version(cls, cc):
"""Older versions of gcc use the ``-dumpversion`` option.
Output looks like this::
4.4.7
In GCC 7, this option was changed to only return the major
version of the compiler::
7
A new ``-dumpfullversion`` option was added that gives us
what we want::
7.2.0
"""
# Apple's gcc is actually apple clang, so skip it. Returning
# "unknown" ensures this compiler is not detected by default.
# Users can add it manually to compilers.yaml at their own risk.
if apple_clang.AppleClang.default_version(cc) != "unknown":
return "unknown"
version = super(Gcc, cls).default_version(cc)
if Version(version) >= Version("7"):
output = spack.compiler.get_compiler_version_output(cc, "-dumpfullversion")
version = cls.extract_version_from_output(output)
return version
@property
def stdcxx_libs(self):
return ("-lstdc++",)
@property
def prefix(self):
# GCC reports its install prefix when running ``-print-search-dirs``
# on the first line ``install: <prefix>``.
cc = spack.util.executable.Executable(self.cc)
with self.compiler_environment():
gcc_output = cc("-print-search-dirs", output=str, error=str)
for line in gcc_output.splitlines():
if line.startswith("install:"):
gcc_prefix = line.split(":")[1].strip()
# Go from <prefix>/lib/gcc/<triplet>/<version>/ to <prefix>
return ancestor(gcc_prefix, 4)
raise RuntimeError(
"could not find install prefix of GCC from output:\n\t{}".format(gcc_output)
)

View File

@@ -1,131 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import sys
from spack.compiler import Compiler, UnsupportedCompilerFlag
from spack.version import Version
class Intel(Compiler):
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("intel", "icc"),
"cxx": os.path.join("intel", "icpc"),
"f77": os.path.join("intel", "ifort"),
"fc": os.path.join("intel", "ifort"),
}
if sys.platform == "win32":
version_argument = "/QV"
else:
version_argument = "--version"
if sys.platform == "win32":
version_regex = r"([1-9][0-9]*\.[0-9]*\.[0-9]*)"
else:
version_regex = r"\((?:IFORT|ICC)\) ([^ ]+)"
@property
def verbose_flag(self):
return "-v"
required_libs = ["libirc", "libifcore", "libifcoremt", "libirng"]
@property
def debug_flags(self):
return ["-debug", "-g", "-g0", "-g1", "-g2", "-g3"]
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3", "-Ofast", "-Os"]
@property
def openmp_flag(self):
if self.real_version < Version("16.0"):
return "-openmp"
else:
return "-qopenmp"
@property
def cxx11_flag(self):
if self.real_version < Version("11.1"):
raise UnsupportedCompilerFlag(self, "the C++11 standard", "cxx11_flag", "< 11.1")
elif self.real_version < Version("13"):
return "-std=c++0x"
else:
return "-std=c++11"
@property
def cxx14_flag(self):
# Adapted from CMake's Intel-CXX rules.
if self.real_version < Version("15"):
raise UnsupportedCompilerFlag(self, "the C++14 standard", "cxx14_flag", "< 15")
elif self.real_version < Version("15.0.2"):
return "-std=c++1y"
else:
return "-std=c++14"
@property
def cxx17_flag(self):
# https://www.intel.com/content/www/us/en/developer/articles/news/c17-features-supported-by-c-compiler.html
if self.real_version < Version("19"):
raise UnsupportedCompilerFlag(self, "the C++17 standard", "cxx17_flag", "< 19")
else:
return "-std=c++17"
@property
def c99_flag(self):
if self.real_version < Version("12"):
raise UnsupportedCompilerFlag(self, "the C99 standard", "c99_flag", "< 12")
else:
return "-std=c99"
@property
def c11_flag(self):
if self.real_version < Version("16"):
raise UnsupportedCompilerFlag(self, "the C11 standard", "c11_flag", "< 16")
else:
return "-std=c1x"
@property
def c18_flag(self):
# c18 supported since oneapi 2022, which is classic version 2021.5.0
if self.real_version < Version("21.5.0"):
raise UnsupportedCompilerFlag(self, "the C18 standard", "c18_flag", "< 21.5.0")
else:
return "-std=c18"
@property
def cc_pic_flag(self):
return "-fPIC"
@property
def cxx_pic_flag(self):
return "-fPIC"
@property
def f77_pic_flag(self):
return "-fPIC"
@property
def fc_pic_flag(self):
return "-fPIC"
@property
def stdcxx_libs(self):
return ("-cxxlib",)
def setup_custom_environment(self, pkg, env):
# Edge cases for Intel's oneAPI compilers when using the legacy classic compilers:
# Always pass flags to disable deprecation warnings, since these warnings can
# confuse tools that parse the output of compiler commands (e.g. version checks).
if self.real_version >= Version("2021") and self.real_version <= Version("2023"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.real_version >= Version("2021") and self.real_version <= Version("2024"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")

View File

@@ -0,0 +1,426 @@
# Copyright 2013-2023 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import contextlib
import hashlib
import json
import os
import re
import shutil
import stat
import sys
import tempfile
import typing
from typing import Dict, List, Optional, Set, Tuple
import llnl.path
import llnl.util.lang
from llnl.util import tty
from llnl.util.filesystem import path_contains_subdirectory, paths_containing_libs
import spack.caches
import spack.util.libc
from spack.util.environment import filter_system_paths
from spack.util.file_cache import FileCache
if typing.TYPE_CHECKING:
import spack.spec
#: regex for parsing linker lines
_LINKER_LINE = re.compile(r"^( *|.*[/\\])" r"(link|ld|([^/\\]+-)?ld|collect2)" r"[^/\\]*( |$)")
#: components of linker lines to ignore
_LINKER_LINE_IGNORE = re.compile(r"(collect2 version|^[A-Za-z0-9_]+=|/ldfe )")
#: regex to match linker search paths
_LINK_DIR_ARG = re.compile(r"^-L(.:)?(?P<dir>[/\\].*)")
#: regex to match linker library path arguments
_LIBPATH_ARG = re.compile(r"^[-/](LIBPATH|libpath):(?P<dir>.*)")
@llnl.path.system_path_filter
def parse_non_system_link_dirs(compiler_debug_output: str) -> List[str]:
"""Parses link paths out of compiler debug output.
Args:
compiler_debug_output: compiler debug output as a string
Returns:
Implicit link paths parsed from the compiler output
"""
link_dirs = _parse_link_paths(compiler_debug_output)
# Remove directories that do not exist. Some versions of the Cray compiler
# report nonexistent directories
link_dirs = filter_non_existing_dirs(link_dirs)
# Return set of directories containing needed compiler libs, minus
# system paths. Note that 'filter_system_paths' only checks for an
# exact match, while 'in_system_subdirectory' checks if a path contains
# a system directory as a subdirectory
link_dirs = filter_system_paths(link_dirs)
return list(p for p in link_dirs if not in_system_subdirectory(p))
def filter_non_existing_dirs(dirs):
return [d for d in dirs if os.path.isdir(d)]
def in_system_subdirectory(path):
system_dirs = [
"/lib/",
"/lib64/",
"/usr/lib/",
"/usr/lib64/",
"/usr/local/lib/",
"/usr/local/lib64/",
]
return any(path_contains_subdirectory(path, x) for x in system_dirs)
def _parse_link_paths(string):
"""Parse implicit link paths from compiler debug output.
This gives the compiler runtime library paths that we need to add to
the RPATH of generated binaries and libraries. It allows us to
ensure, e.g., that codes load the right libstdc++ for their compiler.
"""
lib_search_paths = False
raw_link_dirs = []
for line in string.splitlines():
if lib_search_paths:
if line.startswith("\t"):
raw_link_dirs.append(line[1:])
continue
else:
lib_search_paths = False
elif line.startswith("Library search paths:"):
lib_search_paths = True
if not _LINKER_LINE.match(line):
continue
if _LINKER_LINE_IGNORE.match(line):
continue
tty.debug(f"implicit link dirs: link line: {line}")
next_arg = False
for arg in line.split():
if arg in ("-L", "-Y"):
next_arg = True
continue
if next_arg:
raw_link_dirs.append(arg)
next_arg = False
continue
link_dir_arg = _LINK_DIR_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
raw_link_dirs.append(link_dir)
link_dir_arg = _LIBPATH_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
raw_link_dirs.append(link_dir)
implicit_link_dirs = list()
visited = set()
for link_dir in raw_link_dirs:
normalized_path = os.path.abspath(link_dir)
if normalized_path not in visited:
implicit_link_dirs.append(normalized_path)
visited.add(normalized_path)
tty.debug(f"implicit link dirs: result: {', '.join(implicit_link_dirs)}")
return implicit_link_dirs
class CompilerPropertyDetector:
def __init__(self, compiler_spec: "spack.spec.Spec"):
assert compiler_spec.external, "only external compiler specs are allowed, so far"
assert compiler_spec.concrete, "only concrete compiler specs are allowed, so far"
self.spec = compiler_spec
self.cache = COMPILER_CACHE
@contextlib.contextmanager
def compiler_environment(self):
"""Sets the environment to run this compiler"""
import spack.schema.environment
import spack.util.module_cmd
# Avoid modifying os.environ if possible.
environment = self.spec.extra_attributes.get("environment", {})
modules = self.spec.external_modules or []
if not self.spec.external_modules and not environment:
yield
return
# store environment to replace later
backup_env = os.environ.copy()
try:
# load modules and set env variables
for module in modules:
spack.util.module_cmd.load_module(module)
# apply other compiler environment changes
spack.schema.environment.parse(environment).apply_modifications()
yield
finally:
# Restore environment regardless of whether inner code succeeded
os.environ.clear()
os.environ.update(backup_env)
def _compile_dummy_c_source(self) -> Optional[str]:
import spack.util.executable
assert self.spec.external, "only external compiler specs are allowed, so far"
compiler_pkg = self.spec.package
if getattr(compiler_pkg, "cc"):
cc = compiler_pkg.cc
ext = "c"
else:
cc = compiler_pkg.cxx
ext = "cc"
if not cc or not self.spec.package.verbose_flags:
return None
try:
tmpdir = tempfile.mkdtemp(prefix="spack-implicit-link-info")
fout = os.path.join(tmpdir, "output")
fin = os.path.join(tmpdir, f"main.{ext}")
with open(fin, "w") as csource:
csource.write(
"int main(int argc, char* argv[]) { (void)argc; (void)argv; return 0; }\n"
)
cc_exe = spack.util.executable.Executable(cc)
# FIXME (compiler as nodes): this operation should be encapsulated somewhere else
compiler_flags = self.spec.extra_attributes.get("flags", {})
for flag_type in [
"cflags" if cc == compiler_pkg.cc else "cxxflags",
"cppflags",
"ldflags",
]:
current_flags = compiler_flags.get(flag_type, "").strip()
if current_flags:
cc_exe.add_default_arg(*current_flags.split(" "))
with self.compiler_environment():
return cc_exe("-v", fin, "-o", fout, output=str, error=str)
except spack.util.executable.ProcessError as pe:
tty.debug(f"ProcessError: Command exited with non-zero status: {pe.long_message}")
return None
finally:
shutil.rmtree(tmpdir, ignore_errors=True)
def compiler_verbose_output(self) -> Optional[str]:
return self.cache.get(self.spec).c_compiler_output
def default_dynamic_linker(self) -> Optional[str]:
output = self.compiler_verbose_output()
if not output:
return None
return spack.util.libc.parse_dynamic_linker(output)
def default_libc(self) -> Optional["spack.spec.Spec"]:
"""Determine libc targeted by the compiler from link line"""
# technically this should be testing the target platform of the compiler, but we don't have
# that, so stick to host platform for now.
if sys.platform in ("darwin", "win32"):
return None
dynamic_linker = self.default_dynamic_linker()
if dynamic_linker is None:
return None
return spack.util.libc.libc_from_dynamic_linker(dynamic_linker)
def implicit_rpaths(self) -> List[str]:
output = self.compiler_verbose_output()
if output is None:
return []
link_dirs = parse_non_system_link_dirs(output)
all_required_libs = list(self.spec.package.required_libs) + ["libc", "libc++", "libstdc++"]
dynamic_linker = self.default_dynamic_linker()
# FIXME (compiler as nodes): is this needed ?
# if dynamic_linker is None:
# return []
result = DefaultDynamicLinkerFilter(dynamic_linker)(
paths_containing_libs(link_dirs, all_required_libs)
)
return list(result)
class DefaultDynamicLinkerFilter:
"""Remove rpaths to directories that are default search paths of the dynamic linker."""
_CACHE: Dict[Optional[str], Set[Tuple[int, int]]] = {}
def __init__(self, dynamic_linker: Optional[str]) -> None:
if dynamic_linker not in DefaultDynamicLinkerFilter._CACHE:
# Identify directories by (inode, device) tuple, which handles symlinks too.
default_path_identifiers: Set[Tuple[int, int]] = set()
if not dynamic_linker:
self.default_path_identifiers = None
return
for path in spack.util.libc.default_search_paths_from_dynamic_linker(dynamic_linker):
try:
s = os.stat(path)
if stat.S_ISDIR(s.st_mode):
default_path_identifiers.add((s.st_ino, s.st_dev))
except OSError:
continue
DefaultDynamicLinkerFilter._CACHE[dynamic_linker] = default_path_identifiers
self.default_path_identifiers = DefaultDynamicLinkerFilter._CACHE[dynamic_linker]
def is_dynamic_loader_default_path(self, p: str) -> bool:
if self.default_path_identifiers is None:
return False
try:
s = os.stat(p)
return (s.st_ino, s.st_dev) in self.default_path_identifiers
except OSError:
return False
def __call__(self, dirs: List[str]) -> List[str]:
if not self.default_path_identifiers:
return dirs
return [p for p in dirs if not self.is_dynamic_loader_default_path(p)]
def dynamic_linker_filter_for(node: "spack.spec.Spec") -> Optional[DefaultDynamicLinkerFilter]:
compiler = compiler_spec(node)
if compiler is None:
return None
detector = CompilerPropertyDetector(compiler)
dynamic_linker = detector.default_dynamic_linker()
if dynamic_linker is None:
return None
return DefaultDynamicLinkerFilter(dynamic_linker)
def compiler_spec(node: "spack.spec.Spec") -> Optional["spack.spec.Spec"]:
"""Returns the compiler spec associated with the node passed as argument.
The function looks for a "c", "cxx", and "fortran" compiler in that order,
and returns the first found. If none is found, returns None.
"""
for language in ("c", "cxx", "fortran"):
candidates = node.dependencies(virtuals=[language])
if candidates:
break
else:
return None
return candidates[0]
class CompilerCacheEntry:
"""Deserialized cache entry for a compiler"""
__slots__ = ["c_compiler_output"]
def __init__(self, c_compiler_output: Optional[str]):
self.c_compiler_output = c_compiler_output
@classmethod
def from_dict(cls, data: Dict[str, Optional[str]]):
if not isinstance(data, dict):
raise ValueError(f"Invalid {cls.__name__} data")
c_compiler_output = data.get("c_compiler_output")
if not isinstance(c_compiler_output, (str, type(None))):
raise ValueError(f"Invalid {cls.__name__} data")
return cls(c_compiler_output)
class CompilerCache:
"""Base class for compiler output cache. Default implementation does not cache anything."""
def value(self, compiler: "spack.spec.Spec") -> Dict[str, Optional[str]]:
return {"c_compiler_output": CompilerPropertyDetector(compiler)._compile_dummy_c_source()}
def get(self, compiler: "spack.spec.Spec") -> CompilerCacheEntry:
return CompilerCacheEntry.from_dict(self.value(compiler))
class FileCompilerCache(CompilerCache):
"""Cache for compiler output, which is used to determine implicit link paths, the default libc
version, and the compiler version."""
name = os.path.join("compilers", "compilers.json")
def __init__(self, cache: "FileCache") -> None:
self.cache = cache
self.cache.init_entry(self.name)
self._data: Dict[str, Dict[str, Optional[str]]] = {}
def _get_entry(self, key: str) -> Optional[CompilerCacheEntry]:
try:
return CompilerCacheEntry.from_dict(self._data[key])
except ValueError:
del self._data[key]
except KeyError:
pass
return None
def get(self, compiler: "spack.spec.Spec") -> CompilerCacheEntry:
# Cache hit
try:
with self.cache.read_transaction(self.name) as f:
assert f is not None
self._data = json.loads(f.read())
assert isinstance(self._data, dict)
except (json.JSONDecodeError, AssertionError):
self._data = {}
key = self._key(compiler)
value = self._get_entry(key)
if value is not None:
return value
# Cache miss
with self.cache.write_transaction(self.name) as (old, new):
try:
assert old is not None
self._data = json.loads(old.read())
assert isinstance(self._data, dict)
except (json.JSONDecodeError, AssertionError):
self._data = {}
# Use cache entry that may have been created by another process in the meantime.
entry = self._get_entry(key)
# Finally compute the cache entry
if entry is None:
self._data[key] = self.value(compiler)
entry = CompilerCacheEntry.from_dict(self._data[key])
new.write(json.dumps(self._data, separators=(",", ":")))
return entry
def _key(self, compiler: "spack.spec.Spec") -> str:
as_bytes = json.dumps(compiler.to_dict(), separators=(",", ":")).encode("utf-8")
return hashlib.sha256(as_bytes).hexdigest()
def _make_compiler_cache():
return FileCompilerCache(spack.caches.MISC_CACHE)
COMPILER_CACHE: CompilerCache = llnl.util.lang.Singleton(_make_compiler_cache) # type: ignore

View File

@@ -1,394 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import subprocess
import sys
import tempfile
from typing import Dict
import archspec.cpu
import spack.compiler
import spack.operating_systems.windows_os
import spack.platforms
import spack.util.executable
from spack.compiler import Compiler
from spack.error import SpackError
from spack.version import Version, VersionRange
FC_PATH: Dict[str, str] = dict()
class CmdCall:
"""Compose a call to `cmd` for an ordered series of cmd commands/scripts"""
def __init__(self, *cmds):
if not cmds:
raise RuntimeError(
"""Attempting to run commands from CMD without specifying commands.
Please add commands to be run."""
)
self._cmds = cmds
def __call__(self):
out = subprocess.check_output(self.cmd_line, stderr=subprocess.STDOUT) # novermin
return out.decode("utf-16le", errors="replace") # novermin
@property
def cmd_line(self):
base_call = "cmd /u /c "
commands = " && ".join([x.command_str() for x in self._cmds])
# If multiple commands are being invoked by a single subshell
# they must be encapsulated by a double quote. Always double
# quote to be sure of proper handling
# cmd will properly resolve nested double quotes as needed
#
# `set`` writes out the active env to the subshell stdout,
# and in this context we are always trying to obtain env
# state so it should always be appended
return base_call + f'"{commands} && set"'
class VarsInvocation:
def __init__(self, script):
self._script = script
def command_str(self):
return f'"{self._script}"'
@property
def script(self):
return self._script
class VCVarsInvocation(VarsInvocation):
def __init__(self, script, arch, msvc_version):
super(VCVarsInvocation, self).__init__(script)
self._arch = arch
self._msvc_version = msvc_version
@property
def sdk_ver(self):
"""Accessor for Windows SDK version property
Note: This property may not be set by
the calling context and as such this property will
return an empty string
This property will ONLY be set if the SDK package
is a dependency somewhere in the Spack DAG of the package
for which we are constructing an MSVC compiler env.
Otherwise this property should be unset to allow the VCVARS
script to use its internal heuristics to determine appropriate
SDK version
"""
if getattr(self, "_sdk_ver", None):
return self._sdk_ver + ".0"
return ""
@sdk_ver.setter
def sdk_ver(self, val):
self._sdk_ver = val
@property
def arch(self):
return self._arch
@property
def vcvars_ver(self):
return f"-vcvars_ver={self._msvc_version}"
def command_str(self):
script = super(VCVarsInvocation, self).command_str()
return f"{script} {self.arch} {self.sdk_ver} {self.vcvars_ver}"
def get_valid_fortran_pth():
"""Assign maximum available fortran compiler version"""
# TODO (johnwparent): validate compatibility w/ try compiler
# functionality when added
sort_fn = lambda fc_ver: Version(fc_ver)
sort_fc_ver = sorted(list(FC_PATH.keys()), key=sort_fn)
return FC_PATH[sort_fc_ver[-1]] if sort_fc_ver else None
class Msvc(Compiler):
# Named wrapper links within build_env_path
# Due to the challenges of supporting compiler wrappers
# in Windows, we leave these blank, and dynamically compute
# based on proper versions of MSVC from there
# pending acceptance of #28117 for full support using
# compiler wrappers
link_paths = {"cc": "", "cxx": "", "f77": "", "fc": ""}
#: Compiler argument that produces version information
version_argument = ""
# For getting ifx's version, call it with version_argument
# and ignore the error code
ignore_version_errors = [1]
#: Regex used to extract version from compiler's output
version_regex = r"([1-9][0-9]*\.[0-9]*\.[0-9]*)"
# The MSVC compiler class overrides this to prevent instances
# of erroneous matching on executable names that cannot be msvc
# compilers
suffixes = []
is_supported_on_platform = lambda x: isinstance(x, spack.platforms.Windows)
def __init__(self, *args, **kwargs):
# This positional argument "paths" is later parsed and process by the base class
# via the call to `super` later in this method
paths = args[3]
latest_fc = get_valid_fortran_pth()
new_pth = [pth if pth else latest_fc for pth in paths[2:]]
paths[2:] = new_pth
# Initialize, deferring to base class but then adding the vcvarsallfile
# file based on compiler executable path.
super().__init__(*args, **kwargs)
# To use the MSVC compilers, VCVARS must be invoked
# VCVARS is located at a fixed location, referencable
# idiomatically by the following relative path from the
# compiler.
# Spack first finds the compilers via VSWHERE
# and stores their path, but their respective VCVARS
# file must be invoked before useage.
env_cmds = []
compiler_root = os.path.join(os.path.dirname(self.cc), "../../../../../..")
vcvars_script_path = os.path.join(compiler_root, "Auxiliary", "Build", "vcvars64.bat")
# get current platform architecture and format for vcvars argument
arch = spack.platforms.real_host().default.lower()
arch = arch.replace("-", "_")
if str(archspec.cpu.host().family) == "x86_64":
arch = "amd64"
self.vcvars_call = VCVarsInvocation(vcvars_script_path, arch, self.msvc_version)
env_cmds.append(self.vcvars_call)
# Below is a check for a valid fortran path
# paths has c, cxx, fc, and f77 paths in that order
# paths[2] refers to the fc path and is a generic check
# for a fortran compiler
if paths[2]:
def get_oneapi_root(pth: str):
"""From within a prefix known to be a oneAPI path
determine the oneAPI root path from arbitrary point
under root
Args:
pth: path prefixed within oneAPI root
"""
if not pth:
return ""
while os.path.basename(pth) and os.path.basename(pth) != "oneAPI":
pth = os.path.dirname(pth)
return pth
# If this found, it sets all the vars
oneapi_root = get_oneapi_root(self.fc)
if not oneapi_root:
raise RuntimeError(f"Non-oneAPI Fortran compiler {self.fc} assigned to MSVC")
oneapi_root_setvars = os.path.join(oneapi_root, "setvars.bat")
# some oneAPI exes return a version more precise than their
# install paths specify, so we determine path from
# the install path rather than the fc executable itself
numver = r"\d+\.\d+(?:\.\d+)?"
pattern = f"((?:{numver})|(?:latest))"
version_from_path = re.search(pattern, self.fc).group(1)
oneapi_version_setvars = os.path.join(
oneapi_root, "compiler", version_from_path, "env", "vars.bat"
)
# order matters here, the specific version env must be invoked first,
# otherwise it will be ignored if the root setvars sets up the oneapi
# env first
env_cmds.extend(
[VarsInvocation(oneapi_version_setvars), VarsInvocation(oneapi_root_setvars)]
)
self.msvc_compiler_environment = CmdCall(*env_cmds)
@property
def cxx11_flag(self):
return "/std:c++11"
@property
def cxx14_flag(self):
return "/std:c++14"
@property
def cxx17_flag(self):
return "/std:c++17"
@property
def cxx20_flag(self):
return "/std:c++20"
@property
def c11_flag(self):
return "/std:c11"
@property
def c17_flag(self):
return "/std:c17"
@property
def msvc_version(self):
"""This is the VCToolset version *NOT* the actual version of the cl compiler
For CL version, query `Msvc.cl_version`"""
return Version(re.search(Msvc.version_regex, self.cc).group(1))
@property
def short_msvc_version(self):
"""This is the shorthand VCToolset version of form
MSVC<short-ver>
"""
return "MSVC" + self.vc_toolset_ver
@property
def vc_toolset_ver(self):
"""
The toolset version is the version of the combined set of cl and link
This typically relates directly to VS version i.e. VS 2022 is v143
VS 19 is v142, etc.
This value is defined by the first three digits of the major + minor
version of the VS toolset (143 for 14.3x.bbbbb). Traditionally the
minor version has remained a static two digit number for a VS release
series, however, as of VS22, this is no longer true, both
14.4x.bbbbb and 14.3x.bbbbb are considered valid VS22 VC toolset
versions due to a change in toolset minor version sentiment.
This is *NOT* the full version, for that see
Msvc.msvc_version or MSVC.platform_toolset_ver for the
raw platform toolset version
"""
ver = self.msvc_version[:2].joined.string[:3]
return ver
@property
def platform_toolset_ver(self):
"""
This is the platform toolset version of current MSVC compiler
i.e. 142. The platform toolset is the targeted MSVC library/compiler
versions by compilation (this is different from the VC Toolset)
This is different from the VC toolset version as established
by `short_msvc_version`, but typically are represented by the same
three digit value
"""
# Typically VS toolset version and platform toolset versions match
# VS22 introduces the first divergence of VS toolset version
# (144 for "recent" releases) and platform toolset version (143)
# so it needs additional handling until MS releases v144
# (assuming v144 is also for VS22)
# or adds better support for detection
# TODO: (johnwparent) Update this logic for the next platform toolset
# or VC toolset version update
toolset_ver = self.vc_toolset_ver
vs22_toolset = Version(toolset_ver) > Version("142")
return toolset_ver if not vs22_toolset else "143"
@property
def visual_studio_version(self):
"""The four digit Visual Studio version (i.e. 2019 or 2022)
Note: This differs from the msvc version or toolset version as
those properties track the compiler and build tools version
respectively, whereas this tracks the VS release associated
with a given MSVC compiler.
"""
return re.search(r"[0-9]{4}", self.cc).group(0)
def _compiler_version(self, compiler):
"""Returns version object for given compiler"""
# ignore_errors below is true here due to ifx's
# non zero return code if it is not provided
# and input file
return Version(
re.search(
Msvc.version_regex,
spack.compiler.get_compiler_version_output(
compiler, version_arg=None, ignore_errors=True
),
).group(1)
)
@property
def cl_version(self):
"""Cl toolset version"""
return self._compiler_version(self.cc)
@property
def ifx_version(self):
"""Ifx compiler version associated with this version of MSVC"""
return self._compiler_version(self.fc)
@property
def vs_root(self):
# The MSVC install root is located at a fix level above the compiler
# and is referenceable idiomatically via the pattern below
# this should be consistent accross versions
return os.path.abspath(os.path.join(self.cc, "../../../../../../../.."))
def setup_custom_environment(self, pkg, env):
"""Set environment variables for MSVC using the
Microsoft-provided script."""
# Set the build environment variables for spack. Just using
# subprocess.call() doesn't work since that operates in its own
# environment which is destroyed (along with the adjusted variables)
# once the process terminates. So go the long way around: examine
# output, sort into dictionary, use that to make the build
# environment.
# vcvars can target specific sdk versions, force it to pick up concretized sdk
# version, if needed by spec
if pkg.name != "win-sdk" and "win-sdk" in pkg.spec:
self.vcvars_call.sdk_ver = pkg.spec["win-sdk"].version.string
out = self.msvc_compiler_environment()
int_env = dict(
(key, value)
for key, _, value in (line.partition("=") for line in out.splitlines())
if key and value
)
for env_var in int_env:
if os.pathsep not in int_env[env_var]:
env.set(env_var, int_env[env_var])
else:
env.set_path(env_var, int_env[env_var].split(os.pathsep))
# certain versions of ifx (2021.3.0:2023.1.0) do not play well with env:TMP
# that has a "." character in the path
# Work around by pointing tmp to the stage for the duration of the build
if self.fc and Version(self.fc_version(self.fc)).satisfies(
VersionRange("2021.3.0", "2023.1.0")
):
new_tmp = tempfile.mkdtemp(dir=pkg.stage.path)
env.set("TMP", new_tmp)
env.set("CC", self.cc)
env.set("CXX", self.cxx)
env.set("FC", self.fc)
env.set("F77", self.f77)
@classmethod
def fc_version(cls, fc):
if not sys.platform == "win32":
return "unknown"
fc_ver = cls.default_version(fc)
FC_PATH[fc_ver] = fc
try:
sps = spack.operating_systems.windows_os.WindowsOs().compiler_search_paths
except AttributeError:
raise SpackError(
"Windows compiler search paths not established, "
"please report this behavior to github.com/spack/spack"
)
clp = spack.util.executable.which_string("cl", path=sps)
return cls.default_version(clp) if clp else fc_ver

View File

@@ -1,112 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import llnl.util.lang
import spack.compiler
class Nag(spack.compiler.Compiler):
# Named wrapper links within build_env_path
# Use default wrappers for C and C++, in case provided in compilers.yaml
link_paths = {
"cc": "cc",
"cxx": "c++",
"f77": os.path.join("nag", "nagfor"),
"fc": os.path.join("nag", "nagfor"),
}
version_argument = "-V"
@classmethod
@llnl.util.lang.memoized
def extract_version_from_output(cls, output):
match = re.search(r"NAG Fortran Compiler Release (\d+).(\d+)\(.*\) Build (\d+)", output)
if match:
return ".".join(match.groups())
@property
def verbose_flag(self):
# NAG does not support a flag that would enable verbose output and
# compilation/linking at the same time (with either '-#' or '-dryrun'
# the compiler only prints the commands but does not run them).
# Therefore, the only thing we can do is to pass the '-v' argument to
# the underlying GCC. In order to get verbose output from the latter
# at both compile and linking stages, we need to call NAG with two
# additional flags: '-Wc,-v' and '-Wl,-v'. However, we return only
# '-Wl,-v' for the following reasons:
# 1) the interface of this method does not support multiple flags in
# the return value and, at least currently, verbose output at the
# linking stage has a higher priority for us;
# 2) NAG is usually mixed with GCC compiler, which also accepts
# '-Wl,-v' and produces meaningful result with it: '-v' is passed
# to the linker and the latter produces verbose output for the
# linking stage ('-Wc,-v', however, would break the compilation
# with a message from GCC that the flag is not recognized).
#
# This way, we at least enable the implicit rpath detection, which is
# based on compilation of a C file (see method
# spack.compiler._compile_dummy_c_source): in the case of a mixed
# NAG/GCC toolchain, the flag will be passed to g++ (e.g.
# 'g++ -Wl,-v ./main.c'), otherwise, the flag will be passed to nagfor
# (e.g. 'nagfor -Wl,-v ./main.c' - note that nagfor recognizes '.c'
# extension and treats the file accordingly). The list of detected
# rpaths will contain only GCC-related directories and rpaths to
# NAG-related directories are injected by nagfor anyway.
return "-Wl,-v"
@property
def openmp_flag(self):
return "-openmp"
@property
def debug_flags(self):
return ["-g", "-gline", "-g90"]
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3", "-O4"]
@property
def cxx11_flag(self):
# NAG does not have a C++ compiler
# However, it can be mixed with a compiler that does support it
return "-std=c++11"
@property
def f77_pic_flag(self):
return "-PIC"
@property
def fc_pic_flag(self):
return "-PIC"
# Unlike other compilers, the NAG compiler passes options to GCC, which
# then passes them to the linker. Therefore, we need to doubly wrap the
# options with '-Wl,-Wl,,'
@property
def f77_rpath_arg(self):
return "-Wl,-Wl,,-rpath,,"
@property
def fc_rpath_arg(self):
return "-Wl,-Wl,,-rpath,,"
@property
def linker_arg(self):
return "-Wl,-Wl,,"
@property
def disable_new_dtags(self):
# Disable RPATH/RUNPATH forcing for NAG/GCC mixed toolchains:
return ""
@property
def enable_new_dtags(self):
# Disable RPATH/RUNPATH forcing for NAG/GCC mixed toolchains:
return ""

View File

@@ -1,79 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from spack.compiler import Compiler
class Nvhpc(Compiler):
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("nvhpc", "nvc"),
"cxx": os.path.join("nvhpc", "nvc++"),
"f77": os.path.join("nvhpc", "nvfortran"),
"fc": os.path.join("nvhpc", "nvfortran"),
}
version_argument = "--version"
version_regex = r"nv[^ ]* (?:[^ ]+ Dev-r)?([0-9.]+)(?:-[0-9]+)?"
@property
def verbose_flag(self):
return "-v"
@property
def debug_flags(self):
return ["-g", "-gopt"]
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3", "-O4"]
@property
def openmp_flag(self):
return "-mp"
@property
def cc_pic_flag(self):
return "-fpic"
@property
def cxx_pic_flag(self):
return "-fpic"
@property
def f77_pic_flag(self):
return "-fpic"
@property
def fc_pic_flag(self):
return "-fpic"
@property
def c99_flag(self):
return "-c99"
@property
def c11_flag(self):
return "-c11"
@property
def cxx11_flag(self):
return "--c++11"
@property
def cxx14_flag(self):
return "--c++14"
@property
def cxx17_flag(self):
return "--c++17"
@property
def stdcxx_libs(self):
return ("-c++libs",)
required_libs = ["libnvc", "libnvf"]

View File

@@ -1,172 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from os.path import dirname, join
from llnl.util import tty
from llnl.util.filesystem import ancestor
import spack.util.executable
from spack.compiler import Compiler
from spack.version import Version
class Oneapi(Compiler):
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("oneapi", "icx"),
"cxx": os.path.join("oneapi", "icpx"),
"f77": os.path.join("oneapi", "ifx"),
"fc": os.path.join("oneapi", "ifx"),
}
version_argument = "--version"
version_regex = r"(?:(?:oneAPI DPC\+\+(?:\/C\+\+)? Compiler)|(?:\(IFORT\))|(?:\(IFX\))) (\S+)"
@property
def verbose_flag(self):
return "-v"
required_libs = [
"libirc",
"libifcore",
"libifcoremt",
"libirng",
"libsvml",
"libintlc",
"libimf",
"libsycl",
"libOpenCL",
]
@property
def debug_flags(self):
return ["-debug", "-g", "-g0", "-g1", "-g2", "-g3"]
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3", "-Ofast", "-Os"]
@property
def openmp_flag(self):
return "-fiopenmp"
# There may be some additional options here for offload, e.g. :
# -fopenmp-simd Emit OpenMP code only for SIMD-based constructs.
# -fopenmp-targets=<value>
# -fopenmp-version=<value>
# -fopenmp Parse OpenMP pragmas and generate parallel code.
# -qno-openmp Disable OpenMP support
# -qopenmp-link=<value> Choose whether to link with the static or
# dynamic OpenMP libraries. Default is dynamic.
# -qopenmp-simd Emit OpenMP code only for SIMD-based constructs.
# -qopenmp-stubs enables the user to compile OpenMP programs in
# sequential mode. The OpenMP directives are
# ignored and a stub OpenMP library is linked.
# -qopenmp-threadprivate=<value>
# -qopenmp Parse OpenMP pragmas and generate parallel code.
# -static-openmp Use the static host OpenMP runtime while
# linking.
# -Xopenmp-target=<triple> <arg>
# -Xopenmp-target <arg> Pass <arg> to the target offloading toolchain.
# Source: icx --help output
@property
def cxx11_flag(self):
return "-std=c++11"
@property
def cxx14_flag(self):
return "-std=c++14"
@property
def cxx17_flag(self):
return "-std=c++17"
@property
def cxx20_flag(self):
return "-std=c++20"
@property
def c99_flag(self):
return "-std=c99"
@property
def c11_flag(self):
return "-std=c1x"
@property
def cc_pic_flag(self):
return "-fPIC"
@property
def cxx_pic_flag(self):
return "-fPIC"
@property
def f77_pic_flag(self):
return "-fPIC"
@property
def fc_pic_flag(self):
return "-fPIC"
@property
def stdcxx_libs(self):
return ("-cxxlib",)
@property
def prefix(self):
# OneAPI reports its install prefix when running ``--version``
# on the line ``InstalledDir: <prefix>/bin/compiler``.
cc = spack.util.executable.Executable(self.cc)
with self.compiler_environment():
oneapi_output = cc("--version", output=str, error=str)
for line in oneapi_output.splitlines():
if line.startswith("InstalledDir:"):
oneapi_prefix = line.split(":")[1].strip()
# Go from <prefix>/bin/compiler to <prefix>
return ancestor(oneapi_prefix, 2)
raise RuntimeError(
"could not find install prefix of OneAPI from output:\n\t{}".format(oneapi_output)
)
def setup_custom_environment(self, pkg, env):
# workaround bug in icpx driver where it requires sycl-post-link is on the PATH
# It is located in the same directory as the driver. Error message:
# clang++: error: unable to execute command:
# Executable "sycl-post-link" doesn't exist!
# also ensures that shared objects and libraries required by the compiler,
# e.g. libonnx, can be found succesfully
# due to a fix, this is no longer required for OneAPI versions >= 2024.2
if self.cxx and pkg.spec.satisfies("%oneapi@:2024.1"):
env.prepend_path("PATH", dirname(self.cxx))
env.prepend_path("LD_LIBRARY_PATH", join(dirname(dirname(self.cxx)), "lib"))
# Edge cases for Intel's oneAPI compilers when using the legacy classic compilers:
# Always pass flags to disable deprecation warnings, since these warnings can
# confuse tools that parse the output of compiler commands (e.g. version checks).
# This is really only needed for Fortran, since oneapi@ should be using either
# icx+icpx+ifx or icx+icpx+ifort. But to be on the safe side (some users may
# want to try to swap icpx against icpc, for example), and since the Intel LLVM
# compilers accept these diag-disable flags, we apply them for all compilers.
if self.real_version >= Version("2021") and self.real_version <= Version("2023"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.real_version >= Version("2021") and self.real_version <= Version("2024"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")
# 2024 release bumped the libsycl version because of an ABI
# change, 2024 compilers are required. You will see this
# error:
#
# /usr/bin/ld: warning: libsycl.so.7, needed by ...., not found
if pkg.spec.satisfies("%oneapi@:2023"):
for c in ["dnn"]:
if pkg.spec.satisfies(f"^intel-oneapi-{c}@2024:"):
tty.warn(f"intel-oneapi-{c}@2024 SYCL APIs requires %oneapi@2024:")

View File

@@ -1,54 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import re
import llnl.util.lang
import spack.compilers.clang
class Rocmcc(spack.compilers.clang.Clang):
@property
def link_paths(self):
link_paths = {
"cc": "rocmcc/amdclang",
"cxx": "rocmcc/amdclang++",
"f77": "rocmcc/amdflang",
"fc": "rocmcc/amdflang",
}
return link_paths
@property
def cxx11_flag(self):
return "-std=c++11"
@property
def cxx14_flag(self):
return "-std=c++14"
@property
def cxx17_flag(self):
return "-std=c++17"
@property
def c99_flag(self):
return "-std=c99"
@property
def c11_flag(self):
return "-std=c11"
@classmethod
@llnl.util.lang.memoized
def extract_version_from_output(cls, output):
match = re.search(r"llvm-project roc-(\d+)[._](\d+)[._](\d+)", output)
if match:
return ".".join(match.groups())
@property
def stdcxx_libs(self):
return ("-lstdc++",)

View File

@@ -1,93 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from spack.compiler import Compiler, UnsupportedCompilerFlag
from spack.version import Version
class Xl(Compiler):
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("xl", "xlc"),
"cxx": os.path.join("xl", "xlc++"),
"f77": os.path.join("xl", "xlf"),
"fc": os.path.join("xl", "xlf90"),
}
version_argument = "-qversion"
version_regex = r"([0-9]?[0-9]\.[0-9])"
@property
def verbose_flag(self):
return "-V"
@property
def debug_flags(self):
return ["-g", "-g0", "-g1", "-g2", "-g8", "-g9"]
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3", "-O4", "-O5", "-Ofast"]
@property
def openmp_flag(self):
return "-qsmp=omp"
@property
def cxx11_flag(self):
if self.real_version < Version("13.1"):
raise UnsupportedCompilerFlag(self, "the C++11 standard", "cxx11_flag", "< 13.1")
else:
return "-qlanglvl=extended0x"
@property
def c99_flag(self):
if self.real_version >= Version("13.1.1"):
return "-std=gnu99"
if self.real_version >= Version("10.1"):
return "-qlanglvl=extc99"
raise UnsupportedCompilerFlag(self, "the C99 standard", "c99_flag", "< 10.1")
@property
def c11_flag(self):
if self.real_version >= Version("13.1.2"):
return "-std=gnu11"
if self.real_version >= Version("12.1"):
return "-qlanglvl=extc1x"
raise UnsupportedCompilerFlag(self, "the C11 standard", "c11_flag", "< 12.1")
@property
def cxx14_flag(self):
# .real_version does not have the "y.z" component of "w.x.y.z", which
# is required to distinguish whether support is available
if self.version >= Version("16.1.1.8"):
return "-std=c++14"
raise UnsupportedCompilerFlag(self, "the C++14 standard", "cxx14_flag", "< 16.1.1.8")
@property
def cc_pic_flag(self):
return "-qpic"
@property
def cxx_pic_flag(self):
return "-qpic"
@property
def f77_pic_flag(self):
return "-qpic"
@property
def fc_pic_flag(self):
return "-qpic"
@property
def fflags(self):
# The -qzerosize flag is effective only for the Fortran 77
# compilers and allows the use of zero size objects.
# For Fortran 90 and beyond, it is set by default and has not impact.
# Its use has no negative side effects.
return "-qzerosize"

View File

@@ -1,18 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import spack.compilers.xl
class XlR(spack.compilers.xl.Xl):
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("xl_r", "xlc_r"),
"cxx": os.path.join("xl_r", "xlc++_r"),
"f77": os.path.join("xl_r", "xlf_r"),
"fc": os.path.join("xl_r", "xlf90_r"),
}

View File

@@ -11,6 +11,7 @@
import llnl.util.tty as tty
import spack.compilers
import spack.compilers.config
import spack.config
import spack.error
import spack.repo
@@ -146,7 +147,7 @@ def concretize_separately(
# Ensure we have compilers in compilers.yaml to avoid that
# processes try to write the config file in parallel
_ = spack.compilers.all_compilers_config(spack.config.CONFIG)
_ = spack.compilers.config.all_compilers_from(spack.config.CONFIG)
# Early return if there is nothing to do
if len(args) == 0:
@@ -160,6 +161,11 @@ def concretize_separately(
# TODO: support parallel concretization on macOS and Windows
num_procs = min(len(args), spack.config.determine_number_of_jobs(parallel=True))
msg = "Starting concretization"
if sys.platform not in ("darwin", "win32") and num_procs > 1:
msg += f" pool with {num_procs} processes"
tty.msg(msg)
for j, (i, concrete, duration) in enumerate(
spack.util.parallel.imap_unordered(
_concretize_task, args, processes=num_procs, debug=tty.is_debug(), maxtaskperchild=1

View File

@@ -715,6 +715,9 @@ def print_section(self, section: str, blame: bool = False, *, scope=None) -> Non
raise spack.error.ConfigError(f"cannot read '{section}' configuration") from e
ConfigurationType = Union[Configuration, lang.Singleton]
@contextlib.contextmanager
def override(
path_or_scope: Union[ConfigScope, str], value: Optional[Any] = None

View File

@@ -14,7 +14,7 @@
import llnl.util.tty as tty
import spack.cmd
import spack.compilers
import spack.compilers.config
import spack.deptypes as dt
import spack.error
import spack.hash_types as hash_types
@@ -28,13 +28,13 @@
#: packages here.
default_path = "/opt/cray/pe/cpe-descriptive-manifest/"
compiler_name_translation = {"nvidia": "nvhpc", "rocm": "rocmcc"}
compiler_name_translation = {"nvidia": "nvhpc", "rocm": "rocmcc", "clang": "llvm"}
def translated_compiler_name(manifest_compiler_name):
"""
When creating a Compiler object, Spack expects a name matching
one of the classes in `spack.compilers`. Names in the Cray manifest
one of the classes in `spack.compilers.config`. Names in the Cray manifest
may differ; for cases where we know the name refers to a compiler in
Spack, this function translates it automatically.
@@ -43,25 +43,25 @@ def translated_compiler_name(manifest_compiler_name):
"""
if manifest_compiler_name in compiler_name_translation:
return compiler_name_translation[manifest_compiler_name]
elif manifest_compiler_name in spack.compilers.supported_compilers():
elif manifest_compiler_name in spack.compilers.config.supported_compilers():
return manifest_compiler_name
else:
raise spack.compilers.UnknownCompilerError(
raise spack.compilers.config.UnknownCompilerError(
"Manifest parsing - unknown compiler: {0}".format(manifest_compiler_name)
)
def compiler_from_entry(entry: dict, manifest_path: str):
def compiler_from_entry(entry: dict, *, manifest_path: str) -> "spack.spec.Spec":
# Note that manifest_path is only passed here to compose a
# useful warning message when paths appear to be missing.
compiler_name = translated_compiler_name(entry["name"])
prefix = None
if "prefix" in entry:
prefix = entry["prefix"]
paths = dict(
(lang, os.path.join(prefix, relpath))
for (lang, relpath) in entry["executables"].items()
)
paths = {
lang: os.path.join(prefix, relpath) for lang, relpath in entry["executables"].items()
}
else:
paths = entry["executables"]
@@ -75,25 +75,38 @@ def compiler_from_entry(entry: dict, manifest_path: str):
missing_paths.append(path)
# to instantiate a compiler class we may need a concrete version:
version = "={}".format(entry["version"])
arch = entry["arch"]
operating_system = arch["os"]
target = arch["target"]
spec_str = f"{compiler_name}@={entry['version']} os={operating_system} target={target}"
compiler_cls = spack.compilers.class_for_compiler_name(compiler_name)
spec = spack.spec.CompilerSpec(compiler_cls.name, version)
path_list = [paths.get(x, None) for x in ("cc", "cxx", "f77", "fc")]
compilers = {}
for x in ("cc", "cxx", "fc"):
language = {"cc": "c", "fc": "fortran"}
if x not in paths:
continue
if prefix is None:
prefix = os.path.dirname(paths[x])
compilers[language.get(x, x)] = paths[x]
if missing_paths:
warnings.warn(
"Manifest entry refers to nonexistent paths:\n\t"
+ "\n\t".join(missing_paths)
+ f"\nfor {str(spec)}"
+ f"\nfor {spec_str}"
+ f"\nin {manifest_path}"
+ "\nPlease report this issue"
)
return compiler_cls(spec, operating_system, target, path_list)
assert prefix is not None, "compiler prefix must be set"
result = spack.spec.Spec(
str(spack.spec.parse_with_version_concrete(spec_str)), external_path=prefix
)
result.extra_attributes = {"compilers": compilers}
result._finalize_concretization()
return result
def spec_from_entry(entry):
@@ -121,7 +134,7 @@ def spec_from_entry(entry):
version=entry["compiler"]["version"],
)
spec_format = "{name}@={version} {compiler} {arch}"
spec_format = "{name}@={version} {arch}"
spec_str = spec_format.format(
name=entry["name"], version=entry["version"], compiler=compiler_str, arch=arch_str
)
@@ -182,6 +195,7 @@ def entries_to_specs(entries):
for entry in entries:
try:
spec = spec_from_entry(entry)
assert spec.concrete, f"{spec} is not concrete"
spec_dict[spec._hash] = spec
except spack.repo.UnknownPackageError:
tty.debug("Omitting package {0}: no corresponding repo package".format(entry["name"]))
@@ -222,23 +236,24 @@ def read(path, apply_updates):
tty.debug("{0}: {1} specs read from manifest".format(path, str(len(specs))))
compilers = list()
if "compilers" in json_data:
compilers.extend(compiler_from_entry(x, path) for x in json_data["compilers"])
tty.debug("{0}: {1} compilers read from manifest".format(path, str(len(compilers))))
compilers.extend(
compiler_from_entry(x, manifest_path=path) for x in json_data["compilers"]
)
tty.debug(f"{path}: {str(len(compilers))} compilers read from manifest")
# Filter out the compilers that already appear in the configuration
compilers = spack.compilers.select_new_compilers(compilers)
compilers = spack.compilers.config.select_new_compilers(compilers)
if apply_updates and compilers:
for compiler in compilers:
try:
spack.compilers.add_compilers_to_config([compiler])
except Exception:
warnings.warn(
f"Could not add compiler {str(compiler.spec)}: "
f"\n\tfrom manifest: {path}"
"\nPlease reexecute with 'spack -d' and include the stack trace"
)
tty.debug(f"Include this\n{traceback.format_exc()}")
try:
spack.compilers.config.add_compiler_to_config(compilers)
except Exception:
warnings.warn(
f"Could not add compilers from manifest: {path}"
"\nPlease reexecute with 'spack -d' and include the stack trace"
)
tty.debug(f"Include this\n{traceback.format_exc()}")
if apply_updates:
for spec in specs.values():
assert spec.concrete, f"{spec} is not concrete"
spack.store.STORE.db.add(spec)

View File

@@ -80,7 +80,7 @@
#: DB version. This is stuck in the DB file to track changes in format.
#: Increment by one when the database format changes.
#: Versions before 5 were not integers.
_DB_VERSION = vn.Version("7")
_DB_VERSION = vn.Version("8")
#: For any version combinations here, skip reindex when upgrading.
#: Reindexing can take considerable time and is not always necessary.
@@ -93,6 +93,8 @@
(vn.Version("0.9.3"), vn.Version("5")),
(vn.Version("5"), vn.Version("6")),
(vn.Version("6"), vn.Version("7")),
(vn.Version("6"), vn.Version("8")),
(vn.Version("7"), vn.Version("8")),
]
#: Default timeout for spack database locks in seconds or None (no timeout).
@@ -141,6 +143,7 @@ def reader(version: vn.StandardVersion) -> Type["spack.spec.SpecfileReaderBase"]
vn.Version("5"): spack.spec.SpecfileV1,
vn.Version("6"): spack.spec.SpecfileV3,
vn.Version("7"): spack.spec.SpecfileV4,
vn.Version("8"): spack.spec.SpecfileV5,
}
return reader_cls[version]

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Data structures that represent Spack's dependency relationships."""
from typing import Dict, List
from typing import Dict, List, Type
import spack.deptypes as dt
import spack.spec
@@ -38,7 +38,7 @@ class Dependency:
def __init__(
self,
pkg: "spack.package_base.PackageBase",
pkg: Type["spack.package_base.PackageBase"],
spec: "spack.spec.Spec",
depflag: dt.DepFlag = dt.DEFAULT,
):

View File

@@ -21,6 +21,7 @@ class OpenMpi(Package):
* ``conflicts``
* ``depends_on``
* ``extends``
* ``license``
* ``patch``
* ``provides``
* ``resource``
@@ -34,12 +35,12 @@ class OpenMpi(Package):
import collections.abc
import os.path
import re
from typing import Any, Callable, List, Optional, Tuple, Union
from typing import Any, Callable, List, Optional, Tuple, Type, Union
import llnl.util.lang
import llnl.util.tty.color
import spack.deptypes as dt
import spack.fetch_strategy
import spack.package_base
import spack.patch
import spack.spec
@@ -47,7 +48,6 @@ class OpenMpi(Package):
import spack.variant
from spack.dependency import Dependency
from spack.directives_meta import DirectiveError, DirectiveMeta
from spack.fetch_strategy import from_kwargs
from spack.resource import Resource
from spack.version import (
GitVersion,
@@ -82,11 +82,8 @@ class OpenMpi(Package):
SpecType = str
DepType = Union[Tuple[str, ...], str]
WhenType = Optional[Union[spack.spec.Spec, str, bool]]
Patcher = Callable[[Union[spack.package_base.PackageBase, Dependency]], None]
PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]]
SUPPORTED_LANGUAGES = ("fortran", "cxx", "c")
Patcher = Callable[[Union[Type[spack.package_base.PackageBase], Dependency]], None]
PatchesType = Union[Patcher, str, List[Union[Patcher, str]]]
def _make_when_spec(value: WhenType) -> Optional[spack.spec.Spec]:
@@ -219,7 +216,7 @@ def version(
return lambda pkg: _execute_version(pkg, ver, **kwargs)
def _execute_version(pkg, ver, **kwargs):
def _execute_version(pkg: Type[spack.package_base.PackageBase], ver: Union[str, int], **kwargs):
if (
(any(s in kwargs for s in spack.util.crypto.hashes) or "checksum" in kwargs)
and hasattr(pkg, "has_code")
@@ -250,12 +247,12 @@ def _execute_version(pkg, ver, **kwargs):
def _depends_on(
pkg: spack.package_base.PackageBase,
pkg: Type[spack.package_base.PackageBase],
spec: spack.spec.Spec,
*,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
patches: PatchesType = None,
patches: Optional[PatchesType] = None,
):
when_spec = _make_when_spec(when)
if not when_spec:
@@ -330,7 +327,7 @@ def conflicts(conflict_spec: SpecType, when: WhenType = None, msg: Optional[str]
msg (str): optional user defined message
"""
def _execute_conflicts(pkg: spack.package_base.PackageBase):
def _execute_conflicts(pkg: Type[spack.package_base.PackageBase]):
# If when is not specified the conflict always holds
when_spec = _make_when_spec(when)
if not when_spec:
@@ -349,7 +346,7 @@ def depends_on(
spec: SpecType,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
patches: PatchesType = None,
patches: Optional[PatchesType] = None,
):
"""Creates a dict of deps with specs defining when they apply.
@@ -367,18 +364,17 @@ def depends_on(
"""
dep_spec = spack.spec.Spec(spec)
if dep_spec.name in SUPPORTED_LANGUAGES:
assert type == "build", "languages must be of 'build' type"
return _language(lang_spec_str=spec, when=when)
def _execute_depends_on(pkg: spack.package_base.PackageBase):
def _execute_depends_on(pkg: Type[spack.package_base.PackageBase]):
_depends_on(pkg, dep_spec, when=when, type=type, patches=patches)
return _execute_depends_on
@directive("disable_redistribute")
def redistribute(source=None, binary=None, when: WhenType = None):
def redistribute(
source: Optional[bool] = None, binary: Optional[bool] = None, when: WhenType = None
):
"""Can be used inside a Package definition to declare that
the package source and/or compiled binaries should not be
redistributed.
@@ -393,7 +389,10 @@ def redistribute(source=None, binary=None, when: WhenType = None):
def _execute_redistribute(
pkg: spack.package_base.PackageBase, source=None, binary=None, when: WhenType = None
pkg: Type[spack.package_base.PackageBase],
source: Optional[bool],
binary: Optional[bool],
when: WhenType,
):
if source is None and binary is None:
return
@@ -469,9 +468,7 @@ def provides(*specs: SpecType, when: WhenType = None):
when: condition when this provides clause needs to be considered
"""
def _execute_provides(pkg: spack.package_base.PackageBase):
import spack.parser # Avoid circular dependency
def _execute_provides(pkg: Type[spack.package_base.PackageBase]):
when_spec = _make_when_spec(when)
if not when_spec:
return
@@ -517,7 +514,7 @@ def can_splice(
variants will be skipped by '*'.
"""
def _execute_can_splice(pkg: spack.package_base.PackageBase):
def _execute_can_splice(pkg: Type[spack.package_base.PackageBase]):
when_spec = _make_when_spec(when)
if isinstance(match_variants, str) and match_variants != "*":
raise ValueError(
@@ -558,10 +555,10 @@ def patch(
compressed URL patches)
"""
def _execute_patch(pkg_or_dep: Union[spack.package_base.PackageBase, Dependency]):
pkg = pkg_or_dep
if isinstance(pkg, Dependency):
pkg = pkg.pkg
def _execute_patch(
pkg_or_dep: Union[Type[spack.package_base.PackageBase], Dependency]
) -> None:
pkg = pkg_or_dep.pkg if isinstance(pkg_or_dep, Dependency) else pkg_or_dep
if hasattr(pkg, "has_code") and not pkg.has_code:
raise UnsupportedPackageDirective(
@@ -735,58 +732,55 @@ def _execute_variant(pkg):
@directive("resources")
def resource(**kwargs):
"""Define an external resource to be fetched and staged when building the
package. Based on the keywords present in the dictionary the appropriate
FetchStrategy will be used for the resource. Resources are fetched and
staged in their own folder inside spack stage area, and then moved into
the stage area of the package that needs them.
def resource(
*,
name: Optional[str] = None,
destination: str = "",
placement: Optional[str] = None,
when: WhenType = None,
# additional kwargs are as for `version()`
**kwargs,
):
"""Define an external resource to be fetched and staged when building the package.
Based on the keywords present in the dictionary the appropriate FetchStrategy will
be used for the resource. Resources are fetched and staged in their own folder
inside spack stage area, and then moved into the stage area of the package that
needs them.
List of recognized keywords:
Keyword Arguments:
name: name for the resource
when: condition defining when the resource is needed
destination: path, relative to the package stage area, to which resource should be moved
placement: optionally rename the expanded resource inside the destination directory
* 'when' : (optional) represents the condition upon which the resource is
needed
* 'destination' : (optional) path where to move the resource. This path
must be relative to the main package stage area.
* 'placement' : (optional) gives the possibility to fine tune how the
resource is moved into the main package stage area.
"""
def _execute_resource(pkg):
when = kwargs.get("when")
when_spec = _make_when_spec(when)
if not when_spec:
return
destination = kwargs.get("destination", "")
placement = kwargs.get("placement", None)
# Check if the path is relative
if os.path.isabs(destination):
message = (
"The destination keyword of a resource directive " "can't be an absolute path.\n"
)
message += "\tdestination : '{dest}\n'".format(dest=destination)
raise RuntimeError(message)
msg = "The destination keyword of a resource directive can't be an absolute path.\n"
msg += f"\tdestination : '{destination}\n'"
raise RuntimeError(msg)
# Check if the path falls within the main package stage area
test_path = "stage_folder_root"
normalized_destination = os.path.normpath(
os.path.join(test_path, destination)
) # Normalized absolute path
# Normalized absolute path
normalized_destination = os.path.normpath(os.path.join(test_path, destination))
if test_path not in normalized_destination:
message = (
"The destination folder of a resource must fall "
"within the main package stage directory.\n"
)
message += "\tdestination : '{dest}'\n".format(dest=destination)
raise RuntimeError(message)
msg = "Destination of a resource must be within the package stage directory.\n"
msg += f"\tdestination : '{destination}'\n"
raise RuntimeError(msg)
resources = pkg.resources.setdefault(when_spec, [])
name = kwargs.get("name")
fetcher = from_kwargs(**kwargs)
resources.append(Resource(name, fetcher, destination, placement))
resources.append(
Resource(name, spack.fetch_strategy.from_kwargs(**kwargs), destination, placement)
)
return _execute_resource
@@ -818,7 +812,9 @@ def _execute_maintainer(pkg):
return _execute_maintainer
def _execute_license(pkg, license_identifier: str, when):
def _execute_license(
pkg: Type[spack.package_base.PackageBase], license_identifier: str, when: WhenType
):
# If when is not specified the license always holds
when_spec = _make_when_spec(when)
if not when_spec:
@@ -882,7 +878,7 @@ def requires(*requirement_specs: str, policy="one_of", when=None, msg=None):
msg: optional user defined message
"""
def _execute_requires(pkg: spack.package_base.PackageBase):
def _execute_requires(pkg: Type[spack.package_base.PackageBase]):
if policy not in ("one_of", "any_of"):
err_msg = (
f"the 'policy' argument of the 'requires' directive in {pkg.name} is set "
@@ -903,21 +899,6 @@ def _execute_requires(pkg: spack.package_base.PackageBase):
return _execute_requires
@directive("languages")
def _language(lang_spec_str: str, *, when: Optional[Union[str, bool]] = None):
"""Temporary implementation of language virtuals, until compilers are proper dependencies."""
def _execute_languages(pkg: spack.package_base.PackageBase):
when_spec = _make_when_spec(when)
if not when_spec:
return
languages = pkg.languages.setdefault(when_spec, set())
languages.add(lang_spec_str)
return _execute_languages
class DependencyError(DirectiveError):
"""This is raised when a dependency specification is invalid."""

View File

@@ -5,7 +5,7 @@
import collections.abc
import functools
from typing import List, Set
from typing import Any, Callable, Dict, List, Optional, Sequence, Set, Type, Union
import llnl.util.lang
@@ -25,11 +25,13 @@ class DirectiveMeta(type):
# Set of all known directives
_directive_dict_names: Set[str] = set()
_directives_to_be_executed: List[str] = []
_when_constraints_from_context: List[str] = []
_directives_to_be_executed: List[Callable] = []
_when_constraints_from_context: List[spack.spec.Spec] = []
_default_args: List[dict] = []
def __new__(cls, name, bases, attr_dict):
def __new__(
cls: Type["DirectiveMeta"], name: str, bases: tuple, attr_dict: dict
) -> "DirectiveMeta":
# Initialize the attribute containing the list of directives
# to be executed. Here we go reversed because we want to execute
# commands:
@@ -60,7 +62,7 @@ def __new__(cls, name, bases, attr_dict):
return super(DirectiveMeta, cls).__new__(cls, name, bases, attr_dict)
def __init__(cls, name, bases, attr_dict):
def __init__(cls: "DirectiveMeta", name: str, bases: tuple, attr_dict: dict):
# The instance is being initialized: if it is a package we must ensure
# that the directives are called to set it up.
@@ -81,27 +83,27 @@ def __init__(cls, name, bases, attr_dict):
super(DirectiveMeta, cls).__init__(name, bases, attr_dict)
@staticmethod
def push_to_context(when_spec):
def push_to_context(when_spec: spack.spec.Spec) -> None:
"""Add a spec to the context constraints."""
DirectiveMeta._when_constraints_from_context.append(when_spec)
@staticmethod
def pop_from_context():
def pop_from_context() -> spack.spec.Spec:
"""Pop the last constraint from the context"""
return DirectiveMeta._when_constraints_from_context.pop()
@staticmethod
def push_default_args(default_args):
def push_default_args(default_args: Dict[str, Any]) -> None:
"""Push default arguments"""
DirectiveMeta._default_args.append(default_args)
@staticmethod
def pop_default_args():
def pop_default_args() -> dict:
"""Pop default arguments"""
return DirectiveMeta._default_args.pop()
@staticmethod
def directive(dicts=None):
def directive(dicts: Optional[Union[Sequence[str], str]] = None) -> Callable:
"""Decorator for Spack directives.
Spack directives allow you to modify a package while it is being
@@ -156,7 +158,7 @@ class Foo(Package):
DirectiveMeta._directive_dict_names |= set(dicts)
# This decorator just returns the directive functions
def _decorator(decorated_function):
def _decorator(decorated_function: Callable) -> Callable:
directive_names.append(decorated_function.__name__)
@functools.wraps(decorated_function)

View File

@@ -23,7 +23,7 @@
from spack.error import SpackError
default_projections = {
"all": "{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}"
"all": "{architecture.platform}/{architecture.target}/{name}-{version}-{hash}"
}

View File

@@ -166,9 +166,7 @@ def __init__(
item.target.safe_name(),
" ".join(self._install_target(s.safe_name()) for s in item.prereqs),
item.target.spec_hash(),
item.target.unsafe_format(
"{name}{@version}{%compiler}{variants}{arch=architecture}"
),
item.target.unsafe_format("{name}{@version}{variants}{ arch=architecture}"),
item.buildcache_flag,
)
for item in adjacency_list

View File

@@ -135,7 +135,7 @@ def default_manifest_yaml():
valid_environment_name_re = r"^\w[\w-]*$"
#: version of the lockfile format. Must increase monotonically.
lockfile_format_version = 5
lockfile_format_version = 6
READER_CLS = {
@@ -144,6 +144,7 @@ def default_manifest_yaml():
3: spack.spec.SpecfileV2,
4: spack.spec.SpecfileV3,
5: spack.spec.SpecfileV4,
6: spack.spec.SpecfileV5,
}

View File

@@ -192,3 +192,10 @@ def __reduce__(self):
def _make_stop_phase(msg, long_msg):
return StopPhase(msg, long_msg)
class MirrorError(SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super().__init__(msg, long_msg)

View File

@@ -5,7 +5,6 @@
"""Service functions and classes to implement the hooks
for Spack's command extensions.
"""
import difflib
import glob
import importlib
import os
@@ -17,7 +16,6 @@
import llnl.util.lang
import spack.cmd
import spack.config
import spack.error
import spack.util.path
@@ -25,9 +23,6 @@
_extension_regexp = re.compile(r"spack-(\w[-\w]*)$")
# TODO: For consistency we should use spack.cmd.python_name(), but
# currently this would create a circular relationship between
# spack.cmd and spack.extensions.
def _python_name(cmd_name):
return cmd_name.replace("-", "_")
@@ -211,8 +206,7 @@ def get_module(cmd_name):
module = load_command_extension(cmd_name, folder)
if module:
return module
else:
raise CommandNotFoundError(cmd_name)
return None
def get_template_dirs():
@@ -224,27 +218,6 @@ def get_template_dirs():
return extensions
class CommandNotFoundError(spack.error.SpackError):
"""Exception class thrown when a requested command is not recognized as
such.
"""
def __init__(self, cmd_name):
msg = (
"{0} is not a recognized Spack command or extension command;"
" check with `spack commands`.".format(cmd_name)
)
long_msg = None
similar = difflib.get_close_matches(cmd_name, spack.cmd.all_commands())
if 1 <= len(similar) <= 5:
long_msg = "\nDid you mean one of the following commands?\n "
long_msg += "\n ".join(similar)
super().__init__(msg, long_msg)
class ExtensionNamingError(spack.error.SpackError):
"""Exception class thrown when a configured extension does not follow
the expected naming convention.

View File

@@ -325,12 +325,7 @@ def write(self, spec, color=None, out=None):
self._out = llnl.util.tty.color.ColorStream(out, color=color)
# We'll traverse the spec in topological order as we graph it.
nodes_in_topological_order = [
edge.spec
for edge in spack.traverse.traverse_edges_topo(
[spec], direction="children", deptype=self.depflag
)
]
nodes_in_topological_order = list(spec.traverse(order="topo", deptype=self.depflag))
nodes_in_topological_order.reverse()
# Work on a copy to be nondestructive

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.binary_distribution as bindist
import spack.mirror
import spack.mirrors.mirror
def post_install(spec, explicit):
@@ -22,7 +22,7 @@ def post_install(spec, explicit):
return
# Push the package to all autopush mirrors
for mirror in spack.mirror.MirrorCollection(binary=True, autopush=True).values():
for mirror in spack.mirrors.mirror.MirrorCollection(binary=True, autopush=True).values():
signing_key = bindist.select_signing_key() if mirror.signed else None
with bindist.make_uploader(mirror=mirror, force=True, signing_key=signing_key) as uploader:
uploader.push_or_raise([spec])

View File

@@ -375,23 +375,16 @@ def phase_tests(self, builder, phase_name: str, method_names: List[str]):
for name in method_names:
try:
# Prefer the method in the package over the builder's.
# We need this primarily to pick up arbitrarily named test
# methods but also some build-time checks.
fn = getattr(builder.pkg, name, getattr(builder, name))
msg = f"RUN-TESTS: {phase_name}-time tests [{name}]"
print_message(logger, msg, verbose)
fn()
fn = getattr(builder, name, None) or getattr(builder.pkg, name)
except AttributeError as e:
msg = f"RUN-TESTS: method not implemented [{name}]"
print_message(logger, msg, verbose)
self.add_failure(e, msg)
print_message(logger, f"RUN-TESTS: method not implemented [{name}]", verbose)
self.add_failure(e, f"RUN-TESTS: method not implemented [{name}]")
if fail_fast:
break
continue
print_message(logger, f"RUN-TESTS: {phase_name}-time tests [{name}]", verbose)
fn()
if have_tests:
print_message(logger, "Completed testing", verbose)

View File

@@ -56,7 +56,7 @@
import spack.deptypes as dt
import spack.error
import spack.hooks
import spack.mirror
import spack.mirrors.mirror
import spack.package_base
import spack.package_prefs as prefs
import spack.repo
@@ -491,7 +491,7 @@ def _try_install_from_binary_cache(
timer: timer to keep track of binary install phases.
"""
# Early exit if no binary mirrors are configured.
if not spack.mirror.MirrorCollection(binary=True):
if not spack.mirrors.mirror.MirrorCollection(binary=True):
return False
tty.debug(f"Searching for binary cache of {package_id(pkg.spec)}")

View File

@@ -0,0 +1,146 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
from typing import Optional
import llnl.url
import llnl.util.symlink
from llnl.util.filesystem import mkdirp
import spack.fetch_strategy
import spack.oci.image
import spack.repo
import spack.spec
from spack.error import MirrorError
class MirrorLayout:
"""A ``MirrorLayout`` object describes the relative path of a mirror entry."""
def __init__(self, path: str) -> None:
self.path = path
def __iter__(self):
"""Yield all paths including aliases where the resource can be found."""
yield self.path
def make_alias(self, root: str) -> None:
"""Make the entry ``root / self.path`` available under a human readable alias"""
pass
class DefaultLayout(MirrorLayout):
def __init__(self, alias_path: str, digest_path: Optional[str] = None) -> None:
# When we have a digest, it is used as the primary storage location. If not, then we use
# the human-readable alias. In case of mirrors of a VCS checkout, we currently do not have
# a digest, that's why an alias is required and a digest optional.
super().__init__(path=digest_path or alias_path)
self.alias = alias_path
self.digest_path = digest_path
def make_alias(self, root: str) -> None:
"""Symlink a human readible path in our mirror to the actual storage location."""
# We already use the human-readable path as the main storage location.
if not self.digest_path:
return
alias, digest = os.path.join(root, self.alias), os.path.join(root, self.digest_path)
alias_dir = os.path.dirname(alias)
relative_dst = os.path.relpath(digest, start=alias_dir)
mkdirp(alias_dir)
tmp = f"{alias}.tmp"
llnl.util.symlink.symlink(relative_dst, tmp)
try:
os.rename(tmp, alias)
except OSError:
# Clean up the temporary if possible
try:
os.unlink(tmp)
except OSError:
pass
raise
def __iter__(self):
if self.digest_path:
yield self.digest_path
yield self.alias
class OCILayout(MirrorLayout):
"""Follow the OCI Image Layout Specification to archive blobs where paths are of the form
``blobs/<algorithm>/<digest>``"""
def __init__(self, digest: spack.oci.image.Digest) -> None:
super().__init__(os.path.join("blobs", digest.algorithm, digest.digest))
def _determine_extension(fetcher):
if isinstance(fetcher, spack.fetch_strategy.URLFetchStrategy):
if fetcher.expand_archive:
# If we fetch with a URLFetchStrategy, use URL's archive type
ext = llnl.url.determine_url_file_extension(fetcher.url)
if ext:
# Remove any leading dots
ext = ext.lstrip(".")
else:
msg = """\
Unable to parse extension from {0}.
If this URL is for a tarball but does not include the file extension
in the name, you can explicitly declare it with the following syntax:
version('1.2.3', 'hash', extension='tar.gz')
If this URL is for a download like a .jar or .whl that does not need
to be expanded, or an uncompressed installation script, you can tell
Spack not to expand it with the following syntax:
version('1.2.3', 'hash', expand=False)
"""
raise MirrorError(msg.format(fetcher.url))
else:
# If the archive shouldn't be expanded, don't check extension.
ext = None
else:
# Otherwise we'll make a .tar.gz ourselves
ext = "tar.gz"
return ext
def default_mirror_layout(
fetcher: "spack.fetch_strategy.FetchStrategy",
per_package_ref: str,
spec: Optional["spack.spec.Spec"] = None,
) -> MirrorLayout:
"""Returns a ``MirrorReference`` object which keeps track of the relative
storage path of the resource associated with the specified ``fetcher``."""
ext = None
if spec:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
versions = pkg_cls.versions.get(spec.version, {})
ext = versions.get("extension", None)
# If the spec does not explicitly specify an extension (the default case),
# then try to determine it automatically. An extension can only be
# specified for the primary source of the package (e.g. the source code
# identified in the 'version' declaration). Resources/patches don't have
# an option to specify an extension, so it must be inferred for those.
ext = ext or _determine_extension(fetcher)
if ext:
per_package_ref += ".%s" % ext
global_ref = fetcher.mirror_id()
if global_ref:
global_ref = os.path.join("_source-cache", global_ref)
if global_ref and ext:
global_ref += ".%s" % ext
return DefaultLayout(per_package_ref, global_ref)

View File

@@ -2,42 +2,20 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""
This file contains code for creating spack mirror directories. A
mirror is an organized hierarchy containing specially named archive
files. This enabled spack to know where to find files in a mirror if
the main server for a particular package is down. Or, if the computer
where spack is run is not connected to the internet, it allows spack
to download packages directly from a mirror (e.g., on an intranet).
"""
import collections
import collections.abc
import operator
import os
import os.path
import sys
import traceback
import urllib.parse
from typing import Any, Dict, Optional, Tuple, Union
import llnl.url
import llnl.util.symlink
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack.caches
import spack.config
import spack.error
import spack.fetch_strategy
import spack.mirror
import spack.oci.image
import spack.repo
import spack.spec
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.version
from spack.error import MirrorError
#: What schemes do we support
supported_url_schemes = ("file", "http", "https", "sftp", "ftp", "s3", "gs", "oci")
@@ -490,380 +468,3 @@ def __iter__(self):
def __len__(self):
return len(self._mirrors)
def _determine_extension(fetcher):
if isinstance(fetcher, spack.fetch_strategy.URLFetchStrategy):
if fetcher.expand_archive:
# If we fetch with a URLFetchStrategy, use URL's archive type
ext = llnl.url.determine_url_file_extension(fetcher.url)
if ext:
# Remove any leading dots
ext = ext.lstrip(".")
else:
msg = """\
Unable to parse extension from {0}.
If this URL is for a tarball but does not include the file extension
in the name, you can explicitly declare it with the following syntax:
version('1.2.3', 'hash', extension='tar.gz')
If this URL is for a download like a .jar or .whl that does not need
to be expanded, or an uncompressed installation script, you can tell
Spack not to expand it with the following syntax:
version('1.2.3', 'hash', expand=False)
"""
raise MirrorError(msg.format(fetcher.url))
else:
# If the archive shouldn't be expanded, don't check extension.
ext = None
else:
# Otherwise we'll make a .tar.gz ourselves
ext = "tar.gz"
return ext
class MirrorLayout:
"""A ``MirrorLayout`` object describes the relative path of a mirror entry."""
def __init__(self, path: str) -> None:
self.path = path
def __iter__(self):
"""Yield all paths including aliases where the resource can be found."""
yield self.path
def make_alias(self, root: str) -> None:
"""Make the entry ``root / self.path`` available under a human readable alias"""
pass
class DefaultLayout(MirrorLayout):
def __init__(self, alias_path: str, digest_path: Optional[str] = None) -> None:
# When we have a digest, it is used as the primary storage location. If not, then we use
# the human-readable alias. In case of mirrors of a VCS checkout, we currently do not have
# a digest, that's why an alias is required and a digest optional.
super().__init__(path=digest_path or alias_path)
self.alias = alias_path
self.digest_path = digest_path
def make_alias(self, root: str) -> None:
"""Symlink a human readible path in our mirror to the actual storage location."""
# We already use the human-readable path as the main storage location.
if not self.digest_path:
return
alias, digest = os.path.join(root, self.alias), os.path.join(root, self.digest_path)
alias_dir = os.path.dirname(alias)
relative_dst = os.path.relpath(digest, start=alias_dir)
mkdirp(alias_dir)
tmp = f"{alias}.tmp"
llnl.util.symlink.symlink(relative_dst, tmp)
try:
os.rename(tmp, alias)
except OSError:
# Clean up the temporary if possible
try:
os.unlink(tmp)
except OSError:
pass
raise
def __iter__(self):
if self.digest_path:
yield self.digest_path
yield self.alias
class OCILayout(MirrorLayout):
"""Follow the OCI Image Layout Specification to archive blobs where paths are of the form
``blobs/<algorithm>/<digest>``"""
def __init__(self, digest: spack.oci.image.Digest) -> None:
super().__init__(os.path.join("blobs", digest.algorithm, digest.digest))
def default_mirror_layout(
fetcher: "spack.fetch_strategy.FetchStrategy",
per_package_ref: str,
spec: Optional["spack.spec.Spec"] = None,
) -> MirrorLayout:
"""Returns a ``MirrorReference`` object which keeps track of the relative
storage path of the resource associated with the specified ``fetcher``."""
ext = None
if spec:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
versions = pkg_cls.versions.get(spec.version, {})
ext = versions.get("extension", None)
# If the spec does not explicitly specify an extension (the default case),
# then try to determine it automatically. An extension can only be
# specified for the primary source of the package (e.g. the source code
# identified in the 'version' declaration). Resources/patches don't have
# an option to specify an extension, so it must be inferred for those.
ext = ext or _determine_extension(fetcher)
if ext:
per_package_ref += ".%s" % ext
global_ref = fetcher.mirror_id()
if global_ref:
global_ref = os.path.join("_source-cache", global_ref)
if global_ref and ext:
global_ref += ".%s" % ext
return DefaultLayout(per_package_ref, global_ref)
def get_all_versions(specs):
"""Given a set of initial specs, return a new set of specs that includes
each version of each package in the original set.
Note that if any spec in the original set specifies properties other than
version, this information will be omitted in the new set; for example; the
new set of specs will not include variant settings.
"""
version_specs = []
for spec in specs:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# Skip any package that has no known versions.
if not pkg_cls.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg_cls.name)
continue
for version in pkg_cls.versions:
version_spec = spack.spec.Spec(pkg_cls.name)
version_spec.versions = spack.version.VersionList([version])
version_specs.append(version_spec)
return version_specs
def get_matching_versions(specs, num_versions=1):
"""Get a spec for EACH known version matching any spec in the list.
For concrete specs, this retrieves the concrete version and, if more
than one version per spec is requested, retrieves the latest versions
of the package.
"""
matching = []
for spec in specs:
pkg = spec.package
# Skip any package that has no known versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg.name)
continue
pkg_versions = num_versions
version_order = list(reversed(sorted(pkg.versions)))
matching_spec = []
if spec.concrete:
matching_spec.append(spec)
pkg_versions -= 1
if spec.version in version_order:
version_order.remove(spec.version)
for v in version_order:
# Generate no more than num_versions versions for each spec.
if pkg_versions < 1:
break
# Generate only versions that satisfy the spec.
if spec.concrete or v.intersects(spec.versions):
s = spack.spec.Spec(pkg.name)
s.versions = spack.version.VersionList([v])
s.variants = spec.variants.copy()
# This is needed to avoid hanging references during the
# concretization phase
s.variants.spec = s
matching_spec.append(s)
pkg_versions -= 1
if not matching_spec:
tty.warn("No known version matches spec: %s" % spec)
matching.extend(matching_spec)
return matching
def create(path, specs, skip_unstable_versions=False):
"""Create a directory to be used as a spack mirror, and fill it with
package archives.
Arguments:
path: Path to create a mirror directory hierarchy in.
specs: Any package versions matching these specs will be added \
to the mirror.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
Return Value:
Returns a tuple of lists: (present, mirrored, error)
* present: Package specs that were already present.
* mirrored: Package specs that were successfully mirrored.
* error: Package specs that failed to mirror due to some error.
"""
# automatically spec-ify anything in the specs array.
specs = [s if isinstance(s, spack.spec.Spec) else spack.spec.Spec(s) for s in specs]
mirror_cache, mirror_stats = mirror_cache_and_stats(path, skip_unstable_versions)
for spec in specs:
mirror_stats.next_spec(spec)
create_mirror_from_package_object(spec.package, mirror_cache, mirror_stats)
return mirror_stats.stats()
def mirror_cache_and_stats(path, skip_unstable_versions=False):
"""Return both a mirror cache and a mirror stats, starting from the path
where a mirror ought to be created.
Args:
path (str): path to create a mirror directory hierarchy in.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
"""
# Get the absolute path of the root before we start jumping around.
if not os.path.isdir(path):
try:
mkdirp(path)
except OSError as e:
raise MirrorError("Cannot create directory '%s':" % path, str(e))
mirror_cache = spack.caches.MirrorCache(path, skip_unstable_versions=skip_unstable_versions)
mirror_stats = MirrorStats()
return mirror_cache, mirror_stats
def add(mirror: Mirror, scope=None):
"""Add a named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if mirror.name in mirrors:
tty.die("Mirror with name {} already exists.".format(mirror.name))
items = [(n, u) for n, u in mirrors.items()]
items.insert(0, (mirror.name, mirror.to_dict()))
mirrors = syaml.syaml_dict(items)
spack.config.set("mirrors", mirrors, scope=scope)
def remove(name, scope):
"""Remove the named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if name not in mirrors:
tty.die("No mirror with name %s" % name)
mirrors.pop(name)
spack.config.set("mirrors", mirrors, scope=scope)
tty.msg("Removed mirror %s." % name)
class MirrorStats:
def __init__(self):
self.present = {}
self.new = {}
self.errors = set()
self.current_spec = None
self.added_resources = set()
self.existing_resources = set()
def next_spec(self, spec):
self._tally_current_spec()
self.current_spec = spec
def _tally_current_spec(self):
if self.current_spec:
if self.added_resources:
self.new[self.current_spec] = len(self.added_resources)
if self.existing_resources:
self.present[self.current_spec] = len(self.existing_resources)
self.added_resources = set()
self.existing_resources = set()
self.current_spec = None
def stats(self):
self._tally_current_spec()
return list(self.present), list(self.new), list(self.errors)
def already_existed(self, resource):
# If an error occurred after caching a subset of a spec's
# resources, a secondary attempt may consider them already added
if resource not in self.added_resources:
self.existing_resources.add(resource)
def added(self, resource):
self.added_resources.add(resource)
def error(self):
self.errors.add(self.current_spec)
def create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats):
"""Add a single package object to a mirror.
The package object is only required to have an associated spec
with a concrete version.
Args:
pkg_obj (spack.package_base.PackageBase): package object with to be added.
mirror_cache (spack.caches.MirrorCache): mirror where to add the spec.
mirror_stats (spack.mirror.MirrorStats): statistics on the current mirror
Return:
True if the spec was added successfully, False otherwise
"""
tty.msg("Adding package {} to mirror".format(pkg_obj.spec.format("{name}{@version}")))
num_retries = 3
while num_retries > 0:
try:
# Includes patches and resources
with pkg_obj.stage as pkg_stage:
pkg_stage.cache_mirror(mirror_cache, mirror_stats)
exception = None
break
except Exception as e:
exc_tuple = sys.exc_info()
exception = e
num_retries -= 1
if exception:
if spack.config.get("config:debug"):
traceback.print_exception(file=sys.stderr, *exc_tuple)
else:
tty.warn(
"Error while fetching %s" % pkg_obj.spec.cformat("{name}{@version}"),
getattr(exception, "message", exception),
)
mirror_stats.error()
return False
return True
def require_mirror_name(mirror_name):
"""Find a mirror by name and raise if it does not exist"""
mirror = MirrorCollection().get(mirror_name)
if not mirror:
raise ValueError(f'no mirror named "{mirror_name}"')
return mirror
class MirrorError(spack.error.SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super().__init__(msg, long_msg)

View File

@@ -0,0 +1,258 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import traceback
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack.caches
import spack.config
import spack.error
import spack.repo
import spack.spec
import spack.util.spack_yaml as syaml
import spack.version
from spack.error import MirrorError
from spack.mirrors.mirror import Mirror, MirrorCollection
def get_all_versions(specs):
"""Given a set of initial specs, return a new set of specs that includes
each version of each package in the original set.
Note that if any spec in the original set specifies properties other than
version, this information will be omitted in the new set; for example; the
new set of specs will not include variant settings.
"""
version_specs = []
for spec in specs:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# Skip any package that has no known versions.
if not pkg_cls.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg_cls.name)
continue
for version in pkg_cls.versions:
version_spec = spack.spec.Spec(pkg_cls.name)
version_spec.versions = spack.version.VersionList([version])
version_specs.append(version_spec)
return version_specs
def get_matching_versions(specs, num_versions=1):
"""Get a spec for EACH known version matching any spec in the list.
For concrete specs, this retrieves the concrete version and, if more
than one version per spec is requested, retrieves the latest versions
of the package.
"""
matching = []
for spec in specs:
pkg = spec.package
# Skip any package that has no known versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg.name)
continue
pkg_versions = num_versions
version_order = list(reversed(sorted(pkg.versions)))
matching_spec = []
if spec.concrete:
matching_spec.append(spec)
pkg_versions -= 1
if spec.version in version_order:
version_order.remove(spec.version)
for v in version_order:
# Generate no more than num_versions versions for each spec.
if pkg_versions < 1:
break
# Generate only versions that satisfy the spec.
if spec.concrete or v.intersects(spec.versions):
s = spack.spec.Spec(pkg.name)
s.versions = spack.version.VersionList([v])
s.variants = spec.variants.copy()
# This is needed to avoid hanging references during the
# concretization phase
s.variants.spec = s
matching_spec.append(s)
pkg_versions -= 1
if not matching_spec:
tty.warn("No known version matches spec: %s" % spec)
matching.extend(matching_spec)
return matching
def create(path, specs, skip_unstable_versions=False):
"""Create a directory to be used as a spack mirror, and fill it with
package archives.
Arguments:
path: Path to create a mirror directory hierarchy in.
specs: Any package versions matching these specs will be added \
to the mirror.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
Return Value:
Returns a tuple of lists: (present, mirrored, error)
* present: Package specs that were already present.
* mirrored: Package specs that were successfully mirrored.
* error: Package specs that failed to mirror due to some error.
"""
# automatically spec-ify anything in the specs array.
specs = [s if isinstance(s, spack.spec.Spec) else spack.spec.Spec(s) for s in specs]
mirror_cache, mirror_stats = mirror_cache_and_stats(path, skip_unstable_versions)
for spec in specs:
mirror_stats.next_spec(spec)
create_mirror_from_package_object(spec.package, mirror_cache, mirror_stats)
return mirror_stats.stats()
def mirror_cache_and_stats(path, skip_unstable_versions=False):
"""Return both a mirror cache and a mirror stats, starting from the path
where a mirror ought to be created.
Args:
path (str): path to create a mirror directory hierarchy in.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
"""
# Get the absolute path of the root before we start jumping around.
if not os.path.isdir(path):
try:
mkdirp(path)
except OSError as e:
raise MirrorError("Cannot create directory '%s':" % path, str(e))
mirror_cache = spack.caches.MirrorCache(path, skip_unstable_versions=skip_unstable_versions)
mirror_stats = MirrorStats()
return mirror_cache, mirror_stats
def add(mirror: Mirror, scope=None):
"""Add a named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if mirror.name in mirrors:
tty.die("Mirror with name {} already exists.".format(mirror.name))
items = [(n, u) for n, u in mirrors.items()]
items.insert(0, (mirror.name, mirror.to_dict()))
mirrors = syaml.syaml_dict(items)
spack.config.set("mirrors", mirrors, scope=scope)
def remove(name, scope):
"""Remove the named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if name not in mirrors:
tty.die("No mirror with name %s" % name)
mirrors.pop(name)
spack.config.set("mirrors", mirrors, scope=scope)
tty.msg("Removed mirror %s." % name)
class MirrorStats:
def __init__(self):
self.present = {}
self.new = {}
self.errors = set()
self.current_spec = None
self.added_resources = set()
self.existing_resources = set()
def next_spec(self, spec):
self._tally_current_spec()
self.current_spec = spec
def _tally_current_spec(self):
if self.current_spec:
if self.added_resources:
self.new[self.current_spec] = len(self.added_resources)
if self.existing_resources:
self.present[self.current_spec] = len(self.existing_resources)
self.added_resources = set()
self.existing_resources = set()
self.current_spec = None
def stats(self):
self._tally_current_spec()
return list(self.present), list(self.new), list(self.errors)
def already_existed(self, resource):
# If an error occurred after caching a subset of a spec's
# resources, a secondary attempt may consider them already added
if resource not in self.added_resources:
self.existing_resources.add(resource)
def added(self, resource):
self.added_resources.add(resource)
def error(self):
self.errors.add(self.current_spec)
def create_mirror_from_package_object(
pkg_obj, mirror_cache: "spack.caches.MirrorCache", mirror_stats: MirrorStats
) -> bool:
"""Add a single package object to a mirror.
The package object is only required to have an associated spec
with a concrete version.
Args:
pkg_obj (spack.package_base.PackageBase): package object with to be added.
mirror_cache: mirror where to add the spec.
mirror_stats: statistics on the current mirror
Return:
True if the spec was added successfully, False otherwise
"""
tty.msg("Adding package {} to mirror".format(pkg_obj.spec.format("{name}{@version}")))
max_retries = 3
for num_retries in range(max_retries):
try:
# Includes patches and resources
with pkg_obj.stage as pkg_stage:
pkg_stage.cache_mirror(mirror_cache, mirror_stats)
break
except Exception as e:
if num_retries + 1 == max_retries:
if spack.config.get("config:debug"):
traceback.print_exc()
else:
tty.warn(
"Error while fetching %s" % pkg_obj.spec.format("{name}{@version}"), str(e)
)
mirror_stats.error()
return False
return True
def require_mirror_name(mirror_name):
"""Find a mirror by name and raise if it does not exist"""
mirror = MirrorCollection().get(mirror_name)
if not mirror:
raise ValueError(f'no mirror named "{mirror_name}"')
return mirror

View File

@@ -71,12 +71,16 @@ def _filter_compiler_wrappers_impl(pkg_or_builder):
x = llnl.util.filesystem.FileFilter(*abs_files)
compiler_vars = [
("CC", pkg.compiler.cc),
("CXX", pkg.compiler.cxx),
("F77", pkg.compiler.f77),
("FC", pkg.compiler.fc),
]
compiler_vars = []
if "c" in pkg.spec:
compiler_vars.append(("CC", pkg.spec["c"].package.cc))
if "cxx" in pkg.spec:
compiler_vars.append(("CXX", pkg.spec["cxx"].package.cxx))
if "fortran" in pkg.spec:
compiler_vars.append(("FC", pkg.spec["fortran"].package.fortran))
compiler_vars.append(("F77", pkg.spec["fortran"].package.fortran))
# Some paths to the compiler wrappers might be substrings of the others.
# For example:
@@ -104,7 +108,11 @@ def _filter_compiler_wrappers_impl(pkg_or_builder):
x.filter(wrapper_path, compiler_path, **filter_kwargs)
# Remove this linking flag if present (it turns RPATH into RUNPATH)
x.filter("{0}--enable-new-dtags".format(pkg.compiler.linker_arg), "", **filter_kwargs)
for compiler_lang in ("c", "cxx", "fortran"):
if compiler_lang not in pkg.spec:
continue
compiler_pkg = pkg.spec[compiler_lang].package
x.filter(f"{compiler_pkg.linker_arg}--enable-new-dtags", "", **filter_kwargs)
# NAG compiler is usually mixed with GCC, which has a different
# prefix for linker arguments.

View File

@@ -330,18 +330,17 @@ class BaseConfiguration:
default_projections = {"all": "{name}/{version}-{compiler.name}-{compiler.version}"}
def __init__(self, spec: spack.spec.Spec, module_set_name: str, explicit: bool) -> None:
# Module where type(self) is defined
m = inspect.getmodule(self)
assert m is not None # make mypy happy
self.module = m
# Spec for which we want to generate a module file
self.spec = spec
self.name = module_set_name
self.explicit = explicit
# Dictionary of configuration options that should be applied
# to the spec
# Dictionary of configuration options that should be applied to the spec
self.conf = merge_config_rules(self.module.configuration(self.name), self.spec)
@property
def module(self):
return inspect.getmodule(self)
@property
def projections(self):
"""Projection from specs to module names"""

View File

@@ -6,12 +6,13 @@
import collections
import itertools
import os.path
import pathlib
from typing import Dict, List, Optional, Tuple
import llnl.util.filesystem as fs
import llnl.util.lang as lang
import spack.compilers
import spack.compilers.config
import spack.config
import spack.error
import spack.repo
@@ -59,7 +60,7 @@ def make_context(
return LmodContext(make_configuration(spec, module_set_name, explicit))
def guess_core_compilers(name, store=False) -> List[spack.spec.CompilerSpec]:
def guess_core_compilers(name, store=False) -> List[spack.spec.Spec]:
"""Guesses the list of core compilers installed in the system.
Args:
@@ -70,16 +71,12 @@ def guess_core_compilers(name, store=False) -> List[spack.spec.CompilerSpec]:
List of found core compilers
"""
core_compilers = []
for compiler in spack.compilers.all_compilers():
for compiler in spack.compilers.config.all_compilers():
try:
# A compiler is considered to be a core compiler if any of the
# C, C++ or Fortran compilers reside in a system directory
is_system_compiler = any(
os.path.dirname(getattr(compiler, x, "")) in spack.util.environment.SYSTEM_DIRS
for x in ("cc", "cxx", "f77", "fc")
)
cc_dir = pathlib.Path(compiler.package.cc).parent
is_system_compiler = str(cc_dir) in spack.util.environment.SYSTEM_DIRS
if is_system_compiler:
core_compilers.append(compiler.spec)
core_compilers.append(compiler)
except (KeyError, TypeError, AttributeError):
continue
@@ -101,18 +98,32 @@ class LmodConfiguration(BaseConfiguration):
default_projections = {"all": "{name}/{version}"}
def __init__(self, spec: spack.spec.Spec, module_set_name: str, explicit: bool) -> None:
super().__init__(spec, module_set_name, explicit)
# FIXME (compiler as nodes): make this a bit more robust
candidates = collections.defaultdict(list)
for node in spec.traverse(deptype=("link", "run")):
candidates["c"].extend(node.dependencies(virtuals="c"))
candidates["cxx"].extend(node.dependencies(virtuals="c"))
# FIXME (compiler as nodes): decide what to do when we have more than one C compiler
if candidates["c"] and len(set(candidates["c"])) == 1:
self.compiler = candidates["c"][0]
elif not candidates["c"]:
self.compiler = None
@property
def core_compilers(self) -> List[spack.spec.CompilerSpec]:
def core_compilers(self) -> List[spack.spec.Spec]:
"""Returns the list of "Core" compilers
Raises:
CoreCompilersNotFoundError: if the key was not
specified in the configuration file or the sequence
is empty
CoreCompilersNotFoundError: if the key was not specified in the configuration file or
the sequence is empty
"""
compilers = [
spack.spec.CompilerSpec(c) for c in configuration(self.name).get("core_compilers", [])
]
compilers = []
for c in configuration(self.name).get("core_compilers", []):
compilers.extend(spack.spec.Spec(f"%{c}").dependencies())
if not compilers:
compilers = guess_core_compilers(self.name, store=True)
@@ -161,12 +172,15 @@ def hierarchy_tokens(self):
@property
@lang.memoized
def requires(self):
"""Returns a dictionary mapping all the requirements of this spec
to the actual provider. 'compiler' is always present among the
requirements.
"""Returns a dictionary mapping all the requirements of this spec to the actual provider.
The 'compiler' key is always present among the requirements.
"""
# If it's a core_spec, lie and say it requires a core compiler
if any(self.spec.satisfies(core_spec) for core_spec in self.core_specs):
if (
any(self.spec.satisfies(core_spec) for core_spec in self.core_specs)
or self.compiler is None
):
return {"compiler": self.core_compilers[0]}
hierarchy_filter_list = []
@@ -177,7 +191,8 @@ def requires(self):
# Keep track of the requirements that this package has in terms
# of virtual packages that participate in the hierarchical structure
requirements = {"compiler": self.spec.compiler}
requirements = {"compiler": self.compiler}
# For each virtual dependency in the hierarchy
for x in self.hierarchy_tokens:
# Skip anything filtered for this spec
@@ -200,17 +215,17 @@ def provides(self):
# virtual dependencies in spack
# If it is in the list of supported compilers family -> compiler
if self.spec.name in spack.compilers.supported_compilers():
provides["compiler"] = spack.spec.CompilerSpec(self.spec.format("{name}{@versions}"))
elif self.spec.name in spack.compilers.package_name_to_compiler_name:
if self.spec.name in spack.compilers.config.supported_compilers():
provides["compiler"] = spack.spec.Spec(self.spec.format("{name}{@versions}"))
elif self.spec.name in spack.compilers.config.package_name_to_compiler_name:
# If it is the package for a supported compiler, but of a different name
cname = spack.compilers.package_name_to_compiler_name[self.spec.name]
provides["compiler"] = spack.spec.CompilerSpec(cname, self.spec.versions)
cname = spack.compilers.config.package_name_to_compiler_name[self.spec.name]
provides["compiler"] = spack.spec.Spec(cname, self.spec.versions)
# All the other tokens in the hierarchy must be virtual dependencies
for x in self.hierarchy_tokens:
if self.spec.package.provides(x):
provides[x] = self.spec[x]
provides[x] = self.spec
return provides
@property
@@ -301,12 +316,10 @@ def path_part_fmt(token):
# If we are dealing with a core compiler, return 'Core'
core_compilers = self.conf.core_compilers
if name == "compiler" and any(
spack.spec.CompilerSpec(value).satisfies(c) for c in core_compilers
):
if name == "compiler" and any(spack.spec.Spec(value).satisfies(c) for c in core_compilers):
return "Core"
# CompilerSpec does not have a hash, as we are not allowed to
# Spec does not have a hash, as we are not allowed to
# use different flavors of the same compiler
if name == "compiler":
return path_part_fmt(token=value)

View File

@@ -16,7 +16,8 @@
import llnl.util.tty as tty
import spack.fetch_strategy
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.oci.opener
import spack.stage
import spack.util.url
@@ -213,7 +214,7 @@ def upload_manifest(
return digest, size
def image_from_mirror(mirror: spack.mirror.Mirror) -> ImageReference:
def image_from_mirror(mirror: spack.mirrors.mirror.Mirror) -> ImageReference:
"""Given an OCI based mirror, extract the URL and image name from it"""
url = mirror.push_url
if not url.startswith("oci://"):
@@ -385,5 +386,8 @@ def make_stage(
# is the `oci-layout` and `index.json` files, which are
# required by the spec.
return spack.stage.Stage(
fetch_strategy, mirror_paths=spack.mirror.OCILayout(digest), name=digest.digest, keep=keep
fetch_strategy,
mirror_paths=spack.mirrors.layout.OCILayout(digest),
name=digest.digest,
keep=keep,
)

View File

@@ -20,7 +20,7 @@
import llnl.util.lang
import spack.config
import spack.mirror
import spack.mirrors.mirror
import spack.parser
import spack.util.web
@@ -367,11 +367,11 @@ def http_error_401(self, req: Request, fp, code, msg, headers):
def credentials_from_mirrors(
domain: str, *, mirrors: Optional[Iterable[spack.mirror.Mirror]] = None
domain: str, *, mirrors: Optional[Iterable[spack.mirrors.mirror.Mirror]] = None
) -> Optional[UsernamePassword]:
"""Filter out OCI registry credentials from a list of mirrors."""
mirrors = mirrors or spack.mirror.MirrorCollection().values()
mirrors = mirrors or spack.mirrors.mirror.MirrorCollection().values()
for mirror in mirrors:
# Prefer push credentials over fetch. Unlikely that those are different

View File

@@ -32,7 +32,6 @@
from llnl.util.lang import classproperty, memoized
from llnl.util.link_tree import LinkTree
import spack.compilers
import spack.config
import spack.dependency
import spack.deptypes as dt
@@ -40,7 +39,8 @@
import spack.error
import spack.fetch_strategy as fs
import spack.hooks
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.multimethod
import spack.patch
import spack.phase_callbacks
@@ -52,8 +52,10 @@
import spack.util.path
import spack.util.web
import spack.variant
from spack.compilers.adaptor import DeprecatedCompiler
from spack.error import InstallError, NoURLError, PackageError
from spack.filesystem_view import YamlFilesystemView
from spack.resource import Resource
from spack.solver.version_order import concretization_version_order
from spack.stage import DevelopStage, ResourceStage, Stage, StageComposite, compute_stage_name
from spack.util.package_hash import package_hash
@@ -64,10 +66,9 @@
]
FLAG_HANDLER_TYPE = Callable[[str, Iterable[str]], FLAG_HANDLER_RETURN_TYPE]
"""Allowed URL schemes for spack packages."""
#: Allowed URL schemes for spack packages
_ALLOWED_URL_SCHEMES = ["http", "https", "ftp", "file", "git"]
#: Filename for the Spack build/install log.
_spack_build_logfile = "spack-build-out.txt"
@@ -578,6 +579,8 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
"""
compiler = DeprecatedCompiler()
#
# These are default values for instance variables.
#
@@ -585,6 +588,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
# Declare versions dictionary as placeholder for values.
# This allows analysis tools to correctly interpret the class attributes.
versions: dict
resources: Dict[spack.spec.Spec, List[Resource]]
dependencies: Dict[spack.spec.Spec, Dict[str, spack.dependency.Dependency]]
conflicts: Dict[spack.spec.Spec, List[Tuple[spack.spec.Spec, Optional[str]]]]
requirements: Dict[
@@ -595,6 +599,7 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
patches: Dict[spack.spec.Spec, List[spack.patch.Patch]]
variants: Dict[spack.spec.Spec, Dict[str, spack.variant.Variant]]
languages: Dict[spack.spec.Spec, Set[str]]
licenses: Dict[spack.spec.Spec, str]
splice_specs: Dict[spack.spec.Spec, Tuple[spack.spec.Spec, Union[None, str, List[str]]]]
#: Store whether a given Spec source/binary should not be redistributed.
@@ -1184,10 +1189,10 @@ def _make_resource_stage(self, root_stage, resource):
root=root_stage,
resource=resource,
name=self._resource_stage(resource),
mirror_paths=spack.mirror.default_mirror_layout(
mirror_paths=spack.mirrors.layout.default_mirror_layout(
resource.fetcher, os.path.join(self.name, pretty_resource_name)
),
mirrors=spack.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
path=self.path,
)
@@ -1199,7 +1204,7 @@ def _make_root_stage(self, fetcher):
# Construct a mirror path (TODO: get this out of package.py)
format_string = "{name}-{version}"
pretty_name = self.spec.format_path(format_string)
mirror_paths = spack.mirror.default_mirror_layout(
mirror_paths = spack.mirrors.layout.default_mirror_layout(
fetcher, os.path.join(self.name, pretty_name), self.spec
)
# Construct a path where the stage should build..
@@ -1208,7 +1213,7 @@ def _make_root_stage(self, fetcher):
stage = Stage(
fetcher,
mirror_paths=mirror_paths,
mirrors=spack.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
name=stage_name,
path=self.path,
search_fn=self._download_search,
@@ -1494,15 +1499,6 @@ def prefix(self):
def home(self):
return self.prefix
@property # type: ignore[misc]
@memoized
def compiler(self):
"""Get the spack.compiler.Compiler object used to build this package"""
if not self.spec.concrete:
raise ValueError("Can only get a compiler for a concrete package.")
return spack.compilers.compiler_for_spec(self.spec.compiler, self.spec.architecture)
def url_version(self, version):
"""
Given a version, this returns a string that should be substituted
@@ -1614,7 +1610,7 @@ def do_stage(self, mirror_only=False):
self.stage.create()
# Fetch/expand any associated code.
if self.has_code:
if self.has_code and not self.spec.external:
self.do_fetch(mirror_only)
self.stage.expand_archive()
else:
@@ -1944,17 +1940,14 @@ def _resource_stage(self, resource):
return resource_stage_folder
def do_test(self, dirty=False, externals=False):
if self.test_requires_compiler:
compilers = spack.compilers.compilers_for_spec(
self.spec.compiler, arch_spec=self.spec.architecture
if self.test_requires_compiler and not any(
lang in self.spec for lang in ("c", "cxx", "fortran")
):
tty.error(
f"Skipping tests for package {self.spec}, since a compiler is required, "
f"but not available"
)
if not compilers:
tty.error(
"Skipping tests for package %s\n"
% self.spec.format("{name}-{version}-{hash:7}")
+ "Package test requires missing compiler %s" % self.spec.compiler
)
return
return
kwargs = {
"dirty": dirty,

View File

@@ -375,12 +375,11 @@ def all_specs(self) -> List["spack.spec.Spec"]:
class SpecNodeParser:
"""Parse a single spec node from a stream of tokens"""
__slots__ = "ctx", "has_compiler", "has_version", "literal_str"
__slots__ = "ctx", "has_version", "literal_str"
def __init__(self, ctx, literal_str):
self.ctx = ctx
self.literal_str = literal_str
self.has_compiler = False
self.has_version = False
def parse(
@@ -427,23 +426,24 @@ def add_flag(name: str, value: str, propagate: bool):
raise_parsing_error(str(e), e)
while True:
if self.ctx.accept(TokenType.COMPILER):
if self.has_compiler:
raise_parsing_error("Spec cannot have multiple compilers")
if self.ctx.accept(TokenType.COMPILER) or self.ctx.accept(
TokenType.COMPILER_AND_VERSION
):
build_dependency = spack.spec.Spec(self.ctx.current_token.value[1:])
name_conversion = {
"clang": "llvm",
"oneapi": "intel-oneapi-compilers",
"rocmcc": "llvm-amdgpu",
"intel": "intel-oneapi-compiler-classic",
"arm": "acfl",
}
compiler_name = self.ctx.current_token.value[1:]
initial_spec.compiler = spack.spec.CompilerSpec(compiler_name.strip(), ":")
self.has_compiler = True
if build_dependency.name in name_conversion:
build_dependency.name = name_conversion[build_dependency.name]
elif self.ctx.accept(TokenType.COMPILER_AND_VERSION):
if self.has_compiler:
raise_parsing_error("Spec cannot have multiple compilers")
compiler_name, compiler_version = self.ctx.current_token.value[1:].split("@")
initial_spec.compiler = spack.spec.CompilerSpec(
compiler_name.strip(), compiler_version
initial_spec._add_dependency(
build_dependency, depflag=spack.deptypes.BUILD, virtuals=(), direct=True
)
self.has_compiler = True
elif (
self.ctx.accept(TokenType.VERSION_HASH_PAIR)

View File

@@ -16,7 +16,8 @@
import spack
import spack.error
import spack.fetch_strategy
import spack.mirror
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.repo
import spack.stage
import spack.util.spack_json as sjson
@@ -329,12 +330,12 @@ def stage(self) -> "spack.stage.Stage":
name = "{0}-{1}".format(os.path.basename(self.url), fetch_digest[:7])
per_package_ref = os.path.join(self.owner.split(".")[-1], name)
mirror_ref = spack.mirror.default_mirror_layout(fetcher, per_package_ref)
mirror_ref = spack.mirrors.layout.default_mirror_layout(fetcher, per_package_ref)
self._stage = spack.stage.Stage(
fetcher,
name=f"{spack.stage.stage_prefix}patch-{fetch_digest}",
mirror_paths=mirror_ref,
mirrors=spack.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
)
return self._stage

Some files were not shown because too many files have changed in this diff Show More