Compare commits

..

72 Commits

Author SHA1 Message Date
Harmen Stoppels
a3cbafd87c PackageBase: revamp libs and headers API
Add `find_libs`, `find_headers`, `query_libs`, and `query_headers` to
`PackageBase` and implement `SpecBuildInterface` in terms of those.

The old style `libs` and `headers` properties are deprecated but take
priority over the new style `find_libs` and `find_headers` methods.
2025-03-27 10:22:11 +01:00
Todd Gamblin
199133fca4 bugfix: concretization shouldn't require concrete packages to be known (#49706)
Concretization setup was checking whether any input spec has a dependency
that's *not* in the set of possible dependencies for all roots in the solve.
There are two reasons to check this:

1. The user could be asking for a dependency that none of the roots has, or
2. The user could be asking for a dependency that doesn't exist.

For abstract roots, (2) implies (1), and the check makes sense.  For concrete
roots, we don't care, because the spec has already been built. If a `package.py`
no longer depends on something it did before, it doesn't matter -- it's already
built. If the dependency no longer exists, we also do not care -- we already
built it and there's an installation for it somewhere. 

When you concretize an environment with a lockfile, *many* of the input specs
are concrete, and we don't need to build them. If a package changes its
dependencies, or if a `package.py` is removed for a concrete input spec, that
shouldn't cause an already-built environment to fail concretization.

A user reported that this was happening with an error like:

```console
spack concretize
==> Error: Package chapel does not depend on py-protobuf@5.28.2/a4rf4glr2tntfwsz6myzwmlk5iu25t74
```

Or, with traceback:
```console
  File "/apps/other/spack-devel/lib/spack/spack/solver/asp.py", line 3014, in setup
    raise spack.spec.InvalidDependencyError(spec.name, missing_deps)
spack.spec.InvalidDependencyError: Package chapel does not depend on py-protobuf@5.28.2/a4rf4glr2tntfwsz6myzwmlk5iu25t74
```

Fix this by skipping the check for concrete input specs. We already ignore conflicts,
etc. for concrete/external specs, and we do not need metadata in the solve for
concrete dependencies b/c they're imposed by hash constraints.

- [x] Ignore the package existence check for concrete input specs.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-26 23:58:12 +00:00
Alec Scott
ea3a3b51a0 ci: Skip removed packages during automatic checksum verification (#49681)
* Skip packages removed for automatic checksum verification

* Unify finding modified or added packages with spack.repo logic

* Remove unused imports

* Fix unit-tests using shared modified function

* Update last remaining unit test to new format
2025-03-26 16:47:11 -07:00
Robert Maaskant
23bd3e6104 glab: add v1.54.0 and v1.55.0 (#49689)
* glab: add v1.54.0 and v1.55.0

* glab: uniform go deps

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-03-26 14:48:18 -06:00
Robert Maaskant
c72477e67a gtk-doc: use download.gnome.org for downloads (#49660) 2025-03-26 12:03:41 -06:00
Todd Gamblin
2d2a4d1908 bugfix: display old-style compilers without @= in spack find output (#49693)
* bugfix: display old-style compilers without `@=` in `spack find` output

Fix `display_str` attribute of `DeprecatedCompiler` so that `spack find` displays
compilers without `@=` for the version (as expected).

- [x] Use `spec.format("{@version}")` to do this

before:
```
> spack find
-- darwin-sequoia-m1 / apple-clang@=16.0.0 -----------------------
abseil-cpp@20240722.0               py-graphviz@0.13.2
apple-libuuid@1353.100.2            py-hatch-fancy-pypi-readme@23.1.0
```

after:
```
> spack find
-- darwin-sequoia-m1 / apple-clang@16.0.0 -----------------------
abseil-cpp@20240722.0               py-graphviz@0.13.2
apple-libuuid@1353.100.2            py-hatch-fancy-pypi-readme@23.1.0
```

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>

* [@spackbot] updating style on behalf of tgamblin

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-26 10:34:05 -07:00
Afzal Patel
2cd773aea4 py-jaxlib: add spack-built ROCm support (#49611)
* py-jaxlib: add spack-built ROCm support

* fix style

* py-jaxlib 0.4.38 rocm support

* py-jaxlib 0.4.38 rocm support

* add comgr dependency

* changes for ROCm external and enable till 0.4.38

* enable version of py-jax

* add jax+rocm to ci

* add conflict for cuda and remove py-jaxlib from aarch64 pipeline

* Update var/spack/repos/builtin/packages/py-jaxlib/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* add conflict for aarch64

---------

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2025-03-26 09:23:52 -06:00
Mikael Simberg
145b0667cc asio: add 1.34.0 (#49529) 2025-03-26 14:01:01 +01:00
Massimiliano Culpo
5b3942a489 Turn compilers into nodes (#45189)
## Summary

Compilers stop being a *node attribute*, and become a *build-only* dependency. 

Packages may declare a dependency on the `c`, `cxx`, or `fortran` languages, which
are now treated as virtuals, and compilers would be *providers* for one or more of
those languages. Compilers can also inject runtime dependency, on the node being
compiled. An example graph for something as simple as `zlib-ng` is the following:

<p align="center">
<img src="https://github.com/user-attachments/assets/ee6471cb-09fd-4127-9f16-b9fe6d1338ac" alt="zlib-ng DAG" width="80%" height="auto">
</p>

Here `gcc` is used for both the `c`, and `cxx` languages. Edges are annotated with
the virtuals they satisfy (`c`, `cxx`, `libc`). `gcc` injects `gcc-runtime` on the nodes
being compiled. `glibc` is also injected for packages that require `c`. The
`compiler-wrapper` is explicitly represented as a node in the DAG, and is included in
the hash.

This change in the model has implications on the semantics of the `%` sigil, as
discussed in #44379, and requires a version bump for our `Specfile`, `Database`,
and `Lockfile` formats.

## Breaking changes

Breaking changes below may impact users of this branch.

### 1. Custom, non-numeric version of compilers are not supported

Currently, users can assign to compilers any custom version they want, and Spack
will try to recover the "real version" whenever the custom version fails some operation.
To deduce the "real version" Spack must run the compiler, which can add needless
overhead to common operations.

Since any information that a version like `gcc@foo` might give to the user, can also
be suffixed while retaining the correct numeric version, e.g. `gcc@10.5.0-foo`, Spack
will **not try** anymore to deduce real versions for compilers.

Said otherwise, users should have no expectation that `gcc@foo` behaves as
`gcc@X.Y.Z` internally.

### 2. The `%` sigil in the spec syntax means "direct build dependency"

The `%` sigil in the spec syntax means *"direct build dependency"*, and is not a node
attribute anymore. This means that:

```python
node.satisfies("%gcc")
``` 
is true only if `gcc` is a direct build dependency of the node. *Nodes without a compiler
dependency are allowed.*

### `parent["child"]`, and `node in spec`, will now only inspect the link/run sub-DAG
and direct build dependencies

The subscript notation for `Spec`:

```python
parent["child"]
```

will look for a `child` node only in the link/run transitive graph of `parent`, and in its
direct build dependencies. This means that to reach a transitive build dependency,
we must first pass through the node it is associated with. 

Assuming `parent` does not depend on `cmake`, but depends on a `CMakePackage`,
e.g. `hdf5`, then we have the following situation:

```python
# This one raises an Exception, since "parent" does not depend on cmake
parent["cmake"]
# This one is ok
cmake = parent["hdf5"]["cmake"]
```

### 3. Externals differing by just the compiler attribute

Externals are nodes where dependencies are trimmed, and that _is not planned to
change_ in this branch. Currently, on `develop` it is ok to write:

```yaml
packages:
  hdf5:
    externals:
    - spec: hdf5@1.12 %gcc
      prefix: /prefix/gcc
    - spec: hdf5@1.12 %clang
      prefix: /prefix/clang
```
and Spack will account for the compiler node attribute when computing the optimal
spec. In this branch, using externals with a compiler specified is allowed only if any
compiler in the dag matches the constraints specified on the external. _The external
will be still represented as a single node without dependencies_.

### 4. Spec matrices enforcing a compiler

Currently we can have matrices of the form:

```yaml
matrix:
- [x, y, z]
- [%gcc, %clang]
```
to get the cross-product of specs and compilers. We can disregard the nature of the
packages in the first row, since the compiler is a node attribute required on each node.

In this branch, instead, we require a spec to depend on `c`, `cxx`, or `fortran` for the
`%` to have any meaning. If any of the specs in the first row doesn't depend on these
languages, there will be a concretization error. 

## Deprecations

* The entire `compilers` section in the configuration (i.e., `compilers.yaml`) has been
  deprecated, and current entries will be removed in v1.2.0. For the time being, if Spack
  finds any `compilers` configuration, it will try to convert it automatically to a set of
  external packages.
* The `packages:compiler` soft-preference has been deprecated. It will be removed
  in v1.1.0.

## Other notable changes

* The tokens `{compiler}`, `{compiler.version}`, and `{compiler.name}` in `Spec.format`
  expand to `"none"` if a Spec does not depend on C, C++, or Fortran.
* The default install tree layout is now
  `"{architecture.platform}-{architecture.target}/{name}-{version}-{hash}"`

## Known limitations

The major known limitations of this branch that we intend to fix before v1.0 is that compilers
cannot be bootstrapped directly. 

In this branch we can build a new compiler using an existing external compiler, for instance:
	
```
$ spack install gcc@14 %gcc@10.5.0
```

where `gcc@10.5.0` is external, and `gcc@14` is to be built.

What we can't do at the moment is use a yet to be built compiler, and expect it will be
bootstrapped, e.g. :

```
spack install hdf5 %gcc@14
```

We plan to tackle this issue in a following PR.

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Harmen Stoppels <me@harmenstoppels.nl>
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-25 22:32:49 -06:00
Todd Gamblin
a9c879d53e gdk-pixbuf: Use the official GNOME mirror. (#49690)
The `umea.se` mirror seems to have gone down (or at least is forbidden for now).

Revert the checksum changes in #47825; points at the official GNOME mirror
instead of the prior two places we were getting `gdk-pixbuf`.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-25 18:57:53 -06:00
Todd Gamblin
f42f59c84b concretizer: don't use clingo.Symbol for setup (#49650)
Since we moved from creating clingo symbols directly to constructing a pure string
representation of the program, we don't need to make `AspFunctions` into symbols before
turning them into strings. We can just write strings like clingo would.

This cuts about 25% off the setup time by avoiding an unnecessary round trip.

- [x] create strings directly from `AspFunctions`
- [x] remove unused `symbol()` method on `AspFunction`
- [x] setup no longer tries to call `symbol()`

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Greg Becker <becker33@llnl.gov>

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
2025-03-25 17:55:27 -07:00
Jon Rood
313b7d4cdb nalu-wind: add version 2.2.2. (#49685) 2025-03-25 15:45:23 -07:00
Melven Roehrig-Zoellner
bd41863797 scorep: ensure gcc-plugin is built, patch gcc@14 (#49257)
* scorep: ensure gcc-plugin is built, patch gcc@14
* scorep: patch only to non-deprecated versions
2025-03-25 14:45:53 -07:00
Robert Maaskant
b0dba4ff5a yarn: add v4.6.0, v4.7.0 (#49177)
* yarn: v4.6.0
* py-ipympl: pin yarn to v1
* rstudio: pin yarn to v1
* yarn: add v4.7.0
2025-03-25 14:37:44 -07:00
Alec Scott
4ff43d7fa9 ci: future-proof for enabling GitHub merge queues later (#49665) 2025-03-25 10:07:37 -07:00
Jon Rood
c1df1c7ee5 trilinos: fix kokkos constraints for version 16 (#49643)
* trilinos: add equals sign to kokkos dependencies.

* Fix some license headers to pass style check.

* Generalize a bit.

* Generalize a bit more.

* datatransferkit: constraing to maximum of trilinos@16.0.
2025-03-25 10:43:13 -06:00
arezaii
9ac6ecd5ba Chapel 2.4 (#49662)
* limit some patches by chapel version

* fix short output version if building main

* update patches, remove unneeded 'self' refs

* fix spack style

* update patches with changes from PR

* change py-protobuf to just protobuf dep

* add PR numbers for patches

* fix spack style

* update 2.4 sha256
2025-03-25 09:01:58 -07:00
Todd Gamblin
20ddb85020 setup-env.csh: Harden for people who like aliases (#49670)
A user had `grep` aliased to `grep -n`, which was causing `csh` setup to
fail due to number prefixes in `SPACK_ROOT`.

- [x] Prefix invocations of `grep` and `sed` (which are not builtin) with `\`
      to avoid any aliases.
- [x] Avoid using `dirname` altogether -- use csh's `:h` modifier (which does
      the same thing) instead.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-25 09:01:28 -07:00
Nicholas Sly
2ced87297d Add dbus patch for container builds. (#49402) 2025-03-24 19:00:39 -06:00
Piotr Sacharuk
aa00c3fe1f trilinos: Apply workaround for oneAPI compiler for problems with build (#49636)
* Fix problem at least with datatransferkit

* Include patch 11676 from trilinos

* Add patches for trilinos 13.4.1

* style check failed

* Update links for patches

* additional style check failed
2025-03-24 17:05:43 -07:00
psakievich
0158fc46aa Add recursive argument to spack develop (#46885)
* Add recursive argument to spack develop

This effort allows for a recursive develop call
which will traverse from the develop spec given back to the root(s)
and mark all packages along the path as develop.

If people are doing development across the graph then paying
fetch and full rebuild costs every time spack develop is called
is unnecessary and expensive.

Also remove the constraint for concrete specs and simply take the
max(version) if a version is not given. This should default to the
highest infinity version which is also the logical best guess for
doing development.
2025-03-24 16:50:16 -07:00
Richard Berger
8ac826cca8 hip: add missing HIPCC_LINK_FLAGS_APPEND (#49436)
* hip: add missing HIPCC_LINK_FLAGS_APPEND

---------

Co-authored-by: rbberger <rbberger@users.noreply.github.com>
2025-03-24 13:58:47 -07:00
Teague Sterling
1b829a4a28 kentutils: add v478 (#49521) 2025-03-24 13:33:41 -07:00
Robert Maaskant
e2ed1c2308 py-pymoo: add v0.6.1.3 (#49603)
* py-pymoo: add v0.6.1.3
* py-pymoo: use a when context
* py-pymoo: group build only dependencies
2025-03-24 13:29:05 -07:00
Robert Maaskant
94b828add1 prometheus: improve dependency specs (#49175)
* prometheus: improve dependency specs
* fixup! prometheus: improve dependency specs
* prometheus: fix typo in nodejs dep
* prometheus: fix checksums
2025-03-24 13:27:10 -07:00
Eric Berquist
fd7dcf3a3f sst-core: fix linkage against ncurses, zlib, and HDF5 (#49152)
* sst-core: fix for > 14.0.0 requiring ncurses

* sst-core: backport fix for curses detection

* sst-core: ensure HDF5 is ignored if not specified

* sst-core: HDF5 integration is via C++

* sst-core: switch to with_or_without for configure

* sst-core: switch to enable_or_disable for configure

* sst-core: control memory pools and debug output with variants
2025-03-24 12:45:12 -07:00
Alec Scott
e3bb0d77bc hugo: add v0.145.0 (#49576) 2025-03-24 13:23:21 -06:00
Jon Rood
25761b13e5 kokkos-kernels: rewrite package to fix errors (#49598)
* kokkos-kernels: fix eti dependency statements.

* kokkos-kernels: rewrite package.

* Fix errors.

* Style.

* Style.

* Cleanup.
2025-03-24 12:25:20 -06:00
Stephen Nicholas Swatman
ae48faa83a detray: add v0.90.0-v0.93.0 (#49658)
This commit adds detray versions 0.90.0, 0.91.0, 0.92.0, and 0.93.0.
2025-03-24 10:13:03 -07:00
Afzal Patel
e15a3b0717 hip: fix hip-tests error (#49563) 2025-03-24 10:04:19 -07:00
Andrey Perestoronin
2c8afc5443 Add new 2025.1.0 release for intel-oneapi products (#49642)
* Add new versions of intel-oneapi products

* restore advisor 2025.0.0 release

* fix styling
2025-03-24 11:02:55 -06:00
Sreenivasa Murthy Kolam
99479b7e77 rocprofiler-sdk: new package (#49406)
* rocprofiler-sdk new package
* add license, rocm tag
2025-03-24 09:57:39 -07:00
psakievich
5d0b5ed73c EnvironmentModifications: fix reverse prepend/append (#49645)
pop a single item from front / back resp. instead of remove all instances
2025-03-24 17:29:27 +01:00
Ryan Krattiger
151af13be2 Unit tests: error message when running parallel without xdist (#49632) 2025-03-24 09:25:45 -07:00
Alec Scott
93ea3f51e7 zig: add v0.14.0 (#49629)
* zig: add v0.14.0

* Fix commit hash

* Fix tag for v0.14.0
2025-03-24 08:07:34 -07:00
Alec Scott
a3abc1c492 Fix ci failures after merge of mock tests created before license transition (#49638) 2025-03-21 21:17:56 -06:00
Simon Pintarelli
401484ddf4 remove version prior 7.3 from SIRIUS (#49584) 2025-03-21 20:46:39 +01:00
Robert Maaskant
fc4e76e6fe py-setuptools-scm: fix deps (#49609) 2025-03-21 11:18:11 -07:00
Alec Scott
0853f42723 smee-client: add v3.1.1 (#49578) 2025-03-21 11:56:37 -06:00
Alec Scott
19ca69d0d8 typos: add v1.30.2 (#49577)
* typos: add v1.30.2

* Add rust dependency constraint
2025-03-21 11:56:00 -06:00
Alec Scott
036794725f bfs: add v4.0.6 (#49575) 2025-03-21 11:55:16 -06:00
Alec Scott
e5a2c9aee3 emacs: add v30.1 (#49574) 2025-03-21 11:54:31 -06:00
Alec Scott
5364b88777 fzf: add v0.60.3 (#49573) 2025-03-21 11:43:26 -06:00
Alec Scott
7d1b6324e1 npm: add v11.2.0 (#49572) 2025-03-21 11:42:45 -06:00
Alexandre DENIS
3d0263755e mpibenchmark: add v0.6 (#49612)
* mpibenchmark: add version 0.6
* mpibenchmark: fix syntax
* mpibenchmark: improve package description
2025-03-21 09:54:56 -07:00
Jon Rood
54ad5dca45 exawind: add versions and commits to tags (#49615)
* exawind: add versions and commits to tags.
* Add new version of TIOGA.
* openfast: add commits to tags.
* amr-wind: add dependencies.
* amr-wind: add more settings.

---------

Co-authored-by: jrood-nrel <jrood-nrel@users.noreply.github.com>
2025-03-21 09:49:37 -07:00
Lehman Garrison
ee206952c9 py-uv: add v0.6.8 (#49616) 2025-03-21 09:38:17 -07:00
Ryan Krattiger
4ccef372e8 E4S: Allow building newer ParaView for Linux CI (#47823)
5.11 was locked at a time when master was building by default. Allowing
building newer paraview in CI
2025-03-21 09:07:37 -07:00
Jon Rood
ac6e534806 openfast: patch versions to fix openmp bug. (#49631) 2025-03-21 09:06:56 -07:00
Greg Becker
5983f72439 fix extendee_spec for transitive dependencies on potential extendees (#48025)
* fix extendee_spec for transitive dependencies on potential extendees

* regression test

* resolve conditional extensions on direct deps

* remove outdated comment

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2025-03-21 08:27:51 -07:00
Stephen Sachs
6e10fac7ae openfoam: restrict the CGAL version compatible with C++14 (#47689)
* openfoam: restrict the CGAL version compatible with C++14

CGAL throws an
[error](50219fc33b/Installation/include/CGAL/config.h (L147))
if C++ lower than 17 is used, while OpenFOAM [forces
C++14](https://develop.openfoam.com/Development/openfoam/-/blob/develop/wmake/rules/General/Gcc/c++?ref_type=heads#L9).
This hard C++17 dependency was
[introduced](e54408370b)
to CGAL version 6.

* Add upper bound since openfoam now uses c++17

44f7a7268a
2025-03-21 08:23:33 -07:00
Derek Ryan Strong
ee6ea5155c Add libjpeg-turbo v3.0.4 (#48030) 2025-03-21 08:22:01 -07:00
Cyrus Harrison
48258e8ddc conduit: add v0.9.3 (#48736)
* add 0.9.3 release, fix license listed
* fix sha
2025-03-21 08:20:50 -07:00
Robert Maaskant
429b0375ed yarn: v1.22.22 (#49171) 2025-03-21 08:12:13 -07:00
Robert Maaskant
c6925ab83f new package: py-loky (#49602) 2025-03-21 08:06:36 -07:00
Wouter Deconinck
00d78dfa0c pythia8: add v8.313 (#49045)
* pythia8: add v8.313

* pythia8: conflicts ~yoda +rivet only when @8.313:
2025-03-21 07:59:55 -07:00
Wouter Deconinck
e072a91572 libx11: add v1.8.11 (#48863) 2025-03-21 07:58:12 -07:00
Wouter Deconinck
b7eb0308d4 node-js: run tests with target test-only (#49516) 2025-03-21 07:52:26 -07:00
Wouter Deconinck
c98ee6d8ac eigen: build test executables when self.run_tests (#49540) 2025-03-21 07:50:54 -07:00
Wouter Deconinck
b343ebb64e qt-base: pass SBOM PATH from cmake_args (#49596)
* qt-base: pass SBOM PATH from cmake_args

* qt-base: self.define from list

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

---------

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2025-03-21 07:50:09 -07:00
Adam J. Stewart
e178d2c75d py-torchmetrics: add v1.7.0 (#49633) 2025-03-21 07:44:47 -07:00
Matt Thompson
9b64560ae6 mapl: add v2.53.3, v2.54.2 (#49610) 2025-03-21 07:18:22 -07:00
David Ozog
ca226f3506 sos: (and tests-sos:) update to v1.5.3, add main branch (#49613)
* sos/tests-sos: update to v1.5.3 & add main branch

* [@spackbot] updating style on behalf of davidozog

* sos: cleanup try/except around cloning tests

---------

Co-authored-by: davidozog <davidozog@users.noreply.github.com>
2025-03-21 09:28:16 -04:00
Caetano Melone
8569e04fea py-ruff: add v0.11.1 (#49617)
* py-ruff: add v0.11.1

Add latest version and update minimum supported rust version for 0.9.8
and up.

[before](https://github.com/astral-sh/ruff/blob/0.9.7/Cargo.toml#L7) and
[after](https://github.com/astral-sh/ruff/blob/0.9.8/Cargo.toml#L7)

* minimum rust version

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-03-21 09:14:13 +01:00
Robert Maaskant
32213d5e6b fix: spack audit issues (#49557) 2025-03-20 22:41:15 -07:00
Paul R. C. Kent
4891f3dbc9 rmgdft: add develop version (#49558) 2025-03-20 22:39:26 -07:00
Anderson Chauphan
2b5959c3dd trilinos: add v16.1.0 (#49628)
Signed-off-by: Anderson Chauphan <achauph@sandia.gov>
2025-03-20 22:37:43 -07:00
Suzanne Prentice
353db6752a ruby: add v3.2.5 (#49537) 2025-03-20 22:34:45 -07:00
Adam J. Stewart
bf24b8e82c py-lightning: add v2.5.1 (#49600) 2025-03-21 01:31:17 -04:00
psakievich
f2d830cd4c Get env_var mods from config (#49626) 2025-03-20 21:48:50 -05:00
brian-kelley
070bfa1ed7 KokkosKernels: apply PR 2296 as patch (#49627)
Applies this fix to all affected versions (4.0.00:4.4.00).
Fixes issue #49622.

Signed-off-by: Brian Kelley <bmkelle@sandia.gov>
2025-03-20 20:13:00 -06:00
Alec Scott
c79b6207e8 ci: add automatic checksum verification check (#45063)
Add a CI check to automatically verify the checksums of newly added
package versions:
    - [x] a new command, `spack ci verify-versions`
    - [x] a GitHub actions check to run the command
    - [x] tests for the new command

This also eliminates the suggestion for maintainers to manually verify added
checksums in the case of accidental version <--> checksum mismatches.

----

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-20 22:58:14 +01:00
465 changed files with 9096 additions and 9267 deletions

View File

@@ -9,6 +9,7 @@ on:
branches:
- develop
- releases/**
merge_group:
concurrency:
group: ci-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
@@ -25,13 +26,17 @@ jobs:
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
if: ${{ github.event_name == 'push' }}
if: ${{ github.event_name == 'push' || github.event_name == 'merge_group' }}
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36
id: filter
with:
# For merge group events, compare against the target branch (main)
base: ${{ github.event_name == 'merge_group' && github.event.merge_group.base_ref || '' }}
# For merge group events, use the merge group head ref
ref: ${{ github.event_name == 'merge_group' && github.event.merge_group.head_sha || github.ref }}
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below
# Don't run if we only modified packages in the
# built-in repository or documentation
@@ -76,10 +81,11 @@ jobs:
prechecks:
needs: [ changes ]
uses: ./.github/workflows/valid-style.yml
uses: ./.github/workflows/prechecks.yml
secrets: inherit
with:
with_coverage: ${{ needs.changes.outputs.core }}
with_packages: ${{ needs.changes.outputs.packages }}
import-check:
needs: [ changes ]
@@ -93,7 +99,7 @@ jobs:
- name: Success
run: |
if [ "${{ needs.prechecks.result }}" == "failure" ] || [ "${{ needs.prechecks.result }}" == "canceled" ]; then
echo "Unit tests failed."
echo "Unit tests failed."
exit 1
else
exit 0
@@ -101,6 +107,7 @@ jobs:
coverage:
needs: [ unit-tests, prechecks ]
if: ${{ needs.changes.outputs.core }}
uses: ./.github/workflows/coverage.yml
secrets: inherit
@@ -113,10 +120,10 @@ jobs:
- name: Status summary
run: |
if [ "${{ needs.unit-tests.result }}" == "failure" ] || [ "${{ needs.unit-tests.result }}" == "canceled" ]; then
echo "Unit tests failed."
echo "Unit tests failed."
exit 1
elif [ "${{ needs.bootstrap.result }}" == "failure" ] || [ "${{ needs.bootstrap.result }}" == "canceled" ]; then
echo "Bootstrap tests failed."
echo "Bootstrap tests failed."
exit 1
else
exit 0

View File

@@ -1,4 +1,4 @@
name: style
name: prechecks
on:
workflow_call:
@@ -6,6 +6,9 @@ on:
with_coverage:
required: true
type: string
with_packages:
required: true
type: string
concurrency:
group: style-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
@@ -30,6 +33,7 @@ jobs:
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv var/spack/repos
# Run style checks on the files that have been changed
style:
runs-on: ubuntu-latest
@@ -53,12 +57,25 @@ jobs:
- name: Run style tests
run: |
share/spack/qa/run-style-tests
audit:
uses: ./.github/workflows/audit.yaml
secrets: inherit
with:
with_coverage: ${{ inputs.with_coverage }}
python_version: '3.13'
verify-checksums:
if: ${{ inputs.with_packages == 'true' }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 2
- name: Verify Added Checksums
run: |
bin/spack ci verify-versions HEAD^1 HEAD
# Check that spack can bootstrap the development environment on Python 3.6 - RHEL8
bootstrap-dev-rhel8:
runs-on: ubuntu-latest

View File

@@ -19,7 +19,7 @@ config:
install_tree:
root: $spack/opt/spack
projections:
all: "{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}"
all: "{architecture.platform}-{architecture.target}/{name}-{version}-{hash}"
# install_tree can include an optional padded length (int or boolean)
# default is False (do not pad)
# if padded_length is True, Spack will pad as close to the system max path

View File

@@ -15,12 +15,11 @@
# -------------------------------------------------------------------------
packages:
all:
compiler:
- apple-clang
- clang
- gcc
providers:
c: [apple-clang, llvm, gcc]
cxx: [apple-clang, llvm, gcc]
elf: [libelf]
fortran: [gcc]
fuse: [macfuse]
gl: [apple-gl]
glu: [apple-glu]
@@ -50,3 +49,12 @@ packages:
# although the version number used here isn't critical
- spec: apple-libuuid@1353.100.2
prefix: /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk
c:
prefer:
- apple-clang
cxx:
prefer:
- apple-clang
fortran:
prefer:
- gcc

View File

@@ -15,19 +15,18 @@
# -------------------------------------------------------------------------
packages:
all:
compiler: [gcc, clang, oneapi, xl, nag, fj, aocc]
providers:
awk: [gawk]
armci: [armcimpi]
blas: [openblas, amdblis]
c: [gcc]
cxx: [gcc]
c: [gcc, llvm, intel-oneapi-compilers]
cxx: [gcc, llvm, intel-oneapi-compilers]
D: [ldc]
daal: [intel-oneapi-daal]
elf: [elfutils]
fftw-api: [fftw, amdfftw]
flame: [libflame, amdlibflame]
fortran: [gcc]
fortran: [gcc, llvm, intel-oneapi-compilers]
fortran-rt: [gcc-runtime, intel-oneapi-runtime]
fuse: [libfuse]
gl: [glx, osmesa]

View File

@@ -15,8 +15,8 @@
# -------------------------------------------------------------------------
packages:
all:
compiler:
- msvc
providers:
c : [msvc]
cxx: [msvc]
mpi: [msmpi]
gl: [wgl]

View File

@@ -457,6 +457,13 @@ developed package in the environment are concretized to match the
version (and other constraints) passed as the spec argument to the
``spack develop`` command.
When working deep in the graph it is often desirable to have multiple specs marked
as ``develop`` so you don't have to restage and/or do full rebuilds each time you
call ``spack install``. The ``--recursive`` flag can be used in these scenarios
to ensure that all the dependents of the initial spec you provide are also marked
as develop specs. The ``--recursive`` flag requires a pre-concretized environment
so the graph can be traversed from the supplied spec all the way to the root specs.
For packages with ``git`` attributes, git branches, tags, and commits can
also be used as valid concrete versions (see :ref:`version-specifier`).
This means that for a package ``foo``, ``spack develop foo@git.main`` will clone

View File

@@ -330,7 +330,7 @@ that ``--tests`` is passed to ``spack ci rebuild`` as part of the
- spack --version
- cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture.platform}-{architecture.target}/{name}-{version}-{hash}'"
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cpp

View File

@@ -1 +0,0 @@
../fc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

1
lib/spack/env/c++ vendored
View File

@@ -1 +0,0 @@
cc

1
lib/spack/env/c89 vendored
View File

@@ -1 +0,0 @@
cc

1
lib/spack/env/c99 vendored
View File

@@ -1 +0,0 @@
cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../../cc

View File

@@ -1 +0,0 @@
../../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

1
lib/spack/env/cpp vendored
View File

@@ -1 +0,0 @@
cc

1
lib/spack/env/f77 vendored
View File

@@ -1 +0,0 @@
cc

1
lib/spack/env/f90 vendored
View File

@@ -1 +0,0 @@
cc

1
lib/spack/env/f95 vendored
View File

@@ -1 +0,0 @@
cc

1
lib/spack/env/fc vendored
View File

@@ -1 +0,0 @@
cc

View File

@@ -1 +0,0 @@
../../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

1
lib/spack/env/ftn vendored
View File

@@ -1 +0,0 @@
cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

1
lib/spack/env/ld vendored
View File

@@ -1 +0,0 @@
cc

View File

@@ -1 +0,0 @@
cc

View File

@@ -1 +0,0 @@
cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cpp

View File

@@ -1 +0,0 @@
../fc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -73,7 +73,7 @@ def index_by(objects, *funcs):
if isinstance(f, str):
f = lambda x: getattr(x, funcs[0])
elif isinstance(f, tuple):
f = lambda x: tuple(getattr(x, p) for p in funcs[0])
f = lambda x: tuple(getattr(x, p, None) for p in funcs[0])
result = {}
for o in objects:
@@ -1016,11 +1016,8 @@ def _receive_forwarded(self, context: str, exc: Exception, tb: List[str]):
def grouped_message(self, with_tracebacks: bool = True) -> str:
"""Print out an error message coalescing all the forwarded errors."""
each_exception_message = [
"{0} raised {1}: {2}{3}".format(
context,
exc.__class__.__name__,
exc,
"\n{0}".format("".join(tb)) if with_tracebacks else "",
"\n\t{0} raised {1}: {2}\n{3}".format(
context, exc.__class__.__name__, exc, f"\n{''.join(tb)}" if with_tracebacks else ""
)
for context, exc, tb in self.exceptions
]

View File

@@ -0,0 +1,20 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Alias names to convert legacy compilers to builtin packages and vice-versa"""
BUILTIN_TO_LEGACY_COMPILER = {
"llvm": "clang",
"intel-oneapi-compilers": "oneapi",
"llvm-amdgpu": "rocmcc",
"intel-oneapi-compiler-classic": "intel",
"acfl": "arm",
}
LEGACY_COMPILER_TO_BUILTIN = {
"clang": "llvm",
"oneapi": "intel-oneapi-compilers",
"rocmcc": "llvm-amdgpu",
"intel": "intel-oneapi-compiler-classic",
"arm": "acfl",
}

View File

@@ -110,6 +110,13 @@ def __init__(self, root):
self._write_transaction_impl = llnl.util.lang.nullcontext
self._read_transaction_impl = llnl.util.lang.nullcontext
def _handle_old_db_versions_read(self, check, db, *, reindex: bool):
if not self.is_readable():
raise spack_db.DatabaseNotReadableError(
f"cannot read buildcache v{self.db_version} at {self.root}"
)
return self._handle_current_version_read(check, db)
class FetchCacheError(Exception):
"""Error thrown when fetching the cache failed, usually a composite error list."""
@@ -242,7 +249,7 @@ def _associate_built_specs_with_mirror(self, cache_key, mirror_url):
self._index_file_cache.init_entry(cache_key)
cache_path = self._index_file_cache.cache_path(cache_key)
with self._index_file_cache.read_transaction(cache_key):
db._read_from_file(cache_path)
db._read_from_file(pathlib.Path(cache_path))
except spack_db.InvalidDatabaseVersionError as e:
tty.warn(
f"you need a newer Spack version to read the buildcache index for the "

View File

@@ -234,14 +234,6 @@ def _root_spec(spec_str: str) -> str:
# Add a compiler and platform requirement to the root spec.
platform = str(spack.platforms.host())
if platform == "darwin":
spec_str += " %apple-clang"
elif platform == "windows":
spec_str += " %msvc"
elif platform == "linux":
spec_str += " %gcc"
elif platform == "freebsd":
spec_str += " %clang"
spec_str += f" platform={platform}"
target = archspec.cpu.host().family
spec_str += f" target={target}"

View File

@@ -15,11 +15,13 @@
import archspec.cpu
import spack.compiler
import spack.compilers
import spack.compilers.config
import spack.compilers.libraries
import spack.config
import spack.platforms
import spack.spec
import spack.traverse
import spack.version
from .config import spec_for_current_python
@@ -38,7 +40,7 @@ def __init__(self, configuration):
self.external_cmake, self.external_bison = self._externals_from_yaml(configuration)
def _valid_compiler_or_raise(self) -> "spack.compiler.Compiler":
def _valid_compiler_or_raise(self):
if str(self.host_platform) == "linux":
compiler_name = "gcc"
elif str(self.host_platform) == "darwin":
@@ -46,17 +48,30 @@ def _valid_compiler_or_raise(self) -> "spack.compiler.Compiler":
elif str(self.host_platform) == "windows":
compiler_name = "msvc"
elif str(self.host_platform) == "freebsd":
compiler_name = "clang"
compiler_name = "llvm"
else:
raise RuntimeError(f"Cannot bootstrap clingo from sources on {self.host_platform}")
candidates = spack.compilers.compilers_for_spec(
compiler_name, arch_spec=self.host_architecture
)
candidates = [
x
for x in spack.compilers.config.CompilerFactory.from_packages_yaml(spack.config.CONFIG)
if x.name == compiler_name
]
if not candidates:
raise RuntimeError(
f"Cannot find any version of {compiler_name} to bootstrap clingo from sources"
)
candidates.sort(key=lambda x: x.spec.version, reverse=True)
candidates.sort(key=lambda x: x.version, reverse=True)
best = candidates[0]
# Get compilers for bootstrapping from the 'builtin' repository
best.namespace = "builtin"
# If the compiler does not support C++ 14, fail with a legible error message
try:
_ = best.package.standard_flag(language="cxx", standard="14")
except RuntimeError as e:
raise RuntimeError(
"cannot find a compiler supporting C++ 14 [needed to bootstrap clingo]"
) from e
return candidates[0]
def _externals_from_yaml(
@@ -75,9 +90,6 @@ def _externals_from_yaml(
if not s.satisfies(requirements[pkg_name]):
continue
if not s.intersects(f"%{self.host_compiler.spec}"):
continue
if not s.intersects(f"arch={self.host_architecture}"):
continue
@@ -110,11 +122,14 @@ def concretize(self) -> "spack.spec.Spec":
# Tweak it to conform to the host architecture
for node in s.traverse():
node.architecture.os = str(self.host_os)
node.compiler = self.host_compiler.spec
node.architecture = self.host_architecture
if node.name == "gcc-runtime":
node.versions = self.host_compiler.spec.versions
node.versions = self.host_compiler.versions
# Can't use re2c@3.1 with Python 3.6
if self.host_python.satisfies("@3.6"):
s["re2c"].versions.versions = [spack.version.from_string("=2.2")]
for edge in spack.traverse.traverse_edges([s], cover="edges"):
if edge.spec.name == "python":
@@ -126,6 +141,9 @@ def concretize(self) -> "spack.spec.Spec":
if edge.spec.name == "cmake" and self.external_cmake:
edge.spec = self.external_cmake
if edge.spec.name == self.host_compiler.name:
edge.spec = self.host_compiler
if "libc" in edge.virtuals:
edge.spec = self.host_libc
@@ -141,12 +159,12 @@ def python_external_spec(self) -> "spack.spec.Spec":
return self._external_spec(result)
def libc_external_spec(self) -> "spack.spec.Spec":
result = self.host_compiler.default_libc
detector = spack.compilers.libraries.CompilerPropertyDetector(self.host_compiler)
result = detector.default_libc()
return self._external_spec(result)
def _external_spec(self, initial_spec) -> "spack.spec.Spec":
initial_spec.namespace = "builtin"
initial_spec.compiler = self.host_compiler.spec
initial_spec.architecture = self.host_architecture
for flag_type in spack.spec.FlagMap.valid_compiler_flags():
initial_spec.compiler_flags[flag_type] = []

View File

@@ -10,7 +10,7 @@
from llnl.util import tty
import spack.compilers
import spack.compilers.config
import spack.config
import spack.environment
import spack.modules
@@ -142,8 +142,8 @@ def _bootstrap_config_scopes() -> Sequence["spack.config.ConfigScope"]:
def _add_compilers_if_missing() -> None:
arch = spack.spec.ArchSpec.default_arch()
if not spack.compilers.compilers_for_arch(arch):
spack.compilers.find_compilers()
if not spack.compilers.config.compilers_for_arch(arch):
spack.compilers.config.find_compilers()
@contextlib.contextmanager

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -36,7 +36,6 @@
import multiprocessing
import os
import re
import stat
import sys
import traceback
import types
@@ -71,7 +70,7 @@
import spack.build_systems.meson
import spack.build_systems.python
import spack.builder
import spack.compilers
import spack.compilers.libraries
import spack.config
import spack.deptypes as dt
import spack.error
@@ -85,7 +84,6 @@
import spack.store
import spack.subprocess_context
import spack.util.executable
import spack.util.libc
from spack import traverse
from spack.context import Context
from spack.error import InstallError, NoHeadersError, NoLibrariesError
@@ -93,6 +91,8 @@
from spack.util.environment import (
SYSTEM_DIR_CASE_ENTRY,
EnvironmentModifications,
ModificationList,
PrependPath,
env_flag,
filter_system_paths,
get_path,
@@ -113,7 +113,7 @@
# set_wrapper_variables and used to pass parameters to
# Spack's compiler wrappers.
#
SPACK_ENV_PATH = "SPACK_ENV_PATH"
SPACK_COMPILER_WRAPPER_PATH = "SPACK_COMPILER_WRAPPER_PATH"
SPACK_MANAGED_DIRS = "SPACK_MANAGED_DIRS"
SPACK_INCLUDE_DIRS = "SPACK_INCLUDE_DIRS"
SPACK_LINK_DIRS = "SPACK_LINK_DIRS"
@@ -390,62 +390,10 @@ def _add_werror_handling(keep_werror, env):
env.set("SPACK_COMPILER_FLAGS_REPLACE", " ".join(["|".join(item) for item in replace_flags]))
def set_compiler_environment_variables(pkg, env):
def set_wrapper_environment_variables_for_flags(pkg, env):
assert pkg.spec.concrete
compiler = pkg.compiler
spec = pkg.spec
# Make sure the executables for this compiler exist
compiler.verify_executables()
# Set compiler variables used by CMake and autotools
assert all(key in compiler.link_paths for key in ("cc", "cxx", "f77", "fc"))
# Populate an object with the list of environment modifications
# and return it
# TODO : add additional kwargs for better diagnostics, like requestor,
# ttyout, ttyerr, etc.
link_dir = spack.paths.build_env_path
# Set SPACK compiler variables so that our wrapper knows what to
# call. If there is no compiler configured then use a default
# wrapper which will emit an error if it is used.
if compiler.cc:
env.set("SPACK_CC", compiler.cc)
env.set("CC", os.path.join(link_dir, compiler.link_paths["cc"]))
else:
env.set("CC", os.path.join(link_dir, "cc"))
if compiler.cxx:
env.set("SPACK_CXX", compiler.cxx)
env.set("CXX", os.path.join(link_dir, compiler.link_paths["cxx"]))
else:
env.set("CC", os.path.join(link_dir, "c++"))
if compiler.f77:
env.set("SPACK_F77", compiler.f77)
env.set("F77", os.path.join(link_dir, compiler.link_paths["f77"]))
else:
env.set("F77", os.path.join(link_dir, "f77"))
if compiler.fc:
env.set("SPACK_FC", compiler.fc)
env.set("FC", os.path.join(link_dir, compiler.link_paths["fc"]))
else:
env.set("FC", os.path.join(link_dir, "fc"))
# Set SPACK compiler rpath flags so that our wrapper knows what to use
env.set("SPACK_CC_RPATH_ARG", compiler.cc_rpath_arg)
env.set("SPACK_CXX_RPATH_ARG", compiler.cxx_rpath_arg)
env.set("SPACK_F77_RPATH_ARG", compiler.f77_rpath_arg)
env.set("SPACK_FC_RPATH_ARG", compiler.fc_rpath_arg)
env.set("SPACK_LINKER_ARG", compiler.linker_arg)
# Check whether we want to force RPATH or RUNPATH
if spack.config.get("config:shared_linking:type") == "rpath":
env.set("SPACK_DTAGS_TO_STRIP", compiler.enable_new_dtags)
env.set("SPACK_DTAGS_TO_ADD", compiler.disable_new_dtags)
else:
env.set("SPACK_DTAGS_TO_STRIP", compiler.disable_new_dtags)
env.set("SPACK_DTAGS_TO_ADD", compiler.enable_new_dtags)
if pkg.keep_werror is not None:
keep_werror = pkg.keep_werror
else:
@@ -453,10 +401,6 @@ def set_compiler_environment_variables(pkg, env):
_add_werror_handling(keep_werror, env)
# Set the target parameters that the compiler will add
isa_arg = optimization_flags(compiler, spec.target)
env.set("SPACK_TARGET_ARGS", isa_arg)
# Trap spack-tracked compiler flags as appropriate.
# env_flags are easy to accidentally override.
inject_flags = {}
@@ -489,75 +433,23 @@ def set_compiler_environment_variables(pkg, env):
# implicit variables
env.set(flag.upper(), " ".join(f for f in env_flags[flag]))
pkg.flags_to_build_system_args(build_system_flags)
env.set("SPACK_COMPILER_SPEC", str(spec.compiler))
env.set("SPACK_SYSTEM_DIRS", SYSTEM_DIR_CASE_ENTRY)
compiler.setup_custom_environment(pkg, env)
return env
def optimization_flags(compiler, target):
if spack.compilers.is_mixed_toolchain(compiler):
msg = (
"microarchitecture specific optimizations are not "
"supported yet on mixed compiler toolchains [check"
f" {compiler.name}@{compiler.version} for further details]"
)
tty.debug(msg)
return ""
# Try to check if the current compiler comes with a version number or
# has an unexpected suffix. If so, treat it as a compiler with a
# custom spec.
compiler_version = compiler.version
version_number, suffix = archspec.cpu.version_components(compiler.version)
if not version_number or suffix:
try:
compiler_version = compiler.real_version
except spack.util.executable.ProcessError as e:
# log this and just return compiler.version instead
tty.debug(str(e))
version_number, _ = archspec.cpu.version_components(compiler.version.dotted_numeric_string)
try:
result = target.optimization_flags(compiler.name, compiler_version.dotted_numeric_string)
result = target.optimization_flags(compiler.name, version_number)
except (ValueError, archspec.cpu.UnsupportedMicroarchitecture):
result = ""
return result
class FilterDefaultDynamicLinkerSearchPaths:
"""Remove rpaths to directories that are default search paths of the dynamic linker."""
def __init__(self, dynamic_linker: Optional[str]) -> None:
# Identify directories by (inode, device) tuple, which handles symlinks too.
self.default_path_identifiers: Set[Tuple[int, int]] = set()
if not dynamic_linker:
return
for path in spack.util.libc.default_search_paths_from_dynamic_linker(dynamic_linker):
try:
s = os.stat(path)
if stat.S_ISDIR(s.st_mode):
self.default_path_identifiers.add((s.st_ino, s.st_dev))
except OSError:
continue
def is_dynamic_loader_default_path(self, p: str) -> bool:
try:
s = os.stat(p)
return (s.st_ino, s.st_dev) in self.default_path_identifiers
except OSError:
return False
def __call__(self, dirs: List[str]) -> List[str]:
if not self.default_path_identifiers:
return dirs
return [p for p in dirs if not self.is_dynamic_loader_default_path(p)]
def set_wrapper_variables(pkg, env):
"""Set environment variables used by the Spack compiler wrapper (which have the prefix
`SPACK_`) and also add the compiler wrappers to PATH.
@@ -566,39 +458,8 @@ def set_wrapper_variables(pkg, env):
this function computes these options in a manner that is intended to match the DAG traversal
order in `SetupContext`. TODO: this is not the case yet, we're using post order, SetupContext
is using topo order."""
# Set environment variables if specified for
# the given compiler
compiler = pkg.compiler
env.extend(spack.schema.environment.parse(compiler.environment))
if compiler.extra_rpaths:
extra_rpaths = ":".join(compiler.extra_rpaths)
env.set("SPACK_COMPILER_EXTRA_RPATHS", extra_rpaths)
# Add spack build environment path with compiler wrappers first in
# the path. We add the compiler wrapper path, which includes default
# wrappers (cc, c++, f77, f90), AND a subdirectory containing
# compiler-specific symlinks. The latter ensures that builds that
# are sensitive to the *name* of the compiler see the right name when
# we're building with the wrappers.
#
# Conflicts on case-insensitive systems (like "CC" and "cc") are
# handled by putting one in the <build_env_path>/case-insensitive
# directory. Add that to the path too.
env_paths = []
compiler_specific = os.path.join(
spack.paths.build_env_path, os.path.dirname(pkg.compiler.link_paths["cc"])
)
for item in [spack.paths.build_env_path, compiler_specific]:
env_paths.append(item)
ci = os.path.join(item, "case-insensitive")
if os.path.isdir(ci):
env_paths.append(ci)
tty.debug("Adding compiler bin/ paths: " + " ".join(env_paths))
for item in env_paths:
env.prepend_path("PATH", item)
env.set_path(SPACK_ENV_PATH, env_paths)
# Set compiler flags injected from the spec
set_wrapper_environment_variables_for_flags(pkg, env)
# Working directory for the spack command itself, for debug logs.
if spack.config.get("config:debug"):
@@ -664,22 +525,15 @@ def set_wrapper_variables(pkg, env):
lib_path = os.path.join(pkg.prefix, libdir)
rpath_dirs.insert(0, lib_path)
filter_default_dynamic_linker_search_paths = FilterDefaultDynamicLinkerSearchPaths(
pkg.compiler.default_dynamic_linker
)
# TODO: filter_system_paths is again wrong (and probably unnecessary due to the is_system_path
# branch above). link_dirs should be filtered with entries from _parse_link_paths.
link_dirs = list(dedupe(filter_system_paths(link_dirs)))
include_dirs = list(dedupe(filter_system_paths(include_dirs)))
rpath_dirs = list(dedupe(filter_system_paths(rpath_dirs)))
rpath_dirs = filter_default_dynamic_linker_search_paths(rpath_dirs)
# TODO: implicit_rpaths is prefiltered by is_system_path, that should be removed in favor of
# just this filter.
implicit_rpaths = filter_default_dynamic_linker_search_paths(pkg.compiler.implicit_rpaths())
if implicit_rpaths:
env.set("SPACK_COMPILER_IMPLICIT_RPATHS", ":".join(implicit_rpaths))
default_dynamic_linker_filter = spack.compilers.libraries.dynamic_linker_filter_for(pkg.spec)
if default_dynamic_linker_filter:
rpath_dirs = default_dynamic_linker_filter(rpath_dirs)
# Spack managed directories include the stage, store and upstream stores. We extend this with
# their real paths to make it more robust (e.g. /tmp vs /private/tmp on macOS).
@@ -731,26 +585,6 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
# Don't use which for this; we want to find it in the current dir.
module.configure = Executable("./configure")
# Put spack compiler paths in module scope. (Some packages use it
# in setup_run_environment etc, so don't put it context == build)
link_dir = spack.paths.build_env_path
pkg_compiler = None
try:
pkg_compiler = pkg.compiler
except spack.compilers.NoCompilerForSpecError as e:
tty.debug(f"cannot set 'spack_cc': {str(e)}")
if pkg_compiler is not None:
module.spack_cc = os.path.join(link_dir, pkg_compiler.link_paths["cc"])
module.spack_cxx = os.path.join(link_dir, pkg_compiler.link_paths["cxx"])
module.spack_f77 = os.path.join(link_dir, pkg_compiler.link_paths["f77"])
module.spack_fc = os.path.join(link_dir, pkg_compiler.link_paths["fc"])
else:
module.spack_cc = None
module.spack_cxx = None
module.spack_f77 = None
module.spack_fc = None
# Useful directories within the prefix are encapsulated in
# a Prefix object.
module.prefix = pkg.prefix
@@ -901,7 +735,6 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
context == Context.TEST and pkg.test_requires_compiler
)
if need_compiler:
set_compiler_environment_variables(pkg, env_mods)
set_wrapper_variables(pkg, env_mods)
# Platform specific setup goes before package specific setup. This is for setting
@@ -913,6 +746,26 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
env_mods.extend(setup_context.get_env_modifications())
tty.debug("setup_package: collected all modifications from dependencies")
tty.debug("setup_package: adding compiler wrappers paths")
env_by_name = env_mods.group_by_name()
for x in env_by_name["SPACK_COMPILER_WRAPPER_PATH"]:
assert isinstance(
x, PrependPath
), "unexpected setting used for SPACK_COMPILER_WRAPPER_PATH"
env_mods.prepend_path("PATH", x.value)
# Check whether we want to force RPATH or RUNPATH
enable_var_name, disable_var_name = "SPACK_ENABLE_NEW_DTAGS", "SPACK_DISABLE_NEW_DTAGS"
if enable_var_name in env_by_name and disable_var_name in env_by_name:
enable_new_dtags = _extract_dtags_arg(env_by_name, var_name=enable_var_name)
disable_new_dtags = _extract_dtags_arg(env_by_name, var_name=disable_var_name)
if spack.config.CONFIG.get("config:shared_linking:type") == "rpath":
env_mods.set("SPACK_DTAGS_TO_STRIP", enable_new_dtags)
env_mods.set("SPACK_DTAGS_TO_ADD", disable_new_dtags)
else:
env_mods.set("SPACK_DTAGS_TO_STRIP", disable_new_dtags)
env_mods.set("SPACK_DTAGS_TO_ADD", enable_new_dtags)
if context == Context.TEST:
env_mods.prepend_path("PATH", ".")
elif context == Context.BUILD and not dirty and not env_mods.is_unset("CPATH"):
@@ -926,11 +779,6 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
# Load modules on an already clean environment, just before applying Spack's
# own environment modifications. This ensures Spack controls CC/CXX/... variables.
if need_compiler:
tty.debug("setup_package: loading compiler modules")
for mod in pkg.compiler.modules:
load_module(mod)
load_external_modules(setup_context)
# Make sure nothing's strange about the Spack environment.
@@ -942,6 +790,14 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
return env_base
def _extract_dtags_arg(env_by_name: Dict[str, ModificationList], *, var_name: str) -> str:
try:
enable_new_dtags = env_by_name[var_name][0].value # type: ignore[union-attr]
except (KeyError, IndexError, AttributeError):
enable_new_dtags = ""
return enable_new_dtags
class EnvironmentVisitor:
def __init__(self, *roots: spack.spec.Spec, context: Context):
# For the roots (well, marked specs) we follow different edges

View File

@@ -11,6 +11,7 @@
import spack.build_environment
import spack.builder
import spack.compilers.libraries
import spack.error
import spack.package_base
import spack.phase_callbacks
@@ -398,33 +399,44 @@ def _do_patch_libtool(self) -> None:
markers[tag] = "LIBTOOL TAG CONFIG: {0}".format(tag.upper())
# Replace empty linker flag prefixes:
if self.pkg.compiler.name == "nag":
if self.spec.satisfies("%nag"):
# Nag is mixed with gcc and g++, which are recognized correctly.
# Therefore, we change only Fortran values:
nag_pkg = self.spec["fortran"].package
for tag in ["fc", "f77"]:
marker = markers[tag]
x.filter(
regex='^wl=""$',
repl='wl="{0}"'.format(self.pkg.compiler.linker_arg),
start_at="# ### BEGIN {0}".format(marker),
stop_at="# ### END {0}".format(marker),
repl=f'wl="{nag_pkg.linker_arg}"',
start_at=f"# ### BEGIN {marker}",
stop_at=f"# ### END {marker}",
)
else:
x.filter(regex='^wl=""$', repl='wl="{0}"'.format(self.pkg.compiler.linker_arg))
compiler_spec = spack.compilers.libraries.compiler_spec(self.spec)
if compiler_spec:
x.filter(regex='^wl=""$', repl='wl="{0}"'.format(compiler_spec.package.linker_arg))
# Replace empty PIC flag values:
for cc, marker in markers.items():
for compiler, marker in markers.items():
if compiler == "cc":
language = "c"
elif compiler == "cxx":
language = "cxx"
else:
language = "fortran"
if language not in self.spec:
continue
x.filter(
regex='^pic_flag=""$',
repl='pic_flag="{0}"'.format(
getattr(self.pkg.compiler, "{0}_pic_flag".format(cc))
),
start_at="# ### BEGIN {0}".format(marker),
stop_at="# ### END {0}".format(marker),
repl=f'pic_flag="{self.spec[language].package.pic_flag}"',
start_at=f"# ### BEGIN {marker}",
stop_at=f"# ### END {marker}",
)
# Other compiler-specific patches:
if self.pkg.compiler.name == "fj":
if self.spec.satisfies("%fj"):
x.filter(regex="-nostdlib", repl="", string=True)
rehead = r"/\S*/"
for o in [
@@ -437,7 +449,7 @@ def _do_patch_libtool(self) -> None:
r"crtendS\.o",
]:
x.filter(regex=(rehead + o), repl="")
elif self.pkg.compiler.name == "nag":
elif self.spec.satisfies("%nag"):
for tag in ["fc", "f77"]:
marker = markers[tag]
start_at = "# ### BEGIN {0}".format(marker)

View File

@@ -70,12 +70,8 @@ class CachedCMakeBuilder(CMakeBuilder):
@property
def cache_name(self):
return "{0}-{1}-{2}@{3}.cmake".format(
self.pkg.name,
self.pkg.spec.architecture,
self.pkg.spec.compiler.name,
self.pkg.spec.compiler.version,
)
compiler_str = f"{self.spec['c'].name}-{self.spec['c'].version}"
return f"{self.pkg.name}-{self.spec.architecture.platform}-{compiler_str}.cmake"
@property
def cache_path(self):
@@ -118,7 +114,9 @@ def initconfig_compiler_entries(self):
# Fortran compiler is optional
if "FC" in os.environ:
spack_fc_entry = cmake_cache_path("CMAKE_Fortran_COMPILER", os.environ["FC"])
system_fc_entry = cmake_cache_path("CMAKE_Fortran_COMPILER", self.pkg.compiler.fc)
system_fc_entry = cmake_cache_path(
"CMAKE_Fortran_COMPILER", self.spec["fortran"].package.fortran
)
else:
spack_fc_entry = "# No Fortran compiler defined in spec"
system_fc_entry = "# No Fortran compiler defined in spec"
@@ -134,8 +132,8 @@ def initconfig_compiler_entries(self):
" " + cmake_cache_path("CMAKE_CXX_COMPILER", os.environ["CXX"]),
" " + spack_fc_entry,
"else()\n",
" " + cmake_cache_path("CMAKE_C_COMPILER", self.pkg.compiler.cc),
" " + cmake_cache_path("CMAKE_CXX_COMPILER", self.pkg.compiler.cxx),
" " + cmake_cache_path("CMAKE_C_COMPILER", self.spec["c"].package.cc),
" " + cmake_cache_path("CMAKE_CXX_COMPILER", self.spec["cxx"].package.cxx),
" " + system_fc_entry,
"endif()\n",
]

View File

@@ -6,12 +6,13 @@
import pathlib
import re
import sys
from typing import Dict, List, Sequence, Tuple, Union
from typing import Dict, List, Optional, Sequence, Tuple, Union
import llnl.util.tty as tty
from llnl.util.lang import classproperty
from llnl.util.lang import classproperty, memoized
import spack.compiler
import spack
import spack.compilers.error
import spack.package_base
import spack.util.executable
@@ -43,6 +44,9 @@ class CompilerPackage(spack.package_base.PackageBase):
#: Static definition of languages supported by this class
compiler_languages: Sequence[str] = ["c", "cxx", "fortran"]
#: Relative path to compiler wrappers
compiler_wrapper_link_paths: Dict[str, str] = {}
def __init__(self, spec: "spack.spec.Spec"):
super().__init__(spec)
msg = f"Supported languages for {spec} are not a subset of possible supported languages"
@@ -77,14 +81,14 @@ def executables(cls) -> Sequence[str]:
]
@classmethod
def determine_version(cls, exe: Path):
def determine_version(cls, exe: Path) -> str:
version_argument = cls.compiler_version_argument
if isinstance(version_argument, str):
version_argument = (version_argument,)
for va in version_argument:
try:
output = spack.compiler.get_compiler_version_output(exe, va)
output = compiler_output(exe, version_argument=va)
match = re.search(cls.compiler_version_regex, output)
if match:
return ".".join(match.groups())
@@ -95,10 +99,11 @@ def determine_version(cls, exe: Path):
f"[{__file__}] Cannot detect a valid version for the executable "
f"{str(exe)}, for package '{cls.name}': {e}"
)
return ""
@classmethod
def compiler_bindir(cls, prefix: Path) -> Path:
"""Overridable method for the location of the compiler bindir within the preifx"""
"""Overridable method for the location of the compiler bindir within the prefix"""
return os.path.join(prefix, "bin")
@classmethod
@@ -142,3 +147,109 @@ def determine_compiler_paths(cls, exes: Sequence[Path]) -> Dict[str, Path]:
def determine_variants(cls, exes: Sequence[Path], version_str: str) -> Tuple:
# path determination is separated so it can be reused in subclasses
return "", {"compilers": cls.determine_compiler_paths(exes=exes)}
#: Returns the argument needed to set the RPATH, or None if it does not exist
rpath_arg: Optional[str] = "-Wl,-rpath,"
#: Flag that needs to be used to pass an argument to the linker
linker_arg: str = "-Wl,"
#: Flag used to produce Position Independent Code
pic_flag: str = "-fPIC"
#: Flag used to get verbose output
verbose_flags: str = "-v"
#: Flag to activate OpenMP support
openmp_flag: str = "-fopenmp"
implicit_rpath_libs: List[str] = []
def standard_flag(self, *, language: str, standard: str) -> str:
"""Returns the flag used to enforce a given standard for a language"""
if language not in self.supported_languages:
raise spack.compilers.error.UnsupportedCompilerFlag(
f"{self.spec} does not provide the '{language}' language"
)
try:
return self._standard_flag(language=language, standard=standard)
except (KeyError, RuntimeError) as e:
raise spack.compilers.error.UnsupportedCompilerFlag(
f"{self.spec} does not provide the '{language}' standard {standard}"
) from e
def _standard_flag(self, *, language: str, standard: str) -> str:
raise NotImplementedError("Must be implemented by derived classes")
def archspec_name(self) -> str:
"""Name that archspec uses to refer to this compiler"""
return self.spec.name
@property
def cc(self) -> Optional[str]:
assert self.spec.concrete, "cannot retrieve C compiler, spec is not concrete"
if self.spec.external:
return self.spec.extra_attributes["compilers"].get("c", None)
return self._cc_path()
def _cc_path(self) -> Optional[str]:
"""Returns the path to the C compiler, if the package was installed by Spack"""
return None
@property
def cxx(self) -> Optional[str]:
assert self.spec.concrete, "cannot retrieve C++ compiler, spec is not concrete"
if self.spec.external:
return self.spec.extra_attributes["compilers"].get("cxx", None)
return self._cxx_path()
def _cxx_path(self) -> Optional[str]:
"""Returns the path to the C++ compiler, if the package was installed by Spack"""
return None
@property
def fortran(self):
assert self.spec.concrete, "cannot retrieve Fortran compiler, spec is not concrete"
if self.spec.external:
return self.spec.extra_attributes["compilers"].get("fortran", None)
return self._fortran_path()
def _fortran_path(self) -> Optional[str]:
"""Returns the path to the Fortran compiler, if the package was installed by Spack"""
return None
@memoized
def _compiler_output(
compiler_path: Path, *, version_argument: str, ignore_errors: Tuple[int, ...] = ()
) -> str:
"""Returns the output from the compiler invoked with the given version argument.
Args:
compiler_path: path of the compiler to be invoked
version_argument: the argument used to extract version information
"""
compiler = spack.util.executable.Executable(compiler_path)
if not version_argument:
return compiler(
output=str, error=str, ignore_errors=ignore_errors, timeout=120, fail_on_error=True
)
return compiler(
version_argument,
output=str,
error=str,
ignore_errors=ignore_errors,
timeout=120,
fail_on_error=True,
)
def compiler_output(
compiler_path: Path, *, version_argument: str, ignore_errors: Tuple[int, ...] = ()
) -> str:
"""Wrapper for _get_compiler_version_output()."""
# This ensures that we memoize compiler output by *absolute path*,
# not just executable name. If we don't do this, and the path changes
# (e.g., during testing), we can get incorrect results.
if not os.path.isabs(compiler_path):
compiler_path = spack.util.executable.which_string(str(compiler_path), required=True)
return _compiler_output(
compiler_path, version_argument=version_argument, ignore_errors=ignore_errors
)

View File

@@ -76,7 +76,7 @@ def toolchain_version(self):
Override this method to select a specific version of the toolchain or change
selection heuristics.
Default is whatever version of msvc has been selected by concretization"""
return "v" + self.pkg.compiler.platform_toolset_ver
return "v" + self.spec["msvc"].package.platform_toolset_ver
@property
def std_msbuild_args(self):

View File

@@ -278,10 +278,6 @@ def update_external_dependencies(self, extendee_spec=None):
if not python.architecture.target:
python.architecture.target = archspec.cpu.host().family.name
# Ensure compiler information is present
if not python.compiler:
python.compiler = self.spec.compiler
python.external_path = self.spec.external_path
python._mark_concrete()
self.spec.add_dependency_edge(python, depflag=dt.BUILD | dt.LINK | dt.RUN, virtuals=())

View File

@@ -14,7 +14,7 @@
import tempfile
import zipfile
from collections import namedtuple
from typing import Callable, Dict, List, Set
from typing import Callable, Dict, List, Set, Union
from urllib.request import Request
import llnl.path
@@ -24,7 +24,6 @@
import spack
import spack.binary_distribution as bindist
import spack.concretize
import spack.config as cfg
import spack.environment as ev
import spack.error
@@ -42,6 +41,7 @@
from spack import traverse
from spack.error import SpackError
from spack.reporters.cdash import SPACK_CDASH_TIMEOUT
from spack.version import GitVersion, StandardVersion
from .common import (
IS_WINDOWS,
@@ -80,6 +80,45 @@ def get_change_revisions():
return None, None
def get_added_versions(
checksums_version_dict: Dict[str, Union[StandardVersion, GitVersion]],
path: str,
from_ref: str = "HEAD~1",
to_ref: str = "HEAD",
) -> List[Union[StandardVersion, GitVersion]]:
"""Get a list of the versions added between `from_ref` and `to_ref`.
Args:
checksums_version_dict (Dict): all package versions keyed by known checksums.
path (str): path to the package.py
from_ref (str): oldest git ref, defaults to `HEAD~1`
to_ref (str): newer git ref, defaults to `HEAD`
Returns: list of versions added between refs
"""
git_exe = spack.util.git.git(required=True)
# Gather git diff
diff_lines = git_exe("diff", from_ref, to_ref, "--", path, output=str).split("\n")
# Store added and removed versions
# Removed versions are tracked here to determine when versions are moved in a file
# and show up as both added and removed in a git diff.
added_checksums = set()
removed_checksums = set()
# Scrape diff for modified versions and prune added versions if they show up
# as also removed (which means they've actually just moved in the file and
# we shouldn't need to rechecksum them)
for checksum in checksums_version_dict.keys():
for line in diff_lines:
if checksum in line:
if line.startswith("+"):
added_checksums.add(checksum)
if line.startswith("-"):
removed_checksums.add(checksum)
return [checksums_version_dict[c] for c in added_checksums - removed_checksums]
def get_stack_changed(env_path, rev1="HEAD^", rev2="HEAD"):
"""Given an environment manifest path and two revisions to compare, return
whether or not the stack was changed. Returns True if the environment
@@ -381,10 +420,9 @@ def generate_pipeline(env: ev.Environment, args) -> None:
args: (spack.main.SpackArgumentParser): Parsed arguments from the command
line.
"""
with spack.concretize.disable_compiler_existence_check():
with env.write_transaction():
env.concretize()
env.write()
with env.write_transaction():
env.concretize()
env.write()
options = collect_pipeline_options(env, args)

View File

@@ -209,10 +209,8 @@ def build_name(self, spec: Optional[spack.spec.Spec] = None) -> Optional[str]:
Returns: (str) given spec's CDash build name."""
if spec:
build_name = (
f"{spec.name}@{spec.version}%{spec.compiler} "
f"hash={spec.dag_hash()} arch={spec.architecture} ({self.build_group})"
)
spec_str = spec.format("{name}{@version}{%compiler} hash={hash} arch={architecture}")
build_name = f"{spec_str} ({self.build_group})"
tty.debug(f"Generated CDash build name ({build_name}) from the {spec.name}")
return build_name

View File

@@ -375,8 +375,13 @@ def iter_groups(specs, indent, all_headers):
index = index_by(specs, ("architecture", "compiler"))
ispace = indent * " "
def _key(item):
if item is None:
return ""
return str(item)
# Traverse the index and print out each package
for i, (architecture, compiler) in enumerate(sorted(index)):
for i, (architecture, compiler) in enumerate(sorted(index, key=_key)):
if i > 0:
print()
@@ -448,7 +453,6 @@ def get_arg(name, default=None):
hashes = get_arg("long", False)
namespaces = get_arg("namespaces", False)
flags = get_arg("show_flags", False)
full_compiler = get_arg("show_full_compiler", False)
variants = get_arg("variants", False)
groups = get_arg("groups", True)
all_headers = get_arg("all_headers", False)
@@ -470,10 +474,8 @@ def get_arg(name, default=None):
if format_string is None:
nfmt = "{fullname}" if namespaces else "{name}"
ffmt = ""
if full_compiler or flags:
ffmt += "{compiler_flags} {%compiler.name}"
if full_compiler:
ffmt += "{@compiler.version}"
if flags:
ffmt += " {compiler_flags}"
vfmt = "{variants}" if variants else ""
format_string = nfmt + "{@version}" + vfmt + ffmt

View File

@@ -5,11 +5,13 @@
import json
import os
import shutil
import sys
from typing import Dict
from urllib.parse import urlparse, urlunparse
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.util import tty
import spack.binary_distribution as bindist
import spack.ci as spack_ci
@@ -18,12 +20,20 @@
import spack.cmd.common.arguments
import spack.config as cfg
import spack.environment as ev
import spack.error
import spack.fetch_strategy
import spack.hash_types as ht
import spack.mirrors.mirror
import spack.package_base
import spack.repo
import spack.spec
import spack.stage
import spack.util.executable
import spack.util.gpg as gpg_util
import spack.util.timer as timer
import spack.util.url as url_util
import spack.util.web as web_util
import spack.version
description = "manage continuous integration pipelines"
section = "build"
@@ -191,6 +201,16 @@ def setup_parser(subparser):
reproduce.set_defaults(func=ci_reproduce)
# Verify checksums inside of ci workflows
verify_versions = subparsers.add_parser(
"verify-versions",
description=deindent(ci_verify_versions.__doc__),
help=spack.cmd.first_line(ci_verify_versions.__doc__),
)
verify_versions.add_argument("from_ref", help="git ref from which start looking at changes")
verify_versions.add_argument("to_ref", help="git ref to end looking at changes")
verify_versions.set_defaults(func=ci_verify_versions)
def ci_generate(args):
"""generate jobs file from a CI-aware spack file
@@ -659,6 +679,156 @@ def _gitlab_artifacts_url(url: str) -> str:
return urlunparse(parsed._replace(path="/".join(parts), fragment="", query=""))
def validate_standard_versions(
pkg: spack.package_base.PackageBase, versions: spack.version.VersionList
) -> bool:
"""Get and test the checksum of a package version based on a tarball.
Args:
pkg spack.package_base.PackageBase: Spack package for which to validate a version checksum
versions spack.version.VersionList: list of package versions to validate
Returns: bool: result of the validation. True is valid and false is failed.
"""
url_dict: Dict[spack.version.StandardVersion, str] = {}
for version in versions:
url = pkg.find_valid_url_for_version(version)
url_dict[version] = url
version_hashes = spack.stage.get_checksums_for_versions(
url_dict, pkg.name, fetch_options=pkg.fetch_options
)
valid_checksums = True
for version, sha in version_hashes.items():
if sha != pkg.versions[version]["sha256"]:
tty.error(
f"Invalid checksum found {pkg.name}@{version}\n"
f" [package.py] {pkg.versions[version]['sha256']}\n"
f" [Downloaded] {sha}"
)
valid_checksums = False
continue
tty.info(f"Validated {pkg.name}@{version} --> {sha}")
return valid_checksums
def validate_git_versions(
pkg: spack.package_base.PackageBase, versions: spack.version.VersionList
) -> bool:
"""Get and test the commit and tag of a package version based on a git repository.
Args:
pkg spack.package_base.PackageBase: Spack package for which to validate a version
versions spack.version.VersionList: list of package versions to validate
Returns: bool: result of the validation. True is valid and false is failed.
"""
valid_commit = True
for version in versions:
fetcher = spack.fetch_strategy.for_package_version(pkg, version)
with spack.stage.Stage(fetcher) as stage:
known_commit = pkg.versions[version]["commit"]
try:
stage.fetch()
except spack.error.FetchError:
tty.error(
f"Invalid commit for {pkg.name}@{version}\n"
f" {known_commit} could not be checked out in the git repository."
)
valid_commit = False
continue
# Test if the specified tag matches the commit in the package.py
# We retrieve the commit associated with a tag and compare it to the
# commit that is located in the package.py file.
if "tag" in pkg.versions[version]:
tag = pkg.versions[version]["tag"]
try:
with fs.working_dir(stage.source_path):
found_commit = fetcher.git(
"rev-list", "-n", "1", tag, output=str, error=str
).strip()
except spack.util.executable.ProcessError:
tty.error(
f"Invalid tag for {pkg.name}@{version}\n"
f" {tag} could not be found in the git repository."
)
valid_commit = False
continue
if found_commit != known_commit:
tty.error(
f"Mismatched tag <-> commit found for {pkg.name}@{version}\n"
f" [package.py] {known_commit}\n"
f" [Downloaded] {found_commit}"
)
valid_commit = False
continue
# If we have downloaded the repository, found the commit, and compared
# the tag (if specified) we can conclude that the version is pointing
# at what we would expect.
tty.info(f"Validated {pkg.name}@{version} --> {known_commit}")
return valid_commit
def ci_verify_versions(args):
"""validate version checksum & commits between git refs
This command takes a from_ref and to_ref arguments and
then parses the git diff between the two to determine which packages
have been modified verifies the new checksums inside of them.
"""
# Get a list of all packages that have been changed or added
# between from_ref and to_ref
pkgs = spack.repo.get_all_package_diffs("AC", args.from_ref, args.to_ref)
failed_version = False
for pkg_name in pkgs:
spec = spack.spec.Spec(pkg_name)
pkg = spack.repo.PATH.get_pkg_class(spec.name)(spec)
path = spack.repo.PATH.package_path(pkg_name)
# Skip checking manual download packages and trust the maintainers
if pkg.manual_download:
tty.warn(f"Skipping manual download package: {pkg_name}")
continue
# Store versions checksums / commits for future loop
checksums_version_dict = {}
commits_version_dict = {}
for version in pkg.versions:
# If the package version defines a sha256 we'll use that as the high entropy
# string to detect which versions have been added between from_ref and to_ref
if "sha256" in pkg.versions[version]:
checksums_version_dict[pkg.versions[version]["sha256"]] = version
# If a package version instead defines a commit we'll use that as a
# high entropy string to detect new versions.
elif "commit" in pkg.versions[version]:
commits_version_dict[pkg.versions[version]["commit"]] = version
# TODO: enforce every version have a commit or a sha256 defined if not
# an infinite version (there are a lot of package's where this doesn't work yet.)
with fs.working_dir(os.path.dirname(path)):
added_checksums = spack_ci.get_added_versions(
checksums_version_dict, path, from_ref=args.from_ref, to_ref=args.to_ref
)
added_commits = spack_ci.get_added_versions(
commits_version_dict, path, from_ref=args.from_ref, to_ref=args.to_ref
)
if added_checksums:
failed_version = not validate_standard_versions(pkg, added_checksums) or failed_version
if added_commits:
failed_version = not validate_git_versions(pkg, added_commits) or failed_version
if failed_version:
sys.exit(1)
def ci(parser, args):
if args.func:
return args.func(args)

View File

@@ -4,13 +4,14 @@
import argparse
import sys
import warnings
import llnl.util.tty as tty
from llnl.util.lang import index_by
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
import spack.compilers
import spack.compilers.config
import spack.config
import spack.spec
from spack.cmd.common import arguments
@@ -33,20 +34,20 @@ def setup_parser(subparser):
mixed_toolchain_group.add_argument(
"--mixed-toolchain",
action="store_true",
default=sys.platform == "darwin",
help="Allow mixed toolchains (for example: clang, clang++, gfortran)",
default=False,
help="(DEPRECATED) Allow mixed toolchains (for example: clang, clang++, gfortran)",
)
mixed_toolchain_group.add_argument(
"--no-mixed-toolchain",
action="store_false",
dest="mixed_toolchain",
help="Do not allow mixed toolchains (for example: clang, clang++, gfortran)",
help="(DEPRECATED) Do not allow mixed toolchains (for example: clang, clang++, gfortran)",
)
find_parser.add_argument("add_paths", nargs=argparse.REMAINDER)
find_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope("compilers"),
default=lambda: spack.config.default_modify_scope("packages"),
help="configuration scope to modify",
)
arguments.add_common_arguments(find_parser, ["jobs"])
@@ -79,77 +80,97 @@ def compiler_find(args):
"""Search either $PATH or a list of paths OR MODULES for compilers and
add them to Spack's configuration.
"""
if args.mixed_toolchain:
warnings.warn(
"The '--mixed-toolchain' option has been deprecated in Spack v0.23, and currently "
"has no effect. The option will be removed in Spack v1.1"
)
paths = args.add_paths or None
new_compilers = spack.compilers.find_compilers(
path_hints=paths,
scope=args.scope,
mixed_toolchain=args.mixed_toolchain,
max_workers=args.jobs,
new_compilers = spack.compilers.config.find_compilers(
path_hints=paths, scope=args.scope, max_workers=args.jobs
)
if new_compilers:
n = len(new_compilers)
s = "s" if n > 1 else ""
filename = spack.config.CONFIG.get_config_filename(args.scope, "compilers")
filename = spack.config.CONFIG.get_config_filename(args.scope, "packages")
tty.msg(f"Added {n:d} new compiler{s} to {filename}")
compiler_strs = sorted(f"{c.spec.name}@{c.spec.version}" for c in new_compilers)
compiler_strs = sorted(f"{spec.name}@{spec.versions}" for spec in new_compilers)
colify(reversed(compiler_strs), indent=4)
else:
tty.msg("Found no new compilers")
tty.msg("Compilers are defined in the following files:")
colify(spack.compilers.compiler_config_files(), indent=4)
colify(spack.compilers.config.compiler_config_files(), indent=4)
def compiler_remove(args):
compiler_spec = spack.spec.CompilerSpec(args.compiler_spec)
candidate_compilers = spack.compilers.compilers_for_spec(compiler_spec, scope=args.scope)
remover = spack.compilers.config.CompilerRemover(spack.config.CONFIG)
candidates = remover.mark_compilers(match=args.compiler_spec, scope=args.scope)
if not candidates:
tty.die(f"No compiler matches '{args.compiler_spec}'")
if not candidate_compilers:
tty.die("No compilers match spec %s" % compiler_spec)
compiler_strs = reversed(sorted(f"{spec.name}@{spec.versions}" for spec in candidates))
if not args.all and len(candidate_compilers) > 1:
tty.error(f"Multiple compilers match spec {compiler_spec}. Choose one:")
colify(reversed(sorted([c.spec.display_str for c in candidate_compilers])), indent=4)
tty.msg("Or, use `spack compiler remove -a` to remove all of them.")
if not args.all and len(candidates) > 1:
tty.error(f"multiple compilers match the spec '{args.compiler_spec}':")
print()
colify(compiler_strs, indent=4)
print()
print(
"Either use a stricter spec to select only one, or use `spack compiler remove -a`"
" to remove all of them."
)
sys.exit(1)
for current_compiler in candidate_compilers:
spack.compilers.remove_compiler_from_config(current_compiler.spec, scope=args.scope)
tty.msg(f"{current_compiler.spec.display_str} has been removed")
remover.flush()
tty.msg("The following compilers have been removed:")
print()
colify(compiler_strs, indent=4)
print()
def compiler_info(args):
"""Print info about all compilers matching a spec."""
cspec = spack.spec.CompilerSpec(args.compiler_spec)
compilers = spack.compilers.compilers_for_spec(cspec, scope=args.scope)
query = spack.spec.Spec(args.compiler_spec)
all_compilers = spack.compilers.config.all_compilers(scope=args.scope, init_config=False)
compilers = [x for x in all_compilers if x.satisfies(query)]
if not compilers:
tty.die("No compilers match spec %s" % cspec)
tty.die(f"No compilers match spec {query.cformat()}")
else:
for c in compilers:
print(c.spec.display_str + ":")
print("\tpaths:")
for cpath in ["cc", "cxx", "f77", "fc"]:
print("\t\t%s = %s" % (cpath, getattr(c, cpath, None)))
if c.flags:
print("\tflags:")
for flag, flag_value in c.flags.items():
print("\t\t%s = %s" % (flag, flag_value))
if len(c.environment) != 0:
if len(c.environment.get("set", {})) != 0:
print(f"{c.cformat()}:")
print(f" prefix: {c.external_path}")
extra_attributes = getattr(c, "extra_attributes", {})
if "compilers" in extra_attributes:
print(" compilers:")
for language, exe in extra_attributes.get("compilers", {}).items():
print(f" {language}: {exe}")
if "flags" in extra_attributes:
print(" flags:")
for flag, flag_value in extra_attributes["flags"].items():
print(f" {flag} = {flag_value}")
if "environment" in extra_attributes:
environment = extra_attributes["environment"]
if len(environment.get("set", {})) != 0:
print("\tenvironment:")
print("\t set:")
for key, value in c.environment["set"].items():
print("\t %s = %s" % (key, value))
if c.extra_rpaths:
print("\tExtra rpaths:")
for extra_rpath in c.extra_rpaths:
print("\t\t%s" % extra_rpath)
print("\tmodules = %s" % c.modules)
print("\toperating system = %s" % c.operating_system)
for key, value in environment["set"].items():
print(f"\t {key} = {value}")
if "extra_rpaths" in extra_attributes:
print(" extra rpaths:")
for extra_rpath in extra_attributes["extra_rpaths"]:
print(f" {extra_rpath}")
if getattr(c, "external_modules", []):
print(" modules: ")
for module in c.external_modules:
print(f" {module}")
print()
def compiler_list(args):
compilers = spack.compilers.all_compilers(scope=args.scope, init_config=False)
compilers = spack.compilers.config.all_compilers(scope=args.scope, init_config=False)
# If there are no compilers in any scope, and we're outputting to a tty, give a
# hint to the user.
@@ -162,7 +183,7 @@ def compiler_list(args):
tty.msg(msg)
return
index = index_by(compilers, lambda c: (c.spec.name, c.operating_system, c.target))
index = index_by(compilers, spack.compilers.config.name_os_target)
tty.msg("Available compilers")
@@ -181,10 +202,10 @@ def compiler_list(args):
name, os, target = key
os_str = os
if target:
os_str += "-%s" % target
cname = "%s{%s} %s" % (spack.spec.COMPILER_COLOR, name, os_str)
os_str += f"-{target}"
cname = f"{spack.spec.COMPILER_COLOR}{{{name}}} {os_str}"
tty.hline(colorize(cname), char="-")
colify(reversed(sorted(c.spec.display_str for c in compilers)))
colify(reversed(sorted(c.format("{name}@{version}") for c in compilers)))
def compiler(parser, args):

View File

@@ -521,8 +521,6 @@ def config_prefer_upstream(args):
for spec in pref_specs:
# Collect all the upstream compilers and versions for this package.
pkg = pkgs.get(spec.name, {"version": []})
all = pkgs.get("all", {"compiler": []})
pkgs["all"] = all
pkgs[spec.name] = pkg
# We have no existing variant if this is our first added version.
@@ -532,10 +530,6 @@ def config_prefer_upstream(args):
if version not in pkg["version"]:
pkg["version"].append(version)
compiler = str(spec.compiler)
if compiler not in all["compiler"]:
all["compiler"].append(compiler)
# Get and list all the variants that differ from the default.
variants = []
for var_name, variant in spec.variants.items():

View File

@@ -3,11 +3,13 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
from typing import Optional
import llnl.util.tty as tty
import spack.cmd
import spack.config
import spack.environment
import spack.fetch_strategy
import spack.repo
import spack.spec
@@ -31,37 +33,33 @@ def setup_parser(subparser):
"--no-clone",
action="store_false",
dest="clone",
default=None,
help="do not clone, the package already exists at the source path",
)
clone_group.add_argument(
"--clone",
action="store_true",
dest="clone",
default=None,
help="clone the package even if the path already exists",
default=True,
help=(
"(default) clone the package unless the path already exists, "
"use --force to overwrite"
),
)
subparser.add_argument(
"-f", "--force", help="remove any files or directories that block cloning source code"
)
subparser.add_argument(
"-r",
"--recursive",
action="store_true",
help="traverse nodes of the graph to mark everything up to the root as a develop spec",
)
arguments.add_common_arguments(subparser, ["spec"])
def _update_config(spec, path):
find_fn = lambda section: spec.name in section
entry = {"spec": str(spec)}
if path != spec.name:
entry["path"] = path
def change_fn(section):
section[spec.name] = entry
spack.config.change_or_add("develop", find_fn, change_fn)
def _retrieve_develop_source(spec: spack.spec.Spec, abspath: str) -> None:
# "steal" the source code via staging API. We ask for a stage
# to be created, then copy it afterwards somewhere else. It would be
@@ -83,44 +81,43 @@ def _retrieve_develop_source(spec: spack.spec.Spec, abspath: str) -> None:
package.stage.steal_source(abspath)
def develop(parser, args):
# Note: we could put develop specs in any scope, but I assume
# users would only ever want to do this for either (a) an active
# env or (b) a specified config file (e.g. that is included by
# an environment)
# TODO: when https://github.com/spack/spack/pull/35307 is merged,
# an active env is not required if a scope is specified
env = spack.cmd.require_active_env(cmd_name="develop")
if not args.spec:
if args.clone is False:
raise SpackError("No spec provided to spack develop command")
def assure_concrete_spec(env: spack.environment.Environment, spec: spack.spec.Spec):
version = spec.versions.concrete_range_as_version
if not version:
# first check environment for a matching concrete spec
matching_specs = env.all_matching_specs(spec)
if matching_specs:
version = matching_specs[0].version
test_spec = spack.spec.Spec(f"{spec}@{version}")
for m_spec in matching_specs:
if not m_spec.satisfies(test_spec):
raise SpackError(
f"{spec.name}: has multiple concrete instances in the graph that can't be"
" satisified by a single develop spec. To use `spack develop` ensure one"
" of the following:"
f"\n a) {spec.name} nodes can satisfy the same develop spec (minimally "
"this means they all share the same version)"
f"\n b) Provide a concrete develop spec ({spec.name}@[version]) to clearly"
" indicate what should be developed"
)
else:
# look up the maximum version so infintiy versions are preferred for develop
version = max(spec.package_class.versions.keys())
tty.msg(f"Defaulting to highest version: {spec.name}@{version}")
spec.versions = spack.version.VersionList([version])
# download all dev specs
for name, entry in env.dev_specs.items():
path = entry.get("path", name)
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
if os.path.exists(abspath):
msg = "Skipping developer download of %s" % entry["spec"]
msg += " because its path already exists."
tty.msg(msg)
continue
def setup_src_code(spec: spack.spec.Spec, src_path: str, clone: bool = True, force: bool = False):
"""
Handle checking, cloning or overwriting source code
"""
assert spec.versions
# Both old syntax `spack develop pkg@x` and new syntax `spack develop pkg@=x`
# are currently supported.
spec = spack.spec.parse_with_version_concrete(entry["spec"])
_retrieve_develop_source(spec, abspath)
if clone:
_clone(spec, src_path, force)
if not env.dev_specs:
tty.warn("No develop specs to download")
return
specs = spack.cmd.parse_specs(args.spec)
if len(specs) > 1:
raise SpackError("spack develop requires at most one named spec")
spec = specs[0]
if not clone and not os.path.exists(src_path):
raise SpackError(f"Provided path {src_path} does not exist")
version = spec.versions.concrete_range_as_version
if not version:
@@ -129,40 +126,114 @@ def develop(parser, args):
tty.msg(f"Defaulting to highest version: {spec.name}@{version}")
spec.versions = spack.version.VersionList([version])
# If user does not specify --path, we choose to create a directory in the
# active environment's directory, named after the spec
path = args.path or spec.name
if not os.path.isabs(path):
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
else:
abspath = path
# clone default: only if the path doesn't exist
clone = args.clone
if clone is None:
clone = not os.path.exists(abspath)
def _update_config(spec, path):
find_fn = lambda section: spec.name in section
if not clone and not os.path.exists(abspath):
raise SpackError("Provided path %s does not exist" % abspath)
entry = {"spec": str(spec)}
if path and path != spec.name:
entry["path"] = path
if clone:
if os.path.exists(abspath):
if args.force:
shutil.rmtree(abspath)
else:
msg = "Path %s already exists and cannot be cloned to." % abspath
msg += " Use `spack develop -f` to overwrite."
raise SpackError(msg)
def change_fn(section):
section[spec.name] = entry
_retrieve_develop_source(spec, abspath)
spack.config.change_or_add("develop", find_fn, change_fn)
def update_env(
env: spack.environment.Environment,
spec: spack.spec.Spec,
specified_path: Optional[str] = None,
build_dir: Optional[str] = None,
):
"""
Update the spack.yaml file with additions or changes from a develop call
"""
tty.debug(f"Updating develop config for {env.name} transactionally")
if not specified_path:
dev_entry = env.dev_specs.get(spec.name)
if dev_entry:
specified_path = dev_entry.get("path", None)
tty.debug("Updating develop config for {0} transactionally".format(env.name))
with env.write_transaction():
if args.build_directory is not None:
if build_dir is not None:
spack.config.add(
"packages:{}:package_attributes:build_directory:{}".format(
spec.name, args.build_directory
),
f"packages:{spec.name}:package_attributes:build_directory:{build_dir}",
env.scope_name,
)
_update_config(spec, path)
# add develop spec and update path
_update_config(spec, specified_path)
def _clone(spec: spack.spec.Spec, abspath: str, force: bool = False):
if os.path.exists(abspath):
if force:
shutil.rmtree(abspath)
else:
msg = f"Skipping developer download of {spec.name}"
msg += f" because its path {abspath} already exists."
tty.msg(msg)
return
# cloning can take a while and it's nice to get a message for the longer clones
tty.msg(f"Cloning source code for {spec}")
_retrieve_develop_source(spec, abspath)
def _abs_code_path(
env: spack.environment.Environment, spec: spack.spec.Spec, path: Optional[str] = None
):
src_path = path if path else spec.name
return spack.util.path.canonicalize_path(src_path, default_wd=env.path)
def _dev_spec_generator(args, env):
"""
Generator function to loop over all the develop specs based on how the command is called
If no specs are supplied then loop over the develop specs listed in the environment.
"""
if not args.spec:
if args.clone is False:
raise SpackError("No spec provided to spack develop command")
for name, entry in env.dev_specs.items():
path = entry.get("path", name)
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
# Both old syntax `spack develop pkg@x` and new syntax `spack develop pkg@=x`
# are currently supported.
spec = spack.spec.parse_with_version_concrete(entry["spec"])
yield spec, abspath
else:
specs = spack.cmd.parse_specs(args.spec)
if (args.path or args.build_directory) and len(specs) > 1:
raise SpackError(
"spack develop requires at most one named spec when using the --path or"
" --build-directory arguments"
)
for spec in specs:
if args.recursive:
concrete_specs = env.all_matching_specs(spec)
if not concrete_specs:
tty.warn(
f"{spec.name} has no matching concrete specs in the environment and "
"will be skipped. `spack develop --recursive` requires a concretized"
" environment"
)
else:
for s in concrete_specs:
for node_spec in s.traverse(direction="parents", root=True):
tty.debug(f"Recursive develop for {node_spec.name}")
yield node_spec, _abs_code_path(env, node_spec, args.path)
else:
yield spec, _abs_code_path(env, spec, args.path)
def develop(parser, args):
env = spack.cmd.require_active_env(cmd_name="develop")
for spec, abspath in _dev_spec_generator(args, env):
assure_concrete_spec(env, spec)
setup_src_code(spec, abspath, clone=args.clone, force=args.force)
update_env(env, spec, args.path, args.build_directory)

View File

@@ -98,7 +98,7 @@ def setup_parser(subparser):
"--show-full-compiler",
action="store_true",
dest="show_full_compiler",
help="show full compiler specs",
help="(DEPRECATED) show full compiler specs. Currently it's a no-op",
)
implicit_explicit = subparser.add_mutually_exclusive_group()
implicit_explicit.add_argument(
@@ -278,7 +278,6 @@ def root_decorator(spec, string):
# these enforce details in the root specs to show what the user asked for
namespaces=True,
show_flags=True,
show_full_compiler=True,
decorator=root_decorator,
variants=True,
)
@@ -301,7 +300,6 @@ def root_decorator(spec, string):
decorator=lambda s, f: color.colorize("@*{%s}" % f),
namespace=True,
show_flags=True,
show_full_compiler=True,
variants=True,
)
print()

View File

@@ -38,7 +38,6 @@
r"^lib/spack/spack/.*\.sh$",
r"^lib/spack/spack/.*\.lp$",
r"^lib/spack/llnl/.*\.py$",
r"^lib/spack/env/cc$",
# special case some test data files that have license headers
r"^lib/spack/spack/test/data/style/broken.dummy",
r"^lib/spack/spack/test/data/unparse/.*\.txt",

View File

@@ -515,16 +515,15 @@ def extend_with_dependencies(specs):
def concrete_specs_from_cli_or_file(args):
tty.msg("Concretizing input specs")
with spack.concretize.disable_compiler_existence_check():
if args.specs:
specs = spack.cmd.parse_specs(args.specs, concretize=True)
if not specs:
raise SpackError("unable to parse specs from command line")
if args.specs:
specs = spack.cmd.parse_specs(args.specs, concretize=True)
if not specs:
raise SpackError("unable to parse specs from command line")
if args.file:
specs = specs_from_text_file(args.file, concretize=True)
if not specs:
raise SpackError("unable to parse specs from file '{}'".format(args.file))
if args.file:
specs = specs_from_text_file(args.file, concretize=True)
if not specs:
raise SpackError("unable to parse specs from file '{}'".format(args.file))
return specs

View File

@@ -1,7 +1,12 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
from llnl.util import tty
import spack.database
import spack.store
description = "rebuild Spack's package database"
@@ -10,4 +15,11 @@
def reindex(parser, args):
current_index = spack.store.STORE.db._index_path
if os.path.isfile(current_index):
backup = f"{current_index}.bkp"
shutil.copy(current_index, backup)
tty.msg(f"Created a back-up copy of the DB at {backup}")
spack.store.STORE.reindex()
tty.msg(f"The DB at {current_index} has been reindex to v{spack.database._DB_VERSION}")

View File

@@ -17,6 +17,7 @@
pytest = None # type: ignore
import llnl.util.filesystem
import llnl.util.tty as tty
import llnl.util.tty.color as color
from llnl.util.tty.colify import colify
@@ -236,6 +237,12 @@ def unit_test(parser, args, unknown_args):
pytest_root = spack.extensions.load_extension(args.extension)
if args.numprocesses is not None and args.numprocesses > 1:
try:
import xdist # noqa: F401
except ImportError:
tty.error("parallel unit-test requires pytest-xdist module")
return 1
pytest_args.extend(
[
"--dist",

View File

@@ -1,856 +0,0 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import contextlib
import hashlib
import itertools
import json
import os
import platform
import re
import shutil
import sys
import tempfile
from typing import Dict, List, Optional, Sequence
import llnl.path
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.filesystem import path_contains_subdirectory, paths_containing_libs
import spack.caches
import spack.error
import spack.schema.environment
import spack.spec
import spack.util.executable
import spack.util.libc
import spack.util.module_cmd
import spack.version
from spack.util.environment import filter_system_paths
from spack.util.file_cache import FileCache
__all__ = ["Compiler"]
PATH_INSTANCE_VARS = ["cc", "cxx", "f77", "fc"]
FLAG_INSTANCE_VARS = ["cflags", "cppflags", "cxxflags", "fflags"]
@llnl.util.lang.memoized
def _get_compiler_version_output(compiler_path, version_arg, ignore_errors=()) -> str:
"""Invokes the compiler at a given path passing a single
version argument and returns the output.
Args:
compiler_path (path): path of the compiler to be invoked
version_arg (str): the argument used to extract version information
"""
compiler = spack.util.executable.Executable(compiler_path)
compiler_invocation_args = {
"output": str,
"error": str,
"ignore_errors": ignore_errors,
"timeout": 120,
"fail_on_error": True,
}
if version_arg:
output = compiler(version_arg, **compiler_invocation_args)
else:
output = compiler(**compiler_invocation_args)
return output
def get_compiler_version_output(compiler_path, *args, **kwargs) -> str:
"""Wrapper for _get_compiler_version_output()."""
# This ensures that we memoize compiler output by *absolute path*,
# not just executable name. If we don't do this, and the path changes
# (e.g., during testing), we can get incorrect results.
if not os.path.isabs(compiler_path):
compiler_path = spack.util.executable.which_string(compiler_path, required=True)
return _get_compiler_version_output(compiler_path, *args, **kwargs)
def tokenize_flags(flags_values, propagate=False):
"""Given a compiler flag specification as a string, this returns a list
where the entries are the flags. For compiler options which set values
using the syntax "-flag value", this function groups flags and their
values together. Any token not preceded by a "-" is considered the
value of a prior flag."""
tokens = flags_values.split()
if not tokens:
return []
flag = tokens[0]
flags_with_propagation = []
for token in tokens[1:]:
if not token.startswith("-"):
flag += " " + token
else:
flags_with_propagation.append((flag, propagate))
flag = token
flags_with_propagation.append((flag, propagate))
return flags_with_propagation
#: regex for parsing linker lines
_LINKER_LINE = re.compile(r"^( *|.*[/\\])" r"(link|ld|([^/\\]+-)?ld|collect2)" r"[^/\\]*( |$)")
#: components of linker lines to ignore
_LINKER_LINE_IGNORE = re.compile(r"(collect2 version|^[A-Za-z0-9_]+=|/ldfe )")
#: regex to match linker search paths
_LINK_DIR_ARG = re.compile(r"^-L(.:)?(?P<dir>[/\\].*)")
#: regex to match linker library path arguments
_LIBPATH_ARG = re.compile(r"^[-/](LIBPATH|libpath):(?P<dir>.*)")
def _parse_link_paths(string):
"""Parse implicit link paths from compiler debug output.
This gives the compiler runtime library paths that we need to add to
the RPATH of generated binaries and libraries. It allows us to
ensure, e.g., that codes load the right libstdc++ for their compiler.
"""
lib_search_paths = False
raw_link_dirs = []
for line in string.splitlines():
if lib_search_paths:
if line.startswith("\t"):
raw_link_dirs.append(line[1:])
continue
else:
lib_search_paths = False
elif line.startswith("Library search paths:"):
lib_search_paths = True
if not _LINKER_LINE.match(line):
continue
if _LINKER_LINE_IGNORE.match(line):
continue
tty.debug(f"implicit link dirs: link line: {line}")
next_arg = False
for arg in line.split():
if arg in ("-L", "-Y"):
next_arg = True
continue
if next_arg:
raw_link_dirs.append(arg)
next_arg = False
continue
link_dir_arg = _LINK_DIR_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
raw_link_dirs.append(link_dir)
link_dir_arg = _LIBPATH_ARG.match(arg)
if link_dir_arg:
link_dir = link_dir_arg.group("dir")
raw_link_dirs.append(link_dir)
implicit_link_dirs = list()
visited = set()
for link_dir in raw_link_dirs:
normalized_path = os.path.abspath(link_dir)
if normalized_path not in visited:
implicit_link_dirs.append(normalized_path)
visited.add(normalized_path)
tty.debug(f"implicit link dirs: result: {', '.join(implicit_link_dirs)}")
return implicit_link_dirs
@llnl.path.system_path_filter
def _parse_non_system_link_dirs(string: str) -> List[str]:
"""Parses link paths out of compiler debug output.
Args:
string: compiler debug output as a string
Returns:
Implicit link paths parsed from the compiler output
"""
link_dirs = _parse_link_paths(string)
# Remove directories that do not exist. Some versions of the Cray compiler
# report nonexistent directories
link_dirs = [d for d in link_dirs if os.path.isdir(d)]
# Return set of directories containing needed compiler libs, minus
# system paths. Note that 'filter_system_paths' only checks for an
# exact match, while 'in_system_subdirectory' checks if a path contains
# a system directory as a subdirectory
link_dirs = filter_system_paths(link_dirs)
return list(p for p in link_dirs if not in_system_subdirectory(p))
def in_system_subdirectory(path):
system_dirs = [
"/lib/",
"/lib64/",
"/usr/lib/",
"/usr/lib64/",
"/usr/local/lib/",
"/usr/local/lib64/",
]
return any(path_contains_subdirectory(path, x) for x in system_dirs)
class Compiler:
"""This class encapsulates a Spack "compiler", which includes C,
C++, and Fortran compilers. Subclasses should implement
support for specific compilers, their possible names, arguments,
and how to identify the particular type of compiler."""
# Optional prefix regexes for searching for this type of compiler.
# Prefixes are sometimes used for toolchains
prefixes: List[str] = []
# Optional suffix regexes for searching for this type of compiler.
# Suffixes are used by some frameworks, e.g. macports uses an '-mp-X.Y'
# version suffix for gcc.
suffixes = [r"-.*"]
#: Compiler argument that produces version information
version_argument = "-dumpversion"
#: Return values to ignore when invoking the compiler to get its version
ignore_version_errors: Sequence[int] = ()
#: Regex used to extract version from compiler's output
version_regex = "(.*)"
# These libraries are anticipated to be required by all executables built
# by any compiler
_all_compiler_rpath_libraries = ["libc", "libc++", "libstdc++"]
#: Platform matcher for Platform objects supported by compiler
is_supported_on_platform = lambda x: True
# Default flags used by a compiler to set an rpath
@property
def cc_rpath_arg(self):
return "-Wl,-rpath,"
@property
def cxx_rpath_arg(self):
return "-Wl,-rpath,"
@property
def f77_rpath_arg(self):
return "-Wl,-rpath,"
@property
def fc_rpath_arg(self):
return "-Wl,-rpath,"
@property
def linker_arg(self):
"""Flag that need to be used to pass an argument to the linker."""
return "-Wl,"
@property
def disable_new_dtags(self):
if platform.system() == "Darwin":
return ""
return "--disable-new-dtags"
@property
def enable_new_dtags(self):
if platform.system() == "Darwin":
return ""
return "--enable-new-dtags"
@property
def debug_flags(self):
return ["-g"]
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3"]
def __init__(
self,
cspec,
operating_system,
target,
paths,
modules: Optional[List[str]] = None,
alias=None,
environment=None,
extra_rpaths=None,
enable_implicit_rpaths=None,
**kwargs,
):
self.spec = cspec
self.operating_system = str(operating_system)
self.target = target
self.modules = modules or []
self.alias = alias
self.environment = environment or {}
self.extra_rpaths = extra_rpaths or []
self.enable_implicit_rpaths = enable_implicit_rpaths
self.cache = COMPILER_CACHE
self.cc = paths[0]
self.cxx = paths[1]
self.f77 = None
self.fc = None
if len(paths) > 2:
self.f77 = paths[2]
if len(paths) == 3:
self.fc = self.f77
else:
self.fc = paths[3]
# Unfortunately have to make sure these params are accepted
# in the same order they are returned by sorted(flags)
# in compilers/__init__.py
self.flags = spack.spec.FlagMap(self.spec)
for flag in self.flags.valid_compiler_flags():
value = kwargs.get(flag, None)
if value is not None:
values_with_propagation = tokenize_flags(value, False)
for value, propagation in values_with_propagation:
self.flags.add_flag(flag, value, propagation)
# caching value for compiler reported version
# used for version checks for API, e.g. C++11 flag
self._real_version = None
def __eq__(self, other):
return (
self.cc == other.cc
and self.cxx == other.cxx
and self.fc == other.fc
and self.f77 == other.f77
and self.spec == other.spec
and self.operating_system == other.operating_system
and self.target == other.target
and self.flags == other.flags
and self.modules == other.modules
and self.environment == other.environment
and self.extra_rpaths == other.extra_rpaths
and self.enable_implicit_rpaths == other.enable_implicit_rpaths
)
def __hash__(self):
return hash(
(
self.cc,
self.cxx,
self.fc,
self.f77,
self.spec,
self.operating_system,
self.target,
str(self.flags),
str(self.modules),
str(self.environment),
str(self.extra_rpaths),
self.enable_implicit_rpaths,
)
)
def verify_executables(self):
"""Raise an error if any of the compiler executables is not valid.
This method confirms that for all of the compilers (cc, cxx, f77, fc)
that have paths, those paths exist and are executable by the current
user.
Raises a CompilerAccessError if any of the non-null paths for the
compiler are not accessible.
"""
def accessible_exe(exe):
# compilers may contain executable names (on Cray or user edited)
if not os.path.isabs(exe):
exe = spack.util.executable.which_string(exe)
if not exe:
return False
return os.path.isfile(exe) and os.access(exe, os.X_OK)
# setup environment before verifying in case we have executable names
# instead of absolute paths
with self.compiler_environment():
missing = [
cmp
for cmp in (self.cc, self.cxx, self.f77, self.fc)
if cmp and not accessible_exe(cmp)
]
if missing:
raise CompilerAccessError(self, missing)
@property
def version(self):
return self.spec.version
@property
def real_version(self):
"""Executable reported compiler version used for API-determinations
E.g. C++11 flag checks.
"""
real_version_str = self.cache.get(self).real_version
if not real_version_str or real_version_str == "unknown":
return self.version
return spack.version.StandardVersion.from_string(real_version_str)
def implicit_rpaths(self) -> List[str]:
if self.enable_implicit_rpaths is False:
return []
output = self.compiler_verbose_output
if not output:
return []
link_dirs = _parse_non_system_link_dirs(output)
all_required_libs = list(self.required_libs) + Compiler._all_compiler_rpath_libraries
return list(paths_containing_libs(link_dirs, all_required_libs))
@property
def default_dynamic_linker(self) -> Optional[str]:
"""Determine default dynamic linker from compiler link line"""
output = self.compiler_verbose_output
if not output:
return None
return spack.util.libc.parse_dynamic_linker(output)
@property
def default_libc(self) -> Optional["spack.spec.Spec"]:
"""Determine libc targeted by the compiler from link line"""
# technically this should be testing the target platform of the compiler, but we don't have
# that, so stick to host platform for now.
if sys.platform in ("darwin", "win32"):
return None
dynamic_linker = self.default_dynamic_linker
if not dynamic_linker:
return None
return spack.util.libc.libc_from_dynamic_linker(dynamic_linker)
@property
def required_libs(self):
"""For executables created with this compiler, the compiler libraries
that would be generally required to run it.
"""
# By default every compiler returns the empty list
return []
@property
def compiler_verbose_output(self) -> Optional[str]:
"""Verbose output from compiling a dummy C source file. Output is cached."""
return self.cache.get(self).c_compiler_output
def _compile_dummy_c_source(self) -> Optional[str]:
if self.cc:
cc = self.cc
ext = "c"
else:
cc = self.cxx
ext = "cc"
if not cc or not self.verbose_flag:
return None
try:
tmpdir = tempfile.mkdtemp(prefix="spack-implicit-link-info")
fout = os.path.join(tmpdir, "output")
fin = os.path.join(tmpdir, f"main.{ext}")
with open(fin, "w", encoding="utf-8") as csource:
csource.write(
"int main(int argc, char* argv[]) { (void)argc; (void)argv; return 0; }\n"
)
cc_exe = spack.util.executable.Executable(cc)
for flag_type in ["cflags" if cc == self.cc else "cxxflags", "cppflags", "ldflags"]:
cc_exe.add_default_arg(*self.flags.get(flag_type, []))
with self.compiler_environment():
return cc_exe(self.verbose_flag, fin, "-o", fout, output=str, error=str)
except spack.util.executable.ProcessError as pe:
tty.debug("ProcessError: Command exited with non-zero status: " + pe.long_message)
return None
finally:
shutil.rmtree(tmpdir, ignore_errors=True)
@property
def verbose_flag(self) -> Optional[str]:
"""
This property should be overridden in the compiler subclass if a
verbose flag is available.
If it is not overridden, it is assumed to not be supported.
"""
# This property should be overridden in the compiler subclass if
# OpenMP is supported by that compiler
@property
def openmp_flag(self):
# If it is not overridden, assume it is not supported and warn the user
raise UnsupportedCompilerFlag(self, "OpenMP", "openmp_flag")
# This property should be overridden in the compiler subclass if
# C++98 is not the default standard for that compiler
@property
def cxx98_flag(self):
return ""
# This property should be overridden in the compiler subclass if
# C++11 is supported by that compiler
@property
def cxx11_flag(self):
# If it is not overridden, assume it is not supported and warn the user
raise UnsupportedCompilerFlag(self, "the C++11 standard", "cxx11_flag")
# This property should be overridden in the compiler subclass if
# C++14 is supported by that compiler
@property
def cxx14_flag(self):
# If it is not overridden, assume it is not supported and warn the user
raise UnsupportedCompilerFlag(self, "the C++14 standard", "cxx14_flag")
# This property should be overridden in the compiler subclass if
# C++17 is supported by that compiler
@property
def cxx17_flag(self):
# If it is not overridden, assume it is not supported and warn the user
raise UnsupportedCompilerFlag(self, "the C++17 standard", "cxx17_flag")
# This property should be overridden in the compiler subclass if
# C99 is supported by that compiler
@property
def c99_flag(self):
# If it is not overridden, assume it is not supported and warn the user
raise UnsupportedCompilerFlag(self, "the C99 standard", "c99_flag")
# This property should be overridden in the compiler subclass if
# C11 is supported by that compiler
@property
def c11_flag(self):
# If it is not overridden, assume it is not supported and warn the user
raise UnsupportedCompilerFlag(self, "the C11 standard", "c11_flag")
@property
def cc_pic_flag(self):
"""Returns the flag used by the C compiler to produce
Position Independent Code (PIC)."""
return "-fPIC"
@property
def cxx_pic_flag(self):
"""Returns the flag used by the C++ compiler to produce
Position Independent Code (PIC)."""
return "-fPIC"
@property
def f77_pic_flag(self):
"""Returns the flag used by the F77 compiler to produce
Position Independent Code (PIC)."""
return "-fPIC"
@property
def fc_pic_flag(self):
"""Returns the flag used by the FC compiler to produce
Position Independent Code (PIC)."""
return "-fPIC"
# Note: This is not a class method. The class methods are used to detect
# compilers on PATH based systems, and do not set up the run environment of
# the compiler. This method can be called on `module` based systems as well
def get_real_version(self) -> str:
"""Query the compiler for its version.
This is the "real" compiler version, regardless of what is in the
compilers.yaml file, which the user can change to name their compiler.
Use the runtime environment of the compiler (modules and environment
modifications) to enable the compiler to run properly on any platform.
"""
cc = spack.util.executable.Executable(self.cc)
try:
with self.compiler_environment():
output = cc(
self.version_argument,
output=str,
error=str,
ignore_errors=tuple(self.ignore_version_errors),
)
return self.extract_version_from_output(output)
except spack.util.executable.ProcessError:
return "unknown"
@property
def prefix(self):
"""Query the compiler for its install prefix. This is the install
path as reported by the compiler. Note that paths for cc, cxx, etc
are not enough to find the install prefix of the compiler, since
the can be symlinks, wrappers, or filenames instead of absolute paths."""
raise NotImplementedError("prefix is not implemented for this compiler")
#
# Compiler classes have methods for querying the version of
# specific compiler executables. This is used when discovering compilers.
#
# Compiler *instances* are just data objects, and can only be
# constructed from an actual set of executables.
#
@classmethod
def default_version(cls, cc):
"""Override just this to override all compiler version functions."""
output = get_compiler_version_output(
cc, cls.version_argument, tuple(cls.ignore_version_errors)
)
return cls.extract_version_from_output(output)
@classmethod
@llnl.util.lang.memoized
def extract_version_from_output(cls, output: str) -> str:
"""Extracts the version from compiler's output."""
match = re.search(cls.version_regex, output)
return match.group(1) if match else "unknown"
@classmethod
def cc_version(cls, cc):
return cls.default_version(cc)
@classmethod
def search_regexps(cls, language):
# Compile all the regular expressions used for files beforehand.
# This searches for any combination of <prefix><name><suffix>
# defined for the compiler
compiler_names = getattr(cls, "{0}_names".format(language))
prefixes = [""] + cls.prefixes
suffixes = [""]
if sys.platform == "win32":
ext = r"\.(?:exe|bat)"
cls_suf = [suf + ext for suf in cls.suffixes]
ext_suf = [ext]
suffixes = suffixes + cls.suffixes + cls_suf + ext_suf
else:
suffixes = suffixes + cls.suffixes
regexp_fmt = r"^({0}){1}({2})$"
return [
re.compile(regexp_fmt.format(prefix, re.escape(name), suffix))
for prefix, name, suffix in itertools.product(prefixes, compiler_names, suffixes)
]
def setup_custom_environment(self, pkg, env):
"""Set any environment variables necessary to use the compiler."""
pass
def __repr__(self):
"""Return a string representation of the compiler toolchain."""
return self.__str__()
def __str__(self):
"""Return a string representation of the compiler toolchain."""
return "%s(%s)" % (
self.name,
"\n ".join(
(
str(s)
for s in (
self.cc,
self.cxx,
self.f77,
self.fc,
self.modules,
str(self.operating_system),
)
)
),
)
@contextlib.contextmanager
def compiler_environment(self):
# Avoid modifying os.environ if possible.
if not self.modules and not self.environment:
yield
return
# store environment to replace later
backup_env = os.environ.copy()
try:
# load modules and set env variables
for module in self.modules:
spack.util.module_cmd.load_module(module)
# apply other compiler environment changes
spack.schema.environment.parse(self.environment).apply_modifications()
yield
finally:
# Restore environment regardless of whether inner code succeeded
os.environ.clear()
os.environ.update(backup_env)
def to_dict(self):
flags_dict = {fname: " ".join(fvals) for fname, fvals in self.flags.items()}
flags_dict.update(
{attr: getattr(self, attr, None) for attr in FLAG_INSTANCE_VARS if hasattr(self, attr)}
)
result = {
"spec": str(self.spec),
"paths": {attr: getattr(self, attr, None) for attr in PATH_INSTANCE_VARS},
"flags": flags_dict,
"operating_system": str(self.operating_system),
"target": str(self.target),
"modules": self.modules or [],
"environment": self.environment or {},
"extra_rpaths": self.extra_rpaths or [],
}
if self.enable_implicit_rpaths is not None:
result["implicit_rpaths"] = self.enable_implicit_rpaths
if self.alias:
result["alias"] = self.alias
return result
class CompilerAccessError(spack.error.SpackError):
def __init__(self, compiler, paths):
msg = "Compiler '%s' has executables that are missing" % compiler.spec
msg += " or are not executable: %s" % paths
super().__init__(msg)
class InvalidCompilerError(spack.error.SpackError):
def __init__(self):
super().__init__("Compiler has no executables.")
class UnsupportedCompilerFlag(spack.error.SpackError):
def __init__(self, compiler, feature, flag_name, ver_string=None):
super().__init__(
"{0} ({1}) does not support {2} (as compiler.{3}).".format(
compiler.name, ver_string if ver_string else compiler.version, feature, flag_name
),
"If you think it should, please edit the compiler.{0} subclass to".format(
compiler.name
)
+ " implement the {0} property and submit a pull request or issue.".format(flag_name),
)
class CompilerCacheEntry:
"""Deserialized cache entry for a compiler"""
__slots__ = ("c_compiler_output", "real_version")
def __init__(self, c_compiler_output: Optional[str], real_version: str):
self.c_compiler_output = c_compiler_output
self.real_version = real_version
@property
def empty(self) -> bool:
"""Sometimes the compiler is temporarily broken, preventing us from getting output. The
call site determines if that is a problem."""
return self.c_compiler_output is None
@classmethod
def from_dict(cls, data: Dict[str, Optional[str]]):
if not isinstance(data, dict):
raise ValueError(f"Invalid {cls.__name__} data")
c_compiler_output = data.get("c_compiler_output")
real_version = data.get("real_version")
if not isinstance(real_version, str) or not isinstance(
c_compiler_output, (str, type(None))
):
raise ValueError(f"Invalid {cls.__name__} data")
return cls(c_compiler_output, real_version)
class CompilerCache:
"""Base class for compiler output cache. Default implementation does not cache anything."""
def value(self, compiler: Compiler) -> Dict[str, Optional[str]]:
return {
"c_compiler_output": compiler._compile_dummy_c_source(),
"real_version": compiler.get_real_version(),
}
def get(self, compiler: Compiler) -> CompilerCacheEntry:
return CompilerCacheEntry.from_dict(self.value(compiler))
class FileCompilerCache(CompilerCache):
"""Cache for compiler output, which is used to determine implicit link paths, the default libc
version, and the compiler version."""
name = os.path.join("compilers", "compilers.json")
def __init__(self, cache: "FileCache") -> None:
self.cache = cache
self.cache.init_entry(self.name)
self._data: Dict[str, Dict[str, Optional[str]]] = {}
def _get_entry(self, key: str, *, allow_empty: bool) -> Optional[CompilerCacheEntry]:
try:
entry = CompilerCacheEntry.from_dict(self._data[key])
return entry if allow_empty or not entry.empty else None
except ValueError:
del self._data[key]
except KeyError:
pass
return None
def get(self, compiler: Compiler) -> CompilerCacheEntry:
# Cache hit
try:
with self.cache.read_transaction(self.name) as f:
assert f is not None
self._data = json.loads(f.read())
assert isinstance(self._data, dict)
except (json.JSONDecodeError, AssertionError):
self._data = {}
key = self._key(compiler)
value = self._get_entry(key, allow_empty=False)
if value is not None:
return value
# Cache miss
with self.cache.write_transaction(self.name) as (old, new):
try:
assert old is not None
self._data = json.loads(old.read())
assert isinstance(self._data, dict)
except (json.JSONDecodeError, AssertionError):
self._data = {}
# Use cache entry that may have been created by another process in the meantime.
entry = self._get_entry(key, allow_empty=True)
# Finally compute the cache entry
if entry is None:
self._data[key] = self.value(compiler)
entry = CompilerCacheEntry.from_dict(self._data[key])
new.write(json.dumps(self._data, separators=(",", ":")))
return entry
def _key(self, compiler: Compiler) -> str:
as_bytes = json.dumps(compiler.to_dict(), separators=(",", ":")).encode("utf-8")
return hashlib.sha256(as_bytes).hexdigest()
def _make_compiler_cache():
return FileCompilerCache(spack.caches.MISC_CACHE)
COMPILER_CACHE: CompilerCache = llnl.util.lang.Singleton(_make_compiler_cache) # type: ignore

Some files were not shown because too many files have changed in this diff Show More