Compare commits

...

67 Commits

Author SHA1 Message Date
Todd Gamblin
690fad1182 WIP 2025-03-25 22:34:39 -07:00
Todd Gamblin
13446994ab WIP 2025-03-25 22:34:39 -07:00
Todd Gamblin
327462e8e2 info: generify when-grouping code
We want to show dependencies grouped by conditions, as we already do
with variants. This takes the first step and generifies the variant
display code.
2025-03-25 22:34:39 -07:00
Massimiliano Culpo
5b3942a489 Turn compilers into nodes (#45189)
## Summary

Compilers stop being a *node attribute*, and become a *build-only* dependency. 

Packages may declare a dependency on the `c`, `cxx`, or `fortran` languages, which
are now treated as virtuals, and compilers would be *providers* for one or more of
those languages. Compilers can also inject runtime dependency, on the node being
compiled. An example graph for something as simple as `zlib-ng` is the following:

<p align="center">
<img src="https://github.com/user-attachments/assets/ee6471cb-09fd-4127-9f16-b9fe6d1338ac" alt="zlib-ng DAG" width="80%" height="auto">
</p>

Here `gcc` is used for both the `c`, and `cxx` languages. Edges are annotated with
the virtuals they satisfy (`c`, `cxx`, `libc`). `gcc` injects `gcc-runtime` on the nodes
being compiled. `glibc` is also injected for packages that require `c`. The
`compiler-wrapper` is explicitly represented as a node in the DAG, and is included in
the hash.

This change in the model has implications on the semantics of the `%` sigil, as
discussed in #44379, and requires a version bump for our `Specfile`, `Database`,
and `Lockfile` formats.

## Breaking changes

Breaking changes below may impact users of this branch.

### 1. Custom, non-numeric version of compilers are not supported

Currently, users can assign to compilers any custom version they want, and Spack
will try to recover the "real version" whenever the custom version fails some operation.
To deduce the "real version" Spack must run the compiler, which can add needless
overhead to common operations.

Since any information that a version like `gcc@foo` might give to the user, can also
be suffixed while retaining the correct numeric version, e.g. `gcc@10.5.0-foo`, Spack
will **not try** anymore to deduce real versions for compilers.

Said otherwise, users should have no expectation that `gcc@foo` behaves as
`gcc@X.Y.Z` internally.

### 2. The `%` sigil in the spec syntax means "direct build dependency"

The `%` sigil in the spec syntax means *"direct build dependency"*, and is not a node
attribute anymore. This means that:

```python
node.satisfies("%gcc")
``` 
is true only if `gcc` is a direct build dependency of the node. *Nodes without a compiler
dependency are allowed.*

### `parent["child"]`, and `node in spec`, will now only inspect the link/run sub-DAG
and direct build dependencies

The subscript notation for `Spec`:

```python
parent["child"]
```

will look for a `child` node only in the link/run transitive graph of `parent`, and in its
direct build dependencies. This means that to reach a transitive build dependency,
we must first pass through the node it is associated with. 

Assuming `parent` does not depend on `cmake`, but depends on a `CMakePackage`,
e.g. `hdf5`, then we have the following situation:

```python
# This one raises an Exception, since "parent" does not depend on cmake
parent["cmake"]
# This one is ok
cmake = parent["hdf5"]["cmake"]
```

### 3. Externals differing by just the compiler attribute

Externals are nodes where dependencies are trimmed, and that _is not planned to
change_ in this branch. Currently, on `develop` it is ok to write:

```yaml
packages:
  hdf5:
    externals:
    - spec: hdf5@1.12 %gcc
      prefix: /prefix/gcc
    - spec: hdf5@1.12 %clang
      prefix: /prefix/clang
```
and Spack will account for the compiler node attribute when computing the optimal
spec. In this branch, using externals with a compiler specified is allowed only if any
compiler in the dag matches the constraints specified on the external. _The external
will be still represented as a single node without dependencies_.

### 4. Spec matrices enforcing a compiler

Currently we can have matrices of the form:

```yaml
matrix:
- [x, y, z]
- [%gcc, %clang]
```
to get the cross-product of specs and compilers. We can disregard the nature of the
packages in the first row, since the compiler is a node attribute required on each node.

In this branch, instead, we require a spec to depend on `c`, `cxx`, or `fortran` for the
`%` to have any meaning. If any of the specs in the first row doesn't depend on these
languages, there will be a concretization error. 

## Deprecations

* The entire `compilers` section in the configuration (i.e., `compilers.yaml`) has been
  deprecated, and current entries will be removed in v1.2.0. For the time being, if Spack
  finds any `compilers` configuration, it will try to convert it automatically to a set of
  external packages.
* The `packages:compiler` soft-preference has been deprecated. It will be removed
  in v1.1.0.

## Other notable changes

* The tokens `{compiler}`, `{compiler.version}`, and `{compiler.name}` in `Spec.format`
  expand to `"none"` if a Spec does not depend on C, C++, or Fortran.
* The default install tree layout is now
  `"{architecture.platform}-{architecture.target}/{name}-{version}-{hash}"`

## Known limitations

The major known limitations of this branch that we intend to fix before v1.0 is that compilers
cannot be bootstrapped directly. 

In this branch we can build a new compiler using an existing external compiler, for instance:
	
```
$ spack install gcc@14 %gcc@10.5.0
```

where `gcc@10.5.0` is external, and `gcc@14` is to be built.

What we can't do at the moment is use a yet to be built compiler, and expect it will be
bootstrapped, e.g. :

```
spack install hdf5 %gcc@14
```

We plan to tackle this issue in a following PR.

---------

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Harmen Stoppels <me@harmenstoppels.nl>
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-25 22:32:49 -06:00
Todd Gamblin
a9c879d53e gdk-pixbuf: Use the official GNOME mirror. (#49690)
The `umea.se` mirror seems to have gone down (or at least is forbidden for now).

Revert the checksum changes in #47825; points at the official GNOME mirror
instead of the prior two places we were getting `gdk-pixbuf`.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-25 18:57:53 -06:00
Todd Gamblin
f42f59c84b concretizer: don't use clingo.Symbol for setup (#49650)
Since we moved from creating clingo symbols directly to constructing a pure string
representation of the program, we don't need to make `AspFunctions` into symbols before
turning them into strings. We can just write strings like clingo would.

This cuts about 25% off the setup time by avoiding an unnecessary round trip.

- [x] create strings directly from `AspFunctions`
- [x] remove unused `symbol()` method on `AspFunction`
- [x] setup no longer tries to call `symbol()`

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Greg Becker <becker33@llnl.gov>

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
2025-03-25 17:55:27 -07:00
Jon Rood
313b7d4cdb nalu-wind: add version 2.2.2. (#49685) 2025-03-25 15:45:23 -07:00
Melven Roehrig-Zoellner
bd41863797 scorep: ensure gcc-plugin is built, patch gcc@14 (#49257)
* scorep: ensure gcc-plugin is built, patch gcc@14
* scorep: patch only to non-deprecated versions
2025-03-25 14:45:53 -07:00
Robert Maaskant
b0dba4ff5a yarn: add v4.6.0, v4.7.0 (#49177)
* yarn: v4.6.0
* py-ipympl: pin yarn to v1
* rstudio: pin yarn to v1
* yarn: add v4.7.0
2025-03-25 14:37:44 -07:00
Alec Scott
4ff43d7fa9 ci: future-proof for enabling GitHub merge queues later (#49665) 2025-03-25 10:07:37 -07:00
Jon Rood
c1df1c7ee5 trilinos: fix kokkos constraints for version 16 (#49643)
* trilinos: add equals sign to kokkos dependencies.

* Fix some license headers to pass style check.

* Generalize a bit.

* Generalize a bit more.

* datatransferkit: constraing to maximum of trilinos@16.0.
2025-03-25 10:43:13 -06:00
arezaii
9ac6ecd5ba Chapel 2.4 (#49662)
* limit some patches by chapel version

* fix short output version if building main

* update patches, remove unneeded 'self' refs

* fix spack style

* update patches with changes from PR

* change py-protobuf to just protobuf dep

* add PR numbers for patches

* fix spack style

* update 2.4 sha256
2025-03-25 09:01:58 -07:00
Todd Gamblin
20ddb85020 setup-env.csh: Harden for people who like aliases (#49670)
A user had `grep` aliased to `grep -n`, which was causing `csh` setup to
fail due to number prefixes in `SPACK_ROOT`.

- [x] Prefix invocations of `grep` and `sed` (which are not builtin) with `\`
      to avoid any aliases.
- [x] Avoid using `dirname` altogether -- use csh's `:h` modifier (which does
      the same thing) instead.

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-25 09:01:28 -07:00
Nicholas Sly
2ced87297d Add dbus patch for container builds. (#49402) 2025-03-24 19:00:39 -06:00
Piotr Sacharuk
aa00c3fe1f trilinos: Apply workaround for oneAPI compiler for problems with build (#49636)
* Fix problem at least with datatransferkit

* Include patch 11676 from trilinos

* Add patches for trilinos 13.4.1

* style check failed

* Update links for patches

* additional style check failed
2025-03-24 17:05:43 -07:00
psakievich
0158fc46aa Add recursive argument to spack develop (#46885)
* Add recursive argument to spack develop

This effort allows for a recursive develop call
which will traverse from the develop spec given back to the root(s)
and mark all packages along the path as develop.

If people are doing development across the graph then paying
fetch and full rebuild costs every time spack develop is called
is unnecessary and expensive.

Also remove the constraint for concrete specs and simply take the
max(version) if a version is not given. This should default to the
highest infinity version which is also the logical best guess for
doing development.
2025-03-24 16:50:16 -07:00
Richard Berger
8ac826cca8 hip: add missing HIPCC_LINK_FLAGS_APPEND (#49436)
* hip: add missing HIPCC_LINK_FLAGS_APPEND

---------

Co-authored-by: rbberger <rbberger@users.noreply.github.com>
2025-03-24 13:58:47 -07:00
Teague Sterling
1b829a4a28 kentutils: add v478 (#49521) 2025-03-24 13:33:41 -07:00
Robert Maaskant
e2ed1c2308 py-pymoo: add v0.6.1.3 (#49603)
* py-pymoo: add v0.6.1.3
* py-pymoo: use a when context
* py-pymoo: group build only dependencies
2025-03-24 13:29:05 -07:00
Robert Maaskant
94b828add1 prometheus: improve dependency specs (#49175)
* prometheus: improve dependency specs
* fixup! prometheus: improve dependency specs
* prometheus: fix typo in nodejs dep
* prometheus: fix checksums
2025-03-24 13:27:10 -07:00
Eric Berquist
fd7dcf3a3f sst-core: fix linkage against ncurses, zlib, and HDF5 (#49152)
* sst-core: fix for > 14.0.0 requiring ncurses

* sst-core: backport fix for curses detection

* sst-core: ensure HDF5 is ignored if not specified

* sst-core: HDF5 integration is via C++

* sst-core: switch to with_or_without for configure

* sst-core: switch to enable_or_disable for configure

* sst-core: control memory pools and debug output with variants
2025-03-24 12:45:12 -07:00
Alec Scott
e3bb0d77bc hugo: add v0.145.0 (#49576) 2025-03-24 13:23:21 -06:00
Jon Rood
25761b13e5 kokkos-kernels: rewrite package to fix errors (#49598)
* kokkos-kernels: fix eti dependency statements.

* kokkos-kernels: rewrite package.

* Fix errors.

* Style.

* Style.

* Cleanup.
2025-03-24 12:25:20 -06:00
Stephen Nicholas Swatman
ae48faa83a detray: add v0.90.0-v0.93.0 (#49658)
This commit adds detray versions 0.90.0, 0.91.0, 0.92.0, and 0.93.0.
2025-03-24 10:13:03 -07:00
Afzal Patel
e15a3b0717 hip: fix hip-tests error (#49563) 2025-03-24 10:04:19 -07:00
Andrey Perestoronin
2c8afc5443 Add new 2025.1.0 release for intel-oneapi products (#49642)
* Add new versions of intel-oneapi products

* restore advisor 2025.0.0 release

* fix styling
2025-03-24 11:02:55 -06:00
Sreenivasa Murthy Kolam
99479b7e77 rocprofiler-sdk: new package (#49406)
* rocprofiler-sdk new package
* add license, rocm tag
2025-03-24 09:57:39 -07:00
psakievich
5d0b5ed73c EnvironmentModifications: fix reverse prepend/append (#49645)
pop a single item from front / back resp. instead of remove all instances
2025-03-24 17:29:27 +01:00
Ryan Krattiger
151af13be2 Unit tests: error message when running parallel without xdist (#49632) 2025-03-24 09:25:45 -07:00
Alec Scott
93ea3f51e7 zig: add v0.14.0 (#49629)
* zig: add v0.14.0

* Fix commit hash

* Fix tag for v0.14.0
2025-03-24 08:07:34 -07:00
Alec Scott
a3abc1c492 Fix ci failures after merge of mock tests created before license transition (#49638) 2025-03-21 21:17:56 -06:00
Simon Pintarelli
401484ddf4 remove version prior 7.3 from SIRIUS (#49584) 2025-03-21 20:46:39 +01:00
Robert Maaskant
fc4e76e6fe py-setuptools-scm: fix deps (#49609) 2025-03-21 11:18:11 -07:00
Alec Scott
0853f42723 smee-client: add v3.1.1 (#49578) 2025-03-21 11:56:37 -06:00
Alec Scott
19ca69d0d8 typos: add v1.30.2 (#49577)
* typos: add v1.30.2

* Add rust dependency constraint
2025-03-21 11:56:00 -06:00
Alec Scott
036794725f bfs: add v4.0.6 (#49575) 2025-03-21 11:55:16 -06:00
Alec Scott
e5a2c9aee3 emacs: add v30.1 (#49574) 2025-03-21 11:54:31 -06:00
Alec Scott
5364b88777 fzf: add v0.60.3 (#49573) 2025-03-21 11:43:26 -06:00
Alec Scott
7d1b6324e1 npm: add v11.2.0 (#49572) 2025-03-21 11:42:45 -06:00
Alexandre DENIS
3d0263755e mpibenchmark: add v0.6 (#49612)
* mpibenchmark: add version 0.6
* mpibenchmark: fix syntax
* mpibenchmark: improve package description
2025-03-21 09:54:56 -07:00
Jon Rood
54ad5dca45 exawind: add versions and commits to tags (#49615)
* exawind: add versions and commits to tags.
* Add new version of TIOGA.
* openfast: add commits to tags.
* amr-wind: add dependencies.
* amr-wind: add more settings.

---------

Co-authored-by: jrood-nrel <jrood-nrel@users.noreply.github.com>
2025-03-21 09:49:37 -07:00
Lehman Garrison
ee206952c9 py-uv: add v0.6.8 (#49616) 2025-03-21 09:38:17 -07:00
Ryan Krattiger
4ccef372e8 E4S: Allow building newer ParaView for Linux CI (#47823)
5.11 was locked at a time when master was building by default. Allowing
building newer paraview in CI
2025-03-21 09:07:37 -07:00
Jon Rood
ac6e534806 openfast: patch versions to fix openmp bug. (#49631) 2025-03-21 09:06:56 -07:00
Greg Becker
5983f72439 fix extendee_spec for transitive dependencies on potential extendees (#48025)
* fix extendee_spec for transitive dependencies on potential extendees

* regression test

* resolve conditional extensions on direct deps

* remove outdated comment

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>

---------

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2025-03-21 08:27:51 -07:00
Stephen Sachs
6e10fac7ae openfoam: restrict the CGAL version compatible with C++14 (#47689)
* openfoam: restrict the CGAL version compatible with C++14

CGAL throws an
[error](50219fc33b/Installation/include/CGAL/config.h (L147))
if C++ lower than 17 is used, while OpenFOAM [forces
C++14](https://develop.openfoam.com/Development/openfoam/-/blob/develop/wmake/rules/General/Gcc/c++?ref_type=heads#L9).
This hard C++17 dependency was
[introduced](e54408370b)
to CGAL version 6.

* Add upper bound since openfoam now uses c++17

44f7a7268a
2025-03-21 08:23:33 -07:00
Derek Ryan Strong
ee6ea5155c Add libjpeg-turbo v3.0.4 (#48030) 2025-03-21 08:22:01 -07:00
Cyrus Harrison
48258e8ddc conduit: add v0.9.3 (#48736)
* add 0.9.3 release, fix license listed
* fix sha
2025-03-21 08:20:50 -07:00
Robert Maaskant
429b0375ed yarn: v1.22.22 (#49171) 2025-03-21 08:12:13 -07:00
Robert Maaskant
c6925ab83f new package: py-loky (#49602) 2025-03-21 08:06:36 -07:00
Wouter Deconinck
00d78dfa0c pythia8: add v8.313 (#49045)
* pythia8: add v8.313

* pythia8: conflicts ~yoda +rivet only when @8.313:
2025-03-21 07:59:55 -07:00
Wouter Deconinck
e072a91572 libx11: add v1.8.11 (#48863) 2025-03-21 07:58:12 -07:00
Wouter Deconinck
b7eb0308d4 node-js: run tests with target test-only (#49516) 2025-03-21 07:52:26 -07:00
Wouter Deconinck
c98ee6d8ac eigen: build test executables when self.run_tests (#49540) 2025-03-21 07:50:54 -07:00
Wouter Deconinck
b343ebb64e qt-base: pass SBOM PATH from cmake_args (#49596)
* qt-base: pass SBOM PATH from cmake_args

* qt-base: self.define from list

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>

---------

Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
2025-03-21 07:50:09 -07:00
Adam J. Stewart
e178d2c75d py-torchmetrics: add v1.7.0 (#49633) 2025-03-21 07:44:47 -07:00
Matt Thompson
9b64560ae6 mapl: add v2.53.3, v2.54.2 (#49610) 2025-03-21 07:18:22 -07:00
David Ozog
ca226f3506 sos: (and tests-sos:) update to v1.5.3, add main branch (#49613)
* sos/tests-sos: update to v1.5.3 & add main branch

* [@spackbot] updating style on behalf of davidozog

* sos: cleanup try/except around cloning tests

---------

Co-authored-by: davidozog <davidozog@users.noreply.github.com>
2025-03-21 09:28:16 -04:00
Caetano Melone
8569e04fea py-ruff: add v0.11.1 (#49617)
* py-ruff: add v0.11.1

Add latest version and update minimum supported rust version for 0.9.8
and up.

[before](https://github.com/astral-sh/ruff/blob/0.9.7/Cargo.toml#L7) and
[after](https://github.com/astral-sh/ruff/blob/0.9.8/Cargo.toml#L7)

* minimum rust version

Co-authored-by: Alec Scott <hi@alecbcs.com>

---------

Co-authored-by: Alec Scott <hi@alecbcs.com>
2025-03-21 09:14:13 +01:00
Robert Maaskant
32213d5e6b fix: spack audit issues (#49557) 2025-03-20 22:41:15 -07:00
Paul R. C. Kent
4891f3dbc9 rmgdft: add develop version (#49558) 2025-03-20 22:39:26 -07:00
Anderson Chauphan
2b5959c3dd trilinos: add v16.1.0 (#49628)
Signed-off-by: Anderson Chauphan <achauph@sandia.gov>
2025-03-20 22:37:43 -07:00
Suzanne Prentice
353db6752a ruby: add v3.2.5 (#49537) 2025-03-20 22:34:45 -07:00
Adam J. Stewart
bf24b8e82c py-lightning: add v2.5.1 (#49600) 2025-03-21 01:31:17 -04:00
psakievich
f2d830cd4c Get env_var mods from config (#49626) 2025-03-20 21:48:50 -05:00
brian-kelley
070bfa1ed7 KokkosKernels: apply PR 2296 as patch (#49627)
Applies this fix to all affected versions (4.0.00:4.4.00).
Fixes issue #49622.

Signed-off-by: Brian Kelley <bmkelle@sandia.gov>
2025-03-20 20:13:00 -06:00
Alec Scott
c79b6207e8 ci: add automatic checksum verification check (#45063)
Add a CI check to automatically verify the checksums of newly added
package versions:
    - [x] a new command, `spack ci verify-versions`
    - [x] a GitHub actions check to run the command
    - [x] tests for the new command

This also eliminates the suggestion for maintainers to manually verify added
checksums in the case of accidental version <--> checksum mismatches.

----

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2025-03-20 22:58:14 +01:00
459 changed files with 9012 additions and 9101 deletions

View File

@@ -9,6 +9,7 @@ on:
branches:
- develop
- releases/**
merge_group:
concurrency:
group: ci-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
@@ -25,13 +26,17 @@ jobs:
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
if: ${{ github.event_name == 'push' }}
if: ${{ github.event_name == 'push' || github.event_name == 'merge_group' }}
with:
fetch-depth: 0
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36
id: filter
with:
# For merge group events, compare against the target branch (main)
base: ${{ github.event_name == 'merge_group' && github.event.merge_group.base_ref || '' }}
# For merge group events, use the merge group head ref
ref: ${{ github.event_name == 'merge_group' && github.event.merge_group.head_sha || github.ref }}
# See https://github.com/dorny/paths-filter/issues/56 for the syntax used below
# Don't run if we only modified packages in the
# built-in repository or documentation
@@ -76,10 +81,11 @@ jobs:
prechecks:
needs: [ changes ]
uses: ./.github/workflows/valid-style.yml
uses: ./.github/workflows/prechecks.yml
secrets: inherit
with:
with_coverage: ${{ needs.changes.outputs.core }}
with_packages: ${{ needs.changes.outputs.packages }}
import-check:
needs: [ changes ]
@@ -93,7 +99,7 @@ jobs:
- name: Success
run: |
if [ "${{ needs.prechecks.result }}" == "failure" ] || [ "${{ needs.prechecks.result }}" == "canceled" ]; then
echo "Unit tests failed."
echo "Unit tests failed."
exit 1
else
exit 0
@@ -101,6 +107,7 @@ jobs:
coverage:
needs: [ unit-tests, prechecks ]
if: ${{ needs.changes.outputs.core }}
uses: ./.github/workflows/coverage.yml
secrets: inherit
@@ -113,10 +120,10 @@ jobs:
- name: Status summary
run: |
if [ "${{ needs.unit-tests.result }}" == "failure" ] || [ "${{ needs.unit-tests.result }}" == "canceled" ]; then
echo "Unit tests failed."
echo "Unit tests failed."
exit 1
elif [ "${{ needs.bootstrap.result }}" == "failure" ] || [ "${{ needs.bootstrap.result }}" == "canceled" ]; then
echo "Bootstrap tests failed."
echo "Bootstrap tests failed."
exit 1
else
exit 0

View File

@@ -1,4 +1,4 @@
name: style
name: prechecks
on:
workflow_call:
@@ -6,6 +6,9 @@ on:
with_coverage:
required: true
type: string
with_packages:
required: true
type: string
concurrency:
group: style-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
@@ -30,6 +33,7 @@ jobs:
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories)
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv var/spack/repos
# Run style checks on the files that have been changed
style:
runs-on: ubuntu-latest
@@ -53,12 +57,25 @@ jobs:
- name: Run style tests
run: |
share/spack/qa/run-style-tests
audit:
uses: ./.github/workflows/audit.yaml
secrets: inherit
with:
with_coverage: ${{ inputs.with_coverage }}
python_version: '3.13'
verify-checksums:
if: ${{ inputs.with_packages == 'true' }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 2
- name: Verify Added Checksums
run: |
bin/spack ci verify-versions HEAD^1 HEAD
# Check that spack can bootstrap the development environment on Python 3.6 - RHEL8
bootstrap-dev-rhel8:
runs-on: ubuntu-latest

View File

@@ -19,7 +19,7 @@ config:
install_tree:
root: $spack/opt/spack
projections:
all: "{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}"
all: "{architecture.platform}-{architecture.target}/{name}-{version}-{hash}"
# install_tree can include an optional padded length (int or boolean)
# default is False (do not pad)
# if padded_length is True, Spack will pad as close to the system max path

View File

@@ -15,12 +15,11 @@
# -------------------------------------------------------------------------
packages:
all:
compiler:
- apple-clang
- clang
- gcc
providers:
c: [apple-clang, llvm, gcc]
cxx: [apple-clang, llvm, gcc]
elf: [libelf]
fortran: [gcc]
fuse: [macfuse]
gl: [apple-gl]
glu: [apple-glu]
@@ -50,3 +49,12 @@ packages:
# although the version number used here isn't critical
- spec: apple-libuuid@1353.100.2
prefix: /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk
c:
prefer:
- apple-clang
cxx:
prefer:
- apple-clang
fortran:
prefer:
- gcc

View File

@@ -15,19 +15,18 @@
# -------------------------------------------------------------------------
packages:
all:
compiler: [gcc, clang, oneapi, xl, nag, fj, aocc]
providers:
awk: [gawk]
armci: [armcimpi]
blas: [openblas, amdblis]
c: [gcc]
cxx: [gcc]
c: [gcc, llvm, intel-oneapi-compilers]
cxx: [gcc, llvm, intel-oneapi-compilers]
D: [ldc]
daal: [intel-oneapi-daal]
elf: [elfutils]
fftw-api: [fftw, amdfftw]
flame: [libflame, amdlibflame]
fortran: [gcc]
fortran: [gcc, llvm, intel-oneapi-compilers]
fortran-rt: [gcc-runtime, intel-oneapi-runtime]
fuse: [libfuse]
gl: [glx, osmesa]

View File

@@ -15,8 +15,8 @@
# -------------------------------------------------------------------------
packages:
all:
compiler:
- msvc
providers:
c : [msvc]
cxx: [msvc]
mpi: [msmpi]
gl: [wgl]

View File

@@ -457,6 +457,13 @@ developed package in the environment are concretized to match the
version (and other constraints) passed as the spec argument to the
``spack develop`` command.
When working deep in the graph it is often desirable to have multiple specs marked
as ``develop`` so you don't have to restage and/or do full rebuilds each time you
call ``spack install``. The ``--recursive`` flag can be used in these scenarios
to ensure that all the dependents of the initial spec you provide are also marked
as develop specs. The ``--recursive`` flag requires a pre-concretized environment
so the graph can be traversed from the supplied spec all the way to the root specs.
For packages with ``git`` attributes, git branches, tags, and commits can
also be used as valid concrete versions (see :ref:`version-specifier`).
This means that for a package ``foo``, ``spack develop foo@git.main`` will clone

View File

@@ -330,7 +330,7 @@ that ``--tests`` is passed to ``spack ci rebuild`` as part of the
- spack --version
- cd ${SPACK_CONCRETE_ENV_DIR}
- spack env activate --without-view .
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture}/{compiler.name}-{compiler.version}/{name}-{version}-{hash}'"
- spack config add "config:install_tree:projections:${SPACK_JOB_SPEC_PKG_NAME}:'morepadding/{architecture.platform}-{architecture.target}/{name}-{version}-{hash}'"
- mkdir -p ${SPACK_ARTIFACTS_ROOT}/user_data
- if [[ -r /mnt/key/intermediate_ci_signing_key.gpg ]]; then spack gpg trust /mnt/key/intermediate_ci_signing_key.gpg; fi
- if [[ -r /mnt/key/spack_public_key.gpg ]]; then spack gpg trust /mnt/key/spack_public_key.gpg; fi

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cpp

View File

@@ -1 +0,0 @@
../fc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

1
lib/spack/env/c++ vendored
View File

@@ -1 +0,0 @@
cc

1
lib/spack/env/c89 vendored
View File

@@ -1 +0,0 @@
cc

1
lib/spack/env/c99 vendored
View File

@@ -1 +0,0 @@
cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../../cc

View File

@@ -1 +0,0 @@
../../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

1
lib/spack/env/cpp vendored
View File

@@ -1 +0,0 @@
cc

1
lib/spack/env/f77 vendored
View File

@@ -1 +0,0 @@
cc

1
lib/spack/env/f90 vendored
View File

@@ -1 +0,0 @@
cc

1
lib/spack/env/f95 vendored
View File

@@ -1 +0,0 @@
cc

1
lib/spack/env/fc vendored
View File

@@ -1 +0,0 @@
cc

View File

@@ -1 +0,0 @@
../../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

1
lib/spack/env/ftn vendored
View File

@@ -1 +0,0 @@
cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

1
lib/spack/env/ld vendored
View File

@@ -1 +0,0 @@
cc

View File

@@ -1 +0,0 @@
cc

View File

@@ -1 +0,0 @@
cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cpp

View File

@@ -1 +0,0 @@
../fc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -1 +0,0 @@
../cc

View File

@@ -73,7 +73,7 @@ def index_by(objects, *funcs):
if isinstance(f, str):
f = lambda x: getattr(x, funcs[0])
elif isinstance(f, tuple):
f = lambda x: tuple(getattr(x, p) for p in funcs[0])
f = lambda x: tuple(getattr(x, p, None) for p in funcs[0])
result = {}
for o in objects:
@@ -1016,11 +1016,8 @@ def _receive_forwarded(self, context: str, exc: Exception, tb: List[str]):
def grouped_message(self, with_tracebacks: bool = True) -> str:
"""Print out an error message coalescing all the forwarded errors."""
each_exception_message = [
"{0} raised {1}: {2}{3}".format(
context,
exc.__class__.__name__,
exc,
"\n{0}".format("".join(tb)) if with_tracebacks else "",
"\n\t{0} raised {1}: {2}\n{3}".format(
context, exc.__class__.__name__, exc, f"\n{''.join(tb)}" if with_tracebacks else ""
)
for context, exc, tb in self.exceptions
]

View File

@@ -0,0 +1,20 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Alias names to convert legacy compilers to builtin packages and vice-versa"""
BUILTIN_TO_LEGACY_COMPILER = {
"llvm": "clang",
"intel-oneapi-compilers": "oneapi",
"llvm-amdgpu": "rocmcc",
"intel-oneapi-compiler-classic": "intel",
"acfl": "arm",
}
LEGACY_COMPILER_TO_BUILTIN = {
"clang": "llvm",
"oneapi": "intel-oneapi-compilers",
"rocmcc": "llvm-amdgpu",
"intel": "intel-oneapi-compiler-classic",
"arm": "acfl",
}

View File

@@ -110,6 +110,13 @@ def __init__(self, root):
self._write_transaction_impl = llnl.util.lang.nullcontext
self._read_transaction_impl = llnl.util.lang.nullcontext
def _handle_old_db_versions_read(self, check, db, *, reindex: bool):
if not self.is_readable():
raise spack_db.DatabaseNotReadableError(
f"cannot read buildcache v{self.db_version} at {self.root}"
)
return self._handle_current_version_read(check, db)
class FetchCacheError(Exception):
"""Error thrown when fetching the cache failed, usually a composite error list."""
@@ -242,7 +249,7 @@ def _associate_built_specs_with_mirror(self, cache_key, mirror_url):
self._index_file_cache.init_entry(cache_key)
cache_path = self._index_file_cache.cache_path(cache_key)
with self._index_file_cache.read_transaction(cache_key):
db._read_from_file(cache_path)
db._read_from_file(pathlib.Path(cache_path))
except spack_db.InvalidDatabaseVersionError as e:
tty.warn(
f"you need a newer Spack version to read the buildcache index for the "

View File

@@ -234,14 +234,6 @@ def _root_spec(spec_str: str) -> str:
# Add a compiler and platform requirement to the root spec.
platform = str(spack.platforms.host())
if platform == "darwin":
spec_str += " %apple-clang"
elif platform == "windows":
spec_str += " %msvc"
elif platform == "linux":
spec_str += " %gcc"
elif platform == "freebsd":
spec_str += " %clang"
spec_str += f" platform={platform}"
target = archspec.cpu.host().family
spec_str += f" target={target}"

View File

@@ -15,11 +15,13 @@
import archspec.cpu
import spack.compiler
import spack.compilers
import spack.compilers.config
import spack.compilers.libraries
import spack.config
import spack.platforms
import spack.spec
import spack.traverse
import spack.version
from .config import spec_for_current_python
@@ -38,7 +40,7 @@ def __init__(self, configuration):
self.external_cmake, self.external_bison = self._externals_from_yaml(configuration)
def _valid_compiler_or_raise(self) -> "spack.compiler.Compiler":
def _valid_compiler_or_raise(self):
if str(self.host_platform) == "linux":
compiler_name = "gcc"
elif str(self.host_platform) == "darwin":
@@ -46,17 +48,30 @@ def _valid_compiler_or_raise(self) -> "spack.compiler.Compiler":
elif str(self.host_platform) == "windows":
compiler_name = "msvc"
elif str(self.host_platform) == "freebsd":
compiler_name = "clang"
compiler_name = "llvm"
else:
raise RuntimeError(f"Cannot bootstrap clingo from sources on {self.host_platform}")
candidates = spack.compilers.compilers_for_spec(
compiler_name, arch_spec=self.host_architecture
)
candidates = [
x
for x in spack.compilers.config.CompilerFactory.from_packages_yaml(spack.config.CONFIG)
if x.name == compiler_name
]
if not candidates:
raise RuntimeError(
f"Cannot find any version of {compiler_name} to bootstrap clingo from sources"
)
candidates.sort(key=lambda x: x.spec.version, reverse=True)
candidates.sort(key=lambda x: x.version, reverse=True)
best = candidates[0]
# Get compilers for bootstrapping from the 'builtin' repository
best.namespace = "builtin"
# If the compiler does not support C++ 14, fail with a legible error message
try:
_ = best.package.standard_flag(language="cxx", standard="14")
except RuntimeError as e:
raise RuntimeError(
"cannot find a compiler supporting C++ 14 [needed to bootstrap clingo]"
) from e
return candidates[0]
def _externals_from_yaml(
@@ -75,9 +90,6 @@ def _externals_from_yaml(
if not s.satisfies(requirements[pkg_name]):
continue
if not s.intersects(f"%{self.host_compiler.spec}"):
continue
if not s.intersects(f"arch={self.host_architecture}"):
continue
@@ -110,11 +122,14 @@ def concretize(self) -> "spack.spec.Spec":
# Tweak it to conform to the host architecture
for node in s.traverse():
node.architecture.os = str(self.host_os)
node.compiler = self.host_compiler.spec
node.architecture = self.host_architecture
if node.name == "gcc-runtime":
node.versions = self.host_compiler.spec.versions
node.versions = self.host_compiler.versions
# Can't use re2c@3.1 with Python 3.6
if self.host_python.satisfies("@3.6"):
s["re2c"].versions.versions = [spack.version.from_string("=2.2")]
for edge in spack.traverse.traverse_edges([s], cover="edges"):
if edge.spec.name == "python":
@@ -126,6 +141,9 @@ def concretize(self) -> "spack.spec.Spec":
if edge.spec.name == "cmake" and self.external_cmake:
edge.spec = self.external_cmake
if edge.spec.name == self.host_compiler.name:
edge.spec = self.host_compiler
if "libc" in edge.virtuals:
edge.spec = self.host_libc
@@ -141,12 +159,12 @@ def python_external_spec(self) -> "spack.spec.Spec":
return self._external_spec(result)
def libc_external_spec(self) -> "spack.spec.Spec":
result = self.host_compiler.default_libc
detector = spack.compilers.libraries.CompilerPropertyDetector(self.host_compiler)
result = detector.default_libc()
return self._external_spec(result)
def _external_spec(self, initial_spec) -> "spack.spec.Spec":
initial_spec.namespace = "builtin"
initial_spec.compiler = self.host_compiler.spec
initial_spec.architecture = self.host_architecture
for flag_type in spack.spec.FlagMap.valid_compiler_flags():
initial_spec.compiler_flags[flag_type] = []

View File

@@ -10,7 +10,7 @@
from llnl.util import tty
import spack.compilers
import spack.compilers.config
import spack.config
import spack.environment
import spack.modules
@@ -142,8 +142,8 @@ def _bootstrap_config_scopes() -> Sequence["spack.config.ConfigScope"]:
def _add_compilers_if_missing() -> None:
arch = spack.spec.ArchSpec.default_arch()
if not spack.compilers.compilers_for_arch(arch):
spack.compilers.find_compilers()
if not spack.compilers.config.compilers_for_arch(arch):
spack.compilers.config.find_compilers()
@contextlib.contextmanager

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -36,7 +36,6 @@
import multiprocessing
import os
import re
import stat
import sys
import traceback
import types
@@ -71,7 +70,7 @@
import spack.build_systems.meson
import spack.build_systems.python
import spack.builder
import spack.compilers
import spack.compilers.libraries
import spack.config
import spack.deptypes as dt
import spack.error
@@ -85,7 +84,6 @@
import spack.store
import spack.subprocess_context
import spack.util.executable
import spack.util.libc
from spack import traverse
from spack.context import Context
from spack.error import InstallError, NoHeadersError, NoLibrariesError
@@ -93,6 +91,8 @@
from spack.util.environment import (
SYSTEM_DIR_CASE_ENTRY,
EnvironmentModifications,
ModificationList,
PrependPath,
env_flag,
filter_system_paths,
get_path,
@@ -113,7 +113,7 @@
# set_wrapper_variables and used to pass parameters to
# Spack's compiler wrappers.
#
SPACK_ENV_PATH = "SPACK_ENV_PATH"
SPACK_COMPILER_WRAPPER_PATH = "SPACK_COMPILER_WRAPPER_PATH"
SPACK_MANAGED_DIRS = "SPACK_MANAGED_DIRS"
SPACK_INCLUDE_DIRS = "SPACK_INCLUDE_DIRS"
SPACK_LINK_DIRS = "SPACK_LINK_DIRS"
@@ -390,62 +390,10 @@ def _add_werror_handling(keep_werror, env):
env.set("SPACK_COMPILER_FLAGS_REPLACE", " ".join(["|".join(item) for item in replace_flags]))
def set_compiler_environment_variables(pkg, env):
def set_wrapper_environment_variables_for_flags(pkg, env):
assert pkg.spec.concrete
compiler = pkg.compiler
spec = pkg.spec
# Make sure the executables for this compiler exist
compiler.verify_executables()
# Set compiler variables used by CMake and autotools
assert all(key in compiler.link_paths for key in ("cc", "cxx", "f77", "fc"))
# Populate an object with the list of environment modifications
# and return it
# TODO : add additional kwargs for better diagnostics, like requestor,
# ttyout, ttyerr, etc.
link_dir = spack.paths.build_env_path
# Set SPACK compiler variables so that our wrapper knows what to
# call. If there is no compiler configured then use a default
# wrapper which will emit an error if it is used.
if compiler.cc:
env.set("SPACK_CC", compiler.cc)
env.set("CC", os.path.join(link_dir, compiler.link_paths["cc"]))
else:
env.set("CC", os.path.join(link_dir, "cc"))
if compiler.cxx:
env.set("SPACK_CXX", compiler.cxx)
env.set("CXX", os.path.join(link_dir, compiler.link_paths["cxx"]))
else:
env.set("CC", os.path.join(link_dir, "c++"))
if compiler.f77:
env.set("SPACK_F77", compiler.f77)
env.set("F77", os.path.join(link_dir, compiler.link_paths["f77"]))
else:
env.set("F77", os.path.join(link_dir, "f77"))
if compiler.fc:
env.set("SPACK_FC", compiler.fc)
env.set("FC", os.path.join(link_dir, compiler.link_paths["fc"]))
else:
env.set("FC", os.path.join(link_dir, "fc"))
# Set SPACK compiler rpath flags so that our wrapper knows what to use
env.set("SPACK_CC_RPATH_ARG", compiler.cc_rpath_arg)
env.set("SPACK_CXX_RPATH_ARG", compiler.cxx_rpath_arg)
env.set("SPACK_F77_RPATH_ARG", compiler.f77_rpath_arg)
env.set("SPACK_FC_RPATH_ARG", compiler.fc_rpath_arg)
env.set("SPACK_LINKER_ARG", compiler.linker_arg)
# Check whether we want to force RPATH or RUNPATH
if spack.config.get("config:shared_linking:type") == "rpath":
env.set("SPACK_DTAGS_TO_STRIP", compiler.enable_new_dtags)
env.set("SPACK_DTAGS_TO_ADD", compiler.disable_new_dtags)
else:
env.set("SPACK_DTAGS_TO_STRIP", compiler.disable_new_dtags)
env.set("SPACK_DTAGS_TO_ADD", compiler.enable_new_dtags)
if pkg.keep_werror is not None:
keep_werror = pkg.keep_werror
else:
@@ -453,10 +401,6 @@ def set_compiler_environment_variables(pkg, env):
_add_werror_handling(keep_werror, env)
# Set the target parameters that the compiler will add
isa_arg = optimization_flags(compiler, spec.target)
env.set("SPACK_TARGET_ARGS", isa_arg)
# Trap spack-tracked compiler flags as appropriate.
# env_flags are easy to accidentally override.
inject_flags = {}
@@ -489,75 +433,23 @@ def set_compiler_environment_variables(pkg, env):
# implicit variables
env.set(flag.upper(), " ".join(f for f in env_flags[flag]))
pkg.flags_to_build_system_args(build_system_flags)
env.set("SPACK_COMPILER_SPEC", str(spec.compiler))
env.set("SPACK_SYSTEM_DIRS", SYSTEM_DIR_CASE_ENTRY)
compiler.setup_custom_environment(pkg, env)
return env
def optimization_flags(compiler, target):
if spack.compilers.is_mixed_toolchain(compiler):
msg = (
"microarchitecture specific optimizations are not "
"supported yet on mixed compiler toolchains [check"
f" {compiler.name}@{compiler.version} for further details]"
)
tty.debug(msg)
return ""
# Try to check if the current compiler comes with a version number or
# has an unexpected suffix. If so, treat it as a compiler with a
# custom spec.
compiler_version = compiler.version
version_number, suffix = archspec.cpu.version_components(compiler.version)
if not version_number or suffix:
try:
compiler_version = compiler.real_version
except spack.util.executable.ProcessError as e:
# log this and just return compiler.version instead
tty.debug(str(e))
version_number, _ = archspec.cpu.version_components(compiler.version.dotted_numeric_string)
try:
result = target.optimization_flags(compiler.name, compiler_version.dotted_numeric_string)
result = target.optimization_flags(compiler.name, version_number)
except (ValueError, archspec.cpu.UnsupportedMicroarchitecture):
result = ""
return result
class FilterDefaultDynamicLinkerSearchPaths:
"""Remove rpaths to directories that are default search paths of the dynamic linker."""
def __init__(self, dynamic_linker: Optional[str]) -> None:
# Identify directories by (inode, device) tuple, which handles symlinks too.
self.default_path_identifiers: Set[Tuple[int, int]] = set()
if not dynamic_linker:
return
for path in spack.util.libc.default_search_paths_from_dynamic_linker(dynamic_linker):
try:
s = os.stat(path)
if stat.S_ISDIR(s.st_mode):
self.default_path_identifiers.add((s.st_ino, s.st_dev))
except OSError:
continue
def is_dynamic_loader_default_path(self, p: str) -> bool:
try:
s = os.stat(p)
return (s.st_ino, s.st_dev) in self.default_path_identifiers
except OSError:
return False
def __call__(self, dirs: List[str]) -> List[str]:
if not self.default_path_identifiers:
return dirs
return [p for p in dirs if not self.is_dynamic_loader_default_path(p)]
def set_wrapper_variables(pkg, env):
"""Set environment variables used by the Spack compiler wrapper (which have the prefix
`SPACK_`) and also add the compiler wrappers to PATH.
@@ -566,39 +458,8 @@ def set_wrapper_variables(pkg, env):
this function computes these options in a manner that is intended to match the DAG traversal
order in `SetupContext`. TODO: this is not the case yet, we're using post order, SetupContext
is using topo order."""
# Set environment variables if specified for
# the given compiler
compiler = pkg.compiler
env.extend(spack.schema.environment.parse(compiler.environment))
if compiler.extra_rpaths:
extra_rpaths = ":".join(compiler.extra_rpaths)
env.set("SPACK_COMPILER_EXTRA_RPATHS", extra_rpaths)
# Add spack build environment path with compiler wrappers first in
# the path. We add the compiler wrapper path, which includes default
# wrappers (cc, c++, f77, f90), AND a subdirectory containing
# compiler-specific symlinks. The latter ensures that builds that
# are sensitive to the *name* of the compiler see the right name when
# we're building with the wrappers.
#
# Conflicts on case-insensitive systems (like "CC" and "cc") are
# handled by putting one in the <build_env_path>/case-insensitive
# directory. Add that to the path too.
env_paths = []
compiler_specific = os.path.join(
spack.paths.build_env_path, os.path.dirname(pkg.compiler.link_paths["cc"])
)
for item in [spack.paths.build_env_path, compiler_specific]:
env_paths.append(item)
ci = os.path.join(item, "case-insensitive")
if os.path.isdir(ci):
env_paths.append(ci)
tty.debug("Adding compiler bin/ paths: " + " ".join(env_paths))
for item in env_paths:
env.prepend_path("PATH", item)
env.set_path(SPACK_ENV_PATH, env_paths)
# Set compiler flags injected from the spec
set_wrapper_environment_variables_for_flags(pkg, env)
# Working directory for the spack command itself, for debug logs.
if spack.config.get("config:debug"):
@@ -664,22 +525,15 @@ def set_wrapper_variables(pkg, env):
lib_path = os.path.join(pkg.prefix, libdir)
rpath_dirs.insert(0, lib_path)
filter_default_dynamic_linker_search_paths = FilterDefaultDynamicLinkerSearchPaths(
pkg.compiler.default_dynamic_linker
)
# TODO: filter_system_paths is again wrong (and probably unnecessary due to the is_system_path
# branch above). link_dirs should be filtered with entries from _parse_link_paths.
link_dirs = list(dedupe(filter_system_paths(link_dirs)))
include_dirs = list(dedupe(filter_system_paths(include_dirs)))
rpath_dirs = list(dedupe(filter_system_paths(rpath_dirs)))
rpath_dirs = filter_default_dynamic_linker_search_paths(rpath_dirs)
# TODO: implicit_rpaths is prefiltered by is_system_path, that should be removed in favor of
# just this filter.
implicit_rpaths = filter_default_dynamic_linker_search_paths(pkg.compiler.implicit_rpaths())
if implicit_rpaths:
env.set("SPACK_COMPILER_IMPLICIT_RPATHS", ":".join(implicit_rpaths))
default_dynamic_linker_filter = spack.compilers.libraries.dynamic_linker_filter_for(pkg.spec)
if default_dynamic_linker_filter:
rpath_dirs = default_dynamic_linker_filter(rpath_dirs)
# Spack managed directories include the stage, store and upstream stores. We extend this with
# their real paths to make it more robust (e.g. /tmp vs /private/tmp on macOS).
@@ -731,26 +585,6 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
# Don't use which for this; we want to find it in the current dir.
module.configure = Executable("./configure")
# Put spack compiler paths in module scope. (Some packages use it
# in setup_run_environment etc, so don't put it context == build)
link_dir = spack.paths.build_env_path
pkg_compiler = None
try:
pkg_compiler = pkg.compiler
except spack.compilers.NoCompilerForSpecError as e:
tty.debug(f"cannot set 'spack_cc': {str(e)}")
if pkg_compiler is not None:
module.spack_cc = os.path.join(link_dir, pkg_compiler.link_paths["cc"])
module.spack_cxx = os.path.join(link_dir, pkg_compiler.link_paths["cxx"])
module.spack_f77 = os.path.join(link_dir, pkg_compiler.link_paths["f77"])
module.spack_fc = os.path.join(link_dir, pkg_compiler.link_paths["fc"])
else:
module.spack_cc = None
module.spack_cxx = None
module.spack_f77 = None
module.spack_fc = None
# Useful directories within the prefix are encapsulated in
# a Prefix object.
module.prefix = pkg.prefix
@@ -901,7 +735,6 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
context == Context.TEST and pkg.test_requires_compiler
)
if need_compiler:
set_compiler_environment_variables(pkg, env_mods)
set_wrapper_variables(pkg, env_mods)
# Platform specific setup goes before package specific setup. This is for setting
@@ -913,6 +746,26 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
env_mods.extend(setup_context.get_env_modifications())
tty.debug("setup_package: collected all modifications from dependencies")
tty.debug("setup_package: adding compiler wrappers paths")
env_by_name = env_mods.group_by_name()
for x in env_by_name["SPACK_COMPILER_WRAPPER_PATH"]:
assert isinstance(
x, PrependPath
), "unexpected setting used for SPACK_COMPILER_WRAPPER_PATH"
env_mods.prepend_path("PATH", x.value)
# Check whether we want to force RPATH or RUNPATH
enable_var_name, disable_var_name = "SPACK_ENABLE_NEW_DTAGS", "SPACK_DISABLE_NEW_DTAGS"
if enable_var_name in env_by_name and disable_var_name in env_by_name:
enable_new_dtags = _extract_dtags_arg(env_by_name, var_name=enable_var_name)
disable_new_dtags = _extract_dtags_arg(env_by_name, var_name=disable_var_name)
if spack.config.CONFIG.get("config:shared_linking:type") == "rpath":
env_mods.set("SPACK_DTAGS_TO_STRIP", enable_new_dtags)
env_mods.set("SPACK_DTAGS_TO_ADD", disable_new_dtags)
else:
env_mods.set("SPACK_DTAGS_TO_STRIP", disable_new_dtags)
env_mods.set("SPACK_DTAGS_TO_ADD", enable_new_dtags)
if context == Context.TEST:
env_mods.prepend_path("PATH", ".")
elif context == Context.BUILD and not dirty and not env_mods.is_unset("CPATH"):
@@ -926,11 +779,6 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
# Load modules on an already clean environment, just before applying Spack's
# own environment modifications. This ensures Spack controls CC/CXX/... variables.
if need_compiler:
tty.debug("setup_package: loading compiler modules")
for mod in pkg.compiler.modules:
load_module(mod)
load_external_modules(setup_context)
# Make sure nothing's strange about the Spack environment.
@@ -942,6 +790,14 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
return env_base
def _extract_dtags_arg(env_by_name: Dict[str, ModificationList], *, var_name: str) -> str:
try:
enable_new_dtags = env_by_name[var_name][0].value # type: ignore[union-attr]
except (KeyError, IndexError, AttributeError):
enable_new_dtags = ""
return enable_new_dtags
class EnvironmentVisitor:
def __init__(self, *roots: spack.spec.Spec, context: Context):
# For the roots (well, marked specs) we follow different edges

View File

@@ -11,6 +11,7 @@
import spack.build_environment
import spack.builder
import spack.compilers.libraries
import spack.error
import spack.package_base
import spack.phase_callbacks
@@ -398,33 +399,44 @@ def _do_patch_libtool(self) -> None:
markers[tag] = "LIBTOOL TAG CONFIG: {0}".format(tag.upper())
# Replace empty linker flag prefixes:
if self.pkg.compiler.name == "nag":
if self.spec.satisfies("%nag"):
# Nag is mixed with gcc and g++, which are recognized correctly.
# Therefore, we change only Fortran values:
nag_pkg = self.spec["fortran"].package
for tag in ["fc", "f77"]:
marker = markers[tag]
x.filter(
regex='^wl=""$',
repl='wl="{0}"'.format(self.pkg.compiler.linker_arg),
start_at="# ### BEGIN {0}".format(marker),
stop_at="# ### END {0}".format(marker),
repl=f'wl="{nag_pkg.linker_arg}"',
start_at=f"# ### BEGIN {marker}",
stop_at=f"# ### END {marker}",
)
else:
x.filter(regex='^wl=""$', repl='wl="{0}"'.format(self.pkg.compiler.linker_arg))
compiler_spec = spack.compilers.libraries.compiler_spec(self.spec)
if compiler_spec:
x.filter(regex='^wl=""$', repl='wl="{0}"'.format(compiler_spec.package.linker_arg))
# Replace empty PIC flag values:
for cc, marker in markers.items():
for compiler, marker in markers.items():
if compiler == "cc":
language = "c"
elif compiler == "cxx":
language = "cxx"
else:
language = "fortran"
if language not in self.spec:
continue
x.filter(
regex='^pic_flag=""$',
repl='pic_flag="{0}"'.format(
getattr(self.pkg.compiler, "{0}_pic_flag".format(cc))
),
start_at="# ### BEGIN {0}".format(marker),
stop_at="# ### END {0}".format(marker),
repl=f'pic_flag="{self.spec[language].package.pic_flag}"',
start_at=f"# ### BEGIN {marker}",
stop_at=f"# ### END {marker}",
)
# Other compiler-specific patches:
if self.pkg.compiler.name == "fj":
if self.spec.satisfies("%fj"):
x.filter(regex="-nostdlib", repl="", string=True)
rehead = r"/\S*/"
for o in [
@@ -437,7 +449,7 @@ def _do_patch_libtool(self) -> None:
r"crtendS\.o",
]:
x.filter(regex=(rehead + o), repl="")
elif self.pkg.compiler.name == "nag":
elif self.spec.satisfies("%nag"):
for tag in ["fc", "f77"]:
marker = markers[tag]
start_at = "# ### BEGIN {0}".format(marker)

View File

@@ -70,12 +70,8 @@ class CachedCMakeBuilder(CMakeBuilder):
@property
def cache_name(self):
return "{0}-{1}-{2}@{3}.cmake".format(
self.pkg.name,
self.pkg.spec.architecture,
self.pkg.spec.compiler.name,
self.pkg.spec.compiler.version,
)
compiler_str = f"{self.spec['c'].name}-{self.spec['c'].version}"
return f"{self.pkg.name}-{self.spec.architecture.platform}-{compiler_str}.cmake"
@property
def cache_path(self):
@@ -118,7 +114,9 @@ def initconfig_compiler_entries(self):
# Fortran compiler is optional
if "FC" in os.environ:
spack_fc_entry = cmake_cache_path("CMAKE_Fortran_COMPILER", os.environ["FC"])
system_fc_entry = cmake_cache_path("CMAKE_Fortran_COMPILER", self.pkg.compiler.fc)
system_fc_entry = cmake_cache_path(
"CMAKE_Fortran_COMPILER", self.spec["fortran"].package.fortran
)
else:
spack_fc_entry = "# No Fortran compiler defined in spec"
system_fc_entry = "# No Fortran compiler defined in spec"
@@ -134,8 +132,8 @@ def initconfig_compiler_entries(self):
" " + cmake_cache_path("CMAKE_CXX_COMPILER", os.environ["CXX"]),
" " + spack_fc_entry,
"else()\n",
" " + cmake_cache_path("CMAKE_C_COMPILER", self.pkg.compiler.cc),
" " + cmake_cache_path("CMAKE_CXX_COMPILER", self.pkg.compiler.cxx),
" " + cmake_cache_path("CMAKE_C_COMPILER", self.spec["c"].package.cc),
" " + cmake_cache_path("CMAKE_CXX_COMPILER", self.spec["cxx"].package.cxx),
" " + system_fc_entry,
"endif()\n",
]

View File

@@ -6,12 +6,13 @@
import pathlib
import re
import sys
from typing import Dict, List, Sequence, Tuple, Union
from typing import Dict, List, Optional, Sequence, Tuple, Union
import llnl.util.tty as tty
from llnl.util.lang import classproperty
from llnl.util.lang import classproperty, memoized
import spack.compiler
import spack
import spack.compilers.error
import spack.package_base
import spack.util.executable
@@ -43,6 +44,9 @@ class CompilerPackage(spack.package_base.PackageBase):
#: Static definition of languages supported by this class
compiler_languages: Sequence[str] = ["c", "cxx", "fortran"]
#: Relative path to compiler wrappers
compiler_wrapper_link_paths: Dict[str, str] = {}
def __init__(self, spec: "spack.spec.Spec"):
super().__init__(spec)
msg = f"Supported languages for {spec} are not a subset of possible supported languages"
@@ -77,14 +81,14 @@ def executables(cls) -> Sequence[str]:
]
@classmethod
def determine_version(cls, exe: Path):
def determine_version(cls, exe: Path) -> str:
version_argument = cls.compiler_version_argument
if isinstance(version_argument, str):
version_argument = (version_argument,)
for va in version_argument:
try:
output = spack.compiler.get_compiler_version_output(exe, va)
output = compiler_output(exe, version_argument=va)
match = re.search(cls.compiler_version_regex, output)
if match:
return ".".join(match.groups())
@@ -95,10 +99,11 @@ def determine_version(cls, exe: Path):
f"[{__file__}] Cannot detect a valid version for the executable "
f"{str(exe)}, for package '{cls.name}': {e}"
)
return ""
@classmethod
def compiler_bindir(cls, prefix: Path) -> Path:
"""Overridable method for the location of the compiler bindir within the preifx"""
"""Overridable method for the location of the compiler bindir within the prefix"""
return os.path.join(prefix, "bin")
@classmethod
@@ -142,3 +147,109 @@ def determine_compiler_paths(cls, exes: Sequence[Path]) -> Dict[str, Path]:
def determine_variants(cls, exes: Sequence[Path], version_str: str) -> Tuple:
# path determination is separated so it can be reused in subclasses
return "", {"compilers": cls.determine_compiler_paths(exes=exes)}
#: Returns the argument needed to set the RPATH, or None if it does not exist
rpath_arg: Optional[str] = "-Wl,-rpath,"
#: Flag that needs to be used to pass an argument to the linker
linker_arg: str = "-Wl,"
#: Flag used to produce Position Independent Code
pic_flag: str = "-fPIC"
#: Flag used to get verbose output
verbose_flags: str = "-v"
#: Flag to activate OpenMP support
openmp_flag: str = "-fopenmp"
implicit_rpath_libs: List[str] = []
def standard_flag(self, *, language: str, standard: str) -> str:
"""Returns the flag used to enforce a given standard for a language"""
if language not in self.supported_languages:
raise spack.compilers.error.UnsupportedCompilerFlag(
f"{self.spec} does not provide the '{language}' language"
)
try:
return self._standard_flag(language=language, standard=standard)
except (KeyError, RuntimeError) as e:
raise spack.compilers.error.UnsupportedCompilerFlag(
f"{self.spec} does not provide the '{language}' standard {standard}"
) from e
def _standard_flag(self, *, language: str, standard: str) -> str:
raise NotImplementedError("Must be implemented by derived classes")
def archspec_name(self) -> str:
"""Name that archspec uses to refer to this compiler"""
return self.spec.name
@property
def cc(self) -> Optional[str]:
assert self.spec.concrete, "cannot retrieve C compiler, spec is not concrete"
if self.spec.external:
return self.spec.extra_attributes["compilers"].get("c", None)
return self._cc_path()
def _cc_path(self) -> Optional[str]:
"""Returns the path to the C compiler, if the package was installed by Spack"""
return None
@property
def cxx(self) -> Optional[str]:
assert self.spec.concrete, "cannot retrieve C++ compiler, spec is not concrete"
if self.spec.external:
return self.spec.extra_attributes["compilers"].get("cxx", None)
return self._cxx_path()
def _cxx_path(self) -> Optional[str]:
"""Returns the path to the C++ compiler, if the package was installed by Spack"""
return None
@property
def fortran(self):
assert self.spec.concrete, "cannot retrieve Fortran compiler, spec is not concrete"
if self.spec.external:
return self.spec.extra_attributes["compilers"].get("fortran", None)
return self._fortran_path()
def _fortran_path(self) -> Optional[str]:
"""Returns the path to the Fortran compiler, if the package was installed by Spack"""
return None
@memoized
def _compiler_output(
compiler_path: Path, *, version_argument: str, ignore_errors: Tuple[int, ...] = ()
) -> str:
"""Returns the output from the compiler invoked with the given version argument.
Args:
compiler_path: path of the compiler to be invoked
version_argument: the argument used to extract version information
"""
compiler = spack.util.executable.Executable(compiler_path)
if not version_argument:
return compiler(
output=str, error=str, ignore_errors=ignore_errors, timeout=120, fail_on_error=True
)
return compiler(
version_argument,
output=str,
error=str,
ignore_errors=ignore_errors,
timeout=120,
fail_on_error=True,
)
def compiler_output(
compiler_path: Path, *, version_argument: str, ignore_errors: Tuple[int, ...] = ()
) -> str:
"""Wrapper for _get_compiler_version_output()."""
# This ensures that we memoize compiler output by *absolute path*,
# not just executable name. If we don't do this, and the path changes
# (e.g., during testing), we can get incorrect results.
if not os.path.isabs(compiler_path):
compiler_path = spack.util.executable.which_string(str(compiler_path), required=True)
return _compiler_output(
compiler_path, version_argument=version_argument, ignore_errors=ignore_errors
)

View File

@@ -76,7 +76,7 @@ def toolchain_version(self):
Override this method to select a specific version of the toolchain or change
selection heuristics.
Default is whatever version of msvc has been selected by concretization"""
return "v" + self.pkg.compiler.platform_toolset_ver
return "v" + self.spec["msvc"].package.platform_toolset_ver
@property
def std_msbuild_args(self):

View File

@@ -278,10 +278,6 @@ def update_external_dependencies(self, extendee_spec=None):
if not python.architecture.target:
python.architecture.target = archspec.cpu.host().family.name
# Ensure compiler information is present
if not python.compiler:
python.compiler = self.spec.compiler
python.external_path = self.spec.external_path
python._mark_concrete()
self.spec.add_dependency_edge(python, depflag=dt.BUILD | dt.LINK | dt.RUN, virtuals=())

View File

@@ -14,7 +14,7 @@
import tempfile
import zipfile
from collections import namedtuple
from typing import Callable, Dict, List, Set
from typing import Callable, Dict, List, Set, Union
from urllib.request import Request
import llnl.path
@@ -24,7 +24,6 @@
import spack
import spack.binary_distribution as bindist
import spack.concretize
import spack.config as cfg
import spack.environment as ev
import spack.error
@@ -42,6 +41,7 @@
from spack import traverse
from spack.error import SpackError
from spack.reporters.cdash import SPACK_CDASH_TIMEOUT
from spack.version import GitVersion, StandardVersion
from .common import (
IS_WINDOWS,
@@ -80,6 +80,45 @@ def get_change_revisions():
return None, None
def get_added_versions(
checksums_version_dict: Dict[str, Union[StandardVersion, GitVersion]],
path: str,
from_ref: str = "HEAD~1",
to_ref: str = "HEAD",
) -> List[Union[StandardVersion, GitVersion]]:
"""Get a list of the versions added between `from_ref` and `to_ref`.
Args:
checksums_version_dict (Dict): all package versions keyed by known checksums.
path (str): path to the package.py
from_ref (str): oldest git ref, defaults to `HEAD~1`
to_ref (str): newer git ref, defaults to `HEAD`
Returns: list of versions added between refs
"""
git_exe = spack.util.git.git(required=True)
# Gather git diff
diff_lines = git_exe("diff", from_ref, to_ref, "--", path, output=str).split("\n")
# Store added and removed versions
# Removed versions are tracked here to determine when versions are moved in a file
# and show up as both added and removed in a git diff.
added_checksums = set()
removed_checksums = set()
# Scrape diff for modified versions and prune added versions if they show up
# as also removed (which means they've actually just moved in the file and
# we shouldn't need to rechecksum them)
for checksum in checksums_version_dict.keys():
for line in diff_lines:
if checksum in line:
if line.startswith("+"):
added_checksums.add(checksum)
if line.startswith("-"):
removed_checksums.add(checksum)
return [checksums_version_dict[c] for c in added_checksums - removed_checksums]
def get_stack_changed(env_path, rev1="HEAD^", rev2="HEAD"):
"""Given an environment manifest path and two revisions to compare, return
whether or not the stack was changed. Returns True if the environment
@@ -381,10 +420,9 @@ def generate_pipeline(env: ev.Environment, args) -> None:
args: (spack.main.SpackArgumentParser): Parsed arguments from the command
line.
"""
with spack.concretize.disable_compiler_existence_check():
with env.write_transaction():
env.concretize()
env.write()
with env.write_transaction():
env.concretize()
env.write()
options = collect_pipeline_options(env, args)

View File

@@ -209,10 +209,8 @@ def build_name(self, spec: Optional[spack.spec.Spec] = None) -> Optional[str]:
Returns: (str) given spec's CDash build name."""
if spec:
build_name = (
f"{spec.name}@{spec.version}%{spec.compiler} "
f"hash={spec.dag_hash()} arch={spec.architecture} ({self.build_group})"
)
spec_str = spec.format("{name}{@version}{%compiler} hash={hash} arch={architecture}")
build_name = f"{spec_str} ({self.build_group})"
tty.debug(f"Generated CDash build name ({build_name}) from the {spec.name}")
return build_name

View File

@@ -375,8 +375,13 @@ def iter_groups(specs, indent, all_headers):
index = index_by(specs, ("architecture", "compiler"))
ispace = indent * " "
def _key(item):
if item is None:
return ""
return str(item)
# Traverse the index and print out each package
for i, (architecture, compiler) in enumerate(sorted(index)):
for i, (architecture, compiler) in enumerate(sorted(index, key=_key)):
if i > 0:
print()
@@ -448,7 +453,6 @@ def get_arg(name, default=None):
hashes = get_arg("long", False)
namespaces = get_arg("namespaces", False)
flags = get_arg("show_flags", False)
full_compiler = get_arg("show_full_compiler", False)
variants = get_arg("variants", False)
groups = get_arg("groups", True)
all_headers = get_arg("all_headers", False)
@@ -470,10 +474,8 @@ def get_arg(name, default=None):
if format_string is None:
nfmt = "{fullname}" if namespaces else "{name}"
ffmt = ""
if full_compiler or flags:
ffmt += "{compiler_flags} {%compiler.name}"
if full_compiler:
ffmt += "{@compiler.version}"
if flags:
ffmt += " {compiler_flags}"
vfmt = "{variants}" if variants else ""
format_string = nfmt + "{@version}" + vfmt + ffmt

View File

@@ -4,12 +4,15 @@
import json
import os
import re
import shutil
import sys
from typing import Dict
from urllib.parse import urlparse, urlunparse
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import llnl.util.tty.color as clr
from llnl.util import tty
import spack.binary_distribution as bindist
import spack.ci as spack_ci
@@ -18,12 +21,22 @@
import spack.cmd.common.arguments
import spack.config as cfg
import spack.environment as ev
import spack.error
import spack.fetch_strategy
import spack.hash_types as ht
import spack.mirrors.mirror
import spack.package_base
import spack.paths
import spack.repo
import spack.spec
import spack.stage
import spack.util.executable
import spack.util.git
import spack.util.gpg as gpg_util
import spack.util.timer as timer
import spack.util.url as url_util
import spack.util.web as web_util
import spack.version
description = "manage continuous integration pipelines"
section = "build"
@@ -32,6 +45,7 @@
SPACK_COMMAND = "spack"
INSTALL_FAIL_CODE = 1
FAILED_CREATE_BUILDCACHE_CODE = 100
BUILTIN = re.compile(r"var\/spack\/repos\/builtin\/packages\/([^\/]+)\/package\.py")
def deindent(desc):
@@ -191,6 +205,16 @@ def setup_parser(subparser):
reproduce.set_defaults(func=ci_reproduce)
# Verify checksums inside of ci workflows
verify_versions = subparsers.add_parser(
"verify-versions",
description=deindent(ci_verify_versions.__doc__),
help=spack.cmd.first_line(ci_verify_versions.__doc__),
)
verify_versions.add_argument("from_ref", help="git ref from which start looking at changes")
verify_versions.add_argument("to_ref", help="git ref to end looking at changes")
verify_versions.set_defaults(func=ci_verify_versions)
def ci_generate(args):
"""generate jobs file from a CI-aware spack file
@@ -659,6 +683,159 @@ def _gitlab_artifacts_url(url: str) -> str:
return urlunparse(parsed._replace(path="/".join(parts), fragment="", query=""))
def validate_standard_versions(
pkg: spack.package_base.PackageBase, versions: spack.version.VersionList
) -> bool:
"""Get and test the checksum of a package version based on a tarball.
Args:
pkg spack.package_base.PackageBase: Spack package for which to validate a version checksum
versions spack.version.VersionList: list of package versions to validate
Returns: bool: result of the validation. True is valid and false is failed.
"""
url_dict: Dict[spack.version.StandardVersion, str] = {}
for version in versions:
url = pkg.find_valid_url_for_version(version)
url_dict[version] = url
version_hashes = spack.stage.get_checksums_for_versions(
url_dict, pkg.name, fetch_options=pkg.fetch_options
)
valid_checksums = True
for version, sha in version_hashes.items():
if sha != pkg.versions[version]["sha256"]:
tty.error(
f"Invalid checksum found {pkg.name}@{version}\n"
f" [package.py] {pkg.versions[version]['sha256']}\n"
f" [Downloaded] {sha}"
)
valid_checksums = False
continue
tty.info(f"Validated {pkg.name}@{version} --> {sha}")
return valid_checksums
def validate_git_versions(
pkg: spack.package_base.PackageBase, versions: spack.version.VersionList
) -> bool:
"""Get and test the commit and tag of a package version based on a git repository.
Args:
pkg spack.package_base.PackageBase: Spack package for which to validate a version
versions spack.version.VersionList: list of package versions to validate
Returns: bool: result of the validation. True is valid and false is failed.
"""
valid_commit = True
for version in versions:
fetcher = spack.fetch_strategy.for_package_version(pkg, version)
with spack.stage.Stage(fetcher) as stage:
known_commit = pkg.versions[version]["commit"]
try:
stage.fetch()
except spack.error.FetchError:
tty.error(
f"Invalid commit for {pkg.name}@{version}\n"
f" {known_commit} could not be checked out in the git repository."
)
valid_commit = False
continue
# Test if the specified tag matches the commit in the package.py
# We retrieve the commit associated with a tag and compare it to the
# commit that is located in the package.py file.
if "tag" in pkg.versions[version]:
tag = pkg.versions[version]["tag"]
try:
with fs.working_dir(stage.source_path):
found_commit = fetcher.git(
"rev-list", "-n", "1", tag, output=str, error=str
).strip()
except spack.util.executable.ProcessError:
tty.error(
f"Invalid tag for {pkg.name}@{version}\n"
f" {tag} could not be found in the git repository."
)
valid_commit = False
continue
if found_commit != known_commit:
tty.error(
f"Mismatched tag <-> commit found for {pkg.name}@{version}\n"
f" [package.py] {known_commit}\n"
f" [Downloaded] {found_commit}"
)
valid_commit = False
continue
# If we have downloaded the repository, found the commit, and compared
# the tag (if specified) we can conclude that the version is pointing
# at what we would expect.
tty.info(f"Validated {pkg.name}@{version} --> {known_commit}")
return valid_commit
def ci_verify_versions(args):
"""validate version checksum & commits between git refs
This command takes a from_ref and to_ref arguments and
then parses the git diff between the two to determine which packages
have been modified verifies the new checksums inside of them.
"""
with fs.working_dir(spack.paths.prefix):
# We use HEAD^1 explicitly on the merge commit created by
# GitHub Actions. However HEAD~1 is a safer default for the helper function.
files = spack.util.git.get_modified_files(from_ref=args.from_ref, to_ref=args.to_ref)
# Get a list of package names from the modified files.
pkgs = [(m.group(1), p) for p in files for m in [BUILTIN.search(p)] if m]
failed_version = False
for pkg_name, path in pkgs:
spec = spack.spec.Spec(pkg_name)
pkg = spack.repo.PATH.get_pkg_class(spec.name)(spec)
# Skip checking manual download packages and trust the maintainers
if pkg.manual_download:
tty.warn(f"Skipping manual download package: {pkg_name}")
continue
# Store versions checksums / commits for future loop
checksums_version_dict = {}
commits_version_dict = {}
for version in pkg.versions:
# If the package version defines a sha256 we'll use that as the high entropy
# string to detect which versions have been added between from_ref and to_ref
if "sha256" in pkg.versions[version]:
checksums_version_dict[pkg.versions[version]["sha256"]] = version
# If a package version instead defines a commit we'll use that as a
# high entropy string to detect new versions.
elif "commit" in pkg.versions[version]:
commits_version_dict[pkg.versions[version]["commit"]] = version
# TODO: enforce every version have a commit or a sha256 defined if not
# an infinite version (there are a lot of package's where this doesn't work yet.)
with fs.working_dir(spack.paths.prefix):
added_checksums = spack_ci.get_added_versions(
checksums_version_dict, path, from_ref=args.from_ref, to_ref=args.to_ref
)
added_commits = spack_ci.get_added_versions(
commits_version_dict, path, from_ref=args.from_ref, to_ref=args.to_ref
)
if added_checksums:
failed_version = not validate_standard_versions(pkg, added_checksums) or failed_version
if added_commits:
failed_version = not validate_git_versions(pkg, added_commits) or failed_version
if failed_version:
sys.exit(1)
def ci(parser, args):
if args.func:
return args.func(args)

View File

@@ -4,13 +4,14 @@
import argparse
import sys
import warnings
import llnl.util.tty as tty
from llnl.util.lang import index_by
from llnl.util.tty.colify import colify
from llnl.util.tty.color import colorize
import spack.compilers
import spack.compilers.config
import spack.config
import spack.spec
from spack.cmd.common import arguments
@@ -33,20 +34,20 @@ def setup_parser(subparser):
mixed_toolchain_group.add_argument(
"--mixed-toolchain",
action="store_true",
default=sys.platform == "darwin",
help="Allow mixed toolchains (for example: clang, clang++, gfortran)",
default=False,
help="(DEPRECATED) Allow mixed toolchains (for example: clang, clang++, gfortran)",
)
mixed_toolchain_group.add_argument(
"--no-mixed-toolchain",
action="store_false",
dest="mixed_toolchain",
help="Do not allow mixed toolchains (for example: clang, clang++, gfortran)",
help="(DEPRECATED) Do not allow mixed toolchains (for example: clang, clang++, gfortran)",
)
find_parser.add_argument("add_paths", nargs=argparse.REMAINDER)
find_parser.add_argument(
"--scope",
action=arguments.ConfigScope,
default=lambda: spack.config.default_modify_scope("compilers"),
default=lambda: spack.config.default_modify_scope("packages"),
help="configuration scope to modify",
)
arguments.add_common_arguments(find_parser, ["jobs"])
@@ -79,77 +80,97 @@ def compiler_find(args):
"""Search either $PATH or a list of paths OR MODULES for compilers and
add them to Spack's configuration.
"""
if args.mixed_toolchain:
warnings.warn(
"The '--mixed-toolchain' option has been deprecated in Spack v0.23, and currently "
"has no effect. The option will be removed in Spack v1.1"
)
paths = args.add_paths or None
new_compilers = spack.compilers.find_compilers(
path_hints=paths,
scope=args.scope,
mixed_toolchain=args.mixed_toolchain,
max_workers=args.jobs,
new_compilers = spack.compilers.config.find_compilers(
path_hints=paths, scope=args.scope, max_workers=args.jobs
)
if new_compilers:
n = len(new_compilers)
s = "s" if n > 1 else ""
filename = spack.config.CONFIG.get_config_filename(args.scope, "compilers")
filename = spack.config.CONFIG.get_config_filename(args.scope, "packages")
tty.msg(f"Added {n:d} new compiler{s} to {filename}")
compiler_strs = sorted(f"{c.spec.name}@{c.spec.version}" for c in new_compilers)
compiler_strs = sorted(f"{spec.name}@{spec.versions}" for spec in new_compilers)
colify(reversed(compiler_strs), indent=4)
else:
tty.msg("Found no new compilers")
tty.msg("Compilers are defined in the following files:")
colify(spack.compilers.compiler_config_files(), indent=4)
colify(spack.compilers.config.compiler_config_files(), indent=4)
def compiler_remove(args):
compiler_spec = spack.spec.CompilerSpec(args.compiler_spec)
candidate_compilers = spack.compilers.compilers_for_spec(compiler_spec, scope=args.scope)
remover = spack.compilers.config.CompilerRemover(spack.config.CONFIG)
candidates = remover.mark_compilers(match=args.compiler_spec, scope=args.scope)
if not candidates:
tty.die(f"No compiler matches '{args.compiler_spec}'")
if not candidate_compilers:
tty.die("No compilers match spec %s" % compiler_spec)
compiler_strs = reversed(sorted(f"{spec.name}@{spec.versions}" for spec in candidates))
if not args.all and len(candidate_compilers) > 1:
tty.error(f"Multiple compilers match spec {compiler_spec}. Choose one:")
colify(reversed(sorted([c.spec.display_str for c in candidate_compilers])), indent=4)
tty.msg("Or, use `spack compiler remove -a` to remove all of them.")
if not args.all and len(candidates) > 1:
tty.error(f"multiple compilers match the spec '{args.compiler_spec}':")
print()
colify(compiler_strs, indent=4)
print()
print(
"Either use a stricter spec to select only one, or use `spack compiler remove -a`"
" to remove all of them."
)
sys.exit(1)
for current_compiler in candidate_compilers:
spack.compilers.remove_compiler_from_config(current_compiler.spec, scope=args.scope)
tty.msg(f"{current_compiler.spec.display_str} has been removed")
remover.flush()
tty.msg("The following compilers have been removed:")
print()
colify(compiler_strs, indent=4)
print()
def compiler_info(args):
"""Print info about all compilers matching a spec."""
cspec = spack.spec.CompilerSpec(args.compiler_spec)
compilers = spack.compilers.compilers_for_spec(cspec, scope=args.scope)
query = spack.spec.Spec(args.compiler_spec)
all_compilers = spack.compilers.config.all_compilers(scope=args.scope, init_config=False)
compilers = [x for x in all_compilers if x.satisfies(query)]
if not compilers:
tty.die("No compilers match spec %s" % cspec)
tty.die(f"No compilers match spec {query.cformat()}")
else:
for c in compilers:
print(c.spec.display_str + ":")
print("\tpaths:")
for cpath in ["cc", "cxx", "f77", "fc"]:
print("\t\t%s = %s" % (cpath, getattr(c, cpath, None)))
if c.flags:
print("\tflags:")
for flag, flag_value in c.flags.items():
print("\t\t%s = %s" % (flag, flag_value))
if len(c.environment) != 0:
if len(c.environment.get("set", {})) != 0:
print(f"{c.cformat()}:")
print(f" prefix: {c.external_path}")
extra_attributes = getattr(c, "extra_attributes", {})
if "compilers" in extra_attributes:
print(" compilers:")
for language, exe in extra_attributes.get("compilers", {}).items():
print(f" {language}: {exe}")
if "flags" in extra_attributes:
print(" flags:")
for flag, flag_value in extra_attributes["flags"].items():
print(f" {flag} = {flag_value}")
if "environment" in extra_attributes:
environment = extra_attributes["environment"]
if len(environment.get("set", {})) != 0:
print("\tenvironment:")
print("\t set:")
for key, value in c.environment["set"].items():
print("\t %s = %s" % (key, value))
if c.extra_rpaths:
print("\tExtra rpaths:")
for extra_rpath in c.extra_rpaths:
print("\t\t%s" % extra_rpath)
print("\tmodules = %s" % c.modules)
print("\toperating system = %s" % c.operating_system)
for key, value in environment["set"].items():
print(f"\t {key} = {value}")
if "extra_rpaths" in extra_attributes:
print(" extra rpaths:")
for extra_rpath in extra_attributes["extra_rpaths"]:
print(f" {extra_rpath}")
if getattr(c, "external_modules", []):
print(" modules: ")
for module in c.external_modules:
print(f" {module}")
print()
def compiler_list(args):
compilers = spack.compilers.all_compilers(scope=args.scope, init_config=False)
compilers = spack.compilers.config.all_compilers(scope=args.scope, init_config=False)
# If there are no compilers in any scope, and we're outputting to a tty, give a
# hint to the user.
@@ -162,7 +183,7 @@ def compiler_list(args):
tty.msg(msg)
return
index = index_by(compilers, lambda c: (c.spec.name, c.operating_system, c.target))
index = index_by(compilers, spack.compilers.config.name_os_target)
tty.msg("Available compilers")
@@ -181,10 +202,10 @@ def compiler_list(args):
name, os, target = key
os_str = os
if target:
os_str += "-%s" % target
cname = "%s{%s} %s" % (spack.spec.COMPILER_COLOR, name, os_str)
os_str += f"-{target}"
cname = f"{spack.spec.COMPILER_COLOR}{{{name}}} {os_str}"
tty.hline(colorize(cname), char="-")
colify(reversed(sorted(c.spec.display_str for c in compilers)))
colify(reversed(sorted(c.format("{name}@{version}") for c in compilers)))
def compiler(parser, args):

View File

@@ -521,8 +521,6 @@ def config_prefer_upstream(args):
for spec in pref_specs:
# Collect all the upstream compilers and versions for this package.
pkg = pkgs.get(spec.name, {"version": []})
all = pkgs.get("all", {"compiler": []})
pkgs["all"] = all
pkgs[spec.name] = pkg
# We have no existing variant if this is our first added version.
@@ -532,10 +530,6 @@ def config_prefer_upstream(args):
if version not in pkg["version"]:
pkg["version"].append(version)
compiler = str(spec.compiler)
if compiler not in all["compiler"]:
all["compiler"].append(compiler)
# Get and list all the variants that differ from the default.
variants = []
for var_name, variant in spec.variants.items():

View File

@@ -3,11 +3,13 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
from typing import Optional
import llnl.util.tty as tty
import spack.cmd
import spack.config
import spack.environment
import spack.fetch_strategy
import spack.repo
import spack.spec
@@ -31,37 +33,33 @@ def setup_parser(subparser):
"--no-clone",
action="store_false",
dest="clone",
default=None,
help="do not clone, the package already exists at the source path",
)
clone_group.add_argument(
"--clone",
action="store_true",
dest="clone",
default=None,
help="clone the package even if the path already exists",
default=True,
help=(
"(default) clone the package unless the path already exists, "
"use --force to overwrite"
),
)
subparser.add_argument(
"-f", "--force", help="remove any files or directories that block cloning source code"
)
subparser.add_argument(
"-r",
"--recursive",
action="store_true",
help="traverse nodes of the graph to mark everything up to the root as a develop spec",
)
arguments.add_common_arguments(subparser, ["spec"])
def _update_config(spec, path):
find_fn = lambda section: spec.name in section
entry = {"spec": str(spec)}
if path != spec.name:
entry["path"] = path
def change_fn(section):
section[spec.name] = entry
spack.config.change_or_add("develop", find_fn, change_fn)
def _retrieve_develop_source(spec: spack.spec.Spec, abspath: str) -> None:
# "steal" the source code via staging API. We ask for a stage
# to be created, then copy it afterwards somewhere else. It would be
@@ -83,44 +81,43 @@ def _retrieve_develop_source(spec: spack.spec.Spec, abspath: str) -> None:
package.stage.steal_source(abspath)
def develop(parser, args):
# Note: we could put develop specs in any scope, but I assume
# users would only ever want to do this for either (a) an active
# env or (b) a specified config file (e.g. that is included by
# an environment)
# TODO: when https://github.com/spack/spack/pull/35307 is merged,
# an active env is not required if a scope is specified
env = spack.cmd.require_active_env(cmd_name="develop")
if not args.spec:
if args.clone is False:
raise SpackError("No spec provided to spack develop command")
def assure_concrete_spec(env: spack.environment.Environment, spec: spack.spec.Spec):
version = spec.versions.concrete_range_as_version
if not version:
# first check environment for a matching concrete spec
matching_specs = env.all_matching_specs(spec)
if matching_specs:
version = matching_specs[0].version
test_spec = spack.spec.Spec(f"{spec}@{version}")
for m_spec in matching_specs:
if not m_spec.satisfies(test_spec):
raise SpackError(
f"{spec.name}: has multiple concrete instances in the graph that can't be"
" satisified by a single develop spec. To use `spack develop` ensure one"
" of the following:"
f"\n a) {spec.name} nodes can satisfy the same develop spec (minimally "
"this means they all share the same version)"
f"\n b) Provide a concrete develop spec ({spec.name}@[version]) to clearly"
" indicate what should be developed"
)
else:
# look up the maximum version so infintiy versions are preferred for develop
version = max(spec.package_class.versions.keys())
tty.msg(f"Defaulting to highest version: {spec.name}@{version}")
spec.versions = spack.version.VersionList([version])
# download all dev specs
for name, entry in env.dev_specs.items():
path = entry.get("path", name)
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
if os.path.exists(abspath):
msg = "Skipping developer download of %s" % entry["spec"]
msg += " because its path already exists."
tty.msg(msg)
continue
def setup_src_code(spec: spack.spec.Spec, src_path: str, clone: bool = True, force: bool = False):
"""
Handle checking, cloning or overwriting source code
"""
assert spec.versions
# Both old syntax `spack develop pkg@x` and new syntax `spack develop pkg@=x`
# are currently supported.
spec = spack.spec.parse_with_version_concrete(entry["spec"])
_retrieve_develop_source(spec, abspath)
if clone:
_clone(spec, src_path, force)
if not env.dev_specs:
tty.warn("No develop specs to download")
return
specs = spack.cmd.parse_specs(args.spec)
if len(specs) > 1:
raise SpackError("spack develop requires at most one named spec")
spec = specs[0]
if not clone and not os.path.exists(src_path):
raise SpackError(f"Provided path {src_path} does not exist")
version = spec.versions.concrete_range_as_version
if not version:
@@ -129,40 +126,114 @@ def develop(parser, args):
tty.msg(f"Defaulting to highest version: {spec.name}@{version}")
spec.versions = spack.version.VersionList([version])
# If user does not specify --path, we choose to create a directory in the
# active environment's directory, named after the spec
path = args.path or spec.name
if not os.path.isabs(path):
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
else:
abspath = path
# clone default: only if the path doesn't exist
clone = args.clone
if clone is None:
clone = not os.path.exists(abspath)
def _update_config(spec, path):
find_fn = lambda section: spec.name in section
if not clone and not os.path.exists(abspath):
raise SpackError("Provided path %s does not exist" % abspath)
entry = {"spec": str(spec)}
if path and path != spec.name:
entry["path"] = path
if clone:
if os.path.exists(abspath):
if args.force:
shutil.rmtree(abspath)
else:
msg = "Path %s already exists and cannot be cloned to." % abspath
msg += " Use `spack develop -f` to overwrite."
raise SpackError(msg)
def change_fn(section):
section[spec.name] = entry
_retrieve_develop_source(spec, abspath)
spack.config.change_or_add("develop", find_fn, change_fn)
def update_env(
env: spack.environment.Environment,
spec: spack.spec.Spec,
specified_path: Optional[str] = None,
build_dir: Optional[str] = None,
):
"""
Update the spack.yaml file with additions or changes from a develop call
"""
tty.debug(f"Updating develop config for {env.name} transactionally")
if not specified_path:
dev_entry = env.dev_specs.get(spec.name)
if dev_entry:
specified_path = dev_entry.get("path", None)
tty.debug("Updating develop config for {0} transactionally".format(env.name))
with env.write_transaction():
if args.build_directory is not None:
if build_dir is not None:
spack.config.add(
"packages:{}:package_attributes:build_directory:{}".format(
spec.name, args.build_directory
),
f"packages:{spec.name}:package_attributes:build_directory:{build_dir}",
env.scope_name,
)
_update_config(spec, path)
# add develop spec and update path
_update_config(spec, specified_path)
def _clone(spec: spack.spec.Spec, abspath: str, force: bool = False):
if os.path.exists(abspath):
if force:
shutil.rmtree(abspath)
else:
msg = f"Skipping developer download of {spec.name}"
msg += f" because its path {abspath} already exists."
tty.msg(msg)
return
# cloning can take a while and it's nice to get a message for the longer clones
tty.msg(f"Cloning source code for {spec}")
_retrieve_develop_source(spec, abspath)
def _abs_code_path(
env: spack.environment.Environment, spec: spack.spec.Spec, path: Optional[str] = None
):
src_path = path if path else spec.name
return spack.util.path.canonicalize_path(src_path, default_wd=env.path)
def _dev_spec_generator(args, env):
"""
Generator function to loop over all the develop specs based on how the command is called
If no specs are supplied then loop over the develop specs listed in the environment.
"""
if not args.spec:
if args.clone is False:
raise SpackError("No spec provided to spack develop command")
for name, entry in env.dev_specs.items():
path = entry.get("path", name)
abspath = spack.util.path.canonicalize_path(path, default_wd=env.path)
# Both old syntax `spack develop pkg@x` and new syntax `spack develop pkg@=x`
# are currently supported.
spec = spack.spec.parse_with_version_concrete(entry["spec"])
yield spec, abspath
else:
specs = spack.cmd.parse_specs(args.spec)
if (args.path or args.build_directory) and len(specs) > 1:
raise SpackError(
"spack develop requires at most one named spec when using the --path or"
" --build-directory arguments"
)
for spec in specs:
if args.recursive:
concrete_specs = env.all_matching_specs(spec)
if not concrete_specs:
tty.warn(
f"{spec.name} has no matching concrete specs in the environment and "
"will be skipped. `spack develop --recursive` requires a concretized"
" environment"
)
else:
for s in concrete_specs:
for node_spec in s.traverse(direction="parents", root=True):
tty.debug(f"Recursive develop for {node_spec.name}")
yield node_spec, _abs_code_path(env, node_spec, args.path)
else:
yield spec, _abs_code_path(env, spec, args.path)
def develop(parser, args):
env = spack.cmd.require_active_env(cmd_name="develop")
for spec, abspath in _dev_spec_generator(args, env):
assure_concrete_spec(env, spec)
setup_src_code(spec, abspath, clone=args.clone, force=args.force)
update_env(env, spec, args.path, args.build_directory)

View File

@@ -98,7 +98,7 @@ def setup_parser(subparser):
"--show-full-compiler",
action="store_true",
dest="show_full_compiler",
help="show full compiler specs",
help="(DEPRECATED) show full compiler specs. Currently it's a no-op",
)
implicit_explicit = subparser.add_mutually_exclusive_group()
implicit_explicit.add_argument(
@@ -278,7 +278,6 @@ def root_decorator(spec, string):
# these enforce details in the root specs to show what the user asked for
namespaces=True,
show_flags=True,
show_full_compiler=True,
decorator=root_decorator,
variants=True,
)
@@ -301,7 +300,6 @@ def root_decorator(spec, string):
decorator=lambda s, f: color.colorize("@*{%s}" % f),
namespace=True,
show_flags=True,
show_full_compiler=True,
variants=True,
)
print()

View File

@@ -5,6 +5,7 @@
import sys
import textwrap
from itertools import zip_longest
from typing import Callable, Dict, TypeVar
import llnl.util.tty as tty
import llnl.util.tty.color as color
@@ -14,11 +15,12 @@
import spack.deptypes as dt
import spack.fetch_strategy as fs
import spack.install_test
import spack.package_base
import spack.repo
import spack.spec
import spack.variant
from spack.cmd.common import arguments
from spack.package_base import preferred_version
from spack.util.typing import SupportsRichComparison
description = "get detailed information on a particular package"
section = "basic"
@@ -28,6 +30,44 @@
plain_format = "@."
class Formatter:
"""Generic formatter for elements displayed by `spack info`.
Elements have four parts: name, values, when condition, and description. They can
be formatted two ways (shown here for variants)::
Grouped by when (default)::
when +cuda
cuda_arch [none] none, 10, 100, 100a, 101,
101a, 11, 12, 120, 120a, 13
CUDA architecture
Or, by name (each name has a when nested under it)::
cuda_arch [none] none, 10, 100, 100a, 101,
101a, 11, 12, 120, 120a, 13
when +cuda
CUDA architecture
The values and description will be wrapped if needed. the name (and any additional info)
will not (so they should be kept short).
Subclasses are responsible for generating colorized text, but not wrapping,
indentation, or other formatting, for the name, values, and description.
"""
def format_name(self, element) -> str:
return ""
def format_values(self, element) -> str:
return ""
def format_description(self, element) -> str:
return ""
def padder(str_list, extra=0):
"""Return a function to pad elements of a list."""
length = max(len(str(s)) for s in str_list) + extra
@@ -140,17 +180,19 @@ def lines(self):
yield " " + self.fmt % t
class DependencyFormatter(Formatter):
def format_name(self, dep) -> str:
return str(dep.spec)
def format_values(self, dep) -> str:
return str(dt.flag_to_tuple(dep.depflag))
def print_dependencies(pkg, args):
"""output build, link, and run package dependencies"""
for deptype in ("build", "link", "run"):
color.cprint("")
color.cprint(section_title("%s Dependencies:" % deptype.capitalize()))
deps = sorted(pkg.dependencies_of_type(dt.flag_from_string(deptype)))
if deps:
colify(deps, indent=4)
else:
color.cprint(" None")
print_fn = print_by_name if args.variants_by_name else print_grouped_by_when
print_fn("Dependencies", pkg.dependencies, DependencyFormatter())
def print_detectable(pkg, args):
@@ -263,66 +305,70 @@ def print_tests(pkg, args):
color.cprint(" None")
def _fmt_value(v):
if v is None or isinstance(v, bool):
return str(v).lower()
else:
return str(v)
def _fmt_name_and_default(variant):
"""Print colorized name [default] for a variant."""
return color.colorize(f"@c{{{variant.name}}} @C{{[{_fmt_value(variant.default)}]}}")
def _fmt_when(when: "spack.spec.Spec", indent: int):
return color.colorize(f"{indent * ' '}@B{{when}} {color.cescape(str(when))}")
def _fmt_variant_description(variant, width, indent):
"""Format a variant's description, preserving explicit line breaks."""
return "\n".join(
textwrap.fill(
line, width=width, initial_indent=indent * " ", subsequent_indent=indent * " "
def _fmt_variant_value(v):
return str(v).lower() if v is None or isinstance(v, bool) else str(v)
class VariantFormatter(Formatter):
def format_name(self, variant) -> str:
return color.colorize(
f"@c{{{variant.name}}} @C{{[{_fmt_variant_value(variant.default)}]}}"
)
for line in variant.description.split("\n")
)
def format_values(self, variant) -> str:
values = variant.values
if not isinstance(variant.values, (tuple, list, spack.variant.DisjointSetsOfValues)):
values = [variant.values]
# put 'none' first, sort the rest by value
sorted_values = sorted(values, key=lambda v: (v != "none", v))
return color.colorize(f"@c{{{', '.join(_fmt_variant_value(v) for v in sorted_values)}}}")
def format_description(self, variant) -> str:
return variant.description
def _fmt_variant(variant, max_name_default_len, indent, when=None, out=None):
def _fmt_definition(
name_field, values_field, description, max_name_len, indent, when=None, out=None
):
"""Format a definition entry in `spack info` output.
Arguments:
name_field: name and optional info, e.g. a default; should be short.
values_field: possible values for the entry; Wrapped if long.
description: description of the field (wrapped if overly long)
indent: size of leading indent for entry
when: optional when condition
out: stream to print to
"""
out = out or sys.stdout
_, cols = tty.terminal_size()
name_and_default = _fmt_name_and_default(variant)
name_default_len = color.clen(name_and_default)
name_len = color.clen(name_field)
values = variant.values
if not isinstance(variant.values, (tuple, list, spack.variant.DisjointSetsOfValues)):
values = [variant.values]
pad = 4 # min padding between name and values
value_indent = (indent + max_name_len + pad) * " " # left edge of values
# put 'none' first, sort the rest by value
sorted_values = sorted(values, key=lambda v: (v != "none", v))
pad = 4 # min padding between 'name [default]' and values
value_indent = (indent + max_name_default_len + pad) * " " # left edge of values
# This preserves any formatting (i.e., newlines) from how the description was
# written in package.py, but still wraps long lines for small terminals.
# This allows some packages to provide detailed help on their variants (see, e.g., gasnet).
formatted_values = "\n".join(
textwrap.wrap(
f"{', '.join(_fmt_value(v) for v in sorted_values)}",
width=cols - 2,
initial_indent=value_indent,
subsequent_indent=value_indent,
if values_field:
formatted_values = "\n".join(
textwrap.wrap(
values_field,
width=cols - 2,
initial_indent=value_indent,
subsequent_indent=value_indent,
)
)
)
formatted_values = formatted_values[indent + name_default_len + pad :]
# trim initial indentation
formatted_values = formatted_values[indent + name_len + pad :]
# name [default] value1, value2, value3, ...
padding = pad * " "
color.cprint(f"{indent * ' '}{name_and_default}{padding}@c{{{formatted_values}}}", stream=out)
# name [default] value1, value2, value3, ...
out.write(f"{indent * ' '}{name_field}{pad * ' '}{formatted_values}\n")
# when <spec>
description_indent = indent + 4
@@ -330,38 +376,65 @@ def _fmt_variant(variant, max_name_default_len, indent, when=None, out=None):
out.write(_fmt_when(when, description_indent - 2))
out.write("\n")
# description, preserving explicit line breaks from the way it's written in the package file
out.write(_fmt_variant_description(variant, cols - 2, description_indent))
out.write("\n")
# description, preserving explicit line breaks from the way it's written in the
# package file, but still wrapoing long lines for small terminals. This allows
# descriptions to provide detailed help in descriptions (see, e.g., gasnet's variants).
if description:
formatted_description = "\n".join(
textwrap.fill(
line,
width=cols - 2,
initial_indent=description_indent * " ",
subsequent_indent=description_indent * " ",
)
for line in description.split("\n")
)
out.write(formatted_description)
out.write("\n")
def _print_variants_header(pkg):
"""output variants"""
K = TypeVar("K", bound=SupportsRichComparison)
V = TypeVar("V")
if not pkg.variants:
print(" None")
return
def print_header(header: str, when_indexed_dictionary: Dict, formatter: Formatter):
color.cprint("")
color.cprint(section_title("Variants:"))
color.cprint(section_title(f"{header}:"))
# Calculate the max length of the "name [default]" part of the variant display
# This lets us know where to print variant values.
max_name_default_len = max(
color.clen(_fmt_name_and_default(variant))
for name in pkg.variant_names()
for _, variant in pkg.variant_definitions(name)
if not when_indexed_dictionary:
print(" None")
def max_name_length(when_indexed_dictionary: Dict, formatter: Formatter) -> int:
# Calculate the max length of the first field of the definition. Lets us know how
# much to pad other fields on the first line.
return max(
color.clen(formatter.format_name(definition))
for subkey in spack.package_base._subkeys(when_indexed_dictionary)
for _, definition in spack.package_base._definitions(when_indexed_dictionary, subkey)
)
return max_name_default_len
def print_grouped_by_when(header: str, when_indexed_dictionary: Dict, formatter: Formatter):
"""Generic method to print metadata grouped by when conditions."""
def print_variants_grouped_by_when(pkg):
max_name_default_len = _print_variants_header(pkg)
print_header(header, when_indexed_dictionary, formatter)
if not when_indexed_dictionary:
return
max_name_len = max_name_length(when_indexed_dictionary, formatter)
# Calculate the max length of the first field of the definition. Lets us know how
# much to pad other fields on the first line.
max_name_len = max(
color.clen(formatter.format_name(definition))
for subkey in spack.package_base._subkeys(when_indexed_dictionary)
for _, definition in spack.package_base._definitions(when_indexed_dictionary, subkey)
)
indent = 4
for when, variants_by_name in pkg.variant_items():
padded_values = max_name_default_len + 4
for when, by_name in when_indexed_dictionary.items():
padded_values = max_name_len + 4
start_indent = indent
if when != spack.spec.Spec():
@@ -373,27 +446,46 @@ def print_variants_grouped_by_when(pkg):
padded_values -= 2
start_indent += 2
for name, variant in sorted(variants_by_name.items()):
_fmt_variant(variant, padded_values, start_indent, None, out=sys.stdout)
for subkey, definition in sorted(by_name.items()):
_fmt_definition(
formatter.format_name(definition),
formatter.format_values(definition),
formatter.format_description(definition),
max_name_len,
start_indent,
when=None,
out=sys.stdout,
)
def print_variants_by_name(pkg):
max_name_default_len = _print_variants_header(pkg)
max_name_default_len += 4
def print_by_name(header: str, when_indexed_dictionary: Dict, formatter: Formatter):
print_header(header, when_indexed_dictionary, formatter)
if not when_indexed_dictionary:
return
max_name_len = max_name_length(when_indexed_dictionary, formatter)
max_name_len += 4
indent = 4
for name in pkg.variant_names():
for when, variant in pkg.variant_definitions(name):
_fmt_variant(variant, max_name_default_len, indent, when, out=sys.stdout)
for subkey in spack.package_base._subkeys(when_indexed_dictionary):
for when, definition in spack.package_base._definitions(when_indexed_dictionary, subkey):
_fmt_definition(
formatter.format_name(definition),
formatter.format_values(definition),
formatter.format_description(definition),
max_name_len,
indent,
when=when,
out=sys.stdout,
)
sys.stdout.write("\n")
def print_variants(pkg, args):
"""output variants"""
if args.variants_by_name:
print_variants_by_name(pkg)
else:
print_variants_grouped_by_when(pkg)
print_fn = print_by_name if args.variants_by_name else print_grouped_by_when
print_fn("Variants", pkg.variants, VariantFormatter())
def print_versions(pkg, args):
@@ -413,7 +505,7 @@ def print_versions(pkg, args):
else:
pad = padder(pkg.versions, 4)
preferred = preferred_version(pkg)
preferred = spack.package_base.preferred_version(pkg)
def get_url(version):
try:

View File

@@ -38,7 +38,6 @@
r"^lib/spack/spack/.*\.sh$",
r"^lib/spack/spack/.*\.lp$",
r"^lib/spack/llnl/.*\.py$",
r"^lib/spack/env/cc$",
# special case some test data files that have license headers
r"^lib/spack/spack/test/data/style/broken.dummy",
r"^lib/spack/spack/test/data/unparse/.*\.txt",

View File

@@ -515,16 +515,15 @@ def extend_with_dependencies(specs):
def concrete_specs_from_cli_or_file(args):
tty.msg("Concretizing input specs")
with spack.concretize.disable_compiler_existence_check():
if args.specs:
specs = spack.cmd.parse_specs(args.specs, concretize=True)
if not specs:
raise SpackError("unable to parse specs from command line")
if args.specs:
specs = spack.cmd.parse_specs(args.specs, concretize=True)
if not specs:
raise SpackError("unable to parse specs from command line")
if args.file:
specs = specs_from_text_file(args.file, concretize=True)
if not specs:
raise SpackError("unable to parse specs from file '{}'".format(args.file))
if args.file:
specs = specs_from_text_file(args.file, concretize=True)
if not specs:
raise SpackError("unable to parse specs from file '{}'".format(args.file))
return specs

View File

@@ -1,7 +1,12 @@
# Copyright Spack Project Developers. See COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import shutil
from llnl.util import tty
import spack.database
import spack.store
description = "rebuild Spack's package database"
@@ -10,4 +15,11 @@
def reindex(parser, args):
current_index = spack.store.STORE.db._index_path
if os.path.isfile(current_index):
backup = f"{current_index}.bkp"
shutil.copy(current_index, backup)
tty.msg(f"Created a back-up copy of the DB at {backup}")
spack.store.STORE.reindex()
tty.msg(f"The DB at {current_index} has been reindex to v{spack.database._DB_VERSION}")

View File

@@ -17,6 +17,7 @@
pytest = None # type: ignore
import llnl.util.filesystem
import llnl.util.tty as tty
import llnl.util.tty.color as color
from llnl.util.tty.colify import colify
@@ -236,6 +237,12 @@ def unit_test(parser, args, unknown_args):
pytest_root = spack.extensions.load_extension(args.extension)
if args.numprocesses is not None and args.numprocesses > 1:
try:
import xdist # noqa: F401
except ImportError:
tty.error("parallel unit-test requires pytest-xdist module")
return 1
pytest_args.extend(
[
"--dist",

Some files were not shown because too many files have changed in this diff Show More