Compare commits

..

7 Commits

Author SHA1 Message Date
Harmen Stoppels
05fabbbab2 fix year dependent license verification check 2025-02-03 18:52:37 +01:00
Gregory Becker
c6d4037758 update version number to 0.23.0 2024-11-17 00:59:27 -05:00
Gregory Becker
08f1cf9ae2 Update CHANGELOG.md for v0.23.0 2024-11-17 00:59:27 -05:00
Todd Gamblin
48dfa3c95e Spec: prefer a splice-specific method to __len__ (#47585)
Automatic splicing say `Spec` grow a `__len__` method but it's only used
in one place and it's not clear the semantics are useful elsewhere. It also
runs the risk of Specs one day being confused for other types of containers.

Rather than introduce a new function for one algorithm, let's use a more
specific method in the splice code.

- [x] Use topological ordering in `_resolve_automatic_splices` instead of 
      sorting by node count
- [x] delete `Spec.__len__()` and `Spec.__bool__()`

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-11-14 08:24:06 +01:00
Todd Gamblin
e5c411d8f0 spack spec: simplify and unify output (#47574)
`spack spec` output has looked like this for a while:

```console
> spack spec /v5fn6xo /wd2p2v7
Input spec
--------------------------------
 -   /v5fn6xo

Concretized
--------------------------------
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...

Input spec
--------------------------------
 -   /wd2p2v7

Concretized
--------------------------------
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
```

But the input spec is right there on the CLI, and it doesn't add anything to the output.
Also, since #44843, specs concretized in the CLI line can be unified, so it makes sense
to display them as we did in #44489 -- as one multi-root tree instead of as multiple
single-root trees.

With this PR, concretize output now looks like this:

```console
> spack spec /v5fn6xo /wd2p2v7
[+]  openssl@3.3.1%apple-clang@16.0.0~docs+shared build_system=generic certs=mozilla arch=darwin-sequoia-m1
[+]      ^ca-certificates-mozilla@2023-05-30%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^gmake@4.4.1%apple-clang@16.0.0~guile build_system=generic arch=darwin-sequoia-m1
[+]      ^perl@5.40.0%apple-clang@16.0.0+cpanm+opcode+open+shared+threads build_system=generic arch=darwin-sequoia-m1
[+]          ^berkeley-db@18.1.40%apple-clang@16.0.0+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=darwin-sequoia-m1
[+]          ^bzip2@1.0.8%apple-clang@16.0.0~debug~pic+shared build_system=generic arch=darwin-sequoia-m1
[+]              ^diffutils@3.10%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]                  ^libiconv@1.17%apple-clang@16.0.0 build_system=autotools libs=shared,static arch=darwin-sequoia-m1
[+]          ^gdbm@1.23%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]              ^readline@8.2%apple-clang@16.0.0 build_system=autotools patches=bbf97f1 arch=darwin-sequoia-m1
[+]                  ^ncurses@6.5%apple-clang@16.0.0~symlinks+termlib abi=none build_system=autotools patches=7a351bc arch=darwin-sequoia-m1
[+]                      ^pkgconf@2.2.0%apple-clang@16.0.0 build_system=autotools arch=darwin-sequoia-m1
[+]      ^zlib-ng@2.2.1%apple-clang@16.0.0+compat+new_strategies+opt+pic+shared build_system=autotools arch=darwin-sequoia-m1
[+]          ^gnuconfig@2022-09-17%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]  py-six@1.16.0%apple-clang@16.0.0 build_system=python_pip arch=darwin-sequoia-m1
[+]      ^py-pip@23.1.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[+]      ^py-setuptools@69.2.0%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
[-]      ^py-wheel@0.41.2%apple-clang@16.0.0 build_system=generic arch=darwin-sequoia-m1
...
```

With no input spec displayed -- just the concretization output shown as one consolidated
tree and multiple roots.

- [x] remove "Input Spec" section and "Concretized" header from `spack spec` output
- [x] print concretized specs as one BFS tree instead of multiple

---------

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
2024-11-14 08:16:16 +01:00
psakievich
020e30f3e6 Update tutorial version (#47593) 2024-11-14 08:15:37 +01:00
Harmen Stoppels
181c404af5 missing and redundant imports (#47577) 2024-11-13 13:03:29 +01:00
670 changed files with 6405 additions and 8342 deletions

View File

@@ -57,13 +57,7 @@ jobs:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
- name: Determine latest release tag
id: latest
run: |
git fetch --quiet --tags
echo "tag=$(git tag --list --sort=-v:refname | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | head -n 1)" | tee -a $GITHUB_OUTPUT
- uses: docker/metadata-action@369eb591f429131d6889c46b94e711f089e6ca96
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta
with:
images: |
@@ -77,7 +71,6 @@ jobs:
type=semver,pattern={{major}}
type=ref,event=branch
type=ref,event=pr
type=raw,value=latest,enable=${{ github.ref == format('refs/tags/{0}', steps.latest.outputs.tag) }}
- name: Generate the Dockerfile
env:
@@ -120,7 +113,7 @@ jobs:
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@48aba3b46d1b1fec4febb7c5d0c644b249a11355
uses: docker/build-push-action@4f58ea79222b3b9dc2c8bbdd6debcef730109a75
with:
context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }}

View File

@@ -29,7 +29,6 @@ jobs:
- run: coverage xml
- name: "Upload coverage report to CodeCov"
uses: codecov/codecov-action@05f5a9cfad807516dbbef9929c4a42df3eb78766
uses: codecov/codecov-action@b9fd7d16f6d7d1b5d2bec1a2887e65ceed900238
with:
verbose: true
fail_ci_if_error: true

View File

@@ -1,3 +1,365 @@
# v0.23.0 (2024-11-13)
`v0.23.0` is a major feature release.
We are planning to make this the last major release before Spack `v1.0`
in June 2025. Alongside `v0.23`, we will be making pre-releases (alpha,
beta, etc.) of `v1.0`, and we encourage users to try them and send us
feedback, either on GitHub or on Slack. You can track the road to
`v1.0` here:
* https://github.com/spack/spack/releases
* https://github.com/spack/spack/discussions/30634
## Features in this Release
1. **Language virtuals**
Your packages can now explicitly depend on the languages they require.
Historically, Spack has considered C, C++, and Fortran compiler
dependencies to be implicit. In `v0.23`, you should ensure that
new packages add relevant C, C++, and Fortran dependencies like this:
```python
depends_on("c", type="build")
depends_on("cxx", type="build")
depends_on("fortran", type="build")
```
We encourage you to add these annotations to your packages now, to prepare
for Spack `v1.0.0`. In `v1.0.0`, these annotations will be necessary for
your package to use C, C++, and Fortran compilers. Note that you should
*not* add language dependencies to packages that don't need them, e.g.,
pure python packages.
We have already auto-generated these dependencies for packages in the
`builtin` repository (see #45217), based on the types of source files
present in each package's source code. We *may* have added too many or too
few language dependencies, so please submit pull requests to correct
packages if you find that the language dependencies are incorrect.
Note that we have also backported support for these dependencies to
`v0.21.3` and `v0.22.2`, to make all of them forward-compatible with
`v0.23`. This should allow you to move easily between older and newer Spack
releases without breaking your packages.
2. **Spec splicing**
We are working to make binary installation more seamless in Spack. `v0.23`
introduces "splicing", which allows users to deploy binaries using local,
optimized versions of a binary interface, even if they were not built with
that interface. For example, this would allow you to build binaries in the
cloud using `mpich` and install them on a system using a local, optimized
version of `mvapich2` *without rebuilding*. Spack preserves full provenance
for the installed packages and knows that they were built one way but
deployed another.
Our intent is to leverage this across many key HPC binary packages,
e.g. MPI, CUDA, ROCm, and libfabric.
Fundamentally, splicing allows Spack to redeploy an existing spec with
different dependencies than how it was built. There are two interfaces to
splicing.
a. Explicit Splicing
#39136 introduced the explicit splicing interface. In the
concretizer config, you can specify a target spec and a replacement
by hash.
```yaml
concretizer:
splice:
explicit:
- target: mpi
replacement: mpich/abcdef
```
Here, every installation that would normally use the target spec will
instead use its replacement. Above, any spec using *any* `mpi` will be
spliced to depend on the specific `mpich` installation requested. This
*can* go wrong if you try to replace something built with, e.g.,
`openmpi` with `mpich`, and it is on the user to ensure ABI
compatibility between target and replacement specs. This currently
requires some expertise to use, but it will allow users to reuse the
binaries they create across more machines and environments.
b. Automatic Splicing (experimental)
#46729 introduced automatic splicing. In the concretizer config, enable
automatic splicing:
```yaml
concretizer:
splice:
automatic: true
```
or run:
```console
spack config add concretizer:splice:automatic:true
```
The concretizer will select splices for ABI compatibility to maximize
package reuse. Packages can denote ABI compatibility using the
`can_splice` directive. No packages in Spack yet use this directive, so
if you want to use this feature you will need to add `can_splice`
annotations to your packages. We are working on ways to add more ABI
compatibility information to the Spack package repository, and this
directive may change in the future.
See the documentation for more details:
* https://spack.readthedocs.io/en/latest/build_settings.html#splicing
* https://spack.readthedocs.io/en/latest/packaging_guide.html#specifying-abi-compatibility
3. Broader variant propagation
Since #42931, you can specify propagated variants like `hdf5
build_type==RelWithDebInfo` or `trilinos ++openmp` to propagate a variant
to all dependencies for which it is relevant. This is valid *even* if the
variant does not exist on the package or its dependencies.
See https://spack.readthedocs.io/en/latest/basic_usage.html#variants.
4. Query specs by namespace
#45416 allows a package's namespace (indicating the repository it came from)
to be treated like a variant. You can request packages from particular repos
like this:
```console
spack find zlib namespace=builtin
spack find zlib namespace=myrepo
```
Previously, the spec syntax only allowed namespaces to be prefixes of spec
names, e.g. `builtin.zlib`. The previous syntax still works.
5. `spack spec` respects environment settings and `unify:true`
`spack spec` did not previously respect environment lockfiles or
unification settings, which made it difficult to see exactly how a spec
would concretize within an environment. Now it does, so the output you get
with `spack spec` will be *the same* as what your environment will
concretize to when you run `spack concretize`. Similarly, if you provide
multiple specs on the command line with `spack spec`, it will concretize
them together if `unify:true` is set.
See #47556 and #44843.
6. Less noisy `spack spec` output
`spack spec` previously showed output like this:
```console
> spack spec /v5fn6xo
Input spec
--------------------------------
- /v5fn6xo
Concretized
--------------------------------
[+] openssl@3.3.1%apple-clang@16.0.0~docs+shared arch=darwin-sequoia-m1
...
```
But the input spec is redundant, and we know we run `spack spec` to concretize
the input spec. `spack spec` now *only* shows the concretized spec. See #47574.
7. Better output for `spack find -c`
In an environmnet, `spack find -c` lets you search the concretized, but not
yet installed, specs, just as you would the installed ones. As with `spack
spec`, this should make it easier for you to see what *will* be built
before building and installing it. See #44713.
8. `spack -C <env>`: use an environment's configuration without activation
Spack environments allow you to associate:
1. a set of (possibly concretized) specs, and
2. configuration
When you activate an environment, you're using both of these. Previously, we
supported:
* `spack -e <env>` to run spack in the context of a specific environment, and
* `spack -C <directory>` to run spack using a directory with configuration files.
You can now also pass an environment to `spack -C` to use *only* the environment's
configuration, but not the specs or lockfile. See #45046.
## New commands, options, and directives
* The new `spack env track` command (#41897) takes a non-managed Spack
environment and adds a symlink to Spack's `$environments_root` directory, so
that it will be included for reference counting for commands like `spack
uninstall` and `spack gc`. If you use free-standing directory environments,
this is useful for preventing Spack from removing things required by your
environments. You can undo this tracking with the `spack env untrack`
command.
* Add `-t` short option for `spack --backtrace` (#47227)
`spack -d / --debug` enables backtraces on error, but it can be very
verbose, and sometimes you just want the backtrace. `spack -t / --backtrace`
provides that option.
* `gc`: restrict to specific specs (#46790)
If you only want to garbage-collect specific packages, you can now provide
them on the command line. This gives users finer-grained control over what
is uninstalled.
* oci buildcaches now support `--only=package`. You can now push *just* a
package and not its dependencies to an OCI registry. This allows dependents
of non-redistributable specs to be stored in OCI registries without an
error. See #45775.
## Notable refactors
* Variants are now fully conditional
The `variants` dictionary on packages was previously keyed by variant name,
and allowed only one definition of any given variant. Spack is now smart
enough to understand that variants may have different values and defaults
for different versions. For example, `warpx` prior to `23.06` only supported
builds for one dimensionality, and newer `warpx` versions could be built
with support for many different dimensions:
```python
variant(
"dims",
default="3",
values=("1", "2", "3", "rz"),
multi=False,
description="Number of spatial dimensions",
when="@:23.05",
)
variant(
"dims",
default="1,2,rz,3",
values=("1", "2", "3", "rz"),
multi=True,
description="Number of spatial dimensions",
when="@23.06:",
)
```
Previously, the default for the old version of `warpx` was not respected and
had to be specified manually. Now, Spack will select the right variant
definition for each version at concretization time. This allows variants to
evolve more smoothly over time. See #44425 for details.
## Highlighted bugfixes
1. Externals no longer override the preferred provider (#45025).
External definitions could interfere with package preferences. Now, if
`openmpi` is the preferred `mpi`, and an external `mpich` is defined, a new
`openmpi` *will* be built if building it is possible. Previously we would
prefer `mpich` despite the preference.
2. Composable `cflags` (#41049).
This release fixes a longstanding bug that concretization would fail if
there were different `cflags` specified in `packages.yaml`,
`compilers.yaml`, or on `the` CLI. Flags and their ordering are now tracked
in the concretizer and flags from multiple sources will be merged.
3. Fix concretizer Unification for included environments (#45139).
## Deprecations, removals, and syntax changes
1. The old concretizer has been removed from Spack, along with the
`config:concretizer` config option. Spack will emit a warning if the option
is present in user configuration, since it now has no effect. Spack now
uses a simpler bootstrapping mechanism, where a JSON prototype is tweaked
slightly to get an initial concrete spec to download. See #45215.
2. Best-effort expansion of spec matrices has been removed. This feature did
not work with the "new" ASP-based concretizer, and did not work with
`unify: True` or `unify: when_possible`. Use the
[exclude key](https://spack.readthedocs.io/en/latest/environments.html#spec-matrices)
for the environment to exclude invalid components, or use multiple spec
matrices to combine the list of specs for which the constraint is valid and
the list of specs for which it is not. See #40792.
3. The old Cray `platform` (based on Cray PE modules) has been removed, and
`platform=cray` is no longer supported. Since `v0.19`, Spack has handled
Cray machines like Linux clusters with extra packages, and we have
encouraged using this option to support Cray. The new approach allows us to
correctly handle Cray machines with non-SLES operating systems, and it is
much more reliable than making assumptions about Cray modules. See the
`v0.19` release notes and #43796 for more details.
4. The `config:install_missing_compilers` config option has been deprecated,
and it is a no-op when set in `v0.23`. Our new compiler dependency model
will replace it with a much more reliable and robust mechanism in `v1.0`.
See #46237.
5. Config options that deprecated in `v0.21` have been removed in `v0.23`. You
can now only specify preferences for `compilers`, `targets`, and
`providers` globally via the `packages:all:` section. Similarly, you can
only specify `versions:` locally for a specific package. See #44061 and
#31261 for details.
6. Spack's old test interface has been removed (#45752), having been
deprecated in `v0.22.0` (#34236). All `builtin` packages have been updated
to use the new interface. See the [stand-alone test documentation](
https://spack.readthedocs.io/en/latest/packaging_guide.html#stand-alone-tests)
7. The `spack versions --safe-only` option, deprecated since `v0.21.0`, has
been removed. See #45765.
* The `--dependencies` and `--optimize` arguments to `spack ci` have been
deprecated. See #45005.
## Binary caches
1. Public binary caches now include an ML stack for Linux/aarch64 (#39666)We
now build an ML stack for Linux/aarch64 for all pull requests and on
develop. The ML stack includes both CPU-only and CUDA builds for Horovod,
Hugging Face, JAX, Keras, PyTorch,scikit-learn, TensorBoard, and
TensorFlow, and related packages. The CPU-only stack also includes XGBoost.
See https://cache.spack.io/tag/develop/?stack=ml-linux-aarch64-cuda.
2. There is also now an stack of developer tools for macOS (#46910), which is
analogous to the Linux devtools stack. You can use this to avoid building
many common build dependencies. See
https://cache.spack.io/tag/develop/?stack=developer-tools-darwin.
## Architecture support
* archspec has been updated to `v0.2.5`, with support for `zen5`
* Spack's CUDA package now supports the Grace Hopper `9.0a` compute capability (#45540)
## Windows
* Windows bootstrapping: `file` and `gpg` (#41810)
* `scripts` directory added to PATH on Windows for python extensions (#45427)
* Fix `spack load --list` and `spack unload` on Windows (#35720)
## Other notable changes
* Bugfix: `spack find -x` in environments (#46798)
* Spec splices are now robust to duplicate nodes with the same name in a spec (#46382)
* Cache per-compiler libc calculations for performance (#47213)
* Fixed a bug in external detection for openmpi (#47541)
* Mirror configuration allows username/password as environment variables (#46549)
* Default library search caps maximum depth (#41945)
* Unify interface for `spack spec` and `spack solve` commands (#47182)
* Spack no longer RPATHs directories in the default library search path (#44686)
* Improved performance of Spack database (#46554)
* Enable package reuse for packages with versions from git refs (#43859)
* Improved handling for `uuid` virtual on macos (#43002)
* Improved tracking of task queueing/requeueing in the installer (#46293)
## Spack community stats
* Over 2,000 pull requests updated package recipes
* 8,307 total packages, 329 new since `v0.22.0`
* 140 new Python packages
* 14 new R packages
* 373 people contributed to this release
* 357 committers to packages
* 60 committers to core
# v0.22.2 (2024-09-21)
## Bugfixes
@@ -419,7 +781,7 @@
- spack graph: fix coloring with environments (#41240)
- spack info: sort variants in --variants-by-name (#41389)
- Spec.format: error on old style format strings (#41934)
- ASP-based solver:
- ASP-based solver:
- fix infinite recursion when computing concretization errors (#41061)
- don't error for type mismatch on preferences (#41138)
- don't emit spurious debug output (#41218)

View File

@@ -70,7 +70,7 @@ Tutorial
----------------
We maintain a
[**hands-on tutorial**](https://spack-tutorial.readthedocs.io/).
[**hands-on tutorial**](https://spack.readthedocs.io/en/latest/tutorial.html).
It covers basic to advanced usage, packaging, developer features, and large HPC
deployments. You can do all of the exercises on your own laptop using a
Docker container.

View File

@@ -55,11 +55,3 @@ concretizer:
splice:
explicit: []
automatic: false
# Maximum time, in seconds, allowed for the 'solve' phase. If set to 0, there is no time limit.
timeout: 0
# If set to true, exceeding the timeout will always result in a concretization error. If false,
# the best (suboptimal) model computed before the timeout is used.
#
# Setting this to false yields unreproducible results, so we advise to use that value only
# for debugging purposes (e.g. check which constraints can help Spack concretize faster).
error_on_timeout: true

View File

@@ -76,8 +76,6 @@ packages:
buildable: false
cray-mvapich2:
buildable: false
egl:
buildable: false
fujitsu-mpi:
buildable: false
hpcx-mpi:

View File

@@ -210,7 +210,7 @@ def setup(sphinx):
# Spack classes that are private and we don't want to expose
("py:class", "spack.provider_index._IndexBase"),
("py:class", "spack.repo._PrependFileLoader"),
("py:class", "spack.build_systems._checks.BuilderWithDefaults"),
("py:class", "spack.build_systems._checks.BaseBuilder"),
# Spack classes that intersphinx is unable to resolve
("py:class", "spack.version.StandardVersion"),
("py:class", "spack.spec.DependencySpec"),
@@ -222,9 +222,6 @@ def setup(sphinx):
("py:class", "spack.traverse.EdgeAndDepth"),
("py:class", "archspec.cpu.microarchitecture.Microarchitecture"),
("py:class", "spack.compiler.CompilerCache"),
("py:class", "spack.mirrors.mirror.Mirror"),
("py:class", "spack.mirrors.layout.MirrorLayout"),
("py:class", "spack.mirrors.utils.MirrorStats"),
# TypeVar that is not handled correctly
("py:class", "llnl.util.lang.T"),
]

View File

@@ -1042,7 +1042,7 @@ file snippet we define a view named ``mpis``, rooted at
``/path/to/view`` in which all projections use the package name,
version, and compiler name to determine the path for a given
package. This view selects all packages that depend on MPI, and
excludes those built with the GCC compiler at version 18.5.
excludes those built with the PGI compiler at version 18.5.
The root specs with their (transitive) link and run type dependencies
will be put in the view due to the ``link: all`` option,
and the files in the view will be symlinks to the spack install
@@ -1056,7 +1056,7 @@ directories.
mpis:
root: /path/to/view
select: [^mpi]
exclude: ['%gcc@18.5']
exclude: ['%pgi@18.5']
projections:
all: '{name}/{version}-{compiler.name}'
link: all

View File

@@ -283,6 +283,10 @@ compilers`` or ``spack compiler list``:
intel@14.0.1 intel@13.0.1 intel@12.1.2 intel@10.1
-- clang -------------------------------------------------------
clang@3.4 clang@3.3 clang@3.2 clang@3.1
-- pgi ---------------------------------------------------------
pgi@14.3-0 pgi@13.2-0 pgi@12.1-0 pgi@10.9-0 pgi@8.0-1
pgi@13.10-0 pgi@13.1-1 pgi@11.10-0 pgi@10.2-0 pgi@7.1-3
pgi@13.6-0 pgi@12.8-0 pgi@11.1-0 pgi@9.0-4 pgi@7.0-6
Any of these compilers can be used to build Spack packages. More on
how this is done is in :ref:`sec-specs`.
@@ -802,6 +806,65 @@ flags to the ``icc`` command:
spec: intel@15.0.24.4.9.3
^^^
PGI
^^^
PGI comes with two sets of compilers for C++ and Fortran,
distinguishable by their names. "Old" compilers:
.. code-block:: yaml
cc: /soft/pgi/15.10/linux86-64/15.10/bin/pgcc
cxx: /soft/pgi/15.10/linux86-64/15.10/bin/pgCC
f77: /soft/pgi/15.10/linux86-64/15.10/bin/pgf77
fc: /soft/pgi/15.10/linux86-64/15.10/bin/pgf90
"New" compilers:
.. code-block:: yaml
cc: /soft/pgi/15.10/linux86-64/15.10/bin/pgcc
cxx: /soft/pgi/15.10/linux86-64/15.10/bin/pgc++
f77: /soft/pgi/15.10/linux86-64/15.10/bin/pgfortran
fc: /soft/pgi/15.10/linux86-64/15.10/bin/pgfortran
Older installations of PGI contains just the old compilers; whereas
newer installations contain the old and the new. The new compiler is
considered preferable, as some packages
(``hdf``) will not build with the old compiler.
When auto-detecting a PGI compiler, there are cases where Spack will
find the old compilers, when you really want it to find the new
compilers. It is best to check this ``compilers.yaml``; and if the old
compilers are being used, change ``pgf77`` and ``pgf90`` to
``pgfortran``.
Other issues:
* There are reports that some packages will not build with PGI,
including ``libpciaccess`` and ``openssl``. A workaround is to
build these packages with another compiler and then use them as
dependencies for PGI-build packages. For example:
.. code-block:: console
$ spack install openmpi%pgi ^libpciaccess%gcc
* PGI requires a license to use; see :ref:`licensed-compilers` for more
information on installation.
.. note::
It is believed the problem with HDF 4 is that everything is
compiled with the ``F77`` compiler, but at some point some Fortran
90 code slipped in there. So compilers that can handle both FORTRAN
77 and Fortran 90 (``gfortran``, ``pgfortran``, etc) are fine. But
compilers specific to one or the other (``pgf77``, ``pgf90``) won't
work.
^^^
NAG
^^^
@@ -1326,7 +1389,6 @@ Required:
* Microsoft Visual Studio
* Python
* Git
* 7z
Optional:
* Intel Fortran (needed for some packages)
@@ -1392,13 +1454,6 @@ as the project providing Git support on Windows. This is additionally the recomm
for installing Git on Windows, a link to which can be found above. Spack requires the
utilities vendored by this project.
"""
7zip
"""
A tool for extracting ``.xz`` files is required for extracting source tarballs. The latest 7zip
can be located at https://sourceforge.net/projects/sevenzip/.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step 2: Install and setup Spack
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

View File

@@ -1928,29 +1928,71 @@ to the empty list.
String. A URL pointing to license setup instructions for the software.
Defaults to the empty string.
For example, let's take a look at the Arm Forge package.
For example, let's take a look at the package for the PGI compilers.
.. code-block:: python
# Licensing
license_required = True
license_comment = "#"
license_files = ["licences/Licence"]
license_vars = [
"ALLINEA_LICENSE_DIR",
"ALLINEA_LICENCE_DIR",
"ALLINEA_LICENSE_FILE",
"ALLINEA_LICENCE_FILE",
]
license_url = "https://developer.arm.com/documentation/101169/latest/Use-Arm-Licence-Server"
license_comment = "#"
license_files = ["license.dat"]
license_vars = ["PGROUPD_LICENSE_FILE", "LM_LICENSE_FILE"]
license_url = "http://www.pgroup.com/doc/pgiinstall.pdf"
Arm Forge requires a license. Its license manager uses the ``#`` symbol to denote a comment.
It expects the license file to be named ``License`` and to be located in a ``licenses`` directory
in the installation prefix.
As you can see, PGI requires a license. Its license manager, FlexNet, uses
the ``#`` symbol to denote a comment. It expects the license file to be
named ``license.dat`` and to be located directly in the installation prefix.
If you would like the installation file to be located elsewhere, simply set
``PGROUPD_LICENSE_FILE`` or ``LM_LICENSE_FILE`` after installation. For
further instructions on installation and licensing, see the URL provided.
If you would like the installation file to be located elsewhere, simply set ``ALLINEA_LICENSE_DIR`` or
one of the other license variables after installation. For further instructions on installation and
licensing, see the URL provided.
Let's walk through a sample PGI installation to see exactly what Spack is
and isn't capable of. Since PGI does not provide a download URL, it must
be downloaded manually. It can either be added to a mirror or located in
the current directory when ``spack install pgi`` is run. See :ref:`mirrors`
for instructions on setting up a mirror.
After running ``spack install pgi``, the first thing that will happen is
Spack will create a global license file located at
``$SPACK_ROOT/etc/spack/licenses/pgi/license.dat``. It will then open up the
file using :ref:`your favorite editor <controlling-the-editor>`. It will look like
this:
.. code-block:: sh
# A license is required to use pgi.
#
# The recommended solution is to store your license key in this global
# license file. After installation, the following symlink(s) will be
# added to point to this file (relative to the installation prefix):
#
# license.dat
#
# Alternatively, use one of the following environment variable(s):
#
# PGROUPD_LICENSE_FILE
# LM_LICENSE_FILE
#
# If you choose to store your license in a non-standard location, you may
# set one of these variable(s) to the full pathname to the license file, or
# port@host if you store your license keys on a dedicated license server.
# You will likely want to set this variable in a module file so that it
# gets loaded every time someone tries to use pgi.
#
# For further information on how to acquire a license, please refer to:
#
# http://www.pgroup.com/doc/pgiinstall.pdf
#
# You may enter your license below.
You can add your license directly to this file, or tell FlexNet to use a
license stored on a separate license server. Here is an example that
points to a license server called licman1:
.. code-block:: none
SERVER licman1.mcs.anl.gov 00163eb7fba5 27200
USE_SERVER
If your package requires the license to install, you can reference the
location of this global license using ``self.global_license_file``.
@@ -2925,9 +2967,9 @@ make sense during the build phase may not be needed at runtime, and vice versa.
it makes sense to let a dependency set the environment variables for its dependents. To allow all
this, Spack provides four different methods that can be overridden in a package:
1. :meth:`setup_build_environment <spack.builder.BaseBuilder.setup_build_environment>`
1. :meth:`setup_build_environment <spack.builder.Builder.setup_build_environment>`
2. :meth:`setup_run_environment <spack.package_base.PackageBase.setup_run_environment>`
3. :meth:`setup_dependent_build_environment <spack.builder.BaseBuilder.setup_dependent_build_environment>`
3. :meth:`setup_dependent_build_environment <spack.builder.Builder.setup_dependent_build_environment>`
4. :meth:`setup_dependent_run_environment <spack.package_base.PackageBase.setup_dependent_run_environment>`
The Qt package, for instance, uses this call:

View File

@@ -1,12 +1,12 @@
sphinx==8.1.3
sphinxcontrib-programoutput==0.17
sphinx_design==0.6.1
sphinx-rtd-theme==3.0.2
sphinx-rtd-theme==3.0.1
python-levenshtein==0.26.1
docutils==0.21.2
pygments==2.18.0
urllib3==2.2.3
pytest==8.3.4
pytest==8.3.3
isort==5.13.2
black==24.10.0
flake8==7.1.1

View File

@@ -24,7 +24,6 @@
Callable,
Deque,
Dict,
Generator,
Iterable,
List,
Match,
@@ -2773,6 +2772,22 @@ def prefixes(path):
return paths
@system_path_filter
def md5sum(file):
"""Compute the MD5 sum of a file.
Args:
file (str): file to be checksummed
Returns:
MD5 sum of the file's content
"""
md5 = hashlib.md5()
with open(file, "rb") as f:
md5.update(f.read())
return md5.digest()
@system_path_filter
def remove_directory_contents(dir):
"""Remove all contents of a directory."""
@@ -2823,25 +2838,6 @@ def temporary_dir(
remove_directory_contents(tmp_dir)
@contextmanager
def edit_in_place_through_temporary_file(file_path: str) -> Generator[str, None, None]:
"""Context manager for modifying ``file_path`` in place, preserving its inode and hardlinks,
for functions or external tools that do not support in-place editing. Notice that this function
is unsafe in that it works with paths instead of a file descriptors, but this is by design,
since we assume the call site will create a new inode at the same path."""
tmp_fd, tmp_path = tempfile.mkstemp(
dir=os.path.dirname(file_path), prefix=f"{os.path.basename(file_path)}."
)
# windows cannot replace a file with open fds, so close since the call site needs to replace.
os.close(tmp_fd)
try:
shutil.copyfile(file_path, tmp_path, follow_symlinks=True)
yield tmp_path
shutil.copyfile(tmp_path, file_path, follow_symlinks=True)
finally:
os.unlink(tmp_path)
def filesummary(path, print_bytes=16) -> Tuple[int, bytes]:
"""Create a small summary of the given file. Does not error
when file does not exist.

View File

@@ -11,7 +11,7 @@
import spack.util.git
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.24.0.dev0"
__version__ = "0.23.0"
spack_version = __version__

View File

@@ -571,13 +571,8 @@ def _search_for_deprecated_package_methods(pkgs, error_cls):
@package_properties
def _ensure_all_package_names_are_lowercase(pkgs, error_cls):
"""Ensure package names are lowercase and consistent"""
reserved_names = ("all",)
badname_regex, errors = re.compile(r"[_A-Z]"), []
for pkg_name in pkgs:
if pkg_name in reserved_names:
error_msg = f"The name '{pkg_name}' is reserved, and cannot be used for packages"
errors.append(error_cls(error_msg, []))
if badname_regex.search(pkg_name):
error_msg = f"Package name '{pkg_name}' should be lowercase and must not contain '_'"
errors.append(error_cls(error_msg, []))
@@ -693,19 +688,19 @@ def invalid_sha256_digest(fetcher):
return h, True
return None, False
error_msg = f"Package '{pkg_name}' does not use sha256 checksum"
error_msg = "Package '{}' does not use sha256 checksum".format(pkg_name)
details = []
for v, args in pkg.versions.items():
fetcher = spack.fetch_strategy.for_package_version(pkg, v)
digest, is_bad = invalid_sha256_digest(fetcher)
if is_bad:
details.append(f"{pkg_name}@{v} uses {digest}")
details.append("{}@{} uses {}".format(pkg_name, v, digest))
for _, resources in pkg.resources.items():
for resource in resources:
digest, is_bad = invalid_sha256_digest(resource.fetcher)
if is_bad:
details.append(f"Resource in '{pkg_name}' uses {digest}")
details.append("Resource in '{}' uses {}".format(pkg_name, digest))
if details:
errors.append(error_cls(error_msg, details))

View File

@@ -40,7 +40,7 @@
import spack.hash_types as ht
import spack.hooks
import spack.hooks.sbang
import spack.mirrors.mirror
import spack.mirror
import spack.oci.image
import spack.oci.oci
import spack.oci.opener
@@ -87,8 +87,6 @@
from spack.stage import Stage
from spack.util.executable import which
from .enums import InstallRecordStatus
BUILD_CACHE_RELATIVE_PATH = "build_cache"
BUILD_CACHE_KEYS_RELATIVE_PATH = "_pgp"
@@ -254,7 +252,7 @@ def _associate_built_specs_with_mirror(self, cache_key, mirror_url):
spec_list = [
s
for s in db.query_local(installed=InstallRecordStatus.ANY)
for s in db.query_local(installed=any)
if s.external or db.query_local_by_spec_hash(s.dag_hash()).in_buildcache
]
@@ -369,7 +367,7 @@ def update(self, with_cooldown=False):
on disk under ``_index_cache_root``)."""
self._init_local_index_cache()
configured_mirror_urls = [
m.fetch_url for m in spack.mirrors.mirror.MirrorCollection(binary=True).values()
m.fetch_url for m in spack.mirror.MirrorCollection(binary=True).values()
]
items_to_remove = []
spec_cache_clear_needed = False
@@ -1176,7 +1174,7 @@ def _url_upload_tarball_and_specfile(
class Uploader:
def __init__(self, mirror: spack.mirrors.mirror.Mirror, force: bool, update_index: bool):
def __init__(self, mirror: spack.mirror.Mirror, force: bool, update_index: bool):
self.mirror = mirror
self.force = force
self.update_index = update_index
@@ -1224,7 +1222,7 @@ def tag(self, tag: str, roots: List[spack.spec.Spec]):
class OCIUploader(Uploader):
def __init__(
self,
mirror: spack.mirrors.mirror.Mirror,
mirror: spack.mirror.Mirror,
force: bool,
update_index: bool,
base_image: Optional[str],
@@ -1273,7 +1271,7 @@ def tag(self, tag: str, roots: List[spack.spec.Spec]):
class URLUploader(Uploader):
def __init__(
self,
mirror: spack.mirrors.mirror.Mirror,
mirror: spack.mirror.Mirror,
force: bool,
update_index: bool,
signing_key: Optional[str],
@@ -1297,7 +1295,7 @@ def push(
def make_uploader(
mirror: spack.mirrors.mirror.Mirror,
mirror: spack.mirror.Mirror,
force: bool = False,
update_index: bool = False,
signing_key: Optional[str] = None,
@@ -1953,9 +1951,9 @@ def download_tarball(spec, unsigned: Optional[bool] = False, mirrors_for_spec=No
"signature_verified": "true-if-binary-pkg-was-already-verified"
}
"""
configured_mirrors: Iterable[spack.mirrors.mirror.Mirror] = (
spack.mirrors.mirror.MirrorCollection(binary=True).values()
)
configured_mirrors: Iterable[spack.mirror.Mirror] = spack.mirror.MirrorCollection(
binary=True
).values()
if not configured_mirrors:
tty.die("Please add a spack mirror to allow download of pre-compiled packages.")
@@ -1980,7 +1978,7 @@ def fetch_url_to_mirror(url):
for mirror in configured_mirrors:
if mirror.fetch_url == url:
return mirror
return spack.mirrors.mirror.Mirror(url)
return spack.mirror.Mirror(url)
mirrors = [fetch_url_to_mirror(url) for url in mirror_urls]
@@ -2334,9 +2332,7 @@ def is_backup_file(file):
if not codesign:
return
for binary in changed_files:
# preserve the original inode by running codesign on a copy
with fsys.edit_in_place_through_temporary_file(binary) as tmp_binary:
codesign("-fs-", tmp_binary)
codesign("-fs-", binary)
# If we are installing back to the same location
# relocate the sbang location if the spack directory changed
@@ -2650,7 +2646,7 @@ def try_direct_fetch(spec, mirrors=None):
specfile_is_signed = False
found_specs = []
binary_mirrors = spack.mirrors.mirror.MirrorCollection(mirrors=mirrors, binary=True).values()
binary_mirrors = spack.mirror.MirrorCollection(mirrors=mirrors, binary=True).values()
for mirror in binary_mirrors:
buildcache_fetch_url_json = url_util.join(
@@ -2711,7 +2707,7 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):
if spec is None:
return []
if not spack.mirrors.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True):
if not spack.mirror.MirrorCollection(mirrors=mirrors_to_check, binary=True):
tty.debug("No Spack mirrors are currently configured")
return {}
@@ -2750,7 +2746,7 @@ def clear_spec_cache():
def get_keys(install=False, trust=False, force=False, mirrors=None):
"""Get pgp public keys available on mirror with suffix .pub"""
mirror_collection = mirrors or spack.mirrors.mirror.MirrorCollection(binary=True)
mirror_collection = mirrors or spack.mirror.MirrorCollection(binary=True)
if not mirror_collection:
tty.die("Please add a spack mirror to allow " + "download of build caches.")
@@ -2805,7 +2801,7 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
def _url_push_keys(
*mirrors: Union[spack.mirrors.mirror.Mirror, str],
*mirrors: Union[spack.mirror.Mirror, str],
keys: List[str],
tmpdir: str,
update_index: bool = False,
@@ -2872,7 +2868,7 @@ def check_specs_against_mirrors(mirrors, specs, output_file=None):
"""
rebuilds = {}
for mirror in spack.mirrors.mirror.MirrorCollection(mirrors, binary=True).values():
for mirror in spack.mirror.MirrorCollection(mirrors, binary=True).values():
tty.debug("Checking for built specs at {0}".format(mirror.fetch_url))
rebuild_list = []
@@ -2916,7 +2912,7 @@ def _download_buildcache_entry(mirror_root, descriptions):
def download_buildcache_entry(file_descriptions, mirror_url=None):
if not mirror_url and not spack.mirrors.mirror.MirrorCollection(binary=True):
if not mirror_url and not spack.mirror.MirrorCollection(binary=True):
tty.die(
"Please provide or add a spack mirror to allow " + "download of buildcache entries."
)
@@ -2925,7 +2921,7 @@ def download_buildcache_entry(file_descriptions, mirror_url=None):
mirror_root = os.path.join(mirror_url, BUILD_CACHE_RELATIVE_PATH)
return _download_buildcache_entry(mirror_root, file_descriptions)
for mirror in spack.mirrors.mirror.MirrorCollection(binary=True).values():
for mirror in spack.mirror.MirrorCollection(binary=True).values():
mirror_root = os.path.join(mirror.fetch_url, BUILD_CACHE_RELATIVE_PATH)
if _download_buildcache_entry(mirror_root, file_descriptions):

View File

@@ -37,7 +37,7 @@
import spack.binary_distribution
import spack.config
import spack.detection
import spack.mirrors.mirror
import spack.mirror
import spack.platforms
import spack.spec
import spack.store
@@ -91,7 +91,7 @@ def __init__(self, conf: ConfigDictionary) -> None:
self.metadata_dir = spack.util.path.canonicalize_path(conf["metadata"])
# Promote (relative) paths to file urls
self.url = spack.mirrors.mirror.Mirror(conf["info"]["url"]).fetch_url
self.url = spack.mirror.Mirror(conf["info"]["url"]).fetch_url
@property
def mirror_scope(self) -> spack.config.InternalConfigScope:

View File

@@ -56,6 +56,7 @@
from llnl.util.symlink import symlink
from llnl.util.tty.color import cescape, colorize
import spack.build_systems._checks
import spack.build_systems.cmake
import spack.build_systems.meson
import spack.build_systems.python
@@ -882,9 +883,6 @@ def __init__(self, *roots: spack.spec.Spec, context: Context):
elif context == Context.RUN:
self.root_depflag = dt.RUN | dt.LINK
def accept(self, item):
return True
def neighbors(self, item):
spec = item.edge.spec
if spec.dag_hash() in self.root_hashes:
@@ -922,19 +920,19 @@ def effective_deptypes(
a flag specifying in what way they do so. The list is ordered topologically
from root to leaf, meaning that environment modifications should be applied
in reverse so that dependents override dependencies, not the other way around."""
topo_sorted_edges = traverse.traverse_topo_edges_generator(
traverse.with_artificial_edges(specs),
visitor=EnvironmentVisitor(*specs, context=context),
key=traverse.by_dag_hash,
visitor = traverse.TopoVisitor(
EnvironmentVisitor(*specs, context=context),
key=lambda x: x.dag_hash(),
root=True,
all_edges=True,
)
traverse.traverse_depth_first_with_visitor(traverse.with_artificial_edges(specs), visitor)
# Dictionary with "no mode" as default value, so it's easy to write modes[x] |= flag.
use_modes = defaultdict(lambda: UseMode(0))
nodes_with_type = []
for edge in topo_sorted_edges:
for edge in visitor.edges:
parent, child, depflag = edge.parent, edge.spec, edge.depflag
# Mark the starting point
@@ -1377,7 +1375,7 @@ def exitcode_msg(p):
return child_result
CONTEXT_BASES = (spack.package_base.PackageBase, spack.builder.Builder)
CONTEXT_BASES = (spack.package_base.PackageBase, spack.build_systems._checks.BaseBuilder)
def get_package_context(traceback, context=3):
@@ -1426,20 +1424,27 @@ def make_stack(tb, stack=None):
# We found obj, the Package implementation we care about.
# Point out the location in the install method where we failed.
filename = inspect.getfile(frame.f_code)
lines = [f"{filename}:{frame.f_lineno}, in {frame.f_code.co_name}:"]
lineno = frame.f_lineno
if os.path.basename(filename) == "package.py":
# subtract 1 because we inject a magic import at the top of package files.
# TODO: get rid of the magic import.
lineno -= 1
lines = ["{0}:{1:d}, in {2}:".format(filename, lineno, frame.f_code.co_name)]
# Build a message showing context in the install method.
sourcelines, start = inspect.getsourcelines(frame)
# Calculate lineno of the error relative to the start of the function.
fun_lineno = frame.f_lineno - start
fun_lineno = lineno - start
start_ctx = max(0, fun_lineno - context)
sourcelines = sourcelines[start_ctx : fun_lineno + context + 1]
for i, line in enumerate(sourcelines):
is_error = start_ctx + i == fun_lineno
mark = ">> " if is_error else " "
# Add start to get lineno relative to start of file, not function.
marked = f" {'>> ' if is_error else ' '}{start + start_ctx + i:-6d}{line.rstrip()}"
marked = " {0}{1:-6d}{2}".format(mark, start + start_ctx + i, line.rstrip())
if is_error:
marked = colorize("@R{%s}" % cescape(marked))
lines.append(marked)

View File

@@ -9,7 +9,6 @@
import spack.builder
import spack.error
import spack.phase_callbacks
import spack.relocate
import spack.spec
import spack.store
@@ -64,7 +63,7 @@ def apply_macos_rpath_fixups(builder: spack.builder.Builder):
def ensure_build_dependencies_or_raise(
spec: spack.spec.Spec, dependencies: List[str], error_msg: str
spec: spack.spec.Spec, dependencies: List[spack.spec.Spec], error_msg: str
):
"""Ensure that some build dependencies are present in the concrete spec.
@@ -72,7 +71,7 @@ def ensure_build_dependencies_or_raise(
Args:
spec: concrete spec to be checked.
dependencies: list of package names of required build dependencies
dependencies: list of abstract specs to be satisfied
error_msg: brief error message to be prepended to a longer description
Raises:
@@ -128,8 +127,8 @@ def execute_install_time_tests(builder: spack.builder.Builder):
builder.pkg.tester.phase_tests(builder, "install", builder.install_time_test_callbacks)
class BuilderWithDefaults(spack.builder.Builder):
"""Base class for all specific builders with common callbacks registered."""
class BaseBuilder(spack.builder.Builder):
"""Base class for builders to register common checks"""
# Check that self.prefix is there after installation
spack.phase_callbacks.run_after("install")(sanity_check_prefix)
spack.builder.run_after("install")(sanity_check_prefix)

View File

@@ -6,7 +6,7 @@
import os.path
import stat
import subprocess
from typing import Callable, List, Optional, Set, Tuple, Union
from typing import List
import llnl.util.filesystem as fs
import llnl.util.tty as tty
@@ -15,9 +15,6 @@
import spack.builder
import spack.error
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from spack.operating_systems.mac_os import macos_version
@@ -25,7 +22,7 @@
from spack.version import Version
from ._checks import (
BuilderWithDefaults,
BaseBuilder,
apply_macos_rpath_fixups,
ensure_build_dependencies_or_raise,
execute_build_time_tests,
@@ -72,14 +69,14 @@ def flags_to_build_system_args(self, flags):
# Legacy methods (used by too many packages to change them,
# need to forward to the builder)
def enable_or_disable(self, *args, **kwargs):
return spack.builder.create(self).enable_or_disable(*args, **kwargs)
return self.builder.enable_or_disable(*args, **kwargs)
def with_or_without(self, *args, **kwargs):
return spack.builder.create(self).with_or_without(*args, **kwargs)
return self.builder.with_or_without(*args, **kwargs)
@spack.builder.builder("autotools")
class AutotoolsBuilder(BuilderWithDefaults):
class AutotoolsBuilder(BaseBuilder):
"""The autotools builder encodes the default way of installing software built
with autotools. It has four phases that can be overridden, if need be:
@@ -160,7 +157,7 @@ class AutotoolsBuilder(BuilderWithDefaults):
install_libtool_archives = False
@property
def patch_config_files(self) -> bool:
def patch_config_files(self):
"""Whether to update old ``config.guess`` and ``config.sub`` files
distributed with the tarball.
@@ -180,7 +177,7 @@ def patch_config_files(self) -> bool:
)
@property
def _removed_la_files_log(self) -> str:
def _removed_la_files_log(self):
"""File containing the list of removed libtool archives"""
build_dir = self.build_directory
if not os.path.isabs(self.build_directory):
@@ -188,15 +185,15 @@ def _removed_la_files_log(self) -> str:
return os.path.join(build_dir, "removed_la_files.txt")
@property
def archive_files(self) -> List[str]:
def archive_files(self):
"""Files to archive for packages based on autotools"""
files = [os.path.join(self.build_directory, "config.log")]
if not self.install_libtool_archives:
files.append(self._removed_la_files_log)
return files
@spack.phase_callbacks.run_after("autoreconf")
def _do_patch_config_files(self) -> None:
@spack.builder.run_after("autoreconf")
def _do_patch_config_files(self):
"""Some packages ship with older config.guess/config.sub files and need to
have these updated when installed on a newer architecture.
@@ -297,7 +294,7 @@ def runs_ok(script_abs_path):
and set the prefix to the directory containing the `config.guess` and
`config.sub` files.
"""
raise spack.error.InstallError(msg.format(", ".join(to_be_found), self.pkg.name))
raise spack.error.InstallError(msg.format(", ".join(to_be_found), self.name))
# Copy the good files over the bad ones
for abs_path in to_be_patched:
@@ -307,8 +304,8 @@ def runs_ok(script_abs_path):
fs.copy(substitutes[name], abs_path)
os.chmod(abs_path, mode)
@spack.phase_callbacks.run_before("configure")
def _patch_usr_bin_file(self) -> None:
@spack.builder.run_before("configure")
def _patch_usr_bin_file(self):
"""On NixOS file is not available in /usr/bin/file. Patch configure
scripts to use file from path."""
@@ -319,8 +316,8 @@ def _patch_usr_bin_file(self) -> None:
with fs.keep_modification_time(*x.filenames):
x.filter(regex="/usr/bin/file", repl="file", string=True)
@spack.phase_callbacks.run_before("configure")
def _set_autotools_environment_variables(self) -> None:
@spack.builder.run_before("configure")
def _set_autotools_environment_variables(self):
"""Many autotools builds use a version of mknod.m4 that fails when
running as root unless FORCE_UNSAFE_CONFIGURE is set to 1.
@@ -333,8 +330,8 @@ def _set_autotools_environment_variables(self) -> None:
"""
os.environ["FORCE_UNSAFE_CONFIGURE"] = "1"
@spack.phase_callbacks.run_before("configure")
def _do_patch_libtool_configure(self) -> None:
@spack.builder.run_before("configure")
def _do_patch_libtool_configure(self):
"""Patch bugs that propagate from libtool macros into "configure" and
further into "libtool". Note that patches that can be fixed by patching
"libtool" directly should be implemented in the _do_patch_libtool method
@@ -361,8 +358,8 @@ def _do_patch_libtool_configure(self) -> None:
# Support Libtool 2.4.2 and older:
x.filter(regex=r'^(\s*test \$p = "-R")(; then\s*)$', repl=r'\1 || test x-l = x"$p"\2')
@spack.phase_callbacks.run_after("configure")
def _do_patch_libtool(self) -> None:
@spack.builder.run_after("configure")
def _do_patch_libtool(self):
"""If configure generates a "libtool" script that does not correctly
detect the compiler (and patch_libtool is set), patch in the correct
values for libtool variables.
@@ -510,64 +507,27 @@ def _do_patch_libtool(self) -> None:
)
@property
def configure_directory(self) -> str:
def configure_directory(self):
"""Return the directory where 'configure' resides."""
return self.pkg.stage.source_path
@property
def configure_abs_path(self) -> str:
def configure_abs_path(self):
# Absolute path to configure
configure_abs_path = os.path.join(os.path.abspath(self.configure_directory), "configure")
return configure_abs_path
@property
def build_directory(self) -> str:
def build_directory(self):
"""Override to provide another place to build the package"""
return self.configure_directory
@spack.phase_callbacks.run_before("autoreconf")
def delete_configure_to_force_update(self) -> None:
@spack.builder.run_before("autoreconf")
def delete_configure_to_force_update(self):
if self.force_autoreconf:
fs.force_remove(self.configure_abs_path)
@property
def autoreconf_search_path_args(self) -> List[str]:
"""Search path includes for autoreconf. Add an -I flag for all `aclocal` dirs
of build deps, skips the default path of automake, move external include
flags to the back, since they might pull in unrelated m4 files shadowing
spack dependencies."""
return _autoreconf_search_path_args(self.spec)
@spack.phase_callbacks.run_after("autoreconf")
def set_configure_or_die(self) -> None:
"""Ensure the presence of a "configure" script, or raise. If the "configure"
is found, a module level attribute is set.
Raises:
RuntimeError: if the "configure" script is not found
"""
# Check if the "configure" script is there. If not raise a RuntimeError.
if not os.path.exists(self.configure_abs_path):
msg = "configure script not found in {0}"
raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
def configure_args(self) -> List[str]:
"""Return the list of all the arguments that must be passed to configure,
except ``--prefix`` which will be pre-pended to the list.
"""
return []
def autoreconf(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
def autoreconf(self, pkg, spec, prefix):
"""Not needed usually, configure should be already there"""
# If configure exists nothing needs to be done
@@ -594,12 +554,39 @@ def autoreconf(
autoreconf_args += self.autoreconf_extra_args
self.pkg.module.autoreconf(*autoreconf_args)
def configure(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
@property
def autoreconf_search_path_args(self):
"""Search path includes for autoreconf. Add an -I flag for all `aclocal` dirs
of build deps, skips the default path of automake, move external include
flags to the back, since they might pull in unrelated m4 files shadowing
spack dependencies."""
return _autoreconf_search_path_args(self.spec)
@spack.builder.run_after("autoreconf")
def set_configure_or_die(self):
"""Ensure the presence of a "configure" script, or raise. If the "configure"
is found, a module level attribute is set.
Raises:
RuntimeError: if the "configure" script is not found
"""
# Check if the "configure" script is there. If not raise a RuntimeError.
if not os.path.exists(self.configure_abs_path):
msg = "configure script not found in {0}"
raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
def configure_args(self):
"""Return the list of all the arguments that must be passed to configure,
except ``--prefix`` which will be pre-pended to the list.
"""
return []
def configure(self, pkg, spec, prefix):
"""Run "configure", with the arguments specified by the builder and an
appropriately set prefix.
"""
@@ -610,12 +597,7 @@ def configure(
with fs.working_dir(self.build_directory, create=True):
pkg.module.configure(*options)
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
def build(self, pkg, spec, prefix):
"""Run "make" on the build targets specified by the builder."""
# See https://autotools.io/automake/silent.html
params = ["V=1"]
@@ -623,49 +605,41 @@ def build(
with fs.working_dir(self.build_directory):
pkg.module.make(*params)
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
def install(self, pkg, spec, prefix):
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
spack.builder.run_after("build")(execute_build_time_tests)
def check(self) -> None:
def check(self):
"""Run "make" on the ``test`` and ``check`` targets, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("test")
self.pkg._if_make_target_execute("check")
def _activate_or_not(
self,
name: str,
activation_word: str,
deactivation_word: str,
activation_value: Optional[Union[Callable, str]] = None,
variant=None,
) -> List[str]:
self, name, activation_word, deactivation_word, activation_value=None, variant=None
):
"""This function contain the current implementation details of
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without` and
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.enable_or_disable`.
Args:
name: name of the option that is being activated or not
activation_word: the default activation word ('with' in the case of
``with_or_without``)
deactivation_word: the default deactivation word ('without' in the case of
``with_or_without``)
activation_value: callable that accepts a single value. This value is either one of the
allowed values for a multi-valued variant or the name of a bool-valued variant.
name (str): name of the option that is being activated or not
activation_word (str): the default activation word ('with' in the
case of ``with_or_without``)
deactivation_word (str): the default deactivation word ('without'
in the case of ``with_or_without``)
activation_value (typing.Callable): callable that accepts a single
value. This value is either one of the allowed values for a
multi-valued variant or the name of a bool-valued variant.
Returns the parameter to be used when the value is activated.
The special value "prefix" can also be assigned and will return
The special value 'prefix' can also be assigned and will return
``spec[name].prefix`` as activation parameter.
variant: name of the variant that is being processed (if different from option name)
variant (str): name of the variant that is being processed
(if different from option name)
Examples:
@@ -673,19 +647,19 @@ def _activate_or_not(
.. code-block:: python
variant("foo", values=("x", "y"), description=")
variant("bar", default=True, description=")
variant("ba_z", default=True, description=")
variant('foo', values=('x', 'y'), description='')
variant('bar', default=True, description='')
variant('ba_z', default=True, description='')
calling this function like:
.. code-block:: python
_activate_or_not(
"foo", "with", "without", activation_value="prefix"
'foo', 'with', 'without', activation_value='prefix'
)
_activate_or_not("bar", "with", "without")
_activate_or_not("ba-z", "with", "without", variant="ba_z")
_activate_or_not('bar', 'with', 'without')
_activate_or_not('ba-z', 'with', 'without', variant='ba_z')
will generate the following configuration options:
@@ -705,8 +679,8 @@ def _activate_or_not(
Raises:
KeyError: if name is not among known variants
"""
spec: spack.spec.Spec = self.pkg.spec
args: List[str] = []
spec = self.pkg.spec
args = []
if activation_value == "prefix":
activation_value = lambda x: spec[x].prefix
@@ -724,7 +698,7 @@ def _activate_or_not(
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
vdef = self.pkg.get_variant(variant)
if set(vdef.values) == set((True, False)): # type: ignore
if set(vdef.values) == set((True, False)):
# BoolValuedVariant carry information about a single option.
# Nonetheless, for uniformity of treatment we'll package them
# in an iterable of one element.
@@ -735,12 +709,14 @@ def _activate_or_not(
# package's build system. It excludes values which have special
# meanings and do not correspond to features (e.g. "none")
feature_values = getattr(vdef.values, "feature_values", None) or vdef.values
options = [(v, f"{variant}={v}" in spec) for v in feature_values] # type: ignore
options = [(value, f"{variant}={value}" in spec) for value in feature_values]
# For each allowed value in the list of values
for option_value, activated in options:
# Search for an override in the package for this value
override_name = f"{activation_word}_or_{deactivation_word}_{option_value}"
override_name = "{0}_or_{1}_{2}".format(
activation_word, deactivation_word, option_value
)
line_generator = getattr(self, override_name, None) or getattr(
self.pkg, override_name, None
)
@@ -749,24 +725,19 @@ def _activate_or_not(
def _default_generator(is_activated):
if is_activated:
line = f"--{activation_word}-{option_value}"
line = "--{0}-{1}".format(activation_word, option_value)
if activation_value is not None and activation_value(
option_value
): # NOQA=ignore=E501
line = f"{line}={activation_value(option_value)}"
line += "={0}".format(activation_value(option_value))
return line
return f"--{deactivation_word}-{option_value}"
return "--{0}-{1}".format(deactivation_word, option_value)
line_generator = _default_generator
args.append(line_generator(activated))
return args
def with_or_without(
self,
name: str,
activation_value: Optional[Union[Callable, str]] = None,
variant: Optional[str] = None,
) -> List[str]:
def with_or_without(self, name, activation_value=None, variant=None):
"""Inspects a variant and returns the arguments that activate
or deactivate the selected feature(s) for the configure options.
@@ -781,11 +752,12 @@ def with_or_without(
``variant=value`` is in the spec.
Args:
name: name of a valid multi-valued variant
activation_value: callable that accepts a single value and returns the parameter to be
used leading to an entry of the type ``--with-{name}={parameter}``.
name (str): name of a valid multi-valued variant
activation_value (typing.Callable): callable that accepts a single
value and returns the parameter to be used leading to an entry
of the type ``--with-{name}={parameter}``.
The special value "prefix" can also be assigned and will return
The special value 'prefix' can also be assigned and will return
``spec[name].prefix`` as activation parameter.
Returns:
@@ -793,22 +765,18 @@ def with_or_without(
"""
return self._activate_or_not(name, "with", "without", activation_value, variant)
def enable_or_disable(
self,
name: str,
activation_value: Optional[Union[Callable, str]] = None,
variant: Optional[str] = None,
) -> List[str]:
def enable_or_disable(self, name, activation_value=None, variant=None):
"""Same as
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without`
but substitute ``with`` with ``enable`` and ``without`` with ``disable``.
Args:
name: name of a valid multi-valued variant
activation_value: if present accepts a single value and returns the parameter to be
used leading to an entry of the type ``--enable-{name}={parameter}``
name (str): name of a valid multi-valued variant
activation_value (typing.Callable): if present accepts a single value
and returns the parameter to be used leading to an entry of the
type ``--enable-{name}={parameter}``
The special value "prefix" can also be assigned and will return
The special value 'prefix' can also be assigned and will return
``spec[name].prefix`` as activation parameter.
Returns:
@@ -816,15 +784,15 @@ def enable_or_disable(
"""
return self._activate_or_not(name, "enable", "disable", activation_value, variant)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
spack.builder.run_after("install")(execute_install_time_tests)
def installcheck(self) -> None:
def installcheck(self):
"""Run "make" on the ``installcheck`` target, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("installcheck")
@spack.phase_callbacks.run_after("install")
def remove_libtool_archives(self) -> None:
@spack.builder.run_after("install")
def remove_libtool_archives(self):
"""Remove all .la files in prefix sub-folders if the package sets
``install_libtool_archives`` to be False.
"""
@@ -846,13 +814,12 @@ def setup_build_environment(self, env):
env.set("MACOSX_DEPLOYMENT_TARGET", "10.16")
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
def _autoreconf_search_path_args(spec: spack.spec.Spec) -> List[str]:
dirs_seen: Set[Tuple[int, int]] = set()
flags_spack: List[str] = []
flags_external: List[str] = []
def _autoreconf_search_path_args(spec):
dirs_seen = set()
flags_spack, flags_external = [], []
# We don't want to add an include flag for automake's default search path.
for automake in spec.dependencies(name="automake", deptype="build"):

View File

@@ -10,7 +10,7 @@
import llnl.util.filesystem as fs
import llnl.util.tty as tty
import spack.phase_callbacks
import spack.builder
from .cmake import CMakeBuilder, CMakePackage
@@ -192,10 +192,7 @@ def initconfig_mpi_entries(self):
entries.append(cmake_cache_path("MPI_C_COMPILER", spec["mpi"].mpicc))
entries.append(cmake_cache_path("MPI_CXX_COMPILER", spec["mpi"].mpicxx))
# not all MPIs have Fortran wrappers
if hasattr(spec["mpi"], "mpifc"):
entries.append(cmake_cache_path("MPI_Fortran_COMPILER", spec["mpi"].mpifc))
entries.append(cmake_cache_path("MPI_Fortran_COMPILER", spec["mpi"].mpifc))
# Check for slurm
using_slurm = False
@@ -335,7 +332,7 @@ def std_cmake_args(self):
args.extend(["-C", self.cache_path])
return args
@spack.phase_callbacks.run_after("install")
@spack.builder.run_after("install")
def install_cmake_cache(self):
fs.mkdirp(self.pkg.spec.prefix.share.cmake)
fs.install(self.cache_path, self.pkg.spec.prefix.share.cmake)

View File

@@ -7,11 +7,10 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from spack.multimethod import when
from ._checks import BuilderWithDefaults, execute_install_time_tests
from ._checks import BaseBuilder, execute_install_time_tests
class CargoPackage(spack.package_base.PackageBase):
@@ -28,7 +27,7 @@ class CargoPackage(spack.package_base.PackageBase):
@spack.builder.builder("cargo")
class CargoBuilder(BuilderWithDefaults):
class CargoBuilder(BaseBuilder):
"""The Cargo builder encodes the most common way of building software with
a rust Cargo.toml file. It has two phases that can be overridden, if need be:
@@ -78,7 +77,7 @@ def install(self, pkg, spec, prefix):
with fs.working_dir(self.build_directory):
fs.install_tree("out", prefix)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
spack.builder.run_after("install")(execute_install_time_tests)
def check(self):
"""Run "cargo test"."""

View File

@@ -9,7 +9,7 @@
import re
import sys
from itertools import chain
from typing import Any, List, Optional, Tuple
from typing import List, Optional, Set, Tuple
import llnl.util.filesystem as fs
from llnl.util.lang import stable_partition
@@ -18,15 +18,11 @@
import spack.deptypes as dt
import spack.error
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack import traverse
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from spack.util.environment import filter_system_paths
from ._checks import BuilderWithDefaults, execute_build_time_tests
from ._checks import BaseBuilder, execute_build_time_tests
# Regex to extract the primary generator from the CMake generator
# string.
@@ -52,9 +48,9 @@ def _maybe_set_python_hints(pkg: spack.package_base.PackageBase, args: List[str]
python_executable = pkg.spec["python"].command.path
args.extend(
[
define("PYTHON_EXECUTABLE", python_executable),
define("Python_EXECUTABLE", python_executable),
define("Python3_EXECUTABLE", python_executable),
CMakeBuilder.define("PYTHON_EXECUTABLE", python_executable),
CMakeBuilder.define("Python_EXECUTABLE", python_executable),
CMakeBuilder.define("Python3_EXECUTABLE", python_executable),
]
)
@@ -89,7 +85,7 @@ def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[
ipo = False
if cmake.satisfies("@3.9:"):
args.append(define("CMAKE_INTERPROCEDURAL_OPTIMIZATION", ipo))
args.append(CMakeBuilder.define("CMAKE_INTERPROCEDURAL_OPTIMIZATION", ipo))
# Disable Package Registry: export(PACKAGE) may put files in the user's home directory, and
# find_package may search there. This is not what we want.
@@ -97,36 +93,30 @@ def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[
# Do not populate CMake User Package Registry
if cmake.satisfies("@3.15:"):
# see https://cmake.org/cmake/help/latest/policy/CMP0090.html
args.append(define("CMAKE_POLICY_DEFAULT_CMP0090", "NEW"))
args.append(CMakeBuilder.define("CMAKE_POLICY_DEFAULT_CMP0090", "NEW"))
elif cmake.satisfies("@3.1:"):
# see https://cmake.org/cmake/help/latest/variable/CMAKE_EXPORT_NO_PACKAGE_REGISTRY.html
args.append(define("CMAKE_EXPORT_NO_PACKAGE_REGISTRY", True))
args.append(CMakeBuilder.define("CMAKE_EXPORT_NO_PACKAGE_REGISTRY", True))
# Do not use CMake User/System Package Registry
# https://cmake.org/cmake/help/latest/manual/cmake-packages.7.html#disabling-the-package-registry
if cmake.satisfies("@3.16:"):
args.append(define("CMAKE_FIND_USE_PACKAGE_REGISTRY", False))
args.append(CMakeBuilder.define("CMAKE_FIND_USE_PACKAGE_REGISTRY", False))
elif cmake.satisfies("@3.1:3.15"):
args.append(define("CMAKE_FIND_PACKAGE_NO_PACKAGE_REGISTRY", False))
args.append(define("CMAKE_FIND_PACKAGE_NO_SYSTEM_PACKAGE_REGISTRY", False))
args.append(CMakeBuilder.define("CMAKE_FIND_PACKAGE_NO_PACKAGE_REGISTRY", False))
args.append(CMakeBuilder.define("CMAKE_FIND_PACKAGE_NO_SYSTEM_PACKAGE_REGISTRY", False))
# Export a compilation database if supported.
if _supports_compilation_databases(pkg):
args.append(define("CMAKE_EXPORT_COMPILE_COMMANDS", True))
args.append(CMakeBuilder.define("CMAKE_EXPORT_COMPILE_COMMANDS", True))
# Enable MACOSX_RPATH by default when cmake_minimum_required < 3
# https://cmake.org/cmake/help/latest/policy/CMP0042.html
if pkg.spec.satisfies("platform=darwin") and cmake.satisfies("@3:"):
args.append(define("CMAKE_POLICY_DEFAULT_CMP0042", "NEW"))
# Disable find package's config mode for versions of Boost that
# didn't provide it. See https://github.com/spack/spack/issues/20169
# and https://cmake.org/cmake/help/latest/module/FindBoost.html
if pkg.spec.satisfies("^boost@:1.69.0"):
args.append(define("Boost_NO_BOOST_CMAKE", True))
args.append(CMakeBuilder.define("CMAKE_POLICY_DEFAULT_CMP0042", "NEW"))
def generator(*names: str, default: Optional[str] = None) -> None:
def generator(*names: str, default: Optional[str] = None):
"""The build system generator to use.
See ``cmake --help`` for a list of valid generators.
@@ -167,18 +157,15 @@ def _values(x):
def get_cmake_prefix_path(pkg: spack.package_base.PackageBase) -> List[str]:
"""Obtain the CMAKE_PREFIX_PATH entries for a package, based on the cmake_prefix_path package
attribute of direct build/test and transitive link dependencies."""
edges = traverse.traverse_topo_edges_generator(
traverse.with_artificial_edges([pkg.spec]),
visitor=traverse.MixedDepthVisitor(
direct=dt.BUILD | dt.TEST, transitive=dt.LINK, key=traverse.by_dag_hash
),
key=traverse.by_dag_hash,
root=False,
all_edges=False, # cover all nodes, not all edges
)
ordered_specs = [edge.spec for edge in edges]
# Add direct build/test deps
selected: Set[str] = {s.dag_hash() for s in pkg.spec.dependencies(deptype=dt.BUILD | dt.TEST)}
# Add transitive link deps
selected.update(s.dag_hash() for s in pkg.spec.traverse(root=False, deptype=dt.LINK))
# Separate out externals so they do not shadow Spack prefixes
externals, spack_built = stable_partition((s for s in ordered_specs), lambda x: x.external)
externals, spack_built = stable_partition(
(s for s in pkg.spec.traverse(root=False, order="topo") if s.dag_hash() in selected),
lambda x: x.external,
)
return filter_system_paths(
path for spec in chain(spack_built, externals) for path in spec.package.cmake_prefix_paths
@@ -276,15 +263,15 @@ def flags_to_build_system_args(self, flags):
# Legacy methods (used by too many packages to change them,
# need to forward to the builder)
def define(self, cmake_var: str, value: Any) -> str:
return define(cmake_var, value)
def define(self, *args, **kwargs):
return self.builder.define(*args, **kwargs)
def define_from_variant(self, cmake_var: str, variant: Optional[str] = None) -> str:
return define_from_variant(self, cmake_var, variant)
def define_from_variant(self, *args, **kwargs):
return self.builder.define_from_variant(*args, **kwargs)
@spack.builder.builder("cmake")
class CMakeBuilder(BuilderWithDefaults):
class CMakeBuilder(BaseBuilder):
"""The cmake builder encodes the default way of building software with CMake. IT
has three phases that can be overridden:
@@ -334,15 +321,15 @@ class CMakeBuilder(BuilderWithDefaults):
build_time_test_callbacks = ["check"]
@property
def archive_files(self) -> List[str]:
def archive_files(self):
"""Files to archive for packages based on CMake"""
files = [os.path.join(self.build_directory, "CMakeCache.txt")]
if _supports_compilation_databases(self.pkg):
if _supports_compilation_databases(self):
files.append(os.path.join(self.build_directory, "compile_commands.json"))
return files
@property
def root_cmakelists_dir(self) -> str:
def root_cmakelists_dir(self):
"""The relative path to the directory containing CMakeLists.txt
This path is relative to the root of the extracted tarball,
@@ -351,17 +338,16 @@ def root_cmakelists_dir(self) -> str:
return self.pkg.stage.source_path
@property
def generator(self) -> str:
def generator(self):
if self.spec.satisfies("generator=make"):
return "Unix Makefiles"
if self.spec.satisfies("generator=ninja"):
return "Ninja"
raise ValueError(
f'{self.spec.format()} has an unsupported value for the "generator" variant'
)
msg = f'{self.spec.format()} has an unsupported value for the "generator" variant'
raise ValueError(msg)
@property
def std_cmake_args(self) -> List[str]:
def std_cmake_args(self):
"""Standard cmake arguments provided as a property for
convenience of package writers
"""
@@ -370,9 +356,7 @@ def std_cmake_args(self) -> List[str]:
return args
@staticmethod
def std_args(
pkg: spack.package_base.PackageBase, generator: Optional[str] = None
) -> List[str]:
def std_args(pkg, generator=None):
"""Computes the standard cmake arguments for a generic package"""
default_generator = "Ninja" if sys.platform == "win32" else "Unix Makefiles"
generator = generator or default_generator
@@ -389,6 +373,7 @@ def std_args(
except KeyError:
build_type = "RelWithDebInfo"
define = CMakeBuilder.define
args = [
"-G",
generator,
@@ -420,31 +405,152 @@ def std_args(
return args
@staticmethod
def define_cuda_architectures(pkg: spack.package_base.PackageBase) -> str:
return define_cuda_architectures(pkg)
def define_cuda_architectures(pkg):
"""Returns the str ``-DCMAKE_CUDA_ARCHITECTURES:STRING=(expanded cuda_arch)``.
``cuda_arch`` is variant composed of a list of target CUDA architectures and
it is declared in the cuda package.
This method is no-op for cmake<3.18 and when ``cuda_arch`` variant is not set.
"""
cmake_flag = str()
if "cuda_arch" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.18:"):
cmake_flag = CMakeBuilder.define(
"CMAKE_CUDA_ARCHITECTURES", pkg.spec.variants["cuda_arch"].value
)
return cmake_flag
@staticmethod
def define_hip_architectures(pkg: spack.package_base.PackageBase) -> str:
return define_hip_architectures(pkg)
def define_hip_architectures(pkg):
"""Returns the str ``-DCMAKE_HIP_ARCHITECTURES:STRING=(expanded amdgpu_target)``.
``amdgpu_target`` is variant composed of a list of the target HIP
architectures and it is declared in the rocm package.
This method is no-op for cmake<3.18 and when ``amdgpu_target`` variant is
not set.
"""
cmake_flag = str()
if "amdgpu_target" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.21:"):
cmake_flag = CMakeBuilder.define(
"CMAKE_HIP_ARCHITECTURES", pkg.spec.variants["amdgpu_target"].value
)
return cmake_flag
@staticmethod
def define(cmake_var: str, value: Any) -> str:
return define(cmake_var, value)
def define(cmake_var, value):
"""Return a CMake command line argument that defines a variable.
def define_from_variant(self, cmake_var: str, variant: Optional[str] = None) -> str:
return define_from_variant(self.pkg, cmake_var, variant)
The resulting argument will convert boolean values to OFF/ON
and lists/tuples to CMake semicolon-separated string lists. All other
values will be interpreted as strings.
Examples:
.. code-block:: python
[define('BUILD_SHARED_LIBS', True),
define('CMAKE_CXX_STANDARD', 14),
define('swr', ['avx', 'avx2'])]
will generate the following configuration options:
.. code-block:: console
["-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2]
"""
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
if isinstance(value, bool):
kind = "BOOL"
value = "ON" if value else "OFF"
else:
kind = "STRING"
if isinstance(value, collections.abc.Sequence) and not isinstance(value, str):
value = ";".join(str(v) for v in value)
else:
value = str(value)
return "".join(["-D", cmake_var, ":", kind, "=", value])
def define_from_variant(self, cmake_var, variant=None):
"""Return a CMake command line argument from the given variant's value.
The optional ``variant`` argument defaults to the lower-case transform
of ``cmake_var``.
This utility function is similar to
:meth:`~spack.build_systems.autotools.AutotoolsBuilder.with_or_without`.
Examples:
Given a package with:
.. code-block:: python
variant('cxxstd', default='11', values=('11', '14'),
multi=False, description='')
variant('shared', default=True, description='')
variant('swr', values=any_combination_of('avx', 'avx2'),
description='')
calling this function like:
.. code-block:: python
[self.define_from_variant('BUILD_SHARED_LIBS', 'shared'),
self.define_from_variant('CMAKE_CXX_STANDARD', 'cxxstd'),
self.define_from_variant('SWR')]
will generate the following configuration options:
.. code-block:: console
["-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2]
for ``<spec-name> cxxstd=14 +shared swr=avx,avx2``
Note: if the provided variant is conditional, and the condition is not met,
this function returns an empty string. CMake discards empty strings
provided on the command line.
"""
if variant is None:
variant = cmake_var.lower()
if not self.pkg.has_variant(variant):
raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, self.pkg.name))
if variant not in self.pkg.spec.variants:
return ""
value = self.pkg.spec.variants[variant].value
if isinstance(value, (tuple, list)):
# Sort multi-valued variants for reproducibility
value = sorted(value)
return self.define(cmake_var, value)
@property
def build_dirname(self) -> str:
def build_dirname(self):
"""Directory name to use when building the package."""
return f"spack-build-{self.pkg.spec.dag_hash(7)}"
return "spack-build-%s" % self.pkg.spec.dag_hash(7)
@property
def build_directory(self) -> str:
def build_directory(self):
"""Full-path to the directory to use when building the package."""
return os.path.join(self.pkg.stage.path, self.build_dirname)
def cmake_args(self) -> List[str]:
def cmake_args(self):
"""List of all the arguments that must be passed to cmake, except:
* CMAKE_INSTALL_PREFIX
@@ -454,12 +560,7 @@ def cmake_args(self) -> List[str]:
"""
return []
def cmake(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
def cmake(self, pkg, spec, prefix):
"""Runs ``cmake`` in the build directory"""
# skip cmake phase if it is an incremental develop build
@@ -474,12 +575,7 @@ def cmake(
with fs.working_dir(self.build_directory, create=True):
pkg.module.cmake(*options)
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
def build(self, pkg, spec, prefix):
"""Make the build targets"""
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
@@ -488,12 +584,7 @@ def build(
self.build_targets.append("-v")
pkg.module.ninja(*self.build_targets)
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
def install(self, pkg, spec, prefix):
"""Make the install targets"""
with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles":
@@ -501,9 +592,9 @@ def install(
elif self.generator == "Ninja":
pkg.module.ninja(*self.install_targets)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
spack.builder.run_after("build")(execute_build_time_tests)
def check(self) -> None:
def check(self):
"""Search the CMake-generated files for the targets ``test`` and ``check``,
and runs them if found.
"""
@@ -514,133 +605,3 @@ def check(self) -> None:
elif self.generator == "Ninja":
self.pkg._if_ninja_target_execute("test", jobs_env="CTEST_PARALLEL_LEVEL")
self.pkg._if_ninja_target_execute("check")
def define(cmake_var: str, value: Any) -> str:
"""Return a CMake command line argument that defines a variable.
The resulting argument will convert boolean values to OFF/ON and lists/tuples to CMake
semicolon-separated string lists. All other values will be interpreted as strings.
Examples:
.. code-block:: python
[define("BUILD_SHARED_LIBS", True),
define("CMAKE_CXX_STANDARD", 14),
define("swr", ["avx", "avx2"])]
will generate the following configuration options:
.. code-block:: console
["-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2]
"""
# Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated
if isinstance(value, bool):
kind = "BOOL"
value = "ON" if value else "OFF"
else:
kind = "STRING"
if isinstance(value, collections.abc.Sequence) and not isinstance(value, str):
value = ";".join(str(v) for v in value)
else:
value = str(value)
return "".join(["-D", cmake_var, ":", kind, "=", value])
def define_from_variant(
pkg: spack.package_base.PackageBase, cmake_var: str, variant: Optional[str] = None
) -> str:
"""Return a CMake command line argument from the given variant's value.
The optional ``variant`` argument defaults to the lower-case transform
of ``cmake_var``.
Examples:
Given a package with:
.. code-block:: python
variant("cxxstd", default="11", values=("11", "14"),
multi=False, description="")
variant("shared", default=True, description="")
variant("swr", values=any_combination_of("avx", "avx2"),
description="")
calling this function like:
.. code-block:: python
[
self.define_from_variant("BUILD_SHARED_LIBS", "shared"),
self.define_from_variant("CMAKE_CXX_STANDARD", "cxxstd"),
self.define_from_variant("SWR"),
]
will generate the following configuration options:
.. code-block:: console
[
"-DBUILD_SHARED_LIBS:BOOL=ON",
"-DCMAKE_CXX_STANDARD:STRING=14",
"-DSWR:STRING=avx;avx2",
]
for ``<spec-name> cxxstd=14 +shared swr=avx,avx2``
Note: if the provided variant is conditional, and the condition is not met, this function
returns an empty string. CMake discards empty strings provided on the command line.
"""
if variant is None:
variant = cmake_var.lower()
if not pkg.has_variant(variant):
raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, pkg.name))
if variant not in pkg.spec.variants:
return ""
value = pkg.spec.variants[variant].value
if isinstance(value, (tuple, list)):
# Sort multi-valued variants for reproducibility
value = sorted(value)
return define(cmake_var, value)
def define_hip_architectures(pkg: spack.package_base.PackageBase) -> str:
"""Returns the str ``-DCMAKE_HIP_ARCHITECTURES:STRING=(expanded amdgpu_target)``.
``amdgpu_target`` is variant composed of a list of the target HIP
architectures and it is declared in the rocm package.
This method is no-op for cmake<3.18 and when ``amdgpu_target`` variant is
not set.
"""
if "amdgpu_target" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.21:"):
return define("CMAKE_HIP_ARCHITECTURES", pkg.spec.variants["amdgpu_target"].value)
return ""
def define_cuda_architectures(pkg: spack.package_base.PackageBase) -> str:
"""Returns the str ``-DCMAKE_CUDA_ARCHITECTURES:STRING=(expanded cuda_arch)``.
``cuda_arch`` is variant composed of a list of target CUDA architectures and
it is declared in the cuda package.
This method is no-op for cmake<3.18 and when ``cuda_arch`` variant is not set.
"""
if "cuda_arch" in pkg.spec.variants and pkg.spec.satisfies("^cmake@3.18:"):
return define("CMAKE_CUDA_ARCHITECTURES", pkg.spec.variants["cuda_arch"].value)
return ""

View File

@@ -180,6 +180,13 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%gcc@7:", when="+cuda ^cuda@:9.1 target=x86_64:")
conflicts("%gcc@8:", when="+cuda ^cuda@:10.0.130 target=x86_64:")
conflicts("%gcc@9:", when="+cuda ^cuda@:10.2.89 target=x86_64:")
conflicts("%pgi@:14.8", when="+cuda ^cuda@:7.0.27 target=x86_64:")
conflicts("%pgi@:15.3,15.5:", when="+cuda ^cuda@7.5 target=x86_64:")
conflicts("%pgi@:16.2,16.0:16.3", when="+cuda ^cuda@8 target=x86_64:")
conflicts("%pgi@:15,18:", when="+cuda ^cuda@9.0:9.1 target=x86_64:")
conflicts("%pgi@:16,19:", when="+cuda ^cuda@9.2.88:10.0 target=x86_64:")
conflicts("%pgi@:17,20:", when="+cuda ^cuda@10.1.105:10.2.89 target=x86_64:")
conflicts("%pgi@:17,21:", when="+cuda ^cuda@11.0.2:11.1.0 target=x86_64:")
conflicts("%clang@:3.4", when="+cuda ^cuda@:7.5 target=x86_64:")
conflicts("%clang@:3.7,4:", when="+cuda ^cuda@8.0:9.0 target=x86_64:")
conflicts("%clang@:3.7,4.1:", when="+cuda ^cuda@9.1 target=x86_64:")
@@ -205,6 +212,9 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%gcc@8:", when="+cuda ^cuda@:10.0.130 target=ppc64le:")
conflicts("%gcc@9:", when="+cuda ^cuda@:10.1.243 target=ppc64le:")
# officially, CUDA 11.0.2 only supports the system GCC 8.3 on ppc64le
conflicts("%pgi", when="+cuda ^cuda@:8 target=ppc64le:")
conflicts("%pgi@:16", when="+cuda ^cuda@:9.1.185 target=ppc64le:")
conflicts("%pgi@:17", when="+cuda ^cuda@:10 target=ppc64le:")
conflicts("%clang@4:", when="+cuda ^cuda@:9.0.176 target=ppc64le:")
conflicts("%clang@5:", when="+cuda ^cuda@:9.1 target=ppc64le:")
conflicts("%clang@6:", when="+cuda ^cuda@:9.2 target=ppc64le:")

View File

@@ -7,9 +7,8 @@
import spack.builder
import spack.directives
import spack.package_base
import spack.phase_callbacks
from ._checks import BuilderWithDefaults, apply_macos_rpath_fixups, execute_install_time_tests
from ._checks import BaseBuilder, apply_macos_rpath_fixups, execute_install_time_tests
class Package(spack.package_base.PackageBase):
@@ -27,7 +26,7 @@ class Package(spack.package_base.PackageBase):
@spack.builder.builder("generic")
class GenericBuilder(BuilderWithDefaults):
class GenericBuilder(BaseBuilder):
"""A builder for a generic build system, that require packagers
to implement an "install" phase.
"""
@@ -45,7 +44,7 @@ class GenericBuilder(BuilderWithDefaults):
install_time_test_callbacks = []
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
# unconditionally perform any post-install phase tests
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
spack.builder.run_after("install")(execute_install_time_tests)

View File

@@ -7,11 +7,10 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, extends
from spack.multimethod import when
from ._checks import BuilderWithDefaults, execute_install_time_tests
from ._checks import BaseBuilder, execute_install_time_tests
class GoPackage(spack.package_base.PackageBase):
@@ -33,7 +32,7 @@ class GoPackage(spack.package_base.PackageBase):
@spack.builder.builder("go")
class GoBuilder(BuilderWithDefaults):
class GoBuilder(BaseBuilder):
"""The Go builder encodes the most common way of building software with
a golang go.mod file. It has two phases that can be overridden, if need be:
@@ -100,7 +99,7 @@ def install(self, pkg, spec, prefix):
fs.mkdirp(prefix.bin)
fs.install(pkg.name, prefix.bin)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
spack.builder.run_after("install")(execute_install_time_tests)
def check(self):
"""Run ``go test .`` in the source directory"""

View File

@@ -22,8 +22,8 @@
install,
)
import spack.builder
import spack.error
import spack.phase_callbacks
from spack.build_environment import dso_suffix
from spack.error import InstallError
from spack.util.environment import EnvironmentModifications
@@ -1163,7 +1163,7 @@ def _determine_license_type(self):
debug_print(license_type)
return license_type
@spack.phase_callbacks.run_before("install")
@spack.builder.run_before("install")
def configure(self):
"""Generates the silent.cfg file to pass to installer.sh.
@@ -1250,7 +1250,7 @@ def install(self, spec, prefix):
for f in glob.glob("%s/intel*log" % tmpdir):
install(f, dst)
@spack.phase_callbacks.run_after("install")
@spack.builder.run_after("install")
def validate_install(self):
# Sometimes the installer exits with an error but doesn't pass a
# non-zero exit code to spack. Check for the existence of a 'bin'
@@ -1258,7 +1258,7 @@ def validate_install(self):
if not os.path.exists(self.prefix.bin):
raise InstallError("The installer has failed to install anything.")
@spack.phase_callbacks.run_after("install")
@spack.builder.run_after("install")
def configure_rpath(self):
if "+rpath" not in self.spec:
return
@@ -1276,7 +1276,7 @@ def configure_rpath(self):
with open(compiler_cfg, "w") as fh:
fh.write("-Xlinker -rpath={0}\n".format(compilers_lib_dir))
@spack.phase_callbacks.run_after("install")
@spack.builder.run_after("install")
def configure_auto_dispatch(self):
if self._has_compilers:
if "auto_dispatch=none" in self.spec:
@@ -1300,7 +1300,7 @@ def configure_auto_dispatch(self):
with open(compiler_cfg, "a") as fh:
fh.write("-ax{0}\n".format(",".join(ad)))
@spack.phase_callbacks.run_after("install")
@spack.builder.run_after("install")
def filter_compiler_wrappers(self):
if ("+mpi" in self.spec or self.provides("mpi")) and "~newdtags" in self.spec:
bin_dir = self.component_bin_dir("mpi")
@@ -1308,7 +1308,7 @@ def filter_compiler_wrappers(self):
f = os.path.join(bin_dir, f)
filter_file("-Xlinker --enable-new-dtags", " ", f, string=True)
@spack.phase_callbacks.run_after("install")
@spack.builder.run_after("install")
def uninstall_ism(self):
# The "Intel(R) Software Improvement Program" [ahem] gets installed,
# apparently regardless of PHONEHOME_SEND_USAGE_DATA.
@@ -1340,7 +1340,7 @@ def base_lib_dir(self):
debug_print(d)
return d
@spack.phase_callbacks.run_after("install")
@spack.builder.run_after("install")
def modify_LLVMgold_rpath(self):
"""Add libimf.so and other required libraries to the RUNPATH of LLVMgold.so.

View File

@@ -8,14 +8,11 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when
from ._checks import (
BuilderWithDefaults,
BaseBuilder,
apply_macos_rpath_fixups,
execute_build_time_tests,
execute_install_time_tests,
@@ -39,7 +36,7 @@ class MakefilePackage(spack.package_base.PackageBase):
@spack.builder.builder("makefile")
class MakefileBuilder(BuilderWithDefaults):
class MakefileBuilder(BaseBuilder):
"""The Makefile builder encodes the most common way of building software with
Makefiles. It has three phases that can be overridden, if need be:
@@ -94,50 +91,35 @@ class MakefileBuilder(BuilderWithDefaults):
install_time_test_callbacks = ["installcheck"]
@property
def build_directory(self) -> str:
def build_directory(self):
"""Return the directory containing the main Makefile."""
return self.pkg.stage.source_path
def edit(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
def edit(self, pkg, spec, prefix):
"""Edit the Makefile before calling make. The default is a no-op."""
pass
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
def build(self, pkg, spec, prefix):
"""Run "make" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.build_targets)
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
def install(self, pkg, spec, prefix):
"""Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
spack.builder.run_after("build")(execute_build_time_tests)
def check(self) -> None:
def check(self):
"""Run "make" on the ``test`` and ``check`` targets, if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_make_target_execute("test")
self.pkg._if_make_target_execute("check")
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
spack.builder.run_after("install")(execute_install_time_tests)
def installcheck(self) -> None:
def installcheck(self):
"""Searches the Makefile for an ``installcheck`` target
and runs it if found.
"""
@@ -145,4 +127,4 @@ def installcheck(self) -> None:
self.pkg._if_make_target_execute("installcheck")
# On macOS, force rpaths for shared library IDs and remove duplicate rpaths
spack.phase_callbacks.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)
spack.builder.run_after("install", when="platform=darwin")(apply_macos_rpath_fixups)

View File

@@ -10,7 +10,7 @@
from spack.multimethod import when
from spack.util.executable import which
from ._checks import BuilderWithDefaults
from ._checks import BaseBuilder
class MavenPackage(spack.package_base.PackageBase):
@@ -34,7 +34,7 @@ class MavenPackage(spack.package_base.PackageBase):
@spack.builder.builder("maven")
class MavenBuilder(BuilderWithDefaults):
class MavenBuilder(BaseBuilder):
"""The Maven builder encodes the default way to build software with Maven.
It has two phases that can be overridden, if need be:

View File

@@ -9,13 +9,10 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
import spack.spec
import spack.util.prefix
from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when
from ._checks import BuilderWithDefaults, execute_build_time_tests
from ._checks import BaseBuilder, execute_build_time_tests
class MesonPackage(spack.package_base.PackageBase):
@@ -65,7 +62,7 @@ def flags_to_build_system_args(self, flags):
@spack.builder.builder("meson")
class MesonBuilder(BuilderWithDefaults):
class MesonBuilder(BaseBuilder):
"""The Meson builder encodes the default way to build software with Meson.
The builder has three phases that can be overridden, if need be:
@@ -115,7 +112,7 @@ def archive_files(self):
return [os.path.join(self.build_directory, "meson-logs", "meson-log.txt")]
@property
def root_mesonlists_dir(self) -> str:
def root_mesonlists_dir(self):
"""Relative path to the directory containing meson.build
This path is relative to the root of the extracted tarball,
@@ -124,7 +121,7 @@ def root_mesonlists_dir(self) -> str:
return self.pkg.stage.source_path
@property
def std_meson_args(self) -> List[str]:
def std_meson_args(self):
"""Standard meson arguments provided as a property for convenience
of package writers.
"""
@@ -135,7 +132,7 @@ def std_meson_args(self) -> List[str]:
return std_meson_args
@staticmethod
def std_args(pkg) -> List[str]:
def std_args(pkg):
"""Standard meson arguments for a generic package."""
try:
build_type = pkg.spec.variants["buildtype"].value
@@ -175,7 +172,7 @@ def build_directory(self):
"""Directory to use when building the package."""
return os.path.join(self.pkg.stage.path, self.build_dirname)
def meson_args(self) -> List[str]:
def meson_args(self):
"""List of arguments that must be passed to meson, except:
* ``--prefix``
@@ -188,12 +185,7 @@ def meson_args(self) -> List[str]:
"""
return []
def meson(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
def meson(self, pkg, spec, prefix):
"""Run ``meson`` in the build directory"""
options = []
if self.spec["meson"].satisfies("@0.64:"):
@@ -204,31 +196,21 @@ def meson(
with fs.working_dir(self.build_directory, create=True):
pkg.module.meson(*options)
def build(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
def build(self, pkg, spec, prefix):
"""Make the build targets"""
options = ["-v"]
options += self.build_targets
with fs.working_dir(self.build_directory):
pkg.module.ninja(*options)
def install(
self,
pkg: spack.package_base.PackageBase,
spec: spack.spec.Spec,
prefix: spack.util.prefix.Prefix,
) -> None:
def install(self, pkg, spec, prefix):
"""Make the install targets"""
with fs.working_dir(self.build_directory):
pkg.module.ninja(*self.install_targets)
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
spack.builder.run_after("build")(execute_build_time_tests)
def check(self) -> None:
def check(self):
"""Search Meson-generated files for the target ``test`` and run it if found."""
with fs.working_dir(self.build_directory):
self.pkg._if_ninja_target_execute("test")

View File

@@ -10,7 +10,7 @@
import spack.package_base
from spack.directives import build_system, conflicts
from ._checks import BuilderWithDefaults
from ._checks import BaseBuilder
class MSBuildPackage(spack.package_base.PackageBase):
@@ -26,7 +26,7 @@ class MSBuildPackage(spack.package_base.PackageBase):
@spack.builder.builder("msbuild")
class MSBuildBuilder(BuilderWithDefaults):
class MSBuildBuilder(BaseBuilder):
"""The MSBuild builder encodes the most common way of building software with
Mircosoft's MSBuild tool. It has two phases that can be overridden, if need be:

View File

@@ -10,7 +10,7 @@
import spack.package_base
from spack.directives import build_system, conflicts
from ._checks import BuilderWithDefaults
from ._checks import BaseBuilder
class NMakePackage(spack.package_base.PackageBase):
@@ -26,7 +26,7 @@ class NMakePackage(spack.package_base.PackageBase):
@spack.builder.builder("nmake")
class NMakeBuilder(BuilderWithDefaults):
class NMakeBuilder(BaseBuilder):
"""The NMake builder encodes the most common way of building software with
Mircosoft's NMake tool. It has two phases that can be overridden, if need be:

View File

@@ -7,7 +7,7 @@
from spack.directives import build_system, extends
from spack.multimethod import when
from ._checks import BuilderWithDefaults
from ._checks import BaseBuilder
class OctavePackage(spack.package_base.PackageBase):
@@ -29,7 +29,7 @@ class OctavePackage(spack.package_base.PackageBase):
@spack.builder.builder("octave")
class OctaveBuilder(BuilderWithDefaults):
class OctaveBuilder(BaseBuilder):
"""The octave builder provides the following phases that can be overridden:
1. :py:meth:`~.OctaveBuilder.install`

View File

@@ -255,7 +255,7 @@ def libs(self):
return find_libraries("*", root=self.component_prefix.lib, recursive=not self.v2_layout)
class IntelOneApiLibraryPackageWithSdk(IntelOneApiLibraryPackage):
class IntelOneApiLibraryPackageWithSdk(IntelOneApiPackage):
"""Base class for Intel oneAPI library packages with SDK components.
Contains some convenient default implementations for libraries

View File

@@ -10,12 +10,11 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, extends
from spack.install_test import SkipTest, test_part
from spack.util.executable import Executable
from ._checks import BuilderWithDefaults, execute_build_time_tests
from ._checks import BaseBuilder, execute_build_time_tests
class PerlPackage(spack.package_base.PackageBase):
@@ -85,7 +84,7 @@ def test_use(self):
@spack.builder.builder("perl")
class PerlBuilder(BuilderWithDefaults):
class PerlBuilder(BaseBuilder):
"""The perl builder provides four phases that can be overridden, if required:
1. :py:meth:`~.PerlBuilder.configure`
@@ -164,7 +163,7 @@ def configure(self, pkg, spec, prefix):
# Build.PL may be too long causing the build to fail. Patching the shebang
# does not happen until after install so set '/usr/bin/env perl' here in
# the Build script.
@spack.phase_callbacks.run_after("configure")
@spack.builder.run_after("configure")
def fix_shebang(self):
if self.build_method == "Build.PL":
pattern = "#!{0}".format(self.spec["perl"].command.path)
@@ -176,7 +175,7 @@ def build(self, pkg, spec, prefix):
self.build_executable()
# Ensure that tests run after build (if requested):
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
spack.builder.run_after("build")(execute_build_time_tests)
def check(self):
"""Runs built-in tests of a Perl package."""

View File

@@ -24,7 +24,6 @@
import spack.detection
import spack.multimethod
import spack.package_base
import spack.phase_callbacks
import spack.platforms
import spack.repo
import spack.spec
@@ -35,7 +34,7 @@
from spack.spec import Spec
from spack.util.prefix import Prefix
from ._checks import BuilderWithDefaults, execute_install_time_tests
from ._checks import BaseBuilder, execute_install_time_tests
def _flatten_dict(dictionary: Mapping[str, object]) -> Iterable[str]:
@@ -375,7 +374,7 @@ def list_url(cls) -> Optional[str]: # type: ignore[override]
return None
@property
def python_spec(self) -> Spec:
def python_spec(self):
"""Get python-venv if it exists or python otherwise."""
python, *_ = self.spec.dependencies("python-venv") or self.spec.dependencies("python")
return python
@@ -426,7 +425,7 @@ def libs(self) -> LibraryList:
@spack.builder.builder("python_pip")
class PythonPipBuilder(BuilderWithDefaults):
class PythonPipBuilder(BaseBuilder):
phases = ("install",)
#: Names associated with package methods in the old build-system format
@@ -544,4 +543,4 @@ def install(self, pkg: PythonPackage, spec: Spec, prefix: Prefix) -> None:
with fs.working_dir(self.build_directory):
pip(*args)
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
spack.builder.run_after("install")(execute_install_time_tests)

View File

@@ -6,10 +6,9 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from ._checks import BuilderWithDefaults, execute_build_time_tests
from ._checks import BaseBuilder, execute_build_time_tests
class QMakePackage(spack.package_base.PackageBase):
@@ -31,7 +30,7 @@ class QMakePackage(spack.package_base.PackageBase):
@spack.builder.builder("qmake")
class QMakeBuilder(BuilderWithDefaults):
class QMakeBuilder(BaseBuilder):
"""The qmake builder provides three phases that can be overridden:
1. :py:meth:`~.QMakeBuilder.qmake`
@@ -82,4 +81,4 @@ def check(self):
with working_dir(self.build_directory):
self.pkg._if_make_target_execute("check")
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -8,7 +8,7 @@
import spack.package_base
from spack.directives import build_system, extends, maintainers
from ._checks import BuilderWithDefaults
from ._checks import BaseBuilder
class RubyPackage(spack.package_base.PackageBase):
@@ -28,7 +28,7 @@ class RubyPackage(spack.package_base.PackageBase):
@spack.builder.builder("ruby")
class RubyBuilder(BuilderWithDefaults):
class RubyBuilder(BaseBuilder):
"""The Ruby builder provides two phases that can be overridden if required:
#. :py:meth:`~.RubyBuilder.build`

View File

@@ -4,10 +4,9 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from ._checks import BuilderWithDefaults, execute_build_time_tests
from ._checks import BaseBuilder, execute_build_time_tests
class SConsPackage(spack.package_base.PackageBase):
@@ -29,7 +28,7 @@ class SConsPackage(spack.package_base.PackageBase):
@spack.builder.builder("scons")
class SConsBuilder(BuilderWithDefaults):
class SConsBuilder(BaseBuilder):
"""The Scons builder provides the following phases that can be overridden:
1. :py:meth:`~.SConsBuilder.build`
@@ -80,4 +79,4 @@ def build_test(self):
"""
pass
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -11,12 +11,11 @@
import spack.builder
import spack.install_test
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on, extends
from spack.multimethod import when
from spack.util.executable import Executable
from ._checks import BuilderWithDefaults, execute_install_time_tests
from ._checks import BaseBuilder, execute_install_time_tests
class SIPPackage(spack.package_base.PackageBase):
@@ -104,7 +103,7 @@ def test_imports(self):
@spack.builder.builder("sip")
class SIPBuilder(BuilderWithDefaults):
class SIPBuilder(BaseBuilder):
"""The SIP builder provides the following phases that can be overridden:
* configure
@@ -171,4 +170,4 @@ def install_args(self):
"""Arguments to pass to install."""
return []
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
spack.builder.run_after("install")(execute_install_time_tests)

View File

@@ -6,10 +6,9 @@
import spack.builder
import spack.package_base
import spack.phase_callbacks
from spack.directives import build_system, depends_on
from ._checks import BuilderWithDefaults, execute_build_time_tests, execute_install_time_tests
from ._checks import BaseBuilder, execute_build_time_tests, execute_install_time_tests
class WafPackage(spack.package_base.PackageBase):
@@ -31,7 +30,7 @@ class WafPackage(spack.package_base.PackageBase):
@spack.builder.builder("waf")
class WafBuilder(BuilderWithDefaults):
class WafBuilder(BaseBuilder):
"""The WAF builder provides the following phases that can be overridden:
* configure
@@ -137,7 +136,7 @@ def build_test(self):
"""
pass
spack.phase_callbacks.run_after("build")(execute_build_time_tests)
spack.builder.run_after("build")(execute_build_time_tests)
def install_test(self):
"""Run unit tests after install.
@@ -147,4 +146,4 @@ def install_test(self):
"""
pass
spack.phase_callbacks.run_after("install")(execute_install_time_tests)
spack.builder.run_after("install")(execute_install_time_tests)

View File

@@ -6,30 +6,44 @@
import collections.abc
import copy
import functools
from typing import Dict, List, Optional, Tuple, Type
from typing import List, Optional, Tuple
from llnl.util import lang
import spack.error
import spack.multimethod
import spack.package_base
import spack.phase_callbacks
import spack.repo
import spack.spec
import spack.util.environment
#: Builder classes, as registered by the "builder" decorator
BUILDER_CLS: Dict[str, Type["Builder"]] = {}
BUILDER_CLS = {}
#: An object of this kind is a shared global state used to collect callbacks during
#: class definition time, and is flushed when the class object is created at the end
#: of the class definition
#:
#: Args:
#: attribute_name (str): name of the attribute that will be attached to the builder
#: callbacks (list): container used to temporarily aggregate the callbacks
CallbackTemporaryStage = collections.namedtuple(
"CallbackTemporaryStage", ["attribute_name", "callbacks"]
)
#: Shared global state to aggregate "@run_before" callbacks
_RUN_BEFORE = CallbackTemporaryStage(attribute_name="run_before_callbacks", callbacks=[])
#: Shared global state to aggregate "@run_after" callbacks
_RUN_AFTER = CallbackTemporaryStage(attribute_name="run_after_callbacks", callbacks=[])
#: Map id(pkg) to a builder, to avoid creating multiple
#: builders for the same package object.
_BUILDERS: Dict[int, "Builder"] = {}
_BUILDERS = {}
def builder(build_system_name: str):
def builder(build_system_name):
"""Class decorator used to register the default builder
for a given build-system.
Args:
build_system_name: name of the build-system
build_system_name (str): name of the build-system
"""
def _decorator(cls):
@@ -40,9 +54,13 @@ def _decorator(cls):
return _decorator
def create(pkg: spack.package_base.PackageBase) -> "Builder":
"""Given a package object with an associated concrete spec, return the builder object that can
install it."""
def create(pkg):
"""Given a package object with an associated concrete spec,
return the builder object that can install it.
Args:
pkg (spack.package_base.PackageBase): package for which we want the builder
"""
if id(pkg) not in _BUILDERS:
_BUILDERS[id(pkg)] = _create(pkg)
return _BUILDERS[id(pkg)]
@@ -57,7 +75,7 @@ def __call__(self, spec, prefix):
return self.phase_fn(self.builder.pkg, spec, prefix)
def get_builder_class(pkg, name: str) -> Optional[Type["Builder"]]:
def get_builder_class(pkg, name: str) -> Optional[type]:
"""Return the builder class if a package module defines it."""
cls = getattr(pkg.module, name, None)
if cls and cls.__module__.startswith(spack.repo.ROOT_PYTHON_NAMESPACE):
@@ -65,7 +83,7 @@ def get_builder_class(pkg, name: str) -> Optional[Type["Builder"]]:
return None
def _create(pkg: spack.package_base.PackageBase) -> "Builder":
def _create(pkg):
"""Return a new builder object for the package object being passed as argument.
The function inspects the build-system used by the package object and try to:
@@ -85,7 +103,7 @@ class hierarchy (look at AspellDictPackage for an example of that)
to look for build-related methods in the ``*Package``.
Args:
pkg: package object for which we need a builder
pkg (spack.package_base.PackageBase): package object for which we need a builder
"""
package_buildsystem = buildsystem_name(pkg)
default_builder_cls = BUILDER_CLS[package_buildsystem]
@@ -150,8 +168,8 @@ def __forward(self, *args, **kwargs):
# with the same name is defined in the Package, it will override this definition
# (when _ForwardToBaseBuilder is initialized)
for method_name in (
base_cls.phases # type: ignore
+ base_cls.legacy_methods # type: ignore
base_cls.phases
+ base_cls.legacy_methods
+ getattr(base_cls, "legacy_long_methods", tuple())
+ ("setup_build_environment", "setup_dependent_build_environment")
):
@@ -163,14 +181,14 @@ def __forward(self):
return __forward
for attribute_name in base_cls.legacy_attributes: # type: ignore
for attribute_name in base_cls.legacy_attributes:
setattr(
_ForwardToBaseBuilder,
attribute_name,
property(forward_property_to_getattr(attribute_name)),
)
class Adapter(base_cls, metaclass=_PackageAdapterMeta): # type: ignore
class Adapter(base_cls, metaclass=_PackageAdapterMeta):
def __init__(self, pkg):
# Deal with custom phases in packages here
if hasattr(pkg, "phases"):
@@ -195,18 +213,99 @@ def setup_dependent_build_environment(self, env, dependent_spec):
return Adapter(pkg)
def buildsystem_name(pkg: spack.package_base.PackageBase) -> str:
def buildsystem_name(pkg):
"""Given a package object with an associated concrete spec,
return the name of its build system."""
return the name of its build system.
Args:
pkg (spack.package_base.PackageBase): package for which we want
the build system name
"""
try:
return pkg.spec.variants["build_system"].value
except KeyError:
# We are reading an old spec without the build_system variant
return pkg.legacy_buildsystem # type: ignore
return pkg.legacy_buildsystem
class PhaseCallbacksMeta(type):
"""Permit to register arbitrary functions during class definition and run them
later, before or after a given install phase.
Each method decorated with ``run_before`` or ``run_after`` gets temporarily
stored in a global shared state when a class being defined is parsed by the Python
interpreter. At class definition time that temporary storage gets flushed and a list
of callbacks is attached to the class being defined.
"""
def __new__(mcs, name, bases, attr_dict):
for temporary_stage in (_RUN_BEFORE, _RUN_AFTER):
staged_callbacks = temporary_stage.callbacks
# Here we have an adapter from an old-style package. This means there is no
# hierarchy of builders, and every callback that had to be combined between
# *Package and *Builder has been combined already by _PackageAdapterMeta
if name == "Adapter":
continue
# If we are here we have callbacks. To get a complete list, we accumulate all the
# callbacks from base classes, we deduplicate them, then prepend what we have
# registered here.
#
# The order should be:
# 1. Callbacks are registered in order within the same class
# 2. Callbacks defined in derived classes precede those defined in base
# classes
callbacks_from_base = []
for base in bases:
current_callbacks = getattr(base, temporary_stage.attribute_name, None)
if not current_callbacks:
continue
callbacks_from_base.extend(current_callbacks)
callbacks_from_base = list(lang.dedupe(callbacks_from_base))
# Set the callbacks in this class and flush the temporary stage
attr_dict[temporary_stage.attribute_name] = staged_callbacks[:] + callbacks_from_base
del temporary_stage.callbacks[:]
return super(PhaseCallbacksMeta, mcs).__new__(mcs, name, bases, attr_dict)
@staticmethod
def run_after(phase, when=None):
"""Decorator to register a function for running after a given phase.
Args:
phase (str): phase after which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_AFTER.callbacks.append(item)
return fn
return _decorator
@staticmethod
def run_before(phase, when=None):
"""Decorator to register a function for running before a given phase.
Args:
phase (str): phase before which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_BEFORE.callbacks.append(item)
return fn
return _decorator
class BuilderMeta(
spack.phase_callbacks.PhaseCallbacksMeta,
PhaseCallbacksMeta,
spack.multimethod.MultiMethodMeta,
type(collections.abc.Sequence), # type: ignore
):
@@ -301,12 +400,8 @@ def __new__(mcs, name, bases, attr_dict):
)
combine_callbacks = _PackageAdapterMeta.combine_callbacks
attr_dict[spack.phase_callbacks._RUN_BEFORE.attribute_name] = combine_callbacks(
spack.phase_callbacks._RUN_BEFORE.attribute_name
)
attr_dict[spack.phase_callbacks._RUN_AFTER.attribute_name] = combine_callbacks(
spack.phase_callbacks._RUN_AFTER.attribute_name
)
attr_dict[_RUN_BEFORE.attribute_name] = combine_callbacks(_RUN_BEFORE.attribute_name)
attr_dict[_RUN_AFTER.attribute_name] = combine_callbacks(_RUN_AFTER.attribute_name)
return super(_PackageAdapterMeta, mcs).__new__(mcs, name, bases, attr_dict)
@@ -326,8 +421,8 @@ def __init__(self, name, builder):
self.name = name
self.builder = builder
self.phase_fn = self._select_phase_fn()
self.run_before = self._make_callbacks(spack.phase_callbacks._RUN_BEFORE.attribute_name)
self.run_after = self._make_callbacks(spack.phase_callbacks._RUN_AFTER.attribute_name)
self.run_before = self._make_callbacks(_RUN_BEFORE.attribute_name)
self.run_after = self._make_callbacks(_RUN_AFTER.attribute_name)
def _make_callbacks(self, callbacks_attribute):
result = []
@@ -388,103 +483,15 @@ def copy(self):
return copy.deepcopy(self)
class BaseBuilder(metaclass=BuilderMeta):
"""An interface for builders, without any phases defined. This class is exposed in the package
API, so that packagers can create a single class to define ``setup_build_environment`` and
``@run_before`` and ``@run_after`` callbacks that can be shared among different builders.
class Builder(collections.abc.Sequence, metaclass=BuilderMeta):
"""A builder is a class that, given a package object (i.e. associated with
concrete spec), knows how to install it.
Example:
The builder behaves like a sequence, and when iterated over return the
"phases" of the installation in the correct order.
.. code-block:: python
class AnyBuilder(BaseBuilder):
@run_after("install")
def fixup_install(self):
# do something after the package is installed
pass
def setup_build_environment(self, env):
env.set("MY_ENV_VAR", "my_value")
class CMakeBuilder(cmake.CMakeBuilder, AnyBuilder):
pass
class AutotoolsBuilder(autotools.AutotoolsBuilder, AnyBuilder):
pass
"""
def __init__(self, pkg: spack.package_base.PackageBase) -> None:
self.pkg = pkg
@property
def spec(self) -> spack.spec.Spec:
return self.pkg.spec
@property
def stage(self):
return self.pkg.stage
@property
def prefix(self):
return self.pkg.prefix
def setup_build_environment(
self, env: spack.util.environment.EnvironmentModifications
) -> None:
"""Sets up the build environment for a package.
This method will be called before the current package prefix exists in
Spack's store.
Args:
env: environment modifications to be applied when the package is built. Package authors
can call methods on it to alter the build environment.
"""
if not hasattr(super(), "setup_build_environment"):
return
super().setup_build_environment(env) # type: ignore
def setup_dependent_build_environment(
self, env: spack.util.environment.EnvironmentModifications, dependent_spec: spack.spec.Spec
) -> None:
"""Sets up the build environment of a package that depends on this one.
This is similar to ``setup_build_environment``, but it is used to modify the build
environment of a package that *depends* on this one.
This gives packages the ability to set environment variables for the build of the
dependent, which can be useful to provide search hints for headers or libraries if they are
not in standard locations.
This method will be called before the dependent package prefix exists in Spack's store.
Args:
env: environment modifications to be applied when the dependent package is built.
Package authors can call methods on it to alter the build environment.
dependent_spec: the spec of the dependent package about to be built. This allows the
extendee (self) to query the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
if not hasattr(super(), "setup_dependent_build_environment"):
return
super().setup_dependent_build_environment(env, dependent_spec) # type: ignore
def __repr__(self):
fmt = "{name}{/hash:7}"
return f"{self.__class__.__name__}({self.spec.format(fmt)})"
def __str__(self):
fmt = "{name}{/hash:7}"
return f'"{self.__class__.__name__}" builder for "{self.spec.format(fmt)}"'
class Builder(BaseBuilder, collections.abc.Sequence):
"""A builder is a class that, given a package object (i.e. associated with concrete spec),
knows how to install it.
The builder behaves like a sequence, and when iterated over return the "phases" of the
installation in the correct order.
Args:
pkg (spack.package_base.PackageBase): package object to be built
"""
#: Sequence of phases. Must be defined in derived classes
@@ -499,22 +506,95 @@ class Builder(BaseBuilder, collections.abc.Sequence):
build_time_test_callbacks: List[str]
install_time_test_callbacks: List[str]
#: List of glob expressions. Each expression must either be absolute or relative to the package
#: source path. Matching artifacts found at the end of the build process will be copied in the
#: same directory tree as _spack_build_logfile and _spack_build_envfile.
@property
def archive_files(self) -> List[str]:
return []
#: List of glob expressions. Each expression must either be
#: absolute or relative to the package source path.
#: Matching artifacts found at the end of the build process will be
#: copied in the same directory tree as _spack_build_logfile and
#: _spack_build_envfile.
archive_files: List[str] = []
def __init__(self, pkg: spack.package_base.PackageBase) -> None:
super().__init__(pkg)
def __init__(self, pkg):
self.pkg = pkg
self.callbacks = {}
for phase in self.phases:
self.callbacks[phase] = InstallationPhase(phase, self)
@property
def spec(self):
return self.pkg.spec
@property
def stage(self):
return self.pkg.stage
@property
def prefix(self):
return self.pkg.prefix
def setup_build_environment(self, env):
"""Sets up the build environment for a package.
This method will be called before the current package prefix exists in
Spack's store.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the package is built. Package authors
can call methods on it to alter the build environment.
"""
if not hasattr(super(), "setup_build_environment"):
return
super().setup_build_environment(env)
def setup_dependent_build_environment(self, env, dependent_spec):
"""Sets up the build environment of packages that depend on this one.
This is similar to ``setup_build_environment``, but it is used to
modify the build environments of packages that *depend* on this one.
This gives packages like Python and others that follow the extension
model a way to implement common environment or compile-time settings
for dependencies.
This method will be called before the dependent package prefix exists
in Spack's store.
Examples:
1. Installing python modules generally requires ``PYTHONPATH``
to point to the ``lib/pythonX.Y/site-packages`` directory in the
module's install prefix. This method could be used to set that
variable.
Args:
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the dependent package is built.
Package authors can call methods on it to alter the build environment.
dependent_spec (spack.spec.Spec): the spec of the dependent package
about to be built. This allows the extendee (self) to query
the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
if not hasattr(super(), "setup_dependent_build_environment"):
return
super().setup_dependent_build_environment(env, dependent_spec)
def __getitem__(self, idx):
key = self.phases[idx]
return self.callbacks[key]
def __len__(self):
return len(self.phases)
def __repr__(self):
msg = "{0}({1})"
return msg.format(type(self).__name__, self.pkg.spec.format("{name}/{hash:7}"))
def __str__(self):
msg = '"{0}" builder for "{1}"'
return msg.format(type(self).build_system, self.pkg.spec.format("{name}/{hash:7}"))
# Export these names as standalone to be used in packages
run_after = PhaseCallbacksMeta.run_after
run_before = PhaseCallbacksMeta.run_before

View File

@@ -32,13 +32,11 @@
import spack
import spack.binary_distribution as bindist
import spack.builder
import spack.concretize
import spack.config as cfg
import spack.error
import spack.main
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.mirror
import spack.paths
import spack.repo
import spack.spec
@@ -205,7 +203,7 @@ def _print_staging_summary(spec_labels, stages, rebuild_decisions):
if not stages:
return
mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
mirrors = spack.mirror.MirrorCollection(binary=True)
tty.msg("Checked the following mirrors for binaries:")
for m in mirrors.values():
tty.msg(f" {m.fetch_url}")
@@ -798,7 +796,7 @@ def ensure_expected_target_path(path):
path = path.replace("\\", "/")
return path
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
buildcache_destination = None
if "buildcache-destination" not in pipeline_mirrors:
raise SpackCIError("spack ci generate requires a mirror named 'buildcache-destination'")
@@ -1324,7 +1322,7 @@ def push_to_build_cache(spec: spack.spec.Spec, mirror_url: str, sign_binaries: b
"""
tty.debug(f"Pushing to build cache ({'signed' if sign_binaries else 'unsigned'})")
signing_key = bindist.select_signing_key() if sign_binaries else None
mirror = spack.mirrors.mirror.Mirror.from_url(mirror_url)
mirror = spack.mirror.Mirror.from_url(mirror_url)
try:
with bindist.make_uploader(mirror, signing_key=signing_key) as uploader:
uploader.push_or_raise([spec])
@@ -1344,7 +1342,7 @@ def remove_other_mirrors(mirrors_to_keep, scope=None):
mirrors_to_remove.append(name)
for mirror_name in mirrors_to_remove:
spack.mirrors.utils.remove(mirror_name, scope)
spack.mirror.remove(mirror_name, scope)
def copy_files_to_artifacts(src, artifacts_dir):
@@ -1389,11 +1387,7 @@ def copy_stage_logs_to_artifacts(job_spec: spack.spec.Spec, job_log_dir: str) ->
stage_dir = job_pkg.stage.path
tty.debug(f"stage dir: {stage_dir}")
for file in [
job_pkg.log_path,
job_pkg.env_mods_path,
*spack.builder.create(job_pkg).archive_files,
]:
for file in [job_pkg.log_path, job_pkg.env_mods_path, *job_pkg.builder.archive_files]:
copy_files_to_artifacts(file, job_log_dir)

View File

@@ -4,13 +4,12 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import difflib
import importlib
import os
import re
import sys
from collections import Counter
from typing import List, Optional, Union
from typing import List, Union
import llnl.string
import llnl.util.tty as tty
@@ -34,8 +33,6 @@
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
from ..enums import InstallRecordStatus
# cmd has a submodule called "list" so preserve the python list module
python_list = list
@@ -126,8 +123,6 @@ def get_module(cmd_name):
tty.debug("Imported {0} from built-in commands".format(pname))
except ImportError:
module = spack.extensions.get_module(cmd_name)
if not module:
raise CommandNotFoundError(cmd_name)
attr_setdefault(module, SETUP_PARSER, lambda *args: None) # null-op
attr_setdefault(module, DESCRIPTION, "")
@@ -272,48 +267,39 @@ def matching_specs_from_env(specs):
return _concretize_spec_pairs(spec_pairs + additional_concrete_specs)[: len(spec_pairs)]
def disambiguate_spec(
spec: spack.spec.Spec,
env: Optional[ev.Environment],
local: bool = False,
installed: Union[bool, InstallRecordStatus] = True,
first: bool = False,
) -> spack.spec.Spec:
def disambiguate_spec(spec, env, local=False, installed=True, first=False):
"""Given a spec, figure out which installed package it refers to.
Args:
spec: a spec to disambiguate
env: a spack environment, if one is active, or None if no environment is active
local: do not search chained spack instances
installed: install status argument passed to database query.
first: returns the first matching spec, even if more than one match is found
Arguments:
spec (spack.spec.Spec): a spec to disambiguate
env (spack.environment.Environment): a spack environment,
if one is active, or None if no environment is active
local (bool): do not search chained spack instances
installed (bool or spack.database.InstallStatus or typing.Iterable):
install status argument passed to database query.
See ``spack.database.Database._query`` for details.
"""
hashes = env.all_hashes() if env else None
return disambiguate_spec_from_hashes(spec, hashes, local, installed, first)
def disambiguate_spec_from_hashes(
spec: spack.spec.Spec,
hashes: List[str],
local: bool = False,
installed: Union[bool, InstallRecordStatus] = True,
first: bool = False,
) -> spack.spec.Spec:
def disambiguate_spec_from_hashes(spec, hashes, local=False, installed=True, first=False):
"""Given a spec and a list of hashes, get concrete spec the spec refers to.
Arguments:
spec: a spec to disambiguate
hashes: a set of hashes of specs among which to disambiguate
local: if True, do not search chained spack instances
installed: install status argument passed to database query.
first: returns the first matching spec, even if more than one match is found
spec (spack.spec.Spec): a spec to disambiguate
hashes (typing.Iterable): a set of hashes of specs among which to disambiguate
local (bool): do not search chained spack instances
installed (bool or spack.database.InstallStatus or typing.Iterable):
install status argument passed to database query.
See ``spack.database.Database._query`` for details.
"""
if local:
matching_specs = spack.store.STORE.db.query_local(spec, hashes=hashes, installed=installed)
else:
matching_specs = spack.store.STORE.db.query(spec, hashes=hashes, installed=installed)
if not matching_specs:
tty.die(f"Spec '{spec}' matches no installed packages.")
tty.die("Spec '%s' matches no installed packages." % spec)
elif first:
return matching_specs[0]
@@ -694,24 +680,3 @@ def find_environment(args):
def first_line(docstring):
"""Return the first line of the docstring."""
return docstring.split("\n")[0]
class CommandNotFoundError(spack.error.SpackError):
"""Exception class thrown when a requested command is not recognized as
such.
"""
def __init__(self, cmd_name):
msg = (
f"{cmd_name} is not a recognized Spack command or extension command; "
"check with `spack commands`."
)
long_msg = None
similar = difflib.get_close_matches(cmd_name, all_commands())
if 1 <= len(similar) <= 5:
long_msg = "\nDid you mean one of the following commands?\n "
long_msg += "\n ".join(similar)
super().__init__(msg, long_msg)

View File

@@ -16,7 +16,7 @@
import spack.bootstrap.config
import spack.bootstrap.core
import spack.config
import spack.mirrors.utils
import spack.mirror
import spack.spec
import spack.stage
import spack.util.path
@@ -400,7 +400,7 @@ def _mirror(args):
llnl.util.tty.set_msg_enabled(False)
spec = spack.spec.Spec(spec_str).concretized()
for node in spec.traverse():
spack.mirrors.utils.create(mirror_dir, [node])
spack.mirror.create(mirror_dir, [node])
llnl.util.tty.set_msg_enabled(True)
if args.binary_packages:

View File

@@ -21,7 +21,7 @@
import spack.deptypes as dt
import spack.environment as ev
import spack.error
import spack.mirrors.mirror
import spack.mirror
import spack.oci.oci
import spack.spec
import spack.stage
@@ -34,8 +34,6 @@
from spack.cmd.common import arguments
from spack.spec import Spec, save_dependency_specfiles
from ..enums import InstallRecordStatus
description = "create, download and install binary packages"
section = "packaging"
level = "long"
@@ -310,10 +308,7 @@ def setup_parser(subparser: argparse.ArgumentParser):
def _matching_specs(specs: List[Spec]) -> List[Spec]:
"""Disambiguate specs and return a list of matching specs"""
return [
spack.cmd.disambiguate_spec(s, ev.active_environment(), installed=InstallRecordStatus.ANY)
for s in specs
]
return [spack.cmd.disambiguate_spec(s, ev.active_environment(), installed=any) for s in specs]
def _format_spec(spec: Spec) -> str:
@@ -392,7 +387,7 @@ def push_fn(args):
roots = spack.cmd.require_active_env(cmd_name="buildcache push").concrete_roots()
mirror = args.mirror
assert isinstance(mirror, spack.mirrors.mirror.Mirror)
assert isinstance(mirror, spack.mirror.Mirror)
push_url = mirror.push_url
@@ -750,7 +745,7 @@ def manifest_copy(manifest_file_list, dest_mirror=None):
copy_buildcache_file(copy_file["src"], dest)
def update_index(mirror: spack.mirrors.mirror.Mirror, update_keys=False):
def update_index(mirror: spack.mirror.Mirror, update_keys=False):
# Special case OCI images for now.
try:
image_ref = spack.oci.oci.image_from_mirror(mirror)

View File

@@ -20,7 +20,7 @@
import spack.config as cfg
import spack.environment as ev
import spack.hash_types as ht
import spack.mirrors.mirror
import spack.mirror
import spack.util.gpg as gpg_util
import spack.util.timer as timer
import spack.util.url as url_util
@@ -240,7 +240,7 @@ def ci_reindex(args):
ci_mirrors = yaml_root["mirrors"]
mirror_urls = [url for url in ci_mirrors.values()]
remote_mirror_url = mirror_urls[0]
mirror = spack.mirrors.mirror.Mirror(remote_mirror_url)
mirror = spack.mirror.Mirror(remote_mirror_url)
buildcache.update_index(mirror, update_keys=True)
@@ -328,7 +328,7 @@ def ci_rebuild(args):
full_rebuild = True if rebuild_everything and rebuild_everything.lower() == "true" else False
pipeline_mirrors = spack.mirrors.mirror.MirrorCollection(binary=True)
pipeline_mirrors = spack.mirror.MirrorCollection(binary=True)
buildcache_destination = None
if "buildcache-destination" not in pipeline_mirrors:
tty.die("spack ci rebuild requires a mirror named 'buildcache-destination")

View File

@@ -14,8 +14,7 @@
import spack.config
import spack.deptypes as dt
import spack.environment as ev
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.mirror
import spack.reporters
import spack.spec
import spack.store
@@ -690,31 +689,31 @@ def mirror_name_or_url(m):
# If there's a \ or / in the name, it's interpreted as a path or url.
if "/" in m or "\\" in m or m in (".", ".."):
return spack.mirrors.mirror.Mirror(m)
return spack.mirror.Mirror(m)
# Otherwise, the named mirror is required to exist.
try:
return spack.mirrors.utils.require_mirror_name(m)
return spack.mirror.require_mirror_name(m)
except ValueError as e:
raise argparse.ArgumentTypeError(f"{e}. Did you mean {os.path.join('.', m)}?") from e
def mirror_url(url):
try:
return spack.mirrors.mirror.Mirror.from_url(url)
return spack.mirror.Mirror.from_url(url)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
def mirror_directory(path):
try:
return spack.mirrors.mirror.Mirror.from_local_path(path)
return spack.mirror.Mirror.from_local_path(path)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e
def mirror_name(name):
try:
return spack.mirrors.utils.require_mirror_name(name)
return spack.mirror.require_mirror_name(name)
except ValueError as e:
raise argparse.ArgumentTypeError(str(e)) from e

View File

@@ -23,10 +23,9 @@
import spack.installer
import spack.store
from spack.cmd.common import arguments
from spack.database import InstallStatuses
from spack.error import SpackError
from ..enums import InstallRecordStatus
description = "replace one package with another via symlinks"
section = "admin"
level = "long"
@@ -96,12 +95,8 @@ def deprecate(parser, args):
if len(specs) != 2:
raise SpackError("spack deprecate requires exactly two specs")
deprecate = spack.cmd.disambiguate_spec(
specs[0],
env,
local=True,
installed=(InstallRecordStatus.INSTALLED | InstallRecordStatus.DEPRECATED),
)
install_query = [InstallStatuses.INSTALLED, InstallStatuses.DEPRECATED]
deprecate = spack.cmd.disambiguate_spec(specs[0], env, local=True, installed=install_query)
if args.install:
deprecator = specs[1].concretized()

View File

@@ -17,8 +17,7 @@
import spack.spec
import spack.store
from spack.cmd.common import arguments
from ..enums import InstallRecordStatus
from spack.database import InstallStatuses
description = "list and search installed packages"
section = "basic"
@@ -138,22 +137,21 @@ def setup_parser(subparser):
subparser.add_argument(
"--loaded", action="store_true", help="show only packages loaded in the user environment"
)
only_missing_or_deprecated = subparser.add_mutually_exclusive_group()
only_missing_or_deprecated.add_argument(
subparser.add_argument(
"-M",
"--only-missing",
action="store_true",
dest="only_missing",
help="show only missing dependencies",
)
only_missing_or_deprecated.add_argument(
"--only-deprecated", action="store_true", help="show only deprecated packages"
)
subparser.add_argument(
"--deprecated",
action="store_true",
help="show deprecated packages as well as installed specs",
)
subparser.add_argument(
"--only-deprecated", action="store_true", help="show only deprecated packages"
)
subparser.add_argument(
"--install-tree",
action="store",
@@ -167,23 +165,14 @@ def setup_parser(subparser):
def query_arguments(args):
if args.only_missing and (args.deprecated or args.missing):
raise RuntimeError("cannot use --only-missing with --deprecated, or --missing")
if args.only_deprecated and (args.deprecated or args.missing):
raise RuntimeError("cannot use --only-deprecated with --deprecated, or --missing")
installed = InstallRecordStatus.INSTALLED
if args.only_missing:
installed = InstallRecordStatus.MISSING
elif args.only_deprecated:
installed = InstallRecordStatus.DEPRECATED
if args.missing:
installed |= InstallRecordStatus.MISSING
if args.deprecated:
installed |= InstallRecordStatus.DEPRECATED
# Set up query arguments.
installed = []
if not (args.only_missing or args.only_deprecated):
installed.append(InstallStatuses.INSTALLED)
if (args.deprecated or args.only_deprecated) and not args.only_missing:
installed.append(InstallStatuses.DEPRECATED)
if (args.missing or args.only_missing) and not args.only_deprecated:
installed.append(InstallStatuses.MISSING)
predicate_fn = None
if args.unknown:

View File

@@ -8,7 +8,7 @@
import tempfile
import spack.binary_distribution
import spack.mirrors.mirror
import spack.mirror
import spack.paths
import spack.stage
import spack.util.gpg
@@ -217,11 +217,11 @@ def gpg_publish(args):
mirror = None
if args.directory:
url = spack.util.url.path_to_file_url(args.directory)
mirror = spack.mirrors.mirror.Mirror(url, url)
mirror = spack.mirror.Mirror(url, url)
elif args.mirror_name:
mirror = spack.mirrors.mirror.MirrorCollection(binary=True).lookup(args.mirror_name)
mirror = spack.mirror.MirrorCollection(binary=True).lookup(args.mirror_name)
elif args.mirror_url:
mirror = spack.mirrors.mirror.Mirror(args.mirror_url, args.mirror_url)
mirror = spack.mirror.Mirror(args.mirror_url, args.mirror_url)
with tempfile.TemporaryDirectory(dir=spack.stage.get_stage_root()) as tmpdir:
spack.binary_distribution._url_push_keys(

View File

@@ -78,8 +78,8 @@
boxlib @B{dim=2} boxlib built for 2 dimensions
libdwarf @g{%intel} ^libelf@g{%gcc}
libdwarf, built with intel compiler, linked to libelf built with gcc
mvapich2 @g{%gcc} @B{fabrics=psm,mrail,sock}
mvapich2, built with gcc compiler, with support for multiple fabrics
mvapich2 @g{%pgi} @B{fabrics=psm,mrail,sock}
mvapich2, built with pgi compiler, with support for multiple fabrics
"""

View File

@@ -11,7 +11,6 @@
import llnl.util.tty.color as color
from llnl.util.tty.colify import colify
import spack.builder
import spack.deptypes as dt
import spack.fetch_strategy as fs
import spack.install_test
@@ -203,13 +202,11 @@ def print_namespace(pkg, args):
def print_phases(pkg, args):
"""output installation phases"""
builder = spack.builder.create(pkg)
if hasattr(builder, "phases") and builder.phases:
if hasattr(pkg.builder, "phases") and pkg.builder.phases:
color.cprint("")
color.cprint(section_title("Installation Phases:"))
phase_str = ""
for phase in builder.phases:
for phase in pkg.builder.phases:
phase_str += " {0}".format(phase)
color.cprint(phase_str)

View File

@@ -3,7 +3,6 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import datetime
import os
import re
from collections import defaultdict
@@ -97,7 +96,7 @@ def list_files(args):
OLD_LICENSE, SPDX_MISMATCH, GENERAL_MISMATCH = range(1, 4)
#: Latest year that copyright applies. UPDATE THIS when bumping copyright.
latest_year = datetime.date.today().year
latest_year = 2024 # year of 0.23 release
strict_date = r"Copyright 2013-%s" % latest_year
#: regexes for valid license lines at tops of files

View File

@@ -10,8 +10,7 @@
import spack.cmd
import spack.store
from spack.cmd.common import arguments
from ..enums import InstallRecordStatus
from spack.database import InstallStatuses
description = "mark packages as explicitly or implicitly installed"
section = "admin"
@@ -68,7 +67,8 @@ def find_matching_specs(specs, allow_multiple_matches=False):
has_errors = False
for spec in specs:
matching = spack.store.STORE.db.query_local(spec, installed=InstallRecordStatus.INSTALLED)
install_query = [InstallStatuses.INSTALLED]
matching = spack.store.STORE.db.query_local(spec, installed=install_query)
# For each spec provided, make sure it refers to only one package.
# Fail and ask user to be unambiguous if it doesn't
if not allow_multiple_matches and len(matching) > 1:

View File

@@ -14,8 +14,7 @@
import spack.concretize
import spack.config
import spack.environment as ev
import spack.mirrors.mirror
import spack.mirrors.utils
import spack.mirror
import spack.repo
import spack.spec
import spack.util.web as web_util
@@ -366,15 +365,15 @@ def mirror_add(args):
connection["autopush"] = args.autopush
if args.signed is not None:
connection["signed"] = args.signed
mirror = spack.mirrors.mirror.Mirror(connection, name=args.name)
mirror = spack.mirror.Mirror(connection, name=args.name)
else:
mirror = spack.mirrors.mirror.Mirror(args.url, name=args.name)
spack.mirrors.utils.add(mirror, args.scope)
mirror = spack.mirror.Mirror(args.url, name=args.name)
spack.mirror.add(mirror, args.scope)
def mirror_remove(args):
"""remove a mirror by name"""
spack.mirrors.utils.remove(args.name, args.scope)
spack.mirror.remove(args.name, args.scope)
def _configure_mirror(args):
@@ -383,7 +382,7 @@ def _configure_mirror(args):
if args.name not in mirrors:
tty.die(f"No mirror found with name {args.name}.")
entry = spack.mirrors.mirror.Mirror(mirrors[args.name], args.name)
entry = spack.mirror.Mirror(mirrors[args.name], args.name)
direction = "fetch" if args.fetch else "push" if args.push else None
changes = {}
if args.url:
@@ -450,7 +449,7 @@ def mirror_set_url(args):
def mirror_list(args):
"""print out available mirrors to the console"""
mirrors = spack.mirrors.mirror.MirrorCollection(scope=args.scope)
mirrors = spack.mirror.MirrorCollection(scope=args.scope)
if not mirrors:
tty.msg("No mirrors configured.")
return
@@ -490,9 +489,9 @@ def concrete_specs_from_user(args):
def extend_with_additional_versions(specs, num_versions):
if num_versions == "all":
mirror_specs = spack.mirrors.utils.get_all_versions(specs)
mirror_specs = spack.mirror.get_all_versions(specs)
else:
mirror_specs = spack.mirrors.utils.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = spack.mirror.get_matching_versions(specs, num_versions=num_versions)
mirror_specs = [x.concretized() for x in mirror_specs]
return mirror_specs
@@ -571,7 +570,7 @@ def concrete_specs_from_environment():
def all_specs_with_all_versions():
specs = [spack.spec.Spec(n) for n in spack.repo.all_package_names()]
mirror_specs = spack.mirrors.utils.get_all_versions(specs)
mirror_specs = spack.mirror.get_all_versions(specs)
mirror_specs.sort(key=lambda s: (s.name, s.version))
return mirror_specs
@@ -660,21 +659,19 @@ def _specs_and_action(args):
def create_mirror_for_all_specs(mirror_specs, path, skip_unstable_versions):
mirror_cache, mirror_stats = spack.mirrors.utils.mirror_cache_and_stats(
mirror_cache, mirror_stats = spack.mirror.mirror_cache_and_stats(
path, skip_unstable_versions=skip_unstable_versions
)
for candidate in mirror_specs:
pkg_cls = spack.repo.PATH.get_pkg_class(candidate.name)
pkg_obj = pkg_cls(spack.spec.Spec(candidate))
mirror_stats.next_spec(pkg_obj.spec)
spack.mirrors.utils.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)
spack.mirror.create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats)
process_mirror_stats(*mirror_stats.stats())
def create_mirror_for_individual_specs(mirror_specs, path, skip_unstable_versions):
present, mirrored, error = spack.mirrors.utils.create(
path, mirror_specs, skip_unstable_versions
)
present, mirrored, error = spack.mirror.create(path, mirror_specs, skip_unstable_versions)
tty.msg("Summary for mirror in {}".format(path))
process_mirror_stats(present, mirrored, error)
@@ -684,7 +681,7 @@ def mirror_destroy(args):
mirror_url = None
if args.mirror_name:
result = spack.mirrors.mirror.MirrorCollection().lookup(args.mirror_name)
result = spack.mirror.MirrorCollection().lookup(args.mirror_name)
mirror_url = result.push_url
elif args.mirror_url:
mirror_url = args.mirror_url

View File

@@ -8,7 +8,6 @@
import spack.cmd.common.arguments
import spack.cmd.modules
import spack.config
import spack.modules
import spack.modules.lmod

View File

@@ -7,7 +7,6 @@
import spack.cmd.common.arguments
import spack.cmd.modules
import spack.config
import spack.modules
import spack.modules.tcl

View File

@@ -3,21 +3,18 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import argparse
import ast
import os
import re
import sys
from itertools import zip_longest
from typing import Dict, List, Optional
import llnl.util.tty as tty
import llnl.util.tty.color as color
from llnl.util.filesystem import working_dir
import spack.paths
import spack.repo
import spack.util.git
from spack.util.executable import Executable, which
from spack.util.executable import which
description = "runs source code style checks on spack"
section = "developer"
@@ -39,7 +36,10 @@ def grouper(iterable, n, fillvalue=None):
#: double-check the results of other tools (if, e.g., --fix was provided)
#: The list maps an executable name to a method to ensure the tool is
#: bootstrapped or present in the environment.
tool_names = ["import", "isort", "black", "flake8", "mypy"]
tool_names = ["isort", "black", "flake8", "mypy"]
#: tools we run in spack style
tools = {}
#: warnings to ignore in mypy
mypy_ignores = [
@@ -61,28 +61,14 @@ def is_package(f):
#: decorator for adding tools to the list
class tool:
def __init__(self, name: str, required: bool = False, external: bool = True) -> None:
def __init__(self, name, required=False):
self.name = name
self.external = external
self.required = required
def __call__(self, fun):
self.fun = fun
tools[self.name] = self
tools[self.name] = (fun, self.required)
return fun
@property
def installed(self) -> bool:
return bool(which(self.name)) if self.external else True
@property
def executable(self) -> Optional[Executable]:
return which(self.name) if self.external else None
#: tools we run in spack style
tools: Dict[str, tool] = {}
def changed_files(base="develop", untracked=True, all_files=False, root=None):
"""Get list of changed files in the Spack repository.
@@ -190,22 +176,22 @@ def setup_parser(subparser):
"-t",
"--tool",
action="append",
help="specify which tools to run (default: %s)" % ", ".join(tool_names),
help="specify which tools to run (default: %s)" % ",".join(tool_names),
)
tool_group.add_argument(
"-s",
"--skip",
metavar="TOOL",
action="append",
help="specify tools to skip (choose from %s)" % ", ".join(tool_names),
help="specify tools to skip (choose from %s)" % ",".join(tool_names),
)
subparser.add_argument("files", nargs=argparse.REMAINDER, help="specific files to check")
def cwd_relative(path, root, initial_working_dir):
def cwd_relative(path, args):
"""Translate prefix-relative path to current working directory-relative."""
return os.path.relpath(os.path.join(root, path), initial_working_dir)
return os.path.relpath(os.path.join(args.root, path), args.initial_working_dir)
def rewrite_and_print_output(
@@ -215,10 +201,7 @@ def rewrite_and_print_output(
# print results relative to current working directory
def translate(match):
return replacement.format(
cwd_relative(match.group(1), args.root, args.initial_working_dir),
*list(match.groups()[1:]),
)
return replacement.format(cwd_relative(match.group(1), args), *list(match.groups()[1:]))
for line in output.split("\n"):
if not line:
@@ -237,7 +220,7 @@ def print_style_header(file_list, args, tools_to_run):
# translate modified paths to cwd_relative if needed
paths = [filename.strip() for filename in file_list]
if not args.root_relative:
paths = [cwd_relative(filename, args.root, args.initial_working_dir) for filename in paths]
paths = [cwd_relative(filename, args) for filename in paths]
tty.msg("Modified files", *paths)
sys.stdout.flush()
@@ -323,6 +306,8 @@ def process_files(file_list, is_args):
rewrite_and_print_output(output, args, pat, replacement)
packages_isort_args = (
"--rm",
"spack",
"--rm",
"spack.pkgkit",
"--rm",
@@ -367,137 +352,17 @@ def run_black(black_cmd, file_list, args):
return returncode
def _module_part(root: str, expr: str):
parts = expr.split(".")
# spack.pkg is for repositories, don't try to resolve it here.
if ".".join(parts[:2]) == spack.repo.ROOT_PYTHON_NAMESPACE:
return None
while parts:
f1 = os.path.join(root, "lib", "spack", *parts) + ".py"
f2 = os.path.join(root, "lib", "spack", *parts, "__init__.py")
if (
os.path.exists(f1)
# ensure case sensitive match
and f"{parts[-1]}.py" in os.listdir(os.path.dirname(f1))
or os.path.exists(f2)
):
return ".".join(parts)
parts.pop()
return None
def _run_import_check(
file_list: List[str],
*,
fix: bool,
root_relative: bool,
root=spack.paths.prefix,
working_dir=spack.paths.prefix,
out=sys.stdout,
):
if sys.version_info < (3, 9):
print("import check requires Python 3.9 or later")
return 0
is_use = re.compile(r"(?<!from )(?<!import )(?:llnl|spack)\.[a-zA-Z0-9_\.]+")
# redundant imports followed by a `# comment` are ignored, cause there can be legimitate reason
# to import a module: execute module scope init code, or to deal with circular imports.
is_abs_import = re.compile(r"^import ((?:llnl|spack)\.[a-zA-Z0-9_\.]+)$", re.MULTILINE)
exit_code = 0
for file in file_list:
to_add = set()
to_remove = []
pretty_path = file if root_relative else cwd_relative(file, root, working_dir)
try:
with open(file, "r") as f:
contents = open(file, "r").read()
parsed = ast.parse(contents)
except Exception:
exit_code = 1
print(f"{pretty_path}: could not parse", file=out)
continue
for m in is_abs_import.finditer(contents):
if contents.count(m.group(1)) == 1:
to_remove.append(m.group(0))
exit_code = 1
print(f"{pretty_path}: redundant import: {m.group(1)}", file=out)
# Clear all strings to avoid matching comments/strings etc.
for node in ast.walk(parsed):
if isinstance(node, ast.Constant) and isinstance(node.value, str):
node.value = ""
filtered_contents = ast.unparse(parsed) # novermin
for m in is_use.finditer(filtered_contents):
module = _module_part(root, m.group(0))
if not module or module in to_add:
continue
if re.search(rf"import {re.escape(module)}\b(?!\.)", contents):
continue
to_add.add(module)
exit_code = 1
print(f"{pretty_path}: missing import: {module} ({m.group(0)})", file=out)
if not fix or not to_add and not to_remove:
continue
with open(file, "r") as f:
lines = f.readlines()
if to_add:
# insert missing imports before the first import, delegate ordering to isort
for node in parsed.body:
if isinstance(node, (ast.Import, ast.ImportFrom)):
first_line = node.lineno
break
else:
print(f"{pretty_path}: could not fix", file=out)
continue
lines.insert(first_line, "\n".join(f"import {x}" for x in to_add) + "\n")
new_contents = "".join(lines)
# remove redundant imports
for statement in to_remove:
new_contents = new_contents.replace(f"{statement}\n", "")
with open(file, "w") as f:
f.write(new_contents)
return exit_code
@tool("import", external=False)
def run_import_check(import_check_cmd, file_list, args):
exit_code = _run_import_check(
file_list,
fix=args.fix,
root_relative=args.root_relative,
root=args.root,
working_dir=args.initial_working_dir,
)
print_tool_result("import", exit_code)
return exit_code
def validate_toolset(arg_value):
"""Validate --tool and --skip arguments (sets of optionally comma-separated tools)."""
tools = set(",".join(arg_value).split(",")) # allow args like 'isort,flake8'
for tool in tools:
if tool not in tool_names:
tty.die("Invalid tool: '%s'" % tool, "Choose from: %s" % ", ".join(tool_names))
tty.die("Invaild tool: '%s'" % tool, "Choose from: %s" % ", ".join(tool_names))
return tools
def missing_tools(tools_to_run: List[str]) -> List[str]:
return [t for t in tools_to_run if not tools[t].installed]
def missing_tools(tools_to_run):
return [t for t in tools_to_run if which(t) is None]
def _bootstrap_dev_dependencies():
@@ -552,9 +417,9 @@ def prefix_relative(path):
print_style_header(file_list, args, tools_to_run)
for tool_name in tools_to_run:
tool = tools[tool_name]
run_function, required = tools[tool_name]
print_tool_header(tool_name)
return_code |= tool.fun(tool.executable, file_list, args)
return_code |= run_function(which(tool_name), file_list, args)
if return_code == 0:
tty.msg(color.colorize("@*{spack style checks were clean}"))

View File

@@ -17,8 +17,7 @@
import spack.store
import spack.traverse as traverse
from spack.cmd.common import arguments
from ..enums import InstallRecordStatus
from spack.database import InstallStatuses
description = "remove installed packages"
section = "build"
@@ -100,14 +99,12 @@ def find_matching_specs(
hashes = env.all_hashes() if env else None
# List of specs that match expressions given via command line
specs_from_cli: List[spack.spec.Spec] = []
specs_from_cli: List["spack.spec.Spec"] = []
has_errors = False
for spec in specs:
install_query = [InstallStatuses.INSTALLED, InstallStatuses.DEPRECATED]
matching = spack.store.STORE.db.query_local(
spec,
hashes=hashes,
installed=(InstallRecordStatus.INSTALLED | InstallRecordStatus.DEPRECATED),
origin=origin,
spec, hashes=hashes, installed=install_query, origin=origin
)
# For each spec provided, make sure it refers to only one package.
# Fail and ask user to be unambiguous if it doesn't

View File

@@ -16,6 +16,7 @@
("gfortran", os.path.join("clang", "gfortran")),
("xlf_r", os.path.join("xl_r", "xlf_r")),
("xlf", os.path.join("xl", "xlf")),
("pgfortran", os.path.join("pgi", "pgfortran")),
("ifort", os.path.join("intel", "ifort")),
]
@@ -24,6 +25,7 @@
("gfortran", os.path.join("clang", "gfortran")),
("xlf90_r", os.path.join("xl_r", "xlf90_r")),
("xlf90", os.path.join("xl", "xlf90")),
("pgfortran", os.path.join("pgi", "pgfortran")),
("ifort", os.path.join("intel", "ifort")),
]

View File

@@ -124,8 +124,8 @@ def setup_custom_environment(self, pkg, env):
# Edge cases for Intel's oneAPI compilers when using the legacy classic compilers:
# Always pass flags to disable deprecation warnings, since these warnings can
# confuse tools that parse the output of compiler commands (e.g. version checks).
if self.real_version >= Version("2021") and self.real_version < Version("2024"):
if self.real_version >= Version("2021") and self.real_version <= Version("2023"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.real_version >= Version("2021") and self.real_version < Version("2025"):
if self.real_version >= Version("2021") and self.real_version <= Version("2024"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")

View File

@@ -155,10 +155,10 @@ def setup_custom_environment(self, pkg, env):
# icx+icpx+ifx or icx+icpx+ifort. But to be on the safe side (some users may
# want to try to swap icpx against icpc, for example), and since the Intel LLVM
# compilers accept these diag-disable flags, we apply them for all compilers.
if self.real_version >= Version("2021") and self.real_version < Version("2024"):
if self.real_version >= Version("2021") and self.real_version <= Version("2023"):
env.append_flags("SPACK_ALWAYS_CFLAGS", "-diag-disable=10441")
env.append_flags("SPACK_ALWAYS_CXXFLAGS", "-diag-disable=10441")
if self.real_version >= Version("2021") and self.real_version < Version("2025"):
if self.real_version >= Version("2021") and self.real_version <= Version("2024"):
env.append_flags("SPACK_ALWAYS_FFLAGS", "-diag-disable=10448")
# 2024 release bumped the libsycl version because of an ABI

View File

@@ -0,0 +1,77 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from spack.compiler import Compiler, UnsupportedCompilerFlag
from spack.version import Version
class Pgi(Compiler):
# Named wrapper links within build_env_path
link_paths = {
"cc": os.path.join("pgi", "pgcc"),
"cxx": os.path.join("pgi", "pgc++"),
"f77": os.path.join("pgi", "pgfortran"),
"fc": os.path.join("pgi", "pgfortran"),
}
version_argument = "-V"
ignore_version_errors = [2] # `pgcc -V` on PowerPC annoyingly returns 2
version_regex = r"pg[^ ]* ([0-9.]+)-[0-9]+ (LLVM )?[^ ]+ target on "
@property
def verbose_flag(self):
return "-v"
@property
def debug_flags(self):
return ["-g", "-gopt"]
@property
def opt_flags(self):
return ["-O", "-O0", "-O1", "-O2", "-O3", "-O4"]
@property
def openmp_flag(self):
return "-mp"
@property
def cxx11_flag(self):
return "-std=c++11"
@property
def cc_pic_flag(self):
return "-fpic"
@property
def cxx_pic_flag(self):
return "-fpic"
@property
def f77_pic_flag(self):
return "-fpic"
@property
def fc_pic_flag(self):
return "-fpic"
required_libs = ["libpgc", "libpgf90"]
@property
def c99_flag(self):
if self.real_version >= Version("12.10"):
return "-c99"
raise UnsupportedCompilerFlag(self, "the C99 standard", "c99_flag", "< 12.10")
@property
def c11_flag(self):
if self.real_version >= Version("15.3"):
return "-c11"
raise UnsupportedCompilerFlag(self, "the C11 standard", "c11_flag", "< 15.3")
@property
def stdcxx_libs(self):
return ("-pgc++libs",)

View File

@@ -160,11 +160,6 @@ def concretize_separately(
# TODO: support parallel concretization on macOS and Windows
num_procs = min(len(args), spack.config.determine_number_of_jobs(parallel=True))
msg = "Starting concretization"
if sys.platform not in ("darwin", "win32") and num_procs > 1:
msg += f" pool with {num_procs} processes"
tty.msg(msg)
for j, (i, concrete, duration) in enumerate(
spack.util.parallel.imap_unordered(
_concretize_task, args, processes=num_procs, debug=tty.is_debug(), maxtaskperchild=1

View File

@@ -69,8 +69,6 @@
from spack.error import SpackError
from spack.util.crypto import bit_length
from .enums import InstallRecordStatus
# TODO: Provide an API automatically retyring a build after detecting and
# TODO: clearing a failure.
@@ -162,12 +160,36 @@ def converter(self, spec_like, *args, **kwargs):
return converter
def normalize_query(installed: Union[bool, InstallRecordStatus]) -> InstallRecordStatus:
if installed is True:
installed = InstallRecordStatus.INSTALLED
elif installed is False:
installed = InstallRecordStatus.MISSING
return installed
class InstallStatus(str):
pass
class InstallStatuses:
INSTALLED = InstallStatus("installed")
DEPRECATED = InstallStatus("deprecated")
MISSING = InstallStatus("missing")
@classmethod
def canonicalize(cls, query_arg):
if query_arg is True:
return [cls.INSTALLED]
if query_arg is False:
return [cls.MISSING]
if query_arg is any:
return [cls.INSTALLED, cls.DEPRECATED, cls.MISSING]
if isinstance(query_arg, InstallStatus):
return [query_arg]
try:
statuses = list(query_arg)
if all(isinstance(x, InstallStatus) for x in statuses):
return statuses
except TypeError:
pass
raise TypeError(
"installation query must be `any`, boolean, "
"InstallStatus, or iterable of InstallStatus"
)
class InstallRecord:
@@ -205,8 +227,8 @@ def __init__(
installation_time: Optional[float] = None,
deprecated_for: Optional[str] = None,
in_buildcache: bool = False,
origin: Optional[str] = None,
) -> None:
origin=None,
):
self.spec = spec
self.path = str(path) if path else None
self.installed = bool(installed)
@@ -217,12 +239,14 @@ def __init__(
self.in_buildcache = in_buildcache
self.origin = origin
def install_type_matches(self, installed: InstallRecordStatus) -> bool:
def install_type_matches(self, installed):
installed = InstallStatuses.canonicalize(installed)
if self.installed:
return InstallRecordStatus.INSTALLED in installed
return InstallStatuses.INSTALLED in installed
elif self.deprecated_for:
return InstallRecordStatus.DEPRECATED in installed
return InstallRecordStatus.MISSING in installed
return InstallStatuses.DEPRECATED in installed
else:
return InstallStatuses.MISSING in installed
def to_dict(self, include_fields=DEFAULT_INSTALL_RECORD_FIELDS):
rec_dict = {}
@@ -1372,13 +1396,7 @@ def installed_extensions_for(self, extendee_spec: "spack.spec.Spec"):
if spec.package.extends(extendee_spec):
yield spec.package
def _get_by_hash_local(
self,
dag_hash: str,
default: Optional[List["spack.spec.Spec"]] = None,
installed: Union[bool, InstallRecordStatus] = InstallRecordStatus.ANY,
) -> Optional[List["spack.spec.Spec"]]:
installed = normalize_query(installed)
def _get_by_hash_local(self, dag_hash, default=None, installed=any):
# hash is a full hash and is in the data somewhere
if dag_hash in self._data:
rec = self._data[dag_hash]
@@ -1387,7 +1405,8 @@ def _get_by_hash_local(
else:
return default
# check if hash is a prefix of some installed (or previously installed) spec.
# check if hash is a prefix of some installed (or previously
# installed) spec.
matches = [
record.spec
for h, record in self._data.items()
@@ -1399,43 +1418,52 @@ def _get_by_hash_local(
# nothing found
return default
def get_by_hash_local(
self,
dag_hash: str,
default: Optional[List["spack.spec.Spec"]] = None,
installed: Union[bool, InstallRecordStatus] = InstallRecordStatus.ANY,
) -> Optional[List["spack.spec.Spec"]]:
def get_by_hash_local(self, dag_hash, default=None, installed=any):
"""Look up a spec in *this DB* by DAG hash, or by a DAG hash prefix.
Args:
dag_hash: hash (or hash prefix) to look up
default: default value to return if dag_hash is not in the DB
installed: if ``True``, includes only installed specs in the search; if ``False``
only missing specs. Otherwise, a InstallRecordStatus flag.
Arguments:
dag_hash (str): hash (or hash prefix) to look up
default (object or None): default value to return if dag_hash is
not in the DB (default: None)
installed (bool or InstallStatus or typing.Iterable or None):
if ``True``, includes only installed
specs in the search; if ``False`` only missing specs, and if
``any``, all specs in database. If an InstallStatus or iterable
of InstallStatus, returns specs whose install status
(installed, deprecated, or missing) matches (one of) the
InstallStatus. (default: any)
``installed`` defaults to ``InstallRecordStatus.ANY`` so we can refer to any known hash.
``installed`` defaults to ``any`` so that we can refer to any
known hash. Note that ``query()`` and ``query_one()`` differ in
that they only return installed specs by default.
Returns:
(list): a list of specs matching the hash or hash prefix
``query()`` and ``query_one()`` differ in that they only return installed specs by default.
"""
with self.read_transaction():
return self._get_by_hash_local(dag_hash, default=default, installed=installed)
def get_by_hash(
self,
dag_hash: str,
default: Optional[List["spack.spec.Spec"]] = None,
installed: Union[bool, InstallRecordStatus] = InstallRecordStatus.ANY,
) -> Optional[List["spack.spec.Spec"]]:
def get_by_hash(self, dag_hash, default=None, installed=any):
"""Look up a spec by DAG hash, or by a DAG hash prefix.
Args:
dag_hash: hash (or hash prefix) to look up
default: default value to return if dag_hash is not in the DB
installed: if ``True``, includes only installed specs in the search; if ``False``
only missing specs. Otherwise, a InstallRecordStatus flag.
Arguments:
dag_hash (str): hash (or hash prefix) to look up
default (object or None): default value to return if dag_hash is
not in the DB (default: None)
installed (bool or InstallStatus or typing.Iterable or None):
if ``True``, includes only installed specs in the search; if ``False``
only missing specs, and if ``any``, all specs in database. If an
InstallStatus or iterable of InstallStatus, returns specs whose install
status (installed, deprecated, or missing) matches (one of) the
InstallStatus. (default: any)
``installed`` defaults to ``InstallRecordStatus.ANY`` so we can refer to any known hash.
``query()`` and ``query_one()`` differ in that they only return installed specs by default.
``installed`` defaults to ``any`` so that we can refer to any
known hash. Note that ``query()`` and ``query_one()`` differ in
that they only return installed specs by default.
Returns:
(list): a list of specs matching the hash or hash prefix
"""
@@ -1455,7 +1483,7 @@ def _query(
query_spec: Optional[Union[str, "spack.spec.Spec"]] = None,
*,
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallRecordStatus] = True,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
explicit: Optional[bool] = None,
start_date: Optional[datetime.datetime] = None,
end_date: Optional[datetime.datetime] = None,
@@ -1463,7 +1491,6 @@ def _query(
in_buildcache: Optional[bool] = None,
origin: Optional[str] = None,
) -> List["spack.spec.Spec"]:
installed = normalize_query(installed)
# Restrict the set of records over which we iterate first
matching_hashes = self._data
@@ -1533,7 +1560,7 @@ def query_local(
query_spec: Optional[Union[str, "spack.spec.Spec"]] = None,
*,
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallRecordStatus] = True,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
explicit: Optional[bool] = None,
start_date: Optional[datetime.datetime] = None,
end_date: Optional[datetime.datetime] = None,
@@ -1593,7 +1620,7 @@ def query(
query_spec: Optional[Union[str, "spack.spec.Spec"]] = None,
*,
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallRecordStatus] = True,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
explicit: Optional[bool] = None,
start_date: Optional[datetime.datetime] = None,
end_date: Optional[datetime.datetime] = None,
@@ -1601,7 +1628,7 @@ def query(
hashes: Optional[List[str]] = None,
origin: Optional[str] = None,
install_tree: str = "all",
) -> List["spack.spec.Spec"]:
):
"""Queries the Spack database including all upstream databases.
Args:
@@ -1682,14 +1709,13 @@ def query(
)
results = list(local_results) + list(x for x in upstream_results if x not in local_results)
results.sort()
return results
return sorted(results)
def query_one(
self,
query_spec: Optional[Union[str, "spack.spec.Spec"]],
predicate_fn: Optional[SelectType] = None,
installed: Union[bool, InstallRecordStatus] = True,
installed: Union[bool, InstallStatus, List[InstallStatus]] = True,
) -> Optional["spack.spec.Spec"]:
"""Query for exactly one spec that matches the query spec.

View File

@@ -3,7 +3,7 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Data structures that represent Spack's dependency relationships."""
from typing import Dict, List, Type
from typing import Dict, List
import spack.deptypes as dt
import spack.spec
@@ -38,7 +38,7 @@ class Dependency:
def __init__(
self,
pkg: Type["spack.package_base.PackageBase"],
pkg: "spack.package_base.PackageBase",
spec: "spack.spec.Spec",
depflag: dt.DepFlag = dt.DEFAULT,
):

View File

@@ -21,7 +21,6 @@ class OpenMpi(Package):
* ``conflicts``
* ``depends_on``
* ``extends``
* ``license``
* ``patch``
* ``provides``
* ``resource``
@@ -35,19 +34,19 @@ class OpenMpi(Package):
import collections.abc
import os.path
import re
from typing import Any, Callable, List, Optional, Tuple, Type, Union
from typing import TYPE_CHECKING, Any, Callable, List, Optional, Tuple, Union
import llnl.util.lang
import llnl.util.tty.color
import spack.deptypes as dt
import spack.fetch_strategy
import spack.package_base
import spack.patch
import spack.spec
import spack.util.crypto
import spack.variant
from spack.dependency import Dependency
from spack.directives_meta import DirectiveError, DirectiveMeta
from spack.fetch_strategy import from_kwargs
from spack.resource import Resource
from spack.version import (
GitVersion,
@@ -57,8 +56,13 @@ class OpenMpi(Package):
VersionLookupError,
)
if TYPE_CHECKING:
import spack.package_base
__all__ = [
"DirectiveError",
"DirectiveMeta",
"DisableRedistribute",
"version",
"conditional",
"conflicts",
@@ -81,15 +85,15 @@ class OpenMpi(Package):
SpecType = str
DepType = Union[Tuple[str, ...], str]
WhenType = Optional[Union[spack.spec.Spec, str, bool]]
Patcher = Callable[[Union[Type[spack.package_base.PackageBase], Dependency]], None]
PatchesType = Union[Patcher, str, List[Union[Patcher, str]]]
WhenType = Optional[Union["spack.spec.Spec", str, bool]]
Patcher = Callable[[Union["spack.package_base.PackageBase", Dependency]], None]
PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]]
SUPPORTED_LANGUAGES = ("fortran", "cxx", "c")
def _make_when_spec(value: WhenType) -> Optional[spack.spec.Spec]:
def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]:
"""Create a ``Spec`` that indicates when a directive should be applied.
Directives with ``when`` specs, e.g.:
@@ -134,7 +138,7 @@ def _make_when_spec(value: WhenType) -> Optional[spack.spec.Spec]:
return spack.spec.Spec(value)
SubmoduleCallback = Callable[[spack.package_base.PackageBase], Union[str, List[str], bool]]
SubmoduleCallback = Callable[["spack.package_base.PackageBase"], Union[str, List[str], bool]]
directive = DirectiveMeta.directive
@@ -219,7 +223,7 @@ def version(
return lambda pkg: _execute_version(pkg, ver, **kwargs)
def _execute_version(pkg: Type[spack.package_base.PackageBase], ver: Union[str, int], **kwargs):
def _execute_version(pkg, ver, **kwargs):
if (
(any(s in kwargs for s in spack.util.crypto.hashes) or "checksum" in kwargs)
and hasattr(pkg, "has_code")
@@ -250,12 +254,12 @@ def _execute_version(pkg: Type[spack.package_base.PackageBase], ver: Union[str,
def _depends_on(
pkg: Type[spack.package_base.PackageBase],
spec: spack.spec.Spec,
pkg: "spack.package_base.PackageBase",
spec: "spack.spec.Spec",
*,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
patches: Optional[PatchesType] = None,
patches: PatchesType = None,
):
when_spec = _make_when_spec(when)
if not when_spec:
@@ -330,7 +334,7 @@ def conflicts(conflict_spec: SpecType, when: WhenType = None, msg: Optional[str]
msg (str): optional user defined message
"""
def _execute_conflicts(pkg: Type[spack.package_base.PackageBase]):
def _execute_conflicts(pkg: "spack.package_base.PackageBase"):
# If when is not specified the conflict always holds
when_spec = _make_when_spec(when)
if not when_spec:
@@ -349,7 +353,7 @@ def depends_on(
spec: SpecType,
when: WhenType = None,
type: DepType = dt.DEFAULT_TYPES,
patches: Optional[PatchesType] = None,
patches: PatchesType = None,
):
"""Creates a dict of deps with specs defining when they apply.
@@ -371,16 +375,21 @@ def depends_on(
assert type == "build", "languages must be of 'build' type"
return _language(lang_spec_str=spec, when=when)
def _execute_depends_on(pkg: Type[spack.package_base.PackageBase]):
def _execute_depends_on(pkg: "spack.package_base.PackageBase"):
_depends_on(pkg, dep_spec, when=when, type=type, patches=patches)
return _execute_depends_on
#: Store whether a given Spec source/binary should not be redistributed.
class DisableRedistribute:
def __init__(self, source, binary):
self.source = source
self.binary = binary
@directive("disable_redistribute")
def redistribute(
source: Optional[bool] = None, binary: Optional[bool] = None, when: WhenType = None
):
def redistribute(source=None, binary=None, when: WhenType = None):
"""Can be used inside a Package definition to declare that
the package source and/or compiled binaries should not be
redistributed.
@@ -395,10 +404,7 @@ def redistribute(
def _execute_redistribute(
pkg: Type[spack.package_base.PackageBase],
source: Optional[bool],
binary: Optional[bool],
when: WhenType,
pkg: "spack.package_base.PackageBase", source=None, binary=None, when: WhenType = None
):
if source is None and binary is None:
return
@@ -428,7 +434,7 @@ def _execute_redistribute(
if not binary:
disable.binary = True
else:
pkg.disable_redistribute[when_spec] = spack.package_base.DisableRedistribute(
pkg.disable_redistribute[when_spec] = DisableRedistribute(
source=not source, binary=not binary
)
@@ -474,7 +480,9 @@ def provides(*specs: SpecType, when: WhenType = None):
when: condition when this provides clause needs to be considered
"""
def _execute_provides(pkg: Type[spack.package_base.PackageBase]):
def _execute_provides(pkg: "spack.package_base.PackageBase"):
import spack.parser # Avoid circular dependency
when_spec = _make_when_spec(when)
if not when_spec:
return
@@ -520,7 +528,7 @@ def can_splice(
variants will be skipped by '*'.
"""
def _execute_can_splice(pkg: Type[spack.package_base.PackageBase]):
def _execute_can_splice(pkg: "spack.package_base.PackageBase"):
when_spec = _make_when_spec(when)
if isinstance(match_variants, str) and match_variants != "*":
raise ValueError(
@@ -561,10 +569,10 @@ def patch(
compressed URL patches)
"""
def _execute_patch(
pkg_or_dep: Union[Type[spack.package_base.PackageBase], Dependency]
) -> None:
pkg = pkg_or_dep.pkg if isinstance(pkg_or_dep, Dependency) else pkg_or_dep
def _execute_patch(pkg_or_dep: Union["spack.package_base.PackageBase", Dependency]):
pkg = pkg_or_dep
if isinstance(pkg, Dependency):
pkg = pkg.pkg
if hasattr(pkg, "has_code") and not pkg.has_code:
raise UnsupportedPackageDirective(
@@ -738,55 +746,58 @@ def _execute_variant(pkg):
@directive("resources")
def resource(
*,
name: Optional[str] = None,
destination: str = "",
placement: Optional[str] = None,
when: WhenType = None,
# additional kwargs are as for `version()`
**kwargs,
):
"""Define an external resource to be fetched and staged when building the package.
Based on the keywords present in the dictionary the appropriate FetchStrategy will
be used for the resource. Resources are fetched and staged in their own folder
inside spack stage area, and then moved into the stage area of the package that
needs them.
def resource(**kwargs):
"""Define an external resource to be fetched and staged when building the
package. Based on the keywords present in the dictionary the appropriate
FetchStrategy will be used for the resource. Resources are fetched and
staged in their own folder inside spack stage area, and then moved into
the stage area of the package that needs them.
Keyword Arguments:
name: name for the resource
when: condition defining when the resource is needed
destination: path, relative to the package stage area, to which resource should be moved
placement: optionally rename the expanded resource inside the destination directory
List of recognized keywords:
* 'when' : (optional) represents the condition upon which the resource is
needed
* 'destination' : (optional) path where to move the resource. This path
must be relative to the main package stage area.
* 'placement' : (optional) gives the possibility to fine tune how the
resource is moved into the main package stage area.
"""
def _execute_resource(pkg):
when = kwargs.get("when")
when_spec = _make_when_spec(when)
if not when_spec:
return
destination = kwargs.get("destination", "")
placement = kwargs.get("placement", None)
# Check if the path is relative
if os.path.isabs(destination):
msg = "The destination keyword of a resource directive can't be an absolute path.\n"
msg += f"\tdestination : '{destination}\n'"
raise RuntimeError(msg)
message = (
"The destination keyword of a resource directive " "can't be an absolute path.\n"
)
message += "\tdestination : '{dest}\n'".format(dest=destination)
raise RuntimeError(message)
# Check if the path falls within the main package stage area
test_path = "stage_folder_root"
# Normalized absolute path
normalized_destination = os.path.normpath(os.path.join(test_path, destination))
normalized_destination = os.path.normpath(
os.path.join(test_path, destination)
) # Normalized absolute path
if test_path not in normalized_destination:
msg = "Destination of a resource must be within the package stage directory.\n"
msg += f"\tdestination : '{destination}'\n"
raise RuntimeError(msg)
message = (
"The destination folder of a resource must fall "
"within the main package stage directory.\n"
)
message += "\tdestination : '{dest}'\n".format(dest=destination)
raise RuntimeError(message)
resources = pkg.resources.setdefault(when_spec, [])
resources.append(
Resource(name, spack.fetch_strategy.from_kwargs(**kwargs), destination, placement)
)
name = kwargs.get("name")
fetcher = from_kwargs(**kwargs)
resources.append(Resource(name, fetcher, destination, placement))
return _execute_resource
@@ -818,9 +829,7 @@ def _execute_maintainer(pkg):
return _execute_maintainer
def _execute_license(
pkg: Type[spack.package_base.PackageBase], license_identifier: str, when: WhenType
):
def _execute_license(pkg, license_identifier: str, when):
# If when is not specified the license always holds
when_spec = _make_when_spec(when)
if not when_spec:
@@ -884,7 +893,7 @@ def requires(*requirement_specs: str, policy="one_of", when=None, msg=None):
msg: optional user defined message
"""
def _execute_requires(pkg: Type[spack.package_base.PackageBase]):
def _execute_requires(pkg: "spack.package_base.PackageBase"):
if policy not in ("one_of", "any_of"):
err_msg = (
f"the 'policy' argument of the 'requires' directive in {pkg.name} is set "
@@ -909,7 +918,7 @@ def _execute_requires(pkg: Type[spack.package_base.PackageBase]):
def _language(lang_spec_str: str, *, when: Optional[Union[str, bool]] = None):
"""Temporary implementation of language virtuals, until compilers are proper dependencies."""
def _execute_languages(pkg: Type[spack.package_base.PackageBase]):
def _execute_languages(pkg: "spack.package_base.PackageBase"):
when_spec = _make_when_spec(when)
if not when_spec:
return

View File

@@ -5,7 +5,7 @@
import collections.abc
import functools
from typing import Any, Callable, Dict, List, Optional, Sequence, Set, Type, Union
from typing import List, Set
import llnl.util.lang
@@ -25,13 +25,11 @@ class DirectiveMeta(type):
# Set of all known directives
_directive_dict_names: Set[str] = set()
_directives_to_be_executed: List[Callable] = []
_when_constraints_from_context: List[spack.spec.Spec] = []
_directives_to_be_executed: List[str] = []
_when_constraints_from_context: List[str] = []
_default_args: List[dict] = []
def __new__(
cls: Type["DirectiveMeta"], name: str, bases: tuple, attr_dict: dict
) -> "DirectiveMeta":
def __new__(cls, name, bases, attr_dict):
# Initialize the attribute containing the list of directives
# to be executed. Here we go reversed because we want to execute
# commands:
@@ -62,7 +60,7 @@ def __new__(
return super(DirectiveMeta, cls).__new__(cls, name, bases, attr_dict)
def __init__(cls: "DirectiveMeta", name: str, bases: tuple, attr_dict: dict):
def __init__(cls, name, bases, attr_dict):
# The instance is being initialized: if it is a package we must ensure
# that the directives are called to set it up.
@@ -83,27 +81,27 @@ def __init__(cls: "DirectiveMeta", name: str, bases: tuple, attr_dict: dict):
super(DirectiveMeta, cls).__init__(name, bases, attr_dict)
@staticmethod
def push_to_context(when_spec: spack.spec.Spec) -> None:
def push_to_context(when_spec):
"""Add a spec to the context constraints."""
DirectiveMeta._when_constraints_from_context.append(when_spec)
@staticmethod
def pop_from_context() -> spack.spec.Spec:
def pop_from_context():
"""Pop the last constraint from the context"""
return DirectiveMeta._when_constraints_from_context.pop()
@staticmethod
def push_default_args(default_args: Dict[str, Any]) -> None:
def push_default_args(default_args):
"""Push default arguments"""
DirectiveMeta._default_args.append(default_args)
@staticmethod
def pop_default_args() -> dict:
def pop_default_args():
"""Pop default arguments"""
return DirectiveMeta._default_args.pop()
@staticmethod
def directive(dicts: Optional[Union[Sequence[str], str]] = None) -> Callable:
def directive(dicts=None):
"""Decorator for Spack directives.
Spack directives allow you to modify a package while it is being
@@ -158,7 +156,7 @@ class Foo(Package):
DirectiveMeta._directive_dict_names |= set(dicts)
# This decorator just returns the directive functions
def _decorator(decorated_function: Callable) -> Callable:
def _decorator(decorated_function):
directive_names.append(decorated_function.__name__)
@functools.wraps(decorated_function)

View File

@@ -1,15 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Enumerations used throughout Spack"""
import enum
class InstallRecordStatus(enum.Flag):
"""Enum flag to facilitate querying status from the DB"""
INSTALLED = enum.auto()
DEPRECATED = enum.auto()
MISSING = enum.auto()
ANY = INSTALLED | DEPRECATED | MISSING

View File

@@ -192,10 +192,3 @@ def __reduce__(self):
def _make_stop_phase(msg, long_msg):
return StopPhase(msg, long_msg)
class MirrorError(SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super().__init__(msg, long_msg)

View File

@@ -5,6 +5,7 @@
"""Service functions and classes to implement the hooks
for Spack's command extensions.
"""
import difflib
import glob
import importlib
import os
@@ -16,6 +17,7 @@
import llnl.util.lang
import spack.cmd
import spack.config
import spack.error
import spack.util.path
@@ -23,6 +25,9 @@
_extension_regexp = re.compile(r"spack-(\w[-\w]*)$")
# TODO: For consistency we should use spack.cmd.python_name(), but
# currently this would create a circular relationship between
# spack.cmd and spack.extensions.
def _python_name(cmd_name):
return cmd_name.replace("-", "_")
@@ -206,7 +211,8 @@ def get_module(cmd_name):
module = load_command_extension(cmd_name, folder)
if module:
return module
return None
else:
raise CommandNotFoundError(cmd_name)
def get_template_dirs():
@@ -218,6 +224,27 @@ def get_template_dirs():
return extensions
class CommandNotFoundError(spack.error.SpackError):
"""Exception class thrown when a requested command is not recognized as
such.
"""
def __init__(self, cmd_name):
msg = (
"{0} is not a recognized Spack command or extension command;"
" check with `spack commands`.".format(cmd_name)
)
long_msg = None
similar = difflib.get_close_matches(cmd_name, spack.cmd.all_commands())
if 1 <= len(similar) <= 5:
long_msg = "\nDid you mean one of the following commands?\n "
long_msg += "\n ".join(similar)
super().__init__(msg, long_msg)
class ExtensionNamingError(spack.error.SpackError):
"""Exception class thrown when a configured extension does not follow
the expected naming convention.

View File

@@ -325,7 +325,12 @@ def write(self, spec, color=None, out=None):
self._out = llnl.util.tty.color.ColorStream(out, color=color)
# We'll traverse the spec in topological order as we graph it.
nodes_in_topological_order = list(spec.traverse(order="topo", deptype=self.depflag))
nodes_in_topological_order = [
edge.spec
for edge in spack.traverse.traverse_edges_topo(
[spec], direction="children", deptype=self.depflag
)
]
nodes_in_topological_order.reverse()
# Work on a copy to be nondestructive

View File

@@ -6,7 +6,7 @@
import llnl.util.tty as tty
import spack.binary_distribution as bindist
import spack.mirrors.mirror
import spack.mirror
def post_install(spec, explicit):
@@ -22,7 +22,7 @@ def post_install(spec, explicit):
return
# Push the package to all autopush mirrors
for mirror in spack.mirrors.mirror.MirrorCollection(binary=True, autopush=True).values():
for mirror in spack.mirror.MirrorCollection(binary=True, autopush=True).values():
signing_key = bindist.select_signing_key() if mirror.signed else None
with bindist.make_uploader(mirror=mirror, force=True, signing_key=signing_key) as uploader:
uploader.push_or_raise([spec])

View File

@@ -23,6 +23,7 @@
from llnl.util.tty.color import colorize
import spack.build_environment
import spack.builder
import spack.config
import spack.error
import spack.package_base
@@ -352,7 +353,9 @@ def status(self, name: str, status: "TestStatus", msg: Optional[str] = None):
self.test_parts[part_name] = status
self.counts[status] += 1
def phase_tests(self, builder, phase_name: str, method_names: List[str]):
def phase_tests(
self, builder: spack.builder.Builder, phase_name: str, method_names: List[str]
):
"""Execute the builder's package phase-time tests.
Args:
@@ -375,16 +378,23 @@ def phase_tests(self, builder, phase_name: str, method_names: List[str]):
for name in method_names:
try:
fn = getattr(builder, name, None) or getattr(builder.pkg, name)
# Prefer the method in the package over the builder's.
# We need this primarily to pick up arbitrarily named test
# methods but also some build-time checks.
fn = getattr(builder.pkg, name, getattr(builder, name))
msg = f"RUN-TESTS: {phase_name}-time tests [{name}]"
print_message(logger, msg, verbose)
fn()
except AttributeError as e:
print_message(logger, f"RUN-TESTS: method not implemented [{name}]", verbose)
self.add_failure(e, f"RUN-TESTS: method not implemented [{name}]")
msg = f"RUN-TESTS: method not implemented [{name}]"
print_message(logger, msg, verbose)
self.add_failure(e, msg)
if fail_fast:
break
continue
print_message(logger, f"RUN-TESTS: {phase_name}-time tests [{name}]", verbose)
fn()
if have_tests:
print_message(logger, "Completed testing", verbose)
@@ -754,7 +764,7 @@ def virtuals(pkg):
# hack for compilers that are not dependencies (yet)
# TODO: this all eventually goes away
c_names = ("gcc", "intel", "intel-parallel-studio")
c_names = ("gcc", "intel", "intel-parallel-studio", "pgi")
if pkg.name in c_names:
v_names.extend(["c", "cxx", "fortran"])
if pkg.spec.satisfies("llvm+clang"):

View File

@@ -50,13 +50,12 @@
import spack.binary_distribution as binary_distribution
import spack.build_environment
import spack.builder
import spack.config
import spack.database
import spack.deptypes as dt
import spack.error
import spack.hooks
import spack.mirrors.mirror
import spack.mirror
import spack.package_base
import spack.package_prefs as prefs
import spack.repo
@@ -213,7 +212,7 @@ def _check_last_phase(pkg: "spack.package_base.PackageBase") -> None:
Raises:
``BadInstallPhase`` if stop_before or last phase is invalid
"""
phases = spack.builder.create(pkg).phases # type: ignore[attr-defined]
phases = pkg.builder.phases # type: ignore[attr-defined]
if pkg.stop_before_phase and pkg.stop_before_phase not in phases: # type: ignore[attr-defined]
raise BadInstallPhase(pkg.name, pkg.stop_before_phase) # type: ignore[attr-defined]
@@ -491,7 +490,7 @@ def _try_install_from_binary_cache(
timer: timer to keep track of binary install phases.
"""
# Early exit if no binary mirrors are configured.
if not spack.mirrors.mirror.MirrorCollection(binary=True):
if not spack.mirror.MirrorCollection(binary=True):
return False
tty.debug(f"Searching for binary cache of {package_id(pkg.spec)}")
@@ -662,7 +661,7 @@ def log(pkg: "spack.package_base.PackageBase") -> None:
spack.store.STORE.layout.metadata_path(pkg.spec), "archived-files"
)
for glob_expr in spack.builder.create(pkg).archive_files:
for glob_expr in pkg.builder.archive_files:
# Check that we are trying to copy things that are
# in the stage tree (not arbitrary files)
abs_expr = os.path.realpath(glob_expr)
@@ -2395,6 +2394,7 @@ def _install_source(self) -> None:
fs.install_tree(pkg.stage.source_path, src_target)
def _real_install(self) -> None:
import spack.builder
pkg = self.pkg

View File

@@ -2,20 +2,41 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""
This file contains code for creating spack mirror directories. A
mirror is an organized hierarchy containing specially named archive
files. This enabled spack to know where to find files in a mirror if
the main server for a particular package is down. Or, if the computer
where spack is run is not connected to the internet, it allows spack
to download packages directly from a mirror (e.g., on an intranet).
"""
import collections
import collections.abc
import operator
import os
import os.path
import sys
import traceback
import urllib.parse
from typing import Any, Dict, Optional, Tuple, Union
import llnl.url
import llnl.util.symlink
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack.caches
import spack.config
import spack.error
import spack.fetch_strategy
import spack.oci.image
import spack.repo
import spack.spec
import spack.util.path
import spack.util.spack_json as sjson
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
from spack.error import MirrorError
import spack.version
#: What schemes do we support
supported_url_schemes = ("file", "http", "https", "sftp", "ftp", "s3", "gs", "oci")
@@ -468,3 +489,380 @@ def __iter__(self):
def __len__(self):
return len(self._mirrors)
def _determine_extension(fetcher):
if isinstance(fetcher, spack.fetch_strategy.URLFetchStrategy):
if fetcher.expand_archive:
# If we fetch with a URLFetchStrategy, use URL's archive type
ext = llnl.url.determine_url_file_extension(fetcher.url)
if ext:
# Remove any leading dots
ext = ext.lstrip(".")
else:
msg = """\
Unable to parse extension from {0}.
If this URL is for a tarball but does not include the file extension
in the name, you can explicitly declare it with the following syntax:
version('1.2.3', 'hash', extension='tar.gz')
If this URL is for a download like a .jar or .whl that does not need
to be expanded, or an uncompressed installation script, you can tell
Spack not to expand it with the following syntax:
version('1.2.3', 'hash', expand=False)
"""
raise MirrorError(msg.format(fetcher.url))
else:
# If the archive shouldn't be expanded, don't check extension.
ext = None
else:
# Otherwise we'll make a .tar.gz ourselves
ext = "tar.gz"
return ext
class MirrorLayout:
"""A ``MirrorLayout`` object describes the relative path of a mirror entry."""
def __init__(self, path: str) -> None:
self.path = path
def __iter__(self):
"""Yield all paths including aliases where the resource can be found."""
yield self.path
def make_alias(self, root: str) -> None:
"""Make the entry ``root / self.path`` available under a human readable alias"""
pass
class DefaultLayout(MirrorLayout):
def __init__(self, alias_path: str, digest_path: Optional[str] = None) -> None:
# When we have a digest, it is used as the primary storage location. If not, then we use
# the human-readable alias. In case of mirrors of a VCS checkout, we currently do not have
# a digest, that's why an alias is required and a digest optional.
super().__init__(path=digest_path or alias_path)
self.alias = alias_path
self.digest_path = digest_path
def make_alias(self, root: str) -> None:
"""Symlink a human readible path in our mirror to the actual storage location."""
# We already use the human-readable path as the main storage location.
if not self.digest_path:
return
alias, digest = os.path.join(root, self.alias), os.path.join(root, self.digest_path)
alias_dir = os.path.dirname(alias)
relative_dst = os.path.relpath(digest, start=alias_dir)
mkdirp(alias_dir)
tmp = f"{alias}.tmp"
llnl.util.symlink.symlink(relative_dst, tmp)
try:
os.rename(tmp, alias)
except OSError:
# Clean up the temporary if possible
try:
os.unlink(tmp)
except OSError:
pass
raise
def __iter__(self):
if self.digest_path:
yield self.digest_path
yield self.alias
class OCILayout(MirrorLayout):
"""Follow the OCI Image Layout Specification to archive blobs where paths are of the form
``blobs/<algorithm>/<digest>``"""
def __init__(self, digest: spack.oci.image.Digest) -> None:
super().__init__(os.path.join("blobs", digest.algorithm, digest.digest))
def default_mirror_layout(
fetcher: "spack.fetch_strategy.FetchStrategy",
per_package_ref: str,
spec: Optional["spack.spec.Spec"] = None,
) -> MirrorLayout:
"""Returns a ``MirrorReference`` object which keeps track of the relative
storage path of the resource associated with the specified ``fetcher``."""
ext = None
if spec:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
versions = pkg_cls.versions.get(spec.version, {})
ext = versions.get("extension", None)
# If the spec does not explicitly specify an extension (the default case),
# then try to determine it automatically. An extension can only be
# specified for the primary source of the package (e.g. the source code
# identified in the 'version' declaration). Resources/patches don't have
# an option to specify an extension, so it must be inferred for those.
ext = ext or _determine_extension(fetcher)
if ext:
per_package_ref += ".%s" % ext
global_ref = fetcher.mirror_id()
if global_ref:
global_ref = os.path.join("_source-cache", global_ref)
if global_ref and ext:
global_ref += ".%s" % ext
return DefaultLayout(per_package_ref, global_ref)
def get_all_versions(specs):
"""Given a set of initial specs, return a new set of specs that includes
each version of each package in the original set.
Note that if any spec in the original set specifies properties other than
version, this information will be omitted in the new set; for example; the
new set of specs will not include variant settings.
"""
version_specs = []
for spec in specs:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# Skip any package that has no known versions.
if not pkg_cls.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg_cls.name)
continue
for version in pkg_cls.versions:
version_spec = spack.spec.Spec(pkg_cls.name)
version_spec.versions = spack.version.VersionList([version])
version_specs.append(version_spec)
return version_specs
def get_matching_versions(specs, num_versions=1):
"""Get a spec for EACH known version matching any spec in the list.
For concrete specs, this retrieves the concrete version and, if more
than one version per spec is requested, retrieves the latest versions
of the package.
"""
matching = []
for spec in specs:
pkg = spec.package
# Skip any package that has no known versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg.name)
continue
pkg_versions = num_versions
version_order = list(reversed(sorted(pkg.versions)))
matching_spec = []
if spec.concrete:
matching_spec.append(spec)
pkg_versions -= 1
if spec.version in version_order:
version_order.remove(spec.version)
for v in version_order:
# Generate no more than num_versions versions for each spec.
if pkg_versions < 1:
break
# Generate only versions that satisfy the spec.
if spec.concrete or v.intersects(spec.versions):
s = spack.spec.Spec(pkg.name)
s.versions = spack.version.VersionList([v])
s.variants = spec.variants.copy()
# This is needed to avoid hanging references during the
# concretization phase
s.variants.spec = s
matching_spec.append(s)
pkg_versions -= 1
if not matching_spec:
tty.warn("No known version matches spec: %s" % spec)
matching.extend(matching_spec)
return matching
def create(path, specs, skip_unstable_versions=False):
"""Create a directory to be used as a spack mirror, and fill it with
package archives.
Arguments:
path: Path to create a mirror directory hierarchy in.
specs: Any package versions matching these specs will be added \
to the mirror.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
Return Value:
Returns a tuple of lists: (present, mirrored, error)
* present: Package specs that were already present.
* mirrored: Package specs that were successfully mirrored.
* error: Package specs that failed to mirror due to some error.
"""
# automatically spec-ify anything in the specs array.
specs = [s if isinstance(s, spack.spec.Spec) else spack.spec.Spec(s) for s in specs]
mirror_cache, mirror_stats = mirror_cache_and_stats(path, skip_unstable_versions)
for spec in specs:
mirror_stats.next_spec(spec)
create_mirror_from_package_object(spec.package, mirror_cache, mirror_stats)
return mirror_stats.stats()
def mirror_cache_and_stats(path, skip_unstable_versions=False):
"""Return both a mirror cache and a mirror stats, starting from the path
where a mirror ought to be created.
Args:
path (str): path to create a mirror directory hierarchy in.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
"""
# Get the absolute path of the root before we start jumping around.
if not os.path.isdir(path):
try:
mkdirp(path)
except OSError as e:
raise MirrorError("Cannot create directory '%s':" % path, str(e))
mirror_cache = spack.caches.MirrorCache(path, skip_unstable_versions=skip_unstable_versions)
mirror_stats = MirrorStats()
return mirror_cache, mirror_stats
def add(mirror: Mirror, scope=None):
"""Add a named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if mirror.name in mirrors:
tty.die("Mirror with name {} already exists.".format(mirror.name))
items = [(n, u) for n, u in mirrors.items()]
items.insert(0, (mirror.name, mirror.to_dict()))
mirrors = syaml.syaml_dict(items)
spack.config.set("mirrors", mirrors, scope=scope)
def remove(name, scope):
"""Remove the named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if name not in mirrors:
tty.die("No mirror with name %s" % name)
mirrors.pop(name)
spack.config.set("mirrors", mirrors, scope=scope)
tty.msg("Removed mirror %s." % name)
class MirrorStats:
def __init__(self):
self.present = {}
self.new = {}
self.errors = set()
self.current_spec = None
self.added_resources = set()
self.existing_resources = set()
def next_spec(self, spec):
self._tally_current_spec()
self.current_spec = spec
def _tally_current_spec(self):
if self.current_spec:
if self.added_resources:
self.new[self.current_spec] = len(self.added_resources)
if self.existing_resources:
self.present[self.current_spec] = len(self.existing_resources)
self.added_resources = set()
self.existing_resources = set()
self.current_spec = None
def stats(self):
self._tally_current_spec()
return list(self.present), list(self.new), list(self.errors)
def already_existed(self, resource):
# If an error occurred after caching a subset of a spec's
# resources, a secondary attempt may consider them already added
if resource not in self.added_resources:
self.existing_resources.add(resource)
def added(self, resource):
self.added_resources.add(resource)
def error(self):
self.errors.add(self.current_spec)
def create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats):
"""Add a single package object to a mirror.
The package object is only required to have an associated spec
with a concrete version.
Args:
pkg_obj (spack.package_base.PackageBase): package object with to be added.
mirror_cache (spack.caches.MirrorCache): mirror where to add the spec.
mirror_stats (spack.mirror.MirrorStats): statistics on the current mirror
Return:
True if the spec was added successfully, False otherwise
"""
tty.msg("Adding package {} to mirror".format(pkg_obj.spec.format("{name}{@version}")))
num_retries = 3
while num_retries > 0:
try:
# Includes patches and resources
with pkg_obj.stage as pkg_stage:
pkg_stage.cache_mirror(mirror_cache, mirror_stats)
exception = None
break
except Exception as e:
exc_tuple = sys.exc_info()
exception = e
num_retries -= 1
if exception:
if spack.config.get("config:debug"):
traceback.print_exception(file=sys.stderr, *exc_tuple)
else:
tty.warn(
"Error while fetching %s" % pkg_obj.spec.cformat("{name}{@version}"),
getattr(exception, "message", exception),
)
mirror_stats.error()
return False
return True
def require_mirror_name(mirror_name):
"""Find a mirror by name and raise if it does not exist"""
mirror = MirrorCollection().get(mirror_name)
if not mirror:
raise ValueError(f'no mirror named "{mirror_name}"')
return mirror
class MirrorError(spack.error.SpackError):
"""Superclass of all mirror-creation related errors."""
def __init__(self, msg, long_msg=None):
super().__init__(msg, long_msg)

View File

@@ -1,146 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
from typing import Optional
import llnl.url
import llnl.util.symlink
from llnl.util.filesystem import mkdirp
import spack.fetch_strategy
import spack.oci.image
import spack.repo
import spack.spec
from spack.error import MirrorError
class MirrorLayout:
"""A ``MirrorLayout`` object describes the relative path of a mirror entry."""
def __init__(self, path: str) -> None:
self.path = path
def __iter__(self):
"""Yield all paths including aliases where the resource can be found."""
yield self.path
def make_alias(self, root: str) -> None:
"""Make the entry ``root / self.path`` available under a human readable alias"""
pass
class DefaultLayout(MirrorLayout):
def __init__(self, alias_path: str, digest_path: Optional[str] = None) -> None:
# When we have a digest, it is used as the primary storage location. If not, then we use
# the human-readable alias. In case of mirrors of a VCS checkout, we currently do not have
# a digest, that's why an alias is required and a digest optional.
super().__init__(path=digest_path or alias_path)
self.alias = alias_path
self.digest_path = digest_path
def make_alias(self, root: str) -> None:
"""Symlink a human readible path in our mirror to the actual storage location."""
# We already use the human-readable path as the main storage location.
if not self.digest_path:
return
alias, digest = os.path.join(root, self.alias), os.path.join(root, self.digest_path)
alias_dir = os.path.dirname(alias)
relative_dst = os.path.relpath(digest, start=alias_dir)
mkdirp(alias_dir)
tmp = f"{alias}.tmp"
llnl.util.symlink.symlink(relative_dst, tmp)
try:
os.rename(tmp, alias)
except OSError:
# Clean up the temporary if possible
try:
os.unlink(tmp)
except OSError:
pass
raise
def __iter__(self):
if self.digest_path:
yield self.digest_path
yield self.alias
class OCILayout(MirrorLayout):
"""Follow the OCI Image Layout Specification to archive blobs where paths are of the form
``blobs/<algorithm>/<digest>``"""
def __init__(self, digest: spack.oci.image.Digest) -> None:
super().__init__(os.path.join("blobs", digest.algorithm, digest.digest))
def _determine_extension(fetcher):
if isinstance(fetcher, spack.fetch_strategy.URLFetchStrategy):
if fetcher.expand_archive:
# If we fetch with a URLFetchStrategy, use URL's archive type
ext = llnl.url.determine_url_file_extension(fetcher.url)
if ext:
# Remove any leading dots
ext = ext.lstrip(".")
else:
msg = """\
Unable to parse extension from {0}.
If this URL is for a tarball but does not include the file extension
in the name, you can explicitly declare it with the following syntax:
version('1.2.3', 'hash', extension='tar.gz')
If this URL is for a download like a .jar or .whl that does not need
to be expanded, or an uncompressed installation script, you can tell
Spack not to expand it with the following syntax:
version('1.2.3', 'hash', expand=False)
"""
raise MirrorError(msg.format(fetcher.url))
else:
# If the archive shouldn't be expanded, don't check extension.
ext = None
else:
# Otherwise we'll make a .tar.gz ourselves
ext = "tar.gz"
return ext
def default_mirror_layout(
fetcher: "spack.fetch_strategy.FetchStrategy",
per_package_ref: str,
spec: Optional["spack.spec.Spec"] = None,
) -> MirrorLayout:
"""Returns a ``MirrorReference`` object which keeps track of the relative
storage path of the resource associated with the specified ``fetcher``."""
ext = None
if spec:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
versions = pkg_cls.versions.get(spec.version, {})
ext = versions.get("extension", None)
# If the spec does not explicitly specify an extension (the default case),
# then try to determine it automatically. An extension can only be
# specified for the primary source of the package (e.g. the source code
# identified in the 'version' declaration). Resources/patches don't have
# an option to specify an extension, so it must be inferred for those.
ext = ext or _determine_extension(fetcher)
if ext:
per_package_ref += ".%s" % ext
global_ref = fetcher.mirror_id()
if global_ref:
global_ref = os.path.join("_source-cache", global_ref)
if global_ref and ext:
global_ref += ".%s" % ext
return DefaultLayout(per_package_ref, global_ref)

View File

@@ -1,262 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import sys
import traceback
import llnl.util.tty as tty
from llnl.util.filesystem import mkdirp
import spack.caches
import spack.config
import spack.error
import spack.repo
import spack.spec
import spack.util.spack_yaml as syaml
import spack.version
from spack.error import MirrorError
from spack.mirrors.mirror import Mirror, MirrorCollection
def get_all_versions(specs):
"""Given a set of initial specs, return a new set of specs that includes
each version of each package in the original set.
Note that if any spec in the original set specifies properties other than
version, this information will be omitted in the new set; for example; the
new set of specs will not include variant settings.
"""
version_specs = []
for spec in specs:
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
# Skip any package that has no known versions.
if not pkg_cls.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg_cls.name)
continue
for version in pkg_cls.versions:
version_spec = spack.spec.Spec(pkg_cls.name)
version_spec.versions = spack.version.VersionList([version])
version_specs.append(version_spec)
return version_specs
def get_matching_versions(specs, num_versions=1):
"""Get a spec for EACH known version matching any spec in the list.
For concrete specs, this retrieves the concrete version and, if more
than one version per spec is requested, retrieves the latest versions
of the package.
"""
matching = []
for spec in specs:
pkg = spec.package
# Skip any package that has no known versions.
if not pkg.versions:
tty.msg("No safe (checksummed) versions for package %s" % pkg.name)
continue
pkg_versions = num_versions
version_order = list(reversed(sorted(pkg.versions)))
matching_spec = []
if spec.concrete:
matching_spec.append(spec)
pkg_versions -= 1
if spec.version in version_order:
version_order.remove(spec.version)
for v in version_order:
# Generate no more than num_versions versions for each spec.
if pkg_versions < 1:
break
# Generate only versions that satisfy the spec.
if spec.concrete or v.intersects(spec.versions):
s = spack.spec.Spec(pkg.name)
s.versions = spack.version.VersionList([v])
s.variants = spec.variants.copy()
# This is needed to avoid hanging references during the
# concretization phase
s.variants.spec = s
matching_spec.append(s)
pkg_versions -= 1
if not matching_spec:
tty.warn("No known version matches spec: %s" % spec)
matching.extend(matching_spec)
return matching
def create(path, specs, skip_unstable_versions=False):
"""Create a directory to be used as a spack mirror, and fill it with
package archives.
Arguments:
path: Path to create a mirror directory hierarchy in.
specs: Any package versions matching these specs will be added \
to the mirror.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
Return Value:
Returns a tuple of lists: (present, mirrored, error)
* present: Package specs that were already present.
* mirrored: Package specs that were successfully mirrored.
* error: Package specs that failed to mirror due to some error.
"""
# automatically spec-ify anything in the specs array.
specs = [s if isinstance(s, spack.spec.Spec) else spack.spec.Spec(s) for s in specs]
mirror_cache, mirror_stats = mirror_cache_and_stats(path, skip_unstable_versions)
for spec in specs:
mirror_stats.next_spec(spec)
create_mirror_from_package_object(spec.package, mirror_cache, mirror_stats)
return mirror_stats.stats()
def mirror_cache_and_stats(path, skip_unstable_versions=False):
"""Return both a mirror cache and a mirror stats, starting from the path
where a mirror ought to be created.
Args:
path (str): path to create a mirror directory hierarchy in.
skip_unstable_versions: if true, this skips adding resources when
they do not have a stable archive checksum (as determined by
``fetch_strategy.stable_target``)
"""
# Get the absolute path of the root before we start jumping around.
if not os.path.isdir(path):
try:
mkdirp(path)
except OSError as e:
raise MirrorError("Cannot create directory '%s':" % path, str(e))
mirror_cache = spack.caches.MirrorCache(path, skip_unstable_versions=skip_unstable_versions)
mirror_stats = MirrorStats()
return mirror_cache, mirror_stats
def add(mirror: Mirror, scope=None):
"""Add a named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if mirror.name in mirrors:
tty.die("Mirror with name {} already exists.".format(mirror.name))
items = [(n, u) for n, u in mirrors.items()]
items.insert(0, (mirror.name, mirror.to_dict()))
mirrors = syaml.syaml_dict(items)
spack.config.set("mirrors", mirrors, scope=scope)
def remove(name, scope):
"""Remove the named mirror in the given scope"""
mirrors = spack.config.get("mirrors", scope=scope)
if not mirrors:
mirrors = syaml.syaml_dict()
if name not in mirrors:
tty.die("No mirror with name %s" % name)
mirrors.pop(name)
spack.config.set("mirrors", mirrors, scope=scope)
tty.msg("Removed mirror %s." % name)
class MirrorStats:
def __init__(self):
self.present = {}
self.new = {}
self.errors = set()
self.current_spec = None
self.added_resources = set()
self.existing_resources = set()
def next_spec(self, spec):
self._tally_current_spec()
self.current_spec = spec
def _tally_current_spec(self):
if self.current_spec:
if self.added_resources:
self.new[self.current_spec] = len(self.added_resources)
if self.existing_resources:
self.present[self.current_spec] = len(self.existing_resources)
self.added_resources = set()
self.existing_resources = set()
self.current_spec = None
def stats(self):
self._tally_current_spec()
return list(self.present), list(self.new), list(self.errors)
def already_existed(self, resource):
# If an error occurred after caching a subset of a spec's
# resources, a secondary attempt may consider them already added
if resource not in self.added_resources:
self.existing_resources.add(resource)
def added(self, resource):
self.added_resources.add(resource)
def error(self):
self.errors.add(self.current_spec)
def create_mirror_from_package_object(pkg_obj, mirror_cache, mirror_stats):
"""Add a single package object to a mirror.
The package object is only required to have an associated spec
with a concrete version.
Args:
pkg_obj (spack.package_base.PackageBase): package object with to be added.
mirror_cache (spack.caches.MirrorCache): mirror where to add the spec.
mirror_stats (spack.mirror.MirrorStats): statistics on the current mirror
Return:
True if the spec was added successfully, False otherwise
"""
tty.msg("Adding package {} to mirror".format(pkg_obj.spec.format("{name}{@version}")))
num_retries = 3
while num_retries > 0:
try:
# Includes patches and resources
with pkg_obj.stage as pkg_stage:
pkg_stage.cache_mirror(mirror_cache, mirror_stats)
exception = None
break
except Exception as e:
exc_tuple = sys.exc_info()
exception = e
num_retries -= 1
if exception:
if spack.config.get("config:debug"):
traceback.print_exception(file=sys.stderr, *exc_tuple)
else:
tty.warn(
"Error while fetching %s" % pkg_obj.spec.cformat("{name}{@version}"),
getattr(exception, "message", exception),
)
mirror_stats.error()
return False
return True
def require_mirror_name(mirror_name):
"""Find a mirror by name and raise if it does not exist"""
mirror = MirrorCollection().get(mirror_name)
if not mirror:
raise ValueError(f'no mirror named "{mirror_name}"')
return mirror

View File

@@ -10,7 +10,7 @@
import llnl.util.filesystem
import spack.phase_callbacks
import spack.builder
def filter_compiler_wrappers(*files, **kwargs):
@@ -111,4 +111,4 @@ def _filter_compiler_wrappers_impl(pkg_or_builder):
if pkg.compiler.name == "nag":
x.filter("-Wl,--enable-new-dtags", "", **filter_kwargs)
spack.phase_callbacks.run_after(after)(_filter_compiler_wrappers_impl)
spack.builder.run_after(after)(_filter_compiler_wrappers_impl)

View File

@@ -39,7 +39,7 @@
import llnl.util.filesystem
import llnl.util.tty as tty
from llnl.util.lang import Singleton, dedupe, memoized
from llnl.util.lang import dedupe, memoized
import spack.build_environment
import spack.config
@@ -246,7 +246,7 @@ def _generate_upstream_module_index():
return UpstreamModuleIndex(spack.store.STORE.db, module_indices)
upstream_module_index = Singleton(_generate_upstream_module_index)
upstream_module_index = llnl.util.lang.Singleton(_generate_upstream_module_index)
ModuleIndexEntry = collections.namedtuple("ModuleIndexEntry", ["path", "use_name"])

View File

@@ -16,8 +16,7 @@
import llnl.util.tty as tty
import spack.fetch_strategy
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.mirror
import spack.oci.opener
import spack.stage
import spack.util.url
@@ -214,7 +213,7 @@ def upload_manifest(
return digest, size
def image_from_mirror(mirror: spack.mirrors.mirror.Mirror) -> ImageReference:
def image_from_mirror(mirror: spack.mirror.Mirror) -> ImageReference:
"""Given an OCI based mirror, extract the URL and image name from it"""
url = mirror.push_url
if not url.startswith("oci://"):
@@ -386,8 +385,5 @@ def make_stage(
# is the `oci-layout` and `index.json` files, which are
# required by the spec.
return spack.stage.Stage(
fetch_strategy,
mirror_paths=spack.mirrors.layout.OCILayout(digest),
name=digest.digest,
keep=keep,
fetch_strategy, mirror_paths=spack.mirror.OCILayout(digest), name=digest.digest, keep=keep
)

View File

@@ -20,7 +20,7 @@
import llnl.util.lang
import spack.config
import spack.mirrors.mirror
import spack.mirror
import spack.parser
import spack.util.web
@@ -367,11 +367,11 @@ def http_error_401(self, req: Request, fp, code, msg, headers):
def credentials_from_mirrors(
domain: str, *, mirrors: Optional[Iterable[spack.mirrors.mirror.Mirror]] = None
domain: str, *, mirrors: Optional[Iterable[spack.mirror.Mirror]] = None
) -> Optional[UsernamePassword]:
"""Filter out OCI registry credentials from a list of mirrors."""
mirrors = mirrors or spack.mirrors.mirror.MirrorCollection().values()
mirrors = mirrors or spack.mirror.MirrorCollection().values()
for mirror in mirrors:
# Prefer push credentials over fetch. Unlikely that those are different

View File

@@ -11,7 +11,7 @@
from os import chdir, environ, getcwd, makedirs, mkdir, remove, removedirs
from shutil import move, rmtree
from spack.error import InstallError, NoHeadersError, NoLibrariesError
from spack.error import InstallError
# Emulate some shell commands for convenience
env = environ
@@ -74,7 +74,7 @@
from spack.build_systems.sourceware import SourcewarePackage
from spack.build_systems.waf import WafPackage
from spack.build_systems.xorg import XorgPackage
from spack.builder import BaseBuilder
from spack.builder import run_after, run_before
from spack.config import determine_number_of_jobs
from spack.deptypes import ALL_TYPES as all_deptypes
from spack.directives import *
@@ -100,7 +100,6 @@
on_package_attributes,
)
from spack.package_completions import *
from spack.phase_callbacks import run_after, run_before
from spack.spec import InvalidSpecDetected, Spec
from spack.util.executable import *
from spack.util.filesystem import file_command, fix_darwin_install_name, mime_type

View File

@@ -32,19 +32,18 @@
from llnl.util.lang import classproperty, memoized
from llnl.util.link_tree import LinkTree
import spack.builder
import spack.compilers
import spack.config
import spack.dependency
import spack.deptypes as dt
import spack.directives_meta
import spack.directives
import spack.error
import spack.fetch_strategy as fs
import spack.hooks
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.mirror
import spack.multimethod
import spack.patch
import spack.phase_callbacks
import spack.repo
import spack.spec
import spack.store
@@ -52,10 +51,9 @@
import spack.util.environment
import spack.util.path
import spack.util.web
import spack.variant
from spack.error import InstallError, NoURLError, PackageError
from spack.filesystem_view import YamlFilesystemView
from spack.resource import Resource
from spack.install_test import PackageTest, TestSuite
from spack.solver.version_order import concretization_version_order
from spack.stage import DevelopStage, ResourceStage, Stage, StageComposite, compute_stage_name
from spack.util.package_hash import package_hash
@@ -301,9 +299,9 @@ def determine_variants(cls, objs, version_str):
class PackageMeta(
spack.phase_callbacks.PhaseCallbacksMeta,
spack.builder.PhaseCallbacksMeta,
DetectablePackageMeta,
spack.directives_meta.DirectiveMeta,
spack.directives.DirectiveMeta,
spack.multimethod.MultiMethodMeta,
):
"""
@@ -455,7 +453,7 @@ def _names(when_indexed_dictionary: WhenDict) -> List[str]:
return sorted(all_names)
WhenVariantList = List[Tuple[spack.spec.Spec, spack.variant.Variant]]
WhenVariantList = List[Tuple["spack.spec.Spec", "spack.variant.Variant"]]
def _remove_overridden_vdefs(variant_defs: WhenVariantList) -> None:
@@ -494,14 +492,41 @@ class Hipblas:
i += 1
#: Store whether a given Spec source/binary should not be redistributed.
class DisableRedistribute:
def __init__(self, source, binary):
self.source = source
self.binary = binary
class RedistributionMixin:
"""Logic for determining whether a Package is source/binary
redistributable.
"""
#: Store whether a given Spec source/binary should not be
#: redistributed.
disable_redistribute: Dict["spack.spec.Spec", "spack.directives.DisableRedistribute"]
# Source redistribution must be determined before concretization
# (because source mirrors work with un-concretized Specs).
@classmethod
def redistribute_source(cls, spec):
"""Whether it should be possible to add the source of this
package to a Spack mirror.
"""
for when_spec, disable_redistribute in cls.disable_redistribute.items():
if disable_redistribute.source and spec.satisfies(when_spec):
return False
return True
@property
def redistribute_binary(self):
"""Whether it should be possible to create a binary out of an
installed instance of this package.
"""
for when_spec, disable_redistribute in self.__class__.disable_redistribute.items():
if disable_redistribute.binary and self.spec.satisfies(when_spec):
return False
return True
class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
class PackageBase(WindowsRPath, PackageViewMixin, RedistributionMixin, metaclass=PackageMeta):
"""This is the superclass for all spack packages.
***The Package class***
@@ -587,22 +612,17 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
# Declare versions dictionary as placeholder for values.
# This allows analysis tools to correctly interpret the class attributes.
versions: dict
resources: Dict[spack.spec.Spec, List[Resource]]
dependencies: Dict[spack.spec.Spec, Dict[str, spack.dependency.Dependency]]
conflicts: Dict[spack.spec.Spec, List[Tuple[spack.spec.Spec, Optional[str]]]]
dependencies: Dict["spack.spec.Spec", Dict[str, "spack.dependency.Dependency"]]
conflicts: Dict["spack.spec.Spec", List[Tuple["spack.spec.Spec", Optional[str]]]]
requirements: Dict[
spack.spec.Spec, List[Tuple[Tuple[spack.spec.Spec, ...], str, Optional[str]]]
"spack.spec.Spec", List[Tuple[Tuple["spack.spec.Spec", ...], str, Optional[str]]]
]
provided: Dict[spack.spec.Spec, Set[spack.spec.Spec]]
provided_together: Dict[spack.spec.Spec, List[Set[str]]]
patches: Dict[spack.spec.Spec, List[spack.patch.Patch]]
variants: Dict[spack.spec.Spec, Dict[str, spack.variant.Variant]]
languages: Dict[spack.spec.Spec, Set[str]]
licenses: Dict[spack.spec.Spec, str]
splice_specs: Dict[spack.spec.Spec, Tuple[spack.spec.Spec, Union[None, str, List[str]]]]
#: Store whether a given Spec source/binary should not be redistributed.
disable_redistribute: Dict[spack.spec.Spec, DisableRedistribute]
provided: Dict["spack.spec.Spec", Set["spack.spec.Spec"]]
provided_together: Dict["spack.spec.Spec", List[Set[str]]]
patches: Dict["spack.spec.Spec", List["spack.patch.Patch"]]
variants: Dict["spack.spec.Spec", Dict[str, "spack.variant.Variant"]]
languages: Dict["spack.spec.Spec", Set[str]]
splice_specs: Dict["spack.spec.Spec", Tuple["spack.spec.Spec", Union[None, str, List[str]]]]
#: By default, packages are not virtual
#: Virtual packages override this attribute
@@ -717,11 +737,11 @@ class PackageBase(WindowsRPath, PackageViewMixin, metaclass=PackageMeta):
test_requires_compiler: bool = False
#: TestSuite instance used to manage stand-alone tests for 1+ specs.
test_suite: Optional[Any] = None
test_suite: Optional["TestSuite"] = None
def __init__(self, spec):
# this determines how the package should be built.
self.spec: spack.spec.Spec = spec
self.spec: "spack.spec.Spec" = spec
# Allow custom staging paths for packages
self.path = None
@@ -739,7 +759,7 @@ def __init__(self, spec):
# init internal variables
self._stage: Optional[StageComposite] = None
self._fetcher = None
self._tester: Optional[Any] = None
self._tester: Optional["PackageTest"] = None
# Set up timing variables
self._fetch_time = 0.0
@@ -789,7 +809,9 @@ def variant_definitions(cls, name: str) -> WhenVariantList:
return defs
@classmethod
def variant_items(cls) -> Iterable[Tuple[spack.spec.Spec, Dict[str, spack.variant.Variant]]]:
def variant_items(
cls,
) -> Iterable[Tuple["spack.spec.Spec", Dict[str, "spack.variant.Variant"]]]:
"""Iterate over ``cls.variants.items()`` with overridden definitions removed."""
# Note: This is quadratic in the average number of variant definitions per name.
# That is likely close to linear in practice, as there are few variants with
@@ -807,7 +829,7 @@ def variant_items(cls) -> Iterable[Tuple[spack.spec.Spec, Dict[str, spack.varian
if filtered_variants_by_name:
yield when, filtered_variants_by_name
def get_variant(self, name: str) -> spack.variant.Variant:
def get_variant(self, name: str) -> "spack.variant.Variant":
"""Get the highest precedence variant definition matching this package's spec.
Arguments:
@@ -982,26 +1004,6 @@ def global_license_file(self):
self.global_license_dir, self.name, os.path.basename(self.license_files[0])
)
# Source redistribution must be determined before concretization (because source mirrors work
# with abstract specs).
@classmethod
def redistribute_source(cls, spec):
"""Whether it should be possible to add the source of this
package to a Spack mirror."""
for when_spec, disable_redistribute in cls.disable_redistribute.items():
if disable_redistribute.source and spec.satisfies(when_spec):
return False
return True
@property
def redistribute_binary(self):
"""Whether it should be possible to create a binary out of an installed instance of this
package."""
for when_spec, disable_redistribute in self.disable_redistribute.items():
if disable_redistribute.binary and self.spec.satisfies(when_spec):
return False
return True
# NOTE: return type should be Optional[Literal['all', 'specific', 'none']] in
# Python 3.8+, but we still support 3.6.
@property
@@ -1014,7 +1016,7 @@ def keep_werror(self) -> Optional[str]:
* ``"none"``: filter out all ``-Werror*`` flags.
* ``None``: respect the user's configuration (``"none"`` by default).
"""
if self.spec.satisfies("%nvhpc@:23.3"):
if self.spec.satisfies("%nvhpc@:23.3") or self.spec.satisfies("%pgi"):
# Filtering works by replacing -Werror with -Wno-error, but older nvhpc and
# PGI do not understand -Wno-error, so we disable filtering.
return "all"
@@ -1188,10 +1190,10 @@ def _make_resource_stage(self, root_stage, resource):
root=root_stage,
resource=resource,
name=self._resource_stage(resource),
mirror_paths=spack.mirrors.layout.default_mirror_layout(
mirror_paths=spack.mirror.default_mirror_layout(
resource.fetcher, os.path.join(self.name, pretty_resource_name)
),
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirror.MirrorCollection(source=True).values(),
path=self.path,
)
@@ -1203,7 +1205,7 @@ def _make_root_stage(self, fetcher):
# Construct a mirror path (TODO: get this out of package.py)
format_string = "{name}-{version}"
pretty_name = self.spec.format_path(format_string)
mirror_paths = spack.mirrors.layout.default_mirror_layout(
mirror_paths = spack.mirror.default_mirror_layout(
fetcher, os.path.join(self.name, pretty_name), self.spec
)
# Construct a path where the stage should build..
@@ -1212,7 +1214,7 @@ def _make_root_stage(self, fetcher):
stage = Stage(
fetcher,
mirror_paths=mirror_paths,
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirror.MirrorCollection(source=True).values(),
name=stage_name,
path=self.path,
search_fn=self._download_search,
@@ -1351,13 +1353,11 @@ def archive_install_test_log(self):
@property
def tester(self):
import spack.install_test
if not self.spec.versions.concrete:
raise ValueError("Cannot retrieve tester for package without concrete version.")
if not self._tester:
self._tester = spack.install_test.PackageTest(self)
self._tester = PackageTest(self)
return self._tester
@property
@@ -2014,58 +2014,72 @@ def build_system_flags(
"""
return None, None, flags
def setup_run_environment(self, env: spack.util.environment.EnvironmentModifications) -> None:
def setup_run_environment(self, env):
"""Sets up the run environment for a package.
Args:
env: environment modifications to be applied when the package is run. Package authors
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the package is run. Package authors
can call methods on it to alter the run environment.
"""
pass
def setup_dependent_run_environment(
self, env: spack.util.environment.EnvironmentModifications, dependent_spec: spack.spec.Spec
) -> None:
def setup_dependent_run_environment(self, env, dependent_spec):
"""Sets up the run environment of packages that depend on this one.
This is similar to ``setup_run_environment``, but it is used to modify the run environment
of a package that *depends* on this one.
This is similar to ``setup_run_environment``, but it is used to
modify the run environments of packages that *depend* on this one.
This gives packages like Python and others that follow the extension model a way to
implement common environment or run-time settings for dependencies.
This gives packages like Python and others that follow the extension
model a way to implement common environment or run-time settings
for dependencies.
Args:
env: environment modifications to be applied when the dependent package is run. Package
authors can call methods on it to alter the build environment.
env (spack.util.environment.EnvironmentModifications): environment
modifications to be applied when the dependent package is run.
Package authors can call methods on it to alter the build environment.
dependent_spec: The spec of the dependent package about to be run. This allows the
extendee (self) to query the dependent's state. Note that *this* package's spec is
dependent_spec (spack.spec.Spec): The spec of the dependent package
about to be run. This allows the extendee (self) to query
the dependent's state. Note that *this* package's spec is
available as ``self.spec``
"""
pass
def setup_dependent_package(self, module, dependent_spec: spack.spec.Spec) -> None:
"""Set up module-scope global variables for dependent packages.
def setup_dependent_package(self, module, dependent_spec):
"""Set up Python module-scope variables for dependent packages.
This function is called when setting up the build and run environments of a DAG.
Called before the install() method of dependents.
Default implementation does nothing, but this can be
overridden by an extendable package to set up the module of
its extensions. This is useful if there are some common steps
to installing all extensions for a certain package.
Examples:
1. Extensions often need to invoke the ``python`` interpreter from the Python installation
being extended. This routine can put a ``python`` Executable as a global in the module
scope for the extension package to simplify extension installs.
1. Extensions often need to invoke the ``python`` interpreter
from the Python installation being extended. This routine
can put a ``python()`` Executable object in the module scope
for the extension package to simplify extension installs.
2. MPI compilers could set some variables in the dependent's scope that point to ``mpicc``,
``mpicxx``, etc., allowing them to be called by common name regardless of which MPI is
used.
2. MPI compilers could set some variables in the dependent's
scope that point to ``mpicc``, ``mpicxx``, etc., allowing
them to be called by common name regardless of which MPI is used.
3. BLAS/LAPACK implementations can set some variables
indicating the path to their libraries, since these
paths differ by BLAS/LAPACK implementation.
Args:
module: The Python ``module`` object of the dependent package. Packages can use this to
set module-scope variables for the dependent to use.
module (spack.package_base.PackageBase.module): The Python ``module``
object of the dependent package. Packages can use this to set
module-scope variables for the dependent to use.
dependent_spec: The spec of the dependent package about to be built. This allows the
extendee (self) to query the dependent's state. Note that *this* package's spec is
available as ``self.spec``.
dependent_spec (spack.spec.Spec): The spec of the dependent package
about to be built. This allows the extendee (self) to
query the dependent's state. Note that *this*
package's spec is available as ``self.spec``.
"""
pass
@@ -2092,7 +2106,7 @@ def flag_handler(self, var: FLAG_HANDLER_TYPE) -> None:
# arguments. This is implemented for build system classes where
# appropriate and will otherwise raise a NotImplementedError.
def flags_to_build_system_args(self, flags: Dict[str, List[str]]) -> None:
def flags_to_build_system_args(self, flags):
# Takes flags as a dict name: list of values
if any(v for v in flags.values()):
msg = "The {0} build system".format(self.__class__.__name__)
@@ -2295,6 +2309,10 @@ def rpath_args(self):
"""
return " ".join("-Wl,-rpath,%s" % p for p in self.rpath)
@property
def builder(self):
return spack.builder.create(self)
inject_flags = PackageBase.inject_flags
env_flags = PackageBase.env_flags

View File

@@ -16,8 +16,7 @@
import spack
import spack.error
import spack.fetch_strategy
import spack.mirrors.layout
import spack.mirrors.mirror
import spack.mirror
import spack.repo
import spack.stage
import spack.util.spack_json as sjson
@@ -330,12 +329,12 @@ def stage(self) -> "spack.stage.Stage":
name = "{0}-{1}".format(os.path.basename(self.url), fetch_digest[:7])
per_package_ref = os.path.join(self.owner.split(".")[-1], name)
mirror_ref = spack.mirrors.layout.default_mirror_layout(fetcher, per_package_ref)
mirror_ref = spack.mirror.default_mirror_layout(fetcher, per_package_ref)
self._stage = spack.stage.Stage(
fetcher,
name=f"{spack.stage.stage_prefix}patch-{fetch_digest}",
mirror_paths=mirror_ref,
mirrors=spack.mirrors.mirror.MirrorCollection(source=True).values(),
mirrors=spack.mirror.MirrorCollection(source=True).values(),
)
return self._stage

View File

@@ -1,105 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections
import llnl.util.lang as lang
#: An object of this kind is a shared global state used to collect callbacks during
#: class definition time, and is flushed when the class object is created at the end
#: of the class definition
#:
#: Args:
#: attribute_name (str): name of the attribute that will be attached to the builder
#: callbacks (list): container used to temporarily aggregate the callbacks
CallbackTemporaryStage = collections.namedtuple(
"CallbackTemporaryStage", ["attribute_name", "callbacks"]
)
#: Shared global state to aggregate "@run_before" callbacks
_RUN_BEFORE = CallbackTemporaryStage(attribute_name="run_before_callbacks", callbacks=[])
#: Shared global state to aggregate "@run_after" callbacks
_RUN_AFTER = CallbackTemporaryStage(attribute_name="run_after_callbacks", callbacks=[])
class PhaseCallbacksMeta(type):
"""Permit to register arbitrary functions during class definition and run them
later, before or after a given install phase.
Each method decorated with ``run_before`` or ``run_after`` gets temporarily
stored in a global shared state when a class being defined is parsed by the Python
interpreter. At class definition time that temporary storage gets flushed and a list
of callbacks is attached to the class being defined.
"""
def __new__(mcs, name, bases, attr_dict):
for temporary_stage in (_RUN_BEFORE, _RUN_AFTER):
staged_callbacks = temporary_stage.callbacks
# Here we have an adapter from an old-style package. This means there is no
# hierarchy of builders, and every callback that had to be combined between
# *Package and *Builder has been combined already by _PackageAdapterMeta
if name == "Adapter":
continue
# If we are here we have callbacks. To get a complete list, we accumulate all the
# callbacks from base classes, we deduplicate them, then prepend what we have
# registered here.
#
# The order should be:
# 1. Callbacks are registered in order within the same class
# 2. Callbacks defined in derived classes precede those defined in base
# classes
callbacks_from_base = []
for base in bases:
current_callbacks = getattr(base, temporary_stage.attribute_name, None)
if not current_callbacks:
continue
callbacks_from_base.extend(current_callbacks)
callbacks_from_base = list(lang.dedupe(callbacks_from_base))
# Set the callbacks in this class and flush the temporary stage
attr_dict[temporary_stage.attribute_name] = staged_callbacks[:] + callbacks_from_base
del temporary_stage.callbacks[:]
return super(PhaseCallbacksMeta, mcs).__new__(mcs, name, bases, attr_dict)
@staticmethod
def run_after(phase, when=None):
"""Decorator to register a function for running after a given phase.
Args:
phase (str): phase after which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_AFTER.callbacks.append(item)
return fn
return _decorator
@staticmethod
def run_before(phase, when=None):
"""Decorator to register a function for running before a given phase.
Args:
phase (str): phase before which the function must run.
when (str): condition under which the function is run (if None, it is always run).
"""
def _decorator(fn):
key = (phase, when)
item = (key, fn)
_RUN_BEFORE.callbacks.append(item)
return fn
return _decorator
# Export these names as standalone to be used in packages
run_after = PhaseCallbacksMeta.run_after
run_before = PhaseCallbacksMeta.run_before

View File

@@ -13,7 +13,6 @@
import macholib.mach_o
import macholib.MachO
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
from llnl.util.lang import memoized
@@ -276,10 +275,10 @@ def modify_macho_object(cur_path, rpaths, deps, idpath, paths_to_paths):
# Deduplicate and flatten
args = list(itertools.chain.from_iterable(llnl.util.lang.dedupe(args)))
install_name_tool = executable.Executable("install_name_tool")
if args:
with fs.edit_in_place_through_temporary_file(cur_path) as temp_path:
install_name_tool(*args, temp_path)
args.append(str(cur_path))
install_name_tool = executable.Executable("install_name_tool")
install_name_tool(*args)
def macholib_get_paths(cur_path):
@@ -718,8 +717,8 @@ def fixup_macos_rpath(root, filename):
# No fixes needed
return False
with fs.edit_in_place_through_temporary_file(abspath) as temp_path:
executable.Executable("install_name_tool")(*args, temp_path)
args.append(abspath)
executable.Executable("install_name_tool")(*args)
return True

View File

@@ -41,7 +41,6 @@
import spack.provider_index
import spack.spec
import spack.tag
import spack.tengine
import spack.util.file_cache
import spack.util.git
import spack.util.naming as nm
@@ -82,6 +81,43 @@ def namespace_from_fullname(fullname):
return namespace
class _PrependFileLoader(importlib.machinery.SourceFileLoader):
def __init__(self, fullname, path, prepend=None):
super(_PrependFileLoader, self).__init__(fullname, path)
self.prepend = prepend
def path_stats(self, path):
stats = super(_PrependFileLoader, self).path_stats(path)
if self.prepend:
stats["size"] += len(self.prepend) + 1
return stats
def get_data(self, path):
data = super(_PrependFileLoader, self).get_data(path)
if path != self.path or self.prepend is None:
return data
else:
return self.prepend.encode() + b"\n" + data
class RepoLoader(_PrependFileLoader):
"""Loads a Python module associated with a package in specific repository"""
#: Code in ``_package_prepend`` is prepended to imported packages.
#:
#: Spack packages are expected to call `from spack.package import *`
#: themselves, but we are allowing a deprecation period before breaking
#: external repos that don't do this yet.
_package_prepend = "from spack.package import *"
def __init__(self, fullname, repo, package_name):
self.repo = repo
self.package_name = package_name
self.package_py = repo.filename_for_package_name(package_name)
self.fullname = fullname
super().__init__(self.fullname, self.package_py, prepend=self._package_prepend)
class SpackNamespaceLoader:
def create_module(self, spec):
return SpackNamespace(spec.name)
@@ -151,8 +187,7 @@ def compute_loader(self, fullname):
# With 2 nested conditionals we can call "repo.real_name" only once
package_name = repo.real_name(module_name)
if package_name:
module_path = repo.filename_for_package_name(package_name)
return importlib.machinery.SourceFileLoader(fullname, module_path)
return RepoLoader(fullname, repo, package_name)
# We are importing a full namespace like 'spack.pkg.builtin'
if fullname == repo.full_namespace:
@@ -1486,6 +1521,8 @@ def add_package(self, name, dependencies=None):
Both "dep_type" and "condition" can default to ``None`` in which case
``spack.dependency.default_deptype`` and ``spack.spec.Spec()`` are used.
"""
import spack.tengine # avoid circular import
dependencies = dependencies or []
context = {"cls_name": nm.mod_to_class(name), "dependencies": dependencies}
template = spack.tengine.make_environment().get_template("mock-repository/package.pyt")

View File

@@ -12,10 +12,7 @@
class Resource:
"""Represents any resource to be fetched by a package.
This includes the main tarball or source archive, as well as extra archives defined
by the resource() directive.
"""Represents an optional resource to be fetched by a package.
Aggregates a name, a fetcher, a destination and a placement.
"""

View File

@@ -88,8 +88,6 @@
"strategy": {"type": "string", "enum": ["none", "minimal", "full"]}
},
},
"timeout": {"type": "integer", "minimum": 0},
"error_on_timeout": {"type": "boolean"},
"os_compatible": {"type": "object", "additionalProperties": {"type": "array"}},
},
}

View File

@@ -106,8 +106,8 @@
{
"names": ["install_missing_compilers"],
"message": "The config:install_missing_compilers option has been deprecated in "
"Spack v0.23, and is currently ignored. It will be removed from config after "
"Spack v1.0.",
"Spack v0.23, and is currently ignored. It will be removed from config in "
"Spack v0.25.",
"error": False,
},
],

View File

@@ -27,7 +27,6 @@
import spack
import spack.binary_distribution
import spack.compiler
import spack.compilers
import spack.concretize
import spack.config
@@ -48,6 +47,8 @@
import spack.version as vn
import spack.version.git_ref_lookup
from spack import traverse
from spack.config import get_mark_from_yaml_data
from spack.error import SpecSyntaxError
from .core import (
AspFunction,
@@ -63,7 +64,6 @@
parse_term,
)
from .counter import FullDuplicatesCounter, MinimalDuplicatesCounter, NoDuplicatesCounter
from .requirements import RequirementKind, RequirementParser, RequirementRule
from .version_order import concretization_version_order
GitOrStandardVersion = Union[spack.version.GitVersion, spack.version.StandardVersion]
@@ -143,6 +143,17 @@ def named_spec(
spec.name = old_name
class RequirementKind(enum.Enum):
"""Purpose / provenance of a requirement"""
#: Default requirement expressed under the 'all' attribute of packages.yaml
DEFAULT = enum.auto()
#: Requirement expressed on a virtual package
VIRTUAL = enum.auto()
#: Requirement expressed on a specific package
PACKAGE = enum.auto()
class DeclaredVersion(NamedTuple):
"""Data class to contain information on declared versions used in the solve"""
@@ -745,6 +756,17 @@ def on_model(model):
raise UnsatisfiableSpecError(msg)
class RequirementRule(NamedTuple):
"""Data class to collect information on a requirement"""
pkg_name: str
policy: str
requirements: List["spack.spec.Spec"]
condition: "spack.spec.Spec"
kind: RequirementKind
message: Optional[str]
class KnownCompiler(NamedTuple):
"""Data class to collect information on compilers"""
@@ -863,22 +885,7 @@ def on_model(model):
solve_kwargs["on_unsat"] = cores.append
timer.start("solve")
time_limit = spack.config.CONFIG.get("concretizer:timeout", -1)
error_on_timeout = spack.config.CONFIG.get("concretizer:error_on_timeout", True)
# Spack uses 0 to set no time limit, clingo API uses -1
if time_limit == 0:
time_limit = -1
with self.control.solve(**solve_kwargs, async_=True) as handle:
finished = handle.wait(time_limit)
if not finished:
specs_str = ", ".join(llnl.util.lang.elide_list([str(s) for s in specs], 4))
header = f"Spack is taking more than {time_limit} seconds to solve for {specs_str}"
if error_on_timeout:
raise UnsatisfiableSpecError(f"{header}, stopping concretization")
warnings.warn(f"{header}, using the best configuration found so far")
handle.cancel()
solve_result = handle.get()
solve_result = self.control.solve(**solve_kwargs)
timer.stop("solve")
# once done, construct the solve result
@@ -1123,7 +1130,6 @@ class SpackSolverSetup:
def __init__(self, tests: bool = False):
# these are all initialized in setup()
self.gen: "ProblemInstanceBuilder" = ProblemInstanceBuilder()
self.requirement_parser = RequirementParser(spack.config.CONFIG)
self.possible_virtuals: Set[str] = set()
self.assumptions: List[Tuple["clingo.Symbol", bool]] = [] # type: ignore[name-defined]
@@ -1310,7 +1316,8 @@ def compiler_facts(self):
self.gen.newline()
def package_requirement_rules(self, pkg):
self.emit_facts_from_requirement_rules(self.requirement_parser.rules(pkg))
parser = RequirementParser(spack.config.CONFIG)
self.emit_facts_from_requirement_rules(parser.rules(pkg))
def pkg_rules(self, pkg, tests):
pkg = self.pkg_class(pkg)
@@ -1788,8 +1795,9 @@ def provider_defaults(self):
def provider_requirements(self):
self.gen.h2("Requirements on virtual providers")
parser = RequirementParser(spack.config.CONFIG)
for virtual_str in sorted(self.possible_virtuals):
rules = self.requirement_parser.rules_from_virtual(virtual_str)
rules = parser.rules_from_virtual(virtual_str)
if rules:
self.emit_facts_from_requirement_rules(rules)
self.trigger_rules()
@@ -3064,6 +3072,202 @@ def value(self) -> str:
return "".join(self.asp_problem)
def parse_spec_from_yaml_string(string: str) -> "spack.spec.Spec":
"""Parse a spec from YAML and add file/line info to errors, if it's available.
Parse a ``Spec`` from the supplied string, but also intercept any syntax errors and
add file/line information for debugging using file/line annotations from the string.
Arguments:
string: a string representing a ``Spec`` from config YAML.
"""
try:
return spack.spec.Spec(string)
except SpecSyntaxError as e:
mark = get_mark_from_yaml_data(string)
if mark:
msg = f"{mark.name}:{mark.line + 1}: {str(e)}"
raise SpecSyntaxError(msg) from e
raise e
class RequirementParser:
"""Parses requirements from package.py files and configuration, and returns rules."""
def __init__(self, configuration):
self.config = configuration
def rules(self, pkg: "spack.package_base.PackageBase") -> List[RequirementRule]:
result = []
result.extend(self.rules_from_package_py(pkg))
result.extend(self.rules_from_require(pkg))
result.extend(self.rules_from_prefer(pkg))
result.extend(self.rules_from_conflict(pkg))
return result
def rules_from_package_py(self, pkg) -> List[RequirementRule]:
rules = []
for when_spec, requirement_list in pkg.requirements.items():
for requirements, policy, message in requirement_list:
rules.append(
RequirementRule(
pkg_name=pkg.name,
policy=policy,
requirements=requirements,
kind=RequirementKind.PACKAGE,
condition=when_spec,
message=message,
)
)
return rules
def rules_from_virtual(self, virtual_str: str) -> List[RequirementRule]:
requirements = self.config.get("packages", {}).get(virtual_str, {}).get("require", [])
return self._rules_from_requirements(
virtual_str, requirements, kind=RequirementKind.VIRTUAL
)
def rules_from_require(self, pkg: "spack.package_base.PackageBase") -> List[RequirementRule]:
kind, requirements = self._raw_yaml_data(pkg, section="require")
return self._rules_from_requirements(pkg.name, requirements, kind=kind)
def rules_from_prefer(self, pkg: "spack.package_base.PackageBase") -> List[RequirementRule]:
result = []
kind, preferences = self._raw_yaml_data(pkg, section="prefer")
for item in preferences:
spec, condition, message = self._parse_prefer_conflict_item(item)
result.append(
# A strong preference is defined as:
#
# require:
# - any_of: [spec_str, "@:"]
RequirementRule(
pkg_name=pkg.name,
policy="any_of",
requirements=[spec, spack.spec.Spec("@:")],
kind=kind,
message=message,
condition=condition,
)
)
return result
def rules_from_conflict(self, pkg: "spack.package_base.PackageBase") -> List[RequirementRule]:
result = []
kind, conflicts = self._raw_yaml_data(pkg, section="conflict")
for item in conflicts:
spec, condition, message = self._parse_prefer_conflict_item(item)
result.append(
# A conflict is defined as:
#
# require:
# - one_of: [spec_str, "@:"]
RequirementRule(
pkg_name=pkg.name,
policy="one_of",
requirements=[spec, spack.spec.Spec("@:")],
kind=kind,
message=message,
condition=condition,
)
)
return result
def _parse_prefer_conflict_item(self, item):
# The item is either a string or an object with at least a "spec" attribute
if isinstance(item, str):
spec = parse_spec_from_yaml_string(item)
condition = spack.spec.Spec()
message = None
else:
spec = parse_spec_from_yaml_string(item["spec"])
condition = spack.spec.Spec(item.get("when"))
message = item.get("message")
return spec, condition, message
def _raw_yaml_data(self, pkg: "spack.package_base.PackageBase", *, section: str):
config = self.config.get("packages")
data = config.get(pkg.name, {}).get(section, [])
kind = RequirementKind.PACKAGE
if not data:
data = config.get("all", {}).get(section, [])
kind = RequirementKind.DEFAULT
return kind, data
def _rules_from_requirements(
self, pkg_name: str, requirements, *, kind: RequirementKind
) -> List[RequirementRule]:
"""Manipulate requirements from packages.yaml, and return a list of tuples
with a uniform structure (name, policy, requirements).
"""
if isinstance(requirements, str):
requirements = [requirements]
rules = []
for requirement in requirements:
# A string is equivalent to a one_of group with a single element
if isinstance(requirement, str):
requirement = {"one_of": [requirement]}
for policy in ("spec", "one_of", "any_of"):
if policy not in requirement:
continue
constraints = requirement[policy]
# "spec" is for specifying a single spec
if policy == "spec":
constraints = [constraints]
policy = "one_of"
# validate specs from YAML first, and fail with line numbers if parsing fails.
constraints = [
parse_spec_from_yaml_string(constraint) for constraint in constraints
]
when_str = requirement.get("when")
when = parse_spec_from_yaml_string(when_str) if when_str else spack.spec.Spec()
constraints = [
x
for x in constraints
if not self.reject_requirement_constraint(pkg_name, constraint=x, kind=kind)
]
if not constraints:
continue
rules.append(
RequirementRule(
pkg_name=pkg_name,
policy=policy,
requirements=constraints,
kind=kind,
message=requirement.get("message"),
condition=when,
)
)
return rules
def reject_requirement_constraint(
self, pkg_name: str, *, constraint: spack.spec.Spec, kind: RequirementKind
) -> bool:
"""Returns True if a requirement constraint should be rejected"""
if kind == RequirementKind.DEFAULT:
# Requirements under all: are applied only if they are satisfiable considering only
# package rules, so e.g. variants must exist etc. Otherwise, they are rejected.
try:
s = spack.spec.Spec(pkg_name)
s.constrain(constraint)
s.validate_or_raise()
except spack.error.SpackError as e:
tty.debug(
f"[SETUP] Rejecting the default '{constraint}' requirement "
f"on '{pkg_name}': {str(e)}",
level=2,
)
return True
return False
class CompilerParser:
"""Parses configuration files, and builds a list of possible compilers for the solve."""

View File

@@ -1003,8 +1003,6 @@ variant_default_not_used(node(ID, Package), Variant, Value)
node_has_variant(node(ID, Package), Variant, _),
not attr("variant_value", node(ID, Package), Variant, Value),
not propagate(node(ID, Package), variant_value(Variant, _, _)),
% variant set explicitly don't count for this metric
not attr("variant_set", node(ID, Package), Variant, _),
attr("node", node(ID, Package)).
% The variant is set in an external spec

View File

@@ -1,232 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import enum
from typing import List, NamedTuple, Optional, Sequence
from llnl.util import tty
import spack.config
import spack.error
import spack.package_base
import spack.spec
from spack.config import get_mark_from_yaml_data
class RequirementKind(enum.Enum):
"""Purpose / provenance of a requirement"""
#: Default requirement expressed under the 'all' attribute of packages.yaml
DEFAULT = enum.auto()
#: Requirement expressed on a virtual package
VIRTUAL = enum.auto()
#: Requirement expressed on a specific package
PACKAGE = enum.auto()
class RequirementRule(NamedTuple):
"""Data class to collect information on a requirement"""
pkg_name: str
policy: str
requirements: Sequence[spack.spec.Spec]
condition: spack.spec.Spec
kind: RequirementKind
message: Optional[str]
class RequirementParser:
"""Parses requirements from package.py files and configuration, and returns rules."""
def __init__(self, configuration: spack.config.Configuration):
self.config = configuration
def rules(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
result = []
result.extend(self.rules_from_package_py(pkg))
result.extend(self.rules_from_require(pkg))
result.extend(self.rules_from_prefer(pkg))
result.extend(self.rules_from_conflict(pkg))
return result
def rules_from_package_py(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
rules = []
for when_spec, requirement_list in pkg.requirements.items():
for requirements, policy, message in requirement_list:
rules.append(
RequirementRule(
pkg_name=pkg.name,
policy=policy,
requirements=requirements,
kind=RequirementKind.PACKAGE,
condition=when_spec,
message=message,
)
)
return rules
def rules_from_virtual(self, virtual_str: str) -> List[RequirementRule]:
requirements = self.config.get("packages", {}).get(virtual_str, {}).get("require", [])
return self._rules_from_requirements(
virtual_str, requirements, kind=RequirementKind.VIRTUAL
)
def rules_from_require(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
kind, requirements = self._raw_yaml_data(pkg, section="require")
return self._rules_from_requirements(pkg.name, requirements, kind=kind)
def rules_from_prefer(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
result = []
kind, preferences = self._raw_yaml_data(pkg, section="prefer")
for item in preferences:
spec, condition, message = self._parse_prefer_conflict_item(item)
result.append(
# A strong preference is defined as:
#
# require:
# - any_of: [spec_str, "@:"]
RequirementRule(
pkg_name=pkg.name,
policy="any_of",
requirements=[spec, spack.spec.Spec("@:")],
kind=kind,
message=message,
condition=condition,
)
)
return result
def rules_from_conflict(self, pkg: spack.package_base.PackageBase) -> List[RequirementRule]:
result = []
kind, conflicts = self._raw_yaml_data(pkg, section="conflict")
for item in conflicts:
spec, condition, message = self._parse_prefer_conflict_item(item)
result.append(
# A conflict is defined as:
#
# require:
# - one_of: [spec_str, "@:"]
RequirementRule(
pkg_name=pkg.name,
policy="one_of",
requirements=[spec, spack.spec.Spec("@:")],
kind=kind,
message=message,
condition=condition,
)
)
return result
def _parse_prefer_conflict_item(self, item):
# The item is either a string or an object with at least a "spec" attribute
if isinstance(item, str):
spec = parse_spec_from_yaml_string(item)
condition = spack.spec.Spec()
message = None
else:
spec = parse_spec_from_yaml_string(item["spec"])
condition = spack.spec.Spec(item.get("when"))
message = item.get("message")
return spec, condition, message
def _raw_yaml_data(self, pkg: spack.package_base.PackageBase, *, section: str):
config = self.config.get("packages")
data = config.get(pkg.name, {}).get(section, [])
kind = RequirementKind.PACKAGE
if not data:
data = config.get("all", {}).get(section, [])
kind = RequirementKind.DEFAULT
return kind, data
def _rules_from_requirements(
self, pkg_name: str, requirements, *, kind: RequirementKind
) -> List[RequirementRule]:
"""Manipulate requirements from packages.yaml, and return a list of tuples
with a uniform structure (name, policy, requirements).
"""
if isinstance(requirements, str):
requirements = [requirements]
rules = []
for requirement in requirements:
# A string is equivalent to a one_of group with a single element
if isinstance(requirement, str):
requirement = {"one_of": [requirement]}
for policy in ("spec", "one_of", "any_of"):
if policy not in requirement:
continue
constraints = requirement[policy]
# "spec" is for specifying a single spec
if policy == "spec":
constraints = [constraints]
policy = "one_of"
# validate specs from YAML first, and fail with line numbers if parsing fails.
constraints = [
parse_spec_from_yaml_string(constraint) for constraint in constraints
]
when_str = requirement.get("when")
when = parse_spec_from_yaml_string(when_str) if when_str else spack.spec.Spec()
constraints = [
x
for x in constraints
if not self.reject_requirement_constraint(pkg_name, constraint=x, kind=kind)
]
if not constraints:
continue
rules.append(
RequirementRule(
pkg_name=pkg_name,
policy=policy,
requirements=constraints,
kind=kind,
message=requirement.get("message"),
condition=when,
)
)
return rules
def reject_requirement_constraint(
self, pkg_name: str, *, constraint: spack.spec.Spec, kind: RequirementKind
) -> bool:
"""Returns True if a requirement constraint should be rejected"""
if kind == RequirementKind.DEFAULT:
# Requirements under all: are applied only if they are satisfiable considering only
# package rules, so e.g. variants must exist etc. Otherwise, they are rejected.
try:
s = spack.spec.Spec(pkg_name)
s.constrain(constraint)
s.validate_or_raise()
except spack.error.SpackError as e:
tty.debug(
f"[SETUP] Rejecting the default '{constraint}' requirement "
f"on '{pkg_name}': {str(e)}",
level=2,
)
return True
return False
def parse_spec_from_yaml_string(string: str) -> spack.spec.Spec:
"""Parse a spec from YAML and add file/line info to errors, if it's available.
Parse a ``Spec`` from the supplied string, but also intercept any syntax errors and
add file/line information for debugging using file/line annotations from the string.
Arguments:
string: a string representing a ``Spec`` from config YAML.
"""
try:
return spack.spec.Spec(string)
except spack.error.SpecSyntaxError as e:
mark = get_mark_from_yaml_data(string)
if mark:
msg = f"{mark.name}:{mark.line + 1}: {str(e)}"
raise spack.error.SpecSyntaxError(msg) from e
raise e

View File

@@ -95,8 +95,6 @@
import spack.version as vn
import spack.version.git_ref_lookup
from .enums import InstallRecordStatus
__all__ = [
"CompilerSpec",
"Spec",
@@ -2073,7 +2071,7 @@ def _lookup_hash(self):
# First env, then store, then binary cache
matches = (
(active_env.all_matching_specs(self) if active_env else [])
or spack.store.STORE.db.query(self, installed=InstallRecordStatus.ANY)
or spack.store.STORE.db.query(self, installed=any)
or spack.binary_distribution.BinaryCacheQuery(True)(self)
)

View File

@@ -16,7 +16,6 @@
import llnl.string
import llnl.util.lang
import llnl.util.symlink
import llnl.util.tty as tty
from llnl.util.filesystem import (
can_access,
@@ -34,8 +33,7 @@
import spack.caches
import spack.config
import spack.error
import spack.mirrors.layout
import spack.mirrors.utils
import spack.mirror
import spack.resource
import spack.spec
import spack.util.crypto
@@ -354,8 +352,8 @@ def __init__(
url_or_fetch_strategy,
*,
name=None,
mirror_paths: Optional["spack.mirrors.layout.MirrorLayout"] = None,
mirrors: Optional[Iterable["spack.mirrors.mirror.Mirror"]] = None,
mirror_paths: Optional["spack.mirror.MirrorLayout"] = None,
mirrors: Optional[Iterable["spack.mirror.Mirror"]] = None,
keep=False,
path=None,
lock=True,
@@ -489,7 +487,7 @@ def _generate_fetchers(self, mirror_only=False) -> Generator["fs.FetchStrategy",
# Insert fetchers in the order that the URLs are provided.
fetchers[:0] = (
fs.from_url_scheme(
url_util.join(mirror.fetch_url, *self.mirror_layout.path.split(os.sep)),
url_util.join(mirror.fetch_url, self.mirror_layout.path),
checksum=digest,
expand=expand,
extension=extension,
@@ -602,7 +600,7 @@ def cache_local(self):
spack.caches.FETCH_CACHE.store(self.fetcher, self.mirror_layout.path)
def cache_mirror(
self, mirror: "spack.caches.MirrorCache", stats: "spack.mirrors.utils.MirrorStats"
self, mirror: "spack.caches.MirrorCache", stats: "spack.mirror.MirrorStats"
) -> None:
"""Perform a fetch if the resource is not already cached

View File

@@ -32,7 +32,7 @@
import spack.fetch_strategy
import spack.hooks.sbang as sbang
import spack.main
import spack.mirrors.mirror
import spack.mirror
import spack.paths
import spack.spec
import spack.stage
@@ -324,8 +324,8 @@ def test_push_and_fetch_keys(mock_gnupghome, tmp_path):
mirror = os.path.join(testpath, "mirror")
mirrors = {"test-mirror": url_util.path_to_file_url(mirror)}
mirrors = spack.mirrors.mirror.MirrorCollection(mirrors)
mirror = spack.mirrors.mirror.Mirror(url_util.path_to_file_url(mirror))
mirrors = spack.mirror.MirrorCollection(mirrors)
mirror = spack.mirror.Mirror(url_util.path_to_file_url(mirror))
gpg_dir1 = os.path.join(testpath, "gpg1")
gpg_dir2 = os.path.join(testpath, "gpg2")

View File

@@ -9,7 +9,7 @@
import pytest
import spack.binary_distribution as bd
import spack.mirrors.mirror
import spack.mirror
import spack.spec
from spack.installer import PackageInstaller
@@ -23,7 +23,7 @@ def test_build_tarball_overwrite(install_mockery, mock_fetch, monkeypatch, tmp_p
specs = [spec]
# populate cache, everything is new
mirror = spack.mirrors.mirror.Mirror.from_local_path(str(tmp_path))
mirror = spack.mirror.Mirror.from_local_path(str(tmp_path))
with bd.make_uploader(mirror) as uploader:
skipped = uploader.push_or_raise(specs)
assert not skipped

View File

@@ -15,7 +15,6 @@
import spack.build_systems.autotools
import spack.build_systems.cmake
import spack.builder
import spack.environment
import spack.error
import spack.paths
@@ -150,7 +149,7 @@ def test_libtool_archive_files_are_deleted_by_default(self, mutable_database):
# Assert the libtool archive is not there and we have
# a log of removed files
assert not os.path.exists(spack.builder.create(s.package).libtool_archive_file)
assert not os.path.exists(s.package.builder.libtool_archive_file)
search_directory = os.path.join(s.prefix, ".spack")
libtool_deletion_log = fs.find(search_directory, "removed_la_files.txt", recursive=True)
assert libtool_deletion_log
@@ -161,13 +160,11 @@ def test_libtool_archive_files_might_be_installed_on_demand(
# Install a package that creates a mock libtool archive,
# patch its package to preserve the installation
s = Spec("libtool-deletion").concretized()
monkeypatch.setattr(
type(spack.builder.create(s.package)), "install_libtool_archives", True
)
monkeypatch.setattr(type(s.package.builder), "install_libtool_archives", True)
PackageInstaller([s.package], explicit=True).install()
# Assert libtool archives are installed
assert os.path.exists(spack.builder.create(s.package).libtool_archive_file)
assert os.path.exists(s.package.builder.libtool_archive_file)
def test_autotools_gnuconfig_replacement(self, mutable_database):
"""
@@ -264,7 +261,7 @@ def test_cmake_std_args(self, default_mock_concretization):
# Call the function on a CMakePackage instance
s = default_mock_concretization("cmake-client")
expected = spack.build_systems.cmake.CMakeBuilder.std_args(s.package)
assert spack.builder.create(s.package).std_cmake_args == expected
assert s.package.builder.std_cmake_args == expected
# Call it on another kind of package
s = default_mock_concretization("mpich")
@@ -384,9 +381,7 @@ def test_autotools_args_from_conditional_variant(default_mock_concretization):
is not met. When this is the case, the variant is not set in the spec."""
s = default_mock_concretization("autotools-conditional-variants-test")
assert "example" not in s.variants
assert (
len(spack.builder.create(s.package)._activate_or_not("example", "enable", "disable")) == 0
)
assert len(s.package.builder._activate_or_not("example", "enable", "disable")) == 0
def test_autoreconf_search_path_args_multiple(default_mock_concretization, tmpdir):

View File

@@ -14,7 +14,7 @@
import spack.config
import spack.environment as ev
import spack.main
import spack.mirrors.utils
import spack.mirror
import spack.spec
_bootstrap = spack.main.SpackCommand("bootstrap")
@@ -182,8 +182,8 @@ def test_bootstrap_mirror_metadata(mutable_config, linux_os, monkeypatch, tmpdir
`spack bootstrap add`. Here we don't download data, since that would be an
expensive operation for a unit test.
"""
old_create = spack.mirrors.utils.create
monkeypatch.setattr(spack.mirrors.utils, "create", lambda p, s: old_create(p, []))
old_create = spack.mirror.create
monkeypatch.setattr(spack.mirror, "create", lambda p, s: old_create(p, []))
monkeypatch.setattr(spack.spec.Spec, "concretized", lambda p: p)
# Create the mirror in a temporary folder

Some files were not shown because too many files have changed in this diff Show More