Compare commits

..

85 Commits

Author SHA1 Message Date
Harmen Stoppels
594a376c52 Set version to v0.22.2 2024-09-21 12:39:55 +02:00
Harmen Stoppels
1538c48616 run-unit-tests: no xdist if coverage (#46480)
xdist only slows down unit tests under coverage
2024-09-21 12:39:55 +02:00
Massimiliano Culpo
683e50b8d9 Run unit test in parallel again in CI (#45793)
The --trace-config option was failing for linux unit-tests,
so we were running serial.
2024-09-21 12:39:55 +02:00
Harmen Stoppels
9b32fb0beb Revert "Change environment modifications to escape with double quotes (#36789)" (#42780)
This reverts commit 690394fabc, as it causes arbitrary code execution.
2024-09-21 12:39:55 +02:00
Harmen Stoppels
2c6df0d491 deal with TimeoutError from ssl.py (#45683) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
ce7218acae buildcache: fix hard-coded, outdated layout version (#45645) 2024-09-21 12:39:55 +02:00
Dominic Hofer
246eeb2b69 Remove execution permission from setup-env.sh (#45641)
`setup-env.sh` is meant to be sourced, not executed directly.
By revoking execution permissions, users who accidentally execute
the script will receive an error instead of seeing no effect.

* Remove execution permission from `setup-env.sh` and friends
* Don't make output file executable in `spack commands --update-completion`

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-09-21 12:39:55 +02:00
Harmen Stoppels
cc47ee3984 unparser.py: remove print statements (#45235) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
7b644719c1 Avoid duplicate detectable tag (#45160)
in case of inheritance the static tags prop may be updated multiple
times, and it turns out builder classes magically inherit from
traditional package classes
2024-09-21 12:39:55 +02:00
Harmen Stoppels
d8a6aa551e build_environment: explicitly disable ccache if disabled (#45275) 2024-09-21 12:39:55 +02:00
Massimiliano Culpo
ac7b18483a Bump archspec to latest commit (#46445)
This should fix an issue with Neoverse XX detection

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-21 12:39:55 +02:00
Massimiliano Culpo
39f37de4ce Update archspec to v0.2.5-dev (7e6740012b897ae4a950f0bba7e9726b767e921f) (#45721) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
703e153404 require spec in develop entry (#46485) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
aa013611bc url join: fix oci scheme (#46483)
* url.py: also special case oci scheme in join

* avoid fetching keys from oci mirror
2024-09-21 12:39:55 +02:00
Harmen Stoppels
6a7ccd4e46 docs: refer to upstreams.yaml in chain.rst title (#46475) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
1c6c4b4690 spack.util.url: fix join breakage in python 3.12.6 (#46453) 2024-09-21 12:39:55 +02:00
arezaii
68558b3dd0 Chapel package: updates post release (#45304)
* Fix +rocm variant, to ensure correct dependencies on ROCm packages
  and use of AMD LLVM
* Add a +pshm variant for comm=gasnet to enable fast shared-memory
  comms between co-locales
* Add logic to ensure we get the native CXI libfabric network provider
  on Cray EX
* Expand dependency type for package modules to encompass runtime
  dependencies
* Factor logic for setting (LD_)LIBRARY_PATH and PKG_CONFIG_PATH of
  runtime dependencies
* Workaround issue #44746 that causes a transitive dependency on lua
  to break SLURM
* Disable nonfunctional checkChplDoc test
* Annotate some variants as conditional, to improve spack info output
  and reduce confusion

---------

Co-authored-by: Dan Bonachea <dobonachea@lbl.gov>
2024-09-21 12:39:55 +02:00
arezaii
5440fe09cd update chapel package for v2.1 (#44931) 2024-09-21 12:39:55 +02:00
arezaii
03c22f403f Chapel package: major update (#42197)
* add cray detection taken from upcxx
* add CUDA/ROCm support
* add numerous pass-through options to Chapel build,
  like gpu_mem_strategy, comm_substrate, etc.; all variants are
  translated to analogous CHPL_* environment variables. As a side
  effect, this defines a number of environment variables that are
  not actually used by Chapel.
* Define LD_LIBRARY_PATH, LIBRARY_PATH, and PKG_CONFIG_PATH to
  help programs built with Chapel properly locate needed runtime
  dependencies

---------

Co-authored-by: bonachea <dobonachea@lbl.gov>
2024-09-21 12:39:55 +02:00
Greg Becker
f339225d22 include_concrete: read from older env formats properly (#45766)
* include_concrete: read from older env formats properly
* spack env rm: fix logic for checking env includes
* regression test

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-09-21 12:39:55 +02:00
Massimiliano Culpo
22c815f3d4 Do not halt concretization on unknown variants in externals (#45326)
* Do not halt concretization on unkwnown variants in externals
2024-09-21 12:39:55 +02:00
Massimiliano Culpo
4354288e44 Run minimization of weights only on known targets (#45269)
This prevents excessive output from clingo of the kind:

.../spack/lib/spack/spack/solver/concretize.lp:1640:5-11: info: tuple ignored:
  #sup@2
2024-09-21 12:39:55 +02:00
Massimiliano Culpo
ea2d43b4a6 Do not initialize previous store state in "use_store" (#45268)
The "use_store" context manager is used to swap the value
of a global variable (spack.store.STORE), while keeping
another global variable consistent (spack.config.CONFIG).

When doing that it tries to evaluate the previous value
of the store, if that was not done already. This is wrong,
since the configuration might be in an "intermediate" state
that was never meant to trigger side effects.

Remove that operation, and add a unit test to
prevent regressions.
2024-09-21 12:39:55 +02:00
Massimiliano Culpo
85e67d60a0 Add compatibility of sequoia with previous macOS versions (#45127)
* Add compatibility of sequoia with previous macOS versions

* Add compatibility of sequoia with previous macOS versions
2024-09-21 12:39:55 +02:00
Adam J. Stewart
bf6a9ff5ed Add support for macOS Sequoia (#45018) 2024-09-21 12:39:55 +02:00
Jordan Galby
1bdc30979d Fix regression in spec format string for indiviual variants (#46206)
Fix a regression in {variants.X} and {variants.X.value} spec format strings.
2024-09-21 12:39:55 +02:00
Harmen Stoppels
ef1eabe5b3 Add c to the list of languages (#45191) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
43d673f915 Add pkg- prefix to builtin.mock a b c d ... (#45205) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
8a9c501030 spec.py: fix __getitem__ looking outside of dag (#45090)
`Spec.__getitem__` queries dependent edges, which almost always point to
nodes outside the sub-dag considered. It should only ever look at edges
being traversed.
2024-09-21 12:39:55 +02:00
Harmen Stoppels
9f035ca030 Set version to v0.22.2.dev0 2024-09-21 12:39:55 +02:00
Harmen Stoppels
d66dce2d66 Set version to v0.22.1 2024-07-04 15:14:09 +02:00
Jordan Galby
ef2aa2f5f5 spack audit packages: Fix message (#45045)
Fix message formatting of the "virtual dependency cannot have variants" error.
2024-07-04 15:13:31 +02:00
Harmen Stoppels
41f5f6eaab iconv: require libiconv on linux (#45026)
otherwise it is still picked up from glibc as it is external
2024-07-04 15:07:05 +02:00
Massimiliano Culpo
cba347e0b7 Heuristic decays to default over time (#45023)
This modifies heuristic to decay to clingo default
over time. The hope is that this helps with specs
that have an optimal solution with a high penalty.

Let target and compiler heuristic decay too, do not
guess compiler
2024-07-04 15:07:05 +02:00
Harmen Stoppels
a3cef0f02e netlib-lapack: provide blas and lapack together (#44981)
If netlib-lapack is built with ~external-blas, it internally links
liblapack.so with libblas.so, meaning that whenever netlib-lapack is
used as a lapack provider, the package must also be a blas provider.

Conversely using netli-lapack as a blas provider does not imply that it
also must provide lapack, but nothing is lost disallowing that...
2024-07-01 16:56:31 +02:00
Harmen Stoppels
45fca040c3 Use composite stage also for develop specs (#44950) 2024-07-01 16:56:31 +02:00
Harmen Stoppels
eb2b5739b2 Remove DIYStage (#44949) 2024-07-01 16:56:31 +02:00
Massimiliano Culpo
d299e17d43 neoverse-v1: restore py-cinemasci (#44976)
Use a different tactic for determining conflicts.

Give more priority to setting False very old versions.
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
d883883be0 Ensure parent runtime version >= child (#44834)
Fixes a bug where old gcc-runtime libraries would be loaded at runtime, but newer are required by dependencies, breaking the binaries.
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
249dcb49e2 ASP-based solver: add a generic rule for propagation (#44870)
This adds a generic propagate/2 rule to propagate any
fact to children in the DAG.
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
8628add66b Simplify and improve solver heuristic (#44893)
When we changed how to deal with errors in November,
we didn't realize that for an unconstrained choice
rule it is more important in the heuristic to guess
what is NOT in the answer set, since it will be the
majority of options.

Previously this was following automatically from what
was in the answer set, via `1 { ... } 1` cardinality
constraints.

Here we improve the heuristic and the solve time for specs.
2024-07-01 16:56:31 +02:00
Harmen Stoppels
aeccba8bc0 build_environment: fix ccache error handling (#44740) 2024-07-01 16:56:31 +02:00
Todd Gamblin
d94e8ab36f python: make every view a venv (#44382)
#40773 introduced python-venv, which improved build isolation and avoids issues with,
e.g., `ubuntu`'s system python modifying `sysconfig` to include a (very unwanted)
`local` directory within the default install layout.

This addresses a few cases where #40773 removed functionality, without harming the
default cases where we use `python-venv`.

Traditionally, *every* view with `python` in it was essentially a virtual environment,
because we would copy the `python` interpreter and `os.py` into every view when linking.
We now rely on `python-venv` to do that, but only when it's used (i.e. new builds) and
only for packages that have an `extends("python")` directive.

This again makes every view with `python` in it a virtual environment, but only
if we're not already using a package like `python-venv`. This uses a different
mechanism from before -- instead of using the `virtualenv` trick of copying `python`
into the prefix, we instead create a `pyvenv.cfg` like `venv` (the more modern way
to do it).

This fixes two things:
1. If you already had an environment before Spack `v0.22` that worked, it would
   stop working without a reconcretize and rebuild in `v0.22`, because we no longer
   copy the python interpreter on link. Adding `pyvenv.cfg` fixes this in a more
   modern way, so old views will keep working.

2. If you have an env that only includes python packages that use `depends_on("python")`
   instead of `extends("python")`, those packages will now be importable as before,
   though they won't have the same level of build isolation you'd get with `extends`
   and `python-venv`.

* views: avoid making client code deal with link functions

Users of views and ViewDescriptors shouldn't have to deal with link functions -- they
should just say what type of linking they want.

- [x] views take a link_type, not a link function
- [x] views work out the link function from the link type
- [x] view descriptors and commands now just tell the view what they want.

* python: simplify logic for avoiding pyvenv.cfg in copy views

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
e66c26871f Move unit tests into the same file, simplify main workflow 2024-07-01 16:56:31 +02:00
kwryankrattiger
2db4ff7061 Generate jobs should use x86_64_v3 runners only (#44582) 2024-07-01 16:56:31 +02:00
Tom Bradford
c248932a94 protobuf: fix 3.4:3.21 patch checksum (#44443) 2024-07-01 16:56:31 +02:00
dmagdavector
f15d302fc7 protobuf: update hash for patch needed when="@3.4:3.21" (#44210)
* protobuf: update hash for patch needed when="@3.4:3.21"

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-07-01 16:56:31 +02:00
John W. Parent
74ef630241 Windows: Non config changes to support Gitlab CI (#43965)
* Quote python for shlex

* Remove python path quoting patch

* spack env: Allow `C` "protocol" for config_path

When running spack on windows, a path beginning with `C://...` is a valid path.

* Remove makefile from ci rebuild

* GPG use llnl.util.filesystem.getuid

* Cleanup process_command

* Remove unused lines

* Fix tyop in encode_path

* Double quote arguments

* Cleanup process_command

* Pass cdash args with =

* Escape parens in CMD script

* escape parens doesn't only apply to paths

* Install deps

* sfn prefix

* use sfn with libxml2

* Add hash to dep install

* WIP

* REview

* Changes missed in prior review commit

* Style

* Ensure we handle Windows paths with config scopes

* clarify docstring

* No more MAKE_COMMAND

* syntax cleanup

* Actually correct is_path_url

* Correct call

* raise on other errors

* url2path behaves differently on unix

* Ensure proper quoting

* actually prepend slash in slash_hash

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
2024-07-01 16:56:31 +02:00
John W. Parent
a70ea11e69 Gitlab CI: Windows Configs (#43967)
Add support for Gitlab CI on Windows

This PR adds the config changes required to configure and execute
Gitlab pipelines running Windows builds on Windows runners using
the existing Gitlab CI infrastructure (and newly added Windows 
infrastructure).

* Adds support for generating child pipelines dispatched to Windows runners
* Refactors the relevant pre-scripts, scripts, and post scripts to be compatible with Windows
* Adds Windows config section describing Windows jobs
* Adds VTK as Windows build stack (to be expanded later)
* Modifies proj to build on Windows
* Refactors Windows rpath symlinking to avoid system libs and externals

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2024-07-01 16:56:31 +02:00
John W. Parent
a79b1bd9af Buildcache/ensure symlinks proper prefix (#43851)
* archive: relative links only

Ensure all links written into tarfiles generated from Spack prefixes do not contain symlinks pointing outside the prefix

* binary_distribution: limit extraction to prefix

Ensure files extracted from spackballs are not links pointing outside of the prefix

* Ensure rpaths are properly set on Windows

* hard error on extraction of absolute links

* refactor for non link-modifying approach

* Restore tarball extraction to original impl

* use custom readlink

* cleanup symlink module

* make lstrip
2024-07-01 16:56:31 +02:00
John W. Parent
ac5d5485b9 Cdash reporting timeout (#44213)
* Add timeout to cdash reporter PUT request

Add cdash timeout everywhere
Correct mock responder api

* Style

* brief doc
2024-07-01 16:56:31 +02:00
John W. Parent
04258f9cce Prefer llnl.util.symlink.readlink to os.readlink (#44126)
Symlinks on Windows can use longpath prefixes (\\?\); these are fine
in the context of win32 API interactions but break numerous facets of
Spack behavior that rely on string parsing/matching (archiving,
binary distributions, tarball extraction, view regen, etc).

Spack's internal readlink method (llnl.util.symlink.readlink)
gracefully handles this by removing the prefix and otherwise behaving
exactly as os.readlink does, so we should prefer that in all cases.
2024-07-01 16:56:31 +02:00
Scott Wittenburg
1b14170bd1 gitlab ci: fix untouched spec pruning on windows (#44279)
Use correct path separator in get_all_package_diffs for all platforms.
Ensures correct package change computation on Windows when pruning unchanged specs in Gitlab CI
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
a3bc9dbfe8 Make strong preferences even stronger (#44373)
Before this PR, if Spack could see a possibility to reuse a spec that
doesn't match a strong preference, it would do so. After the PR, a
strong preference would take precedence.
2024-07-01 16:56:31 +02:00
Greg Becker
e7c86259bd bugfix: external detection for compilers with os but not target (#44156)
avoid calling `spec.target` when None.

When an external compiler package has an `os` set but no `target` set, Spack
currently falls into a codepath that calls `spec.target` (which itself calls
`spec.architecture.target.Microarchitecture`) when `spec.architecture.target`
is None, throwing an error.

e.g.

```
packages:
  gcc:
    externals:
    - spec: gcc@12.3.1 os=rhel7
      prefix: /usr
```

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
2605aeb072 ASP-based solver: fix reusing externals on linux (#44316)
We need to tell clingo the libc compatibility of external nodes
in buildcaches or stores, to allow reuse.
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
94536d2b66 Enforce consistency of gl providers (#44307)
* glew: rework dependency on gl

This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.

* mesa-demos: rework dependency on gl

This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.

* mesa-glu: rework dependency on gl

This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.

* paraview: fix dependency on glew

* mesa: group dependency on when("+glx")

* Add missing dependency on libxml2

* paraview: remove the "osmesa" and "egl" variant

Instead, enforce consistency using the "gl" virtual that allows
only one provider.

* visit: remove osmesa variant

* Disable paraview in the aws-isc stacks

* data-vis-sdk: rework constrains to enforce front-ends

* e4s-power: remove redundant paraview

* Pipelines: update osmesa variants

* trilinos-catalyst-ioss-adapter: make gl a run dependency
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
5e580fc82e Remove mesa18 and libosmesa (#44264)
* Remove mesa18 and libosmesa

mesa18 was introduced in #19528 as a way to maintain the old
autotools build of mesa separate from the new meson build.

We could add a second build system to mesa, but since mesa18 has
been deprecated for a long time, we'll just remove it.

libosmesa was used to multiplex the gl provider between mesa18
and mesa, and is thus unecessary. Remove it to reduce complexity
in the graphical stack.

* Remove references to mesa18 and libosmesa

* vtk: rework dependency on gl and osmesa

* memsurfer: rework dependency on vtk

* visit: minimal fix to avoid having both osmesa and glx
2024-07-01 16:56:31 +02:00
Harmen Stoppels
195bad8675 Prefer libiconv for iconv (#44335)
`glibc` and `musl` provide a basic implementation of `iconv` (`iconv`,
`iconv_open`, `iconv_close`), but in practice the installation may be
missing the character encoding methods to make them usable. On Fedora
for example, users need to

```yum install glibc-gconv-extra```

to get the character encodings that `gettext` requires during configure,
namely EUC-JP. Users may not have permissions to install the missing
parts of glibc.

Since Spack can install `libiconv`, it is simpler to use that by
default.
2024-07-01 16:56:31 +02:00
Harmen Stoppels
bd9f3f100a gcc: use -rpath {rpath_dir} not -rpath={rpath dir} (#44315)
to make macOS's linker happy.
2024-07-01 16:56:31 +02:00
Mosè Giordano
b5962613a0 suite-sparse: improve setting of the libs property (#44214)
on some distros it is in lib64/
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
cbcfc7e10a Demote a warning to debug message, if C compiler is not there (#44182) 2024-07-01 16:56:31 +02:00
Massimiliano Culpo
579fadacd0 ASP-based solver: fix version optimization for roots (#44272)
This fixes a bug occurring when two root specs need to select
old versions, and these versions have the same penalty in the
optimization. This sometimes caused an older version to be
preferred to a more recent one.

The issue was the omission of `PackageNode` in the optimization
tuple.
2024-07-01 16:56:31 +02:00
Chris Green
b86d08b022 git: bump v2.39 to 2.45; deprecate unsafe versions (#44248) 2024-07-01 16:56:31 +02:00
Scott Wittenburg
02d62cf40f oci buildcache: handle pagination of tags (#43136)
This fixes an issue where ghcr, gitlab and possibly other container registries paginate tags by default, which violates the OCI spec v1.0, but is common practice (the spec was broken itself). After this commit, you can create build cache indices of > 100 specs on ghcr.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-07-01 16:56:31 +02:00
Harmen Stoppels
97369776f0 build_environment.py: deal with rpathing identical packages (#44219)
When multiple gcc-runtime packages exist in the same link sub-dag, only rpath
the latest.
2024-07-01 16:56:31 +02:00
Howard Pritchard
47af0159dc py-matplotlib: qualify when to do a post install (#44191)
* py-matplotlib: qualify when to do a post install

Older versions of py-matplotlib don't seem to have some of the
files that the post install step is trying to install.
Looks like the files first appeared in 3.6.0 and later.

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>

* Change install paths for older matplotlib

---------

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-07-01 16:56:31 +02:00
Alec Scott
db6ead6fc1 rust: fix v1.78.0 instructions (#44127) 2024-07-01 16:56:31 +02:00
Harmen Stoppels
b4aa2c3cab glibc: detect from "Free Software Foundation" not "gnu" (#44154)
which should be more generic
2024-07-01 16:56:31 +02:00
Harmen Stoppels
4108de1ce4 Set version to v0.22.1.dev0 2024-07-01 16:56:31 +02:00
Todd Gamblin
5fe93fee1e Update CHANGELOG.md for v0.22.0 2024-05-12 02:06:28 +02:00
Todd Gamblin
8207f11333 Bump version to v0.22 2024-05-11 17:54:12 +02:00
Todd Gamblin
5bb5d2696f changelog: add changes form 0.21.1 and 0.21.2 (#44136)
These changes were added to the release branch but did not make it onto `develop`.
2024-05-11 17:48:27 +02:00
Harmen Stoppels
55f37dffe5 oci: improve default_retry (#44132)
Apparently urllib can throw a range of different exceptions:

1. HTTPError
2. URLError with e.reason set to the actual exception
3. TimeoutError from getresponse, which is not wrapped
2024-05-11 15:44:40 +02:00
Harmen Stoppels
252a5bd71b PythonExtension: fix issue where package does not extend python (#44109) 2024-05-10 10:48:06 +02:00
Massimiliano Culpo
f55224f161 Fix filtering external specs (#44093)
When an include filter on externals is present, implicitly
include libcs.

Also, do not penalize deprecated versions if they come
from externals.
2024-05-09 20:48:43 +02:00
Massimiliano Culpo
189ae4b06e CI/Update macOS runners: macos-latest switched to macos-14 (#44094)
macos-latest switched to macos-14, so now we are running
two identical jobs.
2024-05-09 20:48:43 +02:00
Harmen Stoppels
5e9c702fa7 gcc: use -idirafter for libc headers (#44081)
GCC C++ headers like cstdlib use `#include_next <stdlib.h>` to wrap libc
headers. We're using `-isystem` for libc, which puts those headers too
early in the search path. `-idirafter` fixes this so `include_next`
works.
2024-05-08 20:45:39 +02:00
Harmen Stoppels
965bb4d3c0 gitlab ci: tutorial: add julia and vim (#44073) 2024-05-08 14:19:22 +02:00
Todd Gamblin
354f98c94a r: patch R-CVE-2024-27322 for r@3.5:4.3.3 (#44050)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-08 14:19:22 +02:00
Tamara Dahlgren
5dce480154 Remove dead environment creation code (#44065) 2024-05-08 14:19:22 +02:00
Richarda Butler
f634d48b7c Include concrete environments with include_concrete (#33768)
Add the ability to include any number of (potentially nested) concrete environments, e.g.:

```yaml
   spack:
     specs: []
     concretizer:
         unify: true
     include_concrete:
     - /path/to/environment1
     - /path/to/environment2
```

or, from the CLI:

```console
   $ spack env create myenv
   $ spack -e myenv add python
   $ spack -e myenv concretize
   $ spack env create --include-concrete myenv included_env
```

The contents of included concrete environments' spack.lock files are
included in the environment's lock file at creation time. Any changes
to included concrete environments are only reflected after the environment
is re-concretized from the re-concretized included environments.

- [x] Concretize included envs
- [x] Save concrete specs in memory by hash
- [x] Add included envs to combined env's lock file
- [x] Add test
- [x] Update documentation

    Co-authored-by: Kayla Butler <<butler59@llnl.gov>
    Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.co
m>
    Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
    Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-08 14:19:22 +02:00
Massimiliano Culpo
4daee565ae Update the tutorial command to point to releases/v0.22 (#44056) 2024-05-08 14:19:22 +02:00
Massimiliano Culpo
8e4dbdc2d7 Bump removal version in deprecation messages (#44064) 2024-05-08 14:19:22 +02:00
Harmen Stoppels
4f6adc03cd gitlab: dont build paraview for neoverse v2 (#44060) 2024-05-08 14:19:22 +02:00
4939 changed files with 33355 additions and 55372 deletions

View File

@@ -1,5 +1,4 @@
{ {
"name": "Ubuntu 20.04",
"image": "ghcr.io/spack/ubuntu20.04-runner-amd64-gcc-11.4:2023.08.01", "image": "ghcr.io/spack/ubuntu20.04-runner-amd64-gcc-11.4:2023.08.01",
"postCreateCommand": "./.devcontainer/postCreateCommand.sh" "postCreateCommand": "./.devcontainer/postCreateCommand.sh"
} }

View File

@@ -1,5 +0,0 @@
{
"name": "Ubuntu 22.04",
"image": "ghcr.io/spack/ubuntu-22.04:v2024-05-07",
"postCreateCommand": "./.devcontainer/postCreateCommand.sh"
}

View File

@@ -5,10 +5,13 @@ updates:
directory: "/" directory: "/"
schedule: schedule:
interval: "daily" interval: "daily"
# Requirements to run style checks and build documentation # Requirements to build documentation
- package-ecosystem: "pip" - package-ecosystem: "pip"
directories: directory: "/lib/spack/docs"
- "/.github/workflows/requirements/style/*" schedule:
- "/lib/spack/docs" interval: "daily"
# Requirements to run style checks
- package-ecosystem: "pip"
directory: "/.github/workflows/style"
schedule: schedule:
interval: "daily" interval: "daily"

View File

@@ -28,8 +28,8 @@ jobs:
run: run:
shell: ${{ matrix.system.shell }} shell: ${{ matrix.system.shell }}
steps: steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 - uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: ${{inputs.python_version}} python-version: ${{inputs.python_version}}
- name: Install Python packages - name: Install Python packages
@@ -40,35 +40,30 @@ jobs:
run: | run: |
python -m pip install --upgrade pywin32 python -m pip install --upgrade pywin32
- name: Package audits (with coverage) - name: Package audits (with coverage)
env:
COVERAGE_FILE: coverage/.coverage-audits-${{ matrix.system.os }}
if: ${{ inputs.with_coverage == 'true' && runner.os != 'Windows' }} if: ${{ inputs.with_coverage == 'true' && runner.os != 'Windows' }}
run: | run: |
. share/spack/setup-env.sh . share/spack/setup-env.sh
coverage run $(which spack) audit packages coverage run $(which spack) audit packages
coverage run $(which spack) audit configs
coverage run $(which spack) -d audit externals coverage run $(which spack) -d audit externals
coverage combine coverage combine
coverage xml
- name: Package audits (without coverage) - name: Package audits (without coverage)
if: ${{ inputs.with_coverage == 'false' && runner.os != 'Windows' }} if: ${{ inputs.with_coverage == 'false' && runner.os != 'Windows' }}
run: | run: |
. share/spack/setup-env.sh . share/spack/setup-env.sh
spack -d audit packages spack -d audit packages
spack -d audit configs
spack -d audit externals spack -d audit externals
- name: Package audits (without coverage) - name: Package audits (without coverage)
if: ${{ runner.os == 'Windows' }} if: ${{ runner.os == 'Windows' }}
run: | run: |
. share/spack/setup-env.sh . share/spack/setup-env.sh
spack -d audit packages spack -d audit packages
./share/spack/qa/validate_last_exit.ps1 ./share/spack/qa/validate_last_exit.ps1
spack -d audit configs
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals spack -d audit externals
./share/spack/qa/validate_last_exit.ps1 ./share/spack/qa/validate_last_exit.ps1
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874 - uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
if: ${{ inputs.with_coverage == 'true' && runner.os != 'Windows' }} if: ${{ inputs.with_coverage == 'true' }}
with: with:
name: coverage-audits-${{ matrix.system.os }} flags: unittests,audits
path: coverage token: ${{ secrets.CODECOV_TOKEN }}
include-hidden-files: true verbose: true

View File

@@ -37,7 +37,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \ make patch unzip which xz python3 python3-devel tree \
cmake bison cmake bison
- name: Checkout - name: Checkout
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Bootstrap clingo - name: Bootstrap clingo
@@ -53,33 +53,27 @@ jobs:
runs-on: ${{ matrix.runner }} runs-on: ${{ matrix.runner }}
strategy: strategy:
matrix: matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest", "windows-latest"] runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps: steps:
- name: Setup macOS - name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' && matrix.runner != 'windows-latest' }} if: ${{ matrix.runner != 'ubuntu-latest' }}
run: | run: |
brew install cmake bison tree brew install cmake bison tree
- name: Checkout - name: Checkout
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: "3.12" python-version: "3.12"
- name: Bootstrap clingo - name: Bootstrap clingo
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
VALIDATE_LAST_EXIT: ${{ matrix.runner == 'windows-latest' && './share/spack/qa/validate_last_exit.ps1' || '' }}
run: | run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }} source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.5 spack bootstrap disable github-actions-v0.5
spack bootstrap disable github-actions-v0.4 spack bootstrap disable github-actions-v0.4
spack external find --not-buildable cmake bison spack external find --not-buildable cmake bison
spack -d solve zlib spack -d solve zlib
${{ env.VALIDATE_LAST_EXIT }} tree ~/.spack/bootstrap/store/
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/
gnupg-sources: gnupg-sources:
runs-on: ${{ matrix.runner }} runs-on: ${{ matrix.runner }}
@@ -90,13 +84,15 @@ jobs:
- name: Setup macOS - name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }} if: ${{ matrix.runner != 'ubuntu-latest' }}
run: | run: |
brew install tree gawk brew install tree
sudo rm -rf $(command -v gpg gpg2) # Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Setup Ubuntu - name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }} if: ${{ matrix.runner == 'ubuntu-latest' }}
run: sudo rm -rf $(command -v gpg gpg2 patchelf) run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout - name: Checkout
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Bootstrap GnuPG - name: Bootstrap GnuPG
@@ -112,10 +108,10 @@ jobs:
runs-on: ${{ matrix.runner }} runs-on: ${{ matrix.runner }}
strategy: strategy:
matrix: matrix:
runner: ['macos-13', 'macos-14', "ubuntu-latest", "windows-latest"] runner: ['macos-13', 'macos-14', "ubuntu-latest"]
steps: steps:
- name: Setup macOS - name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' && matrix.runner != 'windows-latest'}} if: ${{ matrix.runner != 'ubuntu-latest' }}
run: | run: |
brew install tree brew install tree
# Remove GnuPG since we want to bootstrap it # Remove GnuPG since we want to bootstrap it
@@ -124,16 +120,11 @@ jobs:
if: ${{ matrix.runner == 'ubuntu-latest' }} if: ${{ matrix.runner == 'ubuntu-latest' }}
run: | run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf) sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Setup Windows
if: ${{ matrix.runner == 'windows-latest' }}
run: |
Remove-Item -Path (Get-Command gpg).Path
Remove-Item -Path (Get-Command file).Path
- name: Checkout - name: Checkout
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: | python-version: |
3.8 3.8
@@ -142,20 +133,11 @@ jobs:
3.11 3.11
3.12 3.12
- name: Set bootstrap sources - name: Set bootstrap sources
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
spack bootstrap disable github-actions-v0.4
- name: Disable from source bootstrap
if: ${{ matrix.runner != 'windows-latest' }}
run: | run: |
source share/spack/setup-env.sh source share/spack/setup-env.sh
spack bootstrap disable github-actions-v0.4
spack bootstrap disable spack-install spack bootstrap disable spack-install
- name: Bootstrap clingo - name: Bootstrap clingo
# No binary clingo on Windows yet
if: ${{ matrix.runner != 'windows-latest' }}
run: | run: |
set -e set -e
for ver in '3.8' '3.9' '3.10' '3.11' '3.12' ; do for ver in '3.8' '3.9' '3.10' '3.11' '3.12' ; do
@@ -168,7 +150,7 @@ jobs:
not_found=0 not_found=0
old_path="$PATH" old_path="$PATH"
export PATH="$ver_dir:$PATH" export PATH="$ver_dir:$PATH"
./bin/spack-tmpconfig -b ./.github/workflows/bin/bootstrap-test.sh ./bin/spack-tmpconfig -b ./.github/workflows/bootstrap-test.sh
export PATH="$old_path" export PATH="$old_path"
fi fi
fi fi
@@ -178,24 +160,8 @@ jobs:
fi fi
done done
- name: Bootstrap GnuPG - name: Bootstrap GnuPG
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
VALIDATE_LAST_EXIT: ${{ matrix.runner == 'windows-latest' && './share/spack/qa/validate_last_exit.ps1' || '' }}
run: | run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }} source share/spack/setup-env.sh
spack -d gpg list spack -d gpg list
${{ env.VALIDATE_LAST_EXIT }} tree ~/.spack/bootstrap/store/
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/
- name: Bootstrap File
env:
SETUP_SCRIPT_EXT: ${{ matrix.runner == 'windows-latest' && 'ps1' || 'sh' }}
SETUP_SCRIPT_SOURCE: ${{ matrix.runner == 'windows-latest' && './' || 'source ' }}
USER_SCOPE_PARENT_DIR: ${{ matrix.runner == 'windows-latest' && '$env:userprofile' || '$HOME' }}
VALIDATE_LAST_EXIT: ${{ matrix.runner == 'windows-latest' && './share/spack/qa/validate_last_exit.ps1' || '' }}
run: |
${{ env.SETUP_SCRIPT_SOURCE }}share/spack/setup-env.${{ env.SETUP_SCRIPT_EXT }}
spack -d python share/spack/qa/bootstrap-file.py
${{ env.VALIDATE_LAST_EXIT }}
tree ${{ env.USER_SCOPE_PARENT_DIR }}/.spack/bootstrap/store/

View File

@@ -40,7 +40,8 @@ jobs:
# 1: Platforms to build for # 1: Platforms to build for
# 2: Base image (e.g. ubuntu:22.04) # 2: Base image (e.g. ubuntu:22.04)
dockerfile: [[amazon-linux, 'linux/amd64,linux/arm64', 'amazonlinux:2'], dockerfile: [[amazon-linux, 'linux/amd64,linux/arm64', 'amazonlinux:2'],
[centos-stream9, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:stream9'], [centos7, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:7'],
[centos-stream, 'linux/amd64,linux/arm64,linux/ppc64le', 'centos:stream'],
[leap15, 'linux/amd64,linux/arm64,linux/ppc64le', 'opensuse/leap:15'], [leap15, 'linux/amd64,linux/arm64,linux/ppc64le', 'opensuse/leap:15'],
[ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'], [ubuntu-focal, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:20.04'],
[ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04'], [ubuntu-jammy, 'linux/amd64,linux/arm64,linux/ppc64le', 'ubuntu:22.04'],
@@ -55,7 +56,7 @@ jobs:
if: github.repository == 'spack/spack' if: github.repository == 'spack/spack'
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81 - uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta id: docker_meta
@@ -76,7 +77,7 @@ jobs:
env: env:
SPACK_YAML_OS: "${{ matrix.dockerfile[2] }}" SPACK_YAML_OS: "${{ matrix.dockerfile[2] }}"
run: | run: |
.github/workflows/bin/generate_spack_yaml_containerize.sh .github/workflows/generate_spack_yaml_containerize.sh
. share/spack/setup-env.sh . share/spack/setup-env.sh
mkdir -p dockerfiles/${{ matrix.dockerfile[0] }} mkdir -p dockerfiles/${{ matrix.dockerfile[0] }}
spack containerize --last-stage=bootstrap | tee dockerfiles/${{ matrix.dockerfile[0] }}/Dockerfile spack containerize --last-stage=bootstrap | tee dockerfiles/${{ matrix.dockerfile[0] }}/Dockerfile
@@ -87,19 +88,19 @@ jobs:
fi fi
- name: Upload Dockerfile - name: Upload Dockerfile
uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874 uses: actions/upload-artifact@65462800fd760344b1a7b4382951275a0abb4808
with: with:
name: dockerfiles_${{ matrix.dockerfile[0] }} name: dockerfiles_${{ matrix.dockerfile[0] }}
path: dockerfiles path: dockerfiles
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@49b3bc8e6bdd4a60e6116a5414239cba5943d3cf uses: docker/setup-qemu-action@68827325e0b33c7199eb31dd4e31fbe9023e06e3
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@988b5a0280414f521da01fcc63a27aeeb4b104db uses: docker/setup-buildx-action@d70bba72b1f3fd22344832f00baa16ece964efeb
- name: Log in to GitHub Container Registry - name: Log in to GitHub Container Registry
uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567 uses: docker/login-action@e92390c5fb421da1463c202d546fed0ec5c39f20
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.actor }} username: ${{ github.actor }}
@@ -107,13 +108,13 @@ jobs:
- name: Log in to DockerHub - name: Log in to DockerHub
if: github.event_name != 'pull_request' if: github.event_name != 'pull_request'
uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567 uses: docker/login-action@e92390c5fb421da1463c202d546fed0ec5c39f20
with: with:
username: ${{ secrets.DOCKERHUB_USERNAME }} username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }} password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build & Deploy ${{ matrix.dockerfile[0] }} - name: Build & Deploy ${{ matrix.dockerfile[0] }}
uses: docker/build-push-action@5cd11c3a4ced054e52742c5fd54dca954e0edd85 uses: docker/build-push-action@2cdde995de11925a030ce8070c3d77a52ffcf1c0
with: with:
context: dockerfiles/${{ matrix.dockerfile[0] }} context: dockerfiles/${{ matrix.dockerfile[0] }}
platforms: ${{ matrix.dockerfile[1] }} platforms: ${{ matrix.dockerfile[1] }}
@@ -126,7 +127,7 @@ jobs:
needs: deploy-images needs: deploy-images
steps: steps:
- name: Merge Artifacts - name: Merge Artifacts
uses: actions/upload-artifact/merge@50769540e7f4bd5e21e526ee35c689e35e0d6874 uses: actions/upload-artifact/merge@65462800fd760344b1a7b4382951275a0abb4808
with: with:
name: dockerfiles name: dockerfiles
pattern: dockerfiles_* pattern: dockerfiles_*

View File

@@ -36,7 +36,7 @@ jobs:
core: ${{ steps.filter.outputs.core }} core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }} packages: ${{ steps.filter.outputs.packages }}
steps: steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 - uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
if: ${{ github.event_name == 'push' }} if: ${{ github.event_name == 'push' }}
with: with:
fetch-depth: 0 fetch-depth: 0
@@ -53,13 +53,6 @@ jobs:
- 'var/spack/repos/builtin/packages/clingo/**' - 'var/spack/repos/builtin/packages/clingo/**'
- 'var/spack/repos/builtin/packages/python/**' - 'var/spack/repos/builtin/packages/python/**'
- 'var/spack/repos/builtin/packages/re2c/**' - 'var/spack/repos/builtin/packages/re2c/**'
- 'var/spack/repos/builtin/packages/gnupg/**'
- 'var/spack/repos/builtin/packages/libassuan/**'
- 'var/spack/repos/builtin/packages/libgcrypt/**'
- 'var/spack/repos/builtin/packages/libgpg-error/**'
- 'var/spack/repos/builtin/packages/libksba/**'
- 'var/spack/repos/builtin/packages/npth/**'
- 'var/spack/repos/builtin/packages/pinentry/**'
- 'lib/spack/**' - 'lib/spack/**'
- 'share/spack/**' - 'share/spack/**'
- '.github/workflows/bootstrap.yml' - '.github/workflows/bootstrap.yml'
@@ -84,30 +77,8 @@ jobs:
needs: [ prechecks, changes ] needs: [ prechecks, changes ]
uses: ./.github/workflows/unit_tests.yaml uses: ./.github/workflows/unit_tests.yaml
secrets: inherit secrets: inherit
upload-coverage:
needs: [ unit-tests, prechecks ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 0
- name: Download coverage files
uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16
with:
pattern: coverage-*
path: coverage
merge-multiple: true
- run: pip install --upgrade coverage
- run: ls -la coverage
- run: coverage combine -a coverage/.coverage*
- run: coverage xml
- name: "Upload coverage"
uses: codecov/codecov-action@e28ff129e5465c2c0dcc6f003fc735cb6ae0c673
with:
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
all: all:
needs: [ upload-coverage, bootstrap ] needs: [ unit-tests, bootstrap ]
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Success - name: Success

8
.github/workflows/install_spack.sh vendored Executable file
View File

@@ -0,0 +1,8 @@
#!/usr/bin/env sh
. share/spack/setup-env.sh
echo -e "config:\n build_jobs: 2" > etc/spack/config.yaml
spack config add "packages:all:target:[x86_64]"
spack compiler find
spack compiler info apple-clang
spack debug report
spack solve zlib

View File

@@ -14,10 +14,10 @@ jobs:
build-paraview-deps: build-paraview-deps:
runs-on: windows-latest runs-on: windows-latest
steps: steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 - uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: 3.9 python-version: 3.9
- name: Install Python packages - name: Install Python packages

View File

@@ -1,7 +0,0 @@
black==24.8.0
clingo==5.7.1
flake8==7.1.1
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.20240513
vermin==1.6.0

View File

@@ -0,0 +1,7 @@
black==24.4.2
clingo==5.7.1
flake8==7.0.0
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.9
vermin==1.6.0

View File

@@ -16,34 +16,45 @@ jobs:
matrix: matrix:
os: [ubuntu-latest] os: [ubuntu-latest]
python-version: ['3.7', '3.8', '3.9', '3.10', '3.11', '3.12'] python-version: ['3.7', '3.8', '3.9', '3.10', '3.11', '3.12']
concretizer: ['clingo']
on_develop: on_develop:
- ${{ github.ref == 'refs/heads/develop' }} - ${{ github.ref == 'refs/heads/develop' }}
include: include:
- python-version: '3.11'
os: ubuntu-latest
concretizer: original
on_develop: ${{ github.ref == 'refs/heads/develop' }}
- python-version: '3.6' - python-version: '3.6'
os: ubuntu-20.04 os: ubuntu-20.04
concretizer: clingo
on_develop: ${{ github.ref == 'refs/heads/develop' }} on_develop: ${{ github.ref == 'refs/heads/develop' }}
exclude: exclude:
- python-version: '3.7' - python-version: '3.7'
os: ubuntu-latest os: ubuntu-latest
concretizer: 'clingo'
on_develop: false on_develop: false
- python-version: '3.8' - python-version: '3.8'
os: ubuntu-latest os: ubuntu-latest
concretizer: 'clingo'
on_develop: false on_develop: false
- python-version: '3.9' - python-version: '3.9'
os: ubuntu-latest os: ubuntu-latest
concretizer: 'clingo'
on_develop: false on_develop: false
- python-version: '3.10' - python-version: '3.10'
os: ubuntu-latest os: ubuntu-latest
concretizer: 'clingo'
on_develop: false on_develop: false
- python-version: '3.11' - python-version: '3.11'
os: ubuntu-latest os: ubuntu-latest
concretizer: 'clingo'
on_develop: false on_develop: false
steps: steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 - uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
- name: Install System packages - name: Install System packages
@@ -61,7 +72,7 @@ jobs:
run: | run: |
# Need this for the git tests to succeed. # Need this for the git tests to succeed.
git --version git --version
. .github/workflows/bin/setup_git.sh . .github/workflows/setup_git.sh
- name: Bootstrap clingo - name: Bootstrap clingo
if: ${{ matrix.concretizer == 'clingo' }} if: ${{ matrix.concretizer == 'clingo' }}
env: env:
@@ -74,25 +85,25 @@ jobs:
- name: Run unit tests - name: Run unit tests
env: env:
SPACK_PYTHON: python SPACK_PYTHON: python
SPACK_TEST_SOLVER: ${{ matrix.concretizer }}
SPACK_TEST_PARALLEL: 2 SPACK_TEST_PARALLEL: 2
COVERAGE: true COVERAGE: true
COVERAGE_FILE: coverage/.coverage-${{ matrix.os }}-python${{ matrix.python-version }}
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }} UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: | run: |
share/spack/qa/run-unit-tests share/spack/qa/run-unit-tests
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874 - uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with: with:
name: coverage-${{ matrix.os }}-python${{ matrix.python-version }} flags: unittests,linux,${{ matrix.concretizer }}
path: coverage token: ${{ secrets.CODECOV_TOKEN }}
include-hidden-files: true verbose: true
# Test shell integration # Test shell integration
shell: shell:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 - uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: '3.11' python-version: '3.11'
- name: Install System packages - name: Install System packages
@@ -107,17 +118,17 @@ jobs:
run: | run: |
# Need this for the git tests to succeed. # Need this for the git tests to succeed.
git --version git --version
. .github/workflows/bin/setup_git.sh . .github/workflows/setup_git.sh
- name: Run shell tests - name: Run shell tests
env: env:
COVERAGE: true COVERAGE: true
run: | run: |
share/spack/qa/run-shell-tests share/spack/qa/run-shell-tests
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874 - uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with: with:
name: coverage-shell flags: shelltests,linux
path: coverage token: ${{ secrets.CODECOV_TOKEN }}
include-hidden-files: true verbose: true
# Test RHEL8 UBI with platform Python. This job is run # Test RHEL8 UBI with platform Python. This job is run
# only on PRs modifying core Spack # only on PRs modifying core Spack
@@ -130,13 +141,13 @@ jobs:
dnf install -y \ dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \ bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz make patch tcl unzip which xz
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 - uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- name: Setup repo and non-root user - name: Setup repo and non-root user
run: | run: |
git --version git --version
git config --global --add safe.directory /__w/spack/spack git config --global --add safe.directory /__w/spack/spack
git fetch --unshallow git fetch --unshallow
. .github/workflows/bin/setup_git.sh . .github/workflows/setup_git.sh
useradd spack-test useradd spack-test
chown -R spack-test . chown -R spack-test .
- name: Run unit tests - name: Run unit tests
@@ -149,10 +160,10 @@ jobs:
clingo-cffi: clingo-cffi:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 - uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: '3.11' python-version: '3.11'
- name: Install System packages - name: Install System packages
@@ -167,18 +178,18 @@ jobs:
run: | run: |
# Need this for the git tests to succeed. # Need this for the git tests to succeed.
git --version git --version
. .github/workflows/bin/setup_git.sh . .github/workflows/setup_git.sh
- name: Run unit tests (full suite with coverage) - name: Run unit tests (full suite with coverage)
env: env:
COVERAGE: true COVERAGE: true
COVERAGE_FILE: coverage/.coverage-clingo-cffi SPACK_TEST_SOLVER: clingo
run: | run: |
share/spack/qa/run-unit-tests share/spack/qa/run-unit-tests
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874 - uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with: with:
name: coverage-clingo-cffi flags: unittests,linux,clingo
path: coverage token: ${{ secrets.CODECOV_TOKEN }}
include-hidden-files: true verbose: true
# Run unit tests on MacOS # Run unit tests on MacOS
macos: macos:
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
@@ -187,10 +198,10 @@ jobs:
os: [macos-13, macos-14] os: [macos-13, macos-14]
python-version: ["3.11"] python-version: ["3.11"]
steps: steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 - uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
- name: Install Python packages - name: Install Python packages
@@ -202,21 +213,21 @@ jobs:
brew install dash fish gcc gnupg2 kcov brew install dash fish gcc gnupg2 kcov
- name: Run unit tests - name: Run unit tests
env: env:
SPACK_TEST_SOLVER: clingo
SPACK_TEST_PARALLEL: 4 SPACK_TEST_PARALLEL: 4
COVERAGE_FILE: coverage/.coverage-${{ matrix.os }}-python${{ matrix.python-version }}
run: | run: |
git --version git --version
. .github/workflows/bin/setup_git.sh . .github/workflows/setup_git.sh
. share/spack/setup-env.sh . share/spack/setup-env.sh
$(which spack) bootstrap disable spack-install $(which spack) bootstrap disable spack-install
$(which spack) solve zlib $(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x) common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}" $(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874 - uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with: with:
name: coverage-${{ matrix.os }}-python${{ matrix.python-version }} flags: unittests,macos
path: coverage token: ${{ secrets.CODECOV_TOKEN }}
include-hidden-files: true verbose: true
# Run unit tests on Windows # Run unit tests on Windows
windows: windows:
defaults: defaults:
@@ -225,10 +236,10 @@ jobs:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0} powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
runs-on: windows-latest runs-on: windows-latest
steps: steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 - uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: 3.9 python-version: 3.9
- name: Install Python packages - name: Install Python packages
@@ -236,15 +247,15 @@ jobs:
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop - name: Create local develop
run: | run: |
./.github/workflows/bin/setup_git.ps1 ./.github/workflows/setup_git.ps1
- name: Unit Test - name: Unit Test
env:
COVERAGE_FILE: coverage/.coverage-windows
run: | run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml spack unit-test -x --verbose --cov --cov-config=pyproject.toml
./share/spack/qa/validate_last_exit.ps1 ./share/spack/qa/validate_last_exit.ps1
- uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874 coverage combine -a
coverage xml
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with: with:
name: coverage-windows flags: unittests,windows
path: coverage token: ${{ secrets.CODECOV_TOKEN }}
include-hidden-files: true verbose: true

View File

@@ -18,15 +18,15 @@ jobs:
validate: validate:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 - uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: '3.11' python-version: '3.11'
cache: 'pip' cache: 'pip'
- name: Install Python Packages - name: Install Python Packages
run: | run: |
pip install --upgrade pip setuptools pip install --upgrade pip setuptools
pip install -r .github/workflows/requirements/style/requirements.txt pip install -r .github/workflows/style/requirements.txt
- name: vermin (Spack's Core) - name: vermin (Spack's Core)
run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/ run: vermin --backport importlib --backport argparse --violations --backport typing -t=3.6- -vvv lib/spack/spack/ lib/spack/llnl/ bin/
- name: vermin (Repositories) - name: vermin (Repositories)
@@ -35,22 +35,22 @@ jobs:
style: style:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 - uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 - uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with: with:
python-version: '3.11' python-version: '3.11'
cache: 'pip' cache: 'pip'
- name: Install Python packages - name: Install Python packages
run: | run: |
pip install --upgrade pip setuptools pip install --upgrade pip setuptools
pip install -r .github/workflows/requirements/style/requirements.txt pip install -r .github/workflows/style/requirements.txt
- name: Setup git configuration - name: Setup git configuration
run: | run: |
# Need this for the git tests to succeed. # Need this for the git tests to succeed.
git --version git --version
. .github/workflows/bin/setup_git.sh . .github/workflows/setup_git.sh
- name: Run style tests - name: Run style tests
run: | run: |
share/spack/qa/run-style-tests share/spack/qa/run-style-tests
@@ -70,13 +70,13 @@ jobs:
dnf install -y \ dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \ bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz make patch tcl unzip which xz
- uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 - uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- name: Setup repo and non-root user - name: Setup repo and non-root user
run: | run: |
git --version git --version
git config --global --add safe.directory /__w/spack/spack git config --global --add safe.directory /__w/spack/spack
git fetch --unshallow git fetch --unshallow
. .github/workflows/bin/setup_git.sh . .github/workflows/setup_git.sh
useradd spack-test useradd spack-test
chown -R spack-test . chown -R spack-test .
- name: Bootstrap Spack development environment - name: Bootstrap Spack development environment
@@ -87,62 +87,3 @@ jobs:
spack -d bootstrap now --dev spack -d bootstrap now --dev
spack style -t black spack style -t black
spack unit-test -V spack unit-test -V
import-check:
runs-on: ubuntu-latest
steps:
- uses: julia-actions/setup-julia@v2
with:
version: '1.10'
- uses: julia-actions/cache@v2
# PR: use the base of the PR as the old commit
- name: Checkout PR base commit
if: github.event_name == 'pull_request'
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
ref: ${{ github.event.pull_request.base.sha }}
path: old
# not a PR: use the previous commit as the old commit
- name: Checkout previous commit
if: github.event_name != 'pull_request'
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
fetch-depth: 2
path: old
- name: Checkout previous commit
if: github.event_name != 'pull_request'
run: git -C old reset --hard HEAD^
- name: Checkout new commit
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
path: new
- name: Install circular import checker
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938
with:
repository: haampie/circular-import-fighter
ref: 555519c6fd5564fd2eb844e7b87e84f4d12602e2
path: circular-import-fighter
- name: Install dependencies
working-directory: circular-import-fighter
run: make -j dependencies
- name: Import cycles before
working-directory: circular-import-fighter
run: make SPACK_ROOT=../old && cp solution solution.old
- name: Import cycles after
working-directory: circular-import-fighter
run: make clean-graph && make SPACK_ROOT=../new && cp solution solution.new
- name: Compare import cycles
working-directory: circular-import-fighter
run: |
edges_before="$(grep -oP 'edges to delete: \K\d+' solution.old)"
edges_after="$(grep -oP 'edges to delete: \K\d+' solution.new)"
if [ "$edges_after" -gt "$edges_before" ]; then
printf '\033[1;31mImport check failed: %s imports need to be deleted, ' "$edges_after"
printf 'previously this was %s\033[0m\n' "$edges_before"
printf 'Compare \033[1;97m"Import cycles before"\033[0m and '
printf '\033[1;97m"Import cycles after"\033[0m to see problematic imports.\n'
exit 1
else
printf '\033[1;32mImport check passed: %s <= %s\033[0m\n' "$edges_after" "$edges_before"
fi

View File

@@ -1,3 +1,65 @@
# v0.22.2 (2024-09-21)
## Bugfixes
- Forward compatibility with Spack 0.23 packages with language dependencies (#45205, #45191)
- Forward compatibility with `urllib` from Python 3.12.6+ (#46453, #46483)
- Bump vendored `archspec` for better aarch64 support (#45721, #46445)
- Support macOS Sequoia (#45018, #45127)
- Fix regression in `{variants.X}` and `{variants.X.value}` format strings (#46206)
- Ensure shell escaping of environment variable values in load and activate commands (#42780)
- Fix an issue where `spec[pkg]` considers specs outside the current DAG (#45090)
- Do not halt concretization on unknown variants in externals (#45326)
- Improve validation of `develop` config section (#46485)
- Explicitly disable `ccache` if turned off in config, to avoid cache pollution (#45275)
- Improve backwards compatibility in `include_concrete` (#45766)
- Fix issue where package tags were sometimes repeated (#45160)
- Make `setup-env.sh` "sourced only" by dropping execution bits (#45641)
- Make certain source/binary fetch errors recoverable instead of a hard error (#45683)
- Remove debug statements in package hash computation (#45235)
- Remove redundant clingo warnings (#45269)
- Remove hard-coded layout version (#45645)
- Do not initialize previous store state in `use_store` (#45268)
- Docs improvements (#46475)
## Package updates
- `chapel` major update (#42197, #44931, #45304)
# v0.22.1 (2024-07-04)
## Bugfixes
- Fix reuse of externals on Linux (#44316)
- Ensure parent gcc-runtime version >= child (#44834, #44870)
- Ensure the latest gcc-runtime is rpath'ed when multiple exist among link deps (#44219)
- Improve version detection of glibc (#44154)
- Improve heuristics for solver (#44893, #44976, #45023)
- Make strong preferences override reuse (#44373)
- Reduce verbosity when C compiler is missing (#44182)
- Make missing ccache executable an error when required (#44740)
- Make every environment view containing `python` a `venv` (#44382)
- Fix external detection for compilers with os but no target (#44156)
- Fix version optimization for roots (#44272)
- Handle common implementations of pagination of tags in OCI build caches (#43136)
- Apply fetched patches to develop specs (#44950)
- Avoid Windows wrappers for filesystem utilities on non-Windows (#44126)
- Fix issue with long filenames in build caches on Windows (#43851)
- Fix formatting issue in `spack audit` (#45045)
- CI fixes (#44582, #43965, #43967, #44279, #44213)
## Package updates
- protobuf: fix 3.4:3.21 patch checksum (#44443)
- protobuf: update hash for patch needed when="@3.4:3.21" (#44210)
- git: bump v2.39 to 2.45; deprecate unsafe versions (#44248)
- gcc: use -rpath {rpath_dir} not -rpath={rpath dir} (#44315)
- Remove mesa18 and libosmesa (#44264)
- Enforce consistency of `gl` providers (#44307)
- Require libiconv for iconv (#44335, #45026).
Notice that glibc/musl also provide iconv, but are not guaranteed to be
complete. Set `packages:iconv:require:[glibc]` to restore the old behavior.
- py-matplotlib: qualify when to do a post install (#44191)
- rust: fix v1.78.0 instructions (#44127)
- suite-sparse: improve setting of the `libs` property (#44214)
- netlib-lapack: provide blas and lapack together (#44981)
# v0.22.0 (2024-05-12) # v0.22.0 (2024-05-12)
@@ -319,6 +381,7 @@
* 344 committers to packages * 344 committers to packages
* 45 committers to core * 45 committers to core
# v0.21.2 (2024-03-01) # v0.21.2 (2024-03-01)
## Bugfixes ## Bugfixes
@@ -348,7 +411,7 @@
- spack graph: fix coloring with environments (#41240) - spack graph: fix coloring with environments (#41240)
- spack info: sort variants in --variants-by-name (#41389) - spack info: sort variants in --variants-by-name (#41389)
- Spec.format: error on old style format strings (#41934) - Spec.format: error on old style format strings (#41934)
- ASP-based solver: - ASP-based solver:
- fix infinite recursion when computing concretization errors (#41061) - fix infinite recursion when computing concretization errors (#41061)
- don't error for type mismatch on preferences (#41138) - don't error for type mismatch on preferences (#41138)
- don't emit spurious debug output (#41218) - don't emit spurious debug output (#41218)

View File

@@ -32,7 +32,7 @@
Spack is a multi-platform package manager that builds and installs Spack is a multi-platform package manager that builds and installs
multiple versions and configurations of software. It works on Linux, multiple versions and configurations of software. It works on Linux,
macOS, Windows, and many supercomputers. Spack is non-destructive: installing a macOS, and many supercomputers. Spack is non-destructive: installing a
new version of a package does not break existing installations, so many new version of a package does not break existing installations, so many
configurations of the same package can coexist. configurations of the same package can coexist.
@@ -46,18 +46,13 @@ See the
[Feature Overview](https://spack.readthedocs.io/en/latest/features.html) [Feature Overview](https://spack.readthedocs.io/en/latest/features.html)
for examples and highlights. for examples and highlights.
To install spack and your first package, make sure you have Python & Git. To install spack and your first package, make sure you have Python.
Then: Then:
$ git clone -c feature.manyFiles=true --depth=2 https://github.com/spack/spack.git $ git clone -c feature.manyFiles=true https://github.com/spack/spack.git
$ cd spack/bin $ cd spack/bin
$ ./spack install zlib $ ./spack install zlib
> [!TIP]
> `-c feature.manyFiles=true` improves git's performance on repositories with 1,000+ files.
>
> `--depth=2` prunes the git history to reduce the size of the Spack installation.
Documentation Documentation
---------------- ----------------

View File

@@ -22,4 +22,4 @@
# #
# This is compatible across platforms. # This is compatible across platforms.
# #
exec spack python "$@" exec /usr/bin/env spack python "$@"

View File

@@ -188,27 +188,25 @@ if NOT "%_sp_args%"=="%_sp_args:--help=%" (
goto :end_switch goto :end_switch
:case_load :case_load
if NOT defined _sp_args ( :: If args contain --sh, --csh, or -h/--help: just execute.
exit /B 0 if defined _sp_args (
) if NOT "%_sp_args%"=="%_sp_args:--help=%" (
goto :default_case
:: If args contain --bat, or -h/--help: just execute. ) else if NOT "%_sp_args%"=="%_sp_args:-h=%" (
if NOT "%_sp_args%"=="%_sp_args:--help=%" ( goto :default_case
goto :default_case ) else if NOT "%_sp_args%"=="%_sp_args:--bat=%" (
) else if NOT "%_sp_args%"=="%_sp_args:-h=%" ( goto :default_case
goto :default_case )
) else if NOT "%_sp_args%"=="%_sp_args:--bat=%" (
goto :default_case
) else if NOT "%_sp_args%"=="%_sp_args:--list=%" (
goto :default_case
) )
for /f "tokens=* USEBACKQ" %%I in ( for /f "tokens=* USEBACKQ" %%I in (
`python "%spack%" %_sp_flags% %_sp_subcommand% --bat %_sp_args%` `python "%spack%" %_sp_flags% %_sp_subcommand% --bat %_sp_args%`) do %%I
) do %%I
goto :end_switch goto :end_switch
:case_unload
goto :case_load
:default_case :default_case
python "%spack%" %_sp_flags% %_sp_subcommand% %_sp_args% python "%spack%" %_sp_flags% %_sp_subcommand% %_sp_args%
goto :end_switch goto :end_switch

View File

@@ -115,6 +115,12 @@ config:
suppress_gpg_warnings: false suppress_gpg_warnings: false
# If set to true, Spack will attempt to build any compiler on the spec
# that is not already available. If set to False, Spack will only use
# compilers already configured in compilers.yaml
install_missing_compilers: false
# If set to true, Spack will always check checksums after downloading # If set to true, Spack will always check checksums after downloading
# archives. If false, Spack skips the checksum step. # archives. If false, Spack skips the checksum step.
checksum: true checksum: true
@@ -164,6 +170,23 @@ config:
# If set to true, Spack will use ccache to cache C compiles. # If set to true, Spack will use ccache to cache C compiles.
ccache: false ccache: false
# The concretization algorithm to use in Spack. Options are:
#
# 'clingo': Uses a logic solver under the hood to solve DAGs with full
# backtracking and optimization for user preferences. Spack will
# try to bootstrap the logic solver, if not already available.
#
# 'original': Spack's original greedy, fixed-point concretizer. This
# algorithm can make decisions too early and will not backtrack
# sufficiently for many specs. This will soon be deprecated in
# favor of clingo.
#
# See `concretizer.yaml` for more settings you can fine-tune when
# using clingo.
concretizer: clingo
# How long to wait to lock the Spack installation database. This lock is used # How long to wait to lock the Spack installation database. This lock is used
# when Spack needs to manage its own package metadata and all operations are # when Spack needs to manage its own package metadata and all operations are
# expected to complete within the default time limit. The timeout should # expected to complete within the default time limit. The timeout should

View File

@@ -0,0 +1,16 @@
# -------------------------------------------------------------------------
# This is the default configuration for Spack's module file generation.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/modules.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/modules.yaml
# -------------------------------------------------------------------------
modules: {}

View File

@@ -0,0 +1,3 @@
packages:
iconv:
require: [libiconv]

View File

@@ -20,14 +20,11 @@ packages:
awk: [gawk] awk: [gawk]
armci: [armcimpi] armci: [armcimpi]
blas: [openblas, amdblis] blas: [openblas, amdblis]
c: [gcc]
cxx: [gcc]
D: [ldc] D: [ldc]
daal: [intel-oneapi-daal] daal: [intel-oneapi-daal]
elf: [elfutils] elf: [elfutils]
fftw-api: [fftw, amdfftw] fftw-api: [fftw, amdfftw]
flame: [libflame, amdlibflame] flame: [libflame, amdlibflame]
fortran: [gcc]
fortran-rt: [gcc-runtime, intel-oneapi-runtime] fortran-rt: [gcc-runtime, intel-oneapi-runtime]
fuse: [libfuse] fuse: [libfuse]
gl: [glx, osmesa] gl: [glx, osmesa]
@@ -64,7 +61,6 @@ packages:
tbb: [intel-tbb] tbb: [intel-tbb]
unwind: [libunwind] unwind: [libunwind]
uuid: [util-linux-uuid, libuuid] uuid: [util-linux-uuid, libuuid]
wasi-sdk: [wasi-sdk-prebuilt]
xxd: [xxd-standalone, vim] xxd: [xxd-standalone, vim]
yacc: [bison, byacc] yacc: [bison, byacc]
ziglang: [zig] ziglang: [zig]
@@ -72,13 +68,3 @@ packages:
permissions: permissions:
read: world read: world
write: user write: user
cray-mpich:
buildable: false
cray-mvapich2:
buildable: false
fujitsu-mpi:
buildable: false
hpcx-mpi:
buildable: false
spectrum-mpi:
buildable: false

View File

@@ -1,5 +1,6 @@
config: config:
locks: false locks: false
concretizer: clingo
build_stage:: build_stage::
- '$spack/.staging' - '$spack/.staging'
stage_name: '{name}-{version}-{hash:7}' stage_name: '{name}-{version}-{hash:7}'

View File

@@ -1175,17 +1175,6 @@ unspecified version, but packages can depend on other packages with
could depend on ``mpich@1.2:`` if it can only build with version could depend on ``mpich@1.2:`` if it can only build with version
``1.2`` or higher of ``mpich``. ``1.2`` or higher of ``mpich``.
.. note:: Windows Spec Syntax Caveats
Windows has a few idiosyncrasies when it comes to the Spack spec syntax and the use of certain shells
Spack's spec dependency syntax uses the carat (``^``) character, however this is an escape string in CMD
so it must be escaped with an additional carat (i.e. ``^^``).
CMD also will attempt to interpret strings with ``=`` characters in them. Any spec including this symbol
must double quote the string.
Note: All of these issues are unique to CMD, they can be avoided by using Powershell.
For more context on these caveats see the related issues: `carat <https://github.com/spack/spack/issues/42833>`_ and `equals <https://github.com/spack/spack/issues/43348>`_
Below are more details about the specifiers that you can add to specs. Below are more details about the specifiers that you can add to specs.
.. _version-specifier: .. _version-specifier:
@@ -1444,12 +1433,22 @@ the reserved keywords ``platform``, ``os`` and ``target``:
$ spack install libelf os=ubuntu18.04 $ spack install libelf os=ubuntu18.04
$ spack install libelf target=broadwell $ spack install libelf target=broadwell
or together by using the reserved keyword ``arch``:
.. code-block:: console
$ spack install libelf arch=cray-CNL10-haswell
Normally users don't have to bother specifying the architecture if they Normally users don't have to bother specifying the architecture if they
are installing software for their current host, as in that case the are installing software for their current host, as in that case the
values will be detected automatically. If you need fine-grained control values will be detected automatically. If you need fine-grained control
over which packages use which targets (or over *all* packages' default over which packages use which targets (or over *all* packages' default
target), see :ref:`package-preferences`. target), see :ref:`package-preferences`.
.. admonition:: Cray machines
The situation is a little bit different for Cray machines and a detailed
explanation on how the architecture can be set on them can be found at :ref:`cray-support`
.. _support-for-microarchitectures: .. _support-for-microarchitectures:

View File

@@ -147,15 +147,6 @@ example, the ``bash`` shell is used to run the ``autogen.sh`` script.
def autoreconf(self, spec, prefix): def autoreconf(self, spec, prefix):
which("bash")("autogen.sh") which("bash")("autogen.sh")
If the ``package.py`` has build instructions in a separate
:ref:`builder class <multiple_build_systems>`, the signature for a phase changes slightly:
.. code-block:: python
class AutotoolsBuilder(AutotoolsBuilder):
def autoreconf(self, pkg, spec, prefix):
which("bash")("autogen.sh")
""""""""""""""""""""""""""""""""""""""" """""""""""""""""""""""""""""""""""""""
patching configure or Makefile.in files patching configure or Makefile.in files
""""""""""""""""""""""""""""""""""""""" """""""""""""""""""""""""""""""""""""""

View File

@@ -25,7 +25,7 @@ use Spack to build packages with the tools.
The Spack Python class ``IntelOneapiPackage`` is a base class that is The Spack Python class ``IntelOneapiPackage`` is a base class that is
used by ``IntelOneapiCompilers``, ``IntelOneapiMkl``, used by ``IntelOneapiCompilers``, ``IntelOneapiMkl``,
``IntelOneapiTbb`` and other classes to implement the oneAPI ``IntelOneapiTbb`` and other classes to implement the oneAPI
packages. Search for ``oneAPI`` at `packages.spack.io <https://packages.spack.io>`_ for the full packages. Search for ``oneAPI`` at `<packages.spack.io>`_ for the full
list of available oneAPI packages, or use:: list of available oneAPI packages, or use::
spack list -d oneAPI spack list -d oneAPI

View File

@@ -11,8 +11,7 @@ Chaining Spack Installations (upstreams.yaml)
You can point your Spack installation to another installation to use any You can point your Spack installation to another installation to use any
packages that are installed there. To register the other Spack instance, packages that are installed there. To register the other Spack instance,
you can add it as an entry to ``upstreams.yaml`` at any of the you can add it as an entry to ``upstreams.yaml``:
:ref:`configuration-scopes`:
.. code-block:: yaml .. code-block:: yaml
@@ -23,8 +22,7 @@ you can add it as an entry to ``upstreams.yaml`` at any of the
install_tree: /path/to/another/spack/opt/spack install_tree: /path/to/another/spack/opt/spack
``install_tree`` must point to the ``opt/spack`` directory inside of the ``install_tree`` must point to the ``opt/spack`` directory inside of the
Spack base directory, or the location of the ``install_tree`` defined Spack base directory.
in :ref:`config.yaml <config-yaml>`.
Once the upstream Spack instance has been added, ``spack find`` will Once the upstream Spack instance has been added, ``spack find`` will
automatically check the upstream instance when querying installed packages, automatically check the upstream instance when querying installed packages,

View File

@@ -206,7 +206,6 @@ def setup(sphinx):
("py:class", "six.moves.urllib.parse.ParseResult"), ("py:class", "six.moves.urllib.parse.ParseResult"),
("py:class", "TextIO"), ("py:class", "TextIO"),
("py:class", "hashlib._Hash"), ("py:class", "hashlib._Hash"),
("py:class", "concurrent.futures._base.Executor"),
# Spack classes that are private and we don't want to expose # Spack classes that are private and we don't want to expose
("py:class", "spack.provider_index._IndexBase"), ("py:class", "spack.provider_index._IndexBase"),
("py:class", "spack.repo._PrependFileLoader"), ("py:class", "spack.repo._PrependFileLoader"),
@@ -218,8 +217,6 @@ def setup(sphinx):
("py:class", "spack.spec.SpecfileReaderBase"), ("py:class", "spack.spec.SpecfileReaderBase"),
("py:class", "spack.install_test.Pb"), ("py:class", "spack.install_test.Pb"),
("py:class", "spack.filesystem_view.SimpleFilesystemView"), ("py:class", "spack.filesystem_view.SimpleFilesystemView"),
("py:class", "spack.traverse.EdgeAndDepth"),
("py:class", "archspec.cpu.microarchitecture.Microarchitecture"),
] ]
# The reST default role (used for this markup: `text`) to use for all documents. # The reST default role (used for this markup: `text`) to use for all documents.

View File

@@ -203,9 +203,12 @@ The OS that are currently supported are summarized in the table below:
* - Ubuntu 24.04 * - Ubuntu 24.04
- ``ubuntu:24.04`` - ``ubuntu:24.04``
- ``spack/ubuntu-noble`` - ``spack/ubuntu-noble``
* - CentOS Stream9 * - CentOS 7
- ``quay.io/centos/centos:stream9`` - ``centos:7``
- ``spack/centos-stream9`` - ``spack/centos7``
* - CentOS Stream
- ``quay.io/centos/centos:stream``
- ``spack/centos-stream``
* - openSUSE Leap * - openSUSE Leap
- ``opensuse/leap`` - ``opensuse/leap``
- ``spack/leap15`` - ``spack/leap15``

View File

@@ -181,6 +181,10 @@ Spec-related modules
:mod:`spack.parser` :mod:`spack.parser`
Contains :class:`~spack.parser.SpecParser` and functions related to parsing specs. Contains :class:`~spack.parser.SpecParser` and functions related to parsing specs.
:mod:`spack.concretize`
Contains :class:`~spack.concretize.Concretizer` implementation,
which allows site administrators to change Spack's :ref:`concretization-policies`.
:mod:`spack.version` :mod:`spack.version`
Implements a simple :class:`~spack.version.Version` class with simple Implements a simple :class:`~spack.version.Version` class with simple
comparison semantics. Also implements :class:`~spack.version.VersionRange` comparison semantics. Also implements :class:`~spack.version.VersionRange`

View File

@@ -414,13 +414,7 @@ default, it will also clone the package to a subdirectory in the
environment. This package will have a special variant ``dev_path`` environment. This package will have a special variant ``dev_path``
set, and Spack will ensure the package and its dependents are rebuilt set, and Spack will ensure the package and its dependents are rebuilt
any time the environment is installed if the package's local source any time the environment is installed if the package's local source
code has been modified. Spack's native implementation to check for modifications code has been modified. Spack ensures that all instances of a
is to check if ``mtime`` is newer than the installation.
A custom check can be created by overriding the ``detect_dev_src_change`` method
in your package class. This is particularly useful for projects using custom spack repo's
to drive development and want to optimize performance.
Spack ensures that all instances of a
developed package in the environment are concretized to match the developed package in the environment are concretized to match the
version (and other constraints) passed as the spec argument to the version (and other constraints) passed as the spec argument to the
``spack develop`` command. ``spack develop`` command.
@@ -869,7 +863,7 @@ named list ``compilers`` is ``['%gcc', '%clang', '%intel']`` on
spack: spack:
definitions: definitions:
- compilers: ['%gcc', '%clang'] - compilers: ['%gcc', '%clang']
- when: arch.satisfies('target=x86_64:') - when: arch.satisfies('x86_64:')
compilers: ['%intel'] compilers: ['%intel']
.. note:: .. note::
@@ -937,84 +931,32 @@ This allows for a much-needed reduction in redundancy between packages
and constraints. and constraints.
----------------- ----------------
Environment Views Filesystem Views
----------------- ----------------
Spack Environments can have an associated filesystem view, which is a directory Spack Environments can define filesystem views, which provide a direct access point
with a more traditional structure ``<view>/bin``, ``<view>/lib``, ``<view>/include`` for software similar to the directory hierarchy that might exist under ``/usr/local``.
in which all files of the installed packages are linked. Filesystem views are updated every time the environment is written out to the lock
file ``spack.lock``, so the concrete environment and the view are always compatible.
By default a view is created for each environment, thanks to the ``view: true`` The files of the view's installed packages are brought into the view by symbolic or
option in the ``spack.yaml`` manifest file: hard links, referencing the original Spack installation, or by copy.
.. code-block:: yaml
spack:
specs: [perl, python]
view: true
The view is created in a hidden directory ``.spack-env/view`` relative to the environment.
If you've used ``spack env activate``, you may have already interacted with this view. Spack
prepends its ``<view>/bin`` dir to ``PATH`` when the environment is activated, so that
you can directly run executables from all installed packages in the environment.
Views are highly customizable: you can control where they are put, modify their structure,
include and exclude specs, change how files are linked, and you can even generate multiple
views for a single environment.
.. _configuring_environment_views: .. _configuring_environment_views:
^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Minimal view configuration Configuration in ``spack.yaml``
^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The minimal configuration The Spack Environment manifest file has a top-level keyword
``view``. Each entry under that heading is a **view descriptor**, headed
.. code-block:: yaml by a name. Any number of views may be defined under the ``view`` heading.
The view descriptor contains the root of the view, and
spack: optionally the projections for the view, ``select`` and
# ... ``exclude`` lists for the view and link information via ``link`` and
view: true
lets Spack generate a single view with default settings under the
``.spack-env/view`` directory of the environment.
Another short way to configure a view is to specify just where to put it:
.. code-block:: yaml
spack:
# ...
view: /path/to/view
Views can also be disabled by setting ``view: false``.
^^^^^^^^^^^^^^^^^^^^^^^^^^^
Advanced view configuration
^^^^^^^^^^^^^^^^^^^^^^^^^^^
One or more **view descriptors** can be defined under ``view``, keyed by a name.
The example from the previous section with ``view: /path/to/view`` is equivalent
to defining a view descriptor named ``default`` with a ``root`` attribute:
.. code-block:: yaml
spack:
# ...
view:
default: # name of the view
root: /path/to/view # view descriptor attribute
The ``default`` view descriptor name is special: when you ``spack env activate`` your
environment, this view will be used to update (among other things) your ``PATH``
variable.
View descriptors must contain the root of the view, and optionally projections,
``select`` and ``exclude`` lists and link information via ``link`` and
``link_type``. ``link_type``.
As a more advanced example, in the following manifest For example, in the following manifest
file snippet we define a view named ``mpis``, rooted at file snippet we define a view named ``mpis``, rooted at
``/path/to/view`` in which all projections use the package name, ``/path/to/view`` in which all projections use the package name,
version, and compiler name to determine the path for a given version, and compiler name to determine the path for a given
@@ -1059,10 +1001,59 @@ of ``hardlink`` or ``copy``.
when the environment is not activated, and linked libraries will be located when the environment is not activated, and linked libraries will be located
*outside* of the view thanks to rpaths. *outside* of the view thanks to rpaths.
There are two shorthands for environments with a single view. If the
environment at ``/path/to/env`` has a single view, with a root at
``/path/to/env/.spack-env/view``, with default selection and exclusion
and the default projection, we can put ``view: True`` in the
environment manifest. Similarly, if the environment has a view with a
different root, but default selection, exclusion, and projections, the
manifest can say ``view: /path/to/view``. These views are
automatically named ``default``, so that
.. code-block:: yaml
spack:
# ...
view: True
is equivalent to
.. code-block:: yaml
spack:
# ...
view:
default:
root: .spack-env/view
and
.. code-block:: yaml
spack:
# ...
view: /path/to/view
is equivalent to
.. code-block:: yaml
spack:
# ...
view:
default:
root: /path/to/view
By default, Spack environments are configured with ``view: True`` in
the manifest. Environments can be configured without views using
``view: False``. For backwards compatibility reasons, environments
with no ``view`` key are treated the same as ``view: True``.
From the command line, the ``spack env create`` command takes an From the command line, the ``spack env create`` command takes an
argument ``--with-view [PATH]`` that sets the path for a single, default argument ``--with-view [PATH]`` that sets the path for a single, default
view. If no path is specified, the default path is used (``view: view. If no path is specified, the default path is used (``view:
true``). The argument ``--without-view`` can be used to create an True``). The argument ``--without-view`` can be used to create an
environment without any view configured. environment without any view configured.
The ``spack env view`` command can be used to change the manage views The ``spack env view`` command can be used to change the manage views
@@ -1128,18 +1119,11 @@ the projection under ``all`` before reaching those entries.
Activating environment views Activating environment views
^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The ``spack env activate <env>`` has two effects: The ``spack env activate`` command will put the default view for the
environment into the user's path, in addition to activating the
1. It activates the environment so that further Spack commands such environment for Spack commands. The arguments ``-v,--with-view`` and
as ``spack install`` will run in the context of the environment. ``-V,--without-view`` can be used to tune this behavior. The default
2. It activates the view so that environment variables such as behavior is to activate with the environment view if there is one.
``PATH`` are updated to include the view.
Without further arguments, the ``default`` view of the environment is
activated. If a view with a different name has to be activated,
``spack env activate --with-view <name> <env>`` can be
used instead. You can also activate the environment without modifying
further environment variables using ``--without-view``.
The environment variables affected by the ``spack env activate`` The environment variables affected by the ``spack env activate``
command and the paths that are used to update them are determined by command and the paths that are used to update them are determined by
@@ -1162,8 +1146,8 @@ relevant variable if the path exists. For this reason, it is not
recommended to use non-default projections with the default view of an recommended to use non-default projections with the default view of an
environment. environment.
The ``spack env deactivate`` command will remove the active view of The ``spack env deactivate`` command will remove the default view of
the Spack environment from the user's environment variables. the environment from the user's path.
.. _env-generate-depfile: .. _env-generate-depfile:
@@ -1322,7 +1306,7 @@ index once every package is pushed. Note how this target uses the generated
example/push/%: example/install/% example/push/%: example/install/%
@mkdir -p $(dir $@) @mkdir -p $(dir $@)
$(info About to push $(SPEC) to a buildcache) $(info About to push $(SPEC) to a buildcache)
$(SPACK) -e . buildcache push --only=package $(BUILDCACHE_DIR) /$(HASH) $(SPACK) -e . buildcache push --allow-root --only=package $(BUILDCACHE_DIR) /$(HASH)
@touch $@ @touch $@
push: $(addprefix example/push/,$(example/SPACK_PACKAGE_IDS)) push: $(addprefix example/push/,$(example/SPACK_PACKAGE_IDS))

View File

@@ -61,15 +61,10 @@ Getting Spack is easy. You can clone it from the `github repository
.. code-block:: console .. code-block:: console
$ git clone -c feature.manyFiles=true --depth=2 https://github.com/spack/spack.git $ git clone -c feature.manyFiles=true https://github.com/spack/spack.git
This will create a directory called ``spack``. This will create a directory called ``spack``.
.. note::
``-c feature.manyFiles=true`` improves git's performance on repositories with 1,000+ files.
``--depth=2`` prunes the git history to reduce the size of the Spack installation.
.. _shell-support: .. _shell-support:
^^^^^^^^^^^^^ ^^^^^^^^^^^^^
@@ -1369,6 +1364,187 @@ This will write the private key to the file `dinosaur.priv`.
or for help on an issue or the Spack slack. or for help on an issue or the Spack slack.
.. _cray-support:
-------------
Spack on Cray
-------------
Spack differs slightly when used on a Cray system. The architecture spec
can differentiate between the front-end and back-end processor and operating system.
For example, on Edison at NERSC, the back-end target processor
is "Ivy Bridge", so you can specify to use the back-end this way:
.. code-block:: console
$ spack install zlib target=ivybridge
You can also use the operating system to build against the back-end:
.. code-block:: console
$ spack install zlib os=CNL10
Notice that the name includes both the operating system name and the major
version number concatenated together.
Alternatively, if you want to build something for the front-end,
you can specify the front-end target processor. The processor for a login node
on Edison is "Sandy bridge" so we specify on the command line like so:
.. code-block:: console
$ spack install zlib target=sandybridge
And the front-end operating system is:
.. code-block:: console
$ spack install zlib os=SuSE11
^^^^^^^^^^^^^^^^^^^^^^^
Cray compiler detection
^^^^^^^^^^^^^^^^^^^^^^^
Spack can detect compilers using two methods. For the front-end, we treat
everything the same. The difference lies in back-end compiler detection.
Back-end compiler detection is made via the Tcl module avail command.
Once it detects the compiler it writes the appropriate PrgEnv and compiler
module name to compilers.yaml and sets the paths to each compiler with Cray\'s
compiler wrapper names (i.e. cc, CC, ftn). During build time, Spack will load
the correct PrgEnv and compiler module and will call appropriate wrapper.
The compilers.yaml config file will also differ. There is a
modules section that is filled with the compiler's Programming Environment
and module name. On other systems, this field is empty []:
.. code-block:: yaml
- compiler:
modules:
- PrgEnv-intel
- intel/15.0.109
As mentioned earlier, the compiler paths will look different on a Cray system.
Since most compilers are invoked using cc, CC and ftn, the paths for each
compiler are replaced with their respective Cray compiler wrapper names:
.. code-block:: yaml
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
As opposed to an explicit path to the compiler executable. This allows Spack
to call the Cray compiler wrappers during build time.
For more on compiler configuration, check out :ref:`compiler-config`.
Spack sets the default Cray link type to dynamic, to better match other
other platforms. Individual packages can enable static linking (which is the
default outside of Spack on cray systems) using the ``-static`` flag.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting defaults and using Cray modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If you want to use default compilers for each PrgEnv and also be able
to load cray external modules, you will need to set up a ``packages.yaml``.
Here's an example of an external configuration for cray modules:
.. code-block:: yaml
packages:
mpich:
externals:
- spec: "mpich@7.3.1%gcc@5.2.0 arch=cray_xc-haswell-CNL10"
modules:
- cray-mpich
- spec: "mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-haswell-CNL10"
modules:
- cray-mpich
all:
providers:
mpi: [mpich]
This tells Spack that for whatever package that depends on mpi, load the
cray-mpich module into the environment. You can then be able to use whatever
environment variables, libraries, etc, that are brought into the environment
via module load.
.. note::
For Cray-provided packages, it is best to use ``modules:`` instead of ``prefix:``
in ``packages.yaml``, because the Cray Programming Environment heavily relies on
modules (e.g., loading the ``cray-mpich`` module adds MPI libraries to the
compiler wrapper link line).
You can set the default compiler that Spack can use for each compiler type.
If you want to use the Cray defaults, then set them under ``all:`` in packages.yaml.
In the compiler field, set the compiler specs in your order of preference.
Whenever you build with that compiler type, Spack will concretize to that version.
Here is an example of a full packages.yaml used at NERSC
.. code-block:: yaml
packages:
mpich:
externals:
- spec: "mpich@7.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-mpich
- spec: "mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-SuSE11-ivybridge"
modules:
- cray-mpich
buildable: False
netcdf:
externals:
- spec: "netcdf@4.3.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-netcdf
- spec: "netcdf@4.3.3.1%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-netcdf
buildable: False
hdf5:
externals:
- spec: "hdf5@1.8.14%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-hdf5
- spec: "hdf5@1.8.14%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-hdf5
buildable: False
all:
compiler: [gcc@5.2.0, intel@16.0.0.109]
providers:
mpi: [mpich]
Here we tell spack that whenever we want to build with gcc use version 5.2.0 or
if we want to build with intel compilers, use version 16.0.0.109. We add a spec
for each compiler type for each cray modules. This ensures that for each
compiler on our system we can use that external module.
For more on external packages check out the section :ref:`sec-external-packages`.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using Linux containers on Cray machines
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack uses environment variables particular to the Cray programming
environment to determine which systems are Cray platforms. These
environment variables may be propagated into containers that are not
using the Cray programming environment.
To ensure that Spack does not autodetect the Cray programming
environment, unset the environment variable ``MODULEPATH``. This
will cause Spack to treat a linux container on a Cray system as a base
linux distro.
.. _windows_support: .. _windows_support:
---------------- ----------------
@@ -1480,14 +1656,16 @@ in a Windows CMD prompt.
Step 3: Run and configure Spack Step 3: Run and configure Spack
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
On Windows, Spack supports both primary native shells, Powershell and the traditional command prompt. To use Spack, run ``bin\spack_cmd.bat`` (you may need to Run as Administrator) from the top-level spack
To use Spack, pick your favorite shell, and run ``bin\spack_cmd.bat`` or ``share/spack/setup-env.ps1`` directory. This will provide a Windows command prompt with an environment properly set up with Spack
(you may need to Run as Administrator) from the top-level spack and its prerequisites. If you receive a warning message that Python is not in your ``PATH``
directory. This will provide a Spack enabled shell. If you receive a warning message that Python is not in your ``PATH``
(which may happen if you installed Python from the website and not the Windows Store) add the location (which may happen if you installed Python from the website and not the Windows Store) add the location
of the Python executable to your ``PATH`` now. You can permanently add Python to your ``PATH`` variable of the Python executable to your ``PATH`` now. You can permanently add Python to your ``PATH`` variable
by using the ``Edit the system environment variables`` utility in Windows Control Panel. by using the ``Edit the system environment variables`` utility in Windows Control Panel.
.. note::
Alternatively, Powershell can be used in place of CMD
To configure Spack, first run the following command inside the Spack console: To configure Spack, first run the following command inside the Spack console:
.. code-block:: console .. code-block:: console
@@ -1552,7 +1730,7 @@ and not tabs, so ensure that this is the case when editing one directly.
.. note:: Cygwin .. note:: Cygwin
The use of Cygwin is not officially supported by Spack and is not tested. The use of Cygwin is not officially supported by Spack and is not tested.
However Spack will not prevent this, so use if choosing to use Spack However Spack will not throw an error, so use if choosing to use Spack
with Cygwin, know that no functionality is garunteed. with Cygwin, know that no functionality is garunteed.
^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^
@@ -1566,12 +1744,21 @@ Spack console via:
spack install cpuinfo spack install cpuinfo
If in the previous step, you did not have CMake or Ninja installed, running the command above should install both packages If in the previous step, you did not have CMake or Ninja installed, running the command above should bootstrap both packages
.. note:: Spec Syntax Caveats """""""""""""""""""""""""""
Windows has a few idiosyncrasies when it comes to the Spack spec syntax and the use of certain shells Windows Compatible Packages
See the Spack spec syntax doc for more information """""""""""""""""""""""""""
Not all spack packages currently have Windows support. Some are inherently incompatible with the
platform, and others simply have yet to be ported. To view the current set of packages with Windows
support, the list command should be used via `spack list -t windows`. If there's a package you'd like
to install on Windows but is not in that list, feel free to reach out to request the port or contribute
the port yourself.
.. note::
This is by no means a comprehensive list, some packages may have ports that were not tagged
while others may just work out of the box on Windows and have not been tagged as such.
^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^
For developers For developers
@@ -1581,3 +1768,6 @@ The intent is to provide a Windows installer that will automatically set up
Python, Git, and Spack, instead of requiring the user to do so manually. Python, Git, and Spack, instead of requiring the user to do so manually.
Instructions for creating the installer are at Instructions for creating the installer are at
https://github.com/spack/spack/blob/develop/lib/spack/spack/cmd/installer/README.md https://github.com/spack/spack/blob/develop/lib/spack/spack/cmd/installer/README.md
Alternatively a pre-built copy of the Windows installer is available as an artifact of Spack's Windows CI
available at each run of the CI on develop or any PR.

View File

@@ -39,15 +39,10 @@ package:
.. code-block:: console .. code-block:: console
$ git clone -c feature.manyFiles=true --depth=2 https://github.com/spack/spack.git $ git clone -c feature.manyFiles=true https://github.com/spack/spack.git
$ cd spack/bin $ cd spack/bin
$ ./spack install libelf $ ./spack install libelf
.. note::
``-c feature.manyFiles=true`` improves git's performance on repositories with 1,000+ files.
``--depth=2`` prunes the git history to reduce the size of the Spack installation.
If you're new to spack and want to start using it, see :doc:`getting_started`, If you're new to spack and want to start using it, see :doc:`getting_started`,
or refer to the full manual below. or refer to the full manual below.

View File

@@ -1263,11 +1263,6 @@ Git fetching supports the following parameters to ``version``:
option ``--depth 1`` will be used if the version of git and the specified option ``--depth 1`` will be used if the version of git and the specified
transport protocol support it, and ``--single-branch`` will be used if the transport protocol support it, and ``--single-branch`` will be used if the
version of git supports it. version of git supports it.
* ``git_sparse_paths``: Use ``sparse-checkout`` to only clone these relative paths.
This feature requires ``git`` to be version ``2.25.0`` or later but is useful for
large repositories that have separate portions that can be built independently.
If paths provided are directories then all the subdirectories and associated files
will also be cloned.
Only one of ``tag``, ``branch``, or ``commit`` can be used at a time. Only one of ``tag``, ``branch``, or ``commit`` can be used at a time.
@@ -1366,41 +1361,6 @@ Submodules
For more information about git submodules see the manpage of git: ``man For more information about git submodules see the manpage of git: ``man
git-submodule``. git-submodule``.
Sparse-Checkout
You can supply ``git_sparse_paths`` at the package or version level to utilize git's
sparse-checkout feature. This will only clone the paths that are specified in the
``git_sparse_paths`` attribute for the package along with the files in the top level directory.
This feature allows you to only clone what you need from a large repository.
Note that this is a newer feature in git and requries git ``2.25.0`` or greater.
If ``git_sparse_paths`` is supplied and the git version is too old
then a warning will be issued and that package will use the standard cloning operations instead.
``git_sparse_paths`` should be supplied as a list of paths, a callable function for versions,
or a more complex package attribute using the ``@property`` decorator. The return value should be
a list for a callable implementation of ``git_sparse_paths``.
.. code-block:: python
def sparse_path_function(package)
"""a callable function that can be used in side a version"""
# paths can be directories or functions, all subdirectories and files are included
paths = ["doe", "rae", "me/file.cpp"]
if package.spec.version > Version("1.2.0"):
paths.extend(["fae"])
return paths
class MyPackage(package):
# can also be a package attribute that will be used if not specified in versions
git_sparse_paths = ["doe", "rae"]
# use the package attribute
version("1.0.0")
version("1.1.0")
# use the function
version("1.1.5", git_sparse_paths=sparse_path_func)
version("1.2.0", git_sparse_paths=sparse_path_func)
version("1.2.5", git_sparse_paths=sparse_path_func)
version("1.1.5", git_sparse_paths=sparse_path_func)
.. _github-fetch: .. _github-fetch:
^^^^^^ ^^^^^^
@@ -2384,27 +2344,6 @@ you set ``parallel`` to ``False`` at the package level, then each call
to ``make()`` will be sequential by default, but packagers can call to ``make()`` will be sequential by default, but packagers can call
``make(parallel=True)`` to override it. ``make(parallel=True)`` to override it.
Note that the ``--jobs`` option works out of the box for all standard
build systems. If you are using a non-standard build system instead, you
can use the variable ``make_jobs`` to extract the number of jobs specified
by the ``--jobs`` option:
.. code-block:: python
:emphasize-lines: 7, 11
:linenos:
class Xios(Package):
...
def install(self, spec, prefix):
...
options = [
...
'--jobs', str(make_jobs),
]
...
make_xios = Executable("./make_xios")
make_xios(*options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Install-level build parallelism Install-level build parallelism
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@@ -5234,6 +5173,12 @@ installed executable. The check is implemented as follows:
reframe = Executable(self.prefix.bin.reframe) reframe = Executable(self.prefix.bin.reframe)
reframe("-l") reframe("-l")
.. warning::
The API for adding tests is not yet considered stable and may change
in future releases.
"""""""""""""""""""""""""""""""" """"""""""""""""""""""""""""""""
Checking build-time test results Checking build-time test results
"""""""""""""""""""""""""""""""" """"""""""""""""""""""""""""""""
@@ -5271,42 +5216,38 @@ be left in the build stage directory as illustrated below:
Stand-alone tests Stand-alone tests
^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^
While build-time tests are integrated with the installation process, stand-alone While build-time tests are integrated with the build process, stand-alone
tests are expected to run days, weeks, even months after the software is tests are expected to run days, weeks, even months after the software is
installed. The goal is to provide a mechanism for gaining confidence that installed. The goal is to provide a mechanism for gaining confidence that
packages work as installed **and** *continue* to work as the underlying packages work as installed **and** *continue* to work as the underlying
software evolves. Packages can add and inherit stand-alone tests. The software evolves. Packages can add and inherit stand-alone tests. The
``spack test`` command is used for stand-alone testing. `spack test`` command is used to manage stand-alone testing.
.. admonition:: Stand-alone test methods should complete within a few minutes. .. note::
Execution speed is important since these tests are intended to quickly Execution speed is important since these tests are intended to quickly
assess whether installed specs work on the system. Spack cannot spare assess whether installed specs work on the system. Consequently, they
resources for more extensive testing of packages included in CI stacks. should run relatively quickly -- as in on the order of at most a few
minutes -- while ideally executing all, or at least key aspects of the
installed software.
Consequently, stand-alone tests should run relatively quickly -- as in .. note::
on the order of at most a few minutes -- while testing at least key aspects
of the installed software. Save more extensive testing for other tools. Failing stand-alone tests indicate problems with the installation and,
therefore, there is no reason to proceed with more resource-intensive
tests until those have been investigated.
Passing stand-alone tests indicate that more thorough testing, such
as running extensive unit or regression tests, or tests that run at
scale can proceed without wasting resources on a problematic installation.
Tests are defined in the package using methods with names beginning ``test_``. Tests are defined in the package using methods with names beginning ``test_``.
This allows Spack to support multiple independent checks, or parts. Files This allows Spack to support multiple independent checks, or parts. Files
needed for testing, such as source, data, and expected outputs, may be saved needed for testing, such as source, data, and expected outputs, may be saved
from the build and or stored with the package in the repository. Regardless from the build and or stored with the package in the repository. Regardless
of origin, these files are automatically copied to the spec's test stage of origin, these files are automatically copied to the spec's test stage
directory prior to execution of the test method(s). Spack also provides helper directory prior to execution of the test method(s). Spack also provides some
functions to facilitate common processing. helper functions to facilitate processing.
.. tip::
**The status of stand-alone tests can be used to guide follow-up testing efforts.**
Passing stand-alone tests justify performing more thorough testing, such
as running extensive unit or regression tests or tests that run at scale,
when available. These tests are outside of the scope of Spack packaging.
Failing stand-alone tests indicate problems with the installation and,
therefore, no reason to proceed with more resource-intensive tests until
the failures have been investigated.
.. _configure-test-stage: .. _configure-test-stage:
@@ -5314,26 +5255,30 @@ functions to facilitate common processing.
Configuring the test stage directory Configuring the test stage directory
"""""""""""""""""""""""""""""""""""" """"""""""""""""""""""""""""""""""""
Stand-alone tests utilize a test stage directory to build, run, and track Stand-alone tests utilize a test stage directory for building, running,
tests in the same way Spack uses a build stage directory to install software. and tracking results in the same way Spack uses a build stage directory.
The default test stage root directory, ``$HOME/.spack/test``, is defined in The default test stage root directory, ``~/.spack/test``, is defined in
:ref:`config.yaml <config-yaml>`. This location is customizable by adding or :ref:`etc/spack/defaults/config.yaml <config-yaml>`. This location is
changing the ``test_stage`` path such that: customizable by adding or changing the ``test_stage`` path in the high-level
``config`` of the appropriate ``config.yaml`` file such that:
.. code-block:: yaml .. code-block:: yaml
config: config:
test_stage: /path/to/test/stage test_stage: /path/to/test/stage
Packages can use the ``self.test_suite.stage`` property to access the path. Packages can use the ``self.test_suite.stage`` property to access this setting.
Other package properties that provide access to spec-specific subdirectories
and files are described in :ref:`accessing staged files <accessing-files>`.
.. admonition:: Each spec being tested has its own test stage directory. .. note::
The ``config:test_stage`` option is the path to the root of a The test stage path is the root directory for the **entire suite**.
**test suite**'s stage directories. In other words, it is the root directory for **all specs** being
tested by the ``spack test run`` command. Each spec gets its own
stage subdirectory. Use ``self.test_suite.test_dir_for_spec(self.spec)``
to access the spec-specific test stage directory.
Other package properties that provide paths to spec-specific subdirectories
and files are described in :ref:`accessing-files`.
.. _adding-standalone-tests: .. _adding-standalone-tests:
@@ -5346,144 +5291,61 @@ Test recipes are defined in the package using methods with names beginning
Each method has access to the information Spack tracks on the package, such Each method has access to the information Spack tracks on the package, such
as options, compilers, and dependencies, supporting the customization of tests as options, compilers, and dependencies, supporting the customization of tests
to the build. Standard python ``assert`` statements and other error reporting to the build. Standard python ``assert`` statements and other error reporting
mechanisms can be used. These exceptions are automatically caught and reported mechanisms are available. Such exceptions are automatically caught and reported
as test failures. as test failures.
Each test method is an *implicit test part* named by the method. Its purpose Each test method is an implicit test part named by the method and whose
is the method's docstring. Providing a meaningful purpose for the test gives purpose is the method's docstring. Providing a purpose gives context for
context that can aid debugging. Spack outputs both the name and purpose at the aiding debugging. A test method may contain embedded test parts. Spack
start of test execution so it's also important that the docstring/purpose be outputs the test name and purpose prior to running each test method and
brief. any embedded test parts. For example, ``MyPackage`` below provides two basic
examples of installation tests: ``test_always_fails`` and ``test_example``.
.. tip:: As the name indicates, the first always fails. The second simply runs the
installed example.
We recommend naming test methods so it is clear *what* is being tested.
For example, if a test method is building and or running an executable
called ``example``, then call the method ``test_example``. This, together
with a similarly meaningful test purpose, will aid test comprehension,
debugging, and maintainability.
Stand-alone tests run in an environment that provides access to information
on the installed software, such as build options, dependencies, and compilers.
Build options and dependencies are accessed using the same spec checks used
by build recipes. Examples of checking :ref:`variant settings <variants>` and
:ref:`spec constraints <testing-specs>` can be found at the provided links.
.. admonition:: Spack automatically sets up the test stage directory and environment.
Spack automatically creates the test stage directory and copies
relevant files *prior to* running tests. It can also ensure build
dependencies are available **if** necessary.
The path to the test stage is configurable (see :ref:`configure-test-stage`).
Files that Spack knows to copy are those saved from the build (see
:ref:`cache_extra_test_sources`) and those added to the package repository
(see :ref:`cache_custom_files`).
Spack will use the value of the ``test_requires_compiler`` property to
determine whether it needs to also set up build dependencies (see
:ref:`test-build-tests`).
The ``MyPackage`` package below provides two basic test examples:
``test_example`` and ``test_example2``. The first runs the installed
``example`` and ensures its output contains an expected string. The second
runs ``example2`` without checking output so is only concerned with confirming
the executable runs successfully. If the installed spec is not expected to have
``example2``, then the check at the top of the method will raise a special
``SkipTest`` exception, which is captured to facilitate reporting skipped test
parts to tools like CDash.
.. code-block:: python .. code-block:: python
class MyPackage(Package): class MyPackage(Package):
... ...
def test_always_fails(self):
"""use assert to always fail"""
assert False
def test_example(self): def test_example(self):
"""ensure installed example works""" """run installed example"""
expected = "Done."
example = which(self.prefix.bin.example) example = which(self.prefix.bin.example)
example()
# Capture stdout and stderr from running the Executable
# and check that the expected output was produced.
out = example(output=str.split, error=str.split)
assert expected in out, f"Expected '{expected}' in the output"
def test_example2(self):
"""run installed example2"""
if self.spec.satisfies("@:1.0"):
# Raise SkipTest to ensure flagging the test as skipped for
# test reporting purposes.
raise SkipTest("Test is only available for v1.1 on")
example2 = which(self.prefix.bin.example2)
example2()
Output showing the identification of each test part after running the tests Output showing the identification of each test part after running the tests
is illustrated below. is illustrated below.
.. code-block:: console .. code-block:: console
$ spack test run --alias mypackage mypackage@2.0 $ spack test run --alias mypackage mypackage@1.0
==> Spack test mypackage ==> Spack test mypackage
... ...
$ spack test results -l mypackage $ spack test results -l mypackage
==> Results for test suite 'mypackage': ==> Results for test suite 'mypackage':
... ...
==> [2024-03-10-16:03:56.625439] test: test_example: ensure installed example works ==> [2023-03-10-16:03:56.625204] test: test_always_fails: use assert to always fail
... ...
PASSED: MyPackage::test_example FAILED
==> [2024-03-10-16:03:56.625439] test: test_example2: run installed example2 ==> [2023-03-10-16:03:56.625439] test: test_example: run installed example
... ...
PASSED: MyPackage::test_example2 PASSED
.. admonition:: Do NOT implement tests that must run in the installation prefix.
Use of the package spec's installation prefix for building and running .. note::
tests is **strongly discouraged**. Doing so causes permission errors for
shared spack instances *and* facilities that install the software in
read-only file systems or directories.
Instead, start these test methods by explicitly copying the needed files If ``MyPackage`` were a recipe for a library, the tests should build
from the installation prefix to the test stage directory. Note the test an example or test program that is then executed.
stage directory is the current directory when the test is executed with
the ``spack test run`` command.
.. admonition:: Test methods for library packages should build test executables. A test method can include test parts using the ``test_part`` context manager.
Each part is treated as an independent check to allow subsequent test parts
to execute even after a test part fails.
Stand-alone tests for library packages *should* build test executables .. _test-part:
that utilize the *installed* library. Doing so ensures the tests follow
a similar build process that users of the library would follow.
For more information on how to do this, see :ref:`test-build-tests`.
.. tip::
If you want to see more examples from packages with stand-alone tests, run
``spack pkg grep "def\stest" | sed "s/\/package.py.*//g" | sort -u``
from the command line to get a list of the packages.
.. _adding-standalone-test-parts:
"""""""""""""""""""""""""""""
Adding stand-alone test parts
"""""""""""""""""""""""""""""
Sometimes dependencies between steps of a test lend themselves to being
broken into parts. Tracking the pass/fail status of each part may aid
debugging. Spack provides a ``test_part`` context manager for use within
test methods.
Each test part is independently run, tracked, and reported. Test parts are
executed in the order they appear. If one fails, subsequent test parts are
still performed even if they would also fail. This allows tools like CDash
to track and report the status of test parts across runs. The pass/fail status
of the enclosing test is derived from the statuses of the embedded test parts.
.. admonition:: Test method and test part names **must** be unique.
Test results reporting requires that test methods and embedded test parts
within a package have unique names.
The signature for ``test_part`` is: The signature for ``test_part`` is:
@@ -5505,68 +5367,40 @@ where each argument has the following meaning:
* ``work_dir`` is the path to the directory in which the test will run. * ``work_dir`` is the path to the directory in which the test will run.
The default of ``None``, or ``"."``, corresponds to the the spec's test The default of ``None``, or ``"."``, corresponds to the the spec's test
stage (i.e., ``self.test_suite.test_dir_for_spec(self.spec)``). stage (i.e., ``self.test_suite.test_dir_for_spec(self.spec)``.
.. admonition:: Start test part names with the name of the enclosing test. .. admonition:: Tests should **not** run under the installation directory.
We **highly recommend** starting the names of test parts with the name Use of the package spec's installation directory for building and running
of the enclosing test. Doing so helps with the comprehension, readability tests is **strongly** discouraged. Doing so causes permission errors for
and debugging of test results. shared spack instances *and* facilities that install the software in
read-only file systems or directories.
Suppose ``MyPackage`` installs multiple executables that need to run in a Suppose ``MyPackage`` actually installs two examples we want to use for tests.
specific order since the outputs from one are inputs of others. Further suppose These checks can be implemented as separate checks or, as illustrated below,
we want to add an integration test that runs the executables in order. We can embedded test parts.
accomplish this goal by implementing a stand-alone test method consisting of
test parts for each executable as follows:
.. code-block:: python .. code-block:: python
class MyPackage(Package): class MyPackage(Package):
... ...
def test_series(self): def test_example(self):
"""run setup, perform, and report""" """run installed examples"""
for example in ["ex1", "ex2"]:
with test_part(
self,
f"test_example_{example}",
purpose=f"run installed {example}",
):
exe = which(join_path(self.prefix.bin, example))
exe()
with test_part(self, "test_series_setup", purpose="setup operation"): In this case, there will be an implicit test part for ``test_example``
exe = which(self.prefix.bin.setup)) and separate sub-parts for ``ex1`` and ``ex2``. The second sub-part
exe() will be executed regardless of whether the first passes. The test
log for a run where the first executable fails and the second passes
with test_part(self, "test_series_run", purpose="perform operation"): is illustrated below.
exe = which(self.prefix.bin.run))
exe()
with test_part(self, "test_series_report", purpose="generate report"):
exe = which(self.prefix.bin.report))
exe()
The result is ``test_series`` runs the following executable in order: ``setup``,
``run``, and ``report``. In this case no options are passed to any of the
executables and no outputs from running them are checked. Consequently, the
implementation could be simplified with a for-loop as follows:
.. code-block:: python
class MyPackage(Package):
...
def test_series(self):
"""execute series setup, run, and report"""
for exe, reason in [
("setup", "setup operation"),
("run", "perform operation"),
("report", "generate report")
]:
with test_part(self, f"test_series_{exe}", purpose=reason):
exe = which(self.prefix.bin.join(exe))
exe()
In both cases, since we're using a context manager, each test part in
``test_series`` will execute regardless of the status of the other test
parts.
Now let's look at the output from running the stand-alone tests where
the second test part, ``test_series_run``, fails.
.. code-block:: console .. code-block:: console
@@ -5576,68 +5410,50 @@ the second test part, ``test_series_run``, fails.
$ spack test results -l mypackage $ spack test results -l mypackage
==> Results for test suite 'mypackage': ==> Results for test suite 'mypackage':
... ...
==> [2024-03-10-16:03:56.625204] test: test_series: execute series setup, run, and report ==> [2023-03-10-16:03:56.625204] test: test_example: run installed examples
==> [2024-03-10-16:03:56.625439] test: test_series_setup: setup operation ==> [2023-03-10-16:03:56.625439] test: test_example_ex1: run installed ex1
... ...
PASSED: MyPackage::test_series_setup FAILED
==> [2024-03-10-16:03:56.625555] test: test_series_run: perform operation ==> [2023-03-10-16:03:56.625555] test: test_example_ex2: run installed ex2
... ...
FAILED: MyPackage::test_series_run PASSED
==> [2024-03-10-16:03:57.003456] test: test_series_report: generate report
...
FAILED: MyPackage::test_series_report
FAILED: MyPackage::test_series
... ...
Since test parts depended on the success of previous parts, we see that the .. warning::
failure of one results in the failure of subsequent checks and the overall
result of the test method, ``test_series``, is failure.
.. tip:: Test results reporting requires that each test method and embedded
test part for a package have a unique name.
If you want to see more examples from packages using ``test_part``, run Stand-alone tests run in an environment that provides access to information
``spack pkg grep "test_part(" | sed "s/\/package.py.*//g" | sort -u`` Spack has on how the software was built, such as build options, dependencies,
from the command line to get a list of the packages. and compilers. Build options and dependencies are accessed with the normal
spec checks. Examples of checking :ref:`variant settings <variants>` and
:ref:`spec constraints <testing-specs>` can be found at the provided links.
Accessing compilers in stand-alone tests that are used by the build requires
setting a package property as described :ref:`below <test-compilation>`.
.. _test-build-tests:
""""""""""""""""""""""""""""""""""""" .. _test-compilation:
Building and running test executables
"""""""""""""""""""""""""""""""""""""
.. admonition:: Re-use build-time sources and (small) input data sets when possible. """""""""""""""""""""""""
Enabling test compilation
"""""""""""""""""""""""""
We **highly recommend** re-using build-time test sources and pared down If you want to build and run binaries in tests, then you'll need to tell
input files for testing installed software. These files are easier Spack to load the package's compiler configuration. This is accomplished
to keep synchronized with software capabilities when they reside by setting the package's ``test_requires_compiler`` property to ``True``.
within the software's repository. More information on saving files from
the installation process can be found at :ref:`cache_extra_test_sources`.
If that is not possible, you can add test-related files to the package Setting the property to ``True`` ensures access to the compiler through
repository (see :ref:`cache_custom_files`). It will be important to canonical environment variables (e.g., ``CC``, ``CXX``, ``FC``, ``F77``).
remember to maintain them so they work across listed or supported versions It also gives access to build dependencies like ``cmake`` through their
of the package. ``spec objects`` (e.g., ``self.spec["cmake"].prefix.bin.cmake``).
Packages that build libraries are good examples of cases where you'll want .. note::
to build test executables from the installed software before running them.
Doing so requires you to let Spack know it needs to load the package's
compiler configuration. This is accomplished by setting the package's
``test_requires_compiler`` property to ``True``.
.. admonition:: ``test_requires_compiler = True`` is required to build test executables. The ``test_requires_compiler`` property should be added at the top of
the package near other attributes, such as the ``homepage`` and ``url``.
Setting the property to ``True`` ensures access to the compiler through Below illustrates using this feature to compile an example.
canonical environment variables (e.g., ``CC``, ``CXX``, ``FC``, ``F77``).
It also gives access to build dependencies like ``cmake`` through their
``spec objects`` (e.g., ``self.spec["cmake"].prefix.bin.cmake`` for the
path or ``self.spec["cmake"].command`` for the ``Executable`` instance).
Be sure to add the property at the top of the package class under other
properties like the ``homepage``.
The example below, which ignores how ``cxx-example.cpp`` is acquired,
illustrates the basic process of compiling a test executable using the
installed library before running it.
.. code-block:: python .. code-block:: python
@@ -5661,22 +5477,28 @@ installed library before running it.
cxx_example = which(exe) cxx_example = which(exe)
cxx_example() cxx_example()
Typically the files used to build and or run test executables are either
cached from the installation (see :ref:`cache_extra_test_sources`) or added
to the package repository (see :ref:`cache_custom_files`). There is nothing
preventing the use of both.
.. _cache_extra_test_sources: .. _cache_extra_test_sources:
"""""""""""""""""""""""""""""""""""" """""""""""""""""""""""
Saving build- and install-time files Saving build-time files
"""""""""""""""""""""""""""""""""""" """""""""""""""""""""""
You can use the ``cache_extra_test_sources`` helper routine to copy .. note::
directories and or files from the source build stage directory to the
package's installation directory. Spack will automatically copy these We highly recommend re-using build-time test sources and pared down
files for you when it sets up the test stage directory and before it input files for testing installed software. These files are easier
begins running the tests. to keep synchronized with software capabilities since they reside
within the software's repository.
If that is not possible, you can add test-related files to the package
repository (see :ref:`adding custom files <cache_custom_files>`). It
will be important to maintain them so they work across listed or supported
versions of the package.
You can use the ``cache_extra_test_sources`` helper to copy directories
and or files from the source build stage directory to the package's
installation directory.
The signature for ``cache_extra_test_sources`` is: The signature for ``cache_extra_test_sources`` is:
@@ -5691,69 +5513,46 @@ where each argument has the following meaning:
* ``srcs`` is a string *or* a list of strings corresponding to the * ``srcs`` is a string *or* a list of strings corresponding to the
paths of subdirectories and or files needed for stand-alone testing. paths of subdirectories and or files needed for stand-alone testing.
.. warning:: The paths must be relative to the staged source directory. Contents of
subdirectories and files are copied to a special test cache subdirectory
of the installation prefix. They are automatically copied to the appropriate
relative paths under the test stage directory prior to executing stand-alone
tests.
Paths provided in the ``srcs`` argument **must be relative** to the For example, a package method for copying everything in the ``tests``
staged source directory. They will be copied to the equivalent relative subdirectory plus the ``foo.c`` and ``bar.c`` files from ``examples``
location under the test stage directory prior to test execution. and using ``foo.c`` in a test method is illustrated below.
Contents of subdirectories and files are copied to a special test cache
subdirectory of the installation prefix. They are automatically copied to
the appropriate relative paths under the test stage directory prior to
executing stand-alone tests.
.. tip::
*Perform test-related conversions once when copying files.*
If one or more of the copied files needs to be modified to reference
the installed software, it is recommended that those changes be made
to the cached files **once** in the post-``install`` copy method
**after** the call to ``cache_extra_test_sources``. This will reduce
the amount of unnecessary work in the test method **and** avoid problems
running stand-alone tests in shared instances and facility deployments.
The ``filter_file`` function can be quite useful for such changes
(see :ref:`file-filtering`).
Below is a basic example of a test that relies on files from the installation.
This package method re-uses the contents of the ``examples`` subdirectory,
which is assumed to have all of the files implemented to allow ``make`` to
compile and link ``foo.c`` and ``bar.c`` against the package's installed
library.
.. code-block:: python .. code-block:: python
class MyLibPackage(MakefilePackage): class MyLibPackage(Package):
... ...
@run_after("install") @run_after("install")
def copy_test_files(self): def copy_test_files(self):
cache_extra_test_sources(self, "examples") srcs = ["tests",
join_path("examples", "foo.c"),
join_path("examples", "bar.c")]
cache_extra_test_sources(self, srcs)
def test_example(self): def test_foo(self):
"""build and run the examples""" exe = "foo"
examples_dir = self.test_suite.current_test_cache_dir.examples src_dir = self.test_suite.current_test_cache_dir.examples
with working_dir(examples_dir): with working_dir(src_dir):
make = which("make") cc = which(os.environ["CC"])
make() cc(
f"-L{self.prefix.lib}",
f"-I{self.prefix.include}",
f"{exe}.c",
"-o", exe
)
foo = which(exe)
foo()
for program in ["foo", "bar"]: In this case, the method copies the associated files from the build
with test_part( stage, **after** the software is installed, to the package's test
self, cache directory. Then ``test_foo`` builds ``foo`` using ``foo.c``
f"test_example_{program}", before running the program.
purpose=f"ensure {program} runs"
):
exe = Executable(program)
exe()
In this case, ``copy_test_files`` copies the associated files from the
build stage to the package's test cache directory under the installation
prefix. Running ``spack test run`` for the package results in Spack copying
the directory and its contents to the the test stage directory. The
``working_dir`` context manager ensures the commands within it are executed
from the ``examples_dir``. The test builds the software using ``make`` before
running each executable, ``foo`` and ``bar``, as independent test parts.
.. note:: .. note::
@@ -5762,18 +5561,43 @@ running each executable, ``foo`` and ``bar``, as independent test parts.
The key to copying files for stand-alone testing at build time is use The key to copying files for stand-alone testing at build time is use
of the ``run_after`` directive, which ensures the associated files are of the ``run_after`` directive, which ensures the associated files are
copied **after** the provided build stage (``install``) when the installation copied **after** the provided build stage where the files **and**
prefix **and** files are available. installation prefix are available.
The test method uses the path contained in the package's These paths are **automatically copied** from cache to the test stage
``self.test_suite.current_test_cache_dir`` property for the root directory directory prior to the execution of any stand-alone tests. Tests access
of the copied files. In this case, that's the ``examples`` subdirectory. the files using the ``self.test_suite.current_test_cache_dir`` property.
In our example above, test methods can use the following paths to reference
the copy of each entry listed in ``srcs``, respectively:
.. tip:: * ``self.test_suite.current_test_cache_dir.tests``
* ``join_path(self.test_suite.current_test_cache_dir.examples, "foo.c")``
* ``join_path(self.test_suite.current_test_cache_dir.examples, "bar.c")``
.. admonition:: Library packages should build stand-alone tests
Library developers will want to build the associated tests
against their **installed** libraries before running them.
.. note::
While source and input files are generally recommended, binaries
**may** also be cached by the build process. Only you, as the package
writer or maintainer, know whether these files would be appropriate
for testing the installed software weeks to months later.
.. note::
If one or more of the copied files needs to be modified to reference
the installed software, it is recommended that those changes be made
to the cached files **once** in the ``copy_test_sources`` method and
***after** the call to ``cache_extra_test_sources()``. This will
reduce the amount of unnecessary work in the test method **and** avoid
problems testing in shared instances and facility deployments.
The ``filter_file`` function can be quite useful for such changes.
See :ref:`file manipulation <file-manipulation>`.
If you want to see more examples from packages that cache build files, run
``spack pkg grep cache_extra_test_sources | sed "s/\/package.py.*//g" | sort -u``
from the command line to get a list of the packages.
.. _cache_custom_files: .. _cache_custom_files:
@@ -5781,9 +5605,8 @@ running each executable, ``foo`` and ``bar``, as independent test parts.
Adding custom files Adding custom files
""""""""""""""""""" """""""""""""""""""
Sometimes it is helpful or necessary to include custom files for building and In some cases it can be useful to have files that can be used to build or
or checking the results of tests as part of the package. Examples of the types check the results of tests. Examples include:
of files that might be useful are:
- test source files - test source files
- test input files - test input files
@@ -5791,15 +5614,17 @@ of files that might be useful are:
- expected test outputs - expected test outputs
While obtaining such files from the software repository is preferred (see While obtaining such files from the software repository is preferred (see
:ref:`cache_extra_test_sources`), there are circumstances where doing so is not :ref:`adding build-time files <cache_extra_test_sources>`), there are
feasible such as when the software is not being actively maintained. When test circumstances where that is not feasible (e.g., the software is not being
files cannot be obtained from the repository or there is a need to supplement actively maintained). When test files can't be obtained from the repository
files that can, Spack supports the inclusion of additional files under the or as a supplement to files that can, Spack supports the inclusion of
``test`` subdirectory of the package in the Spack repository. additional files under the ``test`` subdirectory of the package in the
Spack repository.
The following example assumes a ``custom-example.c`` is saved in ``MyLibary`` Spack **automatically copies** the contents of that directory to the
package's ``test`` subdirectory. It also assumes the program simply needs to test staging directory prior to running stand-alone tests. Test methods
be compiled and linked against the installed ``MyLibrary`` software. access those files using the ``self.test_suite.current_test_data_dir``
property as shown below.
.. code-block:: python .. code-block:: python
@@ -5809,29 +5634,17 @@ be compiled and linked against the installed ``MyLibrary`` software.
test_requires_compiler = True test_requires_compiler = True
... ...
def test_custom_example(self): def test_example(self):
"""build and run custom-example""" """build and run custom-example"""
src_dir = self.test_suite.current_test_data_dir data_dir = self.test_suite.current_test_data_dir
exe = "custom-example" exe = "custom-example"
src = datadir.join(f"{exe}.cpp")
...
# TODO: Build custom-example using src and exe
...
custom_example = which(exe)
custom_example()
with working_dir(src_dir):
cc = which(os.environ["CC"])
cc(
f"-L{self.prefix.lib}",
f"-I{self.prefix.include}",
f"{exe}.cpp",
"-o", exe
)
custom_example = Executable(exe)
custom_example()
In this case, ``spack test run`` for the package results in Spack copying
the contents of the ``test`` subdirectory to the test stage directory path
in ``self.test_suite.current_test_data_dir`` before calling
``test_custom_example``. Use of the ``working_dir`` context manager
ensures the commands to build and run the program are performed from
within the appropriate subdirectory of the test stage.
.. _expected_test_output_from_file: .. _expected_test_output_from_file:
@@ -5840,8 +5653,9 @@ Reading expected output from a file
""""""""""""""""""""""""""""""""""" """""""""""""""""""""""""""""""""""
The helper function ``get_escaped_text_output`` is available for packages The helper function ``get_escaped_text_output`` is available for packages
to retrieve properly formatted text from a file potentially containing to retrieve and properly format the text from a file that contains the
special characters. expected output from running an executable that may contain special
characters.
The signature for ``get_escaped_text_output`` is: The signature for ``get_escaped_text_output`` is:
@@ -5851,13 +5665,10 @@ The signature for ``get_escaped_text_output`` is:
where ``filename`` is the path to the file containing the expected output. where ``filename`` is the path to the file containing the expected output.
The path provided to ``filename`` for one of the copied custom files The ``filename`` for a :ref:`custom file <cache_custom_files>` can be
(:ref:`custom file <cache_custom_files>`) is in the path rooted at accessed by tests using the ``self.test_suite.current_test_data_dir``
``self.test_suite.current_test_data_dir``. property. The example below illustrates how to read a file that was
added to the package's ``test`` subdirectory.
The example below shows how to reference both the custom database
(``packages.db``) and expected output (``dump.out``) files Spack copies
to the test stage:
.. code-block:: python .. code-block:: python
@@ -5879,9 +5690,8 @@ to the test stage:
for exp in expected: for exp in expected:
assert re.search(exp, out), f"Expected '{exp}' in output" assert re.search(exp, out), f"Expected '{exp}' in output"
If the files were instead cached from installing the software, the paths to the If the file was instead copied from the ``tests`` subdirectory of the staged
two files would be found under the ``self.test_suite.current_test_cache_dir`` source code, the path would be obtained as shown below.
directory as shown below:
.. code-block:: python .. code-block:: python
@@ -5889,24 +5699,17 @@ directory as shown below:
"""check example table dump""" """check example table dump"""
test_cache_dir = self.test_suite.current_test_cache_dir test_cache_dir = self.test_suite.current_test_cache_dir
db_filename = test_cache_dir.join("packages.db") db_filename = test_cache_dir.join("packages.db")
..
expected = get_escaped_text_output(test_cache_dir.join("dump.out"))
...
Alternatively, if both files had been installed by the software into the Alternatively, if the file was copied to the ``share/tests`` subdirectory
``share/tests`` subdirectory of the installation prefix, the paths to the as part of the installation process, the test could access the path as
two files would be referenced as follows: follows:
.. code-block:: python .. code-block:: python
def test_example(self): def test_example(self):
"""check example table dump""" """check example table dump"""
db_filename = self.prefix.share.tests.join("packages.db") db_filename = join_path(self.prefix.share.tests, "packages.db")
..
expected = get_escaped_text_output(
self.prefix.share.tests.join("dump.out")
)
...
.. _check_outputs: .. _check_outputs:
@@ -5914,9 +5717,9 @@ two files would be referenced as follows:
Comparing expected to actual outputs Comparing expected to actual outputs
"""""""""""""""""""""""""""""""""""" """"""""""""""""""""""""""""""""""""
The ``check_outputs`` helper routine is available for packages to ensure The helper function ``check_outputs`` is available for packages to ensure
multiple expected outputs from running an executable are contained within the expected outputs from running an executable are contained within the
the actual outputs. actual outputs.
The signature for ``check_outputs`` is: The signature for ``check_outputs`` is:
@@ -5942,17 +5745,11 @@ Invoking the method is the equivalent of:
if errors: if errors:
raise RuntimeError("\n ".join(errors)) raise RuntimeError("\n ".join(errors))
.. tip::
If you want to see more examples from packages that use this helper, run
``spack pkg grep check_outputs | sed "s/\/package.py.*//g" | sort -u``
from the command line to get a list of the packages.
.. _accessing-files: .. _accessing-files:
""""""""""""""""""""""""""""""""""""""""" """""""""""""""""""""""""""""""""""""""""
Finding package- and test-related files Accessing package- and test-related files
""""""""""""""""""""""""""""""""""""""""" """""""""""""""""""""""""""""""""""""""""
You may need to access files from one or more locations when writing You may need to access files from one or more locations when writing
@@ -5961,7 +5758,8 @@ include test source files or includes them but has no way to build the
executables using the installed headers and libraries. In these cases executables using the installed headers and libraries. In these cases
you may need to reference the files relative to one or more root directory. you may need to reference the files relative to one or more root directory.
The table below lists relevant path properties and provides additional The table below lists relevant path properties and provides additional
examples of their use. See :ref:`expected_test_output_from_file` for examples of their use.
:ref:`Reading expected output <expected_test_output_from_file>` provides
examples of accessing files saved from the software repository, package examples of accessing files saved from the software repository, package
repository, and installation. repository, and installation.
@@ -5990,6 +5788,7 @@ repository, and installation.
- ``self.test_suite.current_test_data_dir`` - ``self.test_suite.current_test_data_dir``
- ``join_path(self.test_suite.current_test_data_dir, "hello.f90")`` - ``join_path(self.test_suite.current_test_data_dir, "hello.f90")``
.. _inheriting-tests: .. _inheriting-tests:
"""""""""""""""""""""""""""" """"""""""""""""""""""""""""
@@ -6032,7 +5831,7 @@ maintainers provide additional stand-alone tests customized to the package.
.. warning:: .. warning::
Any package that implements a test method with the same name as an Any package that implements a test method with the same name as an
inherited method will override the inherited method. If that is not the inherited method overrides the inherited method. If that is not the
goal and you are not explicitly calling and adding functionality to goal and you are not explicitly calling and adding functionality to
the inherited method for the test, then make sure that all test methods the inherited method for the test, then make sure that all test methods
and embedded test parts have unique test names. and embedded test parts have unique test names.
@@ -6197,8 +5996,6 @@ running:
This is already part of the boilerplate for packages created with This is already part of the boilerplate for packages created with
``spack create``. ``spack create``.
.. _file-filtering:
^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^
Filtering functions Filtering functions
^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^

View File

@@ -253,6 +253,17 @@ can easily happen if it is not updated frequently, this behavior ensures that
spack has a way to know for certain about the status of any concrete spec on spack has a way to know for certain about the status of any concrete spec on
the remote mirror, but can slow down pipeline generation significantly. the remote mirror, but can slow down pipeline generation significantly.
The ``--optimize`` argument is experimental and runs the generated pipeline
document through a series of optimization passes designed to reduce the size
of the generated file.
The ``--dependencies`` is also experimental and disables what in Gitlab is
referred to as DAG scheduling, internally using the ``dependencies`` keyword
rather than ``needs`` to list dependency jobs. The drawback of using this option
is that before any job can begin, all jobs in previous stages must first
complete. The benefit is that Gitlab allows more dependencies to be listed
when using ``dependencies`` instead of ``needs``.
The optional ``--output-file`` argument should be an absolute path (including The optional ``--output-file`` argument should be an absolute path (including
file name) to the generated pipeline, and if not given, the default is file name) to the generated pipeline, and if not given, the default is
``./.gitlab-ci.yml``. ``./.gitlab-ci.yml``.
@@ -663,7 +674,11 @@ build the package.
When including a bootstrapping phase as in the example above, the result is that When including a bootstrapping phase as in the example above, the result is that
the bootstrapped compiler packages will be pushed to the binary mirror (and the the bootstrapped compiler packages will be pushed to the binary mirror (and the
local artifacts mirror) before the actual release specs are built. local artifacts mirror) before the actual release specs are built. In this case,
the jobs corresponding to subsequent release specs are configured to
``install_missing_compilers``, so that if spack is asked to install a package
with a compiler it doesn't know about, it can be quickly installed from the
binary mirror first.
Since bootstrapping compilers is optional, those items can be left out of the Since bootstrapping compilers is optional, those items can be left out of the
environment/stack file, and in that case no bootstrapping will be done (only the environment/stack file, and in that case no bootstrapping will be done (only the

View File

@@ -476,3 +476,9 @@ implemented using Python's built-in `sys.path
:py:mod:`spack.repo` module implements a custom `Python importer :py:mod:`spack.repo` module implements a custom `Python importer
<https://docs.python.org/2/library/imp.html>`_. <https://docs.python.org/2/library/imp.html>`_.
.. warning::
The mechanism for extending packages is not yet extensively tested,
and extending packages across repositories imposes inter-repo
dependencies, which may be hard to manage. Use this feature at your
own risk, but let us know if you have a use case for it.

View File

@@ -1,13 +1,13 @@
sphinx==7.4.7 sphinx==7.2.6
sphinxcontrib-programoutput==0.17 sphinxcontrib-programoutput==0.17
sphinx_design==0.6.1 sphinx_design==0.5.0
sphinx-rtd-theme==2.0.0 sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.1 python-levenshtein==0.25.1
docutils==0.20.1 docutils==0.20.1
pygments==2.18.0 pygments==2.17.2
urllib3==2.2.3 urllib3==2.2.1
pytest==8.3.3 pytest==8.2.0
isort==5.13.2 isort==5.13.2
black==24.8.0 black==24.4.2
flake8==7.1.1 flake8==7.0.0
mypy==1.11.1 mypy==1.10.0

427
lib/spack/env/cc vendored
View File

@@ -101,9 +101,10 @@ setsep() {
esac esac
} }
# prepend LISTNAME ELEMENT # prepend LISTNAME ELEMENT [SEP]
# #
# Prepend ELEMENT to the list stored in the variable LISTNAME. # Prepend ELEMENT to the list stored in the variable LISTNAME,
# assuming the list is separated by SEP.
# Handles empty lists and single-element lists. # Handles empty lists and single-element lists.
prepend() { prepend() {
varname="$1" varname="$1"
@@ -118,39 +119,18 @@ prepend() {
fi fi
} }
# contains LISTNAME ELEMENT # append LISTNAME ELEMENT [SEP]
# #
# Test whether LISTNAME contains ELEMENT. # Append ELEMENT to the list stored in the variable LISTNAME,
# Set $? to 1 if LISTNAME does not contain ELEMENT. # assuming the list is separated by SEP.
# Set $? to 0 if LISTNAME does not contain ELEMENT.
contains() {
varname="$1"
elt="$2"
setsep "$varname"
# the list may: 1) only contain the element, 2) start with the element,
# 3) contain the element in the middle, or 4) end wtih the element.
eval "[ \"\${$varname}\" = \"$elt\" ]" \
|| eval "[ \"\${$varname#${elt}${sep}}\" != \"\${$varname}\" ]" \
|| eval "[ \"\${$varname#*${sep}${elt}${sep}}\" != \"\${$varname}\" ]" \
|| eval "[ \"\${$varname%${sep}${elt}}\" != \"\${$varname}\" ]"
}
# append LISTNAME ELEMENT [unique]
#
# Append ELEMENT to the list stored in the variable LISTNAME.
# Handles empty lists and single-element lists. # Handles empty lists and single-element lists.
#
# If the third argument is provided and if it is the string 'unique',
# this will not append if ELEMENT is already in the list LISTNAME.
append() { append() {
varname="$1" varname="$1"
elt="$2" elt="$2"
if empty "$varname"; then if empty "$varname"; then
eval "$varname=\"\${elt}\"" eval "$varname=\"\${elt}\""
elif [ "$3" != "unique" ] || ! contains "$varname" "$elt" ; then else
# Get the appropriate separator for the list we're appending to. # Get the appropriate separator for the list we're appending to.
setsep "$varname" setsep "$varname"
eval "$varname=\"\${$varname}${sep}\${elt}\"" eval "$varname=\"\${$varname}${sep}\${elt}\""
@@ -168,21 +148,10 @@ extend() {
if [ "$sep" != " " ]; then if [ "$sep" != " " ]; then
IFS="$sep" IFS="$sep"
fi fi
eval "for elt in \${$2}; do append $1 \"$3\${elt}\" ${_append_args}; done" eval "for elt in \${$2}; do append $1 \"$3\${elt}\"; done"
unset IFS unset IFS
} }
# extend_unique LISTNAME1 LISTNAME2 [PREFIX]
#
# Append the elements stored in the variable LISTNAME2 to the list
# stored in LISTNAME1, if they are not already present.
# If PREFIX is provided, prepend it to each element.
extend_unique() {
_append_args="unique"
extend "$@"
unset _append_args
}
# preextend LISTNAME1 LISTNAME2 [PREFIX] # preextend LISTNAME1 LISTNAME2 [PREFIX]
# #
# Prepend the elements stored in the list at LISTNAME2 # Prepend the elements stored in the list at LISTNAME2
@@ -205,46 +174,6 @@ preextend() {
unset IFS unset IFS
} }
execute() {
# dump the full command if the caller supplies SPACK_TEST_COMMAND=dump-args
if [ -n "${SPACK_TEST_COMMAND=}" ]; then
case "$SPACK_TEST_COMMAND" in
dump-args)
IFS="$lsep"
for arg in $full_command_list; do
echo "$arg"
done
unset IFS
exit
;;
dump-env-*)
var=${SPACK_TEST_COMMAND#dump-env-}
eval "printf '%s\n' \"\$0: \$var: \$$var\""
;;
*)
die "Unknown test command: '$SPACK_TEST_COMMAND'"
;;
esac
fi
#
# Write the input and output commands to debug logs if it's asked for.
#
if [ "$SPACK_DEBUG" = TRUE ]; then
input_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.in.log"
output_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.out.log"
echo "[$mode] $command $input_command" >> "$input_log"
IFS="$lsep"
echo "[$mode] "$full_command_list >> "$output_log"
unset IFS
fi
# Execute the full command, preserving spaces with IFS set
# to the alarm bell separator.
IFS="$lsep"; exec $full_command_list
exit
}
# Fail with a clear message if the input contains any bell characters. # Fail with a clear message if the input contains any bell characters.
if eval "[ \"\${*#*${lsep}}\" != \"\$*\" ]"; then if eval "[ \"\${*#*${lsep}}\" != \"\$*\" ]"; then
die "Compiler command line contains our separator ('${lsep}'). Cannot parse." die "Compiler command line contains our separator ('${lsep}'). Cannot parse."
@@ -269,36 +198,6 @@ esac
} }
" "
# path_list functions. Path_lists have 3 parts: spack_store_<list>, <list> and system_<list>,
# which are used to prioritize paths when assembling the final command line.
# init_path_lists LISTNAME
# Set <LISTNAME>, spack_store_<LISTNAME>, and system_<LISTNAME> to "".
init_path_lists() {
eval "spack_store_$1=\"\""
eval "$1=\"\""
eval "system_$1=\"\""
}
# assign_path_lists LISTNAME1 LISTNAME2
# Copy contents of LISTNAME2 into LISTNAME1, for each path_list prefix.
assign_path_lists() {
eval "spack_store_$1=\"\${spack_store_$2}\""
eval "$1=\"\${$2}\""
eval "system_$1=\"\${system_$2}\""
}
# append_path_lists LISTNAME ELT
# Append the provided ELT to the appropriate list, based on the result of path_order().
append_path_lists() {
path_order "$2"
case $? in
0) eval "append spack_store_$1 \"\$2\"" ;;
1) eval "append $1 \"\$2\"" ;;
2) eval "append system_$1 \"\$2\"" ;;
esac
}
# Check if optional parameters are defined # Check if optional parameters are defined
# If we aren't asking for debug flags, don't add them # If we aren't asking for debug flags, don't add them
if [ -z "${SPACK_ADD_DEBUG_FLAGS:-}" ]; then if [ -z "${SPACK_ADD_DEBUG_FLAGS:-}" ]; then
@@ -332,17 +231,12 @@ fi
# ld link # ld link
# ccld compile & link # ccld compile & link
# Note. SPACK_ALWAYS_XFLAGS are applied for all compiler invocations,
# including version checks (SPACK_XFLAGS variants are not applied
# for version checks).
command="${0##*/}" command="${0##*/}"
comp="CC" comp="CC"
vcheck_flags=""
case "$command" in case "$command" in
cpp) cpp)
mode=cpp mode=cpp
debug_flags="-g" debug_flags="-g"
vcheck_flags="${SPACK_ALWAYS_CPPFLAGS}"
;; ;;
cc|c89|c99|gcc|clang|armclang|icc|icx|pgcc|nvc|xlc|xlc_r|fcc|amdclang|cl.exe|craycc) cc|c89|c99|gcc|clang|armclang|icc|icx|pgcc|nvc|xlc|xlc_r|fcc|amdclang|cl.exe|craycc)
command="$SPACK_CC" command="$SPACK_CC"
@@ -350,7 +244,6 @@ case "$command" in
comp="CC" comp="CC"
lang_flags=C lang_flags=C
debug_flags="-g" debug_flags="-g"
vcheck_flags="${SPACK_ALWAYS_CFLAGS}"
;; ;;
c++|CC|g++|clang++|armclang++|icpc|icpx|pgc++|nvc++|xlc++|xlc++_r|FCC|amdclang++|crayCC) c++|CC|g++|clang++|armclang++|icpc|icpx|pgc++|nvc++|xlc++|xlc++_r|FCC|amdclang++|crayCC)
command="$SPACK_CXX" command="$SPACK_CXX"
@@ -358,7 +251,6 @@ case "$command" in
comp="CXX" comp="CXX"
lang_flags=CXX lang_flags=CXX
debug_flags="-g" debug_flags="-g"
vcheck_flags="${SPACK_ALWAYS_CXXFLAGS}"
;; ;;
ftn|f90|fc|f95|gfortran|flang|armflang|ifort|ifx|pgfortran|nvfortran|xlf90|xlf90_r|nagfor|frt|amdflang|crayftn) ftn|f90|fc|f95|gfortran|flang|armflang|ifort|ifx|pgfortran|nvfortran|xlf90|xlf90_r|nagfor|frt|amdflang|crayftn)
command="$SPACK_FC" command="$SPACK_FC"
@@ -366,7 +258,6 @@ case "$command" in
comp="FC" comp="FC"
lang_flags=F lang_flags=F
debug_flags="-g" debug_flags="-g"
vcheck_flags="${SPACK_ALWAYS_FFLAGS}"
;; ;;
f77|xlf|xlf_r|pgf77) f77|xlf|xlf_r|pgf77)
command="$SPACK_F77" command="$SPACK_F77"
@@ -374,7 +265,6 @@ case "$command" in
comp="F77" comp="F77"
lang_flags=F lang_flags=F
debug_flags="-g" debug_flags="-g"
vcheck_flags="${SPACK_ALWAYS_FFLAGS}"
;; ;;
ld|ld.gold|ld.lld) ld|ld.gold|ld.lld)
mode=ld mode=ld
@@ -475,11 +365,27 @@ unset IFS
export PATH="$new_dirs" export PATH="$new_dirs"
if [ "$mode" = vcheck ]; then if [ "$mode" = vcheck ]; then
full_command_list="$command" exec "${command}" "$@"
args="$@" fi
extend full_command_list vcheck_flags
extend full_command_list args # Darwin's linker has a -r argument that merges object files together.
execute # It doesn't work with -rpath.
# This variable controls whether they are added.
add_rpaths=true
if [ "$mode" = ld ] || [ "$mode" = ccld ]; then
if [ "${SPACK_SHORT_SPEC#*darwin}" != "${SPACK_SHORT_SPEC}" ]; then
for arg in "$@"; do
if [ "$arg" = "-r" ]; then
if [ "$mode" = ld ] || [ "$mode" = ccld ]; then
add_rpaths=false
break
fi
elif [ "$arg" = "-Wl,-r" ] && [ "$mode" = ccld ]; then
add_rpaths=false
break
fi
done
fi
fi fi
# Save original command for debug logging # Save original command for debug logging
@@ -511,7 +417,12 @@ input_command="$*"
parse_Wl() { parse_Wl() {
while [ $# -ne 0 ]; do while [ $# -ne 0 ]; do
if [ "$wl_expect_rpath" = yes ]; then if [ "$wl_expect_rpath" = yes ]; then
append_path_lists return_rpath_dirs_list "$1" path_order "$1"
case $? in
0) append return_spack_store_rpath_dirs_list "$1" ;;
1) append return_rpath_dirs_list "$1" ;;
2) append return_system_rpath_dirs_list "$1" ;;
esac
wl_expect_rpath=no wl_expect_rpath=no
else else
case "$1" in case "$1" in
@@ -520,14 +431,24 @@ parse_Wl() {
if [ -z "$arg" ]; then if [ -z "$arg" ]; then
shift; continue shift; continue
fi fi
append_path_lists return_rpath_dirs_list "$arg" path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;; ;;
--rpath=*) --rpath=*)
arg="${1#--rpath=}" arg="${1#--rpath=}"
if [ -z "$arg" ]; then if [ -z "$arg" ]; then
shift; continue shift; continue
fi fi
append_path_lists return_rpath_dirs_list "$arg" path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;; ;;
-rpath|--rpath) -rpath|--rpath)
wl_expect_rpath=yes wl_expect_rpath=yes
@@ -535,7 +456,8 @@ parse_Wl() {
"$dtags_to_strip") "$dtags_to_strip")
;; ;;
-Wl) -Wl)
# Nested -Wl,-Wl means we're in NAG compiler territory. We don't support it. # Nested -Wl,-Wl means we're in NAG compiler territory, we don't support
# it.
return 1 return 1
;; ;;
*) *)
@@ -554,10 +476,21 @@ categorize_arguments() {
return_other_args_list="" return_other_args_list=""
return_isystem_was_used="" return_isystem_was_used=""
init_path_lists return_isystem_include_dirs_list return_isystem_spack_store_include_dirs_list=""
init_path_lists return_include_dirs_list return_isystem_system_include_dirs_list=""
init_path_lists return_lib_dirs_list return_isystem_include_dirs_list=""
init_path_lists return_rpath_dirs_list
return_spack_store_include_dirs_list=""
return_system_include_dirs_list=""
return_include_dirs_list=""
return_spack_store_lib_dirs_list=""
return_system_lib_dirs_list=""
return_lib_dirs_list=""
return_spack_store_rpath_dirs_list=""
return_system_rpath_dirs_list=""
return_rpath_dirs_list=""
# Global state for keeping track of -Wl,-rpath -Wl,/path # Global state for keeping track of -Wl,-rpath -Wl,/path
wl_expect_rpath=no wl_expect_rpath=no
@@ -623,17 +556,32 @@ categorize_arguments() {
arg="${1#-isystem}" arg="${1#-isystem}"
return_isystem_was_used=true return_isystem_was_used=true
if [ -z "$arg" ]; then shift; arg="$1"; fi if [ -z "$arg" ]; then shift; arg="$1"; fi
append_path_lists return_isystem_include_dirs_list "$arg" path_order "$arg"
case $? in
0) append return_isystem_spack_store_include_dirs_list "$arg" ;;
1) append return_isystem_include_dirs_list "$arg" ;;
2) append return_isystem_system_include_dirs_list "$arg" ;;
esac
;; ;;
-I*) -I*)
arg="${1#-I}" arg="${1#-I}"
if [ -z "$arg" ]; then shift; arg="$1"; fi if [ -z "$arg" ]; then shift; arg="$1"; fi
append_path_lists return_include_dirs_list "$arg" path_order "$arg"
case $? in
0) append return_spack_store_include_dirs_list "$arg" ;;
1) append return_include_dirs_list "$arg" ;;
2) append return_system_include_dirs_list "$arg" ;;
esac
;; ;;
-L*) -L*)
arg="${1#-L}" arg="${1#-L}"
if [ -z "$arg" ]; then shift; arg="$1"; fi if [ -z "$arg" ]; then shift; arg="$1"; fi
append_path_lists return_lib_dirs_list "$arg" path_order "$arg"
case $? in
0) append return_spack_store_lib_dirs_list "$arg" ;;
1) append return_lib_dirs_list "$arg" ;;
2) append return_system_lib_dirs_list "$arg" ;;
esac
;; ;;
-l*) -l*)
# -loopopt=0 is generated erroneously in autoconf <= 2.69, # -loopopt=0 is generated erroneously in autoconf <= 2.69,
@@ -666,17 +614,32 @@ categorize_arguments() {
break break
elif [ "$xlinker_expect_rpath" = yes ]; then elif [ "$xlinker_expect_rpath" = yes ]; then
# Register the path of -Xlinker -rpath <other args> -Xlinker <path> # Register the path of -Xlinker -rpath <other args> -Xlinker <path>
append_path_lists return_rpath_dirs_list "$1" path_order "$1"
case $? in
0) append return_spack_store_rpath_dirs_list "$1" ;;
1) append return_rpath_dirs_list "$1" ;;
2) append return_system_rpath_dirs_list "$1" ;;
esac
xlinker_expect_rpath=no xlinker_expect_rpath=no
else else
case "$1" in case "$1" in
-rpath=*) -rpath=*)
arg="${1#-rpath=}" arg="${1#-rpath=}"
append_path_lists return_rpath_dirs_list "$arg" path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;; ;;
--rpath=*) --rpath=*)
arg="${1#--rpath=}" arg="${1#--rpath=}"
append_path_lists return_rpath_dirs_list "$arg" path_order "$arg"
case $? in
0) append return_spack_store_rpath_dirs_list "$arg" ;;
1) append return_rpath_dirs_list "$arg" ;;
2) append return_system_rpath_dirs_list "$arg" ;;
esac
;; ;;
-rpath|--rpath) -rpath|--rpath)
xlinker_expect_rpath=yes xlinker_expect_rpath=yes
@@ -693,32 +656,7 @@ categorize_arguments() {
"$dtags_to_strip") "$dtags_to_strip")
;; ;;
*) *)
# if mode is not ld, we can just add to other args append return_other_args_list "$1"
if [ "$mode" != "ld" ]; then
append return_other_args_list "$1"
shift
continue
fi
# if we're in linker mode, we need to parse raw RPATH args
case "$1" in
-rpath=*)
arg="${1#-rpath=}"
append_path_lists return_rpath_dirs_list "$arg"
;;
--rpath=*)
arg="${1#--rpath=}"
append_path_lists return_rpath_dirs_list "$arg"
;;
-rpath|--rpath)
shift
[ $# -eq 0 ] && break # ignore -rpath without value
append_path_lists return_rpath_dirs_list "$1"
;;
*)
append return_other_args_list "$1"
;;
esac
;; ;;
esac esac
shift shift
@@ -740,10 +678,21 @@ categorize_arguments() {
categorize_arguments "$@" categorize_arguments "$@"
assign_path_lists isystem_include_dirs_list return_isystem_include_dirs_list spack_store_include_dirs_list="$return_spack_store_include_dirs_list"
assign_path_lists include_dirs_list return_include_dirs_list system_include_dirs_list="$return_system_include_dirs_list"
assign_path_lists lib_dirs_list return_lib_dirs_list include_dirs_list="$return_include_dirs_list"
assign_path_lists rpath_dirs_list return_rpath_dirs_list
spack_store_lib_dirs_list="$return_spack_store_lib_dirs_list"
system_lib_dirs_list="$return_system_lib_dirs_list"
lib_dirs_list="$return_lib_dirs_list"
spack_store_rpath_dirs_list="$return_spack_store_rpath_dirs_list"
system_rpath_dirs_list="$return_system_rpath_dirs_list"
rpath_dirs_list="$return_rpath_dirs_list"
isystem_spack_store_include_dirs_list="$return_isystem_spack_store_include_dirs_list"
isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
isystem_include_dirs_list="$return_isystem_include_dirs_list"
isystem_was_used="$return_isystem_was_used" isystem_was_used="$return_isystem_was_used"
other_args_list="$return_other_args_list" other_args_list="$return_other_args_list"
@@ -773,7 +722,6 @@ case "$mode" in
cc|ccld) cc|ccld)
case $lang_flags in case $lang_flags in
F) F)
extend spack_flags_list SPACK_ALWAYS_FFLAGS
extend spack_flags_list SPACK_FFLAGS extend spack_flags_list SPACK_FFLAGS
;; ;;
esac esac
@@ -783,7 +731,6 @@ esac
# C preprocessor flags come before any C/CXX flags # C preprocessor flags come before any C/CXX flags
case "$mode" in case "$mode" in
cpp|as|cc|ccld) cpp|as|cc|ccld)
extend spack_flags_list SPACK_ALWAYS_CPPFLAGS
extend spack_flags_list SPACK_CPPFLAGS extend spack_flags_list SPACK_CPPFLAGS
;; ;;
esac esac
@@ -794,11 +741,9 @@ case "$mode" in
cc|ccld) cc|ccld)
case $lang_flags in case $lang_flags in
C) C)
extend spack_flags_list SPACK_ALWAYS_CFLAGS
extend spack_flags_list SPACK_CFLAGS extend spack_flags_list SPACK_CFLAGS
;; ;;
CXX) CXX)
extend spack_flags_list SPACK_ALWAYS_CXXFLAGS
extend spack_flags_list SPACK_CXXFLAGS extend spack_flags_list SPACK_CXXFLAGS
;; ;;
esac esac
@@ -819,10 +764,21 @@ IFS="$lsep"
categorize_arguments $spack_flags_list categorize_arguments $spack_flags_list
unset IFS unset IFS
assign_path_lists spack_flags_isystem_include_dirs_list return_isystem_include_dirs_list spack_flags_isystem_spack_store_include_dirs_list="$return_isystem_spack_store_include_dirs_list"
assign_path_lists spack_flags_include_dirs_list return_include_dirs_list spack_flags_isystem_system_include_dirs_list="$return_isystem_system_include_dirs_list"
assign_path_lists spack_flags_lib_dirs_list return_lib_dirs_list spack_flags_isystem_include_dirs_list="$return_isystem_include_dirs_list"
assign_path_lists spack_flags_rpath_dirs_list return_rpath_dirs_list
spack_flags_spack_store_include_dirs_list="$return_spack_store_include_dirs_list"
spack_flags_system_include_dirs_list="$return_system_include_dirs_list"
spack_flags_include_dirs_list="$return_include_dirs_list"
spack_flags_spack_store_lib_dirs_list="$return_spack_store_lib_dirs_list"
spack_flags_system_lib_dirs_list="$return_system_lib_dirs_list"
spack_flags_lib_dirs_list="$return_lib_dirs_list"
spack_flags_spack_store_rpath_dirs_list="$return_spack_store_rpath_dirs_list"
spack_flags_system_rpath_dirs_list="$return_system_rpath_dirs_list"
spack_flags_rpath_dirs_list="$return_rpath_dirs_list"
spack_flags_isystem_was_used="$return_isystem_was_used" spack_flags_isystem_was_used="$return_isystem_was_used"
spack_flags_other_args_list="$return_other_args_list" spack_flags_other_args_list="$return_other_args_list"
@@ -841,11 +797,13 @@ if [ "$mode" = ld ] || [ "$mode" = ccld ]; then
fi fi
if [ "$mode" = ccld ] || [ "$mode" = ld ]; then if [ "$mode" = ccld ] || [ "$mode" = ld ]; then
# Append RPATH directories. Note that in the case of the if [ "$add_rpaths" != "false" ]; then
# top-level package these directories may not exist yet. For dependencies # Append RPATH directories. Note that in the case of the
# it is assumed that paths have already been confirmed. # top-level package these directories may not exist yet. For dependencies
extend spack_store_rpath_dirs_list SPACK_STORE_RPATH_DIRS # it is assumed that paths have already been confirmed.
extend rpath_dirs_list SPACK_RPATH_DIRS extend spack_store_rpath_dirs_list SPACK_STORE_RPATH_DIRS
extend rpath_dirs_list SPACK_RPATH_DIRS
fi
fi fi
if [ "$mode" = ccld ] || [ "$mode" = ld ]; then if [ "$mode" = ccld ] || [ "$mode" = ld ]; then
@@ -860,10 +818,14 @@ case "$mode" in
ld|ccld) ld|ccld)
# Set extra RPATHs # Set extra RPATHs
extend lib_dirs_list SPACK_COMPILER_EXTRA_RPATHS extend lib_dirs_list SPACK_COMPILER_EXTRA_RPATHS
extend rpath_dirs_list SPACK_COMPILER_EXTRA_RPATHS if [ "$add_rpaths" != "false" ]; then
extend rpath_dirs_list SPACK_COMPILER_EXTRA_RPATHS
fi
# Set implicit RPATHs # Set implicit RPATHs
extend rpath_dirs_list SPACK_COMPILER_IMPLICIT_RPATHS if [ "$add_rpaths" != "false" ]; then
extend rpath_dirs_list SPACK_COMPILER_IMPLICIT_RPATHS
fi
# Add SPACK_LDLIBS to args # Add SPACK_LDLIBS to args
for lib in $SPACK_LDLIBS; do for lib in $SPACK_LDLIBS; do
@@ -875,7 +837,7 @@ esac
case "$mode" in case "$mode" in
cpp|cc|as|ccld) cpp|cc|as|ccld)
if [ "$spack_flags_isystem_was_used" = "true" ] || [ "$isystem_was_used" = "true" ]; then if [ "$spack_flags_isystem_was_used" = "true" ] || [ "$isystem_was_used" = "true" ]; then
extend spack_store_isystem_include_dirs_list SPACK_STORE_INCLUDE_DIRS extend isystem_spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS
extend isystem_include_dirs_list SPACK_INCLUDE_DIRS extend isystem_include_dirs_list SPACK_INCLUDE_DIRS
else else
extend spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS extend spack_store_include_dirs_list SPACK_STORE_INCLUDE_DIRS
@@ -891,76 +853,64 @@ args_list="$flags_list"
# Include search paths partitioned by (in store, non-sytem, system) # Include search paths partitioned by (in store, non-sytem, system)
# NOTE: adding ${lsep} to the prefix here turns every added element into two # NOTE: adding ${lsep} to the prefix here turns every added element into two
extend args_list spack_store_spack_flags_include_dirs_list -I extend args_list spack_flags_spack_store_include_dirs_list -I
extend args_list spack_store_include_dirs_list -I extend args_list spack_store_include_dirs_list -I
extend args_list spack_flags_include_dirs_list -I extend args_list spack_flags_include_dirs_list -I
extend args_list include_dirs_list -I extend args_list include_dirs_list -I
extend args_list spack_store_spack_flags_isystem_include_dirs_list "-isystem${lsep}" extend args_list spack_flags_isystem_spack_store_include_dirs_list "-isystem${lsep}"
extend args_list spack_store_isystem_include_dirs_list "-isystem${lsep}" extend args_list isystem_spack_store_include_dirs_list "-isystem${lsep}"
extend args_list spack_flags_isystem_include_dirs_list "-isystem${lsep}" extend args_list spack_flags_isystem_include_dirs_list "-isystem${lsep}"
extend args_list isystem_include_dirs_list "-isystem${lsep}" extend args_list isystem_include_dirs_list "-isystem${lsep}"
extend args_list system_spack_flags_include_dirs_list -I extend args_list spack_flags_system_include_dirs_list -I
extend args_list system_include_dirs_list -I extend args_list system_include_dirs_list -I
extend args_list system_spack_flags_isystem_include_dirs_list "-isystem${lsep}" extend args_list spack_flags_isystem_system_include_dirs_list "-isystem${lsep}"
extend args_list system_isystem_include_dirs_list "-isystem${lsep}" extend args_list isystem_system_include_dirs_list "-isystem${lsep}"
# Library search paths partitioned by (in store, non-sytem, system) # Library search paths partitioned by (in store, non-sytem, system)
extend args_list spack_store_spack_flags_lib_dirs_list "-L" extend args_list spack_flags_spack_store_lib_dirs_list "-L"
extend args_list spack_store_lib_dirs_list "-L" extend args_list spack_store_lib_dirs_list "-L"
extend args_list spack_flags_lib_dirs_list "-L" extend args_list spack_flags_lib_dirs_list "-L"
extend args_list lib_dirs_list "-L" extend args_list lib_dirs_list "-L"
extend args_list system_spack_flags_lib_dirs_list "-L" extend args_list spack_flags_system_lib_dirs_list "-L"
extend args_list system_lib_dirs_list "-L" extend args_list system_lib_dirs_list "-L"
# RPATH arguments # RPATHs arguments
rpath_prefix=""
case "$mode" in case "$mode" in
ccld) ccld)
if [ -n "$dtags_to_add" ] ; then if [ -n "$dtags_to_add" ] ; then
append args_list "$linker_arg$dtags_to_add" append args_list "$linker_arg$dtags_to_add"
fi fi
rpath_prefix="$rpath" extend args_list spack_flags_spack_store_rpath_dirs_list "$rpath"
extend args_list spack_store_rpath_dirs_list "$rpath"
extend args_list spack_flags_rpath_dirs_list "$rpath"
extend args_list rpath_dirs_list "$rpath"
extend args_list spack_flags_system_rpath_dirs_list "$rpath"
extend args_list system_rpath_dirs_list "$rpath"
;; ;;
ld) ld)
if [ -n "$dtags_to_add" ] ; then if [ -n "$dtags_to_add" ] ; then
append args_list "$dtags_to_add" append args_list "$dtags_to_add"
fi fi
rpath_prefix="-rpath${lsep}" extend args_list spack_flags_spack_store_rpath_dirs_list "-rpath${lsep}"
extend args_list spack_store_rpath_dirs_list "-rpath${lsep}"
extend args_list spack_flags_rpath_dirs_list "-rpath${lsep}"
extend args_list rpath_dirs_list "-rpath${lsep}"
extend args_list spack_flags_system_rpath_dirs_list "-rpath${lsep}"
extend args_list system_rpath_dirs_list "-rpath${lsep}"
;; ;;
esac esac
# Darwin's linker has a -r argument that merges object files together.
# It doesn't work with -rpath. add_rpaths controls whether RPATHs are added.
add_rpaths=true
if [ "$mode" = ld ] || [ "$mode" = ccld ]; then
if [ "${SPACK_SHORT_SPEC#*darwin}" != "${SPACK_SHORT_SPEC}" ]; then
args="$@"
if contains args "-r" || contains args "-Wl,-r"; then
add_rpaths=false
fi
fi
fi
# if mode is ccld or ld, extend RPATH lists, adding the prefix determined above
if [ "$add_rpaths" == "true" ] && [ -n "$rpath_prefix" ]; then
extend_unique args_list spack_store_spack_flags_rpath_dirs_list "$rpath_prefix"
extend_unique args_list spack_store_rpath_dirs_list "$rpath_prefix"
extend_unique args_list spack_flags_rpath_dirs_list "$rpath_prefix"
extend_unique args_list rpath_dirs_list "$rpath_prefix"
extend_unique args_list system_spack_flags_rpath_dirs_list "$rpath_prefix"
extend_unique args_list system_rpath_dirs_list "$rpath_prefix"
fi
# Other arguments from the input command # Other arguments from the input command
extend args_list other_args_list extend args_list other_args_list
extend args_list spack_flags_other_args_list extend args_list spack_flags_other_args_list
@@ -983,4 +933,39 @@ if [ -n "$SPACK_CCACHE_BINARY" ]; then
esac esac
fi fi
execute # dump the full command if the caller supplies SPACK_TEST_COMMAND=dump-args
if [ -n "${SPACK_TEST_COMMAND=}" ]; then
case "$SPACK_TEST_COMMAND" in
dump-args)
IFS="$lsep"
for arg in $full_command_list; do
echo "$arg"
done
unset IFS
exit
;;
dump-env-*)
var=${SPACK_TEST_COMMAND#dump-env-}
eval "printf '%s\n' \"\$0: \$var: \$$var\""
;;
*)
die "Unknown test command: '$SPACK_TEST_COMMAND'"
;;
esac
fi
#
# Write the input and output commands to debug logs if it's asked for.
#
if [ "$SPACK_DEBUG" = TRUE ]; then
input_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.in.log"
output_log="$SPACK_DEBUG_LOG_DIR/spack-cc-$SPACK_DEBUG_LOG_ID.out.log"
echo "[$mode] $command $input_command" >> "$input_log"
IFS="$lsep"
echo "[$mode] "$full_command_list >> "$output_log"
unset IFS
fi
# Execute the full command, preserving spaces with IFS set
# to the alarm bell separator.
IFS="$lsep"; exec $full_command_list

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec * Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures * Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.5-dev (commit bceb39528ac49dd0c876b2e9bf3e7482e9c2be4a) * Version: 0.2.5-dev (commit cbb1fd5eb397a70d466e5160b393b87b0dbcc78f)
astunparse astunparse
---------------- ----------------

View File

@@ -1265,29 +1265,27 @@ def _distro_release_info(self) -> Dict[str, str]:
match = _DISTRO_RELEASE_BASENAME_PATTERN.match(basename) match = _DISTRO_RELEASE_BASENAME_PATTERN.match(basename)
else: else:
try: try:
with os.scandir(self.etc_dir) as it: basenames = [
etc_files = [ basename
p.path for p in it for basename in os.listdir(self.etc_dir)
if p.is_file() and p.name not in _DISTRO_RELEASE_IGNORE_BASENAMES if basename not in _DISTRO_RELEASE_IGNORE_BASENAMES
] and os.path.isfile(os.path.join(self.etc_dir, basename))
]
# We sort for repeatability in cases where there are multiple # We sort for repeatability in cases where there are multiple
# distro specific files; e.g. CentOS, Oracle, Enterprise all # distro specific files; e.g. CentOS, Oracle, Enterprise all
# containing `redhat-release` on top of their own. # containing `redhat-release` on top of their own.
etc_files.sort() basenames.sort()
except OSError: except OSError:
# This may occur when /etc is not readable but we can't be # This may occur when /etc is not readable but we can't be
# sure about the *-release files. Check common entries of # sure about the *-release files. Check common entries of
# /etc for information. If they turn out to not be there the # /etc for information. If they turn out to not be there the
# error is handled in `_parse_distro_release_file()`. # error is handled in `_parse_distro_release_file()`.
etc_files = [ basenames = _DISTRO_RELEASE_BASENAMES
os.path.join(self.etc_dir, basename) for basename in basenames:
for basename in _DISTRO_RELEASE_BASENAMES match = _DISTRO_RELEASE_BASENAME_PATTERN.match(basename)
]
for filepath in etc_files:
match = _DISTRO_RELEASE_BASENAME_PATTERN.match(os.path.basename(filepath))
if match is None: if match is None:
continue continue
filepath = os.path.join(self.etc_dir, basename)
distro_info = self._parse_distro_release_file(filepath) distro_info = self._parse_distro_release_file(filepath)
# The name is always present if the pattern matches. # The name is always present if the pattern matches.
if "name" not in distro_info: if "name" not in distro_info:

View File

@@ -231,6 +231,96 @@ def is_host_name(instance):
return True return True
try:
# The built-in `idna` codec only implements RFC 3890, so we go elsewhere.
import idna
except ImportError:
pass
else:
@_checks_drafts(draft7="idn-hostname", raises=idna.IDNAError)
def is_idn_host_name(instance):
if not isinstance(instance, str_types):
return True
idna.encode(instance)
return True
try:
import rfc3987
except ImportError:
try:
from rfc3986_validator import validate_rfc3986
except ImportError:
pass
else:
@_checks_drafts(name="uri")
def is_uri(instance):
if not isinstance(instance, str_types):
return True
return validate_rfc3986(instance, rule="URI")
@_checks_drafts(
draft6="uri-reference",
draft7="uri-reference",
raises=ValueError,
)
def is_uri_reference(instance):
if not isinstance(instance, str_types):
return True
return validate_rfc3986(instance, rule="URI_reference")
else:
@_checks_drafts(draft7="iri", raises=ValueError)
def is_iri(instance):
if not isinstance(instance, str_types):
return True
return rfc3987.parse(instance, rule="IRI")
@_checks_drafts(draft7="iri-reference", raises=ValueError)
def is_iri_reference(instance):
if not isinstance(instance, str_types):
return True
return rfc3987.parse(instance, rule="IRI_reference")
@_checks_drafts(name="uri", raises=ValueError)
def is_uri(instance):
if not isinstance(instance, str_types):
return True
return rfc3987.parse(instance, rule="URI")
@_checks_drafts(
draft6="uri-reference",
draft7="uri-reference",
raises=ValueError,
)
def is_uri_reference(instance):
if not isinstance(instance, str_types):
return True
return rfc3987.parse(instance, rule="URI_reference")
try:
from strict_rfc3339 import validate_rfc3339
except ImportError:
try:
from rfc3339_validator import validate_rfc3339
except ImportError:
validate_rfc3339 = None
if validate_rfc3339:
@_checks_drafts(name="date-time")
def is_datetime(instance):
if not isinstance(instance, str_types):
return True
return validate_rfc3339(instance)
@_checks_drafts(draft7="time")
def is_time(instance):
if not isinstance(instance, str_types):
return True
return is_datetime("1970-01-01T" + instance)
@_checks_drafts(name="regex", raises=re.error) @_checks_drafts(name="regex", raises=re.error)
def is_regex(instance): def is_regex(instance):
if not isinstance(instance, str_types): if not isinstance(instance, str_types):
@@ -250,3 +340,86 @@ def is_draft3_time(instance):
if not isinstance(instance, str_types): if not isinstance(instance, str_types):
return True return True
return datetime.datetime.strptime(instance, "%H:%M:%S") return datetime.datetime.strptime(instance, "%H:%M:%S")
try:
import webcolors
except ImportError:
pass
else:
def is_css_color_code(instance):
return webcolors.normalize_hex(instance)
@_checks_drafts(draft3="color", raises=(ValueError, TypeError))
def is_css21_color(instance):
if (
not isinstance(instance, str_types) or
instance.lower() in webcolors.css21_names_to_hex
):
return True
return is_css_color_code(instance)
def is_css3_color(instance):
if instance.lower() in webcolors.css3_names_to_hex:
return True
return is_css_color_code(instance)
try:
import jsonpointer
except ImportError:
pass
else:
@_checks_drafts(
draft6="json-pointer",
draft7="json-pointer",
raises=jsonpointer.JsonPointerException,
)
def is_json_pointer(instance):
if not isinstance(instance, str_types):
return True
return jsonpointer.JsonPointer(instance)
# TODO: I don't want to maintain this, so it
# needs to go either into jsonpointer (pending
# https://github.com/stefankoegl/python-json-pointer/issues/34) or
# into a new external library.
@_checks_drafts(
draft7="relative-json-pointer",
raises=jsonpointer.JsonPointerException,
)
def is_relative_json_pointer(instance):
# Definition taken from:
# https://tools.ietf.org/html/draft-handrews-relative-json-pointer-01#section-3
if not isinstance(instance, str_types):
return True
non_negative_integer, rest = [], ""
for i, character in enumerate(instance):
if character.isdigit():
non_negative_integer.append(character)
continue
if not non_negative_integer:
return False
rest = instance[i:]
break
return (rest == "#") or jsonpointer.JsonPointer(rest)
try:
import uritemplate.exceptions
except ImportError:
pass
else:
@_checks_drafts(
draft6="uri-template",
draft7="uri-template",
raises=uritemplate.exceptions.InvalidTemplate,
)
def is_uri_template(
instance,
template_validator=uritemplate.Validator().force_balanced_braces(),
):
template = uritemplate.URITemplate(instance)
return template_validator.validate(template)

View File

@@ -115,9 +115,6 @@ def __eq__(self, other):
and self.cpu_part == other.cpu_part and self.cpu_part == other.cpu_part
) )
def __hash__(self):
return hash(self.name)
@coerce_target_names @coerce_target_names
def __ne__(self, other): def __ne__(self, other):
return not self == other return not self == other

View File

@@ -1,45 +0,0 @@
diff --git a/lib/spack/external/_vendoring/distro/distro.py b/lib/spack/external/_vendoring/distro/distro.py
index 89e1868047..50c3b18d4d 100644
--- a/lib/spack/external/_vendoring/distro/distro.py
+++ b/lib/spack/external/_vendoring/distro/distro.py
@@ -1265,27 +1265,29 @@ def _distro_release_info(self) -> Dict[str, str]:
match = _DISTRO_RELEASE_BASENAME_PATTERN.match(basename)
else:
try:
- basenames = [
- basename
- for basename in os.listdir(self.etc_dir)
- if basename not in _DISTRO_RELEASE_IGNORE_BASENAMES
- and os.path.isfile(os.path.join(self.etc_dir, basename))
- ]
+ with os.scandir(self.etc_dir) as it:
+ etc_files = [
+ p.path for p in it
+ if p.is_file() and p.name not in _DISTRO_RELEASE_IGNORE_BASENAMES
+ ]
# We sort for repeatability in cases where there are multiple
# distro specific files; e.g. CentOS, Oracle, Enterprise all
# containing `redhat-release` on top of their own.
- basenames.sort()
+ etc_files.sort()
except OSError:
# This may occur when /etc is not readable but we can't be
# sure about the *-release files. Check common entries of
# /etc for information. If they turn out to not be there the
# error is handled in `_parse_distro_release_file()`.
- basenames = _DISTRO_RELEASE_BASENAMES
- for basename in basenames:
- match = _DISTRO_RELEASE_BASENAME_PATTERN.match(basename)
+ etc_files = [
+ os.path.join(self.etc_dir, basename)
+ for basename in _DISTRO_RELEASE_BASENAMES
+ ]
+
+ for filepath in etc_files:
+ match = _DISTRO_RELEASE_BASENAME_PATTERN.match(os.path.basename(filepath))
if match is None:
continue
- filepath = os.path.join(self.etc_dir, basename)
distro_info = self._parse_distro_release_file(filepath)
# The name is always present if the pattern matches.
if "name" not in distro_info:

View File

@@ -13,191 +13,3 @@ index 6b630cdfbb..1791fe7fbf 100644
-__version__ = metadata.version("jsonschema") -__version__ = metadata.version("jsonschema")
+ +
+__version__ = "3.2.0" +__version__ = "3.2.0"
diff --git a/lib/spack/external/_vendoring/jsonschema/_format.py b/lib/spack/external/_vendoring/jsonschema/_format.py
index 281a7cfcff..29061e3661 100644
--- a/lib/spack/external/_vendoring/jsonschema/_format.py
+++ b/lib/spack/external/_vendoring/jsonschema/_format.py
@@ -231,96 +231,6 @@ def is_host_name(instance):
return True
-try:
- # The built-in `idna` codec only implements RFC 3890, so we go elsewhere.
- import idna
-except ImportError:
- pass
-else:
- @_checks_drafts(draft7="idn-hostname", raises=idna.IDNAError)
- def is_idn_host_name(instance):
- if not isinstance(instance, str_types):
- return True
- idna.encode(instance)
- return True
-
-
-try:
- import rfc3987
-except ImportError:
- try:
- from rfc3986_validator import validate_rfc3986
- except ImportError:
- pass
- else:
- @_checks_drafts(name="uri")
- def is_uri(instance):
- if not isinstance(instance, str_types):
- return True
- return validate_rfc3986(instance, rule="URI")
-
- @_checks_drafts(
- draft6="uri-reference",
- draft7="uri-reference",
- raises=ValueError,
- )
- def is_uri_reference(instance):
- if not isinstance(instance, str_types):
- return True
- return validate_rfc3986(instance, rule="URI_reference")
-
-else:
- @_checks_drafts(draft7="iri", raises=ValueError)
- def is_iri(instance):
- if not isinstance(instance, str_types):
- return True
- return rfc3987.parse(instance, rule="IRI")
-
- @_checks_drafts(draft7="iri-reference", raises=ValueError)
- def is_iri_reference(instance):
- if not isinstance(instance, str_types):
- return True
- return rfc3987.parse(instance, rule="IRI_reference")
-
- @_checks_drafts(name="uri", raises=ValueError)
- def is_uri(instance):
- if not isinstance(instance, str_types):
- return True
- return rfc3987.parse(instance, rule="URI")
-
- @_checks_drafts(
- draft6="uri-reference",
- draft7="uri-reference",
- raises=ValueError,
- )
- def is_uri_reference(instance):
- if not isinstance(instance, str_types):
- return True
- return rfc3987.parse(instance, rule="URI_reference")
-
-
-try:
- from strict_rfc3339 import validate_rfc3339
-except ImportError:
- try:
- from rfc3339_validator import validate_rfc3339
- except ImportError:
- validate_rfc3339 = None
-
-if validate_rfc3339:
- @_checks_drafts(name="date-time")
- def is_datetime(instance):
- if not isinstance(instance, str_types):
- return True
- return validate_rfc3339(instance)
-
- @_checks_drafts(draft7="time")
- def is_time(instance):
- if not isinstance(instance, str_types):
- return True
- return is_datetime("1970-01-01T" + instance)
-
-
@_checks_drafts(name="regex", raises=re.error)
def is_regex(instance):
if not isinstance(instance, str_types):
@@ -340,86 +250,3 @@ def is_draft3_time(instance):
if not isinstance(instance, str_types):
return True
return datetime.datetime.strptime(instance, "%H:%M:%S")
-
-
-try:
- import webcolors
-except ImportError:
- pass
-else:
- def is_css_color_code(instance):
- return webcolors.normalize_hex(instance)
-
- @_checks_drafts(draft3="color", raises=(ValueError, TypeError))
- def is_css21_color(instance):
- if (
- not isinstance(instance, str_types) or
- instance.lower() in webcolors.css21_names_to_hex
- ):
- return True
- return is_css_color_code(instance)
-
- def is_css3_color(instance):
- if instance.lower() in webcolors.css3_names_to_hex:
- return True
- return is_css_color_code(instance)
-
-
-try:
- import jsonpointer
-except ImportError:
- pass
-else:
- @_checks_drafts(
- draft6="json-pointer",
- draft7="json-pointer",
- raises=jsonpointer.JsonPointerException,
- )
- def is_json_pointer(instance):
- if not isinstance(instance, str_types):
- return True
- return jsonpointer.JsonPointer(instance)
-
- # TODO: I don't want to maintain this, so it
- # needs to go either into jsonpointer (pending
- # https://github.com/stefankoegl/python-json-pointer/issues/34) or
- # into a new external library.
- @_checks_drafts(
- draft7="relative-json-pointer",
- raises=jsonpointer.JsonPointerException,
- )
- def is_relative_json_pointer(instance):
- # Definition taken from:
- # https://tools.ietf.org/html/draft-handrews-relative-json-pointer-01#section-3
- if not isinstance(instance, str_types):
- return True
- non_negative_integer, rest = [], ""
- for i, character in enumerate(instance):
- if character.isdigit():
- non_negative_integer.append(character)
- continue
-
- if not non_negative_integer:
- return False
-
- rest = instance[i:]
- break
- return (rest == "#") or jsonpointer.JsonPointer(rest)
-
-
-try:
- import uritemplate.exceptions
-except ImportError:
- pass
-else:
- @_checks_drafts(
- draft6="uri-template",
- draft7="uri-template",
- raises=uritemplate.exceptions.InvalidTemplate,
- )
- def is_uri_template(
- instance,
- template_validator=uritemplate.Validator().force_balanced_braces(),
- ):
- template = uritemplate.URITemplate(instance)
- return template_validator.validate(template)

View File

@@ -27,6 +27,8 @@
from llnl.util.lang import dedupe, memoized from llnl.util.lang import dedupe, memoized
from llnl.util.symlink import islink, readlink, resolve_link_target_relative_to_the_link, symlink from llnl.util.symlink import islink, readlink, resolve_link_target_relative_to_the_link, symlink
from spack.util.executable import Executable, which
from ..path import path_to_os_path, system_path_filter from ..path import path_to_os_path, system_path_filter
if sys.platform != "win32": if sys.platform != "win32":
@@ -51,6 +53,7 @@
"find_all_headers", "find_all_headers",
"find_libraries", "find_libraries",
"find_system_libraries", "find_system_libraries",
"fix_darwin_install_name",
"force_remove", "force_remove",
"force_symlink", "force_symlink",
"getuid", "getuid",
@@ -245,6 +248,42 @@ def path_contains_subdirectory(path, root):
return norm_path.startswith(norm_root) return norm_path.startswith(norm_root)
@memoized
def file_command(*args):
"""Creates entry point to `file` system command with provided arguments"""
file_cmd = which("file", required=True)
for arg in args:
file_cmd.add_default_arg(arg)
return file_cmd
@memoized
def _get_mime_type():
"""Generate method to call `file` system command to aquire mime type
for a specified path
"""
if sys.platform == "win32":
# -h option (no-dereference) does not exist in Windows
return file_command("-b", "--mime-type")
else:
return file_command("-b", "-h", "--mime-type")
def mime_type(filename):
"""Returns the mime type and subtype of a file.
Args:
filename: file to be analyzed
Returns:
Tuple containing the MIME type and subtype
"""
output = _get_mime_type()(filename, output=str, error=str).strip()
tty.debug("==> " + output)
type, _, subtype = output.partition("/")
return type, subtype
#: This generates the library filenames that may appear on any OS. #: This generates the library filenames that may appear on any OS.
library_extensions = ["a", "la", "so", "tbd", "dylib"] library_extensions = ["a", "la", "so", "tbd", "dylib"]
@@ -727,6 +766,7 @@ def copy_tree(
src: str, src: str,
dest: str, dest: str,
symlinks: bool = True, symlinks: bool = True,
allow_broken_symlinks: bool = sys.platform != "win32",
ignore: Optional[Callable[[str], bool]] = None, ignore: Optional[Callable[[str], bool]] = None,
_permissions: bool = False, _permissions: bool = False,
): ):
@@ -749,6 +789,8 @@ def copy_tree(
src (str): the directory to copy src (str): the directory to copy
dest (str): the destination directory dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks symlinks (bool): whether or not to preserve symlinks
allow_broken_symlinks (bool): whether or not to allow broken (dangling) symlinks,
On Windows, setting this to True will raise an exception. Defaults to true on unix.
ignore (typing.Callable): function indicating which files to ignore ignore (typing.Callable): function indicating which files to ignore
_permissions (bool): for internal use only _permissions (bool): for internal use only
@@ -756,6 +798,8 @@ def copy_tree(
IOError: if *src* does not match any files or directories IOError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest* ValueError: if *src* is a parent directory of *dest*
""" """
if allow_broken_symlinks and sys.platform == "win32":
raise llnl.util.symlink.SymlinkError("Cannot allow broken symlinks on Windows!")
if _permissions: if _permissions:
tty.debug("Installing {0} to {1}".format(src, dest)) tty.debug("Installing {0} to {1}".format(src, dest))
else: else:
@@ -828,14 +872,16 @@ def escaped_path(path):
copy_mode(s, d) copy_mode(s, d)
for target, d, s in links: for target, d, s in links:
symlink(target, d) symlink(target, d, allow_broken_symlinks=allow_broken_symlinks)
if _permissions: if _permissions:
set_install_permissions(d) set_install_permissions(d)
copy_mode(s, d) copy_mode(s, d)
@system_path_filter @system_path_filter
def install_tree(src, dest, symlinks=True, ignore=None): def install_tree(
src, dest, symlinks=True, ignore=None, allow_broken_symlinks=sys.platform != "win32"
):
"""Recursively install an entire directory tree rooted at *src*. """Recursively install an entire directory tree rooted at *src*.
Same as :py:func:`copy_tree` with the addition of setting proper Same as :py:func:`copy_tree` with the addition of setting proper
@@ -846,12 +892,21 @@ def install_tree(src, dest, symlinks=True, ignore=None):
dest (str): the destination directory dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks symlinks (bool): whether or not to preserve symlinks
ignore (typing.Callable): function indicating which files to ignore ignore (typing.Callable): function indicating which files to ignore
allow_broken_symlinks (bool): whether or not to allow broken (dangling) symlinks,
On Windows, setting this to True will raise an exception.
Raises: Raises:
IOError: if *src* does not match any files or directories IOError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest* ValueError: if *src* is a parent directory of *dest*
""" """
copy_tree(src, dest, symlinks=symlinks, ignore=ignore, _permissions=True) copy_tree(
src,
dest,
symlinks=symlinks,
allow_broken_symlinks=allow_broken_symlinks,
ignore=ignore,
_permissions=True,
)
@system_path_filter @system_path_filter
@@ -1585,12 +1640,6 @@ def remove_linked_tree(path):
shutil.rmtree(os.path.realpath(path), **kwargs) shutil.rmtree(os.path.realpath(path), **kwargs)
os.unlink(path) os.unlink(path)
else: else:
if sys.platform == "win32":
# Adding this prefix allows shutil to remove long paths on windows
# https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation?tabs=registry
long_path_pfx = "\\\\?\\"
if not path.startswith(long_path_pfx):
path = long_path_pfx + path
shutil.rmtree(path, **kwargs) shutil.rmtree(path, **kwargs)
@@ -1640,6 +1689,41 @@ def safe_remove(*files_or_dirs):
raise raise
@system_path_filter
def fix_darwin_install_name(path):
"""Fix install name of dynamic libraries on Darwin to have full path.
There are two parts of this task:
1. Use ``install_name('-id', ...)`` to change install name of a single lib
2. Use ``install_name('-change', ...)`` to change the cross linking between
libs. The function assumes that all libraries are in one folder and
currently won't follow subfolders.
Parameters:
path (str): directory in which .dylib files are located
"""
libs = glob.glob(join_path(path, "*.dylib"))
for lib in libs:
# fix install name first:
install_name_tool = Executable("install_name_tool")
install_name_tool("-id", lib, lib)
otool = Executable("otool")
long_deps = otool("-L", lib, output=str).split("\n")
deps = [dep.partition(" ")[0][1::] for dep in long_deps[2:-1]]
# fix all dependencies:
for dep in deps:
for loc in libs:
# We really want to check for either
# dep == os.path.basename(loc) or
# dep == join_path(builddir, os.path.basename(loc)),
# but we don't know builddir (nor how symbolic links look
# in builddir). We thus only compare the basenames.
if os.path.basename(dep) == os.path.basename(loc):
install_name_tool("-change", dep, loc, lib)
break
def find_first(root: str, files: Union[Iterable[str], str], bfs_depth: int = 2) -> Optional[str]: def find_first(root: str, files: Union[Iterable[str], str], bfs_depth: int = 2) -> Optional[str]:
"""Find the first file matching a pattern. """Find the first file matching a pattern.

View File

@@ -6,6 +6,7 @@
import collections.abc import collections.abc
import contextlib import contextlib
import functools import functools
import inspect
import itertools import itertools
import os import os
import re import re
@@ -15,7 +16,7 @@
from typing import Any, Callable, Iterable, List, Tuple from typing import Any, Callable, Iterable, List, Tuple
# Ignore emacs backups when listing modules # Ignore emacs backups when listing modules
ignore_modules = r"^\.#|~$" ignore_modules = [r"^\.#", "~$"]
def index_by(objects, *funcs): def index_by(objects, *funcs):
@@ -83,6 +84,20 @@ def index_by(objects, *funcs):
return result return result
def caller_locals():
"""This will return the locals of the *parent* of the caller.
This allows a function to insert variables into its caller's
scope. Yes, this is some black magic, and yes it's useful
for implementing things like depends_on and provides.
"""
# Passing zero here skips line context for speed.
stack = inspect.stack(0)
try:
return stack[2][0].f_locals
finally:
del stack
def attr_setdefault(obj, name, value): def attr_setdefault(obj, name, value):
"""Like dict.setdefault, but for objects.""" """Like dict.setdefault, but for objects."""
if not hasattr(obj, name): if not hasattr(obj, name):
@@ -90,6 +105,15 @@ def attr_setdefault(obj, name, value):
return getattr(obj, name) return getattr(obj, name)
def has_method(cls, name):
for base in inspect.getmro(cls):
if base is object:
continue
if name in base.__dict__:
return True
return False
def union_dicts(*dicts): def union_dicts(*dicts):
"""Use update() to combine all dicts into one. """Use update() to combine all dicts into one.
@@ -154,22 +178,19 @@ def list_modules(directory, **kwargs):
order.""" order."""
list_directories = kwargs.setdefault("directories", True) list_directories = kwargs.setdefault("directories", True)
ignore = re.compile(ignore_modules) for name in os.listdir(directory):
if name == "__init__.py":
continue
with os.scandir(directory) as it: path = os.path.join(directory, name)
for entry in it: if list_directories and os.path.isdir(path):
if entry.name == "__init__.py" or entry.name == "__pycache__": init_py = os.path.join(path, "__init__.py")
continue if os.path.isfile(init_py):
yield name
if ( elif name.endswith(".py"):
list_directories if not any(re.search(pattern, name) for pattern in ignore_modules):
and entry.is_dir() yield re.sub(".py$", "", name)
and os.path.isfile(os.path.join(entry.path, "__init__.py"))
):
yield entry.name
elif entry.name.endswith(".py") and entry.is_file() and not ignore.search(entry.name):
yield entry.name[:-3] # strip .py
def decorator_with_or_without_args(decorator): def decorator_with_or_without_args(decorator):
@@ -216,8 +237,8 @@ def setter(name, value):
value.__name__ = name value.__name__ = name
setattr(cls, name, value) setattr(cls, name, value)
if not hasattr(cls, "_cmp_key"): if not has_method(cls, "_cmp_key"):
raise TypeError(f"'{cls.__name__}' doesn't define _cmp_key().") raise TypeError("'%s' doesn't define _cmp_key()." % cls.__name__)
setter("__eq__", lambda s, o: (s is o) or (o is not None and s._cmp_key() == o._cmp_key())) setter("__eq__", lambda s, o: (s is o) or (o is not None and s._cmp_key() == o._cmp_key()))
setter("__lt__", lambda s, o: o is not None and s._cmp_key() < o._cmp_key()) setter("__lt__", lambda s, o: o is not None and s._cmp_key() < o._cmp_key())
@@ -367,8 +388,8 @@ def cd_fun():
TypeError: If the class does not have a ``_cmp_iter`` method TypeError: If the class does not have a ``_cmp_iter`` method
""" """
if not hasattr(cls, "_cmp_iter"): if not has_method(cls, "_cmp_iter"):
raise TypeError(f"'{cls.__name__}' doesn't define _cmp_iter().") raise TypeError("'%s' doesn't define _cmp_iter()." % cls.__name__)
# comparison operators are implemented in terms of lazy_eq and lazy_lt # comparison operators are implemented in terms of lazy_eq and lazy_lt
def eq(self, other): def eq(self, other):
@@ -843,19 +864,20 @@ def uniq(sequence):
return uniq_list return uniq_list
def elide_list(line_list: List[str], max_num: int = 10) -> List[str]: def elide_list(line_list, max_num=10):
"""Takes a long list and limits it to a smaller number of elements, """Takes a long list and limits it to a smaller number of elements,
replacing intervening elements with '...'. For example:: replacing intervening elements with '...'. For example::
elide_list(["1", "2", "3", "4", "5", "6"], 4) elide_list([1,2,3,4,5,6], 4)
gives:: gives::
["1", "2", "3", "...", "6"] [1, 2, 3, '...', 6]
""" """
if len(line_list) > max_num: if len(line_list) > max_num:
return [*line_list[: max_num - 1], "...", line_list[-1]] return line_list[: max_num - 1] + ["..."] + line_list[-1:]
return line_list else:
return line_list
@contextlib.contextmanager @contextlib.contextmanager

View File

@@ -8,7 +8,6 @@
import subprocess import subprocess
import sys import sys
import tempfile import tempfile
from typing import Union
from llnl.util import lang, tty from llnl.util import lang, tty
@@ -17,66 +16,92 @@
if sys.platform == "win32": if sys.platform == "win32":
from win32file import CreateHardLink from win32file import CreateHardLink
is_windows = sys.platform == "win32"
def _windows_symlink(
src: str, dst: str, target_is_directory: bool = False, *, dir_fd: Union[int, None] = None
):
"""On Windows with System Administrator privileges this will be a normal symbolic link via
os.symlink. On Windows without privledges the link will be a junction for a directory and a
hardlink for a file. On Windows the various link types are:
Symbolic Link: A link to a file or directory on the same or different volume (drive letter) or def symlink(source_path: str, link_path: str, allow_broken_symlinks: bool = not is_windows):
even to a remote file or directory (using UNC in its path). Need System Administrator """
privileges to make these. Create a link.
Hard Link: A link to a file on the same volume (drive letter) only. Every file (file's data) On non-Windows and Windows with System Administrator
has at least 1 hard link (file's name). But when this method creates a new hard link there will privleges this will be a normal symbolic link via
be 2. Deleting all hard links effectively deletes the file. Don't need System Administrator os.symlink.
privileges.
Junction: A link to a directory on the same or different volume (drive letter) but not to a On Windows without privledges the link will be a
remote directory. Don't need System Administrator privileges.""" junction for a directory and a hardlink for a file.
source_path = os.path.normpath(src) On Windows the various link types are:
Symbolic Link: A link to a file or directory on the
same or different volume (drive letter) or even to
a remote file or directory (using UNC in its path).
Need System Administrator privileges to make these.
Hard Link: A link to a file on the same volume (drive
letter) only. Every file (file's data) has at least 1
hard link (file's name). But when this method creates
a new hard link there will be 2. Deleting all hard
links effectively deletes the file. Don't need System
Administrator privileges.
Junction: A link to a directory on the same or different
volume (drive letter) but not to a remote directory. Don't
need System Administrator privileges.
Parameters:
source_path (str): The real file or directory that the link points to.
Must be absolute OR relative to the link.
link_path (str): The path where the link will exist.
allow_broken_symlinks (bool): On Linux or Mac, don't raise an exception if the source_path
doesn't exist. This will still raise an exception on Windows.
"""
source_path = os.path.normpath(source_path)
win_source_path = source_path win_source_path = source_path
link_path = os.path.normpath(dst) link_path = os.path.normpath(link_path)
# Perform basic checks to make sure symlinking will succeed # Never allow broken links on Windows.
if os.path.lexists(link_path): if sys.platform == "win32" and allow_broken_symlinks:
raise AlreadyExistsError(f"Link path ({link_path}) already exists. Cannot create link.") raise ValueError("allow_broken_symlinks parameter cannot be True on Windows.")
if not os.path.exists(source_path): if not allow_broken_symlinks:
if os.path.isabs(source_path): # Perform basic checks to make sure symlinking will succeed
# An absolute source path that does not exist will result in a broken link. if os.path.lexists(link_path):
raise SymlinkError( raise AlreadyExistsError(
f"Source path ({source_path}) is absolute but does not exist. Resulting " f"Link path ({link_path}) already exists. Cannot create link."
f"link would be broken so not making link."
) )
else:
# os.symlink can create a link when the given source path is relative to if not os.path.exists(source_path):
# the link path. Emulate this behavior and check to see if the source exists if os.path.isabs(source_path) and not allow_broken_symlinks:
# relative to the link path ahead of link creation to prevent broken # An absolute source path that does not exist will result in a broken link.
# links from being made.
link_parent_dir = os.path.dirname(link_path)
relative_path = os.path.join(link_parent_dir, source_path)
if os.path.exists(relative_path):
# In order to work on windows, the source path needs to be modified to be
# relative because hardlink/junction dont resolve relative paths the same
# way as os.symlink. This is ignored on other operating systems.
win_source_path = relative_path
else:
raise SymlinkError( raise SymlinkError(
f"The source path ({source_path}) is not relative to the link path " f"Source path ({source_path}) is absolute but does not exist. Resulting "
f"({link_path}). Resulting link would be broken so not making link." f"link would be broken so not making link."
) )
else:
# os.symlink can create a link when the given source path is relative to
# the link path. Emulate this behavior and check to see if the source exists
# relative to the link path ahead of link creation to prevent broken
# links from being made.
link_parent_dir = os.path.dirname(link_path)
relative_path = os.path.join(link_parent_dir, source_path)
if os.path.exists(relative_path):
# In order to work on windows, the source path needs to be modified to be
# relative because hardlink/junction dont resolve relative paths the same
# way as os.symlink. This is ignored on other operating systems.
win_source_path = relative_path
elif not allow_broken_symlinks:
raise SymlinkError(
f"The source path ({source_path}) is not relative to the link path "
f"({link_path}). Resulting link would be broken so not making link."
)
# Create the symlink # Create the symlink
if not _windows_can_symlink(): if sys.platform == "win32" and not _windows_can_symlink():
_windows_create_link(win_source_path, link_path) _windows_create_link(win_source_path, link_path)
else: else:
os.symlink(source_path, link_path, target_is_directory=os.path.isdir(source_path)) os.symlink(source_path, link_path, target_is_directory=os.path.isdir(source_path))
def _windows_islink(path: str) -> bool: def islink(path: str) -> bool:
"""Override os.islink to give correct answer for spack logic. """Override os.islink to give correct answer for spack logic.
For Non-Windows: a link can be determined with the os.path.islink method. For Non-Windows: a link can be determined with the os.path.islink method.
@@ -244,7 +269,7 @@ def _windows_create_hard_link(path: str, link: str):
CreateHardLink(link, path) CreateHardLink(link, path)
def _windows_readlink(path: str, *, dir_fd=None): def readlink(path: str, *, dir_fd=None):
"""Spack utility to override of os.readlink method to work cross platform""" """Spack utility to override of os.readlink method to work cross platform"""
if _windows_is_hardlink(path): if _windows_is_hardlink(path):
return _windows_read_hard_link(path) return _windows_read_hard_link(path)
@@ -313,16 +338,6 @@ def resolve_link_target_relative_to_the_link(link):
return os.path.join(link_dir, target) return os.path.join(link_dir, target)
if sys.platform == "win32":
symlink = _windows_symlink
readlink = _windows_readlink
islink = _windows_islink
else:
symlink = os.symlink
readlink = os.readlink
islink = os.path.islink
class SymlinkError(RuntimeError): class SymlinkError(RuntimeError):
"""Exception class for errors raised while creating symlinks, """Exception class for errors raised while creating symlinks,
junctions and hard links junctions and hard links

View File

@@ -10,7 +10,6 @@
import errno import errno
import io import io
import multiprocessing import multiprocessing
import multiprocessing.connection
import os import os
import re import re
import select import select
@@ -34,23 +33,8 @@
pass pass
esc, bell, lbracket, bslash, newline = r"\x1b", r"\x07", r"\[", r"\\", r"\n"
# Ansi Control Sequence Introducers (CSI) are a well-defined format
# Standard ECMA-48: Control Functions for Character-Imaging I/O Devices, section 5.4
# https://www.ecma-international.org/wp-content/uploads/ECMA-48_5th_edition_june_1991.pdf
csi_pre = f"{esc}{lbracket}"
csi_param, csi_inter, csi_post = r"[0-?]", r"[ -/]", r"[@-~]"
ansi_csi = f"{csi_pre}{csi_param}*{csi_inter}*{csi_post}"
# General ansi escape sequences have well-defined prefixes,
# but content and suffixes are less reliable.
# Conservatively assume they end with either "<ESC>\" or "<BELL>",
# with no intervening "<ESC>"/"<BELL>" keys or newlines
esc_pre = f"{esc}[@-_]"
esc_content = f"[^{esc}{bell}{newline}]"
esc_post = f"(?:{esc}{bslash}|{bell})"
ansi_esc = f"{esc_pre}{esc_content}*{esc_post}"
# Use this to strip escape sequences # Use this to strip escape sequences
_escape = re.compile(f"{ansi_csi}|{ansi_esc}") _escape = re.compile(r"\x1b[^m]*m|\x1b\[?1034h|\x1b\][0-9]+;[^\x07]*\x07")
# control characters for enabling/disabling echo # control characters for enabling/disabling echo
# #

View File

@@ -3,15 +3,8 @@
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
from typing import Optional
import spack.paths
import spack.util.git
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string #: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.23.0.dev0" __version__ = "0.22.2"
spack_version = __version__ spack_version = __version__
@@ -26,47 +19,4 @@ def __try_int(v):
spack_version_info = tuple([__try_int(v) for v in __version__.split(".")]) spack_version_info = tuple([__try_int(v) for v in __version__.split(".")])
def get_spack_commit() -> Optional[str]: __all__ = ["spack_version_info", "spack_version"]
"""Get the Spack git commit sha.
Returns:
(str or None) the commit sha if available, otherwise None
"""
git_path = os.path.join(spack.paths.prefix, ".git")
if not os.path.exists(git_path):
return None
git = spack.util.git.git()
if not git:
return None
rev = git(
"-C",
spack.paths.prefix,
"rev-parse",
"HEAD",
output=str,
error=os.devnull,
fail_on_error=False,
)
if git.returncode != 0:
return None
match = re.match(r"[a-f\d]{7,}$", rev)
return match.group(0) if match else None
def get_version() -> str:
"""Get a descriptive version of this instance of Spack.
Outputs '<PEP440 version> (<git commit sha>)'.
The commit sha is only added when available.
"""
commit = get_spack_commit()
if commit:
return f"{spack_version} ({commit})"
return spack_version
__all__ = ["spack_version_info", "spack_version", "get_version", "get_spack_commit"]

131
lib/spack/spack/abi.py Normal file
View File

@@ -0,0 +1,131 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
from llnl.util.lang import memoized
import spack.spec
import spack.version
from spack.compilers.clang import Clang
from spack.util.executable import Executable, ProcessError
class ABI:
"""This class provides methods to test ABI compatibility between specs.
The current implementation is rather rough and could be improved."""
def architecture_compatible(
self, target: spack.spec.Spec, constraint: spack.spec.Spec
) -> bool:
"""Return true if architecture of target spec is ABI compatible
to the architecture of constraint spec. If either the target
or constraint specs have no architecture, target is also defined
as architecture ABI compatible to constraint."""
return (
not target.architecture
or not constraint.architecture
or target.architecture.intersects(constraint.architecture)
)
@memoized
def _gcc_get_libstdcxx_version(self, version):
"""Returns gcc ABI compatibility info by getting the library version of
a compiler's libstdc++ or libgcc_s"""
from spack.build_environment import dso_suffix
spec = spack.spec.CompilerSpec("gcc", version)
compilers = spack.compilers.compilers_for_spec(spec)
if not compilers:
return None
compiler = compilers[0]
rungcc = None
libname = None
output = None
if compiler.cxx:
rungcc = Executable(compiler.cxx)
libname = "libstdc++." + dso_suffix
elif compiler.cc:
rungcc = Executable(compiler.cc)
libname = "libgcc_s." + dso_suffix
else:
return None
try:
# Some gcc's are actually clang and don't respond properly to
# --print-file-name (they just print the filename, not the
# full path). Ignore these and expect them to be handled as clang.
if Clang.default_version(rungcc.exe[0]) != "unknown":
return None
output = rungcc("--print-file-name=%s" % libname, output=str)
except ProcessError:
return None
if not output:
return None
libpath = os.path.realpath(output.strip())
if not libpath:
return None
return os.path.basename(libpath)
@memoized
def _gcc_compiler_compare(self, pversion, cversion):
"""Returns true iff the gcc version pversion and cversion
are ABI compatible."""
plib = self._gcc_get_libstdcxx_version(pversion)
clib = self._gcc_get_libstdcxx_version(cversion)
if not plib or not clib:
return False
return plib == clib
def _intel_compiler_compare(
self, pversion: spack.version.ClosedOpenRange, cversion: spack.version.ClosedOpenRange
) -> bool:
"""Returns true iff the intel version pversion and cversion
are ABI compatible"""
# Test major and minor versions. Ignore build version.
pv = pversion.lo
cv = cversion.lo
return pv.up_to(2) == cv.up_to(2)
def compiler_compatible(
self, parent: spack.spec.Spec, child: spack.spec.Spec, loose: bool = False
) -> bool:
"""Return true if compilers for parent and child are ABI compatible."""
if not parent.compiler or not child.compiler:
return True
if parent.compiler.name != child.compiler.name:
# Different compiler families are assumed ABI incompatible
return False
if loose:
return True
# TODO: Can we move the specialized ABI matching stuff
# TODO: into compiler classes?
for pversion in parent.compiler.versions:
for cversion in child.compiler.versions:
# For a few compilers use specialized comparisons.
# Otherwise match on version match.
if pversion.intersects(cversion):
return True
elif parent.compiler.name == "gcc" and self._gcc_compiler_compare(
pversion, cversion
):
return True
elif parent.compiler.name == "intel" and self._intel_compiler_compare(
pversion, cversion
):
return True
return False
def compatible(
self, target: spack.spec.Spec, constraint: spack.spec.Spec, loose: bool = False
) -> bool:
"""Returns true if target spec is ABI compatible to constraint spec"""
return self.architecture_compatible(target, constraint) and self.compiler_compatible(
target, constraint, loose=loose
)

View File

@@ -39,21 +39,18 @@ def _search_duplicate_compilers(error_cls):
import collections import collections
import collections.abc import collections.abc
import glob import glob
import inspect
import io import io
import itertools import itertools
import os
import pathlib import pathlib
import pickle import pickle
import re import re
import warnings import warnings
from typing import Iterable, List, Set, Tuple
from urllib.request import urlopen from urllib.request import urlopen
import llnl.util.lang import llnl.util.lang
import spack.builder
import spack.config import spack.config
import spack.fetch_strategy
import spack.patch import spack.patch
import spack.repo import spack.repo
import spack.spec import spack.spec
@@ -76,9 +73,7 @@ def __init__(self, summary, details):
self.details = tuple(details) self.details = tuple(details)
def __str__(self): def __str__(self):
if self.details: return self.summary + "\n" + "\n".join([" " + detail for detail in self.details])
return f"{self.summary}\n" + "\n".join(f" {detail}" for detail in self.details)
return self.summary
def __eq__(self, other): def __eq__(self, other):
if self.summary != other.summary or self.details != other.details: if self.summary != other.summary or self.details != other.details:
@@ -215,11 +210,6 @@ def _search_duplicate_compilers(error_cls):
group="configs", tag="CFG-PACKAGES", description="Sanity checks on packages.yaml", kwargs=() group="configs", tag="CFG-PACKAGES", description="Sanity checks on packages.yaml", kwargs=()
) )
#: Sanity checks on packages.yaml
config_repos = AuditClass(
group="configs", tag="CFG-REPOS", description="Sanity checks on repositories", kwargs=()
)
@config_packages @config_packages
def _search_duplicate_specs_in_externals(error_cls): def _search_duplicate_specs_in_externals(error_cls):
@@ -262,6 +252,40 @@ def _search_duplicate_specs_in_externals(error_cls):
return errors return errors
@config_packages
def _deprecated_preferences(error_cls):
"""Search package preferences deprecated in v0.21 (and slated for removal in v0.23)"""
# TODO (v0.23): remove this audit as the attributes will not be allowed in config
errors = []
packages_yaml = spack.config.CONFIG.get_config("packages")
def make_error(attribute_name, config_data, summary):
s = io.StringIO()
s.write("Occurring in the following file:\n")
dict_view = syaml.syaml_dict((k, v) for k, v in config_data.items() if k == attribute_name)
syaml.dump_config(dict_view, stream=s, blame=True)
return error_cls(summary=summary, details=[s.getvalue()])
if "all" in packages_yaml and "version" in packages_yaml["all"]:
summary = "Using the deprecated 'version' attribute under 'packages:all'"
errors.append(make_error("version", packages_yaml["all"], summary))
for package_name in packages_yaml:
if package_name == "all":
continue
package_conf = packages_yaml[package_name]
for attribute in ("compiler", "providers", "target"):
if attribute not in package_conf:
continue
summary = (
f"Using the deprecated '{attribute}' attribute " f"under 'packages:{package_name}'"
)
errors.append(make_error(attribute, package_conf, summary))
return errors
@config_packages @config_packages
def _avoid_mismatched_variants(error_cls): def _avoid_mismatched_variants(error_cls):
"""Warns if variant preferences have mismatched types or names.""" """Warns if variant preferences have mismatched types or names."""
@@ -282,7 +306,7 @@ def _avoid_mismatched_variants(error_cls):
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name) pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for variant in current_spec.variants.values(): for variant in current_spec.variants.values():
# Variant does not exist at all # Variant does not exist at all
if variant.name not in pkg_cls.variant_names(): if variant.name not in pkg_cls.variants:
summary = ( summary = (
f"Setting a preference for the '{pkg_name}' package to the " f"Setting a preference for the '{pkg_name}' package to the "
f"non-existing variant '{variant.name}'" f"non-existing variant '{variant.name}'"
@@ -291,8 +315,9 @@ def _avoid_mismatched_variants(error_cls):
continue continue
# Variant cannot accept this value # Variant cannot accept this value
s = spack.spec.Spec(pkg_name)
try: try:
spack.variant.prevalidate_variant_value(pkg_cls, variant, strict=True) s.update_variant_validate(variant.name, variant.value)
except Exception: except Exception:
summary = ( summary = (
f"Setting the variant '{variant.name}' of the '{pkg_name}' package " f"Setting the variant '{variant.name}' of the '{pkg_name}' package "
@@ -326,43 +351,6 @@ def _wrongly_named_spec(error_cls):
return errors return errors
@config_packages
def _ensure_all_virtual_packages_have_default_providers(error_cls):
"""All virtual packages must have a default provider explicitly set."""
configuration = spack.config.create()
defaults = configuration.get("packages", scope="defaults")
default_providers = defaults["all"]["providers"]
virtuals = spack.repo.PATH.provider_index.providers
default_providers_filename = configuration.scopes["defaults"].get_section_filename("packages")
return [
error_cls(f"'{virtual}' must have a default provider in {default_providers_filename}", [])
for virtual in virtuals
if virtual not in default_providers
]
@config_repos
def _ensure_no_folders_without_package_py(error_cls):
"""Check that we don't leave any folder without a package.py in repos"""
errors = []
for repository in spack.repo.PATH.repos:
missing = []
for entry in os.scandir(repository.packages_path):
if not entry.is_dir():
continue
package_py = pathlib.Path(entry.path) / spack.repo.package_file_name
if not package_py.exists():
missing.append(entry.path)
if missing:
summary = (
f"The '{repository.namespace}' repository misses a package.py file"
f" in the following folders"
)
errors.append(error_cls(summary=summary, details=[f"{x}" for x in missing]))
return errors
def _make_config_error(config_data, summary, error_cls): def _make_config_error(config_data, summary, error_cls):
s = io.StringIO() s = io.StringIO()
s.write("Occurring in the following file:\n") s.write("Occurring in the following file:\n")
@@ -433,10 +421,6 @@ def _check_patch_urls(pkgs, error_cls):
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/" r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
r".+/.+/(?:commit|pull)/[a-fA-F0-9]+\.(?:patch|diff)" r".+/.+/(?:commit|pull)/[a-fA-F0-9]+\.(?:patch|diff)"
) )
github_pull_commits_re = (
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
r".+/.+/pull/\d+/commits/[a-fA-F0-9]+\.(?:patch|diff)"
)
# Only .diff URLs have stable/full hashes: # Only .diff URLs have stable/full hashes:
# https://forum.gitlab.com/t/patches-with-full-index/29313 # https://forum.gitlab.com/t/patches-with-full-index/29313
gitlab_patch_url_re = ( gitlab_patch_url_re = (
@@ -452,24 +436,14 @@ def _check_patch_urls(pkgs, error_cls):
if not isinstance(patch, spack.patch.UrlPatch): if not isinstance(patch, spack.patch.UrlPatch):
continue continue
if re.match(github_pull_commits_re, patch.url): if re.match(github_patch_url_re, patch.url):
url = re.sub(r"/pull/\d+/commits/", r"/commit/", patch.url)
url = re.sub(r"^(.*)(?<!full_index=1)$", r"\1?full_index=1", url)
errors.append(
error_cls(
f"patch URL in package {pkg_cls.name} "
+ "must not be a pull request commit; "
+ f"instead use {url}",
[patch.url],
)
)
elif re.match(github_patch_url_re, patch.url):
full_index_arg = "?full_index=1" full_index_arg = "?full_index=1"
if not patch.url.endswith(full_index_arg): if not patch.url.endswith(full_index_arg):
errors.append( errors.append(
error_cls( error_cls(
f"patch URL in package {pkg_cls.name} " "patch URL in package {0} must end with {1}".format(
+ f"must end with {full_index_arg}", pkg_cls.name, full_index_arg
),
[patch.url], [patch.url],
) )
) )
@@ -477,7 +451,9 @@ def _check_patch_urls(pkgs, error_cls):
if not patch.url.endswith(".diff"): if not patch.url.endswith(".diff"):
errors.append( errors.append(
error_cls( error_cls(
f"patch URL in package {pkg_cls.name} must end with .diff", "patch URL in package {0} must end with .diff".format(
pkg_cls.name
),
[patch.url], [patch.url],
) )
) )
@@ -494,7 +470,7 @@ def _search_for_reserved_attributes_names_in_packages(pkgs, error_cls):
name_definitions = collections.defaultdict(list) name_definitions = collections.defaultdict(list)
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name) pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for cls_item in pkg_cls.__mro__: for cls_item in inspect.getmro(pkg_cls):
for name in RESERVED_NAMES: for name in RESERVED_NAMES:
current_value = cls_item.__dict__.get(name) current_value = cls_item.__dict__.get(name)
if current_value is None: if current_value is None:
@@ -523,7 +499,7 @@ def _ensure_all_package_names_are_lowercase(pkgs, error_cls):
badname_regex, errors = re.compile(r"[_A-Z]"), [] badname_regex, errors = re.compile(r"[_A-Z]"), []
for pkg_name in pkgs: for pkg_name in pkgs:
if badname_regex.search(pkg_name): if badname_regex.search(pkg_name):
error_msg = f"Package name '{pkg_name}' should be lowercase and must not contain '_'" error_msg = "Package name '{}' is either lowercase or conatine '_'".format(pkg_name)
errors.append(error_cls(error_msg, [])) errors.append(error_cls(error_msg, []))
return errors return errors
@@ -662,15 +638,9 @@ def _ensure_env_methods_are_ported_to_builders(pkgs, error_cls):
errors = [] errors = []
for pkg_name in pkgs: for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name) pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
buildsystem_variant, _ = pkg_cls.variants["build_system"]
# values are either Value objects (for conditional values) or the values themselves buildsystem_names = [getattr(x, "value", x) for x in buildsystem_variant.values]
build_system_names = set( builder_cls_names = [spack.builder.BUILDER_CLS[x].__name__ for x in buildsystem_names]
v.value if isinstance(v, spack.variant.Value) else v
for _, variant in pkg_cls.variant_definitions("build_system")
for v in variant.values
)
builder_cls_names = [spack.builder.BUILDER_CLS[x].__name__ for x in build_system_names]
module = pkg_cls.module module = pkg_cls.module
has_builders_in_package_py = any( has_builders_in_package_py = any(
getattr(module, name, False) for name in builder_cls_names getattr(module, name, False) for name in builder_cls_names
@@ -689,88 +659,6 @@ def _ensure_env_methods_are_ported_to_builders(pkgs, error_cls):
return errors return errors
class DeprecatedMagicGlobals(ast.NodeVisitor):
def __init__(self, magic_globals: Iterable[str]):
super().__init__()
self.magic_globals: Set[str] = set(magic_globals)
# State to track whether we're in a class function
self.depth: int = 0
self.in_function: bool = False
self.path = (ast.Module, ast.ClassDef, ast.FunctionDef)
# Defined locals in the current function (heuristically at least)
self.locals: Set[str] = set()
# List of (name, lineno) tuples for references to magic globals
self.references_to_globals: List[Tuple[str, int]] = []
def descend_in_function_def(self, node: ast.AST) -> None:
if not isinstance(node, self.path[self.depth]):
return
self.depth += 1
if self.depth == len(self.path):
self.in_function = True
super().generic_visit(node)
if self.depth == len(self.path):
self.in_function = False
self.locals.clear()
self.depth -= 1
def generic_visit(self, node: ast.AST) -> None:
# Recurse into function definitions
if self.depth < len(self.path):
return self.descend_in_function_def(node)
elif not self.in_function:
return
elif isinstance(node, ast.Global):
for name in node.names:
if name in self.magic_globals:
self.references_to_globals.append((name, node.lineno))
elif isinstance(node, ast.Assign):
# visit the rhs before lhs
super().visit(node.value)
for target in node.targets:
super().visit(target)
elif isinstance(node, ast.Name) and node.id in self.magic_globals:
if isinstance(node.ctx, ast.Load) and node.id not in self.locals:
self.references_to_globals.append((node.id, node.lineno))
elif isinstance(node.ctx, ast.Store):
self.locals.add(node.id)
else:
super().generic_visit(node)
@package_properties
def _uses_deprecated_globals(pkgs, error_cls):
"""Ensure that packages do not use deprecated globals"""
errors = []
for pkg_name in pkgs:
# some packages scheduled to be removed in v0.23 are not worth fixing.
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
if all(v.get("deprecated", False) for v in pkg_cls.versions.values()):
continue
file = spack.repo.PATH.filename_for_package_name(pkg_name)
tree = ast.parse(open(file).read())
visitor = DeprecatedMagicGlobals(("std_cmake_args",))
visitor.visit(tree)
if visitor.references_to_globals:
errors.append(
error_cls(
f"Package '{pkg_name}' uses deprecated globals",
[
f"{file}:{line} references '{name}'"
for name, line in visitor.references_to_globals
],
)
)
return errors
@package_https_directives @package_https_directives
def _linting_package_file(pkgs, error_cls): def _linting_package_file(pkgs, error_cls):
"""Check for correctness of links""" """Check for correctness of links"""
@@ -937,22 +825,20 @@ def check_virtual_with_variants(spec, msg):
# check variants # check variants
dependency_variants = dep.spec.variants dependency_variants = dep.spec.variants
for name, variant in dependency_variants.items(): for name, value in dependency_variants.items():
try: try:
spack.variant.prevalidate_variant_value( v, _ = dependency_pkg_cls.variants[name]
dependency_pkg_cls, variant, dep.spec, strict=True v.validate_or_raise(value, pkg_cls=dependency_pkg_cls)
)
except Exception as e: except Exception as e:
summary = ( summary = (
f"{pkg_name}: wrong variant used for dependency in 'depends_on()'" f"{pkg_name}: wrong variant used for dependency in 'depends_on()'"
) )
error_msg = str(e)
if isinstance(e, KeyError): if isinstance(e, KeyError):
error_msg = ( error_msg = (
f"variant {str(e).strip()} does not exist in package {dep_name}" f"variant {str(e).strip()} does not exist in package {dep_name}"
f" in package '{dep_name}'"
) )
error_msg += f" in package '{dep_name}'"
errors.append( errors.append(
error_cls(summary=summary, details=[error_msg, f"in {filename}"]) error_cls(summary=summary, details=[error_msg, f"in {filename}"])
@@ -964,38 +850,39 @@ def check_virtual_with_variants(spec, msg):
@package_directives @package_directives
def _ensure_variant_defaults_are_parsable(pkgs, error_cls): def _ensure_variant_defaults_are_parsable(pkgs, error_cls):
"""Ensures that variant defaults are present and parsable from cli""" """Ensures that variant defaults are present and parsable from cli"""
def check_variant(pkg_cls, variant, vname):
# bool is a subclass of int in python. Permitting a default that is an instance
# of 'int' means both foo=false and foo=0 are accepted. Other falsish values are
# not allowed, since they can't be parsed from CLI ('foo=')
default_is_parsable = isinstance(variant.default, int) or variant.default
if not default_is_parsable:
msg = f"Variant '{vname}' of package '{pkg_cls.name}' has an unparsable default value"
return [error_cls(msg, [])]
try:
vspec = variant.make_default()
except spack.variant.MultipleValuesInExclusiveVariantError:
msg = f"Can't create default value for variant '{vname}' in package '{pkg_cls.name}'"
return [error_cls(msg, [])]
try:
variant.validate_or_raise(vspec, pkg_cls.name)
except spack.variant.InvalidVariantValueError:
msg = "Default value of variant '{vname}' in package '{pkg.name}' is invalid"
question = "Is it among the allowed values?"
return [error_cls(msg, [question])]
return []
errors = [] errors = []
for pkg_name in pkgs: for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name) pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for vname in pkg_cls.variant_names(): for variant_name, entry in pkg_cls.variants.items():
for _, variant_def in pkg_cls.variant_definitions(vname): variant, _ = entry
errors.extend(check_variant(pkg_cls, variant_def, vname)) default_is_parsable = (
# Permitting a default that is an instance on 'int' permits
# to have foo=false or foo=0. Other falsish values are
# not allowed, since they can't be parsed from cli ('foo=')
isinstance(variant.default, int)
or variant.default
)
if not default_is_parsable:
error_msg = "Variant '{}' of package '{}' has a bad default value"
errors.append(error_cls(error_msg.format(variant_name, pkg_name), []))
continue
try:
vspec = variant.make_default()
except spack.variant.MultipleValuesInExclusiveVariantError:
error_msg = "Cannot create a default value for the variant '{}' in package '{}'"
errors.append(error_cls(error_msg.format(variant_name, pkg_name), []))
continue
try:
variant.validate_or_raise(vspec, pkg_cls=pkg_cls)
except spack.variant.InvalidVariantValueError:
error_msg = (
"The default value of the variant '{}' in package '{}' failed validation"
)
question = "Is it among the allowed values?"
errors.append(error_cls(error_msg.format(variant_name, pkg_name), [question]))
return errors return errors
@@ -1005,11 +892,11 @@ def _ensure_variants_have_descriptions(pkgs, error_cls):
errors = [] errors = []
for pkg_name in pkgs: for pkg_name in pkgs:
pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name) pkg_cls = spack.repo.PATH.get_pkg_class(pkg_name)
for name in pkg_cls.variant_names(): for variant_name, entry in pkg_cls.variants.items():
for when, variant in pkg_cls.variant_definitions(name): variant, _ = entry
if not variant.description: if not variant.description:
msg = f"Variant '{name}' in package '{pkg_name}' is missing a description" error_msg = "Variant '{}' in package '{}' is missing a description"
errors.append(error_cls(msg, [])) errors.append(error_cls(error_msg.format(variant_name, pkg_name), []))
return errors return errors
@@ -1066,26 +953,29 @@ def _version_constraints_are_satisfiable_by_some_version_in_repo(pkgs, error_cls
def _analyze_variants_in_directive(pkg, constraint, directive, error_cls): def _analyze_variants_in_directive(pkg, constraint, directive, error_cls):
variant_exceptions = (
spack.variant.InconsistentValidationError,
spack.variant.MultipleValuesInExclusiveVariantError,
spack.variant.InvalidVariantValueError,
KeyError,
)
errors = [] errors = []
variant_names = pkg.variant_names()
summary = f"{pkg.name}: wrong variant in '{directive}' directive"
filename = spack.repo.PATH.filename_for_package_name(pkg.name)
for name, v in constraint.variants.items(): for name, v in constraint.variants.items():
if name not in variant_names:
msg = f"variant {name} does not exist in {pkg.name}"
errors.append(error_cls(summary=summary, details=[msg, f"in {filename}"]))
continue
try: try:
spack.variant.prevalidate_variant_value(pkg, v, constraint, strict=True) variant, _ = pkg.variants[name]
except ( variant.validate_or_raise(v, pkg_cls=pkg)
spack.variant.InconsistentValidationError, except variant_exceptions as e:
spack.variant.MultipleValuesInExclusiveVariantError, summary = pkg.name + ': wrong variant in "{0}" directive'
spack.variant.InvalidVariantValueError, summary = summary.format(directive)
) as e: filename = spack.repo.PATH.filename_for_package_name(pkg.name)
msg = str(e).strip()
errors.append(error_cls(summary=summary, details=[msg, f"in {filename}"])) error_msg = str(e).strip()
if isinstance(e, KeyError):
error_msg = "the variant {0} does not exist".format(error_msg)
err = error_cls(summary=summary, details=[error_msg, "in " + filename])
errors.append(err)
return errors return errors
@@ -1123,10 +1013,9 @@ def _extracts_errors(triggers, summary):
for dname in dnames for dname in dnames
) )
for when, variants_by_name in pkg_cls.variants.items(): for vname, (variant, triggers) in pkg_cls.variants.items():
for vname, variant in variants_by_name.items(): summary = f"{pkg_name}: wrong 'when=' condition for the '{vname}' variant"
summary = f"{pkg_name}: wrong 'when=' condition for the '{vname}' variant" errors.extend(_extracts_errors(triggers, summary))
errors.extend(_extracts_errors([when], summary))
for when, providers, details in _error_items(pkg_cls.provided): for when, providers, details in _error_items(pkg_cls.provided):
errors.extend( errors.extend(

File diff suppressed because it is too large Load Diff

View File

@@ -9,7 +9,6 @@
all_core_root_specs, all_core_root_specs,
ensure_clingo_importable_or_raise, ensure_clingo_importable_or_raise,
ensure_core_dependencies, ensure_core_dependencies,
ensure_file_in_path_or_raise,
ensure_gpg_in_path_or_raise, ensure_gpg_in_path_or_raise,
ensure_patchelf_in_path_or_raise, ensure_patchelf_in_path_or_raise,
) )
@@ -20,7 +19,6 @@
"is_bootstrapping", "is_bootstrapping",
"ensure_bootstrap_configuration", "ensure_bootstrap_configuration",
"ensure_core_dependencies", "ensure_core_dependencies",
"ensure_file_in_path_or_raise",
"ensure_gpg_in_path_or_raise", "ensure_gpg_in_path_or_raise",
"ensure_clingo_importable_or_raise", "ensure_clingo_importable_or_raise",
"ensure_patchelf_in_path_or_raise", "ensure_patchelf_in_path_or_raise",

View File

@@ -4,7 +4,6 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Common basic functions used through the spack.bootstrap package""" """Common basic functions used through the spack.bootstrap package"""
import fnmatch import fnmatch
import importlib
import os.path import os.path
import re import re
import sys import sys
@@ -29,7 +28,7 @@
def _python_import(module: str) -> bool: def _python_import(module: str) -> bool:
try: try:
importlib.import_module(module) __import__(module)
except ImportError: except ImportError:
return False return False
return True return True
@@ -214,18 +213,15 @@ def _root_spec(spec_str: str) -> str:
Args: Args:
spec_str: spec to be bootstrapped. Must be without compiler and target. spec_str: spec to be bootstrapped. Must be without compiler and target.
""" """
# Add a compiler and platform requirement to the root spec. # Add a compiler requirement to the root spec.
platform = str(spack.platforms.host()) platform = str(spack.platforms.host())
if platform == "darwin": if platform == "darwin":
spec_str += " %apple-clang" spec_str += " %apple-clang"
elif platform == "windows":
spec_str += " %msvc"
elif platform == "linux": elif platform == "linux":
spec_str += " %gcc" spec_str += " %gcc"
elif platform == "freebsd": elif platform == "freebsd":
spec_str += " %clang" spec_str += " %clang"
spec_str += f" platform={platform}"
target = archspec.cpu.host().family target = archspec.cpu.host().family
spec_str += f" target={target}" spec_str += f" target={target}"

View File

@@ -1,154 +0,0 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Bootstrap concrete specs for clingo
Spack uses clingo to concretize specs. When clingo itself needs to be bootstrapped from sources,
we need to rely on another mechanism to get a concrete spec that fits the current host.
This module contains the logic to get a concrete spec for clingo, starting from a prototype
JSON file for a similar platform.
"""
import pathlib
import sys
from typing import Dict, Optional, Tuple
import archspec.cpu
import spack.compiler
import spack.compilers
import spack.platforms
import spack.spec
import spack.traverse
from .config import spec_for_current_python
class ClingoBootstrapConcretizer:
def __init__(self, configuration):
self.host_platform = spack.platforms.host()
self.host_os = self.host_platform.operating_system("frontend")
self.host_target = archspec.cpu.host().family
self.host_architecture = spack.spec.ArchSpec.frontend_arch()
self.host_architecture.target = str(self.host_target)
self.host_compiler = self._valid_compiler_or_raise()
self.host_python = self.python_external_spec()
if str(self.host_platform) == "linux":
self.host_libc = self.libc_external_spec()
self.external_cmake, self.external_bison = self._externals_from_yaml(configuration)
def _valid_compiler_or_raise(self) -> "spack.compiler.Compiler":
if str(self.host_platform) == "linux":
compiler_name = "gcc"
elif str(self.host_platform) == "darwin":
compiler_name = "apple-clang"
elif str(self.host_platform) == "windows":
compiler_name = "msvc"
elif str(self.host_platform) == "freebsd":
compiler_name = "clang"
else:
raise RuntimeError(f"Cannot bootstrap clingo from sources on {self.host_platform}")
candidates = spack.compilers.compilers_for_spec(
compiler_name, arch_spec=self.host_architecture
)
if not candidates:
raise RuntimeError(
f"Cannot find any version of {compiler_name} to bootstrap clingo from sources"
)
candidates.sort(key=lambda x: x.spec.version, reverse=True)
return candidates[0]
def _externals_from_yaml(
self, configuration: "spack.config.Configuration"
) -> Tuple[Optional["spack.spec.Spec"], Optional["spack.spec.Spec"]]:
packages_yaml = configuration.get("packages")
requirements = {"cmake": "@3.20:", "bison": "@2.5:"}
selected: Dict[str, Optional["spack.spec.Spec"]] = {"cmake": None, "bison": None}
for pkg_name in ["cmake", "bison"]:
if pkg_name not in packages_yaml:
continue
candidates = packages_yaml[pkg_name].get("externals", [])
for candidate in candidates:
s = spack.spec.Spec(candidate["spec"], external_path=candidate["prefix"])
if not s.satisfies(requirements[pkg_name]):
continue
if not s.intersects(f"%{self.host_compiler.spec}"):
continue
if not s.intersects(f"arch={self.host_architecture}"):
continue
selected[pkg_name] = self._external_spec(s)
break
return selected["cmake"], selected["bison"]
def prototype_path(self) -> pathlib.Path:
"""Path to a prototype concrete specfile for clingo"""
parent_dir = pathlib.Path(__file__).parent
result = parent_dir / "prototypes" / f"clingo-{self.host_platform}-{self.host_target}.json"
if str(self.host_platform) == "linux":
# Using aarch64 as a fallback, since it has gnuconfig (x86_64 doesn't have it)
if not result.exists():
result = parent_dir / "prototypes" / f"clingo-{self.host_platform}-aarch64.json"
elif str(self.host_platform) == "freebsd":
result = parent_dir / "prototypes" / f"clingo-{self.host_platform}-amd64.json"
elif not result.exists():
raise RuntimeError(f"Cannot bootstrap clingo from sources on {self.host_platform}")
return result
def concretize(self) -> "spack.spec.Spec":
# Read the prototype and mark it NOT concrete
s = spack.spec.Spec.from_specfile(str(self.prototype_path()))
s._mark_concrete(False)
# Tweak it to conform to the host architecture
for node in s.traverse():
node.architecture.os = str(self.host_os)
node.compiler = self.host_compiler.spec
node.architecture = self.host_architecture
if node.name == "gcc-runtime":
node.versions = self.host_compiler.spec.versions
for edge in spack.traverse.traverse_edges([s], cover="edges"):
if edge.spec.name == "python":
edge.spec = self.host_python
if edge.spec.name == "bison" and self.external_bison:
edge.spec = self.external_bison
if edge.spec.name == "cmake" and self.external_cmake:
edge.spec = self.external_cmake
if "libc" in edge.virtuals:
edge.spec = self.host_libc
s._finalize_concretization()
# Work around the fact that the installer calls Spec.dependents() and
# we modified edges inconsistently
return s.copy()
def python_external_spec(self) -> "spack.spec.Spec":
"""Python external spec corresponding to the current running interpreter"""
result = spack.spec.Spec(spec_for_current_python(), external_path=sys.exec_prefix)
return self._external_spec(result)
def libc_external_spec(self) -> "spack.spec.Spec":
result = self.host_compiler.default_libc
return self._external_spec(result)
def _external_spec(self, initial_spec) -> "spack.spec.Spec":
initial_spec.namespace = "builtin"
initial_spec.compiler = self.host_compiler.spec
initial_spec.architecture = self.host_architecture
for flag_type in spack.spec.FlagMap.valid_compiler_flags():
initial_spec.compiler_flags[flag_type] = []
return spack.spec.parse_with_version_concrete(initial_spec)

View File

@@ -14,7 +14,6 @@
import spack.compilers import spack.compilers
import spack.config import spack.config
import spack.environment import spack.environment
import spack.modules
import spack.paths import spack.paths
import spack.platforms import spack.platforms
import spack.repo import spack.repo
@@ -130,10 +129,10 @@ def _bootstrap_config_scopes() -> Sequence["spack.config.ConfigScope"]:
configuration_paths = (spack.config.CONFIGURATION_DEFAULTS_PATH, ("bootstrap", _config_path())) configuration_paths = (spack.config.CONFIGURATION_DEFAULTS_PATH, ("bootstrap", _config_path()))
for name, path in configuration_paths: for name, path in configuration_paths:
platform = spack.platforms.host().name platform = spack.platforms.host().name
platform_scope = spack.config.DirectoryConfigScope( platform_scope = spack.config.ConfigScope(
f"{name}/{platform}", os.path.join(path, platform) "/".join([name, platform]), os.path.join(path, platform)
) )
generic_scope = spack.config.DirectoryConfigScope(name, path) generic_scope = spack.config.ConfigScope(name, path)
config_scopes.extend([generic_scope, platform_scope]) config_scopes.extend([generic_scope, platform_scope])
msg = "[BOOTSTRAP CONFIG SCOPE] name={0}, path={1}" msg = "[BOOTSTRAP CONFIG SCOPE] name={0}, path={1}"
tty.debug(msg.format(generic_scope.name, generic_scope.path)) tty.debug(msg.format(generic_scope.name, generic_scope.path))
@@ -144,7 +143,11 @@ def _bootstrap_config_scopes() -> Sequence["spack.config.ConfigScope"]:
def _add_compilers_if_missing() -> None: def _add_compilers_if_missing() -> None:
arch = spack.spec.ArchSpec.frontend_arch() arch = spack.spec.ArchSpec.frontend_arch()
if not spack.compilers.compilers_for_arch(arch): if not spack.compilers.compilers_for_arch(arch):
spack.compilers.find_compilers() new_compilers = spack.compilers.find_new_compilers(
mixed_toolchain=sys.platform == "darwin"
)
if new_compilers:
spack.compilers.add_compilers_to_config(new_compilers)
@contextlib.contextmanager @contextlib.contextmanager
@@ -153,7 +156,7 @@ def _ensure_bootstrap_configuration() -> Generator:
bootstrap_store_path = store_path() bootstrap_store_path = store_path()
user_configuration = _read_and_sanitize_configuration() user_configuration = _read_and_sanitize_configuration()
with spack.environment.no_active_environment(): with spack.environment.no_active_environment():
with spack.platforms.use_platform( with spack.platforms.prevent_cray_detection(), spack.platforms.use_platform(
spack.platforms.real_host() spack.platforms.real_host()
), spack.repo.use_repositories(spack.paths.packages_path): ), spack.repo.use_repositories(spack.paths.packages_path):
# Default configuration scopes excluding command line # Default configuration scopes excluding command line

View File

@@ -37,19 +37,23 @@
import spack.binary_distribution import spack.binary_distribution
import spack.config import spack.config
import spack.detection import spack.detection
import spack.environment
import spack.modules
import spack.paths
import spack.platforms import spack.platforms
import spack.platforms.linux
import spack.repo
import spack.spec import spack.spec
import spack.store import spack.store
import spack.user_environment import spack.user_environment
import spack.util.environment
import spack.util.executable import spack.util.executable
import spack.util.path import spack.util.path
import spack.util.spack_yaml import spack.util.spack_yaml
import spack.util.url import spack.util.url
import spack.version import spack.version
from spack.installer import PackageInstaller
from ._common import _executables_in_store, _python_import, _root_spec, _try_import_from_store from ._common import _executables_in_store, _python_import, _root_spec, _try_import_from_store
from .clingo import ClingoBootstrapConcretizer
from .config import spack_python_interpreter, spec_for_current_python from .config import spack_python_interpreter, spec_for_current_python
#: Name of the file containing metadata about the bootstrapping source #: Name of the file containing metadata about the bootstrapping source
@@ -264,13 +268,15 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
# Try to build and install from sources # Try to build and install from sources
with spack_python_interpreter(): with spack_python_interpreter():
# Add hint to use frontend operating system on Cray
concrete_spec = spack.spec.Spec(abstract_spec_str + " ^" + spec_for_current_python())
if module == "clingo": if module == "clingo":
bootstrapper = ClingoBootstrapConcretizer(configuration=spack.config.CONFIG) # TODO: remove when the old concretizer is deprecated # pylint: disable=fixme
concrete_spec = bootstrapper.concretize() concrete_spec._old_concretize( # pylint: disable=protected-access
else: deprecation_warning=False
concrete_spec = spack.spec.Spec(
abstract_spec_str + " ^" + spec_for_current_python()
) )
else:
concrete_spec.concretize() concrete_spec.concretize()
msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources" msg = "[BOOTSTRAP MODULE {0}] Try installing '{1}' from sources"
@@ -278,7 +284,7 @@ def try_import(self, module: str, abstract_spec_str: str) -> bool:
# Install the spec that should make the module importable # Install the spec that should make the module importable
with spack.config.override(self.mirror_scope): with spack.config.override(self.mirror_scope):
PackageInstaller([concrete_spec.package], fail_fast=True).install() concrete_spec.package.do_install(fail_fast=True)
if _try_import_from_store(module, query_spec=concrete_spec, query_info=info): if _try_import_from_store(module, query_spec=concrete_spec, query_info=info):
self.last_search = info self.last_search = info
@@ -297,11 +303,18 @@ def try_search_path(self, executables: Tuple[str], abstract_spec_str: str) -> bo
# might reduce compilation time by a fair amount # might reduce compilation time by a fair amount
_add_externals_if_missing() _add_externals_if_missing()
concrete_spec = spack.spec.Spec(abstract_spec_str).concretized() concrete_spec = spack.spec.Spec(abstract_spec_str)
if concrete_spec.name == "patchelf":
concrete_spec._old_concretize( # pylint: disable=protected-access
deprecation_warning=False
)
else:
concrete_spec.concretize()
msg = "[BOOTSTRAP] Try installing '{0}' from sources" msg = "[BOOTSTRAP] Try installing '{0}' from sources"
tty.debug(msg.format(abstract_spec_str)) tty.debug(msg.format(abstract_spec_str))
with spack.config.override(self.mirror_scope): with spack.config.override(self.mirror_scope):
PackageInstaller([concrete_spec.package], fail_fast=True).install() concrete_spec.package.do_install()
if _executables_in_store(executables, concrete_spec, query_info=info): if _executables_in_store(executables, concrete_spec, query_info=info):
self.last_search = info self.last_search = info
return True return True
@@ -467,8 +480,7 @@ def ensure_clingo_importable_or_raise() -> None:
def gnupg_root_spec() -> str: def gnupg_root_spec() -> str:
"""Return the root spec used to bootstrap GnuPG""" """Return the root spec used to bootstrap GnuPG"""
root_spec_name = "win-gpg" if IS_WINDOWS else "gnupg" return _root_spec("gnupg@2.3:")
return _root_spec(f"{root_spec_name}@2.3:")
def ensure_gpg_in_path_or_raise() -> None: def ensure_gpg_in_path_or_raise() -> None:
@@ -478,19 +490,6 @@ def ensure_gpg_in_path_or_raise() -> None:
) )
def file_root_spec() -> str:
"""Return the root spec used to bootstrap file"""
root_spec_name = "win-file" if IS_WINDOWS else "file"
return _root_spec(root_spec_name)
def ensure_file_in_path_or_raise() -> None:
"""Ensure file is in the PATH or raise"""
return ensure_executables_in_path_or_raise(
executables=["file"], abstract_spec=file_root_spec()
)
def patchelf_root_spec() -> str: def patchelf_root_spec() -> str:
"""Return the root spec used to bootstrap patchelf""" """Return the root spec used to bootstrap patchelf"""
# 0.13.1 is the last version not to require C++17. # 0.13.1 is the last version not to require C++17.
@@ -574,15 +573,14 @@ def ensure_core_dependencies() -> None:
"""Ensure the presence of all the core dependencies.""" """Ensure the presence of all the core dependencies."""
if sys.platform.lower() == "linux": if sys.platform.lower() == "linux":
ensure_patchelf_in_path_or_raise() ensure_patchelf_in_path_or_raise()
elif sys.platform == "win32": if not IS_WINDOWS:
ensure_file_in_path_or_raise() ensure_gpg_in_path_or_raise()
ensure_gpg_in_path_or_raise()
ensure_clingo_importable_or_raise() ensure_clingo_importable_or_raise()
def all_core_root_specs() -> List[str]: def all_core_root_specs() -> List[str]:
"""Return a list of all the core root specs that may be used to bootstrap Spack""" """Return a list of all the core root specs that may be used to bootstrap Spack"""
return [clingo_root_spec(), gnupg_root_spec(), patchelf_root_spec(), file_root_spec()] return [clingo_root_spec(), gnupg_root_spec(), patchelf_root_spec()]
def bootstrapping_sources(scope: Optional[str] = None): def bootstrapping_sources(scope: Optional[str] = None):

View File

@@ -14,9 +14,9 @@
from llnl.util import tty from llnl.util import tty
import spack.environment import spack.environment
import spack.spec
import spack.tengine import spack.tengine
import spack.util.path import spack.util.cpus
import spack.util.executable
from ._common import _root_spec from ._common import _root_spec
from .config import root_path, spec_for_current_python, store_path from .config import root_path, spec_for_current_python, store_path

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -88,7 +88,7 @@ def _core_requirements() -> List[RequiredResponseType]:
def _buildcache_requirements() -> List[RequiredResponseType]: def _buildcache_requirements() -> List[RequiredResponseType]:
_buildcache_exes = { _buildcache_exes = {
"file": _missing("file", "required to analyze files for buildcaches", system_only=False), "file": _missing("file", "required to analyze files for buildcaches"),
("gpg2", "gpg"): _missing("gpg2", "required to sign/verify buildcaches", False), ("gpg2", "gpg"): _missing("gpg2", "required to sign/verify buildcaches", False),
} }
if platform.system().lower() == "darwin": if platform.system().lower() == "darwin":
@@ -124,7 +124,7 @@ def _development_requirements() -> List[RequiredResponseType]:
# Ensure we trigger environment modifications if we have an environment # Ensure we trigger environment modifications if we have an environment
if BootstrapEnvironment.spack_yaml().exists(): if BootstrapEnvironment.spack_yaml().exists():
with BootstrapEnvironment() as env: with BootstrapEnvironment() as env:
env.load() env.update_syspath_and_environ()
return [ return [
_required_executable( _required_executable(

View File

@@ -45,8 +45,6 @@
from itertools import chain from itertools import chain
from typing import Dict, List, Set, Tuple from typing import Dict, List, Set, Tuple
import archspec.cpu
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.string import plural from llnl.string import plural
from llnl.util.filesystem import join_path from llnl.util.filesystem import join_path
@@ -55,7 +53,6 @@
from llnl.util.tty.color import cescape, colorize from llnl.util.tty.color import cescape, colorize
from llnl.util.tty.log import MultiProcessFd from llnl.util.tty.log import MultiProcessFd
import spack.build_systems._checks
import spack.build_systems.cmake import spack.build_systems.cmake
import spack.build_systems.meson import spack.build_systems.meson
import spack.build_systems.python import spack.build_systems.python
@@ -64,20 +61,26 @@
import spack.config import spack.config
import spack.deptypes as dt import spack.deptypes as dt
import spack.error import spack.error
import spack.multimethod import spack.main
import spack.package_base import spack.package_base
import spack.paths import spack.paths
import spack.platforms import spack.platforms
import spack.repo
import spack.schema.environment import spack.schema.environment
import spack.spec import spack.spec
import spack.stage import spack.stage
import spack.store import spack.store
import spack.subprocess_context import spack.subprocess_context
import spack.user_environment
import spack.util.executable import spack.util.executable
import spack.util.path
import spack.util.pattern
from spack import traverse from spack import traverse
from spack.context import Context from spack.context import Context
from spack.error import InstallError, NoHeadersError, NoLibrariesError from spack.error import NoHeadersError, NoLibrariesError
from spack.install_test import spack_install_test_log from spack.install_test import spack_install_test_log
from spack.installer import InstallError
from spack.util.cpus import determine_number_of_jobs
from spack.util.environment import ( from spack.util.environment import (
SYSTEM_DIR_CASE_ENTRY, SYSTEM_DIR_CASE_ENTRY,
EnvironmentModifications, EnvironmentModifications,
@@ -89,7 +92,7 @@
) )
from spack.util.executable import Executable from spack.util.executable import Executable
from spack.util.log_parse import make_log_context, parse_log_events from spack.util.log_parse import make_log_context, parse_log_events
from spack.util.module_cmd import load_module, path_from_modules from spack.util.module_cmd import load_module, module, path_from_modules
# #
# This can be set by the user to globally disable parallel builds. # This can be set by the user to globally disable parallel builds.
@@ -188,6 +191,14 @@ def __call__(self, *args, **kwargs):
return super().__call__(*args, **kwargs) return super().__call__(*args, **kwargs)
def _on_cray():
host_platform = spack.platforms.host()
host_os = host_platform.operating_system("default_os")
on_cray = str(host_platform) == "cray"
using_cnl = re.match(r"cnl\d+", str(host_os))
return on_cray, using_cnl
def clean_environment(): def clean_environment():
# Stuff in here sanitizes the build environment to eliminate # Stuff in here sanitizes the build environment to eliminate
# anything the user has set that may interfere. We apply it immediately # anything the user has set that may interfere. We apply it immediately
@@ -231,6 +242,17 @@ def clean_environment():
if varname.endswith("_ROOT") and varname != "SPACK_ROOT": if varname.endswith("_ROOT") and varname != "SPACK_ROOT":
env.unset(varname) env.unset(varname)
# On Cray "cluster" systems, unset CRAY_LD_LIBRARY_PATH to avoid
# interference with Spack dependencies.
# CNL requires these variables to be set (or at least some of them,
# depending on the CNL version).
on_cray, using_cnl = _on_cray()
if on_cray and not using_cnl:
env.unset("CRAY_LD_LIBRARY_PATH")
for varname in os.environ.keys():
if "PKGCONF" in varname:
env.unset(varname)
# Unset the following variables because they can affect installation of # Unset the following variables because they can affect installation of
# Autotools and CMake packages. # Autotools and CMake packages.
build_system_vars = [ build_system_vars = [
@@ -360,7 +382,11 @@ def set_compiler_environment_variables(pkg, env):
_add_werror_handling(keep_werror, env) _add_werror_handling(keep_werror, env)
# Set the target parameters that the compiler will add # Set the target parameters that the compiler will add
isa_arg = optimization_flags(compiler, spec.target) # Don't set on cray platform because the targeting module handles this
if spec.satisfies("platform=cray"):
isa_arg = ""
else:
isa_arg = spec.architecture.target.optimization_flags(compiler)
env.set("SPACK_TARGET_ARGS", isa_arg) env.set("SPACK_TARGET_ARGS", isa_arg)
# Trap spack-tracked compiler flags as appropriate. # Trap spack-tracked compiler flags as appropriate.
@@ -405,36 +431,6 @@ def set_compiler_environment_variables(pkg, env):
return env return env
def optimization_flags(compiler, target):
if spack.compilers.is_mixed_toolchain(compiler):
msg = (
"microarchitecture specific optimizations are not "
"supported yet on mixed compiler toolchains [check"
f" {compiler.name}@{compiler.version} for further details]"
)
tty.debug(msg)
return ""
# Try to check if the current compiler comes with a version number or
# has an unexpected suffix. If so, treat it as a compiler with a
# custom spec.
compiler_version = compiler.version
version_number, suffix = archspec.cpu.version_components(compiler.version)
if not version_number or suffix:
try:
compiler_version = compiler.real_version
except spack.util.executable.ProcessError as e:
# log this and just return compiler.version instead
tty.debug(str(e))
try:
result = target.optimization_flags(compiler.name, compiler_version.dotted_numeric_string)
except (ValueError, archspec.cpu.UnsupportedMicroarchitecture):
result = ""
return result
def set_wrapper_variables(pkg, env): def set_wrapper_variables(pkg, env):
"""Set environment variables used by the Spack compiler wrapper (which have the prefix """Set environment variables used by the Spack compiler wrapper (which have the prefix
`SPACK_`) and also add the compiler wrappers to PATH. `SPACK_`) and also add the compiler wrappers to PATH.
@@ -482,7 +478,7 @@ def set_wrapper_variables(pkg, env):
env.set(SPACK_DEBUG, "TRUE") env.set(SPACK_DEBUG, "TRUE")
env.set(SPACK_SHORT_SPEC, pkg.spec.short_spec) env.set(SPACK_SHORT_SPEC, pkg.spec.short_spec)
env.set(SPACK_DEBUG_LOG_ID, pkg.spec.format("{name}-{hash:7}")) env.set(SPACK_DEBUG_LOG_ID, pkg.spec.format("{name}-{hash:7}"))
env.set(SPACK_DEBUG_LOG_DIR, spack.paths.spack_working_dir) env.set(SPACK_DEBUG_LOG_DIR, spack.main.spack_working_dir)
if spack.config.get("config:ccache"): if spack.config.get("config:ccache"):
# Enable ccache in the compiler wrapper # Enable ccache in the compiler wrapper
@@ -589,7 +585,7 @@ def set_package_py_globals(pkg, context: Context = Context.BUILD):
module.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg) module.std_meson_args = spack.build_systems.meson.MesonBuilder.std_args(pkg)
module.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg) module.std_pip_args = spack.build_systems.python.PythonPipBuilder.std_args(pkg)
jobs = spack.config.determine_number_of_jobs(parallel=pkg.parallel) jobs = determine_number_of_jobs(parallel=pkg.parallel)
module.make_jobs = jobs module.make_jobs = jobs
# TODO: make these build deps that can be installed if not found. # TODO: make these build deps that can be installed if not found.
@@ -768,9 +764,7 @@ def get_rpaths(pkg):
# Second module is our compiler mod name. We use that to get rpaths from # Second module is our compiler mod name. We use that to get rpaths from
# module show output. # module show output.
if pkg.compiler.modules and len(pkg.compiler.modules) > 1: if pkg.compiler.modules and len(pkg.compiler.modules) > 1:
mod_rpath = path_from_modules([pkg.compiler.modules[1]]) rpaths.append(path_from_modules([pkg.compiler.modules[1]]))
if mod_rpath:
rpaths.append(mod_rpath)
return list(dedupe(filter_system_paths(rpaths))) return list(dedupe(filter_system_paths(rpaths)))
@@ -815,6 +809,7 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
# Platform specific setup goes before package specific setup. This is for setting # Platform specific setup goes before package specific setup. This is for setting
# defaults like MACOSX_DEPLOYMENT_TARGET on macOS. # defaults like MACOSX_DEPLOYMENT_TARGET on macOS.
platform = spack.platforms.by_name(pkg.spec.architecture.platform) platform = spack.platforms.by_name(pkg.spec.architecture.platform)
target = platform.target(pkg.spec.architecture.target)
platform.setup_platform_environment(pkg, env_mods) platform.setup_platform_environment(pkg, env_mods)
tty.debug("setup_package: grabbing modifications from dependencies") tty.debug("setup_package: grabbing modifications from dependencies")
@@ -839,6 +834,17 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
for mod in pkg.compiler.modules: for mod in pkg.compiler.modules:
load_module(mod) load_module(mod)
# kludge to handle cray mpich and libsci being automatically loaded by
# PrgEnv modules on cray platform. Module unload does no damage when
# unnecessary
on_cray, _ = _on_cray()
if on_cray and not dirty:
for mod in ["cray-mpich", "cray-libsci"]:
module("unload", mod)
if target and target.module_name:
load_module(target.module_name)
load_external_modules(pkg) load_external_modules(pkg)
implicit_rpaths = pkg.compiler.implicit_rpaths() implicit_rpaths = pkg.compiler.implicit_rpaths()
@@ -1162,7 +1168,7 @@ def _setup_pkg_and_run(
return_value = function(pkg, kwargs) return_value = function(pkg, kwargs)
write_pipe.send(return_value) write_pipe.send(return_value)
except spack.error.StopPhase as e: except StopPhase as e:
# Do not create a full ChildError from this, it's not an error # Do not create a full ChildError from this, it's not an error
# it's a control statement. # it's a control statement.
write_pipe.send(e) write_pipe.send(e)
@@ -1323,7 +1329,7 @@ def exitcode_msg(p):
p.join() p.join()
# If returns a StopPhase, raise it # If returns a StopPhase, raise it
if isinstance(child_result, spack.error.StopPhase): if isinstance(child_result, StopPhase):
# do not print # do not print
raise child_result raise child_result
@@ -1499,7 +1505,7 @@ def long_message(self):
out.write(" {0}\n".format(self.log_name)) out.write(" {0}\n".format(self.log_name))
# Also output the test log path IF it exists # Also output the test log path IF it exists
if self.context != "test" and have_log: if self.context != "test":
test_log = join_path(os.path.dirname(self.log_name), spack_install_test_log) test_log = join_path(os.path.dirname(self.log_name), spack_install_test_log)
if os.path.isfile(test_log): if os.path.isfile(test_log):
out.write("\nSee test log for details:\n") out.write("\nSee test log for details:\n")
@@ -1532,6 +1538,17 @@ def _make_child_error(msg, module, name, traceback, log, log_type, context):
return ChildError(msg, module, name, traceback, log, log_type, context) return ChildError(msg, module, name, traceback, log, log_type, context)
class StopPhase(spack.error.SpackError):
"""Pickle-able exception to control stopped builds."""
def __reduce__(self):
return _make_stop_phase, (self.message, self.long_message)
def _make_stop_phase(msg, long_msg):
return StopPhase(msg, long_msg)
def write_log_summary(out, log_type, log, last=None): def write_log_summary(out, log_type, log, last=None):
errors, warnings = parse_log_events(log) errors, warnings = parse_log_events(log)
nerr = len(errors) nerr = len(errors)
@@ -1565,21 +1582,21 @@ class ModuleChangePropagator:
_PROTECTED_NAMES = ("package", "current_module", "modules_in_mro", "_set_attributes") _PROTECTED_NAMES = ("package", "current_module", "modules_in_mro", "_set_attributes")
def __init__(self, package: spack.package_base.PackageBase) -> None: def __init__(self, package):
self._set_self_attributes("package", package) self._set_self_attributes("package", package)
self._set_self_attributes("current_module", package.module) self._set_self_attributes("current_module", package.module)
#: Modules for the classes in the MRO up to PackageBase #: Modules for the classes in the MRO up to PackageBase
modules_in_mro = [] modules_in_mro = []
for cls in package.__class__.__mro__: for cls in inspect.getmro(type(package)):
module = getattr(cls, "module", None) module = cls.module
if module is None or module is spack.package_base: if module == self.current_module:
break
if module is self.current_module:
continue continue
if module == spack.package_base:
break
modules_in_mro.append(module) modules_in_mro.append(module)
self._set_self_attributes("modules_in_mro", modules_in_mro) self._set_self_attributes("modules_in_mro", modules_in_mro)
self._set_self_attributes("_set_attributes", {}) self._set_self_attributes("_set_attributes", {})

View File

@@ -8,7 +8,7 @@
import llnl.util.lang import llnl.util.lang
import spack.builder import spack.builder
import spack.error import spack.installer
import spack.relocate import spack.relocate
import spack.spec import spack.spec
import spack.store import spack.store
@@ -34,7 +34,7 @@ def check_paths(path_list, filetype, predicate):
if not predicate(abs_path): if not predicate(abs_path):
msg = "Install failed for {0}. No such {1} in prefix: {2}" msg = "Install failed for {0}. No such {1} in prefix: {2}"
msg = msg.format(pkg.name, filetype, path) msg = msg.format(pkg.name, filetype, path)
raise spack.error.InstallError(msg) raise spack.installer.InstallError(msg)
check_paths(pkg.sanity_check_is_file, "file", os.path.isfile) check_paths(pkg.sanity_check_is_file, "file", os.path.isfile)
check_paths(pkg.sanity_check_is_dir, "directory", os.path.isdir) check_paths(pkg.sanity_check_is_dir, "directory", os.path.isdir)
@@ -42,7 +42,7 @@ def check_paths(path_list, filetype, predicate):
ignore_file = llnl.util.lang.match_predicate(spack.store.STORE.layout.hidden_file_regexes) ignore_file = llnl.util.lang.match_predicate(spack.store.STORE.layout.hidden_file_regexes)
if all(map(ignore_file, os.listdir(pkg.prefix))): if all(map(ignore_file, os.listdir(pkg.prefix))):
msg = "Install failed for {0}. Nothing was installed!" msg = "Install failed for {0}. Nothing was installed!"
raise spack.error.InstallError(msg.format(pkg.name)) raise spack.installer.InstallError(msg.format(pkg.name))
def apply_macos_rpath_fixups(builder: spack.builder.Builder): def apply_macos_rpath_fixups(builder: spack.builder.Builder):

View File

@@ -2,8 +2,6 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import spack.directives import spack.directives
@@ -48,12 +46,18 @@ class AspellDictPackage(AutotoolsPackage):
#: Override the default autotools builder #: Override the default autotools builder
AutotoolsBuilder = AspellBuilder AutotoolsBuilder = AspellBuilder
def patch(self): def view_destination(self, view):
aspell_spec = self.spec["aspell"] aspell_spec = self.spec["aspell"]
if view.get_projection_for_spec(aspell_spec) != aspell_spec.prefix:
raise spack.package_base.ExtensionError(
"aspell does not support non-global extensions"
)
aspell = aspell_spec.command aspell = aspell_spec.command
dictdir = aspell("dump", "config", "dict-dir", output=str).strip() return aspell("dump", "config", "dict-dir", output=str).strip()
datadir = aspell("dump", "config", "data-dir", output=str).strip()
dictdir = os.path.relpath(dictdir, aspell_spec.prefix) def view_source(self):
datadir = os.path.relpath(datadir, aspell_spec.prefix) return self.prefix.lib
fs.filter_file(r"^dictdir=.*$", f"dictdir=/{dictdir}", "configure")
fs.filter_file(r"^datadir=.*$", f"datadir=/{datadir}", "configure") def patch(self):
fs.filter_file(r"^dictdir=.*$", "dictdir=/lib", "configure")
fs.filter_file(r"^datadir=.*$", "datadir=/lib", "configure")

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os import os
import os.path import os.path
import stat import stat
@@ -13,7 +14,6 @@
import spack.build_environment import spack.build_environment
import spack.builder import spack.builder
import spack.error
import spack.package_base import spack.package_base
from spack.directives import build_system, conflicts, depends_on from spack.directives import build_system, conflicts, depends_on
from spack.multimethod import when from spack.multimethod import when
@@ -249,7 +249,7 @@ def runs_ok(script_abs_path):
# An external gnuconfig may not not have a prefix. # An external gnuconfig may not not have a prefix.
if gnuconfig_dir is None: if gnuconfig_dir is None:
raise spack.error.InstallError( raise spack.build_environment.InstallError(
"Spack could not find substitutes for GNU config files because no " "Spack could not find substitutes for GNU config files because no "
"prefix is available for the `gnuconfig` package. Make sure you set a " "prefix is available for the `gnuconfig` package. Make sure you set a "
"prefix path instead of modules for external `gnuconfig`." "prefix path instead of modules for external `gnuconfig`."
@@ -269,7 +269,7 @@ def runs_ok(script_abs_path):
msg += ( msg += (
" or the `gnuconfig` package prefix is misconfigured as" " an external package" " or the `gnuconfig` package prefix is misconfigured as" " an external package"
) )
raise spack.error.InstallError(msg) raise spack.build_environment.InstallError(msg)
# Filter working substitutes # Filter working substitutes
candidates = [f for f in candidates if runs_ok(f)] candidates = [f for f in candidates if runs_ok(f)]
@@ -294,7 +294,9 @@ def runs_ok(script_abs_path):
and set the prefix to the directory containing the `config.guess` and and set the prefix to the directory containing the `config.guess` and
`config.sub` files. `config.sub` files.
""" """
raise spack.error.InstallError(msg.format(", ".join(to_be_found), self.name)) raise spack.build_environment.InstallError(
msg.format(", ".join(to_be_found), self.name)
)
# Copy the good files over the bad ones # Copy the good files over the bad ones
for abs_path in to_be_patched: for abs_path in to_be_patched:
@@ -547,12 +549,13 @@ def autoreconf(self, pkg, spec, prefix):
tty.warn("* a custom AUTORECONF phase in the package *") tty.warn("* a custom AUTORECONF phase in the package *")
tty.warn("*********************************************************") tty.warn("*********************************************************")
with fs.working_dir(self.configure_directory): with fs.working_dir(self.configure_directory):
m = inspect.getmodule(self.pkg)
# This line is what is needed most of the time # This line is what is needed most of the time
# --install, --verbose, --force # --install, --verbose, --force
autoreconf_args = ["-ivf"] autoreconf_args = ["-ivf"]
autoreconf_args += self.autoreconf_search_path_args autoreconf_args += self.autoreconf_search_path_args
autoreconf_args += self.autoreconf_extra_args autoreconf_args += self.autoreconf_extra_args
self.pkg.module.autoreconf(*autoreconf_args) m.autoreconf(*autoreconf_args)
@property @property
def autoreconf_search_path_args(self): def autoreconf_search_path_args(self):
@@ -576,9 +579,7 @@ def set_configure_or_die(self):
raise RuntimeError(msg.format(self.configure_directory)) raise RuntimeError(msg.format(self.configure_directory))
# Monkey-patch the configure script in the corresponding module # Monkey-patch the configure script in the corresponding module
globals_for_pkg = spack.build_environment.ModuleChangePropagator(self.pkg) inspect.getmodule(self.pkg).configure = Executable(self.configure_abs_path)
globals_for_pkg.configure = Executable(self.configure_abs_path)
globals_for_pkg.propagate_changes_to_mro()
def configure_args(self): def configure_args(self):
"""Return the list of all the arguments that must be passed to configure, """Return the list of all the arguments that must be passed to configure,
@@ -595,7 +596,7 @@ def configure(self, pkg, spec, prefix):
options += self.configure_args() options += self.configure_args()
with fs.working_dir(self.build_directory, create=True): with fs.working_dir(self.build_directory, create=True):
pkg.module.configure(*options) inspect.getmodule(self.pkg).configure(*options)
def build(self, pkg, spec, prefix): def build(self, pkg, spec, prefix):
"""Run "make" on the build targets specified by the builder.""" """Run "make" on the build targets specified by the builder."""
@@ -603,12 +604,12 @@ def build(self, pkg, spec, prefix):
params = ["V=1"] params = ["V=1"]
params += self.build_targets params += self.build_targets
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
pkg.module.make(*params) inspect.getmodule(self.pkg).make(*params)
def install(self, pkg, spec, prefix): def install(self, pkg, spec, prefix):
"""Run "make" on the install targets specified by the builder.""" """Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets) inspect.getmodule(self.pkg).make(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests) spack.builder.run_after("build")(execute_build_time_tests)
@@ -687,8 +688,9 @@ def _activate_or_not(
variant = variant or name variant = variant or name
# Defensively look that the name passed as argument is among variants # Defensively look that the name passed as argument is among
if not self.pkg.has_variant(variant): # variants
if variant not in self.pkg.variants:
msg = '"{0}" is not a variant of "{1}"' msg = '"{0}" is not a variant of "{1}"'
raise KeyError(msg.format(variant, self.pkg.name)) raise KeyError(msg.format(variant, self.pkg.name))
@@ -697,19 +699,27 @@ def _activate_or_not(
# Create a list of pairs. Each pair includes a configuration # Create a list of pairs. Each pair includes a configuration
# option and whether or not that option is activated # option and whether or not that option is activated
vdef = self.pkg.get_variant(variant) variant_desc, _ = self.pkg.variants[variant]
if set(vdef.values) == set((True, False)): if set(variant_desc.values) == set((True, False)):
# BoolValuedVariant carry information about a single option. # BoolValuedVariant carry information about a single option.
# Nonetheless, for uniformity of treatment we'll package them # Nonetheless, for uniformity of treatment we'll package them
# in an iterable of one element. # in an iterable of one element.
options = [(name, f"+{variant}" in spec)] condition = "+{name}".format(name=variant)
options = [(name, condition in spec)]
else: else:
condition = "{variant}={value}"
# "feature_values" is used to track values which correspond to # "feature_values" is used to track values which correspond to
# features which can be enabled or disabled as understood by the # features which can be enabled or disabled as understood by the
# package's build system. It excludes values which have special # package's build system. It excludes values which have special
# meanings and do not correspond to features (e.g. "none") # meanings and do not correspond to features (e.g. "none")
feature_values = getattr(vdef.values, "feature_values", None) or vdef.values feature_values = (
options = [(value, f"{variant}={value}" in spec) for value in feature_values] getattr(variant_desc.values, "feature_values", None) or variant_desc.values
)
options = [
(value, condition.format(variant=variant, value=value) in spec)
for value in feature_values
]
# For each allowed value in the list of values # For each allowed value in the list of values
for option_value, activated in options: for option_value, activated in options:

View File

@@ -89,7 +89,7 @@ def define_cmake_cache_from_variant(self, cmake_var, variant=None, comment=""):
if variant is None: if variant is None:
variant = cmake_var.lower() variant = cmake_var.lower()
if not self.pkg.has_variant(variant): if variant not in self.pkg.variants:
raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, self.pkg.name)) raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, self.pkg.name))
if variant not in self.pkg.spec.variants: if variant not in self.pkg.spec.variants:
@@ -162,9 +162,7 @@ def initconfig_compiler_entries(self):
ld_flags = " ".join(flags["ldflags"]) ld_flags = " ".join(flags["ldflags"])
ld_format_string = "CMAKE_{0}_LINKER_FLAGS" ld_format_string = "CMAKE_{0}_LINKER_FLAGS"
# CMake has separate linker arguments for types of builds. # CMake has separate linker arguments for types of builds.
# 'ldflags' should not be used with CMAKE_STATIC_LINKER_FLAGS which for ld_type in ["EXE", "MODULE", "SHARED", "STATIC"]:
# is used by the archiver, so don't include "STATIC" in this loop:
for ld_type in ["EXE", "MODULE", "SHARED"]:
ld_string = ld_format_string.format(ld_type) ld_string = ld_format_string.format(ld_type)
entries.append(cmake_cache_string(ld_string, ld_flags)) entries.append(cmake_cache_string(ld_string, ld_flags))

View File

@@ -3,6 +3,8 @@
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import spack.builder import spack.builder
@@ -70,7 +72,9 @@ def check_args(self):
def build(self, pkg, spec, prefix): def build(self, pkg, spec, prefix):
"""Runs ``cargo install`` in the source directory""" """Runs ``cargo install`` in the source directory"""
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
pkg.module.cargo("install", "--root", "out", "--path", ".", *self.build_args) inspect.getmodule(pkg).cargo(
"install", "--root", "out", "--path", ".", *self.build_args
)
def install(self, pkg, spec, prefix): def install(self, pkg, spec, prefix):
"""Copy build files into package prefix.""" """Copy build files into package prefix."""
@@ -82,4 +86,4 @@ def install(self, pkg, spec, prefix):
def check(self): def check(self):
"""Run "cargo test".""" """Run "cargo test"."""
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
self.pkg.module.cargo("test", *self.check_args) inspect.getmodule(self.pkg).cargo("test", *self.check_args)

View File

@@ -3,6 +3,7 @@
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import collections.abc import collections.abc
import inspect
import os import os
import pathlib import pathlib
import platform import platform
@@ -15,7 +16,6 @@
import spack.build_environment import spack.build_environment
import spack.builder import spack.builder
import spack.deptypes as dt import spack.deptypes as dt
import spack.error
import spack.package_base import spack.package_base
from spack.directives import build_system, conflicts, depends_on, variant from spack.directives import build_system, conflicts, depends_on, variant
from spack.multimethod import when from spack.multimethod import when
@@ -108,11 +108,6 @@ def _conditional_cmake_defaults(pkg: spack.package_base.PackageBase, args: List[
if _supports_compilation_databases(pkg): if _supports_compilation_databases(pkg):
args.append(CMakeBuilder.define("CMAKE_EXPORT_COMPILE_COMMANDS", True)) args.append(CMakeBuilder.define("CMAKE_EXPORT_COMPILE_COMMANDS", True))
# Enable MACOSX_RPATH by default when cmake_minimum_required < 3
# https://cmake.org/cmake/help/latest/policy/CMP0042.html
if pkg.spec.satisfies("platform=darwin") and cmake.satisfies("@3:"):
args.append(CMakeBuilder.define("CMAKE_POLICY_DEFAULT_CMP0042", "NEW"))
def generator(*names: str, default: Optional[str] = None): def generator(*names: str, default: Optional[str] = None):
"""The build system generator to use. """The build system generator to use.
@@ -146,7 +141,6 @@ def _values(x):
default=default, default=default,
values=_values, values=_values,
description="the build system generator to use", description="the build system generator to use",
when="build_system=cmake",
) )
for x in not_used: for x in not_used:
conflicts(f"generator={x}") conflicts(f"generator={x}")
@@ -346,7 +340,7 @@ def std_args(pkg, generator=None):
msg = "Invalid CMake generator: '{0}'\n".format(generator) msg = "Invalid CMake generator: '{0}'\n".format(generator)
msg += "CMakePackage currently supports the following " msg += "CMakePackage currently supports the following "
msg += "primary generators: '{0}'".format("', '".join(valid_primary_generators)) msg += "primary generators: '{0}'".format("', '".join(valid_primary_generators))
raise spack.error.InstallError(msg) raise spack.package_base.InstallError(msg)
try: try:
build_type = pkg.spec.variants["build_type"].value build_type = pkg.spec.variants["build_type"].value
@@ -506,7 +500,7 @@ def define_from_variant(self, cmake_var, variant=None):
if variant is None: if variant is None:
variant = cmake_var.lower() variant = cmake_var.lower()
if not self.pkg.has_variant(variant): if variant not in self.pkg.variants:
raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, self.pkg.name)) raise KeyError('"{0}" is not a variant of "{1}"'.format(variant, self.pkg.name))
if variant not in self.pkg.spec.variants: if variant not in self.pkg.spec.variants:
@@ -545,24 +539,24 @@ def cmake(self, pkg, spec, prefix):
options += self.cmake_args() options += self.cmake_args()
options.append(os.path.abspath(self.root_cmakelists_dir)) options.append(os.path.abspath(self.root_cmakelists_dir))
with fs.working_dir(self.build_directory, create=True): with fs.working_dir(self.build_directory, create=True):
pkg.module.cmake(*options) inspect.getmodule(self.pkg).cmake(*options)
def build(self, pkg, spec, prefix): def build(self, pkg, spec, prefix):
"""Make the build targets""" """Make the build targets"""
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles": if self.generator == "Unix Makefiles":
pkg.module.make(*self.build_targets) inspect.getmodule(self.pkg).make(*self.build_targets)
elif self.generator == "Ninja": elif self.generator == "Ninja":
self.build_targets.append("-v") self.build_targets.append("-v")
pkg.module.ninja(*self.build_targets) inspect.getmodule(self.pkg).ninja(*self.build_targets)
def install(self, pkg, spec, prefix): def install(self, pkg, spec, prefix):
"""Make the install targets""" """Make the install targets"""
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
if self.generator == "Unix Makefiles": if self.generator == "Unix Makefiles":
pkg.module.make(*self.install_targets) inspect.getmodule(self.pkg).make(*self.install_targets)
elif self.generator == "Ninja": elif self.generator == "Ninja":
pkg.module.ninja(*self.install_targets) inspect.getmodule(self.pkg).ninja(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests) spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -14,7 +14,6 @@
import spack.compiler import spack.compiler
import spack.package_base import spack.package_base
import spack.util.executable
# Local "type" for type hints # Local "type" for type hints
Path = Union[str, pathlib.Path] Path = Union[str, pathlib.Path]

View File

@@ -3,9 +3,6 @@
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import re
from typing import Iterable, List
import spack.variant import spack.variant
from spack.directives import conflicts, depends_on, variant from spack.directives import conflicts, depends_on, variant
from spack.multimethod import when from spack.multimethod import when
@@ -47,7 +44,6 @@ class CudaPackage(PackageBase):
"87", "87",
"89", "89",
"90", "90",
"90a",
) )
# FIXME: keep cuda and cuda_arch separate to make usage easier until # FIXME: keep cuda and cuda_arch separate to make usage easier until
@@ -74,27 +70,6 @@ def cuda_flags(arch_list):
for s in arch_list for s in arch_list
] ]
@staticmethod
def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
"""Adds a decimal place to each CUDA arch.
>>> compute_capabilities(['90', '90a'])
['9.0', '9.0a']
Args:
arch_list: A list of integer strings, optionally followed by a suffix.
Returns:
A list of float strings, optionally followed by a suffix
"""
pattern = re.compile(r"(\d+)")
capabilities = []
for arch in arch_list:
_, number, letter = re.split(pattern, arch)
number = "{0:.1f}".format(float(number) / 10.0)
capabilities.append(number + letter)
return capabilities
depends_on("cuda", when="+cuda") depends_on("cuda", when="+cuda")
# CUDA version vs Architecture # CUDA version vs Architecture
@@ -135,8 +110,9 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
# From the NVIDIA install guide we know of conflicts for particular # From the NVIDIA install guide we know of conflicts for particular
# platforms (linux, darwin), architectures (x86, powerpc) and compilers # platforms (linux, darwin), architectures (x86, powerpc) and compilers
# (gcc, clang). We don't restrict %gcc and %clang conflicts to # (gcc, clang). We don't restrict %gcc and %clang conflicts to
# platform=linux, since they may apply to platform=darwin. We currently # platform=linux, since they should also apply to platform=cray, and may
# do not provide conflicts for platform=darwin with %apple-clang. # apply to platform=darwin. We currently do not provide conflicts for
# platform=darwin with %apple-clang.
# Linux x86_64 compiler conflicts from here: # Linux x86_64 compiler conflicts from here:
# https://gist.github.com/ax3l/9489132 # https://gist.github.com/ax3l/9489132
@@ -149,8 +125,6 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
# minimum supported versions # minimum supported versions
conflicts("%gcc@:4", when="+cuda ^cuda@11.0:") conflicts("%gcc@:4", when="+cuda ^cuda@11.0:")
conflicts("%gcc@:5", when="+cuda ^cuda@11.4:") conflicts("%gcc@:5", when="+cuda ^cuda@11.4:")
conflicts("%gcc@:7.2", when="+cuda ^cuda@12.4:")
conflicts("%clang@:6", when="+cuda ^cuda@12.2:")
# maximum supported version # maximum supported version
# NOTE: # NOTE:
@@ -163,15 +137,11 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5") conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8") conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.3") conflicts("%gcc@13:", when="+cuda ^cuda@:12.3")
conflicts("%gcc@14:", when="+cuda ^cuda@:12.6")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0") conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5") conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7") conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
conflicts("%clang@15:", when="+cuda ^cuda@:12.0") conflicts("%clang@15:", when="+cuda ^cuda@:12.0")
conflicts("%clang@16:", when="+cuda ^cuda@:12.1") conflicts("%clang@16:", when="+cuda ^cuda@:12.3")
conflicts("%clang@17:", when="+cuda ^cuda@:12.3")
conflicts("%clang@18:", when="+cuda ^cuda@:12.5")
conflicts("%clang@19:", when="+cuda ^cuda@:12.6")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114 # https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0") conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")
@@ -239,16 +209,12 @@ def compute_capabilities(arch_list: Iterable[str]) -> List[str]:
conflicts("%intel@19.0:", when="+cuda ^cuda@:10.0") conflicts("%intel@19.0:", when="+cuda ^cuda@:10.0")
conflicts("%intel@19.1:", when="+cuda ^cuda@:10.1") conflicts("%intel@19.1:", when="+cuda ^cuda@:10.1")
conflicts("%intel@19.2:", when="+cuda ^cuda@:11.1.0") conflicts("%intel@19.2:", when="+cuda ^cuda@:11.1.0")
conflicts("%intel@2021:", when="+cuda ^cuda@:11.4.0")
# XL is mostly relevant for ppc64le Linux # XL is mostly relevant for ppc64le Linux
conflicts("%xl@:12,14:", when="+cuda ^cuda@:9.1") conflicts("%xl@:12,14:", when="+cuda ^cuda@:9.1")
conflicts("%xl@:12,14:15,17:", when="+cuda ^cuda@9.2") conflicts("%xl@:12,14:15,17:", when="+cuda ^cuda@9.2")
conflicts("%xl@:12,17:", when="+cuda ^cuda@:11.1.0") conflicts("%xl@:12,17:", when="+cuda ^cuda@:11.1.0")
# PowerPC.
conflicts("target=ppc64le", when="+cuda ^cuda@12.5:")
# Darwin. # Darwin.
# TODO: add missing conflicts for %apple-clang cuda@:10 # TODO: add missing conflicts for %apple-clang cuda@:10
conflicts("platform=darwin", when="+cuda ^cuda@11.0.2:") conflicts("platform=darwin", when="+cuda ^cuda@11.0.2: ")

View File

@@ -3,6 +3,8 @@
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import spack.builder import spack.builder
@@ -70,7 +72,7 @@ def build_directory(self):
def build_args(self): def build_args(self):
"""Arguments for ``go build``.""" """Arguments for ``go build``."""
# Pass ldflags -s = --strip-all and -w = --no-warnings by default # Pass ldflags -s = --strip-all and -w = --no-warnings by default
return ["-modcacherw", "-ldflags", "-s -w", "-o", f"{self.pkg.name}"] return ["-ldflags", "-s -w", "-o", f"{self.pkg.name}"]
@property @property
def check_args(self): def check_args(self):
@@ -80,7 +82,7 @@ def check_args(self):
def build(self, pkg, spec, prefix): def build(self, pkg, spec, prefix):
"""Runs ``go build`` in the source directory""" """Runs ``go build`` in the source directory"""
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
pkg.module.go("build", *self.build_args) inspect.getmodule(pkg).go("build", *self.build_args)
def install(self, pkg, spec, prefix): def install(self, pkg, spec, prefix):
"""Install built binaries into prefix bin.""" """Install built binaries into prefix bin."""
@@ -93,4 +95,4 @@ def install(self, pkg, spec, prefix):
def check(self): def check(self):
"""Run ``go test .`` in the source directory""" """Run ``go test .`` in the source directory"""
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
self.pkg.module.go("test", *self.check_args) inspect.getmodule(self.pkg).go("test", *self.check_args)

View File

@@ -22,10 +22,9 @@
install, install,
) )
import spack.builder
import spack.error import spack.error
from spack.build_environment import dso_suffix from spack.build_environment import dso_suffix
from spack.error import InstallError from spack.package_base import InstallError
from spack.util.environment import EnvironmentModifications from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable from spack.util.executable import Executable
from spack.util.prefix import Prefix from spack.util.prefix import Prefix
@@ -847,7 +846,6 @@ def scalapack_libs(self):
"^mpich@2:" in spec_root "^mpich@2:" in spec_root
or "^cray-mpich" in spec_root or "^cray-mpich" in spec_root
or "^mvapich2" in spec_root or "^mvapich2" in spec_root
or "^mvapich" in spec_root
or "^intel-mpi" in spec_root or "^intel-mpi" in spec_root
or "^intel-oneapi-mpi" in spec_root or "^intel-oneapi-mpi" in spec_root
or "^intel-parallel-studio" in spec_root or "^intel-parallel-studio" in spec_root
@@ -938,15 +936,32 @@ def mpi_setup_dependent_build_environment(self, env, dependent_spec, compilers_o
"I_MPI_ROOT": self.normalize_path("mpi"), "I_MPI_ROOT": self.normalize_path("mpi"),
} }
compiler_wrapper_commands = self.mpi_compiler_wrappers # CAUTION - SIMILAR code in:
wrapper_vars.update( # var/spack/repos/builtin/packages/mpich/package.py
{ # var/spack/repos/builtin/packages/openmpi/package.py
"MPICC": compiler_wrapper_commands["MPICC"], # var/spack/repos/builtin/packages/mvapich2/package.py
"MPICXX": compiler_wrapper_commands["MPICXX"], #
"MPIF77": compiler_wrapper_commands["MPIF77"], # On Cray, the regular compiler wrappers *are* the MPI wrappers.
"MPIF90": compiler_wrapper_commands["MPIF90"], if "platform=cray" in self.spec:
} # TODO: Confirm
) wrapper_vars.update(
{
"MPICC": compilers_of_client["CC"],
"MPICXX": compilers_of_client["CXX"],
"MPIF77": compilers_of_client["F77"],
"MPIF90": compilers_of_client["F90"],
}
)
else:
compiler_wrapper_commands = self.mpi_compiler_wrappers
wrapper_vars.update(
{
"MPICC": compiler_wrapper_commands["MPICC"],
"MPICXX": compiler_wrapper_commands["MPICXX"],
"MPIF77": compiler_wrapper_commands["MPIF77"],
"MPIF90": compiler_wrapper_commands["MPIF90"],
}
)
# Ensure that the directory containing the compiler wrappers is in the # Ensure that the directory containing the compiler wrappers is in the
# PATH. Spack packages add `prefix.bin` to their dependents' paths, # PATH. Spack packages add `prefix.bin` to their dependents' paths,

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import List from typing import List
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
@@ -102,12 +103,12 @@ def edit(self, pkg, spec, prefix):
def build(self, pkg, spec, prefix): def build(self, pkg, spec, prefix):
"""Run "make" on the build targets specified by the builder.""" """Run "make" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
pkg.module.make(*self.build_targets) inspect.getmodule(self.pkg).make(*self.build_targets)
def install(self, pkg, spec, prefix): def install(self, pkg, spec, prefix):
"""Run "make" on the install targets specified by the builder.""" """Run "make" on the install targets specified by the builder."""
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
pkg.module.make(*self.install_targets) inspect.getmodule(self.pkg).make(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests) spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os import os
from typing import List from typing import List
@@ -194,19 +195,19 @@ def meson(self, pkg, spec, prefix):
options += self.std_meson_args options += self.std_meson_args
options += self.meson_args() options += self.meson_args()
with fs.working_dir(self.build_directory, create=True): with fs.working_dir(self.build_directory, create=True):
pkg.module.meson(*options) inspect.getmodule(self.pkg).meson(*options)
def build(self, pkg, spec, prefix): def build(self, pkg, spec, prefix):
"""Make the build targets""" """Make the build targets"""
options = ["-v"] options = ["-v"]
options += self.build_targets options += self.build_targets
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
pkg.module.ninja(*options) inspect.getmodule(self.pkg).ninja(*options)
def install(self, pkg, spec, prefix): def install(self, pkg, spec, prefix):
"""Make the install targets""" """Make the install targets"""
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
pkg.module.ninja(*self.install_targets) inspect.getmodule(self.pkg).ninja(*self.install_targets)
spack.builder.run_after("build")(execute_build_time_tests) spack.builder.run_after("build")(execute_build_time_tests)

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import List # novm from typing import List # novm
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
@@ -23,6 +24,7 @@ class MSBuildPackage(spack.package_base.PackageBase):
build_system("msbuild") build_system("msbuild")
conflicts("platform=linux", when="build_system=msbuild") conflicts("platform=linux", when="build_system=msbuild")
conflicts("platform=darwin", when="build_system=msbuild") conflicts("platform=darwin", when="build_system=msbuild")
conflicts("platform=cray", when="build_system=msbuild")
@spack.builder.builder("msbuild") @spack.builder.builder("msbuild")
@@ -103,7 +105,7 @@ def msbuild_install_args(self):
def build(self, pkg, spec, prefix): def build(self, pkg, spec, prefix):
"""Run "msbuild" on the build targets specified by the builder.""" """Run "msbuild" on the build targets specified by the builder."""
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
pkg.module.msbuild( inspect.getmodule(self.pkg).msbuild(
*self.std_msbuild_args, *self.std_msbuild_args,
*self.msbuild_args(), *self.msbuild_args(),
self.define_targets(*self.build_targets), self.define_targets(*self.build_targets),
@@ -113,6 +115,6 @@ def install(self, pkg, spec, prefix):
"""Run "msbuild" on the install targets specified by the builder. """Run "msbuild" on the install targets specified by the builder.
This is INSTALL by default""" This is INSTALL by default"""
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
pkg.module.msbuild( inspect.getmodule(self.pkg).msbuild(
*self.msbuild_install_args(), self.define_targets(*self.install_targets) *self.msbuild_install_args(), self.define_targets(*self.install_targets)
) )

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import List # novm from typing import List # novm
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
@@ -23,6 +24,7 @@ class NMakePackage(spack.package_base.PackageBase):
build_system("nmake") build_system("nmake")
conflicts("platform=linux", when="build_system=nmake") conflicts("platform=linux", when="build_system=nmake")
conflicts("platform=darwin", when="build_system=nmake") conflicts("platform=darwin", when="build_system=nmake")
conflicts("platform=cray", when="build_system=nmake")
@spack.builder.builder("nmake") @spack.builder.builder("nmake")
@@ -131,7 +133,9 @@ def build(self, pkg, spec, prefix):
if self.makefile_name: if self.makefile_name:
opts.append("/F{}".format(self.makefile_name)) opts.append("/F{}".format(self.makefile_name))
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
pkg.module.nmake(*opts, *self.build_targets, ignore_quotes=self.ignore_quotes) inspect.getmodule(self.pkg).nmake(
*opts, *self.build_targets, ignore_quotes=self.ignore_quotes
)
def install(self, pkg, spec, prefix): def install(self, pkg, spec, prefix):
"""Run "nmake" on the install targets specified by the builder. """Run "nmake" on the install targets specified by the builder.
@@ -143,4 +147,6 @@ def install(self, pkg, spec, prefix):
opts.append("/F{}".format(self.makefile_name)) opts.append("/F{}".format(self.makefile_name))
opts.append(self.define("PREFIX", fs.windows_sfn(prefix))) opts.append(self.define("PREFIX", fs.windows_sfn(prefix)))
with fs.working_dir(self.build_directory): with fs.working_dir(self.build_directory):
pkg.module.nmake(*opts, *self.install_targets, ignore_quotes=self.ignore_quotes) inspect.getmodule(self.pkg).nmake(
*opts, *self.install_targets, ignore_quotes=self.ignore_quotes
)

View File

@@ -2,6 +2,8 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import spack.builder import spack.builder
import spack.package_base import spack.package_base
from spack.directives import build_system, extends from spack.directives import build_system, extends
@@ -45,7 +47,7 @@ class OctaveBuilder(BaseBuilder):
def install(self, pkg, spec, prefix): def install(self, pkg, spec, prefix):
"""Install the package from the archive file""" """Install the package from the archive file"""
pkg.module.octave( inspect.getmodule(self.pkg).octave(
"--quiet", "--quiet",
"--norc", "--norc",
"--built-in-docstrings-file=/dev/null", "--built-in-docstrings-file=/dev/null",

View File

@@ -3,6 +3,7 @@
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Common utilities for managing intel oneapi packages.""" """Common utilities for managing intel oneapi packages."""
import getpass
import os import os
import platform import platform
import shutil import shutil
@@ -12,10 +13,9 @@
from llnl.util.filesystem import HeaderList, LibraryList, find_libraries, join_path, mkdirp from llnl.util.filesystem import HeaderList, LibraryList, find_libraries, join_path, mkdirp
from llnl.util.link_tree import LinkTree from llnl.util.link_tree import LinkTree
import spack.util.path
from spack.build_environment import dso_suffix from spack.build_environment import dso_suffix
from spack.directives import conflicts, license, redistribute, variant from spack.directives import conflicts, license, redistribute, variant
from spack.error import InstallError from spack.package_base import InstallError
from spack.util.environment import EnvironmentModifications from spack.util.environment import EnvironmentModifications
from spack.util.executable import Executable from spack.util.executable import Executable
@@ -36,8 +36,9 @@ class IntelOneApiPackage(Package):
"target=ppc64:", "target=ppc64:",
"target=ppc64le:", "target=ppc64le:",
"target=aarch64:", "target=aarch64:",
"platform=darwin", "platform=darwin:",
"platform=windows", "platform=cray:",
"platform=windows:",
]: ]:
conflicts(c, msg="This package in only available for x86_64 and Linux") conflicts(c, msg="This package in only available for x86_64 and Linux")
@@ -99,7 +100,7 @@ def install_component(self, installer_path):
# with other install depends on the userid. For root, we # with other install depends on the userid. For root, we
# delete the installercache before and after install. For # delete the installercache before and after install. For
# non root we redefine the HOME environment variable. # non root we redefine the HOME environment variable.
if spack.util.path.get_user() == "root": if getpass.getuser() == "root":
shutil.rmtree("/var/intel/installercache", ignore_errors=True) shutil.rmtree("/var/intel/installercache", ignore_errors=True)
bash = Executable("bash") bash = Executable("bash")
@@ -122,7 +123,7 @@ def install_component(self, installer_path):
self.prefix, self.prefix,
) )
if spack.util.path.get_user() == "root": if getpass.getuser() == "root":
shutil.rmtree("/var/intel/installercache", ignore_errors=True) shutil.rmtree("/var/intel/installercache", ignore_errors=True)
# Some installers have a bug and do not return an error code when failing # Some installers have a bug and do not return an error code when failing

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os import os
from typing import Iterable from typing import Iterable
@@ -133,7 +134,7 @@ def build_method(self):
def build_executable(self): def build_executable(self):
"""Returns the executable method to build the perl package""" """Returns the executable method to build the perl package"""
if self.build_method == "Makefile.PL": if self.build_method == "Makefile.PL":
build_executable = self.pkg.module.make build_executable = inspect.getmodule(self.pkg).make
elif self.build_method == "Build.PL": elif self.build_method == "Build.PL":
build_executable = Executable(os.path.join(self.pkg.stage.source_path, "Build")) build_executable = Executable(os.path.join(self.pkg.stage.source_path, "Build"))
return build_executable return build_executable
@@ -157,7 +158,7 @@ def configure(self, pkg, spec, prefix):
options = ["Build.PL", "--install_base", prefix] options = ["Build.PL", "--install_base", prefix]
options += self.configure_args() options += self.configure_args()
pkg.module.perl(*options) inspect.getmodule(self.pkg).perl(*options)
# It is possible that the shebang in the Build script that is created from # It is possible that the shebang in the Build script that is created from
# Build.PL may be too long causing the build to fail. Patching the shebang # Build.PL may be too long causing the build to fail. Patching the shebang

View File

@@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import functools import functools
import inspect
import operator import operator
import os import os
import re import re
@@ -16,7 +17,7 @@
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import llnl.util.lang as lang import llnl.util.lang as lang
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.filesystem import HeaderList, LibraryList, join_path from llnl.util.filesystem import HeaderList, LibraryList
import spack.builder import spack.builder
import spack.config import spack.config
@@ -24,8 +25,6 @@
import spack.detection import spack.detection
import spack.multimethod import spack.multimethod
import spack.package_base import spack.package_base
import spack.platforms
import spack.repo
import spack.spec import spack.spec
import spack.store import spack.store
from spack.directives import build_system, depends_on, extends from spack.directives import build_system, depends_on, extends
@@ -121,12 +120,6 @@ def skip_modules(self) -> Iterable[str]:
""" """
return [] return []
@property
def bindir(self) -> str:
"""Path to Python package's bindir, bin on unix like OS's Scripts on Windows"""
windows = self.spec.satisfies("platform=windows")
return join_path(self.spec.prefix, "Scripts" if windows else "bin")
def view_file_conflicts(self, view, merge_map): def view_file_conflicts(self, view, merge_map):
"""Report all file conflicts, excepting special cases for python. """Report all file conflicts, excepting special cases for python.
Specifically, this does not report errors for duplicate Specifically, this does not report errors for duplicate
@@ -229,7 +222,7 @@ def test_imports(self) -> None:
# Make sure we are importing the installed modules, # Make sure we are importing the installed modules,
# not the ones in the source directory # not the ones in the source directory
python = self.module.python python = inspect.getmodule(self).python # type: ignore[union-attr]
for module in self.import_modules: for module in self.import_modules:
with test_part( with test_part(
self, self,
@@ -316,9 +309,9 @@ def get_external_python_for_prefix(self):
) )
python_externals_detected = [ python_externals_detected = [
spec d.spec
for spec in python_externals_detection.get("python", []) for d in python_externals_detection.get("python", [])
if spec.external_path == self.spec.external_path if d.prefix == self.spec.external_path
] ]
if python_externals_detected: if python_externals_detected:
return python_externals_detected[0] return python_externals_detected[0]

View File

@@ -2,6 +2,8 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from llnl.util.filesystem import working_dir from llnl.util.filesystem import working_dir
import spack.builder import spack.builder
@@ -64,17 +66,17 @@ def qmake_args(self):
def qmake(self, pkg, spec, prefix): def qmake(self, pkg, spec, prefix):
"""Run ``qmake`` to configure the project and generate a Makefile.""" """Run ``qmake`` to configure the project and generate a Makefile."""
with working_dir(self.build_directory): with working_dir(self.build_directory):
pkg.module.qmake(*self.qmake_args()) inspect.getmodule(self.pkg).qmake(*self.qmake_args())
def build(self, pkg, spec, prefix): def build(self, pkg, spec, prefix):
"""Make the build targets""" """Make the build targets"""
with working_dir(self.build_directory): with working_dir(self.build_directory):
pkg.module.make() inspect.getmodule(self.pkg).make()
def install(self, pkg, spec, prefix): def install(self, pkg, spec, prefix):
"""Make the install targets""" """Make the install targets"""
with working_dir(self.build_directory): with working_dir(self.build_directory):
pkg.module.make("install") inspect.getmodule(self.pkg).make("install")
def check(self): def check(self):
"""Search the Makefile for a ``check:`` target and runs it if found.""" """Search the Makefile for a ``check:`` target and runs it if found."""

View File

@@ -2,10 +2,10 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from typing import Optional, Tuple from typing import Optional, Tuple
import llnl.util.lang as lang import llnl.util.lang as lang
from llnl.util.filesystem import mkdirp
from spack.directives import extends from spack.directives import extends
@@ -37,7 +37,6 @@ def configure_vars(self):
def install(self, pkg, spec, prefix): def install(self, pkg, spec, prefix):
"""Installs an R package.""" """Installs an R package."""
mkdirp(pkg.module.r_lib_dir)
config_args = self.configure_args() config_args = self.configure_args()
config_vars = self.configure_vars() config_vars = self.configure_vars()
@@ -45,14 +44,14 @@ def install(self, pkg, spec, prefix):
args = ["--vanilla", "CMD", "INSTALL"] args = ["--vanilla", "CMD", "INSTALL"]
if config_args: if config_args:
args.append(f"--configure-args={' '.join(config_args)}") args.append("--configure-args={0}".format(" ".join(config_args)))
if config_vars: if config_vars:
args.append(f"--configure-vars={' '.join(config_vars)}") args.append("--configure-vars={0}".format(" ".join(config_vars)))
args.extend([f"--library={pkg.module.r_lib_dir}", self.stage.source_path]) args.extend(["--library={0}".format(self.pkg.module.r_lib_dir), self.stage.source_path])
pkg.module.R(*args) inspect.getmodule(self.pkg).R(*args)
class RPackage(Package): class RPackage(Package):
@@ -81,21 +80,27 @@ class RPackage(Package):
@lang.classproperty @lang.classproperty
def homepage(cls): def homepage(cls):
if cls.cran: if cls.cran:
return f"https://cloud.r-project.org/package={cls.cran}" return "https://cloud.r-project.org/package=" + cls.cran
elif cls.bioc: elif cls.bioc:
return f"https://bioconductor.org/packages/{cls.bioc}" return "https://bioconductor.org/packages/" + cls.bioc
@lang.classproperty @lang.classproperty
def url(cls): def url(cls):
if cls.cran: if cls.cran:
return f"https://cloud.r-project.org/src/contrib/{cls.cran}_{str(list(cls.versions)[0])}.tar.gz" return (
"https://cloud.r-project.org/src/contrib/"
+ cls.cran
+ "_"
+ str(list(cls.versions)[0])
+ ".tar.gz"
)
@lang.classproperty @lang.classproperty
def list_url(cls): def list_url(cls):
if cls.cran: if cls.cran:
return f"https://cloud.r-project.org/src/contrib/Archive/{cls.cran}/" return "https://cloud.r-project.org/src/contrib/Archive/" + cls.cran + "/"
@property @property
def git(self): def git(self):
if self.bioc: if self.bioc:
return f"https://git.bioconductor.org/packages/{self.bioc}" return "https://git.bioconductor.org/packages/" + self.bioc

View File

@@ -11,9 +11,9 @@
import spack.builder import spack.builder
from spack.build_environment import SPACK_NO_PARALLEL_MAKE from spack.build_environment import SPACK_NO_PARALLEL_MAKE
from spack.config import determine_number_of_jobs
from spack.directives import build_system, extends, maintainers from spack.directives import build_system, extends, maintainers
from spack.package_base import PackageBase from spack.package_base import PackageBase
from spack.util.cpus import determine_number_of_jobs
from spack.util.environment import env_flag from spack.util.environment import env_flag
from spack.util.executable import Executable, ProcessError from spack.util.executable import Executable, ProcessError

View File

@@ -3,6 +3,7 @@
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import glob import glob
import inspect
import spack.builder import spack.builder
import spack.package_base import spack.package_base
@@ -51,10 +52,10 @@ def build(self, pkg, spec, prefix):
gemspecs = glob.glob("*.gemspec") gemspecs = glob.glob("*.gemspec")
rakefiles = glob.glob("Rakefile") rakefiles = glob.glob("Rakefile")
if gemspecs: if gemspecs:
pkg.module.gem("build", "--norc", gemspecs[0]) inspect.getmodule(self.pkg).gem("build", "--norc", gemspecs[0])
elif rakefiles: elif rakefiles:
jobs = pkg.module.make_jobs jobs = inspect.getmodule(self.pkg).make_jobs
pkg.module.rake("package", "-j{0}".format(jobs)) inspect.getmodule(self.pkg).rake("package", "-j{0}".format(jobs))
else: else:
# Some Ruby packages only ship `*.gem` files, so nothing to build # Some Ruby packages only ship `*.gem` files, so nothing to build
pass pass
@@ -69,6 +70,6 @@ def install(self, pkg, spec, prefix):
# if --install-dir is not used, GEM_PATH is deleted from the # if --install-dir is not used, GEM_PATH is deleted from the
# environement, and Gems required to build native extensions will # environement, and Gems required to build native extensions will
# not be found. Those extensions are built during `gem install`. # not be found. Those extensions are built during `gem install`.
pkg.module.gem( inspect.getmodule(self.pkg).gem(
"install", "--norc", "--ignore-dependencies", "--install-dir", prefix, gems[0] "install", "--norc", "--ignore-dependencies", "--install-dir", prefix, gems[0]
) )

View File

@@ -2,6 +2,8 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import spack.builder import spack.builder
import spack.package_base import spack.package_base
from spack.directives import build_system, depends_on from spack.directives import build_system, depends_on
@@ -61,7 +63,8 @@ def build_args(self, spec, prefix):
def build(self, pkg, spec, prefix): def build(self, pkg, spec, prefix):
"""Build the package.""" """Build the package."""
pkg.module.scons(*self.build_args(spec, prefix)) args = self.build_args(spec, prefix)
inspect.getmodule(self.pkg).scons(*args)
def install_args(self, spec, prefix): def install_args(self, spec, prefix):
"""Arguments to pass to install.""" """Arguments to pass to install."""
@@ -69,7 +72,9 @@ def install_args(self, spec, prefix):
def install(self, pkg, spec, prefix): def install(self, pkg, spec, prefix):
"""Install the package.""" """Install the package."""
pkg.module.scons("install", *self.install_args(spec, prefix)) args = self.install_args(spec, prefix)
inspect.getmodule(self.pkg).scons("install", *args)
def build_test(self): def build_test(self):
"""Run unit tests after build. """Run unit tests after build.

View File

@@ -2,6 +2,7 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
import os import os
import re import re
@@ -85,13 +86,14 @@ def import_modules(self):
def python(self, *args, **kwargs): def python(self, *args, **kwargs):
"""The python ``Executable``.""" """The python ``Executable``."""
self.pkg.module.python(*args, **kwargs) inspect.getmodule(self).python(*args, **kwargs)
def test_imports(self): def test_imports(self):
"""Attempts to import modules of the installed package.""" """Attempts to import modules of the installed package."""
# Make sure we are importing the installed modules, # Make sure we are importing the installed modules,
# not the ones in the source directory # not the ones in the source directory
python = inspect.getmodule(self).python
for module in self.import_modules: for module in self.import_modules:
with spack.install_test.test_part( with spack.install_test.test_part(
self, self,
@@ -99,7 +101,7 @@ def test_imports(self):
purpose="checking import of {0}".format(module), purpose="checking import of {0}".format(module),
work_dir="spack-test", work_dir="spack-test",
): ):
self.python("-c", "import {0}".format(module)) python("-c", "import {0}".format(module))
@spack.builder.builder("sip") @spack.builder.builder("sip")
@@ -134,13 +136,9 @@ def configure(self, pkg, spec, prefix):
"""Configure the package.""" """Configure the package."""
# https://www.riverbankcomputing.com/static/Docs/sip/command_line_tools.html # https://www.riverbankcomputing.com/static/Docs/sip/command_line_tools.html
args = ["--verbose", "--target-dir", pkg.module.python_platlib] args = ["--verbose", "--target-dir", inspect.getmodule(self.pkg).python_platlib]
args.extend(self.configure_args()) args.extend(self.configure_args())
# https://github.com/Python-SIP/sip/commit/cb0be6cb6e9b756b8b0db3136efb014f6fb9b766
if spec["py-sip"].satisfies("@6.1.0:"):
args.extend(["--scripts-dir", pkg.prefix.bin])
sip_build = Executable(spec["py-sip"].prefix.bin.join("sip-build")) sip_build = Executable(spec["py-sip"].prefix.bin.join("sip-build"))
sip_build(*args) sip_build(*args)
@@ -153,7 +151,7 @@ def build(self, pkg, spec, prefix):
args = self.build_args() args = self.build_args()
with working_dir(self.build_directory): with working_dir(self.build_directory):
pkg.module.make(*args) inspect.getmodule(self.pkg).make(*args)
def build_args(self): def build_args(self):
"""Arguments to pass to build.""" """Arguments to pass to build."""
@@ -164,7 +162,7 @@ def install(self, pkg, spec, prefix):
args = self.install_args() args = self.install_args()
with working_dir(self.build_directory): with working_dir(self.build_directory):
pkg.module.make("install", *args) inspect.getmodule(self.pkg).make("install", *args)
def install_args(self): def install_args(self):
"""Arguments to pass to install.""" """Arguments to pass to install."""

View File

@@ -2,6 +2,8 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details. # Spack Project Developers. See the top-level COPYRIGHT file for details.
# #
# SPDX-License-Identifier: (Apache-2.0 OR MIT) # SPDX-License-Identifier: (Apache-2.0 OR MIT)
import inspect
from llnl.util.filesystem import working_dir from llnl.util.filesystem import working_dir
import spack.builder import spack.builder
@@ -88,11 +90,11 @@ def build_directory(self):
def python(self, *args, **kwargs): def python(self, *args, **kwargs):
"""The python ``Executable``.""" """The python ``Executable``."""
self.pkg.module.python(*args, **kwargs) inspect.getmodule(self.pkg).python(*args, **kwargs)
def waf(self, *args, **kwargs): def waf(self, *args, **kwargs):
"""Runs the waf ``Executable``.""" """Runs the waf ``Executable``."""
jobs = self.pkg.module.make_jobs jobs = inspect.getmodule(self.pkg).make_jobs
with working_dir(self.build_directory): with working_dir(self.build_directory):
self.python("waf", "-j{0}".format(jobs), *args, **kwargs) self.python("waf", "-j{0}".format(jobs), *args, **kwargs)

View File

@@ -6,12 +6,12 @@
import collections.abc import collections.abc
import copy import copy
import functools import functools
import inspect
from typing import List, Optional, Tuple from typing import List, Optional, Tuple
from llnl.util import lang from llnl.util import lang
import spack.error import spack.build_environment
import spack.multimethod
#: Builder classes, as registered by the "builder" decorator #: Builder classes, as registered by the "builder" decorator
BUILDER_CLS = {} BUILDER_CLS = {}
@@ -96,10 +96,11 @@ class hierarchy (look at AspellDictPackage for an example of that)
Args: Args:
pkg (spack.package_base.PackageBase): package object for which we need a builder pkg (spack.package_base.PackageBase): package object for which we need a builder
""" """
package_module = inspect.getmodule(pkg)
package_buildsystem = buildsystem_name(pkg) package_buildsystem = buildsystem_name(pkg)
default_builder_cls = BUILDER_CLS[package_buildsystem] default_builder_cls = BUILDER_CLS[package_buildsystem]
builder_cls_name = default_builder_cls.__name__ builder_cls_name = default_builder_cls.__name__
builder_cls = getattr(pkg.module, builder_cls_name, None) builder_cls = getattr(package_module, builder_cls_name, None)
if builder_cls: if builder_cls:
return builder_cls(pkg) return builder_cls(pkg)
@@ -294,11 +295,7 @@ def _decorator(fn):
return _decorator return _decorator
class BuilderMeta( class BuilderMeta(PhaseCallbacksMeta, type(collections.abc.Sequence)): # type: ignore
PhaseCallbacksMeta,
spack.multimethod.MultiMethodMeta,
type(collections.abc.Sequence), # type: ignore
):
pass pass
@@ -461,13 +458,15 @@ def _on_phase_start(self, instance):
# If a phase has a matching stop_before_phase attribute, # If a phase has a matching stop_before_phase attribute,
# stop the installation process raising a StopPhase # stop the installation process raising a StopPhase
if getattr(instance, "stop_before_phase", None) == self.name: if getattr(instance, "stop_before_phase", None) == self.name:
raise spack.error.StopPhase("Stopping before '{0}' phase".format(self.name)) raise spack.build_environment.StopPhase(
"Stopping before '{0}' phase".format(self.name)
)
def _on_phase_exit(self, instance): def _on_phase_exit(self, instance):
# If a phase has a matching last_phase attribute, # If a phase has a matching last_phase attribute,
# stop the installation process raising a StopPhase # stop the installation process raising a StopPhase
if getattr(instance, "last_phase", None) == self.name: if getattr(instance, "last_phase", None) == self.name:
raise spack.error.StopPhase("Stopping at '{0}' phase".format(self.name)) raise spack.build_environment.StopPhase("Stopping at '{0}' phase".format(self.name))
def copy(self): def copy(self):
return copy.deepcopy(self) return copy.deepcopy(self)

View File

@@ -9,8 +9,10 @@
import llnl.util.lang import llnl.util.lang
from llnl.util.filesystem import mkdirp from llnl.util.filesystem import mkdirp
from llnl.util.symlink import symlink
import spack.config import spack.config
import spack.error
import spack.fetch_strategy import spack.fetch_strategy
import spack.paths import spack.paths
import spack.util.file_cache import spack.util.file_cache
@@ -32,8 +34,6 @@ def _misc_cache():
return spack.util.file_cache.FileCache(path) return spack.util.file_cache.FileCache(path)
FileCacheType = Union[spack.util.file_cache.FileCache, llnl.util.lang.Singleton]
#: Spack's cache for small data #: Spack's cache for small data
MISC_CACHE: Union[spack.util.file_cache.FileCache, llnl.util.lang.Singleton] = ( MISC_CACHE: Union[spack.util.file_cache.FileCache, llnl.util.lang.Singleton] = (
llnl.util.lang.Singleton(_misc_cache) llnl.util.lang.Singleton(_misc_cache)
@@ -72,6 +72,23 @@ def store(self, fetcher, relative_dest):
mkdirp(os.path.dirname(dst)) mkdirp(os.path.dirname(dst))
fetcher.archive(dst) fetcher.archive(dst)
def symlink(self, mirror_ref):
"""Symlink a human readible path in our mirror to the actual
storage location."""
cosmetic_path = os.path.join(self.root, mirror_ref.cosmetic_path)
storage_path = os.path.join(self.root, mirror_ref.storage_path)
relative_dst = os.path.relpath(storage_path, start=os.path.dirname(cosmetic_path))
if not os.path.exists(cosmetic_path):
if os.path.lexists(cosmetic_path):
# In this case the link itself exists but it is broken: remove
# it and recreate it (in order to fix any symlinks broken prior
# to https://github.com/spack/spack/pull/13908)
os.unlink(cosmetic_path)
mkdirp(os.path.dirname(cosmetic_path))
symlink(relative_dst, cosmetic_path)
#: Spack's local cache for downloaded source archives #: Spack's local cache for downloaded source archives
FETCH_CACHE: Union[spack.fetch_strategy.FsCache, llnl.util.lang.Singleton] = ( FETCH_CACHE: Union[spack.fetch_strategy.FsCache, llnl.util.lang.Singleton] = (

View File

@@ -22,8 +22,6 @@
from urllib.parse import urlencode from urllib.parse import urlencode
from urllib.request import HTTPHandler, Request, build_opener from urllib.request import HTTPHandler, Request, build_opener
import ruamel.yaml
import llnl.util.filesystem as fs import llnl.util.filesystem as fs
import llnl.util.tty as tty import llnl.util.tty as tty
from llnl.util.lang import memoized from llnl.util.lang import memoized
@@ -31,7 +29,6 @@
import spack import spack
import spack.binary_distribution as bindist import spack.binary_distribution as bindist
import spack.concretize
import spack.config as cfg import spack.config as cfg
import spack.environment as ev import spack.environment as ev
import spack.main import spack.main
@@ -72,7 +69,7 @@
# TODO: Remove this in Spack 0.23 # TODO: Remove this in Spack 0.23
SHARED_PR_MIRROR_URL = "s3://spack-binaries-prs/shared_pr_mirror" SHARED_PR_MIRROR_URL = "s3://spack-binaries-prs/shared_pr_mirror"
JOB_NAME_FORMAT = ( JOB_NAME_FORMAT = (
"{name}{@version} {/hash:7} {%compiler.name}{@compiler.version}{ arch=architecture}" "{name}{@version} {/hash:7} {%compiler.name}{@compiler.version}{arch=architecture}"
) )
IS_WINDOWS = sys.platform == "win32" IS_WINDOWS = sys.platform == "win32"
spack_gpg = spack.main.SpackCommand("gpg") spack_gpg = spack.main.SpackCommand("gpg")
@@ -554,9 +551,10 @@ def generate_gitlab_ci_yaml(
env, env,
print_summary, print_summary,
output_file, output_file,
*,
prune_dag=False, prune_dag=False,
check_index_only=False, check_index_only=False,
run_optimizer=False,
use_dependencies=False,
artifacts_root=None, artifacts_root=None,
remote_mirror_override=None, remote_mirror_override=None,
): ):
@@ -577,6 +575,12 @@ def generate_gitlab_ci_yaml(
this mode results in faster yaml generation time). Otherwise, also this mode results in faster yaml generation time). Otherwise, also
check each spec directly by url (useful if there is no index or it check each spec directly by url (useful if there is no index or it
might be out of date). might be out of date).
run_optimizer (bool): If True, post-process the generated yaml to try
try to reduce the size (attempts to collect repeated configuration
and replace with definitions).)
use_dependencies (bool): If true, use "dependencies" rather than "needs"
("needs" allows DAG scheduling). Useful if gitlab instance cannot
be configured to handle more than a few "needs" per job.
artifacts_root (str): Path where artifacts like logs, environment artifacts_root (str): Path where artifacts like logs, environment
files (spack.yaml, spack.lock), etc should be written. GitLab files (spack.yaml, spack.lock), etc should be written. GitLab
requires this to be within the project directory. requires this to be within the project directory.
@@ -810,8 +814,7 @@ def ensure_expected_target_path(path):
cli_scopes = [ cli_scopes = [
os.path.relpath(s.path, concrete_env_dir) os.path.relpath(s.path, concrete_env_dir)
for s in cfg.scopes().values() for s in cfg.scopes().values()
if not s.writable if isinstance(s, cfg.ImmutableConfigScope)
and isinstance(s, (cfg.DirectoryConfigScope))
and s.path not in env_includes and s.path not in env_includes
and os.path.exists(s.path) and os.path.exists(s.path)
] ]
@@ -1110,8 +1113,7 @@ def main_script_replacements(cmd):
cdash_handler.populate_buildgroup(all_job_names) cdash_handler.populate_buildgroup(all_job_names)
except (SpackError, HTTPError, URLError, TimeoutError) as err: except (SpackError, HTTPError, URLError, TimeoutError) as err:
tty.warn(f"Problem populating buildgroup: {err}") tty.warn(f"Problem populating buildgroup: {err}")
elif cdash_config: else:
# warn only if there was actually a CDash configuration.
tty.warn("Unable to populate buildgroup without CDash credentials") tty.warn("Unable to populate buildgroup without CDash credentials")
service_job_retries = { service_job_retries = {
@@ -1219,8 +1221,8 @@ def main_script_replacements(cmd):
# Capture the version of Spack used to generate the pipeline, that can be # Capture the version of Spack used to generate the pipeline, that can be
# passed to `git checkout` for version consistency. If we aren't in a Git # passed to `git checkout` for version consistency. If we aren't in a Git
# repository, presume we are a Spack release and use the Git tag instead. # repository, presume we are a Spack release and use the Git tag instead.
spack_version = spack.get_version() spack_version = spack.main.get_version()
version_to_clone = spack.get_spack_commit() or f"v{spack.spack_version}" version_to_clone = spack.main.get_spack_commit() or f"v{spack.spack_version}"
output_object["variables"] = { output_object["variables"] = {
"SPACK_ARTIFACTS_ROOT": rel_artifacts_root, "SPACK_ARTIFACTS_ROOT": rel_artifacts_root,
@@ -1269,6 +1271,17 @@ def main_script_replacements(cmd):
with open(copy_specs_file, "w") as fd: with open(copy_specs_file, "w") as fd:
fd.write(json.dumps(buildcache_copies)) fd.write(json.dumps(buildcache_copies))
# TODO(opadron): remove this or refactor
if run_optimizer:
import spack.ci_optimization as ci_opt
output_object = ci_opt.optimizer(output_object)
# TODO(opadron): remove this or refactor
if use_dependencies:
import spack.ci_needs_workaround as cinw
output_object = cinw.needs_to_dependencies(output_object)
else: else:
# No jobs were generated # No jobs were generated
noop_job = spack_ci_ir["jobs"]["noop"]["attributes"] noop_job = spack_ci_ir["jobs"]["noop"]["attributes"]
@@ -1297,11 +1310,8 @@ def main_script_replacements(cmd):
if not rebuild_everything: if not rebuild_everything:
sys.exit(1) sys.exit(1)
# Minimize yaml output size through use of anchors with open(output_file, "w") as outf:
syaml.anchorify(sorted_output) outf.write(syaml.dump(sorted_output, default_flow_style=True))
with open(output_file, "w") as f:
ruamel.yaml.YAML().dump(sorted_output, f)
def _url_encode_string(input_string): def _url_encode_string(input_string):
@@ -1372,6 +1382,15 @@ def can_verify_binaries():
return len(gpg_util.public_keys()) >= 1 return len(gpg_util.public_keys()) >= 1
def _push_to_build_cache(spec: spack.spec.Spec, sign_binaries: bool, mirror_url: str) -> None:
"""Unchecked version of the public API, for easier mocking"""
bindist.push_or_raise(
spec,
spack.mirror.Mirror.from_url(mirror_url).push_url,
bindist.PushOptions(force=True, unsigned=not sign_binaries),
)
def push_to_build_cache(spec: spack.spec.Spec, mirror_url: str, sign_binaries: bool) -> bool: def push_to_build_cache(spec: spack.spec.Spec, mirror_url: str, sign_binaries: bool) -> bool:
"""Push one or more binary packages to the mirror. """Push one or more binary packages to the mirror.
@@ -1382,15 +1401,20 @@ def push_to_build_cache(spec: spack.spec.Spec, mirror_url: str, sign_binaries: b
sign_binaries: If True, spack will attempt to sign binary package before pushing. sign_binaries: If True, spack will attempt to sign binary package before pushing.
""" """
tty.debug(f"Pushing to build cache ({'signed' if sign_binaries else 'unsigned'})") tty.debug(f"Pushing to build cache ({'signed' if sign_binaries else 'unsigned'})")
signing_key = bindist.select_signing_key() if sign_binaries else None
mirror = spack.mirror.Mirror.from_url(mirror_url)
try: try:
with bindist.make_uploader(mirror, signing_key=signing_key) as uploader: _push_to_build_cache(spec, sign_binaries, mirror_url)
uploader.push_or_raise([spec])
return True return True
except bindist.PushToBuildCacheError as e: except bindist.PushToBuildCacheError as e:
tty.error(f"Problem writing to {mirror_url}: {e}") tty.error(str(e))
return False return False
except Exception as e:
# TODO (zackgalbreath): write an adapter for boto3 exceptions so we can catch a specific
# exception instead of parsing str(e)...
msg = str(e)
if any(x in msg for x in ["Access Denied", "InvalidAccessKeyId"]):
tty.error(f"Permission problem writing to {mirror_url}: {msg}")
return False
raise
def remove_other_mirrors(mirrors_to_keep, scope=None): def remove_other_mirrors(mirrors_to_keep, scope=None):
@@ -1436,6 +1460,10 @@ def copy_stage_logs_to_artifacts(job_spec: spack.spec.Spec, job_log_dir: str) ->
job_log_dir: path into which build log should be copied job_log_dir: path into which build log should be copied
""" """
tty.debug(f"job spec: {job_spec}") tty.debug(f"job spec: {job_spec}")
if not job_spec:
msg = f"Cannot copy stage logs: job spec ({job_spec}) is required"
tty.error(msg)
return
try: try:
pkg_cls = spack.repo.PATH.get_pkg_class(job_spec.name) pkg_cls = spack.repo.PATH.get_pkg_class(job_spec.name)

Some files were not shown because too many files have changed in this diff Show More