Compare commits

..

85 Commits

Author SHA1 Message Date
Harmen Stoppels
594a376c52 Set version to v0.22.2 2024-09-21 12:39:55 +02:00
Harmen Stoppels
1538c48616 run-unit-tests: no xdist if coverage (#46480)
xdist only slows down unit tests under coverage
2024-09-21 12:39:55 +02:00
Massimiliano Culpo
683e50b8d9 Run unit test in parallel again in CI (#45793)
The --trace-config option was failing for linux unit-tests,
so we were running serial.
2024-09-21 12:39:55 +02:00
Harmen Stoppels
9b32fb0beb Revert "Change environment modifications to escape with double quotes (#36789)" (#42780)
This reverts commit 690394fabc, as it causes arbitrary code execution.
2024-09-21 12:39:55 +02:00
Harmen Stoppels
2c6df0d491 deal with TimeoutError from ssl.py (#45683) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
ce7218acae buildcache: fix hard-coded, outdated layout version (#45645) 2024-09-21 12:39:55 +02:00
Dominic Hofer
246eeb2b69 Remove execution permission from setup-env.sh (#45641)
`setup-env.sh` is meant to be sourced, not executed directly.
By revoking execution permissions, users who accidentally execute
the script will receive an error instead of seeing no effect.

* Remove execution permission from `setup-env.sh` and friends
* Don't make output file executable in `spack commands --update-completion`

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-09-21 12:39:55 +02:00
Harmen Stoppels
cc47ee3984 unparser.py: remove print statements (#45235) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
7b644719c1 Avoid duplicate detectable tag (#45160)
in case of inheritance the static tags prop may be updated multiple
times, and it turns out builder classes magically inherit from
traditional package classes
2024-09-21 12:39:55 +02:00
Harmen Stoppels
d8a6aa551e build_environment: explicitly disable ccache if disabled (#45275) 2024-09-21 12:39:55 +02:00
Massimiliano Culpo
ac7b18483a Bump archspec to latest commit (#46445)
This should fix an issue with Neoverse XX detection

Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
2024-09-21 12:39:55 +02:00
Massimiliano Culpo
39f37de4ce Update archspec to v0.2.5-dev (7e6740012b897ae4a950f0bba7e9726b767e921f) (#45721) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
703e153404 require spec in develop entry (#46485) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
aa013611bc url join: fix oci scheme (#46483)
* url.py: also special case oci scheme in join

* avoid fetching keys from oci mirror
2024-09-21 12:39:55 +02:00
Harmen Stoppels
6a7ccd4e46 docs: refer to upstreams.yaml in chain.rst title (#46475) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
1c6c4b4690 spack.util.url: fix join breakage in python 3.12.6 (#46453) 2024-09-21 12:39:55 +02:00
arezaii
68558b3dd0 Chapel package: updates post release (#45304)
* Fix +rocm variant, to ensure correct dependencies on ROCm packages
  and use of AMD LLVM
* Add a +pshm variant for comm=gasnet to enable fast shared-memory
  comms between co-locales
* Add logic to ensure we get the native CXI libfabric network provider
  on Cray EX
* Expand dependency type for package modules to encompass runtime
  dependencies
* Factor logic for setting (LD_)LIBRARY_PATH and PKG_CONFIG_PATH of
  runtime dependencies
* Workaround issue #44746 that causes a transitive dependency on lua
  to break SLURM
* Disable nonfunctional checkChplDoc test
* Annotate some variants as conditional, to improve spack info output
  and reduce confusion

---------

Co-authored-by: Dan Bonachea <dobonachea@lbl.gov>
2024-09-21 12:39:55 +02:00
arezaii
5440fe09cd update chapel package for v2.1 (#44931) 2024-09-21 12:39:55 +02:00
arezaii
03c22f403f Chapel package: major update (#42197)
* add cray detection taken from upcxx
* add CUDA/ROCm support
* add numerous pass-through options to Chapel build,
  like gpu_mem_strategy, comm_substrate, etc.; all variants are
  translated to analogous CHPL_* environment variables. As a side
  effect, this defines a number of environment variables that are
  not actually used by Chapel.
* Define LD_LIBRARY_PATH, LIBRARY_PATH, and PKG_CONFIG_PATH to
  help programs built with Chapel properly locate needed runtime
  dependencies

---------

Co-authored-by: bonachea <dobonachea@lbl.gov>
2024-09-21 12:39:55 +02:00
Greg Becker
f339225d22 include_concrete: read from older env formats properly (#45766)
* include_concrete: read from older env formats properly
* spack env rm: fix logic for checking env includes
* regression test

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-09-21 12:39:55 +02:00
Massimiliano Culpo
22c815f3d4 Do not halt concretization on unknown variants in externals (#45326)
* Do not halt concretization on unkwnown variants in externals
2024-09-21 12:39:55 +02:00
Massimiliano Culpo
4354288e44 Run minimization of weights only on known targets (#45269)
This prevents excessive output from clingo of the kind:

.../spack/lib/spack/spack/solver/concretize.lp:1640:5-11: info: tuple ignored:
  #sup@2
2024-09-21 12:39:55 +02:00
Massimiliano Culpo
ea2d43b4a6 Do not initialize previous store state in "use_store" (#45268)
The "use_store" context manager is used to swap the value
of a global variable (spack.store.STORE), while keeping
another global variable consistent (spack.config.CONFIG).

When doing that it tries to evaluate the previous value
of the store, if that was not done already. This is wrong,
since the configuration might be in an "intermediate" state
that was never meant to trigger side effects.

Remove that operation, and add a unit test to
prevent regressions.
2024-09-21 12:39:55 +02:00
Massimiliano Culpo
85e67d60a0 Add compatibility of sequoia with previous macOS versions (#45127)
* Add compatibility of sequoia with previous macOS versions

* Add compatibility of sequoia with previous macOS versions
2024-09-21 12:39:55 +02:00
Adam J. Stewart
bf6a9ff5ed Add support for macOS Sequoia (#45018) 2024-09-21 12:39:55 +02:00
Jordan Galby
1bdc30979d Fix regression in spec format string for indiviual variants (#46206)
Fix a regression in {variants.X} and {variants.X.value} spec format strings.
2024-09-21 12:39:55 +02:00
Harmen Stoppels
ef1eabe5b3 Add c to the list of languages (#45191) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
43d673f915 Add pkg- prefix to builtin.mock a b c d ... (#45205) 2024-09-21 12:39:55 +02:00
Harmen Stoppels
8a9c501030 spec.py: fix __getitem__ looking outside of dag (#45090)
`Spec.__getitem__` queries dependent edges, which almost always point to
nodes outside the sub-dag considered. It should only ever look at edges
being traversed.
2024-09-21 12:39:55 +02:00
Harmen Stoppels
9f035ca030 Set version to v0.22.2.dev0 2024-09-21 12:39:55 +02:00
Harmen Stoppels
d66dce2d66 Set version to v0.22.1 2024-07-04 15:14:09 +02:00
Jordan Galby
ef2aa2f5f5 spack audit packages: Fix message (#45045)
Fix message formatting of the "virtual dependency cannot have variants" error.
2024-07-04 15:13:31 +02:00
Harmen Stoppels
41f5f6eaab iconv: require libiconv on linux (#45026)
otherwise it is still picked up from glibc as it is external
2024-07-04 15:07:05 +02:00
Massimiliano Culpo
cba347e0b7 Heuristic decays to default over time (#45023)
This modifies heuristic to decay to clingo default
over time. The hope is that this helps with specs
that have an optimal solution with a high penalty.

Let target and compiler heuristic decay too, do not
guess compiler
2024-07-04 15:07:05 +02:00
Harmen Stoppels
a3cef0f02e netlib-lapack: provide blas and lapack together (#44981)
If netlib-lapack is built with ~external-blas, it internally links
liblapack.so with libblas.so, meaning that whenever netlib-lapack is
used as a lapack provider, the package must also be a blas provider.

Conversely using netli-lapack as a blas provider does not imply that it
also must provide lapack, but nothing is lost disallowing that...
2024-07-01 16:56:31 +02:00
Harmen Stoppels
45fca040c3 Use composite stage also for develop specs (#44950) 2024-07-01 16:56:31 +02:00
Harmen Stoppels
eb2b5739b2 Remove DIYStage (#44949) 2024-07-01 16:56:31 +02:00
Massimiliano Culpo
d299e17d43 neoverse-v1: restore py-cinemasci (#44976)
Use a different tactic for determining conflicts.

Give more priority to setting False very old versions.
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
d883883be0 Ensure parent runtime version >= child (#44834)
Fixes a bug where old gcc-runtime libraries would be loaded at runtime, but newer are required by dependencies, breaking the binaries.
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
249dcb49e2 ASP-based solver: add a generic rule for propagation (#44870)
This adds a generic propagate/2 rule to propagate any
fact to children in the DAG.
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
8628add66b Simplify and improve solver heuristic (#44893)
When we changed how to deal with errors in November,
we didn't realize that for an unconstrained choice
rule it is more important in the heuristic to guess
what is NOT in the answer set, since it will be the
majority of options.

Previously this was following automatically from what
was in the answer set, via `1 { ... } 1` cardinality
constraints.

Here we improve the heuristic and the solve time for specs.
2024-07-01 16:56:31 +02:00
Harmen Stoppels
aeccba8bc0 build_environment: fix ccache error handling (#44740) 2024-07-01 16:56:31 +02:00
Todd Gamblin
d94e8ab36f python: make every view a venv (#44382)
#40773 introduced python-venv, which improved build isolation and avoids issues with,
e.g., `ubuntu`'s system python modifying `sysconfig` to include a (very unwanted)
`local` directory within the default install layout.

This addresses a few cases where #40773 removed functionality, without harming the
default cases where we use `python-venv`.

Traditionally, *every* view with `python` in it was essentially a virtual environment,
because we would copy the `python` interpreter and `os.py` into every view when linking.
We now rely on `python-venv` to do that, but only when it's used (i.e. new builds) and
only for packages that have an `extends("python")` directive.

This again makes every view with `python` in it a virtual environment, but only
if we're not already using a package like `python-venv`. This uses a different
mechanism from before -- instead of using the `virtualenv` trick of copying `python`
into the prefix, we instead create a `pyvenv.cfg` like `venv` (the more modern way
to do it).

This fixes two things:
1. If you already had an environment before Spack `v0.22` that worked, it would
   stop working without a reconcretize and rebuild in `v0.22`, because we no longer
   copy the python interpreter on link. Adding `pyvenv.cfg` fixes this in a more
   modern way, so old views will keep working.

2. If you have an env that only includes python packages that use `depends_on("python")`
   instead of `extends("python")`, those packages will now be importable as before,
   though they won't have the same level of build isolation you'd get with `extends`
   and `python-venv`.

* views: avoid making client code deal with link functions

Users of views and ViewDescriptors shouldn't have to deal with link functions -- they
should just say what type of linking they want.

- [x] views take a link_type, not a link function
- [x] views work out the link function from the link type
- [x] view descriptors and commands now just tell the view what they want.

* python: simplify logic for avoiding pyvenv.cfg in copy views

Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
e66c26871f Move unit tests into the same file, simplify main workflow 2024-07-01 16:56:31 +02:00
kwryankrattiger
2db4ff7061 Generate jobs should use x86_64_v3 runners only (#44582) 2024-07-01 16:56:31 +02:00
Tom Bradford
c248932a94 protobuf: fix 3.4:3.21 patch checksum (#44443) 2024-07-01 16:56:31 +02:00
dmagdavector
f15d302fc7 protobuf: update hash for patch needed when="@3.4:3.21" (#44210)
* protobuf: update hash for patch needed when="@3.4:3.21"

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

* Update var/spack/repos/builtin/packages/protobuf/package.py

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>

---------

Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
2024-07-01 16:56:31 +02:00
John W. Parent
74ef630241 Windows: Non config changes to support Gitlab CI (#43965)
* Quote python for shlex

* Remove python path quoting patch

* spack env: Allow `C` "protocol" for config_path

When running spack on windows, a path beginning with `C://...` is a valid path.

* Remove makefile from ci rebuild

* GPG use llnl.util.filesystem.getuid

* Cleanup process_command

* Remove unused lines

* Fix tyop in encode_path

* Double quote arguments

* Cleanup process_command

* Pass cdash args with =

* Escape parens in CMD script

* escape parens doesn't only apply to paths

* Install deps

* sfn prefix

* use sfn with libxml2

* Add hash to dep install

* WIP

* REview

* Changes missed in prior review commit

* Style

* Ensure we handle Windows paths with config scopes

* clarify docstring

* No more MAKE_COMMAND

* syntax cleanup

* Actually correct is_path_url

* Correct call

* raise on other errors

* url2path behaves differently on unix

* Ensure proper quoting

* actually prepend slash in slash_hash

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
2024-07-01 16:56:31 +02:00
John W. Parent
a70ea11e69 Gitlab CI: Windows Configs (#43967)
Add support for Gitlab CI on Windows

This PR adds the config changes required to configure and execute
Gitlab pipelines running Windows builds on Windows runners using
the existing Gitlab CI infrastructure (and newly added Windows 
infrastructure).

* Adds support for generating child pipelines dispatched to Windows runners
* Refactors the relevant pre-scripts, scripts, and post scripts to be compatible with Windows
* Adds Windows config section describing Windows jobs
* Adds VTK as Windows build stack (to be expanded later)
* Modifies proj to build on Windows
* Refactors Windows rpath symlinking to avoid system libs and externals

---------

Co-authored-by: Ryan Krattiger <ryan.krattiger@kitware.com>
Co-authored-by: Mike VanDenburgh <michael.vandenburgh@kitware.com>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
2024-07-01 16:56:31 +02:00
John W. Parent
a79b1bd9af Buildcache/ensure symlinks proper prefix (#43851)
* archive: relative links only

Ensure all links written into tarfiles generated from Spack prefixes do not contain symlinks pointing outside the prefix

* binary_distribution: limit extraction to prefix

Ensure files extracted from spackballs are not links pointing outside of the prefix

* Ensure rpaths are properly set on Windows

* hard error on extraction of absolute links

* refactor for non link-modifying approach

* Restore tarball extraction to original impl

* use custom readlink

* cleanup symlink module

* make lstrip
2024-07-01 16:56:31 +02:00
John W. Parent
ac5d5485b9 Cdash reporting timeout (#44213)
* Add timeout to cdash reporter PUT request

Add cdash timeout everywhere
Correct mock responder api

* Style

* brief doc
2024-07-01 16:56:31 +02:00
John W. Parent
04258f9cce Prefer llnl.util.symlink.readlink to os.readlink (#44126)
Symlinks on Windows can use longpath prefixes (\\?\); these are fine
in the context of win32 API interactions but break numerous facets of
Spack behavior that rely on string parsing/matching (archiving,
binary distributions, tarball extraction, view regen, etc).

Spack's internal readlink method (llnl.util.symlink.readlink)
gracefully handles this by removing the prefix and otherwise behaving
exactly as os.readlink does, so we should prefer that in all cases.
2024-07-01 16:56:31 +02:00
Scott Wittenburg
1b14170bd1 gitlab ci: fix untouched spec pruning on windows (#44279)
Use correct path separator in get_all_package_diffs for all platforms.
Ensures correct package change computation on Windows when pruning unchanged specs in Gitlab CI
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
a3bc9dbfe8 Make strong preferences even stronger (#44373)
Before this PR, if Spack could see a possibility to reuse a spec that
doesn't match a strong preference, it would do so. After the PR, a
strong preference would take precedence.
2024-07-01 16:56:31 +02:00
Greg Becker
e7c86259bd bugfix: external detection for compilers with os but not target (#44156)
avoid calling `spec.target` when None.

When an external compiler package has an `os` set but no `target` set, Spack
currently falls into a codepath that calls `spec.target` (which itself calls
`spec.architecture.target.Microarchitecture`) when `spec.architecture.target`
is None, throwing an error.

e.g.

```
packages:
  gcc:
    externals:
    - spec: gcc@12.3.1 os=rhel7
      prefix: /usr
```

---------

Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
2605aeb072 ASP-based solver: fix reusing externals on linux (#44316)
We need to tell clingo the libc compatibility of external nodes
in buildcaches or stores, to allow reuse.
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
94536d2b66 Enforce consistency of gl providers (#44307)
* glew: rework dependency on gl

This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.

* mesa-demos: rework dependency on gl

This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.

* mesa-glu: rework dependency on gl

This simplifies the package and ensures a single gl implementation is
pulled in. Before we were adding direct dependencies, and those are
not unified through the virtual.

* paraview: fix dependency on glew

* mesa: group dependency on when("+glx")

* Add missing dependency on libxml2

* paraview: remove the "osmesa" and "egl" variant

Instead, enforce consistency using the "gl" virtual that allows
only one provider.

* visit: remove osmesa variant

* Disable paraview in the aws-isc stacks

* data-vis-sdk: rework constrains to enforce front-ends

* e4s-power: remove redundant paraview

* Pipelines: update osmesa variants

* trilinos-catalyst-ioss-adapter: make gl a run dependency
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
5e580fc82e Remove mesa18 and libosmesa (#44264)
* Remove mesa18 and libosmesa

mesa18 was introduced in #19528 as a way to maintain the old
autotools build of mesa separate from the new meson build.

We could add a second build system to mesa, but since mesa18 has
been deprecated for a long time, we'll just remove it.

libosmesa was used to multiplex the gl provider between mesa18
and mesa, and is thus unecessary. Remove it to reduce complexity
in the graphical stack.

* Remove references to mesa18 and libosmesa

* vtk: rework dependency on gl and osmesa

* memsurfer: rework dependency on vtk

* visit: minimal fix to avoid having both osmesa and glx
2024-07-01 16:56:31 +02:00
Harmen Stoppels
195bad8675 Prefer libiconv for iconv (#44335)
`glibc` and `musl` provide a basic implementation of `iconv` (`iconv`,
`iconv_open`, `iconv_close`), but in practice the installation may be
missing the character encoding methods to make them usable. On Fedora
for example, users need to

```yum install glibc-gconv-extra```

to get the character encodings that `gettext` requires during configure,
namely EUC-JP. Users may not have permissions to install the missing
parts of glibc.

Since Spack can install `libiconv`, it is simpler to use that by
default.
2024-07-01 16:56:31 +02:00
Harmen Stoppels
bd9f3f100a gcc: use -rpath {rpath_dir} not -rpath={rpath dir} (#44315)
to make macOS's linker happy.
2024-07-01 16:56:31 +02:00
Mosè Giordano
b5962613a0 suite-sparse: improve setting of the libs property (#44214)
on some distros it is in lib64/
2024-07-01 16:56:31 +02:00
Massimiliano Culpo
cbcfc7e10a Demote a warning to debug message, if C compiler is not there (#44182) 2024-07-01 16:56:31 +02:00
Massimiliano Culpo
579fadacd0 ASP-based solver: fix version optimization for roots (#44272)
This fixes a bug occurring when two root specs need to select
old versions, and these versions have the same penalty in the
optimization. This sometimes caused an older version to be
preferred to a more recent one.

The issue was the omission of `PackageNode` in the optimization
tuple.
2024-07-01 16:56:31 +02:00
Chris Green
b86d08b022 git: bump v2.39 to 2.45; deprecate unsafe versions (#44248) 2024-07-01 16:56:31 +02:00
Scott Wittenburg
02d62cf40f oci buildcache: handle pagination of tags (#43136)
This fixes an issue where ghcr, gitlab and possibly other container registries paginate tags by default, which violates the OCI spec v1.0, but is common practice (the spec was broken itself). After this commit, you can create build cache indices of > 100 specs on ghcr.

Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
2024-07-01 16:56:31 +02:00
Harmen Stoppels
97369776f0 build_environment.py: deal with rpathing identical packages (#44219)
When multiple gcc-runtime packages exist in the same link sub-dag, only rpath
the latest.
2024-07-01 16:56:31 +02:00
Howard Pritchard
47af0159dc py-matplotlib: qualify when to do a post install (#44191)
* py-matplotlib: qualify when to do a post install

Older versions of py-matplotlib don't seem to have some of the
files that the post install step is trying to install.
Looks like the files first appeared in 3.6.0 and later.

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>

* Change install paths for older matplotlib

---------

Signed-off-by: Howard Pritchard <hppritcha@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
2024-07-01 16:56:31 +02:00
Alec Scott
db6ead6fc1 rust: fix v1.78.0 instructions (#44127) 2024-07-01 16:56:31 +02:00
Harmen Stoppels
b4aa2c3cab glibc: detect from "Free Software Foundation" not "gnu" (#44154)
which should be more generic
2024-07-01 16:56:31 +02:00
Harmen Stoppels
4108de1ce4 Set version to v0.22.1.dev0 2024-07-01 16:56:31 +02:00
Todd Gamblin
5fe93fee1e Update CHANGELOG.md for v0.22.0 2024-05-12 02:06:28 +02:00
Todd Gamblin
8207f11333 Bump version to v0.22 2024-05-11 17:54:12 +02:00
Todd Gamblin
5bb5d2696f changelog: add changes form 0.21.1 and 0.21.2 (#44136)
These changes were added to the release branch but did not make it onto `develop`.
2024-05-11 17:48:27 +02:00
Harmen Stoppels
55f37dffe5 oci: improve default_retry (#44132)
Apparently urllib can throw a range of different exceptions:

1. HTTPError
2. URLError with e.reason set to the actual exception
3. TimeoutError from getresponse, which is not wrapped
2024-05-11 15:44:40 +02:00
Harmen Stoppels
252a5bd71b PythonExtension: fix issue where package does not extend python (#44109) 2024-05-10 10:48:06 +02:00
Massimiliano Culpo
f55224f161 Fix filtering external specs (#44093)
When an include filter on externals is present, implicitly
include libcs.

Also, do not penalize deprecated versions if they come
from externals.
2024-05-09 20:48:43 +02:00
Massimiliano Culpo
189ae4b06e CI/Update macOS runners: macos-latest switched to macos-14 (#44094)
macos-latest switched to macos-14, so now we are running
two identical jobs.
2024-05-09 20:48:43 +02:00
Harmen Stoppels
5e9c702fa7 gcc: use -idirafter for libc headers (#44081)
GCC C++ headers like cstdlib use `#include_next <stdlib.h>` to wrap libc
headers. We're using `-isystem` for libc, which puts those headers too
early in the search path. `-idirafter` fixes this so `include_next`
works.
2024-05-08 20:45:39 +02:00
Harmen Stoppels
965bb4d3c0 gitlab ci: tutorial: add julia and vim (#44073) 2024-05-08 14:19:22 +02:00
Todd Gamblin
354f98c94a r: patch R-CVE-2024-27322 for r@3.5:4.3.3 (#44050)
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-08 14:19:22 +02:00
Tamara Dahlgren
5dce480154 Remove dead environment creation code (#44065) 2024-05-08 14:19:22 +02:00
Richarda Butler
f634d48b7c Include concrete environments with include_concrete (#33768)
Add the ability to include any number of (potentially nested) concrete environments, e.g.:

```yaml
   spack:
     specs: []
     concretizer:
         unify: true
     include_concrete:
     - /path/to/environment1
     - /path/to/environment2
```

or, from the CLI:

```console
   $ spack env create myenv
   $ spack -e myenv add python
   $ spack -e myenv concretize
   $ spack env create --include-concrete myenv included_env
```

The contents of included concrete environments' spack.lock files are
included in the environment's lock file at creation time. Any changes
to included concrete environments are only reflected after the environment
is re-concretized from the re-concretized included environments.

- [x] Concretize included envs
- [x] Save concrete specs in memory by hash
- [x] Add included envs to combined env's lock file
- [x] Add test
- [x] Update documentation

    Co-authored-by: Kayla Butler <<butler59@llnl.gov>
    Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.co
m>
    Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
    Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
2024-05-08 14:19:22 +02:00
Massimiliano Culpo
4daee565ae Update the tutorial command to point to releases/v0.22 (#44056) 2024-05-08 14:19:22 +02:00
Massimiliano Culpo
8e4dbdc2d7 Bump removal version in deprecation messages (#44064) 2024-05-08 14:19:22 +02:00
Harmen Stoppels
4f6adc03cd gitlab: dont build paraview for neoverse v2 (#44060) 2024-05-08 14:19:22 +02:00
851 changed files with 7832 additions and 9725 deletions

View File

@@ -28,7 +28,7 @@ jobs:
run:
shell: ${{ matrix.system.shell }}
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: ${{inputs.python_version}}
@@ -61,7 +61,7 @@ jobs:
./share/spack/qa/validate_last_exit.ps1
spack -d audit externals
./share/spack/qa/validate_last_exit.ps1
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
if: ${{ inputs.with_coverage == 'true' }}
with:
flags: unittests,audits

View File

@@ -37,7 +37,7 @@ jobs:
make patch unzip which xz python3 python3-devel tree \
cmake bison
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with:
fetch-depth: 0
- name: Bootstrap clingo
@@ -60,7 +60,7 @@ jobs:
run: |
brew install cmake bison tree
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -84,13 +84,15 @@ jobs:
- name: Setup macOS
if: ${{ matrix.runner != 'ubuntu-latest' }}
run: |
brew install tree gawk
sudo rm -rf $(command -v gpg gpg2)
brew install tree
# Remove GnuPG since we want to bootstrap it
sudo rm -rf /usr/local/bin/gpg
- name: Setup Ubuntu
if: ${{ matrix.runner == 'ubuntu-latest' }}
run: sudo rm -rf $(command -v gpg gpg2 patchelf)
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with:
fetch-depth: 0
- name: Bootstrap GnuPG
@@ -119,7 +121,7 @@ jobs:
run: |
sudo rm -rf $(which gpg) $(which gpg2) $(which patchelf)
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -56,7 +56,7 @@ jobs:
if: github.repository == 'spack/spack'
steps:
- name: Checkout
uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: docker/metadata-action@8e5442c4ef9f78752691e2d8f8d19755c6f78e81
id: docker_meta

View File

@@ -36,7 +36,7 @@ jobs:
core: ${{ steps.filter.outputs.core }}
packages: ${{ steps.filter.outputs.packages }}
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
if: ${{ github.event_name == 'push' }}
with:
fetch-depth: 0
@@ -77,13 +77,8 @@ jobs:
needs: [ prechecks, changes ]
uses: ./.github/workflows/unit_tests.yaml
secrets: inherit
windows:
if: ${{ github.repository == 'spack/spack' && needs.changes.outputs.core == 'true' }}
needs: [ prechecks ]
uses: ./.github/workflows/windows_python.yml
secrets: inherit
all:
needs: [ windows, unit-tests, bootstrap ]
needs: [ unit-tests, bootstrap ]
runs-on: ubuntu-latest
steps:
- name: Success

View File

@@ -14,7 +14,7 @@ jobs:
build-paraview-deps:
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d

View File

@@ -3,5 +3,5 @@ clingo==5.7.1
flake8==7.0.0
isort==5.13.2
mypy==1.8.0
types-six==1.16.21.20240513
types-six==1.16.21.9
vermin==1.6.0

View File

@@ -51,7 +51,7 @@ jobs:
on_develop: false
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -91,7 +91,7 @@ jobs:
UNIT_TEST_COVERAGE: ${{ matrix.python-version == '3.11' }}
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with:
flags: unittests,linux,${{ matrix.concretizer }}
token: ${{ secrets.CODECOV_TOKEN }}
@@ -100,7 +100,7 @@ jobs:
shell:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -124,7 +124,7 @@ jobs:
COVERAGE: true
run: |
share/spack/qa/run-shell-tests
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with:
flags: shelltests,linux
token: ${{ secrets.CODECOV_TOKEN }}
@@ -141,7 +141,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- name: Setup repo and non-root user
run: |
git --version
@@ -160,7 +160,7 @@ jobs:
clingo-cffi:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -185,7 +185,7 @@ jobs:
SPACK_TEST_SOLVER: clingo
run: |
share/spack/qa/run-unit-tests
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with:
flags: unittests,linux,clingo
token: ${{ secrets.CODECOV_TOKEN }}
@@ -198,7 +198,7 @@ jobs:
os: [macos-13, macos-14]
python-version: ["3.11"]
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -223,8 +223,39 @@ jobs:
$(which spack) solve zlib
common_args=(--dist loadfile --tx '4*popen//python=./bin/spack-tmpconfig python -u ./bin/spack python' -x)
$(which spack) unit-test --verbose --cov --cov-config=pyproject.toml --cov-report=xml:coverage.xml "${common_args[@]}"
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
- uses: codecov/codecov-action@5ecb98a3c6b747ed38dc09f787459979aebb39be
with:
flags: unittests,macos
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
# Run unit tests on Windows
windows:
defaults:
run:
shell:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true

View File

@@ -18,7 +18,7 @@ jobs:
validate:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: '3.11'
@@ -35,7 +35,7 @@ jobs:
style:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
@@ -70,7 +70,7 @@ jobs:
dnf install -y \
bzip2 curl file gcc-c++ gcc gcc-gfortran git gnupg2 gzip \
make patch tcl unzip which xz
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b
- name: Setup repo and non-root user
run: |
git --version

View File

@@ -1,83 +0,0 @@
name: windows
on:
workflow_call:
concurrency:
group: windows-${{github.ref}}-${{github.event.pull_request.number || github.run_number}}
cancel-in-progress: true
defaults:
run:
shell:
powershell Invoke-Expression -Command "./share/spack/qa/windows_test_setup.ps1"; {0}
jobs:
unit-tests:
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml --ignore=lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
unit-tests-cmd:
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools coverage pytest-cov clingo
- name: Create local develop
run: |
./.github/workflows/setup_git.ps1
- name: Command Unit Test
run: |
spack unit-test -x --verbose --cov --cov-config=pyproject.toml lib/spack/spack/test/cmd
./share/spack/qa/validate_last_exit.ps1
coverage combine -a
coverage xml
- uses: codecov/codecov-action@125fc84a9a348dbcf27191600683ec096ec9021c
with:
flags: unittests,windows
token: ${{ secrets.CODECOV_TOKEN }}
verbose: true
build-abseil:
runs-on: windows-latest
steps:
- uses: actions/checkout@a5ac7e51b41094c92402da3b24376905380afc29
with:
fetch-depth: 0
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d
with:
python-version: 3.9
- name: Install Python packages
run: |
python -m pip install --upgrade pip pywin32 setuptools coverage
- name: Build Test
run: |
spack compiler find
spack -d external find cmake ninja
spack -d install abseil-cpp

View File

@@ -1,3 +1,387 @@
# v0.22.2 (2024-09-21)
## Bugfixes
- Forward compatibility with Spack 0.23 packages with language dependencies (#45205, #45191)
- Forward compatibility with `urllib` from Python 3.12.6+ (#46453, #46483)
- Bump vendored `archspec` for better aarch64 support (#45721, #46445)
- Support macOS Sequoia (#45018, #45127)
- Fix regression in `{variants.X}` and `{variants.X.value}` format strings (#46206)
- Ensure shell escaping of environment variable values in load and activate commands (#42780)
- Fix an issue where `spec[pkg]` considers specs outside the current DAG (#45090)
- Do not halt concretization on unknown variants in externals (#45326)
- Improve validation of `develop` config section (#46485)
- Explicitly disable `ccache` if turned off in config, to avoid cache pollution (#45275)
- Improve backwards compatibility in `include_concrete` (#45766)
- Fix issue where package tags were sometimes repeated (#45160)
- Make `setup-env.sh` "sourced only" by dropping execution bits (#45641)
- Make certain source/binary fetch errors recoverable instead of a hard error (#45683)
- Remove debug statements in package hash computation (#45235)
- Remove redundant clingo warnings (#45269)
- Remove hard-coded layout version (#45645)
- Do not initialize previous store state in `use_store` (#45268)
- Docs improvements (#46475)
## Package updates
- `chapel` major update (#42197, #44931, #45304)
# v0.22.1 (2024-07-04)
## Bugfixes
- Fix reuse of externals on Linux (#44316)
- Ensure parent gcc-runtime version >= child (#44834, #44870)
- Ensure the latest gcc-runtime is rpath'ed when multiple exist among link deps (#44219)
- Improve version detection of glibc (#44154)
- Improve heuristics for solver (#44893, #44976, #45023)
- Make strong preferences override reuse (#44373)
- Reduce verbosity when C compiler is missing (#44182)
- Make missing ccache executable an error when required (#44740)
- Make every environment view containing `python` a `venv` (#44382)
- Fix external detection for compilers with os but no target (#44156)
- Fix version optimization for roots (#44272)
- Handle common implementations of pagination of tags in OCI build caches (#43136)
- Apply fetched patches to develop specs (#44950)
- Avoid Windows wrappers for filesystem utilities on non-Windows (#44126)
- Fix issue with long filenames in build caches on Windows (#43851)
- Fix formatting issue in `spack audit` (#45045)
- CI fixes (#44582, #43965, #43967, #44279, #44213)
## Package updates
- protobuf: fix 3.4:3.21 patch checksum (#44443)
- protobuf: update hash for patch needed when="@3.4:3.21" (#44210)
- git: bump v2.39 to 2.45; deprecate unsafe versions (#44248)
- gcc: use -rpath {rpath_dir} not -rpath={rpath dir} (#44315)
- Remove mesa18 and libosmesa (#44264)
- Enforce consistency of `gl` providers (#44307)
- Require libiconv for iconv (#44335, #45026).
Notice that glibc/musl also provide iconv, but are not guaranteed to be
complete. Set `packages:iconv:require:[glibc]` to restore the old behavior.
- py-matplotlib: qualify when to do a post install (#44191)
- rust: fix v1.78.0 instructions (#44127)
- suite-sparse: improve setting of the `libs` property (#44214)
- netlib-lapack: provide blas and lapack together (#44981)
# v0.22.0 (2024-05-12)
`v0.22.0` is a major feature release.
## Features in this release
1. **Compiler dependencies**
We are in the process of making compilers proper dependencies in Spack, and a number
of changes in `v0.22` support that effort. You may notice nodes in your dependency
graphs for compiler runtime libraries like `gcc-runtime` or `libgfortran`, and you
may notice that Spack graphs now include `libc`. We've also begun moving compiler
configuration from `compilers.yaml` to `packages.yaml` to make it consistent with
other externals. We are trying to do this with the least disruption possible, so
your existing `compilers.yaml` files should still work. We expect to be done with
this transition by the `v0.23` release in November.
* #41104: Packages compiled with `%gcc` on Linux, macOS and FreeBSD now depend on a
new package `gcc-runtime`, which contains a copy of the shared compiler runtime
libraries. This enables gcc runtime libraries to be installed and relocated when
using a build cache. When building minimal Spack-generated container images it is
no longer necessary to install libgfortran, libgomp etc. using the system package
manager.
* #42062: Packages compiled with `%oneapi` now depend on a new package
`intel-oneapi-runtime`. This is similar to `gcc-runtime`, and the runtimes can
provide virtuals and compilers can inject dependencies on virtuals into compiled
packages. This allows us to model library soname compatibility and allows
compilers like `%oneapi` to provide virtuals like `sycl` (which can also be
provided by standalone libraries). Note that until we have an agreement in place
with intel, Intel packages are marked `redistribute(source=False, binary=False)`
and must be downloaded outside of Spack.
* #43272: changes to the optimization criteria of the solver improve the hit-rate of
buildcaches by a fair amount. The solver more relaxed compatibility rules and will
not try to strictly match compilers or targets of reused specs. Users can still
enforce the previous strict behavior with `require:` sections in `packages.yaml`.
Note that to enforce correct linking, Spack will *not* reuse old `%gcc` and
`%oneapi` specs that do not have the runtime libraries as a dependency.
* #43539: Spack will reuse specs built with compilers that are *not* explicitly
configured in `compilers.yaml`. Because we can now keep runtime libraries in build
cache, we do not require you to also have a local configured compiler to *use* the
runtime libraries. This improves reuse in buildcaches and avoids conflicts with OS
updates that happen underneath Spack.
* #43190: binary compatibility on `linux` is now based on the `libc` version,
instead of on the `os` tag. Spack builds now detect the host `libc` (`glibc` or
`musl`) and add it as an implicit external node in the dependency graph. Binaries
with a `libc` with the same name and a version less than or equal to that of the
detected `libc` can be reused. This is only on `linux`, not `macos` or `Windows`.
* #43464: each package that can provide a compiler is now detectable using `spack
external find`. External packages defining compiler paths are effectively used as
compilers, and `spack external find -t compiler` can be used as a substitute for
`spack compiler find`. More details on this transition are in
[the docs](https://spack.readthedocs.io/en/latest/getting_started.html#manual-compiler-configuration)
2. **Improved `spack find` UI for Environments**
If you're working in an enviroment, you likely care about:
* What are the roots
* Which ones are installed / not installed
* What's been added that still needs to be concretized
We've tweaked `spack find` in environments to show this information much more
clearly. Installation status is shown next to each root, so you can see what is
installed. Roots are also shown in bold in the list of installed packages. There is
also a new option for `spack find -r` / `--only-roots` that will only show env
roots, if you don't want to look at all the installed specs.
More details in #42334.
3. **Improved command-line string quoting**
We are making some breaking changes to how Spack parses specs on the CLI in order to
respect shell quoting instead of trying to fight it. If you (sadly) had to write
something like this on the command line:
```
spack install zlib cflags=\"-O2 -g\"
```
That will now result in an error, but you can now write what you probably expected
to work in the first place:
```
spack install zlib cflags="-O2 -g"
```
Quoted can also now include special characters, so you can supply flags like:
```
spack intall zlib ldflags='-Wl,-rpath=$ORIGIN/_libs'
```
To reduce ambiguity in parsing, we now require that you *not* put spaces around `=`
and `==` when for flags or variants. This would not have broken before but will now
result in an error:
```
spack install zlib cflags = "-O2 -g"
```
More details and discussion in #30634.
4. **Revert default `spack install` behavior to `--reuse`**
We changed the default concretizer behavior from `--reuse` to `--reuse-deps` in
#30990 (in `v0.20`), which meant that *every* `spack install` invocation would
attempt to build a new version of the requested package / any environment roots.
While this is a common ask for *upgrading* and for *developer* workflows, we don't
think it should be the default for a package manager.
We are going to try to stick to this policy:
1. Prioritize reuse and build as little as possible by default.
2. Only upgrade or install duplicates if they are explicitly asked for, or if there
is a known security issue that necessitates an upgrade.
With the install command you now have three options:
* `--reuse` (default): reuse as many existing installations as possible.
* `--reuse-deps` / `--fresh-roots`: upgrade (freshen) roots but reuse dependencies if possible.
* `--fresh`: install fresh versions of requested packages (roots) and their dependencies.
We've also introduced `--fresh-roots` as an alias for `--reuse-deps` to make it more clear
that it may give you fresh versions. More details in #41302 and #43988.
5. **More control over reused specs**
You can now control which packages to reuse and how. There is a new
`concretizer:reuse` config option, which accepts the following properties:
- `roots`: `true` to reuse roots, `false` to reuse just dependencies
- `exclude`: list of constraints used to select which specs *not* to reuse
- `include`: list of constraints used to select which specs *to* reuse
- `from`: list of sources for reused specs (some combination of `local`,
`buildcache`, or `external`)
For example, to reuse only specs compiled with GCC, you could write:
```yaml
concretizer:
reuse:
roots: true
include:
- "%gcc"
```
Or, if `openmpi` must be used from externals, and it must be the only external used:
```yaml
concretizer:
reuse:
roots: true
from:
- type: local
exclude: ["openmpi"]
- type: buildcache
exclude: ["openmpi"]
- type: external
include: ["openmpi"]
```
6. **New `redistribute()` directive**
Some packages can't be redistributed in source or binary form. We need an explicit
way to say that in a package.
Now there is a `redistribute()` directive so that package authors can write:
```python
class MyPackage(Package):
redistribute(source=False, binary=False)
```
Like other directives, this works with `when=`:
```python
class MyPackage(Package):
# 12.0 and higher are proprietary
redistribute(source=False, binary=False, when="@12.0:")
# can't redistribute when we depend on some proprietary dependency
redistribute(source=False, binary=False, when="^proprietary-dependency")
```
More in #20185.
7. **New `conflict:` and `prefer:` syntax for package preferences**
Previously, you could express conflicts and preferences in `packages.yaml` through
some contortions with `require:`:
```yaml
packages:
zlib-ng:
require:
- one_of: ["%clang", "@:"] # conflict on %clang
- any_of: ["+shared", "@:"] # strong preference for +shared
```
You can now use `require:` and `prefer:` for a much more readable configuration:
```yaml
packages:
zlib-ng:
conflict:
- "%clang"
prefer:
- "+shared"
```
See [the documentation](https://spack.readthedocs.io/en/latest/packages_yaml.html#conflicts-and-strong-preferences)
and #41832 for more details.
8. **`include_concrete` in environments**
You may want to build on the *concrete* contents of another environment without
changing that environment. You can now include the concrete specs from another
environment's `spack.lock` with `include_concrete`:
```yaml
spack:
specs: []
concretizer:
unify: true
include_concrete:
- /path/to/environment1
- /path/to/environment2
```
Now, when *this* environment is concretized, it will bring in the already concrete
specs from `environment1` and `environment2`, and build on top of them without
changing them. This is useful if you have phased deployments, where old deployments
should not be modified but you want to use as many of them as possible. More details
in #33768.
9. **`python-venv` isolation**
Spack has unique requirements for Python because it:
1. installs every package in its own independent directory, and
2. allows users to register *external* python installations.
External installations may contain their own installed packages that can interfere
with Spack installations, and some distributions (Debian and Ubuntu) even change the
`sysconfig` in ways that alter the installation layout of installed Python packages
(e.g., with the addition of a `/local` prefix on Debian or Ubuntu). To isolate Spack
from these and other issues, we now insert a small `python-venv` package in between
`python` and packages that need to install Python code. This isolates Spack's build
environment, isolates Spack from any issues with an external python, and resolves a
large number of issues we've had with Python installations.
See #40773 for further details.
## New commands, options, and directives
* Allow packages to be pushed to build cache after install from source (#42423)
* `spack develop`: stage build artifacts in same root as non-dev builds #41373
* Don't delete `spack develop` build artifacts after install (#43424)
* `spack find`: add options for local/upstream only (#42999)
* `spack logs`: print log files for packages (either partially built or installed) (#42202)
* `patch`: support reversing patches (#43040)
* `develop`: Add -b/--build-directory option to set build_directory package attribute (#39606)
* `spack list`: add `--namesapce` / `--repo` option (#41948)
* directives: add `checked_by` field to `license()`, add some license checks
* `spack gc`: add options for environments and build dependencies (#41731)
* Add `--create` to `spack env activate` (#40896)
## Performance improvements
* environment.py: fix excessive re-reads (#43746)
* ruamel yaml: fix quadratic complexity bug (#43745)
* Refactor to improve `spec format` speed (#43712)
* Do not acquire a write lock on the env post install if no views (#43505)
* asp.py: fewer calls to `spec.copy()` (#43715)
* spec.py: early return in `__str__`
* avoid `jinja2` import at startup unless needed (#43237)
## Other new features of note
* `archspec`: update to `v0.2.4`: support for Windows, bugfixes for `neoverse-v1` and
`neoverse-v2` detection.
* `spack config get`/`blame`: with no args, show entire config
* `spack env create <env>`: dir if dir-like (#44024)
* ASP-based solver: update os compatibility for macOS (#43862)
* Add handling of custom ssl certs in urllib ops (#42953)
* Add ability to rename environments (#43296)
* Add config option and compiler support to reuse across OS's (#42693)
* Support for prereleases (#43140)
* Only reuse externals when configured (#41707)
* Environments: Add support for including views (#42250)
## Binary caches
* Build cache: make signed/unsigned a mirror property (#41507)
* tools stack
## Removals, deprecations, and syntax changes
* remove `dpcpp` compiler and package (#43418)
* spack load: remove --only argument (#42120)
## Notable Bugfixes
* repo.py: drop deleted packages from provider cache (#43779)
* Allow `+` in module file names (#41999)
* `cmd/python`: use runpy to allow multiprocessing in scripts (#41789)
* Show extension commands with spack -h (#41726)
* Support environment variable expansion inside module projections (#42917)
* Alert user to failed concretizations (#42655)
* shell: fix zsh color formatting for PS1 in environments (#39497)
* spack mirror create --all: include patches (#41579)
## Spack community stats
* 7,994 total packages; 525 since `v0.21.0`
* 178 new Python packages, 5 new R packages
* 358 people contributed to this release
* 344 committers to packages
* 45 committers to core
# v0.21.2 (2024-03-01)
## Bugfixes
@@ -27,7 +411,7 @@
- spack graph: fix coloring with environments (#41240)
- spack info: sort variants in --variants-by-name (#41389)
- Spec.format: error on old style format strings (#41934)
- ASP-based solver:
- ASP-based solver:
- fix infinite recursion when computing concretization errors (#41061)
- don't error for type mismatch on preferences (#41138)
- don't emit spurious debug output (#41218)

View File

@@ -32,7 +32,7 @@
Spack is a multi-platform package manager that builds and installs
multiple versions and configurations of software. It works on Linux,
macOS, Windows, and many supercomputers. Spack is non-destructive: installing a
macOS, and many supercomputers. Spack is non-destructive: installing a
new version of a package does not break existing installations, so many
configurations of the same package can coexist.

View File

@@ -0,0 +1,16 @@
# -------------------------------------------------------------------------
# This is the default configuration for Spack's module file generation.
#
# Settings here are versioned with Spack and are intended to provide
# sensible defaults out of the box. Spack maintainers should edit this
# file to keep it current.
#
# Users can override these settings by editing the following files.
#
# Per-spack-instance settings (overrides defaults):
# $SPACK_ROOT/etc/spack/modules.yaml
#
# Per-user settings (overrides default and site settings):
# ~/.spack/modules.yaml
# -------------------------------------------------------------------------
modules: {}

View File

@@ -0,0 +1,3 @@
packages:
iconv:
require: [libiconv]

View File

@@ -1433,12 +1433,22 @@ the reserved keywords ``platform``, ``os`` and ``target``:
$ spack install libelf os=ubuntu18.04
$ spack install libelf target=broadwell
or together by using the reserved keyword ``arch``:
.. code-block:: console
$ spack install libelf arch=cray-CNL10-haswell
Normally users don't have to bother specifying the architecture if they
are installing software for their current host, as in that case the
values will be detected automatically. If you need fine-grained control
over which packages use which targets (or over *all* packages' default
target), see :ref:`package-preferences`.
.. admonition:: Cray machines
The situation is a little bit different for Cray machines and a detailed
explanation on how the architecture can be set on them can be found at :ref:`cray-support`
.. _support-for-microarchitectures:

View File

@@ -147,15 +147,6 @@ example, the ``bash`` shell is used to run the ``autogen.sh`` script.
def autoreconf(self, spec, prefix):
which("bash")("autogen.sh")
If the ``package.py`` has build instructions in a separate
:ref:`builder class <multiple_build_systems>`, the signature for a phase changes slightly:
.. code-block:: python
class AutotoolsBuilder(AutotoolsBuilder):
def autoreconf(self, pkg, spec, prefix):
which("bash")("autogen.sh")
"""""""""""""""""""""""""""""""""""""""
patching configure or Makefile.in files
"""""""""""""""""""""""""""""""""""""""

View File

@@ -25,7 +25,7 @@ use Spack to build packages with the tools.
The Spack Python class ``IntelOneapiPackage`` is a base class that is
used by ``IntelOneapiCompilers``, ``IntelOneapiMkl``,
``IntelOneapiTbb`` and other classes to implement the oneAPI
packages. Search for ``oneAPI`` at `packages.spack.io <https://packages.spack.io>`_ for the full
packages. Search for ``oneAPI`` at `<packages.spack.io>`_ for the full
list of available oneAPI packages, or use::
spack list -d oneAPI

View File

@@ -5,14 +5,13 @@
.. chain:
============================
Chaining Spack Installations
============================
=============================================
Chaining Spack Installations (upstreams.yaml)
=============================================
You can point your Spack installation to another installation to use any
packages that are installed there. To register the other Spack instance,
you can add it as an entry to ``upstreams.yaml`` at any of the
:ref:`configuration-scopes`:
you can add it as an entry to ``upstreams.yaml``:
.. code-block:: yaml
@@ -23,8 +22,7 @@ you can add it as an entry to ``upstreams.yaml`` at any of the
install_tree: /path/to/another/spack/opt/spack
``install_tree`` must point to the ``opt/spack`` directory inside of the
Spack base directory, or the location of the ``install_tree`` defined
in :ref:`config.yaml <config-yaml>`.
Spack base directory.
Once the upstream Spack instance has been added, ``spack find`` will
automatically check the upstream instance when querying installed packages,

View File

@@ -1364,6 +1364,187 @@ This will write the private key to the file `dinosaur.priv`.
or for help on an issue or the Spack slack.
.. _cray-support:
-------------
Spack on Cray
-------------
Spack differs slightly when used on a Cray system. The architecture spec
can differentiate between the front-end and back-end processor and operating system.
For example, on Edison at NERSC, the back-end target processor
is "Ivy Bridge", so you can specify to use the back-end this way:
.. code-block:: console
$ spack install zlib target=ivybridge
You can also use the operating system to build against the back-end:
.. code-block:: console
$ spack install zlib os=CNL10
Notice that the name includes both the operating system name and the major
version number concatenated together.
Alternatively, if you want to build something for the front-end,
you can specify the front-end target processor. The processor for a login node
on Edison is "Sandy bridge" so we specify on the command line like so:
.. code-block:: console
$ spack install zlib target=sandybridge
And the front-end operating system is:
.. code-block:: console
$ spack install zlib os=SuSE11
^^^^^^^^^^^^^^^^^^^^^^^
Cray compiler detection
^^^^^^^^^^^^^^^^^^^^^^^
Spack can detect compilers using two methods. For the front-end, we treat
everything the same. The difference lies in back-end compiler detection.
Back-end compiler detection is made via the Tcl module avail command.
Once it detects the compiler it writes the appropriate PrgEnv and compiler
module name to compilers.yaml and sets the paths to each compiler with Cray\'s
compiler wrapper names (i.e. cc, CC, ftn). During build time, Spack will load
the correct PrgEnv and compiler module and will call appropriate wrapper.
The compilers.yaml config file will also differ. There is a
modules section that is filled with the compiler's Programming Environment
and module name. On other systems, this field is empty []:
.. code-block:: yaml
- compiler:
modules:
- PrgEnv-intel
- intel/15.0.109
As mentioned earlier, the compiler paths will look different on a Cray system.
Since most compilers are invoked using cc, CC and ftn, the paths for each
compiler are replaced with their respective Cray compiler wrapper names:
.. code-block:: yaml
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
As opposed to an explicit path to the compiler executable. This allows Spack
to call the Cray compiler wrappers during build time.
For more on compiler configuration, check out :ref:`compiler-config`.
Spack sets the default Cray link type to dynamic, to better match other
other platforms. Individual packages can enable static linking (which is the
default outside of Spack on cray systems) using the ``-static`` flag.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Setting defaults and using Cray modules
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If you want to use default compilers for each PrgEnv and also be able
to load cray external modules, you will need to set up a ``packages.yaml``.
Here's an example of an external configuration for cray modules:
.. code-block:: yaml
packages:
mpich:
externals:
- spec: "mpich@7.3.1%gcc@5.2.0 arch=cray_xc-haswell-CNL10"
modules:
- cray-mpich
- spec: "mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-haswell-CNL10"
modules:
- cray-mpich
all:
providers:
mpi: [mpich]
This tells Spack that for whatever package that depends on mpi, load the
cray-mpich module into the environment. You can then be able to use whatever
environment variables, libraries, etc, that are brought into the environment
via module load.
.. note::
For Cray-provided packages, it is best to use ``modules:`` instead of ``prefix:``
in ``packages.yaml``, because the Cray Programming Environment heavily relies on
modules (e.g., loading the ``cray-mpich`` module adds MPI libraries to the
compiler wrapper link line).
You can set the default compiler that Spack can use for each compiler type.
If you want to use the Cray defaults, then set them under ``all:`` in packages.yaml.
In the compiler field, set the compiler specs in your order of preference.
Whenever you build with that compiler type, Spack will concretize to that version.
Here is an example of a full packages.yaml used at NERSC
.. code-block:: yaml
packages:
mpich:
externals:
- spec: "mpich@7.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-mpich
- spec: "mpich@7.3.1%intel@16.0.0.109 arch=cray_xc-SuSE11-ivybridge"
modules:
- cray-mpich
buildable: False
netcdf:
externals:
- spec: "netcdf@4.3.3.1%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-netcdf
- spec: "netcdf@4.3.3.1%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-netcdf
buildable: False
hdf5:
externals:
- spec: "hdf5@1.8.14%gcc@5.2.0 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-hdf5
- spec: "hdf5@1.8.14%intel@16.0.0.109 arch=cray_xc-CNL10-ivybridge"
modules:
- cray-hdf5
buildable: False
all:
compiler: [gcc@5.2.0, intel@16.0.0.109]
providers:
mpi: [mpich]
Here we tell spack that whenever we want to build with gcc use version 5.2.0 or
if we want to build with intel compilers, use version 16.0.0.109. We add a spec
for each compiler type for each cray modules. This ensures that for each
compiler on our system we can use that external module.
For more on external packages check out the section :ref:`sec-external-packages`.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Using Linux containers on Cray machines
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Spack uses environment variables particular to the Cray programming
environment to determine which systems are Cray platforms. These
environment variables may be propagated into containers that are not
using the Cray programming environment.
To ensure that Spack does not autodetect the Cray programming
environment, unset the environment variable ``MODULEPATH``. This
will cause Spack to treat a linux container on a Cray system as a base
linux distro.
.. _windows_support:
----------------

View File

@@ -476,3 +476,9 @@ implemented using Python's built-in `sys.path
:py:mod:`spack.repo` module implements a custom `Python importer
<https://docs.python.org/2/library/imp.html>`_.
.. warning::
The mechanism for extending packages is not yet extensively tested,
and extending packages across repositories imposes inter-repo
dependencies, which may be hard to manage. Use this feature at your
own risk, but let us know if you have a use case for it.

View File

@@ -4,9 +4,9 @@ sphinx_design==0.5.0
sphinx-rtd-theme==2.0.0
python-levenshtein==0.25.1
docutils==0.20.1
pygments==2.18.0
pygments==2.17.2
urllib3==2.2.1
pytest==8.2.1
pytest==8.2.0
isort==5.13.2
black==24.4.2
flake8==7.0.0

View File

@@ -18,7 +18,7 @@
* Homepage: https://pypi.python.org/pypi/archspec
* Usage: Labeling, comparison and detection of microarchitectures
* Version: 0.2.4 (commit 48b92512b9ce203ded0ebd1ac41b42593e931f7c)
* Version: 0.2.5-dev (commit cbb1fd5eb397a70d466e5160b393b87b0dbcc78f)
astunparse
----------------

View File

@@ -47,7 +47,11 @@ def decorator(factory):
def partial_uarch(
name: str = "", vendor: str = "", features: Optional[Set[str]] = None, generation: int = 0
name: str = "",
vendor: str = "",
features: Optional[Set[str]] = None,
generation: int = 0,
cpu_part: str = "",
) -> Microarchitecture:
"""Construct a partial microarchitecture, from information gathered during system scan."""
return Microarchitecture(
@@ -57,6 +61,7 @@ def partial_uarch(
features=features or set(),
compilers={},
generation=generation,
cpu_part=cpu_part,
)
@@ -90,6 +95,7 @@ def proc_cpuinfo() -> Microarchitecture:
return partial_uarch(
vendor=_canonicalize_aarch64_vendor(data),
features=_feature_set(data, key="Features"),
cpu_part=data.get("CPU part", ""),
)
if architecture in (PPC64LE, PPC64):
@@ -345,6 +351,10 @@ def sorting_fn(item):
generic_candidates = [c for c in candidates if c.vendor == "generic"]
best_generic = max(generic_candidates, key=sorting_fn)
# Relevant for AArch64. Filter on "cpu_part" if we have any match
if info.cpu_part != "" and any(c for c in candidates if info.cpu_part == c.cpu_part):
candidates = [c for c in candidates if info.cpu_part == c.cpu_part]
# Filter the candidates to be descendant of the best generic candidate.
# This is to avoid that the lack of a niche feature that can be disabled
# from e.g. BIOS prevents detection of a reasonably performant architecture

View File

@@ -2,9 +2,7 @@
# Archspec Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Types and functions to manage information
on CPU microarchitectures.
"""
"""Types and functions to manage information on CPU microarchitectures."""
import functools
import platform
import re
@@ -65,21 +63,24 @@ class Microarchitecture:
passed in as argument above.
* versions: versions that support this micro-architecture.
generation (int): generation of the micro-architecture, if
relevant.
generation (int): generation of the micro-architecture, if relevant.
cpu_part (str): cpu part of the architecture, if relevant.
"""
# pylint: disable=too-many-arguments
# pylint: disable=too-many-arguments,too-many-instance-attributes
#: Aliases for micro-architecture's features
feature_aliases = FEATURE_ALIASES
def __init__(self, name, parents, vendor, features, compilers, generation=0):
def __init__(self, name, parents, vendor, features, compilers, generation=0, cpu_part=""):
self.name = name
self.parents = parents
self.vendor = vendor
self.features = features
self.compilers = compilers
# Only relevant for PowerPC
self.generation = generation
# Only relevant for AArch64
self.cpu_part = cpu_part
# Cache the ancestor computation
self._ancestors = None
@@ -111,6 +112,7 @@ def __eq__(self, other):
and self.parents == other.parents # avoid ancestors here
and self.compilers == other.compilers
and self.generation == other.generation
and self.cpu_part == other.cpu_part
)
@coerce_target_names
@@ -143,7 +145,8 @@ def __repr__(self):
cls_name = self.__class__.__name__
fmt = (
cls_name + "({0.name!r}, {0.parents!r}, {0.vendor!r}, "
"{0.features!r}, {0.compilers!r}, {0.generation!r})"
"{0.features!r}, {0.compilers!r}, generation={0.generation!r}, "
"cpu_part={0.cpu_part!r})"
)
return fmt.format(self)
@@ -190,6 +193,7 @@ def to_dict(self):
"generation": self.generation,
"parents": [str(x) for x in self.parents],
"compilers": self.compilers,
"cpupart": self.cpu_part,
}
@staticmethod
@@ -202,6 +206,7 @@ def from_dict(data) -> "Microarchitecture":
features=set(data["features"]),
compilers=data.get("compilers", {}),
generation=data.get("generation", 0),
cpu_part=data.get("cpupart", ""),
)
def optimization_flags(self, compiler, version):
@@ -360,8 +365,11 @@ def fill_target_from_dict(name, data, targets):
features = set(values["features"])
compilers = values.get("compilers", {})
generation = values.get("generation", 0)
cpu_part = values.get("cpupart", "")
targets[name] = Microarchitecture(name, parents, vendor, features, compilers, generation)
targets[name] = Microarchitecture(
name, parents, vendor, features, compilers, generation=generation, cpu_part=cpu_part
)
known_targets = {}
data = archspec.cpu.schema.TARGETS_JSON["microarchitectures"]

View File

@@ -2225,10 +2225,14 @@
],
"nvhpc": [
{
"versions": "21.11:",
"versions": "21.11:23.8",
"name": "zen3",
"flags": "-tp {name}",
"warnings": "zen4 is not fully supported by nvhpc yet, falling back to zen3"
"warnings": "zen4 is not fully supported by nvhpc versions < 23.9, falling back to zen3"
},
{
"versions": "23.9:",
"flags": "-tp {name}"
}
]
}
@@ -2711,7 +2715,8 @@
"flags": "-mcpu=thunderx2t99"
}
]
}
},
"cpupart": "0x0af"
},
"a64fx": {
"from": ["armv8.2a"],
@@ -2779,7 +2784,8 @@
"flags": "-march=armv8.2-a+crc+crypto+fp16+sve"
}
]
}
},
"cpupart": "0x001"
},
"cortex_a72": {
"from": ["aarch64"],
@@ -2816,7 +2822,8 @@
"flags" : "-mcpu=cortex-a72"
}
]
}
},
"cpupart": "0xd08"
},
"neoverse_n1": {
"from": ["cortex_a72", "armv8.2a"],
@@ -2837,8 +2844,7 @@
"asimdrdm",
"lrcpc",
"dcpop",
"asimddp",
"ssbs"
"asimddp"
],
"compilers" : {
"gcc": [
@@ -2902,7 +2908,8 @@
"flags": "-tp {name}"
}
]
}
},
"cpupart": "0xd0c"
},
"neoverse_v1": {
"from": ["neoverse_n1", "armv8.4a"],
@@ -2926,8 +2933,6 @@
"lrcpc",
"dcpop",
"sha3",
"sm3",
"sm4",
"asimddp",
"sha512",
"sve",
@@ -2936,7 +2941,6 @@
"uscat",
"ilrcpc",
"flagm",
"ssbs",
"dcpodp",
"svei8mm",
"svebf16",
@@ -3004,7 +3008,7 @@
},
{
"versions": "11:",
"flags" : "-march=armv8.4-a+sve+ssbs+fp16+bf16+crypto+i8mm+rng"
"flags" : "-march=armv8.4-a+sve+fp16+bf16+crypto+i8mm+rng"
},
{
"versions": "12:",
@@ -3028,7 +3032,8 @@
"flags": "-tp {name}"
}
]
}
},
"cpupart": "0xd40"
},
"neoverse_v2": {
"from": ["neoverse_n1", "armv9.0a"],
@@ -3052,32 +3057,22 @@
"lrcpc",
"dcpop",
"sha3",
"sm3",
"sm4",
"asimddp",
"sha512",
"sve",
"asimdfhm",
"dit",
"uscat",
"ilrcpc",
"flagm",
"ssbs",
"sb",
"dcpodp",
"sve2",
"sveaes",
"svepmull",
"svebitperm",
"svesha3",
"svesm4",
"flagm2",
"frint",
"svei8mm",
"svebf16",
"i8mm",
"bf16",
"dgh"
"bf16"
],
"compilers" : {
"gcc": [
@@ -3102,15 +3097,19 @@
"flags" : "-march=armv8.5-a+sve -mtune=cortex-a76"
},
{
"versions": "10.0:11.99",
"versions": "10.0:11.3.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16 -mtune=cortex-a77"
},
{
"versions": "11.4:11.99",
"flags" : "-mcpu=neoverse-v2"
},
{
"versions": "12.0:12.99",
"versions": "12.0:12.2.99",
"flags" : "-march=armv9-a+i8mm+bf16 -mtune=cortex-a710"
},
{
"versions": "13.0:",
"versions": "12.3:",
"flags" : "-mcpu=neoverse-v2"
}
],
@@ -3145,7 +3144,112 @@
"flags": "-tp {name}"
}
]
}
},
"cpupart": "0xd4f"
},
"neoverse_n2": {
"from": ["neoverse_n1", "armv9.0a"],
"vendor": "ARM",
"features": [
"fp",
"asimd",
"evtstrm",
"aes",
"pmull",
"sha1",
"sha2",
"crc32",
"atomics",
"fphp",
"asimdhp",
"cpuid",
"asimdrdm",
"jscvt",
"fcma",
"lrcpc",
"dcpop",
"sha3",
"asimddp",
"sha512",
"sve",
"asimdfhm",
"uscat",
"ilrcpc",
"flagm",
"sb",
"dcpodp",
"sve2",
"flagm2",
"frint",
"svei8mm",
"svebf16",
"i8mm",
"bf16"
],
"compilers" : {
"gcc": [
{
"versions": "4.8:5.99",
"flags": "-march=armv8-a"
},
{
"versions": "6:6.99",
"flags" : "-march=armv8.1-a"
},
{
"versions": "7.0:7.99",
"flags" : "-march=armv8.2-a -mtune=cortex-a72"
},
{
"versions": "8.0:8.99",
"flags" : "-march=armv8.4-a+sve -mtune=cortex-a72"
},
{
"versions": "9.0:9.99",
"flags" : "-march=armv8.5-a+sve -mtune=cortex-a76"
},
{
"versions": "10.0:10.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16 -mtune=cortex-a77"
},
{
"versions": "11.0:",
"flags" : "-mcpu=neoverse-n2"
}
],
"clang" : [
{
"versions": "9.0:10.99",
"flags" : "-march=armv8.5-a+sve"
},
{
"versions": "11.0:13.99",
"flags" : "-march=armv8.5-a+sve+sve2+i8mm+bf16"
},
{
"versions": "14.0:15.99",
"flags" : "-march=armv9-a+i8mm+bf16"
},
{
"versions": "16.0:",
"flags" : "-mcpu=neoverse-n2"
}
],
"arm" : [
{
"versions": "23.04.0:",
"flags" : "-mcpu=neoverse-n2"
}
],
"nvhpc" : [
{
"versions": "23.3:",
"name": "neoverse-n1",
"flags": "-tp {name}"
}
]
},
"cpupart": "0xd49"
},
"m1": {
"from": ["armv8.4a"],
@@ -3211,7 +3315,8 @@
"flags" : "-mcpu=apple-m1"
}
]
}
},
"cpupart": "0x022"
},
"m2": {
"from": ["m1", "armv8.5a"],
@@ -3289,7 +3394,8 @@
"flags" : "-mcpu=apple-m2"
}
]
}
},
"cpupart": "0x032"
},
"arm": {
"from": [],

View File

@@ -52,6 +52,9 @@
}
}
}
},
"cpupart": {
"type": "string"
}
},
"required": [
@@ -107,4 +110,4 @@
"additionalProperties": false
}
}
}
}

View File

@@ -766,6 +766,7 @@ def copy_tree(
src: str,
dest: str,
symlinks: bool = True,
allow_broken_symlinks: bool = sys.platform != "win32",
ignore: Optional[Callable[[str], bool]] = None,
_permissions: bool = False,
):
@@ -788,6 +789,8 @@ def copy_tree(
src (str): the directory to copy
dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks
allow_broken_symlinks (bool): whether or not to allow broken (dangling) symlinks,
On Windows, setting this to True will raise an exception. Defaults to true on unix.
ignore (typing.Callable): function indicating which files to ignore
_permissions (bool): for internal use only
@@ -795,6 +798,8 @@ def copy_tree(
IOError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest*
"""
if allow_broken_symlinks and sys.platform == "win32":
raise llnl.util.symlink.SymlinkError("Cannot allow broken symlinks on Windows!")
if _permissions:
tty.debug("Installing {0} to {1}".format(src, dest))
else:
@@ -867,14 +872,16 @@ def escaped_path(path):
copy_mode(s, d)
for target, d, s in links:
symlink(target, d)
symlink(target, d, allow_broken_symlinks=allow_broken_symlinks)
if _permissions:
set_install_permissions(d)
copy_mode(s, d)
@system_path_filter
def install_tree(src, dest, symlinks=True, ignore=None):
def install_tree(
src, dest, symlinks=True, ignore=None, allow_broken_symlinks=sys.platform != "win32"
):
"""Recursively install an entire directory tree rooted at *src*.
Same as :py:func:`copy_tree` with the addition of setting proper
@@ -885,12 +892,21 @@ def install_tree(src, dest, symlinks=True, ignore=None):
dest (str): the destination directory
symlinks (bool): whether or not to preserve symlinks
ignore (typing.Callable): function indicating which files to ignore
allow_broken_symlinks (bool): whether or not to allow broken (dangling) symlinks,
On Windows, setting this to True will raise an exception.
Raises:
IOError: if *src* does not match any files or directories
ValueError: if *src* is a parent directory of *dest*
"""
copy_tree(src, dest, symlinks=symlinks, ignore=ignore, _permissions=True)
copy_tree(
src,
dest,
symlinks=symlinks,
allow_broken_symlinks=allow_broken_symlinks,
ignore=ignore,
_permissions=True,
)
@system_path_filter

View File

@@ -8,7 +8,6 @@
import subprocess
import sys
import tempfile
from typing import Union
from llnl.util import lang, tty
@@ -17,66 +16,92 @@
if sys.platform == "win32":
from win32file import CreateHardLink
is_windows = sys.platform == "win32"
def _windows_symlink(
src: str, dst: str, target_is_directory: bool = False, *, dir_fd: Union[int, None] = None
):
"""On Windows with System Administrator privileges this will be a normal symbolic link via
os.symlink. On Windows without privledges the link will be a junction for a directory and a
hardlink for a file. On Windows the various link types are:
Symbolic Link: A link to a file or directory on the same or different volume (drive letter) or
even to a remote file or directory (using UNC in its path). Need System Administrator
privileges to make these.
def symlink(source_path: str, link_path: str, allow_broken_symlinks: bool = not is_windows):
"""
Create a link.
Hard Link: A link to a file on the same volume (drive letter) only. Every file (file's data)
has at least 1 hard link (file's name). But when this method creates a new hard link there will
be 2. Deleting all hard links effectively deletes the file. Don't need System Administrator
privileges.
On non-Windows and Windows with System Administrator
privleges this will be a normal symbolic link via
os.symlink.
Junction: A link to a directory on the same or different volume (drive letter) but not to a
remote directory. Don't need System Administrator privileges."""
source_path = os.path.normpath(src)
On Windows without privledges the link will be a
junction for a directory and a hardlink for a file.
On Windows the various link types are:
Symbolic Link: A link to a file or directory on the
same or different volume (drive letter) or even to
a remote file or directory (using UNC in its path).
Need System Administrator privileges to make these.
Hard Link: A link to a file on the same volume (drive
letter) only. Every file (file's data) has at least 1
hard link (file's name). But when this method creates
a new hard link there will be 2. Deleting all hard
links effectively deletes the file. Don't need System
Administrator privileges.
Junction: A link to a directory on the same or different
volume (drive letter) but not to a remote directory. Don't
need System Administrator privileges.
Parameters:
source_path (str): The real file or directory that the link points to.
Must be absolute OR relative to the link.
link_path (str): The path where the link will exist.
allow_broken_symlinks (bool): On Linux or Mac, don't raise an exception if the source_path
doesn't exist. This will still raise an exception on Windows.
"""
source_path = os.path.normpath(source_path)
win_source_path = source_path
link_path = os.path.normpath(dst)
link_path = os.path.normpath(link_path)
# Perform basic checks to make sure symlinking will succeed
if os.path.lexists(link_path):
raise AlreadyExistsError(f"Link path ({link_path}) already exists. Cannot create link.")
# Never allow broken links on Windows.
if sys.platform == "win32" and allow_broken_symlinks:
raise ValueError("allow_broken_symlinks parameter cannot be True on Windows.")
if not os.path.exists(source_path):
if os.path.isabs(source_path):
# An absolute source path that does not exist will result in a broken link.
raise SymlinkError(
f"Source path ({source_path}) is absolute but does not exist. Resulting "
f"link would be broken so not making link."
if not allow_broken_symlinks:
# Perform basic checks to make sure symlinking will succeed
if os.path.lexists(link_path):
raise AlreadyExistsError(
f"Link path ({link_path}) already exists. Cannot create link."
)
else:
# os.symlink can create a link when the given source path is relative to
# the link path. Emulate this behavior and check to see if the source exists
# relative to the link path ahead of link creation to prevent broken
# links from being made.
link_parent_dir = os.path.dirname(link_path)
relative_path = os.path.join(link_parent_dir, source_path)
if os.path.exists(relative_path):
# In order to work on windows, the source path needs to be modified to be
# relative because hardlink/junction dont resolve relative paths the same
# way as os.symlink. This is ignored on other operating systems.
win_source_path = relative_path
else:
if not os.path.exists(source_path):
if os.path.isabs(source_path) and not allow_broken_symlinks:
# An absolute source path that does not exist will result in a broken link.
raise SymlinkError(
f"The source path ({source_path}) is not relative to the link path "
f"({link_path}). Resulting link would be broken so not making link."
f"Source path ({source_path}) is absolute but does not exist. Resulting "
f"link would be broken so not making link."
)
else:
# os.symlink can create a link when the given source path is relative to
# the link path. Emulate this behavior and check to see if the source exists
# relative to the link path ahead of link creation to prevent broken
# links from being made.
link_parent_dir = os.path.dirname(link_path)
relative_path = os.path.join(link_parent_dir, source_path)
if os.path.exists(relative_path):
# In order to work on windows, the source path needs to be modified to be
# relative because hardlink/junction dont resolve relative paths the same
# way as os.symlink. This is ignored on other operating systems.
win_source_path = relative_path
elif not allow_broken_symlinks:
raise SymlinkError(
f"The source path ({source_path}) is not relative to the link path "
f"({link_path}). Resulting link would be broken so not making link."
)
# Create the symlink
if not _windows_can_symlink():
if sys.platform == "win32" and not _windows_can_symlink():
_windows_create_link(win_source_path, link_path)
else:
os.symlink(source_path, link_path, target_is_directory=os.path.isdir(source_path))
def _windows_islink(path: str) -> bool:
def islink(path: str) -> bool:
"""Override os.islink to give correct answer for spack logic.
For Non-Windows: a link can be determined with the os.path.islink method.
@@ -244,7 +269,7 @@ def _windows_create_hard_link(path: str, link: str):
CreateHardLink(link, path)
def _windows_readlink(path: str, *, dir_fd=None):
def readlink(path: str, *, dir_fd=None):
"""Spack utility to override of os.readlink method to work cross platform"""
if _windows_is_hardlink(path):
return _windows_read_hard_link(path)
@@ -313,16 +338,6 @@ def resolve_link_target_relative_to_the_link(link):
return os.path.join(link_dir, target)
if sys.platform == "win32":
symlink = _windows_symlink
readlink = _windows_readlink
islink = _windows_islink
else:
symlink = os.symlink
readlink = os.readlink
islink = os.path.islink
class SymlinkError(RuntimeError):
"""Exception class for errors raised while creating symlinks,
junctions and hard links

View File

@@ -4,7 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
#: PEP440 canonical <major>.<minor>.<micro>.<devN> string
__version__ = "0.23.0.dev0"
__version__ = "0.22.2"
spack_version = __version__

View File

@@ -421,10 +421,6 @@ def _check_patch_urls(pkgs, error_cls):
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
r".+/.+/(?:commit|pull)/[a-fA-F0-9]+\.(?:patch|diff)"
)
github_pull_commits_re = (
r"^https?://(?:patch-diff\.)?github(?:usercontent)?\.com/"
r".+/.+/pull/\d+/commits/[a-fA-F0-9]+\.(?:patch|diff)"
)
# Only .diff URLs have stable/full hashes:
# https://forum.gitlab.com/t/patches-with-full-index/29313
gitlab_patch_url_re = (
@@ -440,24 +436,14 @@ def _check_patch_urls(pkgs, error_cls):
if not isinstance(patch, spack.patch.UrlPatch):
continue
if re.match(github_pull_commits_re, patch.url):
url = re.sub(r"/pull/\d+/commits/", r"/commit/", patch.url)
url = re.sub(r"^(.*)(?<!full_index=1)$", r"\1?full_index=1", url)
errors.append(
error_cls(
f"patch URL in package {pkg_cls.name} "
+ "must not be a pull request commit; "
+ f"instead use {url}",
[patch.url],
)
)
elif re.match(github_patch_url_re, patch.url):
if re.match(github_patch_url_re, patch.url):
full_index_arg = "?full_index=1"
if not patch.url.endswith(full_index_arg):
errors.append(
error_cls(
f"patch URL in package {pkg_cls.name} "
+ f"must end with {full_index_arg}",
"patch URL in package {0} must end with {1}".format(
pkg_cls.name, full_index_arg
),
[patch.url],
)
)
@@ -465,7 +451,9 @@ def _check_patch_urls(pkgs, error_cls):
if not patch.url.endswith(".diff"):
errors.append(
error_cls(
f"patch URL in package {pkg_cls.name} must end with .diff",
"patch URL in package {0} must end with .diff".format(
pkg_cls.name
),
[patch.url],
)
)
@@ -791,7 +779,7 @@ def check_virtual_with_variants(spec, msg):
return
error = error_cls(
f"{pkg_name}: {msg}",
f"remove variants from '{spec}' in depends_on directive in {filename}",
[f"remove variants from '{spec}' in depends_on directive in {filename}"],
)
errors.append(error)

View File

@@ -23,7 +23,6 @@
import warnings
from contextlib import closing
from typing import Dict, Iterable, List, NamedTuple, Optional, Set, Tuple
from urllib.error import HTTPError, URLError
import llnl.util.filesystem as fsys
import llnl.util.lang
@@ -899,9 +898,8 @@ def url_read_method(url):
try:
_, _, spec_file = web_util.read_from_url(url)
contents = codecs.getreader("utf-8")(spec_file).read()
except (URLError, web_util.SpackWebError) as url_err:
tty.error("Error reading specfile: {0}".format(url))
tty.error(url_err)
except web_util.SpackWebError as e:
tty.error(f"Error reading specfile: {url}: {e}")
return contents
try:
@@ -2041,21 +2039,17 @@ def try_direct_fetch(spec, mirrors=None):
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_signed_json)
specfile_is_signed = True
except (URLError, web_util.SpackWebError, HTTPError) as url_err:
except web_util.SpackWebError as e1:
try:
_, _, fs = web_util.read_from_url(buildcache_fetch_url_json)
except (URLError, web_util.SpackWebError, HTTPError) as url_err_x:
except web_util.SpackWebError as e2:
tty.debug(
"Did not find {0} on {1}".format(
specfile_name, buildcache_fetch_url_signed_json
),
url_err,
f"Did not find {specfile_name} on {buildcache_fetch_url_signed_json}",
e1,
level=2,
)
tty.debug(
"Did not find {0} on {1}".format(specfile_name, buildcache_fetch_url_json),
url_err_x,
level=2,
f"Did not find {specfile_name} on {buildcache_fetch_url_json}", e2, level=2
)
continue
specfile_contents = codecs.getreader("utf-8")(fs).read()
@@ -2140,6 +2134,9 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
for mirror in mirror_collection.values():
fetch_url = mirror.fetch_url
# TODO: oci:// does not support signing.
if fetch_url.startswith("oci://"):
continue
keys_url = url_util.join(
fetch_url, BUILD_CACHE_RELATIVE_PATH, BUILD_CACHE_KEYS_RELATIVE_PATH
)
@@ -2150,19 +2147,12 @@ def get_keys(install=False, trust=False, force=False, mirrors=None):
try:
_, _, json_file = web_util.read_from_url(keys_index)
json_index = sjson.load(codecs.getreader("utf-8")(json_file))
except (URLError, web_util.SpackWebError) as url_err:
except web_util.SpackWebError as url_err:
if web_util.url_exists(keys_index):
err_msg = [
"Unable to find public keys in {0},",
" caught exception attempting to read from {1}.",
]
tty.error(
"".join(err_msg).format(
url_util.format(fetch_url), url_util.format(keys_index)
)
f"Unable to find public keys in {url_util.format(fetch_url)},"
f" caught exception attempting to read from {url_util.format(keys_index)}."
)
tty.debug(url_err)
continue
@@ -2442,7 +2432,7 @@ def get_remote_hash(self):
url_index_hash = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json.hash")
try:
response = self.urlopen(urllib.request.Request(url_index_hash, headers=self.headers))
except urllib.error.URLError:
except (TimeoutError, urllib.error.URLError):
return None
# Validate the hash
@@ -2464,7 +2454,7 @@ def conditional_fetch(self) -> FetchIndexResult:
try:
response = self.urlopen(urllib.request.Request(url_index, headers=self.headers))
except urllib.error.URLError as e:
except (TimeoutError, urllib.error.URLError) as e:
raise FetchIndexError("Could not fetch index from {}".format(url_index), e) from e
try:
@@ -2505,10 +2495,7 @@ def __init__(self, url, etag, urlopen=web_util.urlopen):
def conditional_fetch(self) -> FetchIndexResult:
# Just do a conditional fetch immediately
url = url_util.join(self.url, BUILD_CACHE_RELATIVE_PATH, "index.json")
headers = {
"User-Agent": web_util.SPACK_USER_AGENT,
"If-None-Match": '"{}"'.format(self.etag),
}
headers = {"User-Agent": web_util.SPACK_USER_AGENT, "If-None-Match": f'"{self.etag}"'}
try:
response = self.urlopen(urllib.request.Request(url, headers=headers))
@@ -2516,14 +2503,14 @@ def conditional_fetch(self) -> FetchIndexResult:
if e.getcode() == 304:
# Not modified; that means fresh.
return FetchIndexResult(etag=None, hash=None, data=None, fresh=True)
raise FetchIndexError("Could not fetch index {}".format(url), e) from e
except urllib.error.URLError as e:
raise FetchIndexError("Could not fetch index {}".format(url), e) from e
raise FetchIndexError(f"Could not fetch index {url}", e) from e
except (TimeoutError, urllib.error.URLError) as e:
raise FetchIndexError(f"Could not fetch index {url}", e) from e
try:
result = codecs.getreader("utf-8")(response).read()
except ValueError as e:
raise FetchIndexError("Remote index {} is invalid".format(url), e) from e
raise FetchIndexError(f"Remote index {url} is invalid", e) from e
headers = response.headers
etag_header_value = headers.get("Etag", None) or headers.get("etag", None)
@@ -2554,21 +2541,19 @@ def conditional_fetch(self) -> FetchIndexResult:
headers={"Accept": "application/vnd.oci.image.manifest.v1+json"},
)
)
except urllib.error.URLError as e:
raise FetchIndexError(
"Could not fetch manifest from {}".format(url_manifest), e
) from e
except (TimeoutError, urllib.error.URLError) as e:
raise FetchIndexError(f"Could not fetch manifest from {url_manifest}", e) from e
try:
manifest = json.loads(response.read())
except Exception as e:
raise FetchIndexError("Remote index {} is invalid".format(url_manifest), e) from e
raise FetchIndexError(f"Remote index {url_manifest} is invalid", e) from e
# Get first blob hash, which should be the index.json
try:
index_digest = spack.oci.image.Digest.from_string(manifest["layers"][0]["digest"])
except Exception as e:
raise FetchIndexError("Remote index {} is invalid".format(url_manifest), e) from e
raise FetchIndexError(f"Remote index {url_manifest} is invalid", e) from e
# Fresh?
if index_digest.digest == self.local_hash:

View File

@@ -213,18 +213,15 @@ def _root_spec(spec_str: str) -> str:
Args:
spec_str: spec to be bootstrapped. Must be without compiler and target.
"""
# Add a compiler and platform requirement to the root spec.
# Add a compiler requirement to the root spec.
platform = str(spack.platforms.host())
if platform == "darwin":
spec_str += " %apple-clang"
elif platform == "windows":
spec_str += " %msvc"
elif platform == "linux":
spec_str += " %gcc"
elif platform == "freebsd":
spec_str += " %clang"
spec_str += f" platform={platform}"
target = archspec.cpu.host().family
spec_str += f" target={target}"

View File

@@ -72,6 +72,7 @@
import spack.store
import spack.subprocess_context
import spack.user_environment
import spack.util.executable
import spack.util.path
import spack.util.pattern
from spack import traverse
@@ -91,7 +92,7 @@
)
from spack.util.executable import Executable
from spack.util.log_parse import make_log_context, parse_log_events
from spack.util.module_cmd import load_module, path_from_modules
from spack.util.module_cmd import load_module, module, path_from_modules
#
# This can be set by the user to globally disable parallel builds.
@@ -190,6 +191,14 @@ def __call__(self, *args, **kwargs):
return super().__call__(*args, **kwargs)
def _on_cray():
host_platform = spack.platforms.host()
host_os = host_platform.operating_system("default_os")
on_cray = str(host_platform) == "cray"
using_cnl = re.match(r"cnl\d+", str(host_os))
return on_cray, using_cnl
def clean_environment():
# Stuff in here sanitizes the build environment to eliminate
# anything the user has set that may interfere. We apply it immediately
@@ -233,6 +242,17 @@ def clean_environment():
if varname.endswith("_ROOT") and varname != "SPACK_ROOT":
env.unset(varname)
# On Cray "cluster" systems, unset CRAY_LD_LIBRARY_PATH to avoid
# interference with Spack dependencies.
# CNL requires these variables to be set (or at least some of them,
# depending on the CNL version).
on_cray, using_cnl = _on_cray()
if on_cray and not using_cnl:
env.unset("CRAY_LD_LIBRARY_PATH")
for varname in os.environ.keys():
if "PKGCONF" in varname:
env.unset(varname)
# Unset the following variables because they can affect installation of
# Autotools and CMake packages.
build_system_vars = [
@@ -362,7 +382,11 @@ def set_compiler_environment_variables(pkg, env):
_add_werror_handling(keep_werror, env)
# Set the target parameters that the compiler will add
isa_arg = spec.architecture.target.optimization_flags(compiler)
# Don't set on cray platform because the targeting module handles this
if spec.satisfies("platform=cray"):
isa_arg = ""
else:
isa_arg = spec.architecture.target.optimization_flags(compiler)
env.set("SPACK_TARGET_ARGS", isa_arg)
# Trap spack-tracked compiler flags as appropriate.
@@ -456,12 +480,12 @@ def set_wrapper_variables(pkg, env):
env.set(SPACK_DEBUG_LOG_ID, pkg.spec.format("{name}-{hash:7}"))
env.set(SPACK_DEBUG_LOG_DIR, spack.main.spack_working_dir)
# Find ccache binary and hand it to build environment
if spack.config.get("config:ccache"):
ccache = Executable("ccache")
if not ccache:
raise RuntimeError("No ccache binary found in PATH")
env.set(SPACK_CCACHE_BINARY, ccache)
# Enable ccache in the compiler wrapper
env.set(SPACK_CCACHE_BINARY, spack.util.executable.which_string("ccache", required=True))
else:
# Avoid cache pollution if a build system forces `ccache <compiler wrapper invocation>`.
env.set("CCACHE_DISABLE", "1")
# Gather information about various types of dependencies
link_deps = set(pkg.spec.traverse(root=False, deptype=("link")))
@@ -810,6 +834,14 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
for mod in pkg.compiler.modules:
load_module(mod)
# kludge to handle cray mpich and libsci being automatically loaded by
# PrgEnv modules on cray platform. Module unload does no damage when
# unnecessary
on_cray, _ = _on_cray()
if on_cray and not dirty:
for mod in ["cray-mpich", "cray-libsci"]:
module("unload", mod)
if target and target.module_name:
load_module(target.module_name)
@@ -828,7 +860,7 @@ def setup_package(pkg, dirty, context: Context = Context.BUILD):
return env_base
class EnvironmentVisitor(traverse.AbstractVisitor):
class EnvironmentVisitor:
def __init__(self, *roots: spack.spec.Spec, context: Context):
# For the roots (well, marked specs) we follow different edges
# than for their deps, depending on the context.
@@ -846,7 +878,7 @@ def __init__(self, *roots: spack.spec.Spec, context: Context):
self.root_depflag = dt.RUN | dt.LINK
def neighbors(self, item):
spec = item[0].spec
spec = item.edge.spec
if spec.dag_hash() in self.root_hashes:
depflag = self.root_depflag
else:

View File

@@ -110,8 +110,9 @@ def cuda_flags(arch_list):
# From the NVIDIA install guide we know of conflicts for particular
# platforms (linux, darwin), architectures (x86, powerpc) and compilers
# (gcc, clang). We don't restrict %gcc and %clang conflicts to
# platform=linux, since they may apply to platform=darwin. We currently
# do not provide conflicts for platform=darwin with %apple-clang.
# platform=linux, since they should also apply to platform=cray, and may
# apply to platform=darwin. We currently do not provide conflicts for
# platform=darwin with %apple-clang.
# Linux x86_64 compiler conflicts from here:
# https://gist.github.com/ax3l/9489132
@@ -136,14 +137,11 @@ def cuda_flags(arch_list):
conflicts("%gcc@11.2:", when="+cuda ^cuda@:11.5")
conflicts("%gcc@12:", when="+cuda ^cuda@:11.8")
conflicts("%gcc@13:", when="+cuda ^cuda@:12.3")
conflicts("%gcc@14:", when="+cuda ^cuda@:12.4")
conflicts("%clang@12:", when="+cuda ^cuda@:11.4.0")
conflicts("%clang@13:", when="+cuda ^cuda@:11.5")
conflicts("%clang@14:", when="+cuda ^cuda@:11.7")
conflicts("%clang@15:", when="+cuda ^cuda@:12.0")
conflicts("%clang@16:", when="+cuda ^cuda@:12.1")
conflicts("%clang@17:", when="+cuda ^cuda@:12.3")
conflicts("%clang@18:", when="+cuda ^cuda@:12.4")
conflicts("%clang@16:", when="+cuda ^cuda@:12.3")
# https://gist.github.com/ax3l/9489132#gistcomment-3860114
conflicts("%gcc@10", when="+cuda ^cuda@:11.4.0")

View File

@@ -846,7 +846,6 @@ def scalapack_libs(self):
"^mpich@2:" in spec_root
or "^cray-mpich" in spec_root
or "^mvapich2" in spec_root
or "^mvapich" in spec_root
or "^intel-mpi" in spec_root
or "^intel-oneapi-mpi" in spec_root
or "^intel-parallel-studio" in spec_root
@@ -937,15 +936,32 @@ def mpi_setup_dependent_build_environment(self, env, dependent_spec, compilers_o
"I_MPI_ROOT": self.normalize_path("mpi"),
}
compiler_wrapper_commands = self.mpi_compiler_wrappers
wrapper_vars.update(
{
"MPICC": compiler_wrapper_commands["MPICC"],
"MPICXX": compiler_wrapper_commands["MPICXX"],
"MPIF77": compiler_wrapper_commands["MPIF77"],
"MPIF90": compiler_wrapper_commands["MPIF90"],
}
)
# CAUTION - SIMILAR code in:
# var/spack/repos/builtin/packages/mpich/package.py
# var/spack/repos/builtin/packages/openmpi/package.py
# var/spack/repos/builtin/packages/mvapich2/package.py
#
# On Cray, the regular compiler wrappers *are* the MPI wrappers.
if "platform=cray" in self.spec:
# TODO: Confirm
wrapper_vars.update(
{
"MPICC": compilers_of_client["CC"],
"MPICXX": compilers_of_client["CXX"],
"MPIF77": compilers_of_client["F77"],
"MPIF90": compilers_of_client["F90"],
}
)
else:
compiler_wrapper_commands = self.mpi_compiler_wrappers
wrapper_vars.update(
{
"MPICC": compiler_wrapper_commands["MPICC"],
"MPICXX": compiler_wrapper_commands["MPICXX"],
"MPIF77": compiler_wrapper_commands["MPIF77"],
"MPIF90": compiler_wrapper_commands["MPIF90"],
}
)
# Ensure that the directory containing the compiler wrappers is in the
# PATH. Spack packages add `prefix.bin` to their dependents' paths,

View File

@@ -24,6 +24,7 @@ class MSBuildPackage(spack.package_base.PackageBase):
build_system("msbuild")
conflicts("platform=linux", when="build_system=msbuild")
conflicts("platform=darwin", when="build_system=msbuild")
conflicts("platform=cray", when="build_system=msbuild")
@spack.builder.builder("msbuild")

View File

@@ -24,6 +24,7 @@ class NMakePackage(spack.package_base.PackageBase):
build_system("nmake")
conflicts("platform=linux", when="build_system=nmake")
conflicts("platform=darwin", when="build_system=nmake")
conflicts("platform=cray", when="build_system=nmake")
@spack.builder.builder("nmake")

View File

@@ -36,8 +36,9 @@ class IntelOneApiPackage(Package):
"target=ppc64:",
"target=ppc64le:",
"target=aarch64:",
"platform=darwin",
"platform=windows",
"platform=darwin:",
"platform=cray:",
"platform=windows:",
]:
conflicts(c, msg="This package in only available for x86_64 and Linux")

View File

@@ -1111,7 +1111,7 @@ def main_script_replacements(cmd):
if cdash_handler and cdash_handler.auth_token:
try:
cdash_handler.populate_buildgroup(all_job_names)
except (SpackError, HTTPError, URLError) as err:
except (SpackError, HTTPError, URLError, TimeoutError) as err:
tty.warn(f"Problem populating buildgroup: {err}")
else:
tty.warn("Unable to populate buildgroup without CDash credentials")
@@ -2095,7 +2095,7 @@ def read_broken_spec(broken_spec_url):
"""
try:
_, _, fs = web_util.read_from_url(broken_spec_url)
except (URLError, web_util.SpackWebError, HTTPError):
except web_util.SpackWebError:
tty.warn(f"Unable to read broken spec from {broken_spec_url}")
return None

View File

@@ -813,7 +813,7 @@ def _push_oci(
def extra_config(spec: Spec):
spec_dict = spec.to_dict(hash=ht.dag_hash)
spec_dict["buildcache_layout_version"] = 1
spec_dict["buildcache_layout_version"] = bindist.CURRENT_BUILD_CACHE_LAYOUT_VERSION
spec_dict["binary_cache_checksum"] = {
"hash_algorithm": "sha256",
"hash": checksums[spec.dag_hash()].compressed_digest.digest,

View File

@@ -106,8 +106,7 @@ def clean(parser, args):
# Then do the cleaning falling through the cases
if args.specs:
specs = spack.cmd.parse_specs(args.specs, concretize=False)
specs = list(spack.cmd.matching_spec_from_env(x) for x in specs)
specs = spack.cmd.parse_specs(args.specs, concretize=True)
for spec in specs:
msg = "Cleaning build stage [{0}]"
tty.msg(msg.format(spec.short_spec))

View File

@@ -11,7 +11,6 @@
from argparse import ArgumentParser, Namespace
from typing import IO, Any, Callable, Dict, Iterable, List, Optional, Sequence, Set, Tuple, Union
import llnl.util.filesystem as fs
import llnl.util.tty as tty
from llnl.util.argparsewriter import ArgparseRstWriter, ArgparseWriter, Command
from llnl.util.tty.colify import colify
@@ -867,9 +866,6 @@ def _commands(parser: ArgumentParser, args: Namespace) -> None:
prepend_header(args, f)
formatter(args, f)
if args.update_completion:
fs.set_executable(args.update)
else:
prepend_header(args, sys.stdout)
formatter(args, sys.stdout)

View File

@@ -41,7 +41,7 @@ def setup_parser(subparser):
)
class AreDepsInstalledVisitor(traverse.AbstractVisitor):
class AreDepsInstalledVisitor:
def __init__(self, context: Context = Context.BUILD):
if context == Context.BUILD:
# TODO: run deps shouldn't be required for build env.
@@ -53,27 +53,27 @@ def __init__(self, context: Context = Context.BUILD):
self.has_uninstalled_deps = False
def accept(self, item: traverse.EdgeAndDepth) -> bool:
def accept(self, item):
# The root may be installed or uninstalled.
if item[1] == 0:
if item.depth == 0:
return True
# Early exit after we've seen an uninstalled dep.
if self.has_uninstalled_deps:
return False
spec = item[0].spec
spec = item.edge.spec
if not spec.external and not spec.installed:
self.has_uninstalled_deps = True
return False
return True
def neighbors(self, item: traverse.EdgeAndDepth):
def neighbors(self, item):
# Direct deps: follow build & test edges.
# Transitive deps: follow link / run.
depflag = self.direct_deps if item[1] == 0 else dt.LINK | dt.RUN
return item[0].spec.edges_to_dependencies(depflag=depflag)
depflag = self.direct_deps if item.depth == 0 else dt.LINK | dt.RUN
return item.edge.spec.edges_to_dependencies(depflag=depflag)
def emulate_env_utility(cmd_name, context: Context, args):

View File

@@ -468,32 +468,30 @@ def env_remove(args):
This removes an environment managed by Spack. Directory environments
and manifests embedded in repositories should be removed manually.
"""
read_envs = []
remove_envs = []
valid_envs = []
bad_envs = []
invalid_envs = []
for env_name in ev.all_environment_names():
try:
env = ev.read(env_name)
valid_envs.append(env_name)
valid_envs.append(env)
if env_name in args.rm_env:
read_envs.append(env)
remove_envs.append(env)
except (spack.config.ConfigFormatError, ev.SpackEnvironmentConfigError):
invalid_envs.append(env_name)
if env_name in args.rm_env:
bad_envs.append(env_name)
# Check if env is linked to another before trying to remove
for name in valid_envs:
# Check if remove_env is included from another env before trying to remove
for env in valid_envs:
for remove_env in remove_envs:
# don't check if environment is included to itself
if name == env_name:
if env.name == remove_env.name:
continue
environ = ev.Environment(ev.root(name))
if ev.root(env_name) in environ.included_concrete_envs:
msg = f'Environment "{env_name}" is being used by environment "{name}"'
if remove_env.path in env.included_concrete_envs:
msg = f'Environment "{remove_env.name}" is being used by environment "{env.name}"'
if args.force:
tty.warn(msg)
else:
@@ -506,7 +504,7 @@ def env_remove(args):
if not answer:
tty.die("Will not remove any environments")
for env in read_envs:
for env in remove_envs:
name = env.name
if env.active:
tty.die(f"Environment {name} can't be removed while activated.")

View File

@@ -50,7 +50,7 @@
@B{++}, @r{--}, @r{~~}, @B{==} propagate variants to package dependencies
architecture variants:
@m{platform=platform} linux, darwin, freebsd, windows
@m{platform=platform} linux, darwin, cray, etc.
@m{os=operating_system} specific <operating_system>
@m{target=target} specific <target> processor
@m{arch=platform-os-target} shortcut for all three above

View File

@@ -61,6 +61,7 @@ def install_kwargs_from_args(args):
"dependencies_use_cache": cache_opt(args.use_cache, dep_use_bc),
"dependencies_cache_only": cache_opt(args.cache_only, dep_use_bc),
"include_build_deps": args.include_build_deps,
"explicit": True, # Use true as a default for install command
"stop_at": args.until,
"unsigned": args.unsigned,
"install_deps": ("dependencies" in args.things_to_install),
@@ -472,7 +473,6 @@ def install_without_active_env(args, install_kwargs, reporter_factory):
require_user_confirmation_for_overwrite(concrete_specs, args)
install_kwargs["overwrite"] = [spec.dag_hash() for spec in concrete_specs]
installs = [s.package for s in concrete_specs]
install_kwargs["explicit"] = [s.dag_hash() for s in concrete_specs]
builder = PackageInstaller(installs, install_kwargs)
installs = [(s.package, install_kwargs) for s in concrete_specs]
builder = PackageInstaller(installs)
builder.install()

View File

@@ -151,8 +151,7 @@ def is_installed(spec):
key=lambda s: s.dag_hash(),
)
with spack.store.STORE.db.read_transaction():
return [spec for spec in specs if is_installed(spec)]
return [spec for spec in specs if is_installed(spec)]
def dependent_environments(
@@ -240,8 +239,6 @@ def get_uninstall_list(args, specs: List[spack.spec.Spec], env: Optional[ev.Envi
print()
tty.info("The following environments still reference these specs:")
colify([e.name for e in other_dependent_envs.keys()], indent=4)
if env:
msgs.append("use `spack remove` to remove the spec from the current environment")
msgs.append("use `spack env remove` to remove environments")
msgs.append("use `spack uninstall --force` to override")
print()

View File

@@ -38,10 +38,10 @@
import spack.cmd
import spack.environment as ev
import spack.filesystem_view as fsv
import spack.schema.projections
import spack.store
from spack.config import validate
from spack.filesystem_view import YamlFilesystemView, view_func_parser
from spack.util import spack_yaml as s_yaml
description = "project packages to a compact naming scheme on the filesystem"
@@ -193,17 +193,13 @@ def view(parser, args):
ordered_projections = {}
# What method are we using for this view
if args.action in actions_link:
link_fn = view_func_parser(args.action)
else:
link_fn = view_func_parser("symlink")
view = YamlFilesystemView(
link_type = args.action if args.action in actions_link else "symlink"
view = fsv.YamlFilesystemView(
path,
spack.store.STORE.layout,
projections=ordered_projections,
ignore_conflicts=getattr(args, "ignore_conflicts", False),
link=link_fn,
link_type=link_type,
verbose=args.verbose,
)

View File

@@ -695,6 +695,10 @@ def compiler_environment(self):
try:
# load modules and set env variables
for module in self.modules:
# On cray, mic-knl module cannot be loaded without cce module
# See: https://github.com/spack/spack/issues/3153
if os.environ.get("CRAY_CPU_TARGET") == "mic-knl":
spack.util.module_cmd.load_module("cce")
spack.util.module_cmd.load_module(module)
# apply other compiler environment changes

View File

@@ -96,8 +96,6 @@ def verbose_flag(self):
openmp_flag = "-fopenmp"
# C++ flags based on CMake Modules/Compiler/Clang.cmake
@property
def cxx11_flag(self):
if self.real_version < Version("3.3"):
@@ -122,24 +120,6 @@ def cxx17_flag(self):
return "-std=c++17"
@property
def cxx20_flag(self):
if self.real_version < Version("5.0"):
raise UnsupportedCompilerFlag(self, "the C++20 standard", "cxx20_flag", "< 5.0")
elif self.real_version < Version("11.0"):
return "-std=c++2a"
else:
return "-std=c++20"
@property
def cxx23_flag(self):
if self.real_version < Version("12.0"):
raise UnsupportedCompilerFlag(self, "the C++23 standard", "cxx23_flag", "< 12.0")
elif self.real_version < Version("17.0"):
return "-std=c++2b"
else:
return "-std=c++23"
@property
def c99_flag(self):
return "-std=c99"
@@ -162,10 +142,7 @@ def c17_flag(self):
def c23_flag(self):
if self.real_version < Version("9.0"):
raise UnsupportedCompilerFlag(self, "the C23 standard", "c23_flag", "< 9.0")
elif self.real_version < Version("18.0"):
return "-std=c2x"
else:
return "-std=c23"
return "-std=c2x"
@property
def cc_pic_flag(self):

View File

@@ -97,7 +97,7 @@ class OpenMpi(Package):
PatchesType = Optional[Union[Patcher, str, List[Union[Patcher, str]]]]
SUPPORTED_LANGUAGES = ("fortran", "cxx")
SUPPORTED_LANGUAGES = ("fortran", "cxx", "c")
def _make_when_spec(value: WhenType) -> Optional["spack.spec.Spec"]:

View File

@@ -59,7 +59,7 @@ def __init__(
self.buildcache_flag = ""
class DepfileSpecVisitor(traverse.AbstractVisitor):
class DepfileSpecVisitor:
"""This visitor produces an adjacency list of a (reduced) DAG, which
is used to generate depfile targets with their prerequisites. Currently
it only drops build deps when using buildcache only mode.
@@ -75,17 +75,17 @@ def __init__(self, pkg_buildcache: UseBuildCache, deps_buildcache: UseBuildCache
self.depflag_root = _deptypes(pkg_buildcache)
self.depflag_deps = _deptypes(deps_buildcache)
def neighbors(self, node: traverse.EdgeAndDepth) -> List[spack.spec.DependencySpec]:
def neighbors(self, node):
"""Produce a list of spec to follow from node"""
depflag = self.depflag_root if node[1] == 0 else self.depflag_deps
return traverse.sort_edges(node[0].spec.edges_to_dependencies(depflag=depflag))
depflag = self.depflag_root if node.depth == 0 else self.depflag_deps
return traverse.sort_edges(node.edge.spec.edges_to_dependencies(depflag=depflag))
def accept(self, node: traverse.EdgeAndDepth) -> bool:
def accept(self, node):
self.adjacency_list.append(
DepfileNode(
target=node[0].spec,
target=node.edge.spec,
prereqs=[edge.spec for edge in self.neighbors(node)],
buildcache=self.pkg_buildcache if node[1] == 0 else self.deps_buildcache,
buildcache=self.pkg_buildcache if node.depth == 0 else self.deps_buildcache,
)
)

View File

@@ -30,6 +30,7 @@
import spack.deptypes as dt
import spack.error
import spack.fetch_strategy
import spack.filesystem_view as fsv
import spack.hash_types as ht
import spack.hooks
import spack.main
@@ -52,7 +53,6 @@
import spack.util.url
import spack.version
from spack import traverse
from spack.filesystem_view import SimpleFilesystemView, inverse_view_func_parser, view_func_parser
from spack.installer import PackageInstaller
from spack.schema.env import TOP_LEVEL_KEY
from spack.spec import Spec
@@ -606,7 +606,7 @@ def __init__(
self.projections = projections
self.select = select
self.exclude = exclude
self.link_type = view_func_parser(link_type)
self.link_type = fsv.canonicalize_link_type(link_type)
self.link = link
def select_fn(self, spec):
@@ -640,7 +640,7 @@ def to_dict(self):
if self.exclude:
ret["exclude"] = self.exclude
if self.link_type:
ret["link_type"] = inverse_view_func_parser(self.link_type)
ret["link_type"] = self.link_type
if self.link != default_view_link:
ret["link"] = self.link
return ret
@@ -690,7 +690,7 @@ def get_projection_for_spec(self, spec):
to exist on the filesystem."""
return self._view(self.root).get_projection_for_spec(spec)
def view(self, new: Optional[str] = None) -> SimpleFilesystemView:
def view(self, new: Optional[str] = None) -> fsv.SimpleFilesystemView:
"""
Returns a view object for the *underlying* view directory. This means that the
self.root symlink is followed, and that the view has to exist on the filesystem
@@ -710,14 +710,14 @@ def view(self, new: Optional[str] = None) -> SimpleFilesystemView:
)
return self._view(path)
def _view(self, root: str) -> SimpleFilesystemView:
def _view(self, root: str) -> fsv.SimpleFilesystemView:
"""Returns a view object for a given root dir."""
return SimpleFilesystemView(
return fsv.SimpleFilesystemView(
root,
spack.store.STORE.layout,
ignore_conflicts=True,
projections=self.projections,
link=self.link_type,
link_type=self.link_type,
)
def __contains__(self, spec):
@@ -1190,7 +1190,6 @@ def scope_name(self):
def include_concrete_envs(self):
"""Copy and save the included envs' specs internally"""
lockfile_meta = None
root_hash_seen = set()
concrete_hash_seen = set()
self.included_concrete_spec_data = {}
@@ -1201,37 +1200,26 @@ def include_concrete_envs(self):
raise SpackEnvironmentError(f"Unable to find env at {env_path}")
env = Environment(env_path)
with open(env.lock_path) as f:
lockfile_as_dict = env._read_lockfile(f)
# Lockfile_meta must match each env and use at least format version 5
if lockfile_meta is None:
lockfile_meta = lockfile_as_dict["_meta"]
elif lockfile_meta != lockfile_as_dict["_meta"]:
raise SpackEnvironmentError("All lockfile _meta values must match")
elif lockfile_meta["lockfile-version"] < 5:
raise SpackEnvironmentError("The lockfile format must be at version 5 or higher")
self.included_concrete_spec_data[env_path] = {"roots": [], "concrete_specs": {}}
# Copy unique root specs from env
self.included_concrete_spec_data[env_path] = {"roots": []}
for root_dict in lockfile_as_dict["roots"]:
for root_dict in env._concrete_roots_dict():
if root_dict["hash"] not in root_hash_seen:
self.included_concrete_spec_data[env_path]["roots"].append(root_dict)
root_hash_seen.add(root_dict["hash"])
# Copy unique concrete specs from env
for concrete_spec in lockfile_as_dict["concrete_specs"]:
if concrete_spec not in concrete_hash_seen:
self.included_concrete_spec_data[env_path].update(
{"concrete_specs": lockfile_as_dict["concrete_specs"]}
for dag_hash, spec_details in env._concrete_specs_dict().items():
if dag_hash not in concrete_hash_seen:
self.included_concrete_spec_data[env_path]["concrete_specs"].update(
{dag_hash: spec_details}
)
concrete_hash_seen.add(concrete_spec)
concrete_hash_seen.add(dag_hash)
if "include_concrete" in lockfile_as_dict.keys():
self.included_concrete_spec_data[env_path]["include_concrete"] = lockfile_as_dict[
"include_concrete"
]
# Copy transitive include data
transitive = env.included_concrete_spec_data
if transitive:
self.included_concrete_spec_data[env_path]["include_concrete"] = transitive
self._read_lockfile_dict(self._to_lockfile_dict())
self.write()
@@ -1948,19 +1936,13 @@ def install_specs(self, specs: Optional[List[Spec]] = None, **install_args):
specs = specs if specs is not None else roots
# Extend the set of specs to overwrite with modified dev specs and their parents
overwrite: Set[str] = set()
overwrite.update(install_args.get("overwrite", []), self._dev_specs_that_need_overwrite())
install_args["overwrite"] = overwrite
explicit: Set[str] = set()
explicit.update(
install_args.get("explicit", []),
(s.dag_hash() for s in specs),
(s.dag_hash() for s in roots),
install_args["overwrite"] = (
install_args.get("overwrite", []) + self._dev_specs_that_need_overwrite()
)
install_args["explicit"] = explicit
PackageInstaller([spec.package for spec in specs], install_args).install()
installs = [(spec.package, {**install_args, "explicit": spec in roots}) for spec in specs]
PackageInstaller(installs).install()
def all_specs_generator(self) -> Iterable[Spec]:
"""Returns a generator for all concrete specs"""
@@ -2150,16 +2132,23 @@ def _get_environment_specs(self, recurse_dependencies=True):
return specs
def _to_lockfile_dict(self):
"""Create a dictionary to store a lockfile for this environment."""
def _concrete_specs_dict(self):
concrete_specs = {}
for s in traverse.traverse_nodes(self.specs_by_hash.values(), key=traverse.by_dag_hash):
spec_dict = s.node_dict_with_hashes(hash=ht.dag_hash)
# Assumes no legacy formats, since this was just created.
spec_dict[ht.dag_hash.name] = s.dag_hash()
concrete_specs[s.dag_hash()] = spec_dict
return concrete_specs
def _concrete_roots_dict(self):
hash_spec_list = zip(self.concretized_order, self.concretized_user_specs)
return [{"hash": h, "spec": str(s)} for h, s in hash_spec_list]
def _to_lockfile_dict(self):
"""Create a dictionary to store a lockfile for this environment."""
concrete_specs = self._concrete_specs_dict()
root_specs = self._concrete_roots_dict()
spack_dict = {"version": spack.spack_version}
spack_commit = spack.main.get_spack_commit()
@@ -2180,7 +2169,7 @@ def _to_lockfile_dict(self):
# spack version information
"spack": spack_dict,
# users specs + hashes are the 'roots' of the environment
"roots": [{"hash": h, "spec": str(s)} for h, s in hash_spec_list],
"roots": root_specs,
# Concrete specs by hash, including dependencies
"concrete_specs": concrete_specs,
}

View File

@@ -554,7 +554,7 @@ def fetch(self):
try:
response = self._urlopen(self.url)
except urllib.error.URLError as e:
except (TimeoutError, urllib.error.URLError) as e:
# clean up archive on failure.
if self.archive_file:
os.remove(self.archive_file)

View File

@@ -10,8 +10,9 @@
import shutil
import stat
import sys
from typing import Optional
from typing import Callable, Dict, Optional
from llnl.string import comma_or
from llnl.util import tty
from llnl.util.filesystem import (
mkdirp,
@@ -49,19 +50,20 @@
_projections_path = ".spack/projections.yaml"
def view_symlink(src, dst, **kwargs):
# keyword arguments are irrelevant
# here to fit required call signature
LinkCallbackType = Callable[[str, str, "FilesystemView", Optional["spack.spec.Spec"]], None]
def view_symlink(src: str, dst: str, *args, **kwargs) -> None:
symlink(src, dst)
def view_hardlink(src, dst, **kwargs):
# keyword arguments are irrelevant
# here to fit required call signature
def view_hardlink(src: str, dst: str, *args, **kwargs) -> None:
os.link(src, dst)
def view_copy(src: str, dst: str, view, spec: Optional[spack.spec.Spec] = None):
def view_copy(
src: str, dst: str, view: "FilesystemView", spec: Optional["spack.spec.Spec"] = None
) -> None:
"""
Copy a file from src to dst.
@@ -104,27 +106,40 @@ def view_copy(src: str, dst: str, view, spec: Optional[spack.spec.Spec] = None):
tty.debug(f"Can't change the permissions for {dst}")
def view_func_parser(parsed_name):
# What method are we using for this view
if parsed_name in ("hardlink", "hard"):
#: supported string values for `link_type` in an env, mapped to canonical values
_LINK_TYPES = {
"hardlink": "hardlink",
"hard": "hardlink",
"copy": "copy",
"relocate": "copy",
"add": "symlink",
"symlink": "symlink",
"soft": "symlink",
}
_VALID_LINK_TYPES = sorted(set(_LINK_TYPES.values()))
def canonicalize_link_type(link_type: str) -> str:
"""Return canonical"""
canonical = _LINK_TYPES.get(link_type)
if not canonical:
raise ValueError(
f"Invalid link type: '{link_type}. Must be one of {comma_or(_VALID_LINK_TYPES)}'"
)
return canonical
def function_for_link_type(link_type: str) -> LinkCallbackType:
link_type = canonicalize_link_type(link_type)
if link_type == "hardlink":
return view_hardlink
elif parsed_name in ("copy", "relocate"):
return view_copy
elif parsed_name in ("add", "symlink", "soft"):
elif link_type == "symlink":
return view_symlink
else:
raise ValueError(f"invalid link type for view: '{parsed_name}'")
elif link_type == "copy":
return view_copy
def inverse_view_func_parser(view_type):
# get string based on view type
if view_type is view_hardlink:
link_name = "hardlink"
elif view_type is view_copy:
link_name = "copy"
else:
link_name = "symlink"
return link_name
assert False, "invalid link type" # need mypy Literal values
class FilesystemView:
@@ -140,7 +155,16 @@ class FilesystemView:
directory structure.
"""
def __init__(self, root, layout, **kwargs):
def __init__(
self,
root: str,
layout: "spack.directory_layout.DirectoryLayout",
*,
projections: Optional[Dict] = None,
ignore_conflicts: bool = False,
verbose: bool = False,
link_type: str = "symlink",
):
"""
Initialize a filesystem view under the given `root` directory with
corresponding directory `layout`.
@@ -149,15 +173,14 @@ def __init__(self, root, layout, **kwargs):
"""
self._root = root
self.layout = layout
self.projections = {} if projections is None else projections
self.projections = kwargs.get("projections", {})
self.ignore_conflicts = kwargs.get("ignore_conflicts", False)
self.verbose = kwargs.get("verbose", False)
self.ignore_conflicts = ignore_conflicts
self.verbose = verbose
# Setup link function to include view
link_func = kwargs.get("link", view_symlink)
self.link = ft.partial(link_func, view=self)
self.link_type = link_type
self.link = ft.partial(function_for_link_type(link_type), view=self)
def add_specs(self, *specs, **kwargs):
"""
@@ -255,8 +278,24 @@ class YamlFilesystemView(FilesystemView):
Filesystem view to work with a yaml based directory layout.
"""
def __init__(self, root, layout, **kwargs):
super().__init__(root, layout, **kwargs)
def __init__(
self,
root: str,
layout: "spack.directory_layout.DirectoryLayout",
*,
projections: Optional[Dict] = None,
ignore_conflicts: bool = False,
verbose: bool = False,
link_type: str = "symlink",
):
super().__init__(
root,
layout,
projections=projections,
ignore_conflicts=ignore_conflicts,
verbose=verbose,
link_type=link_type,
)
# Super class gets projections from the kwargs
# YAML specific to get projections from YAML file
@@ -638,9 +677,6 @@ class SimpleFilesystemView(FilesystemView):
"""A simple and partial implementation of FilesystemView focused on performance and immutable
views, where specs cannot be removed after they were added."""
def __init__(self, root, layout, **kwargs):
super().__init__(root, layout, **kwargs)
def _sanity_check_view_projection(self, specs):
"""A very common issue is that we end up with two specs of the same package, that project
to the same prefix. We want to catch that as early as possible and give a sensible error to

View File

@@ -13,6 +13,7 @@
import spack.config
import spack.relocate
from spack.util.elf import ElfParsingError, parse_elf
from spack.util.executable import Executable
def is_shared_library_elf(filepath):
@@ -140,7 +141,7 @@ def post_install(spec, explicit=None):
return
# Only enable on platforms using ELF.
if not spec.satisfies("platform=linux"):
if not spec.satisfies("platform=linux") and not spec.satisfies("platform=cray"):
return
# Disable this hook when bootstrapping, to avoid recursion.
@@ -148,9 +149,10 @@ def post_install(spec, explicit=None):
return
# Should failing to locate patchelf be a hard error?
patchelf = spack.relocate._patchelf()
if not patchelf:
patchelf_path = spack.relocate._patchelf()
if not patchelf_path:
return
patchelf = Executable(patchelf_path)
fixes = find_and_patch_sonames(spec.prefix, spec.package.non_bindable_shared_objects, patchelf)

View File

@@ -117,7 +117,7 @@ def post_install(spec, explicit=None):
return
# Only enable on platforms using ELF.
if not spec.satisfies("platform=linux"):
if not spec.satisfies("platform=linux") and not spec.satisfies("platform=cray"):
return
visit_directory_tree(spec.prefix, ElfFilesWithRPathVisitor())

View File

@@ -600,7 +600,9 @@ def dump_packages(spec: "spack.spec.Spec", path: str) -> None:
if node is spec:
spack.repo.PATH.dump_provenance(node, dest_pkg_dir)
elif source_pkg_dir:
fs.install_tree(source_pkg_dir, dest_pkg_dir)
fs.install_tree(
source_pkg_dir, dest_pkg_dir, allow_broken_symlinks=(sys.platform != "win32")
)
def get_dependent_ids(spec: "spack.spec.Spec") -> List[str]:
@@ -759,8 +761,12 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
if not self.pkg.spec.concrete:
raise ValueError(f"{self.pkg.name} must have a concrete spec")
self.pkg.stop_before_phase = install_args.get("stop_before") # type: ignore[attr-defined] # noqa: E501
self.pkg.last_phase = install_args.get("stop_at") # type: ignore[attr-defined]
# Cache the package phase options with the explicit package,
# popping the options to ensure installation of associated
# dependencies is NOT affected by these options.
self.pkg.stop_before_phase = install_args.pop("stop_before", None) # type: ignore[attr-defined] # noqa: E501
self.pkg.last_phase = install_args.pop("stop_at", None) # type: ignore[attr-defined]
# Cache the package id for convenience
self.pkg_id = package_id(pkg.spec)
@@ -1070,17 +1076,19 @@ def flag_installed(self, installed: List[str]) -> None:
@property
def explicit(self) -> bool:
return self.pkg.spec.dag_hash() in self.request.install_args.get("explicit", [])
"""The package was explicitly requested by the user."""
return self.is_root and self.request.install_args.get("explicit", True)
@property
def is_build_request(self) -> bool:
"""The package was requested directly"""
def is_root(self) -> bool:
"""The package was requested directly, but may or may not be explicit
in an environment."""
return self.pkg == self.request.pkg
@property
def use_cache(self) -> bool:
_use_cache = True
if self.is_build_request:
if self.is_root:
return self.request.install_args.get("package_use_cache", _use_cache)
else:
return self.request.install_args.get("dependencies_use_cache", _use_cache)
@@ -1088,7 +1096,7 @@ def use_cache(self) -> bool:
@property
def cache_only(self) -> bool:
_cache_only = False
if self.is_build_request:
if self.is_root:
return self.request.install_args.get("package_cache_only", _cache_only)
else:
return self.request.install_args.get("dependencies_cache_only", _cache_only)
@@ -1114,17 +1122,24 @@ def priority(self):
class PackageInstaller:
"""
Class for managing the install process for a Spack instance based on a bottom-up DAG approach.
Class for managing the install process for a Spack instance based on a
bottom-up DAG approach.
This installer can coordinate concurrent batch and interactive, local and distributed (on a
shared file system) builds for the same Spack instance.
This installer can coordinate concurrent batch and interactive, local
and distributed (on a shared file system) builds for the same Spack
instance.
"""
def __init__(
self, packages: List["spack.package_base.PackageBase"], install_args: dict
) -> None:
def __init__(self, installs: List[Tuple["spack.package_base.PackageBase", dict]] = []) -> None:
"""Initialize the installer.
Args:
installs (list): list of tuples, where each
tuple consists of a package (PackageBase) and its associated
install arguments (dict)
"""
# List of build requests
self.build_requests = [BuildRequest(pkg, install_args) for pkg in packages]
self.build_requests = [BuildRequest(pkg, install_args) for pkg, install_args in installs]
# Priority queue of build tasks
self.build_pq: List[Tuple[Tuple[int, int], BuildTask]] = []
@@ -1547,7 +1562,7 @@ def _add_tasks(self, request: BuildRequest, all_deps):
#
# External and upstream packages need to get flagged as installed to
# ensure proper status tracking for environment build.
explicit = request.pkg.spec.dag_hash() in request.install_args.get("explicit", [])
explicit = request.install_args.get("explicit", True)
not_local = _handle_external_and_upstream(request.pkg, explicit)
if not_local:
self._flag_installed(request.pkg)
@@ -1668,6 +1683,10 @@ def _install_task(self, task: BuildTask, install_status: InstallStatus) -> None:
if not pkg.unit_test_check():
return
# Injecting information to know if this installation request is the root one
# to determine in BuildProcessInstaller whether installation is explicit or not
install_args["is_root"] = task.is_root
try:
self._setup_install_dir(pkg)
@@ -1979,8 +1998,8 @@ def install(self) -> None:
self._init_queue()
fail_fast_err = "Terminating after first install failure"
single_requested_spec = len(self.build_requests) == 1
failed_build_requests = []
single_explicit_spec = len(self.build_requests) == 1
failed_explicits = []
install_status = InstallStatus(len(self.build_pq))
@@ -2178,11 +2197,14 @@ def install(self) -> None:
if self.fail_fast:
raise InstallError(f"{fail_fast_err}: {str(exc)}", pkg=pkg)
# Terminate when a single build request has failed, or summarize errors later.
if task.is_build_request:
if single_requested_spec:
raise
failed_build_requests.append((pkg, pkg_id, str(exc)))
# Terminate at this point if the single explicit spec has
# failed to install.
if single_explicit_spec and task.explicit:
raise
# Track explicit spec id and error to summarize when done
if task.explicit:
failed_explicits.append((pkg, pkg_id, str(exc)))
finally:
# Remove the install prefix if anything went wrong during
@@ -2205,16 +2227,16 @@ def install(self) -> None:
if request.install_args.get("install_package") and request.pkg_id not in self.installed
]
if failed_build_requests or missing:
for _, pkg_id, err in failed_build_requests:
if failed_explicits or missing:
for _, pkg_id, err in failed_explicits:
tty.error(f"{pkg_id}: {err}")
for _, pkg_id in missing:
tty.error(f"{pkg_id}: Package was not installed")
if len(failed_build_requests) > 0:
pkg = failed_build_requests[0][0]
ids = [pkg_id for _, pkg_id, _ in failed_build_requests]
if len(failed_explicits) > 0:
pkg = failed_explicits[0][0]
ids = [pkg_id for _, pkg_id, _ in failed_explicits]
tty.debug(
"Associating installation failure with first failed "
f"explicit package ({ids[0]}) from {', '.join(ids)}"
@@ -2273,7 +2295,7 @@ def __init__(self, pkg: "spack.package_base.PackageBase", install_args: dict):
self.verbose = bool(install_args.get("verbose", False))
# whether installation was explicitly requested by the user
self.explicit = pkg.spec.dag_hash() in install_args.get("explicit", [])
self.explicit = install_args.get("is_root", False) and install_args.get("explicit", True)
# env before starting installation
self.unmodified_env = install_args.get("unmodified_env", {})
@@ -2358,7 +2380,9 @@ def _install_source(self) -> None:
src_target = os.path.join(pkg.spec.prefix, "share", pkg.name, "src")
tty.debug(f"{self.pre} Copying source to {src_target}")
fs.install_tree(pkg.stage.source_path, src_target)
fs.install_tree(
pkg.stage.source_path, src_target, allow_broken_symlinks=(sys.platform != "win32")
)
def _real_install(self) -> None:
import spack.builder

View File

@@ -3,12 +3,22 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from ._operating_system import OperatingSystem
from .cray_backend import CrayBackend
from .cray_frontend import CrayFrontend
from .freebsd import FreeBSDOs
from .linux_distro import LinuxDistro
from .mac_os import MacOs
from .windows_os import WindowsOs
__all__ = ["OperatingSystem", "LinuxDistro", "MacOs", "WindowsOs", "FreeBSDOs"]
__all__ = [
"OperatingSystem",
"LinuxDistro",
"MacOs",
"CrayFrontend",
"CrayBackend",
"WindowsOs",
"FreeBSDOs",
]
#: List of all the Operating Systems known to Spack
operating_systems = [LinuxDistro, MacOs, WindowsOs, FreeBSDOs]
operating_systems = [LinuxDistro, MacOs, CrayFrontend, CrayBackend, WindowsOs, FreeBSDOs]

View File

@@ -0,0 +1,172 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import re
import llnl.util.tty as tty
import spack.error
import spack.version
from spack.util.module_cmd import module
from .linux_distro import LinuxDistro
#: Possible locations of the Cray CLE release file,
#: which we look at to get the CNL OS version.
_cle_release_file = "/etc/opt/cray/release/cle-release"
_clerelease_file = "/etc/opt/cray/release/clerelease"
def read_cle_release_file():
"""Read the CLE release file and return a dict with its attributes.
This file is present on newer versions of Cray.
The release file looks something like this::
RELEASE=6.0.UP07
BUILD=6.0.7424
...
The dictionary we produce looks like this::
{
"RELEASE": "6.0.UP07",
"BUILD": "6.0.7424",
...
}
Returns:
dict: dictionary of release attributes
"""
with open(_cle_release_file) as release_file:
result = {}
for line in release_file:
# use partition instead of split() to ensure we only split on
# the first '=' in the line.
key, _, value = line.partition("=")
result[key] = value.strip()
return result
def read_clerelease_file():
"""Read the CLE release file and return the Cray OS version.
This file is present on older versions of Cray.
The release file looks something like this::
5.2.UP04
Returns:
str: the Cray OS version
"""
with open(_clerelease_file) as release_file:
for line in release_file:
return line.strip()
class CrayBackend(LinuxDistro):
"""Compute Node Linux (CNL) is the operating system used for the Cray XC
series super computers. It is a very stripped down version of GNU/Linux.
Any compilers found through this operating system will be used with
modules. If updated, user must make sure that version and name are
updated to indicate that OS has been upgraded (or downgraded)
"""
def __init__(self):
name = "cnl"
version = self._detect_crayos_version()
if version:
# If we found a CrayOS version, we do not want the information
# from LinuxDistro. In order to skip the logic from
# distro.linux_distribution, while still calling __init__
# methods further up the MRO, we skip LinuxDistro in the MRO and
# call the OperatingSystem superclass __init__ method
super(LinuxDistro, self).__init__(name, version)
else:
super().__init__()
self.modulecmd = module
def __str__(self):
return self.name + str(self.version)
@classmethod
def _detect_crayos_version(cls):
if os.path.isfile(_cle_release_file):
release_attrs = read_cle_release_file()
if "RELEASE" not in release_attrs:
# This Cray system uses a base OS not CLE/CNL
return None
v = spack.version.Version(release_attrs["RELEASE"])
return v[0]
elif os.path.isfile(_clerelease_file):
v = read_clerelease_file()
return spack.version.Version(v)[0]
else:
# Not all Cray systems run CNL on the backend.
# Systems running in what Cray calls "cluster" mode run other
# linux OSs under the Cray PE.
# So if we don't detect any Cray OS version on the system,
# we return None. We can't ever be sure we will get a Cray OS
# version.
# Returning None allows the calling code to test for the value
# being "True-ish" rather than requiring a try/except block.
return None
def arguments_to_detect_version_fn(self, paths):
import spack.compilers
command_arguments = []
for compiler_name in spack.compilers.supported_compilers():
cmp_cls = spack.compilers.class_for_compiler_name(compiler_name)
# If the compiler doesn't have a corresponding
# Programming Environment, skip to the next
if cmp_cls.PrgEnv is None:
continue
if cmp_cls.PrgEnv_compiler is None:
tty.die("Must supply PrgEnv_compiler with PrgEnv")
compiler_id = spack.compilers.CompilerID(self, compiler_name, None)
detect_version_args = spack.compilers.DetectVersionArgs(
id=compiler_id, variation=(None, None), language="cc", path="cc"
)
command_arguments.append(detect_version_args)
return command_arguments
def detect_version(self, detect_version_args):
import spack.compilers
modulecmd = self.modulecmd
compiler_name = detect_version_args.id.compiler_name
compiler_cls = spack.compilers.class_for_compiler_name(compiler_name)
output = modulecmd("avail", compiler_cls.PrgEnv_compiler)
version_regex = r"({0})/([\d\.]+[\d]-?[\w]*)".format(compiler_cls.PrgEnv_compiler)
matches = re.findall(version_regex, output)
version = tuple(version for _, version in matches if "classic" not in version)
compiler_id = detect_version_args.id
value = detect_version_args._replace(id=compiler_id._replace(version=version))
return value, None
def make_compilers(self, compiler_id, paths):
import spack.spec
name = compiler_id.compiler_name
cmp_cls = spack.compilers.class_for_compiler_name(name)
compilers = []
for v in compiler_id.version:
comp = cmp_cls(
spack.spec.CompilerSpec(name + "@=" + v),
self,
"any",
["cc", "CC", "ftn"],
[cmp_cls.PrgEnv, name + "/" + v],
)
compilers.append(comp)
return compilers

View File

@@ -0,0 +1,105 @@
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import contextlib
import os
import re
import llnl.util.filesystem as fs
import llnl.util.lang
import llnl.util.tty as tty
from spack.util.environment import get_path
from spack.util.module_cmd import module
from .linux_distro import LinuxDistro
@contextlib.contextmanager
def unload_programming_environment():
"""Context manager that unloads Cray Programming Environments."""
env_bu = None
# We rely on the fact that the PrgEnv-* modules set the PE_ENV
# environment variable.
if "PE_ENV" in os.environ:
# Copy environment variables to restore them after the compiler
# detection. We expect that the only thing PrgEnv-* modules do is
# the environment variables modifications.
env_bu = os.environ.copy()
# Get the name of the module from the environment variable.
prg_env = "PrgEnv-" + os.environ["PE_ENV"].lower()
# Unload the PrgEnv-* module. By doing this we intentionally
# provoke errors when the Cray's compiler wrappers are executed
# (Error: A PrgEnv-* modulefile must be loaded.) so they will not
# be detected as valid compilers by the overridden method. We also
# expect that the modules that add the actual compilers' binaries
# into the PATH environment variable (i.e. the following modules:
# 'intel', 'cce', 'gcc', etc.) will also be unloaded since they are
# specified as prerequisites in the PrgEnv-* modulefiles.
module("unload", prg_env)
yield
# Restore the environment.
if env_bu is not None:
os.environ.clear()
os.environ.update(env_bu)
class CrayFrontend(LinuxDistro):
"""Represents OS that runs on login and service nodes of the Cray platform.
It acts as a regular Linux without Cray-specific modules and compiler
wrappers."""
@property
def compiler_search_paths(self):
"""Calls the default function but unloads Cray's programming
environments first.
This prevents from detecting Cray compiler wrappers and avoids
possible false detections.
"""
import spack.compilers
with unload_programming_environment():
search_paths = get_path("PATH")
extract_path_re = re.compile(r"prepend-path[\s]*PATH[\s]*([/\w\.:-]*)")
for compiler_cls in spack.compilers.all_compiler_types():
# Check if the compiler class is supported on Cray
prg_env = getattr(compiler_cls, "PrgEnv", None)
compiler_module = getattr(compiler_cls, "PrgEnv_compiler", None)
if not (prg_env and compiler_module):
continue
# It is supported, check which versions are available
output = module("avail", compiler_cls.PrgEnv_compiler)
version_regex = r"({0})/([\d\.]+[\d]-?[\w]*)".format(compiler_cls.PrgEnv_compiler)
matches = re.findall(version_regex, output)
versions = tuple(version for _, version in matches if "classic" not in version)
# Now inspect the modules and add to paths
msg = "[CRAY FE] Detected FE compiler [name={0}, versions={1}]"
tty.debug(msg.format(compiler_module, versions))
for v in versions:
try:
current_module = compiler_module + "/" + v
out = module("show", current_module)
match = extract_path_re.search(out)
search_paths += match.group(1).split(":")
except Exception as e:
msg = (
"[CRAY FE] An unexpected error occurred while "
"detecting FE compiler [compiler={0}, "
" version={1}, error={2}]"
)
tty.debug(msg.format(compiler_cls.name, v, str(e)))
search_paths = list(llnl.util.lang.dedupe(search_paths))
return fs.search_paths_for_executables(*search_paths)

View File

@@ -143,6 +143,7 @@ def __init__(self):
"12": "monterey",
"13": "ventura",
"14": "sonoma",
"15": "sequoia",
}
version = macos_version()

View File

@@ -199,10 +199,10 @@ def __init__(cls, name, bases, attr_dict):
# assumed to be detectable
if hasattr(cls, "executables") or hasattr(cls, "libraries"):
# Append a tag to each detectable package, so that finding them is faster
if hasattr(cls, "tags"):
getattr(cls, "tags").append(DetectablePackageMeta.TAG)
else:
if not hasattr(cls, "tags"):
setattr(cls, "tags", [DetectablePackageMeta.TAG])
elif DetectablePackageMeta.TAG not in cls.tags:
cls.tags.append(DetectablePackageMeta.TAG)
@classmethod
def platform_executables(cls):
@@ -1119,10 +1119,9 @@ def _make_stage(self):
if not link_format:
link_format = "build-{arch}-{hash:7}"
stage_link = self.spec.format_path(link_format)
return DevelopStage(compute_stage_name(self.spec), dev_path, stage_link)
# To fetch the current version
source_stage = self._make_root_stage(self.fetcher)
source_stage = DevelopStage(compute_stage_name(self.spec), dev_path, stage_link)
else:
source_stage = self._make_root_stage(self.fetcher)
# all_stages is source + resources + patches
all_stages = StageComposite()
@@ -1451,10 +1450,8 @@ def do_fetch(self, mirror_only=False):
return
checksum = spack.config.get("config:checksum")
fetch = self.stage.needs_fetching
if (
checksum
and fetch
and (self.version not in self.versions)
and (not isinstance(self.version, GitVersion))
):
@@ -1561,13 +1558,11 @@ def do_patch(self):
tty.debug("Patching failed last time. Restaging.")
self.stage.restage()
else:
# develop specs/ DIYStages may have patch failures but
# should never be restaged
msg = (
"A patch failure was detected in %s." % self.name
+ " Build errors may occur due to this."
# develop specs may have patch failures but should never be restaged
tty.warn(
f"A patch failure was detected in {self.name}."
" Build errors may occur due to this."
)
tty.warn(msg)
return
# If this file exists, then we already applied all the patches.
@@ -1881,10 +1876,7 @@ def do_install(self, **kwargs):
verbose (bool): Display verbose build output (by default,
suppresses it)
"""
explicit = kwargs.get("explicit", True)
if isinstance(explicit, bool):
kwargs["explicit"] = {self.spec.dag_hash()} if explicit else set()
PackageInstaller([self], kwargs).install()
PackageInstaller([(self, kwargs)]).install()
# TODO (post-34236): Update tests and all packages that use this as a
# TODO (post-34236): package method to the routine made available to

View File

@@ -6,6 +6,7 @@
from ._functions import _host, by_name, platforms, prevent_cray_detection, reset
from ._platform import Platform
from .cray import Cray
from .darwin import Darwin
from .freebsd import FreeBSD
from .linux import Linux
@@ -14,6 +15,7 @@
__all__ = [
"Platform",
"Cray",
"Darwin",
"Linux",
"FreeBSD",

View File

@@ -8,6 +8,7 @@
import spack.util.environment
from .cray import Cray
from .darwin import Darwin
from .freebsd import FreeBSD
from .linux import Linux
@@ -15,7 +16,7 @@
from .windows import Windows
#: List of all the platform classes known to Spack
platforms = [Darwin, Linux, Windows, FreeBSD, Test]
platforms = [Cray, Darwin, Linux, Windows, FreeBSD, Test]
@llnl.util.lang.memoized

View File

@@ -2,10 +2,254 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import os.path
import platform
import re
import archspec.cpu
import llnl.util.tty as tty
from llnl.util.symlink import readlink
import spack.target
import spack.version
from spack.operating_systems.cray_backend import CrayBackend
from spack.operating_systems.cray_frontend import CrayFrontend
from spack.paths import build_env_path
from spack.util.executable import Executable
from spack.util.module_cmd import module
from ._platform import NoPlatformError, Platform
_craype_name_to_target_name = {
"x86-cascadelake": "cascadelake",
"x86-naples": "zen",
"x86-rome": "zen2",
"x86-milan": "zen3",
"x86-skylake": "skylake_avx512",
"mic-knl": "mic_knl",
"interlagos": "bulldozer",
"abudhabi": "piledriver",
}
_ex_craype_dir = "/opt/cray/pe/cpe"
_xc_craype_dir = "/opt/cray/pe/cdt"
def slingshot_network():
return os.path.exists("/opt/cray/pe") and (
os.path.exists("/lib64/libcxi.so") or os.path.exists("/usr/lib64/libcxi.so")
)
def _target_name_from_craype_target_name(name):
return _craype_name_to_target_name.get(name, name)
class Cray(Platform):
priority = 10
def __init__(self):
"""Create a Cray system platform.
Target names should use craype target names but not include the
'craype-' prefix. Uses first viable target from:
self
envars [SPACK_FRONT_END, SPACK_BACK_END]
configuration file "targets.yaml" with keys 'front_end', 'back_end'
scanning /etc/bash/bashrc.local for back_end only
"""
super().__init__("cray")
# Make all craype targets available.
for target in self._avail_targets():
name = _target_name_from_craype_target_name(target)
self.add_target(name, spack.target.Target(name, "craype-%s" % target))
self.back_end = os.environ.get("SPACK_BACK_END", self._default_target_from_env())
self.default = self.back_end
if self.back_end not in self.targets:
# We didn't find a target module for the backend
raise NoPlatformError()
# Setup frontend targets
for name in archspec.cpu.TARGETS:
if name not in self.targets:
self.add_target(name, spack.target.Target(name))
self.front_end = os.environ.get("SPACK_FRONT_END", archspec.cpu.host().name)
if self.front_end not in self.targets:
self.add_target(self.front_end, spack.target.Target(self.front_end))
front_distro = CrayFrontend()
back_distro = CrayBackend()
self.default_os = str(back_distro)
self.back_os = self.default_os
self.front_os = str(front_distro)
self.add_operating_system(self.back_os, back_distro)
if self.front_os != self.back_os:
self.add_operating_system(self.front_os, front_distro)
def setup_platform_environment(self, pkg, env):
"""Change the linker to default dynamic to be more
similar to linux/standard linker behavior
"""
# Unload these modules to prevent any silent linking or unnecessary
# I/O profiling in the case of darshan.
modules_to_unload = ["cray-mpich", "darshan", "cray-libsci", "altd"]
for mod in modules_to_unload:
module("unload", mod)
env.set("CRAYPE_LINK_TYPE", "dynamic")
cray_wrapper_names = os.path.join(build_env_path, "cray")
if os.path.isdir(cray_wrapper_names):
env.prepend_path("PATH", cray_wrapper_names)
env.prepend_path("SPACK_ENV_PATH", cray_wrapper_names)
# Makes spack installed pkg-config work on Crays
env.append_path("PKG_CONFIG_PATH", "/usr/lib64/pkgconfig")
env.append_path("PKG_CONFIG_PATH", "/usr/local/lib64/pkgconfig")
# CRAY_LD_LIBRARY_PATH is used at build time by the cray compiler
# wrappers to augment LD_LIBRARY_PATH. This is to avoid long load
# times at runtime. This behavior is not always respected on cray
# "cluster" systems, so we reproduce it here.
if os.environ.get("CRAY_LD_LIBRARY_PATH"):
env.prepend_path("LD_LIBRARY_PATH", os.environ["CRAY_LD_LIBRARY_PATH"])
@classmethod
def craype_type_and_version(cls):
if os.path.isdir(_ex_craype_dir):
craype_dir = _ex_craype_dir
craype_type = "EX"
elif os.path.isdir(_xc_craype_dir):
craype_dir = _xc_craype_dir
craype_type = "XC"
else:
return (None, None)
# Take the default version from known symlink path
default_path = os.path.join(craype_dir, "default")
if os.path.islink(default_path):
version = spack.version.Version(readlink(default_path))
return (craype_type, version)
# If no default version, sort available versions and return latest
versions_available = [spack.version.Version(v) for v in os.listdir(craype_dir)]
versions_available.sort(reverse=True)
if not versions_available:
return (craype_type, None)
return (craype_type, versions_available[0])
@classmethod
def detect(cls):
"""
Detect whether this system requires CrayPE module support.
Systems with newer CrayPE (21.10 for EX systems, future work for CS and
XC systems) have compilers and MPI wrappers that can be used directly
by path. These systems are considered ``linux`` platforms.
For systems running an older CrayPE, we detect the Cray platform based
on the availability through `module` of the Cray programming
environment. If this environment is available, we can use it to find
compilers, target modules, etc. If the Cray programming environment is
not available via modules, then we will treat it as a standard linux
system, as the Cray compiler wrappers and other components of the Cray
programming environment are irrelevant without module support.
"""
if "opt/cray" not in os.environ.get("MODULEPATH", ""):
return False
craype_type, craype_version = cls.craype_type_and_version()
if craype_type == "XC":
return True
if craype_type == "EX" and craype_version < spack.version.Version("21.10"):
return True
return False
def _default_target_from_env(self):
"""Set and return the default CrayPE target loaded in a clean login
session.
A bash subshell is launched with a wiped environment and the list of
loaded modules is parsed for the first acceptable CrayPE target.
"""
# env -i /bin/bash -lc echo $CRAY_CPU_TARGET 2> /dev/null
if getattr(self, "default", None) is None:
bash = Executable("/bin/bash")
output = bash(
"--norc",
"--noprofile",
"-lc",
"echo $CRAY_CPU_TARGET",
env={"TERM": os.environ.get("TERM", "")},
output=str,
error=os.devnull,
)
default_from_module = "".join(output.split()) # rm all whitespace
if default_from_module:
tty.debug("Found default module:%s" % default_from_module)
return default_from_module
else:
front_end = archspec.cpu.host()
# Look for the frontend architecture or closest ancestor
# available in cray target modules
avail = [_target_name_from_craype_target_name(x) for x in self._avail_targets()]
for front_end_possibility in [front_end] + front_end.ancestors:
if front_end_possibility.name in avail:
tty.debug("using front-end architecture or available ancestor")
return front_end_possibility.name
else:
tty.debug("using platform.machine as default")
return platform.machine()
def _avail_targets(self):
"""Return a list of available CrayPE CPU targets."""
def modules_in_output(output):
"""Returns a list of valid modules parsed from modulecmd output"""
return [i for i in re.split(r"\s\s+|\n", output)]
def target_names_from_modules(modules):
# Craype- module prefixes that are not valid CPU targets.
targets = []
for mod in modules:
if "craype-" in mod:
name = mod[7:]
name = name.split()[0]
_n = name.replace("-", "_") # test for mic-knl/mic_knl
is_target_name = name in archspec.cpu.TARGETS or _n in archspec.cpu.TARGETS
is_cray_target_name = name in _craype_name_to_target_name
if is_target_name or is_cray_target_name:
targets.append(name)
return targets
def modules_from_listdir():
craype_default_path = "/opt/cray/pe/craype/default/modulefiles"
if os.path.isdir(craype_default_path):
return os.listdir(craype_default_path)
return []
if getattr(self, "_craype_targets", None) is None:
strategies = [
lambda: modules_in_output(module("avail", "-t", "craype-")),
modules_from_listdir,
]
for available_craype_modules in strategies:
craype_modules = available_craype_modules()
craype_targets = target_names_from_modules(craype_modules)
if craype_targets:
self._craype_targets = craype_targets
break
else:
# If nothing is found add platform.machine()
# to avoid Spack erroring out
self._craype_targets = [platform.machine()]
return self._craype_targets

View File

@@ -13,6 +13,7 @@
r"\w[\w-]*": {
"type": "object",
"additionalProperties": False,
"required": ["spec"],
"properties": {"spec": {"type": "string"}, "path": {"type": "string"}},
}
},

View File

@@ -844,8 +844,6 @@ def solve(self, setup, specs, reuse=None, output=None, control=None, allow_depre
parent_dir = os.path.dirname(__file__)
self.control.load(os.path.join(parent_dir, "concretize.lp"))
self.control.load(os.path.join(parent_dir, "heuristic.lp"))
if spack.config.CONFIG.get("concretizer:duplicates:strategy", "none") != "none":
self.control.load(os.path.join(parent_dir, "heuristic_separate.lp"))
self.control.load(os.path.join(parent_dir, "display.lp"))
if not setup.concretize_everything:
self.control.load(os.path.join(parent_dir, "when_possible.lp"))
@@ -1435,16 +1433,14 @@ def condition(
# caller, we won't emit partial facts.
condition_id = next(self._id_counter)
self.gen.fact(fn.pkg_fact(required_spec.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
trigger_id = self._get_condition_id(
required_spec, cache=self._trigger_cache, body=True, transform=transform_required
)
self.gen.fact(fn.pkg_fact(required_spec.name, fn.condition(condition_id)))
self.gen.fact(fn.condition_reason(condition_id, msg))
self.gen.fact(
fn.pkg_fact(required_spec.name, fn.condition_trigger(condition_id, trigger_id))
)
if not imposed_spec:
return condition_id
@@ -1693,19 +1689,43 @@ def external_packages(self):
spack.spec.parse_with_version_concrete(x["spec"]) for x in externals
]
external_specs = []
selected_externals = set()
if spec_filters:
for current_filter in spec_filters:
current_filter.factory = lambda: candidate_specs
external_specs.extend(current_filter.selected_specs())
else:
external_specs.extend(candidate_specs)
selected_externals.update(current_filter.selected_specs())
# Emit facts for externals specs. Note that "local_idx" is the index of the spec
# in packages:<pkg_name>:externals. This means:
#
# packages:<pkg_name>:externals[local_idx].spec == spec
external_versions = []
for local_idx, spec in enumerate(candidate_specs):
msg = f"{spec.name} available as external when satisfying {spec}"
if spec_filters and spec not in selected_externals:
continue
if not spec.versions.concrete:
warnings.warn(f"cannot use the external spec {spec}: needs a concrete version")
continue
def external_imposition(input_spec, requirements):
return requirements + [
fn.attr("external_conditions_hold", input_spec.name, local_idx)
]
try:
self.condition(spec, spec, msg=msg, transform_imposed=external_imposition)
except (spack.error.SpecError, RuntimeError) as e:
warnings.warn(f"while setting up external spec {spec}: {e}")
continue
external_versions.append((spec.version, local_idx))
self.possible_versions[spec.name].add(spec.version)
self.gen.newline()
# Order the external versions to prefer more recent versions
# even if specs in packages.yaml are not ordered that way
external_versions = [
(x.version, external_id) for external_id, x in enumerate(external_specs)
]
external_versions = [
(v, idx, external_id)
for idx, (v, external_id) in enumerate(sorted(external_versions, reverse=True))
@@ -1715,19 +1735,6 @@ def external_packages(self):
DeclaredVersion(version=version, idx=idx, origin=Provenance.EXTERNAL)
)
# Declare external conditions with a local index into packages.yaml
for local_idx, spec in enumerate(external_specs):
msg = "%s available as external when satisfying %s" % (spec.name, spec)
def external_imposition(input_spec, requirements):
return requirements + [
fn.attr("external_conditions_hold", input_spec.name, local_idx)
]
self.condition(spec, spec, msg=msg, transform_imposed=external_imposition)
self.possible_versions[spec.name].add(spec.version)
self.gen.newline()
self.trigger_rules()
self.effect_rules()
@@ -1880,11 +1887,8 @@ def _spec_clauses(
)
clauses.append(f.variant_value(spec.name, vname, value))
if variant.propagate:
clauses.append(
f.variant_propagation_candidate(spec.name, vname, value, spec.name)
)
clauses.append(f.propagate(spec.name, fn.variant_value(vname, value)))
# Tell the concretizer that this is a possible value for the
# variant, to account for things like int/str values where we
@@ -2440,7 +2444,7 @@ def setup(
if using_libc_compatibility():
for libc in self.libcs:
self.gen.fact(fn.host_libc(libc.name, libc.version))
self.gen.fact(fn.allowed_libc(libc.name, libc.version))
if not allow_deprecated:
self.gen.fact(fn.deprecated_versions_not_allowed())
@@ -2739,7 +2743,7 @@ class _Head:
node_flag = fn.attr("node_flag_set")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
propagate = fn.attr("propagate")
class _Body:
@@ -2756,7 +2760,7 @@ class _Body:
node_flag = fn.attr("node_flag")
node_flag_source = fn.attr("node_flag_source")
node_flag_propagate = fn.attr("node_flag_propagate")
variant_propagation_candidate = fn.attr("variant_propagation_candidate")
propagate = fn.attr("propagate")
class ProblemInstanceBuilder:
@@ -3230,6 +3234,39 @@ def requires(self, impose: str, *, when: str):
self.runtime_conditions.add((imposed_spec, when_spec))
self.reset()
def propagate(self, constraint_str: str, *, when: str):
msg = "the 'propagate' method can be called only with pkg('*')"
assert self.current_package == "*", msg
when_spec = spack.spec.Spec(when)
assert when_spec.name is None, "only anonymous when specs are accepted"
placeholder = "XXX"
node_variable = "node(ID, Package)"
when_spec.name = placeholder
body_clauses = self._setup.spec_clauses(when_spec, body=True)
body_str = (
f" {f',{os.linesep} '.join(str(x) for x in body_clauses)},\n"
f" not external({node_variable}),\n"
f" not runtime(Package)"
).replace(f'"{placeholder}"', f"{node_variable}")
constraint_spec = spack.spec.Spec(constraint_str)
assert constraint_spec.name is None, "only anonymous constraint specs are accepted"
constraint_spec.name = placeholder
constraint_clauses = self._setup.spec_clauses(constraint_spec, body=False)
for clause in constraint_clauses:
if clause.args[0] == "node_compiler_version_satisfies":
self._setup.compiler_version_constraints.add(constraint_spec.compiler)
args = f'"{constraint_spec.compiler.name}", "{constraint_spec.compiler.versions}"'
head_str = f"propagate({node_variable}, node_compiler_version_satisfies({args}))"
rule = f"{head_str} :-\n{body_str}.\n\n"
self.rules.append(rule)
self.reset()
def consume_facts(self):
"""Consume the facts collected by this object, and emits rules and
facts for the runtimes.
@@ -3799,6 +3836,12 @@ class Solver:
def __init__(self):
self.driver = PyclingoDriver()
self.selector = ReusableSpecsSelector(configuration=spack.config.CONFIG)
if spack.platforms.host().name == "cray":
msg = (
"The Cray platform, i.e. 'platform=cray', will be removed in Spack v0.23. "
"All Cray machines will be then detected as 'platform=linux'."
)
warnings.warn(msg)
@staticmethod
def _check_input_and_extract_concrete_specs(specs):

View File

@@ -811,37 +811,6 @@ node_has_variant(node(ID, Package), Variant) :-
pkg_fact(Package, variant(Variant)),
attr("node", node(ID, Package)).
% Variant propagation is forwarded to dependencies
attr("variant_propagation_candidate", PackageNode, Variant, Value, Source) :-
attr("node", PackageNode),
depends_on(ParentNode, PackageNode),
attr("variant_value", node(_, Source), Variant, Value),
attr("variant_propagation_candidate", ParentNode, Variant, _, Source).
% If the node is a candidate, and it has the variant and value,
% then those variant and value should be propagated
attr("variant_propagate", node(ID, Package), Variant, Value, Source) :-
attr("variant_propagation_candidate", node(ID, Package), Variant, Value, Source),
node_has_variant(node(ID, Package), Variant),
pkg_fact(Package, variant_possible_value(Variant, Value)),
not attr("variant_set", node(ID, Package), Variant).
% Propagate the value, if there is the corresponding attribute
attr("variant_value", PackageNode, Variant, Value) :- attr("variant_propagate", PackageNode, Variant, Value, _).
% If a variant is propagated, we cannot have extraneous values (this is for multi valued variants)
variant_is_propagated(PackageNode, Variant) :- attr("variant_propagate", PackageNode, Variant, _, _).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_value", PackageNode, Variant, Value),
not attr("variant_propagate", PackageNode, Variant, Value, _).
% Cannot receive different values from different sources on the same variant
error(100, "{0} and {1} cannot both propagate variant '{2}' to package {3} with values '{4}' and '{5}'", Source1, Source2, Variant, Package, Value1, Value2) :-
attr("variant_propagate", node(X, Package), Variant, Value1, Source1),
attr("variant_propagate", node(X, Package), Variant, Value2, Source2),
node_has_variant(node(X, Package), Variant),
Value1 < Value2, Source1 < Source2.
% a variant cannot be set if it is not a variant on the package
error(100, "Cannot set variant '{0}' for package '{1}' because the variant condition cannot be satisfied for the given spec", Variant, Package)
:- attr("variant_set", node(X, Package), Variant),
@@ -919,7 +888,7 @@ variant_not_default(node(ID, Package), Variant, Value)
% variants set explicitly on the CLI don't count as non-default
not attr("variant_set", node(ID, Package), Variant, Value),
% variant values forced by propagation don't count as non-default
not attr("variant_propagate", node(ID, Package), Variant, Value, _),
not propagate(node(ID, Package), variant_value(Variant, Value)),
% variants set on externals that we could use don't count as non-default
% this makes spack prefer to use an external over rebuilding with the
% default configuration
@@ -932,7 +901,7 @@ variant_default_not_used(node(ID, Package), Variant, Value)
:- variant_default_value(Package, Variant, Value),
node_has_variant(node(ID, Package), Variant),
not attr("variant_value", node(ID, Package), Variant, Value),
not attr("variant_propagate", node(ID, Package), Variant, _, _),
not propagate(node(ID, Package), variant_value(Variant, _)),
attr("node", node(ID, Package)).
% The variant is set in an external spec
@@ -989,6 +958,67 @@ pkg_fact(Package, variant_single_value("dev_path"))
#defined variant_default_value/3.
#defined variant_default_value_from_packages_yaml/3.
%-----------------------------------------------------------------------------
% Propagation semantics
%-----------------------------------------------------------------------------
% Propagation roots have a corresponding attr("propagate", ...)
propagate(RootNode, PropagatedAttribute) :- attr("propagate", RootNode, PropagatedAttribute).
% Propagate an attribute along edges to child nodes
propagate(ChildNode, PropagatedAttribute) :-
propagate(ParentNode, PropagatedAttribute),
depends_on(ParentNode, ChildNode).
%-----------------------------------------------------------------------------
% Activation of propagated values
%-----------------------------------------------------------------------------
%----
% Variants
%----
% If a variant is propagated, and can be accepted, set its value
attr("variant_value", node(ID, Package), Variant, Value) :-
propagate(node(ID, Package), variant_value(Variant, Value)),
node_has_variant(node(ID, Package), Variant),
pkg_fact(Package, variant_possible_value(Variant, Value)),
not attr("variant_set", node(ID, Package), Variant).
% If a variant is propagated, we cannot have extraneous values
variant_is_propagated(PackageNode, Variant) :-
attr("variant_value", PackageNode, Variant, Value),
propagate(PackageNode, variant_value(Variant, Value)),
not attr("variant_set", PackageNode, Variant).
:- variant_is_propagated(PackageNode, Variant),
attr("variant_value", PackageNode, Variant, Value),
not propagate(PackageNode, variant_value(Variant, Value)).
%----
% Compiler constraints
%----
attr("node_compiler_version_satisfies", node(ID, Package), Compiler, Version) :-
propagate(node(ID, Package), node_compiler_version_satisfies(Compiler, Version)),
node_compiler(node(ID, Package), CompilerID),
compiler_name(CompilerID, Compiler),
not runtime(Package),
not external(Package).
%-----------------------------------------------------------------------------
% Runtimes
%-----------------------------------------------------------------------------
% Check whether the DAG has any built package
has_built_packages() :- build(X), not external(X).
% If we build packages, the runtime nodes must use an available compiler
1 { node_compiler(PackageNode, CompilerID) : build(PackageNode), not external(PackageNode) } :-
has_built_packages(),
runtime(RuntimePackage),
node_compiler(node(_, RuntimePackage), CompilerID).
%-----------------------------------------------------------------------------
% Platform semantics
%-----------------------------------------------------------------------------
@@ -1090,10 +1120,18 @@ attr("node_target", PackageNode, Target)
:- attr("node", PackageNode), attr("node_target_set", PackageNode, Target).
% each node has the weight of its assigned target
node_target_weight(node(ID, Package), Weight)
:- attr("node", node(ID, Package)),
attr("node_target", node(ID, Package), Target),
target_weight(Target, Weight).
target_weight(Target, 0)
:- attr("node", PackageNode),
attr("node_target", PackageNode, Target),
attr("node_target_set", PackageNode, Target).
node_target_weight(PackageNode, MinWeight)
:- attr("node", PackageNode),
attr("node_target", PackageNode, Target),
target(Target),
MinWeight = #min { Weight : target_weight(Target, Weight) }.
:- attr("node_target", PackageNode, Target), not node_target_weight(PackageNode, _).
% compatibility rules for targets among nodes
node_target_match(ParentNode, DependencyNode)
@@ -1155,12 +1193,12 @@ error(10, "No valid compiler for {0} satisfies '%{1}'", Package, Compiler)
% If the compiler of a node must satisfy a constraint, then its version
% must be chosen among the ones that satisfy said constraint
error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
error(100, "Package {0} cannot satisfy '%{1}@{2}'", Package, Compiler, Constraint)
:- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, _).
not compiler_version_satisfies(Compiler, Constraint, _).
error(100, "No valid version for '{0}' compiler '{1}' satisfies '@{2}'", Package, Compiler, Constraint)
error(100, "Package {0} cannot satisfy '%{1}@{2}'", Package, Compiler, Constraint)
:- attr("node", node(X, Package)),
attr("node_compiler_version_satisfies", node(X, Package), Compiler, Constraint),
not compiler_version_satisfies(Compiler, Constraint, ID),
@@ -1496,7 +1534,7 @@ opt_criterion(45, "preferred providers (non-roots)").
}.
% Try to minimize the number of compiler mismatches in the DAG.
opt_criterion(40, "compiler mismatches that are not from CLI").
opt_criterion(40, "compiler mismatches that are not required").
#minimize{ 0@240: #true }.
#minimize{ 0@40: #true }.
#minimize{
@@ -1506,7 +1544,7 @@ opt_criterion(40, "compiler mismatches that are not from CLI").
not runtime(Dependency)
}.
opt_criterion(39, "compiler mismatches that are not from CLI").
opt_criterion(39, "compiler mismatches that are required").
#minimize{ 0@239: #true }.
#minimize{ 0@39: #true }.
#minimize{

View File

@@ -4,21 +4,35 @@
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% Heuristic to speed-up solves (node with ID 0)
% Heuristic to speed-up solves
%=============================================================================
% No duplicates by default (most of them will be true)
#heuristic attr("node", node(PackageID, Package)). [100, init]
#heuristic attr("node", node(PackageID, Package)). [ 2, factor]
#heuristic attr("virtual_node", node(VirtualID, Virtual)). [100, init]
#heuristic attr("node", node(1..X-1, Package)) : max_dupes(Package, X), not virtual(Package), X > 1. [-1, sign]
#heuristic attr("virtual_node", node(1..X-1, Package)) : max_dupes(Package, X), virtual(Package) , X > 1. [-1, sign]
%-----------------
% Domain heuristic
%-----------------
% Pick preferred version
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, Weight)), attr("node", node(PackageID, Package)). [40, init]
#heuristic version_weight(node(PackageID, Package), 0) : pkg_fact(Package, version_declared(Version, 0 )), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, 0 )), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("version", node(PackageID, Package), Version) : pkg_fact(Package, version_declared(Version, Weight)), attr("node", node(PackageID, Package)), Weight > 0. [-1, sign]
% Root node
#heuristic attr("version", node(0, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("root", node(0, Package)). [35, true]
#heuristic version_weight(node(0, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("root", node(0, Package)). [35, true]
#heuristic attr("variant_value", node(0, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("root", node(0, Package)). [35, true]
#heuristic attr("node_target", node(0, Package), Target) : target_weight(Target, 0), attr("root", node(0, Package)). [35, true]
#heuristic node_target_weight(node(0, Package), 0) : attr("root", node(0, Package)). [35, true]
#heuristic node_compiler(node(0, Package), CompilerID) : compiler_weight(ID, 0), compiler_id(ID), attr("root", node(0, Package)). [35, true]
% Use default variants
#heuristic attr("variant_value", node(PackageID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(PackageID, Package)). [40, true]
#heuristic attr("variant_value", node(PackageID, Package), Variant, Value) : not variant_default_value(Package, Variant, Value), attr("node", node(PackageID, Package)). [40, false]
% Providers
#heuristic attr("node", node(0, Package)) : default_provider_preference(Virtual, Package, 0), possible_in_link_run(Package). [30, true]
% Use default operating system and platform
#heuristic attr("node_os", node(PackageID, Package), OS) : os(OS, 0), attr("root", node(PackageID, Package)). [40, true]
#heuristic attr("node_platform", node(PackageID, Package), Platform) : allowed_platform(Platform), attr("root", node(PackageID, Package)). [40, true]
% Use default targets
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)). [30, init]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)). [ 2, factor]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, 0), attr("node", node(PackageID, Package)). [ 1, sign]
#heuristic attr("node_target", node(PackageID, Package), Target) : target_weight(Target, Weight), attr("node", node(PackageID, Package)), Weight > 0. [-1, sign]
% Use the default compilers
#heuristic node_compiler(node(PackageID, Package), ID) : compiler_weight(ID, 0), compiler_id(ID), attr("node", node(PackageID, Package)). [30, init]

View File

@@ -1,24 +0,0 @@
% Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
% Spack Project Developers. See the top-level COPYRIGHT file for details.
%
% SPDX-License-Identifier: (Apache-2.0 OR MIT)
%=============================================================================
% Heuristic to speed-up solves (node with ID > 0)
%=============================================================================
% node(ID, _)
#heuristic attr("version", node(ID, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic version_weight(node(ID, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic attr("variant_value", node(ID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic attr("node_target", node(ID, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_target_weight(node(ID, Package), 0) : attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : compiler_weight(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), ID > 0. [25-5*ID, true]
% node(ID, _), split build dependencies
#heuristic attr("version", node(ID, Package), Version) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic version_weight(node(ID, Package), 0) : pkg_fact(Package, version_declared(Version, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic attr("variant_value", node(ID, Package), Variant, Value) : variant_default_value(Package, Variant, Value), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic attr("node_target", node(ID, Package), Target) : pkg_fact(Package, target_weight(Target, 0)), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_target_weight(node(ID, Package), 0) : attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]
#heuristic node_compiler(node(ID, Package), CompilerID) : compiler_weight(CompilerID, 0), compiler_id(CompilerID), attr("node", node(ID, Package)), multiple_unification_sets(Package), ID > 0. [25, true]

View File

@@ -18,26 +18,15 @@ error(100, "Cannot reuse {0} since we cannot determine libc compatibility", Reus
ReusedPackage != LibcPackage,
not attr("compatible_libc", node(R, ReusedPackage), LibcPackage, LibcVersion).
% Check whether the DAG has any built package
has_built_packages() :- build(X), not external(X).
% A libc is needed in the DAG
:- has_built_packages(), not provider(_, node(0, "libc")).
% Non-libc reused specs must be host libc compatible. In case we build packages, we get a
% host compatible libc provider from other rules. If nothing is built, there is no libc provider,
% since it's pruned from reusable specs, meaning we have to explicitly impose reused specs are host
% compatible.
:- attr("hash", node(R, ReusedPackage), Hash),
not provider(node(R, ReusedPackage), node(0, "libc")),
not attr("compatible_libc", node(R, ReusedPackage), _, _).
% The libc provider must be one that a compiler can target
% The libc must be chosen among available ones
:- has_built_packages(),
provider(node(X, LibcPackage), node(0, "libc")),
attr("node", node(X, LibcPackage)),
attr("version", node(X, LibcPackage), LibcVersion),
not host_libc(LibcPackage, LibcVersion).
not allowed_libc(LibcPackage, LibcVersion).
% A built node must depend on libc
:- build(PackageNode),

View File

@@ -12,6 +12,7 @@
%=============================================================================
% macOS
os_compatible("sequoia", "sonoma").
os_compatible("sonoma", "ventura").
os_compatible("ventura", "monterey").
os_compatible("monterey", "bigsur").

View File

@@ -1454,9 +1454,7 @@ def _get_dependency(self, name):
raise spack.error.SpecError(err_msg.format(name, len(deps)))
return deps[0]
def edges_from_dependents(
self, name=None, depflag: dt.DepFlag = dt.ALL
) -> List[DependencySpec]:
def edges_from_dependents(self, name=None, depflag: dt.DepFlag = dt.ALL):
"""Return a list of edges connecting this node in the DAG
to parents.
@@ -1466,9 +1464,7 @@ def edges_from_dependents(
"""
return [d for d in self._dependents.select(parent=name, depflag=depflag)]
def edges_to_dependencies(
self, name=None, depflag: dt.DepFlag = dt.ALL
) -> List[DependencySpec]:
def edges_to_dependencies(self, name=None, depflag: dt.DepFlag = dt.ALL):
"""Return a list of edges connecting this node in the DAG
to children.
@@ -2820,7 +2816,9 @@ def _old_concretize(self, tests=False, deprecation_warning=True):
# Check if we can produce an optimized binary (will throw if
# there are declared inconsistencies)
self.architecture.target.optimization_flags(self.compiler)
# No need on platform=cray because of the targeting modules
if not self.satisfies("platform=cray"):
self.architecture.target.optimization_flags(self.compiler)
def _patches_assigned(self):
"""Whether patches have been assigned to this spec by the concretizer."""
@@ -4166,29 +4164,21 @@ def __getitem__(self, name: str):
csv = query_parameters.pop().strip()
query_parameters = re.split(r"\s*,\s*", csv)
# In some cases a package appears multiple times in the same DAG for *distinct*
# specs. For example, a build-type dependency may itself depend on a package
# the current spec depends on, but their specs may differ. Therefore we iterate
# in an order here that prioritizes the build, test and runtime dependencies;
# only when we don't find the package do we consider the full DAG.
order = lambda: itertools.chain(
self.traverse(deptype="link"),
self.dependencies(deptype=dt.BUILD | dt.RUN | dt.TEST),
self.traverse(), # fall back to a full search
self.traverse_edges(deptype=dt.LINK, order="breadth", cover="edges"),
self.edges_to_dependencies(depflag=dt.BUILD | dt.RUN | dt.TEST),
self.traverse_edges(deptype=dt.ALL, order="breadth", cover="edges"),
)
# Consider runtime dependencies and direct build/test deps before transitive dependencies,
# and prefer matches closest to the root.
try:
child: Spec = next(
itertools.chain(
# Regular specs
(x for x in order() if x.name == name),
(
x
for x in order()
if (not x.virtual)
and any(name in edge.virtuals for edge in x.edges_from_dependents())
),
(x for x in order() if (not x.virtual) and x.package.provides(name)),
e.spec
for e in itertools.chain(
(e for e in order() if e.spec.name == name or name in e.virtuals),
# for historical reasons
(e for e in order() if e.spec.concrete and e.spec.package.provides(name)),
)
)
except StopIteration:
@@ -4430,9 +4420,12 @@ def format_attribute(match_object: Match) -> str:
if part.startswith("_"):
raise SpecFormatStringError("Attempted to format private attribute")
else:
if part == "variants" and isinstance(current, vt.VariantMap):
if isinstance(current, vt.VariantMap):
# subscript instead of getattr for variant names
current = current[part]
try:
current = current[part]
except KeyError:
raise SpecFormatStringError(f"Variant '{part}' does not exist")
else:
# aliases
if part == "arch":

View File

@@ -346,8 +346,6 @@ class Stage(LockableStagingDir):
similar, and are intended to persist for only one run of spack.
"""
#: Most staging is managed by Spack. DIYStage is one exception.
needs_fetching = True
requires_patch_success = True
def __init__(
@@ -772,8 +770,6 @@ def __init__(self):
"cache_mirror",
"steal_source",
"disable_mirrors",
"needs_fetching",
"requires_patch_success",
]
)
@@ -812,6 +808,10 @@ def path(self):
def archive_file(self):
return self[0].archive_file
@property
def requires_patch_success(self):
return self[0].requires_patch_success
@property
def keep(self):
return self[0].keep
@@ -822,64 +822,7 @@ def keep(self, value):
item.keep = value
class DIYStage:
"""
Simple class that allows any directory to be a spack stage. Consequently,
it does not expect or require that the source path adhere to the standard
directory naming convention.
"""
needs_fetching = False
requires_patch_success = False
def __init__(self, path):
if path is None:
raise ValueError("Cannot construct DIYStage without a path.")
elif not os.path.isdir(path):
raise StagePathError("The stage path directory does not exist:", path)
self.archive_file = None
self.path = path
self.source_path = path
self.created = True
# DIY stages do nothing as context managers.
def __enter__(self):
pass
def __exit__(self, exc_type, exc_val, exc_tb):
pass
def fetch(self, *args, **kwargs):
tty.debug("No need to fetch for DIY.")
def check(self):
tty.debug("No checksum needed for DIY.")
def expand_archive(self):
tty.debug("Using source directory: {0}".format(self.source_path))
@property
def expanded(self):
"""Returns True since the source_path must exist."""
return True
def restage(self):
raise RestageError("Cannot restage a DIY stage.")
def create(self):
self.created = True
def destroy(self):
# No need to destroy DIY stage.
pass
def cache_local(self):
tty.debug("Sources for DIY stages are not cached")
class DevelopStage(LockableStagingDir):
needs_fetching = False
requires_patch_success = False
def __init__(self, name, dev_path, reference_link):

View File

@@ -371,7 +371,6 @@ def use_store(
data.update(extra_data)
# Swap the store with the one just constructed and return it
ensure_singleton_created()
spack.config.CONFIG.push_scope(
spack.config.InternalConfigScope(name=scope_name, data={"config": {"install_tree": data}})
)

View File

@@ -2,12 +2,16 @@
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
import os
import platform
import sys
import pytest
import archspec.cpu
import llnl.util.filesystem as fs
import spack.compilers
import spack.concretize
import spack.operating_systems
@@ -21,8 +25,9 @@ def current_host_platform():
"""Return the platform of the current host as detected by the
'platform' stdlib package.
"""
current_platform = None
if "Linux" in platform.system():
if os.path.exists("/opt/cray/pe"):
current_platform = spack.platforms.Cray()
elif "Linux" in platform.system():
current_platform = spack.platforms.Linux()
elif "Darwin" in platform.system():
current_platform = spack.platforms.Darwin()
@@ -213,7 +218,34 @@ def test_satisfy_strict_constraint_when_not_concrete(architecture_tuple, constra
str(archspec.cpu.host().family) != "x86_64", reason="tests are for x86_64 uarch ranges"
)
def test_concretize_target_ranges(root_target_range, dep_target_range, result, monkeypatch):
spec = Spec(f"a %gcc@10 foobar=bar target={root_target_range} ^b target={dep_target_range}")
spec = Spec(
f"pkg-a %gcc@10 foobar=bar target={root_target_range} ^pkg-b target={dep_target_range}"
)
with spack.concretize.disable_compiler_existence_check():
spec.concretize()
assert spec.target == spec["b"].target == result
assert spec.target == spec["pkg-b"].target == result
@pytest.mark.parametrize(
"versions,default,expected",
[
(["21.11", "21.9"], "21.11", False),
(["21.11", "21.9"], "21.9", True),
(["21.11", "21.9"], None, False),
],
)
@pytest.mark.skipif(sys.platform == "win32", reason="Cray does not use windows")
def test_cray_platform_detection(versions, default, expected, tmpdir, monkeypatch, working_env):
ex_path = str(tmpdir.join("fake_craype_dir"))
fs.mkdirp(ex_path)
with fs.working_dir(ex_path):
for version in versions:
fs.touch(version)
if default:
os.symlink(default, "default")
monkeypatch.setattr(spack.platforms.cray, "_ex_craype_dir", ex_path)
os.environ["MODULEPATH"] = "/opt/cray/pe"
assert spack.platforms.cray.Cray.detect() == expected

View File

@@ -19,8 +19,6 @@
(["missing-dependency"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# The package use a non existing variant in a depends_on directive
(["wrong-variant-in-depends-on"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# This package has a GitHub pull request commit patch URL
(["invalid-github-pull-commits-patch-url"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# This package has a GitHub patch URL without full_index=1
(["invalid-github-patch-url"], ["PKG-DIRECTIVES", "PKG-PROPERTIES"]),
# This package has invalid GitLab patch URLs

View File

@@ -228,3 +228,25 @@ def test_source_is_disabled(mutable_config):
spack.config.add("bootstrap:trusted:{0}:{1}".format(conf["name"], False))
with pytest.raises(ValueError):
spack.bootstrap.core.source_is_enabled_or_raise(conf)
@pytest.mark.regression("45247")
def test_use_store_does_not_try_writing_outside_root(tmp_path, monkeypatch, mutable_config):
"""Tests that when we use the 'use_store' context manager, there is no attempt at creating
a Store outside the given root.
"""
initial_store = mutable_config.get("config:install_tree:root")
user_store = tmp_path / "store"
fn = spack.store.Store.__init__
def _checked_init(self, root, *args, **kwargs):
fn(self, root, *args, **kwargs)
assert self.root == str(user_store)
monkeypatch.setattr(spack.store.Store, "__init__", _checked_init)
spack.store.reinitialize()
with spack.store.use_store(user_store):
assert spack.config.CONFIG.get("config:install_tree:root") == str(user_store)
assert spack.config.CONFIG.get("config:install_tree:root") == initial_store

View File

@@ -457,14 +457,14 @@ def test_parallel_false_is_not_propagating(default_mock_concretization):
# a foobar=bar (parallel = False)
# |
# b (parallel =True)
s = default_mock_concretization("a foobar=bar")
s = default_mock_concretization("pkg-a foobar=bar")
spack.build_environment.set_package_py_globals(s.package, context=Context.BUILD)
assert s["a"].package.module.make_jobs == 1
assert s["pkg-a"].package.module.make_jobs == 1
spack.build_environment.set_package_py_globals(s["b"].package, context=Context.BUILD)
assert s["b"].package.module.make_jobs == spack.build_environment.determine_number_of_jobs(
parallel=s["b"].package.parallel
spack.build_environment.set_package_py_globals(s["pkg-b"].package, context=Context.BUILD)
assert s["pkg-b"].package.module.make_jobs == spack.build_environment.determine_number_of_jobs(
parallel=s["pkg-b"].package.parallel
)
@@ -556,6 +556,24 @@ def test_build_jobs_defaults():
)
def test_dirty_disable_module_unload(config, mock_packages, working_env, mock_module_cmd):
"""Test that on CRAY platform 'module unload' is not called if the 'dirty'
option is on.
"""
s = spack.spec.Spec("pkg-a").concretized()
# If called with "dirty" we don't unload modules, so no calls to the
# `module` function on Cray
spack.build_environment.setup_package(s.package, dirty=True)
assert not mock_module_cmd.calls
# If called without "dirty" we unload modules on Cray
spack.build_environment.setup_package(s.package, dirty=False)
assert mock_module_cmd.calls
assert any(("unload", "cray-libsci") == item[0] for item in mock_module_cmd.calls)
assert any(("unload", "cray-mpich") == item[0] for item in mock_module_cmd.calls)
class TestModuleMonkeyPatcher:
def test_getting_attributes(self, default_mock_concretization):
s = default_mock_concretization("libelf")

View File

@@ -97,7 +97,7 @@ def test_negative_ninja_check(self, input_dir, test_dir, concretize_and_setup):
@pytest.mark.usefixtures("config", "mock_packages")
class TestAutotoolsPackage:
def test_with_or_without(self, default_mock_concretization):
s = default_mock_concretization("a")
s = default_mock_concretization("pkg-a")
options = s.package.with_or_without("foo")
# Ensure that values that are not representing a feature
@@ -129,7 +129,7 @@ def activate(value):
assert "--without-lorem-ipsum" in options
def test_none_is_allowed(self, default_mock_concretization):
s = default_mock_concretization("a foo=none")
s = default_mock_concretization("pkg-a foo=none")
options = s.package.with_or_without("foo")
# Ensure that values that are not representing a feature

View File

@@ -12,21 +12,21 @@
def test_build_task_errors(install_mockery):
with pytest.raises(ValueError, match="must be a package"):
inst.BuildTask("abc", None, False, 0, 0, 0, set())
inst.BuildTask("abc", None, False, 0, 0, 0, [])
spec = spack.spec.Spec("trivial-install-test-package")
pkg_cls = spack.repo.PATH.get_pkg_class(spec.name)
with pytest.raises(ValueError, match="must have a concrete spec"):
inst.BuildTask(pkg_cls(spec), None, False, 0, 0, 0, set())
inst.BuildTask(pkg_cls(spec), None, False, 0, 0, 0, [])
spec.concretize()
assert spec.concrete
with pytest.raises(ValueError, match="must have a build request"):
inst.BuildTask(spec.package, None, False, 0, 0, 0, set())
inst.BuildTask(spec.package, None, False, 0, 0, 0, [])
request = inst.BuildRequest(spec.package, {})
with pytest.raises(inst.InstallError, match="Cannot create a build task"):
inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_REMOVED, set())
inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_REMOVED, [])
def test_build_task_basics(install_mockery):
@@ -36,8 +36,8 @@ def test_build_task_basics(install_mockery):
# Ensure key properties match expectations
request = inst.BuildRequest(spec.package, {})
task = inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_ADDED, set())
assert not task.explicit
task = inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_ADDED, [])
assert task.explicit # package was "explicitly" requested
assert task.priority == len(task.uninstalled_deps)
assert task.key == (task.priority, task.sequence)
@@ -58,7 +58,7 @@ def test_build_task_strings(install_mockery):
# Ensure key properties match expectations
request = inst.BuildRequest(spec.package, {})
task = inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_ADDED, set())
task = inst.BuildTask(spec.package, request, False, 0, 0, inst.STATUS_ADDED, [])
# Cover __repr__
irep = task.__repr__()

View File

@@ -106,24 +106,24 @@ def test_specs_staging(config, tmpdir):
"""
builder = repo.MockRepositoryBuilder(tmpdir)
builder.add_package("g")
builder.add_package("f")
builder.add_package("e")
builder.add_package("d", dependencies=[("f", None, None), ("g", None, None)])
builder.add_package("c")
builder.add_package("b", dependencies=[("d", None, None), ("e", None, None)])
builder.add_package("a", dependencies=[("b", None, None), ("c", None, None)])
builder.add_package("pkg-g")
builder.add_package("pkg-f")
builder.add_package("pkg-e")
builder.add_package("pkg-d", dependencies=[("pkg-f", None, None), ("pkg-g", None, None)])
builder.add_package("pkg-c")
builder.add_package("pkg-b", dependencies=[("pkg-d", None, None), ("pkg-e", None, None)])
builder.add_package("pkg-a", dependencies=[("pkg-b", None, None), ("pkg-c", None, None)])
with repo.use_repositories(builder.root):
spec_a = Spec("a").concretized()
spec_a = Spec("pkg-a").concretized()
spec_a_label = ci._spec_ci_label(spec_a)
spec_b_label = ci._spec_ci_label(spec_a["b"])
spec_c_label = ci._spec_ci_label(spec_a["c"])
spec_d_label = ci._spec_ci_label(spec_a["d"])
spec_e_label = ci._spec_ci_label(spec_a["e"])
spec_f_label = ci._spec_ci_label(spec_a["f"])
spec_g_label = ci._spec_ci_label(spec_a["g"])
spec_b_label = ci._spec_ci_label(spec_a["pkg-b"])
spec_c_label = ci._spec_ci_label(spec_a["pkg-c"])
spec_d_label = ci._spec_ci_label(spec_a["pkg-d"])
spec_e_label = ci._spec_ci_label(spec_a["pkg-e"])
spec_f_label = ci._spec_ci_label(spec_a["pkg-f"])
spec_g_label = ci._spec_ci_label(spec_a["pkg-g"])
spec_labels, dependencies, stages = ci.stage_spec_jobs([spec_a])
@@ -1290,7 +1290,7 @@ def test_ci_generate_override_runner_attrs(
spack:
specs:
- flatten-deps
- a
- pkg-a
mirrors:
some-mirror: https://my.fake.mirror
ci:
@@ -1307,12 +1307,12 @@ def test_ci_generate_override_runner_attrs(
- match:
- dependency-install
- match:
- a
- pkg-a
build-job:
tags:
- specific-a-2
- match:
- a
- pkg-a
build-job-remove:
tags:
- toplevel2
@@ -1372,8 +1372,8 @@ def test_ci_generate_override_runner_attrs(
assert global_vars["SPACK_CHECKOUT_VERSION"] == git_version or "v0.20.0.test0"
for ci_key in yaml_contents.keys():
if ci_key.startswith("a"):
# Make sure a's attributes override variables, and all the
if ci_key.startswith("pkg-a"):
# Make sure pkg-a's attributes override variables, and all the
# scripts. Also, make sure the 'toplevel' tag doesn't
# appear twice, but that a's specific extra tag does appear
the_elt = yaml_contents[ci_key]
@@ -1830,7 +1830,7 @@ def test_ci_generate_read_broken_specs_url(
tmpdir, mutable_mock_env_path, install_mockery, mock_packages, monkeypatch, ci_base_environment
):
"""Verify that `broken-specs-url` works as intended"""
spec_a = Spec("a")
spec_a = Spec("pkg-a")
spec_a.concretize()
a_dag_hash = spec_a.dag_hash()
@@ -1856,7 +1856,7 @@ def test_ci_generate_read_broken_specs_url(
spack:
specs:
- flatten-deps
- a
- pkg-a
mirrors:
some-mirror: https://my.fake.mirror
ci:
@@ -1864,9 +1864,9 @@ def test_ci_generate_read_broken_specs_url(
pipeline-gen:
- submapping:
- match:
- a
- pkg-a
- flatten-deps
- b
- pkg-b
- dependency-install
build-job:
tags:

View File

@@ -11,7 +11,6 @@
import spack.caches
import spack.cmd.clean
import spack.environment as ev
import spack.main
import spack.package_base
import spack.stage
@@ -69,20 +68,6 @@ def test_function_calls(command_line, effects, mock_calls_for_clean):
assert mock_calls_for_clean[name] == (1 if name in effects else 0)
def test_env_aware_clean(mock_stage, install_mockery, mutable_mock_env_path, monkeypatch):
e = ev.create("test", with_view=False)
e.add("mpileaks")
e.concretize()
def fail(*args, **kwargs):
raise Exception("This should not have been called")
monkeypatch.setattr(spack.spec.Spec, "concretize", fail)
with e:
clean("mpileaks")
def test_remove_python_cache(tmpdir, monkeypatch):
cache_files = ["file1.pyo", "file2.pyc"]
source_file = "file1.py"

View File

@@ -81,14 +81,14 @@ def test_match_spec_env(mock_packages, mutable_mock_env_path):
"""
# Initial sanity check: we are planning on choosing a non-default
# value, so make sure that is in fact not the default.
check_defaults = spack.cmd.parse_specs(["a"], concretize=True)[0]
check_defaults = spack.cmd.parse_specs(["pkg-a"], concretize=True)[0]
assert not check_defaults.satisfies("foobar=baz")
e = ev.create("test")
e.add("a foobar=baz")
e.add("pkg-a foobar=baz")
e.concretize()
with e:
env_spec = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["a"])[0])
env_spec = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-a"])[0])
assert env_spec.satisfies("foobar=baz")
assert env_spec.concrete
@@ -96,12 +96,12 @@ def test_match_spec_env(mock_packages, mutable_mock_env_path):
@pytest.mark.usefixtures("config")
def test_multiple_env_match_raises_error(mock_packages, mutable_mock_env_path):
e = ev.create("test")
e.add("a foobar=baz")
e.add("a foobar=fee")
e.add("pkg-a foobar=baz")
e.add("pkg-a foobar=fee")
e.concretize()
with e:
with pytest.raises(ev.SpackEnvironmentError) as exc_info:
spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["a"])[0])
spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-a"])[0])
assert "matches multiple specs" in exc_info.value.message
@@ -109,16 +109,16 @@ def test_multiple_env_match_raises_error(mock_packages, mutable_mock_env_path):
@pytest.mark.usefixtures("config")
def test_root_and_dep_match_returns_root(mock_packages, mutable_mock_env_path):
e = ev.create("test")
e.add("b@0.9")
e.add("a foobar=bar") # Depends on b, should choose b@1.0
e.add("pkg-b@0.9")
e.add("pkg-a foobar=bar") # Depends on b, should choose b@1.0
e.concretize()
with e:
# This query matches the root b and b as a dependency of a. In that
# case the root instance should be preferred.
env_spec1 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["b"])[0])
env_spec1 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-b"])[0])
assert env_spec1.satisfies("@0.9")
env_spec2 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["b@1.0"])[0])
env_spec2 = spack.cmd.matching_spec_from_env(spack.cmd.parse_specs(["pkg-b@1.0"])[0])
assert env_spec2

View File

@@ -51,8 +51,8 @@ def test_concretize_root_test_dependencies_are_concretized(unify, mutable_mock_e
with ev.read("test") as e:
e.unify = unify
add("a")
add("b")
add("pkg-a")
add("pkg-b")
concretize("--test", "root")
assert e.matching_spec("test-dependency")

View File

@@ -15,26 +15,26 @@
def test_env(mutable_mock_env_path, config, mock_packages):
ev.create("test")
with ev.read("test") as e:
e.add("a@2.0 foobar=bar ^b@1.0")
e.add("a@1.0 foobar=bar ^b@0.9")
e.add("pkg-a@2.0 foobar=bar ^pkg-b@1.0")
e.add("pkg-a@1.0 foobar=bar ^pkg-b@0.9")
e.concretize()
e.write()
def test_deconcretize_dep(test_env):
with ev.read("test") as e:
deconcretize("-y", "b@1.0")
deconcretize("-y", "pkg-b@1.0")
specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 1
assert specs[0].satisfies("a@1.0")
assert specs[0].satisfies("pkg-a@1.0")
def test_deconcretize_all_dep(test_env):
with ev.read("test") as e:
with pytest.raises(SpackCommandError):
deconcretize("-y", "b")
deconcretize("-y", "--all", "b")
deconcretize("-y", "pkg-b")
deconcretize("-y", "--all", "pkg-b")
specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 0
@@ -42,27 +42,27 @@ def test_deconcretize_all_dep(test_env):
def test_deconcretize_root(test_env):
with ev.read("test") as e:
output = deconcretize("-y", "--root", "b@1.0")
output = deconcretize("-y", "--root", "pkg-b@1.0")
assert "No matching specs to deconcretize" in output
assert len(e.concretized_order) == 2
deconcretize("-y", "--root", "a@2.0")
deconcretize("-y", "--root", "pkg-a@2.0")
specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 1
assert specs[0].satisfies("a@1.0")
assert specs[0].satisfies("pkg-a@1.0")
def test_deconcretize_all_root(test_env):
with ev.read("test") as e:
with pytest.raises(SpackCommandError):
deconcretize("-y", "--root", "a")
deconcretize("-y", "--root", "pkg-a")
output = deconcretize("-y", "--root", "--all", "b")
output = deconcretize("-y", "--root", "--all", "pkg-b")
assert "No matching specs to deconcretize" in output
assert len(e.concretized_order) == 2
deconcretize("-y", "--root", "--all", "a")
deconcretize("-y", "--root", "--all", "pkg-a")
specs = [s for s, _ in e.concretized_specs()]
assert len(specs) == 0

View File

@@ -125,8 +125,18 @@ def print_spack_cc(*args):
print(os.environ.get("CC", ""))
# `module unload cray-libsci` in test environment causes failure
# It does not fail for actual installs
# build_environment.py imports module directly, so we monkeypatch it there
# rather than in module_cmd
def mock_module_noop(*args):
pass
def test_dev_build_drop_in(tmpdir, mock_packages, monkeypatch, install_mockery, working_env):
monkeypatch.setattr(os, "execvp", print_spack_cc)
monkeypatch.setattr(spack.build_environment, "module", mock_module_noop)
with tmpdir.as_cwd():
output = dev_build("-b", "edit", "--drop-in", "sh", "dev-build-test-install@0.0.0")
assert "lib/spack/env" in output

View File

@@ -28,7 +28,9 @@
import spack.package_base
import spack.paths
import spack.repo
import spack.store
import spack.util.spack_json as sjson
import spack.util.spack_yaml
from spack.cmd.env import _env_create
from spack.main import SpackCommand, SpackCommandError
from spack.spec import Spec
@@ -501,7 +503,7 @@ def test_env_install_two_specs_same_dep(install_mockery, mock_fetch, tmpdir, cap
"""\
spack:
specs:
- a
- pkg-a
- depb
"""
)
@@ -520,8 +522,8 @@ def test_env_install_two_specs_same_dep(install_mockery, mock_fetch, tmpdir, cap
depb = spack.store.STORE.db.query_one("depb", installed=True)
assert depb, "Expected depb to be installed"
a = spack.store.STORE.db.query_one("a", installed=True)
assert a, "Expected a to be installed"
a = spack.store.STORE.db.query_one("pkg-a", installed=True)
assert a, "Expected pkg-a to be installed"
def test_remove_after_concretize():
@@ -825,7 +827,7 @@ def test_env_view_external_prefix(tmp_path, mutable_database, mock_packages):
"""\
spack:
specs:
- a
- pkg-a
view: true
"""
)
@@ -833,9 +835,9 @@ def test_env_view_external_prefix(tmp_path, mutable_database, mock_packages):
external_config = io.StringIO(
"""\
packages:
a:
pkg-a:
externals:
- spec: a@2.0
- spec: pkg-a@2.0
prefix: {a_prefix}
buildable: false
""".format(
@@ -1737,6 +1739,17 @@ def test_env_include_concrete_env_yaml(env_name):
assert test.path in combined_yaml["include_concrete"]
@pytest.mark.regression("45766")
@pytest.mark.parametrize("format", ["v1", "v2", "v3"])
def test_env_include_concrete_old_env(format, tmpdir):
lockfile = os.path.join(spack.paths.test_path, "data", "legacy_env", f"{format}.lock")
# create an env from old .lock file -- this does not update the format
env("create", "old-env", lockfile)
env("create", "--include-concrete", "old-env", "test")
assert ev.read("old-env").all_specs() == ev.read("test").all_specs()
def test_env_bad_include_concrete_env():
with pytest.raises(ev.SpackEnvironmentError):
env("create", "--include-concrete", "nonexistant_env", "combined_env")

View File

@@ -89,7 +89,7 @@ def check(pkg):
assert pkg.run_tests
monkeypatch.setattr(spack.package_base.PackageBase, "unit_test_check", check)
install("--test=all", "a")
install("--test=all", "pkg-a")
def test_install_package_already_installed(
@@ -570,61 +570,58 @@ def test_cdash_upload_build_error(tmpdir, mock_fetch, install_mockery, capfd):
@pytest.mark.disable_clean_stage_check
def test_cdash_upload_clean_build(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capturing of e.g., Build.xml output
with capfd.disabled():
with tmpdir.as_cwd():
install("--log-file=cdash_reports", "--log-format=cdash", "a")
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "</Build>" in content
assert "<Text>" not in content
with capfd.disabled(), tmpdir.as_cwd():
install("--log-file=cdash_reports", "--log-format=cdash", "pkg-a")
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "</Build>" in content
assert "<Text>" not in content
@pytest.mark.disable_clean_stage_check
def test_cdash_upload_extra_params(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capture of e.g., Build.xml output
with capfd.disabled():
with tmpdir.as_cwd():
install(
"--log-file=cdash_reports",
"--log-format=cdash",
"--cdash-build=my_custom_build",
"--cdash-site=my_custom_site",
"--cdash-track=my_custom_track",
"a",
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert 'Site BuildName="my_custom_build - a"' in content
assert 'Name="my_custom_site"' in content
assert "-my_custom_track" in content
with capfd.disabled(), tmpdir.as_cwd():
install(
"--log-file=cdash_reports",
"--log-format=cdash",
"--cdash-build=my_custom_build",
"--cdash-site=my_custom_site",
"--cdash-track=my_custom_track",
"pkg-a",
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert 'Site BuildName="my_custom_build - pkg-a"' in content
assert 'Name="my_custom_site"' in content
assert "-my_custom_track" in content
@pytest.mark.disable_clean_stage_check
def test_cdash_buildstamp_param(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capture of e.g., Build.xml output
with capfd.disabled():
with tmpdir.as_cwd():
cdash_track = "some_mocked_track"
buildstamp_format = "%Y%m%d-%H%M-{0}".format(cdash_track)
buildstamp = time.strftime(buildstamp_format, time.localtime(int(time.time())))
install(
"--log-file=cdash_reports",
"--log-format=cdash",
"--cdash-buildstamp={0}".format(buildstamp),
"a",
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert buildstamp in content
with capfd.disabled(), tmpdir.as_cwd():
cdash_track = "some_mocked_track"
buildstamp_format = "%Y%m%d-%H%M-{0}".format(cdash_track)
buildstamp = time.strftime(buildstamp_format, time.localtime(int(time.time())))
install(
"--log-file=cdash_reports",
"--log-format=cdash",
"--cdash-buildstamp={0}".format(buildstamp),
"pkg-a",
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Build.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert buildstamp in content
@pytest.mark.disable_clean_stage_check
@@ -632,38 +629,37 @@ def test_cdash_install_from_spec_json(
tmpdir, mock_fetch, install_mockery, capfd, mock_packages, mock_archive, config
):
# capfd interferes with Spack's capturing
with capfd.disabled():
with tmpdir.as_cwd():
spec_json_path = str(tmpdir.join("spec.json"))
with capfd.disabled(), tmpdir.as_cwd():
spec_json_path = str(tmpdir.join("spec.json"))
pkg_spec = Spec("a")
pkg_spec.concretize()
pkg_spec = Spec("pkg-a")
pkg_spec.concretize()
with open(spec_json_path, "w") as fd:
fd.write(pkg_spec.to_json(hash=ht.dag_hash))
with open(spec_json_path, "w") as fd:
fd.write(pkg_spec.to_json(hash=ht.dag_hash))
install(
"--log-format=cdash",
"--log-file=cdash_reports",
"--cdash-build=my_custom_build",
"--cdash-site=my_custom_site",
"--cdash-track=my_custom_track",
"-f",
spec_json_path,
)
install(
"--log-format=cdash",
"--log-file=cdash_reports",
"--cdash-build=my_custom_build",
"--cdash-site=my_custom_site",
"--cdash-track=my_custom_track",
"-f",
spec_json_path,
)
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("a_Configure.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
install_command_regex = re.compile(
r"<ConfigureCommand>(.+)</ConfigureCommand>", re.MULTILINE | re.DOTALL
)
m = install_command_regex.search(content)
assert m
install_command = m.group(1)
assert "a@" in install_command
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("pkg-a_Configure.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
install_command_regex = re.compile(
r"<ConfigureCommand>(.+)</ConfigureCommand>", re.MULTILINE | re.DOTALL
)
m = install_command_regex.search(content)
assert m
install_command = m.group(1)
assert "pkg-a@" in install_command
@pytest.mark.disable_clean_stage_check
@@ -795,15 +791,15 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# ^libdwarf
# ^mpich
# libelf@0.8.10
# a~bvv
# ^b
# a
# ^b
# pkg-a~bvv
# ^pkg-b
# pkg-a
# ^pkg-b
e = ev.create("test", with_view=False)
e.add("mpileaks")
e.add("libelf@0.8.10") # so env has both root and dep libelf specs
e.add("a")
e.add("a ~bvv")
e.add("pkg-a")
e.add("pkg-a ~bvv")
e.concretize()
e.write()
env_specs = e.all_specs()
@@ -814,9 +810,9 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# First find and remember some target concrete specs in the environment
for e_spec in env_specs:
if e_spec.satisfies(Spec("a ~bvv")):
if e_spec.satisfies(Spec("pkg-a ~bvv")):
a_spec = e_spec
elif e_spec.name == "b":
elif e_spec.name == "pkg-b":
b_spec = e_spec
elif e_spec.satisfies(Spec("mpi")):
mpi_spec = e_spec
@@ -839,8 +835,8 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
assert "You can add specs to the environment with 'spack add " in inst_out
# Without --add, ensure that two packages "a" get installed
inst_out = install("a", output=str)
assert len([x for x in e.all_specs() if x.installed and x.name == "a"]) == 2
inst_out = install("pkg-a", output=str)
assert len([x for x in e.all_specs() if x.installed and x.name == "pkg-a"]) == 2
# Install an unambiguous dependency spec (that already exists as a dep
# in the environment) and make sure it gets installed (w/ deps),
@@ -873,7 +869,7 @@ def test_install_no_add_in_env(tmpdir, mock_fetch, install_mockery, mutable_mock
# root of the environment as well as installed.
assert b_spec not in e.roots()
install("--add", "b")
install("--add", "pkg-b")
assert b_spec in e.roots()
assert b_spec not in e.uninstalled_specs()
@@ -908,7 +904,7 @@ def test_cdash_auth_token(tmpdir, mock_fetch, install_mockery, monkeypatch, capf
# capfd interferes with Spack's capturing
with tmpdir.as_cwd(), capfd.disabled():
monkeypatch.setenv("SPACK_CDASH_AUTH_TOKEN", "asdf")
out = install("-v", "--log-file=cdash_reports", "--log-format=cdash", "a")
out = install("-v", "--log-file=cdash_reports", "--log-format=cdash", "pkg-a")
assert "Using CDash auth token from environment" in out
@@ -916,26 +912,25 @@ def test_cdash_auth_token(tmpdir, mock_fetch, install_mockery, monkeypatch, capf
@pytest.mark.disable_clean_stage_check
def test_cdash_configure_warning(tmpdir, mock_fetch, install_mockery, capfd):
# capfd interferes with Spack's capturing of e.g., Build.xml output
with capfd.disabled():
with tmpdir.as_cwd():
# Test would fail if install raised an error.
with capfd.disabled(), tmpdir.as_cwd():
# Test would fail if install raised an error.
# Ensure that even on non-x86_64 architectures, there are no
# dependencies installed
spec = spack.spec.Spec("configure-warning").concretized()
spec.clear_dependencies()
specfile = "./spec.json"
with open(specfile, "w") as f:
f.write(spec.to_json())
# Ensure that even on non-x86_64 architectures, there are no
# dependencies installed
spec = Spec("configure-warning").concretized()
spec.clear_dependencies()
specfile = "./spec.json"
with open(specfile, "w") as f:
f.write(spec.to_json())
install("--log-file=cdash_reports", "--log-format=cdash", specfile)
# Verify Configure.xml exists with expected contents.
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("Configure.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "foo: No such file or directory" in content
install("--log-file=cdash_reports", "--log-format=cdash", specfile)
# Verify Configure.xml exists with expected contents.
report_dir = tmpdir.join("cdash_reports")
assert report_dir in tmpdir.listdir()
report_file = report_dir.join("Configure.xml")
assert report_file in report_dir.listdir()
content = report_file.open().read()
assert "foo: No such file or directory" in content
@pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows")
@@ -952,7 +947,7 @@ def test_compiler_bootstrap(
assert CompilerSpec("gcc@=12.0") not in compilers.all_compiler_specs()
# Test succeeds if it does not raise an error
install("a%gcc@=12.0")
install("pkg-a%gcc@=12.0")
@pytest.mark.not_on_windows("Binary mirrors not supported on windows")
@@ -992,8 +987,8 @@ def test_compiler_bootstrap_from_binary_mirror(
# Now make sure that when the compiler is installed from binary mirror,
# it also gets configured as a compiler. Test succeeds if it does not
# raise an error
install("--no-check-signature", "--cache-only", "--only", "dependencies", "b%gcc@=10.2.0")
install("--no-cache", "--only", "package", "b%gcc@10.2.0")
install("--no-check-signature", "--cache-only", "--only", "dependencies", "pkg-b%gcc@=10.2.0")
install("--no-cache", "--only", "package", "pkg-b%gcc@10.2.0")
@pytest.mark.not_on_windows("ArchSpec gives test platform debian rather than windows")
@@ -1013,7 +1008,7 @@ def test_compiler_bootstrap_already_installed(
# Test succeeds if it does not raise an error
install("gcc@=12.0")
install("a%gcc@=12.0")
install("pkg-a%gcc@=12.0")
def test_install_fails_no_args(tmpdir):
@@ -1195,7 +1190,7 @@ def test_report_filename_for_cdash(install_mockery_mutable_config, mock_fetch):
parser = argparse.ArgumentParser()
spack.cmd.install.setup_parser(parser)
args = parser.parse_args(
["--cdash-upload-url", "https://blahblah/submit.php?project=debugging", "a"]
["--cdash-upload-url", "https://blahblah/submit.php?project=debugging", "pkg-a"]
)
specs = spack.cmd.install.concrete_specs_from_cli(args, {})
filename = spack.cmd.install.report_filename(args, specs)

View File

@@ -121,7 +121,7 @@ def test_maintainers_list_packages(mock_packages, capfd):
def test_maintainers_list_fails(mock_packages, capfd):
out = maintainers("a", fail_on_error=False)
out = maintainers("pkg-a", fail_on_error=False)
assert not out
assert maintainers.returncode == 1

View File

@@ -11,6 +11,7 @@
import spack.config
import spack.main
import spack.modules
import spack.spec
import spack.store
module = spack.main.SpackCommand("module")
@@ -178,8 +179,8 @@ def test_setdefault_command(mutable_database, mutable_config):
}
}
spack.config.set("modules", data)
# Install two different versions of a package
other_spec, preferred = "a@1.0", "a@2.0"
# Install two different versions of pkg-a
other_spec, preferred = "pkg-a@1.0", "pkg-a@2.0"
spack.spec.Spec(other_spec).concretized().package.do_install(fake=True)
spack.spec.Spec(preferred).concretized().package.do_install(fake=True)

View File

@@ -28,8 +28,8 @@ def install(self, spec, prefix):
pass
"""
abc = set(("pkg-a", "pkg-b", "pkg-c"))
abd = set(("pkg-a", "pkg-b", "pkg-d"))
abc = {"mockpkg-a", "mockpkg-b", "mockpkg-c"}
abd = {"mockpkg-a", "mockpkg-b", "mockpkg-d"}
# Force all tests to use a git repository *in* the mock packages repo.
@@ -53,27 +53,33 @@ def mock_pkg_git_repo(git, tmpdir_factory):
git("config", "user.name", "Spack Testing")
git("-c", "commit.gpgsign=false", "commit", "-m", "initial mock repo commit")
# add commit with pkg-a, pkg-b, pkg-c packages
mkdirp("pkg-a", "pkg-b", "pkg-c")
with open("pkg-a/package.py", "w") as f:
# add commit with mockpkg-a, mockpkg-b, mockpkg-c packages
mkdirp("mockpkg-a", "mockpkg-b", "mockpkg-c")
with open("mockpkg-a/package.py", "w") as f:
f.write(pkg_template.format(name="PkgA"))
with open("pkg-b/package.py", "w") as f:
with open("mockpkg-b/package.py", "w") as f:
f.write(pkg_template.format(name="PkgB"))
with open("pkg-c/package.py", "w") as f:
with open("mockpkg-c/package.py", "w") as f:
f.write(pkg_template.format(name="PkgC"))
git("add", "pkg-a", "pkg-b", "pkg-c")
git("-c", "commit.gpgsign=false", "commit", "-m", "add pkg-a, pkg-b, pkg-c")
git("add", "mockpkg-a", "mockpkg-b", "mockpkg-c")
git("-c", "commit.gpgsign=false", "commit", "-m", "add mockpkg-a, mockpkg-b, mockpkg-c")
# remove pkg-c, add pkg-d
with open("pkg-b/package.py", "a") as f:
f.write("\n# change pkg-b")
git("add", "pkg-b")
mkdirp("pkg-d")
with open("pkg-d/package.py", "w") as f:
# remove mockpkg-c, add mockpkg-d
with open("mockpkg-b/package.py", "a") as f:
f.write("\n# change mockpkg-b")
git("add", "mockpkg-b")
mkdirp("mockpkg-d")
with open("mockpkg-d/package.py", "w") as f:
f.write(pkg_template.format(name="PkgD"))
git("add", "pkg-d")
git("rm", "-rf", "pkg-c")
git("-c", "commit.gpgsign=false", "commit", "-m", "change pkg-b, remove pkg-c, add pkg-d")
git("add", "mockpkg-d")
git("rm", "-rf", "mockpkg-c")
git(
"-c",
"commit.gpgsign=false",
"commit",
"-m",
"change mockpkg-b, remove mockpkg-c, add mockpkg-d",
)
with spack.repo.use_repositories(str(repo_path)):
yield mock_repo_packages
@@ -86,12 +92,11 @@ def mock_pkg_names():
# Be sure to include virtual packages since packages with stand-alone
# tests may inherit additional tests from the virtuals they provide,
# such as packages that implement `mpi`.
names = set(
return {
name
for name in repo.all_package_names(include_virtuals=True)
if not name.startswith("pkg-")
)
return names
if not name.startswith("mockpkg-")
}
def split(output):
@@ -113,17 +118,17 @@ def test_mock_packages_path(mock_packages):
def test_pkg_add(git, mock_pkg_git_repo):
with working_dir(mock_pkg_git_repo):
mkdirp("pkg-e")
with open("pkg-e/package.py", "w") as f:
mkdirp("mockpkg-e")
with open("mockpkg-e/package.py", "w") as f:
f.write(pkg_template.format(name="PkgE"))
pkg("add", "pkg-e")
pkg("add", "mockpkg-e")
with working_dir(mock_pkg_git_repo):
try:
assert "A pkg-e/package.py" in git("status", "--short", output=str)
assert "A mockpkg-e/package.py" in git("status", "--short", output=str)
finally:
shutil.rmtree("pkg-e")
shutil.rmtree("mockpkg-e")
# Removing a package mid-run disrupts Spack's caching
if spack.repo.PATH.repos[0]._fast_package_checker:
spack.repo.PATH.repos[0]._fast_package_checker.invalidate()
@@ -138,10 +143,10 @@ def test_pkg_list(mock_pkg_git_repo, mock_pkg_names):
assert sorted(mock_pkg_names) == sorted(out)
out = split(pkg("list", "HEAD^"))
assert sorted(mock_pkg_names.union(["pkg-a", "pkg-b", "pkg-c"])) == sorted(out)
assert sorted(mock_pkg_names.union(["mockpkg-a", "mockpkg-b", "mockpkg-c"])) == sorted(out)
out = split(pkg("list", "HEAD"))
assert sorted(mock_pkg_names.union(["pkg-a", "pkg-b", "pkg-d"])) == sorted(out)
assert sorted(mock_pkg_names.union(["mockpkg-a", "mockpkg-b", "mockpkg-d"])) == sorted(out)
# test with three dots to make sure pkg calls `git merge-base`
out = split(pkg("list", "HEAD^^..."))
@@ -151,25 +156,25 @@ def test_pkg_list(mock_pkg_git_repo, mock_pkg_names):
@pytest.mark.not_on_windows("stdout format conflict")
def test_pkg_diff(mock_pkg_git_repo, mock_pkg_names):
out = split(pkg("diff", "HEAD^^", "HEAD^"))
assert out == ["HEAD^:", "pkg-a", "pkg-b", "pkg-c"]
assert out == ["HEAD^:", "mockpkg-a", "mockpkg-b", "mockpkg-c"]
out = split(pkg("diff", "HEAD^^", "HEAD"))
assert out == ["HEAD:", "pkg-a", "pkg-b", "pkg-d"]
assert out == ["HEAD:", "mockpkg-a", "mockpkg-b", "mockpkg-d"]
out = split(pkg("diff", "HEAD^", "HEAD"))
assert out == ["HEAD^:", "pkg-c", "HEAD:", "pkg-d"]
assert out == ["HEAD^:", "mockpkg-c", "HEAD:", "mockpkg-d"]
@pytest.mark.not_on_windows("stdout format conflict")
def test_pkg_added(mock_pkg_git_repo):
out = split(pkg("added", "HEAD^^", "HEAD^"))
assert ["pkg-a", "pkg-b", "pkg-c"] == out
assert ["mockpkg-a", "mockpkg-b", "mockpkg-c"] == out
out = split(pkg("added", "HEAD^^", "HEAD"))
assert ["pkg-a", "pkg-b", "pkg-d"] == out
assert ["mockpkg-a", "mockpkg-b", "mockpkg-d"] == out
out = split(pkg("added", "HEAD^", "HEAD"))
assert ["pkg-d"] == out
assert ["mockpkg-d"] == out
out = split(pkg("added", "HEAD", "HEAD"))
assert out == []
@@ -184,7 +189,7 @@ def test_pkg_removed(mock_pkg_git_repo):
assert out == []
out = split(pkg("removed", "HEAD^", "HEAD"))
assert out == ["pkg-c"]
assert out == ["mockpkg-c"]
@pytest.mark.not_on_windows("stdout format conflict")
@@ -196,34 +201,34 @@ def test_pkg_changed(mock_pkg_git_repo):
assert out == []
out = split(pkg("changed", "--type", "a", "HEAD^^", "HEAD^"))
assert out == ["pkg-a", "pkg-b", "pkg-c"]
assert out == ["mockpkg-a", "mockpkg-b", "mockpkg-c"]
out = split(pkg("changed", "--type", "r", "HEAD^^", "HEAD^"))
assert out == []
out = split(pkg("changed", "--type", "ar", "HEAD^^", "HEAD^"))
assert out == ["pkg-a", "pkg-b", "pkg-c"]
assert out == ["mockpkg-a", "mockpkg-b", "mockpkg-c"]
out = split(pkg("changed", "--type", "arc", "HEAD^^", "HEAD^"))
assert out == ["pkg-a", "pkg-b", "pkg-c"]
assert out == ["mockpkg-a", "mockpkg-b", "mockpkg-c"]
out = split(pkg("changed", "HEAD^", "HEAD"))
assert out == ["pkg-b"]
assert out == ["mockpkg-b"]
out = split(pkg("changed", "--type", "c", "HEAD^", "HEAD"))
assert out == ["pkg-b"]
assert out == ["mockpkg-b"]
out = split(pkg("changed", "--type", "a", "HEAD^", "HEAD"))
assert out == ["pkg-d"]
assert out == ["mockpkg-d"]
out = split(pkg("changed", "--type", "r", "HEAD^", "HEAD"))
assert out == ["pkg-c"]
assert out == ["mockpkg-c"]
out = split(pkg("changed", "--type", "ar", "HEAD^", "HEAD"))
assert out == ["pkg-c", "pkg-d"]
assert out == ["mockpkg-c", "mockpkg-d"]
out = split(pkg("changed", "--type", "arc", "HEAD^", "HEAD"))
assert out == ["pkg-b", "pkg-c", "pkg-d"]
assert out == ["mockpkg-b", "mockpkg-c", "mockpkg-d"]
# invalid type argument
with pytest.raises(spack.main.SpackCommandError):
@@ -289,7 +294,7 @@ def test_pkg_canonical_source(mock_packages):
def test_pkg_hash(mock_packages):
output = pkg("hash", "a", "b").strip().split()
output = pkg("hash", "pkg-a", "pkg-b").strip().split()
assert len(output) == 2 and all(len(elt) == 32 for elt in output)
output = pkg("hash", "multimethod").strip().split()

View File

@@ -58,7 +58,7 @@ def test_spec_concretizer_args(mutable_config, mutable_database, do_not_check_ru
def test_spec_parse_dependency_variant_value():
"""Verify that we can provide multiple key=value variants to multiple separate
packages within a spec string."""
output = spec("multivalue-variant fee=barbaz ^ a foobar=baz")
output = spec("multivalue-variant fee=barbaz ^ pkg-a foobar=baz")
assert "fee=barbaz" in output
assert "foobar=baz" in output

View File

@@ -10,10 +10,14 @@
from llnl.util.filesystem import copy_tree
import spack.cmd.common.arguments
import spack.cmd.install
import spack.cmd.test
import spack.config
import spack.install_test
import spack.package_base
import spack.paths
import spack.spec
import spack.store
from spack.install_test import TestStatus
from spack.main import SpackCommand

View File

@@ -384,18 +384,9 @@ def test_clang_flags():
unsupported_flag_test("cxx17_flag", "clang@3.4")
supported_flag_test("cxx17_flag", "-std=c++1z", "clang@3.5")
supported_flag_test("cxx17_flag", "-std=c++17", "clang@5.0")
unsupported_flag_test("cxx20_flag", "clang@4.0")
supported_flag_test("cxx20_flag", "-std=c++2a", "clang@5.0")
supported_flag_test("cxx20_flag", "-std=c++20", "clang@11.0")
unsupported_flag_test("cxx23_flag", "clang@11.0")
supported_flag_test("cxx23_flag", "-std=c++2b", "clang@12.0")
supported_flag_test("cxx23_flag", "-std=c++23", "clang@17.0")
supported_flag_test("c99_flag", "-std=c99", "clang@3.3")
unsupported_flag_test("c11_flag", "clang@2.0")
supported_flag_test("c11_flag", "-std=c11", "clang@6.1.0")
unsupported_flag_test("c23_flag", "clang@8.0")
supported_flag_test("c23_flag", "-std=c2x", "clang@9.0")
supported_flag_test("c23_flag", "-std=c23", "clang@18.0")
supported_flag_test("cc_pic_flag", "-fPIC", "clang@3.3")
supported_flag_test("cxx_pic_flag", "-fPIC", "clang@3.3")
supported_flag_test("f77_pic_flag", "-fPIC", "clang@3.3")

View File

@@ -3,8 +3,12 @@
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
"""Test detection of compiler version"""
import os
import pytest
import llnl.util.filesystem as fs
import spack.compilers.aocc
import spack.compilers.arm
import spack.compilers.cce
@@ -19,6 +23,7 @@
import spack.compilers.xl
import spack.compilers.xl_r
import spack.util.module_cmd
from spack.operating_systems.cray_frontend import CrayFrontend
@pytest.mark.parametrize(
@@ -408,6 +413,48 @@ def test_xl_version_detection(version_str, expected_version):
assert version == expected_version
@pytest.mark.not_on_windows("Not supported on Windows (yet)")
@pytest.mark.parametrize(
"compiler,version",
[
("gcc", "8.1.0"),
("gcc", "1.0.0-foo"),
("pgi", "19.1"),
("pgi", "19.1a"),
("intel", "9.0.0"),
("intel", "0.0.0-foobar"),
# ('oneapi', '2021.1'),
# ('oneapi', '2021.1-foobar')
],
)
def test_cray_frontend_compiler_detection(compiler, version, tmpdir, monkeypatch, working_env):
"""Test that the Cray frontend properly finds compilers form modules"""
# setup the fake compiler directory
compiler_dir = tmpdir.join(compiler)
compiler_exe = compiler_dir.join("cc").ensure()
fs.set_executable(str(compiler_exe))
# mock modules
def _module(cmd, *args):
module_name = "%s/%s" % (compiler, version)
module_contents = "prepend-path PATH %s" % compiler_dir
if cmd == "avail":
return module_name if compiler in args[0] else ""
if cmd == "show":
return module_contents if module_name in args else ""
monkeypatch.setattr(spack.operating_systems.cray_frontend, "module", _module)
# remove PATH variable
os.environ.pop("PATH", None)
# get a CrayFrontend object
cray_fe_os = CrayFrontend()
paths = cray_fe_os.compiler_search_paths
assert paths == [str(compiler_dir)]
@pytest.mark.parametrize(
"version_str,expected_version",
[

View File

@@ -24,11 +24,13 @@
import spack.platforms
import spack.repo
import spack.solver.asp
import spack.store
import spack.util.file_cache
import spack.util.libc
import spack.variant as vt
from spack.concretize import find_spec
from spack.spec import CompilerSpec, Spec
from spack.version import Version, VersionList, ver
from spack.version import Version, ver
def check_spec(abstract, concrete):
@@ -404,7 +406,7 @@ def test_compiler_flags_from_compiler_and_dependent(self):
def test_compiler_flags_differ_identical_compilers(self, mutable_config, clang12_with_flags):
mutable_config.set("compilers", [clang12_with_flags])
# Correct arch to use test compiler that has flags
spec = Spec("a %clang@12.2.0 platform=test os=fe target=fe")
spec = Spec("pkg-a %clang@12.2.0 platform=test os=fe target=fe")
# Get the compiler that matches the spec (
compiler = spack.compilers.compiler_for_spec("clang@=12.2.0", spec.architecture)
@@ -473,7 +475,7 @@ def test_architecture_deep_inheritance(self, mock_targets, compiler_factory):
assert s.architecture.target == spec.architecture.target
def test_compiler_flags_from_user_are_grouped(self):
spec = Spec('a%gcc cflags="-O -foo-flag foo-val" platform=test')
spec = Spec('pkg-a%gcc cflags="-O -foo-flag foo-val" platform=test')
spec.concretize()
cflags = spec.compiler_flags["cflags"]
assert any(x == "-foo-flag foo-val" for x in cflags)
@@ -581,20 +583,20 @@ def test_concretize_propagate_multivalue_variant(self):
spec = Spec("multivalue-variant foo==baz,fee")
spec.concretize()
assert spec.satisfies("^a foo=baz,fee")
assert spec.satisfies("^b foo=baz,fee")
assert not spec.satisfies("^a foo=bar")
assert not spec.satisfies("^b foo=bar")
assert spec.satisfies("^pkg-a foo=baz,fee")
assert spec.satisfies("^pkg-b foo=baz,fee")
assert not spec.satisfies("^pkg-a foo=bar")
assert not spec.satisfies("^pkg-b foo=bar")
def test_no_matching_compiler_specs(self, mock_low_high_config):
# only relevant when not building compilers as needed
with spack.concretize.enable_compiler_existence_check():
s = Spec("a %gcc@=0.0.0")
s = Spec("pkg-a %gcc@=0.0.0")
with pytest.raises(spack.concretize.UnavailableCompilerVersionError):
s.concretize()
def test_no_compilers_for_arch(self):
s = Spec("a arch=linux-rhel0-x86_64")
s = Spec("pkg-a arch=linux-rhel0-x86_64")
with pytest.raises(spack.error.SpackError):
s.concretize()
@@ -803,7 +805,7 @@ def test_regression_issue_7941(self):
# The string representation of a spec containing
# an explicit multi-valued variant and a dependency
# might be parsed differently than the originating spec
s = Spec("a foobar=bar ^b")
s = Spec("pkg-a foobar=bar ^pkg-b")
t = Spec(str(s))
s.concretize()
@@ -1183,14 +1185,14 @@ def test_conditional_provides_or_depends_on(self):
[
# Check that True is treated correctly and attaches test deps
# to all nodes in the DAG
("a", True, ["a"], []),
("a foobar=bar", True, ["a", "b"], []),
("pkg-a", True, ["pkg-a"], []),
("pkg-a foobar=bar", True, ["pkg-a", "pkg-b"], []),
# Check that a list of names activates the dependency only for
# packages in that list
("a foobar=bar", ["a"], ["a"], ["b"]),
("a foobar=bar", ["b"], ["b"], ["a"]),
("pkg-a foobar=bar", ["pkg-a"], ["pkg-a"], ["pkg-b"]),
("pkg-a foobar=bar", ["pkg-b"], ["pkg-b"], ["pkg-a"]),
# Check that False disregard test dependencies
("a foobar=bar", False, [], ["a", "b"]),
("pkg-a foobar=bar", False, [], ["pkg-a", "pkg-b"]),
],
)
def test_activating_test_dependencies(self, spec_str, tests_arg, with_dep, without_dep):
@@ -1249,7 +1251,7 @@ def test_custom_compiler_version(self, mutable_config, compiler_factory, monkeyp
"compilers", [compiler_factory(spec="gcc@10foo", operating_system="redhat6")]
)
monkeypatch.setattr(spack.compiler.Compiler, "real_version", "10.2.1")
s = Spec("a %gcc@10foo os=redhat6").concretized()
s = Spec("pkg-a %gcc@10foo os=redhat6").concretized()
assert "%gcc@10foo" in s
def test_all_patches_applied(self):
@@ -1393,10 +1395,10 @@ def test_no_reuse_when_variant_condition_does_not_hold(self, mutable_database, m
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_reuse_with_flags(self, mutable_database, mutable_config):
spack.config.set("concretizer:reuse", True)
spec = Spec("a cflags=-g cxxflags=-g").concretized()
spec = Spec("pkg-a cflags=-g cxxflags=-g").concretized()
spack.store.STORE.db.add(spec, None)
testspec = Spec("a cflags=-g")
testspec = Spec("pkg-a cflags=-g")
testspec.concretize()
assert testspec == spec
@@ -1739,49 +1741,49 @@ def test_reuse_with_unknown_namespace_dont_raise(
self, temporary_store, mock_custom_repository
):
with spack.repo.use_repositories(mock_custom_repository, override=False):
s = Spec("c").concretized()
s = Spec("pkg-c").concretized()
assert s.namespace != "builtin.mock"
s.package.do_install(fake=True, explicit=True)
with spack.config.override("concretizer:reuse", True):
s = Spec("c").concretized()
s = Spec("pkg-c").concretized()
assert s.namespace == "builtin.mock"
@pytest.mark.regression("28259")
def test_reuse_with_unknown_package_dont_raise(self, tmpdir, temporary_store, monkeypatch):
builder = spack.repo.MockRepositoryBuilder(tmpdir.mkdir("mock.repo"), namespace="myrepo")
builder.add_package("c")
builder.add_package("pkg-c")
with spack.repo.use_repositories(builder.root, override=False):
s = Spec("c").concretized()
s = Spec("pkg-c").concretized()
assert s.namespace == "myrepo"
s.package.do_install(fake=True, explicit=True)
del sys.modules["spack.pkg.myrepo.c"]
del sys.modules["spack.pkg.myrepo.pkg-c"]
del sys.modules["spack.pkg.myrepo"]
builder.remove("c")
builder.remove("pkg-c")
with spack.repo.use_repositories(builder.root, override=False) as repos:
# TODO (INJECT CONFIGURATION): unclear why the cache needs to be invalidated explicitly
repos.repos[0]._pkg_checker.invalidate()
with spack.config.override("concretizer:reuse", True):
s = Spec("c").concretized()
s = Spec("pkg-c").concretized()
assert s.namespace == "builtin.mock"
@pytest.mark.parametrize(
"specs,expected",
"specs,expected,libc_offset",
[
(["libelf", "libelf@0.8.10"], 1),
(["libdwarf%gcc", "libelf%clang"], 2),
(["libdwarf%gcc", "libdwarf%clang"], 3),
(["libdwarf^libelf@0.8.12", "libdwarf^libelf@0.8.13"], 4),
(["hdf5", "zmpi"], 3),
(["hdf5", "mpich"], 2),
(["hdf5^zmpi", "mpich"], 4),
(["mpi", "zmpi"], 2),
(["mpi", "mpich"], 1),
(["libelf", "libelf@0.8.10"], 1, 1),
(["libdwarf%gcc", "libelf%clang"], 2, 1),
(["libdwarf%gcc", "libdwarf%clang"], 3, 2),
(["libdwarf^libelf@0.8.12", "libdwarf^libelf@0.8.13"], 4, 1),
(["hdf5", "zmpi"], 3, 1),
(["hdf5", "mpich"], 2, 1),
(["hdf5^zmpi", "mpich"], 4, 1),
(["mpi", "zmpi"], 2, 1),
(["mpi", "mpich"], 1, 1),
],
)
@pytest.mark.only_clingo("Original concretizer cannot concretize in rounds")
def test_best_effort_coconcretize(self, specs, expected):
def test_best_effort_coconcretize(self, specs, expected, libc_offset):
specs = [Spec(s) for s in specs]
solver = spack.solver.asp.Solver()
solver.reuse = False
@@ -1790,7 +1792,9 @@ def test_best_effort_coconcretize(self, specs, expected):
for s in result.specs:
concrete_specs.update(s.traverse())
libc_offset = 1 if spack.solver.asp.using_libc_compatibility() else 0
if not spack.solver.asp.using_libc_compatibility():
libc_offset = 0
assert len(concrete_specs) == expected + libc_offset
@pytest.mark.parametrize(
@@ -1892,20 +1896,20 @@ def test_misleading_error_message_on_version(self, mutable_database):
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_version_weight_and_provenance(self):
"""Test package preferences during coconcretization."""
reusable_specs = [Spec(spec_str).concretized() for spec_str in ("b@0.9", "b@1.0")]
root_spec = Spec("a foobar=bar")
reusable_specs = [Spec(spec_str).concretized() for spec_str in ("pkg-b@0.9", "pkg-b@1.0")]
root_spec = Spec("pkg-a foobar=bar")
with spack.config.override("concretizer:reuse", True):
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, [root_spec], reuse=reusable_specs)
# The result here should have a single spec to build ('a')
# and it should be using b@1.0 with a version badness of 2
# The result here should have a single spec to build ('pkg-a')
# and it should be using pkg-b@1.0 with a version badness of 2
# The provenance is:
# version_declared("b","1.0",0,"package_py").
# version_declared("b","0.9",1,"package_py").
# version_declared("b","1.0",2,"installed").
# version_declared("b","0.9",3,"installed").
# version_declared("pkg-b","1.0",0,"package_py").
# version_declared("pkg-b","0.9",1,"package_py").
# version_declared("pkg-b","1.0",2,"installed").
# version_declared("pkg-b","0.9",3,"installed").
#
# Depending on the target, it may also use gnuconfig
result_spec = result.specs[0]
@@ -1919,11 +1923,11 @@ def test_version_weight_and_provenance(self):
for criterion in criteria:
assert criterion in result.criteria, criterion
assert result_spec.satisfies("^b@1.0")
assert result_spec.satisfies("^pkg-b@1.0")
@pytest.mark.only_clingo("Use case not supported by the original concretizer")
def test_reuse_succeeds_with_config_compatible_os(self):
root_spec = Spec("b")
root_spec = Spec("pkg-b")
s = root_spec.concretized()
other_os = s.copy()
mock_os = "ubuntu2204"
@@ -2187,7 +2191,7 @@ def test_external_python_extension_find_unified_python(self):
"specs",
[
["mpileaks^ callpath ^dyninst@8.1.1:8 ^mpich2@1.3:1"],
["multivalue-variant ^a@2:2"],
["multivalue-variant ^pkg-a@2:2"],
["v1-consumer ^conditional-provider@1:1 +disable-v1"],
],
)
@@ -2226,9 +2230,9 @@ def test_unsolved_specs_raises_error(self, monkeypatch, mock_packages, config):
def test_clear_error_when_unknown_compiler_requested(self, mock_packages, config):
"""Tests that the solver can report a case where the compiler cannot be set"""
with pytest.raises(
spack.error.UnsatisfiableSpecError, match="Cannot set the required compiler: a%foo"
spack.error.UnsatisfiableSpecError, match="Cannot set the required compiler: pkg-a%foo"
):
Spec("a %foo").concretized()
Spec("pkg-a %foo").concretized()
@pytest.mark.regression("36339")
def test_compiler_match_constraints_when_selected(self):
@@ -2264,7 +2268,7 @@ def test_compiler_match_constraints_when_selected(self):
},
]
spack.config.set("compilers", compiler_configuration)
s = Spec("a %gcc@:11").concretized()
s = Spec("pkg-a %gcc@:11").concretized()
assert s.compiler.version == ver("=11.1.0"), s
@pytest.mark.regression("36339")
@@ -2285,7 +2289,7 @@ def test_compiler_with_custom_non_numeric_version(self, mock_executable):
}
]
spack.config.set("compilers", compiler_configuration)
s = Spec("a %gcc@foo").concretized()
s = Spec("pkg-a %gcc@foo").concretized()
assert s.compiler.version == ver("=foo")
@pytest.mark.regression("36628")
@@ -2311,7 +2315,7 @@ def test_concretization_with_compilers_supporting_target_any(self):
]
with spack.config.override("compilers", compiler_configuration):
s = spack.spec.Spec("a").concretized()
s = Spec("pkg-a").concretized()
assert s.satisfies("%gcc@12.1.0")
@pytest.mark.parametrize("spec_str", ["mpileaks", "mpileaks ^mpich"])
@@ -2346,7 +2350,7 @@ def test_dont_define_new_version_from_input_if_checksum_required(self, working_e
with pytest.raises(spack.error.UnsatisfiableSpecError):
# normally spack concretizes to @=3.0 if it's not defined in package.py, except
# when checksums are required
Spec("a@=3.0").concretized()
Spec("pkg-a@=3.0").concretized()
@pytest.mark.regression("39570")
@pytest.mark.db
@@ -2446,7 +2450,7 @@ def _default_libc(self):
spack.util.libc, "libc_from_current_python_process", lambda: Spec("glibc@=2.28")
)
mutable_config.set("config:install_missing_compilers", True)
s = Spec("a %gcc@=13.2.0").concretized()
s = Spec("pkg-a %gcc@=13.2.0").concretized()
assert s.satisfies("%gcc@13.2.0")
@pytest.mark.regression("43267")
@@ -2570,28 +2574,54 @@ def test_can_reuse_concrete_externals_for_dependents(self, mutable_config, tmp_p
sombrero = result.specs[0]
assert sombrero["externaltool"].dag_hash() == external_spec.dag_hash()
@pytest.mark.only_clingo("Original concretizer cannot reuse")
def test_cannot_reuse_host_incompatible_libc(self):
"""Test whether reuse concretization correctly fails to reuse a spec with a host
incompatible libc."""
if not spack.solver.asp.using_libc_compatibility():
pytest.skip("This test requires libc nodes")
@pytest.mark.regression("45321")
@pytest.mark.parametrize(
"corrupted_str",
[
"cmake@3.4.3 foo=bar", # cmake has no variant "foo"
"mvdefaults@1.0 foo=a,d", # variant "foo" has no value "d"
"cmake %gcc", # spec has no version
],
)
def test_corrupted_external_does_not_halt_concretization(self, corrupted_str, mutable_config):
"""Tests that having a wrong variant in an external spec doesn't stop concretization"""
corrupted_spec = Spec(corrupted_str)
packages_yaml = {
f"{corrupted_spec.name}": {
"externals": [{"spec": corrupted_str, "prefix": "/dev/null"}]
}
}
mutable_config.set("packages", packages_yaml)
# Assert we don't raise due to the corrupted external entry above
s = Spec("pkg-a").concretized()
assert s.concrete
# We install b@1 ^glibc@2.30, and b@0 ^glibc@2.28. The former is not host compatible, the
# latter is.
fst = Spec("b@1").concretized()
fst._mark_concrete(False)
fst.dependencies("glibc")[0].versions = VersionList(["=2.30"])
fst._mark_concrete(True)
snd = Spec("b@0").concretized()
@pytest.mark.regression("44828")
@pytest.mark.not_on_windows("Tests use linux paths")
def test_correct_external_is_selected_from_packages_yaml(self, mutable_config):
"""Tests that when filtering external specs, the correct external is selected to
reconstruct the prefix, and other external attributes.
"""
packages_yaml = {
"cmake": {
"externals": [
{"spec": "cmake@3.23.1 %gcc", "prefix": "/tmp/prefix1"},
{"spec": "cmake@3.23.1 %clang", "prefix": "/tmp/prefix2"},
]
}
}
concretizer_yaml = {
"reuse": {"roots": True, "from": [{"type": "external", "exclude": ["%gcc"]}]}
}
mutable_config.set("packages", packages_yaml)
mutable_config.set("concretizer", concretizer_yaml)
# The spec b@1 ^glibc@2.30 is "more optimal" than b@0 ^glibc@2.28, but due to glibc
# incompatibility, it should not be reused.
solver = spack.solver.asp.Solver()
setup = spack.solver.asp.SpackSolverSetup()
result, _, _ = solver.driver.solve(setup, [Spec("b")], reuse=[fst, snd])
assert len(result.specs) == 1
assert result.specs[0] == snd
s = Spec("cmake").concretized()
# Check that we got the properties from the right external
assert s.external
assert s.satisfies("%clang")
assert s.prefix == "/tmp/prefix2"
@pytest.fixture()
@@ -2794,7 +2824,9 @@ def test_drop_moving_targets(v_str, v_opts, checksummed):
class TestConcreteSpecsByHash:
"""Tests the container of concrete specs"""
@pytest.mark.parametrize("input_specs", [["a"], ["a foobar=bar", "b"], ["a foobar=baz", "b"]])
@pytest.mark.parametrize(
"input_specs", [["pkg-a"], ["pkg-a foobar=bar", "pkg-b"], ["pkg-a foobar=baz", "pkg-b"]]
)
def test_adding_specs(self, input_specs, default_mock_concretization):
"""Tests that concrete specs in the container are equivalent, but stored as different
objects in memory.

View File

@@ -9,6 +9,7 @@
import archspec.cpu
import spack.config
import spack.paths
import spack.repo
import spack.solver.asp
@@ -47,8 +48,8 @@ def enable_runtimes():
def test_correct_gcc_runtime_is_injected_as_dependency(runtime_repo):
s = spack.spec.Spec("a%gcc@10.2.1 ^b%gcc@9.4.0").concretized()
a, b = s["a"], s["b"]
s = spack.spec.Spec("pkg-a%gcc@10.2.1 ^pkg-b%gcc@9.4.0").concretized()
a, b = s["pkg-a"], s["pkg-b"]
# Both a and b should depend on the same gcc-runtime directly
assert a.dependencies("gcc-runtime") == b.dependencies("gcc-runtime")
@@ -61,16 +62,16 @@ def test_correct_gcc_runtime_is_injected_as_dependency(runtime_repo):
def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_path):
"""Tests that external nodes don't have runtime dependencies."""
packages_yaml = {"b": {"externals": [{"spec": "b@1.0", "prefix": f"{str(tmp_path)}"}]}}
packages_yaml = {"pkg-b": {"externals": [{"spec": "pkg-b@1.0", "prefix": f"{str(tmp_path)}"}]}}
spack.config.set("packages", packages_yaml)
s = spack.spec.Spec("a%gcc@10.2.1").concretized()
s = spack.spec.Spec("pkg-a%gcc@10.2.1").concretized()
a, b = s["a"], s["b"]
a, b = s["pkg-a"], s["pkg-b"]
# Since b is an external, it doesn't depend on gcc-runtime
assert a.dependencies("gcc-runtime")
assert a.dependencies("b")
assert a.dependencies("pkg-b")
assert not b.dependencies("gcc-runtime")
@@ -78,23 +79,36 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
"root_str,reused_str,expected,nruntime",
[
# The reused runtime is older than we need, thus we'll add a more recent one for a
("a%gcc@10.2.1", "b%gcc@9.4.0", {"a": "gcc-runtime@10.2.1", "b": "gcc-runtime@9.4.0"}, 2),
# The root is compiled with an older compiler, thus we'll reuse the runtime from b
("a%gcc@9.4.0", "b%gcc@10.2.1", {"a": "gcc-runtime@10.2.1", "b": "gcc-runtime@10.2.1"}, 1),
(
"pkg-a%gcc@10.2.1",
"pkg-b%gcc@9.4.0",
{"pkg-a": "gcc-runtime@10.2.1", "pkg-b": "gcc-runtime@9.4.0"},
2,
),
# The root is compiled with an older compiler, thus we'll NOT reuse the runtime from b
(
"pkg-a%gcc@9.4.0",
"pkg-b%gcc@10.2.1",
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"},
1,
),
# Same as before, but tests that we can reuse from a more generic target
pytest.param(
"a%gcc@9.4.0",
"b%gcc@10.2.1 target=x86_64",
{"a": "gcc-runtime@10.2.1 target=x86_64", "b": "gcc-runtime@10.2.1 target=x86_64"},
"pkg-a%gcc@9.4.0",
"pkg-b%gcc@10.2.1 target=x86_64",
{"pkg-a": "gcc-runtime@9.4.0", "pkg-b": "gcc-runtime@9.4.0"},
1,
marks=pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
),
),
pytest.param(
"a%gcc@10.2.1",
"b%gcc@9.4.0 target=x86_64",
{"a": "gcc-runtime@10.2.1 target=x86_64", "b": "gcc-runtime@9.4.0 target=x86_64"},
"pkg-a%gcc@10.2.1",
"pkg-b%gcc@9.4.0 target=x86_64",
{
"pkg-a": "gcc-runtime@10.2.1 target=x86_64",
"pkg-b": "gcc-runtime@9.4.0 target=x86_64",
},
2,
marks=pytest.mark.skipif(
str(archspec.cpu.host().family) != "x86_64", reason="test data is x86_64 specific"
@@ -102,17 +116,19 @@ def test_external_nodes_do_not_have_runtimes(runtime_repo, mutable_config, tmp_p
),
],
)
@pytest.mark.regression("44444")
def test_reusing_specs_with_gcc_runtime(root_str, reused_str, expected, nruntime, runtime_repo):
"""Tests that we can reuse specs with a "gcc-runtime" leaf node. In particular, checks
that the semantic for gcc-runtimes versions accounts for reused packages too.
Reusable runtime versions should be lower, or equal, to that of parent nodes.
"""
root, reused_spec = _concretize_with_reuse(root_str=root_str, reused_str=reused_str)
assert f"{expected['b']}" in reused_spec
runtime_a = root.dependencies("gcc-runtime")[0]
assert runtime_a.satisfies(expected["a"])
runtime_b = root["b"].dependencies("gcc-runtime")[0]
assert runtime_b.satisfies(expected["b"])
assert runtime_a.satisfies(expected["pkg-a"])
runtime_b = root["pkg-b"].dependencies("gcc-runtime")[0]
assert runtime_b.satisfies(expected["pkg-b"])
runtimes = [x for x in root.traverse() if x.name == "gcc-runtime"]
assert len(runtimes) == nruntime
@@ -123,8 +139,7 @@ def test_reusing_specs_with_gcc_runtime(root_str, reused_str, expected, nruntime
[
# Ensure that, whether we have multiple runtimes in the DAG or not,
# we always link only the latest version
("a%gcc@10.2.1", "b%gcc@9.4.0", ["gcc-runtime@10.2.1"], ["gcc-runtime@9.4.0"]),
("a%gcc@9.4.0", "b%gcc@10.2.1", ["gcc-runtime@10.2.1"], ["gcc-runtime@9.4.0"]),
("pkg-a%gcc@10.2.1", "pkg-b%gcc@9.4.0", ["gcc-runtime@10.2.1"], ["gcc-runtime@9.4.0"])
],
)
def test_views_can_handle_duplicate_runtime_nodes(

View File

@@ -512,5 +512,5 @@ def test_default_preference_variant_different_type_does_not_error(self):
packages.yaml doesn't fail with an error.
"""
with spack.config.override("packages:all", {"variants": "+foo"}):
s = Spec("a").concretized()
s = Spec("pkg-a").concretized()
assert s.satisfies("foo=bar")

View File

@@ -103,23 +103,6 @@ def test_repo(_create_test_repo, monkeypatch, mock_stage):
yield mock_repo_path
class MakeStage:
def __init__(self, stage):
self.stage = stage
def __call__(self, *args, **kwargs):
return self.stage
@pytest.fixture
def fake_installs(monkeypatch, tmpdir):
stage_path = str(tmpdir.ensure("fake-stage", dir=True))
universal_unused_stage = spack.stage.DIYStage(stage_path)
monkeypatch.setattr(
spack.build_systems.generic.Package, "_make_stage", MakeStage(universal_unused_stage)
)
def test_one_package_multiple_reqs(concretize_scope, test_repo):
conf_str = """\
packages:
@@ -514,7 +497,7 @@ def test_oneof_ordering(concretize_scope, test_repo):
assert s2.satisfies("@2.5")
def test_reuse_oneof(concretize_scope, _create_test_repo, mutable_database, fake_installs):
def test_reuse_oneof(concretize_scope, _create_test_repo, mutable_database, mock_fetch):
conf_str = """\
packages:
y:
@@ -944,9 +927,9 @@ def test_default_requirements_semantic(packages_yaml, concretize_scope, mock_pac
Spec("zlib ~shared").concretized()
# A spec without the shared variant still concretize
s = Spec("a").concretized()
assert not s.satisfies("a +shared")
assert not s.satisfies("a ~shared")
s = Spec("pkg-a").concretized()
assert not s.satisfies("pkg-a +shared")
assert not s.satisfies("pkg-a ~shared")
@pytest.mark.parametrize(

Some files were not shown because too many files have changed in this diff Show More