Pull a free-standing `find_python_in_prefix` function out of `python`'s `command()`
property.
Originally done for #44382 but not ultimately used, I still think this is a good
refactor, so submitting as a separate pull request.
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
* e4s oneapi: upgrade to latest compilers oneapi@2025.1
* update specs and package preferences
* enable some more dav packages
* enable additional specs
* e4s oneapi: packages: elfutils does not have bzip2 variant
* e4s oneapi: packages: elfutils does not have xz variant
* e4s oneapi: comment out heffte+sycl
* comment out e4s oneapi failures
* comment out more failures
* comment out failing spec
* Revert "paraview: add patch for Intel Classic compilers (#49116)"
This reverts commit 7a95e2beb5.
We'll mark Intel Classic compilers as conflicting with ParaView
versions 5.13.0-5.13.2 instead since 5.13.3 is available and can be
built with with those compilers.
* Add conflict for Intel Class compilers and ParaView 5.13.0-5.13.2.
* paraview: add new v5.13.3 release
This commit reorders ASP setup, so that rules from
possible compilers are collected first.
This allows us to know the dependencies that may be
injected before counting the possible dependencies,
so we can account for them too.
Proceeding this way makes it easier to inject
complex runtimes, like hip.
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Deal with the "issue" that passing a str instance does not cause a
type check failure, because str is a subset of Sequence[str] and
Iterable[str]. Instead fix it by special casing the str instance.
* fix: move depends_on(c,cxx,fortran) with other dependencies, after variants
* treewide style: move depends_on(c,cxx,fortran) with other dependencies, after variants
* treewide style: move depends_on(c,cxx,fortran) with other dependencies, after variant
---------
Co-authored-by: Alec Scott <hi@alecbcs.com>
CMake 4.0.0 breaks compatibility with CMake projects
requiring a CMake < 3.5. However, many projects that
specify a minimum requirement for versions older
than 3.5 are actually compatible with newer CMake
and do not use CMake 3.4 or older features. This
allows those projects to use a newer CMake
Co-authored-by: John W. Parent <45471568+johnwparent@users.noreply.github.com>
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
The requirements being removed are redundant, or
even outdated (%cray-prgenv-* is not a compiler in v0.23).
When compilers turned into nodes, these constraints,
with the "one_of" policy, started being unsat under
certain conditions e.g. we can't compile anymore
with GCC and depend on LLVM as a library.
Remove the requirements to make the recipe solvable
again.
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* update packages to work with python 3.12+
* partd updates
* py-nc-time-axis updates for 3.12
* Add new py-cftime as required for py-nc-time-axis
* fix dropped python 3.9
* switch from when blocks to flat
* remove redundant requires
* protect version range for python@:3.11
* add new c-blosc to support newly added python 3.12 py-blosc version
* add scikit-build 0.18.1 for python 3.12 required for this set of commits
* add complete optional variant for py-partd to match pyproject.toml. Deprecate super old versions
* only set system blosc for the required case
* style
* Remove incorrect python bound
* improve python version reqs and move to more canonical depends_on
* move to depends from req
* add new python range limit, update comment
* remove @coreyjadams as maintainer as per their request https://github.com/spack/spack/pull/48830#issuecomment-2705062587
* Fix python bounds
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
---------
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
* Check for LSF, FLux, and Slurm when determing MPI exec
* Make scheduler/MPI exec helper functions methods of CachedCMakeBuilder
* Remove axom workaround for running mpi on machines with flux
* Clearly split old and new hip settings requirements
* Apply generic rocm handling to every project
* make default logic for hip support more robust
* GPU_TARGET is only necessary under certain project specific conditions, it should not be necessary in general
* Update logic to find amdclang++
fixes#49717
If no compiler is listed in the 'packages' section of
the configuration, Spack will currently try to:
1. Look for a legacy compilers.yaml to convert
2. Look for compilers in PATH
in that order. If an entry in compilers.yaml is
corrupted, that should not result in an obscure
error.
Instead, it should just be skipped.
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
This module references `spack.error.Something` in the same file, which happens to
work but is incorrect. Use `spack.error` to prevent that in the future.
Adds Equinox, a JAX library. I've added the latest version 0.11.2, and also 0.11.0
which is compatible with older JAX versions.
I've built both versions and loading an example from their page seems to work OK.
* Add py-wadler-lindig
* Add py-equinox
I noticed that `abseil-cpp` was showing in `spack find` with "no compiler", and the only
difference between it and other nodes was that it *only* depends on `cxx` -- others
depend on `c` as well.
It turns out that the `select()` method on `EdgeMap` only takes `Sequence[str]` and doesn't
check whether they're actually just one `str`. So asking for, e.g., `cxx` is like asking for
`c` or `x` or `x`, as the `str` is treated like a sequence. This causes Spack to miss `cxx`
and `fortran` language virtuals in `DeprecatedCompilerSpec`.
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Currently, externals show up in `spack find` and `spack spec` install status as a green
`[e]`, which is hard to distinguish from the green [+] used for installed packages.
- [x] Make externals magenta instead, so they stand out.
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Concretization setup was checking whether any input spec has a dependency
that's *not* in the set of possible dependencies for all roots in the solve.
There are two reasons to check this:
1. The user could be asking for a dependency that none of the roots has, or
2. The user could be asking for a dependency that doesn't exist.
For abstract roots, (2) implies (1), and the check makes sense. For concrete
roots, we don't care, because the spec has already been built. If a `package.py`
no longer depends on something it did before, it doesn't matter -- it's already
built. If the dependency no longer exists, we also do not care -- we already
built it and there's an installation for it somewhere.
When you concretize an environment with a lockfile, *many* of the input specs
are concrete, and we don't need to build them. If a package changes its
dependencies, or if a `package.py` is removed for a concrete input spec, that
shouldn't cause an already-built environment to fail concretization.
A user reported that this was happening with an error like:
```console
spack concretize
==> Error: Package chapel does not depend on py-protobuf@5.28.2/a4rf4glr2tntfwsz6myzwmlk5iu25t74
```
Or, with traceback:
```console
File "/apps/other/spack-devel/lib/spack/spack/solver/asp.py", line 3014, in setup
raise spack.spec.InvalidDependencyError(spec.name, missing_deps)
spack.spec.InvalidDependencyError: Package chapel does not depend on py-protobuf@5.28.2/a4rf4glr2tntfwsz6myzwmlk5iu25t74
```
Fix this by skipping the check for concrete input specs. We already ignore conflicts,
etc. for concrete/external specs, and we do not need metadata in the solve for
concrete dependencies b/c they're imposed by hash constraints.
- [x] Ignore the package existence check for concrete input specs.
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
* Skip packages removed for automatic checksum verification
* Unify finding modified or added packages with spack.repo logic
* Remove unused imports
* Fix unit-tests using shared modified function
* Update last remaining unit test to new format
* glab: add v1.54.0 and v1.55.0
* glab: uniform go deps
Co-authored-by: Alec Scott <hi@alecbcs.com>
---------
Co-authored-by: Alec Scott <hi@alecbcs.com>
* py-jaxlib: add spack-built ROCm support
* fix style
* py-jaxlib 0.4.38 rocm support
* py-jaxlib 0.4.38 rocm support
* add comgr dependency
* changes for ROCm external and enable till 0.4.38
* enable version of py-jax
* add jax+rocm to ci
* add conflict for cuda and remove py-jaxlib from aarch64 pipeline
* Update var/spack/repos/builtin/packages/py-jaxlib/package.py
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
* add conflict for aarch64
---------
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
## Summary
Compilers stop being a *node attribute*, and become a *build-only* dependency.
Packages may declare a dependency on the `c`, `cxx`, or `fortran` languages, which
are now treated as virtuals, and compilers would be *providers* for one or more of
those languages. Compilers can also inject runtime dependency, on the node being
compiled. An example graph for something as simple as `zlib-ng` is the following:
<p align="center">
<img src="https://github.com/user-attachments/assets/ee6471cb-09fd-4127-9f16-b9fe6d1338ac" alt="zlib-ng DAG" width="80%" height="auto">
</p>
Here `gcc` is used for both the `c`, and `cxx` languages. Edges are annotated with
the virtuals they satisfy (`c`, `cxx`, `libc`). `gcc` injects `gcc-runtime` on the nodes
being compiled. `glibc` is also injected for packages that require `c`. The
`compiler-wrapper` is explicitly represented as a node in the DAG, and is included in
the hash.
This change in the model has implications on the semantics of the `%` sigil, as
discussed in #44379, and requires a version bump for our `Specfile`, `Database`,
and `Lockfile` formats.
## Breaking changes
Breaking changes below may impact users of this branch.
### 1. Custom, non-numeric version of compilers are not supported
Currently, users can assign to compilers any custom version they want, and Spack
will try to recover the "real version" whenever the custom version fails some operation.
To deduce the "real version" Spack must run the compiler, which can add needless
overhead to common operations.
Since any information that a version like `gcc@foo` might give to the user, can also
be suffixed while retaining the correct numeric version, e.g. `gcc@10.5.0-foo`, Spack
will **not try** anymore to deduce real versions for compilers.
Said otherwise, users should have no expectation that `gcc@foo` behaves as
`gcc@X.Y.Z` internally.
### 2. The `%` sigil in the spec syntax means "direct build dependency"
The `%` sigil in the spec syntax means *"direct build dependency"*, and is not a node
attribute anymore. This means that:
```python
node.satisfies("%gcc")
```
is true only if `gcc` is a direct build dependency of the node. *Nodes without a compiler
dependency are allowed.*
### `parent["child"]`, and `node in spec`, will now only inspect the link/run sub-DAG
and direct build dependencies
The subscript notation for `Spec`:
```python
parent["child"]
```
will look for a `child` node only in the link/run transitive graph of `parent`, and in its
direct build dependencies. This means that to reach a transitive build dependency,
we must first pass through the node it is associated with.
Assuming `parent` does not depend on `cmake`, but depends on a `CMakePackage`,
e.g. `hdf5`, then we have the following situation:
```python
# This one raises an Exception, since "parent" does not depend on cmake
parent["cmake"]
# This one is ok
cmake = parent["hdf5"]["cmake"]
```
### 3. Externals differing by just the compiler attribute
Externals are nodes where dependencies are trimmed, and that _is not planned to
change_ in this branch. Currently, on `develop` it is ok to write:
```yaml
packages:
hdf5:
externals:
- spec: hdf5@1.12 %gcc
prefix: /prefix/gcc
- spec: hdf5@1.12 %clang
prefix: /prefix/clang
```
and Spack will account for the compiler node attribute when computing the optimal
spec. In this branch, using externals with a compiler specified is allowed only if any
compiler in the dag matches the constraints specified on the external. _The external
will be still represented as a single node without dependencies_.
### 4. Spec matrices enforcing a compiler
Currently we can have matrices of the form:
```yaml
matrix:
- [x, y, z]
- [%gcc, %clang]
```
to get the cross-product of specs and compilers. We can disregard the nature of the
packages in the first row, since the compiler is a node attribute required on each node.
In this branch, instead, we require a spec to depend on `c`, `cxx`, or `fortran` for the
`%` to have any meaning. If any of the specs in the first row doesn't depend on these
languages, there will be a concretization error.
## Deprecations
* The entire `compilers` section in the configuration (i.e., `compilers.yaml`) has been
deprecated, and current entries will be removed in v1.2.0. For the time being, if Spack
finds any `compilers` configuration, it will try to convert it automatically to a set of
external packages.
* The `packages:compiler` soft-preference has been deprecated. It will be removed
in v1.1.0.
## Other notable changes
* The tokens `{compiler}`, `{compiler.version}`, and `{compiler.name}` in `Spec.format`
expand to `"none"` if a Spec does not depend on C, C++, or Fortran.
* The default install tree layout is now
`"{architecture.platform}-{architecture.target}/{name}-{version}-{hash}"`
## Known limitations
The major known limitations of this branch that we intend to fix before v1.0 is that compilers
cannot be bootstrapped directly.
In this branch we can build a new compiler using an existing external compiler, for instance:
```
$ spack install gcc@14 %gcc@10.5.0
```
where `gcc@10.5.0` is external, and `gcc@14` is to be built.
What we can't do at the moment is use a yet to be built compiler, and expect it will be
bootstrapped, e.g. :
```
spack install hdf5 %gcc@14
```
We plan to tackle this issue in a following PR.
---------
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Harmen Stoppels <me@harmenstoppels.nl>
Co-authored-by: Harmen Stoppels <me@harmenstoppels.nl>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
The `umea.se` mirror seems to have gone down (or at least is forbidden for now).
Revert the checksum changes in #47825; points at the official GNOME mirror
instead of the prior two places we were getting `gdk-pixbuf`.
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Since we moved from creating clingo symbols directly to constructing a pure string
representation of the program, we don't need to make `AspFunctions` into symbols before
turning them into strings. We can just write strings like clingo would.
This cuts about 25% off the setup time by avoiding an unnecessary round trip.
- [x] create strings directly from `AspFunctions`
- [x] remove unused `symbol()` method on `AspFunction`
- [x] setup no longer tries to call `symbol()`
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Signed-off-by: Greg Becker <becker33@llnl.gov>
---------
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Greg Becker <becker33@llnl.gov>
* trilinos: add equals sign to kokkos dependencies.
* Fix some license headers to pass style check.
* Generalize a bit.
* Generalize a bit more.
* datatransferkit: constraing to maximum of trilinos@16.0.
* limit some patches by chapel version
* fix short output version if building main
* update patches, remove unneeded 'self' refs
* fix spack style
* update patches with changes from PR
* change py-protobuf to just protobuf dep
* add PR numbers for patches
* fix spack style
* update 2.4 sha256
A user had `grep` aliased to `grep -n`, which was causing `csh` setup to
fail due to number prefixes in `SPACK_ROOT`.
- [x] Prefix invocations of `grep` and `sed` (which are not builtin) with `\`
to avoid any aliases.
- [x] Avoid using `dirname` altogether -- use csh's `:h` modifier (which does
the same thing) instead.
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
* Fix problem at least with datatransferkit
* Include patch 11676 from trilinos
* Add patches for trilinos 13.4.1
* style check failed
* Update links for patches
* additional style check failed
* Add recursive argument to spack develop
This effort allows for a recursive develop call
which will traverse from the develop spec given back to the root(s)
and mark all packages along the path as develop.
If people are doing development across the graph then paying
fetch and full rebuild costs every time spack develop is called
is unnecessary and expensive.
Also remove the constraint for concrete specs and simply take the
max(version) if a version is not given. This should default to the
highest infinity version which is also the logical best guess for
doing development.
* sst-core: fix for > 14.0.0 requiring ncurses
* sst-core: backport fix for curses detection
* sst-core: ensure HDF5 is ignored if not specified
* sst-core: HDF5 integration is via C++
* sst-core: switch to with_or_without for configure
* sst-core: switch to enable_or_disable for configure
* sst-core: control memory pools and debug output with variants
* exawind: add versions and commits to tags.
* Add new version of TIOGA.
* openfast: add commits to tags.
* amr-wind: add dependencies.
* amr-wind: add more settings.
---------
Co-authored-by: jrood-nrel <jrood-nrel@users.noreply.github.com>
* qt-base: pass SBOM PATH from cmake_args
* qt-base: self.define from list
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
---------
Co-authored-by: Seth R. Johnson <johnsonsr@ornl.gov>
Add a CI check to automatically verify the checksums of newly added
package versions:
- [x] a new command, `spack ci verify-versions`
- [x] a GitHub actions check to run the command
- [x] tests for the new command
This also eliminates the suggestion for maintainers to manually verify added
checksums in the case of accidental version <--> checksum mismatches.
----
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
* add new version
* add v8.8.20250205075315 to py-schema-salad
* Modify range to open ended
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
* Add open ended dependency version range
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
* fix flake8 error
---------
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
* kubectl: add all versions currently supported upstream
* kubectl: build same way as kubernetes
* kubectl: revert back to GoPackage
* kubectl: fix version command
* kubectl: add v1.30.11, v1.31.7, v1.32.3
* kubectl: remove new deprecated versions
* kubectl: refactor build deps
The package was added in 2017, and never updated
substantially. It requires users to login into
a platform to download code.
Thus, instead of updating to new versions, and add
support for OneAPI, remove the package.
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Fixes#49403.
When one scope included another, we were appending to a list stored on the scope to
track what was included, and we would clear the list when the scope was removed.
This assumes that the scopes are always strictly pushed then popped, but the order can
be violated when serializing config scopes across processes (and then activating
environments in subprocesses), or if, e.g., instead of removing each scope we simply
cleared the list of config scopes. Removal can be skipped, which can cause the list of
includes on a cached scope (like the one we use for environments) to grow every time it
is pushed, and this triggers an assertion error.
There isn't actually a need to construct and destroy the include list. We can just
compute it once and cache it -- it's the same every time.
- [x] Cache included scope list on scope objects
- [x] Do not dynamically append/clear the included scope list
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Right now the Spack %msvc compiler is inherently a hybrid compiler
that uses Intel's oneAPI fortran compiler.
This was addressed in Spacks MSVC compiler class, but detection has
since stopped using the compiler class, so this PR moves the logic
into the `msvc` compiler package (does not delete the original code
because that is handled in #45189).
This includes a change to the general detection logic to deprioritize
paths that include a symlink anywhere in the path, in order to prefer
"2025.0/bin" over "latest/bin" for the oneAPI compiler.
* style.py: add spack style --spec-strings for compat with v1.0
* add --fix also, and avoid infinite recursion and too large files
* tests: check identify and check edit files
* samurai: new package
- Add samurai : an HPC library of mesh and physics
* Update var/spack/repos/builtin/packages/samurai/package.py
Co-authored-by: Alec Scott <hi@alecbcs.com>
* Update var/spack/repos/builtin/packages/samurai/package.py
Co-authored-by: Alec Scott <hi@alecbcs.com>
* Update var/spack/repos/builtin/packages/samurai/package.py
Co-authored-by: Alec Scott <hi@alecbcs.com>
* Update var/spack/repos/builtin/packages/samurai/package.py
Co-authored-by: Alec Scott <hi@alecbcs.com>
* Update var/spack/repos/builtin/packages/samurai/package.py
Co-authored-by: Alec Scott <hi@alecbcs.com>
* Remove Whitespace
- Remove whitespace for spack style check
* Update var/spack/repos/builtin/packages/samurai/package.py
Co-authored-by: Alec Scott <hi@alecbcs.com>
* Add tags
- Add tags for the last versions of samurai
- All tags are tested and worked properly
- Add maintainers ("gouarin" - the samurai project lead and "sbstndb" - me, working on samurai)
- Add licence
---------
Co-authored-by: Alec Scott <hi@alecbcs.com>
* Trilinos launch blocking + maintainers
Cuda launch blocking is not needed and slowing modern apps down.
More maintainers to spot issues like this.
---------
Co-authored-by: psakievich <psakievich@users.noreply.github.com>
Elpa's custom preprocessor createst temporary files for which it
assembles long filenames and then uses the last 250 characters. This
results in compilation errors when the first character happens to be a
dash.
* Slurm: extend spack external find support
On Debian srun/salloc --version returns 'slurm-wlm VERSION'. Check for both strings and return the first match.
* non-capturing group for slurm determine_version
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
* slurm: add detection test
---------
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
This commit removes the +fortran variant when building HDF5 for WRF.
This seems unnecessary, and prevents building WRF with some versions of
Intel MPI, as HDF5 doesn't appear to build with Fortran support and
Intel MPI.
* add new versions up to 1.5 and new variants
variant vtk: make vtk optional
variant shared: build shared libs
added patch to fix parmmg cmake so that it can be used by other software with find_package
* use +private for mmg@5.8: and parmmg@1.5:
* fix style and constraint mmg version
* add a condition on patch, use private_headers from mmg PR feelpp/spack#14
* add MET v12.0.0 and METplus v6.0.0
* Set correct dependencies for metplus@6 in var/spack/repos/builtin/packages/metplus/package.py
* Add missing dependency on proj for met@12
* Add met@12.0.1
* Change @6.0.0 to @6: for requirements in var/spack/repos/builtin/packages/metplus/package.py
* Address reviewer comments for met and metplus
---------
Co-authored-by: Rick Grubin <Richard.Grubin@noaa.gov>
This should help resolve the "No binary found when cache-only was specified"
errors we've recently seen in our GitLab CI pipelines.
example failing job here:
https://gitlab.spack.io/spack/spack/-/jobs/15570931#L370
This error is caused when a generate job finds a spec in the local root
binary mirror, and that spec does not yet exist in the stack-specific mirror.
The fix here is to instead locally cache the stack-specific mirrors and only
use the root-level mirror for public use.
Windows paths with drives were being interpreted as network protocols
in canonicalize_path (which was expanded to handle more general URLs
in #48784).
This fixes that and adds some tests for it.
In Spack v1.0 we plan to parse caret ^ and percent % the same. Their meaning is direct and transitive dependency respectively. It means that variants, versions, arch, platform, os, target and dag hash should go before the %, so that they apply to dependent not the %dependency.
When requiring a constraint on a virtual package, it makes little
sense to use anonymous specs, and our documentation shows no example
of requirements on virtual packages starting with `^`.
Right now, due to how `^` is implemented in the solver, writing:
```yaml
mpi:
require: "^openmpi"
```
is equivalent to the more correct form:
```yaml
mpi:
require: "openmpi"
```
but the situation will change when `%` will shift its meaning to be a
direct dependency.
To avoid later errors that are both unclear, and quite slow to get to the user,
this commit makes anonymous specs under virtual requirements an error,
and shows a clear error message pointing to the file and line where the
spec needs to be changed.
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Without it, the build fails with errors like this:
```
Can't locate File/Compare.pm in @INC (you may need to install the File::Compare module) (@INC contains: ...) at ../../../src/backend/catalog/Catalog.pm line 19.
```
Having variants all conditional leaves a lot more degree of freedom to clingo,
and slows down the search.
If variants have inconsistent defaults, we might end up with multiple, equally
sub-optimal solutions. Sometimes this creates a "plateau" in the search space.
Remove conditional boolean variants that can't be activated, since this just increases
the complexity of the model.
If 4 variants have to be all active / inactive together, it's better to use a single requires,
than to explode it into multiple statements dealing with a single variant at a time.
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
ICU4C's NMAKE seems to over-quote to the degree
that it passes paths like ""<path>"" which
confuses the Python command line in subprocesses
the build starts
If you use `spack config change` to modify a `require:` section that did
not exist before, Spack was inserting the merged configuration into the
highest modification scope (which for example would clutter the
environment's `spack.yaml` with a bunch of configuration details
from the defaults).
Supersedes #46792.
Closes#40018.
Closes#31026.
Closes#2700.
There were a number of feature requests for os-specific config. This enables os-specific
config without adding a lot of special sub-scopes.
Support `include:` as an independent configuration schema, allowing users to include
configuration scopes from files or directories. Includes can be:
* conditional (similar to definitions in environments), and/or
* optional (i.e., the include will be skipped if it does not exist).
Includes can be paths or URLs (`ftp`, `https`, `http` or `file`). Paths can be absolute or
relative . Environments can include configuration files using the same schema. Remote includes
must be checked by `sha256`.
Includes can also be recursive, and this modifies the config system accordingly so that
we push included configuration scopes on the stack *before* their including scopes, and
we remove configuration scopes from the stack when their including scopes are removed.
For example, you could have an `include.yaml` file (e.g., under `$HOME/.spack`) to specify
global includes:
```
include:
- ./enable_debug.yaml
- path: https://github.com/spack/spack-configs/blob/main/NREL/configs/mac/config.yaml
sha256: 37f982915b03de18cc4e722c42c5267bf04e46b6a6d6e0ef3a67871fcb1d258b
```
Or an environment `spack.yaml`:
```
spack:
include:
- path: "/path/to/a/config-dir-or-file"
when: os == "ventura"
- ./path/relative/to/containing/file/that/is/required
- path: "/path/with/spack/variables/$os/$target"
optional: true
- path: https://raw.githubusercontent.com/spack/spack-configs/refs/heads/main/path/to/required/raw/config.yaml
sha256: 26e871804a92cd07bb3d611b31b4156ae93d35b6a6d6e0ef3a67871fcb1d258b
```
Updated TODO:
- [x] Get existing unit tests to pass with Todd's changes
- [x] Resolve new (or old) circular imports
- [x] Ensure remote includes (global) work
- [x] Ensure remote includes for environments work (note: caches remote
files under user cache root)
- [x] add sha256 field to include paths, validate, and require for remote includes
- [x] add sha256 remote file unit tests
- [x] revisit how diamond includes should work
- [x] support recursive includes
- [x] add recursive include unit tests
- [x] update docs and unit test to indicate ordering of recursive includes with
conflicting options is deferred to follow-on work
---------
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Co-authored-by: Peter Scheibel <scheibel1@llnl.gov>
Co-authored-by: Todd Gamblin <tgamblin@llnl.gov>
The SPACK_USER_CACHE_PATH was being overwritten in the windows CI
before_script. This should set the path for all systems unless
explicitly overridden.
* gasnet: deprecate old versions
GASNet versions more than 2 years old are not supported.
Update description text.
* gasnet: add 2025.2.0-snapshot version
Defines `spack.package_api_version` and `spack.min_package_api_version`
as tuples (major, minor).
This defines resp. the current Package API version implemented by this version
of Spack and the minimal Package API version it is backwards compatible with.
Repositories can optionally define:
```yaml
repo:
namespace: my_repo
api: v1.2
```
which indicates they are compatible with versions of Spack that implement
Package API `>= 1.2` and `< 2.0`. When the `api` key is omitted, the default
`v1.0` is assumed.
This package has not been maintained since 2016.
We maintain an active fork in the hydrogen
package, so remove this one.
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* py-networkx: add new versions up to 3.4.2
* py-networkx: add more requirements
* py-networkx: fix typo
* py-networkx: fix python and py-setuptools dependencies
---------
Co-authored-by: Joseph C Wang <joequant@gmail.com>
* Update rpy2 to newest version and clean up package
* Add me as maintainer
* Update depends section as per review. Add ipython variant. Fix some ranges and add support for python 3.9. Deprecated outdated versions
* refine depends_on and remove redundant version info
* style
* Adding ability for repo paths from a manifest file to be expanded when creating an environment.
A unit test was added to check that an environment variable will be expanded.
Also, a bug was fixed in the expansion of develop paths where if an environment variable
was in the path that then produced an absolute path the path would not be extended.
* Fixing new unit test for env repo var substitution
* Adding ability for repo paths from a manifest file to be expanded when creating an environment.
A unit test was added to check that an environment variable will be expanded.
Also, a bug was fixed in the expansion of develop paths where if an environment variable
was in the path that then produced an absolute path the path would not be extended.
* Messed up resolving last rebase
* qwt: support building against Qt6
* qwt: fix style
* qwt: depends_on qt-base+opengl+widgets when +opengl
* visit: patch for missing cmath include
---------
Co-authored-by: Bernhard Kaindl <contact@bernhard.kaindl.dev>
* pull in new changes from axom project
* add new versions
* convert more conditionals to spec.satisfies
-------------
Co-authored-by: white238 <white238@users.noreply.github.com>
* rct: update packages (RE, RG, RP, RS, RU) with new version 1.90
* radical: added `url_for_version` for older versions
* radical: set latest versions for `radical.pilot` and `radical.utils`
* radical: fixed `url_for_version` setup
* radical: set the latest version for `radical.entk`
* radical: fixed style for `url_for_version`
* Apply suggestions from code review (python version dependency)
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
---------
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
On Windows, libraries search their directory for dependencies, and
we help libraries in Spack-built packages locate their dependencies
by symlinking them into the dependent's directory (we refer to this
as simulated RPATHing).
We extend the convenience functionality here to support base library
directories outside of the package prefix: this is primarily for
running tests in the build directory (which is not located inside
of the final install prefix chosen by spack).
This release contains modifications to most of the SEACAS applications to support ChangeSets to some degree.
See https://github.com/SandiaLabs/seacas/wiki/Dynamic_Topology for information about Change Sets and
See https://github.com/SandiaLabs/seacas/wiki/Supporting-Change-Sets for information about how the various seacas applications are supporting the use or creation of change sets.
The release also includes various other small changes including formatting, portability, intallation, TPL version updates, and spelling.
* add sendme package
* style fix
* add docstring for test function
* changed maintainer string, run test after install
* removed redundant test
* Follow the common package license header format
Co-authored-by: Alec Scott <hi@alecbcs.com>
---------
Co-authored-by: Alec Scott <hi@alecbcs.com>
* Split requirements to get better error messages in case of unsat solves.
* use list requirements instead of string
* activate static_analysis in a few pipelines
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* ci: darwin stacks: update tags following system updates
* disable SPACK_CI_DISABLE_STACKS; only enable *darwin* stacks for testing
* manually chmod u+w tmp/ before cleanup due to issue#49147
* comment out failing specs for now
* re-enable logic for disabling stacks
* add explanatory comment for darwin after_script additions
* remove more darwin-only targetting
* restore build_stage to default location
* move build-job-remove out of individual darwin stacks into darwin top level config
* keep build_stage in $spack/tmp for now
Python was removed from being a build tool in #46980, due to issues
when reusing specs. This PR adds a new rule to match the interpreter
among different Python packages, in clingo.
It also adds a bunch of new "build-tools", so that specs like:
```
py-matplotlib backend=tkagg
```
can be concretized in one go.
Modifications:
- [x] Make `py-matplotlib backend=tkagg` concretizable
- [x] Add unit-tests to ensure situations like in #46980 do not happen
---------
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Currently, the custom config scopes are pushed at the top when constructing
configuration, and are demoted whenever a context manager activating an
environment is used - see #48414 for details. Workflows that rely on the order
in the [docs](https://spack.readthedocs.io/en/latest/configuration.html#custom-scopes)
are thus fragile, and may break
This PR allows to assign priorities to scopes, and ensures that scopes of lower priorities
are always "below" scopes of higher priorities. When scopes have the same priority,
what matters is the insertion order.
Modifications:
- [x] Add a mapping that iterates over keys according to priorities set when
adding the key/value pair
- [x] Use that mapping to allow assigning priorities to configuration scopes
- [x] Assign different priorities for different kind of scopes, to fix a bug, and
add a regression test
- [x] Simplify `Configuration` constructor
- [x] Remove `Configuration.pop_scope`
---------
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
All the build jobs in pipelines are apparently relying on the bug that was fixed.
The issue was not caught in the PR because generation jobs were fine, and
there was nothing to rebuild.
Reverting to fix pipelines in a new PR.
This reverts commit 3ad99d75f9.
VariantMap.concrete is unused, and would be incorrect if it were used
due to conditional variants.
Just let the Spec dictate what is concrete and what is not.
Currently, environments can end up with higher priority than `-C` custom
config scopes and `-c` command line arguments sometimes. This shouldn't
happen -- those explicit CLI scopes should override active environments.
Up to now configuration behaved like a stack, where scopes could be only be
pushed at the top. This PR allows to assign priorities to scopes, and ensures
that scopes of lower priorities are always "below" scopes of higher priorities.
When scopes have the same priority, what matters is the insertion order.
Modifications:
- [x] Add a mapping that iterates over keys according to priorities set when
adding the key/value pair
- [x] Use that mapping to allow assigning priorities to configuration scopes
- [x] Assign different priorities for different kind of scopes, to fix a bug, and
add a regression test
- [x] Simplify `Configuration` constructor
- [x] Remove `Configuration.pop_scope`
- [x] Remove `unify:false` from custom `-C` scope in pipelines
On the last modification: on `develop`, pipelines are relying on the environment
being able to override `-C` scopes, which is a bug. After this fix, we need to be
explicit about the unification strategy in each stack, and remove the blanket
`unify:false` from the highest priority scope
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* Update survey package file with latest releases and python path settings for building with autoload none.
* Submitting reformatted file.
* update survey package file with libmonitor dependency changes, take out py-gpustat, and minor comment change.
* Trigger build.
Currently, we have `config:shared_linking:missing_library_policy` to error
or warn when shared libraries cannot be resolved upon install.
The new `spack verify libraries` command allows users to run this post
install hook at any point in time to check whether their current
installations can resolve shared libs in rpaths.
* remove preferred to allow seamless python@3.12 usage
* glib: remove deprecated versions
* glib: use extends because python-venv is pulled in from build deps and put into path
* dont patch patch versions, use new patch releases containing the fix instead
* restrict patch of shebangs, group relevant bits together
* simplify lowerbound
* fix pinned glib version
---------
Co-authored-by: Chris Marsh <chrismarsh.c2@gmail.com>
* Bug fix for compiling node-js@21: with gcc@11.2 (var/spack/repos/builtin/packages/node-js/package.py var/spack/repos/builtin/packages/node-js/wasm-compiler-gcc11p2.patch)
Since this bug fix is not sufficient, add a conflict for node-js@21: with gcc@11.2
* In var/spack/repos/builtin/packages/node-js/package.py, restrict patch wasm-compiler-gcc11p2.patch to versions 21:22 for gcc@11.2
* Revert "REVERTME: move celeritas changes to another branch"
This reverts commit a063e43aaf.
* Use predicted g4vg version
* Use
* fixup! Use predicted g4vg version
* Use spec for versions and improve dependency specification
File diff suppressed because one or more lines are too long
Some files were not shown because too many files have changed in this diff
Show More
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.