Currently we inject runtimes when a package has a direct build dep on a
compiler, but what matters is whether the package depends on a language.
That way we can avoid recursion of injecting runtimes to runtimes
without a rule in the solver: runtimes don't depend on languages, they
just have a build dep on the same compiler.
In this way, to run them, we just need to run:
spack unit-test lib/spack/spack/test/concretization
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
* TestSuite: add type hints
* spack test run: add a --timeout argument
* pipelines: allow 2 minutes to run tests
* Fix docstrings, increase maximum pipelines time for tests to 5 mins.
* Use SIGTERM first, SIGKILL shortly after
* Add unit-tests for "start_build_process"
---------
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Multiple build systems have been part of Spack for a long
time now, and they are rarely the cause of a missing attribute.
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Similar to the range-or-specific-version ambiguity of `@1.2` in the past,
which was solved with `@1.2` vs `@=1.2` we still have the ambiguity of
`name=a,b,c` in multi-valued variants. Do they mean "at least a,b,c" or
"exactly a,b,c"?
This issue comes up in for example `gcc languages=c,cxx`; there's no
way to exclude `fortran`.
The ambiguity is resolved with syntax `:=` to distinguish concrete from
abstract.
The following strings parse as **concrete** variants:
* `name:=a,b,c` => values exactly {a, b, c}
* `name:=a` => values exactly {a}
* `+name` => values exactly {True}
* `~name` => values exactly {False}
The following strings parse as **abstract** variants:
* `name=a,b,c` values at least {a, b, c}
* `name=*` special case for testing existence of a variant; values are at
least the empty set {}
As a reminder
* `satisfies(lhs, rhs)` means `concretizations(lhs)` ⊆ `concretizations(rhs)`
* `intersects(lhs, rhs)` means `concretizations(lhs)` ∩ `concretizations(rhs)` ≠ ∅
where `concretizations(...)` is the set of sets of variant values in this case.
The satisfies semantics are:
* rhs abstract: rhs values is a subset of lhs values (whether lhs is abstract or concrete)
* lhs concrete, rhs concrete: set equality
* lhs abstract, rhs concrete: false
and intersects should mean
* lhs and rhs abstract: true (the union is a valid concretization under both)
* lhs or rhs abstract: true iff the abstract variant's values are a subset of the concrete one
* lhs concrete, rhs concrete: set equality
Concrete specs with single-valued variants are printed `+foo`, `~foo` and `foo=bar`;
only multi-valued variants are printed with `foo:=bar,baz` to reduce the visual noise.
Return a single scope from environment.env_config_scope
Return a single scope from environment_path_scope
---------
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
This commit reorders ASP setup, so that rules from
possible compilers are collected first.
This allows us to know the dependencies that may be
injected before counting the possible dependencies,
so we can account for them too.
Proceeding this way makes it easier to inject
complex runtimes, like hip.
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Deal with the "issue" that passing a str instance does not cause a
type check failure, because str is a subset of Sequence[str] and
Iterable[str]. Instead fix it by special casing the str instance.
* Check for LSF, FLux, and Slurm when determing MPI exec
* Make scheduler/MPI exec helper functions methods of CachedCMakeBuilder
* Remove axom workaround for running mpi on machines with flux
* Clearly split old and new hip settings requirements
* Apply generic rocm handling to every project
* make default logic for hip support more robust
* GPU_TARGET is only necessary under certain project specific conditions, it should not be necessary in general
* Update logic to find amdclang++
fixes#49717
If no compiler is listed in the 'packages' section of
the configuration, Spack will currently try to:
1. Look for a legacy compilers.yaml to convert
2. Look for compilers in PATH
in that order. If an entry in compilers.yaml is
corrupted, that should not result in an obscure
error.
Instead, it should just be skipped.
Signed-off-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
This module references `spack.error.Something` in the same file, which happens to
work but is incorrect. Use `spack.error` to prevent that in the future.
I noticed that `abseil-cpp` was showing in `spack find` with "no compiler", and the only
difference between it and other nodes was that it *only* depends on `cxx` -- others
depend on `c` as well.
It turns out that the `select()` method on `EdgeMap` only takes `Sequence[str]` and doesn't
check whether they're actually just one `str`. So asking for, e.g., `cxx` is like asking for
`c` or `x` or `x`, as the `str` is treated like a sequence. This causes Spack to miss `cxx`
and `fortran` language virtuals in `DeprecatedCompilerSpec`.
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Currently, externals show up in `spack find` and `spack spec` install status as a green
`[e]`, which is hard to distinguish from the green [+] used for installed packages.
- [x] Make externals magenta instead, so they stand out.
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>
Concretization setup was checking whether any input spec has a dependency
that's *not* in the set of possible dependencies for all roots in the solve.
There are two reasons to check this:
1. The user could be asking for a dependency that none of the roots has, or
2. The user could be asking for a dependency that doesn't exist.
For abstract roots, (2) implies (1), and the check makes sense. For concrete
roots, we don't care, because the spec has already been built. If a `package.py`
no longer depends on something it did before, it doesn't matter -- it's already
built. If the dependency no longer exists, we also do not care -- we already
built it and there's an installation for it somewhere.
When you concretize an environment with a lockfile, *many* of the input specs
are concrete, and we don't need to build them. If a package changes its
dependencies, or if a `package.py` is removed for a concrete input spec, that
shouldn't cause an already-built environment to fail concretization.
A user reported that this was happening with an error like:
```console
spack concretize
==> Error: Package chapel does not depend on py-protobuf@5.28.2/a4rf4glr2tntfwsz6myzwmlk5iu25t74
```
Or, with traceback:
```console
File "/apps/other/spack-devel/lib/spack/spack/solver/asp.py", line 3014, in setup
raise spack.spec.InvalidDependencyError(spec.name, missing_deps)
spack.spec.InvalidDependencyError: Package chapel does not depend on py-protobuf@5.28.2/a4rf4glr2tntfwsz6myzwmlk5iu25t74
```
Fix this by skipping the check for concrete input specs. We already ignore conflicts,
etc. for concrete/external specs, and we do not need metadata in the solve for
concrete dependencies b/c they're imposed by hash constraints.
- [x] Ignore the package existence check for concrete input specs.
Signed-off-by: Todd Gamblin <tgamblin@llnl.gov>